4 Sources
[1]
Snowflake launches Openflow to help businesses manage data in the age of AI
The news arrives at a time when many businesses are struggling to adapt to a shifting regulatory landscape and the rise of powerful new AI tools. Data is the fuel behind the AI revolution -- the foundational building block for the new technological world order. But data is immaterial, difficult to organize, and subject to an ever-growing mountain of walled gardens and regulatory decrees. Businesses seeking to harness AI, therefore, often struggle to make the most of their data, this most vital of resources. Enter Snowflake. At its annual Snowflake Summit user conference, the company announced the release of Openflow, a new service designed to integrate businesses' data into a single, unified, and intelligible channel. Like disparate streams flowing into a single river, Openflow takes the whole of a company's data -- structured, unstructured, batch, and streaming -- and collects them in such a way that they can be more easily visualized and leveraged. Also: AI doesn't have to be a job-killer. How some businesses are using it to enhance, not replace The platform is also intended to simplify the process of creating new AI systems, including agents, which are able to automatically perform tasks on behalf of human users and work flexibly across an organization's digital ecosystem. "With Snowflake Openflow, we're redefining what open, extensible, and managed data integration looks like, so our customers can quickly build AI-powered apps and agents without leaving their data behind," Chris Child, VP of Product at Snowflake's Data Engineering department, said in a statement. Companies today have to manage a vast amount of data coming from various sources. Every marketing email, internal presentation, customer service interaction, financial statement, video file, and market research survey represents a valuable bit of information that must be collected and stored. The rise of AI has complicated the picture further, as models are trained on the ingestion of this internal, multimodal data. It's a bit like an international corporation managing a vast network of mines on different continents. Such a corporation would require an equally vast bureaucracy to ensure that the quota for every individual ore is being met, and that each gets subsequently transported to wherever in the world it needs to go. Snowflake has sought to occupy the new managerial role at a time when companies no longer primarily depend on physical materials like coal or iron, but on digital information. Openflow is the company's latest step towards achieving that goal: the platform "makes the process of getting data from where it is created to where it can be used effortless," the company said in a press release. Snowflake isn't the only company with its eye on this burgeoning and valuable niche: Box also recently announced that it will soon release its own AI agents that can help businesses organize and retrieve internal data. Openflow's ability to operate across all of an organization's data streams -- "interoperability," in tech parlance -- opens the door to some powerful benefits, the company said. For one, Openflow will enable customers to build custom data build tool (dbt) projects directly within the platform, a feature that will soon launch in public preview. It's also integrated with Apache Iceberg, a data visualization tool that makes it easy to track files from across the full swath of a company's internal structure. Also: How AI coding agents could destroy open source software And thanks to another new feature called Snowpipe Streaming, now available in public preview, data streaming in Openflow has been ratcheted up to 10 gigabytes-per-second, with "significantly reduced latency," according to the company. All of these features have been built with data security and governance in mind -- both of which are key considerations at a time when the proliferation of AI tools is also ramping up cybersecurity risks.
[2]
Snowflake's Openflow tackles AI's toughest engineering challenge: Data ingestion at scale
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More For anyone in AI, it's no big news that "data is the real prize." If you have strong data foundations, your models and the applications powered by them will be right on the money. But that's where it gets messy. Building that foundation is no piece of cake, especially when there are dozens of data sources, each hosting valuable information. You need to build and maintain integration pipelines for each source -- a massive engineering burden for data teams juggling disparate ETL tools to centralize what's needed to power AI workloads. At scale, these pipelines become rigid bottlenecks -- hard to adapt, extend or expand. Snowflake thinks it has an answer. Today, at its annual summit, the company announced the general availability of Openflow -- a fully managed data ingestion service that pulls any type of data from virtually any source, streamlining the process of mobilizing information for rapid AI deployment. How does it work? Powered by Apache NiFi, Openflow uses connectors -- prebuilt or custom -- with Snowflake's embedded governance and security. Whether it's unstructured multimodal content from Box or real-time event streams, Openflow plugs in, unifies, and makes all data types readily available in Snowflake's AI Data Cloud. "Data engineers often faced a critical tradeoff - if they wanted highly controllable pipelines, they encountered complexity and significant infrastructure management. If they wanted a simple solution, they encountered issues of limited privacy, flexibility and customization. Openflow meets customers where their data lives, providing deployment flexibility and guaranteeing security and governance along the way," Chris Child, VP of Product, Data Engineering, at Snowflake, told VentureBeat. While Snowflake has offered ingestion options like Snowpipe for streaming or individual connectors, Openflow delivers a "comprehensive, effortless solution for ingesting virtually all enterprise data." "Snowflake's Snowpipe and Snowpipe Streaming remain a key foundation for customers bringing data into Snowflake, and focus on the 'load' of the ETL process. Openflow, on the other hand, handles the extraction of data directly from source systems, then performs the transform and load processes. It is also integrated with our new Snowpipe Streaming architecture, so data can be streamed into Snowflake once it is extracted," he explained. This ultimately unlocks new use cases where AI can analyze a complete picture of enterprise data, including documents, images, and real-time events, directly within Snowflake. Once the insights are extracted, they can return to the source system using the connector. Over 200 connectors available Openflow currently supports 200+ ready-to-use connectors and processors, covering services like Box, Google Ads, Microsoft SharePoint, Oracle, Salesforce Data Cloud, Workday and Zendesk. "Box's integration with Snowflake Openflow...leverages data extraction from Box using Box AI, honors the original permissions for secure access, and feeds that data into Snowflake for analysis. It also enables a two-way flow in which enriched insights or metadata can be written back to Box, making content smarter over time," Ben Kus, CTO at Box, told VentureBeat. Creating new connectors takes just a few minutes, speeding up time to value. Users also get security features such as role-based authorization, encryption in transit, and secrets management to keep data protected end-to-end. "Organizations that require real-time data integration, deal with high volumes of data from various sources, or rely on unstructured data like images, audio, and video to derive value from will benefit immensely from Openflow," Child added. A retail company, for instance, could unify siloed data from sales, ecommerce, CRM, and social media to deliver personalized experiences and optimized operations. Snowflake customers Irwin, Securonix, and WorkWave are among those set to use Openflow to move and scale global data -- though the company hasn't disclosed exact adoption numbers. What's next? As the next step, Snowflake aims to make Openflow the backbone of real-time, intelligent data movement across distributed systems - powering the age of AI agents. "We're focusing on moving events at a massive scale and enabling real-time, agent-to-agent bi-directional communication, so insights and actions flow seamlessly across distributed systems. For example, a Cortex Agent handing over events to other enterprise agents from other systems, like ServiceNow," Child said. The timeline for these upgrades remains unclear for now.
[3]
Snowflake Just Killed the Data Pipeline as We Know It | AIM
Once seen purely as a data warehousing powerhouse, Snowflake is undergoing a major reinvention. At its Snowflake Summit 2025, underway in San Francisco, the company unveiled a sweeping set of AI products that reimagine how data is ingested, processed, and turned into intelligence, all within one unified platform. Traditional ETL (Extract, Transform, and Load) processes often involve integrating multiple separate tools, such as Talend, Informatica for data integration, Airflow for orchestration, and Spark for processing, to build complex data pipelines. These tools are typically combined to handle extraction, transformation, and loading tasks, which can lead to complexity, higher costs, and maintenance overhead. On the other hand, Snowflake's Openflow, a new multimodal ingestion service powered by Apache NiFi, helps enterprises pull in data from diverse sources and formats into Snowflake's AI Data Cloud. "This is the productisation of an acquisition we made a few months ago of a company called Datavolo. Openflow is a managed service that helps organisations both extract data from a variety of sources and be able to process it," said Christian Kleinerman, EVP of product at Snowflake, in a media briefing. Openflow allows customers to move data from where it is created to where it is needed, supporting both batch and streaming modes. It features hundreds of pre-built connectors and processors, and offers extensibility to build custom connectors. The service supports Snowflake's Bring Your Own Cloud deployment model and is now generally available on AWS. Moreover, the platform removes the existing bottlenecks in data engineering, including rigid pipelines, fragmented stacks, and slow ingestion. Openflow supports both structured and unstructured data and integrates with sources like Box, Google Ads, Oracle, Salesforce Data Cloud, Workday, and Microsoft SharePoint. "Most of our customers are interested in loading data into Snowflake or making it available to Snowflake," said Kleinerman. He further added that their goal is to simplify data movement and processing from any one source to any other destination. With Openflow, Snowflake is also extending its data engineering capabilities. Customers will soon be able to run dbt Projects natively in Snowflake with support for features like in-line AI code assistance and Git integration. The capability will be available within Snowflake Workspaces, a new file-based development environment. These projects will eventually be powered by dbt Fusion. Snowflake also announced expanded support for Apache Iceberg tables, which allows organisations to build a connected lakehouse view and access semi-structured data using Snowflake's engine. New optimisations for file size and partitions are expected to improve performance and control. Snowpipe Streaming, now in public preview, adds support for high-throughput, low-latency data ingest, with data becoming queryable within 5 to 10 seconds. This further improves Openflow's ability to manage near-real-time data streams. Besides, Snowflake has announced new agentic AI offerings at its annual user conference, including two innovations called Snowflake Intelligence and Data Science Agent. Snowflake Intelligence, launching soon in public preview, allows non-technical users to query and act on structured and unstructured data through natural language prompts. The product is powered by Cortex Agents and LLMs from OpenAI and Anthropic, and runs directly inside customers' Snowflake environments, inheriting security and governance controls. "Snowflake Intelligence breaks down these barriers by democratising the ability to extract meaningful intelligence from an organisation's entire enterprise data estate -- structured and unstructured data alike," said Baris Gultekin, head of AI at Snowflake. Snowflake Intelligence also incorporates third-party content through Cortex Knowledge Extensions, including CB Insights, Packt, Stack Overflow, The Associated Press, and USA TODAY. On the other hand, Data Science Agent automates core machine learning tasks using Claude from Anthropic. These tasks include data preparation, feature engineering, and model training. The agent provides verified ML pipeline code and allows users to iterate through suggestions or follow-ups. "We're leveraging AI to help customers create machine learning pipelines, writing code, validating it, and ultimately automating the end-to-end ML lifecycle," said Kleinerman. The company claims the agent reduces the time spent on debugging and experimentation, allowing data scientists to prioritise higher-impact work. These launches are part of Snowflake's broader push to enable enterprise AI use cases. For analytics, Snowflake has also launched AISQL, which extends its SQL language to include AI operations as simple function calls. "The goal of this is to bring the power of AI to analysts and personas that are typically comfortable with database technology," Kleinerman explained. This includes processing text for sentiment analysis and classification, and supporting multimodal data like PDFs, audio, and images. Analysts can now enrich tables with chat transcripts, correlate sensor data with images, and merge structured data with sources like social media sentiment -- all in one interface. The tool integrates with sources like Box, Google Drive, Workday, and Zendesk using Snowflake Openflow and supports natural language conversations that return insights, generate visualisations, and surface business knowledge. The company also introduced SnowConvert AI, an agent that automates data migrations from platforms such as Oracle, Teradata, and Google BigQuery. It reduces the need for manual code rewriting and validation, and accelerates database, BI, and ETL migration processes by two to three times. "SnowConvert AI enables organisations to quickly and easily move from legacy data warehouses... while staying supported and without disrupting critical workflows," the company said. With these launches, Snowflake is moving beyond the traditional data warehouse, positioning itself as a full-stack AI platform for enterprises, spanning ingestion, processing, and intelligent automation.
[4]
Snowflake Openflow Unlocks Full Data Interoperability, Accelerating Data Movement for AI Innovation
Snowflake (NYSE: SNOW), the AI Data Cloud company, today announced at its annual user conference, Snowflake Summit 2025, that it is revolutionizing data movement with Snowflake Openflow, a multi-modal data ingestion service that allows users to easily connect to virtually any data source and drive value from any data architecture. Snowflake Openflow underscores Snowflake's commitment to data unification and interoperability, enabling thousands of global customers to integrate their entire enterprise data ecosystem with AI models, apps, and data agents directly in Snowflake through pre-built and extensible connectors. Eliminating fragmented data stacks and the hours of manual labor data teams spend on ingestion, Snowflake Openflow makes data movement effortless, unifying various types of data and formats, so customers can rapidly deploy AI-powered innovations. Snowflake Openflow's Bring Your Own Cloud deployment model is now generally available on AWS. This press release features multimedia. View the full release here: https://www.businesswire.com/news/home/20250603123780/en/ "Snowflake Openflow dramatically simplifies data accessibility and AI readiness. We're seeing more customers adopt an AI-first data strategy, which is dependent on having access to all of your data in a single platform," said Chris Child, VP of Product, Data Engineering, Snowflake. "With Snowflake Openflow, we're redefining what open, extensible, and managed data integration looks like, so our customers can quickly build AI-powered apps and agents without leaving their data behind." Snowflake Unveils Limitless Interoperability, Built for the AI Era Today's enterprises require seamless access to clean, high-volume data to power their AI innovations, regardless of where that data lives. However, data engineers often struggle with rigid data pipelines, fragmented data stacks, and constrained resources -- making it harder to get their data and platforms ready for the AI world. Current approaches to data integration struggle to handle the necessary scale, responsiveness, and support needed for the multi-modal data that generative AI demands. Tapping into a $15B market, Snowflake Openflow eliminates these roadblocks by supporting customers from the moment data is connected to virtually any source, on-premise or in the cloud. It offers an open, extensible, managed, multi-modal data integration service, integrating structured and unstructured, batch and streaming data. Snowflake Openflow makes the process of getting data from where it is created to where it can be used effortless -- a key element for ensuring seamless data extraction, transformation, and load (ETL) for AI. Enterprises like Irwin, a FactSet company, Securonix, and WorkWave can now scale their data integrations with confidence, laying the groundwork for AI innovation, without sacrificing control, simplicity, or governance. Snowflake Openflow also embraces open standards, so organizations can bring data integrations into a single, unified platform without vendor lock-in and with full support for architecture interoperability. Powered by Apache NiFiβ’, an Apache Software Foundation project built to automate the flow of data between systems, Snowflake Openflow enables data engineers to build custom connectors in minutes and run them seamlessly on Snowflake's managed platform. With Snowflake Openflow, users can harness their data across the entire end-to-end data lifecycle, while adapting to evolving data standards and business demands. Hundreds of ready-to-use connectors and processors simplify and rapidly accelerate data integration from a broad range of data sources including Box, Google Ads, Microsoft Dataverse, Microsoft SharePoint, Oracle, Proofpoint, Salesforce Data Cloud, ServiceNow, Workday, Zendesk, and more, to a wide array of destinations including cloud object stores and messaging platforms, not just Snowflake. Snowflake Extends Choice and Flexibility with Expanded Capabilities for Data Engineering In addition to Snowflake Openflow, Snowflake continues to streamline its data engineering capabilities with additional innovations that help engineers code and automate pipelines with greater confidence, enhancing their existing skills and workflows: Partners Accelerating Data Integration with Snowflake Openflow: Learn More: "Apache NiFi" is a registered trademark or trademark of the Apache Software Foundation in the United States and/or other countries. "Apache Iceberg" is a registered trademark or trademark of the Apache Software Foundation in the United States and/or other countries. Forward Looking Statements This press release contains express and implied forward-looking statements, including statements regarding (i) Snowflake's business strategy, (ii) Snowflake's products, services, and technology offerings, including those that are under development or not generally available, (iii) market growth, trends, and competitive considerations, and (iv) the integration, interoperability, and availability of Snowflake's products with and on third-party platforms. These forward-looking statements are subject to a number of risks, uncertainties and assumptions, including those described under the heading "Risk Factors" and elsewhere in the Quarterly Reports on Form 10-Q and the Annual Reports on Form 10-K that Snowflake files with the Securities and Exchange Commission. In light of these risks, uncertainties, and assumptions, actual results could differ materially and adversely from those anticipated or implied in the forward-looking statements. As a result, you should not rely on any forward-looking statements as predictions of future events. Β© 2025 Snowflake Inc. All rights reserved. Snowflake, the Snowflake logo, and all other Snowflake product, feature and service names mentioned herein are registered trademarks or trademarks of Snowflake Inc. in the United States and other countries. All other brand names or logos mentioned or used herein are for identification purposes only and may be the trademarks of their respective holder(s). Snowflake may not be associated with, or be sponsored or endorsed by, any such holder(s). About Snowflake Snowflake is the platform for the AI era, making it easy for enterprises to innovate faster and get more value from data. More than 11,000 companies around the globe, including hundreds of the world's largest, use Snowflake's AI Data Cloud to build, use, and share data, apps and AI. With Snowflake, data and AI are transformative for everyone. Learn more at snowflake.com (NYSE: SNOW).
Share
Copy Link
Snowflake launches Openflow, a new platform designed to streamline data integration and management for businesses in the age of AI, offering enhanced interoperability and simplified data pipelines.
Snowflake, the AI Data Cloud company, has unveiled Openflow, a revolutionary multi-modal data ingestion service designed to transform how businesses manage and utilize their data in the age of AI. Announced at the annual Snowflake Summit 2025, Openflow aims to simplify data integration and accelerate AI innovation by providing a unified platform for diverse data types 1.
In today's data-driven landscape, businesses face significant challenges in managing and leveraging their vast data resources. Traditional Extract, Transform, and Load (ETL) processes often involve complex integrations of multiple tools, leading to inefficiencies and increased costs 3. Openflow addresses these issues by offering a comprehensive solution that streamlines data movement and processing from various sources to any destination 2.
Source: VentureBeat
Openflow boasts several innovative features that set it apart in the data management landscape:
Unified Data Integration: The platform supports both structured and unstructured data, as well as batch and streaming modes, allowing for seamless integration of diverse data types 4.
Extensive Connectivity: With over 200 pre-built connectors and processors, Openflow enables easy integration with a wide range of data sources, including Box, Google Ads, Microsoft SharePoint, Oracle, and Salesforce Data Cloud 2.
Apache NiFi Integration: Powered by Apache NiFi, Openflow allows data engineers to build custom connectors quickly and run them on Snowflake's managed platform 4.
Enhanced Security and Governance: The platform incorporates robust security features, including role-based authorization, encryption in transit, and secrets management 2.
Source: ZDNet
Openflow is designed to accelerate AI innovation by providing a solid foundation for data management. By simplifying data accessibility and AI readiness, the platform enables businesses to rapidly deploy AI-powered applications and agents 4.
Alongside Openflow, Snowflake announced several other AI-focused offerings:
Snowflake Intelligence: A natural language interface for querying and acting on structured and unstructured data 3.
Data Science Agent: An AI-powered tool that automates core machine learning tasks 3.
AISQL: An extension of Snowflake's SQL language to include AI operations as simple function calls 3.
Source: Analytics India Magazine
The introduction of Openflow and related AI tools positions Snowflake as a major player in the evolving landscape of data management and AI integration. As businesses increasingly adopt AI-first data strategies, platforms like Openflow are set to play a crucial role in enabling seamless data accessibility and AI readiness across industries 1.
Nvidia's new Blackwell GPUs show significant performance gains in AI model training, particularly for large language models, according to the latest MLPerf benchmarks. AMD's latest GPUs show progress but remain a generation behind Nvidia.
5 Sources
Technology
1 day ago
5 Sources
Technology
1 day ago
Reddit has filed a lawsuit against AI startup Anthropic, accusing the company of using Reddit's data without permission to train its AI models, including the chatbot Claude. This legal action marks a significant moment in the ongoing debate over AI companies' use of online content for training purposes.
14 Sources
Policy and Regulation
23 hrs ago
14 Sources
Policy and Regulation
23 hrs ago
OpenAI announces a significant increase in its business user base and introduces new AI-powered features for the workplace, intensifying competition in the enterprise AI market.
3 Sources
Technology
1 day ago
3 Sources
Technology
1 day ago
Apple's partnership with Alibaba to launch AI services in China faces regulatory hurdles due to escalating trade war between the US and China, potentially impacting iPhone sales in a key market.
7 Sources
Business and Economy
1 day ago
7 Sources
Business and Economy
1 day ago
OpenAI and Anthropic are competing to develop advanced AI coding tools, with OpenAI's Codex now available to ChatGPT Plus users and Anthropic's Claude aiming to be the world's best coding model.
2 Sources
Technology
23 hrs ago
2 Sources
Technology
23 hrs ago