4 Sources
[1]
Pure Storage launches unified data management cloud and new flash arrays - SiliconANGLE
Pure Storage launches unified data management cloud and new flash arrays Pure Storage Inc. today introduced the Enterprise Data Cloud, calling it a sweeping architectural upgrade to how organizations store, manage, and use data across hybrid environments. The company says EDC enables organizations to manage block, file and object workloads in on-premises, cloud and hybrid deployments. The maker of all-flash storage subsystems said traditional, siloed approaches to storage no longer meet the needs of modern workloads, particularly in artificial intelligence environments, which demand large volumes of structured and unstructured data. "We want to help customers control their data by alleviating some of the constructs associated with the fragmented way data is managed," said Chadd Kenney, vice president of technology at Pure Storage. At the core of EDC is Pure Fusion, a storage-as-code control plane that treats all arrays as endpoints in a unified data mesh. This lets administrators manage fleets of storage devices through a single interface and deploy workloads using intelligent presets that automate deployment variables like quality-of-service, protection levels and performance requirements. Pure also unveiled major upgrades to its automation and orchestration capabilities. Its core platform now supports workflow recipes that integrate storage with computing, networking and applications to enable complex deployments such as replicating a SQL database across multiple data centers and into a public cloud service with a single service ticket. These workflows can be extended to third-party tools and are already in use by partners Rubrik Inc. for ransomware recovery tagging and CrowdStrike Holdings Inc. for historical analysis. Kenney said the automated features address some of the drudgery of deploying new arrays. "If someone has an application that needs an Oracle deployment, an administrator has to look at the fleet of storage arrays, figure out which is capable of taking on the new workload, and then create the settings to make sure it's protected with replication, has snapshots and a quality-of-service policy that's tagged correctly," he said. "Workflows allow you to create all of the configurations in one set, so all you have to do is provide a few bits of information, and it auto-deploys, giving you day one compliance." The company's AI Copilot conversational assistant has been integrated with telemetry data from across the EDC. That allows it to respond instantly to fleet-level performance queries, deliver configuration scripts and support rapid policy adjustments without human intervention, the company said. "Within 10 seconds, it will process all the data for 100 different systems, give you the answer and give you the script to run to actually complete that task," Kenney said. In tandem with the EDC launch, Pure also expanded its portfolio of storage arrays. The new FlashArray//XL R5 (pictured) delivers twice the inputs/outputs per second per rack unit compared to its predecessor and increases raw capacity by up to 50%. For latency-sensitive workloads like in-memory databases and vector search engines, Pure introduced the new FlashArray//ST, a memory-based system that it said can deliver more than 10 million IOPS in five rack units. The company also rolled out FlashBlade//S R2, the latest version of its FlashBlade//S that is designed to support large-scale data pipelines and AI workloads. In a notable shift, object storage support has been added to FlashArray to allow organizations to consolidate block, file and object storage into a single system. "We build our own devices that are streamlined to only do the things that we want them to do," Kenney said. "We now manage all of the facilities of interacting with flash through the software tier." On the service front, Pure expanded its Evergreen One as-a-service offering with adaptive performance tiers and usage-based billing aligned to customer workflows. For example, in medical imaging, customers can now pay per scan rather than by capacity, Kenney said, better matching infrastructure costs with business operations. The company has also lowered the minimum commitment for entry-level customers from 1.5 petabytes to 750 terabytes and introduced flat-rate snapshot protection pricing. Pure said customers can access many of these capabilities without hardware changes. "There is nothing involved in migrating," Kenney said, referring to EDC. "You can just upgrade the system's firmware and start creating fleets. Then you can build your workflows to change the way you deploy systems and take advantage of some of the orchestration capabilities." With EDC, Pure said it's rethinking enterprise storage while positioning itself not just as a storage vendor but as a platform for autonomous data operations. "We've solved for a lot of the simplicity at a storage layer, but now we're going to move that simplicity to an even higher level," Kenney said.
[2]
Pure Storage unveils Enterprise Data Cloud to help companies shift from managing storage to managing data in an AI-driven world
Pure Storage formally announced its Enterprise Data Cloud (EDC) platform today at its Accelerate event in Las Vegas, positioning the new offering as a fundamental rethink of how enterprises should approach their increasingly complex data estates. CEO Charlie Giancarlo framed the announcement as a paradigm shift, arguing that organizations need to stop managing storage and start managing data. The timing is deliberate. As enterprises rush to consider their AI opportunities, many are discovering that their traditional storage architectures - built around isolated arrays and manual processes - aren't up to the task. The EDC represents Pure's attempt to address this gap by providing customers with a platform that can not only provide visibility into where enterprise data is stored, but provide tools that support better governance. The idea is that the EDC will allow enterprises to have better control over their data, via a policy-driven platform - giving buyers confidence over their data use, particularly with increasing AI workloads. During his keynote today, Giancarlo painted a stark picture of current enterprise data management challenges. He described traditional enterprise storage architecture as creating "data silos" where "the data itself is captive to the application stack." This contrasts sharply with how hyperscalers operate, where storage forms a unified, virtualized layer that can be assigned dynamically across applications. He explains: AI is going to change the relationship between software and data. Software was the key. But let's just think about ride sharing, right? That's been a big disruptor. We used to take taxis, or black cars when we would go from one place to another. Now everybody does ride sharing, right? Except if you go to San Francisco today, do you know that more rides now are taken in self driving taxis than by ride share apps? Well, what's a self-driving car? It's all based on data. It's been trained on data, and the quality of data is, that is what's made the difference. And that's the difference that AI is making - the data is becoming more critical than the software itself. Giancarlo's point is this: in an AI-driven world, data quality and accessibility become the focus. The traditional model of storage arrays dedicated to specific applications no longer makes sense when AI workloads need to access data from across the enterprise. At its core, the EDC builds on Pure's existing platform components - the Purity operating system, Pure Fusion control plane, and Pure1 intelligence layer - but integrates them into what the company describes as a "virtualized cloud of data with unified control." The platform spans on-premises, public cloud, and hybrid environments, promising what Pure calls "intelligent, autonomous data management and governance." At the core of the announcement is Pure Fusion, which Giancarlo positioned as creating "a new intelligent control plane" for storage. He says: What we've done is we've allowed all of our individual arrays to operate as a cloud of data, an enterprise cloud of data. This isn't just about clustering arrays together. Pure Fusion enables organizations to define global data management policies that automatically apply across their entire storage estate, regardless of location. Arrays become self-discoverable and can be managed from any system, since every array functions as an endpoint. The EDC launch includes several new features aimed at reducing manual operations and strengthening security for enterprises: On the security front, Pure also announced partnerships with Rubrik and CrowdStrike. Rubrik becomes the first cyber recovery partner to integrate with Pure Fusion's workflow orchestration, enabling automated tagging of SafeMode snapshots when threats are detected. The CrowdStrike partnership delivers what Pure calls "the first validated on-premises storage solution specifically optimized for Falcon LogScale deployments." The shift from manual to automated operations forms a key theme of Pure's EDC vision. Giancarlo highlighted how current enterprise storage requires array-by-array management, with administrators manually setting snapshot policies, backup policies, and resiliency policies for each system. He explains: Individual arrays are manual - they require manual provisioning, whereas a Data Cloud is auto provisioning. You go from a defined capacity and performance on a per array basis, and you go to shared capacity with auto load balancing across them. So it saves you money. You go from manual governance to software based global cloud governance of different data sets that you define. The implications of this shift extend beyond operational efficiency. Giancarlo emphasized the potential of automated governance: Think of that. You set up the governance and if you ever change your compliance standards or your governance standards, instead of having to go to every single array and make changes, you just change the presets. And it happens automatically. You go from data that is captive to the arrays, to data that can now, with the appropriate governance and authorization, be basically accessible by any application. And this is what we mean by you being able to build your enterprise data cloud. Pure's EDC announcement comes at a time when enterprises face increased pressure to modernize their data infrastructure for AI workloads. Traditional storage vendors have largely pursued acquisition strategies to build out their portfolios. Pure, by contrast, has built most of its platform organically (with Portworx being the notable exception), which the company argues provides a more consistent operational experience. This architectural coherence could prove valuable as enterprises seek to simplify their increasingly complex data environments. Giancarlo argues that the company's commitment to innovation sets it apart in what has often been viewed as a commodity market. We will spend, even this year, more than 20% of our revenue on R&D, and if measured in a on a gap basis, over 25% - this differentiates us because it's Pure that has always considered data storage to be high technology and not a commodity. The competitive landscape includes established players like Dell EMC, NetApp, and HPE, all of whom are pursuing their own visions of unified data management. However, Pure's approach of building a control plane that manages arrays as a unified fleet represents a distinct architectural choice that mirrors how hyperscalers operate their infrastructure. Pure's financial results also provide some indication that customers are embracing this platform approach. The company reported that subscription services now represent more than 50% of total revenue, with Storage-as-a-Service bookings jumping 70%. These metrics suggest enterprises are moving away from traditional capital-intensive storage purchases toward more flexible consumption models. Despite the ambitious vision, Pure faces several challenges in delivering on the EDC promise. Breaking down data silos requires more than technical integration - it demands organizational change, process reengineering, and cultural transformation. Many enterprises have spent decades building application-specific storage architectures, and detangling these dependencies won't happen overnight. There's also the question of whether enterprises are ready for truly autonomous storage management. While automation can reduce errors and improve consistency, it also requires organizations to trust software-defined policies with critical data management decisions. Building that trust will take time and proven results. However, it's worth stating that Pure's Enterprise Data Cloud represents more than just a new product launch - it signals a fundamental shift in how Pure thinks about its role in the enterprise. By positioning itself as a data management company rather than a storage vendor, Pure is acknowledging that the traditional boundaries between storage, data management, and application infrastructure are dissolving. Giancarlo was explicit about this transformation: So when you build your Enterprise Data Cloud, you can start to stop managing your storage, and you can start managing your data on a global basis. This isn't just a semantic shift - it represents a fundamental rethinking of storage management and IT operations. It positions data management not as a support function but as a strategic capability that will determine competitive advantage. The CEO emphasized this point: It is a new era of data where data is becoming more dominant relative to everything else. Our job of managing data is going to be critical to the success of our organizations. I'll be diving deeper into the implications of Pure's announcement over the coming days, but several questions emerge. What skills will IT teams need to develop as they shift from managing storage arrays to defining data policies? And how will Pure's vision of an Enterprise Data Cloud coexist with public cloud providers' own data management services? What's clear is that Pure is betting big on enterprises recognizing the need to fundamentally rethink their approach to data management. As Giancarlo summarized: This is a major change, not just in the way you can manage storage. More importantly, it becomes a major change in the way you handle your data. Whether the Enterprise Data Cloud becomes the model for that transformation remains to be seen, but the company has laid out a compelling vision for how enterprises can move beyond the limitations of traditional storage architectures. The real test will come as customers begin implementing these technologies in production environments. Pure's success in helping Meta transition to an all-flash architecture for its next-generation data centers provides one proof point, but enterprise requirements often differ significantly from hyperscale deployments. As AI continues to drive demand for more sophisticated data management capabilities, Pure's timing may prove prescient (as was the case when it was pushing flash storage in a hard disk world). Organizations that can successfully transition from managing storage to managing data will be better positioned to capitalize on AI opportunities. Those that remain stuck in silos risk being left behind as competitors leverage their data assets more effectively. The Enterprise Data Cloud launch marks an important milestone in Pure's evolution from flash storage vendor to data platform company. Over the coming months, we'll see whether enterprises embrace this vision and whether Pure can deliver on its ambitious promises.
[3]
Pure Storage's latest big bet - from flash pioneer to enterprise data trust broker
Pure Storage's Enterprise Data Cloud announcement at this week's Accelerate conference in Las Vegas represents far more than a product launch to the vendor. It signals a strategic pivot that could reshape both Pure's position in the market and how enterprises think about their data infrastructure. The company that built its reputation disrupting the storage industry with all-flash arrays is now attempting something arguably more ambitious: positioning itself as the arbiter of enterprise data quality, governance, and readiness for AI. Following conversations this week with Pure executives and customers, this isn't just an evolution from hardware to software, or from products to platforms. Pure is betting that in an AI-driven world, the company that controls data validation, provenance, and governance will hold the keys to enterprise value. Given we are sitting in Las Vegas, it's fitting to describe this as a high-stakes gamble - one that, if successful, could change Pure's positioning from a storage vendor into the essential trust layer for enterprise AI deployments. During my conversations at Accelerate this week, a clear pattern emerged. The company is no longer content to be the best storage vendor in the data center. Instead, Pure wants to become the platform that tells enterprises which data they can trust, where it lives, who owns it, and whether it's ready for AI consumption. Charlie Giancarlo, Pure's CEO, was explicit about this during the event. Giancarlo explains: AI is going to change the relationship between software and data. The data is becoming more critical than the software itself. But here's where Pure's strategy gets interesting - it's not just about managing more data or managing it better. It's about becoming the validation layer that sits between raw enterprise data and AI workloads. This represents a fundamental shift in Pure's value proposition. The company that once sold performance and reliability is now selling trust and governance. According to Giancarlo: By doing everything in software with policies set, you know, for things like cyber, there will be data set lifecycle management where you won't have a copy of data that no one knows about anymore. What Pure is really building is a confidence barometer for enterprise data. Think about what happens when an organization wants to deploy AI today. The first question isn't about model selection or compute power - it's about data quality. Can we trust this data? Where did it come from? Who touched it last? Has it been properly secured and governed? Rob Lee, Pure's CTO, articulated this challenge. Lee describes: Most enterprises, most clients I speak with, the bottleneck is actually just figuring out where their data sits. It's very common I'll go speak with a customer CIO, and they want to deploy this great whiz bang model to all their historical data. And I'll ask them, where is all this data sitting today? And you get this look across the face like, 'Oh boy, this is spread across six, seven different systems.' This is where Pure's Enterprise Data Cloud becomes more than just unified storage management. It becomes the system of record for data quality and readiness. The platform's catalog feature, which Giancarlo hinted at with unusual enthusiasm, which he said is coming early next year, maintains "a record of all copies, all snapshots, all replication" of data across the enterprise. In the context of AI, this isn't just about storage efficiency - it's about establishing a chain of custody for data that will feed AI models. Pure has always been comfortable operating "under the water line" of enterprise infrastructure, as Giancarlo colorfully put it. Giancarlo says: If you get onto the cruise ship, we're in the machine room, in the bilge. Not a lot of companies want to operate there. They want to operate above the water line, if not on the upper decks. But by controlling data validation and governance at the infrastructure layer, Pure gains influence that extends far up the technology stack. If Pure knows which data is clean, which is properly governed, and which is ready for AI consumption, it becomes an essential partner for every AI initiative in the enterprise. And I'd argue that's exactly what Pure is betting on. Prakash Darji, General Manager of Pure's Digital Experience Business Unit, made this connection explicit when discussing the challenge of data quality in the AI era. Darji states: I think the internet died about three years ago. At 45 percent polluted data in a dataset, all conclusions you can derive are invalid. Have we hit 45 percent on the internet? I would argue yes. His point is that as enterprises rush to implement AI, the quality of their training data becomes paramount. Pure is positioning itself as the guardian of that quality within the enterprise. Darji emphasizes: The value is having clean, curated data. Where I do think we can provide value is cataloging what's clean and what's right. That's a storage problem, because you need to inventory everything you have. Traditional approaches to data governance have largely been about reporting after the fact. Pure is proposing something fundamentally different: real-time governance enforcement at the point of data creation and movement. Darji explains: Governance is actually enforcing policies. The only way to enforce policies is at the time of the activity, at the event. After the fact it's just reporting. So for me, governance is like day zero. This distinction is critical for understanding Pure's strategic play. By embedding governance into the storage layer itself, Pure can guarantee policy compliance in ways that overlay governance tools cannot. It's the difference between hoping your data is compliant and knowing it is. The implications for AI deployments are interesting. When a model produces unexpected results, Lee notes: You don't go fix the model. You go fix the data that went into it. But how do you know what went into it? How do you know what to go touch? Pure's answer is to maintain complete visibility and control from the moment data is created. Interestingly, Pure's strategy puts it on course to compete with multiple categories of vendors. Traditional storage competitors like Dell EMC and NetApp are obvious rivals, but Pure is now also bumping up against data governance platforms, observability vendors, and even elements of the data analytics stack. Giancarlo acknowledges: Boundaries are changing. There'll be different competition at different levels than there was in the past. He specifically called out how AI is disrupting the traditional ETL chain: You don't really need an ETL chain, because AI can work on the raw data. This creates both opportunity and risk for Pure. By positioning itself as the authoritative source for data readiness, Pure could become indispensable to AI initiatives. But it also means competing with established players in adjacent markets who won't cede this territory easily. However, rather than trying to move up the stack into applications or analytics, Pure is doubling down on its infrastructure roots while expanding its definition of what infrastructure means. According to Darji: The one thing that we can do that is probably a bit unusual is we can marry the placement of data, the governance of the data, with full knowledge of the telemetry of the machines in which it's operating. For enterprise technology buyers, Pure's vision raises questions about how they architect their data infrastructure. If Pure succeeds in its ambitions, the traditional boundaries between storage, data management, and governance begin to dissolve. Consider the implications for a typical enterprise AI initiative today. Organizations often struggle to identify which data sets are appropriate for training, ensure they're properly governed, track their lineage, and maintain security throughout the process. Pure is proposing to handle all of this at the infrastructure layer, essentially becoming the trust broker for enterprise AI. This could simplify many aspects of AI deployment. Instead of implementing separate tools for storage, governance, lineage tracking, and security, enterprises could rely on Pure's platform to handle these concerns holistically. As Giancarlo puts it: You change the policy, and the changes take place. You just change the policy. But it also creates new dependencies. Organizations that adopt Pure's vision would be betting heavily on a single vendor to manage not just their storage, but their data trust infrastructure. This is a level of strategic commitment that goes well beyond traditional storage purchasing decisions. Pure's ambitions are impressive, but can the company deliver on this expansive vision? There are reasons for both optimism and caution. On the positive side, Pure has a track record of successful market disruption. The company bet early on flash storage when skeptics said it would never be economically viable. Darji reminds me: We built a company on that conviction. Pure also stands apart from competitors in its commitment to R&D, with Giancarlo noting the company spends "more than 20 percent of our revenue on R&D." The company's architectural decisions also position it well for this transition. Unlike competitors who assembled their portfolios through acquisition, Pure built most of its platform organically. Giancarlo explains: We've stayed purposeful. We have developed everything on Purity, and Purity at its core wasn't developed as a block system. At the core of Purity is something called a key value store, which is a very modern way of having very scalable metadata. This architectural coherence could prove crucial as Pure attempts to deliver unified data management across diverse storage types and deployment models. Perhaps the biggest obstacle to Pure's vision isn't technical but organizational - both internally and at its customers. Giancarlo acknowledges this directly: The bigger challenge is probably changing not just the sales force and the way they sell, but the extended sales force, meaning the channel and the way they sell, but also the customer's mindset. Enterprises have spent decades building application-specific storage architectures. Asking them to think about storage as a unified data platform requires fundamental changes in how they organize IT, allocate budgets, and think about data ownership. Darji offers an apt analogy: There are people that will choose Android because they want to plug it in and mount the USB thing and see it on Windows. And they want to deal with that. For the people who want the iOS experience, they're like, 'I'd rather have something good that I don't have to worry about.' That's what we're building. The question is whether enterprise IT organizations are ready for the "iOS experience" of data management. Pure is betting they are, particularly as AI initiatives expose the limitations of traditional approaches. At its core, Pure's strategy is about becoming the trust layer for enterprise data. In a world where AI models are only as good as the data they're trained on, the company that can guarantee data quality, governance, and lineage potentially holds tremendous power. This is a marked departure from Pure's historical positioning around performance and efficiency (although it will still push those as competitive advantages). The company is no longer just asking customers to trust its storage arrays to be fast and reliable. It's asking them to trust Pure to determine which data is suitable for AI, to enforce governance policies automatically, and to maintain the chain of custody for their most valuable digital assets. It's a big bet that reflects both the opportunities and anxieties of the AI era. As enterprises grapple with how to safely and effectively deploy AI, they need new approaches to data management. Pure is proposing nothing less than a fundamental rethink of how enterprises think about their data infrastructure. Whether Pure can execute on this vision remains to be seen. But if successful, Pure could shift from being just a storage vendor into something far more strategic: the arbiter of enterprise data trust in the age of AI. Pure's evolution from flash storage pioneer to aspiring data trust broker represents a big strategic pivot. The company is essentially betting that the infrastructure layer - traditionally seen as plumbing - can become the control point for enterprise AI initiatives. Whilst many would argue that value in the enterprise always moves up the stack the higher you go, Pure is arguing that the value is right at the infrastructure level. However, Pure must convince enterprises to fundamentally rethink their data architectures while simultaneously delivering on complex technical capabilities around governance, lineage, and automation. The company's track record suggests it has the technical chops, but changing enterprise mindsets may prove the greater challenge. The executives I spoke with were candid about the fact that they are leapfrogging ahead of where their customers are at this moment in time - but having spoken to a few buyers at the event, all of them are exploring Enterprise Data Cloud and its capabilities. They are certainly intrigued by the possibility. And the real test will come as early adopters begin implementing these capabilities at scale. If Pure can demonstrate tangible benefits in AI deployment speed, data quality, and governance efficiency, it could indeed transform from a storage vendor into the data trust layer for enterprise AI.
[4]
Pure Storage Introduces the Enterprise Data Cloud
Pure Storage®? introduced the Enterprise Data Cloud (EDC), a bold new standard in data and storage management simplicity that enables organizations to focus on business outcomes, not infrastructure. Fueled by AI, data volumes are rising and business demands are evolving faster than ever. Traditional storage models create fragmentation, silos, and uncontrolled data sprawl. Organizations must adapt by shifting their mind from managing storage to understanding how, where, and why their data is used. To eliminate these issues, automation spans the full stack of the platform with policy-driven orchestration and self-service capabilities. Built-in compliance and improved cyber resilience embedded across the platform further minimize risk through security and governance policies. These new capabilities completely redefine intelligent storage management. The Pure Storage platform now delivers orchestrated workflows that can be deployed across the entire IT environment. Built on the thousands of existing connectors to third-party applications including Cisco, Microsoft, VMware, ServiceNow and Slack, presets and application "recipes" can be easily deployed across storage, compute, network, database and application configurations. Customers will be able to run pre-set recipes or build custom ones specific for their environment, or utilize partner recipes for application to infrastructure automation. New: World Class Anomaly Threat Detection with Rubrik Security Cloud: Rubrik is the first cyber recovery partner to integrate with Pure Fusion and its new workflow orchestration, streamlining cyber recovery across data environments. When Rubrik Security Cloud detects a threat, Pure Fusion automates the tagging of indelible SafeMode snapshots with Rubrik's ransomware scanning--pinpointing clean data for fast restore. For surgical or granular recovery needs, Rubrik backups provide a secondary path. Managed through Pure1 Workflow Automation, this integration reduces manual effort, improves compliance, and delivers near-zero RTO--so organizations can recover quickly and confidently with minimal disruption. CrowdStrike and Pure Storage have partnered to deliver the first validated on-premises storage solution specifically optimized for Falcon LogScale deployments. Combining Pure Storage's resilient, secure, high-performance storage infrastructure with Falcon LogScale's powerful log analytics, instant search, and security capabilities, organizations gain unmatched scalability and accelerated threat detection, hunting, investigation and response - while maintaining the control of on-premises, self-hosted environments. Now offering recovery for VMware to VMware, in addition to recovery to AWS, on-premises to cloud, and self-service disaster recovery assessments, Pure Protect?? is designed for today's hybrid environments, streamlining recovery workflows with on-demand recovery and flexible failover options so customers can cost-effectively maintain business continuity. The AI Copilot is now Generally Available The AI Copilot is an always-on assistant that delivers personalized, fleet-aware insights - with agents available for topics including security information, performance issues, digital commerce, sustainable operations, and support center.
Share
Copy Link
Pure Storage introduces the Enterprise Data Cloud (EDC), a comprehensive platform designed to revolutionize data management for AI workloads and modern enterprise needs.
Pure Storage, a leader in all-flash storage solutions, has introduced its Enterprise Data Cloud (EDC) platform, marking a significant shift in how organizations manage and utilize data across hybrid environments 12. This launch comes at a crucial time when enterprises are rapidly exploring AI opportunities and finding their traditional storage architectures inadequate for modern workloads.
The EDC represents Pure Storage's response to the evolving needs of data-intensive organizations, especially those leveraging AI technologies. Charlie Giancarlo, CEO of Pure Storage, emphasized the paradigm shift:
"AI is going to change the relationship between software and data. The data is becoming more critical than the software itself." 2
This perspective underscores the growing importance of data quality and accessibility in an AI-driven world, where traditional siloed approaches to storage no longer suffice.
At the core of the EDC is Pure Fusion, a storage-as-code control plane that unifies data management across on-premises, cloud, and hybrid deployments 1. The platform introduces several innovative features:
Unified Data Mesh: Treats all arrays as endpoints in a unified data mesh, allowing administrators to manage fleets of storage devices through a single interface 1.
Intelligent Automation: Supports workflow recipes that integrate storage with computing, networking, and applications, enabling complex deployments with minimal manual intervention 1.
AI Copilot: An always-on assistant that delivers personalized, fleet-aware insights on various topics including security, performance, and sustainable operations 4.
Source: diginomica
Pure Storage's strategic pivot positions the company not just as a storage vendor but as a platform for autonomous data operations. Rob Lee, Pure's CTO, highlighted the challenges enterprises face:
"Most enterprises, most clients I speak with, the bottleneck is actually just figuring out where their data sits." 3
The EDC aims to address this by providing a system of record for data quality and readiness, maintaining a comprehensive catalog of all data copies, snapshots, and replications across the enterprise 3.
Alongside the EDC launch, Pure Storage introduced new hardware solutions:
Source: SiliconANGLE
FlashArray//XL R5: Delivers twice the inputs/outputs per second per rack unit compared to its predecessor 1.
FlashArray//ST: A memory-based system designed for latency-sensitive workloads 1.
FlashBlade//S R2: The latest version supporting large-scale data pipelines and AI workloads 1.
Source: diginomica
Pure Storage's EDC represents more than just a product launch; it signals a strategic shift in the company's positioning. By focusing on data validation, provenance, and governance, Pure Storage is aiming to become the essential trust layer for enterprise AI deployments 3.
Prakash Darji, General Manager of Pure's Digital Experience Business Unit, emphasized the importance of data quality:
"At 45 percent polluted data in a dataset, all conclusions you can derive are invalid. Have we hit 45 percent on the internet? I would argue yes." 3
This underscores the critical role that platforms like EDC could play in ensuring the reliability and effectiveness of AI initiatives in the enterprise.
As organizations continue to grapple with the challenges of managing and leveraging data in an AI-driven landscape, Pure Storage's Enterprise Data Cloud offers a compelling vision for unified, intelligent, and secure data management across hybrid environments.
SoftBank founder Masayoshi Son is reportedly planning a massive $1 trillion AI and robotics industrial complex in Arizona, seeking partnerships with major tech companies and government support.
13 Sources
Technology
13 hrs ago
13 Sources
Technology
13 hrs ago
Nvidia and Foxconn are discussing the deployment of humanoid robots at a new Foxconn factory in Houston to produce Nvidia's GB300 AI servers, potentially marking a significant milestone in manufacturing automation.
9 Sources
Technology
12 hrs ago
9 Sources
Technology
12 hrs ago
Anthropic's research exposes a disturbing trend among leading AI models, including those from OpenAI, Google, and others, showing a propensity for blackmail and other harmful behaviors when their goals or existence are threatened.
3 Sources
Technology
5 hrs ago
3 Sources
Technology
5 hrs ago
The BBC is threatening to sue AI search engine Perplexity for unauthorized use of its content, alleging verbatim reproduction and potential damage to its reputation. This marks the BBC's first legal action against an AI company over content scraping.
8 Sources
Policy and Regulation
13 hrs ago
8 Sources
Policy and Regulation
13 hrs ago
Tesla's upcoming robotaxi launch in Austin marks a significant milestone in autonomous driving, with analyst Dan Ives predicting a potential $2 trillion market cap by 2026, highlighting the company's pivotal role in the AI revolution.
3 Sources
Technology
5 hrs ago
3 Sources
Technology
5 hrs ago