Curated by THEOUTPOST
On Wed, 16 Oct, 8:04 AM UTC
3 Sources
[1]
What Databricks-AWS Partnership Means for Enterprise
Databricks will tap Amazon's Trainium chips to power services for building AI systems, a move that could cut costs for businesses. E-commerce behemoth Amazon and startup Databricks struck a five-year deal to focus on using Amazon's Trainium AI chips that could cut costs for businesses seeking to build their GenAI apps. Databricks will use AWS Trainium chips to power a service that helps companies customise an AI model or build their own using Mosaic AI. It acquired AI startup MosaicML last year in a $1.3 billion deal and is expanding its services to democratise AI and position its Lakehouse as the top platform for GenAI and LLMs. The company raised $37 million and offers technology up to 15 times cheaper than competitors, serving clients like AI2, Replit, and Hippocratic AI. It claims that its MPT-30B LLM, a 30-billion parameter model, is superior in quality and more cost-effective for local deployment than GPT-3. Meanwhile, Amazon says customers pay less to use its homegrown chips compared with the competition, such as NVIDIA's GPUs, which dominate the AI chip market. The partnership also includes support for AWS's Graviton2-based EC2 instances, which can deliver significantly better price-performance ratios -- up to 4x -- when building lakehouses on AWS. This optimisation is crucial for enterprises aiming to manage costs while maximising performance in their data operations. Databricks VP Naveen Rao highlighted that the partnership will allow companies to build AI models significantly quicker and more cost-effectively, ultimately making AI faster and cheaper for businesses by passing on the savings gained from using Amazon's AI chips. Databricks, the cloud platform, has introduced a pay-as-you-go model for its Lakehouse Platform through AWS Marketplace, allowing customers to easily discover, launch, and build lakehouse environments directly from their AWS accounts. This model simplifies onboarding, consolidates billing under existing AWS management accounts, and enables organisations to leverage their AWS contracts for greater flexibility in resource management. The Lakehouse Platform unifies enterprise analytics and AI workloads on a single platform, eliminating data silos and promoting better collaboration across workflows. Integrated with AWS Lake Formation, it enhances data governance by allowing centralised management of data access policies, ensuring consistent security enforcement across Databricks and AWS services while supporting a wide range of functions, from data processing to machine learning. The deal comes as Databricks, Amazon and other enterprise technology companies like Microsoft, Salesforce and Snowflake, a rival of Databricks, aggressively court businesses to earn more revenue. Meanwhile, corporate technology executives say it is time to show AI investment is generating returns. Also, it's noteworthy that Databricks will continue using NVIDIA processors as part of a pre-existing agreement with AWS, notwithstanding the change. Along with its Inferentia series of AI chips used to develop and operate AI models, Amazon debuted the second iteration of its Trainium chips in November last year. All in all, the agreement adds to Amazon's position in the cut-throat AI chip space, as it looks to advance its position further. The two companies also have an existing partnership where customers can run Databricks data services on Amazon's cloud-computing platform, Amazon Web Services. Databricks also rents NVIDIA's GPUs through AWS, and will be using more of them as part of the deal. Customers using AWS have generated over $1 billion in revenue for Databricks, and AWS is the data company's fastest-growing cloud partner, he added. While early AI successes have relied on using a company's private data to customise AI, for instance, building a bespoke customer service chatbot can help lower staffing costs, for Amazon, that means continuing to position itself as a neutral provider of technology, offering businesses the capabilities to use and compile a variety of AI models from many vendors on its platform. They also make money by renting out analytics, AI, and other cloud-based software that taps AI-ready data so that companies can build their enterprise technology tools. The San Francisco-based firm said it was valued at $43 billion last September. An AWS Inferentia chip (left) and an AWS Trainium chip. (Photo: Amazon) Jonny LeRoy, the CTO of Grainger, is using AI to help customers navigate their product offerings. The Illinois-based company is using a combination of AI models and a retrieval-augmented generation system from Databricks to build its customer-service tool, and is planning to use Amazon's chips under the hood, LeRoy said. Although Amazon isn't widely considered a leader in AI innovation, some technology analysts and business leaders say, it needs to show that it can compete against Microsoft and Google. A part of Amazon's AI reboot involves its AI chips, Trainium and Inferentia, which are designed specifically for building and using AI models. Compared with NVIDIA's more general-purpose GPUs, such custom chips can be more efficient because they were designed for just one thing. Amazon's pitch for its custom AI chips is its lower cost. Customers can expect to pay about 40% less than they would using other hardware, said Dave Brown, vice president of AWS compute and networking services. "No customer is going to move if they're not going to save any money, and if their existing solution is working well for them," Brown said. "So it's important to deliver those cost savings." Despite all this, there is no official statement mentioning how many Amazon customers use its custom chips rather than NVIDIA's GPUs.
[2]
AWs and Databricks to deliver more affordable generative AI for joint customers - SiliconANGLE
AWs and Databricks to deliver more affordable generative AI for joint customers Amazon Web Services Inc. and Databricks Inc. say they have signed a five-year deal to help companies cut costs when building and running their artificial intelligence applications. As part of the deal, Databricks says it will use AWS's Trainium AI chips to power its Mosaic AI service, which helps businesses to customize existing large language models or build their own from scratch. Amazon says the main benefit for customers is that they'll pay a lot less to use its chips, as opposed to Nvidia Corp's graphics processing units, which are the most popular hardware for AI workloads today. Databricks' AI service stems from its $1.3 billion acquisition of a company called MosaicML Inc. last year. The Mosaic AI service includes tools for model serving, which support a variety of third-party LLMs, including a selection of models available through Amazon Bedrock. The companies explained that Databricks' customers will be able to scale model training on Mosaic AI at lower costs when using the AWS Trainium chips. In addition, they'll provide other capabilities to help their joint customers build, deploy and monitor custom AI applications, without sacrificing control over their data or intellectual property. In addition, the deal will see Databricks and AWS work together to make it easier for customers to run Databricks' big data services on AWS via new integrations on the AWS Marketplace, including simplified onboarding, configuration and serverless compute from AWS. Databricks said it will work with its system integration partners to create additional technical solutions and implementation resources designed to help customers identify generative AI use cases and on-premises workloads that can be optimized with Databricks on AWS. According to Databricks, the partnership intends to make it faster and cheaper for businesses to design and build AI applications, primarily from the savings derived from using Amazon's AI chips. "Strengthening our collaboration with AWS allows us to provide customers with unmatched scale and price-performance so they can bring their own generative AI applications to market more rapidly," said Databricks co-founder and Chief Executive Ali Ghodsi. The deal should help both companies in their efforts to compete with rivals such as Microsoft Corp., Snowflake Inc. and Salesforce Inc., which are all vying with them to court businesses' AI dollars. The most successful AI initiatives tend to be those that rely on businesses' internal data to customize AI applications. For instance, a company might create a bespoke AI chatbot that's fueled with knowledge from its internal databases. Amazon's strategy is to position itself as a neutral provider of AI technology, providing customers with access with a variety of AI models and the infrastructure to run them. AWS CEO Matt Garman said the collaboration with Databricks will enable companies to build AI applications powered by valuable insights from their own data. "We're helping customers innovate faster by focusing on what truly matters most for their business," he said. Databricks' derives its revenue from its data analytics services, AI tools and other software that can tap into AI-ready data, helping companies to build various kinds of AI applications. The two companies have an existing partnership, where customers can run Databricks' data tools and services on AWS.
[3]
Amazon Strikes at Nvidia's Stronghold Through Databricks Deal, Cutting AI Costs by 40% - Amazon.com (NASDAQ:AMZN)
Databricks aims to reduce costs for customers, challenging Nvidia's dominance in AI chip technology. Amazon.com Inc AMZN inked a five-year deal with data and AI startup Databricks to provide businesses with cost-effective AI-building capabilities. The financial terms of the contract remain undisclosed. The partnership centers around Amazon's Trainium AI chips, which offer a less expensive alternative to Nvidia Corp's NVDA popular GPUs for companies looking to customize or build their AI models. Also Read: Amazon and Oracle Partnership Unlocks Major Cloud Growth Opportunity, Analyst Suggests Databricks proposed to pass on the savings from using Amazon's chips to customers to break into Nvidia's moat. Databricks customers include W.W. Grainger, Inc GWW and the car-shopping site Edmunds.com. Both companies are eying the enterprise AI space intensifying rivalry with Microsoft Corp MSFT and Snowflake Inc SNOW. Databricks, valued at $43 billion as of 2023, is already deeply involved in the AI market, having acquired AI startup MosaicML for $1.3 billion. Databricks' existing partnership with Amazon allows customers to access its data services through Amazon Web Services (AWS). As part of the agreement, Databricks will also increase its use of Nvidia GPUs rented through AWS. According to Naveen Rao of Databricks, AWS has generated more than $1 billion in revenue for Databricks and remains the company's fastest-growing cloud partner. Rao of Databricks told the WSJ that the collaboration with Amazon allows Databricks to pass cost savings to customers by leveraging Amazon's AI chips. Dave Brown of Amazon Web Services told the WSJ that Amazon's Trainium chips, specifically designed for AI tasks, can help businesses cut their AI development costs by up to 40%. Amazon stock is up over 41% in the last 12 months. Nvidia is up 195% thanks to the AI wave. JMP Securities analyst Nicholas Jones expects Amazon AWS to outpace Microsoft Azure, citing the latter's softness in some European geographies and capacity constraints. Scotiabank's Nat Schindler and JP Morgan's Doug Anmuth flagged AWS AI's focus on providing flexible, cost-effective AI solutions, positioning it as a critical partner for companies leveraging AI. Price Actions: AMZN stock is down 1.34% to $185.52 at the last check on Tuesday. Also Read: Amazon's Profitable Essential Merchandise, Efficiency Gains Set To Drive Growth: Analyst Photo via Company This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Market News and Data brought to you by Benzinga APIs
Share
Share
Copy Link
Amazon Web Services and Databricks have entered a strategic five-year partnership aimed at making generative AI more affordable and accessible for enterprises, leveraging AWS Trainium chips to challenge Nvidia's dominance in the AI chip market.
In a significant move that could reshape the enterprise AI landscape, Amazon Web Services (AWS) and Databricks have announced a five-year strategic partnership aimed at making generative AI more affordable and accessible for businesses 123. This collaboration leverages AWS's Trainium AI chips to power Databricks' Mosaic AI service, potentially disrupting Nvidia's dominance in the AI chip market.
At the heart of this partnership is the promise of substantial cost savings for businesses looking to build and run AI applications. Dave Brown, Vice President of AWS compute and networking services, claims that customers can expect to pay about 40% less using AWS's custom AI chips compared to other hardware options 1. This cost-effectiveness is crucial for enterprises seeking to manage expenses while maximizing performance in their AI and data operations.
Databricks will utilize AWS Trainium chips to power its Mosaic AI service, which helps companies customize existing large language models (LLMs) or build their own from scratch 2. This service stems from Databricks' $1.3 billion acquisition of MosaicML in 2023 and includes tools for model serving that support various third-party LLMs, including those available through Amazon Bedrock 2.
The partnership also introduces new integrations on the AWS Marketplace, including:
These additions aim to streamline the process for customers to discover, launch, and build lakehouse environments directly from their AWS accounts.
This strategic alliance positions AWS and Databricks to compete more effectively against Nvidia's GPUs, which currently dominate the AI chip market. Amazon's custom Trainium and Inferentia chips, designed specifically for AI tasks, offer a more efficient alternative to general-purpose GPUs 1. However, it's worth noting that Databricks will continue to use Nvidia processors as part of its existing agreement with AWS 1.
The partnership is expected to accelerate AI adoption across various industries. Jonny LeRoy, CTO of Grainger, highlighted how they're using a combination of AI models and Databricks' retrieval-augmented generation system to build customer-service tools, with plans to incorporate Amazon's chips 1.
While the financial terms of the deal remain undisclosed, the partnership has significant implications for both companies:
As enterprise technology companies aggressively court businesses for AI-related revenue, this partnership positions AWS and Databricks to compete more effectively against rivals like Microsoft, Snowflake, and Salesforce 2. The collaboration aims to provide customers with unmatched scale and price-performance, enabling faster development and deployment of generative AI applications 2.
In conclusion, this strategic alliance between AWS and Databricks marks a significant step in democratizing AI technology for enterprises, potentially reshaping the competitive landscape of the AI industry in the coming years.
Reference
[1]
[2]
Amazon is set to launch its next-generation AI chip, Trainium 2, aiming to reduce reliance on Nvidia and cut costs for AWS customers. The chip, developed by Amazon's Annapurna Labs, is already being tested by major players in the AI industry.
9 Sources
9 Sources
Amazon Web Services unveils new AI chip clusters and supercomputers, shifting focus to Trainium chips to compete with Nvidia in the AI hardware market.
11 Sources
11 Sources
Amazon is developing its own AI chips in a secret Texas lab, aiming to reduce reliance on Nvidia's expensive GPUs. This move could potentially save billions in cloud computing costs for Amazon Web Services (AWS).
4 Sources
4 Sources
Amazon Web Services (AWS) showcases significant AI developments at its annual re:Invent conference, including new Trainium chips, enhancements to SageMaker and Bedrock platforms, and AI-powered tools to compete with Microsoft in the cloud computing market.
6 Sources
6 Sources
Amazon is accelerating the development of its Trainium2 AI chip to compete with Nvidia in the $100 billion AI chip market, aiming to reduce reliance on external suppliers and offer cost-effective alternatives for cloud services and AI startups.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved