Curated by THEOUTPOST
On Mon, 26 Aug, 4:02 PM UTC
2 Sources
[1]
Internal Amazon sales guidelines spread doubt about OpenAI capabilities, while bad-mouthing Microsoft and Google
This story is available exclusively to Business Insider subscribers. Become an Insider and start reading now. Have an account? Log in. Other talking points from the documents include OpenAI's lack of access to third-party AI models and weak enterprise-level contracts. AWS salespeople should dispel the hype around AI chatbots like ChatGPT, and steer the conversation toward AWS's strength of running the cloud infrastructure behind popular AI services, the guidelines added. "For generative AI workloads, AWS will compete most often w/ Microsoft's Azure OpenAI Service, OpenAI (directly), and Google Cloud's Generative AI on Vertex AI," one of the documents stated. "Move beyond the hype with AI chatbots, and focus on the [Foundation Models] that power them and the cloud infrastructure needed to help enterprise customers safely create, integrate, deploy, and manage their own generative AI applications using their own data." The guideline documents are from late 2023 through to spring 2024. They reflect Amazon's urgency to aggressively counter the growth of AI rivals, especially OpenAI. The viral success of ChatGPT put OpenAI at the forefront of the AI pack, even though Amazon has been working on this technology for years. The effort to criticize OpenAI is also unusual for Amazon, which often says it's so customer-obsessed that it pays little attention to competitors. This is the latest sign that suggests Amazon knows it has work to do to catch up in the AI race. OpenAI, Microsoft, and Google have taken an early lead and could become the main platforms where developers build new AI products and tools. Though Amazon created a new AGI team last year, the company's existing AI models are considered less powerful than those made by its biggest competitors. Instead, Amazon has prioritized selling AI tools like Bedrock, which gives customers access to third-party AI models. AWS also offers cloud access to in-house AI chips that compete with Nvidia GPUs, with mixed results so far. Amazon's spokesperson told BI that AWS is the "leader in cloud" with projected revenue of more than $100 billion this year. Much of the growth has come from its new AI services, which are on pace to generate "multi-billion dollars" in revenue this year, the spokesperson added. AWS has announced more than twice the number of AI features than the next 3 closest competitors combined since 2023, the spokesperson noted. "It's still early days for generative AI, and with so many companies offering varied services, we work to equip our sales teammates with the information they need to help customers understand why AWS is the best, easiest, most performant place to build generative AI applications. To parse the language as anything more than that or mischaracterize our leadership position is misguided speculation," the spokesperson wrote in an email. OpenAI's spokesperson declined to comment. The documents appear to acknowledge that Amazon is playing catch-up to OpenAI. Many AWS customers got started on AI projects with OpenAI technology, like ChatGPT and its GPT models, because of the startup's "timing in the market, ease of use, and overall model intelligence capabilities," Amazon explained in one of the guidelines. But now is a good time to go after those customers to convert them to AWS services, particularly Bedrock, a tool that has partnerships with AI model providers including Anthropic, Meta, and Cohere, the document said. It also claimed that Anthropic's Claude model, in particular, had surpassed OpenAI's GPT models in terms of "intelligence, accuracy, speed, and cost." The customers most likely to migrate to AWS are the ones who are already "All In" on AWS for the majority of their cloud-computing needs, but "who chose to evaluate OpenAI for their first generative AI workloads," it added. "This is an important moment for the field to take action on," one of the documents said. "Amazon, in partnership with various foundation model providers, has now created a stronger value proposition for customers that should not only inspire them to migrate their generative AI workloads onto AWS, but also, choose AWS for their next GenAI projects." Some of those efforts are starting to pay off, according to Amazon's spokesperson. They cited 4 AWS customers -- HUDstats, Arcanum AI, Forcura, and Experian -- that initially used OpenAI's products, but switched to AWS's AI services after facing "limitations with flexibility and scalability." "In Q2 2024, AWS had its biggest quarter over quarter increase in revenue since Q2 2022, and much of this growth is being fueled by customer adoption of generative AI," Amazon's spokesperson said. "Ultimately, customers are choosing AWS because we continue to be the significant leader in operational excellence, security, reliability, and the overall breadth and depth of our services." It's not just OpenAI that AWS is going after. The sales guidelines also share how AWS sales reps should respond to customer questions about Microsoft and Google. If a customer talks about Microsoft and Google's AI infrastructure and chips, AWS salespeople should say Amazon has more than 5 years of experience investing in its own silicon processors, including its AI chips, Trainium and Inferentia, the documents advised. The guidelines also highlight AWS's better cost and energy efficiency compared to competing products, and note the limited availability of Microsoft's Maia AI chip. One of the guidelines also points out Google's limitations in the number of foundation models offered. "We're flattered they're worried about us, but fiction doesn't become fact just because it's in talking points," Google spokesperson Atle Erlingsson told BI. "Not only do we offer more than 150 first, third and open-source models via Vertex AI, our AI infrastructure offers best overall performance, best cost performance, as well as uptime and security." Microsoft's spokesperson declined to comment. For customers who say Microsoft and OpenAI are at the "cutting edge" of generative AI, AWS wants its salespeople to "cut through the hype," and ensure customers understand how AWS has solutions "across the entire stack" of generative AI technology, from the bottom infrastructure to the AI applications used by end customers, it said. In situations where Microsoft pitches its AI-powered analytics software Fabric to customers, AWS salespeople are instructed to say, "Microsoft Fabric is a new (unproven) offering." It says Fabric doesn't offer many integration points with Azure's Generative AI services, and AWS's own analytics services "offer superior functionality" across diverse workloads. Microsoft previously said 67% of Fortune 500 companies use Fabric. The documents also share AWS "value propositions" that should be emphasized during sales pitches. They include AWS's ease of use, including "enterprise-grade security and privacy," and the ability to customize AI models using the customer's own data. It also stresses AWS's price efficiency and broad set of AI chips offered, as well as its own AI-powered applications, like Amazon Q. Customers typically consider the following 9 criteria before choosing an AI model and service provider, one of the documents said. They are customization; personalization; accuracy; security; monitoring; cost; ease of use; responsible AI; and innovation. Despite the competitive tone of the guidelines, AWS also tells salespeople to use caution and clarity when discussing what data its rivals use for model training. OpenAI, for example, publicly said that it may use customer data to train the consumer version of ChatGPT, but not the business data shared through its enterprise product. "The APIs and the Enterprise chatbots from Microsoft, Google, and OpenAI all declare product terms specifying that customer data is not used for model training," one of the documents said. "Be careful to not use misleading FUD (Fear, Uncertainty, Doubt) by conflating competitors' enterprise solutions with consumer services."
[2]
Amazon is telling its salespeople to trash talk Google, Microsoft, and OpenAI
How will Disney replace Bob Iger? A Morgan Stanley exec is leading the search for a replacement Talking points created for Amazon (AMZN) Web Services' salespeople from late 2023 through spring 2024 guide them to emphasize the distinctions between its AI offerings and those of its rivals, according to internal sales guidelines reported by Business Insider. While that's not out of the ordinary, Amazon's sales strategy includes highlighting the shortcomings of each individual company and urging clients to see beyond the AI chatbot craze. "For generative AI workloads, AWS will compete most often w/ Microsoft's (MSFT) Azure OpenAI Service, OpenAI (directly), and Google (GOOGL) Cloud's Generative AI on Vertex AI," one of the documents said. "Move beyond the hype with AI chatbots, and focus on the [Foundation Models] that power them and the cloud infrastructure needed to help enterprise customers safely create, integrate, deploy, and manage their own generative AI applications using their own data." Salespeople are guided, for example, to remark to clients that ChatGPT maker OpenAI is just a research company -- not a cloud provider -- that lacks advanced security and customer support, Business Insider reports. When it comes to questions about Microsoft and Google, AWS salespeople are instructed to say that Amazon has more than five years of experience investing in its own silicon processors, including its AI chips, Trainium, and Inferentia. But the documents capture one unavoidable truth at the 22-year-old cloud computing company: It has fallen behind in the AI race. Amazon explained in one guideline that many AWS customers used OpenAI technology like ChatGPT to start their AI projects because of the startup's "timing in the market, ease of use, and overall model intelligence capabilities." It's not too late for them to switch, however. "This is an important moment for the field to take action on," the guide said. "Amazon, in partnership with various foundation model providers, has now created a stronger value proposition for customers that should not only inspire them to migrate their generative AI workloads onto AWS, but also, choose AWS for their next GenAI projects." Despite the slow start, Amazon chief Andy Jassy said in annual letter to shareholders that he is "optimistic" much of the generative AI transformation will be built on top of Amazon Web Services. Amazon is reportedly working on an AI chatbot, internally called "Metis," to rival OpenAI's ChatGPT, according to Business Insider. The chatbot will be accessible through a web browser and will be powered by one of the company's internal AI models, Olympus, Business Insider reports, citing unnamed sources familiar with the matter and an internal document. Olympus is reportedly more powerful than Amazon's publicly available AI model, Titan. Meanwhile, the company's AI-powered version of its virtual assistant, Alexa, is reportedly not even close to being ready.
Share
Share
Copy Link
Amazon's cloud computing division, AWS, has reportedly instructed its sales team to highlight potential shortcomings in OpenAI's AI models. This strategy aims to position AWS as a superior choice for enterprise AI solutions.
Amazon Web Services (AWS), the cloud computing arm of e-commerce giant Amazon, has reportedly adopted a bold strategy to compete in the rapidly evolving artificial intelligence (AI) market. According to internal documents obtained by Business Insider, AWS has instructed its sales representatives to cast doubt on the capabilities of OpenAI, a leading AI research company 1.
The sales guidelines provided to AWS team members outline specific talking points designed to highlight potential shortcomings in OpenAI's AI models. These points include emphasizing the alleged inability of OpenAI's models to handle tasks involving more than 4,000 tokens, which could limit their effectiveness for certain enterprise applications 1.
In contrast to the perceived limitations of OpenAI's models, AWS is positioning its own AI solutions as more robust and suitable for enterprise needs. The company is promoting its ability to train custom AI models and handle larger datasets, which it claims gives it an edge over competitors in the AI space 2.
This aggressive sales approach comes as competition in the cloud AI market heats up. With major players like Google, Microsoft, and now Amazon vying for dominance, the stakes are high. Microsoft's partnership with OpenAI has been particularly successful, driving significant growth in its cloud business 2.
The strategy employed by AWS reflects the growing importance of AI capabilities in cloud services. As enterprises increasingly look to integrate AI into their operations, cloud providers are under pressure to demonstrate the superiority of their AI offerings. This competition could ultimately benefit customers by driving innovation and improvements in AI technologies 1.
While AWS's approach may be seen as aggressive by some, it is not uncommon in the highly competitive tech industry. However, the focus on discrediting a specific competitor rather than solely promoting one's own strengths has raised eyebrows among industry observers 2.
As the AI arms race continues, it's clear that cloud providers see AI capabilities as a key differentiator in attracting and retaining enterprise customers. The outcome of this competition could shape the future of AI adoption in businesses across various sectors, potentially influencing the pace of AI innovation and integration into everyday business operations 1 2.
Amazon Web Services (AWS) is launching an aggressive sales campaign to boost its AI offerings, pressuring employees to close deals quickly and compete with rivals Google and Microsoft in the rapidly growing AI market.
2 Sources
2 Sources
Amazon Web Services (AWS) showcases significant AI developments at its annual re:Invent conference, including new Trainium chips, enhancements to SageMaker and Bedrock platforms, and AI-powered tools to compete with Microsoft in the cloud computing market.
6 Sources
6 Sources
A comprehensive look at the latest developments in AI, including OpenAI's Sora, Microsoft's vision for ambient intelligence, and the shift towards specialized AI tools in business.
6 Sources
6 Sources
Amazon reports strong Q3 2024 earnings, with AWS showing significant growth driven by AI investments. CEO Andy Jassy defends high capital expenditure on AI infrastructure as a long-term strategic move.
7 Sources
7 Sources
Google showcases AI advancements, including Gemini 2.0 and new hardware, while industry experts debate the future of AI progress amid data scarcity concerns.
8 Sources
8 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved