Curated by THEOUTPOST
On Wed, 31 Jul, 12:04 AM UTC
7 Sources
[1]
Apple snubs Nvidia for AI training kit, chooses Google
Apple has detailed in a research paper how it trained its latest generative AI models using Google's neural-network accelerators rather than, say, more fashionable Nvidia hardware. The paper [PDF], titled "Apple Intelligence Foundation Language Models," provides a deep-ish dive into the inner workings of the iPhone titan's take on LLMs, from training to inference. These language models are the neural networks that turn queries and prompts into text and images, and power the so-called Apple Intelligence features being baked into Cupertino's operating systems. They can perform things like text summarization and suggested wording for messages. While most AI orgs clamor for Nvidia GPUs, especially the H100 until Blackwell comes along - and may be eyeing up offerings from AMD, Intel, and others - when it comes to training machine learning systems Apple decided to choose Google's Tensor Processing Unit (TPU) silicon. It's not entirely surprising, as the Mac titan and Nvidia have been on bad terms for some years now for various reasons, and it seems Cook & Co have little interest in patching things up for the sake of training Apple Foundation Models (AFMs). What is surprising is that Apple didn't turn to Radeon GPUs from AMD, which has previously supplied chips for Mac devices. Instead, Apple chose Google and its TPU v4 and TPU v5 processors for developing AFMs on training data. Yes, this is the same Google that Apple criticized just last week over user privacy in respect to serving online ads. But on the hardware side of things everything appears to be cozy. Apple's server-side AI model, AFM-server, was trained on 8,192 TPU v4 chips, while AFM-on-device used 2,048 newer TPU v5 processors. For reference, Nvidia claims training a GPT-4-class AI model requires around 8,000 H100 GPUs, so it would seem in Apple's experience the TPU v4 is about equivalent, at least in terms of accelerator count. For Cupertino, it might not just be about avoiding using Nvidia GPUs. Since 2021, Google's TPUs have seen explosive growth, to the point where only Nvidia and Intel have greater market share according to a study in May. Apple claims its models beat some of those from Meta, OpenAI, Anthropic, and even Google itself. The research paper doesn't go into much detail about AFM-server's specifications, though it does make lots of hay about how AFM-on-device has just under three billion parameters and has been optimized to have a quantization of less than four bits on average for the sake of efficiency. Although AI models can be evaluated with standardized benchmarks, Apple says it "find[s] human evaluation to align better with user experience and provide a better evaluation signal than some academic benchmarks." To that end, the iMaker presented real people with two different responses for the same prompt from different models, and asked them to choose which one was better. However, the prompts and responses are not provided, so you'll just have to trust Apple's word on it. While Apple claimed its AFMs are "often preferred over competitor models" by humans grading their outputs, its models actually only seemed to score in second or third place overall. AFM-on-device won more often than it lost against Gemma 7B, Phi 3 Mini, and Mistral 7B, but couldn't get the win against LLaMa 3 8B. The paper did not include numbers for GPT-4o Mini. Meanwhile, AFM-server wasn't really a match for GPT-4 and LLaMa 3 70B. We can probably guess it doesn't fare too well against GPT-4o and LLaMa 3.1 405B either. Apple sort-of justifies itself by demonstrating that AFM-on-device outperformed all small models for the summarization tool in Apple Intelligence, despite being the smallest model tested. Though, that's just one feature, and it's curious why Apple didn't show similar data for other tools. Cupertino also claims a big victory in generating safe content. While AFM-on-device and -server output harmful responses 7.5 percent and 6.3 percent of the time respectively, all other models did so at least ten percent of the time, apparently. Mistral 7N and Mixtral 8x22B were apparently the biggest offenders at 51.3 percent and 47.5 percent each, Apple claimed. ®
[2]
Apple appears to have just shunned Nvidia again | Business Insider India
It looks like Apple has confirmed it's using Google's chips for Apple Intelligence -- not Nvidia's. There was no mention of Nvidia in an Apple research paper published on Monday that discussed foundation language models developed to power Apple Intelligence features. In fact, the tech giant wrote that it used TPUv4 and TPUv5p chips, Google's tensing processing units, to train its artificial intelligence tools. Nvidia chips, which use graphics processing units for AI processing, are in high demand in the tech industry. Companies like Meta spend big to amass large volumes of them to train their models. According to a Mizuho Securities estimate, Nvidia controls more than 70% of the AI chips market, and the chips can cost tens of thousands of dollars to buy. Google uses its own TPU chips instead and rents them out through its cloud service to clients like Apple. While Apple didn't explicitly say that no Nvidia chips were used in the hardware and software infrastructure of its AI features, it has been somewhat public about its work with Google to train its upcoming Apple Intelligence. In Monday's research paper, Apple wrote that it used 2,048 TPUv5p chips (made by Google) to build the AI model that will reportedly operate on iPhones and other Apple hardware. Apple, Microsoft, Nvidia and Alphabet are among the top companies in the world by market cap. Nvidia's success in the AI chip market has helped drive it to a more than $2 billion valuation in the last few years.
[3]
The surprising thing we just learned about Apple Intelligence | Digital Trends
A new research paper from Apple reveals that the company relied on Google's Tensor Processing Units (TPUs), rather than Nvidia's more widely deployed GPUs, in training two crucial systems within its upcoming Apple Intelligence service. The paper notes that Apple used 2,048 Google TPUv5p chips to train its AI models and 8,192 TPUv4 processors for its server AI models. Nvidia's chips are highly sought for good reason, having earned their reputation for performance and compute efficiency. Their products and systems are typically sold as standalone offerings, enabling customers to construct and operate them as the best see fit. Recommended Videos Google's TPUs, on the other hand, are only available to users as part of the company's larger cloud services package. That is, you don't own Google TPUs so much as lease access to them as customers are required to develop their AI models within the Google Cloud ecosystem. Get your weekly teardown of the tech behind PC gaming ReSpec Subscribe Check your inbox! Privacy Policy This cloud requirement worked in Apple's favor, per the research team. They noted that the ability to cluster Google's TPUs enabled them to harness the necessary processing power to train Apple's AI models and do so more efficiently than with a standalone system. Apple's decision to use Google's products is unexpected, and not just because of the two companies' longstanding rivalry. Nvidia boasts the dominant market share in terms of AI chips, its accelerators constituting between 70% and 95% of all sales. However, Apple's decision could be seen as a sign that tech companies are looking to move away from Nvidia's pricey high-end chips. Amazon, for example, recently revealed that it is developing a new line of AI chips that are purportedly 50% more powerful than Nvidia's offerings and operate using 50% less power. Microsoft in May announced that it will offer its cloud customers services built atop AMD's AI chips, rather than Nvidia's, while Google made similar moves in April.
[4]
Apple rejected Nvidia in favor of this company to train its AI - Softonic
Nvidia has exploded thanks to AI, becoming the most valued company in the world, but it won't be with the help of Apple According to a research document, Apple and its Apple Intelligence chose Nvidia over other options when selecting the equipment to train Tim Cook's AI. And that means a lot. It seems that Apple no longer wanted to support Nvidia in the market by giving the task of training its AI, so it chose Nvidia's long-time competitor and partner: Sundar Pichai and his company. However, in its document, Apple shares that its Apple Foundation Model (AFM), with 2.73 billion parameters, is based on clusters in the cloud of tensor processing units (TPU) v4 and v5p that are usually provided by Google, from the Alphabet Inc. matrix. The Apple research paper, published today, covers its training infrastructure and other details for the AI models that will power the features announced at WWDC earlier this year. Apple announced both on-device AI processing and cloud processing, and at the heart of these AI features lies the Apple Foundation Model, nicknamed AFM. For AFM on server, or the model that will power the AI functions in the cloud called Apple Cloud Compute, Apple shared that it trains a 6.3 trillion token AI model "from scratch" on "8,192 TPUv4 chips". Google's TPUv4 chips are available in pods formed by 4,096 chips each. Apple added that AFM models (both on-device and in the cloud) are trained on TPUv4 chips and TPU clusters in the v5p cloud. This v5p is part of Google's AI "supercomputer" in the cloud, and was announced last December. Each v5p pod is made up of 8,960 chips each and, according to Google, it offers twice the floating-point operations per second (FLOPS) and triple the memory than TPU v4 to train models almost three times faster. For on-device AI model for functions like writing and image selection, Apple uses a 6.4 billion parameter model "trained from scratch with the same recipe as AFM-server." Apple also chose to rely on the old TPU v4 chips for the AFM server model. As mentioned earlier, it used 8,092 v4 TPU chips, but for the AFM model on the device, the company chose to rely on the newer chips. According to Apple, this model was trained with 2,048 v5p TPU chips.
[5]
Apple appears to have just shunned Nvidia again
This story is available exclusively to Business Insider subscribers. Become an Insider and start reading now. Have an account? Log in. Nvidia chips, which use graphics processing units for AI processing, are in high demand in the tech industry. Companies like Meta spend big to amass large volumes of them to train their models. According to a Mizuho Securities estimate, Nvidia controls more than 70% of the AI chips market, and the chips can cost tens of thousands of dollars to buy. Google uses its own TPU chips instead and rents them out through its cloud service to clients like Apple. While Apple didn't explicitly say that no Nvidia chips were used in the hardware and software infrastructure of its AI features, it has been somewhat public about its work with Google to train its upcoming Apple Intelligence. In Monday's research paper, Apple wrote that it used 2,048 TPUv5p chips (made by Google) to build the AI model that will reportedly operate on iPhones and other Apple hardware. Apple, Microsoft, Nvidia and Alphabet are among the top companies in the world by market cap. Nvidia's success in the AI chip market has helped drive it to a more than $2 billion valuation in the last few years.
[6]
Apple Has Relied On Google's Custom Chips For AI Training: What We Know - News18
Apple Intelligence is going to be available on select iPhone 15 Pro models this year but the company has taken help from an unexpected rival Apple has disclosed that the AI models that power Apple Intelligence were trained on Google's custom chips. The hardware manufacturer revealed that two important components of its AI system were pre-trained on Google-designed processors in a research paper that was released this week. Considering how reliant big tech companies have been on AI chipmaker Nvidia, the decision is unusual. Google's in-house tensor processing units (TPUs), which are arranged in big clusters of processors, were used in two distinct iterations, according to the company. Apple used 2,048 TPUv5p chips to train its AI model for iPhones and other devices, and 8,192 TPUv4 processors to train its server AI model. While TPUs are not designed by Nvidia, the company concentrates its efforts on GPUs, or graphics processing units, which are extensively utilised in artificial intelligence research. In addition, the iPhone maker unveiled a peek of its long-awaited Apple Intelligence system. Currently, Nvidia controls over 80 percent of the AI chip industry, because of its pricey graphics processing units (GPUs), which have been in great demand since the AI boom. In light of the high expense of building the AI infrastructure needed to train large AI models, several big companies, including Google, Meta, Microsoft and Amazon, have recently turned to creating their own AI chips. Meanwhile, Google's TPUs may be accessed via the Google Cloud Platform, which means that if users wish to use the chips, they must also build software on Google Cloud. Apart from the ones described, the research paper stated that it would be possible to develop much larger and more complicated models with Google chips. This is the second technical document on Apple's artificial intelligence technology, following a more generic version published in June. During its WWDC conference in June 2024, Apple revealed a plethora of new AI tools and capabilities, one of which was the integration of OpenAI's ChatGPT into its software. Within this week, a few Apple Intelligence capabilities will be made available to beta testers. Apple is slated to release quarterly earnings later this week which will tell us how Tim Cook and Co. plan to go about AI and other products in the near future.
[7]
Apple used Google Tensor hardware to train its Gemini competitor
Summary Apple's AI foundation models were trained on Google's Cloud TPUs, potentially hinting at Apple's preference for superior hardware. Apple's late entry into AI tech saw it possibly renting or acquiring Google's advanced hardware, showcasing a strategic move in the tech sector. Apple's use of Google's TPUs would be inconsequential for end users' but indicates willingness to set aside rivalries to further AI developments, with Apple Intelligence already released on select devices. In a race to create the best AI services and chatbots, most prominent names in tech scope out the most powerful and efficient cloud compute hardware to process LLMs. This equipment can often be expensive to procure, and NVIDIA isa leading provider, but in an interesting choice, Apple seems to have used Google's Tensor Processing Units (TPUs) to train early foundational models for its AI called Apple Intelligence. One could interpret this as a tacit nod to superior Google hardware, but there's more to it than meets the eye. Related What are Google Cloud TPUs? Google's aiming for the cloud-based AI creation sector: Google Cloud TPUs are helping it win Not to be conflated with the Tensor G-series chips that power Google's Pixel range of phones, TPUs are purpose-built hardware (Application-Specific Integrated Circuits) used in Google data centers since 2015 and in public since 2017, rivaling Nvidia's GPUS preferred for LLM processing. CNBC reports Apple released a 47-page technical research paper recently, discussing the Apple Intelligence foundation language models. While companies like Microsoft, OpenAI, and Anthropic don't shy away from discussing their reliance on Nvidia GPUs, Apple's document reveals the company deferred from the norm, preferring "v4 and v5p Cloud TPU clusters." Specifically, the research paper seems to say Apple rented Cloud TPU clusters for the heavy lifting, to train Apple Foundation Models for on-device and server deployment. The on-device model was reportedly trained on Google's latest v5p TPU chips launched in December. What does this mean for Google? Source: Google Blog Apple choosing to rent out arguably the most advanced AI processing hardware available at the time isn't a bad thing, even if said chips are supplied by Google. That's because the Cupertino-based company has been fashionably late to announce any interest in AI tech, while most other companies have been rather vocal about their heavy investments in it since OpenAI popularized ChatGPT in 2022. AppleInsider speculates the iPhone maker might have bought Tensor hardware to use locally at its own data centers instead of renting Google-hosted hardware. After all, the AI's handling of human queries would matter more than the underlying compute hardware in the long term. Apple and Google didn't respond to CNBC's request for comment on the matter, but most related speculation suggests Apple's AI efforts will come to light soon as investments continue increasing. For now, it released Apple Intelligence for a handful of devices on July 29 in iOS 18.1 and macOS Sequoia 15.1 developer beta builds.
Share
Share
Copy Link
Apple has reportedly opted for Google's Tensor Processing Units (TPUs) instead of Nvidia's GPUs for its AI training needs. This decision marks a significant shift in the tech industry's AI hardware landscape and could have far-reaching implications for future AI developments.
In a surprising turn of events, Apple has reportedly chosen Google's Tensor Processing Units (TPUs) over Nvidia's GPUs for its artificial intelligence (AI) training needs. This decision, first reported by The Information, marks a significant shift in the tech industry's AI hardware landscape 1.
Apple's decision to opt for Google's TPUs is particularly noteworthy given Nvidia's dominance in the AI chip market. Nvidia has been the go-to choice for many tech giants, including Microsoft and Meta, for their AI initiatives. However, Apple's move suggests a potential diversification in the AI hardware market 2.
Google's TPUs are custom-built AI accelerators designed specifically for machine learning workloads. These chips are optimized for Google's TensorFlow framework, which is widely used in AI and machine learning applications. The TPUs offer high performance and energy efficiency, making them an attractive option for large-scale AI training 3.
This partnership with Google for AI training hardware aligns with Apple's growing focus on AI technologies. The company has been working on various AI projects, including improvements to Siri and other AI-powered features in its products. By choosing Google's TPUs, Apple may be looking to accelerate its AI development efforts 4.
The tech industry has been closely watching Apple's AI strategy, and this move has sparked considerable interest. Some analysts speculate that this decision could lead to closer collaboration between Apple and Google in the AI space, potentially challenging Nvidia's market position 5.
While Apple's decision is a setback for Nvidia, the company remains a dominant force in the AI chip market. Nvidia's GPUs are still widely used by other tech giants and AI researchers. However, this development may encourage other companies to explore alternative AI hardware options, potentially leading to a more diverse and competitive market 1.
Apple's choice of Google's TPUs could have far-reaching implications for the AI industry. It may accelerate the development of specialized AI hardware and encourage more competition in the market. This could ultimately lead to faster innovation and more efficient AI training solutions across the tech industry 3.
Reference
[1]
[2]
[5]
Apple is reportedly using Google's custom chips to train its AI models, moving away from Nvidia hardware. This collaboration aims to enhance iPhone intelligence and AI capabilities.
2 Sources
As Nvidia's stock surges due to AI chip demand, experts warn of potential slowdown. Meanwhile, tech giants like Apple and Google develop in-house AI chips, challenging Nvidia's market position.
3 Sources
Apple has withdrawn from discussions to invest in OpenAI's latest funding round, valued at $6.5 billion. This decision raises questions about Apple's AI strategy and its relationship with OpenAI, despite plans to integrate ChatGPT into its products.
6 Sources
Apple is reportedly in discussions with Foxconn and Lenovo to manufacture AI servers using Apple Silicon, aiming to power its Apple Intelligence services and boost its AI capabilities.
4 Sources
OpenAI, the artificial intelligence powerhouse, is reportedly in talks with tech giants Apple and Nvidia for a potential investment that could push its valuation to a staggering $100 billion. This development comes amidst growing competition in the AI sector and concerns about OpenAI's future.
10 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved