Curated by THEOUTPOST
On Tue, 19 Nov, 4:02 PM UTC
3 Sources
[1]
Apple Intelligence is possible thanks to this key decision from 2017 - Softonic
Apple has shaped its artificial intelligence with a unique focus on privacy and local processing that makes a difference in how we interact with our devices while distinguishing itself from what other companies propose. The path that has led this technology to where it is today began longer ago than we might imagine. A key decision in 2017, based on the forward-thinking vision that characterizes Apple, laid the groundwork for us to enjoy Apple Intelligence on devices with M1 chips and later. In 2017, the Apple team was already looking beyond the horizon of the technology available at that time. It was then that the first version of the Bionic chips with the Neural Engine was introduced, initially conceived as a tool to enhance the computational photography capabilities in the iPhone. That same year, Apple's engineering team realized the opportunity they had on their hands when they read a scientific paper titled Attention is All You Need, which introduced the world to transformer neural network models. As explained by Tim Millet, Apple's Vice President of Hardware Engineering, in an interview on The Circuit podcast (via 9to5mac), the team saw the potential of this technology to revolutionize the field of artificial intelligence. Instead of limiting itself to the initial capabilities of the Neural Engine, Apple made a strategic decision: completely rethink its architecture to pave the way for something much bigger. From the very beginning, the company understood that the development of technologies like transformer networks would take years to materialize, but it would take almost the same time to adapt the chips to those capabilities. Therefore, according to Millet, starting in 2017, the team began working on a more advanced version of the Neural Engine, which was finally introduced with the M1 chip in 2020. The arrival of the M1 chip -- four years ago -- marked a before and after in Apple's transition to its own chips and laid the foundation for what today makes Apple Intelligence possible. The enhanced Neural Engine allowed devices with the M1 chip to efficiently run advanced neural networks, and most importantly, on the device itself. This not only offers us greater power and speed, but also an unparalleled level of privacy, as the data does not need to leave the device to be processed. A fundamental difference compared to other artificial intelligence solutions that rely on cloud servers. The fact that Apple Intelligence is available on all Macs with an M1 chip or later is no coincidence. It is, as clearly seen in the interview, the result of very careful planning and a long-term vision -- seven years to be exact -- that began with that decision in 2017. Thanks to that vision, today we can enjoy features such as object removal in Photos, intelligent notification sorting, or real-time personalized emoji generation, all processed directly on our device. And that's just the beginning, as the second part of Apple Intelligence with the Image Tools will arrive shortly, and subsequently, the new Siri with knowledge of our personal context will follow. Without that key decision in 2017, the landscape of Apple Intelligence would be very different. While we wait to know when Google Gemini will be integrated with Siri or see how to use ChatGPT to transcribe our handwritten notes for free, it's easy to hear comments along the lines that Apple has missed the train when it comes to artificial intelligence, but interviews like this give us another perspective. Seven years ago, Apple was not content with what was possible at that time, it bet on what was to come, and that bet has resulted in something that today makes our lives easier and, above all, smarter.
[2]
Apple Intelligence on M1 chips happened because of a key 2017 decision, Apple says - 9to5Mac
Apple Intelligence is made possible by Apple's silicon efforts as a whole, as a new interview reveals. And apparently, those efforts took a big shift all the way back in 2017 in preparation for AI. The Circuit podcast with Ben Bajarin and Jay Goldberg just published a new episode featuring an interview with Apple execs Tom Boger and Tim Millet. Their conversation focuses on Apple silicon, and covers the relationship between AI and the company's silicon work. We learn that the first Neural Engine was created as an extension of Apple's computational photography ambitions, but it then set them up for success in AI. Here's an excerpt that relates to a 2017 decision paving the way for AI on M1 devices: We introduced [the Neural Engine] in 2017, but another interesting thing happened in 2017, that was the paper that got published, Attention is All [You Need]. This was a paper that sort of led to the transformer networks...Well, my team was paying attention. They were reading the paper back in 2017, and they were like, holy mackerel, this stuff looks like it might be interesting. We need to make sure we can do this. And so, we started working on re-architecting our neural engine the minute we started shipping it, so that by 2020, when we released M1 into the Apple silicon transition, we were in a position to be able to run these networks. Now, what did that mean? Well, that meant that we, as we introduced Apple Intelligence, we can commit to say, we can do that on all the Macs running Apple Silicon, because M1, we had the foresight to be able to look, and we're paying attention to the trends and introduce it, knowing that silicon takes time to get it in there. Apple Intelligence may not have existed in 2017, but Apple's silicon teams making changes to the Neural Engine right away is what made it possible for 2020 Macs to run the new software today. The full interview is available here on Apple Podcasts and is well worth a listen. What do you think of Apple's AI tidbits from the interview? Let us know in the comments.
[3]
This Is How Apple Intelligence Support on M1 Mac Models Was Made Possible
Apple Intelligence is compatible with all M1 chip-equipped devices Apple engineers made a key decision in 2017 which led to the company being able to offer Apple Intelligence even in devices launched in 2020, said an executive. In a podcast, senior executives in the company highlighted that the engineers responsible for designing the M1 chipset decided to add neural networks to make them artificial intelligence (AI) ready. This is notable since the M1 chipset was first launched in 2020, which was two years before the generative AI trend caught steam. The Circuit podcast, in its latest episode, invited Apple's Vice President of Platform Architecture Tim Millet and Tom Boger, Senior Director, Mac & iPad Product Marketing at Apple for a conversation. The duo discussed the company's approach towards AI, integration of hardware, importance of architecture, and more. Interestingly, the executives revealed that Apple's engineers had become aware of neural networks in 2017, soon after the first paper about it was published. The same technology led to the development of transformer networks which is considered the foundation for generative AI. The executives highlighted that the engineers began re-designing the company's neural engine for the next generation of Apple silicon -- the M1 chip. By the time the chipset debuted with the MacBook Air, the 13-inch MacBook Pro, and the Mac Mini in 2020, the company could run neural networks on the processor. However, back then the company did not have much use for neural networks and generative AI technology was still two years away. As a takeaway, the executives said with M1 "we had the foresight to be able to look, and we're paying attention to the trends and introduce it, knowing that silicon takes time to get it in there." Notably, at the "It's Glowtime" event earlier this year, Apple announced that Apple Intelligence will be compatible with the M1 chipsets, bringing new features to hardware that is four years old. The tech giant's AI offerings are now scheduled to be rolled out to users globally in December. However, users in the European Union (EU) and China will not get it at launch due to regulatory hurdles.
Share
Share
Copy Link
Apple executives reveal how a strategic decision in 2017 to redesign the Neural Engine laid the groundwork for Apple Intelligence, enabling AI capabilities on M1 chips and later models.
In a revealing interview on The Circuit podcast, Apple executives Tim Millet and Tom Boger shed light on a pivotal decision made by the company in 2017 that set the stage for Apple Intelligence on M1 chips and beyond 1. This strategic move demonstrates Apple's long-term vision and commitment to integrating artificial intelligence into its devices while prioritizing privacy and on-device processing.
The journey began in 2017 with the introduction of the first version of Bionic chips featuring the Neural Engine. Initially conceived to enhance computational photography in iPhones, the Neural Engine soon became the focus of a much more ambitious plan 2.
That same year, Apple's engineering team came across a scientific paper titled "Attention is All You Need," which introduced transformer neural network models. Recognizing the potential of this technology to revolutionize AI, Apple made a crucial decision to completely rethink the architecture of the Neural Engine 12.
Tim Millet, Apple's Vice President of Hardware Engineering, explained that the team understood the development of technologies like transformer networks would take years. However, they also realized that adapting their chips to these capabilities would require a similar timeframe. This foresight led to the development of a more advanced version of the Neural Engine, which debuted with the M1 chip in 2020 23.
The introduction of the M1 chip in 2020 marked a significant milestone in Apple's transition to its own silicon. The enhanced Neural Engine in M1 chips allowed for efficient running of advanced neural networks directly on the device, offering greater power, speed, and an unparalleled level of privacy 2.
Apple's approach to AI stands out due to its focus on on-device processing. This strategy not only enhances performance but also ensures user privacy by keeping data processing local rather than relying on cloud servers 2.
Today, users of M1 chip devices and later models can enjoy features such as object removal in Photos, intelligent notification sorting, and real-time personalized emoji generation. Apple Intelligence is set to expand further with upcoming Image Tools and a new version of Siri with personal context awareness 2.
While some may argue that Apple has lagged in the AI race, this revelation demonstrates the company's deliberate and patient approach to integrating AI technologies. By focusing on hardware capabilities first, Apple has positioned itself to offer unique AI features that align with its privacy-centric philosophy 123.
As the tech industry continues to evolve rapidly in the AI space, Apple's strategy of long-term planning and hardware-software integration sets it apart from competitors relying heavily on cloud-based AI solutions.
Apple's upcoming AI platform, Apple Intelligence, is set to launch with iOS 18, bringing new features to iPhones, iPads, and Macs. This article explores the platform's capabilities, rollout strategy, and how it compares to competitors.
23 Sources
23 Sources
Apple is reportedly using Google's custom chips to train its AI models, moving away from Nvidia hardware. This collaboration aims to enhance iPhone intelligence and AI capabilities.
2 Sources
2 Sources
Apple introduces on-device AI capabilities for iPhones, iPads, and Macs, promising enhanced user experiences while maintaining privacy. The move puts Apple in direct competition with other tech giants in the AI race.
6 Sources
6 Sources
Apple is set to introduce its new AI-driven technology, Apple Intelligence, across its devices in October. This update promises to enhance user experience with advanced features for productivity, creativity, and accessibility.
12 Sources
12 Sources
Apple's AI initiative, Apple Intelligence, encounters significant setbacks and delays, raising questions about the company's ability to compete in the rapidly advancing AI market.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved