Curated by THEOUTPOST
On Wed, 11 Sept, 12:03 AM UTC
2 Sources
[1]
Apple partners with third parties, like Google, on iPhone 16's visual search
Apple's relationship with Google as its search partner is taking a new turn with Apple's introduction of visual search, or "Visual Intelligence" as the iPhone maker dubbed it yesterday during its iPhone 16 event. Already, Apple pays Alphabet roughly $20 billion per year to make Google the default search engine in its Safari browser. Now, iPhone 16 users will be able to access Google's search engine -- and its visual search capabilities -- with a click of the device's new Camera Control button. OpenAI's ChatGPT, which is becoming accessible via Siri, was also shown as a third-party partner in a demo where you could aim your phone's camera at your class notes and get help understanding the concept or problem with a click of a button. With the Camera Control, Apple explained how users can quickly take a photo or record video, and how they'll be able to slide their finger across the button to frame their shot and adjust options like zoom, exposure, or depth of field in a new camera preview experience. However, the button also provides iPhone 16 users with access to Apple's new "visual intelligence" search feature, which is where the Google partnership comes in. When first introduced, the iPhone 16's Camera Control seemed like Apple lingo for "shutter button," but as the event continued, Apple explained there's more you can do with this new hardware feature. With Visual Intelligence, users don't just have an easy way to learn about the things in their camera's view, they also have another way to access third-party services without having to launch standalone apps. Essentially a visual search feature, similar to Google Lens or Pinterest Lens, Apple described Visual Intelligence as a way to instantly learn about everything you see. Across a few examples, Apple demonstrated how users could click the Camera Control button to pull up information about a restaurant you saw while out and about in town, or how you could use the feature to identify the breed of a dog you saw on your walk. The feature could also transform an event poster tacked on a wall into a calendar entry with all the details included. Apple's Senior Vice President of Software Engineering Craig Federighi then casually mentioned that the feature could be used to access Google search, too. "The Camera Control is also your gateway to third-party tools, making it super fast and easy to tap into their specific domain expertise. So, if you come across a bike that looks exactly like the kind you're in the market for, just tap to search Google for where you can buy something similar," he said. The demo showed a person tapping the Camera Control button while aiming their iPhone at a bike, then reviewing an array of similar options available for purchase in a pop-up window overlaid on top of the camera's view. The grid of images and descriptions of the matching bikes was then followed by a smaller onscreen button that read "More results from Google," indicating you could continue your Google search with another tap. What Apple didn't explain is how or when a push of the Camera Control button would know to turn to a third-party partner for an answer rather than a built-in Apple service -- like Apple Maps, which was shown in the demo about the restaurant. Nor did the company fully explain how users would be able to control or configure this feature. Instead, Federighi said, somewhat vaguely, "Of course, you're always in control of when third-party tools are used." Reached for comment, a Google spokesperson said the company didn't have anything to share on its partnership at this stage. Apple didn't respond to a request for comment. What's interesting about this feature is that it presents a new paradigm for interacting with software and services beyond those that Apple ships with the iPhone. And it arrives at a time when the concept of an App Store has begun to feel dated. With AI technology, users can ask questions, perform productivity tasks, be creative with images and video, and more. Those are things consumers used to turn to apps to do, but can now do from a new interface of talking and texting with an AI assistant. Instead of rushing to build its own competitor to ChatGPT, Apple is presenting itself as the platform to reach third-party services, including AI technologies, search services, and likely other providers in the future. What's more, it can make these connections by way of behind-the-scenes deals with partners -- like its partnership with OpenAI on select AI features -- instead of tapping into transactions taking place inside the apps as a means of generating revenue. It also smartly keeps Apple's reputation from taking a hit when a third party, like ChatGPT, gets things wrong (as AIs tend to do) or when a Google Search doesn't yield helpful results.
[2]
Apple looks to third parties, like Google, to help power iPhone 16's visual search
Apple's relationship with Google as its search partner is taking a new turn with Apple's introduction of visual search, or "Visual Intelligence" as the iPhone maker dubbed it yesterday during its iPhone 16 event. Already, Apple pays Alphabet roughly $20 billion per year to make Google the default search engine in its Safari browser. Now, iPhone 16 users will be able to access Google's search engine -- and its visual search capabilities -- with a click of the device's new Camera Control button. OpenAI's ChatGPT, which is becoming accessible via Siri, was also shown as a third-party partner in a demo where you could aim your phone's camera at your class notes and get help understanding the concept or problem with a click of a button. With the Camera Control, Apple explained how users can quickly take a photo or record video, and how they'll be able to slide their finger across the button to frame their shot and adjust options like zoom, exposure, or depth of field in a new camera preview experience. However, the button also provides iPhone 16 users with access to Apple's new "visual intelligence" search feature, which is where the Google partnership comes in. When first introduced, the iPhone 16's Camera Control seemed like Apple lingo for "shutter button," but as the event continued, Apple explained there's more you can do with this new hardware feature. With Visual Intelligence, users don't just have an easy way to learn about the things in their camera's view, they also have another way to access third-party services without having to launch standalone apps. Essentially a visual search feature, similar to Google Lens or Pinterest Lens, Apple described Visual Intelligence as a way to instantly learn about everything you see. Across a few examples, Apple demonstrated how users could click the Camera Control button to pull up information about a restaurant you saw while out and about in town, or how you could use the feature to identify the breed of a dog you saw on your walk. The feature could also transform an event poster tacked on a wall into a calendar entry with all the details included. Apple's Senior Vice President of Software Engineering Craig Federighi then casually mentioned that the feature could be used to access Google search, too. "The Camera Control is also your gateway to third-party tools, making it super fast and easy to tap into their specific domain expertise. So, if you come across a bike that looks exactly like the kind you're in the market for, just tap to search Google for where you can buy something similar," he said. The demo showed a person tapping the Camera Control button while aiming their iPhone at a bike, then reviewing an array of similar options available for purchase in a pop-up window overlaid on top of the camera's view. The grid of images and descriptions of the matching bikes was then followed by a smaller onscreen button that read "More results from Google," indicating you could continue your Google search with another tap. What Apple didn't explain is how or when a push of the Camera Control button would know to turn to a third-party partner for an answer rather than a built-in Apple service -- like Apple Maps, which was shown in the demo about the restaurant. Nor did the company fully explain how users would be able to control or configure this feature. Instead, Federighi said, somewhat vaguely, "Of course, you're always in control of when third-party tools are used." Reached for comment, a Google spokesperson said the company didn't have anything to share on its partnership at this stage. Apple didn't respond to a request for comment. What's interesting about this feature is that it presents a new paradigm for interacting with software and services beyond those that Apple ships with the iPhone. And it arrives at a time when the concept of an App Store has begun to feel dated. With AI technology, users can ask questions, perform productivity tasks, be creative with images and video, and more. Those are things consumers used to turn to apps to do, but can now do from a new interface of talking and texting with an AI assistant. Instead of rushing to build its own competitor to ChatGPT, Apple is presenting itself as the platform to reach third-party services, including AI technologies, search services, and likely other providers in the future. What's more, it can make these connections by way of behind-the-scenes deals with partners -- like its partnership with OpenAI on select AI features -- instead of tapping into transactions taking place inside the apps as a means of generating revenue. It also smartly keeps Apple's reputation from taking a hit when a third party, like ChatGPT, gets things wrong (as AIs tend to do) or when a Google Search doesn't yield helpful results.
Share
Share
Copy Link
Apple announces partnerships with tech giants like Google to enhance the visual search capabilities of the upcoming iPhone 16. This move marks a significant shift in Apple's approach to AI and machine learning technologies.
In a surprising move, Apple has announced collaborations with several third-party companies, including tech giant Google, to power the visual search feature of its upcoming iPhone 16 1. This decision marks a significant shift in Apple's traditionally closed ecosystem approach and highlights the company's commitment to enhancing its AI and machine learning capabilities.
The partnership aims to leverage the expertise of industry leaders in AI and machine learning to create a more robust and accurate visual search function. By tapping into Google's vast image recognition database and advanced algorithms, Apple seeks to provide iPhone 16 users with an unparalleled visual search experience 2.
This collaboration between Apple and its competitors in the AI space signals a new era of cooperation in the tech industry. It also demonstrates Apple's recognition of the need to catch up in certain areas of AI development, where companies like Google have made significant strides 1.
The enhanced visual search feature is expected to allow iPhone 16 users to identify objects, landmarks, and text in images with greater accuracy and speed. This improvement could have far-reaching applications in areas such as augmented reality, accessibility, and e-commerce 2.
Despite the collaboration, Apple maintains that user privacy remains a top priority. The company assures that the visual search feature will process data on-device whenever possible and that any cloud-based processing will adhere to strict privacy guidelines 1.
This move by Apple is likely to shake up the smartphone market, potentially setting a new standard for visual search capabilities in mobile devices. It may also pressure other manufacturers to seek similar partnerships or invest heavily in their own AI and machine learning technologies to remain competitive 2.
The collaboration between Apple and third-party AI experts could pave the way for more advanced AI features in future iPhone models and other Apple devices. This partnership may be just the beginning of a broader strategy to integrate cutting-edge AI technologies across Apple's product lineup 1.
Apple unveils Visual Intelligence, an AI-powered visual search feature for the iPhone 16 camera, rivaling Google Lens. This new technology aims to enhance user experience by providing instant information about objects in photos.
3 Sources
3 Sources
Apple is set to introduce Visual Intelligence, a powerful AI-driven feature for the iPhone 16. This technology aims to revolutionize how users interact with images and the world around them, rivaling Google Lens.
6 Sources
6 Sources
Apple introduces on-device AI capabilities for iPhones, iPads, and Macs, promising enhanced user experiences while maintaining privacy. The move puts Apple in direct competition with other tech giants in the AI race.
6 Sources
6 Sources
Apple's recent iPhone 16 launch event introduced 'Apple Intelligence', their approach to AI integration. While the tech giant aims to revolutionize user experience, questions and skepticism arise about its implementation and impact.
7 Sources
7 Sources
Apple's latest iOS 18.4 beta brings Visual Intelligence, an AI-powered visual search feature, to iPhone 15 Pro and Pro Max models, expanding its availability beyond the iPhone 16 lineup.
5 Sources
5 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved