Curated by THEOUTPOST
On Tue, 10 Sept, 12:05 AM UTC
3 Sources
[1]
Apple adds Visual Intelligence - its answer to Google Lens - to the iPhone 16 camera
The company upgraded the iPhone camera with AI-powered Visual Intelligence, giving users visual search straight from the camera app. Here's how it works. During Monday's iPhone 16 event, Apple announced a brand new lineup of iPhones, AirPods and AirPods Pro, AirPods Max, and an Apple Watch Series 10. Apple also took the opportunity to expand on some Apple Intelligence features, the artificial intelligence (AI) tools coming to the iPhone. This includes a new Visual Intelligence feature, which essentially gives the iPhone camera Google Lens capabilities. Apple's Visual Intelligence lets you capture a photo of things around you, like a flyer or a restaurant, and then uses iPhone's AI capability to search for it and give you more information. Also: Everything we expect at Apple's iPhone 16 event: AI features, AirPods, Apple Watch Series 10, more Apple says captured data will remain private when used with Apple Intelligence and the company's Private Cloud Compute, but users can opt for third-party integrations with the new camera experience. Third-party integrations include the ability to search Google for whatever the camera captures, much like opening Google Lens straight from the iPhone camera app. Users can also allow ChatGPT integration with the Visual Intelligence feature, which would allow the AI chatbot to process the image data captured by the camera. These third-party integrations require the user to give permissions on an opt-in basis. Also: Every iPhone model that will be updated to Apple's iOS 18 (and which ones can't) The iPhone 16 and iPhone 16 Plus are available for pre-order from $799 and $899, respectively, but the Apple Intelligence features won't be readily available for some time. Apple says some of its AI features will begin rolling out in beta next month, with more features to come over the next several months.
[2]
Apple adds Google Lens-like Visual Intelligence to the iPhone 16 camera
The company upgraded the iPhone camera with AI-powered Visual Intelligence, giving users visual search straight from the camera app. Here's how it works. During Monday's iPhone 16 event, Apple announced a brand new lineup of iPhones, AirPods and AirPods Pro, AirPods Max, and an Apple Watch Series 10. Apple also took the opportunity to expand on some Apple Intelligence features, the artificial intelligence (AI) tools coming to the iPhone. This includes a new Visual Intelligence feature, which essentially gives the iPhone camera Google Lens capabilities. Apple's Visual Intelligence lets you capture a photo of things around you, like a flyer or a restaurant, and then uses iPhone's AI capability to search for it and give you more information. Also: Everything we expect at Apple's iPhone 16 event: AI features, AirPods, Apple Watch Series 10, more Apple says captured data will remain private when used with Apple Intelligence and the company's Private Cloud Compute, but users can opt for third-party integrations with the new camera experience. Third-party integrations include the ability to search Google for whatever the camera captures, much like opening Google Lens straight from the iPhone camera app. Users can also allow ChatGPT integration with the Visual Intelligence feature, which would allow the AI chatbot to process the image data captured by the camera. These third-party integrations require the user to give permissions on an opt-in basis. Also: Every iPhone model that will be updated to Apple's iOS 18 (and which ones can't) The iPhone 16 and iPhone 16 Plus are available for pre-order from $799 and $899, respectively, but the Apple Intelligence features won't be readily available for some time. Apple says some of its AI features will begin rolling out in beta next month, with more features to come over the next several months.
[3]
AI-powered visual search comes to the iPhone
Visual search is coming to the iPhone, powered by Apple Intelligence, Apple's suite of AI capabilities, the company announced at Monday's Apple Event 2024. The Camera Control, the new button on the iPhone 16 and 16 Plus, can launch what Apple calls "visual intelligence" -- basically a reverse image search combined with some text recognition. If you use visual intelligence to search for a restaurant, for example, it'll pull up restaurant hours, ratings and options to check out the menu or make a reservation, Apple says. Or, if you come across a flier for an event, you can use visual intelligence to quickly add the title, time, date and location to your calendar. Visual intelligence will launch along with other Apple Intelligence features in beta in October for U.S. English language users. Users in other countries can expect to get it December and early 2025.
Share
Share
Copy Link
Apple unveils Visual Intelligence, an AI-powered visual search feature for the iPhone 16 camera, rivaling Google Lens. This new technology aims to enhance user experience by providing instant information about objects in photos.
Apple has announced a groundbreaking addition to its iPhone 16 camera system: Visual Intelligence. This AI-powered visual search feature is set to revolutionize how users interact with their surroundings through their smartphone cameras 1.
Visual Intelligence allows iPhone 16 users to point their camera at objects, text, or scenes to receive instant information and interactive options. By leveraging advanced AI algorithms, the feature can recognize and provide details about a wide range of items, from landmarks and plants to products and text 2.
Apple's Visual Intelligence is being viewed as a direct competitor to Google Lens, which has been available on Android devices for several years. While both technologies offer similar functionalities, Apple's integration into the native iPhone camera app could provide a seamless user experience for iOS users 3.
Apple emphasizes that Visual Intelligence prioritizes user privacy. The company states that all processing occurs on-device, ensuring that personal data and image analysis remain secure and are not shared with external servers 1.
The introduction of Visual Intelligence is expected to significantly enhance the iPhone's utility as a tool for exploration and information gathering. Users will be able to interact with their environment in new ways, potentially changing how they shop, travel, and learn 3.
Visual Intelligence will be available exclusively on the iPhone 16 series at launch. Apple has not yet announced plans for backward compatibility with older iPhone models, suggesting that the feature may require specific hardware capabilities present in the latest devices 2.
Apple is set to introduce Visual Intelligence, a powerful AI-driven feature for the iPhone 16. This technology aims to revolutionize how users interact with images and the world around them, rivaling Google Lens.
6 Sources
6 Sources
Apple's AI-powered Visual Intelligence feature, previously exclusive to iPhone 16 models, will soon be available on iPhone 15 Pro devices through a future software update, likely iOS 18.4.
12 Sources
12 Sources
Apple's latest iOS 18.4 beta brings Visual Intelligence, an AI-powered visual search feature, to iPhone 15 Pro and Pro Max models, expanding its availability beyond the iPhone 16 lineup.
5 Sources
5 Sources
Apple announces partnerships with tech giants like Google to enhance the visual search capabilities of the upcoming iPhone 16. This move marks a significant shift in Apple's approach to AI and machine learning technologies.
2 Sources
2 Sources
Apple is set to introduce new AI-powered features, collectively known as Apple Intelligence, to the iPhone 16 series. These features are currently in beta testing and are expected to revolutionize user interaction with iOS devices.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved