Curated by THEOUTPOST
On Tue, 10 Sept, 12:05 AM UTC
6 Sources
[1]
Apple announces Visual Intelligence -- its take on Google Lens
Apple is bringing a new AI-powered feature to the iPhone 16 that will let users turn the camera into a glorified visual search engine. This is similar to Google Lens on Android and is powered by Apple Intelligence but includes integration with any app or service running on the phone. Visual Intelligence is essentially AI Vision, where a language model can analyze and understand images. This is something Claude, Gemini and ChatGPT are also able to do well. With its deep integration into the iPhone 16, including access to the new Camera Control Button, Apple's approach is likely to be much more user-friendly. One example given during the Glowtime event included using it to add an event from a poster to your phone calendar. Apple Visual Intelligence was one of the standout announcements for me during Glowtime. Vision AI is likely to be the most user-friendly AI feature as it lets the AI see the world around us. Some Vision AI features are already in use on the iPhone and have been for some time, including being able to copy text from an image or identify an animal type in a photo, but this brings those features to the real world via the camera. Using a combination of on-board and cloud-based (through Apple's Private Cloud Compute), AI models are able to analyze what the camera is seeing in near real-time and provide feedback. How it handles the image depends on the user. For example, it could add an event to the calendar if it identifies one in the image, or it could just tell you a dog breed. Alternatively, if you see a product you want to buy you could have Apple Intelligence redirect you to Google. Apple says it doesn't store any images captured by the AI as part of the Apple Intelligence search and removes images sent to the cloud for deeper analysis. As much of the data gathered in Apple Intelligence features, including Visual Intelligence, is processed on device, particularly on the iPhone 16 with the new powerful A18 processor, but where it does go to the cloud Apple says it goes to great lengths to protect the information. This is largely driven by its Private Cloud Compute, a new cloud system built on Apple Silicon and a custom version of the iPhone operating system. As well as ensuring nothing is accessible to anyone beyond the user, the architecture is also open to audit by third parties. If a user opts-in to send data to a third party such as Google for search or OpenAI's ChatGPT for a deeper analysis it won't have the same security, but Apple says that will always be opt-in and optional with nothing sent without express permission. Apple Visual Intelligence gives the AI a view on the world outside your phone. It can be used to take a photo of a bag of groceries and have the AI generate a recipe, or of an empty fridge and have it generate a shopping list. Outside of food, it could be used for live translation of signs, to identify potentially risky ingredients for someone with food allergies or identify a location from a simple photo. If you take a photo of a dog you can go into Photos and Apple Intelligence will tell you the breed, but now you won't have to take the photo as holding the camera up to a dog will give you that information. This will also work with spiders or any other animal. There are as many use cases as there are types of things to look at. It could be used to get the history of a building, find a review of a book or even get a link to buy that bike. It is an impressive and logical feature to be built into the iPhone 16.
[2]
5 Apple Intelligence features coming to the iPhone 16
During its 'Glowtime' iPhone and Apple Watch event, Apple declared that the iPhone 16, including the base and Pro models, were built from the ground up with Apple Intelligence in mind. This includes updated Apple Silicon with improved neural engines, new hardware controls, and changes to the operating system that will be coming as early as next month. One of the most notable hardware upgrades across all four iPhone 16 models is the Camera Control button which can be used to trigger Apple Intelligence actions such as adding an event to your calendar from a photo of a flyer. You'll also be able to look up information from images you grab with Camera Control, whether it's about a restaurant or a breed of dog -- two examples Apple cited in its Glowtime preview. iPhone 16 also comes with other Apple Intelligence features such as new conversational Siri, writing tools integrated with every app and smart summaries for notifications and emails. Many of the Apple Intelligence features are part of iOS 18 and will also work with the latest iPad and the iPhone 16 Pro models, but with the Action button and Camera Control button, as well as a faster Neural Engine, the iPhone 16 should give the best experience. Initially, it will only be available in U.S. English, coming to other versions of English from December and other languages next year. You also won't get all the features immediately as many Apple Intelligence updates will be added over the coming months. Initial Apple Intelligence features are part of an iOS 18.1 beta available to developers. Apple said the features would be available in a beta for all users next month. The main iOS 18 update comes out next Monday (September 16). Apple senior vice president of software Craig Federighi, says Apple Intelligence can "Understand and create language and images, take action on your behalf to simplify daily life and does it all while grounded in your personal context." Apple is bringing its own version of one of the most powerful forms of AI, Vision, to the iPhone 16 in the form of Apple Visual Intelligence. This uses AI to analyze images and perform tasks based on the content. It works with text in images and location data as well as the image itself. During a demo video, Apple showed someone holding the camera up to a poster on a wall advertising an event and being offered the chance to add it to the calendar. They then held their phone up to a dog using Apple Intelligence to identify the breed. If these features sound familiar, it is something Google has offered through Google Lens for some time. Vision AI is also something you can do with Claude and ChatGPT on the iPhone already. Apple even said Visual Intelligence in the camera app could be integrated with Google Search or ChatGPT for more detailed responses. One of the features of Apple Intelligence that most people will likely use all the time involves writing tools. These will be deeply integrated into iOS 18 and accessible through any app where you need to write, including Slack, Messages and the browser. Writing tool features include completely re-writing your paragraph for a specific audience, as well as simpler updates such as spelling and grammar correction. You can also use them for proofreading or turning paragraphs of prose into a bulleted list. iPhone 16 will get a new default app called Image Playground. This will allow you to use AI to create images of yourself and others, as well as general pictures to share on social. These features, as well as the ability to use AI to create custom emojis, will also be available through Siri by just describing what you want and in the Messages and Mail apps. Apple says you'll be able to just describe to Siri what you want to see -- and it will make the image. The most powerful image features won't be around generating the pictures, but in analyzing the ones you already have. For example, you'll be able to go into the Photos app describe a dress that you know someone may have worn once and it will find all the images featuring that dress. What is potentially more impressive is it can do this with video and even find a specific moment any within any video stored in your photos library. A feature that stood out to me during the Apple Intelligence demo involved the automation elements. For example, with Apple Intelligence, instead of seeing the first line of an email in the preview window, you'll see an AI-generated summary of that email to give you a better idea of what it says before you open it. This level of summarization also extends to notifications, where Apple Intelligence will automatically summarize the purpose of any notification to make it easier to know whether it is worth opening or just discarding. Speaking of notifications, your phone's on-board AI will also be able to automatically place the most important or urgent notification at the top of the pile. Siri is getting one of the most notable upgrades, with a new look and more conversational language capabilities, it will be the "face" of Apple Intelligence. Where youcurrently talk to the AI assistant almost like you're typing a query into Google, in the iPhone 16 it will understand you even if you change what you're saying halfway through saying it. It can hold a natural language conversation. It also works when you are typing your query, offering a similar but cut-down experience to ChatGPT or Google Gemini. This is in part because it has a large language model behind it with an expansive training dataset. This extends to a deep understanding of the phone and operating system, allowing you to get advice on features and how to perform tasks directly from Siri.
[3]
Apple wants you to scan your friend's dog with iPhone 16's Visual Intelligence
Apple revealed the iPhone 16 lineup at the "It's Glowtime" event on September 9, and also showcased its forthcoming Apple Intelligence AI features which will begin rolling out in October. Apple CEO Tim Cook called the iPhone 16 series the "first iPhones designed from the ground up for Apple Intelligence", and the company showed several short videos to demonstrate the day-to-day uses of its new AI tools. Amongst the features shown was Visual Intelligence, activated by the new Camera Control capacitive button, which analyses the images in the field of view of the camera app to provide contextual information about the user's surroundings. Apple's livestream showed a supposed user walking around town, using Visual Intelligence to pull up a restaurant menu and find more details about a concert flyer. However, the example that has stirred the most reactions has to be our protagonist using their iPhone 16 to scan a passer-by's dog, to find the breed of the canine companion through an AI-assisted web search. Online commenters have been going barking mad (sorry) trying to figure out why the user wouldn't just ask the dog walker themselves. As tech reviewer and YouTuber Rjey Tech writes on X (formerly Twitter): "'What kind of dog is that?' Instead of asking the owner, pull out your iPhone and ask Apple Intelligence." Others weren't so quietly sarcastic. App developer kitze referred to "AI morons" silently pointing iPhones at strangers' puppies. Most reactions haven't been quite so strong, but there's certainly a consensus of confusion regarding Apple's choice of demonstration. While most commentators - us included - recognize that Apple's iPhone 16 demonstrations aren't quite true-to-life, the dog-scanning discourse highlights concern over whether Apple Intelligence brings meaningful upgrades to the iPhone 16, iPhone 16 Plus, iPhone 16 Pro, and iPhone 16 Pro Max. After all, Apple's annual September events are the most important product launches in its calendar, so fans expect the company's A-game all the way through. So, a somewhat unrealistic demonstration can leave commentators like FlightRadar24's Jacob Rabinowitz thinking that Apple can't bring any tangible use cases to mind. It's too early to draw conclusions about Apple Intelligence, but we think it's most likely Apple was just having a bit of fun - and let's face it, cute dogs are good for business. On the other hand, Apple are always keen to emphasise the constant presence of its products in people's lives, so there is something to glean about how the company may be envisioning a normalisation of AI in day-to-day interactions. Visual Intelligence will be available later this year as part of a staggered rollout of Apple Intelligence features.
[4]
The iPhone 16's Apple Intelligence Revolutionizes the Camera and Photos App
The new Apple Intelligence feature on the iPhone 16 has been built "from the ground up" to understand language and images. Craig Federighi introduced the feature at Apple's Glowtime event today (Monday) which saw the launch of the iPhone 16 and 16 Plus -- the first iPhone fully integrated with Apple Intelligence. Apple users have already been able to search for objects in Photos but with Apple Intelligence they will be able to get more specific. For example, users will be better able to describe a photo they want to find on their phone using natural language. This feature extends to videos where the user can find a "specific moment" in the clip. Users can also create a movie with AI, it will automatically find photos and videos and "smartly" arrange them into a storyline. Using the new touch-sensitive Camera Control button, users can point the iPhone 16's camera at a subject and be able to see specific information. Apple gave the example of using Visual Intelligence to get more information about a restaurant or an event poster. For example, the user can find out the restaurant's opening hours and reviews. Image Playground is a new dedicated app that will be integrated into other apps such as Messages. It works like an AI image generator, allowing users to generate different types of images based on a prompt. Notably, Image Playground doesn't support photorealistic generation. Instead, it has three styles: Animation, Illustration, and Sketch. It can also be used to create novel emojis by typing in a prompt. Away from the visual side of things, Apple says its AI feature is "grounded in personal context" thanks to a unique integration of hardware and software. Apple Intelligence will be found across a multitude of apps and can be used to help write emails or private messages. It will also improve Siri -- Apple's much-maligned personal assistant -- by giving the user "step-by-step guidance on how to do something with your iPhone". Siri will be able to "tap into your personal context" while gaining "on-screen awareness"; for example, taking prompts from messages and opening a relevant app. Users will be able to use AI-enhanced Siri to send photos, for example: "Hey Siri, send Erica the photos from Saturday's BBQ." The Siri feature will be available in Beta from next month. Localized English -- like U.K. and Australia -- will start rolling out in December and next year it will add other languages like Chinese French, Japanese, and Spanish. Meanwhile, the Private Cloud Computer feature will give users access to larger AI models that are more powerful than the one installed on the iPhone 16. But Apple insists the user's data will be protected and never stored.
[5]
Apple's iPhone 16 Will Have New 'Visual Intelligence' Feature
Expertise Home | Home Tech | Home Energy | Energy | Climate Change | Appliances At Apple's "Glowtime" event on Monday, the company unveiled the brand new iPhone 16 and iPhone 16 Plus. Along with a redesigned camera, the iPhone 16 will have another new feature: Visual intelligence. Visual intelligence will use AI to identify the things around you. For example, if you see a dog, you can take a photo of it and your iPhone 16 will identify the breed. Or, if you see a poster promoting an event, you can take a photo of it and the details on the poster will be added to your calendar.
[6]
iPhone 16's Camera Control button has a new Visual Intelligence feature to compete with Google Pixel's Circle to Search
The iPhone 16 was just announced and it has an amazing new Apple Intelligence feature called Visual Intelligence. Revealed at the September Apple event 'Glowtime', the new incredible AI feature allows you to click the new Camera Control button on the side of the phone and use a multimodal AI to give you search results on the fly. Ever wanted to know what that dog breed walking by is? Sorted. Search for anything you see using Visual Intelligence and the new Camera Control button on the iPhone 16. Visual Intelligence works similarly to the Google Pixel's Circle to Search feature but it could be even better thanks to the iPhone camera's ability to search for whatever you see. The iPhone 16 and iPhone 16 Plus are powered by a new A18 chip made from the ground up for Apple Intelligence. The Visual Intelligence feature will be available later this year, meaning you'll only be able to use the Camera Control button for, you guessed it, the camera at launch. The iPhone 16 isn't just getting a new Camera Control button, it's also getting the iPhone 15 Pro's Action button, which is capable of launching any shortcut you can think of. The new Visual Intelligence feature is also available on the newly announced iPhone 16 Pro and iPhone 16 Pro Max, which now have larger displays compared to last year's Pro models alongside the A18 Pro chip. The larger 6.3- and 6.9-inch displays are perfect for Visual Intelligence, allowing you to easily frame whatever you want to use AI to search for and view it on a larger display. Apple Intelligence arrives in beta next month, with future AI abilities like Visual Intelligence and an upgraded Siri with context awareness coming soon after.
Share
Share
Copy Link
Apple is set to introduce Visual Intelligence, a powerful AI-driven feature for the iPhone 16. This technology aims to revolutionize how users interact with images and the world around them, rivaling Google Lens.
Apple is gearing up to launch a groundbreaking AI-powered feature called Visual Intelligence for the upcoming iPhone 16. This innovative technology is poised to transform how users interact with their device's camera and Photos app, offering capabilities similar to Google Lens 1.
Visual Intelligence will enable iPhone 16 users to perform complex image recognition and analysis tasks. By simply pointing their camera at an object, users can identify various elements within their surroundings. The feature is expected to recognize a wide range of items, including animals, landmarks, plants, and even food dishes 3.
One of the key benefits of Visual Intelligence is its ability to automatically organize and categorize photos in the Photos app. The AI will analyze images, identifying people, objects, and scenes, making it easier for users to search and find specific photos within their library. This feature is expected to greatly improve the user experience when managing large photo collections 4.
Visual Intelligence will provide real-time information about objects in view. For instance, users can scan a friend's dog to learn about its breed or point their camera at a landmark to access historical information. This feature is designed to make everyday interactions more informative and engaging 3.
Apple's Visual Intelligence is expected to integrate seamlessly with other Apple services and apps. This integration could potentially allow users to shop for items they've photographed, find recipes for dishes they've captured, or get directions to places they've seen in images 2.
As with any AI-powered feature, privacy and security are paramount. Apple is likely to implement Visual Intelligence with its characteristic focus on user privacy, ensuring that image analysis is performed on-device rather than in the cloud. This approach would align with Apple's commitment to protecting user data 5.
The introduction of Visual Intelligence could significantly change how iPhone users interact with their surroundings. From simplifying shopping experiences to enhancing educational opportunities, this feature has the potential to make the iPhone an even more integral part of users' daily lives 4.
Reference
[2]
Apple unveils Visual Intelligence, an AI-powered visual search feature for the iPhone 16 camera, rivaling Google Lens. This new technology aims to enhance user experience by providing instant information about objects in photos.
3 Sources
3 Sources
Apple's AI-powered Visual Intelligence feature, previously exclusive to iPhone 16 models, will soon be available on iPhone 15 Pro devices through a future software update, likely iOS 18.4.
12 Sources
12 Sources
Apple's latest iOS 18.4 beta brings Visual Intelligence, an AI-powered visual search feature, to iPhone 15 Pro and Pro Max models, expanding its availability beyond the iPhone 16 lineup.
5 Sources
5 Sources
Apple is set to introduce its new AI-driven technology, Apple Intelligence, across its devices in October. This update promises to enhance user experience with advanced features for productivity, creativity, and accessibility.
12 Sources
12 Sources
Apple is set to introduce new AI-powered features, collectively known as Apple Intelligence, to the iPhone 16 series. These features are currently in beta testing and are expected to revolutionize user interaction with iOS devices.
4 Sources
4 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved