15 Sources
15 Sources
[1]
Amazon unveils AI smart glasses for its delivery drivers | TechCrunch
Amazon announced on Wednesday that it's developing AI-powered smart glasses for its delivery drivers. The idea behind the glasses is to give delivery drivers a hand-free experience that reduces the need to keep looking between their phone, the package they're delivering, and their surroundings. The e-commerce giant says the glasses will allow delivery drivers to scan packages, follow turn-by-turn walking directions, and capture proof of delivery, all without using their phones. The glasses use AI-powered sensing capabilities and computer vision alongside cameras to create a display that includes things like hazards and delivery tasks. Amazon likely hopes that the new glasses will shave time off of each delivery by providing delivery drivers with detailed directions and information about hazards directly in their line of sight. When a driver parks at a delivery location, Amazon says the glasses automatically activate. The glasses help the driver locate the package inside the vehicle and then navigate to the delivery address. The glasses can provide easy-to-follow directions in places like multi-unit apartment complexes and business locations. The glasses are paired with a controller worn in the delivery vest that contains operational controls, a swappable battery, and a dedicated emergency button. Amazon notes that the glasses also support prescription lenses and transitional lenses that automatically adjust to light. The retailer is currently trialing the glasses with delivery drivers in North America and plans to refine the technology before a wider rollout. The announcement doesn't come as a surprise, as Reuters reported last year that Amazon was working on the smart glasses. In the future, Amazon says the glasses will be able to provide drivers with "real-time defect detection" that could notify them if they accidentally drop off a package at the wrong address. The glasses will also be able to detect pets in yards and automatically adjust to hazards like low light conditions. Also on Wednesday, Amazon unveiled a new robotic arm called "Blue Jay" that can work alongside warehouse employees to pick items off shelves and sort them. Additionally, the tech giant announced a new AI tool called Eluna that will help provide operational insights at Amazon warehouses.
[2]
Amazon reveals 'smart delivery glasses' that guide drivers and scan packages
Remember when word got out that Amazon was building smart glasses for its delivery drivers before possibly launching a version for consumers as well? Amazon has just revealed those "Amelia" glasses to the world -- with a built-in display and an always-on camera to assist drivers as they go. They can help drivers find the right packages inside their delivery vans, give them turn by turn directions to the right address, and take a hands-free photo of a successful delivery instead of needing them to whip out a phone. (Ever had an Amazon driver tell you "please don't take your package yet, I have to take a pic first?") "If there are hazards, or a need to navigate complex environments like apartment buildings, the glasses will guide [drivers] safely to their destination," Amazon claims, showing off mocked-up video clips of what drivers should see as they work. As you can see in the photo below, the glasses aren't standalone -- they're paired with a vest that contains a swappable battery, and a button the driver can press to take a photo of each successful delivery. There seems to be a dial surrounding that button as well, and Amazon writes that the controller also has a dedicated emergency button to call for help. Amazon hasn't said which sensors are inside the glasses, but images suggest they may have two cameras: one centered above the nose, and one above the temple. Amazon also says the glasses have transition lenses -- they'll tint stronger in sunlight, and get clearer without -- and support prescription lenses as well. While the company isn't saying exactly when or where the glasses might roll out, the company says hundreds of drivers have already tested early versions of the tech, and that it wants to put more AI features inside. "We anticipate future versions of the glasses will provide real-time defect detection, where the glasses can help notify drivers if they've mistakenly dropped a package at a customer doorstep that does not correspond with the house or apartment number on the package, detect hazards like low light and adjust the lenses, notify that there's a pet in the yard, and more," writes Amazon. The frames don't seem particularly thick in these photos Amazon has released, but -- like the Meta Display and other glasses with built-in screens -- they don't seem quite at the level of consumer eyewear just yet. The Information reported in September that Amazon's glasses-with-a-screen for consumers, codename "Jayhawk", might launch for consumers in 2026 or 2027. Reuters, which originally reported on the glasses for delivery drivers wrote that those were codename "Amelia," and indeed some of Amazon's images today have the word "Amelia" in their filename. Unsurprisingly, Amazon's blog post doesn't discuss any possible ethical concerns that workers or customers might have as these glasses monitor the "last mile" of the delivery process.
[3]
Your Amazon driver is wearing AI smart glasses now - here's what they see
They can help identify hazards and make it easier to find packages. AI-powered smart glasses are having a moment, with Meta adding several new pairs to its collection last month at Meta Connect and Samsung teasing its upcoming pair this week. However, the AI smart glasses on the market have catered towards the general consumer, while the ones Amazon just launched have a different purpose: optimizing deliveries. On Wednesday, during its "Delivering the Future" event in San Francisco, Amazon unveiled its smart delivery glasses, designed to help delivery associates deliver packages more safely and efficiently, which in turn improves customer delivery experiences. The glasses can scan packages, display turn-by-turn walking directions, and capture an image of the delivery hands-free, helping drivers stay focused and avoid reaching for their phones. Powering it all is AI and machine learning. The glasses use AI-powered sensing, computer vision, and a camera to create a "heads-up display" that can showcase navigation details, hazards, and delivery tasks, as stated in the release. As seen in the screenshot above, the in-lens display appears to function similarly to the in-lens displays found on the Even Reality smart glasses. For example, the glasses automatically populate delivery information, such as address and number of packages, within the driver's field of view as soon as they park safely outside the delivery. The smart glasses also enable them to locate the package in the truck, displaying different alerts when the package is identified. Also: Watch out, Meta: Samsung just confirmed its smart glasses plans (with some spicy hints) Lastly, to safely deliver the driver to the destination, it navigates them to the delivery address with turn-by-turn walking navigation powered by Amazon's geospatial technology. If there are hazards on the way, the smart glasses will guide the delivery associate accordingly to safely get them to their destination. The glasses can accommodate prescriptions as well as light-adjusting lenses. Additionally, the glasses feature a small controller worn in the delivery vest for operational controls, a swappable battery for all-day use, and a dedicated emergency button. Amazon said in the blog post that the glasses were designed with input from the drivers themselves, with hundreds testing early versions and providing feedback that influenced the design and comfort of the glasses. In the future, the company anticipates that the glasses will be able to detect real-time mistakes, alerting the driver if they mistakenly drop the package at the wrong house or apartment, send notifications when a pet is in the yard, and perform other functions. The company did not provide an exact timeline for when to expect your driver to arrive at your door wearing the smart glasses, or when the rollout will start.
[4]
Amazon straps AI smart specs to delivery drivers
Why monitor staff through phones or cameras when Bezos' boxshifter can strap surveillance to their heads? Amazon is testing AI-powered smart glasses to help its drivers get from their vans to customers' doorsteps. The smart specs combine mounted cameras with computer vision and what Amazon calls "AI-powered sensing capabilities," which project turn-by-turn directions and delivery instructions directly into the driver's field of view. The idea, Amazon said in an announcement, is to create a "hands-free experience" that reduces the need for drivers to repeatedly look between their phones, packages, and surroundings when making a drop-off. In practice, the hands-free part may be limited, given that packages still need to be carried to the customer's doorstep. The glasses are part of a wider push, we're told, to thread AI through Amazon's last-mile delivery network, sitting alongside its expanding fleet of route-planning software, drones, and warehouse automation tools. Once a driver parks, the glasses automatically activate, helping them with everything from finding the right parcel in the van to navigating apartment buildings and confirming the drop-off. A small controller in the delivery vest houses the battery and buttons, including an emergency contact system. The glasses support prescription lenses and transitional lenses that automatically adjust to light. Future iterations of the eyewear could detect when a parcel has been dropped at the wrong address, flag loose pets or trip hazards, or automatically adjust to low-light conditions. The eyewear was developed under Amazon's $16.7 billion Delivery Service Partner program, which funds AI programs across its global network of drivers. The company says the tech was designed with feedback from hundreds of test drivers, with one saying they made him "feel safer the whole time" because the display kept information "right in my field of view." Skeptics might note the flip side: wearable cameras and continuous tracking built directly into a driver's line of sight. Amazon insists the aim is to make deliveries safer and more "seamless" - a term it uses a grand total of five times in its announcement. Whether "seamless" also means "more easy to observe" is something delivery drivers may soon find out. ®
[5]
Amazon unveils prototype AI smart glasses for its delivery drivers
Amazon is the latest US tech giant to enter an increasingly crowded field of firms experimenting with wearables, but for now it is a product meant for drivers, not customers. Ms Tomay said that drivers "have been doing real deliveries with these" to customers and that the glasses will be initially rolled out in North America. "We custom designed it for that use case," she added. "There's a very specific application here." When asked by the BBC if the Amelia smart glasses might be marketed to consumers at some point in the future, Ms Tomay did not rule out the possibility. Instagram and Facebook-owner Meta has also experimented with smart glasses in recent years. At its Meta Connect conference last month, the company unveiled a range of smart glasses powered by its Meta AI technology, including a pair of Ray-Bans with a built-in display. Unlike Amazon, Meta's smart glasses target the mainstream consumer products market. Meta presented the hardware as a technology that allows users to remain more engaged in the real world compared to smartphones. For Amazon, the Amelia smart glasses could augment efficiency in the "last mile" of its delivery network. Ms Tomay said the smart glasses can detect when they are in a moving vehicle, which prompts them to automatically shut off. "From a safety perspective, we thought that was important. No distractions," Ms Tomay told a group of reporters at an event in California. Ms Tomay estimated that the glasses could create up to 30 minutes in efficiencies per 8- to 10-hour shift by minimising repetitive tasks and helping drivers to quickly locate packages in their vehicles. The smart glasses also include a hardware switch on the controller that lets the driver turn off the glasses and all of its sensors, including the camera and microphone.
[6]
Amazon unveils AI-powered augmented reality glasses for delivery drivers
MILPITAS, Calif. -- Amazon is bringing delivery details directly to drivers' eyeballs. The e-commerce giant on Wednesday confirmed that it's developing new augmented reality glasses for delivery drivers, using AI and computer vision to help them scan packages, follow turn-by-turn walking directions, and capture proof of delivery, among other features. Amazon says the goal is to create a hands-free experience, making the job safer and more seamless by reducing the need for drivers to look down at a device. Scenarios shown by the company make it clear that the devices activate after parking, not while driving, which could help to alleviate safety and regulatory concerns. But the devices play into larger questions about the workload on drivers, and a history of technology efficiencies at Amazon's fulfillment hubs pushing workers to the limits of human capacity and safety. It's also another opportunity for the company to track and monitor the work done by drivers, adding more pressure to a job that already comes with demanding delivery targets. The wearable system was developed with input from hundreds of drivers, according to the company. It includes a small controller worn on the driver's vest that houses operational controls, a swappable battery for all-day use, and a dedicated emergency button. The glasses are also designed to support prescription and transitional lenses. Amazon says future versions could provide real-time alerts for hazards, like pets in the yard, or notify a driver if they are about to drop a package at the wrong address. According to Amazon, the smart glasses are an early prototype, currently in preliminary testing with hundreds of drivers in North America. The company says it's gathering driver feedback to refine the technology before planning a broader rollout. The announcement at Amazon's Delivering the Future event in the Bay Area today confirms a report by The Information last month. That report also said Amazon is developing consumer AR glasses to compete with Facebook parent Meta's AI-powered Ray Ban smart glasses. The enterprise AR market is in flux, with early mover Microsoft pivoting away from HoloLens hardware, creating an opening for players like Magic Leap and Vancouver, Wash.-based RealWear. A demo video released by Amazon shows a delivery driver using augmented reality (AR) glasses throughout their workflow. It begins after the driver parks in an electric Rivian van, where the glasses overlay the next delivery address directly onto a view of the road. "Dog on property," the audio cue cautions the driver. Upon parking, the driver moves to the cargo area. The AR display then activates to help with sorting, with green highlights overlaid on the specific packages required for that stop. As the driver picks each item, it's scanned and a virtual checklist in their vision gets updated. After retrieving all the packages from the cargo hold, the driver begins walking to the house. The glasses project a digital path onto the ground, guiding them along the walkway to the front door. Once at the porch, the display prompts the driver to "Take photo" to confirm the delivery. After placing the items, the driver taps a chest-mounded device to take the picture. A final menu then appears, allowing the driver to "Tap to finish" the stop before heading back to the van.
[7]
Amazon will give its delivery drivers AI-powered smart glasses, promising to make the job safer, faster amid automation push | Fortune
For Amazon delivery drivers, new glasses promise something more than just clearer vision or the blocked sun glare. Amazon is developing AI-powered smart glasses for its delivery drivers, the company said in a Wednesday blog post. The glasses will allow drivers to scan packages, following detailed walking directions, and document proof-of-delivery without their phones. Using cameras, as well as AI-powered sensing abilities, the technology will create an augmented reality display for drivers that includes information like hazards, as well as maps that direct drivers to particular building unit numbers. The glasses will automatically activate once a driver parks at a delivery location and can support prescription and transition lenses within its design. Eliminating needing to use a phone, as was the provision of convenience instructions, is aimed to increase the safety and efficiency of the delivery process, the company said. Future iterations of the glasses aim to give drivers "real-time defect detection" if they drop off a package at a wrong address. The device will also be able to adjust to low-light conditions and detect pets in customers' yards. Expedited delivery has remained a hallmark of Amazon's business as it competes with the growing e-commerce capabilities of Walmart and other retail giants. Amazon announced in June a $4 billion investment in tripling its delivery network size, particularly in rural areas, by 2026. One Amazon delivery driver made on average 65,700 deliveries in 2024, translating to 100,375 packages annually, according to data compiled by CapitalOne Shopping. That's about 27 deliveries per hour. Reuters reported the product's development last November. Anonymous sources told the outlet that while the glasses could increase driver productivity by freeing up hand space for workers to carry more packages, the company may have trouble developing a battery able to last an entire shift, which can be up to 10 hours. Drivers may also not want to wear the devices, which may be uncomfortable or distracting, the sources said. Amazon did not respond to Fortune's request for comment on concerns about the battery duration or comfortability of the glasses. In addition to AI-powered glasses for drivers, Amazon is also developing operational technologies for warehouse workers, the company announced Wednesday. Blue Jay, a robotics system using multiple arms to lift and sort packages, aims to mitigate the need for employees to lift heavy items. Project Eluna is an agentic AI model that will monitor numerous dashboards and make decisions, such as about reducing sorting bottlenecks, with the goal to lessen the "cognitive load" of workers. The AI agent will be piloted at a Tennessee fulfillment center during the holiday season. The company's automation push has brought with it concern about the future of human employment. Some AI experts have said automation processes will surely displace human workers, with University of Louisville professor of computer science Roman Yampolskiy saying AI could spike unemployment levels up to 99% in the next five years -- a more eye-popping figure than even Anthropic CEO Dario Amodei's projection of the technology replacing 50% of entry-level white-collar jobs in the same period. "Before we always said, 'This job is going to be automated, retrain to do this other job,'" Yampolskiy said in an episode of The Diary of a CEO podcast last month. "But if I'm telling you that all jobs will be automated, then there is no plan B. You cannot retrain." A New York Times investigation published on Tuesday reported, citing internal documents, Amazon plans to automate 75% of its operations. That translates to roughly 600,000 jobs for which the company would not need to hire in the future. Amazon spokesperson Kelly Nantel said the investigation did not accurately reflect the company's hiring strategy, and that the company recently announced plans to fill 250,000 positions ahead of the end-of-year holiday push. "Leaked documents often paint an incomplete and misleading picture of our plans, and that's the case here," Nantel told Fortune in a statement. "In this instance, the materials appear to reflect the perspective of just one team and don't represent our overall hiring strategy across our various operations business lines -- now or moving forward." Amazon executives have made an effort to assuage anxieties about the future of employment. Amazon Robotics' chief technologist Tye Brady told Fortune in May the company's automation advancements are meant to enhance, not replace, the jobs of humans. The interview at Fortune's Brainstorm AI conference in London took place after Amazon announced the launch of Vulcan, a robot arm with a sense of touch. "I will be unabashedly proud that we aim to eliminate, I mean eliminate, every menial, mundane, and repetitive job out there," Brady said. "And if it's repetitive, we want to automate that, because we will never run out of things to do for our employees. We want them to focus on higher-level tasks." "People are amazing at using common sense, reasoning, and understanding complex problems," he continued. "Why would you not use that?"
[8]
Amazon Reveals AI Glasses for Delivery Workers to Cut Phone Use | AIM
As drivers arrive at delivery locations, the glasses will activate to provide crucial details, streamlining identification. Amazon has announced a prototype of smart glasses equipped with AI technology, specifically designed for its delivery personnel. The company aims to streamline the delivery process by reducing the need for drivers to check their phones, read package labels, and look around for the correct address. The "Amelia" smart glasses use AI-driven sensing and computer vision technology, along with cameras, to create a heads-up display that shows everything from navigation information to safety hazards to delivery assignments. Once drivers securely park at a delivery spot, the glasses automatically activate, providing the delivery associate with delivery details directly in their line of sight. According to the company, this process begins with identifying the correct packages in their vehicles for the respective residences. "The smart glasses are just one step in our broader effort to innovate in the last-mile delivery process, creating solutions that improve safety," the company said in a statement. Amazon has joined other tech giants in Silicon Valley, like Meta, by investing in and creating smart glasses, but the Seattle-based company has not yet disclosed plans for a consumer version of the "Amelia" smart glasses. The company also introduced a robotic arm named "Blue Jay," intended to assist warehouse workers by retrieving items from shelves and organising them. According to Amazon, this robot, already in operation at a warehouse in South Carolina, aims to minimise injuries and optimise space use in its facilities. Moreover, the tech giant launched a new AI tool called "Eluna," which will offer operational insights for Amazon warehouses. Both Blue Jay and Project Eluna collaborate with operations employees to foster safer and more efficient working environments. "These advanced innovations build on several recent AI and automation breakthroughs for our operations, like Vulcan and DeepFleet," Amazon said.
[9]
Amazon melds AI with robotics and smart glasses to streamline deliveries - SiliconANGLE
Amazon melds AI with robotics and smart glasses to streamline deliveries Amazon.com Inc. aims to transform its retail and logistics business with the introduction of more sophisticated robots and artificial intelligence operations tools for warehouses, as well as "smart delivery glasses" for the contractors who deliver its packages to customer's doors. At its annual Delivering the Future event today in the Bay Area, Amazon lifted the lid off Blue Jay, a new robotics system with multiple arms that effectively combines three systems into one, and Project Eluna, which helps warehouse managers make more intelligent decisions on the spot. Meanwhile, the smart delivery glasses are still under development, and will eventually help Amazon delivery associates to navigate more effectively and capture proof of delivery without having to grab their smartphones constantly. The company said that Blue Jay is the culmination of years of advances it has made in Robotics, building on earlier systems such as Vulcan and DeepFleet. It's the latest example of what Amazon calls "physical AI," which refers to AI systems that can interact with their physical environment to support humans in their work. Blue Jay is an advanced robotic sorting system that's designed to be installed at the company's logistics facilities. With its multiple arms, it's more like an automated production line that can pick, sort and stow hundreds of different items to try and organize packages more efficiently. It consolidates what used to be three separate robotic systems into one much more capable station that can process thousands of items per day at faster speeds than before. Amazon said it used AI systems and "digital twins" to accelerate the design of Blue Jay, which meant it was able to go from concept to production in just over a year. That's incredibly quick, considering that earlier systems such as Cardinal, Robin and Sparrow took about three years to develop. The company said AI enabled it to condense years of trial-and-error were condensed into months of development. the company said. Its engineers could iterate on dozens of prototypes for Blue Jay with the use of digital twins. Blue Jay is already being tested at one of Amazon's biggest logistics facilities in South Carolina, and it has the capability to organize about 75% of the various items it stores at that site. Over time, the system will be introduced in many more facilities, including thousands of Sub-Same Day Sites to help the company deliver parcels to customers faster. Amazon is a pioneer of physical AI. Earlier this year it announced a robot called Vulcan that was the first in the world to possess a sense of touch. It was a key development that aids both operations and safety. For instance, if Vulcan picks up a box and notices that it's starting to crumple at the edges, it will understand that it's gripping it too tightly and reduce the amount of force it's using. It also uses touch to know when it accidentally bumps into a human worker, and will immediately pause when this happens. The Blue Jay system will be aided by Project Eluna, which is an agentic AI platform that's designed to act like an assistant for operations managers. The company explained that its human workers face a never-ending workload, having to monitor dozens of dashboards that overlook their logistics operations and respond to any issues, such as technology breakdowns and bottlenecks that require a reallocation of resources. Project Eluna will help to reduce that cognitive load, helping to oversee operations and recommend actions to operators in real time. It'll be able to anticipate bottlenecks before they become a problem and work out how to deal with them, so that the logistics operation runs as smoothly as possible. Operators will be able to converse with Eluna too, asking questions such as "Where should we shift people to avoid a bottleneck?" and receive immediate recommendations. "Our latest innovations are great examples of how we're using AI and robotics to create an even better experience for our employees and customers," said Amazon Robotics Chief Technologist Tye Brady. "The goal is to make technology the most practical, the most powerful tool it can be, so that work becomes safer, smarter, and more rewarding." The smart delivery glasses don't appear to have an official name just yet, but they're already being used in the real world by hundreds of delivery drivers to assist in their development with feedback. Earlier this year, Amazon revealed that it's working on an advanced geospatial mapping technology that provides more granular detail about things such as building shapes and obstacles on pavements, and we now know why. The technology is being integrated into the smart glasses, which feature an embedded display that allows for hands-free operation, so drivers no longer have to keep fiddling with their smartphones. The company said they'll provide navigation instructions while the wearer is driving, and then once they've parked up, it'll continue to tell them where to go if they need to walk to the customer's house, with turn-by-turn walking directions. That will be especially useful in places such as large apartment complexes, the company said. In addition, the glasses will be able to scan packages and take photos of each item once they reach the customer's door as proof of delivery. Amazon said its drivers were a massive help, testing early versions of the smart glasses and providing feedback on their functionality and comfort, so it could tinker with the design and get it just right. Based on the driver's inputs, it decided to use a swappable battery to ensure the glasses can be used by those on longer shifts. It also added support for prescription lenses, so that those who need to wear regular spectacles can also use them. A delivery contractor at Maddox Logistics Corp. in Omaha identified at Kaleb M. said he felt safer while using the smart spectacles, because they provide all of the information he needs in his field of view. "Instead of having to look down at a phone, you can keep your eyes forward and look past the display," he said. "You're always focused on what's ahead."
[10]
Amazon Just Unveiled AI Smart Glasses for Its Drivers -- Here's What They Actually Do
Amazon said that hundreds of drivers have already tested the devices. Amazon recently unveiled new wearable AI smart glasses to help delivery drivers carry out tasks hands-free, without using their phones. The e-commerce giant announced on Wednesday that it was working on developing the glasses for faster delivery times. The glasses use AI-powered sensors, computer vision and cameras to create a display that shows everything from navigation instructions to delivery tasks. The frames support prescription lenses, as well as transitional lenses that automatically change according to light. The glasses enable drivers to scan packages, follow step-by-step walking instructions to a specific location and verify delivery. The device pairs with a controller, worn within the driver's delivery vest, that includes a swappable battery and a dedicated emergency button to get help along the route if needed. Here's how it works: When a driver parks, the glasses automatically activate and identify the correct package for delivery through camera recognition. After the driver selects the package, the built-in display shows real-time navigation information to the delivery address, allowing drivers to keep their eyes forward instead of switching between a phone and the package. The glasses will navigate around hazards and within complex environments like apartment buildings. If the battery runs low, the driver can swap it out with a battery in their delivery vest. Related: Meta Just Unveiled New Smart Glasses With an Embedded Display -- and an iPhone 17 Price Point. Here's How They Work. Amazon announced the glasses at its annual "Delivering the Future" logistics event on Wednesday. Beryl Tomay, Amazon's transportation vice president, said at the event that hundreds of delivery drivers had already tested the glasses, with some recording time savings of 30 minutes for a single eight to ten-hour shift with the help of the devices. "It reduces the need to manage a phone and a package," Tomay said at the event, per Reuters. "It helps them stay at attention, which enhances their safety." Tomay said that the glasses would be free and optional for drivers to use. This innovation is still experimental and Amazon's plans for them are undecided, she said. Amazon is piloting the glasses with some delivery drivers in North America and plans to keep working on the technology before a broader rollout, per TechCrunch. Related: Apple Is Reportedly Developing AI Smart Glasses to Compete with Meta and Google The smart glasses would cut down the time and cost it takes to deliver packages. Amazon's shipping costs have steadily increased over the years, climbing from $83.5 billion in 2022 to $89.5 billion in 2023 and $95.8 billion in 2024, per Statista estimates. By 2024, the company's combined shipping and fulfillment expenses hit $194.3 billion, accounting for 34% of total operating costs, according to Capital One Shopping Research. The glasses target these expenditures by saving seconds or minutes per delivery, which could translate to major financial savings given Amazon's delivery volume. News of Amazon smart glasses for drivers first leaked in November 2024. A Reuters report, citing five sources, stated that the glasses would feature a screen with navigation instructions to help drivers deliver packages to unfamiliar locations. Another report released this September from The Information also highlighted that Amazon was developing a pair of smart glasses for delivery drivers, as well as another pair of glasses with an embedded lens for the general public. The public-facing glasses are expected to be released in late 2026 or early 2027, per the report. While Amazon is getting in the wearable AI game, Meta is the undisputed leader in the smart glasses market, with a market share of over 70% in the first half of 2025. The company's dominance is driven by strong sales of its Ray-Ban Meta smart glasses, which have sold over two million pairs since their October 2023 debut.
[11]
Amazon Launches AI Glasses for Delivery Associates With Heads-Up Display
Amazon smart glasses are claimed to offer a hands-free experience Amazon has launched its first smart glasses for delivery associates with artificial intelligence (AI)-powered sensing functionalities and computer vision, the company announced on Wednesday. The new smart glasses sport a heads-up display for providing turn-by-turn navigation, information about packages, and notifying wearers of threats and hazards. Moreover, it features a multi-camera setup, which allows delivery agents to scan packages. Amazon uses its geospatial technology, too. Amazon AI Smart Glasses for Delivery Associates to Provide Turn-By-Turn Navigation The Seattle-based tech giant announced the launch of its new Amazon smart glasses for delivery associates. The company claims that the smart glasses will make the delivery experience "safer and more seamless" for its associates. The smart glasses sport a heads-up display that will offer turn-by-turn navigation, allowing the wearer to scan the package. They will also offer a "hands-free experience" while reducing the need for users to look at their smartphones, packages, and the surrounding environment. The first smart glasses from Amazon offer AI-enabled sensing capabilities and computer vision. They carry a multi-camera setup, which allows the heads-up display to show directions, information regarding the package, and hazards on the way to the destination. Amazon's AI smart glasses activate automatically when a delivery associate parks their vehicle. The heads-up display prompts the relevant delivery information, projected in their field of view. Additionally, Amazon's geospatial technology enables turn-by-turn navigation. Moreover, the smart glasses help wearers deliver the correct package to its corresponding destination by reading the barcode on the box. The number of packages, the code, and the address appear on the right side of the heads-up display. Amazon's first smart glasses are paired with a controller, which is worn by the delivery person in a vest. It features operational controls for the smart glasses, along with a swappable battery and a dedicated emergency button. On top of this, they also support prescription lenses and transitional lenses that "automatically adjust to light". The company also said that the future versions of the smart glasses will be capable of real-time defect detection, which will prompt the delivery agent if they "mistakenly" drop a package at the wrong destination. Amazon also expects the later iterations to detect hazards, like low-light environments or if there is a pet in the yard.
[12]
Amazon's New AI Glasses Guide Drivers Hands-Free - Amazon.com (NASDAQ:AMZN)
Amazon.com, Inc. (NASDAQ:AMZN) announced on Wednesday the introduction of smart delivery glasses that combine artificial intelligence and computer vision to enhance delivery safety and efficiency. AMZN is trading at elevated levels. See what is happening here The technology gives delivery associates real-time navigation and task updates directly in their line of sight. Amazon said it developed the wearable system with extensive feedback from delivery associates and Delivery Service Partner drivers who perform daily routes across neighborhoods. Also Read: Amazon Defies Retail Slowdown, Sticking With 250,000 Holiday Hires The initiative expands the company's ongoing investment in driver safety and last-mile logistics through digital innovation. Hands-free Navigation And Safety The new delivery glasses eliminate the need for drivers to check their phones while navigating or scanning packages. Instead, key information such as walking routes, package identification, and proof-of-delivery capture appears on a lightweight display. The heads-up interface helps drivers stay aware of their surroundings, reducing distractions during busy deliveries. The glasses integrate AI-powered sensors and computer vision that recognize packages and detect potential obstacles. Once a driver parks, the display activates automatically, showing step-by-step navigation to each address using Amazon's geospatial mapping system. The device can also guide users through complex settings, such as apartment complexes or gated communities. Future Vision Of Delivery Amazon plans to enhance the glasses with features such as real-time package verification, hazard alerts, and adaptive lighting. Future updates may even detect animals or obstacles near delivery points, further minimizing risk. The company says this innovation reflects its broader mission to make last-mile operations safer and more intuitive for every driver in its network. AMZN Price Action: Amazon.com shares were up 0.42% at $218.87 at the time of publication on Thursday, according to Benzinga Pro data. Read Next: Amazon Web Services Outage Knocks Major Websites Offline -- Disney+, Reddit, McDonald's App, United Airlines Among Those Hit Image via Shutterstock AMZNAmazon.com Inc$220.691.26%Overview This content was partially produced with the help of AI tools and was reviewed and published by Benzinga editors. Market News and Data brought to you by Benzinga APIs
[13]
Amazon Equips Drivers With Smart Glasses as It Deepens AI Fulfillment Investments | PYMNTS.com
By completing this form, you agree to receive marketing communications from PYMNTS and to the sharing of your information with our sponsor, if applicable, in accordance with our Privacy Policy and Terms and Conditions. The glasses combine computer-vision overlays and AI sensing to guide drivers through package pickup, walking directions and proof of delivery all without the need to look at a phone. The pilot in North America includes a vest-mounted controller with a swappable battery and emergency button. The company said the system activates automatically when drivers park, reducing friction and improving safety across millions of daily stops. Behind the scenes, Amazon is rolling out additional innovations aimed at record-fast delivery in 2025. The company is regionalizing inventory, deploying e-cargo bikes and electric vans, and investing about $4 billion to expand its same-day and next-day delivery network into smaller towns and rural areas. It is also deploying tools such as the "Blue Jay" robotics system that coordinates multiple robotic arms to collapse three production lines into one, the "Project Eluna" agentic-AI model to anticipate bottlenecks and provide real-time operational insights, immersive virtual-reality training for drivers via the EVOLVE simulator, and the "Packaging Decision Engine" plus Project P.I. to reduce packaging waste and detect defects through AI. AI-powered demand-prediction tools place inventory closer to customers while new same-day networks reach perishable grocery and prescription deliveries faster. The company is also reconfiguring its fulfillment centers to move inventory closer to where demand is forecasted, using machine learning to predict order surges down to the ZIP-code level. New tools will allow drivers to consolidate routes dynamically, while AI-based demand models continuously adjust inventory positioning across regions. These upgrades are expected to reduce transit times, optimize energy usage, and improve reliability, further reinforcing Amazon's push to convert logistics efficiency into customer retention. From a strategic perspective, shortening the interval between checkout and doorstep strengthens Amazon's value proposition for Prime members and tightens the link between order placement and revenue recognition. Alongside the delivery upgrades, Amazon expanded its Seller Assistant into an "agentic AI" assistant that can reason, plan and act on behalf of merchants, handling inventory optimization, compliance monitoring and ad creation. The tool reflects Amazon's move from a traditional marketplace to an operational co-pilot for millions of sellers. These developments come amid a broader race in wearable AI. Meta recently introduced its "first serious" smart glasses for consumers at $799, underscoring how hardware, AI and commerce are converging. Both launches highlight how real-world productivity and digital intelligence are increasingly being designed into connected devices and platforms. Amazon's latest push, combining driver-facing hardware, logistics infrastructure and seller-facing AI shows a coordinated strategy to control both the physical and digital ends of fulfillment. For enterprise and payments professionals, the lesson is that value creation is shifting toward intelligent edges: tools that augment human performance, automate commerce decisions, and accelerate the movement of goods. As Amazon expands these systems, the competitive bar for speed, operational visibility and ecosystem support is set to rise.
[14]
Amazon sees faster delivery speeds with hi-tech driver eyeglasses, AI
MILPITAS, California (Reuters) -In its relentless drive to bring everyday items to customers faster, Amazon has shifted expectations from two-day delivery to same-day and even within an hour. Now, with robots, artificial intelligence and even eyewear, it is working to pare seconds off each delivery. On Wednesday, Seattle-based Amazon showed off advanced eyeglasses for delivery workers for the first time publicly after Reuters exclusively reported the company was developing them last year. Known internally as Amelia, the glasses have a small screen that gives turn-by-turn directions, scans package codes and takes photos for proof of delivery. The glasses could replace the bulky handheld Global Positioning System devices drivers use today and give helpful navigation tips like which way to turn when leaving an apartment elevator or how to avoid an aggressive dog at a customer's home. The glasses rely on a paired controller placed in a driver's vest and Amazon solved the challenge of battery life by having swappable battery packs, the company said. The announcements made at Amazon's annual "Delivering the Future" logistics event are part of the firm's particular focus on the "last 100 yards," which contribute to the expensive final steps in a delivery's journey to customers. Last year, it unveiled a delivery van scanner to direct drivers to packages for each stop by shining a green spotlight on them, shaving seconds off time usually spent reading labels. And in June, it showed reporters new digital maps that gave far more detail about neighborhoods, building shapes and obstacles, than, say, Google Maps. Also on Wednesday, the online retailer showed off a robotic arm that it says can work in concert with warehouse employees picking items off shelves and sorting them for faster and more accurate order fulfillment. Amazon claimed the robot, dubbed Blue Jay, could reduce injury rates and work in a smaller space than equivalent robots that previously required three separate stations. The robot is in use already at a warehouse in South Carolina, Amazon said, and it plans to roll it out to more facilities in the coming months, particularly what are known as sub-same-day sites focused on delivery in a few hours or less. Additionally, Amazon said it plans to deploy an artificial intelligence system at warehouses, starting at one in Tennessee, that can help manage operations at a high level, to prevent gridlock and other challenges that can slow operations. It was not immediately clear how Amazon planned to institute the software or who would have access to it. The company's expansion of warehouse robots is expected to reduce its U.S. hiring by 160,000 workers over the next two years, the New York Times reported on Tuesday. Amazon said it planned to hire 250,000 temporary workers for the holiday season. (Reporting by Greg Bensinger; Editing by Lisa Shumaker)
[15]
Amazon's AR glasses for delivery drivers: It has some cool features
Wearable tech transforms last-mile logistics for Amazon drivers Amazon has unveiled AI-powered augmented reality (AR) glasses designed to transform the way its delivery drivers operate. The new wearable technology aims to make deliveries faster, safer, and more efficient by offering hands-free assistance, advanced navigation, and real-time information directly in the driver's field of view. By integrating AR and AI, Amazon is experimenting with a future where technology augments the delivery experience rather than replacing human effort. Also read: Flipkart fulfillment centre secrets: What happens after you hit buy on Big Billion Days At the core of Amazon's AR glasses is hands-free navigation and package handling. The glasses overlay delivery instructions, addresses, and package details directly onto the driver's view, eliminating the need to constantly check a handheld device. Drivers can follow turn-by-turn walking directions, scan packages, and confirm deliveries without ever taking their eyes off the route or the surroundings. This not only enhances efficiency but also reduces errors, such as scanning the wrong package or misreading delivery instructions. The system integrates smoothly with Amazon's existing delivery apps, presenting information contextually as drivers approach each stop. For example, a driver walking toward a complex apartment complex may receive visual cues on the correct entrance, the floor number, or a specific door. This hands-free interaction allows drivers to focus on the job without distractions from handheld screens, which have traditionally slowed down the delivery process. Safety has been a key focus for Amazon in developing these glasses. Early testing shows that future versions may include real-time hazard alerts. Drivers could receive notifications about pets roaming in a yard, children nearby, or even if a package is at risk of being delivered to the wrong address. These proactive warnings aim to prevent accidents, ensuring drivers can navigate safely while maintaining the pace of deliveries. The glasses are designed to accommodate a wide range of users. They support prescription and transitional lenses, making them accessible for drivers who normally require corrective eyewear. Additionally, the glasses are paired with a small controller that fits on the driver's vest. This controller houses a swappable battery capable of lasting a full workday, offers operational controls, and includes a dedicated emergency button for quick assistance. Also read: Amazon launches new Echo Dot and Echo Show devices with Alexa+ Integration Amazon has taken measures to ensure that the glasses are not used while driving. They can only be activated after parking, which addresses safety and regulatory concerns. Currently, hundreds of drivers in North America are participating in preliminary testing. This trial phase allows Amazon to collect feedback on usability, comfort, and efficiency, helping refine the technology before considering a larger rollout. Drivers who have tested the glasses report that they reduce the repetitive back-and-forth between handheld devices and packages, streamlining the daily workflow. By prioritizing controlled testing, Amazon ensures that both driver safety and delivery efficiency are addressed simultaneously. The company has emphasized that the technology is intended to enhance human work, not replace it, reinforcing a collaborative approach between humans and AI in logistics. The introduction of AR glasses is part of Amazon's broader strategy to integrate cutting-edge technology into its logistics operations. Wearable tech like this represents a shift in how delivery systems operate, combining AI, AR, and human skill to create smarter, more intuitive workflows. By improving efficiency and reducing errors, the glasses could redefine last-mile delivery standards, setting a precedent for other companies in the sector. Amazon's investment in AR technology also highlights a growing trend in logistics: wearable devices are no longer just experimental tools but practical solutions for real-world challenges. The glasses demonstrate how AI can be applied to make routine work more precise, safer, and less physically taxing. As testing continues and feedback is incorporated, the potential for these AR glasses to become a standard part of delivery operations seems increasingly likely. While the technology is still in its early stages, its potential impact is significant. From hands-free package scanning to real-time hazard alerts and seamless navigation, the glasses aim to make the driver's job smoother and more efficient. For now, Amazon is taking a cautious approach, but the AR glasses may soon become an essential tool for delivery drivers, showcasing the power of AI and AR to transform the everyday logistics experience.
Share
Share
Copy Link
Amazon has introduced AI-powered smart glasses for its delivery drivers, aiming to enhance efficiency and safety in the last-mile delivery process. The glasses offer hands-free package scanning, navigation, and proof of delivery capture.
Amazon has unveiled its latest innovation in the world of e-commerce logistics: AI-powered smart glasses designed specifically for its delivery drivers. The tech giant announced the development of these glasses, codenamed 'Amelia,' which aim to streamline the delivery process and enhance safety for drivers
1
2
.
Source: Digit
The smart glasses are equipped with AI-powered sensing capabilities, computer vision, and cameras to create a heads-up display that provides crucial information directly in the driver's line of sight. Key features include:
When a driver arrives at a delivery location, the glasses automatically activate, guiding them to locate the correct package inside their vehicle and navigate to the delivery address
3
.
Source: TechCrunch
The glasses are designed with driver comfort in mind, supporting prescription lenses and featuring transitional lenses that adjust to light conditions. A small controller worn in the delivery vest houses operational controls, a swappable battery for all-day use, and a dedicated emergency button
2
.Amazon reports that hundreds of drivers have already tested early versions of the technology, providing feedback that influenced the design and functionality. The company plans to refine the technology further before a wider rollout, with initial deployment expected in North America
5
.Amazon anticipates future versions of the glasses will include additional AI-powered features such as:
1
Related Stories
The smart glasses are expected to save up to 30 minutes per 8- to 10-hour shift by minimizing repetitive tasks and helping drivers quickly locate packages. Safety features include automatic deactivation when the glasses detect they are in a moving vehicle, and a hardware switch allowing drivers to turn off all sensors, including the camera and microphone
5
.Amazon's entry into the smart glasses market follows other tech giants like Meta, which recently unveiled consumer-focused AI-powered smart glasses. While Amazon's current focus is on delivery optimization, the company has not ruled out the possibility of future consumer applications
5
.Some skeptics have raised questions about the privacy and surveillance implications of wearable cameras and continuous tracking built into drivers' line of sight. Amazon maintains that the primary goal is to make deliveries safer and more seamless, but the balance between efficiency and worker privacy remains a topic of discussion
4
.
Source: The Register
Summarized by
Navi
[4]