4 Sources
4 Sources
[1]
Ford is getting ready to put AI assistants in its cars
The annual Consumer Electronics Show is currently raging in Las Vegas, and as has become traditional over the past decade, automakers and their suppliers now use the conference to announce their technology plans. Tonight it was Ford's turn, and it is very on-trend for 2026. If you guessed that means AI is coming to the Ford in-car experience, congratulations, you guessed right. Even though the company owes everything to mass-producing identical vehicles, it says that it wants AI to personalize your car to you. "Our vision for the customer is simple, but not elementary: a seamless layer of intelligence that travels with you between your phone and your vehicle," said Doug Field, Ford's chief EV, design, and digital officer. "Not generic intelligence -- many people can do that better than we can. What customers need is intelligence that understands where you are, what you're doing, and what your vehicle is capable of, and then makes the next decision simpler," Field wrote in a blog post Ford shared ahead of time with Ars. As an example, Field suggests you could take a photo of something you want to load onto your truck, upload it to the AI, and find out whether it will fit in the bed. At first, Ford's AI assistant will just show up in the Ford and Lincoln smartphone apps. Expect that rollout to happen starting early this year. From 2027, the AI assistant will become a native experience as new or refreshed models are able to include it, possibly starting with the cheap electric truck that the automaker tells us is due next year, but also gas models like the Expedition and Navigator. Also expect those new or refreshed models to become software-defined vehicles, where dozens of discrete electronic control units have been replaced by a handful of powerful multitasking computers. This is one of the latest trends in automotive design, and at CES this year, Ford is showing off what it calls its "High Performance Compute Center" -- perhaps high-performance computer sounded too pedestrian for something with four wheels. The new computer was designed in-house and is in charge of infotainment, the advanced driver assistance systems, audio, and networking. Ford says the new computer is much cheaper than previous solutions, while taking up half the volume, even as it offers much better performance. "Our upcoming Universal Electric Vehicle (UEV) architecture incorporates a fivefold increase for the in-house module design, giving us 5X more control over critical semiconductors," said Paul Costa, executive director of electronics platforms at Ford. Moving to a software-defined vehicle architecture, with much more powerful processing for things like perception, means Ford can get a little more ambitious with its partially automated driver assists. According to Field, next year will see the debut of a new generation of its BlueCruise assist that has "significantly more capability at a 30 percent lower cost." And in 2028, Ford plans to start offering a so-called "level 3" assist, where the driver can give up situational awareness completely under certain circumstances, like heavy highway traffic.
[2]
Ford's AI voice assistant is coming later this year, L3 driving in 2028
Ford's new AI-powered voice assistant will be rolling out to customers later this year, the company's top software executive said at CES today. And in 2028, the automaker will introduce a hands-free, eyes-off Level 3 autonomous driving feature as part of its more affordable (and hopefully more profitable) Universal Electric Vehicle (UEV) platform, set to launch in 2027. Most importantly, Ford said it would be developing a lot of the core technology around these products in-house in order to reduce costs and retain greater control over. Mind you, the company isn't creating its own large-language models or designing its own silicon, like Tesla and Rivian. Instead, it will be building its own electronic and computer modules that are smaller and more efficient than the systems that are currently in place. "By designing our own software and hardware in-house, we've found a way to make this technology more affordable," Ford's chief officer for EVs and software Doug Field wrote in a blog post. "This means we can put advanced hands-free driving into the vehicles people actually buy, not just vehicles with unattainable price points." The new comes as Ford faces increasing pressure to roll out more affordable EVs after its big bet on electric versions of the Mustang and F-150 pickup truck failed to excite customers or make a profit. The company recently cancelled the F-150 Lightning amid cooling EV sales, and said it would make more hybrid vehicles as well as battery storage systems to meet growing demand from AI data center construction. Ford also has been recalibrating its AI strategy after shutting down its autonomous vehicle program with Argo AI in 2022, pivoting from fully autonomous Level 4 vehicles to Level 2 and Level 3 conditional autonomous driver assist features. Amid all this, the company is trying to stake out a middle ground on AI: not going all-in on a robot army like Tesla and Hyundai, while still committing to some AI-powered products, like voice assistants and automated driving features. Ford said its AI assistant will launch on the Ford and Lincoln mobile apps in 2026, before expanding to the in-car experience in 2027. An example would be a Ford owner standing in a hardware, unsure how many bags of mulch will fit in the bed of their truck. The owner could snap a photo of the mulch and ask the assistant, who could respond with a more accurate answer than, say, ChatGPT or Google's Gemini, because it all the information about the owner's vehicle, including truck bed size and trim level. At a recent tech conference, Ford's CFO Sherry House said Ford would be integrating Google's Gemini into its vehicles. That said, the automaker is designing its assistant to be chatbot-agnostic, meaning it will work with a variety of different LLMs. "The key part is that we take this LLM, and then we give it access to all the relevant Ford systems so that LLM then knows about what specific vehicle you're using," Sammy Omari, Ford's head of ADAS and infotainment, told me. Autonomous driving features will come later with the launch of Ford's Universal EV Platform. Ford's flagship product right now is BlueCruise, its hands-free Level 2 driver assist feature that's only available on most highways. Ford plans on rolling out a point-to-point hands-free system that can recognize traffic lights and navigate intersections. And then eventually it will launch a Level 3 system where the driver still needs to be able to take over the vehicle upon request but can also take their eyes off the road in certain situations. (Some experts have argued that L3 systems can be dangerous given the need for drivers to stay attentive despite the vehicle performing most of the driving tasks.) Omari explained that by rigorously scrutinizing every sensor, software component, and compute unit, the team has achieved a system that is approximately 30 percent lower cost than today's hands-free system, while delivering significantly more capability. All of this will depend on a "radical rethink" of Ford's computing architecture, Field said in the blog post. That means a more unified "brain" that can process infotainment, ADAS, voice commands, and more. For almost a decade, Ford has been building a team with the relevant expertise to spearhead these projects. The former Argo AI team, originally focused on Level 4 robotaxi development, was brought on board the mothership for their expertise in machine learning, robotics, and software. And a team of BlackBerry engineers, who were initially hired in 2017, is now working on building next-generation electronic modules to enable some of these innovations, Paul Costa, executive director of Ford's electronics platforms, told me. But Ford doesn't want to get into "a TOPS arms race," Costa added, referring to the metric for measuring AI processor's speed in trillions of operations a second. Other companies, like Tesla and Rivian, have stressed the processing speed of their AI chips to prove how powerful their automated driving systems will be. Ford's not interested in playing that game. Rather than optimizing for performance alone, they pursued a balance of performance, cost, and size. The result is a compute module that is significantly more powerful, lower in cost, and 44 percent smaller than the system it replaces. "We're not just choosing one area here to optimize around at the expense of everything else," Costa said. "We've actually been able to optimize across the board, and that's why we're so excited about it."
[3]
Ford is building an AI assistant that knows your vehicle inside out
It will let you do more than just control the AC or open the sunroof. CES 2026 Read and watch our complete CES coverage here Updated less than 22 minutes ago Ford is working on a new AI assistant for its vehicles that goes beyond basic tasks like adjusting the air conditioning or opening the sunroof. At CES, the company shared a few details, revealing that the assistant will offer a more integrated AI experience with a deeper understanding of both the vehicle and the driver's needs. Ford's demo showcases a chat interface similar to ChatGPT or Gemini, which will allow drivers to interact with the assistant using text or voice. Sample prompts shown in the interface suggest that the assistant will be capable of answering vehicle-specific questions, such as "What is Pro Power Onboard?" The interface will also include an attachment menu that will let users capture images or attach photos from their gallery and ask related questions. For example, drivers could take a photo of bags of mulch and ask how many would fit in the truck bed, with the assistant analyzing the image and responding based on the capacity and bed volume of their specific model. Recommended Videos Ford notes that its AI assistant "isn't just another LLM or a piece of software" that users talk to occasionally. Instead, the company is positioning it "as an intelligent thread woven seamlessly through every aspect" of a Ford owner's experience, with a deep understanding of their specific vehicle and needs. Ford plans to release its AI assistant through its mobile app in H1 2026 The Ford AI assistant will initially roll out through the Ford and Lincoln apps in the first half of 2026, with in-vehicle integration planned for new Ford and Lincoln models in 2027. The company has not yet revealed which models will be first to receive the native integration. Ford also hasn't detailed all of the assistant's capabilities or fully outlined the in-car experience. More information is expected closer to the in-app rollout.
[4]
CES 2026: Ford Is Launching Its Own AI Assistant
Listen up, Ford drivers: You're getting a new AI assistant this year. During a decidedly low-key CES keynote, the company announced Ford AI Assistant, a new AI-powered bot coming to Ford customers in the early half of 2026. While the company has plans to integrate the assistant into Ford vehicles directly, that isn't how you'll first experience this new AI. Instead, Ford is rolling out Ford AI Assistant to an upgraded version of its Ford app first, and plans on shipping cars with the assistant built-in sometime in 2027. In effect, Ford has added a proprietary version of ChatGPT or Gemini to its app. Ford's idea here is to offer users a smart assistant experience directly tied to their Ford vehicle. In one example, the company suggests a customer could visit a hardware store looking to buy mulch. Said customer could take a photo of a pile of bags of mulch, and ask the assistant "how many bags can I fit in the bed of my truck?" Ford AI Assistant could then run the numbers, and offer an educated estimate to how much mulch the customer can buy and take with them at one time. Of course, other AI assistants can do similar calculations. Send ChatGPT the same photo, and ask the same question -- specifying the model of your truck -- and the bot will run the numbers itself. The difference, in Ford's view, is that Ford AI Assistant is connected to your vehicle specifically. It can read all the sensors in your car, so it knows, for example, how many people are currently traveling with you, your current tire pressure, or, really, anything and everything about your car. According to Doug Field, Ford's Chief Officer of EVs, Digital, and Design, the company's goal with the assistant is to offer answers customers can't get from other sources. ChatGPT certainly doesn't have access to your every sensor embedded in your car, so Ford does have the advantage there. Ford didn't go out and build its AI tech by scratch, however. The company tells TechCrunch that Ford AI Assistant is hosted by Google Cloud, and is run using "off-the-shelf LLMs." Still, that likely won't have much of an impact on whether or not customers use this new assistant. Instead, that will come down to how useful they find the AI assistant in the app. As someone who rarely uses AI assistants, I'd imagine I'd find little use for it if I owned a Ford. That being said, there are some times when it could genuinely be useful to have external access to your car's information. I could probably eyeball how many bags of mulch would fit in my trunk, but I can't tell you my exact odometer reading without starting up my car. The same goes for my tire pressure: It'd be helpful to know my tire pressure before getting in my car, to know whether I should be headed somewhere I can fill up before going to my destination. Of course, there's also a privacy discussion to be had here. Modern cars are already privacy nightmares, but there's something a bit unnerving about an AI assistant that knows everything about my car.
Share
Share
Copy Link
Ford announced its AI assistant will debut in mobile apps early 2026, with in-car integration starting 2027. The assistant uses vehicle-specific data to answer questions generic AI can't, like calculating cargo capacity from photos. Ford also plans Level 3 autonomous driving by 2028, developed in-house at 30 percent lower cost than current systems.
Ford revealed its strategy to integrate AI into the driving experience at CES, with Doug Field, the company's chief EV, design, and digital officer, outlining a vision for personalized in-car experiences that extend beyond generic intelligence
1
. The Ford AI assistant will first arrive in the Ford and Lincoln mobile apps during the first half of 2026, before expanding to native in-vehicle integration starting in 2027 with new or refreshed models2
3
. This phased approach allows Ford to test and refine the technology before embedding it directly into vehicle systems.
Source: Lifehacker
The AI voice assistant distinguishes itself by leveraging vehicle-specific data that generic chatbots cannot access. Field emphasized that customers need "intelligence that understands where you are, what you're doing, and what your vehicle is capable of"
1
. A practical example involves photographing bags of mulch at a hardware store and asking the assistant how many will fit in the truck bed—the system can provide accurate answers because it knows the exact bed dimensions and current cargo status of that specific vehicle2
.
Source: Digital Trends
Ford is pursuing in-house development for critical components while relying on established large language models (LLMs) from partners. The assistant is hosted by Google Cloud and uses off-the-shelf LLMs, but Ford designs the integration layer that connects these models to vehicle systems
4
. Sammy Omari, Ford's head of ADAS and infotainment, explained that the key differentiator is giving the LLM "access to all the relevant Ford systems so that LLM then knows about what specific vehicle you're using"2
. The chatbot-agnostic design means Ford can work with various AI models, including Gemini, which CFO Sherry House confirmed would be integrated into vehicles2
.
Source: The Verge
This strategy extends to hardware, where Ford developed its "High Performance Compute Center" in-house. The new computer consolidates infotainment, driver-assist systems, audio, and networking functions while occupying half the volume of previous solutions at significantly lower cost
1
. Paul Costa, executive director of electronics platforms at Ford, noted that the Universal Electric Vehicle (UEV) architecture incorporates a fivefold increase for in-house module design, giving Ford 5X more control over critical semiconductors1
.The transition to software-defined vehicles represents a fundamental shift in automotive architecture, replacing dozens of discrete electronic control units with a handful of powerful multitasking computers
1
. This unified "brain" processes multiple functions simultaneously, from infotainment to voice commands to advanced driver assistance2
. The architecture change enables Ford to deliver more sophisticated features without the cost penalties that have plagued premium autonomous systems.Field emphasized that this approach makes "advanced hands-free driving into the vehicles people actually buy, not just vehicles with unattainable price points"
2
. The strategy reflects lessons learned from Ford's pivot away from fully autonomous robotaxis after shutting down Argo AI in 2022, with the company now focusing on Level 2 and Level 3 conditional autonomous driver assist features instead2
.Related Stories
Ford plans to introduce Level 3 autonomous driving capabilities in 2028, where drivers can take their eyes off the road completely under certain circumstances like heavy highway traffic
1
. Before that milestone, a new generation of BlueCruise will debut in 2027 with "significantly more capability at a 30 percent lower cost"1
. Omari explained that rigorous scrutiny of every sensor, software component, and compute unit achieved this cost reduction while delivering enhanced functionality2
.The enhanced BlueCruise system will offer point-to-point hands-free capability that recognizes traffic lights and navigates intersections, expanding beyond the current highway-only operation
2
. However, some experts have raised concerns about Level 3 systems requiring drivers to stay attentive despite the vehicle performing most driving tasks, creating potential safety challenges during transition periods2
.The AI assistant's deep integration with vehicle sensors raises privacy concerns, as it can access tire pressure, odometer readings, passenger count, and virtually every data point generated by the car
4
. While this comprehensive access enables genuinely useful features—like checking tire pressure before entering the vehicle or receiving maintenance alerts based on actual usage patterns—it also means the system maintains constant awareness of vehicle status and potentially driver behavior4
.Ford positions the assistant as "an intelligent thread woven seamlessly through every aspect" of ownership rather than occasional software interaction
3
. The chat interface resembles ChatGPT or Gemini, supporting both text and voice input, with an attachment menu for capturing or uploading images3
. Sample prompts shown at CES included vehicle-specific queries like "What is Pro Power Onboard?" demonstrating the system's ability to explain model-specific features3
. Ford hasn't yet detailed all capabilities or fully outlined the in-car experience, with more information expected closer to the 2026 app rollout3
.Summarized by
Navi
[1]
[3]
[4]
08 Jan 2026•Technology

22 Oct 2025•Technology

07 Jan 2026•Technology

1
Policy and Regulation

2
Technology

3
Technology
