3 Sources
3 Sources
[1]
Broadcom just announced an AI chipset that translates audio in real time directly on the device
Broadcom and a company called CAMB.AI are teaming up to bring on-device audio translation to a chipset. This would allow devices that use the SoC to complete translation, dubbing and audio description tasks without having to dip into the cloud. In other words, it could massively improve accessibility for consumers. The companies promise ultra-low latency and enhanced privacy, being that all processing is kept local to the user's device. The wireless bandwidth should also be drastically reduced. As for the audio description piece, there's a demo video of the tool being used on a clip from the film Ratatouille. The AI can be heard describing the scene in various languages, in addition to a written translation appearing on-screen. This looks incredibly useful, particularly for those with vision issues. There is a major caveat. This is a tightly controlled clip with plenty of edits. We have no idea how this tech will work in a real world scenario. Also, we don't know how accurate the information will be. It does feature a voice model that's already being used by organizations like NASCAR, Comcast and Eurovision. The companies boast that this will enable "on-device translation in over 150 languages." We don't know when these chips will begin showing up in TVs and other gadgets. The tech is in the testing phase for now, so it's gonna be a while. Broadcom also recently teamed up with OpenAI to help the latter company to manufacture its own chips.
[2]
CAMB.AI, Broadcom Bring Voice AI Directly to Chip | AIM
The next phase will move CAMB.AI's real-time translation model to Broadcom's on-device NPU, enabling translation in over 150 languages. CAMB.AI and Broadcom have announced a collaboration that embeds CAMB.AI's generative voice model, MARS, directly into Broadcom's neural processing unit (NPU) chipsets. The partnership enables text-to-speech and localisation features to run natively on consumer devices without depending on the cloud. The integration enables real-time translation, dubbing, captioning, and audio descriptions to function locally, eliminating latency and privacy concerns while reducing costs for users and content providers. Akshat Prakash, co-founder and CTO of CAMB.AI, said, "By partnering with Broadcom, we can deliver this capability to consumers globally in a way that is faster, more private, and more integrated into everyday devices than ever before." Running on Broadcom's SoC-integrated NPU, CAMB.AI's text-to-speech model converts written text into natural speech in multiple languages. This approach supports accessibility for visually impaired users, improves communication in e-learning and customer service, and cuts reliance on external servers. Rich Nelson, SVP and GM of Broadcom's broadband video group, said, "We are enabling next-generation user experiences that are both highly intelligent and privacy-first." The next phase of the collaboration will explore moving CAMB.AI's real-time translation model to Broadcom's on-device NPU, enabling translation across more than 150 languages. Broadcom's chips already power over 500 million devices globally, including set-top boxes and broadband gateways, meaning the new capability could bring multilingual and accessible content to homes worldwide. CAMB.AI, known for its multilingual AI localisation work with organisations such as IMAX, Comcast NBCUniversal, and Major League Soccer, has built MARS and BOLI models that are already available on AWS Bedrock and Google Vertex AI. The partnership with Broadcom marks a move towards embedding localisation and accessibility at the hardware level, bringing AI-powered communication closer to users than ever before.
[3]
New Broadcom chip could make your next TV an instant translator
Broadcom and CAMB.AI are developing an AI chipset capable of real-time, on-device audio translation. Broadcom has announced it is teaming up with a company called CAMB.AI to develop a new AI chipset capable of performing real-time audio translation and dubbing directly on a user's device. The new SoC is being designed to run CAMB.AI's audio translation models locally, without needing to connect to the cloud. The companies state this on-device processing will provide enhanced privacy, ultra-low latency, and drastically reduced wireless bandwidth consumption. In addition to real-time audio translation, the chipset will also be able to generate audio descriptions of what is happening on a screen, a feature aimed at improving accessibility for users with vision issues. The companies have announced that the technology will be able to support on-device translation in over 150 languages. A demo video of the audio description feature showed the AI describing a scene from a film in various languages, with a written translation also appearing on-screen. The voice model used for the technology is reportedly already in use by organizations such as NASCAR, Comcast, and Eurovision. The technology is currently in the testing phase, and there is no timeline for when the new chipsets will begin to appear in consumer gadgets like televisions. The announcement follows other recent AI-related news from Broadcom, including a partnership with OpenAI to help manufacture its own chips.
Share
Share
Copy Link
Broadcom and CAMB.AI are collaborating to embed AI-powered real-time translation capabilities directly into chipsets, enabling on-device processing for over 150 languages without cloud dependency. The technology promises enhanced privacy, ultra-low latency, and improved accessibility features including audio descriptions for visually impaired users.
Broadcom, a leading semiconductor company, has announced a groundbreaking partnership with CAMB.AI to develop AI chipsets capable of performing real-time audio translation and dubbing directly on consumer devices
1
. This collaboration represents a significant shift toward on-device AI processing, eliminating the need for cloud connectivity while promising enhanced privacy and ultra-low latency performance.
Source: Analytics India Magazine
The partnership integrates CAMB.AI's generative voice model, MARS, directly into Broadcom's neural processing unit (NPU) chipsets
2
. This integration enables text-to-speech and localization features to run natively on consumer devices, supporting real-time translation, dubbing, captioning, and audio descriptions without external server dependency.The new system-on-chip (SoC) technology promises to support on-device translation across more than 150 languages
3
. Running on Broadcom's SoC-integrated NPU, CAMB.AI's text-to-speech model converts written text into natural speech in multiple languages, offering significant improvements in processing speed and user privacy.Akshat Prakash, co-founder and CTO of CAMB.AI, emphasized the global impact potential: "By partnering with Broadcom, we can deliver this capability to consumers globally in a way that is faster, more private, and more integrated into everyday devices than ever before"
2
.Beyond translation capabilities, the chipset includes innovative accessibility features, particularly audio descriptions for visually impaired users
1
. A demonstration video showcased the technology describing scenes from the film Ratatouille in various languages, with simultaneous written translations appearing on-screen.The voice model powering these features is already utilized by major organizations including NASCAR, Comcast, and Eurovision, suggesting proven reliability in professional applications
3
. This technology supports accessibility for visually impaired users while improving communication in e-learning and customer service applications.Related Stories
Broadcom's existing chip infrastructure already powers over 500 million devices globally, including set-top boxes and broadband gateways
2
. This extensive deployment base means the new AI translation capabilities could potentially reach homes worldwide through existing hardware ecosystems.Rich Nelson, Senior Vice President and General Manager of Broadcom's broadband video group, highlighted the privacy-focused approach: "We are enabling next-generation user experiences that are both highly intelligent and privacy-first"
2
.The technology remains in the testing phase, with no specific timeline announced for when these chipsets will appear in consumer devices like televisions
1
. However, CAMB.AI's existing partnerships with organizations such as IMAX, Comcast NBCUniversal, and Major League Soccer demonstrate the company's established presence in multilingual AI localization.CAMB.AI's MARS and BOLI models are currently available on AWS Bedrock and Google Vertex AI platforms, indicating the technology's readiness for broader deployment
2
. The partnership with Broadcom represents a strategic move toward embedding localization and accessibility features directly at the hardware level, bringing AI-powered communication capabilities closer to end users than previously possible.Summarized by
Navi
[1]
[2]
05 Sept 2025•Technology

10 May 2025•Technology

26 Nov 2024•Technology
