2 Sources
[1]
Don't hold your breath for OpenAI's new model to run on your phone's Snapdragon chip
This could mean faster, more private AI features on your phone, just not yet. When you use an AI model like ChatGPT, it runs in the cloud rather than on your phone or laptop, but Qualcomm seems eager to change that. The company has announced that OpenAI's first open-source reasoning model, with the less-than-catchy name "gpt-oss-20b," is now capable of running directly on Snapdragon-powered devices. In a press release, Qualcomm says this is the first time OpenAI has made one of its models available for on-device use. Previously, the company's most advanced models could only run on powerful cloud infrastructure, but with help from Qualcomm's AI Engine and AI Stack, this 20-billion-parameter model has been tested locally. However, that doesn't mean your phone is ready for it. Despite references to Snapdragon devices, this isn't aimed at smartphones just yet. The model is still pretty beefy and requires 24GB of RAM, with Qualcomm's integration work appearing targeted at developer-grade platforms, not the chip in your pocket. It's more about Snapdragon-powered PCs than a simple AI upgrade for your Android device. Still, Qualcomm calls this a milestone moment, with potential benefits in areas like privacy, speed, and personalization. Because everything runs directly on the device, there's no need to send data elsewhere, and tasks like reasoning or assistant-style interactions can happen faster and offline. While OpenAI is initially targeting developers, if it is scaled, it could impact how AI tools behave on your Snapdragon phone in the future. Think faster responses and no delays if your internet connection is playing up. It could also open the door for future apps that use local AI without sacrificing privacy. Developers can now access the model through platforms like Hugging Face and Ollama, with Qualcomm saying more deployment info will appear soon on its AI Hub.
[2]
OpenAI's AI model can now run directly on Snapdragon hardware - Phandroid
Most AI tools like ChatGPT rely on the cloud to function, but Qualcomm wants to bring that power directly to your device. The company just announced that OpenAI's first open-source reasoning model, called gpt-oss-20b, is now capable of running locally on Snapdragon-powered devices. This marks a first for OpenAI. Until now, its models have only worked through cloud infrastructure, requiring serious server-grade hardware. But with help from Qualcomm's AI Engine and AI Stack, the 20-billion-parameter model has been successfully tested on local hardware, no internet required. Before you get too excited, this doesn't mean your phone will be running full-blown ChatGPT anytime soon. The model still needs 24GB of RAM, which rules out most smartphones. Qualcomm is clearly aiming this at developer-grade platforms and Snapdragon-powered PCs, not everyday Android phones. At least not yet. Still, it's a significant milestone. Running powerful AI models on-device could mean faster responses, better privacy, and improved personalization. With no need to send data back and forth to the cloud, tasks like reasoning or virtual assistant responses could happen in real-time, even offline. For now, the focus is on developers. They can access the model via Hugging Face and Ollama, with more deployment details coming soon on Qualcomm's AI Hub. If the tech continues to scale, it could eventually reshape how AI apps run on future Snapdragon phones -- no cloud, no lag, just smarter local performance.
Share
Copy Link
Qualcomm announces successful testing of OpenAI's gpt-oss-20b model on Snapdragon-powered devices, marking a significant step towards on-device AI processing.
In a groundbreaking development, Qualcomm has announced that OpenAI's first open-source reasoning model, dubbed "gpt-oss-20b," can now run directly on Snapdragon-powered devices 1. This marks a significant shift from the cloud-based operation of AI models like ChatGPT towards on-device processing, potentially revolutionizing the way AI functions on mobile and personal computing devices.
Source: Android Authority
The 20-billion-parameter model, made possible through Qualcomm's AI Engine and AI Stack, represents OpenAI's first venture into making its models available for on-device use 2. However, it's important to note that this technology is not yet ready for smartphones. The model currently requires 24GB of RAM, making it more suitable for developer-grade platforms and Snapdragon-powered PCs rather than typical Android devices 1.
Despite its current limitations, this development holds promise for several key areas:
Source: Phandroid
Qualcomm has made the model accessible to developers through platforms like Hugging Face and Ollama, with more deployment information expected to be available soon on Qualcomm's AI Hub 1. While the immediate focus is on developer applications, the long-term implications could be significant for consumer devices.
If successfully scaled, this technology could reshape how AI tools function on Snapdragon-powered phones in the future. Users might experience faster AI responses, improved offline capabilities, and enhanced privacy features 2. However, it's crucial to temper expectations, as widespread implementation in consumer smartphones is likely still some time away.
This milestone represents a convergence of OpenAI's advanced language models and Qualcomm's hardware expertise, potentially opening new avenues for AI application development and usage across a range of devices. As the technology evolves, it could lead to more powerful, efficient, and private AI experiences directly on our personal devices.
Summarized by
Navi
[1]
Google introduces a series of AI agents and tools to revolutionize data engineering, data science, and analytics, promising to streamline workflows and boost productivity for enterprise data teams.
3 Sources
Technology
1 day ago
3 Sources
Technology
1 day ago
Huawei is open-sourcing its CANN software toolkit for Ascend AI GPUs, aiming to compete with NVIDIA's CUDA and attract more developers to its ecosystem.
2 Sources
Technology
1 day ago
2 Sources
Technology
1 day ago
Anthropic's Claude AI model has demonstrated exceptional performance in hacking competitions, outranking human competitors and raising questions about the future of AI in cybersecurity.
2 Sources
Technology
16 hrs ago
2 Sources
Technology
16 hrs ago
The Productivity Commission's proposal for AI copyright exemptions in Australia has ignited a fierce debate between tech companies and creative industries, raising concerns about intellectual property rights and economic impact.
3 Sources
Policy and Regulation
15 hrs ago
3 Sources
Policy and Regulation
15 hrs ago
DigitalOcean reports strong Q2 2025 earnings, with revenue and EPS beating expectations. The company's focus on AI offerings and cloud services contributes to significant growth, leading to a nearly 29% stock price increase.
4 Sources
Business and Economy
1 day ago
4 Sources
Business and Economy
1 day ago