Akamai Launches Cloud Inference: Revolutionizing AI Performance with Edge Computing

5 Sources

Akamai Technologies introduces Cloud Inference, a distributed AI inference platform promising improved performance, lower latency, and cost savings compared to traditional cloud infrastructures.

News article

Akamai Unveils Cloud Inference: A Game-Changer for AI Performance

Akamai Technologies, a leader in cybersecurity and cloud computing, has launched Akamai Cloud Inference, a groundbreaking service designed to revolutionize AI inference capabilities 12. This new offering leverages Akamai's globally distributed network to address the limitations of centralized cloud models, promising significant improvements in AI application performance and efficiency.

The Power of Distributed AI Inference

Akamai Cloud Inference runs on Akamai Cloud, touted as the world's most distributed cloud infrastructure platform. With over 4,100 points of presence across more than 1,200 networks in over 130 countries, Akamai's platform is uniquely positioned to bring AI inference closer to end-users and data sources 14.

Adam Karon, Chief Operating Officer and General Manager of Akamai's Cloud Technology Group, explains the significance: "Getting AI data closer to users and devices is hard and it's where legacy clouds struggle. While the heavy lifting of training LLMs will continue to happen in big hyperscale datacenters, the actionable work of inference will take place at the edge" 3.

Key Features and Benefits

Akamai Cloud Inference offers a comprehensive suite of tools for developers and platform engineers:

  1. Compute Resources: A range of options from CPUs for fine-tuned inference to powerful GPUs and ASICs, integrated with Nvidia's AI Enterprise ecosystem 4.

  2. Data Management: Partnership with VAST Data for a cutting-edge data fabric optimized for AI workloads, complemented by scalable object storage and integration with vector database vendors 4.

  3. Containerization: Leveraging Kubernetes for improved scalability, resilience, and portability of AI workloads 14.

  4. Edge Compute: WebAssembly capabilities for executing LLM inference directly from serverless apps at the network edge 4.

Performance and Cost Advantages

The distributed nature of Akamai Cloud Inference translates into tangible benefits:

  • 3x improvement in throughput
  • Up to 2.5x reduction in latency
  • Potential cost savings of up to 86% on AI inference workloads compared to traditional hyperscaler infrastructure 24

Shifting Focus from Training to Inference

As AI adoption matures, there's a growing recognition that the emphasis on large language models (LLMs) may have overshadowed more practical AI solutions. Akamai's platform caters to this shift, enabling enterprises to leverage lightweight AI models optimized for specific business problems 4.

The Future of Distributed AI

Gartner predicts that by 2025, 75% of data will be generated outside of centralized data centers or cloud regions. This trend underscores the importance of Akamai's approach, which processes data closer to its point of origin 4.

Akamai Cloud Inference represents a significant step towards more efficient, responsive, and cost-effective AI applications. By bringing inference capabilities to the edge of the network, Akamai is positioning itself at the forefront of the next wave of AI innovation, promising to deliver faster, smarter, and more personalized experiences for users across the globe.

Explore today's top stories

Taiwan Adds Huawei and SMIC to Export Control List, Impacting AI Chip Development

Taiwan has added Chinese tech giants Huawei and SMIC to its export control list, requiring government approval for any tech exports to these companies. This move significantly impacts China's AI chip development efforts and aligns with US restrictions.

Bloomberg Business logoReuters logoEconomic Times logo

4 Sources

Technology

6 hrs ago

Taiwan Adds Huawei and SMIC to Export Control List,

AI Reshaping Talent Acquisition: ManpowerGroup Insights on the Future of Work

ManpowerGroup's Chief Innovation Officer discusses how AI is transforming recruitment and the skills employers will seek in the future, highlighting the need for soft skills and potential over traditional credentials.

Phys.org logoEconomic Times logo

2 Sources

Business and Economy

22 hrs ago

AI Reshaping Talent Acquisition: ManpowerGroup Insights on

Tech Giants Race to Create the Ultimate AI Device, Led by OpenAI and Jony Ive Collaboration

OpenAI partners with former Apple design chief Jony Ive to develop a revolutionary AI gadget, while other tech companies explore new interfaces for AI interaction.

France 24 logoEconomic Times logo

2 Sources

Technology

6 hrs ago

Tech Giants Race to Create the Ultimate AI Device, Led by

AI and Space Lasers Revolutionize Forest Carbon Mapping for Climate Science

A groundbreaking study combines satellite data, space-based LiDAR, and AI algorithms to rapidly and accurately map forest carbon, potentially transforming climate change research and forest management.

ScienceDaily logoPhys.org logo

2 Sources

Science and Research

6 hrs ago

AI and Space Lasers Revolutionize Forest Carbon Mapping for

Amazon to Invest $13 Billion in Australia's Data Center Infrastructure, Boosting AI Capabilities

Amazon announces a significant $13 billion investment in Australia's data center infrastructure from 2025 to 2029, aimed at expanding AI capabilities and supporting generative AI workloads.

Reuters logoEconomic Times logoMarket Screener logo

3 Sources

Business and Economy

14 hrs ago

Amazon to Invest $13 Billion in Australia's Data Center
TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

Β© 2025 Triveous Technologies Private Limited
Twitter logo
Instagram logo
LinkedIn logo