Curated by THEOUTPOST
On Wed, 13 Nov, 12:03 AM UTC
4 Sources
[1]
Amazon attempts to lure AI researchers with $110M in grants and credits | TechCrunch
There's an AI chip battle brewing among the major cloud vendors. Google's Trillium, a custom chip for training and running AI models, recently entered preview, and Microsoft's Maia is expected to follow in short order. Not to be outdone, Amazon Web Services has AI chips, too: Trainium, Inferentia, and Graviton. In an effort to draw attention to Trainium in particular, the company is launching a new grant program for AI research. Called Build on Trainium, the new program will award $110 million total to institutions, scientists, and students researching AI. AWS will give up to $11 million each in Trainium credits to universities with which it has strategic partnerships, along with individual grants up to $500,000 to the broader AI research community. AWS also says that it's establishing a "research cluster" of up to 40,000 Trainium chips research teams and students can access through self-managed reservations. Gadi Hutt, senior director at AWS' Annapurna Labs, a chipmaking firm AWS acquired in 2015, said that Build on Trainium is intended to furnish researchers with the hardware support they need to pursue their work. Grant participants will also be connected with Trainium educational resources and enablement programs, Hutt added. "AI academic research today is severely bottlenecked by a lack of resources and, as such, the academic sector is falling behind quickly," Hutt said. "With Build on Trainium, AWS is investing in a new wave of AI research guided by leading AI research in universities that will advance the state of generative AI applications, libraries, and optimizations." Indeed, academics in the AI field lack the considerable infrastructure tech giants have at their disposals. Meta, for example, has procured well over 100,000 AI chips to develop its flagship models. In contrast, Stanford's Natural Language Processing Group has 68 GPUs for all of its work. But not everyone believes that AWS will be a benevolent sponsor. "This feels like an effort on generalizing a corruption of academic research funding," Os Keyes, a PhD candidate at the University of Washington who studies the ethical impact of emerging technologies, told TechCrunch. With Build on Trainium, AWS will have the final say on which projects receive grants. The selection process is opaque; Hutt would only say AWS will allocate funds "based on research merit and needs" and "evaluate program success and outcomes." An AWS spokesperson later clarified that a committee of "AI and application practitioners" will review proposals and select "the most impactful and promising projects that will help advance machine learning science forward." There's evidence suggesting corporate-backed AI research tends to favor work with commercial applications over other areas of study. In a recent paper, researchers found that leading AI firms have significantly lower output for research critically examining AI's ethical implications compared to conventional studies. Moreover, the "responsible" AI research big firms do is narrower in scope, according to the co-authors, and lacks diversity in the topics addressed. Researchers have pushed for legal and technical protections to scrutinize AI without fear that vendors will suspend their accounts or threaten legal action. Build with Trainium is advertising for, well, Trainium. But is AWS' other angle trying to woo researchers to its platform? I asked whether grant recipients will be "locked in" to the AWS ecosystem or Trainium if they accept an award. Hutt said they won't, and that the only requirements they'll have to meet are publishing a paper and "open-sourcing" their work on GitHub under a permissive license. "There is no contractual lock that makes universities exclusive technology partners," he said. "What we ask in return is that the outcomes of the research will be open-sourced for the benefit of the community." In any event, it's not clear Build with Trainium will do much to bridge the gap between AI academia and industry. In 2021, U.S. government agencies, aside from the Department of Defense, allocated $1.5 billion for academic funding for AI research. That same year, AI industry worldwide spent more than $340 billion overall (not just for research). Nearly 70% of people with Ph.D.s in AI end up in private industry, lured not only by competitive salaries but access to essential compute and data (and the means to process it). In recent years, companies have ramped up their poaching of faculty AI researchers, and set aside larger grants for Ph.D. students performing research. The end result? The largest AI models developed in any given year now come from industry more than 90% of the time, while the number of AI papers published with industry co-authors has nearly doubled since 2000. Policymakers have taken some steps to address the academia-industry funding gap. Last year, the National Science Foundation announced a $140 million investment to launch seven university-led National AI Research Institutes to examine how AI could mitigate the effects of climate change and improve education. Elsewhere, efforts are underway to establish the U.S. National AI Research Resource, a $2.6 billion initiative that'd provide AI researchers and students with access to computational resources and datasets. But they're still small time compared to corporate programs. And there's little reason to think that the status quo will change anytime soon.
[2]
AWS woos academics with cluster of 40K Trainium chips
Amazon wants more people building applications and frameworks for its custom Trainium accelerators and is making up to 40,000 chips available to university researchers under a $110 million initiative announced on Tuesday. Dubbed "Build on Trainium," the program will provide compute hours to AI academics developing new algorithms, looking to increase accelerator performance, or scale compute across large distributed systems. "A researcher might invent a new model architecture or a new performance optimization technique, but they may not be able to afford the high-performance computing resources required for a large-scale experiment," AWS explained in a recent blog post. And perhaps more importantly, the fruits of this labor are expected to be open-sourced by researchers and developers so that they can benefit the machine learning ecosystem as a whole. As altruistic as this all might sound, it's to Amazon's benefit: The cloud giant's custom silicon, which now spans the gamut from CPUs and SmartNICs to dedicated AI training and inference accelerators, was originally designed to improve the efficiency of its internal workloads. Developing low-level application frameworks and kernels isn't a big ask for such a large company. However, things get trickier when you start opening up the hardware to the public, which in large part lacks these resources and expertise, necessitating a higher degree of abstraction. This is why we've seen many Intel, AMD, and others gravitate toward frameworks like PyTorch or TensorFlow to hide the complexity associated with low-level coding. We've certainly seen this with AWS products like SageMaker. Researchers, on the other hand, are often more than willing to dive into low-level hardware if it means extracting additional performance, uncovering hardware-specific optimizations, or simply getting access to the compute necessary to move their research forward. What was it they say about necessity being the mother of invention? "The knobs of flexibility built into the architecture at every step make it a dream platform from a research perspective," Christopher Fletcher, an associate professor at the University of California at Berkeley, said of Trainium in a statement. It isn't clear from the announcement whether all 40,000 of those accelerators are its first or second generation parts. We'll update if we hear back on this. The second generation parts, announced roughly a year ago during Amazon's Re:Invent event, saw the company shift focus toward everyone's favorite flavor of AI: large language models. As we reported at the time, Trainium2 is said to deliver 4x faster training performance than its predecessor and boost memory capacity by threefold. Since any innovations uncovered by researchers -- optimized compute kernels for domain-specific machine learning tasks, for example -- will be open-sourced under the Build on Trainium program, Amazon stands to benefit from its crowdsourcing of software development. Naturally, throwing hardware at academics is a tale as old as university computer science programs, and to support these efforts, Amazon is extending access to technical education and enablement programs to get researchers up to speed. This will be handled through a partnership with the Neuron Data Science community, an organization led by Amazon's Annapurna Labs team. ®
[3]
Amazon Invests USD 110 Million to Boost AI Research with Free Access to Trainium Chips
Neuron Kernel Interface (NKI) enables researchers to directly optimise chip performance for AI models. Amazon's cloud computing division (AWS) announced on Tuesday that it will provide free computing power to researchers interested in using its custom artificial intelligence chips, aiming to compete with Nvidia's dominance in the field. Amazon has announced a USD 110 million investment in the Build on Trainium program, aimed at expanding AI research and training opportunities. Also Read: Comviva and AWS Partner to Offer Cloud-First, AI-Driven Solutions for Telecom Providers The initiative will provide university-led research teams with access to AWS Trainium UltraClusters (collections of AI accelerators that work together on complex computational tasks) to explore new AI architectures, machine learning (ML) libraries, and performance optimisations. AWS Trainium, the ML chip built specifically for deep learning training and inference, will enable researchers to tackle large-scale AI challenges. As part of Build on Trainium, AWS created a Trainium research UltraCluster with up to 40,000 Trainium chips, which are optimally designed for the unique workloads and computational structures of AI, Amazon said. As part of Build on Trainium, AWS and AI research institutions are also establishing dedicated funding for new research and student education. "A researcher might invent a new model architecture or a new performance optimisation technique, but they may not be able to afford the high-performance computing resources required for a large-scale experiment," Amazon noted. Also Read: Anthropic, Palantir, and AWS Partner to Bring Claude AI Models to US Defense Operations The program will support a range of research areas, including algorithmic improvements and distributed systems, and foster collaborations between AI experts and research institutions like Carnegie Mellon University (CMU) and the University of California at Berkeley. "Trainium is beyond programmable -- not only can you run a program, you get low-level access to tune features of the hardware itself," said Christopher Fletcher, an associate professor of computer science research at the University of California at Berkeley, and a participant in Build on Trainium. "The knobs of flexibility built into the architecture at every step make it a dream platform from a research perspective." As part of Build on Trainium, selected research teams will receive AWS Trainium credits, technical support, and access to educational resources. The initiative will also make its advancements open-source, according to Amazon. Also Read: AWS Announces Generative AI Partner Innovation Alliance Amazon said these advancements are made possible, in part, by a new programming interface for AWS Trainium and Inferentia called the Neuron Kernel Interface (NKI). "This interface gives direct access to the chip's instruction set and allows researchers to build optimised compute kernels (core computational units) for new model operations, performance optimizations, and science innovations," Amazon said.
[4]
Amazon Puts $110M Into Academic Generative AI Research
A new initiative will help universities access powerful AI tools Generative AI is now increasingly used to synthesize images, video, text and code. Now Amazon says it will invest US $110 million in university-led research into generative AI to help drive breakthroughs in the field, the company announced. Generative AI systems such as DALL-E, Midjourney, and Stable Diffusion now regularly conjure photorealistic images. ChatGPT, perhaps the most well-known generative AI chatbot, has passed law school and business school exams, successfully answered interview questions for software-coding jobs, written real estate listings, and developed ad content. However, developing novel generative AI applications increasingly requires a lot of computing power. Such resources are often well beyond academic researchers, as two researchers noted in a tongue-in-cheek paper that ran in the Proceedings of the IEEE in January. "AI academic research today is severely bottlenecked by a lack of resources and as such, the academic sector is falling behind quickly," says Gadi Hutt, senior director of business development at Amazon Web Services' Annapurna Labs. As part of the new initiative from Amazon, called Build on Trainium, Amazon Web Services (AWS) has created a computer cluster where researchers can make reservations to access up to 40,000 Trainium chips, Hutt says. AWS developed these processors for high-performance, low-cost deep learning. "With Build on Trainium, AWS is investing in a new wave of AI research and classes guided by leading AI research in universities that will advance the state of generative AI applications, libraries, and optimizations," Hutt says. A new programming interface for Trainium called the Neuron Kernal Interface gives researchers "bare-metal" chip programming capabilities, allowing direct access to the chip's instruction set and enabling users to build compute kernels for new model operations and performance optimizations, Amazon says. "That's great," says Julian Togelius, an associate professor of computer science and engineering at New York University. "A lot of companies like to share resources, but only if you use a specific proprietary tool chain. Giving researchers low-level access to tune features of the hardware itself sounds awesome." Togelius was one of the two researchers who authored the Proceedings of the IEEE piece. The AWS initiative will develop strategic partnerships with universities, including Carnegie Mellon University and the University of California, Berkeley, with more to be announced in coming weeks, Hutt says. In addition, the company will provide grant-level allocations of Trainium credits to the broader research community through multiple rounds of Amazon research awards calling for papers over the next three to five years. Researchers can start applying for these grants immediately, Hutt says. All in all, researchers will get the opportunity to build new AI architectures, machine learning libraries, and performance optimizations for large-scale distributed Trainium clusters. All code developed through the new initiative will be available via open-source machine learning software libraries. "Amazon has been very aggressively reaching out to startups, so now targeting academics makes perfect sense," Togelius says. "They're getting more people to do research with their cloud system, getting researchers to found their own companies who may work with AWS." Amazon notes the initiative will also help train future AI experts. For instance, those participating in Build on Trainium will receive access to AWS's technical education and enablement programs for Trainium, Hutt says, in partnership with the Neuron Data Science community led by Amazon's chip developer Annapurna. A key factor to the new initiative's success will be how it chooses who it supports, Togelius says. "The question is how will Amazon distribute these resources so people do things that big tech could or would not have done?" Togelius says. "If Amazon distributes these resources like it would distribute compute within Amazon, then it hasn't gained much. You want to focus on what the large tech companies are not investing in for one reason or another -- it's too narrow, it's experimental, it's got weird optics -- the kind of things that you can't sell to a venture capitalist or to shareholders. Academics need to double down on the weirdness."
Share
Share
Copy Link
Amazon Web Services launches the "Build on Trainium" program, offering $110 million in grants and compute credits to academic researchers for AI development using its custom Trainium chips.
Amazon Web Services (AWS) has announced a significant initiative aimed at boosting artificial intelligence research in academia. The "Build on Trainium" program, with a substantial investment of $110 million, seeks to provide researchers with access to cutting-edge AI hardware and resources 1.
At the heart of this initiative is AWS's custom AI chip, Trainium. The company is making up to 40,000 Trainium chips available to university researchers, creating a powerful research cluster 2. This move is designed to address the growing compute resource gap between academia and industry in AI research.
The Build on Trainium program offers:
The initiative aims to support a wide range of research areas, including algorithmic improvements, distributed systems, and new AI architectures 3.
AWS has introduced the Neuron Kernel Interface (NKI), a new programming interface for Trainium. This interface provides researchers with "bare-metal" chip programming capabilities, allowing direct access to the chip's instruction set and enabling the development of optimized compute kernels 3 4.
This move by Amazon comes amidst an ongoing AI chip battle among major cloud vendors. Google's Trillium and Microsoft's Maia are also entering the market, intensifying competition in the AI hardware space 1.
The initiative addresses a critical issue in AI research: the widening resource gap between academia and industry. With private industry spending over $340 billion on AI in 2021, compared to $1.5 billion in U.S. government funding for academic AI research, Amazon's investment could help level the playing field 1.
While the program promises significant benefits, some experts have raised concerns:
A key aspect of the Build on Trainium program is its commitment to open-sourcing research outcomes. AWS requires grant recipients to publish their work and make it available on GitHub under a permissive license, potentially benefiting the broader AI community 1 2.
As the program unfolds, its impact on bridging the academia-industry gap in AI research remains to be seen. The success of Build on Trainium could potentially influence future collaborations between tech giants and academic institutions, shaping the landscape of AI innovation and education 4.
Reference
[2]
[3]
[4]
IEEE Spectrum: Technology, Engineering, and Science News
|Amazon Puts $110M Into Academic Generative AI ResearchAmazon is set to launch its next-generation AI chip, Trainium 2, aiming to reduce reliance on Nvidia and cut costs for AWS customers. The chip, developed by Amazon's Annapurna Labs, is already being tested by major players in the AI industry.
9 Sources
9 Sources
Amazon Web Services unveils new AI chip clusters and supercomputers, shifting focus to Trainium chips to compete with Nvidia in the AI hardware market.
11 Sources
11 Sources
Amazon Web Services announces its next-generation AI chip, Trainium3, promising 4x performance boost over Trainium2. The company also launches Trainium2-powered cloud instances for high-performance AI computing.
10 Sources
10 Sources
Amazon is accelerating the development of its Trainium2 AI chip to compete with Nvidia in the $100 billion AI chip market, aiming to reduce reliance on external suppliers and offer cost-effective alternatives for cloud services and AI startups.
4 Sources
4 Sources
Amazon Web Services (AWS) showcases significant AI developments at its annual re:Invent conference, including new Trainium chips, enhancements to SageMaker and Bedrock platforms, and AI-powered tools to compete with Microsoft in the cloud computing market.
6 Sources
6 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved