Curated by THEOUTPOST
On Sat, 23 Nov, 8:01 AM UTC
2 Sources
[1]
AI chip startup MatX reportedly raises $80M funding round - SiliconANGLE
MatX Inc., a chip startup led by former Google LLC engineers, has reportedly raised $80 million in fresh funding. TechCrunch today cited sources as saying that Anthropic PBC backer Spark Capital led the investment. The raise, which is described as a Series B round, reportedly values MatX at more than $300 million. The milestone comes a few months after the company raised $25 million in initial funding from a group of prominent investors. MatX was founded in 2022 by Chief Executive Officer Reiner Pope and Chief Technology Officer Mike Gunter. The duo previously worked at Google, where they helped develop the company's TPU line of artificial intelligence processors. They also worked on other machine learning projects during their stints at the search giant. MatX is developing chips for training AI models and performing inference, or the task of running a neural network in production after it's trained. According to the company, customers will have the ability to build machine learning clusters that contain hundreds of thousands of its chips. MatX estimates that such clusters will be capable of powering AI models with millions of simultaneous users. The company's website states that it's prioritizing cost-efficiency over latency with its chips' design. Nevertheless, MatX expects that the processors will be "competitive" in the latter area. For AI models with 70 billion parameters, the company is promising latencies of less than a hundredth of a second per token. MatX plans to give customers "low-level control over the hardware." Several existing AI processors, including Nvidia Corp.'s market-leading graphics cards, provide similar capabilities. They allow developers to modify how computations are carried out in a way that improves the performance of AI models. Nvidia's chips, for example, provide low-level controls that make it easier to implement operator fusion. This is a machine learning technique that reduces the number of times an AI model must move data to and from a graphics card's memory. Such data transfers incur processing delays, which means that lowering their frequency speeds up calculations. MatX says that another contributor to its chips' performance is their no-frills architecture. The processors lack some of some of the components included in standard GPUs, which makes it possible to add more AI-optimized circuits. Earlier this year, MatX told Bloomberg that its chips will be at least ten times better at running large language models than Nvidia silicon. The company further claims that AI clusters powered by its silicon will be capable of running LLMs with ten trillion parameters. For smaller models with around 70 billion parameters, MatX is promising training times of a few days or weeks. The company expects to complete the development of its first product next year.
[2]
Exclusive: MatX, chip startup founded by Google alums, raised Series A at valuation of $300M+, sources say
MatX, a startup designing chips that support large language models, has raised a Series A of approximately $80 million, three sources say, less than a year after raising its $25 million seed round. Spark Capital led the investment, valuing the company at a pre-money valuation in the mid-$200 million range and a post-money valuation of the low $300 million range, a person who reviewed the deal told TechCrunch. MatX was co-founded two years ago by Mike Gunter, who previously worked at Google on the design of Tensor Process Units (TPUs), the tech giant's AI chips, and Reiner Pope, who also came from Google's TPU team, where he wrote AI software. Gunter and Pope hope to help ease the shortage of chips designed to handle AI workloads. They say the sweet spot for their chips are AI workloads of "at least" 7 billion, and "ideally" 20 billion or more activated parameters. And they boast that their chips deliver high performance at more affordable prices, according to MatX's website. The startup says its chips are particularly good at scaling to large clusters because of MatX's advanced interconnect, aka the communication pathways that AI chips use to transfer information. The pair told Bloomberg that the company's goal is to make its processors ten times better at training LLMs and delivering results than NVIDIA's GPUs. The Information reported last month that MatX was looking to raise between $75 million and $100 million for the round. The startup's seed round was announced last December led by high-profile AI angel investors Nat Friedman, former CEO of GitHub; and Daniel Gross, who previously ran search and AI at Apple after Apple bought his startup, Cue, in 2013. Friedman and Gross frequently angel invest together. Now, Gross has cofounded a new AI company, Safe Superintelligence, with former OpenAI chief scientist Ilya Sustkever. Companies designing chips have seen an increased interest from investors amid the AI boom and stratospheric demand for NVIDIA's processors. Groq, a chip startup that was founded by a former TPU engineer Jonathan Ross, saw its valuation nearly triple to $2.8 billion in August up from its previous valuation of $1 billion set in April. MatX and Spark didn't respond to a request for comment.
Share
Share
Copy Link
MatX, founded by former Google engineers, raises $80 million in a Series A round led by Spark Capital, valuing the company at over $300 million. The startup aims to develop AI chips that outperform NVIDIA's GPUs in large language model processing.
MatX Inc., an AI chip startup founded by former Google engineers, has reportedly raised $80 million in a Series A funding round led by Spark Capital. The investment values the company at over $300 million, marking a significant milestone in its growth trajectory [1][2].
MatX was co-founded in 2022 by CEO Reiner Pope and CTO Mike Gunter, both of whom previously worked on Google's Tensor Processing Unit (TPU) team. Their experience in developing AI processors at Google has positioned MatX as a potentially disruptive force in the AI chip market [1][2].
The startup is developing chips specifically designed for training AI models and performing inference tasks. MatX aims to create processors that are at least ten times more efficient at running large language models (LLMs) compared to NVIDIA's market-leading GPUs [1].
MatX is targeting AI workloads with "at least" 7 billion, and "ideally" 20 billion or more activated parameters. The company claims its chips are particularly adept at scaling to large clusters due to advanced interconnect technology [2].
The AI chip market has seen increased investor interest amid the AI boom and high demand for NVIDIA's processors. Other startups, such as Groq, have also secured significant funding, highlighting the competitive landscape [2].
MatX expects to complete the development of its first product in 2025. The company aims to enable AI clusters capable of running LLMs with ten trillion parameters, potentially revolutionizing the field of large-scale AI processing [1].
As the AI industry continues to grow, MatX's ambitious goals and strong funding position it as a potential challenger to established players like NVIDIA in the rapidly evolving AI chip market.
Groq, an AI chip startup, has raised $640 million in a funding round led by D1 Capital Partners, achieving a $2.8 billion valuation. The company aims to challenge Nvidia's dominance in the AI chip market with its innovative tensor streaming processor technology.
6 Sources
Elon Musk's AI company xAI raises $6 billion in Series C funding, with plans to expand its Colossus supercomputer and develop advanced AI models to compete with industry leaders.
14 Sources
Elon Musk's AI company xAI has raised $5-6 billion in a new funding round, valuing the company at $50 billion. The funds will be used to purchase 100,000 Nvidia chips to expand its AI supercomputer capabilities.
5 Sources
Elon Musk's AI startup xAI is reportedly in discussions to raise several billion dollars at a valuation of up to $40 billion, with chipmaking giant Nvidia considering an investment. The funding talks come as xAI expands its AI infrastructure and competes with other major players in the field.
10 Sources
TensorWave, an AI cloud platform using AMD GPUs, raises $43 million to expand its data center capacity and launch a new inference platform, aiming to provide an alternative to Nvidia's dominance in the AI chip market.
3 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2025 TheOutpost.AI All rights reserved