4 Sources
4 Sources
[1]
Fundamental raises $255 million Series A with a new take on big data analysis
An AI lab called Fundamental emerged from stealth on Thursday, offering a new foundation model to solve an old problem: how to draw insights from the huge quantities of structured data produced by enterprises. By combining the old systems of predictive AI with more contemporary tools, the company believes it can reshape how large enterprises analyze their data. "While LLMs have been great at working with unstructured data, like text, audio, video, and code, they don't work well with structured data like tables," CEO Jeremy Fraenkel told TechCrunch. "With our model Nexus, we have built the best foundation model to handle that type of data." The idea has already drawn significant interest from investors. The company is emerging from stealth with $255 million in funding at a $1.2 billion valuation. The bulk of it comes from the recent $225 million Series A round led by Oak HC/FT, Valor Equity Partners, Battery Ventures, and Salesforce Ventures; Hetz Ventures also participated in the Series A, with angel funding from Perplexity CEO Aravind Srinivas, Brex co-founder Henrique Dubugras, and Datadog CEO Olivier Pomel. Called a Large Tabular Model (LTM) rather than a Large Language Model (LLM), Fundamental's Nexus breaks from contemporary AI practices in a number of significant ways. The model is deterministic -- that is, it will give the same answer every time it is asked a given question -- and doesn't rely on the transformer architecture that defines models from most contemporary AI labs. Fundamental calls it a foundation model because it goes through the normal steps of pre-training and fine-tuning, but the result is something profoundly different from what a client would get when partnering with OpenAI or Anthropic. Those differences are important because Fundamental is chasing a use-case where contemporary AI models often falter. Because Transformer-based AI models can only process data that's within their context window, they often have trouble reasoning over extremely large datasets -- analyzing a spreadsheet with billions of rows, for instance. But that kind of enormous structured dataset is common within large enterprises, creating a significant opportunity for models that can handle the scale. As Fraenkel sees it, that's a huge opportunity for Fundamental. Using Nexus, the company can bring contemporary techniques to Big Data analysis, offering something more powerful and flexible than the algorithms that are currently in use. "You can now have one model across all of your use cases, so you can now expand massively the number of use cases that you tackle," he told TechCrunch. "And on each one of those use cases, you get better performance than what you would otherwise be able to do with an army of data scientists." That promise has already brought in a number of high-profile contracts, including seven-figure contracts with Fortune 100 clients. The company has also entered into a strategic partnership with AWS that will allow AWS users to deploy Nexus directly from existing instances.
[2]
Fundamental emerges from stealth with first major foundation model trained for tabular data
The deep learning revolution has a curious blind spot: the spreadsheet. While Large Language Models (LLMs) have mastered the nuances of human prose and image generators have conquered the digital canvas, the structured, relational data that underpins the global economy -- the rows and columns of ERP systems, CRMs, and financial ledgers -- has so far been treated as just another file format similar to text or PDFs. That's left enterprises to forecast business outcomes using the typical bespoke, labor-intensive data science process of manual feature engineering and classic machine learning algorithms that predate modern deep learning. But now Fundamental, a San Francisco-based AI firm co-founded by DeepMind alumni, is launching today with $255 million in total funding to bridge this gap. Emerging from stealth, the company is debuting NEXUS, a Large Tabular Model (LTM) designed to treat business data not as a simple sequence of words, but as a complex web of non-linear relationships. The tech: moving beyond sequential logic Most current AI models are built on sequential logic -- predicting the next word in a sentence or the next pixel in a frame. However, enterprise data is inherently non-sequential. A customer's churn risk isn't just a timeline; it's a multi-dimensional intersection of transaction frequency, support ticket sentiment, and regional economic shifts. Existing LLMs struggle with this because they are poorly suited to the size and dimensionality constraints of enterprise-scale tables. "The most valuable data in the world lives in tables and until now there has been no good foundation model built specifically to understand it," said Jeremy Fraenkel, CEO and Co-founder of Fundamental. In a recent interview with VentureBeat, Fraenkel emphasized that while the AI world is obsessed with text, audio, and video, tables remain the largest modality for enterprises. "LLMs really cannot handle this type of data very well," he explained, "and enterprises currently rely on very old-school machine learning algorithms in order to make predictions." NEXUS was trained on billions of real-world tabular datasets using Amazon SageMaker HyperPod. Unlike traditional XGBoost or Random Forest models, which require data scientists to manually define features -- the specific variables the model should look at -- NEXUS is designed to ingest raw tables directly. It identifies latent patterns across columns and rows that human analysts might miss, effectively reading the hidden language of the grid to understand non-linear interactions. The tokenization trap A primary reason traditional LLMs fail at tabular data is how they process numbers. Fraenkel explains that LLMs tokenize numbers the same way they tokenize words, breaking them into smaller chunks. "The problem is they apply the same thing to numbers. Tables are, by and large, all numerical," Fraenkel noted. "If you have a number like 2.3, the '2', the '.', and the '3' are seen as three different tokens. That essentially means you lose the understanding of the distribution of numbers. It's not like a calculator; you don't always get the right answer because the model doesn't understand the concept of numbers natively." Furthermore, tabular data is order-invariant in a way that language is not. Fraenkel uses a healthcare example to illustrate: "If I give you a table with hundreds of thousands of patients and ask you to predict which of them has diabetes, it shouldn't matter if the first column is height and the second is weight, or vice versa." While LLMs are highly sensitive to the order of words in a prompt, NEXUS is architected to understand that shifting column positions should not impact the underlying prediction. Operating at the predictive layer Recent high-profile integrations, such as Anthropic's Claude appearing directly within Microsoft Excel, have suggested that LLMs are already solving tables. However, Fraenkel distinguishes Fundamental's work as operating at a fundamentally different layer: the predictive layer. "What they are doing is essentially at the formula layer -- formulas are text, they are like code," he said. "We aren't trying to allow you to build a financial model in Excel. We are helping you make a forecast." NEXUS is designed for split-second decisions where a human isn't in the loop, such as a credit card provider determining if a transaction is fraudulent the moment you swipe. While tools like Claude can summarize a spreadsheet, NEXUS is built to predict the next row -- whether that is an equipment failure in a factory or the probability of a patient being readmitted to a hospital. Architecture and availability The core value proposition of Fundamental is the radical reduction of time-to-insight. Traditionally, building a predictive model could take months of manual labor. "You have to hire an army of data scientists to build all of those data pipelines to process and clean the data," Fraenkel explained. "If there are missing values or inconsistent data, your model won't work. You have to build those pipelines for every single use case." Fundamental claims NEXUS replaces this entire manual process with just one line of code. Because the model has been pre-trained on a billion tables, it doesn't require the same level of task-specific training or feature engineering that traditional algorithms do. As Fundamental moves from its stealth phase into the broader market, it does so with a commercial structure designed to bypass the traditional friction of enterprise software adoption. The company has already secured several seven-figure contracts with Fortune 100 organizations, a feat facilitated by a strategic go-to-market architecture where Amazon Web Services (AWS) serves as the seller of record on the AWS Marketplace. This allows enterprise leaders to procure and deploy NEXUS using existing AWS credits, effectively treating predictive intelligence as a standard utility alongside compute and storage. For the engineers tasked with implementation, the experience is high-impact but low-friction; NEXUS operates via a Python-based interface at a purely predictive layer rather than a conversational one. Developers connect raw tables directly to the model and label specific target columns -- such as a credit default probability or a maintenance risk score -- to trigger the forecast. The model then returns regressions or classifications directly into the enterprise data stack, functioning as a silent, high-speed engine for automated decision-making rather than a chat-based assistant. The societal stakes: beyond the bottom line While the commercial implications of demand forecasting and price prediction are clear, Fundamental is emphasizing the societal benefit of predictive intelligence. The company highlights key areas where NEXUS can prevent catastrophic outcomes by identifying signals hidden in structured data. By analyzing sensor data and maintenance records, NEXUS can predict failures like pipe corrosion. The company points to the Flint water crisis -- which cost over $1 billion in repairs -- as an example where predictive monitoring could have prevented life-threatening contamination. Similarly, during the COVID-19 crisis, PPE shortages cost hospitals $323 billion in a single year. Fundamental argues that by using manufacturing and epidemiological data, NEXUS can predict shortages 4-6 weeks before peak demand, triggering emergency manufacturing in time to save lives. On the climate front, NEXUS aims to provide 30-60 day flood and drought predictions, such as for the 2022 Pakistan floods which caused $30 billion in damages. Finally, the model is being used to predict hospital readmission risks by analyzing patient demographics and social determinants. As the company puts it: "A single mother working two jobs shouldn't end up back in the ER because we failed to predict she'd need follow-up care." Performance vs. latency In the enterprise world, the definition of better varies by industry. For some, it is speed; for others, it is raw accuracy. "In terms of latency, it depends on the use case," Fraenkel explains. "If you are a researcher trying to understand what drugs to administer to a patient in Africa, latency doesn't matter as much. You are trying to make a more accurate decision that can end up saving the most lives possible." In contrast, for a bank or hedge fund, even a marginal increase in accuracy translates to massive value. "Increasing the prediction accuracy by half a percent is worth billions of dollars for a bank," Fraenkel says. "For different use cases, the magnitude of the percentage increase changes, but we can get you to a better performance than what you have currently." Ambitious vision receives big backing The $225 million Series A, led by Oak HC/FT with participation from Salesforce Ventures, Valor Equity Partners, and Battery Ventures, signals high-conviction belief that tabular data is the next great frontier. Notable angel investors including leaders from Perplexity, Wiz, Brex, and Datadog further validate the company's pedigree. Annie Lamont, Co-Founder and Managing Partner at Oak HC/FT, articulated the sentiment: "The significance of Fundamental's model is hard to overstate -- structured, relational data has yet to see the benefits of the deep learning revolution." Fundamental is positioning itself not just as another AI tool, but as a new category of enterprise AI. With a team of approximately 35 based in San Francisco, the company is moving away from the bespoke model era and toward a foundation model era for tables. "Those traditional algorithms have been the same for the last 10 years; they are not improving," Fraenkel said. "Our models keep improving. We are doing the same thing for tables that ChatGPT did for text." Partnering with AWS Through a strategic partnership with Amazon Web Services (AWS), NEXUS is integrated directly into the AWS dashboard. AWS customers can deploy the model using their existing credits and infrastructure. Fraenkel describes this as a "very unique agreement," noting Fundamental is one of only two AI companies to have established such a deep, multi-layered partnership with Amazon. One of the most significant hurdles for enterprise AI is data privacy. Companies are often unwilling to move sensitive data to a third-party infrastructure. To solve this, Fundamental and Amazon achieved a massive engineering feat: the ability to deploy fully encrypted models -- both the architecture and the weights -- directly within the customer's own environment. "Customers can be confident the data sits with them," Fraenkel said. "We are the first, and currently only, company to have built such a solution." Fundamental's emergence is an attempt to redefine the OS for business decisions. If NEXUS performs as advertised -- handling financial fraud, energy prices, and supply chain disruptions with a single, generalized model -- it will mark the moment where AI finally learned to read the spreadsheets that actually run the world. The Power to Predict is no longer about looking at what happened yesterday; it is about uncovering the hidden language of tables to determine what happens tomorrow.
[3]
Fundamental launches with $255M and an AI model optimized for tabular data - SiliconANGLE
Fundamental launches with $255M and an AI model optimized for tabular data Artificial intelligence startup Fundamental Technologies Inc. launched today with $255 million in initial funding from a group of prominent investors. The company raised the bulk of the capital through a $225 million Series A round led by Oak HC/FT. The firm was joined by Salesforce Ventures, Valor Equity Partners, Battery Ventures and Hetz Ventures. Fundamental says that its investor roster also includes the chief executives of Wiz Inc., Perplexity AI Inc., Datadog Inc. and Brex Inc. The company's flagship offering is an AI model called Nexus. It's specifically optimized to process tabular data, or information organized into rows and columns. According to TechCrunch, Nexus is not based on the transformer architecture that underpins most large language models. Fundamental opted against the technology partly because it uses a mechanism called a tokenizer to process prompts. A tokenizer breaks down input data into smaller chunks to ease processing. That approach is effective for text input, but can reportedly create quality issues during tabular data analyses. Fundamental trained Nexus on billions of tabular datasets. It carried out the training using Amazon Web Services Inc.'s Amazon SageMaker HyperPod service. The service enables developers to spin up cloud-based AI clusters with thousands of chips. It also speeds up administrative tasks such as defining cluster configuration settings and recovering from training errors. Large tabular datasets often have to be organized into a form that lends itself better to analysis before they can be analyzed by an AI model. Nexus removes that requirement. The model thereby reduces the need for manual data preparation, which lowers the costs associated with AI projects. One way Nexus speeds up the data preparation workflow is by automatically interpreting ambiguous records. For example, the model could determine whether spreadsheet fields that contain the values "Yellowstone" and "Yosemite" are describing national parks or conference room names. That automation removes the need for developers to manually find and remove potentially ambiguous table elements. Companies can use Nexus to generate forecasts based on tabular data. For example, a manufacturer could upload a spreadsheet that contains equipment telemetry and ask the model to estimate when the next malfunction will occur. Nexus also lends itself to tasks such as predicting floods and store traffic. "We've built a generalized foundation model specifically to leverage the world's most valuable data: the billions of tables that underpin predictions in every enterprise, across every vertical," said Fundamental co-founder and Chief Executive Officer Jeremy Fraenkel (pictured, right, with co-founders Annie Lamont and Gabriel Suissa). The AI developer says that it has already nabbed multiple 7-figure contracts with Fortune 100 companies. Those customers are using Nexus for tasks such as forecasting customer churn. Additionally, Fundamental has partnered with AWS to make it simple for enterprises deploy to its AI model in their cloud environments.
[4]
Fundamental Raises $255 Million for AI Large Tabular Model | PYMNTS.com
The company, which has developed a "large tabular model" (LTM) to glean predictions from enterprise data, said Thursday (Feb. 5) it will use the funding to scale compute, expand enterprise deployments, and add to its research, engineering, and go-to-market teams. "Enterprises have historically relied on antiquated machine learning algorithms that predate deep learning to analyze their data, inform decisions, and make predictions," the announcement from Amazon said. "In contrast, recent advances in deep learning have largely centered on LLMs [large language models] and related architectures optimized for unstructured, sequential data such as text, images, and video." Thus, these models are ill-suited to capturing the "non-sequential, nonlinear relationships inherent in tabular data," the announcement added, and struggle to "process enterprise-scale tables at all due to size and dimensionality constraints." This means they're not designed to derive value from the tabular datasets that inform every crucial enterprise decision. With NEXUS, the announcement said, companies can predict with greater accuracy, replacing legacy predictive analytics with a purpose-built foundation model designed specifically for tabular data. "Fundamental enables enterprises to move beyond analysis of past events to answer forward-looking questions like what will happen next, when risks will emerge, or where opportunities exist -- all with fast time-to-value and enterprise-grade deployment on any cloud infrastructure," Amazon added. In addition to the funding, Fundamental has launched a collaboration with Amazon Web Services (AWS) to accelerate enterprise adoption of its model to AWS customers, who can now purchase and deploy Fundamental's NEXUS LTM in their AWS environment. "Fundamental's structured data prediction model builds on AWS' advanced AI offerings, helping enterprise customers fill a crucial gap in comprehensive tabular data analysis at scale," said Dave Brown, VP of Compute, Platforms & ML Services at AWS. "By partnering with Fundamental, we are making it seamless for customers to transform tabular data -- the backbone of enterprise decision-making -- into a powerful predictive asset. This collaboration exemplifies our commitment to bringing transformative AI solutions to market with the enterprise-grade security and scalability our customers demand." Writing about Amazon's place in the AI race earlier this week, PYMNTS CEO Karen Webster argued that the challenge facing the company is not a technical one. "The company knows how to build and scale new business units and monetize them. And they have done so well with AWS," she argued. "But doing so required separating AWS from retail economics and allowing it to serve a broad ecosystem of companies, including competitors."
Share
Share
Copy Link
DeepMind alumni launch Fundamental with $255 million in funding to solve a critical gap in AI: analyzing structured tabular data. The startup's Nexus model brings deep learning to enterprise spreadsheets, promising faster predictions than traditional machine learning algorithms. Fortune 100 companies have already signed seven-figure contracts.

Source: SiliconANGLE
An AI lab called Fundamental has emerged from stealth with $255 million in total funding to address a persistent challenge in enterprise technology: how to extract insights from massive structured datasets. The San Francisco-based company, co-founded by DeepMind alumni including CEO Jeremy Fraenkel, secured the bulk of its capital through a $225 million Series A round led by Oak HC/FT, Valor Equity Partners, Battery Ventures, and Salesforce Ventures
1
. Hetz Ventures also participated, alongside angel investors including Perplexity CEO Aravind Srinivas, Brex co-founder Henrique Dubugras, and Datadog CEO Olivier Pomel. The funding values the company at $1.2 billion1
.
Source: VentureBeat
Fundamental's flagship product, Nexus, represents a departure from contemporary AI development. Rather than a Large Language Model, the company has built what it calls a Large Tabular Model (LTM) specifically designed for structured data like spreadsheets and database tables
1
. The foundation model for structured data doesn't rely on the transformer architecture that defines models from OpenAI and Anthropic. Instead, Nexus is deterministic, meaning it delivers the same answer every time for a given question. This AI model for tabular data was trained on billions of real-world tabular datasets using Amazon SageMaker HyperPod2
.The deep learning revolution has largely bypassed the spreadsheet, leaving enterprises to forecast business outcomes using legacy machine learning algorithms that predate modern AI
2
. Fraenkel explained that LLMs struggle with tabular data because of how they handle tokenization. "LLMs tokenize numbers the same way they tokenize words, breaking them into smaller chunks," he told VentureBeat. "If you have a number like 2.3, the '2', the '.', and the '3' are seen as three different tokens. That essentially means you lose the understanding of the distribution of numbers"2
. Additionally, transformer-based models can only process data within their context window, making them ineffective for analyzing spreadsheets with billions of rows1
.
Source: PYMNTS
While recent integrations like Anthropic's Claude appearing in Microsoft Excel suggest LLMs are already solving tables, Fundamental operates at a fundamentally different layer. "What they are doing is essentially at the formula layer -- formulas are text, they are like code," Fraenkel explained. "We aren't trying to allow you to build a financial model in Excel. We are helping you make a forecast"
2
. Nexus is designed for split-second decisions where humans aren't in the loop, such as determining if a credit card transaction is fraudulent the moment a customer swipes. The model can predict customer churn, equipment failures, hospital readmissions, and store traffic patterns3
.Related Stories
One of the most significant advantages of Nexus is its ability to ingest raw tables directly, eliminating the need for manual feature engineering that traditional approaches require. Unlike XGBoost or Random Forest models, where data scientists must manually define which variables the model should examine, Nexus identifies latent patterns across columns and rows automatically
2
. The model can automatically interpret ambiguous records, determining whether fields containing "Yellowstone" and "Yosemite" describe national parks or conference room names3
. This automation drastically reduces the time required for Big Data analysis, which traditionally could take months of manual labor.Fundamental has entered into a strategic partnership with AWS that allows customers to purchase and deploy the Nexus LTM directly within their AWS environment
4
. "Fundamental's structured data prediction model builds on AWS' advanced AI offerings, helping enterprise customers fill a crucial gap in comprehensive tabular data analysis at scale," said Dave Brown, VP of Compute, Platforms & ML Services at AWS4
. The company has already secured multiple seven-figure contracts with Fortune 100 companies for tasks including forecasting customer churn3
. Fundamental plans to use the funding to scale compute resources, expand enterprise deployments, and grow its research, engineering, and go-to-market teams4
.Summarized by
Navi
[2]
[3]
21 Feb 2025•Business and Economy

21 Aug 2025•Business and Economy

02 Oct 2025•Science and Research

1
Technology

2
Policy and Regulation

3
Science and Research
