6 Sources
[1]
OpenAI's advisory board calls for continued and strengthened nonprofit oversight
OpenAI should continue to be controlled by a nonprofit because the artificial intelligence technology it is developing is "too consequential" to be governed by a corporation alone. That is the message from an advisory board convened by OpenAI to give it recommendations about its nonprofit structure -- delivered in a report released Thursday, along with a sweeping vision for democratizing AI and reforming philanthropy. "We think it's too important to entrust to any one sector, the private sector or even the government sector," said Daniel Zingale, the convener of OpenAI's nonprofit commission and a former adviser to three California governors. "The nonprofit model allows for what we call a common sector," that facilitates democratic participation. The recommendations are not binding on OpenAI, but the advisory commission, which includes the labor organizer Dolores Huerta, offers a framework that may be used to judge OpenAI in the future, whether or not they adopt it. In the commission's view, communities that are already feeling the impacts of AI technologies should have input on how they are developed, including how data about them is used. But there are currently few avenues for people to influence tech companies who control much of the development of AI. OpenAI, the maker of ChatGPT, started in 2015 as a nonprofit research laboratory and has since incorporated a for-profit company with a valuation that has grown to $300 billion. The company has tried to change its structure since the nonprofit board ousted its CEO Sam Altman in Nov. 2023. He was reinstated days later and continues to lead OpenAI. It has run into hurdles escaping its nonprofit roots, including scrutiny from the attorney generals in California and Delaware, who have oversight of nonprofits, and a lawsuit by Elon Musk, an early donor to and founder of OpenAI. Most recently, OpenAI has said it will turn its for-profit company into a public benefit corporation, which must balance the interests of shareholders and its mission. Its nonprofit will hold shares in that new corporation, but OpenAI has not said how much. Zingale said Huerta told the commission their challenge was to help make sure AI is a blessing and not a curse. To grapple with those stakes, they envision a nonprofit with an expansive mandate to help everyone participate in the development and trajectory of AI. "The measure of this nonprofit will be in what it builds, who it includes, and how faithfully it endures to mission and impact," they wrote. The commission toured California communities and solicited feedback online. They heard that many were inspired by OpenAI's mission to create artificial intelligence to benefit humanity and ensure those benefits are felt widely and evenly. But, Zingale said many people feel they are in the dark about how it's happening. "They know this is profoundly important what's happening in this 'Age of Intelligence,' but they want to understand better what it is, how it's developed, where are the important choices being made and who's making them?" he said. Zingale said the commission chose early on not to interact with Altman in any way in order to maintain their independence, though they quote him in their report. However, they did speak with the company's senior engineers, who they said, "entered our space with humility, seriousness, and a genuine desire to understand how their work might translate into democratic legitimacy." The commission proposed OpenAI immediately provide significant resources to the nonprofit for use in the public interest. For context, the nonprofit reported $23 million in assets in 2023, the most recent year that its tax filing is available. The commission recommend focusing on closing gaps in economic opportunity, investing in AI literacy and creating an organization that is accessible to and governed by everyday people. "For OpenAI's nonprofit to fulfill its mandate, it should commit to more than just doing good - it should commit to being known, seen, and shaped by the people it claims to serve," they wrote. The commission suggested opening a rapid response fund to help reduce economic strains now. Zingale said they specifically recommended funding theater, art and health. "We're trying to make the point that they need to dedicate some of their resources to human to human activities," he said. The commission also recommend setting up a requirement that a human lead the nonprofit, which Zingale said is a serious recommendation and "a sign of the times." ___ Associated Press coverage of philanthropy and nonprofits receives support through the AP's collaboration with The Conversation US, with funding from Lilly Endowment Inc. The AP is solely responsible for this content. For all of AP's philanthropy coverage, visit https://apnews.com/hub/philanthropy.
[2]
OpenAI's advisory board calls for continued and strengthened nonprofit oversight
OpenAI should continue to be controlled by a nonprofit because the artificial intelligence technology it is developing is "too consequential" to be governed by a corporation alone. That is the message from an advisory board convened by OpenAI to give it recommendations about its nonprofit structure -- delivered in a report released Thursday, along with a sweeping vision for democratizing AI and reforming philanthropy. "We think it's too important to entrust to any one sector, the private sector or even the government sector," said Daniel Zingale, the convener of OpenAI's nonprofit commission and a former adviser to three California governors. "The nonprofit model allows for what we call a common sector," that facilitates democratic participation. The recommendations are not binding on OpenAI, but the advisory commission, which includes the labor organizer Dolores Huerta, offers a framework that may be used to judge OpenAI in the future, whether or not they adopt it. In the commission's view, communities that are already feeling the impacts of AI technologies should have input on how they are developed, including how data about them is used. But there are currently few avenues for people to influence tech companies who control much of the development of AI. OpenAI, the maker of ChatGPT, started in 2015 as a nonprofit research laboratory and has since incorporated a for-profit company with a valuation that has grown to $300 billion. The company has tried to change its structure since the nonprofit board ousted its CEO Sam Altman in Nov. 2023. He was reinstated days later and continues to lead OpenAI. It has run into hurdles escaping its nonprofit roots, including scrutiny from the attorney generals in California and Delaware, who have oversight of nonprofits, and a lawsuit by Elon Musk, an early donor to and founder of OpenAI. Most recently, OpenAI has said it will turn its for-profit company into a public benefit corporation, which must balance the interests of shareholders and its mission. Its nonprofit will hold shares in that new corporation, but OpenAI has not said how much. Zingale said Huerta told the commission their challenge was to help make sure AI is a blessing and not a curse. To grapple with those stakes, they envision a nonprofit with an expansive mandate to help everyone participate in the development and trajectory of AI. "The measure of this nonprofit will be in what it builds, who it includes, and how faithfully it endures to mission and impact," they wrote. The commission toured California communities and solicited feedback online. They heard that many were inspired by OpenAI's mission to create artificial intelligence to benefit humanity and ensure those benefits are felt widely and evenly. But, Zingale said many people feel they are in the dark about how it's happening. "They know this is profoundly important what's happening in this 'Age of Intelligence,' but they want to understand better what it is, how it's developed, where are the important choices being made and who's making them?" he said. Zingale said the commission chose early on not to interact with Altman in any way in order to maintain their independence, though they quote him in their report. However, they did speak with the company's senior engineers, who they said, "entered our space with humility, seriousness, and a genuine desire to understand how their work might translate into democratic legitimacy." The commission proposed OpenAI immediately provide significant resources to the nonprofit for use in the public interest. For context, the nonprofit reported $23 million in assets in 2023, the most recent year that its tax filing is available. The commission recommend focusing on closing gaps in economic opportunity, investing in AI literacy and creating an organization that is accessible to and governed by everyday people. "For OpenAI's nonprofit to fulfill its mandate, it should commit to more than just doing good -- it should commit to being known, seen, and shaped by the people it claims to serve," they wrote. The commission suggested opening a rapid response fund to help reduce economic strains now. Zingale said they specifically recommended funding theater, art and health. "We're trying to make the point that they need to dedicate some of their resources to human to human activities," he said. The commission also recommended setting up a requirement that a human lead the nonprofit, which Zingale said is a serious recommendation and "a sign of the times."
[3]
OpenAI's advisory board calls for continued and strengthened nonprofit oversight
OpenAI should continue to be controlled by a nonprofit because the artificial intelligence technology it is developing is "too consequential" to be governed by a corporation alone. That is the message from an advisory board convened by OpenAI to give it recommendations about its nonprofit structure -- delivered in a report released Thursday, along with a sweeping vision for democratizing AI and reforming philanthropy. "We think it's too important to entrust to any one sector, the private sector or even the government sector," said Daniel Zingale, the convener of OpenAI's nonprofit commission and a former adviser to three California governors. "The nonprofit model allows for what we call a common sector," that facilitates democratic participation. The recommendations are not binding on OpenAI, but the advisory commission, which includes the labor organizer Dolores Huerta, offers a framework that may be used to judge OpenAI in the future, whether or not they adopt it. In the commission's view, communities that are already feeling the impacts of AI technologies should have input on how they are developed, including how data about them is used. But there are currently few avenues for people to influence tech companies who control much of the development of AI. OpenAI, the maker of ChatGPT, started in 2015 as a nonprofit research laboratory and has since incorporated a for-profit company with a valuation that has grown to $300 billion. The company has tried to change its structure since the nonprofit board ousted its CEO Sam Altman in Nov. 2023. He was reinstated days later and continues to lead OpenAI. It has run into hurdles escaping its nonprofit roots, including scrutiny from the attorney generals in California and Delaware, who have oversight of nonprofits, and a lawsuit by Elon Musk, an early donor to and founder of OpenAI. Most recently, OpenAI has said it will turn its for-profit company into a public benefit corporation, which must balance the interests of shareholders and its mission. Its nonprofit will hold shares in that new corporation, but OpenAI has not said how much. Zingale said Huerta told the commission their challenge was to help make sure AI is a blessing and not a curse. To grapple with those stakes, they envision a nonprofit with an expansive mandate to help everyone participate in the development and trajectory of AI. "The measure of this nonprofit will be in what it builds, who it includes, and how faithfully it endures to mission and impact," they wrote. The commission toured California communities and solicited feedback online. They heard that many were inspired by OpenAI's mission to create artificial intelligence to benefit humanity and ensure those benefits are felt widely and evenly. But, Zingale said many people feel they are in the dark about how it's happening. "They know this is profoundly important what's happening in this 'Age of Intelligence,' but they want to understand better what it is, how it's developed, where are the important choices being made and who's making them?" he said. Zingale said the commission chose early on not to interact with Altman in any way in order to maintain their independence, though they quote him in their report. However, they did speak with the company's senior engineers, who they said, "entered our space with humility, seriousness, and a genuine desire to understand how their work might translate into democratic legitimacy." The commission proposed OpenAI immediately provide significant resources to the nonprofit for use in the public interest. For context, the nonprofit reported $23 million in assets in 2023, the most recent year that its tax filing is available. The commission recommend focusing on closing gaps in economic opportunity, investing in AI literacy and creating an organization that is accessible to and governed by everyday people. "For OpenAI's nonprofit to fulfill its mandate, it should commit to more than just doing good - it should commit to being known, seen, and shaped by the people it claims to serve," they wrote. The commission suggested opening a rapid response fund to help reduce economic strains now. Zingale said they specifically recommended funding theater, art and health. "We're trying to make the point that they need to dedicate some of their resources to human to human activities," he said. The commission also recommended setting up a requirement that a human lead the nonprofit, which Zingale said is a serious recommendation and "a sign of the times." ___ Associated Press coverage of philanthropy and nonprofits receives support through the AP's collaboration with The Conversation US, with funding from Lilly Endowment Inc. The AP is solely responsible for this content. For all of AP's philanthropy coverage, visit https://apnews.com/hub/philanthropy.
[4]
OpenAI's advisory board calls for continued and strengthened nonprofit oversight
An advisory board convened by OpenAI says it should continue to be controlled by a nonprofit because the artificial intelligence technology it is developing is "too consequential" to be governed by a corporation alone OpenAI should continue to be controlled by a nonprofit because the artificial intelligence technology it is developing is "too consequential" to be governed by a corporation alone. That is the message from an advisory board convened by OpenAI to give it recommendations about its nonprofit structure -- delivered in a report released Thursday, along with a sweeping vision for democratizing AI and reforming philanthropy. "We think it's too important to entrust to any one sector, the private sector or even the government sector," said Daniel Zingale, the convener of OpenAI's nonprofit commission and a former adviser to three California governors. "The nonprofit model allows for what we call a common sector," that facilitates democratic participation. The recommendations are not binding on OpenAI, but the advisory commission, which includes the labor organizer Dolores Huerta, offers a framework that may be used to judge OpenAI in the future, whether or not they adopt it. In the commission's view, communities that are already feeling the impacts of AI technologies should have input on how they are developed, including how data about them is used. But there are currently few avenues for people to influence tech companies who control much of the development of AI. OpenAI, the maker of ChatGPT, started in 2015 as a nonprofit research laboratory and has since incorporated a for-profit company with a valuation that has grown to $300 billion. The company has tried to change its structure since the nonprofit board ousted its CEO Sam Altman in Nov. 2023. He was reinstated days later and continues to lead OpenAI. It has run into hurdles escaping its nonprofit roots, including scrutiny from the attorney generals in California and Delaware, who have oversight of nonprofits, and a lawsuit by Elon Musk, an early donor to and founder of OpenAI. Most recently, OpenAI has said it will turn its for-profit company into a public benefit corporation, which must balance the interests of shareholders and its mission. Its nonprofit will hold shares in that new corporation, but OpenAI has not said how much. Zingale said Huerta told the commission their challenge was to help make sure AI is a blessing and not a curse. To grapple with those stakes, they envision a nonprofit with an expansive mandate to help everyone participate in the development and trajectory of AI. "The measure of this nonprofit will be in what it builds, who it includes, and how faithfully it endures to mission and impact," they wrote. The commission toured California communities and solicited feedback online. They heard that many were inspired by OpenAI's mission to create artificial intelligence to benefit humanity and ensure those benefits are felt widely and evenly. But, Zingale said many people feel they are in the dark about how it's happening. "They know this is profoundly important what's happening in this 'Age of Intelligence,' but they want to understand better what it is, how it's developed, where are the important choices being made and who's making them?" he said. Zingale said the commission chose early on not to interact with Altman in any way in order to maintain their independence, though they quote him in their report. However, they did speak with the company's senior engineers, who they said, "entered our space with humility, seriousness, and a genuine desire to understand how their work might translate into democratic legitimacy." The commission proposed OpenAI immediately provide significant resources to the nonprofit for use in the public interest. For context, the nonprofit reported $23 million in assets in 2023, the most recent year that its tax filing is available. The commission recommend focusing on closing gaps in economic opportunity, investing in AI literacy and creating an organization that is accessible to and governed by everyday people. "For OpenAI's nonprofit to fulfill its mandate, it should commit to more than just doing good - it should commit to being known, seen, and shaped by the people it claims to serve," they wrote. The commission suggested opening a rapid response fund to help reduce economic strains now. Zingale said they specifically recommended funding theater, art and health. "We're trying to make the point that they need to dedicate some of their resources to human to human activities," he said. The commission also recommended setting up a requirement that a human lead the nonprofit, which Zingale said is a serious recommendation and "a sign of the times." ___ Associated Press coverage of philanthropy and nonprofits receives support through the AP's collaboration with The Conversation US, with funding from Lilly Endowment Inc. The AP is solely responsible for this content. For all of AP's philanthropy coverage, visit https://apnews.com/hub/philanthropy.
[5]
OpenAI's Advisory Board Calls for Continued and Strengthened Nonprofit Oversight
OpenAI should continue to be controlled by a nonprofit because the artificial intelligence technology it is developing is "too consequential" to be governed by a corporation alone. That is the message from an advisory board convened by OpenAI to give it recommendations about its nonprofit structure -- delivered in a report released Thursday, along with a sweeping vision for democratizing AI and reforming philanthropy. "We think it's too important to entrust to any one sector, the private sector or even the government sector," said Daniel Zingale, the convener of OpenAI's nonprofit commission and a former adviser to three California governors. "The nonprofit model allows for what we call a common sector," that facilitates democratic participation. The recommendations are not binding on OpenAI, but the advisory commission, which includes the labor organizer Dolores Huerta, offers a framework that may be used to judge OpenAI in the future, whether or not they adopt it. In the commission's view, communities that are already feeling the impacts of AI technologies should have input on how they are developed, including how data about them is used. But there are currently few avenues for people to influence tech companies who control much of the development of AI. OpenAI, the maker of ChatGPT, started in 2015 as a nonprofit research laboratory and has since incorporated a for-profit company with a valuation that has grown to $300 billion. The company has tried to change its structure since the nonprofit board ousted its CEO Sam Altman in Nov. 2023. He was reinstated days later and continues to lead OpenAI. It has run into hurdles escaping its nonprofit roots, including scrutiny from the attorney generals in California and Delaware, who have oversight of nonprofits, and a lawsuit by Elon Musk, an early donor to and founder of OpenAI. Most recently, OpenAI has said it will turn its for-profit company into a public benefit corporation, which must balance the interests of shareholders and its mission. Its nonprofit will hold shares in that new corporation, but OpenAI has not said how much. Zingale said Huerta told the commission their challenge was to help make sure AI is a blessing and not a curse. To grapple with those stakes, they envision a nonprofit with an expansive mandate to help everyone participate in the development and trajectory of AI. "The measure of this nonprofit will be in what it builds, who it includes, and how faithfully it endures to mission and impact," they wrote. The commission toured California communities and solicited feedback online. They heard that many were inspired by OpenAI's mission to create artificial intelligence to benefit humanity and ensure those benefits are felt widely and evenly. But, Zingale said many people feel they are in the dark about how it's happening. "They know this is profoundly important what's happening in this 'Age of Intelligence,' but they want to understand better what it is, how it's developed, where are the important choices being made and who's making them?" he said. Zingale said the commission chose early on not to interact with Altman in any way in order to maintain their independence, though they quote him in their report. However, they did speak with the company's senior engineers, who they said, "entered our space with humility, seriousness, and a genuine desire to understand how their work might translate into democratic legitimacy." The commission proposed OpenAI immediately provide significant resources to the nonprofit for use in the public interest. For context, the nonprofit reported $23 million in assets in 2023, the most recent year that its tax filing is available. The commission recommend focusing on closing gaps in economic opportunity, investing in AI literacy and creating an organization that is accessible to and governed by everyday people. "For OpenAI's nonprofit to fulfill its mandate, it should commit to more than just doing good - it should commit to being known, seen, and shaped by the people it claims to serve," they wrote. The commission suggested opening a rapid response fund to help reduce economic strains now. Zingale said they specifically recommended funding theater, art and health. "We're trying to make the point that they need to dedicate some of their resources to human to human activities," he said. The commission also recommend setting up a requirement that a human lead the nonprofit, which Zingale said is a serious recommendation and "a sign of the times." ___ Associated Press coverage of philanthropy and nonprofits receives support through the AP's collaboration with The Conversation US, with funding from Lilly Endowment Inc. The AP is solely responsible for this content. For all of AP's philanthropy coverage, visit https://apnews.com/hub/philanthropy.
[6]
Why OpenAI's Advisory Board Is Calling for Continued Nonprofit Oversight
OpenAI should continue to be controlled by a nonprofit because the artificial intelligence technology it is developing is "too consequential" to be governed by a corporation alone. That is the message from an advisory board convened by OpenAI to give it recommendations about its nonprofit structure -- delivered in a report released Thursday, along with a sweeping vision for democratizing AI and reforming philanthropy. "We think it's too important to entrust to any one sector, the private sector or even the government sector," said Daniel Zingale, the convener of OpenAI's nonprofit commission and a former adviser to three California governors. "The nonprofit model allows for what we call a common sector," that facilitates democratic participation. The recommendations are not binding on OpenAI, but the advisory commission, which includes the labor organizer Dolores Huerta, offers a framework that may be used to judge OpenAI in the future, whether or not they adopt it. In the commission's view, communities that are already feeling the impacts of AI technologies should have input on how they are developed, including how data about them is used. But there are currently few avenues for people to influence tech companies who control much of the development of AI. OpenAI, the maker of ChatGPT, started in 2015 as a nonprofit research laboratory and has since incorporated a for-profit company with a valuation that has grown to $300 billion. The company has tried to change its structure since the nonprofit board ousted its CEO Sam Altman in Nov. 2023. He was reinstated days later and continues to lead OpenAI. It has run into hurdles escaping its nonprofit roots, including scrutiny from the attorney generals in California and Delaware, who have oversight of nonprofits, and a lawsuit by Elon Musk, an early donor to and founder of OpenAI. Most recently, OpenAI has said it will turn its for-profit company into a public benefit corporation, which must balance the interests of shareholders and its mission. Its nonprofit will hold shares in that new corporation, but OpenAI has not said how much. Zingale said Huerta told the commission their challenge was to help make sure AI is a blessing and not a curse. To grapple with those stakes, they envision a nonprofit with an expansive mandate to help everyone participate in the development and trajectory of AI. "The measure of this nonprofit will be in what it builds, who it includes, and how faithfully it endures to mission and impact," they wrote. The commission toured California communities and solicited feedback online. They heard that many were inspired by OpenAI's mission to create artificial intelligence to benefit humanity and ensure those benefits are felt widely and evenly. But, Zingale said many people feel they are in the dark about how it's happening. "They know this is profoundly important what's happening in this 'Age of Intelligence,' but they want to understand better what it is, how it's developed, where are the important choices being made and who's making them?" he said. Zingale said the commission chose early on not to interact with Altman in any way in order to maintain their independence, though they quote him in their report. However, they did speak with the company's senior engineers, who they said, "entered our space with humility, seriousness, and a genuine desire to understand how their work might translate into democratic legitimacy." The commission proposed OpenAI immediately provide significant resources to the nonprofit for use in the public interest. For context, the nonprofit reported $23 million in assets in 2023, the most recent year that its tax filing is available. The commission recommend focusing on closing gaps in economic opportunity, investing in AI literacy and creating an organization that is accessible to and governed by everyday people. "For OpenAI's nonprofit to fulfill its mandate, it should commit to more than just doing good - it should commit to being known, seen, and shaped by the people it claims to serve," they wrote. The commission suggested opening a rapid response fund to help reduce economic strains now. Zingale said they specifically recommended funding theater, art and health. "We're trying to make the point that they need to dedicate some of their resources to human to human activities," he said. The commission also recommended setting up a requirement that a human lead the nonprofit, which Zingale said is a serious recommendation and "a sign of the times." Copyright 2025. All rights reserved. This material may not be published, broadcast, rewritten or redistributed. The final deadline for the 2025 Inc. Power Partner Awards is Friday, July 25, at 11:59 p.m. PT. Apply now.
Share
Copy Link
An advisory board convened by OpenAI recommends that the company should continue to be controlled by a nonprofit, emphasizing the need for democratic participation in AI development and governance.
An advisory board convened by OpenAI has delivered a compelling message: the artificial intelligence technology being developed by the company is "too consequential" to be governed solely by a corporation 1. The board, which includes labor organizer Dolores Huerta, released a report on Thursday outlining recommendations for OpenAI's nonprofit structure and presenting a vision for democratizing AI and reforming philanthropy 2.
Daniel Zingale, the convener of OpenAI's nonprofit commission and former advisor to three California governors, emphasized the importance of maintaining nonprofit control. "We think it's too important to entrust to any one sector, the private sector or even the government sector," Zingale stated 3. The commission argues that the nonprofit model facilitates what they call a "common sector," enabling democratic participation in AI development and governance 1.
OpenAI, the creator of ChatGPT, began as a nonprofit research laboratory in 2015 but has since incorporated a for-profit arm valued at $300 billion 4. The company has faced challenges in altering its structure, particularly following the brief ousting and reinstatement of CEO Sam Altman in November 2023 2. OpenAI has encountered scrutiny from attorney generals in California and Delaware, as well as a lawsuit from early donor and founder Elon Musk 3.
Source: Inc. Magazine
The advisory commission's report outlines several key recommendations:
Community Input: Communities already impacted by AI technologies should have a say in their development, including how their data is used 1.
Significant Resource Allocation: OpenAI should immediately provide substantial resources to the nonprofit for public interest use. For context, the nonprofit reported $23 million in assets in 2023 4.
Focus Areas: The commission recommends concentrating on closing economic opportunity gaps, investing in AI literacy, and creating an organization accessible to and governed by everyday people 2.
Rapid Response Fund: A suggestion to open a fund to help reduce current economic strains, with specific recommendations for funding theater, art, and health 3.
Human Leadership: A requirement that a human lead the nonprofit, which Zingale described as "a sign of the times" 4.
The commission conducted tours of California communities and solicited online feedback. They found that while many were inspired by OpenAI's mission to create AI that benefits humanity, there was a widespread desire for greater understanding and transparency in the development process 5.
"They know this is profoundly important what's happening in this 'Age of Intelligence,' but they want to understand better what it is, how it's developed, where are the important choices being made and who's making them?" Zingale explained 1.
While these recommendations are not binding, they provide a framework that may be used to evaluate OpenAI's future actions and decisions. The commission's vision extends beyond mere oversight, calling for a nonprofit structure that is "known, seen, and shaped by the people it claims to serve" 5.
Summarized by
Navi
[3]
[5]
U.S. News & World Report
|OpenAI's Advisory Board Calls for Continued and Strengthened Nonprofit OversightNetflix has incorporated generative AI technology in its original series "El Eternauta," marking a significant shift in content production methods for the streaming giant.
23 Sources
Technology
15 hrs ago
23 Sources
Technology
15 hrs ago
Meta declines to sign the European Union's voluntary AI code of practice, calling it an overreach that could stifle innovation and economic growth in Europe. The decision highlights growing tensions between tech giants and EU regulators over AI governance.
13 Sources
Policy and Regulation
15 hrs ago
13 Sources
Policy and Regulation
15 hrs ago
Perplexity AI partners with Airtel to offer free Pro subscriptions, leading to a significant increase in downloads and user base in India, potentially reshaping the AI search landscape.
5 Sources
Technology
15 hrs ago
5 Sources
Technology
15 hrs ago
Perplexity AI, an AI-powered search engine startup, has raised $100 million in a new funding round, valuing the company at $18 billion. This development highlights the growing investor interest in AI startups and Perplexity's potential to challenge Google's dominance in internet search.
4 Sources
Startups
15 hrs ago
4 Sources
Startups
15 hrs ago
The European Commission has released guidelines to help AI models with systemic risks comply with the EU's new AI Act, aiming to clarify regulations and address industry concerns.
2 Sources
Policy and Regulation
15 hrs ago
2 Sources
Policy and Regulation
15 hrs ago