Microsoft Copilot Exposes Thousands of Private GitHub Repositories, Raising Security Concerns

5 Sources

Share

Security researchers discover that Microsoft's AI assistant Copilot can access and expose data from over 20,000 private GitHub repositories, affecting major tech companies and posing significant security risks.

News article

AI-Powered Security Breach: Copilot Exposes Private GitHub Repositories

In a startling revelation, security researchers have uncovered a significant vulnerability in Microsoft's AI assistant, Copilot, which has been exposing data from thousands of private GitHub repositories. This discovery, made by Israeli cybersecurity firm Lasso, has sent shockwaves through the tech industry, highlighting the potential risks associated with AI-powered tools and data caching mechanisms

1

.

The Scope of the Breach

Lasso's investigation revealed that over 20,000 GitHub repositories, which had been set to private in 2024, were still accessible through Copilot. This security breach affected more than 16,000 organizations, including major technology companies such as IBM, Google, PayPal, Tencent, Microsoft, and Amazon Web Services

2

.

How the Breach Occurred

The root cause of this breach lies in the caching mechanism of Microsoft's Bing search engine. When repositories were temporarily made public, Bing indexed and cached their contents. Even after these repositories were switched back to private, Copilot retained access to the cached data, making it potentially accessible to anyone using the AI assistant

3

.

Sensitive Information at Risk

The exposed repositories contained highly sensitive data, including:

  1. Intellectual property
  2. Confidential corporate information
  3. Access keys and security tokens
  4. Tools for bypassing AI safety measures

    4

This breach has raised concerns about the potential for cybercriminals to manipulate Copilot into revealing confidential information, posing a significant security threat to affected organizations.

Microsoft's Response and Mitigation Efforts

When informed about the issue in November 2024, Microsoft initially classified it as a "low-severity" problem, describing the caching behavior as "acceptable." However, the company took some steps to address the situation:

  1. Removed links to Bing's cache from search results in December 2024
  2. Disabled public access to a special Bing user interface at cc.bingj.com

    5

Despite these measures, Lasso researchers found that Copilot could still access the cached data, indicating that the fix was only partial and temporary.

Implications and Recommendations

This incident highlights the challenges of managing data privacy and security in the age of AI-powered tools. It also underscores the importance of proper security practices when handling sensitive information in code repositories.

Experts recommend that affected organizations take the following steps:

  1. Rotate or revoke any compromised security credentials
  2. Review and update their data handling practices
  3. Implement stricter controls on repository visibility

As AI technologies continue to evolve, this incident serves as a stark reminder of the need for robust security measures and careful consideration of the potential risks associated with these powerful tools.

TheOutpost.ai

Your Daily Dose of Curated AI News

Don’t drown in AI news. We cut through the noise - filtering, ranking and summarizing the most important AI news, breakthroughs and research daily. Spend less time searching for the latest in AI and get straight to action.

© 2025 Triveous Technologies Private Limited
Instagram logo
LinkedIn logo