Israeli cybersecurity firm Lasso has revealed a major data leak risk in Microsoft’s Copilot AI. According to the report, historical data in private GitHub repositories is still accessible via Copilot. Here are the details…
More than 20,000 private GitHub repositories at risk in Microsoft Copilot!
Lasso’s research found that some GitHub projects that were accidentally made public for a short time in 2024 continued to be accessible by Copilot despite being made private again. In particular, it was reported that private data from more than 16,000 large companies such as Microsoft, Amazon AWS, Google, IBM, PayPal and Tencent was at risk.

Due to this vulnerability, source codes, API keys and other sensitive information from companies that were previously accidentally made public are still indexed in Microsoft’s Copilot AI system. According to Lasso’s research, the Bing search engine’s indexing mechanism for open repositories on GitHub causes data to be permanently accessible by Copilot.
Although Microsoft disabled Bing’s caching system in December 2024, the Lasso team noted that this was only a temporary solution and that data was still available on Copilot. In the meantime, Microsoft considered the vulnerability a “low-risk issue” and argued that the system’s working principle was acceptable. However, security experts say that the unauthorized presence of sensitive company data in AI systems poses serious security threats.
The fact that the company sees this as a temporary security issue has caused even more concern among corporate users. It is said that if this situation is not fixed, the risk of more companies’ private data being unintentionally used by Copilot in the future will continue.