Microsoft's AI research team mistakenly leaked 38TB of the company's private data

A robot sitting on a desk with a computer with plain papers falling to the ground
(Image credit: Bing Image Creator/Kevin Okemwa)

What you need to know

  • Amid all the Microsoft leaks today, Microsoft's AI research team was also found to have mistakenly leaked 38TB worth of the company's private data.
  • The leaked data included backups of Microsoft employees' computers.
  • Data stored in the computers included passwords to Microsoft services, secret keys, and over 30,000 internal Teams messages from over 350 company employees.
  • Microsoft has since issued a statement indicating that customer data wasn't compromised, and neither were its internal services. 
  • Wiz, a cybersecurity firm, narrowed down the root cause of the leak to an Azure feature dubbed Shared Access Signature (SAS) tokens, which granted users unlimited access to the Azure Storage resources.

Microsoft must be having a field day after what can be termed the "biggest leak in Xbox history." The leak brought forth details regarding a refreshed Xbox series X console codenamed 'Brooklin,' which sports a cylindrical design and is expected to go on sale later this year in November.

And now, Microsoft's AI research team has also mistakenly leaked 38TB of private company data. According to Wiz, a cybersecurity firm, the data also included a link with backups of Microsoft employees' computers. The data stored in the computers contained passwords to Microsoft services, secret keys, and over 30,000 internal Teams messages from more than 350 company employees.

Microsoft quickly issued a statement while speaking to TechCrunch, assuring concerned customers that their data wasn't at risk. The company also indicated that internal services weren't affected either. 

According to the report, the AI research team uploaded the training data containing sensitive data, including open-source code and AI models for image recognition. As such, users who came across the GitHub repository could access the Azure link, allowing them to download the models. 

And while Microsoft assured customers that their data remained untouched and no internal services were affected, the link provided users complete access to the Azure storage account. Strangely enough, users with access could alter the data stored in the account, including uploading, overwriting, and even deleting existing files.

The cybersecurity company narrowed the issue to an Azure feature, Shared Access Signature (SAS) tokens. Essentially, the feature is a URL that grants restricted access rights to Azure Storage resources. While it is possible to configure the feature to limit access to specific files, this link provides users unlimited access.

Wiz flagged and reported the issue to Microsoft on June 22, 2023, and the company quickly responded to the issue and revoked the SAS token on June 23, 2023. Microsoft further cited rescans to its public repositories, but its systems marked the link that caused the issue as a false positive.

Remedies for the future

The potential danger such leaks could cause if they fall into the wrong hands is unimaginable. Luckily, in this case, the issue was flagged and remedied quickly.

Microsoft has since issued a comprehensive list highlighting best practices for handling SAS tokens. It is also evident that users need to be careful when using the feature and also ensure that restrictions are put in place to avoid replication of such an instance in the future, which could potentially cause a lot of damage. 

Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.