Skip to content

Microsoft Accidentally Exposes 38TB of Sensitive Data

Microsoft's AI team unknowingly exposes 38 terabytes of internal data, including employee info and passwords. Discover how one URL led to this blunder.

Microsoft Accidentally Exposes 38TB of Sensitive Data

If you've ever deleted an important email or misplaced your house keys, you know the sting of a minor mistake with major consequences. Now, imagine being Microsoft and accidentally exposing not one, not two, but 38 terabytes of super-sensitive internal data! A tiny error in code became a digital Pandora's box, and you won't believe how it happened.

Cloud security startup Wiz, on its quest to probe unintentional data exposures, stumbled upon a treasure trove of Microsoft secrets. Microsoft's AI research division had set up a GitHub repository, loaded with cutting-edge open source code and AI models for image recognition. Sounds harmless, right? Except, the Azure Storage URL, where the repository directed users to download the models, was like an open door to a bank vault!

The fault lay in an overly generous "shared access signature" (SAS) token. These tokens are Azure's method of allowing shareable links to data storage. However, Microsoft's SAS token didn't just say, "come on in;" it practically yelled, "take whatever you want!"

Wiz discovered that anyone with the URL could have not only "read-only" access but "full control." They could delete, replace, or even inject malicious content. We're talking 38 terabytes of sensitive data here, including personal backups of two Microsoft employees, a horde of internal messages on Microsoft Teams, and enough passwords and secret keys to turn any hacker into a digital Houdini!

"In the race to advance AI, data security often takes a backseat," warns Ami Luttwak, CTO of Wiz. Massive amounts of data and multi-team collaboration make such lapses increasingly difficult to prevent. Thankfully, Wiz alerted Microsoft, which immediately plugged the leak, revoked the faulty SAS token, and upped its internal security checks. Microsoft confirmed that "no customer data was exposed," but you've got to wonder—how many more ticking data time bombs are out there?

So the next time you beat yourself up over a tiny mistake, remember: even tech giants like Microsoft aren't immune to human error. But let's hope this serves as a wake-up call for all companies in the booming AI industry to double-check their data doors!