The never-ending story: Microsoft AI team accidentally exposed 38 Terabytes of internal data

3 min read

The accidental sharing of cloud access is an all-too-familiar story.

In one latest incident, Microsoft’s AI research team accidentally exposed to the public 38 Terabytes of private data including internal messages, private keys, and passwords, according to a recent report [1]. And all it took to cause this gigantic exposure was a few errant clicks in a configuration menu.

How did it happen?

To share some of their AI models with the public, Microsoft’s AI research team configured a shared access URL to their models stored in Azure Storage and shared the URL on GitHub.

The URL was intended to access only the models, but a misconfiguration of the embedded signature in fact allowed complete access (read and write) to the entire Azure Storage account which included complete workstation backups, private keys, passwords, and hundreds of thousands of internal messages.

The mistake was enabled by Azure’s Shared Access Signature feature. To make matters worse, the existence of active Shared Access Signatures is difficult to monitor and audit. In fact, the Wiz research team that discovered the exposure believes the exposure had been in place since July 20, 2020!

We all strive to be careful in configuring cloud storage, but as the unending string of similar incidents show, costly slip-ups are bound to happen sooner or later especially in the absence of systematic guardrails in place to prevent them.

Does your organization use cloud storage? Here are my recommendations to minimize your risk of similar exposure.

How to minimize your risk of exposure

Cloud native
Authorization

Entitlement Explosion Repair

Join Styra and PACLabs on April 11 for a webinar exploring how organizations are using Policy as Code for smarter Access Control.

Speak with an Engineer

Request time with our team to talk about how you can modernize your access management.