Microsoft AI Researchers Expose 38TB of Sensitive Data via SAS Tokens
Microsoft says no customer data was exposed.
Microsoft artificial intelligence (AI) researchers accidentally exposed 38 terabytes of sensitive data, including private keys and passwords, according to Wiz.
The researchers did so while publishing a storage bucket of open-source training data on GitHub. The exposed data included a backup disk of two employees’ workstations. The backup includes secrets, private keys, passwords and more than 30,000 internal Microsoft Teams messages.
According to Wiz, the researchers shared their files using an Azure feature called Shared Access Signature (SAS) tokens, which allows users to share data from Azure Storage accounts. The access level can be limited to specific files only; however, in this case, the link was configured to share the entire storage account, including the private files.