The Microsoft AI research division accidentally leaked dozens of terabytes of sensitive data starting in July 2020 while contributing open-source AI learning models to a public GitHub repository.
Almost three years later, this was discovered by cloud security firm Wiz whose security researchers found that a Microsoft employee inadvertently shared the URL for a misconfigured Azure Blob storage bucket containing the leaked information.
Microsoft linked the data exposure to using an excessively permissive Shared Access Signature (SAS) token, which allowed full control over the shared files. This Azure feature enables data sharing in a manner described by Wiz researchers as challenging to monitor and revoke.
Read more: https://www.bleepingcomputer.com/news/microsoft/microsoft-leaks-38tb-of-private-data-via-unsecured-azure-storage/
Read also: Xbox leak: New Xbox Series X design, next-gen in 2028, and more