Microsoft has confirmed a significant error made by its artificial intelligence researchers, resulting in the inadvertent exposure of a vast 38-terabyte dataset. This dataset contains sensitive information such as private keys, passwords, and internal Microsoft Teams conversations from hundreds of employees.
In a report recently published by cloud security company Wiz, it was disclosed that the exposed data comprised comprehensive backups of two employees’ computer systems. The report, which was made public on Monday, provided a detailed account of the incident by Wiz researchers Hillai Ben-Sasson and Ronny Greenberg.
“Our scan shows that this account contained 38TB of additional data, including Microsoft employees’ personal computer backups,” Ben-Sasson and Greenberg said. “The backups contained sensitive personal data, including passwords to Microsoft services, secret keys, and over 30,000 internal Microsoft Teams messages from 359 Microsoft employees.”
In a blog post, the company confirmed that a Microsoft employee accidentally exposed internal data in a public GitHub repository. Data exposed included backups of two former employees’ workstations and internal Teams messages.
“No customer data was exposed, and no other internal services were put at risk because of this issue. No customer action is required in response to this issue,” Microsoft’s Security Response Centre said.
A data leak refers to the inadvertent or unauthorized disclosure of sensitive information to an audience or individuals who are not authorized to access it. Such data leaks can lead to significant repercussions for both the organizations responsible and the individuals whose data is compromised.