Microsoft Error Exposes Confidential Emails to AI Tool Copilot

A configuration error at Microsoft allowed its AI assistant Copilot to access and surface confidential internal emails, raising fresh concerns about the security implications of integrating artificial intelligence into enterprise software.

The company acknowledged the issue after it was flagged by cybersecurity researchers, saying it has been addressed and that it “did not provide anyone access to information they weren’t already authorised to see.”

However, security experts disputed Microsoft’s characterization, noting that the error effectively gave Copilot access to email threads that were restricted to specific individuals and departments.

“This is exactly the kind of incident that enterprise security teams have been warning about,” said Bruce Schneier, a prominent cybersecurity researcher. “When you give an AI system broad access to organizational data, the blast radius of any misconfiguration is enormous.”

The incident has reignited debate about the appropriate level of data access for AI tools in corporate environments, with several Fortune 500 companies reportedly pausing their Copilot deployments pending security reviews.

Leave a Reply

Your email address will not be published. Required fields are marked *