Microsoft has admitted that a coding bug accidentally allowed Copilot Chat to access and summarize confidential emails.
As Bleeping Computer reports, the flaw bypasses data loss prevention (DLP) policies enabled by customers who wish to keep their data shielded from Microsoft’s AI. The issue, first reported on Jan. 21, affects the Work tab of Copilot Chat, a feature that began rolling out to Microsoft 365 business users via Word, Excel, PowerPoint, Outlook, and OneNote in September.
(Credit: Microsoft)
Microsoft has traced the issue to a coding bug in Copilot. "A code issue is allowing items in the sent items and draft folders to be picked up by Copilot even though confidential labels are set in place," the company tells BleepingComputer.
Users can mark their files and emails as sensitive, or let Microsoft 365 do it automatically. Once the label is applied, Microsoft is supposed to keep the data “compliant with your organization's information protection policies.”
A fix for the ongoing issue began rolling out earlier this month, though the company hasn’t clarified when it will be fully resolved. It is still monitoring the fix and reaching out to affected users to check if it is working. The number of organizations affected by the bug is unclear, but it appears the UK’s National Health Service (NHS) is among them.
Recommended by Our Editors
Microsoft’s integration of AI features into its products has been anything but smooth. Features like Windows Recall and Copilot Vision have raised privacy concerns, and the company is also reportedly planning to scale back Copilot across Windows 11 apps.
About Our Expert

Experience
Jibin is a tech news writer based out of Ahmedabad, India. Previously, he served as the editor of iGeeksBlog and is a self-proclaimed tech enthusiast who loves breaking down complex information for a broader audience.
Read Full Bio