Microsoft 365's buggy Copilot 'Chat' has been summarizing confidential emails for a month — yet another AI privacy nightmare
A recent Microsoft 365 Copilot Chat bug analyzed and summarized confidential emails, raising more concerns about AI.
All the latest news, reviews, and guides for Windows and Xbox diehards.
You are now subscribed
Your newsletter sign-up was successful
Generative AI has evolved from mere chatbots that generate text and images based on prompts. The technology is now more sophisticated, and its impact can already be seen in the job market, with executives from top AI research labs like Microsoft AI CEO Mustafa Suleyman and Anthropic CEO Dario Amodei claiming that it's on the verge of phasing out entry-level white-collar jobs as soon as 18 months from now.
However, AI hasn't completely reached its prime time, primarily due to reluctance from some users to adopt it into their workflows, citing privacy and security issues. The latest example comes from Microsoft itself, where it has identified a bug impacting 365 Copilot Chat, allowing the service to access and summarize confidential emails without consent since late January.
According to Microsoft: "The Microsoft 365 Copilot 'work tab' Chat is summarizing email messages even though these email messages have a sensitivity label applied and a DLP policy is configured. Users' email messages with a confidential label applied are being incorrectly processed by Microsoft 365 Copilot chat." (via BleepingComputer).
Microsoft began rolling out Copilot Chat across its productivity tools, including Word, Excel, PowerPoint, Outlook, and OneNote, for users with a Microsoft 365 license, aiming to make Copilot a personal AI assistant for work.
The bug, tracked under CW1226324, impacted Copilot Chat and allowed the service to incorrectly access, read, and summarize emails saved in a user's inbox, across their sent and draft folders, which included messages explicitly labelled as confidential to restrict unauthorized access by automated tools.
A code issue is allowing items in the sent items and draft folders to be picked up by Copilot even though confidential labels are set in place.
Microsoft
Microsoft has narrowed down the issue to an unspecified code error, indicating that it already began rolling out a fix for the bug at the beginning of February 2026. "A code issue is allowing items in the sent items and draft folders to be picked up by Copilot even though confidential labels are set in place," the software giant added.
The company is monitoring the situation closely and is even reaching out to affected users to establish whether the fix is working. It's still unclear when Microsoft will complete the rollout of this fix. What's more, it hasn't categorically indicated how many users were impacted by this issue.
So, my take is that AI might seem like a productivity booster, alleviating the effort factored in to handle repetitive and redundant tasks, but it still feels too half-baked to be fully implemented into our workflows with the trust to maintain a high standard of professionalism (and confidentiality).
Over to you
Microsoft admits Copilot exposed confidential emails due to a bug. Do you still trust AI? Let me know in the comments.
Join us on Reddit at r/WindowsCentral to share your insights and discuss our latest news, reviews, and more.

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.
You must confirm your public display name before commenting
Please logout and then login again, you will then be prompted to enter your display name.
