Microsoft 365's buggy Copilot 'Chat' has been summarizing confidential emails for a month — yet another AI privacy nightmare

The Microsoft Copilot app is displayed on a smartphone.
Microsoft has identified the new bug in 365 Copilot. (Image credit: Getty Images | Thomas Trutschel)

Generative AI has evolved from mere chatbots that generate text and images based on prompts. The technology is now more sophisticated, and its impact can already be seen in the job market, with executives from top AI research labs like Microsoft AI CEO Mustafa Suleyman and Anthropic CEO Dario Amodei claiming that it's on the verge of phasing out entry-level white-collar jobs as soon as 18 months from now.

However, AI hasn't completely reached its prime time, primarily due to reluctance from some users to adopt it into their workflows, citing privacy and security issues. The latest example comes from Microsoft itself, where it has identified a bug impacting 365 Copilot Chat, allowing the service to access and summarize confidential emails without consent since late January.

A code issue is allowing items in the sent items and draft folders to be picked up by Copilot even though confidential labels are set in place.

Microsoft

Microsoft has narrowed down the issue to an unspecified code error, indicating that it already began rolling out a fix for the bug at the beginning of February 2026. "A code issue is allowing items in the sent items and draft folders to be picked up by Copilot even though confidential labels are set in place," the software giant added.

The company is monitoring the situation closely and is even reaching out to affected users to establish whether the fix is working. It's still unclear when Microsoft will complete the rollout of this fix. What's more, it hasn't categorically indicated how many users were impacted by this issue.

So, my take is that AI might seem like a productivity booster, alleviating the effort factored in to handle repetitive and redundant tasks, but it still feels too half-baked to be fully implemented into our workflows with the trust to maintain a high standard of professionalism (and confidentiality).

Over to you

Microsoft admits Copilot exposed confidential emails due to a bug. Do you still trust AI? Let me know in the comments.


Click to join us on r/WindowsCentral

Join us on Reddit at r/WindowsCentral to share your insights and discuss our latest news, reviews, and more.


Kevin Okemwa
Contributor

Kevin Okemwa is a seasoned tech journalist based in Nairobi, Kenya with lots of experience covering the latest trends and developments in the industry at Windows Central. With a passion for innovation and a keen eye for detail, he has written for leading publications such as OnMSFT, MakeUseOf, and Windows Report, providing insightful analysis and breaking news on everything revolving around the Microsoft ecosystem. While AFK and not busy following the ever-emerging trends in tech, you can find him exploring the world or listening to music.

You must confirm your public display name before commenting

Please logout and then login again, you will then be prompted to enter your display name.