Skip to main content

MicrosoftMicrosoft Copilot bug exposed customers' confidential emails for four weeks, including UK NHS data

A bug (CW1226324) allowed Microsoft Copilot Chat to read and summarize customers' confidential emails without permission for approximately four weeks (January 21 to mid-February 2026). Emails marked with confidentiality labels and protected by DLP policies were incorrectly processed across Word, Excel, and PowerPoint. Affected organizations included the UK's National Health Service. Microsoft did not disclose the number of affected customers or what data was accessed. This was the second trust boundary violation in eight months, following CVE-2025-32711 'EchoLeak' in June 2025 (CVSS 9.3).

Scoring Impact

TopicDirectionRelevanceContribution
AI Safety-againstsecondary-0.50
Data Security-againstprimary-1.00
User Privacy-againstprimary-1.00
Overall incident score =-0.357

Score = avg(topic contributions) × significance (high ×1.5) × confidence (0.57)× agency (negligent ×0.5)

Evidence (1 signal)

Confirms product_decision Feb 18, 2026 documented

TechCrunch reported Microsoft Copilot bug exposed confidential emails for four weeks including NHS data

TechCrunch reported on February 18, 2026 that a Microsoft Copilot Chat bug (CW1226324) allowed the AI to read and summarize confidential emails without permission for approximately four weeks. Affected organizations included the UK's National Health Service. This was the second trust boundary violation in eight months.

Related: Same Topics