Microsoft 365 Copilot Security Bug: When AI Becomes a Confidentiality Nightmare
Here’s something that should make you check your email security settings right now. Microsoft just patched a bug in their Microsoft 365 Copilot that was leaking confidential emails for six straight weeks. If your company uses those little “confidential” or “sensitive” labels on emails, the AI assistant was showing them to people who had no business reading them. The bug started January 21, 2026 and ran until early March. Forty-two days. Legal contracts. HR records. Financial data. Medical information. All accessible to anyone who asked Copilot the right question. This wasn’t some theoretical risk found in a lab. Real companies got hit. Real people saw information they had no clearance for.
How the Microsoft 365 Copilot Security Bug Actually Worked
Let me break down what happened in plain English, folks. Microsoft 365 has a feature where you can mark emails as “confidential” or “sensitive.” When you do that, the system is supposed to restrict who can read those messages. It’s like putting a lock on certain file cabinets in your office.
Enter Copilot, Microsoft’s AI assistant that’s supposed to help you work faster by answering questions about your documents and emails. Someone asks Copilot a question, and instead of respecting those “confidential” locks, it pulls information from emails they don’t have clearance to read. The bug bypassed the security controls completely.
Imagine you’re working in HR and you ask Copilot “What’s the status of the Johnson account?” Copilot might helpfully pull information from a confidential email about Johnson’s pending termination, his salary details, or his medical leave. Information you were never supposed to see. The AI just… gave it to you. Because as far as Copilot was concerned, if it could see it, you could see it.
Real Companies Got Hit By This
I talked to a financial advisor in Portsmouth last week who uses Microsoft 365 for everything. She marks client emails “confidential” as routine. Account numbers. Social Security info. Investment strategies. Retirement account balances. The stuff you expect kept private.
During those six weeks, her junior associate was using Copilot to search for client information. The associate later mentioned details from a high-net-worth client’s estate planning email. An email the associate never should have had access to. That’s when my friend realized something was very wrong. She spent the next three days manually auditing who accessed what, trying to figure out the scope of the exposure.
A law firm in Boston had a similar problem. An attorney working on a merger asked Copilot to summarize recent communications with the client. Copilot pulled information from confidential attorney-client privileged emails that were above the attorney’s clearance level. Those emails contained negotiation strategies and financial details that could have torpedoed the entire deal if they’d leaked. The firm caught it before any real damage was done, but only because someone noticed Copilot quoting information that shouldn’t have been accessible.
The Story Behind ForwardToSafety
This whole thing reminds me why I built ForwardToSafety in the first place. My father fell for a phishing email a few years back. Here’s the thing: I’ve been doing cybersecurity for 50 years. I present for the FBI InfraGard. I’ve protected my clients from ransomware with a perfect track record. And my own dad still got hit.
The scammers got remote access to his computer and started searching through his files looking for anything financial. My step-mother noticed something weird and called me. I was able to connect remotely and shut them down before they found the spreadsheet with all the bank account credentials. We were lucky. We caught it in time.
After that, I asked myself one question: What would I build if the person I was protecting was my father? The answer was ForwardToSafety. No complicated software. No training courses. Just forward a suspicious email to try@forwardtosafety.com and get a plain-English verdict in about 47 seconds. Safe. Suspicious. Or Dangerous.
This Copilot bug is a different kind of threat, but it’s the same problem. Technology is supposed to protect us. When it fails, real people lose real money. Systems fail. Bugs happen. You need a backup plan.
Why This Matters Even If You Don’t Use Microsoft 365
Maybe you’re thinking “I don’t work for a big company. I’m retired. This doesn’t affect me.” Wrong. Here’s why this bug matters to everyone.
First, your doctor probably uses Microsoft 365. So does your financial advisor. Your lawyer. Your Medicare coordinator. When you email them about your retirement accounts, your medical records, your estate planning, those emails get stored in their system. If they were using Copilot, your information might have been accessible to people who shouldn’t have seen it.
Second, this shows how AI creates new security holes in systems that were already locked down. Microsoft has some of the best security engineers in the world. They still missed it. Every company adding AI to their products is creating similar risks. Your bank. Your investment firm. Your insurance company. They’re all racing to add AI assistants. Each one is a potential bug waiting to happen.
Third, if someone at your doctor’s office accessed your information during those six weeks, you might never know. There’s no requirement to notify you unless there was a confirmed breach. The bug exposed information, but if nobody outside the company accessed it, they don’t have to tell you.
What Actually Got Exposed
Let’s talk about what “confidential” emails typically contain, folks. We’re not talking about your fantasy football league scores. We’re talking about:
Retirement account information. Account numbers. Balance statements. Withdrawal strategies. Tax planning. Everything you’ve spent decades building. One leaked email could give a hoser everything they need to impersonate you to your financial institution.
Medical records. Test results. Diagnoses. Treatment plans. Medication lists. Medicare numbers. Information that’s protected by HIPAA specifically because it’s private and could be used for identity theft or discrimination.
Legal documents. Estate planning. Property transactions. Business contracts. Divorce proceedings. Anything you’ve ever told a lawyer in confidence, assuming that “attorney-client privilege” meant something.
HR and employment records. Salary information. Performance reviews. Disciplinary actions. Layoff plans. Company financials. M&A negotiations. All the stuff that could tank a career or a stock price if it leaked.
This bug didn’t just expose “data.” It exposed the kind of information that ruins lives.
What You Can Do Right Now
If you use Microsoft 365 at work or for personal email, check with your IT person or email provider to confirm they’ve applied Microsoft’s security patch. Don’t assume it happened automatically. Ask directly. Get confirmation. If they can’t confirm it, that’s a red flag.
If you handle financial information, medical records, or legal documents over Microsoft 365 email, consider changing passwords on any accounts that might have been discussed in those messages. Yes, even if you don’t know for sure whether anything was accessed. Better safe than sorry when we’re talking about your retirement savings or medical privacy.
Just because an email is marked “confidential” doesn’t mean the system keeps it that way. We just learned that. If information is truly sensitive, consider whether email is the right way to share it at all. Phone calls don’t leave a searchable record. Neither do in-person conversations. Sometimes the old ways are more secure.
One More Thing: Check Your Inbox
You’ve got a suspicious email sitting in your inbox right now. Maybe it’s from your bank asking you to verify your account. Maybe it’s from Microsoft warning you about a security issue. Maybe it’s from a delivery company saying they couldn’t deliver a package.
Here’s the thing: some of those emails are real. Some of them are phishing attacks designed to steal your information. And AI has made it nearly impossible to tell the difference just by reading them.
Before you click anything, forward that email to try@forwardtosafety.com. You’ll get a verdict in about 47 seconds. Safe. Suspicious. Or Dangerous. No signup. No app. No software to install. Just forward and know.
Because whether it’s a Copilot bug or a phishing email pretending to warn you about one, the hosers are always looking for new ways to get to your information. Don’t make it easy for them.
Want Weekly Security Updates Like This?
Sign up for free at CraigPeterson.com. I’ll send you practical, no-nonsense advice every week on how to protect your retirement savings, your personal information, and your independence from online threats. No jargon. No hype. Just straight talk about real risks and real solutions.