The role of healthcare privacy manager has never been more complex. Five years ago, your biggest concerns were probably lost laptops and nosy employees. Today, you're navigating artificial intelligence tools that your clinical staff found on their own, cloud applications that multiply faster than you can vet them, and threat actors who've figured out that healthcare data is worth more than credit card numbers on the dark web.

I've spent the last few months talking to privacy managers at practices of all sizes, and the same themes keep coming up. The landscape has shifted, and the old playbook isn't enough anymore. Here are five things that should be on every healthcare privacy manager's radar in 2026.

1. AI Tools Are Already in Your Organization — Whether You Know It or Not

Let's start with the elephant in the room. Generative AI tools like ChatGPT, Claude, Gemini, and dozens of healthcare-specific applications have exploded in popularity. And here's the uncomfortable truth: your staff is probably already using them.

A physician wants to draft a patient letter faster. A billing specialist needs help with a complex coding question. A nurse is trying to understand a rare diagnosis. The temptation to paste that information into an AI chatbot is enormous — and many people don't realize the privacy implications.

The making clear that PHI entered into AI tools can constitute a disclosure under HIPAA. If that AI tool doesn't have a Business Associate Agreement with your organization, you've got a problem. And most consumer AI tools explicitly state in their terms of service that they don't sign BAAs.

What you need to do:

Inventory AI usage — Survey your departments. Find out what tools people are using. You can't manage what you don't know about.

Create an AI acceptable use policy — Be specific about what's allowed and what isn't. "Don't put patient information in ChatGPT" needs to be in writing.

Vet healthcare-specific AI tools — Some vendors offer HIPAA-compliant AI solutions with proper BAAs. If your organization wants to use AI (and they will), give them a sanctioned option.

Train relentlessly — Your workforce needs to understand that "helpful" and "compliant" aren't always the same thing.

The HHS has been increasingly focused on AI in healthcare. Their guidance emphasizes that covered entities remain responsible for PHI regardless of what technology processes it. The tool might be new, but the obligation isn't.

2. Your Cloud Footprint Has Probably Tripled — And Your BAA Coverage Hasn't Kept Up

Remember when "the cloud" meant maybe your EHR and an email provider? Those days are long gone.

Today's average healthcare practice uses cloud services for scheduling, telehealth, patient communication, document storage, faxing (yes, still faxing), billing, analytics, and a dozen other functions. Each one of those services potentially touches PHI. Each one needs a Business Associate Agreement.

The HealthIT.gov guidance on cloud computing is clear: cloud service providers that create, receive, maintain, or transmit PHI on behalf of a covered entity are business associates. No exceptions for "it's just temporary storage" or "we only use it for backups."

Here's what I see all the time: a practice signed a BAA with their main cloud provider three years ago. Since then, they've added six more cloud services. Someone in billing signed up for a productivity tool. The marketing team started using a cloud-based design platform to create patient education materials. Nobody thought to involve compliance.

The real risk:

When a breach occurs — and with cloud services, "when" is more accurate than "if" — you need to be able to demonstrate that you had appropriate agreements in place. If OCR comes knocking and you can't produce a BAA for a service that was storing PHI, you're facing potential penalties for the breach and for the missing agreement.

What you need to do:

Conduct a cloud audit — Work with IT to identify every cloud service in use. Check credit card statements if you have to. Shadow IT is real.

Map data flows — For each service, determine if PHI could reasonably end up there. Be conservative in your assessment.

Verify BAA coverage — Check that you have current, signed BAAs for every service that touches PHI. Note expiration dates.

Establish a vetting process — No new cloud service gets implemented without privacy review. Make this part of your procurement workflow.

3. Data Breach Notification Rules Are Being Enforced More Aggressively Than Ever

If you've been following the HHS Breach Portal — sometimes called the "Wall of Shame" — you've noticed the numbers climbing. Healthcare data breaches affected over 133 million individuals in 2023 alone, and 2024 and 2025 continued the trend.

What's changed isn't just the volume of breaches. It's how aggressively OCR is pursuing enforcement, particularly around breach notification requirements.

The HIPAA Breach Notification Rule requires covered entities to notify affected individuals within 60 days of discovering a breach. For breaches affecting 500 or more individuals, you also need to notify HHS and prominent media outlets in the affected state. These aren't suggestions — they're legal requirements with real penalties for non-compliance.

Recent enforcement actions have targeted organizations not for the breach itself, but for how they handled the aftermath. Delayed notifications, inadequate risk assessments, failure to notify all affected individuals — these failures are drawing seven-figure penalties.

What's catching organizations off guard:

The 60-day clock starts at discovery — Not when you've completed your investigation. Not when legal signs off. Discovery.

"Discovery" is broadly interpreted — If a reasonably diligent organization would have found the breach, you're deemed to have discovered it, even if you actually didn't.

Business associate breaches are your problem too — When your BA gets breached, your notification obligations kick in.

State laws may be stricter — Many states have breach notification requirements that are more demanding than HIPAA. You need to comply with both.

What you need to do:

Have an incident response plan ready — Not a template sitting in a drawer. An actual, tested plan with assigned roles and contact information.

Know your notification templates — Draft notification letters in advance. When a breach happens, you won't have time to start from scratch.

Document everything — Your risk assessment, your decision-making process, your notification timeline. If OCR investigates, documentation is your defense.

Test your response — Run tabletop exercises. Find the gaps before a real incident exposes them.

4. Paste Sites and Data Dumps Are Exposing Healthcare Organizations Daily

Here's something that doesn't get enough attention: the proliferation of paste sites, dark web marketplaces, and public data dumps as vectors for healthcare data exposure.

Sites like Pastebin, GitHub (when misused), and their countless imitators have become repositories for stolen data. Threat actors who breach healthcare organizations often dump samples of the data on these sites — sometimes to prove they have it, sometimes to pressure organizations into paying ransoms, sometimes just for notoriety.

But it's not always malicious actors. Sometimes it's your own people.

A developer testing code pastes a database connection string that includes credentials. An IT admin shares a configuration file that contains patient information. A researcher uploads a dataset they thought was de-identified but wasn't. These accidental exposures happen constantly, and they can sit on public sites for months before anyone notices.

The Cybersecurity and Infrastructure Security Agency (CISA) has repeatedly warned about credential exposure through paste sites and code repositories. For healthcare organizations, the risk extends beyond credentials to actual patient data.

What you need to do:

Monitor for exposure — Services exist that scan paste sites and dark web forums for your organization's data. Consider implementing one.

Educate developers and IT staff — Make sure anyone who handles code or configurations understands the risks of public sharing.

Implement data loss prevention — DLP tools can detect and block PHI from being uploaded to unauthorized sites.

Review your de-identification practices — If you share datasets for research or analytics, make sure they're truly de-identified per HIPAA's Safe Harbor or Expert Determination methods.

Have a response plan for discovered exposures — If you find your data on a paste site, you need to act fast. Know who to contact and what steps to take.

5. Mobile Device Management Isn't Optional Anymore — It's Essential

Mobile devices have been a HIPAA concern for over a decade. So why is this still on the list for 2026? Because the problem has gotten worse, not better.

The pandemic permanently changed how healthcare is delivered. Telehealth exploded. Remote work became normal. Clinicians started using personal devices to access patient information from home, from their cars, from coffee shops. And many of those behaviors stuck around after the emergency declarations ended.

The HHS guidance on mobile devices and health apps emphasizes that the same security requirements apply regardless of the device. A smartphone accessing your EHR needs the same protections as a desktop workstation in your office.

Yet I consistently see healthcare organizations with mature security programs for their on-premise infrastructure and almost nothing for mobile devices. No mobile device management (MDM) solution. No ability to remotely wipe a lost device. No visibility into what apps are installed or what data is being accessed.

The risk landscape:

Lost and stolen devices — A phone left in an Uber can expose thousands of patient records.

Unencrypted data — Many devices aren't encrypted by default, and users disable security features for convenience.

Malicious apps — Personal devices often have apps that access contacts, calendars, and files — including PHI.

Unsecured networks — Accessing PHI over public WiFi without a VPN is distressingly common.

Messaging apps — Clinicians texting about patients through iMessage, WhatsApp, or SMS. Convenient, yes. Compliant, no.

What you need to do:

Implement MDM — If devices access PHI, you need to manage them. Period. MDM solutions can enforce encryption, enable remote wipe, and restrict app installation.

Establish BYOD policies — If you allow personal devices, be explicit about the requirements. Users should understand that their personal phone becomes subject to organizational controls when it accesses PHI.

Provide secure communication tools — Give clinicians a compliant way to communicate. If you don't, they'll find their own way, and it won't be compliant.

Train on physical security — Don't leave devices unattended. Use strong passcodes. Report lost devices immediately. These basics save breaches.

The Common Thread

If you look at these five issues, there's a pattern: technology is evolving faster than policies and training can keep up. AI, cloud services, mobile devices, paste sites — these aren't new concepts, but their prevalence and risk profile in healthcare have changed dramatically.

The organizations that stay ahead of these challenges share some common characteristics:

• They treat compliance as an ongoing process, not an annual checkbox.

• They invest in visibility — you can't protect what you can't see.

• They make it easy for staff to do the right thing, with clear policies and sanctioned tools.

• They document everything, knowing that good documentation is their best defense.

• They test their assumptions through audits, tabletop exercises, and honest self-assessment.

Taking Action

Reading about these challenges is one thing. Doing something about them is another.

If your documentation hasn't been updated to address AI tools, cloud proliferation, or mobile device management, you've got gaps. If your incident response plan hasn't been tested in the last year, you don't know if it works. If you can't produce a current BAA for every vendor touching PHI, you're exposed.

At hipaa.app, we built a platform specifically to help healthcare organizations get their compliance documentation in order. Not generic templates — actual policies and procedures customized to your organization, your technology environment, and your specific risks.

The platform guides you through a comprehensive assessment, generates your documentation, and gives you the tools to maintain it over time. Most organizations complete the initial process in under 30 minutes.

Whether you use our platform or not, the important thing is to take action. The threat landscape isn't waiting for you to catch up.

Create a free account at hipaa.app and see where your compliance stands. No credit card required, no pressure — just clarity on what you need to do to protect your organization and your patients.

Because in 2026, "we didn't know" isn't an acceptable answer anymore.