The Health Data Privacy Gap Telehealth Apps Exploit — and How to Close It
Why Your Health Data May Be Less Protected Than You Think
When you download a telehealth app to talk to a doctor, check your symptoms, or manage a prescription, you probably assume your personal health information is protected. After all, there are laws for that — right?
The answer is more complicated than most people realize. A significant gap exists between what people expect and what the law actually requires. Many telehealth apps operate in a legal space where the nation’s main health privacy law simply does not apply. That gap is real, it is growing, and it puts millions of people at risk.
What HIPAA Actually Covers — and What It Doesn’t
The Health Insurance Portability and Accountability Act, commonly known as HIPAA, has been the backbone of health data protection in the United States since 1996. It sets rules for how medical information can be collected, stored, shared, and used. It applies to what are called “covered entities” — hospitals, clinics, insurance companies, and the business partners that directly work with them.
Here is the problem: HIPAA was written long before smartphones existed. The law was not designed with consumer health apps in mind. As a result, many telehealth companies and digital health platforms do not qualify as covered entities under HIPAA. They are not legally required to follow its rules.
This means a mental health app that stores your therapy notes, a period-tracking app that records reproductive health data, or a symptom checker that logs your medical concerns may be completely free to share or sell that information — without telling you, without your clear consent, and without breaking a single law.
The Telehealth Boom Made the Problem Bigger
The COVID-19 pandemic pushed millions of people toward telehealth practically overnight. Virtual doctor visits, online therapy, and app-based health monitoring became the new normal. That was largely a good thing — it expanded access to care for people in rural areas, people with disabilities, and those with limited time or transportation.
But the rapid growth of telehealth also attracted a wave of companies looking to profit from health data. Health information is incredibly valuable. It can be used to target advertisements, influence insurance decisions, and be sold to data brokers who bundle and resell it to anyone willing to pay.
Studies and investigative reports have found that many popular health apps send user data to third parties including Facebook, Google, and various data brokers. In some cases, this happens even when users have not given clear permission. The data shared can include sensitive details like diagnoses, medications, mental health status, and reproductive choices.
Real-World Examples of the Privacy Gap
This is not a hypothetical problem. Here are some documented examples of how telehealth privacy gaps have played out in real life:
- Mental health apps sharing therapy data: Several popular mental health platforms were found to be transmitting session data and user behavior information to advertising networks, even as they marketed themselves as private and secure.
- Reproductive health tracking: After the Supreme Court overturned Roe v. Wade in 2022, researchers and advocates raised urgent concerns about period-tracking apps storing data that could be used in legal proceedings. Many of these apps had privacy policies that allowed broad data sharing.
- FTC enforcement actions: The Federal Trade Commission has taken action against several health app companies for deceptive data practices, signaling that regulators are paying attention — but also revealing just how widespread the problem is.
- Prescription management platforms: Some companies that help users manage medications and refills were found to be sharing data with pharmacy benefit managers and marketing firms without clear user consent.
Why Healthcare Law Has Struggled to Keep Up
Lawmakers and regulators face a real challenge. Technology moves fast. Laws move slowly. By the time a regulation is written, debated, and passed, the technology it is meant to address has often changed dramatically.
HIPAA has not seen a major update in decades. While the Department of Health and Human Services has issued guidance and rule updates over the years, the core framework still reflects a world of paper records and physical clinics rather than cloud-based apps and artificial intelligence-driven health tools.
There have been efforts to close the gap. The FTC has used its authority over deceptive business practices to go after health app companies. Some states, including California and Washington, have passed their own stronger data protection laws. But these measures are uneven and incomplete. A person in one state may have significantly more protection than someone in another.
What Data Is Being Collected and Who Is Getting It
To understand the risk, it helps to know what kinds of data telehealth apps typically collect and where it can end up.
Common types of data collected by health apps include:
- Personal identifiers such as name, email address, date of birth, and location
- Medical history, symptoms, and diagnoses entered by users
- Mental health information from therapy sessions or mood tracking
- Reproductive and sexual health data
- Prescription and medication details
- Device information and browsing behavior linked to health searches
- Payment information connected to health services
This data can flow to a range of third parties, including:
- Advertising networks that use it for targeted marketing
- Data brokers who aggregate and resell consumer profiles
- Employers or insurance companies in some circumstances
- Law enforcement agencies that request it through legal processes
- Investors and acquirers if a company is bought or goes bankrupt
What “Privacy Policy” Really Means
Most health apps have a privacy policy. Most users never read it. And even those who do often find language so broad and complicated that it is nearly impossible to understand what they are actually agreeing to.
Common phrases like “we may share your information with trusted partners” or “we use data to improve your experience” can mean almost anything. These vague terms give companies enormous flexibility to use and share data in ways users would never expect if they were told plainly what was happening.
Privacy policies are also one-sided agreements. You either accept them entirely or you cannot use the service. There is rarely a middle ground where you can consent to some uses of your data but not others.
How to Protect Yourself Right Now
While the legal landscape catches up, there are practical steps you can take to reduce your risk when using telehealth apps.
- Check if the app is HIPAA-compliant: Look for a clear statement that the company is a covered entity or business associate under HIPAA. If you cannot find it, ask directly or look elsewhere.
- Read the privacy policy — or at least scan it: Look specifically for sections about data sharing with third parties and data selling. If the language is vague or allows broad sharing, be cautious.
- Limit what you share: Only enter information that is strictly necessary for the service you need. Avoid sharing sensitive details in apps that do not clearly explain how the data is protected.
- Use established healthcare providers: Telehealth services offered directly through hospitals, established medical groups, or licensed insurers are more likely to be covered by HIPAA than standalone apps.
- Check app permissions on your phone: Review what access the app has to your location, contacts, camera, and microphone. Revoke any permissions that do not seem necessary.
- Look for apps with strong independent privacy certifications: Some apps voluntarily submit to audits or certifications that confirm stronger data practices beyond what the law requires.
How Lawmakers and Regulators Can Close the Gap
Individual caution only goes so far. Real protection requires systemic change. There are several meaningful steps that policymakers could take to address the telehealth privacy gap.
Update HIPAA for the Digital Age
The most straightforward solution is to modernize HIPAA so that it clearly covers consumer health apps and digital health platforms. This would mean expanding the definition of covered entities and ensuring that companies collecting health data in any form — regardless of whether they are traditional healthcare providers — must meet the same standards.
Pass a Comprehensive Federal Privacy Law
The United States is one of the few wealthy nations without a broad national data privacy law. A comprehensive federal privacy law that covers all types of personal data — not just health information — would give everyone a baseline of protection and make it easier to enforce consistent standards across industries.
Require Meaningful Consent
Laws should require that health apps obtain genuine, informed consent before sharing sensitive data. That means plain language, specific descriptions of what will be shared and with whom, and real choices — not take-it-or-leave-it checkboxes buried in lengthy documents.
Restrict Data Selling and Brokering
Health data should not be sold to data brokers, period. Specific prohibitions on the commercial sale of health information — separate from using it to provide the service a user signed up for — would close one of the most significant loopholes that exists today.
Strengthen State-Level Protections
In the absence of federal action, states can lead the way. Washington State’s My Health MY Data Act, passed in 2023, is one example of a state going further than federal law to protect health information collected by non-HIPAA-covered entities. More states following this model would help fill the gap in the near term.
The Stakes Are High
Health data is among the most sensitive information that exists about a person. It touches on physical conditions, mental health, reproductive choices, and personal behaviors. When that data is shared without knowledge or consent, the consequences can be serious — discrimination, embarrassment, financial harm, and in some cases, legal jeopardy.
Telehealth has genuine potential to improve access to care and make healthcare more convenient. But that potential comes with responsibility. The companies building these tools, the lawmakers overseeing them, and the regulators enforcing the rules all have a role to play in making sure that better access to care does not come at the hidden cost of lost privacy.
Until the law catches up, the burden falls too heavily on individual users to protect themselves — often without the information they need to do so effectively. That is not a system that deserves trust. Closing the health data privacy gap is not just a legal issue. It is a matter of basic fairness.














