Health IT and Patient Privacy

VI. Key Issues: Financing and Delivery >> A. Health Spending >> Health Cost Containment  >> Improve Administration >> Health IT  >> Health IT and Patient Privacy (last updated 1.27.22)

Overview

The collection and sharing of patient health records inspired by the Health Insurance Portability and Accountability Act (HIPAA), the Health Information Technology for Economic and Clinical Health (HITECH) Act, and the Affordable Care Act have created a conflict between those who advocate that patient records are shared widely and those advocating for health records privacy. 

  • Tech and Health Care Need Their Own ‘Hippocratic Oath’ to Make Digital Health Work. “When a whistleblower revealed the details of Project Nightingale, a collaboration between Google and the Ascension health system, he or she also surfaced critical flaws in the ways that health care and tech work together. As part of the deal, Ascension, a nonprofit Catholic hospital system that operates in 21 states, gave Google access to millions of patient records, including names and birth dates. The goal of Project Nightingale was to build new tools that help doctors extract key information from patients’ medical records and deliver more targeted medical treatments. It would also make it possible for doctors to spend more time with patients and less time combing through endless layers of electronic health data. The problem was that the hospital system gave Google access to this mountain of data without the knowledge of doctors or patients. After the news broke, stories emerged questioning compliance with privacy laws and whether Google had plans to monetize the data it received. Lawmakers have voiced similar concerns. This isn’t an isolated incident. There have been other hiccups over the years as tech and health care have increasingly gravitated towards one another. Think IBM Watson, Memorial Sloan Kettering’s alliance with Paige.AI, and the fall of Outcome Health. These gaffes are exacerbating an already frayed trust between the public and the tech industry.” (StatNews.com, 12.23.19)
  • Healthcare Interconnectivity Opens the Door to Cyberthreats. “For some health systems and physicians, the quality of patient care is the top priority, while cybersecurity is near the bottom of that list. However, the cybersecurity of devices and systems that support health care delivery is closely tied to patient safety. For example, cardiac rhythmic devices might transfer protected health information (PHI) through a smartphone, or a physician might transmit potentially lifesaving prescriptions to a pharmacy via the internet. If devices do not have protections in place to prevent cyberthreats, the health and safety of the patient could be at risk. Recall August 2017, when the U.S. Food and Drug Administration issued a voluntary recall of 465,000 pacemakers after cyber vulnerabilities were identified. Large hospital systems and health plans typically have more sophisticated tools, vendor relationships and dedicated staff to ensure that patient data, PHI and the systems and devices that support care delivery are as secure as possible. However, that same type of information also resides in the EHRs of a small physician practice, the computer system used by a mom-and-pop drug store, or the records of a rural dentist office. Smaller entities typically do not have the personnel, money or resources needed to safeguard patient data, or to patch their systems enough to protect them from threats.” (SMLR Group, 3.13.18)
  • Basic Health Privacy FAQ’s. “Q. Who has access to my health records? A. Many more people than you would ever want, including people outside the health care industry: Insurance companies; Government agencies especially if you receive Medicare, Medicaid, SCHIP, SSI, Workers Comp or any local, state or federal assistance; Employers; Banks and Financial Institutions; Researchers; Marketers; Drug companies; Data miners; Transcribers in and outside the U.S.; Websites that collect information about you… Even health information about procedures paid for privately can be reported. The original Privacy Rule stated that information about procedures paid for out of pocket would not be disclosed, but that statement was in the context of a discussion of the right of consent which was included in the original (HIPAA) Rule but repealed in the Amended Rule. See 65 Fed. Reg. at 82,512. Since the Amended Rule allows for the use and disclosure without consent of personal health information for the insurance company’s business operations, clearly such information can be used and disclosed regardless of whether the individual paid out-of-pocket.” (Patient Privacy Rights Foundation, 2016
  • Are Medical Records Private? “How Private Are Your Medical Records? Not Very. In the United States, most patients believe that Health Insurance Portability and Accountability Act (HIPAA) laws keep our medical records private, shared only among our doctors, ourselves and maybe a loved one or caregiver. But those who believe that are wrong! In fact, there are dozens of individuals and organizations that are legally allowed to access our medical records for a variety of reasons, either by request or by purchase. In some cases, we provide permission for their access. In others, permission isn’t necessary. In still other cases, we provide permission without even realizing we’ve done so.” (VeryWell.com, 5.9.16)
  • Your Medical Records Aren’t Secure. “In 2002, under President George W. Bush, the right of a patient to control his most sensitive personal data—from prescriptions to DNA—was eliminated by federal regulators implementing the Health Insurance Portability and Accountability Act. Those privacy notices you sign in doctors’ offices do not actually give you any control over your personal data; they merely describe how the data will be used and disclosed… Today our lab test results are disclosed to insurance companies before we even know the results. Prescriptions are data-mined by pharmacies, pharmaceutical technology vendors, hospitals and are sold to insurers, drug companies, employers and others willing to pay for the information to use in making decisions about you, your job or your treatments, or for research. Self-insured employers can access employees’ entire health records, including medications. And in the past five years, according to the nonprofit Privacy Rights Clearinghouse, more than 45 million electronic health records were either lost, stolen by insiders (hospital or government-agency employees, health IT vendors, etc.), or hacked from outside.” (Wall Street Journal, 3.23.10)

Health Information Exchange and Patient Privacy

Health Information Mining

  • The Data Privacy Lab: This program was established in the Institute for Quantitative Social Science (IQSS) at Harvard University. The Lab started in 2001 and relocated to Harvard University in 2011. “The overall objective of the Data Privacy Lab is to provide intellectual leadership to society in shaping the evolving relationship between technology and the legal right to or public expectation of privacy in the collection and sharing of data… A goal of the Data Privacy Lab is to inform on-going discussions and to assess and propose balanced approaches in which data can be shared but in which inferences about the identities of people and organizations contained in the released data cannot reliably be made.” One project under this organization, thedataMap, explores the sharing of health data with various entities. The lab offers a flow chart of health sharing in 1997, before implementation of HIPAA, and a more current (2013) interactive map.

19972013

  • Seven Ways Data Currently Being Collected About You Could Hurt Your Career or Personal Life. “One thing is becoming clear with data brokers: it is almost impossible to keep track of where they’re getting their data. Consider all the sources that could collect ‘health-inflected’ information, such as bills for pills or GPS records of an emergency room visit… Unfortunately, most data isn’t covered by FCRA or HIPAA. So we’re going to need new laws to help rein in the worst abuses of the new data landscape. Data brokers need to document where they get their data from, and to whom they sell it. We deserve the right to access all files kept on us and the right to correct them. Until that happens, the brave new world of runaway data will continue to threaten our reputations, opportunities and livelihoods.” Pasquale, Frank. (Huffington Post, 11.6.14)
  • Health Data Privacy Concerns Top HIE Barrier, Study Finds. “Concerns over health data privacy and potential confidentiality issues were one of the top barriers to HIE, according to a recent study published by the Robert Wood Johnson Foundation.” (HealthIT Security, 9.29.15)
  • How Companies Scour Our Digital Lives for Clues to Our Health. “People typically touch their phones 2,617 per day, according to one study — leaving a particularly enticing trail of data to mine. ’Our interactions with the digital world could actually unlock secrets of disease,’ said Dr. Sachin H. Jain, chief executive of CareMore Health, a health system, who has helped study Twitter posts for signs of sleep problems. Similar approaches, he said, might someday help gauge whether patients’ medicines are working. ’It could help with understanding the effectiveness of treatments,’ he said. The field is so new and so little studied, however, that even proponents warn that some digital phenotyping may be no better at detecting health problems than a crystal ball. If a sociable person suddenly stopped texting friends, for instance, it might indicate that he or she had become depressed, said Dr. Steve Steinhubl, director of digital medicine at the Scripps Translational Science Institute in San Diego. Or ‘it could mean that somebody’s just going on a camping trip and has changed their normal behavior,’ he said. ’It’s this whole new potential for snake oil,’ Dr. Steinhubl said. That is not stopping the rush into the field — by start-ups and giants like Facebook — despite questions about efficacy and data privacy… ‘It’s like we’re in school forever,’ Professor Pasquale said, ‘and we’re being graded in all these ways forever by all the companies that have the most data about us.’” (New York Times, 2.25.18)

Pharmaceutical Use

  • Big Data Peeps At Your Medical Records To Find Drug Problems. “To do a better job of spotting unforeseen risks and side effects, the Food and Drug Administration is trying something new — and there’s a decent chance that it involves your medical records. It’s called Mini-Sentinel, and it’s a $116 million government project to actively go out and look for adverse events linked to marketed drugs. This pilot program is able to mine huge databases of medical records for signs that drugs may be linked to problems… Their health records include nearly 180 million Americans. If you have insurance through a private health plan, the chances are ‘pretty good’ that your data may have been used in one of these studies, says Dr. Richard Platt, the principal investigator for Mini-Sentinel and a professor at Harvard Medical School’s Department of Population Medicine.” (NPR, 7.21.14)
  • Government is Now Tracking Your Medications. “The digital age is leading to the end of centuries-old constitutional privacy guarantees, as evidenced by the growth of Internet cloud-based data storage and electronic records, both of which are too easily accessible to prying eyes enabled by power-hungry politicians. For instance, most Americans are unaware that state and federal governments are tracking – and accessing – your prescription medication records. Dozens of states allow federal and state law enforcement agencies free, warrantless access to databases that contain your drug history. What’s more, the federal Drug Enforcement Agency is scrambling for authority to search databases in states where there are pharmaceutical privacy protections, reports Scripps News. At present, 31 states grant authorities this kind of carte blanche access; only one state – Missouri – and the District of Columbia do not have prescription drug marketing programs. But these protections are likely to fall as well; Missouri’s program is currently on target for state approval, while D.C. officials say theirs will be up and running by year’s end.” (Natural News, 9.12.16)

Employer Wellness Programs

  • Bosses Tap Outside Firms to Predict Which Workers Might Get Sick. “Employee wellness firms and insurers are working with companies to mine data about the prescription drugs workers use, how they shop and even whether they vote, to predict their individual health needs and recommend treatments…Federal health-privacy laws generally bar employers from viewing workers’ personal health information, though self-insured employers have more leeway, says Careen Martin, a health-care and cybersecurity lawyer at Nilan Johnson Lewis PA. Instead, employers contract with wellness firms who have access to workers’ health data. ‘There are enormous potential risks’ in these efforts, such as the exposure of personal health data to employers or others,’ says Frank Pasquale, a law professor at the University of Maryland, who studies health privacy.” (Wall Street Journal, 2.17.16)
  • Wellness Programs at Work: Could Your Boss be Spying on Your Health? “[P]rivacy experts have pointed out that employers do not have access to employees’ medical histories. Unless, they are provided to them by the employees through the company’s wellness program. Such wellness programs have the potential to become surveillance programs, according to Ifeoma Ajunwa, who teaches law at the David A Clarke School of Law at University of the District of Columbia. ‘I don’t think all employers employ them that way,’ she said. ‘I am not saying that all wellness programs are surveillance programs, but what we are seeing with the current status of the law, they do have that potential for unscrupulous employers to use them as a way to check on their employees and investigate the health of their employees.’” (The Guardian, 2.29.16)

Health Records Sales

  • Your Medical Records Are for Sale. “As hospitals shift to digital medical records, administrators promise patients better care and shorter waits. They often neglect to mention that they share files with state health agencies, which in turn sell the information to private data-mining companies. The records are stripped of names and addresses, and there’s no evidence that data miners are doing the legwork to identify individual patients. Yet the records often contain patients’ ages, Zip Codes, and treatment dates—enough metadata for an inquiring mind to match names to files or for aggressive companies to target ads or hike insurance premiums.” (Bloomberg Business, 8.8.13)
  • How Data Brokers Make Money Off Your Medical Records. “Once upon a time, simply removing a person’s name, address and Social Security number from a medical record may well have protected anonymity. Not so today. Straightforward data-mining tools can rummage through multiple databases containing anonymized and nonanonymized data to reidentify the individuals from their ostensibly private medical records…’It is getting easier and easier to identify people from anonymized data,’ says Chesley Richards, director of the Office of Public Health Scientific Services at the Centers for Disease Control and Prevention… ‘I personally believe that at the end of the day, individuals own their data,’ says Pfizer’s Berger. ‘If somebody is using [their] data, they should know. And if the collection is ‘only for commercial purposes, I think patients should have the ability to opt out.’” (Scientific American, 2.1.16)

Data Privacy Litigation

  • U.S. Supreme Court Deals Blow to CT Health Data Collection Effort. “In a ruling that could have reverberations for a Connecticut health reform effort, the U.S. Supreme Court ruled Tuesday that certain health plans could not be required by a state to disclose data for use in a health care claims database… Close to 20 states, including Connecticut, are developing or already have databases of medical, dental and pharmacy claims that can show what health care services residents used and what was paid for them. Proponents of the databases say they can be a key tool in better understanding health care service usage and costs, including price variation and gaps in access to care… Wadleigh said the database could begin producing reports as early as this summer. Individuals would not be identified in the data, but some critics have said the database nonetheless raises privacy concerns and have sought the ability to opt out of having their information submitted.” (Connecticut Mirror, 3.1.16)

Data Anonymization/De-identification

Most current data-sharing arrangements involve health researchers who hope to discover medical breakthroughs. With some exceptions, these researchers are obligated to eliminate 18 personal identifiers, using “de-identified” or “anonymized” records when they analyze and share with others patients’ health records. Much of the expanded data-sharing enabled by HIPAA relies upon robust data de-identification. However, due to advances in technology, patient records can now be re-identified.

  • The Importance and Value of Protecting the Privacy of Health Information: The Roles of the HIPAA Privacy Rule and the Common Rule in Health Research. “[D]e-identification (and the less stringent anonymization) of information is particularly troublesome with respect to detailed databases containing genotypic and phenotypic data. The increase in genomic data coupled with the increase of computerization of other records about individuals, many of which are publicly available, increases the likelihood that data subjects can be re-identified. Single nucleotide polymorphisms (SNPs) contain information that can be used to identify individuals. Even a small number of SNPs can identify an individual almost as precisely as a social security number does. People who have access to individual data can potentially perform matches to public SNP data leading to matching and identification of individuals. Similarly, researchers with access to a large number of SNPs and corresponding phenotype data can potentially re-identify some individuals even if the information had been encrypted… Thus, it will become more questionable to treat this information as if the use and disclosure of this information poses no risk at all to the individual. (National Academy of Sciences, 2008)
  • Big Data and Privacy: A Technological Perspective — The President’s Council of Advisors on Science and Technology (PCAST, May 2014) Summary: “This report begins by exploring the changing nature of privacy as computing technology has advanced and big data has come to the forefront.  It proceeds by identifying the sources of these data, the utility of these data — including new data analytics enabled by data mining and data fusion — and the privacy challenges big data poses in a world where technologies for re-identification often outpace privacy-preserving de-identification capabilities, and where it is increasingly hard to identify privacy-sensitive information at the time of its collection.”
    • Data Privacy: “The same data and analytics that provide benefits to individuals and society if used appropriately can also create potential harms – threats to individual privacy according to privacy norms both widely shared and personal. For example, large‐scale analysis of research on disease, together with health data from electronic medical records and genomic information, might lead to better and timelier treatment for individuals but also to inappropriate disqualification for insurance or jobs… With a broad perspective, scholars today recognize a number of different legal meanings for ‘privacy.’”
    • Environmental Sensors: “Environmental sensors that enable new food and air safety may also be able to detect and characterize tobacco or marijuana smoke. Health care or health insurance providers may want assurance that self‐declared non‐smokers are telling the truth.”
    • Anonymization or De-identification: “[Y]ou may not mind if your medical record is used in research as long as you are identified only as Patient X and your actual name and patient identifier are stripped from that record… Unfortunately, it is increasingly easy to defeat anonymization by the very techniques that are being developed for many legitimate applications of big data. In general, as the size and diversity of available data grows, the likelihood of being able to re‐identify individuals (that is, re‐associate their records with their names) grows substantially… by fusing public, Personal Genome Project profiles containing zip code, birthdate, and gender with public voter rolls, and mining for names hidden in attached documents, 84‐97 percent of the profiles for which names were provided were correctly identified. Anonymization remains somewhat useful as an added safeguard, but it is not robust against near‐term future re-identification methods. PCAST does not see it as being a useful basis for policy. Unfortunately, anonymization is already rooted in the law, sometimes giving a false expectation of privacy.”
  • Every Patient a Subject. “[C]urrent norms for medical research permit a scientist who gets a sample of blood, tissue, or saliva to sequence and use that genome without the donor’s specific consent, or even without her knowledge. The scientist then may share those genomic data with others, including a database maintained by the U.S. National Institutes of Health that’s used by researchers and companies worldwide. This can all happen without any notice to the people whose DNA was sequenced. (In fact, if the study is federally funded, in some cases the scientist must share the information.)… ‘[D]e-identification’ is becoming only a reassuring myth. Subjects of genomic research should not confidently expect to remain anonymous. The possibility of ‘re-identifying’ people from either their genomes or the health or demographic data connected with those genomes is real… If the research community truly believes that science must conscript patient genomes for public benefit, it should make that case openly, explaining how notice and consent will impose undue burdens on crucial research.” (Slate, December, 2014)
  • Analyst: Private Firms’ Access to Obamacare User Info ‘Incomprehensible.’ “‘[W]ith today’s technology, Wright said, even with names and addresses stripped from the data collected by these firms, other companies and outside groups need only a small amount of information to identify users. ‘It’s gotten to the point now on the Internet where there’s so much data floating out there, it takes very small steps to create a profile on you, sir, to understand what you do, where you live, what your interests are,’ Wright said. He pointed to a recent study by MIT researchers that showed marketers can identify you ‘with more than 90 percent accuracy by looking at just four purchases, three if the price,’ is included. ‘And this is after companies ‘anonymized’ the transaction records,’ Wright added… Not only did users of the site not authorize the collection of their personal data by private firms, they also didn’t know that collection was going on in the first place, De Mooy explained.” (PJ Media, 2.22.15)
  • The Illusion of Patient Privacy and Private Practice. “Our health insurance claims data are being released to researchers and government agencies by the All-Payer Claims Databases (APCDs). These have been created by 14 states (with five states in implementation, including Connecticut and New York) to mandate that the health plans turn over medical and pharmacy claims data, including all diagnoses, procedures, tests, drugs prescribed, providers’ names with dates and identifiers (which can include enrollment data and Social Security numbers), all to a massive database managed by the state or a private company under state contract. These data are then sent to researchers in identified or de-identified forms, with varying degrees of privacy protections that attempt to prevent re-identification and leaks. As noted earlier, identified medical information can be released for the purposes of the APCDs, public health, researchers, government oversight agencies, healthcare operations, etc., without patient consent. But even with the 18 identifiers removed as specified by HIPAA for de-identification (Safe Harbor method), it is no longer as private as it might have been 15 years ago, before the explosion of online databases.” (Journal of American Physicians and Surgeons, Winter 2015)
  • De-Identification and the Health Insurance Portability and Accountability Act (HIPAA). But there is a problem with de-identification. We know that there are de-identified datasets in which some of the records can be re-identified. That is they can be linked back to the original data subject. Sometimes this is because the records were not properly de-identified in the first place. Other times, it is because that information in the dataset is distinctive in some way that was not realized at first. This distinctiveness can be used to link the data back to the original identity.” (National Committee on Vital and Health Statistics, 5.24.16)
  • The Privacy Delusions of Genetic Testing. “Customers are wrong to think their information is safely locked away. It’s not; it’s getting sold far and wide. Many testing firms that generally don’t sell patient information, such as Ambry and Invitae, give it away to public databases. Such transfers, as privacy consultant Bob Gellman puts it, leave a ‘big gap in protections.’ Hacks are inevitable. Easily accessible, public genetic depositories are obvious targets. If genetic data does fall into the hands of nefarious actors, it’s relatively easy for them to de-anonymize it. New lab techniques can unearth genetic markers tied to specific, physical traits, such as eye or hair color. Sleuths can then cross-reference those traits against publicly available demographic data to identify the donors. Using this process, one MIT scientist was able to identify the people behind five supposedly anonymous genetic samples randomly selected from a public research database. It took him less than a day. Likewise, a Harvard Medical School professor dug up the identities of over 80% of the samples housed in his school’s genetic database. Privacy protections can be broken. Indeed, no less than Linda Avey, a cofounder of 23andMe, has explicitly admitted that ‘it’s a fallacy to think that genomic data can be fully anonymized.’” (Forbes, 2.15.17)
  • Feasibility of Reidentifying Individuals in Large National Physical Activity Data Sets From Which Protected Health Information Has Been Removed With Use of Machine Learning.Policymakers1,2 have been concerned about the possibility of identifying individuals or their actions based on activity data, whereas device manufacturers and exercise-focused social networks believe that sharing deidentified physical activity data poses no privacy risks.35 Because it was recently reported that location information from activity trackers could be used to identify the location of military sites,18 these groups have begun to restrict which location data are shared. However, device manufacturers continue to share deidentified physical activity data with individuals’ employers, advertisers, and health care organizations.10 Thus, it is vital to be able to quantify the privacy risks from sharing such data. Our results suggest that partially aggregated PAM data with geographic and protected health information removed can be reidentified using machine learning. (JAMA Network, 12.21.18)

Privacy and the Affordable Care Act 

Federal Health Insurance Exchange

Opportunities for Fraud/Identity Theft

Threats to Privacy

State-based Insurance Exchanges

  • California’s Obamacare Exchange to Collect Insurance Data on Patients. “California’s health insurance exchange wants to know why you got sick this summer. With 1.4 million people enrolled, the state-run marketplace is embarking on an ambitious effort to collect insurance company data on prescriptions, doctor visits and hospital stays for every Obamacare patient. Covered California says this massive data-mining project is essential to measure the quality of care that patients receive and to hold health insurers and medical providers accountable under the Affordable Care Act. The state in April signed a five-year, $9.3-million contract with Truven Health Analytics Inc. of Michigan to run the database. The effort has raised questions about patient privacy and whether the state is doing enough to inform consumers about how their data will be used. There are also worries about security amid massive breaches at Anthem Inc. and other health insurers affecting millions of Americans. Peter Lee, executive director of Covered California, said protecting sensitive information was a top priority and that consumers stand to benefit from the collection of medical data. He acknowledged the state had no plans to let consumers opt out and keep their records out of the database… (Covered California) shared details on Covered California enrollees with researchers at UC San Francisco and UC San Diego, and those names were compared with a state database of patients who received hospital care in 2012.” (Los Angeles Times, 6.21.15)

Analysis

  • The ACA and the Death of Medical Privacy. “I never sign medical release forms anymore. That’s because I read them. These forms tend to be lengthy documents which ultimately state that your medical records can be shared with just about everyone on the planet. Don’t believe me? Here’s the first paragraph of a 2,000-word explanation of how PHI (protected health information) can be used by a nationally recognized pediatric provider: ‘Quality Improvement Activities: Information may be shared to improve the quality or cost of care. For example, your PHI may be reviewed by XXX XXX or outside agencies to evaluate and improve the quality of care and services we provide.’ Outside agencies? Are you kidding me? Who would you sign that release?… If my records are accessible to a RHIO (regional health information organization), the probability that I have medical privacy is near zero… The Affordable Care Act has exacerbated the problem considerably, and I read all too much from healthcare IT industry pundits about the need for increased sharing of information and more ‘visibility.’ This is all rationalized by dubious claims about saving lives and ‘improving outcomes.’” (CIO, 8.2.16)

ACA’s Risk Adjustment Program

  • Obamacare Stability Rests On Shaky Risk Adjustment. Author points out that, for the ACA’s Risk Adjustment provision to function properly, government must have access to individual health records. “[E]ven though insurers can’t price individuals based on their medical risk, the government will be provided information on each individual’s medical risk. The government will then make payments to insurers so that the insurers get something like the same amount of money that they would receive if they could rate on medical risk… government ultimately has to have access to medical records. If government can’t directly or indirectly ultimately access individual medical records to see how the insurer is classifying people, the system doesn’t work… For all the privacy intrusions raised by Risk Adjustment — and I am surprised more has not been written about this — it turns out that government has not been auspicious in its efforts to predict medical risk.” Chandler, Seth. (Forbes, 1.21.16)

Resources