Even HIPAA Can’t Protect You: Personalized Prohibitions Based On Your Medical Data Could Be Coming

Interestingly, I had two readers contact me today in response to my comment in The SDK: The Privacy Edition 2015.04.01 on the Slate article “Our Data, Our Health. A Future Tense Event Recap”. One was from the legal privacy side, excited to learn more as to how the legal side can assuage the fears of the general public regarding access to medical data. The other, speaking as a member of the lay, wanted to remind me that access to medical data isn’t the only concern: The even more critical concern is what those who legitimately have access to the data may do with it.

That lead me to think back to a time, not so long ago, when it was a huge concern to patients that their Insurance Provider would discover patients’ preexisting conditions, because Insurance companies could refuse to cover patients with preexisting conditions. Patients no longer have to worry about that, now that (at least in the United States) The Affordable Care Act requires Insurance Companies to insure patients with preexisting conditions. But what about when organizations/schools/companies can make decisions regarding what you may or may not do, based on the medical data they have access to? How many people will be prohibited from participating in activities once “automated review of medical history” is required for various activities?

PADI already does this. If you try to go scuba diving, you will be asked to complete a “Participant Record” (available here). The Participant Record is actually PADI’s way of compelling prospective divers to divulge their “relevant” medical history. If you have no preexisting conditions, lucky you; you simply sign the form, declaring essentially that you “have no medical conditions and take no prescription drugs.” But it’s a very detailed form, and any prospective scuba diver who has a history of any one of the enumerated conditions must get a letter from BOTH their General Practitioner AND, as appropriate, the treating specialist, indicating that the prospective diver’s latest physical showed no problems which would preclude diving. Some scuba diving shops will also require that the physical has to have been within the last 30 days.
The Participant Record gets even worse. There are levels of risks, with the highest ones all but disqualifying a prospective diver from scuba diving. My reader calls foul, willing to give the dive shop the benefit of the doubt that the shop thinks it has a legitimate reason to review a prospective divers’ medical history. Except for that their computer would instantly disqualify him; an unfair assessment since he scuba dives without any trouble. And remember: just because a shop can come up with a reason to review a prospective participant’s medical history doesn’t mean that their request should be granted.

What if the dive shop skipped the middle man and put in automated review tomorrow The dive shop then would no longer needed you as a go-between to access your medical information from your doctors. Suddenly the dive shop doesn’t even need you to fill out the form. You would show up at the shop, and provide your driver’s license (something most of us wouldn’t think twice about). But once the shop learned your identity, the dive shop’s computer would automatically access your medical history. But it turns out, your medical history includes back pain. So the shop’s computer simply denies you the ability to sign up for a dive. Case closed. No data breach caused this. This was “legitimate access“.

Now let’s assume that the Privacy Folks try and write some regulation to protect your data from Joe at the Surf Shack from knowing that you stubbed your toe in the second grade. For all their best intentions, they could write regulation that dictated that the software could ping the medical database, asking if your medical history contained any of PADI’s medical history red flags, but the medical database was only allowed to return a positive or negative. The engineers can implement that easy enough. The dive shop’s computer would then have no way of knowing why you were rejected, and the fact that you have back pain would be kept private.
The downside? You would also be clueless as to why you were rejected, with zero recourse. To give you some notice, the PADI Participant Record could sit on the counter, giving you a heads up as to what the dive shop’s computer is querying the medical database for. But even then, you might think in your head “hm, I probably have at least three of these, but they’re all low risk factors”. The issue of a binary “accepted”/”rejected” response still stands, plus if the medical data is pulled from multiple doctors, you would have absolutely no idea which record produced the positive match to the dive shop’s query. And you very well might not want Joe’s Surf Shack to be able to let you know which record produced the positive match. A positive match from a Hernia Center could be awkward for you and for Joe.

But let’s take a step back. In the previous scenario, that data is pretty clearly protected by Health Information Portability and Accountability Act (HIPAA). What if you could do automatically prohibit people from participating in activities, and you didn’t even need to access medical data from a physician/hospital/etc.? This concern lead me to imagine a scenarios wherein access to mere “health data”, which as of now is not covered by HIPAA, would lead to a very similar outcomes to the one I just described with the Dive Shop. The technology for this already exists, and even I’m not sure whether any regulations would prevent it:

Say you’re on vacation at Disneyland, and Disneyland offers an App for your iPhone which has your schedule, your Fastpasses, etc. (Disneyland already has at least an App for Wait times, plus there’s Disney’s $1 Billion Bet on a Magical Wristband). Now, let’s say that Disney’s App has a “feature” which allows it to sync with Apple HealthKit. And like everything else in the land of user privacy preferences, more likely than not, this feature would be “opt-out”, and default would allow both read & write of Apple HealthKit Data. At first glance, that might look like a genuine feature – you’re going to be walking A LOT at Disneyland! You want to know exactly how many steps you took during your journey through the most Magical Place on Earth! Not so fast.
You just let Disneyland see everything else you sync with Apple HealthKit. Suddenly, Disneyland would have the data to replace this Prohibition & General Warning sign:
7188608892_fa70b8b732_h
with a digital sign which displayed warnings specifically tailored for you, based off of all of that medical data which the Disneyland App obtained from Apple HealthKit. An almost harmless for instance: the sign could display the prohibition for obescity, based on the weight you inputted into Apple HealthKit. In an effort to maintain privacy, the warning sign could simply appear on your App. If you decide that you are willing to defy the warning, it’s an enter-at-your-own-risk. But Disneyland now knows what you did, and could potentially now absolved of anything that could happen to you.

To take this scenario a step further, the App could physically prevent people whose health data indicates that they fall under a prohibited category of riders from going through the gate for a particular ride. The App could even get aggressive, requiring that only people who log at least 20,000 steps per week can ride the high-intensity roller coasters.

Finally, imagine a ride prohibits pregnant women from riding a roller coaster, and a woman who uses a Fertility Tracking App is suddenly prohibited from riding a roller coaster which she was able to ride the previous day. How could Disneyland know she was pregnant when she likely didn’t even know? Disneyland likely doesn’t know for certain, but all her App has to do is extrapolate that {insert lady-talk} to determine that she must be pregnant, and sync that information with Apple HealthKit (Target already did something similar in 2012 to figure out that a teen girl was pregnant before her dad did). Now the poor woman can’t get through the gate to Magic Mountain, and with any luck it was her App that suddenly showed “No Pregnant Women” and not a digital sign within view of her entire group.

Creepy enough yet?

Advertisements

One thought on “Even HIPAA Can’t Protect You: Personalized Prohibitions Based On Your Medical Data Could Be Coming

Debug This!

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s