As if you didn't already share enough (maybe too much?) on Facebook and Twitter, a lot of personal data that you didn't even know existed is being shared online—records such as insurance claims, digital health records, housing records and all the things your friends and acquaintances have to say about you. And lest you think otherwise, the market for this data is huge.
According to a Politico report, companies are cashing in on Americans' health care data. Not only are they not asking consumers' permission, many don't even know about it.
One of the things companies are using that data for is create patient “risk scores,” which are then sold to doctors, insurers and hospitals to help them identify patients in certain risk categories, including higher risk of opioid addiction or overdose. Based on this information, an institution may deny coverage or raise rates for a patient or otherwise alter the treatment plan.
The risk scores from the algorithms being created from all those data are being used by Cigna and Optum already, says Politico, after LexisNexis and other data brokers have scooped it up and removed the personally identifying details, transmogrifying it into algorithms that could over- or underestimate a patient's risk. Both are problematic, as, according to Politico, “overestimating risk might lead health systems to focus their energy on the wrong patients; a low risk score might cause a patient to fall through the cracks.”
The opioid epidemic has spurred this latest twist on data collection and sales, with providers trying to judge how much and how long a patient may safely take opioids; but an inaccurate algorithm could actually end up denying patients pain medication when they really need it and are not at high risk of addiction. That's what health care safety advocates fear, among other worries. And while doctors can share those risk scores with patients, if they choose to do so, they aren't obliged to.
According to Politico, “The algorithms assign each patient a number on a scale from zero to 1, showing their risk of addiction if prescribed opioids. The risk predictions sometimes go directly into patients' health records, where clinicians may use them, for example, to turn down or limit a patient's request for a painkiller.”
And therein lies the rub, according to Lorraine Possanza of the ECRI Institute, a critic of the process: those risk scores could classify people without their even knowing about it, and provide doctors with an excuse to prevent them from “getting the drugs they need.”
“Consumers, clinicians and institutions need to understand that personalized health is a type of surveillance,” Harvard University professor Eric Perakslis told Politico. “There is no way around it, so it needs to be recognized and understood.”
With the horse already out of the barn, it could be tough to close the door—but big data being used in this manner goes way beyond a privacy issue, and “impinges on human rights beyond simple violation of privacy,” data governance expert Martin Tisne says in a recent issue of Technology Review, arguing for a data-specific Bill of Rights that includes the right to be secure against “unreasonable surveillance” and discrimination based on data.
Still, it seems likely that risk scores aren't going away, with data aggregators all jumping on the bandwagon and creating their own algorithms that may or may not be reasonably accurate. But hazards remain. LexisNexis sells a tool to health plans that tags patients who may already have opioid use disorder—but if a relative or roommate is at the same address and happens to have the same health plan, or if they use a pharmacy with a reputation for dispensing pills in high volumes, that evaluation could go awry.
How? According to LexisNexis's Shweta Vyas, LexisNexis can draw “relatively strong connections” between people, with those connections resulting from public records that indicate they both live at the same address. Vyas says in the report that if both individuals are covered by the same health plan, the software can find patterns “in the aggregate behavior of those two people.”
Complete your profile to continue reading and get FREE access to BenefitsPRO, part of your ALM digital membership.
Your access to unlimited BenefitsPRO content isn’t changing.
Once you are an ALM digital member, you’ll receive:
- Breaking benefits news and analysis, on-site and via our newsletters and custom alerts
- Educational webcasts, white papers, and ebooks from industry thought leaders
- Critical converage of the property casualty insurance and financial advisory markets on our other ALM sites, PropertyCasualty360 and ThinkAdvisor
Already have an account? Sign In Now
© 2024 ALM Global, LLC, All Rights Reserved. Request academic re-use from www.copyright.com. All other uses, submit a request to [email protected]. For more information visit Asset & Logo Licensing.