Keisha Ray

Publish date

by Anne Zimmerman, JD, MS

I use the phrase “Barcode me” to verbalize the idea that my data is my own and that when anyone stands to use it for financial gain, I should be paid. The big data landscape with its massive capabilities can trace data at many points in the aggregation process. Pieces of metadata are merged into larger aggregate pools and deidentified along the way. Whether I am barcoded or represented by a QR code or even a shape or algorithm, there are ways to compensate me. I prefer Venmo but PayPal is fine too.

Privacy in the big data landscape is crucial in the context of personal, and especially health data, and many people wonder whether deidentification is enough to keep data safe. Health data has a corporate value and is reused and pooled many times. The risk of reidentification or any other privacy breach should make the use of the data even more valuable. The risk of reidentification must not be an excuse to fail to compensate people for the use of their data. As a separate issue, even assuming no risk of a breach, the corporate use of data for financial gain provides a windfall.

Data miners have more uses and outlets for data than ever. Pharmaceutical marketing is increasingly constitutionally protected as free speech, benefitting corporations using the data. New data like that derived from facial recognition technology combined with genetic data leads to broader aggregate information. Facial recognition’s biometric data is by nature personally identifiable, and is HIPAA protected, but some of the data collected could have commercial uses. If adequately deidentified, there does not seem to be a law prohibiting the sale of any medical or genetic data derived from facial recognition. While a full-on picture is to be kept private, correlations between characteristics and genetics or medical conditions are likely fair game for big data pools. Even if used for medical diagnoses or for arguably some public health knowledge, financial gains will ensue. The sale of equipment, medical devices, and pharmaceutical products with the added twist of data-generating patents can lead to windfalls for manufacturers and data miners.

Those asserting a right to be forgotten, one idea behind some consumer privacy laws, focus on the ability to opt out of data collection, something inconvenient and seemingly unrealistic for many people. People bear the personal risks of a breach of privacy although new legal structures impose more penalties for corporate privacy failures. However, compensating the person whose data is provided as a data owner would be the realistic, ethical solution. Rather than waiving privacy rights as we do when we “accept cookies” and give consent in exchange for nothing, people can be asked to sell their data in exchange for money. From rent to royalties, there are many structures that would allow people to be compensated. Most importantly, people should have a property right in their individual data — they deserve to share in the financial rewards generated by big data.

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.