Surveillance Capitalism + the Medical Gaze = Yikes

Author

Blog Editor

Publish date

Tag(s): Legacy post
Topic(s): Artificial Intelligence Technology

by Adam Omelianchuk, PhD

Consider what the future holds for mental health treatment options. One could authorize an “Internet of Things” to detect mood-changes from an app that monitors one’s social media posts; stress levels from a smartwatch; anxiety symptoms from tapping and scrolling patterns on a touchscreen; signs of cognitive impairment from a speech pattern analysis through anything with a microphone; the benefits of an A.I. chatbot that offers therapeutic conversation; the presence of gut microbes associated with autism from the examination feces deposited in a smart toilet; sleep quality from a smart mattress, medication compliance from a smart pill box (see Joshua Skorburg’s insightful commentary from which I borrowed this list). All of this to create what is called a digital phenotype of your activity, which promises to be useful to mental health professionals.

An important and attentive article argues that if appropriate measures of accountability, protection of user data, transparency, and informed consent are in place, then these interventions could be aptly described as “enabling passive, continuous, quantitative, and ecological measurement-based care” (emphasis added). But I’m not optimistic that such measures will be in place for those who would develop these data-mining technologies for the agents of Big Tech. Yes, one can imagine how these technologies could be beneficial if they were used for a restricted time while feeding their results into a secure electronic health record under the watchful eye of a psychiatrist. Yet they are meant to be mediated through devices designed to be permanent fixtures in our lives which host apps designed to eliminate the need for such expenses like a psychiatrist. Smartphones, smartwatches, and the rest are not like the clumsy EKG monitor I wore for twenty-fours to help my cardiologist better understand my baseline heart rate. They are meant to be quietly invasive much like the GPS tracker on my smartphone. Yes, I could turn it off, but, annoyingly, I will have to turn it back on if I want to use it for the noble purpose of not getting lost. So, like many others, I put up with the creepy “How was [wherever you went]?” question that repeatedly arises. Being digitally stalked by faceless tech-companies is just a way of life in the modern world, one we are made to believe we endorse by ticking the “terms and conditions” boxes on our devices. 

Snark aside, isn’t this what we want? Take the “snoring” app I downloaded a few months ago. Festooned with color coded wavelengths and graphs, I can see the times when my snoozing was “quiet,” “light,” “loud,” and “epic” (most of it was “epic”). Having no idea how to interpret my “snore score” beyond worrying that I should see a specialist, I wondered if there were any analytic tools available to provide recommendations. It turns out there were none, but my desire for them to be there was strong. I wanted a definite recommendation given my snore recording, which is just the sort of thing an algorithm, digital or not, can provide.

I say “digital or not” because algorithms are just models that take some data as input, and, by way of some set of constraints and rules of inference, produce an output, ideally a recommendation or prediction. The “if this, then that” style of reasoning is everywhere in medicine from medical training, to the mundane use of UpToDate, to the guidelines and best-practices statements from specialist societies. Feedback from users in the peer-reviewed literature and elsewhere enables refinement of these things to produce better outcomes, and so on. That one would try to digitally replicate something like this process through a machine-learning system with the stunning ability of smart devices to gather data from their users looks pretty reasonable. Got a snoring problem? There’s an app for that. 

Here’s another thing: I don’t want to pay for any of this. And by “I” I mean “we.” 

Where this leaves us, and the developers to whom we make these demands, is precisely in the space mediated by the powerful logic of what Shoshana Zuboff calls “surveillance capitalism.” According to this logic, we trade private data about ourselves for the ability to use a neat and convenient product that promises to maximize some benefit without the use of our credit card. If all goes well, we won’t be able to live without the app and the developer will be able to sell our digital profile for maximum profit to third parties who have their own interests in modifying our behavior. Who those parties would be in the case of healthcare apps is not clear, but it would not be surprising if the drug and device companies got into the trade. 

And this is very concerning, because it will further the cause of medicalization.  

To get a handle on the effects of that, consider one of my favorite lines from Gattaca. Vincent says to Irene, “You are the authority… on what is not possible, aren’t you, Irene? They have got you looking so hard for any flaw… that after a while that’s all that you see.” This reflexive fault-finding is caused, in part, by the so-called “medical gaze” which the famed mid-twentieth century philosopher Michel Foucault outlined in his 1963 book The Birth of the Clinic. To take up the medical gaze is to take up a view of the body that reduces it to an object of manipulation for the sake of medical analysis and the treatment of disease, usually of interest to society, without reference to the personal identity or subjective experience of the patient. To be on the receiving end of it is to be “medicalized,” to be an object that “indicates” treatment interventions. It is widely noted that the direct-to-consumer marketing of pharmaceutical companies is ‘exhibit A’ of modern medicalization. This is because it incentivizes the design of messages that make healthy people feel unhealthy in order to motivate them to seek out the marketed product to feel whole again. I mean, my snoring must be a “serious problem” because the app said so, and, hey, here’s a link to a store where I can buy sleep remedy stuff! 

Zuboff begins her book The Age of Surveillance Capitalism with a reference to the story of Odysseus and Calypso. Although the daughter of Atlas is willing to meet his every need with wine, bread, and bed, the hero is miserable to the point of tears because he is not permitted to leave her island. As the Encyclopedia Britannica puts it, “she could not overcome his longing for home even by promising him immortality.”  It is a haunting metaphor for our growing discontent with an economic order that while promising to enhance human experience by facilitating greater access to goods like knowledge, friendship, consumer products, special services, and perhaps even justice, it actually treats that experience as raw material to be measured for the sake of extracting information from, though not necessarily for us. I say that if this economy is combined with the powers of medicalization in our society, we will suffer something worse: An acute alienation from a true sense of well-being wrought by the pursuit of a socially constructed view of health through a combination of powerful surveillance technologies, the algorithmic nature of medical reasoning, and the desire for cheap and convenient solutions to complex health problems. Forgoing divine intervention, the only way I see off this island is to pay the developers not to sell our data. 

I suppose the real trick, however, will be to avoid being on the island at all.

We use cookies to improve your website experience. To learn about our use of cookies and how you can manage your cookie settings, please see our Privacy Policy. By closing this message, you are consenting to our use of cookies.