Sometime last week, Laurie Garrett, the Pulitzer Prize-winning author of The Coming Plague, proposed a thought experiment to her interviewer on MSNBC. She had been describing the lockdown procedures in place in China, and mulling how much more limited actions are available to the US to mitigate the spread. Imagine, she said (or more or less), the police out on the interstate pulling over a truck driver “with his gun rack” and demanding a swab, running a test, and then and there ordering the driver to abandon the truck and putting him in isolation.
Um…even without the gun rack detail…
The 1980s AIDS crisis may have been the first time my generation became aware of the tension between privacy and epidemiology. Understanding what was causing the then-unknown “gay cancer” involved tracing contacts, asking intimate questions, and, once it was better understood, telling patients to contact their former and current sexual partners. At a time when many gay men were still closeted, this often meant painful conversations with wives as well as ex-lovers. (Cue a well-known joke from 1983: “What’s the hardest part of having AIDS? Trying to convince your wife you’re Haitian.”)
The descriptions emerging of how China is working to contain the virus indicate a level of surveillance that – for now – is still unthinkable in the West. In a Huangzhou project, for example, citizens are required to install the Alipay Health Code app on their phonesthat assigns them a traffic light code based on their recent contacts and movements – which in turn determines which public and private spaces they’re allowed to enter. Paul Mozur, who co-wrote that piece for the New York Times with Raymond Zhong and Aaron Krolik, has posted on Twitter video clips of how this works on the ground, while Ryutaro Uchiyama marvels at Singapore’s command and open publication of highly detailed data This is a level of control that severely frightened people, even in the West, might accept temporarily or in specific circumstances – we do, after all, accept being data-scanned and physically scanned as part of the price of flying. I have no difficulty imagining we might accept barriers and screening before entering nursing homes or hospital wards, but under what conditions would the citizens of democratic societies accept being stopped randomly on the street and our phones scanned for location and personal contact histories?
The Chinese system has automated just such a system. Quite reasonably, at the Guardian Lily Kuo wonders if the system will be made permanent, essentially hijacking this virus outbreak in order to implement a much deeper system of social control than existed before. Along with all the other risks of this outbreak – deaths, widespread illness, overwhelmed hospitals and medical staff, widespread economic damage, and the mental and emotional stress of isolation, loss, and lockdown – there is a genuine risk that “the new normal” that emerges post-crisis will have vastly more surveillance embedded in it.
Not everyone may think this is bad. On Twitter, Stewart Baker, whose long-held opposition to “warrant-proof” encryption we noted last week, suggested it was time for him to revive his “privacy kills” series. What set him off was a New York Times piece about a Washington-based lab that was not allowed to test swabs they’d collected from flu patients for coronavirus, on the basis that the patients would have to give consent for the change of us. Yes, the constraint sounds stupid and, given the situation, was clearly dangerous. But it would be more reasonable to say that either *this* interpretation or *this* set of rules needs to be changed than to conclude unliterally that “privacy is bad”. Making an exemption for epidemics and public health emergencies is a pretty easy fix that doesn’t require up-ending all patient confidentiality on a permanent basis. The populations of even the most democratic, individualistic countries are capable of understanding the temporary need for extreme measures in a crisis. Even the famously national ID-shy UK accepted identity papers during wartime (and then rejected them after the war ended (PDF)).
The irony is that lack of privacy kills, too. At The Atlantic, Zeynep Tufecki argues that extreme surveillance and suppression of freedom of expression paradoxically results in what she calls “authoritarian blindness”: a system designed to suppress information can’t find out what’s really going on. At The Bulwark, Robert Tracinski applies Tufecki’s analysis to Donald Trump’s habit of labeling anything he doesn’t like “fake news” and blaming any events he doesn’t like on the “deep state” and concludes that this, too, engenders widespread and dangerous distrust. It’s just as hard for a government to know what’s really happening when the leader doesn’t want to know as when the leader doesn’t want anyone *else* to know.
At this point in most countries it’s early stages, and as both the virus and fear of it spread, people will be willing to consent to any measure that they believe will keep them and their loved ones safe. But, as Access Now agrees, there will come a day when this is past and we begin again to think about other issues. When that day comes, it will be important to remember that privacy is one of the tools needed to protect public health.
Illustrations: Alipay Health Code in action (press photo).
Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard – or follow on Twitter.