net.warsWriting

Do you eat sand? Privacy in the age of AI – Net.Wars by Wendy Grossman

Privacy, pioneering activist Simon Davies writes in his new book, Privacy: A Personal Chronicle, “varies widely according to context and environment to the extent that even after decades of academic interest in the subject, the world’s leading experts have been unable to agree on a single definition.” In 2010, I suggested defining it as being able to eat sand without fear. The reference was to the prospect posed by detailed electronic school records to small children and their parents face, which store data on everything they do. It didn’t occur to me at the time, but in a data-rich future when eating sand has been outlawed (because some pseudoscientist believes it leads to criminality) and someone asks, “Did you eat sand as a child?”, saying no because you forgot the incident (because you were *three* and now you’re 65) will make you a dangerous liar.

The fact that even innocent pastimes – like eating sand – look sinister when the beholder is already prejudiced – is the kind of reason why sometimes we need privacy even from the people we’re supposed to be able to trust. This year’s Privacy Law Scholarstossed up two examples, provided by Najarian Peters, whose project examines the reasons why black Americans adopt edu0cational alternatives – home-schooling, “un-schooling” (children follow their own interests, Summerhill-style), and self-directed education (children direct their own activities), and Carleen M. Zubrzycki, who has been studying privacy from doctors. Cue Greg House: Everybody lies. Judging from the responses Zubrzycki is getting from everyone she talks to about her projects, House is right, but, as he would not accept, we have our reasons.

Sometimes lying is essential to get a new opinion untainted by previous incorrect diagnoses or dismissals (women in pain, particularly). In some cases, the problem isn’t the doctor but the electronic record and the wider health system that may see it. In some cases, lying may protect the doctor, too; under the new, restrictive Alabama law that makes performing an abortion after six weeks a felony, doctors would depend on their patients’ silence. This last topic raised a question: given that women are asked the date of their last period at every medical appointment, will states with these restrictive laws (if they are allowed to stand) begin demanding to inspect women’s menstrual apps?

The intriguing part of Peters’ project is that most discussions of home-schooling and other alternative approaches to education focus on the stereotype of parents who don’t want their kids to learn about evolution, climate change, or sex. But her interviewees have a different set of concerns: they want a solid education for their children, but they also want to protect them from prejudice, stigmatization, and the underachievement that comes with being treated as though you can’t achieve much. The same infraction that is minor for a white kid may be noted and used to confirm teachers’ prejudices against a black child. And so on. It’s another reminderof how little growing up white in America may tell you about growing up black in America.

Zybrzycki and Peters were not alone in finding gaps in our thinking: Anne Toomey McKenna, Amy C. Gaudion, and Jenni L. Evans have discovered that existing laws do not cover the use of data collected by satellites and aggregated via apps – think last year’s Strava incident, in which a heat map published by the company from aggregated data exposed the location of military bases and the identities of personnel – while PLSC co-founder Chris Hoofnagle began the initial spadework on the prospective privacy impacts of quantum computing.

Both of these are gaps in current law. GDPR covers processing data; it says little about how the predictions derived from that data may be used. GDPR also doesn’t cover the commercial aggregation of satellite data, an intersectional issue requiring expertise in both privacy law and satellite technology. Yet all data may eventually be personal data, as 100,000 porn stars may soon find out. (Or they may not; the claim that a programmer has been able to use facial recognition to match porn performers to social media photographs is considered dubious, at least for now) For this reason, Margot Kaminski is proposing “binary governance”, in which one prong governs the use of data and the other ensures due process.

Tl;dr: it’s going to be rough. Quantum computing is expected to expose things that today can successfully be hidden while protecting – including stealth surveillance technologies. It’s long been mooted, for example, that quantum computing will render all of today’s encryption crackable, opening up all our historical encrypted data. PLSC’s discussion suggests it will also vastly increase the speed of communications. More interesting was a comment from Pam Dixon, whose research shows that high-speech biometric analysis is already beginning to happen, as companies in China find new, much faster, search methods that are bringing “profound breakthroughs” in mass surveillance.

“The first disruption was the commodification of data and data breakers,” she said. “What’s happening now is the next phase, the commodification of prediction. It’s getting really cheap.” If the machine predicts that you fit the profile of people who ate sand, what will it matter if you say you didn’t? Even if it’s true.

Illustrations: Sand box (via Janez Novak at Wikimedia).

Wendy M. Grossman is the 2013 winner of the Enigma Award. Her Web site has an extensive archive of her books, articles, and music, and an archive of earlier columns in this series. Stories about the border wars between cyberspace and real life are posted occasionally during the week at the net.wars Pinboard – or follow on Twitter.

Related Articles

Back to top button