SHENZHEN, CHINA: Facial recognition technology at street to identify jaywalkers and automatically issue them fines by text. Offenders' faces are displayed on screens at crossings. Photo: StreetVJ/Shutterstock
Privacy was something that used to be taken for granted.
Ordinarily, the private life of an individual was not open to scrutiny, while public life was the concern of law and order and decency. In communication terms, privacy meant that only the addressee could open letters or telegrams and telephone operators would not listen in to conversations. Unauthorised disclosure could be sanctioned.
In “Privacy is the new wilderness we must protect” (openDemocracy 2 August 2019), Maciej Ceglowski calls for an updated concept of privacy as a public good if we are to save our rights as individuals in today’s digital world.
Ceglowski calls this ambient privacy – “the understanding that there is value in having our everyday interactions with one another remain outside the reach of monitoring, and that the small details of our daily lives should pass by unremembered. What we do at home, work, church, school, or in our leisure time does not belong in a permanent record. Not every conversation needs to be a deposition.”
The digital surveillance system functions effectively and efficiently because it is automated. Computers monitor and sift, identifying key attributes, making connections and assessing probabilities. Commercial entities and security services use that data: the former to compile marketing trends and the latter to assess imminent culpability. China is using deep-learning systems to search in real time through video feeds capable of capturing millions of faces. State security is building an archive that will be used to identify suspicious behaviour in order to predict who will become an “unsafe” social actor.
In this context, Ceglowski argues:
“Our discourse around privacy needs to expand to address foundational questions about the role of automation: To what extent is living in a surveillance-saturated world compatible with pluralism and democracy? What are the consequences of raising a generation of children whose every action feeds into a corporate database? What does it mean to be manipulated from an early age by machine learning algorithms that adaptively learn to shape our behavior?”
The alternative is a world with no ambient privacy and little data protection, dominated by hostile governments and big corporations. It is the kind of dystopia that George Orwell identified in his novel Nineteen Eighty-Four in which “Power is in tearing human minds to pieces and putting them together again in new shapes of your own choosing.”
What shall we do to prevent it?