Posted by Sacha Saint-Leger on Novemeber 28, 2019
My city and state are Rome — as Antoninus. But as a human being? Th world. So for me, good can only mean what’s good for both communities.
— Marcus Aurelius, Notes to himself
I participated in the most significant change in the history of American espionage — the change from targeted surveillance of individuals to mass surveillance of entire populations. I helped make it technologically feasible for a single government to collect all the world’s digital communications, store them for ages, and search through them at will.
— Edward Snowden, Permanent Record
The system of near-universal surveillance had been set up not just without our consent, but in a way that deliberately hid every aspect of its programs from our knowledge. At every step, the changing procedures and their consequences were kept from everyone, including most lawmakers.
— Edward Snowden, Permanent Record
Where exactly is the maximum tolerable level of surveillance, beyond which it becomes oppressive? That happens when surveillance interferes with the functioning of democracy: when whistleblowers (such as Snowden) are likely to be caught.
— Richard Stallman, How Much Surveillance Can Democracy Withstand?
Contrary to assertions that people don’t care about privacy in the digital age, the vast majority of Americans believe that they should have more control over their data.
According to a 2015 survey by Pew Research, 93% of Americans believe that being in control of who can get information about them is important. At the same time, a similarly large majority — 90% — believe that controlling what information is collected about them is important.
However these views are in stark contrast to how the FBI and NSA currently operate, even post-Snowden.
At a House Oversight Committee hearing in June of this year, an FBI witness revealed that the agency can match or request a match of an American’s face against at least 640 million images of adults living in the U.S. This includes driver’s license photos from 21 states, including states that do not have laws explicitly allowing them to be used in this way.
At the same hearing we also learned that the FBI believes it can use face recognition on individuals without a warrant or probable cause.
According to the ACLU:
Under FBI guidelines, agents can open an assessment without any fact-based suspicion whatsoever. Even preliminary investigations may be opened only in cases where there is mere “information or allegation” of wrongdoing, which the FBI interprets to cover mere speculation that a crime may be committed in the future.
How do we explain this inconsistency between what the public wants and what the authorities are doing?
It comes down to one word — framing — or the metaphors and moral narrative associated with an idea.
In the words of Steve Rathje:
We often metaphorically think of the mind as a machine, saying that it is “wired” to behave in certain ways. But, the mind is not simply a machine, engineered to behave entirely rationally. Instead, like a work of art, the mind thrives on metaphor, narrative, and emotion — which can sometimes overtake our rationality.
In other words, the way we frame something determines how we think about it. And mass surveillance is, of course, no exception. Whether we realise it or not, mass surveillance has been brilliantly framed by authority to influence our thoughts in a particular direction, even if we instinctively know there’s something not quite right about it.
To quote directly from Phillipp Rogaway’s seminal paper on the moral dimension of cryptography:
I think people know at an instinctual level that a life in which our thoughts, discourse, and interactions are subjected to constant algorithmic or human monitoring is no life at all. We are sprinting towards a world that we know, even without rational thought, is not a place where man belongs.
In part 3 of the moral character of cryptographic work, Rogaway outlines what he calls the law-enforcement framing of mass surveillance. This persuasive and well-crafted framing is often espoused by intelligence communities around the world to justify their actions. He describes it like this:
Reading it evokes a sense of fear. A fear of crime, a fear of losing our parents’ protections, even a fear of the dark… This is no accident.
Rogaway goes on to contrast it with the — almost orthogonal — cypherpunk / surveillance-studies framing:
Reading this evokes a sense of injustice, sense that our liberty has been infringed upon, even taken away, without us realising.
So which narrative brings us closer to the truth? And where should we start to try to find out?
Let’s try to answer this from first principles.
If you re-read the two narratives you should see that points 4 and 5 of the cypherpunk narrative are incompatible with the first three points of the law-enforcement narrative — and since the last three points of the law-enforcement narrative follow from the first three, comparing these parts seems like a good place to start.
To put it simply, the key disagreement here seems to hinge on whether privacy is a personal or collective good, and whether it is wrong to regard privacy and security as conflicting values.
Let’s address these separately.
While it’s self-evident that privacy can be a personal good, It’s not so obvious that it can be a collective good too. How can it be a collective good?
Privacy is a collective good if the limitations placed on our privacy result in a world in which there is less space for personal exploration, and less space to challenge social norms.
I'm on the Tianjin to Beijing train and the automated announcement just warned us that breaking train rules will hurt our personal credit scores!— Emily Rauhala (@emilyrauhala) January 3, 2018
In such a world, progress slows, since human progress depends on the ability of individuals to challenge authority and the status quo.
To borrow two of Bertand Russell’s Ten Commandments:
Have no respect for the authority of others, for there are always contrary authorities to be found.
Do not fear to be eccentric in opinion, for every opinion now accepted was once eccentric.
Put another way, if the space in which we can express eccentric opinions or challenge authority gets smaller, then effective political dissent becomes harder. And — to paraphrase Rogaway again — without dissent, social progress is unlikely.
Since social progress is a collective good, it’s clear that things aren’t as simple as the law-enforcement framing suggests.
Another way of asking this question is the following: Can lack of privacy make us less secure?
We don’t have to look very hard to see that the answer is yes, it can.
Last month, for example, it was reported that at least 100 journalists, human rights activists and political dissidents had their smartphones attacked by spyware that exploited a vulnerability in WhatsApp.
It should be obvious that a world in which journalists and activists don’t dare to reveal crimes, is a world which is at the same time less secure for the individual and worse for society.
In sum, there are important links between privacy, the security of individuals, and the collective good. Again, things aren’t as simple as the law-enforcement framing would have us believe.
Now that we’ve effectively dismantled the first half of the law enforcement narrative, the rest of it falls apart by itself— since the second half builds off of the first.
Let’s turn to the remaining parts of the cypherpunk narrative to see if it’s just as misleading as the law enforcement narrative, or if it can get us closer to the truth.
The first and fifth points — Surveillance is an instrument of power and In a world of ubiquitous monitoring there is no space to challenge social norms — are true almost by definition.
And the second and third — Surveilling everyone has become cheaper than figuring out whom to surveil and Governmental surveillance is strongly linked to cyberwar since security vulnerabilities that enable one enable the other— are also obviously true.
So the only point that’s left to verify is 6 — creeping surveillance is hard to stop, because of interlocking corporate and governmental interests.
In our manifesto we wrote that:
Governments, corporations, and anybody with the faintest aspiration to power, has recognised that understanding as much as possible about as many as possible is the lever of influence in the modern world. Whether it’s news targeting, influencing politics, or nudging purchasing decisions, it comes down to the same thing: shaping human behaviour.
We see evidence for this everywhere we look. Just last month, news came out that China is partnering with its tech sector to roll-out emotional recognition systems at airports and subway stations — the idea being that using video footage, emotion recognition technology can rapidly identify criminal suspects by analysing their mental state…
We also learned last month that Microsoft beat out Amazon to win a 10bn US defence contract called the Joint Enterprise Defense Infrastructure cloud project — or JEDI for short. For those of you who are unfamiliar, JEDI is the information platform on which the Pentagon plans to fight its future wars, pooling all the information available to its generals to “heighten their effectiveness on the battlefield”.
To quote US Air Force brigadier general David Krumm:
When you think about IT, you don’t think about killing people and breaking things… [The whole point of the new Pentagon cloud] is to make sure more bad guys meet their makers.
Again, these are far from isolated cases. They are simply the most recent examples of a general trend. The incentives driving this trend are clear. Governments need corporations to build the tech, and the data generated by the tech to exercise more power. Corporations need the data to make more profits, and governments to defend them from foreign competitors.
It’s clear that one of the most pressing challenges facing modern society today is the chronic, imperceptible advancement of mass surveillance technology. How this advancement is framed in the public’s mind may well dictate the future of our democracy.
As for the question posed in the title of this essay: is mass surveillance compatible with the collective good?
We’ll leave the answer to you ;)