Privacy Through Visibility: Disrupting NSA Surveillance With Algorithmically Generated "Scary" Stories
Computational artists engage the politics of networked communication through code. By
creating net art, hacktivist projects, and "tactical media," artists illuminate the dark sides of
networks, challenge the notion of the network as a liberating force, and propose mechanisms
for tweaking the "evil media" these networks facilitate. A primary example of network-based
politics is the US National Security Agency's (NSA) email surveillance efforts recently revealed
by Edward Snowden. Using systems to examine our text-based digital communications, the
NSA algorithimically collects and searches everything we write and send in a futile effort to
predict behaviors based on words in emails. Large collections of words have thus become
codified as something to fear, as an indicator of intent. This presentation will explore the
methods of artists who engage the politics of digital surveillance using algorithmically
generated language, and will explore the question of whether computationally produced text
can combat computational text analysis. A focus will be the author's project ScareMail,
a web browser extension that makes email “scary” in order to disrupt NSA surveillance. Extending Google’s Gmail, the project adds to every new email’s signature an algorithmically generated narrative containing a collection of probable NSA search terms. This “story” acts as a trap for NSA programs like PRISM and XKeyscore, forcing them to look at nonsense. Each email’s story is unique in an attempt to avoid automated filtering by NSA search systems. ScareMail attempts to disrupt the NSA’s surveillance efforts by making NSA search results useless. Searching is about finding the needles in haystacks. By filling all email with “scary” stories, ScareMail thwarts NSA search algorithms by overwhelming them with too many results.
If every email contains the word “plot,” or “facility,” for example, then searching for those
words becomes a fruitless exercise. A search that returns everything is a search that returns
nothing of use. ScareMail thus proposes, through its algorithmic generation of "scary" stories,
an alternative model of privacy built on visibility and noise rather than encryption and silence.
(Soruce: author's abstract)
Works referenced:
Title | Author | Year |
---|---|---|
ScareMail | Ben Grosser | 2013 |
Critical writing referenced:
Title | Author | Year |
---|---|---|
Software Studies, a Lexicon | 2008 | |
The Interface Effect | Alexander R. Galloway | 2012 |