“If you have nothing to hide you have nothing to fear.” That phrase carries me back to elementary school, anxiously tracing the patterned floor with my eyes and counting the students who made the slow loop out of the confessional booth and up towards the alter. I was next: with difficulty I lifted my feet out of the kneeler and shuffled over to the secluded corner where Father Gary sat. When I sat down I had solidified the words in my mind to such an extent that they tumbled out of me, like I had just turned on a recording. I never felt like I covered everything, but I had mustered up something to let me go home free.
I never quite knew if I had something to hide. If I did, how did I stop hiding it? I confessed sins that were unintentional but revealed later as wrong. For instance, I remarked to my dad as we drove home from church how strange it was that we drank Jesus’ blood, and he said, sternly “That’s disrespectful.” I sat in silence and contemplated how my statement constituted disrespect, and later confessed what I felt I had done.
Whenever I thought about my sins, I was considering how my words or actions had been interpreted, what message they had. Perhaps my actions were malicious, but when reflecting on them I had this lingering sense that they were a misunderstanding that I couldn’t set straight. So when I hear this phrase, “nothing to hide: nothing to fear,” I am reminded of my anxious elementary school days trying to situate myself amongst voices of authority, to ensure I am not hiding anything. It seems to me that these debates about surveillance center around a similar concern: how our correspondences are used, what status they have in this stage of international politics and telecommunications.
The article by arstechnica, “American Spies,” touched on the ambiguous political terminology around surveillance practices. A collection of data that made use of search term, for instance, would no longer be called “bulk,” which refers to the massive indiscriminate collection of randomized data. We don’t really have legalistic terms with sufficient granularity to capture the many ways we handle and manipulate data. I noticed this in the article about Obama’s talk at SXSW, where, though admitting that he was no software engineer, he kept wanting to apply metaphors of keys and locked boxes. Cryptography and codebreaking have propelled each other to the point that P and NP must be brought in to form a complete picture of the dilemma, and common metaphors won’t carry much weight. Microsoft has to map their software in the reverse direction, fitting its technical potential into existing legal models. They pleaded freedom of speech to allow notifications for users when a government intercepts their information. We never think of software features as an instance of free speech, but why should this private company be restricted from alerting users to information as it comes under their grasp? All these controversies remind me of what Baudrillard called a reality principle in distress: we are yearning to establish some stable ground for the drift of the signifier, to know how are messages will be used and what implications they carry.