Snowden’s Question – Who Can You Trust?
Trust is slippery. Long before Snowden exceeded his scope of work Ken Thompson defined what trust means in a virtual world. Writing his 1984 article for the Association for Computing Machinery “Reflections on Trusting Trust” made him one of the first to understand how difficult it is to establish trust when relying on communication mediated by electronics. Or people. Or both.
“if this code were installed in binary and the binary were used to compile the login command, I could log into that system as any user.
The moral is obvious. You can’t trust code that you did not totally create yourself. (Especially code from companies that employ people like me.) No amount of source-level verification or scrutiny will protect you from using untrusted code. In demonstrating the possibility of this kind of attack, I picked on the C compiler. I could have picked on any program-handling program such as an assembler, a loader, or even hardware microcode. As the level of program gets lower, these bugs will be harder and harder to detect. A well installed microcode bug will be almost impossible to detect.”
The insights expressed in that quote are the basis of several global industries. Subversion of privacy, financial transactions and strategic knowledge nets hundreds of billions a year while efforts to secure systems, stop fraud and identity theft cost at least as much both in direct costs and intangibles such as innovation stifled by paranoia and time wasted in security theater. Given that Ken Thompson is one of the handful of individuals who built the tools that built the infrastructure that the current insanity is based on you have to wonder why these lessons weren’t taken to heart and ways and means of reducing the risk weren’t built in from the bottom up. Our current systems are not trustworthy and it’s only because people are unaware or uncaring of just how unreliable, subverted and vulnerable to abuse they are that they continue to be relied upon. Now that awareness of the cost of fundamental systemic insecurity exacerbated by greed and shortsighted paranoia is becoming part of the mainstream people are able to take steps to protect themselves and to expect systems where trustworthiness is a given rather than an add on.
When these systems are built (and they are being built now) they will use Free Software because:
“You can’t trust code that you did not totally create yourself.”