Attempting to add more encryption is just likely to have more broken encryption on the internet. – Poul-Henning Kamp, FOSDEM 2014
Edward Snowden spoke at SXSW on Monday; the theme of the ACLU’s panel was that more encryption will protect us all from NSA spying. Snowden went as far as to make an unfortunate Harry Potter reference, calling encryption a “defense against the dark arts”. More encryption is supposed to make it more expensive for the US government to spy on everybody.
Snowden’s interview at SXSW was all sorts of irresponsible. First and foremost, I’ll point out that while Snowden told us that encrypting hard drives and networks is the key to blocking out spooks, he didn’t name which programs he uses to accomplish these two things. For someone with the public’s interest at heart, he’s very shy about helping the public protect their interests by sharing details. If I was Snowden, sharing which code is ‘clean’ would be at the top of my agenda, especially since the US Congress has shown itself to be so ineffectual. (I’d feel better if Snowden didn’t look so happy as he skirted this important issue.)
Instead, Snowden states that any mathematician will tell you the encryption math works. Here’s the rub: pure math rarely translates into perfect code. What do I mean by that? ‘Shortcuts’– or at least subjective coding decisions– are made for the sake of code that works. Sometimes, a nefarious sabotage will be hidden in this ‘subjectivity’. Remember BULLRUN and RC4′s (probable) random number generator that wasn’t random? For a long time everybody thought RC4 did what it’s supposed to do even though it was compromised; compromised despite the existence of sound math behind why random numbers are good for cryptography!
Here’s another example of pure math not making it into the code. SHA-256 is a hashing algorithm that is widely used in cryptography. It uses a mathematical idea (circular shift) that can’t be directly expressed in C (a coding language), so coders have to use different ways to get a result that looks right. Check out all the opinions (subjectivity) that coding ‘circular shift’ generates on StackOverflow. Take your pick, because if there’s an exploitable situation, the NSA sure will.
Now, I’m not saying SHA-256 is cracked. I’m saying that I found this ‘window for subjectivity’ after about 5 min looking; so imagine what I could do with a full-time job looking for holes. (Then imagine what a techie could do.) There’s bound to be one window that’s NSA- exploitable. Edward: pure math is no proof that encryption works in practice. You know that. (And so does Vladimir.)
I hope that these last four paragraphs explain why it’s so important that Edward tell us what programs/encryption implementations he’s using to secure his data. (There are over 1300 implementations of SHA-256 out there, are they all equally good?!) If you’re not technically minded though, here’s one more explanation. There’s no word for ‘No’ in Chinese. Therefore, people express ‘no’ in a myriad of roundabout ways that are all subject to misinterpretation. See the problem? Turning pure math into working code has similar challenges. We need the program/implementation details, Ed.
So that’s my beef with calling for ‘more encryption’ in general; what about the specific implementation of network ‘encryption’ through Tor? Amazingly, none of the MSM journos noticed that Snowden praised and damned Tor in the same breath. It’s good for blocking run-of-the-mill, “default”, dragnet-surveillance says Edward, but if you’re personally targeted by the spooks, you’re done for. Contrast that with how Appelbaum sells Tor: encryption for political activists and journalists. Ya think political activists and journalists are personally targeted for surveillance? Maybe? Again, the responsible thing for Ed to have done was advise against using Tor. He didn’t do that.
Unlike Ed and the ACLU and the Kremlin, I don’t think encryption is a magic bullet. My opinion is based on the advice of two well respected people, William Binney and Poul-Henning Kamp. These are the guys who, I believe, really know what they’re talking about. So I’ll let them speak in their own words.
Last year, I posted a link to a video of Bill Binney speaking at MIT about his work for the NSA. Binney designed the NSA’s massive data collection systems. Here’s what he has to say about encryption:
[Question about securing communications]
[Binney] We were talking about this earlier [securing communicaitons]. In my mind, the safest way to do it– no online cypher is safe, simply because if they don’t have the key, they’ll come across and get it from you.That’s assuming they didn’t already plant it in your system. The safest way is to do everything offline, encryption offline, then go online and send it. Then do decryption offline. But you have to have an air gap there. If you don’t have that air gap, you’re not safe.
[Question about embedding.]
[Binney] That’s embedding. That’s not safe.
[Question about if NSA can break SHA encryption standards.]
[Binney] You mean like 128 or 256 [SHA encryption standards]? I don’t think they [the NSA] have to break them at all. They already have them. I mean, that’s my point. You’ve got back doors. No online system is safe. [Unintelligible] Or, they’re [the company] required by law to put a back door into the system so that the FBI can listen to it.
Q: A lot of times we hear discussion that you can see the patterns and that will tell us who the bad guys are. What you’re presenting here is that they have lots of information and if they want to go after David Petraeus, or for that matter, Bill Binney, they can then turn the spotlight and get all of your data, all the data that they’ve collected. Is there a capability at this point that you’re aware of or that you can foresee that by taking all the data then suddenly it’s going to tell you who the bad guy is? They’re talking about predictive policing.
[Binney]They already did that. That’s what we maintained was the way to do it. We had no problem making those decisions. They apparently still do. And the reason I thought was that they really wanted to spy on the United States. And they claim it as an excuse: “We’re still having trouble, we can’t do it– unless we can take in all this data we won’t be able to achieve that end [safety from terrorism]”. That’s simply false.
[Question about what communications make you a suspicious person in the NSA's eyes.]
[Binney]…would default out or you mean if they [the NSA] couldn’t break it? If they didn’t have a key that would work? Yes, generally they say that’s very suspicious. That’s what the FBI says. If you’re encrypting, it’s suspicious. You’re already, you know, that’s tick number one for the hit list. Okay? I don’t know how many more ticks you have to have before you get on the list. Or what it takes to get off once you’re on. After all, Sen Kennedy up here was on the no-fly list for so many months; he had trouble getting off. Well, if he got on it and had trouble getting off, imagine any one of us, what problems we would have.
The take-home is that even if you encrypt, your data is still vulnerable after you send/receive it, because of spy-bugs in the software/hardware you used to view/store it.
What Binney says is a big deal. I wonder how many journalists/bloggers who proudly display their public encryption keys on their Twitter profile keep their private key on a separate, off-line computer? My guess is not many. In effect, they are signalling to the FBI/NSA “Look at me!” while offering up their private key to the government on a platter.
Snowden and the ACLU would say this: if enough people use encryption– if enough people put a target on their back– it’s just too expensive to watch them all. I refer the ACLU to the ‘subjective/broken code’ issue I’ve just described, and to Poul-Henning Kamp’s FOSDEM 2014 talk, quoted below. Poul-Henning Kamp is a widely-respected security expert who has spent a lot of time working in the open-source community– he’s seen stuff go down.
[The following quote is taken from FOSDEM 2014, where Kamp conducts a 'thought experiment' pretending to be an NSA 'executive' debriefing the brass on what his department does to monitor the internet. The 'thought experiment' showcases what Kamp suspects the NSA currently does. ]
This is another interesting program that we’re [the NSA are] working on. It’s actually inspired by a field accident. We had to evacuate a high-risk, and high-quality, resource. We had no facilities nearby his location that could be used. So we set him up as an independent contractor. Story was, you know, “tired of the boss, company politics” yada, yada. Started for himself, was lucky to find some customers, and so on. While sitting there, he spent some time on some open-source software project– had to do something! He spotted some opportunities for ground work.
Most open-source projects are based on trust. There’s no formal vetting, there’s no checking people’s resumes, there’s no checking if they’re really who they say they are. It’s like: “Oh, I’m this dude who sits in Ulaanbaatar and here’s some patches!” And if you send good patches for some years, people will start to trust you, and not check your patches; and they will give you a commitment so that you can add them yourself because that’s really much easier. It’s a fantastic environment. People can come in and nobody knows that your a ‘doc’ or NSA agent.
So not only can you collect information about the project’s interior that way, once the trust is in place you can start to influence their code. Perception is an easy thing to get right here, it’s like: “Yeah, you know, I sit at this non-profit thing, and as long as the email works and the printers print– it’s all humming. I’m sitting here seven hours of day doing not a shit.” “Okay, cool! Here’s a dude who has time and he’s delivering good code.” And one of the things we’ve found out here is that we can do– in reality, this is one of our people, and he’s sitting somewhere in a shop-front that looks like it’s this non-profit thing for stopping, I don’t know, oak trees falling over. And actually, that’s our neighbors who have this shop-front, and they need somebody to get the computers to work in this little stealth setup they have, and we need to have an ethernet and a desk for our man, so it’s very convenient, and the bosses can claim “We’re doing cooperative work and saving money!” It’s really well run.
And of course, you cannot go in and add obvious vulnerabilities to source code, people will spot that. It has to be more subtle than that. ‘Programming Mistakes.’ The careless semicolon, 0/1 based errors, all these classics. It’s kind of dangerous to do it yourself; it’s better to say “Oh, I got this patch and I looked at it, and it looks okay,” and stick it in. People will start to notice if your own code quality sucks, but if you accept patches which are not quite up to standard– well, everybody can have a bad day.
So in general, obfuscating the code, making it harder to understand critical bits of the code, makes it easier to make it almost do what it’s supposed to do. And misleading documentation is always a wonderful thing, particularly for crypto-sensitive stuff, and deceptive defaults so that things don’t do what people think. It doesn’t have to be the core code. It doesn’t have to be the operating system kernel. In previous FreeBSD, [there where] 20k packages of software that were built for patches needed for BSD– nobody ever looks at those patches. Whoever ported this piece of software just sticks a patch in and that’s it, it’s never reviewed. In general, nobody has ever looked at all the patches in previous FreeBSD port collections. They should.
So, this is our poster boy: the Debian random number generator. [Applause] This was really beautifully executed: this dude who sends in the patch says, “This patch gets Valgrind to complain and it doesn’t seem to do anything sensible, you should just remove it.” And they did. So for two years, all the Debians had lousy random numbers which which made brute-forcing SSL keys [snaps fingers] “Done!”
He earned a pretty good bonus!
Open SSL is the crown jewel. [Applause] Open SSL is the standard library if you want crypto. Getting SSL to work with all browsers without Open SSL is very tricky. Reading the Open SSL manuals or source code is not tricky– that’s close to impossible. And that’s 300,000 lines of code. So good luck with that. The documentation is deficient and misleading and the defaults are deceptive. They don’t do what you think they do. This saves so much money in collection you have no idea.
The sad truth is that ‘open-source’ and ‘widely-used’ are qualities the NSA can’t resist; they present an information-rich opportunity for cracking. There is no white-knight brotherhood of open-source wizards who will write a secure encryption program with a cute GUI that will work properly for long.
If we’re going to reclaim online privacy, the general public will need to be more savvy and suspicious than we have been in the past. The easy-to-use and effective encryption program that the ACLU and Snowden call for will be a constantly moving target. Is the general public ready for the hunt? Until they are, shouting for ‘more encryption’ is counterproductive.
Edward Snowden did a great thing for humanity when he exposed the NSA’s, and partner spying operations’, offensive practices against the people they’re supposed to serve. I don’t begrudge him trying to scrape out a life from what’s left to him; kowtowing to Putin and his American Buddies is part of that. However, it is now in the public’s interest to take whatever Edward says with a grain of salt.
So what do we do? Ending with another quote from Kamp/FOSDEM 2014, where he address the audience directly, NOT as an imaginary NSA agent!
The standard reaction in the open-source environment to Edward Snowden’s disclosures has been “We need to strengthen the protocols! We need to have SSL everywhere!” I think that misses the point by a large margin. The things that have been published by the Snowden documents, by now, are the things that the general public can understand reading their newspaper. The stuff we would be interested in have not been published and maybe never will. Attempting to add more encryption is just likely to have more broken encryption on the internet. This is not a technical problem, this is a political problem. It must be solved by political means. That means find politicians in your country who can understand this and make sure they understand it. If you cannot find these politicians, get you some… This is your children’s, grandchildren’s future society you’re looking at.
I think Kamp understands the ACLU better than most.
