The revelation last week that researchers Thai Duong and Juliano Rizzo had developed a new attack on SSL that gives them the ability to decrypt some protected sessions on the fly sparked a lot of discussions about the inherent problems of the protocol and whether it has outlived its usefulness. But it’s not just SSL that’s the problem; it’s the slow accumulation of security problems in the key protocols and systems on which the Internet–and much of our world–rely that has become the real issue.
It’s beyond cliche at this point to talk about how much our financial system, utilities, transportation, food supply and other necessities depend on computers and the Internet. Suffice it to say, it’s a lot. Without the Internet–or at least computers–most of these things simply wouldn’t work very well, if at all.
And what the companies that run these industries and many others need in order to work is a means for moving data around securely. In a lot of cases, that’s accomplished through the use of some form of SSL or TLS. Retailers need it to conduct credit-card transactions, financial companies need it for secure online banking and companies in virtually every conceivable industry need it to communicate securely with their partners. Absent the ability to do those things, many companies really wouldn’t know how to continue doing business. Companies can and do plan for a lot of scenarios, including natural disasters, large-scale Internet or power outages and interruptions to their supply chains. But it’s unlikely that anyone has a backup plan for the possibility of the main means of secure electronic communications becoming unusable.
The good news is that Rizzo and Duong’s attack doesn’t constitute that kind of problem. Their attack, which Rizzo demonstrated on Friday evening at the Ekoparty conference in Argentina, involves a new method for attacking the way that SSL/TLS handles encrypted data when using CBC (cipher-block chaining) mode. It’s an improvement on an existing attack and they go after a long-known vulnerability in the protocol that has been fixed in the newer versions, TLS 1.1 and 1.2. (For an excellent description of the technical details of the issue, see Eric Rescorla’s writeup.) In short, if an attacker has a man-in-the-middle position on a network, he could use Duong and Rizzo’s attack to find and decrypt encrypted cookies for high-value sites such as banks or retailers and then hijack those sessions.
In the demo, Rizzo used PayPal as an example and, using the custom BEAST tool that he and Duong built, was able to decrypt the victim’s cookie in a short period of time. It’s a serious problem, but as both Rescorla and Adam Langley of Google point out, there are much simpler attacks that accomplish the task of removing the security of SSL.
“The attack is still a difficult one; the attacker has to have high-bandwidth MITM access to the victim. This is typically achieved by being on the same wireless network as the victim. None the less, it’s a much less serious issue than a problem which can be exploited by having the victim merely visit a webpage. (Incidentally, we pushed out a fix to all Chrome users for such a Flash bug only a few days ago.),” Langley writes in a blog post on the attack.
“Also, an attacker with MITM abilities doesn’t need to implement this complex attack. SSL stripping and mixed-scripting issues are much easier to exploit and will take continued, sustained effort to address. Duong and Rizzo have highlighted this fact by choosing to attack one of the few HSTS sites.”
But it’s not the BEAST attack or Moxie Marlinspike’s SSLstrip attack or any other individual vulnerability or exploit that’s the problem. No, it’s the gradual accretion of all of these things and more over the years that is making it all too clear how ill-suited the current Internet infrastructure is for the task of transporting data securely.
Think of the current situation with DigiNotar. This is a relatively small certificate authority, based in The Netherlands, that was doing a nice little business selling certificates to companies and the Dutch government. An attacker decided to go after the company and was able to compromise its CA infrastructure, issue himself valid certs for hundreds of sites, including major e-commerce and portal sites as well as many government sites. Within a few days of the attack coming to light, the major browser vendors had revoked trust in the DigiNotar CAs and the Dutch government had ceased doing business with the company. A couple of weeks later, the company was declared bankrupt and out of business.
That’s one small CA. A larger CA, Comodo, was similarly compromised earlier this year, but both of those attacks were detected and dealt with. Consider the damage that could be done by an attacker who was able to perform the same kind of intrusion at a company the size of VeriSign or Thawte or one of the larger CAs and remain undetected for a long time. It’s not something you’ll want to think about for long.
These attacks have highlighted the considerable weakness of the entire CA system, and coupled with the real problems in SSL/TLS, it does not make for a comforting picture of Internet security at the moment. There are a lot of smart people working on ways to make both SSL and the CA system itself more resilient and reliable, but it’s not clear what the long-term answer is. Moving off of SSL and CAs would take a tremendous amount of time and effort and no clear alternatives have arisen to force the issue.
The closest thing we have to the proximate event that would get everyone scrambling for a solution is the steady drip, drip, drip of more remote events that are all solvable when taken individually. But when taken together, they all add up to a truly ugly problem.