Survey Finds Secure Sites Not So Secure

A new project that was setup to monitor the quality and strength of the SSL implementations on top sites across the Internet found that 75 percent of them are vulnerable to the BEAST SSL attack and that just 10 percent of the sites surveyed should be considered secure.

A new project that was setup to monitor the quality and strength of the SSL implementations on top sites across the Internet found that 75 percent of them are vulnerable to the BEAST SSL attack and that just 10 percent of the sites surveyed should be considered secure.

The SSL Pulse project, set up by the Trustworthy Internet Movement, looks at several components of each site’s SSL implementation to determine how secure the site actually is. The project looks at how each site is configured, which versions of the TLS and SSL protocols the site supports, whether the site is vulnerable to the BEAST or insecure renegotiation attacks and other factors. The data that the SSL Pulse project has gathered thus far shows that the vast majority of the 200,000 sites the project is surveying need some serious help in fixing their SSL implementations.

There is quite a bit of alarming data in what the project has gathered, and one of those pieces of information is that more than 148,000 of the sites surveyed are vulnerable to the BEAST attack, which was developed by researchers Juliano Rizzo and Thai Duong and disclosed last year. Their attack uses what’s known as a chosen-plaintext attack against the AES implementation in the TLS 1.0 protocol and enables them to use a custom tool they wrote to steal and decrypt supposedly secure HTTPS cookies. The attacker can then hijack the victim’s secure SSL session with a site such as an e-commerce site or online banking site.

The BEAST attack is complex, but it’s a serious concern and the fact that three quarters of the top sites that the project surveyed are still vulnerable to the attack is troubling. Sites can protect against the attack by implementing mitigations in their TLS 1.0 deployments, including configuring their servers to only use the RC4 cipher during TLS 1.0 or SSL 3.0 sessions.

The other major concern in the data compiled by the SSL Pulse survey is that a third of the sites still support SSL 2.0, a protocol that is considered insecure. Experts recommend that sites not use SSL 2.0 at all because of its weaknesses. 

The Trustworthy Internet Movement, formed earlier this year and backed by Qualys CEO Philippe Courtot, boasts a task force that comprises some of the top SSL experts in the industry, including Ivan Ristic of Qualys, Moxie Marlinspike of Whisper Systems and Twitter and Adam Langley of Google. 

Ristic said in a blog post that while the data compiled by the survey is not definitive, it’s a good indication of what’s happening on the sites secured by SSL.

Looking at the SSL Labs grades, which are designed to sum up the quality of SSL configuration, we can see that about 50% (99,903 sites) got an A, which is a good result. Previous global SSL Labs surveys reported about 33% well-configured sites, which means that more popular sites are better configured. Unfortunately, many of these A-grade sites (still) support insecure renegotiation (8,522 sites, or 8.5% of the well-configured ones) or are vulnerable to the BEAST attack (72,357 sites, or 72.4% of the well-configured ones). This leaves us with only 19,024 sites (or 9.59% of all sites) that are genuinely secure at this level of analysis,” Ristic wrote.

Suggested articles


  • Curlybird on

    It is my experience that presently every computer and network in the world is compromised to some extent.  Even if the front door is barred and chained, there are so many unknown back doors extant that no one can say, for certain, they are 100% secure, even if they have stand alone systems.

    The world was shown how the BEAST exploit could compromise TLS/SSL security in September 2011, in Argentina.  Yet, the fix cannot be fully implemented as it is not compatible with most existing protocols.

    The only way security can be tightened is by the application of hardware and software modifications on an ongoing basis.  Security breaches have become too profitable for those involved to give up easily.  The beneficiaries of such breaches seem to always be several steps ahead of any security modifications, so by the time an issue is discovered and published, the professionals are generally long gone.

  • Chase on

    I think it's very ironic that viewing this threatpost blog entry over SSL shows both warnings due to insecure elements, and the use of AES-CBC. I suppose threatpost is vulnerable to BEAST too.

    That said, it is important to remember that BEAST also requires an XSS vulnerability to function, and the recommendation is to drop a generally strong cipher for a much weaker one (RC4).

  • Mcihael R on

    I see that this site is using TLS 1.0 with resources from non secure servers.

  • Anonymous on

    I agree with Curlybird -- its a fractal problem. No matter how far in you zoom, there are vulnerabilities below visibility of the humans tasked with solving them. Vulnerabilities is a chaotic function. Drawing a single unbroken line which represents threat surface isn't realistic anymore due to the interconnectedness of technology to the business and people who us these networks. There are thousands of infinite fissures. The prevalence rate of a specific threat scenario speaks to two things: 1) how many other things must be broken than people are ignoring this; and 2) how much higher-level "zoomed out" activity is changing that it keeps them busy at that level. Aligning the threat vector with the sensitivity of data protected to achieve is economic tradeoff that is in the same order of magnitude as the familial dysfunction, crime, malnutrition and war. Is it really a serious concern? To whom? To sides that are also vulnerable to XSS.

Subscribe to our newsletter, Threatpost Today!

Get the latest breaking news delivered daily to your inbox.