It seems the exaggerated volume of bad traffic used in politically motivated DDoS attacks last year was not an isolated phenomenon.
Distributed denial-of-service attacks that congest Internet connectivity and disrupt online services topped unprecedented levels in 2013, shoving aside stealthier attacks against the application layer preferred by hackers in previous years.
“It seems that attackers are trying to achieve a goal, be it to impact service availability or as part of a much broader attack campaign, distract from financial fraud and theft,” said Arbor Networks solutions archictect Darren Anstee. “They’ve gone back to volumetric attacks because they are aware that better defenses are in place and this is a way to get around those.”
Arbor released its Worldwide Infrastructure Security Report this week and regardless of whether respondents were in service provider or enterprise environments, DDoS attacks were the No. 1 operational threat to these organizations.
While it’s generally accepted that a 20 Gbps attack is enough to overrun a website or a Web-based service, a substantial number of attacks were more than 100 Gbps, topping out at 309 Gbps in an attack against spam blacklist provider Spamhaus.
“Far more respondents are telling us about larger attacks than 100 Gbps than in 2011 and 2012,” Anstee said. “We saw others at 191, 152 and 130 Gbps.”
Attacks such as the Spamhaus takedown are outliers to be sure with three times the traffic used there than in multiple attacks targeting Bank of America, PNC, Wells Fargo and other large American financial institutions allegedly by the the al-Qassam Cyber Fighters. Spikes in the Spamhaus DDoS attack reached 309 Gbps as attackers took advantage of open DNS resolvers to amplify attacks against the Swiss volunteer organization.
The availability of open DNS resolvers gave the Spamhaus attackers the ability to spoof Spamhaus IP addresses to send the site massive volumes of DNS requests; there was collateral damage in those attacks as well, impacting online streaming media services such as Netflix.
“Spamhaus made people aware of the threat of reflection amplification attacks. It does appear attackers have learned to leverage the infrastructure available on the Internet to help them in attacks,” Anstee said.
Within the last month, NTP amplification attacks have been used to take down sites as well, causing US-CERT to issue an advisory warning enterprises and service providers of the risk. Attackers are taking advantage of a weakness in NTP servers that allows an administrator to query for the IP address of the last 600 machines interacting with an NTP server. By sending a GETLIST command to an NTP server that is spoofed with a victim’s source address, that IP can be overrun with uncalled for traffic in no time.
Arbor’s report indicates that few companies have security staff dedicated to infrastructure such as DNS and locking down those and related services. Coupled with the availability of open DNS resolvers, that presents a problem for high-value targets.
“If you’ve got open DNS resolvers you can use and if you’ve got a botnet that can generate a good volume of traffic and point it at a list of open DNS resolvers, you can use those resolvers to amplify the capabilities you have for your botnet,” Anstee said, adding that attackers can get a 30x improvement with amplification in some cases. “Unfortunately, it’s not that hard; the know-how is available.”
Survey respondents said their top concern for 2014 is DDoS attacks against infrastructure given the ease at which amplification attacks have been happening. Volumetric attacks that consume bandwidth are a top attack vector, along with TCP state-exhaustion attacks that consume connection state tables in load balancers, firewalls and applications servers, and application layers that target aspects of applications or services. A good number of those attacks are even conducted against HTTPS websites and services.
“Well-formed attacks that targeted encrypted Web services were much higher than expected,” Anstee said. Such attacks, like those carried out in Operation Ababil by al-Qassam Cyber Fighters, require a bit of reconnaissance by the attacker who must determine firsthand which files such as log-in forms, or PDFs of annual reports and investment details, are available on an open page. The attacker then uses a GET request over an over to that file putting a load on the server until it is overrun.
“They are carrying out a normal operation over an encrypted connection,” Anstee said, adding such attacks are difficult to detect. “You don’t see bots doing much of that.”