I was sent this link today on Clickbot.a written by the Google adwords guys. It’s a pretty interesting high level read for the most part, if you don’t know much about click fraud, but does get into some of the technical stuff near the end on how the bot actually worked. While the conclusions of the paper are fine, I was struck that the authors failed to address the most important point.
The most important point being the only reason this bot existed, and the only reason the hackers used it to compromise 100,000+ machines - because it was economically lucrative to do so. That means Google’s detection was too slow to respond to and prevent the attackers from making enough money to make it worth their while. Also, it was at the expense of the advertisers as well as the poor web sites who were compromised for this purpose no less. Which means that Google’s detection methods need to improve to not just pick up this particular variant but also polymorphic versions that are far less easy to detect. So while it is commendable for Google to fix this one issue, it shows they are lacking the technology to pro-actively defend against future and less immature variants.
While Google’s executive management feels that economics will solve this issue I feel that Google is failing to see how detrimental this is to the advertisers who depend on quality click traffic. In lieu of this quality, alternative solutions must be in place to allow advertisers to recoup their costs while Google struggles to build new technology to defeat the issue. However, without access to the actual landing pages that the advertisers use, Google cannot have deep insight into the full picture. Ultimately, this will cause a bigger rift with time that the attackers can exploit on the vast majority of sites that don’t use alternative click quality tools. Until the time when Google can come up with a creative solution, companies like Click Forensics fill that void.