The NYT posted an article today posting some results from an Acunetix scan that says that 70% of websites are vulnerable. To quote the article, “On average 91% of these websites, contained some form of website vulnerability, ranging from the more serious such as SQL Injection and Cross Site Scripting to more minor ones such as local path disclosure or directory listing.” Glad to see that XSS is being tossed a bone in the article (it’s the little vulnerability that could!).
But for some reason these numbers seem WAY low to me. Given that Jeremiah has found about 70% of sites to be vulnerable to XSS alone, and I’ve found closer to 80% of them to be vulnerable in the one thousand or so sites I’ve manually looked at. And that’s not all of them either, that’s just what I found! I’d love to get a list of sites they say aren’t vulnerable and re-test a segment of them.
I have a feeling these numbers are understating the real problem. Just because you can’t find the problem doesn’t mean it’s not there. Still 70% is a scary enough number that most people can’t really comprehend anyway. Telling them that 80% or 90% or more is vulnerable wouldn’t change their perception much, I’d imagine. Even worse it could make them give up hope, “Well if everyone else is getting it wrong, what hope do I have?” Scary stuff.