So I’m talking with MustLive the other day about XSS issues in public websites. Normally XSS vulnerabilities in random websites are boring because of their pervasiveness, but in this conversation was little bit different. MustLive was going after military intelligence websites in his country (Ukraine), my country (U.S.), and also the U.K. for good measure. The CIA, FBI, NSA, and the MI5 (SIS) were just a few on his list. Not to mention Yahoo as well. Check out his blog for details. I mentioned he might want to be careful, but we both concluded his risk of prosecution is limited given his geographic location. At the end of the conversation I jokingly said:
“Now if there was only a way to find the next million XSS issues faster we'd be getting somewhere. Finding and fixing 1 at a time is never going to get us anywhere.”
Had I said that to anyone besides MustLive, they would have just laughed it off and been done with it. But noooo, MustLive says, “there is such way (to find next million XSS) - it's me :-).”, and then proceeds to start performing some rudimentary metrics. When I made the comment I was thinking in terms of unique XSS issues, not aggregate totalks, but MustLive looked at it another way I didn’t immediately consider. Check this out:
"Look at my holes in Google Search Appliance - There are 4 XSS holes (in 4 parameters) in this engine and there are up to 207000 sites out there with this engine. If all of them are vulnerable (not all, but many of them) there will be 207000 sites x 4 holes = 828000 XSS (up to million new XSS holes). It's mathematics dude ;-). Google is one of the top vulns (XSS) makers."
He certainly has a point, but he didn’t stop there.
"And yes, don't forget about UXSS holes - like Google say via another dork, there are up to 302 millions sites with pdfs and almost all of them are vulnerable to UXSS. So there are some ways to find a lot of XSS faster."
As if we're really going to remove PDFs from the Web... but... talk about an interesting view of the world and that’s just considering a few samplings of what’s actually out there. We could easily add dozens if not hundreds of more issues to compile larger and larger numbers. And because I enjoy hearing how MustLive views the world and I asked what he thought would make a difference. He replied:
“There are some possible solutions, like making browsers less XSS-sensitive (with more protection), making server languages and frameworks more secure (with more anti XSS filters) and through teaching people (what I do every day when finding another XSS hole and notifying site's administrator). There are a lot of people (and security guys also) which need to be taught: site owners and administrators, web developers (site and web app developers), security specialist (who must be aware of a real state of security) and Internet community in whole.
We need productive solutions (because teaching is slow one), so protection on client-side (browsers) and server-side (languages and frameworks) can help us, but even finding and fixing (and teaching) one XSS is a small step to better and secure world.”
MustLive is talking about a holistic approach to web security, very similar to what I’ve been describing. I wonder what the time line looks like for improvement though based on these "best practices".