Update 07.28.2009: Salesforce.com publishes their "Vulnerability Reporting Policy" and becomes the latest large corp to "legalize it." I was privy to early drafts for feedback and I must say, the final product looks pretty good! They even have a safe for testing playground for security researchers. Nice touch! Hopefully more organizations will follow suit.
Update 06.20.2009: Jack Mannino offers very well-thought and persuasive counterpoints to my suggestion below. I'll have to take some time to consider his arguments and respond accordingly. Boiled down Jack is reminding us that private websites are different in purpose than government systems, which exist for totally different purposes and may not automatically benefit from pen-test crowd-sourcing. The second main point is one of saturating already limited resources with respect to incident response.
I’d wager fewer than ten percent of United States .GOV and .MIL websites are professionally tested for custom Web application vulnerabilities. The reasons why are probably the same as in the private sector. Those responsible don’t know or don’t want to know that problems exist. Statistically of course they do, and our statistics are validated by a recent Federal Aviation Administration security report indicating that 70 of its websites tallied thousands of vulnerabilities. Those who do acknowledge and wish to address the problem often lack the budget or authority to initiate a project. Consequently, enemies both foreign and domestic are likely to know more about what and where our government’s website vulnerabilities are located than the defenders do.
This is a vital concern and an issue I think could be solved through policy or legislation. I believe there are hundreds, maybe thousands, of vulnerability researchers ready and willing to volunteer themselves to find and disclose vulnerabilities -- for free -- if only allowed to do so. What every information security professional knows is there are the penetration tests you pay for and those that you get everyday for free, no matter who you are. What they also know is that testing anything you don’t own or have written consent to test runs the risk of legal prosecution. You are especially not supposed to touch government or military systems, as these organizations have an infinite amount of time and money to go after someone. This is vastly different from the approach of a private enterprise whose investigation eventually has a cost-benefit analysis attached. Generally, no more is invested than the amount lost.
Even so, some researchers are comfortable with harmlessly poking at private sector websites for Cross-Site Scripting, Cross-Site Request Forgery, SQL Injection, and other bugs. XSSed.com serves as good examples of open disclosure, which also demonstrates that no top-level domain is off limits despite the legal consequences. It is time to do something new because we know Web applications are the biggest InfoSec risk we face. This is an extraordinarily large problem. And so, what if, to meet this challenge, we leveraged people’s willingness to find vulnerabilities on their own time, eliminated their risk of prosecution, and instead provided a mechanism for disclosure like a government version of the MSRC? That’s right, let us hack .GOV and .MIL as a veritable army of volunteer pen-testers. How cool would that be! It is not like anyone is being prosecuted for simply finding a government website vulnerability, so no loss there. Sound crazy? Maybe, but here me out first.
Consider that several prominent websites such as PayPal, Microsoft, and Google have already successfully taken such measures. I have first hand knowledge that more are on the way. Their policies state that as long as the researchers follow the rules of engagement -- essentially not doing any damage or defrauding the system, and discreetly disclosing their findings so the companies can create a fix, no legal measures will be taken. These organizations have matured and learned to work with the community. After a fix has been issued, the researcher may tell the world to bolster his reputation in the security community. No dollars are exchanged, but impressive work has led companies to single out a specific researchers thank them. Yes, there are potential downsides, but in my humble opinion the gain more than justifies the risks.
Assuming reported vulnerabilities are fixed promptly, a similar approach would benefit the government while measurably raising the bar for the bad guys. Currently, it seems that for many governmental Internet-connected systems the bar is set quite low. By allowing the good guys to assist them, the government could get access to a qualified pool of security talent to fill their internal security positions. Wasn’t the Pentagon looking to hire high school students for this sort of thing anyway? Open source and commercial vendors could get a new playground to test and improve their vulnerability scanning products. Hard to beat free pen-tests. College students and security training professionals could apply and safely hone their skills. Fake websites are nice and all, but nothing in Web application security compares to experience on real systems. Everyone wins!
I know I’ve painted some broad strokes, but I’d very much like to hear what others think. If reasonable, perhaps this is something Jeff Moss, our de-facto hacker representative to the Department of Homeland Security, could run up the flag pole.