Sunday, September 03, 2006

Vulnerability "discovery" more important than disclosure

The vulnerability "disclosure" debate isn't going away. A post at StillSecure, After All These Years has some nice links to experts boiling down their respective arguments attempting to balance researcher ethics, user security, and vendor responsibility. My question is, “what happens to our security when researchers lose the legal ability discover vulnerabilities in the software that's the most important (custom web applications)"?

By their nature, custom web applications are hosted on someone else's servers and available nowhere else. Attempting to find vulnerabilities of any kind on machines other than your own is frowned upon as being potentially illegal. Who cares about disclosure when we can’t even go about finding security issues without running the risk of going to jail. Some say, "do not test a system without written consent", offer good, but also short-sighted advice. The InfoSec community hasn't dealt with the legal issues of "discovering" vulnerabilities, only with "disclosing" them.

Traditionally researchers have played the role of Good Samaritan by finding vulnerabilities in software readily available to them. We're rapidly moving towards a world where the software that holds our most sensitive information (online banks, stores, IRS, etc.) is not on PC desktop software. The same people whom provide the layer of community oversight run into a very real problem besides ethics, a threat to their personal freedom. I’d wager there are few top researchers are willing to risk incarceration in pursuit of a few Cross-Site Scripting and SQL Injection issues. Organizations providing the web-based services are also not going to be handing out hack-me-if-can authorization letters. And with few people looking, software security naturally degrades. That's probably why 8 out of 10 websites have vulnerabilities.


Drew Hintz said...
This comment has been removed by the author.
Jeremiah Grossman said...

"I wonder when we ever had the legal right to test others' production systems."

I meant that from the perspective that security researchers in the past could test "important" software on their own machines. Today's important software now runs on someone elses machine, hence the loss of ability.

"I would argue 10 out of 10 have vulnerabilities."

If you restrict your sample set to non-static non-brochureware websites sure. I have hopes there might be one "secure" website out there in the world. :)

"Testing custom web apps without the owner's permission is similar to me attempting to break in to your company's mail server. It doesn't seem very white-hat to me."

My point is on the larger popular websites, they are getting banged on by thousands of people 24x7x365 anyway. Legal or not. If an organization is able to pull some vulnerabilities out of circulation for a nominal fee, then why not? Best bring whomever they can onto their side.