If web application scanning tools are the power tools used for broad application assessment, then the more sophisticated penetration tester will extend and refine the results through the usage of finely tuned scalpels. Myself? I've always favored using Netcat, Paros and human intelligence. This is not to say that there are not many other powerful tools available, but these happen to be my scalpels of choice.
Whenever I hear people saying that they wish there was an open source web application scanning tool available, similar to a Metasploit type tool but for the application, I'm genuinely puzzled. I wish for something even more basic - a solid, mature open source framework from which to perform web application assessments. I want a framework from which I can begin with an architectural risk analysis, and move forward, collecting and trending SDL artifacts - through to a platform from which I can proxy, build, fuzz and report on my assessment. Am I missing something?
Free & Open Source Tools
In the free and open source category of web application assessment tools, there are no shortage of current projects: Burp proxy, Grabber, Pantera, Paros, SPIKE Proxy, WebScarab, Wapiti and W3AF to name a few. Many of these even have capabilities or extended features that allow for basic web application scanning. The immediate question is why one of these products is not able to make the leap to being the "Metasploit" for web applications. The short answer? It ain't easy. Commercial web application scanners spend significant amounts of time in two core areas of intellectual property: application scanning and broadly applicable security tests and validation.
Application Scanning
While scanning web applications like Google does, a mile wide and an inch deep with no session state or awareness is relatively simple, the applications of today demand so much more. The advent of rich internet applications (RIAs) requires that scanners no longer just parse the DOM for links. They must be able to build virtual pages. They must be able to execute JavaScript and Flash. They must be able crawl through an application depth first (rather than breadth first). Consider applications that enhance functionality with features like client-side validation for performance (and also perform server-side validation for security). Without executing this JavaScript function in the context of a virtual browser, the application might be vulnerable to DOM-based cross-site scripting. Automating repeatable and comprehensive exercise of an application requires blood, sweat and tears. This alone is difficult for an open source project to replicate.
Broadly Applicable Security Tests
The second significant challenge for the open source community is around the creation of tests for a large number of applications and organizations. Creating tests that work for organization A is easy. Creating tests for organization B is relatively easy. However, creating tests that work broadly across radically different industries and implementations takes a tremendous amount of experience, feedback and time - again, something that is not well suited for the open source industry.
My Wish for the Future
So if building a mature scanning solution with broadly applicable testing is difficult, what is it that I would like? I want a more humble solution from the open source community, and a more flexible product from the commercial vendors.
From the open source community, I would like a single mature product where I can both collect my artifacts (predictive threat index, architectural risk analysis, threat modeling, etc) and where I can exercise my scalpels of choice (network requests, proxy tools, fuzzers, statistical analysis, etc). This consolidation is something that just doesn't exist today. Much of the pain comes when information from disparate sources needs to be all "brought together" into that final report. Unlike network penetration where much of the assessment is a binary break-it-or-not assessment, web application security is a puzzle that needs to be massaged, manipulated, and correlated to understand the findings. This requires a platform to work from.
From the commercial web application scanning vendors (of which we are certainly a part), I would like a flexible platform from which to work. Don't lock me in. Let me build my own extensions. And I don't want manually to consolidate automated scanning results, artifacts, and findings from the scalpels and manual assessment. I want a platform from which I can begin, perform and complete an assessment. Give me a platform that will automate the drudgery (maintaining session state, report creation, artifact inclusion, etc) and from which I can do what humans do best - applying human intelligence to solving the puzzle.
But hey, that's just me and my wishes.
You said, "I want a framework from which I can begin with an architectural risk analysis, and move forward, collecting and trending SDL artifacts - through to a platform from which I can proxy, build, fuzz and report on my assessment. Am I missing something?".
Yes, you're missing quite a lot. I'm not sure what you mean with regards to Metasploit. I know Mark Curphey has mentioned this on his blog about wfuzz, and I pointed him towards w3af.
Metasploit is the wrong comparison. AttackAPI is closer to what Metasploit does. You're thinking Nessus, which is a scanning tool. Metasploit is an exploitation engine.
Also, the focus on penetration testing in the security world usually makes me wonder how developers got left out of the big picture. What about tools for them? What about automated static code analyzers, model checkers, coverage tools, and dynamic analysis / hybrid analysis tools?
First you say, "They must be able to crawl through an application depth first (rather than breadth first)".
Oh, so open-source projects don't do this? I guess that you have never heard of the ELZA Project then. SPI Dynamics has - they stole the scanning ideas directly from this project and then had their lawyers threaten the creator of elza with violation of their patents.
Then you go on to say, "Automating repeatable and comprehensive exercise of an application requires blood, sweat and tears. This alone is difficult for an open source project to replicate", with specific reference to the RIA, Flash, and Ajax scanning problems.
It is my opinion that these issues are best addressed with open-source. In fact, I haven't seen them well addressed by commercial scanners. Maybe you are confusing the two!
You would only need to look at the Sahi project to understand how far ahead open-source is for web application testing. But I'll mention the OpenQA organization as well. Especially since they release quality open-source work such as Watir, Selenium IDE, and Selenium Remote Control. But these tools are geared more towards developers than towards pen-testers (aka point-and-clickers), so they don't count, right?
Furthermore, you contend, "However, creating tests that work broadly across radically different industries and implementations takes a tremendous amount of experience, feedback and time - again, something that is not well suited for the open source industry".
What? How is that task not suited for the open source industry? That's exactly the benefit of open-source.
Finally, you end with, "I would like a single mature product where I can ... exercise my scalpels of choice (network requests, proxy tools, fuzzers, statistical analysis, etc). This consolidation is something that just doesn't exist today".
ProxMon does this. It's basically a proxy fuzzer that handles the network requests and does statistical analysis. It does require WebScarab, but it's open-source so you can change that requirement if you'd like to.
The problem with not using manual local proxies to test is well-defined. It's why you lose business to all the best assessors in web application security. You need to use multiple browsers from multiple platforms to test effectively. Manually walking the site with a browser has different state issues than with an automated scraper, although twill, another open-source tool, is pretty good for this purpose, much better than LWP and WWW::Mechanize, which I believe WI and AppScan are based on.
I'm at a loss for words. Luckily for IBM, software patents are their primary LOB. You're in good hands. I bet they put you up to this post - what with their whole, "We love open-source; if it could only be as good as DB2 we might even use it ourselves" motto.
HP has most of the other IP. Funny that the name Selenium comes from the cure for Mercury poisoning, as in, "Mercury Interactive". I would think it's highly likely that OpenQA picked that name on purpose - and it appears that they can also read minds, as WI is the only commercial tool that lacks the critical RIA support you speak of.
Also - you may want to check out a list I built of mostly open-source web application security tools. Everything I mentioned in this post is available via one of the links on this page. Available here:
http://owasp.org/index.php/Phoenix/Tools
Posted by: dre | July 10, 2007 at 01:26 PM
dre,
Thanks for your comments. I actually agree with many of the points that your raise. For example, you mention that developers have been left out of the picture. I agree. In fact, my own thought is that this is part of the problem and has brought us to the (in)security that we have today. We've been so focused on this security issues and communicating in terms of really technical exploits, that we have failed to address the causes to the problem and helping to actually "do" something about it. However, my post here was not from the perspective of solving the problem or helping developers, but from a very different perspective - that of audit and penetration.
This is why I actually use the Metasploit example, rather than the Nessus example. As an auditor, some of the value which I bring is around context. Using DREAD, this might fall in the exploitability and damage categories. These are risk measurements which are best left to a human who can understand context. While Nessus might do a scanning vulnerability assessment, Metasploit gives me an exploitation engine. You're correct in pointing out that AttackAPI is very close to this.
Looking at this from an audit perspective, I still have these frustrations. If you were to look at my toolbox, for every commercial tool I use, I have at least 15-20 free or open source tools. I like these tools. I use them daily. I don't mean to belittle the open source industry. Percentage-of-time-wise, I spend more time using open source tools, than commercial tools. The commercial tools for me are generally for scale and the broad brush stroke approach, and it's why the manual local proxies (as you point out) are still needed. My frustration with open source tools is that I have so many of them. Again, "I would like a single mature product where I can both collect my artifacts (predictive threat index, architectural risk analysis, threat modeling, etc) and where I can exercise my scalpels of choice (network requests, proxy tools, fuzzers, statistical analysis, etc)."
Perhaps I didn't explain this well enough. I think it's valuable to be able to tie together a security requirement from the requirements phase, with an actual exploitation test as part of my audit, to a report. Currently, doing this simple step requires a handful of powerful, independent tools. And, even then, I'm forced to manually tie all this together for the final report.
I don't want to say that the open source community can not produce a web application scan and audit tool. (Consider something as simple as a JavaScript based crawler like CSpider [ http://devedge-temp.mozilla.org/toolbox/examples/2003/CSpider/index_en.html ].) I'm just saying that when I hear people ask for this in the open source community, I take a more humble approach and wish that I had a better platform from which to perform my audit. I know from experience that building a) a really good crawler with session management and b) building a set of tests that fits a broad market, takes a lot of time and resources. As an auditor, I would much rather see the current proxy tools we have be enhanced into a platform that takes me from security requirements to audit (exploitation) to report.
Posted by: Danny Allan | July 10, 2007 at 04:53 PM
Hi,
I'd like to emphasize a point that Danny is making regarding the abundance and redundancy of webappsec open source tools, that instead of being tied together, just sit side by side, barely complementing each other.
Take OWASP for example. Under OWASP you have:
1) WebScarab
2) Pantera
3) DirBuster
4) Sprajax
5) WSFuzzer
6) Interceptor (XML)
7) JBroFuzz
I actually remember that during the OWASP conference last year in Seattle, when Interceptor was introduced, some OWASP people were wondering why it wasn't a part of WebScarab. Since I personally use WebScarab a lot, I think it should’ve turned into a framework, and all of the other products should’ve plugged into it, including projects such as Report Generator.
Posted by: AppSecInsider | July 10, 2007 at 08:06 PM
I did a small writeup on the new w3af GTK interface:
http://fuzion.rootmybox.org/?p=11
Posted by: fuzion | January 16, 2008 at 02:38 PM