Application Security Evangelist for Veracode with 15 years of application and information security industry experience and a contributing member of the Cloud Security Alliance. I’ve worked for the following organizations: Most of past has been spent as a product manager for Application & Information Security Companies silvexis.com (blog)
con is a confidence trick that takes place over a long period of time • Dynamic Analysis ‣ Black box, external automated pen testing performed against a running application. Sometimes called Dynamic Application Security Testing or DAST • Static Analysis ‣ White box, source/binary automated analysis performed against a non- running collection of source code or compiled binaries
web application security scanning products available today1 ‣ 17 are commercial products, the rest open source • Most claim to fully automate the process of finding web vulnerabilities ‣ An educated guess says that all of the Fortune 500 own or use one of these tools • Yet, Web Sites are still vulnerable, people are still getting hacked, and we don’t seem to be any better off 1. The History of Web Application Scanning Project, http://silvexis.com/research/hwas/
first web application scanner “Whisker”, created by Rain Forest Puppy ‣ “CGI Scanner” back when CGI was how it was done ‣ Focused on known attacks and misconfigurations ‣ Whisker was discontinued eventually and the core features became libwhisker which was used by tools like Nikto ‣ Both of which are still maintained which makes Nikto the longest running web application scanner in existence
v1 • Focused on ease of use and the less experienced user • Linux Server Hosted Web Proxy, Web Based • No Automated Crawler • Searched for unknown vulnerabilities only WebInspect v1 • Focused on the expert user/Pen tester • Windows Desktop Application • Fully Automated Crawler • Searched for known Both products rapidly converged into two very similar products...and got acquired by giants
Cenzic, KavaDo, NT Objectives, Rapid7, nCircle, Qualys, Veracode, Whitehat etc.. ‣ They all work more or less the same way ‣ Crawl something, Audit something, Try to be smart about it ‣ Some work better than others, but it doesn’t seem to be solving the problem • Veracode and Whitehat have a managed service model, which just might be on to something ‣ Managed service means when the scan fails, it’s their fault, not yours ‣ Not to be confused with Qualys, which is hosted not managed ‣ Disclosure: I work for Veracode, so don’t just take my word for it
up with two things ‣ The technology platform (e.g. “standards”) ‣ The attacks and weaknesses The Web technology platform is moving as fast or faster than the attacks and weaknesses ‣ Other than the web, Is there any other area in security technology where the platform has consistently moved faster than the threat landscape? ‣ This is also why web applications are consistently the source of the majority of the defects1,2 Nobody can exactly agree on just what a Web Application is ‣ It’s a moving target, yesterday it was a collection of web pages, today its multi-headed monster 1 2 1. Majority of vulnerabilities found were web vulnerabilities, Veracode State of Software Security Report, Vol 2 2. 54% of hacking breaches we targeted at web applications, Verizon data breach report, 2010 3
the Internet is that there are no standards • The second rule of the Internet is that THERE ARE NO STANDARDS • Anything even remotely resembling a standard is only loosely enforced • Internet Explorer is the “compiler for web” If it works in IE, ship it rfc-3514 HTML5 rfc-3093 AJAX JSON
too ‣ An expert with nothing is still better than a novice with a tool, yet we are giving novices tools and expecting the same result ‣ We are too obsessed with trying to control change as a means to improve security ‣ We are not thinking about scalability ‣ We are too focused on scanning
dead. • If you are a vendor, stop offering this as an viable option, you are lying to your customers ‣ Using only an automated crawler is just asking for it, demand that they record the important test cases • Ultimately, Web security testing will look a lot like desktop application testing ‣ There is no such thing as a automated crawler for desktop applications ‣ This requires you must get to know the application in advance ‣ BTW, The application owner would be much better at this than you
scan centric mode of thinking ‣ Unless you are only interested in compliance, ugh. • This will be your only hope for keeping track of your application perimeter
Unless you happen to be in the Testing Business • You have 3000 applications, your small team of 3 will never scale1 • You want to manage the process, not be the process • If you or your application development org can’t handle security testing consider a managed security testing service 1. Actual size of one Fortune 100 company’s application security team
Constant • If your process requires that you can control change, you have already failed • Even if you have it, you are hurting your business by keeping it • Set the expectation for continuous anytime anywhere assessment ‣ If you can’t do it, find a company that can do it for you ‣ Hackers don’t schedule their pen tests
They will never scale • There will always be false positives or negatives • They can’t improve as fast as the web evolves • They are too complex to operate consistently • They are not capable of continuous assessment
perimeter must protect itself, can’t rely on external controls • Must be resilient to a continuous assessment of it’s strength • Rugged Software Manifesto ‣ http://www.ruggedsoftware.org/