The Long Con of Application Scanning - Security B Sides ATL

The Long Con of Application Scanning - Security B Sides ATL

Presented on October 8th 2010

8cc9603f5e4312325f9b400333409853?s=128

Erik Peterson

October 08, 2010
Tweet

Transcript

  1. The Long Con of Automated Web Application Security Testing Erik

    Peterson / erik@silvexis.com / @silvexis ATL
  2. Who is Erik Peterson erik@silvexis.com @silvexis (twitter) I am a

    Application Security Evangelist for Veracode with 15 years of application and information security industry experience and a contributing member of the Cloud Security Alliance. I’ve worked for the following organizations: Most of past has been spent as a product manager for Application & Information Security Companies silvexis.com (blog)
  3. A Few Quick Definitions • Long Con ‣ A long

    con is a confidence trick that takes place over a long period of time • Dynamic Analysis ‣ Black box, external automated pen testing performed against a running application. Sometimes called Dynamic Application Security Testing or DAST • Static Analysis ‣ White box, source/binary automated analysis performed against a non- running collection of source code or compiled binaries
  4. What is The Long Con? • There are 27 dynamic

    web application security scanning products available today1 ‣ 17 are commercial products, the rest open source • Most claim to fully automate the process of finding web vulnerabilities ‣ An educated guess says that all of the Fortune 500 own or use one of these tools • Yet, Web Sites are still vulnerable, people are still getting hacked, and we don’t seem to be any better off 1. The History of Web Application Scanning Project, http://silvexis.com/research/hwas/
  5. The Cake is a Lie  But what about Dynamic

    Scanning? • Just what have we been trying to do for 10 years? • Why is it so damn hard? • Is there a better approach?
 These are todays questions
  6. 10 years of history in 60 seconds Attempts at Dynamic

    Web Scanning
  7. It was 1999 and the web was young... • The

    first web application scanner “Whisker”, created by Rain Forest Puppy ‣ “CGI Scanner” back when CGI was how it was done ‣ Focused on known attacks and misconfigurations ‣ Whisker was discontinued eventually and the core features became libwhisker which was used by tools like Nikto ‣ Both of which are still maintained which makes Nikto the longest running web application scanner in existence
  8. A year later, the Start of the Commercial Era AppScan

    1.0 Released July 25th 2000 WebInspect 1.0 Released Feb 1st 2001
  9. Two dramatically different starting points With one common ending AppScan

    v1 • Focused on ease of use and the less experienced user • Linux Server Hosted Web Proxy, Web Based • No Automated Crawler • Searched for unknown vulnerabilities only WebInspect v1 • Focused on the expert user/Pen tester • Windows Desktop Application • Fully Automated Crawler • Searched for known Both products rapidly converged into two very similar products...and got acquired by giants
  10. A lot of other tools arrived since then • Acunetx,

    Cenzic, KavaDo, NT Objectives, Rapid7, nCircle, Qualys, Veracode, Whitehat etc.. ‣ They all work more or less the same way ‣ Crawl something, Audit something, Try to be smart about it ‣ Some work better than others, but it doesn’t seem to be solving the problem • Veracode and Whitehat have a managed service model, which just might be on to something ‣ Managed service means when the scan fails, it’s their fault, not yours ‣ Not to be confused with Qualys, which is hosted not managed ‣ Disclosure: I work for Veracode, so don’t just take my word for it
  11. So after 10 years, Why is Dynamic Web Scanning still

    so Damn Hard?
  12. A Never Ending Battle A scanning tool has to keep

    up with two things ‣ The technology platform (e.g. “standards”) ‣ The attacks and weaknesses The Web technology platform is moving as fast or faster than the attacks and weaknesses ‣ Other than the web, Is there any other area in security technology where the platform has consistently moved faster than the threat landscape? ‣ This is also why web applications are consistently the source of the majority of the defects1,2 Nobody can exactly agree on just what a Web Application is ‣ It’s a moving target, yesterday it was a collection of web pages, today its multi-headed monster 1 2 1. Majority of vulnerabilities found were web vulnerabilities, Veracode State of Software Security Report, Vol 2 2. 54% of hacking breaches we targeted at web applications, Verizon data breach report, 2010 3
  13. False Positives False Negatives Pick One

  14. How Would You Expect an Automated Crawl of this Application

    to Perform?
  15. And Just What is a Web Application Anyway? Web Application

    * P.S. - This is also your new “perimeter”
  16. Web Standards will save us? • The first rule of

    the Internet is that there are no standards • The second rule of the Internet is that THERE ARE NO STANDARDS • Anything even remotely resembling a standard is only loosely enforced • Internet Explorer is the “compiler for web”
 If it works in IE, ship it rfc-3514 HTML5 rfc-3093 AJAX JSON
  17. What about the People and Process part? • That’s broken

    too ‣ An expert with nothing is still better than a novice with a tool, yet we are giving novices tools and expecting the same result ‣ We are too obsessed with trying to control change as a means to improve security ‣ We are not thinking about scalability ‣ We are too focused on scanning
  18. Suggestions for a better approach End the madness

  19. Accept that Automated  Crawling is Dead • Really, it’s

    dead. • If you are a vendor, stop offering this as an viable option, you are lying to your customers ‣ Using only an automated crawler is just asking for it, demand that they record the important test cases • Ultimately, Web security testing will look a lot like desktop application testing ‣ There is no such thing as a automated crawler for desktop applications ‣ This requires you must get to know the application in advance ‣ BTW, The application owner would be much better at this than you
  20. Stop Scanning Support A Continuous Assessment Model • Abandon the

    scan centric mode of thinking ‣ Unless you are only interested in compliance, ugh. • This will be your only hope for keeping track of your application perimeter
  21. Don’t expect the web to stand still Be Ready to

    Evolve with the Technology A constant feedback loop is required
  22. Figure out how to get out of the testing business

    Unless you happen to be in the Testing Business • You have 3000 applications, your small team of 3 will never scale1 • You want to manage the process, not be the process • If you or your application development org can’t handle security testing consider a managed security testing service 1. Actual size of one Fortune 100 company’s application security team
  23. Go/No Go Authority is a Myth Accept that Change is

    Constant • If your process requires that you can control change, you have already failed • Even if you have it, you are hurting your business by keeping it • Set the expectation for continuous anytime anywhere assessment ‣ If you can’t do it, find a company that can do it for you ‣ Hackers don’t schedule their pen tests
  24. My Ultimate Recommendation Give up on Stand Alone Tools •

    They will never scale • There will always be false positives or negatives • They can’t improve as fast as the web evolves • They are too complex to operate consistently • They are not capable of continuous assessment
  25. Final Thought Your Applications are the new Perimeter • A

    perimeter must protect itself, can’t rely on external controls • Must be resilient to a continuous assessment of it’s strength • Rugged Software Manifesto ‣ http://www.ruggedsoftware.org/
  26. silvexis.com erik@silvexis.com @silvexis (twitter) ATL Thank You BSides ATL!