Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Security Regression Testing on OWASP Zap Node API

Kim Carter
February 17, 2021

Security Regression Testing on OWASP Zap Node API

There is this problem that we (Development Teams and their businesses) are still struggling with after adding all the security bolt-ons and improvements. It’s called application security (AppSec).

As Developers, we’re still creating defective code. There are many areas we’ve been able to configure and automate to help improve security, but the very human aspect of creating secure code is still a dark art, and in many cases our single point of failure.

We’re going to discuss traditional approaches of addressing security in our software, and why they’re just not cutting it any more. A red teaming engagement can be very expensive, is too late in the SDLC to be finding then fixing bugs. In many cases we’re pushing code to production continuously, the traditional approaches and security checks are no longer viable.

In this session, Kim will attempt to demystify how security can become less of a disabler/blocker and more of an enabler/selling point, allowing you to create and deliver robust software with security baked in as frequently and confidently as your business demands.
We’re going to unlock the secrets of building and running a Development Team with security super powers (the purpleteam), finding and fixing defects at the very point that they’re introduced.

One of the tools often used is the OWASP ZAP API, now we have an officially supported Node API. In this talk we build on the Node API to create a fully featured security regression testing CLI that can be consumed by your CI/nightly builds.

Kim Carter

February 17, 2021
Tweet

More Decks by Kim Carter

Other Decks in Technology

Transcript

  1.  @binarymist
    1

    View full-size slide

  2. InfoSecNZ Slack
    2 . 2

    View full-size slide

  3. TRADITIONALLY
    TRADITIONALLY
    How have we found bugs in so ware?
    3 . 1

    View full-size slide

  4. TRADITIONALLY
    TRADITIONALLY
    How have we found bugs in so ware?
    Um...
    3 . 1

    View full-size slide

  5. TRADITIONALLY
    TRADITIONALLY
    How have we found bugs in so ware?
    Um... We haven't really
    3 . 1

    View full-size slide

  6. The catch all Red Teaming Exercise
    3 . 2

    View full-size slide

  7. The catch all Red Teaming Exercise
    3 . 2

    View full-size slide

  8. The catch all Red Teaming Exercise
    3 . 3

    View full-size slide

  9. The catch all Red Teaming Exercise
    ≈$20k per week
    3 . 3

    View full-size slide

  10. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    3 . 3

    View full-size slide

  11. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    ≈So ware project before release: six months
    3 . 3

    View full-size slide

  12. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    ≈So ware project before release: six months
    ≈$40k per six months - per project
    3 . 3

    View full-size slide

  13. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    ≈So ware project before release: six months
    ≈$40k per six months - per project
    Found: 5 crit, 10 high, 10 med, 10 low severity bugs
    3 . 3

    View full-size slide

  14. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    ≈So ware project before release: six months
    ≈$40k per six months - per project
    Found: 5 crit, 10 high, 10 med, 10 low severity bugs
    Many bugs le unfound waiting to be exploited
    3 . 3

    View full-size slide

  15. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    ≈So ware project before release: six months
    ≈$40k per six months - per project
    Found: 5 crit, 10 high, 10 med, 10 low severity bugs
    Many bugs le unfound waiting to be exploited
    Business decides to only fix the 5 criticals
    3 . 3

    View full-size slide

  16. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    ≈So ware project before release: six months
    ≈$40k per six months - per project
    Found: 5 crit, 10 high, 10 med, 10 low severity bugs
    Many bugs le unfound waiting to be exploited
    Business decides to only fix the 5 criticals
    Each bug avg cost of 15+ x fixed when introduced
    3 . 3

    View full-size slide

  17. The catch all Red Teaming Exercise
    ≈$20k per week
    ≈Engagement: two weeks
    ≈So ware project before release: six months
    ≈$40k per six months - per project
    Found: 5 crit, 10 high, 10 med, 10 low severity bugs
    Many bugs le unfound waiting to be exploited
    Business decides to only fix the 5 criticals
    Each bug avg cost of 15+ x fixed when introduced
    5 bugs x 15 x $320 = $24000
    3 . 3

    View full-size slide

  18. Bottom line:
    Red Teaming
    6 months (2 week engagement): $40’000
    Only 5 Red Team bugs fixed: cost: $24000
    3 . 4

    View full-size slide

  19. Bottom line:
    Red Teaming
    Too expensive
    Too late
    Too many bugs le unfixed
    3 . 5

    View full-size slide

  20. We can do better
    3 . 6

    View full-size slide

  21. We can do better
    And we have to
    3 . 6

    View full-size slide

  22. Things are changing
    But some are not
    4 . 1

    View full-size slide

  23. WHAT'S CHANGED?
    WHAT'S CHANGED?
    4 . 2

    View full-size slide

  24. WHAT'S CHANGED?
    WHAT'S CHANGED?
    We no longer release every 6 months or year
    Now it's weekly, daily, hourly, etc
    More than ever we need to deliver faster
    4 . 2

    View full-size slide

  25. The Internet has grown up
    And so have our attackers
    4 . 3

    View full-size slide

  26. More than ever we need to li our game
    4 . 4

    View full-size slide

  27. THE MORE THINGS CHANGE
    THE MORE THINGS CHANGE
    THE MORE THEY STAY THE SAME
    THE MORE THEY STAY THE SAME
    4 . 5

    View full-size slide

  28. THE MORE THINGS CHANGE
    THE MORE THINGS CHANGE
    THE MORE THEY STAY THE SAME
    THE MORE THEY STAY THE SAME
    What's the No. 1 area we as Developers/Engineers
    need the most help with?
    4 . 5

    View full-size slide

  29. THE MORE THINGS CHANGE
    THE MORE THINGS CHANGE
    THE MORE THEY STAY THE SAME
    THE MORE THEY STAY THE SAME
    What's the No. 1 area we as Developers/Engineers
    need the most help with?
    APPSEC
    APPSEC
    4 . 5

    View full-size slide

  30.  Establish a Security Champion
    4 . 9

    View full-size slide

  31.  Establish a Security Champion
     Hand-cra ed Penetration Testing
    4 . 9

    View full-size slide

  32.  Establish a Security Champion
     Hand-cra ed Penetration Testing
     Pair Programming
    4 . 9

    View full-size slide

  33.  Establish a Security Champion
     Hand-cra ed Penetration Testing
     Pair Programming
     Code Review
    4 . 9

    View full-size slide

  34.  Establish a Security Champion
     Hand-cra ed Penetration Testing
     Pair Programming
     Code Review
     Techniques for Asserting Discipline
    4 . 9

    View full-size slide

  35.  Establish a Security Champion
     Hand-cra ed Penetration Testing
     Pair Programming
     Code Review
     Techniques for Asserting Discipline
     Techniques for dealing with Consumption of Free &
    Open Source
    4 . 9

    View full-size slide

  36.  Establish a Security Champion
     Hand-cra ed Penetration Testing
     Pair Programming
     Code Review
     Techniques for Asserting Discipline
     Techniques for dealing with Consumption of Free &
    Open Source
     Security Focussed TDD
    4 . 9

    View full-size slide

  37.  Establish a Security Champion
     Hand-cra ed Penetration Testing
     Pair Programming
     Code Review
     Techniques for Asserting Discipline
     Techniques for dealing with Consumption of Free &
    Open Source
     Security Focussed TDD
     Evil Test Conditions
    4 . 9

    View full-size slide

  38.  Establish a Security Champion
     Hand-cra ed Penetration Testing
     Pair Programming
     Code Review
     Techniques for Asserting Discipline
     Techniques for dealing with Consumption of Free &
    Open Source
     Security Focussed TDD
     Evil Test Conditions
     Security Regression Testing
    4 . 9

    View full-size slide

  39. SECURITY REGRESSION TESTING
    SECURITY REGRESSION TESTING
    5 . 1

    View full-size slide

  40. WHAT IS
    WHAT IS
    5 . 2

    View full-size slide

  41. WHAT IS
    WHAT IS
    SECURITY REGRESSION
    SECURITY REGRESSION
    TESTING?
    TESTING?
    5 . 2

    View full-size slide

  42. WHY?
    WHY?
    5 . 3

    View full-size slide

  43. Bottom line:
    Red Teaming
    6 months (2 week engagement): $40’000
    Only 5 Red Team bugs fixed: cost: $24000
    5 . 6

    View full-size slide

  44. Purple Teaming
    5 . 7

    View full-size slide

  45. Purple Teaming
    ≈$160 per hour per Engineer
    5 . 7

    View full-size slide

  46. Purple Teaming
    ≈$160 per hour per Engineer
    Almost every security bug found+fixed as introduced
    5 . 7

    View full-size slide

  47. Purple Teaming
    ≈$160 per hour per Engineer
    Almost every security bug found+fixed as introduced
    Almost 0 cost. Call each bug fix ≈2 hours (≈$320)
    5 . 7

    View full-size slide

  48. Purple Teaming
    ≈$160 per hour per Engineer
    Almost every security bug found+fixed as introduced
    Almost 0 cost. Call each bug fix ≈2 hours (≈$320)
    If we fixed every (35) bug found in red teaming
    exercise it would cost 35 * ≈$320 = ≈$11200
    5 . 7

    View full-size slide

  49. Purple Teaming
    ≈$160 per hour per Engineer
    Almost every security bug found+fixed as introduced
    Almost 0 cost. Call each bug fix ≈2 hours (≈$320)
    If we fixed every (35) bug found in red teaming
    exercise it would cost 35 * ≈$320 = ≈$11200
    As opposed to fixing 5 bugs & costing $24000
    5 . 7

    View full-size slide

  50. Purple Teaming
    ≈$160 per hour per Engineer
    Almost every security bug found+fixed as introduced
    Almost 0 cost. Call each bug fix ≈2 hours (≈$320)
    If we fixed every (35) bug found in red teaming
    exercise it would cost 35 * ≈$320 = ≈$11200
    As opposed to fixing 5 bugs & costing $24000
    >2 x cost to fix only 14% bugs found in Red Teaming
    5 . 7

    View full-size slide

  51. Purple Teaming
    ≈$160 per hour per Engineer
    Almost every security bug found+fixed as introduced
    Almost 0 cost. Call each bug fix ≈2 hours (≈$320)
    If we fixed every (35) bug found in red teaming
    exercise it would cost 35 * ≈$320 = ≈$11200
    As opposed to fixing 5 bugs & costing $24000
    >2 x cost to fix only 14% bugs found in Red Teaming
    As opposed to fixing all 35 for < ½ $ of 5 crit Red
    Teaming fixes
    5 . 7

    View full-size slide

  52. Purple Teaming
    Security regression testing will always find many
    more defects
    Not constrained to time
    Red Team ≈2 weeks to hack
    Automated security regression testing:
    Every day (CI) to hack
    Every night (nightly build) to hack
    5 . 8

    View full-size slide

  53. The Evolution of...
    6 . 1

    View full-size slide


  54. Developers write imperative tests for everything
    6 . 3

    View full-size slide


  55. Developers write imperative tests for everything
    All components required manual setup and config
    6 . 3

    View full-size slide


  56. Developers write imperative tests for everything
    All components required manual setup and config
    Components need to be kept up to date
    6 . 3

    View full-size slide


  57. Developers write imperative tests for everything
    All components required manual setup and config
    Components need to be kept up to date
    Minimum of three months work
    6 . 3

    View full-size slide

  58. Developers write a little config
    No additional setup
    No updating components
    No writing tests
    6 . 4

    View full-size slide

  59. Consumable by your CI/nightly builds
    Backed by a SaaS
    Plugable Testers
    6 . 5

    View full-size slide

  60. PURPLETEAM ARCHITECTURE
    PURPLETEAM ARCHITECTURE
    7 . 1

    View full-size slide

  61. The manual steps, everything else is automatic:
    7 . 3

    View full-size slide

  62. The manual steps, everything else is automatic:
    1. Run docker-compose-ui
    7 . 3

    View full-size slide

  63. The manual steps, everything else is automatic:
    1. Run docker-compose-ui
    2. Host Lambda functions
    7 . 3

    View full-size slide

  64. The manual steps, everything else is automatic:
    1. Run docker-compose-ui
    2. Host Lambda functions
    3. Run your SUT
    7 . 3

    View full-size slide

  65. The manual steps, everything else is automatic:
    1. Run docker-compose-ui
    2. Host Lambda functions
    3. Run your SUT
    4. Run the main -> npm run dc-up
    docker-compose
    7 . 3

    View full-size slide

  66. The manual steps, everything else is automatic:
    1. Run docker-compose-ui
    2. Host Lambda functions
    3. Run your SUT
    4. Run the main -> npm run dc-up
    docker-compose
    5. Run CLI -> purpleteam test
    7 . 3

    View full-size slide

  67. The manual steps, everything else is automatic:
    1. Run docker-compose-ui
    2. Host Lambda functions
    3. Run your SUT
    4. Run the main -> npm run dc-up
    docker-compose
    5. Run CLI -> purpleteam test
    6. Once test has finished, check artefacts
    7 . 3

    View full-size slide

  68. As a consumer:
    1. Run docker-compose-ui
    2. Host Lambda functions
    3. Run your SUT
    4. Run the main -> npm run dc-up
    5. Run CLI -> purpleteam test
    6. Once test has finished, check artefacts
    docker-compose
    7 . 4

    View full-size slide

  69. As a consumer:
    3. Run your SUT
    5. Run CLI -> purpleteam test
    6. Once test has finished, check artefacts
    7 . 4

    View full-size slide

  70. As a consumer:
    1. Run your SUT
    2. Run CLI -> purpleteam test
    3. Once test has finished, check artefacts
    7 . 5

    View full-size slide

  71. ORCHESTRATOR
    ORCHESTRATOR
    7 . 6

    View full-size slide

  72. TESTERS
    TESTERS
    7 . 8

    View full-size slide

  73. TESTERS
    TESTERS
    app-scanner
    7 . 8

    View full-size slide

  74. TESTERS
    TESTERS
    app-scanner
    server-scanner
    7 . 8

    View full-size slide

  75. TESTERS
    TESTERS
    app-scanner
    server-scanner
    tls-checker
    7 . 8

    View full-size slide

  76. TESTERS
    TESTERS
    app-scanner
    server-scanner
    tls-checker
    Your tester here?
    7 . 8

    View full-size slide

  77. SLAVES
    SLAVES
    7 . 9

    View full-size slide

  78. Prod
    Dev
    7 . 11

    View full-size slide

  79. App Testing Slaves
    # docker-compose up --scale zap=2
    version: "3.6"
    networks:
    compose_pt-net:
    external: true
    services:
    zap:
    image: owasp/zap2docker-stable
    networks:
    compose_pt-net:
    # Soft limit of 12 test sessions.
    ports:
    - "8080-8091:8080"
    7 . 12

    View full-size slide

  80. App Testing Slave helper (Selenium instance) (one for
    each App Testing Slave)
    version: "3.6"
    networks:
    compose_pt-net:
    external: true
    services:
    chrome:
    image: selenium/standalone-chrome
    networks:
    compose_pt-net:
    ports:
    - "4444-4455:4444"
    shm_size: 1G
    firefox:
    7 . 13

    View full-size slide

  81. CLI
    CLI
    7 . 14

    View full-size slide

  82. CLI
    CLI
    purpleteam
    7 . 14

    View full-size slide

  83. Notable dependencies:
    "blessed",
    "blessed-contrib",
    "chalk",
    "convict",
    "eventsource",
    "purpleteam-logger",
    "request",
    "request-promise",
    "request-promise-native",
    "sywac"
    7 . 15

    View full-size slide

  84. Notable dev dependencies:
    "code",
    "lab",
    "mocksee",
    "sinon"
    7 . 16

    View full-size slide

  85. about.js
    test.js
    testplan.js
    7 . 18

    View full-size slide

  86. PURPLETEAM IN ACTION
    PURPLETEAM IN ACTION
    8 . 1

    View full-size slide

  87. npm install -g purpleteam
    8 . 2

    View full-size slide

  88. npm install -g purpleteam
    Define SUT in the build user config file
    8 . 2

    View full-size slide

  89. {
    "data": {
    "type": "testRun",
    "attributes": {
    "version": "0.1.0-alpha.1",
    "sutAuthentication": {
    "route": "/login",
    "usernameFieldLocater": "userName",
    "passwordFieldLocater": "password",
    "submit": "btn btn-danger",
    "expectedResponseFail": "Invalid"
    },
    "sutIp": "pt-sut-cont",
    "sutPort": 4000,
    "sutProtocol": "http",
    8 . 3

    View full-size slide

  90. purpleteam test
    8 . 4

    View full-size slide

  91. CAN'T WAIT?
    CAN'T WAIT?
    9

    View full-size slide

  92. CAN'T WAIT?
    CAN'T WAIT?
    Help Build it  gitlab.com/purpleteam-labs
    9

    View full-size slide

  93. CAN'T WAIT?
    CAN'T WAIT?
    Help Build it  gitlab.com/purpleteam-labs
    Try old PoC  github.com/binarymist/NodeGoat/wiki/
    9

    View full-size slide