Automation Testing of Legacy Applications

Automation Testing of Legacy Applications

Everyone loves to work on a greenfield project. You can choose language, architecture, design from scratch, all the new, shiny stuff. The reality is that software engineers at some point need to work on or support legacy applications. It’s alive because it has some business value. The code is so messed up (“I just need to add one more ‘if’ to this method”), and the whole thing is held together by spit and baling wire. There are no automated tests safety net. Every change requires manual tests and a prayer to the software gods.

This talk will show you how to start small and work your way up to the point where you reach a confident state. It will show you how to optimize for the team happiness. It will affect topics such as, unit testing, acceptance tests, static code analysis, continuous integration, architecture for testability. The talk is inspired by real life experience, working on three legacy projects in the span of more than five years.

1f686da361195e15bb4e478397a4fc8f?s=128

emanuil

June 04, 2015
Tweet

Transcript

  1. AUTOMATION TESTING LEGACY APPLICATIONS @EmanuilSlavov OF

  2. Your application is legacy, if you don’t have automated tests.

  3. GREENFIELD PROJECT

  4. BROWNFIELD PROJECT BROWNFIELD PROJECT

  5. WHY INVEST IN LEGACY SYSTEM? TEAM HAPPINESS

  6. LOW TEAM MORALE

  7. Fragile Software Slow Feedback Stupid Errors Repetitive Work

  8. Quality software is team effort.

  9. SHIFT LEFT

  10. WHAT TO DO ABOUT IT

  11. Start with basic acceptance tests

  12. Functionality that makes money Must have functionality - compliance, security

    Repeating Manual Tests - Save Time Pareto Principle - 80/20
  13. Do not test through the UI. (if possible)

  14. None
  15. result = RestClient.post( REGISTER_URL, user_details.to_json, {:content_type => :json} )

  16. 800 test x 10 seconds = 2h 13min This saved

    us:
  17. Limit external dependencies calls.

  18. Need to Call External System Comes from automated test? Talk

    to the real system No Fake the response Yes
  19. Test should create the data they need.

  20. Scenario: Client admin should not be able to view master’s

    agencies Given а master user And master creates agency And a client admin When client admin views master's agency Then client admin should get an error
  21. Set test data via API or DB.

  22. Poll for results from API/DB operations.

  23. sleeping(1).seconds.between_tries.failing_after(10).tries do result = some_operation raise 'No Data' if result

    == [] end
  24. Run a test 20 times consecutively. Commit only if the

    test does not fail.
  25. for i in {1..20}; do your_test; done

  26. Make async tasks sync

  27. CODE CHANGES

  28. First Order of Business: Remove Unused Code

  29. KNIGHT CAPITAL MELTDOWN LOST $440 MILLION IN 45 MINUTES

  30. Second Order of Business: Stop The Rot

  31. None
  32. CONTINUOUS INTEGRATION

  33. None
  34. Run on every commit Fast Hook one by one all

    the checks Run longer tests periodically
  35. Developers need to receive feedback about their new code within

    5 minutes.
  36. WHAT CHECKS TO RUN ON COMMIT?

  37. The PHP Case

  38. None
  39. php -l api/models/mobile_push_model.php PHP Parse error: api/models/mobile_push_model.php on line 61

    Errors parsing api/models/mobile_push_model.php Linter
  40. None
  41. UnknownObjectMethod in file: api/models/mobile_push_model.php, line: 55, problem entry: $pusher->reallyUnsubscribeDevice ($params['user_id'],

    $params['device_id'], $actions) HHVM
  42. STATIC CODE QUALITY

  43. CYCLOMATIC COMPLEXITY function testPrint() { echo('Hello World'); } Complexity: 1

    function testPrint($parameter) { if($parameter) { echo('Hello World'); } } Complexity: 2
  44. Method complexity should be less than 10.

  45. 12 Fatalities $1,2 Billion Settlement

  46. ”The throttle angle function scored [complexity] over 100 (unmaintainable)” Michael

    Barr
  47. Complexity 82 Complexity 10 Constantly refactor to decrease complexity

  48. Method size should be less than 100 lines (ideally less

    than 50).
  49. Improve the code - then lower the threshold on commit

    check. Then repeat.
  50. FIGHT LEGACY CODE WRITE UNIT TESTS

  51. Written by Developers Fast, Independent Test Technical Aspects Cooperation between

    QA & Developers
  52. [Demo]

  53. 100% test coverage is not sufficient!

  54. None
  55. SECURITY TESTS

  56. SQL Injection Detection (PHP and ADOdb) $dbConn->GetRow(“SELECT * FROM users

    WHERE id = $user_id”) $dbConn->GetRow(“SELECT * FROM users WHERE id = ?”, array(‘$user_id’))
  57. Those errors can be caught with code analysis.

  58. There was no such tool. So we developed one.

  59. github.com/emanuil/php-reaper

  60. None
  61. MONITORING

  62. Your second line of defense.

  63. Show a lot with TV and Raspberry Pi.

  64. None
  65. Live Graphs + Deploys

  66. CONCLUSION

  67. Аutomatе the most important functionalities Continuously improve static code quality

    Write unit tests for changed/new code Expand checks on commit Enable monitoring
  68. None
  69. @EmanuilSlavov EmanuilSlavov.com