Upgrade to Pro — share decks privately, control downloads, hide ads and more …

EXTENT-2016: Industry Practices of Advanced Program Analysis

EXTENT-2016: Industry Practices of Advanced Program Analysis

EXTENT-2016: Software Testing & Trading Technology Trends
22 June, 2016, 10 Paternoster Square, London

Industry Practices of Advanced Program Analysis
Alexey Khoroshilov, Snr. Researcher, ISP RAS

Would like to know more?
Visit our website: extentconf.com
Follow us:
https://www.linkedin.com/company/exactpro-systems-llc?trk=biz-companies-cym
https://twitter.com/exactpro
#extent2016
#exactpro

Exactpro

June 27, 2016
Tweet

More Decks by Exactpro

Other Decks in Technology

Transcript

  1. Institute for System Programming of the Russian Academy of Sciences

    Industry practices of advanced program analysis Alexey Khoroshilov [email protected] ExTENT-2016 London, 22 June 2016
  2. The Goal of Verification to make sure there is no

    any bugs in any possible execution of the target software • with minimal cost Verification Fresh Software Verified Software
  3. The Goal of Verification to achieve maximal assurance that there

    is no any bugs in any possible execution of the target software • within reasonable cost Verification Fresh Software Verified Software
  4. Testing Test Test Suite 1 kind of bugs all kinds

    of bugs in all executions in 1 execution
  5. Inductive Reasoning • Conclusions are supported by its premises •

    If premises are true, it would be unlikely impossible for the conclusions to be false • Example: • Every time I’ve walked by that dog, he hasn’t tried to bite me. • So, the next time I walk by that dog he won’t try to bite me.
  6. Inductive Reasoning • Conclusions are supported by its premises •

    If premises are true, it would be unlikely impossible for the conclusions to be false • Example: • The component behaves correctly on all test data. • So, the component always behaves correctly.
  7. Inductive Reasoning on Program Correctness • The component behaves correctly

    on all test data • Tests are quite representative => • The component always behaves correctly
  8. Deductive Reasoning • Conclusions certainly follow from its premises •

    If premises are true, it is impossible for the conclusions to be false • Example: • All men are mortal. • Socrates is a man. • Therefore, Socrates is mortal.
  9. Deductive Reasoning on Program Correctness • If assumptions are held

    • regarding compiler&linker • regarding environment behaviour • regarding input data • The component always behaves correctly
  10. Deductive Reasoning on Program Correctness • If assumptions are held

    • regarding compiler&linker • regarding environment behaviour • regarding input data • The component always behaves correctly (i.e. terminates and its output satisfies to postcondition)
  11. Deductive Verification Historical Perspective 1947 1970 • Lecture of Alan

    Turing to London Mathematical Society • Methods of Floyd/Hoare
  12. START: ( y 1 , y 2 )  (

    0, x 1 ) y 2  x 2 F T ( y 1 , y 2 )  ( y 1 +1, y 2 - x 2 ) HALT: ( z 1 , z 2 )  ( y 1 , y 2 ) A: Precondition (x 1  0)  (x 2  0) С: Postcondition (x 1 = z 1 x 2 + z 2 )  (0  z 2 < x 2 ) B: Invariant (x 1 = y 1 x 2 + y 2 ) ∧ (y 2  0) Deductive Verification Prove program correctness by induction: Let precondition to be held in A we pass A->B => Invariant is held in B Let invariant to be held in B we pass B-T->B => Invariant is held in B Let invariant to be held in B we pass B-F->C => Postcondition is held in С Let precondition to be held in A we pass A->C => Postcondition is held in С =>
  13. Deductive Verification Historical Perspective 1947 1970 2000 • Lecture of

    Alan Turing to London Mathematical Society • Methods of Floyd/Hoare • Deductive verifcation tools for Ada, C, Java, С# • SunRise, ESC/Java, Frama-C, LOOP, Boogie/VCC
  14. Deductive Verification Historical Perspective 1947 1970 2000 2010 • Lecture

    of Alan Turing to London Mathematical Society • Methods of Floyd/Hoare • Deductive verifcation tools for Ada, C, Java, С# • SunRise, ESC/Java, Frama-C, LOOP, Boogie/VCC • Application in real-life projects for small-size components • Nuclear power (UK, France) • Avionics (Airbus, NASA, UK Air Traffic Control) • Components of operating systems (seL4, Verisoft, Verisoft-XT)
  15. Industry Applications • UK Air Traffic Management System • 250

    KLOC of logical lines of code (in Ada) • proof type safety, few functional correctness code • 153K VCs, of which 98.76% are proven automatically (*) Angela Wallenburg “Safe and Secure Programming Using Spark”
  16. Years Tools Target code Scope Size Verisoft 2004-2008 Isabelle designed

    for verification hw/kernel/ compiler/ libraries/apps 10 kLOC (kernel) L4.verified seL4 2004-2009 Isabelle designed for verification, performance oriented microkernel security model (no MMU) 7.5 kLOC (without asm and boot) Verisoft-XT small-hv 2007-2013 VCC designed for verification separation property only 2.5 kLOC Verisoft-XT Hyper-V 2007-2013 VCC industrial separation property only 100 kLOC Verisoft-XT PikeOS 2007-2013 VCC industrial, simplicity for performance some system calls 10 KLOC OS Deductive Verification
  17. Linux Verification Center founded in 2005 • OLVER Program •

    Linux Standard Base Infrastructure Program • Linux Driver Verification Program • Linux File System Verification Program • Linux Deductive Verification Program
  18. Toolset for Event-B models verification Model of security requirements Formalized

    security model Formalized low-level security model Manual Automated verification Security requirements Security model AstraVer Toolchain (*) for deductive verification of C programs (based on Frama-C – Jessie – Why3) Pre-/post-conditions of LSM operations Security arcitecture LSM source code Model of security requirements Mathematical notation LSM Implementation implements Specificatiion of library functions Linux kernel AstraVer Project AstraLinux (*) The research on deductive verification tools development was carried out with funding from the Ministry of Education and Science of Russia (the project unique identifier is RFMEFI60414X0051
  19. Deductive Verification Status • Reasonable Tool Support • Ada, C,

    C#, Java • Functional specification as comments, even natively supported in Ada-2012 • Dedicated languages: Boogie, Why3 • Manual efforts still significant • up to 10x of development efforts • highly skilled team required
  20. Testing and Deductive Verification 1 kind bugs all kinds of

    bugs in all executions Deductive 1. Proof of complete correctness under some assumptions 2. Very labour intensive and time consuming in 1 execution Test Test Suite
  21. Testing Deductive Verification Kind of bugs almost all almost all

    Executions under analysis small almost all Development cost linear huge Execution cost hw small (target hw) hw small Result analysis cost small big Maintenance cost small to big huge Testing and Deductive Verification
  22. SVACE by ISPRAS • Static analysis of C/C++/Java code, Linux/Windows

    • 150+ kinds of defects • Buffer overflows, NULL-pointer dereferences • Memory management, tainted input • Concurrency issues • Lightweight analysis of semantic patterns • Eclipse plugin or WebUI
  23. Testing Static Analysis Deductive Verification Kind of bugs almost all

    safety only almost all Executions under analysis small almost all almost all Development cost linear 0 to small huge Execution cost hw small (target hw) hw small hw small Result analysis cost small medium (false alarms) big Maintenance cost small to big small huge Testing and Program Analysis
  24. Testing Static Analysis Deductive Verification Kind of bugs almost all

    safety only almost all Executions under analysis small big almost all almost all Development cost linear 0 to small huge Execution cost hw small (target hw) hw small hw small Result analysis cost small medium (false alarms) big Maintenance cost small to big small huge Testing and Program Analysis
  25. Testing and Program Analysis 1 kind bugs all kinds of

    bugs in all executions Deductive in 1 execution Test Test Suite SVACE 1. Static analysis 2. Quickly finds potential bugs 3. No any guarantee
  26. R ... Model Checking R 2 R 1 R 0

    [Clarke/Emerson, Sifakis 1981] • Iterative fixpoint post computation
  27. Error Location? int f(int y) { struct urb *x; x

    = usb_alloc_urb(0,GFP_KERNEL); ... usb_free_urb(x); return y; }
  28. Error Location? int f(int y) { struct urb *x; x

    = usb_alloc_urb(0,GFP_KERNEL); // allocate new URB ... usb_free_urb(x); // deallocate URB: assert(x is NULL or previously allocated URB) return y; } … // after module exit: assert( all allocated URBs are deallocated)
  29. Instrumentation int f(int y) { struct urb *x; x =

    usb_alloc_urb(0,GFP_KERNEL); ... usb_free_urb(x); return y; } set URBS = empty; int f(int y) { struct urb *x; x = usb_alloc_urb(); add(URBS, urb); ... assert(contains(URBS, x)); usb_free_urb(x); remove(URBS, urb); return y; } … // after module exit assert(is_empty(URBS));
  30. Bounded Model Checking • finite unfolding of transition relation b

    1 a 2 a 2 b 2 b 2 b 2 b 2 b 1 a 2 a 2 b 2 b 2 b 2 b 2 r
  31. Counter-Example Guided Abstraction Refinement • Detailed model of a program

    is huge • Detailed model of a program is not needed to check a particular property • Detailed model of a concrete path in a program is suitable for analysis => • Build a model that is just enough to check the particular property
  32. SV-COMP-2013 Competition Results Drivers Drivers Bit Bit operations operations Pointers

    Pointers CEGAR CEGAR BMC BMC Shape- Shape- analysis analysis
  33. Linux Verification Center founded in 2005 • OLVER Program •

    Linux Standard Base Infrastructure Program • Linux Driver Verification Program • Linux File System Verification Program • Linux Deductive Verification Program
  34. SVACE vs. LDV SVACE LDV-CPAchecker Time of analysis 2 hrs

    111 hrs (4.5 days) Warnings 35 328 True bugs 8 103 True positive rate 23% 31% • Target code: • Linux kernel 3.17-rc1, allmodconfig, x86-64 • 3 223 modules, 33 373 source files • Target bugs: • double free, memory leaks There is no a single common bugs!
  35. Testing and Program Analysis 1 kind bugs all kinds of

    bugs in all executions Deductive in 1 execution Test Test Suite SVACE LDV 1. Investigates all possible paths 2. Is able to prove absence of bugs of particular kind 3. Domain-specific (framework) adaptation (environment model, library model) 4. Applicability limited to medium-sized components (up to 50 KLoC) 5. Requires nonzero hardware resources - time: 15 minutes per rule per module - memory: 15 Gb
  36. Testing Static Analysis Software Model Checking Deductive Verification Kind of

    bugs almost all safety only safety only almost all Executions under analysis small big almost all almost all Development cost linear 0 to small medium huge Execution cost hw small (target hw) hw small hw big hw small Result analysis cost small medium (false alarms) medium (false alarms) big Maintenance cost small to big small small huge Testing and Program Analysis
  37. Conclusions • There is no silver bullet • The key

    is in a competent combination of available techniques
  38. Conclusions (2) • Two advanced program analysis techniques: • Deductive

    verification • + proof of complete correctness under some assumptions • – significant manual efforts • – highly skilled team required • => only for really important code • Software model checking • + investigates almost all possible paths • + complimentary to static analysis • – per framework/domian adaptation required • – limited to medium-sized components (up to 50 KLoC)
  39. Institute for System Programming of the Russian Academy of Sciences

    Thank you! Alexey Khoroshilov [email protected] http://linuxtesting.org/ Morris Kline. “Mathematics: The Loss of Certainty” Oxford Press, 1980