Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Passive Testing Techniques in Practice

Exactpro
PRO
November 08, 2019

Passive Testing Techniques in Practice

Ana Rosa Cavalli
Professor, Telecom Sudparis

International Conference on Software Testing, Machine Learning and Complex Process Analysis (TMPA-2019)
7-9 November 2019, Tbilisi

Video: https://youtu.be/agKI0sbj9B8

TMPA Conference website https://tmpaconf.org/
TMPA Conference on Facebook https://www.facebook.com/groups/tmpaconf/

Exactpro
PRO

November 08, 2019
Tweet

More Decks by Exactpro

Other Decks in Technology

Transcript

  1. ANA ROSA CAVALLI
    I N S T I T U T M I N E S - T E L E C O M / T E L E C O M
    S U D P A R I S
    TBILISI
    NOVEMBER 7-9
    1

    View Slide

  2. Show the evolution of active testing
    to monitoring (passive testing)
    techniques
    Explain the differences and
    complementarity of these
    techniques
    Present some representative
    examples
    2

    View Slide

  3.  Our research model is based in:
    • Basic and applied research
    • Evaluation of results in real environments
    • Strong collaboration with industrial partners
    3
    Basic Research
    Application
    Domains

    View Slide

  4. 4

    View Slide

  5.  Testing: The process of executing software with
    the intent of finding and correcting faults
     Conformance testing: The process of checking if
    the implementation under test conforms the
    specification
    • Two techniques: active and passive testing (monitoring)
    • This presentation will focus mostly on monitoring, but
    there are many common objectives and challenges with
    active testing
    5

    View Slide

  6. • Usually called Model Based Testing (MBT)
    • It is assumed that the tester controls the implementation. Control
    means: after sending an input and after receiving an output, the
    tester knows what is the next input to be send
    • The tester can guide the implementation towards specific states
    • Automatic test generation methods can be defined
    • Usually a test case is a set of input sequences
    IUT Active Tester Verdict:
    PASS,
    FAIL,
    INCONC.
    Formal
    Specification
    Test
    Suites
    6

    View Slide

  7. • Passive testing consists in analyzing the traces recorded from the
    IUT and trying to find a fault by comparing these traces with either
    the complete specification or by verifying some specifics
    requirements (or properties) during normal runtime
    • No interferences with the IUT
    • It is also referred to as monitoring
    IUT
    Passive Tester Verdict:
    PASS,
    FAIL,
    INCONC.
    System
    Specification
    System User
    PO
    Trace
    Collection
    7

    View Slide

  8. 8

    View Slide

  9. 1 3
    2
    1€ / another 1€ 1€ / OK
    Choice / Soda, Juice
    2€ / OK
    Specification
    1 3
    2
    1€ / another 1€ 1€ / OK
    Choice / Soda, Juice
    2€ / yet another 1€
    I1
    output fault
    9

    View Slide

  10. 1 2
    1€ / another 1€
    Choice / Soda, Juice
    2€ / OK
    I2
    transfer fault
    1€ / OK
    1
    1€ / another 1€
    1€ / OK
    Choice /
    Soda, Juice
    2€ / OK
    I3
    10

    View Slide

  11.  How to bring the finite state machine
    implementation into any given state at any given
    time during testing ?
     Non trivial problem because of limited
    controllability of the finite state machine
    implementation
     It may not be possible to put the finite state
    machine into the head state of the transition
    being tested without realizing several transitions
    11

    View Slide

  12. a/b
    Specification
    a/b
    Imp1
    a/b a/b
    Imp2
    ε/b a/b
    Non controllable
    Controllable
    Non controllable
    a/b a/c
    Controllable under fairness
    assumption
    Imp3 Imp4
    12

    View Slide

  13.  How to verify that the finite state machine
    implementation is in a correct state after
    input/output exchange?
     State identification problem. Difficult because of limited
    observability of the finite state machine implementation, it
    may not be possible to directly verify that the finite state
    machine is in the desired tail state after the transition has
    been fired
    13

    View Slide

  14. To solve this problem different
    methods have been proposed:
     DS (Distinguishing Sequence)
     UIO (Unique Input/Output Sequence)
     W (Distinction Set)
    14

    View Slide

  15. S1
    S3
    S2 a/x
    c/z
    b/y
    a/y
    b/z
    c/y
    a/y
    b/x
    c/x Define an input sequence for each state
    such that the output sequence
    generated is unique to that state.
    Detects output and transfer faults.
    State UIO sequences
    S1 c/x
    S2 c/y
    S3 b/y
    (1)
    (2)
    Test of (1): a/y a/x b/y
    Test of (2): a/y c/z b/y
    15

    View Slide

  16. S1
    S3
    S2
    c/x
    b/y
    b/z
    c/y
    a/y
    c/x
    Test of (1): a/y a/x b/y
    Test of (2): a/y c/z b/y
    Application du test of (1) to
    the implementation: a/y a/x
    b/z (transfer error)
    Application of test (2) to
    the implementation:
    a/y c/x (output error)b/y
    b/x
    a/x
    Faulty Implementation
    a/y
    16

    View Slide

  17. Non applicable when no direct access to
    the implementation under test
    Semi- controllable interfaces
    (component testing)
    Interferences on the behaviour of the
    implementation
    17

    View Slide

  18.  Test in context, embedded testing:
     Tests focused on some
    components of the system, to
    avoid redundant tests
     Interfaces semi-controllables
     In some cases it is not possible to
    apply active testing
    C A
    a b’c c’ b a’
    ib
    ia
    Environment
    Internal
    Message
    Context Module Embedded Module
    18

    View Slide

  19.  Conformance testing is essentially focused on
    verifying the conformity of a given implementation
    to its specification
     It is based on the ability of a tester that stimulates the
    implementation under test and checks the correction of the
    answers provided by the implementation
     Closely related to the controllability of the IUT
     In some cases this activity becomes difficult, in particular:
     if the tester has not a direct interface with the implementation
     or when the implementation is built from components that have
    to run in their environment and cannot be shutdown or
    interrupted (for long time) in order to test them
    19

    View Slide

  20.  Controllability
     No controllability issue because no interaction with the
    implementation under test
     Observability
     It is assumed that to perform passive testing it is necessary to
    observe the messages exchanges between modules.
     Passive testing is a Grey Box testing technique
     Fault detection using passive testing
     It is possible to detect output faults
     It is possible to detect transfer faults under some hypothesis:
    to initialise the IUT in order to be sure that the
    implementation is in the initial state and then perform passive
    testing
    20

    View Slide

  21.  In this approach a set of properties are extracted from the
    specification or proposed by the protocol experts, and then
    the trace resulting from the implementation is analyzed to
    determine whether it validates this set of properties.
     These extracted set of properties are called invariants
    because they have to hold true at every moment.
    21

    View Slide

  22.  Definition: an invariant is a property that is
    always true.
     Two test steps:
     Extraction of invariants from the specification or
    proposed by protocol experts
     Application of invariants on execution event
    traces from implementation
     Solution: I/O invariants
    22

    View Slide

  23.  An invariant is composed of two parts :
     The test (an input or an output)
     The preamble (I/O sequence)
     3 kind of invariants :
     Output invariant (simple invariant)
     Input invariant (obligation invariant)
     Succession invariant (loop invariant)
    23

    View Slide

  24.  Definition : invariant in which the test is an
    output
     Meaning : « immediatly after the sequence
    préambule there is always the expected
    output »
     Example :
    (i1
    / o1
    ) (i2
    / o2
    )
    (preambule in blue, expected output in red)
    24

    View Slide

  25.  Definition : invariant in which the test is an
    input
     Meaning : « immediatly before the sequence
    preamble there is always the input test »
     Example :
    (i1
    / o1
    ) (i2
    / o2
    )
    (preamble in blue, test in red)
    25

    View Slide

  26.  Definition : I/O invariant for complex
    properties (loops …)
     Example :
     the 3 invariants below build the property :
    « only the third i2
    is followed by o3
    »
    (i1
    / o1
    ) (i2
    / o2
    )
    (i1
    / o1
    ) (i2
    / o2
    ) (i2
    / o2
    )
    (i1
    / o1
    ) (i2
    / o2
    ) (i2
    / o2
    ) (i2
    / o3
    )
    26

    View Slide

  27.  A trace as i1
    /O1
    ,…, in-1
    /On-1,
    in
    /O is a simple invariant if each
    time that the trace i1
    /O1
    ,…, in-1
    /On-1
    is observed, if we obtain
    the input in
    then we necessarily get an output belonging
    to O, where O is included in the set of expected outputs.
     i/o, *, i’/O means that if we detect the transition i/o
    then the first occurrence of the symbol i’ is followed by
    an output belonging to the set O.
     * replaces any sequence of symbols not containing the
    input symbol i’ and ? replaces any input or output.
    27

    View Slide

  28. 28
    a/y
    1
    3
    2 a/x
    c/z
    b/y
    a/y
    b/z
    c/y
    b/x
    a/x c/x
    Traces
    a/y c/z b/y a/y a/x c/z b/y
    c/x a/y a/x c/z b/y
    c/y a/x b/z b/x a/y
    Verdict
    Invariants
    a/?, c/z, b/{y}
    b/z, a/{x}
    a/x, *, b/{y, z}
    a/y, ?/{z}
    a/x, *, ?/{y}
    False
    True
    False
    a/{x}
    True
    False
    True

    View Slide

  29. 29
     Possibility to focus on a
    specific part of the
    specification
     Full test generation
    automation
     Needs a model
     May modify (crash) the IUT
    behavior
    IUT Active Tester Verdict:
    PASS,FAIL,
    INCONC.
    Formal
    Specification
    Test
    Suites
    IUT
    Passive Tester Verdict:
    PASS,FAIL,
    INCONC.
    System Specification
    System User
    PO
    Trace
    Collection
     No interferences with the IUT
     No models needed
     Full monitoring automation
     Grey box testing

    View Slide

  30.  Approach proposed by researchers of verification (model
    checking) community
     EAGLE and RuleR tools proposed by Barringer and al. in
    2004 and 2010 respectively, based on temporal logics and
    rewriting rules for properties description
     Others tools: Tracematches, [Avgustinov et al. 2007], J-
    LO [Bodden 2005]and LSC [Maoz and Harel 2006]
     SNORT- CISCO (protocol analysis, content searching and
    matching, detection of a variety of attacks)
     BRO – University of Berkeley (called Zeek real time IDS,
    network based IDS)
    30

    View Slide

  31. 31

    View Slide

  32.  Monitoring the traces of a running system (e.g.,
    traffic or message flows), online or offline.
     Non-obtrusive (i.e., execution traces are
    observed without interfering with the behaviour
    of the system).
     Analyzing collected data according to functional
    and non-functional requirements:
     Security properties described in a formal specification (temporal logic ,
    regular expressions, describing behaviour involving several events over time).
     Performance to get real time visibility over the traffic statistics, KPI,
    delivered QoS, etc.
     Extended to perform counter-measures.
    32

    View Slide

  33. 33

    View Slide

  34. 34

    View Slide

  35. 35

    View Slide

  36. INTER-TRUST project. Three-year
    project with many academic and
    industrial partners
    Security properties of services
    Detection of attacks using active and
    monitoring techniques
    36

    View Slide

  37.  Why testing ? (testing phase)
     Vulnerabilities can be introduced by AOP (Aspect Oriented
    Programming) used in Inter-trust
     Functional testing
     Check the respect of weaved security policies (aspects)
     Check the robustness of the target application
     Detect vulnerabilities
     Simulate attacks
     Why monitoring ? (testing & operation phases)
     Same as above
     + detecting context changes (context awareness) at runtime
    37

    View Slide

  38. 38

    View Slide

  39. 39
    • Generation of tests from IF model and test purposes
    Target: functional, security properties, attacks
    • Execution relying on Selenium (Web interface)
    • Detecting failures using MMT

    View Slide

  40. 40
    First step, the user is
    asked to introduce his
    login and password :
    -if the user does not use
    a correct login and
    password, an error
    message will be
    displayed
    - otherwise the user will
    be connected.
    In this state the user
    is asked to choose
    his privacy options
    (Authentication,
    Encryption,
    signature)
    In this state the user will
    choose a list of elections
    in which he will vote
    This state means that
    there are a warning
    regarding the security
    policy. The user must
    choose other options.
    The vote is validated.
    The user cannot modify
    his vote anymore.
    However he can choose
    another election or
    logout.
    This state presents the
    available elections for
    the user
    In this step the user has
    to verify his vote: He can
    confirm or change his
    vote.
    In this step the vote
    choices are displayed.
    The user has to fill the
    vote form. The step is
    the effective vote
    The E-voting
    application has been
    specified as an
    extended finite state
    machine (IF
    language)

    View Slide

  41. 41
    This part of the TestGen-IF tool aims
    to choose the test objective. Each test
    objective is presented with its
    description and formal specification.

    View Slide

  42. 42
    42
    The test generation of
    abstract test cases based
    on an algorithm called
    “Hit or Jump”

    View Slide

  43. 43
    43

    View Slide

  44. 44
     Detecting failures using MMT
     Events based detection
     Properties as FSMs or as LTL properties

    View Slide

  45.  Two main uses:
     During the testing phase to complement the testing tools
    and provide a verdict
     During the operation phase to monitor security and
    application context
     Relies on data collected at different levels
     Network
     Application internal events (notification module)
     System status (CPU and memory usage)
    45

    View Slide

  46. 46

    View Slide

  47.  Evoting test case – Advanced authentication option
     Example of property: Only authenticated voters can cast
    their votes
    47
    Init Logged
    _In
    Login
    Cast vote  Failure Cast vote  Success
    Logout

    View Slide

  48. 48

    View Slide

  49. 49

    View Slide

  50.  Model based test generation for security purposes
    (TestGen-IF)
     Correlation of data from different sources
    (Network, application, system)
     Detection of attacks and failures and runtime
    reaction
     Brings dynamicity to system by adapting to
    different contexts
    50

    View Slide

  51.  Monitoring of routing protocols for ad hoc (OLSR protocol)
    and mesh networks based on a distributed approach
    (Batman protocol) (Telecom Sud Paris)
     Monitoring for secure interoperability – Application to a
    multi-source information system (Telecom Sud Paris)
     Monitoring with time constraints (C. Andrés, M. Nuñez and
    Mercedes Merayo)
     Monitoring with Asynchronous Communications (M.
    Nuñez and R. Hierons)
     Other works by (T. Jeron and H. Marchand, A. Ulrich and
    A. Petrenko)
    Related Works
    51

    View Slide

  52.  It is now easier to support active testing and
    monitoring and integrate it with other development
    activities
     Modeling technology has matured (using FSMs,
    EFSMs, different UML profiles (SysML), temporal
    logics)
     Much research and innovation is still required and it
    should involve collaborations between research and
    industry
    52

    View Slide

  53. 1. Raul A. Fuentes-Samaniego, Vinh Hoa La, Ana Rosa Cavalli, Juan Arturo Nolazco-Flores, Raúl V. Ramírez-Velarde:
    A monitoring-based approach for WSN security using IEEE-802.15.4/6LowPAN and DTLS communication. IJAACS
    12(3): 218-243 (2019).
    2. Rabéa Ameur-Boulifa, Ana R. Cavalli, Stephane Maag: Verifying Complex Software Control Systems from Test
    Objectives: Application to the ETCS System. ICSOFT 2019: 397-406.
    3. Thierno Birahime Sambe, Stephane Maag, Ana R. Cavalli: A Methodology for Enterprise Resource Planning
    Automation Testing Application to the Open Source ERP-ODOO. ICSOFT 2019: 407-415.
    4. Diego Rivera, Edgardo Montes de Oca, Wissam Mallouli, Ana R. Cavalli, Brecht Vermeulen, Matevz Vucnik:
    Industrial IoT Security Monitoring and Test on Fed4Fire+ Platforms. ICTSS 2019: 270-278
    5. Georges Ouffoué, Fatiha Zaïdi, Ana R. Cavalli: Attack Tolerance for Services-Based Applications in the Cloud. ICTSS
    2019: 242-258
    6. Khalifa Toumi, Mohamed H. E. Aouadi, Ana R. Cavalli, Wissam Mallouli, Jordi Puiggalí Allepuz, Pol Valletb Montfort:
    A Framework for Testing and Monitoring Security Policies: Application to an Electronic Voting System. Comput. J.
    61(8): 1109-1122 (2018).
    7. Sarah A. Dahab, Erika Silva, Stephane Maag, Ana Rosa Cavalli, Wissam Mallouli: Enhancing Software Development
    Process Quality based on Metrics Correlation and Suggestion. ICSOFT 2018: 154-165.
    8. Pamela Carvallo, Ana R. Cavalli, Wissam Mallouli, Erkuden Rios: Multi-cloud Applications Security Monitoring. GPC
    2017: 748-758.
    9. Georges L. A. Ouffoue, Fatiha Zaïdi, Ana R. Cavalli, Mounir Lallali: An Attack-Tolerant Framework for Web Services.
    SCC 2017: 503-506
    10. Georges L. A. Ouffoue, Fatiha Zaïdi, Ana R. Cavalli, Mounir Lallali: An Attack-Tolerant Framework for Web Services.
    SCC 2017: 503-506.
    53

    View Slide

  54. 11. Pamela Carvallo, Ana R. Cavalli, Natalia Kushik: Automatic Derivation and Validation of a Cloud
    Dataset for Insider Threat Detection. ICSOFT 2017: 480-487.
    12. Ana R. Cavalli, Antonio M. Ortiz, Georges Ouffoué, Cesar A. Sanchez, Fatiha Zaïdi:
    Design of a Secure Shield for Internet and Web-Based Services Using Software Reflection. ICWS 2018:
    472-486.
    13. Vinh Hoa La, Raul A. Fuentes-Samaniego, Ana R. Cavalli: Network Monitoring Using MMT: An
    Application Based on the User-Agent Field in HTTP Headers. AINA 2016: 147-154.
    14. Vinh Hoa La, Raul Fuentes, Ana R. Cavalli: A novel monitoring solution for 6LoWPAN-based Wireless
    Sensor Networks. APCC 2016: 230-237.
    54

    View Slide