Slide 1

Slide 1 text

ANA ROSA CAVALLI I N S T I T U T M I N E S - T E L E C O M / T E L E C O M S U D P A R I S TBILISI NOVEMBER 7-9 1

Slide 2

Slide 2 text

Show the evolution of active testing to monitoring (passive testing) techniques Explain the differences and complementarity of these techniques Present some representative examples 2

Slide 3

Slide 3 text

 Our research model is based in: • Basic and applied research • Evaluation of results in real environments • Strong collaboration with industrial partners 3 Basic Research Application Domains

Slide 4

Slide 4 text

4

Slide 5

Slide 5 text

 Testing: The process of executing software with the intent of finding and correcting faults  Conformance testing: The process of checking if the implementation under test conforms the specification • Two techniques: active and passive testing (monitoring) • This presentation will focus mostly on monitoring, but there are many common objectives and challenges with active testing 5

Slide 6

Slide 6 text

• Usually called Model Based Testing (MBT) • It is assumed that the tester controls the implementation. Control means: after sending an input and after receiving an output, the tester knows what is the next input to be send • The tester can guide the implementation towards specific states • Automatic test generation methods can be defined • Usually a test case is a set of input sequences IUT Active Tester Verdict: PASS, FAIL, INCONC. Formal Specification Test Suites 6

Slide 7

Slide 7 text

• Passive testing consists in analyzing the traces recorded from the IUT and trying to find a fault by comparing these traces with either the complete specification or by verifying some specifics requirements (or properties) during normal runtime • No interferences with the IUT • It is also referred to as monitoring IUT Passive Tester Verdict: PASS, FAIL, INCONC. System Specification System User PO Trace Collection 7

Slide 8

Slide 8 text

8

Slide 9

Slide 9 text

1 3 2 1€ / another 1€ 1€ / OK Choice / Soda, Juice 2€ / OK Specification 1 3 2 1€ / another 1€ 1€ / OK Choice / Soda, Juice 2€ / yet another 1€ I1 output fault 9

Slide 10

Slide 10 text

1 2 1€ / another 1€ Choice / Soda, Juice 2€ / OK I2 transfer fault 1€ / OK 1 1€ / another 1€ 1€ / OK Choice / Soda, Juice 2€ / OK I3 10

Slide 11

Slide 11 text

 How to bring the finite state machine implementation into any given state at any given time during testing ?  Non trivial problem because of limited controllability of the finite state machine implementation  It may not be possible to put the finite state machine into the head state of the transition being tested without realizing several transitions 11

Slide 12

Slide 12 text

a/b Specification a/b Imp1 a/b a/b Imp2 ε/b a/b Non controllable Controllable Non controllable a/b a/c Controllable under fairness assumption Imp3 Imp4 12

Slide 13

Slide 13 text

 How to verify that the finite state machine implementation is in a correct state after input/output exchange?  State identification problem. Difficult because of limited observability of the finite state machine implementation, it may not be possible to directly verify that the finite state machine is in the desired tail state after the transition has been fired 13

Slide 14

Slide 14 text

To solve this problem different methods have been proposed:  DS (Distinguishing Sequence)  UIO (Unique Input/Output Sequence)  W (Distinction Set) 14

Slide 15

Slide 15 text

S1 S3 S2 a/x c/z b/y a/y b/z c/y a/y b/x c/x Define an input sequence for each state such that the output sequence generated is unique to that state. Detects output and transfer faults. State UIO sequences S1 c/x S2 c/y S3 b/y (1) (2) Test of (1): a/y a/x b/y Test of (2): a/y c/z b/y 15

Slide 16

Slide 16 text

S1 S3 S2 c/x b/y b/z c/y a/y c/x Test of (1): a/y a/x b/y Test of (2): a/y c/z b/y Application du test of (1) to the implementation: a/y a/x b/z (transfer error) Application of test (2) to the implementation: a/y c/x (output error)b/y b/x a/x Faulty Implementation a/y 16

Slide 17

Slide 17 text

Non applicable when no direct access to the implementation under test Semi- controllable interfaces (component testing) Interferences on the behaviour of the implementation 17

Slide 18

Slide 18 text

 Test in context, embedded testing:  Tests focused on some components of the system, to avoid redundant tests  Interfaces semi-controllables  In some cases it is not possible to apply active testing C A a b’c c’ b a’ ib ia Environment Internal Message Context Module Embedded Module 18

Slide 19

Slide 19 text

 Conformance testing is essentially focused on verifying the conformity of a given implementation to its specification  It is based on the ability of a tester that stimulates the implementation under test and checks the correction of the answers provided by the implementation  Closely related to the controllability of the IUT  In some cases this activity becomes difficult, in particular:  if the tester has not a direct interface with the implementation  or when the implementation is built from components that have to run in their environment and cannot be shutdown or interrupted (for long time) in order to test them 19

Slide 20

Slide 20 text

 Controllability  No controllability issue because no interaction with the implementation under test  Observability  It is assumed that to perform passive testing it is necessary to observe the messages exchanges between modules.  Passive testing is a Grey Box testing technique  Fault detection using passive testing  It is possible to detect output faults  It is possible to detect transfer faults under some hypothesis: to initialise the IUT in order to be sure that the implementation is in the initial state and then perform passive testing 20

Slide 21

Slide 21 text

 In this approach a set of properties are extracted from the specification or proposed by the protocol experts, and then the trace resulting from the implementation is analyzed to determine whether it validates this set of properties.  These extracted set of properties are called invariants because they have to hold true at every moment. 21

Slide 22

Slide 22 text

 Definition: an invariant is a property that is always true.  Two test steps:  Extraction of invariants from the specification or proposed by protocol experts  Application of invariants on execution event traces from implementation  Solution: I/O invariants 22

Slide 23

Slide 23 text

 An invariant is composed of two parts :  The test (an input or an output)  The preamble (I/O sequence)  3 kind of invariants :  Output invariant (simple invariant)  Input invariant (obligation invariant)  Succession invariant (loop invariant) 23

Slide 24

Slide 24 text

 Definition : invariant in which the test is an output  Meaning : « immediatly after the sequence préambule there is always the expected output »  Example : (i1 / o1 ) (i2 / o2 ) (preambule in blue, expected output in red) 24

Slide 25

Slide 25 text

 Definition : invariant in which the test is an input  Meaning : « immediatly before the sequence preamble there is always the input test »  Example : (i1 / o1 ) (i2 / o2 ) (preamble in blue, test in red) 25

Slide 26

Slide 26 text

 Definition : I/O invariant for complex properties (loops …)  Example :  the 3 invariants below build the property : « only the third i2 is followed by o3 » (i1 / o1 ) (i2 / o2 ) (i1 / o1 ) (i2 / o2 ) (i2 / o2 ) (i1 / o1 ) (i2 / o2 ) (i2 / o2 ) (i2 / o3 ) 26

Slide 27

Slide 27 text

 A trace as i1 /O1 ,…, in-1 /On-1, in /O is a simple invariant if each time that the trace i1 /O1 ,…, in-1 /On-1 is observed, if we obtain the input in then we necessarily get an output belonging to O, where O is included in the set of expected outputs.  i/o, *, i’/O means that if we detect the transition i/o then the first occurrence of the symbol i’ is followed by an output belonging to the set O.  * replaces any sequence of symbols not containing the input symbol i’ and ? replaces any input or output. 27

Slide 28

Slide 28 text

28 a/y 1 3 2 a/x c/z b/y a/y b/z c/y b/x a/x c/x Traces a/y c/z b/y a/y a/x c/z b/y c/x a/y a/x c/z b/y c/y a/x b/z b/x a/y Verdict Invariants a/?, c/z, b/{y} b/z, a/{x} a/x, *, b/{y, z} a/y, ?/{z} a/x, *, ?/{y} False True False a/{x} True False True

Slide 29

Slide 29 text

29  Possibility to focus on a specific part of the specification  Full test generation automation  Needs a model  May modify (crash) the IUT behavior IUT Active Tester Verdict: PASS,FAIL, INCONC. Formal Specification Test Suites IUT Passive Tester Verdict: PASS,FAIL, INCONC. System Specification System User PO Trace Collection  No interferences with the IUT  No models needed  Full monitoring automation  Grey box testing

Slide 30

Slide 30 text

 Approach proposed by researchers of verification (model checking) community  EAGLE and RuleR tools proposed by Barringer and al. in 2004 and 2010 respectively, based on temporal logics and rewriting rules for properties description  Others tools: Tracematches, [Avgustinov et al. 2007], J- LO [Bodden 2005]and LSC [Maoz and Harel 2006]  SNORT- CISCO (protocol analysis, content searching and matching, detection of a variety of attacks)  BRO – University of Berkeley (called Zeek real time IDS, network based IDS) 30

Slide 31

Slide 31 text

31

Slide 32

Slide 32 text

 Monitoring the traces of a running system (e.g., traffic or message flows), online or offline.  Non-obtrusive (i.e., execution traces are observed without interfering with the behaviour of the system).  Analyzing collected data according to functional and non-functional requirements:  Security properties described in a formal specification (temporal logic , regular expressions, describing behaviour involving several events over time).  Performance to get real time visibility over the traffic statistics, KPI, delivered QoS, etc.  Extended to perform counter-measures. 32

Slide 33

Slide 33 text

33

Slide 34

Slide 34 text

34

Slide 35

Slide 35 text

35

Slide 36

Slide 36 text

INTER-TRUST project. Three-year project with many academic and industrial partners Security properties of services Detection of attacks using active and monitoring techniques 36

Slide 37

Slide 37 text

 Why testing ? (testing phase)  Vulnerabilities can be introduced by AOP (Aspect Oriented Programming) used in Inter-trust  Functional testing  Check the respect of weaved security policies (aspects)  Check the robustness of the target application  Detect vulnerabilities  Simulate attacks  Why monitoring ? (testing & operation phases)  Same as above  + detecting context changes (context awareness) at runtime 37

Slide 38

Slide 38 text

38

Slide 39

Slide 39 text

39 • Generation of tests from IF model and test purposes Target: functional, security properties, attacks • Execution relying on Selenium (Web interface) • Detecting failures using MMT

Slide 40

Slide 40 text

40 First step, the user is asked to introduce his login and password : -if the user does not use a correct login and password, an error message will be displayed - otherwise the user will be connected. In this state the user is asked to choose his privacy options (Authentication, Encryption, signature) In this state the user will choose a list of elections in which he will vote This state means that there are a warning regarding the security policy. The user must choose other options. The vote is validated. The user cannot modify his vote anymore. However he can choose another election or logout. This state presents the available elections for the user In this step the user has to verify his vote: He can confirm or change his vote. In this step the vote choices are displayed. The user has to fill the vote form. The step is the effective vote The E-voting application has been specified as an extended finite state machine (IF language)

Slide 41

Slide 41 text

41 This part of the TestGen-IF tool aims to choose the test objective. Each test objective is presented with its description and formal specification.

Slide 42

Slide 42 text

42 42 The test generation of abstract test cases based on an algorithm called “Hit or Jump”

Slide 43

Slide 43 text

43 43

Slide 44

Slide 44 text

44  Detecting failures using MMT  Events based detection  Properties as FSMs or as LTL properties

Slide 45

Slide 45 text

 Two main uses:  During the testing phase to complement the testing tools and provide a verdict  During the operation phase to monitor security and application context  Relies on data collected at different levels  Network  Application internal events (notification module)  System status (CPU and memory usage) 45

Slide 46

Slide 46 text

46

Slide 47

Slide 47 text

 Evoting test case – Advanced authentication option  Example of property: Only authenticated voters can cast their votes 47 Init Logged _In Login Cast vote  Failure Cast vote  Success Logout

Slide 48

Slide 48 text

48

Slide 49

Slide 49 text

49

Slide 50

Slide 50 text

 Model based test generation for security purposes (TestGen-IF)  Correlation of data from different sources (Network, application, system)  Detection of attacks and failures and runtime reaction  Brings dynamicity to system by adapting to different contexts 50

Slide 51

Slide 51 text

 Monitoring of routing protocols for ad hoc (OLSR protocol) and mesh networks based on a distributed approach (Batman protocol) (Telecom Sud Paris)  Monitoring for secure interoperability – Application to a multi-source information system (Telecom Sud Paris)  Monitoring with time constraints (C. Andrés, M. Nuñez and Mercedes Merayo)  Monitoring with Asynchronous Communications (M. Nuñez and R. Hierons)  Other works by (T. Jeron and H. Marchand, A. Ulrich and A. Petrenko) Related Works 51

Slide 52

Slide 52 text

 It is now easier to support active testing and monitoring and integrate it with other development activities  Modeling technology has matured (using FSMs, EFSMs, different UML profiles (SysML), temporal logics)  Much research and innovation is still required and it should involve collaborations between research and industry 52

Slide 53

Slide 53 text

1. Raul A. Fuentes-Samaniego, Vinh Hoa La, Ana Rosa Cavalli, Juan Arturo Nolazco-Flores, Raúl V. Ramírez-Velarde: A monitoring-based approach for WSN security using IEEE-802.15.4/6LowPAN and DTLS communication. IJAACS 12(3): 218-243 (2019). 2. Rabéa Ameur-Boulifa, Ana R. Cavalli, Stephane Maag: Verifying Complex Software Control Systems from Test Objectives: Application to the ETCS System. ICSOFT 2019: 397-406. 3. Thierno Birahime Sambe, Stephane Maag, Ana R. Cavalli: A Methodology for Enterprise Resource Planning Automation Testing Application to the Open Source ERP-ODOO. ICSOFT 2019: 407-415. 4. Diego Rivera, Edgardo Montes de Oca, Wissam Mallouli, Ana R. Cavalli, Brecht Vermeulen, Matevz Vucnik: Industrial IoT Security Monitoring and Test on Fed4Fire+ Platforms. ICTSS 2019: 270-278 5. Georges Ouffoué, Fatiha Zaïdi, Ana R. Cavalli: Attack Tolerance for Services-Based Applications in the Cloud. ICTSS 2019: 242-258 6. Khalifa Toumi, Mohamed H. E. Aouadi, Ana R. Cavalli, Wissam Mallouli, Jordi Puiggalí Allepuz, Pol Valletb Montfort: A Framework for Testing and Monitoring Security Policies: Application to an Electronic Voting System. Comput. J. 61(8): 1109-1122 (2018). 7. Sarah A. Dahab, Erika Silva, Stephane Maag, Ana Rosa Cavalli, Wissam Mallouli: Enhancing Software Development Process Quality based on Metrics Correlation and Suggestion. ICSOFT 2018: 154-165. 8. Pamela Carvallo, Ana R. Cavalli, Wissam Mallouli, Erkuden Rios: Multi-cloud Applications Security Monitoring. GPC 2017: 748-758. 9. Georges L. A. Ouffoue, Fatiha Zaïdi, Ana R. Cavalli, Mounir Lallali: An Attack-Tolerant Framework for Web Services. SCC 2017: 503-506 10. Georges L. A. Ouffoue, Fatiha Zaïdi, Ana R. Cavalli, Mounir Lallali: An Attack-Tolerant Framework for Web Services. SCC 2017: 503-506. 53 •

Slide 54

Slide 54 text

11. Pamela Carvallo, Ana R. Cavalli, Natalia Kushik: Automatic Derivation and Validation of a Cloud Dataset for Insider Threat Detection. ICSOFT 2017: 480-487. 12. Ana R. Cavalli, Antonio M. Ortiz, Georges Ouffoué, Cesar A. Sanchez, Fatiha Zaïdi: Design of a Secure Shield for Internet and Web-Based Services Using Software Reflection. ICWS 2018: 472-486. 13. Vinh Hoa La, Raul A. Fuentes-Samaniego, Ana R. Cavalli: Network Monitoring Using MMT: An Application Based on the User-Agent Field in HTTP Headers. AINA 2016: 147-154. 14. Vinh Hoa La, Raul Fuentes, Ana R. Cavalli: A novel monitoring solution for 6LoWPAN-based Wireless Sensor Networks. APCC 2016: 230-237. 54