Automated Cross-Browser Computability Testing

68326348c38c04dd1d0a42ee80ffa183?s=47 saltlab
October 13, 2014

Automated Cross-Browser Computability Testing

68326348c38c04dd1d0a42ee80ffa183?s=128

saltlab

October 13, 2014
Tweet

Transcript

  1. Automated  Cross-­‐Browser   Compa1bility  Tes1ng   Ali  Mesbah   University

     of  Bri4sh  Columbia   Vancouver,  BC,  Canada   Mukul  R.  Prasad   Fujitsu  Laboratories  of  America   Sunnyvale,  CA,  USA  
  2. Move  to  the  Web  

  3. Browsing  Environments  

  4. Cross-­‐browser  Compa1bility  (CBC)   Problem   •  Web  browsers  render

     web  content  differently     •  Increase  in  client-­‐side  code     •  Developers   –   build  web  apps  with  one  browser   –   test  manually  for  a  few  more  as  an  aFer-­‐thought    
  5. Look  &  Feel  versus  Func1onality   Firefox   Chrome  

  6. Look  &  Feel  versus  Func1onality   Firefox   IE  

  7. Behind  the  Scenes   Browser  Engine   JavaScript   HTML

      CSS   DOM  
  8. Our  Defini1on   We  define  Cross-­‐Browser  Compa1bility  tes1ng  as:  

        “Checking  the  func1onal  consistency  of  a  web   applica1on  across  different  browsing  environments.”  
  9. Our  Goal   Automa4cally  detect  inconsistencies  at:   •  Trace-­‐level

        –  Example:  click  on  “New  User”  -­‐>  “Create  User”   –  State  transi1ons   •  Screen-­‐level     –  Example:  screen  widget’s  structure  and  posi1on  
  10. +Dynamic Crawler     http://www.example.com Our  Approach   Execu1on  environment

      (OS  +  Browser)   Inferred  models   1 3 2 4 1 3 5 2 4 1 3 5 2 4
  11. State  Explora1on   Automa1cally  derive  a  model  of  the  user

     interac1on   –  Simulate  a  user’s  interac1on  with  the  web  applica1on   –  Gain  access  to  various  dynamic  DOM  states   –  Incrementally  build  a  State  Graph     h_p://crawljax.com  
  12. State  Graph   Index   State  3   State  5

      Click:  //BODY/DIV/A[2]   Click:  //BODY/DIV[2]/A[3]   State  2   State  4   Click:  //BODY/SPAN[3]   Click:  //BODY/DIV[2]/IMG   Click:  //BODY/SPAN[3]/A[1]  
  13. Equivalence  Check:  Trace-­‐level   ≡ ? 1 3 2 4

    1 3 5 2 4 •  Graph  isomorphism   •  Linear  1me   •  Maximal  match   Examples  of  mismatches:   •  Edge  3  -­‐>  5    missing  in  IE   •  Edge  2  -­‐>  1  missing  in   Chrome      
  14. Screen  Models   State  Graph   HEAD DIV class=news World

    News HTML SPAN #content P BODY DOM  Tree   1   3   5   2   4  
  15. Equivalence  Check:  Screen-­‐level   HEAD DIV class=news HTML SPAN #content

    BODY ≡ ? HEAD DIV class=news HTML SPAN #content LABEL BODY Screen  4   Screen  4   •  Tree  comparison   •  DOM  differencing   •  Filters  to  prune  syntac1c  differences   –  Case  sensi1vity,  white  space,  etc  
  16. Empirical  Evalua1on   Research  Ques1on:   –  Effec1veness  in  revealing

     trace-­‐ level  and  screen-­‐level  CBC   differences   Experimental  Subjects:   –  1  open  source  (The  Organizer)   –  3  industrial  (IND1,  IND2,  TMMS)   –  1  public  domain  (CNN)  
  17. Subject   Browsers   Detected   States   Detected  

    Transi4ons   Trace-­‐level   Screen-­‐level   Organizer   Firefox   Chrome   13   14   53   60   7   0   23   33   IND-­‐1   Firefox   Chrome   100   100   107   107   0   0   43   18   IND-­‐2   Firefox   IE   200   200   200   199   7   2   40   37   TMMS   IE   Chrome   31   24   48   34   14   0   50   13   CNN   Firefox   IE   100   100   99   99   16   0   41   12   Screen-­‐level:   –  False  posi1ves:  12-­‐37  %   –  False  nega1ves:   •  Case  1  and  4:  0  %   Trace-­‐level:     –  False  posi1ves:  0-­‐2     –  False  nega1ves:  0      
  18. Firefox   Chrome   <TD> <LABEL class=“538”></LABEL> <A href=“javascript:void(null); ”

    onclick=“return yahoo.widget.my item searches (0)”</A> </TD> <TD> <A class=“538” href=“javascript:void(nul l);” onclick=“return yahoo.widget.my item searches (0)”</A> </TD>
  19. None
  20. B:  Observable  Differences   A:  Trace/Screen-­‐level  Differences   C:  Ideal

     set  of   differences  to   detect     •  Observable  Differences:  Where  the  end  human  user  can  visually  see  a   difference     •  Trace/Screen-­‐level  Differences:  Where  our  technique  can  poten1ally   observe  a  difference     D:  Differences   detected  by  our  tool  
  21. Related  Work   •  Emulate  browser  environments   –   Adobe

     BrowserLab,  BrowserCamp     –   ieCapture,  BrowserCam,  Browser  Photo  (capture  screenshots)   –   No  clear  defini1on  of  CBC,  manual  checking   •  Automa1c  Checking   –   WebDiff,  Roy  Choudhary  and  Orso   –   DOM  and  Image  comparison   –   Single  screen,  look  &  feel  issues    
  22. Summary   •  The  trend  is:  move  your  soFware  to

     the  Web     •  Various  Browser/OS/Plaqorm  combina1ons   –  Cross-­‐browser  compa1bility  is  more  than  look  &  feel   –  Treat  CBC  as  a  func1onal  requirement   •  We  provide  an  automated  approach  to  detect     –  trace-­‐level  and   –  screen-­‐level  differences