SOFTWARE ENGINEERING RESEARCH COMMUNITY VIEWPOINTS ON RAPID REVIEWS

B7f4495b07e9af2c96994e27a455bfdb?s=47 Bruno Cartaxo
September 20, 2019

SOFTWARE ENGINEERING RESEARCH COMMUNITY VIEWPOINTS ON RAPID REVIEWS

Presentation of the paper with the same title in the 13th IEEE/ACM International Symposium on Empirical Software Engineering (ESEM) , Porto de Galinhas, Brazil

B7f4495b07e9af2c96994e27a455bfdb?s=128

Bruno Cartaxo

September 20, 2019
Tweet

Transcript

  1. SOFTWARE ENGINEERING RESEARCH COMMUNITY VIEWPOINTS ON RAPID REVIEWS Bruno Cartaxo

    - Federal Institute of Pernambuco - IFPE Gustavo Pinto - Federal University of Pará - UFPA Baldoino Fonseca - Federal University of Alagoas - UFAL Márcio Ribeiro - Federal University of Alagoas - UFAL Pedro Pinheiro - Federal University of Alagoas - UFAL Maria Teresa Baldassarre - University of Bari - UNIBARI Sergio Soares - Federal University of Pernambuco - UFPE Porto de Galinhas Pernambuco - Brazil
  2. MANY ARE CLAIMING THAT SOFTWARE ENGINEERING RESEARCH LACKS CONNECTION WITH

    PRACTICE IN PARTICULAR SYSTEMATIC REVIEWS 2
  3. EVIDENCE BASED MEDICINE HAS BEEN APPROACHING THE LACK OF CONNECTION

    BETWEEN SLRs AND PRACTICE WITH RAPID REVIEWS 3
  4. RRs are lightweight secondary studies focused on delivering evidence to

    practitioners in a timely manner 1,2 TO ACHIEVE THIS GOAL Some steps of SLRs are deliberately omitted or simplified in RRs RAPID REVIEWS 1. Tricco et al. A scoping review of rapid review methods. BMC Medicine, 2015 2. Hartling et al. A taxonomy of rapid reviews links report types and methods to specific decision-making contexts. Journal of Clinical Epidemiology, 2016 4
  5. RAPID REVIEW CHARACTERISTICS • They reduce costs of heavyweight methods

    • They deliver evidence in a timely manner • They are performed in close collaboration with practitioners 5
  6. EVIDENCE BRIEFINGS1 6 1. Bruno Cartaxo et al. Evidence Briefings:

    Towards a Medium to Transfer Knowledge from Systematic Reviews to Practitioners, 2016
  7. 7

  8. HOWEVER, THE OPINIONS OF THE RESEARCH COMMUNITY... 8 SEEM TO

    BE POLARIZED
  9. THE OF THIS RESEARCH IS TO INVESTIGATE THE SOFTWARE ENGINEERING

    RESEARCH COMMUNITY VIEWPOINTS ABOUT RAPID REVIEWS 9
  10. Q - METHODOLOGY 10 employs both qualitative and quantitative methods

    to shed light on complex issues regarding human subjectivity focuses on discovering the existent viewpoints about a particular topic
  11. 11 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting

    Factor Analysis Conducting Factor Interpretation
  12. SET OF OPINATIVE STATEMENTS COMPOSED BY 50 STATEMENTS 12 1.

    Kelly et al. Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using q methodology. PeerJ, 2016. Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  13. SET OF OPINATIVE STATEMENTS COMPOSED BY 50 STATEMENTS1 13 1.

    Kelly et al. Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using q methodology. PeerJ, 2016. Rapid reviews do not replace systematic reviews Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  14. SET OF OPINATIVE STATEMENTS COMPOSED BY 50 STATEMENTS1 14 1.

    Kelly et al. Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using q methodology. PeerJ, 2016. Rapid reviews do not replace systematic reviews Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation Rapid reviews are ’quick and dirty’ systematic reviews
  15. RESEARCHERS ICSE, ESEM, EASE 2015, 2016, 2017 37 PARTICIPANTS >

    90% ARE PhD > 90% HAVE READ A SLR > 70% CONDUCTED A SLR 15 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  16. Participants rank the opinative statements following the Q-SORT structure according

    to their level of agreement with the statements 16 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  17. 17 ... Participants Q-SORTs Factors/Viewpoints Dimensionality Reduction Algorithms Defining the

    Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  18. • Analyze how the statements are ranked in each factor

    • Analyze the distinguishing statements of each factor • Analyzed the participants comments on the extremes • Write a story about each factor/viewpoint • Define a name for each factor/viewpoint 18 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  19. 19

  20. VIEWPOINT A: unconvinced/undecided 20 VIEWPOINT B: enthusiastic VIEWPOINT C: picky

    VIEWPOINT D: pragmatic
  21. Further research comparing RRs with SLRs is required before I

    decide how I feel about rapid reviews. 21 Unconvinced/Undecided
  22. Further research comparing RRs with SLRs is required before I

    decide how I feel about rapid reviews. I put more confidence in evidence produced in a SLR than of a RR. A well-conducted RR may produce better evidence than a poorly conducted SLR. 22 Unconvinced/Undecided
  23. RRs can be timely and valid, even when methodological concessions

    are made, and I strongly disagree that RRs are ’quick and dirty’ SLRs. A well-conducted RR may produce better evidence than a poorly conducted SLR. 23 Enthusiastic
  24. RRs can be timely and valid, even when methodological concessions

    are made, and I strongly disagree that RRs are ’quick and dirty’ SLRs. A well-conducted RR may produce better evidence than a poorly conducted SLR. However, I believe that minimum standards to conduct and report RRs are essential. 24 Enthusiastic
  25. RRs that omit an assessment of the quality of included

    studies are useless to practitioners, because practitioners do not fully understand the implications of streamlining evidence synthesis methods to produce a more timely evidence product. 25 Picky
  26. RRs that omit an assessment of the quality of included

    studies are useless to practitioners, because practitioners do not fully understand the implications of streamlining evidence synthesis methods to produce a more timely evidence product. I believe that using no evidence to inform decisions may be better than using RRs. 26 Picky
  27. Appropriateness of a RRs varies with the type of decision

    being made. The evidence from rapid reviews is good enough to inform low risk decisions. 27 Pragmatic
  28. Appropriateness of a RRs varies with the type of decision

    being made. The evidence from rapid reviews is good enough to inform low risk decisions. Transparency of process is more important than the actual methods used to produce RR, as transparency allows the practitioners to make their own assessment on validity and appropriateness 28 Pragmatic
  29. Consensuses and Dissensions among the Viewpoints 29 All evidence synthesis

    products, including rapid reviews and systematic reviews, can be conducted very well or very poorly It is important to have minimum standards for the reporting of rapid reviews
  30. Researchers’ concerns related to Rapid Reviews • More evidence about

    RRs • Minimum standards for RRs • Quality assessment of studies included in RRs • Transparency with RRs results 30
  31. Conclusion • Four viewpoints constituting a typology • Despite the

    differences, we also identified some consensus • With this typology, one can better address the main concerns of researchers and promote better understanding about RRs • As consequence, we can pave a road better connecting SE research with practice and make SE research community more impactful and relevant. 31
  32. FUTURE DIRECTIONS More RRs in SE Compare RRs with SLRs

    Impact of RRs in practice 32
  33. None
  34. brunocartaxo.com/talks @brunocartaxo facebook.com/ bruno.cartaxo