Upgrade to Pro — share decks privately, control downloads, hide ads and more …

SOFTWARE ENGINEERING RESEARCH COMMUNITY VIEWPOINTS ON RAPID REVIEWS

Bruno Cartaxo
September 20, 2019

SOFTWARE ENGINEERING RESEARCH COMMUNITY VIEWPOINTS ON RAPID REVIEWS

Presentation of the paper with the same title in the 13th IEEE/ACM International Symposium on Empirical Software Engineering (ESEM) , Porto de Galinhas, Brazil

Bruno Cartaxo

September 20, 2019
Tweet

More Decks by Bruno Cartaxo

Other Decks in Research

Transcript

  1. SOFTWARE ENGINEERING RESEARCH COMMUNITY VIEWPOINTS ON RAPID REVIEWS Bruno Cartaxo

    - Federal Institute of Pernambuco - IFPE Gustavo Pinto - Federal University of Pará - UFPA Baldoino Fonseca - Federal University of Alagoas - UFAL Márcio Ribeiro - Federal University of Alagoas - UFAL Pedro Pinheiro - Federal University of Alagoas - UFAL Maria Teresa Baldassarre - University of Bari - UNIBARI Sergio Soares - Federal University of Pernambuco - UFPE Porto de Galinhas Pernambuco - Brazil
  2. EVIDENCE BASED MEDICINE HAS BEEN APPROACHING THE LACK OF CONNECTION

    BETWEEN SLRs AND PRACTICE WITH RAPID REVIEWS 3
  3. RRs are lightweight secondary studies focused on delivering evidence to

    practitioners in a timely manner 1,2 TO ACHIEVE THIS GOAL Some steps of SLRs are deliberately omitted or simplified in RRs RAPID REVIEWS 1. Tricco et al. A scoping review of rapid review methods. BMC Medicine, 2015 2. Hartling et al. A taxonomy of rapid reviews links report types and methods to specific decision-making contexts. Journal of Clinical Epidemiology, 2016 4
  4. RAPID REVIEW CHARACTERISTICS • They reduce costs of heavyweight methods

    • They deliver evidence in a timely manner • They are performed in close collaboration with practitioners 5
  5. EVIDENCE BRIEFINGS1 6 1. Bruno Cartaxo et al. Evidence Briefings:

    Towards a Medium to Transfer Knowledge from Systematic Reviews to Practitioners, 2016
  6. 7

  7. THE OF THIS RESEARCH IS TO INVESTIGATE THE SOFTWARE ENGINEERING

    RESEARCH COMMUNITY VIEWPOINTS ABOUT RAPID REVIEWS 9
  8. Q - METHODOLOGY 10 employs both qualitative and quantitative methods

    to shed light on complex issues regarding human subjectivity focuses on discovering the existent viewpoints about a particular topic
  9. 11 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting

    Factor Analysis Conducting Factor Interpretation
  10. SET OF OPINATIVE STATEMENTS COMPOSED BY 50 STATEMENTS 12 1.

    Kelly et al. Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using q methodology. PeerJ, 2016. Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  11. SET OF OPINATIVE STATEMENTS COMPOSED BY 50 STATEMENTS1 13 1.

    Kelly et al. Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using q methodology. PeerJ, 2016. Rapid reviews do not replace systematic reviews Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  12. SET OF OPINATIVE STATEMENTS COMPOSED BY 50 STATEMENTS1 14 1.

    Kelly et al. Expediting evidence synthesis for healthcare decision-making: exploring attitudes and perceptions towards rapid reviews using q methodology. PeerJ, 2016. Rapid reviews do not replace systematic reviews Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation Rapid reviews are ’quick and dirty’ systematic reviews
  13. RESEARCHERS ICSE, ESEM, EASE 2015, 2016, 2017 37 PARTICIPANTS >

    90% ARE PhD > 90% HAVE READ A SLR > 70% CONDUCTED A SLR 15 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  14. Participants rank the opinative statements following the Q-SORT structure according

    to their level of agreement with the statements 16 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  15. 17 ... Participants Q-SORTs Factors/Viewpoints Dimensionality Reduction Algorithms Defining the

    Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  16. • Analyze how the statements are ranked in each factor

    • Analyze the distinguishing statements of each factor • Analyzed the participants comments on the extremes • Write a story about each factor/viewpoint • Define a name for each factor/viewpoint 18 Defining the Q-SET Defining the P-SET Q-SORTing process Conducting Factor Analysis Conducting Factor Interpretation
  17. 19

  18. Further research comparing RRs with SLRs is required before I

    decide how I feel about rapid reviews. 21 Unconvinced/Undecided
  19. Further research comparing RRs with SLRs is required before I

    decide how I feel about rapid reviews. I put more confidence in evidence produced in a SLR than of a RR. A well-conducted RR may produce better evidence than a poorly conducted SLR. 22 Unconvinced/Undecided
  20. RRs can be timely and valid, even when methodological concessions

    are made, and I strongly disagree that RRs are ’quick and dirty’ SLRs. A well-conducted RR may produce better evidence than a poorly conducted SLR. 23 Enthusiastic
  21. RRs can be timely and valid, even when methodological concessions

    are made, and I strongly disagree that RRs are ’quick and dirty’ SLRs. A well-conducted RR may produce better evidence than a poorly conducted SLR. However, I believe that minimum standards to conduct and report RRs are essential. 24 Enthusiastic
  22. RRs that omit an assessment of the quality of included

    studies are useless to practitioners, because practitioners do not fully understand the implications of streamlining evidence synthesis methods to produce a more timely evidence product. 25 Picky
  23. RRs that omit an assessment of the quality of included

    studies are useless to practitioners, because practitioners do not fully understand the implications of streamlining evidence synthesis methods to produce a more timely evidence product. I believe that using no evidence to inform decisions may be better than using RRs. 26 Picky
  24. Appropriateness of a RRs varies with the type of decision

    being made. The evidence from rapid reviews is good enough to inform low risk decisions. 27 Pragmatic
  25. Appropriateness of a RRs varies with the type of decision

    being made. The evidence from rapid reviews is good enough to inform low risk decisions. Transparency of process is more important than the actual methods used to produce RR, as transparency allows the practitioners to make their own assessment on validity and appropriateness 28 Pragmatic
  26. Consensuses and Dissensions among the Viewpoints 29 All evidence synthesis

    products, including rapid reviews and systematic reviews, can be conducted very well or very poorly It is important to have minimum standards for the reporting of rapid reviews
  27. Researchers’ concerns related to Rapid Reviews • More evidence about

    RRs • Minimum standards for RRs • Quality assessment of studies included in RRs • Transparency with RRs results 30
  28. Conclusion • Four viewpoints constituting a typology • Despite the

    differences, we also identified some consensus • With this typology, one can better address the main concerns of researchers and promote better understanding about RRs • As consequence, we can pave a road better connecting SE research with practice and make SE research community more impactful and relevant. 31