Upgrade to PRO for Only $50/Yearā€”Limited-Time Offer! šŸ”„

Evidence Briefings: Towards a Medium to Transfe...

Gustavo Pinto
January 14, 2018
72

Evidence Briefings: Towards a Medium to Transfer Knowledge from Systematic Reviews toĀ Practitioners

Gustavo Pinto

January 14, 2018
Tweet

More Decks by Gustavo Pinto

Transcript

  1. Evidence Brieļ¬ngs: Towards a Medium to Transfer Knowledge from Systematic

    Reviews to Practitioners @brunocartaxo @gustavopinto @scbs @soueltonvieira moving to 1
  2. Evidence-Based Medicine EBM is the conscientious, explicit, judicious and reasonable

    use of modern, best evidence in making decisions about the care of individual patients. It integrates the clinical experience and patient values with the best available research It aims to increase the use of high quality clinical research in clinical decision making Acta Informatica Medicaā€™2008 3
  3. Evidence-Based Medicine One of the greatest achievements of EBM has

    been the development of systematic reviews and meta-analyses, methods by which researchers identify multiple studies on a topic, separate the best ones and then critically analyze them to come up with a summary of the best available evidence. Acta Informatica Medicaā€™2008 4
  4. Evidence-Based Software Engineering Evidence-Based Software Engineering (EBSE) is a way

    to integrate the best research evidence with practice. Barbara Kitchenham Keele University ICSEā€™2004 7
  5. 1. Convert the need for information into answerable questions 2.

    Identify the best evidence with which to answer these questions 3. Appraise the evidence critically: assess its validity and usefulness 4. Implement the results of this appraisal in software engineering practice 5. Evaluate the performance of this implementation The 5 Steps of Evidence-Based Practice 8
  6. 1. Convert the need for information into answerable questions 2.

    Identify the best evidence with which to answer these questions 3. Appraise the evidence critically: assess its validity and usefulness 4. Implement the results of this appraisal in software engineering practice 5. Evaluate the performance of this implementation The 5 Steps of Evidence-Based Practice }Systematic Literature Reviews (SRs) 9
  7. How to ļ¬nd the best evidence? X, Y, and Z:

    A Systematic Literature Review 11
  8. 1. Convert the need for information into answerable questions 2.

    Identify the best evidence with which to answer these questions 3. Appraise the evidence critically: assess its validity and usefulness 4. Implement the results of this appraisal in software engineering practice 5. Evaluate the performance of this implementation The 5 Steps of Evidence-Based Practice }Systematic Literature Reviews (SRs) 12
  9. It involves reading the right papers and then changing behavior

    in the practice of the discipline. Trisha Greenhalgh Oxford University Evidence-Based practice is not only about reading and summarizing papers 13
  10. FƔbio Queda CIn @ UFPE There is a lack of

    connection between systematic reviews and software engineering practice ISTā€™2011 14
  11. SRs do not provide guidelines for practitioners Most of SR

    authors afļ¬rmed that they hadnā€™t a direct impact on industrial practice Lack of connection with industry is the 6th top barrier 15
  12. 1. Convert the need for information into answerable questions 2.

    Identify the best evidence with which to answer these questions 3. Appraise the evidence critically: assess its validity and usefulness 4. Implement the results of this appraisal in software engineering practice 5. Evaluate the performance of this implementation The 5 Steps of Evidence-Based Practice }Systematic Literature Reviews (SRs) 16
  13. Evidence-Based Medicine (EBM) researchers argue that SRs: Focus on a

    narrow question 18 Are time consuming they usually require between 6 months and 2 years to complete (1,139 hours, on average) Policymakers often require access to contextualized resources that address a broader scope of scientiļ¬c evidence quickly On the other handā€¦
  14. 19 Rapid Reviews 56% of the rapid reviews were conducted

    in the last 3 years ā€œRapid reviews simplify the process of SRs to produce information in a timely mannerā€
  15. 20 Rapid Reviews Brieļ¬ngs and Summaries 56% of the rapid

    reviews were conducted in the last 3 years ā€œRapid reviews simplify the process of SRs to produce information in a timely mannerā€ ā€œ[Brieļ¬ngs] translates existing SRs into actionable messages in the form of short accessible brieļ¬ngsā€
  16. Evidence Brieļ¬ngs: Towards a Medium to Transfer Knowledge from Systematic

    Reviews to Practitioners @brunocartaxo @gustavopinto @scbs @soueltonvieira moving to 24
  17. ā€œEvidence Brieļ¬ngsā€ 25 An one-page document, extracted from a systematic

    review, that contains ļ¬ndings useful for practitioners
  18. Approach 26 1 Systematic Reviews Selection 2 Systematic Review Data

    Extraction 3 Evidence Brieļ¬ngs Generation 4 Evidence Brieļ¬ngs Evaluation
  19. Approach 27 1 Systematic Reviews Selection 2 Systematic Review Data

    Extraction 3 Evidence Brieļ¬ngs Generation 4 Evidence Brieļ¬ngs Evaluation
  20. 30 1 Systematic Reviews Selection 120 Systematic Reviews 32 Systematic

    Reviews with guidelines FƔbio Queda CIn @ UFPE YAY! citation++
  21. 31 1 Systematic Reviews Selection 120 Systematic Reviews 32 Systematic

    Reviews with guidelines 24 Systematic Reviews with search strings FƔbio Queda CIn @ UFPE YAY! citation++
  22. 32 1 Systematic Reviews Selection 120 Systematic Reviews 32 Systematic

    Reviews with guidelines 24 Systematic Reviews with search strings 12 Selected SRs FƔbio Queda CIn @ UFPE YAY! citation++ # Subject 2 Global software development 3 Agile software development 1 Software testing 1 Software requirements 1 Model based software development 1 Software development productivity 1 Cost and effort estimation 1 Code duplication 1 Software engineering knowledge management
  23. Approach 33 1 Systematic Reviews Selection 2 Systematic Review Data

    Extraction 3 Evidence Brieļ¬ngs Generation 4 Evidence Brieļ¬ngs Evaluation
  24. 34 2 Systematic Reviews Data Extraction Paper title: Original The

    effectiveness of pair programing: A meta-analysis Brieļ¬ng The effectiveness of pair programming
  25. 35 2 Systematic Reviews Data Extraction Paper title: Research goals:

    Original The effectiveness of pair programing: A meta-analysis Brieļ¬ng The effectiveness of pair programming Template This brieļ¬ng reports evidence on <GOAL> based on scientiļ¬c evidence from a systematic review. Brieļ¬ng This brieļ¬ng reports evidence on the effectiveness of pair programming around quality duration and effort based on scientiļ¬c evidence from a systematic review.
  26. 36 2 Systematic Reviews Data Extraction Paper title: Research goals:

    Research ļ¬ndings: Original The effectiveness of pair programing: A meta-analysis Brieļ¬ng The effectiveness of pair programming Template This brieļ¬ng reports evidence on <GOAL> based on scientiļ¬c evidence from a systematic review. Brieļ¬ng This brieļ¬ng reports evidence on the effectiveness of pair programming around quality duration and effort based on scientiļ¬c evidence from a systematic review. Finding 1 Pairing up of individuals seems to elevate the junior pairs up to near senior pair performance Finding 2 If you do not know the seniority or skill levels of your programmers, but do have a feeling for task complexity, then employ pair programing either when task complexity is low and time is of the essence, or when task complexity is high and correctness if important
  27. Approach 37 1 Systematic Reviews Selection 2 Systematic Review Data

    Extraction 3 Evidence Brieļ¬ngs Generation 4 Evidence Brieļ¬ngs Evaluation
  28. 38 3 Evidence Brieļ¬ngs Generation: Principles Similarity: elements that are

    similar are more likely to be organized together Proximity: closer elements are more likely to be perceived as a group Continuation: elements will be grouped as a whole if they are co- linear Unity: elements that have a visual connection should belong to a uniform group
  29. 39 3 Evidence Brieļ¬ngs Generation 1. The title of the

    brieļ¬ng 2. The goal of the brieļ¬ng 3. The ļ¬ndings extracted from the original review 4. An informative box with general information 5. The reference to the original review 6. The logos of our research group and university
  30. 40 3 Evidence Brieļ¬ngs Generation 1. The title of the

    brieļ¬ng 2. The goal of the brieļ¬ng 3. The ļ¬ndings extracted from the original review 4. An informative box with general information 5. The reference to the original review 6. The logos of our research group and university The template of Evidence Brieļ¬ngs is licensed under CC-BY license!
  31. Approach 41 1 Systematic Reviews Selection 2 Systematic Review Data

    Extraction 3 Evidence Brieļ¬ngs Generation 4 Evidence Brieļ¬ngs Evaluation
  32. RQ: How practitioners and researchers perceive the content and format

    of Evidence Brieļ¬ngs? 42 4 Evidence Brieļ¬ngs Evaluation
  33. 44 4 1 SRs Search Strings Queried a StackExchange website

    2 Removed false-positive questions 3 Classiļ¬ed questions using open card sort 4 programmers.stackexchange.com sqa.stackexchange.com pm.stackexchange.comā€Ø ā€Ø reverseengineering.stackexchange.com ā€Ø softwarerecs.stackexchange.com ā€Ø 1,738 related questions 473 related questions Kappa: 0.72 Evidence Brieļ¬ngs Evaluation: Practitioners
  34. 45 4 473 StackExchange Users who asked questions related to

    the SRs Evidence Brieļ¬ngs Evaluation: Practitioners
  35. 46 4 473 StackExchange Users who asked questions related to

    the SRs only 146 of them had public proļ¬le (LinkedIn, Github, Twitter, etc) Evidence Brieļ¬ngs Evaluation: Practitioners
  36. 47 4 The 22 authors of the 12 Systematic Reviews

    Evidence Brieļ¬ngs Evaluation: Researchers
  37. 48 4 Evidence Brieļ¬ngs Evaluation: Survey Survey principles: Reciprocity (e.g.,

    we rafļ¬‚ed a 100 USD Amazon card gift) Authority & Credibility (e.g, Ph.D., University professors) Liking (e.g., personalized emails) Scarcity: (e.g., we deļ¬ned a deadline) Brevity (e.g., we asked closed questions as much as possible) Social Beneļ¬t (e.g., 1 USD for the Brazilian Red Cross)
  38. Evidence Brieļ¬ngs Evaluation: Survey 49 4 Survey principles: Reciprocity (e.g.,

    we rafļ¬‚ed a 100 USD Amazon card gift) Authority & Credibility (e.g, Ph.D., University professors) Liking (e.g., personalized emails) Scarcity: (e.g., we deļ¬ned a deadline) Brevity (e.g., we asked closed questions as much as possible) Social Beneļ¬t (e.g., 1 USD for the Brazilian Red Cross)
  39. 50 4 Survey with Practitioners (17 questions, 2 open-ended) Survey

    with Researchers (8 questions, 2 open-ended) Evidence Brieļ¬ngs Evaluation: Survey 7 respondents (31% of response rate) 32 respondents (22% of response rate)
  40. 53 Survey with Practitioners: Demographics Q1. What is your current

    position? Q2. How many years of experience do you have in your current position?
  41. 54 Survey with Practitioners: Demographics Q1. What is your current

    position? Q2. How many years of experience do you have in your current position?
  42. 55 Survey with Practitioners: Demographics Q3. Where do you work?

    Q4. What is your level of educational attainment?
  43. 56 Survey with Practitioners: Demographics Q3. Where do you work?

    Q4. What is your level of educational attainment?
  44. 57 Survey with Practitioners: Acquiring Knowledge Q5. How often do

    you use StackExchange websites? Q6. How often do you read software engineering research papers?
  45. 58 Q7 . Have you ever read a systematic review

    paper? Q8. For what reason you read a systematic review paper? Survey with Practitioners: Acquiring Knowledge
  46. 59 Q10. To what degree do you think the information

    available in the brieļ¬ng we sent to you can answer your question on StackExchange? Survey with Practitioners: Brieļ¬ngsā€™ Content
  47. 60 Q10. To what degree do you think the information

    available in the brieļ¬ng we sent to you can answer your question on StackExchange? Q11. Why? Survey with Practitioners: Brieļ¬ngsā€™ Content ā€œThe question is too speciļ¬cā€ ā€œThe question expected more than one answerā€ ā€œThe question touched a slightly different issueā€ ā€œThe brieļ¬ng lacks detailsā€
  48. 61 Survey with Practitioners: Brieļ¬ngsā€™ Content Q12. Regardless the brieļ¬ng

    answers or not your question, how important do you think is the research presented on the brieļ¬ng?
  49. 62 Q12. Regardless the brieļ¬ng answers or not your question,

    how important do you think is the research presented on the brieļ¬ng? Survey with Practitioners: Brieļ¬ngsā€™ Content Q13. Why? ā€œAgile is not a one size ļ¬ts all methodology. To make it work you need to see what works for you and your team. [...] Making bold high level statistical statements about Agile software development will only hurt it where as it can shine in truly Agile organizations.ā€
  50. 63 Q14. How do you compare the answers from the

    StackExchange community to the ļ¬ndings presented in the brieļ¬ng? Survey with Practitioners: Brieļ¬ngsā€™ Content
  51. 64 Survey with Practitioners: Brieļ¬ngsā€™ Format Q15. How easy was

    to ļ¬nd the information in the brieļ¬ng? Q16. Is the brieļ¬ng interface clear and understandable?ā€Ø Q17 . Does the brieļ¬ng look reliable?
  52. 65 Q15. How easy was to ļ¬nd the information in

    the brieļ¬ng? Q16. Is the brieļ¬ng interface clear and understandable?ā€Ø Q17 . Does the brieļ¬ng look reliable? Survey with Practitioners: Brieļ¬ngsā€™ Format
  53. 67 Survey with Researchers: Sharing Knowledge Q1. How important for

    you is to share research results to practitioners? ?
  54. 68 Survey with Researchers: Sharing Knowledge Q1. How important for

    you is to share research results to practitioners?
  55. 69 Survey with Researchers: Sharing Knowledge Q2. How often do

    you share research results to practitioners?
  56. 70 Survey with Researchers: Sharing Knowledge Q2. How often do

    you share research results to practitioners?
  57. 71 Survey with Researchers: Sharing Knowledge Q2. How often do

    you share research results to practitioners? Q3. How do you do that? ā€œTeachingā€ ā€œSeminarsā€ ā€œWritingā€ ā€œAdvisory workā€ ā€œSocial Networksā€
  58. 72 Survey with Researchers: Brieļ¬ngsā€™ Content Q4. How does the

    brieļ¬ng that we sent to you cover the main ļ¬ndings of your paper?
  59. 73 Survey with Researchers: Brieļ¬ngsā€™ Content Q4. How does the

    brieļ¬ng that we sent to you cover the main ļ¬ndings of your paper? Q5. Why? YAY! :-)
  60. 74 Survey with Researchers: Brieļ¬ngsā€™ Format Q6. How easy was

    to ļ¬nd the information in the brieļ¬ng? Q7 . Is the brieļ¬ng interface clear and understandable?ā€Ø Q8. Does the brieļ¬ng look reliable?
  61. Revisiting Findings 76 Practitioners rarely use research papers as mediums

    to acquire knowledge. Software engineering practice still has many beliefs with no evidence basis.
  62. Revisiting Findings 78 Both researchers and practitioners positively evaluated the

    evidence brieļ¬ngs The brieļ¬ngs well covered the main ļ¬ndings of the original systematic reviews
  63. The Yin-Yang of Research and Practice 79 Researchers want to

    transfer knowledge. But not all of them do so. Practitioners want to be more aware of software engineering research. But few of them do so.
  64. 91 Are your search strings well-designed? ā€œquality + modelā€, ā€œquality

    + model drivenā€ and ā€œmodel driven + experienceā€ (software AND ((cost OR effort OR productivity) WITH (factors OR indicators OR drivers OR measure)))
  65. Evidence Brieļ¬ngs: Towards a Medium to Transfer Knowledge from Systematic

    Reviews to Practitioners @brunocartaxo @gustavopinto @scbs @soueltonvieira moving to 93