A minimum standard for publishing computational results

A minimum standard for publishing computational results

This paper was presented at the AMOS National Conference, Brisbane Convention and Exhibition Centre, Brisbane, Queensland.

A more detailed account of the work can be found at Authorea: https://www.authorea.com/users/5641/articles/15874/_show_article


Damien Irving

July 17, 2015


  1. Damien Irving AMOS Conference, Brisbane 2015 A minimum standard for

    publishing computational results in the weather and climate sciences
  2. Overview 1.  The reproducibility crisis 2.  A reproducible paper 3. 

    A new minimum standard
  3. The reproducibility crisis •  Our field has rapidly transitioned to

    a computational science •  Conventions around communicating our methods have hardly changed –  Have you ever seen a paper provide code and software details? •  It’s impossible to replicate the results presented in journal papers today
  4. The crisis response •  Funding agencies + journals1 –  Some

    progress on dataset disclosure •  Most weather/climate journals have policies •  Not consistently enforced –  Weak or non-existent code requirements •  It’s not their fault –  No examples to base new standards on –  I set about addressing this deficiency… 1. Stodden et al. 2013. PLoS ONE, 8, e67111
  5. A reproducible paper - rationale •  Procedure needs to: – 

    Minimise the time involved1 –  Minimise complexity of required tools –  Be consistent with computational best practices 1. Stodden (2010). doi:10.2139/ssrn.1550193
  6. A reproducible paper - components •  Irving D, Simmonds I

    (in press). A novel approach to diagnosing Southern Hemisphere planetary wave activity and its influence on regional climate variability. Journal of Climate. doi:10.1175/JCLI- D-15-0287.1 –  Preprint: https://www.authorea.com/users/5641/ articles/12197/_show_article –  Includes a brief computation section…
  7. •  Cites key software packages •  Points to supplementary material

    –  Software description, code repository, log files –  Hosted at GitHub & Figshare: http://dx.doi.org/10.6084/m9.figshare. 1385387 Computation section
  8. Software Description •  Name, version number, release date, institution and

    DOI or URL
  9. Code repository •  Consistent with computational best practice1 –  Write

    scripts –  Modularise, don’t copy/paste -> code library –  Use version control •  Your everyday repository is fine https://github.com/DamienIrving/climate- analysis 1. Wilson et al. 2014. PLoS Biol, 12, e1001745
  10. Log files •  Follow the NCO / CDO approach… – 

    Can generate timestamps with any language –  Features: •  Simple •  Read/writeable by anyone •  Easy to regenerate (no manual editing)
  11. A new minimum standard •  Authors must include brief computation

    section which cites software and points to supplementary materials: –  Software description –  Code repository (public, version controlled) –  Log files •  Authors not obliged to provide assistance •  Reviews only need to check availability
  12. Aim higher! •  Minimum standard is reproducible, but not very

    comprehensible –  Encouraging to see initiatives like the CWSLab workflow tool: http://cwslab.nci.org.au/
  13. Conclusion •  There is a reproducibility crisis in weather/ climate/ocean

    research •  This can be solved by adding a brief computation section to papers which points to supplementary materials: –  Software description –  Code repository (public, version controlled) –  Log files •  Journals could adopt this framework as a formal minimum standard
  14. Look out for the BAMS essay! https://www.authorea.com/users/ 5641/articles/15874/_show_article