Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Benchmarking your Scientific Python Packages Using ASV and GitHub Actions

Benchmarking your Scientific Python Packages Using ASV and GitHub Actions

Anissa Zacharias

May 02, 2024
Tweet

Other Decks in Programming

Transcript

  1. Preview What We’re Coving and Why 2 Benchmarking your Scientific

    Python Packages Using ASV and GitHub Actions We wanted to implement benchmarking that: ‣ We could develop alongside our package code ‣ Wouldn’t clutter up our package repo with results ‣ Would show us how our benchmarks runs changed over time Here we’ll cover: ‣ Airspeed Velocity (ASV), the package we used for benchmarking ‣ How to (or at least how we) set up benchmarking and a benchmark archive ‣ Some details about our GitHub action configuration ‣ Discussion
  2. Airspeed Velocity Background 3 Benchmarking your Scientific Python Packages Using

    ASV and GitHub Actions • benchmarking over package lifetime • can track runtime, memory consumption and custom values • highly customizable • a fair amount of initial set up work numpy’s (old) ASV set up https://github.com/airspeed-velocity/asv https://asv.readthedocs.io
  3. Airspeed Velocity Writing Benchmarks • Benchmarks go in a folder

    in the benchmarked repo Types of benchmarks: 1. Timing - time_*() 2. Memory - mem_*() 3. Peak memory - peakmem_*() 4. Raw timing - timeraw_*() 5. Tracking - track_*() 4 Benchmarking your Scientific Python Packages Using ASV and GitHub Actions class Import: """Benchmark importing geocat-comp.""" def timeraw_import_geocat_comp(self): return "import geocat.comp" benchmarks/import.py location configured in asv.conf.json benchmarks can be organized into separate files You can set individual benchmark attributes like timeout, setup, teardown, repeat counts, and more. Benchmarks can have version numbers if you decide to alter a benchmark.
  4. Airspeed Velocity Important commands 5 Benchmarking your Scientific Python Packages

    Using ASV and GitHub Actions $ asv run main..mybranch Run all commits on a branch since branching off main https://asv.readthedocs.io/en/stable/commands.html $ asv quickstart Used to generate config files and set up a new benchmarking suite $ asv run v0.1^! Benchmark a single commit or tag $ asv publish Generates html from benchmark results $ asv preview Used to preview the results locally $ asv run <range> —skip-existing Skips running benchmarks that have existing results
  5. Structure Overall 6 Benchmarking your Scientific Python Packages Using ASV

    and GitHub Actions Python Package Repository Package Code Benchmarks Directory benchmark python files configuration file runs benchmarks and pushes results to archive repo on push to main Benchmark Archive Repository configuration file archived results benchmarks.json archived results deploy to gh pages makes html from archived results on push to main static html gh pages website
  6. Structure Overall 7 Benchmarking your Scientific Python Packages Using ASV

    and GitHub Actions Python Package Repository Package Code Benchmarks Directory benchmark python files configuration file runs benchmarks and pushes results to archive repo on push to main Benchmark Archive Repository configuration file archived results benchmarks.json archived results deploy to gh pages makes html from archived results on push to main static html gh pages website Original code that we wanted to benchmark 1 2 2.5 After set up, everything from here over happens automatically Deploys from GH action to hosted website https://ncar.github.io/geocat-comp-asv/
  7. Benchmarking your Scientific Python Packages Using ASV and GitHub Actions

    The Benchmarked Package Repo To add ASV to an existing project: • make benchmarks directory • add benchmarks • add configuration file • add github actions for automation Structure 8 Python Package Repository Package Code Benchmarks Directory benchmark python files configuration file runs benchmarks and pushes results to archive repo on push to main 1
  8. Benchmarking your Scientific Python Packages Using ASV and GitHub Actions

    The Benchmark Archive Repo • separate benchmark archive repo to avoid cluttering up main package repository with results To set up archive repo: • add top-level config file • add github action to make static html and deploy github pages Structure 9 Benchmark Archive Repository configuration file archived results benchmarks.json archived results deploy to gh pages makes html from archived results on push to main static html gh pages website 2 2.5
  9. Airspeed Velocity Setting Up Config Files 10 Benchmarking your Scientific

    Python Packages Using ASV and GitHub Actions Package Repo Archive Repo For running the benchmarks For generating the publish/preview hosting for pages { "version": 1, "project": "geocat-comp", "project_url": "https://geocat-comp.readthedocs.io", "repo": "..", "branches": ["main"], // for git "environment_type": "conda", "install_timeout": 600, "show_commit_url": "http://github.com/NCAR/geocat-comp/commit/", "pythons": ["3.10"], "conda_channels": ["conda-forge"], "conda_environment_file": "../build_envs/asv-bench.yml", "matrix": { "python": [""], }, "benchmark_dir": ".", } { "version": 1, "project": "geocat-comp", "project_url": "https://github.com/NCAR/geocat-comp", "repo": "https://github.com/NCAR/geocat-comp", "branches": [ "main" ], "dvcs": "git", "environment_type": "conda", "benchmark_dir": "benchmarks" }
  10. GitHub Actions Automating Benchmarks 11 Benchmarking your Scientific Python Packages

    Using ASV and GitHub Actions https://github.com/NCAR/geocat-comp/blob/main/.github/workflows/asv-benchmarking.yml
  11. GitHub Actions Automating Dashboard Creation 12 Benchmarking your Scientific Python

    Packages Using ASV and GitHub Actions https://github.com/NCAR/geocat-comp-asv/blob/main/.github/workflows/make-publish.yml