Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Devops Behavioral Economics: Displayed behavior as compared to reported behavio

Devops Behavioral Economics: Displayed behavior as compared to reported behavio

What can you learn about devops and software delivery practices by looking at data from a platform with more than 300,000 developers, 25,000 organizations and 25+ million builds per month? As I recently joined CircleCI, I was interested in this new data set from a large SaaS developer platform and the kinds of questions that it could answer: What trends and patterns pop out from the data? Are they different than what is seen through surveys where responders opt-in to participating as compared to being aggregated through platform usage?

I wanted to use this data set in a behavioral economics methodology to see what reported behaviors are when compared actual behaviors across a large data set. In this talk, I’ll cover a view into anonymized team data from millions of builds to share insights, behaviors, and metrics that help teams build better software faster, and look into characteristics of success that we can measure through data. Finally, I’ll cover what we can infer from team behavior, callout a few tools, start a language war, and provide take-aways for you to benchmark with your own software delivery teams.

Michael Stahnke

August 27, 2019
Tweet

More Decks by Michael Stahnke

Other Decks in Technology

Transcript

  1. @stahnma Hard Hitting Stats • The most common length of

    a repository name is 17.6 characters • The most common project name is “api”, followed by “terraform”, “infrastructure”, and “web” • After environmental names (e.g dev, test, stage) the most common branch name is “update-readme” • 0.01% of workflows have English curse words in the name of the branch
  2. @stahnma MTTR = Time from red to green Delivery lead

    time = time of workflow? Deployment frequency = how often do you kick off workflows? Change failure rate = workflow failure rate? Mapping Metrics
  3. @stahnma Lead Time Some people would cry for that 50p

    length, question is would it be tears of sorrow or joy Min 50p 95p Max 2.1s 3m27s 28m45s 3.3 days
  4. @stahnma Lead Time Some people would cry for that 50p

    length, question is would it be tears of sorrow or joy Min 50p 95p Max 2.1s 3m27s 28m45s 3.3 days
  5. @stahnma 25 Number of runs per day per project Number

    of projects building this many
 times per day Deployment Frequency
  6. @stahnma 50p 95p 99p Max workflows per day per project

    3 29 74 2488 workflows per org per day 6 85 250 3036 workflows per project on master branch per day 1 13 39 2443 Deployment Frequency
  7. @stahnma 50p 95p 99p Max workflows per day per project

    3 29 74 2488 workflows per org per day 6 85 250 3036 workflows per project on master branch per day 1 13 39 2443 Deployment Frequency
  8. @stahnma 50p of approvals happen in 3.53 minutes 3% for

    workflows with approval jobs 20% for all workflows Manual Approval Jobs
  9. @stahnma Manual Approval Jobs 50p Length (wthout an approval) :

    3m27s 50p Length with Approvals: 14m38s Some workflows have more than 1 approval job 3.5% of jobs using manual approvals
  10. @stahnma Language Fail rate on master Median branches Median users

    PHP 7.0% 5 2 JavaScript 9.5% 5 2 Python 10.6% 5 2 Ruby 10.9% 9 3 Go 12.4% 4 2 Java 13.1% 4 2
  11. @stahnma Language Fail rate on master Median branches Median users

    PHP 7.0% 5 2 JavaScript 9.5% 5 2 Python 10.6% 5 2 Ruby 10.9% 9 3 Go 12.4% 4 2 Java 13.1% 4 2