Engineering Metrics Benchmarks

Set team improvement goals using industry standard benchmarks
elite strong fair needs focus
cycle time
Cycle Time measures the time it takes for a single engineering task to go through the different phases of the delivery process from 'code' to 'production'.
< 48 hours 48 - 118 hours 118 - 209 hours 210 + hours
coding time
Coding Time measures the time it takes from the first commit until a pull request is issued. Short Coding Time correlates to low WIP, small PR size and clear requirements.
< 12 hours 12 - 24 hours 24 - 38 hours 39 + hours
pickup time
Pickup Time measures the time a pull request waits for someone to start reviewing it. Low Pickup Time represents strong teamwork and a healthy review process.
< 7 hours 7 - 12 hours 12 - 18 hours 19 + hours
review time
Review Time measures the time it takes to complete a code review and get a pull request merged. Low Review Time represents strong teamwork and a healthy review process.
< 6 hours 6 - 13 hours 13 - 28 hours 29 + hours
deploy time
Deploy Time measures the time from when a branch is merged to when the code is released. Low deploy time correlates to high deployment frequency.
< 4 hours 4 - 48 hours 2 - 7 days 8 + days
deploy frequency
Deployment frequency measures how often code is released. Elite Deploy Frequency represents a stable and healthy continuous delivery pipeline.
Daily + > 1 / week 1 / week < 1 / week
pr size
Pull request size measures the number of code lines modified in a pull request. Smaller pull requests are easier to review, safer to merge, correlate to lower Cycle Time.
 < 225 code changes 225 - 400 code changes 400 - 800 code changes 800 + code changes
rework rate
Rework Rate measures the amount of changes made to code that is less than 21 days old. High rework rates signal code churn and is a leading indicator of quality issues.
< 8% 8% - 11% 11% - 14% 15% +
planning accuracy
Planning accuracy measures the ratio of planned work vs. what is actually delivered during a sprint or iteration. High Planning Accuracy signals a high level of predictability and stable execution.
> 80% 65 - 79% 40 - 64% < 40%
elite
cycle time
Cycle Time measures the time it takes for a single engineering task to go through the different phases of the delivery process from 'code' to 'production'.
< 48 hours
coding time
Coding Time measures the time it takes from the first commit until a pull request is issued. Short Coding Time correlates to low WIP, small PR size and clear requirements.
< 12 hours
pickup time
Pickup Time measures the time a pull request waits for someone to start reviewing it. Low Pickup Time represents strong teamwork and a healthy review process.
< 7 hours
review time
Review Time measures the time it takes to complete a code review and get a pull request merged. Low Review Time represents strong teamwork and a healthy review process.
< 6 hours
deploy time
Deploy Time measures the time from when a branch is merged to when the code is released. Low deploy time correlates to high deployment frequency.
< 4 hours
deploy frequency
Deployment frequency measures how often code is released. Elite Deploy Frequency represents a stable and healthy continuous delivery pipeline.
Daily +
pr size
Pull request size measures the number of code lines modified in a pull request. Smaller pull requests are easier to review, safer to merge, correlate to lower Cycle Time.
 < 225 code changes
rework rate
Rework Rate measures the amount of changes made to code that is less than 21 days old. High rework rates signal code churn and is a leading indicator of quality issues.
< 8%
planning accuracy
Planning accuracy measures the ratio of planned work vs. what is actually delivered during a sprint or iteration. High Planning Accuracy signals a high level of predictability and stable execution.
> 80%
Data Sourced From
+ 0

Teams

+ 0 k

Branches

The Engineering Metrics Benchmarks Study

The Engineering Metrics Benchmarks were created from a study of 1,971 dev teams and 847k branches. For the first time since DORA published their research in 2014, engineering teams are able to benchmark their performance against data-backed industry standards. Continue reading to learn more about our data collection and metric calculations.

Compare your team metrics to industry standards in < 5 mins

Blog

Engineering Metrics Benchmarks

Read more about how we calculated each metric

Product

Engineering Metrics with LinearB

See why engineering teams love using LinearB to improve

Join our community of data-driven dev leaders