Engineering Metrics Benchmarks

Set team improvement goals using industry standard benchmarks
Send this report to your inbox
elite strong fair needs focus
cycle time
Cycle time measures the time it takes for a single engineering task to go through the different phases of the delivery process from 'code' to 'production'.
< 42 hours 42 - 95 hours 95 - 188 hours 188+ hours
coding time
Coding time measures the time it takes from the first commit until a pull request is issued. Short coding time correlates to low WIP, small PR size and clear requirements.
< .5 hours .5 - 1 hour 1 - 4.5 hours 4.5+ hours
pickup time
Pickup time measures the time a pull request waits for someone to start reviewing it. Low pickup time represents strong teamwork and a healthy review process.
< 1 hour 1 - 3 hours 3 - 14 hours 14+ hours
review time
Review time measures the time it takes to complete a code review and get a pull request merged. Low review time represents strong teamwork and a healthy review process.
< 1 hour 1 - 5 hours 5 - 21 hours 21+ hours
deploy time
Deploy time measures the time from when a branch is merged to when the code is released. Low deploy time correlates to high deployment frequency.
< 1 hours 1 - 20 hours 20 - 196 hours 196+ hours
deploy frequency
Deployment frequency measures how often code is released. Elite deploy frequency represents a stable and healthy continuous delivery pipeline.
Daily + > 1/ week 1/ week < 1/ week
pr size
Pull request size measures the number of code lines modified in a pull request. Smaller pull requests are easier to review, safer to merge, correlate to lower cycle time.
< 105 code changes 105 - 155 code changes 155 - 229 code changes 229+ code changes
rework rate
Rework rate measures the amount of changes made to code that is less than 21 days old. High rework rates signal code churn and is a leading indicator of quality issues.
< 8% 8% - 11% 11% - 14% 15%+
planning accuracy
Planning accuracy measures the ratio of planned work vs. what is actually delivered during a sprint or iteration. High planning accuracy signals a high level of predictability and stable execution.
> 80% 79 - 65% 64 - 40% < 40%
elite
cycle time
Cycle time measures the time it takes for a single engineering task to go through the different phases of the delivery process from 'code' to 'production'.
< 42 hours
coding time
Coding time measures the time it takes from the first commit until a pull request is issued. Short coding time correlates to low WIP, small PR size and clear requirements.
< .5 hour
pickup time
Pickup time measures the time a pull request waits for someone to start reviewing it. Low pickup time represents strong teamwork and a healthy review process.
< 1 hour
review time
Review time measures the time it takes to complete a code review and get a pull request merged. Low review time represents strong teamwork and a healthy review process.
< 1 hour
deploy time
Deploy time measures the time from when a branch is merged to when the code is released. Low deploy time correlates to high deployment frequency.
< 1 hour
deploy frequency
Deployment frequency measures how often code is released. Elite deploy frequency represents a stable and healthy continuous delivery pipeline.
Daily +
pr size
Pull request size measures the number of code lines modified in a pull request. Smaller pull requests are easier to review, safer to merge, correlate to lower cycle time.
< 105 code changes
rework rate
Rework rate measures the amount of changes made to code that is less than 21 days old. High rework rates signal code churn and is a leading indicator of quality issues.
< 8%
planning accuracy
Planning accuracy measures the ratio of planned work vs. what is actually delivered during a sprint or iteration. High planning accuracy signals a high level of predictability and stable execution.
> 80%
Data Sourced From

+1,971

Teams

+4.5M

Branches

The Engineering Metrics Benchmarks Study

The Engineering Metrics Benchmarks were created from a study of 1,971 dev teams and +4.5M branches. For the first time since DORA published their research in 2014, engineering teams are able to benchmark their performance against data-backed industry standards. Continue reading to learn more about our data collection and metric calculations.

Compare your team metrics to
industry standards in < 5 mins

See how your organization compares to industry standards in < 5 mins

Blog

Engineering Metrics Benchmarks

Continue reading about how we calculated each metric
Read Now
Product

Engineering Metrics with LinearB

See why engineering teams love using LinearB to improve
Explore Next