We're excited to partner with LinearB on this year's Accelerate State of DevOps Report, which helps shape the future of developer productivity. LinearB's benchmarks research adds essential insights into how engineering teams continue to evolve through quantitative support to DORA's research.
Head of Google's DORA Team
DORA, Efficiency & Planning Benchmarks
The best way to improve any team is to understand what a well-performing engineering org is versus one that’s struggling. The 2023 Engineering Benchmarks Report is the most thorough, comprehensive look at what performance metrics make engineering orgs elite, average, or needing improvement. See benchmarks for essential metrics like:
Cycle time, deployment frequency, mean time to restore, and more.
Pull-request size, code-review time, pickup time, and more.
Planning accuracy, capacity accuracy, investment distribution, and more.
Benchmarks Based On The Size & Maturity Of Orgs
Small teams need to understand how they perform compared to an org with 5 engineers, not 5,000. In this year’s report, we outline for the first time the benchmarks and insights that apply to engineering teams based on the size of their organizations. See benchmarks for companies based on their size:
Startups: 0-100 employees
Scale-ups: 100-1,000 employees
Enterprise: 1,000+ employees
For the First Time: Investment Benchmarks
With the creation of investment metrics – data on what types of work dev teams focus on — leaders know the share of their team’s time and resources going toward different initiatives. With visibility into those types of work investments, engineering leaders can strategize with other business leaders on what amount of work should be prioritized to help the larger company. See investment benchmarks based on different types of resource deployments:
Keeping the lights on