Virtual Workshop
2026 Benchmarks Report
Join our AI Productivity roundtable
Register for this virtual event and be the first to explore new AI insights from the 2026 Software Engineering Benchmarks Report – backed by 8.1M+ PRs across 4,800 engineering teams and 42 countries.
*Event registration includes early access to the 2026 Report delivered right to your inbox.
Sessions
1pm ET on December 10, 2025
11am GMT on December 11, 2025
Preferred session
Speakers
Rob Zuber
CTO, CircleCI
Yishai Beeri
CTO, LinearB
About the workshop
No fluff. Just data-driven insights from millions of data points and a 35-minute roundtable discussion breaking down:
- State of the Market: Survey results and reflections from our 2026 AI in Engineering Leadership survey.
- 2026 Benchmarks: This year’s benchmarks include 20 metrics spanning the entire SDLC – plus 3 all-new AI metrics.
- [NEW] AI insights: A brand new segment breaking down the impact AI tools are having on delivery velocity, code quality, and team health.
Benchmarks Insights from:
8.1+ million pull requests
4,800 organizations
42 countries
About the report
This year’s report takes a hard look at AI’s impact on productivity. Here’s a first look at a few of the standout findings from this year’s data:
- AI PRs wait 4.6x longer before review — but are reviewed 2x faster once picked up.
- Acceptance Rates for AI-generated PRs are significantly lower than manual PRs (32.7% vs. 84.4%).
- Bot Acceptance Rates vary widely by tool, with Devin’s rising since April and Copilot’s slipping since May.
Everything here – and plenty more – will be covered by our experts live in the workshop. Register above to join us.