Podcast
/
Stop measuring AI adoption. Start measuring AI impact. | LinearB’s APEX framework

Stop measuring AI adoption. Start measuring AI impact. | LinearB’s APEX framework

By Dan Lines
|
Blog_Comprehensive_DORA_Guide_2400x1256_37_e5f4a6c106

Are your AI coding tools actually making your team faster, or are they just creating downstream chaos? This week, Ben Lloyd Pearson and Dan Lines introduce APEX, LinearB’s new engineering leadership framework built explicitly to measure and manage software delivery in the AI era. Moving beyond traditional frameworks like DORA and SPACE, APEX balances AI Leverage, Predictability, Efficiency, and Developer experience to ensure upstream code generation translates into actual business value. Tune in to learn how to break past the illusion of coding speed, prevent AI slop from clogging your review pipelines, and discover which pillar of the APEX framework your team needs to tackle first.

Show Notes

  • LinearB APEX Framework: Explore the full operating model, visual breakdowns, and the guide to operationalizing the metrics.
  • Workflow Automation: Learn aboutLinearB's gitStream(policy-as-code for PR automation) andWorkerB(developer bot for minimizing idle time).

Transcript 

(Disclaimer: may contain unintentionally confusing, inaccurate and/or amusing transcription errors)

[00:00:00] Ben Lloyd Pearson: All right. Welcome back to the Dev Interrupted everyone. Today we've got a, a very special guest in my opinion. I'm joined by my fellow host and LinearB, COO, Dan Lines. Dan, it's really great to have you back on the show again. It's been a little while.

[00:00:16] Dan Lines: What's up, BLP. Awesome to be here. Super excited to catch up. an exciting topic today.

[00:00:22] Ben Lloyd Pearson: Yeah. Yeah. And of course I'm joking about, it's been a little while 'cause I, I think you've been on the, the episode just a couple of weeks ago. So it's actually really nice to get you back, uh, for, for multiple episodes and quick succession. So.

[00:00:34] Dan Lines: Love being here.

[00:00:35] Ben Lloyd Pearson: Yeah. Yeah. So, all right. So the topic that we want to, we want to cover today.

[00:00:39] Ben Lloyd Pearson: So, you know, we spent many years on this show, uh, you know, both at Dev Interrupted, but also at LinearB, be talking about things like Dora like space, you know, all these frameworks that really try to measure how effectively engineering teams are operating. You know, with the idea being that like part of our core mission at both LinearB and Dev [00:01:00] Interrupted is really to help engineering teams move from like this more like gut feel driven decision making to data-driven engineering.

[00:01:10] Ben Lloyd Pearson: And, you know, and it's been working well I feel like, but the world has really sort of changed in, in the last year or so as AI coding tools like, co-pilot cursor, Claude Codex, you know, all these new tools are hitting the mainstream and executives are seeing the bills for these tools. They're seeing all of these viral things about how teams are getting crazy productivity and writing tons of code with AI and, and, and all of these things that claims and hypes that is, is out there that people are making about ai.

[00:01:41] Ben Lloyd Pearson: And meanwhile, we, we encounter people, I feel like, on a weekly basis, that, that are asking us like, does this actually make us better? Like, are we actually more productive? Are we delivering more value to our customers? And that's the topic that we're gonna talk about today. 'cause you know, we've, we've got all these dashboards out there from.

[00:01:59] Ben Lloyd Pearson: Uh, that [00:02:00] show you things like AI adoption. You can get your DORA metrics, your cycle time, your CFR see your predictability. But we started developing this new operating model at LinearB that we are calling APEX. And that's really what I want to talk about today, Dan, because this, this really gets into why, you know, these playbooks that we have, they're still great, but they may not be up to the, up to snuff for the AI era.

[00:02:24] Ben Lloyd Pearson: Uh, and, and this is the framework that we hope is gonna prove that value. So before we get into the background behind APEX and how to implement it and all the details, uh, of what's in it, I want to just maybe start with a high level overview. So from your perspective, Dan, like what is APEX and, and why should our audience care about it?

[00:02:41] Dan Lines: Well, I mean, first of all, coolest name ever APEX. Gotta have a cool name. And, uh, why I love APEX and we're gonna, uh, talk about, uh, each aspect of it. But APEX is a framework that was made by the people, [00:03:00] for the people. And what I mean by that is it was really organically made, through our customer base.

[00:03:06] Ben Lloyd Pearson: Yeah.

[00:03:07] Dan Lines: It wasn't something like, okay, you know, Dora was more like, Hey, let's do like a research assignment. Let's go

[00:03:12] Ben Lloyd Pearson: Yeah.

[00:03:13] Dan Lines: and let's do all, it's very like, uh, I would say research focus, you know, really specific on getting code through the pipeline. You know, also change failure rate, balance with quality.

[00:03:26] Dan Lines: But I think what's great about APEX is it came from actual users, actual usage, and, and from our customers.

[00:03:34] Ben Lloyd Pearson: and I actually wanna point out that you know, that that research focus, you know, 'cause this is something we've seen time and time again like that. Um, you know, what works in the lab and, and what you can observe through research experiments and all of that stuff. That doesn't always play out in real life, you know, real life.

[00:03:48] Ben Lloyd Pearson: It can be quite a bit more messy than that. It's, it's, uh, you're dealing with all these different constraints. So I really do think it's important to point out how we're trying to take a much more practical and pragmatic approach that's based on [00:04:00] the things we see from, from customers and from, from our community of experts.

[00:04:04] Ben Lloyd Pearson: Like every single day, right?

[00:04:06] Dan Lines: And no knock to DORA or Space

[00:04:08] Ben Lloyd Pearson: Yeah. Yeah. Yeah.

[00:04:09] Dan Lines: Frameworks, but, uh, revolutions and evolutions has happened,

[00:04:13] Ben Lloyd Pearson: Yeah.

[00:04:14] Dan Lines: You mentioned, uh, one of them obviously AI

[00:04:18] Ben Lloyd Pearson: taking

[00:04:18] Dan Lines: over the world, eaten the world. So we know that, you know, some of the other, uh, frameworks have been around now for years and years, probably outdated, but. There's another thing that I think happened in between, let's say like Dora Space and then where we came to, uh, with APEX is engineering organizations. Over the last, I don't know, five years, are also no longer just responsible to ship code.

[00:04:46] Ben Lloyd Pearson: Hmm.

[00:04:47] Dan Lines: are also responsible to the business. For value.

[00:04:52] Ben Lloyd Pearson: Yeah.

[00:04:52] Dan Lines: when you look at APEX, right, A and APEX AI leverage the P predictable delivery, predictability, [00:05:00] the E efficiency still gotta be efficient with shipping code

[00:05:04] Ben Lloyd Pearson: Yeah.

[00:05:04] Dan Lines: and then the X DevEx. what I like about APEX and what our customer base, uh, has told us is. It's up to date because it's AI forward. Okay? It starts with a, it's ai, but it's more about the balance. It takes into consideration. Of course, we need to ship code. It needs to be efficient and high quality, but there also has to be the value of the predictability

[00:05:31] Ben Lloyd Pearson: Mm-hmm.

[00:05:31] Dan Lines: Are we actually delivering value, uh, stories in a predictable way every sprint? Then it's got the A in there, right? Okay. The AI to be there gotta be AI four, gotta take it into consideration, and then it rounds everything out with developer experience. So I think APEX, really what I've been thinking about is, wow, this is a really balanced framework, uh, for the times that we're living in today.

[00:05:58] Ben Lloyd Pearson: Yeah, and I, and I think one [00:06:00] thing that's really important to point out is how the, we, we designed this framework really to be. Um, to, to focus on AI as like a central component of your SDLC, like it is now a primary contributor or a primary operator within your SDLC or, or at least it is becoming more and more of that over time and like, yes, we have, we have a for AI leverage that explicitly calling out AI.

[00:06:23] Ben Lloyd Pearson: But I also do think it's important to recognize that AI is gonna impact every single aspect of this framework. So, you know, it, it impacts your cycle time, you know, your efficiency. Um, it impacts the experience that your developers have at your organization. So, so not only are we we giving it, its. Its own explicit like sort of boundary where we're saying, this is the AI measurement that you need.

[00:06:47] Ben Lloyd Pearson: We're also showing how AI is impacting all of the aspects of your SDLC and how you need to factor that into your, your visibility layer.

[00:06:55] Dan Lines: They all play together, right? The A, the P, the E. So the [00:07:00] ai, the predictability, the efficiency, they all, these all relate to each

[00:07:03] Ben Lloyd Pearson: Mm-hmm.

[00:07:04] Dan Lines: And you're right. I mean, at least you know, the customers that I'm working with in terms of their AI journey. I think, uh, there's different stages, If you said like, I don't know, stage one is something like, hey, maybe, um. We're just getting started and some, and some companies are there, like some large enterprise, Hey, we're, we are just getting started with this.

[00:07:25] Ben Lloyd Pearson: Yeah.

[00:07:26] Dan Lines: And maybe on the far extreme, you know, you have, uh, some of our customers are doing like spec driven de development. It's established, like they're trying to be like cutting, cutting edge and probably most are in the middle.

[00:07:38] Dan Lines: They're in an experimentation phase. AI tools have been rolled out.

[00:07:42] Ben Lloyd Pearson: Yeah.

[00:07:43] Dan Lines: Claude is kind of taking over now, they probably started with, with copilot, but I think what's cool about this framework is it doesn't really matter where you are in that, in the journey. You don't have to be like extreme with AI in order to get value out of it. lot of the customers that, that I, I talk [00:08:00] to, we even start with the p and e. So kind of the middle of a APEX, we say, Hey, you know what's core, uh, to delivery. let's make sure we have great planning and capacity accuracy, delivering value on time, delivering stories on time still, number one thing. And then efficiency.

[00:08:17] Dan Lines: Let's make sure the code's getting out in an efficient way with high quality. Can even just start there. layer in, okay, how is AI adoption affecting this?

[00:08:26] Ben Lloyd Pearson: Yeah.

[00:08:27] Dan Lines: Um, so yeah, that's kind of what I'm seeing, at least in the market and the customers I'm talking to.

[00:08:32] Ben Lloyd Pearson: Yeah. Yeah. And we'll get in more into these individual pillars, uh, quite a bit more here in a minute. Uh, but before we do that, I want to, I wanna just take a step back and just sort of look at the, the, the context behind where we are today and why we felt this need to shift to a framework like a APEX. Um, and let's, let's start with this notion that, um, I, it comes up almost on a weekly basis, it feels like here at Dev Interrupted.

[00:08:57] Ben Lloyd Pearson: And this, this idea that. Coding [00:09:00] faster is an illusion. You know, we hear it all the time how developers are feeling faster. You know, either they can write more code or they can do research more quickly. Um, yet often we're hearing that like the bottom line for these companies isn't really moving. So I'm just curious from your perspective, like.

[00:09:16] Ben Lloyd Pearson: Why is there this massive disconnect between the tool adoption? You know, I feel like tools, AI usage is basically ubiquitous at this point. Um, but then the actual delivery isn't quite like matching the expectations around this tool adoption. So what do you think is going on here?

[00:09:31] Dan Lines: Great question. I mean, first and foremost, I, I do believe like AI is a different beast maybe

[00:09:37] Ben Lloyd Pearson: Yeah.

[00:09:37] Dan Lines: liketu tools that have been adopted, I don't know, over the last few, like it is actually game changing. I think anyone using AI. Uh, developers, you just feel like you can do way more than you did before.

[00:09:50] Dan Lines: There's a lot of, let's say, uh, volume to it. That's the best way that I can describe it

[00:09:54] Ben Lloyd Pearson: Yeah.

[00:09:55] Dan Lines: end user. Even, even like, uh, uh, if I'm not using, uh, AI [00:10:00] to develop, just like doing day, uh, day-to-day life, I feel like I can do more volume of

[00:10:05] Ben Lloyd Pearson: Makes you feel bigger, like you have a bigger presence. Yeah.

[00:10:08] Dan Lines: if I wanna be, I can be like five people in parallel.

[00:10:11] Ben Lloyd Pearson: Yeah.

[00:10:11] Dan Lines: five developers in, so there. There is that, uh, volume feel. But then I, I think what's also happened is if I, uh, move away from the developer and let's say, uh, when I'm talking to like CTOs and SVPs, there is a combination of a promise or expectation from to the business usually coming, like from the CEO of board, uh, or the board of like, Hey, we're in the AI era. Uh, there's an expectation to do more, move faster, deliver more code, deliver more value faster. So you got that expectation, oh, okay, now that I'm supposed to, uh, my org's supposed to be adopting ai, guess we gotta do everything like 10 10x speed. But then the, the reality sets in of, hey, just because you're developing more code or more volume in the early [00:11:00] stages of the SDLC, doesn't mean that it can actually get out to production. It doesn't mean that stories are actually getting like completed on time. There's bottlenecks, uh, after the coding. Probably the rest of the SDLC hasn't caught up quite yet. When you put that triangle together, that's where I think the faster illusion comes from.

[00:11:20] Ben Lloyd Pearson: Yeah. And, and, uh, you know, I keep coming back to this, this concept that came up in last year's DORA report where, uh, you know, a lot of it this year was, or last year, was focused on AI in particular, uh, even more so than they have in recent years. And, uh, there was a phrase in there that really, that basically been stuck in my head ever since I heard it.

[00:11:39] Ben Lloyd Pearson: And that is that upstream acceleration is lost to downstream chaos. You know, like you have these downstream bottlenecks and things like code reviews and uh, deployment, stuff like that. Um, and, and I think it's becoming very apparent that if those downstream systems aren't ready for ai, the, the upstream gains are just [00:12:00] gonna get lost and go nowhere.

[00:12:01] Ben Lloyd Pearson: Right.

[00:12:02] Dan Lines: Yeah.

[00:12:03] Ben Lloyd Pearson: Um, but I'm in a, yeah, go ahead.

[00:12:05] Dan Lines: did, we did the Benchmark, the Benchmarks podcast

[00:12:08] Ben Lloyd Pearson: Yep. Yep.

[00:12:10] Dan Lines: I don't know if we, we can pull, uh, the data on that, but I think you, we had something in there that said, Hey, even a lot of code that's being either created by AI or fully created by an AI agent, maybe a poll request goes up.

[00:12:24] Ben Lloyd Pearson: Mm-hmm. Yeah.

[00:12:26] Dan Lines: part of that chaos,

[00:12:28] Ben Lloyd Pearson: Yeah. It's more likely to not get merged, more likely to have a longer review time. Um, uh, you know, and I think we even found some interesting conflicting ideas where, like, it might sit for a review longer. Like someone doesn't pick it up for days. But then once they do, they just, they just thumbs they give, they give an LGTM and and move on.

[00:12:46] Dan Lines: it.

[00:12:47] Ben Lloyd Pearson: Yeah, yeah, yeah. So it's,

[00:12:48] Dan Lines: of the chaos.

[00:12:49] Ben Lloyd Pearson: yeah.

[00:12:50] Dan Lines: that's like the data behind that, uh, upstream versus downstream chaos that you're talking about.

[00:12:55] Ben Lloyd Pearson: Yeah. Yeah, exactly. And, and you know, and I mentioned Dora and, and, and I wanna [00:13:00] touch on that a little bit 'cause we, we've already hinted at this, but. You know, we've had Dora space. They, they've been around for quite a while, very battle tested at this point, and well respected in the industry. There's been just a proliferation, I feel like, in recent years of like general productivity frameworks that, um, often, you know, to me just kind of feel like it's, it's someone who just like invents a dashboard that they want you to buy from them or something like that.

[00:13:25] Ben Lloyd Pearson: But you know, we've, so we've always been a fan of like frameworks in general, especially some of the ones that are more well established. But, so I want to just touch into why we're expanding upon them with APEX. Like do you feel like it's something that where the, the existing metrics fell short of something or is it just that we need something that the expands the scope of, of what they're doing?

[00:13:46] Dan Lines: Yeah. Yeah, like, like I said before, I mean DORA, I mean, these are great frameworks.

[00:13:50] Ben Lloyd Pearson: Yeah.

[00:13:51] Dan Lines: really are. I mean, we're, we're all about them. And like a, a lot of the customers that we work with like set metrics and we like improve cycle time or [00:14:00] CFR or MTTR, deployment frequency, like all of that kind of stuff.

[00:14:03] Dan Lines: It makes sense. I just feel like there's been an organic evolution. There's been an e evolution in, in the markets, what you can do with ai. Kind of just, uh, I would say naturally made some of the, those frameworks to be out of date, no fault of their own. And even pro, probably, I think some of like the creators of Dora are like working on the next thing and all of that.

[00:14:27] Ben Lloyd Pearson: Yeah. Yeah.

[00:14:28] Dan Lines: our customers basically came to us and said, Hey, you know, Dora, it's, but it's been great. We need something new. And the reason we need something new. AI is here and we need to be measuring the adoption, the impact of it. That's the expectation. also what I said before, I think there's the business value of it, meaning. or the business side of it, are we actually creating value, delivering value, and are we doing it on time? And those are the two, like, I think, primary additions that, uh, [00:15:00] APEX, uh, addresses that maybe some of the earlier frameworks, no farther their own, uh, just didn't focus on, didn't

[00:15:07] Ben Lloyd Pearson: Yeah. Yeah. Yeah. I mean, built for a different era. I mean, Dora I think was really built for like the cloud computing era, and you, and if you think about where, where we were like 10 years ago as cloud computing took over, like, yeah, there's a lot of parallels between that. But you know, as you said, this is a real game changer.

[00:15:23] Ben Lloyd Pearson: Like the, the, rate of change that AI has created creating is on a different scale than, you know, what we were dealing with back then. And, and, and I just wanna get into like one last topic before we move on to the, the framework itself. And that's this notion of AI in the critical path. So one of the fi, the principles of a APEX, as I mentioned, is treating AI as this like first class production contributor.

[00:15:47] Ben Lloyd Pearson: So I'm wondering like, how, how does the mindset of like, you know, I think a lot of organizations see AI as. So far is like a side experiment. Um, but we're rapidly shifting into this [00:16:00] world where AI becomes a core part of the delivery system. So, you know, how, like reflect on that shift a little bit, what you're seeing from LinearB customers and also like how you think APEX is here to, to help address that.

[00:16:13] Dan Lines: Yeah, like I, I'll even just like reference how my, how my day went. Today. I'm coming off a call with, I, I would just say, uh, a CTO of the biggest, uh. Let's say like re retail manufacturers in the us. We'll put, we'll put it that way. And my conversation, with him, what do you think is the first thing that he came to me with?

[00:16:35] Dan Lines: He said, Hey, you know, we are committed. We're rolling out AI tooling. Um, and there is an expectation now that we're able to manage adoption. Provide visibility with adoption and even more so, so than that. And like I, I have other customers saying this to me. trying to figure out, okay, I have lots of teams.

[00:16:59] Dan Lines: Let's say I have a [00:17:00] hun hundreds of teams and the big ones, even like thousands of teams. Where is AI being adopted? Uh, where is it effective? Which teams? And then once we know that, How can we replicate those behaviors across the other

[00:17:15] Ben Lloyd Pearson: Yeah.

[00:17:15] Dan Lines: are maybe struggling a little more? And the reason that I tell that story like that, because you asked like AI in the critical path. Yeah, it's in the critical path path now. These are the types of, uh, questions that engineering organizations, uh, have to answer, to be, uh, modernized or however you wanna put it. That's the expectation now. and of course, like the, a APEX, uh, framework is well suited to address that.

[00:17:40] Ben Lloyd Pearson: Yeah. It's like when you move from, from an AI experimentation to like actually having AI in your critical paths there, it, it creates data anomalies, right? Like there's a, there's a change in your data and suddenly a team is operating differently than they were before. Hopefully much more productive, productively.

[00:17:58] Ben Lloyd Pearson: And once you see that, you [00:18:00] want to like naturally, you're gonna wanna replicate that across other teams and get them out of the experimentation stage. Right.

[00:18:06] Dan Lines: Exactly, and it's like a business co it now it's like a commitment

[00:18:10] Ben Lloyd Pearson: I.

[00:18:10] Dan Lines: of companies. It's like, yes, I am committed to moving past just like early experimentation to actually operationalizing and by the way, showing the impact. Like, hey, all of this AI rollout, Hey, is uh, Claude actually doing anything for us

[00:18:25] Ben Lloyd Pearson: Yeah.

[00:18:25] Dan Lines: terms of like the, the bottom line of like the value and getting high quality code out to production.

[00:18:31] Ben Lloyd Pearson: Yeah. All right. Well, I think we've done a, a great job at sort of providing the background context for, for how we got there or to here to this moment. Um, now I want to get into the pillars individually and sort of break down, um, what APEX consists of and, and why we've built it this way. So, you know, there's four distinct pillars.

[00:18:49] Ben Lloyd Pearson: We, we mentioned them briefly at the top, uh, of the show, but just real quickly, I'll list them again. So we have a, for AI leverage. We have P for predictability. We have E for efficiency, and [00:19:00] then X for developer experience or DevEx. So let's, let's just start at the top with, with AI leverage, 'cause it's kind of the hot button topic, obviously.

[00:19:08] Ben Lloyd Pearson: Uh, and for this, you know, we, we define it as measuring how effectively AI is embedded into your production systems. Uh, and the North Star metric that we have for this is AI assisted prs. So which, what number of prs over a specified time period, um, or what percent of prs. we're assisted in some way by ai and we kind of break this down by like coding assistance, code review, fully agentic systems.

[00:19:33] Ben Lloyd Pearson: Like there's a variety of ways that this shows up. And you know, the, one of the things that the, that we note with this framework is that like usage dashboards alone aren't going to validate your impact. Like I imagine if you were to just go and look at raw usage today, you would see most of your developers are using AI at least on a weekly basis, probably daily at this point.

[00:19:54] Ben Lloyd Pearson: Um, but we take it a step further than that and we tie usage to the actual pull requests that are. [00:20:00] Entering into your, your code base, um, and, and looking at how AI is contributing to the unit of work through the system. So let, let's, let's start there with AI assisted prs, Dan, so what role do you think that this fits within APEX?

[00:20:14] Dan Lines: Yeah. Yeah. Okay. So a few, a few things there. Uh, I do think impact matters the most. Okay, so, and, and, and that is what, what I'm hearing from the customer base,

[00:20:25] Ben Lloyd Pearson: Mm-hmm.

[00:20:26] Dan Lines: hey, I still have to say, how does this affect things like cycle time, rework rate, pr size, change failure rate.

[00:20:34] Ben Lloyd Pearson: Yeah.

[00:20:35] Dan Lines: How does it affect my planning, uh, accuracy, my delivery, all of that?

[00:20:39] Dan Lines: It, it, uh, it matters and that's why it's not only about adoption

[00:20:43] Ben Lloyd Pearson: Yeah.

[00:20:44] Dan Lines: u and usually I, I'm hearing that from, uh, customers that feel, Hey, I've already kind of roll, rolled out ai I am ready for, for the impact side of it. But I will, I will say there are also a lot of engineering organizations that are just, uh. [00:21:00] On the adoption phase. when you look at AI assisted prs, I think it's a really easy way, and I'll go back to the teams, which teams within your engineering organization have like 70% AI assisted prs and greater, which teams have 50% and greater, and which teams are kind of just like us starting out, let's say, uh, 10, 20% that that type of benchmark. I do think that gives kind of like, okay, I can now. an understanding of like what maturity level my, my adoption is with, uh, amongst my teams. And like I said earlier, uh, usually what people are doing is trying to look at those teams that have a, a high adoption rate and uh, and high impact metrics, like cycle time has decreased, change failure rate has decreased.

[00:21:50] Dan Lines: Now, when I identify those teams, I, I can go in and, and inspect what are these teams doing with AI that maybe other teams aren't. And then you go and try to [00:22:00] replicate those behaviors.

[00:22:01] Ben Lloyd Pearson: Yeah.

[00:22:02] Dan Lines: that I'm seeing.

[00:22:03] Ben Lloyd Pearson: Yeah. And it's like, it is really, again, it's like, it's like an anomaly detection, right? So like a team with really high AI usage. Could, it could be one of two things. It could be a team that has learned something really good, that is super beneficial that you should propagate to other teams where you can, the other side of the equation might be a team that's currently overwhelmed by AI slop, right?

[00:22:24] Ben Lloyd Pearson: Like maybe, maybe they're just creating tons of AI generated code. They're not reviewing it, they're just pushing it to production and things are blowing up constantly. And yeah, so, and that's where I think. Going from the A, the AI leverage to these other metrics is really critical, right? Because if you're just getting more AI assistance, but it's, it's messing up all the other stuff like predictability.

[00:22:45] Ben Lloyd Pearson: The next thing that I want to talk about, then it's all for nothing. Uh, so let, let's just move on to predictability then, which, you know, to us that's ensuring that your commitments are reliable, and that, again, the AI volume isn't adding instability into your systems. [00:23:00] So we have planning and capacity accuracy.

[00:23:03] Ben Lloyd Pearson: As the, the north star metrics to this, and of course, as AI increases this output variability that I'm describing. Um, how do managers like control planning and capacity accuracy, Dan, like, just to make sure that, that the volume isn't like upending their sprints.

[00:23:20] Dan Lines: Yeah. And this, and this is the balance aspect of it. I, I'm happy you brought it up that way. I tried to talk to you and Ori yesterday about Mr. Miyagi from Karate Kid, the Balance,

[00:23:31] Ben Lloyd Pearson: Yeah.

[00:23:32] Dan Lines: neither of you are watching Cobra Kai,

[00:23:34] Ben Lloyd Pearson: Yeah. Yeah.

[00:23:35] Dan Lines: gotta, you gotta watch. But yeah, I

[00:23:37] Ben Lloyd Pearson: Required watching for all engineering leaders.

[00:23:40] Dan Lines: like APEX balance, Mr.

[00:23:42] Dan Lines: Miyagi, all of this goes together. Yes. The balance side of it is if you're just generating a bunch of, uh, slop or junk, whatever, uh, with ai, you would see, okay, my planning accuracy in my, uh, sprints are decreasing. I'm not actually [00:24:00] doing what I say I'm gonna do. It's not, uh, stories aren't getting completed.

[00:24:03] Dan Lines: Bugs aren't getting fixed. I'm not actually providing value back to the business. I'm just messing around with AI and doing a bunch of volume stuff, like we said

[00:24:11] Ben Lloyd Pearson: Mm-hmm.

[00:24:12] Dan Lines: of the pod. And so I feel like that, uh, that's one of the balancers, Hey, let's look for teams that, okay, you do have, uh, let's say, uh, nice AI adoption. Yeah, maybe like 65, 70% of your prs, uh, are AI assisted, your planning accuracy is increasing. Your capacity accuracy is increasing. you're doing what you say you will. That's kind of like, uh, when I say like predictability, I think we wrote here, ensure delivery commitments remain, remain reliable.

[00:24:45] Dan Lines: You're still a reliable delivery team even though you're, uh, utilizing ai. Like that's the ba I think balancing aspect of it.

[00:24:54] Ben Lloyd Pearson: Yeah, and this is where, uh, you know, we haven't talked about quality indicators very much, but I do think this is [00:25:00] where we, where the quality starts to really come into the picture. Um, because if, if you have high, if you're creating software with high defect rates, or if you're slow at responding to outages or failures.

[00:25:13] Ben Lloyd Pearson: Um, or if you are constantly reworking existing code and, and refactoring it on like a, just a constant basis, because it doesn't, it wasn't architected the right way. Like, all of these things are gonna impact your predictability and this, the lens of ai it's gonna get, it's gonna amplify, it's gonna make it even worse.

[00:25:31] Ben Lloyd Pearson: You know, it's why this stuff is, is so important. And speaking of important things, let's, let's talk about the, the sort of tried and true though, probably not the most exciting letter within this, and that's efficiency, you know? No, no one really is excited about efficiency, but everyone wants to be efficient, right?

[00:25:48] Dan Lines: I'm excited about it.

[00:25:50] Ben Lloyd Pearson: Yeah. Yeah. Yeah. I mean, it certainly feels really good to be efficient. Uh, and you know, and this is all about optimizing how your work flows from start to merge. Um, our North [00:26:00] Star within this is cycle time, but we also, you know, getting to the quality side of things, we also throw in CFR change failure rate as a, a core metric to this.

[00:26:09] Ben Lloyd Pearson: Because if you move faster, but you're creating more failures, you're, losing the gains of being more efficient. So, you know, Dan, my question to you is, you know, if you're adopting AI and your coding time drops dramatically because AI is doing all the writing for you, but the review time spikes, your constraints just moved, right?

[00:26:29] Ben Lloyd Pearson: Like, isn't, isn't that the case? Like how, how do we, how do we use cycle time as the north star within this?

[00:26:34] Dan Lines: Yeah. So I, I said, Hey, I, I like efficiency. Uh, let's pay homage to Dora cycle time and CFR.

[00:26:42] Ben Lloyd Pearson: Yeah. Mm-hmm.

[00:26:44] Dan Lines: matters the efficiency of how code flows through your SDLC and the end game actually making it out to production into the hands of a customer. That matters a lot.

[00:26:54] Ben Lloyd Pearson: Yeah.

[00:26:55] Dan Lines: so, you know, to your point, hey, let's say coding time decreases, uh, [00:27:00] by a ton. And let's say even, uh, prs open increase by a ton. If these prs, uh, no one's picking them up for review, they're not actually making it out to production, they're getting thrown back. Maybe you do have standards in place that are kind of blocking these prs. Uh. Maybe the test coverage is there, whether, whether, uh, maybe they do make it out to production and your change failure rate is spiking.

[00:27:25] Dan Lines: The customer that I was talking about earlier, quality was actually their, uh, this person's number one thing on their mind in the AI era. The mindset was

[00:27:34] Ben Lloyd Pearson: Yeah.

[00:27:34] Dan Lines: yeah, okay. We're we're deploying. Uh, I think they, I think they were using Claude. Hey, everyone's using Claude, and I can see a lot more stuff is happening.

[00:27:41] Dan Lines: But you know what they also told me? My number one problem is incidents in production. Uh, this person told me they're caused by code. Like these are direct bugs in prod. So on the quality side, again, the change failure rate, paying homage, uh, to Dora, the [00:28:00] rework rate, are we having to rework the code over and over again?

[00:28:03] Dan Lines: That's where the efficiency is, that balancer, uh, to the speed of ai. So,

[00:28:08] Ben Lloyd Pearson: Yeah.

[00:28:08] Dan Lines: you, I'm in on efficiency. I still like the e.

[00:28:12] Ben Lloyd Pearson: Yeah. And I, and I think it's really important to, to break cycle time down. 'cause you know, we've, we've been touching on this, uh, multiple times, but, cycle time is a fairly complex metrics metric. There's a lot of stages that go into it. Uh, and you really need to break down each individual component part.

[00:28:27] Ben Lloyd Pearson: So what's your, your coding time, how long does it take PRs to get picked up for review? How long does that review take? And then once it's been approved, how long is it taking you to get it to deployed to production? Like any one of these stages could be your constraint within the system and you, you really need to go from that like high level efficiency view all the way down to like looking at the actual, the actual segments that are getting bottlenecked and, and which prs are the ones that are causing those bottlenecks.

[00:28:56] Ben Lloyd Pearson: All right, so let's move on to the last pillar then. And this is the [00:29:00] X developer experience. And this, the purpose of this is just to ensure that any gains you get from. Uh, AI are sustainable and human centered. You know, the humans are the ones that have to operate these systems and at the end of the day, they should be satisfied with how they're performing.

[00:29:16] Ben Lloyd Pearson: And that's why we picked the North Star metric for this to be developer satisfaction. So, you know, APEX kind of treats. Uh, DevEx is almost like a guardrail, right? So if, if your cycle times are improving, but your developer satisfaction drops, um, like maybe that's an indicator that your productivity gains are actually just fake.

[00:29:37] Ben Lloyd Pearson: They're illusion or unsustainable. So let's talk about developer experience. Dan, like, like how, how does this work within APEX? Um.

[00:29:45] Dan Lines: Yeah. Like we said earlier, the A, okay, the AI leverage, the P, the predictability, the E, the efficiency, and now the X to me, they all relate together. They all relate, and the way that I [00:30:00] like to think about it is if I'm leveraging AI in the right way. And what we talked about is making sure that AI has the right impact, not just adoption.

[00:30:10] Ben Lloyd Pearson: Mm-hmm.

[00:30:11] Dan Lines: right impact. If I am improving my predictability, devs want predictability. I mean, when I, when I was a, a developer, like chaos kind, yeah, it's

[00:30:23] Ben Lloyd Pearson: Yeah.

[00:30:23] Dan Lines: might be fun on a Friday when I'm experimenting, but not when I'm actually like on the hook to deliver my sprint on time. I

[00:30:29] Ben Lloyd Pearson: Yeah.

[00:30:30] Dan Lines: chaos, so I wanna deliver great work. I want it to be in an efficient way, right? I don't wanna create a bunch of prs and have them get stuck in the review process or not be able to get deployed. That's the e And when these things come together, yeah, I think kind of like the final check is okay, let's just validate, uh, that if we're doing well with the A, the P and the E, the satisfaction is, is there. And usually what I see is like if the AI [00:31:00] leverage, the predictability and the efficiency are looking good, the satisfaction is usually there. Now you might catch like a red flag or something like that. But anyways, I, I kind of think that's okay. Maybe that's like the final check to make sure that, we are listening and, and we're on point, uh, with the other aspects of APEX.

[00:31:17] Ben Lloyd Pearson: Yeah, and I think it's important to remember that satisfaction can be measured at multiple levels. So you can look at like the overall satisfaction, like how, how, how do you feel about the way we do work and your job here, and all of that. Um, but then the qualitative side of this is, is really great for like digging all the way down to the specifics.

[00:31:35] Ben Lloyd Pearson: So like. Looking at the teams that are frustrated with AI and getting direct feedback from them about why they're frustrated so that you can address it. Or looking at the teams that have been incredibly successful with AI and see. What are the things that, that make them flow really well that you could maybe, uh, take from them and apply to other teams within the organization?

[00:31:55] Ben Lloyd Pearson: So, um, that's I think what I really love most about this last one is, is how it it, you, you [00:32:00] can both measure it at the high level, but then go all the way down to like the individual teams and, and individuals themselves who are in being impacted by, uh, AI and, and just get direct feedback from them.

[00:32:11] Dan Lines: Yep. Well said.

[00:32:13] Ben Lloyd Pearson: So let's, let's talk before moving on, just about connecting this all together.

[00:32:17] Ben Lloyd Pearson: So we've, we intro, we've been introducing this framework to a lot of people so far. Um, I think a lot of people look at it and they just wonder like, where do I start? You know, there's, there's four pillars here. Uh, which ones for me? So, so Dan, is it, is it a matter of like, I have to pick. One of these that is the most important to me?

[00:32:36] Ben Lloyd Pearson: Or do, do we LinearB, have an opinion about which one you have to come in and pick first, or how, how does that work?

[00:32:42] Dan Lines: No, I, I, I think the letter picks you,

[00:32:44] Ben Lloyd Pearson: Yeah.

[00:32:45] Dan Lines: It's like, uh. I don't know, like in Harry Potter, when you get selected to one of the house

[00:32:49] Ben Lloyd Pearson: Yeah.

[00:32:49] Dan Lines: you don't pick, it picks you

[00:32:51] Ben Lloyd Pearson: You get a nice leather shoe and it's like this shoe was for me.

[00:32:54] Dan Lines: It, it, it picks you. And, and what I mean by that is, uh, what I'm, what I'm [00:33:00] seeing is, okay, when customers come in and they're kinda like. Let's say that you're early on in the AI adoption journey. Let's say you're kind of at the start starting point. Usually the P and the E, the predictability and the efficiency is the place to start

[00:33:15] Ben Lloyd Pearson: Mm-hmm.

[00:33:15] Dan Lines: That's how APEX selects you. Why? 'cause still at the end of the day, you gotta be predictable.

[00:33:21] Dan Lines: You gotta be efficient.

[00:33:22] Ben Lloyd Pearson: Mm-hmm.

[00:33:23] Dan Lines: the business of engineering.

[00:33:24] Ben Lloyd Pearson: deliver value on time and do it in an efficient way.

[00:33:27] Ben Lloyd Pearson: And more importantly.

[00:33:28] Dan Lines: In later.

[00:33:29] Ben Lloyd Pearson: Yeah. Well, and more importantly, um, you know, if we know that AI is an amplifier, if you have bad efficiency, if you have bad predictability, AI is gonna make it worse rather than better. Right. Yeah.

[00:33:40] Dan Lines: So I think that middle is like the co, the core of it. if you are on the other side and you're saying, Hey, you know what? I made a bunch of promises, uh, to the business around AI adoption and the impact of ai, or, Hey, my P and my E are really solid, like you said, uh, Ben, and I'm really looking [00:34:00] to amplify, so I'm getting that AI leverage. Then you start with the a.

[00:34:04] Ben Lloyd Pearson: Mm-hmm.

[00:34:05] Dan Lines: Then you work your way to the, to the right. Okay. More AI adoption. And I'm gonna make sure that my predictability, my efficiency, and my satisfaction, uh, remain constant.

[00:34:15] Ben Lloyd Pearson: Yeah, I like that approach. Very, very flexible. Kind of meets you where you are rather than trying to form fit you into it. Right.

[00:34:23] Dan Lines: selects you. Yep.

[00:34:25] Ben Lloyd Pearson: Alright, well I feel like we've gotten a really great breakdown of what APEX is, why we brought it to the market, and, and why we think engineering leaders should be leveraging it today.

[00:34:34] Ben Lloyd Pearson: Um, before we close out, I wanna cover just a couple more things related to this. And, and the first is, is how you. Put APEX into operation, right? So as a part of this, we have a guide that, that we will include in the show notes with this. Um, but we also included, a recommendations for a specific rhythm of, um, operations around these.

[00:34:53] Ben Lloyd Pearson: So, for example, you might want to track your ai. your AI leverage on a weekly basis because it's [00:35:00] changing so frequently. or you might wanna look at predictability once per sprint because it's just a natural cadence to analyze that. Whereas something like DevEx or your metrics that could be monthly or quarterly, depending on like, how you as an organization, um, feel about this.

[00:35:14] Ben Lloyd Pearson: So, so why do you think Dan, is the, is the cadence so important to APEX actually working within organizations?

[00:35:21] Dan Lines: I mean, I just feel like the business of engineering, the whole per, like, one of the main, I don't know, tenets of engineering is to have a repeatable cadence.

[00:35:32] Ben Lloyd Pearson: Yeah.

[00:35:32] Dan Lines: is like a, a machine that needs to be operating in a smooth way. When the cadence gets broken, uh, engineering is broken, the business is broken. So, like you said, yeah, like predictability. The P of APEX go with the flow of your, uh, sprints. Maybe it's two weeks, maybe it's three weeks. If you're working in, in Kanban, it's weekly. On the efficiency side, yeah. I mean, I look, I like to look at it, uh, let's say, uh, twice a month. Some [00:36:00] organizations look at it monthly, uh, on the AI leverage side. That's a monthly mode for me right now. And I can see actually some, some of our customers are even like, okay, we're kind of under the gun here. So it's like weekly

[00:36:12] Ben Lloyd Pearson: Yeah.

[00:36:13] Dan Lines: on all of these? Yeah. You report it to the business. You, you owe it to the business. It's a quarterly executive report.

[00:36:19] Dan Lines: Or if you're gonna survey developers, yeah, you do it at a quarterly ca uh, cadence. But the whole key key to me is like the business of engineering runs on cadences. And therefore, like the APEX framework, I think fits into the natural cadence of how engineering orgs operate.

[00:36:37] Ben Lloyd Pearson: Yeah. Uh, that, that makes a lot of sense. And, and I, and, and it's one, one of the things I like the most about the framework is it's, it's very easy to just, you know, whatever cadence works for you. Like, like you said, if you want to, if you wanna keep tabs on ai, like every day, like, because you're moving that fast, like do it, you know, it makes, it, makes sense for you.

[00:36:53] Ben Lloyd Pearson: But if you're an organization that's moving a little more slowly or. Uses it more for executive reporting. Maybe a monthly or quarterly [00:37:00] cadence makes a lot more sense, but the, the flexibility of APEX, I think is what one of the things that makes it really powerful. All right. And then I wanna close out by just talking about how we actually operationalize this data.

[00:37:11] Ben Lloyd Pearson: So, you know, a constant theme we've always had, uh, at LinearB at Dev Interrupted is that, you know, visibility doesn't matter if you're not taking action. Upon it. So, you know, I'm curious, Dan, from your perspective, like what's the first step to taking for an engineering leader to take advantage of APEX and what should their goal be by adopting this framework?

[00:37:33] Dan Lines: Yeah, great question. I mean, the, the engineering leaders that I

[00:37:37] Ben Lloyd Pearson: Yeah.

[00:37:38] Dan Lines: when we first start together, it's getting to a benchmark. Let's come benchmark against APEX. You gotta see where you are. Yeah. Like we said before, if you come in and you know, Hey, I'm already predictable, I got great efficiency, I need to start with ai.

[00:37:50] Dan Lines: So, okay, the A selects you, but oftentimes, again, APEX is a balanced, uh, framework. Let's see where you are from a benchmark perspective, and [00:38:00] then let's decide together. uh, usually when you come in, okay, you come in, you start using LinearB you know, you start using the platform, it will show you, show you on the benchmarks.

[00:38:10] Dan Lines: It's pretty obvious where you'll, uh, the same way the benchmark will select you. It kinda lets you know, Hey, let's

[00:38:17] Ben Lloyd Pearson: Yeah,

[00:38:18] Dan Lines: and then build a progressive

[00:38:19] Ben Lloyd Pearson: it's like a, it's like a flashing warning sign. It's like, here, here's the thing for you to solve today, kind of thing.

[00:38:24] Dan Lines: your benchmark, and then see where you wanna improve. Set a goal. Those are the first two steps.

[00:38:29] Ben Lloyd Pearson: Yeah. And then, and then of course, uh, you know, I think, uh, one of, when someone does that, they're gonna encounter a lot of common bottlenecks that we see time to time things like code reviews, for example, or frequent bottleneck. Uh, so I'm just curious, like, what, what's your opinion on, you know, once you've got these metrics, like what's the next step that someone should be taking and, and they benchmarked it.

[00:38:52] Dan Lines: yeah, yeah, yeah. So at at LinearB we're always doing, uh, metric and then action. Once you see all of your metrics, [00:39:00] you're gonna look at them, you're gonna get benchmark, and what's a natural thing to do? Okay, I see these metrics. Maybe I'm talking to like linearB MCP, I'm getting my insights.

[00:39:09] Dan Lines: At the end of the day, you gotta take the next step to, uh, action. So like for us at, at LinearB, we would say, Hey, let's go roll out the AI code review. Let's put in some gitStream rules. Let's work with, uh, let's turn on workerB. These are the things that make the A, the P, the E, the X actually improve.

[00:39:28] Ben Lloyd Pearson: Yeah.

[00:39:29] Dan Lines: so yeah, action's the next step.

[00:39:30] Ben Lloyd Pearson: And I think in particular, ai, when you're adopting it into your, you're again putting it into your critical paths. Uh, it, it's really useful to, to have deterministic controls that keep it on guard. Like keep, keep it on like inside of guardrails, you know? Uh, and yeah, I think without that it's, it's very easy for, for, for a team that might otherwise be successful, to be one of those teams that gets overwhelmed by AI slop.

[00:39:55] Ben Lloyd Pearson: Right.

[00:39:57] Dan Lines: Let's keep 'em in line.

[00:39:59] Ben Lloyd Pearson: [00:40:00] Yeah. Yeah. All right, Dan. Well, it is, it's always great talking to you about these topics. I, I think this is just a really great reminder today that, you know, the tools are changing very rapidly at times, especially now. But the goals really are, have always been the same. You know, we want to deliver value to customers, and I feel like we've really only scratched the surface of, of what we could get into with, with APEX and, you know, how it can impact engineering leaders.

[00:40:25] Ben Lloyd Pearson: Uh, so I definitely think we're gonna have you back at some point to, to talk about this. Um, there's, I think, a lot of room for follow-up content that I'm hoping we'll, we'll be able to bring on to Dev Interrupted in the future. But yeah, thanks Dan for, for coming out today.

[00:40:38] Dan Lines: Thanks for having me, man. Can't wait to be back. Uh, next time when you call, I'll be there.

[00:40:46] Ben Lloyd Pearson: All right. Awesome. Well, for those of you that are listening, if you wanna see the full APEX operating model with the metrics and the visual breakdown of these pillars that we discussed, you know, you can head over your favorite search engine, look through the LinearB APEX [00:41:00] framework. We'll also have a link in the show notes if that's easier for you.

[00:41:03] Ben Lloyd Pearson: Dan, thanks to you and our audience for joining us today. If you found this episode helpful, feel free to share it with another engineering leader who's navigating this AI transition. I'm sure they would really appreciate the help right now, and you, our listener, are in a place to help them because of what you've learned here today.

[00:41:20] Ben Lloyd Pearson: So help us out, help your friends out, share this episode, share the guide. Uh, we'd love to hear what you all have to think about it, and we'll see you all in the next episode.

Your next listen