"Just having software engineering intelligence... that used to be enough, but that's no longer the case... the difference is we now need to take that information and we need to apply immediate actions."
Are your teams feeling the intense pressure to produce more in an era increasingly dominated by AI?
Join hosts Ben Lloyd Pearson and Dan Lines as they unpack a major shift in how engineering organizations must now approach productivity. Dan reveals the urgent challenges he hears directly from CTOs and VPs, who are grappling with how to define their AI strategy for genuine productivity gain, accurately measure its true impact, and understand the resulting implications for their workforce.
In this episode, Ben and Dan explore why traditional software engineering intelligence (simply having metrics and information) is no longer sufficient in 2025. Together, they explore productivity's nuanced meaning and discover how organizations can shift from passive data observation to an active improvement mindset, and get a look at what defines a developer productivity insights platform.
Transcript
(Disclaimer: may contain unintentionally confusing, inaccurate and/or amusing transcription errors)
[00:00:06] Andrew Zigler: Welcome to Dev Interrupted. I'm your host, Andrew Zigler.
[00:00:09] Ben Lloyd Pearson: And I'm your host Ben Lloyd Pearson.
[00:00:12] Andrew Zigler: Before we actually get started on this week's news, you might have noticed the new icon for Dev Interrupted on wherever you're listening to us today. That's right. Dev Interrupted has undergone a, a quick rebrand. We have some new visuals representing us out in, uh, the internet world. so if you see a different icon, a.
[00:00:28] Andrew Zigler: Header image, even different, photos for our guests that come onto the show. Just know that's part of our audience growing, evolving us listening to our community, and updating our visuals.
[00:00:38] Ben Lloyd Pearson: AI is changing everything. It's changed the LinearB brand too. No, I, I joke we've been using the same designs for quite a while, you know, and, and I imagine long time listeners and, and Substack subscribers out there, they've probably noticed, you know, like you mentioned the new icon.
[00:00:54] Ben Lloyd Pearson: We've got new color scheme on our substack. just a whole bunch of updated visuals for everything [00:01:00] that we publish. But, you know, don't worry, this is mostly a visual refresh. We just felt it was
[00:01:04] Andrew Zigler: Yeah.
[00:01:04] Ben Lloyd Pearson: to give a, a facelift to Dev Interrupted, and just make it a little sharper, a little, more 2025, I guess, for lack of a better, way to describe it.
[00:01:14] Ben Lloyd Pearson: I'm not a designer,
[00:01:15] Andrew Zigler: Yes. No, we, we did, we did not design it. We entrusted it to the skills of a talented designer who made it. and we're excited. it has a lot of black and white themes. And actually, I just realized today, I'm totally twinning with our new branding 'cause I'm wearing black and white. So, definitely check it out less than what you think.
[00:01:30] Ben Lloyd Pearson: yeah. Yeah. And absolutely. And, and you absolutely don't want me designing this E either. 'cause it would be the absolute worst design,
[00:01:37] Andrew Zigler: Yeah, no, we just, we talk about engineering news. we don't design podcast rebrand.
[00:01:40] Ben Lloyd Pearson: Yeah, but I wanna point out that, you know, Dev Interrupted LinearB we have this very close relationship, you know, LinearB has like this wealth of data and research within the organization that we love tapping into. Whereas on the other, on the Dev Interrupted side, we have this in. Incredible community of engineering leaders. And you know, what we're really trying to [00:02:00] do, like sort of behind the scenes, you know, as this visual refresh is happening, is really to leverage both sides of that equation a little bit better.
[00:02:08] Ben Lloyd Pearson: So, Be on the lookout for new types of content from us, from deeper content that gets really deep into research and into the opinions of our experts. you know, I think a really great example of this, and, and we will, we'll mention this a little bit later, but we've got this really cool workshop coming out tomorrow about, how organizations are moving beyond, copilot style AI tools to do more AI driven software development. we've got some incredible research from LinearB but then also some, some really fantastic past guests from Dev Interrupted that are gonna join us to share their expertise on it. So. You know, moving forward, the brand is updated, we're still gonna be the same. The content is mostly gonna follow the same structure we've always followed, but just keep an eye out for additional new content opportunities, new research and new types of material coming out of us.
[00:02:56] Ben Lloyd Pearson: So, there is a little bit changing on the back end, but, for now it's [00:03:00] mostly just a visual refresh and we hope you all like it. let us know what you think on LinkedIn.
[00:03:03] Andrew Zigler: Yeah. and with that out of the way, let's move on to some of this week's industry news. cause we have a really cool roundup of some, interesting topics people were talking about this week in software engineering. One of them is a, a recent age agentic AI primer doc that dropped from an expert in the field who we've also covered in the past.
[00:03:19] Andrew Zigler: We have an update on how he's thinking about agent coding. We have an article about why pen and paper is still a developer's best friend. Uh, in the round things out, we're looking at some of the modern forms that shadow IT is taking. So Ben, what do you wanna dive into first?
[00:03:34] Ben Lloyd Pearson: Well, it's a little refreshing to have AI content being in the minority this week,
[00:03:39] Andrew Zigler: That's right.
[00:03:40] Ben Lloyd Pearson: in quite a while. So maybe we just get that outta the way. Let's start there.
[00:03:43] Andrew Zigler: Well then let's dive into this really cool article from Pete Hodgson about why your AI coding assistant keeps doing things wrong and ultimately how to fix it. this is a, a primer that outlines some of the type of tasks that folks are using agentic tools in, coding environment to, [00:04:00] To, to work on software, right?
[00:04:01] Andrew Zigler: And using an LLM in your IDE can take a lot of different forms and the questions that you can ask it, really vary depending on your own expertise level and what you're trying to accomplish and how complex your code base is. Um, so this article, it offers a nuance perspective on some of the more polarized discussions around using AI assisted coding in this way.
[00:04:21] Andrew Zigler: Is it good, is it bad? And ultimately what it boils down to is it's good at certain things. And it's bad at other things. You know, this is, um, asking your LLM to spell strawberry all over again. It's important that you're using your tools in a way that they're best set up to give you success. Um, so he introduced this idea called a constraint content, uh, context matrix.
[00:04:42] Andrew Zigler: Categorizes tasks on a, on axes. if you've been following our content, we love mapping things out on graphs and axes in terms of adoption. Um, so we're going to include some links on, this coverage. Um, I myself have been talking a bit with Pete. We'd love to probably have him on the podcast in the future, so stay tuned 'cause we're [00:05:00] gonna really dive into the evolution of age agentic coating space.
[00:05:03] Andrew Zigler: Ben, what did you think?
[00:05:05] Ben Lloyd Pearson: Yeah, I've been learning a lot from reading his content, so it'd be, it'd be wonderful to, to bring him on. So one of the things Pete said in this article was he asked or said that the question of whether or not AI can write good code or not is actually just a false dichotomy. he does a really good job in this article of breaking down how to modify tasks to bring them into what he's calling like the AI sweet spot. You mentioned this matrix. it actually looks very similar to a matrix we published recently on The Dev Interrupted substack where we, you know, again, looking at a AI driven software development.
[00:05:36] Ben Lloyd Pearson: So, yeah, I think it's a really great way to just sort of like think about challenges like this. and I think the whole point really is that AI can write good code, but only when it's given clear constraints in adequate context for the problem that you're trying to solve. he had this really awesome comparison. It was.he compared AI to an engineer with the coding ability of a senior level [00:06:00] engineer, but the decision making skills of a junior developer. in all situations, the AI agents are basically like an engineer that's spending their first hour ever working on your code base.
[00:06:11] Andrew Zigler: Yeah.
[00:06:12] Ben Lloyd Pearson: you open up a new discussion, it's like you just hired a new developer that's really great at writing code, but not so great at at making decisions. And of the recommendations he have like really just resonate with my experience. You know, my favorite approach that he, he outlines in this is when I'm having a really hard time solving a problem with AI is I like to break that problem down into much smaller chunks that require less context to solve and can be more constrained. So I can give it all the information it needs 'cause it's a smaller problem to solve and I can tell it to solve it in a way that is much more controlled and constrained. in a predictable environment. And you know, he, he mentions that like the danger zone for AI is when you have too much implied context and too open-ended of solutions.
[00:06:56] Ben Lloyd Pearson: Like if you're counting on, on a model [00:07:00] to figure out the context on its own and choose from a whole wealth of solutions to the problem, then that's where things really start to fall off the rails.
[00:07:08] Andrew Zigler: I agree. So we'll include the, the link to the article so folks can check it out and share what you think as well about, you know, how you're using these tools.
[00:07:16] Ben Lloyd Pearson: what's our next story, Andrew?
[00:07:17] Andrew Zigler: Yeah. Our, our next one is, uh, diving back into the world of pen and paper and why that's important for developers. This was a, a p Yeah. I, I know you do. That's why we made sure it was in here. I knew that you had lots of good feedback on how to use pen and paper successfully. As an, as a developer, as an engineer, as a product minded person, I saw a lot of my.
[00:07:35] Andrew Zigler: Myself in this article as I was reading it, this comes from Ham Mati Santa, uh, on his website, reflecting on his experiences as a developer, there's things that really stood out to him about being a developer that I resonated with. Like when you go to write code and you open your IDE, just feels like all of the creative energy and the thoughts that you had about what you were gonna write just fly outta your brain, right?
[00:07:56] Andrew Zigler: And you're now in. You're in produce mode, you're in right mode. And, [00:08:00] the trick with pen and paper is being able to capture those long form creative thoughts and places and where they pop up that aren't necessarily when you're sitting in front of an editor. It's about having a a, a gorilla note taking style being
[00:08:14] Andrew Zigler: unafraid to take notes anywhere, everywhere, when they come up, and when your best ideas strike you. Um, I very much work this way. You know, I, Ben, I know you're a big pen and paper person and we're gonna get your scoop on how you use that or why it's important to you. But I'm a digital note taker.
[00:08:29] Andrew Zigler: before all of my notes were in things like Trello cards and notions and apple notes and things across all my devices. In the last three years, I've consolidated into an obsidian vault, where I keep daily notes and, write topical notes on things that interest me, things that we cover on this show.
[00:08:44] Andrew Zigler: So many of those discussions grow out of reflections that I put into that note journal. Because I think the most important thing that you can do as a developer and you're working and expanding how you think about the things you build and, and the creative outputs that you have, is to, [00:09:00] keep your notes in a flat and future compatible way.
[00:09:03] Andrew Zigler: Um, markdown is sits at this perfect intersection in my mind of human readable and machine readable, and it. it's really good to help augment, for new tools that come out too. I've been able to take things from my zettel cast and from my notebook, and just plug my markdown stuff into ai, for example, and it's context ready to go.
[00:09:22] Andrew Zigler: What about you, Ben?
[00:09:23] Ben Lloyd Pearson: Yeah, instead of markdown, I prefer my chicken scratches that are only readable by me. No one else can read them. but seriously, I do all of my weekly planning and meeting note taking, and much of my planning and design decisions all happen, with, you know, I say pen and paper, but today, it's now a digital writing device.
[00:09:43] Ben Lloyd Pearson: Like I have a. A tablet that's specifically for handwriting. and, and I, you know, honestly, I think it's just, it's much better for long-term knowledge retention, and that's something that is just really hard to come by in this day and age. Like we, we've all, I, I think we all struggle with attention spans [00:10:00] these days, partly because of the social media era, but now we're in the AI era and I feel like it's almost compounding. the challenges that we face just focusing on, things that aren't on a screen front of us. But I also, I also love it because I like any activity that kind of forces you into like a single threaded mindset. when you're sitting at a computer, there's so many different things in front of you that you could immediately like attach to and, and do something with.
[00:10:27] Ben Lloyd Pearson: And it's really easy for your attention to sort of get fragmented across like multiple tasks or multiple components of a problem that you're working on or conversations or things that you're reading. and this author also mentioned like that he likes taking the notebook and pen on walks with them.
[00:10:45] Andrew Zigler: Yeah.
[00:10:45] Ben Lloyd Pearson: and I think that's a great idea. my personal choice is I love trail running for a very similar manner, you know, getting out in the woods and, just being away from everything for a moment, even in the middle of the day. so obviously I can't bring my notebook for that, but, uh, but I [00:11:00] think, the critical thing is that you need these regular breaks from the high functioning knowledge work.
[00:11:05] Ben Lloyd Pearson: if all you're doing is trying to do like this really, high functioning effort or high, functioning work, you're not giving your mind the space that it needs to break out of the minutia of completing those tasks. To think about The bigger picture. And I think this article is just in general, it's just a great reminder to think about practices that improve mental functioning, mental clarity, uh, you know, and even mental health. You know, there's a lot of ways to accomplish this, but I think practices like this are a really great way to just sort of break you out of that mindset that you get stuck in when you're busy producing.
[00:11:40] Andrew Zigler: I agree. And so we wanna hear from, you know, y'all, our listeners, about what are the mindful practices that you do to help yourself focused and, and protect that creative energy. Maybe you can settle the virtual note versus physical note debate between us. 'cause I know we're both firmly in our own camps.
[00:11:55] Ben Lloyd Pearson: Yeah. Yeah. It, it blows my mind that my right handwriting is so bad with how much I [00:12:00] actually like to do it. But,
[00:12:00] Andrew Zigler: Yeah. Well, let's round things out with our last story here. So, this is a, an article on LeadDev by Kelli Korducki. I really loved this piece. I had to pick it up and include it in here for y'all. And it's, I. Talks about a form, that shadow it takes, that maybe is not talked about as much as traditional shadow it and just the backup for a second.
[00:12:18] Andrew Zigler: Shadow it as we know it is. It's, it's when you know you work for a company or you're on a team and you're using a tool or a software that's not sanctioned by your company, or maybe it's outside of budget or outside of scope. Those could be from financial reasons, for security reasons. It could be for any number of reasons really and shadow it happens when folks use those tools anyways.
[00:12:39] Andrew Zigler: We saw this, uh, really take a, a new form. When a AI came on the scene, you had shadow AI where folks were using AI or LLMs outside of their companies sanctioned methods, or maybe their company was forbidding it. 'cause many did that at the time.and so. People were using these tools, without supervision or approval from those above [00:13:00] them and shadow, it is another format that this article covers is about folks using unsanctioned workflows or working in different toolings or processes outside of what's established.
[00:13:11] Andrew Zigler: So this kind of veers into a more nebulous realm. Maybe they are using a tool that's not, approved or established, or maybe they're using a tool that's approved, but they're using it in an. Unintended way to short circuit a process that maybe adds a lot of baggage to them or makes their day-to-day work really hard.
[00:13:28] Andrew Zigler: And systems like this pop up when you have disconnects between leaders and the doers. Right. so that was a really interesting angle that this article dived into. I know, Ben, you've experienced some of this before in past roles. What do you think about this perspective?
[00:13:43] Ben Lloyd Pearson: Yeah, I can, I can relate very well to this article because back in a past life when I worked in platform engineering, one of my responsibilities was maintaining shadow Dev tools. Funnily enough,
[00:13:56] Andrew Zigler: And that's funny.
[00:13:57] Ben Lloyd Pearson: some of them were, did eventually become [00:14:00] officially sanctioned, but many of them were either unofficially sanctioned or not sanctioned
[00:14:05] Andrew Zigler: Oh, so you were like the keeper of the forbidden tools. You were kind of like the guy with the trench coat that had like the software in it no one could use.
[00:14:11] Ben Lloyd Pearson: was, I was the one with the,
[00:14:13] Ben Lloyd Pearson: credentials that our centralized IT department didn't know about.
[00:14:16] Andrew Zigler: You know, we laugh, but this is like a, this is actually a real thing that happens in so many organizations. I'm sure many folks agree.
[00:14:22] Ben Lloyd Pearson: and I mean, the reason, the reason this happened is because in our case it was either, it was either we roll our own tools or there were aspects of our jobs, like within our development workflows that would just be literally impossible for us to complete. Like one really great example. I worked with a bunch of open source engineers, like people who contribute to open source projects and, you know, if you're sending code to your collaborators out in the open source space through email, and that email has like a little footer at the bottom that says everything in this email is the intellectual property of the company that that developer works for.
[00:14:56] Ben Lloyd Pearson: Like, that's just not compatible with the community that you're working [00:15:00] in. So. What do you do? You spin up your own email server that doesn't have those footers, you know? you know, and I think like the, the lesson that I want to just, just take away from this is that engineering leaders kind of have this choice.
[00:15:13] Ben Lloyd Pearson: either you can institutionalize The tools that your developers need to standardize their adoption and usage, or you're gonna have your developers adopt those tools on their own and potentially without your knowledge. So if you have a platform engineering function, or DevOps or DevEx team or engineering enablement, like these are the types of people that should be finding where tooling isn't matching the, the needs and the expectations of your developers and, and making it match those needs and expectations.
[00:15:42] Andrew Zigler: So, Ben, what's coming up next?
[00:15:45] Ben Lloyd Pearson: Yeah. So, uh, this week I am sitting down with Dan Lines and we're gonna discuss the future of developer productivity, developer experience, and get a little bit into AI as well. this is actually a, a,bigger discussion. It was a very long [00:16:00] discussion that he and I had that we ended up breaking into two episodes. this first half is gonna focus more on the developer productivity, developer experience, and we'll transition into Uh, the AI stuff, for next week. So stay tuned after the break. It's a great discussion.
[00:16:16] Andrew Zigler: Okay. Quick interruption because I'm hosting a workshop tomorrow that's right beyond co-pilot. What's next for AI and software development is going down June 4th, and yours truly will be hosting it live. We have an absolutely stacked 35 minute panel with past guests from the pod, like Adnan Ijaz from Amazon Q, and Birgitta Bockeler from ThoughtWorks, along with Suzie Prince from Atlassian.
[00:16:40] Andrew Zigler: We're digging into how top teams are pushing past co-pilot testing out agentic ai, measuring real world impact and actually improving their developer experience. And don't worry if you can't make it live. Everyone who registers gets the full recording plus access to it via accompanying guide. It has tools, prompts, insights, basically [00:17:00] everything you need from our recent 2025 AI impact report.
[00:17:04] Andrew Zigler: So what are you waiting for? Check out that link in the show notes and let's hang out tomorrow. I'll see you there.
[00:17:12] Ben Lloyd Pearson: Hey everyone, welcome back to the show. Today we're talking about a major shift in the way that engineering organizations think about productivity, and I've got the perfect guest to dig into it. You all already know and love him. Joining me today is Dan Lines, founder of LinearB and fellow host here at Dev Interrupted. Dan, it's really great to be across the mic from you once again.
[00:17:36] Dan Lines: what's up Ben? Yeah, it's awesome to be on here with you.
[00:17:39] Ben Lloyd Pearson: Yeah. So if you've been following the evolution of developer tooling and engineering intelligence, you probably know that LinearB the company that, that Dan founded and that I work for, have, have really been at the center of it. But things are changing everywhere when you look around, AI is taking over software pipelines, and with those changes, you know, we've made [00:18:00] some shifts to how LinearB approaches, the challenges that engineering teams are facing today.
[00:18:05] Ben Lloyd Pearson: So, Dan, we've got a lot to talk about today. I want to unpack what's changing at LinearB. For the Dev Interrupted audience and explain what's been happening behind the scenes So let's just start with the big picture. what has been changing around this space of software engineering intelligence and developer productivity?
[00:18:25] Dan Lines: Yeah, there is a lot to unpack here and I think. In order to understand, some of the great changes that are happening with LinearB, maybe we start with some of the changes that are happening, in the world. You mentioned ai, I don't know if you mentioned pro productivity, but these two things tie together.
[00:18:44] Dan Lines: How are we gonna be more productive? What's happening with ai? And also what, what's happening for our customers? So let me start with what I'm seeing. for our customer base, for developer experience teams, for CTOs, for VPs, and for [00:19:00] CEOs, we are now entered into this urgency period to produce productivity.
[00:19:06] Dan Lines: That's what I'm hearing out there. So when I go and talk to a CTO or a VP of engineering, or head, like a director of developer experience, what they're saying back to me, it's Hey Dan, we have this extreme urgency coming from our business onto us, and they're saying to us, you need to be more productive. You need to produce more.
[00:19:28] Dan Lines: And we can get into what productive actually means, but that's the urgency that's coming onto them. And specifically, I hear three questions. That are being asked to our customer base and I, and I'll say what they are. The first question that they're being asked is, what is your strategy to deploy ai? In terms of productivity gain, not just deploy it anywhere, but how does it actually help us be more efficient or improve our quality, or whatever it is?
[00:19:59] Dan Lines: That's the first [00:20:00] question. The second question that they're being asked is, okay, assuming that you have a strategy for AI and you're deploying, how are you measuring the impact of it? So they're on the hook for that. And then the third thing. It comes back to either cost or, Hey, what are you doing differently with the workforce?
[00:20:21] Dan Lines: If you have this strategy right now to deploy a bunch of ai, or we bought a ton of co-pilot licenses, let's say, how does that impact our workforce? And so I think that's the biggest shift that we're seeing right now in the industry. And with us being, you know, CTOs, VPs. DevEx leaders, that's what the business is being asked, of us now.
[00:20:45] Ben Lloyd Pearson: I love the, the idea of this urgency to be productive. You know, I think everyone is feeling this pressure that, particularly with the advent of AI driven tools and software development, there's this pressure that like, and a lot of hype, right? Like [00:21:00] there's this hype that we're gonna be enabling these like 10 x productivity workflows, right?
[00:21:05] Ben Lloyd Pearson: And so, I think there really is sort of a lot of confusion in the space around like what it's gonna take to be more productive in this environment. So.
[00:21:14] Dan Lines: absolutely.
[00:21:15] Ben Lloyd Pearson: Yeah. So I, and, and one of the, the, the big things that, you know, has sort of been central to LinearB's existence really from the very start is, engineering intelligence.
[00:21:25] Ben Lloyd Pearson: Like this idea that you need to be able to measure the productivity of your organization, measure the developer experience, show how engineering impacts the business, bottom line. But let's talk for a moment about why you and us here at LinearB have, you know, view that as not being enough anymore.
[00:21:44] Dan Lines: Yeah, exactly. So yeah, I mean, six years ago, five years ago, just having software engineering intelligence, meaning having a bunch of metrics, having a bunch of surveys, having a bunch of information coming. to you being the [00:22:00] CTO or like the head of, DevEx That used to be enough, but that's no longer the case.
[00:22:05] Dan Lines: And the reason that it's no longer the case, I kind of think about it, is think about like, the smartest person, you know, and someone that has, you know, and, and there are some folks out there like this, like the smartest person I know. Kind of knows everything, but doesn't necessarily mean that they are capitalizing on that knowledge.
[00:22:24] Dan Lines: Meaning they're super, super smart, but they might be a little bit, lazy. They're not applying their knowledge. And so I think about it the same way for, for software engineering intelligence. It's amazing to have all of this, information, again, whether it's a bunch of metrics or a bunch of, survey results.
[00:22:42] Dan Lines: That's okay. And an okay start, but the difference is we now need to take that information and we need to apply immediate actions. And for LinearB, that's gonna be our AI automations, that's gonna be our gitStream technology, all of that, we can [00:23:00] get into it, but I need to take all of that information and I need to
[00:23:04] Dan Lines: apply it in a way that I can show my organization is being more productive right now. And I think that's the difference from, you know, five years ago to the urgency that there is today. We can't, we are not, at least at LinearB, just stopping when metrics saying, Hey, here's all the metrics in the world that you could possibly have.
[00:23:23] Dan Lines: No, we're not stopping there. We're now saying, of course, you get all of the metrics and all of the survey results. But here are the next steps that you're going to deploy into your SDLC to actually capture productivity gain. That's the difference.
[00:23:38] Ben Lloyd Pearson: and you know, we've been throwing this word productivity around quite a bit, and it's. a, it's a very complicated term for sure. and it brings out a lot of different opinions about, what it means and, and what you should value when you think about developer or engineering productivity. So let's think about where, you know, here we are in 2025, [00:24:00] like have engineering organizations that are adapting to this modern AI driven way. what does productivity actually mean in that environment?
[00:24:10] Dan Lines: Yeah, you're right. It is nuanced. It needs to be described. So let, let's try to break it down and I, I think the best way that we can break it down is by role, what productivity means to a CTO or a VP of engineering, I think is a little bit different than what productivity means to a developer or a developer experience team.
[00:24:34] Dan Lines: So when I hear a CEO or a CTO or a VPE or someone in the c c-suite talk about productivity, what they're looking at is how much value can I output? Versus how much does it cost me? So value to cost. Now, for an engineering organization, their job typically is to produce value [00:25:00] through a product
[00:25:01] Dan Lines: back to the business and to the organization. And they have a workforce and they also have tooling, and they also have everything else. And, they have a cost that's associated to that value. And what they're being asked to do typically is, Hey, maintain the costs that you have today. We're not gonna increase your budget.
[00:25:24] Dan Lines: And we're probably not gonna increase your workforce, but I need to see more value coming out from the engineering organization. And you could say the value is, and we, we have a bunch of ways to measure this within LinearB, but you can measure new value. You can measure value enhancements, you can a measure story delivered.
[00:25:43] Dan Lines: You can measure, you know, pull requests that have been delivered. But at the end of the day, it's, I need more value per cost. That's the way that a VP or A CTO is looking at this. Now, the way that that translates then, for the developer [00:26:00] experience team and for developers, developer experience team is on the hook to now figure out how to actually be hands-on and deliver more value.
[00:26:08] Dan Lines: Tough job. They're serving developers and the way that I look at it, there is okay. I cannot ask my developers to work any harder. They're already working a ton. We're not gonna get more value there. If I came back to you, Ben, and say, Hey, you're already working eight to 11 hours days, Ben, I'm gonna need you to work the weekend.
[00:26:30] Ben Lloyd Pearson: Yeah.
[00:26:31] Dan Lines: that's not gonna work.
[00:26:32] Ben Lloyd Pearson: Not gonna work.
[00:26:34] Dan Lines: So the way that value or productivity is provided then, for developers is taking some job off of their plate that they're doing and probably usually doing manually without sacrificing any quality. So you used to be doing, for example, Ben used to have to do X amount of code reviews per week.
[00:26:55] Dan Lines: I'm going to replace part of that time with an AI code [00:27:00] review. maybe you're a team leader. You used to be leading retrospective ceremonies that take a few hours every two weeks. I'm gonna cut that time in half because I'm gonna provide all of the information that you need for that ceremony, those types of things that are actually taking work or a piece of the job off of a developer's plate, that's what productivity increase means.
[00:27:24] Ben Lloyd Pearson: Yeah. and I think we encounter this same scenario quite often, in this story where these teams, whether you're at the executive level or more at the developer level, they get metrics in their hands and then they just ask, what next? You know, like, great, I have this data, but, what next?
[00:27:42] Ben Lloyd Pearson: Because this is very passive. It's not, it's not an active improvement. And I like how you've already illustrated a couple of examples of how engineering organizations can then start to think about productivity improvements, but describe what an organization goes through as they switch from like this passive observation into more [00:28:00] of an active improvement mindset.
[00:28:03] Dan Lines: Yeah, absolutely. I think you described it pretty well, like a passive mindset is I am looking at a bunch of metrics and maybe I'm seeing some problems, but I'm having a hard time taking the next step of what to do. With the data or the survey results, an active platform. And what we like talk about at LinearB is, hey, we have this active productivity platform,
[00:28:26] Dan Lines: The platform itself will actually tell you, Ben, these are the automations that you need to deploy next in order to fix this bottleneck problem that you have. So for example, if you were working. with the LinearB software, it will actually say to you, Hey Ben, we noticed in these exact repos that it seems to be a bottleneck in your review process or these services or these teams.
[00:28:53] Dan Lines: If you deploy this set of automations and the automations may look like, Hey, you should have an AI PR [00:29:00] description, you should have an AI code review. Areas that you have Dependabot doing, raising a bunch of PRs we shouldautomate the merge process, for example.
[00:29:10] Ben Lloyd Pearson: Right.
[00:29:11] Dan Lines: Click this button and deploy automation so that you can actually gain productivity.
[00:29:15] Dan Lines: That's what an active platform looks like from our perspective.
[00:29:20] Ben Lloyd Pearson: and I think what we're really illustrating is like, you know, LinearB's role within helping organizations move from like a metrics oriented view into like, productivity mastery in some sense. for our audience out there who's maybe they're like a DevEx leader, what advice would you give to them? On being more proactive, on acting in on insights rather than just looking at them, for example.
[00:29:44] Dan Lines: You know what I can just give kind of our perspective, like the developer experience, leaders that we work with. What we talk to them about is guaranteeing hours returned to the developer. That's their job. The way to improve [00:30:00] developer experience is saying, Hey, this week I need to guarantee X amount of hours back to, all of the developers that I'm responsible for.
[00:30:10] Dan Lines: So my advice, to a DevEx leader is to think in that mindset, what am I doing this week? And of course you know it's going to be with AI and automations. What is my strategy with AI and automations to actually return hours back to developers? And if I have a path, this is what we, do at LinearB, we guarantee a path for improvement.
[00:30:33] Dan Lines: Now I know I'm in the right track. If I, on the flip side, if I am a developer experience leader and I cannot guarantee a path for hours returned, I'm probably, kind of stuck in that, uh, looking at metrics or looking at survey results, I'm not actually moving my roadmap forward. So that, that would be my main advice.
[00:30:55] Ben Lloyd Pearson: Yeah. And if they, and if they were to plug LinearB into their stack tomorrow, like what do you think [00:31:00] is the first aha moment that they would encounter?
[00:31:03] Dan Lines: There's two, two that I see usually. The first aha moment is, they'll say, whoa, I didn't realize I had so many hours that I could save in this certain area, in these set of repos and these services. 'Cause with LinearB, it's gonna put right into your face, Hey, here's an area that you can actually get hours returned.
[00:31:24] Dan Lines: Maybe it's approving safe changes for, pull requests, for example. Hey, uh, it looks like your team is doing a lot of looks good to me and really not doing a code review in this area. We could save hours right there. But then I, I think the second aha moment is more so, okay, you're telling me that I can click a few buttons and resolve that I can deploy automations?
[00:31:47] Dan Lines: Yes, you can. So it's those two, combined with the LinearB platform.
[00:31:52] Ben Lloyd Pearson: Yeah, and,and I really like how we've taken this focus on Finding aspects of the developer's life where there's [00:32:00] some, some level of toil that's introduced to it, right? Like, and I think the simplest example is our new AI powered, descriptions, which we'll get into but, you know, you think about how long it takes to actually write a PR description. if you're lucky, it's like a five minute task. If you're not so lucky, it's maybe a 20 minute task or even longer sometimes. And it doesn't really add value to your work. it's something you have to do to help your team, but there's an add value work.
[00:32:27] Ben Lloyd Pearson: So like I, I know particularly for developers, when they can see that, you can just open a pull request and not have to worry about this moment of toil. That really leads to some profound, rethinking of our workflows. Like I think developers start to think like, what else can we that to, you know, within our workflow.
[00:32:44] Dan Lines: Yeah, I mean like the, the, okay. What's good about the PR summary? If you haven't been thinking in terms of how am I gonna return hours back to my developers, it's really easy to think about. That's why I think it's probably a good place to start or why you brought it up. And [00:33:00] you're right. Let's say that you have, a uh, developer English is their first language.
[00:33:05] Dan Lines: For some reason, they're really, really good at writing out, descriptions of what they did in code changes. Maybe it would take 5 minutes now in a situation. And a lot of the companies that we work with that we're working with Global companies. There's teams all over the world.
[00:33:20] Dan Lines: Sometimes English is their second language, so that might be a 15 minute experience that you can give back to each developer for every PR. 'Cause writing that PR description is, cumbersome and maybe the least fun part of the day. The most fun part of the day was coding and actually getting the PR up.
[00:33:38] Dan Lines: The least fun end of the day was trying to describe what you did in English. And so even just right there, if you say, Hey, I'm gonna start with something simple. I know I can give 5 to 15 minutes back for each developer that's raising a PR this week, that's a really concrete way to start. And I think it goes back to the question that you asked earlier of, Hey, what are the mindset[00:34:00]
[00:34:00] Dan Lines: of like kind of the best DevEx leaders. I think that's the mindset. What am I gonna do next to get a few minutes back and those few minutes back add up over a year period of time that then translate to real productivity back to the organization.
[00:34:14] Ben Lloyd Pearson: And a phrase that I've been hearing a lot around the LinearB halls recently is this idea of providing a control plane for developer experience. So would you help me unpack that for a moment? Like what would you view as like the ultimate control plane for developer experience?
[00:34:31] Dan Lines: Yeah, I think there's three parts to what, control plane is for a developer experience team. The first part is understanding what you're going to do next and why you're going to do it. So if you think about being a DevEx leader, you might have a lot of options of how you think you can contribute a better experience, and therefore productivity gain back to the organization.
[00:34:58] Dan Lines: But sometimes it's hard to [00:35:00] pinpoint where and what you should do. So if you're thinking about a platform like our platform, the LinearB platform, it will actually tell you, Hey, here are the first three things to do to get the most bang for your buck. You can help with these, let's say five services because you have the most to give back to the developer experience team.
[00:35:21] Ben Lloyd Pearson: Yeah.
[00:35:22] Dan Lines: So
[00:35:22] Dan Lines: that's
[00:35:22] Dan Lines: the first part of the kind of the control plane navigating what to do. The second part is, are you actually able to take action? So that's not getting stuck in survey results or, or metrics. The second part I think of a great control plane is actually being able to push a button that says deploy.
[00:35:41] Dan Lines: Deploy AI here, deploy these set of rules there, deploy these automations there. And then the third part of a good, that's more like the management and deployment side. And the third part of it is actually looping back and saying, did the deployments that I just did actually return, [00:36:00] those minutes and hours back to the organization? And I, and I can, I see a graph of that hopefully increasing over time.
[00:36:07] Dan Lines: That's what a great, I think control plane looks like for a DevEx team.
[00:36:11] Ben Lloyd Pearson: Yeah. And, and I sometimes get the sense that, DevEx is often treated more like a vibe rather than like a
[00:36:18] Ben Lloyd Pearson: strategic
[00:36:18] Ben Lloyd Pearson: investment, you know? And, and I think probably what you just described is a part of that. I. But do you think there's more to that? Like why is it the teams don't view, always view developer experience as something that needs to be invested in, particularly at like the enterprise level?
[00:36:34] Dan Lines: I think a few years ago, or maybe outside of 2025, developer experience teams, maybe were thought as more or as a vibe or went on as, as a vibe or acted on vibes and, and that is gonna be seen by the business, unfortunately, as a nice to have. Hey, would love to have a developer [00:37:00] experience team if we could, if there's a surplus of funds.
[00:37:03] Dan Lines: That's like a, a vibe, uh, style team. I think the difference now, and this is good for DevEx teams, is moving from vibe to I can prove productivity output back to my CEO. And you go from having like a nice to have team to, whoa, this team is super important for hitting the goals of the, of the CEO.
[00:37:30] Dan Lines: Let's say that's where you wanna be. Like, imagine coming to like a meeting with your CEO and saying, Hey, can I just show you what we did over the last six months? Here's how many hours we returned back to the organization. And by the way, we did it with AI deployment and I can measure every step of the process and it's managed.
[00:37:48] Dan Lines: Would you like to give me more funds so I could do this more? They're gonna say yes.
[00:37:53] Dan Lines: As
[00:37:54] Dan Lines: opposed to, Hey, we kind of like asked about the vibe of how developers, and it seems like it's [00:38:00] okay and some improvements, and that's what we, we learned. That's that's more nice to have.
[00:38:06] Ben Lloyd Pearson: I mean, it, it feels easy to say, but what, what do you think are the biggest blockers that organizations are facing to actually making that their reality?
[00:38:14] Dan Lines: Great question. I think what's happening and, and maybe especially for like the larger organizations, they're lacking a concrete strategy. I. To deploy AI and automations to receive that productivity gain across their 5,000, 10,000, 60,000 repos in a controlled way. That's what I see the most. And with LinearB, we actually provide that control with gitStream.
[00:38:46] Dan Lines: It's a programmatic language that gives them the confidence to execute on their roadmap and prove the gain back to the business. I actually think that's the biggest blocker.
[00:38:58] Ben Lloyd Pearson: And, and then in terms of [00:39:00] like actually making improvements, like why do you think that so many leaders like struggle to drive DevEx improvement within their organization?
[00:39:10] Dan Lines: it's getting stuck in the, quantitative and qualitative information gathering. It kind of goes back to what we were saying in the beginning. Usually they might get stuck in the data or they might get stuck in the feedback from the developers. And even if you have a ton of intelligence or knowledge, it doesn't mean you know exactly what to do to take the next action to ensure the biggest gain and the biggest gain in experience back for your organization.
[00:39:38] Dan Lines: I think it's going from that step one to step two, where I see, see kind of getting stuck in the mud a bit.
[00:39:45] Ben Lloyd Pearson: Yeah. Awesome. Well, Dan, I want to thank you for coming in today to share your opinions on developer productivity, developer experience, the future of enginerring intelligenceAnd to our loyal listeners, thank you for making it this far. If you've enjoyed [00:40:00] this episode, subscribe, share, check out our substack for more content. Dan, thanks for joining us today. Stay tuned for part two of this conversation coming up here in a future episode.