The debate on measuring developer productivity has arrived - and it’s here to stay. 

On this week’s episode of Dev Interrupted, cohost Conor Bronsdon welcomes LinearB cofounder & CEO Ori Keren and Kelly Vaughn, Director of Engineering at Spot AI, to offer their critiques of a debate that has captured the attention of the engineering community: can you measure developer productivity?

Consulting giant McKinsey published an article that ignited a firestorm, prompting industry leaders Kent Beck and Gergely Orosz to counter with a detailed 2-part response via the Pragmatic Engineer.

Believing the industry to be at a crossroads, Ori and Kelly combine forces to offer their perspective on the debate, sharing why it’s an opportunity for dev teams everywhere to “roll out metrics the right way.”

Episode Highlights:

  • (4:00) Initial reaction to the article from McKinsey
  • (12:30) Highlights from the Orosz & Beck response
  • (14:30) What Orosz & Beck missed
  • (23:30) Opportunity to educate the community
  • (27:30) What's best for the engineering community
  • (33:30) How to have this conversation with CEOs & CFOs
  • (38:00) Why this discourse is "exactly what we needed."

Episode Transcript:

(Disclaimer: may contain unintentionally confusing, inaccurate and/or amusing transcription errors)

Conor Bronsdon: Hey everyone, welcome to Dev Interrupted. This is your co host, Conor Bronsdon, and today I'm delighted to be joined by two of my favorite guests.

LinearB co founder and CEO, Ori Keren, and Kelly Vaughn, Director of Engineering at Spot AI. Ori, Kelly, welcome back to the show.

Kelly Vaughn: Yeah, thanks for having us. I'm particularly excited to be doing this with Ori because we were talking a little bit earlier that we've listened to each other on this podcast multiple times, but this is the first time we've ever actually been able to do this together.

Ori Keren: Same goes here. Excited to be here and to be part of this panel.

Conor Bronsdon: It was the same thought I had coming into this, where, I've talked to each of you separately about a lot of these topics that I know you're passionate about, I've talked to you each about, in fact, shows that you've both been on, and, Ori's been like, oh yeah, Kelly said this thing, and I was like, oh yeah, she recommended a book to me, and, Kelly, you've been, said, oh, this thing around developer productivity around entering metrics I heard Ori say this, that was really interesting.

Thank you, Kelly. So it's, it is really great to have you both actually on the same episode. We need to do more of these. And the impetus behind this episode is a controversial one, actually. There was an article that was released titled, Yes, You Can Measure Software Developer Productivity by McKinsey.

And it immediately ignited debates within the software development community with Kent Beck, software engineer and creator of Extreme Programming, calling the report so absurd and naive that it makes no sense to critique it in detail. That said, he did to some extent go on to critique in a detail when Gergely Orosz the mind behind the pragmatic engineer catalog what was happening, and reached out to Kent to join forces together.

The two of them wrote a two-part series as a response to the McKinsey piece. Definitely worthwhile reading as well. We'll link all these articles in the show notes. And as I alluded to at the top of the show, You're both very passionate about this topic, Ori you've obviously dedicated much of your life, including founding LinearB, to many of the principles discussed in this debate.

And Kelly, you are one of the most vocal and outspoken engineering leaders I know. We've talked about this topic multiple times. if there's a cause to stand behind what's best for engineers, the two of you stand there 10 times. you truly have what's best for the community in mind.

You're both authorities on the subject. And we want to start by asking you both, and Ori will kick off with you, what were your first impressions as you were reading that McKinsey article?

Ori Keren: Yeah, the initial thought was great. It's great that people, more people care about this.

More people want to express their opinion and want to participate in this debate. I think it's an important space that, we need multiple views and actually, reading through it, the first parts makes it made sense. The motivation was there, the reference, DORA, which is like the de facto standard now for some of the things.

I also think they came in with McKinsey has the advantage of coming in and represent, because we've all been there. The voice of the CEO, CFO exists. We can. Ignore it, or we might not want to listen to it, but it exists in this conversation. So those are like the parts that I like, but the parts where it started breaking for me were the area where people started, talking about individual, metrics of developers, that's the area where I feel strongly that you cannot use.

a system or definitely not the same system to measure individual metrics. It's almost like creating like an autoimmune system that attacks its own culture. So in the high level, again, I like the motivation. I like when people, more people want to express opinions, but those are like the areas where it started breaking for me.

Conor Bronsdon: Kelly, I see you nodding along to that.

Kelly Vaughn: Yeah, no, I love the analogy of the autoimmune. That is so incredibly true. Yeah, I can echo that same sentiment. When I started reading it, I'm like, okay, this is a very hot topic. Why are people so upset about this intro?

Or I'm just in the intro, I should say. This seems fine right now. But same exact thing. It started breaking down for me as soon as we started digging into the individual. Because, I've seen it time and time again, it does not work. And then I got a good laugh once I got to the very end of it, which we'll probably dig into aeventually here around letting developers do what they do best as far asavoiding the non coding have a lot of opinions on that one as well.

Conor Bronsdon: Kelly, it sounds like you have a strong opinion around, this what devs focus on piece. I want to talk more about some of the things that two of you alluded to, this autoimmune response we're talking about and how it can be dangerous, the fact that CEOs and CFOs do want this and we have to acknowledge that.

So why don't we just jump in there,

Kelly Vaughn: Yeah. Talking through the example, let me, I can read it word for word so I don't fumble exactly what it said here. So it said, for example, one company found that its most talented developers were spending excessive time on non coding activities, such as design sessions or managing interdependencies across teams.

In response, the company changed its operating model and clarified roles and responsibilities to enable those highest value developers to do what they do best. When I read that, I actually laughed out loud. Because, okay, the best engineers I work with are cross functional. The best engineers I work with understand those interdependencies.

They understand how their work impacts the business, how their work impacts the others who are also doing work in the same general space. And if you focus entirely on what you're doing as far as just coding, you're going to miss so much context. And that's how we start to go down rabbit holes of we're introducing features that the customers don't actually want, for example.

Or we're introducing something that somebody's already working on. Or now you're introducing two very different types of patterns of coding into the same repo, where you could have avoided doing that had you had these interdependent conversations. So I'll pause there for a second, Ori, I'm curious what your take is.

Ori Keren: I totally agree and I would add, another thing. I think the best engineers that I've worked with, they're the best readers of code, which is a very rare quality to have. And one of the best things they can do to contribute to a team, and we should talk a lot about, again, I love sports, so the sport analogy about how a team works is The best developers can read code.

They have patience to read code. They saw code. They can do the best reviews. So yeah, just, it started to feel a little bit like, one threaded, thought or one thing in mind there, because yes, the best developers can do a lot of things. And if you just let them code, you miss some of their greatest qualities.

Kelly Vaughn: Completely agree. I would be terrible. I would actually be absolutely awful if I were the developer on the receiving end of this and all I was told to do was just code. Because where I thrive is on the people in the process side of things. And you would miss all of my greatest qualities, all of my terrible jokes you would never get to hear because I'm just so focused on coding.

Conor Bronsdon: Ori, based on what Kelly's saying, do you think that same challenge would face these top engineers that you've known over the years?

Ori Keren: Yeah, absolutely. So I think, Like we said, the top engineers bring much more capabilities and qualities than just coding, and they have this overview of the system.

They can be the best reviewers the best system designers. I also think that, there are a couple of other things I saw there that caught my attention, one was, there was an inner loop and outer loop that we all know that developers have. But the outer loop, like the deployment part, was part of the outer loop.

And we say now, you build it, you ship it, you own it. And I think again, in, in the utopia, in a perfect process, you build the thing, you code it, you test it. If you can ship it right away, it's still part of your inner loop. So I think it's also something they missed with the outer and inner loop.

And the last thing that caught my attention there, that was, there's a lot of buzz around Gen AI and what it brings, to the table and everything. And it's a topic that is really, I'm really passionate about. I think one of the things the article said is like, it was proven that developers can go 2x fast.

Now what I would want to really emphasize here is, let's say you can stream code 10x fast even, right? If your pipes... Development pipes are still the same pipes that exist. They're still the old ones that are narrow and you don't enable like improving your development pipeline, you won't move 10x fast.

It's still going to be stuck there. So that's an area that, I'll be happy like to develop later on in this conversation.

Conor Bronsdon: Kelly, were there other key points of the article that stood out to you that you want to highlight?

Kelly Vaughn: I think the, I do think the inner outer loop I went back and read that again after I went through the article the first time. Cause that was a little bit surprising to me as well. That the idea that all of these quote unquote outer loop activities are more of a nuisance necessary piece of it.

There was also a section in here, and I don't remember exactly where it was, but it was, and I think it was in relation to this, whereas it almost came off as the roles that are manual QA, the roles that are around, the deployment processes. These are, they're, these are important roles for a reason, and it should be treated as such, and not as this extra task that you just have to do.

Conor Bronsdon: to both of your points here. I think there is this consideration of the software development lifecycle sometimes that overly focuses on just, Oh, the coding part and not the, how do you get the code into production? How do you QA it? How do you make sure it's getting reviewed by the right people?

How do you ensure that it fits within the context of what your team needs? How do we ensure it fits within the context of their business needs? And I know I've talked to you both individually about elements of this. And frankly it's part of why. CEOs and CFOs are gonna keep caring about engineering metrics and developer productivity.

We can't get away from this conversation. that's why it's so important that we frame it in the right way and that this conversation is engineering led so that it doesn't damage culture like you alluded to earlier, Ori. And I know that's part of the concern that Gergely and Kent had in their critique.

What resonated with the two of you about the critique from Gergely and Kent of McKinsey's article?

Kelly Vaughn: Yeah, I really liked the mental model that they use throughout parts both parts 1 and 2, where they're thinking about it as effort, output, outcome, impact. And I had to reread that multiple times because I always forgot the order that they were talking about, but it makes complete sense as far as, where a lot of these metrics tend to lean into and why you end up missing the, that greater picture, the outcome and the impact are so incredibly important to the work that you're doing. And when you start to dig into some of the developer metrics that you're looking at from not only these systems, when you're thinking about lines of code and story pointing and things like that, you're starting to look.

Really far into the effort and output side of things. And you're completely missing that outcome and impact. And it drives us back to that beginning conversation around developers are more than just the code that they're outputting. The developers need to know the context, so they know the impact that they're going to have.

They know what outcomes are actually coming from the work that they're working on. So that part really resonated with me.

Ori Keren: I totally agree. I love that framework that they, the effort, output, outcome, impact. I love, that they, push back on the individual metrics and talk about those areas.

I love the sport analogies. They did references to that as well. I do think we can talk about some things that, I in disagreement with them, but overall, I think it was a great response, to paint a perspective of how people who are coming from this industry are thinking about it.

I had the privilege You know to speak to Kent Beck once, the founding father of the Agile Manifesto, etc. The extreme programming person so Overall, I really like the response. I really like that framework.

Conor Bronsdon: So what are the parts you thought they missed, Ori?

Ori Keren: Again, these are not big misses, but I think one of the things that caught me really strongly is when they took the DORA metrics and they tried to map them in, into that framework effort, output, outcome, impact.

And I think if I remember correctly, I don't have in front of me, they positioned the DORA metrics in outcome and impact. So it's almost I think, CFR was in impact and the rest were in outcome. And I think that the DORA metrics are actually output and outcome more and not impact. And I see this, when, engineering leaders come and present neural metrics to the business can't understand like a CFR as an outcome, as an impact, so it can understand, oh, so the outcome is, you roll back less versions, you have more successful rollouts. That's a great outcome, but impact here is the impact on the business.

It's still hard sometimes for the business to tie. Tie between them. And that's, I think, a little bit of a blind spot that, people that come, like us, that come from the space, tend to have over, we think, hey, DORA metrics will cover and it's enough to paint a picture of how our engineering organization, performs.

So that's one that really stood up for for me, which I thought could be improved.

Kelly Vaughn: I do think that's a little bit of a side effect of trying to fit one mental model into another mental model. I'm in agreement. I think, when you're asking, when you put change failure rate into impact, the question you have to ask yourself is why does this matter?

And the why is not change failure rate. The why is the impact resulting from a decreased change failure rate or an increased change failure rate. And that is why these all need to be shifted by one because you're still talking about what is the impact of that.

Conor Bronsdon: This speaks to something that we hear a lot in business where it's, is this an operational metric for the business, for your business unit, or is this an actual business metric, the impact piece?

And I'm hearing that from both of you. I know Ori, we've talked a bit about What we can learn from other business units, like sales and marketing, they're farther along the approach to metrics. And I also know there's some things we should avoid there. What would you take away from how other business units are approaching their metrics that we can apply to this developer productivity conversation?

Ori Keren: They spoke about these analogies and they touched on some interesting aspects of it, which I really appreciated. By the way, some of them, I think, there was some description about this their sales reps who only care about their things. 

Conor Bronsdon: I think Ken had some certain opinions of sales, it seemed like.

Ori Keren: Development is definitely team sports. After running a business now, sales is also a team sport. It's almost like I love basketball. So maybe you want to compare it to, professional basketball and college basketball, where in college basketball, you work more together and in professional basketball, you still try to win as a team, but yeah, it's more about the individual.

So there's still some some analogies there. Here's the thing, the biggest blind spot, that I think, everybody that talks about this space is missing is the big opportunity. And we can talk about, by the way, they talk about why you shouldn't measure

the effort, and I think it's a risk measuring the effort if you talk about like individual metrics, but measuring, like mapping the entities in the effort and looking for inefficiencies there. And by the way, these are metrics that are only interesting to the engineering side.

It's really important. You can, this is developer experience, right? If you have a long build time, if you have like bottlenecks in your effort, you want to uncover them. So that's one thing of the second, big, biggest opportunity that I see here. I like the analogy to sales because if you think about Salesforce, so you can see the funnel.

In sales was like the cycle time in engineering, right? That's a great analogy, but those systems took off when, because you mapped the entities, now you could apply policies to actually accelerate the workflow and nobody's speaking about that. There's nobody speaking. Okay. Yeah, there's arguments how we should go about it, but.

Hey, let's take this huge opportunity that we have, that now we mapping all the entities in the universe and maybe, code reviews could be 10 x faster and maybe, ci and bill times can be shorter because we have the data around them. We can find the inefficiencies in them.

So that's like the analogy that I like, with sales and what I think. There is opportunity to learn around, engineering efficiency from how sales efficiency evolved throughout the years.

Kelly Vaughn: I completely agree with that. When sales is looking at optimizing their time to close, how long it takes from, outbound or inbound lead to actually converting this into a paying customer, we're doing the same thing in engineering.

For example, for my team, I know that we can do a lot better on how long it takes us to actually close PRs, from the time it takes to open a PR to actually deploying this into production or into even staging before we get to production. There's a lot of work that we can continue to do in our own.

In our own development funnel and deployment funnel that we're actually looking at right now around our API deployment process, for example, these are metrics that you can be analyzing. You can be using where things start to get dangerous is around when you start using these to measure the effectiveness of any individual engineer.

And so that's why it's really important to map these to the entity and not to the individual.

Ori Keren: Couldn't agree more.

Conor Bronsdon: Yeah, Ori, I know you and I have talked about this before, but a big focus for us is saying, let's look at team centric metrics because engineering is a team sport, as you alluded to we need to win together and we're all relying on each other's workflows.

Ori Keren: One point I think that that I'm not sure I hit it all the way. Once you map the entities around like your process or your effort, and You see where you have problems, theoretically, you can automate much more.

That's like what Salesforce did with or even Jira, like how you input data and move things around. And again, I don't want to push our own product, but what GitStream does, right? Since I have pull request and I have its context. Because we mapped all the entities, now maybe 20 percent of them can be auto approved or the opposite.

If it's one that's really important in the roadmap and touches very important things, let's bring more reviewers or let's run only a selective portion of the CI. You can actually automate things, and that's the blind spot. Everybody's speaking about the metrics, but this is like true workflow improvement that can come, from all of the movement in this space.

Kelly Vaughn: I think that's important though, because... Much like the engineering spaces that we're evolving, so do our workflows, and we're going to have to continue to invest time and resources into our workflows, into our reporting, into our process to identify these bottlenecks, and as, technology continues to evolve, we're going to find ways to automate, automate more things, and that's exactly what you're talking about there.

Conor Bronsdon: Yeah, Ori, you alluded to this earlier when you said, even if we're moving 10x faster on our coding, if there's a blocker in that software development lifecycle that the workflow is messed up and code just sits there for days, things like PRs, other issues, that's not going to really solve the problems of developer productivity.

I think it's a great point to bring up.

We've seen some, I think, negative examples in the last year or two around people saying, Oh, lines of code is how I'm going to measure which devs I keep and which devs I lay off. And these other decisions that, as you alluded to at the start of the show, Ori, become really toxic for an organization.

Ori Keren: Yeah, absolutely. I think, especially if they use their framework, especially when you move into the effort zone, I agree with Kelly, like measure the entities and the process. And if you have to, if you think it's right to also. Benchmark and look at other aspects, other, parts of your organization are doing.

Stay with teams if you have to. Never go to individual metrics, but stay with teams because, With Teams, you can find hey, maybe this team has any some sort of inefficiency, like Kelly said, in the PRU process. For some it's about, I don't know, they're releasing every two weeks and not, they don't have CD.

Okay, so let's fix that. So if you stay with Teams and you focus on the entities and the process, there's still a lot of great things you can do. In the effort area, even before outcome and impact.

Conor Bronsdon: And Ori, the way I know that you're passionate about this is because I know that LinearB has actively stepped out of deals where certain companies have said, Hey, we want you to drill down to these individual metrics because we're, you're so bought into this long term belief in the health of the engineering org.

And you're saying, look, companies that are going to drive into individual metrics are the ones that are going to fail to actually achieve the outcome and impact they need long term. This isn't the kind of partner we want.

Ori Keren: Yeah, this is a great question and a great it's always a dilemma because think about like DORA and all the academic research that kind of lay the foundation for companies like us come and productize this and it comes with responsibility, right?

and I'll be very open and honest, maybe in the beginning you can't stay, stand firm, but now we are big enough to say, hey, this is our philosophy, this is how we think you should do it all out, and in some extreme cases. Let's say, hey, it's probably not a good match, it's fine if that's your philosophy, but we're probably not a good match, there are other tools, so the, and this is how we, recommend calling out these type of solutions, so I think I said it in the past, I think, I see this as we're building a business and we, of course, want to, make tons of revenue and be great, but I'm also super happy that, and I feel lucky that I have the opportunity to be here in this crossroad where this is being implemented.

You can see this space crossing the chasm. Now, a lot of big companies are coming in. And McKinsey's writing about it.

McKinsey's writing about it. Big enterprise are coming in. So to be the ones that kind of say, Hey, this is the right way to roll out. I love it. I love this responsibility and help educate the market.

Kelly Vaughn: I think that you nailed it at that last point there. If there's anything I learned from working in video intelligence, it's that if you give people the tools to do something, they're going to, they're always going to be people who abuse those tools. And so the best way you can do from a, from a business ownership standpoint, or as an engineering leader, when you're having an impact on what it is that your company is going to be building, think about how it could, how it should be used, but think about how else it could be used, in not so great ways.

And you can't always, prevent people from doing like humans are going to be humans. People are always going to try to game systems. They're always going to try to find a way to do something. You don't want them to do it that way. However, when you can really. Have a firm standing on, this is how we intend for this to use, or to be used, and this is how we think you should be using it, and you provide those educational opportunities and educational materials about it.

You can't prevent everyone from doing any one particular thing, but at least you can help steer the ship in the direction.

Conor Bronsdon: Absolutely, and... I think this is a really important and nuanced piece of what we're talking about here. It's great that, Gergely and Kent talk about DORA metrics, space frameworks.

I think these are important things to apply. But if we're not considering the long term implications of Where we're measuring on engineering teams, we're falling into the same trap as McKinsey, and this is where I think we see a lot of positive intent from this debate where people are saying, oh we see this negative strain and some of the stuff that McKinsey's putting out, how can we do it better?

But. It feels like there's still an internal debate happening within the engineering community about how to move forward. So I think the question I'm curious to ask you both is how do you see us having to change the tenor of that debate or what can we do to, try to solidify the engineering community so that we can, go to our CEOs, go to our CFOs together and say, this is the right approach to take that is going to be healthy for our team in the longterm.

Kelly Vaughn: My biggest concern here is that, again, we can't develop this one size fits all solution. I think what's best for us as an engineering community would we have this conversation. We talk about what works, what doesn't work, where these dangers exist.

And we lay out, here are the options that we could recommend, and then you take this to your team. You take this to your CEO, to your board, whoever's asking these questions and say, based on the way that our team works and our team functions, based on our beliefs, we believe that this is the best path forward.

And I'm leaving that very broad for a reason. I'm leaving it very vague for a reason, because it's not going to be the same from one company to the next. The way that LinearB works is not the same way that Spot AI works. We're going to have differences in, And especially in like cultural differences, of how different engineers work, how different teams are going to function and what, the way that we're working, we're dealing with both hardware and software.

And so there's all kinds of fun intricacies that come with that, that we can't measure the same way that a, a totally, a total cloud based software company can measure. And so I think it's important to have these conversations. I think it's important to start to lay these out and have, have the experts in the field.

Have these conversations, especially with the McKinsey's of the world and be working closely with them. Those who have the voice should be including those who are deeply immersed in the experience. Contributing to that voice.

Ori Keren: Yeah, absolutely. We started with the fact that the CEO, there's a reason, right? The, the CEO and the CFOs of the world are saying, Hey, how do you measure that black box? That's like the one, that's department that I don't have visibility into and it's expensive. And. The temptation to go into individual metrics, we've talked about how risky it is, but there is something we could do, and I think it's responsible for engineering organization to do, is to, map the investments, and be bold, by the way, when you map the investment, when you show how much you're investing in innovation versus. Keeping their lights on versus like enhancing features, or even these are the projects that we're working on and here's the investment. and put the non functional project, the big, infrastructure change that every engineer needed and knows that you got to put them in like in bold.

Hey, this is untouchable and I'll explain why I think that those things. This is the right way to have those conversations with the CFOs and the CEOs of the world. And this is something we owe to them. So we can say, Hey, we have internal metrics. We can show DORA is maybe at a level that they can start grasp all these DX metrics.

Probably something that it's not they couldn't probably or shouldn't care about them, but if we move to the, to the effort area, that's definitely an area that I've seen. Engineering leaders being proactive, showing those, hey, these are the projects that we're working on and I'll explain why.

And have conversations around it. That's where I see it click. That's where I can see like an engineering leader get a seat at the table, and get positive feedback from a CFO, CEO, and good conversations spark. Some people are afraid. They're saying, oh, I'm going to show this non functional project.

They say. Stop. No, this product you should stop. So yeah, there's no silver line here, but that's how we get to know each other crafts and they can understand where we invest efforts.

Kelly Vaughn: There's a level of patience required. Especially when you're coming from the engineering space to, a seat at the table with the CEO, with the CFO, with others at the company who are not in engineering.

What we're working on is extremely complex. There's a reason why we see it as this black box. And so when you're in the position where you're trying to explain this, it can get really easy to just get frustrated about this is just how it is, I can't, if you don't understand it, I don't know what to tell you, it requires a lot of patience and a lot of education and breaking this down.

To help them understand why this is such a challenging thing to measure. But it's better to take the time to do that than to just jump to, okay I guess we can just use these metrics. And since you understand these numbers, if our, if our head of finance came to me and started talking to me about all the reports that he's working on, I'm like, okay, I know what numbers look like.

I don't know what any of these things mean. It's going to be the same exact thing. He would have to explain this to me about why we're making certain investments in particular areas. We like to bring this back around on the topic of investments. There's a reason why engineering is often sitting under R& D.

A lot of it is a, you're taking a bet on what it is you're working on. And talking about the potential impact and the investment you're making on making this bet. That is something that can translate a little bit more easily, not directly one to one, but if you have the customer background, you understand the context of why customers are asking for this, that is a much easier thing to explain, but there's still the other side of it of, you're going to have these moonshots, part of being a startup means successful in the startup space is being innovative.

And so you can't back, you can't have this perfect example this company did this and this is how it was impacting them. And this is how it was great for them. So here's why we're doing it. If it's totally new in the space spending time understanding how you can explain that and how you can explain why this is a gamble that you want to make, why this is a bet you're making, how you're investing that time into it, time and resources and money, and what you intend to get out of that will, it'll land a little bit better than just saying, this is how it is, this is how we're doing, this is what it is that we're doing, we're spending our time on.

Ori Keren: I've seen it making a huge impact. Exactly like you said, it takes time, but after the second and the third meeting, these are great conversations to have.

Conor Bronsdon: How should we start those conversations?

Ori Keren: So it depends on a lot of things on the, size of the company, et cetera. But I'm a true believer that, because we spoke about almost like these three areas, right around like developer experience, which is again if you have to reflect it back to the mental model, it's more in the effort area and say, Hey, we have inefficiencies here.

It's not assigned to a specific individual. We have inefficiencies in the process because you know how it is. That's how development process work. There's inherent friction when you have context switches. Here's how we're measuring it and here's how we're improving it. That's the first slide I would, that if I need to bring three metrics to a board meeting, that's one that kind of represents it.

Then the second is DORA is becoming a good standard. It's the transition from that area almost all the way to the impact, but like I said in my opinion, it's not deep enough into the impact. So talk about the DORA metric. These are standards. Send them, Hey, go read about it. This is interesting. It tells you about how we operate.

I was like, how smooth the engine is working. And then here's the effort. And the third one is here's the efforts and the framework around, like, where are we investing exactly like Kelly said, that these are the areas where we're taking moonshots, right? We're taking bets. Why? Because we're an innovative company and we got to invest in innovation.

And these are the areas that we invest in, unfunctional stuff that we have to build. So I think if you come with this framework and you frame it like that, good things will happen by the way, in the first meeting, people will nod. They say yeah. Second meeting, listen more.

The third one, I've seen it happening. They start asking you questions. Oh, maybe we, this is the best part because when I was a VP of engineering, my background as a VP of engineering before I became CEO twice. I was envying. The fact that the CEO was having conversation with the only with sales. Hey, if you change this in the process, maybe this will happen.

I wanted to have those same conversations. So all of a sudden now I've seen this conversation starting to happen with engineering, which is great.

Conor Bronsdon: Kelly, does that kind of framework of let's extend our ability to have these conversations resonate with you?

Kelly Vaughn: Absolutely. And I think the emphasis on the fact that you're not going to get the questions immediately is very true and something that really, you know, just because you're not getting any feedback doesn't mean you're doing everything right.

I think that's a very important thing to, to emphasize here, especially as you're introducing new patterns of thinking, you're introducing new metrics that people need to understand. They're going to need time to like sit and think about this and understand what it is that it means. How does it impact the business?

How does it impact me? And having these regular conversations that second, that third time, these questions will start coming up and people will start to create the patterns between how it relates to the work that everyone else is doing and really start to understand. What is happening in engineering land?

And why we are leaning into these particular metrics and why we are investing the time in fixing particular things or upgrading infrastructure that everyone loves to do. we take it back to our roots here at Spot when we're looking at like a mental model or a theme of, it just works.

It doesn't matter what moonshots we're building. If we're having outages if our customers are having to spend a lot of time speaking with support. To fix particular issues that should just work. And so putting this into a mindset of here is why we're investing this time in fixing these things so we can grow faster, we can scale faster.

That kind of helps to resonate with the CFOs and the CEOs of the world. And when you say, Hey, I need these particular resources as well, going into the next year.

Conor Bronsdon: I'll shout out the fact that, Ori and our CTO, Yishai, have built some templated slides that anyone can download for free that you can use to start these conversations at the boardroom level, at the exec team level, to showcase both the developer experience metrics that are important to your team and also, what your team's investing in.

So if you need them we'll drop a link in the show notes. We want to make sure everyone's armed to have these conversations. I think this is a great moment for us to zoom out and say, what are the major takeaways that you each had from this initial part of the debate and where do you want to see it go next as this conversation explodes into, the engineering and technology consciousness?

Ori Keren: to wrap it like we started it, I like it when people express their opinions it's great. This is again, starting a business like this in 2019 where we had to explain why it's even important to start measuring it's It's such an exciting times where more people with more views are coming into this and expressing their opinions.

So opinions are great I think there was going to be one point where there's going to be more consolidation, more standardization. DORA is becoming standard, but we need, iterate on that. So my biggest takeaway is maybe I'm optimistic. It's it's great that different people come with different views and that it's on companies like to find their middle ground, between what they think is, right and what they think is wrong.

But... One thing, don't use individual metrics, you have other ways like to like to do, performance reviews and all the other things you can do to, measure or find what do you need to do to help our developers to improve?

Kelly Vaughn: Yeah, I completely agree.

Regardless of how you feel about the McKinsey report, it got people talking. And this discourse was exactly what we needed. Because we can't find a solution to this if we're not having a conversation about it. And the more voices we have in the room... The more ideas that we're going to have, and there's going to be some great ideas, there's going to be some not so great ideas.

That is part of the conversation, but welcoming those different opinions is what's going to actually get us to a point where we can be in agreement on, what is going to be best to, satisfy all parties to some degree. What can we actually get to the CFOs of the world who are going to be asking for these metrics without harming the developers on our team, the developer community, and not starting to create this environment where people are going to find a way to game the system because they're up against these individual metrics that they know how to actually, work around and the more conversation we can have about this, the better off we're going to be in the longterm here.

Thank you.

Conor Bronsdon: And I'll say if you're a listener to Dev Interrupted who's hearing this conversation and you want to put your two cents in, we would love to hear from you. You can reach out to us on our social media over Twitter and LinkedIn. You can also drop a comment on our substack. I'll say we are always interested to hear from folks who say, hey, look, I think you guys are totally wrong.

Let's have you write a guest article. Let's have you come on and chat with Ori or I for a few minutes. We'd love to hear from you. This is a really important debate. Engineering teams all over the planet are having these conversations and it's our hope that leaders lean into healthy metrics, into team metrics, but as both Kelly and Ori have pointed out, this is something that differs from organization to organization.

The needs of Spot AI versus Google versus, Meta versus, LinearB are all very different and we need to approach that. with that respect for each other's organizations and that ability to, as Kelly pointed out, be patient as we begin this educational process. So I hope that we're helping to continue a discourse that creates a healthy environment for dev teams everywhere.

Ori, Kelly, thank you so much for coming on the show. I really enjoy talking with you both. Always, you can read more about Kelly and her thoughts on the subject at her newsletter. That's Lessons in Engineering Leadership, which can be found at engleadership. xyz. And you can also hear from Ori and more of his thoughts here on the Dev Interrupted substack at devinterrupted.

substack. com. Ori, I know you have several pieces that we're working on with you right now. Very excited to see some of those come out. They're all going to be on the Dev Interrupted substack. And of course, keep following along every Tuesday. as we share more articles about this debate and information in that same sub stack on our Dev Interrupted download, along with highlights of engineering content from all over the web and our podcast.

Ori, Kelly, any closing thoughts before we wrap up here?

Ori Keren: Just thank you, Conor, and thank you very much, Kelly, it was great being on the panel with you and exchanging ideas.

Kelly Vaughn: Likewise. Thank you so much. And I really encourage those of you who do have opinions and thoughts on this to share because I would absolutely love to.

I'd love to. I'd love to see you tell me I'm wrong and tell me why. 

Conor Bronsdon: Kelly is the best at Twitter. Of the three of us by far . So if, or sorry, I'm sorry, x.com, whatever we're calling it now. If you have an opinion I'm sure she and I will be both. Be tweeting about this her much more poignantly than I am.

So please reach out to us when you see us drop these videos, this episode or comment on it, would love to hear from you and, that feedback from the community is so important. Ori Kelly, thank you so much for coming on. Really enjoyed this conversation. It's great to hear from you both and let's do it again soon.

Ori Keren: Sure. Thank you.