What trends do engineering leaders need to pay attention to, and how will they impact your teams in 2024?
This week, co-host Conor Bronsdon is joined by LinearB co-founder and CEO Ori Keren to discuss his predictions for next year.
Together they discuss why dev team metrics are here to stay, why Ori doesn’t like the term ‘developer productivity’ [hint: he prefers ‘engineering efficiency’], how the rise of gen AI written code will create a problem for development pipelines everywhere, and the potential friction points inherent to remote work.
Ori concludes the episode by offering advice to engineering leaders and startup founders on the need to adopt a metrics program or risk getting left behind.
Episode Highlights:
- 01:00 Trends & predictions for 2024
- 04:00 Role of data in engineering leadership
- 11:00 The impact of AI on software development
- 16:00 Importance of efficient code reviews
- 22:00 Starting a metrics program
- 26:00 Advice for founders and eng leaders
Episode Transcript:
(Disclaimer: may contain unintentionally confusing, inaccurate and/or amusing transcription errors)
Conor Bronsdon: Hey everyone. Welcome back to Dev Interrupted. I'm your host, Conor Bronsdon, and I'm delighted to be joined in the Dev Interrupted Dome by LinearB Co-founder and CEO Ori Keren. Ori, welcome back to the podcast.
Ori Keren: It's great to be here and it's great to be in the Dome for the first time.
Conor Bronsdon: Yeah, it's a fun experience.
If you're not watching this on YouTube and you're maybe listening on Spotify, consider checking it out. It's a lot of fun to watch the whole thing happen. And that's why Ori and I are here to talk about these fascinating topics that we're hearing all about at DevOps Enterprise Summit 2023. And as we head towards the end of the year, teams worldwide are thinking about what are the trends that we need to pay attention to in 2024?
What are OKRs for next year looking like? What are our key initiatives? So Ori, given your years, your decades really now of development experience, your years as a founder your experience about startups, scale ups and enterprises, I wanted to get your take. What are the key predictions that you have about the state of software development and engineering teams in 2024?
Ori Keren: Yeah, so I'll start with the topic that it's closest to what we do and what I do around like developer productivity. And I think we're going to see a little bit of a conflicting trend. So one Engineering efficiency or developer productivity, measuring that is here to stay. It's almost if you look at 2008 and the challenges that sales had there, et cetera strong sales efficiency technologies evolved out of that.
And I believe that from this tough times or tough macroeconomics times you're going to see engineering efficiency here to stay and like the inheritance. All of this time. So that's the first thing. Companies who didn't start in 2023 are going to start in 2024. And in 2025, it's a no brainer.
It's here to stay for that. Another thing that, another two interesting aspects that related to that is I would start with the first is Even in the past, you had companies that say, Hey, I'm going to do it myself. So DIY, I'm going to, I'm going to build like a custom solution for that. We're already starting to see less and less.
And I think 2024, my prediction is going to, they're going to almost go away. The same things happen. The same things happen when... People say, oh, okay, CRM, do we really need ACRM? Do we really need all the opportunities? We can have it in a spreadsheet or build something of our own. And the technology evolved so much that you are behind if you're not starting.
So you go, you gotta see less like companies that are choosing to do them, do it themselves, and go with the, some of the solutions. And I would say the last trend that I think we're gonna continue to see. In Engineering Efficiency Developer Productivity is there's a saying that in these tough economic times, like the CFO is almost like the new CIO.
They're like, like making decisions on what information technology do you buy. You got to put like a lot of ROI justification into that that thing is not going to go away. They're going to still, in spirit at least, be in those conversations. And I think That engineering leaders that won't control their data, understand what's happening, run a metrics program, map what's happening in their organization.
They're going to have tough times like to control the narrative of why they're operating, in a good way. And why there's this justification to maybe increase the team. The times where I was an engineering leader, I just came to my CEO, Hey, I need two or three more people. And I convinced him, by just talking about hunches and not with data, are over.
I think my other prediction is that engineering leaders who will do this transformation will control the data. We'll control the narrative and we'll be able to talk to the CFO, we'll achieve great things for the organizations and vice versa.
Conor Bronsdon: What would be your advice to engineering leaders who are trying to have that conversation and trying to be the ones who control the narrative, to your point, so that they're not beholden fully to the board and instead say, Hey, look, my, my data is here. My, my input is here.
Ori Keren: I would say, again, it starts with having the data. If you don't have the data. It's all hunches. It's all guesses. Then it's, there's different types of data, right? So people, we double down on Dora metrics. Probably going to be the theme of the, this event. But there's other areas where you measure the allocation.
How much investment there are going into each type of investment and each project. Those are like the type of information, this is the type of information that is super interesting for CFO, board, et cetera. So I would control, have the data in front of you be able to start a conversation, initiate a conversation.
If you want it come to you and come to you in the moment that you're at least ready. And so if you're initiated a conversation. You can control the narrative, or at least lead the discussion to where you want it to be.
Conor Bronsdon: Yeah, you and I have talked about this before which is like, when you fail to do that, you run into things like the McKinsey framework, where it says, okay, here's how you need to measure developer productivity.
And, we've all seen many of the critiques of that. If you haven't already listened to Ori and myself and Kelly Vaughn on this podcast critiquing it, definitely go check that out. But to sum it up, when you don't consider the impact, Of your engineering metrics program, of your analytics, of these decisions you're making at the leadership layer, on the culture of your engineering team, on the retention of your engineering team, on your ability to actually deploy for engineers, when you risk gamification, you create these massive vulnerabilities.
I believe you put it as like you, you create an autoimmune disease within the company. So this is a huge risk.
Ori Keren: Yeah. So that's the other side. The allocation is more like, okay I'm a VPE, I'm a CTO. I need to have a data driven conversation with my financial folks. The other side of it is to try to stay away from those individual metrics.
At least, in those conversations with a CFO and I believe, my belief is that you need different system to assess your talent. If you use the same system that you like measure that your DORA metrics and your metrics you create like a, like you said, like autoimmune system because then people are gaming the metrics and they know they're being measured on that.
So they're going to change their behaviors. So I would. I totally recommend using something different to assess the talent of your team, which probably you should also do, and use the metrics more for the team related things.
Conor Bronsdon: And this is why I hear you using the phrasing engineering efficiency.
Ori Keren: sO if you call something developer productivity, you use like the single tent and other, the plural tents. Even in sales, you measure the sales efficiency, even in sales where it's such a more close to like individual sports. Competitive. Yeah. Yeah. You don't talk about sales rep efficiency.
You say sales efficiency and you measure like the friction in your process. At the end, you will measure like the. Individual reps. So yeah I don't like the term developer productivity. If we can use dev team efficiency, engineering efficiency, I think that's like the right term to use.
Conor Bronsdon: And I know that, as we all look at these solutions for developer productivity or engineering efficiency, was that, whichever you want to phrase it as.
There is also a, an issue within companies of tool fatigue, not just the CFO level where they're saying, oh, I'm worried about how much we're spending, but also at the dev level, at the team leader level, and they're saying, okay. What's this new tool? Why should I start using it? How can the community, the dev tools community, avoid having our cutting edge tools that should hopefully be improving the lives of engineering leaders actually be a headache.
Ori Keren: Yeah, so that's a great question. I think there's this term called developer experience. We all heard it. It's gaining momentum. It's still not fully operationalized at this point in time. And I think that. What we see from our customer base depends on the size. So if you're like an enterprise customer, you probably have a developer experience team.
If you are a smaller one, you probably have somebody who's like very passionate about it, or what we call the AOR, area of responsibility. So the rise of those developer experience team and IDPs, like internal developer portals, I think they're good. And I predict 2024 for them like to go. Even, to be more rolled out within teams.
Those things can help with what you talked about, because it's a central place for people to understand. What's stopping me from delivering on my task? If I'm talking about myself as an individual developer, I come in, I want to complete the task end to end. If I need to learn new tools all day long, hurts my productivity.
If the test that I need to run too long, hurts my productivity. So that's productivity from the eyes of the developer. And developer experience is super, super important. I think 2024 is going to be the year where it's going to be much more operationalized with two things in mind. And our philosophy is always use those two methods.
On one hand, there's empiric things you can measure in developer experience. For example, merge frequency. If I'm not, if I'm a developer and I'm not pushing PRs out, forget about how my manager looks at it. From my side, I don't like it. Because every developer wants to achieve things. I want to come in, complete a task, and go to the next task.
There is empiric metrics like flaky tests that are sometimes working, sometimes not working. You want to kick them out because they're not stable, they're like confusing you. There's like the lengths of the CI. Those are empiric metrics and we're going to... See a lot of companies invest a lot in them.
There's also the qualitative side that I see a lot of good movement into that. Asking the developers via surveys, where do you think what are, what is the thing that's blocking you? And yeah, and if they say, tool fatigue, I want a new tool, you should listen to them.
Conor Bronsdon: So this is a really interesting thing to dig into, given that, if we haven't already said in this episode, AI is here.
It's here to stay. You're not going to get away from engaging with AI tools. We're already seeing some of these trends in our data. We're seeing some of the trends in the DORA research, which we were a partner with for Google. And in that DORA research, we saw that companies that were leveraging AI tooling, we're starting to see them actually improve developer experience metrics, these like satisfaction surveys, because we're seeing improved automation, leveraging programmable workflows and AI to actually ensure that some of these like less fun tasks some of these frustrating friction points get automated away.
What do you see happening as that trend continues to evolve with AI in the coming years?
Ori Keren: Yeah. So there's a couple of interesting aspects to that. I think I think even in the Google report, they're saying, yeah, we're starting to see the trend, but it's still really early. Yeah, early. It's being incorporated.
It's going to be much more incorporated in 2024. So that's like a very important prediction. Now I have a lot of like ideas around it and a lot of thoughts around it. The first one is okay. Gen AI is going to do more impact on how fast code is being generated in 2024 for sure. So you're going to get some.
Fully automated, like Gen AI code going in, you get it, you're going to get some half, like semi automated like Gen AI code. And here's the thing. It's I keep saying it and I'll say it again. You're going to try to stream more water into the same old narrow, rusty pipes. So I foresee in 2024, all of a sudden in the acknowledgement, okay, whoa, like we're generating code, but If we thought about these pipelines and how can we open them up so they can release, reviews faster, CI faster, CD faster and the key is not faster in terms of let's throw more machines and more money on the problem, it's being smart, classify the work that's coming in and then send them on different routes.
So some of the things can go faster, some of the things need more human intervention. That's one thing. That, that I think we're going to see in 2024 with GenAI. Our customers are already coming to us and saying, we're experimenting with things around GenAI. Can you help us like figure out what type of impact it does?
So of course you can use LinearB to, to measure the impact our, is the cycle time faster in like GenAI? Yeah. So that's like in the, like my thoughts around GenAI in high level, there's two other things that are interesting. One is the quality of being able to read code is going to become much more important because you can, code is going to be generated faster, like we said.
But you need humans to read the code and approve it. And here's the thing, developers don't like to read code and they like to write code. Having these senior developers that know how to read and analyze the code and how it's going to impact their specific system it's a very important quality for tech companies to continue to invest in and make sure that they have it because that's going to be a pipeline like opener if they have it.
And if not again, more code is going to sit in the pipe, it's going to be stuck. My other very philosophical kind of thought that I had yesterday is that you know how when you you had all of a sudden an iPhone, you stopped remembering numbers? Sadly true. And then you got, I don't know, Google Maps, Waze, whatever you use, we lost the navigation capabilities or at least some of them.
And my worry is that since, again, this is like not next year, but two, three years ahead when more and more dependency is going to be, Hey, I need Gen AI to help me like generate code the, the innovation level will decrease because a lot of good ideas are coming at, I'm working with my hands, I'm doing something and now all of a sudden Oh, this is a great idea.
Let's do this. Let's work. And your dependency in something that will help you generate the code will block your innovation. So I think, yeah. Companies like in two, three years from now, we're going to start to think about how do I preserve the knowledge and not lose it and decrease their dependency in that.
But that's a little bit more far ahead.
Conor Bronsdon: That's an interesting prediction. Cause I know you and I have talked about this before about how you believe the best developers are ones who are really incredible and willing to read code. And so I wonder if to your point, that skillset will become even more important because you need to go beyond just what GenAI has done for you.
Yeah, great. You can now get. Some of the basics done with Gen AI, and I'm sure that's going to continue to improve, but to actually create something innovative and new is going to take potentially more work because of this expectation that so much of it is, in some ways, cookie cutter. Yeah.
Ori Keren: Yeah, definitely.
I think I think, again, the ability to read code is something that those are going to be the most attractive developers, like you, that will people will look for them. And. Companies will need to invest resources in training. People like to, okay, how do I increase those capabilities?
Conor Bronsdon: Are there any new challenges that you see developers facing around another topic that you brought up earlier, which is resource allocation? And... How do you see engineering leaders needing to mitigate that resource challenge that they may be facing in these choppy economic times to continue to improve productivity?
Ori Keren: Yeah, so we divide it to two interesting like things that we're seeing. First of all, there's this topic of remote work that it's not solved. That's our belief internally because what we found in our research is that teams that are working, not necessarily co located and working from, but in the same time zones are much more collaborative and are working better together.
So you know, there's this brave new world where everybody's working wherever they want from different time zones, et cetera. I still believe hybrid is somewhere the right way to go and each company needs to find their, but We're seeing that companies that are working more or less in the same time zone are able to break the biggest like blockers in engineering efficiency.
And we know that code reviews, that's something we keep on fighting as like one of the main blockers. And this year Google also mentioned it as one of the main thing that you need to improve in order. It's being done better when you're in the same time zone. So I think companies need to say, Hey, this is our strategy.
It's fine if we still decide to work, different time zones, but let's just acknowledge the prices, come up with programs that like fix that. Because again, if I'm issuing a PR and I'm waiting three hours for somebody to wake up, it's a productivity killer. I think companies need to figure out.
There's strategy around remote work, hybrid, time zones, all of that in order to be more successful. So that's one thing that I see. And the second thing is like very simple. Like we talked about in the beginning, I'll say it again. You want to be better and face the challenges.
You got to start with a metrics program. We talked about it, but, and metrics program is just the first step. And after you do that, you gotta make sure you set OKRs and have like operational Allocate your resources. Allocate your resources. Figure out how you change your resource allocation.
Automate, automate as much as you can. So those are like the things that I think like teams that like will start doing and will improve will give them like a competitive edge.
Conor Bronsdon: Yeah, so you mentioned the automation, the programmable workflows piece, that seems like it's a really crucial element of predictable delivery.
And that's really what we're all going for, is high quality predictable delivery that can drive that innovation we talked about earlier. And you mentioned the DORA research this year, and now the 2023 DORA report that just came out which we're very proud to partner with Google on. One of the really awesome insights in there was that teams with faster code reviews perform 50 percent better on software delivery.
And it was, it's like such a substantial improvement, but it really aligns with what we've seen in the research. As to where the friction points are in development, which is code reviews are one of those main ones we hear about both in, quantitative metrics through our own data and also in the qualitative metrics, whether it's conversations, surveys, et cetera.
Ori Keren: Yeah, absolutely. There's this triangle between if you figure out that having small pull requests it will, you got to have like small pull requests, so you got to break your work into small chunks. And then you understand that. What it does to you, it increases your merge frequency which is a very important developer experience metric that I will tell you.
Because if developers are merging two, three PRs a week, they're happy. We want to complete, I say we, I still see myself as a developer, we want to complete tasks. That's what we want to do. And then on the other side, what it does to you, it like keeps the pickup and review and all the handoffs faster.
Because here's what's happening and it's a harsh truth. Everybody knows it. If I'm getting a PR, the first thing I'm going to do that I need to review, first thing that I'm going to do if I have estimated time to review, great. But if I don't have it, I'm going to assess really quickly. Is this like a 30 minutes thing or can I do it in two minutes?
And here's what I'm going to do, it's a very binary classification. 2 3 minutes? Okay, I'm going to take it. 30 minutes? Throw it in the queue. I don't know, it's it's going to... Maybe it's tomorrow morning. It's tomorrow, or in two weeks, and that's it. That's... When this triangle works, small PRs, high merge frequency, fast pickup and review, less handoffs, or even if there are handoffs.
They're fast and then you get things like out to production and it's still fresh in your head when there's a problem so you can roll out quickly. You can roll back quickly. That's the magic. When that happens a lot of like the problems are solved and that's why like Google identified it and we're seeing it with the leading indicators that lead to we've been talking about it like for the last two or three years.
And by the way, that's the first blocker. In a development pipeline. Once you finish that, there's more now. There's like, why is my CI sometimes failing, sometimes not? Solve that. Why is my CI taking so long? Solve that. Can I release it directly to production? But, that's the main blocker that is hurting productivity at this point.
Conor Bronsdon: And it really aligns to Dora's research, which shows that when you have good quality and speed, those things actually go together. And what we're seeing here is when you break down that triangle you talked about, these small PRs, quick review cycles, and when we pull these all things together, we're able to create A much faster software delivery pipeline, and that also improves quality, because instead of having to look at a massive review, or maybe you're missing something, or you're lacking context, I agree, I can get this small review quicker, it's a lot easier to figure out, there's less context switching involved and you're not only improving the speed, but the quality of what you do.
And so that kind of brings me to my next question, which is what should engineering leaders top priorities be when it comes to DORA and the insights from DORA as we head into the next year? Because I think you've alluded to code reviews. What are the other things they should pay attention to?
Ori Keren: You mean when they think about the metrics program or when they think about DORA?
I'd be curious about both. Yeah. I would say we're going to... We're seeing this a lot, so I think organizations, they know themselves the best. sO they need to start a metrics program and we're going to have a talk in this session with Syngenta, one of our customers.
What I liked about how they operate is they said to themselves, okay, there's Dora, there's other metrics, there's, we're going to choose the things that we believe in and that's what we're going to, we're going to measure. And by the way. Once they did that, they went quickly to the next level in the maturity, saying, okay, measuring is not enough.
How do we create operational cadence, that everybody is accountable and everybody's presenting their metrics and they're like sharing them, which is like a very important phase. I think once you have those You get to the next level. Okay, I see problems. How do I fix them? And that's why we love GitStream because it's a great venue to start coding your coding yourself out of out of the problems and improving them.
So my recommendation is start a metrics program, pick the things that are important to you, and focus on what you're trying to achieve and how it's aligned best to your culture. Then I think what's going to happen to you is like the third quarter or the fourth quarter syndrome where we're saying, okay, now I need more, I need allocation use cases, I need like automation.
There's no magic, you just got to start. Like I said in the beginning, it ties really nice to the beginning. Engineering efficiency is here to stay. The gap between those who started and are now in their second or third year and are in advanced use cases. And, metrics are table stakes for them.
Between the ones that didn't start yet is growing bigger. And the competitive edge is, you can really see those customers who run it. And just by seeing those metrics they already save like a lot of time. It's hard to compete with them if you're in the same industry with them.
Conor Bronsdon: Absolutely. And this is why as a company, LinearB has now made Dora metrics free for everyone worldwide.
With our free dashboard that we've launched you're probably gonna hear an ad for it on this podcast but it's completely free to download. You can use it for a team of any size. We wanna make sure that every team has those table stake metrics you talked about so you can start to identify the areas of friction for your team to, to improve, because.
It's so dependent on how your team's set up. Is it something where it's crossed multiple time zones? Maybe that's, you're gonna have different challenges than a team that's co-located. They may be harder challenges, they may not be harder challenges. Are you a hybrid team that meets a couple times a year versus one that's in the office a couple days a week?
Are you doing more pair programming? Less pair programming? So many things, and I'm just rattling off a few, can really affect how your team approaches this. And so it begs a question for me we've given a lot of advice for engineering leaders. What about the rest of the C suite, what about founders in particular, as a founder yourself and someone who's been around a lot of incredible founders over the years, how should founders think about engineering efficiency in 2024 and beyond, and what else should they be prioritizing as they make these really critical decisions and partner with engineering leaders?
Ori Keren: I think it boils down to some of the same things. I think, first of all, when you think about your talent, think about what we spoke about before. You got to make sure that you have I don't want founders to say, Hey, generating code is easy now with Gen AI. So I don't need, no, you need like strong developers.
You need like the ones that will help you like, authorize the code, read it, approve it. So that, that's one thing that I think they should be thinking about it. Another thing when you start your journey, put the, put those metrics and allocation things from the beginning because it's, I.
I think if you start a relationship with your board, with your CEO, with your peers you put this culture in where engineering is accountable like any other department and they can talk about the, here's our investments and it's dramatic. Because you start like the relationship with your board on the right foot.
So I think if founders like thinking about starting a company now yeah, think about GenAI, think about strong talent that can still read and generate the important code. Think about workflow automation for your pipelines, because even if you move fast there, you still like, Be stuck there and think about communication, to your board with allocation.
Because if from the first board meetings this is what you do and you get used to it there's a lot of there's gonna be a lot of trust and a lot of a good relationship with your engineering team. And you probably can avoid, some of those hard times when people say, Hey, maybe we're not delivering as fast.
No, you have data all the time from the beginning.
Conor Bronsdon: And by doing that, you change the conversation from engineering as a cost center to engineering as a value driver for the business. And you start to translate those key operational metrics to the board in a way that they can understand the business metrics. And that drives, I think that speaks to an important skill for founders in 2024 and beyond, which is the ability to cross the bridge from the go to market team, the sales, the CS the marketing that already has a lot of these operational metrics, these business metrics that they've been using for years to the engineering side of things.
But what other skills do you think are really crucial for founders? What advice would you give founders who are getting started today or are starting to learn on their journey?
Ori Keren: I have the general advices to founders, which is like always be ready to fall a lot of times, bounce back. Learn from it.
It's there's enough, there's I think even this, conditions, it's not going to change. You're still going to do a lot of mistakes hire great people, great for every period. I don't think these things change, when we think about 2024, it's still it's still the same.
Maybe, one thing is, what I'm seeing out there is like that jump between when you raise seed to when you raise your A round, that's still okay. Like people fund those. Yeah, then the the toughest one is okay, I need to prove that I'm like, I have a valid business. It's repeatable.
So I would make sure that you have everything aligned when you raise your. A, and you have a lot of time until you need to raise your B to figure things out. That's where I see a lot of companies struggle now. But that's a general
Conor Bronsdon: like advice. See, I feel like I have much simpler advice for founders around, making sure you can raise your next round and that's just buy an ai domain. You should be good to go.
Ori Keren: That too.
Conor Bronsdon: Fantastic. Ori, this has been great. Any closing thoughts you want to share? Any last predictions you want to get in?
Ori Keren: No, I would just say stay adaptive because every prediction that I suggested here is just as good as every prediction that somebody else will give you.
And probably in February, 2024. We already learned that some of them are, we missed. So just, we gotta always adapt.
Conor Bronsdon: That speaks to another great piece of advice for not just founders, but anyone, tech, which is continuous learning. Always keep learning, always keep iterating. It's something we alluded to on the show about engineering team.
Something you alluded to about, falling down and getting up. And hopefully we're doing some of that on the podcast here, improving what we're providing to you and hopefully giving you an opportunity to learn and talk to leaders like Ori. Ori, thank you so much for coming back on the show.
If you enjoyed this episode let us know. We'd love to hear about the type of formats that you're looking for. We're trying to experiment with new conversations and if you really enjoyed it, consider leaving us a review, whether that's on YouTube, always love a thumbs up. If it's on Spotify, Five Star, same with Apple Podcasts, wherever you leave your reviews, we'll have a link in the comments and thank you so much for listening.
Ori Keren: Thanks for having me. It was great being here in the Dome.
Conor Bronsdon: Yeah, it was a ton of fun.