"Developer experience is like achieving both of those goals. Take the highest impact things and help [developers] do it more effectively. Take the lower impact things that have to be done and automate it."
Live from San Francisco, Dev Interrupted explores the future of developer productivity with Ori Keren (CEO, LinearB) and Dharmesh Thakker (General Partner, Battery Ventures). The conversation, recorded before a live audience in San Francisco, examines the exciting possibilities and potential challenges of AI-powered tools, from code completion and review to the more advanced agentic AI.
Moderated by Ben Lloyd Pearson, this episode captures the excitement of the event, including audience Q&A and feedback. Ori and Dharmesh share their insights on how these trends will shape the future of software development over the next 12 to 24 months, offering predictions and practical advice for engineering teams and leaders. Discover how these shifts will impact your work and the broader tech industry.
Show Notes
- 8 Habits of Highly Productive Engineering Teams
- Beyond the DORA Frameworks
- Introducing AI-Powered Code Review with gitStream
- Book a demo
Transcript
Ben Lloyd Pearson: 0:08
Welcome to Dev Interrupted. I'm your host, Ben Lloyd Pearson.
Andrew Zigler: 0:11
And I'm your host, Andrew Ziegler. And, Ben, what's on your mind this week?
Ben Lloyd Pearson: 0:17
Yeah. So I have something to confess. I have a credit addiction.
Andrew Zigler: 0:22
Oh,
Ben Lloyd Pearson: 0:23
up for a new AI service that promised all sorts of agents and other things that would do various. Tasks and solve problems for me. And they gave me a whole bunch of monthly credits to like hire these agents, but it really sent me down this like rabbit hole of like the future of work and how we value things. I think like, are we going to be tokenizing ourselves or our work or our tasks that we, carry out every week? Like basically everything's like priced in tokens now and hard work costs more tokens, but you never really actually know what the cost is until like the prompts. So, yeah, somehow I find myself wondering, like, how can I maximize the amount of tokens that I spend? Because, you know, spending more tokens means you're accomplishing more work, right? Like, But anyways, maybe we'll talk about this more later. Uh, let's dive into this week's news. Heh,
Andrew Zigler: 1:15
into this week's news, I got a great one about OpenAI's CEO Sam Altman in a recent Reddit AMA, but before we get there, this token thing you're talking about is stuck in my brain. It's so funny to me because if you think about it in some ways, when you go and you have your tokens and you're trying to get the result you want, it kind of becomes like the modern loot box. Like you have all these tokens and you're trying to get that like rare shiny thing. It only comes out every once in a while, so you gotta put a lot of tokens in the machine. And so, the idea that, you know, doing more work, spending more tokens means that you're more valuable, that's, that's fascinating. Maybe it's about the things that you can't get out of the machine that you put the tokens into.
Ben Lloyd Pearson: 1:55
I wonder how many tokens it's going to cost for me to have an agent that figures out how to best spend my tokens.
Andrew Zigler: 2:01
Well, I really hope that people let us know in the comments on Substack this week about how many tokens that they think that that would be worth. But now to move on to this week's news. I do want to cover an interesting one from Reddit AMA. Now we all know Reddit AMAs, right? Where somebody in a high position somewhere, you know, they descend into the throng of the masses. And they answer questions for the common folk. and we had one last week, from Sam Altman. And there was someone who asked, would you consider releasing some model weights and publishing some research about it? And in this AMA, you know, Sam indicated that he thinks that they're on the wrong side of history right now when it comes to open sourcing models and technology around AI, and indicated that that's something that they might do in the future. Now, of course, you know, in a Reddit AMA, you should take everything with a grain of salt, but it definitely sparked a lot of interesting conversations online.
Ben Lloyd Pearson: 2:55
Yeah, you know, honestly, my opinion is that the risk of open sourcing these models and some of these weights is probably extremely low. Like the real commercial value comes from all of the training and the reasoning that they add to these models. So, I do kind of think that. Agree that he's probably is on the wrong side of history and we are already seeing a lot of innovation and rapid iteration coming out of the open source space on this. But so yeah, I kind of feel like this is a no brainer,
Andrew Zigler: 3:22
when it comes to the open sourcing AI, you know, it's a complicated formula. It's not like other open source technologies in the past where you release the source code, you release it under, you know, a license that may or may not be permissive, that may or may not let someone build something of commercial value on it. But ultimately, when you release those things, someone else can replicate your success and carry the, you know, The torch of that technology forward. But with AI, when you open source it, it's a little more complex because you might, open source the model and even maybe the weights that you used. But if you don't open source the training code that you use or the hyper parameters you use to get it into that state, or if you don't share your training data so that folks can understand what went into the model and then maybe further train it, then you're only. Giving them a really small part of the equation. And when companies release their technology in that way and call it open source, they still hold all the keys because they're the ones that can run that model, continue to train it and make it succeed at scale. So it's a very complicated, conversation within the open source communities right now about what that definition even means.
Ben Lloyd Pearson: 4:30
a story that I've been reading this week, was from, it's actually a really cool, like, project from this gentleman named Chris Keel, who's a software engineer at Amazon. and he published this, article on his personal blog where it's titled Software Development Topics I've Changed My Mind On After 10 Years in the Industry. and he actually did this pretty well. Previously, four years ago when he was at six years in his career. And it's a really just cool list that, you know, sort of serves as a document of like where he is personally on a, on a journey, but also shares a lot of like interesting little tidbits that he's picked up. And there were two that actually stood out that he changed his mind on, first simple is not a given. It takes constant work. And second, most programming should be done long before a single line of code is written. And I actually kind of think maybe these two points are related because, you know, you think good planning, good preparation takes a lot of time. And, but it's something that really has like a long term efficiency gains that you create if you do it right. the thing that I kind of take away from this is, you know, like habits in particular can be a really powerful way of making these types of improvements. Like if you're habitually simplifying things in your life, like, you know, it's a really great way to just keep accumulating like benefits.
Andrew Zigler: 5:45
And also just habitually reflecting on things that you're learning as you're going. This is a great article from Chris. I love the idea of him reflecting year after year on practices within engineering that he agrees with, that he doesn't agree with, and seeing how his own sentiment changes over time. You know, I keep a daily journal. I'm going to steal this idea for myself and re evaluate how I think about, you know, technology over time. I think it makes for a really great reflection exercise. And the things that stood out to me in his, roundup, this, you know, for this article were very similar to yours, an opinion of his that changed is that there's no pride in managing or understanding complexity, which really ties into your idea of simple is not a given, but what I love about, like, the nuance of this is that there's no pride in that, you know, if you create an incredibly complex beast that you, you know, maintain all the knowledge and you have to The Keys, and you understand how it works on an intricate level, you know, you're really just creating a nightmare for your future self because that becomes very hard to manage and ultimately scale. And there's no real pride in building things that way.
Ben Lloyd Pearson: 6:47
yeah, you're reminding me of a time that I had to understand some Microsoft API documentation and it was the most convoluted, complex thing I think I've ever experienced.
Andrew Zigler: 6:57
And someone would probably be really proud of that, you know? But there's
Ben Lloyd Pearson: 7:01
well I assure you, I, I assure you, I, I felt no pride trying to understand this mess.
Andrew Zigler: 7:09
And something about it from his lineup that I like too was about what didn't change, an opinion of his that stayed the same, you know, he stayed firm on this, which is code coverage is absolutely nothing to do with code quality. We're actually having an upcoming guest on the show from diff blue, who's talking about code quality versus code coverage. so definitely stick around for a future episode where we're going to dive more into that one. cause I totally agree with Chris here.
Ben Lloyd Pearson: 7:31
So what else have you been reading about, Andrew?
Andrew Zigler: 7:33
yeah, I read an article about how chat is a bad UI pattern for a lot of development tools, which really stood out to me. There's a conversation I know that many folks have had about like, what's the best format to be working with a tool like AI? And up until now, we've kind of been shoving LLMs into chat windows, because that was the easiest way to start extracting immediate value. But It causes us to get stuck, right, in this mental perception. And now the whole world is trained on LLM equals chat interface, which is just one way of interfacing with the technology. And, you know, this reminded me actually of another article I read on artificial ignorance, a really great substack that I highly recommend everyone check out, where that talked in November about how when we're forcing AI Chat windows, we're failing to capture its true potential, and that holds true for things like development tools with AI as well.
Ben Lloyd Pearson: 8:29
You know, I feel like we have pigeonholed it into this Into a chatbot because it is so effective at that model, you know, like being able to talk to any sort of like knowledge base is like been pretty revolutionary actually. you know, I think a company that really understands this super well is Cursor. Like they come to mind. and you know, there's been some rumors recently, that haven't been, I don't believe they've been confirmed, but there's rumors that they are now the fastest company to ever scale to a hundred. A million ARR. it was, I think like 12 months or something like that. and I think really what the key to their success is that they do have like varying degrees of interfaces that are sort of like scaled to the level of challenge that you're trying to resolve. So sometimes it's like just trying to predict like your next step, like the things that you want to type in the next moment. but other times you're doing things like selecting a section of code base and Having a conversation with the model about how to change it, and, you know, sort of iterating on it based on a conversation, this really ties in well to today's episode because there's one of my favorite moments from this episode was actually from some audience engagement we have just as a little foreshadowing there, but Rob Zuber, the CTO at CircleCI, touched on this point really well and he provided a great example where, you know, he said it. Today, we asked these GPT models to create a web page based on certain criteria, like certain user criteria. Tomorrow, we may actually just have an API on your website listening for users to show up that can create completely custom content in real time based on what that user is trying to express. To achieve. So it's less, you know, like we may have to completely shift our perception of how we integrate AI into all of our, our platforms. So you definitely want to stick around to hear about that in the episode.
Andrew Zigler: 10:17
Yeah, that'll definitely be a insightful one for sure.
Ben Lloyd Pearson: 10:20
And then one last fun story that I wanted to share before we head off to our full interview today, uh, is that NASA is going to be doing a Twitch stream from the international space station.
Andrew Zigler: 10:32
Oh, nice.
Ben Lloyd Pearson: 10:33
Yeah. So the plan is to discuss daily life aboard the space station, some of the research that they're conducting, it's happening on February 12th, so go check it out if you have the chance. they've also said they want to do more of these in the future. So it's, it's really cool. I love space. And by the way, we have always wanted to get. Astronaut on this show. So if you're an astronaut listening right now, or if you know one, or even if you just work at NASA or do cool space things, reach out to us, cause we would love to have, somebody who's been in space or helped people get to space on this show.
Andrew Zigler: 11:09
that would be a cool field to definitely cover for sure. There's so much expertise to explore there. I love the idea of NASA doing a Twitch stream. What a great way to engage with folks. Um, but also especially with like kids and get them excited about science and space. I think that that's a perfect platform for them and with a big audience. So I'm definitely going to tune in.
Ben Lloyd Pearson: 11:29
Alright Andrew, so this is the point where we get to make a prediction about an event that will happen in the future to us, but our audience will find out after the event and they'll get to see whether or not we are accurate.
Andrew Zigler: 11:41
Oh, I love this part where all of our biases go on display and we figure out if we're right or wrong.
Ben Lloyd Pearson: 11:47
your Super Bowl prediction? Everyone's got a prediction, what's yours Andrew?
Andrew Zigler: 11:52
Oh, man. Well, for American football, my prediction is that, well, You know, I can't even necessarily think about what's going to be happening in the game or who's going to be playing. All I know is I still need to be looking up the Puppy Bowl information for myself and the people in my household. But, the biggest thing that I'm already anticipating, and future me, hopefully, fingers crossed, is on the finish line being like, Yes, I was right, is that there's probably going to be some kind of really either tragically good or tragically bad AI generated commercial. Maybe we'll get both, but there's definitely going to be a lot of talk about how AI was used in the Super Bowl commercials. That's my personal prediction, but maybe that's my own bias speaking out.
Ben Lloyd Pearson: 12:32
Yeah, so I actually made this point in a post on LinkedIn that went nowhere And I think it's because people are just sick of seeing AI generated videos at this point, but I I agree with you I think this is we're gonna see you know at least one maybe multiples of people like making AI part of the punchline and within the, the commercial's joke, uh, but also like, you know, so I was talking about my token addiction. So I spent a quarter of my monthly tokens generating videos for this LinkedIn post, uh, before I finally just went to Sora and, uh, had it, you know, as part of my monthly membership there. But why are footballs so hard for AI to understand? Like, I don't get it. Like, every video, there's like 10 footballs flying everywhere. People are morphing into footballs and footballs are like scoring touchdowns by themselves. but there's only ever one football on the field in a game. So something is training these models to put footballs everywhere.
Andrew Zigler: 13:33
Or there's a lack of training, maybe. What I like about your post is you really accurately called out that a lot of that, you know, that video footage is fake. Those photos are licensed and they're owned maybe by private parties who won't let them be trained. And so, maybe the AI just really does lack that kind of understanding. It kind of reminds me of just how, you know, you're getting all these crazy results, and I'm sorry to hear that you spent a quarter of your monthly budget of tokens trying to get this result. If anyone in our audience is good at the economy, please help him budget these tokens so he can make it through. But I will say that the LLMs, they just lack a really good spatial understanding. a fun example that, I did just before this, we hopped on here is I asked an LLM, I asked ChatGPT to finish the sentence. The football flew 200 feet overhead. I jumped as high as I could and, and of course it wants to finish the sentence. And it says, Stretched my fingers Towards it, feeling the leather brush against my fingertips, just enough to tip it back into my grasp as I crash to the ground, breathless but victorious. But as we all know, humans can't jump 200 feet in the air to catch a football. It only has a semantic understanding of our world, but not a spatial one.
Ben Lloyd Pearson: 14:45
Yeah, man. I wonder what universe it comes from. Like either it's a universe where 200 feet is actually not very much distance or, people are 200 feet tall,
Andrew Zigler: 14:55
Or Ben, or, it assumes that you're maybe in space You maybe you're in space and you're on your, or you're on the moon and you're jumping and you could maybe get it. Um, so maybe it's just future thinking.
Ben Lloyd Pearson: 15:07
Yeah.
Andrew Zigler: 15:08
But with all that said, this has been a really fun news roundup and thanks for sticking around. coming up, we're on Ben's conversation about the future of developer productivity with LinearB CEO Ori Keren and Dharmesh Thakkar, the generous general partner at Battery Ventures. This format for this is a little different than what we normally do, but there's really great audience engagement as well. This conversation was recorded in front of a whole bunch of folks at The Melody in San Francisco, which is a beautiful venue. We have lots of video of it too, so be sure to check it out. Features some really great Q& A, like I said, and you don't want to miss it.
Ben Lloyd Pearson: 15:43
Yep. Stick around. Habits are a powerful thing. Maybe you've read one of the many books about the habits of highly successful people out there. Well, LinearB is out with their own book, The 8 Habits of Highly Productive Engineering Teams. This practical guide offers advice and templates to help you establish durable data driven habits. It covers things like setting actionable team goals, coaching developers to level up their skills, using monthly metrics check ins to unblock friction, and run more efficient and effective sprint retrospectives. This guide has something for everyone within your engineering team, so check out the link in the show notes for the eight habits of highly productive engineering teams. Today we're talking about the future of developer productivity, and I've got Ori Keren, you're from LinearB and Dharmesh Thakker from Battery Ventures. So just give us a moment, a round of applause for our guests that have joined us today. So this session is, it's a little bit about the opinions of the people on this stage, but we also have a lot of really brilliant people in this room and it's why we've brought you here. So, you know, we're going to share our predictions about the next 12 to 24 months. But we're also just as interested in what you out there in the audience think is going to happen. So, there will be some opportunity for you to share your perspectives during this session. So, um, you know, don't be shy. I know you're all eating now, but, as you finish up your food, you can, participate a little more, but we really want you to learn. From us, but also from the people around you. so if we can get to our predictions, we have five predictions about the future of developer productivity. And before we get into that, I've heard actually that Ori that you have a bold prediction that you want to make.
Ori Keren: 17:41
Yeah. When we were talking about it in the podcast, we recorded together and my bold prediction says that. Even though there's a lot of, levers to pull to increase developer productivity in the next year. I think because we're still in the norming, storming, phase. developer productivity is actually going to decline in 2025. I think people are still experimenting with a lot of technologies. and they still don't know how to gather them together. before, you know, in every change, what, what happens is you get a small dip before you kind of like, uh, get your processes together. I think like 2026, 2027 is where we see the peak, but 2025 is still going to be challenging with a lot of experiments and actually predict developer productivity will decline a bit.
Ben Lloyd Pearson: 18:44
Yeah, I think that's a, it's a really great observation, you know, cause 2024. Almost felt like it was the year really to experiment with a lot of the AI technologies that are out there and businesses, and engineering leaders, they get some room to experiment, but eventually you have to start showing the fruits of that experimentation, you know, so I think people are going to see that, you know, a lot of our predictions today really are centered around All of the new generative AI changes that are happening and how it's impacting, your senior developers versus non technical people within your organization. you know, and also how this is going to impact how organizations respond to DevEx concerns over the next couple of years. so with that said, I want to really just jump into this first prediction we have about AI powered development tools. I feel like a day doesn't go by at this point where I'm not talking about these things. particularly I feel like agentic AI, uh, is coming through a lot. but you know, we should all be familiar already with the code generation and auto completion tools that are out there. So this is things like GitHub Copilot, OpenAI I believe has a tool that works within this. There's quite a few other competitors that have emerged on the market. and really what these are doing are enabling developers to write code faster. Often with fewer errors with, some exceptions to that. And I think one thing that we can expect is more specialized AI models that are tailored for specific programming conditions. I wonder if that, if that aligns with what you two are hearing from the market, or, is it going to be the name of the game with AI, particularly around auto completion is going to be in like customizing it to, to workflows.
Ori Keren: 20:28
Yes, I think, uh, I would evaluate it too. I think, the industry now is ready to experiment with, and we've seen it in the last year, every second software company is experimenting with what we call assisted, where it's like a co pilot or something that helps you with the AI based review. I can see a lot of, a lot of energy putting into that. we're trying to measure the productivity coming out of, out of it. It's still a controversial, you can see a lot of more throughput. But on the other hand, sometimes the code churns faster out of those systems. So definitely in everything that's assisted where you have a developer that still leads the effort. we can see organization experimenting with it and, a lot of cool things are happening. Now, if you move to agentic use cases, which is where the future is, I think it's almost like a year behind, you don't still see, you see early experiments, but you don't see confidence of organizations to, go and deploy, like, I don't know, an agent that takes like a simple task from a JIRA queue and issues a PR or an agent that does like, Fixes a bug or an agent that looks in SRE problems and fix them and security scans or, or whatever. and I think it's going to take time. It's probably going to take like two or three more years for people to feel more confident with that. It's almost like, autonomous cars, like the technology was there, but then you had to put a lot of regulation. Same thing I think is going to happen in software. the technology is going to be there next year, but there's still a lot of problems to solve around IP, like who's owning the IP, around legal implications, around compliance, SOC 2, all that type of problems, so technology will move faster, but it's still going to take time to streamline all of these things into like, Mainstream development,
Dharmesh Thakker: 22:41
some color first and foremost. Thank you so much for having me here. there's so much depth that we covered in the last panel. I'm like a big pictures guy, but it's great to see how all this great work is coming together. your question bends about whether AI tools will be ubiquitous. Makes total sense. Like you think about like 5 million developers in the U S you know, on average getting paid a hundred to 200 K it's like almost.$500 billion like that is being spent, but there's so much work that is BS work. You gotta go, you know, fix syntax issues, you gotta do documentation, you've gotta great unit tests. All this ancillary work that no developer, you know, got a CS degree to go work on can be automated and just makes economic sense for companies that make sense because they can take their most valuable resources and point them to driving innovation instead of fixing syntax issues or the developer's experience is so much better. Because they can focus on more fulfilling work. and you know, for the customer, it's better because you can push code, hopefully higher quality code faster. and so for all those reasons, the economics of, you know, these tools makes a ton of sense. The problem right now is there's just so much fragmentation. it's like a games business. Every week there's a new tool. It's like Copilot, now there's Codium, and Cursor, and Magic, and Poolside. There's like 20 different tools. And so, I feel like in 2025 or perhaps 26, You'll see the market converge to a few, a couple of these tools that become the enterprise standard or the startup standard. But right now there's a lot of fragmentation. Agentic is like two steps beyond that. Once you standardize and a set of tools and best practices on how port assist works, how PR automation works, you know, what you guys talked about with LinearB last time, how you push that forward to customers and test. Do A/B testing with customers. Once we standardize on that process, then we can go implement that in a genetic manner. But I think that's like at least two, three years away. So that's my, my prediction on that.
Ben Lloyd Pearson: 24:42
Yeah. And I think you're actually answering one of the questions that I have. And is that, then that is will by the year 2026, so after this next year, will developers actually trust the reviews that they get from these AI models? Like, is, is that the point where like, well, we'd be able to get there that quickly, but it seems like both of you think we're more in the like two to three year mark before we can really start to see like maximize the productivity impact of these, you think that's accurate?
Ori Keren: 25:10
you're asking if developers will get a review from like a, a, a model and you say, Hey, I trust the review 100%?
Ben Lloyd Pearson: 25:17
Yeah, I mean, I think today there's so much skepticism. it's like a trust, but verify, right? Like we always have to verify everything that comes out of these models.
Ori Keren: 25:24
I'm still looking for the developers that will look in a peer review and trust it 100%. So I think it's kind of. It's going to be the same, as you, some of the comments you will accept, some of the comments you will say, hey, maybe you don't understand the broader context, but, that's when it becomes reality,
Dharmesh Thakker: 25:43
Developers trust open source, they trust cloud environments the code runs on, they trust peer reviews. if a system is going to learn from prior PR mergers and apply a bunch of automation and intelligence to it. And it's not something that they want to spend a lot of time on. They would rather be writing more source code. I think there's every reason why the system can be trusted. However, you would still go verify some of the more extreme cases, some of the more important builds. but we certainly saying, you know, our firm is involved with a hundred and call it 200 companies that range anywhere from 5 million to 3 or 4 billion. The whole notion of PR merges getting automated, at least a low end PR merges getting automated is very. Pretty standard right now. I don't think there is a trust issue, but you've got to verify the more complex issues.
Ben Lloyd Pearson: 26:32
So I'm curious from the audience. we have three technologies here listed that we think are going to be a big part of AI over the next three years. autocompletion, text generation, agentic AI, code reviews. Just by a show of hands, how many of you are actively adopting or experimenting with at least one of these? Yeah, that's what I expected. It's definitely most of the audience. Now I'm curious if there's any of you out there that are doing all three of them. I've got one hand in the back. Would you be interested in sharing anything about your, your story of like adopting this? Not to put you on the spot, you can tell me now,
Audience member 1: 27:12
was a lot of Hesitance among some of the members of our team. Our team is small eight people all engineers and Some of the people have been professionals for 20 years. And so they've seen all of these waves of AI They've been very against it where They have relented It's on the more mundane boilerplate code, but all these frameworks require, and so the code generation auto completion, that was really easy for them to see the value. Like we rely heavily on Rust, so when the automation makes mistakes, the compiler will tell us, those are the combinations, like Rust kind of makes it a little more safe to adopt this. And I think other languages will start to adopt some of that intelligence and their compilers over time. on the automated bug fixing and code reviews, we have thousands and thousands of dependencies. And so keeping those dependencies updated would be a whole job function for us, maybe two or three years ago. And we don't even review, like we have a bot that reviews another bot opening those, update PRs or dependency maintenance PRs, which is really nice. And then for the agentic, we try keep a pulse. On what's happening outside of our world. Like we have our customers, we have our Slack, and then we have like our internal communications. But there's, we, we felt probably about eight months ago that the world was moving much faster and we had to be really concerned about local Maxima. So we constantly are pulling in, data that matches like new posts that match certain keywords. They come from a variety of feeds ranging from Google news. To hacker renews, and they are experimenting like an agent that we have nothing to do with. We'll propose new prototypes, new bounties even to bring in other humans, and we're constantly trying to keep our finger on the pulse and so, so are our customers too.
Ben Lloyd Pearson: 29:26
That's great. It's a great story, and actually very amazing. I mean, it seems like you're ahead of the curve in this audience. Yeah. And it's great to see so much adoption.
Audience member 1: 29:37
Weavey, for instance, they, like, so Darmesh and I are both involved in this company called Weaviate. They do all three of these too, for sure. So that's like a vector database. Bob says, hi, but like, it's a vector database. It's open source. And so their agent lives in, uh, in Slack. And if you ask a question, like, you know, you need some help. With, uh, a feature or a new API or a new syntax, it'll just respond and drop in the docs. That's one example, but there's a ton of other examples of agentic workflows within their business, within our business, within most businesses, I would say, at least that I know
Ben Lloyd Pearson: 30:15
about,
Audience member 1: 30:16
you
Ben Lloyd Pearson: 30:16
know. Awesome. I really appreciate the insight and for anyone else out there that is you to share their thoughts, there will be more opportunities, so don't worry. so before we move on to the next insight, I want to ask both of you, what do you think is going to be the biggest impact to developers over the next year or so, in terms of the impact of these new AI tools?
Dharmesh Thakker: 30:37
I think the near term impact in 12 months, it's, I think engineers are doubling down as data scientists, right? There used to be where you create a model and then you provide that to engineers to go write an application on top. I think the role of the AI engineer is like doubling down saying, Hey, you have this extremely powerful foundational models accessible through an AI or through an API. You can now iterate through your application logic, make sure the models aren't hallucinating and you're doubling down as a data scientist and a software engineer and you're implementing much higher order logic. that customers benefit from. So I think your AI engineer role is getting super elevated, has much higher, efficacy more so than throughput. It's not about the lines of code, it's how we can impact the business by combining application logic with foundational models wrapped into a killer application at record speed. So I think the role is becoming really important and they're doubling down as data scientists.
Ori Keren: 31:35
If I can add, I think, um, it's almost like every developer role, you will need to complement with some AI capabilities. We're hiring right now and we need a full stack, you know, engineer with AI capabilities. We need a back end engineer with AI capabilities. By the way, they don't need to write the models or, be the best data scientists, but they need to know how to leverage them. where do the models stay? what's the do's and the don'ts? what can they consume, et cetera. So I think, it's going to transform a lot of like their roles.
Ben Lloyd Pearson: 32:11
yeah, we'll definitely get into that. And it's a great segue actually into our next, key prediction. So, and that is the senior developer is going to become more valuable in your organization. So I think there's this common belief that generative AI is having a bigger impact on the lower skill developers, like your junior developers or fresh graduates. but more and more research is coming out that seems to indicate that it's actually your senior developers, in fact, that not only going to reap the most benefits, but are going to be the ones that you're going to depend on the most heavily. So I think the key things that are driving this, again, AI is going to show up in a lot of these, but, you know, natural language interfaces are dramatically shifting, uh, you know, how, people learn new technologies and skills. in fact, this technology really is here today. I think if you're looking for one use case to. Adopt when it comes to AI, it's creating, it's giving your developers something like chat, GPT or co pilot or a tool where they can have a natural language interface into their code base. and what's really the key benefit being knowledge and skills acquisition. So they can. More rapidly achieve proficiency in new tools or technologies or processes. Gen AI is also likely to introduce a lot of new bottlenecks to your organization. So, you know, if all you're doing with AI is producing more code, doesn't actually mean you're going to be shipping more code into your production. So. there's still going to be a code reviews that are required. You're going to have to run everything through your CI and CD pipelines. and it's senior developers who feel the biggest negative impact from, these problems. and in fact, I really believe, you know, for this, that structured rollouts are the name of the game. Uh, there was a recent research article from Google that found that, using a structured rollout, there was a 21%. Velocity improvement for what they call enterprise grade tasks. So they gave, you know, a range of developers from, you know, the most junior all the way up to the most senior, a structured way to use generative AI and on average, the senior developer on average, they saw a 21 percent improvement, but the improvement was actually bigger than that for the senior developers. and then lastly, you know, security always comes up. it's a very, recurring topic. but it's going to continue to shift to the left and be more integrated into your developer's environment. So, you know, you're going to be detecting vulnerabilities much earlier. developers will be empowered to catch issues much earlier before they even hit like your CI CD pipeline, for example. And your senior dev develop developers really need, you know, the support and enablement so that the burden of maintaining a security posture doesn't fall onto them. So wanna get your, your guys' insight, like how do you think that senior developers are gonna be impacted over the next year?
Ori Keren: 35:07
Yeah,
Ben Lloyd Pearson: 35:07
that's a really
Ori Keren: 35:08
interesting topic. I think like, there's two things that are gonna happen with senior developers. First of all, Whoever is a senior developer today, they kind of grew without, AI, you know, supporting them. And I think what it will cause, or what causes like you have now in, um, key positions in some companies, people who actually learn the code base on their own, you know, AI is assisting you to like, Accelerate fast. and these people are going to be assets. It's going to be really, really hard to lose them. there's something also, with senior developers that are, again, didn't grew up on, on AI supporting them. that they spark their life. I think, and it's debatable, uh, willing to hear like other opinions, this creativity thing going on because. They don't just say, you know, they get a task, hey, you should do this. They're the people that can say, hey, you know, let's refactor this entire area. Maybe let's change the product and do this type of things. And I think the senior developers of 10 years from now won't have the same life knowledge because, They're going to use a lot of AI like to gain the knowledge and rely on it a lot more. So I predict the senior developers of now are going to be a major asset. I would say the second thing, the senior developers that are Going to be here in 10 years from now are going to be a, five people team. They want to activate agents that take like, you know, pick up, you know, simple tasks from a JIRA queue and work for them. They're going to have agents that do AI based review. They're going to have agents that fix SRE problems that they see in security. So it's almost like, a strong senior developer would be like a 10 X or a a 50 X multiplier. So it's really, really going to be interesting, how senior developers evolve over the next years.
Ben Lloyd Pearson: 37:22
are there any skills that you think senior developers should be focusing on right now?
Ori Keren: 37:28
Definitely, go and see an experiment with AI, lead the way, don't just settle for, the assisted stuff. I'm asking my senior developers right now. My engineering manager is asking senior developers right now to go in, experiment with agentic use cases. so those type of things. But I think, again, like I said before, it's also very, very important. Not to lose this spark of creativity and offer ideas and all these type of things that makes a single developer great.
Ben Lloyd Pearson: 38:04
Yeah, you know, something that I'm actually hearing a lot is the notion of like a product oriented engineer, like engineers who can actually take an idea or a technical challenge all the way down to the end user of that challenge. So even if you're a backend engineer, that's optimizing a database query or something like that. You should still be thinking about the real world usage of that query to make sure that it's optimized for that situation.
Dharmesh Thakker: 38:30
Just on that point, you asked the question, what should senior developers be thinking about? I think of senior developers as what you, you know, the industry calls 10x developers, right? It's not necessarily that they produce 10x the amount of code. It is that they have 10x the amount of impact by architectural choices, by best practices, what have you. And all this talk we hear about AI is so focused on developer productivity as measured by throughput saying, hey, AI is going to generate more lines of code. Cares, doesn't matter. The question is, what code is going to drive the most impact on your end customers? And so I think senior developers can lead the charge and focusing on kind of developer efficacy as opposed to throughput. And by that, what I mean is. When you contribute, you know, a set of code, how do you then A B test that, how does that impact the end customer, does that drive the intended business impact, and then how do you bring it back, close the loop, and figure out, how do you prioritize the, source code that can ultimately impact the end user, right, so, I think that's where a 10x developer can have a 100x impact by closing the loop end to end, pushing code faster, but higher quality code, measure with the end customer impact, And use that to import, learn better best practices, and then that can become, a training ground or coaching ground for some of the junior developers to focus on higher impact work. So I think this whole concept of developer efficacy, you know, productivity is measured by customer impact, not lines of code, in terms of throughput, is an evolution this industry needs to go towards.
Ben Lloyd Pearson: 40:00
Yeah, so I want to throw it back to the audience for just a moment. I'm curious, through a show of hands again, are you implementing a practice today to help your senior engineers specifically adopt generative AI versus just a general practice? Again, one hand. I don't know who that is, but do you want to, do you want to share what you're doing?
Audience member 2: 40:21
hi, um, more kind of encouraging my senior to explore, but basically just so we could save time. So, go explore and do more of that. Just so we can cut off time and often what I see and kind of maybe it is a follow up question here because you need to, you know, developers, they're coding in code language. So that's how they express themselves. And now all of a sudden they need to express themselves actually in English, which kind of makes the product management in a way redundant, I would say, so kind of pushing. My engineers to think more of a product managers and, you know, set the prompt to solve about the problem in the context of a user or like the value that you're trying to produce and kind of as an outcome like, okay, now let's help me to code it. So that that's really where kind of pushing my seniors to work.
Ben Lloyd Pearson: 41:30
Yeah, and I think, you know, something that resonates with me on that is that, these AI tools can actually be pretty good devices to role play your user, right? Like. I think developers aren't always as accustomed to meeting customers and talking directly to users. It can be difficult for them sometimes, but ChatGPT or tools like that can actually be really great at just telling you to, be this user and answer my questions or how would a user interact with a system like this? So, I think there definitely is a lot of potential there to help, your developers think more like a product manager and to, to think more about product design. And I want to keep us moving forward into our next prediction. and it's a topic that I think you really can't discuss without talking about developer experience, because they really do seem to come hand in hand. And, you know, I hear a lot from engineering leaders that. They have a hard time understanding how to invest in developer experience. and you know, I, I think it's not going to go away, but if you can do it in a strategic way, there's some pretty big benefits that you can achieve. So some of the technologies that we're looking at in this space is particularly around like unifying development environments. So, you know, things will become more integrated over time. We'll see more seamless, coding to the testing deployment tool chains. You know, imagine a world where, code reviews start to become more automated inside of the IDE as the developer writes their code. you can have AI agents or other services running within the IDE. And telling them that, there's a problem before they even send it to their, their team. I also think that documentation and knowledge sharing are, Potentially, dramatically going to shift. the point we have on the slide, I believe, says it's the end of documentation. But really, I think what we're seeing is developers are relying more on automated documentation and AI assistance to generate, and maintain their documentation. and I really think that there's going to be some significant improvements to knowledge sharing, that comes out of this, you know, as like knowledge bases start to get replaced by like hyper focused LLMs, I think about moving from like an internal forum to like a knowledge base that is updated automatically through a natural language interface. and then there's also, you know, just the constant force of asynchronous development. And so. You know, the pandemic normalized remote work culture, these recent return to office mandates that have come out. but, virtually every modern enterprise is distributed to some degree today. And so, you know, if your tools don't support asynchronous development today, they really are. Even if you are an office based culture, you really are sort of handicapping yourself. So, we'll just see what you guys had to say about that.
Ori Keren: 44:21
Really good topic because, the way you ask the questions is like the problems that cause developer experience not to be great. Now, when I take a step back and, there's a debate out there, how should we measure developer experience? And, you know, there's been, companies who are saying quantitative. metrics is the best way to do it. You got to measure build times, you got to measure flaky tests or how much time I wait for an environment, et cetera. And then, others say, hey, let's do an academic survey or something like that. I say nobody wins. The most important thing to do is to have a framework to fix the problems. Whatever metrics you collect, whether you do it Quant or Qual, or both of them, it's just the beginning. And the topics you spoke about, like, how do I improve the day to day of the developers to have a better developer experience? By the way, not because, yeah, some of it is, I want my developers to have a work life balance, etc. But at the end of the day, I want them to be more productive. So what we're missing in this space, I think, and it's hard for me not to say that in LinearB we're attacking this problem. is some sort of framework and infrastructure that will help you, solve these problems. That will help you see, Hey, you know what? reviewing code is like, we talked about PRs and small PRs and all that in the previous session. Reviewing code is my bottleneck. Let me deploy an automation or an I based solution that helps me overcome this problem. So instead of like staying in the argument, let's take it to the next step. Let's build like an infrastructure and a product that lets you solve these problems. So I think it's really, really important to go to the next phase. And here's the future that I see. maybe it's a bold prediction. We started seeing it with customers, but I think it's going to get more and more extreme. people that have awareness to developer experience and they want to improve it. they're going to go from, an hour from commit to production to minutes. Because they're going to have AI agents working for them, solving pipeline issues. And people who are stuck behind, they're going to still stay with, 20 days of a cycle time. So the gap is going to get so big between, these companies. that's one of the problems that I'm, I really, I'm really passionate about and I'm eager to help like, uh, organizations solve this problem.
Dharmesh Thakker: 47:11
perhaps, like, in my mind, I think about developer experience, it comes up as a new term, and I'm like, what does that even mean, you know, because we talk to, 6, 000 companies a year, and now there's, this trend of developer experience, developer productivity, and It's coming up. So when we think about it internally, it's kind of a simple framework where developers are doing so much more. I told you they're doubling down as data scientists, they're scanning code in some cases, they're taking security responsibility. These are things they want to do. And these are things that have an impact on customers and the business, right? And then there's things that can be automated. Developer experience is like achieving both of those goals. Take the highest impact things and help them do it more effectively. Take the lower impact things that have to be done and automate it. And then you iterate on that so that, you know, your most valuable resources, developers can drive the biggest impact to your business without losing sight of all the other things that need to be done. Right? So finding that balance, I think is super important. And without kind of focus on developer experience, you can't achieve that. And what's encouraging is more and more companies now seem to have like a dedicated title for developer experience. There's a dedicated budget for developer experience. So companies are recognizing that, Hey, by focusing on developer experience, you can get a very clear ROI, but driving much more kind of quality code and business output, while keeping developers happy. So they don't lose, you know, you don't lose them to your competition.
Ben Lloyd Pearson: 48:35
Yeah, and this is actually something I would like to take to our audience now. So, so how many, through a raise of hands, have ever heard, usually this comes from a non engineering person, but have ever heard the statement, I don't understand why we invest in DevEx? Have you ever heard that statement? Wow, no one has never heard that. Oh, I've got Tara in the back. Thank you, Tara. Tara, did you successfully navigate that situation and make the argument for DevEx? And if so, would you like to come share how you did that?
Audience member 3: 49:05
God, I have so many hot takes right now. So I'm trying to focus on this particular question. so, you know, whether you call it developer experience, developer productivity, back in the day, we called it, you know, the build release team or the infrastructure team or the platform team, right? It's the, the act of supporting the ecosystem and the environment within which a developer works has had many names over the years. And as someone who has done almost entirely this work. Most of the time I've had to justify existence because, you know, I'm listening to, to Darmesh, you know, it's like, Oh, look at the developers focusing on the right thing. Well, what, how do you define the right thing? I think is the interesting thing. And so if a platform team, infrastructure team, DevEx team is able to be a multiplier and you can show the data that supports that, then that becomes a really easy argument, which is why. You know, things like these reports come out, I think it's increasingly less of an issue, but for the first 20 years of my career, it was absolutely an issue. And what you had to show every year when budget season came around is what value add, are you contributing? So, I will pause there, otherwise I will totally derail your whole conversation.
Ben Lloyd Pearson: 50:10
Yeah, I appreciate it. No, that's, that's wonderful. Um, so, I'd actually like to ask both of you, like, if, if someone were investing into DevEx today, where would you recommend they start
Ori Keren: 50:19
I would say always, start with measuring, but don't stop there. Find your best way to measure whether it's like, um, I'm a strong believer in quantitative metrics. I always say like, when you only based on qual, I always give this analogy of like, uh, sales leader coming into, staff meeting the mesh. imagine sales data coming into staff meeting. And saying, I don't need the empiric metrics around my funnel in sales. I don't need to know how much like, SQLs, opportunities, customers I have. I'm just basing myself on what the reps are telling me. So I think the same thing needs to happen in engineering. You gotta have empiric data, and qualitative data can definitely support it. so I would start with measuring, definitely. but it wouldn't stop there. Wouldn't stop there, like, because the most frustrating thing is to see the problems and not giving tools to fix them.
Ben Lloyd Pearson: 51:18
Yeah, that's actually, that's a great, great insight. I think I'll just move on to our next prediction, man. And that's, you know, it starts to get into how, the rules of developers are likely to change over the coming years. we've talked a little bit tonight about how, AI, Gen AI, machine learning stuff like that. That is becoming a thing that really all developers need to be focused on. So, developers are going to be asked to have a broader skill set. There's going to be more emphasis on integrating AI and machine learning into applications, using various models. and many developers are going to need to become much more familiar with. Things like building, deploying, you know, leveraging, monitoring AI models and integrating them into applications, even if they don't specialize with these tools. and in fact, there's a lot of products that are coming out that are specifically for developers who aren't specialists in this. there's also, we think maybe a good chance that productivity as a whole, like is going to start to become a, a job function for people. to the point where we may even see the rise of like DevOps engineers that, that like focus more on productivity, like a productivity engineer, for example. and then the last two I actually added at your request, Ori, so I'm interested to see what you had to say about them, but, you know, two technologies that really seem to be becoming more commonplace at a pretty rapid rate are cloud based IDEs and serverless technologies, so, I know some of these get a little groaned, we get some groans from people because they've tried these tools in the past and haven't been completely convinced, but, you know, more things are moving into the cloud, they're offering better integrations and debugging and collaboration. and then serverless is just making it easier to decouple development from the actual deployment services. So, how do you think that, you know, developers roles are going to change over the coming years?
Ori Keren: 53:12
Yeah, so cloud based ideas definitely are very important. Interesting, the way they evolve, I actually want to focus, on the roles that I see that, I think are going to be new. So we talked about the fact that almost every engineer, every developer would need to know how do I adopt AI. We're a full stack backend, whatever I do, then I think that some two interesting roles or this one is, like to think about InfoSec, I think there's going to be people that specialize in InfoSec in I can see it right now in contracts that we have, Which one does it use? Where do you deploy them? how do we attack IP? how do we make sure that the models don't learn on the data we send them. there's going to be a role of InfoSec specializing on AI. So that's one thing. And I think there's two interesting roles, that I think are going to evolve. I think there's going to be an end of AI development. And if we build the right infrastructure for these people, they're going to be the people that are going to say, okay, here's where we're allowing like agents to take, a simple task out of a Jira queue and start working on them. And here's the policy where we're not allowing you to do it. here is like where we're allowing, um, you know, to find a bug and fix it automatically. And here's what we're not allowing you to do. So you're going to have a head of AI development that defines the policies, implements them. And then you're going to have the practitioners that hopefully they have a product to do it with. They've kind of like write the rules, write the policies that says like, Maybe in this microservice I'm allowing more, maybe in this microservice I'm less permissive. So, I foresee like these two very, very interesting roles ahead of AI development and kind of the practitioner that does it day to day, that are going to, I don't know if they're going to sit inside a developer experience team or in other things, but these are two very interesting roles that I can see.
Dharmesh Thakker: 55:29
maybe, uh, slightly different takes like full stack engineering. It was kind of a merger between front end back end. I think in the next couple of years, full stack implies front end back end, security scanning and engineering. It's like, you know, full stack engineer will expect it to touch all those aspects, right? And the second thing I see changing is even though engineers are not responsible for some of the downstream things when it comes to CI and CD selection, when it comes to observability tools, it comes to feature flagging tools, if you will, they still become heavy influencers. A lot of developers are trying these tools and influencing what the standard should be for other teams to, to standardize on. So a lot of purchase decisions within IT. Are going through the route of developers, trying them and influencing them and the things developers do kind of encompasses a much larger exposure area, if you will, you may ask, how do you do all that? It's because a lot of the work that they had to do, you know, can be automated now, which frees up time for them to do the more strategic work.
Ben Lloyd Pearson: 56:32
It's great. So I'm wondering from our audience by show of hands again, how many of you have Okay. Put somebody whose job it is to lead your effort to work AI into your product, either on the engineering or the product side. Yeah, I see a couple hands back there. Do either of you want to share your opinion?
Audience member 4: 56:52
Absolutely. So, we, well, it's for us, people that already do CI, CD. So we have kind of two facets of that. One, we look at how we can improve the workflow for our customers. Right. So how do we integrate capabilities into our product based on, you know, I was sparing the details of all that. The other is actually that we have a large number of customers who are doing the same thing, right? So they're trying to figure out how to deliver software with, you know, AI capabilities, how everyone described backed by LLMs. How do we help them build and validate those capabilities so they can confidently put them out into production? So like our whole ecosystem, It's changing. And I think now I'm going to just go on a tangent and talk about what you were talking about before, but like, as we look at this, I think a lot of what you're what you're talking about here is like the next year, the next two years, right? It's like, how do we subtly change what we do from a software perspective, right? How do I get some code generated so I don't just write the code kind of thing? But one of the things that we spend a lot of time thinking about, given that we're in the flow of software delivery, is how does software fundamentally change five years from now, ten years from now, there was a quote up there about 2040, right? Like, are we just generating the same programming languages that we're writing ourselves in 2024? I don't think so. I think we'll construct software in entirely different ways that we haven't reasoned about yet. And so, if you're in the software delivery industry, You need to be thinking about that. Not just, like, can I write code faster because I have a coding assistant, but do we write code? Like, is that how we give machine instructions when the machines are generating the instructions and the machines are receiving the instructions? Why are we writing it in a programming language designed for humans? And what happens when, can you just link, like right now we ask a model, can you please generate a webpage so that if someone hits this URL, this webpage will be returned. Why don't we tell them, the model, when someone hits that URL, return a webpage that looks like this. Right? Like, why are we writing code at that point? I don't know what the answer is to that question, but I think there's much more interesting problems 5 years out and 10 years out than there are in the next 12 months. So we're thinking about all of them. That's my
Ben Lloyd Pearson: 59:10
answer. Yeah, and it's great that you can have that long term of an insight, too. You know, it's, I think, very valuable to have. So, yeah, so that brings us to our last prediction that we have. And this is actually one that I added to this list because, it's come up in a couple of conversations I've had on Dev Interrupted. specifically one with, Bernd Ruecker here that, should be coming out next month in January. and that's the rise of this concept of citizen developers. And specifically, I think you're going to see. Non technical and less technical people start to play more technical roles within your organization. low code, no code. These platforms have been around for a while. Usually when I bring them up audiences like this roll their eyes and think, oh, that'll never happen it's not going to work for us. but. these platforms are continuing to evolve and non developers are being given more and more capabilities to deploy applications, integrate with various systems, with APIs and databases, and, engineering teams are going to become less dependent on developers to produce things that sometimes actually go into production. It may not be your, your central app or service. but they are often building things that touch customers in some ways. and as a part of this, I think we're also likely to see quite a few more, integrated tool collaboration tools. So, you know, expect more sophisticated tooling that blends developer tools and IDEs with more feature rich capabilities. So kind of imagine like. Google docs for code. and this would allow, you know, teams of all technical levels. to collaborate on code, to discuss changes and resolve issues, in environments that are more familiar with them. So I'd like to hear what you all think about, this rise, you know, we've, we've already talked about senior developers and how they're likely to be impacted, but. There's also going to be a lot of impacts to people who aren't nearly as advanced in their capabilities. So what do you think about that?
Ori Keren: 1:01:08
Yeah, I think, it didn't start yesterday. Low code, no code is like, especially in internal IT applications is getting a lot of momentum. I think AI will accelerate it because, if I have a challenge and I don't know what to write in the epics, you know, trigger inside Salesforce and the custom objects, I will. Ask ChatGPT and it will tell me the answer. So it will only accelerate like, this type of problems. I think, just for us, I've seen like three cases lately where, in all the go to market sales, stack, we have people coming, Hey, how can we create a better architecture, et cetera, and all the challenges that we know that are going to come with people that are working with low code, I even had the VP of finance is here saying, Hey, can we get somebody to integrate Zapier to another system, et cetera. It's happening all over the place. I think the two challenges that it will bring, number one. It will create a lot of shadow IT, not just, uh, hey, vetted apps that you can't use. Even if it's apps that you can use, you might use them extensively and give them more access that you want to give them. so that's, that's one problem. And the second problem, it, it will create, unless you put a lot of energy into it, And treat it almost like as, you know, regular development and major technical debt. I can see companies that have broken Salesforce architecture, and the price they pay on, the speed of road market operations or others, um, is big. And so it is interesting, it is going to accelerate, and the problems are going to get bigger.
Dharmesh Thakker: 1:03:01
In many ways, we've kind of seen this before, if you remember like 10 years ago, it used to be that any company that was getting started, had to first spend 10 million on getting a data center by servers. And then cloud came along, and you're like, oh, I can build an application on AWS in a matter of, you know, months and, you know, for one tenth the cost, right? And then the focus shifted to software engineering. Now the same thing seems to be happening here. Now it feels like, hey, software developers can get the first application going. You need a citizen developer who can go build it. A product manager can use a natural language interface and get some prototype of a product they can test with a customer. And we're seeing this happen. Like, I've never seen companies that are five, ten people who get the first product out in six months and get to like two million dollars in revenue across a hundred customers.
Ben Lloyd Pearson: 1:03:49
You're like,
Dharmesh Thakker: 1:03:50
wow, this is crazy. And I think the trend is pointing to is that, you know, it's becoming much easier to generate logical blocks of code using AI and automate the pull request merges and downstream delivery of that code and ultimately test with customers much faster. So this trend is going to continue. And as a result of that, I think the emphasis of. Software is going to focus a lot more on customer delight, you know, the user experience, time to value. but I think it definitely accelerates the pace of innovation. The same way we saw cloud accelerate the pace of, you know, company launches 10 years ago.
Ben Lloyd Pearson: 1:04:30
Yeah, and I think one thing that sometimes gets overlooked, that's really critical in this is that when you have people getting involved in the development process, who don't have a lot of experience doing it, they aren't familiar at all with the processes that a typical development team works. So they, they may not even know things as simple as like PR should be smaller because they're easier to review. Right? Like, so there's, there's actually a level of just. Norm, normalizing that has to happen often with when you have large, we've seen this, we've seen this with larger companies we've spoken to where they hire, you know, large numbers of developers straight out of college who maybe come from less technical backgrounds and just some of the most basic practices that developers normalize pretty quickly from, maybe just a year or two experience. These people have almost no awareness of. So. You know, being able to like establish norms around, how, what we expect in our code, making sure that you also use the same tools that you use for your development stats. even if it's low code, it should still be run through your CI, CD system and have a QA process and, all the various typical things that would come along with software development. But you're often, often dealing with the group of people that really don't have a lot of experience with those things. And I think that has to be accounted for.
Ori Keren: 1:05:44
Yeah, it's even like, most of the code you develop in Salesforce and in these IT applications, it's not even like I maintained in, You know, in source control or it's local and today in environments, I know there's companies who are trying to solve this problem. but yeah, the practices and the methodologies are behind.
Ben Lloyd Pearson: 1:06:06
Yeah, so that, those are our predictions for today. and unfortunately we don't have time for a Q& A because we're a little behind our schedule. but Ori, Dharmesh, I want to thank you for coming in today. So if you all can give them a round of applause. Thank you very much.