We're so early — I don't think the learnings are there yet to be able to give really good concrete prescriptive advice.

I think companies are going to have to learn some of that as they go. What are the problems? Is there certain types of code — is it backend that does better with Gen AI right now? As opposed to React or Angular that might be a little framework based and less language based, right?

I don't have anything very prescriptive other than I would start as a normal engineer, start small, build upon that, go slow, you know, walk, crawl, run.

Gen AI for dev teams has been a focal point of conversation for the last few years, but the technology and application are both still very nascent. How can you find the best Gen AI use case for your team, and implement it safely?

This week, our host Dan Lines sits down with Peter McKee, Vice President of Developer Relations and Community at Sonar. They explore the benefits and risks associated with Gen AI, and whether this new tooling is most impactful for junior or senior developers. Regardless of the persona, there needs to be an emphasis on quality control, static code analysis, and the new coaching strategies to help the influx of new code.

Tune in to hear Dan and Peter offer practical advice for engineering leaders on safely experimenting with and integrating Gen AI tools to enhance productivity without sacrificing quality.

Episode Highlights:

  • 00:33 The ins and out of being a VP of Developer Relations and Community
  • 04:48 The Importance of wisdom and experience when applying Gen AI
  • 08:32 Is there more of a risk for junior developers in this age?
  • 19:51 How tooling can help with the influx of Gen AI Code 
  • 26:02 The safe ways to roll out Gen AI to developers
  • 29:21 Where to start applying Gen AI for your team 

Show Notes:

Peter McKee: 0:00

We're so early. I don't think the learnings are there yet to be able to give really, really good concrete prescriptive advice. Right. I think companies are going to have to learn some of that as they go. Right. What are, what are the problems? how does, upping the amount of code and the speed that we're trying to move at? How is that affecting things? Is it working? It's not working. Is there certain types of code? Is it backend that does better with Gen AI right now? As opposed to React or Angular that's might be a little framework based and less language based, right? Yeah. I don't have anything very prescriptive other than I would think. Start as a normal engineer, start small, build upon that, go slow, you know, walk, crawl, run. and, and as an engineer, I think you need to fight back that with the business, right? the business, you still need to push. I think you need to push, right? But, but I think we, we walk through it slowly.

0:46

Developer productivity can make or break whether companies deliver value to their customers. But are you tracking the right metrics that truly make a difference? Understanding the right productivity metrics can be the difference between hitting your goals and falling behind. To highlight how you can adjust your approach to both measure what matters and identify the right corrective strategies. LinearB just released the engineering leader's guide to accelerating developer productivity. Download the guide at the link of the show notes and take the steps you need to improve your organization's developer productivity today

Dan Lines: 1:20

Hey everyone, what's up, and welcome back to another episode of Dev Interrupted. I'm your host Dan Lines, LinearB co founder and COO, and today I'm joined by Peter McKee, Vice President of Developer Relations and Community at Sonar. Welcome to the show, Peter.

Peter McKee: 1:41

Yeah. Thanks for having me. Glad to be here. Glad to be here.

Dan Lines: 1:43

We were joking before, I love that title, that must be a really fun job to have, to be a VP of like, like, how is that for you? I,

Peter McKee: 1:53

it's great. It's great. I was a software engineer for 20 some years, 25 years or so. You know, really enjoyed it. I thought I was going to be the guy you put in the dark room, slid pizzas to, you know, leave alone. And, and I was that in the early career, but, um, as you can't tell, I turned out to be the smiley talkative engineer. And, uh, when I was at Docker, they were like, this might, this role might be good for you. And, uh, yeah, I love it. I love it. You get to talk about tech, stay involved in it, be hands on, but I don't have any deadlines for production on the dev team. So, you know, there's, there's that. So yeah, I really, really enjoy it.

Dan Lines: 2:25

yeah, you thought you were gonna live off of pizza, Mountain Dew, maybe some Reese's Pieces, I remember that's what I used to eat, a lot of Reese's Pieces when I was a developer,

Peter McKee: 2:33

love Reese's. Yeah.

Dan Lines: 2:35

you're the guy that can also Talk, smile, be on a pod, relate to people. Who gave you your first shot at it? You said it was Docker?

Peter McKee: 2:44

It was.

Dan Lines: 2:44

That's cool.

Peter McKee: 2:46

Yeah, we had, uh, uh, sold off the enterprise business to a company called Mirantis and we kind of started over and recapitalized kind of lucked out, right? Because it was a huge community already. Docker had a massive brand. So it was, you know, it was able to kind of step into that brand. So yeah, it was a great, great first, dev, uh, relationship role. And I loved it. Love

Dan Lines: 3:04

That's really great. Yeah. Thanks for sharing your background a little bit I know like some of the developers listening and stuff Maybe like how how do I get a shot at that? Like whether what do I need to do to be like Peter? But today We are going to continue. So my co host Conor and I have been interviewing leaders from across the engineering spectrum. We have this series going on on Gen AI, obviously like the hot topic, variety of viewpoints on different approaches with AI and what does it mean for, for developers? so it's been really fun. So, so far to get everybody's perspective. So I guess we'll, we'll start out, Peter, Like, what's your perspective on how Gen AI is impacting software development processes? You can take it from wherever you, wherever you want to take it from.

Peter McKee: 3:54

Yeah, man. That's a, it's a huge question, right? Thank you. It's super interesting. Super interesting. so many, so many ways that it's definitely impacting development. And, and I think it will just continuously impact as impact as might be a strong word, right? Because it's, I also feel that we, let's think five years from now, three years, five years from now, the progression of how these LLMs and the progressing, the speed it's going, um, is crazy. Right. And I don't think. We know exactly what's going to look like in three to five years. I was thinking about this the other day. I take a step back. Right. And if you look at when I first started developing in the nineties, right, their assembly was still around, C was around, you know, I walked uphill both ways to work, but, um, uh, you know, these terrible days, but yeah, and then C plus plus, right. But as we went, you abstract and you get higher and higher and higher. The barrier to entry becomes lower and lower and lower, right. And I think that's what Gen AI, I think if we look at it right now as a, as a tool as such, right, that you're going to start programming at a higher level, right. Does English become the new programming language, right? if you think about it, we're writing English just in specific keywords, right. To give commands. It's not that big of a leap to go to, You know, general English to then gets transformed into JavaScript, like everything else and run in the browser. Who knows, right? But it's very interesting. So it's affecting, it's affecting development. It's affecting, um, the way we think, the way tools are being built. It's going to change, not only us, but, uh, knowledge workers in general, right? Where there's a whole other thing too, right? We thought it was going to be the robots, the autonomous vehicles, right? But no, it's coming for the lawyers and the software engineers and the doctors, right? With, uh, it's pretty interesting. Very, very interesting.

Dan Lines: 5:38

So first of all, I do think it's affecting all industries, right? We're going to talk about software development, but I think you're right. Like. Everything's going to have its own co pilot, I guess, right? I'm a lawyer. I have my co pilot to help me formulate. I don't know what lawyers do, but help, help me formulate like my disposition or, or whatever. Right. But you know, I, I do think everyone's going to get paired up with some type of bot that helps them be more productive. And I think it's pretty fitting that it's starting with, you know, developers, the ones that are creating these. You know, bots anyways. do you see anything, like, difference between, I don't know, like, entry level developers versus more, like, hey, I started developing in the 90s versus I'm coming out of school now, or like I, you know, am in my first few years? Like, are you seeing anything, differences with that?

Peter McKee: 6:32

Yeah, definitely. Definitely. Right. yeah, someone with 30 years experience, 20 years, 15 years of experience. Right. Let me take a step back and I kind of think of things. Through this kind of a model when I was thinking about this is, so I have seven children, we homeschooled them all. We, my, my wife, really, you know, I acted like the principal, right? But there's a, there's, there's this classical education model called the Trivium, right? And you start out with your language and grammar, it's your, Your basics, right? Then you learn logic and it's, uh, it's traditional logic, but it's it's logic in how do solutions work? How do solutions work logically to solve a problem? And then above that is rhetoric where you start to argument for different, um, problems to solve. Right. And so if you think of that through engineering, entry level, Mid level, junior, even senior folks could stay in that logical phase. Uh, principals and staff go up into that rhetoric, right? They're working across the organization. And so if I apply that to general AI, those junior developers, right? They, they really just know the mechanics, right? Of writing software. They can't argue at the rhetoric level. Right. And I think the really used gen AI, where it is now is powerful and as quickly, you have to understand your language really well, your environments really well. You'd have to. Be able to see what Gen AI produces and know how to shape that into what your domain space and the way you construct software. Junior developers don't do that as easy. Um, and so have we accelerated the copy and paste from stack overflow with Gen AI? Kind of just. Sped it up, right? And so I think that's the real risk is for, you still at this point have to understand what the code is doing, how it does it, but you can offload things, right? Like I can never remember how to read in a file and parse a CSV file, right? Stack Overflow is gold for me for that, right? And Gen AI, boom, boom, boom, can do it two seconds and I can read the code very well and make sure it's solid. So I think senior level folks, and I'm just put into two big buckets, right? They're able to reason quickly about the Gen AI that's the, code that's been produced. and just experience, right? It's just wisdom. Let's say it's not knowledge. It's wisdom. Right? So it's, it's the more wisdom you have, you're able to see the shit code first quicker, right, then the gold code. So I think those are the two big buckets. I kind of think about it and how you should kind of, maybe as a dev manager, thinking about these

Dan Lines: 8:48

that's interesting. So you gave, I think you gave the example of like parsing a file or something like that. Now, if you have the wisdom, you're a developer, you've been around a while, you have some wisdom, you would say like. Hey, for parsing a file, like that's not where I want to spend my brain power. Let me, you know, I'll review the code. I'll make sure that that it's quality. But it's like, that's not what I'm really trying. Usually that's not what I'm really, I'm not like writing this parser. It's like, that's like a means to the end of the actual business value that I'm trying to achieve. Now, maybe, maybe there's some danger there then, because if a junior developer is using Copilot. This is just speculation. And before at least I had to go to the internet, search like, overflow, like do some copy paste, maybe read what people are saying. Like, Hey, this is a good solution. Not a good solution. Like I remember doing all this, trying to find, I don't know if it's just like auto populating for me. Maybe I'm not gonna, maybe the quality will go down or maybe there is a little bit more risk. Do you think there, there is more risk for junior developers?

Peter McKee: 9:50

a hundred percent, a hundred percent. Um, and that's what we worry about at Sonar, right? We're, we're really worried about quality and we do static code analysis, right? And for us, Gen AI now is, is our big message is, Hey, you, this is even more so why you should be scanning your code and looking for quality, right? Right. There's just tons more code being produced. and what I've learned working at Sonar, right, there's security risks that are very, very hard to spot in your code. Right. There's, there's things in greps and in regular expressions and right. And you, and it looks good, but. The machine is way better to do it. So yeah, I think there's a big risk. I think there's a big risk that there's always a risk to go faster. Gen AI is producing more code. Hey, let's use that. It works right. Functionality and quality are two different things, right? You can have something functioning, but it could be a shit quality. Excuse my language, right? It could be poor quality. You know, what's the three pigs and the wolf, right? How they build their houses on sticks and brick, right? And sure, the doors all worked, right? They had windows in all the houses, but the one that was built on a solid foundation last. And that's, sorry, using a kid's story, but it's true, right? Build on quality and static analyzer code for quality first. Especially if you're using Gen AI, right? you just cannot ensure the quality that's being produced. That's not what it does.

Dan Lines: 11:06

Yeah. I mean, I, I, I agree with you there. you know, our, our, like my, my co founding, partner, his name is Ori and what he, you know, he's saying, so he's our CEO at LinearB. What we're observing is the amount of code that is being developed. Now it might be developed by human, but it's actually also developed by like some type of AI. The volume of code is increasing. So there's a lot more code now that, you know, I, I know you're, you're at Sonar. So that's going to put a lot of stress on tools like, like Sonar. You need more usage, right? I opened up a PR. What's the quality of that? What needs to be reviewed? Is there a vulnerability here? And I also think that there is, there is pressure. Um, I'm engineers to move faster. Like I'm talking with a lot of VPs of engineering. They're coming to, to LinearB, like our company, and they want to measure the impact of generative AI, right? I'm spending millions of bucks on this. Is it, are my juniors faster? Like what, what's the impact of it? And so I do think there is that it's maybe like double dangerous because there's that expectation. Like you got, you got to move, you got to move faster. I just bought this thing for you.

Peter McKee: 12:21

Right. Yeah, exactly. Yeah. It just compounds it. Right. most business leaders and probably yourself, right? You, you think of a ROI, right? ROI

Dan Lines: 12:30

they got to, and that's what's, that's like what they're judged on. Right.

Peter McKee: 12:34

Exactly. And, and coupled with that, right. Is how hard is it to, first of all, um, You know, quantify how good a dev team is, right? It's so hard. You can't look at numbers, lines of code. You can't, you know, maybe it's solid features that the business wants delivered, right? Something like that. Right. But it's not lines of code. It's not, you know, issues fixed, right? Cause, uh, what will happen if we go with those, we'll become sales folks, right? Sorry, failed sales folks. And we'll game them, right? Sure. You, you, you, uh, performance me on a number of defects I fix or number of issues I, you know, features I develop even. Grabbing the smallest features I can, the easiest ones to do, right? Or I'm going to produce a bunch of defects and then I fix them and look like a hero, right? So even that's an issue, right? And now throwing more code into the mix and saying, go faster, faster, faster. Um, yeah, I mean, so in some sense we do need these abstractions, so we can go quicker and still bring more value to companies, but at what

Dan Lines: 13:33

It's usually always like in software engineering, it's like the balance, right? You want to go fast, but you got to have the quality. You want to get out to production as fast as possible, but if there's a bug found in production It costs you like 10x more, so you don't want that and I think that's kind of kind of what everyone's dealing with then you Got the security issues on top of it yeah, really interesting times. Have you seen anything around coaching developers around using some of these generative tools? Like, is any, is any organization trying to say kind of like, okay, like here's how to use them if you're a junior, even just like what we said in this pod. Like, have you seen any coaching stuff going on?

Peter McKee: 14:11

I haven't, we, we do some, clean as you code or learn as you code, but it's a real more around vulnerabilities, around issues, why it's an issue and why you shouldn't code like that, and that's really good for junior developers, but no, I, I've seen other companies and YouTube, you know, coming out for the general public around, uh, prompt engineering. I'm sure you've seen, you know, which is powerful, but nothing specific for developers, right? At least that I've seen. No. Like, how do you like truly integrate these in? what does that workflow look like? What does that pipeline look like? Yeah. It's very, very interesting. I haven't, I don't know. Does that plug into some kind of, uh, these DevEx platforms, um, that are coming out, right? These,

Dan Lines: 14:48

it certainly does like for us, you know, we'll say, Hey, this code had some, uh, or this pull request had like Gen AI modified code in it. or Sonar, hey, found something on the PR scan, set up a rule for like multiple reviewers. Like we're doing that type of stuff with companies. That's kind of like putting policy and I would say like guardrails around it, which I think it is. And then the other thing that, that, you know, we're kind of like exploring a bit is like, how, how could we coach engineers like earlier on? So you don't have to get to that point that you need, you know, all the rules, but yeah, I, I haven't seen anyone doing that yet.

Peter McKee: 15:29

One thing that popped in my head, I don't know if it's germane to the conversation, but we've been thinking about, you know, just like everything, right? It's, it's junk in, junk out, right? And so I think part of the problems that I'm not sure, right, is some of these LLMs or Jena, you know, Jena AI it's matching a pattern that's probably not the best pattern for that solution, right? And it's A little bit of randomness in there gets it a little, gets a little bit off, if Sonar is at the front, right, and you're scanning and cleaning the code that then you're feeding into your LLMs, or at least, I forget the term, but you have your LLM and then you're, you're putting your data into it also, right? If that code is clean, I think that's a great start, right? You should be starting there. to then feed into your LLM some good quality, you know, and it could be specific to your organization while you, how you do, you know, remote procedure calls, right? Like, you know, you can do them a handful of different ways, but we do it here at Visa this way, right? And maybe LLMs could then be powerful to help that way, right? To help say, I'm going to generate you code, but I know the pattern that you use. And so you get maybe, you know, Um, still not kind of in the training, but that's one thing that we're kind of thinking about at Sonar that would help with some of this, right? With junior developers that they can start trusting the answers a little bit better.

Dan Lines: 16:43

Yeah, that, that makes sense. Have you seen anything? Around Gen AI, affecting quality, like, is there any data reports or anything that you've seen?

Peter McKee: 16:52

I've written a couple of articles pointing to GitHub, uh, Microsoft, uh, in the like that are, are, um, the amount of code churn that's happening. Um, the, the amount of,

Dan Lines: 17:02

Yep. We see that too. Yep.

Peter McKee: 17:05

Uh, the quality is, is definitely down, right? Uh, developer love, right? Developer, uh, um, joy is dropping, right? Because you are, well, you know what it is like, you know, as an engineer, you want to work on the code that you wrote. That's, that's nice and formatted, nice and neat. You can understand it right away and you

Dan Lines: 17:24

Yeah, like the art of it.

Peter McKee: 17:26

Yeah, but it's rare, right? You get into a code base that's been built 15 years ago. You know, the, the, there's so bound together. Right. And you work a lot on fixing other people's code. Right. And now if the code's being generated faster, it's gen AI. It's not that great of code. Now you've got to fix even more code. Right. So developers getting upset with that, you know, and they're looking for, and it's, it's still a little bit of a seller's market, right? Like, um, we, we can, uh, we can move to, Top engineers still have a lot of leeway to move. So if you go somewhere and you're like, this code base is terrible. I'm not getting the help to fix it. I'm going to leave. Right. And so we're seeing developer churn leaving places to find that, that Holy Grail of a development

Dan Lines: 18:06

That's one of the good things about being a developer. It's like you can get the job. Usually you can always get a job, right? So that's like a good, good thing for like the VPs and the managers listening. I mean, I think most people know it's like, that's why developer experience matters so much. That's why, you know, companies like us are measuring it. That's why companies, I think, like you are saying, like, let's keep the quality high. No, nobody wants like a crappy code base. Like that sucks. I mean, developers can go wherever they want, but the fact is like most of the VPs that I'm talking to are experimenting with Copilot, right? Like they are rolling out initiatives. They're testing it. They want to measure the impact. Like it's here. It's real.

Peter McKee: 18:48

think you have to.

Dan Lines: 18:49

gotta, yeah. Yeah. You don't want to be, want to be left behind. Now we were talking about the junior developers a little bit, but probably on the flip side of that, maybe if you're like intermediate, or I think you were saying like senior developers, the ones before principal, maybe, you know, we have been seeing increased throughput for sure.

Peter McKee: 19:08

I mean, I said, so I take notes, right? Imagine every day I have my, um, my sublime and I do markdown in there, right? And I have some little goofy markdown for tasks. And I would go day to day, I'd go across each one of those files and I pull the tasks up and I put them into another file. Don't ask why it's this weird process, but I'm like, well, I'm an engineer. I should write a script to do this, right? And so I'm like, I'm going to write it in C. Why? Because I haven't written C in a while. Threw that out very quickly because there's no strings, but it went back to JavaScript. Okay. So I jumped over the chat GPT and, um, I was using, I think three, five, uh, maybe even before this three, five, right. And just started firing up asking questions. And I, I hesitate to say 80%. It's probably more like 65, 70 ish percent. Get me there. It would start to break down as I started getting into the little bit of refactoring places, uh, it would break down. It would add, uh, modules that weren't compatible with other modules. My point is I'm almost 30 years into software engineering, heavily, heavily in the JavaScript since like 2003. So I know it really, really well. Right. Didn't matter to me, right. I was able to take that last 40, 30, 20 percent and code it really easily. So it was very powerful for me. It helped tremendously, right? because I knew I could see what it was doing. I knew what it was doing. I could fix the versions real quick. I could, I could refactor really quick. Uh, yeah, you shouldn't really use that method, right? This is a cleaner way to do that.

Dan Lines: 20:34

But you can clean up. It's like the 80 20 rule or like the 60 40. It's like you could do the 40 20 30. At the end of the day, what you're saying is you're moving faster, which means more code is going to get pushed at a faster rate in general, let's say. Do you have any opinions on Like if we're in agreement that More code in general is going to be pushed faster throughout the planet. Right. Do you have any thoughts on processes or like infrastructure to accommodate this load in a sense? Like, have you thought about that at all?

Peter McKee: 21:12

than AI monitoring AI,

Dan Lines: 21:13

I mean, that, I mean that, that is a thing like we're, so we're working with some customers where they. They are, uh, so let's say CoPilot has like assisted the developer in creating some of the code. They open up a pull request, and on the LinearB side, we can see that it's like assisted with CoPilot. So then, also what they're doing is experimenting with some of the AI review. Right.

Peter McKee: 21:40

Right.

Dan Lines: 21:40

What they're, what we see them doing is then writing some rules, some automation to say, okay, if there is this much code that's like gen ai, like orchestrated, or if we have like a reviewer that's like an AI bot, I do wanna still make sure there's a human there too. Like I see a lot of that. That happening, again, that's kind of like streamlining, I would, I would call it like end to end more on the, on the SDLC, because the load is moving to that PR basically.

Peter McKee: 22:10

Yeah. I think, you know, AI is really good at pattern matching, right. And predicting, and you know, we might not use these large LLMs, maybe for code reviews, right? Maybe those are specialized tool that are really tweaked and there's thresholds there, but yeah, I think that's where we're going to go. I think to answer what you should do now. And I'm extremely biased, right? Uh, so I won't, I'll talk in terms of the solution and not so much Sonar, but Sonar has a great product, right? But static analyzer code. Should we be static analyzing your code? AI, I think will, will eventually get there, will help augment static analysis. But, um, you know, right now, handwritten parsers, static analyzers, these are written by PhD student, uh, not even students, graduates, doctors, right? Really, really good. So I think quality first, right? You've got to be checking your quality And, um, I think that's the, the, you know, the, the barrier to entry, right? It's kind of like, we need to get there. I think how we are with unit testing and function functional testing, right? Like no good software engineering team, professional software engine team, DevOps team will not allow their code without any unit tests. Right. And like, You just would not do that, right? And we don't do that with quality, right? Why aren't we doing that with quality? Right? We should be doing that with quality. I think DevOps personnel should be going great. Your unit tests don't turn green. You're not going. If your quality gate isn't passed, it's not going right. Right. I think we need to get there. And I think that'll help with GNI with all this code produced. And then how do we look at. AI just in general, how can it help streamline the processes? How can it help with code reviews? To your point, how can it help take a junior developer to a senior developer faster? Right? I think there's a huge power there, right? And you were hinting at and talking about earlier, I think that's maybe you and I should start a second company and do that. Right. But I

Dan Lines: 24:02

Cut the time in half from junior developer to senior developer with our new AI software.

Peter McKee: 24:09

right.

Dan Lines: 24:10

It's a compelling, it's honestly a compelling thing, like, again, ever, like, the people that are putting up millions of dollars in budget, so it's like your VPEs, your CTOs, they report to the CEO, they are on the hook to show this investment moved our projects along faster, gave more ROI to the business, so it's actually not even that much of a joke, like, I bet they'd Go for it, you know?

Peter McKee: 24:35

yeah, a hundred percent. this might be contrarian. I don't, I know we're talking to, uh, the developer tech crowd maybe, right. businesses look at us as a liability, right. We're a cost center usually. We, we're, we're a risk to put security into production, right? right. It's code is a risk, right? And the more and more you have, it's harder to maintain and technical deck gets harder, becomes more expensive. Software engineers are expensive, right? So whatever we can do to lower that risk for the businesses is what I think an engineering and development team should look to do, right? Um, cause then that brings even more value to the business and they love you and they keep paying you to do what you do. so I think a lot of tools are going to focus on that. How do we. And we can't, as engineers, get too, too tight, right? And hold onto our domain too much. Like, I have to write code. That's what I do. I do my, and we do, we do so much more than just typing in code. Right. But you got to let some of that go, right? It's just a step up in, in, in the abstractions, right? We're just moving up higher. you know, I would never come to you and Dan and say, Hey, you're going to go, you, the next, at LinearB, you should absolutely go to assembly language. It's the future, right? Like we'd never, it's always higher and higher and higher. So, um, you can't beat them. You got to join the robots. So I think we all just accept it, but you know, think about it deeply to how we bring these in these tools in and how we protect our code bases, right? We can't get them, let them become more of a liability, right, by just going faster and putting code into it, right? It's not a

Dan Lines: 26:02

Yeah, couldn't agree with you more. one of the things I wanted to ask you is, there's certainly different types of defects that I think, you know, So I'm not an expert in Sonar, but I do know there's different types of defects, different types of severities. Let's say some of the defects are security incidents, but other ones might just be bugs. Like it's not a security thing. It is a bug. Is there any data around like teams that are using? Uh, like Copilot or another Gen AI solution, like the type of issues that are surfaced more than other types, like is it security stuff or is it like little tight, like smaller stuff? Is there anything around that yet?

Peter McKee: 26:41

You know what? I'm not sure. I'm not sure. I haven't seen

Dan Lines: 26:43

That would be cool to see. Yeah,

Peter McKee: 26:46

it's definitely affecting it. But what is the effect? Right? Because I can probably infer from what you're saying, right? Like, if it's all security, we've got a major issue. If it's a you're just missing your tabs, right? And formatting then.

Dan Lines: 26:59

yeah. Yeah. I mean there's a wide RA wide range of like the, like the severity. Right. I'm thinking like if I am a engineering manager or a VPE or a CTO listening to this. And I know I need to start either dabbling in some copilot stuff, or I maybe started to put a little bit of money in, but now I'm listening to this and saying, oh, I don't know, like, I don't want anyone to be afraid. I think everyone actually should go and do this, but. Is there anything around, like, the safe way to roll, roll this out or to start experimenting so that you're not just, like, throwing all of your developers into a situation that could get the company into trouble a bit, you know what I mean?

Peter McKee: 27:44

I think you need to look at, uh, your policies and your data, your data retention, like start there, get, make sure that's all in place, right? Make sure you're not leaking any, uh, IP or, or, or unintentionally bringing IPN that you don't want to. I think that's an issue too, right? It's a little hard to tell, no, but get all that, you know, cross your, uh, T's and dot your I's. And then next, I think like anything, right? Start a POC, start small, start on maybe a line of business application, right? That's maybe not public facing. If you had a large organization probably started, you know, um, you know, customer success, they might have some ticketing, you know, something homegrown, like start there, right? Start a small project. And, and I think be, be intentional about it. Right. Um, and I don't think it's smart just, okay, well, let's, let's open it up to these 5, 000 developers, Ed. You know, Visa or whatever. Yeah. Start small, do a little POC, build off of that. Um, and it, cause we're so early. I don't think the learnings are there yet to be able to give really, really good concrete prescriptive advice. Right. I think companies are going to have to learn some of that as they go. Right. What are, what are the problems? What, what does, uh, You know, how does, uh, upping the amount of code and the speed that we're trying to move at? How is that affecting things? Is it working? It's not working. Is there certain types of code? Is it backend that does better with Gen AI right now? As opposed to React or Angular that's might be a little framework based and less language based, right? Yeah. I don't have anything very prescriptive other than I would think. Start as a normal engineer, start small, build upon that, go slow, you know, walk, crawl, run. and, and as an engineer, I think you need to fight back that with the business, right? the business, you still need to push. I think you need to push, right? But, but I think we, we walk through it slowly.

Dan Lines: 29:32

it's good advice. And I think like between us, we can get to maybe some, cause I mean, I think you're thinking about it the right way. It's like the first piece of advice is like, use your engineering instinct. Okay. What do I do with my engineering instincts? I start small. So like with a control, like what I know is what other VPs of engineering are doing. So first of all, they are starting with a few groups of engineers, not rolling it out to everybody. So that, so that's one thing. And they're measuring. So the idea is if I start with like group A and I give everyone licenses, make sure you have a measurement in place that you can measure your bug rate, your change failure rate, your cycle time, like your PR size. So you can do this comparison and I'm sure like on the, SonarSide, like you can see like more hits or less hits or even or whatever it is. So now you have some data in place. Like that's one tip and that's basically what you're saying. And I do see VPs doing that, but I think the other thing that you and I talked about now what groups are you going to pick? Is it a front end group? Is it a more junior fo like you have geography, you have skill level, you have area of code, right?

Peter McKee: 30:47

Yeah. It's interesting. Right.

Dan Lines: 30:48

Like, so I, that can be, I think that's pretty good advice to say, like, intentionally pick, like, who is going to, try it, right?

Peter McKee: 30:57

Yeah. Who would Gen I benefit the

Dan Lines: 30:59

right.

Peter McKee: 30:59

that's a hard question, right? Because do you, I could see saying, let's take a more junior team and let's see if we can up level them right through Gen AI and some of these tools. Or do you go the other way? Let's take a more senior team and see how much our top performing senior teams, see how much

Dan Lines: 31:15

Yeah. I'm not sure, yeah, I'm not sure it even almost matters. It's more about, like, the intention of the experiment. It's like, hey, I, I have, like, a theory. I'm gonna measure this. That I do feel like it can really, I have a lot of juniors, I have a really junior team, I feel like it can help them. Okay. So, deploy it there, and then measure it against the average. Or, hey, you know what, I really think like my intermediate and seniors are gonna crush on this, because they have the wisdom, they're gonna use it the right way, I'm gonna hold off the juniors. I think it's more like the intentionality of the, the experiment is what I'm seeing these, uh, you know, I think the VPs, they're a little more progressive that are pushing forward with it, uh, before like dropping the millions of dollars on like the 1, 000 devs that you have.

Peter McKee: 32:05

No, I love that. I love that. You know, as a VP, right? As an executive at a company, your, your main goal is to, drive massive change, right? Make, make a, make a change in the organization, right? They're, they're, and they're looking to make big changes, right? And that's, that's where the VP stands and what they're thinking about and where they, where they're looking to do, right? And so then having that team, that strong technical sneeze underneath them, that can. Temperate is very important. But, um, yeah, I think you need to do those experiments. And I think what you laid out, I think you're right. It doesn't really matter, but to your point, right? Like you're saying intentional,

Dan Lines: 32:41

that, that's what I see like the most like progressive VPs are doing that I, that I'm working with. So I'm just gonna throw that advice set to everyone. And then on the ROI side, you're, you're totally right. It's like, yeah, you do have to show the ROI at the end of the day, the day. But if you go to your CEO and say like, this is the intention of my ai. Like program or like co pilot experiment, I will report and update you on it. And when I feel like we're in a good spot, then I'm going to ask you for all the money. I won't do that. You know? So then it's like, it's, uh, controlled, I guess, in a sense. Is there any topic, Peter, that we needed to hit on that we didn't hit on?

Peter McKee: 33:21

no. I think, I think this is a great conversation. I really enjoyed it. We had a lot of stuff. I think quality, of course, I'm going to go, I'm going to be preaching quality always, right? I think it's very, very important. Um, yeah, use our open source tools, right? They're free, get a hint, use them as a tool. And then you'll see if they work for you, go forward with it. I think that's a great, I think you need to be doing that in tandem. in a professional setting where you use Gen AI. It's, it's, um, it's your safety net a little bit, right? It's the, when you do flipping above the, on the trapeze, right? Something's going to catch you. So I think, I think that, um, that was the one thing I really wanted to talk about. And then, and yeah, I just wanted to see where this AI, Gen AI went and, um, super interesting. So thanks for having

Dan Lines: 34:01

Yeah, really appreciate you coming on. I'd love to have you on again. And the next time that you come on, maybe it'll be enough time that we can even get some data on like the types of findings with, with, you know, on the quality. I think that, I mean, we're all learning like what the impact is. So that, that would be really cool. But yeah, uh, Peter. Thank you so much for coming on the pod today. It's been a pleasure.

Peter McKee: 34:26

Yeah. And thank you. Thank you. Likewise. And have me back. I'd love to be back. Yeah.

Dan Lines: 34:29

Awesome. And for you listeners, please subscribe to our Dev Interrupted YouTube channel to watch this episode and tons of other behind the scenes content. See you all soon. And Peter, again, thanks for coming on, man.

Peter McKee: 34:44

Thank you. Thanks a lot.