Podcast
/
The CTO must now think like the CFO to survive

The CTO must now think like the CFO to survive

By Lake Dai
|
Blog_Comprehensive_DORA_Guide_2400x1256_22_89b97fe457

"In the AI-native SaaS companies... 30 to 50% of operating costs are compute cost. So think about it, if you're engineer manager... your CFO is gonna ask you the question, 'So how much are we gonna spend this quarter?'... Now, engineering leaders have to think from the lens as a CFO."

AI is forcing engineering leaders to become part-CFO, part-governance expert, and part-business strategist. Are you ready for the shift? We're joined by Lake Dai, a globally recognized AI expert, professor at Carnegie Mellon, and founder of Sancus Ventures, to explore the new operating strategies required in an AI-first era. She explains why AI has evolved from a simple tool to a core business metric that leaders are held accountable for on earnings calls. This new reality introduces massive new compute costs—sometimes 30-50% of OpEx—forcing leaders to adopt the financial foresight of a CFO to forecast and justify spending.

 

Beyond the balance sheet, Lake identifies AI governance as the biggest blind spot for most leaders today, outlining the urgent need for an AI handbook to manage unit, system, and ethical risks. This strategic shift also reshapes the engineering org itself, from managing hybrid teams of humans and agents to the need for new training environments, almost like "AI flight simulators." This episode is an essential briefing on these new complexities, all centered on Lake's most urgent advice: in a world moving this fast, the best strategy is to slow down and focus on the fundamentals.

Show Notes

Transcript 

(Disclaimer: may contain unintentionally confusing, inaccurate and/or amusing transcription errors)

[00:00:00] Andrew Zigler: Welcome back to Dev Interrupted. I'm your host, Andrew Zigler, and today we're sitting down with Lake Dai, a globally recognized expert on AI innovation, ethics, and governance. She's an adjunct professor for a apply to AI and AI governance at Carnegie Mellon University.

[00:00:17] Andrew Zigler: She's advised institutions like the UK Parliament and even the state of California, and is currently the founder and managing partner of Sancus Ventures Lake's. Background also includes pivotal roles at Apple, Alibaba, and Yahoo, where she was head of search in China. Lake has been working with machine learning platforms as far back as 2002, and she's an, has been named as the top 100 women in AI and a top AI boardroom talent.

[00:00:44] Andrew Zigler: And today she's sitting down with us on Dev Interrupted. I'm really excited to have you here, lake. Thanks for sitting down with us.

[00:00:50] Lake Dai: Thank you for having

[00:00:51] Andrew Zigler: Amazing. So let's go ahead and dive into it because we're here at ELC, the Engineering Leadership Conference here in San Francisco, and you gave a talk.

[00:00:58] Andrew Zigler: And this talk was about [00:01:00] industry shifts that are impacting engineering leaders and the new operating strategies that they have to adopt in an AI first world.

[00:01:08] Andrew Zigler: Yeah. Let's dig into that a little bit. What was your talk about?

[00:01:11] Lake Dai: Yeah, so, so there are a lot of AI trends, right? So particularly in 2025, we see that ag agent AI really start picking up. We see compute, compute continue to be a constraint. Um, and among all so many different trends, I think there are few things we see are very relevant to engineering leaders.

[00:01:31] Lake Dai: The first thing is.

[00:01:31] Lake Dai: so AI right now become a core operating metrics. So what do we, what does it mean? Because e engineering leaders will look at a lot of metrics in the past, but what's interesting is now AI is operating metrics. So for example, in SP 500 earning costs, 287 out of 500 have cold AI in their earning.

[00:01:57] Lake Dai: Right? So what does it mean? All right, so it means that [00:02:00] AI adoption is strategic importance. So if those, um, earning costs have been quoting so much about AI development, what does it mean? Is that at engineering leader, you have to think about that as well.

[00:02:12] Andrew Zigler: Yeah.

[00:02:13] Lake Dai: So that's the first thing we see, like AI being operating metrics and to be very specific and people ask what does it mean?

[00:02:20] Lake Dai: What kind metrics are we're looking at? So let's say if you engineering leader in the past may look at a response time, scalability, metrics, uh, accuracy models, et cetera. But now how do you evaluate AI adoptions, for example? What's your organization, um, are using? Are you using AI for chatbot for customer service?

[00:02:41] Lake Dai: How much cost has it reduced

[00:02:44] Andrew Zigler: And

[00:02:45] Lake Dai: how much time has it reduced? Right? So those are operating metrics to measure those AI adoptions. So that's why I'm saying we seeing from engineering metrics switch to more and more on the operating metrics side. The second thing is quite [00:03:00] interesting is. Compute constraint, right?

[00:03:03] Lake Dai: So people talk about, well, the model's getting smaller or compute more, more efficient. But we are actually just at, at the very beginning of using compute. So compute consumptions, since GBT launch to today already increased by a hundred x, and that's just on the language model. So now we're moving to multimodality, word model, robotics.

[00:03:27] Lake Dai: So as you can imagine, the compute costs will con compute, uh, consumption will continue to increase, right? Um, and the other interesting metrics is, in the na ai native SaaS companies, and they're reporting about 30 to 50% of operating costs are compute cost.

[00:03:45] Andrew Zigler: Wow.

[00:03:47] Lake Dai: So

[00:03:47] Lake Dai: think about it, if you're engineer manager, you work for AI startup or AI growth company, and 30% or even 50% of the costs are actually coming from compute, and your CFO is gonna ask you the question, [00:04:00] so how much are we gonna spend this quarter?

[00:04:02] Lake Dai: Right? Can you give a prediction? Can we reduce the cost? What if you know, for example, you cannot get hold of the supply? What will happen to the main business? So now. Engineering leaders have to think from the lens as A CFO,

[00:04:16] Lake Dai: right? You have to answer to those questions here. That's why the compute cost is another impact.

[00:04:21] Lake Dai: And then the third one, I'm sure a lot of your guests have already talked about it, is uplifting your engineering organizations. That means that as we continue to build, uh, more AI native infrastructures. We're switching, from maybe in the past or more of a user interface or, uh, applications, or infra infras.

[00:04:42] Lake Dai: But now we'll switch more of a, like, say called agent AI friendly or agent infrastructures means like orchestration, API layers. So you have to think about those, uh, new architectures. You have to think about skillsets you can think about.

[00:04:57] Lake Dai: A lot of organizations talk about [00:05:00] with cursor and vibe coding or co-pilots, you do not need as many junior developer anymore.

[00:05:06] Lake Dai: So what about junior developer you already hired? What about the senior developer now need to upskill the, understand the new technology and new infrastructure.

[00:05:16] Lake Dai: So

[00:05:17] Lake Dai: what you do, and maybe even you are managing a team of human and agents simultaneously,

[00:05:24] Andrew Zigler: Yeah. I think you end up in this world where you have these hybridized teams where part of the team is a person and part of the team is a compute layer.

[00:05:30] Lake Dai: I know. And then imagine that will be more complex when you have a vendors also have that infrastructure.

[00:05:35] Andrew Zigler: course, and then you're interacting with other versions of that ecosystem within their own organ.

[00:05:40] Andrew Zigler: You're, you're kind of blind to it. You have to trust that what goes in is gonna go in and get sorted correctly with how it gets used.

[00:05:46] Lake Dai: We actually,

[00:05:46] Lake Dai: we're talking about Matrix now, right? So we talked about in the past we have collaborations. Internal and external. That's it. So now internal and external, you talk about human to human. Human to ai. AI to [00:06:00] ai, right? So that's the three relationship I was talking about. In the future, we see more of that evolving because, human to AI relationship.

[00:06:08] Lake Dai: At, in the past we thought about human or training ai, but now we're talking about AI copilot. And as we have a more and more, for example, student, student students or junior developers needs to training, maybe we need a simulated environment to train the junior developer to be senior developers.

[00:06:26] Andrew Zigler: Oh, that's fascinating. Creating this like. Safe space for them to figure out how to work with these tools. Almost like a flight simulator for a pilot before they start

[00:06:33] Andrew Zigler: flying. Did you get enough hours in the simulator first? Before you, you were flying a real plane with real people on it. The idea of applying the same thing.

[00:06:41] Andrew Zigler: That engineering is fascinating because the thing about AI is it's so powerful, which it's a force multiplier, you know, if we're both good and bad, and you have to have really good process and really good alignment within your org, or it can really go awry. Right,

[00:06:54] Lake Dai: Right, right. I actually think AI simulated training environment is one of the key [00:07:00] to upskill your organization. so that's the human to AI relationship. Right. Let's talk about AI to AI relationship. That's also fascinating. so we're just talking about like, my agent, work with your agent, but my agent can also have sub-agents, right? So this is a whole, remember you play a game that you have 10 people line up. You, uh, you, you say something sentence that you go to the end of a

[00:07:23] Lake Dai: line

[00:07:23] Andrew Zigler: a game of telephone

[00:07:24] Andrew Zigler: change, right? Yeah,

[00:07:25] Lake Dai: So, okay, so now imagine. How we actually, um, measure the outcome of the model is that we provide training data set, we measure, we do the eval evaluation, the outcome.

[00:07:38] Lake Dai: That's okay. And, but then they can start produce subagents and that demand, uh, and their outcome evaluation will come continue to pass. How do you know will continue align with initial intention? Right? So, and then you have vendors, agents, and then you [00:08:00] have, you know, we're talking about collaboration, but there's also, for example, if you have a AI criminals attacking.

[00:08:07] Lake Dai: Your system, you need AI police to protect you, right?

[00:08:10] Andrew Zigler: Right?

[00:08:11] Lake Dai: So there's also the opposite positions,

[00:08:13] Andrew Zigler: Mm.

[00:08:13] Lake Dai: so it's quite interesting.

[00:08:15] Andrew Zigler: using AI defensively within

[00:08:17] Lake Dai: Yeah. Yeah. We, we've seen, um, I, I know you mentioned that we also run a VC firm. We invest in early stage ai, infra and native apps. So we seeing a lot of innovation now even in the cyber security space, using reinforced learning that you have a, a red team, you have blue team training, uh, use reinforced learning and training, um, to have a much, much better, uh, solutions than the current human, I would say human dominant or. of

[00:08:47] Andrew Zigler: right. They can, like, they can work together in a way where they better understand each other, you

[00:08:50] Lake Dai: Right, right. So, this is getting like, like similar to like, again, again this basically you have generative models. You did have discrim models. One [00:09:00] side is continue creating new solutions. The other side is continue to, to say, yeah, like say this is right, this is not wrong. And after running, uh, reinforced learning for many rounds, then both models will become so much better.

[00:09:12] Lake Dai: Yeah. So that's. partner, you make it spar with itself.

[00:09:16] Lake Dai: So then what's happening in the cybersecurity space is that you do want to have a aspiring partner here. Then you can be so much better.

[00:09:22] Andrew Zigler: something that you mentioned about. Engineers using agents, and those agents have sub-agents. This is a world I was recently in, so I recently did a, a hackathon with a block open source, and they have goose, you know, they're, they're a tool for a vibe coding. And we had a vibe coding hackathon, and I was one of the participants and I had a partner, so two parts of the hackathon we were building entirely with AI and AI agents.

[00:09:43] Andrew Zigler: And the point was they use sub-agents as part of that process. And it was so hard in that hackathon environment for me and my partner. To get our agentic teams aligned with each other on building the same thing within a really limited timeframe. It was actually really hard. And at the end, you know, [00:10:00] uh, they asked me like, you know, what would you have done differently now that you did that for?

[00:10:03] Andrew Zigler: It was a four hour hackathon. Like, what would you have done different? And I said, you know, I would've used all four hours to plan. I would've just used all four hours to talk about what we wanted to do, because that ultimately was the missing piece of building that intent. Of what we were trying to accomplish before picking up the tools, before assembling the teams.

[00:10:21] Andrew Zigler: Because like you said, once you start going down the layers, you start to lose context. You're a little blind to what's happening. So it's really important to have that top down alignment.

[00:10:30] Lake Dai: Yeah, so this is actually quite interesting because it's related to my early work at search. Um, because in search engine, remember, uh, you know, early days, either website or mobiles, you're putting the search keywords. The hardest part of a search engine, unlike what people think is, is crawling all the information, is actually what we call the search intent.

[00:10:51] Lake Dai: Means that people have explicit intent or implicit intent and due to do the intent. Mining is so difficult because quite [00:11:00] often you say A you meant B, actually you need C and then we give you D.

[00:11:05] Andrew Zigler: Exactly. I love that. I love that, that that's like a really great point because. Uh, especially in a world like search, you are kind of like guessing ahead of the user, and in a way that's where you start to build an experience that's delightful and you get that widespread adoption. And it's why things like search are fundamental blocks of our world today.

[00:11:22] Andrew Zigler: Right. And, and I, I wanna also dig in about your discussion on operating metrics and about understanding that cost and being able to communicate between. Your financial leaders and your engineering leaders and doing forecasting even on how you're using these tools. I think that's really fascinating and a problem that a lot of engineering leaders are still grappling with and figuring out.

[00:11:43] Andrew Zigler: You know, we talk about that a lot on Dev Interrupted, uh, Dev Interrupted as well with LinearB , you know, our, our whole, our whole mainframe is about understanding those operating metrics and really boiling it down so that you get the impact of what it is that you're getting across the finish line.

[00:11:57] Andrew Zigler: Like are you shipping more code? But is that code better? [00:12:00] Is it

[00:12:00] Lake Dai: Yeah. Yeah.

[00:12:01] Andrew Zigler: when it hits production? Is are things failing? Like actually having that full picture is really important part of the costs. Uh, operating costs, understanding. 'cause if you have a lot of tools and then they get introduced and then maybe a few months later you have a really bad outage or something happened, it's because you got this proliferation of bad that maybe you didn't see.

[00:12:18] Andrew Zigler: You know? So I think it's really interesting problem space. And I'm curious, like what advice do you give to engineering leaders to be really confident going into that like financial conversation around the compute that their org is using and justifying

[00:12:34] Lake Dai: This is actually a great question because we actually literally talk about mindset shifting, right? So,

[00:12:41] Lake Dai: I think I, at our, my early talk, um, audience is asking.

[00:12:47] Lake Dai: what

[00:12:48] Lake Dai: can you gimme a, basically a recommendation? What are the operating metrics that we're looking at? So interestingly, it's really case by case.

[00:12:55] Lake Dai: Let's say if you are e-commerce company and then you are, uh, [00:13:00] apply AI in your internal process and as a, uh, I would say as a CEO of the e-commerce company. I'll say, well, let's say Andrew, you adopt ai. So let me, let me know. So using this specifically. Whatever technology you're using, how much have you left?

[00:13:18] Lake Dai: Conversion, Because e-commerce conversion works, right? So, um, let's say we need to also buy a lot of, uh, purchase a lot of ads, uh, for ac user acquisitions. How much more efficiency has increased that?

[00:13:31] Andrew Zigler: right?

[00:13:31] Lake Dai: So those are very specific questions related to user acquisitions and also customer conversions into, you know, or upsell percentage.

[00:13:40] Andrew Zigler: It's about not being blind to all of those little multipliers that are across your organization where you're tweaking and making it more efficient. 'cause that all boils down to that metric that you're trying to

[00:13:49] Lake Dai: Yeah. So in, in another way is that one, if I'm a CEO think that way and then I'll ask my engineer leaders that I don't care about which metal you are

[00:13:58] Andrew Zigler: Right.

[00:13:59] Lake Dai: nor do I [00:14:00] cut how, how fast you,

[00:14:01] Andrew Zigler: or how many lines you made.

[00:14:04] Lake Dai: so what's the impact?

[00:14:05] Andrew Zigler: Exactly. Yeah. What did it do? Did it, did it, did it ship something? Did it make something safer? Did it, did it save a lot of time in a really like, eyeopening way? Like, oh, I didn't even know we could do this this way before.

[00:14:16] Andrew Zigler: Those are like the real force multipliers where you find that, that like hidden unlock. And a lot of times when I sit down and I talk with the engineering leader, like, where's the big, where was the biggest unlock in your org as people started experimenting with ai? A lot of times it comes from the non-engineers and the folks who've never picked up those tools before.

[00:14:32] Andrew Zigler: Someone in finance figuring out how to like, pick it up at a workflow and build something out. To make like a cost calculation way more efficient or way more accurate. And when we look at those on like an aggregate level, that really compounds and adds up. And that's the impact of the non technologist in the conversation.

[00:14:48] Andrew Zigler: Right. Which is really profound and interesting because I think we're entering a world where everyone's gonna become this native technologist and we're all getting closer to expressing our [00:15:00] intent and words and then getting machine computation output. Getting algorithmic things. We talked with guests here about like personalized software, idiosyncratic things that you spin up and throw away stuff that you would never have built in a world before ai.

[00:15:13] Andrew Zigler: But now it's actually easier to build it with AI than it is to do anything else. So I'm curious, like as an educator, like how, how are you thinking about that and how are you equipping the future?

[00:15:23] Lake Dai: Oh, there's so much to talk about. I think. A couple things you said is quite interesting is when everyone start building, um, who's creating value, who's capturing the value, right? Um, and I think for education now, I'll put on my professor hat on. when I think about education, particularly for, uh, higher education, so we're really looking at providing students three things, knowledge transfer.

[00:15:50] Lake Dai: I would say 80% or 90% of the university is doing knowledge transfer.

[00:15:53] Andrew Zigler: Yes.

[00:15:54] Lake Dai: And then there is the branding of a university like Stanford, Yale, you know, [00:16:00] Harvard branding association, and there's whole network of friends you went to college with will, will go to your wedding, go to all the things like for your

[00:16:07] Andrew Zigler: in your

[00:16:08] Andrew Zigler: company later on, like all those friends that you make and you stick with.

[00:16:11] Lake Dai: The biggest shift is that now the 70, 80% knowledge transfer,

[00:16:15] Lake Dai: university is not the primary, may, may not be the primary source anymore because you can get it. You can just talk to ChatGPT you can go to Coursera. There's online courses and university providing that as well. So knowledge is not hard to find. So what is the, what is the purpose for university? What would should transfer, what kind of skillset we should

[00:16:38] Andrew Zigler: right.

[00:16:38] Lake Dai: And I would say it's asking the right questions because even today, you have access to I have at G, but how come our results is different? Do you know how to ask the right question?

[00:16:52] Andrew Zigler: Yeah, exactly. You have to be, you have to be in the moment and you have to be plugged in and you have to have taste and instincts about what's going on around [00:17:00] you and, and, and, and you're right though that it's like you can use those things as building blocks to get that base knowledge, but that if you're not curious and ask those deeper questions and dig into it more, that's what really makes the knowledge set in and you can build upon it.

[00:17:14] Andrew Zigler: Right. And I think that that's like an interesting. Scenario that you've laid out that, you know, the knowledge transfer, maybe university is not the vehicle for it anymore. I, I agree with you. I think like my background, uh, as an educator and working within the space has always been around empowering people through like career and technical education.

[00:17:32] Andrew Zigler: And a big part of that is just being driven and being constantly curious to learn. Right? And those folks really, they, they set themselves above because if you are willing to learn and you don't give up, you can pick up new skills and you can pivot in this world that we live in and. it really opens a new door for like a new type of thinkers to have a lot of impact in the world we live in.

[00:17:52] Lake Dai: Yeah, so, and I, I, I know a lot of people talk about education. Everyone's like, have a little more negative tone. Oh my god, you [00:18:00] know,

[00:18:00] Andrew Zigler: Oh yeah. They're using it the cheat

[00:18:01] Lake Dai: got a

[00:18:02] Andrew Zigler: No one's gonna write anymore. And Yeah, but on the other hand,

[00:18:04] Lake Dai: But on the other hand, I'll give you example. So, so when Vibe coding just started, like I, I think within a week already demoed, lovable in my class.

[00:18:14] Lake Dai: So this is a master of engineering at Carnegie Ma, Carnegie Mellon University, and I also have a group of high school students join us to audit the class. So when I demoed that, um, and actually, uh, the guest speaker came, it was, uh, Ted n who was the CTO of GitHub. And he shows that, and I can see the face on my students and say, oh my God, this is something I used to do coding in, uh, in months. Now you do, you did on seven

[00:18:42] Lake Dai: minutes. I can see the look on the face, but I can see also the, on the face of the high school, like, oh, does that mean I can do X, Y,

[00:18:50] Andrew Zigler: And it does. And that's what's so exciting is that it, it pulls them up at a way earlier age to participate in the world and build the world that we live in.

[00:18:58] Lake Dai: Yeah. So [00:19:00] again, if we go back, if you.

[00:19:02] Lake Dai: Have a, have the beginner's mind drop the package and you go back to your high school mind, you're like, oh my God the world has completely opened up. So many possibilities, and that's

[00:19:14] Andrew Zigler: Yeah. I'm curious too, from your perspective, you have a very, very high level perspective and you know, you gave us some insights at the very top about these pervasive conversations within Boardroom about measuring an understanding AI impact.

[00:19:27] Andrew Zigler: I'm curious, from your perspective, what do you think is the biggest blind spot that like engineering leaders have right now? About it?

[00:19:34] Lake Dai: AI governance. Yeah. Yeah. So AI governance, uh, we, we talk about operating metrics, but I think there's another new metrics that were coming, um, that is become so critical to the, to the leaders, not only just engineering leaders, is that the governance elements. Um, because we talk, we did talk about major trends.

[00:19:54] Lake Dai: One of the trend trends I didn't talk about is regulatory trends. Uh, there are [00:20:00] so many. Ai, uh, policy and compliance and regulatory requirements, it's very difficult for both sides. The policy makers try to catch up the development of AI and also for ai innovators try to catch up with the compliance

[00:20:15] Andrew Zigler: requirements.

[00:20:16] Andrew Zigler: Yeah. Or build something that's not gonna get regulated to death in a year or, or really they'll think ahead about how is policy gonna impact what I'm building?

[00:20:23] Lake Dai: Yeah. So I think that is very important to have both sides to have a full conversation because I think everyone has a best intention. Yeah. Right. Um, so as engineer and leader, I think to understand where the trends is going and, um, create a framework really allows you to continue to innovate, meantime protect your organization, also protect your consumers, customers

[00:20:48] Andrew Zigler: Yeah.

[00:20:48] Lake Dai: and their partners.

[00:20:49] Lake Dai: That's what's super important. So I'm gonna quote. Um, so one of, uh, um, really good talk coming from the founder of Al, uh, bot Auto, uh, [00:21:00] ho He said really perfectly, he said there are three structures. There's, there are three frameworks. The first frameworks the unit risk means that when you coding level, the second is system risk, which means that how you think about engineering development process.

[00:21:15] Lake Dai: The third is the ethical level is how you organize and culturally think about this problem. And he said in the unit level is really difficult to eliminate. Eliminate it. Because as money, the software you a box. Right? Right. Um, however, systematically how you think about organizers looking for, um, the potential bias of the model models, hallucinations, misinformation, disinformation, safety, or, um, potentially, for example, IP Process, process and document it. And that's very important because it doesn't have to be perfect, but you can start with something then just like the, you know, the surgeons have a checklist before the surgeons that this is the 1, 2, 3, 4 thing you wanna [00:22:00] check before you conduct the surgeon. Um, I think that's really essential

[00:22:04] Andrew Zigler: Yeah. Building those policies, those playbooks, those internal frameworks for how you take something that is late in the latent space and you make it more deterministic and repeatable and measurable over time.

[00:22:16] Lake Dai: You continue to perfect that, uh, the list, but have a list is so much better, have no list.

[00:22:22] Andrew Zigler: Absolutely.

[00:22:23] Lake Dai: and then there's ethical elements being that company culture. What do you think of this? Um, and then all of this I think needs to create their own. Each organization's different. You need to have your own, um, I'll say handbook. And it's good to be documented to protect you and also to. Um, support you continue to innovate.

[00:22:47] Andrew Zigler: I'm curious too how you, how you think about. As a leader, how do you tackle the problem of technical debt in a world with ai? How does it influence how you think about it within an [00:23:00] engineering work?

[00:23:00] Lake Dai: well that's such a big question.

[00:23:03] Andrew Zigler: Right.

[00:23:04] Lake Dai: When you say technical debt, there's a lot. Um, so I think you need to make a comparison that it's, it is a faster, easier to patch it up. Or just completely use something new. And we see that quite often, uh, in infrastructure. 'cause infrastructure takes time to build. the ones who are more successful have a larger, infra, more complex Infra is actually sometimes, you know, harder to make this new,

[00:23:29] Lake Dai: uh, I would say adopt the new technologies.

[00:23:32] Lake Dai: So we're gonna see this whole wave, maybe like the, the tier two is catching up because they are, they have less.

[00:23:40] Lake Dai: They're like, forget about this. I wanna just start from everything from you. I'm not gonna call suggest the names, but I know in the in Infras, you know, cloud space, we can see some tier two cloud providers not making really good progress. I don't think they think they're tier two, by the way,

[00:23:55] Andrew Zigler: I, I, I, I know what you mean.

[00:23:57] Andrew Zigler: I, I think, I think that's an interesting way to, to, to look [00:24:00] at it too. And I'm, I'm curious about how, so you're in, we're all in this world where you have this new compute layer that's taking up a huge portion of mind share and physic, like actual, tangible costs within an engineering org. And we're in a world where that access to compute is largely subsidized.

[00:24:19] Andrew Zigler: It's at our fingertips. We're all still figuring out how to use it. I'm curious what your perspective is, especially from just really high level that you have about how AI and access to AI might evolve over time and how engineering leaders can get ahead of that. There are some that like try to be more insular, have their own models spin up their own compute layer.

[00:24:37] Andrew Zigler: That doesn't necessarily scale. You have others that lean entirely on third party providers. They get all of it inbound and that adds risk to their, to their org. So how do, how do you navigate that?

[00:24:47] Lake Dai: Wow. That's, um,

[00:24:49] Lake Dai: that's really good questions. Each question can write a paper.

[00:24:53] Andrew Zigler: We'll go, we'll, we'll spin this up into a paper and, and then we'll, we'll get it published.

[00:24:56] Lake Dai: Um, so I think there are multiple layers, [00:25:00] uh, to answer this question. So first thing is that when we think about, um, what's gonna be the next generation changes, I think engineering leaders should pay really close attention. What are the fundamental movers?

[00:25:12] Andrew Zigler: Yeah. Right. So, uh, I did mention a little bit is that, for example, the agentic infrastructure that's something takes some time to build, right?

[00:25:22] Lake Dai: So if you are not paying attention to what's happening and it's harder to catch up. so you, you kind of, I know everyone's have a really busy schedule, but the meantime, you carve out some time to watch the most important technical trends that come in, that they may not arrive in the same year.

[00:25:39] Lake Dai: Um, but all this, uh, trends are started with the research and then move the industry and being adopted very, very quickly at scale.

[00:25:48] Lake Dai: so agentic

[00:25:50] Lake Dai: uh,

[00:25:51] Lake Dai: infrastructure is one. And the other thing is we mo we see pre, you know, uh, we're moving from pre-training to post-training. We see a lot of, [00:26:00] um, a lot of inference now is moving to the edge side.

[00:26:04] Lake Dai: And that was something is very interesting. What does it mean I move to edge side,

[00:26:08] Lake Dai: right? We talk about edge comput for a

[00:26:09] Andrew Zigler: Yeah. We, we've, we've, we've, sat down with Google DeepMind and, and dug into Gemma, you know, there's small, like on-prem, uh, small on, on embedded models and, and what does that even mean to pick those up and use those, especially combined in a world like robotics too.

[00:26:22] Lake Dai: Yeah. So the first question I would ask you, why would people think about Edge? Why? Why edge? Why can't be everything. Just, you know, in the cloud.

[00:26:31] Andrew Zigler: Oh, because of the cost. Because of the, the risk of if, if everything, I guess, is in the cloud, like let's say that your entire compute layer relies on making those API calls into Claude code or into ChatGPT

[00:26:43] Andrew Zigler: And then, you know, you could be like a frog in a pot of water that's slowly getting warmer and warmer as those prices go up and up and then that fights against the whole operating metrics that you're trying to be super efficient on. So you end up kind of trading one thing for a new [00:27:00] problem. Kind of, it, it becomes difficult, I think for, for, for leaders.

[00:27:03] Lake Dai: so cost dependency, as you mentioned. Uh, there are also other, uh, other elements. So for example, privacy reason, right? So some of the information you do not want to put in,

[00:27:14] Lake Dai: uh, you wanna put on the devices that give you privacy. Uh, and then sometimes the response time as well. Something you need to make a really quick decisions, you want the models to be running.

[00:27:25] Andrew Zigler: needs to be a hand, you need to be right there.

[00:27:27] Lake Dai: so that's another reason and, And I also say that uh, the framework models in the future is not only for large language model multimodality, and sometimes they may actually run on the devices will be more efficient. and how we look at all the information or connect each other.

[00:27:43] Lake Dai: For example, let's say for example, let's just take a traffic. Uh, you don't always have to call to see where the traffic is and if you have all the cars in the same region, car communication could be more efficient. You don't, that

[00:27:58] Andrew Zigler: you get these different communication [00:28:00] models.

[00:28:00] Lake Dai: yeah, it has nothing to do with

[00:28:03] Andrew Zigler: You don't gotta deal with those. You don't need to query some big master like list of all of it. You can just work with that localizing. So it's also about, from your opinion, localize knowledge. Yeah. Localizing and structuring that knowledge to where it's close at hand when it needs to be.

[00:28:15] Andrew Zigler: Yeah. Yeah. That's really cool. you know, Some of the last things I want to, I wanna dig into here is how do you prepare new leaders? To build something in AI if, or in engineering, if maybe it's their, like, if it's their first time really entering into the scene. Do you have strong advice for how engineering leaders can use the opportunities in our market, in our industry today to really get ahead?

[00:28:40] Andrew Zigler: 'cause there's a lot of transformative opportunities. Is there a good advice that you'd wanna give to like a, a new engineering leader, somebody who's picking up a problem? It's

[00:28:47] Lake Dai: it's almost too much, too fast right then. Then you have to be like Matrix. Then you do the slow move.

[00:28:54] Andrew Zigler: oh yeah. Do the go down with the bullets going by. It feel, it feels that way to be an engineering leader right now [00:29:00] with the way the problems are coming at you and how you dodge 'em

[00:29:03] Lake Dai: You should be slow down, not going fast. Yeah. Um, so. first of all, I wanna say something, um, little unconventional and I can't wait for us not to talk about AI because we don't talk about internet anymore. We don't talk about mobile anymore. Nobody come out to say, I'm building internet company, or I'm building a mobile company.

[00:29:25] Lake Dai: Right,

[00:29:25] Andrew Zigler: Right. No one says that.

[00:29:26] Lake Dai: right. Because it's only a technology that facilitate. Support you to creating something new, right? So now we talk about we creating e-commerce company. We got this company, we solved this problem. I think hopefully I'll see this hype of talking about ai.

[00:29:45] Andrew Zigler: Decrease. It needs to cool off. It needs to just embed itself into our world as this new

[00:29:49] Lake Dai: talk about the real problem, the worst solution we try to provide.

[00:29:52] Andrew Zigler: Yeah. And when it slows down and cools off and it's kind of like the earth's crust and it cools off and then things can finally start growing on it.

[00:29:59] Lake Dai: Yeah. [00:30:00] And I, I, I, seriously, I think I, you should just slow down. I, if I want to catch every AI news and all the names, I'll be gonna ask.

[00:30:09] Andrew Zigler: I try to do that every week Right here on Dev Interrupted. They all, they all know I'm trying to keep up with the news. My head is spinning some weeks.

[00:30:14] Lake Dai: oh, I tell you something really funny. So I, um. So I did the research was like, okay, so this quarter was the, the biggest, uh, trends that being tackling in the media and gave me 10 most insight for papers to read.

[00:30:29] Lake Dai: And I, I just have, uh, you know, the, the best model to run it. And I came back and like three out of 10 papers were written by ChatGPT Yeah, yeah. I'm gonna say, I can imagine. I can imagine what it pulled from.

[00:30:44] Lake Dai: Right. So you basically, you look at it as completely written by ChatGPT with no, um, new insights.

[00:30:53] Andrew Zigler: Yeah. So

[00:30:54] Lake Dai: So this is interesting as, as we retrieve the information from, from the, [00:31:00] those models and the models pulling the information, the human generating more content with the assist of ai, you're going to that loop.

[00:31:08] Andrew Zigler: It's a feedback loop. Yeah. Jump in. Junk out eventually.

[00:31:12] Lake Dai: Yeah. Yeah. So that's what I'm saying is like, wanna catch up? Everything is hard.

[00:31:16] Andrew Zigler: Yeah. Yeah. And go and go slow. Go slow people. If you're listening to this, AI is moving real fast and slow and steady is gonna win this race.

[00:31:24] Lake Dai: Think about real fundamentals because you have a few quiet moments. Then you can think about what is the true fundamentals, what are the trends? Has continued repeat itself. We see very clear trends in whenever there's a technology, um.

[00:31:39] Lake Dai: Major shift in, in terms of internet and mobile. Look at those histories, how it repeat itself. Um,

[00:31:46] Andrew Zigler: you can learn so much

[00:31:47] Lake Dai: I learned so

[00:31:48] Andrew Zigler: History is very cyclical. Like even when like app, like when the Macintosh hit the scene and then people work something, create, creating graphic arts, and you have this revolt from traditional, you know, media artists. And then [00:32:00] even before that, when you had a computer aided, uh, computer assisted design and CAD software.

[00:32:04] Andrew Zigler: And it, and it displaces the work of so many architects who are then freed up to do higher level architectural work. Right. And it's a really interesting reaction, but then also a harnessing of that power.

[00:32:14] Lake Dai: And then in those trends you will see something like very consistent, which that whenever there's new technology, there's a lot of uh,

[00:32:23] Lake Dai: so much like passion to create some applications. Yeah. You're gonna see the first wave are people try to, oh, I can use the technology to do A, B, C, D.

[00:32:31] Lake Dai: And, but because they didn't have a really good infrastructure that support that, so a lot of 'em actually died off

[00:32:37] Lake Dai: and then everyone is like, okay, let's build the infrastructure for this new technology. Right. Um, and then you're gonna see building tech, uh, infra. And then you see overview of infra.

[00:32:46] Andrew Zigler: Yeah. It comes in stages

[00:32:47] Lake Dai: And then, but because it's the overbuilding and then drive the infra cost down and all in a sudden the real boom started.

[00:32:55] Lake Dai: So we actually seeing the AI right now, right, because we see a little bit of, little bit [00:33:00] like applications very early, like two When ChatGPT first came out, but that, that time we didn't really have a full AI native infrastructure. And then the last three years, a lot of companies are building it, and improving it.

[00:33:11] Lake Dai: So now we see the cost of a building AI apps in terms tools, compute calls, everything is actually become so much easier and then we see more people building.

[00:33:22] Andrew Zigler: Yeah.

[00:33:23] Lake Dai: Um, so I hope to see the, you know, app store time for AI apps.

[00:33:31] Andrew Zigler: You know, Lake, I think we're gonna have to sit down with you again in the future and, and, and take another temperature test on how AI is evolving. And maybe by the time we talk it is cooled off a little bit. People aren't talking about it as much. We can talk about that next platform, that next level evolution.

[00:33:44] Andrew Zigler: But I have enjoyed our conversation so much today and you bring so much amazing insight from your perspective and. Your role as an educator and an advisor. So I really want to thank you for sitting down with us and I want to ask like if there's final, uh, things that you're working on right now, places where [00:34:00] people can go to, to learn more about Lake and the work that you do.

[00:34:03] Lake Dai: I can, I can, I recommend my substack.

[00:34:05] Andrew Zigler: Yeah, please.

[00:34:06] Lake Dai: So my name's Lake Dai D-A-I So I have AI in my name. so I created this substack called Lake D AI unbundled uh, little long, it's not really, really good name. I now think about it. Um, but what's important is that I continue to monitor the trends. And like I said, trend watching is really important for my investment and for my

[00:34:31] Andrew Zigler: right.

[00:34:32] Lake Dai: And so I write maybe once a month, once every other month only when something important

[00:34:37] Andrew Zigler: right? You're right. When it matters.

[00:34:39] Lake Dai: I know. So I think this, 'cause I have too many people asking, what do you see think will happen next? And I figure this is actually a really good way to show people, just read the article.

[00:34:48] Andrew Zigler: Amazing. Well, we're gonna plug that in into our show notes, make sure folks can go check out your Substack and follow you the, the work that you do. And I, again, I really want to thank you for sitting down with us on Dev Interrupted and those listening. We'll see you [00:35:00] next time.

[00:35:00] Lake Dai: See you next time.

Your next listen