What if deploying a new capability to an industrial robot arm was as seamless as pushing an update to a web app? This week, Andrew sits down with Brian Gerkey, CTO of Intrinsic and a titan of the open-source robotics community, to discuss how modern AI is finally giving robotics the "brains" to handle the unpredictable physical world. Brian breaks down how to move away from rigid, monolithic automation toward software-defined, modular robotics using tools like ROS (Robot Operating System) and digital twins. Finally, Brian invites developers of all backgrounds to test their skills in Intrinsic's new AI for Industry Challenge by using AI to solve one of the hardest problems in manufacturing: plugging in a tangled cable.
Show Notes
- Brian’s Website:Learn more about Brian's work, research, and history in the open-source robotics community.
- Intrinsic: Explore the robotics software company led by Brian, which provides tools like Flowstate to build and deploy intelligent robotic solutions.
- AI for Industry Challenge:Get the toolkit and register to compete in Intrinsic's new challenge to solve complex dexterous manipulation and cable routing.
- ROS (Robot Operating System): The open-source middleware suite that has become the global standard for robotics software development.
- Open Robotics: The organization Brian co-founded to support the development and distribution of open-source robotics software.
- Google DeepMind: Intrinsic's sister organization, collaborating on foundation models and advanced perception for robotics.
Transcript
(Disclaimer: may contain unintentionally confusing, inaccurate and/or amusing transcription errors)
[00:00:00] Andrew Zigler: Brian, just before like we roll into the script, like you yourself as an engineer and like an engineer minded person. What is most exciting to you right now on a personal level about how fast all the technology is moving and the things that you're experimenting with outside of Intrinsic and are you doing anything fun outside of applying it towards robotics?
[00:00:19] Brian Gerkey: Well, I mean, I think I'm mostly working focused on robotics and for me the, the exciting part is really just seeing that robotics is, you know, we've been, I, I feel like I've been waiting for robotics to have its moment for about 25 years.
[00:00:33] Andrew Zigler: Yeah,
[00:00:34] Brian Gerkey: this for a long time, mostly as an intellectual pursuit, and now it, it's, it's starting to become, not, not only a hot topic 'cause that wouldn't be quite enough, but it's like, it's a hot topic in part for a reason, because it's also becoming really, really useful.
[00:00:48] Brian Gerkey: So that, that to me is just, it's very satisfying.
[00:00:50] Andrew Zigler: Absolutely. It is like transformative to see this technology that has so much potential finally be able to like manifest it. Robotics [00:01:00] is something that I've always found personally interesting just in my own interest with technology and experimenting with it just as like a solo dev or just a, even like a teenager.
[00:01:10] Andrew Zigler: Just trying to figure out how a breadboard works is like really complex. To get into. Um, so it's really amazing now to see how it can become more accessible too. And I'm excited to talk about that today.
[00:01:21] Brian Gerkey: Yep.
[00:01:22] Andrew Zigler: And, and to everyone listening, welcome back to Dev Interrupted. Today we're looking at the physical AI revolution. For decades, the hardware and robotics was way ahead of the software. We had the bodies, but they lacked the brains. To handle the messy, unpredictable world that we all live in. But our guest today has spent his career solving that gap. Brian Gerkey is the CTO of Intrinsic. Many of you know him as a titan of the open source community.
[00:01:51] Andrew Zigler: He was a co-founder and CEO of Open Robotics and one of the primary architects of the Robot Operating System. And now at Intrinsic, which [00:02:00] just officially moved from an Alphabet moonshot to an integrated core unit within Google. He's leading the charge to make robots as easy to program as web apps. And Brian, it's so great to have you here today.
[00:02:12] Brian Gerkey: Well, thank you for inviting me, Andrew. It's great to be here.
[00:02:15] Andrew Zigler: Absolutely. And I want to dive right in to the breakthrough that we're all talking about and experiencing in robotics. You, you, you feel it yourself. You said at the top of our conversation that, you know, you feel like robotics is finally having its time in the sun and is gonna be able to make huge strides.
[00:02:31] Andrew Zigler: And you're right at the forefront of that. And a lot of that is made possible by things like artificial intelligence and new developments like LLMs, but applying that to a heavy robot arm is like an entirely different beast to tackle. So what was the missing link from your perspective that's finally made this software driven robotics viable for like more industrial work?
[00:02:54] Brian Gerkey: I, I think that's a great characterization. 'cause I think the, the robot hardware has been. Pretty capable [00:03:00] for quite a long time. Like if you look at the kind of applications we're tackling at an at intrinsic with our customers, we're basically using robots of a kind that have been around for a long time.
[00:03:09] Brian Gerkey: They've gotten better, they've gotten lighter, they've gotten smaller, they've gotten more, you know, they're more collaborative robots that are, might be safe to use without having a cage around them. So there are some, there are some improvements along the way. But by and large there, it's kind of the same hardware.
[00:03:22] Brian Gerkey: Uh, but historically those robots have only been used in really fixed, rigid environments where we, in order to use the robot, you engineered out all the variants so that the robot could operate in almost a blind fashion, and it just would do the same thing again and again. Pick from here, put to there, and that, that was, that's really useful in those high volume, uh, manufacturing type applications.
[00:03:43] Brian Gerkey: But there's so much more that could be done now. Then, and so we've been wanting to do that for a long time. I think what has really been the, the breakthrough is the availability of better software and that software goes all the way from perception, so how do we understand the world? I've got cameras.
[00:03:59] Brian Gerkey: I've [00:04:00] got lasers. I've got other sensors that are gonna help me understand, see the world, and then make sense. Now I need to make sense of it, and then I need to decide, well, how can I act in that world in a safe way? Can I, can I move my body so that I don't collide with the world? And I go to the place where I want to get the object and I, and I figure out where it is.
[00:04:17] Brian Gerkey: I figure out how to, how to grasp it. I pick it up and then I do whatever it is I'm supposed to do with it. So that. Solving all the components of that, uh, that control like perception, planning, and control problem that took us as a field just decades to get right. Um, it's taken, I mean, I shouldn't say it's right yet.
[00:04:35] Brian Gerkey: We're still, it's always a work in progress, but we've made a ton of. Progress and, and now it's being accelerated by the advent of what, what, what I'll call modern ai. 'cause I've been at this long enough that AI has been a term, I mean, it's been a term since the fifties. Uh, when I was in school, in undergrad in the nineties, I took an under, I took a AI class, right?
[00:04:56] Brian Gerkey: And there are lots of techniques that our ai that now when we see ai, [00:05:00] ai, we mean a specific type of ai, which is these big neural networks. we still use a lot of the older AI and the things that we do, and that's also really good. But these big neural networks have now, they've allowed us to really solve some problems in ways that we didn't previously think was possible.
[00:05:16] Brian Gerkey: And perception is a great example. So now we can, if we think back to like how that, how do I get the robot to do applications where I haven't engineered out all the variants, so I'm, it's gonna encounter a different situation from run to run, from job to job. A key way to deal with that is make it smarter about how it sees the world.
[00:05:35] Brian Gerkey: And that's, that's where the neural networks come in to handle that camera data and allow the robot to, to figure out what is it seeing, where is the thing that it needs to interact with? Where are the obstacles?
[00:05:45] Andrew Zigler: Absolutely. And so there's multimodality that has to be, uh, you know, tackled with technology, with software. There's also the reality of latency because when you are a robot operating in the real world and you're ingesting all of his information [00:06:00] from sensors, it immediately goes stale because you exist in like a living, changing world. And so like having to tackle all these problems requires compounding like a lot of innovation. And that's what, you know, LLMs and AI and neural networks and ad advancements in this kind of technology have really unlocked for robotics. And it's really amazing to see how. This was initially more like an experimental, like route into what is the future of robotics going to look like?
[00:06:31] Andrew Zigler: And now it's transitioned into something more deeply integrated, like within Google. And so from your own perspective as a CTO, how does having access now closer to Google's frontier research and, and things like Gemini change what's possible in your mind for where intrinsic can go with these compounding software problems.
[00:06:52] Brian Gerkey: So we've been collaborating with our, our colleagues at DeepMind for years now. Um, in fact, some of, you know, if you look at the perception [00:07:00] aspects of the intrinsic product that's out there on the market where you we're actually, you know, we've, we've taken innovations that originated in DeepMind and we built on top of them and fine tuned them and, customized them and also, uh, hardened them, which is a topic we can talk about in some detail.
[00:07:16] Brian Gerkey: To put them into production. And so we've been working with them, uh, for, for years now. Now that we are officially part of Google, we can work even better together. So now, you know, previously that we were two separate companies if, if even under the same umbrella. And that always adds, uh, a little bit of a barrier between the two teams.
[00:07:33] Brian Gerkey: And so now we've taken that away and now we've got the opportunity to work much more closely together. We're we're now meeting very regularly with the leadership team over there and figuring out, what's your objective? What's our objective? What's our broader strategy?
[00:07:47] Brian Gerkey: And how do we bring that together in a way that's gonna allow us to both do what we're, we're trying to do here? And I, I think the future is very bright there.
[00:07:55] Andrew Zigler: Absolutely. I, I like how you call it what, that you can both do better because you're working [00:08:00] together in that way. And I think that becomes really key when, uh, technology companies partner so closely with more like foundational lab type environments. Because we've had guests on the show before, like, we've had Philip Schmidt from Google DeepMind, who's talked about being plugged into this virtuous cycle, this virtuous loop where you have the research that informs the development of the baseline technology that informs the innovation on top, and then the innovation on top. You learn so much that fuels that.
[00:08:29] Andrew Zigler: Baseline, uh, that makes everything better. So now that intrinsic and robotics can be closer to its own virtuous loop, that unlocks, like you said, a lot of, a lot of upside for everybody. I also like too, Brian, how you called out, how we think about, um, robotics more traditionally, like maybe like an arm or something that does something very standardized. Uh, I, I imagine, or I think about when people think about like assembly lines in a factory, imagine if that robot got off by an [00:09:00] inch or something. Now suddenly the whole thing has to shut down because it's not an intelligent machine as much as it is an automated one. So in this new world where those machines can become more intelligent, you know, how do you characterize that at intrinsic and, and how do you then build for that unknowable world based on these kinds of sensor inputs that you can get now?
[00:09:22] Brian Gerkey: Yeah, I think you've drawn just the right distinction there between it being a, you know, what you might call an intelligent robot versus being an automated robot. And that, and historically, most, and by, I mean by the way, robots are, are very widely deployed today, right? You can go into lots of factories, like you said, you can, you can go to assembly lines and find, you know, from, from automotive to, uh, electronics manufacturing, you find.
[00:09:45] Brian Gerkey: The use of a lot of robots. They are by and large, but I think he correctly characterized as automation. They're, um, they're, they're very well, uh, and in a very high performance way, often designed and [00:10:00] built, but they are done, they're, they're designed and built to do one specific task. So if you look at, for example, let's take electronics manufacturing as an example.
[00:10:09] Brian Gerkey: What, what, what is common today is if I'm gonna build, say, a mobile phone. Then I'm gonna bring up an assembly line for that phone and what, and I'm gonna figure out all the steps to put that phone together, and I'm gonna automate as many of them as I can because that's gonna be more efficient and more productive.
[00:10:26] Brian Gerkey: And I'm gonna build robot cells that are really specific for doing one task for that phone for the timer. I'm gonna be making that phone and when I wanna switch the model of the phone next year. I'm gonna kind of start from scratch. I'm gonna basically take those robots, I'm gonna tear that whole thing down.
[00:10:44] Brian Gerkey: I'm gonna build up a new line and I'm gonna program, I might, maybe I reuse some of the hardware, but I'm, I'm kind of starting over every time. And that's been the journey of deploying automation kind of since the beginning is you tend to start over. Each new task is kind of like, it's [00:11:00] its own bespoke snowflake basically.
[00:11:02] Brian Gerkey: And. The difference that we think, the different situation that we think we can provide now is what if that hardware you, you have, you have the same hardware and it becomes a software defined resource for you. Like it, it's something where what it does is a result of the software that you put into it.
[00:11:22] Brian Gerkey: And that software gives it, it gives it better capabilities. 'cause it can, for example, instead of always reaching to the same place, and if you, if the, if the object it's gonna pick up is off by one inch, as you said, it just, you know, it misses. Instead it looks over there and says, oh, I see where it is. I'm gonna pick it up wherever it happens to be.
[00:11:39] Brian Gerkey: I have an expectation of approximately where that is, but I'm gonna go pick it up from exactly where I understand it to be. That makes it more capable also, then we can update that software over time. Just like you can update software on all the other systems you interact with, from your phone to your laptop, to your cloud-based systems to, I mean, today your car gets software updates on a regular basis.
[00:11:59] Brian Gerkey: [00:12:00] So if we start to think about these robots as not being single purpose, one time programmed bespoke resources, but rather these reusable, reconfigurable software defined resources, we can fundamentally change how we think about using robots. Especially even, even just narrowly in manufacturing. We could talk about other domains too at intrinsic.
[00:12:21] Brian Gerkey: We're currently focusing on rolling out, rolling outta manufacturing, but even just within manufacturing, which is its, its itself a big industry. We think there's a. There's a sea change that can be affected there. If we can have this kind of software defined robotics view.
[00:12:36] Andrew Zigler: Absolutely, and, and it's tackling the hardware and the physical problem with the same mindset that an engineer is tackling a software problem using modern software engineering solutions. The idea of modularizing it, breaking up this monolithic. Thing into smaller component parts. And then in doing so, you're actually able to, to democratize the ability to create the [00:13:00] best version of all of those individual parts.
[00:13:02] Andrew Zigler: Because in the previous world where you're creating this singular monolithic solution, one-off every time, it's the equivalent of just like. You know, one shot prompting an entire repo on your computer and then trying to use it and then hoping it works and, and then it does for that immediate thing and then throwing it away.
[00:13:19] Andrew Zigler: That's not a very, engineering mindset to build because there's so much potential to extract from that. And this is what's getting unlocked I that, that you're calling out software's allowing us to extract the value, the domain expertise that goes into creating that super specialized phone production line.
[00:13:36] Andrew Zigler: Right.
[00:13:37] Brian Gerkey: I agree with that, and I think you're, you're hinting at something which I, I think is really important to recognize about robotics as a field, is that it's very interdisciplinary. So if you talk about, you know, an eng, an engineering approach to solving a problem is, is to break it up into pieces.
[00:13:51] Brian Gerkey: And in robotics, you need people to be involved in the endeavor who are expert at. Mechanical [00:14:00] engineering and electrical engineering on the hardware side. Um, as well as, you know, user experience and human robot interaction at the other end, where you're starting to put the system toge in, in with people.
[00:14:10] Brian Gerkey: 'cause the systems are always gonna interact with people even if you consider them to be autonomous in the middle. As we think about putting the software together, you need people who are experts at the cloud infrastructure. And the data handling and the, and vision and state estimation and planning and control and so on.
[00:14:30] Brian Gerkey: And so you need to have a, a software framework that allows you, allows all those people, 'cause nobody is an expert in all those things, right? There's no, there's no like a renaissance man of robotics these days. It's, it's so, it's too broad. And so you need someone, you need to be able to bring in all the people.
[00:14:49] Brian Gerkey: Uh, to contribute and give them a place to exercise their expertise.
[00:14:54] Andrew Zigler: Exactly, a lot of your work. In open [00:15:00] source and as a technologist, has been focused on building tools that enable other developers to bring that domain expertise closer to the problem. You know, where does your builder mindset come from and how does it shape what you do at Intrinsic?
[00:15:14] Brian Gerkey: I think I first, my, my first experience with this was when I was in grad school, I was at USC and I was, uh, pursuing a PhD in computer science and a robotics lab. And like every, every robotics lab we had, we had some robots, we had mobile robots. They were small wield platforms that we, um, that we had to work with.
[00:15:34] Brian Gerkey: And our goal as researchers, as grad students is to do new science. Right? That's, that's the top level objective you're meant to. You're meant to discover things. Uh, in this case it's a, it's kind of an experimental science. And so you, you invent algorithms and you test them. So the robots are like a, they're part of your testing.
[00:15:52] Brian Gerkey: They're, they're instruments for you to exercise your ideas, and then you gather data, you report on that, that's what you write about. Now, [00:16:00] in order to use the hardware, we needed software. and we had a very common experience, which is that we bought the robots from a company that made robots for the research market.
[00:16:10] Brian Gerkey: They came with some software from the manufacturer, but it was, we had it, it didn't do everything that we wanted, that we wanted to make changes, we wanted to fix bugs, we wanted to add things, but it was proprietary software that came from the manufacturer. We couldn't change it. but we were, you know, programmers.
[00:16:24] Brian Gerkey: So we thought, well, why don't we, we could write our own software instead of using that software. So it was kind of like, if you imagine you get a computer, it has an operating system, instead of using that one, you write your own. This is that, I mean, that. A much grander version of this story is what is at the core of like, you know, the, the origination of, of Unix actually of, um, of BSD in particular.
[00:16:44] Brian Gerkey: So. What we did is we wrote our own software and then it served our purposes in the lab. So it was infrastructure software to use the robot, interact with all its sensors. And then we didn't just keep it to ourselves, but we shared it. We put it up as open source. Uh, we put it on Source Forge, which was a, [00:17:00] you know, that was a thing that came before GitHub, right?
[00:17:02] Brian Gerkey: This is quite a while ago. It was like era circa 2000. And the thing that was magic for me was when the first time after sharing it. That we got contributions. We got another ro, another set of grad students in another lab in another state who said, Hey, I found that useful. Here are the drivers that allow that same software to support my robot, which is a different robot that you've never seen before, and now I'm using your software with my robot.
[00:17:26] Brian Gerkey: And that to me was just like mind blowing. Like, wow, that's, I, if we put this out there and then somebody picks it up and, and finds a way to use it in a way that we didn't even imagine. like that for me is where it started. And, and I, since then, I've just always been dedicated to being a tool builder, putting these platforms out there because I love to be surprised by what people do with it.
[00:17:49] Brian Gerkey: And so now, you know, fast forward several years, we, we worked on the, the project I'm alluding to there was called the Player Stage Project, that kind of, what replaced it eventually was the ROS Project, the [00:18:00] robot, uh, robot Operating system. And now that we, we built that again, kind of with research users in mind.
[00:18:08] Brian Gerkey: We, we were building a, A large, uh, a wheeled human scale robot, uh, called the PR two. It's, it's a, you could think of it kind of like a humanoid, not on legs though. Wheels, two arms ahead with lots of sensors. We wanted it to come with software. We made all that software open source. That was ROS. But the key thing is that it, it wasn't just used with the PR two and by now it's been used for everything from, indoor robots, outdoor robots, wheeled robots, legged robots, flying robots, swimming robots, robots on the space station. Like anything you can imagine. And that's one of my very favorite things to do, is to go to ROS Con's, a developer's conference that we run every year and just see the variety of applications that, that people make.
[00:18:48] Brian Gerkey: So, like, back to your question of, you know, why am I a builder? It's because of that. It's because if you build tools and put them out there. And then it's like for me, there's just nothing more satisfying than seeing what people [00:19:00] think of to do with what you shared with them.
[00:19:02] Andrew Zigler: Absolutely so well said. That's the magic of, of open source, that you share something that you build and then you become surprised and delighted by how others find ways to innovate and build and be inspired by what you put out there. And that's what makes open source such a great part to be, uh, you know, a part of, and that contribute back to like, I, you know, also had a lot of. Really amazing experiences and open source where I share something or I put something out there. And it's amazing to see what other people will do once they get a glimpse of it. But it's even more magical, like what you were describing to think about it extended into our world and the technology, that technology that you made influenced the physical world and other places and ways that you didn't even see you weren't even there.
[00:19:48] Andrew Zigler: And That
[00:19:48] Brian Gerkey: that's totally true. I,
[00:19:49] Andrew Zigler: of magic to it that is just some, something totally different than, oh, someone ran my software.
[00:19:54] Brian Gerkey: I, I will, I have to say, like I, you know, I travel a fair bit for work and I. [00:20:00] It's not uncommon these days to get off the plane at an airport and see robots roaming around the airport. Um, doing, they're doing cleaning or security or they're, um, you know, they're, they're a rolling kiosk, that offering, especially in Asia, they're, you know, kind of helper robots that will, you can ask them questions and things,
[00:20:17] Andrew Zigler: Yep.
[00:20:18] Brian Gerkey: I always make a point to take pictures and see, you know, who's making that.
[00:20:21] Brian Gerkey: And I go look it up. And, you know, very often, uh, they're running ROS. It's like, wow, that, that's kind of cool. I landed at some new place in the world I'd never been before. And there's a robot in this airport, which is running some software that we created, and that's really cool.
[00:20:36] Andrew Zigler: That is really cool. I mean, after this, now, you know, this has another aside, like I live in, in West Hollywood and in West LA they're piloting all of these, you know, robotic delivery robots because it's an extremely walkable neighborhood and there's two or three that are always fleets that are always roaming around.
[00:20:52] Andrew Zigler: And I see them and they are tackling the problems differently. I am always fascinated on an engineering level when one of 'em rolls by me and [00:21:00] now I can't help but wonder if maybe they're rolling by me powered by ROS.
[00:21:04] Brian Gerkey: There's a there. It would depend on the brand. There's a very good chance they are you. Yeah.
[00:21:09] Andrew Zigler: cool. So let's talk about that. Let's talk about the democratization of influencing that physical world. You know, there's millions of software developers globally, but only a, a small fraction of them have ever really worked on a robot. What do you think is the biggest barrier right now, keeping everyday coders out of the robotic space?
[00:21:26] Brian Gerkey: I think that historically it was a, so if we roll back, you know, I don't know, 10 years it would've been that. You needed to build and understand too much of it yourself. You, you needed as an individual to come in to build an application. You kind of needed to be that, um, you know, imaginary robotics renaissance man or woman that I mentioned earlier, like, that you just, you needed to have too much in your head of like how the low level details of the robot work in order to program a robot in a useful way.[00:22:00]
[00:22:00] Brian Gerkey: Uh, I think that platforms like ROS have lowered that barrier a lot because they, they come with, um, a lot of the, a lot of the things that you need sort of taken care of and taken care of in a way that you can kind of treat it like a black box. You know, that it, you know that it works, but you don't need to know how it works.
[00:22:19] Brian Gerkey: I think in addition there has been, it's been difficult to get hardware. So hardware has the, the robot hardware has tended to be. Uh, expensive. It has long lead time to manufacture, and so it's that, that's been a, that's been also been a barrier. I think what we're seeing now is that we've got, we've got really good software that's available.
[00:22:40] Brian Gerkey: Some of it's open source, like what is in the ROS ecosystem. Some of it is, Proprietary and maybe makes use of builds on top of things that come outta the open source ecosystem, like what Intrinsic is offering. But both are available and they, and they, they both do the job of encapsulating some, uh, like a lot of that stuff that you need to [00:23:00] use, but don't need to understand how that works internally.
[00:23:02] Brian Gerkey: And then on the hardware side, I think there's two interesting developments. One is. Just the availability of low cost hardware. So you look at like what comes out of the robot effort. They've got a great open source robot arm that you can buy it from them. You can also just download the designs and 3D print it, and you've got a low-cost robot arm that you can work with at home.
[00:23:24] Brian Gerkey: They've got some great tutorials that tie it into using machine learning models so that you can grab like, uh, you know, one of the, Great open source, neural networks. You can bring that in. You can train it with your robot and you can actually play with the robot interactively. And you, you can start, I mean, I would encourage people to do this.
[00:23:41] Brian Gerkey: It's actually a, a pretty low cost, quick and easy thing to do at home. and an intermediate step there is simulation. So like the, you know, one of my colleagues as long said the, the, the lowest cost robot you can find is the one that lives inside your computer. So if you can have a good software simulation of a robot, then that can get you a lot of the [00:24:00] way to exploring the idea you have for what you want the robot to do.
[00:24:04] Brian Gerkey: So you can, you can test, you develop, iterate in simulation and then you, once you've figured out what kind of robot do I want to use, what's the application, how do I want the software to look? You can actually defer quite far going, going through the time and cost to get it onto the hardware. You still need to get to the hardware, right?
[00:24:22] Brian Gerkey: The, unless you're building a, a movie or a video game, uh, all also, which have their own value, just having it in simulation is not enough. Right to, for most, what I would consider robotics applications. The, the value manifests itself once it gets out into the physical world. So you do need to go to the hardware and you're probably gonna have to make some changes versus what happened in simulation.
[00:24:43] Brian Gerkey: 'cause simulation's getting better every day, but it's also still not perfect. Um, but the simulation can, it, it gives you a big leg up.
[00:24:50] Andrew Zigler: In the simulations as they get better, it allows you to, to, like you said, defer that cost, that physical investment until as late as possible. And it, and then when you [00:25:00] finally do make that jump, it's as accurate as it, as it could possibly be. And, and that becomes the things that you're, you're tuning to on like the larger loop that then enables all of this simulated based kind of innovation inside of it.
[00:25:13] Andrew Zigler: Because I like what you called out, I think that the sim is really critical. and in this kind of. environment that you're talking about where you're solving a physical problem, you need to represent it in a simulated way so that you can gather enough data to solve it effectively and quickly because that's what you know, these types of neural networks and this type of technology is actually unlocking is your ability to.
[00:25:35] Andrew Zigler: Rapidly iterate. So the simulation to me always seems like such a key piece and, and digital twins are something that are, are really fascinating to me. Uh, trying to mirror the precision of the real world in that simulated way and a along, along that path. there's a lot of obstacles and so in the robotics industry, uh, you know, what do you think becomes the difference [00:26:00] between a compelling prototype and something that actually starts to deliver value rapidly? And does is the simulation, what closes that gap? Is it, is it something else that the simulation enables? I'm, I'm curious how, uh, you see teams bridge that gap.
[00:26:15] Brian Gerkey: The simulation helps. Like I said, it, it helps a lot. Um, and like at Intrinsic, we, we take a digital first approach. So if you're gonna, if you, if you're gonna sit down and try to program a robot, uh, what, what, what we and I would prescribe is you start with a digital twin, like model it in a 3D environment, may maybe using a standard CAD tool.
[00:26:38] Brian Gerkey: Um, may you there lots of different authoring tools that you could use. The intrinsic, uh, product flow state is a, is a 3D. Uh, environment that allows you to, you can author directly there. You can also import your assets from a, from a CAD system, but in however you do it. First create a 3D representation of the system you want to interact with.
[00:26:57] Brian Gerkey: And then that allows you to explore like, well, which robot [00:27:00] do I want to use? 'cause different robots have different, uh, you know, workspaces. If it's a robot arm, they'd have different payload capacities. Where do I wanna put the sensors? Right? Those are all things that are easy to play around with in a 3D environment, much faster and, uh, much less expensively than you would with the hardware.
[00:27:15] Brian Gerkey: And then bring up the software stack, get it working. convince yourselves that you convince yourself that you've got like, at least approximately the right behavior, before you go to simulation. so let's say that you, you get it working in simulation, now you get it working on the robot.
[00:27:30] Brian Gerkey: Um, working means different things to different people, right? So a lot of times we see a lot of really great, videos of robots doing just amazing things. There is generally a pretty big gap between a video of a robot doing something in, you know, a couple of times in a reasonably controlled environment.
[00:27:49] Brian Gerkey: You don't know how many outtakes there were, right? They don't, they don't, not everybody's good about showing the blooper reel. So there's a difference between that, which is a, that is valuable, right? That's a, that's a [00:28:00] demonstration of what's possible, right? That that's like, oh my God, we did this thing we didn't even think was feasible before.
[00:28:05] Brian Gerkey: There's still a gap from there too. It's a product that I can convince a customer to put on, let's say their shop floor and rely on for their business. Right? Like I, there are amazing things now that we can do with robots that have like an 80% effectiveness rate or even a 90% effective, or like reliability.
[00:28:26] Brian Gerkey: Like a robot can do some amazing dextrous manipulation task 90% of the time. That is extraordinary. We could not do that. Five years ago, even one year ago. At the same time, if I'm a customer working in a manufacturing context, a 90% reliability rate is like useless. That is just like, I'm not gonna deploy something that one time out of 10, it fails to do what I want it to do.
[00:28:49] Brian Gerkey: Right. We need so much more reliability before we're willing. As a, as a, as a customer, who is, I, I like Ro. Like if I'm an end customer, maybe I like robots, maybe I'm [00:29:00] enamored of them. But if you, if I'm really gonna use it for my business, I kind of need to, the point for me needs to be, not that it's a robot, not that robots are cool, but that it does useful work for me and that it does, does it reliably enough for me.
[00:29:11] Brian Gerkey: And so on that basis, I need to be much more reliable than 90%. So then what does it take to get there? One of my, uh, friends and mentors is Rod Brooks, uh, who founded our robot. And he has these, uh, he has some laws of robotics, his own laws of robotics, and one of them is that from a lab demo to, uh, something that is, uh, I think he says to 99.9% reliability takes 10 years, and then every nine you want to add after that takes another 10 years.
[00:29:40] Brian Gerkey: I think. Technology is changing faster now than it has been historically. So I think that that time period is probably shrinking. It's probably, it might not always be 10 years like it has been up till now, but it's, I would still measure it in years. So if, if you're asking like you see a demo of a first [00:30:00] demonstration of some compelling application of a technology, like from there till it's gonna be in productive use, I, I still think we measure that in, in years.
[00:30:08] Brian Gerkey: Even more years depending on the domain, if you're gonna put it into a mission critical, uh, application. I think this has been the, the journey of autonomous vehicles, right? You look, I mean this is a very recent and timely example. I don't know if you've got autonomous cars driving around in la I know we've got 'em in Northern California like
[00:30:25] Andrew Zigler: we got the, we got them. They're awesome.
[00:30:27] Brian Gerkey: so, and they are awesome and it's worth recognizing that, you know, you, a lot of that work's been going on for many decades, but.
[00:30:35] Brian Gerkey: There was a real focus of attention in, um, there were these DARPA funded challenges that happened in the mid two thousands from like 2004 to 2008. That really showed in first off-road and then on-road conditions what was possible. But it took another 15 years to go from there to something that is the beginning of a [00:31:00] business that is exercising that technology.
[00:31:03] Andrew Zigler: And even 15 years. That seems so fast when you think about how many hoops that technology has to jump through, because like you said, that that's existing in a world where 90% doesn't cut it. That's the brutal reality of, of taking robotics to production is you start with as many nines on that uptime. You start with a full 100% and, and your customers, they don't accept anything less because they're operating at scale
[00:31:29] Brian Gerkey: Absolutely.
[00:31:30] Andrew Zigler: and it's part of bringing people together to solve this problem, which we've acknowledged now needs to be like modularized. It needs to have these hyper-specific learning loops so we can take advantage of all of all of this new compute, inability to learn rapidly with these machines and get them specified for what we need.
[00:31:50] Andrew Zigler: Part of that is, things in like the intrinsic ecosystem, like embedding domain expertise and skills into things that robots can access and then apply [00:32:00] to their environment. Like what does that look like to you as a, as a roboticist and our, on our show and to our listeners to a, a lot as well, when they hear skills right now, they think about how do I encode my, my knowledge, working expertise into something like Claude Code could use to, to be accurate like me. What is the analog for that in the physical world?
[00:32:21] Brian Gerkey: It's, uh, it's a great question and, and the, and so I mentioned earlier the idea of like a good software platform. It. To allow you to build an application it, it takes things that you need but don't need to understand how they work and it encapsulates them. And so that, that's the concept of a skill for us, is it takes some functionality that I need and it, it makes it like a function I can call basically.
[00:32:43] Brian Gerkey: So, examples in the intrinsic stack are, you know, pose estimation. So this is the, this is the problem of I've got, uh, I have a camera image and I have, let's say a CAD model or some other way to represent the object I wanna find in the scene. [00:33:00] Where is it? In the scene In 3D space? Uh, and that's a, that's a computer vision problem, which a neural networks are now very good at solving.
[00:33:07] Brian Gerkey: We use neural networks almost exclusively for solving that problem. But we can give that to you as a skill. And that's in our catalog. So if you go build an application, you can just drag and drop the, uh, I think it's the estimate pose, uh, skill that you drop in and you say, you know, take your input from this camera, look for this object.
[00:33:26] Brian Gerkey: And then it outputs a pose. It gives you a, a location and an orientation in space where that thing exists, and you don't need to know how it works inside. And that that, you know, then you go from there to, we've got another skill called move robot. And that's where I can say for robot arm for example, I want the hand to end up in this place, which might be the output of that sm a post skill.
[00:33:47] Brian Gerkey: Like move the robot to where the object is. 'cause I want to, I don't know, pick it up. And you, and then when you say go, what happens underneath is it does, it looks at well, where are all the things in the environment. Uh, it does a, a high [00:34:00] degree of freedom motion planner to decide how to move through space in a safe way that respects all the constraints of the robot.
[00:34:05] Brian Gerkey: Doesn't collide with anything. But again, that's a thing. You get to take that as just a building block. You don't need to know how it works internally. You didn't have to implement it yourself. Importantly, you can bring, as you mentioned, your own skills right there. You can extend the platform by encoding your own skills, and they could be those kind of like underlying bits of functionality.
[00:34:25] Brian Gerkey: Like I mentioned. They could also be specific to an application, like if you are, and this is where it, it gets pretty interesting, like as at Intrinsic, we're, I think we're expert at building the underlying robotics infrastructure. Like perception planning, things like that. We are not expert in any particular application domain for robots.
[00:34:46] Brian Gerkey: So let's say you want to have a robot that is gonna do, uh, welding. And it turned, there's a lot of what's called process knowledge that goes into welding. Like welding is not, uh, to do it well, you have to know a lot about the materials. You have to have a really [00:35:00] tight feedback loop. You have to understand like how to, to produce a weld of really good quality.
[00:35:03] Brian Gerkey: I have no idea how to do that and nor does our team. And so we are not the ones to develop the skill for controlling the tip of a welding robot. We can give you the bits and pieces to control a robot generally, but if you are an expert in that area, then what you might bring to the table is that process knowledge, and we can give you a way to encode it so that you have a skill that does that task really well.
[00:35:26] Andrew Zigler: And that's the transformation that's affecting robotics. The ability now to open source and share these modularized pieces of intelligence and action so that teams can build on top of them instead of constantly reinventing the wheel, and it sounds, you know, fundamental, but these are core unlocks that just, you know, weren't possible in, in a realm before, you know, recent technology advances and, and all of the innovations around technology and neural networks as well, and. like you called out. So much of this too, is based in [00:36:00] perception and understanding the world, and the world is different everywhere. So the only way to actually grow the capabilities of the robotics industry and what it can achieve is to, like you said, democratize it so the welders of the world can bring their process closer to the technology that they're building with. They can create reusable, amazing machines that are highly specialized, and then they can even share this knowledge with each other. And this even goes back to the idea of the open source ecosystem scaling and winning in these kinds of cases. You know, we've had a lot of software leaders on the show who have made really big bets on their open source ecosystems and, and making everything available on a
[00:36:43] Andrew Zigler: base level for, uh, people to build, and to innovate because when everyone can innovate in this shared space, everyone benefits from the gains. And this is a key flywheel for anything that involves learning. Which, as you've rightly called out, robotics [00:37:00] is all about extracting learning from places and embedding it into machines. And you, you spent over a decade, you know, leading open robots in the ROS community. And looking back like what did the success of Open Source Robotics teach you about tackling these complex engineering problems?
[00:37:18] Brian Gerkey: if I had to pick. Maybe two lessons out of the experience with ROS and what has made it now a, a a defacto software standard for robotics globally is one, we took a distributed systems approach and that this goes back to the earlier part of conversation where we talked about how do you get an interdisciplinary team to be able to contribute.
[00:37:42] Brian Gerkey: So we, we took a, an approach where we said this. We're gonna, we're gonna break down this problem into pieces, and we're gonna provide the tools to create applications, which are made up of pieces. We're not, we're, we're gonna start instead of having a, you could, by contrast take like a monolithic [00:38:00] application approach, um, that might have some benefits to it.
[00:38:03] Brian Gerkey: If all you want to do is build one application, if you know that you wanna build exactly one application, a monolithic approach, you might be able to squeeze more efficiency out of it. For example, 'cause you don't pay the penalty. There's always some cost of having abstractions, uh, between components. But we said, well, yeah, but what we're trying to do here is allow a lot of people to build a lot of different applications.
[00:38:24] Brian Gerkey: So let's take a distributed systems approach where you, uh, an application when it's running has multiple, uh, we call them nodes in the ROS ecosystem. They might be processes, they might be threads, they could be in the same process. There's some details there, but they are. The, the functions of that make up the application are broken up into pieces, which can kind of stand alone.
[00:38:44] Brian Gerkey: And then what that means is that you get to come to the table with your expertise and you wanna contribute. You're not contributing to a big monolithic stack. What you're contributing is a piece. It's a package, it's a node, it's a process that has a well-defined interface [00:39:00] and it can run as part of the larger application.
[00:39:02] Brian Gerkey: And then it, it does part of that overall, it solves part of that overall problem. So that, that's one is. Distributed systems, let, let people contribute just the part that they're good at. The second thing is lower the barrier to entry to joining the ecosystem. And so part of that is about, um, the licensing.
[00:39:20] Brian Gerkey: So open source, uh, we've always chosen permissive licenses. We used to use the BSD three clause license. These days we use the Apache two license, which is kind of spirit, the spiritual successor to the BSD license that is more aware of modern like software patent law. But let people take the software without asking for permission, without registering.
[00:39:41] Brian Gerkey: And importantly, give them the ability to modify it themselves and to use it commercially without any constraints, other than like carrying forward copyright and things like that. But that, that means that they can build on it with the understanding that they're gonna be able to go use it in their business, and that's gonna give them more of an incentive to participate.
[00:39:59] Brian Gerkey: So [00:40:00] lower their barrier to entry in that way. Also you, when you combine those two things together, if with the distributed systems approach, my contribution to ROS can be, I don't need to get a patch applied to the core. I could, but more often I'm gonna come to the table with a new thing. It's a thing that isn't in the core.
[00:40:18] Brian Gerkey: It's a, it's a device driver for a new camera that just came out, or it's a machine. It, it's a machine learning perception system that I just built myself. I get to share that with the community as a standalone package. That's just that I can, I can contribute that atom to the system rather than having to like, uh, you know, figure out how to modify the underlying core.
[00:40:36] Brian Gerkey: So I think the, the combination of those two things is it's just allowed people to come in really, really easily to the community, and that's what's caused it to grow.
[00:40:45] Andrew Zigler: It's amazing hearing you describe it like that because I know for me, when I listen to it, I know for many of our listeners as well who, like I said, maybe are in, uh, sometimes a, a more traditional software engineering world, where right now with the innovations that neural networks and LLMs [00:41:00] provide, they think about how they spend a lot of their times too, breaking big problems that used to be big monolithic problems down into smaller ones and then democratizing it and just handing it over to the stakeholder, to the domain expertise holder to the, to the person who sends that email every Friday, and let them make the workflow with that, with that tool you made. And I think those are a lot of the really powerful unlocks that engineers and more generalized software engineering environments are experiencing right now. So it's really amazing to hear how that same thing applies, uh, to robotics and beyond. I also I wanna call out the really smart thing that you said about, creating like an open source community that actually grows and becomes this like really almost like this, like rich garden with like, just so many things in it.
[00:41:46] Andrew Zigler: People want to come in, they want to, they want to plant their plants here. They want to be in, in this soil because it's rich and it's letting everybody act. Actually own what they build and what they grow. And that's what's really exciting, I think about, uh, making that kind of ecosystem is everyone [00:42:00] has a stake in, in the environment and making it better.
[00:42:03] Andrew Zigler: and that's the really, the magic of creating that, that virtuous en environment. So it's, it's amazing to hear that, you know, taken so much joy and care to, and attention and, and, and fostering it. I'm curious too, just from a leadership perspective, how do you think about as an ecosystem leader partnering with other ecosystem leaders and, and how do you think about like, mutual uh, growth on those kinds of levels?
[00:42:24] Brian Gerkey: Robotics is a very big space, right? The robotics is even, it's, it, I, I wouldn't even say that robotics necessarily is a, it's not even a single ecosystem. It's like a, it's like an ecosystem of ecosystems. And so, there's the ROS ecosystem. There's, there's an NVIDIA ecosystem, uh, that, you know, there's a lot of e energy going into robotics.
[00:42:43] Brian Gerkey: I was at GTC a couple of weeks ago, and there's a, there's a ton of interest in applying NVIDIA technology to robotics. There's a. There's a Google DeepMind ecosystem, likewise, lot of interest in applying things to robotics. There's an intrinsic ecosystem, uh, where we [00:43:00] are, you know, we've got a particular view on applying robotics in a manufacturing context, and we've got a particular product offering.
[00:43:06] Brian Gerkey: There's all of these intersect with each other, and so it's, it's super important to understand, you know, what each ecosystem is trying to achieve. Who the, you know, who the. People are who are really driving those ecosystems forward. And then finding those places of intersection and figuring out, well, how do we work together?
[00:43:24] Brian Gerkey: Um, you know, our, our friends at, uh, our friends in DeepMind have long been proponents of the open source, uh, simulator physics engine called Meco. And so you know, to interact more effectively with them. we've been using a different physics engine historically. We work on, well, why don't, what if we would both support the same physics engine, then, then our applications and our test suites move back and forth more easily.
[00:43:48] Brian Gerkey: Uh, we look at Nvidia and say, okay, they've, they've gone all in on the open USD standard for representing digital twins, 3D environments. Well, um, let's make sure between that ecosystem and the [00:44:00] ROS ecosystem that there's a good interchange because. ROS has its own way of representing 3D environments that actually predates USD, uh, but let's, you know, let's make sure there's good interchange there, and then, hey, maybe we end up, we'll just see where people vote with their feet and maybe USD becomes a more commonly used representation.
[00:44:18] Brian Gerkey: But I think just like being aware of what's happening in each of those ecosystems, and then being willing to find those areas of overlap and then figure out how to work together, that that's what leads to a successful outcome for everybody.
[00:44:31] Andrew Zigler: This is really inspiring to, to hear you listen about how accessible robotics are now, and I I wanna touch on something that I learned about Intrinsic, that currently y'all have an AI for industry challenge that's running right now. Could you maybe gimme some more details about it and how maybe folks have been listening to this conversation could participate if they're curious and taking a dive into robotics.
[00:44:54] Brian Gerkey: Absolutely. And there's a, there's a long proud history in robotics of, uh, challenges really [00:45:00] motivating a field. I think it's a, it's a great, you know, it's, it's a great, uh, forcing function to like get to basically put a problem in front of a group of people and say, okay, this is a problem that we, we think it's an important one 'cause we've drawn it from real experience.
[00:45:16] Brian Gerkey: Also we've kind of specified it clearly. So it becomes a, a task where we can, everybody can, can enter and participate and then we can measure performance against, uh, folks. And hey, there's, there's a dollar prize associated with it, along with bragging rights. So in this case for the, what we call the AI for industry challenge, this is the first time we've done it.
[00:45:35] Brian Gerkey: And what we're focused on is, uh, a particular task that is incredibly relevant in electronics manufacturing, and that's handling of cables. So pretty much every, almost every electronics device, even pretty small ones, end up having some cables in them. Uh, and what we're working on here is at the, at a kind of a macro scale, if you look at servers, there are lots of servers that are going [00:46:00] into data centers these days.
[00:46:01] Brian Gerkey: And there are cables to be, uh, connected inside those servers when they're being manufactured. And this is something that's been traditionally very difficult for robots to deal with. Uh, you know, when they, if you get a, if you give, give you a bin of cables, they're all kind of entertained, inter, you know, woven, they're tangled.
[00:46:17] Brian Gerkey: Can you, can you see, can you pick out one? Cleanly from that scene using perception, can you decide, well, how do I pick it up? Um, having picked it up, like, are you able to feel well enough in order to, you know, insert it and where it needs to go? These are intellectually challenging problems for robotics.
[00:46:36] Brian Gerkey: They're also incredibly relevant for an end customer in manufacturing. So what we did is we, we took that together and designed a competition where we've formulated a, a specific cable handling, uh, a, a set of tasks you need to do, you know, you need to pick up a cable, you need to plug it in in a particular way.
[00:46:54] Brian Gerkey: We give you a simulation environment. That you can run at home. So there's a repo you can [00:47:00] clone, uh, off a GitHub, you'll get a simulator that you can run at home. In fact, I think we, we even support three different simulation environments depending on which one you prefer to work in. Our assumption is you're probably gonna use a modern ai, a, a, a neural network based approach to come up with a policy in order to solve this problem.
[00:47:16] Brian Gerkey: Although, hey, if you took a classical approach and just like wrote some code to solve it in more of an imperative fashion, that's up to you like a solution to solution. And then you can submit your, your solution, your policy to us, and we test it and we, we can score it. We've got a leaderboard. We've already got folks up there on the leaderboard now, and we've got a few phases to this competition.
[00:47:37] Brian Gerkey: So we're gonna take the highest scoring, um, folks outta the first phase. We're gonna get them into a second phase where they're gonna start using, interacting with a cloud hosted, system that intrinsic runs. And then the people who do well there are gonna progress into running on actual hardware. And then there's prize money at the end of it.
[00:47:54] Andrew Zigler: it's like a multi-phase hackathon with all these different gates that you jump through. And along the way you're learning [00:48:00] about, um, this, this simulation space that we talked about today that allows folks to innovate and to, um, and, and to build. And it's actually, it's really fun to hear you describe like, oh, all the things that robots can tackle.
[00:48:12] Andrew Zigler: And then here they are. You know, tripped up by a, a cable, you know, something that as humans we just take for granted all the time. We just plug it in with full accuracy, not even thinking about it.
[00:48:23] Brian Gerkey: Yeah, you
[00:48:23] Andrew Zigler: of course,
[00:48:24] Brian Gerkey: with your eyes closed, right? You could, you could reach to the bin and kind of use your fingers and pick it up, and then you could kind of go wiggle it in. It's super hard for robots, and that's a, that's a real obstacle to, to manufacturing at a time where there's a lot of demand for more devices like that.
[00:48:39] Andrew Zigler: That's very cool. So I'm, it's exciting to see y'all open up the challenge. Uh, I know I'm gonna go check it out, especially since you've packaged up a really great starting kit. There's no reason why I can't throw a Claude Code session at it and see what me and it could do. And
[00:48:52] Brian Gerkey: Absolutely.
[00:48:52] Andrew Zigler: I invite our listeners to definitely explore and to do the same.
[00:48:56] Andrew Zigler: And Brian, thanks so much for sitting down with me today. [00:49:00] Uh, besides the hackathon as well, where can folks go to learn more about you, the, what you're doing at Intrinsic and, and follow the latest from, uh, the challenge as well?
[00:49:09] Brian Gerkey: So the, I'm sure we can get a link in the notes for the, the challenge website. That's where you can download, we'll point you to the participant toolkit so you can get the code to run at home. You can, uh, you can register to, to participate. There's a, there's a whole community where people are starting to, to talk with each other about what's going on there more broadly, your
[00:49:27] Brian Gerkey: Entry point for all things related to ROS is ROS.org. That's where you'll find, uh, an entry point to that whole community, including all the discussion forums, uh, pointers to the documentation, the tutorials, things to get going. I would, you know, again, suggest starting with. One of those low-cost robot platforms like a LA robot or a turtle bott as if you're into mobile robots.
[00:49:47] Brian Gerkey: A Turtle Bott is a low-cost mobile robot that you can, uh, get to work with if you find any of that interesting. Uh, we do ROS meetups and developer conferences throughout the year. The, our global ROSCon that [00:50:00] we do every year is gonna be this year in Toronto in the fall. So, uh, look that up and, you know, come, come join us in person and get to know some of the folks behind the ecosystem.
[00:50:10] Brian Gerkey: There'll be a bunch of folks from the broader Russ ecosystem. There'll be a strong contingent from Intrinsic representing there as well.
[00:50:15] Andrew Zigler: Very cool. Well, we're definitely gonna share all of those notes and I'm definitely gonna be checking out ROS Khan. So thank you again for joining and like I said, uh, you'll be able to find all of the links and things that we talked. About today in the show notes, but don't forget to find us on LinkedIn or Substack as well, uh, where we include this with our weekly newsletter.
[00:50:34] Andrew Zigler: You can find myself there as well as our guest and we would love to continue this conversation with you. So if you have any questions or you're inspired or challenges about things in in the world that, that we talked about today, we want to hear from you. Just let us know, uh, and we're gonna follow up with you.
[00:50:49] Andrew Zigler: And thanks again for joining us on Devon. Trumped everybody. We'll see you next time. And Brian, thanks again. For joining me today, it was a true pleasure to have you here.
[00:50:58] Brian Gerkey: My pleasure. Thanks, Andrew.



