Hermes Frangoudis (00:07) Welcome to the Convo AI World Podcast, where we builders, developers, and the teams really pioneering the solutions for conversational voice and the AI space, in general. Today I'm joined by Amogha, the Co-founder and CEO of Carbon Origins. So excited to have you today. Amogha Srirangarajan (00:25) Thanks for having me, Hermes. It's exciting to be here. Hermes Frangoudis (00:29) I hear so much about Carbon Origins around the office from Blaise. Like, I gotta give Blaise a shoutout. He's always like talking about the latest thing you guys are doing and it's just mind blowing. It's so cool and I've been so excited for this episode. So let's get into it. You ready? Amogha Srirangarajan (00:48) Yeah, definitely a big shoutout for Blaise. He's been a huge evangelist of Carbon Origins since the founding and since we were a tiny little team out of Techstars. And yeah, I love the partnership that we have built over the years. Hermes Frangoudis (01:00) So speaking of over the years, let's start from the beginning. Really? What inspired Carbon Origins? Amogha Srirangarajan (01:05) I'm Yeah, you know the name Carbon Origins has a pretty deep meaning. You and I are a carbon-based life form and at Carbon Origins, we're building intelligence that's trained on data created by carbon-based, the most intelligent carbon-based life form we know on Earth. And that's how we think we're going to solve the labor shortage in critical industries across the world. We train AI by experts in the real world doing real jobs through teleoperation, and then use that data to build advanced AI models with superhuman skills. And unlike teaching people new skills, teaching an AI is hyper-scalable and it builds on itself. Teaching a human is one-to-one, teaching an AI is one-to-infinity. Hermes Frangoudis (01:59) That's an interesting perspective, like teaching a human is one-to-one and teaching an AI is one-to-infinity because it can replicate and share that knowledge and do it so much more at scale than we ever could as our carbon light forms, as you would put us. Amogha Srirangarajan (02:16) Yeah, but you you have to pay homage. Soon, maybe in our lifetime, there's going to be more non-carbon-based intelligence agents than carbon-based intelligence agents in the world. And that Carbon Origins is kind of like paying homage to all that. The name. Yeah. Hermes Frangoudis (02:36) I love this this thought of where you kind of come from the Carbon Origins, but you didn't always, you weren't always doing what you're doing now, right? You started in a completely different way and pivoted. So you're doing what? Like last, last mile delivery, right? Amogha Srirangarajan (02:49) Yeah, yeah. I mean the vision, the mission has always been make robots commonplace, right? If you see our logo, that's all the planets in our solar system. We're going to have robots mining and pioneering humanity's future as a space-faring civilization. But when you're starting out of your garage, bootstrapped, CAPEX is a pretty big limiter. You can't build off-world robots or mining robots or construction robots. So we started with something that I could build out of my garage and out of my limited financial resources and that was sidewalk delivery robots. And this is also kind of inspired by the pandemic, the surge in demand for last mile deliveries. Everybody was ordering groceries and takeout to their home. The streets were empty, sidewalks were empty. It was like the perfect environment for sidewalk delivery robots to kind of come into market. So I've literally built sidewalk delivery robots in my garage during the pandemic, put it out, built an app, started taking orders. Initially, started with pure teleoperation and eventually, started building autonomy and ended a hybrid teleop autonomy stack. And we had robots in LA, in Minneapolis, in Spain, multiple cities in Spain. And we grew that quite a bit. But you know, that was never the end game. That was just the first step in our billion mile or trillion mile journey to conquer the solar system with robots. So when we had the opportunity to pivot beyond that, go beyond that, we took it. Hermes Frangoudis (04:19) And now you're doing heavy machinery teleoperation, right? Like that's how you guys would kind of classify it or what would you call that? Amogha Srirangarajan (04:28) Yeah, Yeah yeah, we do heavy robot labor. Hermes Frangoudis (04:32) And that's And that is one of your pieces behind you, right? Like that's one of your Amogha Srirangarajan (04:35) Yeah, yeah, Yeah, if you see in my background, that's MARS, Multi-Purpose Autonomous Robotic Skid Steer. And we started with skid steers because it's like the Swiss Army knife of the heavy equipment trades. You have hundreds of attachments from agricultural, forestry, construction, sitekeeping, firefighting—you name it. There's an attachment manufacturer or a whole set of attachment manufacturers that build application-specific attachments for these machines. So very quickly, we can iterate and test where can we solve the labor shortages with this initial flagship product, MARS, and then apply that to larger machines over time, like bulldozers, haul trucks, excavators, surface miners, drills, and so on. Hermes Frangoudis (05:24) Super cool and it all just starts with this one versatile machine that allows you to kind of build off of that base, right? Like you built a real strong base there. Amogha Srirangarajan (05:32) Yeah. Yeah, you know, it's kind of, you know, none of us have, like a crystal ball, you know, we have a lot of hypotheses on where we can apply robot labor most effectively to solve labor shortages. And, you know, there are so many different machines you can start with. So we took a very data-driven approach. We looked at what is the most sold piece of heavy equipment in North America? And it's skid steers. It's far more than excavators, bulldozers, haul trucks, any of these other types of machines. And every construction company has several skid steers in their fleet. Every landscaping company has them. Recycling, refineries, they all have skid steers because it's so versatile. And it's one of those things where you don't need a ton of skill to use it, but you need a ton of experience to be effective with it. It's the perfect platform for AI. Hermes Frangoudis (06:29) Gotcha. That's super cool. How did your early work in aerospace and VR really prepare you for the latency and bandwidth challenges of like live robotics? Amogha Srirangarajan (06:40) Yeah, my entrepreneurial journey started with my rocket team that I started at Case Western, my alma mater, spinning off and becoming my first startup company. And we moved to the Mojave Desert in the middle of nowhere and we were launching rockets out of Friends of Amateur Rocketry, FAR site. It's in the middle of nowhere in the Mojave Desert. We rented a house in California City. It was like a really big house. We converted it into a rocket lab. We were cooking rocket fuel in the kitchen. We were building, literally, rocket engines in the living room and assembling rockets in the garage. It was crazy. There's some articles about this. You guys can go find it. But during that time, one big focus on the tech stack was live video and telemetry. We had many goals, but that's secondary. That's when I really got exposed to high quality hardware compression for video. How do we beam down a ton of video and telemetry during flight? Because it's no guarantee you're going to recover that vehicle. So you want to get that data off the machine while it's at supersonic speeds flying into space. So that was kind of my first exposure to high quality video downlink, ultra-low latency and having controls on the ground, processing all that data. Because you can put a lot more compute in an RV or a trailer than you can put in that rocket. So if you had a thousand different sensors on the machine, and if you could figure out a high-bandwidth, low-latency pipeline from the rocket to the ground station, you can throw a ton of GPUs at the problem and send control signals back to the rocket. So yeah, so that was on the rocket side was real rocket science in every aspect. And then, you know, I've always been obsessed with how the human brain perceives the world and how we, like a lot of people don't realize how much compression happens in the brain. You know, we only see the 10 degrees in front of us at the high resolution. Everything else is is super low resolution and decimated, and your brain is able to construct this three-dimensional world and localize you in that world and plan actions. It's insane how complex the human brain is. And VR is a great window into figuring out how the brain works, and to building the highest-bandwidth human machine interface. So I got fascinated with VR with the launch of the first Oculus on Kickstarter. The concept has been around for a long time, but with Oculus, it became affordable for developers to go buy a VR headset and start building applications. And I worked with a company called The VOID, where they were developing these location-based VR entertainment apps. Like you put on a VR headset and you would get transported to Jumanji or Ghostbusters and you'd be in this multi-story warehouse building and it's like a large-scale location-based VR. There are companies like Sandbox VR that's really doing well today. That's kind of taken that to the next level. But The VOID was one of the pioneers of that and I got to work with that early team and got a ton of insights into how to leverage VR to to build worlds for the brain to live in. Then during that period, I learned a lot about how the brain interprets visual data, spatial audio data, haptics. And I've been able to apply all of that into- I had some aha moments during that time. And that's kind of when I started the germ of the idea of, okay, we can use VR, the highest-bandwidth interface to the human brain, to train the silicon brains. That will do the boring, dirty, dangerous jobs in the world that humans just don't want to do anymore. So combining my background in aerospace with ultra-low latency video data, high-quality video compression, hardware-accelerated compression, doing real-time modeling on that data, and with VR, this deep insight into how the human brain works and interprets visual-audio data, haptics and how we can fool the brain, how we can extract information. I think like the one of the aha moments was like eye tracking and head tracking. It's like the window into your into your brain. You see something before you take action on it. So your eyes ⁓ Hermes Frangoudis (11:12) So your eyes pick it up. Your eyes pick up before your brain even like hits it. So your eyes are almost like telling the story. Or before you're like executing Amogha Srirangarajan (11:18) Or before you're like executing something with your eyes move way faster than your hands move. ⁓ So we can tell what an operator is about to do even before that teleop command is coming from that operator. And we can do some really interesting stuff with that. We can also use eye tracking and head tracking for even more advanced compression on the video because we don't need to send everything to Hermes Frangoudis (11:25) Thank you. Amogha Srirangarajan (11:42) to the headset, we just need to send what the operator is looking at, at the highest resolution and highest bitrate. Everything else can be decimated. So there's so much you can do with VR teleoperation. It is the way to control robots. It is the way to train AI. Hermes Frangoudis (11:59) So you've kind of danced around it. We've seen. kind of the piece that you guys are operating. But really walk us through what's the operator experience like? Like headset, feedback, controls, like awareness. What is the operator really kind of going through when they're in one of these machines? Amogha Srirangarajan (12:18) Yeah, they feel like superheroes. They feel like Tony Stark from Iron Man, right? It's like, so there's some fundamental things that we do, very different from everybody else that's trying to do remote ops, remote control tele ops. We project reality one-to-one in the headset. So everything you see in the headset is what the size and angular accuracy of real world. So if you're sitting in that machine and you're seeing stuff around you, it's better than that inside the headset because one, we project everything to be one-to-one scale, but we get rid of all the blind spots. And then on top of that, we overlay sensor data. So you're also getting the synthetic data overlaid on the real-world data. And then we also overlay computer vision output on that data. So if there is a person or vehicle coming from your right side back, we can give you spatial cues. So we enhance the situational awareness of the operator significantly. It's like getting like spidey senses tingling. Something is tingling behind me. What is going on? there's a truck coming at me, right? Or I'm backing up towards that excavator. I need to start slowing down. And with AI co-pilots, we can override some of the errors that an operator might perform. and yeah Hermes Frangoudis (13:36) So if I'm tracking right, HD is better than real life and you have AI co-pilots with you. So your spidey sense is tingling, but maybe you're not picking up on it. The AI operator can step in and help you avoid what I'm going to assume is tragedy or something really not desirable outcomes, right? Amogha Srirangarajan (13:59) Yeah, so safety first, right? So we applied AI a lot towards enhancing situational awareness and safety. And then we started applying it to automate the routines, the repetitive stuff and the predictable stuff, like driving is on the very level for autonomy on the driving side. The machine more or less drives itself from anywhere on the job site to anywhere else on the job site. Now we're applying it to actions, right? So that's like scooping dirt and loading a truck or digging a trench, augering holes, drilling, spraying fire, fire lines using thermal camera data, creating dozer lines. These are action models that are very specific to the tool, specific to the environment, specific to the machine and the end result that the customer wants. So those are very much unique models that we train based on expert operators training the AI how to do it. And then we have these general purpose models like driving safety that scales very quickly across all the machines on our platform. Hermes Frangoudis (15:01) So cool. So your operators, when they're in there, they're relying on like that millisecond-level video and audio streaming like low-latency. So really what do you do to keep that trick? What's your trick? What's your secret sauce there? Amogha Srirangarajan (15:18) Yeah, there's a lot. There's a lot we have pioneered in that space. I mentioned during the eye tracking, head tracking, we use that data for advanced compression. We're prioritizing video and data that the operator is looking at or will be looking at, about to look at. And then beyond that, timing is great because all these automotive companies building autonomous vehicles have driven the price down on automotive-grade GPUs, which is very important from a hardware-accelerated compression standpoint. So we can get that high-quality video. We can compress that high-quality video. And then there's also, like there's so many tangents kind of like aligning. So you have like autonomous vehicles driving the cost of compute down for edge compute. And you have 5G and Starlink getting mature and ubiquitous, so connectivity is really good. And then actual, like, the market readiness from a mindset standpoint, people are seeing Waymos and RoboTaxis in the world, so it's no longer this weird sci-fi concept to have an autonomous vehicle or remote-operated vehicle, remote-supervised autonomous vehicle on a job site. It's becoming commonplace in the general public. So it's like market timing, people's mindsets, technology readiness, market forces, they're all kind of lining up. This is the time to do what we're doing. Hermes Frangoudis (16:48) That's the most exciting thing about as a founder. It's like when those blocks and pieces in varying industries, right? Like it's not just like one industry lining up with one other one. It's like multiple industries kind of lining up. The technology is really coming to fruition. And I think you nailed it. Like public perception, the willingness, right? People are expecting these things around job sites and in places where the work is not something you're going to find a lot of people that want to do, right? Like digging trenches probably in the middle of the summer and the burning heat. Like that's dangerous for people, especially with the way the heat's been this year. It's been like crazy heat domes. You're talking real inhospitable climates. And that's just the precursor for what you guys want to do, right? You're talking about putting these things in space and that's the most inhospitable climate. Amogha Srirangarajan (17:23) Yup. yeah. Yeah. ⁓ Yeah, it's such a waste of money to like send an astronaut to sit in a machine and go mine the Moon or like build roads on the Moon or habitats on the-Robots are going to do that. And they have to do it. Hermes Frangoudis (17:55) They gotta do it. They gotta be there anyway. The machine has to be Amogha Srirangarajan (17:58) Yeah. But even on Earth, right, like as you as you've identified there are so many jobs that suffer from the limitations of the human body and mind, right? And We call this the gray tsunami. All the experienced operators in the market are retiring, and there's not enough new operators entering the market. There's a big imbalance of people exiting the industry to people coming into the industry. And when an experienced operator with 30, 40 years of experience retires, they take with them all that knowledge, institutional knowledge and experience. And there's just not enough time to get new operators trained on all of that. So time is of the essence. So our approach is, okay, let's extend your career by a couple more years. Don't sit in that compactor that's like shaking you to bits or that excavator where you're doing this boring job in the middle of the Texas Sun or in the front line during a wildfire, hitting dozer lines. Do that from the safety and comfort of an air-conditioned trailer or in your home office, right? It's much better quality-of-life experience. And then that allows us to collect data to train these AI models. It makes the whole like heavy equipment operations job sexy for the next generation. And it empowers them with these AI models that have been trained on expert operators who have done this for the last 30, 40 years. So it's, yeah, like I like to say, you know, we're transitioning from operators playing the instrument in the orchestra to like conducting the orchestra, right? Hermes Frangoudis (19:44) Yeah, because they're leading the AI orchestras now, right? So we talked a lot about VR headsets and I imagine these things are using like hand controls, but are you guys doing anything with like voice and conversational-type AI or voice activity detection or anything like that? Amogha Srirangarajan (20:03) Oh yeah, like I was one of the early, early adopters of ChatGPT and even before ChatGPT, I was playing around with IBM Watson and I was using a lot of the text-to-speech, speech-to-text stuff to build hands-free controls for Teleop. But now with LLMs and reasoning, my God, the things that you can do, right? Like for example, A lot of the operators have a huge checklist that they have to run through. But that's all automated today. You just say, okay, run me through the checklist and make sure all systems are online. It's like checking all the cameras, checking the radios, checking latency, setting the right bandwidth limits on the radius based on what the quality of the connection is. It's turning on safety systems, hazard lights, flood lights, turning on the engine. It figures out what tool is attached to the machine, sets the RPM to that tool. Let's say the Fyrebx is an example of the machine behind me, depending on how much water is in that Fyrebx, you want to change your RPM so enough hydraulic pressure is on the arms and the pivot system. It automates all that, right? It knows, it has a chart to look at. It figures out, "Okay, I need to be at 2,500 or 2,600. This is what the hydraulic flow rate should be: low, medium, high". Yeah, these things are like cognitive offload for the operators. So they can focus on the human stuff, right? Like, "Okay, let me look at the blueprint. Let me strategize how I need to tackle this task while my co-pilot that I gave a one high-level instruction is getting the system up and running". Hermes Frangoudis (21:39) And that's so cool because those are like the tedious yet super important to-get-started pieces, right? Like the safety check, the tech check, making sure everything's kind of up and running. But like you said, there's no reason it couldn't be automated. An AI kind of running through the steps, making sure the logs are showing up, right? Like making sure the buttons are lighting up, making sure the pieces are initializing the way they should, which for our brain, it's a very important step, but it's also a really like tedious and kind of boring step because you're just, yeah. Amogha Srirangarajan (22:09) And you could make mistakes and you could forget, you could just be lazy and you don't set it properly and you're like, it's whatever. Yeah. Hermes Frangoudis (22:15) You're like, oh man, I'm running late. Let's go. The AI's not gonna let you just go. Amogha Srirangarajan (22:21) Yeah, it does it so fast and it shows you while it's doing it. So you're getting visual confirmation of these things happening around you. It's a magical experience. And even like bug reporting, right? And synthesizing telemetry into summary for you, like instead of trying to interpret data. it's summarizing that data, it's interpreting the data summarizing and providing you a summary, but you can also always go and check what's happening. Hermes Frangoudis (22:51) So how do you balance all that, right? Like you have these voice AI features, you have haptic feedbacks, like how do you kind of balance that and not cognitively overload the operator, right? Like I know there's warnings or summaries, but is some of that just kind of like tucked away for later when it's needed or like how much of that is brought to the forefront? Amogha Srirangarajan (23:13) Yeah, we spent a ton of time on user interface design and user experience. And I've spent hundreds and hundreds of hours, maybe even thousands of hours, physically inside these machines, operating it the good old way and learning from the best. Went through a heavy equipment academy to learn how to operate excavators, bulldozers, skid steers, skidders, haul trucks. Worked very closely with dealerships, worked very closely with expert operators within our customers' workforce and try to understand what their brain needs in order to do what they need to do. And then building the user experience and interface to match that. It is very much an art than it is engineering and science. Yeah, I've seen some of the other alternative solutions out there in the market. It's very surface level. We dig in very, very deep into how to make the experience magical. And it's all about reducing the cognitive overload and improving situational awareness and making the operators feel like they have superpowers. They can do more with our system, like a couple orders of magnitude more with our system than they could ever do sitting in the cab with legacy interface and legacy controls. Yeah. Hermes Frangoudis (24:30) These systems, you're talking about like taking old-school operators that are just about to retire, right? Or somewhere close in that age range because you're talking about like kind of extending their time on site by not having to be in these machines. But these are blue-collar, like construction worker, operator kind of guys. What's that shift look like going from being in one of these to being in a VR headset and navigating this whole thing? I imagine AI is definitely helping, right? But how does that shift happen and what's it been like for them? Do they like it? Do they prefer it? Amogha Srirangarajan (25:10) So we offer two paths for operators, and most people use both paths depending on what they're doing. So you can use the VR headset as your primary visual interface, but we also have a desktop interface with big 4K monitors. But the cognitive load actually is much higher when you're seeing it in desktop because you can't really project reality at a one-to-one scale. So you're having to compress it to the size of the monitor or monitors. So your brain has to work extra hard to figure out. "Okay. How far is that right?". There's no depth perception, there's no one-to-one scaling. So it's yeah, but in VR headset, it's a lot more a natural and we're projecting real-world video. We've done a ton of work around running these at really high frame rate on the headsets. So you don't feel the nausea and also your brain is not interpolating some virtual world, it's seeing real world footage, real world video, which is what we're evolved to act upon and live in. So yeah, like most people start off with a desktop and then they're like, okay, let me give that VR headset a try. And then they try it. They're like, oh my god like, I want to put up. Hermes Frangoudis (26:19) Why have you been hiding this from Amogha Srirangarajan (26:21) Yeah, yeah. Right? And the spatial audio is great, the depth perception is great. But it's not for everybody, right? And that's okay. We have a face-tracking camera on the desktop platform, which also gives us the eye-tracking and head-tracking information. And we've also actually collected data from some people are just never going to do this. And that's okay. They can sit in the machine and we'll still track their data, right? Because we can put those face-tracking, eye-tracking camera in the cab to see what they're looking at. And we get all the data from the sensors, the cameras, and they have a nice touchscreen inside the cab. So it's like driving a Tesla. You can get your GPS, you can get your camera views, you can get your safety systems. You get a lot of the value of sitting in the VR cockpit or in the desktop cockpit while you're sitting in the cab. If you use one of our upfitted machines for some of these operators who just don't want to do it the new way. We've also put a lot of emphasis on audio and haptics. So when you're in the desktop interface, the joysticks have haptics, the steering wheel has haptics, we use this high-end butt-kicker haptics on the chairs. So when you go or bump, you can feel that. And the human brain is a beautiful thing. It very quickly starts translating that experience of sitting in the cab to these new feelings that you're experiencing. Hermes Frangoudis (27:48) So you're able to mind map it, right? Like in the UX term, your mind is able to map it to existing pathways. So this correlates to a feeling that they know. Amogha Srirangarajan (27:50) Okay. Mm-hmm. Yeah, and our ratio is more like one to five when we look at like experienced operators to brand-spanking-new operators who have never sat inside a vehicle. And that's really the problem we're trying to solve as well, right? Like one problem is like the data collection, train high-quality AI models with experts training it live in the real world, doing real jobs. But the other problem is, okay, they're still gonna retire. And we want humans in the loop. We have so much context, cultural context, company-wide context, process, standard operating procedures that each company has different. In construction, if you're looking at just trench digging, for example, between any two companies, there are some differences on how they would tackle trench digging, backfilling, things like that. So you really need the humans in the loop for a while. As I said, they're going to conduct these orchestras. And yeah, a lot of our really good operators on the platform have started off as a teleoperator. And then we put them in the real machine to get some feel. But they're coming from a gaming background. They're young. They're hungry. They want to do something. But they don't want to do what their granddad did. Hermes Frangoudis (29:12) No, that's super cool. And it's interesting that you bring up like all these other elements to it, right? Like you have the haptics, you have the full spectrum, like audio, video, multicam. So how does this like differentiate you from maybe some of the other startups that are out there in a similar field, but just not tackling the problem the same way? Amogha Srirangarajan (29:36) Yeah, I don't spend too much time looking at what the others are doing. I do spend some time looking at it. I consistently see a lot of companies getting overly obsessed with getting to full autonomy on day one or very quickly, which I guess makes sense to some extent because of what people have seen with ChatGPT and autonomous vehicles on the road. The big difference is there was a lot of data to begin with there, Large language models, vision models, video models. These have billions of data sets that you can very quickly put together, scrape from the internet, add weights and annotations and start training these models. Even cars on the roads. It is a fairly structured environment. There's rules of engagement on a road. There's ways you drive. And there's so much dash cam footage. And What Tesla did with collecting data on their EVs to train their AI models was a brilliant move. There is no equivalent to that in our space with heavy equipment in highly unstructured worlds. I think it's really important to build for the remote operator first, really focus on the best data collection tool, the best teleoperation tool, because when that AI fails, you want the best teleoperation solution in your fingertips to, for recovery, to continue the job. You don't want these machines stopping because the AI model wasn't trained on some edge-case scenario. And in that edge case is when you need the best teleoperation platform, period. So that's where a lot of our focus went and still is. But we know where the puck is going. I almost. Hermes Frangoudis (31:15) There will be autonomy at some point, right? But right now it's human-in-the-loop is going to be integral to actually making it happen. If Amogha Srirangarajan (31:21) If you think about it, it's like the graphical user interface for personal computers. There are a lot of computer engineers and computer scientists using computers before Apple and Microsoft came to market with GUI, Graphical User Interface. The same goes with robots. I think for robots to be effective in the real world and for everyday people who don't have a degree in engineering or who don't have a ton of a technical experience to get the most value out of robotic equipment, you need to have a really good interface to it. Yeah, teleoperation is almost equivalent for robots, what GUI did to personal computers. So I think the company that really nails it, like Carbon Origins, will dominate the space. Hermes Frangoudis (32:06) So exciting. And you talk about how there's no one really kind of like doing these things in kind of the tele-op heavy machinery type space. You guys are really leading this. And can you tell us a little bit about your recent partnership that maybe like accelerated these real-world deployments and kind of shaped maybe some product requirements and shaped your roadmap? Amogha Srirangarajan (32:31) Yeah, yeah. know, our domain expertise is in robotics, in deep tech, in computer and AI, and working closely with domain experts. And domain experts are usually the guys making the business end of these machines, like the machine behind me has the Fyrebx attachment on it. And they have been powering firefighting contractors with advanced firefighting attachments for their skid steers, skidders, bulldozers, longer than we've been building AI for firefighting or building robotic solutions for firefighting. So partnering with people like Fyrebx, is the way, it's probably the only way to deliver the right value to the customers in that market. And we're going to replicate that across different industries, figure out who is already selling tools into that industry with deep domain expertise, domain knowledge, and existing channels to get the product to those customers, convince them, and they'll convince their customers. Hermes Frangoudis (33:29) I'm sure it's not a hard sale because this kind of technology is very clear the value it brings to the market and the value it brings in general to productivity and the ability to operate in, like we said earlier, adverse human conditions. Conditions that are adverse to human existence or the ability for humans to thrive. You're talking about firefighting. That's probably one of the most inhospitable environments. And there's people that put their lives on the line all the time to save others, right? And here's an opportunity for one of these machines that can withstand the heat, can go in there, put the fire out and actually be operated by human, by someone that's making intelligent decisions around how to put these out. Like that's, that's super cool. Amogha Srirangarajan (34:18) Yeah, mean, it's a national security risk fires if you think about it. Anybody with a matchstick and some gasoline can bring down a military base, can bring down a whole community, can bring down an airport, you know, and wildfires are on rise. Mother Nature can do it too. Exactly. And it's on the rise, you know, for all sorts of reasons. We don't need to get into why it's on the rise, but it's on the rise. And we got to do something about it. Hermes Frangoudis (34:32) mean, Mother Nature can do it too. Yeah, it's happening. Amogha Srirangarajan (34:44) And the number of firefighters isn't increasing proportionally to the need and the demand for firefighters on the front line. Robots today, like our machines, we're far better at doing things like initial attack, point protection, fire watch, mop-up recovery, building fire lines, dozer lines, wet lines, helping with prescribed burns, we are a force multiplier in that market. We're not replacing anybody. We're just giving people this incredible force-multiplier tool where, let's take prescribed burns, a ground crew with the traditional fire engine, which can go in all terrain and has to be confined to roads or access paths. If they could do like 100 acres a day in prescribed burn, we could do 1,000 acres per day with a machine like this supporting that crew. And then bundling AI on top of that, these machines are much safer to operate around. There's 360-degree situational awareness to the remote operator and the machine itself. We can use drones to map what's ahead so we can update the terrain in real time. We can give that data to the remote operators. Level-four autonomous driving can do most of the driving, so the operator is mainly focused on what's happening with the business end with the tool, either creating the dozer line or the wet line, coordinating with the fire chief, incident command, and point protection. These things can be airdropped with a Sikorsky Skycrane to protect the electric substation. So it can do point protection of critical assets, critical infrastructure that's in the line of a wildfire. And then once fire has been put out, you've got to do this thing called fire watch, where people are sitting in trucks just watching the fire line for re-ignition, because sometimes these embers are several feet under the ground, and it's slowly smoldering, and it can re-ignite. So fire watch and mop-up is a big part of post-suppression. Again, there we can relieve the human crew because this machine can stay in sentry mode, use its thermal cameras to keep scanning the fire line, and if it sees hot spots, go investigate it, do a burst of fire retardant, put it out. No human can do that. Kind of stopping the problem before Hermes Frangoudis (37:13) kind of stopping the problem before it ever even bubbles up past the surface. That's so cool. And these things are all like modular, right? So you can take the front piece off, you can put a new piece on, they can do different versatile actions. So how do you decide like what hardware factors to build and integrate? Is that really driven by customers? Or Amogha Srirangarajan (37:17) Absolutely. Hermes Frangoudis (37:42) market demand, like you said, looking for the customers that have these things already in market and trying to meet their existing platforms. Amogha Srirangarajan (37:50) Yeah, it's like a multifaceted question, I guess. The skid steer itself is a multipurpose, that's why we call it MARS, Multipurpose Autonomous Robotics Skid Steer. And there it's driven by customer demand and insights that we're getting from attachment manufacturers on where there is the most labor shortage. If we can solve labor shortage, we increase their total addressable market, the ceiling for how many attachments that they can sell keeps increasing. Because if a customer can only buy like two or three, because that's how big the crew is, that's what they can use. But with our system, they can buy 10 times that much, right? So we can increase the total addressable market for our partners by 10x. And so they're very motivated, they know how to look at, where can I apply this technology? And then we work with them and Lighthouse customers, early adopters to figure out how to accelerate that. And then beyond the skid steer, beyond MARS, our upfit technology is also modular and agnostic to the kind of machine underneath. So we're working with mines in Latin America who have a lot of regulations to start transitioning towards unmanned equipment operation and mines, like high-risk mines to decommissioned tailings dams, lithium mines that are corrosive, that are massive, and there's just not enough people to put in the middle of Chile to go mine for lithium while the demand is skyrocketing. So there's a lot of opportunities in the mining space, which is well-established market for teleoperation autonomy for better solutions, better than what the OEMs are providing. If you want to buy something from Caterpillar or Komatsu, they're going to force you to buy a new machine. And the premium is like one and a half to two times the cost of the base machine. to get autonomy and teleoperation, which I think is ridiculous. We can go and upfit the fleet that you already have, and we can upscale the crew that you already have, which is a big deal. Yeah, I mean, so that modular... Hermes Frangoudis (39:50) I mean, that's huge for business, right? Like they've already sunk the costs into the machinery. They've sunk the costs into the crew. So for them to have to go sink more money just to get that one feature, right? It doesn't make sense. Whereas you guys are kind of unlocking their hands in that sense. That's so cool. Amogha Srirangarajan (40:10) Yeah, we're upgrading their existing infrastructure, upscaling their existing crew. There's modularity there. And there's also lot of modularity in the software, a lot of the safety systems scale pretty well. Driving, the self-driving systems scale pretty well. And the pipeline to train action models scales pretty well. And the network infrastructure, right? That scales incredibly well in the backend. Hermes Frangoudis (40:32) Yeah, that's huge. Like having a network that scales with the demand because how many cameras are you guys broadcasting at once? Amogha Srirangarajan (40:40) It really depends what the operator is trying to do. There are 16 cameras on the machines, it's a combination of monoscopic and stereoscopic cameras. Our goal is zero blind spots and views that you could never get while sitting in the cab, right? To give those superpowers and situational awareness for the operator. And beyond that, we have LIDAR, we have radar. LIDARs are not that useful. They're mostly a calibration tool because we work in such dirty, dusty, grimy environments. But radars are really good. Again, autonomous vehicles have driven the price down on high quality automotive grade radars. And Starlink and Viasat for satellite link, 5G, 4G LTE are becoming ubiquitous. You can put up your own private 4G tower if you need to and get similar quality data links of these machines and backhauls from mesh networks. Hermes Frangoudis (41:33) the infrared is like getting there. It's no longer things of the future. It's things of today. But I do want to pick your brain a little bit about things of the future. I know we're coming up on time and I want to be conscious of your time, but I do want to hear some of your ideas. Because we were talking the other day and I think everyone needs to kind of hear where you're thinking this is going to go. Earlier you kind of alluded to it like space and mining, but Amogha Srirangarajan (41:35) ⁓ Absolutely. Yeah. Hermes Frangoudis (42:01) really, where do you see the future of this being? Amogha Srirangarajan (42:03) Yeah, I mean, back in university, I started our university's Lunabotics mining team. NASA used to have this Lunabotics mining competition in Kennedy Space Center in Florida, where you'd build your lunar mining robot in a year and go and compete against other universities. I really got the lunar bug, you know. And I immediately started seeing some really key unlocks for human civilization, taking industry off world to the Moon. And more so recently, over the last two to three years, our energy consumption as a civilization is exponentially growing, and access to AI models, access to the internet, is also exponentially increasing. And the human population is increasing. So soon we're going to run into massive bottlenecks in energy on Earth. These data centers are really power hungry. The actual analytics is all over the place, but a query on ChatGPT or Grok or Gemini consumes an order of magnitude, or more, power than a simple Google search. And you and I grew up on Google searches. My parents didn't even have that when they were in their 20s and 30s. And the next few generations, our search is going to be very different. It's all going to be driven by these really power-hungry AI models. And as efficient as we make them, they're going to get more and more complex. So they're going to need more and more energy. And we're going to enter this generative-first world where information is generated rather than searched. Anyway, so all of these trends point at one thing. We need to solve the energy crisis before it becomes a massive bottleneck for future generations. And the way I see, and how we're going to solve that at Carbon Origins, we're going to build massive, really massive mirrors orbiting the Sun and orbiting Earth. The Sun is the most efficient nuclear fusion reactor we have access to. It's bigger than any nuclear fusion reactor we can build on Earth. And the Moon is the perfect place to build these mirrors because it's aluminum oxide just sitting there on that lunar regolith. And if you think about back in the day, like long time ago, how they built mirrors, they would use aluminum to build mirrors. So we could do like aluminum vapor deposition, like we can mine for aluminum, build massive mirrors with aluminum vapor deposition. And because there's no atmosphere, there's low gravity, you can build these massive mirrors and launch it without a big rocket with aerodynamic features to it. So building is easier, launching it into Earth orbit, lunar orbit, solar orbit is easier. And we have figured out how to point things, how to point lasers, and how to hold position on mirrors super accurately a long time ago. That's how Hubble works, that's how James Webb works, right? All these telescopes, and Starlink with their laser links, like, this is a solved problem. Like, holding mirrors, holding position in space, that's a solved problem. If we can build these massive mirrors and build massive collectors in space, and beam that coherent energy down to Earth. And you might have seen something, a much more smaller version of this in like Arizona, Nevada, where you have these massive mirror arrays focusing light on a pot of salt. And you run distilled water through it and then run a turbine and you get energy. But what if we could like hyperscale that, right? Have these energy generators offshore, like several nautical miles offshore. And most of humanity is coastal. We all live near the coast. So we could have energy, like collection happening in space, beaming coherent sunlight to energy generators on offshore, use the thermal mass of water to cool down that water and then run these turbines. And clean, renewable energy powered by the biggest nuclear fusion reactor we have access to, the Sun, can be powering our data centers of the future. So I think I'm going to probably be doing that in the late 2030s, early 2040s. First, it'll start with landing bases for rockets, habitats and construction facilities, but as soon as possible start building these massive mirrors and essentially making energy ubiquitous. Energy should no longer be a bottleneck. And that's how we become like Type II civilization. Hermes Frangoudis (46:30) No, that's super interesting because even not even just like pointing it to Earth, but maybe bringing it to the Moon and bringing it to other areas where you have less opportunity to create new energy and to like set up these things to capture energy. But going there with an idea and an ability to send and beam energy around is going to be huge in terms of like just energy transported. Amogha Srirangarajan (46:58) Yeah, I mean, you know, coherent beam of sunlight that we have concentrated can travel speed of light anywhere. And the biggest market is going to be Earth. But if you look beyond Earth, right? People have had this crazy idea of like nuking the caps on Mars to create an atmosphere on Mars. That's like a popular idea. You don't need to throw nukes on the caps. You can just melt it using a concentrated sunlight. also a great way to power interstellar spaceships. You can build massive solar sails and you can use light. Light exerts force on reflective surfaces. If your photon hits a mirror and bounces off, that energy is going to move that mirror forward. So you could have massive solar sails and you could have this beam of energy just like pushing those satellites into relativistic speeds, certain percentages of the speed of light, and get probes off-world and go explore Alpha Centauri in our lifetime. That would be insane. Yeah. Yeah. Hermes Frangoudis (47:45) it. It'd be wild, that'd be so cool. All right. So I know you have a bunch of testing to do and I've already taken up so much of your time. I've enjoyed every minute of this conversation. I would probably keep going for hours if you let me, but right now you have to actually go do real world stuff in reality to kind of keep your business rolling. So I'm going to leave you with, yeah, you guys don't ever stop, right? I got one more question for you and this bit of a wild card. Amogha Srirangarajan (48:07) Yeah You Yeah. Our machine is 24-7. Nope. Okay. Okay. Hermes Frangoudis (48:30) ⁓ If someday you had like kind of unlimited lab time and budget and like moonshot robotics would you be prototyping? Amogha Srirangarajan (48:43) Yeah, we are a moonshot company, right? I have like a million ideas to how... We have kind of like go back to first principles and like how we would mine on the Moon is going to be very different than how we mine on Earth. The excavators on the Moon, the bulldozers on the Moon, the haul trucks on the Moon are not going to look anything like those that are on Earth because you have an eighth of the gravity. You know, you're operating in vacuum. Thermodynamics is a completely different challenge compared to thermodynamics on Earth. So that's where, I mean, that's why I'm building Carbon Origins. It's going to be the cash cow that's going to fund our moonshot machines, moonshot robots of the future. So yeah, if I had unlimited funding, it's coming. It's in the horizon. I think the countries, the organizations, the companies that dominate the Moon will define the next 100 years of human civilization. And I want to figure out what is the best way to mine the Moon, build habitats on the Moon, build these mirrors on the Moon, build generational spaceships that we can use to get humanity off solar system into new adventures in the final frontier. That's where I would be putting my money and that's what we're working towards. Hermes Frangoudis (50:05) That's amazing. You're getting to live the dream, man. And I know it's a lot of hard work and it's that like, what's that meme with the ice cap where this is what people see versus like what it is. ⁓ So I'm so thankful that you took this time to chat with me and to all our listeners because it's huge. Like you guys are doing some really Amogha Srirangarajan (50:08) Okay. Yeah. Yeah. Hermes Frangoudis (50:32) momentous, amazing things, and you have even more momentous and ambitious ideas that I full-heartedly believe you're going to be able to hit. If you've done this now, I can't imagine what you're going to be doing in 2030, 2040. Amogha Srirangarajan (50:45) Well, it's been pleasure working with you guys too. Takes a lot of partners to pull off what we're doing. So yeah, thanks for the amazing relationship so far. I know we're going to be building some amazing things. Hermes Frangoudis (50:57) Well, I also want to thank our audience, our listeners, our viewers live. For everyone on social media, like & subscribe. For everyone listening to the podcast, follow along for the next episode. Thanks so much for joining us and we'll see on the next one. Amogha Srirangarajan (51:12) See y'all, take care, bye.