MVPro Media – The Vision Podcast #18
Guest – Chris Matthieu, VP of Developer Ecosystem at RealSense
In this episode, Josh speaks with Chris Matthieu, VP of Developer Ecosystem at RealSense, about the company’s evolution following its spin-out from Intel and what independence means for the future of AI-powered vision. The conversation explores RealSense’s expanding role in robotics, access control, and industrial automation, as well as the partnerships and platform developments shaping the next phase of Physical AI.
“I wholeheartedly believe in physical AI. I believe that programming ROS robots is a thing of the past. Like, I do believe it’s all going to be about missions, VLMs, VLA’s, AI controlling the body of these robots.” – Chris Matthieu
On this page:
- Podcast player
- Guest information
- Useful links
- Episode chapters (coming soon)
- Episode transcript
Listen to the Episode:
About our Guest:

Chris Matthieu is VP of Developer Ecosystem at RealSense, where he leads engagement with developers and partners building AI-powered vision systems for robotics, biometrics, and industrial automation. A serial entrepreneur with five startup acquisition exits, Chris has held senior technology leadership roles (VP/CTO) across IoT, edge computing, and automation, including co-founding Octoblu, an IoT platform acquired by Citrix. At RealSense, he focuses on expanding industrial-ready vision solutions and enabling Physical AI systems used in autonomous mobile robots, humanoid robotics, and secure access control. He is also a frequent speaker and educator in the computer vision and robotics community.

Useful Links:
- RealSense | realsenseai.com
- Chris Matthieu on LinkedIn
- RealSense Completes Spinout from Intel, Raises $50 Million to Accelerate AI-Powered Vision for Robotics and Biometrics
Episode Chapters:
Click onto the chapters to access the relevant sections of the transcript below. (coming soon).
Chapter 1 — From Entrepreneur to Vision Builder
Chris reflects on his unconventional career path, from early leadership roles and startup exits to finding a long-term home in computer vision and robotics.
Chapter 2 — How RealSense Found Its Market
A look back at RealSense’s early days, its time inside Intel, and why depth perception and on-camera compute became foundational to its dominance in robotics.
Chapter 3 — From Robotics to Access Control
Chris explains how RealSense’s core depth technology is now powering new use cases beyond robotics, including facial authentication and secure access systems.
Chapter 4 — Industrial Readiness and Deployment Reality
The conversation turns to what industrial customers actually need: lifecycle stability, rugged hardware, longer cable runs, and standards-friendly integration.
Chapter 5 — The Future of Physical AI
Chris shares his vision of Physical AI, where robots reason, adapt, and interact naturally with humans—moving beyond scripted behavior toward embodied intelligence.
Episode Transcript:
Josh Eastburn (host)
In July of 2025, leading AI vision company, RealSense, announced the completion of its spin out from Intel, along with a $50 million series A funding round. Aimed to help it expand its market share as an independent company. To talk about this move and what it means for the future of RealSense, I’m joined today by VP of Developer Ecosystem at RealSense, Chris Matthieu. As VP Chris helps developers and partners turn AI-powered vision into real-world robotics and biometrics solutions. A serial entrepreneur himself with five startup acquisition exits, Chris has held executive technology leadership roles and has built products across IoT, edge computing and automation. RealSense delivers industry-leading depth cameras and vision technology used in autonomous mobile robots, access control, industrial automation, healthcare, and more. With a mission to deliver world-class perception systems for physical AI and safely integrate robotics and AI into everyday life, RealSense provides intelligent, secure, and reliable vision systems that help machines navigate and interact with the human world. The company is headquartered in Cupertino, California, with operations worldwide and claims an installed base including 60% of the world’s AMRs and humanoid robots. Now here’s Chris to talk about the future.
Josh Eastburn
Chasing the arc of your career has been super fun. You’ve done a lot of stuff in tech. I was trying to list it out. Lottery, I think is kind of what you started.
Chris Matthieu
Oh, yeah. I was like CTO of the Arizona Lottery when I was 25 years old. That was crazy, yeah.
Josh Eastburn
Yeah. Wow. Voice and phone services, language platforms, IoT, supercomputing.
Chris Matthieu
Yeah. I’ve built and sold five companies and the last one was a decentralized mesh supercomputing platform bought by Magic Leap. They were the augmented reality glasses company that had raised like $6 billion or something crazy.
Josh Eastburn
Yeah. I remember that.
Chris Matthieu
They wanted to use it to keep all these digital dragons flying in the sky, like that everyone could see the same dragon and all this crazy metaverse for like real world metaverse stuff. They needed this infinite computing and that’s what they bought. Crazy.
Josh Eastburn
Okay. Okay. Wow. So was it the metaverse conversation that came first or had you gotten into the mesh computing some other way?
Chris Matthieu
No. The company I built before that was an IoT company. I sold to Citrix and what I had noticed building that company was that like, there was going to be like 50 billion IoT devices connected like by 2000 or 2010 or something like that. And most of these devices were like nothing happening on them. Like just sitting there waiting to chirp every 15 minutes on sensor data. And I was like, look at all this wasted latent compute laying around. Like what if we could somehow harness all of that. And that’s what Computes.com was all about. Like it, anything with a processor could connect, communicate and compute together like one global mesh computer.
Josh Eastburn
So I feel like you’re, I don’t, you tell me, but is your just disposition just such that if you see a problem, you see a gap in the market, you’re just going to jump in there and, and, you know, put your, put your option out there?
Chris Matthieu
Yeah, I love the business side of it. And like you, I love the engineering side of it. And when I see one plus one equals four, I see those opportunities. I try to connect the dots and I love building things. I love building things and I love building companies.
Josh Eastburn
I was watching a lot of your educational content around RealSense. What ultimately brought you into the computer vision and imaging market?
Chris Matthieu
I’ve always kind of been computer vision. Like if you look at like all the shiny objects that you see along the way, a lot of it even in IoT dealt with computer vision. But what got me to RealSense, I was, I had magically played off like half its staff. So then I started building this real world metaverse idea. And I was, timing is everything. And like the timing wasn’t quite right for, I wasn’t getting the right business opportunity leads. A friend reached out to me who was at Intel and said, We’ve got this incubator building, discovering technologies inside of Intel, trying to get them funding and spin them out as new companies. He said, We really need an entrepreneur in residence that can help turn these tech ideas into companies. I was like, that’s what I do. I joined Intel. And the RealSense team was part of the incubator program. So I got exposed to lots of different startups. I would come in and get a million for this one, a couple of millions for that one, and turn them into companies. But RealSense was the most mature one that stuck. And I was asked to see what I could do with RealSense.
Chris Matthieu
And I was like, oh my gosh, robots, AI, computer vision, I was like, it was just- It’s all coming togetheroh,- yeah, yeah. I was just a kid in a candy store with RealSense. It’s just stuck. I really love doing what I do.
Josh Eastburn
And now RealSense has such a presence, particularly in robotics, right? Vision applied to robotics, correct?
Chris Matthieu
Huge. RealSense has about 60% of the worldwide robotics market, and I think 80% of the humanoids. When you talk to people and they see RealSense cameras on these robots, it’s inevitable. They’re all RealSense. And when I go to a conference and I see one that’s not running real sense, I want to stop and talk and ask why. Why are you not? Are you missing something? Are we missing something? Yeah, it’s market dominance. And it’s easy to understand why. These cameras have built-in compute inside of them. So they’re able to do all the depth calculations in real time. I think it’s like one millisecond per frame getting all the depth points. So you have RGB, red, green, blue, plus depth. So you’ve got a depth pixel on every RGB pixel that the camera sees, and it’s all happening in real time on the camera compute.
Josh Eastburn
Ultimately, is that what you think, you know, kind of going back to those incubator days, is that what led to the strength of RealSense? Was just like, we’re just bringing this technology to market that no one else has seen.
Chris Matthieu
Absolutely, yeah.
Chris Matthieu
I think from Intel’s perspective, I think they had always seen robotics, you know, as an up and coming trend. And, you know, timing is everything. Like, I still think that now, I think our Intel has built and stopped and started its robotics programs over and over again, but RealSense saw the opportunity, stuck with it. And while robotics is still, I think, the number one market for these stereo depth cameras, I still get amazed when I see other use cases, like in retail or manufacturing as well. Like this new D555 power over ethernet camera. Ethernet goes like 100 meters. So, like you can string like a hundred meter ethernet cable out to the Entryway entrance way of your retail store. And then it’s kind of like Google Analytics for the real world. You’re tracking people coming in and out, where are people moving around, looking at what displays. So it’s cool to hear of new use cases that are not robotics for this type of technology.
Josh Eastburn
Do you have, and this is, maybe this is, you’re gonna have to think over a lot of different applications here, but do you have a favorite vision application either that you’ve built, a project that you’ve been part of, something like that?
Chris Matthieu
So I’ve built a lot of different ones. I had a salesperson on our team say, hey, we’re meeting with this airline. They were thinking that maybe they could use RealSense to get the dimensions of suitcases. I was like, So I pulled out cursor AI and 30 minutes later I had a full working demo where it sees a suitcase, it gives you the dimensions, gives you the volume and they’re like, what took you so long? It’s like AI is writing really good code for RealSense and it still blows my mind because the RealSense SDK is open source. A lot of the algorithms like YOLO are all open source. So AI just says, let me write that for you and it builds this type of stuff, and I did have some fun. I had a competition, like a week-long coding competition, and we gave out camera prizes for developers that built cool things. I was like, I would give examples for each, and I had AI write me this game where you could use it like a pong, like a one-player pong game, and the camera’s looking at you, so if you move close to the camera, paddle goes up. If you move away from the paddle, away from the camera, the paddle goes down. I was like, it wrote that in 15 minutes. So now you got this game with RealSense. And I did something similar with music. Like I made a theremin. Is that what you call it? Theremin, yeah. Yeah, so you would get close to the camera and the pitch would go up. And so you were doing all this fun little martial arts looking moves and making music with the RealSense camera. I was like, this is bizarre.
Josh Eastburn
Wow. And by the same token, if you look at the newsroom of RealSense today, right? Even just since the announcement about the new ownership and all of that and the departure from Intel, there has been a whole range of new partnerships in different industries, right? So it seems like there’s an active move by RealSense to see, you know, where can we apply this technology in other places, right? Maybe you put those dots together for us also, right? What’s the big picture story about the direction that RealSense going is with this?
Chris Matthieu
Yeah, so we’re looking at industry leaders, obviously, to partner with to make our cameras a 1+1=4 story. NVIDIA’s just got amazing foothold on robotics with all their GPU strategies and Thor, the new Thor Jetson running on robots. It just made logical sense to partner with NVIDIA where some of those conversations when we were under Intel weren’t really feasible, but like now we see NVIDIA in RealSense like everywhere. And when we go to conferences, we’ll bring an NVIDIA Jetson with eight RealSense cameras connected to it. It’s like peanut butter and chocolate. So we’re having fun with NVIDIA. We released a new access control camera. It’s a facial authentication camera a little while ago. And companies like Dormakaba, we’ve partnered with that are like leaders in access control. So finding companies like that with that presence, now they can use the RealSense facial recognition, we call it RealSense ID, to like just walk up to a door without your badge and it just lets you in. You know, theme parks where people are walking around in flip-flops and swimsuits or cruise ships where we’re seeing a lot of deployments of these types of cameras with Dormakaba. So lots of interesting new partnerships that were, they’re just lining up with us.
Josh Eastburn
So the partnership with NVIDIA, from the tech stack perspective, you sort of feel like that just makes sense. We’re going to be working together, so let’s have a closer alliance.
Chris Matthieu
Yes. In fact, there’s a new technology that we’ve worked with NVIDIA on developing called Holoscan, and we’re the first camera to leverage it commercially. And what it is, it’s taking all the camera sensor data right over the Ethernet jack and putting it right directly into an NVIDIA GPU. Now you’re talking instantaneous AI on that sensor data to make robots more smoothly operate autonomously. It’s not the janky robot trying to grab something where it moves every second. If you’re getting all of this data into the GPU directly, Like now you’ve got really smooth, quick responses on robotics and safety. If you’ve got robots working with humans, you need to make sure that you’re responding quickly so that no one gets hurt.
Josh Eastburn
Okay, so tightening up that integration then?
Chris Matthieu
Correct.
Josh Eastburn
Okay.
Chris Matthieu
And in fact, on this D555 camera, it runs Holoscan on the camera. So you literally just plug this camera into a power Ethernet network, the same thing that the Nvidia compute is on, and it just finds it and streams right into the GPUs.
Josh Eastburn
And then is the partnership with AVerMedia similar thinking like we’re trying to build out, trying to make this more integrated, more deployable?
Chris Matthieu
Absolutely. I did a whole series of demos and unboxings with AVerMedia recently. They’ve released a new GTX ORIN platform that has GMSL, deserializer board attached to it, and it can support eight of our cameras, GMSL cameras out of the box. I call it a robot in a box, but literally it’s a kit, a developer kit that comes with the Jetson, comes with the deserializer board, comes with one of our GMSL cameras, and everything’s pre-installed like the SDK is pre-installed the some AI models are all pre-installed, CUDA. Like it just like you open it up, you run a setup that makes everything current and fresh and you just start working on robots. It’s pretty amazing.
Josh Eastburn
And then Dormakaba, you said, it sounds like that partnership is really about accessing this new market, bringing that technology into the secure access market.
Chris Matthieu
Yeah, there are leaders in gates and access control points, so by being able to put our RealSense ID on their gate system, they’re able to just transition from badges into bio off facial off, like immediately. And there was another company, Ones Technology, we did some partnership with and together we won best of show last year at ISC West on best floor bio system and mobile mobile app that use real sense. Yeah.
Josh Eastburn
What do you feel like then is the is this kind of indicative of the trend that RealSense is going of just like We have a foothold in robotics and we just want to grow the market share that we have, or we want to expand our presence into other areas.
Chris Matthieu
Yeah. If you look at the technology, I know our viewers, our listeners might not be able to see this, but it’s still stereo technology. Our RealSense ID still has multiple lenses. It still has a projector on it. So it’s leveraging a lot of the depth technology that we’ve been doing for a long time and applying it to facial authentication. And I do think there’s a crossover. I do think that soon robots will need to know who is working with them, not only a person, but maybe two use cases. Maybe the robot can only take orders from certain operators. So maybe Josh, but not Chris. Or maybe if the robot needs to bring a tool or a beer or something of that nature to a person, it needs to know. Who to bring it to rather than that person over there. So I think there’s a crossover. I think there’s facial law capabilities and bonding, robotic bonding with humans.
Josh Eastburn
That totally makes sense. I don’t know why I haven’t thought of that before, but that seems totally intuitive now that you pointed out. Well, that’s really exciting. And actually, that’s a great segue to the next point that I want to talk about. You mentioned GMLS, you’re talking about robotics, and so naturally we talk about industrial applications a lot on this show, machine vision specifically. Tell me a little bit more about what RealSense is thinking about how to address the needs that are specific to the industrial space.
Chris Matthieu
Yeah, so that’s a great question. I know all of our earlier cameras were all USB, which is great for developers. You plug it in, you get going, and you’re working out of the box as well. But what happens, what I’ve seen with USB, you’ve got distance issues, like the cables can only be so long. I’ve also seen issues where like just USB hubs and things like that sometimes don’t stay connected very long. I was talking with one of our humanoid customers and they’re using some older generation USB cameras and they built a whole subsystem for rebooting the USB hubs. If they lose contact with the camera, they reboot the hub programmatically and the camera comes back. That’s terrible. Yeah, so like GMSL is becoming, I think that new robotics standard. I think it’s a standard from automotive way back, but it’s power over coax, basically. So you’ve got like 50 meter range on a coax connector to the camera. It’s very resilient, better throughput. You don’t have any like hubs in the middle that you’re having to tinker with. And similarly, the power over ethernet, this new line of cameras, we’re seeing a demand for that as well. And the newer 555 D555 cameras, it has more tops on these cameras, more compute than our previous line, where we can do things like I mentioned Holoscan running on the camera, but we can also run ROS too on the cameras as well. So you plug these things in and they automatically start publishing all of the ROS topics you would normally need to deploy an SDK and a host to pick up.
Josh Eastburn
Okay. So a couple examples of integrating standards into the cameras that are hopefully going to make them more compatible in an industrial environment, right?
Chris Matthieu
Yep. Faster, more resilient, further reach on the connectivity.
Josh Eastburn
Is there hope for standards like Genicam in the future, do you think? Is that too far out to speculate? It may be too far out.
Chris Matthieu
I do see a lot of protocols companies with protocols and other types of cameras approaching us asking us to implement additional protocols on the cameras. I think until we see adoption, until the adoption is there, it’s hard to justify the engineering and development because hardware and software components to make these work. The biggest demand I’ve seen beyond USB, obviously, is the GMSL and power over Ethernet at this point.
Josh Eastburn
You know, the life expectancy, right? Or the commitment on the, you know, a manufacturer, industrial type of company, OEM, that buys a piece of equipment, you know, they’re really looking for a long lifetime, but I think maybe more importantly, the cost of upgrading and things like that is it’s different. There’s a bigger impact from the downtime and all the things that are related to it from what you might see in like the consumer space, right? Have there been any kind of conversations about, yeah, just thinking about lifecycle management of the products?
Chris Matthieu
Yeah, and we get asked that frequently because when we were under Intel, we discontinued our LIDAR camera. Because at the time, Intel had like three divisions working on different LIDAR products. So I think the CEO at the time made a decision to have Mobileye be the LIDAR company of Intel. And so we just continued that. And people still ask for us to bring it back. We may at some point. It’s a totally different tech than vision. All of our current products, they have no end of life. On the radar. So they’re all good investments. They’re all great cameras, even cameras that we built five years ago are still product leaders in the market. And we are going to continue to support and maintain. And they all run on our current version of our SDK. So they’re solid bets in the market.
Josh Eastburn
Speaking of solid bets, we’re early in 2026 still, right? Are there upcoming developments this year that people should be on the lookout for from RealSense?
Chris Matthieu
Oh, man, there’s probably a half dozen new cameras in the pipeline, and I don’t think I can talk about them freely just yet, but you can probably imagine what they are just from some of the things we’ve talked about on this show. But yeah, there’s a lot coming. I feel like Like we’re a rambunctious teenager in the life cycle of our company. Like Intel was a great parent, a great foster parent to incubate us. And we’ve got 80 patents, great market share, great technology. And I think we just finally reached the point where we just needed to leave the house, you know, and go. Yeah, go spread our wings, go to cost our own trouble and fun on our own. And it’s paying off, like all of these new partnerships and robotics companies, like It’s super exciting to see robots too transition from wheels to legs. That opens up a whole bunch of new tech opportunities for us as well. Like around Visual SLAM, around visual odometry, lots of cool new things. You can imagine complementing our camera technologies.
Josh Eastburn
Yeah, I’d like to dig into that a little bit actually, because as I was looking around at, again, the press room on the RealSense website and looking at some of your social stuff, the future of physical AI. I feel like that’s a topic that’s come up a lot. Your CEO has talked a lot about that and RealSense trying to position itself to be a serious player in that market going forward. Can you tell us maybe some of the, I mean, because on the other hand, I’ll add to this, I go to shows like Automate and we see that there is increasing adoption of AMR, serious investment happening in humanoid robots, right? The other technologies in addition to vision that all needs to come together, right, to make that work. yeah, So what is the future that RealSense is trying to build or trying to be part of?
Chris Matthieu
Like you, an engineer, I wholeheartedly believe in physical AI. I believe that programming ROS robots is a thing of the past. I do believe it’s all going to be about missions, VLMs, VLAs, AI controlling the body, embodiment of these robots. There’s a company called OpenMind. They recently raised 20 million. They’re based in San Francisco. I’m going to be giving a talk at the SPIE conference next week at the Moscone Center in San Francisco. And the CTO of OpenMind will be with me on stage. We have 30 minutes to make a Unitree robot dog, a quadruped, follow me on stage. And all we’ve got is a RealSense camera on it. And we have open minds, brain pack, they call it. So we’re going to live code, vibe code this physical AI experiment to see if in 30 minutes live on stage we can get the robot dog to follow me around. And I have no doubt we will be able to do that successfully, but that just gives you some awareness that these robots need to be spatially aware not only of people and things like to follow, but where they’re walking obstacle avoidance like they just need to be given a mission and they need to figure out how to solve it. Remember on the Matrix when Neo says I know Kung Fu. That’s like what I’m imagining like all of this moving towards where the robot goes I see a screwdriver. How do I what is that? It’s a screwdriver. How do I use it? Okay, I got it. You know that that’s the new world we’re moving to.
Josh Eastburn
Wow. So this kind of just in time requesting or self, not self program. We’re not there yet, but yeah, footsteps in that direction of like acquiring information and instructions and programming as needed to fulfill the task at hand.
Chris Matthieu
Exactly.
Josh Eastburn
Wow. Okay. Okay. And, and so, okay, so RealSense sees itself as, as part of that as, I, I don’t even know if it’s, if it’s right to say the vision provider because there’s a lot of different types of interactions that you’re facilitating. Through “vision” right?
Chris Matthieu
Yep. And our CEO, you mentioned he gives a lot of talks and thinks about this a lot. He talks a lot about the vision cortex. So, yeah, so if our cameras are basically, they’ve got a lot of tops on the cameras, a lot of compute. If and when they’re able to start reasoning to the point where we’re like humans, like you don’t have to think about how to grab, how to pick up your cup of coffee. Some of these things are just natural subconscious motions where we think about that. Maybe there’s a way to connect the RealSense vision directly to the motor skills of the robot and save important compute for things that the robot has to figure out. We think a lot about that level of vision perception and spatial awareness to add value without consuming robot compute.
Josh Eastburn
Well, okay, this is a pretty exciting picture that you’re painting of the future here. And I want to give our listeners some ideas of kind of how they can follow along or get more involved. Maybe they’re hearing about RealSense for the first time. Where should they be going for education? Yeah, to learn more from you and others in the company.
Chris Matthieu
Great question. So realsenseai.com is our website. We’re on LinkedIn mostly, but also X. My handle, Chris Matthews, is the same handle on all the platforms. I’m constantly doing podcasts as well. I love to interview customers and partners that use RealSense. I love hacking on these ideas, like robotics ideas and RealSense experiments. I’ve got a lot of videos on YouTube and on social media. And we are putting together a certification course as well. It’ll be an easy way to step 1, 2, 3, guide developers through learning all the ins and outs of developing with stereo vision, RealSense stereo vision cameras. And maybe even get a test at the end and get scored for your certificate program.
Josh Eastburn
And you mentioned, I mean, this open source, right? GitHub.
Chris Matthieu
Oh, yeah, GitHub. Yep, yep. It’s RealSense AI on GitHub as well. And our RealSense SDK is open.Everything we have is open source on GitHub. And there’s also a RealSense ID facial authentication set of repos out there as well.
Chris Matthieu
Okay. Okay. Awesome. You mentioned SPIE as well. Any other events coming up that people might look for yet? We’re doing a lot in March. We have South by Southwest. I’ll be hosting a panel around, around robotics, and we have a booth at South by Southwest. We’ll be at ISC West for a facial authentication security conference and I think there’s a German Hanover MESSE giant robotics conference in April in Hanover, Germany coming up. We’ll be there. I don’t know. I think we’re still putting together the list.
Josh Eastburn
Yeah, that’s great. Well, thank you so much. Yeah, and definitely, if you think of anything else, feel free to send me links and we’ll put them in the show notes and all that good stuff. Our time has flown by. This has been so fun. Thank you.
Chris Matthieu
Yeah, likewise. Thank you, Josh. I’ve enjoyed it.
Josh Eastburn
Anything else you wanted to put in a plug-in for that I’ve neglected to mention?
Chris Matthieu
RealSense is the future of robotics. Get on the wave.
Josh Eastburn
Thank you. Thanks to our European Editor at Large, Mark Williamson, for connecting me with Chris last year and to Penny Malsch at RealSense for making this interview happen. You can read all about the spin out of RealSense in their original press release, linked in the show notes below. Last month our editor-in-chief Tom Tiner had boots on the ground at. SPIE Photonics West. Tune in to MVProMedia on LinkedIn and at MVProMedia.com for all the coverage. For MVProMedia, I’m Josh Eastburn. Be well.
















