Ouster, Inc. (OUST)
NASDAQ: OUST · Real-Time Price · USD
26.45
-0.51 (-1.89%)
At close: May 1, 2026, 4:00 PM EDT
26.40
-0.05 (-0.19%)
After-hours: May 1, 2026, 7:59 PM EDT
← View all transcripts

28th Annual Needham Growth Conference Virtual

Jan 15, 2026

Casey Huebner
National Sales Manager, NEDEM

Thank you all for attending the 28th Annual Needham Growth Conference. My name is Casey Huebner. I work on the New York sales desk here at Needham. I'm happy to introduce Ken Gianella, CFO, Chen Geng, SVP, Strategic Finance and Treasurer, and Jim Fanucchi of Ouster. For a company presentation, we will have investor Q&A. Questions can be sent to me through the conference portal, and I'll have those relayed post-presentation. With that, I'll let the outside team take it away. Thank you.

Kenneth C Gianella
CFO, KORE Group Holdings

Thank you so much, Casey. Totally appreciate you guys hosting us here today. Let's just jump right into it, obviously past the safe harbor statement. It's still early within our year end close, but I want to start first with just describing about who we are and what Ouster does. So a lot of people come to us and say, "Well, you're just a sensor company. What are you just selling?" and I think the biggest thing that people's perception has changed over the last 12 to 24 months is we're not just a hardware company. We are a perception solution platform, and what that means is that not only do we sense and take the perception and sensing side of it, but then we have a perception software layer that can think and actuate and generate actions.

Then we have applications that can drive outcomes or help support the outcomes you're looking for. That sense, think, act is really the platform that Ouster brings to light. When you start thinking about physical AI and what physical AI means, to us, it's really anything that moves. Do you want to monitor something that moves, or do you want to be something that moves within the environment autonomously? Now, how we do that is we take our global leading digital lidar sensors. In the areas we operate, we are the number one provider for 3D digital lidar within the areas we operate. We offer the only software infrastructure with a very clear perception platform that can help people get to market a lot quicker.

So our over 1,000 customers, they work with us not only just to get the perception and the sensing side of it, but how can we help them get to market quicker using our expertise over 10 years? And then last but not least, it's pulling together this global solution that, with our Gemini AI program, we've used that to log and catalog over four million endpoints as part of our AI training algorithm that creates a virtuous cycle for our customers. So it's not just sensing and performing. It's also learning and taking those learnings and cycling it through with our AI algorithm. Next slide, please. So when you think about the market opportunity that we go to today, $70 billion market opportunity across four key verticals. In 2024, it was about equally split, 25% across each vertical.

But let me just walk through what we're seeing each of those today. Everyone understands automotive and what's going on there. Our primary play in there is really around robo trucking, robo taxis, not so much in the ADAS system. We don't have that product line yet, but we predominantly work with the robo trucking and robo taxis. Moving next to smart infrastructure, this is one we've really been pleased with the performance lately, and that's primarily driven by our intelligent transport systems. So what that means is over 300,000 intersections in America use stoplights or controlled stoplight of some form. We're targeting that market squarely. Our Blue City platform has the ability to control an intersection intelligently, but it's also a force multiplier.

It allows cities to not only put in to control the lights, but in real time, they can monitor, manage, and more importantly, get the data from them so city planners can understand what's going on for part of their long-term planning. This system, it's a brownfield opportunity that we're going into, and it's a very open market that we're seeing a lot of synergies across city and state that's helping us grow that sector. Next within here, what we really like in this is perimeter crowd security. We've already secured for the Los Angeles Olympics, as well as several stadiums around America to help do crowd control and crowd analytics for the upcoming World Cup games later this year. It also is great for logistics and detection within the perimeter security.

So the logistics side of this, think within warehouses, and this starts moving into the industrial complex side of it. If you think about that end-to-end supply chain as a truck moves onto a lot, and then from that truck, it gets offloaded, and then it moves into a warehouse and gets shelved. Each one of those end use cases, our lidar and our perception systems are used, whether that's perimeter security around the warehouse, whether that's detecting the truck entering the lot and logging where it parked in the lot, whether that's the autonomous or safety modified forklifts, because many of the forklifts today, while they're not full autonomous, our lidar is used for safety applications when you mix heavy equipment with humans in a place.

All the way then when you get into the warehouse, we map the warehouse with our lidar and our applications, and then we can monitor and help move autonomously pallets and pickers and so forth within that warehouse to operate, so that smart infrastructure and industrial, really big space driving our business today. Last but not least is robotics. This is one that we've been super pleased with, and we've partnered with folks for a very long time with, and one of the best use case examples we have is our partner, Serve Robotics. Serve, they've been with us for several years now, and we started with them helping them get off the ground with proof points that really helped them hit their business model to where they need to be.

And from that, we've grown with them from tens of units to hundreds of units to we are really pleased with the recent announcement that they had that increased their orders with us by the thousands. This is a great example how by partnering with 1,000 customers across multiple use cases, not just one specific use case, this helps us give visibility into our growth. And so our partner success is really our success. And so as they grow and as their end use cases break open, this gives us the ability to go and supply and work with them. The other use case that many folks may know about is in defense and drones. Ourselves at Ouster, we've been actually in this space for a long time, predominantly in the mapping side of it, working with our various integrators and drone partners.

We recently announced that we had Blue UAS certification. What that means is the Department of Defense has certified our supply chain to really be conflict-free from any end users or end states that may compromise the supply chain and the production of DoD or military or other government operations using our lidar sensor. We're the only 3D lidar sensor in the Western space that has, or in the world actually, that has that certification for these applications. We've been doing drones for a while. It's a big part of our business. It's not just a new use case for us, but this is something that we also are excited about that and the larger robotics sector. Next slide, please. When we think about what our technology is built on and why we believe we're going to win, lidar has been around for decades now.

What's really changed the face of it is moving from an analog-based lidar into more of a digital lidar space. The second thing is the compute power. So when you combine compute with our digital lidar, you're truly riding the wave of Moore's Law, not only of every node that we create. We're on the L3 chip today. It not only doubles our TAM in the markets that we can serve, but it greatly improves the efficiency of not just the power, but also the efficiency of other attributes such as visibility and the amount of attributes that it can operate under. Our next chip node, which we're super excited about coming out soon, is the L4 chip. That's going to take it even to the next level and open up even more specs with us.

The key about the L3 Chip today, that full 70 billion, that's all on the truck already. We can service that whole market with the existing applications. By adding more products and services and software to our platform, that not only expands our TAM, but helps us get deeper into our customer's wallet. It helps us get more market share and more of their wallet in this perception layer of physical AI. Next slide, please. So when you look at the actual hardware side of it, that's where our OS products come in. This line is digital 3D lidar for short, mid, long-range sensing. The cool thing about this product set, what you're really looking at is different form factors, but the reality is the underlying architecture and manufacture that we do is all one platform.

The platform layer for the perception applications, maybe you tweak those a little bit from a software side, but this one set of hardware series can serve every single one of those use cases I just talked about. Whether that's working with a large ag producer on a large combine to help them guide through the fields worth $2 million, or to Serve Robotics with a couple hundred thousand dollar unit, it's the exact same perception and sensing system that supports and drives that perception for those physical AI solutions. The DF series, which is in development, that's another chip set we're developing called Chronos. That one is also in the works to come out shortly.

But what this will do is this will enter us directly into the automotive space and the ADAS space, specifically designed to deal with a truly solid-state lidar digital system predominantly for, again, the automotive space. Next slide, please. So when you think about the long-term financial framework, and we talked about the hardware, and you add on those software elements, and you add on the perception platform that we bring to the table, we believe this can drive across those market segments 30%-50% growth. That growth that we're looking at is predicated on things we have on the truck today. It's going to continue to be fueled by innovation. It's going to be continued to be fueled by both organic and inorganic means through software and through continuing to broaden our overall perception and sensing portfolio. Gross margins, we're targeting 35%-40% on a GAAP basis.

This is driven from a mix of all those elements I just talked about: hardware, software, service, and applications that go to it. We believe within this marketplace and with the mix that we see, that's a great place to be at to continue to grow our business. And then last but not least, something that the company was founded on, it's really fiscal responsibility and driving not just cool technology, but cool technology that can be sold into an active marketplace. So we're very, very pleased with our roadmap of what we developed, but also maintaining our operating expenses to get leverage and build leverage from that operating space and grow profitably is something we're looking at.

So when you look at this long-term framework combined and you look at some of the notes that are out there, this would target us towards a profitability both from a cash flow and from an EBITDA somewhere in the late 2027, early 2028 timeframe. Next. So to end, and before we turn to questions, I just want to re-highlight the investment strategy. We're an AI platform solution player, driving solutions, not just at the middleware or at that perception platform layer, but it's the applications and end use cases that our customers can use to speed their time to market and make them successful.

Think of us as being that platform. If you think back to the gold rush, we're selling the picks and shovels of perception solutions to folks driving into this physical AI gold mine that's out there. Next is our digital lidar technology.

Every new node that we bring out there lowers our total cost basis, drives higher efficacy, drives higher application use, and opens up new TAM for us. And then last but not least is our diversified and proven business model. We didn't come out 10 years ago focused just on one narrow segment, one narrow use case. We came out with a platform that can operate across all of these use cases ubiquitously and using our platform to penetrate these. We've shipped over 100,000 sensors, and we're really, really pleased. We did a merger. One thing I didn't mention, we merged with Velodyne a few years ago, and with that brought the fundamental patent IP portfolio with it that really solidifies this overall investment highlight. So we own the patents. We own the IP.

We are one of the founders of this space, and our roadmap of what we're driving is going to continue to keep us a leader in perception and sensing for years to come. With that, Casey, we'd love to turn back to you to see if we have any questions from the audience. Of course. Thank you so much, Ken. Really appreciate it. First one coming in, at CES, autonomy was pretty front and center. Many transportation and industrial companies discussing, talking about their autonomous products. Just want to get a sense, how does your technology and product portfolio fit into these applications? Thank you, Casey, and thank you for the question. Number one, you can't have physical AI without some sort of perception and sensing platform. You have to have sensing and perception platform to work autonomously with whatever end application you're trying to do.

What we did this year at CES, and CES was a very successful one for us, and we were really pleased because you can see how physical AI has grown over the years just looking at CES, where it once started, where no one knew what physical AI was, to we really had a breakout year this year, to where it was front and center across all those segments I just talked about, both robotics, industrials, as well as smart infrastructure. So what we did this year, what was really unique, is we took our investors and our analysts, we did a tour. We didn't have our own booth, but we took them a tour to all of our partners.

So we literally walked the floor and said, "Look, this is how we're being used on ag machinery. Look, here's how we're being used in security and aviation.

Look, here's how we're being used in logistics," and it was a real great proof point for everybody to then talk to our customers right there face to face to hear what great partners we were. In all my years of finance, CFO, and operations, I can't remember where I walked into partners' booths or even competitors' booths, and they're like, "Ouster, wow, you guys are really the leaders. You're the ones really helped us get to where we're at." That's a great feeling knowing that we're really changing and helping our partners be successful. CES, you can just see the focus. You can see the growth in physical AI becoming more mainstream across all those segments. Robotics was a big one, but also we can't forget the industrial side was really shown well there.

Chen Geng
Sales Capacity, Amazon Global Purchasing

Of course. No, really, really appreciate it.

Kenneth C Gianella
CFO, KORE Group Holdings

So would you be able to touch on the various technologies your customers use in their autonomy stacks and how they really all fit into your approach?

Chen Geng
Sales Capacity, Amazon Global Purchasing

Yeah. So first and foremost, we look at ourselves not as a lidar company, but as a perception platform. When you think about our customers, we really believe that they want multiple sensing modalities. And that's lidar, that's cameras, that's anything that they can do to put into a perception layer that can sense, think, act. At the end of the day, that's what they're trying to do with our product set. And so what we bring to the table is we bring that full platform of sensing, thinking with our perception layer, and then being able to act real-time.

Now, what we also bring with our Gemini product set is an example of where we can take all those learnings of what we learned operating, and we can pull that back into the Gemini model, continue to train their model for unique use cases that they do to continue their R&D development, and then you can push that AI model back into production to operate even more effectively and efficiently from what you learned from the next cycle, so we really bring to the table that whole end-to-end use case, and our end goal is really to reduce the friction for our customers to get to autonomy, so again, I go back to, I want them to win. I want them to get to market quickly.

And if I can help them with my software and my kits and my sensing platform to get to market quicker, then that's a virtuous cycle we're going to continue to win at.

Kenneth C Gianella
CFO, KORE Group Holdings

Of course. No, absolutely. And just coming off the back of that question, could you spend a moment breaking down your Blue City and Gemini offerings, in particular, how they're different, what markets do they serve, and what is the go-to-market approach with each?

Chen Geng
Sales Capacity, Amazon Global Purchasing

Yeah. So Gemini, think of that as kind of our AI engine, right? We use that as part of the Perception Platform to detect, classify, track, and it can be used across multiple applications, mostly stationary items tracking. We've trained it on over 4 million endpoints out there. We're continuously training and growing that AI module within Gemini. And think of that as a foundational piece from that AI portion for us.

When you look at Blue City, that's one of the true applications that we have out there that offer that end-to-end. Blue City is a transportation or smart city solution, if you will. We go to market predominantly through distributors with that application that allows them to sell to cities, states, and other government entities that allow them to monitor, manage traffic flow within intersections, freeways, and thoroughfares more effectively and efficiently. So think when you pull up to a stoplight or if you're in the northeast, they plow up the inductors and the things that make the light change constantly. Our system works through rain, snow, sleet, hail, and actively, real-time can change and monitor the flow of traffic and change the lights.

If grandma's having a tough time crossing the street and it's taking a little bit longer, we can hold the lights until the pedestrians clear out.

We have great use cases, great partners with this, whether it's the city of Nashville, the state of Utah, plenty of use cases where this comes together. And the reason why this is a great segment it's using is there's really no competition between cities and states. We're seeing growth just by word of mouth of people loving what our product does and telling different folks at conferences, "Hey, you got to go and pilot this." And what we're seeing and why we're seeing such great traction with this, not just because it's an end-to-end solution and it's ease of use and, oh, by the way, it's cost-effective and already baked into these cities and states' budgets, but we're seeing it because it's a force multiplier for them. We're displacing Brownfield technology, but then they're able to take this AI dataset.

They're able to do other learnings from it for city planning, for making the streets be more productive and understand how and where they can manage traffic flow more effectively. It's just an added application and analytics that we can provide to them for the same cost as they would get from other technology.

Jim Fanucchi
Controller, Lieff Cabraser Heimann and Bernstein

No, thank you so much. So having a strong balance sheet and financial position is really helping cement your position as a leader in the perception and sensing market. So what benefits or opportunities do you really see from this position?

Kenneth C Gianella
CFO, KORE Group Holdings

Well, I mean, think of the example I just gave about Serve, right? It was a 3 year journey with them from inception through the growth and prototypes to them hitting it with production and really knocking it out of the park.

We have 999-plus customers just like that that need to partner with someone, not just for a year. They're not just buying a sensor from us. They're buying someone that needs to help them be successful through their journey. And that journey, especially in physical AI and especially with large industrials, they're looking for ways and someone that can partner with them for two, three, five years. If I do nothing different from a financial structure, we have over six years' worth of runway from our balance sheet. We're the best-capitalized Western lidar company out there. We have really great partnerships, and we have a very strong balance sheet that allows us to get into doors and stay there because they know we're going to be around to support and partner with them throughout their physical AI journey.

Jim Fanucchi
Controller, Lieff Cabraser Heimann and Bernstein

That's great. Let's see if we have any others come in.

We can give it another 30 seconds. But in the meantime, thank you so much for your time. Really appreciate the presentation. From what it sounds like, the conference has been very successful. So best of luck in 2026 and after that. Thank you very much for hosting us today, Casey. Absolutely. Thank you.

Powered by