NXP Semiconductors N.V. (NXPI)
NASDAQ: NXPI · Real-Time Price · USD
244.04
+2.88 (1.19%)
At close: Apr 24, 2026, 4:00 PM EDT
244.00
-0.04 (-0.02%)
After-hours: Apr 24, 2026, 7:59 PM EDT
← View all transcripts

CES 2026

Jan 5, 2026

Lars Reger
EVP & CTO, NXP Semiconductors

Hello everyone here from Las Vegas, and you might ask yourself why I'm starting the booth walk-around via this little hole here kneeling on the ground. Well, the answer is a very simple one: because I was watching the camera via this hole here in the in-wheel motor of this Verge motorcycle. This is a complete motor, no powertrain anymore, in this bike: solid-state battery and NXP-enabled electronics. The i.MX 95 is operating the central brain of this 200-horsepower ultra-fast sports motorcycle. So just for the geeks, 0 to 60 miles in 2.5 seconds, that is very close to a rocket launch.

Fantastic motorcycle, and if you are an average motorcycle driver like me, I'm going to show you what NXP Electronics is doing to guys like me. We have here a smart i.MX RT-enabled helmet.

We have the automated brake light in here, so if the bike is decelerating, even if I'm not braking, but if my head is decelerating, the brake light switches on. I have noise cancellation in here, and if I fall off the bike or if I have an accident, this helmet is doing the automated e-call, so automatically calling the ambulance even if I cannot do that anymore. So you have great safety, you have great driving experience, and we have partners like Enphase who are taking care that the electrons are getting back into the machine in 10 minutes via the solid-state battery.

You are charging that device to have a driving range of 600 kilometers, roughly. So you can imagine how much NXP electronics' AI intelligence systems at the edge start changing the traffic experience. And then we continue looking at smaller other driving experiences with intelligence.

Just follow me here. We have our AI-enabled systems, a little rover here, and this rover is observing the ground, is observing that there is an oil leak on the manufacturing floor, and can start cleaning action, so can start classifying with our AI systems in the device. Like small/large language models, it starts classifying what it has in front of itself and then takes the right action autonomously. Furthermore, what we have is the devices are getting, on the same functional safety criticality, smaller and are getting closer to me with my body. You see here our insulin pumps.

Of course, highly functional safety, relevant. You see hearing aids, cochlear implants. You see smart glasses, all operated by our ultra-energy-efficient small i.MX RT microcontrollers.

So i.MX, every time, same family: microcontrollers, small, tiny, energy-efficient microprocessors, the big ones carrying a lot of AI and smartness, energy-efficient for the edge, but that is basically what the entire game is about. And of course, that moves forward here. You have here facial biometry. You have systems like scanners in your retail stores. You have them here in a very, very efficient, smart, nicely designed way. We have startup sections here from the firmly installed form factors, also to things that you can see here again moving. You have little driving rovers, reference designs from NXP.

You have drone kits from NXP. So this here, for example, are drone kits, drone form factors. Again, software-defined radio connectivity, the i.MX microprocessors that are running these devices. And what we also have is we have Ultra-Wideband, advanced sensors with millimeter accuracy, not centimeter accuracy anymore.

And all of that together, our partners, companies, for example, like Auterion or Holybro, they are using these types of devices for their flying form factors. You might have seen some videos with me in the past already on this device here, but also where do we have it? Here are these Auterion form factors that are really in serious production. So reference design from NXP, startup acceleration, and then guys get going and industrialize what we have on the shelf. We are not a set maker, but we help these set makers.

And of course, my favorite use case is Lars needs a pizza or Lars needs a beer, regardless of where Lars is, he gets it delivered automatically, autonomously. And we are working at the moment with all our radar and sensor electronics on full autonomy of these drones.

So we have the first demos where the drones are doing slalom between the trees or other obstacles, where these drones can hang themselves onto a skyscraper wall positioned by Ultra-Wideband, and where the drone in the 10th floor is hanging itself to the wall. You open the window, you empty the drone, you fill the belly of the drone again, close the window, drone lifts off. You don't need any landing places in cities, and that makes life and certification, of course, much easier.

Now, with all of that that we have on display outside of the booth here, the deep tech stuff, how NXP enables the world that anticipates and automates, is happening inside. We have three big themes lined up: brighter journeys, brighter places, and brighter lives. Or in other words, we're bringing intelligent systems to the edge in all three segments.

We are enabling the great robot awakening, AI at the edge. Join me. I'm going to show you what that means. As I said earlier, we have here the booth separated in three segments: in green, the brighter lives, in blue, the brighter places and infrastructure, and in orange, so in the NXP colors, the brighter journeys. I'm going to take you first to the brighter lives section because that is mega-cool use cases that we are especially driving with our latest acquisition.

You might know that we have acquired Kinara, an AI accelerator company, and together with our functional, safe, and secure microprocessors and these AI accelerators, we are able to build super-efficient systems at the edge with high functional safety requirements. Our partner, GE HealthCare, has two demos here to showcase with us that are really, really outstanding.

So the outstanding part that you're going to see here is basically an anesthesia measurement. And for my next surgery, whenever it comes, I want to be treated by these types of devices. Why? Well, very simply. The anesthetist has his or her hands full to do for working with all the pipes and needles around my body. Now, this system here contains a little large language model, and you can voice-operate certain parameters. "Give me that or that level of narcotics. We need more oxygen." All of these types of things.

So you have a robot assisting the anesthetist, making sure that this person is super relaxed, has everything in control, and the manual tasks are taken over by this device here. So this is how a robot today is looking like.

These are the robots that I'm talking about, and these robots are going to change your life to the better. What you also have here is a very, very dramatic use case, can be very dramatic, you have the same AI system in a child detector. So you have a newborn baby, and of course, it's the worry of every nurse in neonatology that you have 10 of the little babies, and you cannot observe them all, how they are doing. So what you're seeing here, the moment where the baby is face down in its bed, the system detects it and warns the nurse, just says, "Hey, look at that baby. It was okay two minutes ago.

Now this little toddler is face down, and we need to do something just as an early warning." So in other words, you are making the life of the intensive care nurses much, much better, much easier because they don't have these panic moments anymore. They can be there, treat all the kids in a calm way, and in case there is an alarm, do hop in and move. Requires, of course, high accuracy, high efficiency, functional safety, and of course also hacking security. Now, what is happening with the same electronics, if you're moving a little bit closer here, is you can take our electronics for tasks like access control.

So you see here an NXP setup of how do you operate with a smart door lock, camera case detection, Ultra-Wideband.

So via the phone, like in the car access case, Lars is coming with his phone to the front door. "Okay. Is he authorized to open the door?" Door open. And if I recognize his face, I have even a two-factor authentication and these types of things. Our microcontrollers, microprocessors, and also our ultra-wideband and camera electronics are doing. And how that is looking like, you see over here in this little form factor. This here is really the condensed version of this door lock. So this is how it will show up at your office door, at your workshop entrance, at your factory, or at home.

Furthermore, what we have in our electronics, we're using the similar connectivity for complete home gateways. So these home gateways, again, are containing Bluetooth, Wi-Fi, ultra-wideband. They contain AI, and they contain NFC.

With all of that, you are able to have a super-energy-efficient device that is operating your house, and that device is usually in power-saving mode. Ultra-Wideband here works as a proximity detector. The moment where Lars is getting close to the device, it recognizes very energy efficiently, "There is a human being coming my way," and only then waking up the same for the door lock. These devices are mainly at sleep, mainly in an ultra-low power observation state. The moment they are getting into the Ultra-Wideband signal, wake up, and you wake up the different stages of AI or authentication control.

This is what we are doing here with the NXP silicon. We have here a couple of partners already, partners that are doing really productization again.

So we have 50 customer demos on the booth here, partner demos showing our reference designs from the eval boards, really into these form factors, and then into how customers are realizing the products. And you can, of course, take that further on. So you have here the different door lock form factors from the different customers. You even have our i.MX here in these reMarkables. So there you have basically your e-paper, same electronics, same AI capabilities, same low-power capabilities, and you can operate these devices for weeks on one battery loading.

And then, of course, one of my favorite use cases, I have to admit, I'm a lousy cook. I'm a passionate eater, but a lousy cook. So the Thermomix is one of these devices that makes my life higher quality.

And of course, also there again, same electronics, same connectivity, same AI, and same compute performance in the devices can be used in kitchen appliances, in access appliances, and what you have seen earlier in medical appliances. If you take that into a more heavy-duty environment, so industrialization, building control, and home control, you're getting close to one of our favorite partners, Honeywell and NXP. We are working since years on driving the portfolio to the next scalable level. You know that companies like Honeywell are broad in valves, building control, avionics, a lot of segments.

And what we are trying to do is we are trying to use NXP Silicon and changing the form factors of these sensors, of these control panels, of the compute devices, of the AI.

Make that scalable via the, for example, i.MX scalability of the platform via the S32 microcontroller, microprocessors, and also via the AI accelerators and sensing electronics. And from there, you can go from a tiny little valve that has to live for a decade on a lithium battery somewhere in the desert, you can go really to complete building management units. Here you're seeing a partner wall with very, very different partners, different form factors. You see iThings is a company, a daughter company of Carrier for connectivity to heat pumps. You see the EV charging wallboxes here.

I could go now over tons of these energy and building management use cases into what we have stolen years ago from the automotive battery management. And how do you operate your car? Well, take the wheels of that car. I have that thing at home.

I need for my solar cells an energy storage because I don't want to operate my house only when the sun is shining. I want to have that, of course, agnostic. So these types of battery storage devices are derived from automotive battery management, 1500-volt type of systems, lower voltage systems. But in principle, this is how your energy storage in your house is going to look like. And of course, I showed it outside already with our partners, Enphase, how I get these electrons that I have stored in the battery, how I bring them into all my appliances, including my rolling appliances.

What that does in a nutshell is very nicely visible here because what you can start doing is now you can start trading the energy in your rolling power bank.

Just imagine your car comes to a parking lot, and your hook is up to the grid. So I move it in here. Then, if you have the right security, if you have the right electronics in there, like in a banking application, you can authenticate the source of energy. You can start trading that energy with the utility supplier, and you can operate and stabilize the entire grid. For example, if the grid gets out of service, we had that incident in Berlin two days ago where there was a big energy dispatching system on fire, so the grid gets disconnected.

This system here, the rolling power banks, they start taking over the supply. Grid comes on again, but you have some sabotage maybe coming up. So you have here an unauthorized energy supplier in the system. What we are doing is we are taking the chips of the electronic passports.

We are putting the same security in here. We even have systems that are carrying Post-Quantum Cryptography. Our i.MX 93 is the first microprocessor with post-quantum crypto in it. These devices automatically do the negotiation with the grid provider and make sure that only authorized energy sources, non-corrupt energy sources are in the system and that the grid is stable. Now, if you take all of that, energy supply is clear. How do you operate your actuators? How do you operate really your robots on a shop floor? How do you operate an entire factory?

Well, first thing is you have very different old and legacy field bus systems in these factories, for example, EtherCAT or Ethernet. Now, what you need is lots of electronics here, some old electronics from NXP with an i.MX 8 enabled.

So bulky boxes, there you have to make sure that from the manufacturing robot via the field bus, all the controlling and management is done. You need switches for the connectivity. We have up-integrated all of that in the i.MX 93, including the post-quantum crypto security. So in other words, if I am switching here the field bus protocol from EtherCAT into Ethernet, it is software-defined architecture, a word that you know from the automotive electronics only. This is Software-Defined Factory. You're just reprogramming this switch here, and you're reprogramming the i.MX 93.

You can upgrade the security standards if needed, and all of that on this tiny little stamp-size microprocessor here. And all of that via the different field bus systems. Very important, very high cost of bill of material if you want to do this in discrete ways.

Now, what that is going to get us into is that is going to get us into an i.MX 93 system here and one i.MX 93 system here. And what you see here is an air gap and antennas. So what have we done? We have taken Ethernet TSN over Wi-Fi. So we are sitting here on a 93 post-quantum crypto-enabled system. We are transmitting via Wi-Fi the time-sensitive Ethernet signals to this device here in the shop floor. And this unit here is operating our robot here that is at the moment filling the different vessels in NXP colors. So a pharmaceutical robot with motor control, with everything in here.

That is how these devices on the robot side in the factory floor are working. And you're saving a lot of real estate because you're getting wireless with time-sensitive networking.

If you take this to my favorite segment of the rolling robots, you're going to go into these types of demos here. What we have here is a pretty well-known demo also of the last years: how does a driver workplace, how does a car cockpit look like, how do we evolve with our electronics the way how you drive your car in future. Now, what you're going to see here is the following. You have an i.MX 95, the same like in the motorcycle, the same like in a lot of these other appliances here. And this i.MX 95 is operating the displays.

Now, what you also have is you have the camera detection here, and the AI accelerator on that i.MX 95 is detecting that Lars, the driver, is sitting here in the driver workplace, is recognizing me, is authorizing me.

And also what we then here can see, my friend Brian has his smartwatch connected to the dashboard here via Bluetooth. It says, "Okay, heartbeat of 100 beats per minute at the moment." So he seems to be nervous. But what the system also is doing via our Ultra-Wideband chips, the same chips that we are using for car access, for door opening, these chips are working like a motion detector, like a radar system. And these systems are measuring the movement of my chest or here at the moment, the crazy moves of my hands.

But in principle, what we measure is we're saying, "Okay, you are breathing at a certain rate. Lars, you are breathing at 20 times per minute. You are nervous. You are breathing 6 times per minute. Don't fall asleep." We have vibration control and sound here in the headrests.

So I can easily hear here my personal sound space. The others don't. And if I'm falling asleep, the seat starts vibrating and also if an ambulance comes from behind or a bicycle rider comes from behind and is ringing the bell, then what is happening is in our car radio chip in the quantum, the AI, tiny little AI, is operating the in-cabin microphones and is saying, "Hey, Lars, outside from the rear right side, there is a bicycle rider or an ambulance is coming. Vibrate the seats. Get Lars's attention. Hey, there is something happening.

Don't listen to AC/DC full throttle all the time, but be aware of what we are doing here." Now, what the guys have also done is they have taken little loudspeakers here, putting ultrasound on top of the audio signal and having little microphones.

So redundant to the Ultra-Wideband signals, they are also measuring my chest movement here via the ultrasound. So we have redundant sensing, ultra-cheap. Car access is anyhow in the vehicle. The other part is using simple infrastructure that we also have. That is all that we have. Then you need to connect this to the outside of the vehicle. This is our connectivity unit driven by an i.MX 93, operating a complete telematics unit, everything that operates electromagnetic waves, waves to and from the car. So super cool thing for the driver workplace.

If this is too boring for you in a car, of course, what we are also doing, we are moving this into very different form factors.

We are using, together with our acquisition, TTTech Auto, these types of ECUs have a lot of that electronics and of these use cases in these boxes here, in the fusion boxes. We are moving these boxes, for example, into the big snow groomers, into the advanced displays in special vehicles. So it is not only all for the normal car driver and the leisure driving of Lars going on vacation. This is very, very heavy-duty and very dangerous workplaces where we need to make sure that never, ever a mistake happens. There are still way too many fatalities of skiers on the road that we need to get down to zero.

Now, from all of that, I started talking already about software-defined factories earlier. These software-defined factories are a simple derivative of the year-long discussion that we had on software-defined vehicles, software-defined architectures in the widest sense.

Having a simple domain-based architecture, sometimes even split the domain into zones, the zonal discussion on the car. Then taking NXP microcontrollers and microprocessors for the central networking device, our S32M family, and for the body control and zonal control, our S32K family. Here what you're seeing is a very nice partner wall with guys like Quanta, Delta, HiRain, Applied EV, Desay, Aumovio. Delta, I cannot read them all. But what you're seeing is the S32M in the center. You see the S32Ks as zonal microprocessors.

And all of that is delivering you a very clean setup of vehicle architecture connected via SerDes, Ethernet, pendulum, and FlexRay. Then what you are getting is you are getting basically to these types of reference designs. So what you're going to see here is we call it our CoreRide platform.

We have exactly these ECUs for the zonal control, for the central compute, and we acquired TTTech Auto this year to make sure that we have one software level on all of these ECUs. Now, what you can do with that is you can easily, if you have one operating system everywhere, move functionality around. And if you so we have it here on display. You have your small, medium, and large type of vehicles. You can populate now your function blocks on the various zones or on the central compute. If you have a small, simple car, you have everything on the central unit.

If you have a big and very feature-rich car, you start distributing all of the functions to the front zones, to the rear zones, and so on. But in principle, you can play with it. It's software movement. It's not hardware movements anymore.

And that, of course, makes the entire thing super, super sexy. The moment where you have these software-defined architectures, the car OEMs and the Tier 1s have ultimate flexibility. It's only a horsepower discussion then, but in the end, you are just moving stuff around. Now, what is that stuff that you are processing in these zones and that you're processing on the central units? Well, to a large extent, this is sensor data from within the vehicle. I've shown you all of that on the driver workplace. But it is also sensor data from around the car.

And sensor data from around the car, I mean, most of us know from the autonomous driving discussions, is, of course, radar sensor data. And this here is our next big step into radar system cost down. And it is also LiDAR data.

Here, a very advanced concept of a LiDAR with our partner, Valeo, where you use the headlamp, sending out the signal and getting the echo back. And you have an ultra-cheap headlamp-driven LiDAR proximity detector. You can take both of these signals, radar and LiDAR signals, bring them together, fuse them, have a camera as well, detect the object. And then via LiDAR and radar, you are detecting what that object is doing, whether it's moving towards you, away from you. So all the Doppler shifts and all that stuff. And all of that is processed here on this device.

Now, that is one tiny radar sensor that is doing an awesome job. But how can we push the boundaries of physics? Well, very simply, we are pushing the boundaries of physics by taking our big 5-nanometer devices, 13 billion transistors, the S32N.

This device has isolated segments so that no source code can go from left to right, no executable code can corrupt the other segments. So we are stealing one of these segments. We are taking the radar data. And we are taking that radar data here, not raw data, by the way. Not raw data. We are pumping over a 100 megabit Ethernet line. But we are taking that data. And we have optimized data here already on the central compute, infrastructure-optimized data. And then whoever wants to bring an AI accelerator on top of that, so the NVIDIAs, Mobileyes, Qualcomms of this world, can use that super-advanced, artifact-free data and can do their autonomous driving on top of it.

So the claim of fame of NXP is that we are working with this integration infrastructure. Now, how does that look in real life?

Well, for example, we have this isolation, what I mentioned earlier, in microcontrollers. And you have different separated segments. So source code from the one area, from the traction inverter or from the DC/DC controls, cannot be cross-charged. If one area stalls, for example, an ASIL-B area stalls, the other ASIL-D segments are unaffected in one piece of silicon. But the function is safe. Power management, I see, these microprocessors here, you can operate all of that. And you see the functionalities that we can integrate, or you see it here in the slide, in the lineup.

Whatever you are bringing in energy management use cases, you can combine in one control unit. And basically, all the high-voltage, difficult case, ASIL-D, relevant case, the high-voltage energy management of that entire rolling robot gets managed by one ECU.

If you follow me over here, how does that look now in real life? Well, very simply. You have these types of fantastic rolling robots here, more than 30 NXP chips in these devices here, in this Audi SQ6 from networking, of course, Ultra-Wideband, of course, radar, all of these types of things, the in-vehicle networking, the microcontrollers, microprocessors, more than 35 chips in this device. It's an electric vehicle. What we have here is with partners like EaseLink. EaseLink is driving the charging infrastructure to the next level.

What you are seeing here, and that is very relevant not only for the consumer cars but also for all other sorts of warehouse robots, rolling skateboards of all kinds. It's not a wireless charging, but it's a galvanic connected charge grid based on NXP electronics. You see this snorkel here moving down.

This is the stuff how robots are going to feed themselves in the future. Yeah? They just drive to the parking spot, sockets down, charge the vehicle without any losses, like a galvanic direct connection. You see the metal pins here. Functional, safe. And this is how we are autonomously also feeding our robots that anticipate and automate the world. I hope I could show you with a bit of a fast-paced walkthrough here how NXP is really bringing the intelligent systems to the edge. None of these systems needs AI at the cloud. None of these systems needs AI at the cloud.

We are dreaming of 50 billion smart connected robots that are making my life easier in the future. I urgently need it. Before I go into retirement age, I need all these helpers around me.

You have seen it in brighter lives, doing my medical care, helping me with keeping the body healthy and monitors in brighter places, building the cocoon around me, my house that protects me, is always warm, is always climatized, has a filled fridge, has low maintenance. Then, of course, whenever Lars wants to start roaming around, my robot grabs me by the hand, transports me to wherever I like. If I like to drive myself, either in this fantastic device here or with a virtual motorcycle, it will be a safe and reliable experience.

We are at the moment at the edge where the robots are able to take responsibility from Lars, the human being. Lars can trust the devices and doesn't need the cloud for that. That's the important message. NXP is enabling that with a complete portfolio.

If you want to dive into that deeper, if you are a startup, if you are a customer, if you are a large-scale customer as well, make sure you go into NXP.com, chase me, chase my team. We are keen on talking to you, diving deeper with you, and looking forward to the fairs throughout the year and, of course, also the next CES to come. Thank you.

Powered by