Ladies and gentlemen, welcome to the ST Intelligent Sensing Enabling the Physical AI conference call and live webcast. I am Sandra, the Chorus Call operator. I would like to remind you that all participants have been placed in listen-only mode and the conference is being recorded. The presentation will be followed by a Q&A session. You can register for questions at any time by pressing star and one on your telephone. For operator assistance, please press star and zero. The conference must not be recorded for publication or broadcast. At this time, it is my pleasure to hand over to Jerome Ramel, EVP, Corporate Development and Integrated External Communication. Please go ahead, sir.
Thank you. Thank you everyone for joining the ST Intelligent Sensing Enabling the Physical AI conference call. Hosting the call today, Marco Cassis, President, Analog, MEMS and Sensors Group, Head of STMicroelectronics Strategy, System Research and Applications, and Innovation Office. This live webcast and presentation materials can be accessed on ST Investor Relations website. A replay will be available shortly after the conclusion of this call. This call will include forward-looking statements that involve risk factors that could cause ST results to differ materially from management expectation and plans. We encourage you to review the safe harbor statement contained in the presentation and also in ST most recent regulatory filing for a full description of these risk factors. Also, to ensure all participants have an opportunity to ask questions during the Q&A session, please limit yourself to one question and a brief follow-up.
I'd like now to turn the call over to Marco Cassis. Marco.
Thank you, Jerome. Good afternoon, everyone, and thank you for joining this conference call dedicated to sensors. I will begin by presenting our sensor product portfolio, the markets we address, and outline our financial ambitions for the next three years. I will then briefly discuss the main benefits for ST of our recently completed MEMS acquisition. Finally, I will focus on the opportunities we see in humanoids before opening the floor for your questions. ST is committed to enabling a smarter, more efficient and safer world through our advanced sensing technologies. In the age of AI, data is the new oil. Just as oil reduces friction and enhances efficiency in engines, sensors bridge the physical and digital worlds to optimize the performance of AI systems by providing precise and relevant data. Our vision is to seamlessly integrate AI capabilities into everyday applications, enhancing performance, efficiency and user experience.
Leveraging our expertise in semiconductor design and manufacturing, we are developing intelligent sensors that not only capture data with high precision, but also process it in real time. Sensors are essential for enabling intelligent behavior and decision-making in AI-related applications such as autonomous vehicles, smart homes, industrial automation, robotics, and smart consumer devices, and many other use cases enhanced by AI. In 2025, our sensors revenues, including MEMS sensors and actuators, as well as optical sensing solutions, generated $2.2 billion revenues, growing 10% year-over-year. We have been a pioneer in both the MEMS and imaging industries, having shipped more than 30 billion sensors in the past decades. We intend to continue on this path, maintaining and further strengthening our leading market positions.
This performance is driven by our technology leadership, in-house manufacturing, and the advanced hardware and software ecosystem we have developed, including automotive qualified sensors technologies. We are proud to be one of the few, if not the only player, to master both MEMS sensors as well as optical sensors that we usually refers to as imaging. We are fully leveraging ST's IDM model to build intelligent sensors. Our highly specialized proprietary process technologies are the foundation for strongly differentiated products, and we have the capabilities to offer either standard products or custom solutions. In MEMS, we have a leading portfolio of sensors embedding increasing levels of intelligence. In imaging, we are a recognized leader in time-of-flight and in specialized CMOS image sensors, and we also offer ambient light sensors and advanced optics. Like for our MEMS products, our imaging solutions are embedding more and more intelligence.
Needless to say that in the event further post-processing computing power is needed, we can easily leverage our leading STM32 general purpose microcontroller family and ecosystem. Our sensors and associated companion chips leverage our well-balanced manufacturing footprint with sites distributed across key regions worldwide, including on the front end, our fabs in France, including our 300-millimeter digital fab in Crolles, as well as in Italy, including 300-millimeter fab in Agrate and in Singapore. On the back end, our sites in Europe and in Asia. This gives us resilience, scale, and control over quality, cost, and capacity ramp up, which is especially important for our engaged customer programs. Turning now to the markets. These slides show why we see sensors as a structural growth story. They sit at the intersection of several long-term trends across automotive, industrial, consumer, IoT, and healthcare. First, automotive.
Safety and regulatory standards keep tightening globally, which increases the numbers of sensors per vehicle. On top of that, electrification and autonomous driving demand higher value inertial units and advanced imaging or lidar. It's not just more sensors, it's more content per car. Second, industrial infrastructure. Long-term robotization boosts demand for MEMS and machine vision sensors in factory and logistics equipment. At the same time, the digitalization of infrastructures is driving broad deployment of distributed condition monitoring MEMS and vision systems to improve uptime and efficiency. The energy transition is accelerating adoption of environmental and pressure sensors, plus imaging for asset and emissions monitoring. Third, consumer IoT and healthcare. On the consumer side, premium smartphones, wearables, and AR/VR devices keep adding MEMS motion sensors and 3D or depth imagers.
In parallel, the proliferation of low-power IoT nodes in smart homes and smart cities increases demand for ultra-low power environmental MEMS and compact imagers at the edge. In health and remote care, the shift toward continuous and preventive monitoring raises penetration of motion and pressure sensors. Across all these end markets, as more physical AI is embedded into devices, the quantity and value of sensing contents continue to grow, structurally supporting sensor market growth. Here we see how those structural trends translate into a solid long-term growth profile for sensors. The broad sensor market, combining MEMS and imaging, is expected to grow from roughly $49 billion in 2025 to about $57 billion in 2028, a compounded average growth of around 4.7%. Within this very large market, the segments that ST specifically targets are growing even faster.
CMOS specialized image sensor, which is the main product category addressed by our imaging business, represents today an addressable market of around $4 billion, growing at about 5.7% CAGR to 2028, with a leading market position for ST. Motion and pressure MEMS sensors represent a $7 billion+ market growing at roughly 5.3% CAGR to 2028. Here as well, ST is a key player with pro forma market shares in the low teens and a strong number two position, including the recent acquisition. We have an expanding total market, and the sensor sub-segments where ST is most active are growing faster than the overall sensor market. In this landscape, our ambition is to significantly outperform the market we serve.
We aim to grow our sensor revenues at mid-teens compounded average growth on a reported base until 2028, starting from $2.2 billion reported revenues in 2025. Both MEMS and imaging are expected to contribute to this growth. In addition, we see further opportunities to further grow beyond this time horizon. Now, a few words on the recent acquisition we did in MEMS. The acquisition is highly complementary in terms of technologies as it adds to our ThELMA sensor and PETRA actuators, two new technologies with additional features. UMems, which bring high sensitivity capabilities and require low calibration efforts. PCell for capacitive pressure sensors. In terms of product portfolio, the complementarity is also high. ST has established leadership position in personal electronics for MEMS sensor for Android, and in computer and peripherals for actuators, printheads, and inertial MEMS sensors for laptops.
The acquisition creates strong footprint in automotive, especially leadership in accelerometers for safety applications, and a top five ranking for pressure sensors. This complements our established leadership in automotive navigation. In industrial, the acquired business brings solution for medical devices on top of our application-specific solutions. This enables us to significantly rebalance our MEMS and market exposure. Our revenues are now more aligned to the fast-growing automotive market, which accounts for 37% of our MEMS revenues on a pro forma basis in 2025, while also increasing the share of industrial, now 18% of revenues. Let me now turn to our strategy for intelligent sensing. Intelligent sensors are sensing and computing. Sensors are becoming smarter and are now the primary source of data for AI and the key enabler of human AI interaction.
They no longer just measure, they also pre-process data so systems can understand and react to their surroundings. Intelligent sensors embed algorithms, machine learning, and processing capabilities to improve context awareness and offload the rest of the system. They are smart, transformative, and accurate. Smart means they process data locally, saving energy, reducing latency, and helping protect privacy. Transformative means they significantly enhance how we interact with the environment. Accurate means they deliver precise, reliable data, reducing calibration effort and energy use. Together, these three attributes allow intelligent sensors to provide meaningful information in an optimal and sustainable way. Intelligent sensing supports AI in two ways. First, as an enabler, sensors capture large data sets and pre-process them at the edge, optimizing throughput, adding tags or security features, and allowing powerful local host to run AI without always resorting to the cloud. This is edge AI.
Second, intelligence inside the sensor itself handles lighter workloads where ultra-low power and very low latency are critical. Here, the sensor directly computes what it measures, AI at the edge of the edge. Together, these two forms let ST address a vast range of applications. From a smartphone, laptops and wearables to tools, cars, robots, and connected devices that must process data in real time and make autonomous decisions. Bringing intelligence closer to the user reduces latency, enables more personalized features, improves energy efficiency, and lower environmental impact, as less data must travel to and from the cloud. Let me now show how we enable this with our MEMS and imaging solution. We offer intelligent MEMS sensors that can process signals right inside the sensor itself.
We already have two generations on the market with more than 20 products deployed, and we are seeing strong momentum, especially in applications where energy and computing efficiency really matter. In sensor AI alone is not enough. That's why we have built a full development ecosystem, fully compatible with STM32Cube.AI, to support customers through the entire design cycle and enable through end-to-end development. We are the leaders in this space, and we are continuing to invest in the next generations of sensors, so we can bring tiny, efficient processing to more and more high volume applications. Our more advanced smart MEMS integrate a function called an intelligent sensor processing unit or ISPU. This unit is a true integrated processor that is optimized versus a general purpose MCU. It can be used to run complex AI algorithms to process raw data and provide to the system only meaningful data.
The main advantage is that being integrated, it optimizes the required computing power to deliver orders of magnitude higher efficiency than processing on a separate device. Optical sensing solutions are also embedding more and more intelligence. Thanks to die-stacking technology, there is a lot of silicon area available to integrate processing capabilities inside the sensor. This offers higher data accuracy and quality, improves signal-to-noise ratio, and minimizes the overall power consumption of the system. All of ST optical sensors already integrate 32-bit MCUs, and the new generation also includes convolutional neural network accelerator, enabling much higher performance in many use cases. In parallel, our imaging business is expanding to new high-growth areas. Our imaging core business is indeed based on three pillars. Smartphone front-facing. ST is number one for face authentication at major OEMs. Smartphone world-facing, including camera assist and light sensing.
PC laptop, where ST is number one for low power presence detection, smart sensing face authentication, Windows Hello, security. We see three additional growth drivers. Automotive, with driver monitoring and cabin occupancy. ST aims to have a leading position in this still small but fast-growing segment, which is driven by increased regulation. Industrial, including among other, robotics, people counting, smart home, depth sensing, and security and surveillance. Emerging applications such as AR, VR, MR, as well as humanoid robots, on which I will give more color after. In summary, AI needs data and therefore sensors. Scalable, sustainable AI depends on intelligent sensors. This create a major opportunity for companies like ST that truly master the sense and compute paradigm. Our strong technology roadmaps, scalable manufacturing model, and broad increasing intelligent product portfolio position us as a leader in this space.
Coupled with our deep expertise and close partnership with the industry's key innovators, we have a robust pipeline of customer programs and are well-positioned to outgrow the market. Let's now dig a little bit further into the humanoids opportunity for ST. Within the wider robotics vertical, which is already a reality today, industrial end markets, ST is well-positioned as a strategic enabler to address the growing humanoid market, which could represent a further long-term growth driver for the industry. We estimate ST's current addressable bill of material in humanoid robots at about $600 per unit. If you take a typical humanoid, almost every block you see on this slide needs sensing and is enabling further ST component attach. In the head, we have cameras and sensors that let the robot see and understand its positions in the environment.
In the body, the main processing units rely on motion sensors to keep balance and to move smoothly. In the arms, legs, and joints, sensors and motor drivers work together, so movements are precise and safe. The hands need sensors to detect grip and contact, as well as understanding position. We are already engaged with major OEMs in U.S., Europe, China, and APAC, and deeply integrated into the value chain. In particular, ST is working with NVIDIA to further accelerate and streamline the end-to-end development experience for physical AI solution, leveraging the complementary strengths and portfolios of both companies. Additional information will be made available later today with a press release. Focusing on sensors, we offer a comprehensive portfolio of products.
For sensing, we provide an extensive portfolio of MEMS, including IMUs, accelerometers for motion detection and balance. Gyroscopes for orientation and angular velocity sensing, magnetometers for direction sensing, pressure sensors, and sensors for measuring temperature and humidity. Sensor fusion software libraries that enables motion tracking, fall detection, activity recognition, and contextual awareness. For vision, we offer advanced 2D and 3D vision sensors, including global shutter CMOS sensors and time-of-flight modules in order to support use cases such as object recognition, simultaneous localization and mapping, obstacle avoidance, gesture, and proximity sensing. Our imaging products support high resolution, low power consumption, and integration with AI processing. Beyond sensors, our broad offering plays a key role.
It includes our industry-leading STM32 microcontrollers and comprehensive ecosystem, as well as our strong portfolio in power and analog product, which enables us to address requirements for motor control and drivers, power and battery management, connectivity, and edge AI. With more than 500 ST components in humanoid robot major system blocks, we are uniquely positioned to address these opportunities. Conclusion. A few words to conclude. Our ambition is to grow our sensor revenue at the mid-teens compound average growth by 2028, with further opportunities to grow beyond this time horizon. We are ideally positioned to win in a growing sensor market, which is increasingly driven by physical AI, thanks to our strong technology and products roadmaps, IDM model, and partnership with market shapers.
For MEMS, our recent acquisition is further strengthening our technology and product portfolio and rebalancing market exposure, positioning us as a strong number two player in the market we serve. In imaging, while consolidating our leadership position in our core markets, we are expanding to high growth areas. We are a strategic enablers for humanoid robots with an overall $600 current bill of material opportunity per unit, with sensors as key component to attach further ST content. Thank you.
Yeah. We're gonna start the Q&A.
We will now begin.
Operator, please.
Okay. We will now begin the question and answer session. Anyone who wishes to ask a question or make a comment may press star and one on the telephone. You will hear a tone to confirm that you've entered a queue. If you wish to remove yourself from the question queue, you may press star and two. Participants are requested to disable the loudspeaker mode while asking a question. In the interest of time, please limit yourself to one question only. Anyone with a question may press star and one at this time. Our first question comes from Jakob Bluestone from BNP Paribas. Please go ahead.
Hi. Thanks for taking the question. I just wanted to follow up on the humanoid robot. I guess in China it's much further advanced than it is in the West. I was wondering if you could maybe expand a little bit on what your position is there. You obviously mentioned that you're embedded with some of the major OEMs, but I don't know if you can share any market share data or just give us a little bit of a sense of what your position is for China, specifically for humanoid robots.
Is the question specific to humanoid robots or sensing in China?
It was on the humanoid robots specifically, but if there's any broader color you think is relevant, then.
Okay.
Happy to take that as well.
In humanoid robots, yeah, what I can say is, at this stage, we are basically, and this is not only in China, it's overall, we are present in, at least the top 10, humanoid makers. At this stage is a positioning in terms of, product offer with, of course, as I was saying before, a strong interest in, on the sensing portion because, the sensing portion, represent between 30%-40% overall of the bill of material. So it's an important portion, and it is an enabler for all, so the rest of the components. So I would say I feel comfortable in the position in terms, again, in terms of positioning, but again, this is still a market which is at the beginning.
If we expand a little bit more on the sense overall, we see good traction in, for example, in automotive because their electrification is strong. The electric cars are heavier than internal combustion engine cars, which means, for example, there are new growing applications like the suspension and that are calling for accelerometers or even six-axis to help and to support the way these active suspension are working. The same is for autopilot, etc. I will say that China is surely an interesting market for us and our positioning there is good. I hope I answered the question.
It's very helpful. Thank you.
Thank you, Jakob.
The next question comes from Gianmarco Bonacina from Banca Akros. Please go ahead.
Good afternoon. Question on your ambition to grow mid-teens. If I understood correctly, it's on a reported basis, so including NXP. Would it be fair to say that if we exclude these business which you will consolidate this year, the expected CAGR is more around 10%? And related question, looking at your slide 21, if you can indicate, let's say on the core business, if you expect to grow more in line with the market, so mid-single-digit. Among the high-growth areas, which are the one where you have more visibility in the next 12, 24 months to have stronger growth because maybe you are already working with some customers. Thank you.
You are specifically speaking about imaging, correct?
No, I mean, the target overall, and on page 21, I was just wondering which are the growth areas where you have visibility to generate, let's say, extra revenues in the next 12-24 months, because I think humanoids is more something for the mid-term, but maybe I'm wrong.
Okay. First of all, the fact that we are confident that in a market that between 2025 and 2028, a market will grow between 5%-6%. As I was saying before, our aim is to grow the sensor revenues at mid-teens compounded. Of course, this is including the NXP acquisition. At constant perimeters, you are right, the growth is more in the low double digits, but still a consistent growth. This trajectory is supported by, as I was saying before, a clear roadmap of engaged customer programs and a pipeline of opportunities. At the end, we expect to outperform, mainly driven by the fact that we have a strong technology roadmaps, a wide product portfolio, and we can leverage on our IDM model.
In this business, we have a customer intimacy and expertise in innovation with market shapers. The exposure that we have to segments that are rising faster, like automotive for MEMS or imaging, is one of the reasons, again, we do believe that we will perform better than the market. Now, going specific on chart 21, yes, you are right. The majority of. There are new applications in which we do believe that we will grow faster compared to areas where we have been more traditionally present, what we call, let's say, our core business. Specifically here, I can share with you that in cabin monitoring and driver occupancy, we do expect here to have, which is an application that is going to grow in the range of 40% CAGR.
Where we already have a low double-digit market share, we do believe that we are going to increase our market share in the next few years down the road. Yes, we have applications that are growing faster and help us to have an overall growth of on the sensor which is higher, much higher than what is the expected growth of the sensor business. I hope I answered the question.
Thank you.
Thank you, Gianmarco. Next question, please.
The next question comes from Adithya Metuku from HSBC. Please go ahead.
Yeah. Good afternoon, guys. Two questions, please. Firstly, just on the robotics opportunity, do you need any qualifications to address this market? Are they similar to automotive qualifications? Just any color you can give around that would be helpful. Then secondly, you showed a number of things in that SAM slide. I think it's towards end of your slide deck. Is there any one particular part of the SAM that tends to be more sticky than others? For example, if you win that, then maybe you can win a lot of the other things within the SAM. Are there any dynamics like that that we need to be aware of? Any color there would also be helpful. Thank you.
Okay. Which was the first part of the question? Sorry, because the line was not great.
Oh, sorry. Just, are there any qualifications needed to address the robotic solution?
Okay. Sorry about that. No, the mission profile for robotics is not as strict as the mission profile required for automotive. Surely there is still safety things that need to be taken into consideration, which is coming more from the usage of sensor and so on. Definitely it's not an environment which has the same requirement of the automotive. It's much more similar to the industrial requirement. Plus, and this is true for humanoids or for robotics, let's say, used in the industry. Clearly, you have also robotics that are much more consumer, let's say, oriented, where even the industrial requirement are quote-unquote, "much lighter." Coming back to the second part of the question.
Yeah, I do believe that there are parts that are more sticky because are really enablers of the performances. What I mean, it's not by chance that we have evaluated that over $600 of bill of material that you can have in a humanoid, between 30%-40% of this bill of material is sensors related. You know very well it's the sensor offer is not so many company can do and can do both. It's not only sensors, it's also the fact that you need to have the modularization of the sensor to make sure that who is doing the robotics will be in a position to quote-unquote, "simulate" how the movement will be, how smooth they will be, and so on. It is products plus ecosystem.
I think there is a strength on which we can leverage, and that's what I was trying also to pass as a message during the presentation. We can attach to this also good part of ST product portfolio. There are other parts also that are important. You can have the GaN because you can make motors that are smaller, et cetera. I think I will distinguish really sensing as an enabler and the fact that as a company, we have a wide portfolio that can fit extremely well with the requirement for humanoids. In terms of positioning, I think it's going to be sticky enough and not so easy to replace.
Got it. Essentially sensors are the differentiating element, and then you can stack or not just around it.
Yes. I strongly believe in the differentiated elements. It's very important that the latency is at the minimum because these are movements. To do that, you want to do it at the lowest budget of energies. There we have surely a quite interesting offer because data can be processed with our ISPU, very low power consumption, so only meaningful data are sent from the sensor itself. It's fitting extremely well what if you want are the requirement of humanoids, which is, they need to be very accurate in the way they do move, and here sensors play a role. They need to save as much as possible energy, and here again, sensing plays a role in terms of processing data locally.
Of course, other parts of the portfolio too, like Yann was saying before, will help in that direction in terms of saving energy. I would definitely define sensor as a differentiated factors.
Understood. Thank you.
Thank you, Adi. Next question, please.
The next question comes from Sébastien Sztabowicz from Kepler Cheuvreux. Please go ahead.
Yeah. Hi, everyone, and thanks for taking my question. Coming back to your forecast of mid-teens growth for the business in the coming years, do you see different dynamic between the MEMS and the imaging segment, or you expect, I would say, broadly similar growth in the coming years in the two segment? The second question is, do you need any specific block to execute your strategy around intelligent sensing? i.e., do you need some M&A to acquire part of the technology or you have everything in-house? Thank you.
Okay. For the first part, at this stage, let's say the imaging portion in terms of value is higher than the MEMS portion. I cannot give you granularity. What I can say is that while imaging will keep growing, the MEMS portion will grow faster. We'll have a little bit more rebalancing in terms of how the split is between the imaging and the MEMS portion. For the second part of the question, no, we do have what we need in-house, both in terms of mechanics and technologies, and also in terms of IPs. The only thing that we are doing is in order to integrate more digital capabilities, we will start using more advanced nodes to be sure that we can do much more computation, let's say, at the edge.
In terms of analog IPs and so on, we have everything in-house at this stage, and I do not see that we need to acquire from external. Also because we can leverage on the analog portion and analog design capabilities that we have in ST in case further IPs could be required. Of course, if down the road we find that something new or something better can be important, we'll be always keen to try to understand and see what will be necessary for us to do. At this stage, I think we are pretty well covered for what we see coming from the market. Again, I underline we are present in the big major one, so we have already good level of interaction that let...
Are making us confident that we have the right level of IPs. I hope I answered.
Yeah, very well.
Thank you, Sébastien . Next question, please.
The next question comes from Lee Simpson from Morgan Stanley. Please go ahead.
Great. Thanks for squeezing me in here. I'm fascinated by the humanoids opportunity and the $600 that you've called out as bill of materials. I'm really just trying to understand the go-to-market here. It's clear that in the past you've worked well with integrated big players and other large tentpole markets. Looks as though you could be in a position to do that here as well. Even herein, how are you doing this? Are you going to, are you gonna create reference designs? Are you gonna create evaluation platforms, or are you looking to maybe help with a sim-to-real gap and accelerate maybe some standard products for the market? I'm just trying to understand how this all fits together as go-to-market. Thank you.
Yeah. It's a combination, and it is also a working process, I have to say. It's a combination of everything you have said, basically. It is clearly we work with, let's call shapers, and let me put it in this way. It's even if it is much more sophisticated, but to give you an example, it's like when you work, for example, with Qualcomm and you want to have. Here I'm speaking about smartphone. You want to have your MEMS or whatever included in the Qualcomm platform with the drivers. After that, your devices will have the drivers and can be used by whoever in the Android's community is going to use a Qualcomm microprocessor.
Working and enabling with, let's say, the microprocessor offering, is automatically translating that your devices will be part of whoever is going to develop things using a Qualcomm processor. This is one step. The other one, of course, is to work with customer that want to have a little bit more specific, and in that case, I think it will be maybe not that often we could make also dedicated or custom devices. I think the first stage it will be much more, quote-unquote, "standard devices" that are already trimmed and with the good performance to support. What we are doing at this stage is to make sure that we are present with the majority of who is involved in humanoids. It is. We are very humble.
We try also to understand what are the needs, and it's a growing and learning process at this stage. I'm pretty confident that we have the majority of the ingredients that are necessary.
That's precisely what I was asking, and thank you for the discussion. I mean, it sounds like you get the standard parts when needed. You have custom parts when also needed.
Yes.
Maybe in the context of the AI vision proposition, it looks as though you could have standard, time of flight sensors, for instance, but you could even do a vision processor if called upon by a large OEM.
It's always in our DNA. If there is volume and it's something specific, we can always consider. Of course, we need to have the return on the investment. It need to make a lot of sense, et cetera.
Volumes coming, but at least we have the IPs, we have the competencies, and as you said, at this stage, we have standard products that are fitting extremely well with the requirements because here I speak about the MEMS, but also imaging is not only the capacity to sense, but is also the capacity to perform locally data processing at very, very low energy. It's improving the latencies, improving the overall performance. I'm pretty confident that we have a good offer there.
Thank you very much.
Thank you.
Thank you, Lee. Next question, please.
The next question comes from Johannes Schaller from Deutsche Bank. Please go ahead.
Yeah. Hi. Thanks for taking my question. Marco, you already gave the 30%-40% contribution from the sensing side to the $600 opportunity per humanoid robot. Could you maybe help us break down the other 60%-70% a little bit more by product? Then I think you partially answered that question already, but do you see in the market from these robotics companies more a desire to buy entire systems from companies like ST, or is there more a desire to buy really individual best of breed products, so the best sensor, the best microcontroller, the best Ethernet component, et cetera? Can you maybe talk about that dynamic a little bit more? Thank you.
For the first part, as I said, basically, if you combine the sensor overall, we are in the range between 30%-40%. I think the power and discrete will be in the low double teens percentage. Analog in the range of 20%-25%. MDG, what you call a microcontroller, in the similar range between 20%-25%. The portion which is overall the biggest, we do believe is the sensing part.
Yeah. That's clear. Thank you.
Yeah. The second part of the question was, you were asking me the
Just the desire you see in the market from your customers.
Sure.
To go for best of breed versus system solutions.
No, I think that we will have two stages. The first one is, if they can take the overall system, it is easier. Maybe in the second stage, there will be more optimization, maybe. But, I strongly believe that, you need to be inside some of the ecosystem to be a solid supplier. So more you are in that ecosystem, more difficult will be for you to be replaced. Surely, some part could be a little bit more cherry-picking, but I think would be a second stage. Because you can leverage, you can help. So I think, the bill of material, I think it's maybe not all of it, but is a solid bill of material that we can address.
Great. Thank you, Marco.
Thank you.
Thank you, Johannes. Next question, please.
As a reminder, if you wish to register for a question, please press star followed by one. The next question comes from Alberto Gegra from Equita SpA. Please go ahead.
Hi, good afternoon. I hope you can hear me. I have a follow-up on competition. Can you elaborate more on the evolution of the competitive landscape in this robotic ecosystem? Particularly, I'm wondering if there are new entrants, for instance, in China, where the robotic market is quite developed. Just to better understand what are the entry barriers in this segment, and also compared to your, let's say, traditional competitors, which are also as you IDMs, where do you think you are ahead of competition in this sensors business?
Yeah. I cannot comment so much on competition and. I can underline again. First of all, we are present independently geographically, which means also in China. We are present with the major players at this stage. I don't have the percentage because still volumes are small. It's more a positioning what we see at this stage. Versus our competitor that has also a broad portfolio. If you look, we are the only one that has this kind of sensors, both imaging and MEMS. I think we are the only one that are embedding local low power at low energy computational capabilities. While, of course, microcontroller, we are very strong, but we have also competitor with a good offer in terms of microcontroller.
For the power is the same, the other is the same. I really think that the distinctive feature that we bring to the table for humanoids is the sensor portion. For this application is mandatory because we are speaking about movement, precise movement, if possible, low energy. The performance of the humanoids will be linked on how sensing is helping them to move, helping them to understand the environment. I think we are the only one with this breadth of sensing capabilities. I strongly believe this is a good differentiator factor that we bring versus competition. I hope I answered. Yeah.
Yes. Thank you.
Yeah.
Thank you. We have time for one more question.
We have a follow-up question from Adithya Metuku from HSBC. Please go ahead.
Yeah. Thank you, guys. Just one additional question. Marco, you mentioned you will do node migration for faster processing power. Are you planning on doing this in-house, or will this be outsourced? And secondly, if you could also comment a bit on how you intend to integrate the logic with the sensor, what sort of technologies are you thinking of? Any packaging, any color you can give around that would be super helpful. Thank you.
Okay. I missed the first part of the question. On the Digital, I understood. First part, can you-
Just the nodes. You mentioned moving to smaller nodes to increase the computational power.
Yes. Sure.
I just wondered.
No, this one is.
If you will be doing that in-house.
Yeah. We will be doing this in-house. It's a little bit tricky because, let's say that portion, typically you have a 50% which is analog and 50% which is digital. To move to a very advanced, for what can be advanced at this stage, but an advanced digital node, will not get all the benefits because the analog portion, which is big, cannot be shrunk. The performance are strongly linked with the sensitivity of the analog portion. Because let's not forget, we are in the MEMS, in the motion MEMS, we are basically measuring the movement of few electrons, so we need to be very precise and very accurate. At this stage, we are using a 130 nanometer.
I think we'll go down to 90 nanometer, all in-house technologies. We are quote unquote independent there. Maybe next step could be to go down from 40, but again while the movement from 130 and 90 is 'cause the analog basically stays the same, is very fast and very straightforward. To go further down needs to make really a lot of sense. This is true for our sensors. For the NXP portion, the mechanics is coming from us. The ASIC is external, but in the next generation, the future, also the ASIC, so the digital portion and so on, will come from ST.
Got it. When you said you will shrink, you're talking about going down to 90 from 130 on the logic side?
Yes.
Okay. Understood.
Yes. It's more. We want to put more capabilities. It's not really a reduction of die size. It's more in the same die size to put more computational capabilities.
Got it.
Okay.
Thank you, Adi. Well, thank you, Marco. I think this is the end of our call, so thank you very much all of you for joining us today. As a reminder, our quiet period starts this evening, so sorry for the ones that didn't have time to ask a question there. Thank you very much.
Thank you. Thank you all. Thank you for attending.
Ladies and gentlemen, the conference is now over. Thank you for choosing Chorus Call, and thank you for participating in the conference. You may now disconnect your lines. Goodbye.