Welcome, and thank you for standing by. I would like to inform all participants that this conference call, as well as the Q&A, is being recorded and will be available to clients of J.P. Morgan. Parts of this conference call may also be reproduced in J.P. Morgan research. If you have any objections, you may disconnect at this time. Unless otherwise permitted by internal J.P. Morgan policy, members of J.P. Morgan Investment and Corporate Banking are not permitted on this call and should disconnect now. Views and opinions expressed by any external speakers on this call are those of the speakers and not of J.P. Morgan. I would now like to turn the call over to Samik Chatterjee.
Yep, thank you, Operator, and welcome, everyone. I'm Samik Chatterjee from J.P. Morgan Equity Research, and I have the pleasure of hosting the Design Engineering Software Tech Talk with Keysight Technologies today. What I want to do is, before I introduce Niels, who is going to be the host of the presentation here, just a reminder for the audience on asking questions later on for the presentation, we're going to take questions by text. So you can either submit the question on the portal, you can also email me the question or email my team the question, and we can incorporate in the Q&A section of this presentation, we can incorporate the questions that you have. We'll do our best to get through as many of the questions that come in from the audience in the Q&A section.
So with that, let me introduce and hand it over to Niels Faché, Dr. Niels Faché, who's the VP and General Manager of Design and Simulation at Keysight. And, Niels, thank you for the time, and I'll hand it over to you for this Design Engineering Tech Talk. Thank you.
So thank you, Samik, for the introduction and organizing this, Software Tech Talk. Good day, everyone. Welcome to Keysight's Design Engineering Software Presentation, and thank you for joining us today. My name is Niels Faché, Vice President and General Manager of Keysight's Design Engineering Software organization. I started my career as an entrepreneur in electronic design automation and sold my company to Hewlett-Packard. Since that acquisition, I've been with Hewlett-Packard, then Agilent, and now Keysight for almost three decades. I've worked in various roles across the company and have a broad experience with our design, emulation, and test portfolio. I returned to our electronic design automation business about three years ago, and since then, I've led an effort to expand our portfolio from EDA to design engineering.
It's my pleasure to be here today and share our journey, business strategy, and portfolio. As always, please refer to our safe harbor statement, when it comes to forward-looking statements or any, financial, measures. So as we think about our business strategy and our portfolio, it's instructive to look at our history. Our roots go back to the early days of Hewlett-Packard, a company founded on electronic measurement innovation. In those days, measurement and test was a new and emerging markets, and the products were primarily hardware, solutions. In the late '80s, Hewlett-Packard developed design and simulation tools, with the initial target to help, improve the development of our products. So that became a commercial business, and that we still have in our portfolio, today.
In 1999, a highly diversified Hewlett-Packard spun out Agilent, and Agilent became the world's premier measurement company for electronic test and life sciences. As there were new generations of wireless and high-speed wireline communications, the test equipment became much more sophisticated, and the software content continued to grow. At the same time, our electronic design automation business became the industry standard for RF and microwave design. Then in 2014, Keysight became an independent company, you know, focused on electronic design and test. Since the formation of Keysight, we have embarked on a major transformation, going from hardware-centric products to software-centric solutions. That's been a profound change to our business, has had a significant impact. For example, we have substantially increased the investments in product development.
We have also seen a significant increase in the pace of innovation and product releases, and we're using software to connect our instruments and create more complex test solutions. Today, we are still using our electronic design automation tools in-house. As our product development teams look for ways to improve the performance of instruments, develop better and development life cycle, so we can get products to market faster, and they adopt new fabrication technologies, they use our tools for that purpose. We gain a lot of insights from their needs, and that helps us development tools that stay at the leading edge of design methodologies, while we're also supporting the latest applications, standards, and technologies. So let's take a closer look at our software business.
So as Keysight has executed on its software-centric solution strategy, the software business has tripled over the past decade, from $350 million- $1.2 billion in revenue. Software growth has outpaced the growth in the overall Keysight portfolio, and we expect that to continue to be the case. The software is made up of three types of software. We have the instrument software, which is connected with our core test equipment. All of that software runs directly in instruments. It's been around for a long time, and instrument software enables application-specific, standard-based measurements. For example, we use instruments to analyze a 5G signal. The next type of software is the test solution software.
This is a much more recent investment, and here we have software that connects different instruments and automates the testing of products with very complex signals. For example, in the last decade, we have made investments in protocol layer emulation. We've made a couple acquisitions with Ixia and Anite that have bolstered our portfolio. This software allows a customer to test a handset in the context of a system with our base station emulation capabilities. And then finally, we have the design engineering software. Here we have a portfolio of physics-based virtual prototyping tools, and these are used in different fields of electronic design, computer-aided engineering, and process and data management.
If we look at these different types of software, as you can see here, you know, they are used across the entire engineering life cycle, from design and simulation to physical prototyping, validation, manufacturing, and optimization. While we expect growth in all these different types of portfolios, there is an increased emphasis on virtual prototyping and design engineering software, and that's also the focus for our presentation today. To further set the stage, let's take a closer look at the engineering life cycle and how our design engineering software fits in. In this slide, we use the so-called V-model as a representation of a product development life cycle. On the left-hand side, we have virtual prototyping, with the outputs are virtual twins, which are virtual representations of a physical system.
On the right-hand side, we have the realization, the integration, the testing of physical systems. So product development starts with a concept, and that concept is translated into a functional model. That functional model describes, from a user perspective, how the product will behave. The model is also used to come up with the specifications. In the subsequent design engineering phase, those specifications are used to create a design. That design is simulated, is evaluated, it's optimized against these specifications. So the output of this process are so-called virtual twins. They are virtual representations of a physical system. And so in this virtual design space, design teams can experiment, they can innovate, they can improve the product, and all of that can be done before there's ever a physical prototype.
Keysight's portfolio is made up of our EDA business, which we have had for over four decades and is focused on RF, microwave, and high-speed digital design, includes also the communication, for example, between chiplets and heterogeneous designs. We recently added computer-aided engineering capabilities with the ESI acquisition, and with the Cliosoft acquisition, we also have tools to better connect and automate peripheral processes, such as design, data, and IP management. So once a virtual prototype is complete, it's ready for a physical realization, and then prototypes are tested and measured against these specifications. Once the specifications are met at a component level, components are integrated in a system. We go back up to the V, we test that system against the specifications, and when that system is ready, it's released to manufacturing, and then it's built and tested in volume.
Now, our last concept is this slide, the hybrid twin. With the hybrid twin, we connect both sides of the V. We have data from the simulation domain, we have data from the physical domain. As we go through a product development life cycle, we collect a lot of measurement data. We measure prototypes, we measure products, and so that data can be used to enhance virtual twins. So hybrid twins bring together simulated data and measured data and create better representations of physical systems. And of course, as these hybrid twins are used in simulations, these simulations become more predictive, more valuable, and as such, hybrid twin is very valuable IP.
That IP can also be used for subsequent product designs, and artificial intelligence can be applied here to create these twins and also to reuse the IP that already exists to generate a new design. Our strategic focus is to enable a shift left in product development with an increased emphasis on virtual prototyping, and we do this with our design engineering solutions. Now, this virtual prototyping becomes much more powerful when it's not only applied to the product, but also the processes and the workflows that are associated with that product. As such, it enables concurrent engineering, which further optimizes the product, not only in terms of its performance and cost, but its manufacturability, its serviceability, and ultimately, its time to market. So we've already talked about the product twin.
Here, we use an automobile as an example, and as mentioned before, you know, a good product twin, whether that is a virtual or a hybrid twin, accurately predicts the behavior of a physical system. For this example, simulations can show how this vehicle will perform under crash conditions, and as such, car safety can be improved before there's ever a car built. As a result, fewer physical tests are required, and they can be limited to the minimum that is needed for certification. Now, in addition to the product twins, we also have so-called process twins, which are virtual representations of a process. In this case, we're looking at a casting manufacturing process. As we all know, a chassis is made of different parts, and some of these parts are cast. Now, during the casting, there could be imperfections in that cast part.
There could be mechanical stress. It's very important to understand that and simulate it, so that we also know how that part is going to perform in the context of the system, the vehicle, in particular, in safety. And then finally, we have workflow twins, which are virtual representations of workflows, and here we can apply virtual reality to look at a manufacturing process, as is illustrated here. So this is very powerful. Before, it was needed to have a physical prototype, so teams could work on a manufacturing process to assemble that prototype. Today, that's no longer required. It can all be done in the virtual domain, and of course, as such, trade-offs can be made before there's ever a physical prototype available. So in shifting left, products, processes, and workflows can all be constructed at the same time.
So that makes concurrent engineering very powerful, as it creates efficiencies over the entire product development life cycle, as engineering teams can make trade-offs between a design, a process, and a workflow, and optimize that all upfront during the design phase. That will lead to better products, will reduce physical prototyping, will reduce risk, and ultimately lead to better outcomes. The examples that I've used here are actually supported by the Keysight ESI portfolio. Now, that shift left in the product development life cycle, implementing concurrent engineering, are profound and impactful changes in the methodology for the product development. However, it is required, and it's driven by product complexity. Today's products are more sophisticated than ever. Products are smart and connected. They offer more features and functionality, and that functionality is more densely integrated.
There are many legal, safety, and regulatory requirements in most of our end markets, and at the same time, our customers deal with shrinking time to market. So the design engineering complexity is growing, and it's growing exponentially, and it's largely driven by Moore's Law. As an example, when you look at a wireless modem in a 2G, 3G handset, that typically has 150 requirements. You look at a wireless modem in a 5G handset, it has 1,500 requirements. As a result, there is a widening gap between complexity and the engineering workforce over time. First of all, there is the cost of engineering, which doesn't keep up with the complexity. The staffing doesn't keep up. It's also very difficult to find skilled talent, so there is a widening gap.
That gap needs to be closed with design engineering tools that help engineering teams manage complexity in a very productive way. Now, the design tools have come a long way. The first design tools date from the seventies and the eighties. At that time, they were replacing manual paper and pen exercises and were focused on very simple components. Over time, graphical user interfaces were added to manage more complex steps in a workflow, and today, we have integrated workflows end to end. When you think, for example, about a gallium arsenide IC or a module, Keysight offers an end-to-end flow, from system-level design all the way to a layout that can be realized in a manufacturing process. So we have these single-discipline systems, such as an electronic system and an RF module, and we take advantage of high-performance compute requirements. That is the state-of-the-art.
However, complexity keeps growing, and that really requires a next-generation digital transformation. Given the complexity of systems, there is a need for a hierarchical approach, a top-down approach, from systems to components. We've already talked about concurrent engineering to optimize the product development life cycle. Multiple disciplines, such as electronics and mechanics, are involved. Multiple physics are involved. This is all needed to reduce and eliminate prototyping cycles, so the more virtual prototyping, the more we can reduce physical prototyping. And of course, as we will see in the examples, there's lots of interdependencies that need to be managed.
Throughout the life cycle, a lot of data is generated that needs to be stored, tagged, that needs to be analyzed, and finally, we need to take advantage of artificial intelligence technologies to improve the productivity of, and design engineering teams and elevate their design intelligence. So the next frontier of this digital transformation focus on these multi-discipline, multi-physics system workflows and capabilities to connect and automate peripheral processes, such as data management, requirements management, and so we call that process and data management. So let's examine that complexity in a little bit more detail with a couple examples from different markets. So the first one is a satellite. A complex product, such as a satellite, is made up of multiple systems, including a payload electronic system, a satellite mechanical bus system, and design engineering of such a system involves multiple applications.
In the lower half of the slide, you have a sample of the type of applications that are involved. Each of these systems is made up of subsystems and components and has a web of interdependent applications. For example, the payload electronic system has a transponder subsystem, and in that transponder subsystem, we have an RF power amplifier. That RF power amplifier connects with an antenna, a communication processing unit, and a power delivery unit. If we make a change in any of these applications, it likely has an impact on some of the other applications, so there's an interdependency between these applications. So engineering teams need to be able to collaborate and exchange data between these various applications. And at the same time, each of these applications has an application-specific workflow and its own life cycle. So this is a very complex setup for engineering teams.
So let's take a closer look at this for the RF power amplifier. If we really want to understand how an RF power amplifier works, it involves a lot of different physics, and so predicting the performance of a power amplifier requires multi-physics simulations and a lot of application know-how. So typically, a design engineering effort will start with an electronic design, and that might involve some RF and microwave simulations. But in a power amplifier module, there will be interconnects, there will be a packaging, and so we need to understand the physical effects associated with those, and that requires an electromagnetic simulation. A package needs to be designed in a way that it's robust enough to withstand mechanical stress and vibrations, so that requires a mechanical analysis and a vibrational analysis. There are also thermal effects.
There's conduction, convection, which has an impact on the RF performance, so that needs to be analyzed as well. So as you can see, there is multiple physics here in play that are not small effects. They actually determine how the RF power amplifier is going to operate, and so these physical effects need to be simulated, and they actually need to be co-simulated because they depend on each other. If the temperature goes up in a power amplifier under operation, its electrical characteristics are also changing. So with that power amplifier, in order to design it, we have an application-specific end-to-end workflow. It starts with a system-level design, where we come up with high-level specifications. Then, we go into a more detailed design, where we create a schematic, and we can simulate it.
At this time, we can already take into account that we're using different technologies, if that is a module. At some point, when that abstract simulation is completed, and so we believe we're on the right track with this design to meet the specifications, we go to a layout realization. We can design the printed circuit board on which the different components are mounted. We can design the integrated circuits, part of the amplifier, and the packaging that's being used. And we have a physical realization in terms of a layout. Once that physical realization is there, we can also run physical simulations. We can look at electromagnetic effects. We can look at thermal effects. We can look at the coupling between the two. And ultimately, before we can go through a realization, a fabrication, we also need to characterize the manufacturing processes.
So if we have an integrated circuit that will be developed by a foundry, we need to characterize that foundry process. So as you can see in this illustration, there's a lot of steps, end to end, to create this power amplifier design and have a predictive virtual twin, and so this requires a lot of application-specific know-how, simulation technologies that are tuned for this application, and that's really our DNA. That's what we've been doing for several decades. Let's take a second example of an electrical and autonomous drive vehicle. There's a lot of similarities with the previous example. There is a system hierarchy. There's different engineering disciplines, electronics, mechanics, electrical, applications. So you can see that we have a number of these systems and applications that come together to support that system.
In the case of the unibody system, one of the key applications is crash and safety. A physical test of a car, a car crash, can cost as much as $1 million per test, and typically, for a new car, several dozens of tests are required, so this is extremely expensive. Now, if car safety requirements are not met, it will trigger redesign and rework, and so we all understand that this is very costly. So virtual prototyping of a vehicle, simulating car crash conditions, is obviously of very high value. It does require, again, multi-physics. When we look at a car crash and safety, we start with a dynamic structural analysis of the vehicle. We all know that when a car hits a wall, it will deform, so we need to simulate that deformation.
Now, in order to do that well, we need to characterize the manufacturing processes behind the construction of that car. We need to understand the casting process and the imperfections that that introduces. We need to understand how plates are welded and how they will perform when they come under stress. All of that needs to be done. So we may also want to look at an airbag, and how that airbag, you know, explodes and fills up and protects passengers. These are different physical effects that all take place when a car is subjected to a crash condition. Again, there is a very complex end-to-end workflow associated with that.
We start with the design of the vehicle. We characterize different manufacturing processes, such as stamping, casting, and welding, and then we look at various use cases, not only the full vehicle crash simulation. We can use that to look at water management when a car, you know, drives through water. We can look at misuse scenarios. We can zoom in on very specific issues, such as the behavior of an airbag. And then, of course, we really need to have the underlying manufacturing processes well-characterized. So again, this is something we understand very well. We have the application expertise, the simulation technologies, and we can work with our customers on an automated workflow.
So as you've already gathered from these examples, is that our focus is on high-performance applications, where the behavior and the performance of the application of the product is determined by physics. Usually, there is more than one type of physics involved. It's not purely electronic, or it's not purely mechanical. A number of physics are in play. Our focus in our design engineering organization is to develop these multi-physics automated workflows. As you can see here, we start typically with a product stimulus. A stimulus is applied to a product, this could be an electrical signal, and then we have different physics simulators that we can apply to analyze the behavior and the performance of that product. For example, in the case of your wireless handset, we may be looking at signal quality.
In the case of a car, we may be looking at how the car behaves in a crash condition and how it impacts the passengers. So we are focused on high-performance use cases, and so you have an illustration here of a number of the use cases that we cover. When we think about high frequencies, this could be sub-terahertz frequencies, several hundred gigahertz of frequencies, for example, for 6G devices. When we think about high-speed data, we're looking at data centers and communication in data centers, where we have 800 gigabits of data rates. Lots of physical effects occur there. We've already talked about the high impact in a crash condition. We can look at vibrational analysis, for example, when a satellite is launched into space, will it survive that launch? We can look at power delivery from batteries.
We can look at very complex processes, such as the assembly of a vehicle, and do that with virtual reality tools. We can look at highly mechanical dynamic mechanics in, for example, in, in airplanes. That is our focus, these high-performance use cases, where we bring together best-in-class simulation technologies and integrate them in automated workflows with deep application expertise. We leverage that across our Keysight end markets in automotive and energy, aerospace, defense, wireless, and then wireline networks and data centers. So here we have an overview of our portfolio. I've already given a few examples, but for the sake of completeness, this is our entire portfolio. First of all, we have the Keysight core part of the portfolio that's been around for several decades. As you know, that's been focused on RF, microwave, and wireless applications.
We also have high-speed digital and photonics, including heterogeneous designs and chiplets, and then we have communication system tools. With the acquisition of ESI, we have added system solutions for electromechanical systems, mechanical product performance, evaluation of manufacturing processes, and then we also have virtual reality tools to validate workflows. And then with the addition of Cliosoft, we also now have capabilities to automate and connect processes with design data and IP management. Our overall addressable market is $2 million. These markets have very favorable secular growth trends. Our customers, we allow our customers to efficiently innovate and use the physical tools that we have to accurately predict how their products will perform in the real world. So in conclusion, Keysight supports its customers over the entire engineering lifecycle with design, emulation, and test solutions.
Our portfolio is shifting left, with an increased emphasis on virtual prototyping with our design engineering solutions. We have very favorable trends, megatrends, that support our business. There is the growing product complexity and the shorter innovation cycles, and that's driving the need for next-generation workflows, requiring multiple disciplines, multiple physics. We are focused on those applications where characterization and performance analysis requires one or more physics. We have the best-in-class technologies, the deep application expertise, and we can bring it all together in end-to-end automated workflows. We have made a number of acquisitions in recent years. We have continued to make organic investments as well, so it positions us very well to be a strategic partner with our customers and help them digitally transform.
So finally, as a result of the differentiation in our portfolio, the unique capabilities that we offer, our design engineering portfolio is a high-growth engine for our business, with very strong margins and mostly recurring revenue. Thank you very much, and that brings me to the end of my presentation. At this point, I'd like to turn it back over to Samik.
Niels, thank you for that, and that's a great presentation on not only the relevance, but also the complexity of the software capabilities you have. I'll start you off with something much more, sort of from the early part of this presentation, and then we'll sort of build through and go through some questions across all the slides. But before I do that, again, to the audience, if you have a question, you can use the Q&A function in the webcast that you're using, or you can email me the question as well. As a side note, the Keysight team will not be taking any questions on the Spirent acquisition, so that's one topic we're not entertaining questions on today.
Now, to start you off, Niels, here, I mean, want to go back to slide 4, where you talked about 30% of the revenue being derived from design engineering software in fiscal 2023. I just want to see if you can outline for us how the company sort of has made progress in relation to positioning for this addressable market opportunity over the years. Like, for example, what was the exposure to design engineering in FY 2014? How did the exposure increase, both organically or inorganically, over the years?
Yes, yeah. Yes, Samik, thanks for the question. So, as I mentioned in the slide, the current makeup of our portfolio is instrument software is 25%, the test solution software is 45%, and as you said, design engineering is 30%. It was a very different makeup 10 years ago. If you look at the start of Keysight in 2014, the focus was really on the instrument software, and that was the majority of the business. And in addition, we had our electronic design automation tools, primarily focused on RF and microwave. And we've obviously continued to invest in the instrument software, but the focus has really been on the test solutions and building out that position. That's actually over $500 million today.
As already mentioned, we've made a couple acquisitions with Ixia and Anite, and really built out a portfolio, including protocol layer emulation. So if you look at our history, typically, our focus was on physical layer tests, but we've really gone up the stack with protocol emulation capabilities. So today, when a customer wants to look at a handset and how it operates in the context of a network, we have the test solutions to do that. Now, when it comes to design engineering, we have made both organic and inorganic investments. On the organic side, we have invested heavily in high-speed digital communications. And the reason for that is that as these digital systems go to higher data rates, higher frequencies, they're no longer binary zeros and ones. Now, physical effects become really important.
And when you think about the communication between chiplets, you really have to look at the degradation of a signal as it goes from one chiplet to the other, and that's where our tools come in. And so we have enjoyed a lot of growth in high-speed digital. We've also developed solutions for new markets, like quantum and photonics, so very significant organic investments and addressing new applications. And then, of course, we've also made a couple inorganic investments that have allowed us to step outside of electronic design. With the acquisition of ESI, we have computer-aided engineering capabilities in the portfolio. With the acquisition of Cliosoft, we also now have capabilities that help us connect and automate, you know, processes.
As customers work on more complex designs, design data management, IP management, becomes much more critical, and as such, that portfolio has also grown considerably. So definitely a shift towards test solutions and design engineering, and so we also expect that to continue in the foreseeable future.
Perfect. Perfect. Very helpful. When investors want to think about the growth opportunity between design engineering, test solution software, and instrument, so test software, how should investors think about the right growth vectors here, for the software opportunities in each of those? I mean, maybe address it either in terms of growth expectations or the relative difference in growth expectations for those sub-segments.
Yes, yeah. Yes, thank you for the question. So overall, we expect the Keysight end markets to grow 4%-6% over a cycle, over a longer period of time, and of course, that can vary from year to year. Given our position of differentiation, and that actually starts with hardware differentiation, and then also differentiation in software, the application know-how and expertise that we have, you know, we believe that we can grow 5%-7%. Within the software portfolio, that will likely vary quite a bit, where we see more growth in test solutions and design engineering. Actually, if you look at the design engineering end markets, the growth rates are typically in the high single digits.
And so our objective is to grow with or better than these markets, and we do that by picking application verticals where we have best-in-class simulation technology, deep application expertise, and we can offer our capabilities to a customer in an automated workflow, which is really what customers need to be productive, predictive with their virtual prototyping efforts.
... I'll move to slide 5. You outlined the process of simulation to manufacturing that Keysight supports its customers through. Can you just share what a typical timeframe for a customer looks like to move from simulation to manufacturing? Does it vary by end market, depending on which end market the customer, is in? Does it vary in terms of the timeline to go from simulation to manufacturing?
Yes. Thank you. So it varies quite a bit. So if we look at consumer products, a handset, a smartwatch, a car, you know, these products go through refresh cycles on an annual basis, and there may be updates, in between. And that's already an indication of product development life cycles, which can be as short as a few months, you know, to a year. In other markets, think about aerospace defense, highly regulated, highly consequential if something goes wrong with the product, the life cycles tend to be a lot longer, can be several years, given the complexity of system integration, verification. So these markets are a lot longer. But independent of the market we're looking at, all of our customers are looking for ways to compress the product development life cycle.
It's very typical for us to hear from customers that they want to reduce a product development life cycle by 20% or even, you know, 50%. In particular, the verification of the design and the product is where a lot of time is spent. There's a lot of risk associated with that. There's a lot of iterations, and that is where customers see opportunity to consider considerably compress that, and I think that that's going to continue. I also hear from customers that they're looking for efficiency breakthroughs. They're not... We're not talking about 10, 20%, we're talking about being 2, 3, 4 times more efficient.
So if you think about a semiconductor company developing systems on chip with the same workforce, they want to be able to, you know, bring to market 2, 3, 4 times as many SoCs in the same amount of time. In the case of Keysight, we focus on our product development life cycles as well. We're very much dependent on hitting market windows, bringing best-in-class capabilities to market. And so in recent years, we have also reduced our product development life cycles by 50%, and we have done that by using a variety of tools, including our own in-house tools, creating application-specific workflows, and automating them end to end. Now, notwithstanding that progress that many customers have made, that we have made, there continues to be room to improve the product development life cycle.
For example, in the case of Keysight, we think we can make another 50% improvement by further automating, you know, peripheral processes.
I just want to stick to that slide a bit more here for the next couple of questions. Just wondering, when you think about a typical customer for the software design engineering solution relative to a customer for test solution software, how would you outline what that sort of typical customer looks like between the two? How are they different? How much of your customer base do you consider to be an addressable market for the design engineering solution? And how to think about progress in sort of getting more customer adoption within the customers you already work with?
Yes. Yes. Thank you. So first of all, the customer base for our test solutions is larger than for our design engineering tools, and that has to do with the fact that our test solutions, you know, are available across the entire product development life cycle. Not only in product development, but also in manufacturing and even in installation and maintenance. So there's a much broader set of use cases associated with our test equipment portfolio. But when we zoom in on product development, where we have virtual prototyping preceding physical prototyping, that's where we see the combination of our tools. So between our design engineering software and our test equipment, we have a lot of common customers.
Customers that go through an entire product development life cycle are typically going to use tools on both sides of the V from Keysight, design and simulation tools, and test tools.
Maybe, when you think about that simulation to manufacturing flow with the customer, then how do you think about competitive threats? Because if a customer is using one and not the other, what do you typically find as the reason for it?
Yes. Yeah, that's a good question. So as Keysight, covering design and test is a unique position for us, and a lot of our customers value that. But of course, there's alternatives on both sides of the V. And so where customers make a decision for a particular tool, a point tool, they'll compare our products with alternatives, and they will look at the product that best fits the needs. In some cases, they may have a dual vendor strategy, and they'll buy from us and another vendor. However, the trend that we see is that customers are more and more interested in moving away from point tools into workflow solutions, and that's where we offer a great benefit, as we can connect the various stages of the life cycle.
You know, for example, when we, you know, perform a signal analysis in the virtual domain, that measurement IP that we use there is the same measurement IP that we use in the test domain. So think about this. You're a product development team. You design and simulate, and you think you have a design that meets the specifications. You have tested that design. You have used Keysight measurement IP to test it. Now you build it. You want, of course, correlation between the physical prototype and the virtual prototype. It takes the same measurement IP to do so. So we help our customers also ensuring that there is consistency in how they perform measurements. So, and that trend will continue, and I think that's a favorable trend, for us, and to become more of a preferred supplier to our customers.
I mean, that brings me to the last question on that slide itself, in trying to quantify how that impact is. Think of it as either sort of a impact on the win rate or impact through revenue synergies. When you start to get engagement with a customer, and you talked about this sort of move to the left, when you start seeing that engagement with a customer on the simulation side, how do you sort of quantify what synergy that provides you in terms of the win rate with that customer or synergy with the rest of the portfolio?
Yes. So there are clearly, for us, synergies. When a customer is working on a new technology, let's say that is 6G, chiplets, new safety regulations for an automobile, they will start in the virtual prototyping space, and that's where, you know, they use our tools. We develop measurement capabilities, even compliance measurements for virtual prototyping, and we get to know the customer, we get to understand their applications better. And so that expertise can then also help us with, you know, the test solutions. So even if customers make independent decisions between design and test tools, you know, the fact that we are in the design space, that we are working with customers, you know, helps us also on the test side.
As I mentioned before, as we have consistent measurement IP, whether we're dealing with a virtual prototype or a physical prototype, is of great value. The customer can trust our measurements, and so as they compare simulations and tests, they know that they have performed the measurements in an, in a consistent manner. You know, and as I said, that is a trend that will continue. That's also why we're focused on bridging the domains from virtual to physical, prototyping with design and test verification flows. So we believe that there is going to be more of an, a downstream leverage as customers use our design tools, that they also, then will follow on and use our, our test equipment.
A question that I expect to get from investors quite a bit is, in relation to simulation solutions from EDA companies, can you talk about how Keysight differentiates relative to other EDA companies like Synopsys or Cadence Design Systems?
Yes. So as a first approximation, the focus of Cadence and Synopsys is on digital systems or systems that are largely have digital content. And in a digital system, we abstract out the physics. We're talking about zeros and ones. It's a zero or it's a one, and all the signal processing is based on zeros and ones. In our world, we deal with physics. We deal with a radio signal that goes from a handset to a base station. We deal with components where their behavior depends on different types of physics, and so it requires physics simulations to predict the performance and the behavior of the components. So these are very different.
Now, there are systems that combine digital, analog, and RF, and in that case, customers will put together workflows that are based on, on various tools. So some customers will use products from Cadence and Keysight or from Synopsys and Keysight, and that's the world we live in. We live in a world where we support an open ecosystem. Our tools can collaborate with other tools, and we focus on the applications where we have the best, simulation technologies, the deepest application know-how, and where we can offer integrated workflows to our customers.
A follow-up question on this, particularly, a question that's coming from an investor: How does the acquisition of Ansys by Synopsys change the competitive landscape for Keysight in simulation, particularly in chip design simulation?
Yes, I mean, the acquisition reflects the mega trend of bringing different physics together. And you see that behind the moves that we have made as well with the acquisition of ESI. Today, we are already partnering with Synopsys and Ansys, and so when you think about these next-generation RF and microwave design, this partnership illustrates that we all have best-in-class capabilities, and I expect that to continue. The designs that our customers are working on are very complex. They want to be able to use the best tools that are in the market, and they are asking for an open ecosystem that allows them to bring together tools from different vendors.
I think that that partnership that we have today with Synopsys and Ansys demonstrates that very well. In some cases, there's overlap between tools where customers have a choice, but in general, we believe that by collaborating and continuing to collaborate, that we can offer our customers industry-leading workflows.
Moving to the next topic, going back to slide seven, I think, where you outlined the mega trends that drive the business, including smart products, more features, regulatory requirements, and time to market. Just curious if you can outline how, what, how should we think about the bigger driver to lead customers to adopt Keysight's design engineering solution, which of these mega trends that you've called out is the biggest driver here for customers to move left with you on the portfolio?
Yeah, so, a good question. So it all comes back to complexity, and the way complexity manifests itself is that, it's a combination of more physics, more complex physics, that drive the performance and the behavior of a component, and interdependencies between different types of physics. And it's that interdependency that is very complex. It's one thing to have a complexity where you now have three physical effects instead of two. It's very different when you look at interdependencies. If I go back to the example of an RF power amplifier, you know, long time ago, that was an electronic simulation. It was a single physics simulation. Now, you have thermal effects, you have mechanical effects, you have airflow that you have to deal with, and they all interact with each other.
You cannot just take them apart and say, "I'm going to look at one physical effect at a time and add them up, and that's gonna tell me how the power amplifier performs." They interact. If, when a power amplifier heats up, when your phone heats up, it has an impact on the electrical performance, so we need to be able to co-simulate, and that drives a substantial increase in complexity. Simulation times can become much longer. We need to develop capabilities to accelerate these simulations. It all starts from a complexity that drives more physics, more interactions, more complex simulations, and so, and then more complex workflows. We need to continue to work on this in ways that customers can manage that complexity.
Maybe the second part to that, talk about how do you think about content uplift on design engineering solutions with the customer beyond the initial adoption? You get in with a customer, they start using your portfolio on that simulation side, but then what becomes the biggest driver of seeing more continued content uplift with that customer? Is it purely just the testing times, or are there other drivers that you would point to, that drive that content uplift over time?
Yeah, so let's say that a customer decided to adopt our tools when they were working on a 4G phone, an LTE phone. And so now they move to, you know, from 4G to 5G or to 6G. Any time they go from one generation to the next, they're not only changing generations, they have to continue to support the older generation as well. So when you look at a phone of today, it's not supporting a single cellular standard, it supports multiple cellular standards. It doesn't support a single Wi-Fi standard, it supports a collection of standards that were developed over time. And then we have new standards, such as Near Field Communication. So the number of standards keep growing.
So a customer may start with our tools and cover a number of communication standards, and as they work on next-generation products, they will need to continue to support those standards, and more standards will be added. So it's really additive: more technologies, more standards, and then, of course, you have the interaction between all of these standards. When you think about a handset of today, it receives a lot of different signals, cellular, Wi-Fi, and so on, and it needs to be able to take these signals apart. It needs to be able to look at the signal that's of interest. That has an impact on the front end of that phone, and that is a much more complex design.
That increased complexity drives the need for physics-based simulations, and these simulations are becoming much more complex, and that drives the adoption of our tools and the number of simulations that are done with our tools.
I have three more questions, so let me try and squeeze them in here before we end, at the top of the hour. You gave examples of multi-discipline systems in autos and communication. I'm wondering, what does a industrial sort of example look like? What could be some of the opportunities on the industrial side?
Yeah, that's a good question, and it's actually a growing market. And so when we think about a smart home, we have lots of different radios in our house. It can start at your front door, it can be your fridge, you know, your washer. They all have... They're all smart and connected instruments. They have sensors, they have radios, so there's lots of opportunities, you know, associated with the smart home. Same for smart cities, where we sense and have different type of sensors and communication. So, you know, healthcare is another market. So we see a number of other end markets that are where we have smart and connected instruments, which involve physics, which require multi-discipline, physics-based workflows and represent growth opportunities for us.
Don't want to miss out talking about AI and the sort of data center build-outs we are seeing relative to that. How are you thinking about design engineering being more relevant for end markets that are influenced by AI right now? How do we see that drive demand?
Yeah, so that's actually a significant growth opportunity for us. When we think about AI, you know, we know that that takes a lot of data to train refined models. You know, that data needs to be processed. That data is transferred between a computer and a data center over a network. So all these components of a network and a communication system, they deal with that amount of data, vast amounts of data, and the higher data rates and the higher frequencies... and so we know when we go to higher frequencies, we go to higher data rates, there will be physical effects. And when we transport data from one chiplet to the next, there will be distortion of that signal, and so physical simulations are required.
So it's a major growth driver for us, and so some of our tools that we have developed are focused on modeling that communication, and so that's one growth opportunity. At some point, the data is no longer transferred, you know, in the form of electrical signals. Systems are built based on photonics, and so now we deal with a totally different type of physics, and so that represents another growth opportunity for us. So AI, as an application, and the vast amounts of data associated with it, and the need to process and transfer that data is an, is a major, trend for us, and it's been driving the growth of our portfolio and will continue to do so for a while.
On the flip side, how does AI influence your customers' ability to design, call it design engineering solution in-house, rather than rely on your solution? How would you think of AI and the sort of overall sort of productivity gains in coding, et cetera, that we're seeing, how does that play into customers' capabilities being higher as well, like leveraging AI?
Yes. Yeah, so if you look at our customers, they see the transformative power of artificial intelligence, and for them, it all comes down to: how can they increase productivity and design intelligence? And so it's our opportunity, as a supplier of design engineering tools, to help them with that. So we do that in a number of ways, and we have lots of experience in that space. It starts with modeling. We have a number of applications where we start from data that could be simulated or measured. We build models, and we verify those models, and we use those models in simulation. We've done this for a long time around device characterization, so we have a lot of expertise to develop models using artificial intelligence, and in this case, neural networks. That's a capability that we can supply to our customers.
That way, they can use their own data, which they may consider proprietary, not willing to share that with anyone else, but we provide them the tools to create their own models. That's transformative power of AI/ML. Artificial intelligence can also be used to speed up complex simulations or perform global optimization, and in the context of multi-physics, multiple disciplines, that's absolutely critical. Customers solve problems that are too complex for the technology that we have today. Artificial intelligence can help. Again, more tools that we can supply, you know, to our customers. And then when it comes to productivity, customers want to have information at their fingertips. They are looking for intelligent chatbots, which are based on our data, could be based on all of our learning material that we have. Customers may want to augment it with their own.
So that's another area where we can work together. We can create generative capabilities, where customers can tap into our IP or their own IP, and from there, generate new designs. So there's a lot that customers can do with artificial intelligence. Their focus is still on building their products, accelerating time to market, improve performance. Our opportunity is to provide them with the toolkits to do that, and that's going to be a different world. It's still gonna rely on best-in-class simulation technology, it still takes our deep application expertise, and we still need to do it in a way that customers can create end-to-end workflows. It does require an open system, a modular approach, standard interfaces, scripting capabilities so customers can tailor it.
They can maybe add some of their own AI and ML capabilities, but in general, this represents a set of new opportunities for us to help our customers make breakthroughs in their product development life cycle.
Okay, I think we're right on time here, so I'll wrap it up there, Niels. Great presentation, and thank you for this engaging tech talk, and thank you to the audience as well for tuning in. Thank you.
Thank you so much.
Hope to-
Thank you.