Good afternoon, everyone. Before we begin, I'd like to remind everyone that during the course of this investor meeting, Synopsys will discuss forecasts, targets, and other forward-looking statements regarding the company and its financial results. While these statements represent our best current judgment about future results and performance as of today, our actual results are subject to many risks and uncertainties that could cause actual results to differ materially from what we expect. In addition to any risks that we highlight during this meeting, important factors that may affect our future results are described in our most recent SEC reports and investor presentations. In addition, we will refer to certain non-GAAP financial measures during the discussion. Reconciliations to their most directly comparable GAAP financial measures and supplemental financial information can be found in the presentations and 8-K that we released earlier today.
All of these items, plus the most recent investor presentation, are available on our website at www.synopsys.com.
Please welcome Synopsys Senior Vice President of Investor Relations, Trey Campbell.
Thanks everyone who's joined us, all the friends that have come to join us here in Santa Clara, and all the folks that are online. We, we really appreciate you spending your afternoon with us today. We're gonna have... We're excited to share the opportunity that we see in front of us, and how we're gonna go harvest that opportunity. One, let me kind of give you just a little bit of a breakdown on the agenda, and then we'll, we'll go from there. So first of all, Sassine is gonna walk us through the unprecedented opportunity we see in front of us and how we're gonna go realize that opportunity. After Sassine, we're gonna have Shankar come up.
He's gonna tell us about all the innovations that are happening in EDA and how those are gonna help us solve a lot of our customers' problems. Following that, Ravi is gonna come up to give us a presentation on our system strategy, how the combination of Synopsys and Ansys are gonna create a tremendous capability for our customers in Silicon to Systems strategy. then we're gonna go to John, who's gonna talk about the differentiated IP business that we have, how we're gonna continue to deliver market-leading growth in that business. And then Shelagh is gonna come up and sum all those things together and talk about the implications to our financials and our long-term guidance. After that, we're gonna do a Q&A with Sassine and Shelagh, and it's gonna be a great day.
One thing I wanted to tell you, we're gonna do. I know you like to hear us talk about why we think we're positioned to win, but I think you probably will appreciate hearing it from our customers as well. So after each of these sessions, we're gonna have a customer testimonial, a video that plays after this. And so we're gonna start with our great customer and partner, AWS. Thank you.
From day one, when we established Annapurna Labs, which later became the nucleus of custom silicon development within AWS, we aligned with our customers as well as our partners, like Synopsys, in a common vision: accelerating time to market, improving productivity, with the key goal to deliver new product innovation to our end customers. It all started with aligning on innovative, high-quality interface IP and really creating a long-term alignment and a successful partnership. The key principle in this collaboration was allowing AWS to focus on creating our own differentiated technology, while Synopsys providing a high quality, on-time IP for our industry-standard interfaces. What we really wanted is to accelerate product development while the chips and underlying semiconductor has becoming more and more complex. One of the ways we worked very closely with Synopsys is to use the elastic and scalable AWS infrastructure to scale our compute.
Sometimes we had to scale our EDA compute usage up to 5 times and even more over our baseline within few hours. Synopsys EDA tools has been optimized for AWS powerful and scalable infrastructure. So Synopsys has been an early adopter of AWS Graviton processors, really enabling us to run Synopsys EDA tools and IP on Graviton. As a matter of fact, our latest Graviton chips are being developed on our existing Graviton chips using Synopsys EDA tools. At the end of the day, it's a win-win on all sides: cost, performance, and energy efficiency.
Please welcome Synopsys President and Chief Executive Officer, Sassine Ghazi.
Thank you. Thank you. It's so great to have you here. I was looking back, last time we had that meeting was five years ago, and so many things happened in the last five years. If I'm not mistaken, roughly, we doubled the size of the company. We started a new business with the Software Design Group , and I'll be describing a little bit about it.
... We decided to explore strategic alternatives for SIG. We decided to make a small M&A with Ansys. What else have we done in the last five years? Transitioned a new CEO. So many things happened, and we haven't had a chance to talk to you yet. So I hope through the next number of presentations, you'll get a sense, how are we seeing the world? How are we seeing our company, the strategy, the investments we're making, and why are we so excited about the opportunities that we have? So I have three sections for my presentation. The first section is where we've been. The second section will go through the market. Then the third section, specifically to Synopsys.
So in terms of where we've been, we like to think of our mission as empowering technology innovators everywhere, with a purpose to empower innovation today that ignites the ingenuity of tomorrow. The value proposition we provide to our customers is to maximize their R&D capabilities to deliver their innovation and multiply their productivity. You're gonna hear us talk quite a bit about Synopsys, our technology that is empowering our customers' innovation. Our journey. We started the company 37 years ago with synthesis, which truly changed the way digital chip design was done. It improved productivity, the quality of the results, et cetera, at the time, in a drastic way. Since then, we continued along the EDA journey to develop an end-to-end platform for our customers.
We started our IP business 25 years ago, that today is roughly about 25% of our revenue, and it started as small components that are needed inside our own EDA products, our synthesis products, that customers did not have to build those small, small components. They just can pick them up from Synopsys to achieve the best outcome for their design. Since then, of course, we built it to become super exciting and mission-critical for our customers, along with EDA, as they're developing their future chips. The Software Integrity Group, roughly about 10 years ago, we started that business. We made number of investments, both organic and through consolidation, and we became the leader in application security testing. With systems, we started with...
If you recall, about three to four years ago, we started talking about SLM, Silicon Lifecycle Management, which is critical components that we sell to our customer as an IP to integrate inside their chip design. So they're a bunch of monitor sensors that they put inside their chip and monitors the chip health throughout its life cycle, from pre-silicon to manufacturing, post-silicon, as it's sitting in the field. And we form SDG, the Systems Design Group, roughly about 2-2.5 years ago, as we start seeing where our customer are heading to Silicon to Systems. and of course, the acquisition of Ansys that we announced and will describe, how does that fit in the overall silicon and system and the bridge in between? Now, the graph on top is something so beautiful to watch.
The company, the industry was growing mid-single-digit for about a decade. Then it moved to double-digit for the following decade. And you look since 2020- 2023, the acceleration in growth, 17% CAGR, which is well above the market TAM growth that we are able to achieve with our portfolio. Now, this is a bragging slide. If you look at certain criteria, if you look at public companies, both in semiconductor and software, and you set the criteria, companies of $5 billion+ in revenue, 10%+ next 12-month growth, 35% ops margin. Believe it or not, out of hundreds of companies, public companies, it will filter it down to only seven companies in that bracket.
Seven companies, and we're on both the semiconductor side, with the likes of Broadcom, NVIDIA, TSMC, and on the software side, with the likes of Adobe, Adobe, Microsoft, and Intuit. So it's a rare combination that we have of scale, growth, and profitability that we're committed to continue on being in that class and continue on leading our market in that grouping of companies in both semiconductor and software. We started reporting market segments in the beginning of 2023, and the reason it was important to report market segments, because we were reporting just one big one for EDA and IP, and then there was the Software Integrity Group... and there was always the debate: you're losing share, you're gaining share. So the best way to say, here's who we are as a company, we reported three market segments.
There is the Design Automation , the Design IP, the Software Integrity , which brings the total of the company to $5.8 billion as we wrapped up FY 2023. And as you can see in each, we are the number one in Design Automation . We are the number two in Design IP. But anytime I say we're the number two in Design IP, I have to finish the sentence and saying we're the number one in interface and foundation IP and security IP, and we'll slice that in more details. And of course, in that case, from an overall TAM point of view, Arm is the leader, in IP. In the Software Integrity Group , we're the number one in application security testing. Now, as you know, about 4-5 months ago, we communicated that we're exploring strategic alternatives for SIG.
Today, we'd like to communicate that we have made the decision to start the sale process for SIG. So we move from an exploration of a strategic alternative to a decision to sell the business. I'll describe as we go through the presentation why did we make that decision? Before I get there, actually, maybe some perspective that you'll be interested in. The moment we announced the exploration of strategic alternatives, we had about three dozen interested parties, three dozen. Then it was almost impossible to be able to have due diligence and communication with all, so we tried to trim it down to a couple dozen. Then right now, we're in the half a dozen process.
It's a competitive process that we're going through, and we'll be going through actively, as the decision is communicated, in from an exploration to a decision. You look at the operating margin as well as the growth, you can see where we are in terms of the different market segments of this business. At the beginning of the year, the first opportunity I had when I transitioned into the CEO role, is to communicate to our shareholders, to our customers, to our employees, what are the priorities we see for the company. We listed as the priority number one is technology and innovation leadership. As you get a sense, especially this week, as you're in GTC or earlier, a number of you were at Broadcom, et cetera, we cannot miss a beat in technology leadership.
In our industry, if you're not constantly on the bleeding edge, delivering your customers the best solution needed in order for them to develop these massive AI chips or networking chips or HPC, we, we won't be the company that we have today. So technology leadership, industry-leading growth. In the last four years, from the 2020- 2023 period, we were able to achieve 17% revenue growth, as I stated earlier, which is 300 basis points above the TAM, which is impressive. The margin expansion, we put energy and focus behind it and how to optimize our investment, and we achieved 700 basis points during that period. And what is equally impressive is the 26% non-GAAP EPS we were able to achieve. Now, again, another quick bragging.
As you look at The Magnificent Seven, they're magnificent, they're wonderful companies, and we are... If you compare us, we're not part of the Mag Seven, but if you compare our TSR as part of that grouping, we performed as the six compared to the seven. Excellent performance in terms of shareholder return that we're able to deliver. Now, let me transition into the market opportunity, because many of the questions that come up often, you know that we are highly indexed to our customers' R&D spend, which is a fantastic resiliency for our business. So many of the questions that come up, we're, we're seeing a nice growth coming from Synopsys. How will that change in the next two, three, five years? So that's what I'm hoping that we'll get out of this section.
We are seeing this era of pervasive intelligence, where AI is the mega driver that is creating many opportunity. When you think of AI as reinventing the compute with significant opportunity in intelligent systems for various, various applications, what's gonna power this opportunity is silicon. In the context of silicon, software-defined systems, because that silicon in these intelligent systems, they're gonna sit in various markets that's gonna be driven by more and more software content. This is where we see the big opportunity. So we see the opportunity both from the silicon up, where many silicon companies are trying to design these very complex, very sophisticated silicon, and system companies, either they're making the investment to design their own chip or they need to make an investment to be able to define the chip and electronics requirements for their systems.
So for Synopsys, we see it an opportunity both with the chip companies as well as with the system companies. So that's how we're envisioning the future Silicon to Systems in the era of pervasive intelligence. Now, when you look at this graph, that it took our industry 60 years to get to $500 billion, and by the end of the decade, it will double in size. That was the driver for us to double down our investment and continue on leading in our core technology, which is the silicon design, enabling our customer to do their silicon design with our software and IP. I mentioned earlier, when we ramped up and created our Software Integrity business and decided to explore the strategic alternatives, has nothing to do with the business itself.
It was the focus on retargeting our capital, our investment, our management bandwidth, to double down on this opportunity that you're seeing right here, which is truly an explosion in silicon demand and the system optimization opportunity. Now, you see the number here listed about $400 billion for AI opportunity, and, you know, you hear various numbers. This is $400 billion by 2027, and that number was not too long ago in the, the double-digit billions, to right now, $400 billion, given the vast acceleration with AI. Now, what are our customers facing? When you look at the opportunities, to deliver chips to satisfy the AI applications and HPC and other, high compute demand, applications, the performance requirement is insatiable. There is such a demand to bring in more transistors into a silicon in order to support more compute.
Energy efficiency. Those chips can consume a lot of power, so how do you improve performance of the chip while lowering the power consumption of the chip? All of that is happening with the background that the traditional Moore's Law innovation that allowed to double the transistor and the performance at the same power envelope is no longer applicable. You need a new way to innovate, and this is where Synopsys is playing a significant role. Historically, our company engage with foundry, so the relationship is typically Synopsys foundry, then Synopsys customer. So with the foundry, we enable our tool to support their next node, and with the customer, we make sure we support them as they adopt our technology that we already enabled.
With that complexity, we're at the heart of foundry customer Synopsys to enable an architecture optimization from the customer, how they're designing the chip in the context of system, foundry to enable that next node, and with our technology, which in this case, both the software and IP. So Moore's Law is definitely still alive and going, so it's. I know there are many claims that will it continue? It is continuing. The big challenge and the question our customers are asking themselves: Can I afford it? How much of my chip do I need to be on the bleeding edge? How much will I can I stay on the, the more mature technology, and what are the alternatives? So the alternatives are, we're listing few in here, is multi-die, the whole silicon hardware, software, co-design and optimization, and the Silicon Lifecycle Management.
How do you manage the health of that silicon as you're optimizing those complex systems all the way in the field? Each one of those have a opportunity for both EDA and IP, because the moment you disaggregate a monolithic chip into multiple dies, there's a complexity on how to design it and architect it. We have the software for that. Then it creates a massive opportunity for IP, because you had one chip, you're slicing it right now into multiple chips. You need to connect them together, so you need the new IP to connect those new chips together. As we engage with the customers, they're typically measuring their own product investment along three vectors. Can I make and meet my power performance area envelope for the next product? Two, can I get it done in time?
Three, what are my resource utilization, given resources, the skilled resources to design those advanced complex systems are very hard to find. So how do I manage the next product innovation and making that power performance area, making the time to results, as well as making it effective and cost-effective with the resources? And this is where we play a significant role... And it's really rewarding and amazing to see how our customers are speaking about us in that context, how critical. And when I use the word mission-critical, and I believe Shankar will show that same slide that we, we talked about at GTC, just a couple of days ago, that we are a mission-critical to the success of those customers. We're no longer an important enablement for them, but those chips and advanced systems cannot happen without that capability.
And we have to optimize across all three vectors. Then how do you do it? If you look at our portfolio today, the one thing we're offering, offering our customer is: How do you shift left the architecture exploration? And this is where EDA and SDG, the Software Design Group, both Ravi and Shankar will describe how are we doing to enable both a system company and a silicon company to architect the system. That system can be a chip or multiple chips. You must have a deeply integrated, an EDA platform. If you have a fragmented EDA platform, that's where the concept of sign-off becomes very important, then you're gonna have many iterations as you're designing that chip versus a convergent flow for it. And there's a massive verification opportunity. This is where our hardware-assisted verification with ZeBu and HAPS becomes very, very relevant. Silicon-proven IP.
Our customers, not only they don't have the resources to invest in a non-differentiated IP, but by the IP being non-differentiated does not mean it's not complex. So the IP we deliver is a very complex IP. However, the scale that we have, the investment that we've made, we're able, with high level of trust, to deliver that IP when the IP is required and needed on various foundries, on new standards that they come up, and make sure that they're available on different market segments, from an automotive to an HPC to mobile, etc. With AI, cloud is becoming more relevant. With complexity as well, cloud is becoming more relevant because instead of taking X hours or X days to do a function, can you accelerate it if you give me more compute?
But that's not only a compute availability. There's a significant innovation and R&D effort to make sure our software can scale to support more cores, more sophisticated compute, from a CPU to GPU to accelerators, etc. Then data analytics, which I'll touch on it in a moment. That's how we look at our portfolio. So today, when you look at the Synopsys portfolio, what is it that we're offering the customer? We're starting at an architecture, everything they need to do in design and verification, the IP components, and the ability to deliver cloud scalability and a modern way to manage their data. So that's the fun slide because I know many of you ask that question all the time. So how are you growing above the semiconductor R&D spend?
If you look back in the 2013 till roughly the 2018 timeframe, first, the Semi sales and the Semi R&D, they're so nicely correlated to each other, and we don't anticipate that to change. So that will continue. You can see different higher sales post-COVID for all the reasons that we know, but typically it's tightly correlated. Then you look at the EDA TAM, it was very correlated with the R&D spend. Then around the 2018 timeframe, it started opening up. And what changed? There are new entrants. There are hyperscalers that they're not captured as part of the Semi R&D spend, so that's part of it. Two, complexity.
Complexity of the chip, meaning when we introduce AI, when we introduce 3DIC Compiler to solve the new complex challenges that the most advanced chip semiconductor guys are facing, it's a new monetization opportunity for Synopsys. So there are new entrants and complexity that you start seeing that the EDA TAM has outpaced and outgrown the Semi R&D. IP has been consistently above. The reason IP has been consistently above, and it took a very nice upward trajectory, because customers realized, "If I don't have to develop the IP, I can buy it from Synopsys, I'll just buy it from Synopsys." There are very few customers remaining that they're developing the IP as a differentiator when we can offer it to them. And there are new IP requirements.
So when a company is or was a mobile company and having only a mobile product, and that semi company trying to diversify their business to become mobile and automotive, that's, for us, a new opportunity. We sell new IP, new EDA tool for that same customer. So that's a-- that's-- I hope you find it useful as we look at our own acceleration as a market, how is the TAM accelerating compared to the Semi, R&D?... Continuing on the TAM story, as we look at the last number of years from 2018- 2023, you can see the CAGR, historical CAGR, where we have a market of $18 billion, where $7 billion is IP, $11 billion is EDA. The total TAM CAGR is 13%.
12% is what I showed on the previous slide, that's on the EDA side, and on IP, 14%. As we look ahead, we believe the EDA will remain about the same. IP will slightly accelerate for many of the reasons I just stated. If you round the numbers, the whole TAM will continue at the same pace that we've been growing before. Now, for those of you that are reading the bullets carefully, you may be asking yourself: Where is AI acceleration? What's gonna happen with AI? Will it accelerate that growth further? How does it change that equation? And I'll touch on it, how and when do we believe AI will layer in terms of an opportunity and an acceleration further above the TAM.
But if you look at it just the way it is, it's still, with high confidence, a significant opportunity for our industry in terms of growth for both EDA and IP. Back to the first chart I shown, where it was a 6%-10% CAGR over 10 years and 10 years CAGR and accelerating to 17%. What drove a lot of that acceleration, we have no doubt that it will continue into the future. So that's our TAM. Now, let's talk about Synopsys and how are we positioned to continue on being the market leader in terms of TAM growth, for the two segments we just described. Our strategy, as we've been describing it, is Silicon to Systems.
At the silicon level, we've done everything possible to continue on being the leader in technology innovation and providing the best support and technology for our customers. That will continue and will expand into the system-level solution. What we mean by systems are, again, two vectors. For silicon companies, that they're becoming more and more delivering a silicon in the context of software, hardware to their end customer, and system companies, that they are investing deeper into the electronics. We have an amazing opportunity as well to optimize our go-to-market to expand our reach. What Synopsys has done incredibly well is engaging with our leading customers. That has been our DNA. We do incredibly well with those customers, and we have a great market share and position with them.
As we start expanding to systems, we need to expand our go-to market opportunity and reach, and this is where I'm gonna bring in an Ansys color into it later in the discussion, because we've had with many of you, where Ansys has a strong position in, markets that we haven't traditionally sold to, like, aerospace, like industrial, et cetera. And those are the opportunities that we see in the future, as we become one company, that we can deliver to that go-to-market scale opportunity into the systems. From a margin expansion, point of view, I wanna say our company is enthusiastic about the opportunity we have to leverage AI inside our own company to improve our productivity, as well as looking at every corner we have to optimize our investments.
So you have our commitment, not only we're excited about technology, innovation, and growth, but improving and optimizing our bottom line. Shankar will go through this in more details, but the reason I wanted to show you this slide: When we talk about Synopsys is the only company with an end-to-end EDA platform, it's really the only company with an end-to-end EDA platform. We are the company that has TCAD, which goes into the deep physics. When a foundry is trying to develop the next Angstrom technology, they use our TCAD, which is an industry-standard technology, to model those devices, and you go all the way up the stack to what we talked about, the system architecture, how do I explore my platform architecture of my entire chip and system? We talk about two other things, which is the hyperconvergence and the pervasive AI.
When we introduced Fusion, we had the assets from various parts of the chip design process, where we brought together and fused engines together, so our customers does not get to step number three and realize that they have to go back to step number one because the results did not match. That's a huge differentiation for our customers... And that's where the hyperconvergence and pervasive AI, applying AI everywhere in the flow, are the investments we started making with the fusion around the 2016 timeframe, 2016, 2017, in that neighborhood. And same thing with AI. With DSO.ai, we started the investment in 2017, with the ability, with the product delivery in production in 2020, and we will talk about the roadmap that we have created.
Number one in EDA, 65% of our revenue is coming from that EDA full stack. We are the leader in both digital and design and verification. We are the leader in GPU acceleration. You will see some of the numbers Shankar will present. The type of acceleration you can get by using different compute architecture is significant. We're not talking about 30, 40% improvement for the customer, we're talking about multiple factors. 5x, 10x, 15x improvement by leveraging a new compute architecture that we can use for our software as we are improving the time to achieve the results. You layer on top of it AI, which is another booster of acceleration. That's how you end up designing those massive hundreds of billions of transistor devices.
We talked about we are the gold industry standard for sign-off and TCAD, and I know I had many discussions with a number of you. What is sign-off? Today, in the chip design, you have multiple sign-off checks. You have a timing sign-off, so when you're done with the chip, before you ship it, you send it to manufacturing, am I going to meet the frequency, the speed of the chip? Power sign-off, is it going to be within the same power envelope, et cetera? And TCAD, we touched on it. And, the Silicon Lifecycle Management, the reason it's a great investment for Synopsys, because it has two components.
It has an IP component, which we know how to sell IP, and we know how to insert IP on a chip, and then you need EDA in order to make sure that we get the data out of these monitors and sensors that they're sitting in the chip. IP. I mentioned 25 years, 25% of our company, and the scale that we're bringing in, and we are the leader in interface and foundation. So if you look at the overall IP TAM, roughly half of it is processor. The other half is the interface, the foundation, the security, and some other. John will describe the journey because this is the current, this is the 2023 IP TAM split. If you go back six, seven, eight years ago, that picture looked very differently. The processor IP was a much bigger portion of it.
The reason you're seeing more and more, IP interfaces and foundation IP, security IP, et cetera, is driven by, by multiple factors. I mentioned one example earlier, multi-die. That requires a new standards to stitch those dies together in a package. You want to go from a mobile to an automotive to an HPC, those are different type of IP that you need. And we're so fortunate that we are in a position that the scale, the knowledge, the credibility, that we have, is not something that you can repeat overnight. It's not something that you can just build the scale and the knowledge, et cetera, that we have with IP, and it's something that... The reason I'm emphasizing it, from both an opportunity and the investment and value we see in our IP as we look ahead into the future.
Now, let's put it in the context of the TAM. EDA, for those of you who are very familiar with EDA, is typically split into two type of EDA. You have the digital design and verification, and you have the custom design manufacturing PCB. So that's the EDA market. The TAM growth for those two cohorts of the EDA TAM, the digital design and verification from 2020- 2023, it grew by 15% CAGR. For the custom design, 12%. The mix of the TAM is what you see as well, is slightly bigger for the digital design and verification. On the IP side, the interface IP and foundation IP is one cohort, and the processor IP and other is another.
You can see for that period of time, 18% growth for foundation and interface IP and 14% for the other, and you see the split of the TAM. So that's the TAM. Now, how does Synopsys fit within that TAM? Our portfolio today, the usage share, the market share we have in the market, we are highly indexed to the digital design and verification portion of the TAM. Now, you can say you got lucky, you're in the right time, right place. It's okay, we take it, but there's a huge amount of innovation and investment that got us to that stage, where we are the leader in the digital design and verification, and it happened to have for Synopsys as the largest portion of our EDA TAM revenue, our own EDA revenue.
On IP, you see even a much more drastic view of our revenue split between interface foundation IP and processor. That is what gives us the confidence when we look ahead, that we will be the market leader in terms of growth for EDA and IP, is the portfolio we have and the solution that we offer for our customer, and they've given, they're both indexed to much higher growth, just in general, TAM. We cannot wrap up this discussion without talking about AI. We've been talking about AI, AI, AI. Where are we? Where is the adoption? What's the monetization of AI? Are you surprised with this?
We're in early stages, but does not mean we don't have customers using it, adopting it, but it's still early stages, and I'm gonna try to help you see where we are in these stages. First, when we look at AI, we look at three major categories of AI. We're benefiting greatly from the surge in AI chip designs. Be it the semiconductor companies designing AI chips or hyperscalers designing AI chips, we're seeing a beautiful opportunity there, it's just there's more chips, meaning more digital design and more interface and foundation IP that we're selling. The category in the middle is our AI products, the AI technology that we invested in, and how are we monetizing the AI solution that we're offering. And the third one is what I touched on earlier, which is our own operational efficiency with AI.
But let me focus on the one in the middle, because many can claim the number one, number three, that there are benefits and they're working on them. But the one in the middle is what we've been talking about in terms of what is it we're doing, where are we in that monetization? I mentioned we released DSO.ai in 2020. So that is the product that we have the most experience with, and outcome and results in terms of turning it into financial outcome. Because right now we've been selling it for a number of years, so we know the journey of the monetization, and I'll double-click into that. But since then, we've expanded it to other optimization engines across the EDA stack to verification, test, analog...
And today, what I announced this morning is 3DSO.ai, which is using AI for multi-die in a package application. The next layer up is data analytics, which we call it Synopsys. This is under the Synopsys.ai umbrella. The product .ai, then the .da, which is the data analytics. We announced those products starting in 2022, with number of them already in production, able to monetize, et cetera. The third layer that we announced in Q4 last year, which is the Copilot.
And the reason it was a significant announcement for our industry, the pressure and opportunity for us to truly modernize the way our customer use AI—use Synopsys' EDA product, is by leveraging large language model, generative AI, and what are the opportunities we can bring into our platform to provide that optimization. So those are the three layers of AI we have today. In terms of continuous opportunity, you're gonna see more coming in the DSO.ai. There will be more in the EDA. But today, this is where we are. Those are products we have today. We're engaged with customers today. In number of them, we're into the monetization stages of them today. Let's use DSO.ai as an example, because that's, again, it's the best example, given we've been selling it for a while.
So in 2021, we had four logos using it, and then you see the numbers and acceleration. In 2023, 20 logos using it. When I'm saying using it, it means we're selling it to them. They're using it in production, they're seeing value, they're paying money for it. Those are the number of logos with an estimate by the end of 2024, we'll be at 35 logos adopting the technology. Then within those logos, you'll ask yourself: Where are you in the adoption? So we are in production in 20% of that addressable TAM. Now, within that TAM, we are in about 10%-15% utilization of the technology.... That's why when we say it's early stage, that's why it's still early stage. Now, the question may be in the back of your mind, why is it so slow?
Why is it, if they love it and the results are stunning, why isn't it much broader? The reality is if you are a semiconductor company, and you're doing a derivative of a chip that is already working, but you do some minor tweaking on the chip, you don't touch your flow. You don't touch your tool selection, you just stay for the next evolution of the chip. Where we're seeing the rapid adoption is a new design that is requiring AI to deal with the complexity of the design. Now, you look at that equation, what we have been seeing consistently with those customers is about a 20% uplift in revenue when they adopt DSO.ai, and that's 20% uplift in revenue that where AI applies, meaning DSO.ai, it applies for the digital portion of the chip.
So it's you take Fusion Compiler, which is the tool they use. They layer DSO.ai, you get 20% more spend, revenue, from the customer by enabling DSO.ai. So that's the average we're seeing today. In terms of results, and why are the customers paying for it, and I don't wanna go through each one of them, one by one, you see the outcome. The outcome is a significant productivity booster, faster time, better quality of results. Now, how does it impact the TAM? I shared earlier that the TAM has been the EDA TAM 12% CAGR. You look ahead, and let's assume, based on what I shared with you, for the next five years is 12% as well. Where is AI? We believe AI will add another 2% revenue uplift on the overall EDA TAM.
Given we are in an early stages of that deployment, that will happen over time, no question about it. As I said, we have monetization happening today. We believe that will contribute to roughly a growth from a 12%-14%, which is a +2% due to AI, where the customers are willing to pay for that technology and differentiation because it helps them reduce their time to design the chip and their ability to design a differentiated, system or a chip, it depends on the application. Now, I Silicon to Systems design solutions. you've seen that slide, when we communicated, to our investors, the logic behind Ansys.
The transformation, and I, I can ramble many examples, it's happening right now, where many silicon semiconductor companies are trying to invest and differentiate not only the silicon layer, but they're building silicon software, hardware to deliver an electronics delivery to their system customers. You look on the system side, like hyperscalers, some of the automotive, et cetera, at different stages of that, maturity or journey, they're looking for a way to either design their chip or provide a spec for their chip suppliers of how to deliver the right chip for their Silicon to Systems and the Systems to Silicon view that we see. So there are two elements, for the Ansys rationale. One is deep inside the silicon opportunity. This is the multi-die packaging, et cetera, et cetera.
That multi-die in a package is by itself a complex multiphysics system that is no longer only in electronics. It's electronics, mechanical, fluid, structural. You need to take into account when you're designing that multi-die package. Same thing when you look at the end product as a system, say, a car, that multiphysics that is required to simulate between electronics, mechanical, et cetera, is again a significant opportunity, especially when you look at it as the software content is gonna be fairly significant, and the electronics content is significant. We see an amazing opportunity here for both at the silicon level and at the system level. Now, I talked about SDG, which is the Systems Design Group, and the journey we've been on. Hardware-assisted verification, which is our hardware system, ZeBu and HAPS.
It's today the industry standard and the leader for a hardware system that achieves the best performance, meaning the chip companies use HAPS and ZeBu to accelerate the verification process and to run some software on the chip to see is it going to meet my customer requirements? Same thing if you are a system company that you're buying a chip or buying a chip from a semiconductor company, you use that system as well to accelerate your integration process. Now, the virtualization is another layer up in abstraction, where before the chip is ready. So let's say you're a system company, you're an automotive OEM or hyperscaler, et cetera, and you're working with a semiconductor company.
You say, "I want to have a model, a virtual model of that chip, so I, the semi architect, can start developing my software early before the chip is ready, because the chip may not be ready till another year." That's the virtualization layer. Then there is the software in electronics. In here, you need to look at the whole electronic system and how do you test it. We've made the acquisition of PikeTec, and you continue going up the stack into virtual product and into the whole product operating in the field. This is where Ansys, SLM, and a number of technologies we have will provide that, that entire stack that we're talking about in terms of digital twinning. What is the TAM?
If you remember when we shared that slide, a couple of months ago, it's been for the announcement of the Ansys acquisition, we talked about the $11 billion EDA, we talked about the $7 billion IP, then we talked about the $10 billion for simulation and analysis. The reason you're seeing that $3 billion additional TAM for system software and SLM is that's the opportunity we're carving out in terms of the TAM increments for the investments we're making when we're talking about the whole virtualization of a system, early software development, et cetera, et cetera. And that CAGR over the next number of years till 2028 is estimated to be at about 20% growth opportunity for all the reasons I've mentioned. And I want to keep emphasizing that point.
Given our strength in the portfolio, given where we are indexed in the areas of growth, we have confidence that we'll continue on outgrowing and at a faster pace than the rest of Silicon to Systems, that's how we're thinking about the future, that's how we're looking at it from various market segments, and that's how we're looking at the opportunity as we look ahead and accelerating that opportunity as we go through the process of approval with the Ansys acquisition. I have number of quotes from customers that they came forward with their support of the Ansys acquisition, and you can read them. From an AMD point of view, they see the world as becoming more and more heterogeneous integration of multiple chips into a package.
You see NVIDIA, where they're talking about co-optimizing the silicon and the system. TSMC, the one that they will manufacture, package, test those multi-die systems, they need that solution from Synopsys as well, and Ansys, in order to bring it together. And Tesla, which is a system company, how do they approach it from a software-defined vehicle or system, and the combination and what it brings? Now, we are very fortunate in this case, that we have established a partnership since 2017. With that, I would like to invite Ajei to join me on stage and have a fun chat. Hi, Ajei. So, Ajei, I described from the Synopsys lens, what our customers are saying, what's their excitement, how do we see the opportunity. It will be great to hear from you, how do you see the macro trends and opportunity?
Sure. I think actually, Sassine, you covered some of the material, and so forgive me if I emphasize some of the points that you made. Obviously, we have a relationship and a partnership that we announced several years ago, and I'm excited about the progress of that partnership, certainly in the area of multi-die. You saw customer comments about what we could do together as part of the joint portfolio, and that's really exciting as we think about the future of that aspect of the business. But if I think about the broader business of building products, that industry or the way we've been approaching the market has changed over the years. It's been influenced by a couple of things.
One, it's been influenced by increased levels of complexity and how complexity is changing the way customers are thinking about their markets and their businesses. And the second is, it's being changed by the electronics or the computerized content within products that were traditionally not smart, but have increasingly become smart. And so if you go back in time and think about our business, years ago, most of our customers would do single physics kind of analysis, right? So they would, they would do, say, a structural analysis of a component to figure out if it worked or a product, and it was like a single physics analysis that allowed them to be confident in the success of their product... and over the years, that's changed.
Today, customers look not just for individual physics, but they look for an integrated multiphysics solution that allows them to be able to effectively reason about the performance of a product. So a great example would be, or a very simple example would be, a car crash. So in years past, you may have just done a structural analysis. Today, you have to do a structural analysis coupled with a CFD analysis, because, look, will the airbag deploy in time to protect the passenger? Will the sheet metal not injure the passenger if it breaks around them? Will the electronics trigger the airbag rapidly enough? So it's a complex multiphysics analysis. But of course, that's now also changing, and now the world is moving towards, and has already moved towards, the notion of software-defined everything.
You're seeing this certainly in the, like, in the area of electronics. You're seeing this in automotive. You mentioned software-defined vehicles. You're seeing this with industrial equipment. You're seeing this with medical devices. The basic idea, of course, is that at the time the product is actually being designed, the designer doesn't completely understand exactly what the operating parameters are and how the product will be used, and so the design has to be broad and flexible enough to take into account the fact that it's being changed, and that changes may happen in the field. And so the ability to be able to do all of this, to be able to analyze and bring these sophisticated products to market, revolves around a few things. It revolves around the ability to do multiphysics analysis.
It revolves around the ability to be able to understand the computer or the electronics environment, to be able to integrate these things together. Of course, in the context of some of these broader IT changes that are taking place, the use of AI, the use of high-performance computing, the use of model-based analysis, all of this wraps around the fundamental analytics around the computer, the computer computing and the electronics environment, and of course, the multiphysics analysis. And so that comes together to support this next generation of computer, of software-defined everything for essentially all of the industries that we serve.
It's so exciting. I mentioned some of the customers that they reach out to Synopsys more naturally from the silicon side of their care about. Have you heard from any customers, and what are the areas, if you go from high tech, aerospace, industrial, automotive, et cetera, what has been the reaction?
So the reaction has been really positive. Obviously, we have many mutual customers who are using Synopsys products, they're using Ansys products together. That's obviously been very positive. But even outside of the high-tech industry, and that's about a third, approximately, of our business, but even outside of the high-tech industry, looking at automotive, for example, as you said, with the transformation that's taking place in the automotive industry, where folks are looking at, you know, building silicon, integrating it together, certainly there's been a lot of excitement.
But in this broader context, because of what I just described, these software-defined, essentially everything across industries, you have customers, and I've talked to a number of customers since we made this announcement. You have customers who are excited about the opportunity because they realize that in the future, they have to understand not only what they've historically been doing, but they have to take into account the entire supply chain. And so even if the customer, I've, you know, I was talking to an industrial customer, an aerospace customer, even if they have historically not been designing silicon, they want to understand what the implications are going to be of the choice of silicon that they're making inside their product design.
So this notion of being able to manage the supply chain, to be able to understand at a model level, at a model-based level, how it all fits together to get to a better outcome, is important. So that's, that's the feedback from our customers has been really positive because they see that opportunity, and it's not just something that we, as vendors, are telling them about. It's not something that we, as technologists, are telling them about. It's what they're hearing from their customers. The transformation of these industries is manifest, it's palpable, and, and I believe that our, our two companies working together would be in a position to facilitate that.
That's great. One last question. Our employees, they're thrilled, and they see the opportunity Silicon to Systems. how has it been received right now? It's been a few weeks that you're talking to the Ansys side, when they're looking at that future combination.
So one of the great things, of course, is we have been partners now for a number of years, and I think that that partnership has built trust because customers have seen success. And as a result of that, obviously, people have worked together to achieve an outcome, and that's been great. But I think that there's a broader context here, which is ultimately about culture. Because when you put two organizations together, there's obviously end markets and an understanding of how that works, which is really important. But there's also an aspect of culture. I mean, at the end of the day, corporations are about people, and you've got to bring organizations together, you're bringing people together. And so what are the cultures of the two companies like? And I think that's an important consideration.
There's a lot of similarities. In fact, the two companies—the two cultures are eerily similar. So if you look at the focus on technology, both companies share a commitment to developing advanced technology and bringing it to market. I mean, you can't fool science and engineering, and that's something that we both believe in. That's something that's important. That's in the DNA of both companies. That's one. The second thing is the nature of customer relationships. We all recognize that at the end of the day, we're here to make our customers successful, and if we don't take that extra step to make them successful, it doesn't matter how great the technology is, it doesn't matter.
And so that focus on ensuring our customers are successful, you have people in Ansys who work nights and weekends to make our customers successful, and the same thing at Synopsys, and I think that that connection as well is really important. The third thing, and I think this is also important, is I believe that both companies are fundamentally decent in the way that we operate. We operate with respect for our employees, we operate with respect for our customers. We operate with a level of decency, and that's shown in the way we work internally, in the way that we are seen through our engagement surveys, in both companies, and certainly it's shown up in the way that we partner. And so I think that cultural connection helps us as we plan for the future.
That's very exciting. Now, I know we have a journey before we get to becoming one company. As we communicated, it's the first half of 2025. Meanwhile, we continue on collaborating under the umbrella of collaboration that we've created and operating as two separate companies. But I really appreciate you coming here, giving your perspective. And one thing we announced this morning, actually, that at time of close, day one, Ajei will be joining our board in that day. So, we look forward to having you part of the board and part of the success and the journey.
Thank you, my friend. Thanks.
Thank you.
Thanks.
Thank you. Thank you. Okay, to wrap up, this is where we've been, 17% CAGR. We have accelerated our operating margin in the 2020 to 2023 period from 28%- 35%, 700 basis point improvement; the free cash flow, 23%-26%; EPS, an impressive 26% CAGR. Our combined company long-term multi-year objectives, you see them as well. As we communicated, is a industry-leading double-digit growth with continued commitment to improve our operational efficiency and health to continue on leading above the TAM and deliver to our shareholders an outsized return, as we looked at the earlier, the Magnificent Seven, and where we fit just as one measure.
When you look at the criteria of companies that are in the grouping that we are, we're one out of seven companies, both if you combine the semi and software. We really appreciate the trust you have in us and the flexibility you've given us. I know we put so much in the last number of quarters before having that opportunity to discuss with you where we are, how we are looking toward the future, and I truly hope that you feel that similar level of excitement that we have. In summary, we see a multi-decade or we have been on a multi-decade journey and track record of we do what we say, and we say what we do.
We're passionate about solving customer problems and staying on the bleeding edge of innovation with technology leadership in multiple areas, and our commitment to our resilient business model, while leading a market-leading growth, for us, for you, for our customers, to continue our investment. So with that, thank you, and we'll move to Shankar.
TSMC and Synopsys have been partners for more than two decades since we jointly made the foundry industry's first reference flow in 2001. This marked a critical development to help designers shorten their development cycle time from concept to silicon. Our long history of collaboration Silicon to Systems, enabling our mutual customers to address the most challenging design requirements. Together, we support them at the leading edge of innovation, from Angstrom-scale devices to complex 3D IC systems across AI, high-performance computing, and automotive designs. Together, with Synopsys' leading EDA and IP solutions, we enable customers to reach new levels in performance and power efficiency and accelerate time to market for their products. We put them on their fastest path to silicon success and business prosperity.
Synopsys has been a long-standing partner of TSMC and an important part of our Open Innovation Platform , which has greatly shortened time to market and changed the trajectory of entire industry by fostering cross-industry collaboration between IP, EDA, memory, substrate, testing, manufacturing, and packaging partners. Our continued partnership with Synopsys and OIP partner is essential as we continue to innovate in the new era of pervasive intelligence.... We look forward to working closely with Synopsys to meet the evolving need of today's designers with a full spectrum of world-class proven solution and service, and help them achieve even greater heights of innovation.
Please welcome Synopsys GM, Electronic Design Automation Group, Shankar Krishnamoorthy.
Good afternoon. Today, I'm going to share some perspectives from a Synopsys viewpoint on how we are delivering market-beating growth in the EDA business on the basis of pioneering innovations. In this era of pervasive intelligence, you know, we are working very closely with many customers, many partners, and one comment we hear very frequently is, "Hey, Synopsys, you're moving from being important to being mission-critical for our success." Why is that? For several decades of Moore's Law innovation, when there was a node transition, we reliably got a 2x, 2x, 2x improvement in performance, power, area. But over the last few node transitions, we have seen those gains reduce to 15%-30% in terms of performance and power.
There is great innovation ongoing between the tools, fab tool suppliers, the foundries, EDA companies, IP suppliers, to keep that Moore's Law innovation going well into the Angstrom era. But if you look at what is happening in end markets, there is a dramatic compute requirement, 2x every six months, needed to keep up with what is needed to train very large language model AI systems. In spite of this, Moore's Law slowing down in terms of the gains on power and performance, these companies are still able to try to keep up with that compute requirement with tremendous innovations, and we are well on our way, marching towards the trillion transistor system by the end of this decade. How are these semi and systems companies able to overcome the fact that the Moore's Law progression is not giving the great gains as before?
There is a lot of innovation underway in order to enable this. There is continuing innovation in the foundries and in all the semiconductor equipment suppliers and EDA companies around areas like Design Technology Co-Optimization , manufacturing innovations to keep that Angstrom era very much on schedule. But in addition to that, this is the golden age of computer architecture, as domain-specific architectures are being broadly deployed across the systems and semi, industries. This is also an era where we see the advent of multi-die designs, and multiple dies are being pulled together to create these Superchip s in order to address bandwidth and latency and other, requirements to achieve that level of performance. Energy efficiency is becoming extremely important because we just cannot keep driving up the power consumption linearly, so we have to bend that curve to make all this sustainable.
And then workloads like EDA are now becoming the biggest workloads at semi and systems companies. So compute acceleration is of great interest, and a lot of innovation is needed there in order to keep the schedules on track with respect to the pace at which these designs and systems need to get delivered. And then last but not least, AI offers a promising and exciting new way to dramatically improve engineering productivity, as well as the performance power of designs. The reason Synopsys is moving from important to mission-critical is really we are the platform on which all these innovations are happening at semi and systems companies. So what, what do we believe are the fundamental reasons why we, customers believe we are becoming mission-critical to their aspirations? The breadth and the depth of our EDA portfolio is really a big reason why.
If you look at the EDA stack, the whole process from system architecture through design capture, verification, implementation, sign-off, test, lifecycle management, manufacturing. Synopsys is very unique in the sense that we have the broadest and deepest EDA portfolio that covers all these areas, with best-in-class offerings in every layer of the stack. This doesn't happen by accident. This is a 30-year journey for us, where we have kept innovating, kept delivering best-in-class technologies, and this has resulted in a number one position in EDA, 65% of Synopsys FY 2023 revenue allocated to the Design Automation segment, leadership positions in digital design and verification, leadership in GPU-accelerated analog verification. Two elements of the stack are really, really important.
The sign-off step, which is where the electrical checks for whether or not the chip actually is going to function in silicon and meet its performance and power requirement. That is an area where Synopsys is a gold standard in the industry. The TCAD step, which is where the process research into transistors and materials and interconnects happens. All these are being done with Synopsys TCAD, which gives us tremendous visibility into where the industry is going, what are the types of technologies that are likely to succeed, and we fuse these understandings and learnings across our entire stack in order to get the benefit of delivering the highest performance, the highest power, and also in a predictable way from a schedule perspective.
We are also a pioneer in new technologies like Silicon Lifecycle Management, where we are essentially introducing telemetry into chips to track the health of the silicon over the entire lifecycle, from design, yield ramp, high-volume manufacturing, as well as in-field operation. So across this broad and deep EDA stack, we are driving two key concepts. The first concept is hyperconvergence, where we are fusing many elements of the stack much tighter together than ever before in order to unlock opportunities with respect to PPA, as well as predictability and engineering productivity. A great example of this is when we fused synthesis, place and route, and sign-off into a single product, which we call Fusion Compiler, which has revolutionized the way in which advanced node implementation happens today in the industry. The other innovation we are driving across the entire stack is pervasive AI.
So across every layer of the stack, we are infusing AI in order to improve results as well as productivity. But the stack is only as good as every layer in it, and we have some of the industry's most loved products in every layer of the stack. If you look at design capture, for example, Synopsys synthesis , Design Compiler, Fusion Compiler, essentially is used by a very large percentage of all RTL and front-end designers in the industry to generate functionally correct netlists with the highest PPA. In verification, VCS is the fastest simulator in the market today for many, many decades, and Verdi is the much-loved environment for debugging verification. In the implementation area, Fusion Compiler is the champion for advanced node place and route.
We talked about the golden sign-off, through which most of the designs in the industry are going through in order to ensure silicon correctness, and in terms of timing measurements, power measurements, and in terms of other metrics. And then, if you look at the manufacturing side, beyond TCAD, our computational lithography solutions in Proteus OPC are used by the most advanced foundries today to essentially create mask sets for the most challenging designs that are happening in the industry today. So a very, very strong stack, which is then being supplemented with the hyperconvergence concept and the pervasive AI concept. So our technical teams work very, very closely with semi companies, systems companies to realize their product ambitions. And as part of that, we essentially see how we are doing.
We track how many designs are being done in the industry, and the graph on the left is showing you all the different advanced node designs that are being tracked for 7-nanometer and below. We have a pretty good handle of this because of the breadth and the depth of our portfolio. As you can see here, over 500 designs are in progress or completed up to this point at 7 nanometers and below, with 400 taped out and about 100 in active execution currently. Across this set of active designs, and a design is a full SoC with possibly hundreds of blocks in it, you can see how we are doing in terms of how our customers are using our software. In place and route, over 50% of usage in place and route is exclusively Synopsys.
So over 50% of SoCs at 7 nanometers and below are designed exclusively with Synopsys place and route, and an additional 30% are essentially designs where the usage is split between us and other vendors. So cumulatively, over 80% of designs in the industry are essentially going through Synopsys place and route technology. If you look at synthesis, over 84%-85% of designs are using Synopsys synthesis to generate netlists. If you look at sign-off, over 90% of designs are being signed off by the gold standard in timing analysis, power analysis, extraction, and so on. So putting this all together, the digital full flow, a significant percentage of designs in advanced node, 7 nanometers and below, are essentially going through Synopsys technology.
If you now look at areas like digital verification, over 70% of designs are using Synopsys digital verification, and then analog verification, cumulatively, about 40% of designs. But specifically, if you look at the memory market, flash, DRAM, SRAM, Synopsys analog verification is in a very, very strong position because of the dramatic accelerations we have delivered with GPU-assisted circuit simulation. At the core, at our DNA, we are a pioneer. We are a pioneering company, where we are essentially looking at what are the big, hairy, audacious problems on the horizon. We commission teams, we essentially try to solve them before our customers need to, need solutions, and then we work closely with them to deploy it and to essentially help them achieve their ambitions.
Here are an example, over the last 7 years-8 years, of some of the pioneering innovations that Synopsys has delivered that have advanced the semiconductor industry. From the introduction of the revolutionary approach called Fusion, where we brought together many, many EDA engines into a single data model to unlock PPA and eliminate iterations from implementation. Through our pioneering work in AI, where we introduced the first AI-based applications in EDA, to a brand-new approach to address multi-die, the whole approach towards chip telemetry and Silicon Lifecycle Management.
Using GPUs to accelerate traditionally, compute-intensive EDA problems like circuit simulation and OPC, broadening the AI solution to look at verification, to look at analog design, to look at test, and then extending it with generative AI and using LLMs to, again, create a whole new set of capabilities to enable EDA users to get a lot more productive. This is our DNA. We pioneer, we look ahead in terms of what's coming down the pipe, we make big investments, and then we deliver solutions ahead of where the market needs it. All this, the mission-critical nature of what we do, as well as this pioneering spirit, has led to a significant uptick in our growth rate.
From low single-digit CAGR in the 2018, 2019, 2020 time frame, over the FY 2021- 2023 time frame, we see a significant uptick of 17% CAGR in the Design Automation business. Before we talk about innovations, let's talk about product architecture. I believe this is our sustainable competitive advantage in the EDA space. I want to take you back to 2016. We were just at that transition from 16 nanometers-7 nanometers. Things were getting really hard. You know, point tools for synthesis, point tool for place and route, point tool for STA. The tools were not correlating, a lot of iterations in the flow, performance and power was being left on the table. And the industry, as we were staring ahead into the single-nanometer era and Angstrom era, needed a new approach.
Synopsys made a significant, audacious bet that we needed a new product architecture for EDA. A product architecture, where essentially on a single data model, we would fuse all these EDA engines and create a very tight coupling. At the center of it was our golden sign-off, which ensures that there is predictability, convergence across the entire flow, and then around it, we brought our synthesis technology, our place and route technology, and we deeply coupled it in, you know, in a revolutionary way to unlock a lot more PPA. This was introduced in the form of Fusion Compiler back in 2018, and this new product architecture dramatically sped up our pace of innovation.
Over the next five years, we moved to rapidly integrate many of the other EDA engines onto this product architecture because of its modularity, and we're able to unlock PPA and power and test improvements by bringing test much closer into synthesis and place and route, by bringing verification much closer to synthesis, by bringing analog much closer to digital and unlocking opportunities. This also dramatically sped up the pace of our product execution, as we were able to build a product for 3DIC on this platform by rapidly reusing many of the assets we already had, as well as innovating new engines on top of this.
As we look at the horizon with the Ansys acquisition, we see a tremendous opportunity here to solve the customer's high-value problems on designing very complex multi-die systems, trillion transistor systems, by the end of this decade, by deeply fusing the gold multiphysics engines that Ansys brings for stress and warpage and thermal and many others, with all the EDA assets we have in the Fusion platform. We expect that because of the modularity of this, we are going to be able to do this much, much faster than anyone expects, about one release cycle. Then, of course, there is the promise of AI. We built our AI technology in a way whereby every product benefited from our AI innovation.
So the speed with which we are rolling out the verification space optimization or the analog space optimization or the test space optimization is because of the product of the modular architecture we have in the product, as well as the way in which AI is infused over every element of this product architecture. Let's now look at how these pioneering innovations are unleashing new growth vectors for our EDA business. Let me start with what we call the March to Angstroms. You know, for those who have been in this industry for a while, you all recognize this curve very, very well. I know that back in 16- and 10-nanometer, there was all this murmur on how long is this gonna continue? Is it gonna peter out in one, one node, two nodes?
It's really amazing to watch that we are already at, you know, 18 angstrom s. There was already talk about 14 angstroms, and all the research organizations are already working at below 10 angstroms. Moore's Law is alive and well with continuing innovations underway. So how is this happening? It's happening because of a product of multiple swim lanes of execution. From the TCAD Design Technology Co-Optimization level, where different types of transistors are being investigated, different types of materials are being investigated, different types of interconnect structures are being investigated, to the swim lane, where essentially in computational lithography, we are rapidly moving from model-based to curvilinear, to now high-NA EUV in the horizon, to continuing evolution in the sign-off area with new types of modeling for variation and local layout effects.
The place and route area, again, significant innovations around hybrid row technology and backside power and further evolutions of that. Then library architecture itself has changed dramatically in the past few years. So the product of all these swim lanes is what is creating all these additional node offerings, and we are able to still keep this Moore's Law trajectory going, delivering that 15%-30% gain node over node, and also the pace of innovation has really picked up dramatically. As you can see, the end markets are really pulling to find every little edge that they can find. And here again, the important mission-critical nature of Synopsys comes into play. At every swim lane, there is a critical Synopsys technology that is the platform on which much of the innovation happens.
Our TCAD platforms, we talked about, our computational lithography platforms, and now recently accelerated with all this GPU technology, is very, very critical for the progress on that swim lane. Our sign-off technology, we talked about. Fusion Compiler is the place and route platform, where much of the advanced node DTCO activity goes on in terms of the new concepts, and then, of course, our foundation library portfolio. All this cumulatively has delivered the highest PPA at 2 nanometer and 18 angstrom for all the leading foundries with the Synopsys portfolio. This is validated from the strong endorsements we have received for the Synopsys portfolio from TSMC and Samsung, as well as Intel Foundry Services. Let's now look at the next innovation, which is Synopsys.ai.
When we look at the EDA workflow, I want you all to just pause and appreciate that this is probably the most complex human undertaking ever. You're talking about designing something in the order of 200-300 billion transistors in about 18-24 months, with a team of 500-1,000 people, and ensuring that it works correctly, it hits its performance targets, all the software works correctly. So really a tremendous, tremendous undertaking across many, many disciplines. So if you look at the workflow that is enabling it, this is why the whole EDA workflow is so critical. You know, every one of the steps of this workflow, from architecture through verification, implementation, manufacturing, has tremendous opportunities in terms of unlocking productivity bottlenecks.
So in a regime where the total number of engineers and talent availability is not keeping up with the requirements, we have to look at disruptive ways to increasingly make things more autonomous, to have essentially AI pick up the slack in terms of repetitive tasks, in terms of iterations, and thereby open up more cycles for the existing community of EDA engineers. I want to take a couple of examples to illustrate to you how this plays out. Let's take verification for an example. So typically, as a verification engineer, you have an architectural spec from which you write the design, and you write the tests, and then those design and the tests then go into a regression system, which then runs the regressions, you know, multiple times in a day or every night.
Then you basically look at the results of that regression to either improve the coverage or essentially fix the bugs that were introduced in the coverage improvement cycle or the debug cycle. A lot of the work in this workflow is repetitive. A lot of this work, in this workflow is iterative, and so there's a great opportunity here to bring AI to essentially significantly accelerate this workflow and to determine which parts of these can be made autonomous and which part of this workflow still needs humans to drive it. Another example is implementation. Even with all the great progress we have made with Fusion and Fusion Compiler and so on, there is a lot of iteration and repetitive work that an implementation engineer does. They have to evaluate different floor plans. They have to look at different tool switches and options.
They have to look at different libraries, different layer stacks, as they are searching for: what is that best combination that gives me the best possible PPA, which differentiates my product? Here again, the application of AI can dramatically accelerate not just the workflow, but improve the results, because autonomously, you can search a much larger search space versus an individual essentially trying to do as much they can in the time that's been allotted to them. So this is what is really promising and exciting about applying AI to different parts of the EDA workflow. At Synopsys, we have set a blistering pace of AI innovation. You know, when AlphaGo beat Lee Sedol back in 2016 in Korea, it opened up a lot of eyes at Synopsys on the possibility of this technology.
We commissioned our AI team in 2017, and since then, we've been on this journey where we are rapidly learning, adapting, mapping it to our problem domain, and then essentially driving innovation. So in 2020, we pioneered the application of AI to EDA with the introduction of DSO.ai, and then we had to learn. We had to learn how customers would adopt AI. We got pushback from customers on not trusting the results of AI. We had to integrate AI into the customer's flows. We had to figure out what was the right window of opportunity, where a customer is willing to bring in new technology into their flow. So as we went through those process, we learned many, many lessons, and then we started to hit our tempo.
You can see the pace at which the number of tape-outs with DSO has been climbing, right? By the end of 2024, we will have over 400 tape-outs completed with DSO moving into 500. We were also able to then broaden our AI approach to include verification and test and analog. Today, you heard Sassine talk about 3DSO.ai as well. We also broadened our overall offering to include data analytics, where we essentially built a data continuum across design, manufacturing, as well as production, in order to analyze the data from EDA and then optimize the whole flow as well as results. Then in late 2022, when OpenAI introduced ChatGPT, again, a huge, you know, set of possibilities emerged.
We moved rapidly, and by end of 2023, we introduced the first LLM-based EDA application, which is basically Synopsys Copilot. And here again, you can see the pace at which we are moving in this area, the pace at which we are also deploying this technology and learning from customers in terms of how to make it scalable. And really, there's a lot of upside as we look at the roadmap going forward, as well as the usage by customers going forward. So market leaders are realizing significant gains with our AI technology. In blue there, you see some of the results that DSO has produced across different types of designs, and it's just not an advanced node thing. It can apply to mature nodes as much as it applies to advanced nodes.
It applies to the most bleeding-edge CPU as much as it applies to an image sensor design. So really, this is a technology that can improve power, improve performance, improve productivity, and essentially have a big impact in terms of implementation flows. Similarly, if you look at verification, as we are broadening our deployment of our verification space optimization, this is a disruptive technology that is improving coverage, finding bugs that were not normally found with the traditional approach of verification, which is obviously huge for any hardware design company, that you can find more bugs before your design tapes out. It also dramatically cuts down the total compute needed to achieve a certain level of coverage because AI autonomously is kind of soaks the coverage space versus essentially getting trapped.
So these are all the advantages that are getting unlocked by AI and verification, and you can see that many of the marquee customers have already brought these technologies in-house and are actively deploying it. And to give you a sense of how fast things are moving, we wanted to share with you the design activity, which is really, in the case of implementation, it is how many blocks, in a, you know, in a company are using, DSO. In the case of verification, it's how many IPs in the company are essentially going through the VSO flow. And really pay close attention to the trajectory.
You know, after those learning years of 2021 and 2022, and customers also essentially ready to embrace AI as a strong addition to their flows, you can see how we are beginning to see that big uptick in terms of adoption, in terms of usage. And by the end of this 2024, we expect over 160 IPs at all the marquee customers embracing AI and verification, and over 320 designs on the implementation side. So this is really something that is moving very rapidly across our customer base. And then looking at other AI applications, right?
When you look at analog circuit design, this is one of the most expensive design steps, where an analog circuit engineer spends a lot of time, along with their team, to tune their analog design to basically meet a certain set of criteria at a given node, and then they are tasked with migrating that to another node. It's significant effort in order to have that same circuit working at the next node. Here is an opportunity to bring AI to dramatically cut that cycle down, essentially by using AI to autonomously optimize the analog circuit at the next node. We have worked with all the advanced foundries, like Intel and Samsung and TSMC, in order to roll out analog design migration flows to our common customers to help them significantly accelerate their analog design workflows. Another area is test.
Test costs, test cycles are a big, big care about for companies which have high-volume parts. And on the left, you can see the graph that shows how the test cost is skyrocketing with complexity. So here again, AI can play a big role in dramatically cutting down pattern counts, test cycles, and thereby bending that curve in terms of test costs and test times. And here again, we are early in our engagements with DSO.ai at several market-making companies, and as we are broadening the deployment, we will essentially be able to see this across a much broader set of customers. And then that brings us to LLM-based EDA, right? LLM is a powerful new concept, where essentially you can encode a very large amount of data in a very compact model.
I mean, even a trillion-parameter model is a compact model, given how much data was used to build that trillion-parameter model. And on that model, you can build a lot of interesting applications. So we began our journey with LLMs by introducing Synopsys.ai Copilot, which is essentially a generative AI copilot for all the EDA engineers using our EDA tools, in order to essentially have a 24/7, 365 assistant next to them, so that they can cut down the number of times that they need to talk to you know, either the Synopsys support engineers or experts within their own organization to get the data they need to keep moving faster with our tools. But we've also been able to now broaden that offering to automatically generate many key EDA collateral, generation of RTL, generation of test benches, generation of formal assertions.
All of these are now possible with generative AI, and we are rapidly integrating these into our whole EDA portfolio, and thereby driving significant productivity improvements among our customers. So here are some examples from our leading engagements at AMD, Intel, and Microsoft, interviewing some of the engineers that used our GenAI capabilities. They talked about 30% faster ramp times for junior engineers, focusing on critical tasks while GenAI takes care of the mundane stuff, and the responses are at least 2x faster to expert queries versus running it up the chain, and trying to get hold of an expert engineer. So all this promises to unlock more productivity and more engineering activity with the application of GenAI. Another pioneering innovation is multi-die.
If you look at the past couple of years, these products from these amazing companies, you know, have come out and have changed the game with respect to AI training, with respect to compute. You can see each of this is an example of what a modern multi-die package looks like. Several hundred billion transistors essentially being brought together in a, you know, square centimeter package. How is this happening? I mean, how are they able to do this? What kind of complexity are they dealing with? If you look at the pace in which multi-die is being adopted in the industry, for server AI and PC client, we expect a rapid uptick even by 2027. Cumulatively, this is going to translate to 30% of EDA software TAM being driven by multi-die, by 2027.
So tremendous opportunity here to essentially drive this adoption into our customer base. And if you look at the design of a typical multi-die system, you know, you have multiple high-bandwidth memories talking to compute arrays with AI accelerators, stacking of memories, you know, dozens and dozens of connections between the different dies. And if you look at what does it take to build a multi-die system, you first need the capacity. How do you, you know, represent several 100 billion transistors accurately, effectively, to be able to even see whether things are working as you expected? You need to have a lot of analysis that is native to the platform if you want to optimize bandwidth and latency, which is very critical to system performance. And then you've got to deal with millions of connections, right?
Between dies, between compute and memory, and all of these need automation. So a couple of ways to solve this. Some of our peers are expecting to solve this by extending their PCB and package design tools. The reality is, they are never going to scale, given the capacity limitations, given the need for automation, given the need for deep integration of analysis with design construction. The approach Synopsys introduced is what we call the chiplet disruption, where we essentially build on our Fusion, high-capacity IC platform and represent multi-die as an extension of the single-die problem. So essentially, by being able to co-optimize between 2D and 3D and having a single design environment where you can explore, construct, and sign off these multi-die systems, we are disrupting the space.
This is evidenced by many of the market leaders adopting our multi-die solution, which is basically called 3DIC Compiler. It's the only unified and scalable design platform for multi-die and 3D IC design, where exploration, construction, and sign-off are all integrated in a single cockpit. We talked about AI-powered exploration, as we've seen introduced 3DSO.ai. That's integral to 3DIC Compiler. Our broad die-to-die IP portfolio is very integral because it's deeply integrated into our platform to automate the connections between the dies. And also with multi-die, test and chip health becomes a big, big care about, because you have not only you have to do it at a single die level, you now have to orchestrate it across all the different dies in your package.
So again, the proof is in the pudding, and several leading market makers have essentially embraced our multi-die approach and our multi-die platform to design one of the most complex CPU designs, which was recently announced, 300 billion transistors with, you know, stacking of compute arrays and so on. A leading HPC AI semi company, essentially bringing 10 tiles together to create a Superchip , with both CPU and GPU compute integrated to accelerate training. And then a hyperscaler, essentially using our environment to optimize the thermal properties of their multi-die system, which then multiplies 10,000 times as they implement that multi-die design across their data center, significantly changing the overall thermal profile.
Then today, Sassine announced several new innovations in our multi-die portfolio with the application of AI, as well as native thermal analysis, in order to enable the fast iterations and exploration. This brings us to energy efficiency, and if you look at the projection of AI data center power over the next few years, it's expected to grow 4x. So AI data center power is expected to go from 4.5 GW in 2023 to 18.7 GW in 2028. And this is, of course, because if it's taking 50 GWh to train GPT-4, you can expect that it's going to be even higher for the next generation of large language models. And even as the H100s evolve to the Blackwells, this is going to continue to be a significant challenge.
So how do you optimize for power and energy efficiency in silicon, between the software, the architecture, all the way down to the device level? Here again, the Synopsys unique EDA portfolio, the end-to-end, deep and broad portfolio, comes to really solve this problem in a pretty significant way. So all the way from system architecture, where we've got the industry-leading tools for essentially exploring the architecture of designs and doing power optimizations at that level with workload simulation, all the way to design capture, where we can optimize the microarchitecture of the design with our RTL power capabilities, implementation with Fusion Compiler and DSO.ai, we can really squeeze the power of your implementation.
The gold sign-off, where essentially you're measuring the power in a silicon-accurate fashion, so you can, with confidence, optimize the design, knowing that you will see the power improvement in silicon, all the way to the health monitoring piece, where in field, based on the workload,... you can adaptively change the voltage and thereby drop the power. This is an end-to-end approach that we are deploying across our customers, and some of the benefits are a leading AI startup reporting over 2x reduction in terms of power consumption by doing architecture exploration and software profiling of AI designs. Another NPU IP company talking about a 20% power savings from all the RTL microarchitecture changes that we drove, and an HPC company talking about 12% dynamic power savings using our design construction portfolio. Moving to EDA compute acceleration.
So we talked about EDA becoming one of the most intensive workloads at semi and systems companies, and we have taken an all-of-the-above approach to accelerate EDA workloads. We are innovating on multi-core, as CPUs are going from 96 cores to 128 cores to 256 cores, and we are seeing some dramatic speed-ups in areas like static timing analysis. We are innovating in distributed computing, where essentially there are several problem statements in EDA, where you can chop it up into hundreds of or thousands of bite-sized chunks and solve them independently, and then pull all the results back together. Example is formal verification, up to this point, considered computationally intractable. Now, we have essentially distributed it, and we are seeing significant improvements with the ability to prove more and more circuits.
A cloud is an integral part of scaling and scaling in an elastic fashion. So if you are an AI hardware supplier with a radical-sized chip, doing a physical verification today at an advanced node is at least a week of runtime. Using our cloud solution with our IC Validator physical verification platform, we can get elastic scaling and do any chip within 24 hours. And then last but not least, the tremendous opportunities unlocked by accelerated computing with GPU and CPU/GPU architectures. So speaking about that in more detail, you all saw from Sassine's keynote this morning, some of the work that we are doing with NVIDIA to really move the state-of-the-art in this area. So, you know, the acceleration is not just in one area. We've been able to accelerate functional verification.
We've been able to accelerate key steps of place and route, accelerate circuit simulation and SPICE, and this is something that, you know, companies that have very large analog circuits, like memory companies, really, really benefit from this. And then, of course, you saw the big announcement around TSMC, Synopsys, NVIDIA, accelerating one of the biggest semiconductor workloads, which is essentially running computational lithography, with great innovations happening at the OPC layer from Synopsys, at the, GPU, CPU, GPU layer from NVIDIA with a cuLitho library, and of course, working closely with our common partner, TSMC, to optimize their whole OPC, recipes. There, we've been able to show some dramatic breakthroughs in terms of performance for a very important workload. So pulling this all back together, right? I just want to leave you with four key messages.
The first one is Synopsys EDA is mission-critical in the eyes of our partners and semi companies, systems companies, because of all the reasons I've laid out. We are very well-positioned with the breadth and depth of our portfolio. We have a next-generation product architecture that is going to enable us to rapidly fuse the electronics and multiphysics to really solve some very important high-value problems for this trillion transistor multi-die package that we are gonna see before the end of the decade. And then, our pioneering spirit is what enables us to place these big, audacious bets in order to solve these important problems, which then drives significant growth in our EDA business. Thank you for your attention.
Intel and Synopsys have enjoyed a decades-long collaboration, which includes developing advanced design flows, combining Synopsys' AI-driven EDA suite with Intel's technology and design expertise. Our multi-generation partnership has created a robust chip design ecosystem. This approach helps our customers rapidly turn their designs into sophisticated solutions. As Intel Foundry continues to advance its process and packaging technologies, in the world's first Systems Foundry for the AI era, our customers can achieve optimal results with Synopsys EDA solutions. This AI-driven approach to design is now certified on Intel's most advanced process nodes. In addition, Synopsys' broad IP portfolio on Intel processes minimizes integration risk by helping chip designers achieve greater first-pass silicon success. The powerful synergy between Synopsys and Intel Foundry is a testament to pushing the boundaries of what's possible. Together, we are not just designing systems of chips, we're sculpting the future of technology.
Please welcome Synopsys GM Systems Design Group, Ravi Subramanian.
Good afternoon. Today, I'm going to share with you the details about our systems strategy as Synopsys seizes the systems opportunity in the era of pervasive intelligence. As Sassine shared earlier, pervasive intelligence is enabled by the rapidly increasing silicon and software content in systems. Many industries are being reshaped today as autonomy and electrification drive a very software-defined world. Systems companies, faced with these tectonic changes, are reshaping product development and business models... reshaping product development as more electronics content and silicon content and software content come in, and change business models as their products can get shipped. The value that they deliver to their customers can change even after they've sold the product, while the product is in the field. All this is driving the demand for more compute, silicon, both at the edge and at the core of these businesses.
Now, if we look at pervasive intelligence, it is driving a silicon-to-systems transformation across the spectrum shown here. If we look at the left side, as semiconductor companies started transforming themselves to start delivering more software and silicon and hardware, they ultimately are delivering the heart of electronic systems. Now, going further to the right, as we enter the systems world, we are facing the fact today that virtually every form of control is moving to electronics-based, software-defined control. We've seen this in the past as we came from brakes to electronic braking systems, electronic fuel injection. Now we have MEA, more electric aircraft. All of these are replacing forms of control that were analog, hydraulic, mechanical, to be digital forms of control. This is driving systems companies to really take a look at the silicon and the software that's purpose-built for their world.
This is driving the rapidly increasing electronics content in systems products. This is one of the most exciting times in the history of technology as computing permeates into so many industries. Pervasive intelligence is truly pervasive, and automotive is a great example to take a look at. If we look at systems and software, and as they become more essential to product development, they also change the way products are developed. R&D processes in systems industries, automotive is a key example we start with, are becoming more and more complex and costly. First, in the automotive industry, it's well known that the silicon content that is found in cars is rapidly growing and is expected to reach almost 50% of the total cost of a car, the bill of materials.
The serial product development methods that have been historically applied to first get a chip, then build hardware, then put software on the hardware, build the product, and then test the product in the field, are rapidly breaking down. They're breaking down because as the rate of innovation increases due to software, the need for the product development requires a more continuous integration and validation cycle. This will support things like over-the-air software updates and real-time monitoring of the chip in the field. As software is developed, it then is integrated into the platform that is already in the field, and as the rate of innovation increases, more and more software-defined capabilities come to these platforms. These massive challenges that are facing the automotive industry are really starting from the standpoint of: How are their products changing across the systems world?
In the automotive industry, the products used to be about the driving experience. Today, it's a digital platform. Virtually every OEM in the world has recast their products as something that is a digital platform, that is gonna transform the automotive business. And we can expect to see this across many industries. Yes, just take a look at what the magnitude of this change is. Just in automotive, as software comes in, it's staggering. Just from left to right, automotive software market expected to grow to $84 billion by the end of this decade. Just software for automotive platforms. You look at the number of lines of code growing from hundreds of million to almost a billion as we reach the end of the decade and have full Level 5 capability. And then silicon and software development costs greatly growing.
Silicon growing because we have the first cars now that have 7-nanometer chips in the car, and it was something that was completely unbelievable or unfathomable a decade ago. Then finally, software as a percentage of the overall chip development for electronics products is taking an increasing percentage of the cost. When we look at the previous product testing methods, though, they are being challenged in a big way. They're being challenged because as more autonomy comes in, it requires more levels of safety. As more levels of safety come in, it requires more forms of control. Those forms of control need to be validated. To be validated, these control systems need to be exposed to more scenarios. In the automotive case, billions of miles of driving. And the question really is physical testing even practical?
Well, in a key RAND report, Driving to Safety, they made it clear: legacy ways of building products make it an impossible proposition to achieve these capabilities. So what are they saying? Solving these challenges requires a fundamental change in the product development paradigm. It requires digital twin technologies. As Sassine mentioned earlier, digital twin is a virtual representation of a physical product. If you have a physical product with certain characteristics, you can then model those characteristics in a virtual world. Depending on the use case, you can model some or all of those characteristics. Typically, it depends on the problem. Now, the minute you create something in the virtual model or the virtual world, you can take advantage of something that's impossible to take advantage of in the physical world....
And that is the massive increase in compute power that exists today, and that is really driving the dawn of digital twin. In a recent study by McKinsey in 2022, they talked about the key customer benefits as they adopt digital twin, specifically virtualization technology in their product development. Of course, first and foremost is the dramatic savings from doing physical-based testing, do virtual testing. But beyond that, really benefits on revenue growth, time to market, improved product quality, and at the end of the day, greater profitability for that product portfolio, with or without using digital twins. And Ajei here spoke earlier, Ansys has been a pioneer in multiple domains, physics-based systems, mechanical-based systems, but adoption of virtualization has already started showing its key value. Now, industry leaders across the board agree virtualization is necessary today with the challenges that they're facing with software-defined products.
All of these OEMs have unequivocally said virtualization is essential in order for them to address the challenges in their product development. If we look at Synopsys now, we have been purposefully building our way to deliver the comprehensive digital twin. This began in the mid-2000s as the mobile industry started driving the first complex SoCs. These systems on chips had a demand for software bring up and hardware-software validation, and Synopsys established a hardware-assisted verification business at that time through its ZeBu and HAPS platforms. Today, Synopsys is the leader in hardware-assisted verification. Virtualization technology, through creating virtual models of chips that could be provided to OEMs that enable software teams to develop software on a model of a chip before your chip has actually arrived.
That capability has been built out over decades, and now Synopsys is the industry leader in virtualization. These two foundational pieces really drive the core of electronics and virtualization. Recently, we completed the acquisition of PikeTec. This is the bridge from the silicon world to the software that's residing on silicon. Our acquisition of PikeTec was motivated by a large number of our virtualization customers saying our software challenges are the next big challenge. Specifically, as the number of lines of code grows, the type of testing, amount of testing, and the increasing validation requirements make this a formidable problem. What PikeTec enables our customers to do is master the art of test creation, test validation, results accuracy and results collection, and requirements traceability, so you can trace a software test back to a requirement.
These first three pillars here, or first three bars here, represent the core of the Electronics Digital Twin: silicon, models of chips, and the software residing on those chips. Then the next frontier in the comprehensive digital twin is to talk about products, products that have electronic forms of control. So that is the virtual product, the virtual model of the product, and then the virtual model of the product in the environment. It could be the car in a city, it could be a robot in a factory. All of these essentially represent the frontier and the culmination of the comprehensive digital twin. It could be a satellite in space, it could be a drone, but ultimately, a model of a product with the electronics and software operating within the environment. And Ansys will dramatically accelerate our ability to achieve a comprehensive digital twin.
Now, our core electronics, software, and systems business really drives what's the Electronics Digital Twin. This forms the basis to bring together virtualization of silicon, shown on the left here, and virtualization of the electronic product, shown on the right. Virtualization of silicon is about models of chips, the virtual model of the architecture, virtual model of a chip, and then models of combinations of chips. Then you have the environment to develop and test software on virtual models of chips, and then ultimately have the entire virtual model of the electrical architecture, software and silicon operating together. All of those are tied together, you see here, with what we call the Electronics Digital Twin fabric. The Synopsys digital twin fabric enables the connection of electronic digital twins with other domains.
A simple way to think about this is if you had an electronic braking system, you would have a mechanical domain, which is a multiphysics domain, and an electronic system in the digital twin. You can have these working together in order to see how software that controls a braking system can actually impact the mechanical world, and have the mechanical world and the electrical world talking together. This is really the fundamental value as you bring the validation of those systems into a virtual world. This fabric, now with silicon in the end product, allows you to then look at the next element here. When you have electronics in the product, you then have the ability to look at silicon in the product, and that begets Silicon Lifecycle Management, which allows you to understand the behavior of the chip in the product.
So let's take a closer look here. Silicon Lifecycle Management, or SLM, is becoming essential to understand the behavior of a product in the field, especially as more and more electronics goes into the product. As Shankar mentioned, Silicon Lifecycle Management begins with inserting monitors into a chip, and then being able to then take advantage of those monitors to look at margin optimization while you're in design, yield optimization when you're ramping the production, silicon insights as you ramp to volume production, and then finally, as the chip is put into a product that's in the field, the ability to look at in-field operation, both predictive maintenance and in-field optimization. This is really revolutionary in that product lifecycle management now has Silicon Lifecycle Management as a fundamental core when we have electronics and products. SLM enables a complete loop of build, operate, analyze, and optimize for electronics-based products.
Now, Synopsys is uniquely positioned to address these challenges, and I have some key proof points here. First, nine out of the 10 top OEMs use Synopsys virtualization technology today. We have over 400 engineers focused on digital twin technology service, serving the entire automotive geographies of Europe, North America, and Asia. Virtually every virtual chip model that is required by top automotive semi, top suppliers and OEMs is delivered by Synopsys. Our OEM partners tell us about the semiconductor co, partners they're working with. Semiconductor companies drive the creation of virtual models to enable the assessment, evaluation, and development of products by OEMs with a virtual model provided by Synopsys. All the system software and Silicon Lifecycle Management TAM, as Sassine mentioned earlier, represents a $3 billion TAM opportunity.
If we look at our systems and system software and SLM revenue, we've had a 30% CAGR growth over the FY 2020- 2023 period of this business. Now, the acquisition of Ansys accelerates our EDT strategy. Now, the Electronics Digital Twin is an example of an extensible platform, as I mentioned earlier. It enables a connection to multiple domains, as we show here. On the far left, you can see multiphysics domains, models of sensors. Then you have vehicle dynamics, which includes crash simulation, which Ajei mentioned. Driving scenario generators like Omniverse and coverage, hardware-in-the-loop systems for hardware-based testing, and then finally, embedded software tools for safety and security.
Now, by having an extensible platform that allows you to connect to these domains, Synopsys is responding to the fact that more and more our customers in different end markets are telling us, "We have growing electronics and software content in our domain. Help us virtualize product development to get the benefits of virtualization in the way we are doing product development." And Ansys brings to the table here a significant portfolio that accelerates the reach, as you can see, of the Electronics Digital Twin into many industries. So if we look at the combination of Synopsys and Ansys, on the left, you see Synopsys bringing the virtual vehicle for electronics and software testing, and the ability to integrate into a rich ecosystem to complete the automotive flow. With Ansys, we see in the automotive space, they're bringing their core multiphysics environment, virtual testing, simulation process data management.
As you can imagine, there'll be a lot of data moving around here, but also model-based systems engineering. And the combination allows the creation of things that customers do not have today, but will dramatically accelerate and change the way they develop products at much lower cost. It begins with the combination enabling comprehensive digital twins for software-defined vehicles. Hopefully, that is evident from what we've just shared here, but also virtual vehicle testing as we combine with scenario generators. Jensen, in his talk with the Synopsys slide you saw on the far right, was actually looking at Synopsys Electronics Digital Twins coming together with Omniverse. And then there is the simulation data management and ultimately, the digital twin deployment and the lifecycle management of products. So really outstanding joint opportunities, which are up ahead for us to create together.
So there are also opportunities in other verticals, and this brings us back to the bigger picture. The secular trend of autonomy and electrification is driving changes across the system spectrum, and this will accelerate our expansion into new vertical markets as we look not only at automotive, but aerospace, industrial, medical, and other industries in the era of pervasive intelligence, where virtually every one of those products starts becoming a smart product, controlled completely by software and digital methods, and revolutionizing the product landscape in those industries. This is a dramatic transformation that's underway across the systems world as silicon brings compute and control into so many new industries at an affordable cost. Ansys will also accelerate our expansion by a very focused go-to-market.
On the left side here, you heard earlier, Ansys will strengthen our capabilities in advanced chip design technology as we enter the era of multi-die systems. The foundation of that has been our seven-year relationship with Ansys to be able to understand, collaborate, build, and deliver winning solutions for customers recognized the world over. The next chapter is accelerating the expansion into new growth verticals, an additional chapter as part of the story of the combination. Ansys go-to-market allows through their fast-growing market services into other than the high-tech segment here. You can see about 69%-70% of Ansys reaching into aerospace, automotive, industrial, et cetera. This go-to-market apparatus, this go-to-market engine, is fundamental to capturing and seizing the opportunity for digital twins, combining Synopsys and Ansys.
It opens up high-potential verticals and the high opportunity that really will drive unlocking the $13 billion in systems TAM with Ansys. As was seen earlier, the $13 billion built up first from our system software and SLM business, a $3 billion TAM, growing at about 20% over the period of 5 years they've shown there. Then the Ansys simulation and analysis TAM of $10 billion, together making up the $13 billion, and then looking at that, expanding with the opportunity to create expanded TAM with digital twins. In summary, we have an unprecedented opportunity as silicon, software, and systems meet. This is one of the most exciting times in the history of technology, as electronics, through silicon, becomes pervasive in every industry. As an industry leader in hardware-assisted verification and virtualization, Synopsys today is ready and poised to win.
I've shared some proof points here in automotive, and those successes, with nine out of the 10 top OEMs, with virtually every automotive new chip being driven by virtualization from Synopsys, is a key factor in looking at our ability to succeed in executing this bigger vision. Ansys acquisition will certainly accelerate our silicon-to-system strategy and really catalyze the opportunity presented with pervasive intelligence entering the systems world. Thank you.
Tesla's mission is to accelerate the world's transition to sustainable energy. We are well on our way with the Model Y, which is not only the best-selling electric vehicle, but also the world's best-selling car of any type. We revolutionized the automotive industry by pioneering the concept of software-defined vehicles. These cars are built on the premise that if you design the silicon and systems right, the car's features and functions can be rolled out, modified, and upgraded through software alone. Our partnership with Synopsys is critical to executing on our vision. Their portfolio of automotive-grade IP and AI-driven EDA tools help us create differentiated, safe, and secure SoCs, and their emulation and virtual prototyping solutions reduce development and validation effort for digital twins. Synopsys provides the technology we need to innovate, speeding design time and reducing risk from Silicon to Systems.
In fact, our most recent autopilot chip, AP4, shipped on first-pass silicon. Synopsys is our trusted partner in helping us accelerate our chip and system development efforts, and we look forward to many years of collaboration and innovation together.
Please welcome Synopsys SVP Solutions Group, John Koeter.
Hi, how are you today? My name is John Koeter. I'm the Senior Vice President of Product Management and Strategy at Synopsys Design IP Group. I've been in this role since 2007. So in today's presentation, I'm going to give you a little bit of a look back to where we have been, and then, of course, I'm going to spend most of the time looking forward to where we're going, to what I consider to be a very bright future. So first of all, let me start off with giving you an anatomy of a modern SoC chip. So in a modern SoC chip, you have a heterogeneous compute cluster made up of CPUs, GPUs, NPUs, which are neural processing units, DSPs, as you have different compute elements that are optimized for different workflows and workloads in the software world.
Now, of course, the chip needs to communicate to the outside world, and it does that through interfaces, whether it's a memory interface like DDR or HBM, whether it's a system interface like PCI Express or Ethernet, whether it's a peripheral interface like USB or HDMI, the chip needs to connect to the outside world. And in the era of multi-die systems, it also has to connect to another die. So one of the things you're seeing a rapid adoption on is die-to-die interfaces. So in addition to the compute cluster and the interfaces, another major component of a chip is what we call foundation IP, 'cause it truly is foundational. That's standard cells and memory and general-purpose I/Os, that kind of thing, that are in each and every single chip that is manufactured. And lastly, I'm going to talk about security IP.
Security IP, you'll see in a survey coming up in a few slides, is critically important on today's modern SoCs. Now, tying all of these blocks together is an on-chip bus fabric. So I spent just a minute talking about the major components of an SoC because I want to put this in the context of the third-party IP market. Now, this data is fresh. It's as fresh as we came, and in fact, this is preliminary results from a market research firm called IPnest. They did me a favor. They gave me a preliminary version of this report, specifically for this event. So according to IPnest, the 2023 third-party IP TAM was just over $7 billion, $7.05 billion, if you want to be precise.
Here, about 48% of that was processors, 29% of that was interfaces, another 15% was foundation IP, and then other, which is primarily in the area of security and bus fabrics. So taking a little bit of a look back, starting in 2015, which was the first year IPnest did this market report, the IP market back then was $3 billion. You can see it was 56% processors, 18% interfaces, 14% foundation IP. In 2023, again, and I'm using round numbers here, in 2023, the market size was $7 billion. So over this period of time, the market CAGR was 11%, but not all segments grew at the market CAGR. Processors grew at a 9% CAGR, foundation IP grew at the market rate, and interface IP grew at an 18% CAGR.
Now, again, according to IPnest, because we don't go back and report Design IP revenue this far back, so according to IPnest, during the same period, Synopsys' CAGR was 19%. Okay, so now how does that translate into our market share? So back in 2015, we had 13% market share. In 2023, we have 22% market share. Every year for the last six or seven years, we've been gaining one point of market share. But over the last year, between 2022 and 2023, the overall third-party IP market grew at about a little over 6%, and then, you know from our filings that Synopsys grew a little under 18%.
So we actually grew at about three times the market rate between 2022 and 2023, and as a result, we actually gained two points of market share in the last year. So let me talk a little bit more about Synopsys. 25 years of investment and commitment, 25% of the company, or about $1.54 billion. No. two provider of IP worldwide, as measured by revenue, Number one in interfaces, Number one in foundation IP, and we have a growing processor business with our ARC product lines, our neural processing units, and our DSP processors. Now, we have built this business in a very deliberate, very strategic manner, starting 25 years ago. We started with the basic building blocks that are on every single chip. We then added interfaces. We added foundation IP, which is memory and standard cells.
We added processors, we added security, we added sensors, so that we have the world's broadest IP portfolio, and we have IP covering most of the common blocks that are on a modern SoC. And as my boss likes to say, we did this in a very deliberate fashion. We went into every new area. We first go deep, then we go broad into that area, and then we go into the next area, and we've done that for 25 years. Now, of course, a chip is not just made up of third-party IP. Our customers use our best-in-class EDA tools to take their algorithms, their RTL, soft IP RTL, and synthesize them down into the foundational elements of the chip. All right, and today, you might have heard in the morning that we expanded our IP portfolio by acquiring Intrinsic ID.
Intrinsic ID is a market leader for something called Physical Unclonable Functions, or PUFs. Now, why we're really excited about Intrinsic ID is that a PUF, when combined with an SRAM on the chip, produces a unique signature that uniquely identifies that chip. Now, that unique signature can then be used for all sorts of different things. Like, for example, it can be used to create a hardware root of trust and a trusted execution environment, uniquely identifying that chip. Okay, so the other thing that I wanted to highlight here is if you look back over a decade from Q1 2024, and you compare that, trailing twelve months, you compare that to a decade before, we have grown the business, revenue in this business by more than 5x.
At the same time, in that same period of time, we have more than doubled our non-GAAP operating margin percentage. I call that a 10x operating leverage, and that's very impressive. Now, let's look at the future. What's driving the growth of the IP business into the future? Well, it's five primary things. Silicon proliferation. Silicon is going everywhere, in every application, in every market segment, and in record amounts. Just look at automotive as an example.... Moore's Law, despite many rumors of its death, Moore's Law continues to march forward. We call it the march to angstroms. Multi-die. Multi-die is a huge tailwind for Synopsys IP in particular, and I'll explain why in a few slides. Pervasive AI, changing the landscape in every market segment, and particularly important and particularly good for our interface IP business. And accelerated outsourcing.
I'll talk a little bit more about that as well. So let's look at each and every one of these five growth drivers here. Silicon proliferation. I'm going to start with Synopsys' internal forecast of design starts for the rest of the decade. So if you look out over the rest of the decade, we are predicting a 6.4% increase in design starts. You can see the, the stacked bar chart here. The bottom one is high-performance compute or AI. You can see that growing very, very nicely, but that's not the only segment growing. Automotive is growing, high-end mobile is growing, consumer and IoT is growing. And each one of these design starts, each one of these design starts represents a multimillion-dollar IP opportunity for Synopsys, and in fact, most of them represent an eight-figure opportunity for Synopsys.
And by the way, when I was explaining this, slide, I should have taken a minute and said, "This is advanced design starts," which we define that as being 16 nanometer and below. Okay, so despite the rumors of Moore's Law death, it continues to march on. This is our internal database. This is our pre-sales database, and between EDA and IP, I'm going to say we have pretty good insight into almost every chip being done in the industry right now, and we monitor that, and we mine that for trends and how those trends are trending. So you can look back here at our pre-sales database in 2018. Now, first of all, I just want to point out to you the shape of this curve. So first of all, you see pre-sales opportunities, which are a proxy for design starts.
You see pre-sales opportunities increasing. We got to around 2020, late 2019, 2020. You definitely see a knee of the curve there, here, as design starts and opportunities kicked up due to the combination of, AI as well as COVID. You see that at the end of 2022, there was a slight dip in design starts, a couple of quarters, not much, as there was macroeconomic factors. And then you can see for the next three or four quarters, things are up and to the right, and pre-sales opportunities are growing again. But the other thing that I wanted to point down on this slide is the pre-sales opportunities. Just look at the green line here. That's 5 nanometer. Look at how that nicely grew starting in about 2019. Look at the yellow bar, 3 nanometer.
That nicely grew, and you can just see down at the very bottom in purple, you can see 2 nanometer. So Moore's Law is alive and well and continues to evolve. Now, it's changing economically, without a doubt, but it is alive and well and kicking. And this is important to our IP business because with the increased process technology complexity, that means higher ASPs for our IP. And by the way, in case you don't know, we are by far the leader in physical IP. By physical IP, I mean IP that is process-specific, and at the end of the presentation, I'll give you some stats on that. All right, so we mentioned silicon complexity is exploding, software-driven systems, pervasive intelligence, the march to trillions of transistors.
All of this is transforming what used to be an individual SoC into a multi-die system, and multi-die systems are good for Synopsys IP. It's more design starts, more interfaces, more IP and multiple foundries, and more IP and multiple nodes. As the company with the broadest IP portfolio, the one that's a leader in physical IP, this trend is strongly positive for us because we can take advantage of our scale. Now, one of the most giant megatrends is pervasive AI and how that is changing every single application out there. There's a very well-documented and very well-understood phenomenon in the industry that there is a significant gap between the number of CPU cores you can put on a chip and how fast you can get that data to that core.
This is a graph that happens to be from Meta, presented at the OCP event in November, and you can have up to a couple hundred CPU cores on your chip. Today, the bottleneck is you just can't feed those cores fast enough. As a result, of course, the industry knows this, and the industry has been changing. What used to be a period of change between major interface standards, in this case, we're looking at DDR, where it used to be every three or four years, it's now every two years. It's even 18 months. This is not just happening on the memory side, okay? This is happening in PCI Express, this is happening in USB, it's happening in HBM, and this trend is strongly positive for Synopsys.
The reason is, is the faster the data interface, the more complex these interface standards, that directly translates to higher ASPs... Now, you may have remembered the very opening video. It was Nafea at Amazon. You know what he said in that video, if you remember back? He said, "We have had a long-term partnership with Synopsys on interface IP, so we can focus on other things that differentiate our systems." That's what he said in that video, and he is not alone. System companies are coming into the market in many different areas, not just high-performance compute, but also mobile, automotive. System companies are developing custom silicon, and in fact, our business in IP from the end of 2018 to the end of 2023, our IP business coming from system companies went up by a factor of 50%.
As Nafea said, you know, system companies coming in and doing their own custom chip do not have legacy IP. So for them, IP is a greenfield. It's as close to a no-brainer as it can be to just go out and buy IP. Now, one of the other things that is accelerating outsourcing is what I call supply chain resiliency. Our customers are asking that we provide our IP in many different foundries so that they can have the optimum solution for their system. And not only do they want the choice, the optionality, but they want to have that same IP portfolio when whatever foundry and whatever process node they go to.
So whether it's TSMC or Intel or Samsung, or a new Japanese foundry called Rapidus, or GlobalFoundries, or UMC, or any of the other foundries, our customers really trust us and look to us to provide the on-ramp onto these foundries. Now, let me talk about the third-party IP market again. So you remember at the beginning of the presentation, the third-party IP market was about $7 billion, but it's actually only about three-quarters outsourced. Long-established semiconductor companies have been doing their own IP for many, many decades, but these long-established semiconductor companies, they're subject to the same forces that I just outlined: supply chain resilience, you know, multi-die, more, more, all that kind of stuff. And as a result, every year, slowly and steadily, we convert long-time established semiconductor companies from make to buy, thereby growing our SAM and our business.
All right, now, we, not I, but we, employed a third-party analyst called Channel Media to conduct a survey, and they just conducted this survey over about the last 1.5 months, something like that. So again, this data is as fresh as it can be. Channel Media reached out to 2,500 engineers in both semiconductor and system companies all over the world, focusing on what they call eight different personas, everything from a design engineer to a VP of engineering to the C-suite.
They asked them, "When you consider the third-party IP market, what is most important to them?" Quality, the top 10, quality of the IP, understanding our use case, security, customer service, licensing terms, level of engineering expertise, having the best PPA, power, performance, and area, features matching the applications, the vendor reputation, quality, support, financial stability, as well as scalability to provide the IP that they need in many different nodes and in many different foundries. You know, in this blind survey of these 2,500 people, this is what they said were the 10 most important reasons for going to third-party IP. They also asked, how does Synopsys stack up? And again, remember, this is a blind survey. These people responding to the survey did not know that Synopsys was sponsoring that. When asked, "Who has the best IP?" The answer was Synopsys.
Who has the highest quality IP?" Synopsys. "Who has the best IP support?" Synopsys. "Who is the overall industry leader for IP?" Synopsys. You can see the percentages. The users of this survey responded with top marks for Synopsys across the board. Now let's look at what our customers say. Rohit at Ampere was saying, "We know that they will be there to achieve our best designs." Thomas Boehm at Infineon talks about how Synopsys is supporting us to build automotive systems with the highest level of functional safety. Pete Bannon, you just saw him right before this, at Tesla, is talking about our automotive-grade IP portfolio, and that Synopsys provides the technology to speed design time and reduce risks. In fact, he mentioned in his most recent autopilot chip, AP4, it was a first-pass silicon success.
Mark Papermaster, in today's video, in Sassine's keynote for SNUG, he talked about how on their MI300 chip, AMD used Synopsys IP. So that's what our customers were saying. Let's look at what our foundry partners are saying. Stu Pann, Intel Foundry: "Synopsys' broad IP portfolio helps chip designers achieve first-pass silicon success."... Jongshin Shin at Samsung Foundry, "Synopsys is our primary IP partner, and they have benefited mutual customers by providing access to high-quality IP." Cliff Hou at TSMC talks about, "Our collaboration with Synopsys leads customers to the fastest path to silicon success." So if I go back to the five drivers that I started this section with, silicon proliferation, march to Angstroms, multi-die, pervasive AI, accelerated outsourcing, I went back, and I turned this into an equation. And it's a pretty simple equation.
More design starts, times higher ASPs, driven by Moore's Law and by faster interfaces, times more outsourcing, means a sustainable mid-teens growth rate. So we are reaffirming and recommitting to our sustainable mid-teens growth rate for the Design IP group at Synopsys. Now, and throughout this presentation, I've talked about Synopsys' scale and how it's a sustainable advantage. Now, let me communicate that to you in numbers. We have IP in more than 30 different foundries, covering 380 process technologies. We have 6,300 engineers cranking out IP each and every day, and that has resulted in a catalog of more than 8,000 IP parts. And over the last 5 years, that has resulted in more than 10,000 design wins for our IP.
So if I wrap this up, Synopsys Design IP is operating in a very vibrant third-party IP market with strong tailwinds. We are the leaders in interface IP. We are the leaders in foundation IP. We have a growing processor IP business. We're the number two for worldwide IP vendor by revenue, about 25% of Synopsys, 25 years in the making, about $1.54 billion last year. You heard the results from the survey, the best quality IP, the best support, the most recognized IP vendor. And we have a very resilient business model, which is based primarily on design starts. And so the future is bright for our Design IP group. We will continue to grow this business at a sustainable mid-teens revenue growth. Thank you very much.
At Ampere, we are building the highest performance and most power-efficient CPUs for compute of all types. To deliver the power efficiency, performance, and security that our customers demand from our processors and platforms, we believe in using the right tools for the job. With exploding demand for AI compute, this approach gives our customers the ability to right-size their AI compute with our innovative processors to create a new era of pervasive intelligence delivered sustainably. Therefore, our collaboration with Synopsys is essential to providing the capability that Ampere's design teams require to rapidly bring our differentiated products to market. Synopsys provides a broad IP portfolio, from memories and logic blocks to complex security and interface subsystems, as well as their full-stack, AI-driven EDA tool suite and hardware-assisted verification systems.
These critical IP and EDA capabilities have enabled Ampere to achieve first-pass silicon success and accelerate time to market for many of our designs. As a partner, there is a tremendous amount of trust that we put in Synopsys because we know they will be there to help us achieve our best designs and celebrate in our success. We look forward to many years of collaboration and developing the future of sustainable AI with Synopsys.
Please welcome Synopsys Chief Financial Officer, Shelagh Glaser.
It's great to see everybody here. So I'm going to wrap this all up with financials. So bringing it all together, we have a consistent, strong business execution. We've met or exceeded all financial targets: revenue, non-GAAP operating margin, non-GAAP EPS. And it's – I mean, you heard today, it's in our DNA to execute because we have to execute for our customers. We're in the mission-critical part of their business, and so as we're executing for them, we're executing on our financials. And our track record is of resilient, durable, profitable growth. Our CAGR from 2020 to 2023, revenue's been 17%. We've expanded non-GAAP operating margin by 700 basis points, and we've grown non-GAAP EPS by 26%, so delivering both top and bottom-line expansion.
And as we've talked about today, this era of pervasive intelligence, silicon proliferation, the movement of designing from software down, has actually accelerated the rate of our growth, and our technology helping unlock the innovation of our customers has been critical. So we have accelerated the top line. So in the last several years, we've grown the top line 17%, versus the prior 10 years, we grew at 10%. And importantly, our commitment to scaling has allowed us to accelerate the bottom line. So in the past several years, we've driven 26% EPS CAGR, versus the prior 10 years, we drove 13% CAGR. And we are focused on delivering shareholder return. Again, that's in our DNA. We've delivered superior value creation. As we've talked about today, we've outperformed all major industries over the last 3 years, and we've outperformed 6 of the Magnificent Seven.
And our focus on this is, again, to unlock value for our customers, which unlocks value for our shareholders. We've built an incredibly high value portfolio. You heard today about the incredible capability that we have. You heard from Shankar, you heard from John, you heard from Sassine about the quality of our portfolio. So in EDA, we're number one. We are the only company in the industry with a full EDA platform, and our design flow from system architecture through manufacturing is unparalleled. And we are the standard, the gold standard, in both power and timing, which is critical to our customers. In IP, you just heard from John, we're number two in the industry, and we've built the broadest portfolio that allows our customers to know they can depend on us to deliver to their IP needs.
In Software Integrity we've built the world's best platform on application security testing. The quality of this portfolio is driving the double-digit revenue growth, and in each and every business, we've improved operating margin as well. We're also focused on making sure that we're generating free cash flow. We've been generating strong free cash flow over the last three years. We've driven a 22% CAGR, and we've expanded free cash flow margin by 330 basis points. We're driving above-market growth in all of our segments, and it's really driven by the innovation that we've been driving. You heard from Shankar talk about design and automation. The market growth has been 14%. We've been outpacing it at 17%.
As Shankar and Sassine talked about, we are indexed to the part of the market that is growing the fastest, so specifically, indexed to digital design and verification, and most notably, those are also the technologies that are used in AI and HPC. In Design IP, we're outpacing the TAM, the growth of 13%, with our growth of, as John just talked about, 21%, and we're focused on the interface and foundation IP that is outgrowing processor IP. So we're focused on scaling the business, both top line and bottom line, and we focused on improving profitability in our segments, and we've driven more than 460 basis points expansion. Let's dive a little bit deeper on Design Automation .
As we heard from Sassine and Shankar, Design Automation is delivering above-market growth, and it's really driven by the unbelievable trends in this business, and we expect continued growth in this business, and it's fueled by, as, Shankar likes to call it, the industry's relentless march to angstroms. And the industry's doing that because more computing is needed, more performance is needed, and the move to multi-die, which is creating a lot of complexity for the customers, and we're creating tools that allow them to tame that complexity. The mega trend of AI, which is not only fueling more chips that are in service of AI, but also the broad adoption of our Synopsys AI as customers ingest that into their design flows, and the continuous need for energy-efficient compute.
So, making sure that that power consumption and power optimization is happening from software all the way down to devices. And as Shankar shared, in EDA, we have compute acceleration innovations, so our customers can use multi-core, our customers can use distributed computing, cloud, and GPU, whatever is helpful for them to make sure that we're helping speed up their designs. Our focus in this business is driving double-digit growth with market growth of 12%, and in the future, as Sassine talked about, we expect that there's a 2% additional growth as we infuse AI into the business, and we're able to monetize that throughout our customer base. In Design IP, this is an extremely resilient business. We just heard today about, you know, the number of titles we have.
Because we've driven this, we've been able to drive our revenue CAGR at 21% and our operating margin improvement by 430 basis points. This is fueled by, again, the proliferation of silicon, so just the explosion of silicon across our customers and the march to Angstrom. As John shared, each new process node needs new IP. We need to optimize new IP, and that's a higher monetization for us as we march forward. The continued focus on multi-die, as customers are looking to continue their march on delivering performance, that means we're selling IP not just on one chip, but we're selling the IP on multiple chiplets....
Then in the era of pervasive intelligence, the critical IP interfaces that John talked about are moving from four-year refresh cycles to two-year refresh cycles, things like PCIe and USB, and again, that's more, monetization for us. This trend to accelerated outsourcing, as John talked about, helps the customers have more flexibility in their design. But for us, it means more, more IP blocks coming to us because the customers are using their scarce engineering resources to be on their differentiated blocks when we can give them proven IP interface blocks that are really on every foundry and allow them to make the right choice for their business and ensure that they get a high-quality chip on schedule. We have industry-leading growth, and we expect this growth, to be in the mid-teens as we move forward.
Let's talk a little bit more about Ansys and Synopsys. As you heard today, we are adding onto our leadership portfolio, the Ansys leadership portfolio, and Sassine and Ajei talked about the capabilities that are coming as we bring Ansys in, and those capabilities are going to help us service the customers even better. And the foundation of those, our strong seven-year relationship means we'll be ready for those customers to be able to serve them with technology and in go-to-market. We shared this slide when we made the Ansys announcement. So Ansys is an extremely high-quality business with durable revenue growth. The combined entity will be an $8 billion company, and we expect the combined company to be industry-leading double-digit growth. We expect Ansys to be neutral to our growth in the first year and accretive thereafter.
Post-close, we expect immediate margin expansion, increasing Synopsys non-GAAP operating margin by 125 basis points the first year post-close, and by approximately 250 basis points in the medium term. We continue to expect unlevered free cash flow margin to expand by approximately 75 basis points the first year post-close and by approximately 250 basis points in the medium term. We expect significant synergies on both cost and revenue and expect the transaction will be non-GAAP EPS accretive within the second full-year post-close and substantially accretive thereafter. Unpacking the synergy. As we've talked about, we expect that on cost, we will have $400 million of actionable, identified cost synergies on a run-rate basis by year four—by year three, pardon me.
We expect to be able to streamline and realize these benefits and be able to integrate our engineering platforms, focusing on infusing AI and cloud. And in terms of revenue, we expect $400 million line-of-sight run-rate synergies by year four, and that's really informed by our understanding of our customer base and their customer base, and I'll unpack this a little bit more. Long term, we expect that $400 million in synergies will translate into $1 billion annually. Unpacking the synergies a little bit more, 'cause I know we've gotten a lot of questions as we've talked to you about this. On cost synergies, the substantial majority of cost synergies are going to come from streamlining and scaling. Again, we have two strong businesses coming together.
We have a great understanding of the company from our seven-year relationship and then also informed by our diligence, and we expect that, bringing the companies together, we're going to be able to achieve significant streamlining and scaling. A lesser portion, about 20% of it, will come from combining our platforms together. As you've heard today, we've got AI deeply infused into our products. Cloud is another opportunity, and we see bringing those platforms together to be able to give, you know, good offerings to our customers, but the cost synergies are very line of sight. We know exactly where we will get these cost synergies. On revenue synergies, the $400 million year-four run rate, a substantial majority of it, about 70%, will come from our existing customer base.
We've talked a lot today about multi-die and the complexity that is facing our customers as they try to navigate through the complexities of Moore's law and balance performance, cost, and schedule, and they're moving to multi-die. We know we need to bring that deeper fusion together of everything Shanker discussed with us with the multiphysics, so that customers can have a design platform and solve those multi-die problems. So about 70% of that synergy will come from our existing customers as we solve that. About 30% will come... Ansys has decades-long relationship in automotive, aerospace, industrial, and about 30% will come from us being able to go more deeply into those customers as we can leverage their capabilities. We're confident that that $400 million will translate into the $1 billion that we've talked about long-term.
Importantly, everything Ravi talked about in terms of the digital twin is above and beyond. We expect to be able to, as we, integrate Ansys, be able to build out that full digital twin that Ravi was talking about, but those will be additional synergies on top of these committed synergies. As we move to the balance sheet and financing of Ansys, we expect gross debt to adjusted EBITDA to be approximately 3.9x at the transaction close. Given that we expect to generate substantial and sustained free cash flow with the combined company, we expect to rapidly delever to less than 2x within two years of the transaction. As we approach the 2x leverage, we intend to resume share buybacks.
On a long-term gross leverage, our target is 1, is below 1x, and we expect to maintain investment-grade credit ratings, given our strong cash generation and commitment to rapidly delever. Now, let me talk a little bit about our integration. So we expect to close the transaction in the first half of 2025. We have highly complementary portfolios, as you've heard today, and we are driven by the same customer needs and the same secular trends. Both companies are rooted in technology and science. You heard that, I think, was a great talk with Sassine and Ajei, rooted in technology and science and really solving difficult customer problems, and that is delivered by extremely strong talent and underpinned by a similar culture, and we heard that from Ajei today.
Our integration planning is informed by our seven-year-long relationship and our comprehensive diligence, and we are going to be ready to successfully integrate Ansys in first half of 2025. Our integration planning approach includes key leaders from both Synopsys and Ansys, and we're focused on ensuring we have a path to accelerate the business, to support our customer needs, deliver the synergies, and maintain key talent. We look forward to providing updates on this as we progress. Now, I want to take a minute to talk about our disciplined capital allocation approach. Throughout this day, you've heard about the tremendous opportunities we have in design, automation, and Design IP. We announced in Q4 2023 that we were exploring a strategic alternative review process for our Software Integrity Group . As we've talked about, that portfolio is the leading application security testing portfolio in the industry.
We've completed, as Sassine said earlier today, the comprehensive review, and at the conclusion of that, we've made the decision that we're going to move forward with pursuing a sale of SIG. I want to update our 2024 guidance to reflect both the strength of Design Automation and Design IP and our discontinued operations of the Software Integrity Group. For updated guidance for 2024, the full-year targets are revenue of $6.06 billion-$6.12 billion, up approximately 14%-15.1%. Total non-GAAP costs and expenses between $3.76 billion-$3.80 billion, resulting in non-GAAP operating margin of 38% at the midpoint, and non-GAAP earnings of $12.86-$12.94 per share, up approximately 22%.
8-K and investor presentation include additional targets and GAAP to non-GAAP reconciliations. Our long-term financial objectives are a combined company basis Silicon to Systems strategy with Ansys to continue to deliver shareholder value, driven by annual industry-leading double-digit revenue growth, non-GAAP operating margins in the mid-40s, unlevered free cash flow margins in the mid-30s, and high teens non-GAAP EPS growth. We intend to utilize our strong cash flow for debt paydown as a priority, continue to invest in R&D, and as we approach the leverage of below 2x, resume share buybacks. This is consistent with our prior guidance. Now, turning to our commitment to building a smart future together, a few highlights to share. We're enabling our customers to innovate for a greener world.
We are pleased to have been recognized in February 2024 as a sustainability leader by CDP, based on our strategy, our transparency, and our performance on Climate Change. We're dedicated to security across our enterprise and to helping our customers build trusted silicon. And through Synopsys for Good, our foundation, in calendar year 2023, we donated $5.7 million to charities focused on STEM, communities in which we work, and environmental sustainability. In conclusion, through our continued execution, we remain confident and committed to deliver outsized shareholder growth, as we have done for decades, through industry-leading, resilient growth as we solve our customers' most pressing challenges as they face rising complexity.
With expanding margins, as we drive continued scaling of our business while being disciplined stewards of capital, focusing on the highest returns and strong free cash flow management, delevering post-acquisition to less than 2x within 24 months. Thank you for your time and support. Let's take a look at a video from our valued partner, Samsung, and then we'll open it up for Q&A.
... Through our long-standing alliance with Synopsys, we have empowered our customers to design cutting-edge products using Samsung's processes and deliver technology breakthroughs. Synopsys' EDA reference flow, which has been certified by Samsung Foundry, has been adopted by numerous customers, and its AI-powered suite provides significant PPA advantages. Along with Synopsys' broad, silicon-proven IP portfolios in the Samsung Foundry processes, it enables HPC, AI, multi-die, and automotive SoC design for customers' predictable success. Together, Samsung and Synopsys are trailblazing with advanced solutions that expedite development and help our customers achieve complex design goals in the era of pervasive intelligence. It's not just about meeting challenges, it's about exceeding them to pave a new path forward. Synopsys is pivotal, providing reliability and significance to our ongoing journey towards continued success and innovation. We express our incomparable confidence in the strengths of this partnership.
Synopsys is our trusted partner for today and the future.
All right. Well, we're gonna have two mic runners in the room. I'd encourage just one question, so we get as many questions from different folks as we can, and then we'll get going. So...
A lot of hands.
Over here, Charles.
Thanks. Hi, Sassine, hi, Shelagh. I wanna ask about AI. I'm glad to see that the two points of the growth rate uplift coming from AI. This morning, I went to that AI panel hosted by Trey, and actually, one my takeaway from the discussion from your AI experts were that it looks like all the AI products probably works the best if the customer adopts the full stack of Synopsys solutions. But we know that's not the case today, that the customers pick and choose the best of the breed, what they think the best of breed. I'm sure Synopsys has fantastic products. But in your projection of that 2% of the growth rate uplift, do you embed some sort of market share gains?
Because I think about if customer really like your AI product, that probably will force the hand to get them to adopt more of the full stack Synopsys solutions. So just want trying to understand some of the underlying assumptions there.
For sure. Yeah, that's a great question. Our AI solutions, take DSO.ai, it works with our Fusion Compiler design platform. That's what you mean by full stack. So, if you're looking at the DSO, VSO, TSO, ASO, which is the four, design, and part of the optimization of the stack, each one of them works with the Synopsys underlying technology to do that optimization. With the DSO.ai, because we have the most experience with it so far, it absolutely resulted in market share gain because the customer are using the best outcome for their PPA. If they're using Fusion Compiler and DSO.ai, and they're getting the best PPA, it's resulting in that shift.
If you remember one of Shankar's slides where he has the rings of share in the digital implementation, that's a space where you may have a mixed flows, meaning the customer on the same chip may be using a full Synopsys flow and mixing and matching, in some cases, the best-in-class in their mind or some opportunity for them to use a dual flow. With AI, it naturally gets you to a more integrated flow with whichever AI engine you're using with the underlying technology. So we are seeing it with DSO.ai. With the rest of, the .ai's engines that we have, it will be very similar. It will be very similar. In order to get to the best PPA or verification outcome or test outcome, in the flow.
Thanks. Vivek Arya from Bank of America Securities. Thanks for a very informative analyst event. I wanted to break the AI impact in two different ways. I think, Sassine, you mentioned the 2% lift, which I think is more from the monetization of your tools. I want to get at how much of a lift in the EDA market is just from more design starts in AI, right? More customers wanting to do design, and I was hoping you could address two aspects of it. One is, how much of that is in the core AI products in the data center? How much of that are you seeing at the edge? Have you already seen it? Is that still to be seen? And how much of a lift is that?
Because if EDA was growing, you know, 12%, it's still growing 12%, where do we see that impact of AI in more number of design starts?
Very good question. Actually, we debated how to put some data behind that question. If you remember, about three quarters ago, four quarters ago, we did communicate-
Mm-hmm
... that roughly about 10% of our overall revenue is contributed to the AI chip starts. As you look at where we are today and how our customers are communicating today, the AI opportunities they see, for example, AI PC, what does that mean from a chip start point of view? Because if the same chip that applied to PC is it a new design start for the AI PC? Will there be two chips? Will it move from that one chip to the next chip? What will happen as our customers are trying to find their way in defining AI on the edge and AI in data center, inferencing, training, et cetera, et cetera?
You move, if you fast-forward 5, 7 years from now, we believe that pretty much every advanced chip is going to have some sort of an AI accelerator inside it, be it on an edge or in the data center, training or, inferencing. To us, what's an interesting insight, as you look at those trends, is the type of chip they're building in order to support AI, in them. That requires the most advanced node, typically, because you cannot have that computational power on an older technology, so you need to move down the Angstrom path. Most likely, it will require some sort of a disaggregation, meaning you need to have multiple dies in a package. Those are the incremental opportunities that we see. Then back to where you were going with the question, how will it change that 12%?
Will that 12% be higher due to what I just described from the core, EDA and IP getting higher beside the 12%, for the AI monetization? At this stage, it's very difficult today, right now, in this meeting, to say the 12% is going to change due to those trends that we're seeing. It's too early. It's too early. But what we're absolutely seeing is an acceleration of both the IP and advanced EDA, though.
Thanks. Thanks again for hosting this. Very informative. Appreciated. Gary Mobley with Wells Fargo Securities. I wanted to ask about a more boring topic, and that is the IP business. It's a multi-part question. What I heard during the presentation was a story about product breadth, 8,000 products. It's effectively a $2 billion business, and so it's got scale and breadth. And so my question to you is: Can you turn that into a subscription-based business versus a consumption-based business? What would be the advantages of that? What would be the impediments? And then, as well, with the further rollouts of processor IP, how do you toe that fine line between working collaboratively with Arm and tapping into what is the largest portion of the IP market?
Yeah, just to ask a clarifying question, when you're talking about subscription... Hey, Phil, if you don't mind, when you're talking about moving to subscription-based business, can you clarify, what do you mean?
Like a time-based licensing.
Okay. Yeah. The reason I'm asking that clarifying question, today, most of our contracts are a committed agreement with the customer. So we call them FSAs, flexible spending account, where the customer commits $X million for a duration of time, and they pull the IP when they need it, so they burn that flexible spending account. So we provide them that flexibility. It gives them, it gives us the ability to forecast our business because those agreements are committed. The reason we don't move to a subscription base in terms of the time base, where you sell them a USB for three years, and they can pretty much use it on any chip start, it will impact our monetization ability. The customers would like to do what you're proposing.
For us, we'd like to be very disciplined that we have a very clean, clear business model with IP. Anytime you have a new standard, moving from an LPDDR5 to 6, moving to a different process technology, moving to a different protocol in terms of automotive to HPC end market, it's a monetization opportunity for Synopsys. The customer pays us even if the chip does not go in production. So if they pull the IP to, I don't want to say experiment with, because there's a lot of resources on their side, but even if they design in the IP and that chip does not go into mass production, we get paid. So that's, that's where we're very disciplined on how to do that monetization.
So we provide flexibility from a business model point of view, where we have flexible spending account and other mechanism for the customer. Regarding the processor business, we made a strategic decision late last year, driven by two factors. One, we have a processor business today, called the ARC processor. If you think of the processor business at the high level, you can break it into three tiers. There's the low end, mid-end, high-end. We've been in the low to mid-end business with the ARC processor. When the RISC-V momentum is happening, and we hear customers asking us to support and simplify the instruction set offering from having an ARC and having a RISC-V, we made the strategic decision to transition our portfolio from an ARC architecture to a RISC-V architecture.
This, this should give us an opportunity to expand and accelerate our growth in those two markets, the mid and the low-tier market, and participate in the growth. What we are planning and investing around is we will grow above the market TAM in those two slices of the overall processor business. Now, to the last point of your question, which is important to address, is the relationship with Arm. Arm is both a customer and an ecosystem partner. Today, when you hear Arm talking about their compute subsystem, talking about the advanced processors that they're offering their customer, we have a great both R&D and a go-to-market approach with Arm, driven by our joint customers. Because our joint customers, when they buy an Arm IP, right next to it is sitting a Synopsys IP to interface it with the rest of the chip.
So they understand where we are and where we're not planning on being, and how do we continue on expanding the opportunity?
Jay Vleeschhouwer . A clarification first, if I may, on product development and your synergies comments, and then the real question. So Shankar said earlier that you expect to incorporate, or integrate Ansys within, quote, "one release cycle." Does that correspond to your four-year timeframe for synergies, or do that mean something else in terms of timeframe? And then relatedly, Ansys does three major releases a year: R1, 2, and 3. Is that something you expect to retain or perhaps even, adopt for yourselves? And then the real question.
So yes, it is part of the synergy. And as Shankar explained, the reason we feel confident into plugging in those engines and integrating into our Fusion design platform is the architecture, the data model, all the investments that we've made since actually 2014 timeframe, when we created our data model, then announced Fusion Compiler, et cetera, et cetera, gives us the confidence to integrate those engines. Now, that's from the silicon side of the market. Ansys has a certain rhythm of delivering their product for many, many customers outside the silicon part. As we go through the integration process, understanding those customers, their requirement for support, et cetera, we'll make the right decision. Will it stay, will it not?
At this point, if you're asking me, they're doing it for a good reason, and I don't see a reason for us to change. But ask me that question again as we get closer to day one.
Okay. So real question is, about competitive criteria for customer selection in the future. We'll stipulate that the full integrated stack makes perfect sense, and you have it, and you have the leading share. But the reason I'm asking it that way is for many, many years, when EDA companies put out a press release for a new product, the invariable claim is that it's ten times better than the existing product, either your own or a, competitor's. That, I think, is a requirement in the EDA operating manual, that all press releases should say 10 X.
So, in future, do you think that you will continue to be able to compete on that multiple of improvement, or, for example, based on what Jensen was saying this morning, that you're going to have to go to 100 times improvements per product release as sort of the next criterion of competitiveness?
So thank you for the question. I don't know how many products we said 10x, but you and I, we can talk about it. On my mind, there is one. We typically talk percentages in terms of improvement from release to release or product family to product family. The reason for that is the complexity in order to achieve that incremental turnaround time is not straightforward. What happened in the last number of years is distributed processing, availability of compute, how do you re-architect and engineer your product that can take advantage of that compute and can take advantage of the distributed processing? So that's element one. With the availability of compute, how do we re-architect the product? Two, the type of compute.
You go back 10 years ago, the only type of compute we had for our EDA product was an x86 architecture, CPU-based. Today, the numbers that you show, we've shown on the slide, in terms of 10X and 15X improvement, that's coming through a different type of compute available, accelerated compute. That's absolutely the way of the future. Now, the how fast the adoption is going to be is not a straightforward thing to answer, because today, our customers may not have the availability of that compute to take advantage of it. Because these are massive workload, they're expensive in terms of the type of compute infrastructure needed, from memory, CPU, GPU, et cetera, that's needed to run those, workloads.
That's why we're offering cloud, hybrid cloud, so they don't have to invest everything on-prem, et cetera, et cetera. So yes, we are seeing continued improvement in the algorithm, through distributed et cetera, type of compute, different type of compute, accelerated compute, and anything where we can put AI to optimize, simplify, the learning. Those are the vectors we're operating on in order to achieve, a better turnaround time given the complexity. Phil, right there.
We're gonna have one more question. This will be the last one.
Sorry, the last question can be on the guide, if that's okay. So, looks like Q2 guidance is being updated, and the full-year guidance is being updated. With the upcoming SIG divestiture, is there, like, a time when that's being contemplated since you're giving an updated Q2 guide? And then is the messaging for the full-year that the EDA business is upticking, given the implied growth in SIG this year is declining? Thanks.
Yeah, so I'll answer the second question first. So, absolutely. I mean, what you've heard today, that's manifesting in the updated guide, the strength we're seeing in both Design Automation and Design IP. So we're flowing that through the guide, and as you said, we've also updated Q2. And really, the decision on SIG is culminated the, you know, comprehensive review process. As Sassine talked about, we're active in discussions with potential buyers, so we'll look forward to, in Q2 earnings, updating you on the progress. But, you know, we're moving forward with that, and so, we'll continue to provide updates, but there's not a specific date.
So with that, given that was the last question, again, thank you very much for spending the last few hours with us. I hope you found it insightful, and we look forward to more interactions. Thank you so much.
Thank you.