Akamai Technologies, Inc. (AKAM)
NASDAQ: AKAM · Real-Time Price · USD
102.98
+3.18 (3.19%)
At close: Apr 30, 2026, 4:00 PM EDT
102.85
-0.13 (-0.13%)
After-hours: Apr 30, 2026, 6:23 PM EDT
← View all transcripts

53rd Annual Nasdaq Investor Conference

Dec 9, 2025

Moderator

All right, almost lunchtime. We're super excited to have the CEO of Akamai, Tom Leighton. Tom, thank you for joining us at Nasdaq London, and once again.

Tom Leighton
CEO, Akamai

Thank you.

Moderator

I appreciate it, Tom. We're going to focus a lot of conversation on, I think, what the key investor excitement is around what you guys are doing in compute, specifically with GPU inference. Before we get there, though, maybe just sort of a level-setting question in terms of, as you look 2025 year- to- date, how would you characterize the IT spend environment across the different stages of business? And what about 2025 has surprised you either to the upside or to the downside?

Tom Leighton
CEO, Akamai

You mean spend that we're spending in IT or that our customers are spending?

Moderator

It's customers.

Tom Leighton
CEO, Akamai

Customers, yeah. So obviously, security is still very important on people's minds. The attack rates have gone up, especially now with GenAI. It gives power to the attackers. And a lot of adoption of GenAI capabilities in enterprises, and there's a whole new attack landscape. You've got to defend that. And on the compute side, a lot of interest in the new applications that are becoming possible, agents to do all sorts of things. And so I think rapid adoption of AI, and that's a tailwind for Akamai.

Moderator

Awesome. So let's talk about the opportunity with the Akamai Inference Cloud. So the Akamai Inference Cloud has launched. It's getting a lot of investor excitement. Can you talk about how the partnership with NVIDIA came together, and why do you feel Akamai can be a player in the GPU inference market, given a lot of competition with the hyperscalers and the neoclouds? So just maybe just give us a history on how you guys thought about entering this part of the compute space.

Tom Leighton
CEO, Akamai

Yeah, there's a lot of questions in there. We've always had a good relationship with NVIDIA. I remember way back in the day when there was interest in deploying GPUs in our edge platform to help create gaming as a service. That never really took off because of the financials. It's better to have the user spend the power in the home to power the console, to buy the compute device, to pay for the colo in the home, than doing it in the cloud. But now, with the rapid rise in GenAI and agents and inferencing, that makes a lot of sense to do on the edge. And I think there's a lot of excitement there, certainly in our customer base, about what's the web going to become, what's going to be possible. And just take an example, say, in commerce.

We've got customers that, boy, wouldn't it be nice if when you bought an article of clothing that it fit, and you didn't have to return it? And so you can scan your body. The e-commerce company will know your shape. They'll know the various articles of clothing. Do they run large, small, whatever? And so that when you do buy something, a very good chance it'll fit. Well, in addition, you'd sort of like to know how do you look in that article of clothing? And so today, we have some customers that will show you a picture of that. We've got customers that want to show you a video of you wearing the dress, and then in different environments. Pretty soon, I think you'll see your personal concierge will be an avatar, or it'll look like a human being. It'll talk to you.

It'll make suggestions of what you might look good in and show you. The whole experience in commerce could well change. You look at buying tickets for things, concierge service for that, what you might like to go see, getting everything all scheduled. There'll be agents just all over the place, each agent doing a specific task. Medium-sized model, and they'll all be talking to each other and coordinating against data personalized to you, and so the whole experience and how the web works, the protocols, probably going to be different, and we're excited because that's a perfect area for us. You're going to want those applications to be really fast, reliable, scalable, secure. The way we have built our platform to do that, and a key aspect is that it be distributed, so the compute is being done close to the user.

If your particular agent is generating a personalized video, no way that scales in a central data center. Just like streaming way back in the day, no way you're going to deliver all those streams from a few locations in big data centers. Can't work. You're going to want that to be done distributed. And if you're buying something, you're going to want low latency. Now, before now, when you're running the models, they were slow. They were big. As you get the lighter-weight models on things like the NVIDIA RTX 6000s, they're fast. And so you do want to be close. You don't have the bottleneck being the back and forth between the user and the model. So I think, in some sense, the web changes, but what you need in the fundamentals is very much the fundamentals that Akamai has helped the web scale from the beginning, with delivery.

And then, of course, it's got to be secure.

Moderator

In some sense, you're talking about kind of a continuation of hyper-personalization, but doing it in real time. And that's kind of real-time inference capability. And you're doing things like video, real-time inferencing video. That's a whole new world.

Tom Leighton
CEO, Akamai

Yeah, I think you'll even see it in things that are what used to, or what are today, one-to-many video. That the version of the game I see is going to be different than the game you see. Because maybe I like this team, and you like that team. We've got, I was talking to a customer just a couple of days ago in New York with exercise stuff, and you're in the class doing your thing, and you want it personalized in terms of your language, so that you can enter other markets. Same instructor, but talks in your language and accent. Everything, I think, is going to change in a way that we're in a very good position to help enable.

Moderator

Yeah, that's super interesting. So let's talk a little bit about the underlying, at least the hypothesis around the underlying unit economics of the Akamai Inference Cloud. So on the earnings call, you described the current economics as $1 CapEx basically equals $1 revenue. Can you unpack that for us? And what gives you the confidence that these inference workloads will have the sufficient lifetime value to become profitable, not just a high-growth business?

Tom Leighton
CEO, Akamai

Yeah, I think the same math looks like it's holding for the newer GPUs. They cost more. They take more power, but they generate more revenue in the ballpark, the same ROI. And I think a good question when you're buying the GPUs is, are they going to last? And what you've seen so far is the GPUs being used in these gigantic farms for gigantic training. And there, every year, you want to buy the next version because you're trying to do more variables. You're trying to get, well, can you do generalized intelligence or something? That's not the game we're in. We're in supporting applications for customers. And the nice thing about that is that, hey, the NVIDIA RTX 6000 is good at generating a video, personalized video on the fly, and it's still going to do that in three or four years.

It's not an issue that we're going to need a bigger one to generate that video for you. So the specific tasks you're doing, we believe that equipment will be good for us over the long haul. Because we're not in the big core training, taking the model to the next level to play chess better or to solve super hard math problems.

Moderator

That's a pretty clear distinction. I'm glad you made that. And so you talked about the technical capabilities of the chips and the hardware. In terms of the workloads, because they're applications essentially, do you also feel that that application and those workloads are going to be around four or five years? Or do these workloads kind of shift from one cloud to the next as they kind of shop around for the best unit cost? And just get your perspective on that.

Tom Leighton
CEO, Akamai

I think the applications, once they're here, will be around. I think cost does matter to our customer base, and it's an area where we have good competitive advantages. Many of our customers that had been using a hyperscaler that have moved to us, we're helping them save money at the same time, and of course, with our distributed architecture, we're very efficient at moving data around. We do more of that than anybody, and so we're in a good position, I think, to offer competitive pricing.

Moderator

Great. In terms of the CapEx build-out to pursue the inference opportunity, how do you show that any incremental CapEx investment will ultimately see the traffic? Today, it sounds pretty obvious, pretty easy. You build it, kind of day will come. That's kind of the stage of the market. As you think about building a durable business, how do you guys think about the CapEx build-out and the revenue coming online?

Tom Leighton
CEO, Akamai

Yeah, so we build out CapEx for what we think we'll be consuming in the near future. Initially, after Linode, we did do big tranches. That is in use. And so we're in the mode now that we build out now based on what we think we're going to sell and use three to six months down the road. A related issue is the data center deals that we do. And so for the bigger data center footprints, those will be long-term contracts, and we'll often build in room for growth. And we pay as we go. We do get hit with the accounting charge for the linearization of that, so that the accounting says that we're spending more than we're actually paying in the early years. So we do get hit with some of that.

Moderator

In terms of the Akamai Inference Cloud footprints, I think you're currently in 17 locations. What does that look like over the next year in terms of expanding upon, building upon those 17 geolocations?

Tom Leighton
CEO, Akamai

Yeah, so there'll be more cities. We're currently building out in India and Southeast Asia, and they'll also get deeper, so there'll be bigger footprints, and so in some of these cities, we're looking at 10 megawatt kinds of data centers, which is larger than generally we've been. We're in over 4,000 POPs, and obviously, the vast majority of those are not GPU-equipped POPs. You don't even think about the power. In fact, we don't even pay for the power in the vast majority of those locations, but for the GPU footprints, there'll be some cities that'll be bigger with more concentrated GPU.

Moderator

And then in terms of pricing and how the pricing model works for inference, can you just sort of lay out or sort of outline what that looks like? It could be. Is it pretty straightforward, just sort of GPU-per-hour type pricing? Or do customers are going to make some sort of commitment and burn down? How does that?

Tom Leighton
CEO, Akamai

Yeah, so both. So far, it's been per consumption basis. VM hours could be tokens in terms of GPUs. Also, for GPUs, there are some cases where a customer may want a tranche, and then we would sell, OK, this many GPUs over some number of years that are committed.

Moderator

Like a cluster, OK. The Akamai Inference Cloud is part of a broader business unit called cloud infrastructure services. And that's been kind of the engine powering, I guess, the best growth at the company. Cloud infrastructure services, I think, accelerated in Q3. And the team has been signaling that could accelerate meaningfully going forward. What drove that acceleration in Q3? And to what extent is this being driven by a handful of customers? Are you satisfied with the breadth of the growth coming out of CIS?

Tom Leighton
CEO, Akamai

Yeah, great question. The biggest customers are generally the biggest spenders. So we've had a bunch of growth driven that way. I think we've been very favorably impressed and surprised at the breadth at the same time. And part of that is our field is now selling it broadly. Last year was really the first year of broad sales. We have a specialist team that we're growing. There's more market awareness that, hey, Akamai is a cloud company. What you're doing with the hyperscalers, you can do with Akamai. Of course, now we have Inference Cloud, which gives us another capability in terms of market awareness and capability to sell. So it's both. We've got some very large customers that are growing. Three hyperscalers now are all cloud customers at Akamai. And a lot of the more good enterprises, not having to be a giant company, but a solid enterprise.

Moderator

If we could go back to the origins of Akamai's entry into the cloud business, it was through this acquisition of Linode, which got you into kind of classic centralized public cloud offering. Was there any synergies with Linode and their data center footprint, and then as you expanded that with the core Akamai network? Or was it kind of a completely different compute stack, and was there any sort of integration efforts associated with that?

Tom Leighton
CEO, Akamai

Initially, different compute stack, not distributed, not enterprise grade, small and medium business, but really good stuff, very developer-friendly, very popular with developers, and so what we've done is get it to be enterprise grade, so a huge amount of work on reliability, various certifications, more functionality that major enterprises need, also making it more distributed, so we're in over three dozen cities now with full stack compute and storage, a new fabric in these cities to make it be a lot more reliable, and we have taken the capabilities and put it into our distributed platform, and you see that with our managed container service, which is a software. We take the software that we were running in those core data centers for containers and deploy it into our edge platform.

And in fact, one of the hyperscalers is using that today because we can get their containers, their business logic, closer to end users in a lot more cities than they can get with their hyperscaler approach, which we compete with. So yeah, Linode was different, but we are using those capabilities to fully integrate it with our platform. Another thing we'll be doing is making the container deployment be serverless. Today, we're serverless with JavaScript, but that function is a service, which is more session-based, but getting containers to be spun up automatically so you don't have to worry about pre-provisioning, like today you do with the cloud and the hyperscalers.

Moderator

When you think about that specific element about a managed Kubernetes service running more serverless, more broadly, that being unlocked, you got GPU Inference Cloud. How much of getting into cloud-native workloads, how much of a demand boost or demand driver can it be for CIS going forward?

Tom Leighton
CEO, Akamai

I think a lot. That's where I think a lot of the future demand is going to be and the excitement is. Yeah, that's, I think, very important to be able to do that.

Moderator

Is that driving a lot of the customer decisions and the sort of contract commitments that you're winning today?

Tom Leighton
CEO, Akamai

I think it's driving customer interest. Because a lot of that is still work to be done. We just started Inference Cloud. We're not fully serverless now with containers. But that roadmap is exciting for customers. And I think it's just, hey, customers are getting aware that we have these capabilities. It performs very well. It's more distributed than the hyperscalers, so it gives better performance. And for a lot of applications, we're a lot less expensive than the hyperscalers. If you're moving a lot of data around, which big media is, if you have really chatty applications, a lot of hits, which commerce does, we're less expensive.

Moderator

This actually just brought up a question in my mind. I had just hosted the MongoDB management team, and they made the point that in 2025, all three major hyperscalers had some sort of outage, and when I think about Akamai's offering and kind of the distributed nature, the 4,000 points of presence, what you guys got 36 locations with Linode, does the Akamai Cloud start to become part of the resiliency strategy for particular workloads, given that we've seen kind of outages across the hyperscalers this year?

Tom Leighton
CEO, Akamai

I think that makes a lot of sense, and I think you'll see the outages probably increase. Just the nature of the beast that as companies get older, your people are single points of failure, and if they move to another company, you've got code bases that nobody at the company really understands. And that happened in one of the recent cloud outages. It's an area we worry a lot about at Akamai. Also, change management. I'd say the hyperscalers have done a pretty good job there. They do go down from time to time. It's an area where we make huge investment at Akamai. The last time we had an event like that would have been four and a half years ago, and our goal is to really be five nines. And we've achieved that.

One of the biggest banks here in the world based here, the regulators measure it. And five nines is the standard. And they achieved it using Akamai. And that's 10 minutes of downtime in two years across everything, including attacks, or you made a change that fouled something up. So a huge effort. And I give the hyperscalers a pretty good score, even though they've had outages. Some of our other competitors, very poor. They used to go down every quarter. Now you see it happen multiple times in a quarter with big outages. And that's not something you change overnight. It has to be a design focus and a lot of effort put in to make that happen. It's not an accident.

Moderator

The conversation that you and I have been having over the last 15 minutes has been focused on the compute business, and specifically the cloud infrastructure services piece, where that's the Inference Cloud is. That's where the public cloud business with Linode is. There is a whole portion of compute revenue that's non-CIS. Can you talk a little bit about what's in this business? And is there a situation, because I think it's kind of like the drag on growth in terms of the overall compute business, which is growing mid-teens, is there a world where that business gets sold or moves into discontinued operations so that investors kind of see the better growth that you're seeing specifically from CIS?

Tom Leighton
CEO, Akamai

Yeah, that's a great question. That business is the stuff that we were doing that was compute-related before Linode, basically. And so there were special compute-related functions we would do for customers. For example, queuing. Customers wanted that, so we did it. It wasn't a real focus area for us. And over time, other companies were created. That's all they did. And so in particular with queuing, we ended up making a relationship with one of those companies where we've migrated our queuing solution to theirs, which is better. Saves us a lot of money because we don't have to worry about keeping a legacy product going. And also, they become a customer of our cloud. And so as their business grows, it grows on Akamai and helps us. We did that with our Media Services Live, something else that we didn't want to keep doing.

We've done it with our Video Manager service, and these things, a bunch of those were in what we call other cloud applications. They're the legacy cloud, and we've got, there's NetStorage in there. There's some other things. Some of them are growing. Generally, they're not focus areas for us, and so it's not something you would just sell that business. We've got to keep our customers happy, but if the right deal comes along where we can save money, help grow our cloud business by getting more partners on it, yeah, we would do that. I think going forward, we've said that business is going to other cloud apps will be flattish, roughly. All the investment is in CIS, and that's where all the growth and potential is.

We'll probably still report both, although we may, when we talk about compute, just focus on CIS next year since that's what's important. We'll disclose, I think, [OCACC], what it was. And of course, as CIS gets bigger, if you were to combine them, you got this piece pretty static and this piece getting very large, it won't matter.

Moderator

Over time, it'll mix down. So let's move the conversation on to security, which generates the majority of Akamai's revenue. You've been in cybersecurity for over a decade now. Built a fantastic business, $2+ billion run rate business. Growth has dipped below 10% on a constant currency basis. When you take a step back, how much of an investment priority is the security business for Akamai given all the opportunities around public cloud, Akamai Inference Cloud, CIS? Just in terms of where the incremental dollars are going, how would you sort of stack rank security versus the rest of the business?

Tom Leighton
CEO, Akamai

Security is very important. The majority of our revenue is an important growth driver for us. As we look forward, our goal is to try to do around 10%, including M&A. We're always looking for the right addition in terms of M&A. We've been very happy with Guardicore and Noname. They're both doing great. We looked at 30%-35% ARR growth this year, which is great to see, and I think a lot of runway there. Separate from M&A, we've launched our Firewall for AI. I think protecting AI is really important, and that'll become, I think, an important market where we want to be a player, so yeah, it's important. Big area of investment for us, and I would say security and CIS. Those are the big investment areas.

Moderator

100%. When you think about the composition of the security business, you have your traditional web security business, businesses that you've been in for a very long time. And then you have the growth engines of that business. You mentioned Guardicore and your API Security business, which is Noname, the AI firewall. Is there a time where there is a stabilization potential with kind of the core web security business? And is there a time where the dollar growth from Guardicore plus API Security, is there kind of an inflection point where those scale and we see better growth out of security business? Or does M&A really need to be part of the equation to sustain that double-digit growth?

Tom Leighton
CEO, Akamai

Yeah, so the majority of the revenue, as you noted, is in the traditional businesses of web app firewall, bot management, the stack on top of that, stopping DDoS attacks. And that is more mature, where the market leader by far is growing, but it's not going to grow as fast as it did. And then you've got the newer businesses where we have market leadership position growing very rapidly. I think M&A is an important part of the future, adding new capabilities. The security landscape changing very rapidly. AI should accelerate that. And so I think acquisition, yeah, is important going forward. And the key is to get the right value. We're not going to pay crazy amounts of money for something. We're very careful with what we buy.

Moderator

The API Security business, as you mentioned, crossed the $100 million run rate. What do you think, what does a TAM sort of look like in API Security? And what do you see the potential penetration of API Security within the Akamai customer base?

Tom Leighton
CEO, Akamai

Yeah, API Security is great because there's a lot of room still in our customer base growing very rapidly, but also outside the customer base. You don't have to have been a traditional CDN or WAF buyer to need API Security. And it also has growth potential in terms of AI. So already the latest version of the product will find your shadow AI. And pretty much every enterprise has shadow AI today. And you're going to need to identify that and protect it. So it couples very nicely with our Firewall for AI.

Moderator

Let's talk a little bit about Guardicore, which has been one of the major drivers of security for the last few years. What's the runway for growth there? And to what extent is Guardicore emerging as a land product for new customers? Or is it primarily an install-based sale?

Tom Leighton
CEO, Akamai

It's both. Good runway. It's something we do cross-sell into our base. But Guardicore, in some sense, even less related to CDN or WAF. It's a behind-the-firewall solution. It's any enterprise, really, especially if they're worried about data exfiltration or ransomware, would need Guardicore. And so we have had a lot of new customers to Akamai through Guardicore sales.

Moderator

If we think about going into, we're running up on about two minutes left. As we think about exiting 2025, and we've got Q4 to go, so I don't want to necessarily get ahead of that. But as you think about your priorities for the team in 2026, what are your hopes and aspirations in terms of whether it's investment priorities, growth objectives? What is 2026 going to be about for Akamai?

Tom Leighton
CEO, Akamai

Yeah, it's about stronger growth, about rapid adoption of cloud infrastructure services, particularly with Inference Cloud, which is a new capability, continuing the strong growth in our security capabilities. Also, as you talked about, adding a lot of new customers. Because you think about the rapidly growing products, API Security, Guardicore segmentation, cloud infrastructure services, those can be sold to a lot of customers that aren't Akamai customers today that don't worry about CDN, maybe don't worry about web app firewall in the sense that our traditional base does. And so we have already started a program to greatly increase and incent hunting new customers at Akamai. So cross-sell, great, we're doing that. But we've got a lot of opportunity to increase our overall base. And that'll be a big focus area next year.

Moderator

Can you speak to the sales motion and maybe even the partner strategy around that to land potentially multiple products that are traditionally outside of the traditional Akamai customer base?

Tom Leighton
CEO, Akamai

Yeah, partners even more important, especially those products I talk about, often partner-led. So that's good. They're partner-friendly, which is good. So yeah, I think we're in a good position now, and we're making the investments and the associated compensation structure to realize that.

Moderator

In terms of the partners, should we think about these as GSIs? Are these the hyperscalers? Which partners are users?

Tom Leighton
CEO, Akamai

They're not the hyperscalers. But yeah, so GSIs, the big firms that you would think of. Sometimes carriers are good partners. The hyperscalers are customers, and we compete with them. Sometimes our products are listed on their marketplace. But we generally are not going to market with them.

Moderator

We've gone through basically 30 minutes without mentioning the delivery business. But maybe just one in terms of kind of the state of the delivery business post a couple of years of consolidation. What are your sort of underlying assumptions for that business in terms of pricing and the competitive dynamics? And what are your hopes for that segment of business, which has come down as a percentage of it? I think it's sort of under 30% of the business today. But kind of what's the state of the delivery business as we head into 2026?

Tom Leighton
CEO, Akamai

Yeah, a lot better than it was the last couple of years, and you're right, there's been a lot of consolidation. Probably at least five competitors are gone because they were selling below their cost. That is still happening, but less than it was before because there's fewer of them. It's still competitive. Traffic growth overall is better. The pricing declines, which we control to some extent for us, less than they were before. We do turn away some business, which hurts on the traffic side, but is better on the pricing side, so we're looking towards close to stabilization. It's declining. Revenue is declining low- to mid-single digits. Probably be that way for a little while. We'd like to get it even. Overall, we'd over time like to actually grow it.

We've cut back a lot on the investment there, both in terms of the CapEx is half where it was before. And we're careful about the kinds of big users that we take on to make sure it's really profitable and driving cash flow for us.

Moderator

Tom, thank you for spending 30 minutes with us at Nasdaq London. Enjoy the conversation. All the best in 2026.

Tom Leighton
CEO, Akamai

Thank you.

Powered by