Hello everyone and welcome to F5 Partner Connect. Thank you for joining us today for our quarterly update. My name is Paul Szczepan, and I am the VP of Global Solutions Engineering for the partner community here at F5. I'll be your host for today's session, and before we get started, I'd like to address a few housekeeping items to help maximize your viewing experience. If at any time during the webinar you experience issues with your audio or slides, try refreshing the browser. Should this not resolve the issue, switching to a different browser may help you. You can access links to AI content referenced later in the resource box. As always, we would very much appreciate it if you completed the survey at the end of the webinar to give us your input.
Please submit your questions in the Q&A box, and we will conduct a live Q&A session at the end with answers for your questions. There will also be at the end of the webinar we will post this to Partner Central for your viewing. But first, F5's VP of Partner Ecosystem, Lisa Citron, and F5's Chief Technology Officer, Kunal Anand, were recently together in Tokyo, where they recorded a fireside chat to share their perspective on the impacts of AI and the top things partners can do now and in the future with AI and F5, then we'll be back live to share how you can help your customers build their AI infrastructure, and now, over to Lisa and Kunal.
Hello partners, thank you for joining this quarter's Partner Connect. We're here at App World Tokyo, and I'm thrilled to be joined here today by Kunal Anand, F5 CTO. Kunal, thank you for joining us.
Lisa, thank you for having me. I'm super delighted to be here and getting to hang out, and we can talk all about technology, but super thrilled to be here with you.
Thank you. You are new to F5, and I'd love for you to give our partners a glimpse into your background and what shaped you to coming here to F5 as CTO.
So certainly. I joined F5 in April of this year. Prior to that, I was Chief Technology Officer and Chief Information Security Officer at Imperva, and I've spent decades in technology and just love the space, love what's going on in cybersecurity. I think it's probably one of the most interesting times right now, and what attracted me to F5 was obviously the people. And of course, we have incredible technology, incredible solutions, and amazing partners who help us grow with amazing customers.
Absolutely. So in your first few months here, tell me what's influencing your perspective on where we should be heading as a company?
So I always love taking an outside-in view. I think an outside-in view is the most important thing because you actually get to look at the industry, you get to look at the landscape and see how it's evolving. And there's just been quite a bit of change going on. Obviously, AI, the big driving force right now, it's probably one of the most transformative things that's happening in the world today. It's absolutely not theoretical. We'll probably talk more about numbers later on, but what's insane is 18% of IT budgets globally have been allocated to AI. So it's a very, very large change, and Lisa, I'm sure you've seen quite a bit as well.
You know, it is a top topic that I see within the partner community. We're hearing over and over that these are board-level initiatives, that CEOs are looking to understand what they can do to bring AI in. And you know, we see, as you said, that there's budget there and people are moving forward.
You know, the thing that you brought up just now from a board perspective, so spot on. About a few weeks ago, I was in New York, and I got to meet with many financial services and banking organizations, and they were talking about the allocation of their budgets and also what's top of mind for them right now in technology. And it's astonishing how much is changing, not just because of AI, but it just happens to coincide with people's transformation efforts that they pulled in during the pandemic. And to your point, whether it's AI or security or broad transformation initiatives, these conversations are still at the board level.
And I know this is one of those technology trends that is moving faster than any other technology trend that we've seen. So what are customers putting into production?
So spot on, I think this is where the rubber meets the road, right?
Yeah.
Like when it comes to AI, what do we actually see? And I think it's just important to start with what AI actually is. You know, there's two parts to AI. There's how we build and train models, and then how do we use those models, and that's called inference. And these are two fundamentally different approaches, and they have two different requirements in terms of technology and product architectures. So there's training, and that's how you build a model. And today, people are leveraging their own proprietary data, or they're using third-party models. So they could be using OpenAI, or they could be using Meta's Llama model. And then when it comes to inference, that's where they actually use the model in their environments to do some sort of prediction work or some sort of task completion.
And what we fundamentally are seeing right now is this thing called retrieval- augmented generation. It's called RAG. Well, we're actually seeing a lot of organizations practice and implement RAG inside of their world. And what we're actually seeing is these large enterprises talking about those banks, talking about those financial organizations. They're relying on their proprietary data in conjunction with all of these really capable foundational models, niche models, enterprise models. And it's just crazy to see how fast this is all developing. And now the sort of other way to think about this is you've got training, you've got inference, but what's super important is APIs.
Yeah.
And API, or APIs, I should say, really are the glue that holds this AI world together. And it's amazing to see this happen so quickly and to watch this evolve.
I know we had a Partner Connect where we leveraged [Chad Whalen], and we talked about the fact that you can't actually talk about AI without APIs.
Yes.
And so it's great to hear you validate that as well because API security and how APIs are becoming such a large conversation with customers is definitely a trend our partners are seeing. So what are some examples of what customers are doing now with AI?
Yeah, so I would say there's two general things that are happening. The 1st is organizations are trying to figure out how they can measure productivity. A lot of people have invested in Copilots. A lot of people have invested in this technology, whether it's to improve the customer experience or whether it's to create new experiences altogether. But in that, there's a measurement that they're trying to figure out is productivity. What's the return on investment right now, and I think a lot of organizations are still struggling with governance.
Sure.
I mean, we hear about it all the time.
Yeah.
A lot of organizations are trying to figure out security. They're trying to figure out how to responsibly do this. And a topic that we've seen, and our partners, right, have talked about this with us, this concept of an AI Center of Excellence.
Yes. I know that we have partners around the world who have made considerable investments in building out their AI Centers of Excellence in coordination with the likes of NVIDIA and Intel, just to really give customers a place that they can experience what they're going to put into production. So I think it's a key role that our partners are playing in helping to shape what our customers, what our joint customers are doing.
You know, the other thing that's also important to mention is I think everyone tried to get to AI or adopt AI capabilities as fast as possible. And it's okay that it's a little bit messy right now.
Yeah.
I remember when we all started adopting internet technologies as organizations, and it was messy, and then as we started to see the evolution of cloud, that was messy too.
Sure.
And I think we're seeing this pretty similar trend here where everyone kind of rushed to adopt this technology, but we're still in, I would say, the early innings of this.
I think what you said is what, you know, I'm definitely hearing from our partner community. There's a lot of our partners who are working with the likes of Microsoft on things like Copilot initiatives while helping customers explore what is the next. What is the next use case? What is the next place that they can bring that into their environment?
You know, I would say that on that point, what we're seeing is these interesting approaches that the partner community and to the partners that are tuning in are helping organizations with. And so on one side, you've got those AI Centers of Excellence, but then on the other side, for those organizations that are pretty demanding and they want these big buildouts, we're seeing this concept of AI factories.
Right.
Right? And you know, I think it starts with this overall narrative, which is, you know, we believe that over the next decade, every application is going to be an AI app or it's going to be an AI-powered application.
Wow.
It's an insane.
Wow.
It's an insane trend that we see, and you know, it's not theoretical because what we've surveyed and data that we've collected as well and studied, 77% of the largest companies in the world will have an AI application in production. 77% will have an AI app in production within the next 12 months.
77%.
77%.
That's phenomenal.
And so you talk and add that up, right? 18% of budgets are allocated to AI. 77% are going to have an AI application in production over the next 12 months. And then over the next decade, particularly in 2027, there's going to be a flip. And the flip being that more than 50% of the applications are going to be in, you know, either an AI app or they'll be AI-assisted, but 2027 is not too far away.
It's not too far away.
Today it's around, I would say, maybe 10%, 10%-15%, somewhere in that range are getting AI turbocharged or AI-powered, if you will.
Wow. It's phenomenal how fast this is moving and serious dollars, serious dollars going into this, which is a tremendous opportunity for our partners, is a tremendous opportunity for F5 in where we're going, for sure.
The bit that I also think we should just be intellectually honest about too is it's not just all these GPUs that we're leveraging. We're going to need a lot more power, a lot more energy globally if we want to make this work. So again, we get to spend all this time with our partners who end up bringing us into great customer discussions. And you know, just to put numbers out there, today planet Earth has 7 TW of energy.
Okay.
If we want to make everything intelligent, we need 23 TW of energy.
How do we get there?
Oh, we're going to need all sorts of new power supply and all sorts of things globally if we want to unlock that. But I bring this up because it's one of the top concerns that organizations have. And not to link it immediately back to ESG, but one of the things that's so important for people is first we have to secure all the silicon, these GPUs. And then after we get that stuff, we then now need to figure out a way how we can sustainably run all of these AI models in these production environments. And so it's an increase not just in CapEx to acquire this material or acquire the semiconductors and silicon, but it's OpEx, right, for how you choose to run it.
And we just believe that the next couple of years are going to be fundamentally about how we scale energy, how we scale silicon, how we scale those applications. I think that's what every big organization is thinking about now.
Absolutely. And many of our partners have consulting organizations that go along with their teams who sell and their teams who implement. So I think that power conversation, that energy conversation is probably a big piece of how do you plan?
Yeah, it's huge.
Yeah.
It's huge. I mean, like we go back to those AI factories that we were talking about, and again, for those that don't know or maybe you haven't seen that term yet, we all know about a GPU, right, a card that can do some form of acceleration for these AI workloads. Eventually, these GPUs turn into clusters, and clusters would be when you have a bunch of these GPUs together. However, over time, those clusters end up growing as well, and when you see a collection of clusters, those become AI factories.
Okay.
And what's happening now is organizations that are going down that path of securing that silicon, spinning up and scaling out these AI factories. Top of mind for people is not just going to be productivity, but it's going to be return on investment, how they can run all of this. And then more importantly, you know, there's going to be a new set of challenges, unique things that people are going to have to admit.
Of course. New technology, new challenges.
Yeah.
So how does F5 come into this conversation? What are the challenges that we're solving?
I'll start by saying I think there's two ways to think about this. There's ADC for AI, which are some of the challenges that we're helping people solve, and then there's also going to be AI for ADC, which is going to be how we're going to incorporate AI technologies across our portfolio.
Sure.
So let's start actually with the latter, which is going to be how are we going to implement generative AI across our portfolio today? And there's four places that we're focusing on right now. The first is going to be around how we're going to implement iRule code generation.
That's a favorite. iRules are a favorite of customers and partners.
So fun fact for the partners, I actually used to be an F5 customer back in the day. I ran technology at the BBC Worldwide. That was my introduction to F5 BIG-IP, and I loved iRules, the fact that I could do all sorts of things. I know your customers love iRules as well. It's so powerful. You can do great things with it. The challenge, though, is when you have enough of these iRules and you've built enough of them, when you go back and revisit them a few years later, it's like, what did I write? What did I do?
Sure.
And then sometimes we've observed, especially as organizations are facing more nation-state cyber attacks, more nation-state issues, or just, I would say, cyber issues in general, what we're finding is organizations want to be able to write iRules as fast as possible. But we also know that not everyone in these organizations is an iRules expert.
Absolutely.
So what we think we can do is leverage generative AI to help organizations go and write iRules just using natural language. It's pretty powerful.
That's pretty powerful, and you know, we know that there are a number of our partners who take that role for customers, and I think that will add productivity for them and their teams.
Absolutely. And for the partners that are building these really sophisticated iRules, where sometimes they have to reach back out to F5, the AI will help them self-solve.
That's amazing.
So it's going to be a win-win, a win for our partners, and it's going to be a win for our customers as well. So really excited about that.
Excellent.
The next initiative that we're focusing on is what we call AI-powered WAF. Now, before anyone jumps to this and thinks that we're going to have a WAF that automatically is going to tune itself and do all of that, AI still hallucinates, and what we mean by hallucinate is AI can still make all sorts of stuff up.
Sure.
So what we're talking about is a WAF that is going to learn behavior for applications and APIs, and then it's going to provide a set of suggested configurations and rules and policies after it gets to know the application and API. So still very much human in the loop or human on the loop, but the idea is fundamentally, let's leverage technology to help improve our web application firewall.
Sure, so that human assist.
Yes, that's absolutely right. And again, we know we have many partners who are tuning the WAF and the set of experiences around the WAF for our customers. We think that this is going to be a great opportunity where you can sit in front of the suggestions that we're providing and then go back to the customer with a set of suggestions. You'll look like rock stars.
That's awesome.
For that. The 3rd and 4th will be more around AI assistance and things like predictive operations, so we want to be able to help with posture management effectively. We want to be able to tell you that your environment is set up correctly, or if you are configuring a solution, it could be something like NGINX, or it could be BIG-IP. If you're configuring something in a non-standard way, we want to be able to get out in front, and we think that our partners would love to have a capability like that.
Absolutely. It's a way better place to find the challenge than in production.
As someone who worked with a partner when I was at the BBC, I will tell you that I lean really heavily on my partner to help me get my deployment correctly.
Absolutely.
And I think that for our partners, I think you're going to absolutely love how we're bringing AI into our ADC portfolio.
That's amazing.
Now let's talk about the flip side of this.
Yes.
So ADC for AI, specifically, how are we going to help organizations with their unique AI challenges? And there's three challenges that organizations have today with AI. The 1st, and we had already talked a little bit about data, and we talked about how organizations need data for AI workloads. What we've observed is a lot of people need help with moving data in and out of these large data stores. And for things like retrieval- augmented generation, organizations are going to need to get and soak a lot of data out of these systems. I guess that these systems, they're great, but they were never designed to handle as many reads that we're seeing now in these environments.
Makes sense.
So what we've heard from really large technology vendors in the space, data vendors in the space, as well as customers and partners, is the desire and the need to have a solution like BIG-IP sitting directly in front of these solutions. And we can absolutely help. We support all the major protocols that these data vendors support today or have implemented today. And great example being for NetApp, right? Being able to, you know, Jensen of NVIDIA has said that 80% of enterprise data lives in NetApp appliances. And being able to have BIG-IP directly in front of a NetApp appliance inside of these large organizations, we think is just going to be super powerful for people.
That's amazing, and I love the connection to another key technology vendor that I know many of our partners have important relationships with.
And it doesn't stop with NetApp, right? So being able to support the vast gamut of data providers and solutions in the space. And I know Lisa will probably talk more about some of the other ones that we'll do and to our partner community. I'm certain you're going to hear about those as well. The next one is going to be around load balancing for AI workloads. And remember, I walked through this before, but people will go from a single GPU to a cluster of GPUs to a factory of GPUs. And so for the medium to large organizations that are out there where people are implementing AI on their own, they're going to need the ability to do things like load balancing. They're going to need the ability to apply security before the inference job or the training job makes its way to the GPU cluster downstream.
And I think there's just a phenomenal opportunity for our partners to position all of the capabilities we have at F5 right there. So whether it's right above the AI factory or if it's across multiple AI factories, we just think there's a tremendous opportunity there.
Unbelievable.
And the last one is around API.
Okay.
Right? So API is the glue that holds AI together. And when people are going through APIs for AI, they're typically using it for inference, right? I have a query, I want to go, and I want to ask an AI something. It could be OpenAI, it could be AI that you've deployed locally. But we think that there's an awesome opportunity to secure all that inferencing. And so being able to leverage, whether it's distributed cloud or whether it's NGINX or whether it's BIG-IP, we want to be able to sit directly in front of all of those workloads where inference is happening. And it's just an exciting time right now.
I think when you take a step back and you think about where F5 is today, obviously leading the world in terms of delivery and security at this scale, working with amazing partners with awesome technology, I think we're just in a great spot right now.
Kunal, you are actually talking about what has been our heritage, what customers and partners know us for, application delivery controller. What does this mean for F5 and our partners?
I am so glad that you said and you broke out and expanded application delivery controller. I absolutely love it, and I think right now, this is probably one of the most important things in the world is ADC, right? Application delivery controller. I actually think the reason for this is when you take a step back and you look at the industry right now or the ecosystem, nine out of 10 large organizations are hybrid and multi-cloud.
Yep.
Right? They're going to have data everywhere. They're going to have applications everywhere. There's a term called data gravity. And when you think about the way that AI works, you need to bring AI to where data lives. You can't bring data to where the AI lives. And our partners certainly know this because they work with large regulated organizations that need to have their workloads in, for regulatory reasons or legal reasons, the data needs to live in a certain place.
Sure.
And so we think that organizations are going to be way more hybrid and multi-cloud. But we think that there's an opportunity for F5 to, I would just say, redesign and reframe, reimagine ADC for the AI era. And I believe that there's six properties that are going to define next generation ADC. And these properties are going to include that holistic set of capabilities around delivery and security, right? That's.
Sure. That's been the core.
It's the core, and again, F5 starting with delivery and then over the last decade, adding more security capabilities. I mean, obviously, F5 always had for a long time a web application firewall, but expanding to things like API, getting into bot with the Shape acquisition years ago, right? Of course, the second thing, and I think this is also important, is being able to deploy in any form factor, any environment, anywhere, right? That's something that we've been leaning on as a message.
Absolutely.
And it's why we have the product families of BIG-IP and NGINX and Distributed Cloud. You then get into this really important thing, and our partners have told us this, and it's good to bring this up.
Yeah.
This concept of single policy and unified management.
This is a big problem.
Consistent policy management is important. I think every customer out there is looking for an ability to just not just consolidate everything into a single pane of glass, but they're really looking for the ability to write a policy once and have that propagate to all of the capabilities that's F5 in their world. Of course, rich analytics and insights, programmability. We talked about iRules.
Yep.
And then the last bit is around full lifecycle automation, especially as people embrace DevOps today.
Sure, and we see a lot of that. There are many of our biggest partners who have built great teams around automation and bringing automation into the customers to really make those F5 owners inside the customer more efficient.
Yes.
You know, bring a lot better capabilities to what they're doing.
Getting to work with some of our partners and getting to meet some customers. There's not a meeting that goes by where a customer doesn't bring up things like infrastructure as code.
Sure.
Or talking about CI/CD and wanting to, in a programmable manner, orchestrate and set up our technologies and solutions.
Yes.
It's great to hear and see that.
Yeah, absolutely. That's wonderful.
But I go back then and say, well, okay, if these are the properties that's needed, how are we going to address that? And so our vision is to build a unified ADC platform that's going to combine all these market-leading capabilities, awesome and advanced intelligence, and we want to unify all of these things together. And today, when you break it out from an F5 perspective, there's really five pillars, and we call them anywhere capabilities.
Okay.
That will include things like application delivery.
Sure.
That's what people know us for.
Of course.
Load balancing, right? LTM, GTM for our partners who know all of those codes, and I'm sure they all do.
They do.
And then you get secure multi-cloud networking because, again, the world is going to be hybrid. The world is multi-cloud. The third pillar is going to be around Zero Trust.
That's a big topic.
I love it. I love it. And yes, it's an absolutely huge topic. And so this would include things like APM, right? So having the richness there, access management. There's a lot of good stuff in the works on the Zero Trust side.
Excellent.
From that, on the F5 front. Of course, web application and API protection. We've done that, and we keep adding more and more capabilities there. And then, of course, enterprise AI solutions as well. And so this would be around how are we going to not just load balance, but how are we going to secure these workloads? And I won't spoil it now, but there's going to be some fun surprises.
Awesome.
And some really cool things that we're going to be talking about in the near future. But we put all this together in what we call an ADC platform. And that platform comes together and can provide a bunch of capabilities and technology that can be weaved through, whether it's BIG-IP or NGINX or Distributed Cloud Services. So this is our vision of leveraging these rich anywhere capabilities that we've already built.
Sure.
And we want to put them all together into a platform that's easy to digest for organizations. Now, again, our goal is to provide the best of the best capabilities, but do that in concert, hand in hand with our partners.
Absolutely, and this really brings into context the acquisitions we've made and how they all come under this rich heritage we have of application delivery controller.
Absolutely.
This is really, this is the future.
We want to redefine ADC for the AI era.
Amazing. Amazing.
And again, I go back to those six properties that we were outlining. We're the only company in the world that can deliver on those six properties holistically. And of course, we can't do that by ourselves.
No.
We have to do that with our partners, and the one thing that was so amazing about our partner community is how they lean in.
100%.
And my ask to our partners is, I have probably a pretty simple email address comparatively, just my first name, kunal@f5.com. And to all of our partners, feel free to reach out, send me an email. I would love to either get to meet you if you have questions about any of this. Again, totally open transparency, open door and transparency with respect to our platform, our vision, and where we're going.
And I really appreciate that in just the few months that you've been on board, you have made sure to meet with our partners in every city that you've gone to. And so as a channel leader, that is a dream to have someone who is listening, engaging, and looking for feedback and how we can be better together.
Without partners, we wouldn't be F5.
No. That is so true. So weaved into who we are and how we've built the company that we have. So let's do a summary. Give us the top three takeaways that you want our partners to remember when talking to customers about our role in AI.
Sure thing. And again, I just want to say thank you for pulling all of this together.
Oh, of course.
And you were saying such nice things about when I go and visit things, but I have to sort of reciprocate here and say working at F5 has been amazing for me because, again, I love the fact that our partners are reaching out and they're sharing opportunities for us to get better or suggestions. I love that. And so kudos and a testament to the quality of the team and the program, Lisa, that you're pulling together.
We have an amazing channel team around the world. We have an amazing sales team around the world who spends a lot of time working with our partners. So thank you.
Now, going back to the question that you asked, how do I want people to think about us with respect to this conversation and AI in general? So AI is transformative. We all know this. We talked a little bit about the pull forward, the budgets, the growth of AI applications. One, we are going to add AI across our portfolio to improve the customer experience, whether it's going to be around helping people write iRules to AI-powered WAF or these assistants. We think that's going to improve the customer experience. The second thing is going to be about how we help organizations solve their AI challenges. And again, we walked through three examples. Those three examples would be around data ingestion, how do we sit on top of these data stores? The second is load balancing for these AI factories.
The third is how do we secure all that inferencing?
Absolutely.
And then the last bit is how do we pull all of this together holistically to make the lives of our partners as well as for the customers easier? I mean, everyone is doing the best and the most they can right now. And we think we're in a great place to connect the dots and really try to reimagine ADC for the AI era.
Absolutely. And I look forward to working with your team and others to bring the reference architectures to life through our partner community. So thank you for being here with us today. Appreciate the overview that you've given and really helping our partner community understand where F5 is going and our key role in this AI era that we're in right now. Now, we want to show each of you how you can help your customers today. And right here in this being the recorded part, we're going to bring you over to the regional teams who are there to answer questions live for you. Thank you so much.
What a great, insightful look at what's going on in the AI industry, where F5's at, and how and what the opportunity looks like for all of us as we move forward. But now let's move over to our F5 Senior Product Marketing Manager, Hunter Smit, who has some tips and resources for you to help your customers build their AI infrastructure. Hunter?
Awesome. Thanks, Paul. And hello, everybody. I'm excited to share with you how F5 can help your customers build their AI applications today with three use cases. The evolution over the last 10 years has created application architectures that span public clouds, on-prem deployments, co-locations, and the edge. In the last couple of years, your customers have felt pressure to augment their applications with AI. AI is providing organizations the opportunity to build apps that interact with their customers in a more freeform manner and further leverage their internal data to drive revenue and satisfaction. Building these AI applications further complicates application infrastructure and creates new security risks. You can help your customers build infrastructure that enables them to continue to evolve their business operations securely with three stepping stones that we've outlined here. First, building the foundation with API security for AI workloads.
In the world of AI, if you don't have a foundation, it doesn't matter what else is being built. According to F5's State of Application Strategy Report for 2024, the increasing use of AI means more APIs. Because AI apps are often architected with multiple APIs, further deployment and use of AI will create an even greater flood of APIs. This continued reliance on APIs has really ushered in a new era which APIs can be as critical to the business as the apps themselves. They're also harder to protect, monitor, and manage. F5 Distributed Cloud Services API security can help organizations get a handle on their APIs across all environments, helping and allowing your customers to easily deploy critical monitoring and enforcement wherever needed to keep track of and secure all API connections. Second, we have traffic management for AI data ingest.
As customers are rushing to differentiate their solutions with AI to make their offerings unique, this takes an enormous amount of proprietary data. F5 technologies can help deliver a complete AI and ML infrastructure for your customers. Traffic management for AI data ingest is how large-scale media-rich AI data is managed and transported into a data center for machine learning and training purposes. For enterprises handling large-scale media-rich AI training data sets, F5 VELOS, or rSeries, with BIG-IP LTM, delivers an unparalleled on-premises solution. BIG-IP LTM on rSeries and VELOS can cut costs and accelerate performance while strengthening security for data traffic ingestion. F5 is enabling seamless integration of these hardware and software solutions to ensure secure and efficient handling of large-scale media-intensive AI data. Third, we have distributed inference for AI models.
F5 Distributed Cloud App Stack drives distributed inference regardless of location to deliver timely, accurate responses using a single control plane across F5's global infrastructure. This seamless integration of web and API protection delivers performance for latency-sensitive applications by using AI and ML at the edge to ensure low latency. Integrations with storage and cloud platforms ensure fast, accurate, and context-aware responses, and when deployed, your customers can build better-performing AI apps via a single virtual data plane, and as we head to the next slide, in order to begin these conversations with your customers today, you can leverage F5's AI channel salesplay available on Partner Central for API security for AI workloads, traffic management for data ingest, and distributed inference. In just 10 minutes or less right now, these channel salesplays provide focused guidance, enabling you to be ready to go. They're focused on four elements.
First, what to know. This is going to be a quick overview of the F5 solution, the ideal customer profile, the target audience, as well as professional services and larger technology selling opportunities. Second, what to say. This is going to provide you the talking points, the questions you need to ask based on the persona and objection handling. Third, what to show. This is going to provide links and URLs and resources to customer presentations, solution overviews, blogs, reports, etc., helping you drive the conversation and consideration of F5 solutions. And 4th, what to do. This will include information like deal registration links, as well as the channel quoting guide. Now, I will pass it back to David [Willman] for the Q&A session.
Thanks, Hunter. Appreciate your time today. My name is David [Willman]. I will be your moderator today for today's Q&A. I'm going to ask Paul and Ken to join us on this virtual stage. Ken, if you don't mind, I think everybody knows who Paul is, and she kind of introduced himself. If Ken, if you can take a moment to introduce yourself, that'd be great.
Sure. Hi. I'm Ken Arora. I work for Kunal in the office of CTO. I'm sort of a hardcore technologist. I'll give you sort of the technology perspective. I have worked in the area of cybersecurity and AI for many years. Just a little, Kunal likes a little anecdote about his past life. He mentioned iRules. My first job out of school many, many years ago was, in fact, to do hardware acceleration for neural nets, so long history there. I am so happy to be here, and I look forward to the Q&A.
Awesome. Thanks, Ken. And folks, just as a reminder, if you do have any questions, feel free to put your questions in the Q&A chat box. We would appreciate it. So we do have some questions that did come in. First question, Ken, it's actually probably for you. What are some specific examples of how AI will be used in upcoming explosion customer applications?
Very good. It's a really good question. Let's be specific. So I'll break it into two areas. And I'll make them specific by using F5 as an example, but F5 is just illustrative of many enterprises. The 1st bucket about how enterprises and customers' companies will use AI will be in a customer-facing application. And again, there's a maturity curve there. The first example of that is just a chatbot. You've seen this roll out in many companies that they've got an AI-powered chatbot to help you with a set of questions. The next level of that, and you heard Kunal talk about that, is as those enterprises say, "A generic ChatGPT is nice, but it doesn't know anything about my company and my enterprise. So how do I teach about that?" And that's the RAG model, the Retrieval- Augmented Generation model that Kunal talked about.
So what we do there is we pull data from data sources that are internal, some that are maybe partner data sources, and we put that all together, and we help the AI give a better answer with all that information. That chatbot, I don't think Kunal got to talk about it, but that pattern of incorporating domain knowledge for a better customer experience is something that F5 has rolled out with the Distributed Cloud in terms of helping users have a better experience to answer questions for Distributed Cloud. And you see that happening, I think, as we speak with NGINX. A little bit of a teaser, I guess, for people who might want to dive deeper. I think this pattern is something that's going to have a lot of opportunity for not just F5, but F5 partners, data, governance, things like that. Those are teasers.
Ask follow-up questions if you're interested. The 2nd chunk, and I'll keep this brief, is how companies are using AI to help their own productivity, and I think Kunal mentioned there's companies that are working very hard to measure the productivity gains they're getting. Specific examples might be for a software company like F5. How can it help the productivity of software engineers? How can it augment what a software engineer does? A software engineer may not know some specific domain, so he might ask ChatGPT to look at that, to look, say, "How do I start the kernel of some software to do something?" A very specific task. It might be to help understand code. There might be legacy code that's been lying around for 15 years. Those particular developers are no longer at the company. I want to understand it. It could be used by marketing for summarization.
I need to take this. I need to summarize this. These are all use cases that we're doing. It's being used by our support services to help take knowledge, personalize an answer, and be a starting point for an email that the support engineer might write out. So those are some very specific examples that F5 is doing, and I think those are typical of most customers.
Awesome. Thanks, Ken. Those are great examples. So another question just came in. Does F5's AI strategy change the way it views partners in terms of skill sets?
Sure. I can start with that. I don't know if Paul might want to pick that up. I think that it expands the range of partners we have, and it takes some of the areas that have been, I'd say, not been in the forefront and brings them to the forefront. I'll give just a couple of specific examples. One is the role of data. Data is, right, you've heard it, the oil, the blood of AI applications. You heard Kunal talk about how movement of data efficiently is going to be a core value proposition for ADC in the AI age. So you can imagine that there are going to be issues there, not just with delivery, but the inspection of data, making sure that inappropriate data isn't being leaked, that there's not bias in the data.
Those sorts of things are going to be areas where F5 has some technologies, but we're going to work with partners. I'll just throw another one out there in governance, right? You can't protect. You can't deliver what you can't see. So we've talked about APIs. So API discovery and API governance are going to be key areas where we are going to. There's some technology F5 has, but there's a lot of technology we're going to leverage from our partners to help us with that visibility and that governance of what's going on with AI. So those are just a couple of examples. I don't know if there might be something more you want to add.
Yeah. Look, I also think that what it does is, I'm not sure it's a brand new skill set that we're looking to go and pursue. Because when you look at the things that we're doing with AI, whether it be API security, something that we already do today, right? We're applying it to a new technology with AI. When you look at traffic management capabilities and the things that you have to apply against an AI model, those are things that we do today. We're just doing it into a different technology. So I think there's a lot of capabilities and skill sets that's in the partner community that have been built over the years that are very applicable to what we're going to do in the AI marketplace together.
Awesome. Thanks, guys. Just following up to our next question here. Our customers are telling us that they are overwhelmed with how many vendors are telling them they have the right approach to AI. Do you have a message we can bring that helps cut through the noise? Ken, maybe you can take that first, or maybe Paul, either one. You guys can both suggest that.
Yeah. Look, I think it is an overwhelming message because everybody's coming, and AI is definitely the buzzword that everybody uses. I think what F5 brings to the table along with the partner community and what the real strategy is, is really to go look at some of the alliance partners that we have. So typically, we've gone to market together. But I believe that this opens up the door to have a better conversation about how we bring one and one and one together. And so not only the skill sets and the capabilities of the partner community, but bringing in the alliance partners that we work with, the alliance partners that you work with, and bringing in F5 to really create holistic solutions for what people are doing in the AI space. Just to name a few, and it's probably the ones that everyone's thinking about.
So there's capabilities to have conversations on the value of bringing NVIDIA F5 together, as well as the partner community and how we build out holistic solutions. I think Kunal brought up some interesting comments around the data stores and how we're actively working with the folks at NetApp to really look at how we create holistic solutions, bringing together the power of not only F5, but some of the alliance partners to make sure that we build out these solutions holistically. And there are others that we're working with, like Intel and others, that will continue to refine that message, refine what that architecture looks like, continue to define what the value is as we move forward in these solution sets and go to market with the partner community to go drive that into the market.
Yeah. Yeah. That's great. Yeah. I'll just add from a technology point of view, what we also want to do is build platforms and infrastructure such that the customer, when the customer is saying, "I don't want so many vendors," they're saying that often what they're really saying is, "I don't want to manage all this complexity. I don't want to manage what looks like eight different things." And from a technology point of view, if we can find a way to integrate what F5 does and what the partners do into a more holistic platform, that will also help a lot of the concern that these C-level executives will have. An example of that is the hybrid multi-cloud strategy.
If we have a hybrid multi-cloud framework that we can plug in components into from our partners, but it all looks and feels like one holistic solution, that's going to go a long way. As just a specific example, that's going to go a long way in kind of allaying some of the concerns that are behind the statement, "Hey, I'm drowning in a sea of vendors.
Gotcha. Thanks, Ken. I think, Paul, you kind of addressed one of the questions already about who are the alliance partners F5 is aligning with. So you mentioned Intel, NVIDIA. So if there's anything else you would like to add to that, but I think you pretty much covered that question that's in our chat there.
Yeah.
So there is another question. So currently, F5 has different personas that we have conversations with: NetOps, SecOps, DevOps, and developers. Is there a certain persona that your AI messaging resonates more than others? Ken, do you want to take that one?
Sure. I'll start with that. Sure. I think that within those personas, they each have their perspective on it. SecOps is very interested in what data is being sent out to, say, third-party AI places. DevOps is interested in how to bring RAG in. But what we're actually seeing, I think, is another sort of persona emerging, which is AI Ops. And they are very interested in these solutions because they have a more holistic view of what goes on. The other persona that I actually would say is that the AI conversation today ends up being an opportunity to up-level a conversation. So you're no longer selling to sort of the middle layer of technology services that have that division between NetOps, SecOps. But you're actually selling to a CISO. And I think Kunal and Lisa talked about how these conversations are board-level conversations.
And the CISO is looking out at this portfolio of applications, and all the different sort of threats are going on, whether those are governance sort of concerns or whether they are data leakage threats or whether they are the cost, the OpEx, the cost of doing these GenAI queries can add up, and you find that the AI conversation, especially, is something that the CISO or the CIO of large companies are very interested in, and they are, we find, as often as not, the buying persona.
Paul, do you want to add anything else, or?
Yeah. No, I'd just reiterate what Ken said. And look, I think there's one thing that we're finding is it's pretty easy to find out who's really involved. I think the question is more around how do you go figure out who's really involved with the activities that are going around deploying an AI strategy. And so one of the things that we found talking to partners as well as customers is you can go talk to DevOps folks. You can go talk to NetOps folks. And you can ask the folks that you deal with on a day-to-day basis what their AI strategy is. And what we're finding is seven, eight, nine, 10, nine out of 10 times that they really don't have any idea because they're not really part of it. And I think that goes to the conversation that Ken talks about.
This is a great opportunity for us to really upskill and raise the level of conversation into the executive suite and have the conversation about how we can take collectively our solutions to the market to enable them in driving the strategy and what they're trying to do, whether it be drive productivity into their organizations to make their applications run more smoothly to leverage the data that they have. I think it gives us an opportunity to broaden the scope of the conversations that we have at the executive level, for sure.
Awesome. Thanks, Paul. It looks like that's all the time we have for questions. But again, thank you, Paul and Ken. But before we conclude, I have some exciting announcements to make. So that is right. We have our next Partner Connect, which is going to be November 21st. So please mark your calendars and be on the lookout for it likes to come. And then also, we have App World. This is our large customer event that's going to be happening in Las Vegas, February 27th, excuse me, 25th through the 27th. Registration does open October 7th. And then lastly, Paul had mentioned this at the very beginning. If you haven't already, please take a moment to complete the brief survey that will pop up on the screen once the session ends. The recording will be available on Partner Central and linked in the follow-up email as well.
Thank you again for joining us today, and we will see you in November.