Ready to go. Thanks, everybody. I'm Derek Wood, Senior Analyst covering enterprise software at TD Cowen. We've got the AvePoint team here, Mario and Jamie. Why don't Mario kick off with an introduction about you, and then just let's start with just kind of a brief overview of AvePoint and talk about kind of the core pillars of the product services.
Sure. Thank you for having us. Mario Carvajal, I serve as the Chief Strategy and Marketing Officer. Been with the team for, I think this is my 17th year. It has been an interesting, long but short journey in many ways. We are continuing to do work that we started a while ago, which is help organizations with unstructured data. As you can imagine, the growth of unstructured data over the last few years has been phenomenal. We create unstructured data in everything we do. We take a picture, we listen to a conversation, and that transcription service is now unstructured data. We have designed a comprehensive platform that helps organizations deploy the security controls to prepare, optimize their data.
We spend a lot of time with organizations across all industries, really figuring out the best way to deploy automated controls that help secure access to information, ensure that when knowledge workers are sharing data, whether it's with internal employees or external partners, that the data is not only secure but well-governed. We have in the platform three major suites. We like to think of them as themes. We think of the resilience of information, the importance of making sure there's continuity by having information available. There are a lot of companies that will say to us, "Look, I have data sovereignty rules that say when we apply a protection policy to data, it can't leave the borders of our country." Our software allows you to do that, especially in an organization that could be a global organization.
We also introduce in the resilience the ability to manage the lifecycle of data. Many organizations will turn a lot of this unstructured data into records. Applying a records policy is not easy to do. Some of the work that we've done with organizations is digitize a lot of data by applying record policies that dictate when the information should be moved, when the information should be archived, deleted, et cetera. That is all part of this resilience approach. We have a control suite, which allows us to deploy all the controls. That works hand in hand with the resilience suite, where an organization will say, "That's great.
I already have a protection policy in place, but how do I actually know if the data is ready for my AI, GenAI tool or product that I purchased? In the control suite, we allow you to do a number of things. One is we can take an inventory of all of the access control. We can say we have identified highly sensitive data that's at risk, and here's a policy to make sure that we appropriately implement the controls for access. In the control suite, we will also manage the user lifecycle. Users are coming in and out of the organization, and any time a user comes in, you probably have to provision access to information. How do you do that effectively within the policy? We do that as well.
The third suite at a high level is really one where we're modernizing the data. We have many organizations that love to move data from either other sources, other data stores. When you're moving data, the characteristics of the information should be retained so that you have continuity, so you have data lineage. Our software, over the years, we've built over 30 connectors to legacy systems, to other cloud sources, and we're able to move those data sets around without losing that fidelity. The three suites really are designed to help an organization think about an end-to-end approach to managing the data through its lifecycle.
Yeah. I appreciate your modernization tool. We used that when we migrated over and made sure all my files got moved over, all my old emails got moved over, and I did not lose any of that.
Yeah. In that move, the continuity to yourself and your team is important. You don't have to sit there and say, "We have an outage," or, "We have to wait.
Yeah. Yeah. I like that the resilience and control and modernization, they're very thematic about what you're doing. That's a good way to think about it. That's a helpful foundation. Moving to, let's just talk about the last quarter, really, and the macro and kind of what you guys are seeing out there in terms of spending behavior, how you feel like you're positioned. I've been asking people this all day, and I think people are coming back to like, I'm sick of saying macro. It's just the new normal. There's always uncertainty out there. It seems like that's the world we're living in. What are you seeing out there?
I'll start, and I'm sure Jamie will add some comments. I think many organizations are looking to prioritize spend in areas that really allow them to drive value in their business. On the one hand, the macro creates uncertainty because they're being just more measured in the investments. That in many cases could introduce slower sales cycles and more approvals. I think that's that ongoing macro. When you're working with organizations and solving high-priority issues, you then see an opportunity to grow your own business. In our case, we've been very fortunate. Nine quarters of performance have really shown that we're addressing a core problem. We're working with these organizations in the areas of priority. The transformation of the data set is really important as organizations get ready for what they're going to do with AI capabilities.
We are quite fortunate that we also play, as you know, in the global landscape. We take advantage of really seeing the rhythm across different markets. That has given us a great way to help organizations that are dealing with different challenges of regulatory changes in the environment because of AI. Some of that has created good momentum for us. The pipeline build continues to be in the positive. That is another reason why I think our performance has been what it is.
Yeah. I guess maybe just to sprinkle a few numbers in on top of that. Yeah, total ARR growth was 26%, 28% when you adjust for FX, which were both accelerations versus Q4. Revenue growth was 25% or 27% on a constant currency basis. We had record growth in net new ARR, which we're obviously extremely proud of. I think we had beats versus our guidance for Q1 revenue and op income, which we did flow through to the updated full-year guidance for those metrics. With our ARR guidance for the year absent updated expectations for FX, we did not raise that, which was really just a matter of, we think, being kind of prudent and sort of accounting for, again, the "macro uncertainty" that could be in the second half of the year.
There is really nothing that we have seen to suggest that we are not a priority and the problems that we solve are not critical for customers. The demand environment, I think, is very healthy. We are obviously very happy with our continued efficiency and profitability improvements with about a 14.5% non-GAAP operating margin in Q1. We are expecting about the same for the year. We have been able to, I think, consistently now, like Mario said, for nine quarters, deliver really strong performance both on the top line and while not sacrificing profitability.
Yeah. I mean, to hear the word acceleration, not just this quarter, but we saw that in numbers a bunch of times last year too. It's pretty rare in what we've seen in the rest of the cycle of software companies. You guys are certainly on to something. Just before moving on from the macro, because the one area that really has kind of more potential uncertainty is on the public sector side and what's going on with DoD. We've seen other companies that have higher exposure kind of try to de-risk numbers a little bit. Just give us a sense for what exposure is there for you guys. Especially, we know Q3 is the big quarter. Do you feel like you've taken some more conservative assumptions on that vertical?
I think so. It was actually one of the things we talked about, not even in this most recent earnings, but back at the end of February when we reported for the full year of last year and guided for this year. We actually already accounted for, at that time, what we felt like could be sort of uncertainty related to the impact of Doge. I think it's important to keep in mind we have about 11% of our ARR is tied to the "federal government," but that's on a global basis. When you distill that down to just in the U.S. and then just the agencies that actually sort of had been discussed and sort of maybe targeted by Doge, it was about 2% of our total ARR.
Obviously, we didn't view that as sort of a binary thing where it could stay or go, but we just tried to sort of account for kind of potential uncertainty there. I think we certainly accounted for that exposure at the beginning of this year and sort of baked in a little bit more of that uncertainty. That's probably also reflected in sort of the updated guidance for ARR as of Q2.
Yeah. Maybe just one comment I want to add. The other opportunity for us is even within the DoD movement, a lot of it was about modernizing the infrastructure. A lot of agencies were still sitting on legacy. Legacy also increases cost. Our sales team have done a very good job in really moving themselves into that area of value where part of that modernization strategy, we want to be in there. We want to be included. When we see RFPs that come out, a lot of the work we're doing is emphasizing the platform is designed to make that modernization.
The beauty of having a platform is you do not have to come in and say, "I am ready to deploy all the governance capabilities." You could start with the connector framework to say, "Wait, let us move the data out of a legacy system, reduce cost." I think those are opportunities for us to navigate the macro or the changes that are occurring. As Jamie mentioned, 2% of what we saw in those agencies really was not discouraging for us because, again, we are saying we are coming in with a core value to help you also reduce cost. As long as you are saying, "I am helping you reduce cost," you get the next follow-up, "Come in and show me how you can help me reduce cost.
The other exciting value prop and thematic area has been around Copilot and getting an AI in general for GenAI getting your data estate ready. We recently published a GenTech AI report. I mean, certainly talk about data cleanliness and data quality, data governance are key constraints to Copilot or AI agent adoption. You guys have been talking about this for a while in terms of helping with proof of concepts. Where are you starting to see proof of concepts move into production? How have you helped with that? How are you thinking about more pull-through as we get more into this AI adoption cycle?
Yeah. We are continuing to help organizations in that transformation. The initial pilots we saw have now moved into rollouts that are pointing the attention to the quality of data. I was just seeing a client last week, and one of the challenges they have is they've been sitting on millions of files that do not have all the characteristics so they could catalog the information appropriately. Part of cataloging the information is so that Copilot does not inadvertently surface incorrect information. How do you do it at scale? How do you, without compromising the initial timestamp of when that file was created, who authored it, add additional characteristics so you can make those decisions?
When we're working with an organization in a problem like that, we typically will say, "Run our command center for a quick analysis on what your risk posture is to information." You can do that today in the product. That allows you to get a readiness score, and then the application will start to inform which are the policies that you should turn on based on your own sharing policies, right? This is where we do integrate. Often, if they're using first-party products, let's say they're using a Microsoft Purview, we'll integrate with Purview. We'll say, "We know you have certain class codes in Purview. That could work, but over here, you still have exposure to external access. Do you want us to turn that off?
Do you want us to actually apply a different label to that?" In some cases where the information is just stagnant, it has been sitting there, and you want to actually train on that data, you need to then apply a better labeling system. All of that is part of the story of our platform and the fact that we can do it across not just one of the applications in 365, but all the applications. Even if the company says, "I'm also using Google Workspace over here," or, "I'm using G Drive," right? Maybe I'm storing data there because I happen to have a contract also with Google Cloud. The beauty of our product is that it is multi-cloud. We are able to apply that policy holistically.
That allows the organization to say, "We're now ready to turn on the Copilot agent in this case at global scale." We are part of that motion. We are unlocking the potential. We also, Derek, you may know, announced some of the work we are doing to launch the products now in Google Cloud. The same narrative applies, right? Gemini, I think, is also offering organizations an alternative choice. In some cases, for organizations that made the initial investment in Google Cloud for productivity applications, they are saying, "How do we actually use more Gemini? How do we use more of the capabilities that Google is going to be shipping even with NotebookLM," right? Which is a different way for people to change the way they do research, take notes, et cetera.
If the quality of the data set of the business over the years has not been maintained, it will be hard for you to tap into that and extract value. Back to your question, I think that is part of our narrative. Beyond that, establishing a governance framework, when regulation starts to come in and makes sure the organization is ready to adapt to regulation, is also an opportunity we are giving organizations because of the way we build the platform. I think that is also going to be really interesting in the next few months. The EU data regulation is coming into play in September. Especially in the EMEA region, organizations are going to prioritize that. How do we deal with the privacy of information? Do I have a way to audit and keep that data lineage? We are going to be providing that for them.
To us, this is really showing that the momentum is there. As you said, it's not just pilots. It's true implementations that are happening.
How do we think about if customers really start proliferating use of generative AI, let's say within Microsoft? I mean, how does that pull in more use of AvePoint?
I'll give you an example. Just to kind of set some fundamentals, you have Copilot, which is used inside the productivity apps. Then you have Copilot, which is part of what we're doing in Power Platform to do low-code sort of workflows that are automated. For Copilot Studio, the agent design is going to allow organizations an opportunity to transform manual business processes, right? In that arena and area, we see a great opportunity for us to be the provider to give you the governance controls for your agents. The good story there is that that's not only applicable to Copilot Studio, but it could also work eventually when Google ships their agent space.
For us, what you'll see from us is an opportunity to help organizations with agentic systems where you're designing the agent and you might have your choice of which LLM, but you're still thinking, "Well, how does the governance of what the agent is doing happen? How do I apply a policy? And how do I meet regulatory standards?" That for us will be a way to help organizations continue to not only be ready, but maintain the posture of what the security is for that agent. We think we're uniquely positioned because of the way we've just built the products and the way they interoperate. In many cases, because we're already in these organizations as a default system to monitor the environment, we have the signals already in place to see what's happening with the data set.
Why do customers choose you over Microsoft's governance tools?
Because our system sits outside of Microsoft, we're adding an additional dimension of controls that augment what you have with Microsoft First Party. They'll give you a basic layer of capability, and we'll come in and address all of the other scenarios that are more specific to your business. We still do it with an application that over time scales with your business. There's a lot that we've built in how the software can remediate based on how the environment is changing over time. I think that's quite unique. It's also very different when a company says to us, "I like your policy mechanism, but I also want it to work in data stores outside of Microsoft." The fact that we do that also creates a differentiation and a clear distinction that we are providing unique value versus what they get with First Party.
The same would apply when we are talking and working with a Google Cloud customer where the DLP capabilities that ship with Google are great, but the DLP capabilities do not give you that extra dimension to say it is not just about the data security. It is about what kind of user is coming into the environment and also what is the regulatory requirement. Those pieces we factor in, and that is where we are bringing more value to that sort of opportunity.
Remind us, your pricing model is, is it a per seat model? Is it?
Yeah. It is a per seat model primarily. We have cases where we are doing some capacity or consumption. As we continue to transition, we are always looking at what is the path of least resistance to help customers understand our value. Pricing through the seat has been traditionally understood, well understood. It has worked well, but it does not mean that in the future with agent design systems, companies will say, "Do I pay per operation?" Right? We are already seeing some in the consumer landscape charge per operation, right? I think it is a different way of running compute, but for now, we are primarily seat-based.
Just trying to think if there's a proliferation again of GenAI, does that produce more seats?
It'll produce more seats in the sense that when we're monitoring the impact of what, let's say, a GenAI product, whether it's provided by Microsoft or you're building your own, right? We're companies. The idea that the impact of that based on the security is based on entitlement creates more user seats opportunity for us. We're also going to look at what the cost of the outcome of that agent is. That's the operational run. If we're going to be running a governance framework around that, then we have an opportunity, I think, to also charge based on that.
You seem like a very critical tool in the AI value chain, but you're still a pretty small company. I don't know, in terms of market awareness, brand awareness is probably still challenged. You're the Chief Marketing Officer. How do you think about helping get the world to know more about AvePoint and getting more brand awareness?
You know this, you've followed our story since we came into the public markets. Yeah, we weren't that well known. I've worked extensively with a great team to really emphasize more clarity in our story. I think many at the beginning really didn't understand exactly what we were solving for. Over the past sort of two years, I think there's been more clarity that we're a data security company. We come in to help organizations in the unstructured side of the data. We've done a number of campaigns that are really, I think, amplified. What's also interesting we haven't talked about is we've leveraged our channel to also emphasize the brand. We go to market, as you know, Derek, through an indirect, which are channel players or partners that are running IT services for many companies.
We have done a lot to create a program for them so that they get behind the brand and they also can promote our message. I think what is happening is you still have a lot of noise out there, and you need to differentiate with smart campaigns. We are now being much more, I think, creative in placing the brand in places where not only our customers and partners, but also investors understand who AvePoint is. We are constantly attending more and more in-person events to make sure we are represented. I think the work we are doing also within Google will help elevate the brand, quite frankly. I think within the Microsoft ecosystem, we are known, but outside, I think, in all fairness, we were not as widely recognized, in fact, misunderstood at times that we are only in the Microsoft world.
I think really positioning the company as a multi-cloud provider has done great for us, but we still have more work to do.
How long, where are you in that Google journey?
Oh, yeah. We shipped our first iteration of the Google Cloud Platform in February. We're already in their marketplace available. Look, we've been supporting customers with a Google data store for over four years. The signal for us this year was we're bringing the entire platform stack to run on Google Cloud. That does mean there's a difference in how customers can now leverage the technology. That's why shipping the product was important for us in February. We also attended Google Next. We kind of came and provided some more perspective there of what our story is going to be. We're doing some go-to-market initiative with them to not only bring more value to Google Workspace customers, but also the work that we can do in amplifying Gemini deployments.
Because we already have that playbook and how we've seen it work for Microsoft Copilot. We're probably in N1, I would say, back to your question. We're super optimistic because the problem space really of unstructured data, it's really the same. The challenge of security and the challenge of the regulatory environment really affects companies across all industries. I'm also excited about what they're going to be doing in agent space. I think that gives us an opportunity to bring a lot of the governance we've done for Copilot Studio over to that buyer or that sort of ecosystem.
I'm smirking because I was going to ask the ending question on the MSP part of the business. It seems like it's going really well. It's just like, where are we in attacking the surface opportunity of that? Is it still very early or?
Jamie reminded me, he was like, we're at 5,000 partners in our sort of what we say ecosystem that transact with us. There's probably in the low end, 40,000 MSPs out there. We think we also at March 3rd, Investor Day event, we talked extensively about the Elements Platform, which is the version of our software designed for the MSP persona, the managed IT service. That really allows a partner to manage multiple customers, that multi-tenant architecture with baseline configuration management, user management, which are all of the capabilities they need to then deliver a comprehensive service to their own customers. We're only getting started with that.
We feel pretty, pretty confident that we're going to be able to differentiate by bringing all of the capability we've delivered directly to large enterprises in a very simple way for MSPs to do it for small, medium business, for companies that are maybe 100 employees, 50 employees. Below 5,000 is where we see a great opportunity. Yeah, we've been at this, as you know, building all of the business model to get to market. We're in over 100 marketplaces presently from a distribution so that it makes it easy to acquire the solution. Quite frankly, the investment in also the Identic acquisition that we announced in January was a signal that we want to bring more IP to that MSP and make sure it's already in the platform for them to leverage.
I'm pretty optimistic that we are in the early innings and that there's still a lot of headroom for us.
Yeah, and I think of the 5,000 partners, that's sort of all in, about 3,000 are MSPs. One of the other stats we gave at our investor day was back in 2020, the ARR from MSPs was low single digits as a percentage of total. At the end of 2024, it was about 12% or 13%. That's about a 60% CAGR over the course of that time. Growing very, very rapidly. Like Mario said, it's a big opportunity for us because the investments we've made there, they really view us not as a cost, but actually as a revenue stream for their business.
Yeah. Maybe just to end it with you, Jamie, on just the, I mean, you've got long-term ARR targets. It assumes a 25% CAGR. You just did 28% in the quarter. It is pretty, you see multiple years of durable growth versus existing levels. What gives you the confidence to put those numbers out there?
Yeah, I think it really goes back to the strategic priorities that we've laid out in terms of, so the billion-dollar target is sort of at the end of 2029 that we announced. The priorities and sort of the building blocks are really, like Mario, I think has talked about throughout today, building out the platform, new products and capabilities, continued scaling of the channel strategy, geographic expansion, going deeper where we already are and also into new regions, strategic M&A and investments where appropriate, and then just doing more across the different customer segments. We're very unique in how balanced we are across geography, across customer segments, across verticals. When you sort of marry all those together, there's a lot of vectors for growth that we feel like support that pretty ambitious target.
Thanks, everybody. I'm Derek Wood, Senior. And then just let's start with just kind of a brief overview of AvePoint and talk about kind of the core pillars of the product services.
Sure. Thank you for having us. Mario Carvajal, I serve as the Chief Strategy and Marketing Officer. Been with the team for, I think this is my 17th year. It has been an interesting, long but short journey in many ways. We are continuing to do work that we started a while ago, which is help organizations with unstructured data. As you can imagine, the growth of unstructured data over the last few years has been phenomenal. We create unstructured data in everything we do. We take a picture, we listen to a conversation, and that transcription service is now unstructured data. We have designed a comprehensive platform that helps organizations deploy the security controls to prepare, optimize their data.
We spend a lot of time with organizations across all industries, really figuring out the best way to deploy automated controls that help secure access to information, ensure that when knowledge workers are sharing data, whether it's with internal employees or external partners, that the data is not only secure, but well governed. We have in the platform three major suites. We like to think of them as themes. We think of the resilience of information, the importance of making sure there's continuity by having information available. There are a lot of companies that will say to us, "Look, I have data sovereignty rules that say when we apply a protection policy to data, it can't leave the borders of our country." Our software allows you to do that, especially in an organization that could be a global organization.
We also introduce in the resilience the ability to manage the lifecycle of data. Many organizations will turn a lot of this unstructured data into records. Applying a records policy is not easy to do. Some of the work that we've done with organizations is digitize a lot of data by applying record policies that dictate when the information should be moved, when the information should be archived, deleted, et cetera. That is all part of this resilience approach. We have a control suite, which allows us to deploy all the controls. That works hand in hand with the resilience suite, where an organization will say, "Well, that's great.
I already have a protection policy in place, but how do I actually know if the data is ready for my AI, GenAI tool or product that I purchased? In the control suite, we allow you to do a number of things. One is we can take an inventory of all of the access control. We can say we have identified highly sensitive data that's at risk, and here's a policy to make sure that we appropriately implement the controls for access. In the control suite, we also manage the user lifecycle. Users are coming in and out of the organization. Anytime a user comes in, you probably have to provision access to information. How do you do that effectively within the policy? We do that as well. The third suite at a high level is really one where we're modernizing the data.
We have many organizations that love to move data from either other sources, other data stores. When you're moving data, the characteristics of the information should be retained so that you have continuity, so you have data lineage. Our software, over the years, we've built over 30 connectors to legacy systems, to other cloud sources. We're able to move those data sets around without losing that fidelity. The three suites really are designed to help an organization think about an end-to-end approach to managing the data through its lifecycle.
Yeah. I appreciate your modernization tool. We used that when we migrated over and made sure all my files got moved over, all my old emails got moved over, and I did not lose any of that.
Yeah. Hopefully. In that move, the continuity to yourself and your team is important. You do not have to sit there and say, "We have an outage," or, "We have to wait.
Yeah. Yeah. I like that, I mean, the resilience and control and modernization, they're very thematic about what you're doing. That's a good way to think about it. That is a helpful foundation. Moving to, let's just talk about the last quarter really and the macro and kind of what you guys are seeing out there in terms of spending behavior, how you feel like you're positioned. I've been asking people this all day and everyone, I think people are coming back to like, I'm sick of saying macro. It's just the new normal. There's always uncertainty out there. It seems like that's the world we're living in. What are you seeing out there?
I'll start, and I'm sure Jamie will add some comments. I think many organizations are looking to prioritize spend in areas that really allow them to drive value in their business. On the one hand, the macro creates uncertainty because they're being just more measured in the investments. That in many cases could introduce slower sales cycles and more approvals. I think that's that ongoing macro. When you're working with organizations and solving high-priority issues, you then see an opportunity to grow your own business. In our case, we've been very fortunate. Nine quarters of performance have really shown that we're addressing a core problem. We're working with these organizations in the areas of priority. The transformation of the data set is really important as organizations get ready for what they're going to do with AI capabilities.
We are quite fortunate that we also play, as you know, in the global landscape. We take advantage of really seeing the rhythm across different markets. That has given us a great way to help organizations that are dealing with different challenges of regulatory changes in the environment because of AI. Some of that has created good momentum for us. The pipeline build continues to be in the positive. That is another reason why I think our performance has been what it is.
Yeah. I guess maybe just to sprinkle a few numbers in on top of that. Total ARR growth was 26%, 28% when you adjust for FX, which were both accelerations versus Q4. Revenue growth was 25% or 27% on a constant currency basis. We had record growth in net new ARR, which we're obviously extremely proud of. I think we had beats versus our guidance for Q1 revenue and op income, which we did flow through to the updated full year guidance for those metrics. I did, with our ARR guidance for the year absent updated expectations for FX, we didn't raise that, which was really just a matter of, we think, being kind of prudent and sort of accounting for, again, the quote, macro uncertainty that could be in the second half of the year.
There is really nothing that we have seen to suggest that we are not a priority and the problems that we solve are not critical for customers. The demand environment, I think, is very healthy. We are obviously very happy with our continued efficiency and profitability improvements with about a 14.5% non-GAAP operating margin in Q1. We are expecting about the same for the year. We have been able to, I think, consistently now, like Mario said, for nine quarters, deliver really strong performance both on the top line and while not sacrificing profitability.
Yeah. I mean, to hear the word acceleration, not just this quarter, but we saw that in numbers a bunch of times last year too. It's pretty rare in what we've seen in the rest of the cycle of software companies. You guys are certainly on to something. Just before moving on from the macro, because the one area that really has kind of more potential uncertainty is on the public sector side and what's going on with DoD. We've seen other companies that have higher exposure kind of try to de-risk numbers a little bit. Just give us a sense for what exposure is there for you guys. Especially we know Q3 is the big quarter. Do you feel like you've taken some more conservative assumptions on that vertical?
I think so. It was actually one of the things we talked about, not even in this most recent earnings, but back at the end of February when we reported for the full year of last year and guided for this year. We actually already accounted for, at that time, what we felt like could be sort of uncertainty related to the impact of DOGE. I think it's important to keep in mind we have about 11% of our ARR is tied to the Rederal Government, but that's on a global basis. When you distill that down to just in the U.S. and then just the agencies that actually sort of had been discussed and sort of maybe targeted by DOGE, it was about 2% of our total ARR.
Obviously, we did not view that as sort of a binary thing where it could stay or go, but we just tried to sort of account for kind of potential uncertainty there. I think we certainly accounted for that exposure at the beginning of this year and sort of baked in a little bit more of that uncertainty. That is probably also reflected in sort of the updated guidance for ARR as of Q2.
Yeah. Maybe just one comment I want to add. The other opportunity for us is even within the DOGE movement, a lot of it was about modernizing the infrastructure. A lot of agencies were still sitting on legacy. Legacy also increases cost. Our sales team have done a very good job in really moving themselves into that area of value where part of that modernization strategy, we want to be in there. We want to be included. When we see RFPs that come out, a lot of the work we're doing is emphasizing the platform is designed to make that modernization.
The beauty of having a platform is you do not have to come in and say, "I'm ready to deploy all the governance capabilities." You could start with the connector framework to say, "Wait, let's move the data out of a legacy system, reduce cost." I think those are opportunities for us to navigate the macro or the changes that are occurring. As Jamie mentioned, 2% of what we saw in those agencies really was not discouraging for us because, again, we are saying we are coming in with a core value to help you also reduce cost. As long as you are saying, "I'm helping you reduce cost," you get the next follow-up, "Come in and show me how you can help me reduce cost.
The other exciting value prop and thematic area has been around Copilot and getting an AI in general for GenAI getting your data estate ready. We recently published a GenTech AI report. I mean, certainly talk about data cleanliness and data quality, data governance are key constraints to Copilot or AI agent adoption. You guys have been talking about this for a while in terms of helping with proof of concepts. Where are you starting to see proof of concepts move into production? How have you helped with that? How are you thinking about more pull-through as we get more into this AI adoption cycle?
Yeah. We are continuing to help organizations in that transformation. The initial pilots we saw have now moved into rollouts that are pointing the attention to the quality of data. I was just seeing a client last week, and one of the challenges they have is they've been sitting on millions of files that do not have all the characteristics so they could catalog the information appropriately. Part of cataloging the information is so that Copilot does not inadvertently surface incorrect information. How do you do it at scale? How do you, without compromising the initial timestamp of when that file was created, who authored it, add additional characteristics so you can make those decisions?
When we're working with an organization in a problem like that, we typically will say, "Run our command center for a quick analysis on what your risk posture is to information." You can do that today in the product. That allows you to get a readiness score, and then the application will start to inform which are the policies that you should turn on based on your own sharing policies, right? This is where we do integrate. Often, if they're using first-party products, let's say they're using a Microsoft Purview, we'll integrate with Purview. We'll say, "We know you have certain class codes in Purview. That could work, but over here, you still have exposure to external access. Do you want us to turn that off?
Do you want us to actually apply a different label to that? In some cases where the information is just stagnant, it's been sitting there, and you want to actually train on that data, you need to then apply a better labeling system. All of that is part of the story of our platform and the fact that we can do it across not just one of the applications in 365, but all the applications, or even if the company says, "I'm also using Google Workspace over here," or, "I'm using G Drive," right? Maybe I'm storing data there because I happen to have a contract also with Google Cloud. The beauty of our product is that it is multi-cloud. We are able to apply that policy holistically.
That allows the organization to say, "We're now ready to turn on the Copilot agent in this case at global scale." We are part of that motion. We are unlocking the potential. We also, Derek, you may know, announced some of the work we are doing to launch the products now in Google Cloud. The same narrative applies, right? Gemini, I think, is also offering organizations an alternative choice. In some cases, for organizations that made the initial investment in Google Cloud for productivity applications, they are saying, "How do we actually use more Gemini? How do we use more of the capabilities that Google is going to be shipping even with NotebookLM," right? That is a different way for people to change the way they do research, take notes, et cetera.
If the quality of the data set of the business over the years has not been maintained, it will be hard for you to tap into that and extract value. Back to your question, I think that is part of our narrative. Beyond that, establishing a governance framework when regulation starts to come in and making sure the organization is ready to adapt to regulation is also an opportunity we are giving organizations because of the way we built the platform. I think that is also going to be really interesting in the next few months. The EU data regulation is coming into play in September. Especially in the EMEA region, organizations are going to prioritize that. How do we deal with the privacy of information? Do I have a way to audit and keep that data lineage? We are going to be providing that for them.
To us, this is really showing that the momentum is there. As you said, it's not just pilots. It's true implementations that are happening.
How do we think about if customers really start proliferating use of generative AI, let's say within Microsoft? I mean, how does that pull in more use of AvePoint?
I'll give you an example. Just to kind of set some fundamentals, you have Copilot, which is used inside the productivity apps. Then you have Copilot, which is part of what we're doing in Power Platform to do low-code sort of workflows that are automated. For Copilot Studio, the agent design is going to allow organizations an opportunity to transform manual business processes, right? In that arena and area, we see a great opportunity for us to be the provider to give you the governance controls for your agents. The good story there is that that's not only applicable to Copilot Studio, but it could also work eventually when Google ships their agent space.
For us, what you'll see from us is an opportunity to help organizations with agentic systems where you're designing the agent and you might have your choice of which LLM, but you're still thinking, "Well, how does the governance of what the agent is doing happen? How do I apply a policy and how do I meet regulatory standards?" That for us will be a way to help organizations continue to not only be ready, but maintain the posture of what the security is for that agent. We think we're uniquely positioned because of the way we've just built the products and the way they interoperate. In many cases, because we're already in these organizations as a default system to monitor the environment, we have the signals already in place to see what's happening with the data set.
Why do customers choose you over Microsoft's governance tools?
Because our system sits outside of Microsoft, we're adding an additional dimension of controls that augment what you have with Microsoft First Party. They'll give you a basic layer of capability, and we'll come in and address all of the other scenarios that are more specific to your business. We still do it with an application that over time scales with your business. There's a lot that we've built in how the software can remediate based on how the environment is changing over time. I think that's quite unique. It's also very different when a company says to us, "I like your policy mechanism, but I also want it to work in data stores outside of Microsoft." The fact that we do that also creates a differentiation and a clear distinction that we are providing unique value versus what they get with First Party.
The same would apply when we are talking and working with a Google Cloud customer where the DLP capabilities that ship with Google are great, but the DLP capabilities do not give you that extra dimension to say it is not just about the data security. It is about what kind of user is coming into the environment and also what is the regulatory requirement. Those pieces we factor in, and that is where we are bringing more value to that sort of opportunity.
Remind us, your pricing model is, is it a per seat model? Is it?
Yeah. It is a per seat model primarily. We have cases where we are doing some capacity or consumption. As we continue to transition, we are always looking at what is the path of least resistance to help customers understand our value. Pricing through the seat has been traditionally understood, well understood. It has worked well, but it does not mean that in the future with agent design systems, companies will say, "Do I pay per operation?" Right? We are already seeing some in the consumer landscape charge per operation, right? I think it is a different way of running compute, but for now, we are primarily seat-based.
Just trying to think if there's a proliferation again of GenAI, does that produce more seats?
It'll produce more seats in the sense that when we're monitoring the impact of what, let's say, a GenAI product, whether it's provided by Microsoft or you're building your own, right? We're companies. The idea that the impact of that based on the security is based on entitlement creates more user seats opportunity for us. We're also going to look at what the cost of the outcome of that agent is. That's the operational run. If we're going to be running a governance framework around that, then we have an opportunity, I think, to also charge based on that.
You seem like a very critical tool in the AI value chain, but you're still a pretty small company. I don't know, in terms of market awareness, brand awareness is probably still challenged. You're the Chief Marketing Officer. How do you think about helping get the world to know more about AvePoint and getting more brand awareness?
You know this, you've followed our story since we came into the public markets. Yeah, we weren't that well known. I've worked extensively with a great team to really emphasize more clarity in our story. I think many at the beginning really didn't understand exactly what we were solving for. Over the past sort of two years, I think there's been more clarity that we're a data security company. We come in to help organizations in the unstructured side of the data. We've done a number of campaigns that are really, I think, amplified. What's also interesting we haven't talked about is we've leveraged our channel to also emphasize the brand. We go to market, as you know, Derek, through an indirect, which are channel players or partners that are running IT services for many companies.
We have done a lot to create a program for them so that they get behind the brand and they also can promote our message. I think what is happening is you still have a lot of noise out there, and you need to differentiate with smart campaigns. We are now being much more, I think, creative in placing the brand in places where not only our customers and partners, but also investors understand who AvePoint is. We are constantly attending more and more in-person events to make sure we are represented. I think the work we are doing also within Google will help elevate the brand, quite frankly. I think within the Microsoft ecosystem, we are known, but outside, I think, in all fairness, we were not as widely recognized, in fact, misunderstood at times that we are only in the Microsoft world.
I think really positioning the company as a multi-cloud provider has done great for us, but we still have more work to do.
How long, where are you in that Google journey?
Oh, yeah. We shipped our first iteration of the Google Cloud Platform in February. We're already in their marketplace available. Look, we've been supporting customers with a Google data store for over four years. The signal for us this year was we're bringing the entire platform stack to run on Google Cloud. That does mean there's a difference in how customers can now leverage the technology. That's why shipping the product was important for us in February. We also attended Google Next. We kind of came and provided some more perspective there of what our story is going to be.
We're doing some go-to-market initiative with them to not only bring more value to Google Workspace customers, but also the work that we can do in amplifying Gemini deployments because we already have that playbook and how we've seen it work for Microsoft Copilot. We're probably in inning one, I would say, back to your question. We're super optimistic because the problem space really of unstructured data, it's really the same. The challenge of security and the challenge of the regulatory environment really affects companies across all industries. I'm also excited about what they're going to be doing in agent space. I think that gives us an opportunity to bring a lot of the governance we've done for Copilot Studio over to that buyer or that sort of ecosystem.
I'm smirking because I was going to ask the ending question on the MSP part of the business. That it seems like it's going really well. It's just like, where are we in attacking the surface opportunity of that? Is it still very early or?
Jamie reminded me, it was like, we're at 5,000 partners in our sort of what we say ecosystem that transact with us. There's probably in the low end, 40,000 MSPs out there. We think we also at March 3rd, Investor Day event, we talked extensively about the Elements platform, which is the version of our software designed for the MSP persona, the managed IT service. That really allows a partner to manage multiple customers, that multi-tenant architecture with baseline configuration management, user management, which are all of the capabilities they need to then deliver a comprehensive service to their own customers. We're only getting started with that.
We feel pretty, pretty confident that we're going to be able to differentiate by bringing all of the capability we've delivered directly to large enterprises in a very simple way for MSPs to do it for small, medium business, for companies that are maybe 100 employees, 50 employees. Below 5,000 is where we see a great opportunity. Yeah, we've been at this, as you know, building all of the business model to get to market. We're in over 100 marketplaces presently from a distribution so that it makes it easier to acquire the solution. Quite frankly, the investment in also the Identic acquisition that we announced in January was a signal that we want to bring more IP to that MSP and make sure it's already in the platform for them to leverage.
I'm pretty optimistic that we are in the early innings and that there's still a lot of headroom for us.
Yeah, and I think of the 5,000 partners, that's sort of all in, about 3,000 are MSPs. One of the other stats we gave at our investor day was back in 2020, the ARR from MSPs was low single digits as a percentage of total. At the end of 2024, it was about 12% or 13%. That's about a 60% CAGR over the course of that time. Growing very, very rapidly. Like Mario said, it's a big opportunity for us because the investments we've made there, they really view us not as a cost, but actually as a revenue stream for their business.
Yeah. Maybe just to end it with you, Jamie, on just the, I mean, you've got long-term ARR targets. It assumes a 25% CAGR. You just did 28% in the quarter. It is pretty, you see multiple years of durable growth versus existing levels. What gives you the confidence to put those numbers out there?
Yeah, I think it really goes back to the strategic priorities that we've laid out in terms of, so the billion-dollar target is sort of at the end of 2029 that we announced. The priorities and sort of the building blocks are really, like Mario, I think has talked about throughout today, building out the platform, new products and capabilities, continued scaling of the channel strategy, geographic expansion going deeper where we already are and also into new regions, strategic M&A and investments where appropriate, and then just doing more across the different customer segments. We're very unique in how balanced we are across geography, across customer segments, across verticals. When you sort of marry all those together, there's a lot of vectors for growth that we feel like support that pretty ambitious target.
Great. We're out of time. Thanks, everybody. Thank you guys for joining.
Thanks, Derek.
Yeah, thank you so much.