All right. Good afternoon, everyone. Welcome to our afternoon conversation with Tom Alison, who is the head of Facebook. He oversees the development and strategy across all of News Feeds, Stories, Groups, video, Marketplace, Gaming, News, Dating, a lot more. You've been with Meta since 2010.
That's right.
There is a lot that's been going on at Meta over the course of the last 10 years. Well, you and I were talking about, you know, the first eight and the last two. So thank you so much for joining us.
Thank you. It's great to be here.
Let me do the disclosures. First, all important disclosures, including personal holdings disclosures and Morgan Stanley disclosures, appear on the Morgan Stanley public website at www.morganstanley.com/researchdisclosures. They are also available at the registration desk. Some of the statements made today by Meta may be considered forward-looking. These statements involve a number of risks and uncertainties that could cause actual results to differ materially. Any forward-looking statements made today by the company are based on assumptions as of today, and Meta undertakes no obligation to update them. Please refer to Meta's Form 10-K filed with the SEC for a discussion of the factors that may impact actual results. Okay. Well, maybe, maybe we should sorta start a little walk down memory lane.
Yeah, sure.
Across the last decade. So I'd be curious just to sort of talk about your main priorities and focus in your role the first eight years, and then what has changed most in the last two years of kinda how you're managing the businesses and what you're focused on most?
Yeah, sure. So I started at Facebook, like you said, about 13 years ago. When I joined, I joined as an engineer. I had done a startup before that. I joined our user growth team. It was a great starting point at the company 'cause I learned a lot about user growth, and Facebook was kinda, you know, and continues to be excellent at that. I moved into engineering leadership, and I helped us kinda transition over to mobile during those years, getting kinda News Feed running on iOS and particularly focused on Android at the time as we saw global growth, of Android. And then over time, you know, I learned how to pick up, and manage a bunch of different product lines, you know, Dating, Groups, Events, Search. And I think that helped me really understand how to bring these products together in Facebook.
I'd say in the last two years, I took on the role of head of Facebook, so I'm responsible for our overall product strategy. I'm responsible for growing user engagement for revenue, and I manage all of the kinda product and technical functions that deliver the Facebook experience that folks use today.
Great. Okay. Well, those, those last two years, you know, I was talking about this offstage. There's a lot has changed in the last two years. You know, I think some of the, the investments that the, the company started making in GPUs sort of to analyze more first-party data has really driven a lot of positive results. So maybe can you just sort of unpack that a little bit for us?
Yeah.
Help us understand or give us examples of what you're able to do better now with your first-party data that you couldn't before you did the whole platform reconstruction with the GPUs?
Yeah. Yeah, I'll take a step back. I mean, really, the last two years for me have been setting the foundation for the next several years of Facebook. What our strategy is responding to at the moment are two, you know, really disruptive trends in social media. The first one is generational change. I mean, Facebook is 20 years old, and the generation that we built Facebook for, you know, the, the next generation that we're looking at, Gen Z, you know, U.S. young adults, you know, ages 18 to 29, they expect different things from social media today. They wanna stay up to date with their friends and with the people that they care about, but they also view social media as something that's really gonna open up their world and help them pursue their interests, and they wanna see content from everywhere.
So that was one change that we're navigating. The second one really is AI and all of the innovation. And I know we'll talk a bunch about generative AI, but there was a bunch of innovation in the recommendation space. And if you think about what Facebook traditionally did, we did kind of social graph content ranking. So, you know, you've got your friends, you've got the Groups that you've joined, you've got the pages that you follow. You might have, you know, a couple hundred friends and things like that. And so maybe we, you know, we need to look at all the updates from those entities and maybe look at ranking thousands or maybe tens of thousands of pieces of content to assemble your feed for you.
But in a world where we're gonna try to show you the best of anything going on on Facebook, now we have to look at billions of pieces of content to figure out what is the right one for you at the time. And so that really forced us to rethink our entire technology strategy. And so, you know, kinda what we did was we said, "Look, we're gonna lean into this social discovery use case, not only helping you stay connected with the people that you care about now, but leaning into recommendations in our products to help you discover new things." And then we're gonna set a new technology strategy. And so we actually created an advanced technology group.
They are kinda homed within the Facebook organization, but their charter is to build the best content recommendation system in the world that can power all of our recommendations products, whether it's in Facebook, whether it's Instagram, whether it's Threads. And, you know, we've set off on that course. In terms of the results that we're seeing, I mean, look, I've been pretty pleased with where we are. There's obviously a lot more work to do. We invested a lot in short-form video, Facebook Reels. Facebook Reels is now about kinda 1/3 of Facebook's video time. Facebook video time is over 50% of our overall time spent. And Facebook Reels has grown something on the order of 70% year-over-year, so good growth there. But it's not just Reels where we're kind of applying this technology. You know, we've really leaned into providing more recommendations content in Feed itself.
So this could be recommended text posts or photo posts or group posts or video. And, you know, right now when you see kind of posts in Feed, content posts, about 30% of those posts on average are delivered by our recommendation system. And that's actually up 2x over the past two years, and that's unlocked a lot of kinda new content inventory for us in Feed. But it goes beyond that too. I mean, you know, we have this product called Marketplace that's very popular. We've been kinda upgrading the technology stack there. We're seeing really good kind of strong year-on-year growth for U.S. young adults using Marketplace. They love that product, and it's really been helping us kinda fuel engagement across the ecosystem. Similar things for products that have been around for a while, Groups, Groups you should join.
We've been putting kinda those recommendation models on the advanced technology, growing engagement gains there. I was admiring this the other day, you know, on a team that builds our People You May Know service. This was Facebook's original recommendation service that recommends friends for you. They put this service on GPUs very recently and are unlocking new gains. So even some of our older use cases are kinda benefiting from the technology investments that we made. Those are the types of things that we're seeing in the business right now.
Yeah. That's very helpful. Let me ask a couple follow-ups there. That 70% time spent in Reels, that's across the entire platform? That's Facebook plus Instagram, or that's?
No, that's just Facebook time.
That's just Facebook. Okay. Great.
That's the 70% growth, yep.
Then the Marketplace is an interesting callout 'cause we do surveys on what people do on Facebook, and it always comes up as something I think that the market underappreciates.
Yeah.
So what is it, you know, with the algorithms or the matching that's now sort of driving that incremental Marketplace adoption from here?
I mean, just in general, what we've done is we've really improved relevance across the board. So one of the things that I saw when we were really looking at kind of the attitudes, particularly of kind of, you know, the younger cohort, that Gen Z, 18-29, was they said, "Hey, look, this content just isn't relevant enough to me." So, you know, what we're able to deliver, whether it's in Reels, whether it's in Feed, whether it's in Marketplace, is really personalized kinda relevant content that drives both downstream engagement, that drives repeat usage, and just really improving the relevance of the products has been kinda one of the key things that's really, frankly, driving progress across the board.
Got it. Okay. So there's been investments that are a lot enabling you to better analyze more of your data, improve relevance. Maybe walk us through how you think about sort of the compounding benefits of that. What inning are we in in sort of that, you know, that benefit you could see? And what are sort of the difficult hurdles you still have to clear to kinda keep this going?
Yeah. So when I think about where we are on the journey with our AI, I'll talk about our kind of recommendation systems and personalization technology in particular. Historically, we have kind of a recommendations model, an AI model per product. So Reels has its own recommendation model that powers those recommendations. Groups has its own recommendations model. Feed has its own recommendations model. I would say, like, phase one of this journey was putting these models on GPUs, right? So putting them on GPUs actually allowed them to kind of learn more efficiently, to perform better. You know, a bunch of the gains that we saw in 2022 were, frankly, just, like, upgrading the kind of technology stack and the infra that powers the models.
Last year, we kinda took a step back, and we said, "Well, is there an innovation story here?" We looked a lot at kind of the innovation that was happening with large language models and generative AI. You know, those models are just, you know, very large models that can handle lots of data and solve kinda very general-purpose, types of activities like, you know, chatting. We said, "Well, what would this look like in a recommendation space? What if we actually had, instead of these per-product recommendation models, what if we had one recommendations architecture that could power all of our recommendations products and that could leverage lots of data?" So we actually kind of rearchitected our recommendations stack, to be able to kinda do this with the North Star. We basically have a technology roadmap that goes through 2026, so this is kinda part of it.
Late last year, what we did was we said, so we created this kind of new model architecture, and we decided to test it with Reels to just validate, was this gonna, you know, help us? So we put Facebook Reels on this new model architecture. It used the same data as the, you know, older you know, the previous model that was on GPUs.
Mm-hmm.
But we got roughly an 8%-10% gain in Reels watch time. So what that told us was this new model architecture is learning from the data much more efficiently than the previous generation. So that was, like, a good sign that says, "Okay, we're on the right track." And so now really in phase three, where we're going, is how do we actually continue to validate and scale this? We think we have an opportunity to have a lot more data, to give these models a lot more data to learn from. And we think we have the opportunity to have these models power more products. So, for example, instead of just powering Reels, we're working on a project to power our entire video ecosystem, with this single model. And then can we add our feed recommendations product to also be served by this model?
If we get this right, not only will the recommendations be kinda more engaging and more relevant, but we think the responsiveness of them can improve as well. So if you see something that you're into in Reels.
Mm-hmm.
And then you go back to Feed, we can kinda show you more similar content in Feed as well, very seamlessly and responsively. Now, look, all of this also requires a bunch of kind of hardware investments and planning, right?
Mm-hmm.
So in addition to this, we're kinda, like, frankly, like, reconfiguring data centers, figuring out how to wire more GPUs together. This is a big part of what's gonna push model development, in our generative AI work. But it's also similar to recommendations. You know, we have recommendations data is actually, in a lot of ways, a lot larger of a dataset than even some of the large language models use 'cause you're looking at all the interactions of billions of people every day. And so we've really focused on kind of investing more and making sure that we can scale these models up with the right kinda hardware and data support. But I'm excited about the technology roadmap. I think we have kinda more ahead of us. But look, it's, you know, this is a, you know, these are, you know, challenging projects.
But overall, I feel good about the future of the technology investments and where we're going with it.
One, two, three into 2026, basically, is kinda I think.
Yeah. Look, I mean, we're gonna be kind of continuing to validate the new model architecture this year. We've got a project that we're working on to kinda unify Facebook's video ecosystem. So I mentioned that Reels is 1/3 of Facebook's video time. It means 2/3 of our time.
Right.
It's still the video product that we developed years ago. Well, that product is actually that kinda, you know, older video product is not yet on the kinda state-of-the-art ranking stack. So we actually think by bringing these product lines together, unifying the experience, putting all of our video inventory on kinda one state-of-the-art kind of ranking stack, delivering it through this kinda more modern, you know, Reels-centric viewer, we think there's an opportunity to unlock kinda more upside in video down the road. Now, it's a complicated project. There's a lot of ad format that we need to kinda manage through. There's a lot of user kind of expectations and behavior. But it's the type of thing that we can really do now and contemplate given where the kind of underlying technology is taking us.
Great. So let me ask you a question about total time spent on the platform or total engagement in the platform because one of the consistent debates that we have externally is the incrementality of all of this video time.
Yeah.
Maybe, you know, we've talked a lot about Reels growing quickly, video growing quickly, etc., Marketplace doing well. What can you tell us about overall time spent trends in the U.S. and globally?
Yeah. I'll talk about that, but I'll take a step back and tell you what we look at even more closely than time spent. I mean, a lot of our focus has been really growing daily user visitation and engagement, right? A lot of the motivation for the strategy over the past two years was to reaccelerate daily usage. And, you know, we were feeling better about where we are. You know, we've had kinda quarter-on-quarter growth, for the past several quarters in the U.S. So again, even reestablishing kinda growth in the U.S. was a big priority. And then underneath that, you know, we're pleased to see that, you know, daily usage amongst young adults is growing, which is kinda fueling that top-line growth in the U.S. And so that was a big focus area for us as kind of an engagement objective.
The other thing that we look at is kind of organic content impressions. And are those growing? And we'll also look at revenue-weighted content impressions to make sure we're growing impressions in the areas that are gonna really maximize kind of our, our, our revenue. And so again, with the, you know, additional recommendations in feed, with the inventory that's being brought on by Reels, we feel good about that. Now, I'll talk about time for a minute because, you know, overall time spent growth is healthy, but I think we have to be careful at how we look at that because a thought exercise that you can go through is, would you rather have somebody watch two five-minute videos or five two-minute videos? From an overall time spent perspective, those are absolutely equal scenarios.
From our business perspective, we would much prefer the latter scenario because that's giving us more content impressions. That's giving us more ability to serve ads. So the monetization efficiency of the time in the latter scenario is better. So now what happens? Well, actually, we're seeing this kinda mix between long-form and short-form video to some degree. You know, Reels growth is incremental to kinda top-line time spent. But yes, there are some mixed shifts that are happening in between. The reason that I'm optimistic, though, is because you can have scenarios where you might even lose time, but you grow monetizable video impressions because of the shift to short-form video.
So again, with this video unification project that we're working on, by unifying all of our video on the same kind of recommendations technology, by delivering it through the same experience, we're setting ourselves up in a way where we can continue to grow video time long-term, but we have a lot of dials and a lot of options to get the right blend of short-form and long-form video that we think users are gonna want, but that's also gonna increase the monetization efficiency.
Right.
Of that video product. So again, lots of kind of complex things to work through. It's a big focus for us this year. I think we'll continue into 2025 on that. But, you know, we really look at not just the top-line time spent growth, but how efficient are we from a monetization perspective within it? And there are scenarios where I would happily take kind of lower time spent growth if it meant that I could have kinda more ad impressions within that unit of time.
That's good color. Well, let's have a little discussion about generative AI.
Yeah. Let's do it.
Getting off the radio now. 'Cause we,
Heard a lot of it's on a lot of people's minds right now.
Yeah. You mentioned it in 18 minutes. It's kind of, it's kind of impressive.
Right.
you know, it was a new set of products that you launched last year, last fall, around Meta AI and a series of the chatbots, the Tom Brady bot, the Kardashian bot, etc.
Mm-hmm.
There seems to be a focus to drive new types of behavior.
Yeah.
What have you seen so far? As you sort of think about gating factors to drive more engagement with these assistants, what do you have to execute on?
Yeah. There's a few things. I mean, I think actually, you know, if I just think about Facebook, we're working with kind of generative AI kinda features all over the company right now, as you can imagine. If I think about just Facebook, there's a couple things I'm excited about. Look, I think people are gonna continue to deepen these relationships with the chat-based assistants. And we're putting a lot of effort into Meta Assist. And I think we're really well-positioned because essentially, what we're seeing right now is that the interface to kind of some of these advanced AIs mirrors the interface that you use to communicate with other humans. And Meta is very good at creating kind of, you know, products that help you communicate in this way. So I think the chat's gonna continue to grow.
Right now, Facebook, we really talk about this idea of social discovery. We're showing you a lot of recommendations in your feed. You can go off, and you can kinda talk to people about them. You can learn more that way. You know, we're looking at, "Hey, look, what would it be like if you also had kind of Meta Assistant available with you in your feed?" So you get a recommended post about Taylor Swift, and you say to yourself, "You know what? When's the next Taylor Swift concert?" Well, look, you could go ask in the comments and maybe wait for somebody to respond. Or you could easily just click a button and say, "Meta AI, tell me more about what I'm seeing with Taylor Swift right now.
When's the next content?" So I think we'll be able to have these bridges to the AI in a way that's really gonna kinda deepen the experience of kinda discovering content. Another one that I'm excited about is we're integrating Meta AI into our group's product. So if you are a kinda home hobbyist baker, you're probably in a baking group on Facebook. And you can go in and ask a question and say, "Hey, how come my sourdough bread isn't rising properly?" Now, look, there are people in the group that'll come in and answer your question. But if for some reason they don't, we've enabled Meta AI to come in and answer your question in the comments. And then the cool thing is that it can kinda tell you kind of what it knows from its general knowledge.
Over time, it's also gonna be able to pull other group posts of like, "Hey, this was discussed over in this post. Why don't you go check it out here?" And not only can you interact with the AI in that context, but other groups other group members can interact with the AI in that context. So I think we have the opportunity to put generative AI in kind of a multiplayer kinda consumer environment and see what comes out of that, which is gonna be exciting. But beyond that, there's a lot of applications for advertisers, creators, developers. I mean, I can talk about some of the things that we're thinking about there.
But overall, I think, look, if we're in the kinda business of content discovery and enabling that for people, I think that just the generative AI products are gonna be able to, like, allow people to kinda deepen their engagement with the things that they care about and also even share out the things that they're kind of, you know, talking to the AI, AI about in, in certain contexts back with the folks on their network that I think is going to kind of enhance the engagement impact that we see. But it's gonna take us a while to kinda find out what are the right kinda integrations and formula for this. So you'll see us testing a lot of different things and then kind of going with what sticks once we learn more.
Is the hard part just sort of ensuring it's a high-utility experience? Is the hard part sort of making sure you're analyzing the right signal? What is sort of the main regulator at how quickly it gets pushed out?
There's a number of things. I mean, some of it is like, you know, the quality of the underlying models. So in addition to doing these product integrations, we're continuing to invest a lot of effort in really training kind of our next kinda state-of-the-art generative AI models. We have Llama 2 out now. We're working very closely on kinda Llama 3. So you wanna give these kind of AI agents more expressive power. You wanna give them the ability not only to understand text but to understand images, to understand kinda what we call multimodal applications, which is gonna really help in our kinda rich media ecosystem. So there's things like that that you need to do. And look, anytime you're introducing a new consumer product, it takes some time for people to figure out, "Well, how do I use this?
And what is the right place to integrate this into existing products and workflows? And how do you educate people on, like, what is this thing valuable for? You know? So I think, you know, not everybody is gonna think to themselves, "Hey, I wanna kind of go and chat with a chatbot today," but I think one of the things that we can do is in various parts of our kinda product experience, we can actually kinda show people interesting questions that you might wanna ask the chatbot to help them realize, "Oh, I can go ask an AI about this. Now, actually, I do wanna go and do that." And that helps people kind of progressively understand what the value is.
And then over time, we hope we can kind of build from, like, maybe serendipitous behavior, more intentional behavior where they're going to the AI for more things. But I think at first, we're gonna have to create a lot of bridges and teaching people what are these AIs good for in kind of social contexts or other contexts where you're using Meta products.
On the point of content creation, you know, you've shown some examples of what Emu could do last year. You know, we've seen other text-to-video models and applications come out in the last month or so across the ecosystem using GenAI. If you sort of keep it that same mindset of, like, the 2026 product pipeline, is it realistic that you could have text-to-video content creation rolled out across Meta? Or is that just too quick?
I don't think it's necessarily too quick. I mean, we've already seen kinda text-to-video applications both in research and, you know, even in some kinda limited production environments today. It's very hard to tell the pace of, you know, how the quality's gonna improve. I think that's one of the big unknown questions in the industry where a lot of people are working on scaling up GPUs. And we'll see how much kinda better quality comes out of that. So I know text-to-video will certainly be possible.
Mm-hmm.
By 2026. The question is, how good will it be? But look, we're already finding applications of some of this stuff within our own product. Even for kind of advertisers, we introduced some generative AI capabilities that allow them to do image outfilling, which is really good. If you have an image that's kinda meant for feed, can you outfill it and make it available for placement in stories very easily? Can you change the background kind of with generative AI for your product catalog to give our system more variations to go out and try and to optimize? Can you vary the text? So again, we're already seeing the near-term kinda benefits of the creative possibilities of this and integrating them into our product.
I think as we kinda work with people, with advertisers, with creators, we're in a good position because we're gonna get a lot of kind of feedback and data on, like, how can we kind of really maximize the value of these creative possibilities for the for the kind of stakeholders in our ecosystem who are most likely to benefit from it.
Yep. Okay. That's great. Well, we're looking forward to seeing more of that. Let's, let's talk a little bit about the, the ad business. I thought.
Mm-hmm.
And you mentioned it a little bit earlier in our discussion. But recently, Meta also talked about how they're managing ad load on a case-by-case basis and almost more dynamically, sort of, you know, the ad load or the number of ads you might see for a video versus Feed versus Stories. It could be different; it sounds like by person or almost by.
Yeah.
Maybe just talk to us about what changed there. What have you learned about ad load and the way you can more dynamically adjust that across the platform?
Yeah. That's a great question. I mean, I've talked a bit about recommendations. But one thing I didn't kinda mention, with these big technology investments and upgrades we're making, we're also improving personalization. So then what I mean by personalization is, what is the right blend of content to give to you? So for example, I'm really into recommendations. You might wanna see more friend content. We need to learn that you wanna see more friend content in your feed even if we have recommendations available. So actually, some of the engagement gains that we've seen over the last years has been not only bringing kinda new content inventory into the system via recommendations but improving how well we personalize that slate or that mix of content for each individual user. Now, the same applies to ads. Ads are kind of content in our system.
And so for example, I love ads. Like, I can open Facebook, and you can show me an ad at the top of my feed. And I'm like, "Awesome." Right? And, like, a lot of times, I'll click on it. But even if I don't click on it, I don't mind it. I'll keep kinda using Facebook. Right? And you can actually increase my ad load with not very many consequences on my engagement. Now, you might be different. You might be like, "Okay with ads in feed. But, like, ads in Reels, you're not as into. And that changes your behavior." And again, so we personalize both kinda the frequency of ads, where and how are we placing them on a per-user basis.
So as our technology gets better and our personalization capabilities get better, it allows us to actually get better at serving the right or the optimal ad mix to kinda each individual user. But I still kind of expect kinda beyond that, just a lot of the growth that I expect that we'll see in the ads business is continuing to kinda monetize these newer high-growth surfaces like video, like Reels. And I think the other opportunity we have outside of just getting better at kind of increasing, you know, content supply and increasing ad supply is if you think about video, I think we have an opportunity to, like, help more advertisers create video-friendly formats and do format optimization so that we're not just kind of increasing, like, ad load or the supply of ads. We're also increasing kind of the conversion factor and the efficacy.
I think actually, it's that formula, the fact that we're kinda both increasing supply as well as increasing the efficacy of ads that's really kind of working to deliver a lot of value to advertisers right now.
I love ads. You can keep, keep, keep talking about ads.
Okay. We're gonna say, "All right. Note to the team. Style up Brian's ads.
You have to remind everybody, "Style man." Yep. And you even got the Taylor Swift ads. Like, you don't have to analyze my feed. I don't know.
Like, we know you're.
Exactly. We probably do. So what about Advantage+? You know, Advantage+ is something that, you know, there's been a lot of ink written about this tool over the last couple years. So a couple things. So one, maybe talk to us about what are some of the applications of Advantage+ that have really resonated best with advertisers? And what do you see from an uplift spend per advertiser once they start to take on these tools?
Mm-hmm. So kind of the core theme behind Advantage+ is rather than you have to have you manage every single detail of your audience or your creative and put together kind of all of the combinations of the campaign on yourself, we're offering kinda more and more ad products to allow Meta to optimize more variations of that for you. So for example, our Advantage+ Audience product, you can give us kinda some signals or some tips on the audience that you want. But then you can say, "Hey, I'm gonna allow Meta to go out and find additional kinda audience candidates that are similar to mine." And we're seeing kinda good results with advertisers. Some may just wanna kinda take pieces of this. We have another Advantage+ product like our shopping product.
You know, folks that are doing kinda bigger shopping campaigns with lots of variations benefit from this a lot because we can take lots of your creative. We can take kinda audience hints. We can even optimize, like, which surface to show the ads on. So is this ad gonna perform better, in a Reels context? Or is it gonna perform better in a Feed context? And actually being able to optimize kind of across, you know, a lot more variables is allowing us to give advertisers better results. And so, you know, I think you're gonna continue to see us, like, build on this theme of being able to test and allow advertisers to easily test and use our systems to get the most out of finding optimal combination of things.
And honestly, like, what we tend to see is, when these products work well, you know, the cost per click for advertisers goes down. They're more efficient. You know, over time, that gives advertisers the confidence to unlock more budget with us or try kinda new products and get more out of us. So I just think you're gonna see us continue with that theme. And again, we talked about generative AI. Right? Like, now part of this suite is if you look at kind of our kinda Advantage+ Creative, making it super easy if you're selling a handbag online, you know, you can generate 10 different backgrounds for that. You can kinda feed that into, you know, our Advantage+. And we'll find the best variation.
So giving advertisers the ability to give us more variance to go out and optimize very easily is just gonna speed up that cycle of finding kind of the most effective campaign.
Yeah.
I'm excited about that.
You continue to improve. You have over 10 million advertisers. Of the, you know, any help at all on what percentage of the advertisers are using Advantage+? And then for the ones that aren't, what do you have to sort of show them? Or what's the hurdle they have to get over to start to adopt those tools?
Yeah. I mean, there's no one-size-fits-all for Advantage+. So we have some advertisers that are using more of the end-to-end. We have some advertisers that are using parts of it. I think for some of our Advantage+ Audience work, I think we're kinda defaulting folks into that now. And I think that's become kinda more popular. A lot of what we really have to do is figure out you know, sometimes what we'll do is we'll be able to figure out how well how does Advantage+ work with, like, a larger advertiser, maybe with more sophisticated objectives or a big campaign. Right? Because they can give us lots of assets to optimize. They can give us lots of hints on audience. They wanna try out different surfaces. And so we often kinda learn a lot from being able to run those campaigns.
And then we figure out, well, how do we scale this maybe beyond the e-commerce vertical and to other verticals? Or how do we scale this from kind of a high-spend advertiser to a lower-spend advertiser? So it's really kind of in that learning process of, how do we make sure that this technology works for all segments? But I would say overall, like, we're pleased with the progress. We get good reception from this. And I think in the course of time, we'll just continue to see a trend of more and more advertisers at least opting into, you know, some of the Advantage+ products and then more folks who are gonna be able to use the end-to-end solutions.
Great. The other part of the revenue that the company started talking about a little more consistently over the last couple years has been Click-to-Message.
Mm-hmm.
I think maybe just to sort of, you know, help us out a little bit on, you know, where have you made the most progress in Click-to-Message? You know, we think it's sort of approaching 10%+ of total revenue now, growing rather quickly. You know, what has driven that strength over the last couple years in the Click-to-Message go-to-market?
Yeah. So what we're seeing with Click-to-Message, it's essentially a product that allows advertisers to specify an objective of, you know, acquiring a customer over a messaging channel. And we have kind of regions of the world like Southeast Asia, parts of LATAM where messaging is just so ubiquitous in terms of how people communicate, not just with their friends or their family but also with businesses, that there's been a, you know, a growing demand for this type of product. So beyond kinda Click-to-Messaging, we also offer kind of paid business messaging tools. So not only can we help you establish a relationship over messaging with a prospective customer, we can help you kind of enhance that relationship through, you know, things like marketing or remarketing or customer support or even allowing kind of, you know, purchasing behavior inside of the message thread.
So again, we're kinda seeing this market kinda validated and grow, overseas. I think we're starting to see some, you know, interesting traction in the U.S. The thing that I'm excited about, we've been talking about generative AI. Right? So, you know, in a world where we can actually have chatbots help businesses really scale these types of businesses by taking maybe some of the more routine customer interactions and letting the chatbot handle them and then taking some of the higher-touch customer interactions and having kinda humans handle those, that's gonna create a lot more value for businesses. That's gonna allow new businesses to come on and, and try some of these things.
And we think it might even kind of unlock kinda regions in the world where maybe people aren't quite ready to invest in kind of these messaging-based products because they're like, "I'm not sure there's an ROI for my business." But it's like, the cost of trying them with a good kinda generative AI solution goes down a lot. Now, look, with generative AI in the context of a business or a business messaging, there's still a lot that you need to get right. I mean, I'm sure a lot of you have seen kinda the errors that chatbots still make. Right? In a business context, you know, if you have a chatbot that's doing something that, you know, hurts the brand or has the, the customer lose trust in the business, that's, that's a very kinda big, issue. So we're kind of in smaller-scale testing.
Mm-hmm.
Of this type of product right now. You know, again, with a small number of businesses and fusions, again, seeing really encouraging results, I think it's gonna be a tricky road to get it all right. But I do see kind of an outcome where we're able to offer this to more businesses and actually grow that entire business category.
Great. Last one I had is on Reality Labs.
Mm-hmm.
I think there is this perception and maybe it's more of a debate externally about, can Reality Labs, even if it is a success, how does it help the core, you know, the core.
Yeah.
Facebook platform? Or is it just gonna cannibalize? So I know it's very early. But maybe just talk to us about sort of what you've seen so far of people who are using the Reality Labs tools with their engagement on Facebook. And how do you think about the vision of the two of them integrating long-term?
Yeah. I mean, there's a couple of kind of cross-connections that we're exploring across the product lines. You know, one of them is just helping people kind of understand what's possible in a virtual environment in Quest. I mean, the most basic expression of that has been allowing people to choose their own avatar in Facebook or Instagram. It's a 2D avatar. It's, you know, obviously, like, kind of an illustration of their kinda persona. But the cool thing is that once you set that in Facebook or Instagram, as soon as you get a Quest, you've already kinda got your persona ready to go. It just reduces friction into getting more of the experiences. We're also seeing kinda possibilities of working, you know, with gaming developers. You know, Facebook has a gaming ecosystem.
Yep.
You know, we have a whole free-to-play kinda gaming ecosystem that drives engagement. You know, can we have more of the gaming developers develop experiences that you can kinda play in Facebook and also then go play in VR and have more cross-pollination kind of of those users across the different kinda gaming titles to try different experiences? So we're looking at, at things like that. There's kinda new possibilities opening up in VR with this idea of mixed reality. Right? Because virtual reality, you're in this very immersive experience. And it's like, I guess, you know, you could kind of look around and see your Facebook feed. And that would be cool. But mixed reality offers interesting opportunities. You've seen probably these videos of people, like, washing their dishes with a Quest headset on while watching a video. Right?
So we are gonna be exploring, well, what do our apps look like in mixed reality? Because we know that people, you know, while they might be using the headset for gaming or other things, actually now mixed reality offers kind of the opportunity to use more of the traditional social products and, you know, kind of explore that in new ways. I'll tell you the thing I'm actually most excited about, though. It's the Meta Ray-Bans and augmented reality. So, the video capture on the new Meta Ray-Bans is really cool. One of the things where I think, you know, to your point on how could this drive engagement for Facebook or Instagram, if you look at creators right now and they wanna take first-person video, they're going and sticking a phone in people's face. I've even seen some creators, like, strap it to their head. You know?
I mean, that just feels like a very unnatural way of capturing the world. And what especially young adults want from social media now is kinda more authenticity. So the fact that you can go out, capture first-person views of what you're doing, kinda capture, you know, even a live stream of it, send that back into Facebook, into Instagram, create Reels from that, I think that's, like, an interesting first-person content format that is captured in a very authentic, low-pressure way. And to me, that's interesting content. And we've already seen creators start to really see, "Oh, wow. I can use the glasses for that." You're starting to see more video created from it.
Actually, the product itself, when somebody posts a video from Meta Ray-Ban glasses, we give a little bit of attribution, which helps other creators and people discover that this is possible. So again, like, when I look at, like, what are the creative opportunities that the kinda Reality Labs devices allow for, that's one in particular that I can see, you know, and I've been talking to the team. I said, "How can we get more creators creating content here?" Because the content feels unique. It feels fresh. And it feels like something you can't do. And it's just very convenient for creators to be able to capture it with a pair of glasses versus holding a big camera or something like that.
All right. Well, Tom, it'll be very exciting to see everything that you ship over the next, next year or so. Thank you so much.
Yeah. Thanks for having me. Appreciate it.