All right. Hi, everyone. Thanks for joining us.
This was a good quarter, and it wrapped up an important year for our community and our company. We estimate that there are more than 3.1 billion people who use at least one of our apps each day. 2023 was our year of efficiency, which focused on making Meta a stronger technology company and improving our business to give us the stability to deliver our ambitious long-term vision for AI and the metaverse. And last year, not only did we achieve our efficiency goals, but we returned to strong revenue growth, saw strong engagement across our apps; shipped a number of exciting new products like Threads, Ray-Ban Meta smart glasses and mixed reality in Quest 3; and of course, established a world-class AI effort that's going to be the foundation for many of our future products. I think that being a leaner company is helping us execute better and faster, and we will continue to carry these values forward as a permanent part of how we operate.
Now moving forward, a major goal, we'll be building the most popular and most advanced AI products and services. And if we succeed, everyone who uses our services will have a world-class AI assistant to help get things done, every creator will have an AI that their community can engage with, every business will have an AI that their customers can interact with to buy goods and get support, and every developer will have a state-of-the-art open-source model to build with. I also think that everyone will want a new category of computing devices that let you frictionlessly interact with AIs that can see what you see and hear what you hear, like smart glasses. And one thing that became clear to me in the last year is that this next generation of services requires building full general intelligence.
Previously, I thought that because many of the tools were social-, commerce- or maybe media-oriented that it might be possible to deliver these products by solving only a subset of AI's challenges. But now it's clear that we're going to need our models to be able to reason, plan, code, remember and many other cognitive abilities in order to provide the best versions of the services that we envision. We've been working on general intelligence research and FAIR for more than a decade. But now general intelligence will be the theme of our product work as well. Meta has a long history of building new technologies into our services, and we have a clear long-term playbook for becoming leaders.
And there are a few key aspects of this that I want to take some time to go through today. The first is world-class compute infrastructure. I recently shared that, by the end of this year, we'll have about 350,000 H100s, and including other GPUs, that will be around 600,000 H100 equivalents of compute. We're well positioned now because of the lessons that we learned from Reels. We initially underbuilt our GPU clusters for Reels. And when we were going through that, I decided that we should build enough capacity to support both Reels and another Reels-sized AI service that we expected to emerge so we wouldn't be in that situation again. And at the time, the decision was somewhat controversial, and we faced a lot of questions about CapEx spending, but I'm really glad that we did this. Now going forward, we think that training and operating future models will be even more compute-intensive. We don't have a clear expectation for exactly how much this will be yet, but the trend has been that state-of-the-art large language models have been trained on roughly 10x the amount of compute each year. And our training clusters are only part of our overall infrastructure, and the rest, obviously, isn't growing as quickly. But overall, we're playing to win here, and I expect us to continue investing aggressively in this area. In order to build the most advanced clusters, we're also designing novel data centers and designing our own custom silicons specialized for our workloads. The second part of our playbook is open-source software infrastructure. Our long-standing strategy has been to build an open-source general infrastructure while keeping our specific product implementations proprietary. In the case of AI, the general infrastructure includes our Llama models, including Llama 3, which is training now, and it's looking great so far, as well as industry standard tools like PyTorch that we've developed. And this approach to open source has unlocked a lot of innovation across the industry, and it's something that we believe in deeply. And I know that some people have questions about how we benefit from open sourcing, the results of our research and large amounts of compute. So I thought it might be useful to lay out the strategic benefits here. The short version is that open sourcing improves our models. And because there's still significant work to turn our models into products because there will be other open-source models available anyway, we find that there are mostly advantages to being the open-source leader, and it doesn't remove differentiation for our products much anyway. And more specifically, there are several strategic benefits. First, open-source software is typically safer and more secure as well as more compute-efficient to operate due to all the ongoing feedback, scrutiny and development from the community. Now this is a big deal because safety is one of the most important issues in AI. Efficiency improvements and lowering the compute costs also benefit everyone, including us. Second, open-source software often becomes an industry standard. And when companies standardize on building with our stack, that then becomes easier to integrate new innovations into our products. That's subtle, but the ability to learn and improve quickly is a huge advantage. And being an industry standard enables that.
Third, open source is hugely popular with developers and researchers. And we know that people want to work on open systems that will be widely adopted. So this helps us recruit the best people at Meta, which is a very big deal for leading in any new technology area. And again, we typically have unique data and build unique product integrations anyway, so providing infrastructure like Llama as open source doesn't reduce our main advantage. This is why our long-standing strategy has been to open source general infrastructure and why I expect it to continue to be the right approach for us going forward. The next part of our playbook is just taking a long-term approach towards the development.
While we're working on today's products and models, we're also working on the research that we need to advance for Llama 5, 6 and 7 in the coming years and beyond to develop full general intelligence. It's important to have a portfolio of multiyear investments in research projects, but it's also important to have clear launch vehicles like future Llama models that help focus our work. We've worked on general intelligence in our lab, FAIR, for more than a decade, as I mentioned, and we produced a lot of valuable work. But having clear product targets for delivering general intelligence really focuses this work and helps us build the leading research program.
Now the next key part of our playbook is learning from unique data and feedback loops in our products. When people think about data, they typically think about the corpus that you might use to train a model upfront. And on Facebook and Instagram, there are hundreds of billions of publicly shared images and tens of billions of public videos, which we estimate is greater than the common crawl data set. And people share large numbers of public text posts and comments across our services as well. But even more important in the upfront training corpus is the ability to establish the right feedback loops with hundreds of millions of people interacting with AI services across our products. And this feedback is a big part of how we've improved our AI systems so quickly with Reels and Ads, especially over the last couple of years when we had to re-architect it around new rules. Now that brings me to the last part of our playbook for building leading services, which is our culture of rapid learning and experimentation across our apps. When we decide that a new technology, like AI-recommended Reels, is going to be an important part of the future, we're not shy about having multiple teams experimenting with different versions across our apps until we get it right.
And then we learn what works and we roll it out to everyone. And there used to be this meme that we'd probably launch Stories on our Settings page at some point. And look, I think it's kind of funny because it gets to a core part of our approach. We start by learning and tuning our products until they perform the way we want, and then we roll them out very broadly. And sometimes, occasionally, products will blow up before we're ready for them to, like Threads, although I'll note that Threads now has more people actively using it today than it did during its initial launch peak. So that one is, I think, on track to be a major success. But normally, we learn and we iterate methodically. And we started doing that with our AI services in the fall launching Meta AI, our assistant; AI Studio, which the precursor to Creator AIs; our Alpha with Business AIs; and the Ray-Ban Meta smart glasses. And we've been tuning each of these, and we're getting closer to rolling them out more widely. So you should expect that in the coming months. From there, we'll focus on rolling out services until they reach hundreds of millions or billions of people. And usually, only when we reach that kind of scale do we start focusing on what monetization will look like. And although in this case, the way the business AIs will help business messaging grow and WhatsApp, Messenger and Instagram is pretty clear. But that's our basic approach, and I'm really excited about pointing our company at developing so many of these awesome things.
Now we have 2 major parts of our long-term vision, and in addition to AI, the other part is the metaverse. We've invested heavily in both AI and the metaverse for a long time, and we will continue to do so. These days, there are a lot of questions more about AI that I get, and that field is moving very quickly. But I still expect that this next generation of AR, VR and MR computing platforms to deliver a realistic sense of presence that will be the foundation for the future of social experiences and almost every other category of experiences as well. Reality Labs crossed $1 billion in revenue in Q4 for the first time with Quest having a strong holiday season. Quest 3 is off to a strong start, and I expect it to continue to be the most popular mixed reality device. With Quest 3 and Quest 2 both performing well, we saw that the Quest app was actually the most downloaded app in the App Store on Christmas Day.
I want to give a shout-out to Asgard's Wrath2, which was developed by one of our in-house studios and received IGN's 10 out of 10 masterpiece rating, making it one of the best-rated games out there, not just in VR, but of any game on any platform ever. So it's a really good sign that we're able to deliver that quality of work at Meta. Horizon is growing quickly, too.
It is now on Top 10 Most Used Apps on Quest, and we have an exciting road map ahead. This is another example of applying the long-term playbook that I discussed earlier with AI but in another area. We take the time to build up the core technology and tune the experience.
And then when it's ready, we're good at growing things. Now our focus for this year is going to be on growing the mobile version of Horizon as well as the VR One. Ray-Ban Meta smart glasses are also off to a very strong start, both in sales and engagement. Our partner, EssilorLuxottica is already planning on making more than we both expected due to high demand. Engagement and retention are also significantly higher than the first version of the glasses.
The experience is just a lot better with Meta AI in there as well as there's a higher resolution camera, better audio and more. And we also have an exciting road map of software improvements ahead, starting with rolling out multimodal AI and then some other really exciting new AI features later in the year. I said this before, but I think that people are going to want new categories of devices that seamlessly engage with AIs frequently throughout the AI without having to take out your phone and press a button and point it at what you want to see. And I think that smart glasses are going to be a compelling form factor for this, and it's a good example of how our AI and metaverse visions are connected. In addition to AI in the metaverse, we're continuing to improve our Apps and Ads businesses as well.
Reels and our discovery engine remain a priority and major driver of engagement, and Messaging continues to be our focus for building the next revenue pillar of our business before our longer-term work reaches scale. But since I went a bit longer in the other areas today, I'm just going to mention a few highlights here. Reels continues to do very well across both Instagram and Facebook. People re-share Reels 3.5 billion times every day. Reels is now contributing to our net revenue across our apps. The biggest opportunity going forward is unifying our recommendation systems across Reels and other types of video that will help people discover the best content across our systems no matter what format it's in.
WhatsApp is also doing very well. And the most exciting new trend here is that it is succeeding more broadly in the United States, where there's a real appetite for a private, secure and cross-platform messaging app that everyone can use. And given the strategic importance of the U.S. and its outsized importance for revenue, this is just a huge opportunity. Threads is also growing steadily with more than 130 million monthly actives. And I'm optimistic that we can keep the pace of improvements in growth going and show that a friendly, discussion-oriented app can be as widely used as the most popular social apps. All right. That's what I wanted to cover today. Our communities are growing, and our business is back on track. Once again, a big thank you to all of our employees, partners, shareholders and everyone in our community for sticking with us and for making 2023 such a success. I'm looking forward to another exciting year ahead. And now here's Susan.