Why Mark Zuckerberg thinks AR glasses will replace your phone

Meta’s CEO on his first pair of AR glasses, partnering with Ray-Ban, why he’s done with politics, and more.

We had just come fresh off that demo, walked into the podcast studio, sat down, and hit record. It was fresh in our minds, and that’s where we started. Orion is very much the story of AR as a category. It’s something that Meta hoped would be a consumer product and decided toward the end of its development that it wouldn’t be because of how expensive it is to make. So instead, they’ve turned it into a fancy demo that people like me are getting around Connect this year.

It’s really meant to signify that, “Hey, we have been building something the whole time. We finally have something that works. It’s just not something that we can ship at commercial scale.”

NP: The first thing that struck me listening to the interview was that Zuckerberg feels like he has control of the next platform shift, that platform shift is going to be glasses, and that he can actually take the fight to Apple and Google in a way that he probably couldn’t when Meta was a younger company, when it was just Facebook.

AH: Yeah, and they’re seeing a lot of early traction with the Meta Ray-Bans. We talked a lot about that, their expanded partnership with EssilorLuxottica, and why he thinks this really storied eyewear conglomerate out of Europe could do to smart glasses what Samsung did to smartphones in Korea. He sees this as becoming a huge millions-of-units-a-year market.

I think everyone here at The Verge can see that the Ray-Bans are an early hit and that Meta has tapped into something here that may end up being pretty big in the long run, which is not overpacking tech into glasses that look good, that do a handful of things really well. And Meta is expanding on that rapidly this year with some other AI features that we also talked about.

NP: You got into that in depth, but the other thing that really struck me about this interview is that Zuck just seems loose. He seems confident. He seems almost defiant, in a way.

AH: Yeah, he’s done a lot of self-reflection. In the back half of this interview, we get into a lot of the brand stuff around Meta, how he’s worked through the last few years, and where he sees the company going now, which is, in his own words, “nonpartisan.” He even admits that he may be naive in thinking that a company like Meta can be nonpartisan, but he’s going to try to play a back seat role to all of the discourse that has really engulfed the company for the last 10 years.

And we get into all of the dicey stuff. We get into the link between social media and teen mental health. We get into Cambridge Analytica and how, in hindsight, he thinks the company was unfairly blamed for it. I would say this is a new Zuckerberg, and it was fascinating to hear him talk about all of this in retrospect.

NP: The one thing I’ll say is he was in a very talkative mood with you, and you let him talk. There are some answers in there particularly around the harms to teens from social media where he says the data isn’t there, and I’m very curious how parents are going to react to his comments.

AH: Me, too.

NP: All right, let’s get into it. Here’s Verge deputy editor Alex Heath interviewing Meta CEO Mark Zuckerberg.

The Orion smart glasses have been in the works for almost a decade, but Zuckerberg thinks they aren’t quite ready for the mainstream. Photo by Vjeran Pavic / The Verge

This transcript has been lightly edited for length and clarity. 

Alex Heath: Mark, we just tried Orion together.

Mark Zuckerberg: Yeah. What did you think?

We’re fresh off of it. It feels like true AR glasses are finally getting closer. Orion is a product that you have been working on for five-plus years.

Almost 10.

Take me back to the beginning when you started the project. When it started in research, what were you thinking about? What was the goal for it?

A lot of it goes all the way back to our relationship with mobile platforms. We have lived through one major platform transition already because we started on the web, not on mobile. Mobile phones and smartphones got started around the same time as Facebook and early social media, so we didn’t really get to play any role in that platform transition.

But going through it, where we weren’t born on mobile, we had this awareness that, okay, web was a thing; mobile is a thing that is different. There are strengths and weaknesses of it. There’s this continuum of computing where, now, you have a mobile device that you can take with you all the time, and that’s amazing. But it’s small, and it kind of pulls you away from other interactions. Those things are not great.

There was this recognition that, just like there was the transition from computers to mobile, mobile was not going to be the end of the line. As soon as we started becoming a more stable company, once we found our footing on mobile and we weren’t clearly going to go out of business or something like that, I was like, “Okay, let’s start planting some seeds for what we think could be the future.” Mobile is already getting defined. By 2012, 2014, it was generally too late to really shape that platform in a meaningful way. I mean, we had some experiments, but they didn’t succeed or go anywhere.

Pretty quickly, I was like, “Okay, we should focus on the future because, just like there was the shift from desktop to mobile, new things are going to be possible in the future. So what is that?” I think the simplest version of it is basically what you started seeing with Orion. The vision is a normal pair of glasses that can do two really fundamental things. One is to put holograms in the world to deliver this realistic sense of presence, like you were there with another person or in another place, or maybe you’re physically with a person, but just like we did, you can pull up a virtual Pong game or whatever. You can work on things together. You can sit at a coffee shop and pull up your whole workstation of different monitors. You can be on a flight or in the back seat of a car and pull up a full-screen movie theater. There’s great computing and a full sense of presence, like you’re there with people no matter where they are.

Thing two is that it’s the ideal device for AI. The reason for that is because glasses are uniquely positioned for you to be able to let people see what you see and hear what you hear. They give you very subtle feedback where they can speak in your ear or have silent input that shows up on the glasses that other people can’t see and doesn’t take you away from the world around you. I think that is all going to be really profound. Now, when we got started, I had thought that the hologram part of this was going to be possible before AI. It’s an interesting twist of fate that the AI part is actually possible before the holograms are really able to be mass-produced at an affordable price.

But that was the vision. I think that it’s pretty easy to wrap your head around [the idea that] there are already 1 to 2 billion people who wear glasses on a daily basis. Just like everyone who upgraded to smartphones, I think everyone who has glasses is pretty quickly going to upgrade to smart glasses over the next decade. And then I think it’s going to start being really valuable, and a lot of other people who aren’t wearing glasses today are going to end up wearing them, too.

That’s the simple version. Then, as we’ve developed this out, there are more nuanced directions that have emerged. While that was the full version of what we wanted to build, there are all these things where we said, “Okay, maybe it’s really hard to build normal-looking glasses that can do holograms at an affordable price point. So what parts of that can we take on?” And that’s where we did the partnership with EssilorLuxottica.

So it’s like, “Okay, before you have a display, you can get normal-looking glasses that can stream video and capture content and have a camera, a microphone, and great audio.” But the most important feature at this point is the ability to access Meta AI and just have a full AI there, and it’s multimodal because it has a camera. That product is starting at $300. Initially, I thought, “Hey, this is on the technology path to building full holographic glasses.” At this point, I actually just think both are going to exist long term. I think there are going to be people who want the full holographic glasses, and I think there are going to be people who prefer the superior form factor or lower price of a device where they are primarily optimizing for getting AI. I also think there’s going to be a range of things in between.

So there’s the full field of view that you just saw, where it’s 70 degrees, a really wide field of view for glasses. But I think that there are other products in between that, too. There’s a heads-up display version, which, for that, you probably just need 20 or 30 degrees. You can’t do full-world holograms where you’re interacting with things. You’re not going to play ping-pong in a 30-degree field of view, but you can communicate with AI. You can text your friends, you can get directions, and you can see the content that you’re capturing.

I think that there’s a lot there that’s going to be compelling. At each step along this continuum, from display list to small display to full holographic, you’re packing more technology in. Each step up is going to be a little more expensive and is going to have more constraints on the form factor. Even though I think we’ll get them all to be attractive, you’ll be able to do the simpler ones and much smaller form factors permanently. And then, of course, there are the mixed reality headsets, which kind of took a different direction, which is going toward the same vision. But on that, we said, “Okay, well, we’re not going to try to fit into a glasses form factor.” For that one, we’re going to say, “Okay, we’re going to really go for all the compute we want, and this is going to be more of a headset or goggles form factor.”

My guess is that that’s going to be a long-term thing, too, because there are a bunch of uses where people want the full immersion. And if you’re sitting at your desk and working for a long period of time, you might want the increase in computing power you’re going to be able to get. But I think there’s no doubt that what you saw with Orion is the quintessential vision of what I thought and continue to think is going to be the next major multibillion-person computing platform. And then all these other things are going to get built out around it.

It’s my understanding that you originally hoped Orion would be a consumer product when you first set out to build it.

Yeah. Orion was meant to be our first consumer product, and we weren’t sure if we were going to be able to pull it off. In general, it’s probably turned out significantly better than our 50-50 estimates of what it would be, but we didn’t get there on everything that we wanted to. We still want it to be a little smaller, a little brighter, a little bit higher resolution, and a lot more affordable before we put it out there as a product. And look, we have a line of sight to all those things. I think we’ll probably have the thing that was going to be the version two end up being the consumer product, and we’re going to use Orion with developers to basically cultivate the software experience so that by the time we’re ready to ship something, it’s going to be much more dialed in.

But to be clear, you’re not selling Orion at all. What I’m wondering is, when you made the call, I think it was around 2022, to say Orion is going to be an internal dev kit, how did you feel about that? Was there any part of you that was like, “I really wish this could have just been the consumer product we had built for years”?

I always want to ship stuff quickly, but I think it was the right thing. On this product, there’s a pretty clear set of constraints that you want to hit, especially around the form factor. It is very helpful for us that chunkier glasses are kind of ascendant in the fashion world because that allows us to build glasses that are going to be fashionable but also tech-forward. Even so, I’d say these are unmistakably glasses. They’re reasonably comfortable. They’re under 100 grams.

I wore them for two hours and I couldn’t really tell.

I think we aspire to build things that look really good, and I think these are good glasses, but I want it to be a little smaller so it can fit within what’s really fashionable. When people see the Ray-Bans, there’s no compromise on fashion. Part of why I think people like them is you get all this functionality, but even when you’re not using it, they’re great glasses. For the future version of Orion, that’s the target, too.

Most of the time you’re going through your day, you’re not computing, or maybe something is happening in the background. It needs to be good in order for you to want to keep it on your face. I feel like we’re almost there. We’ve made more progress than anyone else in the world that I’m aware of, but we didn’t quite hit my bar. Similarly, on price, these are going to be more expensive than the Ray-Bans. There’s just a lot more tech that’s going in them, but we do want to have it be within a consumer price point, and this was outside of that range, so I wanted to wait until we could get to that range in order to have some of them shipped.

Are you imagining that the first commercial version — whenever it’s ready in the next couple of years — will be a developer-focused product that you’re selling publicly? Or do you want it to be consumer-ready? 

No, consumer.

That’s why I’m asking about the strategy, because Apple, Snap, and others have decided to do developer-focused plays and get the hardware going with developers early. But are you saying you’re skipping that and just going straight to consumer?

We are using this as a developer kit, but just primarily internally and maybe with a handful of partners. At this point, Meta is by far the premier developer of augmented reality and virtual and mixed reality software and hardware in the world. So you can think about it as a developer kit, but we have a lot of that talent in-house and then we also have well-developed partnerships with a lot of folks externally who we can go to and work with as well.

I don’t think we need to announce a dev kit that arbitrary developers can go buy to get access to the talent that we need to go build out the platform. We’re in a place where we can work with partners and do that, but that’s absolutely what we’re going to do over the next few years. We’re going to hone the experience and figure out what we need to do to really nail it when it’s ready to ship.

A lot has been written about how much you’re spending on Reality Labs. You probably can’t have an exact number, but if you were to guess the cost of building Orion over the last 10 years, are we talking $5 billion-plus, or was it more than that?

Yeah, probably. But overall for Reality Labs, for a while, a lot of people thought all of that budget was going toward virtual and mixed reality. I actually think we’ve said publicly that our glasses programs are a bigger budget than our virtual and mixed reality programs, but that goes across all of them. So that’s the full AR, that’s the display-less glasses, all the work we’re going to do on Ray-Ban, and we just announced the expanded partnership with EssilorLuxottica. They’re a great company. We’ve had a great experience working with them. They’ve designed so many great glasses, and working with them to do even more is going to be really exciting. There’s a lot more to do there on all of these things.

How does this partnership work, and this renewal that you just did with them, how is it structured? What does this deal look like?

I think it was a kind of commitment from the companies that we’re feeling pretty good about how this is going, and we’re going to build a lot more glasses together. Rather than doing one generation and then designing the next generation, a longer-term partnership allows the teams to not just have to worry about one thing at a time — “Okay, is this one going to be good? And then how do we build on that for the next one?”

Now, we can start a multiyear roadmap of many different devices, knowing that we’re going to be working together for a long time. I’m optimistic about that. That’s sort of how we work internally. Sometimes, when you’re early on, you definitely want to learn from each device launch, but when there are things that you’re committed to, I don’t think you want the team to feel like, “Okay, if we don’t get the short-term milestone, then we’re going to cancel the whole thing.”

Are you buying a stake in EssilorLuxottica?

Yeah, I think we’ve talked about investing in them. It’s not going to be a major thing. I’d say it’s more of a symbolic thing. We want to have this be a long-term partnership, and as part of that, I thought that this would be a nice gesture. I fundamentally believe in them a lot. I think that they’re going to go from being the premier glasses company in the world to one of the major technology companies in the world. My vision for them and how I think about it is like if you think about how Samsung in Korea made it so that Korea became one of the main hubs of building phones in the world. I think this is probably one of the best shots for Europe and Italy, in particular, to become a major hub for manufacturing and building and designing the next major category of computing platforms overall.

They’re kind of all in on that now, and it’s been this interesting question because they have such a good business and such deep competence in the areas. I’ve gotten more of an appreciation of how strong of a technology company they are in their own way: designing lenses, designing the materials that you need to make fashionable glasses that can be light enough but also feel good. They bring a huge amount that people in our world, the tech world, probably don’t necessarily see, but I think that they’re really well set up for the future. So I believe in the partnership. I’m really excited about the work that we’re doing together, and fundamentally, I think that that’s just going to be a massively successful company in the future.

Is it set up in a way where they control the designs and you provide the tech stack, or do you collaborate on the design? 

I think we collaborate on everything. Part of working together is that you build a joint culture over time, and there were a lot of really sharp people over there who, I think, it took maybe a couple versions for us to gain an appreciation for how each of us approaches things. They really think about things from this “fashion, manufacturing, lenses, selling optical devices” perspective. And we obviously come at it from a consumer electronics, AI, and software perspective. But I think, over time, we just appreciate each other’s perspectives on things a lot more.

I’m constantly talking to them to get their ideas on different things. You know partnerships are working well when you reach out to them to get their opinion on things that are not actually currently in the scope of what you’re working on together. I do that frequently with Rocco [Basilico], who runs their wearables, and Francesco [Milleri], who’s their CEO, and our team does that with a large part of the working group over there. It’s a good crew. They share good values. They’re really sharp. And like I said, I believe in them, and I think it’s going to be a very successful partnership and company.

How many Ray-Ban Metas have you sold so far?

I don’t know if we’ve given a number on that.

I know. That’s why I’m asking.

It’s going very well. One of the things that I think is interesting is we underestimated demand. One thing that is very different in the world of consumer electronics than software is that there are fewer supply constraints in software. There are some. I mean, like some of the stuff that we’re rolling out, like the voice on Meta AI, we need to meter it as we’re rolling it out because we need to make sure we have enough inference capacity to handle it, but fundamentally, we’ll resolve that in weeks.

But for manufacturing, you make these concrete decisions like, “Okay, are we setting up four manufacturing lines or six?” And each one is a big upfront [capital expenditure] investment, and you’re basically deciding upfront the velocity at which you’re going to be able to generate supply before you know what the demand is. On this one, we thought that Ray-Ban Meta was probably going to sell three or five times more than the first version did. And we just dramatically underestimated it.

Now, we’re in this position where it’s actually been somewhat hard for us to gauge what the real demand is because they’re sold out. You can’t get them. So, if you can’t get them, how do you know where the actual curve is? We’re basically getting to the point where that’s resolved. Now, we kind of adjusted, and we made the decision to build more manufacturing lines. It took some time to do it. They’re online now. It’s not just about being able to make them; you need to get them into all the stores and get the distribution right. We feel like that’s in a pretty good place now.

Over the rest of this year, we’re going to start getting a real sense of the demand, but while that’s going on, the glasses keep getting better because of over-the-air AI updates. So, even though we keep shipping new frames and they’re adding more transition lenses because people want to wear them indoors, the hardware doesn’t necessarily change. And that’s an interesting thing because sunglasses are a little more discretionary, so I think a lot more people early on were thinking, “Hey, I’ll experiment with this with sunglasses. I’m not going to make these my primary glasses.” Now, we’re seeing a lot more people say, “Hey, this is actually really useful. I want to be able to wear them inside. I want them to be my primary glasses.”

So, whether that’s working with them through the optical channel or the transitions, that’s an important part, but the AI part of this also just keeps getting better. We talked about it at Connect: the ability to have, over the next few months when we roll this out, real-time translations. You’re traveling abroad, someone’s speaking Spanish to you, you just get it translated into English in your ear. It will roll out to more and more languages over time. I think we’re starting with a few languages, and we’ll hit more over time.

I tried that. Well, actually, I didn’t try real-time translation, but I tried looking at a menu in French, and it translated it into English. And then, at the end, I was like, “What is the euro [price] in USD?” And it did that, too. I’m also starting to see the continuum of this to Orion in the sense of the utility aspects. You could say, “Look at this and remind me about it at 8PM tonight,” and then it syncs with the companion app. 

Yeah, Reminders are a new thing.

It’s not replacing the phone, but it’s augmenting what I would do with my phone. And I’m wondering if the [AI] app is a place for more of that kind of interaction as well. How are these glasses going to be more deeply tied to Meta AI over time? It seems like they’re getting closer and closer all the time.

Well, I think Meta AI is becoming a more and more prominent feature of the glasses, and there’s more stuff that you can do. You just mentioned Reminders, which is another example. Now, that is just going to work, and now your glasses can remind you of things.

Or you can look at a phone number and say, “Call this phone number,” and then it calls on the phone.

Yeah, we’ll add more capabilities over time, and some of those are model updates. Okay, now it has Llama 3.2, but some of it is software development around it. Reminders you don’t get for free just because we updated the model. We have this big software development effort, and we’re adding features continuously and developing the ecosystem, so you get more apps like Spotify, and all these different things can work more natively.

So the glasses just get more and more useful, which I think is also going to increase demand over time. And how does it interact with phones? Like you said, I don’t think people are getting rid of phones anytime soon. The way I think about this is that when phones became the primary computing platform, we didn’t get rid of computers. We just kind of shifted. I don’t know if you had this experience, but at some point in the early 2010s, I noticed that I’d be sitting at my desk in front of my computer, and I’d just pull out my phone to do things.

It’s not like we’re going to throw away our phones, but I think what’s going to happen is that, slowly, we’re just going to start doing more things with our glasses and leaving our phones in our pockets more. It’s not like we’re done with our computers, and I don’t think we’re going to be done with our phones for a while, but there’s a pretty clear path where you’re just going to use your glasses for more and more things. Over time, I think the glasses are also going to be able to be powered by wrist-based wearables or other wearables.

So, you’re going to wake up one day 10 years from now, and you’re not even going to need to bring your phone with you. Now, you’re still going to have a phone, but I think more of the time, people are going to leave it in their pocket or leave it in their bag, or eventually, some of the time, leave it at home. I think there will be this gradual shift to glasses becoming the main way we do computing.

It’s interesting that we’re talking about this right now, because I feel like phones are becoming kind of boring and stale. I was just looking at the new iPhone, and it’s basically the same as the year before. People are doing foldables, but it feels like people have run out of ideas on phones and that they’re kind of at their natural end state. When you see something like the Ray-Bans and how people have gravitated to them in a way that’s surprised you, and I think surprised all of us, I wonder if it’s also just that people want to interact with technology in different ways now.

Like you said at the beginning, the way that AI has intersected with this is kind of an “aha” thing for people that, honestly, for me, I didn’t expect it to click as quickly as it did. But when I got whitelisted for the AI, I was walking around in my backyard and using it, and I was like, “Oh, it’s obvious now where this is going. It feels like things are finally in a place where you can see where it’s going. Whereas before, it’s been a lot of R&D and talking about it, but the Ray-Bans are kind of a signifier of that, and I’m wondering if you agree.

For what it’s worth, I also think that all the AI work is going to make phones a lot more exciting. The most exciting thing that has happened to our family of apps roadmap in a long time is all the different AI things that we’re building. If I were at any of the other companies trying to design what the next few versions of iPhone or Google’s phones should be, I think that there’s a long and interesting roadmap of things that they can do with AI that, as an app developer, we can’t. That’s a pretty exciting and interesting thing for them to do, which I assume they will.

On the AI social media piece, one of the wilder things that your team told me you’re going to start doing is showing people AI-generated imagery personalized to them, in feed. I think it’s starting as an experiment, but if you’re a photographer, you would see Meta AI generating content that’s personalized for you, alongside content from the people you follow.

It’s this idea that I’ve been thinking about, of AI invading social media, so to speak — maybe you don’t like the word “invading,” but you know what I mean — and what that does to how we relate to each other as humans. In your view, how much AI stuff and AI-generated stuff is going to be filling feeds in the near future?

Here’s how I come at this: in the history of running the company — and we’ve been building these apps for 20 years — every three to five years, there’s some new major format that comes along that is typically additive to the experience. So, initially, people updated their profiles; then they were able to post statuses that were texts; then links; then you got photos early on; then you added videos; then mobile. Basically Snap invented stories, the first version of that, and that became a pretty widely used format. The whole version of shortform videos, I think, is still an ascendant format.

Given that set of assumptions, we’re trying to understand what things are most useful to people within that. There’s one vein of this, which is helping people and creators make better content using AI. So that is going to be pretty clear. Just make it super easy for aspiring creators or advanced creators to make much better stuff than they would be able to otherwise. That can take the format of like, “All right, my daughter is writing a book and she wants it illustrated, and we sit down together and work with Meta AI and Imagine to help her come up with images to illustrate it.” That’s a thing that’s like, she didn’t have the capability to do that before. She’s not a graphic designer, but now she has that ability. I think that that’s going to be pretty cool.

Then there’s a version where you have this great diversity of AI agents that are part of this system. And this, I think, is a big difference between our vision of AI and most of the other companies. Yeah, we’re building Meta AI as the main assistant that you can build. That’s sort of equivalent to the singular assistant that may be like what Google or an OpenAI or different folks are building, but it’s not really the main thing that we’re doing. Our main vision is that we think that there are going to be a lot of these. It’s every business, all the hundreds of millions of small businesses, just like they have a website and an email address and a social media account today, I think that they’re all going to have an AI that helps them interact with their customers in the future, that does some combination of sales and customer support and all of that.

I think that they’re going to have their own profiles. They’re going to be creating content. People will be able to follow them if they want. You’ll be able to comment on their stuff. They may be able to comment on your stuff if you’re connected with them, and there will obviously be different logic and rules, but that’s one way that there’s going to be a lot more AI participants in the broader social construct. Then you get to the test that you mentioned, which is maybe the most abstract, which is just having the central Meta AI system directly generate content for you based on what we think is going to be interesting to you and putting that in your feed.

On that, I think there’s been this trend over time where the feeds started off as primarily and exclusively content for people you followed, your friends. I guess it was friends early on, then it kind of broadened out to, “Okay, you followed a set of friends and creators.” And then it got to a point where the algorithm was good enough where we’re actually showing you a lot of stuff that you’re not following directly because, in some ways, that’s a better way to show you more interesting stuff than only constraining it to things that you’ve chosen to follow.

I think the next logical jump on that is like, “Okay, we’re showing you content from your friends and creators that you’re following and creators that you’re not following that are generating interesting things. And you just add on to that, a layer of, “Okay, and we’re also going to show you content that’s generated by an AI system that might be something that you’re interested in.” Now, how big do any of these segments get? I think it’s really hard to know until you build them out over time, but it feels like it is a category in the world that’s going to exist, and how big it gets is kind of dependent on the execution and how good it is.

But in a lot of ways, the big change already happened, which is people getting content that they weren’t following. And the definition of feeds and social interaction has changed very fundamentally in the last 10 years. Now, in social systems, most of the direct interaction is happening in more private forums, in messaging or groups.

This is one of the reasons we were late with Reels initially to compete with TikTok is because we hadn’t made this mental shift where we kind of felt like, “No, the feed is where you interact with people.” Actually, increasingly, the feed is becoming a place where you discover content that you then take to your private forums and interact with people there. It’s like, I’ll still have the thing where a friend will post something and I’ll comment on it and engage directly in feed. Again, this is additive. You’re adding more over time. But the main way that you engage with Reels isn’t necessarily that you go into the Reels comments and comment and talk to people you don’t know. It’s like you see something funny and you send it to friends in a group chat.

I think that paradigm will absolutely continue with AI and all kinds of interesting content. So it is facilitating connections with people, but already, we’re in this mode where our connections through social media are shifting to more private places, and the role of the feed in the ecosystem is more of what I’d call a discovery engine of content: icebreakers or interesting topic starters for the conversations that you’re having across this broader spectrum of places where you’re interacting.

The sociology that I’ve seen on this is that most people have way fewer friends physically than they would like to have. People cherish the human connections that they have, and the more we can do to make that feel more real and give you more reasons to connect, whether it’s through something funny that shows up so you can message someone or a pair of glasses that lets your sister show up as a hologram in your living room when she lives across the country and you wouldn’t be able to see her otherwise, that’s always our main bread and butter in the thing that we’re doing.

But in addition to that, the average person, maybe they’d like to have 10 friends, and there’s the stat that — it’s sort of sad — the average American feels like they have fewer than three real close friends. So does this take away from that? My guess is no. I think that what’s going to happen is it’s going to help give people more of the support that they need and give people more reasons and the ability to connect with either a broader range of people or more deeply with the people they care about.

How are you feeling about how Threads is doing these days?

Threads is on fire. It’s great. There’s only so quickly that something can get to 1 billion people, so we’ll keep pushing on it.

I’ve heard it’s still using Instagram a lot for growth. I’m wondering, when do you see it getting to a standalone growth driver on its own?

I think that these things all connect to each other. Threads helps Instagram, and Instagram helps threads. I don’t know that we have some strategic goal, which is to make it so that Threads is completely disconnected from Instagram or Facebook. I actually think we’re going in the other direction. It started off just connected to Instagram, and now we also connected it so that the content can show up [elsewhere].

I’m not even sure what X is anymore, but I think what it used to be, what Twitter used to be, was a place where you went when news was happening. I know you, and the company, seem to be distancing yourself from recommending news. But with Threads, it feels like that’s what people want and what people thought Threads might be, but it seems like you are intentionally saying, “We don’t want Threads to be that.”

There are different ways to look at this. I always looked at Twitter not as primarily about real-time news but as a shortform, primarily text discussion-oriented app. To me, the fundamental defining aspect of that format is that when you make a post, the comments aren’t subordinate to the post. The comments are kind of at a peer level.

That is a very different architecture than every other type of social network that’s out there. And it’s a subtle difference, but within these systems, these subtle differences lead to very different emerging behaviors. Because of that, people can take and fork discussions, and it makes it a very good discussion-oriented platform. News is one thing that people like discussing, but it’s not the only thing.

I always looked at Twitter, and I was like, “Hey, this is such a wasted opportunity. This is clearly a billion-person app.” Maybe in the modern day, when you have many billions of people using social apps, it should be multiple billions of people. There were a lot of things that have been complicated about Twitter and the corporate structure and all of that, but for whatever reason, they just weren’t quite getting there. Eventually, I thought, “Hey, I think we can do this. I think we can get this, build out the discussion platform in a way that can get to a billion people and be more of a ubiquitous social platform that I think achieves its full potential.” But our version of this is that we want it to be a kinder place. We don’t want it to start with the direct head-to-head combat of news, and especially politics.

I think we’ll see. We’ll run the experiment.

That needs to exist in the world. Because I feel like with X’s seeming implosion, it doesn’t really exist anymore. Maybe I’m biased as someone in the media, but I do think when something big happens in the world, people want an app that they can go to and see everyone that they follow talking about it immediately. There’s not an immediacy [on Threads].

Well, we’re not the only company. There are a ton of different competitors and different companies doing things. I think that there’s a talented team over at X, so I wouldn’t write them off. And then obviously, there are all these other folks, and there are a lot of startups that are doing stuff. So I don’t feel like we have to go at that first. I think that maybe we get there over time, or maybe we decide that it’s enough of a zero-sum trade, or maybe even a negative-sum trade, where that use case should exist somewhere but maybe that use case prevents a lot more usage and a lot more value in other places because it makes it a somewhat less friendly place. I don’t think we know the answer to that yet. But I do think, the last 8–10 years of our experience has been that the political discourse is tricky.

On the one hand, it’s obviously a very important thing in society. On the other hand, I don’t think it leaves people feeling good. I’m torn between these two values. I think people should be able to have this kind of open discourse, and that’s good. But I don’t want to design a product that makes people angry. There’s an informational lens for looking at this, and then there’s “you’re designing a product, and what’s the feel of the product?” I think anyone who’s designing a product cares a lot about how the thing feels.

But you recognize the importance of that discussion happening. 

And culture changes over time. Maybe the stuff will be a little bit less polarized and anger-inducing at some point, and maybe it’ll be possible to have more of that while also, at the same time, having a product where we’re proud of how it feels. Until then, I think we want to design a product where people can get the things that they want, but fundamentally, I care a lot about how people feel coming away from the product.

Do you see this decision to downrank political content for people who aren’t being followed in feed as a political decision? Because you’re also, at the same time, not really saying much about the US presidential election this year. You’re not donating. You’ve said you want to stay out of it now.

And I see the way the company’s acting, and it reflects your personal way you’re operating right now. I’m wondering how much more of it is also what you and the company have gone through and the political environment, and not necessarily just what users are telling you.

Sure.

Is there a throughline there?

I’m sure it’s all connected. In this case, it wasn’t a tradeoff between those two things because this actually was what our community was telling us. And people were saying, “Generally, we don’t want so much politics. We don’t feel good. We want more stuff from our friends and family. We want more stuff from our interests.” That was kind of the primary driver. But it’s definitely the case that our corporate experience on this shaped this.

I think there’s a big difference between something being political and being partisan. And the main thing that I care about is making sure that we can be seen as nonpartisan and be a trusted institution by as many people as possible, as much as something can be in the world in 2024. I think that the partisan politics is so tough in the world right now that I’ve made the decision that, for me and for the company, the best thing to do is to try to be as nonpartisan and neutral as possible in all of this and distance ourselves from it as much as possible. It’s not just the substance. I also think perception matters. Maybe it doesn’t matter on our platforms, whether I endorse a candidate or not, but I don’t want to go anywhere near that.

Sure, you could say that’s a political strategy, but for where we are in the world today, it’s very hard. Almost every institution has become partisan in some way, and we are just trying to resist that. And maybe I’m too naive, and maybe that’s impossible, but we’re going to try to do that.

On the Acquired podcast recently, you said that the political miscalculation was a 20-year mistake.

Yeah, from a brand perspective.

And you said it was going to take another 10 years or so for you to fully work through that cycle. What makes you think it’s such a lasting thing? Because you look at how you personally have evolved over the last couple of years, and I think perception of the company has evolved. I’m wondering what you meant by saying it’s going to take another 10 years.

I’m just talking about where our brand and our reputation are compared to where I think they would’ve been. Sure, maybe things have improved somewhat over the last few years. You can feel the trend, but it’s still significantly worse than it was in 2016. The internet industry overall, and I think our company, in particular, we’re seen way more positively.

Look, there were real issues. I think it’s always very difficult to talk about this stuff in a nuanced way because, to some degree, before 2016, everyone was sort of too rosy about the internet overall and didn’t talk enough about the issues. Then the pendulum swung and people only talked about the issues and didn’t talk about the stuff that was positive, and it was all there the whole time. When I talk about this, I don’t mean to come across as simplistic or—

Or that you guys didn’t do anything wrong or anything.

Or that there weren’t issues with the internet or things like that. Obviously, every year, whether it’s politics or other things, there are always things that you look back on and you’re like, “Hey, if I were playing this perfectly, I would’ve done these things differently.” But I do think it’s the case that I didn’t really know how to react to something as big of a shift in the world as what happened, and it took me a while to find my footing. I do think that it’s tricky when you’re caught up in these big debates and you’re not experienced or sophisticated and engaging with that. I think you can make some big missteps. I do think that some of the things that we were accused of over time, it’s been pretty clear at this point now that all the investigations have been done that they weren’t true.

Source: https://www.theverge.com/24253481/meta-ceo-mark-zuckerberg-ar-glasses-orion-ray-bans-ai-decoder-interview
Exit mobile version