Mark Zuckerberg: Meta, Facebook, Instagram, and the Metaverse #267

Transcript

00:00:00 Let’s talk about free speech and censorship.

00:00:02 You don’t build a company like this

00:00:04 unless you believe that people expressing themselves

00:00:06 is a good thing.

00:00:07 Let me ask you as a father,

00:00:08 there’s a weight heavy on you

00:00:09 that people get bullied on social networks.

00:00:13 I care a lot about how people feel

00:00:14 when they use our products

00:00:15 and I don’t want to build products that make people angry.

00:00:19 Why do you think so many people dislike you?

00:00:23 Some even hate you.

00:00:25 And how do you regain their trust and support?

00:00:30 The following is a conversation with Mark Zuckerberg,

00:00:32 CEO of Facebook, now called Meta.

00:00:36 Please allow me to say a few words

00:00:38 about this conversation with Mark Zuckerberg,

00:00:41 about social media,

00:00:42 and about what troubles me in the world today,

00:00:45 and what gives me hope.

00:00:47 If this is not interesting to you,

00:00:49 I understand, please skip.

00:00:51 I believe that at its best,

00:00:54 social media puts a mirror to humanity

00:00:57 and reveals the full complexity of our world,

00:01:01 shining a light on the dark aspects of human nature

00:01:04 and giving us hope, a way out,

00:01:06 through compassionate but tense chaos of conversation

00:01:09 that eventually can turn into understanding,

00:01:12 friendship, and even love.

00:01:15 But this is not simple.

00:01:17 Our world is not simple.

00:01:19 It is full of human suffering.

00:01:22 I think about the hundreds of millions of people

00:01:24 who are starving and who live in extreme poverty,

00:01:28 the one million people who take their own life every year,

00:01:31 the 20 million people that attempt it,

00:01:33 and the many, many more millions who suffer quietly

00:01:37 in ways that numbers can never know.

00:01:40 I’m troubled by the cruelty and pain of war.

00:01:44 Today, my heart goes out to the people of Ukraine.

00:01:48 My grandfather spilled his blood on this land,

00:01:52 held the line as a machine gunner

00:01:54 against the Nazi invasion, surviving impossible odds.

00:01:59 I am nothing without him.

00:02:01 His blood runs in my blood.

00:02:04 My words are useless here.

00:02:07 I send my love.

00:02:09 It’s all I have.

00:02:11 I hope to travel to Russia and Ukraine soon.

00:02:14 I will speak to citizens and leaders,

00:02:16 including Vladimir Putin.

00:02:19 As I’ve said in the past, I don’t care about access,

00:02:22 fame, money, or power, and I’m afraid of nothing.

00:02:27 But I am who I am, and my goal in conversation

00:02:31 is to understand the human being before me,

00:02:33 no matter who they are, no matter their position.

00:02:36 And I do believe the line between good and evil

00:02:40 runs through the heart of every man.

00:02:43 So this is it.

00:02:45 This is our world.

00:02:47 It is full of hate, violence, and destruction.

00:02:51 But it is also full of love, beauty,

00:02:55 and the insatiable desire to help each other.

00:02:59 The people who run the social networks

00:03:01 that show this world, that show us to ourselves,

00:03:05 have the greatest of responsibilities.

00:03:08 In a time of war, pandemic, atrocity,

00:03:11 we turn to social networks to share real human insights

00:03:14 and experiences, to organize protests and celebrations,

00:03:18 to learn and to challenge our understanding of the world,

00:03:21 of our history and of our future,

00:03:24 and above all, to be reminded of our common humanity.

00:03:28 When the social networks fail,

00:03:30 they have the power to cause immense suffering.

00:03:33 And when they succeed,

00:03:35 they have the power to lessen that suffering.

00:03:37 This is hard.

00:03:39 It’s a responsibility, perhaps,

00:03:41 almost unlike any other in history.

00:03:44 This podcast conversation attempts to understand the man

00:03:47 and the company who take this responsibility on,

00:03:50 where they fail and where they hope to succeed.

00:03:54 Mark Zuckerberg’s feet are often held to the fire,

00:03:57 as they should be, and this actually gives me hope.

00:04:01 The power of innovation and engineering,

00:04:03 coupled with the freedom of speech

00:04:05 in the form of its highest ideal,

00:04:07 I believe can solve any problem in the world.

00:04:11 But that’s just it, both are necessary,

00:04:14 the engineer and the critic.

00:04:17 I believe that criticism is essential, but cynicism is not.

00:04:23 And I worry that in our public discourse,

00:04:25 cynicism too easily masquerades as wisdom, as truth,

00:04:30 becomes viral and takes over,

00:04:32 and worse, suffocates the dreams of young minds

00:04:35 who want to build solutions to the problems of the world.

00:04:39 We need to inspire those young minds.

00:04:41 At least for me, they give me hope.

00:04:44 And one small way I’m trying to contribute

00:04:47 is to have honest conversations like these

00:04:49 that don’t just ride the viral wave of cynicism,

00:04:53 but seek to understand the failures

00:04:54 and successes of the past, the problems before us,

00:04:57 and the possible solutions

00:04:59 in this very complicated world of ours.

00:05:02 I’m sure I will fail often,

00:05:05 and I count on the critic to point it out when I do.

00:05:10 But I ask for one thing,

00:05:12 and that is to fuel the fire of optimism,

00:05:15 especially in those who dream to build solutions,

00:05:18 because without that, we don’t have a chance

00:05:21 on this too fragile, tiny planet of ours.

00:05:25 This is the Lex Friedman podcast.

00:05:28 To support it, please check out our sponsors

00:05:30 in the description.

00:05:31 And now, dear friends, here’s Mark Zuckerberg.

00:05:40 Can you circle all the traffic lights, please?

00:05:53 You actually did it.

00:05:54 That is very impressive performance.

00:05:56 Okay, now we can initiate the interview procedure.

00:06:00 Is it possible that this conversation is happening

00:06:02 inside a metaverse created by you,

00:06:05 by Meta many years from now,

00:06:07 and we’re doing a memory replay experience?

00:06:10 I don’t know the answer to that.

00:06:11 Then I’d be some computer construct

00:06:15 and not the person who created that Meta company.

00:06:18 But that would truly be Meta.

00:06:21 Right, so this could be somebody else

00:06:23 using the Mark Zuckerberg avatar

00:06:26 who can do the Mark and the Lex conversation replay

00:06:29 from four decades ago when Meta, it was first sort of.

00:06:33 I mean, it’s not gonna be four decades

00:06:34 before we have photorealistic avatars like this.

00:06:38 So I think we’re much closer to that.

00:06:40 Well, that’s something you talk about

00:06:41 is how passionate you are about the idea

00:06:43 of the avatar representing who you are in the metaverse.

00:06:46 So I do these podcasts in person.

00:06:51 You know, I’m a stickler for that,

00:06:52 because there’s a magic to the in person conversation.

00:06:55 How long do you think it’ll be before

00:06:58 you can have the same kind of magic in the metaverse,

00:07:00 the same kind of intimacy in the chemistry,

00:07:02 whatever the heck is there when we’re talking in person?

00:07:06 How difficult is it?

00:07:07 How long before we have it in the metaverse?

00:07:10 Well, I think this is like the key question, right?

00:07:12 Because the thing that’s different about virtual

00:07:17 and hopefully augmented reality

00:07:19 compared to all other forms of digital platforms before

00:07:22 is this feeling of presence, right?

00:07:24 The feeling that you’re right,

00:07:25 that you’re in an experience

00:07:27 and that you’re there with other people or in another place.

00:07:29 And that’s just different from all of the other screens

00:07:32 that we have today, right?

00:07:33 Phones, TVs, all the stuff.

00:07:36 They’re trying to, in some cases, deliver experiences

00:07:39 that feel high fidelity,

00:07:43 but at no point do you actually feel like you’re in it, right?

00:07:46 At some level, your content is trying to sort of convince you

00:07:49 that this is a realistic thing that’s happening,

00:07:51 but all of the kind of subtle signals are telling you,

00:07:54 no, you’re looking at a screen.

00:07:56 So the question about how you develop these systems is like,

00:08:00 what are all of the things that make the physical world

00:08:03 all the different cues?

00:08:04 So I think on visual presence and spatial audio,

00:08:13 we’re making reasonable progress.

00:08:15 Spatial audio makes a huge deal.

00:08:16 I don’t know if you’ve tried this experience,

00:08:19 workrooms that we launched where you have meetings.

00:08:21 And I basically made a rule for all of the top,

00:08:26 you know, management folks at the company

00:08:27 that they need to be doing standing meetings

00:08:29 in workrooms already, right?

00:08:31 I feel like we got to dog food this,

00:08:33 you know, this is how people are gonna work in the future.

00:08:35 So we have to adopt this now.

00:08:38 And there were already a lot of things

00:08:40 that I think feel significantly better

00:08:42 than like typical Zoom meetings,

00:08:44 even though the avatars are a lot lower fidelity.

00:08:48 You know, the idea that you have spatial audio,

00:08:50 you’re around a table in VR with people.

00:08:53 If someone’s talking from over there,

00:08:55 it sounds like it’s talking from over there.

00:08:56 You can see, you know, the arm gestures

00:08:59 and stuff feel more natural.

00:09:01 You can have side conversations,

00:09:03 which is something that you can’t really do in Zoom.

00:09:04 I mean, I guess you can text someone over,

00:09:06 like out of band,

00:09:08 but if you’re actually sitting around a table with people,

00:09:12 you know, you can lean over

00:09:14 and whisper to the person next to you

00:09:15 and like have a conversation that you can’t,

00:09:17 you know, that you can’t really do

00:09:19 with in just video communication.

00:09:23 So I think it’s interesting in what ways

00:09:27 some of these things already feel more real

00:09:29 than a lot of the technology that we have,

00:09:32 even when the visual fidelity isn’t quite there,

00:09:35 but I think it’ll get there over the next few years.

00:09:37 Now, I mean, you were asking about comparing that

00:09:38 to the true physical world,

00:09:40 not Zoom or something like that.

00:09:42 And there, I mean, I think you have feelings

00:09:45 of like temperature, you know, olfactory,

00:09:50 obviously touch, right, we’re working on haptic gloves,

00:09:54 you know, the sense that you wanna be able to,

00:09:56 you know, put your hands down

00:09:57 and feel some pressure from the table.

00:09:59 You know, all of these things

00:10:00 I think are gonna be really critical

00:10:01 to be able to keep up this illusion

00:10:04 that you’re in a world

00:10:06 and that you’re fully present in this world.

00:10:08 But I don’t know,

00:10:09 I think we’re gonna have a lot of these building blocks

00:10:11 within, you know, the next 10 years or so.

00:10:14 And even before that, I think it’s amazing

00:10:15 how much you’re just gonna be able to build with software

00:10:18 that sort of masks some of these things.

00:10:21 I realize I’m going long,

00:10:22 but I was told we have a few hours here.

00:10:25 So it’s a…

00:10:26 We’re here for five to six hours.

00:10:27 Yeah, so I mean, it’s, look,

00:10:28 I mean, that’s on the shorter end

00:10:30 of the congressional testimonies I’ve done.

00:10:32 But it’s, but, you know, one of the things

00:10:36 that we found with hand presence, right?

00:10:39 So the earliest VR, you just have the headset

00:10:42 and then, and that was cool, you could look around,

00:10:44 you feel like you’re in a place,

00:10:45 but you don’t feel like you’re really able to interact with it

00:10:47 until you have hands.

00:10:48 And then there was this big question

00:10:49 where once you got hands,

00:10:51 what’s the right way to represent them?

00:10:53 And initially, all of our assumptions was, okay,

00:10:58 when I look down and see my hands in the physical world,

00:11:00 I see an arm and it’s gonna be super weird

00:11:02 if you see, you know, just your hand.

00:11:06 But it turned out to not be the case

00:11:08 because there’s this issue with your arms,

00:11:09 which is like, what’s your elbow angle?

00:11:11 And if the elbow angle that we’re kind of interpolating

00:11:14 based on where your hand is and where your headset is

00:11:18 actually isn’t accurate,

00:11:19 it creates this very uncomfortable feeling

00:11:21 where it’s like, oh, like my arm is actually out like this,

00:11:24 but it’s like showing it in here.

00:11:25 And that actually broke the feeling of presence a lot more.

00:11:29 Whereas it turns out that if you just show the hands

00:11:31 and you don’t show the arms,

00:11:34 it actually is fine for people.

00:11:36 So I think that there’s a bunch

00:11:38 of these interesting psychological cues

00:11:41 where it’ll be more about getting the right details right.

00:11:44 And I think a lot of that will be possible

00:11:46 even over a few year period or a five year period.

00:11:49 And we won’t need like every single thing to be solved

00:11:52 to deliver this like full sense of presence.

00:11:54 Yeah, it’s a fascinating psychology question

00:11:56 of what is the essence

00:11:59 that makes in person conversation special?

00:12:04 It’s like emojis are able to convey emotion really well,

00:12:08 even though they’re obviously not photorealistic.

00:12:10 And so in that same way, Jessica, you’re saying,

00:12:12 just showing the hands is able

00:12:14 to create a comfortable expression with your hands.

00:12:18 So I wonder what that is.

00:12:19 People in the world wars used to write letters

00:12:21 and you can fall in love with just writing letters.

00:12:24 You don’t need to see each other in person.

00:12:26 You can convey emotion.

00:12:27 You can be depth of experience with just words.

00:12:32 So that’s, I think, a fascinating place

00:12:35 to explore psychology of like,

00:12:37 how do you find that intimacy?

00:12:39 Yeah, and the way that I come to all of this stuff is,

00:12:42 I basically studied psychology and computer science.

00:12:45 So all of the work that I do

00:12:47 is sort of at the intersection of those things.

00:12:49 I think most of the other big tech companies

00:12:52 are building technology for you to interact with.

00:12:55 What I care about is building technology

00:12:56 to help people interact with each other.

00:12:58 So I think it’s a somewhat different approach

00:12:59 than most of the other tech entrepreneurs

00:13:02 and big companies come at this from.

00:13:04 And a lot of the lessons

00:13:08 in terms of how I think about designing products

00:13:10 come from some just basic elements of psychology, right?

00:13:15 In terms of our brains,

00:13:19 you can compare it to the brains of other animals.

00:13:22 We’re very wired to specific things, facial expressions.

00:13:25 I mean, we’re very visual, right?

00:13:28 So compared to other animals,

00:13:29 I mean, that’s clearly the main sense

00:13:31 that most people have.

00:13:32 But there’s a whole part of your brain

00:13:35 that’s just kind of focused on reading facial cues.

00:13:38 So when we’re designing the next version of Quest

00:13:42 or the VR headset, a big focus for us is face tracking

00:13:45 and basically eye tracking so you can make eye contact,

00:13:48 which again, isn’t really something

00:13:50 that you can do over a video conference.

00:13:51 It’s sort of amazing how far video conferencing

00:13:55 has gotten without the ability to make eye contact, right?

00:13:58 It’s sort of a bizarre thing if you think about it.

00:14:00 You’re looking at someone’s face,

00:14:03 sometimes for an hour when you’re in a meeting

00:14:05 and you looking at their eyes to them

00:14:08 doesn’t look like you’re looking at their eyes.

00:14:11 You’re always looking past each other, I guess.

00:14:15 I guess you’re right.

00:14:15 You’re not sending that signal.

00:14:16 Well, you’re trying to.

00:14:17 Right, you’re trying to.

00:14:18 A lot of times, or at least I find myself,

00:14:19 I’m trying to look into the other person’s eyes.

00:14:21 But they don’t feel like you’re looking to their eyes.

00:14:23 So then the question is,

00:14:23 all right, am I supposed to look at the camera

00:14:25 so that way you can have a sensation

00:14:27 that I’m looking at you?

00:14:28 I think that that’s an interesting question.

00:14:30 And then with VR today,

00:14:33 even without eye tracking

00:14:35 and knowing what your eyes are actually looking at,

00:14:37 you can fake it reasonably well, right?

00:14:39 So you can look at where the head pose is.

00:14:42 And if it looks like I’m kind of looking

00:14:43 in your general direction,

00:14:44 then you can sort of assume

00:14:46 that maybe there’s some eye contact intended

00:14:48 and you can do it in a way where it’s like,

00:14:50 okay, maybe it’s not a fixated stare,

00:14:54 but it’s somewhat natural.

00:14:56 But once you have actual eye tracking,

00:14:58 you can do it for real.

00:15:00 And I think that that’s really important stuff.

00:15:02 So when I think about Meta’s contribution to this field,

00:15:05 I have to say it’s not clear to me

00:15:06 that any of the other companies

00:15:08 that are focused on the Metaverse

00:15:11 or on virtual and augmented reality

00:15:13 are gonna prioritize putting these features in the hardware

00:15:15 because like everything, they’re trade offs, right?

00:15:18 I mean, it adds some weight to the device.

00:15:21 Maybe it adds some thickness.

00:15:22 You could totally see another company taking the approach

00:15:24 of let’s just make the lightest and thinnest thing possible.

00:15:27 But I want us to design the most human thing possible

00:15:31 that creates the richest sense of presence

00:15:33 and cause so much of human emotion and expression

00:15:37 comes from these like micro movements.

00:15:39 If I like move my eyebrow millimeter,

00:15:41 you will notice and that like means something.

00:15:44 So the fact that we’re losing these signals

00:15:46 and a lot of communication I think is a loss.

00:15:49 So it’s not like, okay, there’s one feature

00:15:51 and you add this, then it all of a sudden

00:15:53 is gonna feel like we have real presence.

00:15:55 You can sort of look at how the human brain works

00:15:57 and how we express and kind of read emotions

00:16:01 and you can just build a roadmap of that,

00:16:04 of just what are the most important things

00:16:06 to try to unlock over a five to 10 year period

00:16:08 and just try to make the experience

00:16:10 more and more human and social.

00:16:12 When do you think would be a moment,

00:16:16 like a singularity moment for the Metaverse

00:16:19 where there’s a lot of ways to ask this question,

00:16:22 but people will have many or most

00:16:26 of their meaningful experiences

00:16:28 in the Metaverse versus the real world.

00:16:31 And actually it’s interesting to think about

00:16:33 the fact that a lot of people are having

00:16:35 the most important moments of their life

00:16:37 happen in the digital sphere,

00:16:39 especially not during COVID,

00:16:41 like even falling in love or meeting friends

00:16:45 or getting excited about stuff

00:16:46 that is happening on the 2D digital plane.

00:16:49 When do you think the Metaverse

00:16:50 will provide those experiences for a large number,

00:16:54 like a majority of the population?

00:16:54 Yeah, I think it’s a really good question.

00:16:57 There was someone, I read this piece

00:17:00 that framed this as a lot of people think

00:17:03 that the Metaverse is about a place,

00:17:06 but one definition of this is it’s about a time

00:17:10 when basically immersive digital worlds

00:17:12 become the primary way that we live our lives

00:17:17 and spend our time.

00:17:18 I think that that’s a reasonable construct.

00:17:20 And from that perspective,

00:17:21 I think you also just wanna look at this as a continuation

00:17:25 because it’s not like, okay,

00:17:27 we are building digital worlds,

00:17:28 but we don’t have that today.

00:17:29 I think you and I probably already live

00:17:32 a very large part of our life in digital worlds.

00:17:34 They’re just not 3D immersive virtual reality,

00:17:37 but I do a lot of meetings over video

00:17:39 or I spend a lot of time writing things over email

00:17:42 or WhatsApp or whatever.

00:17:44 So what is it gonna take to get there

00:17:46 for kind of the immersive presence version of this,

00:17:48 which I think is what you’re asking.

00:17:51 And for that, I think that there’s just a bunch

00:17:52 of different use cases.

00:17:55 And I think when you’re building technology,

00:18:00 I think a lot of it is just you’re managing this duality

00:18:05 where on the one hand,

00:18:06 you wanna build these elegant things that can scale

00:18:10 and have billions of people use them

00:18:12 and get value from them.

00:18:13 And then on the other hand,

00:18:14 you’re fighting this kind of ground game

00:18:17 where there are just a lot of different use cases

00:18:19 and people do different things

00:18:20 and you wanna be able to unlock them.

00:18:22 So the first ones that we basically went after

00:18:25 were gaming with Quest and social experiences.

00:18:30 And it goes back to when we started working

00:18:32 on virtual reality.

00:18:33 My theory at the time was basically

00:18:37 people thought about it as gaming,

00:18:39 but if you look at all computing platforms up to that point,

00:18:44 gaming is a huge part, it was a huge part of PCs,

00:18:47 it was a huge part of mobile,

00:18:49 but it was also very decentralized.

00:18:51 There wasn’t, for the most part,

00:18:54 one or two gaming companies.

00:18:55 There were a lot of gaming companies

00:18:57 and gaming is somewhat hits based.

00:18:58 I mean, we’re getting some games that have more longevity,

00:19:01 but in general, there were a lot of different games

00:19:05 out there.

00:19:06 But on PC and on mobile,

00:19:10 the companies that focused on communication

00:19:13 and social interaction,

00:19:15 there tended to be a smaller number of those

00:19:17 and that ended up being just as important of a thing

00:19:19 as all of the games that you did combined.

00:19:21 I think productivity is another area.

00:19:23 That’s obviously something

00:19:23 that we’ve historically been less focused on,

00:19:26 but I think it’s gonna be really important for us.

00:19:27 With workroom, do you mean productivity

00:19:29 in the collaborative aspect?

00:19:30 Yeah, I think that there’s a workroom’s aspect of this,

00:19:34 like a meeting aspect,

00:19:35 and then I think that there’s like a Word, Excel,

00:19:39 productivity, either you’re working or coding

00:19:42 or knowledge work as opposed to just meetings.

00:19:46 So you can kind of go through all these different use cases.

00:19:49 Gaming, I think we’re well on our way.

00:19:51 Social, I think we’re just the kind of preeminent company

00:19:56 that focuses on this.

00:19:57 And I think that that’s already on Quest becoming the,

00:20:00 if you look at the list of what are the top apps,

00:20:03 social apps are already number one, two, three.

00:20:06 So that’s kind of becoming a critical thing, but I don’t know.

00:20:10 I would imagine for someone like you,

00:20:12 it’ll be until we get a lot of the work things dialed in.

00:20:17 When this is just like much more adopted

00:20:20 and clearly better than Zoom for VC,

00:20:24 when if you’re doing your coding or your writing

00:20:27 or whatever it is in VR,

00:20:29 which it’s not that far off to imagine that

00:20:31 because pretty soon you’re just gonna be able

00:20:32 to have a screen that’s bigger than,

00:20:34 it’ll be your ideal setup and you can bring it with you

00:20:36 and put it on anywhere

00:20:37 and have your kind of ideal workstation.

00:20:39 So I think that there are a few things to work out on that,

00:20:42 but I don’t think that that’s more than five years off.

00:20:46 And then you’ll get a bunch of other things

00:20:48 that like aren’t even possible

00:20:50 or you don’t even think about using a phone

00:20:52 or PC for today, like fitness, right?

00:20:54 So, I mean, I know you’re, we were talking before

00:20:57 about how you’re into running

00:20:58 and like I’m really into a lot of things

00:21:00 around fitness as well,

00:21:02 different things in different places.

00:21:04 I got really into hydrofoiling recently

00:21:06 and surfing and I used to fence competitively.

00:21:12 I like run.

00:21:13 So, and you were saying that you were thinking

00:21:14 about trying different martial arts

00:21:16 and I tried to trick you and convince you

00:21:18 into doing Brazilian Jiu Jitsu.

00:21:19 Or you actually mentioned that that was one

00:21:21 you’re curious about and I don’t know.

00:21:23 Is that a trick?

00:21:24 Yeah, I don’t know.

00:21:26 We’re in the metaverse now.

00:21:27 Yeah, no, I took that seriously.

00:21:29 I thought that that was a real suggestion.

00:21:34 That would be an amazing chance

00:21:36 if we ever step on the mat together

00:21:37 and just like roll around.

00:21:39 I’ll show you some moves.

00:21:40 Well, give me a year to train and then we can do it.

00:21:43 You know, you’ve seen Rocky IV

00:21:44 where the Russian faces off the American.

00:21:46 I’m the Russian in this picture.

00:21:47 And then you’re the Rocky, the underdog

00:21:49 that gets to win in the end.

00:21:51 The idea of me as Rocky and like fighting is…

00:21:56 If he dies, he dies.

00:21:58 Sorry, I just had to.

00:22:00 I mean.

00:22:01 Anyway, yeah.

00:22:02 But I mean, a lot of aspects of fitness.

00:22:05 You know, I don’t know if you’ve tried supernatural

00:22:08 on Quest or…

00:22:10 So first of all, can I just comment on the fact

00:22:12 every time I played around with Quest 2,

00:22:15 I just, I get giddy every time I step into virtual reality.

00:22:18 So you mentioned productivity and all those kinds of things.

00:22:20 That’s definitely something I’m excited about,

00:22:23 but really I just love the possibilities

00:22:26 of stepping into that world.

00:22:28 Maybe it’s the introvert in me,

00:22:30 but it just feels like the most convenient way

00:22:34 to travel into worlds,

00:22:37 into worlds that are similar to the real world

00:22:40 or totally different.

00:22:41 It’s like Alice in Wonderland.

00:22:42 Just try out crazy stuff.

00:22:44 The possibilities are endless.

00:22:45 And I just, I personally am just love,

00:22:50 get excited for stepping in those virtual worlds.

00:22:53 So I’m a huge fan.

00:22:55 In terms of the productivity as a programmer,

00:22:58 I spend most of my day programming.

00:23:00 That’s really interesting also,

00:23:01 but then you have to develop the right IDEs.

00:23:04 You have to develop, like there has to be a threshold

00:23:07 where a large amount of the programming community

00:23:09 moves there, but the collaborative aspects

00:23:11 that are possible in terms of meetings,

00:23:14 in terms of when two coders are working together,

00:23:18 I mean, the possibilities there are super, super exciting.

00:23:21 I think that in building this, we sort of need to balance.

00:23:27 There are gonna be some new things

00:23:28 that you just couldn’t do before.

00:23:29 And those are gonna be the amazing experiences.

00:23:31 So teleporting to any place, right?

00:23:33 Whether it’s a real place or something that people made.

00:23:38 And I mean, some of the experiences

00:23:40 around how we can build stuff in new ways,

00:23:42 where a lot of the stuff that,

00:23:44 when I’m coding stuff, it’s like, all right,

00:23:46 you code it and then you build it

00:23:47 and then you see it afterwards.

00:23:48 But increasingly it’s gonna be possible to,

00:23:50 you’re in a world and you’re building the world

00:23:52 as you are in it and kind of manipulating it.

00:23:55 One of the things that we showed at our Inside the Lab

00:23:59 for recent artificial intelligence progress

00:24:02 is this Builder Bot program,

00:24:03 where now you can just talk to it and say,

00:24:07 hey, okay, I’m in this world,

00:24:08 like put some trees over there and it’ll do that.

00:24:10 And like, all right, put some bottles of water

00:24:13 on our picnic blanket and it’ll do that

00:24:17 and you’re in the world.

00:24:17 And I think there are gonna be new paradigms for coding.

00:24:19 So yeah, there are gonna be some things

00:24:22 that I think are just pretty amazing,

00:24:24 especially the first few times that you do them,

00:24:26 but that you’re like, whoa,

00:24:28 like I’ve never had an experience like this.

00:24:30 But most of your life, I would imagine,

00:24:34 is not doing things that are amazing for the first time.

00:24:38 A lot of this in terms of,

00:24:39 I mean, just answering your question from before around,

00:24:42 what is it gonna take

00:24:42 before you’re spending most of your time in this?

00:24:45 Well, first of all, let me just say it as an aside,

00:24:48 the goal isn’t to have people spend a lot more time

00:24:50 in computing.

00:24:51 It’s to make it so that. I’m asking for myself.

00:24:52 Yeah, it’s to make it. When will I spend all my time in?

00:24:54 Yeah, it’s to make computing more natural.

00:24:57 But I think you will spend most of your computing time

00:25:02 in this when it does the things

00:25:04 that you use computing for somewhat better.

00:25:07 So maybe having your perfect workstation

00:25:10 is a 5% improvement on your coding productivity.

00:25:15 Maybe it’s not like a completely new thing.

00:25:19 But I mean, look, if I could increase the productivity

00:25:21 of every engineer at Meta by 5%,

00:25:25 we’d buy those devices for everyone.

00:25:27 And I imagine a lot of other companies would too.

00:25:30 And that’s how you start getting to the scale

00:25:31 that I think makes this rival

00:25:34 some of the bigger computing platforms that exist today.

00:25:37 Let me ask you about identity.

00:25:38 We talked about the avatar.

00:25:40 How do you see identity in the Metaverse?

00:25:42 Should the avatar be tied to your identity

00:25:46 or can I be anything in the Metaverse?

00:25:49 Like, can I be whatever the heck I want?

00:25:52 Can I even be a troll?

00:25:53 So there’s exciting freeing possibilities

00:25:57 and there’s the darker possibilities too.

00:26:00 Yeah, I mean, I think that there’s gonna be a range, right?

00:26:03 So we’re working on, for expression and avatars,

00:26:10 on one end of the spectrum are kind of expressive

00:26:13 and cartoonish avatars.

00:26:14 And then on the other end of the spectrum

00:26:16 are photorealistic avatars.

00:26:18 And I just think the reality is

00:26:20 that there are gonna be different use cases

00:26:22 for different things.

00:26:23 And I guess there’s another axis.

00:26:25 So if you’re going from photorealistic to expressive,

00:26:28 there’s also like representing you directly

00:26:31 versus like some fantasy identity.

00:26:33 And I think that there are gonna be things

00:26:35 on all ends of that spectrum too, right?

00:26:37 So you’ll want photo, like in some experience,

00:26:41 you might wanna be like a photorealistic dragon, right?

00:26:44 Or if I’m playing Onward,

00:26:46 or just this military simulator game,

00:26:50 I think getting to be more photorealistic as a soldier

00:26:53 in that could enhance the experience.

00:26:57 There are times when I’m hanging out with friends

00:26:59 where I want them to know it’s me.

00:27:02 So a kind of cartoonish or expressive version of me is good.

00:27:06 But there are also experiences like,

00:27:09 VRChat does this well today,

00:27:11 where a lot of the experience is kind of dressing up

00:27:14 and wearing a fantastical avatar

00:27:17 that’s almost like a meme or is humorous.

00:27:19 So you come into an experience

00:27:21 and it’s almost like you have like a built in icebreaker

00:27:24 because like you see people and you’re just like,

00:27:27 all right, I’m cracking up at what you’re wearing

00:27:29 because that’s funny.

00:27:30 And it’s just like, where’d you get that?

00:27:31 Or, oh, you made that?

00:27:32 That’s, it’s awesome.

00:27:35 Whereas, okay, if you’re going into a work meeting,

00:27:38 maybe a photorealistic version of your real self

00:27:41 is gonna be the most appropriate thing for that.

00:27:43 So I think the reality is there aren’t going to be,

00:27:47 it’s not just gonna be one thing.

00:27:50 You know, my own sense of kind of how you wanna

00:27:54 express identity online has sort of evolved over time.

00:27:56 And that, you know, early days in Facebook,

00:27:58 I thought, okay, people are gonna have one identity.

00:28:00 And now I think that’s clearly not gonna be the case.

00:28:02 I think you’re gonna have all these different things

00:28:04 and there’s utility in being able to do different things.

00:28:07 So some of the technical challenges

00:28:10 that I’m really interested in around it

00:28:12 are how do you build the software

00:28:14 to allow people to seamlessly go between them?

00:28:17 So say, so you could view them

00:28:19 as just completely discrete points on a spectrum,

00:28:25 but let’s talk about the metaverse economy for a second.

00:28:28 Let’s say I buy a digital shirt

00:28:31 for my photorealistic avatar, which by the way,

00:28:34 I think at the time where we’re spending a lot of time

00:28:36 in the metaverse doing a lot of our work meetings

00:28:38 in the metaverse and et cetera,

00:28:40 I would imagine that the economy around virtual clothing

00:28:42 as an example is going to be quite as big.

00:28:44 Why wouldn’t I spend almost as much money

00:28:47 in investing in my appearance or expression

00:28:49 for my photorealistic avatar for meetings

00:28:52 as I would for whatever I’m gonna wear in my video chat.

00:28:55 But the question is, okay, so you,

00:28:56 let’s say you buy some shirt

00:28:57 for your photorealistic avatar.

00:28:59 Wouldn’t it be cool if there was a way

00:29:02 to basically translate that into a more expressive thing

00:29:07 for your kind of cartoonish or expressive avatar?

00:29:11 And there are multiple ways to do that.

00:29:12 You can view them as two discrete points and okay,

00:29:14 maybe if a designer sells one thing,

00:29:18 then it actually comes in a pack and there’s two

00:29:19 and you can use either one on that,

00:29:22 but I actually think this stuff might exist more

00:29:24 as a spectrum in the future.

00:29:26 And that’s what I do think the direction

00:29:29 on some of the AI advances that is happening

00:29:33 to be able to, especially stuff around like style transfer,

00:29:35 being able to take a piece of art or express something

00:29:39 and say, okay, paint me this photo in the style of Gauguin

00:29:44 or whoever it is that you’re interested in.

00:29:49 Take this shirt and put it in the style

00:29:51 of what I’ve designed for my expressive avatar.

00:29:55 I think that’s gonna be pretty compelling.

00:29:56 And so the fashion, you might be buying like a generator,

00:30:00 like a closet that generates a style.

00:30:03 And then like with the GANs,

00:30:05 you’ll be able to infinitely generate outfits

00:30:08 thereby making it, so the reason I wear the same thing

00:30:10 all the time is I don’t like choice.

00:30:12 You’ve talked about the same thing,

00:30:15 but now you don’t even have to choose.

00:30:16 Your closet generates your outfit for you every time.

00:30:19 So you have to live with the outfit it generates.

00:30:23 I mean, you could do that, although,

00:30:25 no, I think that that’s, I think some people will,

00:30:27 but I think like, I think there’s going to be a huge aspect

00:30:31 of just people doing creative commerce here.

00:30:35 So I think that there is going to be a big market

00:30:37 around people designing digital clothing.

00:30:41 But the question is, if you’re designing digital clothing,

00:30:43 do you need to design, if you’re the designer,

00:30:44 do you need to make it for each kind of specific discrete

00:30:48 point along a spectrum, or are you just designing it

00:30:51 for kind of a photo realistic case or an expressive case,

00:30:54 or can you design one

00:30:55 and have it translate across these things?

00:30:57 If I buy a style from a designer who I care about,

00:31:01 and now I’m a dragon, is there a way to morph that

00:31:04 so it goes on the dragon in a way that makes sense?

00:31:07 And that I think is an interesting AI problem

00:31:09 because you’re probably not going to make it

00:31:10 so that designers have to go design for all those things.

00:31:14 But the more useful the digital content is that you buy

00:31:17 in a lot of uses, in a lot of use cases,

00:31:21 the more that economy will just explode.

00:31:23 And that’s a lot of what all of the,

00:31:28 we were joking about NFTs before,

00:31:29 but I think a lot of the promise here is that

00:31:32 if the digital goods that you buy are not just tied

00:31:35 to one platform or one use case,

00:31:37 they end up being more valuable,

00:31:38 which means that people are more willing

00:31:39 and more likely to invest in them,

00:31:41 and that just spurs the whole economy.

00:31:44 But the question is, that’s a fascinating positive aspect,

00:31:47 but the potential negative aspect is that

00:31:50 you can have people concealing their identity

00:31:52 in order to troll or even not people, bots.

00:31:57 So how do you know in the metaverse

00:31:58 that you’re talking to a real human or an AI

00:32:02 or a well intentioned human?

00:32:03 Is that something you think about,

00:32:04 something you’re concerned about?

00:32:06 Well, let’s break that down into a few different cases.

00:32:10 I mean, because knowing that you’re talking to someone

00:32:11 who has good intentions is something that I think

00:32:13 is not even solved in pretty much anywhere.

00:32:17 But I mean, if you’re talking to someone who’s a dragon,

00:32:20 I think it’s pretty clear that they’re not representing

00:32:22 themselves as a person.

00:32:23 I think probably the most pernicious thing

00:32:25 that you want to solve for is,

00:32:30 I think probably one of the scariest ones is

00:32:32 how do you make sure that someone isn’t impersonating you?

00:32:35 So, okay, you’re in a future version of this conversation,

00:32:39 and we have photorealistic avatars,

00:32:41 and we’re doing this in work rooms

00:32:43 or whatever the future version of that is,

00:32:44 and someone walks in who looks like me.

00:32:48 How do you know that that’s me?

00:32:50 And one of the things that we’re thinking about

00:32:54 is it’s still a pretty big AI project

00:32:57 to be able to generate photorealistic avatars

00:32:59 that basically can like,

00:33:00 they work like these codecs of you, right?

00:33:03 So you kind of have a map from your headset

00:33:06 and whatever sensors of what your body’s actually doing,

00:33:08 and it takes the model and it kind of displays it in VR.

00:33:11 But there’s a question, which is,

00:33:12 should there be some sort of biometric security

00:33:15 so that when I put on my VR headset

00:33:18 or I’m going to go use that avatar,

00:33:20 I need to first prove that I am that?

00:33:24 And I think you probably are gonna want something like that.

00:33:26 So as we’re developing these technologies,

00:33:31 we’re also thinking about the security for things like that

00:33:34 because people aren’t gonna wanna be impersonated.

00:33:37 That’s a huge security issue.

00:33:41 Then you just get the question

00:33:42 of people hiding behind fake accounts

00:33:46 to do malicious things,

00:33:48 which is not gonna be unique to the metaverse,

00:33:51 although certainly in a environment

00:33:56 where it’s more immersive

00:33:57 and you have more of a sense of presence,

00:33:58 it could be more painful.

00:34:01 But this is obviously something

00:34:03 that we’ve just dealt with for years

00:34:06 in social media and the internet more broadly.

00:34:08 And there, I think there have been a bunch of tactics

00:34:13 that I think we’ve just evolved to,

00:34:17 we’ve built up these different AI systems

00:34:20 to basically get a sense of,

00:34:21 is this account behaving in the way that a person would?

00:34:26 And it turns out,

00:34:28 so in all of the work that we’ve done around,

00:34:31 we call it community integrity

00:34:33 and it’s basically like policing harmful content

00:34:36 and trying to figure out where to draw the line.

00:34:38 And there are all these like really hard

00:34:39 and philosophical questions around like,

00:34:41 where do you draw the line on some of this stuff?

00:34:42 And the thing that I’ve kind of found the most effective

00:34:47 is as much as possible trying to figure out

00:34:51 who are the inauthentic accounts

00:34:53 or where are the accounts that are behaving

00:34:55 in an overall harmful way at the account level,

00:34:58 rather than trying to get into like policing

00:35:00 what they’re saying, right?

00:35:01 Which I think the metaverse is gonna be even harder

00:35:03 because the metaverse I think will have more properties of,

00:35:07 it’s almost more like a phone call, right?

00:35:09 Or it’s not like I post a piece of content

00:35:12 and is that piece of content good or bad?

00:35:14 So I think more of this stuff will have to be done

00:35:16 at the level of the account.

00:35:19 But this is the area where,

00:35:21 between the kind of counter intelligence teams

00:35:27 that we built up inside the company

00:35:28 and like years of building just different AI systems

00:35:33 to basically detect what is a real account and what isn’t.

00:35:36 I’m not saying we’re perfect,

00:35:37 but like this is an area where I just think

00:35:39 we are like years ahead of basically anyone else

00:35:43 in the industry in terms of having built those capabilities.

00:35:48 And I think that that just is gonna be incredibly important

00:35:50 for this next wave of things.

00:35:51 And like you said, on a technical level,

00:35:53 on a philosophical level,

00:35:54 it’s an incredibly difficult problem to solve.

00:35:59 By the way, I would probably like to open source my avatar

00:36:03 so there could be like millions of Lexis walking around

00:36:05 just like an army.

00:36:07 Like Agent Smith?

00:36:08 Agent Smith, yeah, exactly.

00:36:10 So the Unity ML folks built a copy of me

00:36:16 and they sent it to me.

00:36:18 So there’s a person running around

00:36:20 and I’ve just been doing reinforcement learning on it.

00:36:22 I was gonna release it

00:36:25 because just to have sort of like thousands of Lexis

00:36:29 doing reinforcement.

00:36:31 So they fall over naturally,

00:36:32 they have to learn how to like walk around and stuff.

00:36:34 So I love that idea,

00:36:36 this tension between biometric security,

00:36:39 you want to have one identity,

00:36:40 but then certain avatars, you might have to have many.

00:36:43 I don’t know which is better security,

00:36:45 sort of flooding the world with Lexis

00:36:48 and thereby achieving security

00:36:49 or really being protective of your identity.

00:36:51 I have to ask you a security question actually.

00:36:53 Well, how does flooding the world with Lexis help me know

00:36:56 in our conversation that I’m talking to the real Lex?

00:36:59 I completely destroy the trust

00:37:01 in all my relationships then, right?

00:37:03 If I flood,

00:37:04 cause then it’s, yeah, that.

00:37:07 I think that one’s not gonna work that well for you.

00:37:09 It’s not gonna work that well for the original copy.

00:37:11 It probably fits some things.

00:37:13 Like if you’re a public figure

00:37:14 and you’re trying to have a bunch of,

00:37:18 if you’re trying to show up

00:37:19 in a bunch of different places in the future,

00:37:21 you’ll be able to do that in the metaverse.

00:37:23 So that kind of replication I think will be useful.

00:37:26 But I do think that you’re gonna want a notion of like,

00:37:29 I am talking to the real one.

00:37:31 Yeah.

00:37:32 Yeah, especially if the fake ones start outperforming you

00:37:35 and all your private relationships

00:37:37 and then you’re left behind.

00:37:38 I mean, that’s a serious concern I have with clones.

00:37:41 Again, the things I think about.

00:37:43 Okay, so I recently got, I use QNAP NAS storage.

00:37:48 So just storage for video and stuff.

00:37:50 And I recently got hacked.

00:37:51 This is the first time for me with ransomware.

00:37:53 It’s not me personally, it’s all QNAP devices.

00:37:58 So the question that people have

00:38:00 is about security in general.

00:38:03 Because I was doing a lot of the right things

00:38:05 in terms of security and nevertheless,

00:38:06 ransomware basically disabled my device.

00:38:10 Is that something you think about?

00:38:12 What are the different steps you could take

00:38:13 to protect people’s data on the security front?

00:38:16 I think that there’s different solutions for,

00:38:21 and strategies where it makes sense to have stuff

00:38:23 kind of put behind a fortress, right?

00:38:25 So the centralized model versus the decentralizing.

00:38:30 Then I think both have strengths and weaknesses.

00:38:32 So I think anyone who says, okay,

00:38:33 just decentralize everything, that’ll make it more secure.

00:38:36 I think that that’s tough because,

00:38:38 I mean, the advantage of something like encryption

00:38:42 is that we run the largest encrypted service

00:38:46 in the world with WhatsApp.

00:38:47 And we’re one of the first to roll out

00:38:49 a multi platform encryption service.

00:38:52 And that’s something that I think was a big advance

00:38:55 for the industry.

00:38:57 And one of the promises that we can basically make

00:38:59 because of that, our company doesn’t see

00:39:02 when you’re sending an encrypted message

00:39:04 and to an encrypted message,

00:39:05 what the content is of what you’re sharing.

00:39:07 So that way, if someone hacks Meta servers,

00:39:11 they’re not gonna be able to access the WhatsApp message

00:39:14 that you’re sending to your friend.

00:39:16 And that I think matters a lot to people

00:39:19 because obviously if someone is able to compromise

00:39:21 a company’s servers and that company has hundreds

00:39:23 of millions or billions of people,

00:39:25 then that ends up being a very big deal.

00:39:27 The flip side of that is, okay,

00:39:29 all the content is on your phone.

00:39:32 Are you following security best practices on your phone?

00:39:35 If you lose your phone, all your content is gone.

00:39:38 So that’s an issue.

00:39:39 Maybe you go back up your content from WhatsApp

00:39:42 or some other service in an iCloud or something,

00:39:45 but then you’re just at Apple’s whims about,

00:39:47 are they gonna go turn over the data to some government

00:39:51 or are they gonna get hacked?

00:39:53 So a lot of the time it is useful to have data

00:39:57 in a centralized place too because then you can train

00:40:00 systems that can just do much better personalization.

00:40:04 I think that in a lot of cases, centralized systems

00:40:08 can offer, especially if you’re a serious company,

00:40:13 you’re running the state of the art stuff

00:40:16 and you have red teams attacking your own stuff

00:40:19 and you’re putting out bounty programs

00:40:24 and trying to attract some of the best hackers in the world

00:40:26 to go break into your stuff all the time.

00:40:27 So any system is gonna have security issues,

00:40:30 but I think the best way forward is to basically try

00:40:34 to be as aggressive and open about hardening

00:40:36 the systems as possible, not trying to kind of hide

00:40:39 and pretend that there aren’t gonna be issues,

00:40:40 which I think is over time why a lot of open source systems

00:40:43 have gotten relatively more secure is because they’re open

00:40:46 and it’s not, rather than pretending that there aren’t

00:40:48 gonna be issues, just people surface them quicker.

00:40:50 So I think you want to adopt that approach as a company

00:40:53 and just constantly be hardening yourself.

00:40:56 Trying to stay one step ahead of the attackers.

00:41:01 It’s an inherently adversarial space.

00:41:03 I think it’s an interesting security is interesting

00:41:07 because of the different kind of threats

00:41:09 that we’ve managed over the last five years,

00:41:11 there are ones where basically the adversaries

00:41:15 keep on getting better and better.

00:41:16 So trying to kind of interfere with security

00:41:21 is certainly one area of this.

00:41:23 If you have nation states that are trying

00:41:24 to interfere in elections or something,

00:41:27 they’re kind of evolving their tactics.

00:41:29 Whereas on the other hand, I don’t want to be too simplistic

00:41:32 about it, but if someone is saying something hateful,

00:41:36 people usually aren’t getting smarter and smarter

00:41:38 about how they say hateful things.

00:41:40 So maybe there’s some element of that,

00:41:42 but it’s a very small dynamic compared

00:41:44 to how advanced attackers and some of these other places

00:41:48 get over time.

00:41:49 I believe most people are good,

00:41:51 so they actually get better over time

00:41:53 and not being less hateful

00:41:55 because they realize it’s not fun being hateful.

00:42:00 That’s at least the belief I have.

00:42:01 But first, bathroom break.

00:42:04 Sure, okay.

00:42:06 So we’ll come back to AI,

00:42:08 but let me ask some difficult questions now.

00:42:11 Social Dilemma is a popular documentary

00:42:13 that raised concerns about the effects

00:42:15 of social media on society.

00:42:17 You responded with a point by point rebuttal titled,

00:42:20 What the Social Dilemma Gets Wrong.

00:42:23 People should read that.

00:42:24 I would say the key point they make

00:42:26 is because social media is funded by ads,

00:42:29 algorithms want to maximize attention and engagement

00:42:33 and an effective way to do so is to get people angry

00:42:38 at each other, increase division and so on.

00:42:40 Can you steel man their criticisms and arguments

00:42:44 that they make in the documentary

00:42:46 as a way to understand the concern

00:42:48 and as a way to respond to it?

00:42:53 Well, yeah, I think that’s a good conversation to have.

00:42:56 I don’t happen to agree with the conclusions

00:43:00 and I think that they make a few assumptions

00:43:02 that are just very big jumps

00:43:06 that I don’t think are reasonable to make.

00:43:08 But I understand overall why people would be concerned

00:43:13 that our business model and ads in general,

00:43:19 we do make more money

00:43:20 as people use the service more in general, right?

00:43:23 So as a kind of basic assumption, okay,

00:43:26 do we have an incentive for people to build a service

00:43:29 that people use more?

00:43:31 Yes, on a lot of levels.

00:43:32 I mean, we think what we’re doing is good.

00:43:34 So we think that if people are finding it useful,

00:43:37 they’ll use it more.

00:43:38 Or if you just look at it as this sort of,

00:43:41 if the only thing we cared about is money,

00:43:43 which is not for anyone who knows me,

00:43:46 but okay, we’re a company.

00:43:47 So let’s say you just kind of simplified it down to that,

00:43:51 then would we want people to use the services more?

00:43:53 Yes, and then you get to the second question,

00:43:57 which is does kind of getting people agitated

00:44:02 make them more likely to use the services more?

00:44:07 And I think from looking at other media in the world,

00:44:12 especially TV, and there’s the old news adage,

00:44:17 if it bleeds, it leads.

00:44:18 Like I think that this is,

00:44:20 there are a bunch of reasons why someone might think

00:44:25 that that kind of provocative content

00:44:30 would be the most engaging.

00:44:32 Now, what I’ve always found is two things.

00:44:35 One is that what grabs someone’s attention in the near term

00:44:39 is not necessarily something

00:44:40 that they’re going to appreciate having seen

00:44:43 or going to be the best over the long term.

00:44:45 So I think what a lot of people get wrong

00:44:47 is that I’m not building this company

00:44:50 to make the most money or get people to spend the most time

00:44:53 on this in the next quarter or the next year.

00:44:55 I’ve been doing this for 17 years at this point,

00:44:58 and I’m still relatively young,

00:45:00 and I have a lot more that I wanna do

00:45:02 over the coming decades.

00:45:03 So I think that it’s too simplistic to say,

00:45:08 hey, this might increase time in the near term,

00:45:11 therefore, it’s what you’re gonna do.

00:45:13 Because I actually think a deeper look

00:45:15 at kind of what my incentives are,

00:45:17 the incentives of a company

00:45:18 that are focused on the long term,

00:45:20 is to basically do what people

00:45:22 are gonna find valuable over time,

00:45:24 not what is gonna draw people’s attention today.

00:45:26 The other thing that I’d say is that,

00:45:29 I think a lot of times people look at this

00:45:31 from the perspective of media

00:45:34 or kind of information or civic discourse,

00:45:37 but one other way of looking at this is just that,

00:45:40 okay, I’m a product designer, right?

00:45:42 Our company, we build products,

00:45:45 and a big part of building a product

00:45:47 is not just the function and utility

00:45:49 of what you’re delivering,

00:45:50 but the feeling of how it feels, right?

00:45:52 And we spend a lot of time talking about virtual reality

00:45:55 and how the kind of key aspect of that experience

00:45:58 is the feeling of presence, which it’s a visceral thing.

00:46:01 It’s not just about the utility that you’re delivering,

00:46:03 it’s about like the sensation.

00:46:05 And similarly, I care a lot about how people feel

00:46:10 when they use our products,

00:46:11 and I don’t want to build products that make people angry.

00:46:15 I mean, that’s like not, I think,

00:46:17 what we’re here on this earth to do,

00:46:18 is to build something that people spend a bunch of time doing

00:46:22 and it just kind of makes them angrier at other people.

00:46:24 I mean, I think that that’s not good.

00:46:26 That’s not what I think would be

00:46:30 sort of a good use of our time

00:46:32 or a good contribution to the world.

00:46:33 So, okay, it’s like people, they tell us

00:46:36 on a per content basis, does this thing,

00:46:39 do I like it?

00:46:40 Do I love it?

00:46:40 Does it make me angry?

00:46:41 Does it make me sad?

00:46:42 And based on that, we choose to basically show content

00:46:47 that makes people angry less,

00:46:49 because of course, if you’re designing a product

00:46:52 and you want people to be able to connect

00:46:56 and feel good over a long period of time,

00:46:59 then that’s naturally what you’re gonna do.

00:47:02 So, I don’t know, I think overall,

00:47:06 I understand at a high level,

00:47:10 if you’re not thinking too deeply about it,

00:47:13 why that argument might be appealing.

00:47:16 But I just think if you actually look

00:47:19 at what our real incentives are,

00:47:20 not just like if we were trying to optimize

00:47:25 for the next week,

00:47:26 but like as people working on this,

00:47:28 like why are we here?

00:47:30 And I think it’s pretty clear

00:47:32 that that’s not actually how you would wanna

00:47:34 design the system.

00:47:35 I guess one other thing that I’d say is that,

00:47:37 while we’re focused on the ads business model,

00:47:40 I do think it’s important to note that a lot

00:47:43 of these issues are not unique to ads.

00:47:45 I mean, so take like a subscription news business model,

00:47:47 for example, I think that has just as many

00:47:50 potential pitfalls.

00:47:53 Maybe if someone’s paying for a subscription,

00:47:55 you don’t get paid per piece of content that they look at,

00:47:57 but say for example, I think like a bunch

00:48:02 of the partisanship that we see could potentially

00:48:06 be made worse by you have these kind of partisan

00:48:12 news organizations that basically sell subscriptions

00:48:15 and they’re only gonna get people on one side

00:48:17 to basically subscribe to them.

00:48:19 So their incentive is not to print content

00:48:22 or produce content that’s kind of centrist

00:48:26 or down the line either.

00:48:27 I bet that what a lot of them find is that

00:48:30 if they produce stuff that’s kind of more polarizing

00:48:32 or more partisan, then that is what gets

00:48:35 the more subscribers.

00:48:36 So I think that this stuff is all,

00:48:40 there’s no perfect business model.

00:48:41 Everything has pitfalls.

00:48:44 The thing that I think is great about advertising

00:48:46 is it makes it so the consumer service is free,

00:48:48 which if you believe that everyone should have a voice

00:48:50 and everyone should be able to connect,

00:48:52 then that’s a great thing, as opposed to building

00:48:55 a luxury service that not everyone can afford.

00:48:57 But look, every business model, you have to be careful

00:49:00 about how you’re implementing what you’re doing.

00:49:02 You responded to a few things there.

00:49:04 You spoke to the fact that there is a narrative

00:49:07 of malevolence, like you’re leaning into them,

00:49:12 making people angry just because it makes more money

00:49:15 in the short term, that kind of thing.

00:49:16 So you responded to that.

00:49:17 But there’s also kind of reality of human nature.

00:49:22 Just like you spoke about, there’s fights,

00:49:25 arguments we get in and we don’t like ourselves afterwards,

00:49:28 but we got into them anyway.

00:49:30 So our longterm growth is, I believe for most of us,

00:49:34 has to do with learning, challenging yourself,

00:49:38 improving, being kind to each other,

00:49:40 finding a community of people that you connect with

00:49:47 on a real human level, all that kind of stuff.

00:49:50 But it does seem when you look at social media

00:49:54 that a lot of fights break out,

00:49:56 a lot of arguments break out,

00:49:58 a lot of viral content ends up being sort of outrage

00:50:03 in one direction or the other.

00:50:04 And so it’s easy from that to infer the narrative

00:50:08 that social media companies are letting

00:50:11 this outrage become viral.

00:50:13 And so they’re increasing the division in the world.

00:50:16 I mean, perhaps you can comment on that

00:50:18 or further, how can you be,

00:50:21 how can you push back on this narrative?

00:50:25 How can you be transparent about this battle?

00:50:28 Because I think it’s not just motivation or financials,

00:50:33 it’s a technical problem too,

00:50:36 which is how do you improve longterm wellbeing

00:50:41 of human beings?

00:50:43 I think that going through some of the design decisions

00:50:47 would be a good conversation.

00:50:49 But first, I actually think,

00:50:51 I think you acknowledged that,

00:50:54 that narrative is somewhat anecdotal.

00:50:56 And I think it’s worth grounding this conversation

00:50:59 in the actual research that has been done on this,

00:51:02 which by and large finds that social media

00:51:07 is not a large driver of polarization, right?

00:51:10 And, I mean, there’s been a number of economists

00:51:14 and social scientists and folks who have studied this.

00:51:18 In a lot of polarization, it varies around the world.

00:51:21 If social media is basically in every country,

00:51:23 Facebook’s in pretty much every country

00:51:24 except for China and maybe North Korea.

00:51:27 And you see different trends in different places

00:51:32 where in a lot of countries polarization is declining,

00:51:37 in some it’s flat, in the US it’s risen sharply.

00:51:41 So the question is, what are the unique phenomenon

00:51:44 in the different places?

00:51:45 And I think for the people who are trying to say,

00:51:47 hey, social media is the thing that’s doing this.

00:51:50 I think that that clearly doesn’t hold up

00:51:52 because social media is a phenomenon

00:51:54 that is pretty much equivalent

00:51:56 in all of these different countries.

00:51:57 And you have researchers like this economist at Stanford,

00:52:00 Matthew Genskow, who has just written at length about this.

00:52:05 And it’s a bunch of books by political scientists,

00:52:10 Ezra Klein and folks, why we’re polarized,

00:52:13 basically goes through this decades long analysis in the US.

00:52:17 Before I was born, basically talking about

00:52:19 some of the forces in kind of partisan politics

00:52:22 and Fox News and different things

00:52:25 that predate the internet in a lot of ways

00:52:27 that I think are likely larger contributors.

00:52:30 So to the contrary on this,

00:52:32 not only is it pretty clear that social media

00:52:35 is not a major contributor,

00:52:37 but most of the academic studies that I’ve seen

00:52:40 actually show that social media use

00:52:42 is correlated with lower polarization.

00:52:45 And Genskow, the same person who just did the study

00:52:48 that I cited about longitudinal polarization

00:52:51 across different countries,

00:52:54 also did a study that basically showed

00:52:57 that if you looked after the 2016 election in the US,

00:53:02 the voters who were the most polarized

00:53:05 were actually the ones who were not on the internet.

00:53:07 So, and there have been recent other studies,

00:53:10 I think in Europe and around the world,

00:53:12 basically showing that as people stop using social media,

00:53:16 they tend to get more polarized.

00:53:19 Then there’s a deeper analysis around,

00:53:21 okay, well, polarization actually isn’t even one thing.

00:53:24 Cause you know, having different opinions on something

00:53:26 isn’t, I don’t think that that’s by itself bad.

00:53:28 What people who study this say is most problematic

00:53:33 is what they call affective polarization,

00:53:35 which is basically are you,

00:53:37 do you have negative feelings towards people

00:53:40 of another group?

00:53:41 And the way that a lot of scholars study this

00:53:43 is they basically ask a group,

00:53:46 would you let your kids marry someone of group X?

00:53:50 Whatever the groups are that you’re worried

00:53:53 that someone might have negative feelings towards.

00:53:55 And in general, use of social media

00:53:58 has corresponded to decreases

00:53:59 in that kind of affective polarization.

00:54:01 So I just wanna, I think we should talk

00:54:04 through the design decisions and how we handle

00:54:07 the kind of specific pieces of content,

00:54:10 but overall, I think it’s just worth grounding

00:54:13 that discussion in the research that’s existed

00:54:15 that I think overwhelmingly shows

00:54:17 that the mainstream narrative around this

00:54:19 is just not right.

00:54:21 But the narrative does take hold

00:54:24 and it’s compelling to a lot of people.

00:54:27 There’s another question I’d like to ask you on this.

00:54:31 I was looking at various polls and saw that you’re

00:54:34 one of the most disliked tech leaders today,

00:54:38 54% unfavorable rating.

00:54:41 Elon Musk is 23%.

00:54:43 It’s basically everybody has a very high unfavorable rating

00:54:46 that are tech leaders.

00:54:48 Maybe you can help me understand that.

00:54:50 Why do you think so many people dislike you?

00:54:54 Some even hate you.

00:54:56 And how do you regain their trust and support?

00:54:59 Given everything you just said,

00:55:02 why are you losing the battle

00:55:05 in explaining to people what actual impact

00:55:09 social media has on society?

00:55:12 Well, I’m curious if that’s a US survey or world.

00:55:16 It is US, yeah.

00:55:17 So I think that there’s a few dynamics.

00:55:19 One is that our brand

00:55:24 has been somewhat uniquely challenged in the US

00:55:27 compared to other places.

00:55:29 It’s not that there are.

00:55:29 I mean, other countries, we have issues too,

00:55:32 but I think in the US, there was this dynamic where

00:55:36 if you look at like the next sentiment

00:55:38 of kind of coverage or attitude towards us,

00:55:42 before 2016, I think that there were probably

00:55:44 very few months, if any, where it was negative.

00:55:47 And since 2016, I think that there probably

00:55:49 been very few months, if any, then it’s been positive.

00:55:51 Politics.

00:55:53 But I think it’s a specific thing.

00:55:55 And this is very different from other places.

00:55:56 So I think in a lot of other countries in the world,

00:55:59 the sentiment towards meta and our services

00:56:02 is extremely positive.

00:56:04 In the US, we have more challenges.

00:56:06 And I think compared to other companies,

00:56:09 you can look at certain industries,

00:56:12 I think if you look at it from like a partisan perspective,

00:56:16 not from like a political perspective,

00:56:18 but just kind of culturally,

00:56:19 it’s like there are people who are probably

00:56:20 more left of center and there are people

00:56:21 who are more right of center,

00:56:22 and there’s kind of blue America and red America.

00:56:25 There are certain industries that I think

00:56:27 maybe one half of the country has a more positive view

00:56:30 towards than another.

00:56:32 And I think we’re in a,

00:56:36 one of the positions that we’re in that I think

00:56:38 is really challenging is that because of a lot

00:56:41 of the content decisions that we’ve basically

00:56:44 had to arbitrate, and because we’re not a partisan company,

00:56:49 we’re not a Democrat company or a Republican company,

00:56:52 we’re trying to make the best decisions we can

00:56:55 to help people connect and help people have as much voice

00:56:58 as they can while having some rules

00:57:01 because we’re running a community.

00:57:04 The net effect of that is that we’re kind of constantly

00:57:07 making decisions that piss off people in both camps.

00:57:12 And the effect that I’ve sort of seen is that

00:57:17 when we make a decision that is,

00:57:21 that’s a controversial one that’s gonna upset,

00:57:24 say about half the country,

00:57:27 those decisions are all negative sum,

00:57:30 from a brand perspective, because it’s not like,

00:57:33 if we make that decision in one way

00:57:35 and say half the country is happy

00:57:37 about that particular decision that we make,

00:57:40 they tend to not say, oh, sweet, meta got that one right.

00:57:43 They’re just like, ah, you didn’t mess that one up.

00:57:46 But their opinion doesn’t tend to go up by that much.

00:57:48 Whereas the people who kind of are on the other side of it

00:57:52 are like, God, how could you mess that up?

00:57:55 How could you possibly think that that piece of content

00:57:57 is okay and should be up and should not be censored?

00:58:00 Or, and so I think the, whereas if you leave it up

00:58:04 and, you know, it’s, or if you take it down,

00:58:09 the people who thought it should be taken down or,

00:58:11 you know, it’s like, all right, fine, great.

00:58:12 You didn’t mess that one up.

00:58:14 So our internal assessment of,

00:58:16 and the kind of analytics on our brand

00:58:17 are basically anytime one of these big controversial things

00:58:20 comes up in society,

00:58:23 our brand goes down with half of the country.

00:58:26 And then like, if you,

00:58:27 and then if you just kind of extrapolate that out,

00:58:29 it’s just been very challenging for us to try to navigate

00:58:33 what is a polarizing country in a principled way,

00:58:36 where we’re not trying to kind of hew to one side

00:58:38 or the other, we’re trying to do

00:58:39 what we think is the right thing.

00:58:41 But that’s what I think is the right thing

00:58:43 for us to do though.

00:58:44 So, I mean, that’s what we’ll try to keep doing.

00:58:47 Just as a human being, how does it feel though,

00:58:50 when you’re giving so much of your day to day life

00:58:53 to try to heal division, to try to do good in the world,

00:58:58 as we’ve talked about, that so many people in the US,

00:59:02 the place you call home have a negative view

00:59:06 of you as a leader, as a human being

00:59:09 and the company you love?

00:59:14 Well, I mean, it’s not great,

00:59:15 but I mean, look, if I wanted people to think positively

00:59:21 about me as a person,

00:59:25 I don’t know, I’m not sure if you go build a company.

00:59:27 I mean, it’s like.

00:59:28 Or a social media company.

00:59:30 It seems exceptionally difficult to do

00:59:32 with a social media company.

00:59:32 Yeah, so, I mean, I don’t know,

00:59:34 there is a dynamic where a lot of the other people

00:59:39 running these companies, internet companies,

00:59:42 have sort of stepped back and they just do things

00:59:45 that are sort of, I don’t know, less controversial.

00:59:49 And some of it may be that they just get tired over time.

00:59:52 But, you know, it’s, so I don’t know.

00:59:55 I think that, you know, running a company is hard,

00:59:58 building something at scale is hard.

00:59:59 You only really do it for a long period of time

01:00:01 if you really care about what you’re doing.

01:00:04 And yeah, so, I mean, it’s not great, but like,

01:00:08 but look, I think that at some level,

01:00:11 whether 25% of people dislike you

01:00:14 or 75% of people dislike you,

01:00:18 your experience as a public figure is gonna be

01:00:21 that there’s a lot of people who dislike you, right?

01:00:23 So, I actually am not sure how different it is.

01:00:28 You know, certainly, you know,

01:00:31 the country’s gotten more polarized

01:00:32 and we in particular have gotten, you know,

01:00:35 more controversial over the last five or years or so.

01:00:39 But, I don’t know, I kind of think like as a public figure

01:00:45 and leader of one of these enterprises.

01:00:48 Comes with the job.

01:00:49 Yeah, part of what you do is like,

01:00:51 and look, the answer can’t just be ignore it, right?

01:00:54 Because like a huge part of the job

01:00:56 is like you need to be getting feedback

01:00:58 and internalizing feedback on how you can do better.

01:01:00 But I think increasingly what you need to do

01:01:02 is be able to figure out, you know,

01:01:04 who are the kind of good faith critics

01:01:08 who are criticizing you because

01:01:10 they’re trying to help you do a better job

01:01:12 rather than tear you down.

01:01:13 And those are the people I just think you have to cherish

01:01:16 and like, and listen very closely

01:01:19 to the things that they’re saying,

01:01:20 because, you know, I think it’s just as dangerous

01:01:23 to tune out everyone who says anything negative

01:01:26 and just listen to the people who are kind of positive

01:01:29 and support you, you know,

01:01:31 as it would be psychologically to pay attention

01:01:33 trying to make people who are never gonna like you like you.

01:01:36 So I think that that’s just kind of a dance

01:01:38 that people have to do.

01:01:40 But I mean, I, you know,

01:01:41 so you kind of develop more of a feel for like,

01:01:44 who actually is trying to accomplish

01:01:46 the same types of things in the world

01:01:48 and who has different ideas about how to do that

01:01:51 and how can I learn from those people?

01:01:52 And like, yeah, we get stuff wrong.

01:01:54 And when the people whose opinions I respect

01:01:57 call me out on getting stuff wrong,

01:01:59 that hurts and makes me wanna do better.

01:02:02 But I think at this point, I’m pretty tuned to just,

01:02:04 all right, if someone, if I know they’re,

01:02:06 they’re kind of like operating in bad faith

01:02:08 and they’re not really trying to help,

01:02:10 then, you know, I don’t know, it’s not, it’s, it doesn’t,

01:02:13 you know, I think over time,

01:02:13 it just doesn’t bother you that much.

01:02:15 But you are surrounded by people that believe in the mission

01:02:18 that love you.

01:02:21 Are there friends or colleagues in your inner circle

01:02:23 you trust that call you out on your bullshit

01:02:26 whenever your thinking may be misguided

01:02:28 as it is for leaders at times?

01:02:30 I think we have a famously open company culture

01:02:34 where we sort of encourage that kind of dissent internally,

01:02:39 which is, you know, why there’s so much material

01:02:41 internally that can leak out

01:02:43 with people sort of disagreeing

01:02:44 is because that’s sort of the culture.

01:02:47 You know, our management team, I think it’s a lot of people,

01:02:50 you know, there are some newer folks who come in,

01:02:52 there are some folks who’ve kind of been there for a while,

01:02:54 but there’s a very high level of trust.

01:02:56 And I would say it is a relatively confrontational

01:02:59 group of people.

01:03:01 And my friends and family, I think, will push me on this.

01:03:04 But look, it’s not just,

01:03:06 but I think you need some diversity, right?

01:03:09 It can’t just be, you know,

01:03:12 people who are your friends and family.

01:03:13 It’s also, you know, I mean, there are journalists

01:03:16 or analysts or, you know,

01:03:19 peer executives at other companies

01:03:23 or, you know, other people who sort of are insightful

01:03:27 about thinking about the world,

01:03:28 you know, certain politicians

01:03:30 or people kind of in that sphere

01:03:32 who I just think have like very insightful perspectives

01:03:36 who even if they would,

01:03:39 they come at the world from a different perspective,

01:03:41 which is sort of what makes the perspective so valuable.

01:03:44 But, you know, I think fundamentally

01:03:46 we’re trying to get to the same place

01:03:47 in terms of, you know, helping people connect more,

01:03:50 helping the whole world function better,

01:03:53 not just, you know, one place or another.

01:03:57 And I don’t know, I mean,

01:03:58 those are the people whose opinions really matter to me.

01:04:02 And I just, it’s, you know,

01:04:04 that’s how I learn on a day to day basis.

01:04:05 People are constantly sending me comments on stuff

01:04:07 or links to things they found interesting.

01:04:10 And I don’t know, it’s kind of constantly evolving

01:04:13 this model of the world

01:04:14 and kind of what we should be aspiring to be.

01:04:16 You’ve talked about, you have a famously open culture

01:04:20 which comes with the criticism

01:04:25 and the painful experiences.

01:04:27 So let me ask you another difficult question.

01:04:30 Frances Haugen, the Facebook whistleblower,

01:04:33 leaked the internal Instagram research

01:04:35 into teenagers and wellbeing.

01:04:38 Her claim is that Instagram is choosing profit

01:04:41 over wellbeing of teenage girls.

01:04:43 So Instagram is quote, toxic for them.

01:04:46 Your response titled,

01:04:48 what our research really says about teen wellbeing

01:04:52 and Instagram says, no, Instagram research shows

01:04:55 that 11 of 12 wellbeing issues,

01:04:58 teenage girls who said they struggle

01:05:02 with those difficult issues also said

01:05:04 that Instagram made them better rather than worse.

01:05:07 Again, can you steal man and defend the point

01:05:11 and Frances Haugen’s characterization of the study

01:05:14 and then help me understand the positive

01:05:17 and negative effects of Instagram

01:05:19 and Facebook on young people?

01:05:20 So there are certainly questions around teen mental health

01:05:25 that are really important.

01:05:26 It’s hard to, as a parent, it’s like hard to imagine

01:05:29 any set of questions that are sort of more important.

01:05:32 I mean, I guess maybe other aspects of physical health

01:05:34 or wellbeing are probably come to that level,

01:05:37 but like, these are really important questions, right?

01:05:40 Which is why we dedicate teams to studying them.

01:05:45 I don’t think the internet or social media are unique

01:05:48 in having these questions.

01:05:50 I mean, I think people and there’ve been sort of magazines

01:05:53 with promoting certain body types for women

01:05:56 and kids for decades,

01:05:58 but we really care about this stuff.

01:06:01 So we wanted to study it.

01:06:02 And of course, we didn’t expect

01:06:05 that everything was gonna be positive all the time.

01:06:07 So, I mean, the reason why you study this stuff

01:06:08 is to try to improve and get better.

01:06:10 So, I mean, look, the place where I disagree

01:06:13 with the characterization first,

01:06:15 I thought some of the reporting and coverage of it

01:06:18 just took the whole thing out of proportion

01:06:20 and that it focused on, as you said,

01:06:22 I think there were like 20 metrics in there

01:06:24 and on 18 or 19, the effect of using Instagram

01:06:27 was neutral or positive on the teen’s wellbeing.

01:06:30 And there was one area where I think it showed

01:06:34 that we needed to improve

01:06:35 and we took some steps to try to do that

01:06:37 after doing the research.

01:06:38 But I think having the coverage just focus on that one

01:06:41 without focusing on the,

01:06:43 I mean, I think an accurate characterization

01:06:45 would have been that kids using Instagram

01:06:47 or not kids, teens is generally positive

01:06:52 for their mental health.

01:06:53 But of course, that was not the narrative that came out.

01:06:55 So I think it’s hard to,

01:06:56 that’s not a kind of logical thing to straw man,

01:06:59 but I sort of disagree or steel man,

01:07:01 but I sort of disagree with that overall characterization.

01:07:04 I think anyone sort of looking at this objectively would,

01:07:09 but then, I mean, there is this sort of intent critique

01:07:15 that I think you were getting at before,

01:07:16 which says, it assumes some sort of malevolence, right?

01:07:19 It’s like, which it’s really hard for me

01:07:23 to really wrap my head around this

01:07:26 because as far as I know,

01:07:29 it’s not clear that any of the other tech companies

01:07:31 are doing this kind of research.

01:07:33 So why the narrative should form that we did research

01:07:37 because we were studying an issue

01:07:38 because we wanted to understand it to improve

01:07:40 and took steps after that to try to improve it,

01:07:43 that your interpretation of that would be

01:07:46 that we did the research

01:07:47 and tried to sweep it under the rug.

01:07:49 It just, it sort of is like, I don’t know,

01:07:53 it’s beyond credibility to me

01:07:55 that like that’s the accurate description of the actions

01:07:59 that we’ve taken compared to the others in the industry.

01:08:01 So I don’t know, that’s kind of, that’s my view on it.

01:08:05 These are really important issues

01:08:06 and there’s a lot of stuff

01:08:07 that I think we’re gonna be working on

01:08:09 related to teen mental health for a long time,

01:08:11 including trying to understand this better.

01:08:14 And I would encourage everyone else

01:08:15 in the industry to do this too.

01:08:18 Yeah, I would love there to be open conversations

01:08:21 and a lot of great research being released internally

01:08:25 and then also externally.

01:08:27 It doesn’t make me feel good

01:08:31 to see press obviously get way more clicks

01:08:35 when they say negative things about social media.

01:08:39 Objectively speaking, I can just tell

01:08:42 that there’s hunger to say negative things

01:08:44 about social media.

01:08:46 And I don’t understand how that’s supposed to lead

01:08:50 to an open conversation about the positives

01:08:53 and the negatives, the concerns about social media,

01:08:56 especially when you’re doing that kind of research.

01:08:59 I mean, I don’t know what to do with that,

01:09:01 but let me ask you as a father,

01:09:05 there’s a weight heavy on you

01:09:06 that people get bullied on social networks.

01:09:10 So people get bullied in their private life.

01:09:13 But now because so much of our life is in the digital world,

01:09:17 the bullying moves from the physical world

01:09:19 to the digital world.

01:09:21 So you’re now creating a platform

01:09:24 on which bullying happens.

01:09:26 And some of that bullying can lead to damage

01:09:30 to mental health.

01:09:31 And some of that bullying can lead to depression,

01:09:35 even suicide.

01:09:37 There’s a weight heavy on you

01:09:38 that people have committed suicide

01:09:43 or will commit suicide based on the bullying

01:09:46 that happens on social media.

01:09:48 Yeah, I mean, there’s a set of harms

01:09:51 that we basically track and build systems to fight against.

01:09:55 And bullying and self harm are,

01:10:01 these are some of the biggest things

01:10:03 that we are most focused on.

01:10:10 For bullying, like you say, it’s gonna be,

01:10:16 while this predates the internet,

01:10:18 then it’s probably impossible to get rid of all of it.

01:10:22 You wanna give people tools to fight it

01:10:24 and you wanna fight it yourself.

01:10:27 And you also wanna make sure that people have the tools

01:10:28 to get help when they need it.

01:10:30 So I think this isn’t like a question of,

01:10:33 can you get rid of all bullying?

01:10:34 I mean, it’s like, all right, I mean, I have two daughters

01:10:39 and they fight and push each other around and stuff too.

01:10:43 And the question is just,

01:10:44 how do you handle that situation?

01:10:47 And there’s a handful of things that I think you can do.

01:10:51 We talked a little bit before around some of the AI tools

01:10:55 that you can build to identify

01:10:56 when something harmful is happening.

01:10:59 It’s actually, it’s very hard in bullying

01:11:00 because a lot of bullying is very context specific.

01:11:02 It’s not like you’re trying to fit a formula of like,

01:11:06 if like looking at the different harms,

01:11:09 someone promoting a terrorist group is like,

01:11:12 probably one of the simpler things to generally find

01:11:14 because things promoting that group are gonna look

01:11:17 at a certain way or feel a certain way.

01:11:19 Bullying could just be, you know,

01:11:21 someone making some subtle comment about someone’s appearance

01:11:24 that’s idiosyncratic to them.

01:11:26 And it could look at just like humor.

01:11:28 So humor to one person can be destructive

01:11:31 to another human being, yeah.

01:11:32 So with bullying, I think there are certain things

01:11:36 that you can find through AI systems,

01:11:40 but I think it is increasingly important

01:11:42 to just give people more agency themselves.

01:11:44 So we’ve done things like making it

01:11:46 so people can turn off comments

01:11:47 or take a break from hearing from a specific person

01:11:52 without having to signal at all

01:11:54 that they’re gonna stop following them

01:11:55 or kind of make some stand that,

01:11:58 okay, I’m not friends with you anymore.

01:11:59 I’m not following you.

01:12:00 I just like, I just don’t wanna hear about this,

01:12:01 but I also don’t wanna signal at all publicly

01:12:05 that or to them that there’s been an issue.

01:12:10 And then you get to some of the more extreme cases

01:12:14 like you’re talking about

01:12:14 where someone is thinking about self harm or suicide.

01:12:19 And there we’ve found that that is a place

01:12:24 where AI can identify a lot

01:12:26 as well as people flagging things.

01:12:28 If people are expressing something

01:12:31 that is potentially they’re thinking of hurting themselves,

01:12:35 those are cues that you can build systems

01:12:37 and hundreds of languages around the world

01:12:39 to be able to identify that.

01:12:41 And one of the things that I’m actually quite proud of

01:12:45 is we’ve built these systems

01:12:47 that I think are clearly leading at this point

01:12:50 that not only identify that,

01:12:53 but then connect with local first responders

01:12:57 and have been able to save, I think at this point,

01:12:59 it’s in thousands of cases,

01:13:01 be able to get first responders to people

01:13:04 through these systems who really need them

01:13:07 because of specific plumbing that we’ve done

01:13:09 between the AI work and being able to communicate

01:13:11 with local first responder organizations.

01:13:13 We’re rolling that out in more places around the world.

01:13:15 And I think the team that worked on that

01:13:18 just did awesome stuff.

01:13:19 So I think that that’s a long way of saying,

01:13:22 yeah, I mean, this is a heavy topic

01:13:25 and you want to attack it in a bunch of different ways

01:13:30 and also kind of understand that some of nature

01:13:33 is for people to do this to each other,

01:13:36 which is unfortunate,

01:13:37 but you can give people tools and build things that help.

01:13:40 It’s still one hell of a burden though.

01:13:43 A platform that allows people

01:13:46 to fall in love with each other

01:13:48 is also by nature going to be a platform

01:13:51 that allows people to hurt each other.

01:13:52 And when you’re managing such a platform, it’s difficult.

01:13:57 And I think you spoke to it,

01:13:58 but the psychology of that, of being a leader in that space,

01:14:01 of creating technology that’s playing in this space,

01:14:05 like you mentioned, psychology is really damn difficult.

01:14:10 And I mean, the burden of that is just great.

01:14:13 I just wanted to hear you speak to that point.

01:14:18 I have to ask about the thing you’ve brought up a few times,

01:14:23 which is making controversial decisions.

01:14:26 Let’s talk about free speech and censorship.

01:14:29 So there are two groups of people pressuring Meta on this.

01:14:33 One group is upset that Facebook, the social network,

01:14:37 allows misinformation in quotes to be spread on the platform.

01:14:41 The other group are concerned that Facebook censors speech

01:14:44 by calling it misinformation.

01:14:46 So you’re getting it from both sides.

01:14:48 You, in 2019, October at Georgetown University,

01:14:54 eloquently defended the importance of free speech,

01:14:58 but then COVID came and the 2020 election came.

01:15:04 Do you worry that outside pressures

01:15:06 from advertisers, politicians, the public,

01:15:08 have forced Meta to damage the ideal of free speech

01:15:11 that you spoke highly of?

01:15:14 Just to say some obvious things upfront,

01:15:16 I don’t think pressure from advertisers

01:15:18 or politicians directly in any way

01:15:21 affects how we think about this.

01:15:22 I think these are just hard topics.

01:15:25 So let me just take you through our evolution

01:15:26 from kind of the beginning of the company

01:15:28 to where we are now.

01:15:30 You don’t build a company like this

01:15:31 unless you believe that people expressing themselves

01:15:34 is a good thing, right?

01:15:35 So that’s sort of the foundational thing.

01:15:38 You can kind of think about our company as a formula

01:15:41 where we think giving people voice

01:15:44 and helping people connect creates opportunity, right?

01:15:47 So those are the two things that we’re always focused on

01:15:49 are sort of helping people connect.

01:15:50 We talked about that a lot,

01:15:52 but also giving people voice

01:15:53 and ability to express themselves.

01:15:55 Then by the way, most of the time

01:15:56 when people express themselves,

01:15:58 that’s not like politically controversial content.

01:16:00 It’s like expressing something about their identity

01:16:04 that’s more related to the avatar conversation

01:16:06 we had earlier in terms of expressing some facet,

01:16:08 but that’s what’s important to people on a day to day basis.

01:16:11 And sometimes when people feel strongly enough

01:16:13 about something, it kind of becomes a political topic.

01:16:16 That’s sort of always been a thing that we’ve focused on.

01:16:19 There’s always been the question of safety in this,

01:16:22 which if you’re building a community,

01:16:24 I think you have to focus on safety.

01:16:26 We’ve had these community standards from early on,

01:16:28 and there are about 20 different kinds of harm

01:16:32 that we track and try to fight actively.

01:16:34 We’ve talked about some of them already.

01:16:36 So it includes things like bullying and harassment.

01:16:40 It includes things like terrorism or promoting terrorism,

01:16:46 inciting violence, intellectual property theft.

01:16:49 And in general, I think call it about 18 out of 20 of those.

01:16:53 There’s not really a particularly polarized definition

01:16:57 of that.

01:16:59 I think you’re not really gonna find many people

01:17:01 in the country or in the world

01:17:03 who are trying to say we should be

01:17:07 fighting terrorist content less.

01:17:09 I think the content where there are a couple of areas

01:17:12 where I think that this has gotten more controversial

01:17:14 recently, which I’ll talk about.

01:17:16 And you’re right, the misinformation is basically is up there.

01:17:20 And I think sometimes the definition of hate speech

01:17:21 is up there too.

01:17:22 But I think in general, most of the content

01:17:25 that I think we’re working on for safety

01:17:29 is not actually, people don’t kind of have these questions.

01:17:32 So it’s sort of this subset.

01:17:35 But if you go back to the beginning of the company,

01:17:37 this was sort of pre deep learning days.

01:17:42 And therefore, it was me and my roommate Dustin join me.

01:17:47 And if someone posted something bad,

01:17:54 it was the AI technology did not exist yet

01:17:57 to be able to go basically look at all the content.

01:18:02 And we were a small enough outfit

01:18:06 that no one would expect that we could review it all.

01:18:08 Even if someone reported it to us,

01:18:10 we basically did our best, right?

01:18:11 It’s like someone would report it

01:18:12 and we try to look at stuff and deal with stuff.

01:18:16 And for call it the first seven or eight years

01:18:22 of the company, we weren’t that big of a company.

01:18:26 For a lot of that period, we weren’t even really profitable.

01:18:28 The AI didn’t really exist to be able to do

01:18:30 the kind of moderation that we do today.

01:18:32 And then at some point in kind of the middle

01:18:35 of the last decade, that started to flip.

01:18:38 And we got to the point where we were sort of a larger

01:18:44 and more profitable company.

01:18:45 And the AI was starting to come online

01:18:48 to be able to proactively detect

01:18:50 some of the simpler forms of this.

01:18:52 So things like pornography,

01:18:54 you could train an image classifier

01:18:57 to identify what a nipple was,

01:18:59 or you can fight against terrorist content.

01:19:01 You still could.

01:19:02 There’s actually papers on this, it’s great.

01:19:03 Oh, of course there are.

01:19:04 Technical papers.

01:19:05 Of course there are.

01:19:06 Those are relatively easier things to train AI to do

01:19:09 than for example, understand the nuances

01:19:12 of what is inciting violence

01:19:14 in a hundred languages around the world

01:19:15 and not have the false positives of like,

01:19:20 okay, are you posting about this thing

01:19:22 that might be inciting violence

01:19:24 because you’re actually trying to denounce it?

01:19:26 In which case we probably shouldn’t take that down.

01:19:28 Where if you’re trying to denounce something

01:19:29 that’s inciting violence in some kind of dialect

01:19:33 in a corner of India, as opposed to,

01:19:37 okay, actually you’re posting this thing

01:19:38 because you’re trying to incite violence.

01:19:39 Okay, building an AI that can basically get

01:19:42 to that level of nuance and all the languages

01:19:44 that we serve is something that I think

01:19:47 is only really becoming possible now,

01:19:49 not towards the middle of the last decade.

01:19:51 But there’s been this evolution,

01:19:54 and I think what happened,

01:19:57 people sort of woke up after 2016

01:20:00 and a lot of people are like,

01:20:02 okay, the country is a lot more polarized

01:20:05 and there’s a lot more stuff here than we realized.

01:20:08 Why weren’t these internet companies on top of this?

01:20:11 And I think at that point it was reasonable feedback

01:20:18 that some of this technology had started becoming possible.

01:20:22 And at that point, I really did feel like

01:20:25 we needed to make a substantially larger investment.

01:20:27 We’d already worked on this stuff a lot,

01:20:29 on AI and on these integrity problems,

01:20:32 but that we should basically invest,

01:20:35 have a thousand or more engineers

01:20:37 basically work on building these AI systems

01:20:39 to be able to go and proactively identify the stuff

01:20:41 across all these different areas.

01:20:43 Okay, so we went and did that.

01:20:45 Now we’ve built the tools to be able to do that.

01:20:48 And now I think it’s actually a much more complicated

01:20:50 set of philosophical rather than technical questions,

01:20:53 which is the exact policies, which are okay.

01:20:56 Now, the way that we basically hold ourselves accountable

01:21:01 is we issue these transparency reports every quarter

01:21:04 and the metric that we track is for each of these

01:21:06 20 types of harmful content.

01:21:10 How much of that content are we taking down

01:21:12 before someone even has to report it to us?

01:21:14 So how effective is our AI at doing this?

01:21:17 But that basically creates this big question,

01:21:19 which is okay, now we need to really be careful

01:21:22 about how proactive we set the AI

01:21:25 and where the exact policy lines are

01:21:28 around what we’re taking down.

01:21:30 It’s certainly at a point now where I felt like

01:21:35 at the beginning of that journey

01:21:37 of building those AI systems, there was a lot of push.

01:21:43 There’s saying, okay, you’ve got to do more.

01:21:44 There’s clearly a lot more bad content

01:21:46 that people aren’t reporting or that you’re not getting to

01:21:49 and you need to get more effective at that.

01:21:51 And I was pretty sympathetic to that.

01:21:52 But then I think at some point along the way,

01:21:54 there started to be almost equal issues on both sides

01:21:58 of, okay, actually you’re kind of taking down

01:22:00 too much stuff, right?

01:22:02 Or some of the stuff is borderline

01:22:05 and it wasn’t really bothering anyone

01:22:07 and they didn’t report it.

01:22:09 So is that really an issue that you need to take down?

01:22:13 Whereas we still have the critique on the other side too

01:22:15 where a lot of people think we’re not doing enough.

01:22:18 So it’s become, as we built the technical capacity,

01:22:21 I think it becomes more philosophically interesting almost

01:22:25 where you wanna be on the line.

01:22:27 And I just think you don’t want one person

01:22:31 making those decisions.

01:22:32 So we’ve also tried to innovate

01:22:33 in terms of building out this independent oversight board,

01:22:36 which has people who are dedicated to free expression

01:22:39 but from around the world who people can appeal cases to.

01:22:43 So a lot of the most controversial cases basically go to them

01:22:46 and they make the final binding decision

01:22:47 on how we should handle that.

01:22:49 And then of course, their decisions,

01:22:50 we then try to figure out what the principles are

01:22:53 behind those and encode them into the algorithms.

01:22:55 And how are those people chosen, which, you know,

01:22:58 you’re outsourcing a difficult decision.

01:23:00 Yeah, the initial people,

01:23:02 we chose a handful of chairs for the group

01:23:09 and we basically chose the people

01:23:12 for a commitment to free expression

01:23:16 and like a broad understanding of human rights

01:23:19 and the trade offs around free expression.

01:23:21 So they fundamentally people

01:23:22 who are gonna lean towards free expression.

01:23:24 Towards freedom of speech.

01:23:26 Okay, so there’s also this idea of fact checkers.

01:23:28 So jumping around to the misinformation questions,

01:23:31 especially during COVID,

01:23:33 which is an exceptionally speaking of polarization.

01:23:36 Can I speak to the COVID thing?

01:23:38 I mean, I think one of the hardest set of questions

01:23:40 around free expression,

01:23:41 because you asked about Georgetown

01:23:42 has my stance fundamentally changed?

01:23:43 And the answer to that is no, my stance has not changed.

01:23:48 It is fundamentally the same as when I was talking

01:23:52 at Georgetown from a philosophical perspective.

01:23:56 The challenge with free speech is that everyone agrees

01:24:01 that there is a line where if you’re actually

01:24:05 about to do physical harm to people

01:24:08 that there should be restrictions.

01:24:10 So, I mean, there’s the famous Supreme Court

01:24:13 historical example of like,

01:24:15 you can’t yell fire in a crowded theater.

01:24:18 The thing that everyone disagrees on

01:24:20 is what is the definition of real harm?

01:24:22 Where I think some people think,

01:24:24 okay, this should only be a very literal,

01:24:27 I mean, take it back to the bullying conversation

01:24:29 we were just having, where is it just harm

01:24:32 if the person is about to hurt themselves

01:24:34 because they’ve been bullied so hard?

01:24:36 Or is it actually harm like as they’re being bullied?

01:24:39 And kind of at what point in the spectrum is that?

01:24:42 And that’s the part that there’s not agreement on.

01:24:44 But I think what people agree on pretty broadly

01:24:47 is that when there is an acute threat

01:24:49 that it does make sense from a societal perspective

01:24:52 to tolerate less speech.

01:24:57 That could be potentially harmful in that acute situation.

01:24:59 So I think where COVID got very difficult is,

01:25:02 I don’t think anyone expected this to be going on for years.

01:25:06 But if you’d kind of asked now a priori,

01:25:10 would a global pandemic where a lot of people are dying

01:25:14 and catching this, is that an emergency

01:25:19 that where you’d kind of consider it

01:25:21 that it’s problematic to basically yell fire

01:25:25 in a crowded theater?

01:25:26 I think that that probably passes that test.

01:25:29 So I think that it’s a very tricky situation,

01:25:32 but I think the fundamental commitment

01:25:35 to free expression is there.

01:25:38 And that’s what I believe.

01:25:39 And again, I don’t think you start this company

01:25:41 unless you care about people being able

01:25:42 to express themselves as much as possible.

01:25:44 But I think that that’s the question,

01:25:48 is how do you define what the harm is

01:25:50 and how acute that is?

01:25:52 And what are the institutions that define that harm?

01:25:55 A lot of the criticism is that the CDC, the WHO,

01:25:59 the institutions we’ve come to trust as a civilization

01:26:03 to give the line of what is and isn’t harm

01:26:07 in terms of health policy have failed in many ways,

01:26:11 in small ways and in big ways, depending on who you ask.

01:26:14 And then the perspective of meta and Facebook is like,

01:26:17 well, where the hell do I get the information

01:26:20 of what is and isn’t misinformation?

01:26:22 So it’s a really difficult place to be in,

01:26:25 but it’s great to hear that you’re leaning

01:26:26 towards freedom of speech on this aspect.

01:26:30 And again, I think this actually calls to the fact

01:26:33 that we need to reform institutions

01:26:35 that help keep an open mind

01:26:36 of what is and isn’t misinformation.

01:26:39 And misinformation has been used to bully on the internet.

01:26:44 I mean, I just have, I’m friends with Joe Rogan

01:26:46 and he is called as a,

01:26:49 I remember hanging out with him in Vegas

01:26:51 and somebody yelled, stop spreading misinformation.

01:26:54 I mean, and there’s a lot of people that follow him

01:26:57 that believe he’s not spreading misinformation.

01:26:59 Like you can’t just not acknowledge the fact

01:27:02 that there’s a large number of people

01:27:05 that have a different definition of misinformation.

01:27:08 And that’s such a tough place to be.

01:27:10 Like who do you listen to?

01:27:11 Do you listen to quote unquote experts who gets,

01:27:15 as a person who has a PhD, I gotta say,

01:27:17 I mean, I’m not sure I know what defines an expert,

01:27:21 especially in a new,

01:27:24 in a totally new pandemic or a new catastrophic event,

01:27:29 especially when politics is involved

01:27:31 and especially when the news are,

01:27:33 the media involved that can propagate

01:27:37 sort of outrageous narratives

01:27:39 and thereby make a lot of money.

01:27:40 Like what the hell?

01:27:41 Where’s the source of truth?

01:27:43 And then everybody turns to Facebook.

01:27:45 It’s like, please tell me what the source of truth is.

01:27:49 Well, I mean, well, how would you handle this

01:27:50 if you were in my position?

01:27:52 Is very, very, very, very difficult.

01:27:55 I would say,

01:27:59 I would more speak about how difficult the choices are

01:28:02 and be transparent about like,

01:28:04 what the hell do you do with this?

01:28:05 Like here, you got exactly,

01:28:07 ask the exact question you just asked me,

01:28:08 but to the broader public, like, okay, yeah,

01:28:10 you guys tell me what to do.

01:28:12 So like crowdsource it.

01:28:14 And then the other aspect is when you spoke really eloquently

01:28:19 about the fact that there’s this going back and forth

01:28:23 and now there’s a feeling like you’re censoring

01:28:25 a little bit too much.

01:28:26 So I would lean, I would try to be ahead of that feeling.

01:28:30 I would now lean towards freedom of speech and say,

01:28:33 we’re not the ones that are going to define misinformation.

01:28:36 Let it be a public debate, let the idea stand.

01:28:40 And I actually place, this idea of misinformation,

01:28:44 I place the responsibility

01:28:46 on the poor communication skills of scientists.

01:28:50 They should be in the battlefield of ideas

01:28:52 and everybody who is spreading information

01:28:57 against the vaccine, they should not be censored.

01:29:00 They should be talked with and you should show the data,

01:29:03 you should have open discussion

01:29:04 as opposed to rolling your eyes and saying,

01:29:07 I’m the expert, I know what I’m talking about.

01:29:09 No, you need to convince people, it’s a battle of ideas.

01:29:13 So that’s the whole point of freedom of speech.

01:29:15 It’s the way to defeat bad ideas

01:29:17 is with good ideas, with speech.

01:29:20 So like the responsibility here falls

01:29:22 on the poor communication skills of scientists.

01:29:26 Thanks to social media, scientists are not communicators.

01:29:32 They have the power to communicate.

01:29:34 Some of the best stuff I’ve seen about COVID

01:29:36 from doctors is on social media.

01:29:38 It’s a way to learn to respond really quickly,

01:29:41 to go faster than the peer review process.

01:29:43 And so they just need to get way better

01:29:45 at that communication.

01:29:46 And also by better, I don’t mean just convincing,

01:29:50 I also mean speak with humility,

01:29:51 don’t talk down to people, all those kinds of things.

01:29:54 And as a platform, I would say,

01:29:56 I would step back a little bit.

01:29:59 Not all the way, of course,

01:30:00 because there’s a lot of stuff that can cause real harm

01:30:03 as we’ve talked about,

01:30:04 but you lean more towards freedom of speech

01:30:06 because then people from a brand perspective

01:30:09 wouldn’t be blaming you for the other ills of society,

01:30:13 which there are many.

01:30:14 The institutions have flaws, the political divide,

01:30:19 obviously politicians have flaws, that’s news.

01:30:23 The media has flaws that they’re all trying to work with.

01:30:28 And because of the central place of Facebook in the world,

01:30:31 all of those flaws somehow kind of propagate to Facebook.

01:30:34 And you’re sitting there as Plato, the philosopher,

01:30:38 have to answer to some of the most difficult questions

01:30:40 asking, being asked of human civilization.

01:30:43 So I don’t know, maybe this is an American answer though,

01:30:47 to lean towards freedom of speech.

01:30:48 I don’t know if that applies globally.

01:30:51 So yeah, I don’t know.

01:30:52 But transparency and saying, I think as a technologist,

01:30:57 one of the things I sense about Facebook and meta

01:30:59 when people talk about this company

01:31:02 is they don’t necessarily understand

01:31:04 fully how difficult the problem is.

01:31:06 You talked about AI has to catch

01:31:08 a bunch of harmful stuff really quickly.

01:31:11 Just the sea of data you have to deal with.

01:31:14 It’s a really difficult problem.

01:31:16 So like any of the critics,

01:31:18 if you just hand them the helm for a week,

01:31:22 let’s see how well you can do.

01:31:25 Like that, to me, that’s definitely something

01:31:28 that would wake people up to how difficult this problem is

01:31:31 if there’s more transparency

01:31:32 of saying how difficult this problem is.

01:31:35 Let me ask you about, on the AI front,

01:31:37 just because you mentioned language and my ineloquence.

01:31:41 Translation is something I wanted to ask you about.

01:31:44 And first, just to give a shout out to the supercomputer.

01:31:47 You’ve recently announced the AI research supercluster, RSC.

01:31:51 Obviously, I’m somebody who loves the GPUs.

01:31:54 It currently has 6,000 GPUs.

01:31:57 NVIDIA DGX A100 is the systems that have

01:32:02 in total 6,000 GPUs.

01:32:04 And it will eventually, maybe this year,

01:32:06 maybe soon, will have 16,000 GPUs.

01:32:10 So it can do a bunch of different kinds

01:32:11 of machine learning applications.

01:32:15 There’s a cool thing on the distributed storage aspect

01:32:18 and all that kind of stuff.

01:32:19 So one of the applications that I think is super exciting

01:32:23 is translation, real time translation.

01:32:26 I mentioned to you that having a conversation,

01:32:29 I speak Russian fluently,

01:32:30 I speak English somewhat fluently,

01:32:32 and having a conversation with Vladimir Putin,

01:32:34 say, as a use case.

01:32:36 Me, as a user, coming to you as a use case.

01:32:38 We both speak each other’s language.

01:32:42 I speak Russian, he speaks English.

01:32:45 How can we have that communication go well

01:32:48 with the help of AI?

01:32:49 I think it’s such a beautiful and a powerful application

01:32:52 of AI to connect the world,

01:32:54 that bridge the gap, not necessarily between me and Putin,

01:32:57 but people that don’t have that shared language.

01:33:01 Can you just speak about your vision with translation?

01:33:04 Because I think that’s a really exciting application.

01:33:06 If you’re trying to help people connect

01:33:08 all around the world,

01:33:09 a lot of content is produced in one language

01:33:11 and people in all these other places are interested in it.

01:33:14 So being able to translate that

01:33:17 just unlocks a lot of value on a day to day basis.

01:33:20 I mean, so the kind of AI around translation is interesting

01:33:24 because it’s gone through a bunch of iterations.

01:33:27 But the basic state of the art

01:33:29 is that you don’t wanna go through

01:33:33 different kind of intermediate symbolic

01:33:38 representations of language or something like that.

01:33:42 You basically wanna be able to map the concepts

01:33:46 and basically go directly from one language to another.

01:33:49 And you just can train bigger and bigger models

01:33:53 in order to be able to do that.

01:33:54 And that’s where the research supercluster comes in

01:33:58 is basically a lot of the trend in machine learning

01:34:01 is just you’re building bigger and bigger models

01:34:03 and you just need a lot of computation to train them.

01:34:05 So it’s not that like the translation would run

01:34:08 on the supercomputer, the training of the model,

01:34:12 which could have billions or trillions of examples

01:34:15 of just basically that.

01:34:19 You’re training models on this supercluster

01:34:22 in days or weeks that might take a much longer period of time

01:34:27 on a smaller cluster.

01:34:28 So it just wouldn’t be practical for most teams to do.

01:34:30 But the translation work,

01:34:34 we’re basically getting from being able to go

01:34:38 between about a hundred languages seamlessly today

01:34:42 to being able to go to about 300 languages in the near term.

01:34:46 So from any language to any other language.

01:34:48 Yeah.

01:34:49 And part of the issue when you get closer to more languages

01:34:53 is some of these get to be pretty,

01:34:59 not very popular languages, right?

01:35:01 Where there isn’t that much content in them.

01:35:04 So you end up having less data

01:35:07 and you need to kind of use a model that you’ve built up

01:35:10 around other examples.

01:35:12 And this is one of the big questions around AI

01:35:14 is like how generalizable can things be?

01:35:16 And that I think is one of the things

01:35:18 that’s just kind of exciting here

01:35:19 from a technical perspective.

01:35:21 But capturing, we talked about this with the metaverse,

01:35:23 capturing the magic of human to human interaction.

01:35:26 So me and Putin, okay.

01:35:29 Again, this is therapy session.

01:35:30 I mean, it’s a tough example

01:35:31 because you actually both speak Russian and English.

01:35:33 No, but that’s.

01:35:34 But in the future.

01:35:35 I see it as a touring test of a kind

01:35:37 because we would both like to have an AI that improves

01:35:40 because I don’t speak Russian that well.

01:35:42 He doesn’t speak English that well.

01:35:44 Yeah.

01:35:45 It would be nice to outperform our abilities

01:35:48 and it sets a really nice bar

01:35:50 because I think AI can really help in translation

01:35:53 for people that don’t speak the language at all,

01:35:55 but to actually capture the magic of the chemistry,

01:36:00 the translation, which would make the metaverse

01:36:03 super immersive.

01:36:04 I mean, that’s exciting.

01:36:05 You remove the barrier of language, period.

01:36:08 Yeah, so when people think about translation,

01:36:11 I think a lot of that is they’re thinking about text to text,

01:36:14 but speech to speech, I think is a whole nother thing.

01:36:17 And I mean, one of the big lessons on that,

01:36:19 which I was referring to before is I think early models,

01:36:22 it’s like, all right, they take speech,

01:36:23 they translate it to text,

01:36:25 translate the text to another language

01:36:26 and then kind of output that as speech in that language.

01:36:29 And you don’t wanna do that.

01:36:30 You just wanna be able to go directly from speech

01:36:32 in one language to speech in another language

01:36:34 and build up the models to do that.

01:36:36 And I mean, I think one of the,

01:36:39 there have been,

01:36:40 when you look at the progress in machine learning,

01:36:42 there have been big advances in the techniques,

01:36:47 some of the advances in self supervised learning,

01:36:51 which I know you talked to Jan about

01:36:52 and he’s like one of the leading thinkers in this area.

01:36:55 I just think that that stuff is really exciting,

01:36:57 but then you couple that with the ability

01:36:59 to just throw larger and larger amounts of compute

01:37:02 at training these models.

01:37:04 And you can just do a lot of things

01:37:05 that were harder to do before.

01:37:09 But we’re asking more of our systems too, right?

01:37:12 So if you think about the applications

01:37:14 that we’re gonna need for the metaverse,

01:37:18 or think about it, okay,

01:37:19 so let’s talk about AR here for a second.

01:37:21 You’re gonna have these glasses,

01:37:23 they’re gonna look hopefully

01:37:24 like a normal ish looking pair of glasses,

01:37:28 but they’re gonna be able to put holograms in the world

01:37:31 and intermix virtual and physical objects in your scene.

01:37:35 And one of the things that’s gonna be unique about this

01:37:39 compared to every other computing device

01:37:41 that you’ve had before,

01:37:42 is that this is gonna be the first computing device

01:37:45 that has all the same signals

01:37:47 about what’s going on around you that you have.

01:37:49 Right, so your phone,

01:37:50 you can have it take a photo or a video,

01:37:54 but I mean, these glasses are gonna,

01:37:56 whenever you activate them,

01:37:57 they’re gonna be able to see what you see

01:37:59 from your perspective,

01:38:00 they’re gonna be able to hear what you hear

01:38:01 because the microphones and all that

01:38:03 are gonna be right around where your ears are.

01:38:05 So you’re gonna want an AI assistant,

01:38:08 that’s a new kind of AI assistant

01:38:10 that can basically help you process the world

01:38:13 from this first person perspective

01:38:17 or from the perspective that you have.

01:38:18 And the utility of that is gonna be huge,

01:38:21 but the kinds of AI models that we’re gonna need

01:38:25 are going to be just,

01:38:28 I don’t know, there’s a lot that we’re gonna need

01:38:30 to basically make advances in.

01:38:31 But I mean, but that’s why I think these concepts

01:38:33 of the metaverse and the advances in AI

01:38:36 are so fundamentally interlinked

01:38:40 that I mean, they’re kind of enabling each other.

01:38:42 Yeah, like the world builder is a really cool idea.

01:38:45 Like you can be like a Bob Ross,

01:38:47 like I’m gonna put a little tree right here.

01:38:49 Yeah.

01:38:49 I need a little tree, it’s missing a little tree.

01:38:51 And then, but at scale,

01:38:52 like enriching your experience in all kinds of ways.

01:38:55 You mentioned the assistant too,

01:38:56 that’s really interesting how you can have AI assistants

01:39:00 helping you out on different levels

01:39:01 of sort of intimacy of communication.

01:39:04 It could be just like scheduling

01:39:05 or it could be like almost like therapy.

01:39:08 Clearly I need some.

01:39:09 So let me ask you,

01:39:11 you’re one of the most successful people ever.

01:39:14 You’ve built an incredible company

01:39:16 that has a lot of impact.

01:39:18 What advice do you have for young people today?

01:39:23 How to live a life they can be proud of?

01:39:25 How to build something that can have a big positive impact

01:39:30 on the world?

01:39:31 Well, let’s break that down.

01:39:37 Cause I think you proud of, have a big positive impact.

01:39:41 Well, you’re actually listening.

01:39:42 And how to live your life

01:39:43 are actually three different things that I think,

01:39:47 I mean, they could line up,

01:39:48 but, and also like what age of people are you talking to?

01:39:52 Cause I mean, I can like.

01:39:53 High school and college.

01:39:54 So you don’t really know what you’re doing,

01:39:56 but your dream big.

01:39:58 And you really have a chance to do something unprecedented.

01:40:02 Yeah.

01:40:04 So I guess just to.

01:40:05 Also for people my age.

01:40:06 Okay, so let’s maybe start with the kind of most

01:40:09 philosophical and abstract version of this.

01:40:12 Every night when I put my daughters to bed,

01:40:16 we go through this thing and like,

01:40:20 they call it the good night things.

01:40:21 Cause we’re basically what we talk about at night.

01:40:25 And I just, I go through them.

01:40:29 Sounds like a good show.

01:40:31 The good night things.

01:40:32 Yeah.

01:40:33 Priscilla’s always asking, she’s like,

01:40:34 can I get good night things?

01:40:35 Like, I don’t know.

01:40:36 You go to bed too early.

01:40:37 But it’s,

01:40:41 but I basically go through with Max and Augie,

01:40:46 what are the things that are most important in life?

01:40:48 Right.

01:40:49 That I just, it’s like, what do I want them to remember

01:40:51 and just have like really ingrained in them as they grow up?

01:40:53 And it’s health, right?

01:40:56 Making sure that you take care of yourself

01:40:58 and keep yourself in good shape,

01:41:00 loving friends and family, right?

01:41:02 Because having the relationships,

01:41:05 the family and making time for friends,

01:41:08 I think is perhaps one of the most important things.

01:41:13 And then the third is maybe a little more amorphous,

01:41:16 but it is something that you’re excited about for the future.

01:41:19 And when I’m talking to a four year old,

01:41:21 often I’ll ask her what she’s excited about

01:41:23 for tomorrow or the week ahead.

01:41:25 But I think for most people, it’s really hard.

01:41:29 I mean, the world is a heavy place.

01:41:31 And I think like the way that we navigate it

01:41:34 is that we have things that we’re looking forward to.

01:41:37 So whether it is building AR glasses for the future

01:41:41 or being able to celebrate my 10 year wedding anniversary

01:41:45 with my wife that’s coming up,

01:41:47 it’s like, I think people,

01:41:48 you know, you have things that you’re looking forward to.

01:41:51 Or for the girls, it’s often I want to see mom

01:41:53 in the morning, right?

01:41:54 It’s just, but it’s like that’s a really critical thing.

01:41:57 And then the last thing is I ask them every day,

01:42:00 what did you do today to help someone?

01:42:04 Because I just think that that’s a really critical thing

01:42:07 is like, it’s easy to kind of get caught up in yourself

01:42:10 and kind of stuff that’s really far down the road,

01:42:14 but like, did you do something just concrete today

01:42:17 to help someone?

01:42:18 And, you know, it can just be as simple as, okay, yeah,

01:42:21 I helped set the table for lunch, right?

01:42:23 Or, you know, this other kid in our school

01:42:26 was having a hard time with something

01:42:27 and I like helped explain it to him.

01:42:29 But in that those are, that’s sort of like,

01:42:32 if you were to boil down my overall life philosophy

01:42:36 into what I try to impart to my kids,

01:42:40 those are the things that I think are really important.

01:42:43 So, okay, so let’s say college.

01:42:44 So if you’re a graduate in college,

01:42:45 probably more practical advice, I’m always very focused

01:42:52 on people.

01:42:53 And I think the most important decision

01:42:57 you’re probably gonna make if you’re in college

01:42:59 is who you surround yourself with,

01:43:01 because you become like the people

01:43:02 you surround yourself with.

01:43:04 And I sort of have this hiring heuristic at Metta,

01:43:09 which is that I will only hire someone to work for me

01:43:13 if I could see myself working for them.

01:43:17 Not necessarily that I want them to run the company

01:43:19 because I like my job, but in an alternate universe,

01:43:22 if it was their company and I was looking

01:43:23 to go work somewhere, would I be happy to work for them?

01:43:27 And I think that that’s a helpful heuristic

01:43:31 to help balance, you know,

01:43:33 when you’re building something like this,

01:43:33 there’s a lot of pressure to, you know,

01:43:36 you wanna build out your team,

01:43:37 because there’s a lot of stuff that you need to get done.

01:43:39 And everyone always says, don’t compromise on quality,

01:43:41 but there’s this question of, okay,

01:43:42 well, how do you know that someone is good enough?

01:43:44 And I think my answer is, I would want someone

01:43:46 to be on my team if I would work for them.

01:43:50 But I think it’s actually a pretty similar answer

01:43:53 to like, if you were choosing friends or a partner

01:43:58 or something like that.

01:43:59 So when you’re kind of in college,

01:44:01 trying to figure out what your circle is gonna be,

01:44:03 trying to figure out, you know,

01:44:04 you’re evaluating data,

01:44:05 your circle is gonna be trying to figure out, you know,

01:44:07 you’re evaluating different job opportunities.

01:44:09 Who are the people, even if they’re gonna be peers

01:44:12 in what you’re doing,

01:44:14 who are the people who in an alternate university,

01:44:17 you would wanna work for them,

01:44:18 because you think you’re gonna learn a lot from them,

01:44:20 because they know, because they are kind of values aligned

01:44:24 on the things that you care about,

01:44:25 and they’re gonna like, and they’re gonna push you,

01:44:28 but also they know different things

01:44:29 and have different experiences

01:44:30 that are kind of more of what you wanna become like

01:44:32 over time.

01:44:33 But I don’t know, I think probably people are too,

01:44:37 in general, objective focused,

01:44:39 and maybe not focused enough on the connections

01:44:42 and the people who they’re basically building relationships

01:44:46 with.

01:44:47 I don’t know what it says about me,

01:44:48 but my place in Austin now has seven legged robots.

01:44:53 So I’m surrounded myself by robots,

01:44:55 which is probably something I should look into.

01:44:59 What kind of world would you like to see your daughters

01:45:02 grow up in, even after you’re gone?

01:45:09 Well, I think one of the promises of all the stuff

01:45:11 that is getting built now is that it can be a world

01:45:15 where more people can just live out their imagination.

01:45:21 One of my favorite quotes,

01:45:23 I think it was attributed to Picasso,

01:45:25 it’s that all children are artists,

01:45:26 and the challenge is how do you remain one

01:45:28 when you grow up?

01:45:29 And if you have kids, this is pretty clear,

01:45:33 I mean, they just have wonderful imaginations.

01:45:36 And part of what I think is gonna be great

01:45:38 about the creator economy and the metaverse

01:45:41 and all this stuff is this notion around

01:45:44 that a lot more people in the future

01:45:46 are gonna get to work doing creative stuff

01:45:49 than what I think today we would just consider

01:45:51 traditional labor or service.

01:45:53 And I think that that’s awesome.

01:45:56 And that’s a lot of what people are here to do

01:46:00 is collaborate together, work together,

01:46:03 think of things that you wanna build and go do it.

01:46:06 And I don’t know, one of the things

01:46:08 that I just think is striking,

01:46:09 so I teach my daughters some basic coding with Scratch.

01:46:13 I mean, they’re still obviously really young,

01:46:15 but I think of coding as building,

01:46:18 where it’s like when I’m coding,

01:46:19 I’m building something that I want to exist.

01:46:22 But my youngest daughter, she’s very musical

01:46:27 and pretty artistic and she thinks about coding as art.

01:46:32 She calls it code art, not the code,

01:46:35 but the output of what she is making.

01:46:37 It’s like, she’s just very interesting visually

01:46:39 in what she can kind of output and how it can move around.

01:46:42 And do we need to fix that?

01:46:45 Are we good?

01:46:45 What happened?

01:46:47 Do we have to clap, Alexa?

01:46:49 Yeah, so I was just talking about Augie and her code art,

01:46:53 but I mean, to me, this is like a beautiful thing, right?

01:46:56 The notion that like for me,

01:46:58 coding was this functional thing and I enjoyed it.

01:47:01 And it like helped build something utilitarian,

01:47:04 but that for the next generation of people,

01:47:06 it will be even more an expression

01:47:10 of their kind of imagination and artistic sense

01:47:14 for what they want to exist.

01:47:15 So I don’t know if that happens,

01:47:17 if we can help bring about this world

01:47:20 where a lot more people can,

01:47:23 that that’s like their existence going forward

01:47:25 is being able to basically create

01:47:28 and live out all these different kinds of art.

01:47:32 I just think that that’s like a beautiful

01:47:34 and wonderful thing and will be very freeing for humanity

01:47:37 to spend more of our time on the things that matter to us.

01:47:40 Yeah, allow more and more people to express their art

01:47:43 in the full meaning of that word.

01:47:45 That’s a beautiful vision.

01:47:46 We mentioned that you are mortal.

01:47:50 Are you afraid of death?

01:47:51 Do you think about your mortality?

01:47:56 And are you afraid of it?

01:48:01 You didn’t sign up for this on a podcast, did you?

01:48:03 No, I mean, it’s an interesting question.

01:48:07 I mean, I’m definitely aware of it.

01:48:08 I do a fair amount of like extreme sport type stuff.

01:48:13 So like, so I’m definitely aware of it.

01:48:19 And you’re flirting with it a bit.

01:48:22 I train hard.

01:48:23 I mean, so it’s like, if I’m gonna go out

01:48:25 in like a 15 foot wave.

01:48:27 Go out big.

01:48:28 Well, then it’s like, all right,

01:48:29 I’ll make sure we have the right safety gear

01:48:31 and like make sure that I’m like used to that spot

01:48:34 and all that stuff.

01:48:35 But like, but you know, I mean, you.

01:48:37 The risk is still there.

01:48:38 You take some head blows along the way.

01:48:40 Yes, but definitely aware of it.

01:48:45 Definitely would like to stay safe.

01:48:48 I have a lot of stuff that I want to build and want to.

01:48:52 Does it freak you out that it’s finite though?

01:48:55 That there’s a deadline when it’s all over

01:48:59 and that there’ll be a time when your daughters are around

01:49:01 and you’re gone?

01:49:03 I don’t know.

01:49:03 That doesn’t freak me out.

01:49:04 I think, I don’t know.

01:49:09 Constraints are helpful.

01:49:16 Yeah.

01:49:17 Yeah, the finiteness makes ice cream

01:49:20 taste more delicious somehow.

01:49:21 The fact that it’s gonna be over.

01:49:23 There’s something about that with the metaverse too.

01:49:25 You want, we talked about this identity earlier,

01:49:28 like having just one, like NFTs.

01:49:30 There’s something powerful about the constraint

01:49:34 of finiteness or uniqueness.

01:49:36 That this moment is singular in history.

01:49:39 But I mean, a lot of,

01:49:41 as you go through different waves of technology,

01:49:42 I think a lot of what is interesting is

01:49:44 what becomes in practice infinite

01:49:48 or kind of there can be many, many of a thing

01:49:51 and then what ends up still being constrained.

01:49:53 So the metaverse should hopefully allow

01:50:00 a very large number or maybe in practice,

01:50:04 hopefully close to an infinite amount of expression

01:50:06 and worlds, but we’ll still only have

01:50:09 a finite amount of time.

01:50:11 Yes.

01:50:12 I think living longer I think is good.

01:50:18 And obviously all of my, our philanthropic work is,

01:50:21 it’s not focused on longevity,

01:50:23 but it is focused on trying to achieve

01:50:25 what I think is a possible goal in this century,

01:50:29 which is to be able to cure, prevent

01:50:31 or manage all diseases.

01:50:33 So I certainly think people kind of getting sick

01:50:36 and dying is a bad thing because,

01:50:37 and I’m dedicating almost all of my capital

01:50:40 towards advancing research in that area to push on that,

01:50:44 which I mean, we could do a whole,

01:50:45 another one of these podcasts about that

01:50:46 because that’s a fascinating topic.

01:50:49 I mean, this is with your wife Priscilla Chan,

01:50:51 you formed the Chan Zuckerberg Initiative,

01:50:54 gave away 99% or pledged to give away 99%

01:50:57 of Facebook non meta shares.

01:50:59 I mean, like you said, we could talk forever

01:51:01 about all the exciting things you’re working on there,

01:51:06 including the sort of moonshot of eradicating disease

01:51:11 by the mid century marker.

01:51:13 I don’t actually know if you’re gonna ever eradicate it,

01:51:15 but I think you can get to a point where you

01:51:17 can either cure things that happened, right?

01:51:20 So people get diseases, but you can cure them.

01:51:22 Prevent is probably closest to eradication

01:51:25 or just be able to manage as sort of like ongoing things

01:51:28 that are not gonna ruin your life.

01:51:33 And I think that that’s possible.

01:51:34 I think saying that there’s gonna be no disease at all

01:51:37 probably is not possible within the next several decades.

01:51:41 Basic thing is increase the quality of life

01:51:44 and maybe keep the finiteness

01:51:46 because it makes everything taste more delicious.

01:51:50 Maybe that’s just being a romantic 20th century human.

01:51:54 Maybe, but I mean, but it was an intentional decision

01:51:57 to not focus on our philanthropy on like explicitly

01:52:01 on longevity or living forever.

01:52:03 Yes.

01:52:06 If at the moment of your death, and by the way,

01:52:09 I like that the lights went out

01:52:11 when we started talking about death.

01:52:13 You get to meet God.

01:52:14 It does make it a lot more dramatic.

01:52:15 It does.

01:52:17 I should get closer to the mic.

01:52:19 At the moment of your death, you get to meet God

01:52:23 and you get to ask one question.

01:52:26 What question would you like to ask?

01:52:29 Or maybe a whole conversation.

01:52:31 I don’t know.

01:52:32 It’s up to you.

01:52:32 It’s more dramatic when it’s just one question.

01:52:37 Well, if it’s only one question and I died,

01:52:42 I would just wanna know that Priscilla and my family,

01:52:48 like if they were gonna be okay.

01:52:50 That might depend on the circumstances of my death.

01:52:54 But I think that in most circumstances that I can think of,

01:52:58 that’s probably the main thing that I would care about.

01:53:01 Yeah, I think God will hear that question and be like,

01:53:02 all right, fine, you get in.

01:53:04 That’s the right question to ask.

01:53:06 Is it?

01:53:07 I don’t know.

01:53:08 The humility and selfishness.

01:53:09 All right, you’re in.

01:53:10 I mean, but well, maybe.

01:53:14 They’re gonna be fine.

01:53:15 Don’t worry, you’re in.

01:53:16 Okay, but I mean, one of the things that I think

01:53:18 I struggle with at least is on the one hand,

01:53:22 that’s probably the thing that’s closest to me

01:53:25 and maybe the most common human experience.

01:53:29 But I don’t know, one of the things that I just struggle with

01:53:32 in terms of running this large enterprise is like,

01:53:38 should the thing that I care more about

01:53:41 be that responsibility?

01:53:44 And I think it’s shifted over time.

01:53:49 I mean, like before I really had a family

01:53:52 that was like the only thing I cared about.

01:53:53 And at this point, I mean, I care deeply about it,

01:53:59 but yeah, I think that that’s not as obvious of a question.

01:54:06 Yeah, we humans are weird.

01:54:07 You get this ability to impact millions of lives

01:54:12 and it’s definitely something, billions of lives,

01:54:15 it’s something you care about,

01:54:16 but the weird humans that are closest to us,

01:54:21 those are the ones that mean the most.

01:54:23 And I suppose that’s the dream of the metaverse

01:54:26 is to connect, form small groups like that

01:54:29 where you can have those intimate relationships.

01:54:31 Let me ask you the big, ridiculous.

01:54:33 Well, and to be able to be close,

01:54:36 not just based on who you happen to be next to.

01:54:39 I think that’s what the internet is already doing

01:54:41 is allowing you to spend more of your time

01:54:44 not physically proximate.

01:54:46 I mean, I always think when you think about the metaverse,

01:54:49 people ask this question about the real world.

01:54:52 It’s like the virtual world versus the real world.

01:54:54 And it’s like, no, the real world is a combination

01:54:58 of the virtual world and the physical world.

01:55:00 But I think over time, as we get more technology,

01:55:04 the physical world is becoming less of a percent

01:55:06 of the real world.

01:55:08 And I think that that opens up a lot of opportunities

01:55:10 for people, because you can work in different places.

01:55:13 You can stay more close to, stay closer to people

01:55:17 who are in different places.

01:55:18 So I think that’s good.

01:55:19 Removing barriers of geography

01:55:21 and then barriers of language.

01:55:23 That’s a beautiful vision.

01:55:25 Big, ridiculous question.

01:55:27 What do you think is the meaning of life?

01:55:44 I think that, well, there are probably a couple

01:55:46 of different ways that I would go at this.

01:55:52 But I think it gets back to this last question

01:55:53 that we talked about, about the duality

01:55:55 between you have the people around you

01:55:58 who you care the most about,

01:56:00 and then there’s like this bigger thing

01:56:03 that maybe you’re building.

01:56:05 And I think that in my own life, I mean,

01:56:07 I sort of think about this tension,

01:56:09 but I mean, it’s like, I started this whole company

01:56:11 and my life’s work is around human connection.

01:56:15 So I think it’s intellectually probably the thing

01:56:22 that I go to first is just that human connection

01:56:27 is the meaning.

01:56:29 And I mean, I think that it’s a thing

01:56:31 that our society probably systematically undervalues.

01:56:36 I mean, I just remember when I was growing up

01:56:39 and in school, it’s like, do your homework

01:56:43 and then go play with your friends after.

01:56:45 And it’s like, no, well, what if playing

01:56:47 with your friends is the point?

01:56:50 That sounds like an argument your daughter would make.

01:56:52 Well, I mean, I don’t know, I just think it’s interesting.

01:56:54 Homework doesn’t even matter, man.

01:56:56 Well, I think it’s interesting because it’s,

01:56:58 and people, I think people tend to think

01:57:02 about that stuff as wasting time,

01:57:05 or that’s like what you do in the free time that you have.

01:57:08 But like, what if that’s actually the point?

01:57:11 So that’s one.

01:57:12 But here’s maybe a different way of counting out this,

01:57:14 which is maybe more like religious in nature.

01:57:17 I mean, I always like,

01:57:22 there’s a rabbi who I’ve studied with

01:57:25 who kind of gave me this,

01:57:27 we were talking through Genesis and the Bible and the Torah

01:57:31 and they’re basically walking through,

01:57:36 it’s like, okay, you go through the seven days of creation

01:57:40 and it’s basically, it’s like,

01:57:45 why does the Bible start there?

01:57:48 Right, it’s like it could have started anywhere,

01:57:49 right, in terms of like how to live.

01:57:52 But basically it starts with talking about

01:57:54 how God created people in his, her image.

01:58:00 But the Bible starts by talking about

01:58:02 how God created everything.

01:58:04 So I actually think that there’s like a compelling argument

01:58:11 that I think I’ve always just found meaningful

01:58:12 and inspiring that a lot of the point

01:58:18 of what sort of religion has been telling us

01:58:22 that we should do is to create and build things.

01:58:30 So these things are not necessarily at odds.

01:58:32 I mean, I think like, I mean, that’s,

01:58:34 and I think probably to some degree

01:58:36 you’d expect me to say something like this

01:58:37 because I’ve dedicated my life to creating things

01:58:39 that help people connect.

01:58:40 So, I mean, that’s sort of the fusion of,

01:58:43 I mean, getting back to what we talked about earlier,

01:58:45 it’s, I mean, what I studied in school

01:58:46 or psychology and computer science, right?

01:58:48 So it’s, I mean, these are like the two themes

01:58:50 that I care about, but I don’t know for me,

01:58:54 that’s kind of what I think about, that’s what matters.

01:58:57 To create and to love, which is the ultimate form

01:59:02 of connection.

01:59:03 I think this is one hell of an amazing replay experience

01:59:07 in the metaverse.

01:59:08 So whoever is using our avatars years from now,

01:59:11 I hope you had fun and thank you for talking today.

01:59:14 Thank you.

01:59:16 Thanks for listening to this conversation

01:59:18 with Mark Zuckerberg.

01:59:19 To support this podcast, please check out our sponsors

01:59:22 in the description.

01:59:23 And now, let me leave you with the end of the poem, If,

01:59:27 by Roger Kipling.

01:59:30 If you can talk with crowds and keep your virtue,

01:59:34 or walk with kings, nor lose the common touch,

01:59:37 if neither foes nor loving friends can hurt you,

01:59:41 if all men count with you, but none too much.

01:59:47 If you can fill the unforgiving minute

01:59:49 with 60 seconds worth of distance run,

01:59:52 yours is the earth and everything that’s in it.

01:59:56 And which is more, you’ll be a man, my son.

01:59:59 Thank you for listening and hope to see you next time.