Cristos Goodrow: YouTube Algorithm #68

Transcript

00:00:00 The following is a conversation with Christos Goudreau,

00:00:03 Vice President of Engineering at Google and Head of Search and Discovery at YouTube,

00:00:08 also known as the YouTube Algorithm.

00:00:11 YouTube has approximately 1.9 billion users,

00:00:15 and every day people watch over 1 billion hours of YouTube video.

00:00:20 It is the second most popular search engine behind Google itself.

00:00:24 For many people, it is not only a source of entertainment,

00:00:27 but also how we learn new ideas from math and physics videos to podcasts to debates, opinions,

00:00:33 ideas from out of the box thinkers and activists on some of the most tense,

00:00:38 challenging, and impactful topics in the world today.

00:00:42 YouTube and other content platforms receive criticism from both viewers and creators,

00:00:48 as they should, because the engineering task before them is hard, and they don’t always

00:00:53 succeed, and the impact of their work is truly world changing.

00:00:58 To me, YouTube has been an incredible wellspring of knowledge.

00:01:02 I’ve watched hundreds, if not thousands, of lectures that changed the way I see

00:01:06 many fundamental ideas in math, science, engineering, and philosophy.

00:01:12 But it does put a mirror to ourselves, and keeps the responsibility of the steps we take

00:01:17 in each of our online educational journeys into the hands of each of us.

00:01:21 The YouTube algorithm has an important role in that journey of helping us find new,

00:01:26 exciting ideas to learn about.

00:01:28 That’s a difficult and an exciting problem for an artificial intelligence system.

00:01:33 As I’ve said in lectures and other forums, recommendation systems will be one of the

00:01:37 most impactful areas of AI in the 21st century, and YouTube is one of the biggest

00:01:43 recommendation systems in the world.

00:01:46 This is the Artificial Intelligence Podcast.

00:01:49 If you enjoy it, subscribe on YouTube, give it five stars on Apple Podcast, follow on

00:01:54 Spotify, support it on Patreon, or simply connect with me on Twitter, at Lex Friedman,

00:01:59 spelled F R I D M A N.

00:02:02 I recently started doing ads at the end of the introduction.

00:02:05 I’ll do one or two minutes after introducing the episode, and never any ads in the middle

00:02:10 that can break the flow of the conversation.

00:02:12 I hope that works for you and doesn’t hurt the listening experience.

00:02:16 This show is presented by Cash App, the number one finance app in the App Store.

00:02:20 I personally use Cash App to send money to friends, but you can also use it to buy,

00:02:25 sell, and deposit Bitcoin in just seconds.

00:02:27 Cash App also has a new investing feature.

00:02:30 You can buy fractions of a stock, say, $1 worth, no matter what the stock price is.

00:02:35 Broker services are provided by Cash App Investing, a subsidiary of Square and Member

00:02:40 SIPC.

00:02:41 I’m excited to be working with Cash App to support one of my favorite organizations

00:02:45 called FIRST, best known for their FIRST Robotics and Lego competitions.

00:02:50 They educate and inspire hundreds of thousands of students in over 110 countries and have

00:02:56 a perfect rating and charity navigator, which means that donated money is used to maximum

00:03:00 effectiveness.

00:03:02 When you get Cash App from the App Store or Google Play and use code LEXPODCAST, you’ll

00:03:08 get $10, and Cash App will also donate $10 to FIRST, which again is an organization that

00:03:14 I’ve personally seen inspire girls and boys to dream of engineering a better world.

00:03:19 And now, here’s my conversation with Christos Goudreau.

00:03:24 YouTube is the world’s second most popular search engine, behind Google, of course.

00:03:29 We watch more than 1 billion hours of YouTube videos a day, more than Netflix and Facebook

00:03:34 video combined.

00:03:35 YouTube creators upload over 500,000 hours of video every day.

00:03:41 Average lifespan of a human being, just for comparison, is about 700,000 hours.

00:03:47 So, what’s uploaded every single day is just enough for a human to watch in a lifetime.

00:03:53 So, let me ask an absurd philosophical question.

00:03:56 If from birth, when I was born, and there’s many people born today with the internet,

00:04:00 I watched YouTube videos nonstop, do you think there are trajectories through YouTube video

00:04:06 space that can maximize my average happiness, or maybe education, or my growth as a human

00:04:14 being?

00:04:15 I think there are some great trajectories through YouTube videos, but I wouldn’t recommend

00:04:21 that anyone spend all of their waking hours or all of their hours watching YouTube.

00:04:26 I mean, I think about the fact that YouTube has been really great for my kids, for instance.

00:04:32 My oldest daughter, she’s been watching YouTube for several years.

00:04:37 She watches Tyler Oakley and the Vlogbrothers, and I know that it’s had a very profound and

00:04:44 positive impact on her character.

00:04:46 And my younger daughter, she’s a ballerina, and her teachers tell her that YouTube is

00:04:52 a huge advantage for her because she can practice a routine and watch professional dancers do

00:04:58 that same routine and stop it and back it up and rewind and all that stuff, right?

00:05:03 So, it’s been really good for them.

00:05:06 And then even my son is a sophomore in college.

00:05:08 He got through his linear algebra class because of a channel called Three Blue, One Brown,

00:05:15 which helps you understand linear algebra, but in a way that would be very hard for anyone

00:05:22 to do on a whiteboard or a chalkboard.

00:05:25 And so, I think that those experiences, from my point of view, were very good.

00:05:30 And so, I can imagine really good trajectories through YouTube, yes.

00:05:34 Have you looked at, do you think of broadly about that trajectory over a period?

00:05:38 Because YouTube has grown up now.

00:05:41 So, over a period of years, you just kind of gave a few anecdotal examples, but I used

00:05:48 to watch certain shows on YouTube.

00:05:49 I don’t anymore.

00:05:50 I’ve moved on to other shows.

00:05:52 Ultimately, you want people to, from YouTube’s perspective, to stay on YouTube, to grow as

00:05:57 human beings on YouTube.

00:06:00 So, you have to think not just what makes them engage today or this month, but what

00:06:07 makes them engage today or this month, but also for a period of years.

00:06:12 Absolutely.

00:06:13 That’s right.

00:06:13 I mean, if YouTube is going to continue to enrich people’s lives, then it has to grow

00:06:20 with them, and people’s interests change over time.

00:06:25 And so, I think we’ve been working on this problem, and I’ll just say it broadly as

00:06:31 like how to introduce diversity and introduce people who are watching one thing to something

00:06:38 else they might like.

00:06:40 We’ve been working on that problem all the eight years I’ve been at YouTube.

00:06:45 It’s a hard problem because, I mean, of course, it’s trivial to introduce diversity

00:06:51 that doesn’t help.

00:06:52 Yeah, just add a random video.

00:06:54 I could just randomly select a video from the billions that we have.

00:06:58 It’s likely not to even be in your language.

00:07:01 So, the likelihood that you would watch it and develop a new interest is very, very low.

00:07:08 And so, what you want to do when you’re trying to increase diversity is find something that

00:07:14 is not too similar to the things that you’ve watched, but also something that you might

00:07:21 be likely to watch.

00:07:23 And that balance, finding that spot between those two things is quite challenging.

00:07:28 So, the diversity of content, diversity of ideas, it’s a really difficult, it’s a thing

00:07:36 like that’s almost impossible to define, right?

00:07:39 Like, what’s different?

00:07:41 So, how do you think about that?

00:07:43 So, two examples is I’m a huge fan of Three Blue One Brown, say, and then one diversity.

00:07:51 I wasn’t even aware of a channel called Veritasium, which is a great science, physics, whatever

00:07:57 channel.

00:07:57 So, one version of diversity is showing me Derek’s Veritasium channel, which I was really

00:08:03 excited to discover.

00:08:04 I actually now watch a lot of his videos.

00:08:06 Okay, so you’re a person who’s watching some math channels and you might be interested

00:08:12 in some other science or math channels.

00:08:14 So, like you mentioned, the first kind of diversity is just show you some things from

00:08:20 other channels that are related, but not just, you know, not all the Three Blue One Brown

00:08:27 channel, throw in a couple others.

00:08:29 So, that’s maybe the first kind of diversity that we started with many, many years ago.

00:08:36 Taking a bigger leap is about, I mean, the mechanisms we use for that is we basically

00:08:44 cluster videos and channels together, mostly videos.

00:08:48 We do almost everything at the video level.

00:08:50 And so, we’ll make some kind of a cluster via some embedding process and then measure

00:08:58 what is the likelihood that users who watch one cluster might also watch another cluster

00:09:05 that’s very distinct.

00:09:06 So, we may come to find that people who watch science videos also like jazz.

00:09:15 This is possible, right?

00:09:16 And so, because of that relationship that we’ve identified through the embeddings and

00:09:25 then the measurement of the people who watch both, we might recommend a jazz video once

00:09:30 in a while.

00:09:31 So, there’s this cluster in the embedding space of jazz videos and science videos.

00:09:36 And so, you kind of try to look at aggregate statistics where if a lot of people that jump

00:09:42 from science cluster to the jazz cluster tend to remain as engaged or become more engaged,

00:09:51 then that means those two, they should hop back and forth and they’ll be happy.

00:09:57 Right.

00:09:57 There’s a higher likelihood that a person who’s watching science would like jazz than

00:10:03 the person watching science would like, I don’t know, backyard railroads or something

00:10:08 else, right?

00:10:08 And so, we can try to measure these likelihoods and use that to make the best recommendation

00:10:15 we can.

00:10:16 So, okay.

00:10:16 So, we’ll talk about the machine learning of that, but I have to linger on things that

00:10:21 neither you or anyone have an answer to.

00:10:24 There’s gray areas of truth, which is, for example, now I can’t believe I’m going there,

00:10:31 but politics.

00:10:32 It happens so that certain people believe certain things and they’re very certain about

00:10:36 them.

00:10:38 Let’s move outside the red versus blue politics of today’s world, but there’s different ideologies.

00:10:44 For example, in college, I read quite a lot of Ayn Rand I studied, and that’s a particular

00:10:49 philosophical ideology I found interesting to explore.

00:10:53 Okay.

00:10:53 So, that was that kind of space.

00:10:55 I’ve kind of moved on from that cluster intellectually, but it nevertheless is an interesting cluster.

00:11:00 I was born in the Soviet Union.

00:11:02 Socialism, communism is a certain kind of political ideology that’s really interesting

00:11:06 to explore.

00:11:07 Again, objectively, there’s a set of beliefs about how the economy should work and so on.

00:11:12 And so, it’s hard to know what’s true or not in terms of people within those communities

00:11:18 are often advocating that this is how we achieve utopia in this world, and they’re pretty

00:11:24 certain about it.

00:11:25 So, how do you try to manage politics in this chaotic, divisive world?

00:11:33 Not positive or any kind of ideas in terms of filtering what people should watch next

00:11:38 and in terms of also not letting certain things be on YouTube.

00:11:44 This is an exceptionally difficult responsibility.

00:11:47 Well, the responsibility to get this right is our top priority.

00:11:52 And the first comes down to making sure that we have good, clear rules of the road, right?

00:11:58 Like, just because we have freedom of speech doesn’t mean that you can literally say anything,

00:12:03 right?

00:12:03 Like, we as a society have accepted certain restrictions on our freedom of speech.

00:12:10 There are things like libel laws and things like that.

00:12:13 And so, where we can draw a clear line, we do, and that’s what we do.

00:12:20 We draw a clear line, we do, and we continue to evolve that line over time.

00:12:27 However, as you pointed out, wherever you draw the line, there’s going to be a border

00:12:32 line.

00:12:33 And in that border line area, we are going to maybe not remove videos, but we will try

00:12:40 to reduce the recommendations of them or the proliferation of them by demoting them.

00:12:47 Alternatively, in those situations, try to raise what we would call authoritative or

00:12:53 credible sources of information.

00:12:55 So, we’re not trying to, I mean, you mentioned Ayn Rand and communism.

00:13:03 Those are two valid points of view that people are going to debate and discuss.

00:13:07 And of course, people who believe in one or the other of those things are going to try

00:13:13 to persuade other people to their point of view.

00:13:15 And so, we’re not trying to settle that or choose a side or anything like that.

00:13:21 What we’re trying to do is make sure that the people who are expressing those point

00:13:26 of view and offering those positions are authoritative and credible.

00:13:33 So, let me ask a question about people I don’t like personally.

00:13:38 You heard me.

00:13:39 I don’t care if you leave comments on this.

00:13:41 But sometimes, they’re brilliantly funny, which is trolls.

00:13:45 So, people who kind of mock, I mean, the internet is full, Reddit of mock style

00:13:53 comedy where people just kind of make fun of, point out that the emperor has no clothes.

00:13:59 And there’s brilliant comedy in that, but sometimes it can get cruel and mean.

00:14:03 So, on that, on the mean point, and sorry to look at the comments, but I’m going to

00:14:10 and sorry to linger on these things that have no good answers.

00:14:13 But actually, I totally hear you that this is really important that you’re trying to

00:14:19 solve it.

00:14:19 But how do you reduce the meanness of people on YouTube?

00:14:26 I understand that anyone who uploads YouTube videos has to become resilient to a certain

00:14:33 amount of meanness.

00:14:35 Like I’ve heard that from many creators.

00:14:37 And we are trying in various ways, comment ranking, allowing certain features to block

00:14:47 people, to reduce or make that meanness or that trolling behavior less effective on YouTube.

00:14:55 Yeah.

00:14:56 And so, I mean, it’s very important, but it’s something that we’re going to keep having

00:15:05 to work on and as we improve it, like maybe we’ll get to a point where people don’t have

00:15:12 to suffer this sort of meanness when they upload YouTube videos.

00:15:16 I hope we do, but it just does seem to be something that you have to be able to deal

00:15:23 with as a YouTube creator nowadays.

00:15:25 Do you have a hope that, so you mentioned two things that I kind of agree with.

00:15:29 So there’s like a machine learning approach of ranking comments based on whatever, based

00:15:37 on how much they contribute to the healthy conversation.

00:15:40 Let’s put it that way.

00:15:41 Then the other is almost an interface question of how do you, how does the creator filter?

00:15:48 So block or how does, how do humans themselves, the users of YouTube manage their own conversation?

00:15:56 Do you have hope that these two tools will create a better society without limiting freedom

00:16:02 of speech too much, without sort of attacking, even like saying that people, what do you

00:16:07 mean limiting, sort of curating speech?

00:16:12 I mean, I think that that overall is our whole project here at YouTube.

00:16:16 Right.

00:16:17 Like we fundamentally believe and I personally believe very much that YouTube can be great.

00:16:24 It’s been great for my kids.

00:16:26 I think it can be great for society.

00:16:29 But it’s absolutely critical that we get this responsibility part right.

00:16:34 And that’s why it’s our top priority.

00:16:37 Susan Wojcicki, who’s the CEO of YouTube, she says something that I personally find

00:16:42 very inspiring, which is that we want to do our jobs today in a manner so that people

00:16:49 20 and 30 years from now will look back and say, YouTube, they really figured this out.

00:16:54 They really found a way to strike the right balance between the openness and the value

00:17:00 that the openness has and also making sure that we are meeting our responsibility to

00:17:06 users in society.

00:17:09 So the burden on YouTube actually is quite incredible.

00:17:12 And the one thing that people don’t give enough credit to the seriousness and the magnitude

00:17:18 of the problem, I think.

00:17:19 So I personally hope that you do solve it because a lot is in your hand, a lot is riding

00:17:26 on your success or failure.

00:17:28 So it’s besides, of course, running a successful company, you’re also curating the content

00:17:34 of the internet and the conversation on the internet.

00:17:36 That’s a powerful thing.

00:17:40 So one thing that people wonder about is how much of it can be solved with pure machine

00:17:48 learning.

00:17:49 So looking at the data, studying the data and creating algorithms that curate the comments,

00:17:55 curate the content, and how much of it needs human intervention, meaning people here at

00:18:02 YouTube in a room sitting and thinking about what is the nature of truth, what are the

00:18:11 ideals that we should be promoting, that kind of thing.

00:18:14 So algorithm versus human input, what’s your sense?

00:18:18 I mean, my own experience has demonstrated that you need both of those things.

00:18:25 Algorithms, I mean, you’re familiar with machine learning algorithms and the thing

00:18:29 they need most is data and the data is generated by humans.

00:18:34 And so, for instance, when we’re building a system to try to figure out which are the

00:18:42 videos that are misinformation or borderline policy violations, well, the first thing we

00:18:49 need to do is get human beings to make decisions about which of those videos are in which category.

00:18:57 And then we use that data and basically take that information that’s determined and governed

00:19:04 by humans and extrapolate it or apply it to the entire set of billions of YouTube videos.

00:19:12 And we couldn’t get to all the videos on YouTube well without the humans, and we couldn’t use

00:19:20 the humans to get to all the videos of YouTube.

00:19:23 So there’s no world in which you have only one or the other of these things.

00:19:28 And just as you said, a lot of it comes down to people at YouTube spending a lot of time

00:19:37 trying to figure out what are the right policies, what are the outcomes based on those policies,

00:19:43 are they the kinds of things we want to see?

00:19:46 And then once we kind of get an agreement or build some consensus around what the policies

00:19:53 are, well, then we’ve got to find a way to implement those policies across all of YouTube.

00:19:59 And that’s where both the human beings, we call them evaluators or reviewers, come into

00:20:05 play to help us with that.

00:20:07 And then once we get a lot of training data from them, then we apply the machine learning

00:20:12 techniques to take it even further.

00:20:14 Do you have a sense that these human beings have a bias in some kind of direction?

00:20:20 I mean, that’s an interesting question.

00:20:22 We do sort of in autonomous vehicles and computer vision in general, a lot of annotation, and

00:20:28 we rarely ask what bias do the annotators have.

00:20:35 Even in the sense that they’re better at annotating certain things than others.

00:20:42 For example, people are much better at, for example, at writing, they’re much better at

00:20:48 or much better at annotating segmentation at segmenting cars in a scene versus segmenting

00:20:56 bushes or trees.

00:20:59 There’s specific mechanical reasons for that, but also because it’s semantic gray area.

00:21:04 And just for a lot of reasons, people are just terrible at annotating trees.

00:21:09 Okay, so in the same kind of sense, do you think of, in terms of people reviewing videos

00:21:15 or annotating the content of videos, is there some kind of bias that you’re aware of or

00:21:21 seek out in that human input?

00:21:24 Well, we take steps to try to overcome these kinds of biases or biases that we think would

00:21:30 be problematic.

00:21:32 So for instance, like we ask people to have a bias towards scientific consensus.

00:21:38 That’s something that we instruct them to do.

00:21:41 We ask them to have a bias towards demonstration of expertise or credibility or authoritativeness.

00:21:48 But there are other biases that we want to make sure to try to remove.

00:21:53 And there’s many techniques for doing this.

00:21:55 One of them is you send the same thing to be reviewed to many people.

00:22:01 And so, that’s one technique.

00:22:04 Another is that you make sure that the people that are doing these sorts of tasks, that

00:22:09 these sorts of tasks are from different backgrounds and different areas of the United States or

00:22:15 of the world.

00:22:17 But then, even with all of that, it’s possible for certain kinds of what we would call unfair

00:22:25 biases to creep into machine learning systems, primarily, as you said, because maybe the

00:22:31 training data itself comes in in a biased way.

00:22:34 So, we also have worked very hard on improving the machine learning systems to remove and

00:22:41 reduce unfair biases when it goes against or involves some protected class, for instance.

00:22:51 Thank you for exploring with me some of the more challenging things.

00:22:55 I’m sure there’s a few more that we’ll jump back to.

00:22:57 But let me jump into the fun part, which is maybe the basics of the quote, unquote, YouTube

00:23:05 algorithm.

00:23:06 What does the YouTube algorithm look at to make recommendation for what to watch next?

00:23:11 And it’s from a machine learning perspective.

00:23:14 Or when you search for a particular term, how does it know what to show you next?

00:23:20 Because it seems to, at least for me, do an incredible job of both.

00:23:25 Well, that’s kind of you to say.

00:23:26 It didn’t used to do a very good job, but it’s gotten better over the years.

00:23:31 Even I observed that it’s improved quite a bit.

00:23:35 Those are two different situations.

00:23:36 Like when you search for something, YouTube uses the best technology we can get from Google

00:23:45 to make sure that the YouTube search system finds what someone’s looking for.

00:23:50 And of course, the very first things that one thinks about is, okay, well, does the

00:23:55 word occur in the title, for instance?

00:24:00 But there are much more sophisticated things where we’re mostly trying to do some syntactic

00:24:07 match or maybe a semantic match based on words that we can add to the document itself.

00:24:15 For instance, maybe is this video watched a lot after this query?

00:24:21 That’s something that we can observe and then as a result, make sure that that document

00:24:30 would be retrieved for that query.

00:24:33 Now, when you talk about what kind of videos would be recommended to watch next, that’s

00:24:40 something, again, we’ve been working on for many years and probably the first real attempt

00:24:50 to do that well was to use collaborative filtering.

00:24:55 Can you describe what collaborative filtering is?

00:24:57 Sure.

00:24:58 It’s just basically what we do is we observe which videos get watched close together by

00:25:06 the same person.

00:25:08 And if you observe that and if you can imagine creating a graph where the videos that get

00:25:15 watched close together by the most people are very close to one another in this graph

00:25:20 and videos that don’t frequently get watched close together by the same person or the same

00:25:26 people are far apart, then you end up with this graph that we call the related graph

00:25:33 that basically represents videos that are very similar or related in some way.

00:25:38 And what’s amazing about that is that it puts all the videos that are in the same

00:25:45 language together, for instance, and we didn’t even have to think about language.

00:25:51 It just does it, right?

00:25:52 And it puts all the videos that are about sports together and it puts most of the music

00:25:56 videos together and it puts all of these sorts of videos together just because that’s sort

00:26:02 of the way the people using YouTube behave.

00:26:05 So that already cleans up a lot of the problem.

00:26:10 It takes care of the lowest hanging fruit, which happens to be a huge one of just managing

00:26:16 these millions of videos.

00:26:18 That’s right.

00:26:19 I remember a few years ago I was talking to someone who was trying to propose that we

00:26:27 do a research project concerning people who are bilingual, and this person was making

00:26:37 this proposal based on the idea that YouTube could not possibly be good at recommending

00:26:44 videos well to people who are bilingual.

00:26:48 And so she was telling me about this and I said, well, can you give me an example of

00:26:54 what problem do you think we have on YouTube with the recommendations?

00:26:57 And so she said, well, I’m a researcher in the US and when I’m looking for academic

00:27:04 topics, I want to see them in English.

00:27:07 And so she searched for one, found a video, and then looked at the watch next suggestions

00:27:12 and they were all in English.

00:27:14 And so she said, oh, I see.

00:27:16 YouTube must think that I speak only English.

00:27:18 And so she said, now I’m actually originally from Turkey and sometimes when I’m cooking,

00:27:23 let’s say I want to make some baklava, I really like to watch videos that are in Turkish.

00:27:27 And so she searched for a video about making the baklava and then selected it and it was

00:27:33 in Turkish and the watch next recommendations were in Turkish.

00:27:35 And she just couldn’t believe how this was possible and how is it that you know that

00:27:41 I speak both these two languages and put all the videos together?

00:27:44 And it’s just as a sort of an outcome of this related graph that’s created through

00:27:49 collaborative filtering.

00:27:51 So for me, one of my huge interests is just human psychology, right?

00:27:54 And that’s such a powerful platform on which to utilize human psychology to discover what

00:28:02 people, individual people want to watch next.

00:28:04 But it’s also be just fascinating to me.

00:28:06 You know, I’ve, Google search has ability to look at your own history and I’ve done

00:28:13 that before, just, just what I’ve searched three years for many, many years.

00:28:17 And it’s fascinating picture of who I am actually.

00:28:21 And I don’t think anyone’s ever summarized.

00:28:24 I personally would love that.

00:28:26 A summary of who I am as a person on the internet to me, because I didn’t get a reply

00:28:32 of who I am as a person on the internet to me, because I think it reveals, I think it

00:28:38 puts a mirror to me or to others.

00:28:41 You know, that’s actually quite revealing and interesting, you know, just the, maybe

00:28:47 in the number of, it’s a joke, but not really is the number of cat videos I’ve watched or

00:28:53 videos of people falling, you know, stuff that’s absurd, that kind of stuff.

00:28:59 It’s really interesting.

00:29:00 And of course it’s really good for the machine learning aspect to, to show, to figure out

00:29:06 what to show next.

00:29:06 But it’s interesting.

00:29:09 Have you just as a tangent played around with the idea of giving a map to people sort of,

00:29:16 as opposed to just using this information to show what’s next, showing them here are

00:29:22 the clusters you’ve loved over the years kind of thing?

00:29:25 Well, we do provide the history of all the videos that you’ve watched.

00:29:29 Yes.

00:29:29 So you can definitely search through that and look through it and search through it

00:29:32 to see what it is that you’ve been watching on YouTube.

00:29:35 We have actually in various times experimented with this sort of cluster idea, finding ways

00:29:44 to demonstrate or show people what topics they’ve been interested in or what clusters

00:29:51 they’ve watched from.

00:29:51 It’s interesting that you bring this up because in some sense, the way the recommendation

00:29:58 system of YouTube sees a user is exactly as the history of all the videos they’ve

00:30:04 watched on YouTube.

00:30:06 And so you can think of yourself or any user on YouTube as kind of like a DNA strand of

00:30:17 all your videos, right?

00:30:18 That sort of represents you, you can also think of it as maybe a vector in the space

00:30:23 of all the videos on YouTube.

00:30:26 And so now once you think of it as a vector in the space of all the videos on YouTube,

00:30:31 then you can start to say, okay, well, which other vectors are close to me and to my vector?

00:30:39 And that’s one of the ways that we generate some diverse recommendations is because you’re

00:30:44 like, okay, well, these people seem to be close with respect to the videos they’ve

00:30:50 watched on YouTube, but here’s a topic or a video that one of them has watched and

00:30:55 enjoyed, but the other one hasn’t, that could be an opportunity to make a good recommendation.

00:31:01 I got to tell you, I mean, I know I’m going to ask for things that are impossible, but

00:31:04 I would love to cluster than human beings.

00:31:07 I would love to know who has similar trajectories as me, because you probably would want to

00:31:12 hang out, right?

00:31:14 There’s a social aspect there, like actually finding some of the most fascinating people

00:31:18 I find on YouTube, but have like no followers and I start following them and they create

00:31:23 incredible content and on that topic, I just love to ask, there’s some videos that just

00:31:29 blow my mind in terms of quality and depth and just in every regard are amazing videos

00:31:37 and they have like 57 views, okay?

00:31:40 How do you get videos of quality to be seen by many eyes?

00:31:46 So the measure of quality, is it just something, yeah, how do you know that something is good?

00:31:53 Well, I mean, I think it depends initially on what sort of video we’re talking about.

00:31:58 So in the realm of, let’s say you mentioned politics and news, in that realm, you know,

00:32:08 quality news or quality journalism relies on having a journalism department, right?

00:32:16 Like you have to have actual journalists and fact checkers and people like that and so

00:32:22 in that situation and in others, maybe science or in medicine, quality has a lot to do with

00:32:30 the authoritativeness and the credibility and the expertise of the people who make the

00:32:34 video.

00:32:36 Now, if you think about the other end of the spectrum, you know, what is the highest quality

00:32:42 prank video or what is the highest quality Minecraft video, right?

00:32:49 That might be the one that people enjoy watching the most and watch to the end or it might

00:32:54 be the one that when we ask people the next day after they watched it, were they satisfied

00:33:03 with it?

00:33:04 And so we in, especially in the realm of entertainment, have been trying to get at better and better

00:33:11 measures of quality or satisfaction or enrichment since I came to YouTube.

00:33:19 And we started with, well, you know, the first approximation is the one that gets more views.

00:33:27 But you know, we both know that things can get a lot of views and not really be that

00:33:32 high quality, especially if people are clicking on something and then immediately realizing

00:33:37 that it’s not that great and abandoning it.

00:33:41 And that’s why we moved from views to thinking about the amount of time people spend watching

00:33:46 it with the premise that like, you know, in some sense, the time that someone spends watching

00:33:52 a video is related to the value that they get from that video.

00:33:57 It may not be perfectly related, but it has something to say about how much value they

00:34:02 get.

00:34:04 But even that’s not good enough, right?

00:34:05 Because I myself have spent time clicking through channels on television late at night

00:34:11 and ended up watching Under Siege 2 for some reason I don’t know.

00:34:16 And if you were to ask me the next day, are you glad that you watched that show on TV

00:34:21 last night?

00:34:22 I’d say, yeah, I wish I would have gone to bed or read a book or almost anything else,

00:34:27 really.

00:34:29 And so that’s why some people got the idea a few years ago to try to survey users afterwards.

00:34:35 And so we get feedback data from those surveys and then use that in the machine learning

00:34:43 system to try to not just predict what you’re going to click on right now, what you might

00:34:47 watch for a while, but what when we ask you tomorrow, you’ll give four or five stars to.

00:34:54 So just to summarize, what are the signals from a machine learning perspective that a

00:34:59 user can provide?

00:35:00 So you mentioned just clicking on the video views, the time watched, maybe the relative

00:35:05 time watched, the clicking like and dislike on the video, maybe commenting on the video.

00:35:12 All of those things.

00:35:14 All of those things.

00:35:15 And then the one I wasn’t actually quite aware of, even though I might have engaged in it

00:35:20 is a survey afterwards, which is a brilliant idea.

00:35:24 Is there other signals?

00:35:26 I mean, that’s already a really rich space of signals to learn from.

00:35:30 Is there something else?

00:35:31 Well, you mentioned commenting, also sharing the video.

00:35:35 If you think it’s worthy to be shared with someone else you know.

00:35:39 Within YouTube or outside of YouTube as well?

00:35:41 Either.

00:35:42 Let’s see, you mentioned like, dislike.

00:35:44 Like and dislike.

00:35:45 How important is that?

00:35:47 It’s very important, right?

00:35:48 We want, it’s predictive of satisfaction.

00:35:52 But it’s not perfectly predictive.

00:35:56 Subscribe.

00:35:57 If you subscribe to the channel of the person who made the video, then that also is a piece

00:36:03 of information and it signals satisfaction.

00:36:07 Although over the years, we’ve learned that people have a wide range of attitudes about

00:36:13 what it means to subscribe.

00:36:17 We would ask some users who didn’t subscribe very much, but they watched a lot from a few

00:36:24 channels.

00:36:25 We’d say, well, why didn’t you subscribe?

00:36:26 And they would say, well, I can’t afford to pay for anything.

00:36:32 We tried to let them understand like, actually it doesn’t cost anything.

00:36:35 It’s free.

00:36:36 It just helps us know that you are very interested in this creator.

00:36:41 But then we’ve asked other people who subscribe to many things and don’t really watch any

00:36:47 of the videos from those channels.

00:36:49 And we say, well, why did you subscribe to this if you weren’t really interested in any

00:36:54 more videos from that channel?

00:36:56 And they might tell us, well, I just, you know, I thought the person did a great job

00:37:00 and I just want to kind of give them a high five.

00:37:03 And so.

00:37:04 Yeah.

00:37:05 That’s where I sit.

00:37:06 I go to channels where I just, this person is amazing.

00:37:11 I like this person.

00:37:13 But then I like this person and I really want to support them.

00:37:18 That’s how I click subscribe.

00:37:19 Even though I mean never actually want to click on their videos when they’re releasing

00:37:23 it.

00:37:24 I just love what they’re doing.

00:37:25 And it’s maybe outside of my interest area and so on, which is probably the wrong way

00:37:30 to use the subscribe button.

00:37:31 But I just want to say congrats.

00:37:32 This is great work.

00:37:34 Well, so you have to deal with all the space of people that see the subscribe button is

00:37:39 totally different.

00:37:40 That’s right.

00:37:41 And so, you know, we can’t just close our eyes and say, sorry, you’re using it wrong.

00:37:46 You know, we’re not going to pay attention to what you’ve done.

00:37:50 We need to embrace all the ways in which all the different people in the world use the

00:37:53 subscribe button or the like and the dislike button.

00:37:57 So in terms of signals of machine learning, using for the search and for the recommendation,

00:38:05 you’ve mentioned title.

00:38:06 So like metadata, like text data that people provide description and title and maybe keywords.

00:38:13 Maybe you can speak to the value of those things in search and also this incredible

00:38:19 fascinating area of the content itself.

00:38:22 So the video content itself, trying to understand what’s happening in the video.

00:38:26 So YouTube released a data set that, you know, in the machine learning computer vision world,

00:38:30 this is just an exciting space.

00:38:33 How much is that currently?

00:38:35 How much are you playing with that currently?

00:38:37 How much is your hope for the future of being able to analyze the content of the video itself?

00:38:42 Well, we have been working on that also since I came to YouTube.

00:38:46 Analyzing the content.

00:38:47 Analyzing the content of the video, right?

00:38:50 And what I can tell you is that our ability to do it well is still somewhat crude.

00:39:00 We can tell if it’s a music video, we can tell if it’s a sports video, we can probably

00:39:05 tell you that people are playing soccer.

00:39:09 We probably can’t tell whether it’s Manchester United or my daughter’s soccer team.

00:39:15 So these things are kind of difficult and using them, we can use them in some ways.

00:39:21 So for instance, we use that kind of information to understand and inform these clusters that

00:39:27 I talked about.

00:39:30 And also maybe to add some words like soccer, for instance, to the video, if it doesn’t

00:39:34 occur in the title or the description, which is remarkable that often it doesn’t.

00:39:40 One of the things that I ask creators to do is please help us out with the title and the

00:39:47 description.

00:39:48 For instance, we were a few years ago having a live stream of some competition for World

00:39:56 of Warcraft on YouTube.

00:39:59 And it was a very important competition, but if you typed World of Warcraft in search,

00:40:04 you wouldn’t find it.

00:40:05 World of Warcraft wasn’t in the title?

00:40:07 World of Warcraft wasn’t in the title.

00:40:09 It was match 478, you know, A team versus B team and World of Warcraft wasn’t in the

00:40:14 title.

00:40:15 I’m just like, come on, give me.

00:40:17 Being literal on the internet is actually very uncool, which is the problem.

00:40:22 Oh, is that right?

00:40:23 Well, I mean, in some sense, well, some of the greatest videos, I mean, there’s a humor

00:40:28 to just being indirect, being witty and so on.

00:40:31 And actually being, you know, machine learning algorithms want you to be, you know, literal,

00:40:37 right?

00:40:38 You just want to say what’s in the thing, be very, very simple.

00:40:42 And in some sense that gets away from wit and humor.

00:40:46 So you have to play with both, right?

00:40:48 But you’re saying that for now, sort of the content of the title, the content of the description,

00:40:54 the actual text is one of the best ways for the algorithm to find your video and put them

00:41:01 in the right cluster.

00:41:03 That’s right.

00:41:04 And I would go further and say that if you want people, human beings to select your video

00:41:10 in search, then it helps to have, let’s say World of Warcraft in the title.

00:41:14 Because why would a person, you know, if they’re looking at a bunch, they type World of Warcraft

00:41:20 and they have a bunch of videos, all of whom say World of Warcraft, except the one that

00:41:23 you uploaded.

00:41:24 Well, even the person is going to think, well, maybe this isn’t somehow search made a mistake.

00:41:29 This isn’t really about World of Warcraft.

00:41:31 So it’s important not just for the machine learning systems, but also for the people

00:41:36 who might be looking for this sort of thing.

00:41:38 They get a clue that it’s what they’re looking for by seeing that same thing prominently

00:41:44 in the title of the video.

00:41:45 Okay.

00:41:46 Let me push back on that.

00:41:47 So I think from the algorithm perspective, yes, but if they typed in World of Warcraft

00:41:52 and saw a video that with the title simply winning and the thumbnail has like a sad orc

00:42:02 or something, I don’t know, right?

00:42:04 Like I think that’s much, it gets your curiosity up.

00:42:11 And then if they could trust that the algorithm was smart enough to figure out somehow that

00:42:15 this is indeed a World of Warcraft video, that would have created the most beautiful

00:42:20 experience.

00:42:21 I think in terms of just the wit and the humor and the curiosity that we human beings naturally

00:42:25 have.

00:42:26 But you’re saying, I mean, realistically speaking, it’s really hard for the algorithm

00:42:30 to figure out that the content of that video will be a World of Warcraft video.

00:42:34 And you have to accept that some people are going to skip it.

00:42:37 Yeah.

00:42:38 Right?

00:42:39 I mean, and so you’re right.

00:42:41 The people who don’t skip it and select it are going to be delighted, but other people

00:42:47 might say, yeah, this is not what I was looking for.

00:42:50 And making stuff discoverable, I think is what you’re really working on and hoping.

00:42:56 So yeah.

00:42:57 So from your perspective, put stuff in the title description.

00:43:00 And remember the collaborative filtering part of the system starts by the same user watching

00:43:07 videos together, right?

00:43:09 So the way that they’re probably going to do that is by searching for them.

00:43:14 That’s a fascinating aspect of it.

00:43:15 It’s like ant colonies.

00:43:16 That’s how they find stuff.

00:43:19 So I mean, what degree for collaborative filtering in general is one curious ant, one curious

00:43:27 user, essential?

00:43:28 So just a person who is more willing to click on random videos and sort of explore these

00:43:33 cluster spaces.

00:43:35 In your sense, how many people are just like watching the same thing over and over and

00:43:39 over and over?

00:43:40 And how many are just like the explorers and just kind of like click on stuff and then

00:43:44 help the other ant in the ant’s colony discover the cool stuff?

00:43:49 Do you have a sense of that at all?

00:43:51 I really don’t think I have a sense for the relative sizes of those groups.

00:43:56 But I would say that people come to YouTube with some certain amount of intent.

00:44:01 And as long as they, to the extent to which they try to satisfy that intent, that certainly

00:44:08 helps our systems, right?

00:44:09 Because our systems rely on kind of a faithful amount of behavior, right?

00:44:17 And there are people who try to trick us, right?

00:44:19 There are people and machines that try to associate videos together that really don’t

00:44:25 belong together, but they’re trying to get that association made because it’s profitable

00:44:30 for them.

00:44:31 And so we have to always be resilient to that sort of attempt at gaming the systems.

00:44:37 So speaking to that, there’s a lot of people that in a positive way, perhaps, I don’t know,

00:44:42 I don’t like it, but like to want to try to game the system to get more attention.

00:44:47 Everybody creators in a positive sense want to get attention, right?

00:44:51 So how do you work in this space when people create more and more sort of click baity titles

00:45:01 and thumbnails?

00:45:02 Sort of very to ask him, Derek has made a video where basically describes that it seems

00:45:08 what works is to create a high quality video, really good video, where people would want

00:45:12 to watch it once they click on it, but have click baity titles and thumbnails to get them

00:45:18 to click on it in the first place.

00:45:19 And he’s saying, I’m embracing this fact, I’m just going to keep doing it.

00:45:23 And I hope you forgive me for doing it and you will enjoy my videos once you click on

00:45:28 them.

00:45:29 So in what sense do you see this kind of click bait style attempt to manipulate, to get people

00:45:38 in the door to manipulate the algorithm or play with the algorithm or game the algorithm?

00:45:43 I think that you can look at it as an attempt to game the algorithm.

00:45:47 But even if you were to take the algorithm out of it and just say, okay, well, all these

00:45:52 videos happen to be lined up, which the algorithm didn’t make any decision about which one to

00:45:57 put at the top or the bottom, but they’re all lined up there, which one are the people

00:46:02 going to choose?

00:46:04 And I’ll tell you the same thing that I told Derek is, I have a bookshelf and they have

00:46:09 two kinds of books on them, science books.

00:46:13 I have my math books from when I was a student and they all look identical except for the

00:46:19 titles on the covers.

00:46:21 They’re all yellow, they’re all from Springer and they’re every single one of them.

00:46:24 The cover is totally the same.

00:46:27 Yes.

00:46:28 Right?

00:46:29 Yeah.

00:46:30 On the other hand, I have other more pop science type books and they all have very interesting

00:46:34 covers and they have provocative titles and things like that.

00:46:40 I wouldn’t say that they’re click baity because they are indeed good books.

00:46:45 And I don’t think that they cross any line, but that’s just a decision you have to make.

00:46:52 Like the people who write classical recursion theory by Piero di Freddie, he was fine with

00:46:58 the yellow title and nothing more.

00:47:02 Whereas I think other people who wrote a more popular type book understand that they need

00:47:10 to have a compelling cover and a compelling title.

00:47:15 And I don’t think there’s anything really wrong with that.

00:47:19 We do take steps to make sure that there is a line that you don’t cross.

00:47:24 And if you go too far, maybe your thumbnail is especially racy or it’s all caps with too

00:47:32 many exclamation points, we observe that users are sometimes offended by that.

00:47:41 And so for the users who are offended by that, we will then depress or suppress those videos.

00:47:51 And which reminds me, there’s also another signal where users can say, I don’t know if

00:47:55 it was recently added, but I really enjoy it.

00:47:58 Just saying, something like, I don’t want to see this video anymore or something like,

00:48:04 like this is a, like there’s certain videos that just cut me the wrong way.

00:48:09 Like just, just jump out at me, it’s like, I don’t want to, I don’t want this.

00:48:12 And it feels really good to clean that up, to be like, I don’t, that’s not, that’s not

00:48:17 for me.

00:48:18 I don’t know.

00:48:19 I think that might’ve been recently added, but that’s also a really strong signal.

00:48:22 Yes, absolutely.

00:48:23 Right.

00:48:24 We don’t want to make a recommendation that people are unhappy with.

00:48:29 And that makes me, that particular one makes me feel good as a user in general and as a

00:48:34 machine learning person.

00:48:35 Cause I feel like I’m helping the algorithm.

00:48:37 My interactions on YouTube don’t always feel like I’m helping the algorithm.

00:48:41 Like I’m not reminded of that fact.

00:48:43 Like for example, Tesla and Autopilot and Elon Musk create a feeling for their customers,

00:48:50 for people that own Teslas, that they’re helping the algorithm of Tesla vehicles.

00:48:54 Like they’re all, like are really proud they’re helping the fleet learn.

00:48:57 I think YouTube doesn’t always remind people that you’re helping the algorithm get smarter.

00:49:02 And for me, I love that idea.

00:49:04 Like we’re all collaboratively, like Wikipedia gives that sense that we’re all together creating

00:49:09 a beautiful thing.

00:49:12 YouTube is a, doesn’t always remind me of that.

00:49:14 It’s a, this conversation is reminding me of that, but.

00:49:18 Well that’s a good tip.

00:49:19 We should keep that fact in mind when we design these features.

00:49:22 I’m not sure I really thought about it that way, but that’s a very interesting perspective.

00:49:28 It’s an interesting question of personalization that I feel like when I click like on a video,

00:49:35 I’m just improving my experience.

00:49:39 It would be great.

00:49:40 It would make me personally, people are different, but make me feel great if I was helping also

00:49:45 the YouTube algorithm broadly say something.

00:49:47 You know what I’m saying?

00:49:48 Like there’s a, that I don’t know if that’s human nature, but you want the products you

00:49:53 love, and I certainly love YouTube, like you want to help it get smarter, smarter, smarter

00:49:58 because there’s some kind of coupling between our lives together being better.

00:50:04 If YouTube is better than I will, my life will be better.

00:50:07 And there’s that kind of reasoning.

00:50:08 I’m not sure what that is and I’m not sure how many people share that feeling.

00:50:12 That could be just a machine learning feeling.

00:50:14 But on that point, how much personalization is there in terms of next video recommendations?

00:50:22 So is it kind of all really boiling down to clustering?

00:50:28 Like if I’m the nearest clusters to me and so on and that kind of thing, or how much

00:50:33 is personalized to me, the individual completely?

00:50:36 It’s very, very personalized.

00:50:38 So your experience will be quite a bit different from anybody else’s who’s watching that same

00:50:45 video, at least when they’re logged in.

00:50:48 And the reason is that we found that users often want two different kinds of things when

00:50:56 they’re watching a video.

00:50:58 Sometimes they want to keep watching more on that topic or more in that genre.

00:51:05 And other times they just are done and they’re ready to move on to something else.

00:51:09 And so the question is, well, what is the something else?

00:51:13 And one of the first things one can imagine is, well, maybe something else is the latest

00:51:19 video from some channel to which you’ve subscribed.

00:51:22 And that’s going to be very different for you than it is for me.

00:51:27 And even if it’s not something that you subscribe to, it’s something that you watch a lot.

00:51:31 And again, that’ll be very different on a person by person basis.

00:51:34 And so even the Watch Next, as well as the homepage, of course, is quite personalized.

00:51:43 So what, we mentioned some of the signals, but what does success look like?

00:51:47 What does success look like in terms of the algorithm creating a great long term experience

00:51:52 for a user?

00:51:53 Or to put another way, if you look at the videos I’ve watched this month, how do you

00:52:00 know the algorithm succeeded for me?

00:52:03 I think, first of all, if you come back and watch more YouTube, then that’s one indication

00:52:09 that you found some value from it.

00:52:10 So just the number of hours is a powerful indicator.

00:52:13 Well, I mean, not the hours themselves, but the fact that you return on another day.

00:52:22 So that’s probably the most simple indicator.

00:52:26 People don’t come back to things that they don’t find value in, right?

00:52:29 There’s a lot of other things that they could do.

00:52:32 But like I said, ideally, we would like everybody to feel that YouTube enriches their lives

00:52:38 and that every video they watched is the best one they’ve ever watched since they’ve started

00:52:43 watching YouTube.

00:52:44 And so that’s why we survey them and ask them, is this one to five stars?

00:52:52 And so our version of success is every time someone takes that survey, they say it’s five

00:52:58 stars.

00:53:00 And if we ask them, is this the best video you’ve ever seen on YouTube?

00:53:03 They say, yes, every single time.

00:53:05 So it’s hard to imagine that we would actually achieve that.

00:53:09 Maybe asymptotically we would get there, but that would be what we think success is.

00:53:16 It’s funny.

00:53:17 I’ve recently said somewhere, I don’t know, maybe tweeted, but that Ray Dalio has this

00:53:23 video on the economic machine, I forget what it’s called, but it’s a 30 minute video.

00:53:29 And I said it’s the greatest video I’ve ever watched on YouTube.

00:53:32 It’s like I watched the whole thing and my mind was blown as a very crisp, clean description

00:53:38 of how the, at least the American economic system works.

00:53:41 It’s a beautiful video.

00:53:43 And I was just, I wanted to click on something to say this is the best thing.

00:53:47 This is the best thing ever.

00:53:48 Please let me, I can’t believe I discovered it.

00:53:51 I mean, the views and the likes reflect its quality, but I was almost upset that I haven’t

00:53:57 found it earlier and wanted to find other things like it.

00:54:01 I don’t think I’ve ever felt that this is the best video I’ve ever watched.

00:54:05 That was that.

00:54:06 And to me, the ultimate utopia, the best experiences were every single video.

00:54:10 Where I don’t see any of the videos I regret and every single video I watch is one that

00:54:15 actually helps me grow, helps me enjoy life, be happy and so on.

00:54:25 So that’s a heck of a, that’s one of the most beautiful and ambitious, I think, machine

00:54:31 learning tasks.

00:54:32 So when you look at a society as opposed to the individual user, do you think of how YouTube

00:54:37 is changing society when you have these millions of people watching videos, growing, learning,

00:54:44 changing, having debates?

00:54:45 Do you have a sense of, yeah, what the big impact on society is?

00:54:51 I think it’s huge, but do you have a sense of what direction we’re taking this world?

00:54:55 Well, I mean, I think openness has had an impact on society already.

00:55:02 There’s a lot of…

00:55:03 What do you mean by openness?

00:55:05 Well, the fact that unlike other mediums, there’s not someone sitting at YouTube who

00:55:14 decides before you can upload your video, whether it’s worth having you upload it or

00:55:20 worth anybody seeing it really, right?

00:55:23 And so there are some creators who say, like, I wouldn’t have this opportunity to reach

00:55:32 an audience.

00:55:33 Tyler Oakley often said that he wouldn’t have had this opportunity to reach this audience

00:55:39 if it weren’t for YouTube.

00:55:44 And so I think that’s one way in which YouTube has changed society.

00:55:50 I know that there are people that I work with from outside the United States, especially

00:55:56 from places where literacy is low, and they think that YouTube can help in those places

00:56:03 because you don’t need to be able to read and write in order to learn something important

00:56:09 for your life, maybe how to do some job or how to fix something.

00:56:15 And so that’s another way in which I think YouTube is possibly changing society.

00:56:21 So I’ve worked at YouTube for eight, almost nine years now.

00:56:25 And it’s fun because I meet people and you tell them where you work, you say you work

00:56:32 on YouTube and they immediately say, I love YouTube, right?

00:56:36 Which is great, makes me feel great.

00:56:39 But then of course, when I ask them, well, what is it that you love about YouTube?

00:56:43 Not one time ever has anybody said that the search works outstanding or that the recommendations

00:56:50 are great.

00:56:52 What they always say when I ask them, what do you love about YouTube is they immediately

00:56:57 start talking about some channel or some creator or some topic or some community that they

00:57:03 found on YouTube and that they just love.

00:57:07 And so that has made me realize that YouTube is really about the video and connecting the

00:57:16 people with the videos.

00:57:19 And then everything else kind of gets out of the way.

00:57:22 So beyond the video, it’s an interesting, because you kind of mentioned creator.

00:57:28 What about the connection with just the individual creators as opposed to just individual video?

00:57:35 So like I gave the example of Ray Dalio video that the video itself is incredible, but there’s

00:57:42 some people who are just creators that I love.

00:57:47 One of the cool things about people who call themselves YouTubers or whatever is they have

00:57:52 a journey.

00:57:53 They usually, almost all of them, they suck horribly in the beginning and then they kind

00:57:57 of grow and then there’s that genuineness in their growth.

00:58:01 So YouTube clearly wants to help creators connect with their audience in this kind of

00:58:07 way.

00:58:08 So how do you think about that process of helping creators grow, helping them connect

00:58:12 with their audience, develop not just individual videos, but the entirety of a creator’s life

00:58:17 on YouTube?

00:58:18 Well, I mean, we’re trying to help creators find the biggest audience that they can find.

00:58:24 And the reason why that’s, you brought up creator versus video, the reason why creator

00:58:30 channel is so important is because if we have a hope of people coming back to YouTube, well,

00:58:41 they have to have in their minds some sense of what they’re going to find when they come

00:58:46 back to YouTube.

00:58:48 If YouTube were just the next viral video and I have no concept of what the next viral

00:58:54 video could be, one time it’s a cat playing a piano and the next day it’s some children

00:59:00 interrupting a reporter and the next day it’s some other thing happening, then it’s hard

00:59:06 for me to, when I’m not watching YouTube, say, gosh, I really would like to see something

00:59:14 from someone or about something, right?

00:59:17 And so that’s why I think this connection between fans and creators is so important

00:59:24 for both, because it’s a way of sort of fostering a relationship that can play out into the

00:59:31 future.

00:59:32 Let me talk about kind of a dark and interesting question in general, and again, a topic that

00:59:40 you or nobody has an answer to.

00:59:42 But social media has a sense of, it gives us highs and it gives us lows in the sense

00:59:50 that sort of creators often speak about having sort of burnout and having psychological ups

00:59:58 and downs and challenges mentally in terms of continuing the creation process.

01:00:02 There’s a momentum, there’s a huge excited audience that makes creators feel great.

01:00:08 And I think it’s more than just financial.

01:00:11 I think it’s literally just, they love that sense of community.

01:00:16 It’s part of the reason I upload to YouTube.

01:00:18 I don’t care about money, never will.

01:00:20 What I care about is the community, but some people feel like this momentum, and even when

01:00:26 there’s times in their life when they don’t feel, you know, for some reason don’t feel

01:00:31 like creating.

01:00:32 So how do you think about burnout, this mental exhaustion that some YouTube creators go through?

01:00:38 Is that something we have an answer for?

01:00:40 Is that something, how do we even think about that?

01:00:42 Well, the first thing is we want to make sure that the YouTube systems are not contributing

01:00:47 to this sense, right?

01:00:49 And so we’ve done a fair amount of research to demonstrate that you can absolutely take

01:00:56 a break.

01:00:57 If you are a creator and you’ve been uploading a lot, we have just as many examples of people

01:01:03 who took a break and came back more popular than they were before as we have examples

01:01:08 of going the other way.

01:01:09 Yeah.

01:01:10 Can we pause on that for a second?

01:01:11 So the feeling that people have, I think, is if I take a break, everybody, the party

01:01:17 will leave, right?

01:01:19 So if you could just linger on that.

01:01:21 So in your sense that taking a break is okay.

01:01:24 Yes, taking a break is absolutely okay.

01:01:27 And the reason I say that is because we have, we can observe many examples of being, of

01:01:35 creators coming back very strong and even stronger after they have taken some sort of

01:01:40 break.

01:01:41 And so I just want to dispel the myth that this somehow necessarily means that your channel

01:01:50 is going to go down or lose views.

01:01:53 That is not the case.

01:01:55 We know for sure that this is not a necessary outcome.

01:01:59 And so we want to encourage people to make sure that they take care of themselves.

01:02:04 That is job one, right?

01:02:06 You have to look after yourself and your mental health.

01:02:10 And I think that it probably, in some of these cases, contributes to better videos once they

01:02:19 come back, right?

01:02:20 Because a lot of people, I mean, I know myself, if I burn out on something, then I’m probably

01:02:24 not doing my best work, even though I can keep working until I pass out.

01:02:30 And so I think that the taking a break may even improve the creative ideas that someone

01:02:38 has.

01:02:39 Okay.

01:02:40 I think that’s a really important thing to sort of dispel.

01:02:42 I think that applies to all of social media, like literally I’ve taken a break for a day

01:02:47 every once in a while.

01:02:49 Sorry.

01:02:50 Sorry if that sounds like a short time, but even like, sorry, email, just taking a break

01:02:57 from email, or only checking email once a day, especially when you’re going through

01:03:02 something psychologically in your personal life or so on, or really not sleeping much

01:03:06 because of work deadlines, it can refresh you in a way that’s profound.

01:03:10 And so the same applies.

01:03:11 It was there when you came back, right?

01:03:13 It’s there.

01:03:14 And it looks different, actually, when you come back.

01:03:17 You’re sort of brighter eyed with some coffee, everything, the world looks better.

01:03:22 So it’s important to take a break when you need it.

01:03:26 So you’ve mentioned kind of the YouTube algorithm that isn’t E equals MC squared, it’s not the

01:03:33 single equation, it’s potentially sort of more than a million lines of code.

01:03:41 Is it more akin to what successful autonomous vehicles today are, which is they’re just

01:03:47 basically patches on top of patches of heuristics and human experts really tuning the algorithm

01:03:55 and have some machine learning modules?

01:03:58 Or is it becoming more and more a giant machine learning system with humans just doing a little

01:04:04 bit of tweaking here and there?

01:04:06 What’s your sense?

01:04:07 First of all, do you even have a sense of what is the YouTube algorithm at this point?

01:04:11 And however much you do have a sense, what does it look like?

01:04:15 Well, we don’t usually think about it as the algorithm because it’s a bunch of systems

01:04:21 that work on different services.

01:04:24 The other thing that I think people don’t understand is that what you might refer to

01:04:29 as the YouTube algorithm from outside of YouTube is actually a bunch of code and machine learning

01:04:37 systems and heuristics, but that’s married with the behavior of all the people who come

01:04:43 to YouTube every day.

01:04:44 So the people part of the code, essentially.

01:04:46 Exactly.

01:04:47 If there were no people who came to YouTube tomorrow, then the algorithm wouldn’t work

01:04:51 anymore.

01:04:52 Right.

01:04:53 That’s the whole part of the algorithm.

01:04:55 And so when people talk about, well, the algorithm does this, the algorithm does that, it’s sometimes

01:05:00 hard to understand, well, it could be the viewers are doing that.

01:05:04 And the algorithm is mostly just keeping track of what the viewers do and then reacting to

01:05:10 those things in sort of more fine grain situations.

01:05:16 And I think that this is the way that the recommendation system and the search system

01:05:21 and probably many machine learning systems evolve is you start trying to solve a problem

01:05:28 and the first way to solve a problem is often with a simple heuristic.

01:05:34 And you want to say, what are the videos we’re going to recommend?

01:05:36 Well, how about the most popular ones?

01:05:39 That’s where you start.

01:05:43 And over time, you collect some data and you refine your situation so that you’re making

01:05:48 less heuristics and you’re building a system that can actually learn what to do in different

01:05:54 situations based on some observations of those situations in the past.

01:06:00 And you keep chipping away at these heuristics over time.

01:06:03 And so I think that just like with diversity, I think the first diversity measure we took

01:06:10 was, okay, not more than three videos in a row from the same channel.

01:06:15 It’s a pretty simple heuristic to encourage diversity, but it worked, right?

01:06:20 Who needs to see four, five, six videos in a row from the same channel?

01:06:25 And over time, we try to chip away at that and make it more fine grain and basically

01:06:31 have it remove the heuristics in favor of something that can react to individuals and

01:06:39 individual situations.

01:06:41 So how do you, you mentioned, you know, we know that something worked.

01:06:46 How do you get a sense when decisions are kind of A, B testing that this idea was a

01:06:51 good one, this was not so good?

01:06:55 How do you measure that and across which time scale, across how many users, that kind of

01:07:00 thing?

01:07:01 Well, you mentioned the A, B experiments.

01:07:04 And so just about every single change we make to YouTube, we do it only after we’ve run

01:07:11 a A, B experiment.

01:07:13 And so in those experiments, which run from one week to months, we measure hundreds, literally

01:07:24 hundreds of different variables and measure changes with confidence intervals in all of

01:07:30 them, because we really are trying to get a sense for ultimately, does this improve

01:07:36 the experience for viewers?

01:07:38 That’s the question we’re trying to answer.

01:07:40 And an experiment is one way because we can see certain things go up and down.

01:07:45 So for instance, if we noticed in the experiment, people are dismissing videos less frequently,

01:07:52 or they’re saying that they’re more satisfied, they’re giving more videos five stars after

01:07:58 they watch them, then those would be indications that the experiment is successful, that it’s

01:08:04 improving the situation for viewers.

01:08:08 But we can also look at other things, like we might do user studies, where we invite

01:08:12 some people in and ask them, like, what do you think about this?

01:08:16 What do you think about that?

01:08:17 How do you feel about this?

01:08:19 And other various kinds of user research.

01:08:22 But ultimately, before we launch something, we’re going to want to run an experiment.

01:08:26 So we get a sense for what the impact is going to be, not just to the viewers, but also to

01:08:31 the different channels and all of that.

01:08:36 An absurd question.

01:08:38 Nobody knows.

01:08:39 Well, actually, it’s interesting.

01:08:40 Maybe there’s an answer.

01:08:41 But if I want to make a viral video, how do I do it?

01:08:45 I don’t know how you make a viral video.

01:08:48 I know that we have in the past tried to figure out if we could detect when a video was going

01:08:55 to go viral.

01:08:57 And those were, you take the first and second derivatives of the view count and maybe use

01:09:03 that to do some prediction.

01:09:07 But I can’t say we ever got very good at that.

01:09:10 Oftentimes we look at where the traffic was coming from.

01:09:14 If a lot of the viewership is coming from something like Twitter, then maybe it has

01:09:20 a higher chance of becoming viral than if it were coming from search or something.

01:09:26 But that was just trying to detect a video that might be viral.

01:09:30 How to make one, I have no idea.

01:09:33 You get your kids to interrupt you while you’re on the news or something.

01:09:38 Absolutely.

01:09:39 But after the fact, on one individual video, sort of ahead of time predicting is a really

01:09:44 hard task.

01:09:45 But after the video went viral, in analysis, can you sometimes understand why it went viral?

01:09:53 From the perspective of YouTube broadly, first of all, is it even interesting for YouTube

01:09:58 that a particular video is viral or does that not matter for the individual, for the experience

01:10:04 of people?

01:10:05 Well, I think people expect that if a video is going viral and it’s something they would

01:10:11 be interested in, then I think they would expect YouTube to recommend it to them.

01:10:16 Right.

01:10:17 So if something’s going viral, it’s good to just let the wave, let people ride the wave

01:10:21 of its violence.

01:10:22 Well, I mean, we want to meet people’s expectations in that way, of course.

01:10:27 So like I mentioned, I hung out with Derek Mueller a while ago, a couple of months back.

01:10:34 He’s actually the person who suggested I talk to you on this podcast.

01:10:37 All right.

01:10:38 Well, thank you, Derek.

01:10:40 At that time, he just recently posted an awesome science video titled, why are 96 million black

01:10:48 balls on this reservoir?

01:10:50 And in a matter of, I don’t know how long, but like a few days, he got 38 million views

01:10:55 and it’s still growing.

01:10:57 Is this something you can analyze and understand why it happened, this video and you want a

01:11:03 particular video like it?

01:11:06 I mean, we can surely see where it was recommended, where it was found, who watched it and those

01:11:13 sorts of things.

01:11:14 So it’s actually, sorry to interrupt, it is the video which helped me discover who Derek

01:11:20 is.

01:11:21 I didn’t know who he is before.

01:11:22 So I remember, you know, usually I just have all of these technical, boring MIT Stanford

01:11:28 talks in my recommendation because that’s how I watch.

01:11:30 And then all of a sudden there’s this black balls and reservoir video with like an excited

01:11:35 nerd with like just, why is this being recommended to me?

01:11:40 So I clicked on it and watched the whole thing and it was awesome.

01:11:44 And then a lot of people had that experience, like why was I recommended this?

01:11:48 But they all of course watched it and enjoyed it, which is, what’s your sense of this just

01:11:52 wave of recommendation that comes with this viral video that ultimately people get enjoy

01:11:58 after they click on it?

01:11:59 Well, I think it’s the system, you know, basically doing what anybody who’s recommending something

01:12:05 would do, which is you show it to some people and if they like it, you say, okay, well,

01:12:09 can I find some more people who are a little bit like them?

01:12:12 Okay, I’m going to try it with them.

01:12:14 Oh, they like it too.

01:12:15 Let me expand the circle some more, find some more people.

01:12:17 Oh, it turns out they like it too.

01:12:19 And you just keep going until you get some feedback that says that, no, now you’ve gone

01:12:23 too far.

01:12:24 These people don’t like it anymore.

01:12:25 And so I think that’s basically what happened.

01:12:28 And you asked me about how to make a video go viral or make a viral video.

01:12:35 I don’t think that if you or I decided to make a video about 96 million balls that it

01:12:41 would also go viral.

01:12:42 It’s possible that Derek made like the canonical video about those black balls in the lake.

01:12:51 He did actually.

01:12:52 Right.

01:12:53 And I don’t know whether or not just following along is the secret.

01:12:59 Yeah.

01:13:00 But it’s fascinating.

01:13:01 I mean, just like you said, the algorithm sort of expanding that circle and then figuring

01:13:04 out that more and more people did enjoy it and that sort of phase shift of just a huge

01:13:09 number of people enjoying it and the algorithm quickly, automatically, I assume, figuring

01:13:15 that out.

01:13:16 I don’t know, the dynamics of psychology of that is a beautiful thing.

01:13:20 So what do you think about the idea of clipping?

01:13:25 Too many people annoyed me into doing it, which is they were requesting it.

01:13:29 They said it would be very beneficial to add clips in like the coolest points and actually

01:13:36 have explicit videos.

01:13:37 Like I’m re uploading a video, like a short clip, which is what the podcasts are doing.

01:13:44 Do you see as opposed to, like I also add timestamps for the topics, do you want the

01:13:49 clip?

01:13:50 Do you see YouTube somehow helping creators with that process or helping connect clips

01:13:54 to the original videos or is that just on a long list of amazing features to work towards?

01:14:00 Yeah.

01:14:01 I mean, it’s not something that I think we’ve done yet, but I can tell you that I think

01:14:08 clipping is great and I think it’s actually great for you as a creator.

01:14:12 And here’s the reason.

01:14:15 If you think about, I mean, let’s say the NBA is uploading videos of its games.

01:14:23 Well, people might search for warriors versus rockets or they might search for Steph Curry.

01:14:31 And so a highlight from the game in which Steph Curry makes an amazing shot is an opportunity

01:14:37 for someone to find a portion of that video.

01:14:41 And so I think that you never know how people are going to search for something that you’ve

01:14:48 created.

01:14:49 And so you want to, I would say you want to make clips and add titles and things like

01:14:54 that so that they can find it as easily as possible.

01:14:58 Do you have a dream of a future, perhaps a distant future when the YouTube algorithm

01:15:03 figures that out?

01:15:05 Sort of automatically detects the parts of the video that are really interesting, exciting,

01:15:12 potentially exciting for people and sort of clip them out in this incredibly rich space.

01:15:17 Cause if you talk about, if you talk, even just this conversation, we probably covered

01:15:21 30, 40 little topics and there’s a huge space of users that would find, you know, 30% of

01:15:29 those topics really interesting.

01:15:30 And that space is very different.

01:15:33 It’s something that’s beyond my ability to clip out, right?

01:15:37 But the algorithm might be able to figure all that out, sort of expand into clips.

01:15:43 Do you have a, do you think about this kind of thing?

01:15:46 Do you have a hope or dream that one day the algorithm will be able to do that kind of

01:15:49 deep content analysis?

01:15:50 Well, we’ve actually had projects that attempt to achieve this, but it really does depend

01:15:57 on understanding the video well and our understanding of the video right now is quite crude.

01:16:03 And so I think it would be especially hard to do it with a conversation like this.

01:16:11 One might be able to do it with, let’s say a soccer match more easily, right?

01:16:18 You could probably find out where the goals were scored.

01:16:20 And then of course you, you need to figure out who it was that scored the goal and, and

01:16:25 that might require a human to do some annotation.

01:16:28 But I think that trying to identify coherent topics in a transcript, like, like the one

01:16:35 of our conversation is, is not something that we’re going to be very good at right away.

01:16:42 And I was speaking more to the general problem actually of being able to do both a soccer

01:16:46 match and our conversation without explicit sort of almost my, my hope was that there

01:16:52 exists an algorithm that’s able to find exciting things in video.

01:17:00 So Google now on Google search will help you find the segment of the video that you’re

01:17:06 interested in.

01:17:07 So if you search for something like how to change the filter in my dishwasher, then if

01:17:13 there’s a long video about your dishwasher and this is the part where the person shows

01:17:17 you how to change the filter, then, then it will highlight that area.

01:17:22 And provide a link directly to it.

01:17:24 And do you know if, from your recollection, do you know if the thumbnail reflects, like,

01:17:29 what’s the difference between showing the full video and the shorter clip?

01:17:32 Do you know how it’s presented in search results?

01:17:34 I don’t remember how it’s presented.

01:17:36 And the other thing I would say is that right now it’s based on creator annotations.

01:17:41 Ah, got it.

01:17:43 So it’s not the thing we’re talking about.

01:17:45 But folks are working on the more automatic version.

01:17:50 It’s interesting, people might not imagine this, but a lot of our systems start by using

01:17:56 almost entirely the audience behavior.

01:18:00 And then as they get better, the refinement comes from using the content.

01:18:07 And I wish, I know there’s privacy concerns, but I wish YouTube explored the space, which

01:18:15 is sort of putting a camera on the users if they allowed it, right, to study their, like,

01:18:21 I did a lot of emotion recognition work and so on, to study actual sort of richer signal.

01:18:27 One of the cool things when you upload 360 like VR video to YouTube, and I’ve done this

01:18:32 a few times, so I’ve uploaded myself, it’s a horrible idea.

01:18:37 Some people enjoyed it, but whatever.

01:18:39 The video of me giving a lecture in 360 with a 360 camera, and it’s cool because YouTube

01:18:44 allows you to then watch where did people look at?

01:18:47 There’s a heat map of where, you know, of where the center of the VR experience was.

01:18:53 And it’s interesting because that reveals to you, like, what people looked at.

01:18:57 It’s not always what you were expecting.

01:19:00 In the case of the lecture, it’s pretty boring, it is what we were expecting, but we did a

01:19:05 few funny videos where there’s a bunch of people doing things, and everybody tracks

01:19:09 those people.

01:19:10 You know, in the beginning, they all look at the main person and they start spreading

01:19:13 around and looking at the other people.

01:19:15 It’s fascinating.

01:19:16 So that kind of, that’s a really strong signal of what people found exciting in the video.

01:19:21 I don’t know how you get that from people just watching, except they tuned out at this

01:19:26 point.

01:19:27 Like, it’s hard to measure this moment was super exciting for people.

01:19:32 I don’t know how you get that signal.

01:19:34 Maybe comment, is there a way to get that signal where this was like, this is when their

01:19:38 eyes opened up and they’re like, like for me with the Ray Dalio video, right?

01:19:42 Like at first I was like, okay, this is another one of these like dumb it down for you videos.

01:19:48 And then you like start watching, it’s like, okay, there’s really crisp, clean, deep explanation

01:19:52 of how the economy works.

01:19:54 That’s where I like set up and started watching, right?

01:19:56 That moment, is there a way to detect that moment?

01:19:59 The only way I can think of is by asking people to label it.

01:20:05 You mentioned that we’re quite far away in terms of doing video analysis, deep video

01:20:09 analysis.

01:20:11 Of course, Google, YouTube, you know, we’re quite far away from solving autonomous driving

01:20:18 problem too.

01:20:19 So it’s a…

01:20:20 I don’t know.

01:20:21 I think we’re closer to that.

01:20:22 Well, the, you know, you never know.

01:20:25 And the Wright brothers thought they’re never, they’re not going to fly for 50 years, three

01:20:29 years before they flew.

01:20:30 So what are the biggest challenges would you say?

01:20:34 Is it the broad challenge of understanding video, understanding natural language, understanding

01:20:40 the challenge before the entire machine learning community or just being able to understand

01:20:45 data?

01:20:46 Is there something specific to video that’s even more challenging than understanding natural

01:20:51 language understanding?

01:20:53 What’s your sense of what the biggest challenge is?

01:20:54 Video is just so much information.

01:20:56 And so precision becomes a real problem.

01:21:01 It’s like, you know, you’re trying to classify something and you’ve got a million classes

01:21:08 and the distinctions among them, at least from a machine learning perspective are often

01:21:17 pretty small, right?

01:21:19 Like, you know, you need to see this person’s number in order to know which player it is.

01:21:28 And there’s a lot of players or you need to see, you know, the logo on their chest in

01:21:35 order to know like which team they play for.

01:21:38 And so, and that’s just figuring out who’s who, right?

01:21:41 And then you go further and saying, okay, well, you know, was that a goal?

01:21:45 Was it not a goal?

01:21:46 Like, is that an interesting moment as you said, or is that not an interesting moment?

01:21:51 These things can be pretty hard.

01:21:53 So okay.

01:21:54 So Yann LeCun, I’m not sure if you’re familiar sort of with his current thinking and work.

01:21:59 So he believes that self, what he’s referring to as self supervised learning will be the

01:22:05 solution sort of to achieving this kind of greater level of intelligence.

01:22:09 In fact, the thing he’s focusing on is watching video and predicting the next frame.

01:22:14 So predicting the future of video, right?

01:22:18 So for now we’re very far from that, but his thought is because it’s unsupervised or as

01:22:24 he refers to as self supervised, you know, if you watch enough video, essentially if

01:22:29 you watch YouTube, you’ll be able to learn about the nature of reality, the physics,

01:22:34 the common sense reasoning required by just teaching a system to predict the next frame.

01:22:40 So he’s confident this is the way to go.

01:22:42 So for you, from the perspective of just working with this video, how do you think an algorithm

01:22:50 that just watches all of YouTube, stays up all day and night watching YouTube would be

01:22:55 able to understand enough of the physics of the world about the way this world works,

01:23:02 be able to do common sense reasoning and so on?

01:23:05 Well, I mean, we have systems that already watch all the videos on YouTube, right?

01:23:10 But they’re just looking for very specific things, right?

01:23:13 They’re supervised learning systems that are trying to identify something or classify something.

01:23:22 And I don’t know if, I don’t know if predicting the next frame is really going to get there

01:23:25 because I’m not an expert on compression algorithms, but I understand that that’s kind of what

01:23:32 compression video compression algorithms do is they basically try to predict the next

01:23:37 frame and then fix up the places where they got it wrong.

01:23:41 And that leads to higher compression than if you actually put all the bits for the next

01:23:46 frame there.

01:23:48 So I don’t know if I believe that just being able to predict the next frame is going to

01:23:53 be enough because there’s so many frames and even a tiny bit of error on a per frame basis

01:24:00 can lead to wildly different videos.

01:24:02 So the thing is, the idea of compression is one way to do compression is to describe through

01:24:08 text what’s contained in the video.

01:24:10 That’s the ultimate high level of compression.

01:24:12 So the idea is traditionally when you think of video image compression, you’re trying

01:24:16 to maintain the same visual quality while reducing the size.

01:24:22 But if you think of deep learning from a bigger perspective of what compression is, is you’re

01:24:27 trying to summarize the video.

01:24:29 And the idea there is if you have a big enough neural network, just by watching the next,

01:24:35 trying to predict the next frame, you’ll be able to form a compression of actually understanding

01:24:40 what’s going on in the scene.

01:24:42 If there’s two people talking, you can just reduce that entire video into the fact that

01:24:47 two people are talking and maybe the content of what they’re saying and so on.

01:24:51 That’s kind of the open ended dream.

01:24:55 So I just wanted to sort of express that because it’s interesting, compelling notion, but it

01:25:01 is nevertheless true that video, our world is a lot more complicated than we get a credit

01:25:07 for.

01:25:08 I mean, in terms of search and discovery, we have been working on trying to summarize

01:25:12 videos in text or with some kind of labels for eight years at least.

01:25:20 And you know, and we’re kind of so, so.

01:25:25 So if you were to say the problem is a hundred percent solved and eight years ago was zero

01:25:31 percent solved, where are we on that timeline would you say?

01:25:37 Yeah.

01:25:38 To summarize a video well, maybe less than a quarter of the way.

01:25:44 So on that topic, what does YouTube look like 10, 20, 30 years from now?

01:25:50 I mean, I think that YouTube is evolving to take the place of TV.

01:25:58 I grew up as a kid in the seventies and I watched a tremendous amount of television

01:26:03 and I feel sorry for my poor mom because people told her at the time that it was going to

01:26:09 rot my brain and that she should kill her television.

01:26:14 But anyway, I mean, I think that YouTube is at least for my family, a better version of

01:26:21 television, right?

01:26:22 It’s one that is on demand.

01:26:24 It’s more tailored to the things that my kids want to watch.

01:26:28 And also they can find things that they would never have found on television.

01:26:34 And so I think that at least from just observing my own family, that’s where we’re headed is

01:26:40 that people watch YouTube kind of in the same way that I watched television when I was younger.

01:26:46 So from a search and discovery perspective, what do you, what are you excited about in

01:26:51 the five, 10, 20, 30 years?

01:26:54 Like what kind of things?

01:26:55 It’s already really good.

01:26:56 I think it’s achieved a lot of, of course we don’t know what’s possible.

01:27:01 So it’s the task of search of typing in the text or discovering new videos by the next

01:27:08 recommendation.

01:27:09 So I personally am really happy with the experience.

01:27:12 I continuously, I rarely watch a video that’s not awesome from my own perspective, but what’s,

01:27:18 what else is possible?

01:27:19 What are you excited about?

01:27:21 Well, I think introducing people to more of what’s available on YouTube is not only very

01:27:28 important to YouTube and to creators, but I think it will help enrich people’s lives

01:27:34 because there’s a lot that I’m still finding out is available on YouTube that I didn’t

01:27:38 even know.

01:27:39 I’ve been working YouTube eight years and it wasn’t until last year that I learned that,

01:27:46 that I could watch USC football games from the 1970s.

01:27:51 Like I didn’t even know that was possible until last year and I’ve been working here

01:27:55 quite some time.

01:27:56 So, you know, what was broken about, about that?

01:27:58 That it took me seven years to learn that this stuff was already on YouTube even when

01:28:03 I got here.

01:28:04 So I think there’s a big opportunity there.

01:28:07 And then as I said before, you know, we want to make sure that YouTube finds a way to ensure

01:28:16 that it’s acting responsibly with respect to society and enriching people’s lives.

01:28:23 So we want to take all of the great things that it does and make sure that we are eliminating

01:28:28 the negative consequences that might happen.

01:28:31 And then lastly, if we could get to a point where all the videos people watch are the

01:28:37 best ones they’ve ever watched, that’d be outstanding too.

01:28:40 Do you see in many senses becoming a window into the world for people?

01:28:45 It’s especially with live video, you get to watch events.

01:28:49 I mean, it’s really, it’s the way you experience a lot of the world that’s out there is better

01:28:54 than TV in many, many ways.

01:28:56 So do you see becoming more than just video?

01:29:00 Do you see creators creating visual experiences and virtual worlds that if I’m, I’m talking

01:29:06 crazy now, but sort of virtual reality and entering that space, or is that at least for

01:29:11 now totally outside what YouTube is thinking about?

01:29:14 I mean, I think Google is thinking about virtual reality.

01:29:18 I don’t think about virtual reality too much.

01:29:22 I know that we would want to make sure that YouTube is there when virtual reality becomes

01:29:28 something or if virtual reality becomes something that a lot of people are interested in.

01:29:34 But I haven’t seen it really take off yet.

01:29:38 Take off.

01:29:39 Well, the future is wide open.

01:29:41 Christos, I’ve been really looking forward to this conversation.

01:29:43 It’s been a huge honor.

01:29:45 Thank you for answering some of the more difficult questions I’ve asked.

01:29:48 I’m really excited about what YouTube has in store for us.

01:29:52 It’s one of the greatest products I’ve ever used and continues.

01:29:54 So thank you so much for talking to me.

01:29:56 It’s my pleasure.

01:29:57 Thanks for asking me.

01:29:58 Thanks for listening to this conversation.

01:30:01 And thank you to our presenting sponsor, Cash App.

01:30:04 Download it.

01:30:05 Use code LexPodcast.

01:30:07 You’ll get $10 and $10 will go to FIRST, a STEM education nonprofit that inspires hundreds

01:30:12 of thousands of young minds to become future leaders and innovators.

01:30:17 If you enjoy this podcast, subscribe on YouTube, give it five stars on Apple Podcast, follow

01:30:22 on Spotify, support on Patreon, or simply connect with me on Twitter.

01:30:27 And now, let me leave you with some words of wisdom from Marcel Proust.

01:30:32 The real voyage of discovery consists not in seeking new landscapes, but in having new

01:30:37 eyes.

01:30:40 Thank you for listening and hope to see you next time.