Transcript
00:00:00 The following is a conversation with Kevin Systrom,
00:00:02 cofounder and long time CEO of Instagram,
00:00:06 including for six years after Facebook’s acquisition
00:00:09 of Instagram.
00:00:10 This is the Lex Friedman podcast.
00:00:13 To support it, please check out our sponsors
00:00:15 in the description.
00:00:16 And now, here’s my conversation with Kevin Systrom.
00:00:21 At the risk of asking the Rolling Stones
00:00:24 to play Satisfaction, let me ask you about
00:00:26 the origin story of Instagram.
00:00:28 Sure. So maybe some context.
00:00:30 You, like we were talking about offline,
00:00:32 grew up in Massachusetts, learned computer programming there,
00:00:36 liked to play Doom II, worked at a vinyl record store.
00:00:40 Then you went to Stanford, turned down Mr. Mark Zuckerberg
00:00:46 and Facebook, went to Florence to study photography.
00:00:49 Those are just some random, beautiful,
00:00:51 impossibly brief glimpses into a life.
00:00:54 So let me ask again, can you take me through
00:00:56 the origin story of Instagram, given that context?
00:00:59 You basically set it up.
00:01:01 All right, so we have a fair amount of time,
00:01:04 so I’ll go into some detail.
00:01:05 But basically what I’ll say is,
00:01:09 Instagram started out of a company actually called Bourbon,
00:01:13 and it was spelled B U R B N.
00:01:16 And a couple of things were happening at the time.
00:01:19 So if we zoom back to 2010, not a lot of people remember
00:01:22 what was happening in the dot com world then,
00:01:25 but check in apps were all the rage.
00:01:29 So.
00:01:30 What’s a check in app?
00:01:31 Gowalla, Foursquare, Hotpotato.
00:01:34 So I’m at a place, I’m gonna tell the world
00:01:36 that I’m at this place.
00:01:37 That’s right.
00:01:38 What’s the idea behind this kind of app, by the way?
00:01:41 You know what, I’m gonna answer that,
00:01:42 but through what Instagram became
00:01:45 and why I believe Instagram replaced them.
00:01:48 So the whole idea was to share with the world
00:01:49 what you were doing, specifically with your friends, right?
00:01:53 But there were all the rage,
00:01:54 and Foursquare was getting all the press.
00:01:56 And I remember sitting around saying,
00:01:57 hey, I wanna build something,
00:01:58 but I don’t know what I wanna build.
00:02:00 What if I built a better version of Foursquare?
00:02:04 And I asked myself, well, why don’t I like Foursquare
00:02:07 or how could it be improved?
00:02:10 And basically I sat down and I said,
00:02:13 I think that if you have a few extra features,
00:02:16 it might be enough.
00:02:17 One of which happened to be posting a photo
00:02:19 of where you were.
00:02:20 There were some others.
00:02:21 It turns out that wasn’t enough.
00:02:23 My co founder joined, we were going to attack Foursquare
00:02:27 and the likes and try to build something interesting.
00:02:30 And no one used it.
00:02:31 No one cared because it wasn’t enough.
00:02:32 It wasn’t different enough, right?
00:02:35 So one day we were sitting down and we asked ourselves,
00:02:38 okay, it’s come to Jesus moment.
00:02:40 Are we gonna do this startup?
00:02:43 And if we’re going to,
00:02:44 we can’t do what we’re currently doing.
00:02:46 We have to switch it up.
00:02:47 So what do people love the most?
00:02:48 So we sat down and we wrote out three things
00:02:51 that we thought people uniquely loved about our product
00:02:54 that weren’t in other products.
00:02:57 Photos happened to be the top one.
00:02:59 So sharing a photo of what you were doing,
00:03:01 where you were at the moment
00:03:03 was not something products let you do really.
00:03:06 Facebook was like, post an album of your vacation
00:03:09 from two weeks ago, right?
00:03:11 Twitter allowed you to post a photo,
00:03:13 but their feed was primarily text
00:03:15 and they didn’t show the photo in line
00:03:17 or at least I don’t think they did at the time.
00:03:19 So even though it seems totally stupid
00:03:23 and obvious to us now, at the moment then,
00:03:27 posting a photo of what you were doing at the moment
00:03:29 was like not a thing.
00:03:32 So we decided to go after that
00:03:34 because we noticed that people who used our service,
00:03:36 the one thing they happened to like the most
00:03:38 was posting a photo.
00:03:40 So that was the beginning of Instagram.
00:03:41 And yes, like we went through and we added filters
00:03:44 and there’s a bunch of stories around that.
00:03:46 But the origin of this was that
00:03:48 we were trying to be a check in app,
00:03:49 realized that no one wanted another check in app.
00:03:52 It became a photo sharing app,
00:03:54 but one that was much more about what you’re doing
00:03:56 and where you are.
00:03:57 And that’s why when I say,
00:03:58 I think we’ve replaced check in apps,
00:04:01 it became a check in via a photo
00:04:03 rather than saying your location
00:04:06 and then optionally adding a photo.
00:04:08 When you were thinking about what people like,
00:04:11 from where did you get a sense
00:04:13 that this is what people like?
00:04:14 You said, we sat down, we wrote some stuff down on paper.
00:04:18 Where is that intuition that seems fundamental
00:04:21 to the success of an app like Instagram?
00:04:26 Where does that idea,
00:04:27 where does that list of three things come from exactly?
00:04:31 Only after having studied machine learning now
00:04:33 for a couple of years, I like, I have a…
00:04:36 You have understood yourself?
00:04:39 I’ve started to make connections,
00:04:40 like we can go into this later,
00:04:43 but obviously the connections between machine learning
00:04:50 and the human brain, I think are stretched sometimes.
00:04:53 At the same time, being able to back prop
00:04:57 and being able to look at the world, try something,
00:05:01 figure out how you’re wrong, how wrong you are,
00:05:04 and then nudge your company in the right direction
00:05:08 based on how wrong you are,
00:05:10 is a fascinating concept.
00:05:12 And I don’t, we didn’t know we were doing it at the time,
00:05:14 but that’s basically what we were doing, right?
00:05:17 We put it out to, call it a hundred people,
00:05:20 and you would look at their data.
00:05:22 You would say, what are they sharing?
00:05:25 Like what resonates, what doesn’t resonate?
00:05:26 We think they’re gonna resonate with X,
00:05:29 but it turns out they resonate with Y.
00:05:31 Okay, shift the company towards Y.
00:05:33 And it turns out if you do that enough quickly enough,
00:05:36 you can get to a solution that has product market fit.
00:05:39 Most companies fail because they sit there
00:05:42 and they don’t, either they’re learning rates too slow,
00:05:45 they sit there and they’re just,
00:05:46 they’re adamant that they’re right,
00:05:47 even though the data is telling them they’re not right,
00:05:50 or they’re learning rates too high
00:05:53 and they wildly chase different ideas
00:05:55 and they never actually settle on one
00:05:57 where they don’t groove, right?
00:05:59 And I think when we sat down
00:06:00 and we wrote out those three ideas,
00:06:01 what we were saying is, what are the three possible,
00:06:05 whether they’re local or global maxima in our world, right?
00:06:10 That users are telling us they like
00:06:12 because they’re using the product that way.
00:06:14 It was clear people liked the photos
00:06:16 because that was the thing they were doing.
00:06:18 And we just said, okay, like,
00:06:20 what if we just cut out most of the other stuff
00:06:22 and focus on that thing?
00:06:25 And then it happened to be a multi billion dollar business
00:06:27 and it’s that easy by the way.
00:06:30 Yeah, I guess so.
00:06:32 Well, nobody ever writes about neural networks
00:06:34 that miserably failed.
00:06:36 So this particular neural network succeeded.
00:06:38 Oh, they sell all the time, right?
00:06:40 Yeah.
00:06:41 But nobody writes about it.
00:06:41 The default state is failing.
00:06:42 Yes.
00:06:44 When you said the way people are using the app,
00:06:48 is that the loss function for this neural network
00:06:51 or is it also self report?
00:06:52 Like, do you ever ask people what they like
00:06:54 or do you have to track exactly what they’re doing,
00:06:58 not what they’re saying?
00:07:00 I once made a Thanksgiving dinner, okay?
00:07:03 And it was for relatives and I like to cook a lot.
00:07:08 Okay.
00:07:09 And I worked really hard on picking the specific dishes
00:07:13 and I was really proud because I had planned it out
00:07:17 using a Gantt chart and like it was ready on time
00:07:19 and everything was hot.
00:07:21 Nice.
00:07:22 Like, I don’t know if you’re a big Thanksgiving guy,
00:07:23 but like the worst thing about Thanksgiving
00:07:25 is when the turkey is cold and some things are hot
00:07:28 and some things, anyway.
00:07:29 You had a Gantt chart.
00:07:30 Did you actually have a chart?
00:07:31 Oh yeah.
00:07:32 OmniPlan, fairly expensive, like Gantt chart thing
00:07:36 that I think maybe 10 people have purchased in the world,
00:07:39 but I’m one of them and I use it for recipe planning
00:07:42 only around big holidays.
00:07:44 That’s brilliant, by the way.
00:07:45 Do people do this kind of…
00:07:47 Overengineering?
00:07:48 It’s not overengineering, it’s just engineering.
00:07:50 It’s planning.
00:07:51 Thanksgiving is a complicated set of events
00:07:54 with some uncertainty with a lot of things going on.
00:07:57 You should be able, you should be planning it in this way.
00:07:59 There should be a chart.
00:08:00 It’s not overengineering.
00:08:01 I mean, so what’s funny is, brief aside.
00:08:04 Yes, it’s brilliant.
00:08:06 I love cooking, I love food, I love coffee,
00:08:08 and I’ve spent some time with some chefs
00:08:10 who like know their stuff.
00:08:12 And they always just take out a piece of paper
00:08:15 and just work backwards in rough order.
00:08:17 Like it’s never perfect, but rough order.
00:08:20 It’s just like, oh, that makes sense.
00:08:21 Why not just work backwards from the end goal, right?
00:08:24 And put in some buffer time.
00:08:26 And so I probably over specified it a bit using a Gantt chart,
00:08:29 but the fact that you can do it,
00:08:32 it’s what professional kitchens roughly do.
00:08:35 They just don’t call it a Gantt chart,
00:08:36 or at least I don’t think they do.
00:08:38 Anyway, I was telling a story about Thanksgiving.
00:08:40 So here’s the thing.
00:08:42 I’m sitting down, we have the meal,
00:08:44 and then I got to know Ray Dalio fairly well
00:08:49 over maybe the last year of Instagram.
00:08:53 And one thing that he kept saying was like,
00:08:55 feedback is really hard to get honestly from people.
00:08:59 And I sat down after dinner, I said,
00:09:02 guys, I want feedback.
00:09:03 What was good and what was bad?
00:09:05 Yes.
00:09:06 And what’s funny is like,
00:09:07 literally everyone just said everything was great.
00:09:11 And I like personally knew I had screwed up
00:09:13 a handful of things, but no one would say it.
00:09:17 And can you imagine now not something as high stakes
00:09:20 as Thanksgiving dinner, okay?
00:09:22 Thanksgiving dinner, it’s not that high stakes.
00:09:25 But you’re trying to build a product
00:09:26 and everyone knows you left your job for it
00:09:28 and you’re trying to build it out
00:09:29 and you’re trying to make something wonderful
00:09:32 and it’s yours, right?
00:09:33 You designed it.
00:09:35 Now try asking for feedback
00:09:37 and know that you’re giving this to your friends
00:09:39 and your family.
00:09:42 People have trouble giving hard feedback.
00:09:45 People have trouble saying, I don’t like this
00:09:48 or this isn’t great, or this is how it’s failed me.
00:09:51 In fact, you usually have two classes of people.
00:09:56 People who just won’t say bad things,
00:09:58 you can literally say to them,
00:10:00 please tell me what you hate most about this
00:10:02 and they won’t do it.
00:10:03 They’ll try, but they won’t.
00:10:05 And then the other class of people
00:10:06 are just negative period about everything
00:10:09 and it’s hard to parse out like what is true and what isn’t.
00:10:14 So my rule of thumb with this is
00:10:17 you should always ask people,
00:10:20 but at the end of the day, it’s amazing what data
00:10:22 will tell you.
00:10:23 And that’s why with whatever project I work on, even now,
00:10:27 collecting data from the beginning on usage patterns,
00:10:30 so engagement, how many days of the week do they use it?
00:10:34 How many, I don’t know if we were to go back to Instagram,
00:10:36 how many impressions per day, right?
00:10:39 Is that growing?
00:10:40 Is that shrinking?
00:10:41 And don’t be like overly scientific about it, right?
00:10:44 Cause maybe you have 50 beta users or something.
00:10:48 But what’s fascinating is that data doesn’t lie.
00:10:52 People are very defensive about their time.
00:10:57 They’ll say, oh, I’m so busy,
00:10:59 I’m sorry I didn’t get to use the app.
00:11:00 Like I’m just, you know,
00:11:03 but I don’t know you were posting on Instagram
00:11:05 the whole time.
00:11:07 So I don’t know at the end of the day,
00:11:09 like at Facebook there was, you know,
00:11:11 before time spent became kind of this loaded term there.
00:11:15 The idea that people’s currency in their lives is time.
00:11:21 And they only have a certain amount of time to give things,
00:11:23 whether it’s friends or family or apps or TV shows
00:11:26 or whatever, there’s no way of inventing more of it,
00:11:28 at least not that I know of.
00:11:32 If they don’t use it, it’s because it’s not great.
00:11:35 So the moral of the story is you can ask all you want,
00:11:39 but you just have to look at the data.
00:11:41 And data doesn’t lie, right?
00:11:45 I mean, there’s metrics, there’s data can obscure
00:11:51 the key insight if you’re not careful.
00:11:54 So time spent in the app, that’s one.
00:11:57 There’s so many metrics you can put at this
00:11:59 and they will give you totally different insights,
00:12:02 especially when you’re trying to create something
00:12:04 that doesn’t obviously exist yet.
00:12:07 So, you know, measuring maybe why you left the app
00:12:12 or measuring special moments of happiness
00:12:17 that will make sure you return to the app
00:12:20 or moments of happiness that are long lasting
00:12:22 versus like dopamine short term, all of those things.
00:12:26 But I think I suppose in the beginning,
00:12:29 you can just get away with just asking the question,
00:12:33 which features are used a lot?
00:12:35 Let’s do more of that.
00:12:37 And how hard was the decision?
00:12:40 And I mean, maybe you can tell me what Instagram
00:12:43 looked in the beginning,
00:12:44 but how hard was it to make pictures
00:12:47 of the first class citizen?
00:12:48 That’s a revolutionary idea.
00:12:50 Like at whatever point Instagram became this feed of photos,
00:12:56 that’s quite brilliant.
00:12:58 Plus, I also don’t know when this happened,
00:13:02 but they’re all shaped the same.
00:13:05 It’s like a…
00:13:06 I have to tell you why, that’s the interesting part.
00:13:09 Why is that?
00:13:10 So a couple of things.
00:13:11 One is data, like you’re right.
00:13:16 You can overinterpret data.
00:13:18 Like imagine trying to fly a plane by staring at,
00:13:22 I don’t know, a single metric like airspeed.
00:13:25 You don’t know if you’re going up or down.
00:13:27 I mean, it correlates with up or down,
00:13:28 but you don’t actually know.
00:13:29 It will never help you land the plane.
00:13:32 So don’t stare at one metric.
00:13:33 Like it turns out you have to synthesize a bunch of metrics
00:13:36 to know where to go.
00:13:37 But it doesn’t lie.
00:13:40 Like if your airspeed is zero,
00:13:41 unless it’s not working, right?
00:13:43 If it’s zero, you’re probably gonna fall out of the sky.
00:13:46 So generally you look around and you have the scan going.
00:13:50 Yes.
00:13:51 And you’re just asking yourself,
00:13:52 is this working or is this not working?
00:13:56 But people have trouble explaining how they actually feel.
00:14:02 So just, it’s about synthesizing both of them.
00:14:05 So then Instagram, right?
00:14:07 We were talking about revolutionary moment
00:14:10 where the feed became square photos basically.
00:14:14 And photos first and then square photos.
00:14:16 Yeah.
00:14:19 It was clear to me that the biggest,
00:14:21 so I believe the biggest companies are founded
00:14:26 when enormous technical shifts happen.
00:14:30 And the biggest technical shift that happened
00:14:32 right before Instagram was founded
00:14:34 was the advent of a phone that didn’t suck.
00:14:38 The iPhone, right?
00:14:38 Like in retrospect, we’re like, oh my God,
00:14:40 the first iPhone that almost had,
00:14:43 like it wasn’t that good.
00:14:44 But compared to everything else at the time,
00:14:46 it was amazing.
00:14:48 And by the way,
00:14:50 the first phone that had an incredible camera
00:14:54 that could like do as well as the point and shoot
00:14:57 you might carry around was the iPhone 4.
00:15:01 And that was right when Instagram launched.
00:15:02 And we looked around and we said,
00:15:04 what will change?
00:15:05 Because everyone has a camera in their pocket.
00:15:08 And it was so clear to me that the world
00:15:11 of social networks before it was based in the desktop
00:15:16 and sitting there and having a link you could share, right?
00:15:20 And that wasn’t gonna be the case.
00:15:21 So the question is what would you share
00:15:23 if you were out and about in the world?
00:15:26 If not only did you have a camera that fit in your pocket,
00:15:29 but by the way, that camera had a network attached to it
00:15:31 that allowed you to share instantly.
00:15:34 That seemed revolutionary.
00:15:36 And a bunch of people saw it at the same time.
00:15:37 It wasn’t just Instagram.
00:15:38 There were a bunch of competitors.
00:15:41 The thing we did, I think was not only,
00:15:44 well, we focused on two things.
00:15:45 So we wrote down those things, we circled photos
00:15:47 and we said, I think we should invest in this.
00:15:50 But then we said, what sucks about photos?
00:15:52 One, they look like crap, right?
00:15:55 They just, at least back then.
00:15:57 Now my phone takes pretty great photos, right?
00:16:01 Back then they were blurry, not so great, compressed, right?
00:16:06 Two, it was really slow, like really slow to upload a photo.
00:16:12 And I’ll tell a fun story about that
00:16:13 and explain to you why they’re all the same size
00:16:15 and square as well.
00:16:18 And three, man, if you wanted to share a photo
00:16:21 on different networks, you had to go to each
00:16:24 of the individual apps and select all of them
00:16:26 and upload individually.
00:16:27 And so we were like, all right, those are the pain points.
00:16:30 We’re gonna focus on that.
00:16:31 So one, instead of, because they weren’t beautiful,
00:16:36 we were like, why don’t we lean into the fact
00:16:37 that they’re not beautiful?
00:16:38 And I remember studying in Florence,
00:16:40 my photography teacher gave me this Holga camera
00:16:42 and I’m not sure everyone knows what a Holga camera is,
00:16:44 but they’re these old school plastic cameras.
00:16:48 I think they’re produced in China at the time.
00:16:52 And I wanna say the original ones were like
00:16:54 from the 70s or the 80s or something.
00:16:55 They’re supposed to be like $3 cameras for the every person.
00:16:58 They took nice medium format films, large negatives,
00:17:05 but they kind of blurred the light
00:17:07 and they kind of like light leaked into the side.
00:17:10 And there was this whole resurgence where people looked
00:17:13 at that and said, oh my God, this is a style, right?
00:17:16 And I remember using that in Florence and just saying,
00:17:19 well, why don’t we just like lean into the fact
00:17:20 that these photos suck and make them suck more,
00:17:24 but in an artistic way.
00:17:26 And it turns out that had product market fit.
00:17:28 People really liked that.
00:17:29 They were willing to share their not so great photos
00:17:31 if they looked not so great on purpose, okay?
00:17:35 The second part.
00:17:37 That’s where the filters come into picture.
00:17:40 So computational modification of photos
00:17:42 to make them look extra crappy to where it becomes art.
00:17:46 Yeah, yeah.
00:17:47 And I mean, add light leaks, add like an overlay filter,
00:17:51 make them more contrasty than they should be.
00:17:54 The first filter we ever produced was called X Pro 2.
00:17:58 And I designed it while I was
00:17:59 in this small little bed and breakfast room
00:18:02 in Todos Santos, Mexico.
00:18:03 I was trying to take a break from the bourbon days.
00:18:06 And I remember saying to my co founder,
00:18:08 I just need like a week to reset.
00:18:11 And that was on that trip worked on the first filter
00:18:15 because I said, you know, I think I can do this.
00:18:17 And I literally iterated one by one over the RGB values
00:18:22 in the array that was the photo and just slightly shifted.
00:18:26 Basically there was a function of R, function of G,
00:18:29 function of B that just shifted them slightly.
00:18:32 It was in rocket science.
00:18:34 And it turns out that actually
00:18:36 made your photo look pretty cool.
00:18:38 It just mapped from one color space to another color space.
00:18:42 It was simple, but it was really slow.
00:18:44 I mean, if you applied a filter,
00:18:47 I think it used to take two or three seconds to render.
00:18:50 Only eventually would I figure out how to do it on the GPU.
00:18:53 And I’m not even sure it was a GPU, but it was using OpenGL.
00:18:56 But anyway, I would eventually figure that out
00:18:59 and then it would be instant, but it used to be really slow.
00:19:02 By the way, anyone who’s watching or listening,
00:19:06 it’s amazing what you can get away with in a startup
00:19:10 as long as the product outcome is right for the user.
00:19:14 Like you can be slow.
00:19:15 You can be terrible.
00:19:16 You can be, as long as you have product market fit,
00:19:21 people will put up with a lot.
00:19:22 And then the question is just about compressing,
00:19:25 making it more performant over time
00:19:27 so that they get that product market fit instantly.
00:19:30 So fascinating because there’s some things
00:19:32 where those three seconds would make or break the app,
00:19:37 but some things you’re saying not.
00:19:39 It’s hard to know when, it’s the problem Spotify solved
00:19:43 for making streaming like work.
00:19:48 And like delays in listening to music is a huge negative,
00:19:53 even like slight delays.
00:19:55 But here you’re saying, I mean,
00:19:57 how do you know when those three seconds are okay
00:20:00 or are you just gonna have to try it out?
00:20:03 Because to me, my intuition would be
00:20:07 those three seconds would kill the app.
00:20:09 Like I would try to do the OpenGL thing.
00:20:12 Right, so I wish I were that smart at the time.
00:20:17 I wasn’t, I just knew how to do what I knew how to do.
00:20:21 And I decided, okay, like,
00:20:23 why don’t I just iterate over the values and change them?
00:20:27 And what’s interesting is that
00:20:31 compared to the alternatives, no one else used OpenGL.
00:20:35 So everyone else was doing it the dumb way.
00:20:37 And in fact, they were doing it at a high resolution.
00:20:40 Now comes in the small resolution
00:20:42 that we’ll talk about in a second.
00:20:45 By choosing 512 pixels by 512 pixels,
00:20:48 which I believe it was at the time,
00:20:51 we iterated over a lot fewer pixels than our competitors
00:20:54 who were trying to do these enormous output like images.
00:20:58 So instead of taking 20 seconds,
00:21:00 I mean, three seconds feels pretty good, right?
00:21:03 So on a relative basis, we were winning like a lot.
00:21:06 Okay, so that’s answer number one.
00:21:08 Answer number two is we actually focused
00:21:11 on latency in the right places.
00:21:13 So we did this really wonderful thing when you uploaded.
00:21:17 So the way it would work is you’d take your phone,
00:21:20 you’d take the photo and then you’d go to the,
00:21:25 you’d go to the edit screen where you would caption it.
00:21:28 And on that caption screen, you’d start typing
00:21:31 and you’d think, okay, like what’s a clever caption?
00:21:34 And I said to Mike, hey, when I worked on the Gmail team,
00:21:36 you know what they did?
00:21:37 When you typed in your username or your email address,
00:21:40 even before you’ve entered in your password,
00:21:43 like the probability once you enter in your username
00:21:47 that you’re going to actually sign in is extremely high.
00:21:51 So why not just start loading your account
00:21:52 in the background?
00:21:53 Not like sending it down to the desktop,
00:21:55 that would be a security issue,
00:21:59 but like loaded into memory on the server,
00:22:01 like get it ready, prepare it.
00:22:03 I always thought that was so fascinating and unintuitive.
00:22:06 And I was like, Mike, why don’t we just do that?
00:22:08 But like, we’ll just upload the photo
00:22:10 and like assume you’re going to upload the photo.
00:22:14 And if you don’t, forget about it, we’ll delete it, right?
00:22:17 So what ended up happening was people would caption
00:22:20 their photo, they’d press done or upload
00:22:24 and you’d see this little progress bar just go foop.
00:22:27 It was lightning fast, okay?
00:22:30 We were no faster than anyone else at the time,
00:22:32 but by choosing 512 by 512 and doing it in the background,
00:22:37 it almost guaranteed that it was done
00:22:39 by the time you captioned.
00:22:41 And everyone when they used it was like,
00:22:43 how the hell is this thing so fast?
00:22:47 But we were slow, we just hid the slowness.
00:22:49 It wasn’t like, these things are just like,
00:22:51 it’s a shell game, you’re just hiding the latency.
00:22:55 That mattered to people like a lot.
00:22:58 And I think that, so you were willing to put up
00:23:01 with a slow filter if it meant
00:23:03 you could share it immediately.
00:23:04 And of course we added sharing options
00:23:06 which let you distribute it really quickly,
00:23:08 that was the third part.
00:23:11 So latency matters, but relative to what?
00:23:14 And then there’s some like tricks,
00:23:17 you get around to just hiding the latency.
00:23:20 Like I don’t know if Spotify starts downloading
00:23:23 the next song eagerly, I’m assuming they do.
00:23:25 There are a bunch of ideas here that are not rocket science
00:23:29 that really help.
00:23:31 And all of that was stuff you were explicitly
00:23:35 having a discussion about, like those designs
00:23:37 and you were having like arguments, discussions.
00:23:41 I’m not sure it was arguments, I mean,
00:23:43 I’m not sure if you’ve met my co founder Mike,
00:23:45 but he’s a pretty nice guy and he’s very reasonable.
00:23:48 And we both just saw eye to eye and we’re like,
00:23:51 yeah, it’s like, make this fast or at least seem fast,
00:23:56 it’ll be great.
00:23:57 Honestly, I think the most contentious thing
00:23:59 and he would say this too initially,
00:24:02 was I was on an iPhone 3G, so like the not so fast one.
00:24:08 And he had a brand new iPhone 4, that was cheap.
00:24:10 Nice.
00:24:12 And his feed loaded super smoothly,
00:24:15 like when he would scroll from photo to photo,
00:24:18 buttery smooth, right?
00:24:20 But on my phone, every time you got to a new photo,
00:24:22 it was like, kachunk, kachunk, allocate memory,
00:24:25 like all this stuff, right?
00:24:27 I was like, Mike, that’s unacceptable.
00:24:29 He’s like, oh, come on, man, just like upgrade your phone.
00:24:31 Basically, he didn’t actually say that,
00:24:33 he was nicer than that.
00:24:35 But I could tell he wished like,
00:24:36 I would just stop being cheap and just get a new phone.
00:24:39 But what’s funny is we actually sat there working
00:24:41 on that little detail for a few days before launch.
00:24:45 And that polished experience,
00:24:48 plus the fact that uploading seemed fast
00:24:51 for all these people who didn’t have nice phones,
00:24:54 I think meant a lot because far too often,
00:24:57 you see teams focus not on performance,
00:25:00 they focus on what’s the cool computer science problem
00:25:03 they can solve, right?
00:25:05 Can we scale this thing to a billion users
00:25:08 and they’ve got like a hundred, right?
00:25:11 Yeah.
00:25:12 You talked about loss function,
00:25:14 so I want to come back to that.
00:25:16 Like the loss function is like,
00:25:17 do you provide a great, happy, magical,
00:25:20 whatever experience for the consumer?
00:25:23 And listen, if it happens to involve something complex
00:25:25 and technical, then great.
00:25:27 But it turns out, I think most of the time,
00:25:32 those experiences are just sitting there waiting to be built
00:25:34 with like not that complex solutions.
00:25:38 But everyone is just like so stuck in their own head
00:25:40 that they have to overengineer everything
00:25:42 and then they forget about the easy stuff.
00:25:45 I mean, also, maybe to flip the loss function there is,
00:25:48 you’re trying to minimize the number of times
00:25:50 there’s unpleasant experience, right?
00:25:54 Like the one you mentioned where when you go
00:25:56 to the next photo, it freezes for a little bit.
00:25:58 So it’s almost, as opposed to maximizing pleasure,
00:26:00 it’s probably easier to minimize the number of like,
00:26:04 the friction.
00:26:05 Yeah.
00:26:06 And as we all know, you just make the pleasure negative
00:26:10 and then minimize everything, so.
00:26:13 We’re mapping this all back to neural networks.
00:26:14 But actually, can I say one thing on that,
00:26:16 which is I don’t know a lot about machine learning,
00:26:19 but I feel like I’ve tried studying a bunch.
00:26:23 That whole idea of reinforcement learning
00:26:26 and planning out more than the greedy single experience,
00:26:30 I think is the closest you can get
00:26:34 to like ideal product design thinking,
00:26:38 where you’re not saying,
00:26:39 hey, like, can we have a great experience just this one time?
00:26:43 But like, what is the right way to onboard someone?
00:26:46 What series of experiences correlate most with them
00:26:50 hanging on long term, right?
00:26:52 So not just saying, oh, did the photo load slowly
00:26:55 a couple of times, or did they get a great photo
00:26:57 at the top of their feed?
00:27:00 But like, what are the things that are gonna make
00:27:01 this person come back over the next week,
00:27:04 over the next month?
00:27:06 And as a product designer asking yourself,
00:27:08 okay, I wanna optimize, not just minimize bad experiences
00:27:11 in the short run, but like,
00:27:13 how do I get someone to engage over the next month?
00:27:17 And I’m not gonna claim at all that I thought that way
00:27:20 at all at the time, because I certainly didn’t.
00:27:23 But if I were going back and giving myself any advice,
00:27:25 it would be thinking, what are those second order effects
00:27:28 that you can create?
00:27:30 And it turns out having your friends on the service
00:27:33 is an enormous win.
00:27:34 So starting with a very small group of people
00:27:37 that produce content that you wanted to see, which we did,
00:27:40 we seeded the community very well, I think.
00:27:43 Ended up mattering, and so.
00:27:46 Yeah, you said that community is one
00:27:48 of the most important things.
00:27:49 So it’s from a metrics perspective,
00:27:52 from maybe a philosophy perspective,
00:27:55 building a certain kind of community within the app.
00:27:57 See, I wasn’t sure what exactly you meant by that
00:28:00 when I first heard you say that.
00:28:01 Maybe you can elaborate, but as I understand now,
00:28:04 it can literally mean get your friends onto the app.
00:28:09 Yeah, think of it this way.
00:28:12 You can build an amazing restaurant or bar or whatever,
00:28:16 right, but if you show up and you’re the only one there,
00:28:20 is it like, does it matter how good the food is?
00:28:24 The drinks, whatever?
00:28:25 No, these are inherently social experiences
00:28:28 that we were working on.
00:28:30 So the idea of having people there,
00:28:35 like you needed to have that,
00:28:36 otherwise it was just a filter out.
00:28:39 But by the way, part of the genius,
00:28:41 I’m gonna say genius, even though it wasn’t really genius,
00:28:43 was starting to be marauding as a filter app was awesome.
00:28:50 The fact that you could,
00:28:51 so we talk about single player mode a lot,
00:28:53 which is like, can you play the game alone?
00:28:56 And Instagram, you could totally play alone.
00:28:57 You could filter your photos,
00:28:59 and a lot of people would tell me,
00:29:00 I didn’t even realize that this thing was a social network
00:29:04 until my friend showed up.
00:29:06 It totally worked as a single player game.
00:29:09 And then when your friends showed up,
00:29:10 all of a sudden it was like,
00:29:11 oh, not only was this great alone,
00:29:14 but now I actually have this trove of photos
00:29:16 that people can look at and start liking,
00:29:19 and then I can like theirs.
00:29:20 And so it was this bootstrap method
00:29:23 of how do you make the thing not suck
00:29:25 when the restaurant is empty?
00:29:27 Yeah, but the thing is, when you say friends,
00:29:29 we’re not necessarily referring to friends
00:29:31 in the physical space.
00:29:32 So you’re not bringing your physical friends with you.
00:29:35 You’re also making new friends.
00:29:36 So you’re finding new community.
00:29:38 So it’s not immediately obvious to me
00:29:40 that it’s almost like building any kind of community.
00:29:45 It was both.
00:29:46 And what we learned very early on
00:29:48 was what made Instagram special
00:29:50 and the reason why you would sign up for it
00:29:51 versus say, just sit on Facebook
00:29:53 and look at your friends photos.
00:29:55 Of course we were live,
00:29:56 and of course it was interesting
00:29:58 to see what your friends were doing now.
00:30:00 But the fact that you could connect with people
00:30:02 who like took really beautiful photos in a certain style
00:30:05 all around the world, whether they were travelers,
00:30:07 it was the beginning of the influencer economy.
00:30:11 It was these people
00:30:12 who became professional Instagrammers way back when.
00:30:16 But they took these amazing photos
00:30:18 and some of them were photographers professionally.
00:30:24 And all of a sudden you had this moment in the day
00:30:25 when you could open up this app
00:30:27 and sure you could see what your friends were doing,
00:30:28 but also it was like, oh my God,
00:30:30 that’s a beautiful waterfall or oh my God,
00:30:33 I didn’t realize there was that corner of England
00:30:35 or like really cool stuff.
00:30:38 And the beauty about Instagram early on
00:30:40 was that it was international by default.
00:30:43 You didn’t have to speak English to use it, right?
00:30:46 You could just look at the photos, works great.
00:30:49 We did translate, we had some pretty bad translations,
00:30:52 but we did translate the app.
00:30:54 And even if our translations were pretty poor,
00:30:57 the idea that you could just connect with other people
00:31:00 through their images was pretty powerful.
00:31:02 So how much technical difficulties
00:31:05 there with the programming?
00:31:07 Like what programming language you were talking about?
00:31:09 What was?
00:31:09 Zero, like maybe it was hard for us,
00:31:12 but I mean, there was nothing.
00:31:16 The only thing that was complex about Instagram
00:31:18 at the beginning technically was making it scale.
00:31:23 We were just plain old Objective C for the client.
00:31:27 So it was iPhone only at first?
00:31:29 iPhone only, yep.
00:31:30 As an Android person, I’m deeply offended, but go ahead.
00:31:34 This was 2010.
00:31:35 Oh, sure, sure.
00:31:36 Sorry, sorry.
00:31:37 Android’s getting a lot better, right?
00:31:39 So.
00:31:40 I take it back, you’re right.
00:31:41 If I were to do something today,
00:31:43 I think it would be very different
00:31:44 in terms of launch strategy, right?
00:31:46 Android’s enormous too.
00:31:49 But anyway, back to that moment,
00:31:51 it was Objective C and then we were Python based,
00:31:56 which is just like, this is before Python was really cool.
00:32:00 Like now it’s cool
00:32:01 because it’s all these machine learning libraries
00:32:03 like support Python and right.
00:32:05 Now it’s super, now it’s like cool to be Python.
00:32:08 Back then it was like, oh, Google uses Python.
00:32:10 Like maybe you should use Python.
00:32:12 Facebook was PHP.
00:32:14 Like I had worked at a small startup
00:32:16 of some ex Googlers that used Python.
00:32:19 So we used it and we used a framework called Django,
00:32:22 still exists and people use for basically the backend.
00:32:27 And then you threw a couple interesting things in there.
00:32:30 I mean, we used Postgres, which was kind of fun.
00:32:32 It was a little bit like Hipster database
00:32:34 at the time, right?
00:32:34 Versus MySQL.
00:32:36 MySQL, like everyone used MySQL.
00:32:37 So like using Postgres was like an interesting decision.
00:32:40 Right?
00:32:42 But we used it because it had a bunch of geo features
00:32:45 built in because we thought we were gonna be
00:32:46 a check and app, remember?
00:32:47 It’s also super cool now.
00:32:49 So you were into Python before it was cool
00:32:51 and you were into Postgres before it was cool.
00:32:53 Yeah, we were basically like,
00:32:55 not only Hipster photo company,
00:32:57 Hipster tech company, right?
00:33:00 We also adopted Redis early and like loved it.
00:33:04 I mean, it solves so many problems for us
00:33:06 and turns out that’s still pretty cool.
00:33:09 But the programming was very easy.
00:33:11 It was like, sign up a user, have a feed.
00:33:13 There was nothing, no machine learning at all, zero.
00:33:17 Can you get some context?
00:33:18 How many users at each of these stages?
00:33:20 Are we talking about a hundred users, a thousand users?
00:33:24 So the stage I just described,
00:33:25 I mean that technical stack lasted
00:33:27 through probably 50 million users.
00:33:32 I mean, seriously, like you can get away with a lot
00:33:36 with a pretty basic stack.
00:33:38 Like I think a lot of startups try to overengineer
00:33:41 their solutions from the beginning to like really scale
00:33:43 and you can get away with a lot.
00:33:45 That being said, most of the first two years of Instagram
00:33:48 was literally just trying to make that stack scale.
00:33:51 And it wasn’t, it was not a Python problem.
00:33:54 It was like literally just like, where do we put the data?
00:33:58 Like it’s all coming in too fast.
00:33:59 Like how do we store it?
00:34:01 How do we make sure to be up?
00:34:02 How do we like, how do we make sure we’re
00:34:05 on the right size boxes that they have enough memory?
00:34:09 Those were the issues, but.
00:34:11 Can you speak to the choices you make at that stage
00:34:14 when you’re growing so quickly?
00:34:16 Do you use something like somebody else’s
00:34:18 computer infrastructure or do you build in house?
00:34:22 I’m only laughing because we, when we launched
00:34:25 we had a single computer that we had rented
00:34:30 in some color space in LA.
00:34:32 I don’t even remember what it was called.
00:34:34 Cause I thought that’s what you did.
00:34:35 When I worked at a company called Odio that became Twitter.
00:34:38 I remember visiting our space in San Francisco.
00:34:41 You walked in, you had to wear the ear things.
00:34:43 It was cold and fans everywhere, right?
00:34:46 And we had to, you know, plug one out, replace one.
00:34:49 And I was the intern, so I just like held things.
00:34:52 But I thought to myself, oh, this is how it goes.
00:34:54 And then I remember being in a VC’s office.
00:34:58 I think it was a benchmarks office.
00:35:00 And I think we ran into another entrepreneur
00:35:02 and they were like, oh, how are things going?
00:35:03 We’re like, ah, you know, try to scale this thing.
00:35:06 And they were like, well, I mean
00:35:08 can’t you just add more instances?
00:35:09 And I was like, what do you mean?
00:35:11 And they’re like instances on Amazon.
00:35:13 I was like, what are those?
00:35:15 And it was this moment where we realized how deep
00:35:18 in it we were because we had no idea that AWS existed
00:35:22 nor should we be using it.
00:35:24 Anyway, that night we went back to the office
00:35:27 and we got on AWS, but we did this really dumb thing.
00:35:29 We’re so sorry to people listening.
00:35:32 But we brought up an instance, which was our database.
00:35:38 It’s gonna be a replacement for our database.
00:35:40 But we had it talking over the public internet
00:35:43 to our little box in LA that was our app server.
00:35:46 Very nice. Yeah.
00:35:48 That’s how sophisticated we were.
00:35:49 And obviously that was very, very slow.
00:35:52 Didn’t work at all.
00:35:53 I mean, it worked, but didn’t work.
00:35:55 Only like later that night did we realize
00:35:58 we had to have it all together.
00:36:00 But at least like if you’re listening right now
00:36:02 and you’re thinking, you know, I have no chance.
00:36:04 I’m gonna start to start, but I have no chance.
00:36:06 I don’t know.
00:36:07 We did it and we made a bunch
00:36:08 of really dumb mistakes initially.
00:36:10 I think the question is how quickly do you learn
00:36:12 that you’re making a mistake?
00:36:13 And do you do the right thing immediately right after?
00:36:16 So you didn’t pay for those mistakes by failure.
00:36:20 So yeah, how quickly did you fix it?
00:36:23 I guess there’s a lot of ways to sneak up to this question
00:36:26 of how the hell do you scale the thing?
00:36:28 Other startups, if you have an idea,
00:36:30 how do you scale the thing?
00:36:31 Is it just AWS and you try to write the kind of code
00:36:37 that’s easy to spread across a large number of instances,
00:36:41 and then the rest is just put money into it?
00:36:45 Basically, I would say a couple of things.
00:36:48 First off, don’t even ask the question,
00:36:51 just find product market fit, duct tape it together, right?
00:36:55 Like if you have to.
00:36:56 I think there’s a big caveat here, which I want to get to,
00:37:00 but generally all that matters is product market fit.
00:37:03 That’s all that matters.
00:37:04 If people like your product,
00:37:06 do not worry about when 50,000 people use your product
00:37:10 because you will be happy that you have that problem
00:37:12 when you get there.
00:37:13 I actually can’t name many startups
00:37:18 where they go from nothing to something overnight
00:37:22 and they can’t figure out how to scale it.
00:37:24 There are some, but I think nowadays,
00:37:27 it’s a, when I say a solved problem,
00:37:29 like there are ways of solving it.
00:37:33 The base case is typically that startups
00:37:35 worry way too much about scaling way too early
00:37:38 and forget that they actually have to make something
00:37:39 that people like.
00:37:40 That’s the default mistake case.
00:37:43 But what I’ll say is once you start scaling,
00:37:48 I mean, hiring quickly,
00:37:49 people who have seen the game before
00:37:51 and just know how to do it,
00:37:52 it becomes a bit of like, yeah,
00:37:56 just throw instances of the problem, right?
00:37:59 But the last thing I’ll say on this
00:38:00 that I think did save us,
00:38:03 we were pretty rigorous about writing tests
00:38:06 from the beginning.
00:38:08 That helped us move very, very quickly
00:38:11 when we wanted to rewrite parts of the product
00:38:14 and know that we weren’t breaking something else.
00:38:17 Tests are one of those things where it’s like,
00:38:18 you go slow to go fast.
00:38:20 And they suck when you have to write them
00:38:22 because you have to figure it out.
00:38:24 And there are always those ones that break
00:38:27 when you don’t want them to break and they’re annoying
00:38:29 and it feels like you spent all this time.
00:38:30 But looking back, I think that like longterm optimal,
00:38:34 even with a team of four,
00:38:36 it allowed us to move very, very quickly
00:38:38 because anyone could touch any part of the product
00:38:41 and know that they weren’t gonna bring down the site,
00:38:44 or at least in general.
00:38:45 At which point do you know product market fit?
00:38:48 How many users would you say?
00:38:50 Is it all it takes is like 10 people?
00:38:52 Or is it a thousand?
00:38:53 Is it 50,000?
00:38:55 I don’t think it is generally
00:38:58 a question of absolute numbers.
00:38:59 I think it’s a question of cohorts
00:39:01 and I think it’s a question of trends.
00:39:03 So, you know, it depends how big your business is trying
00:39:08 to be, right?
00:39:09 But if I were signing up a thousand people a week
00:39:12 and they all retain,
00:39:14 like the retention curves for those cohorts looked good,
00:39:16 healthy, and even like,
00:39:19 as you started getting more people on the service,
00:39:21 maybe those earlier cohorts started curving up again
00:39:24 because now there are network effects
00:39:26 and their friends are on the service
00:39:27 or totally depends what type of business you’re in,
00:39:29 but I’m talking purely social, right?
00:39:34 I don’t think it’s an absolute number.
00:39:36 I think it is,
00:39:37 I guess you could call it a marginal number.
00:39:39 So I spent a lot of time when I work with startups
00:39:42 asking them like, okay,
00:39:44 have you looked at that cohort versus this cohort,
00:39:47 whether it’s your clients
00:39:48 or whether it’s people signing up for the service?
00:39:52 But a lot of people think you just have to hit some mark,
00:39:55 like 10,000 people or 50,000 people,
00:39:57 but really seven ish billion people in the world.
00:40:01 Most people forever will not know about your product.
00:40:05 There are always more people out there to sign up.
00:40:07 It’s just a question of how you turn on the spigot, so.
00:40:11 At that stage, early stage yourself,
00:40:14 but also by way of advice,
00:40:16 should you worry about money at all?
00:40:18 How this thing’s gonna make money?
00:40:20 Or do you just try to find product market fit
00:40:24 and get a lot of users to enjoy using your thing?
00:40:28 I think it totally depends.
00:40:30 And that’s an unsatisfying answer.
00:40:32 I was talking with a friend today who,
00:40:36 he was one of our earlier investors and he was saying,
00:40:39 hey, like, have you been doing any angel investing lately?
00:40:41 I said, not really.
00:40:42 I’m just like focused on what I wanna do next.
00:40:44 And he said, the number of financings have just gone bonkers.
00:40:49 Like people are throwing money everywhere right now.
00:40:56 And I think the question is,
00:41:00 do you have an inkling of how you’re gonna make money?
00:41:04 Or are you really just like waving your hands?
00:41:07 I would not like to be an entrepreneur in the position of,
00:41:11 well, I have no idea how this will eventually make money.
00:41:14 That’s not fun.
00:41:16 If you are in an area,
00:41:18 like let’s say you wanted to start a social network, right?
00:41:22 Not saying this is a good idea, but if you did,
00:41:25 there are only a handful of ways they’ve made money
00:41:27 and really only one way they’ve made money in the past
00:41:29 and that’s ads.
00:41:31 So, if you have a service that’s amenable to that
00:41:37 and then I wouldn’t worry too much about that
00:41:39 because if you get to the scale,
00:41:40 you can hire some smart people and figure that out.
00:41:44 I do think that is really healthy for a lot of startups
00:41:48 these days, especially the ones doing
00:41:50 like enterprise software, slacks of the world, et cetera,
00:41:54 to be worried about money from the beginning,
00:41:56 but mostly as a way of winning over clients
00:42:00 and having stickiness.
00:42:05 Of course you need to be worried about money,
00:42:06 but I’m gonna also say this again,
00:42:08 which is it’s like longterm profitability.
00:42:12 If you have a roadmap to that, then that’s great.
00:42:15 But if you’re just like, I don’t know, maybe never,
00:42:18 like we’re working on this metaverse thing,
00:42:19 I think maybe someday, I don’t know.
00:42:22 Like that seems harder to me.
00:42:24 So you have to be as big as Facebook
00:42:26 to like finance that bet, right?
00:42:29 Do you think it’s possible, you said,
00:42:31 you’re not saying it’s necessarily a good idea
00:42:33 to launch a social network.
00:42:35 Do you think it’s possible today,
00:42:38 maybe you can put yourself in those shoes,
00:42:41 to launch a social network that achieves the scale
00:42:45 of a Facebook or a Twitter or an Instagram,
00:42:49 and maybe even greater scale?
00:42:51 Absolutely.
00:42:53 How do you do it?
00:42:55 Asking for a friend.
00:42:56 Yeah, if I knew, I’d probably be doing it right now
00:42:59 and not sitting here.
00:43:00 So, I mean, there’s a lot of ways to ask this question.
00:43:03 One is create a totally new product market fit,
00:43:07 create a new market, create something like Instagram did,
00:43:10 which is like create something kind of new,
00:43:13 or literally out compete Facebook at its own thing,
00:43:17 or I’ll compete Twitter at its own thing.
00:43:19 The only way to compete now,
00:43:21 if you wanna build a large social network
00:43:23 is to look for the cracks, look for the openings.
00:43:28 No one competed, I mean,
00:43:31 no one competed with the core business of Google.
00:43:33 No one competed with the core business of Microsoft.
00:43:36 You don’t go at the big guys
00:43:39 doing exactly what they’re doing.
00:43:41 Instagram didn’t win, quote unquote,
00:43:43 because it tried to be a visual Twitter.
00:43:46 Like we spotted things that either Twitter
00:43:49 wasn’t going to do or refuse to do,
00:43:52 images and feed for the longest time, right?
00:43:55 Or that Facebook wasn’t doing or not paying attention to
00:43:58 because they were mostly desktop at the time
00:44:00 and we were purely mobile, purely visual.
00:44:05 Often there are opportunities sitting there.
00:44:07 You have to figure out like,
00:44:12 I think like there’s a strategy book,
00:44:13 I can’t remember the name,
00:44:14 but talk about moats and just like the best place to play
00:44:19 is where your competitor literally can’t pivot
00:44:22 because structurally they’re set up not to be there.
00:44:26 And that’s where you win.
00:44:28 And what’s fascinating is like,
00:44:30 do you know how many people were like,
00:44:31 images, Facebook does that, Twitter does that.
00:44:35 I mean, how wrong were they, really wrong?
00:44:37 And these are some of the smartest people
00:44:38 in Silicon Valley, right?
00:44:40 But now Instagram exists for a while.
00:44:42 How is it that Snapchat could then exist?
00:44:45 It makes no sense.
00:44:47 Like plenty of people would say,
00:44:48 well, there’s Facebook, no images.
00:44:50 Okay, okay, Instagram, I’ll give you that one.
00:44:52 But wait, now another image based social network
00:44:55 is gonna get really big.
00:44:57 And then TikTok comes along.
00:44:59 Like the prior, so you asked me, is it possible?
00:45:03 The only reason I’m answering yes
00:45:05 is because my prior is that it’s happened once every,
00:45:09 I don’t know, three, four or five years consistently.
00:45:13 And I can’t imagine there’s anything structurally
00:45:15 that would change that.
00:45:17 So that’s why I answer that way.
00:45:18 Not because I know how, I just,
00:45:21 when you see a pattern, you see a pattern
00:45:22 and there’s no reason to believe that’s gonna stop.
00:45:25 And it’s subtle too, because like you said,
00:45:27 Snapchat and TikTok,
00:45:29 they’re all doing the same space of things,
00:45:32 but there’s something fundamentally different
00:45:34 about like a three second video and a five second video
00:45:38 and a 15 second video and a one minute video
00:45:41 and a one hour video, like fundamentally different.
00:45:43 Fundamentally different.
00:45:45 I mean, I think one of the reasons Snapchat exists
00:45:48 is because Instagram was so focused on posting great,
00:45:51 beautiful manicured versions of yourself throughout time.
00:45:57 And there was this enormous demand of like,
00:45:58 hey, I really like this behavior.
00:46:00 I love using Instagram, but man,
00:46:03 I just like wish I could share something going on in my day.
00:46:07 Do I really have to put it on my profile?
00:46:10 Do I really have to make it last forever?
00:46:11 Do I really, and that opened up a door,
00:46:14 it created a market, right?
00:46:16 And then what’s fascinating is Instagram had an explore page
00:46:20 for the longest time and it was image driven, right?
00:46:23 But there’s absolutely a behavior where you open up Instagram
00:46:26 and you sit on the explore page all day.
00:46:28 That is effectively TikTok,
00:46:29 but obviously focused on videos.
00:46:32 And it’s not like you could just put the explore page
00:46:35 in TikTok form and it works.
00:46:37 It had to be video, it had to have music.
00:46:39 These are the hard parts about product development
00:46:42 that are very hard to predict,
00:46:44 but they’re all versions of the same thing
00:46:47 with varying, like if you line them up
00:46:50 in a bunch of dimensions, they’re just like kind of on,
00:46:55 they’re different values of the same dimensions,
00:46:56 which is like, I guess, easy to say in retrospect.
00:46:59 But like, if I were an entrepreneur going after that area,
00:47:02 I’d ask myself like, where’s the opening?
00:47:05 What needs to exist because TikTok exists now?
00:47:08 So I wonder how much things that don’t yet exist
00:47:13 and can exist is in the space of algorithms,
00:47:15 in the space of recommender systems.
00:47:18 So in the space of how the feed is generated.
00:47:21 So we kind of talk about the actual elements
00:47:24 of the content, that’s what we’ve been talking,
00:47:27 the difference between photos,
00:47:29 between short videos, longer videos.
00:47:32 I wonder how much disruption is possible
00:47:35 in the way the algorithms work.
00:47:37 Because a lot of the criticism towards social media
00:47:39 is in the way the algorithms work currently.
00:47:41 And it feels like, first of all,
00:47:44 talking about product market fit,
00:47:47 there’s certainly a hunger for social media algorithms
00:47:54 that do something different.
00:47:56 I don’t think anyone, everyone said complaining,
00:47:59 this is hurting me and this is hurting society,
00:48:04 but I keep doing it because I’m addicted to it.
00:48:07 And they say, we want something different,
00:48:09 but we don’t know what.
00:48:11 It feels like just different.
00:48:15 It feels like there’s a hunger for that,
00:48:17 but that’s in the space of algorithms.
00:48:19 I wonder if it’s possible to disrupt in that space.
00:48:22 Absolutely, I have this thesis that the worst part
00:48:27 about social networks is that they’re, is the people.
00:48:33 It’s a line that sounds funny, right?
00:48:36 Because like, that’s why you call it a social network.
00:48:39 But what does social networks actually do for you?
00:48:42 Like just think, like imagine you were an alien
00:48:45 and you landed and someone says,
00:48:47 hey, there’s this site, it’s a social network.
00:48:49 We’re not gonna tell you what it is,
00:48:50 but just what does it do?
00:48:51 And you have to explain it to them.
00:48:53 It does two things.
00:48:54 One is that people you know and have social ties with
00:49:00 distribute updates through whether it’s photos or videos
00:49:05 about their lives so that you don’t have to physically
00:49:07 be with them, but you can keep in touch with them.
00:49:09 That’s one, that’s like a big part of Instagram.
00:49:12 That’s a big part of Snap.
00:49:14 It is not part of TikTok at all.
00:49:16 So there’s another big part, which is there’s all this
00:49:19 content out in the world that’s entertaining,
00:49:22 whether you wanna watch it or you wanna read it.
00:49:26 And matchmaking between content that exists in the world
00:49:30 and people that want that content
00:49:33 turns out to be like a really big business, right?
00:49:35 Search and discovery, which you?
00:49:37 Search and discovery, but my point is it could be video,
00:49:39 it could be text, it could be websites, it could be,
00:49:42 I mean, think back to like dig, right?
00:49:46 Or stumble upon or, right?
00:49:49 Nice, yeah.
00:49:50 But like, what did those do?
00:49:51 Like they basically distributed interesting content
00:49:54 to you, right?
00:49:57 I think the most interesting part or the future
00:50:00 of social networks is going to be making them less social
00:50:03 because I think people are part of the root cause
00:50:06 of the problem.
00:50:07 So for instance, often in recommender systems,
00:50:10 we talk about two stages.
00:50:11 There’s a candidate generation step, which is just like
00:50:14 of our vast trove of stuff that you might wanna see,
00:50:18 what small subset should we pick for you, okay?
00:50:24 Typically that is grabbed from things
00:50:25 your friends have shared, right?
00:50:28 Then there’s a ranking step which says, okay,
00:50:30 now given these 100, 200 things depends on the network,
00:50:33 right?
00:50:34 Let’s like be really good about ranking them
00:50:36 and generally rank the things up higher
00:50:38 that get the most engagement, right?
00:50:40 So what’s the problem with that?
00:50:42 Step one is we’ve limited everything you could possibly see
00:50:46 to things that your friends have chosen to share
00:50:49 or maybe not friends, but influencers.
00:50:52 What things do people generally want to share?
00:50:54 They wanna share things that are gonna get likes,
00:50:56 that are gonna show up broadly.
00:50:58 So they tend to be more emotionally driven.
00:51:00 They tend to be more risque or whatever.
00:51:03 So why do we have this problem?
00:51:05 It’s because we show people things people have decided
00:51:08 to share and those things self select to being the things
00:51:11 that are most divisive.
00:51:14 So how do you fix that?
00:51:15 Well, what if you just imagine for a second
00:51:20 that why do you have to grab things
00:51:21 from things your friends have shared?
00:51:23 Why not just like grab things?
00:51:25 That’s really fascinating to me.
00:51:27 And that’s something I’ve been thinking a lot about.
00:51:29 And just like, why is it that when you log onto Twitter,
00:51:34 you’re just sitting there looking at things from accounts
00:51:38 that you’ve followed for whatever reason?
00:51:42 And TikTok I think has done a wonderful job here,
00:51:44 which is like, you can literally be anyone.
00:51:47 And if you produce something fascinating, it’ll go viral.
00:51:51 But like, you don’t have to be someone that anyone knows.
00:51:54 You don’t have to have built up a giant following.
00:51:57 You don’t have to have paid for followers.
00:52:00 You don’t have to try to maintain those followers.
00:52:01 You literally just have to produce something interesting.
00:52:04 That is I think the future of social networking.
00:52:07 That’s the direction things will head.
00:52:10 And I think what you’ll find is it’s far less
00:52:12 about people manipulating distribution
00:52:15 and far more about what is like, is this content good?
00:52:19 And good is obviously a vague definition
00:52:22 that we could spend hours on.
00:52:23 But different networks I think will decide
00:52:27 different value functions to decide what is good
00:52:29 and what isn’t good.
00:52:30 And I think that’s a fascinating direction.
00:52:32 So that’s almost like creating an internet.
00:52:33 I mean, that’s what Google did for web pages
00:52:36 that did page rank search.
00:52:39 So it’s discovery, you don’t follow anybody on Google
00:52:42 when you use a search engine.
00:52:44 You just discover web pages.
00:52:46 And so what TikTok does is saying,
00:52:49 let’s start from scratch.
00:52:51 Let’s like start a new internet
00:52:54 and have people discover stuff on that new internet
00:52:56 within a particular kind of pool of people.
00:52:59 But what’s so fascinating about this
00:53:01 is like the field of information retrieval.
00:53:05 Like I always talked about as I was studying this stuff,
00:53:08 it was used the word query and document.
00:53:10 So I was like, why are they saying query and documents?
00:53:12 Like they’re literally imagine,
00:53:14 like if you just stop thinking about query
00:53:17 as like literally a search query
00:53:18 and a query could be a person.
00:53:20 I mean, a lot of the way,
00:53:22 I’m not gonna claim to know how Instagram or Facebook
00:53:24 machine learning works today,
00:53:26 but if you want to find a match for a query,
00:53:30 the query is actually the attributes of the person,
00:53:33 their age, their gender, where they’re from,
00:53:36 maybe some kind of summarization of their interests.
00:53:39 And that’s a query, right?
00:53:41 And that matches against documents.
00:53:43 And by the way, documents don’t have to be texts.
00:53:45 They can be videos, however long.
00:53:48 I don’t know what the limit is on TikTok these days.
00:53:50 They keep changing it.
00:53:51 My point is just, you’ve got a query,
00:53:53 which is someone in search of something
00:53:55 that they want to match and you’ve got the document
00:53:58 and it doesn’t have to be text.
00:53:59 It could be anything.
00:54:00 And how do you match make?
00:54:02 And that’s one of these like,
00:54:03 I mean, I’ve spent a lot of time thinking about this
00:54:06 and I don’t claim to have mastered it at all,
00:54:08 but I think it’s so fascinating about where that will go
00:54:11 with new social networks.
00:54:13 See, what I’m also fascinated by is metrics
00:54:16 that are different than engagement.
00:54:18 So the other thing from an alien perspective,
00:54:20 what social networks are doing is they,
00:54:25 they in the short term,
00:54:26 bring out different aspects of each human being.
00:54:30 So first, let me say that an algorithm or a social network
00:54:38 for each individual can bring out the best of that person
00:54:41 or the worst of that person,
00:54:43 or there’s a bunch of different parts to us,
00:54:45 parts we’re proud of that we are,
00:54:48 parts we’re not so proud of.
00:54:50 When we look at the big picture of our lives,
00:54:53 when we look back 30 days from now,
00:54:55 am I proud that I said those things or not?
00:54:57 Am I proud that I felt those things?
00:54:59 Am I proud that I experienced or read those things
00:55:02 or thought about those things?
00:55:04 Just in that kind of self reflected kind of way.
00:55:08 And so coupled with that,
00:55:10 I wonder if it’s possible to have different metrics
00:55:12 that are not just about engagement,
00:55:14 but are about long term happiness,
00:55:18 growth of a human being,
00:55:20 where they look back and say,
00:55:22 I am a better human being
00:55:23 for having spent 100 hours on that app.
00:55:26 And that feels like it’s actually strongly correlated
00:55:30 with engagement in the long term.
00:55:33 In the short term, it may not be,
00:55:34 but in long term, it’s like the same kind of thing
00:55:37 where you really fall in love with the product.
00:55:40 You fall in love with an iPhone,
00:55:41 you fall in love with a car.
00:55:43 That’s what makes you fall in love
00:55:45 is like really being proud
00:55:49 and just in a self reflected way,
00:55:51 understanding that you’re a better human being
00:55:53 for having used the thing.
00:55:55 And that’s what great relationships are made from.
00:55:58 It’s not just like you’re hot
00:56:01 and we like being together or something like that.
00:56:04 It’s more like I’m a better human being
00:56:06 because I’m with you.
00:56:07 And that feels like a metric
00:56:09 that could be optimized for by the algorithms.
00:56:13 But anytime I kind of talk about this with anybody,
00:56:17 they seem to say, yeah, okay,
00:56:18 that’s going to get out competed immediately
00:56:21 by the engagement if it’s ad driven especially.
00:56:24 I just don’t think so.
00:56:26 I don’t, I mean, a lot of it’s just implementation.
00:56:30 I’ll say a couple of things.
00:56:31 One is to pull back the curtain on daily meetings
00:56:36 inside of these large social media companies.
00:56:40 A lot of what management,
00:56:42 or at least the people that are tweaking these algorithms
00:56:45 spend their time on are trade offs.
00:56:47 And there’s these things called value functions,
00:56:50 which are like, okay,
00:56:51 we can predict the probability that you’ll click
00:56:54 on this thing or the probability that you’ll share it,
00:56:58 or the probability that you will leave a comment on it
00:57:01 or the probability you’ll dwell on it.
00:57:04 Individual actions, right?
00:57:07 And you’ve got this neural network
00:57:09 that basically has a bunch of heads at the end
00:57:11 and all of them are between zero and one and great.
00:57:14 They all have values, right?
00:57:15 Or they all have probabilities.
00:57:19 And then in these meetings, what they will do is say,
00:57:21 well, how much do we value a comment versus a click
00:57:26 versus a share versus a,
00:57:29 and then maybe even some downstream thing, right?
00:57:31 That has nothing to do with the item there,
00:57:34 but like driving follows or something.
00:57:37 And what typically happens is they will say,
00:57:40 well, what are our goals for this quarter at the company?
00:57:42 Oh, we wanna drive sharing up, okay.
00:57:44 Well, let’s turn down these metrics
00:57:46 and turn up these metrics.
00:57:48 And they blend them right into a single scalar
00:57:52 which they’re trying to optimize.
00:57:55 That is really hard because invariably
00:57:58 you think you’re solving for, I don’t know,
00:57:59 something called meaningful interactions, right?
00:58:02 This was the big Facebook pivot.
00:58:04 And I don’t actually have any internal knowledge.
00:58:06 Like I wasn’t in those meetings,
00:58:09 but at least from what we’ve seen over the last month or so,
00:58:12 it seems by actually trying to optimize
00:58:16 for meaningful interactions,
00:58:18 it had all these side effects of optimizing
00:58:20 for these other things.
00:58:22 And I don’t claim to fully understand them,
00:58:24 but what I will say is that trade offs abound.
00:58:28 And as much as you’d like to solve for one thing,
00:58:31 if you have a network of over a billion people,
00:58:34 you’re gonna have unintended consequences either way.
00:58:36 And it gets really hard.
00:58:38 So what you’re describing is effectively a value model
00:58:41 that says like, can we capture,
00:58:43 this is the thing that I spent a lot of time thinking about,
00:58:45 like, can you capture utility
00:58:49 in a way that like actually measures someone’s happiness
00:58:53 that isn’t just a, what do they call it?
00:58:56 A surrogate problem where you say, well,
00:58:59 kind of think like the more you use the product,
00:59:01 the happier you are.
00:59:03 That was always the argument at Facebook, by the way.
00:59:04 It was like, well, people use it more,
00:59:07 so they must be more happy.
00:59:09 Turns out there are like a lot of things you use more
00:59:11 that make you less happy in the world.
00:59:12 Not talking about Facebook,
00:59:14 just let’s think about whether it’s gambling or whatever,
00:59:17 like that you can do more of,
00:59:19 but doesn’t necessarily make you happier.
00:59:20 So the idea that time equals happiness,
00:59:22 obviously you can’t map utility and time together easily.
00:59:27 There are a lot of edge cases.
00:59:28 So when you look around the world and you say,
00:59:30 well, what are all the ways we can model utility?
00:59:32 That is like one of the,
00:59:34 please, if you know someone smart doing this,
00:59:36 introduce me because I’m fascinated by it.
00:59:38 And it seems really tough.
00:59:40 But the idea that reinforcement learning,
00:59:42 like everyone interesting I know in machine learning,
00:59:46 like I was really interested in recommender systems
00:59:48 and supervised learning.
00:59:49 And the more I dug into it, I was like,
00:59:52 oh, literally everyone smart
00:59:54 is working on reinforcement learning.
00:59:56 Like literally everyone.
00:59:57 You just made people at OpenAI and DeepMind very happy, yes.
01:00:00 But I mean, but what’s interesting is like,
01:00:02 it’s one thing to train a game and like,
01:00:06 I mean that paper where they just took Atari
01:00:09 and they used a ConvNet to basically just like
01:00:12 train simple actions, mind blowing, right?
01:00:15 Absolutely mind blowing, but it’s a game, great.
01:00:18 So now what if you’re constructing a feed for a person,
01:00:23 right?
01:00:24 Like how can you construct that feed in such a way
01:00:29 that optimizes for a diversity of experience,
01:00:32 a longterm happiness, right?
01:00:36 But that reward function,
01:00:38 it turns out in reinforcement learning again,
01:00:41 as I’ve learned, like reward design is really hard.
01:00:45 And I don’t know, like how do you design a scalar reward
01:00:49 for someone’s happiness over time?
01:00:51 I mean, do you have to measure dopamine levels?
01:00:53 Like, do you have to?
01:00:54 Well, you have to have a lot more signals
01:00:58 from the human being.
01:00:59 Currently it feels like there’s not enough signals
01:01:01 coming from the human being users of this algorithm.
01:01:06 So for reinforcement learning to work well,
01:01:08 you need to have a lot more data.
01:01:10 Needs to have a lot of data.
01:01:11 And that actually is a challenge for anyone
01:01:13 who wants to start something,
01:01:14 which is you don’t have a lot of data.
01:01:16 So how do you compete?
01:01:17 But I do think back to your original point,
01:01:20 rethinking the algorithm, rethinking reward functions,
01:01:23 rethinking utility, that’s fascinating.
01:01:28 That’s cool.
01:01:28 And I think that’s an open opportunity
01:01:31 for a company that figures it out.
01:01:34 I have to ask about April, 2012,
01:01:37 when Instagram, along with its massive employee base
01:01:43 of 13 people was sold to Facebook for $1 billion.
01:01:48 What was the process like on a business level,
01:01:50 engineering level, human level?
01:01:52 What was that process of selling to Facebook like?
01:01:54 What did it feel like?
01:01:56 So I want to provide some context,
01:01:58 which is I worked in corporate development at Google,
01:02:00 which not a lot of people know,
01:02:02 but corporate development is effectively the group
01:02:04 that buys companies, right?
01:02:06 You sit there and you acquire companies.
01:02:08 And I had sat through so many of these meetings
01:02:10 with entrepreneurs.
01:02:12 We actually, fun fact, we never acquired a single company
01:02:14 when I worked in corporate development.
01:02:15 So I can’t claim that I had like a lot of experience,
01:02:21 but I had enough experience to understand,
01:02:23 okay, like what prices are people getting
01:02:25 and what’s the process?
01:02:27 And as we started to grow,
01:02:32 we were trying to keep this thing running
01:02:33 and we were exhausted and we were 13 people.
01:02:36 And I mean, we were trying to think back,
01:02:38 it was probably 27, 37 now,
01:02:43 so young on a relative basis, right?
01:02:48 And we’re trying to keep the thing running.
01:02:50 And then we go out to raise money
01:02:53 and we’re kind of like the hot startup at the time.
01:02:57 And I remember going into a specific VC and saying,
01:03:00 our terms we’re looking for are,
01:03:02 we’re looking for a $500 million valuation.
01:03:05 And I’ve never seen so many jaws drop all in unison, right?
01:03:10 And I was like, thanked and walked out the door
01:03:12 very kindly after.
01:03:14 And then I got a call the next day
01:03:16 from someone who was connected to them.
01:03:18 And they said, we just wanna let you know
01:03:21 that like it was pretty offensive
01:03:22 that you asked for a $500 million valuation.
01:03:25 And I can’t tell if that was like just negotiating or what,
01:03:30 but it’s true, like no one offered us more, right?
01:03:32 So we were…
01:03:33 So can you clarify the number again?
01:03:35 You said how many million?
01:03:37 500.
01:03:37 500 million.
01:03:38 500 million, yeah, half a billion.
01:03:42 So in my mind, I’m anchored like, okay,
01:03:44 well, literally no one’s biting at 500 million.
01:03:46 And eventually we would get Sequoia and Greylock
01:03:50 and others together at 500 million, basically, post.
01:03:54 It was 450 pre, I think we raised $50 million.
01:03:56 But just like no one was used to seeing
01:03:59 a $500 million companies then.
01:04:01 Like, I don’t know if it was because we were just coming
01:04:03 out of the hangover of 2008
01:04:06 and things were still on recovery mode.
01:04:10 But then along comes Facebook.
01:04:13 And after some negotiation, we’ve two X to the number
01:04:18 from a half a billion to a billion.
01:04:21 Yeah, it seems pretty good.
01:04:22 And I think Mark and I really saw eye to eye
01:04:27 that this thing could be big.
01:04:29 We thought we could…
01:04:30 Their resources would help us scale it.
01:04:32 And in a lot of ways it de risks.
01:04:34 I mean, it de risks a lot of the employees lives
01:04:37 for the rest of their lives,
01:04:38 including me, including Mike, right?
01:04:41 I think I might’ve had like 10 grand
01:04:42 in my bank account at the time, right?
01:04:44 Like we’re working hard, we had nothing.
01:04:48 So on a relative basis, it seems very high.
01:04:51 And then I think the last company to exit
01:04:53 for anywhere close to a billion was YouTube
01:04:55 that I could think of.
01:04:57 And thus began the giant long bull run of 2012
01:05:02 to all the way to where we are now,
01:05:04 where I saw some stat yesterday
01:05:06 about like how many unicorns exist and it’s absurd.
01:05:11 But then again, never underestimate technology
01:05:13 and like the value it can provide.
01:05:15 And man, costs have dropped and man scale has increased.
01:05:19 And you can make businesses make a lot of money now.
01:05:23 But on a fundamental level, I don’t know,
01:05:26 like how do you describe the decision
01:05:29 to sell a company with 13 people for a billion dollars?
01:05:32 So first of all, like how did it take a lot of guts
01:05:36 to sit at a table and say 500 million
01:05:38 or 1 billion with Mark Zuckerberg?
01:05:41 It seems like a very large number with 13.
01:05:44 Like, especially…
01:05:45 It doesn’t seem, it is.
01:05:47 It is.
01:05:48 They’re all large numbers.
01:05:49 Especially like you said before the unicorn parade.
01:05:55 I like that, I’m gonna use that.
01:05:56 The unicorn parade?
01:05:57 Yeah.
01:05:59 You were at the head of the unicorn parade.
01:06:01 It’s the, yeah, it’s a massive unicorn parade.
01:06:04 Okay, so no, I mean, we knew we were worth
01:06:10 quote unquote a lot, but we didn’t,
01:06:12 I mean, there was no market for Instagram.
01:06:14 I mean, it’s not, you couldn’t mark to market this thing
01:06:17 in the public markets.
01:06:18 You didn’t quite understand what it would be worth
01:06:20 or was worth at the time.
01:06:23 So in a market, an illiquid market
01:06:24 where you have one buyer and one seller
01:06:26 and you’re going back and forth,
01:06:27 and well, I guess there were like VC firms
01:06:31 who were willing to invest at a certain valuation.
01:06:34 So I don’t know, you just go with your gut.
01:06:38 And at the end of the day, I would say
01:06:41 the hardest part of it was not realizing,
01:06:50 like when we sold, it was tough
01:06:52 because like literally everywhere I go,
01:06:54 restaurants, whatever, like for a good six months after,
01:06:59 there was a lot of attention on the deal,
01:07:01 a lot of attention on the product,
01:07:03 a lot of attention, it was kind of miserable, right?
01:07:06 And you’re like, wait, like I made a lot of money,
01:07:08 but like, why is this not great?
01:07:10 And it’s because it turns out,
01:07:14 I don’t know, like I don’t really keep in touch with Mark,
01:07:16 but I’ve got to assume his job right now
01:07:17 is not exactly the most happy job in the world.
01:07:19 It’s really tough when you’re on top
01:07:21 and it’s really tough when you’re in the limelight.
01:07:24 So the decision itself was like, oh, cool, this is great.
01:07:27 How lucky are we, right?
01:07:29 So, okay, there’s a million question I want to ask.
01:07:30 Yeah, go, go, go.
01:07:32 First of all, why is it hard to be on top?
01:07:37 Why did you not feel good?
01:07:39 Like, can you dig into that?
01:07:40 It always, I’ve heard like Olympic athletes say
01:07:45 after they win gold, they get depressed.
01:07:49 Is it something like that where it feels like
01:07:53 it was kind of like a thing you were working towards?
01:07:56 Yeah, sure.
01:07:57 Some loose definition of success.
01:07:58 And this sure as heck feels like
01:08:01 at least according to other startups,
01:08:03 this is what success looks like.
01:08:04 And now why don’t I feel any better?
01:08:08 I’m still human, I still have all the same problems.
01:08:10 Is that the nature?
01:08:11 Or is it just like negative attention of some kind?
01:08:14 I think it’s all of the above.
01:08:16 But to be clear, there was a lot of happiness
01:08:18 in terms of like, oh my God, this is great.
01:08:20 Like we won the Super Bowl of startups, right?
01:08:25 Anyone who can get to a liquidity event
01:08:28 of anything meaningful feels like,
01:08:31 wow, this is what we started out to do.
01:08:32 Of course we want to create great things that people love,
01:08:35 but like we won in a big way.
01:08:38 But yeah, there’s this big like,
01:08:39 oh, if we won, what’s next?
01:08:43 So they call it the we have arrived syndrome,
01:08:49 which I need to go back and look where I can quote that from.
01:08:52 But I remember reading about it at the time.
01:08:54 I was like, oh yeah, that’s that.
01:08:56 And I remember we had a product manager leave very early on
01:08:59 when we got to Facebook.
01:09:00 And he said to me,
01:09:01 I just don’t believe I can learn anything
01:09:03 at this company anymore.
01:09:05 It’s like, it’s hit its apex.
01:09:08 We sold it great.
01:09:09 I just don’t have anything else to learn.
01:09:12 So from 2012 all the way to the day I left in 2018,
01:09:15 like the amount I learned and the humility
01:09:18 with which I realized, oh, we thought we won.
01:09:22 Billion dollars is cool,
01:09:23 but like there are a hundred billion dollar companies.
01:09:26 And by the way, on top of that, we had no revenue.
01:09:29 We had, I mean, we had a cool product,
01:09:31 but we didn’t scale it yet.
01:09:32 And there’s so much to learn.
01:09:34 And then competitors and how fun was it to fight Snapchat?
01:09:38 Oh my God.
01:09:39 Like it was, it’s like Yankees Red Sox.
01:09:42 It’s great.
01:09:42 Like that’s what you live for.
01:09:46 You know, you win some, you lose some,
01:09:47 but the amount you can learn through that process,
01:09:52 what I’ve realized in life is that there is no,
01:09:56 and there’s always someone who has more,
01:09:58 there’s always more challenge, just at different scales.
01:10:02 And it sounds like a little Buddhist,
01:10:04 but everything is super challenging,
01:10:09 whether you’re like a small business
01:10:11 or an enormous business.
01:10:13 I say like choose the game you like to play, right?
01:10:17 You’ve got to imagine that
01:10:18 if you’re an amazing basketball player,
01:10:19 you enjoy to some extent practicing basketball.
01:10:22 It’s gotta be something you love.
01:10:24 It’s gonna suck.
01:10:24 It’s gonna be hard.
01:10:26 You’re gonna have injuries, right?
01:10:27 But you gotta love it.
01:10:28 And the same thing with Instagram,
01:10:30 which is we might’ve sold, but it was like, great.
01:10:35 There’s one Super Bowl title.
01:10:37 Can we win five?
01:10:38 What else can we do?
01:10:40 Now I imagine you didn’t ask this, but okay, so I left.
01:10:43 There’s a little bit of like, what do you do next, right?
01:10:46 Like, how do you top that thing?
01:10:49 It’s the wrong question.
01:10:50 The question is like, when you wake up every day,
01:10:53 what is the hardest, most interesting thing
01:10:54 you can go work on?
01:10:56 Because like at the end of the day,
01:10:58 we all turn into dirt, it doesn’t matter, right?
01:11:00 But what does matter is like,
01:11:02 can we really enjoy this life?
01:11:05 Not in a hedonistic way, because that’s those,
01:11:08 it’s like the reinforcement learning,
01:11:09 learning like short term versus long term objectives.
01:11:13 Can you wake up every day and truly enjoy what you’re doing
01:11:19 knowing that it’s gonna be painful?
01:11:22 Knowing that like, no matter what you choose,
01:11:24 it’s gonna be painful.
01:11:25 Whether you sit on a beach
01:11:26 or whether you manage a thousand people or 10,000,
01:11:29 it’s gonna be painful.
01:11:31 So choose something that’s fun to have pain.
01:11:35 But yes, there was a lot of, we have arrived
01:11:38 and it’s a maturation process you just have to go through.
01:11:44 So no matter how much success there is,
01:11:47 how much money you make,
01:11:48 you have to wake up the next day and choose the hard life,
01:11:51 whatever that means next, that’s fun.
01:11:53 The fun slash hard life, hard life that’s fun.
01:11:56 I guess what I’m trying to say is slightly different,
01:12:00 which is just that no one realizes
01:12:02 everything’s gonna be hard.
01:12:05 Even chilling out is hard.
01:12:07 And then you just start worrying about stupid stuff.
01:12:09 Like, I don’t know, like did so and so
01:12:12 forget to paint the house today
01:12:14 or like did the gardener come or whatever?
01:12:16 Like, or, oh, I’m so angry
01:12:18 and my shipment of wine didn’t show up
01:12:20 and I’m sitting here on the beach without my wine.
01:12:22 I don’t know, I’m making shit up now, but like.
01:12:24 It turns out that even chilling, AKA meditation,
01:12:27 is hard work.
01:12:27 Yeah, and at least meditation is like productive chilling
01:12:31 where you’re like actually training yourself
01:12:32 to calm down and be, but backing up for a moment,
01:12:36 everything’s hard.
01:12:37 You might as well be like playing the game you love to play.
01:12:41 I just like playing and winning and I’m on the,
01:12:47 I’m still on the, I think the first half of life,
01:12:49 knock on wood, and I’ve got a lot of years
01:12:53 and what am I gonna do, sit around?
01:12:54 And the other way of looking at this, by the way,
01:12:59 imagine you made one movie and it was great.
01:13:02 Would you just like stop making movies?
01:13:05 No, generally you’re like, wow,
01:13:06 I really like making movies, let’s make another one.
01:13:09 A lot of times, by the way,
01:13:10 the second one or the third one, not that great,
01:13:11 but the fourth one, awesome.
01:13:13 And no one forgets the second,
01:13:15 or everyone forgets the second and the third one.
01:13:17 So there’s just this constant process of like,
01:13:20 can I produce and is that fun?
01:13:23 Is that exciting?
01:13:24 What else can I learn?
01:13:25 So this machine learning stuff for me
01:13:26 has been this awesome new chapter of being like,
01:13:31 man, that’s something I didn’t understand at all.
01:13:33 And now I feel like I’m one 10th of the way there.
01:13:37 And that feels like a big mountain to climb.
01:13:40 So I distracted us from the original question.
01:13:42 No, and we’ll return to the machine learning
01:13:44 cause I’d love to explore your interest there.
01:13:46 But I mean, speaking of sort of challenges and hard things,
01:13:50 is there a possible world
01:13:52 where sitting in a room with Mark Zuckerberg
01:13:56 with a $1 billion deal, you turn it down?
01:14:00 Yeah, of course.
01:14:01 What does that world look like?
01:14:03 Why would you turn it down?
01:14:04 Why did you take it?
01:14:05 What was the calculation that you were making?
01:14:08 Thus enters the world of counterfactuals
01:14:11 and not really knowing.
01:14:13 And if only we could run that experiment.
01:14:15 Well, the universe exists,
01:14:16 it’s just running in parallel to our own.
01:14:18 Yeah, it’s so fascinating, right?
01:14:23 I mean, we’re talking a lot about money,
01:14:24 but the real question was,
01:14:28 I’m not sure you’ll believe me when I say this,
01:14:30 but could we strap our little company
01:14:33 onto the side of a rocket ship
01:14:36 and like get out to a lot of people really, really quickly
01:14:39 with the support, with the talent of a place like Facebook?
01:14:45 I mean, people often ask me what I would do differently
01:14:48 at Instagram today.
01:14:49 And I say, well, I’d probably hire more carefully
01:14:51 because we showed up just like before I knew it,
01:14:53 we had like a hundred people on the team
01:14:56 and 200, then 300.
01:14:57 I don’t know where all these people were coming from.
01:14:59 I never had to recruit them.
01:15:00 I never had to screen them.
01:15:02 They were just like internal transfers, right?
01:15:05 So it’s like relying on the Facebook hiring machine,
01:15:07 which is quite sort of, I mean, it’s an elaborate machine.
01:15:11 It’s great, by the way.
01:15:12 They have really talented people there.
01:15:15 But my point is the choice was like, take this thing,
01:15:21 put it on the side of a rocket ship
01:15:22 that you know is growing very quickly.
01:15:25 Like I had seen what had happened
01:15:27 when Ev sold Blogger to Google and then Google went public.
01:15:30 Remember we sold before Facebook went public.
01:15:34 There was a moment at which the stock price was $17,
01:15:37 by the way, Facebook stock price was $17.
01:15:40 I remember thinking, what the fuck did I just do, right?
01:15:43 Now at 320 ish, I don’t know where we are today.
01:15:48 But like, okay, like the best thing by the way
01:15:52 is like when the stock is down, everyone calls you a dope.
01:15:56 And then when it’s up, they also call you a dope,
01:15:58 but just for a different reason, right?
01:16:01 Like you can’t win.
01:16:02 Less than in there somewhere.
01:16:03 Yeah.
01:16:04 So, but you know, the choice is to strap yourself
01:16:06 to a rocket ship or to build your own.
01:16:08 You know, Mr. Elon built his own
01:16:11 literally with a rocket ship.
01:16:13 That’s a difficult choice because there’s a world.
01:16:18 Actually, I would say something different,
01:16:19 which is Elon and others decided to sell PayPal
01:16:23 for not that much.
01:16:25 I mean, how much was it about a billion dollars?
01:16:26 I can’t remember.
01:16:27 Something like that, yeah.
01:16:28 I mean, it was early and,
01:16:30 but it’s worth a lot more now.
01:16:31 To then build a new rocket ship.
01:16:33 So this is the cool part, right?
01:16:34 If you are an entrepreneur
01:16:37 and you own a controlling stake in the company,
01:16:40 not only is it really hard to do something else
01:16:42 with your life because all of the, you know,
01:16:45 value is tied up in you as a personality
01:16:48 attached to this company, right?
01:16:50 But if you sell it and you’re getting yourself enough capital
01:16:53 and you like have enough energy,
01:16:55 you can do another thing or 10 other things
01:16:58 or in Elon’s case, like a bunch of other things.
01:17:00 I don’t know, like I lost count at this point.
01:17:03 And it might’ve seemed silly at the time.
01:17:05 And sure, like if you look back,
01:17:07 man, PayPal is worth a lot now, right?
01:17:10 But I don’t know.
01:17:11 Like, do you think Elon like cares about like,
01:17:13 are we gonna buy Pinterest or not?
01:17:15 Like, I just, he is,
01:17:17 he created a massive capital
01:17:20 that allowed him to do what he wants to do.
01:17:23 And that’s awesome.
01:17:24 That’s more freeing than anything
01:17:25 because when you are an entrepreneur attached to a company,
01:17:28 you gotta stay at that company for a really long time.
01:17:30 It’s really hard to remove yourself.
01:17:32 But I’m not sure how much he loved
01:17:34 PayPal versus SpaceX and Tesla.
01:17:37 I have a sense that you love Instagram.
01:17:41 Yeah, I loved enough to like work
01:17:43 for six years beyond the deal.
01:17:44 Which is rare, which is very rare.
01:17:46 You chose.
01:17:47 But can I tell you why?
01:17:48 Sure.
01:17:49 It’s, please.
01:17:51 There are not a lot of companies that you can be part of
01:17:55 where the Pope’s like,
01:17:57 I would like to sign up for your product.
01:17:59 Like I’m not a religious person at all.
01:18:01 I’m really not.
01:18:02 Yeah.
01:18:03 You go to the Vatican and you’re like walking
01:18:05 among giant columns and you’re hearing the music
01:18:07 and everything and like the Pope walks in
01:18:10 and he wants to press the signup button on your product.
01:18:12 Like it’s a moment in life, okay?
01:18:15 No matter what your persuasion, okay?
01:18:19 The number of doors and experiences that that opened up
01:18:22 was, it was incredible.
01:18:23 I mean, the people I got to meet, the places I got to go,
01:18:27 I assume maybe like a payments company
01:18:30 is slightly different, right?
01:18:32 But that’s why, like it was so fun.
01:18:34 And plus I truly believed we were building
01:18:37 such a great product and I loved, loved the game.
01:18:40 It wasn’t about money.
01:18:41 It was about the game.
01:18:43 Do you think you had the guts to say no?
01:18:46 Is that, so here’s, I often think about this,
01:18:49 like how hard is it for an entrepreneur to say no?
01:18:52 Because the peer pressure.
01:18:54 So every, like basically the sea of entrepreneurs
01:18:57 in Silicon Valley are gonna tell you,
01:18:59 I mean, this is their dream.
01:19:01 The thing you were sitting before is a dream.
01:19:05 To walk away from that is really,
01:19:07 it seems like nearly impossible.
01:19:11 Because Instagram could in 10 years be,
01:19:16 you could talk about Google.
01:19:17 You could be making self driving cars
01:19:20 and building rockets that go to Mars
01:19:21 and compete with SpaceX.
01:19:23 Totally.
01:19:24 And so that’s an interesting decision to say,
01:19:28 am I willing to risk it?
01:19:31 And the reason I also say it’s an interesting decision
01:19:34 because it feels like per our previous discussion,
01:19:37 if you’re launching a social network company,
01:19:42 there’s going to be that meeting, whatever that number is.
01:19:45 If you’re successful, if you’re on this rocket ship
01:19:48 of success, there’s going to be a meeting
01:19:50 with one of the social media, social network companies
01:19:53 that wanna buy you, whether it’s Facebook or Twitter,
01:19:58 but it could also very well be Google
01:20:00 who seems to have like a graveyard
01:20:04 of failed social networks.
01:20:06 And it’s, I mean, I don’t know.
01:20:08 I think about that, how difficult it is
01:20:11 for an entrepreneur to make that decision.
01:20:13 How many have successfully made that decision?
01:20:15 I guess, this is a big question.
01:20:18 It’s sad to me, to be honest,
01:20:20 that too many make that decision,
01:20:21 perhaps for the wrong reason.
01:20:23 Sorry, when you say make the decision,
01:20:25 you mean to the affirmative.
01:20:26 To the affirmative, yeah.
01:20:27 Got it, yeah.
01:20:28 There are also companies that don’t sell, right?
01:20:31 And take the PATH and say, we’re gonna be independent.
01:20:34 And then you’ve never heard of them again.
01:20:37 Like I remember PATH, right?
01:20:40 Was one of our competitors early on.
01:20:43 There was a big moment when they had,
01:20:45 I can’t remember what it was,
01:20:46 like $110 million offer from Google or something.
01:20:50 It might’ve been larger, I don’t know.
01:20:54 And I remember there was like this big TechCrunch article
01:20:56 that was like, they turned it down after talking deeply
01:20:59 about their values and everything.
01:21:01 And I don’t know the inner workings of Foursquare,
01:21:05 but I’m certain there were many conversations over time
01:21:09 where there were companies that wanted Foursquare as well.
01:21:11 Recently, I mean, what other companies?
01:21:15 There’s Clubhouse, right?
01:21:16 Like, I don’t know.
01:21:17 Maybe people were really interested in them too.
01:21:20 Like there are plenty of moments where people say no
01:21:25 and we just forget that those things happen.
01:21:28 We only focus on the ones where like they said yes
01:21:33 and like, wow, like what if they had stayed independent?
01:21:36 So I don’t know.
01:21:37 I used to think a lot about this and now I just don’t
01:21:39 because I’m like, whatever.
01:21:42 Things have gone pretty well.
01:21:44 I’m ready for the next game.
01:21:45 I mean, think about an athlete where, I don’t know,
01:21:50 maybe they do something wrong in the World Series
01:21:53 or whatever.
01:21:54 If you let it haunt you for the rest of your career,
01:21:57 like why not just be like, I don’t know, it was a game.
01:22:00 Next game, next shot, right?
01:22:02 And if you just move to that world,
01:22:05 like at least I have a next shot, right?
01:22:07 No, that’s beautiful, but I mean, just insights
01:22:10 and it’s funny you brought up Clubhouse.
01:22:12 It is very true.
01:22:14 It seems like Clubhouse is on the downward path
01:22:19 and it’s very possible to see a billion plus dollar deal
01:22:23 at some stage, maybe like a year ago or half a year ago
01:22:27 from Facebook, from Google.
01:22:28 I think Facebook was flirting with that idea too.
01:22:30 And I think a lot of companies probably were.
01:22:34 I wish it was more public.
01:22:37 You know what?
01:22:37 There’s not like a bad public story
01:22:41 about them making the decision to walk away.
01:22:43 We just don’t hear about it.
01:22:45 And then we get to see the results of that success
01:22:47 or the failure, more often failure.
01:22:49 So a couple of things, one is,
01:22:52 I would not assume Clubhouse is down for the count at all.
01:22:54 They’re young, they have plenty of money,
01:22:56 they’re run by really smart people.
01:22:59 I’d give them like a very fighting chance to figure it out.
01:23:02 There are a lot of times when people call Twitter
01:23:04 down for the count and they figure it out
01:23:05 and they seem to be doing well, right?
01:23:08 So just backing up like,
01:23:09 and not knowing anything about their internals,
01:23:11 like there’s a strong chance they will figure it out
01:23:15 and that people are just down
01:23:16 because they like being down about companies.
01:23:18 They like assuming that they’re gonna fail.
01:23:20 So who knows, right?
01:23:21 But let’s take the ones in the past
01:23:22 where like we know how it played out.
01:23:24 There are plenty of examples
01:23:25 where people have turned down big offers
01:23:28 and then you’ve just never heard from them again,
01:23:30 but we never focus on the companies
01:23:32 because you just forget that those were big.
01:23:34 But inside your psyche,
01:23:38 I think it’s easy for someone with enough money
01:23:42 to say money doesn’t matter, which I think is like,
01:23:45 it’s bullshit.
01:23:45 Of course, money matters to people, but at the moment,
01:23:50 you just can’t even grasp like the number of zeros
01:23:53 that you’re talking about.
01:23:53 It just doesn’t make sense, right?
01:23:56 So to think rationally in that moment
01:23:58 is not something many people are equipped to do,
01:24:01 especially not people
01:24:03 where I think we had founded the company a year earlier,
01:24:05 maybe two years earlier, like a year and a half,
01:24:07 we were 13 people, but I will say,
01:24:11 I still don’t know if it was the right decision
01:24:14 because I don’t have that counterfactual.
01:24:16 I don’t know that other world.
01:24:18 I’m just thankful that by and large,
01:24:21 most people love Instagram, still do.
01:24:23 By and large, people are very happy
01:24:25 with like the time we had there
01:24:28 and I’m proud of what we built.
01:24:29 So like, I’m cool.
01:24:31 Like now it’s next shot, right?
01:24:35 Well, if we could just linger on this Yankees versus Red Sox,
01:24:40 the fun of it, the competition over,
01:24:42 I would say over the space of features.
01:24:45 So there are a bunch of features,
01:24:49 like there’s photos, there’s one minute videos on Instagram,
01:24:54 there’s IGTV, there’s stories, there’s reels, there’s live.
01:24:58 So that sounds like it’s like a long list
01:25:01 of too much stuff, but it’s not
01:25:04 because it feels like they’re close together,
01:25:07 but they’re somehow, like what we’re saying,
01:25:09 fundamentally distinct,
01:25:10 like each of the things I mentioned.
01:25:13 Maybe can you describe the philosophies,
01:25:15 the design philosophies behind some of these,
01:25:17 how you were thinking about it
01:25:19 during the historic war between Snapchat and Instagram
01:25:24 or just in general,
01:25:25 like this space of features that was discovered.
01:25:30 There’s this great book by Clay Christensen
01:25:34 called, Competing Against Luck.
01:25:36 It’s like a terrible title,
01:25:39 but within it, there’s effectively an expression
01:25:43 of this thing called jobs to be done theory.
01:25:46 And it’s unclear if like he came up with it
01:25:48 or some of his colleagues,
01:25:50 but there are a bunch of places you can find
01:25:52 with people claiming to have come up
01:25:53 with this jobs to be done theory.
01:25:55 But the idea is if you like zoom out
01:25:59 and you look at your product,
01:26:00 you ask yourself, why are people hiring your product?
01:26:04 Like imagine every product in your life
01:26:07 is effectively an employee, you know,
01:26:10 you’re CEO of your life
01:26:11 and you hire products to be employees effectively.
01:26:13 They all have roles and jobs, right?
01:26:16 Why are you hiring a product?
01:26:18 Why do you want that product
01:26:19 to perform something in your life?
01:26:21 And like, what are the hidden reasons
01:26:22 why you’re in love with this product?
01:26:26 Instagram was about sharing your life
01:26:28 with others visually, period, right?
01:26:31 Why? Because you feel connected with them.
01:26:34 You get to show off.
01:26:35 You get to feel good and cared about, right?
01:26:38 With likes and it turns out that that will,
01:26:43 I think forever define Instagram
01:26:46 and any product that serves that job
01:26:49 is going to do very well, okay?
01:26:52 Stories let’s take as an example
01:26:55 is very much serving that job.
01:26:58 In fact, it serves it better than the original product
01:27:00 because when you’re large and have an enormous audience,
01:27:04 you’re worried about people seeing your stuff
01:27:08 or you’re worried about being permanent
01:27:09 so that a college admissions person
01:27:11 is going to see your photo of you doing something.
01:27:13 And so it turns out that that is a more efficient way
01:27:16 of performing that job than the original product was.
01:27:19 The original product still has its value,
01:27:22 but at scale, these two things together
01:27:24 work really, really well.
01:27:26 Now, I will claim that other parts of the product
01:27:29 over time didn’t perform that job as well.
01:27:32 I think IGTV probably didn’t, right?
01:27:35 Shopping is like completely unrelated
01:27:38 to what I just described, but it might work.
01:27:40 I don’t know, right?
01:27:43 Products I think, products that succeed
01:27:46 are products that all share this parent node
01:27:49 of like this job to be done that is in common.
01:27:52 And then they’re just like different ways of doing it, right?
01:27:55 Apple, I think does a great job with this, right?
01:27:57 It’s like managing your digital life
01:27:59 and all the products just work together.
01:28:01 They sync, they’re like, it’s beautiful, right?
01:28:06 Even if they require like silly specific cords to work,
01:28:10 but they’re all part of a system.
01:28:12 It’s when you leave that system
01:28:14 and you start doing something weird
01:28:15 that people start scratching their head
01:28:17 and I think you are less successful.
01:28:19 So I think one of the challenges
01:28:20 Facebook has had throughout its life
01:28:22 is that it has never fully, I think,
01:28:24 appreciated the job to be done of the main product.
01:28:28 And what it’s done is said,
01:28:29 ooh, there’s a shiny object over there.
01:28:31 That startup’s getting some traction.
01:28:32 Let’s go copy that thing.
01:28:34 And then they’re confused why it doesn’t work.
01:28:36 Like why doesn’t it work?
01:28:37 It’s because the people who show up for this
01:28:40 don’t want that, it’s different.
01:28:42 What’s the purpose of Facebook?
01:28:44 So I remember I was a very early Facebook user.
01:28:47 The reason I was personally excited about Facebook
01:28:50 is because you can, first of all, use your real name.
01:28:55 Like I can exist in this world.
01:28:58 It can be like formally exist.
01:29:00 I like anonymity for certain things, Reddit and so on,
01:29:04 but I want it to also exist not anonymously
01:29:09 so that I can connect with other friends of mine
01:29:12 nonanonymously and there’s a reliable way to know
01:29:16 that I’m real and they’re real and that we’re connecting.
01:29:19 And it’s kind of like, I liked it for the reasons
01:29:24 that people like LinkedIn, I guess.
01:29:27 But like without the form, like not everybody is dressed up
01:29:30 and being super polite, like more like with friends.
01:29:34 But then it became something much bigger than that,
01:29:37 I suppose, there’s a feed.
01:29:39 It became this, I mean, it became a place
01:29:44 to discover content, to share content
01:29:49 that’s not just about connecting directly with friends.
01:29:53 I mean, it became something else.
01:29:54 I don’t even know what it is really.
01:29:56 So you said Instagram is a place
01:29:58 where you visually share your life.
01:30:02 What is Facebook?
01:30:03 Well, let’s go back to the founding of Facebook
01:30:06 and why it worked really well initially at Harvard
01:30:09 and then Dartmouth and Stanford and I can’t remember,
01:30:12 probably MIT, there were like a handful of schools
01:30:14 in that first tranche, right?
01:30:17 It worked because there are communities
01:30:19 that exist in the world that want to transact.
01:30:23 And when I say transact, I don’t mean commercially,
01:30:25 I just mean they want to share, they want to coordinate,
01:30:28 they want to communicate, they want a space for themselves.
01:30:32 And Facebook at its best, I think is that.
01:30:36 And if actually you look at the most popular products
01:30:39 that Facebook has built over time,
01:30:42 if you look at things like groups, marketplace,
01:30:45 groups is enormous.
01:30:47 And groups is effectively like everyone can found
01:30:50 their own little Stanford or Dartmouth or MIT, right?
01:30:53 And find each other and share and communicate
01:30:57 about something that matters deeply to them.
01:31:00 That is the core of what Facebook was built around.
01:31:03 And I think today is where it stands most strongly.
01:31:09 Yeah, it’s brilliant.
01:31:09 The groups, I wish groups were done better.
01:31:13 It feels like it’s not a first class citizen.
01:31:16 I know I may be saying something without much knowledge,
01:31:19 but it feels like it’s kind of bolted on
01:31:24 while being used a lot.
01:31:26 It feels like there needs to be a little bit more structure
01:31:28 in terms of discovery, in terms of like.
01:31:32 I mean, look at Reddit.
01:31:33 Like Reddit is basically groups of public and open
01:31:36 and a little bit crazy, right?
01:31:38 In a good way.
01:31:40 But there’s clear product market fit
01:31:42 for that specific use case.
01:31:44 And it doesn’t have to be a college, it can be anything.
01:31:47 It can be a small group, a big group,
01:31:48 it can be group messaging.
01:31:50 Facebook shines, I think, when it leans into that.
01:31:53 I think when there are other companies
01:31:56 that just seem exciting and now all of a sudden
01:32:00 the product shifts in some fundamental way
01:32:02 to go try to compete with that other thing,
01:32:05 that’s when I think consumers get confused.
01:32:09 Even if you can be successful,
01:32:10 like even if you can compete with that other company,
01:32:13 even if you can figure out how to bolt it on,
01:32:16 eventually you come back and you look at the app
01:32:17 and you’re like, I just don’t know why I opened this app.
01:32:20 Like why, like too many things going on.
01:32:23 And that was always a worry.
01:32:24 I mean, you listed all the things at Instagram
01:32:26 and it almost gave me a heart attack,
01:32:27 like way too many things.
01:32:30 But I don’t know, entrepreneurs get bored.
01:32:31 They want to add things.
01:32:32 They want to like, right?
01:32:35 I don’t have a good answer for it,
01:32:37 except for that I think being true to your original use case
01:32:41 and not even original use case,
01:32:43 but sorry, actually not use case, original job.
01:32:46 There are many use cases under that job.
01:32:49 Being true to that and like being really good at it
01:32:52 over time and morphing as needs change,
01:32:57 I think that’s how to make a company last forever.
01:32:59 And I mean, honestly, like my main thesis
01:33:03 about why Facebook is in the position it is today
01:33:07 is if they have had a series of product launches
01:33:12 that delighted people over time,
01:33:16 I think they’d be in a totally different world.
01:33:17 So just like imagine for a moment,
01:33:20 and by the way, Apple’s entering this,
01:33:21 but like Apple for so long,
01:33:23 just like product after product,
01:33:25 you couldn’t wait for it.
01:33:26 You stood in line for it.
01:33:27 You talked about it.
01:33:28 You got excited.
01:33:29 Amazon makes your life so easy.
01:33:31 It’s like, wow, I needed this thing
01:33:34 and it showed up at my door two days later.
01:33:36 And like both of these companies, by the way,
01:33:39 Amazon, Apple have issues, right?
01:33:41 There are labor issues,
01:33:43 whether it’s here in the US or in China,
01:33:45 there are environmental issues.
01:33:48 But like when’s the last time
01:33:49 you heard like a large chorus being like,
01:33:52 these companies better pay
01:33:54 for what they’re doing on these things, right?
01:33:56 I think Facebook’s main issue today is like,
01:33:59 you need to produce a hit.
01:34:01 If you don’t produce hits,
01:34:03 it’s really hard to keep consumers on your side.
01:34:06 Then people just start picking on you
01:34:08 for a variety of reasons, whether it’s right or wrong.
01:34:11 I’m not even going to place a judgment
01:34:12 right here and right now.
01:34:13 I’m just going to say that it is way better
01:34:17 to be in a world where you are producing hits
01:34:19 and consumers love what you’re doing
01:34:21 because then they’re on your side.
01:34:23 And I think that’s, it’s the past 10 years
01:34:26 for Facebook has been fairly hard on this dimension.
01:34:29 So, and by hits, it doesn’t necessarily mean financial hits.
01:34:33 It feels like to me, what you’re saying
01:34:34 is something that brings joy.
01:34:37 A product that brings joy
01:34:38 to some fraction of the population.
01:34:40 Yeah, I mean, TikTok isn’t just literally an algorithm.
01:34:45 In some ways, TikTok’s content and algorithm
01:34:49 have more sway now over the American psyche
01:34:53 than Facebook’s algorithm, right?
01:34:54 It’s visual, it’s video.
01:34:57 By the way, it’s not defined by who you follow.
01:34:59 It’s defined by some magical thing that,
01:35:01 by the way, if someone wanted to tweak
01:35:02 to show you a certain type of content for some reason,
01:35:05 they could, right, but people love it.
01:35:10 So as a CEO, let me ask you a question
01:35:13 because leadership matters.
01:35:17 This is a complicated question.
01:35:19 Why is Mark Zuckerberg distrusted, disliked
01:35:23 and sometimes even hated by many people in public?
01:35:28 Right, that is a complicated question.
01:35:30 Well, the premise, I’m not sure I agree with the premise.
01:35:34 And I can expand that to include
01:35:38 even a more mysterious question for me, Bill Gates.
01:35:42 Hmm.
01:35:44 What is the Bill Gates version of the question?
01:35:47 Do you think people hate Bill Gates?
01:35:49 No, distrust.
01:35:51 Ah.
01:35:52 So takeaway one, it’s a checklist.
01:35:56 There is, I think Mark Zuckerberg’s distrust
01:36:00 is the primary one, but there’s also like a dislike,
01:36:04 maybe hate is too strong a word,
01:36:05 but it’s just if you look at like the articles
01:36:09 that are being written and so on, there’s a dislike.
01:36:13 And it makes, it’s confusing to me
01:36:15 because it’s like the public picks certain individuals
01:36:18 and they attach certain kinds of emotions
01:36:21 to those individuals.
01:36:22 Yeah, so someone once just recently said,
01:36:27 there’s a strong case that founder led companies
01:36:30 have this problem and that a lot of Mark’s issues
01:36:33 come today come from the fact that he is a visible founder
01:36:38 with this story that people have watched in both a movie
01:36:42 and they followed along and he’s this boy wonder kid
01:36:45 who became one of the world’s richest people
01:36:47 and he’s no longer Mark the person,
01:36:50 he’s Mark this image of a person
01:36:53 with enormous wealth and power.
01:36:55 And in today’s world, we have issues
01:36:59 with enormous wealth and power for a variety of reasons.
01:37:02 One of which is we’ve been stuck inside
01:37:05 for a year and a half, two years.
01:37:07 One of which is a lot of people were really unhappy
01:37:10 about not the last election, but the last, last election.
01:37:13 And where do you take out that anger?
01:37:15 Who do you blame but the people in charge?
01:37:18 That’s one example or one reason why I think
01:37:22 a lot of people express anger or resentment
01:37:26 or unhappiness with Mark.
01:37:29 At the same time, I don’t know,
01:37:31 I pointed out to that person, I was like, well,
01:37:34 I don’t know, I think a lot of people really like Elon.
01:37:38 Elon arguably, he kept his factory open here
01:37:41 throughout COVID protocols, which arguably
01:37:44 a lot of people would be against.
01:37:47 While saying a bunch of crazy offensive things
01:37:50 on the internet, they still.
01:37:52 And basically gives the middle finger to the SEC
01:37:56 on Twitter and I don’t know, I’m like,
01:37:59 well, there’s a founder and like people kind of like him.
01:38:02 So I do think that the founder and slash CEO
01:38:09 of a company that’s a social network company
01:38:11 is like an extra level of difficulty.
01:38:13 If life is a video game,
01:38:15 you just chose the harder video game.
01:38:17 So, I mean, that’s why it’s interesting to ask you
01:38:20 because you were the founder and CEO of a social network.
01:38:24 I challenge it because.
01:38:25 Exactly, but you’re one of the rare examples.
01:38:30 Even Jack Dorsey’s disliked, not to the degree,
01:38:34 but it just seems harder
01:38:35 when you’re running a social media company.
01:38:38 It’s interesting.
01:38:39 I never thought of Jack as just like,
01:38:41 I think generally he’s well respected.
01:38:45 Yeah, I think so.
01:38:45 I think you’re right, but he’s not loved.
01:38:50 Yeah.
01:38:51 And I feel like you, I mean, to me,
01:38:53 Twitter is an incredible thing.
01:38:54 Yeah, again, can I just come back to this point,
01:38:57 which seems over simplistic,
01:38:59 but I really do think how a product makes someone feel,
01:39:04 they ascribe that feeling to the founder.
01:39:07 Yep.
01:39:09 So make people feel good.
01:39:11 So think about it.
01:39:13 Let’s just go with this thesis for a second.
01:39:14 Sure, I like it though.
01:39:17 Amazon’s pretty utilitarian, right?
01:39:19 It delivers brown boxes to your front door.
01:39:21 Sure, you can have Alexa
01:39:23 and you can have all these things, right?
01:39:25 But in general, it delivers stuff quickly to you
01:39:28 at a reasonable price, right?
01:39:31 I think Jeff Bezos is wonderfully wealthy,
01:39:34 thoughtful, smart guy, right?
01:39:36 But people kind of feel that way about him.
01:39:38 They’re like, wow, this is really big.
01:39:40 We’re impressed that this is really big.
01:39:42 But he’s doing the same space stuff Elon’s doing,
01:39:45 but they don’t necessarily ascribe
01:39:47 the same sense of wonder, right?
01:39:50 Now let’s take Elon.
01:39:51 And again, this is pet theory.
01:39:52 I don’t have much proof other than my own intuition.
01:39:56 He is literally about living the future.
01:39:59 Mars, it’s about wonder.
01:40:01 It’s about going back to that feeling as a kid
01:40:04 when you looked up to the stars and asked,
01:40:05 is there life out there?
01:40:08 People get behind that because it’s a sense of hope
01:40:11 and excitement and innovation.
01:40:13 And like, you can say whatever you want,
01:40:15 but we ascribe that emotion to that person.
01:40:18 Now, let’s say you’re on a social network
01:40:21 and people make you kind of angry
01:40:23 because they disagree with you
01:40:24 or they say something ridiculous
01:40:25 or they’re living a FOMO type life where you’re like,
01:40:28 wow, I wish I was doing that thing.
01:40:31 I think Instagram, if I were to think back,
01:40:33 by and large, when I was there,
01:40:35 was not about FOMO, was not about this influencer economy,
01:40:39 although it certainly became that way closer to the end.
01:40:43 It was about the sense of wonder and happiness
01:40:45 and beautiful things in the world.
01:40:47 And I don’t know, I mean, like,
01:40:49 I don’t want to have a blind spot,
01:40:50 but I don’t think anyone had a strong opinion
01:40:52 about me one way or the other.
01:40:53 For the longest time, the way people explained to me,
01:40:55 I mean, if you want to go for toxicity,
01:40:57 you go to Facebook or Twitter.
01:40:59 If you want to go to feel good about life,
01:41:01 you go to Instagram to enjoy, celebrate life.
01:41:04 And my experience been talking to people
01:41:05 is they gave me the benefit of the doubt because of that.
01:41:08 But if your experience of the product
01:41:10 is kind of makes you angry, it’s where you argue.
01:41:13 I mean, a big part of Jack might be
01:41:15 that he wasn’t actually the CEO for a very long time
01:41:17 and only became recently.
01:41:19 So I’m not sure how much of the connection got made.
01:41:23 But in general, I mean, if you hate,
01:41:28 I’m just thinking about other companies
01:41:29 that aren’t tech companies.
01:41:30 If you hate like what a company is doing
01:41:32 or it makes you not feel happy.
01:41:36 I don’t know, like people are really angry
01:41:37 about Comcast or whatever.
01:41:39 Are they even called Comcast anymore?
01:41:40 It’s like Xfinity or something, right?
01:41:42 They had to rebrand.
01:41:43 They became meta, right?
01:41:44 And it’s like, but my point is if it makes you angry.
01:41:48 That’s beautiful, yeah.
01:41:50 But the thing is, this is me saying this.
01:41:54 I think your thesis is very strong and correct,
01:41:58 has elements of correctness,
01:42:00 but I still personally put some blame on individuals.
01:42:05 Of course.
01:42:06 I think you said, Elon, looking up,
01:42:10 there’s something about Childlike Wander to him,
01:42:13 like to his personality, his character,
01:42:16 something about, I think more so than others
01:42:19 where people can trust them.
01:42:21 And there’s, I don’t know,
01:42:23 Sondra Prachai is an example of somebody who’s like,
01:42:26 there’s some kind, it’s hard to put into words,
01:42:29 but there’s something about the human being
01:42:32 where he’s trustworthy.
01:42:34 He’s human in a way that connects to us.
01:42:38 And the same with Sajid Nadella.
01:42:40 I mean, some of these folks, something about us
01:42:46 is drawn to them, even when they’re flawed.
01:42:48 Even like, so your thesis really holds up for Steve Jobs
01:42:53 because I think people didn’t like Steve Jobs,
01:42:55 but he delivered products
01:42:57 and then they fell in love every time.
01:43:01 I guess you could say that the CEO,
01:43:03 the leader is also a product.
01:43:05 And if they keep delivering a product that people like
01:43:08 by being in public and saying things that people like,
01:43:11 that’s also a way to make people happy.
01:43:14 But from a social network perspective,
01:43:16 it makes me wonder how difficult it is
01:43:19 to explain to people why certain things happen,
01:43:22 like to explain machine learning,
01:43:24 to explain why certain,
01:43:30 the woke mob effect happens
01:43:32 or the certain kinds of like bullying happens,
01:43:36 which is like, it’s human nature combined with algorithm.
01:43:40 And it’s very difficult to control for
01:43:42 how the spread of quote unquote misinformation happens.
01:43:45 It’s very difficult to control for that.
01:43:47 And so you try to decelerate certain parts
01:43:50 and you create more problems than you solve.
01:43:53 And anything that looks at all like censorship
01:43:55 can create huge amounts of problems as a slippery slope.
01:43:58 And then you have to inject humans
01:44:01 to oversee the machine learning algorithms.
01:44:03 And anytime you inject humans into the system,
01:44:05 it’s gonna create a huge number of problems.
01:44:07 And I feel like it’s up to the leader
01:44:08 to communicate that effectively, to be transparent.
01:44:12 First of all, design products
01:44:13 that don’t have those problems.
01:44:15 And second of all, when they have those problems,
01:44:17 to be able to communicate with them.
01:44:19 I guess that’s all going to,
01:44:21 when you run a social network company, your job is hard.
01:44:25 Yeah, I will say the one element that you haven’t named
01:44:28 that I think you’re getting at is just bedside manner,
01:44:31 which Steve Jobs, I never worked for him.
01:44:35 I never met him in person.
01:44:38 Had an uncanny ability in public to have bedside manner.
01:44:43 I mean, some of the best clips of Steve Jobs
01:44:46 from like, I would say maybe the 80s
01:44:49 when he’s on the stage and getting questions
01:44:51 from the audience about life.
01:44:54 And he’ll take this question that is like,
01:44:56 how are you gonna compete with blah?
01:44:58 And it’s super boring.
01:44:59 And I don’t even know the name of the company.
01:45:01 And his answer is as if you just asked your grandfather,
01:45:05 the meaning of life.
01:45:07 And you sit there and you’re just like, what?
01:45:10 And there’s that bedside manner.
01:45:13 And if you lack that, or if that’s just not intuitive to you,
01:45:16 I think that it can be a lot harder
01:45:20 to gain the trust of people.
01:45:21 And then add on top of that, missteps of companies.
01:45:27 I don’t know if you have any friends from the past
01:45:29 where maybe they crossed you once,
01:45:31 or maybe you get back together and you’re friends again,
01:45:34 but you just never really forget that thing.
01:45:36 It’s human nature not to forget.
01:45:38 I’m Russian, you crossed me once.
01:45:41 We solved the problem.
01:45:43 So my point is, humans don’t forget.
01:45:48 And if there are times in the past
01:45:50 where they feel like they don’t trust the company
01:45:52 or the company hasn’t had their back,
01:45:55 that is really hard to earn back,
01:45:57 especially if you don’t have that bedside manner.
01:46:00 And again, I’m not attributing this specifically to Marc
01:46:02 because I think a lot of the companies have this issue
01:46:06 where one, you have to be trustworthy as a company
01:46:10 and live by it and live by those actions.
01:46:12 And then two, I think you need to be able
01:46:14 to be really relatable in a way that’s very difficult
01:46:18 if you’re worth what these people are.
01:46:20 It’s really hard.
01:46:22 Yeah.
01:46:23 Jack does a pretty good job of this by being a monk.
01:46:27 But I also, Jack issues attention.
01:46:29 He’s not out there almost on purpose.
01:46:32 He’s just working hard, doing square, right?
01:46:35 I literally shared a desk like this with him at Odeo.
01:46:38 I mean, this normal guy who likes painting,
01:46:41 I remember he would leave early on Wednesdays or something
01:46:45 to go to a painting class.
01:46:47 And he’s creative, he’s thoughtful.
01:46:49 I mean, money makes people more creative and more thoughtful,
01:46:53 extreme versions of themselves, right?
01:46:55 And this was a long, long time ago.
01:46:58 You mentioned that he asked you
01:47:00 to do some kind of JavaScript thing.
01:47:02 We were working on some JavaScript together.
01:47:05 That’s hilarious, like pre Twitter, early Twitter days,
01:47:08 you and Jack Dorsey are in a room together
01:47:11 talking about JavaScript,
01:47:12 solving some kind of menial problem.
01:47:14 Terrible problems, yeah.
01:47:15 I mean, not terrible, just like boring widget problem.
01:47:18 I think it was the Odeo widget
01:47:19 we were working on at the time.
01:47:21 I’m surprised anyone paid me to be in the room as an intern
01:47:24 because I didn’t really provide any value.
01:47:26 I’m very thankful to anyone who included me back in the day.
01:47:30 It was very helpful.
01:47:32 So thank you for listening.
01:47:33 I mean, is there Odeo that’s a precursor to Twitter?
01:47:38 First of all, did you have any anticipation
01:47:40 that this Jack Dorsey guy could be also
01:47:43 a head of a major social network?
01:47:46 And second, did you learn anything from the guy that,
01:47:49 like, do you think it’s a coincidence
01:47:52 that you two were in the room together?
01:47:55 And it’s the coincidence meaning like,
01:47:58 why does the world play its game in a certain way
01:48:01 where these two founders of social networks?
01:48:04 I don’t know.
01:48:04 It’s so weird, right?
01:48:05 Like, I mean, it’s also weird that Mark showed up
01:48:11 in our fraternity my sophomore year
01:48:13 and we got to know each other then,
01:48:16 like long before Instagram.
01:48:18 It’s a small world,
01:48:20 but let me tell a fun story about Jack.
01:48:25 We’re at Odeo and I don’t know,
01:48:27 I think Ev was feeling like people
01:48:28 weren’t working hard enough or something.
01:48:31 Nice.
01:48:32 And I can’t remember exactly what he,
01:48:35 he created this thing where every Friday,
01:48:39 I don’t know if it was every Friday,
01:48:40 I only remember this happening once,
01:48:43 but he had us like a statuette, it’s like of Mary.
01:48:47 And in the bottom, it’s hollow, right?
01:48:50 And I remember on a Friday,
01:48:54 Ev decided he was going to let everyone vote for
01:48:57 who had worked the hardest that week.
01:48:59 We all voted, closed ballot, right?
01:49:01 We all put it in a bucket and he tallied the votes.
01:49:04 And then whoever got the most votes, as I recall,
01:49:07 got the statuette.
01:49:09 And in the statuette was a thousand bucks,
01:49:12 or I recall there was a thousand bucks in it.
01:49:14 It might’ve been a hundred bucks,
01:49:15 but let’s call it a thousand.
01:49:16 It’s more exciting that way.
01:49:17 It felt like a thousand, yeah.
01:49:18 It did to me for sure.
01:49:19 I actually got two votes.
01:49:20 I was very happy.
01:49:21 We were a small company, but as the intern,
01:49:23 I got at least two votes.
01:49:24 So everybody knew how many votes they got individually?
01:49:26 Yeah, yeah.
01:49:27 And I think it was one of these self accountability things.
01:49:29 Anyway, I remember Jack just getting
01:49:31 like the vast majority of votes from everyone.
01:49:35 And I remember just thinking like,
01:49:37 like I couldn’t imagine he would become what he’d become
01:49:39 and do what he would do,
01:49:41 but I had a profound respect that the new guy
01:49:45 who I really liked worked that hard.
01:49:48 And you could see his dedication even then
01:49:51 and that people respected him.
01:49:53 That’s the one story that I remember of him
01:49:55 like working with him specifically from that summer.
01:49:58 Can I take a small tangent on that?
01:49:59 Of course.
01:50:00 There’s kind of a pushback in Silicon Valley
01:50:02 a little bit against hard work.
01:50:04 Can you speak to the sort of the thing you admire
01:50:08 to see the new guy working so hard, that thing?
01:50:11 What is the value of that thing in a company?
01:50:13 See, this is like, just to be very frank,
01:50:16 it drives me nuts.
01:50:17 Like I saw this really funny video on TikTok.
01:50:21 Was it on TikTok?
01:50:22 It was like, I’m taking a break from my mental health
01:50:24 to work on my career.
01:50:26 I thought that was funny.
01:50:27 Um, so I was like, oh, it is kind of phrased that way,
01:50:31 the opposite often, right?
01:50:32 Yeah.
01:50:33 Okay, so a couple of things.
01:50:35 Uh, I, uh, I have worked so hard
01:50:42 to do the things that I did.
01:50:43 Like Mike and I lost years off of our lives,
01:50:47 staying up late, figuring things out,
01:50:50 the stress that comes with the job.
01:50:51 I have a lot more gray hair now than I did back then.
01:50:54 It requires an enormous amount of work
01:50:57 and most people aren’t successful, right?
01:50:59 But even the ones that do don’t skate by.
01:51:04 I am okay if people choose not to work hard
01:51:07 because I don’t actually think there’s anything
01:51:09 in this world that says you have to work hard.
01:51:13 But I do think that great things require a lot of hard work.
01:51:16 So there’s no way you can expect to change the world
01:51:18 without working really hard.
01:51:19 By the way, even changing the world,
01:51:22 you know, the folks that I respect the most
01:51:24 have nudged the world in like a slight direction,
01:51:27 slight, very, very slight.
01:51:30 Like even if Elon accomplishes all the things
01:51:33 he wants to accomplish,
01:51:34 we will have nudged the world in a slight direction,
01:51:38 but it requires enormous amount.
01:51:40 There was an interview with him where he was just like,
01:51:43 he was interviewed, I think, at the Tesla factory
01:51:45 and he was like, work is really hard.
01:51:47 This is actually unhealthy.
01:51:49 And I can’t recall the exact,
01:51:51 but he was like visibly shaken
01:51:52 about how hard he had been working.
01:51:54 And he was like, this is bad.
01:51:55 And unfortunately, I think to have great outcomes,
01:51:57 you actually do need to work
01:51:58 at like three standard deviations above the mean,
01:52:01 but there’s nothing saying that people have to go for that.
01:52:03 See, the thing is, but what I would argue,
01:52:06 this is my personal opinion,
01:52:08 is nobody has to do anything, first of all.
01:52:10 They certainly don’t have to work hard.
01:52:12 But I think hard work in a company should be admired.
01:52:17 And you should not feel like,
01:52:22 you shouldn’t feel good about yourself for not working hard.
01:52:26 Like, so for example, I don’t have to work out.
01:52:30 I don’t have to run.
01:52:31 I hate running, but like, I certainly don’t feel good
01:52:35 if I don’t run because I know for my health,
01:52:37 like there’s certain values,
01:52:39 I guess is what I’m trying to get at.
01:52:40 There’s certain values that you have in life.
01:52:42 It feels like there’s certain values
01:52:43 that companies should have in hard work.
01:52:45 Certain values that companies should have in hard work
01:52:47 is one of the things I think that should be admired.
01:52:51 I often ask this kind of silly question,
01:52:54 just to get a sense of people,
01:52:56 like if I’m hiring and so on.
01:52:58 I just ask if they think it’s better
01:53:00 to work hard or work smart.
01:53:03 It was helpful for me to get a sense of people from that.
01:53:07 Because you think like the right.
01:53:08 The answer’s both.
01:53:09 What’s that?
01:53:10 The answer’s both.
01:53:11 The answer’s both.
01:53:12 I usually try not to give them that,
01:53:13 but sometimes I’ll say both if that’s an option.
01:53:16 But a lot of people kind of,
01:53:19 a surprising number will say work smart.
01:53:22 And there are usually people
01:53:23 who don’t know how to work smart.
01:53:26 And they’re literally just lazy.
01:53:29 Not just, there’s two effects behind that.
01:53:32 One is laziness and the other is ego.
01:53:37 When you’re younger and you say it’s better to work smart,
01:53:40 it means you think you know what it means
01:53:44 to work smart at this early stage.
01:53:46 To me, people that say work hard or both,
01:53:49 they have the humility to understand like,
01:53:52 I’m going to have to work my ass off
01:53:54 because I’m too dumb to know how to work smart.
01:53:56 And people who are self critical in this way,
01:53:59 in some small amount, you have to have some confidence.
01:54:01 But if you have humility,
01:54:03 that means you’re going to actually eventually figure out
01:54:06 what it means to work smart.
01:54:07 And then to actually be successful, you should do both.
01:54:11 So I have a very particular take on this,
01:54:14 which is that no one’s forcing you to do anything.
01:54:19 All choices have consequences.
01:54:22 So if you major in, I don’t know, theoretical literature,
01:54:27 I don’t even know if that’s a major.
01:54:28 I’m just making something up.
01:54:30 That’s supposed to regular literature, applied literature.
01:54:33 Yeah, think about like theoretical Spanish lit
01:54:38 from the 14th century.
01:54:39 Like just make up your esoteric thing.
01:54:41 And then the number of people I went to Stanford with
01:54:44 who get out in the world and they’re like,
01:54:45 wait, what, I can’t find a job?
01:54:47 Like no one wants a theoretical,
01:54:49 like there are plenty of counter examples
01:54:51 of people who have majored in esoteric things
01:54:53 and gone on to be very successful.
01:54:54 So I just want to be clear, it’s not about the major.
01:54:56 But every choice you make, whether it’s to have kids,
01:55:00 like I love my children.
01:55:02 It’s so awesome to have two kids.
01:55:04 And it is so hard to work really hard and also have kids.
01:55:08 It’s really hard.
01:55:09 And there’s a reason why certain very successful people
01:55:12 like don’t have, or not successful,
01:55:14 but people who run very, very large companies or startups
01:55:17 have chosen not to have kids for a while
01:55:19 or chosen not to like prioritize them.
01:55:21 Everything’s a choice.
01:55:22 And like I choose to prioritize my children
01:55:24 because like I want to do that, right?
01:55:27 So everything’s a choice.
01:55:29 Now, once you’ve made that choice,
01:55:33 I think it’s important that the contract is clear,
01:55:36 which is to say,
01:55:37 let’s imagine you were joining a new startup.
01:55:40 It’s important that that startup communicate
01:55:43 that like the expectation is like,
01:55:44 we’re all working really, really hard right now.
01:55:46 You don’t have to join the startup.
01:55:48 But like, if you do just know like you’re,
01:55:51 it’s almost as if you join, I don’t know,
01:55:53 pick your like sports team.
01:55:57 Like let’s go back to the Yankees for a second.
01:56:00 You want to join the Yankees,
01:56:01 but you don’t really want to work that hard.
01:56:03 You don’t really want to do batting practice
01:56:05 or pitching practice or whatever for your position, right?
01:56:09 That to me is wacko.
01:56:11 And that’s actually the world that it feels like we live in
01:56:13 in tech sometimes,
01:56:15 where people both want to work for the Yankees
01:56:17 because it pays a lot,
01:56:18 but like don’t actually want to work that hard.
01:56:21 That I don’t fully understand.
01:56:23 Because if you sign up for some of these things,
01:56:26 just sign up for it.
01:56:26 But it’s okay if you don’t want to sign up for it.
01:56:29 There’s so many wonderful careers in this world
01:56:31 that don’t require 80 hours a week.
01:56:33 But when I read about companies going to like
01:56:35 four day work weeks and stuff,
01:56:36 I just like, I chuckle because I can’t get enough done
01:56:39 with a seven day week.
01:56:41 I don’t know how.
01:56:42 And people will say, oh, you’re just not working smart.
01:56:44 And it’s like, no, I work pretty smart,
01:56:46 I think in general.
01:56:47 Like I wouldn’t have gotten to this point
01:56:49 if I hadn’t like some amount of working smart.
01:56:52 And there is balance though.
01:56:53 So I used to be like a pretty big cyclist.
01:56:55 I don’t do it much anymore just because of kids
01:56:58 and like prioritizing other things, right?
01:57:01 But one of the most important things to learn
01:57:03 as a cyclist is to take a rest day.
01:57:06 But to me and to cyclists,
01:57:07 like resting is a function of optimizing for the long run.
01:57:12 It’s not like a thing that you do for its own merits.
01:57:16 It’s actually like, if you don’t rest,
01:57:17 your muscles don’t recover.
01:57:18 And then you’re just not as,
01:57:19 like you’re not training as efficiently.
01:57:21 You should probably, the successful people I’ve known
01:57:24 in terms of athletes, they hate rest days,
01:57:27 but they know they have to do it for the long term.
01:57:29 They think their opposition is getting stronger and stronger
01:57:32 and that’s the feeling,
01:57:34 but you know it’s the right thing
01:57:36 and usually you need a coach to help you.
01:57:38 Yeah, totally.
01:57:39 So, I mean, I use this thing called training peaks
01:57:41 and it’s interesting
01:57:42 because it actually mathematically shows
01:57:44 like where you are on the curve and all this stuff,
01:57:46 but you have to have that rest,
01:57:49 but it’s a function of going harder for longer.
01:57:52 Again, it’s this reinforcement learning,
01:57:53 like planning the aggregate and the long,
01:57:56 but a lot of people will hide behind laziness
01:57:58 by saying that they’re trying to optimize for the long run
01:58:00 and they’re not, they’re just not working very hard.
01:58:03 But again, you don’t have to sign up for it.
01:58:05 It’s totally cool.
01:58:05 Like, I don’t think less of people
01:58:07 for like not working super hard.
01:58:08 It’s just like, don’t sign up for things
01:58:10 that require working super hard.
01:58:11 And some of that requires for the leadership
01:58:13 to have the guts, the boldness to communicate effectively
01:58:17 at the very beginning.
01:58:17 I mean, sometimes I think most of the problems arise
01:58:20 in the fact that the leadership is kind of hesitant
01:58:24 to communicate the socially difficult truth
01:58:30 of what it takes to be at this company.
01:58:33 So they kind of say, hey, come with us.
01:58:36 There’s, we have snacks, you know, but.
01:58:39 Unlimited vacation and.
01:58:40 Yeah.
01:58:41 You know, Ray at Bridgewater is always fascinating
01:58:44 because, you know, people,
01:58:45 it’s been called like a cult on the outside
01:58:47 or cultish and,
01:58:49 but what’s fascinating is like,
01:58:51 they just don’t give on their principles.
01:58:52 They’re like, listen, this is what it’s like to work here.
01:58:55 We record every meeting.
01:58:57 We’re like brutally honest
01:58:59 and that’s not going to feel right to everyone.
01:59:00 And if it doesn’t feel right to you, totally cool.
01:59:03 Just go work somewhere else.
01:59:04 But if you work here, you are signing up for this.
01:59:08 And that’s, that’s been fascinating to me
01:59:10 because it’s honesty upfront.
01:59:12 It’s a system in which you operate.
01:59:15 And if it’s not for you,
01:59:16 like no one’s forcing you to work there, right?
01:59:19 I actually did.
01:59:20 So I did a conversation with him
01:59:22 and kind of got stuck in a funny moment,
01:59:25 which is at the end I asked him, you know,
01:59:28 to give me honest feedback of how I did on the interview.
01:59:31 And I was.
01:59:32 Did he?
01:59:33 I don’t think so.
01:59:34 He was super nice.
01:59:36 He asked me, he’s like, well,
01:59:38 tell me, did you accomplish
01:59:40 what you were hoping to accomplish?
01:59:42 I was like, that’s not, that’s not.
01:59:45 I’m asking you as an objective observer
01:59:47 of two people talking, how do we do today?
01:59:52 And then he’s like, well,
01:59:53 he gave me this politician answer.
01:59:55 Well, I feel like we’ve accomplished
01:59:57 successful communication of like ideas,
02:00:00 which is I’d love to spread some of the ideas
02:00:03 in that, like in principles and so on.
02:00:06 I was like.
02:00:07 Back to my original point,
02:00:08 it’s really hard to get.
02:00:10 Even for Radalia.
02:00:11 It’s really hard to give feedback.
02:00:13 And one of the other things I learned from him
02:00:15 and just people in that world is like,
02:00:19 man, humans really like to pretend
02:00:22 like they’ve come to,
02:00:24 that they’ve come to some kind of meeting of the minds.
02:00:27 Like if there’s conflict, if you and I have conflict,
02:00:30 it’s always better to meet face to face, right?
02:00:32 We’re on the phone.
02:00:33 Slack is not great, right?
02:00:35 Email is not great, but face to face.
02:00:37 What’s crazy is you and I get together
02:00:38 and we actively try to,
02:00:40 even if we’re not actually solving the conflict,
02:00:43 we actively try to paper over the conflict.
02:00:45 Oh yeah, it didn’t really bother me that much.
02:00:48 Oh yeah, I’m sure you didn’t mean it.
02:00:50 But like, no, in our minds we’re still there.
02:00:54 So this is one of the things that as a leader,
02:00:56 you always have to be digging,
02:00:58 especially as you ascend.
02:00:59 Like straight to the conflict.
02:01:00 Yeah, but as you ascend,
02:01:02 no one wants to tell you you’re crazy.
02:01:03 No one wants to tell you your idea is bad.
02:01:06 And you can, you’re like, oh,
02:01:08 oh, I’m going to be a leader.
02:01:09 And the idea is, well, I’m just going to ask people.
02:01:12 No one tells you.
02:01:13 So like you have to look for the markers,
02:01:16 knowing that literally just people
02:01:18 aren’t going to tell you along the way and be paranoid.
02:01:21 I mean, you asked about selling the company.
02:01:24 I think one of the biggest differences between me
02:01:25 and a lot of other entrepreneurs is like,
02:01:28 I wasn’t completely confident we could do it.
02:01:30 Like we could be alone and actually be great.
02:01:35 And if any entrepreneur is honest with you,
02:01:37 they also feel that way.
02:01:39 But a lot of people are like,
02:01:40 well, I have to be cocky and just say,
02:01:42 I can do this on my own.
02:01:43 We’re going to be fine.
02:01:44 We’re going to crush everyone.
02:01:46 Some people do say that and then it’s not right.
02:01:49 And they, and they fail.
02:01:50 But being honest in that moment with yourself,
02:01:55 with those close to you.
02:01:57 And also you talked about the personality of leaders
02:02:00 and who resonates and who doesn’t.
02:02:04 It’s rare that I see leaders be vulnerable, rare.
02:02:09 And one thing I tried to do at Instagram,
02:02:12 at least internally was like, say when I screwed up
02:02:16 and like point out how I was wrong about things
02:02:19 and point out where my judgment was off.
02:02:22 Everyone thinks they have to bat a thousand, right?
02:02:25 Like that’s crazy.
02:02:27 The best quant hedge funds in the world,
02:02:29 bat 50.001%.
02:02:31 They just take a lot of bets, right?
02:02:34 Renaissance, they might, they might bat 51%, right?
02:02:37 But holy hell, like the question isn’t,
02:02:41 are you right every single time
02:02:43 and you have to seem invincible.
02:02:46 The question is how many at bats do you get?
02:02:48 And on average, are you better on average, right?
02:02:53 With enough bets and enough at bats
02:02:55 that your aggregate can be very high.
02:02:58 I mean, Steve Jobs was wrong at a lot of stuff.
02:03:01 The Newton too early, right?
02:03:03 Next, not quite right.
02:03:05 There was even a time where he said like,
02:03:08 no one will ever wanna watch a video on the iPod.
02:03:13 Totally wrong.
02:03:14 But who cares if you come around
02:03:16 and realize your mistake and fix it.
02:03:18 It becomes just like you said, harder and harder
02:03:20 when your ego grows and the number of people around you
02:03:23 that say positive things towards you grows.
02:03:26 I actually think it’s really valuable that,
02:03:29 like let’s imagine a counterfactual
02:03:31 where Instagram became worth like $300 billion
02:03:34 or something crazy, right?
02:03:37 I kind of like that my life is relatively normal now.
02:03:40 When I say relatively, you get what I mean.
02:03:41 I’m not making a claim that I live a normal life,
02:03:44 but like I certainly don’t live in a world
02:03:46 where there are like 15 Sherpas following me,
02:03:49 like fetching me water, like that’s not how it works.
02:03:52 I actually like that I have a sense of humility of like,
02:03:56 I may not found another thing that’s nearly as big
02:03:59 so I have to work twice as hard
02:04:01 or I have to like learn twice as much.
02:04:04 I have to, we haven’t talked about machine learning yet,
02:04:07 but my favorite thing is all these like famous,
02:04:11 you know, tech guys who have worked in the industry,
02:04:14 pontificating about the future of machine learning
02:04:17 and how it’s gonna kill us all, right?
02:04:19 And like, I’m pretty sure they’ve never tried
02:04:22 to build anything with machine learning themselves.
02:04:24 Yes, so there’s a nice line between people
02:04:27 that actually build stuff with machine,
02:04:29 like actually program something
02:04:31 or at least understand some of those fundamentals
02:04:33 and the people that are just saying philosophical stuff
02:04:36 or journalists and so on.
02:04:38 It’s an interesting line to walk
02:04:40 because the people who program are often not philosophers.
02:04:44 Or don’t have the attention,
02:04:45 they can’t write an op ed for the Wall Street Journal,
02:04:48 like it doesn’t work.
02:04:49 So like, it’s nice to be both a little bit,
02:04:51 like to have elements of both.
02:04:52 My point is the fact that I have to learn stuff from scratch
02:04:56 or that I choose to are like.
02:04:58 It’s humbling.
02:04:59 Yeah, I mean, again, I have a lot of advantages.
02:05:02 I like, but my point is it’s awesome
02:05:06 to be back in a game where you have to fight.
02:05:11 That is, that’s fun.
02:05:13 So being humble, being vulnerable,
02:05:16 it’s an important aspect of a leader
02:05:18 and I hope it serves me well,
02:05:19 but like, I can’t fast forward 10 years to now.
02:05:22 I’ve just, that’s my game plan.
02:05:24 Before I forget, I have to ask you one last thing
02:05:26 on Instagram.
02:05:28 What do you think about the whistleblower,
02:05:30 Frances Haugen, recently coming out
02:05:33 and saying that Facebook is aware of Instagram’s
02:05:36 harmful effect on teenage girls
02:05:39 as per their own internal research studies on the matter?
02:05:43 What do you think about this baby of yours,
02:05:46 Instagram being under fire now,
02:05:48 as we’ve been talking about under the leadership of Facebook?
02:05:54 You know, I often question, where does the blame lie?
02:05:59 Is the blame at the people that originated the network me?
02:06:06 Is the blame at like the decision to combine the network
02:06:10 with another network with a certain set of values?
02:06:15 Is the blame at how it gets run after I left?
02:06:20 Like, is it the driver or is it the car, right?
02:06:25 Is it that someone enabled these devices in the first place?
02:06:29 If you go to an extreme, right?
02:06:31 Or is it the users themselves, just human nature?
02:06:35 Is it just the way of human nature?
02:06:37 Sure, and like the idea that we’re gonna find
02:06:39 a mutually exclusive answer here is crazy.
02:06:42 There’s not one place that’s a combination
02:06:43 of a lot of these things.
02:06:45 And then the question is like, is it true at all, right?
02:06:48 Like I’m not actually saying that’s not true
02:06:50 or that it’s true, but there’s always more nuance here.
02:06:55 Do I believe that social media has an effect
02:06:59 on young people?
02:07:00 Well, it’s got it, they use it a lot.
02:07:02 And I bet you there are a lot of positive effects
02:07:04 and I bet you there are negative effects,
02:07:05 just like any technology.
02:07:07 And where I’ve come to in my thinking on this
02:07:10 is that I think any technology has negative side effects.
02:07:13 The question is, as a leader, what do you do about them?
02:07:16 And are you actively working on them
02:07:18 or do you just like not really believe in them?
02:07:20 If you’re a leader that sits there and says,
02:07:22 well, we’re gonna put an enormous amount
02:07:24 of resources against this.
02:07:26 We’re gonna acknowledge when there are true criticisms,
02:07:29 we’re gonna be vulnerable and that we’re not perfect
02:07:32 and we’re gonna go fix them
02:07:33 and we’re gonna be held accountable along the way.
02:07:37 I think that people generally really respect that.
02:07:41 But I think that where Facebook I think has had issues
02:07:44 in the past is where they say things like,
02:07:46 can’t remember what Mark said about misinformation
02:07:49 during the election.
02:07:50 There was that like famous quote where he was like,
02:07:52 it’s pretty crazy to think that Facebook had anything
02:07:54 to do with this election.
02:07:55 Like that was something like that quote.
02:07:57 And I don’t remember what stage he was on and yeah.
02:08:00 But ooh, that did not age well, right?
02:08:02 Like you have to be willing to say,
02:08:05 well, maybe there’s something there and wow,
02:08:09 like I wanna go look into it
02:08:11 and truly believe it in your gut.
02:08:12 But if people look at you and how you act
02:08:14 and what you say and don’t believe you truly feel that way.
02:08:18 It’s not just the words you say, but how you say them
02:08:20 and that people believe they actually feel the pain
02:08:23 of having caused any suffering in the world.
02:08:25 So to me, it’s much more about your actions
02:08:29 and your posture post event
02:08:31 than it is about debugging the why.
02:08:33 Cause I don’t know, is it like, I don’t know this research.
02:08:36 It was written well after I left, right?
02:08:38 Like, is it the algorithm?
02:08:41 Is it the explore page?
02:08:42 Is it the people you might know unit connecting you
02:08:45 to ideas that are dangerous?
02:08:47 Like I really don’t know.
02:08:51 So we’d have to have a much deeper,
02:08:53 I think dive to understand where the blame lies.
02:08:56 What’s very unpleasant to me to consider,
02:08:58 now, I don’t know if this is true,
02:09:00 but to consider the very fact that there might be
02:09:03 some complicated games being played here.
02:09:07 For example, as somebody, I really love psychology
02:09:10 and I love it enough to know that the field
02:09:13 is pretty broken in the following way.
02:09:15 It’s very difficult to study human beings well at scale
02:09:19 because the questions you ask affect the results.
02:09:22 You can basically get any results you want.
02:09:25 And so you have an internal Facebook study
02:09:27 that asks some question of which we don’t know
02:09:29 the full details and there’s some kind of analysis,
02:09:32 but that’s just the one little tiny slice
02:09:34 into some much bigger picture.
02:09:37 And so you can have thousands of employees at Facebook.
02:09:40 One of them comes out and picks whatever narrative,
02:09:44 knowing that they become famous,
02:09:46 coupled with the other really uncomfortable thing
02:09:49 I see in the world, which is journalists seem to understand
02:09:53 they get a lot of clickbait attention
02:09:55 from saying something negative about social networks.
02:09:58 Certain companies, like they even get some clickbait stuff
02:10:03 about Tesla or about, especially when it’s like,
02:10:07 when there’s a public famous CEO type of person,
02:10:11 if they get a lot of views on the negative, not the positive,
02:10:14 the positive, they’ll get, I mean,
02:10:16 it actually goes to the thing you were saying before,
02:10:18 if there’s a hot, sexy new product,
02:10:20 that’s great to look forward to, they get positive on that,
02:10:23 but absent a product, it’s nice to have like the CEO
02:10:28 messing up in some kind of way.
02:10:30 And so couple that with the whistleblower
02:10:33 and with this whole dynamic of journalism and so on,
02:10:38 you know, with Social Dilemma being really popular,
02:10:41 documentary, it’s like, all right,
02:10:44 my concern is there’s deep flaws in human nature here
02:10:48 in terms of things we need to deal with,
02:10:51 like the nature of hate, of bullying,
02:10:54 all those kinds of things.
02:10:56 And then there’s people who are trying to use that
02:11:00 potentially to become famous and make money
02:11:02 off of blaming others for causing more of the problem
02:11:06 as opposed to helping solve the problem.
02:11:08 So I don’t know what to think.
02:11:10 I’m not saying this is like, I’m just uncomfortable
02:11:12 with, I guess, not knowing what to think about any of this
02:11:16 because a bunch of folks I know that work at Facebook
02:11:19 on the machine learning side, so Yann LeCun,
02:11:21 I mean, they’re quite upset about what’s happening
02:11:25 because there’s a lot of really brilliant,
02:11:26 good people inside Facebook that are trying to do good.
02:11:30 And so like all of this press, Yann is one of them,
02:11:33 and he has an amazing team
02:11:34 with the machine learning researchers.
02:11:35 Like he’s really upset with the fact
02:11:38 that people don’t seem to understand
02:11:40 that the portrayal does not represent
02:11:43 the full nature of efforts that’s going on at Facebook.
02:11:46 So I don’t know what to think about that.
02:11:48 Well, you just, I think, very helpfully explained
02:11:52 the nuance of the situation
02:11:54 and why it’s so hard to understand.
02:11:56 But a couple of things.
02:11:57 One is I think I have been surprised
02:12:02 at the scale with which some product manager
02:12:11 can do an enormous amount of harm
02:12:14 to a very, very large company
02:12:17 by releasing a trove of documents.
02:12:19 Like I think I read a couple of them when they got published
02:12:21 and I haven’t even spent any time going deep.
02:12:24 Part of it’s like, I don’t really feel like reliving
02:12:26 a previous life, but wow.
02:12:30 Like talk about challenging the idea of open culture
02:12:34 and like what that does to Facebook internally.
02:12:37 If Facebook was built, like I remember like my office,
02:12:43 we had this like no visitors rule around my office
02:12:45 because we always had like confidential stuff up
02:12:47 on the walls and everyone was super angry
02:12:50 because they’re like, that goes against our culture
02:12:52 of transparency and like marks in the fish cube
02:12:54 or whatever they call it, the aquarium,
02:12:56 I think they called it, where like literally anyone could see
02:12:59 what he was doing at any point and I don’t know.
02:13:03 I mean, other companies like Apple have been quiet
02:13:06 slash locked down, Snapchat’s the same way for a reason.
02:13:10 And I don’t know what this does to transparency
02:13:14 on the inside of startups that value that.
02:13:16 I think that it’s a seminal moment and you can say,
02:13:20 well, you should have nothing to hide, right?
02:13:22 But to your point, you can pick out documents
02:13:24 that show anything, right?
02:13:27 But I don’t know.
02:13:28 What happens to transparency inside of startups
02:13:31 and the culture that startups or companies
02:13:35 in the future will grow, like the startup of the future
02:13:37 that becomes the next Facebook will be locked down
02:13:40 and what does that do, right?
02:13:42 So that’s part one.
02:13:44 Part two, like I don’t think that you could design
02:13:49 a more like a well orchestrated handful of events
02:13:54 from the like 16 minutes to releasing the documents
02:13:59 in the way that they were released at the right time.
02:14:02 That takes a lot of planning and partnership.
02:14:05 And it seems like she has a partner at some firm, right?
02:14:09 That probably helped a lot with this, but man,
02:14:12 at a personal level, if you’re her,
02:14:15 you’d have to really believe in what you are doing,
02:14:19 really believe in it because you are personally
02:14:22 putting your ass on the line, right?
02:14:25 Like you’ve got a very large company
02:14:29 that doesn’t like enemies, right?
02:14:33 It takes a lot of guts and I don’t love
02:14:37 these conspiracy theories about like,
02:14:39 oh, she’s being financed from some person or people.
02:14:42 Like I don’t love them because that’s
02:14:43 like the easy thing to say.
02:14:45 I think the Occam’s razor here is like someone thought
02:14:49 they were doing something wrong
02:14:51 and was like very, very courageous.
02:14:54 And I don’t know if courageous is the word,
02:14:57 but like, so without getting into like,
02:15:00 is she a martyr?
02:15:01 Is she courageous?
02:15:02 Is she right?
02:15:03 Like, let’s put that aside for a second.
02:15:05 Then there are the documents themselves.
02:15:07 They say what they say.
02:15:09 To your point, a lot of the things that like people
02:15:12 have been worried about already in the documents
02:15:15 or they’re already been said externally.
02:15:17 And I don’t know, I’m just like, I’m thankful
02:15:22 that I am focused on new things with my life.
02:15:25 Well, let me just say, I just think it’s a really
02:15:27 hard problem that probably Facebook and Twitter
02:15:30 are trying to solve.
02:15:32 I’m actually just fascinated by how hard this problem is.
02:15:35 There are fundamental issues at Facebook in tone
02:15:38 and in an approach of how product gets built
02:15:41 and the objective functions.
02:15:43 And since people, organizations are not people.
02:15:48 So yawn and fair, right?
02:15:50 Like there are a lot of really great people
02:15:51 who like literally just want to push
02:15:53 reinforcement learning forward.
02:15:55 They literally just want to teach a robot
02:15:56 to touch, feel, lift, right?
02:15:59 Like they’re not thinking about political misinformation,
02:16:02 right?
02:16:03 But there’s a strong connection between what funds
02:16:07 that research and an enormously profitable machine
02:16:10 that has trade offs.
02:16:13 And one cannot separate the two.
02:16:18 You are not completely separate from the system.
02:16:21 So I agree, it can feel really frustrating
02:16:24 to feel if you’re internal there,
02:16:27 that you’re working on something completely unrelated
02:16:29 and you feel like your group’s good.
02:16:31 I can understand that.
02:16:32 But there’s some responsibility still.
02:16:34 You have to acknowledge, it’s like the Ray Dalio thing.
02:16:36 You have to look in the mirror and see if there’s problems
02:16:38 and you have to fix those problems.
02:16:40 Yeah.
02:16:43 You mentioned machine learning reinforcement quite a bit.
02:16:46 I mean, to me, social networks is one of the exciting places,
02:16:50 recommender systems where machine learning is applied.
02:16:52 Where else in the world, in the space of possibilities
02:16:56 over the next five, 10, 20 years,
02:16:58 do you think we’re going to see impact of machine learning
02:17:02 when you try, on the philosophical level,
02:17:05 on a technical level, what do you think?
02:17:07 Or within social networks themselves?
02:17:11 Well, I think the obvious answers are climate change.
02:17:17 Think about how much fuel
02:17:19 or just waste there is in energy consumption today
02:17:24 because we don’t plan accordingly,
02:17:27 because we take the least efficient route or…
02:17:30 The logistics and stuff, the supply chain,
02:17:32 all that kind of stuff.
02:17:33 Yeah, I mean, listen, if we’re gonna fight climate change,
02:17:36 like one really way, one awesome way to do it
02:17:39 is figure out how to optimize how we operate as a species
02:17:43 and minimize the amount of energy we consume
02:17:47 to maximize whatever economic impact we wanna have.
02:17:51 Because right now those two are very much tied together.
02:17:53 And I don’t believe that that has to be the case.
02:17:56 There’s this really interesting, you’ve read it.
02:18:00 For people who are listening,
02:18:00 there’s this really interesting paper
02:18:02 on reinforcement learning
02:18:04 and energy consumption inside buildings.
02:18:06 It’s like one of the seminal ones, right?
02:18:08 But imagine that at massive scale.
02:18:10 That’s super interesting.
02:18:11 I mean, they’ve done resource planning for servers
02:18:15 for peak load using reinforcement learning.
02:18:17 I don’t know if that was at Google or somewhere else,
02:18:19 but like, okay, great, you do it for servers,
02:18:21 but what if you could do it for just capacity
02:18:24 and general energy capacity for cities
02:18:26 and planning for traffic?
02:18:27 And of course there’s all the self driving cars
02:18:30 and I don’t know, like I’m not gonna pontificate
02:18:34 on like crazy ideas using reinforcement learning
02:18:39 or machine learning.
02:18:40 It’s just so clear to me
02:18:41 that humans don’t think quickly enough.
02:18:43 So it’s interesting to think about machine learning
02:18:46 helping a little bit at scale.
02:18:49 So a little bit to a large number of people
02:18:52 that has a huge impact.
02:18:53 So if you optimize, say Google Maps, something like that,
02:18:57 trajectory planning or what a map quest first.
02:19:01 Getting here, I looked and it was like,
02:19:02 here’s the most energy efficient route.
02:19:04 And I was like, I’m gonna be late.
02:19:05 I need to take the fastest route.
02:19:07 As opposed to unrolling the map.
02:19:09 Yeah, yeah.
02:19:10 Like, and then that’s going to be very inefficient
02:19:12 no matter what.
02:19:13 I was definitely the other day,
02:19:14 like part of the Epsilon of Epsilon Greedy
02:19:17 with Waze where like I was sent on like a weird route
02:19:21 that I could tell they’re like,
02:19:22 we just need to collect data at this road.
02:19:24 Like we just, Kevin’s.
02:19:26 You were the ant they sent out for exploration.
02:19:28 Kevin’s definitely gonna be the guinea pig.
02:19:30 And great, now we have.
02:19:32 Did you at least feel pride?
02:19:34 Oh, going through it, I was like, oh, this is fun.
02:19:36 Like now they get data about this weird shortcut.
02:19:38 And actually I hit all the green lights networked.
02:19:40 I’m like, this is a problem.
02:19:41 Bad data.
02:19:42 Bad data, they’re just gonna imagine.
02:19:44 I could see you slowing down and stopping at a green light
02:19:47 just to give them the right kind of data.
02:19:50 But to answer your question,
02:19:51 like I feel like that was fairly unsatisfying
02:19:53 and it’s easy to say climate change.
02:19:55 But what I would say is at Instagram,
02:19:58 everything we applied machine learning to
02:20:02 got better for users and it got better for the company.
02:20:04 I saw the power.
02:20:06 I didn’t fully understand it as an executive.
02:20:08 And I think that’s actually one of the issues
02:20:10 that, and when I say understand,
02:20:12 I mean the mathematics of it.
02:20:14 Like I understand what it does.
02:20:15 I understand that it helps.
02:20:17 But there are a lot of executives now
02:20:20 that talk about it in the way
02:20:22 that they talk about the internet
02:20:23 or they talked about the internet like 10 years ago.
02:20:25 They’re like, we’re gonna build mobile.
02:20:27 And you’re like, what does that mean?
02:20:27 They’re like, we’re just gonna do mobile.
02:20:29 And you’re like, okay.
02:20:31 So my sense is the next generation of leaders
02:20:33 will have grown up having had classes
02:20:37 in reinforcement learning, supervised learning, whatever.
02:20:40 And they will be able to thoughtfully apply it
02:20:42 to their companies and the places that it is needed most.
02:20:46 And that’s really cool.
02:20:47 Cause I mean, talk about efficiency gains.
02:20:53 That’s what excites me the most about it.
02:20:54 Yeah, so there’s, it’s interesting just to get a fundamental
02:20:58 first principles understanding
02:20:59 of certain concepts of machine learning.
02:21:01 So supervised learning from an executive perspective,
02:21:04 supervised learning, you have to have a lot of humans
02:21:07 label a lot of data.
02:21:08 So the question there is, okay,
02:21:09 can we gather a large amount of data
02:21:12 that can be labeled well?
02:21:14 And that’s the question Tesla asked,
02:21:16 like can we create a data engine
02:21:17 that keeps sending an imperfect machine learning system
02:21:22 out there, whenever it fails, it gives us data back.
02:21:25 We label it by human and we send it back and forth
02:21:27 to this way.
02:21:28 Then there’s Yann LeCun’s excited
02:21:30 about the self supervised learning
02:21:32 where you do much less human labeling
02:21:36 and there’s some kind of mechanism for the system
02:21:38 to learn it by itself on the human generated data.
02:21:42 And then there’s the reinforcement learning,
02:21:44 which is like basically allowing,
02:21:47 it’s applying the alpha zero technology
02:21:52 that allow through self play to learn how to solve
02:21:56 the game of Go and achieve incredible levels
02:21:59 at the game of chess.
02:22:02 Can you formulate the problem you’re trying to solve
02:22:05 in a way that’s amenable to reinforcement learning?
02:22:07 And can you get the right kind of signal at scale?
02:22:09 Cause you need a lot of signal.
02:22:11 And that’s kind of fascinating to see which part
02:22:15 of a social network can you convert
02:22:17 into reinforcement learning problem.
02:22:19 The fascinating thing about reinforcement learning,
02:22:22 I think, is that we now have learned
02:22:25 to apply neural networks to guess the Q function,
02:22:34 basically the values for any state in action.
02:22:37 And that is fascinating cause we used to just like,
02:22:40 I don’t know, have like a linear regression,
02:22:42 like hope it worked and that was the fanciest version of it.
02:22:45 But now you look at it, I’m like trying to learn this stuff
02:22:47 and I look at it and I’m like,
02:22:48 there are like 17 different acronyms of different ways
02:22:51 you can try to apply this.
02:22:52 No one quite agrees, like what’s the best.
02:22:56 Generally, if you’re trying to like build a neural network,
02:22:58 there are pretty well trodden ways of doing that.
02:23:02 You use Adam, you use ReLU,
02:23:04 like there’s just like general good ideas.
02:23:07 And in reinforcement learning,
02:23:08 I feel like the consensus is like, it totally depends.
02:23:13 And by the way, it’s really hard to get it to converge
02:23:16 and it’s noisy and it like,
02:23:18 so there are all these really interesting ideas
02:23:20 around building simulators.
02:23:23 You know, like for instance, in self driving, right?
02:23:25 Like you don’t want to like actually have someone
02:23:29 getting in an accident to learn that an accident is bad.
02:23:31 So you start simulating accidents,
02:23:33 simulating aggressive drivers,
02:23:35 just simulating crazy dogs that run into the street and,
02:23:39 wow, fascinating, right?
02:23:41 Like my mind starts racing and then the question is,
02:23:43 okay, forget about self driving cars.
02:23:45 Let’s talk about social networks.
02:23:49 How can you produce a better, more thoughtful experience
02:23:52 using these types of algorithms?
02:23:55 And honestly, in talking to some of the people
02:23:58 that work at Facebook and old Instagrammers,
02:24:01 most people are like, yeah, we tried a lot of things,
02:24:04 didn’t quite ever make it work.
02:24:05 I mean, for the longest time,
02:24:06 Facebook ads was effectively a logistic regression, okay?
02:24:10 I don’t know what it is now,
02:24:11 but like if you look at this paper
02:24:13 that they published back in the day,
02:24:14 it was literally just a logistic regression.
02:24:16 Made a lot of money.
02:24:18 So even at these like extremely large scales,
02:24:21 if we are not yet touching
02:24:23 what reinforcement learning can truly do,
02:24:25 imagine what the next 10 years looks like.
02:24:27 How cool is that?
02:24:28 It’s amazing.
02:24:29 So I really liked the use of reinforcement learning
02:24:32 as part of the simulation, for example,
02:24:34 like with self driving cars, it’s modeling pedestrians.
02:24:37 So the nice thing about reinforcement learning,
02:24:40 it can be used to learn agents within the world.
02:24:45 So they can learn to behave properly.
02:24:47 Like you can teach pedestrians to,
02:24:49 like you don’t hard code the way they behave,
02:24:51 they learn how to behave.
02:24:53 In that same way, I do have a hope,
02:24:55 was it Jack Dorsey talks about healthy conversations.
02:24:58 You talked about meaningful interactions, I believe.
02:25:03 Like simulating interactions.
02:25:06 So you can learn how to manage that, it’s fascinating.
02:25:09 So where most of your algorithm development happens
02:25:13 in virtual worlds, and then you can really learn
02:25:16 how to design the interface,
02:25:18 how you design a bunch of aspects of the experience
02:25:21 in terms of how you select what’s shown in the feed,
02:25:24 all those kinds of things.
02:25:26 It feels like if you can connect reinforcement learning
02:25:28 to that, that’s super exciting.
02:25:30 Yep, and I think if you have a company and leadership
02:25:35 that believe in doing the right things
02:25:36 and can apply this technology in the right way,
02:25:38 some really special stuff can happen.
02:25:41 It is mostly likely going to be a group of people
02:25:44 we’ve never heard about, start up from scratch, right?
02:25:49 And you asked if like new social networks could be built,
02:25:52 I’ve got to imagine they will be.
02:25:54 And whoever starts it, it might be some kids in a garage
02:25:58 that took these classes from these people, you, right?
02:26:02 And they’re building all of these things
02:26:04 with this tech at the core.
02:26:06 So I’m trying not to be someone who just like throws
02:26:08 around reinforcement learning as a buzzword.
02:26:11 I truly believe that it is the most cutting edge
02:26:16 in what can happen in social networks.
02:26:18 And I also believe it’s super hard.
02:26:20 Like it’s super hard to make it work.
02:26:22 It’s super hard to do it at scale.
02:26:24 It’s super hard to find people that truly understand it.
02:26:26 So I’m not gonna say that like,
02:26:30 I think it’ll be applied in social networks
02:26:31 before we have true self driving.
02:26:33 Yeah, we could argue about this for a long time,
02:26:36 but yes, I agree with you.
02:26:38 I think self driving is way harder than people realize.
02:26:40 Oh, absolutely.
02:26:41 Let me ask you in terms of that kid in the garage
02:26:44 or those couple of kids in the garage,
02:26:45 what advice would you give to them
02:26:47 if they wanna start a new social network or a business?
02:26:50 What advice would you give to somebody
02:26:52 with a big dream and a young startup?
02:26:56 To me, you have to choose to do something
02:26:59 that even if it fails, like it was so fun, right?
02:27:03 Like we never started Instagram knowing
02:27:06 it was going to be big.
02:27:07 We started Instagram because we loved photography.
02:27:10 We loved social networks.
02:27:12 I had seen what other social networks had done
02:27:14 and I thought, hmm, maybe we did a spin on this,
02:27:17 but like nowhere was our feet predestined.
02:27:20 Like it wasn’t like, it wasn’t written out anywhere
02:27:23 that everything was gonna go great.
02:27:25 And I often think the counterfactual,
02:27:27 like what if it had not gone well?
02:27:28 I would have been like, I don’t know, that was fun.
02:27:30 We raised some money, we learned some stuff
02:27:32 and does it position you well for the next experience?
02:27:37 That’s the advice that I would give
02:27:39 to anyone wanting to start something today,
02:27:41 which is like, does this meet with your ultimate goals?
02:27:45 Not wealth, not fame, none of that,
02:27:47 because all of that, by the way, is bullshit.
02:27:48 Like you can get super famous and super wealthy.
02:27:52 And I think generally those are not things that,
02:27:57 again, it’s easy to say with like a lot of money
02:27:59 that somehow like it’s not good to have a lot of money.
02:28:01 It’s just, I think that complicates life enormously
02:28:04 in a way that people don’t fully comprehend.
02:28:06 So I think it is way more interesting to shoot for,
02:28:09 can I make something that people love,
02:28:11 that provides value in the world,
02:28:13 that I love building, that I love working on, right?
02:28:17 That’s what I would do if I were starting from scratch.
02:28:21 And by the way, like in some ways
02:28:22 that I will do that personally,
02:28:25 which is like choose the thing
02:28:26 that you get up every morning and you’re like,
02:28:27 I love this, even when it’s painful.
02:28:33 Even when it’s painful.
02:28:34 What about a social network specifically?
02:28:36 If you were to imagine, put yourself in the mind of some.
02:28:40 I can’t compete against myself.
02:28:42 I can’t give out ideas.
02:28:43 Okay, I got you.
02:28:44 No, but it’s like high level.
02:28:45 You can focus on community.
02:28:47 Yeah.
02:28:50 I said that as a half joke.
02:28:54 In all honesty, I think these things are so hard to build
02:28:56 that like ideas are a dime a dozen, but.
02:28:59 You have talked about keeping it simple.
02:29:02 Can I tell you?
02:29:03 Which is a liberating idea.
02:29:04 My model is it’s three circles and they overlap.
02:29:08 One circle is what do I have experience at?
02:29:11 Slash, what am I good at?
02:29:12 I don’t like saying what am I good at
02:29:14 because it just like seems like,
02:29:16 what do I have experience in, right?
02:29:18 What can I bring to the table?
02:29:19 What am I excited about is the other circle.
02:29:22 What gets it?
02:29:22 What’s just super cool, right?
02:29:24 That I want to work on because even when this is hard,
02:29:29 I think it’s so cool.
02:29:30 I want to stick with it.
02:29:31 And the last circle is like, what does the world need?
02:29:34 And if that circle ain’t there,
02:29:36 it doesn’t matter what you work on.
02:29:37 Cause there are a lot of startups that exist
02:29:39 that just no one needs or very small markets need.
02:29:43 But if you want to be successful,
02:29:44 I think if you’re like, if you’re good at it,
02:29:47 you have, sorry, if you’re good at it,
02:29:49 you’re passionate about it and the world needs it.
02:29:51 I mean, this sounds simple,
02:29:53 not enough people sit down and just think
02:29:54 about those circles and think, do these things overlap?
02:29:58 And then can I get that middle section?
02:29:59 It’s small, but can I get that middle section?
02:30:02 I think a lot about that personally.
02:30:05 And then you have to be really honest about the circle
02:30:09 that you’re good at and really honest about the circle
02:30:14 that the world needs.
02:30:17 And as opposed to really honest about the passion,
02:30:20 like what do you actually love?
02:30:22 As opposed to like some kind of dream of making money,
02:30:24 all those kinds of stuff, like literally love doing.
02:30:26 I had a former engineer who decided to start a startup
02:30:29 and I was like, are you sure you want to start a company
02:30:32 versus like join something else?
02:30:34 Because being a coach of an NBA team and playing basketball
02:30:39 are two very, very different things.
02:30:42 And like not everyone fully understands the difference.
02:30:45 I think you can kind of do it both.
02:30:50 And I don’t know, jury’s out on that one
02:30:51 because like they’re in the middle of it now.
02:30:54 But it’s really important to figure out
02:30:57 what you’re good at, not be full of yourself,
02:30:59 like truly look at your track record.
02:31:03 What’s the saying like, it ain’t bragging if you can do it.
02:31:09 But too many people are delusional
02:31:12 and like think they’re better at things
02:31:14 than they actually are,
02:31:15 or think there’s a bigger market than there actually is.
02:31:18 When you confuse your passion for things with a big market,
02:31:21 that’s really scary, right?
02:31:23 Like just because you think it’s cool
02:31:25 doesn’t mean that it’s a big business opportunity.
02:31:27 So like, what evidence do you have?
02:31:28 Again, I’m a fairly like, I’m a strict rationalist on this.
02:31:32 And like sometimes people don’t like working with me
02:31:35 because I’m pretty pragmatic about things.
02:31:37 Like I’m not Elon, like I don’t sit
02:31:40 and make bold proclamations about visiting Mars.
02:31:44 Like that’s just not how I work.
02:31:46 I’m like, okay, I want to build this really cool thing
02:31:48 that’s fairly practical and I think we could do it.
02:31:50 And it’s in this way.
02:31:52 And what’s cool though is like, that’s just my sweet spot.
02:31:55 I’m not like, I just, I can’t with a straight face
02:31:58 talk about the metaphors.
02:31:59 I can’t, I just, it’s not me.
02:32:01 What do you think about the Facebook renaming itself to?
02:32:05 I didn’t mean that as a dig.
02:32:06 I just literally mean like, I’m fairly,
02:32:09 I like to live in the next five years.
02:32:11 And like, what things can I get out in a year
02:32:13 that people will use at scale?
02:32:15 And so it’s just, again, those circles I think are different
02:32:20 for different people, but it’s important to realize
02:32:22 that like market matters, you being good at it matters
02:32:26 and having passion for it matters.
02:32:27 Your question, sorry.
02:32:29 Well, on that last, on this topic in terms of funding,
02:32:33 is there, by way of advice,
02:32:39 was funding in your own journey helpful, unhelpful?
02:32:44 Like is there a right time to get funding, venture funding
02:32:48 or anything, borrow some money from your parents?
02:32:51 I don’t know.
02:32:51 Like is money getting in the way?
02:32:54 Does it help?
02:32:56 Is the timing important?
02:32:57 Is there some kind of wisdom you can give there
02:33:00 because you were exceptionally successful very quickly?
02:33:06 Funding helps as long as it’s from the right people.
02:33:09 That includes yourself.
02:33:10 And I’ll talk about myself funding myself in a second,
02:33:13 which is like, because I can fund myself
02:33:15 doing whatever projects I can do,
02:33:18 I don’t really have another person putting pressure on me
02:33:20 except for myself and that creates strange dynamics, right?
02:33:24 But let’s like talk about people getting funding
02:33:27 from a venture capitalist initially.
02:33:30 We raised money from Matt Kohler at Benchmark.
02:33:32 He’s brilliant, amazing guy, very thoughtful.
02:33:36 And he was very helpful early on.
02:33:39 But I have stories from entrepreneurs
02:33:41 where they raised money from the wrong person
02:33:42 or the wrong firm where incentives weren’t aligned.
02:33:46 They didn’t think in the same way
02:33:48 and bad things happened because of that.
02:33:51 The boardroom was always noisy.
02:33:53 There were fights, like we just never had that.
02:33:55 Matt was great.
02:33:57 I think like capital these days
02:33:59 is kind of a dime a dozen, right?
02:34:01 Like as long as you’re fundable,
02:34:03 like it seems like there’s money out there
02:34:05 is what I’m hearing.
02:34:08 It’s really important that you are aligned
02:34:10 and that you think of raising money
02:34:12 as hiring someone for your team rather than taking money
02:34:15 if capital is plentiful, right?
02:34:18 It provides a certain amount of pressure
02:34:20 to do the right thing that I think is healthy
02:34:23 for any startup.
02:34:24 And it keeps you real and honest
02:34:25 because they don’t wanna lose their money.
02:34:27 They’re paid to not lose their money.
02:34:29 The problem, maybe I could depersonalize it,
02:34:32 but like I remember having lunch with Elon.
02:34:35 It’s only happened once.
02:34:37 And I asked him, like I was trying to figure out
02:34:39 what I was doing after Instagram, right?
02:34:42 And I asked him something about like angel investing.
02:34:44 And he looked at me with a straight face.
02:34:46 He was like, why the F would I do that?
02:34:48 Like, why?
02:34:48 I was like, I don’t know.
02:34:50 Like you’re connected.
02:34:51 Like seems like he’s like, I only invest in myself.
02:34:54 I was like, Ooh, okay.
02:34:56 You know, like not the confidence.
02:34:59 I was just like, what a novel idea.
02:35:00 It’s like, yeah, if you have money,
02:35:03 like why not just put it against your bag
02:35:06 and like enable you’re visiting Mars or something, right?
02:35:11 Like that’s awesome, great.
02:35:12 But I had never really thought of it that way.
02:35:14 But also with that comes an interesting dynamic
02:35:17 where you don’t actually have people
02:35:22 who are gonna lose that money telling you,
02:35:23 hey, don’t do this or, hey, you need to face this reality.
02:35:27 So you need to create other versions of that truth teller.
02:35:32 And whatever I do next,
02:35:34 that’s gonna be one of the interesting challenges
02:35:36 is how do you create that truth telling situation?
02:35:40 And that’s part of why, by the way,
02:35:41 I think someone like Jack, when you start Square,
02:35:43 you have money, but you still, you bring on partners
02:35:46 because I think it creates
02:35:48 a truth telling type environment.
02:35:51 I’m still trying to figure this out.
02:35:52 Like it’s an interesting dynamic.
02:35:56 So you’re thinking of perhaps launching some kind of venture
02:35:59 where you’re investing in yourself?
02:36:01 I mean, I’m 37 going on 38 next month.
02:36:06 I have a long life to live.
02:36:07 I’m definitely not gonna sit on the beach, right?
02:36:10 So I’m gonna do something at some point
02:36:13 and I gotta imagine I will like help fund it, right?
02:36:18 So the other way of thinking about this
02:36:20 is you can park your money in the SMP,
02:36:21 and this is bad
02:36:22 because the SMP has done wonderfully well last year, right?
02:36:26 Or you can invest in yourself.
02:36:27 And if you’re not gonna invest in yourself,
02:36:30 you probably shouldn’t do a startup.
02:36:32 It’s kind of the way of thinking about it.
02:36:35 And you can invest in yourself in the way Elon does,
02:36:37 which is basically go all in on this investment.
02:36:41 Maybe that’s one way to achieve accountability
02:36:43 is like you’re kind of screwed if you fail.
02:36:46 Yeah, that’s, yeah.
02:36:49 I personally like that.
02:36:50 I like burning bridges behind me
02:36:52 so that I’m fucked if it fails.
02:36:57 It’s really important though.
02:37:00 One of the things I think Mark said to me early on
02:37:03 that sticks with me that I think is true.
02:37:06 We were talking about people
02:37:07 who had left like operating roles
02:37:10 and started doing venture or something.
02:37:11 He was like, a lot of people convince themselves
02:37:13 they work really hard.
02:37:14 Like they think they work really hard
02:37:15 and they put on the show
02:37:17 and in their minds they work really hard,
02:37:19 but they don’t work very hard.
02:37:22 There is something about lighting a fire underneath you
02:37:24 and burning bridges such that you can’t turn back.
02:37:28 That I think, we didn’t talk about this specifically,
02:37:31 but I think you’re right.
02:37:32 There is, you need to have that
02:37:34 because there’s the self delusion at a certain scale.
02:37:39 Oh, I have so many board calls.
02:37:40 Oh, like we have all these things to figure out.
02:37:43 It’s like, this is one of the hard parts
02:37:45 about it being an operator.
02:37:47 It’s like, there are so many people
02:37:50 that have made a lot of money not operating,
02:37:52 but operating is just one of the hardest things on earth.
02:37:55 It is just so effing hard.
02:37:58 It is stressful.
02:37:59 It is, you’re dealing with real humans,
02:38:01 not just like throwing capital in and hoping it grows.
02:38:04 I’m not undermining the VC mindset.
02:38:06 I think it’s a wonderful thing and needed
02:38:08 and so many wonderful VCs I’ve worked with.
02:38:11 But yeah, like when your ass is on the line
02:38:14 and it’s your money, it’s…
02:38:18 Talk to me in 10 years, we’ll see how it goes.
02:38:21 Yeah, but like you were saying, that is a source.
02:38:23 When you wake up in the morning
02:38:24 and you look forward to the day full of challenges,
02:38:28 that’s also where you can find happiness.
02:38:31 Let me ask you about love and friendship.
02:38:32 Sure.
02:38:33 What’s the role in this heck of a difficult journey
02:38:36 you have been on of love, of friendship?
02:38:41 What’s the role of love in the human condition?
02:38:45 Well, first things first,
02:38:47 the woman I married, my wife, Nicole,
02:38:49 there’s no way I could do what I do if we weren’t together.
02:38:53 She had the filter idea.
02:38:55 Yeah, yeah, exactly.
02:38:56 We didn’t go over that story.
02:38:59 Everything is a partnership, right?
02:39:01 And to achieve great things,
02:39:03 it’s not about like someone pulling their weight in places.
02:39:06 Like it’s not like someone’s supporting you
02:39:08 so that you could do this other thing.
02:39:11 It’s literally like,
02:39:14 Mike and I and our partnership as cofounders is fascinating
02:39:18 because I don’t think Instagram would have happened
02:39:20 without that partnership.
02:39:21 Like either him or me alone, no way.
02:39:25 We pushed and pulled each other in a way
02:39:28 that allowed us to build a better thing because of it.
02:39:32 Nicole, she pushed me to work on the filters early on.
02:39:35 And yes, that’s exciting.
02:39:36 It’s a fun story, right?
02:39:38 But the truth of it is being able to like level
02:39:41 with someone about how hard the process is
02:39:44 and have someone see you for who you are before Instagram
02:39:49 and know that there’s a constant you throughout all of this
02:39:53 and be able to call you when you’re drifting from that,
02:39:55 but also support you when you’re trying to stick with that.
02:39:58 That’s, I mean, that’s true friendship slash love,
02:40:02 whatever you want to call it.
02:40:04 But also it was for someone not to care.
02:40:07 I remember Nicole saying,
02:40:08 hey, like I know you’re going to do this Instagram thing.
02:40:10 You should, I guess it was bourbon at the time.
02:40:12 You should do it because, you know,
02:40:15 even if it doesn’t work,
02:40:16 we can move to like a smaller apartment and it’ll be fine.
02:40:20 Like we’ll make it work.
02:40:22 How beautiful is that, right?
02:40:24 That’s almost like a superpower
02:40:25 that gives you permission to fail.
02:40:27 And somehow that actually leads to success.
02:40:29 But also she’s like the least impressed
02:40:31 about Instagram of anyone.
02:40:33 She’s like, yeah, it’s great.
02:40:34 But like, I love you for you.
02:40:36 Like, I like that you’re like a decent cook.
02:40:38 That’s beautiful.
02:40:39 That’s beautiful with the Gantt chart and Thanksgiving,
02:40:42 which I still think is a brilliant effing idea.
02:40:44 Thank you.
02:40:46 Big, ridiculous question.
02:40:48 Have you, you’re old and wise at this stage.
02:40:53 So have you discovered meaning to this whole thing?
02:40:55 Why the hell are we descendants of apes here on earth?
02:40:59 What’s the meaning of it?
02:41:00 What’s the meaning of life?
02:41:01 I haven’t.
02:41:02 And I am, so the crazy,
02:41:06 so the best learning for me has been like,
02:41:09 no matter what level of success you achieve,
02:41:12 you’re still worried about similar things,
02:41:14 just maybe on a slightly different scale.
02:41:16 You’re still concerned about the same thing.
02:41:18 You’re still self conscious about the same things.
02:41:21 Just like, and actually that moment going through that
02:41:26 is what makes you believe there’s gotta be like
02:41:29 more machinery to life or purpose to life.
02:41:31 And that we’re all chasing these materialistic things,
02:41:35 but you start realizing like,
02:41:38 it’s almost like, you know, the Truman Show
02:41:39 when he gets to the edge and he like knocks against it.
02:41:42 He’s like, what?
02:41:43 Like there’s this awakening that happens
02:41:45 when you get to that edge that you realize,
02:41:47 oh, like sure, it’s great.
02:41:49 It’s great that we all chase money and fame and success.
02:41:53 But you hit the edge and I’m not even claiming
02:41:56 I hit an edge like Elon’s hit an edge.
02:41:58 Like there’s clearly larger scales.
02:42:00 But what’s cool is you learn that,
02:42:02 like it doesn’t actually matter
02:42:03 and that there are all these other things that truly matter.
02:42:07 That’s not a case for working less hard.
02:42:09 That’s not a case for taking it easy.
02:42:11 That’s not a case for the four day work week.
02:42:13 What that is a case for is designing your life
02:42:16 exactly the way you want to design it.
02:42:18 Cause I don’t know, I think we go around the earth,
02:42:22 you know, the sun a certain number of times
02:42:25 and then we die and then that’s it.
02:42:27 That’s me.
02:42:28 Are you afraid of that moment?
02:42:30 No, not at all.
02:42:31 In fact, or at least not yet.
02:42:36 Listen, I’m like a pilot, like I do crazy things
02:42:39 and I like, no, I like, if anything, I’m like,
02:42:43 oh, I got to choose mindfully and purposefully
02:42:49 the thing I am doing right now and not just fall into it
02:42:54 because you’re going to wake up one day and ask yourself
02:42:55 why the hell you spent the last 10 years doing X, Y or Z?
02:42:58 Yeah.
02:42:59 So I guess my like shorter answer to this is
02:43:03 doing things on purpose because you choose to do them.
02:43:08 So important in life and not just like floating
02:43:11 down the river of life, hitting branches along the way
02:43:14 cause you will hit branches, right?
02:43:17 But rather like literally plotting a course
02:43:19 and not having a 10 year plan,
02:43:21 but just choosing every day to opt in.
02:43:23 That I think has been more like,
02:43:28 I haven’t figured out the meaning of life
02:43:29 by any stretch of the imagination,
02:43:31 but it certainly isn’t money and it certainly isn’t fame
02:43:33 and it certainly isn’t travel.
02:43:35 And it’s like, and it’s way more of like opting
02:43:37 into the game you love playing.
02:43:40 Every day opting in.
02:43:41 Just opting in and like, don’t let it happen.
02:43:44 You opt in.
02:43:46 Kevin, it’s great to end on love and the meaning of life.
02:43:51 This was an amazing conversation.
02:43:53 It was a lot of fun, thank you.
02:43:54 You gave me like a light into some fascinating aspects
02:43:57 of this technical world.
02:44:00 And I can’t honestly wait to see what you do next.
02:44:04 Thank you so much.
02:44:05 Thanks for having me.
02:44:07 Thanks for listening to this conversation
02:44:09 with Kevin Systrom.
02:44:10 To support this podcast, please check out our sponsors
02:44:13 in the description.
02:44:14 And now let me leave you with some words
02:44:16 from Kevin Systrom himself.
02:44:19 Focusing on one thing and doing it really, really well
02:44:24 can get you very far.
02:44:25 Thank you for listening and hope to see you next time.