Transcript
00:00:00 If you read a half hour a night,
00:00:01 the calculation I came to is that you can read
00:00:03 a thousand books in 50 years.
00:00:05 All of the components are there
00:00:07 to engineer intimate experiences.
00:00:09 Extraterrestrial life is a true mystery,
00:00:11 the most tantalizing mystery of all.
00:00:13 How many humans need to disappear
00:00:15 for us to be completely lost?
00:00:19 The following is a conversation with Tim Urban,
00:00:22 author and illustrator of the amazing blog
00:00:25 called Wait, But Why?
00:00:28 This is the Lex Friedman podcast.
00:00:30 To support it, please check out our sponsors
00:00:32 in the description.
00:00:33 And now, dear friends, here’s Tim Urban.
00:00:38 You wrote a Wait, But Why blog post
00:00:41 about the big and the small,
00:00:42 from the observable universe to the atom.
00:00:45 What world do you find most mysterious or beautiful,
00:00:48 the very big or the very small?
00:00:50 The very small seems a lot more mysterious.
00:00:54 And I mean, the very big I feel like we kind of understand.
00:00:56 I mean, not the very, very big.
00:00:58 Not the multiverse, if there is a multiverse,
00:01:01 not anything outside of the observable universe.
00:01:05 But the very small,
00:01:07 I think we really have no idea what’s going on,
00:01:09 or very much less idea.
00:01:11 But I find that,
00:01:12 so I think the small is more mysterious,
00:01:13 but I think the big is sexier.
00:01:17 I just cannot get enough of the bigness of space
00:01:23 and the farness of stars.
00:01:26 And it just continually blows my mind.
00:01:29 I mean, we still,
00:01:30 the vastness of the observable universe
00:01:33 has the mystery that we don’t know what’s out there.
00:01:35 We know how it works, perhaps.
00:01:38 Like, general relativity can tell us
00:01:40 how the movement of bodies works,
00:01:42 how they’re born, all that kind of things.
00:01:45 But like, how many civilizations are out there?
00:01:49 How many, like, what are the weird things that are out there?
00:01:51 Oh yeah, life.
00:01:52 Well, extraterrestrial life is a true mystery.
00:01:54 The most tantalizing mystery of all.
00:01:58 But that’s like our size.
00:02:00 So that’s maybe it’s that the actual,
00:02:02 the big and the small are really cool,
00:02:04 but it’s actually the things that are potentially our size
00:02:06 that are the most tantalizing.
00:02:07 Potentially our size is probably the key word.
00:02:10 Yeah, I mean,
00:02:11 I wonder how small intelligent life could get.
00:02:13 Probably not that small.
00:02:15 And I assume that there’s a limit that you’re not gonna,
00:02:18 I mean, you might have like a whale,
00:02:19 blue whale size intelligent being,
00:02:21 that would be kind of cool.
00:02:23 But I feel like we’re in the range of order of magnitude
00:02:27 smaller and bigger than us for life.
00:02:28 But maybe not.
00:02:29 Maybe you could have some giant life form.
00:02:32 Just seems like, I don’t know,
00:02:34 there’s gotta be some reason that anything intelligent
00:02:36 between kind of like a little tiny rodent
00:02:39 or finger monkey up to a blue whale on this planet.
00:02:43 I don’t know.
00:02:44 Maybe when you change the gravity and other things.
00:02:47 Well, you could think of life
00:02:49 as a thing of self assembling organisms
00:02:52 and they just get bigger and bigger and bigger.
00:02:54 Like there’s no such thing as a human being.
00:02:55 A human being is made up of a bunch of tiny organisms
00:02:58 working together.
00:03:00 And we somehow envision that as one entity
00:03:03 because it has consciousness.
00:03:04 But maybe it’s just organisms on top of organisms.
00:03:08 Organisms all the way down, turtles all the way down.
00:03:10 So like earth can be seen as an organism
00:03:13 for people, for alien species that’s very different.
00:03:18 Like why is the human the fundamental entity
00:03:21 that is living and then everything else
00:03:24 is just either a collection of humans
00:03:26 or components of humans?
00:03:28 I think if it kind of is, if you think about,
00:03:30 I think of like an emergence elevator.
00:03:32 And so you’ve got an ant is on one floor
00:03:36 and then the ant colony is a floor above.
00:03:39 Or maybe there’s even units within the colony
00:03:41 that’s one floor above and the full colony
00:03:42 is two floors above.
00:03:44 And to me, I think that it’s the colony
00:03:46 that is closest to being the animal.
00:03:49 It’s like the individual thing that competes with others
00:03:53 while the individual ants are like cells
00:03:57 in the animal’s body.
00:03:59 We are more like a colony in that regard.
00:04:01 But the humans are weird because we kind of,
00:04:04 I think of it, if emergence happens in an emergence tower,
00:04:08 where you’ve got kind of, as I said,
00:04:09 cells and then humans and communities and societies.
00:04:13 Ants are very specific.
00:04:15 The individual ants are always cooperating
00:04:16 with each other for the sake of the colony.
00:04:19 So the colony is this unit that is the competitive unit.
00:04:22 Humans can kind of go,
00:04:24 we take the elevator up and down
00:04:25 emergence tower psychologically.
00:04:27 Sometimes we are individuals
00:04:29 that are competing with other individuals
00:04:31 and that’s where our mindset is.
00:04:33 And then other times we get in this crazy zone,
00:04:35 you know, a protest or a sporting event
00:04:37 and you’re just chanting and screaming
00:04:40 and doing the same hand motions
00:04:41 with all these other people and you feel like one.
00:04:43 You feel like one, you know, and you’d sacrifice yourself.
00:04:45 And now that’s what, you know, soldiers.
00:04:46 And so our brains can kind of psychologically
00:04:50 go up and down this elevator in an interesting way.
00:04:53 Yeah, I wonder how much of that
00:04:55 is just the narrative we tell ourselves.
00:04:57 Maybe we are just like an ant colony.
00:04:59 We’re just collaborating always,
00:05:00 even in our stories of individualism,
00:05:04 of like the freedom of the individual,
00:05:05 like this kind of isolation,
00:05:09 lone man on an island kind of thing.
00:05:11 We’re actually all part of this giant network
00:05:13 of maybe one of the things that makes humans who we are
00:05:17 is probably deeply social,
00:05:21 the ability to maintain
00:05:22 not just the single human intelligence,
00:05:24 but like a collective intelligence.
00:05:26 And so this feeling like individual
00:05:29 is just because we woke up at this level of the hierarchy.
00:05:33 So we make it special,
00:05:34 but we very well could be just part of the ant colony.
00:05:39 This whole conversation,
00:05:40 I’m either going to be doing a Shakespearean analysis
00:05:44 of your Twitter, your writing,
00:05:46 or very specific statements that you’ve made.
00:05:49 So you’ve written answers to a mailbag of questions.
00:05:54 The questions were amazing, the ones you’ve chosen,
00:05:57 and your answers were amazing.
00:05:58 So on this topic of the big and the small,
00:06:00 somebody asked, are we bigger than we are small?
00:06:03 Or smaller than we are big?
00:06:05 Who’s asking these questions?
00:06:06 This is really good.
00:06:08 You have amazing fans.
00:06:09 Okay, so where do we sit at this level
00:06:15 of the very small to the very big?
00:06:18 So are we bigger or are we small?
00:06:20 Are we bigger than we are small?
00:06:21 I think it depends on what we’re asking here.
00:06:23 So if we’re talking about the biggest thing
00:06:26 that we kind of can talk about without just imagining
00:06:31 is the observable universe, the Hubble sphere.
00:06:34 And that’s about 10 to the 26th meters in diameter.
00:06:41 The smallest thing we talk about is a plank length,
00:06:44 but you could argue that that’s kind of an imaginary thing.
00:06:47 But that’s 10 to the negative 35.
00:06:49 Now we’re about, conveniently, about 10 to the one.
00:06:52 Not quite, 10 to the zero.
00:06:53 We’re about 10 to the zero meters long.
00:06:56 So it’s easy because you can just look and say,
00:06:59 okay, well, for example, atoms are like 10
00:07:03 to the negative 15th or 10 to the negative 16th meters
00:07:07 across, right?
00:07:08 If you go 10 to the 15th or 10 to the 16th,
00:07:10 which is right, that’s now.
00:07:12 So an atom to us is us to this.
00:07:13 You get to like nebulas, smaller than a galaxy
00:07:16 and bigger than the biggest star.
00:07:18 So we’re right in between nebula and an atom.
00:07:22 Now, if you wanna go down to quark level,
00:07:24 you might be able to get up to galaxy level.
00:07:27 When you go up to the observable universe,
00:07:29 you’re getting down on the small side
00:07:31 to things that we, I think, are mostly theoretically
00:07:35 imagining are there and hypothesizing are there.
00:07:39 So I think as far as real world objects
00:07:43 that we really know a lot about,
00:07:45 I would say we are smaller than we are big.
00:07:48 But if you wanna go down to the Planck length,
00:07:50 we’re very quickly, we’re bigger than we are small.
00:07:52 If you think about strings.
00:07:54 Yeah, strings, exactly, string theory and so on.
00:07:57 That’s interesting.
00:07:58 But I think like you answered,
00:08:00 no matter what, we’re kind of middleish.
00:08:02 Yeah, I mean, here’s something cool.
00:08:04 If a human is a neutrino, and again, neutrino,
00:08:06 the size doesn’t really make sense.
00:08:08 It’s not really a size.
00:08:09 But when we talk about some of these neutrinos,
00:08:11 I mean, if a neutrino is a human, a proton is the sun.
00:08:15 So that’s like, I mean, a proton is real small,
00:08:18 like really small.
00:08:20 And so, yeah, the small gets like crazy small very quickly.
00:08:25 Let’s talk about aliens.
00:08:30 We already mentioned it.
00:08:32 Let’s start just by with the basic,
00:08:34 what’s your intuition as of today?
00:08:36 This is a thing that could change day by day.
00:08:38 But how many alien civilizations out there?
00:08:40 Is it zero?
00:08:42 Is it a handful?
00:08:45 Is it almost endless, like the observable universe
00:08:50 or the universe is teeming with life?
00:08:52 If I had gun to my head, I have to take a guess.
00:08:54 I would say it’s teeming with life.
00:08:56 I would say there is.
00:08:57 I think running a Monte Carlo simulation,
00:09:00 this paper by Andrew Sandberg and Drexler
00:09:03 and a few others a couple of years ago,
00:09:05 I think you probably know about it.
00:09:07 I think the mean,
00:09:12 using different,
00:09:16 running through randomized rake equation multiplication,
00:09:20 you’re ended up with 27 million as the mean
00:09:23 of intelligent civilizations in the galaxy,
00:09:26 in the Milky Way alone.
00:09:29 And so then if you go outside the Milky Way,
00:09:30 that would turn into trillions.
00:09:32 That’s the mean.
00:09:34 Now, what’s interesting is that there’s a long tail
00:09:37 because they believe some of these multipliers
00:09:40 in the Drake equation.
00:09:40 So for example, the probability that life starts
00:09:45 in the first place,
00:09:46 they think that the kind of range that we use
00:09:51 is for that variable or is way too small.
00:09:54 And that’s constraining our possibilities.
00:09:57 And if you actually extend it to some crazy number
00:10:00 of orders of magnitude, like 200,
00:10:03 they think that that variable should be,
00:10:06 you get this long tail where,
00:10:08 I forget the exact number,
00:10:09 but it’s like a third or a quarter
00:10:11 of the total outcomes have us alone.
00:10:14 I think it’s a sizable percentage has us
00:10:18 as the only intelligent life in the galaxy,
00:10:22 but you can keep going.
00:10:23 And I think there’s like a non zero,
00:10:25 like legitimate amount of outcomes there
00:10:28 that have us as the only life
00:10:30 in the observable universe at all is on earth.
00:10:32 I mean, it seems incredibly counterintuitive.
00:10:35 It seems like, you mentioned that people think
00:10:37 you must be an idiot because if you picked up one grain
00:10:41 of sand on a beach and examined it
00:10:42 and you found all these little things on it,
00:10:44 it’s like saying, well, maybe this is the only one
00:10:46 that has that.
00:10:47 And it’s like, probably not.
00:10:48 They’re probably most of the sand probably
00:10:50 or a lot of the sand, right?
00:10:52 So, and then the other hand, we don’t see anything.
00:10:54 We don’t see any evidence, which of course,
00:10:56 people would say that the people who stuff scoff
00:10:59 at the concept that we’re potentially alone,
00:11:03 they say, well, of course, there’s lots of reasons
00:11:05 we wouldn’t have seen anything.
00:11:06 And they can go list them and they’re very compelling,
00:11:09 but we don’t know.
00:11:11 And the truth is if there were,
00:11:12 if this were a freak thing, I mean, we don’t,
00:11:14 if this were a completely freak thing that happened here,
00:11:17 whether it’s life at all or just getting
00:11:19 to this level of intelligence,
00:11:23 that species, whoever it was, would think
00:11:25 there must be lots of us out there and they’d be wrong.
00:11:28 So just being, again, using the same intuition
00:11:30 that most people would use, I’d say there’s probably lots
00:11:32 of other things out there.
00:11:34 Yeah, and you wrote a great blog post about it.
00:11:36 But to me, the two interesting reasons
00:11:40 that we haven’t been in contact, I too have an intuition
00:11:44 that the universe is teeming with life.
00:11:47 So one interesting is around the great filter.
00:11:50 So we either, the great filter’s either behind us
00:11:54 or in front of us.
00:11:55 So the reason that’s interesting is you get to think
00:11:58 about what kind of things ensure
00:12:01 or ensure the survival of an intelligent civilization
00:12:05 or lead to the destruction of intelligent civilization.
00:12:08 That’s a very pragmatic, very important question
00:12:10 to always be asking.
00:12:11 And we’ll talk about some of those.
00:12:13 And then the other one is I’m saddened by the possibility
00:12:18 that there could be aliens communicating
00:12:20 with us all the time.
00:12:21 In fact, they may have visited.
00:12:25 And we’re just too dumb to hear it, to see it.
00:12:29 Like the idea that the kind of life that can evolve
00:12:34 is just the range of life that can evolve is so large
00:12:38 that our narrow view of what is life
00:12:41 and what is intelligent life is preventing us
00:12:43 from having communication with them.
00:12:45 But then they don’t seem very smart
00:12:47 because if they were trying to communicate with us,
00:12:50 they would surely, if they were super intelligent,
00:12:53 they would be very, I’m sure if there’s lots of life,
00:12:56 we’re not that rare, we’re not some crazy weird species
00:12:59 that hears and has different kinds of ways
00:13:01 of perceiving signals.
00:13:04 So they would probably be able to,
00:13:06 if you really wanted to communicate
00:13:08 with an earth like species, with a human like species,
00:13:12 you would send out all kinds of things.
00:13:14 You’d send out radio waves and you send out gravity waves
00:13:18 and lots of things.
00:13:20 So if they’re communicating in a way,
00:13:22 they’re trying to communicate with us
00:13:23 and it’s just we’re too dumb to perceive the signals,
00:13:25 it’s like, well, they’re not doing a great job
00:13:28 of considering the primitive species we might be.
00:13:32 So I don’t know, I think if a super intelligent species
00:13:35 wanted to get in touch with us and had the capability of,
00:13:40 I think probably they would.
00:13:43 Well, they may be getting in touch with us,
00:13:48 they’re just getting in touch with the thing
00:13:50 that we humans are not understanding
00:13:52 that they’re getting in touch with us with.
00:13:53 I guess that’s what I was trying to say is
00:13:55 there could be something about earth
00:13:57 that’s much more special than us humans.
00:14:00 Like the nature of the intelligence that’s on earth
00:14:02 or the thing that’s of value and that’s curious
00:14:07 and that’s complicated and fascinating and beautiful
00:14:10 might be something that’s not just like tweets, okay?
00:14:15 Like English language that’s interpretable
00:14:17 or any kind of language or any kind of signal,
00:14:19 whether it’s gravity or radio signal
00:14:22 that humans seem to appreciate.
00:14:24 Why not the actual, it could be the process
00:14:28 of evolution itself.
00:14:29 There could be something about the way
00:14:31 that earth is breathing essentially
00:14:33 through the creation of life
00:14:34 and this complex growth of life.
00:14:37 There’s like, it’s a whole different way
00:14:40 to view organisms and view life
00:14:42 that could be getting communicated with.
00:14:44 And we humans are just a tiny fingertip
00:14:47 on top of that intelligence.
00:14:49 And the communication is happening
00:14:50 with the main mothership of earth
00:14:54 versus us humans that seem to treat ourselves
00:14:56 as super important and we’re missing the big picture.
00:15:00 I mean, it sounds crazy, but our understanding
00:15:03 of what is intelligent, of what is life,
00:15:05 what is consciousness is very limited.
00:15:08 And it seems to be, and just being very suspicious,
00:15:11 it seems to be awfully human centric.
00:15:14 Like this story, it seems like the progress of science
00:15:17 is constantly putting humans down on the importance,
00:15:22 humans down on the cosmic importance,
00:15:28 the ranking of how big we are, how important we are.
00:15:32 That seems to be the more we discover
00:15:33 that’s what’s happening.
00:15:34 And I think science is very young.
00:15:37 And so I think eventually we might figure out
00:15:39 that there’s something much, much bigger going on,
00:15:41 that humans are just a curious little side effect
00:15:45 of the much bigger thing.
00:15:46 That’s what, I mean, that, as I’m saying,
00:15:49 it just sounds insane, but.
00:15:50 Well, it just, it sounds a little like religious.
00:15:54 It sounds like a spiritual, it gets to that realm
00:15:59 where there’s something that more than meets the eye.
00:16:02 Well, yeah, but not, so religious and spiritual,
00:16:07 often have this kind of whoo whoo characteristic,
00:16:11 like when people write books about them,
00:16:12 then go to wars over whatever the heck
00:16:14 is written in those books.
00:16:16 I mean, more like it’s possible that collective intelligence
00:16:19 is more important than individual intelligence, right?
00:16:22 It’s the ant colony.
00:16:23 What’s the primal organism?
00:16:24 Is it the ant colony or is it the ant?
00:16:26 Yeah, I mean, humans, just like any individual ant
00:16:29 can’t do shit, but the colony can do,
00:16:32 make this incredible structures and has this intelligence.
00:16:35 And we’re exactly the same.
00:16:36 I mean, you know the famous thing that no one,
00:16:40 no human knows how to make a pencil.
00:16:43 Have you heard this?
00:16:44 No. Basically, I mean.
00:16:45 This is great.
00:16:47 There’s not, a single human out there
00:16:49 has absolutely no idea how to make a pencil.
00:16:51 So you have to think about, you have to get the wood,
00:16:54 the paint, the different chemicals
00:16:56 that make up the yellow paint.
00:16:57 The eraser is a whole other thing.
00:16:59 The metal has to be mined from somewhere
00:17:01 and then the graphite, whatever that is.
00:17:04 And there’s not one person on earth
00:17:06 who knows how to kind of collect all those materials
00:17:10 and create a pencil, but together,
00:17:12 that’s one of the, that’s child’s play.
00:17:14 It’s just one of the easiest things.
00:17:15 So, you know, the other thing I like to think about,
00:17:18 I actually put this as a question on the blog once.
00:17:22 There’s a thought experiment
00:17:24 and I actually wanna hear what you think.
00:17:26 So if a witch, kind of a dickish witch comes around
00:17:30 and she says, I’m gonna cast a spell on all of humanity
00:17:35 and all material things that you’ve invented
00:17:39 are gonna disappear all at once.
00:17:41 So suddenly we’re all standing there naked.
00:17:42 There’s no buildings.
00:17:44 There’s no cars and boats and ships and no mines,
00:17:48 nothing, right?
00:17:49 It’s just the stone age earth and a bunch of naked humans,
00:17:52 but we’re all the same, we have the same brain.
00:17:54 So we’re all know what’s going on.
00:17:55 And we all got a note from her, so we understand the deal.
00:17:57 And she says, she communicated to every human,
00:18:00 here’s the deal, you lost all your stuff.
00:18:04 You guys need to make one working iPhone 13
00:18:07 and you make one working iPhone 13
00:18:09 that could pass in the Apple store today,
00:18:11 you know, in your previous world
00:18:13 for an iPhone 13, then I will restore everything.
00:18:16 How long do you think?
00:18:17 And so everyone knows this is the mission.
00:18:19 We’re all aware of the mission, everyone, all humans.
00:18:22 How long would it take us?
00:18:23 That’s a really interesting question.
00:18:25 So obviously if you do a random selection
00:18:28 of 100 or a thousand humans within the population,
00:18:31 I think you’re screwed to make that iPhone.
00:18:34 I tend to believe that there’s fascinating specialization
00:18:39 among the human civilization.
00:18:40 Like there’s a few hackers out there
00:18:43 that can like solo build an iPhone.
00:18:46 But with what materials?
00:18:48 So no materials whatsoever.
00:18:51 It has to, I mean, it’s virtually, I mean, okay.
00:18:54 You have to build factories.
00:18:56 I mean, to fabricate.
00:19:00 Okay.
00:19:02 And how are you gonna mine them?
00:19:03 You know, you gotta mine the materials
00:19:04 where you don’t have any cranes.
00:19:05 You don’t have any, you know.
00:19:06 Okay, you 100% have to have the, everybody’s naked.
00:19:10 Everyone’s naked and everyone’s where they are.
00:19:12 So you and I would currently be naked.
00:19:14 It’s on the ground in what used to be Manhattan.
00:19:16 So no building.
00:19:17 No, grassy island.
00:19:18 Yeah.
00:19:19 So you need a naked Elon Musk type character
00:19:22 to then start building a company.
00:19:23 You have to have a large company then.
00:19:25 Right.
00:19:26 He doesn’t even know where he, you know,
00:19:27 where is everyone?
00:19:28 You know, oh shit, how am I gonna find
00:19:29 other people I need to talk to?
00:19:30 But we have all the knowledge of.
00:19:32 Yeah, everyone has the knowledge
00:19:33 that’s in their current brains.
00:19:34 Yeah.
00:19:34 I’ve met some legit engineers.
00:19:38 Crazy polymath people.
00:19:39 Yeah, but the actual labor of,
00:19:42 cause you said, cause like the original Mac,
00:19:47 like the Apple II, that can be built.
00:19:50 But.
00:19:51 Even that, you know.
00:19:52 Even that’s gonna be tough.
00:19:53 Well, I think part of it is a communication problem.
00:19:55 If you could suddenly have, you know, someone,
00:19:56 if everyone had a walkie talkie
00:19:58 and there was, you know, a couple, you know,
00:19:59 10 really smart people were designated the leaders,
00:20:01 they could say, okay, I want, you know,
00:20:02 everyone who can do this to walk west, you know,
00:20:05 until you get to this little hub and everyone else,
00:20:08 you know, and they could actually coordinate,
00:20:10 but we don’t have that.
00:20:11 So it’s like people just, you know,
00:20:12 and then what I think about is,
00:20:14 so you’ve got some people that are like trying to organize
00:20:15 and you’ll have a little community
00:20:17 where a couple hundred people have come together
00:20:18 and maybe a couple thousand have organized
00:20:20 and they designated one person, you know, as the leader
00:20:22 and then they have sub leaders and okay,
00:20:24 we have a start here.
00:20:25 We have some organization.
00:20:26 You’re also gonna have some people that say, good,
00:20:28 humans were scourged upon the earth and this is good.
00:20:31 And they’re gonna try to sabotage.
00:20:32 They’re gonna try to murder the people with the,
00:20:33 and who know what they’re talking about.
00:20:35 Yeah.
00:20:37 The elite that possessed the knowledge.
00:20:40 Well, and so everyone, maybe everyone’s hopeful
00:20:42 for the, you know, we’re all civilized and hopeful
00:20:44 for the first 30 days or something.
00:20:45 And then things start to fall off.
00:20:47 They, you know, people get, start to lose hope
00:20:48 and there’s new kinds of, you know,
00:20:50 new kinds of governments popping up, you know,
00:20:52 new kinds of societies and they’re, you know,
00:20:55 and they don’t play nicely with the other ones.
00:20:57 And I think very quickly,
00:20:59 I think a lot of people would just give up and say,
00:21:00 you know what, this is it.
00:21:01 We’re back in the stone age.
00:21:02 Let’s just create, you know, agrarian.
00:21:03 We don’t also don’t know how to farm.
00:21:05 No one knows how to farm.
00:21:06 There’s like, even the farmers, you know,
00:21:08 a lot of them are relying on their machines.
00:21:11 And so we also, you know, mass starvation.
00:21:14 And that, you know, when you’re trying to organize,
00:21:16 a lot of people are, you know, coming in with, you know,
00:21:18 spears they’ve fashioned and trying to murder everyone
00:21:21 who has food.
00:21:22 That’s an interesting question.
00:21:23 Given today’s society, how much violence would that be?
00:21:25 We’ve gotten softer, less violent.
00:21:28 And we don’t have weapons.
00:21:29 So that’s something. We don’t have weapons.
00:21:29 We have really primitive weapons now.
00:21:31 But we have, and also we have a kind of ethics
00:21:33 where murder is bad.
00:21:35 We used to be less, like human life was less valued
00:21:38 in the past.
00:21:39 So murder was more okay, like ethically.
00:21:41 But in the past, they also were really good
00:21:44 at figuring out how to have sustenance.
00:21:46 They knew how to get food and water because they,
00:21:48 so we have no idea.
00:21:49 Like the ancient hunter gatherer societies would laugh
00:21:51 at what’s going on here.
00:21:52 They’d say, you guys, you don’t know what you’re,
00:21:54 none of you know what you’re doing.
00:21:55 Yeah.
00:21:56 And also the amount of people feeding this amount of people
00:21:59 in a very, in a stone age, you know, civilization,
00:22:02 that’s not gonna happen.
00:22:03 So New York and San Francisco are screwed.
00:22:05 Well, whoever’s not near water is really screwed.
00:22:07 So that’s, you’re near a river or freshwater river.
00:22:10 And you know, anyway, it’s a very interesting question.
00:22:12 And what it does, this and the pencil,
00:22:14 it makes me feel so grateful and like excited about like,
00:22:19 man, our civilization is so cool.
00:22:21 And this is, talk about collective intelligence.
00:22:24 Humans did not build any of this.
00:22:28 It’s collective human super,
00:22:30 collective humans is a super intelligent,
00:22:32 you know, being that is, that can do absolutely,
00:22:37 especially over a long period of time,
00:22:38 can do such magical things.
00:22:39 And we just get to be born, when I go out,
00:22:41 when I’m working and I’m hungry,
00:22:42 I just go click, click, click, and like a salad’s coming.
00:22:46 The salad arrives.
00:22:47 If you think about the incredible infrastructure
00:22:50 that’s in place for that, for that quickly,
00:22:52 or just the internet to, you know, the electricity,
00:22:54 first of all, that’s just powering the things, you know,
00:22:55 how the, where the, the amount of structures
00:22:57 that have to be created and for that electricity to be there.
00:23:00 And then you’ve got the, you’ve of course the internet,
00:23:02 and then you have this system where delivery drivers
00:23:05 and they have, they’re riding bikes
00:23:06 that were made by someone else.
00:23:07 And they’re going to get the salad
00:23:09 and all those ingredients came from all over the place.
00:23:11 I mean, it’s just, so I think it’s like,
00:23:13 I like thinking about these things
00:23:15 because it makes me feel like just so grateful.
00:23:19 I’m like, man, it would be so awful if we didn’t have this.
00:23:21 And people who didn’t have it would think
00:23:22 this was such magic we live in and we do.
00:23:25 And like, cool, that’s fun.
00:23:27 Yeah, one of the most amazing things when I showed up,
00:23:29 I came here at 13 from the Soviet Union
00:23:32 and the supermarket was, people don’t really realize that,
00:23:36 but the abundance of food, it’s not even,
00:23:41 so bananas was the thing I was obsessed about.
00:23:43 I just ate bananas every day for many, many months
00:23:45 because they haven’t had bananas in Russia.
00:23:47 And the fact that you can have as many bananas as you want,
00:23:49 plus they were like somewhat inexpensive
00:23:53 relative to the other food.
00:23:56 The fact that you can somehow have a system
00:23:58 that brings bananas to you without having to wait
00:24:00 in a long line, all of those things,
00:24:02 it’s magic.
00:24:03 I mean, also imagine, so first of all,
00:24:06 the ancient hunter gatherers,
00:24:07 you picture the mother gathering and eating
00:24:08 for all this fresh food, no.
00:24:10 So do you know what an avocado used to look like?
00:24:11 It was a little like a sphere and the fruit of it,
00:24:15 the actual avocado part was like a little tiny layer
00:24:18 around this big pit that took up almost the whole volume.
00:24:21 We’ve made crazy robot avocados today
00:24:25 that have nothing to do with like what they,
00:24:28 so same with bananas, these big, sweet, you know,
00:24:33 and not infested with bugs and, you know,
00:24:35 they used to eat the shittiest food
00:24:38 and they’re eating uncooked meat
00:24:40 or maybe they cook it and they’re just, it’s gross
00:24:42 and it’s things rot.
00:24:44 So you go to the supermarket and it’s just,
00:24:47 A, it’s like crazy super engineered cartoon food,
00:24:49 fruit and food.
00:24:51 And then it’s all this processed food,
00:24:52 which, you know, we complain about in our setting.
00:24:54 Oh, you know, we complain about, you know,
00:24:55 we need too much process.
00:24:56 That’s a, this is a good problem.
00:24:58 I mean, if you imagine what they would think,
00:25:00 oh my God, a cracker.
00:25:01 You know how delicious a cracker would taste to them?
00:25:04 You know, candy, you know, pasta and spaghetti.
00:25:08 They never had anything like this.
00:25:09 And then you have from all over the world,
00:25:11 I mean, things that are grown all over the place,
00:25:13 all here in nice little racks organized
00:25:16 and on a middle class salary,
00:25:17 you can afford anything you want.
00:25:20 I mean, it’s again, just like incredible gratitude.
00:25:23 Like, ah, yeah.
00:25:25 And the question is how resilient is this whole thing?
00:25:27 I mean, this is another darker version of your question is
00:25:31 if we keep all the material possessions we have,
00:25:35 but we start knocking out some percent of the population,
00:25:39 how resilient is the system that we built up
00:25:42 where we rely on other humans
00:25:43 and the knowledge that built up on the past,
00:25:45 the distributed nature of knowledge,
00:25:50 how much does it take?
00:25:52 How many humans need to disappear
00:25:55 for us to be completely lost?
00:25:57 Well, I’m trying to go off one thing,
00:25:59 which is Elon Musk says that he has this number,
00:26:02 a million in mind as the order of magnitude of people
00:26:05 you need to be on Mars to truly be multi planetary.
00:26:11 Multi planetary doesn’t mean, you know,
00:26:15 like when Neil Armstrong, you know, goes to the moon,
00:26:19 they call it a great leap for mankind.
00:26:21 It’s not a great leap for anything.
00:26:23 It is a great achievement for mankind.
00:26:26 And I always like think about if the first fish
00:26:29 to kind of go on land just kind of went up
00:26:31 and gave the shore a high five
00:26:33 and goes back into the water,
00:26:34 that’s not a great leap for life.
00:26:36 That’s a great achievement for that fish.
00:26:37 And there should be a little statue of that fish
00:26:38 and it’s, you know, in the water
00:26:39 and everyone should celebrate the fish.
00:26:41 But it’s, but when we talk about a great leap for life,
00:26:44 it’s permanent.
00:26:45 It’s something that now from now on, this is how things are.
00:26:48 So this is part of why I get so excited about Mars,
00:26:51 by the way, is because you can count on one hand,
00:26:53 like the number of great leaps that we’ve had,
00:26:57 you know, like no life to life and single cell
00:27:00 or simple cell to complex cell
00:27:02 and single cell organisms to animals to come,
00:27:06 you know, multi cell animals and then ocean to land
00:27:09 and then one planet to two planets, anyway, diversion.
00:27:12 But the point is that we are officially that leap
00:27:17 for all of life, you know, has happened
00:27:19 once the ships could stop coming from Earth
00:27:22 because there’s some horrible catastrophic World War III
00:27:24 and everyone dies on Earth and they’re fine
00:27:26 and they can turn that certain X number of people
00:27:29 into 7 billion, you know, population
00:27:31 that’s thriving just like Earth.
00:27:33 They can build ships, they can come back
00:27:34 and recolonize Earth
00:27:35 because now we are officially multi planetary
00:27:37 where it’s a self sustaining.
00:27:38 He says a million people is about what he thinks.
00:27:40 Now that might be a specialized group.
00:27:42 That’s a very specifically, you know,
00:27:44 selected million that has very, very skilled million people,
00:27:49 not just maybe the average million on Earth,
00:27:51 but I think it depends what you’re talking about.
00:27:53 But I don’t think, you know, so one million is one 7,000th,
00:27:56 one 8,000th of the current population.
00:27:58 I think you need a very, very, very small fraction
00:28:02 of humans on Earth to get by.
00:28:04 Obviously you’re not gonna have
00:28:05 the same thriving civilization
00:28:06 if you get to a too small a number,
00:28:08 but it depends who you’re killing off, I guess,
00:28:10 is part of the question.
00:28:11 Yeah, if you killed off half of the people
00:28:14 just randomly right now, I think we’d be fine.
00:28:15 It would be obviously a great awful tragedy.
00:28:18 I think if you killed off three quarters
00:28:19 of all people randomly,
00:28:20 just three out of every four people drops dead.
00:28:22 I think we’d have, obviously the stock market would crash.
00:28:24 We’d have a rough patch,
00:28:27 but I almost can assure you that the species would be fine.
00:28:30 Well, cause the million number,
00:28:31 like you said, it is specialized.
00:28:33 So I think, cause you have to do this,
00:28:39 you have to basically do the iPhone experiment.
00:28:41 Like literally you have to be able to manufacture computers.
00:28:46 Yeah, everything.
00:28:47 If you’re gonna have the self sustaining means
00:28:48 you can, any major important skill,
00:28:50 any important piece of kind of infrastructure on earth
00:28:53 can be built there just as well.
00:28:57 It’d be interesting to list out
00:29:00 what are the important things,
00:29:01 what are the important skills?
00:29:03 Yeah, I mean, if you have to feed everyone,
00:29:05 so mass farming, things like that,
00:29:08 you have to, you have to, you have mining,
00:29:12 these questions, it’s like the materials might be,
00:29:15 I don’t know, five mile, two miles underground,
00:29:17 I don’t know what the actual, but like,
00:29:20 it’s amazing to me just that these things
00:29:22 got built in the first place.
00:29:23 And they never got, no one built the first,
00:29:25 the mine that we’re getting stuff for the iPhone for
00:29:28 probably wasn’t built for the iPhone.
00:29:30 Or in general, early mining was for,
00:29:33 I think obviously I assume the industrial revolution
00:29:35 when we realized, oh, fossil fuels,
00:29:37 we wanna extract this magical energy source.
00:29:40 I assume that like mining took a huge leap
00:29:42 without knowing very much about this.
00:29:43 I think you’re gonna need mining,
00:29:45 you’re gonna need like a lot of electrical engineers.
00:29:48 If you’re gonna have a civilization like ours,
00:29:50 and of course you could have oil and lanterns,
00:29:51 we could go way back,
00:29:53 but if you’re trying to build our today thing,
00:29:55 you’re gonna need energy and electricity
00:29:58 and then mines that can bring materials,
00:30:00 and then you’re gonna need a ton of plumbing
00:30:03 and everything that entails.
00:30:05 And like you said, food, but also the manufacturer,
00:30:07 so like turning raw materials into something useful,
00:30:10 that whole thing, like factories,
00:30:13 some supply chain, transportation.
00:30:15 Right, I mean, you think about,
00:30:17 when we talk about like world hunger,
00:30:18 one of the major problems is,
00:30:20 there’s plenty of food and by the time it arrives,
00:30:22 most of it’s gone bad in the truck,
00:30:24 in a kind of an impoverished place.
00:30:26 So it’s like, again, we take it so for granted,
00:30:29 all the food in the supermarket is fresh, it’s all there.
00:30:33 And which always stresses me,
00:30:35 if I were running a supermarket,
00:30:36 I would always be so like miserable
00:30:37 about like things going bad on the shelves,
00:30:41 or if you don’t have enough, that’s not good,
00:30:43 but if you have too much, it goes bad anyway.
00:30:44 Of course, there’ll be entertainers too.
00:30:46 Like somebody would have a YouTube channel
00:30:48 that’s running on Mars.
00:30:50 There is something different about a civilization on Mars
00:30:55 and Earth existing versus like a civilization
00:30:59 in the United States versus Russia and China.
00:31:01 Like that’s a different,
00:31:03 fundamentally different distance, like philosophically.
00:31:07 Will it be like fuzzy?
00:31:08 We know there’ll be like a reality show on Mars
00:31:09 that everyone on Earth is obsessed with.
00:31:11 And I think if people are going back and forth enough,
00:31:15 then it becomes fuzzy.
00:31:16 It becomes like, oh, our friends on Mars.
00:31:17 And there’s like this Mars versus Earth,
00:31:19 and it become like fun tribalism.
00:31:22 I think if people don’t rarely go back and forth
00:31:25 and it really, they’re there for,
00:31:26 I think if you get kind of like, oh, we hate,
00:31:28 a lot of like us versus them stuff going on.
00:31:30 There could be also war in space for territory.
00:31:33 As a first colony happens, China, Russia,
00:31:38 or whoever the European, different European nations,
00:31:41 Switzerland finally gets their act together
00:31:43 and starts wars as opposed to staying out of all of them.
00:31:46 Yeah, there’s all kinds of crazy geopolitical things
00:31:49 that like we have not even,
00:31:51 no one’s really even thought about too much yet
00:31:52 that like, it could get weird.
00:31:54 Think about the 1500s when it was suddenly like a race
00:31:57 to like colonize or capture land
00:32:00 or discover new land that hasn’t been,
00:32:02 so it was like this new frontiers.
00:32:04 There’s not really, the land is not,
00:32:06 the thing about Crimea was like this huge thing
00:32:09 because this tiny peninsula switched.
00:32:11 That’s how like optimized everything has become.
00:32:13 Everything is just like really stuck.
00:32:16 Mars is a whole new world of like,
00:32:18 territory, naming things and you know,
00:32:22 and it’s a chance for new kind of governments maybe,
00:32:24 or maybe it’s just the colonies of these governments
00:32:27 so we don’t get that opportunity.
00:32:28 I think it’d be cool if there’s new countries
00:32:30 being totally new experiments.
00:32:32 And that’s fascinating because Elon talks exactly
00:32:34 about that and I believe that very much.
00:32:36 Like that should be, like from the start,
00:32:40 they should determine their own sovereignty.
00:32:43 Like they should determine their own thing.
00:32:46 There was one modern democracy in late 1700s, the US.
00:32:50 I mean, it was the only modern democracy.
00:32:53 And now of course, there’s hundreds or dozen, many dozens.
00:32:57 But I think part of the reason that was able to start,
00:32:59 I mean, it’s not that people didn’t have the idea,
00:33:01 people had the idea, it was that they had a clean slate,
00:33:04 new place, and they suddenly were,
00:33:06 so I think it would be a great opportunity to have,
00:33:10 there’s a lot of people who have done that,
00:33:11 oh, if I had my own government on an island,
00:33:13 my own country, what would I do?
00:33:15 And the US founders actually had the opportunity,
00:33:20 that fantasy, they were like, we can do it.
00:33:21 Let’s make, okay, what’s the perfect country?
00:33:23 And they tried to make something.
00:33:25 Sometimes progress is, it’s not held up by our imagination.
00:33:29 It’s held up by just, there’s no blank canvas
00:33:33 to try something on.
00:33:35 Yeah, it’s an opportunity for a fresh start.
00:33:37 The funny thing about the conversation we’re having
00:33:39 is not often had, I mean, even by Elon,
00:33:42 he’s so focused on Starship
00:33:43 and actually putting the first human on Mars.
00:33:46 I think thinking about this kind of stuff is inspiring.
00:33:51 It makes us dream, it makes us hope for the future.
00:33:55 And it makes us somehow thinking about civilization on Mars
00:33:59 is helping us think about the civilization here on Earth.
00:34:03 Yeah, totally. And how we should run it.
00:34:05 Well, what do you think are, like in our lifetime,
00:34:07 are we gonna, I think any effort that goes to Mars,
00:34:11 the goal is in this decade.
00:34:13 Do you think that’s actually gonna be achieved?
00:34:15 I have a big bet, $10,000 with a friend
00:34:18 when I was drunk in an argument.
00:34:21 This is great.
00:34:22 That the Neil Armstrong of Mars,
00:34:23 whoever he or she may be, will set foot by the end of 2030.
00:34:28 Now, this was probably in 2018 when I had this argument.
00:34:30 So, like what if it?
00:34:31 So, a human has to touch Mars by the end of 2030.
00:34:35 Oh, by the year 30.
00:34:37 Yeah, by January 1st, 2031.
00:34:39 Yeah.
00:34:41 Did you agree on the time zone or what?
00:34:44 No, no.
00:34:44 If it’s coming on that exact day,
00:34:46 that’s gonna be really stressful.
00:34:48 But anyway, I think that there will be.
00:34:52 That was 2018.
00:34:53 I was more confident then.
00:34:54 I think it’s gonna be around this time.
00:34:56 I mean, I still won the general bet
00:34:58 because his point was, you are crazy.
00:34:59 This is not gonna happen in our lifetimes.
00:35:01 I’ve been offered many, many decades.
00:35:02 And I said, you’re wrong.
00:35:03 You don’t know what’s going on in SpaceX.
00:35:05 I think if the world depended on it,
00:35:08 I think probably SpaceX could probably.
00:35:10 I mean, I don’t know this,
00:35:11 but I think the tech is almost there.
00:35:14 Like, I don’t think, of course,
00:35:16 it’s delayed many years by safety.
00:35:18 So, they first wanna send a ship around Mars
00:35:20 and they wanna land a cargo ship on Mars.
00:35:22 And there’s the moon on the way to.
00:35:23 Yeah, yeah, there’s a lot.
00:35:24 But I think the moon, a decade before,
00:35:27 seemed like magical tech that humans didn’t have.
00:35:30 This is like, no, we can,
00:35:33 it’s totally conceivable that this,
00:35:35 you’ve seen Starship,
00:35:36 like it is a interplanetary colonial
00:35:41 or interplanetary transport like system.
00:35:43 That’s what they used to call it.
00:35:45 The SpaceX, the way they do it is,
00:35:47 every time they do a launch,
00:35:49 something fails usually, when they’re testing
00:35:52 and they learn a thousand things.
00:35:54 The amount of data they get and they improve so,
00:35:57 each one has, it’s like they’ve moved up
00:36:00 like eight generations in each one.
00:36:01 Anyway, so it’s not inconceivable that pretty soon
00:36:04 they could send a Starship to Mars and land it.
00:36:07 There’s just no good reason
00:36:08 I don’t think that they couldn’t do that.
00:36:09 And so, if they could do that,
00:36:10 they could in theory send a person to Mars pretty soon.
00:36:13 Now, taking off from Mars and coming back,
00:36:15 again, I think, I don’t think anyone want
00:36:17 to be on that voyage today because there’s just,
00:36:20 you know, they’re still in,
00:36:21 it’s still amateur hour here and getting that perfect.
00:36:24 I don’t think we’re too far away now.
00:36:25 The question is, so every 26 months,
00:36:30 Earth laps Mars, right?
00:36:31 It’s like a sinusoidal, soil orbit
00:36:34 or whatever it’s called, the period, 26 months.
00:36:37 So it’s right now, like in the evens,
00:36:39 like 2022 is gonna have one of these, late 2024.
00:36:43 So people could, this was the earliest estimate I heard.
00:36:45 Elon said, maybe we can send people to Mars in 2024,
00:36:48 you know, to land in early 2025.
00:36:51 That is not gonna happen because that included 2022
00:36:54 sending a cargo ship to Mars, maybe even a one in 2020.
00:36:58 And so I think they’re not quite on that schedule,
00:37:00 but to win my bet, 2027, I have a chance
00:37:03 and 2029, I have another chance.
00:37:05 We’re not very good at like backing up
00:37:06 and seeing the big picture.
00:37:07 We’re very distracted by what’s going on today
00:37:09 and what we can believe
00:37:10 because it’s happening in front of our face.
00:37:12 There’s no way that humans gonna be landing on Mars
00:37:15 and it’s not gonna be the only thing
00:37:17 everyone is talking about, right?
00:37:18 I mean, it’s gonna be the moon landing,
00:37:20 but even bigger deal, going to another planet, right?
00:37:23 And for it to start a colony,
00:37:24 not just to, again, high five and come back.
00:37:27 So this is like the 2020s, maybe the 2030s
00:37:31 is gonna be the new 1960s.
00:37:33 We’re gonna have a space decade.
00:37:34 I’m so excited about it.
00:37:35 And again, it’s one of the great leaps for all of life
00:37:38 happening in our lifetimes, like that’s wild.
00:37:40 To paint a slightly cynical possibility,
00:37:43 which I don’t see happening,
00:37:46 but I just wanna put sort of value into leadership.
00:37:49 I think it wasn’t obvious that the moon landing
00:37:52 would be so exciting for all of human civilization.
00:37:55 Some of that have to do with the right speeches,
00:37:56 with the space race.
00:37:58 Like space, depending on how it’s presented,
00:38:01 can be boring.
00:38:03 I don’t think it’s been that so far, but I’ve actually.
00:38:07 I think space is quite boring right now.
00:38:10 No, SpaceX is super, but like 10 years ago, space.
00:38:13 Some writer, I forget who wrote,
00:38:15 it’s like the best magic trick in the show
00:38:17 happened at the beginning.
00:38:18 And now they’re starting to do this like easy hazard.
00:38:20 It’s like, you can’t go in that direction.
00:38:21 And the line that this writer said is like,
00:38:23 watching astronauts go up to the space station
00:38:28 after watching the moon is like
00:38:29 watching Columbus sail to Ibiza.
00:38:31 It’s just like, everything is so practical.
00:38:34 You’re going up to the space station, not to explore,
00:38:36 but to do science experiments in microgravity.
00:38:38 And you’re sending rockets up,
00:38:41 mostly here and there there’s a probe,
00:38:43 but mostly you’re sending them up to put satellites
00:38:45 for DirecTV or whatever it is.
00:38:48 It’s kind of like lame earth industry usage.
00:38:51 So I agree with you, space is boring there.
00:38:55 The first human setting foot on Mars,
00:38:59 that’s gotta be a crazy global event.
00:39:01 I can’t imagine it not being.
00:39:02 Maybe you’re right.
00:39:03 Maybe I’m taking for granted the speeches
00:39:05 and the space race and that.
00:39:06 I think the value of, I guess what I’m pushing
00:39:09 is the value of people like Elon Musk
00:39:12 and potentially other leaders that hopefully step up
00:39:14 is extremely important here.
00:39:16 Like I would argue without the publicity of SpaceX,
00:39:19 it’s not just the ingenuity of SpaceX,
00:39:21 but like what they’ve done publicly
00:39:23 by having a figure that tweets
00:39:25 and all that kind of stuff like that,
00:39:27 that’s a source of inspiration.
00:39:29 Totally.
00:39:30 NASA wasn’t able to quite pull off with a shuttle.
00:39:32 That’s one of his two reasons for doing this.
00:39:35 SpaceX exists for two reasons.
00:39:37 One, life insurance for the species.
00:39:41 I always think about this way.
00:39:42 If you’re an alien on some far away planet
00:39:44 and you’re rooting against humanity
00:39:46 and you win the bet if humanity goes extinct,
00:39:49 you do not like SpaceX.
00:39:51 You do not want them to have their eggs
00:39:53 in two baskets now.
00:39:54 Yeah.
00:39:56 Sure, it’s like obviously you could have something
00:39:59 that kills everyone on both planets,
00:40:00 some AI war or something.
00:40:02 But the point is obviously it’s good for our chances,
00:40:06 our longterm chances to be having
00:40:08 two self sustaining civilizations going on.
00:40:11 The second reason, he values this I think just as high
00:40:15 is it’s the greatest adventure in history
00:40:17 going multi planetary and that people need some reason
00:40:21 to wake up in the morning
00:40:21 and it’ll just be this hopefully great uniting event too.
00:40:26 I mean, today’s nasty, awful political environment,
00:40:29 which is like a whirlpool that sucks everything into it.
00:40:34 So you name a thing and it’s become a nasty political topic.
00:40:38 So I hope that space can,
00:40:42 Mars can just bring everyone together,
00:40:44 but it could become this hideous thing
00:40:46 where it’s a billionaire,
00:40:48 some annoying storyline gets built.
00:40:50 So half the people think that anyone who’s excited about Mars
00:40:53 is an evil something.
00:40:55 Anyway, I hope it is super exciting.
00:40:57 So far space has been a uniting, inspiring thing.
00:41:03 And in fact, especially during this time of a pandemic
00:41:06 has been just a commercial entity
00:41:08 putting out humans into space for the first time
00:41:12 was just one of the only big sources of hope.
00:41:15 Totally in awe, just like watching this huge skyscraper
00:41:19 go up in the air, flip over, go back down and land.
00:41:21 I mean, it just makes everyone just wanna sit back
00:41:23 and clap and kinda like,
00:41:25 the way I look at something like SpaceX
00:41:26 is it makes me proud to be a human.
00:41:30 And I think it makes a lot of people feel that way.
00:41:31 It’s like good for our self esteem.
00:41:33 It’s like, you know what?
00:41:34 We’re pretty, we have a lot of problems,
00:41:35 but like we’re kind of awesome.
00:41:36 And if we can put people on Mars,
00:41:39 sticking up an earth flag on Mars,
00:41:41 like damn, we should be so proud
00:41:43 of our like little family here.
00:41:45 Like we did something cool.
00:41:46 And by the way, I’ve made it clear to SpaceX people,
00:41:51 including Elon, many times,
00:41:52 and it’s like once a year reminder
00:41:54 that if they want to make this more exciting,
00:41:56 they send the writer to Mars on the thing.
00:42:00 And I’ll blog about it.
00:42:02 So I’m just continuing to throw this out there.
00:42:04 On which trip?
00:42:05 I’m trying to get them to send me to Mars.
00:42:07 No, I understand that.
00:42:08 So I just wanna clarify
00:42:10 on which trip does the writer wanna go?
00:42:12 I think my dream one, to be honest,
00:42:14 would be like the Apollo eight,
00:42:16 where they just looped around the moon and came back.
00:42:19 Cause landing on Mars.
00:42:22 Give you a lot of good content to write about.
00:42:24 Great content, right?
00:42:26 I mean, the amount of kind of high minded,
00:42:28 and so I would go into the thing
00:42:30 and I would blog about it and I’d be in microgravity.
00:42:34 So I’d be bouncing around my little space.
00:42:35 I get a little, they can just send me in a dragon.
00:42:37 They don’t need to do a whole starship.
00:42:38 And I would bounce around and I would get to,
00:42:41 I’ve always had a dream of going
00:42:43 to like one of those nice jails for a year.
00:42:47 Because I just have nothing to do besides like read books
00:42:49 and no responsibilities and no social plans.
00:42:51 So this is the ultimate version of that.
00:42:53 Anyway, it’s a side topic, but I think it would be.
00:42:55 But also if you, I mean, to be honest,
00:42:57 if you land on Mars, it’s epic.
00:43:00 And then if you die there of like finishing your writing,
00:43:04 it will be just even that much more powerful
00:43:06 for the impact.
00:43:08 Yeah, but then I’m gone.
00:43:09 And I don’t even get to like experience the publication
00:43:12 of it, which is the whole point of some of the greatest
00:43:14 writers in history didn’t get a chance to experience
00:43:16 the publication of their.
00:43:18 I know, I don’t really think that I think like,
00:43:19 I think back to Jesus and I’m like, oh man,
00:43:21 that guy really like crushed it, you know?
00:43:23 But then if you think about it,
00:43:26 it doesn’t like you could literally die today
00:43:28 and then become the next Jesus, like 2000 years from now
00:43:31 in this civilization that’s like, there are, you know,
00:43:33 they’re like in magical in the clouds
00:43:35 and they’re worshiping you, they’re worshiping Lex.
00:43:38 And like, that sounds like your ego probably would be like,
00:43:40 wow, that’s pretty cool, except irrelevant to you
00:43:43 because you never even knew it happened.
00:43:44 This feels like a Rick and Morty episode.
00:43:46 It does, it does.
00:43:47 Okay, you’ve talked to Elon quite a bit,
00:43:51 you’ve written about him quite a bit.
00:43:53 Just, it’d be cool to hear you talk about
00:43:58 what are your ideas of what, you know,
00:44:00 the magic sauces you’ve written about with Elon.
00:44:02 What makes him so successful?
00:44:06 His style of thinking, his ambition, his dreams,
00:44:09 his, the people he connects with,
00:44:11 the kind of problems he tackles.
00:44:13 Is there a kind of comments you can make
00:44:14 about what makes him special?
00:44:16 I think that obviously there’s a lot of things
00:44:18 that he’s very good at.
00:44:19 He has, he’s obviously super intelligent.
00:44:23 His heart is very much in like, I think the right place.
00:44:25 Like, you know, I really, really believe that like,
00:44:28 and I think people can sense that, you know,
00:44:30 he just doesn’t seem like a grifter of any kind.
00:44:32 He’s truly trying to do these big things
00:44:34 for the right reasons.
00:44:36 And he’s obviously crazy ambitious and hardworking, right?
00:44:38 Not everyone is.
00:44:39 Some people are as talented and have cool visions,
00:44:40 but they just don’t wanna spend their life that way.
00:44:43 So, but that’s, none of those alone
00:44:46 is what makes Elon, Elon.
00:44:48 I mean, if it were, there’d be more of him
00:44:49 because there’s a lot of people that are very smart
00:44:51 and smart enough to accumulate a lot of money and influence
00:44:53 and they have great ambition and they have, you know,
00:44:56 their hearts in the right place.
00:44:58 To me, it is the very unusual quality he has
00:45:01 is that he’s sane in a way
00:45:04 that almost every human is crazy.
00:45:06 What I mean by that is we are programmed
00:45:10 to trust conventional wisdom over our own reasoning
00:45:17 for good reason.
00:45:19 If you go back 50,000 years and conventional wisdom says,
00:45:22 you know, don’t eat that berry, you know,
00:45:26 or this is the way you tie a spearhead to a spear.
00:45:29 And you’re thinking, I’m smarter than that.
00:45:31 Like, you’re not.
00:45:32 You know, that comes from the accumulation
00:45:35 of life experience, accumulation of observation
00:45:37 and experience over many generations.
00:45:39 And that’s a little mini version
00:45:41 of the collective super intelligence.
00:45:43 It’s like, you know, it’s equivalent
00:45:45 of like making a pencil today.
00:45:46 Like people back then, like the conventional wisdom,
00:45:51 like had this super, this knowledge
00:45:54 that no human could ever accumulate.
00:45:56 So we’re very wired to trust it.
00:45:57 Plus the secondary thing is that the people who, you know,
00:46:00 just say that they believe the mountain is,
00:46:02 they worship the mountain as their God, right?
00:46:04 And the mountain determines their fate.
00:46:06 That’s not true, right?
00:46:07 And the conventional wisdom is wrong there,
00:46:08 but believing it was helpful to survival
00:46:12 because you were part of the crowd
00:46:15 and you stayed in the tribe.
00:46:16 And if you started to, you know, insult the mountain God
00:46:20 and say, that’s just a mountain, it’s not, you know,
00:46:22 you didn’t fare very well, right?
00:46:23 So for a lot of reasons, it was a great survival trait
00:46:25 to just trust what other people said and believe it.
00:46:29 And truly, you know, obviously, you know,
00:46:30 the more you really believed it, the better.
00:46:32 Today, conventional wisdom in a rapidly changing world
00:46:39 and a huge giant society,
00:46:42 our brains are not built to understand that.
00:46:44 They have a few settings, you know,
00:46:45 and none of them is, you know,
00:46:47 300 million person society.
00:46:49 So your brain is basically,
00:46:54 is treating a lot of things like a small tribe,
00:46:57 even though they’re not,
00:46:58 and they’re treating conventional wisdom as, you know,
00:47:01 very wise in a way that it’s not.
00:47:03 If you think about it this way, it’s like, picture a,
00:47:05 like a bucket that’s not moving very much,
00:47:08 moving like a millimeter a year.
00:47:09 And so it has time to collect a lot of water in it.
00:47:11 That’s like conventional wisdom in the old days
00:47:12 when very few things changed.
00:47:13 Like your 10, you know, great, great, great grandmother
00:47:15 probably lived a similar life to you,
00:47:17 maybe on the same piece of land.
00:47:19 And so old people really knew what they were talking about.
00:47:21 Today, the bucket’s moving really quickly.
00:47:23 And so, you know, the wisdom doesn’t accumulate,
00:47:25 but we think it does because our brain settings
00:47:27 doesn’t have the, oh, move, you know,
00:47:29 quickly moving bucket setting on it.
00:47:31 So my grandmother gives me advice all the time,
00:47:35 and I have to decide, is this,
00:47:38 so there are certain things that are not changing,
00:47:40 like relationships and love and loyalty
00:47:42 and things like this.
00:47:44 Her advice on those things, I’ll listen to it all day.
00:47:45 She’s one of the people who said,
00:47:46 you’ve got to live near your people you love,
00:47:48 live near your family, right?
00:47:50 I think that is like tremendous wisdom, right?
00:47:52 That is wisdom, because that’s happens to be something
00:47:54 that hasn’t, doesn’t change from generation to generation.
00:47:56 For now.
00:47:57 Right, she all, right, for now.
00:47:59 She’s also telling, right, so I’ll be the idiot
00:48:01 telling my brain that they’ll actually be in the,
00:48:03 it’s a metaverse, like being like, it doesn’t matter.
00:48:05 And I’m like, you have to, it’s not the same
00:48:07 when you’re not in person.
00:48:08 They’re gonna say, it’s exactly the same, grandpa.
00:48:10 And they’ll also be thinking to me with their near link,
00:48:12 and I’m gonna be like, slow down.
00:48:13 I don’t understand what you’re saying.
00:48:15 You just talk like a normal person.
00:48:16 Anyway, so my grandmother then, but then she says,
00:48:19 you know, you’re, I don’t know about this writing
00:48:21 you’re doing, you should go to law school,
00:48:23 and you know, you want to be secure.
00:48:25 And that’s not good advice for me.
00:48:26 You know, given the world I’m in,
00:48:28 and what I like to do, and what I’m good at,
00:48:30 that’s not the right advice.
00:48:31 But because the world is totally,
00:48:33 she’s in a different world.
00:48:34 So she became wise for a world that’s no longer here, right?
00:48:37 Now, if you think about that,
00:48:38 so then when we think about conventional wisdom,
00:48:40 it’s a little like my grandmother,
00:48:41 and there’s a lot of, no, it’s not maybe, you know,
00:48:45 60 years outdated like her software.
00:48:47 It’s maybe 10 years outdated conventional wisdom,
00:48:50 sometimes 20.
00:48:51 So anyway, I think that we all continually
00:48:55 don’t have the confidence in our own reasoning
00:48:58 when it conflicts with what everyone else thinks,
00:49:00 when with what seems right.
00:49:02 We don’t have the guts to act on that reasoning
00:49:06 for that reason, right?
00:49:07 You know, we, and so there’s so many Elon examples.
00:49:11 I mean, just from the beginning,
00:49:12 building Zip2 was his first company.
00:49:15 And it was internet advertising at the time
00:49:19 when people said, you know, this internet was brand new,
00:49:22 like kind of thinking of like the metaverse,
00:49:24 VR metaverse today.
00:49:24 And people would be like, oh, we, you know,
00:49:27 we facilitate internet advertising.
00:49:29 People were saying, yeah, people are gonna advertise
00:49:30 on the internet, yeah, right.
00:49:32 Actually, it wasn’t that he’s magical and saw the future,
00:49:34 it’s that he looked at the present,
00:49:36 looked at what the internet was,
00:49:38 thought about, you know, the obvious
00:49:40 like advertising opportunity this was gonna be.
00:49:43 It wasn’t rocket science.
00:49:45 It wasn’t genius, I don’t believe.
00:49:46 I think it was just seeing the truth.
00:49:48 And when everyone else is laughing,
00:49:50 saying, well, you’re wrong.
00:49:51 I mean, I did the math and here it is, right?
00:49:54 Next company, you know, x.com,
00:49:56 which became eventually PayPal.
00:49:59 People said, oh yeah, people are gonna put
00:50:00 their financial information on the internet.
00:50:02 No way.
00:50:05 To us, it seems so obvious.
00:50:06 If you went back then, you would probably feel the same.
00:50:08 You’d think this is, that is a fake company that no,
00:50:11 it’s just obviously not a good idea.
00:50:12 He looked around and said, you know, I see where this is.
00:50:14 And so again, he could see where it was going
00:50:15 because he could see what it was that day
00:50:17 and not what it, you know, not people, conventional wisdom
00:50:19 was still a bunch of years earlier.
00:50:21 SpaceX is the ultimate example.
00:50:23 A friend of his apparently bought,
00:50:25 actually compiled a montage, video montage
00:50:28 of rockets blowing up to show him this is not a good idea.
00:50:32 And if, but just even the bigger picture,
00:50:34 the amount of billionaires who have like thought
00:50:36 this was, I’m gonna start launching rockets
00:50:38 and you know, the amount that failed.
00:50:40 I mean, it’s not, conventional wisdom said
00:50:43 this isn’t a bad endeavor.
00:50:44 He was putting all of his money into it.
00:50:45 Yeah, landing rockets was another thing.
00:50:49 You know, well, if you know,
00:50:50 here’s the classic kind of way we reason,
00:50:52 which is if this could be done,
00:50:55 NASA would have done it a long time ago
00:50:57 because of the money it would save.
00:50:58 This could be done, the Soviet Union
00:50:59 would have done it back in the sixties.
00:51:01 It’s obviously something that can’t be done.
00:51:03 And the math on his envelope said,
00:51:06 well, I think it can be done.
00:51:07 And so he just did it.
00:51:08 So in each of these cases, I think actually in some ways,
00:51:10 Elon gets too much credit as, you know,
00:51:12 people think it’s that he’s, you know,
00:51:13 it’s that his Einstein intelligence
00:51:15 or he can see the future.
00:51:16 He has incredible, he has incredible guts.
00:51:19 He’s so, you know, courageous.
00:51:20 I think if you actually are looking at reality
00:51:24 and you’re just assessing probabilities
00:51:26 and you’re ignoring all the noise, which is wrong,
00:51:29 so wrong, right?
00:51:30 And you just, then you just have to be, you know,
00:51:33 pretty smart and pretty courageous.
00:51:36 And you have to have this magical ability to be seen
00:51:39 and trust your reasoning over conventional wisdom
00:51:42 because your individual reasoning, you know,
00:51:44 part of it is that we see that we can’t build a pencil.
00:51:46 We can’t build, you know,
00:51:48 this civilization on our own, right?
00:51:49 So we kind of count, you know,
00:51:51 count to the collective for good reasons.
00:51:56 But this is different when it comes to kind of
00:51:57 what’s possible, you know,
00:51:59 the Beatles were doing their kind of Motowny chord patterns
00:52:02 in the early sixties.
00:52:03 And they were doing what was normal.
00:52:05 They were doing what was clearly this kind of sound is a hit.
00:52:08 Then they started getting weird
00:52:10 because they were so popular.
00:52:11 They had this confidence to say,
00:52:12 let’s just, we’re gonna start just experimenting.
00:52:15 And it turns out that like,
00:52:17 if you just, all these people are in this,
00:52:18 like one groove together doing music,
00:52:20 and it’s just like, there’s a lot of land over there.
00:52:22 And it seems like, you know,
00:52:24 I’m sure the managers would say,
00:52:25 and that the, all the record execs would say,
00:52:28 no, you have to be here.
00:52:29 This is what sells.
00:52:30 And it’s just not true.
00:52:31 So I think that Elon is why,
00:52:33 that’s why the term for this
00:52:35 that actually Elon likes to use
00:52:36 is reasoning from first principles, the physics term.
00:52:39 First principles are your axioms.
00:52:41 And physicists, they don’t say, well, what’s,
00:52:43 you know, what do people think?
00:52:44 No, they say, what are the axioms?
00:52:46 Those are the puzzle pieces.
00:52:47 Let’s use those to build a conclusion.
00:52:49 That’s our hypothesis.
00:52:50 Now let’s test it, right?
00:52:51 And they come up with all kinds of new things
00:52:53 constantly by doing that.
00:52:55 If Einstein was assuming conventional wisdom was right,
00:52:57 he never would have even tried to create something
00:52:59 that really disproved Newton’s laws.
00:53:02 And the other way to reason is reasoning by analogy,
00:53:05 which is a great shortcut.
00:53:08 It’s when we look at other people’s reasoning
00:53:10 and we kind of photocopy it into our head, we steal it.
00:53:13 So reasoning by analogy, we do all the time.
00:53:15 And it’s usually a good thing.
00:53:17 I mean, we don’t, it takes a lot of mental energy and time
00:53:19 to reason from first principles.
00:53:20 It’s actually, you know,
00:53:21 you don’t wanna reinvent the wheel every time, right?
00:53:23 You want to often copy other people’s reasoning
00:53:26 most of the time.
00:53:27 And I, you know, most of us do it most of the time
00:53:28 and that’s good, but there’s certain moments
00:53:30 when you’re, forget just for a second,
00:53:31 like succeeding in like the world of like Elon,
00:53:34 just who you’re gonna marry,
00:53:35 where are you gonna settle down?
00:53:36 How are you gonna raise your kids?
00:53:38 How are you gonna educate your kids?
00:53:40 How you should educate yourself?
00:53:42 What kind of career paths in terms,
00:53:43 these moments, this is what on your death bed,
00:53:45 like you look back on and that’s what,
00:53:47 these are the few number of choices
00:53:49 that really define your life.
00:53:50 Those should not be reasoned by analogy.
00:53:52 You should absolutely try to reason from first principles.
00:53:55 And Elon, not just by the way in his work,
00:53:58 but in his personal life.
00:53:59 I mean, if you just look at the way he’s on Twitter,
00:54:01 he’s not, it’s not how you’re supposed to be
00:54:03 when you’re a super famous, you know, industry titan.
00:54:07 You’re not supposed to just be silly on Twitter
00:54:08 and do memes and getting little quibbles with you.
00:54:11 He just does things his own way,
00:54:14 regardless of what you’re supposed to do,
00:54:15 which sometimes serves him and sometimes doesn’t,
00:54:17 but I think it has taken him where it has taken him.
00:54:21 Yeah, I mean, I probably wouldn’t describe
00:54:23 his approach to Twitter as first principles,
00:54:26 but I guess it has the same element.
00:54:26 I think it is.
00:54:28 Well, first of all, I will say that a lot of tweets,
00:54:30 people think, oh, like he’s gonna be done after that.
00:54:32 He’s fine, he’s just one man, time man of the year.
00:54:35 Like it’s something, it’s not sinking him,
00:54:38 and I think, you know, it’s not that I think
00:54:40 this is like super reasoned out.
00:54:41 I think that, you know, Twitter is his silly side,
00:54:43 but I think that he saw,
00:54:48 his reasoning did not feel like there was a giant risk
00:54:50 in just being his silly self on Twitter,
00:54:52 when a lot of billionaires would say,
00:54:53 well, no one else is doing that.
00:54:55 So it must be a good reason, right?
00:54:58 Well, I gotta say that he inspires me to,
00:55:01 that’s okay to be silly.
00:55:02 Totally. On Twitter.
00:55:04 And, but yeah, you’re right.
00:55:06 The big inspiration is the willingness to do that
00:55:09 when nobody else is doing it.
00:55:11 Yeah, and I think about all the great artists,
00:55:13 you know, all the great inventors and entrepreneurs,
00:55:16 almost all of them,
00:55:17 they had a moment when they trusted their reasoning.
00:55:19 I mean, Airbnb was over 60 with VCs.
00:55:24 A lot of people would say,
00:55:25 obviously they know something we don’t, right?
00:55:28 But they didn’t, they said, eh, I think they’re all wrong.
00:55:30 I mean, that’s, that takes some kind of different wiring
00:55:32 in your brain.
00:55:34 And then that’s both for big picture
00:55:36 and detailed like engineering problems.
00:55:39 It’s fun to talk to him.
00:55:40 It’s fun to talk to Jim Keller,
00:55:42 who’s a good example of this kind of thinking
00:55:44 about like manufacturing, how to get costs down.
00:55:47 They always talk about like,
00:55:49 they talk about SpaceX rockets this way.
00:55:52 They talk about manufacturing this way,
00:55:53 like cost per pound or per ton to get to orbit
00:56:01 or something like that.
00:56:02 This is all the reason we need to get the cost down.
00:56:04 It’s a very kind of raw materials.
00:56:07 Yeah.
00:56:07 Like just very basic way of thinking.
00:56:10 First principles.
00:56:11 It’s really, yeah.
00:56:12 And the first principles of a rocket
00:56:14 are like the price of raw materials
00:56:16 and gravity, you know, and wind.
00:56:19 I mean, these are your first principles and fuel.
00:56:24 Henry Ford, you know, what made Henry Ford blow up
00:56:29 as an entrepreneur?
00:56:30 The assembly line, right?
00:56:32 I mean, he thought for a second and said,
00:56:35 this isn’t how manufacturing is normally done this way,
00:56:38 but I think this is a different kind of product.
00:56:40 And that’s what changed it.
00:56:42 And then what happened is when someone reasons
00:56:43 from first principles, they often fail.
00:56:45 You know, you’re going out into the fog
00:56:47 with no conventional wisdom to guide you.
00:56:49 But when you succeed, what you notice is that
00:56:51 everyone else turns and says, wait, what, what?
00:56:52 What are they doing?
00:56:53 And they all, they flock over.
00:56:55 Look at the iPhone.
00:56:56 iPhone, you know, Steve Jobs was obviously famously good
00:56:58 at reasoning from first principles
00:57:00 because that guy had crazy self confidence.
00:57:02 He just said, you know, if I think this is right,
00:57:04 like everyone, and that, I mean, I don’t know how,
00:57:06 I don’t know how he does that.
00:57:07 And I don’t think Apple can do that anymore.
00:57:09 I mean, they lost that.
00:57:10 That one brain, his ability to do that was made
00:57:13 that in a totally different company,
00:57:14 even though there’s tens of thousands of people there.
00:57:17 He said, he didn’t say, and I’m giving a lot of credit
00:57:20 to Steve Jobs, but of course it was a team at Apple
00:57:23 who said they didn’t look at the flip phones and say,
00:57:26 okay, well kind of, you know, let’s make a keyboard
00:57:28 that’s like clicky and, you know, really cool Apple.
00:57:30 A keyboard, they said, what should a mobile device be?
00:57:32 You know, what the axioms, what are the axioms here?
00:57:35 And none of them involved a keyboard necessarily.
00:57:37 And by the time they piece it up, there was no keyboards.
00:57:39 It didn’t make sense.
00:57:40 Everyone suddenly is going, wait, what, what are they doing?
00:57:42 What are they?
00:57:42 And now every phone looks like the iPhone.
00:57:43 I mean, that’s, that’s how it goes.
00:57:46 You tweeted, what’s something you’ve changed your mind about?
00:57:51 That’s the question you’ve tweeted.
00:57:53 Elon replied, brain transplants,
00:57:55 Sam Harris responded, nuclear power.
00:57:58 There’s a bunch of people with cool responses there.
00:58:01 In general, what are your thoughts
00:58:02 about some of the responses
00:58:03 and what have you changed your mind about big or small,
00:58:07 perhaps in doing the research for some of your writing?
00:58:10 So I’m writing right now, just finishing a book
00:58:13 on kind of why our society is such a shit place at the moment
00:58:19 just polarized.
00:58:20 And, you know, we have all these gifts
00:58:21 like we’re talking about, just the supermarket, you know,
00:58:23 we have these, it’s exploding technology.
00:58:25 Fewer and fewer people are in poverty.
00:58:28 You know, it’s, Louis CK, you know, likes to say,
00:58:30 you know, everything’s amazing and no one’s happy, right?
00:58:32 But it’s really extreme moment right now
00:58:35 where it’s like, hate is on the rise,
00:58:37 like crazy things, right?
00:58:38 And.
00:58:39 If I could interrupt briefly,
00:58:41 you did tweet that you just wrote the last word.
00:58:43 I sure did.
00:58:44 And then there’s some hilarious asshole who said,
00:58:46 now you just have to work on all the ones in the middle.
00:58:49 Yeah, I’ve heard that.
00:58:50 I mean, when you, when you earned a reputation
00:58:53 as a tried and true procrastinator,
00:58:55 you’re just gonna get shit forever.
00:58:57 And that’s fine.
00:58:58 I accept my fate there.
00:58:59 So do you mind sharing a little bit more
00:59:02 about the details of what you’re writing?
00:59:03 So you’re, how do you approach this question
00:59:07 about the state of society?
00:59:09 I wanted to figure out what was going on
00:59:11 because what I noticed was a bad trend.
00:59:15 It’s not that, you know, things are bad.
00:59:16 It’s that things are getting worse in certain ways.
00:59:19 Not in every way.
00:59:20 Look at Max Roser’s stuff, you know,
00:59:23 he comes up with all these amazing graphs.
00:59:24 This is what’s weird is that things are getting better
00:59:27 in almost every important metric you can think of,
00:59:31 except the amount of people who hate other people
00:59:35 in their own country.
00:59:36 And the amount of people that hate their own country,
00:59:40 the amount of Americans that hate America’s on the rise,
00:59:42 right, the amount of Americans
00:59:44 that hate other Americans is on the rise.
00:59:47 The amount of Americans that hate the president
00:59:49 is on the rise, all these things,
00:59:50 like on a very steep rise.
00:59:52 So what the hell?
00:59:53 What’s going on?
00:59:54 Like there’s something causing that.
00:59:56 It’s not that, you know, a bunch of new people were born
00:59:58 who were just dicks.
00:59:59 It’s that something is going on.
01:00:01 So I think of it as a very simple,
01:00:04 oversimplified equation, human behavior.
01:00:07 And it’s the output.
01:00:09 That I think the two inputs are human nature
01:00:12 and environment, right?
01:00:13 And this is basic, you know,
01:00:14 super kindergarten level like, you know, animal behavior.
01:00:18 But I think it’s worth thinking about.
01:00:20 You’ve got human nature,
01:00:21 which is not changing very much, right?
01:00:24 And then you got, you throw that nature
01:00:27 into a certain environment
01:00:29 and it reacts to the environment, right?
01:00:30 It’s shaped by the environment.
01:00:32 And then eventually what comes out is behavior, right?
01:00:36 Human nature is not changing very much,
01:00:38 but suddenly we’re behaving differently, right?
01:00:40 We are, again, you know, look at the polls.
01:00:42 Like it used to be that the president, you know,
01:00:45 was liked by, I don’t remember the exact numbers,
01:00:47 but, you know, 80% or 70% of their own party
01:00:51 and, you know, 50% of the other party.
01:00:53 And now it’s like 40% of their own party
01:00:54 and 10% of the other party, you know?
01:00:56 It’s, and it’s not that the presidents are getting worse.
01:00:59 Maybe some people would argue that they are,
01:01:00 but more so, and there’s a lot of, you know,
01:01:04 idiot presidents throughout the,
01:01:05 what’s going on is something in the environment is changing.
01:01:08 And that’s, you’re seeing is a change in behavior.
01:01:11 A easy example here is that, you know,
01:01:12 by a lot of metrics,
01:01:14 racism is getting, is becoming less and less of a problem.
01:01:19 You know, it’s hard to measure, but there’s metrics like,
01:01:21 you know, how upset would you be
01:01:23 if your kid married someone of another race?
01:01:26 And that number is plummeting,
01:01:28 but racial grievance is skyrocketing, right?
01:01:30 There’s a lot of examples like this.
01:01:32 So I wanted to look around and say,
01:01:33 and the reason I took it on,
01:01:34 the reason I don’t think this is just an unfortunate trend,
01:01:36 unpleasant trend that hopefully we come out of,
01:01:38 is that all this other stuff I like to write about,
01:01:40 all this future stuff, right?
01:01:42 And it’s this magical, I always think of this,
01:01:43 like I’m very optimistic in a lot of ways.
01:01:45 And I think that our world would be a utopia,
01:01:48 would seem like actual heaven.
01:01:50 Like whatever Thomas Jefferson was picturing as heaven,
01:01:53 other than maybe the eternal life aspect,
01:01:55 I think that if he came to 2021 US, it would be better.
01:01:59 It’s cooler than heaven.
01:02:00 But we live in a place that’s cooler than 1700s heaven.
01:02:03 Again, other than the fact that we still die.
01:02:05 Now, I think the future world
01:02:07 actually probably would have, quote, eternal life.
01:02:09 I don’t think anyone wants eternal life actually,
01:02:12 if people think they do.
01:02:13 Eternal is a long time.
01:02:14 But I think the choice to die when you want,
01:02:16 maybe we’re uploaded, maybe we can refresh our bodies.
01:02:19 I don’t know what it is.
01:02:20 But the point is, I think about that utopia.
01:02:23 And I do believe that like, if we don’t botch this,
01:02:25 we’d be heading towards somewhere
01:02:27 that would seem like heaven, maybe in our lifetimes.
01:02:30 Of course, if things go wrong,
01:02:32 now think about the trends here.
01:02:34 Just like the 20th century would seem like
01:02:36 some magical utopia to someone from the 16th century.
01:02:39 But the bad things in the 20th century
01:02:43 were kind of the worst things ever
01:02:46 in terms of just absolute magnitude.
01:02:48 World War II, the biggest genocides ever.
01:02:52 You’ve got maybe climate change,
01:02:55 if it is the existential threat that many people think it is.
01:02:58 We never had an existential threat on that level before.
01:03:01 So the good is getting better and the bad is getting worse.
01:03:04 And so what I think about the future,
01:03:05 I think of us as some kind of big, long canoe as a species.
01:03:10 5 million mile long canoe, each of us sitting in a row.
01:03:14 And we each have one oar,
01:03:16 and we can paddle on the left side or the right side.
01:03:18 And what we know is there’s a fork up there somewhere.
01:03:21 And the river forks, and there’s a utopia on one side
01:03:26 and a dystopia on the other side.
01:03:27 And I really believe that that’s,
01:03:28 we’re probably not headed for just an okay future.
01:03:30 It’s just the way tech is exploding,
01:03:32 like it’s probably gonna be really good or really bad.
01:03:34 The question is which side should we be rowing on?
01:03:35 We can’t see up there, right?
01:03:37 But it really matters.
01:03:38 So I’m writing about all this future stuff
01:03:39 and I’m saying none of this matters
01:03:40 if we’re squabbling our way into kind of like
01:03:43 a civil war right now.
01:03:44 So what’s going on?
01:03:45 So it’s a really important problem to solve.
01:03:48 What are your sources of hope in this?
01:03:51 So like how do you steer the canoe?
01:03:54 One of my big sources of hope,
01:03:56 and this is I think my answer to what I changed my mind on,
01:03:59 is I think I always knew this, but it’s easy to forget it.
01:04:03 Our primitive brain does not remember this fact,
01:04:05 which is that I don’t think there are very many bad people.
01:04:11 Now, you say bad, are there selfish people?
01:04:13 Most of us, I think that if you think of people,
01:04:18 there’s digital languages, ones and zeros.
01:04:21 And our primitive brain very quickly can get into the land
01:04:24 where everyone’s a one or a zero.
01:04:25 Our tribe, we’re all ones, we’re perfect.
01:04:26 I’m perfect, my family is that other family,
01:04:28 it’s that other tribe.
01:04:29 There are zeros and then you dehumanize them, right?
01:04:31 These people are awful.
01:04:33 So zero is not a human place.
01:04:36 No one’s a zero and no one’s a one.
01:04:37 You’re dehumanizing yourself.
01:04:39 So when we get into this land,
01:04:41 I call it political Disney world,
01:04:42 because the Disney movies have good guys.
01:04:44 Scar is totally bad and Mufasa is totally good, right?
01:04:48 You don’t see Mufasa’s character flaws.
01:04:50 You don’t see Scar’s upbringing that made him like that,
01:04:52 that humanizes him.
01:04:53 No, lionizes him, whatever.
01:04:55 You are.
01:04:56 Well done.
01:04:57 Mufasa’s a one and Scar’s a zero, very simple.
01:05:01 So political Disney world is a place,
01:05:03 it’s a psychological place that all of us have been in.
01:05:06 And it can be religious Disney world,
01:05:08 it can be national Disney world and the war, whatever it is,
01:05:10 but it’s a place where we fall into this delusion
01:05:13 that there are protagonists and antagonists
01:05:14 and that’s it, right?
01:05:15 That is not true.
01:05:16 We are all 0.5s or maybe 0.6s to 0.4s in that.
01:05:20 We are also, on one hand,
01:05:22 I don’t think there’s that many really great people,
01:05:24 frankly, I think if you get into it,
01:05:26 people are kind of, a lot of people,
01:05:27 most of us have, if you get really
01:05:29 into our most shameful memories,
01:05:31 the things we’ve done that are worse,
01:05:32 the most shameful thoughts,
01:05:33 the deep selfishness that some of us have
01:05:35 in areas we wouldn’t want to admit, right?
01:05:37 Most of us have a lot of unadmirable stuff, right?
01:05:40 On the other hand, if you actually got into,
01:05:42 really got into someone else’s brain
01:05:44 and you looked at their upbringing,
01:05:45 you looked at the trauma that they’ve experienced
01:05:47 and then you looked at the insecurities they have
01:05:49 and you look at all their,
01:05:50 if you assemble the highlight reel of your worst moments,
01:05:53 the meanest things you’ve ever done,
01:05:55 the worst, the most selfish,
01:05:56 the times you stole something, whatever,
01:05:58 and you just, people are like,
01:05:59 wow, Lex is an awful person.
01:06:00 If you highlighted your,
01:06:01 if you did a montage of your best moments,
01:06:03 people would say, oh, he’s a god, right?
01:06:05 But of course, we all have both of those.
01:06:06 So, I’ve started to really try to remind myself
01:06:11 that everyone’s a 0.5, right?
01:06:13 And 0.5s are all worthy of criticism
01:06:15 and we’re all worthy of compassion.
01:06:17 And the thing that makes me hopeful
01:06:19 is that I really think that there’s a bunch of 0.5s
01:06:22 and 0.5s are good enough
01:06:24 that we should be able to create a good society together.
01:06:26 There’s a lot of love in every human.
01:06:28 And I think there’s more love in humans than hate.
01:06:32 I always remember this moment,
01:06:34 this is a weird anecdote,
01:06:35 but I’m a Red Sox fan, Boston Red Sox baseball,
01:06:38 and Derek Jeter is who we hate the most.
01:06:40 He’s on the Yankees.
01:06:41 Yes.
01:06:42 And hate, right?
01:06:44 Jeter, right?
01:06:45 He was his last game in Fenway, he’s retiring.
01:06:47 And he got this rousing standing ovation
01:06:49 and I almost cried.
01:06:50 And it was like, what is going on?
01:06:52 We hate this guy,
01:06:53 but actually there’s so much love in all humans.
01:06:56 It felt so good to just give a huge cheer
01:06:58 to this guy we hate because it’s like this moment
01:07:00 of like a little fist pound being like,
01:07:02 of course we all actually love each other.
01:07:04 And I think there’s so much of that.
01:07:06 And so the thing that I think I’ve come around on
01:07:08 is I think that we are in an environment
01:07:11 that’s bringing out really bad stuff.
01:07:13 I don’t think it’s, if I thought it was the people,
01:07:15 I would be more hopeful.
01:07:16 Like if I thought it was human nature,
01:07:17 I’d be more upset.
01:07:19 It’s the two independent variables here,
01:07:23 or that there’s a fixed variable.
01:07:24 There’s a constant, which is human nature.
01:07:25 And there’s the independent variable environment
01:07:27 and that it behaviors the dependent variable.
01:07:30 I like that the thing that I think is bad
01:07:32 is the independent variable, the environment,
01:07:34 which means I think we can,
01:07:35 the environment can get better.
01:07:36 And there’s a lot of things I can go into
01:07:38 about why the environment I think is bad,
01:07:39 but I have hope because I think the thing that’s bad for us
01:07:42 is something that can change.
01:07:44 The first principles idea here is that most people
01:07:47 have the capacity to be a 0.7 to 0.9.
01:07:51 If the environment is properly calibrated
01:07:56 with the right incentives.
01:07:57 I think that, well, I think that maybe if we’re all,
01:08:00 yeah, if we’re all 0.5s,
01:08:01 I think that environments can bring out our good side.
01:08:05 Yeah, so maybe we’re all on some kind of distribution
01:08:08 and the right environment can, yes,
01:08:11 can bring out our higher sides.
01:08:12 And I think a lot of, in a lot of ways you could say it has.
01:08:15 I mean, the U.S. environment,
01:08:17 we take for granted how the liberal laws
01:08:20 and liberal environment that we live in.
01:08:23 I mean, like in New York City, right?
01:08:26 If you walk down the street and you like assault someone,
01:08:29 hey, if anyone sees you, they’re probably gonna yell at you.
01:08:31 You might get your ass kicked by someone for doing that.
01:08:32 You also might end up in jail, you know,
01:08:35 if it’s security cameras and there’s just norms,
01:08:38 you know, we’re all trained.
01:08:38 That’s what awful people do, right?
01:08:40 So there’s, it’s not the human nature
01:08:42 doesn’t have it in it to be like that.
01:08:44 It’s that this environment we’re in has made that a much,
01:08:47 much, much smaller experience for people.
01:08:50 There’s so many examples like that where it’s like,
01:08:51 man, you don’t realize how much of the worst human nature
01:08:54 is contained by our environment.
01:08:56 And, but I think that, you know,
01:08:58 rapidly changing environment,
01:09:00 which is what we have right now, social media starts.
01:09:02 I mean, what a seismic change to the environment.
01:09:04 There’s a lot of examples like that,
01:09:05 rapidly changing environment
01:09:06 can create rapidly changing behavior
01:09:09 and wisdom sometimes can’t keep up.
01:09:11 And so we, you know, we can really kind of lose our grip
01:09:15 on some of the good behavior.
01:09:17 Were you surprised by Elon’s answer about brain transplants
01:09:21 or Sam’s about nuclear power or anything else?
01:09:24 Just…
01:09:25 Sam’s I think is, I have a friend,
01:09:27 Isabel Boumeke who has a,
01:09:29 who’s a nuclear power, you know, influencer.
01:09:33 I’ve become very convinced
01:09:34 and I’ve not done my deep dive on this.
01:09:37 But here’s, in this case, this is,
01:09:39 this is reasoning by analogy here.
01:09:42 The amount of really smart people I respect
01:09:45 who all, who seem to have dug in,
01:09:46 who all say nuclear power is clearly a good option.
01:09:49 It’s obviously emission free, but you know,
01:09:51 the concerns about meltdowns and waste,
01:09:54 they see that they’re completely overblown.
01:09:56 So judging from those people, secondary knowledge here,
01:10:00 I will say I’m a strong advocate.
01:10:02 I haven’t done my own deep dive yet,
01:10:04 but it does seem like a little bit odd
01:10:06 that you’ve got people
01:10:08 who are so concerned about climate change,
01:10:11 who have, it seems like it’s kind of an ideology
01:10:15 where nuclear power doesn’t fit
01:10:17 rather than rational, you know, fear of climate change
01:10:20 that somehow is anti nuclear power.
01:10:22 It just, yeah.
01:10:23 I personally am uncomfortably reasoning by analogy
01:10:26 with climate change.
01:10:27 I’ve actually have not done a deep dive myself.
01:10:29 Me neither, because it’s so,
01:10:31 man, it seems like a deep dive.
01:10:33 And my reasoning by analogy there
01:10:37 currently has me thinking it’s a truly existential thing,
01:10:40 but feeling hopeful.
01:10:41 So let me, this is me speaking
01:10:44 and this is speaking from a person
01:10:45 who’s not done the deep dive.
01:10:48 I’m a little suspicious
01:10:50 of the amount of fear mongering going on.
01:10:52 Especially over the past couple of years,
01:10:54 I’ve gotten uncomfortable with fear mongering
01:10:56 in all walks of life.
01:10:59 There’s way too many people interested
01:11:01 in manipulating the populace with fear.
01:11:04 And so I don’t like it.
01:11:05 I should probably do a deep dive
01:11:06 because to me it’s, well, the big problem
01:11:11 with the opposition to climate change
01:11:13 or whatever the fear mongering is
01:11:15 that it also grows the skepticism in science broadly.
01:11:19 It’s like, and that,
01:11:20 so I need to make sure I do that deep dive.
01:11:23 I have listened to a few folks
01:11:25 who kind of criticize the fear mongering
01:11:27 and all those kinds of things,
01:11:29 but they’re few and far between.
01:11:30 And so it’s like, all right, what is the truth here?
01:11:33 And it feels lazy, but it also feels like
01:11:36 it’s hard to get to the,
01:11:38 like there’s a lot of kind of activists talking about idea,
01:11:43 versus like sources of objective,
01:11:48 like calm first principles type reasoning.
01:11:52 Like one of the things,
01:11:54 I know it’s supposed to be a very big problem,
01:11:57 but when people talk about catastrophic effects
01:12:00 of climate change,
01:12:01 I haven’t been able to like see really great deep analysis
01:12:06 of what that looks like in 10, 20, 30 years,
01:12:09 raising rising sea levels.
01:12:12 What are the models of how that changes human behavior,
01:12:17 society, what are the things that happen?
01:12:19 There’s going to be constraints on the resources
01:12:21 and people are gonna have to move around.
01:12:23 This is happening gradually.
01:12:24 Are we gonna be able to respond to this?
01:12:26 How would we respond to this?
01:12:27 What are the best,
01:12:28 like, what are the best models
01:12:30 for how everything goes wrong?
01:12:32 Again, I was, this is a question I keep starting
01:12:36 to ask myself without doing any research,
01:12:38 like motivating myself to get up to this deep dive
01:12:43 that I feel is deep,
01:12:45 just watching people not do a great job
01:12:47 with that kind of modeling with the pandemic
01:12:49 and sort of being caught off guard and wondering,
01:12:52 okay, if we’re not good with this pandemic,
01:12:55 how are we going to respond to other kinds of tragedies?
01:12:57 Well, this is part of why I wrote the book.
01:12:59 Cause I said, we’re going to have more and more of these,
01:13:04 like big collective, what should we do here situations?
01:13:08 Whether it’s, how about when, you know,
01:13:09 we’re probably not that far away
01:13:10 from people being able to go and decide the IQ of their kid
01:13:14 or like, you know, make a bunch of embryos
01:13:15 and actually pick the highest IQ.
01:13:18 Can possibly go wrong.
01:13:19 Yeah.
01:13:20 And also like, imagine the political sides of that
01:13:22 and like something that only wealthy people can afford at
01:13:25 first and just the nightmare, right?
01:13:27 We need to be able to have our wits about us as a species
01:13:29 where we can actually get into a topic like that
01:13:32 and come up with where the collective brain can be smart.
01:13:36 I think that there are certain topics
01:13:39 where I think of this and this is again,
01:13:41 another simplistic model, but I think it works
01:13:43 is that there’s a higher mind and a primitive mind, right?
01:13:45 You can, you know, in your head
01:13:47 and these team up with others.
01:13:48 So when the higher minds are in a higher mind
01:13:50 is more rational and puts out ideas
01:13:52 that it’s not attached to.
01:13:54 And so it can change its mind easily
01:13:56 cause it’s just an idea and the higher mind
01:13:58 can get criticized.
01:14:00 Their ideas can get criticized and it’s no big deal.
01:14:02 And so when the higher minds team up,
01:14:05 it’s like all these people in the room,
01:14:06 like throwing out ideas and kicking them
01:14:07 and one idea goes out and everyone criticizes it,
01:14:10 which is like, you know, shooting bows and arrows at it.
01:14:11 And the truth, the true idea is, you know,
01:14:13 the bow, the arrows bounce off and it’s so, okay,
01:14:16 it rises up and the other ones get shot down.
01:14:18 So it’s this incredible system.
01:14:19 This is what, you know,
01:14:20 this is what good science institution is,
01:14:22 is, you know, someone puts out a thing,
01:14:24 criticism arrows come at it and, you know,
01:14:26 most of them fall and the needle is in the haystack
01:14:29 end up rising up, right?
01:14:30 Incredible mechanism.
01:14:30 So what that’s happening is a bunch of people,
01:14:32 a bunch of flawed medium scientists
01:14:34 are creating super intelligence.
01:14:38 Then there’s the primitive mind,
01:14:40 which, you know, is the more limbic system part of our brain.
01:14:44 It’s the part of us that is very much not living in 2021.
01:14:48 It’s living many tens of thousands of years ago.
01:14:51 And it does not treat ideas like this separate thing.
01:14:54 It identifies with its ideas.
01:14:56 It only gets involved when it finds an idea sacred.
01:14:59 It starts holding an idea sacred and it starts identifying.
01:15:02 So what happens is they team up too.
01:15:04 And so when you have a topic that a bunch of primitive,
01:15:08 that really rouses a bunch of primitive minds,
01:15:13 it quickly, the primitive minds team up
01:15:14 and they create an echo chamber
01:15:16 where suddenly no one can criticize this.
01:15:18 And in fact, if it’s powerful enough,
01:15:20 people outside the community, you can,
01:15:22 no one can criticize it.
01:15:22 We will get your paper retracted.
01:15:23 We will get you fired, right?
01:15:25 That’s not higher mind behavior.
01:15:27 That is crazy primitive mind.
01:15:28 And so now what happens is the collective
01:15:30 becomes dumber than an individual,
01:15:32 a dumber than a reason, a single reasoning individual.
01:15:34 You have this collective is suddenly attached
01:15:37 to this sacred scripture with the idea
01:15:40 and they will not change their mind
01:15:43 and they get dumber and dumber.
01:15:44 And so climate change, what’s worrisome
01:15:47 is that climate change has in many ways
01:15:49 become a sacred topic,
01:15:50 where if you come up with a nuanced thing,
01:15:52 you might get called branded a denier.
01:15:54 So there goes the super intelligence,
01:15:57 all the arrows, no arrows can be fired.
01:15:59 But if you get called a denier,
01:16:01 that’s a social penalty for firing an arrow
01:16:03 at a certain orthodoxy, right?
01:16:05 And so what’s happening is the big brain
01:16:06 gets like frozen, right?
01:16:08 And it becomes very stupid.
01:16:08 Now, you can also say that
01:16:10 about a lot of other topics right now.
01:16:14 You just mentioned another one, I forget what it was,
01:16:16 but that’s also kind of like this.
01:16:19 The world of vaccine.
01:16:20 Yeah, yeah, COVID, okay.
01:16:21 And here’s my point earlier is that
01:16:23 what I see is that the political divide
01:16:25 has like a whirlpool that’s pulling everything into it.
01:16:28 And in that whirlpool, thinking is done
01:16:32 with the primitive mind tribes.
01:16:34 And so I get, okay, obviously something like race,
01:16:38 that makes sense, that also right now,
01:16:40 the topic of race, for example, or gender,
01:16:42 these things are in the whirlpool.
01:16:43 But that at least is like, okay,
01:16:45 that’s something that the primitive mind
01:16:47 would always get really worked up about.
01:16:49 It taps into like our deepest kind of like primal selves.
01:16:54 COVID, maybe it’s COVID in a way too,
01:16:56 but climate change, that should just be something
01:16:57 that our rational brains are like,
01:16:58 let’s solve this complex problem.
01:17:00 But the problem is that it’s all gotten sucked
01:17:02 into the red versus blue whirlpool.
01:17:04 And once that happens,
01:17:05 it’s in the hands of the primitive minds.
01:17:06 And we’re losing our ability to be wise together,
01:17:09 to make decisions.
01:17:10 It’s like the big species brain is like,
01:17:13 or the big American brain is like,
01:17:15 drunk at the wheel right now.
01:17:16 And we’re about to go into our future
01:17:18 with more and more big technologies,
01:17:21 scary things, we have to make big right decisions.
01:17:23 And not, we’re getting dumber as a collective.
01:17:26 And that’s part of this environmental problem.
01:17:28 So within the space of technologists
01:17:30 and the space of scientists, we should allow the arrows.
01:17:33 That’s one of the saddest things to me about,
01:17:35 is like the scientists, like I’ve seen arrogance.
01:17:39 There’s a lot of mechanisms that maintain the tribe.
01:17:42 It’s the arrogance, it’s how you built up this mechanism
01:17:46 that defends, this wall that defends against the arrows.
01:17:50 It’s arrogance, credentialism,
01:17:54 like just ego, really.
01:17:58 And then just, it protects you
01:17:59 from actually challenging your own ideas.
01:18:01 This ideal of science that makes science beautiful.
01:18:05 In a time of fear, and in a time of division
01:18:09 created by perhaps politicians that leverage the fear,
01:18:12 it, like you said, makes the whole system dumber.
01:18:16 The science system dumber,
01:18:19 the tech developer system dumber,
01:18:22 if they don’t allow the challenging of ideas.
01:18:24 What’s really bad is that like,
01:18:27 in a normal environment,
01:18:28 you’re always gonna have echo chambers.
01:18:30 So what’s the opposite of an echo chamber?
01:18:32 I created a term for it, because I think we need it,
01:18:34 which is called an idea lab, an idea lab, right?
01:18:36 It’s like people treat, it’s like people act like scientists,
01:18:38 even if they’re not doing science,
01:18:40 they just treat their ideas like science experiments
01:18:43 and they toss them out there and everyone disagrees.
01:18:45 And disagreement is like the game.
01:18:46 Everyone likes to disagree.
01:18:48 Certain texts thread where everyone is just saying,
01:18:50 it’s almost like someone throws something out
01:18:52 and just it’s an impulse for the rest of the group to say,
01:18:54 I think you’re being like overly general there.
01:18:56 Or I think like, aren’t you kind of being,
01:18:58 I think that’s like your bias showing.
01:19:00 And it’s like, no one’s getting offended
01:19:01 because it’s like, we’re all just messing,
01:19:02 we all of course respect each other, obviously.
01:19:04 We’re just, you know, trashing each other’s ideas
01:19:08 and that the whole group becomes smarter.
01:19:10 You’re always gonna have idea labs and echo chambers,
01:19:12 right, in different communities.
01:19:13 And most of us participate in both of them.
01:19:15 You know, maybe in your marriage is a great idea lab,
01:19:18 you love to disagree with your spouse and maybe in,
01:19:20 but this group of friends or your family at home,
01:19:23 you know, in front of that sister,
01:19:24 you do not bring up politics
01:19:25 because she’s now enforced when that happens,
01:19:27 her bullying is forcing the whole room
01:19:30 to be an echo chamber to appease her.
01:19:32 Now, what scares me is that usually have these things
01:19:35 existing kind of in bubbles.
01:19:36 And usually there’s like an age
01:19:39 have their natural defenses against each other.
01:19:40 So an echo chamber person stays in their echo chamber.
01:19:44 They don’t like, they will cut you out.
01:19:45 They don’t like to be friends with people
01:19:47 who disagree with them.
01:19:48 You notice that they will cut you out.
01:19:49 They’ll cut out their parents
01:19:50 if they voted for Trump or whatever, right?
01:19:51 So that’s how they do it.
01:19:54 They will say, I’m going to stay inside
01:19:56 of an echo chamber safely.
01:19:57 So my ideas, which I identify with
01:20:00 because my primitive mind is doing the thinking
01:20:02 are not going to ever have to get challenged
01:20:04 because it feels so scary and awful for that to happen.
01:20:07 But if they leave and they go into an idea lab environment,
01:20:09 they’re going to, people are going to say, what?
01:20:10 No, they’re going to disagree.
01:20:11 And they’re going to say,
01:20:12 and the person’s going to try to bully them.
01:20:13 They’re going to say, that’s really offensive.
01:20:15 And people are going to say, no, it’s not.
01:20:16 And they’re going to immediately say
01:20:17 these people are assholes, right?
01:20:19 So the echo chamber person,
01:20:21 it doesn’t have much power once they leave the echo chamber.
01:20:24 Likewise, the idea lab person,
01:20:26 they have this great environment,
01:20:27 but if they go into an echo chamber
01:20:28 where everyone else is, and they do that,
01:20:29 they will get kicked out of the group.
01:20:30 They will get branded as something,
01:20:32 a denier, a racist, a right winger, a radical,
01:20:36 these nasty words.
01:20:39 The thing that I don’t like right now
01:20:41 is that the echo chambers have found ways
01:20:44 to forcefully expand into places
01:20:48 that normally have a pretty good immune system
01:20:51 against echo chambers, like universities,
01:20:53 like science journals,
01:20:54 places where usually it’s like,
01:20:56 there’s a strong idea lab culture there, veritas.
01:20:59 You know, that’s an idea lab slogan.
01:21:02 You have is that these people have found a way to,
01:21:06 a lot of people have found a way
01:21:07 to actually go out of their thing
01:21:09 and keep their echo chamber
01:21:10 by making sure that everyone is scared
01:21:11 because they can punish anyone,
01:21:13 whether you’re in their community or not.
01:21:15 So that’s all brilliantly put.
01:21:19 When’s the book coming out?
01:21:20 Any idea?
01:21:21 June, July, we’re not quite sure yet.
01:21:24 Okay, I can’t wait.
01:21:25 Thanks. It’s awesome.
01:21:26 Do you have a title yet or you can’t talk about that?
01:21:28 Still working on it.
01:21:29 Okay.
01:21:30 If it’s okay, just a couple of questions from Mailbag.
01:21:32 I just love these.
01:21:34 I would love to hear you riff on these.
01:21:37 So one is about film and music.
01:21:39 Why do we prefer to watch, the question goes,
01:21:41 why do we prefer to watch a film
01:21:43 we haven’t watched before,
01:21:45 but we want to listen to songs
01:21:47 that we have heard hundreds of times?
01:21:50 This question and your answer
01:21:51 really started to make me think like,
01:21:53 yeah, that’s true.
01:21:54 That’s really interesting.
01:21:55 Like we draw that line somehow.
01:21:59 So what’s the answer?
01:22:00 So I think, let’s use these two minds again.
01:22:02 I think that when your higher mind
01:22:04 is the one who’s taking something in
01:22:06 and they’re really interested in,
01:22:07 what are the lyrics or I’m gonna learn something
01:22:08 or reading a book or whatever.
01:22:11 And the higher mind is trying to get information.
01:22:15 And once it has it,
01:22:16 there’s no point in listening to it again.
01:22:17 It has the information.
01:22:20 Your rational brain is like, I got it.
01:22:23 But when you eat a good meal or have sex or whatever,
01:22:27 that’s something you can do again and again,
01:22:28 because it actually, your primitive brain loves it.
01:22:32 And it never gets bored of things that it loves.
01:22:35 So I think music is a very primal thing.
01:22:38 I think music goes right into our primitive brain.
01:22:40 A lot, I think it’s of course, it’s a collaboration.
01:22:43 Your rational brain is absorbing the actual message.
01:22:47 But I think it’s all about emotions
01:22:48 and even more than emotions,
01:22:50 it literally like the music taps into like some very,
01:22:53 very deep, primal part of us.
01:22:59 And so when you hear a song once,
01:23:02 even some of your favorite songs,
01:23:03 the first time you heard it,
01:23:04 you were like, I guess that’s kind of catchy.
01:23:05 Yeah.
01:23:07 And then you end up loving it on the 10th listen.
01:23:10 But sometimes you even don’t even like a song.
01:23:11 You’re like, oh, this song sucks.
01:23:12 But suddenly you find yourself on the 40th time
01:23:15 because it’s on the radio all the time,
01:23:16 just kind of being like, oh, I love this song.
01:23:17 And you’re like, wait, I hated the song.
01:23:19 And what’s happening is that the sound is actually,
01:23:23 the music’s actually carving a pathway in your brain
01:23:27 and it’s a dance.
01:23:29 And when your brain knows what’s coming,
01:23:30 it can dance, it knows the steps.
01:23:33 So your brain is your internal kind of,
01:23:35 your brain is actually dancing with the music
01:23:37 and it knows the steps and it can anticipate.
01:23:40 And so there’s something about knowing,
01:23:45 having memorized the song
01:23:46 that makes it incredibly enjoyable to us.
01:23:48 But when we hear it for the first time,
01:23:49 we don’t know where it’s gonna go.
01:23:50 We’re like an awkward dancer.
01:23:51 We don’t know the steps and your primitive brain
01:23:53 can’t really have that much fun yet.
01:23:55 That’s how I feel.
01:23:56 And in the movies, that’s less primitive.
01:23:59 That’s the story you’re taking in.
01:24:03 But a really good movie that we really love,
01:24:05 often we will watch it like 12 times.
01:24:07 It’s still like it, not that many,
01:24:09 but versus if you’re watching a talk show, right?
01:24:11 If you’re listening to one of your podcasts,
01:24:14 as a perfect example,
01:24:15 there’s not many people that will listen
01:24:16 to one of your podcasts,
01:24:17 no matter how good it is, 12 times.
01:24:19 Because once you’ve got it, you’ve got it.
01:24:21 It’s a form of information that’s very higher mind focused.
01:24:25 That’s how I read it.
01:24:26 Well, the funny thing is there is people
01:24:28 that listen to a podcast episode many, many times.
01:24:32 And often I think the reason for that
01:24:33 is not because of the information, is the chemistry,
01:24:36 is the music of the conversation.
01:24:38 So it’s not the actual.
01:24:38 It’s the art of it they like.
01:24:40 Yeah, they’ll fall in love with some kind of person,
01:24:42 some weird personality, and they’ll just be listening to,
01:24:45 they’ll be captivated by the beat of that kind of person.
01:24:47 Or like a standup comic.
01:24:48 I’ve watched like certain things,
01:24:49 like episodes like 20 times, even though I, you know.
01:24:53 I have to ask you about the wizard hat.
01:24:56 You had a blog about Neuralink.
01:24:58 I got a chance to visit Neuralink a couple of times,
01:25:00 hanging out with those folks, that was one of the pieces
01:25:05 of writing you did that like changes culture
01:25:09 and changes the way people think about a thing.
01:25:13 The ridiculousness of your stick figure drawings
01:25:15 are somehow, it’s like calling the origin
01:25:21 of the universe the Big Bang.
01:25:23 It’s a silly title, but it somehow sticks
01:25:25 to be the representative of that.
01:25:28 And the same way the wizard hat for the Neuralink
01:25:31 is somehow it was a really powerful way to explain that.
01:25:35 You actually proposed that the man of the year
01:25:37 cover of Time should be.
01:25:39 One of my drawings.
01:25:40 One of your drawings.
01:25:41 Yes, yes.
01:25:42 It’s an outrage that it wasn’t.
01:25:43 It wasn’t.
01:25:44 Okay, so what are your thoughts about like all those years
01:25:49 later about Neuralink?
01:25:50 Do you find this idea, like what excites you about it?
01:25:54 Is it the big long term philosophical things?
01:25:56 Is it the practical things?
01:25:58 Do you think it’s super difficult to do
01:26:00 on the neurosurgery side
01:26:02 and the material engineering, the robotics side?
01:26:05 Or do you think the machine learning side
01:26:08 for the brain computer interfaces
01:26:09 where they get to learn about each other,
01:26:11 all that kind of stuff.
01:26:12 I would just love to get your thoughts
01:26:13 because you’re one of the people
01:26:14 that really considered this problem,
01:26:17 really studied it, brain computer interfaces.
01:26:19 I mean, I’m super excited about it.
01:26:21 It’s a, I really think it’s actually
01:26:24 Elon’s most ambitious thing.
01:26:27 More than colonizing Mars
01:26:28 because that’s just a bunch of people going somewhere,
01:26:31 even though it’s somewhere far.
01:26:33 Neuralink is changing what a person is eventually.
01:26:37 Now, I think that Neuralink engineers and Elon himself
01:26:42 would all be the first to admit that it is a maybe
01:26:45 that whether they can do their goals here.
01:26:46 I mean, it is so crazy ambitious to try to,
01:26:50 even their eventual goals are,
01:26:52 of course in the interim,
01:26:53 they have a higher probability
01:26:55 of accomplishing smaller things,
01:26:56 which are still huge, like basically solving paralysis,
01:27:01 strokes, Parkinson, things like that.
01:27:02 I mean, it can be unbelievable.
01:27:03 And anyone who doesn’t have one of these things,
01:27:06 like we might, everyone should be very happy
01:27:08 about this kind of helping with different disabilities.
01:27:15 But the thing that is like,
01:27:17 so the grand goal is this augmentation
01:27:21 where you take someone who’s totally healthy
01:27:23 and you put a brain machine interface in any way
01:27:26 to give them superpowers.
01:27:29 It’s the possibilities if they can do this,
01:27:32 if they can really,
01:27:33 so they’ve already shown that they are for real,
01:27:36 but they’ve created this robot.
01:27:38 Elon talks about like, it should be like LASIK,
01:27:40 where it’s not,
01:27:42 it shouldn’t be something that needs a surgeon.
01:27:44 This shouldn’t just be for rich people
01:27:46 who have waited in line for six months.
01:27:48 It should be for anyone who can afford LASIK
01:27:50 and eventually, hopefully something that isn’t covered by
01:27:53 insurance or something that anyone can do.
01:27:56 Something this big a deal should be something
01:27:57 that anyone can afford eventually.
01:28:00 And when we have this, again,
01:28:03 I’m talking about a very advanced phase down the road.
01:28:05 So maybe a less advanced phase,
01:28:07 just maybe right now,
01:28:11 if you think about when you listen to a song,
01:28:13 what’s happening, is you actually hear the sound?
01:28:16 Well, not really.
01:28:18 It’s that the sound is coming out of the speaker.
01:28:22 The speaker is vibrating.
01:28:23 It’s vibrating air molecules.
01:28:25 Those air molecules, you know,
01:28:26 get vibrated all the way to your head pressure wave.
01:28:31 And then it vibrates your eardrum.
01:28:34 Your eardrum is really the speaker now in your head
01:28:37 that then vibrates bones and fluid,
01:28:40 which then stimulates neurons in your auditory cortex,
01:28:44 which give you the perception that you’re hearing sound.
01:28:49 Now, if you think about that,
01:28:52 do we really need to have a speaker to do that?
01:28:55 You could just somehow,
01:28:56 if you had a little tiny thing
01:28:56 that could vibrate eardrums,
01:28:57 you could do it that way.
01:28:59 That seems very hard.
01:29:00 But really what you need,
01:29:02 if you go to the very end
01:29:03 with a thing that really needs to happen,
01:29:05 is your auditory cortex neurons
01:29:07 need to be stimulated in a certain way.
01:29:09 If you have a ton of Neuralink things in there,
01:29:11 Neuralink electrodes,
01:29:13 and they get really good at stimulating things,
01:29:15 you could play a song in your head
01:29:17 that you hear that is not playing anywhere.
01:29:20 There’s no sound in the room,
01:29:21 but you hear and no one else could.
01:29:23 It’s not like they can get close to your head and hear it.
01:29:25 There’s no sound.
01:29:26 They could not hear anything,
01:29:27 but you hear sound.
01:29:28 You can turn up.
01:29:29 So you open your phone,
01:29:30 you have the Neuralink app.
01:29:31 You open the Neuralink app,
01:29:32 and or just Neuralink.
01:29:34 So basically you can open your Spotify
01:29:35 and you can play to your speaker,
01:29:38 you can play to your computer,
01:29:39 you can play right out of your phone to your headphones,
01:29:41 or you can have a new one.
01:29:43 You can play into your brain.
01:29:44 And this is one of the earlier things.
01:29:47 This is something that seems like really doable.
01:29:50 So no more headphones.
01:29:52 I always think it’s so annoying
01:29:53 because I can leave the house with just my phone
01:29:56 and nothing else,
01:29:57 or even just an Apple watch.
01:29:58 But there’s always this one thing,
01:29:59 I’m like, and headphones.
01:30:00 You do need your headphones, right?
01:30:01 So I feel like that’ll be the end of that.
01:30:03 But there’s so many things that you,
01:30:04 and you keep going,
01:30:06 the ability to think together.
01:30:08 You can talk about like super brains.
01:30:10 I mean, one of the examples Elon uses
01:30:12 is that the low bandwidth of speech.
01:30:16 If I go to a movie and I come out of a scary movie
01:30:19 and you say, how was it?
01:30:20 I said, oh, it was terrifying.
01:30:21 Well, what did I just do?
01:30:22 I just gave you,
01:30:24 I had five buckets I could have given you.
01:30:26 One was horrifying, terrifying, scary, eerie, creepy,
01:30:29 whatever.
01:30:30 That’s about it.
01:30:31 And I had a much more nuanced experience than that.
01:30:36 And all I have is these words, right?
01:30:39 And so instead I just hand you the bucket.
01:30:41 I put the stuff in the bucket and give it to you,
01:30:44 but all you have is the bucket.
01:30:45 You just have to guess what I put into that bucket.
01:30:47 All you can do is look at the label of the bucket and say,
01:30:50 when I say terrifying, here’s what I mean.
01:30:52 So the point is it’s very lossy.
01:30:55 I had all this nuanced information
01:30:57 of what I thought of the movie.
01:30:58 And I’m sending you a very low res package
01:31:00 that you’re gonna now guess
01:31:02 what the high res thing looked like.
01:31:04 That’s language in general.
01:31:05 Our thoughts are much more nuanced.
01:31:07 We can think to each other.
01:31:08 We can do amazing things.
01:31:09 We could A, have a brainstorm that doesn’t feel like,
01:31:11 oh, we’re not talking in each other’s heads.
01:31:13 Not just that I hear your voice.
01:31:14 No, no, no, we are just thinking.
01:31:15 No words are being said internally or externally.
01:31:19 The two brains are literally collaborating.
01:31:21 It’s something, it’s a skill.
01:31:22 I’m sure we’d have to get good at it.
01:31:23 I’m sure young kids will be great at it
01:31:25 and old people will be bad.
01:31:26 But you think together and together you’re like,
01:31:28 oh, had the joint epiphany.
01:31:29 And now how about eight people in a room doing it, right?
01:31:31 So it gets, you know, there’s other examples.
01:31:34 How about when you’re a dress designer or a bridge designer
01:31:37 and you want to show people what your dress looks like.
01:31:40 Well, right now you gotta sketch it for a long time.
01:31:42 Here, just beam it onto the screen from your head.
01:31:44 So you can picture it.
01:31:45 If, you know, if you can picture a tree in your head,
01:31:47 well, you can just suddenly,
01:31:48 whatever’s in your head, you can be pictured.
01:31:50 So we’ll have to get very good at it, right?
01:31:52 And take a skill, right?
01:31:52 You know, you’re gonna have to,
01:31:54 but the possibilities, my God.
01:31:56 Talk about like, I feel like if that works,
01:31:58 if we really do have that as something,
01:32:01 I think it’ll almost be like a new ADBC line.
01:32:04 It’s such a big change that the idea of like anyone living
01:32:07 before everyone had brain machine interfaces
01:32:09 is living in like before the common era.
01:32:12 It’s that level of like big change if it can work.
01:32:16 Yeah, and like a replay of memories,
01:32:18 just replaying stuff in your head.
01:32:19 Oh my God, yeah.
01:32:20 And copying, you know, you can hopefully copy memories
01:32:23 onto other things and you don’t have to just rely on your,
01:32:26 you know, your wet circuitry.
01:32:27 It does make me sad because you’re right.
01:32:29 The brain is incredibly neuroplastic
01:32:32 and so it can adjust, it can learn how to do this.
01:32:34 I think it’ll be a skill.
01:32:36 Or probably you and I will be too old to truly learn.
01:32:38 Well, maybe we can get, there’ll be great trainings.
01:32:40 You know, I’m spending the next three months
01:32:42 in like a, you know, one of the Neuralink trainings.
01:32:44 But it’ll still be a bit of like grandpa, I can’t.
01:32:47 Definitely.
01:32:48 This is, you know, I was thinking,
01:32:49 how am I gonna be old?
01:32:50 I’m like, no, I’m gonna be great at the new phones.
01:32:51 It’s like, how can it be the phones?
01:32:53 It’s gonna be that, you know,
01:32:54 the kid’s gonna be thinking to me.
01:32:55 I’m gonna be like, I just, can you just talk please?
01:32:58 And they’re gonna be like, okay, I’ll just talk.
01:32:59 And they’re gonna, so that’ll be the equivalent of,
01:33:02 you know, yelling to your grandparents today.
01:33:04 I really suspect, I don’t know what your thoughts are,
01:33:06 but I grew up in a time when physical contact interaction
01:33:11 was valuable.
01:33:12 I just feel like that’s going to go the way
01:33:15 that’s gonna disappear.
01:33:17 What, why?
01:33:17 I mean, is there anything more intimate
01:33:19 than thinking with each other?
01:33:20 I mean, that’s, you talk about, you know,
01:33:21 once we were all doing that, it might feel like, man,
01:33:23 everyone was so isolated from each other so far.
01:33:24 Yeah, sorry.
01:33:25 So I didn’t say that intimacy disappears.
01:33:27 I just meant physical, having to be in the same,
01:33:29 having to touch each other.
01:33:31 If people like that, if it is important,
01:33:34 won’t there be whole waves of people start to say,
01:33:36 you know, there’s all these articles that come out
01:33:38 about how, you know, in our metaverse,
01:33:39 we’ve lost something important and then now there’s a huge,
01:33:42 all first the hippies start doing it
01:33:43 and then eventually it becomes this big wave
01:33:44 and now everyone, won’t, you know,
01:33:46 if something truly is lost, won’t we recover it?
01:33:49 Well, I think from first principles,
01:33:51 all of the components are there to engineer
01:33:54 intimate experiences in the metaverse or in the cyberspace.
01:34:00 And so to me, I don’t see anything profoundly unique
01:34:06 to the physical experience.
01:34:08 Like I don’t understand.
01:34:09 But then why are you saying there’s a loss there?
01:34:12 No, I’m just sad because I won’t,
01:34:13 oh, it’s a loss for me personally, because the world.
01:34:16 So then you do think there’s something unique
01:34:17 in the physical experience.
01:34:18 For me, because I was raised with it.
01:34:20 Oh.
01:34:21 Yeah, yeah, yeah.
01:34:21 So whatever, so anything you’re raised with,
01:34:24 you fall in love with.
01:34:25 Like people in this country came up with baseball.
01:34:27 I was raised in the Soviet Union.
01:34:29 I don’t understand baseball.
01:34:30 I get, I like it, but I don’t love it
01:34:32 the way Americans love it.
01:34:36 Because a lot of times they went to baseball games
01:34:38 with their father and then there’s that family connection.
01:34:41 There’s a young kid dreaming about, I don’t know,
01:34:47 becoming an MLB player himself.
01:34:49 I don’t know, something like that.
01:34:50 But that’s what you’re raised with,
01:34:52 obviously is really important.
01:34:53 But I mean, fundamentally to the human experience,
01:34:57 listen, we’re doing this podcast in person.
01:34:58 So clearly I still value it, but.
01:35:01 But it’s true.
01:35:02 If this were, obviously through a screen,
01:35:04 we all agree that’s not the same.
01:35:05 Yeah, it’s not the same.
01:35:06 But if this were some, we had contact lenses on
01:35:08 and maybe Neuralink, maybe again, forget,
01:35:11 again, this is all the devices,
01:35:13 even if it’s just cool as a contact lens,
01:35:15 that’s all old school.
01:35:16 Once you have the brain machine interface,
01:35:18 it’ll just be projection of,
01:35:20 it’ll take over my visual cortex.
01:35:22 My visual cortex will get put into a virtual room
01:35:25 and so will yours.
01:35:26 So we will see, we will hear, really hear and see
01:35:29 as if where you won’t have any masks, no VR mask needed.
01:35:32 And at that point, it really will feel like you’ll forget.
01:35:35 You’ll say, will we together and physically or not?
01:35:37 You won’t even,
01:35:38 it’d be so unimportant you won’t even remember.
01:35:40 And you’re right.
01:35:41 This is one of those shifts in society
01:35:45 that changes everything.
01:35:46 Romantically, people still need to be together.
01:35:50 There’s a whole set of like physical things
01:35:52 with a relationship that are needed.
01:35:55 You know, like.
01:35:56 Like what?
01:35:57 Like sex?
01:35:58 Sex, but also just like there’s pheromones.
01:36:00 Like there’s the physical touch is such a,
01:36:03 that’s like music.
01:36:04 It goes to such a deeply primitive part of us
01:36:06 that what physical touch with a romantic partner does,
01:36:09 that I think that,
01:36:11 so I’m sure there’ll be a whole wave of people who,
01:36:13 their new thing is that, you know,
01:36:14 you’re romantically involved people
01:36:16 you never actually are in person with,
01:36:17 but, and I’m sure there’ll be things
01:36:18 where you can actually smell what’s in the room
01:36:20 and you can.
01:36:21 Yeah, and touch.
01:36:22 Yeah, but I think that’ll be one of the last things to go.
01:36:24 I think there’ll be, there’s something,
01:36:26 that to me seems like something that’ll be a while
01:36:29 before people feel like there’s nothing lost
01:36:32 by not being in the same.
01:36:33 It’s very difficult to replicate the human interaction.
01:36:35 Although sex also, again,
01:36:37 you could not to get too like weird,
01:36:39 but you could have a thing where you,
01:36:41 you’re basically, you know, or, you know,
01:36:44 let’s just do a massage because it’s less like awkward,
01:36:46 but like someone, you know,
01:36:48 everyone is still imagining sex.
01:36:49 So go on.
01:36:50 A masseuse could massage a fake body
01:36:54 and you could feel whatever’s happening, right?
01:36:57 So you’re lying down in your apartment alone,
01:36:58 but you’re feeling a full.
01:37:00 There’ll be the new like YouTube or like streaming
01:37:02 where it’s one masseuse massaging one body,
01:37:05 but like a thousand people are experiencing.
01:37:06 Exactly right now.
01:37:07 Think about it right now.
01:37:08 You know what, Taylor Swift doesn’t play for one person
01:37:10 and has to go around and every one of her fans
01:37:12 she has to go play for or a book, right?
01:37:14 You do it and it goes everywhere.
01:37:15 So it’ll be the same idea.
01:37:18 You’ve written and thought a lot about AI.
01:37:21 So AI safety specifically,
01:37:25 you’ve mentioned you’re actually starting a podcast,
01:37:27 which is awesome.
01:37:28 You’re so good at talking, so good at thinking,
01:37:30 so good at being weird in the most beautiful of ways,
01:37:33 but you’ve been thinking about this AI safety question.
01:37:38 Where today does your concern lie
01:37:41 for the near future, for the longterm future?
01:37:43 Like quite a bit of stuff happened,
01:37:46 including with Elon’s work with Tesla Autopilot.
01:37:48 There’s a bunch of amazing robots with Boston Dynamics
01:37:52 and everyone’s favorite vacuum robot, iRobot, Roomba.
01:37:57 And then there’s obviously the applications
01:37:59 of machine learning for recommender systems
01:38:01 in Twitter, Facebook, and so on.
01:38:04 And face recognition for surveillance,
01:38:07 all these kinds of things are happening.
01:38:09 Just a lot of incredible use of not the face recognition,
01:38:12 but the incredible use of deep learning,
01:38:15 machine learning to capture information about people
01:38:19 and try to recommend to them what they wanna consume next.
01:38:24 Some of that can be abused,
01:38:25 some of that can be used for good,
01:38:27 like for Netflix or something like that.
01:38:29 What are your thoughts about all this?
01:38:30 Yeah, I mean, I really don’t think humans are very smart,
01:38:35 all things considered, I think we’re like limited.
01:38:38 And we’re dumb enough that we’re very easily manipulable.
01:38:43 Not just like, oh, like our emotions,
01:38:45 people can, our emotions can be pulled like puppet strings.
01:38:50 I mean, again, I look at like,
01:38:51 I do look at what’s going on in political polarization now
01:38:53 and I see a lot of puppet string emotions happening.
01:38:56 So yeah, there’s a lot to be scared of for sure,
01:38:58 like very scared of.
01:39:00 I get excited about a lot of very specific things.
01:39:03 Like one of the things I get excited about is I like,
01:39:06 so the future of wearables, right?
01:39:08 Again, I think that it would be like,
01:39:09 oh, the wrist, the Fitbit around my wrist
01:39:11 is gonna seem, you know, the whoop
01:39:13 is gonna seem really hilariously old school in 20 years.
01:39:17 Back with Neuralink.
01:39:18 Like a big bracelet, right?
01:39:21 It’s gonna turn into little sensors in our blood probably,
01:39:24 or, you know, even, you know, infrared,
01:39:26 we’re, you know, just things that are gonna be,
01:39:28 it’s gonna be collecting a hundred times more data
01:39:31 than it collects now, more nuanced data,
01:39:32 more specific to our body.
01:39:34 And it’s going to be, you know, super reliable,
01:39:36 but that’s the hardware side.
01:39:38 And then the software is gonna be,
01:39:40 this is, I’ve not done my deep dive.
01:39:41 This is all speculation,
01:39:42 but the software is gonna get really good.
01:39:45 And this is the AI component.
01:39:46 And so I get excited about specific things like that.
01:39:49 Like think about if hardware were able to collect,
01:39:54 first of all, the hardware knows your whole genome
01:39:56 and we know a lot more about what a genome sequence means.
01:40:00 Cause you can collect your genome now
01:40:03 and we just don’t know much.
01:40:04 We, okay, we don’t have much to do with that information.
01:40:07 As AI gets, so now you have your genome,
01:40:09 you’ve got what’s in your blood at any given moment,
01:40:11 all the levels of everything, right?
01:40:13 You have the exact width of your heart arteries
01:40:16 at any given moment, you’ve got.
01:40:18 All the, all the virons,
01:40:20 all the viruses that ever visited your body
01:40:22 cause there’s a trace of it.
01:40:24 So you have all the pathogens,
01:40:25 all the things that like,
01:40:27 you should be concerned about health wise
01:40:29 and might have threatened you,
01:40:30 you might be immune from all of that kind of stuff.
01:40:32 They also, of course it knows
01:40:34 how fast your heart is beating
01:40:35 and it knows how much you know,
01:40:37 exactly the amount of exercise,
01:40:39 knows your muscle mass and your weight and all that,
01:40:41 but it also maybe can even know your emotions.
01:40:42 I mean, make, if emotions, you know, what are they,
01:40:44 you know, where do they come from?
01:40:45 Probably pretty obvious chemicals once we get in there.
01:40:48 So again, Neuralink can be involved here maybe
01:40:50 in collecting information, you know,
01:40:53 cause right now you have to do the thing,
01:40:54 what’s your mood right now?
01:40:55 And it’s hard to even assess, you know,
01:40:56 and you’re in a bad mood, it’s hard to even, but.
01:40:58 By the way, just as a shout out,
01:41:00 Lisa Feldman Barrett, who’s a neuroscientist
01:41:03 at Northeastern just wrote a,
01:41:05 I mean, not just, like a few years ago,
01:41:07 wrote a whole book saying our expression of emotions
01:41:09 has nothing to do with the experience of emotions.
01:41:12 So you really actually want to be measuring.
01:41:15 That, that’s exactly.
01:41:16 I, you can tell because one of these apps pops up
01:41:18 and says, you know, what, how do you feel right now?
01:41:20 Good, bad?
01:41:21 I’m like, I don’t know.
01:41:22 Like I feel bad right now because the thing popping up
01:41:24 reminded me that I’m procrastinating.
01:41:26 So I was on my phone, I should have been more,
01:41:27 you know, I’m like, that’s not my, you know.
01:41:29 So I think it will probably be able to very,
01:41:33 get all this info, right?
01:41:34 Now the AI can go to town.
01:41:37 Think about when the AI gets really good at this
01:41:39 and it knows your genome and it knows it can just,
01:41:42 I want the AI to just tell me what to do when it turns up.
01:41:46 Okay, so how about this?
01:41:47 Now imagine attaching that to a meal service, right?
01:41:50 And the meal service has everything, you know,
01:41:51 all the million ingredients and supplements and vitamins
01:41:54 and everything.
01:41:55 And I give the, I tell the AI my broad goals.
01:41:59 I want to gain muscle or I want to, you know,
01:42:02 maintain my weight, but I want to have more energy
01:42:04 or whatever, I just want, or I want to, you know,
01:42:05 I just want to be very healthy and I want to,
01:42:06 obviously everyone wants the same, like 10 basic things.
01:42:09 Like you want to avoid cancer, you want to, you know,
01:42:11 various things, you want to age slower.
01:42:14 So now the AI has my goals and a drone comes at,
01:42:19 you know, a little thing pops up and it says like,
01:42:22 you know, beep, beep, like, you know, 15 minutes,
01:42:24 you’re going to eat because it knows that’s a great,
01:42:25 that’s the right time for my body to eat.
01:42:27 15 minutes later, a little slot opens in my wall
01:42:30 where a drone has come from the factory,
01:42:32 the eating, the food factory and dropped the perfect meal
01:42:35 for my, that moment for me, for my mood, for my genome,
01:42:38 for my blood contents.
01:42:40 And it’s, it’s because it knows my goals.
01:42:42 So, you know, it knows I want to feel energy at this time
01:42:44 and then I want to wind down here.
01:42:45 So those things you have to tell it.
01:42:47 Well, plus the pleasure thing, like it knows what kind
01:42:50 of components of a meal you’ve enjoyed in the past
01:42:52 so you can assemble the perfect meal.
01:42:53 Exactly, it knows you way better than you know yourself,
01:42:56 better than any human could ever know you.
01:42:58 And a little thing pops up,
01:42:59 you still have some choice, right?
01:43:01 Still, it pops up and it says like, you know, coffee,
01:43:05 because it knows that, you know, my cutoff,
01:43:07 they says, you know, I can have coffee
01:43:08 for the next 15 minutes only because at that point
01:43:11 it knows how long it stays in my system.
01:43:12 It knows what my sleep is like when I have it too late.
01:43:14 It knows I have to wake up at this time tomorrow
01:43:16 because that was my calendar.
01:43:17 And so I think a lot of people’s, this is,
01:43:19 I think something that humans are wrong about
01:43:22 is that most people will hear this and be like,
01:43:23 that sounds awful, that sounds dystopian.
01:43:26 No, it doesn’t, it sounds incredible.
01:43:27 And if we all had this, we would not look back and be like,
01:43:29 I wish I was like making awful choices every day
01:43:32 like I was in the past.
01:43:33 And then this isn’t, these aren’t important decisions.
01:43:36 Your important decision making energy,
01:43:38 your important focus and your attention can go
01:43:41 onto your kids and on your work and on, you know,
01:43:44 helping other people and things that matter.
01:43:46 And so I think AI, when I think about like personal
01:43:49 lifestyle and stuff like that, I really love,
01:43:53 like I love thinking about that.
01:43:55 I think it’s gonna be very, and I think we’ll all be
01:43:57 so much healthier that when we look back today,
01:44:00 one of the things that’s gonna look so primitive
01:44:02 is the one size fits all thing,
01:44:03 getting like reading advice about keto.
01:44:07 Each genome is gonna have very specific,
01:44:10 one, you know, unique advice coming from AI.
01:44:12 And so, yeah.
01:44:14 Yeah, the customization that’s enabled by collection
01:44:16 of data and the use of AI, a lot of people think
01:44:19 what’s the, like they think of the worst case scenario
01:44:22 that data being used by authoritarian governments
01:44:25 to control you, all that kind of stuff.
01:44:26 They don’t think about most likely,
01:44:28 especially in a capitalist society,
01:44:30 it’s most likely going to be used as part of a competition
01:44:34 to get you the most delicious and healthy meal possible
01:44:36 as fast as possible.
01:44:38 Yeah, so the world will definitely be much better
01:44:40 with the integration of data.
01:44:42 But of course, you wanna be able to be transparent
01:44:46 and honest about how that data is misused.
01:44:48 And that’s why it’s important to have free speech
01:44:50 and people to speak out, like when some bullshit
01:44:52 is being done by companies.
01:44:53 That we need to have our wits about us as a society.
01:44:55 Like this is free speech is the mechanism
01:45:00 by which the big brain can think, can think for itself,
01:45:03 can think straight, can see straight.
01:45:05 When you take away free speech, when you start saying
01:45:08 that in every topic, when any topic’s political,
01:45:10 it becomes treacherous to talk about.
01:45:12 So forget the government taking away free speech.
01:45:15 If the culture penalizes nuanced conversation
01:45:18 about any topic that’s political
01:45:21 and the politics is so all consuming
01:45:24 and it’s such a incredible market to polarize people,
01:45:30 for media to polarize people and to bring any topic it can
01:45:32 into that and get people hooked on it as a political topic,
01:45:36 we become a very dumb society.
01:45:38 So free speech goes away as far as it matters.
01:45:39 People say, oh, people like to say outside,
01:45:42 you don’t even know what free speech is.
01:45:43 Free speech is, your free speech is not being violated.
01:45:46 It’s like, no, you’re right.
01:45:48 My first amendment rights are not being violated.
01:45:50 But the culture of free speech,
01:45:52 which is the second ingredient of two,
01:45:53 you need the first amendment
01:45:55 and you need the culture of free speech.
01:45:57 And now you have free speech
01:45:58 and the culture is much more specific.
01:46:00 You obviously can have a culture that believes people
01:46:03 right now take any topic again, that has to do with like,
01:46:06 some very sensitive topics, police shootings,
01:46:10 or what’s going on in K through 12 schools
01:46:13 or even climate change, take any of these.
01:46:16 And the first amendment’s still there.
01:46:19 You’re not gonna get arrested no matter what you say.
01:46:22 The culture of free speech is gone
01:46:25 because you will be destroyed.
01:46:27 Your life can be over as far as it matters
01:46:30 if you say the wrong thing.
01:46:31 But a really vigorous culture of free speech,
01:46:35 you get no penalty at all
01:46:36 for even saying something super dumb.
01:46:38 People will say, like, people will laugh and be like,
01:46:40 well, that was like kind of hilariously offensive
01:46:42 and like, not at all correct.
01:46:43 Like, you know, you’re wrong and here’s why.
01:46:45 But no one’s like mad at you.
01:46:47 Now the brain is thinking at its best.
01:46:49 The IQ of the big brain is like,
01:46:51 as high as it can be in that culture.
01:46:53 And the culture where, and you say something wrong
01:46:55 and people say, oh, wow, you’ve changed.
01:46:56 Oh, wow, like, look, this is his real colors.
01:46:58 He knows the big brain is dumb.
01:47:01 You still have mutual respect for each other.
01:47:03 So like, you don’t think lesser of others
01:47:06 when they say a bunch of dumb things.
01:47:08 You know it’s just the play of ideas.
01:47:10 But you still have respect, you still have love for them.
01:47:12 Because I think the worst case is
01:47:15 when you have a complete free like anarchy of ideas
01:47:18 where it’s like everybody lost hope
01:47:22 that something like a truth can even be converged towards.
01:47:25 Like, everybody has their own truth.
01:47:27 Then it’s just chaos.
01:47:29 Like, if you have mutual respect
01:47:31 and a mutual goal of arriving at the truth
01:47:33 and the humility that you want to listen
01:47:35 to other people’s ideas,
01:47:37 and a forgiveness that other people’s ideas
01:47:38 might be dumb as hell,
01:47:39 that doesn’t mean they’re lesser beings,
01:47:41 all that kind of stuff.
01:47:42 But that’s like a weird balance to strike.
01:47:44 Right now people are being trained, little kids,
01:47:47 college students, being trained
01:47:49 to think the exact opposite way.
01:47:51 To think that there’s no such thing as objective truth,
01:47:53 which is, you know, the objective truth
01:47:54 is the end on the compass for every thinker.
01:47:59 Doesn’t mean we’re necessarily on our way or we’re finding,
01:48:02 but we’re all aiming in the same direction.
01:48:03 We all believe that there’s a place
01:48:05 we can eventually get closer to.
01:48:07 Not objective truth, you know,
01:48:08 teaching them that disagreement is bad, violence.
01:48:12 You know, it’s, you know,
01:48:16 it’s like, you know, you quickly sound like
01:48:17 you’re just going on like a political rant with this topic,
01:48:20 but like, it’s really bad.
01:48:22 It’s like genuinely the worst.
01:48:24 If I had my own country,
01:48:26 I mean, it’s like I would teach kids
01:48:30 some very specific things
01:48:31 that this is doing the exact opposite of,
01:48:36 and it sucks, it sucks.
01:48:39 Speaking of a way to escape this,
01:48:40 you’ve tweeted 30 minutes of reading a day equals,
01:48:43 yeah, this whole video,
01:48:44 and it’s cool to think about reading,
01:48:46 like as a habit and something that accumulates.
01:48:50 You said 30 minutes of reading a day
01:48:51 equals 1000 books in 50 years.
01:48:54 I love like thinking about this,
01:48:56 like chipping away at the mountain.
01:48:59 Can you expand on that sort of the habit of reading?
01:49:02 How do you recommend people read?
01:49:04 Yeah, yeah, I mean, it’s incredible.
01:49:06 If you do something, a little of something every day,
01:49:10 it compiles, it compiles.
01:49:12 You know, I always think about like the people
01:49:14 who achieve these incredible things in life,
01:49:16 these great, like famous, legendary people,
01:49:19 they have the same number of days that you do,
01:49:21 and it’s not like they were doing magical days.
01:49:23 They just, they got a little done every day,
01:49:26 and that adds up to a monument,
01:49:32 they’re putting one brick in a day,
01:49:33 eventually they have this building,
01:49:35 this legendary building.
01:49:36 So you can take writing,
01:49:38 someone who, you know, there’s two aspiring writers,
01:49:40 and one doesn’t ever write,
01:49:42 doesn’t, you know, manages to never zero write,
01:49:44 zero pages a day,
01:49:45 and the other one manages to do two pages a week, right?
01:49:49 Not very much.
01:49:50 The other one does zero pages a week,
01:49:51 two pages a week, 98% of both of their time is the same.
01:49:55 The other person, just 2%, they’re doing one other thing.
01:49:58 One year later, they have written,
01:50:00 they write two books a year.
01:50:01 This prolific person, you know, in 20 years,
01:50:04 they’ve written 40 books,
01:50:05 they’re one of the most prolific writers of all time.
01:50:07 They write two pages a week.
01:50:09 Sorry, that’s not true.
01:50:10 That was two pages a day.
01:50:12 Okay, two pages a week,
01:50:13 you’re still writing about a book every two years.
01:50:15 So in 20 years, you’ve still written 10 books,
01:50:17 also prolific writer, right?
01:50:19 Huge, massive writing career.
01:50:21 You write two pages every Sunday morning.
01:50:23 The other person has the same exact week,
01:50:25 and they don’t do that Sunday morning thing.
01:50:26 They are a wannabe writer.
01:50:27 They always said they could write.
01:50:29 They talk about how they used to be,
01:50:30 and nothing happens, right?
01:50:31 So it’s inspiring, I think,
01:50:34 for a lot of people who feel frustrated
01:50:35 and they’re not doing anything.
01:50:36 So reading is another example
01:50:38 where someone who reads very, you know, doesn’t read,
01:50:43 and someone who’s a prolific reader,
01:50:45 you know, I always think about like the Tyler Cowen types.
01:50:47 I’m like, how the hell do you read so much?
01:50:49 It’s infuriating, you know?
01:50:51 Or like James Clear puts out his like,
01:50:53 his 10 favorite books of the year,
01:50:56 his 20 favorite books of the year.
01:50:57 I’m like, your 20 favorites?
01:51:00 Like I’m trying to just read 20 books,
01:51:02 like that would be an amazing year.
01:51:03 So, but the thing is,
01:51:06 they’re not doing something crazy and magical.
01:51:07 They’re just reading a half hour a night, you know?
01:51:09 If you read a half hour a night,
01:51:11 the calculation I came to is that
01:51:12 you can read a thousand books in 50 years.
01:51:15 So as someone who’s 80 and they’ve read a thousand books,
01:51:18 you know, between 30 and 80,
01:51:20 they are extremely well read.
01:51:21 They can delve deep into many nonfiction areas.
01:51:24 They can be, you know, an amazing fiction reader,
01:51:27 avid fiction reader.
01:51:29 And again, that’s a half hour a day.
01:51:30 Some people can do an hour,
01:51:31 a half hour in the morning audio book,
01:51:33 half hour at night in bed.
01:51:34 Now they’ve read 2000 books.
01:51:35 So I think it’s motivating.
01:51:40 And you realize that a lot of times you think
01:51:43 that the people who are doing amazing things
01:51:45 and you’re not, you think that there’s a bigger gap
01:51:48 between you and them than there really is.
01:51:51 I, on the reading front, I’m a very slow reader,
01:51:53 which is just a very frustrating fact about me,
01:51:58 but I’m faster with audio books.
01:52:00 And also I just, you know, I’ll just,
01:52:02 it’s just hard to get myself to read,
01:52:04 but I’ve started doing audio books
01:52:05 and I’ll wake up, throw it on, do it in the shower,
01:52:09 brushing my teeth, you know, making breakfast,
01:52:10 dealing with the dogs, things like that, whatever,
01:52:12 until I sit down.
01:52:14 And that’s, I can read, I can read a book a week,
01:52:17 a book every 10 days at that clip.
01:52:20 And suddenly I’m this big reader
01:52:21 because I’m just, while doing my morning stuff,
01:52:24 I have it on and also it’s this fun,
01:52:25 it makes the morning so fun.
01:52:26 I’m like having a great time the whole morning.
01:52:27 So I’m like, oh, I’m so into this book.
01:52:29 So I think that, you know, audio books
01:52:31 is another amazing gift to people
01:52:32 who have a hard time reading.
01:52:34 I find that that’s actually an interesting skill.
01:52:36 I do audio books quite a bit.
01:52:38 Like it’s a skill to maintain, at least for me,
01:52:41 probably the kind of books I read,
01:52:43 which is often like history or like,
01:52:45 there’s a lot of content.
01:52:47 And if you miss parts of it, you miss out on stuff.
01:52:51 And so it’s a skill to maintain focus.
01:52:55 At least for me.
01:52:56 Well, the 10 second back button is very valuable.
01:52:58 Oh, interesting.
01:52:59 So I just, if I get lost,
01:53:01 sometimes the book is so good
01:53:02 that I’m thinking about what the person just said.
01:53:03 And I just get, the skill for me is just remembering to pause.
01:53:06 And if I don’t, no problem, just back, back, back, back.
01:53:09 Just three quick backs.
01:53:10 So of course it’s not that efficient,
01:53:12 but it’s, I do the same thing when I’m reading.
01:53:14 I’ll read a whole paragraph
01:53:14 and realize I was tuning out.
01:53:15 Yeah, you know?
01:53:17 You know, I haven’t actually even considered to try that.
01:53:19 I’ve been so hard on myself maintaining focus
01:53:22 because you do get lost in thought.
01:53:24 Maybe I should try that.
01:53:25 Yeah, and when you get lost in thought, by the way,
01:53:26 you’re processing the book.
01:53:27 That’s not wasted time.
01:53:29 That’s your brain really categorizing
01:53:31 and cataloging what you just read and like.
01:53:33 Well, there’s several kinds of thoughts, right?
01:53:36 There’s thoughts related to the book
01:53:37 and there’s a thought that it could take you elsewhere.
01:53:40 Well, I find that if I am continually thinking
01:53:43 about something else, I just say, I’m not,
01:53:45 I just pause the book.
01:53:47 Yeah, especially in the shower or something
01:53:49 when like, that’s sometimes
01:53:51 when really great thoughts come up.
01:53:52 If I’m having all these thoughts about other stuff,
01:53:53 I’m saying clearly my mind wants to work on something else.
01:53:55 So I’ll just pause it.
01:53:56 Yeah, quiet, Dan Carlin.
01:53:57 I’m thinking about something else right now.
01:53:59 Exactly, exactly.
01:54:00 Also you can, things like you have to head out to the store.
01:54:04 Like I’m gonna read 20 pages on that trip.
01:54:06 Just walking back and forth, going to the airport.
01:54:08 I mean, flights, you know, the Uber,
01:54:10 and then you’re walking to the,
01:54:12 walking through the airport,
01:54:13 you’re sharing the security line.
01:54:13 I’m reading the whole time, like,
01:54:15 I know this is not groundbreaking.
01:54:17 People know what audio books are,
01:54:18 but I think that more people
01:54:19 should probably get into them than do.
01:54:21 Cause I know a lot of people,
01:54:22 they have this stubborn kind of things.
01:54:23 I don’t like, I like to have the paper book and sure,
01:54:25 but like, it’s pretty fun to be able to read.
01:54:27 I still, I listen to a huge number of audio books
01:54:31 and podcasts, but I still,
01:54:32 the most impactful experiences for me are still reading.
01:54:36 And I read very, very slow.
01:54:38 And it’s very frustrating when like,
01:54:41 you go to these websites like that estimate
01:54:43 how long a book takes on average,
01:54:45 that those are always annoying.
01:54:46 They do like a page a minute when I read like best,
01:54:50 a page every two minutes, at best.
01:54:51 At best, when you’re like really like,
01:54:54 actually not pausing.
01:54:55 I just, my ADD, it’s like, I just,
01:54:58 it’s hard to keep focusing.
01:54:59 And I also like to really absorb.
01:55:01 So on the other side of things,
01:55:03 when I finish a book, 10 years later, I’ll be like,
01:55:05 you know that scene when this happens
01:55:06 and another friend read it,
01:55:07 I’ll be like, what?
01:55:08 I don’t remember any like details.
01:55:09 I’m like, oh, I can tell you like the entire.
01:55:10 So I absorbed the shit out of it,
01:55:12 but I don’t think it’s worth it to like have to read
01:55:14 so less, so much less in my life.
01:55:16 I actually, so in terms of going to the airport,
01:55:18 you know, in these like filler moments of life,
01:55:22 I do a lot of, it’s an app called Anki.
01:55:24 I don’t know if you know about it.
01:55:26 It’s a space repetition app.
01:55:29 So there’s all of these facts I have when I read,
01:55:31 I write it down if I want to remember it.
01:55:33 And it’s these, you review it.
01:55:36 And the one, the things you remember,
01:55:39 it takes longer and longer to bring back up.
01:55:41 It’s like flashcards, but a digital app.
01:55:43 It’s called Anki.
01:55:44 I recommend it to a lot of people.
01:55:45 There’s a huge community of people
01:55:46 that are just like obsessed with it.
01:55:48 Anki?
01:55:49 A N K I.
01:55:51 So this is extremely well known app and idea,
01:55:55 like among students who are like medical students,
01:55:59 like people that really have to study.
01:56:01 Like this is not like fun stuff.
01:56:03 They really have to memorize a lot of things.
01:56:06 They have to remember them well.
01:56:07 They have to be able to integrate them
01:56:08 with a bunch of ideas.
01:56:10 And I find it to be really useful for like,
01:56:13 when you read history,
01:56:14 if you think this particular factoid,
01:56:17 they’d probably be extremely useful for you.
01:56:19 Cause you’re, that’d be interesting actually thought.
01:56:23 Cause you’re doing, you talked about like opening up
01:56:25 a trillion tabs and reading things.
01:56:29 You know, you probably want to remember some facts
01:56:32 you read along the way.
01:56:33 Like you might remember, okay,
01:56:34 this thing I can’t directly put into the writing,
01:56:37 but it’s a cool little factoid.
01:56:38 I want to.
01:56:39 Oh, all the time.
01:56:40 Store that in there.
01:56:41 And that’s why I go Anki, drop it in.
01:56:43 Oh, you can just drop it in.
01:56:45 Yeah.
01:56:45 You drop it in a line of a podcast or like a video?
01:56:48 Well, no.
01:56:49 I guess I can type it though.
01:56:50 So yes.
01:56:51 So Anki, there’s a bunch of,
01:56:53 it’s called Space Repetitions.
01:56:54 There’s a bunch of apps that are much nicer than Anki.
01:56:57 Anki is the ghetto, like Craigslist version,
01:57:00 but it has a giant community because people are like,
01:57:02 we don’t want features.
01:57:04 We want a text box.
01:57:06 Like it’s very basic, very stripped down.
01:57:08 So you can drop in stuff.
01:57:09 You can drop it.
01:57:10 That sounds really,
01:57:11 I can’t believe I have not come across this.
01:57:13 You actually, once you look into it,
01:57:15 you’ll realize that how have I not come,
01:57:18 you are the person,
01:57:20 I guarantee you’ll probably write a blog about it.
01:57:22 I can’t believe you actually have it.
01:57:23 Well, it’s also just like.
01:57:24 It’s your people too.
01:57:25 And my people say, what do you write about?
01:57:28 Literally anything I find interesting.
01:57:30 And so for me, once you start a blog,
01:57:33 like your entire worldview becomes,
01:57:36 would this be a good blog post?
01:57:37 Would this be, I mean, that’s the lens
01:57:39 I see everything through,
01:57:40 but I constantly coming across something
01:57:42 or just a tweet, something that I’m like,
01:57:46 ooh, I need to like share this with my readers.
01:57:47 My readers to me are like my friends who I’m like,
01:57:52 I’m gonna, oh, I need to show,
01:57:53 I need to tell them about this.
01:57:55 And so I feel like just a place to,
01:57:56 I mean, I collect things in a document right now,
01:57:58 if it’s like really good,
01:57:58 but it’s the little factoids and stuff like that.
01:58:01 I think, especially if I’m learning something,
01:58:02 if I’m like.
01:58:03 So the problem is when you say stuff,
01:58:05 when you look at it, like tweet and all that kind of stuff
01:58:07 is you also need to couple that with a system for review.
01:58:11 Cause what Anki does is like literally,
01:58:14 it determines for me, I don’t have to do anything.
01:58:16 There’s this giant pile of things I’ve saved
01:58:18 and it brings up to me, okay, here’s, I don’t know.
01:58:25 When Churchill did something, right?
01:58:27 I’m reading about World War II a lot now,
01:58:29 like a particular event, here’s that.
01:58:31 Do you remember when, what year that happened?
01:58:33 And you say yes or no, or like you get to pick,
01:58:38 you get to see the answer and you get to self evaluate
01:58:42 how well you remember that fact.
01:58:43 And if you remember well,
01:58:44 it’ll be another month before you see it again.
01:58:46 If you don’t remember, it’ll bring it up again.
01:58:48 That’s a way to review tweets, to review concepts.
01:58:52 And it offloads the kind of,
01:58:54 the process of selecting which parts
01:58:56 you’re supposed to review or not.
01:58:57 And you can grow that library.
01:58:58 I mean, obviously medical students use it
01:59:00 for like tens of thousands of facts and it’s.
01:59:04 It just gamifies it too.
01:59:05 It’s like you can passively sit back and just,
01:59:07 and the thing will like make sure
01:59:09 you eventually learn it all.
01:59:10 This is, you know, you don’t have to be the executive
01:59:13 calling that like the program,
01:59:15 the memorization program someone else is handling.
01:59:17 I would love to hear about like you trying it out
01:59:20 or space repetition as an idea.
01:59:22 There’s a few other apps, but Anki’s the big master.
01:59:24 I totally want to try.
01:59:26 You’ve written and spoken quite a bit about procrastination.
01:59:29 I like you suffer from procrastination
01:59:32 like many other people, suffering quotes.
01:59:37 How do we avoid procrastination?
01:59:39 I don’t think the suffer is in quotes.
01:59:43 I think that’s a huge part of the problem
01:59:45 is that it’s treated like a silly problem.
01:59:50 People don’t take it seriously as a dire problem,
01:59:56 but it can be.
01:59:57 It can ruin your life.
02:00:03 There’s like, we talked about the compiling concept
02:00:09 with, you know, if you read a little,
02:00:11 you know, if you write, if you write two pages a week,
02:00:15 you write a book every two years,
02:00:16 you’re a prolific writer, right?
02:00:18 And the difference between, you know,
02:00:20 again, it’s not that that person’s working so hard.
02:00:22 It’s that they have the ability to,
02:00:24 when they commit to something like on Sunday mornings,
02:00:26 I’m gonna write two pages.
02:00:27 That’s it.
02:00:29 They respect, they have enough,
02:00:33 they respect the part of them that made that decision
02:00:36 is a respected character in their brain.
02:00:39 And they say, well, that’s, I decided it,
02:00:41 so I’m gonna do it.
02:00:43 The procrastinator won’t do those two pages.
02:00:46 That’s just exactly the kind of thing
02:00:48 the procrastinator will keep on their list
02:00:50 and they will not do.
02:00:51 But that doesn’t mean they’re any less talented
02:00:53 than the writer who does the two pages.
02:00:54 Doesn’t mean they want it any less.
02:00:57 Maybe they want it even more.
02:00:58 And it doesn’t mean that they wouldn’t be just as happy
02:01:01 having done it as the writer who does it.
02:01:03 So what they’re missing out on,
02:01:05 picture a writer who writes 10 books, you know,
02:01:07 bestsellers, and they go on these book tours
02:01:11 and, you know, they, and they just are so gratified
02:01:15 with their career and, you know,
02:01:16 and they think about what the other person is missing
02:01:20 who does none of that, right?
02:01:23 So that is a massive loss, a massive loss.
02:01:27 And it’s because the internal mechanism in their brain
02:01:32 is not doing what the other person’s is.
02:01:33 So they don’t have the respect for the part of them
02:01:36 that made the choice.
02:01:37 They feel like it’s someone they can disregard.
02:01:40 And so to me, it’s in the same boat
02:01:42 as someone who is obese because their eating habits
02:01:47 make them obese over time or their exercise habits.
02:01:50 So that, you know, that’s a huge loss for that person.
02:01:54 That person is, you know, the health problems
02:01:56 and it’s just probably making them miserable.
02:01:59 And it’s self inflicted, right?
02:02:01 It’s self defeating, but that doesn’t make an easy problem
02:02:05 to fix just because you’re doing it to yourself.
02:02:07 So to me, procrastination is another one of these
02:02:09 where you are the only person in your own way.
02:02:11 You are, you know, you are failing at something
02:02:15 or not doing something that you really want to do.
02:02:17 You know, it doesn’t have to be work.
02:02:18 Maybe you’re, you want to get out of that marriage
02:02:21 that you know, you realize it hits you.
02:02:23 You shouldn’t be in this marriage.
02:02:24 You should get divorced and you wait 20 extra years
02:02:26 before you do it or you don’t do it at all.
02:02:30 That is, you know, you’re not living the life
02:02:33 that you know you should be living, right?
02:02:35 And so I think it’s fascinating.
02:02:38 Now, the problem is it’s also a funny problem
02:02:39 because there’s short term procrastination,
02:02:43 which I talk about as, you know,
02:02:44 the kind that has a deadline.
02:02:45 Now, some people, you know,
02:02:48 this is when I bring in, there’s different characters.
02:02:50 There’s the panic monster comes in the room
02:02:52 and that’s when you actually, you know,
02:02:54 the procrastinator can, there’s different levels.
02:02:57 There’s the kind that even when there’s a deadline,
02:03:02 they stop panicking.
02:03:03 They just, they’ve given up
02:03:04 and they really have a problem.
02:03:06 Then there’s the kind that when there’s a deadline,
02:03:08 they’ll do it, but they’ll wait till the last second.
02:03:10 Both of those people, I think have a huge problem
02:03:13 once there’s no deadline because,
02:03:15 and most of the important things in life,
02:03:17 there’s no deadline, which is, you know,
02:03:18 changing your career, you know,
02:03:21 becoming a writer when you never have been before,
02:03:23 getting out of your relationship,
02:03:27 you know, be doing whatever you need to,
02:03:28 the changes you need to make
02:03:29 in order to get into a relation.
02:03:31 There’s a thing after it.
02:03:32 Launching a startup.
02:03:33 Launching a startup, right?
02:03:35 Or once you’ve launched a startup,
02:03:37 firing is the right, someone that needs to be fired, right?
02:03:40 I mean, going out for fundraising
02:03:41 instead of just trying to, you know,
02:03:43 there’s so many moments when the big change
02:03:46 that you know you should be making
02:03:48 that would completely change your life
02:03:49 if you just did it has no deadline.
02:03:53 It just has to be coming from yourself.
02:03:55 And I think that a ton of people have a problem
02:04:01 where they will, they think this delusion
02:04:04 that, you know, I’m gonna do that.
02:04:05 I’m definitely gonna do that, you know,
02:04:07 but not this week, not this month,
02:04:09 not today, because whatever.
02:04:10 And they make this excuse again and again,
02:04:12 and it just sits there on their list collecting dust.
02:04:15 And so, yeah, to me, it is a very real suffering.
02:04:20 And the fix isn’t fixing the habits.
02:04:23 Just like not working on the fix, first of all.
02:04:27 So there’s, okay, there is,
02:04:32 there’s, just that you have a boat that sucks
02:04:34 and it’s leaking and it’s gonna sink.
02:04:37 You can fix it with duct tape for a couple of,
02:04:40 you know, for one ride or whatever.
02:04:42 That’s not really fixing the boat,
02:04:43 but you can get you by.
02:04:44 So there’s duct tape solutions.
02:04:47 To me, so the panic monster is the character
02:04:49 that rushes into the room once the deadline gets too close
02:04:52 or once there’s some scary external pressure,
02:04:54 not just from yourself.
02:04:55 And that’s a huge aid to a lot of procrastinators.
02:04:59 Again, there’s a lot of people who won’t,
02:05:00 you know, do that thing.
02:05:02 They’ve been writing that book they wanted to write,
02:05:03 but there’s way fewer people
02:05:05 who will not show up to the exam.
02:05:07 You know, most people show up to the exam.
02:05:08 So that’s because the panic monster
02:05:12 is gonna freak out if they don’t.
02:05:14 So you can create a panic monster.
02:05:16 If you wanna, you know, you really wanna write music,
02:05:19 you really wanna become a singer, songwriter,
02:05:20 well, book a venue, tell 40 people about it
02:05:25 and say, hey, on, you know, this day, two months from now,
02:05:28 come and see, I’m gonna play you some of my songs.
02:05:30 You now have a panic monster, you’re gonna write songs.
02:05:31 You’re gonna have to, right?
02:05:33 So there’s duct tape things.
02:05:37 You know, you can do things, you know, people do,
02:05:39 I’ve done a lot of this with a friend and I say,
02:05:41 if I don’t get X done by a week from now,
02:05:44 I have to donate a lot of money
02:05:47 somewhere I don’t wanna donate.
02:05:48 And that’s, you would put that
02:05:49 in the category of duct tape solutions.
02:05:51 Yeah, because it’s not, why do I need that, right?
02:05:55 If I really had solved this,
02:05:56 this is something I want to do for me, it’s selfish.
02:05:58 This is, I just literally just want to be selfish here
02:06:01 and do the work I need to do
02:06:02 to get the goals I wanna get, right?
02:06:04 There’s a much, all the incentives
02:06:07 should be in the right place.
02:06:08 And yet, if I don’t say that,
02:06:10 it’ll be a week from now and I won’t have done it.
02:06:12 Something weird is going on, there’s some resistance,
02:06:14 there’s some force that is prevent,
02:06:16 that is in my own way, right?
02:06:17 And so doing something where I have to pay all this money,
02:06:21 okay, now I’ll panic and I’ll do it.
02:06:22 So that’s duct tape.
02:06:23 Fixing the boat is something where I don’t have to do that.
02:06:27 I just will do the things that I,
02:06:29 again, it’s not, I’m talking about super crazy work ethic.
02:06:33 Just like, for example, okay, I have a lot of examples
02:06:36 because I have a serious problem that I’ve been working on.
02:06:40 And in some ways I’ve gotten really successful
02:06:41 at solving it and in other ways I’m still floundering.
02:06:44 So.
02:06:45 Yeah, the world’s greatest duct taper.
02:06:47 Yes, well, I’m pretty good at duct taping.
02:06:49 I probably could be even better and I’m like, and I’m.
02:06:52 You’re procrastinating and becoming a better duct taper.
02:06:54 Literally, like, yes, there’s nothing, I won’t.
02:06:57 So here’s what I know what I should do as a writer, right?
02:07:00 It’s very obvious to me is that I should wake up.
02:07:03 Doesn’t have to be crazy,
02:07:04 I don’t wanna 6 a.m. or anything insane
02:07:06 or I’m not gonna be one of those crazy people at 5.30 jogs.
02:07:10 I’m gonna wake up at whatever, you know, 7.30, 8, 8.30
02:07:14 and I should have a block, like, just say nine to noon
02:07:18 where I get up and I just really quick make some coffee
02:07:22 and write.
02:07:23 It’s obvious because all the great writers in history
02:07:26 did exactly that, some.
02:07:28 Some of them have done that, that’s common.
02:07:30 There’s some that I like these writers,
02:07:31 they do the late night sessions,
02:07:32 but most of them they do wake up.
02:07:33 But there’s a session, but there’s a session that’s.
02:07:36 Most writers write in the morning and there’s a reason.
02:07:38 I don’t think I’m different than those people.
02:07:41 It’s a great time to write, you’re fresh, right?
02:07:43 Your ideas from dreaming have kind of collected,
02:07:46 you have all, you know, new answers
02:07:48 that you didn’t have yesterday and you can just go.
02:07:51 But more importantly, if I just had a routine
02:07:53 where I wrote from nine to noon, weekdays.
02:07:58 Every week would have a minimum of 15 focused hours
02:08:02 of writing, which doesn’t sound like a lot, but it’s a lot.
02:08:04 A 15, 15, no, this is no joke.
02:08:06 This is, you know, you’re not, your phone’s away,
02:08:09 you’re not talking to anyone, you’re not opening your email,
02:08:11 you are focused writing for three hours, five.
02:08:14 That’s a big week for most writers, right?
02:08:16 So now what’s happening is that every weekday
02:08:18 is a minimum of a B, I’ll give myself.
02:08:21 You know, an A might be, you know, wow,
02:08:23 I really just got into a flow and wrote for six hours
02:08:24 and had, you know, great.
02:08:26 But it’s a minimum of a B, I can keep going if I want.
02:08:28 And every week is a minimum of a B with those 15 hours.
02:08:31 Right, and if I just had, talk about compiling,
02:08:32 if I, this is the two pages a week.
02:08:34 If I just did that every week,
02:08:37 I’d achieve all my writing goals in my life.
02:08:39 And yet I wake up and most days I just,
02:08:42 either I’ll revenge procrastination late at night
02:08:44 and go to bed way too late and then wake up later
02:08:46 and get on a bad schedule and I just fall
02:08:47 into these bad schedules.
02:08:48 Or I’ll wake up and there’s just, you know,
02:08:50 I’ll say I was gonna do a few emails and I’ll open it up
02:08:52 and I’m suddenly on text and I’m texting.
02:08:54 Or I’ll just go and, you know, I’ll make a phone call
02:08:56 and I’ll be on phone calls for three hours.
02:08:58 It’s always something.
02:08:59 Yeah, yeah.
02:09:00 Or I’ll start writing and then I hit a little bit of a wall,
02:09:02 but because there’s no sacred,
02:09:03 this is a sacred writing block,
02:09:05 I’ll just hit the wall and say, well, this is icky
02:09:07 and I’ll go do something else.
02:09:08 So duct tape, what I’ve done is,
02:09:12 White But Why has one employee, Alisha.
02:09:14 She’s the manager of lots of things.
02:09:16 That’s her role.
02:09:17 She truly does lots of things.
02:09:20 And one of the things we started doing is,
02:09:24 either she comes over and sits next to me
02:09:25 where she can see my screen from nine to noon.
02:09:29 That’s all it takes.
02:09:30 The thing about procrastination is there’s usually,
02:09:31 they’re not kicking and screaming.
02:09:32 I don’t want to do this.
02:09:33 It’s the feeling of, you know, in the old days
02:09:35 when you had to go to class,
02:09:36 you know, your lunch block is over and it’s like,
02:09:38 oh, shit, I have class in five minutes.
02:09:40 Or it’s Monday morning, you go, oh.
02:09:42 Yeah.
02:09:43 But you said, you know what?
02:09:44 But you go, you say, okay.
02:09:45 And then you get to class and it’s not that bad
02:09:46 once you’re there, right?
02:09:48 You know, you have a trainer and he says, okay, next set.
02:09:49 And you go, oh, okay.
02:09:50 And you do it.
02:09:51 That’s all it is.
02:09:52 It’s someone, some external thing being like,
02:09:55 okay, I have to do this.
02:09:56 And then you have that moment of like, it sucks,
02:09:58 but I guess I’ll do it.
02:09:59 If no one’s there though, the problem with the procrastinator
02:10:03 is they don’t have that person in their head.
02:10:04 Other people I think were raised with a sense of shame
02:10:05 if they don’t do stuff.
02:10:06 And that stick in their head is hugely helpful.
02:10:09 I don’t really have that.
02:10:11 And so anyway, Alicia is sitting there next to me.
02:10:15 So she’s doing her own work, but she can see my screen
02:10:17 and she of all people knows exactly what I should be doing,
02:10:19 what I shouldn’t be doing.
02:10:21 That’s all it takes.
02:10:22 The shame of just having her see me
02:10:24 while she’s sitting there not working
02:10:26 would just be too, it’s too weird and too embarrassing.
02:10:27 So I get it done and it’s amazing.
02:10:29 It’s a game changer for me.
02:10:31 So duct tape can solve, sometimes duct tape is enough,
02:10:34 but I’m curious to, I’m still trying to, what is going on?
02:10:39 Yeah.
02:10:40 I think part of it is that we are actually wired.
02:10:43 I think I’m being very sane human actually
02:10:47 is what’s happening.
02:10:48 Or not sane is not the right word.
02:10:48 I’m being like, I’m being a natural human
02:10:52 that we are not programmed to sit there and do homework
02:10:55 of a certain kind that we get the results like
02:10:58 six months later.
02:10:59 Like that is not, so we’re supposed to conserve energy
02:11:03 and fulfill our needs as we need them
02:11:05 and do immediate things.
02:11:06 And we’re overriding our natural ways
02:11:10 when we wake up and get to it.
02:11:12 And I think sometimes it’s because the pain,
02:11:14 I think a lot of times we’re just avoiding suffering
02:11:15 and for a lot of people, the pain of not doing it
02:11:18 is actually worse because they feel shame.
02:11:20 So if they don’t get up and take a jog
02:11:22 and get up early and get to work,
02:11:23 I’ll feel like a bad person.
02:11:25 And that is worse than doing those things.
02:11:28 And then it becomes a habit eventually
02:11:29 and it becomes just easy automatic.
02:11:31 It becomes I do it because that’s what I do.
02:11:33 But I think that if you don’t have a lot of shame
02:11:35 necessarily, the pain of doing those things
02:11:38 is worse in the immediate moment than not doing it.
02:11:42 But I think that there’s this feeling
02:11:44 that you capture with your body language
02:11:46 and so on, like I don’t want to do another set.
02:11:50 That feeling, the people I’ve seen that are good
02:11:53 at not procrastinating are the ones
02:11:55 that have trained themselves to like the moment
02:11:58 they would be having that feeling.
02:12:00 They just, it’s like Zen, like Sam Harris style Zen.
02:12:03 You don’t experience that feeling.
02:12:05 You just march forward.
02:12:06 Like I talked to Elon about this a lot actually offline.
02:12:09 It’s like, he doesn’t have this.
02:12:11 No, clearly not.
02:12:12 It’s the way I think, at least he talks about it
02:12:16 and the way I think about it is it’s like
02:12:18 you just pretend you’re like a machine running an algorithm.
02:12:21 Like you know this, you should be doing this.
02:12:24 Not because somebody told you so on.
02:12:26 This is probably the thing you want to do.
02:12:28 Like look at the big picture of your life
02:12:30 and just run the algorithm.
02:12:31 Like ignore your feelings, just run as if.
02:12:33 But it’s framing, frame it differently.
02:12:35 Yeah.
02:12:36 You know, yeah, you can frame it as like,
02:12:38 it can feel like homework or it can feel like
02:12:40 you’re like, you’re living your best life
02:12:42 or something when you’re doing your work.
02:12:43 Yeah, and maybe reframe it.
02:12:46 But I think ultimately is whatever reframing
02:12:48 you need to do, you just need to do it for a few weeks.
02:12:52 And that’s how the habit is formed and you stick with it.
02:12:56 Like I’m now on a kick where I exercise every day.
02:13:04 It doesn’t matter what that exercise is.
02:13:06 It’s not serious.
02:13:07 It could be 200 pushups, but it’s the thing that like
02:13:11 I make sure I exercise every day
02:13:12 and it’s become way, way easier because of the habit.
02:13:15 And I just, and I don’t like, at least with exercise
02:13:19 because it’s easier to replicate that feeling.
02:13:22 I don’t allow myself to go like,
02:13:24 I don’t feel like doing this.
02:13:25 Right.
02:13:26 Well, I think about that, even just like little things,
02:13:27 like I brush my teeth before I go to bed
02:13:29 and it’s just a habit.
02:13:30 Yeah.
02:13:31 And it is effort.
02:13:31 Like if it were something else, I would be like,
02:13:34 oh, I’m gonna go to the bathroom and go do that.
02:13:35 And I just want to like, I’m just gonna lie down right now.
02:13:37 But it doesn’t even cross my mind.
02:13:38 It’s just like that I just robotically go and do it.
02:13:41 Yeah.
02:13:42 And it almost has become like a nice routine.
02:13:43 It’s like, oh, this part of the night, you know,
02:13:44 it’s like a morning routine for me stuff is like,
02:13:47 you know, that stuff is kind of just like automated.
02:13:50 Yeah, it’s funny because you don’t like go,
02:13:52 like I don’t think I’ve skipped many days.
02:13:54 I don’t think I skipped any days brushing my teeth.
02:13:56 Right.
02:13:57 Like unless I didn’t have a toothbrush,
02:13:59 like I was in the woods or something.
02:14:00 And what is that?
02:14:01 Because it’s annoying.
02:14:02 Well, so to me, there is,
02:14:04 so the character that makes me procrastinate
02:14:06 is the instant gratification monkey.
02:14:08 That’s what I’ve labeled him, right?
02:14:09 And there’s the rational decision maker
02:14:11 and the instant gratification monkey
02:14:12 and these battle with each other.
02:14:14 But for procrastinator, the monkey wins.
02:14:18 Yeah.
02:14:18 I don’t think the monkey is, you know,
02:14:20 you read about this kind of stuff.
02:14:21 I think that this kind of more primitive brain
02:14:25 is always winning.
02:14:26 And in the non procrastinators,
02:14:28 that primitive brain is on board for some reason
02:14:31 and isn’t resisting.
02:14:32 So, but when I think about brushing my teeth,
02:14:35 it’s like the monkey doesn’t even think
02:14:36 there’s an option to not do it.
02:14:38 So it doesn’t even like get, there’s no hope.
02:14:40 Monkey has no hope there.
02:14:41 So it doesn’t even like get involved.
02:14:43 And it’s just like, yeah, you know, we have to,
02:14:44 just like kind of like robotically, just like, you know,
02:14:46 it was kind of like Stockholm syndrome,
02:14:48 just like, oh no, no, I have to do this.
02:14:50 It doesn’t even like wake up.
02:14:51 It’s like, yeah, we’re doing this now.
02:14:53 For other things, the monkey’s like, ooh, no, no, no.
02:14:54 Most days I can win this one.
02:14:56 And so the monkey puts up that like fierce resistance
02:15:01 and it’s like, it’s a lot of it’s like
02:15:03 the initial transition.
02:15:05 So I think of it as like jumping in a cold pool
02:15:09 where it’s like, I will spend the whole day
02:15:12 pacing around the side of the pool in my bathing suit,
02:15:15 just being like, I don’t want to have that one second
02:15:17 when you first jump in and it sucks.
02:15:18 And then once you’re, once I’m in,
02:15:20 once I jump in, I’m usually, you know,
02:15:21 once I start writing, I’m suddenly I’m like,
02:15:23 oh, this isn’t so bad.
02:15:24 Okay, I’m kind of into it.
02:15:25 And then I, then sometimes you can’t tear me away.
02:15:27 You know, then I suddenly I’m like, I get into a flow.
02:15:29 So it’s like, once I get into cold water, I don’t mind it,
02:15:31 but I will spend hours standing around the side of the pool.
02:15:34 And by the way, I do this in a more literal sense.
02:15:36 When I go to the gym with a trainer in 45 minutes,
02:15:40 I do a full full ass workout.
02:15:41 And it’s not because I’m having a good time,
02:15:44 but it’s because it’s that,
02:15:47 oh, okay, I have to go to class feeling, right?
02:15:49 But when I go to the gym alone,
02:15:50 I will literally do a set and then dick around my phone
02:15:55 for 10 minutes before the next set.
02:15:57 And I’ll spend an over an hour there and do way less.
02:15:59 So it is the transition.
02:16:01 Once I’m actually doing the set,
02:16:03 I’m never like, I don’t want to stop in the middle.
02:16:04 Now it’s just like, I’m going to do this.
02:16:05 And I felt happy, I just did it.
02:16:07 So it’s something,
02:16:07 there’s something about transitions that is very,
02:16:09 that’s why procrastinators are late a lot of places.
02:16:12 It’s, I will procrastinate getting ready
02:16:14 to go to the airport,
02:16:16 even though I know I should leave at three,
02:16:18 so I can not be stressed.
02:16:19 I’ll leave at 3.36 and I’ll be super stressed.
02:16:22 Once I’m on the way to the airport,
02:16:24 immediately I’m like, why didn’t I do this earlier?
02:16:26 Now I’m back on my phone doing what I was doing.
02:16:28 I just had to get in the damn car or whatever.
02:16:31 So yeah, there’s some very, very odd irrational.
02:16:36 Yeah, like I was waiting for you to come
02:16:38 and you said that you’re running a few minutes late.
02:16:40 And I was like, I was like, I’ll go get you a coffee
02:16:45 because I can’t possibly be the one who’s early.
02:16:49 I can’t, I don’t understand.
02:16:50 I’m always late to stuff.
02:16:51 And I know it’s disrespectful
02:16:54 in the eyes of a lot of people.
02:16:55 I can’t help, you know what I’m doing ahead of it?
02:16:58 It’s not like I don’t care about the people.
02:17:00 I’m often like, you know, for like this conversation,
02:17:03 I’d be preparing more.
02:17:05 It’s like, I obviously care about the person,
02:17:08 but for some.
02:17:09 Yeah, it’s misinterpreted as like,
02:17:10 there are some people that like show up late
02:17:13 because they kind of like that quality in themselves.
02:17:15 That’s a dick, right?
02:17:16 There’s a lot of those people,
02:17:17 but more often it’s someone who shows up frazzled
02:17:20 and they feel awful and they’re furious at themselves.
02:17:22 They’re so regretful.
02:17:23 Exactly.
02:17:24 I mean, that’s me.
02:17:25 And I mean, all you have to do is look at those people alone
02:17:27 running through the airport, right?
02:17:29 They’re not being disrespectful to anyone there.
02:17:31 They just inflicted this on themselves.
02:17:33 Like.
02:17:33 This is hilarious.
02:17:34 Yeah.
02:17:35 You tweeted a quote by James Baldwin saying, quote,
02:17:38 I imagine one of the reasons people cling to their hates
02:17:41 so stubbornly is because they sense once hate is gone,
02:17:47 they will be forced to deal with the pain.
02:17:51 What has been a painful but formative experience
02:17:53 in your life?
02:17:55 Or what’s the flavor, the shape of your pain
02:17:58 that fuels you?
02:17:59 I mean, honestly, the first thing that jumped to mind
02:18:02 is my own like battles against myself to get my work done
02:18:06 because it affects everything.
02:18:07 When I, I just took five years in this book
02:18:09 and granted it’s a beast.
02:18:11 Like I probably would have taken two or three years,
02:18:13 but it didn’t need to take five.
02:18:14 And that was a lot of, not just, you know,
02:18:16 not just that I’m not working,
02:18:17 it’s that I’m over researching.
02:18:19 I’m making it, I’m adding in things I shouldn’t
02:18:22 because I’m perfectionist, you know,
02:18:23 being a perfectionist about like, oh, well I learned that.
02:18:25 Now I want to get it in there.
02:18:26 I know I’m going to end up cutting it later.
02:18:28 Just, you know, or I over outline, you know,
02:18:30 something until, you know, trying to get it perfect
02:18:32 when I know that’s not possible.
02:18:33 So making a lot of immature kind of,
02:18:36 like, I’m not actually that much of a writing amateur.
02:18:38 I’ve written, including my old blog,
02:18:40 I’ve been a writer for 15 years.
02:18:42 I know what I’m doing.
02:18:43 I could advise other writers really well.
02:18:45 And yet I do a bunch of amateur things
02:18:47 that I know while I’m doing them is an,
02:18:50 I know I’m being an amateur.
02:18:51 So that A, it hurts the actual product.
02:18:55 It makes, you know, B, it’s waste your precious time.
02:18:59 C, when you’re mad at yourself,
02:19:01 when you’re in a negative, you know,
02:19:04 self defeating spiral, it almost inevitably will,
02:19:08 you’ll be less good to others.
02:19:09 Like, you know, I’ll just, just, I used to, you know,
02:19:12 early on in my now marriage,
02:19:15 one of the things we always used to do
02:19:17 is I used to plan mystery dates.
02:19:18 You know, New York city, great, great place for this.
02:19:20 I’d find some weird little adventure for us.
02:19:22 You know, it could be anything.
02:19:24 And I wouldn’t tell her what it was.
02:19:25 I said, I reserving you for Thursday night,
02:19:27 you know, at seven, okay?
02:19:29 And it was such a fun part of our relationship.
02:19:31 Started writing this book and got into a really bad,
02:19:33 you know, personal space where I was like, in my head,
02:19:36 I was like, I can’t do anything until this is done.
02:19:38 You know, like, no.
02:19:39 And I just stopped like ever valuing,
02:19:41 like, like joy of any kind.
02:19:44 Like I was like, no, any, no, that’s when I’m done.
02:19:47 And that’s a trap or very quickly, you know,
02:19:49 cause I always think, you know,
02:19:50 I think it’s going to be a six months away,
02:19:51 but actually five years later, I’m like, wow,
02:19:53 I really wasn’t living fully.
02:19:55 And for five years is not, we don’t live very long.
02:19:58 Like talking about your prime decades,
02:19:59 like that’s like a sixth of my prime years.
02:20:01 Like, wow, like that’s a huge loss.
02:20:04 So to me, that was excruciating.
02:20:06 And, you know, and it was a bad pattern,
02:20:10 a very unproductive, unhelpful pattern for me,
02:20:12 which is I’d wake up in the morning in this great mood,
02:20:15 great mood every morning, wake up thrilled to be awake.
02:20:18 I have the whole day ahead of me.
02:20:19 I’m going to get so much work done today.
02:20:21 And, but you know, first I’m going to do all these other
02:20:23 things and it’s all going to be great.
02:20:24 And then I ended up kind of failing for the day
02:20:27 with those goals, sometimes miserably,
02:20:30 sometimes only partially.
02:20:31 And then I get in bed probably a couple hours later
02:20:35 than I want to.
02:20:36 And that’s when all of the real reality hits me.
02:20:39 Suddenly so much regret, so much anxiety,
02:20:42 furious at myself, wishing I could take a time machine back
02:20:44 three months, six months a year,
02:20:46 or just even to the beginning of that day.
02:20:48 And just tossing and turning now.
02:20:51 I mean, this is a very bad place.
02:20:53 This is what I said, suffering,
02:20:54 procrastinators suffer in a very serious way.
02:20:56 So look, I, you know,
02:20:58 I know this probably sounds like a lot of like first world
02:21:01 problems and it is, but it’s real suffering as well.
02:21:03 Like it’s, so to me, it’s like,
02:21:06 it’s painful because you’re not being,
02:21:11 you’re not being as good a friend or a spouse or whatever,
02:21:13 as you could be.
02:21:14 You’re also not treating yourself very well.
02:21:16 You’re usually not being very healthy in these moments.
02:21:17 You know, you’re often, and you’re not being,
02:21:20 I’m not being good to my readers.
02:21:21 So it’s just a lot of this.
02:21:22 And it’s like, it feels like it’s one small tweak away.
02:21:27 Sometimes it’s like, that’s what I said.
02:21:28 It’s like, you just suddenly are just doing that nine to 12
02:21:31 and you get in that habit.
02:21:32 Everything else falls into place.
02:21:33 All of this reverses.
02:21:35 So I feel hopeful, but it’s like, it is a,
02:21:39 I have not figured, I haven’t fixed the boat yet.
02:21:41 I have some good duct tape though.
02:21:43 And you also don’t want to romanticize it
02:21:45 because it is true that some of the greats in history,
02:21:48 especially writers suffer from all the same stuff.
02:21:51 Like they weren’t quite able.
02:21:52 I mean, you might only write for two or three hours a day,
02:21:56 but the rest of the day is often spent kind of tortured.
02:22:00 Well, right.
02:22:01 This is the irrational thing is if I,
02:22:03 and this goes for a lot of people’s jobs,
02:22:05 people especially who work for themselves,
02:22:07 you’d be shocked how much you could wake up at nine
02:22:10 or eight or seven or whatever.
02:22:11 Get to work and stop at one,
02:22:13 but you’re really focused in those hours.
02:22:15 One or two and do 25 really focused hours of stuff,
02:22:20 product stuff a week, and then there’s 112 waking hours
02:22:23 in the week, right?
02:22:24 So we’re talking about 80 something hours of free time.
02:22:27 You can live, you know, if you’re just really focused
02:22:29 in your, you know, yin and yang of your time,
02:22:31 and that’s what, that’s my goal is black and white time.
02:22:34 I really focused time and then totally like
02:22:37 clean conscience free time.
02:22:39 Right now I have neither, it’s a lot of gray.
02:22:40 It’s a lot of, I should be working, but I’m not.
02:22:42 Oh, I’m wasting this time.
02:22:43 This is bad.
02:22:44 And that’s just as massive.
02:22:45 So if you can just get really good at the black
02:22:49 and the white, so you just wake up
02:22:50 and it’s just like full work.
02:22:51 And then I think a lot of people could have
02:22:53 like all this free time,
02:22:54 but instead I’ll do those same three hours.
02:22:55 It’s like you said, I’ll do them really late at night
02:22:57 or whatever after having tortured myself the whole day
02:23:00 and not had any fun.
02:23:01 It’s not like I’m having fun.
02:23:03 I call it the dark playground, by the way,
02:23:04 which is where you are when you know you should be working,
02:23:06 but you’re doing something else.
02:23:08 You’re doing something fun on paper,
02:23:10 but it’s never, it feels awful.
02:23:12 And so, yeah, I spend a lot of time in the dark.
02:23:14 And you know you shouldn’t be doing it
02:23:16 and you still do it and yeah.
02:23:18 It’s not clean conscience fun.
02:23:19 It’s bad, it’s toxic.
02:23:21 And I think that it’s, there’s something about,
02:23:23 you know, you’re draining yourself all the time.
02:23:25 And if you just did your focused hours
02:23:26 and then if you actually have good, clean fun,
02:23:28 fun can be anything.
02:23:29 Reading a book can be hanging out with someone
02:23:31 who can be really fun.
02:23:32 You can go and do something cool in the city.
02:23:33 You know, that is critical.
02:23:37 You’re recharging some part of your psyche there.
02:23:39 And I think it makes it easier
02:23:39 to actually work the next day.
02:23:40 And I say this from the experiences
02:23:42 when I have had good stretches.
02:23:44 It’s like, you know what it is?
02:23:46 It’s like, you feel like you’re fist pounding.
02:23:48 One part of your brain is fist pounding the other part.
02:23:50 Like, you’re like, we got this.
02:23:52 Like, we treat ourselves well.
02:23:54 Like, this is how you internally feel.
02:23:56 Like, I treat myself.
02:23:56 And it’s like, yeah, no, it’s work time.
02:23:58 And then later you’re like, now it’s play time.
02:23:59 And it’s like, okay, back to work.
02:24:00 And you’re in this very healthy,
02:24:02 like parent child relationship in your head
02:24:04 versus like this constant conflict.
02:24:06 And like the kid doesn’t respect the parent
02:24:08 and the parent hates the kid.
02:24:09 And like, yeah.
02:24:10 And you’re right.
02:24:11 It always feels like it’s like one fix away.
02:24:14 So that there’s hope.
02:24:15 I mean, I guess, I mean,
02:24:18 so much of what you said just rings so true.
02:24:21 I guess I have the same kind of hope.
02:24:24 But you know, this podcast is very regular.
02:24:27 I mean, I’m impressed.
02:24:29 Like, and I think partially what,
02:24:32 there is a bit of a duct tape solution here,
02:24:34 which is you just, the,
02:24:36 cause it’s always easy to schedule stuff
02:24:37 for the future for myself, right?
02:24:39 Because that’s future Tim
02:24:40 and future Tim is not my problem.
02:24:42 So I’ll schedule all kinds of shit for future Tim.
02:24:44 And I will then not do it.
02:24:48 But in this case,
02:24:49 you can schedule podcasts and you have to show up.
02:24:52 Yeah, you have to show up.
02:24:53 Right, it seems like a good medium for procrastinating.
02:24:55 This is not my, this is what I do for fun.
02:24:57 I know, but at least this is the kind of thing,
02:25:00 especially if it’s not your main thing.
02:25:02 Especially if it’s not your main thing,
02:25:03 it’s the kind of thing that you would dream of doing
02:25:05 and want to do and never do.
02:25:06 And I feel like your regular production here
02:25:10 is a sign that something is working,
02:25:12 at least in this regard.
02:25:13 Yeah, in this regard.
02:25:14 I’m just, I’m sure you have this same kind of thing
02:25:16 with the podcast.
02:25:17 In fact, because you’re going to be doing the podcast,
02:25:18 as possible, the podcast becomes what the podcast is for me.
02:25:22 This is you procrastinating.
02:25:24 If you think about being 80
02:25:25 and if you can get into that person’s head
02:25:27 and look back and be like, and just deep regret,
02:25:29 you just, you know, yearning,
02:25:31 you could do anything to just go back
02:25:33 and have done this differently.
02:25:34 That is desperation.
02:25:36 It’s just, you don’t feel it yet.
02:25:37 It’s not in you yet.
02:25:38 The other thing you could do is if you have a partner,
02:25:40 if you want to partner with someone,
02:25:41 now you could say, we meet.
02:25:43 These 15 hours every week.
02:25:44 And that point you’re going to get it done.
02:25:46 So working with someone can help.
02:25:48 Yeah, that’s why they say like a co founder
02:25:51 is really powerful for many reasons,
02:25:52 but that’s kind of one of them.
02:25:54 Because to actually, for the startup case,
02:25:57 you, unlike writing perhaps,
02:26:00 it’s really like a hundred hour plus thing.
02:26:03 Like once you really launch, you go all in.
02:26:06 Like everything else just disappears.
02:26:09 Like you can’t even have a hope of a balanced life
02:26:11 for a little bit.
02:26:12 So, and there co founder really helps.
02:26:15 That’s the idea.
02:26:18 When you, you’re one of the most interesting people
02:26:20 on the internet.
02:26:21 So as a writer, you look out into the future.
02:26:27 Do you dream about certain things you want to still create?
02:26:30 Is there projects that you want to write?
02:26:34 Is there movies you want to write or direct or?
02:26:37 Endless.
02:26:38 So it’s just endless sea of ideas.
02:26:40 No, there’s specific list of things that really excite me,
02:26:44 but it’s a big list that I know
02:26:46 I’ll never get through them all.
02:26:47 And that’s part of why the last five years really like,
02:26:51 when I feel like I’m not moving as quickly as I could,
02:26:54 it bothers me because I have so much genuine excitement
02:26:57 to try so many different things.
02:26:58 And I get so much joy from finishing things.
02:27:00 I don’t like doing things, but a lot of writers are like that.
02:27:03 Publishing something is hugely joyful
02:27:06 and makes it all worth it.
02:27:08 Or just finishing something you’re proud of,
02:27:09 putting it out there and have people appreciate it.
02:27:11 It’s like the best thing in the world, right?
02:27:13 Every kid makes some little bargain with themselves,
02:27:15 has a little, a dream or something.
02:27:19 And I feel like when I do something that I make something
02:27:22 in this, for me, it’s been mostly writing
02:27:25 and I feel proud of it and I put it out there.
02:27:27 I feel like I like, again,
02:27:29 I’m like fist pounding my seven year old self.
02:27:30 Like there’s a little, like I owe it to myself
02:27:34 to do certain things.
02:27:35 And I just did one of the things I owe.
02:27:36 I just paid off some debt to myself.
02:27:38 I owed it and I paid it and it feels great.
02:27:40 It feels like very like you just feel very,
02:27:42 a lot of inner peace when you do it.
02:27:43 So the more things I can do,
02:27:46 and I just have fun doing it, right?
02:27:47 And so I just, for me that includes a lot more writing.
02:27:50 I just, short blog posts, I write very long blog posts,
02:27:54 but basically short writing
02:27:55 in the form of long blog posts is a great,
02:27:57 I love that medium.
02:27:59 I wanna do a lot more of that.
02:28:00 Books yet to be seen, I’m gonna do this
02:28:02 and I’m gonna have another book I’m gonna do right after.
02:28:04 And we’ll see if I like those two.
02:28:05 And if I do, I’ll do more, otherwise I won’t.
02:28:07 But I also wanna try other mediums.
02:28:08 I wanna make more videos.
02:28:10 I want to, I did a little travel series once.
02:28:14 I love doing that.
02:28:15 I wanna do more of that.
02:28:16 Almost like a vlog.
02:28:18 No, it was, I let readers in a survey
02:28:22 pick five countries they wanted me to go to.
02:28:24 That’s awesome.
02:28:25 And they picked, they sent me to weird places.
02:28:27 They sent me, I went to Siberia, I went to Japan.
02:28:32 I went from there to, this is all in a row,
02:28:34 into Nigeria, from there to Iraq.
02:28:37 And from there to Greenland.
02:28:39 And then I went back to New York,
02:28:41 like two weeks in each place.
02:28:43 And I got to, each one I got to have some weird experiences.
02:28:46 I tried to like really dig in
02:28:48 and have like some interesting experiences.
02:28:51 And then I wrote about it.
02:28:52 And I taught readers a little bit
02:28:53 about the history of these places.
02:28:55 And it was just, I love doing that.
02:28:56 I love, so, and I’m like, oh man,
02:28:58 like I haven’t done one of those in so long.
02:29:01 And then I have a big like desire to do fictional stuff.
02:29:05 Like I want to write a sci fi at some point.
02:29:07 And I would love to write a musical.
02:29:09 That’s actually what I was doing before Wait But Why.
02:29:11 I was with a partner, Ryan Langer.
02:29:14 We were halfway through a musical
02:29:15 and he got tied up with his other musical
02:29:19 and Wait But Why started taking off.
02:29:21 And we just haven’t gotten back to it.
02:29:22 But it’s such a fun medium.
02:29:24 So it’s such a silly medium, but it’s so fun.
02:29:26 So you think about all of these mediums
02:29:28 on which you can be creative and create something
02:29:30 and you like the variety of it.
02:29:32 Yeah, it’s just that if there’s a chance on a new medium,
02:29:37 I could do something good.
02:29:38 I want to do it.
02:29:39 I want to try it.
02:29:40 It sounds like so gratifying and so fun.
02:29:42 I think it’s fun to just watch you actually sample these.
02:29:45 So I can’t wait for your podcast.
02:29:47 I’ll be listening to all of them.
02:29:49 I mean, that’s a cool medium to see like where it goes.
02:29:52 The cool thing about podcasting and making videos,
02:29:54 especially with a super creative mind like yours,
02:29:56 you don’t really know what you’re going to make of it
02:29:59 until you try it.
02:30:01 Yeah, podcasts I’m really excited about,
02:30:03 but I’m like, I like going on other people’s podcasts.
02:30:06 And I never try to have my own.
02:30:08 So with every medium,
02:30:10 there’s the challenges of how the sausage is made.
02:30:13 So like the challenges of action.
02:30:15 Yeah, but it’s also, I like to like,
02:30:16 I’ll go on like, as you know, long ass monologues
02:30:19 and you can’t do that.
02:30:20 If you’re the interviewer,
02:30:21 like you’re not supposed to do that as much.
02:30:23 So I have to like rein it in and that might be hard,
02:30:27 but we’ll see.
02:30:28 You could also do solo type stuff.
02:30:29 Yeah, maybe I’ll do a little of each.
02:30:31 You know what’s funny?
02:30:32 I mean, some of my favorite is more like solo,
02:30:34 but there’s like a sidekick.
02:30:36 So you’re having a conversation, but you’re like friends,
02:30:40 but it’s really you ranting,
02:30:43 which I think you’d be extremely good at.
02:30:46 That’s funny, yeah.
02:30:46 Or even if it’s 50 50, that’s fine.
02:30:48 Like if it’s just a friend who I want to like really riff
02:30:50 with, I just don’t like interviewing someone,
02:30:55 which I won’t, that’s not what the podcast will be,
02:30:57 but I can’t help.
02:30:58 I’ve tried moderating panels before
02:30:59 and I cannot help myself.
02:31:01 I have to get involved and no one likes a moderator
02:31:03 who’s too involved.
02:31:04 It’s very unappealing.
02:31:05 So I, you know, interviewing someone and I’m like,
02:31:07 I can’t, I don’t even, I don’t, I just, it’s not my,
02:31:09 I can grill someone, but that’s different.
02:31:11 That’s my curiosity being like, wait, how about this?
02:31:13 And I interrupt them and I’m trying to.
02:31:14 I see the way your brain works.
02:31:16 It’s hilarious.
02:31:17 It’s awesome.
02:31:18 It’s like lights up with fire and excitement.
02:31:19 Yeah, I actually, I love listening.
02:31:21 I like watching people, I like listening to people.
02:31:24 So this is like me right now,
02:31:25 having just listening to a podcast.
02:31:27 This is me listening to your podcast right now.
02:31:29 I love listening to a podcast
02:31:30 because then it’s not even like,
02:31:31 but once I’m in the room, I suddenly can’t help myself.
02:31:33 I jump again.
02:31:35 Okay.
02:31:36 Big last ridiculous question.
02:31:38 What is the meaning of life?
02:31:40 The meaning of like an individual life?
02:31:42 Your existence here on earth,
02:31:44 or maybe broadly this whole thing we’ve got going on,
02:31:47 descendants of apes,
02:31:49 busily creating.
02:31:50 Yeah, well there’s, yeah.
02:31:51 For me, I feel like I want to be around as long as I can.
02:31:55 If I can, if I can do some kind of crazy life extension
02:31:57 or upload myself, I’m gonna,
02:31:59 because who doesn’t want to see how cool
02:32:01 the year 3000 is?
02:32:03 Imagine.
02:32:04 You did say mortality was not appealing.
02:32:06 No, it’s not appealing at all to me.
02:32:08 Now, it’s ultimately appealing.
02:32:10 As I said, no one wants eternal life, I believe,
02:32:12 if they understood what eternity really was.
02:32:14 And I did Graham’s number as a post,
02:32:15 and I was like, okay, no one wants to live that many years.
02:32:18 But I’d like to choose.
02:32:19 I’d like to say, you know what?
02:32:20 I’m truly over it now, and I’m going to have,
02:32:21 you know, at that point we’d have,
02:32:22 our whole society would have like,
02:32:24 we’d have a ceremony.
02:32:25 We’d have a whole process of someone signing off,
02:32:28 and you know, it would be beautiful,
02:32:29 and it wouldn’t be sad.
02:32:30 Well, I think you’d be super depressed by that point.
02:32:33 Like, who’s gonna sign off when they’re doing pretty good?
02:32:35 Maybe, maybe, yes.
02:32:36 Okay, maybe it’s dark.
02:32:37 But at least, but the point is,
02:32:38 if I’m happy, I can stay around for five, you know.
02:32:40 I’m thinking 50 century sounds great.
02:32:42 Like, I don’t know if I want more than that.
02:32:44 50 sounds like a right number.
02:32:45 And so if you’re thinking, if you would sign up for 50,
02:32:47 if you had a choice, one is what I get, that is bullshit.
02:32:50 Like, if you’re somebody who wants 50,
02:32:52 one is a hideous number, right?
02:32:55 You know, anyway.
02:32:56 So for me personally, I want to be around as long as I can.
02:33:01 And then honestly, the reason I love writing,
02:33:02 the thing that I love most is like,
02:33:05 is like a warm fuzzy connection with other people, right?
02:33:08 And that can be my friends, and it can be readers.
02:33:11 And that’s why I would never want to be like a journalist
02:33:13 where their personality is like hidden behind the writing,
02:33:17 or like even a biographer, you know.
02:33:19 There’s a lot of people who are great writers,
02:33:21 but it’s, I like to personally connect.
02:33:23 And if I can take something that’s in my head
02:33:25 and other people can say, oh my God, I think that too,
02:33:27 and this made me feel so much better,
02:33:28 it made me feel seen, like that feels amazing.
02:33:30 And I just feel like we’re all having
02:33:33 such a weird common experience on this one little rock,
02:33:35 in this one little moment of time,
02:33:37 where this weird, these weird four limb beings,
02:33:40 and we’re all the same.
02:33:41 And it’s like, we’re all, we all, the human experience,
02:33:43 so I feel like so many of us suffer in the same ways,
02:33:46 and we’re all going through a lot of the same things.
02:33:48 And to me, it is very, if I lived,
02:33:50 if I was on my deathbed and I feel like I had,
02:33:52 like I had a ton of human connection,
02:33:54 and like shared a lot of common experience,
02:33:56 and made a lot of other people feel like,
02:33:59 like not alone.
02:34:01 Do you feel that as a writer?
02:34:02 Do you like hear and feel like the inspiration,
02:34:08 like all the people that you make smile,
02:34:10 and all the people you inspire?
02:34:11 Honestly, not, sometimes, you know,
02:34:13 when we did an in person event,
02:34:14 and I, you know, meet a bunch of people,
02:34:16 and it’s incredibly gratifying,
02:34:18 or, you know, you just, you know, you get emails,
02:34:19 but I think it is easy to forget that how many people,
02:34:22 sometimes you’re just sitting there alone,
02:34:24 typing, dealing with your procrastination.
02:34:26 But that’s why publishing is so gratifying,
02:34:28 because that’s the moment when all this connection happens.
02:34:30 And especially if I had to put my finger on it,
02:34:32 it’s like, it’s having a bunch of people who feel lonely,
02:34:35 and their like existence is all realized,
02:34:37 like all, you know, connect, right?
02:34:38 So that, if I do a lot of that,
02:34:40 and that includes, of course, my actual spending,
02:34:43 you know, a lot of really high quality time
02:34:44 with friends and family, and like,
02:34:47 and making the whole thing as heartbreaking
02:34:50 as like mortality and life can be,
02:34:51 make the whole thing like fun,
02:34:53 and at least we can like laugh at ourselves together
02:34:55 while going through it.
02:34:56 And that to me is the, yeah.
02:34:58 And then your last blog post will be written from Mars
02:35:02 as you get the bad news that you’re not able to return
02:35:05 because of the malfunction in the rocket.
02:35:07 Yeah, I would like to go to Mars,
02:35:09 and like go there for a week,
02:35:10 and be like, yeah, here we are, and then come back.
02:35:11 No, I know that’s what you want.
02:35:13 Staying there, yeah.
02:35:14 And that’s fine, by the way.
02:35:15 If I, yeah, if, so you think,
02:35:17 you’re picturing me alone on Mars as the first person there,
02:35:20 and then it malfunctions.
02:35:21 Right, no, you were supposed to return,
02:35:23 but it malfunctions, and then there’s this,
02:35:26 so it’s both the hope, the awe that you experience,
02:35:30 which is how the blog starts,
02:35:32 and then it’s the overwhelming,
02:35:34 like feeling of existential dread,
02:35:38 but then it returns to like the love of humanity.
02:35:41 Well, that’s the thing, is if I could be writing,
02:35:43 and actually like writing something
02:35:44 that people would read back on Earth,
02:35:46 it would make it feel so much better.
02:35:47 Yeah, yeah.
02:35:48 You know, if I were just alone,
02:35:48 and no one was gonna realize what happened.
02:35:50 No, no, no, you get to write.
02:35:51 Yeah, no, no, it’s perfectly safe.
02:35:53 Also, that would bring out great writing.
02:35:54 Yeah, I think so.
02:35:55 If you don’t have your deathbed on Mars alone.
02:35:56 I think so.
02:35:57 Yeah.
02:35:58 Well, that’s exactly the future I hope for you, Tim.
02:36:02 All right, this was an incredible conversation.
02:36:04 You’re a really special human being, Tim.
02:36:06 Thank you so much for spending
02:36:08 your really valuable time with me.
02:36:10 I can’t wait to hear your podcast.
02:36:11 I can’t wait to read your next blog post,
02:36:15 which you said in a Twitter reply.
02:36:17 You’ll get more.
02:36:19 Yeah, soon enough, I’ll be back.
02:36:20 After the book, which add that to the long list of ideas
02:36:24 to procrastinate about.
02:36:26 Tim, thanks so much for talking to me, man.
02:36:28 Thank you.
02:36:30 Thanks for listening to this conversation with Tim Urban.
02:36:33 To support this podcast,
02:36:34 please check out our sponsors in the description.
02:36:37 And now, let me leave you with some words
02:36:39 from Tim Urban himself.
02:36:42 Be humbler about what you know,
02:36:44 more confident about what’s possible,
02:36:47 and less afraid of things that don’t matter.
02:36:51 Thanks for listening, and hope to see you next time.