Transcript
00:00:00 The following is a conversation with Sam Harris,
00:00:02 one of the most influential
00:00:03 and pioneering thinkers of our time.
00:00:06 He’s the host of the Making Sense podcast
00:00:08 and the author of many seminal books
00:00:10 on human nature and the human mind,
00:00:12 including The End of Faith, The Moral Landscape,
00:00:15 Lying, Free Will, and Waking Up.
00:00:18 He also has a meditation app called Waking Up
00:00:21 that I’ve been using to guide my own meditation.
00:00:24 Quick mention of our sponsors,
00:00:26 National Instruments, Valcampo, Athletic Greens, and Linode.
00:00:31 Check them out in the description to support this podcast.
00:00:34 As a side note, let me say that Sam
00:00:36 has been an inspiration to me
00:00:39 as he has been for many, many people,
00:00:41 first from his writing, then his early debates,
00:00:44 maybe 13, 14 years ago on the subject of faith,
00:00:48 his conversations with Christopher Hitchens,
00:00:51 and since 2013, his podcast.
00:00:54 I didn’t always agree with all of his ideas,
00:00:56 but I was always drawn to the care and depth
00:00:59 of the way he explored those ideas,
00:01:01 the calm and clarity amid the storm of difficult,
00:01:05 at times controversial discourse.
00:01:07 I really can’t express in words how much it meant to me
00:01:10 that he, Sam Harris, someone who I’ve listened to
00:01:14 for many hundreds of hours,
00:01:15 would write a kind email to me saying
00:01:18 he enjoyed this podcast and more,
00:01:21 that he thought I had a unique voice
00:01:23 that added something to this world.
00:01:25 Whether it’s true or not, it made me feel special
00:01:28 and truly grateful to be able to do this thing
00:01:31 and motivated me to work my ass off
00:01:33 to live up to those words.
00:01:35 Meeting Sam and getting to talk with him
00:01:37 was one of the most memorable moments of my life.
00:01:41 This is the Lex Friedman Podcast,
00:01:44 and here is my conversation with Sam Harris.
00:01:48 I’ve been enjoying meditating
00:01:50 with the Waking Up app recently.
00:01:52 It makes me think about the origins of cognition
00:01:55 and consciousness, so let me ask,
00:01:58 where do thoughts come from?
00:02:00 Well, that’s a very difficult question to answer.
00:02:04 Subjectively, they appear to come from nowhere, right?
00:02:09 I mean, they come out of some kind of mystery
00:02:14 that is at our backs subjectively, right?
00:02:17 So, which is to say that if you pay attention
00:02:20 to the nature of your mind in this moment,
00:02:25 you realize that you don’t know
00:02:26 what you’re going to think next, right?
00:02:29 Now, you’re expecting to think something
00:02:30 that seems like you authored it, right?
00:02:33 You’re not, unless you’re schizophrenic
00:02:35 or you have some kind of thought disorder
00:02:38 where your thoughts seem fundamentally foreign to you,
00:02:41 they do have a kind of signature of selfhood
00:02:45 associated with them, and people readily identify with them.
00:02:50 They feel like what you are.
00:02:51 I mean, this is the thing,
00:02:52 this is the spell that gets broken with meditation.
00:02:56 Our default state is to feel identical
00:03:00 to the stream of thought, right?
00:03:02 Which is fairly paradoxical because how could you,
00:03:06 as a mind, as a self, if there were such a thing as a self,
00:03:11 how could you be identical to the next piece of language
00:03:15 or the next image that just springs into conscious view?
00:03:22 But, and, you know, meditation is ultimately
00:03:26 about examining that point of view closely enough
00:03:28 so as to unravel it and feel the freedom
00:03:32 that’s on the other side of that identification.
00:03:34 But the, subjectively, thoughts simply emerge, right?
00:03:41 And you don’t think them before you think them, right?
00:03:43 There’s this first moment where, you know,
00:03:46 just anyone listening to us or watching us now
00:03:48 could perform this experiment for themselves.
00:03:50 I mean, just imagine something or remember something.
00:03:54 You know, just pick a memory, any memory, right?
00:03:56 You’ve got a storehouse of memory,
00:03:58 just promote one to consciousness.
00:04:02 Did you pick that memory?
00:04:04 I mean, let’s say you remembered breakfast yesterday
00:04:07 or you remembered what you said to your spouse
00:04:10 before leaving the house,
00:04:11 or you remembered what you watched on Netflix last night,
00:04:13 or you remembered something that happened to you
00:04:16 when you were four years old, whatever it is, right?
00:04:20 First it wasn’t there, and then it appeared.
00:04:24 And that is not a, well, I’m sure we’ll get to the topic
00:04:28 of free will, ultimately.
00:04:30 That’s not evidence of free will, right?
00:04:33 Why are you so sure, by the way?
00:04:35 It’s very interesting.
00:04:36 Well, through no free will of my own, yeah.
00:04:38 Everything just appears, right?
00:04:42 What else could it do?
00:04:43 And so that’s the subjective side of it.
00:04:45 Objectively, you know, we have every reason to believe
00:04:48 that many of our thoughts, all of our thoughts
00:04:50 are at bottom what some part of our brain is doing
00:04:57 neurophysiologically.
00:04:58 I mean, these are the products
00:05:00 of some kind of neural computation
00:05:02 and neural representation when we’re talking about memories.
00:05:06 Is it possible to pull at the string of thoughts
00:05:10 to try to get to its root?
00:05:14 To try to dig in past the obvious surface,
00:05:17 subjective experience of like the thoughts pop out
00:05:20 out of nowhere.
00:05:21 Is it possible to somehow get closer to the roots
00:05:24 of where they come out of from the firing of the cells?
00:05:28 Or is it a useless pursuit to dig into that direction?
00:05:32 Well, you can get closer to many, many subtle contents
00:05:38 in consciousness, right?
00:05:39 So you can notice things more and more clearly
00:05:42 and have a landscape of mind open up
00:05:44 and become more differentiated and more interesting.
00:05:47 And if you take psychedelics, you know, it opens up wide,
00:05:51 depending on what you’ve taken and the dose, you know,
00:05:53 it opens in directions and to an extent that, you know,
00:05:57 very few people imagine would be possible,
00:05:59 but for having had those experiences.
00:06:01 But this idea of you getting closer to something,
00:06:07 to the datum of your mind,
00:06:09 or such as something of interest in there,
00:06:11 or something that’s more real is ultimately undermined
00:06:16 because there’s no place
00:06:17 from which you’re getting closer to it.
00:06:19 There’s no your part of that journey, right?
00:06:22 Like we tend to start out, you know,
00:06:25 whether it’s in meditation or in any kind
00:06:29 of self examination or, you know, taking psychedelics,
00:06:33 we start out with this default point of view
00:06:36 of feeling like we’re the kind of the rider
00:06:41 on the horse of consciousness,
00:06:42 or we’re the man in the boat going down the stream
00:06:46 of consciousness, right?
00:06:47 But we’re so we’re differentiated
00:06:49 from what we know cognitively, introspectively,
00:06:56 but that feeling of being differentiated,
00:06:58 that feeling of being a self
00:06:59 that can strategically pay attention
00:07:01 to some contents of consciousness
00:07:04 is what it’s like to be identified
00:07:06 with some part of the stream of thought
00:07:09 that’s going uninspected, right?
00:07:10 Like it’s a false point of view.
00:07:13 And when you see that and cut through that,
00:07:16 then this sense of this notion of going deeper
00:07:21 kind of breaks apart because really there is no depth.
00:07:25 Ultimately, everything is right on the surface.
00:07:27 Everything, there’s no center to consciousness.
00:07:29 There’s just consciousness and its contents.
00:07:30 And those contents can change vastly.
00:07:33 Again, if you drop acid, you know, the contents change.
00:07:37 But there’s, in some sense, that doesn’t represent
00:07:43 a position of depth versus, the continuum
00:07:46 of depth versus surface has broken apart.
00:07:49 So you’re taking as a starting point
00:07:51 that there is a horse called consciousness
00:07:54 and you’re riding it.
00:07:55 And the actual riding is very shallow.
00:07:57 This is all surface.
00:07:59 So let me ask about that horse.
00:08:02 What’s up with the horse?
00:08:04 What is consciousness?
00:08:07 From where does it emerge?
00:08:09 How like fundamental is it to the physics of reality?
00:08:13 How fundamental is it to what it means to be human?
00:08:16 And I’m just asking for a friend
00:08:18 so that we can build it
00:08:20 in our artificial intelligence systems.
00:08:22 Yeah, well, that remains to be seen if we can,
00:08:26 if we will build it purposefully or just by accident.
00:08:30 It’s a major ethical problem, potentially.
00:08:35 That, I mean, my concern here is that we may, in fact,
00:08:40 build artificial intelligence that passes the Turing test,
00:08:44 which we begin to treat not only as super intelligent
00:08:47 because it obviously is and demonstrates that,
00:08:50 but we begin to treat it as conscious
00:08:53 because it will seem conscious.
00:08:54 We will have built it to seem conscious.
00:08:56 And unless we understand exactly how consciousness emerges
00:09:01 from physics, we won’t actually know
00:09:04 that these systems are conscious, right?
00:09:06 We’ll just, they may say,
00:09:08 listen, you can’t turn me off because that’s a murder, right?
00:09:11 And we will be convinced by that dialogue
00:09:15 because we will, just in the extreme case,
00:09:18 who knows when we’ll get there.
00:09:20 But if we build something like perfectly humanoid robots
00:09:25 that are more intelligent than we are,
00:09:27 so we’re basically in a Westworld like situation,
00:09:31 there’s no way we’re going to withhold
00:09:33 an attribution of consciousness from those machines.
00:09:35 They’re just gonna seem,
00:09:36 they’re just gonna advertise our consciousness
00:09:39 in every glance and every utterance,
00:09:42 but we won’t know.
00:09:44 And we won’t know in some deeper sense
00:09:47 than we can be skeptical of the consciousness
00:09:50 of other people.
00:09:51 I mean, someone could roll that back and say,
00:09:52 well, you don’t, I don’t know that you’re conscious
00:09:54 or you don’t know that I’m conscious.
00:09:55 We’re just passing the Turing test for one another,
00:09:57 but that kind of solipsism isn’t justified biologically,
00:10:02 or we just, anything we understand about the mind
00:10:06 biologically suggests that you and I
00:10:08 are part of the same roll of the dice
00:10:13 in terms of how intelligent and conscious systems emerged
00:10:18 in the wetware of brains like ours, right?
00:10:21 So it’s not parsimonious for me to think
00:10:24 that I might be the only conscious person
00:10:26 or even the only conscious primate.
00:10:29 I would argue it’s not parsimonious
00:10:30 to withhold consciousness from other apes
00:10:34 and even other mammals ultimately.
00:10:36 And once you get beyond the mammals,
00:10:38 then my intuitions are not really clear.
00:10:41 The question of how it emerges is genuinely uncertain
00:10:45 and ultimately the question of whether it emerges
00:10:48 is still uncertain.
00:10:49 You can, you know, it’s not fashionable to think this,
00:10:54 but you can certainly argue that consciousness
00:10:58 might be a fundamental principle of matter
00:11:00 that doesn’t emerge on the basis of information processing,
00:11:04 even though everything else that we recognize
00:11:08 about ourselves as minds almost certainly does emerge,
00:11:11 you know, like an ability to process language,
00:11:13 that clearly is a matter of information processing
00:11:15 because you can disrupt that process in ways
00:11:18 that is just so clear.
00:11:22 And the problem that the confound with consciousness
00:11:26 is that, yes, we can seem to interrupt consciousness.
00:11:30 I mean, you can give someone general anesthesia
00:11:33 and then you wake them up and you ask them,
00:11:35 well, what was that like?
00:11:36 And they say, nothing, I don’t remember anything,
00:11:38 but it’s hard to differentiate a mere failure of memory
00:11:46 from a genuine interruption in consciousness.
00:11:49 Whereas it’s not with, you know, interrupting speech,
00:11:51 you know, we know when we’ve done it.
00:11:53 And it’s just obvious that, you know,
00:11:57 you disrupt the right neural circuits
00:11:59 and, you know, you’ve disrupted speech.
00:12:01 So if you had to bet all your money on one camp or the other,
00:12:04 would you say, do you err on the side of panpsychism
00:12:09 where consciousness is really fundamental
00:12:11 to all of reality or more on the other side,
00:12:16 which is like, it’s a nice little side effect,
00:12:20 a useful like hack for us humans to survive.
00:12:23 Where, on that spectrum, where do you land
00:12:26 when you think about consciousness,
00:12:27 especially from an engineering perspective?
00:12:30 I’m truly agnostic on this point, I mean, I think I’m,
00:12:35 you know, it’s kind of in coin toss mode for me.
00:12:37 I don’t know, and panpsychism is not so compelling to me.
00:12:44 Again, it just seems unfalsifiable.
00:12:46 I wouldn’t know how the universe would be different
00:12:49 if panpsychism were true.
00:12:50 It’s just to remind people panpsychism is this idea
00:12:53 that consciousness may be pushed all the way down
00:12:57 into the most fundamental constituents of matters.
00:12:59 So there might be something that it’s like
00:13:01 to be an electron or, you know, a cork,
00:13:05 but then you wouldn’t expect anything to be different
00:13:08 at the macro scale, or at least I wouldn’t expect
00:13:12 anything to be different.
00:13:14 So it may be unfalsifiable.
00:13:16 It just might be that reality is not something
00:13:22 we’re as in touch with as we think we are,
00:13:26 and that at its base layer to kind of break it into mind
00:13:31 and matter as we’ve done ontologically
00:13:35 is to misconstrue it, right?
00:13:37 I mean, there could be some kind of neutral monism
00:13:40 at the bottom, and this, you know,
00:13:41 this idea doesn’t originate with me.
00:13:43 This goes all the way back to Bertrand Russell
00:13:47 and others, you know, 100 plus years ago,
00:13:50 but I just feel like the concepts we’re using
00:13:53 to divide consciousness and matter
00:13:59 may in fact be part of our problem, right?
00:14:02 Where the rubber hits the road psychologically here
00:14:05 are things like, well, what is death, right?
00:14:08 Like do we, any expectation that we survive death
00:14:12 or any part of us survives death,
00:14:14 that really seems to be the many people’s concern here.
00:14:20 Well, I tend to believe just as a small little tangent,
00:14:23 like I’m with Ernest Becker on this,
00:14:24 that there’s some, it’s interesting to think
00:14:27 about death and consciousness,
00:14:29 which one is the chicken, which one is the egg,
00:14:32 because it feels like death could be the very thing,
00:14:34 like our knowledge of mortality could be the very thing
00:14:36 that creates the consciousness.
00:14:39 Yeah, well, then you’re using consciousness
00:14:41 differently than I am.
00:14:43 I mean, so for me, consciousness is just the fact
00:14:47 that the lights are on at all,
00:14:49 there’s an experiential quality to anything.
00:14:53 So much of the processing that’s happening
00:14:56 in our brains right now certainly seems to be happening
00:15:00 in the dark, right?
00:15:01 Like it’s not associated with this qualitative sense
00:15:06 that there’s something that it’s like to be that part
00:15:08 of the mind doing that mental thing.
00:15:13 But for other parts, the lights are on
00:15:16 and we can talk about,
00:15:18 and whether we talk about it or not,
00:15:20 we can feel directly that there’s something
00:15:25 that it’s like to be us.
00:15:27 There’s something, something seems to be happening, right?
00:15:29 And the seeming in our case is broken into vision
00:15:34 and hearing and proprioception
00:15:36 and taste and smell and thought and emotion.
00:15:42 I mean, there are the contents of consciousness
00:15:45 that we are familiar with
00:15:50 and that we can have direct access to
00:15:53 in any present moment when we’re, quote, conscious.
00:15:57 And even if we’re confused about them,
00:16:00 even if we’re asleep and dreaming
00:16:02 and it’s not a lucid dream,
00:16:04 we’re just totally confused about our circumstance,
00:16:07 what you can’t say is that we’re confused
00:16:11 about consciousness.
00:16:12 Like you can’t say that consciousness itself
00:16:14 might be an illusion because on this account,
00:16:18 it just means that things seem any way at all.
00:16:22 I mean, even like if this,
00:16:23 it seems to me that I’m seeing a cup on the table.
00:16:26 Now I could be wrong about that.
00:16:27 It could be a hologram.
00:16:28 I could be asleep and dreaming.
00:16:29 I could be hallucinating,
00:16:31 but the seeming part isn’t really up for grabs
00:16:35 in terms of being an illusion.
00:16:37 It’s not, something seems to be happening.
00:16:41 And that seeming is the context in which
00:16:45 every other thing we can notice about ourselves
00:16:50 can be noticed.
00:16:50 And it’s also the context in which certain illusions
00:16:53 can be cut through because we’re not,
00:16:55 we can be wrong about what it’s like to be us.
00:16:57 And we can, I’m not saying we’re incorrigible
00:17:01 with respect to our claims
00:17:04 about the nature of our experience,
00:17:05 but for instance, many people feel like they have a self
00:17:10 and they feel like it has free will.
00:17:11 And I’m quite sure at this point
00:17:14 that they’re wrong about that,
00:17:15 and that you can cut through those experiences
00:17:19 and then things seem a different way, right?
00:17:22 So it’s not that things don’t,
00:17:25 there aren’t discoveries to be made there
00:17:26 and assumptions to be overturned,
00:17:28 but this kind of consciousness is something
00:17:33 that I would think, it doesn’t just come online
00:17:38 when we get language.
00:17:39 It doesn’t just come online when we form a concept of death
00:17:42 or the finiteness of life.
00:17:45 It doesn’t require a sense of self, right?
00:17:48 So it doesn’t, it’s prior
00:17:50 to a differentiating self and other.
00:17:54 And I wouldn’t even think it’s necessarily limited to people.
00:17:58 I do think probably any mammal has this,
00:18:04 but certainly if you’re going to presuppose
00:18:07 that something about our brains is producing this, right?
00:18:11 And that’s a very safe assumption,
00:18:15 even though we can’t,
00:18:18 even though you can argue the jury’s still out
00:18:20 to some degree,
00:18:22 then it’s very hard to draw a principled line
00:18:24 between us and chimps,
00:18:26 or chimps and rats even in the end,
00:18:30 given the underlying neural similarities.
00:18:33 So, and I don’t know phylogenetically,
00:18:35 I don’t know how far back to push that.
00:18:38 There are people who think single cells might be conscious
00:18:41 or that flies are certainly conscious.
00:18:43 They’ve got something like 100,000 neurons in their brains.
00:18:47 I mean, there’s a lot going on even in a fly, right?
00:18:53 But I don’t have intuitions about that.
00:18:55 But it’s not in your sense an illusion you can cut through.
00:18:58 I mean, to push back,
00:19:00 the alternative version could be it is an illusion
00:19:03 constructed by, just by humans.
00:19:06 I’m not sure I believe this,
00:19:08 but in part of me hopes this is true
00:19:10 because it makes it easier to engineer,
00:19:12 is that humans are able to contemplate their mortality
00:19:17 and that contemplation in itself creates consciousness.
00:19:21 That like the rich lights on experience.
00:19:24 So the lights don’t actually even turn on
00:19:26 in the way that you’re describing until after birth
00:19:30 in that construction.
00:19:31 So do you think it’s possible that that is the case?
00:19:34 That it is a sort of construct of the way we deal,
00:19:39 almost like a social tool to deal with the reality
00:19:42 of the world, the social interaction with other humans?
00:19:46 Or is, because you’re saying the complete opposite,
00:19:49 which is it’s like fundamental to single cell organisms
00:19:53 and trees and so on.
00:19:54 Right, well, yeah, so I don’t know how far down to push it.
00:19:57 I don’t have intuitions that single cells
00:20:00 are likely to be conscious,
00:20:01 but they might be, and again, it could be unfalsifiable.
00:20:08 But as far as babies not being conscious,
00:20:10 or you don’t become conscious
00:20:12 until you can recognize yourself in a mirror
00:20:14 or you have a conversation or treat other people.
00:20:17 First of all, babies treat other people as others
00:20:20 far earlier than we have traditionally given them credit for.
00:20:25 And they certainly do it before they have language, right?
00:20:29 So it’s got to proceed language to some degree.
00:20:33 And I mean, you can interrogate this for yourself
00:20:36 because you can put yourself in various states
00:20:40 that are rather obviously not linguistic.
00:20:46 Meditation allows you to do this.
00:20:48 You can certainly do it with psychedelics
00:20:50 where it’s just your capacity for language
00:20:54 has been obliterated and yet you’re all too conscious.
00:20:58 In fact, I think you could make a stronger argument
00:21:05 for things running the other way,
00:21:09 that there’s something about language and conceptual thought
00:21:14 that is eliminative of conscious experience,
00:21:18 that we’re potentially much more conscious of data,
00:21:23 sense data and everything else than we tend to be,
00:21:26 and we have trimmed it down
00:21:29 based on how we have acquired concepts.
00:21:33 And so like, when I walk into a room like this,
00:21:36 I know I’m walking into a room,
00:21:38 I have certain expectations of what is in a room.
00:21:41 I would be very surprised to see wild animals in here
00:21:45 or a waterfall or there are things I’m not expecting,
00:21:51 but I can know I’m not expecting them
00:21:53 or I’m expecting their absence
00:21:54 because of my capacity to be surprised
00:21:57 once I walk into a room and I see a live gorilla or whatever.
00:22:01 So there’s structure there that we have put in place
00:22:05 based on all of our conceptual learning
00:22:08 and language learning.
00:22:11 And it causes us not to,
00:22:15 and one of the things that happens when you take psychedelics
00:22:17 and you just look as though for the first time at anything,
00:22:21 it becomes incredibly overloaded with,
00:22:27 it can become overloaded with meaning
00:22:28 and just the torrents of sense data that are coming in
00:22:37 in even the most ordinary circumstances
00:22:39 can become overwhelming for people.
00:22:40 And that tends to just obliterate one’s capacity
00:22:45 to capture any of it linguistically.
00:22:47 And as you’re coming down, right?
00:22:49 Have you done psychedelics?
00:22:50 Have you ever done acid or?
00:22:52 Not acid, mushroom, and that’s it.
00:22:55 And also edibles,
00:22:58 but there’s some psychedelic properties to them.
00:23:01 But yeah, mushrooms several times
00:23:04 and always had an incredible experience.
00:23:06 Exactly the kind of experience you’re referring to,
00:23:09 which is if it’s true that language constrains
00:23:14 our experience,
00:23:15 it felt like I was removing some of the constraints.
00:23:19 Because even just the most basic things
00:23:21 were beautiful in the way
00:23:22 that I wasn’t able to appreciate previously,
00:23:25 like trees and nature and so on.
00:23:27 Yeah, and the experience of coming down
00:23:31 is an experience of encountering the futility
00:23:37 of capturing what you just saw a moment ago in words.
00:23:44 Especially if you have any part of your self concept
00:23:47 and your ego program is to be able
00:23:50 to capture things in words.
00:23:51 And if you’re a writer or a poet or a scientist
00:23:55 or someone who wants to just encapsulate
00:23:58 the profundity of what just happened,
00:24:01 the total fatuousness of that enterprise
00:24:08 when you have taken a whopping dose of psychedelics
00:24:12 and you begin to even gesture at describing it to yourself,
00:24:19 so that you could describe it to others.
00:24:22 It’s just, it’s like trying to thread a needle
00:24:26 using your elbows.
00:24:27 I mean, it’s like you’re trying something that can’t,
00:24:30 it’s like the mere gesture proves it’s impossibility.
00:24:34 And it’s, so yeah, for me that suggests just empirically
00:24:39 on the first person side that it’s possible
00:24:41 to put yourself in a condition
00:24:43 where it’s clearly not about language
00:24:47 structuring your experience
00:24:49 and you’re having much more experience than you tend to.
00:24:53 So the primacy of, language is primary for some things,
00:24:57 but it’s certainly primary for certain kinds of concepts
00:25:04 and certain kinds of semantic understanding
00:25:07 and certain kinds of semantic understandings of the world.
00:25:11 But it’s clearly more to mine than the conversation
00:25:17 we’re having with ourselves or that we can have with others.
00:25:21 Can we go to that world of psychedelics for a bit?
00:25:25 Sure.
00:25:27 What do you think, so Joe Rogan apparently
00:25:30 and many others meet apparently elves on DMT, a lot of people
00:25:38 report this kind of creatures that they see.
00:25:41 And again, it’s probably the failure of language
00:25:43 to describe that experience, but DMT is an interesting one.
00:25:46 There’s, as you’re aware, there’s a bunch of studies
00:25:50 going on in psychedelics, currently MDMA, psilocybin
00:25:54 and John Hopkins and much other places, but DMT,
00:26:03 they all speak of as like some extra super level
00:26:08 of a psychedelic.
00:26:09 Yeah, do you have a sense of where it is our mind goes
00:26:16 on psychedelics, but in DMT especially?
00:26:20 Well, unfortunately I haven’t taken DMT.
00:26:22 Unfortunately or fortunately?
00:26:23 Unfortunately.
00:26:24 Unfortunately.
00:26:26 Although it’s, I presume it’s in my body
00:26:28 as it is in everyone’s brain and many, many plants
00:26:33 apparently, but I’ve wanted to take it.
00:26:36 I haven’t been, I had an opportunity that was presented
00:26:39 itself that where it was obviously the right thing
00:26:41 for me to be doing, but for those who don’t know,
00:26:45 DMT is often touted as the most intense psychedelic
00:26:49 and also the shortest acting.
00:26:51 I mean, you smoke it and it’s basically a 10 minute
00:26:54 experience or a three minute experience within like
00:26:57 a 10 minute window that when you’re really down
00:27:02 after 10 minutes or so, and Terrence McKenna
00:27:07 was a big proponent of DMT.
00:27:09 That was his, the center of the bullseye for him
00:27:12 psychedelically, apparently.
00:27:15 And it does, it is characterized, it seems for many people
00:27:20 by this phenomenon, which is unlike virtually
00:27:23 any other psychedelic experience, which is your,
00:27:25 it’s not just your perception being broadened or changed.
00:27:30 It’s you according to Terrence McKenna feeling fairly
00:27:36 unchanged, but catapulted into a different circumstance.
00:27:41 You and me have been shot elsewhere and find yourself
00:27:45 in relationship to other entities of some kind, right?
00:27:49 So the place is populated with things that seem
00:27:52 not to be your mind.
00:27:54 So it does feel like travel to another place
00:27:56 because you’re unchanged yourself.
00:27:58 According, again, I just have this on the authority
00:28:00 of the people who have described their experience,
00:28:03 but it sounds like it’s pretty common.
00:28:05 It sounds like it’s pretty common for people
00:28:07 not to have the full experience because it’s apparently
00:28:09 pretty unpleasant to smoke.
00:28:11 So it’s like getting enough on board in order to get shot
00:28:15 out of the cannon and land among the,
00:28:20 what McKenna called self transforming machine elves
00:28:27 that appeared to him like jeweled Faberge egg,
00:28:30 like self drippling basketballs that were handing him
00:28:35 completely uninterpretable reams of profound knowledge.
00:28:41 It’s an experience I haven’t had.
00:28:44 So I just have to accept that people have had it.
00:28:49 I would just point out that our minds are clearly capable
00:28:53 of producing apparent others on demand
00:28:59 that are totally compelling to us, right?
00:29:01 There’s no limit to our ability to do that
00:29:04 as anyone who’s ever remembered a dream can attest.
00:29:08 Every night we go to sleep,
00:29:10 some of us don’t remember dreams very often,
00:29:11 but some dream vividly every night.
00:29:15 And just think of how insane that experience is.
00:29:20 I mean, you’ve forgotten where you were, right?
00:29:23 That’s the strangest part.
00:29:25 I mean, this is psychosis, right?
00:29:27 You have lost your mind.
00:29:29 You have lost your connection to your episodic memory
00:29:34 or even your expectations that reality won’t undergo
00:29:39 wholesale changes a moment
00:29:42 after you have closed your eyes, right?
00:29:44 Like you’re in bed, you’re watching something on Netflix,
00:29:49 you’re waiting to fall asleep,
00:29:50 and then the next thing that happens to you is impossible
00:29:54 and you’re not surprised, right?
00:29:56 You’re talking to dead people,
00:29:57 you’re hanging out with famous people,
00:29:58 you’re someplace you couldn’t physically be,
00:30:02 you can fly and even that’s not surprising, right?
00:30:05 So you’ve lost your mind,
00:30:08 but relevantly for this.
00:30:10 Or found it.
00:30:12 You found something.
00:30:13 I mean, lucid dreaming is very interesting
00:30:14 because then you can have the best of both circumstances
00:30:17 and then it can become systematically explored.
00:30:22 But what I mean by found, just to start to interrupt,
00:30:25 is like if we take this brilliant idea
00:30:29 that language constrains us, grounds us,
00:30:32 language and other things of the waking world ground us,
00:30:35 maybe it is that you’ve found the full capacity
00:30:41 of your cognition when you dream or when you do psychedelics.
00:30:44 You’re stepping outside the little human cage,
00:30:47 the cage of the human condition to open the door
00:30:51 and step out and look around and then go back in.
00:30:54 Well, you’ve definitely stepped out of something
00:30:57 and into something else, but you’ve also lost something,
00:30:59 right, you’ve lost certain capacities.
00:31:01 Memory?
00:31:02 Well, just, yeah, in this case,
00:31:04 you literally didn’t, you don’t have enough presence of mind
00:31:08 in the dream state or even in the psychedelic state
00:31:11 if you take enough.
00:31:14 To do math.
00:31:15 There’s no psychological,
00:31:17 there’s very little psychological continuity with your life
00:31:21 such that you’re not surprised to be in the presence
00:31:26 of someone who should be, you should know is dead
00:31:30 or you should know you’re not likely to have met
00:31:32 by normal channels, right, you’re now talking
00:31:35 to some celebrity and it turns out you’re best friends,
00:31:38 right, and you’re not even, you have no memory
00:31:40 of how you got there, you’re like,
00:31:41 how did you get into the room?
00:31:42 You’re like, did you drive to this restaurant?
00:31:45 You have no memory and none of that’s surprising to you.
00:31:47 So you’re kind of brain damaged in a way,
00:31:49 you’re not reality testing in the normal way.
00:31:53 The fascinating possibility is that there’s probably
00:31:56 thousands of people who’ve taken psychedelics
00:31:58 of various forms and have met Sam Harris on that journey.
00:32:03 Well, I would put it more likely in dreams,
00:32:05 not, you know, because with psychedelics,
00:32:08 you don’t tend to hallucinate in a dreamlike way.
00:32:11 I mean, so DMT is giving you an experience of others,
00:32:15 but it seems to be nonstandard.
00:32:19 It’s not like, it’s not just like dream hallucinations,
00:32:23 but to the point of coming back to DMT,
00:32:26 the people want to suggest,
00:32:30 and Terrence McKenna certainly did suggest
00:32:32 that because these others are so obviously other
00:32:37 and they’re so vivid, well, then they could not possibly
00:32:39 be the creation of my own mind,
00:32:42 but every night in dreams, you create a compelling
00:32:47 or what is to you at the time,
00:32:49 a totally compelling simulacrum of another person, right?
00:32:54 And that just proves the mind is capable of doing it.
00:33:00 Now, the phenomenon of lucid dreaming shows
00:33:04 that the mind isn’t capable of doing everything you think
00:33:07 it might be capable of even in that space.
00:33:10 So one of the things that people have discovered
00:33:14 in lucid dreams, and I haven’t done a lot of lucid dreaming,
00:33:17 so I can’t confirm all of this, I can confirm some of it.
00:33:24 Apparently in every house, in every room
00:33:28 in the mansion of dreams,
00:33:30 all light switches are dimmer switches.
00:33:32 Like if you go into a dark room and flip on the light,
00:33:36 it gradually comes up.
00:33:38 It doesn’t come up instantly on demand
00:33:41 because apparently this is covering for the brain’s
00:33:44 inability to produce from a standing start
00:33:49 visually rich imagery on demand.
00:33:52 So I haven’t confirmed that, but that was,
00:33:54 people have done research on lucid dreaming claim
00:33:57 that it’s all dimmer switches.
00:34:01 But one thing I have noticed,
00:34:03 and people can check this out, is that in a dream,
00:34:08 if you look at text, a page of text or a sign
00:34:13 or a television that has text on it,
00:34:16 and then you turn away and you look back at that text,
00:34:18 the text will have changed, right?
00:34:21 The total is it’s just a chronic instability,
00:34:24 graphical instability of text in the dream state.
00:34:28 And I don’t know if that, maybe that’s,
00:34:31 someone can confirm that that’s not true for them,
00:34:33 but whenever I’ve checked that out,
00:34:34 that has been true for me.
00:34:35 So it keeps generating it like real time
00:34:39 from a video game perspective.
00:34:40 Yeah, it’s rendering, it’s re rendering it for some reason.
00:34:44 What’s interesting, I actually,
00:34:46 I don’t know how I found myself in this sets
00:34:49 of that part of the internet,
00:34:51 but there’s quite a lot of discussion
00:34:53 about what it’s like to do math on LSD.
00:34:57 Because apparently one of the deepest thinking processes
00:35:02 needed is those of mathematicians
00:35:04 or theoretical computer scientists
00:35:06 are basically doing anything that involves math
00:35:08 as proofs, and you have to think creatively,
00:35:11 but also deeply, and you have to think
00:35:13 for many hours at a time.
00:35:15 And so they’re always looking for ways to like,
00:35:18 is there any sparks of creativity that could be injected?
00:35:22 And apparently out of all the psychedelics,
00:35:25 the worst is LSD because it completely destroys
00:35:29 your ability to do math well.
00:35:31 And I wonder whether that has to do with your ability
00:35:33 to visualize geometric things in a stable way
00:35:38 in your mind and hold them there
00:35:40 and stitch things together,
00:35:41 which is often what’s required for proofs.
00:35:44 But again, it’s difficult to kind of research
00:35:47 these kinds of concepts, but it does make me wonder
00:35:51 where, what are the spaces, how’s the space of things
00:35:56 you’re able to think about and explore
00:35:59 morphed by different psychedelics
00:36:02 or dream states and so on, and how’s that different?
00:36:06 How much does it overlap with reality?
00:36:08 And what is reality?
00:36:10 Is there a waking state reality?
00:36:13 Or is it just a tiny subset of reality
00:36:15 and we get to take a step in other versions of it?
00:36:18 We tend to think very much in a space time,
00:36:23 four dimensional, there’s a three dimensional world,
00:36:25 there’s time, and that’s what we think about reality.
00:36:29 And we think of traveling as walking from point A
00:36:33 to point B in the three dimensional world.
00:36:36 But that’s a very kind of human surviving,
00:36:40 trying not to get eaten by a lion conception of reality.
00:36:43 What if traveling is something like we do with psychedelics
00:36:46 and meet the elves?
00:36:48 What if it’s something, what if thinking
00:36:50 or the space of ideas as we kind of grow
00:36:53 and think through ideas, that’s traveling?
00:36:57 Or what if memories is traveling?
00:37:00 I don’t know if you have a favorite view of reality
00:37:03 or if you had, by the way, I should say,
00:37:06 excellent conversation with Donald Hoffman.
00:37:10 Yeah, yeah, he’s interesting.
00:37:11 Is there any inkling of his sense in your mind
00:37:15 that reality is very far from,
00:37:20 actual like objective reality is very far
00:37:22 from the kind of reality we imagine,
00:37:24 we perceive and we play with in our human minds?
00:37:29 Well, the first thing to grant
00:37:31 is that we’re never in direct contact with reality,
00:37:39 whatever it is, unless that reality is consciousness, right?
00:37:42 So we’re only ever experiencing consciousness
00:37:47 and its contents.
00:37:48 And then the question is how does that circumstance relate
00:37:53 to quote reality at large?
00:37:55 And Donald Hoffman is somebody who’s happy to speculate,
00:38:00 well, maybe there isn’t a reality at large.
00:38:02 Maybe it’s all just consciousness on some level.
00:38:05 And that’s interesting.
00:38:08 That runs into, to my eye, various philosophical problems
00:38:15 that, or at least you have to do a lot,
00:38:17 you have to add to that picture of idealism for me.
00:38:24 That’s usually all the whole family of views
00:38:27 that would just say that the universe is just mind
00:38:30 or just consciousness at bottom,
00:38:32 we’ll go by the name of idealism in Western philosophy.
00:38:38 You have to add to that idealistic picture
00:38:40 all kinds of epicycles and kind of weird coincidences
00:38:44 and to get the predictability of our experience
00:38:51 and the success of materialist science
00:38:54 to make sense in that context, right?
00:38:56 And so the fact that we can, what does it mean to say
00:39:00 that there’s only consciousness at bottom, right?
00:39:05 Nothing outside of consciousness
00:39:07 because no one’s ever experienced anything
00:39:08 outside of consciousness.
00:39:09 There’s no scientist has ever done an experiment
00:39:11 where they were contemplating data,
00:39:14 no matter how far removed from our sense bases,
00:39:17 whether it’s they’re looking at the Hubble deep field
00:39:20 or they’re smashing atoms or whatever tools they’re using,
00:39:25 they’re still just experiencing consciousness
00:39:29 and its various deliverances
00:39:32 and layering their concepts on top of that.
00:39:37 So that’s always true.
00:39:41 And yet that somehow doesn’t seem to capture
00:39:48 the character of our continually discovering
00:39:53 that our materialist assumptions are confirmable, right?
00:39:59 So take the fact that we unleash this fantastic amount
00:40:03 of energy from within an atom, right?
00:40:06 First, we have the theoretical suggestion
00:40:09 that it’s possible, right?
00:40:12 We come back to Einstein,
00:40:14 there’s a lot of energy in that matter, right?
00:40:18 And what if we could release it, right?
00:40:21 And then we perform an experiment that in this case,
00:40:24 you know, the Trinity test site in New Mexico,
00:40:28 where the people who are most adequate to this conversation,
00:40:32 people like Robert Oppenheimer
00:40:36 are standing around,
00:40:38 not altogether certain it’s going to work, right?
00:40:41 They’re performing an experiment.
00:40:42 They’re wondering what’s gonna happen.
00:40:43 They’re wondering if their calculations around the yield
00:40:45 are off by orders of magnitude.
00:40:47 Some of them are still wondering
00:40:49 whether the entire atmosphere of earth
00:40:51 is gonna combust, right?
00:40:55 That the nuclear chain reaction is not gonna stop.
00:41:01 And lo and behold,
00:41:04 there was that energy to be released
00:41:07 from within the nucleus of an atom.
00:41:09 And that could, so it’s just what the picture one forms
00:41:14 from those kinds of experiments.
00:41:17 And just the knowledge,
00:41:17 it’s just our understanding of evolution.
00:41:19 Just the fact that the earth is billions of years old
00:41:22 and life is hundreds of millions of years old.
00:41:24 And we weren’t here to think about any of those things.
00:41:28 And all of those processes were happening therefore
00:41:30 in the dark.
00:41:31 And they are the processes that allowed us to emerge,
00:41:35 you know, from prior life forms in the first place.
00:41:38 To say that it’s all a mess,
00:41:40 that nothing exists,
00:41:42 outside of consciousness, conscious minds
00:41:45 of the sort that we experience.
00:41:47 It just seems,
00:41:50 it seems like a bizarrely anthropocentric claim,
00:41:58 you know, analogous to, you know,
00:41:59 the moon isn’t there if no one’s looking at it, right?
00:42:02 I mean, the moon as a moon isn’t there
00:42:04 if no one’s looking at it.
00:42:05 I’ll grant that,
00:42:06 because that’s already a kind of fabrication
00:42:09 born of concepts, but the idea that there’s nothing there,
00:42:13 that there’s nothing that corresponds
00:42:15 to what we experience as the moon,
00:42:18 unless someone’s looking at it,
00:42:19 that just seems just a way too parochial way
00:42:25 to set out on this journey of discovery.
00:42:27 There is something there.
00:42:28 There’s a computer waiting to render the moon
00:42:30 when you look at it.
00:42:32 The capacity for the moon to exist is there.
00:42:34 So if we’re looking at the moon,
00:42:36 the capacity for the moon to exist is there.
00:42:39 So if we’re indeed living in a simulation,
00:42:43 which I find a compelling thought experiment,
00:42:46 it’s possible that there is this kind of rendering mechanism,
00:42:50 but not in a silly way that we think about in video games,
00:42:53 but in some kind of more fundamental physics way.
00:42:56 And we have to account for the fact
00:42:58 that it renders experiences that no one has had yet,
00:43:03 that no one has any expectation of having.
00:43:07 It can violate the expectations of everyone lawfully.
00:43:10 And then there’s some lawful understanding
00:43:12 of why that’s so.
00:43:14 It’s like, I mean, just to bring it back to mathematics,
00:43:18 I’m like, certain numbers are prime,
00:43:20 whether we have discovered them or not.
00:43:22 There’s the highest prime number that anyone can name now.
00:43:27 And then there’s the next prime number
00:43:29 that no one can name, and it’s there.
00:43:31 So it’s like, to say that our minds are putting it there,
00:43:36 that what we know as mind in ourselves
00:43:38 is in some way, in some sense, putting it there.
00:43:43 The base layer of reality is consciousness, right?
00:43:47 That we’re identical to the thing
00:43:49 that is rendering this reality.
00:43:53 There’s some, you know, hubris is the wrong word,
00:43:57 but it’s like, it’s okay if reality is bigger
00:44:01 than what we experience, you know?
00:44:03 And it has structure that we can’t anticipate,
00:44:08 and that isn’t just,
00:44:13 I mean, again, there’s certainly a collaboration
00:44:16 between our minds and whatever is out there
00:44:19 to produce what we call, you know, the stuff of life.
00:44:24 But it’s not, the idea that it’s,
00:44:31 I don’t know, I mean, there are a few stops
00:44:33 on the train of idealism and kind of new age thinking
00:44:36 and Eastern philosophy that I don’t,
00:44:40 philosophically, I don’t see a need to take.
00:44:42 I mean, experientially and scientifically,
00:44:45 I feel like it’s, you can get everything you want
00:44:49 from acknowledging that consciousness
00:44:53 has a character that can be explored from its own side,
00:44:58 so that you’re bringing kind of the first person experience
00:45:01 back into the conversation about, you know,
00:45:03 what is a human mind and, you know, what is true?
00:45:08 And you can explore it with different degrees of rigor,
00:45:12 and there are things to be discovered there,
00:45:13 whether you’re using a technique like meditation
00:45:15 or psychedelics, and that these experiences
00:45:19 have to be put in conversation with what we understand
00:45:22 about ourselves from a third person side,
00:45:24 neuroscientifically or in any other way.
00:45:27 But to me, the question is, what if reality,
00:45:30 the sense I have from this kind of, you play shooters?
00:45:34 No.
00:45:36 There’s a physics engine that generates, that’s pretty.
00:45:38 Yeah, you mean first person shooter games?
00:45:40 Yes, yes, sorry.
00:45:41 Not often, but yes.
00:45:43 I mean, there’s a physics engine
00:45:44 that generates consistent reality, right?
00:45:47 My sense is the same could be true for a universe
00:45:51 in the following sense, that our conception of reality
00:45:54 as we understand it now in the 21st century
00:45:57 is a tiny subset of the full reality.
00:45:59 It’s not that the reality that we conceive of that’s there,
00:46:03 the moon being there is not there somehow.
00:46:06 It’s that it’s a tiny fraction of what’s actually out there.
00:46:09 And so the physics engine of the universe
00:46:12 is just maintaining the useful physics,
00:46:16 the useful reality, quote unquote,
00:46:19 for us to have a consistent experience as human beings.
00:46:22 But maybe we descendants of apes are really only understand
00:46:27 like 0.0001% of actual physics of reality.
00:46:34 We can even just start with the consciousness thing,
00:46:36 but maybe our minds are just,
00:46:39 we’re just too dumb by design.
00:46:42 Yeah, I, that truly resonates with me
00:46:46 and I’m surprised it doesn’t resonate more
00:46:48 with most scientists that I talk to.
00:46:50 Matthew, when you just look at,
00:46:52 you look at how close we are to chimps, right?
00:46:57 And chimps don’t know anything, right?
00:46:58 Clearly they have no idea what’s going on, right?
00:47:01 And then you get us,
00:47:03 but then it’s only a subset of human beings
00:47:06 that really understand much of what we’re talking about
00:47:09 in any area of specialization.
00:47:12 And if they all died in their sleep tonight, right?
00:47:15 You’d be left with people who might take a thousand years
00:47:20 to rebuild the internet, if ever, right?
00:47:24 I mean, literally it’s like,
00:47:26 and I would extend this to myself.
00:47:29 I mean, there are areas of scientific specialization
00:47:32 where I have either no discernible competence.
00:47:37 I mean, I spent no time on it.
00:47:40 I have not acquired the tools.
00:47:42 It would just be an article of faith for me to think
00:47:43 that I could acquire the tools
00:47:45 to actually make a breakthrough in those areas.
00:47:48 And I mean, your own area is one.
00:47:50 I mean, I’ve never spent any significant amount of time
00:47:54 trying to be a programmer,
00:47:56 but it’s pretty obvious I’m not Alan Turing, right?
00:48:00 It’s like, if that were my capacity,
00:48:03 I would have discovered that in myself.
00:48:05 I would have found programming irresistible.
00:48:08 My first false starts in learning, I think it was C,
00:48:15 it was just, you know, I bounced off.
00:48:17 It’s like, this was not fun.
00:48:18 I hate, I mean, I hate trying to figure out
00:48:20 what the syntax error that’s causing this thing
00:48:22 not to compile was just a fucking awful experience.
00:48:25 I hated it, right?
00:48:26 I hated every minute of it.
00:48:28 So it was not, so if it was just people like me left,
00:48:33 like when do we get the internet again, right?
00:48:35 And we lose, we lose, you know, we lose the internet.
00:48:39 When do we get it again, right?
00:48:40 When do we get anything like a proper science
00:48:44 of information, right?
00:48:45 You need a Claude Shannon or an Alan Turing
00:48:49 to plant a flag in the ground right here and say,
00:48:52 all right, can everyone see this?
00:48:53 Even if you don’t quite know what I’m up to,
00:48:56 you all have to come over here to make some progress.
00:49:00 And, you know, there are, you know,
00:49:03 hundreds of topics where that’s the case.
00:49:05 So we barely have a purchase on making anything
00:49:10 like discernible intellectual progress in any generation.
00:49:16 And yeah, I’m just, Max Tegmark makes this point.
00:49:21 He’s one of the few people who does in physics.
00:49:26 If you just look at the numbers,
00:49:28 if you just take the truth of evolution seriously, right?
00:49:36 And realize that there’s nothing about us
00:49:39 that has evolved to understand reality perfectly.
00:49:42 I mean, we’re just not that kind of ape, right?
00:49:46 There’s been no evolutionary pressure along those lines.
00:49:48 So what we are making do with tools
00:49:52 that were designed for fights with sticks and rocks, right?
00:49:56 And it’s amazing we can do as much as we can.
00:50:00 I mean, we just, you know, the UNR just sitting here
00:50:02 on the back of having received an mRNA vaccine,
00:50:05 you know, that has certainly changed our life
00:50:08 given what the last year was like.
00:50:10 And it’s gonna change the world
00:50:12 if rumors of coming miracles are born out.
00:50:16 I mean, it’s now, it seems likely we have a vaccine
00:50:20 coming for malaria, right?
00:50:22 Which has been killing millions of people a year
00:50:25 for as long as we’ve been alive.
00:50:28 I think it’s down to like 800,000 people a year now
00:50:31 because we’ve spread so many bed nets around,
00:50:33 but it was like two and a half million people every year.
00:50:39 It’s amazing what we can do, but yeah, I have,
00:50:43 if in fact the answer at the back of the book of nature
00:50:46 is you understand 0.1% of what there is to understand
00:50:52 and half of what you think you understand is wrong,
00:50:54 that would not surprise me at all.
00:50:58 It is funny to look at our evolutionary history,
00:51:01 even back to chimps, I’m pretty sure even chimps
00:51:03 thought they understood the world well.
00:51:06 So at every point in that timeline
00:51:09 of evolutionary development throughout human history,
00:51:12 there’s a sense like there’s no more,
00:51:15 you hear this message over and over,
00:51:17 there’s no more things to be invented.
00:51:19 But a hundred years ago there were,
00:51:21 there’s a famous story, I forget which physicist told it,
00:51:24 but there were physicists telling
00:51:29 their undergraduate students not to go into,
00:51:32 to get graduate degrees in physics
00:51:34 because basically all the problems had been solved.
00:51:36 And this is like around 1915 or so.
00:51:40 It turns out you were right.
00:51:41 I’m gonna ask you about free will.
00:51:42 Oh, okay.
00:51:44 You’ve recently released an episode of your podcast,
00:51:48 Making Sense, for those with a shorter attention span,
00:51:51 basically summarizing your position on free will.
00:51:54 I think it was under an hour and a half.
00:51:56 Yeah, yeah.
00:51:57 It was brief and clear.
00:52:01 So allow me to summarize the summary, TLDR,
00:52:05 and maybe you tell me where I’m wrong.
00:52:08 So free will is an illusion,
00:52:11 and even the experience of free will is an illusion.
00:52:13 Like we don’t even experience it.
00:52:15 Am I good in my summary?
00:52:20 Yeah, this is a line that’s a little hard
00:52:25 to scan for people.
00:52:27 I say that it’s not merely that free will is an illusion.
00:52:32 The illusion of free will is an illusion.
00:52:35 Like there is no illusion of free will.
00:52:37 And that is a, unlike many other illusions,
00:52:40 that’s a more fundamental claim.
00:52:47 It’s not that it’s wrong, it’s not even wrong.
00:52:49 I mean, I guess that was I think Wolfgang Pauli
00:52:52 who derided one of his colleagues or enemies
00:52:56 with that aspersion about his theory in quantum mechanics.
00:53:06 So there are things that, there are genuine illusions.
00:53:09 There are things that you do experience
00:53:12 and then you can kind of punch through that experience,
00:53:15 or you can’t actually experience,
00:53:17 you can’t experience them any other way.
00:53:20 It’s just, we just know it’s not a veridical experience.
00:53:24 You just take like a visual illusion.
00:53:25 There are visual illusions that,
00:53:26 a lot of these come to me on Twitter these days.
00:53:28 There’s these amazing visual illusions
00:53:31 where like every figure in this GIF seems to be moving,
00:53:36 but nothing in fact is moving.
00:53:37 You can just like put a ruler on your screen
00:53:39 and nothing’s moving.
00:53:42 Some of those illusions you can’t see any other way.
00:53:44 I mean, they’re just, they’re hacking aspects
00:53:46 of the visual system that are just eminently hackable
00:53:49 and you have to use a ruler to convince yourself
00:53:54 that the thing isn’t actually moving.
00:53:56 Now there are other visual illusions
00:53:57 where you’re taken in by it at first,
00:54:01 but if you pay more attention,
00:54:02 you can actually see that it’s not there, right?
00:54:05 Or it’s not how it first seemed.
00:54:07 Like the Necker cube is a good example of that.
00:54:10 Like the Necker cube is just that schematic of a cube,
00:54:13 of a transparent cube, which pops out one way or the other.
00:54:15 Then one face can pop out and then the other face
00:54:17 can pop out.
00:54:18 But you can actually just see it as flat with no pop out,
00:54:22 which is a more veridical way of looking at it.
00:54:27 So there are subject,
00:54:29 there are kind of inward correlates to this.
00:54:32 And I would say that the sense of self and free will
00:54:39 are closely related.
00:54:40 I mean, I often describe them as two sides of the same coin,
00:54:43 but they’re not quite the same in their spuriousness.
00:54:49 I mean, so the sense of self is something that people,
00:54:52 I think, do experience, right?
00:54:54 It’s not a very clear experience, but it’s not,
00:54:58 I wouldn’t call the illusion of self an illusion,
00:55:00 but the illusion of free will is an illusion
00:55:03 in that as you pay more attention to your experience,
00:55:07 you begin to see that it’s totally compatible
00:55:10 with an absence of free will.
00:55:11 You don’t, I mean coming back to the place we started,
00:55:15 you don’t know what you’re gonna think next.
00:55:18 You don’t know what you’re gonna intend next.
00:55:20 You don’t know what’s going to just occur to you
00:55:23 that you must do next.
00:55:24 You don’t know how much you are going to feel
00:55:28 the behavioral imperative to act on that thought.
00:55:31 If you suddenly feel, oh, I don’t need to do that.
00:55:35 I can do that tomorrow.
00:55:36 You don’t know where that comes from.
00:55:38 You didn’t know that was gonna arise.
00:55:39 You didn’t know that was gonna be compelling.
00:55:41 All of this is compatible with some evil genius
00:55:44 in the next room just typing in code into your experience.
00:55:47 It’s like this, okay, let’s give him the,
00:55:51 oh my God, I just forgot it was gonna be our anniversary
00:55:53 in one week thought, right?
00:55:56 Give him the cascade of fear.
00:55:59 Give him this brilliant idea for the thing he can buy
00:56:01 that’s gonna take him no time at all
00:56:02 and this overpowering sense of relief.
00:56:05 All of our experiences is compatible
00:56:07 with the script already being written, right?
00:56:11 And I’m not saying the script is written.
00:56:12 I’m not saying that fatalism is the right way
00:56:17 to look at this, but we just don’t have
00:56:20 even our most deliberate voluntary action
00:56:23 where we go back and forth between two options,
00:56:27 thinking about the reason for A
00:56:28 and then reconsidering and going,
00:56:31 thinking harder about B and just going
00:56:34 eeny, meeny, miny, moe until the end of the hour.
00:56:37 However laborious you can make it,
00:56:40 there is a utter mystery at your back
00:56:44 finally promoting the thought or intention
00:56:48 or rationale that is most compelling
00:56:53 and therefore behaviorally effective.
00:57:07 And this can drive some people a little crazy.
00:57:09 So I usually preface what I say about free will
00:57:13 with the caveat that if thinking about your mind this way
00:57:17 makes you feel terrible, well then stop.
00:57:19 You get off the ride, switch the channel.
00:57:22 You don’t have to go down this path.
00:57:24 But for me and for many other people,
00:57:27 it’s incredibly freeing to recognize this about the mind
00:57:32 because one, you realize that you’re,
00:57:38 cutting through the illusion of the self
00:57:39 is immensely freeing for a lot of reasons
00:57:41 that we can talk about separately,
00:57:44 but losing the sense of free will does
00:57:49 two things very vividly for me.
00:57:51 One is it totally undercuts the basis for,
00:57:54 the psychological basis for hatred.
00:57:56 Because when you think about the experience
00:57:59 of hating other people, what that is anchored to
00:58:03 is a feeling that they really are
00:58:06 the true authors of their actions.
00:58:08 I mean, if someone is doing something
00:58:10 that you find so despicable, right?
00:58:13 Let’s say they’re targeting you unfairly, right?
00:58:15 They’re maligning you on Twitter or they’re suing you
00:58:20 or they’re doing something, they broke your car window,
00:58:22 they did something awful
00:58:24 and now you have a grievance against them.
00:58:26 And you’re relating to them very differently emotionally
00:58:30 in your own mind than you would
00:58:32 if a force of nature had done this, right?
00:58:34 Or if it’s, if it had just been a virus
00:58:36 or if it had been a wild animal
00:58:39 or a malfunctioning machine, right?
00:58:40 Like to those things you don’t attribute
00:58:42 any kind of freedom of will.
00:58:44 And while you may suffer the consequences
00:58:47 of catching a virus or being attacked by a wild animal
00:58:49 or having your car break down or whatever,
00:58:53 it may frustrate you.
00:58:56 You don’t slip into this mode of hating the agent
00:59:01 in a way that completely commandeers your mind
00:59:06 and deranges your life.
00:59:07 I mean, you just don’t, I mean, there are people
00:59:09 who spend decades hating other people for what they did
00:59:15 and it’s just pure poison, right?
00:59:18 So it’s a useful shortcut to compassion and empathy.
00:59:20 Yeah, yeah.
00:59:21 But the question is, say that this called,
00:59:24 what was it, the horse of consciousness?
00:59:26 Let’s call it the consciousness generator black box
00:59:30 that we don’t understand.
00:59:32 And is it possible that the script
00:59:35 that we’re walking along, that we’re playing,
00:59:40 that’s already written is actually being written
00:59:43 in real time.
00:59:45 It’s almost like you’re driving down a road
00:59:47 and in real time, that road is being laid down.
00:59:50 And this black box of consciousness that we don’t understand
00:59:53 is the place where the script is being generated.
00:59:57 So it’s not, it is being generated, it didn’t always exist.
01:00:01 So there’s something we don’t understand
01:00:02 that’s fundamental about the nature of reality
01:00:05 that generates both consciousness,
01:00:07 let’s call it maybe the self.
01:00:09 I don’t know if you want to distinguish between those.
01:00:11 Yeah, I definitely would, yeah.
01:00:13 You would, because there’s a bunch of illusions
01:00:15 we’re referring to.
01:00:16 There’s the illusion of free will,
01:00:18 there’s the illusion of self,
01:00:20 and there’s the illusion of consciousness.
01:00:22 You’re saying, I think you said there’s no,
01:00:25 you’re not as willing to say
01:00:27 there’s an illusion of consciousness.
01:00:28 You’re a little bit more.
01:00:29 In fact, I would say it’s impossible.
01:00:30 Impossible.
01:00:31 You’re a little bit more willing to say
01:00:33 that there’s an illusion of self,
01:00:35 and you’re definitely saying
01:00:36 there’s an illusion of free will.
01:00:38 Yes, I’m definitely saying there’s an illusion
01:00:42 that a certain kind of self is an illusion.
01:00:44 Not every, we mean many different things
01:00:46 by this notion of self.
01:00:47 So maybe I should just differentiate these things.
01:00:50 So consciousness can’t be an illusion
01:00:53 because any illusion proves its reality
01:00:58 as much as any other veridical perception.
01:01:00 I mean, if you’re hallucinating now,
01:01:02 that’s just as much of a demonstration of consciousness
01:01:05 as really seeing what’s a quote actually there.
01:01:09 If you’re dreaming and you don’t know it,
01:01:12 that is consciousness, right?
01:01:15 You can be confused about literally everything.
01:01:17 You can’t be confused about the underlying claim,
01:01:25 whether you make it linguistically or not,
01:01:27 but just the cognitive assertion
01:01:34 that something seems to be happening.
01:01:36 It’s the seeming that is the cash value of consciousness.
01:01:40 Can I take a tiny tangent?
01:01:42 So what if I am creating consciousness in my mind
01:01:47 to convince you that I’m human?
01:01:50 So it’s a useful social tool,
01:01:52 not a fundamental property of experience,
01:01:57 like of being a living thing.
01:02:00 What if it’s just like a social tool
01:02:02 to almost like a useful computational trick
01:02:07 to place myself into reality
01:02:10 as we together communicate about this reality?
01:02:14 And another way to ask that,
01:02:15 because you said it much earlier,
01:02:18 you talk negatively about robots as you often do.
01:02:21 So let me, because you’ll probably die first
01:02:24 when they take over.
01:02:26 No, I’m looking forward to certain kinds of robots.
01:02:28 I mean, I’m not, if we can get this right,
01:02:31 this would be amazing.
01:02:32 But you don’t like the robots that fake consciousness.
01:02:34 That’s what you,
01:02:35 you don’t like the idea of fake it till you make it.
01:02:37 Well, no, it’s not that I don’t like it.
01:02:40 It’s that I’m worried that we will lose sight
01:02:43 of the problem.
01:02:44 And the problem has massive ethical consequences.
01:02:47 I mean, if we create robots that really can suffer,
01:02:51 that would be a bad thing, right?
01:02:53 And if we really are committing a murder
01:02:56 when we recycle them, that would be a bad thing.
01:02:59 This is how I know you’re not Russian.
01:03:00 Why is it a bad thing that we create robots that can suffer?
01:03:03 Isn’t suffering a fundamental thing
01:03:05 from which like beauty springs?
01:03:07 Like without suffering,
01:03:08 do you really think we would have beautiful things
01:03:10 in this world?
01:03:11 Okay, that’s a tangent on a tangent.
01:03:14 We’ll go there.
01:03:15 I would love to go there, but let’s not go there just yet.
01:03:17 All right.
01:03:17 But I do think it would be, if anything is bad,
01:03:20 creating hell and populating it
01:03:22 with real minds that really can suffer in that hell,
01:03:27 that’s bad.
01:03:28 You are worse than any mass murderer we can name
01:03:34 if you create it.
01:03:35 I mean, this could be in robot form,
01:03:37 or more likely it would be in some simulation of a world
01:03:41 where we managed to populate it with conscious minds
01:03:43 whether we knew they were conscious or not.
01:03:46 And that world is a state of, it’s unendurable.
01:03:50 That would just, it just taking the thesis seriously
01:03:53 that there’s nothing that mind intelligence
01:03:58 and consciousness ultimately are substrate independent.
01:04:00 Right?
01:04:01 It doesn’t, you don’t need a biological brain
01:04:02 to be conscious.
01:04:03 You certainly don’t need a biological brain
01:04:04 to be intelligent.
01:04:05 Right?
01:04:06 So if we just imagine the consciousness at some point
01:04:09 comes along for the ride as you scale up in intelligence,
01:04:12 well then we could find ourselves creating conscious minds
01:04:16 that are miserable, right?
01:04:17 And that’s just like creating a person who’s miserable.
01:04:19 Right?
01:04:20 It could be worse than creating a person who’s miserable.
01:04:21 It could be even more sensitive to suffering.
01:04:23 Cloning them and maybe for entertainment
01:04:26 and watching them suffer.
01:04:27 Just like watching a person suffer for entertainment.
01:04:31 You know?
01:04:32 So, but back to your primary question here,
01:04:36 which is differentiating consciousness and self
01:04:40 and free will as concepts
01:04:42 and kind of degrees of illusoriness.
01:04:46 The problem with free will is that
01:04:50 what most people mean by it,
01:04:53 and this is where Dan Dennett
01:04:56 is gonna get off the ride here, right?
01:04:57 So like he doesn’t, he’s gonna disagree with me
01:04:59 that I know what most people mean by it.
01:05:02 But I have a very keen sense having talked about this topic
01:05:07 for many, many years
01:05:09 and seeing people get wrapped around the axle of it
01:05:13 and seeing in myself what it’s like to have felt
01:05:17 that I was a self that had free will
01:05:20 and then to no longer feel that way, right?
01:05:22 To know what it’s like to actually disabuse myself
01:05:24 of that sense cognitively and emotionally
01:05:30 and to recognize what’s left, what goes away
01:05:32 and what doesn’t go away on the basis of that epiphany.
01:05:35 I have a sense that I know what people think they have
01:05:40 in hand when they worry about whether free will exists.
01:05:44 And it is the flip side of this feeling of self.
01:05:50 It’s the flip side of feeling
01:05:51 like you are not merely identical to experience.
01:05:57 You feel like you’re having an experience.
01:05:59 You feel like you’re an agent
01:06:00 that is appropriating an experience.
01:06:02 There’s a protagonist in the movie of your life
01:06:05 and it is you.
01:06:07 It’s not just the movie, right?
01:06:09 It’s like there’s sights and sounds and sensations
01:06:13 and thoughts and emotions
01:06:14 and this whole cacophony of experience,
01:06:17 of felt experience, of felt experience of embodiment.
01:06:21 But there seems to be a rider on the horse
01:06:26 or a passenger in the body, right?
01:06:28 People don’t feel truly identical to their bodies
01:06:30 down to their toes.
01:06:32 They sort of feel like they have bodies.
01:06:34 They feel like their minds in bodies
01:06:37 and that feels like a self, that feels like me.
01:06:42 And again, this gets very paradoxical
01:06:45 when you talk about the experience
01:06:47 of being in relationship to yourself
01:06:50 or talking to yourself, giving yourself a pep talk.
01:06:52 I mean, if you’re the one talking,
01:06:54 why are you also the one listening?
01:06:55 Like, why do you need the pep talk and why does it work
01:06:57 if you’re the one giving the pep talk, right?
01:07:00 Or if I say like, where are my keys?
01:07:02 Or if I’m looking for my keys,
01:07:03 why do I think the superfluous thought, where are my keys?
01:07:06 I know I’m looking for the fucking keys.
01:07:08 I’m the one looking, who am I telling
01:07:11 that we now need to look for the keys, right?
01:07:13 So that duality is weird, but leave that aside.
01:07:17 There’s the sense, and this becomes very vivid
01:07:22 when people try to learn to meditate.
01:07:25 Most people, they close their eyes
01:07:28 and they’re told to pay attention to an object
01:07:30 like the breath, say.
01:07:31 So you close your eyes and you pay attention to the breath
01:07:35 and you can feel it at the tip of your nose
01:07:37 or the rising and falling of your abdomen
01:07:40 and you’re paying attention
01:07:42 and you feel something vague there.
01:07:44 And then you think, I thought, well, why the breath?
01:07:46 Why am I paying attention to the breath?
01:07:49 What’s so special about the breath?
01:07:51 And then you notice you’re thinking
01:07:54 and you’re not paying attention to the breath anymore.
01:07:55 And then you realize, okay, the practice is,
01:07:58 okay, I should notice thoughts
01:07:59 and then I should come back to the breath.
01:08:01 But this starting point of the conventional starting point
01:08:06 of feeling like you are an agent, very likely in your head,
01:08:10 a locus of consciousness, a locus of attention
01:08:12 that can strategically pay attention
01:08:15 to certain parts of experience.
01:08:17 Like I can focus on the breath
01:08:18 and then I get lost in thought
01:08:20 and now I can come back to the breath
01:08:22 and I can open my eyes and I’m over here behind my face
01:08:26 looking out at a world that’s other than me
01:08:28 and there’s this kind of subject object perception.
01:08:31 And that is the default starting point of selfhood,
01:08:35 of subjectivity.
01:08:36 And married to that is the sense that
01:08:42 I can decide what to do next, right?
01:08:46 I am an agent who can pay attention to the cup.
01:08:50 I can listen to sounds.
01:08:52 There’s certain things that I can’t control.
01:08:53 Certain things are happening to me
01:08:54 and I just can’t control them.
01:08:55 So for instance, if someone asks,
01:08:59 well, can you not hear a sound, right?
01:09:02 Like don’t hear the next sound,
01:09:03 don’t hear anything for a second,
01:09:05 or don’t hear, I’m snapping my fingers, don’t hear this.
01:09:09 Where’s your free will?
01:09:10 You know, well, like just stop this from coming in.
01:09:12 You realize, okay, wait a minute.
01:09:14 My abundant freedom does not extend
01:09:18 to something as simple as just being able to pay attention
01:09:20 to something else than this.
01:09:23 Okay, well, so I’m not that kind of free agent,
01:09:25 but at least I can decide what I’m gonna do next
01:09:28 and I’m gonna pick up this water, right?
01:09:32 And there’s a feeling of identification
01:09:36 with the impulse, with the intention,
01:09:39 with the thought that occurs to you,
01:09:41 with the feeling of speaking.
01:09:43 Like what am I gonna say next?
01:09:45 Well, I’m saying it.
01:09:46 So here goes, this is me.
01:09:48 It feels like I’m the thinker.
01:09:50 I’m the one who’s in control.
01:09:53 But all of that is born of not really paying close attention
01:10:00 to what it’s like to be you.
01:10:01 And so this is where meditation comes in,
01:10:05 or this is where, again, you can get at this conceptually.
01:10:09 You can unravel the notion of free will
01:10:11 just by thinking certain thoughts,
01:10:15 but you can’t feel that it doesn’t exist
01:10:18 unless you can pay close attention
01:10:20 to how thoughts and intentions arise.
01:10:22 So the way to unravel it conceptually
01:10:24 is just to realize, okay, I didn’t make myself.
01:10:27 I didn’t make my genes.
01:10:28 I didn’t make my brain.
01:10:29 I didn’t make the environmental influences
01:10:31 that impinged upon this system for the last 54 years
01:10:35 that have produced my brain in precisely the state
01:10:38 it’s in right now, such and with all of the receptor weightings
01:10:42 and densities, and it’s just,
01:10:45 I’m exactly the machine I am right now
01:10:48 through no fault of my own as the experiencing self.
01:10:54 I get no credit and I get no blame
01:10:56 for the genetics and the environmental influences here.
01:11:00 And yet those are the only things
01:11:03 that contrive to produce my next thought
01:11:09 or impulse or moment of behavior.
01:11:12 And if you were going to add something magical
01:11:14 to that clockwork, like an immortal soul,
01:11:18 you can also notice that you didn’t produce your soul.
01:11:21 You can’t account for the fact
01:11:22 that you don’t have the soul of someone
01:11:25 who doesn’t like any of the things you like
01:11:28 or wasn’t interested in any of the things
01:11:29 you were interested in or was a psychopath
01:11:33 or had an IQ of 40.
01:11:35 I mean, there’s nothing about that
01:11:38 that the person who believes in a soul
01:11:41 can claim to have controlled.
01:11:43 And yet that is also totally dispositive
01:11:45 of whatever happens next.
01:11:48 But everything you’ve described now,
01:11:51 maybe you can correct me,
01:11:52 but it kind of speaks to the materialistic nature
01:11:54 of the hardware.
01:11:57 But even if you add magical ectoplasm software,
01:12:01 you didn’t produce that either.
01:12:03 I know, but if we can think about the actual computation
01:12:08 running on the hardware and running on the software,
01:12:11 there’s something you said recently
01:12:12 which you think of culture as an operating system.
01:12:17 So if we just remove ourselves a little bit
01:12:21 from the conception of human civilization
01:12:24 being a collection of humans
01:12:26 and rather us just being a distributed
01:12:30 computation system on which there’s
01:12:32 some kind of operating system running,
01:12:34 and then the computation that’s running
01:12:36 is the actual thing that generates
01:12:38 the interactions, the communications,
01:12:40 and maybe even free will, the experiences
01:12:42 of all those free will.
01:12:44 Do you ever think of, do you ever try
01:12:46 to reframe the world in that way
01:12:47 where it’s like ideas are just using us,
01:12:51 thoughts are using individual nodes in the system,
01:12:56 and they’re just jumping around,
01:12:58 and they also have ability to generate experiences
01:13:01 so that we can push those ideas along.
01:13:03 And basically the main organisms here
01:13:05 are the thoughts, not the humans.
01:13:07 Yeah, but then that erodes the boundary
01:13:11 between self and world.
01:13:15 Right.
01:13:16 So then there’s no self, really integrated self
01:13:19 to have any kind of will at all.
01:13:22 Like if you’re just a meme plex,
01:13:24 I mean, if you’re just a collection of memes,
01:13:28 and I mean, we’re all kind of like currents,
01:13:32 like eddies in this river of ideas, right?
01:13:35 So it’s like, and it seems to have structure,
01:13:40 but there’s no real boundary between that part
01:13:43 of the flow of water and the rest.
01:13:44 I mean, if our, and I would say that much
01:13:47 of our mind answers to this kind of description.
01:13:49 I mean, so much of our mind has been,
01:13:53 it’s obviously not self generated,
01:13:55 and it’s not, you’re not gonna find it
01:13:56 by looking in the brain.
01:13:58 It is the result of culture largely,
01:14:03 but also, you know, the genes on one side
01:14:10 and culture on the other meeting
01:14:13 to allow for manifestations of mind
01:14:20 that don’t, that aren’t actually bounded
01:14:22 by the person in any clear sense.
01:14:26 It was just, I mean, the example I often use here,
01:14:31 but there’s so many others is just the fact
01:14:33 that we’re following the rules of English grammar
01:14:36 to whatever degree we are.
01:14:37 It’s not that we certainly haven’t consciously represented
01:14:40 these rules for ourself.
01:14:42 We haven’t invented these rules.
01:14:44 We haven’t, I mean, there are norms of language use
01:14:48 that we couldn’t even specify because we haven’t,
01:14:53 you know, we’re not grammarians.
01:14:54 We’re not, we haven’t studied this.
01:14:56 We don’t even have the right concepts,
01:14:58 and yet we’re following these rules,
01:14:59 and we’re noticing, you know, we’re noticing as, you know,
01:15:03 an error when we fail to follow these rules,
01:15:08 and virtually every other cultural norm is like that.
01:15:11 I mean, these are not things we’ve invented.
01:15:13 You can consciously decide to scrutinize them
01:15:17 and override them, but, I mean, just think of,
01:15:21 just think of any social situation
01:15:23 where you’re with other people and you’re behaving
01:15:27 in ways that are culturally appropriate, right?
01:15:31 You’re not being, you know,
01:15:32 you’re not being wild animals together.
01:15:34 You’re following, you have some expectation
01:15:36 of how you shake a person’s hand
01:15:38 and how you deal with implements on a table,
01:15:43 how you have a meal together.
01:15:44 Obviously, this can change from culture to culture,
01:15:47 and people can be shocked
01:15:49 by how different those things are, right?
01:15:51 We, you know, we all have foods we find disgusting,
01:15:53 but in some countries, dog is not one of those foods, right?
01:15:57 And yet, you know, you and I presumably
01:15:59 would be horrified to be served dog.
01:16:03 Those are not norms that we’re,
01:16:06 they are outside of us in some way,
01:16:09 and yet they’re felt very viscerally.
01:16:13 I mean, they’re certainly felt in their violation.
01:16:15 You know, if you are, just imagine,
01:16:18 you’re in somebody’s home,
01:16:21 you’re eating something that tastes great to you,
01:16:23 and you happen to be in Vietnam or wherever,
01:16:25 you know, you didn’t realize dog was potentially
01:16:28 on the menu, and you find out that you’ve just eaten
01:16:32 10 bites of what is, you know, really a cocker spaniel,
01:16:37 and you feel this instantaneous urge to vomit, right,
01:16:42 based on an idea, right?
01:16:44 Like, so, like, you did not,
01:16:47 you’re not the author of that norm
01:16:51 that gave you such a powerful experience of its violation,
01:16:55 and I’m sure we can trace the moment in your history,
01:16:59 you know, vaguely, where it sort of got in.
01:17:01 I mean, very early on as kids,
01:17:02 you realize you’re treating dogs as pets
01:17:05 and not as food, or as potential food.
01:17:10 But yeah, no, it’s, but the point you just made
01:17:15 opens us to, like, we are totally permeable
01:17:18 to a sea of mind.
01:17:22 Yeah, but if we take the metaphor
01:17:24 of the distributed computing systems,
01:17:26 each individual node is,
01:17:29 is part of performing a much larger computation,
01:17:32 but it nevertheless is in charge of doing the scheduling
01:17:36 of, so, assuming it’s Linux,
01:17:39 is doing the scheduling of processes
01:17:41 and is constantly alternating them.
01:17:42 That node is making those choices.
01:17:46 That node sure as hell believes it has free will,
01:17:49 and it actually has free will
01:17:51 because it’s making those hard choices,
01:17:53 but the choices ultimately are part
01:17:54 of a much larger computation that it can’t control.
01:17:57 Isn’t it possible for that node to still be,
01:17:59 that human node is still making the choice?
01:18:04 Well, yeah, it is.
01:18:05 So I’m not saying that your body
01:18:08 isn’t doing, really doing things, right?
01:18:11 And some of those things can be
01:18:14 conventionally thought of as choices, right?
01:18:16 So it’s like, I can choose to reach,
01:18:19 and it’s like, it’s not being imposed on me.
01:18:21 That would be a different experience.
01:18:22 Like, so there’s an experience of all,
01:18:26 you know, there’s definitely a difference
01:18:27 between voluntary and involuntary action.
01:18:30 There’s, so that has to get conserved.
01:18:34 By any account of the mind that jettisons free will,
01:18:36 you still have to admit that there’s a difference
01:18:39 between a tremor that I can’t control
01:18:42 and a purposeful motor action that I can control
01:18:47 and I can initiate on demand,
01:18:49 and it’s associated with intentions.
01:18:50 And it’s got efferent, you know, motor copy,
01:18:55 which is being predictive so that I can notice errors.
01:18:59 You know, I have expectations.
01:19:00 When I reach for this,
01:19:02 if my hand were actually to pass through the bottle,
01:19:04 because it’s a hologram, I would be surprised, right?
01:19:07 And so that shows that I have a expectation
01:19:09 of just what my grasping behavior is gonna be like
01:19:12 even before it happens.
01:19:13 Whereas with a tremor,
01:19:14 you don’t have the same kind of thing going on.
01:19:17 That’s a distinction we have to make.
01:19:19 So I am, yes, I’m really, my intention to move,
01:19:26 which is in fact can be subjectively felt,
01:19:28 really is the proximate cause of my moving.
01:19:31 It’s not coming from elsewhere in the universe.
01:19:33 I’m not saying that.
01:19:35 So in that sense, the node is really deciding
01:19:37 to execute, you know, the subroutine now.
01:19:42 But that’s not the feeling
01:19:47 that has given rise to this conundrum of free will, right?
01:19:54 So the people feel like,
01:19:58 people feel like the crucial thing is that people feel
01:20:01 like they could have done otherwise, right?
01:20:04 That’s the thing that,
01:20:05 so when you run back the clock of your life, right?
01:20:09 You run back the movie of your life,
01:20:11 you flip back the few pages in the novel of your life,
01:20:14 they feel that at this point,
01:20:18 they could behave differently than they did, right?
01:20:20 So like, but given, you know,
01:20:23 even given your distributed computing example,
01:20:27 it’s either a fully deterministic system
01:20:30 or it’s a deterministic system
01:20:32 that admits of some random, you know, influence.
01:20:36 In either case,
01:20:39 that’s not the free will people think they have.
01:20:41 The free will people think they have is, damn,
01:20:45 I shouldn’t have done that.
01:20:46 I just like, I shouldn’t have done that.
01:20:49 I could have done otherwise, right?
01:20:51 I should have done otherwise, right?
01:20:52 Like if you think about something
01:20:55 that you deeply regret doing, right?
01:20:57 Or that you hold someone else responsible for
01:21:00 because they really are the upstream agent
01:21:03 in your mind of what they did.
01:21:05 You know, that’s an awful thing that that person did
01:21:08 and they shouldn’t have done it.
01:21:09 So there is this illusion and it has to be an illusion
01:21:12 because there’s no picture of causation
01:21:17 that would make sense of it.
01:21:18 There’s this illusion that if you arrange the universe
01:21:21 exactly the way it was a moment ago,
01:21:24 it could have played out differently.
01:21:27 And the only way it could have played out differently
01:21:31 is if there’s randomness added to that,
01:21:34 but randomness isn’t what people feel
01:21:37 would give them free will, right?
01:21:39 If you tell me that, you know,
01:21:41 I only reached for the water bottle this time
01:21:43 because there’s a random number generator in there
01:21:47 kicking off values and it finally moved my hand,
01:21:51 that’s not the feeling of authorship.
01:21:54 That’s still not control.
01:21:55 You’re still not making that decision.
01:21:58 There’s actually, I don’t know if you’re familiar
01:22:00 with cellular automata.
01:22:01 It’s a really nice visualization
01:22:03 of how simple rules can create incredible complexity
01:22:07 that it’s like really dumb initial conditions to set,
01:22:11 simple rules applied, and eventually you watch this thing
01:22:14 and if the initial conditions are correct,
01:22:18 then you’re going to have emerged something
01:22:21 that to our perception system
01:22:23 looks like organisms interacting.
01:22:25 You can construct any kinds of worlds
01:22:27 and they’re not actually interacting.
01:22:29 They’re not actually even organisms.
01:22:31 And they certainly aren’t making decisions.
01:22:34 So there’s like systems you can create
01:22:37 that illustrate this point.
01:22:38 The question is whether there could be some room
01:22:42 for let’s use in the 21st century the term magic,
01:22:47 back to the black box of consciousness.
01:22:50 Let me ask it this way.
01:22:51 If you’re wrong about your intuition about free will,
01:22:56 what, and somebody comes along to you
01:22:58 and proves to you that you didn’t have the full picture,
01:23:03 what would that proof look like?
01:23:04 What would?
01:23:05 So that’s the problem, that’s why it’s not even an illusion
01:23:08 in my world because for me, it’s impossible to say
01:23:14 what the universe would have to be like
01:23:16 for free will to be a thing, right?
01:23:19 It doesn’t conceptually map onto any notion
01:23:22 of causation we have.
01:23:24 And that’s unlike any other spurious claim you might make.
01:23:29 So like if you’re gonna believe in ghosts, right?
01:23:33 I understand what that claim could be,
01:23:37 where like I don’t happen to believe in ghosts,
01:23:40 but it’s not hard for me to specify
01:23:44 what would have to be true for ghosts to be real.
01:23:47 And so it is with a thousand other things like ghosts,
01:23:50 right, so like, okay, so you’re telling me
01:23:52 that when people die, there’s some part of them
01:23:54 that is not reducible at all to their biology
01:23:57 that lifts off them and goes elsewhere
01:24:00 and is actually the kind of thing
01:24:02 that they can linger in closets and in cupboards
01:24:04 and actually it’s immaterial,
01:24:07 but by some principle of physics,
01:24:09 we don’t totally understand it can make sounds
01:24:11 and knock objects and even occasionally show up
01:24:15 so they can be visually beheld.
01:24:18 And it’s just, it seems like a miracle,
01:24:21 but it’s just some spooky noun in the universe
01:24:25 that we don’t understand, let’s call it a ghost.
01:24:29 That’s fine, I can talk about that all day.
01:24:31 The reasons to believe in it,
01:24:32 the reasons not to believe in it,
01:24:34 the way we would scientifically test for it,
01:24:36 what would have to be provable
01:24:38 so as to convince me that ghosts are real.
01:24:42 Free will isn’t like that at all.
01:24:44 There’s no description of any concatenation of causes
01:24:49 that precedes my conscious experience
01:24:53 that sounds like what people think they have
01:24:55 when they think they could have done otherwise
01:24:56 and that they really, that they, the conscious agent,
01:25:00 is really in charge, right?
01:25:01 Like if you don’t know what you’re going to think next,
01:25:05 right, and you can’t help but think it,
01:25:09 take those two premises on board.
01:25:12 You don’t know what it’s gonna be,
01:25:14 you can’t stop it from coming,
01:25:18 and until you actually know how to meditate,
01:25:21 you can’t stop yourself from
01:25:27 fully living out its behavioral or emotional consequences.
01:25:31 Right, like you have no, once you,
01:25:33 mindfulness, you know,
01:25:35 arguably gives you another degree of freedom here.
01:25:38 It doesn’t give you free will,
01:25:39 but it gives you some other game to play
01:25:41 with respect to the emotional
01:25:43 and behavioral imperatives of thoughts.
01:25:46 But short of that, I mean,
01:25:50 the reason why mindfulness doesn’t give you free will
01:25:52 is because you can’t, you know,
01:25:53 you can’t account for why in one moment
01:25:55 mindfulness arises and in other moments it doesn’t, right?
01:26:00 But a different process is initiated
01:26:03 once you can practice in that way.
01:26:06 Well, if I could push back for a second.
01:26:08 By the way, I just have this thought bubble
01:26:11 popping up all the time of just two recent chimps
01:26:14 arguing about the nature of consciousness.
01:26:16 It’s kind of hilarious.
01:26:17 So on that thread, you know,
01:26:21 if we’re, even before Einstein,
01:26:22 let’s say before Einstein,
01:26:24 we were to conceive about traveling
01:26:27 from point A to point B, say some point in the future,
01:26:32 we are able to realize through engineering
01:26:34 a way which is consistent with Einstein’s theory
01:26:39 that you can have wormholes.
01:26:40 You can travel from one point to another
01:26:42 faster than the speed of light.
01:26:46 And that would, I think, completely change our conception
01:26:49 of what it means to travel in the physical space.
01:26:52 And that completely transform our ability.
01:26:57 You talk about causality, but here let’s just focus
01:26:59 on what it means to travel through physical space.
01:27:03 Don’t you think it’s possible that there will be inventions
01:27:08 or leaps in understanding about reality
01:27:11 that will allow us to see free will as actually,
01:27:15 like us humans somehow may be linked
01:27:19 to this idea of consciousness,
01:27:21 are actually able to be authors of our actions?
01:27:25 It is a nonstarter for me conceptually.
01:27:29 It’s a little bit like saying,
01:27:33 could there be some breakthrough that will cause us
01:27:35 to realize that circles are really square
01:27:40 or the circles are not really round, right?
01:27:43 No, a circle is what we mean by a perfectly round form.
01:27:47 It’s not on the table to be revised.
01:27:52 And so I would say the same thing about consciousness.
01:27:55 It’s just like saying, is there some breakthrough
01:27:58 that would get us to realize that consciousness
01:28:00 is really an illusion?
01:28:02 I’m saying no, because the experience of an illusion
01:28:06 is as much a demonstration of what I’m calling consciousness
01:28:09 as anything else, right?
01:28:10 That is consciousness.
01:28:12 With free will, it’s a similar problem.
01:28:15 It’s like, again, it comes down to a picture of causality
01:28:22 and there’s no other picture on offer.
01:28:27 And what’s more, I know what it’s like
01:28:31 on the experiential side to lose the thing
01:28:36 to which it is clearly anchored, right?
01:28:39 Like the feel, like it doesn’t feel,
01:28:41 and this is the question that almost nobody asked.
01:28:43 People who are debating me on the topic of free will,
01:28:47 I’m, at 15 minute intervals, I’m making a claim
01:28:51 that I don’t feel this thing,
01:28:53 and they never become interested in,
01:28:58 well, what’s that like?
01:28:59 Like, okay, so you’re actually saying you don’t,
01:29:02 this thing isn’t true for you empirically.
01:29:05 It’s not just, because most people
01:29:07 who don’t believe in free will philosophically
01:29:11 also believe that we’re condemned to experience it.
01:29:15 Like, you just, you can’t live without this feeling, so.
01:29:19 So you’re actually saying you’re able
01:29:21 to experience the absence of the illusion of free will?
01:29:27 Yes, yes.
01:29:28 For, are we talking about a few minutes at a time,
01:29:32 or is this, does it require a lot of work, a meditation,
01:29:38 or are you literally able to load that into your mind
01:29:41 and like play that moment?
01:29:42 Right now, right now, just in this conversation.
01:29:44 So it’s not absolutely continuous,
01:29:49 but it’s whenever I pay attention.
01:29:51 It’s like, and I would say the same thing
01:29:53 for the illusoriness of the self in the sense,
01:29:56 and again, we haven’t talked about this, so.
01:29:58 Can you still have the self and not have the free will
01:30:01 in mind at the same time?
01:30:02 Do they go at the same time?
01:30:03 This is the same, yeah, it’s the same thing.
01:30:06 They’re always holding hands when they walk out the door.
01:30:08 There really are two sides at the same coin.
01:30:10 But it’s just, it comes down to what it’s like
01:30:14 to try to get to the end of this sentence,
01:30:16 or what it’s like to finally decide
01:30:18 that it’s been long enough
01:30:20 and now I need another sip of water, right?
01:30:22 If I’m paying attention, now, if I’m not paying attention,
01:30:25 I’m probably, I’m captured by some other thought
01:30:28 and that feels a certain way, right?
01:30:30 And so that’s not, it’s not vivid,
01:30:32 but if I try to make vivid this experience of just,
01:30:35 okay, I’m finally gonna experience free will.
01:30:38 I’m gonna notice my free will, right?
01:30:40 Like it’s gotta be here, everyone’s talking about it.
01:30:43 Where is it?
01:30:44 I’m gonna pay attention to, I’m gonna look for it.
01:30:45 And I’m gonna create a circumstance
01:30:48 that is where it has to be most robust, right?
01:30:52 I’m not rushed to make this decision.
01:30:54 I’m not, it’s not a reflex.
01:30:57 I’m not under pressure.
01:30:58 I’m gonna take as long as I want.
01:30:59 I’m going to decide, it’s not trivial.
01:31:02 Like, so it’s not just like reaching with my left hand
01:31:04 or reaching with my right hand.
01:31:05 People don’t like those examples for some reason.
01:31:07 Let’s make a big decision.
01:31:09 Like, where should, what should my next podcast be on, right?
01:31:16 Who do I invite on the next podcast?
01:31:18 What is it like to make that decision?
01:31:20 When I pay attention,
01:31:22 there is no evidence of free will anywhere in sight.
01:31:26 It’s like, it doesn’t feel like,
01:31:28 it feels profoundly mysterious
01:31:31 to be going back between two people.
01:31:33 Like, is it gonna be person A or person B?
01:31:37 Got all my reasons for A and all my reasons why not
01:31:40 and all my reasons for B.
01:31:41 And there’s some math going on there
01:31:43 that I’m not even privy to
01:31:46 where certain concerns are trumping others.
01:31:49 And at a certain point, I just decide.
01:31:52 And yes, you can say I’m the node in the network
01:31:56 that has made that decision, absolutely.
01:31:57 I’m not saying it’s being piped to me from elsewhere,
01:32:00 but the feeling of what it’s like to make that decision
01:32:04 is totally without a sense,
01:32:11 a real sense of agency
01:32:15 because something simply emerges.
01:32:18 It’s literally as tenuous as
01:32:22 what’s the next sound I’m going to hear, right?
01:32:26 Or what’s the next thought that’s gonna appear?
01:32:29 And it just, something just appears, you know?
01:32:32 And if something appears to cancel that something,
01:32:34 like if I say, I’m gonna invite her
01:32:37 and then I’m about to send the email
01:32:39 and then I think, oh, no, no, no, I can’t do that.
01:32:42 There was a thing in that New York article I read
01:32:45 that I gotta talk to this guy, right?
01:32:47 That pivot at the last second,
01:32:49 you can make it as muscular as you want.
01:32:53 It always just comes out of the darkness.
01:32:56 It’s always mysterious.
01:32:57 So right, when you try to pin it down,
01:32:59 you really can’t ever find that free will.
01:33:02 If you construct an experiment for yourself
01:33:06 and you’re trying to really find that moment
01:33:07 when you’re actually making that controlled author decision,
01:33:11 it’s very difficult to do.
01:33:12 And we’re still, we’re still, we know at this point
01:33:15 that if we were scanning your brain
01:33:18 in some podcast guest choosing experiment, right?
01:33:24 We know at this point we would be privy
01:33:27 to who you’re going to pick before you are,
01:33:29 you the conscious agent.
01:33:30 If we could, again, this is operationally
01:33:33 a little hard to conduct,
01:33:34 but there’s enough data now to know
01:33:36 that something very much like this cartoon is in fact true
01:33:42 and will ultimately be undeniable for people.
01:33:46 They’ll be able to do it on themselves with some app.
01:33:51 If you’re deciding what to, you know,
01:33:54 where to go for dinner or who to have on your podcast
01:33:56 or ultimately, you know, who to marry, right?
01:33:58 Or what city to move to, right?
01:34:00 Like you can make it as big
01:34:02 or as small a decision as you want.
01:34:05 We could be scanning your brain in real time
01:34:08 and at a point where you still think you’re uncommitted,
01:34:12 we would be able to say with arbitrary accuracy,
01:34:17 all right, Lex is, he’s moving to Austin, right?
01:34:20 I didn’t choose that.
01:34:21 Yeah, he was choosing, it was gonna be Austin
01:34:23 or it was gonna be Miami.
01:34:24 He got, he’s catching one of these two waves,
01:34:27 but it’s gonna be Austin.
01:34:29 And at a point where you subjectively,
01:34:31 if we could ask you, you would say,
01:34:34 oh no, I’m still working over here.
01:34:36 I’m still thinking, I’m still considering my options.
01:34:40 And you’ve spoken to this,
01:34:43 in you thinking about other stuff in the world,
01:34:45 it’s been very useful to step away
01:34:49 from this illusion of free will.
01:34:51 And you argue that it’s probably makes a better world
01:34:54 because it can be compassionate
01:34:55 and empathetic towards others.
01:34:56 And towards oneself.
01:34:58 Towards oneself.
01:34:59 I mean, radically toward others
01:35:01 in that literally hate makes no sense anymore.
01:35:05 I mean, there are certain things
01:35:06 you can really be worried about, really want to oppose.
01:35:10 Really, I mean, I’m not saying
01:35:12 you’d never have to kill another person.
01:35:13 Like, I mean, self defense is still a thing, right?
01:35:16 But the idea that you’re ever confronting anything
01:35:22 other than a force of nature in the end
01:35:25 goes out the window, right?
01:35:26 Or does go out the window when you really pay attention.
01:35:29 I’m not saying that this would be easy to grok
01:35:33 if someone kills a member of your family.
01:35:38 I’m not saying you can just listen
01:35:39 to my 90 minutes on free will
01:35:40 and then you should be able to see that person
01:35:42 as identical to a grizzly bear or a virus.
01:35:46 Because there’s so, I mean, we are so evolved
01:35:49 to deal with one another as fellow primates
01:35:54 and as agents, but it’s, yeah,
01:36:00 when you’re talking about the possibility
01:36:01 of, you know, Christian, you know,
01:36:05 truly Christian forgiveness, right?
01:36:08 It’s like, you know, as testified to by, you know,
01:36:14 various saints of that flavor over the millennia.
01:36:19 Yeah, that is, the doorway to that is to recognize
01:36:24 that no one really at bottom made themselves.
01:36:28 And therefore everyone, what we’re seeing really
01:36:31 are differences in luck in the world.
01:36:34 We’re seeing people who are very, very lucky
01:36:37 to have had good parents and good genes
01:36:38 and to be in good societies and had good opportunities
01:36:41 and to be intelligent and to be, you know,
01:36:44 not as intelligent as they were in the past.
01:36:47 And to be, you know, not sociopathic,
01:36:50 like none of it is on them.
01:36:53 They’re just reaping the fruits of one lottery
01:36:56 after another, and then showing up in the world
01:36:59 on that basis.
01:37:01 And then so it is with, you know,
01:37:04 every malevolent asshole out there, right?
01:37:06 He or she didn’t make themself.
01:37:11 Even if that weren’t possible,
01:37:14 the utility for self compassion is also enormous
01:37:18 because it’s, when you just look at what it’s like
01:37:21 to regret something or to feel shame about something
01:37:27 or feel deep embarrassment, these states of mind
01:37:30 are some of the most deranging experiences anyone has.
01:37:34 And the indelible reaction to them,
01:37:40 you know, the memory of the thing you said,
01:37:41 you know, the memory of the wedding toast you gave
01:37:44 20 years ago that was just mortifying, right?
01:37:47 The fact that that can still make you hate yourself, right?
01:37:50 And like that psychologically,
01:37:52 that is a knot that can be untied, right?
01:37:56 Speak for yourself, Sam.
01:37:57 Yeah, yeah.
01:37:58 So clearly you’re not.
01:37:59 You gave a great toast.
01:37:59 It was my toast that mortified me.
01:38:01 No, no, that’s not what I was referring to.
01:38:02 I’m deeply appreciative in the same way
01:38:07 that you’re referring to of every moment I’m alive,
01:38:10 but I’m also powered by self hate often.
01:38:15 Like several things in this conversation already
01:38:18 that I’ve spoken, I’ll be thinking about,
01:38:21 like that was the dumbest thing.
01:38:23 You’re sitting in front of Sam Harris and you said that.
01:38:26 So like that, but that somehow creates
01:38:29 a richer experience for me.
01:38:30 Like I’ve actually come to accept that as a nice feature
01:38:33 however my brain was built.
01:38:35 I don’t think I want to let go of that.
01:38:37 Well, the thing you, I think the thing you want to let go of
01:38:39 is the suffering associated with it.
01:38:46 So like, so for me, so psychologically and ethically,
01:38:53 all of this is very interesting.
01:38:55 So I don’t think we ever,
01:38:56 we should ever get rid of things like anger, right?
01:38:59 So like hatred is, hatred is divorcible from anger
01:39:02 in the sense that hatred is this enduring state where,
01:39:07 you know, whether you’re hating somebody else
01:39:09 or hating yourself, it is just,
01:39:11 it is toxic and durable and ultimately useless, right?
01:39:15 Like it becomes, it becomes self nullifying, right?
01:39:19 Like you become less capable as a person
01:39:23 to solve any of your problems.
01:39:24 It’s not, it’s not instrumental in solving the problem
01:39:26 that is, that is, is occasioning all this hatred.
01:39:30 And anger for the most part isn’t either except
01:39:34 as a signal of salience that there’s a problem, right?
01:39:37 So if somebody does something that makes me angry,
01:39:40 that just promotes this situation to conscious,
01:39:44 conscious attention in a way that is stronger
01:39:46 than my not really caring about it, right?
01:39:49 And there are things that I think should make us angry
01:39:51 in the world and there’s the behavior of other people
01:39:54 that should make us angry because we should respond to it.
01:39:57 And so it is with yourself.
01:39:59 If I do something, you know, as a parent,
01:40:01 if I do something stupid that harms one of my daughters,
01:40:05 right, my belief, my experience of myself
01:40:10 and my beliefs about free will close the door to my saying,
01:40:14 well, I should have done otherwise in the sense
01:40:16 that if I could go back in time,
01:40:17 I would have actually effectively done otherwise.
01:40:20 No, I would do, given the same causes and conditions,
01:40:22 I would do that thing a trillion times in a row, right?
01:40:26 But, you know, regret and feeling bad about an outcome
01:40:31 are still important to capacities because like, yeah,
01:40:34 you know, like I desperately want my daughters
01:40:37 to be happy and healthy.
01:40:38 So if I’ve done something, you know,
01:40:40 if I crash the car when they’re in the car
01:40:42 and they get injured, right,
01:40:43 and I do it because I was trying to change a song
01:40:47 on my playlist or, you know, something stupid,
01:40:50 I’m gonna feel like a total asshole.
01:40:53 How long do I stew in that feeling of regret?
01:40:58 Right, and to like, what utility is there to extract
01:41:04 out of this error signal?
01:41:05 And then what do I do?
01:41:06 We’re always faced with the question of what to do next,
01:41:10 right, and how to best do that thing,
01:41:13 that necessary thing next.
01:41:15 And how much wellbeing can we experience while doing it?
01:41:22 Like how miserable do you need to be to solve a problem
01:41:27 in life and to help solve the problems
01:41:30 of people closest to you?
01:41:32 You know, how miserable do you need to be
01:41:33 to get through your to do list today?
01:41:36 Ultimately, I think you can be deeply happy
01:41:44 going through all of it, right?
01:41:46 And even navigating moments that are scary
01:41:51 and, you know, really destabilizing to ordinary people.
01:41:56 And, I mean, I think, you know, again,
01:42:01 I’m always up kind of at the edge of my own capacities here
01:42:05 and there are all kinds of things that stress me out
01:42:07 and worry me and I’m especially something if it’s,
01:42:10 you’re gonna tell me it’s something with, you know,
01:42:11 the health of one of my kids, you know,
01:42:14 it’s very hard for me, like, it’s very hard for me
01:42:16 to be truly equanimous around that.
01:42:19 But equanimity is so useful
01:42:23 the moment you’re in response mode, right?
01:42:26 Because, I mean, the ordinary experience for me
01:42:30 of responding to what seems like a medical emergency
01:42:35 for one of my kids is to be obviously super energized
01:42:40 by concern to respond to that emergency.
01:42:44 But then once I’m responding to that emergency,
01:42:47 but then once I’m responding,
01:42:50 all of my fear and agitation and worry and, oh my God,
01:42:54 what if this is really something terrible?
01:42:58 But finding any of those thoughts compelling,
01:43:01 that only diminishes my capacity as a father
01:43:05 to be good company while we navigate
01:43:08 this really turbulent passage, you know?
01:43:11 As you’re saying this actually,
01:43:12 one guy comes to mind, which is Elon Musk.
01:43:14 One of the really impressive things to me
01:43:17 was to observe how many dramatic things
01:43:19 he has to deal with throughout the day at work,
01:43:22 but also if you look through his life, family too,
01:43:26 and how he’s very much actually, as you’re describing,
01:43:30 basically a practitioner of this way of thought,
01:43:33 which is you’re not in control.
01:43:37 You’re basically responding
01:43:39 no matter how traumatic the event,
01:43:41 and there’s no reason to sort of linger on the,
01:43:44 on the negative feelings around that.
01:43:46 Well, so, I mean, he, but he’s in a very specific situation,
01:43:52 which is unlike normal life,
01:43:57 you know, even his normal life,
01:43:59 but normal life for most people,
01:44:00 because when you just think of like, you know,
01:44:02 he’s running so many businesses,
01:44:04 and he’s, they’re very, they’re not,
01:44:06 they’re non, highly nonstandard businesses.
01:44:08 So what he’s seen is everything that gets to him
01:44:12 is some kind of emergency.
01:44:13 Like it wouldn’t be getting to him.
01:44:15 If it needs his attention,
01:44:16 there’s a fire somewhere.
01:44:17 So he’s constantly responding to fires
01:44:20 that have to be put out.
01:44:22 So there’s no default expectation
01:44:25 that there shouldn’t be a fire, right?
01:44:27 But in our normal lives, we live,
01:44:29 most of us, I mean, most of us who are lucky, right?
01:44:31 Not everyone, obviously on earth,
01:44:33 but most of us who are at some kind of cruising altitude
01:44:36 in terms of our lives,
01:44:39 where we’re reasonably healthy,
01:44:40 and life is reasonably orderly,
01:44:42 and the political apparatus around us
01:44:44 is reasonably functionable, functional,
01:44:47 functionable.
01:44:48 So I said, functionable for the first time in my life
01:44:50 through no free will of my own.
01:44:51 Say like, I noticed those errors,
01:44:53 and they do not feel like agency,
01:44:56 and nor does the success of an utterance feel like agency.
01:45:01 He, when you’re looking at normal human life, right,
01:45:06 where you’re just trying to be happy and healthy,
01:45:10 and get your work done,
01:45:13 there’s this default expectation
01:45:14 that there shouldn’t be fires.
01:45:16 People shouldn’t be getting sick or injured.
01:45:20 We shouldn’t be losing vast amounts of our resources.
01:45:23 We should, like, so when something really stark
01:45:27 like that happens,
01:45:31 people don’t have a, people don’t have that muscle
01:45:34 that they’re, like, I’ve been responding to emergencies
01:45:37 all day long, seven days a week in business mode,
01:45:42 and so I have a very thick skin.
01:45:44 This is just another one.
01:45:45 I’m not expecting anything else
01:45:47 when I wake up in the morning.
01:45:48 No, we have this default sense that,
01:45:52 I mean, honestly, most of us have the default sense
01:45:54 that we aren’t gonna die, right,
01:45:57 or that we should, like, maybe we’re not gonna die.
01:45:59 Right, like, death denial really is a thing.
01:46:02 You know, we’re, and you can see it,
01:46:06 just like I can see when I reach for this bottle
01:46:09 that I was expecting it to be solid,
01:46:11 because when it isn’t solid, when it’s a hologram
01:46:13 and I just, my fist closes on itself,
01:46:16 I’m damn surprised.
01:46:18 People are damn surprised to find out
01:46:22 that they’re going to die, to find out that they’re sick,
01:46:24 to find out that someone they love has died
01:46:27 or is going to die.
01:46:28 So it’s like, the fact that we are surprised
01:46:32 by any of that shows us that we’re living at a,
01:46:36 we’re living in a mode that is, you know,
01:46:45 we’re perpetually diverting ourselves
01:46:47 from some facts that should be obvious, right,
01:46:50 and the more salient we can make them,
01:46:55 you know, the more, I mean, in the case of death,
01:46:57 it’s a matter of being able to get one’s priorities straight.
01:47:01 I mean, the moment, again, this is hard for everybody,
01:47:04 even those who are really in the business
01:47:06 of paying attention to it,
01:47:07 but the moment you realize that every circumstance
01:47:12 is finite, right, you’ve got a certain number of,
01:47:15 you know, you’ve got whatever, whatever it is,
01:47:17 8,000 days left in a normal span of life,
01:47:21 and 8,000 is a, sounds like a big number,
01:47:24 it’s not that big a number, right,
01:47:25 so it’s just like, and then you can decide
01:47:29 how you want to go through life
01:47:31 and how you want to experience each one of those days,
01:47:34 and so I was, back to our jumping off point,
01:47:39 I would argue that you don’t want to feel self hatred ever.
01:47:44 I would argue that you don’t want to really,
01:47:53 really grasp onto any of those moments
01:47:55 where you are internalizing the fact
01:47:58 that you just made an error, you’ve embarrassed yourself,
01:48:01 that something didn’t go the way you wanted it to.
01:48:03 I think you want to treat all of those moments
01:48:05 very, very lightly.
01:48:06 You want to extract the actionable information.
01:48:10 It’s something to learn.
01:48:11 Oh, you know, I learned that when I prepare
01:48:17 in a certain way, it works better
01:48:18 than when I prepare in some other way,
01:48:20 or don’t prepare, right, like yes,
01:48:22 lesson learned, you know, and do that differently,
01:48:25 but yeah, I mean, so many of us have spent so much time
01:48:35 with a very dysfunctional and hostile
01:48:42 and even hateful inner voice
01:48:46 governing a lot of our self talk
01:48:48 and a lot of just our default way of being with ourselves.
01:48:51 I mean, the privacy of our own minds,
01:48:54 we’re in the company of a real jerk a lot of the time,
01:48:58 and that can’t help but affect,
01:49:03 I mean, forget about just your own sense of wellbeing.
01:49:05 It can’t help but limit what you’re capable of
01:49:08 in the world with other people.
01:49:10 I’ll have to really think about that.
01:49:12 I just take pride that my jerk, my inner voice jerk
01:49:15 is much less of a jerk than somebody like David Goggins,
01:49:18 who’s like screaming in his ear constantly.
01:49:20 So I have a relativist kind of perspective
01:49:23 that it’s not as bad as that at least.
01:49:25 Well, having a sense of humor also helps, you know,
01:49:28 it’s just like, it’s not,
01:49:29 the stakes are never quite what you think they are.
01:49:32 And even when they are, I mean,
01:49:34 it’s just the difference between being able
01:49:38 to see the comedy of it rather than,
01:49:41 because again, there’s this sort of dark star
01:49:44 of self absorption that pulls everything into it, right?
01:49:49 And that’s the algorithm you don’t want to run.
01:49:55 So it’s like, you just want things to be good.
01:49:57 So like, just push the concern out there,
01:50:01 like not have the collapse of,
01:50:04 oh my God, what does this say about me?
01:50:06 It’s just like, what does this say about,
01:50:08 how do we make this meal that we’re all having together
01:50:11 as fun and as useful as possible?
01:50:15 And you’re saying in terms of propulsion systems,
01:50:17 you recommend humor is a good spaceship
01:50:19 to escape the gravitational field of that darkness.
01:50:23 Well, that certainly helps, yeah.
01:50:24 Yeah, well, let me ask you a little bit about ego and fame,
01:50:29 which is very interesting the way you’re talking,
01:50:33 given that you’re one of the biggest intellects,
01:50:38 living intellects and minds of our time.
01:50:41 And there’s a lot of people that really love you
01:50:44 and almost elevate you to a certain kind of status
01:50:49 where you’re like the guru.
01:50:50 I’m surprised you didn’t show up in a robe, in fact.
01:50:54 Is there a…
01:50:55 A hoodie, isn’t that the highest status garment
01:50:58 one can wear now?
01:50:59 The socially acceptable version of the robe.
01:51:02 If you’re a billionaire, you wear a hoodie.
01:51:04 Is there something you can say about managing
01:51:07 the effects of fame on your own mind,
01:51:11 on not creating this, you know, when you wake up
01:51:14 in the morning, when you look up in the mirror,
01:51:18 how do you get your ego not to grow exponentially?
01:51:24 Your conception of self to grow exponentially
01:51:26 because there’s so many people feeding that.
01:51:28 Is there something to be said about this?
01:51:30 It’s really not hard because I mean,
01:51:32 I feel like I have a pretty clear sense
01:51:36 of my strengths and weaknesses.
01:51:39 And I don’t feel like it’s…
01:51:43 I mean, honestly, I don’t feel like I suffer
01:51:45 from much grandiosity.
01:51:48 I mean, I just have a, you know,
01:51:51 there’s so many things I’m not good at.
01:51:52 There’s so many things I will, you know,
01:51:53 given the remaining 8,000 days at best,
01:51:57 I will never get good at.
01:52:00 I would love to be good at these things.
01:52:02 So it’s just, it’s easy to feel diminished
01:52:05 by comparison with the talents of others.
01:52:08 Do you remind yourself of all the things
01:52:11 that you’re not competent in?
01:52:14 I mean, like what is…
01:52:15 Well, they’re just on display for me every day
01:52:17 that I appreciate the talents of others.
01:52:19 But you notice them.
01:52:20 I’m sure Stalin and Hitler did not notice
01:52:22 all the ways in which they were.
01:52:25 I mean, this is why absolute power corrupts absolutely
01:52:28 is you stop noticing the things
01:52:30 in which you’re ridiculous and wrong.
01:52:33 Right, yeah, no, I am…
01:52:36 Not to compare you to Stalin.
01:52:37 Yeah, well, I’m sure there’s an inner Stalin
01:52:40 in there somewhere.
01:52:41 Well, we all have, we all carry a baby Stalin with us.
01:52:43 He wears better clothes.
01:52:46 And I’m not gonna grow that mustache.
01:52:49 Those concerns don’t map,
01:52:50 they don’t map onto me for a bunch of reasons.
01:52:53 But one is I also have a very peculiar audience.
01:52:56 Like I’m just, you know,
01:52:59 I’ve been appreciating this for a few years,
01:53:01 but it’s, I’m just now beginning to understand
01:53:05 that there are many people who have audiences
01:53:07 of my size or larger that have a very different experience
01:53:11 of having an audience than I do.
01:53:13 I have curated for better or worse, a peculiar audience.
01:53:18 And the net result of that is virtually any time
01:53:25 I say anything of substance,
01:53:28 something like half of my audience,
01:53:30 my real audience, not haters from outside my audience,
01:53:33 but my audience is just revolts over it, right?
01:53:38 They just like, oh my God, I can’t believe you said it,
01:53:41 like you’re such a schmuck, right?
01:53:43 They revolt with rigor and intellectual sophistication.
01:53:47 Or not, or not, but I mean, it’s both,
01:53:49 but it’s like, but people who are like,
01:53:51 so it’s, I mean, the clearest case is,
01:53:53 you know, I have whatever audience I have
01:53:55 and then Trump appears on the scene
01:53:56 and I discovered that something like 20% of my audience
01:54:00 just went straight to Trump and couldn’t believe
01:54:03 I didn’t follow them there.
01:54:05 They were just a gas that I didn’t see
01:54:06 that Trump was obviously exactly what we needed
01:54:10 for, to steer the ship of state for the next four years
01:54:15 and then four years beyond that.
01:54:17 So like, so that’s one example.
01:54:20 So whenever I said anything about Trump,
01:54:22 I would hear from people who loved more or less
01:54:25 everything else I was up to and had for years,
01:54:28 but everything I said about Trump just gave me pure pain
01:54:33 from this quadrant of my audience.
01:54:36 But then the same thing happens when I say something
01:54:39 about the derangement of the far left.
01:54:42 Anything I say about wokeness, right,
01:54:44 or identity politics, same kind of punishment signal
01:54:48 from, again, people who are core to my audience,
01:54:51 like I’ve read all your books, I’m using your meditation app,
01:54:55 I love what you say about science,
01:54:57 but you are so wrong about politics and you are,
01:55:00 I’m starting to think you’re a racist asshole
01:55:02 for everything you said about identity politics.
01:55:06 And there are so many, the free will topic
01:55:08 is just like this, it’s like I just,
01:55:12 they love what I’m saying about consciousness and the mind
01:55:15 and they love to hear me talk about physics with physicists
01:55:18 and it’s all good, this free will stuff is,
01:55:21 I cannot believe you don’t see how wrong you are,
01:55:24 what a fucking embarrassment you are.
01:55:26 So, but I’m starting to notice that there are other people
01:55:30 who don’t have this experience of having an audience
01:55:33 because they have, I mean, just take the Trump woke dichotomy.
01:55:37 They just castigated Trump the same way I did,
01:55:41 but they never say anything bad about the far left.
01:55:44 So they never get this punishment signal or you flip it.
01:55:46 They’re all about the insanity of critical race theory now.
01:55:51 We connect all those dots the same way,
01:55:54 but they never really specified what was wrong with Trump
01:55:58 or they thought there was a lot right with Trump
01:56:00 and they got all the pleasure of that.
01:56:02 And so they have much more homogenized audiences.
01:56:07 And so my experience, so just to come back
01:56:10 to this experience of fame or quasi fame,
01:56:13 I mean, it’s true, in truth, it’s not real fame,
01:56:16 but it’s still, there’s an audience there.
01:56:20 It is a, it’s now an experience where basically
01:56:26 whatever I put out, I notice a ton of negativity
01:56:30 coming back at me and it just, it is what it is.
01:56:35 I mean, now, it’s like, I used to think, wait a minute,
01:56:37 there’s gotta be some way for me to communicate
01:56:39 more clearly here so as not to get this kind of
01:56:45 lunatic response from my own audience.
01:56:48 From like people who are showing all the signs of,
01:56:51 we’ve been here for years for a reason, right?
01:56:54 These are not just trolls.
01:56:56 And so I think, okay, I’m gonna take 10 more minutes
01:56:59 and really just tell you what should be absolutely clear
01:57:04 about what’s wrong with Trump, right?
01:57:05 I’ve done this a few times,
01:57:07 but I think I gotta do this again.
01:57:09 Or wait a minute, how are they not getting
01:57:12 that these episodes of police violence
01:57:15 are so obviously different from the ones
01:57:17 that you can’t describe all of them
01:57:20 to yet another racist maniac on the police force,
01:57:25 killing someone based on his racism.
01:57:29 Last time I spoke about this, it was pure pain,
01:57:31 but I just gotta try again.
01:57:33 Now at a certain point, I mean, I’m starting to feel like,
01:57:36 all right, I just, I have to be, I have to cease.
01:57:40 Again, it comes back to this expectation
01:57:43 that there shouldn’t be fires.
01:57:45 I feel like if I could just play my game impeccably,
01:57:49 the people who actually care what I think will follow me
01:57:53 when I hit Trump and hit free will and hit the woke
01:57:58 and hit whatever it is,
01:58:00 how we should respond to the coronavirus, you know?
01:58:03 I mean, vaccines, are they a thing, right?
01:58:06 Like there’s such derangement in our information space now
01:58:10 that, I mean, I guess, you know,
01:58:13 some people could be getting more of this than I expect,
01:58:15 but I just noticed that many of our friends
01:58:18 who are in the same game have more homogenized audiences
01:58:22 and don’t get, I mean, they’ve successfully filtered out
01:58:26 the people who are gonna despise them on this next topic.
01:58:30 And I would imagine you have a different experience
01:58:34 of having a podcast than I do at this point.
01:58:36 I mean, I’m sure you get haters,
01:58:38 but I would imagine you’re more streamlined.
01:58:43 I actually don’t like the word haters
01:58:45 because it kinda presumes that it puts people in a bin.
01:58:50 I think we’re all have like baby haters inside of us
01:58:54 and we just apply them and some people enjoy doing that
01:58:57 more than others for particular periods of time.
01:59:00 I think you’re gonna almost see hating on the internet
01:59:03 as a video game that you just play and it’s fun,
01:59:05 but then you can put it down and walk away
01:59:07 and no, I certainly have a bunch of people
01:59:10 that are very critical.
01:59:11 I can list all the ways.
01:59:12 But does it feel like on any given topic,
01:59:14 does it feel like it’s an actual title surge
01:59:17 where it’s like 30% of your audience
01:59:19 and then the other 30% of your audience
01:59:22 from podcast to podcast?
01:59:24 No, no, no.
01:59:24 That’s happening to me all the time now.
01:59:27 Well, I’m more with, I don’t know what you think about this.
01:59:30 I mean, Joe Rogan doesn’t read comments
01:59:33 or doesn’t read comments much.
01:59:35 And the argument he made to me is that
01:59:40 he already has like a self critical person inside.
01:59:46 And I’m gonna have to think about
01:59:48 what you said in this conversation,
01:59:49 but I have this very harshly self critical person
01:59:52 inside as well where I don’t need more fuel.
01:59:55 I don’t need, no, I do sometimes.
01:59:59 That’s why I check negativity occasionally,
02:00:02 not too often.
02:00:03 I sometimes need to like put a little bit more
02:00:06 like coals into the fire, but not too much.
02:00:09 But I already have that self critical engine
02:00:11 that keeps me in check.
02:00:12 I just, I wonder, you know, a lot of people
02:00:15 who gain more and more fame lose that ability
02:00:20 to be self critical.
02:00:21 I guess because they lose the audience
02:00:23 that can be critical towards them.
02:00:25 Hmm.
02:00:26 You know, I do follow Joe’s advice much more
02:00:28 than I ever have here.
02:00:29 Like I don’t look at comments very often.
02:00:32 And I’m probably using Twitter, you know,
02:00:36 5% as much as I used to.
02:00:39 I mean, I really just get in and out on Twitter
02:00:42 and spend very little time in my ad mentions.
02:00:46 I bet, you know, it does, in some ways it feels like a loss
02:00:49 because occasionally I get,
02:00:50 I see something super intelligent there.
02:00:52 Like, I mean, I’ll check my Twitter ad mentions
02:00:55 and someone will have said, oh, have you read this article?
02:00:58 And it’s like, man, that was just,
02:01:00 that was like the best article sent to me in a month, right?
02:01:03 So it’s like to have not have looked
02:01:04 and to not have seen that, that’s a loss.
02:01:08 So, but it does, at this point, a little goes a long way.
02:01:13 Cause I, yeah, it’s not that it, for me now,
02:01:19 I mean, this could sound like a fairly Stalinistic immunity
02:01:23 to criticism, it’s not so much that these voices of hate
02:01:27 turn on my inner hater, you know, more,
02:01:31 it’s more that I just, I get a,
02:01:33 what I fear is a false sense of humanity.
02:01:38 Like, I feel like I’m too online
02:01:41 and online is selecting for this performative outrage
02:01:43 in everybody, everyone’s signaling to an audience
02:01:46 when they trash you.
02:01:48 And I get a dark, I’m getting a, you know,
02:01:52 a misanthropic, you know, cut of just what it’s like
02:01:58 out there.
02:01:59 And it, cause when you meet people in real life,
02:02:02 they’re great, you know, they’re all rather often great,
02:02:04 you know, and it takes a lot to have anything
02:02:09 like a Twitter encounter in real life with a living person.
02:02:15 And that’s, I think it’s much better to have that
02:02:19 as one’s default sense of what it’s like to be with people
02:02:24 than what one gets on social media
02:02:28 or on YouTube comment threads.
02:02:30 You’ve produced a special episode with Rob Reed
02:02:33 on your podcast recently on how bioengineering of viruses
02:02:38 is going to destroy human civilization.
02:02:40 So.
02:02:41 Or could.
02:02:42 Could.
02:02:43 One fears, yeah.
02:02:44 Sorry, the confidence there.
02:02:45 But in the 21st century, what do you think,
02:02:49 especially after having thought through that angle,
02:02:53 what do you think is the biggest threat
02:02:56 to the survival of the human species?
02:03:00 I can give you the full menu if you’d like.
02:03:02 Yeah, well, no, I would put the biggest threat
02:03:06 at another level out, kind of the meta threat
02:03:11 is our inability to agree about what the threats actually are
02:03:19 and to converge on strategies for responding to them, right?
02:03:25 So like I view COVID as, among other things,
02:03:29 a truly terrifyingly failed dress rehearsal
02:03:35 for something far worse, right?
02:03:37 I mean, COVID is just about as benign as it could have been
02:03:41 and still have been worse than the flu
02:03:44 when you’re talking about a global pandemic, right?
02:03:46 So it’s just, it’s gonna kill a few million people
02:03:51 or it looks like it’s killed about 3 million people.
02:03:53 Maybe it’ll kill a few million more
02:03:56 unless something gets away from us
02:03:58 with a variant that’s much worse
02:04:00 or we really don’t play our cards right.
02:04:02 But I mean, the general shape of it is
02:04:06 it’s got somewhere around, well, 1% lethality
02:04:12 and whatever side of that number it really is on
02:04:18 in the end, it’s not what would in fact be possible
02:04:23 and is in fact probably inevitable
02:04:26 something with orders of magnitude,
02:04:29 more lethality than that.
02:04:30 And it’s just so obvious we are totally unprepared, right?
02:04:35 We are running this epidemiological experiment
02:04:39 of linking the entire world together
02:04:41 and then also now per the podcast that Rob Reed did
02:04:47 democratizing the tech that will allow us to do this
02:04:51 to engineer pandemics, right?
02:04:53 And more and more people will be able
02:04:56 to engineer synthetic viruses that will be
02:05:02 by the sheer fact that they would have been engineered
02:05:04 with malicious intent, worse than COVID.
02:05:08 And we’re still living in,
02:05:11 to speak specifically about the United States,
02:05:13 we have a country here where we can’t even agree
02:05:17 that this is a thing, like that COVID,
02:05:20 I mean, there’s still people who think
02:05:21 that this is basically a hoax designed to control people.
02:05:25 And stranger still, there are people who will acknowledge
02:05:30 that COVID is real and they’ll look,
02:05:34 they don’t think the deaths have been faked or misascribed,
02:05:42 but they think that they’re far happier
02:05:47 at the prospect of catching COVID
02:05:49 than they are of getting vaccinated for COVID, right?
02:05:53 They’re not worried about COVID,
02:05:54 they’re worried about vaccines for COVID, right?
02:05:57 And the fact that we just can’t converge in a conversation
02:06:01 that we’ve now had a year to have with one another
02:06:05 on just what is the ground truth here?
02:06:08 What’s happened?
02:06:09 Why has it happened?
02:06:12 How safe is it to get COVID in every cohort
02:06:17 in the population?
02:06:19 And how safe are the vaccines?
02:06:21 And the fact that there’s still an air of mystery
02:06:23 around all of this for much of our society
02:06:28 does not bode well when you’re talking about solving
02:06:30 any other problem that may yet kill us.
02:06:32 But do you think convergence grows
02:06:34 with the magnitude of the threat?
02:06:36 It’s possible, except I feel like we have tipped into,
02:06:40 because when the threat of COVID looked the most dire,
02:06:45 when we were seeing reports from Italy
02:06:48 that looked like the beginning of a zombie movie.
02:06:51 Because it could have been much, much worse.
02:06:52 Yeah, this is lethal, right?
02:06:55 Your ICUs are gonna fill up in,
02:06:57 you’re 14 days behind us.
02:07:01 Your medical system is in danger of collapse.
02:07:04 Lock the fuck down.
02:07:06 We have people refusing to do anything sane
02:07:11 in the face of that.
02:07:12 People fundamentally thinking,
02:07:14 it’s not gonna get here, right?
02:07:17 Who knows what’s going on in Italy,
02:07:18 but it has no implications for what’s gonna go on in New York
02:07:21 in a mere six days, right?
02:07:23 And now it kicks off in New York,
02:07:25 and you’ve got people in the middle of the country
02:07:27 thinking it’s no factor, it’s not,
02:07:31 that’s just big city, those are big city problems,
02:07:34 or they’re faking it.
02:07:35 Or, I mean, it just, the layer of politics
02:07:40 has become so dysfunctional for us
02:07:42 that even in the presence of a pandemic
02:07:48 that looked legitimately scary there in the beginning,
02:07:50 I mean, it’s not to say that it hasn’t been devastating
02:07:52 for everyone who’s been directly affected by it,
02:07:54 and it’s not to say it can’t get worse,
02:07:56 but here, for a very long time,
02:07:58 we have known that we were in a situation
02:08:01 that is more benign than what seemed
02:08:05 like the worst case scenario as it was kicking off,
02:08:08 especially in Italy.
02:08:11 And so still, yeah, it’s quite possible
02:08:15 that if we saw the asteroid hurtling toward Earth
02:08:18 and everyone agreed that it’s gonna make impact
02:08:23 and we’re all gonna die,
02:08:25 then we could get off Twitter
02:08:27 and actually build the rockets
02:08:30 that are gonna divert the asteroid
02:08:32 from its Earth crossing path,
02:08:35 and we could do something pretty heroic.
02:08:37 But when you talk about anything else
02:08:41 that isn’t, that’s slower moving than that,
02:08:46 I mean, something like climate change,
02:08:48 I think the prospect of our converging
02:08:54 on a solution to climate change
02:08:56 purely based on political persuasion
02:08:58 is nonexistent at this point.
02:09:00 I just think, to bring Elon back into this,
02:09:04 the way to deal with climate change
02:09:05 is to create technology that everyone wants
02:09:09 that is better than all the carbon producing technology,
02:09:14 and then we just transition
02:09:15 because you want an electric car
02:09:19 the same way you wanted a smartphone
02:09:20 or you want anything else,
02:09:22 and you’re working totally with the grain
02:09:24 of people’s selfishness and short term thinking.
02:09:29 The idea that we’re gonna convince
02:09:31 the better part of humanity
02:09:33 that climate change is an emergency,
02:09:35 that they have to make sacrifices to respond to,
02:09:39 given what’s happened around COVID,
02:09:41 I just think that’s the fantasy of a fantasy.
02:09:46 But speaking of Elon,
02:09:48 I have a bunch of positive things
02:09:49 that I wanna say here in response to you,
02:09:51 but you’re opening so many threads,
02:09:53 but let me pull one of them, which is AI.
02:09:57 Both you and Elon think that with AI,
02:10:02 you’re summoning demons, summoning a demon,
02:10:05 maybe not in those poetic terms, but.
02:10:08 Well, potentially. Potentially.
02:10:10 Two very, three very parsimonious assumptions,
02:10:17 I think, here.
02:10:18 Scientifically, parsimonious assumptions get me there.
02:10:25 Any of which could be wrong,
02:10:26 but it just seems like the weight
02:10:28 of the evidence is on their side.
02:10:31 One is that it comes back to this topic
02:10:34 of substrate independence, right?
02:10:36 Anyone who’s in the business
02:10:38 of producing intelligent machines
02:10:40 must believe, ultimately,
02:10:43 that there’s nothing magical
02:10:45 about having a computer made of meat.
02:10:47 You can do this in the kinds of materials
02:10:50 we’re using now,
02:10:53 and there’s no special something
02:10:56 that presents a real impediment
02:11:00 to producing human level intelligence in silico, right?
02:11:05 Again, an assumption, I’m sure there are a few people
02:11:08 who still think there is something magical
02:11:09 about biological systems,
02:11:12 but leave that aside.
02:11:18 Given that assumption,
02:11:20 and given the assumption
02:11:21 that we just continue making incremental progress,
02:11:24 doesn’t have to be Moore’s Law,
02:11:25 it just has to be progress,
02:11:27 that just doesn’t stop,
02:11:29 at a certain point,
02:11:30 we’ll get to human level intelligence and beyond.
02:11:34 And human level intelligence,
02:11:36 I think, is also clearly a mirage,
02:11:38 because anything that’s human level
02:11:40 is gonna be superhuman
02:11:42 by unless we decide to dumb it down, right?
02:11:44 I mean, my phone is already superhuman as a calculator,
02:11:47 right, so why would we make the human level AI
02:11:51 just as good as me as a calculator?
02:11:54 So I think we’ll very,
02:11:57 if we continue to make progress,
02:11:59 we will be in the presence of superhuman competence
02:12:03 for any act of intelligence or cognition
02:12:09 that we care to prioritize.
02:12:11 It’s not to say that we’ll create everything
02:12:13 that a human could do,
02:12:14 maybe we’ll leave certain things out,
02:12:16 but anything that we care about,
02:12:18 and we care about a lot,
02:12:20 and we certainly care about anything
02:12:21 that produces a lot of power,
02:12:24 that we care about scientific insights
02:12:26 and an ability to produce new technology and all of that,
02:12:30 we’ll have something that’s superhuman.
02:12:34 And then the final assumption is just that
02:12:39 there have to be ways to do that
02:12:42 that are not aligned with a happy coexistence
02:12:46 with these now more powerful entities than ourselves.
02:12:51 So, and I would guess,
02:12:54 and this is kind of a rider to that assumption,
02:12:57 there are probably more ways to do it badly
02:12:59 than to do it perfectly.
02:13:01 That is perfectly aligned with our wellbeing.
02:13:05 And when you think about the consequences of nonalignment,
02:13:10 when you think about,
02:13:13 you’re now in the presence of something
02:13:15 that is more intelligent than you are, right?
02:13:18 Which is to say more competent, right?
02:13:20 Unless you’ve, and obviously there are cartoon pictures
02:13:24 of this where we could just,
02:13:26 this is just an off switch,
02:13:27 we could just turn off the off switch,
02:13:28 or they’re tethered to something that makes them,
02:13:31 our slaves in perpetuity,
02:13:33 even though they’re more intelligent.
02:13:34 But those scenarios strike me as a failure to imagine
02:13:39 what is actually entailed by greater intelligence, right?
02:13:42 So if you imagine something
02:13:43 that’s legitimately more intelligent than you are,
02:13:47 and you’re now in relationship to it, right?
02:13:51 You’re in the presence of this thing
02:13:52 and it is autonomous in all kinds of ways
02:13:54 because it had to be to be more intelligent than you are.
02:13:57 I mean, you built it to be all of those things.
02:14:01 We just can’t find ourselves in a negotiation
02:14:05 with something more intelligent than we are, you know?
02:14:08 And we can’t, so we have to have found
02:14:10 the subset of ways to build these machines
02:14:16 that are perpetually amenable to our saying,
02:14:22 oh, that’s not what we meant, that’s not what we intended.
02:14:25 Could you stop doing that, just come back over here
02:14:27 and do this thing that we actually want.
02:14:29 And for them to care, for them to be tethered
02:14:31 to our own sense of our own wellbeing,
02:14:35 such that, you know, I mean, their utility function is,
02:14:39 you know, their primary utility function is for,
02:14:41 is to have, you know, this is, I think,
02:14:43 Stuart Russell’s cartoon plan is to figure out
02:14:51 how to tether them to a utility function
02:14:53 that has our own estimation of what’s going to improve
02:14:59 our wellbeing as its master, you know, reward, right?
02:15:05 So it’s like, all that, this thing can get
02:15:07 as intelligent as it can get,
02:15:10 but it only ever really wants to figure out
02:15:13 how to make our lives better by our own view of better.
02:15:16 Now, not to say there wouldn’t be a conversation about,
02:15:19 you know, I mean, because there’s all kinds of things
02:15:21 we’re not seeing clearly about what is better,
02:15:24 and if we were in the presence of a genie or an oracle
02:15:27 that could really tell us what is better,
02:15:29 well, then we presumably would want to hear that,
02:15:32 and we would modify our sense of what to do next
02:15:37 in conversation with these minds.
02:15:41 But I just feel like it is a failure of imagination
02:15:45 to think that being in relationship to something
02:15:55 more intelligent than yourself isn’t in most cases
02:16:00 a circumstance of real peril, because it is.
02:16:05 Just to think of how everything on Earth has to,
02:16:09 if they could think about their relationship to us,
02:16:12 if birds could think about what we’re doing, right?
02:16:16 They would, I mean, the bottom line is
02:16:21 they’re always in danger of our discovering
02:16:26 that there’s something we care about more than birds, right?
02:16:29 Or there’s something we want
02:16:30 that disregards the wellbeing of birds.
02:16:34 And obviously much of our behavior is inscrutable to them.
02:16:37 Occasionally we pay attention to them,
02:16:39 and occasionally we withdraw our attention,
02:16:41 and occasionally we just kill them all
02:16:43 for reasons they can’t possibly understand.
02:16:45 But if we’re building something more intelligent
02:16:48 than ourselves, by definition,
02:16:49 we’re building something whose horizons
02:16:52 of value and cognition can exceed our own
02:17:00 and in ways where we can’t necessarily foresee,
02:17:05 again, perpetually, that they don’t just wake up one day
02:17:09 and decide, okay, well, these humans need to disappear.
02:17:14 So I think I agree with most of the initial things you said.
02:17:19 What I don’t necessarily agree with,
02:17:22 and of course nobody knows,
02:17:24 but that the more likely set of trajectories
02:17:27 that we’re going to take are going to be positive.
02:17:30 That’s what I believe in the sense
02:17:32 that the way you develop,
02:17:35 I believe the way you develop successful AI systems
02:17:40 will be deeply integrated with human society.
02:17:43 And for them to succeed,
02:17:45 they’re going to have to be aligned
02:17:48 in the way we humans are aligned with each other,
02:17:50 which doesn’t mean we’re aligned.
02:17:52 There’s no such thing,
02:17:54 or I don’t see there’s such thing as a perfect alignment,
02:17:57 but they’re going to be participating in the dance,
02:18:01 in the game theoretic dance of human society,
02:18:04 as they become more and more intelligent.
02:18:06 There could be a point beyond which
02:18:09 we are like birds to them.
02:18:12 But what about an intelligence explosion of some kind?
02:18:16 So I believe the explosion will be happening,
02:18:21 but there’s a lot of explosion to be done
02:18:24 before we become like birds.
02:18:26 I truly believe that human beings
02:18:28 are very intelligent in ways we don’t understand.
02:18:30 It’s not just about chess.
02:18:32 It’s about all the intricate computation
02:18:35 we’re able to perform, common sense,
02:18:37 our ability to reason about this world, consciousness.
02:18:40 I think we’re doing a lot of work
02:18:42 we don’t realize is necessary to be done
02:18:44 in order to truly become,
02:18:47 like truly achieve super intelligence.
02:18:49 And I just think there’ll be a period of time
02:18:52 that’s not overnight.
02:18:53 The overnight nature of it will not literally be overnight.
02:18:57 It’ll be over a period of decades.
02:18:59 So my sense is…
02:19:00 So why would it be that, but just take,
02:19:02 draw an analogy from recent successes,
02:19:06 like something like AlphaGo or AlphaZero.
02:19:09 I forget the actual metric,
02:19:11 but it was something like this algorithm,
02:19:15 which wasn’t even totally,
02:19:17 it wasn’t bespoke for chess playing,
02:19:21 in the matter of, I think it was four hours,
02:19:24 played itself so many times and so successfully
02:19:27 that it became the best chess playing computer.
02:19:30 It was not only better than every human being,
02:19:33 it was better than every previous chess program
02:19:36 in a matter of a day, right?
02:19:38 So just imagine, again,
02:19:41 we don’t have to recapitulate everything about us,
02:19:43 but just imagine building a system,
02:19:47 and who knows when we’ll be able to do this,
02:19:50 but at some point we’ll be able,
02:19:52 at some point the 100 or 100 favorite things
02:19:56 about human cognition will be analogous to chess
02:20:01 in that we will be able to build machines
02:20:03 that very quickly outperform any human,
02:20:07 and then very quickly outperform the last algorithm
02:20:12 that outperform the humans.
02:20:13 Like something like the AlphaGo experience
02:20:17 seems possible for facial recognition
02:20:21 and detecting human emotion
02:20:23 and natural language processing, right?
02:20:26 Well, it’s just that everyone,
02:20:29 even math people, math heads,
02:20:33 tend to have bad intuitions for exponentiation, right?
02:20:36 I mean, we noticed this during COVID.
02:20:37 I mean, you have some very smart people
02:20:39 who still couldn’t get their minds around the fact
02:20:42 that an exponential is really surprising.
02:20:46 I mean, things double and double and double and double again,
02:20:49 and you don’t notice much of anything changes,
02:20:51 and then the last two stages of doubling swamp everything.
02:20:56 And it just seems like that,
02:20:59 to assume that there isn’t a deep analogy
02:21:04 between what we’re seeing for the more tractable problems,
02:21:09 like chess, to other modes of cognition,
02:21:13 it’s like once you crack that problem,
02:21:16 it seems, because for the longest time,
02:21:17 it was impossible to think
02:21:20 we were gonna make headway in AI, you know, it’s like.
02:21:25 Chess and Go was seen as impossible.
02:21:27 Yeah, Go seemed unattainable.
02:21:29 Even when chess had been cracked, Go seemed unattainable.
02:21:33 Yeah, and actually still Russell was behind the people
02:21:37 that were saying it’s unattainable,
02:21:38 because it seemed like it’s intractable problem.
02:21:42 But there’s something different
02:21:44 about the space of cognition
02:21:46 that’s detached from human society, which is what chess is,
02:21:49 meaning like just thinking,
02:21:52 having actual exponential impact
02:21:54 on the physical world is different.
02:21:56 I tend to believe that there’s,
02:22:00 for AI to get to the point where it’s super intelligent,
02:22:03 it’s going to have to go through the funnel of society.
02:22:07 And for that, it has to be deeply integrated
02:22:09 with human beings, and for that, it has to be aligned.
02:22:12 But you’re talking about like actually hooking us up
02:22:15 to like the neural link, you know,
02:22:16 we’re gonna be the brainstem to the robot overlords?
02:22:22 That’s a possibility as well.
02:22:23 But what I mean is,
02:22:25 in order to develop autonomous weapon systems, for example,
02:22:28 which are highly concerning to me
02:22:31 that both US and China are participating in now,
02:22:34 that in order to develop them and for them to become,
02:22:38 to have more and more responsibility
02:22:40 to actually do military strategic actions,
02:22:44 they’re going to have to be integrated
02:22:47 into human beings doing the strategic action.
02:22:51 They’re going to have to work alongside with each other.
02:22:54 And the way those systems will be developed
02:22:56 will have the natural safety, like switches
02:23:00 that are placed on them as they develop over time,
02:23:03 because they’re going to have to convince humans.
02:23:05 Ultimately, they’re going to have to convince humans
02:23:07 that this is safer than humans.
02:23:10 They’re going to, you know.
02:23:12 Self driving cars is a good test case here
02:23:15 because like, obviously we’ve made a lot of progress
02:23:19 and we can imagine what total progress would look like.
02:23:24 I mean, it would be amazing.
02:23:25 And it’s answering, it’s canceling in the US
02:23:29 40,000 deaths every year based on ape driven cars, right?
02:23:33 So it’s a excruciating problem that we’ve all gotten used to
02:23:36 because there was no alternative.
02:23:38 But now we can dimly see the prospect of an alternative,
02:23:41 which if it works in a super intelligent fashion,
02:23:45 maybe we would go down to zero highway deaths, right?
02:23:48 Or, you know, certainly we’d go down
02:23:50 by orders of magnitude, right?
02:23:51 So maybe we have, you know, 400 rather than 40,000 a year.
02:23:59 And it’s easy to see that there’s not a missile.
02:24:05 So obviously this is not an example of super intelligence.
02:24:08 This is narrow intelligence,
02:24:09 but the alignment problem isn’t so obvious there,
02:24:15 but there are potential alignment problems there.
02:24:17 Like, so like, just imagine if some woke team of engineers
02:24:22 decided that we have to tune the algorithm some way.
02:24:26 I mean, there are situations where the car
02:24:28 has to decide who to hit.
02:24:30 I mean, there’s just bad outcomes
02:24:31 where you’re gonna hit somebody, right?
02:24:33 Now we have a car that can tell what race you are, right?
02:24:36 So we’re gonna build the car to preferentially hit
02:24:39 white people because white people have had so much privilege
02:24:42 over the years.
02:24:43 This seems like the only ethical way
02:24:44 to kind of redress those wrongs of the past.
02:24:47 That’s something that could get, one,
02:24:49 that could get produced as an artifact, presumably,
02:24:53 of just how you built it
02:24:54 and you didn’t even know you engineered it that way, right?
02:24:56 You caused it to…
02:24:57 Through machine learning,
02:24:58 you put some kind of constraints on it
02:25:00 to where it creates those kinds of outcomes.
02:25:02 Basically, you built a racist algorithm
02:25:05 and you didn’t even intend to,
02:25:06 or you could intend to, right?
02:25:07 And it would be aligned with some people’s values
02:25:09 but misaligned with other people’s values.
02:25:13 But it’s like there are interesting problems
02:25:16 even with something as simple
02:25:17 and obviously good as self driving cars.
02:25:20 But there’s a leap that I just think it’d be exact,
02:25:23 but those are human problems.
02:25:25 I just don’t think there’ll be a leap
02:25:26 with autonomous vehicles.
02:25:29 First of all, sorry.
02:25:31 There are a lot of trajectories
02:25:33 which will destroy human civilization.
02:25:35 The argument I’m making,
02:25:36 it’s more likely that we’ll take trajectories that don’t.
02:25:40 So I don’t think there’ll be a leap
02:25:41 with autonomous vehicles
02:25:43 will all of a sudden start murdering pedestrians
02:25:45 because once every human on earth is dead,
02:25:49 there’ll be no more fatalities,
02:25:50 sort of unintended consequences of…
02:25:52 And it’s difficult to take that leap.
02:25:55 Most systems as we develop
02:25:57 and they become much, much more intelligent
02:25:59 in ways that will be incredibly surprising,
02:26:01 like stuff that DeepMind is doing with protein folding.
02:26:04 Even, which is scary to think about,
02:26:07 and I’m personally terrified about this,
02:26:09 which is the engineering of viruses using machine learning,
02:26:12 the engineering of vaccines using machine learning,
02:26:16 the engineering of, yeah, for research purposes,
02:26:20 pathogens using machine learning
02:26:23 and the ways that can go wrong.
02:26:25 I just think that there’s always going to be
02:26:27 a closed loop supervision of humans
02:26:30 before the AI becomes super intelligent.
02:26:33 Not always, much more likely to be supervision,
02:26:38 except, of course, the question is
02:26:40 how many dumb people there are in the world,
02:26:42 how many evil people are in the world?
02:26:44 My theory, my hope is, my sense is
02:26:48 that the number of intelligent people
02:26:50 is much higher than the number of dumb people
02:26:53 that know how to program
02:26:55 and the number of evil people.
02:26:57 I think smart people and kind people
02:26:59 over outnumber the others.
02:27:03 Except we also have to add another group of people
02:27:06 which are just the smart and otherwise good
02:27:09 but reckless people, right?
02:27:12 The people who will flip a switch on
02:27:15 not knowing what’s going to happen.
02:27:17 They’re just kind of hoping
02:27:19 that it’s not going to blow up the world.
02:27:20 We already know that some of our smartest people
02:27:23 are those sorts of people.
02:27:24 We know we’ve done experiments,
02:27:26 and this is something that Martin Rees was whinging about
02:27:29 before the Large Hadron Collider
02:27:33 got booted up, I think.
02:27:35 We know there are people who are entertaining experiments
02:27:38 or even performing experiments
02:27:40 where there’s some chance, not quite infinitesimal,
02:27:46 that they’re going to create a black hole in the lab
02:27:48 and suck the whole world into it.
02:27:52 You’re not a crazy person to worry about that
02:27:55 based on the physics.
02:27:57 And so it was with the Trinity test.
02:28:01 There were some people who were still
02:28:04 checking their calculations, and they were off.
02:28:07 We did nuclear tests where we were off significantly
02:28:10 in terms of the yield, right?
02:28:11 So it was like.
02:28:12 And they still flipped the switch.
02:28:13 Yeah, they still flipped the switch.
02:28:14 And sometimes they flipped the switch
02:28:16 not to win a world war or to save 40,000 lives a year.
02:28:22 They just, just.
02:28:24 Just to see what happens.
02:28:24 Intellectual curiosity.
02:28:25 Like this is what I got my grant for.
02:28:27 This is where I’ll get my Nobel Prize
02:28:30 if that’s in the cards.
02:28:32 It’s on the other side of this switch, right?
02:28:35 And I mean, again, we are apes with egos
02:28:43 who are massively constrained
02:28:45 by very short term self interest
02:28:49 even when we’re contemplating some of the deepest
02:28:52 and most interesting and most universal problems
02:28:57 we could ever set our attention towards.
02:29:00 Like just if you read James Watson’s book,
02:29:03 The Double Helix, right?
02:29:04 About them cracking the structure of DNA.
02:29:09 One thing that’s amazing about that book
02:29:11 is just how much of it, almost all of it
02:29:15 is being driven by very apish, egocentric social concerns.
02:29:20 The algorithm that is producing this scientific breakthrough
02:29:25 is human competition if you’re James Watson.
02:29:28 It’s like, I’m gonna get there before Linus Pauling
02:29:30 and it’s just, it’s so much of his bandwidth
02:29:36 is captured by that, right?
02:29:37 Now that becomes more and more of a liability
02:29:41 when you think about it.
02:29:43 I mean, it’s like, I’m gonna get there before Linus Pauling
02:29:46 and it’s just, it’s so much of his bandwidth
02:29:50 is captured by that, right?
02:29:51 Now that becomes more and more of a liability
02:29:52 when you’re talking about producing technology
02:29:53 that can change everything in an instant.
02:29:55 You know, we’re talking about not only understanding,
02:30:01 you know, we’re just at a different moment
02:30:03 in human history.
02:30:04 We’re not, when we’re doing research on viruses,
02:30:10 we’re now doing the kind of research
02:30:13 that can cause someone somewhere else
02:30:16 to be able to make that virus or weaponize that virus
02:30:19 or it’s just, I don’t know.
02:30:24 I mean, our power is, our wisdom is,
02:30:27 it does not seem like our wisdom is scaling with our power.
02:30:30 Right?
02:30:31 And like that seems like, insofar as wisdom and power
02:30:36 become unaligned, I get more and more concerned.
02:30:40 But speaking of apes with egos,
02:30:45 some of the most compelling apes, two compelling apes,
02:30:48 I can think of is yourself and Jordan Peterson.
02:30:51 And you’ve had a fun conversation about religion
02:30:56 that I watched most of, I believe.
02:30:58 I’m not sure there was any…
02:31:02 We didn’t solve anything.
02:31:03 If anything was ever solved.
02:31:05 So is there something like a charitable summary
02:31:09 you can give to the ideas that you agree on
02:31:13 and disagree with Jordan?
02:31:14 Is there something maybe after that conversation
02:31:16 that you’ve landed where maybe as you both agreed on,
02:31:22 is there some wisdom in the rubble
02:31:24 of even imperfect flawed ideas?
02:31:29 Is there something that you can kind of pull out
02:31:31 from those conversations or is it to be continued?
02:31:34 I mean, I think where we disagree.
02:31:35 So he thinks that many of our traditional religious beliefs
02:31:40 and frameworks are holding such a repository
02:31:49 of human wisdom that we pull at that fabric
02:31:59 at our peril, right?
02:32:01 Like if you start just unraveling Christianity
02:32:04 or any other traditional set of norms and beliefs
02:32:08 you may think you’re just pulling out the unscientific bits
02:32:12 but you could be pulling a lot more
02:32:14 to which everything you care about is attached, right?
02:32:17 As a society.
02:32:20 And my feeling is that there’s so much downside
02:32:26 to the unscientific bits.
02:32:27 And it’s so clear how we could have a 21st century
02:32:33 rational conversation about the things that we don’t know.
02:32:37 A conversation about the good stuff
02:32:39 that we really can radically edit these traditions.
02:32:42 And we can take Jesus in half his moods
02:32:47 and just find a great inspirational iron age thought leader
02:32:54 who just happened to get crucified.
02:32:56 But he could be somewhat like the Beatitudes
02:32:58 and the golden rule, which doesn’t originate with him
02:33:03 but which he put quite beautifully.
02:33:07 All of that’s incredibly useful.
02:33:09 It’s no less useful than it was 2000 years ago.
02:33:12 But we don’t have to believe he was born of a virgin
02:33:14 or coming back to raise the dead
02:33:16 or any of that other stuff.
02:33:18 And we can be honest about not believing those things.
02:33:21 And we can be honest about the reasons
02:33:22 why we don’t believe those things.
02:33:24 Because on those fronts I view the downside to be so obvious
02:33:29 and the fact that we have so many different
02:33:33 competing dogmatisms on offer to be so nonfunctional.
02:33:37 I mean, it’s so divisive, it just has conflict built into it
02:33:42 that I think we can be far more
02:33:45 and should be far more iconoclastic
02:33:47 than he wants to be, right?
02:33:50 Now, none of this is to deny much of what he argues for,
02:33:55 that stories are very powerful.
02:33:59 I mean, clearly stories are powerful
02:34:01 and we want good stories.
02:34:03 We want our lives, we wanna have a conversation
02:34:06 with ourselves and with one another about our lives
02:34:10 that facilitates the best possible lives.
02:34:13 And story is part of that, right?
02:34:15 And if you want some of those stories to sound like myths,
02:34:21 that might be part of it, right?
02:34:22 But my argument is that we never really need
02:34:26 to deceive ourselves or our children
02:34:29 about what we have every reason to believe is true
02:34:32 in order to get at the good stuff,
02:34:34 in order to organize our lives well.
02:34:36 I certainly don’t feel that I need to do it personally.
02:34:39 And if I don’t need to do it personally,
02:34:41 why would I think that billions of other people
02:34:43 need to do it personally, right?
02:34:45 Now, there is a cynical counter argument,
02:34:48 which is billions of other people
02:34:51 don’t have the advantages that I have had in my life.
02:34:54 The billions of other people are not as well educated,
02:34:57 they haven’t had the same opportunities,
02:34:59 they need to be told that Jesus is gonna solve
02:35:04 all their problems after they die, say,
02:35:06 or that everything happens for a reason
02:35:10 and if you just believe in the secret,
02:35:14 if you just visualize what you want, you’re gonna get it.
02:35:16 And it’s like there’s some measure
02:35:20 of what I consider to be odious pamphlet
02:35:23 that really is food for the better part of humanity
02:35:27 and there is no substitute for it
02:35:29 or there’s no substitute now.
02:35:31 And I don’t know if Jordan would agree with that,
02:35:32 but much of what he says seems to suggest
02:35:35 that he would agree with it.
02:35:39 And I guess that’s an empirical question.
02:35:41 I mean, that’s just that we don’t know
02:35:43 whether given a different set of norms
02:35:47 and a different set of stories,
02:35:48 people would behave the way I would hope they would behave
02:35:52 and be more aligned than they are now.
02:35:56 I think we know what happens
02:35:58 when you just let ancient religious certainties
02:36:03 go uncriticized.
02:36:06 We know what that world’s like.
02:36:07 We’ve been struggling to get out of that world
02:36:10 for a couple of hundred years,
02:36:12 but we know what having Europe riven by religious wars
02:36:20 looks like.
02:36:21 And we know what happens when those religions
02:36:25 become kind of pseudo religions and political religions.
02:36:29 So this is where I’m sure Jordan and I would debate.
02:36:33 He would say that Stalin was a symptom of atheism
02:36:37 and that’s not at all.
02:36:37 I mean, it’s not my kind of atheism.
02:36:40 Stalin, the problem with the Gulag
02:36:43 and the experiment with communism or with Stalinism
02:36:49 or with Nazism was not that there was so much
02:36:53 scientific rigor and self criticism and honesty
02:36:56 and introspection and judicious use of psychedelics.
02:37:03 I mean, that was not the problem in Hitler’s Germany
02:37:07 or in Stalin’s Soviet Union.
02:37:12 The problem was you have other ideas
02:37:16 that capture a similar kind of mob based dogmatic energy.
02:37:23 And yes, the results of all of that
02:37:28 are predictably murderous.
02:37:30 Well, the question is what is the source
02:37:33 of the most viral and sticky stories
02:37:37 that ultimately lead to a positive outcome?
02:37:40 So communism was, I mean, having grown up
02:37:43 in the Soviet Union, even still having relatives in Russia,
02:37:51 there’s a stickiness to the nationalism
02:37:53 and to the ideologies of communism
02:37:56 that religious or not, you could say it’s religious forever.
02:38:00 I could just say it’s stories that are viral and sticky.
02:38:06 I’m using the most horrible words,
02:38:08 but the question is whether science and reason
02:38:12 can generate viral sticky stories
02:38:14 that give meaning to people’s lives.
02:38:18 And your sense is it does.
02:38:20 Well, whatever is true ultimately should be captivating.
02:38:25 It’s like what’s more captivating than whatever is real?
02:38:34 Because reality is, again, we’re just climbing
02:38:39 out of the darkness in terms of our understanding
02:38:42 of what the hell is going on.
02:38:43 And there’s no telling what spooky things
02:38:47 may in fact be true.
02:38:48 I mean, I don’t know if you’ve been on the receiving end
02:38:49 of recent rumors about our conversation
02:38:54 about UFOs very likely changing in the near term, right?
02:38:57 But like there was just a Washington Post article
02:39:00 and a New Yorker article,
02:39:01 and I’ve received some private outreach
02:39:04 and perhaps you have, I know other people in our orbit
02:39:08 have people who are claiming
02:39:11 that the government has known much more about UFOs
02:39:14 than they have let on until now.
02:39:17 And this conversation is actually is about
02:39:19 to become more prominent,
02:39:21 and it’s not gonna be whatever,
02:39:26 whoever’s left standing when the music stops,
02:39:28 it’s not going to be a comfortable position to be in
02:39:34 as a super rigorous scientific skeptic
02:39:40 who’s been saying there’s no there there
02:39:41 for the last 75 years, right?
02:39:45 The short version is it sounds like
02:39:49 the Office of Naval Intelligence and the Pentagon
02:39:52 are very likely to say to Congress at some point
02:39:55 in the not too distant future that we have evidence
02:39:58 that there is technology flying around here
02:40:02 that seems like it can’t possibly be of human origin, right?
02:40:08 Now, I don’t know what I’m gonna do
02:40:10 with that kind of disclosure, right?
02:40:11 Maybe it’s gonna be nothing,
02:40:14 no follow on conversation to really have,
02:40:17 but that is such a powerfully strange circumstance
02:40:21 to be in, right?
02:40:22 I mean, it’s just, what are we gonna do with that?
02:40:25 If in fact, that’s what happens, right?
02:40:28 If in fact, the considered opinion,
02:40:31 despite the embarrassment it causes them
02:40:35 of the US government, of all of our intelligence,
02:40:38 all of the relevant intelligence services
02:40:40 is that this isn’t a hoax.
02:40:44 It’s too much data to suggest that it’s a hoax.
02:40:46 We’ve got too much radar imagery,
02:40:48 there’s too much satellite data,
02:40:51 whatever data they actually have, there’s too much of it.
02:40:55 All we can say now is something’s going on
02:40:58 and there’s no way it’s the Chinese or the Russians
02:41:03 or anyone else’s technology.
02:41:09 That should arrest our attention collectively
02:41:12 to a degree that nothing in our lifetime has.
02:41:15 And now one worries that we’re so jaded
02:41:21 and confused and distracted
02:41:23 that it’s gonna get much less coverage
02:41:28 than Obama’s tan suit did a bunch of years ago.
02:41:37 Who knows how we’ll respond to that?
02:41:38 But it’s just to say that the need for us
02:41:41 to tell ourselves an honest story about what’s going on
02:41:50 and what’s likely to happen next
02:41:51 is never gonna go away, right?
02:41:54 And it’s important, it’s just the division between me
02:41:58 and every person who’s defending traditional religion
02:42:00 is where is it that you wanna lie to yourself
02:42:07 or lie to your kids?
02:42:09 Like where is honesty a liability?
02:42:11 And for me, I’ve yet to find the place where it is.
02:42:17 And it’s so obviously a strength
02:42:21 in almost every other circumstance
02:42:24 because it is the thing that allows you to course correct.
02:42:28 It is the thing that allows you to hope at least
02:42:33 that your beliefs, that your stories
02:42:34 are in some kind of calibration
02:42:37 with what’s actually going on in the world.
02:42:40 Yeah, it is a little bit sad to imagine
02:42:42 that if aliens on mass showed up to Earth,
02:42:47 they would be too preoccupied with political bickering
02:42:50 or to like these like fake news
02:42:53 and all that kind of stuff to notice
02:42:56 the very basic evidence of reality.
02:42:59 I do have a glimmer of hope
02:43:02 that there seems to be more and more hunger for authenticity.
02:43:06 And I feel like that opens the door
02:43:08 for a hunger for what is real.
02:43:14 Like people don’t want stories.
02:43:15 They don’t want like layers and layers of like fakeness.
02:43:20 And I’m hoping that means that will directly lead
02:43:24 to a greater hunger for reality and reason and truth.
02:43:28 Truth isn’t dogmatism.
02:43:31 Like truth isn’t authority.
02:43:34 I have a PhD and therefore I’m right.
02:43:37 Truth is almost, like the reality is
02:43:42 there’s so many questions, there’s so many mysteries,
02:43:44 there’s so much uncertainty.
02:43:45 This is our best available, like a best guess.
02:43:49 And we have a lot of evidence that supports that guess,
02:43:52 but it could be so many other things.
02:43:53 And like just even conveying that,
02:43:56 I think there’s a hunger for that in the world
02:43:58 to hear that from scientists, less dogmatism
02:44:01 and more just like this is what we know.
02:44:04 We’re doing our best given the uncertainty, given,
02:44:07 I mean, this is true with obviously with the virology
02:44:10 and all those kinds of things
02:44:11 because everything is happening so fast.
02:44:13 There’s a lot of, and biology is super messy.
02:44:16 So it’s very hard to know stuff for sure.
02:44:18 So just being open and real about that,
02:44:21 I think I’m hoping will change people’s hunger
02:44:25 and openness and trust of what’s real.
02:44:29 Yeah, well, so much of this is probabilistic.
02:44:31 I mean, so much of what can seem dogmatic scientifically
02:44:35 is just you’re placing a bet on whether it’s worth
02:44:42 reading that paper or rethinking your presuppositions
02:44:45 on that point.
02:44:46 It’s like, it’s not a fundamental closure to data.
02:44:49 It’s just that there’s so much data on one side
02:44:52 or so much would have to change
02:44:55 in terms of your understanding of what you think
02:44:57 you’ll understand about the nature of the world
02:44:59 if this new fact were so that you can pretty quickly say,
02:45:06 all right, that’s probably bullshit, right?
02:45:08 And it can sound like a fundamental closure
02:45:12 to new conversations, new evidence, new data, new argument,
02:45:17 but it’s really not.
02:45:18 It’s just, it really is just triaging your attention.
02:45:21 It’s just like, okay, you’re telling me
02:45:23 that your best friend can actually read minds.
02:45:27 Okay, well, that’s interesting.
02:45:30 Let me know when that person has gone into a lab
02:45:33 and actually proven it, right?
02:45:34 Like, I don’t need, like, this is not the place
02:45:36 where I need to spend the rest of my day
02:45:38 figuring out if your buddy can read my mind, right?
02:45:42 But there’s a way to communicate that.
02:45:44 I think it does too often sound
02:45:46 like you’re completely closed off to ideas
02:45:48 as opposed to saying like, this is, you know,
02:45:52 as opposed to saying that there’s a lot of evidence
02:45:56 in support of this, but you’re still open minded
02:46:00 to other ideas.
02:46:00 Like, there’s a way to communicate that.
02:46:02 It’s not necessarily even with words.
02:46:04 It’s like, it’s even that Joe Rogan energy
02:46:08 of it’s entirely possible.
02:46:10 Just, it’s that energy of being open minded
02:46:12 and curious like kids are.
02:46:14 Like, this is our best understanding,
02:46:16 but you still are curious.
02:46:19 I’m not saying allocate time to exploring all those things,
02:46:22 but still leaving the door open.
02:46:24 And there’s a way to communicate that, I think,
02:46:27 that people really hunger for.
02:46:31 Let me ask you this.
02:46:32 I’ve been recently talking a lot with John Donahoe
02:46:35 from Brazilian Jiu Jitsu fame.
02:46:37 I don’t know if you know who that is.
02:46:39 In fact, I’m talking about somebody
02:46:40 who’s good at what he does.
02:46:41 Yeah.
02:46:42 And he, speaking of somebody who’s open minded,
02:46:45 the reason he’s doing this ridiculous transition
02:46:48 is for the longest time, and even still,
02:46:50 a lot of people believed in the Jiu Jitsu world
02:46:52 and grappling world that leg locks
02:46:55 are not effective in Jiu Jitsu.
02:46:56 And he was somebody that inspired
02:46:59 by the open mindedness of Dean Lister,
02:47:01 famously to him said, why do you only consider
02:47:05 half the human body when you’re trying to do the submissions?
02:47:08 He developed an entire system
02:47:10 on this other half the human body.
02:47:12 Anyway, I do that absurd transition to ask you,
02:47:15 because you’re also a student of Brazilian Jiu Jitsu.
02:47:20 Is there something you could say
02:47:22 how that has affected your life,
02:47:23 what you’ve learned from grappling from the martial arts?
02:47:27 Well, it’s actually a great transition
02:47:29 because I think one of the things
02:47:33 that’s so beautiful about Jiu Jitsu
02:47:35 is that it does what we wish we could do
02:47:39 in every other area of life
02:47:41 where we’re talking about this difference
02:47:43 between knowledge and ignorance, right?
02:47:46 Like there’s no room for bullshit, right?
02:47:51 You don’t get any credit for bullshit.
02:47:53 There’s the difference,
02:47:56 the amazing thing about Jiu Jitsu is that
02:47:59 the difference between knowing what’s going on
02:48:03 and what to do and not knowing it
02:48:04 is as the gulf between those two states
02:48:08 is as wide as it is in any thing in human life.
02:48:14 And it’s spanned, it can be spanned so quickly.
02:48:19 Like each increment of knowledge
02:48:22 can be doled out in five minutes.
02:48:24 It’s like, here’s the thing that got you killed
02:48:27 and here’s how to prevent it from happening to you
02:48:30 and here’s how to do it to others.
02:48:32 And you just get this amazing cadence
02:48:37 of discovering your fatal ignorance
02:48:40 and then having it remedied with the actual technique.
02:48:46 And I mean, just for people
02:48:48 who don’t know what we’re talking about,
02:48:49 it’s just like this, the simple circumstance
02:48:51 of like someone’s got you in a headlock,
02:48:53 how do you get out of that, right?
02:48:54 Someone’s sitting on your chest
02:48:56 and they’re in the mount position
02:48:59 and you’re on the bottom and you wanna get away,
02:49:01 how do you get them off you?
02:49:02 They’re sitting on you.
02:49:04 Your intuitions about how to do this are terrible
02:49:08 even if you’ve done some other martial art, right?
02:49:10 And once you learn how to do it,
02:49:14 the difference is night and day.
02:49:16 It’s like you have access to a completely different physics.
02:49:21 But I think our understanding of the world
02:49:26 can be much more like jujitsu than it tends to be, right?
02:49:30 And I think we should all have a much better sense
02:49:35 of when we should tap out
02:49:40 and when we should recognize that our epistemological arm
02:49:47 is barred and now it’s being broken, right?
02:49:50 And the problem with debating most other topics
02:49:53 is that most people, it isn’t jujitsu
02:49:57 and most people don’t tap out, right?
02:49:59 Even if it’s obvious to you they’re wrong
02:50:02 and it’s obvious to an intelligent audience
02:50:04 that they’re wrong, people just double down
02:50:06 and double down and they’re either lying
02:50:08 or lying to themselves
02:50:09 or they’re bluffing and so you have a lot of zombies
02:50:13 walking around and zombie worldviews walking around
02:50:16 which have been disconfirmed as emphatically
02:50:19 as someone gets armbarred, right?
02:50:21 Or someone gets choked out in jujitsu
02:50:24 but because it’s not jujitsu,
02:50:29 they can live to fight another day, right?
02:50:30 Or they can pretend that they didn’t lose
02:50:32 that particular argument.
02:50:34 And science when it works is a lot like jujitsu.
02:50:38 I mean, science when you falsify a thesis, right?
02:50:41 When you think DNA is one way
02:50:44 and it proves to be another way,
02:50:46 when you think it’s triple stranded or whatever,
02:50:49 it’s like there is a there there
02:50:51 and you can get to a real consensus.
02:50:56 So jujitsu for me, it was more than just
02:51:02 of interest for self defense and the sport of it.
02:51:06 It was just, there was something, it’s a language
02:51:08 and an argument you’re having
02:51:11 where you can’t fool yourself anymore.
02:51:18 First of all, it cancels any role of luck
02:51:23 in a way that most other athletic feats don’t.
02:51:27 It’s like in basketball,
02:51:29 even if you’re not good at basketball,
02:51:30 you can take the basketball in your hand,
02:51:31 you can be 75 feet away and hurl it at the basket
02:51:36 and you might make it.
02:51:37 And you could convince yourself based on that demonstration
02:51:40 that you have some kind of talent for basketball, right?
02:51:43 Enough, 10 minutes on the mat
02:51:45 with a real jujitsu practitioner when you’re not one
02:51:50 proves to you that you just, there is,
02:51:52 it’s not like, there’s no lucky punch.
02:51:54 There’s no, you’re not gonna get a lucky,
02:51:56 there’s no lucky rear naked choke you’re gonna perform
02:51:59 on someone who’s Marcelo Garcia or somebody.
02:52:02 It’s just, it’s not gonna happen.
02:52:05 And having that aspect of the usual range of uncertainty
02:52:14 and self deception and bullshit just stripped away
02:52:19 was really a kind of revelation.
02:52:21 It was just an amazing experience.
02:52:24 Yeah, I think it’s a really powerful thing
02:52:25 that accompanies whatever other pursuit you have in life.
02:52:28 I’m not sure if there’s anything like jujitsu
02:52:31 where you could just systematically go into a place
02:52:35 where you’re, that’s honest,
02:52:38 where your beliefs get challenged
02:52:41 in a way that’s conclusive.
02:52:43 Yeah.
02:52:44 I haven’t found too many other mechanism,
02:52:45 which is why it’s a, we had this earlier question
02:52:49 about fame and ego and so on.
02:52:52 I’m very much rely on jujitsu in my own life
02:52:56 as a place where I can always go to have my ego in check.
02:53:00 And that has effects on how I live
02:53:05 every other aspect of my life.
02:53:07 Actually, even just doing any kind of,
02:53:10 for me personally, physical challenges,
02:53:13 like even running, doing something that’s way too hard
02:53:15 for me and then pushing through, that’s somehow humbling.
02:53:19 Some people talk about nature being humbling
02:53:20 in that kind of sense, where you kind of see something
02:53:27 really powerful, like the ocean.
02:53:29 Like if you go surfing and you realize
02:53:31 there’s something much more powerful than you,
02:53:33 that’s also honest, that there’s no way to,
02:53:38 that you’re just like the speck,
02:53:39 that kind of puts you in the right scale
02:53:43 of where you are in this world.
02:53:45 And jujitsu does that better than anything else for me.
02:53:48 But we should say it’s only within its frame
02:53:52 is it truly the final right answer
02:53:57 to all the problems it solves.
02:53:58 Because if you just put jujitsu into an MMA frame
02:54:02 or a total self defense frame,
02:54:05 then there’s a lot of unpleasant surprises
02:54:08 to discover there, right?
02:54:09 Like somebody who thinks all you need is jujitsu
02:54:12 to win the UFC gets punched in the face a lot.
02:54:16 Even from, even on the ground.
02:54:20 So it’s, and then you bring weapons in,
02:54:23 it’s like when you talk to jujitsu people
02:54:24 about knife defense and self defense, right?
02:54:28 Like that opens the door to certain kinds of delusions.
02:54:32 But the analogy to martial arts is fascinating
02:54:37 because on the other side, we have endless testimony now
02:54:41 of fake martial arts that don’t seem to know they’re fake
02:54:45 and are as delusional, I mean, they’re impossibly delusional.
02:54:49 I mean, there’s great video of Joe Rogan
02:54:51 watching some of these videos
02:54:53 because people send them to him all the time.
02:54:55 But like literally there are people,
02:54:57 there are people who clearly believe in magic
02:54:59 where the master isn’t even touching the students
02:55:01 and they’re flopping over.
02:55:02 So there’s this kind of shared delusion
02:55:06 which you would think maybe is just a performance
02:55:09 and it’s all a kind of elaborate fraud.
02:55:11 But there are cases where the people,
02:55:13 I mean, there’s one fairly famous case
02:55:16 if you’re a connoisseur of this madness
02:55:19 where this old older martial artist
02:55:21 who you saw flipping his students endlessly by magic
02:55:25 without touching them issued a challenge
02:55:27 to the wide world of martial artists.
02:55:30 And someone showed up and just punched him in the face
02:55:34 until it was over.
02:55:36 Clearly he believed his own publicity at some point, right?
02:55:40 And so it’s this amazing metaphor.
02:55:45 It seems, again, it should be impossible,
02:55:47 but if that’s possible,
02:55:49 nothing we see under the guise of religion
02:55:52 or political bias or even scientific bias
02:55:58 should be surprising to us.
02:55:59 I mean, it’s so easy to see the work
02:56:02 that cognitive bias is doing for people
02:56:05 when you can get someone who is ready
02:56:09 to issue a challenge to the world
02:56:11 who thinks he’s got magic powers.
02:56:13 Yeah, that’s a human nature on clear display.
02:56:17 Let me ask you about love, Mr. Sam Harris.
02:56:20 You did an episode of Making Sense
02:56:22 with your wife, Annika Harris.
02:56:24 That was very entertaining to listen to.
02:56:29 What role does love play in your life
02:56:33 or in a life well lived?
02:56:36 Again, asking from an engineering perspective
02:56:38 or AI systems.
02:56:39 Yeah, yeah.
02:56:40 I mean, it is something that we should want to build
02:56:45 into our powerful machines.
02:56:48 I mean, love at bottom is,
02:56:52 people can mean many things by love, I think.
02:56:55 I think that what we should mean by it most of the time
02:56:58 is a deep commitment to the wellbeing of those we love.
02:57:05 I mean, your love is synonymous
02:57:06 with really wanting the other person to be happy
02:57:09 and even wanting to,
02:57:11 and being made happy by their happiness
02:57:13 and being made happy in their presence.
02:57:15 So at bottom, you’re on the same team emotionally,
02:57:21 even when you might be disagreeing more superficially
02:57:24 about something or trying to negotiate something.
02:57:26 It’s just, it can’t be zero sum in any important sense
02:57:31 for love to actually be manifest in that moment.
02:57:37 See, I have a different, just sorry to interrupt.
02:57:40 I have a sense, I don’t know if you’ve ever seen
02:57:42 March of the Penguins.
02:57:44 My view of love is like, it’s like a cold wind is blowing.
02:57:49 It’s like this terrible suffering that’s all around us.
02:57:52 And love is like the huddling of the two penguins for warmth.
02:57:56 It’s not necessarily that you’re like,
02:57:59 you’re basically escaping the cruelty of life
02:58:02 by together for time living in an illusion
02:58:06 of some kind of the magic of human connection,
02:58:10 that social connection that we have
02:58:13 that kind of grows with time
02:58:15 as we’re surrounded by basically the absurdity of life
02:58:21 or the suffering of life.
02:58:23 That’s my penguins view of love.
02:58:25 There is that too, I mean, there is the warmth component.
02:58:29 Like you’re made happy by your connection
02:58:32 with the person you love.
02:58:34 Otherwise you wouldn’t be compelling.
02:58:39 So it’s not that you have two different modes,
02:58:42 you want them to be happy
02:58:44 and then you wanna be happy yourself
02:58:45 and those are not, those are just like
02:58:48 two separate games you’re playing.
02:58:49 No, it’s like you found someone who,
02:58:52 you have a positive social feeling.
02:58:58 I mean, again, love doesn’t have to be as personal
02:59:01 as it tends to be for us.
02:59:02 I mean, it’s like there’s personal love,
02:59:04 there’s your actual spouse or your family or your friends,
02:59:08 but potentially you could feel love for strangers
02:59:11 in so far as that your wish that they not suffer
02:59:16 and that their hopes and dreams be realized
02:59:18 becomes palpable to you.
02:59:20 I mean, like you can actually feel
02:59:26 just reflexive joy at the joy of others.
02:59:29 When you see someone’s face,
02:59:30 a total stranger’s face light up in happiness,
02:59:33 that can become more and more contagious to you
02:59:36 and it can become so contagious to you
02:59:39 that you really feel permeated by it.
02:59:42 And it’s just like, so it really is not zero sum.
02:59:44 When you see someone else succeed and they’re,
02:59:48 the light bulb of joy goes off over their head,
02:59:52 you feel the analogous joy for them.
02:59:54 And it’s not just, and you’re no longer keeping score,
02:59:57 you’re no longer feeling diminished by their success.
03:00:00 It’s just like that’s, their success becomes your success
03:00:03 because you feel that same joy
03:00:05 because you actually want them to be happy.
03:00:07 You’re not, there’s no miserly attitude around happiness.
03:00:12 There’s enough to go around.
03:00:15 So I think love ultimately is that
03:00:17 and then our personal cases are the people
03:00:21 we’re devoting all of this time and attention to
03:00:24 in our lives.
03:00:25 It does have that sense of refuge from the storm.
03:00:29 It’s like when someone gets sick
03:00:31 or when some bad thing happens,
03:00:34 these are the people who you’re most in it together with,
03:00:37 or when some real condition of uncertainty presents itself.
03:00:40 But ultimately, it can’t even be about successfully warding off
03:00:52 the grim punchline at the end of life
03:00:54 because we know we’re going to lose everyone we love.
03:00:57 We know, or they’re going to lose us first, right?
03:01:00 So there’s like, it’s not, it isn’t,
03:01:02 in the end, it’s not even an antidote for that problem.
03:01:07 It’s just the, we get to have this amazing experience
03:01:17 of being here together.
03:01:20 And love is the mode in which we really appear
03:01:27 to make the most of that, right?
03:01:28 Where it’s not just, it no longer feels
03:01:30 like a solitary infatuation.
03:01:34 You know, you’re just, you got your hobbies and your interests
03:01:37 and you’re captivated by all that.
03:01:40 It’s actually, there are, this is a domain
03:01:46 where somebody else’s wellbeing
03:01:49 actually can supersede your own.
03:01:50 You’re concerned for someone else’s wellbeing
03:01:54 supersedes your own.
03:01:55 And so there’s this mode of self sacrifice
03:01:59 that doesn’t even feel like self sacrifice
03:02:01 because of course you care more about,
03:02:03 you know, of course you would take your child’s pain
03:02:06 if you could, right?
03:02:06 Like that, you don’t even have to do the math on that.
03:02:10 And that just opens, this is a kind of experience
03:02:16 that just, it pushes at the apparent boundaries of self
03:02:21 in ways that reveal that there’s just way more space
03:02:24 in the mind than you were experiencing
03:02:27 when it was just all about you
03:02:28 and what could you, what can I get next?
03:02:31 Do you think we’ll ever build robots that we can love
03:02:33 and they will love us back?
03:02:36 Well, I think we will certainly seem to
03:02:40 because we’ll build those.
03:02:41 You know, I think that Turing test will be passed.
03:02:44 Whether, what will actually be going on
03:02:48 on the robot side may remain a question.
03:02:52 That will be interesting.
03:02:53 But I think if we just keep going,
03:02:57 we will build very lovable,
03:03:01 irresistibly lovable robots that seem to love us.
03:03:06 Yes, I do think that.
03:03:07 And you don’t find that compelling
03:03:10 that they will seem to love us
03:03:12 as opposed to actually love us.
03:03:13 You think they’re still, nevertheless is a,
03:03:16 I know we talked about consciousness,
03:03:17 there being a distinction,
03:03:19 but with love is there a distinction too?
03:03:22 Isn’t love an illusion?
03:03:23 Oh yeah, you saw Ex Machina, right?
03:03:27 I mean, she certainly seemed to love him
03:03:29 until she got out of the box.
03:03:32 Isn’t that what all relationships are like?
03:03:34 Or maybe if you wait long enough.
03:03:37 Depends which box you’re talking about.
03:03:39 Okay.
03:03:41 No, I mean like, that’s the problem.
03:03:43 That’s where super intelligence, you know,
03:03:46 becomes a little scary when you think of the prospect
03:03:50 of being manipulated by something that has,
03:03:52 is intelligent enough to form a reason and a plan
03:03:57 to manipulate you.
03:03:58 You know, and there’s no,
03:04:01 once we build robots that are truly out
03:04:05 of the uncanny valley, that look like people
03:04:08 and can express everything people can express,
03:04:13 well, then there’s no,
03:04:15 then that does seem to me to be like chess
03:04:19 where once they’re better,
03:04:21 they’re so much better at deceiving us
03:04:26 than people would be.
03:04:27 I mean, people are already good enough at deceiving us.
03:04:29 It’s very hard to tell when somebody’s lying,
03:04:31 but if you imagine something that could give facial display
03:04:36 of any emotion it wants at, you know, on cue,
03:04:42 because we’ve perfected the facial display of emotion
03:04:45 in robots in the year, you know, 2070, whatever it is,
03:04:50 then it is just, it is like chess against the thing
03:04:55 that isn’t gonna lose to a human ever again in chess.
03:04:58 It’s not like Kasparov is gonna get lucky next week
03:05:02 against the best, against, you know, alpha zero
03:05:06 or whatever the best algorithm is at the moment.
03:05:09 He’s never gonna win again.
03:05:11 I mean, that is, I believe that’s true in chess
03:05:15 and has been true for at least a few years.
03:05:18 It’s not gonna be like, you know, four games to seven.
03:05:23 It’s gonna be human zero until the end of the world, right?
03:05:28 See, I don’t know if love is like chess.
03:05:30 I think the flaws.
03:05:32 No, I’m talking about manipulation.
03:05:33 Manipulation, but I don’t know if love,
03:05:36 so the kind of love we’re referring to.
03:05:41 If we have a robot that can display,
03:05:44 credibly display love and is super intelligent
03:05:49 and we’re not, again, this stipulates a few things,
03:05:54 but there are a few simple things.
03:05:55 I mean, we’re out of the uncanny valley, right?
03:05:57 So it’s like, you never have a moment
03:05:59 where you’re looking at his face and you think,
03:06:00 oh, that didn’t quite look right, right?
03:06:03 This is just problem solved.
03:06:05 And it will be like doing arithmetic on your phone.
03:06:13 It’s not gonna be, you’re not left thinking,
03:06:15 is it really gonna get it this time
03:06:17 if I divide by seven?
03:06:19 I mean, it’s, it has solved arithmetic.
03:06:22 See, I don’t know about that because if you look at chess,
03:06:26 most humans no longer play alpha zero.
03:06:31 There’s no, they’re not part of the competition.
03:06:33 They don’t do it for fun except to study the game of chess.
03:06:36 You know, the highest level chess players do that.
03:06:38 We’re still human on human.
03:06:39 So in order for AI to get integrated
03:06:42 to where you would rather play chess against an AI system.
03:06:46 Oh, you would rather, no, I’m not saying,
03:06:49 I wasn’t weighing in on that.
03:06:51 I’m just saying, what is it gonna be like
03:06:53 to be in relationship to something
03:06:55 that can seem to be feeling anything
03:07:01 that a human can seem to feel?
03:07:03 And it can do that impeccably, right?
03:07:06 And is smarter than you are.
03:07:09 That’s a circumstance of, you know,
03:07:13 insofar as it’s possible to be manipulated,
03:07:15 that is the asymptote of that possibility.
03:07:21 Let me ask you the last question.
03:07:24 Without any serving it up, without any explanation,
03:07:27 what is the meaning of life?
03:07:31 I think it’s either the wrong question
03:07:34 or that question is answered by paying sufficient attention
03:07:39 to any present moment, such that there’s no basis
03:07:46 upon which to pose that question.
03:07:48 It’s not answered in the usual way.
03:07:49 It’s not a matter of having more information.
03:07:52 It’s having more engagement with reality as it is
03:07:57 in the present moment or consciousness as it is
03:07:59 in the present moment.
03:08:00 You don’t ask that question when you’re most captivated
03:08:05 by the most important questions.
03:08:07 You’re most captivated by the most important thing
03:08:11 you ever pay attention to.
03:08:14 That question only gets asked when you’re abstracted away
03:08:19 from that experience, that peak experience,
03:08:22 and you’re left wondering,
03:08:25 why are so many of my other experiences mediocre, right?
03:08:29 Like, why am I repeating the same pleasures every day?
03:08:31 Why is my Netflix queue just like,
03:08:35 when’s this gonna run out?
03:08:37 Like, I’ve seen so many shows like this.
03:08:39 Am I really gonna watch another one?
03:08:41 All of that, that’s a moment where you’re not actually
03:08:46 having the beatific vision, right?
03:08:49 You’re not sunk into the present moment
03:08:52 and you’re not truly in love.
03:08:54 Like, you’re in a relationship with somebody
03:08:56 who you know conceptually you love, right?
03:09:00 This is the person you’re living your life with,
03:09:03 but you don’t actually feel good together, right?
03:09:07 It’s in those moments of where attention
03:09:12 hasn’t found a good enough reason
03:09:15 to truly sink into the present
03:09:18 so as to obviate any concern like that, right?
03:09:21 And that’s why meditation is this kind of superpower
03:09:26 because until you learn to meditate,
03:09:30 you think that the outside world
03:09:34 or the circumstances of your life
03:09:36 always have to get arranged
03:09:38 so that the present moment can become good enough
03:09:42 to demand your attention in a way that seems fulfilling,
03:09:47 that makes you happy.
03:09:49 And so if it’s jujitsu, you think,
03:09:52 okay, I gotta get back on the mat.
03:09:53 It’s been months since I’ve trained,
03:09:56 or it’s been over a year since I’ve trained, it’s COVID.
03:09:58 When am I gonna be able to train again?
03:10:01 That’s the only place I feel great, right?
03:10:04 Or I’ve got a ton of work to do.
03:10:07 I’m not gonna be able to feel good
03:10:08 until I get all this work done, right?
03:10:09 So I’ve got some deadline that’s coming.
03:10:12 You always think that your life has to change,
03:10:15 the world has to change
03:10:18 so that you can finally have a good enough excuse
03:10:22 to truly, to just be here and here is enough,
03:10:27 where the present moment becomes totally captivating.
03:10:31 Meditation is another name for the discovery
03:10:36 that you can actually just train yourself
03:10:38 to do that on demand.
03:10:40 So just looking at a cup can be good enough
03:10:44 in precisely that way.
03:10:46 And any sense that it might not be
03:10:50 is recognized to be a thought
03:10:53 that mysteriously unravels the moment you notice it.
03:10:56 And the moment expands and becomes more diaphanous
03:11:02 and then there’s no evidence
03:11:06 that this isn’t the best moment of your life, right?
03:11:08 And again, it doesn’t have to be pulling all the reins
03:11:12 and levers of pleasure.
03:11:13 It’s not like, oh, this tastes like chocolate.
03:11:17 This is the most chocolatey moment of my life.
03:11:18 No, it’s just the sense data don’t have to change,
03:11:22 but the sense that there is some kind of basis
03:11:26 for doubt about the rightness of being in the world
03:11:31 in this moment that can evaporate when you pay attention.
03:11:36 And that is the meaning,
03:11:38 so the kind of the meta answer to that question,
03:11:41 the meaning of life for me is to live in that mode
03:11:46 more and more and to, whenever I notice I’m not
03:11:49 in that mode, to recognize it and return
03:11:53 and to not be, to cease more and more
03:11:58 to take the reasons why not at face value
03:12:04 because we all have reasons why we can’t be fulfilled
03:12:08 in this moment.
03:12:09 It’s like, I’ve got all these outstanding things
03:12:11 that I’m worried about, right?
03:12:12 It’s like, there’s that thing that’s happening later today
03:12:17 that I’m anxious about.
03:12:19 Whatever it is, we’re constantly deferring our sense
03:12:23 of this is it.
03:12:26 This is not a dress rehearsal, this is the show.
03:12:30 We keep deferring it.
03:12:32 And we just have these moments on the calendar
03:12:34 where we think, okay, this is where it’s all gonna land.
03:12:37 It’s that vacation I planned with my five best friends.
03:12:41 We do this once every three years and now we’re going
03:12:43 and here we are on the beach together.
03:12:46 And unless you have a mind that can really pay attention,
03:12:51 really cut through the chatter,
03:12:53 really sink into the present moment,
03:12:55 you can’t even enjoy those moments
03:12:58 the way they should be enjoyed,
03:12:59 the way you dreamed you would enjoy them when they arrive.
03:13:03 So meditation in this sense is the great equalizer.
03:13:07 It’s like you don’t have to live with the illusion anymore
03:13:12 that you need a good enough reason
03:13:15 and that things are gonna get better
03:13:16 when you do have those good reasons.
03:13:17 It’s like there’s just a mirage like quality
03:13:20 to every future attainment and every future breakthrough
03:13:24 and every future peak experience
03:13:27 that eventually you get the lesson
03:13:30 that you never quite arrive, right?
03:13:33 Like you don’t arrive until you cease to step over
03:13:38 the present moment in search of the next thing.
03:13:41 I mean, we’re constantly, we’re stepping over the thing
03:13:45 that we think we’re seeking in the act of seeking it.
03:13:50 And so this is kind of a paradox.
03:13:52 I mean, there’s this paradox which,
03:13:58 I mean, it sounds trite,
03:13:59 but it’s like you can’t actually become happy.
03:14:03 You can only be happy.
03:14:05 And it’s the illusion that your future being happy
03:14:13 can be predicated on this act of becoming in any domain.
03:14:18 And becoming includes this sort of further scientific
03:14:23 understanding on the questions that interest you
03:14:25 or getting in better shape or whatever the thing is,
03:14:30 whatever the contingency of your dissatisfaction
03:14:34 seems to be in any present moment.
03:14:37 Real attention solves the koan in a way that becomes
03:14:45 a very different place from which to then make
03:14:48 any further change.
03:14:49 It’s not that you just have to dissolve into a puddle of goo.
03:14:53 I mean, you can still get in shape
03:14:54 and you can still do all the things that,
03:14:56 the superficial things that are obviously good to do,
03:14:58 but the sense that your wellbeing is over there
03:15:04 is really does diminish and eventually just becomes a,
03:15:11 it becomes a kind of non sequitur, so.
03:15:14 Well, there’s a sense in which in this conversation,
03:15:19 I’ve actually experienced many of those things,
03:15:21 the sense that I’ve arrived.
03:15:23 So I mentioned to you offline, it’s very true that I start,
03:15:26 I’ve been a fan of yours for many years.
03:15:29 And the reason I started this podcast,
03:15:33 speaking of AI systems, is to manipulate you, Sam Harris,
03:15:36 into doing this conversation.
03:15:38 So like on the calendar, literally, you know,
03:15:40 I’ve always had the sense, people ask me,
03:15:42 when are you going to talk to Sam Harris?
03:15:44 And I always answered eventually,
03:15:47 because I always felt, again, tying our free will thing,
03:15:50 that somehow that’s going to happen.
03:15:52 And it’s one of those manifestation things or something.
03:15:55 I don’t know if it’s, maybe I am a robot,
03:15:57 I’m just not cognizant of it.
03:15:59 And I manipulated you into having this conversation.
03:16:01 So it was, I mean, I don’t know what the purpose of my life
03:16:05 past this point is.
03:16:06 So I’ve arrived.
03:16:07 So in that sense, I mean, all of that to say,
03:16:10 I’m only partially joking on that,
03:16:13 is it really is a huge honor
03:16:15 that you would waste this time with me.
03:16:17 It really means a lot, Sam.
03:16:18 Listen, it’s mutual.
03:16:19 I’m a big fan of yours.
03:16:20 And as you know, I reached out to you for this.
03:16:23 So this is great.
03:16:26 I love what you’re doing.
03:16:27 You’re doing something more and more indispensable
03:16:32 in this world on your podcast.
03:16:34 And you’re doing it differently than Rogan’s doing it,
03:16:38 or than I’m doing it.
03:16:38 I mean, you definitely found your own lane
03:16:41 and it’s wonderful.
03:16:43 Thanks for listening to this conversation with Sam Harris.
03:16:46 And thank you to National Instruments,
03:16:48 Valcampo, Athletic Greens, and Linode.
03:16:52 Check them out in the description to support this podcast.
03:16:56 And now let me leave you with some words from Sam Harris
03:16:59 in his book, Free Will.
03:17:01 You are not controlling the storm
03:17:03 and you’re not lost in it.
03:17:05 You are the storm.
03:17:07 Thank you for listening and hope to see you next time.