Transcript
00:00:00 Evolutionarily, if we see a lion running at us,
00:00:03 we didn’t have time to sort of calculate
00:00:05 the lion’s kinetic energy and is it optimal
00:00:08 to go this way or that way, you just react it.
00:00:10 And physically our bodies are well attuned
00:00:13 to actually make right decisions.
00:00:14 But when you’re playing a game like poker,
00:00:16 this is not something that you ever evolved to do.
00:00:19 And yet you’re in that same flight or fight response.
00:00:22 And so that’s a really important skill
00:00:24 to be able to develop to basically learn
00:00:25 how to like meditate in the moment and calm yourself
00:00:29 so that you can think clearly.
00:00:32 The following is a conversation with Liv Burry,
00:00:35 formerly one of the best poker players in the world,
00:00:38 trained as an astrophysicist and is now a philanthropist
00:00:42 and an educator on topics of game theory,
00:00:45 physics, complexity, and life.
00:00:49 This is the Lex Friedman podcast.
00:00:51 To support it, please check out our sponsors
00:00:53 in the description.
00:00:54 And now, dear friends, here’s Liv Burry.
00:00:57 What role do you think luck plays in poker and in life?
00:01:02 You can pick whichever one you want,
00:01:04 poker or life and or life.
00:01:06 The longer you play, the less influence luck has.
00:01:10 Like with all things, the bigger your sample size,
00:01:13 the more the quality of your decisions
00:01:16 or your strategies matter.
00:01:18 So to answer that question, yeah, in poker,
00:01:21 it really depends.
00:01:22 If you and I sat and played 10 hands right now,
00:01:26 I might only win 52% of the time, 53% maybe.
00:01:30 But if we played 10,000 hands,
00:01:31 then I’ll probably win like over 98, 99% of the time.
00:01:35 So it’s a question of sample sizes.
00:01:38 And what are you figuring out over time?
00:01:40 The betting strategy that this individual does
00:01:42 or literally it doesn’t matter
00:01:43 against any individual over time?
00:01:45 Against any individual over time, the better player
00:01:47 because they’re making better decisions.
00:01:49 So what does that mean to make a better decision?
00:01:51 Well, to get into the realness of Gritty already,
00:01:55 basically poker is a game of math.
00:01:58 There are these strategies familiar
00:02:00 with like Nash Equilibria, right?
00:02:02 So there are these game theory optimal strategies
00:02:06 that you can adopt.
00:02:08 And the closer you play to them,
00:02:10 the less exploitable you are.
00:02:12 So because I’ve studied the game a bunch,
00:02:15 although admittedly not for a few years,
00:02:17 but back when I was playing all the time,
00:02:20 I would study these game theory optimal solutions
00:02:23 and try and then adopt those strategies
00:02:24 when I go and play.
00:02:25 So I’d play against you and I would do that.
00:02:27 And because the objective,
00:02:31 when you’re playing game theory optimal,
00:02:33 it’s actually, it’s a loss minimization thing
00:02:36 that you’re trying to do.
00:02:37 Your best bet is to try and play a sort of similar style.
00:02:42 You also need to try and adopt this loss minimization.
00:02:46 But because I’ve been playing much longer than you,
00:02:48 I’ll be better at that.
00:02:49 So first of all, you’re not taking advantage
00:02:51 of my mistakes.
00:02:53 But then on top of that,
00:02:55 I’ll be better at recognizing
00:02:56 when you are playing suboptimally
00:02:59 and then deviating from this game theory optimal strategy
00:03:02 to exploit your bad plays.
00:03:05 Can you define game theory and Nash equilibria?
00:03:08 Can we try to sneak up to it in a bunch of ways?
00:03:10 Like what’s a game theory framework of analyzing poker,
00:03:14 analyzing any kind of situation?
00:03:16 So game theory is just basically the study of decisions
00:03:21 within a competitive situation.
00:03:24 I mean, it’s technically a branch of economics,
00:03:26 but it also applies to like wider decision theory.
00:03:30 And usually when you see it,
00:03:35 it’s these like little payoff matrices and so on.
00:03:37 That’s how it’s depicted.
00:03:38 But it’s essentially just like study of strategies
00:03:41 under different competitive situations.
00:03:42 And as it happens, certain games,
00:03:46 in fact, many, many games have these things
00:03:48 called Nash equilibria.
00:03:50 And what that means is when you’re in a Nash equilibrium,
00:03:52 basically it is not,
00:03:55 there is no strategy that you can take
00:03:59 that would be more beneficial
00:04:01 than the one you’re currently taking,
00:04:02 assuming your opponent is also doing the same thing.
00:04:05 So it’d be a bad idea,
00:04:06 if we’re both playing in a game theory optimal strategy,
00:04:10 if either of us deviate from that,
00:04:12 now we’re putting ourselves at a disadvantage.
00:04:16 Rock, paper, scissors
00:04:17 is actually a really great example of this.
00:04:18 Like if we were to start playing rock, paper, scissors,
00:04:22 you know, you know nothing about me
00:04:23 and we’re gonna play for all our money,
00:04:26 let’s play 10 rounds of it.
00:04:27 What would your sort of optimal strategy be?
00:04:30 Do you think, what would you do?
00:04:33 Let’s see.
00:04:35 I would probably try to be as random as possible.
00:04:42 Exactly.
00:04:43 You wanna, because you don’t know anything about me,
00:04:46 you don’t want to give anything away about yourself.
00:04:48 So ideally you’d have like a little dice
00:04:49 or somewhat, you know, perfect randomizer
00:04:52 that makes you randomize 33% of the time
00:04:54 each of the three different things.
00:04:56 And in response to that,
00:04:58 well, actually I can kind of do anything,
00:04:59 but I would probably just randomize back too,
00:05:01 but actually it wouldn’t matter
00:05:02 because I know that you’re playing randomly.
00:05:05 So that would be us in a Nash equilibrium
00:05:07 where we’re both playing this like unexploitable strategy.
00:05:10 However, if after a while you then notice
00:05:13 that I’m playing rock a little bit more often than I should.
00:05:16 Yeah, you’re the kind of person that would do that,
00:05:18 wouldn’t you?
00:05:18 Sure.
00:05:19 Yes, yes, yes.
00:05:20 I’m more of a scissors girl, but anyway.
00:05:21 You are?
00:05:22 No, I’m a, as I said, randomizer.
00:05:25 So you notice I’m throwing rock too much
00:05:27 or something like that.
00:05:28 Now you’d be making a mistake
00:05:29 by continuing playing this game theory optimal strategy,
00:05:32 well, the previous one, because you are now,
00:05:36 I’m making a mistake and you’re not deviating
00:05:39 and exploiting my mistake.
00:05:41 So you’d want to start throwing paper a bit more often
00:05:43 in whatever you figure is the right sort of percentage
00:05:45 of the time that I’m throwing rock too often.
00:05:48 So that’s basically an example of where,
00:05:51 what game theory optimal strategy is
00:05:53 in terms of loss minimization,
00:05:54 but it’s not always the maximally profitable thing
00:05:58 if your opponent is doing stupid stuff,
00:06:00 which in that example.
00:06:02 So that’s kind of then how it works in poker,
00:06:04 but it’s a lot more complex.
00:06:07 And the way poker players typically,
00:06:10 nowadays they study, the games change so much.
00:06:12 And I think we should talk about how it sort of evolved,
00:06:15 but nowadays like the top pros
00:06:17 basically spend all their time in between sessions
00:06:20 running these simulators using like software
00:06:24 where they do basically Monte Carlo simulations,
00:06:26 sort of doing billions of fictitious self play hands.
00:06:31 You input a fictitious hand scenario,
00:06:34 like, oh, what do I do with Jack nine suited
00:06:36 on a King 10 four to two spades board
00:06:41 and against this bet size.
00:06:43 So you’d input that press play,
00:06:45 it’ll run its billions of fake hands
00:06:49 and then it will converge upon
00:06:50 what the game theory optimal strategies are.
00:06:53 And then you wanna try and memorize what these are.
00:06:55 Basically they’re like ratios of how often,
00:06:57 what types of hands you want to bluff
00:06:59 and what percentage of the time.
00:07:01 So then there’s this additional layer
00:07:02 of inbuilt randomization built in.
00:07:04 Yeah, those kinds of simulations incorporate
00:07:06 all the betting strategies and everything else like that.
00:07:09 So as opposed to some kind of very crude mathematical model
00:07:12 of what’s the probability you win
00:07:13 just based on the quality of the card,
00:07:16 it’s including everything else too.
00:07:18 The game theory of it.
00:07:20 Yes, yeah, essentially.
00:07:21 And what’s interesting is that nowadays,
00:07:23 if you want to be a top pro and you go and play
00:07:25 in these really like the super high stakes tournaments
00:07:27 or tough cash games, if you don’t know this stuff,
00:07:30 you’re gonna get eaten alive in the long run.
00:07:33 But of course you could get lucky over the short run
00:07:35 and that’s where this like luck factor comes in
00:07:36 because luck is both a blessing and a curse.
00:07:40 If luck didn’t, if there wasn’t this random element
00:07:42 and there wasn’t the ability for worse players
00:07:45 to win sometimes, then poker would fall apart.
00:07:48 You know, the same reason people don’t play chess
00:07:50 professionally for money against,
00:07:53 you don’t see people going and hustling chess
00:07:55 like not knowing, trying to make a living from it
00:07:57 because you know there’s very little luck in chess,
00:08:00 but there’s quite a lot of luck in poker.
00:08:01 Have you seen Beautiful Mind, that movie?
00:08:03 Years ago.
00:08:04 Well, what do you think about the game theoretic formulation
00:08:07 of what is it, the hot blonde at the bar?
00:08:09 Do you remember?
00:08:10 Oh, yeah.
00:08:11 What they illustrated is they’re trying to pick up a girl
00:08:14 at a bar and there’s multiple girls.
00:08:16 They’re like friend, it’s like a friend group
00:08:18 and you’re trying to approach,
00:08:20 I don’t remember the details, but I remember.
00:08:21 Don’t you like then speak to her friends first
00:08:24 or something like that, feign disinterest.
00:08:25 I mean, it’s classic pickup artist stuff, right?
00:08:27 You wanna.
00:08:28 And they were trying to correlate that somehow,
00:08:30 that being an optimal strategy game theoretically.
00:08:35 Why?
00:08:36 What, what, like, I don’t think, I remember.
00:08:38 I can’t imagine that there is,
00:08:39 I mean, there’s probably an optimal strategy.
00:08:41 Is it, does that mean that there’s an actual Nash equilibrium
00:08:45 of like picking up girls?
00:08:46 Do you know the marriage problem?
00:08:49 It’s optimal stopping.
00:08:50 Yes.
00:08:51 So where it’s an optimal dating strategy
00:08:54 where you, do you remember what it is?
00:08:56 Yeah, I think it’s like something like,
00:08:57 you know you’ve got like a set of a hundred people
00:08:59 you’re gonna look through and after,
00:09:01 how many do you, now after that,
00:09:05 after going on this many dates out of a hundred,
00:09:08 at what point do you then go, okay, the next best person I see,
00:09:10 is that the right one?
00:09:11 And I think it’s like something like 37%.
00:09:14 Uh, it’s one over E, whatever that is.
00:09:17 Right, which I think is 37%.
00:09:19 Yeah.
00:09:21 We’re gonna fact check that.
00:09:24 Yeah.
00:09:25 So, but it’s funny under those strict constraints,
00:09:28 then yes, after that many people,
00:09:30 as long as you have a fixed size pool,
00:09:32 then you just pick the next person
00:09:36 that is better than anyone you’ve seen before.
00:09:38 Anyone else you’ve seen, yeah.
00:09:40 Have you tried this?
00:09:41 Have you incorporated it?
00:09:42 I’m not one of those people.
00:09:44 And we’re gonna discuss this.
00:09:45 I, and, what do you mean, those people?
00:09:49 I try not to optimize stuff.
00:09:52 I try to listen to the heart.
00:09:55 I don’t think, I like,
00:09:59 my mind immediately is attracted to optimizing everything.
00:10:04 Optimizing everything.
00:10:06 And I think that if you really give in
00:10:09 to that kind of addiction,
00:10:11 that you lose the joy of the small things,
00:10:14 the minutiae of life, I think.
00:10:17 I don’t know.
00:10:18 I’m concerned about the addictive nature
00:10:19 of my personality in that regard.
00:10:21 In some ways,
00:10:24 while I think the, on average,
00:10:26 people under try and quantify things,
00:10:30 or try, under optimize.
00:10:32 There are some people who, you know,
00:10:34 with all these things, it’s a balancing act.
00:10:37 I’ve been on dating apps, but I’ve never used them.
00:10:40 I’m sure they have data on this,
00:10:42 because they probably have
00:10:43 the optimal stopping control problem.
00:10:45 Because there aren’t a lot of people that use social,
00:10:47 like dating apps, are on there for a long time.
00:10:51 So the interesting aspect is like, all right,
00:10:56 how long before you stop looking
00:10:58 before it actually starts affecting your mind negatively,
00:11:01 such that you see dating as a kind of,
00:11:07 A game.
00:11:08 A kind of game versus an actual process
00:11:12 of finding somebody that’s going to make you happy
00:11:14 for the rest of your life.
00:11:15 That’s really interesting.
00:11:17 They have the data.
00:11:18 I wish they would be able to release that data.
00:11:20 And I do want to hop to it.
00:11:21 It’s OkCupid, right?
00:11:22 I think they ran a huge, huge study on all of that.
00:11:25 Yeah, they’re more data driven, I think, OkCupid folks are.
00:11:28 I think there’s a lot of opportunity for dating apps,
00:11:30 in general, even bigger than dating apps,
00:11:32 people connecting on the internet.
00:11:35 I just hope they’re more data driven.
00:11:37 And it doesn’t seem that way.
00:11:40 I think like, I’ve always thought that
00:11:44 Goodreads should be a dating app.
00:11:47 Like the.
00:11:48 I’ve never used it.
00:11:49 After what?
00:11:50 Goodreads is just lists like books that you’ve read.
00:11:55 And allows you to comment on the books you read
00:11:57 and what books you’re currently reading.
00:11:58 But it’s a giant social networks of people reading books.
00:12:01 And that seems to be a much better database
00:12:03 of like interests.
00:12:04 Of course it constrains you to the books you’re reading,
00:12:06 but like that really reveals so much more about the person.
00:12:10 Allows you to discover shared interests
00:12:12 because books are a kind of window
00:12:13 into the way you see the world.
00:12:16 Also like the kind of places,
00:12:19 people you’re curious about,
00:12:20 the kind of ideas you’re curious about.
00:12:21 Are you a romantic or are you cold calculating rationalist?
00:12:24 Are you into Ayn Rand or are you into Bernie Sanders?
00:12:28 Are you into whatever?
00:12:29 And I feel like that reveals so much more
00:12:31 than like a person trying to look hot
00:12:35 from a certain angle in a Tinder profile.
00:12:37 Well, and it’d also be a really great filter
00:12:38 in the first place for people.
00:12:40 It’s less people who read books
00:12:41 and are willing to go and rate them
00:12:44 and give feedback on them and so on.
00:12:47 So that’s already a really strong filter.
00:12:48 Probably the type of people you’d be looking for.
00:12:50 Well, at least be able to fake reading books.
00:12:52 I mean, the thing about books,
00:12:53 you don’t really need to read it.
00:12:54 You can just look at the click counts.
00:12:55 Yeah, game the dating app by feigning intellectualism.
00:12:59 Can I admit something very horrible about myself?
00:13:02 Go on.
00:13:03 The things that, you know,
00:13:04 I don’t have many things in my closet,
00:13:05 but this is one of them.
00:13:07 I’ve never actually read Shakespeare.
00:13:10 I’ve only read Cliff Notes
00:13:12 and I got a five in the AP English exam.
00:13:15 Wow.
00:13:15 And I…
00:13:16 Which book?
00:13:18 The which books have I read?
00:13:19 Oh yeah, which was the exam on which book?
00:13:21 Oh no, they include a lot of them.
00:13:23 Oh.
00:13:24 But Hamlet, I don’t even know
00:13:27 if you read Romeo and Juliet, Macbeth.
00:13:30 I don’t remember, but I don’t understand it.
00:13:32 It’s like really cryptic.
00:13:34 It’s hard.
00:13:34 It’s really, I don’t, and it’s not that pleasant to read.
00:13:37 It’s like ancient speak.
00:13:39 I don’t understand it.
00:13:40 Anyway, maybe I was too dumb.
00:13:41 I’m still too dumb, but I did get…
00:13:44 You got a five, which is…
00:13:45 Yeah, yeah.
00:13:46 I don’t know how the US grading system…
00:13:47 Oh no, so AP English is a,
00:13:50 there’s kind of this advanced versions of courses
00:13:53 in high school, and you take a test
00:13:54 that is like a broad test for that subject
00:13:57 and includes a lot.
00:13:58 It wasn’t obviously just Shakespeare.
00:14:00 I think a lot of it was also writing, written.
00:14:04 You have like AP Physics, AP Computer Science,
00:14:07 AP Biology, AP Chemistry, and then AP English
00:14:11 or AP Literature, I forget what it was.
00:14:13 But I think Shakespeare was a part of that.
00:14:16 But I…
00:14:16 And the point is you gamified it?
00:14:19 Gamified.
00:14:19 Well, in entirety, I was into getting As.
00:14:22 I saw it as a game.
00:14:24 I don’t think any…
00:14:27 I don’t think all of the learning I’ve done
00:14:30 has been outside of school.
00:14:33 The deepest learning I’ve done has been outside of school
00:14:36 with a few exceptions, especially in grad school,
00:14:38 like deep computer science courses.
00:14:40 But that was still outside of school
00:14:41 because it was outside of getting…
00:14:43 Sorry.
00:14:44 It was outside of getting the A for the course.
00:14:46 The best stuff I’ve ever done is when you read the chapter
00:14:49 and you do many of the problems at the end of the chapter,
00:14:52 which is usually not what’s required for the course,
00:14:54 like the hardest stuff.
00:14:56 In fact, textbooks are freaking incredible.
00:14:58 If you go back now and you look at like biology textbook
00:15:02 or any of the computer science textbooks
00:15:06 on algorithms and data structures,
00:15:07 those things are incredibly…
00:15:09 They have the best summary of a subject,
00:15:11 plus they have practice problems of increasing difficulty
00:15:15 that allows you to truly master the basic,
00:15:17 like the fundamental ideas behind that.
00:15:19 I got through my entire physics degree with one textbook
00:15:24 that was just this really comprehensive one
00:15:26 that they told us at the beginning of the first year,
00:15:28 buy this, but you’re gonna have to buy 15 other books
00:15:31 for all your supplementary courses.
00:15:33 And I was like, every time I was just checked
00:15:35 to see whether this book covered it and it did.
00:15:38 And I think I only bought like two or three extra
00:15:39 and thank God, cause they’re super expensive textbooks.
00:15:41 It’s a whole racket they’ve got going on.
00:15:44 Yeah, they are.
00:15:45 They could just…
00:15:46 You get the right one, it’s just like a manual for…
00:15:49 But what’s interesting though,
00:15:51 is this is the tyranny of having exams and metrics.
00:15:56 The tyranny of exams and metrics, yes.
00:15:58 I loved them because I’m very competitive
00:16:00 and I liked finding ways to gamify things
00:16:04 and then like sort of dust off my shoulders afterwards
00:16:06 when I get a good grade
00:16:07 or be annoyed at myself when I didn’t.
00:16:09 But yeah, you’re absolutely right.
00:16:10 And that the actual…
00:16:12 How much of that physics knowledge I’ve retained,
00:16:15 like I’ve learned how to cram and study
00:16:19 and please an examiner,
00:16:21 but did that give me the deep lasting knowledge
00:16:23 that I needed?
00:16:24 I mean, yes and no,
00:16:27 but really like nothing makes you learn a topic better
00:16:30 than when you actually then have to teach it yourself.
00:16:34 Like I’m trying to wrap my teeth
00:16:35 around this like game theory Moloch stuff right now.
00:16:38 And there’s no exam at the end of it that I can gamify.
00:16:43 There’s no way to gamify
00:16:43 and sort of like shortcut my way through it.
00:16:45 I have to understand it so deeply
00:16:47 from like deep foundational levels
00:16:49 to then to build upon it
00:16:50 and then try and explain it to other people.
00:16:52 And like, you’re about to go and do some lectures, right?
00:16:54 You can’t sort of just like,
00:16:58 you presumably can’t rely on the knowledge
00:17:00 that you got through when you were studying for an exam
00:17:03 to reteach that.
00:17:04 Yeah, and especially high level lectures,
00:17:06 especially the kind of stuff you do on YouTube,
00:17:09 you’re not just regurgitating material.
00:17:12 You have to think through what is the core idea here.
00:17:17 And when you do the lectures live especially,
00:17:20 you have to, there’s no second takes.
00:17:25 That is the luxury you get
00:17:27 if you’re recording a video for YouTube
00:17:28 or something like that.
00:17:30 But it definitely is a luxury you shouldn’t lean on.
00:17:35 I’ve gotten to interact with a few YouTubers
00:17:37 that lean on that too much.
00:17:39 And you realize, oh, you’ve gamified this system
00:17:43 because you’re not really thinking deeply about stuff.
00:17:46 You’re through the edit,
00:17:48 both written and spoken,
00:17:52 you’re crafting an amazing video,
00:17:53 but you yourself as a human being
00:17:55 have not really deeply understood it.
00:17:57 So live teaching or at least recording video
00:18:00 with very few takes is a different beast.
00:18:04 And I think it’s the most honest way of doing it,
00:18:07 like as few takes as possible.
00:18:09 That’s why I’m nervous about this.
00:18:10 Don’t go back and be like, ah, let’s do that.
00:18:14 Don’t fuck this up, Liv.
00:18:17 The tyranny of exams.
00:18:18 I do think people talk about high school and college
00:18:24 as a time to do drugs and drink and have fun
00:18:27 and all this kind of stuff.
00:18:28 But looking back, of course I did a lot of those things.
00:18:33 Yes, no, yes, but it’s also a time
00:18:39 when you get to read textbooks or read books
00:18:43 or learn with all the time in the world.
00:18:48 You don’t have these responsibilities of laundry
00:18:54 and having to sort of pay for mortgage,
00:18:59 all that kind of stuff, pay taxes, all this kind of stuff.
00:19:03 In most cases, there’s just so much time in the day
00:19:06 for learning and you don’t realize it at the time
00:19:09 because at the time it seems like a chore,
00:19:11 like why the hell does there’s so much homework?
00:19:14 But you never get a chance to do this kind of learning,
00:19:17 this kind of homework ever again in life,
00:19:20 unless later in life you really make a big effort out of it.
00:19:23 Like basically your knowledge gets solidified.
00:19:26 You don’t get to have fun and learn.
00:19:28 Learning is really fulfilling and really fun
00:19:32 if you’re that kind of person.
00:19:33 Like some people like, you know,
00:19:37 like knowledge is not something that they think is fun.
00:19:39 But if that’s the kind of thing that you think is fun,
00:19:43 that’s the time to have fun and do the drugs and drink
00:19:45 and all that kind of stuff.
00:19:46 But the learning, just going back to those textbooks,
00:19:50 the hours spent with the textbooks
00:19:52 is really, really rewarding.
00:19:54 Do people even use textbooks anymore?
00:19:56 Yeah. Do you think?
00:19:56 Kids these days with their TikTok and their…
00:20:00 Well, not even that, but it’s just like so much information,
00:20:03 really high quality information,
00:20:05 you know, it’s now in digital format online.
00:20:08 Yeah, but they’re not, they are using that,
00:20:10 but you know, college is still very, there’s a curriculum.
00:20:15 I mean, so much of school is about rigorous study
00:20:18 of a subject and still on YouTube, that’s not there.
00:20:22 Right. YouTube has,
00:20:23 YouTube has, Grant Sanderson talks about this.
00:20:27 He’s this math…
00:20:29 3Blue1Brown.
00:20:30 Yeah, 3Blue1Brown.
00:20:31 He says like, I’m not a math teacher.
00:20:33 I just take really cool concepts and I inspire people.
00:20:37 But if you wanna really learn calculus,
00:20:39 if you wanna really learn linear algebra,
00:20:41 you should do the textbook.
00:20:43 You should do that, you know.
00:20:45 And there’s still the textbook industrial complex
00:20:48 that like charges like $200 for textbook and somehow,
00:20:53 I don’t know, it’s ridiculous.
00:20:56 Well, they’re like, oh, sorry, new edition, edition 14.6.
00:21:00 Sorry, you can’t use 14.5 anymore.
00:21:02 It’s like, what’s different?
00:21:03 We’ve got one paragraph different.
00:21:05 So we mentioned offline, Daniel Negrano.
00:21:08 I’m gonna get a chance to talk to him on this podcast.
00:21:11 And he’s somebody that I found fascinating
00:21:14 in terms of the way he thinks about poker,
00:21:16 verbalizes the way he thinks about poker,
00:21:18 the way he plays poker.
00:21:20 So, and he’s still pretty damn good.
00:21:22 He’s been good for a long time.
00:21:24 So you mentioned that people are running
00:21:27 these kinds of simulations and the game of poker has changed.
00:21:30 Do you think he’s adapting in this way?
00:21:33 Like the top pros, do they have to adopt this way?
00:21:36 Or is there still like over the years,
00:21:41 you basically develop this gut feeling about,
00:21:45 like you get to be like good the way,
00:21:48 like alpha zero is good.
00:21:49 You look at the board and somehow from the fog
00:21:54 comes out the right answer.
00:21:55 Like this is likely what they have.
00:21:58 This is likely the best way to move.
00:22:00 And you don’t really, you can’t really put a finger
00:22:03 on exactly why, but it just comes from your gut feeling.
00:22:08 Or no?
00:22:09 Yes and no.
00:22:10 So gut feelings are definitely very important.
00:22:14 You know, that we’ve got our two,
00:22:16 you can distill it down to two modes
00:22:18 of decision making, right?
00:22:19 You’ve got your sort of logical linear voice in your head,
00:22:22 system two, as it’s often called,
00:22:24 and your system one, your gut intuition.
00:22:29 And historically in poker,
00:22:32 the very best players were playing
00:22:34 almost entirely by their gut.
00:22:37 You know, often they do some kind of inspired play
00:22:39 and you’d ask them why they do it
00:22:40 and they wouldn’t really be able to explain it.
00:22:43 And that’s not so much because their process
00:22:46 was unintelligible, but it was more just because
00:22:48 no one had the language with which to describe
00:22:51 what optimal strategies were
00:22:52 because no one really understood how poker worked.
00:22:54 This was before, you know, we had analysis software.
00:22:57 You know, no one was writing,
00:22:59 I guess some people would write down their hands
00:23:01 in a little notebook,
00:23:02 but there was no way to assimilate all this data
00:23:04 and analyze it.
00:23:05 But then, you know, when computers became cheaper
00:23:08 and software started emerging,
00:23:09 and then obviously online poker,
00:23:11 where it would like automatically save your hand histories.
00:23:14 Now, all of a sudden you kind of had this body of data
00:23:17 that you could run analysis on.
00:23:19 And so that’s when people started to see, you know,
00:23:22 these mathematical solutions.
00:23:26 And so what that meant is the role of intuition
00:23:32 essentially became smaller.
00:23:35 And it went more into, as we talked before about,
00:23:38 you know, this game theory optimal style.
00:23:40 But also, as I said, like game theory optimal
00:23:43 is about loss minimization and being unexploitable.
00:23:47 But if you’re playing against people who aren’t,
00:23:49 because no person, no human being can play perfectly
00:23:51 game theory optimal in poker, not even the best AIs.
00:23:54 They’re still like, you know,
00:23:55 they’re 99.99% of the way there or whatever,
00:23:57 but it’s kind of like the speed of light.
00:23:59 You can’t reach it perfectly.
00:24:01 So there’s still a role for intuition?
00:24:03 Yes, so when, yeah,
00:24:05 when you’re playing this unexploitable style,
00:24:08 but when your opponents start doing something,
00:24:11 you know, suboptimal that you want to exploit,
00:24:14 well, now that’s where not only your like logical brain
00:24:17 will need to be thinking, well, okay,
00:24:18 I know I have this, I’m in the sort of top end
00:24:21 of my range here with this hand.
00:24:23 So that means I need to be calling X percent of the time
00:24:27 and I put them on this range, et cetera.
00:24:30 But then sometimes you’ll have this gut feeling
00:24:34 that will tell you, you know what, this time,
00:24:37 I know mathematically I’m meant to call now.
00:24:40 You know, I’m in the sort of top end of my range
00:24:42 and this is the odds I’m getting.
00:24:45 So the math says I should call,
00:24:46 but there’s something in your gut saying,
00:24:48 they’ve got it this time, they’ve got it.
00:24:49 Like they’re beating you, your hand is worse.
00:24:55 So then the real art,
00:24:56 this is where the last remaining art in poker,
00:24:59 the fuzziness is like, do you listen to your gut?
00:25:03 How do you quantify the strength of it?
00:25:06 Or can you even quantify the strength of it?
00:25:08 And I think that’s what Daniel has.
00:25:13 I mean, I can’t speak for how much he’s studying
00:25:15 with the simulators and that kind of thing.
00:25:17 I think he has, like he must be to still be keeping up.
00:25:22 But he has an incredible intuition for just,
00:25:26 he’s seen so many hands of poker in the flesh.
00:25:29 He’s seen so many people, the way they behave
00:25:31 when the chips are, you know, when the money’s on the line
00:25:33 and he’ve got him staring you down in the eye.
00:25:36 You know, he’s intimidating.
00:25:37 He’s got this like kind of X factor vibe
00:25:39 that he, you know, gives out.
00:25:42 And he talks a lot, which is an interactive element,
00:25:45 which is he’s getting stuff from other people.
00:25:47 Yes, yeah.
00:25:48 And just like the subtlety.
00:25:49 So he’s like, he’s probing constantly.
00:25:51 Yeah, he’s probing and he’s getting this extra layer
00:25:53 of information that others can’t.
00:25:55 Now that said though, he’s good online as well.
00:25:57 You know, I don’t know how, again,
00:25:59 would he be beating the top cash game players online?
00:26:02 Probably not, no.
00:26:03 But when he’s in person and he’s got that additional
00:26:07 layer of information, he can not only extract it,
00:26:09 but he knows what to do with it still so well.
00:26:13 There’s one player who I would say is the exception
00:26:15 to all of this.
00:26:16 And he’s one of my favorite people to talk about
00:26:18 in terms of, I think he might have cracked the simulation.
00:26:23 It’s Phil Helmuth.
00:26:25 He…
00:26:27 In more ways than one, he’s cracked the simulation,
00:26:29 I think.
00:26:29 Yeah, he somehow to this day is still
00:26:33 and I love you Phil, I’m not in any way knocking you.
00:26:37 He is still winning so much at the World Series
00:26:41 of Poker specifically.
00:26:43 He’s now won 16 bracelets.
00:26:44 The next nearest person I think has won 10.
00:26:47 And he is consistently year in, year out going deep
00:26:50 or winning these huge field tournaments,
00:26:52 you know, with like 2000 people,
00:26:54 which statistically he should not be doing.
00:26:57 And yet you watch some of the plays he makes
00:27:02 and they make no sense, like mathematically,
00:27:04 they are so far from game theory optimal.
00:27:07 And the thing is, if you went and stuck him
00:27:08 in one of these high stakes cash games
00:27:10 with a bunch of like GTO people,
00:27:12 he’s gonna get ripped apart.
00:27:14 But there’s something that he has that when he’s
00:27:16 in the halls of the World Series of Poker specifically,
00:27:20 amongst sort of amateurish players,
00:27:23 he gets them to do crazy shit like that.
00:27:26 And, but my little pet theory is that also,
00:27:30 he just, the card, he’s like a wizard
00:27:35 and he gets the cards to do what he needs them to.
00:27:38 Because he just expects to win and he expects to receive,
00:27:43 you know, to get flopper set with a frequency far beyond
00:27:47 what the real percentages are.
00:27:50 And I don’t even know if he knows what the real percentages
00:27:52 are, he doesn’t need to, because he gets there.
00:27:54 I think he has found the Chico,
00:27:55 because when I’ve seen him play,
00:27:57 he seems to be like annoyed that the long shot thing
00:28:01 didn’t happen.
00:28:02 He’s like annoyed and it’s almost like everybody else
00:28:05 is stupid because he was obviously going to win
00:28:08 with the spare.
00:28:08 If that silly thing hadn’t happened.
00:28:10 And it’s like, you don’t understand,
00:28:11 the silly thing happens 99% of the time.
00:28:13 And it’s a 1%, not the other way around,
00:28:15 but genuinely for his lived experience at the World Series,
00:28:18 only at the World Series of Poker, it is like that.
00:28:21 So I don’t blame him for feeling that way.
00:28:24 But he does, he has this X factor
00:28:26 and the poker community has tried for years
00:28:29 to rip him down saying like, he’s no good,
00:28:32 but he’s clearly good because he’s still winning
00:28:34 or there’s something going on.
00:28:36 Whether that’s he’s figured out how to mess
00:28:39 with the fabric of reality and how cards,
00:28:42 a randomly shuffled deck of cards come out.
00:28:44 I don’t know what it is, but he’s doing it right still.
00:28:46 Who do you think is the greatest of all time?
00:28:48 Would you put Hellmuth?
00:28:52 Not Hellmuth definitely, he seems like the kind of person
00:28:54 when mentioned he would actually watch this.
00:28:56 So you might wanna be careful.
00:28:58 Well, as I said, I love Phil and I have,
00:29:02 I would say this to his face, I’m not saying anything.
00:29:04 I don’t, he’s got, he truly, I mean,
00:29:06 he is one of the greatest.
00:29:09 I don’t know if he’s the greatest,
00:29:10 he’s certainly the greatest at the World Series of Poker.
00:29:14 And he is the greatest at, despite the game switching
00:29:17 into a pure game, almost an entire game of math,
00:29:20 he has managed to keep the magic alive.
00:29:22 And this like, just through sheer force of will,
00:29:26 making the game work for him.
00:29:27 And that is incredible.
00:29:28 And I think it’s something that should be studied
00:29:30 because it’s an example.
00:29:32 Yeah, there might be some actual game theoretic wisdom.
00:29:35 There might be something to be said about optimality
00:29:37 from studying him.
00:29:39 What do you mean by optimality?
00:29:40 Meaning, or rather game design, perhaps.
00:29:45 Meaning if what he’s doing is working,
00:29:48 maybe poker is more complicated
00:29:51 than the one we’re currently modeling it as.
00:29:54 So like his, yeah.
00:29:55 Or there’s an extra layer,
00:29:56 and I don’t mean to get too weird and wooey,
00:29:59 but, or there’s an extra layer of ability
00:30:04 to manipulate the things the way you want them to go
00:30:07 that we don’t understand yet.
00:30:09 Do you think Phil Hellmuth understands them?
00:30:12 Is he just generally?
00:30:13 Hashtag positivity.
00:30:14 Well, he wrote a book on positivity and he’s.
00:30:17 He has, he did, not like a trolling book.
00:30:20 No.
00:30:21 He’s just straight up, yeah.
00:30:23 Phil Hellmuth wrote a book about positivity.
00:30:26 Yes.
00:30:27 Okay, not ironic.
00:30:28 And I think it’s about sort of manifesting what you want
00:30:32 and getting the outcomes that you want
00:30:34 by believing so much in yourself
00:30:36 and in your ability to win,
00:30:37 like eyes on the prize.
00:30:40 And I mean, it’s working.
00:30:42 The man’s delivered.
00:30:43 But where do you put like Phil Ivey
00:30:45 and all those kinds of people?
00:30:47 I mean, I’m too, I’ve been,
00:30:49 to be honest too much out of the scene
00:30:50 for the last few years to really,
00:30:53 I mean, Phil Ivey’s clearly got,
00:30:54 again, he’s got that X factor.
00:30:57 He’s so incredibly intimidating to play against.
00:31:00 I’ve only played against him a couple of times,
00:31:01 but when he like looks you in the eye
00:31:03 and you’re trying to run a bluff on him,
00:31:04 oof, no one’s made me sweat harder than Phil Ivey,
00:31:07 just my bluff got through actually.
00:31:10 That was actually one of the most thrilling moments
00:31:11 I’ve ever had in poker was,
00:31:12 it was in a Monte Carlo in a high roller.
00:31:15 I can’t remember exactly what the hand was,
00:31:16 but I, you know, a three bit
00:31:19 and then like just barreled all the way through.
00:31:22 And he just like put his laser eyes into me.
00:31:24 And I felt like he was just scouring my soul.
00:31:28 And I was just like, hold it together, Liv,
00:31:29 hold it together.
00:31:30 And he was like, folded.
00:31:31 And you knew your hand was weaker.
00:31:33 Yeah, I mean, I was bluffing.
00:31:34 I presume, which, you know,
00:31:36 there’s a chance I was bluffing with the best hand,
00:31:37 but I’m pretty sure my hand was worse.
00:31:40 And he folded.
00:31:44 I was truly one of the deep highlights of my career.
00:31:47 Did you show the cards or did you fold?
00:31:50 You should never show in game.
00:31:51 Like, because especially as I felt like
00:31:53 I was one of the worst players at the table
00:31:54 in that tournament.
00:31:55 So giving that information,
00:31:57 unless I had a really solid plan
00:31:59 that I was now like advertising,
00:32:01 oh, look, I’m capable of bluffing Phil Ivey.
00:32:03 But like, why?
00:32:05 It’s much more valuable to take advantage
00:32:07 of the impression that they have of me,
00:32:09 which is like, I’m a scared girl
00:32:10 playing a high roller for the first time.
00:32:12 Keep that going, you know.
00:32:14 Interesting.
00:32:15 But isn’t there layers to this?
00:32:17 Like psychological warfare
00:32:18 that the scared girl might be way smart
00:32:22 and then like just to flip the tables?
00:32:24 Do you think about that kind of stuff?
00:32:25 Or is it better not to reveal information?
00:32:28 I mean, generally speaking,
00:32:29 you want to not reveal information.
00:32:31 You know, the goal of poker is to be
00:32:32 as deceptive as possible about your own strategies
00:32:36 while elucidating as much out of your opponent
00:32:38 about their own.
00:32:39 So giving them free information,
00:32:42 particularly if they’re people
00:32:42 who you consider very good players,
00:32:44 any information I give them is going into
00:32:47 their little database and being,
00:32:49 I assume it’s going to be calculated and used well.
00:32:51 So I have to be really confident
00:32:53 that my like meta gaming that I’m going to then do,
00:32:56 oh, they’ve seen this, so therefore that.
00:32:58 I’m going to be on the right level.
00:33:00 So it’s better just to keep that little secret
00:33:03 to myself in the moment.
00:33:04 So how much is bluffing part of the game?
00:33:06 Huge amount.
00:33:08 So yeah, I mean, maybe actually let me ask,
00:33:10 like, what did it feel like with Phil Ivey
00:33:12 or anyone else when it’s a high stake,
00:33:15 when it’s a big, it’s a big bluff?
00:33:18 So a lot of money on the table and maybe,
00:33:23 I mean, what defines a big bluff?
00:33:24 Maybe a lot of money on the table,
00:33:25 but also some uncertainty in your mind and heart
00:33:29 about like self doubt.
00:33:31 Well, maybe I miscalculated what’s going on here,
00:33:34 what the bet said, all that kind of stuff.
00:33:36 Like, what does that feel like?
00:33:38 I mean, it’s, I imagine comparable to,
00:33:45 you know, running a, I mean, any kind of big bluff
00:33:49 where you have a lot of something
00:33:52 that you care about on the line.
00:33:54 You know, so if you’re bluffing in a courtroom,
00:33:57 not that anyone should ever do that,
00:33:58 or, you know, something equatable to that.
00:34:00 It’s, you know, in that scenario, you know,
00:34:04 I think it was the first time I’d ever played a 20,
00:34:05 I’d won my way into this 25K tournament.
00:34:09 So that was the buy in 25,000 euros.
00:34:11 And I had satelliteed my way in
00:34:13 because it was much bigger
00:34:14 than I would ever normally play.
00:34:16 And, you know, I hadn’t, I wasn’t that experienced
00:34:18 at the time, and now I was sitting there
00:34:20 against all the big boys, you know,
00:34:21 the Negronus, the Phil Ivey’s and so on.
00:34:24 And then to like, you know,
00:34:28 each time you put the bets out, you know,
00:34:30 you put another bet out, your card.
00:34:33 Yeah, I was on what’s called a semi bluff.
00:34:34 So there were some cards that could come
00:34:36 that would make my hand very, very strong
00:34:38 and therefore win.
00:34:39 But most of the time, those cards don’t come.
00:34:40 So that is a semi bluff because you’re representing,
00:34:43 are you representing that you already have something?
00:34:47 So I think in this scenario, I had a flush draw.
00:34:51 So I had two clubs, two clubs came out on the flop.
00:34:55 And then I’m hoping that on the turn
00:34:57 and the river, one will come.
00:34:59 So I have some future equity.
00:35:00 I could hit a club and then I’ll have the best hand
00:35:02 in which case, great.
00:35:04 And so I can keep betting and I’ll want them to call,
00:35:07 but I’m also got the other way of winning the hand
00:35:09 where if my card doesn’t come,
00:35:11 I can keep betting and get them to fold their hand.
00:35:14 And I’m pretty sure that’s what the scenario was.
00:35:18 So I had some future equity, but it’s still, you know,
00:35:21 most of the time I don’t hit that club.
00:35:23 And so I would rather him just fold because I’m, you know,
00:35:25 the pot is now getting bigger and bigger.
00:35:27 And in the end, like I’ve jam all in on the river.
00:35:30 That’s my entire tournament on the line.
00:35:33 As far as I’m aware, this might be the one time
00:35:35 I ever get to play a big 25K.
00:35:37 You know, this was the first time I played one.
00:35:38 So it was, it felt like the most momentous thing.
00:35:42 And this was also when I was trying to build myself up at,
00:35:43 you know, build my name, a name for myself in poker.
00:35:46 I wanted to get respect.
00:35:47 Destroy everything for you.
00:35:49 It felt like it in the moment.
00:35:50 Like, I mean, it literally does feel
00:35:51 like a form of life and death.
00:35:52 Like your body physiologically
00:35:54 is having that flight or fight response.
00:35:56 What are you doing with your body?
00:35:57 What are you doing with your face?
00:35:58 Are you just like, what are you thinking about?
00:36:01 More of a mixture of like, okay, what are the cards?
00:36:04 So in theory, I’m thinking about like, okay,
00:36:07 what are cards that make my hand look stronger?
00:36:09 Which cards hit my perceived range from his perspective?
00:36:13 Which cards don’t?
00:36:15 What’s the right amount of bet size
00:36:17 to maximize my fold equity in this situation?
00:36:20 You know, that’s the logical stuff
00:36:22 that I should be thinking about.
00:36:23 But I think in reality, because I was so scared,
00:36:25 because there’s this, at least for me,
00:36:26 there’s a certain threshold of like nervousness or stress
00:36:30 beyond which the logical brain shuts off.
00:36:33 And now it just gets into this like,
00:36:36 just like, it feels like a game of wits, basically.
00:36:38 It’s like of nerve.
00:36:39 Can you hold your resolve?
00:36:41 And it certainly got by that, like by the river.
00:36:44 I think by that point, I was like,
00:36:45 I don’t even know if this is a good bluff anymore,
00:36:47 but fuck it, let’s do it.
00:36:50 Your mind is almost numb from the intensity of that feeling.
00:36:53 I call it the white noise.
00:36:55 And it happens in all kinds of decision making.
00:36:58 I think anything that’s really, really stressful.
00:37:00 I can imagine someone in like an important job interview,
00:37:02 if it’s like a job they’ve always wanted,
00:37:04 and they’re getting grilled, you know,
00:37:06 like Bridgewater style, where they ask these really hard,
00:37:09 like mathematical questions.
00:37:11 You know, it’s a really learned skill
00:37:13 to be able to like subdue your flight or fight response.
00:37:17 You know, I think get from the sympathetic
00:37:19 into the parasympathetic.
00:37:20 So you can actually, you know, engage that voice
00:37:23 in your head and do those slow logical calculations.
00:37:26 Because evolutionarily, you know, if we see a lion
00:37:29 running at us, we didn’t have time to sort of calculate
00:37:31 the lion’s kinetic energy and, you know,
00:37:33 is it optimal to go this way or that way?
00:37:35 You just react it.
00:37:37 And physically, our bodies are well attuned
00:37:39 to actually make right decisions.
00:37:41 But when you’re playing a game like poker,
00:37:43 this is not something that you ever, you know,
00:37:44 evolved to do.
00:37:45 And yet you’re in that same flight or fight response.
00:37:48 And so that’s a really important skill to be able to develop
00:37:51 to basically learn how to like meditate in the moment
00:37:54 and calm yourself so that you can think clearly.
00:37:57 But as you were searching for a comparable thing,
00:38:00 it’s interesting because you just made me realize
00:38:03 that bluffing is like an incredibly high stakes
00:38:06 form of lying.
00:38:08 You’re lying.
00:38:10 And I don’t think you can.
00:38:11 Telling a story.
00:38:12 No, no, it’s straight up lying.
00:38:15 In the context of game, it’s not a negative kind of lying.
00:38:19 But it is, yeah, exactly.
00:38:19 You’re representing something that you don’t have.
00:38:23 And I was thinking like how often in life
00:38:26 do we have such high stakes of lying?
00:38:30 Because I was thinking certainly
00:38:33 in high level military strategy,
00:38:36 I was thinking when Hitler was lying to Stalin
00:38:40 about his plans to invade the Soviet Union.
00:38:44 And so you’re talking to a person like your friends
00:38:48 and you’re fighting against the enemy,
00:38:50 whatever the formulation of the enemy is.
00:38:53 But meanwhile, whole time you’re building up troops
00:38:56 on the border.
00:38:58 That’s extremely.
00:38:59 Wait, wait, so Hitler and Stalin were like
00:39:01 pretending to be friends?
00:39:02 Yeah.
00:39:03 Well, my history knowledge is terrible.
00:39:04 Oh yeah.
00:39:04 That’s crazy.
00:39:05 Yeah, that they were, oh man.
00:39:09 And it worked because Stalin,
00:39:11 until the troops crossed the border
00:39:14 and invaded in Operation Barbarossa
00:39:17 where this storm of Nazi troops
00:39:22 invaded large parts of the Soviet Union.
00:39:25 And hence, one of the biggest wars in human history began.
00:39:30 Stalin for sure thought that this was never going to be,
00:39:34 that Hitler is not crazy enough to invade the Soviet Union.
00:39:38 And it makes, geopolitically makes total sense
00:39:41 to be collaborators.
00:39:43 And ideologically, even though there’s a tension
00:39:46 between communism and fascism or national socialism,
00:39:50 however you formulate it,
00:39:51 it still feels like this is the right way
00:39:54 to battle the West.
00:39:55 Right.
00:39:56 They were more ideologically aligned.
00:39:58 They in theory had a common enemy, which is the West.
00:40:01 So it made total sense.
00:40:03 And in terms of negotiations
00:40:05 and the way things were communicated,
00:40:07 it seemed to Stalin that for sure,
00:40:11 that they would remain, at least for a while,
00:40:16 peaceful collaborators.
00:40:17 And that, and everybody, because of that,
00:40:21 in the Soviet Union believed that it was a huge shock
00:40:24 when Kiev was invaded.
00:40:25 And you hear echoes of that when I travel to Ukraine,
00:40:28 sort of the shock of the invasion.
00:40:32 It’s not just the invasion on one particular border,
00:40:34 but the invasion of the capital city.
00:40:36 And just like, holy shit, especially at that time,
00:40:41 when you thought World War I,
00:40:44 you realized that that was the war to end all wars.
00:40:46 You would never have this kind of war.
00:40:48 And holy shit, this person is mad enough
00:40:52 to try to take on this monster in the Soviet Union.
00:40:56 So it’s no longer going to be a war
00:40:58 of hundreds of thousands dead.
00:40:59 It’ll be a war of tens of millions dead.
00:41:02 And yeah, but that, that’s a very large scale kind of lie,
00:41:08 but I’m sure there’s in politics and geopolitics,
00:41:11 that kind of lying happening all the time.
00:41:14 And a lot of people pay financially
00:41:17 and with their lives for that kind of lying.
00:41:19 But in our personal lives, I don’t know how often we,
00:41:22 maybe we.
00:41:23 I think people do.
00:41:23 I mean, like think of spouses
00:41:25 cheating on their partners, right?
00:41:27 And then like having to lie,
00:41:28 like where were you last night?
00:41:29 Stuff like that. Oh shit, that’s tough.
00:41:30 Yeah, that’s true.
00:41:31 Like that’s, I think, you know, I mean,
00:41:34 unfortunately that stuff happens all the time, right?
00:41:36 Or having like multiple families, that one is great.
00:41:38 When each family doesn’t know about the other one
00:41:42 and like maintaining that life.
00:41:44 There’s probably a sense of excitement about that too.
00:41:48 Or. It seems unnecessary, yeah.
00:41:50 But. Why?
00:41:51 Well, just lying.
00:41:52 Like, you know, the truth finds a way of coming out.
00:41:56 You know? Yes.
00:41:57 But hence that’s the thrill.
00:41:59 Yeah, perhaps.
00:42:00 Yeah, people.
00:42:01 I mean, you know, that’s why I think actually like poker.
00:42:04 What’s so interesting about poker
00:42:05 is most of the best players I know,
00:42:09 they’re always exceptions, you know?
00:42:10 They’re always bad eggs.
00:42:12 But actually poker players are very honest people.
00:42:15 I would say they are more honest than the average,
00:42:17 you know, if you just took random population sample.
00:42:22 Because A, you know, I think, you know,
00:42:26 humans like to have that.
00:42:28 Most people like to have some kind of, you know,
00:42:30 mysterious, you know, an opportunity to do something
00:42:32 like a little edgy.
00:42:34 So we get to sort of scratch that itch
00:42:36 of being edgy at the poker table,
00:42:38 where it’s like, it’s part of the game.
00:42:40 Everyone knows what they’re in for, and that’s allowed.
00:42:43 And you get to like really get that out of your system.
00:42:47 And then also like poker players learned that, you know,
00:42:51 I would play in a huge game against some of my friends,
00:42:55 even my partner, Igor, where we will be, you know,
00:42:57 absolutely going at each other’s throats,
00:42:59 trying to draw blood in terms of winning each money
00:43:02 off each other and like getting under each other’s skin,
00:43:04 winding each other up, doing the craftiest moves we can.
00:43:08 But then once the game’s done, you know,
00:43:11 the winners and the losers will go off
00:43:12 and get a drink together and have a fun time
00:43:13 and like talk about it in this like weird academic way
00:43:16 afterwards, because, and that’s why games are so great.
00:43:19 Cause you get to like live out like this competitive urge
00:43:23 that, you know, most people have.
00:43:26 What’s it feel like to lose?
00:43:28 Like we talked about bluffing when it worked out.
00:43:31 What about when you, when you go broke?
00:43:35 So like in a game, I’m, fortunately I’ve never gone broke.
00:43:39 You mean like full life?
00:43:40 Full life, no.
00:43:42 I know plenty of people who have.
00:43:46 And I don’t think Igor would mind me saying he went,
00:43:48 you know, he went broke once in poker bowl, you know,
00:43:50 early on when we were together.
00:43:51 I feel like you haven’t lived unless you’ve gone broke.
00:43:54 Oh yeah.
00:43:55 Some sense.
00:43:56 Right.
00:43:57 Some fundamental sense.
00:43:58 I mean, I’m happy, I’ve sort of lived through it,
00:43:59 vicariously through him when he did it at the time.
00:44:02 But yeah, what’s it like to lose?
00:44:04 Well, it depends.
00:44:04 So it depends on the amount.
00:44:06 It depends what percentage of your net worth
00:44:07 you’ve just lost.
00:44:10 It depends on your brain chemistry.
00:44:11 It really, you know, varies from person to person.
00:44:13 You have a very cold calculating way of thinking about this.
00:44:16 So it depends what percentage.
00:44:18 Well, it did, it really does, right?
00:44:20 Yeah, it’s true, it’s true.
00:44:22 I mean, that’s another thing poker trains you to do.
00:44:24 You see, you see everything in percentages
00:44:28 or you see everything in like ROI or expected hourlies
00:44:30 or cost benefit, et cetera.
00:44:32 You know, so that’s, one of the things I’ve tried to do
00:44:37 is calibrate the strength of my emotional response
00:44:39 to the win or loss that I’ve received.
00:44:43 Because it’s no good if you like, you know,
00:44:45 you have a huge emotional dramatic response to a tiny loss
00:44:50 or on the flip side, you have a huge win
00:44:53 and you’re sort of so dead inside
00:44:54 that you don’t even feel it.
00:44:55 Well, that’s, you know, that’s a shame.
00:44:56 I want my emotions to calibrate with reality
00:45:00 as much as possible.
00:45:02 So yeah, what’s it like to lose?
00:45:04 I mean, I’ve had times where I’ve lost, you know,
00:45:07 busted out of a tournament that I thought I was gonna win in
00:45:09 especially if I got really unlucky or I make a dumb play
00:45:13 where I’ve gone away and like, you know, kicked the wall,
00:45:17 punched a wall, I like nearly broke my hand one time.
00:45:19 Like I’m a lot less competitive than I used to be.
00:45:24 Like I was like pathologically competitive in my like
00:45:27 late teens, early twenties, I just had to win at everything.
00:45:30 And I think that sort of slowly waned as I’ve gotten older.
00:45:33 According to you, yeah.
00:45:34 According to me.
00:45:35 I don’t know if others would say the same, right?
00:45:38 I feel like ultra competitive people,
00:45:40 like I’ve heard Joe Rogan say this to me.
00:45:43 It’s like that he’s a lot less competitive
00:45:44 than he used to be.
00:45:45 I don’t know about that.
00:45:47 Oh, I believe it.
00:45:48 No, I totally believe it.
00:45:49 Like, because as you get, you can still be,
00:45:51 like I care about winning.
00:45:53 Like when, you know, I play a game with my buddies online
00:45:56 or, you know, whatever it is,
00:45:57 polytopia is my current obsession.
00:45:58 Like when I.
00:46:00 Thank you for passing on your obsession to me.
00:46:02 Are you playing now?
00:46:03 Yeah, I’m playing now.
00:46:04 We gotta have a game.
00:46:05 But I’m terrible and I enjoy playing terribly.
00:46:08 I don’t wanna have a game because that’s gonna pull me
00:46:10 into your monster of like a competitive play.
00:46:13 It’s important, it’s an important skill.
00:46:15 I’m enjoying playing on the, I can’t.
00:46:18 You just do the points thing, you know, against the bots.
00:46:20 Yeah, against the bots.
00:46:22 And I can’t even do the, there’s like a hard one
00:46:26 and there’s a very hard one.
00:46:26 And then it’s crazy, yeah.
00:46:27 It’s crazy.
00:46:28 I can’t, I don’t even enjoy the hard one.
00:46:29 The crazy, I really don’t enjoy.
00:46:32 Cause it’s intense.
00:46:33 You have to constantly try to win
00:46:34 as opposed to enjoy building a little world and.
00:46:37 Yeah, no, no, no.
00:46:38 There’s no time for exploration in polytopia.
00:46:40 You gotta get.
00:46:40 Well, when, once you graduate from the crazies,
00:46:42 then you can come play the.
00:46:44 Graduate from the crazies.
00:46:46 Yeah, so in order to be able to play a decent game
00:46:48 against like, you know, our group,
00:46:51 you’ll need to be, you’ll need to be consistently winning
00:46:55 like 90% of games against 15 crazy bots.
00:46:58 Yeah.
00:46:59 And you’ll be able to, like there’ll be,
00:47:00 I could teach you it within a day, honestly.
00:47:03 How to beat the crazies?
00:47:04 How to beat the crazies.
00:47:05 And then, and then you’ll be ready for the big leagues.
00:47:08 Generalizes to more than just polytopia.
00:47:11 But okay, why were we talking about polytopia?
00:47:14 Losing hurts.
00:47:15 Losing hurts, oh yeah.
00:47:17 Yes, competitiveness over time.
00:47:19 Oh yeah.
00:47:20 I think it’s more that, at least for me,
00:47:23 I still care about playing,
00:47:24 about winning when I choose to play something.
00:47:26 It’s just that I don’t see the world
00:47:28 as zero sum as I used to be, you know?
00:47:31 I think as one gets older and wiser,
00:47:34 you start to see the world more as a positive something.
00:47:38 Or at least you’re more aware of externalities,
00:47:40 of scenarios, of competitive interactions.
00:47:43 And so, yeah, I just like, I’m more,
00:47:47 and I’m more aware of my own, you know, like,
00:47:50 if I have a really strong emotional response to losing,
00:47:52 and that makes me then feel shitty for the rest of the day,
00:47:54 and then I beat myself up mentally for it.
00:47:57 Like, I’m now more aware
00:47:58 that that’s unnecessary negative externality.
00:48:01 So I’m like, okay, I need to find a way to turn this down,
00:48:03 you know, dial this down a bit.
00:48:05 Was poker the thing that has,
00:48:07 if you think back at your life,
00:48:09 and think about some of the lower points of your life,
00:48:12 like the darker places you’ve got in your mind,
00:48:15 did it have to do something with poker?
00:48:17 Like, did losing spark the descent into darkness,
00:48:23 or was it something else?
00:48:27 I think my darkest points in poker
00:48:29 were when I was wanting to quit and move on to other things,
00:48:34 but I felt like I hadn’t ticked all the boxes
00:48:38 I wanted to tick.
00:48:39 Like, I wanted to be the most winningest female player,
00:48:43 which is by itself a bad goal.
00:48:45 You know, that was one of my initial goals,
00:48:47 and I was like, well, I haven’t, you know,
00:48:48 and I wanted to win a WPT event.
00:48:50 I’ve won one of these, I’ve won one of these,
00:48:52 but I want one of those as well.
00:48:53 And that sort of, again, like,
00:48:57 it’s a drive of like overoptimization to random metrics
00:49:00 that I decided were important
00:49:02 without much wisdom at the time, but then like carried on.
00:49:06 That made me continue chasing it longer
00:49:09 than I still actually had the passion to chase it for.
00:49:12 And I don’t have any regrets that, you know,
00:49:15 I played for as long as I did, because who knows,
00:49:17 you know, I wouldn’t be sitting here,
00:49:19 I wouldn’t be living this incredible life
00:49:21 that I’m living now.
00:49:22 This is the height of your life right now.
00:49:24 This is it, peak experience, absolute pinnacle
00:49:28 here in your robot land with your creepy light.
00:49:34 No, it is, I mean, I wouldn’t change a thing
00:49:37 about my life right now, and I feel very blessed to say that.
00:49:40 So, but the dark times were in the sort of like 2016 to 18,
00:49:47 even sooner really, where I was like,
00:49:50 I had stopped loving the game,
00:49:53 and I was going through the motions,
00:49:55 and I would, and then I was like, you know,
00:49:59 I would take the losses harder than I needed to,
00:50:02 because I’m like, ah, it’s another one.
00:50:03 And it was, I was aware that like,
00:50:04 I felt like my life was ticking away,
00:50:06 and I was like, is this gonna be what’s on my tombstone?
00:50:07 Oh yeah, she played the game of, you know,
00:50:09 this zero sum game of poker,
00:50:11 slightly more optimally than her next opponent.
00:50:14 Like, cool, great, legacy, you know?
00:50:16 So, I just wanted, you know, there was something in me
00:50:19 that knew I needed to be doing something
00:50:21 more directly impactful and just meaningful.
00:50:25 It was just like your search for meaning,
00:50:26 and I think it’s a thing a lot of poker players,
00:50:27 even a lot of, I imagine any games players
00:50:30 who sort of love intellectual pursuits,
00:50:36 you know, I think you should ask Magnus Carlsen
00:50:37 this question, I don’t know what he’s on.
00:50:38 He’s walking away from chess, right?
00:50:39 Yeah, like, it must be so hard for him.
00:50:41 You know, he’s been on the top for so long,
00:50:43 and it’s like, well, now what?
00:50:45 He’s got this incredible brain, like, what to put it to?
00:50:49 And, yeah, it’s.
00:50:52 It’s this weird moment where I’ve just spoken with people
00:50:55 that won multiple gold medals at the Olympics,
00:50:58 and the depression hits hard after you win.
00:51:02 Dopamine crash.
00:51:04 Because it’s a kind of a goodbye,
00:51:05 saying goodbye to that person,
00:51:06 to all the dreams you had that thought,
00:51:09 you thought would give meaning to your life,
00:51:11 but in fact, life is full of constant pursuits of meaning.
00:51:16 It doesn’t, you don’t like arrive and figure it all out,
00:51:20 and there’s endless bliss,
00:51:21 and it continues going on and on.
00:51:23 You constantly have to figure out to rediscover yourself.
00:51:27 And so for you, like that struggle to say goodbye to poker,
00:51:31 you have to like find the next.
00:51:33 There’s always a bigger game.
00:51:34 That’s the thing.
00:51:35 That’s my motto is like, what’s the next game?
00:51:38 And more importantly,
00:51:41 because obviously game usually implies zero sum,
00:51:43 like what’s a game which is like Omni win?
00:51:46 Like what? Omni win.
00:51:47 Omni win.
00:51:48 Why is Omni win so important?
00:51:50 Because if everyone plays zero sum games,
00:51:54 that’s a fast track to either completely stagnate
00:51:57 as a civilization, but more actually,
00:51:59 far more likely to extinct ourselves.
00:52:02 You know, like the playing field is finite.
00:52:05 You know, nuclear powers are playing,
00:52:09 you know, a game of poker with, you know,
00:52:12 but their chips are nuclear weapons, right?
00:52:14 And the stakes have gotten so large
00:52:17 that if anyone makes a single bet, you know,
00:52:19 fires some weapons, the playing field breaks.
00:52:22 I made a video on this.
00:52:22 Like, you know, the playing field is finite.
00:52:26 And if we keep playing these adversarial zero sum games,
00:52:31 thinking that we, you know,
00:52:33 in order for us to win, someone else has to lose,
00:52:35 or if we lose that, you know, someone else wins,
00:52:37 that will extinct us.
00:52:40 It’s just a matter of when.
00:52:41 What do you think about that mutually assured destruction,
00:52:44 that very simple,
00:52:46 almost to the point of caricaturing game theory idea
00:52:49 that does seem to be at the core
00:52:52 of why we haven’t blown each other up yet
00:52:54 with nuclear weapons.
00:52:55 Do you think there’s some truth to that,
00:52:57 this kind of stabilizing force
00:53:00 of mutually assured destruction?
00:53:01 And do you think that’s gonna hold up
00:53:04 through the 21st century?
00:53:07 I mean, it has held.
00:53:09 Yes, there’s definitely truth to it,
00:53:11 that it was a, you know, it’s a Nash equilibrium.
00:53:14 Yeah, are you surprised it held this long?
00:53:17 Isn’t it crazy?
00:53:18 It is crazy when you factor in all the like
00:53:21 near miss accidental firings.
00:53:24 Yes, that’s makes me wonder like, you know,
00:53:28 are you familiar with the like quantum suicide
00:53:29 thought experiment, where it’s basically like,
00:53:33 you have a, you know, like a Russian roulette
00:53:36 type scenario hooked up to some kind of quantum event,
00:53:40 you know, particle splitting or pair of particles splitting.
00:53:45 And if it, you know, if it goes A,
00:53:48 then the gun doesn’t go off and it goes B,
00:53:50 then it does go off and it kills you.
00:53:52 Because you can only ever be in the universe,
00:53:55 you know, assuming like the Everett branch,
00:53:56 you know, multiverse theory,
00:53:57 you’ll always only end up in the branch
00:54:00 where you continually make, you know, option A comes in,
00:54:03 but you run that experiment enough times,
00:54:05 it starts getting pretty damn, you know,
00:54:07 out of the tree gets huge,
00:54:09 there’s a million different scenarios,
00:54:10 but you’ll always find yourself in this,
00:54:12 in the one where it didn’t go off.
00:54:14 And so from that perspective, you are essentially immortal
00:54:20 because someone, and you will only find yourself
00:54:22 in the set of observers that make it down that path.
00:54:25 So it’s kind of a…
00:54:26 That doesn’t mean, that doesn’t mean
00:54:30 you’re still not gonna be fucked at some point in your life.
00:54:33 No, of course not, no, I’m not advocating
00:54:34 like that we’re all immortal because of this.
00:54:36 It’s just like a fun thought experiment.
00:54:38 And the point is it like raises this thing
00:54:40 of like these things called observer selection effects,
00:54:42 which Bostrom, Nick Bostrom talks about a lot,
00:54:44 and I think people should go read.
00:54:46 It’s really powerful,
00:54:47 but I think it could be overextended, that logic.
00:54:48 I’m not sure exactly how it can be.
00:54:52 I just feel like you can get, you can overgeneralize
00:54:57 that logic somehow.
00:54:58 Well, no, I mean, it leaves you into like solipsism,
00:55:00 which is a very dangerous mindset.
00:55:02 Again, if everyone like falls into solipsism of like,
00:55:04 well, I’ll be fine.
00:55:05 That’s a great way of creating a very,
00:55:08 self terminating environment.
00:55:10 But my point is, is that with the nuclear weapons thing,
00:55:14 there have been at least, I think it’s 12 or 11 near misses
00:55:19 of like just stupid things, like there was moonrise
00:55:22 over Norway, and it made weird reflections
00:55:25 of some glaciers in the mountains, which set off,
00:55:28 I think the alarms of NORAD radar,
00:55:33 and that put them on high alert, nearly ready to shoot.
00:55:35 And it was only because the head of Russian military
00:55:39 happened to be at the UN in New York at the time
00:55:41 that they go like, well, wait a second,
00:55:43 why would they fire now when their guy is there?
00:55:47 And it was only that lucky happenstance,
00:55:49 which doesn’t happen very often where they didn’t then
00:55:50 escalate it into firing.
00:55:51 And there’s a bunch of these different ones.
00:55:53 Stanislav Petrov, like saved the person
00:55:56 who should be the most famous person on earth,
00:55:57 cause he’s probably on expectation,
00:55:59 saved the most human lives of anyone,
00:56:01 like billions of people by ignoring Russian orders to fire
00:56:05 because he felt in his gut that actually
00:56:06 this was a false alarm.
00:56:07 And it turned out to be, you know, very hard thing to do.
00:56:11 And there’s so many of those scenarios that I can’t help
00:56:13 but wonder at this point that we aren’t having this kind
00:56:15 of like selection effect thing going on.
00:56:17 Cause you look back and you’re like, geez,
00:56:19 that’s a lot of near misses.
00:56:20 But of course we don’t know the actual probabilities
00:56:22 that they would have lent each one would have ended up
00:56:24 in nuclear war.
00:56:25 Maybe they were not that likely, but still the point is,
00:56:27 it’s a very dark, stupid game that we’re playing.
00:56:30 And it is an absolute moral imperative if you ask me
00:56:35 to get as many people thinking about ways
00:56:37 to make this like very precarious.
00:56:39 Cause we’re in a Nash equilibrium,
00:56:41 but it’s not like we’re in the bottom of a pit.
00:56:42 You know, if you would like map it topographically,
00:56:46 it’s not like a stable ball at the bottom of a thing.
00:56:48 We’re not in equilibrium because of that.
00:56:49 We’re on the top of a hill with a ball balanced on top.
00:56:52 And just at any little nudge could send it flying down
00:56:55 and you know, nuclear war pops off
00:56:57 and hellfire and bad times.
00:57:00 On the positive side,
00:57:01 life on earth will probably still continue.
00:57:03 And another intelligent civilization might still pop up.
00:57:06 Maybe.
00:57:07 Several millennia after.
00:57:08 Pick your X risk, depends on the X risk.
00:57:10 Nuclear war, sure.
00:57:11 That’s one of the perhaps less bad ones.
00:57:12 Green goo through synthetic biology, very bad.
00:57:17 Will turn, you know, destroy all, you know,
00:57:22 organic matter through, you know,
00:57:25 it’s basically like a biological paperclip maximizer.
00:57:28 Also bad.
00:57:28 Or AI type, you know, mass extinction thing as well
00:57:32 would also be bad.
00:57:33 Shh, they’re listening.
00:57:35 There’s a robot right behind you.
00:57:36 Okay, wait.
00:57:37 So let me ask you about this from a game theory perspective.
00:57:40 Do you think we’re living in a simulation?
00:57:42 Do you think we’re living inside a video game
00:57:44 created by somebody else?
00:57:48 Well, so what was the second part of the question?
00:57:50 Do I think we’re living in a simulation and?
00:57:52 A simulation that is observed by somebody
00:57:56 for purpose of entertainment.
00:57:58 So like a video game.
00:57:59 Are we listening?
00:58:00 Are we, because there’s a,
00:58:03 it’s like Phil Hellmuth type of situation, right?
00:58:05 Like there’s a creepy level of like,
00:58:09 this is kind of fun and interesting.
00:58:13 Like there’s a lot of interesting stuff going on.
00:58:16 Maybe that could be somehow integrated
00:58:18 into the evolutionary process where the way we perceive
00:58:23 and are.
00:58:24 Are you asking me if I believe in God?
00:58:27 Sounds like it.
00:58:28 Kind of, but God seems to be not optimizing
00:58:33 in the different formulations of God that we conceive of.
00:58:37 He doesn’t seem to be, or she, optimizing
00:58:39 for like personal entertainment.
00:58:44 Maybe the older gods did.
00:58:45 But the, you know, just like, basically like a teenager
00:58:49 in their mom’s basement watching, create a fun universe
00:58:54 to observe what kind of crazy shit might happen.
00:58:58 Okay, so to try and answer this.
00:59:00 Do I think there is some kind of extraneous intelligence
00:59:11 to like our, you know, classic measurable universe
00:59:16 that we, you know, can measure with, you know,
00:59:18 through our current physics and instruments?
00:59:23 I think so, yes.
00:59:25 Partly because I’ve had just small little bits of evidence
00:59:29 in my own life, which have made me question.
00:59:32 Like, so I was a diehard atheist, even five years ago.
00:59:37 You know, I got into like the rationality community,
00:59:39 big fan of less wrong, continue to be an incredible resource.
00:59:45 But I’ve just started to have too many little snippets
00:59:49 of experience, which don’t make sense with the current sort
00:59:54 of purely materialistic explanation of how reality works.
01:00:04 Isn’t that just like a humbling, practical realization
01:00:09 that we don’t know how reality works?
01:00:12 Isn’t that just a reminder to yourself that you’re
01:00:15 like, I don’t know how reality works, isn’t that just
01:00:19 a reminder to yourself?
01:00:20 Yeah, no, it’s a reminder of epistemic humility
01:00:22 because I fell too hard, you know, same as people,
01:00:25 like I think, you know, many people who are just like,
01:00:27 my religion is the way, this is the correct way,
01:00:29 this is the work, this is the law, you are immoral
01:00:32 if you don’t follow this, blah, blah, blah.
01:00:33 I think they are lacking epistemic humility.
01:00:36 They’re a little too much hubris there.
01:00:38 But similarly, I think that sort of the Richard Dawkins
01:00:40 realism is too rigid as well and doesn’t, you know,
01:00:48 there’s a way to try and navigate these questions
01:00:51 which still honors the scientific method,
01:00:52 which I still think is our best sort of realm
01:00:54 of like reasonable inquiry, you know, a method of inquiry.
01:00:58 So an example, two kind of notable examples
01:01:03 that like really rattled my cage.
01:01:06 The first one was actually in 2010 early on
01:01:09 in quite early on in my poker career.
01:01:13 And I, remember the Icelandic volcano that erupted
01:01:19 that like shut down kind of all Atlantic airspace.
01:01:22 And it meant I got stuck down in the South of France.
01:01:25 I was there for something else.
01:01:27 And I couldn’t get home and someone said,
01:01:29 well, there’s a big poker tournament happening in Italy.
01:01:31 Maybe, do you wanna go?
01:01:32 I was like, all right, sure.
01:01:33 Like, let’s, you know, got a train across,
01:01:35 found a way to get there.
01:01:37 And the buy in was 5,000 euros,
01:01:39 which was much bigger than my bankroll would normally allow.
01:01:42 And so I played a feeder tournament, won my way in
01:01:46 kind of like I did with the Monte Carlo big one.
01:01:49 So then I won my way, you know,
01:01:50 from 500 euros into 5,000 euros to play this thing.
01:01:54 And on day one of then the big tournament,
01:01:57 which turned out to have,
01:01:58 it was the biggest tournament ever held in Europe
01:02:00 at the time.
01:02:01 It got over like 1,200 people, absolutely huge.
01:02:04 And I remember they dimmed the lights for before, you know,
01:02:08 the normal shuffle up and deal
01:02:10 to tell everyone to start playing.
01:02:12 And they played Chemical Brothers, Hey Boy, Hey Girl,
01:02:16 which I don’t know why it’s notable,
01:02:17 but it was just like a really,
01:02:18 it was a song I always liked.
01:02:19 It was like one of these like pump me up songs.
01:02:21 And I was sitting there thinking, oh yeah, it’s exciting.
01:02:22 I’m playing this really big tournament.
01:02:24 And out of nowhere, just suddenly this voice in my head,
01:02:29 just, and it sounded like my own sort of, you know,
01:02:32 when you think in your mind, you hear a voice kind of, right?
01:02:34 At least I do.
01:02:36 And so it sounded like my own voice and it said,
01:02:38 you are going to win this tournament.
01:02:41 And it was so powerful that I got this like wave of like,
01:02:44 you know, sort of goosebumps down my body.
01:02:46 And that I even, I remember looking around being like,
01:02:48 did anyone else hear that?
01:02:49 And obviously people are in their phones,
01:02:51 like no one else heard it.
01:02:51 And I was like, okay, six days later,
01:02:56 I win the fucking tournament out of 1,200 people.
01:02:59 And I don’t know how to explain it.
01:03:08 Okay, yes, maybe I have that feeling
01:03:13 before every time I play.
01:03:14 And it’s just that I happened to, you know,
01:03:16 because I won the tournament, I retroactively remembered it.
01:03:18 But that’s just.
01:03:19 Or the feeling gave you a kind of,
01:03:22 now from the film, Helmutian.
01:03:24 Well, exactly.
01:03:25 Like it gave you a confident, a deep confidence.
01:03:28 And it did.
01:03:29 It definitely did.
01:03:30 Like, I remember then feeling this like sort of,
01:03:32 well, although I remember then on day one,
01:03:33 I then went and lost half my stack quite early on.
01:03:35 And I remember thinking like, oh, well, that was bullshit.
01:03:37 You know, what kind of premonition is this?
01:03:40 Thinking, oh, I’m out.
01:03:41 But you know, I managed to like keep it together
01:03:42 and recover and then just went like pretty perfectly
01:03:46 from then on.
01:03:47 And either way, it definitely instilled me
01:03:51 with this confidence.
01:03:52 And I don’t want to put, I can’t put an explanation.
01:03:55 Like, you know, was it some, you know, huge extra to extra,
01:04:01 you know, supernatural thing driving me?
01:04:03 Or was it just my own self confidence in someone
01:04:06 that just made me make the right decisions?
01:04:07 I don’t know.
01:04:08 And I don’t, I’m not going to put a frame on it.
01:04:10 And I think.
01:04:11 I think I know a good explanation.
01:04:12 So we’re a bunch of NPCs living in this world
01:04:14 created by, in the simulation.
01:04:16 And then people, not people, creatures from outside
01:04:20 of the simulation sort of can tune in
01:04:23 and play your character.
01:04:24 And that feeling you got is somebody just like,
01:04:27 they got to play a poker tournament through you.
01:04:29 Honestly, it felt like that.
01:04:30 It did actually feel a little bit like that.
01:04:33 But it’s been 12 years now.
01:04:35 I’ve retold the story many times.
01:04:36 Like, I don’t even know how much I can trust my memory.
01:04:38 You’re just an NPC retelling the same story.
01:04:41 Because they just played the tournament and left.
01:04:43 Yeah, they’re like, oh, that was fun.
01:04:44 Cool.
01:04:45 Yeah, cool.
01:04:46 Next time.
01:04:47 And now you’re for the rest of your life left
01:04:49 as a boring NPC retelling this story of greatness.
01:04:51 But it was, and what was interesting was that after that,
01:04:53 then I didn’t obviously win a major tournament
01:04:55 for quite a long time.
01:04:56 And it left, that was actually another sort of dark period
01:05:01 because I had this incredible,
01:05:02 like the highs of winning that,
01:05:04 just on a like material level were insane,
01:05:06 winning the money.
01:05:07 I was on the front page of newspapers
01:05:08 because there was like this girl that came out of nowhere
01:05:10 and won this big thing.
01:05:12 And so again, like sort of chasing that feeling
01:05:14 was difficult.
01:05:16 But then on top of that, there was this feeling
01:05:18 of like almost being touched by something bigger
01:05:22 that was like, ah.
01:05:24 So maybe, did you have a sense
01:05:26 that I might be somebody special?
01:05:29 Like this kind of,
01:05:34 I think that’s the confidence thing
01:05:36 that maybe you could do something special in this world
01:05:40 after all kind of feeling.
01:05:42 I definitely, I mean, this is the thing
01:05:45 I think everybody wrestles with to an extent, right?
01:05:48 We are truly the protagonists in our own lives.
01:05:51 And so it’s a natural bias, human bias
01:05:54 to feel special.
01:05:58 And I think, and in some ways we are special.
01:06:00 Every single person is special
01:06:01 because you are that, the universe does,
01:06:04 the world literally does revolve around you.
01:06:06 That’s the thing in some respect.
01:06:08 But of course, if you then zoom out
01:06:10 and take the amalgam of everyone’s experiences,
01:06:12 then no, it doesn’t.
01:06:12 So there is this shared sort of objective reality,
01:06:15 but sorry, there’s objective reality that is shared,
01:06:17 but then there’s also this subjective reality
01:06:19 which is truly unique to you.
01:06:20 And I think both of those things coexist.
01:06:22 And it’s not like one is correct and one isn’t.
01:06:24 And again, anyone who’s like,
01:06:26 oh no, your lived experience is everything
01:06:28 versus your lived experience is nothing.
01:06:30 No, it’s a blend between these two things.
01:06:32 They can exist concurrently.
01:06:33 But there’s a certain kind of sense
01:06:35 that at least I’ve had my whole life.
01:06:36 And I think a lot of people have this as like,
01:06:38 well, I’m just like this little person.
01:06:40 Surely I can’t be one of those people
01:06:42 that do the big thing, right?
01:06:46 There’s all these big people doing big things.
01:06:48 There’s big actors and actresses, big musicians.
01:06:53 There’s big business owners and all that kind of stuff,
01:06:57 scientists and so on.
01:06:58 I have my own subject experience that I enjoy and so on,
01:07:02 but there’s like a different layer.
01:07:04 Like surely I can’t do those great things.
01:07:09 I mean, one of the things just having interacted
01:07:11 with a lot of great people, I realized,
01:07:14 no, they’re like just the same humans as me.
01:07:20 And that realization I think is really empowering.
01:07:22 And to remind yourself.
01:07:24 What are they?
01:07:25 Huh?
01:07:25 What are they?
01:07:26 Are they?
01:07:27 Well, in terms of.
01:07:29 Depends on some, yeah.
01:07:30 They’re like a bag of insecurities and.
01:07:33 Yes.
01:07:36 Peculiar sort of, like their own little weirdnesses
01:07:41 and so on, I should say also not.
01:07:48 They have the capacity for brilliance,
01:07:50 but they’re not generically brilliant.
01:07:52 Like, you know, we tend to say this person
01:07:55 or that person is brilliant, but really no,
01:07:59 they’re just like sitting there and thinking through stuff
01:08:02 just like the rest of us.
01:08:04 Right.
01:08:05 I think they’re in the habit of thinking through stuff
01:08:08 seriously and they’ve built up a habit of not allowing them,
01:08:13 their mind to get trapped in a bunch of bullshit
01:08:15 and minutia of day to day life.
01:08:16 They really think big ideas, but those big ideas,
01:08:21 it’s like allowing yourself the freedom to think big,
01:08:24 to realize that you can be one that actually solved
01:08:27 this particular big problem.
01:08:29 First identify a big problem that you care about,
01:08:31 then like, I can actually be the one
01:08:33 that solves this problem.
01:08:34 And like allowing yourself to believe that.
01:08:37 And I think sometimes you do need to have like
01:08:39 that shock go through your body and a voice tells you,
01:08:41 you’re gonna win this tournament.
01:08:42 Well, exactly.
01:08:43 And whether it was, it’s this idea of useful fictions.
01:08:50 So again, like going through all like
01:08:53 the classic rationalist training of Les Wrong
01:08:55 where it’s like, you want your map,
01:08:57 you know, the image you have of the world in your head
01:09:00 to as accurately match up with how the world actually is.
01:09:04 You want the map and the territory to perfectly align
01:09:06 as, you know, you want it to be
01:09:08 as an accurate representation as possible.
01:09:11 I don’t know if I fully subscribed to that anymore,
01:09:13 having now had these moments of like feeling of something
01:09:17 either bigger or just actually just being overconfident.
01:09:20 Like there is value in overconfidence sometimes.
01:09:23 If you, you know, take, you know,
01:09:25 take Magnus Carlsen, right?
01:09:30 If he, I’m sure from a young age,
01:09:32 he knew he was very talented,
01:09:34 but I wouldn’t be surprised if he was also had something
01:09:37 in him to, well, actually maybe he’s a bad example
01:09:40 because he truly is the world’s greatest,
01:09:42 but someone who it was unclear
01:09:44 whether they were gonna be the world’s greatest,
01:09:45 but ended up doing extremely well
01:09:47 because they had this innate, deep self confidence,
01:09:50 this like even overblown idea
01:09:53 of how good their relative skill level is.
01:09:54 That gave them the confidence to then pursue this thing
01:09:56 and they’re like with the kind of focus and dedication
01:10:00 that it requires to excel in whatever it is
01:10:02 you’re trying to do, you know?
01:10:03 And so there are these useful fictions
01:10:06 and that’s where I think I diverge slightly
01:10:09 with the classic sort of rationalist community
01:10:14 because that’s a field that is worth studying
01:10:21 of like how the stories we tell,
01:10:23 what the stories we tell to ourselves,
01:10:25 even if they are actually false,
01:10:26 and even if we suspect they might be false,
01:10:28 how it’s better to sort of have that like little bit
01:10:30 of faith, like value in faith, I think actually.
01:10:34 And that’s partly another thing
01:10:35 that’s now led me to explore the concept of God,
01:10:40 whether you wanna call it a simulator,
01:10:42 the classic theological thing.
01:10:44 I think we’re all like elucidating to the same thing.
01:10:46 Now, I don’t know, I’m not saying,
01:10:47 because obviously the Christian God
01:10:49 is like all benevolent, endless love.
01:10:53 The simulation, at least one of the simulation hypothesis
01:10:56 is like, as you said, like a teenager in his bedroom
01:10:58 who doesn’t really care, doesn’t give a shit
01:11:00 about the individuals within there.
01:11:02 It just like wants to see how the thing plays out
01:11:05 because it’s curious and it could turn it off like that.
01:11:07 Where on the sort of psychopathy
01:11:09 to benevolent spectrum God is, I don’t know.
01:11:13 But just having a little bit of faith
01:11:20 that there is something else out there
01:11:21 that might be interested in our outcome
01:11:24 is I think an essential thing actually for people to find.
01:11:27 A, because it creates commonality between,
01:11:29 it’s something we can all share.
01:11:31 And it is uniquely humbling of all of us to an extent.
01:11:35 It’s like a common objective.
01:11:37 But B, it gives people that little bit of like reserve
01:11:41 when things get really dark.
01:11:43 And I do think things are gonna get pretty dark
01:11:44 over the next few years.
01:11:47 But it gives that like,
01:11:49 to think that there’s something out there
01:11:50 that actually wants our game to keep going.
01:11:53 I keep calling it the game.
01:11:55 It’s a thing C and I, we call it the game.
01:11:57 You and C is a.k.a. Grimes, we call what the game?
01:12:02 Everything, the whole thing?
01:12:03 Yeah, we joke about like.
01:12:05 So everything is a game.
01:12:06 Well, the universe, like what if it’s a game
01:12:11 and the goal of the game is to figure out like,
01:12:13 well, either how to beat it, how to get out of it.
01:12:16 Maybe this universe is an escape room,
01:12:18 like a giant escape room.
01:12:20 And the goal is to figure out,
01:12:22 put all the pieces to puzzle, figure out how it works
01:12:25 in order to like unlock this like hyperdimensional key
01:12:29 and get out beyond what it is.
01:12:31 That’s.
01:12:32 No, but then, so you’re saying it’s like different levels
01:12:34 and it’s like a cage within a cage within a cage
01:12:36 and never like one cage at a time,
01:12:38 you figure out how to escape that.
01:12:41 Like a new level up, you know,
01:12:42 like us becoming multi planetary would be a level up
01:12:44 or us, you know, figuring out how to upload
01:12:46 our consciousnesses to the thing.
01:12:48 That would probably be a leveling up or spiritually,
01:12:51 you know, humanity becoming more combined
01:12:53 and less adversarial and bloodthirsty
01:12:56 and us becoming a little bit more enlightened.
01:12:58 That would be a leveling up.
01:12:59 You know, there’s many different frames to it,
01:13:01 whether it’s physical, you know, digital
01:13:05 or like metaphysical.
01:13:05 I wonder what the levels, I think,
01:13:07 I think level one for earth is probably
01:13:10 the biological evolutionary process.
01:13:14 So going from single cell organisms to early humans.
01:13:18 Then maybe level two is whatever’s happening inside our minds
01:13:22 and creating ideas and creating technologies.
01:13:26 That’s like evolutionary process of ideas.
01:13:31 And then multi planetary is interesting.
01:13:34 Is that fundamentally different
01:13:35 from what we’re doing here on earth?
01:13:37 Probably, because it allows us to like exponentially scale.
01:13:41 It delays the Malthusian trap, right?
01:13:46 It’s a way to keep the playing field,
01:13:50 to make the playing field get larger
01:13:53 so that it can accommodate more of our stuff, more of us.
01:13:57 And that’s a good thing,
01:13:58 but I don’t know if it like fully solves this issue of,
01:14:05 well, this thing called Moloch,
01:14:06 which we haven’t talked about yet,
01:14:07 but which is basically,
01:14:10 I call it the God of unhealthy competition.
01:14:12 Yeah, let’s go to Moloch.
01:14:13 What’s Moloch?
01:14:14 You did a great video on Moloch and one aspect of it,
01:14:18 the application of it to one aspect.
01:14:20 Instagram beauty filters.
01:14:22 True.
01:14:23 Very niche, but I wanted to start off small.
01:14:27 So Moloch was originally coined as,
01:14:34 well, so apparently back in the like Canaanite times,
01:14:39 it was to say ancient Carthaginian,
01:14:41 I can never say it Carthaginian,
01:14:43 somewhere around like 300 BC or 280, I don’t know.
01:14:47 There was supposedly this death cult
01:14:49 who would sacrifice their children
01:14:53 to this awful demon God thing they called Moloch
01:14:56 in order to get power to win wars.
01:14:59 So really dark, horrible things.
01:15:00 And it was literally like about child sacrifice,
01:15:02 whether they actually existed or not, we don’t know,
01:15:04 but in mythology they did.
01:15:05 And this God that they worshiped
01:15:07 was this thing called Moloch.
01:15:09 And then I don’t know,
01:15:10 it seemed like it was kind of quiet throughout history
01:15:13 in terms of mythology beyond that,
01:15:15 until this movie Metropolis in 1927 talked about this,
01:15:24 you see that there was this incredible futuristic city
01:15:27 that everyone was living great in,
01:15:29 but then the protagonist goes underground into the sewers
01:15:31 and sees that the city is run by this machine.
01:15:34 And this machine basically would just like kill the workers
01:15:37 all the time because it was just so hard to keep it running.
01:15:40 They were always dying.
01:15:40 So there was all this suffering that was required
01:15:42 in order to keep the city going.
01:15:44 And then the protagonist has this vision
01:15:45 that this machine is actually this demon Moloch.
01:15:48 So again, it’s like this sort of like mechanistic consumption
01:15:50 of humans in order to get more power.
01:15:54 And then Allen Ginsberg wrote a poem in the 60s,
01:15:58 which incredible poem called Howl about this thing Moloch.
01:16:04 And a lot of people sort of quite understandably
01:16:06 take the interpretation of that,
01:16:08 that he’s talking about capitalism.
01:16:10 But then the sort of piece to resistance
01:16:13 that’s moved Moloch into this idea of game theory
01:16:16 was Scott Alexander of Slate Style Codex
01:16:20 wrote this incredible,
01:16:21 well, literally I think it might be my favorite piece
01:16:23 of writing of all time.
01:16:24 It’s called Meditations on Moloch.
01:16:26 Everyone must go read it.
01:16:28 And…
01:16:29 I say Codex is a blog.
01:16:30 It’s a blog, yes.
01:16:32 We can link to it in the show notes or something, right?
01:16:35 No, don’t.
01:16:36 I, yes, yes.
01:16:38 But I like how you assume
01:16:41 I have a professional operation going on here.
01:16:43 I mean…
01:16:44 I shall try to remember to…
01:16:45 You were gonna assume.
01:16:46 What do you…
01:16:46 What are you, what do you want?
01:16:48 You’re giving the impression of it.
01:16:49 Yeah, I’ll look, please.
01:16:50 If I don’t, please somebody in the comments remind me.
01:16:53 I’ll help you.
01:16:54 If you don’t know this blog,
01:16:55 it’s one of the best blogs ever probably.
01:16:59 You should probably be following it.
01:17:01 Yes.
01:17:02 Are blogs still a thing?
01:17:03 I think they are still a thing, yeah.
01:17:04 Yeah, he’s migrated onto Substack,
01:17:06 but yeah, it’s still a blog.
01:17:07 Anyway.
01:17:08 Substack better not fuck things up, but…
01:17:10 I hope not, yeah.
01:17:12 I hope they don’t, I hope they don’t turn Molochy,
01:17:14 which will mean something to people when we continue.
01:17:16 Yeah.
01:17:17 When I stop interrupting for once.
01:17:19 No, no, it’s good.
01:17:20 Go on, yeah.
01:17:21 So anyway, so he writes,
01:17:22 he writes this piece, Meditations on Moloch,
01:17:24 and basically he analyzes the poem and he’s like,
01:17:28 okay, so it seems to be something relating
01:17:29 to where competition goes wrong.
01:17:32 And, you know, Moloch was historically this thing
01:17:35 of like where people would sacrifice a thing
01:17:38 that they care about, in this case, children,
01:17:40 their own children, in order to gain power,
01:17:43 a competitive advantage.
01:17:45 And if you look at almost everything that sort of goes wrong
01:17:48 in our society, it’s that same process.
01:17:52 So with the Instagram beauty filters thing,
01:17:56 you know, if you’re trying to become
01:17:57 a famous Instagram model,
01:18:01 you are incentivized to post the hottest pictures
01:18:03 of yourself that you can, you know,
01:18:05 you’re trying to play that game.
01:18:07 There’s a lot of hot women on Instagram.
01:18:08 How do you compete against them?
01:18:10 You post really hot pictures
01:18:11 and that’s how you get more likes.
01:18:14 As technology gets better, you know,
01:18:16 more makeup techniques come along.
01:18:19 And then more recently, these beauty filters
01:18:22 where like at the touch of a button,
01:18:23 it makes your face look absolutely incredible
01:18:26 compared to your natural face.
01:18:29 These technologies come along,
01:18:30 it’s everyone is incentivized to that short term strategy.
01:18:35 But over on net, it’s bad for everyone
01:18:39 because now everyone is kind of feeling
01:18:40 like they have to use these things.
01:18:41 And these things like they make you like,
01:18:43 the reason why I talked about them in this video
01:18:44 is because I noticed it myself, you know,
01:18:45 like I was trying to grow my Instagram for a while,
01:18:48 I’ve given up on it now.
01:18:49 But yeah, and I noticed these filters,
01:18:52 how good they made me look.
01:18:53 And I’m like, well, I know that everyone else
01:18:56 is kind of doing it.
01:18:57 Go subscribe to Liv’s Instagram.
01:18:58 Please, so I don’t have to use the filters.
01:19:01 I’ll post a bunch of, yeah, make it blow up.
01:19:06 So yeah, you felt the pressure actually.
01:19:08 Exactly, these short term incentives
01:19:10 to do this like, this thing that like either sacrifices
01:19:13 your integrity or something else
01:19:17 in order to like stay competitive,
01:19:20 which on aggregate turns like,
01:19:22 creates this like sort of race to the bottom spiral
01:19:24 where everyone else ends up in a situation
01:19:26 which is worse off than if they hadn’t started,
01:19:28 you know, than they were before.
01:19:29 Kind of like if, like at a football stadium,
01:19:34 like the system is so badly designed,
01:19:36 a competitive system of like everyone sitting
01:19:38 and having a view that if someone at the very front
01:19:40 stands up to get an even better view,
01:19:42 it forces everyone else behind
01:19:43 to like adopt that same strategy
01:19:45 just to get to where they were before.
01:19:46 But now everyone’s stuck standing up.
01:19:48 Like, so you need this like top down God’s eye coordination
01:19:51 to make it go back to the better state.
01:19:53 But from within the system, you can’t actually do that.
01:19:56 So that’s kind of what this Moloch thing is.
01:19:57 It’s this thing that makes people sacrifice values
01:20:01 in order to optimize for the winning the game in question,
01:20:04 the short term game.
01:20:05 But this Moloch, can you attribute it
01:20:09 to any one centralized source
01:20:11 or is it an emergent phenomena
01:20:13 from a large collection of people?
01:20:16 Exactly that.
01:20:16 It’s an emergent phenomena.
01:20:18 It’s a force of game theory.
01:20:22 It’s a force of bad incentives on a multi agent system
01:20:25 where you’ve got more, you know,
01:20:26 prisoner’s dilemma is technically
01:20:28 a kind of Moloch system as well,
01:20:30 but it’s just a two player thing.
01:20:31 But another word for Moloch is it multipolar trap.
01:20:35 Where basically you just got a lot of different people
01:20:37 all competing for some kind of prize.
01:20:40 And it would be better
01:20:42 if everyone didn’t do this one shitty strategy,
01:20:44 but because that strategy gives you a short term advantage,
01:20:47 everyone’s incentivized to do it.
01:20:48 And so everyone ends up doing it.
01:20:49 So the responsibility for,
01:20:51 I mean, social media is a really nice place
01:20:54 for a large number of people to play game theory.
01:20:56 And so they also have the ability
01:20:59 to then design the rules of the game.
01:21:02 And is it on them to try to anticipate
01:21:05 what kind of like to do the thing
01:21:07 that poker players are doing to run simulation?
01:21:11 Ideally that would have been great.
01:21:12 If, you know, Mark Zuckerberg and Jack
01:21:15 and all the, you know, the Twitter founders and everyone,
01:21:17 if they had at least just run a few simulations
01:21:20 of how their algorithms would, you know,
01:21:23 if different types of algorithms would turn out for society,
01:21:26 that would have been great.
01:21:27 That’s really difficult to do
01:21:28 that kind of deep philosophical thinking
01:21:30 about thinking about humanity actually.
01:21:33 So not kind of this level of how do we optimize engagement
01:21:40 or what brings people joy in the short term,
01:21:42 but how is this thing going to change
01:21:45 the way people see the world?
01:21:47 How is it gonna get morphed in iterative games played
01:21:52 into something that will change society forever?
01:21:56 That requires some deep thinking.
01:21:58 That’s, I hope there’s meetings like that inside companies,
01:22:02 but I haven’t seen them.
01:22:03 There aren’t, that’s the problem.
01:22:04 And it’s difficult because like,
01:22:07 when you’re starting up a social media company,
01:22:09 you know, you’re aware that you’ve got investors to please,
01:22:13 there’s bills to pay, you know,
01:22:17 there’s only so much R&D you can afford to do.
01:22:19 You’ve got all these like incredible pressures,
01:22:21 bad incentives to get on and just build your thing
01:22:24 as quickly as possible and start making money.
01:22:26 And, you know, I don’t think anyone intended
01:22:29 when they built these social media platforms
01:22:32 and just to like preface it.
01:22:33 So the reason why, you know, social media is relevant
01:22:36 because it’s a very good example of like,
01:22:38 everyone these days is optimizing for, you know, clicks,
01:22:42 whether it’s a social media platforms themselves,
01:22:44 because, you know, every click gets more, you know,
01:22:47 impressions and impressions pay for, you know,
01:22:49 they get advertising dollars
01:22:50 or whether it’s individual influencers
01:22:53 or, you know, whether it’s a New York Times or whoever,
01:22:56 they’re trying to get their story to go viral.
01:22:58 So everyone’s got this bad incentive of using, you know,
01:23:00 as you called it, the clickbait industrial complex.
01:23:02 That’s a very molly key system
01:23:04 because everyone is now using worse and worse tactics
01:23:06 in order to like try and win this attention game.
01:23:09 And yeah, so ideally these companies
01:23:14 would have had enough slack in the beginning
01:23:17 in order to run these experiments to see,
01:23:19 okay, what are the ways this could possibly go wrong
01:23:22 for people?
01:23:23 What are the ways that Moloch,
01:23:24 they should be aware of this concept of Moloch
01:23:26 and realize that whenever you have
01:23:28 a highly competitive multiagent system,
01:23:31 which social media is a classic example of,
01:23:32 millions of agents all trying to compete
01:23:34 for likes and so on,
01:23:35 and you try and bring all this complexity down
01:23:39 into like very small metrics,
01:23:41 such as number of likes, number of retweets,
01:23:44 whatever the algorithm optimizes for,
01:23:46 that is a guaranteed recipe for this stuff to go wrong
01:23:49 and become a race to the bottom.
01:23:51 I think there should be an honesty when founders,
01:23:53 I think there’s a hunger for that kind of transparency
01:23:56 of like, we don’t know what the fuck we’re doing.
01:23:58 This is a fascinating experiment.
01:23:59 We’re all running as a human civilization.
01:24:02 Let’s try this out.
01:24:04 And like, actually just be honest about this,
01:24:06 that we’re all like these weird rats in a maze.
01:24:10 None of us are controlling it.
01:24:12 There’s this kind of sense like the founders,
01:24:15 the CEO of Instagram or whatever,
01:24:17 Mark Zuckerberg has a control and he’s like,
01:24:19 like with strings playing people.
01:24:21 No, they’re.
01:24:22 He’s at the mercy of this is like everyone else.
01:24:24 He’s just like trying to do his best.
01:24:26 And like, I think putting on a smile
01:24:29 and doing over polished videos
01:24:32 about how Instagram and Facebook are good for you,
01:24:36 I think is not the right way to actually ask
01:24:39 some of the deepest questions we get to ask as a society.
01:24:43 How do we design the game such that we build a better world?
01:24:48 I think a big part of this as well is people,
01:24:51 there’s this philosophy, particularly in Silicon Valley
01:24:56 of well, techno optimism,
01:24:58 technology will solve all our issues.
01:25:01 And there’s a steel man argument to that where yes,
01:25:03 technology has solved a lot of problems
01:25:05 and can potentially solve a lot of future ones.
01:25:08 But it can also, it’s always a double edged sword.
01:25:10 And particularly as you know,
01:25:11 technology gets more and more powerful
01:25:13 and we’ve now got like big data
01:25:15 and we’re able to do all kinds of like
01:25:17 psychological manipulation with it and so on.
01:25:22 Technology is not a values neutral thing.
01:25:24 People think, I used to always think this myself.
01:25:26 It’s like this naive view that,
01:25:28 oh, technology is completely neutral.
01:25:30 It’s just, it’s the humans that either make it good or bad.
01:25:33 No, to the point we’re at now,
01:25:36 the technology that we are creating,
01:25:38 they are social technologies.
01:25:39 They literally dictate how humans now form social groups
01:25:45 and so on beyond that.
01:25:46 And beyond that, it also then,
01:25:47 that gives rise to like the memes
01:25:49 that we then like coalesce around.
01:25:51 And that, if you have the stack that way
01:25:54 where it’s technology driving social interaction,
01:25:56 which then drives like memetic culture
01:26:00 and like which ideas become popular, that’s Moloch.
01:26:04 And we need the other way around, we need it.
01:26:06 So we need to figure out what are the good memes?
01:26:08 What are the good values
01:26:11 that we think we need to optimize for
01:26:15 that like makes people happy and healthy
01:26:17 and like keeps society as robust and safe as possible,
01:26:21 then figure out what the social structure
01:26:22 around those should be.
01:26:23 And only then do we figure out technology,
01:26:25 but we’re doing the other way around.
01:26:26 And as much as I love in many ways
01:26:31 the culture of Silicon Valley,
01:26:32 and like I do think that technology has,
01:26:36 I don’t wanna knock it.
01:26:36 It’s done so many wonderful things for us,
01:26:38 same as capitalism.
01:26:40 There are, we have to like be honest with ourselves.
01:26:44 We’re getting to a point where we are losing control
01:26:47 of this very powerful machine that we have created.
01:26:49 Can you redesign the machine within the game?
01:26:53 Can you just have, can you understand the game enough?
01:26:57 Okay, this is the game.
01:26:58 And this is how we start to reemphasize
01:27:01 the memes that matter,
01:27:03 the memes that bring out the best in us.
01:27:06 You know, like the way I try to be in real life
01:27:11 and the way I try to be online
01:27:13 is to be about kindness and love.
01:27:15 And I feel like I’m sometimes get like criticized
01:27:19 for being naive and all those kinds of things.
01:27:22 But I feel like I’m just trying to live within this game.
01:27:25 I’m trying to be authentic.
01:27:27 Yeah, but also like, hey, it’s kind of fun to do this.
01:27:31 Like you guys should try this too, you know,
01:27:33 and that’s like trying to redesign
01:27:37 some aspects of the game within the game.
01:27:40 Is that possible?
01:27:43 I don’t know, but I think we should try.
01:27:46 I don’t think we have an option but to try.
01:27:48 Well, the other option is to create new companies
01:27:51 or to pressure companies that,
01:27:55 or anyone who has control of the rules of the game.
01:27:59 I think we need to be doing all of the above.
01:28:01 I think we need to be thinking hard
01:28:02 about what are the kind of positive, healthy memes.
01:28:09 You know, as Elon said,
01:28:10 he who controls the memes controls the universe.
01:28:13 He said that.
01:28:14 I think he did, yeah.
01:28:16 But there’s truth to that.
01:28:17 It’s very, there is wisdom in that
01:28:19 because memes have driven history.
01:28:22 You know, we are a cultural species.
01:28:24 That’s what sets us apart from chimpanzees
01:28:27 and everything else.
01:28:27 We have the ability to learn and evolve through culture
01:28:32 as opposed to biology or like, you know,
01:28:34 classic physical constraints.
01:28:37 And that means culture is incredibly powerful
01:28:40 and we can create and become victim
01:28:43 to very bad memes or very good ones.
01:28:46 But we do have some agency over which memes,
01:28:49 you know, we, but not only put out there,
01:28:51 but we also like subscribe to.
01:28:54 So I think we need to take that approach.
01:28:56 We also need to, you know,
01:28:58 because I don’t want, I’m making this video right now
01:29:02 called The Attention Wars,
01:29:03 which is about like how Moloch,
01:29:05 like the media machine is this Moloch machine.
01:29:08 Well, is this kind of like blind dumb thing
01:29:11 where everyone is optimizing for engagement
01:29:13 in order to win their share of the attention pie.
01:29:16 And then if you zoom out,
01:29:17 it’s really like Moloch that’s pulling the strings
01:29:19 because the only thing that benefits from this in the end,
01:29:20 you know, like our information ecosystem is breaking down.
01:29:23 Like we have, you look at the state of the US,
01:29:26 it’s in, we’re in a civil war.
01:29:28 It’s just not a physical war.
01:29:29 It’s an information war.
01:29:33 And people are becoming more fractured
01:29:35 in terms of what their actual shared reality is.
01:29:37 Like truly like an extreme left person,
01:29:39 an extreme right person,
01:29:40 like they literally live in different worlds
01:29:43 in their minds at this point.
01:29:45 And it’s getting more and more amplified.
01:29:47 And this force is like a razor blade
01:29:50 pushing through everything.
01:29:51 It doesn’t matter how innocuous a topic is,
01:29:53 it will find a way to split into this,
01:29:55 you know, bifurcated cultural and it’s fucking terrifying.
01:29:58 Because that maximizes the tension.
01:29:59 And that’s like an emergent Moloch type force
01:30:03 that takes anything, any topic
01:30:06 and cuts through it so that it can split nicely
01:30:11 into two groups.
01:30:12 One that’s…
01:30:13 Well, it’s whatever, yeah,
01:30:16 all everyone is trying to do within the system
01:30:17 is just maximize whatever gets them the most attention
01:30:21 because they’re just trying to make money
01:30:22 so they can keep their thing going, right?
01:30:24 And the best emotion for getting attention,
01:30:29 well, because it’s not just about attention on the internet,
01:30:30 it’s engagement, that’s the key thing, right?
01:30:33 In order for something to go viral,
01:30:34 you need people to actually engage with it.
01:30:35 They need to like comment or retweet or whatever.
01:30:39 And of all the emotions that,
01:30:43 there’s like seven classic shared emotions
01:30:45 that studies have found that all humans,
01:30:47 even from like previously uncontacted tribes have.
01:30:51 Some of those are negative, you know, like sadness,
01:30:54 disgust, anger, et cetera, some are positive,
01:30:57 happiness, excitement, and so on.
01:31:01 The one that happens to be the most useful
01:31:03 for the internet is anger.
01:31:05 Because anger, it’s such an active emotion.
01:31:09 If you want people to engage, if someone’s scared,
01:31:13 and I’m not just like talking out my ass here,
01:31:14 there are studies here that have looked into this.
01:31:18 Whereas like if someone’s like,
01:31:19 disgusted or fearful, they actually tend to then be like,
01:31:22 oh, I don’t wanna deal with this.
01:31:23 So they’re less likely to actually engage
01:31:25 and share it and so on, they’re just gonna be like, ugh.
01:31:27 Whereas if they’re enraged by a thing,
01:31:29 well now that triggers all the like,
01:31:31 the old tribalism emotions.
01:31:34 And so that’s how then things get sort of spread,
01:31:37 you know, much more easily.
01:31:38 They out compete all the other memes in the ecosystem.
01:31:42 And so this like, the attention economy,
01:31:46 the wheels that make it go around are,
01:31:48 is rage.
01:31:49 I did a tweet, the problem with raging against the machine
01:31:53 is that the machine has learned to feed off rage.
01:31:56 Because it is feeding off our rage.
01:31:57 That’s the thing that’s now keeping it going.
01:31:59 So the more we get angry, the worse it gets.
01:32:01 So the mullet in this attention,
01:32:04 in the war of attention is constantly maximizing rage.
01:32:10 What it is optimizing for is engagement.
01:32:12 And it happens to be that engagement
01:32:14 is more propaganda, you know.
01:32:20 I mean, it just sounds like everything is putting,
01:32:23 more and more things are being put through this
01:32:24 like propagandist lens of winning
01:32:27 whatever the war is in question.
01:32:28 Whether it’s the culture war or the Ukraine war, yeah.
01:32:31 Well, I think the silver lining of this,
01:32:33 do you think it’s possible that in the long arc
01:32:36 of this process, you actually do arrive
01:32:39 at greater wisdom and more progress?
01:32:41 It just, in the moment, it feels like people are
01:32:45 tearing each other to shreds over ideas.
01:32:47 But if you think about it, one of the magic things
01:32:49 about democracy and so on, is you have
01:32:51 the blue versus red constantly fighting.
01:32:53 It’s almost like they’re in discourse,
01:32:58 creating devil’s advocate, making devils out of each other.
01:33:01 And through that process, discussing ideas.
01:33:04 Like almost really embodying different ideas
01:33:07 just to yell at each other.
01:33:08 And through the yelling, over the period of decades,
01:33:11 maybe centuries, figuring out a better system.
01:33:15 Like in the moment, it feels fucked up.
01:33:17 But in the long arc, it actually is productive.
01:33:20 I hope so.
01:33:23 That said, we are now in the era of,
01:33:28 just as we have weapons of mass destruction
01:33:30 with nuclear weapons, you know,
01:33:32 that can break the whole playing field,
01:33:35 we now are developing weapons
01:33:37 of informational mass destruction.
01:33:39 Information weapons, you know, WMDs
01:33:41 that basically can be used for propaganda
01:33:44 or just manipulating people however is needed,
01:33:49 whether that’s through dumb TikTok videos,
01:33:53 or, you know, there are significant resources being put in.
01:33:59 I don’t mean to sound like, you know,
01:34:01 to doom and gloom, but there are bad actors out there.
01:34:04 That’s the thing, there are plenty of good actors
01:34:06 within the system who are just trying to stay afloat
01:34:07 in the game, so effectively doing monarchy things.
01:34:09 But then on top of that, we have actual bad actors
01:34:12 who are intentionally trying to, like,
01:34:14 manipulate the other side into doing things.
01:34:17 And using, so because it’s a digital space,
01:34:19 they’re able to use artificial actors, meaning bots.
01:34:24 Exactly, botnets, you know,
01:34:26 and this is a whole new situation
01:34:29 that we’ve never had before.
01:34:30 Yeah, it’s exciting.
01:34:32 You know what I want to do?
01:34:34 You know what I want to do that,
01:34:36 because there is, you know, people are talking about bots
01:34:38 manipulating and, like, malicious bots
01:34:41 that are basically spreading propaganda.
01:34:43 I want to create, like, a bot army for, like,
01:34:46 that fights that. For love?
01:34:47 Yeah, exactly, for love, that fights, that, I mean.
01:34:50 You know, there’s, I mean, there’s truth
01:34:52 to fight fire with fire, it’s like,
01:34:54 but how you always have to be careful
01:34:57 whenever you create, again, like,
01:34:59 Moloch is very tricky. Yeah, yeah.
01:35:01 Hitler was trying to spread the love, too.
01:35:03 Well, yeah, so we thought, but, you know, I agree with you
01:35:07 that, like, that is a thing that should be considered,
01:35:09 but there is, again, everyone,
01:35:11 the road to hell is paved in good intentions.
01:35:13 And this is, there’s always unforeseen circumstances,
01:35:18 you know, outcomes, externalities
01:35:20 of you trying to adopt a thing,
01:35:21 even if you do it in the very best of faith.
01:35:23 But you can learn lessons of history.
01:35:25 If you can run some sims on it first, absolutely.
01:35:28 But also there’s certain aspects of a system,
01:35:30 as we’ve learned through history, that do better than others.
01:35:33 Like, for example, don’t have a dictator,
01:35:36 so, like, if I were to create this bot army,
01:35:39 it’s not good for me to have full control over it.
01:35:42 Because in the beginning, I might have a good understanding
01:35:45 of what’s good and not, but over time,
01:35:47 that starts to get deviated,
01:35:49 because I’ll get annoyed at some assholes,
01:35:50 and I’ll think, okay, wouldn’t it be nice
01:35:52 to get rid of those assholes?
01:35:53 But then that power starts getting to your head,
01:35:55 you become corrupted, that’s basic human nature.
01:35:58 So distribute the power somehow.
01:35:59 We need a love botnet on a DAO.
01:36:05 A DAO love botnet.
01:36:07 Yeah, and without a leader, like without…
01:36:10 Well, exactly, a distributed, right,
01:36:12 but yeah, without any kind of centralized…
01:36:14 Yeah, without even, you know,
01:36:16 basically it’s the more control,
01:36:17 the more you can decentralize the control of a thing
01:36:21 to people, you know, but the balance…
01:36:25 But then you still need the ability to coordinate,
01:36:26 because that’s the issue when something is too,
01:36:29 you know, that’s really, to me, like the culture wars,
01:36:32 the bigger war we’re dealing with is actually between
01:36:35 the sort of the, I don’t know what even the term is for it,
01:36:39 but like centralization versus decentralization.
01:36:41 That’s the tension we’re seeing.
01:36:44 Power in control by a few versus completely distributed.
01:36:48 And the trouble is if you have a fully centralized thing,
01:36:50 then you’re at risk of tyranny, you know,
01:36:52 Stalin type things can happen, or completely distributed.
01:36:56 Now you’re at risk of complete anarchy and chaos
01:36:58 but you can’t even coordinate to like on, you know,
01:37:01 when there’s like a pandemic or anything like that.
01:37:03 So it’s like, what is the right balance to strike
01:37:05 between these two structures?
01:37:08 Can’t Moloch really take hold
01:37:09 in a fully decentralized system?
01:37:11 That’s one of the dangers too.
01:37:12 Yes, very vulnerable to Moloch.
01:37:14 So a dictator can commit huge atrocities,
01:37:17 but they can also make sure the infrastructure works
01:37:20 and trains run on time.
01:37:23 They have that God’s eye view at least.
01:37:24 They have the ability to create like laws and rules
01:37:27 that like force coordination, which stops Moloch.
01:37:30 But then you’re vulnerable to that dictator
01:37:33 getting infected with like this,
01:37:34 with some kind of psychopathy type thing.
01:37:36 What’s reverse Moloch?
01:37:39 Sorry, great question.
01:37:40 So that’s where, so I’ve been working on this series.
01:37:46 It’s been driving me insane for the last year and a half.
01:37:48 I did the first one a year ago.
01:37:50 I can’t believe it’s nearly been a year.
01:37:52 The second one, hopefully will be coming out
01:37:53 in like a month.
01:37:54 And my goal at the end of the series is to like present,
01:37:58 cause basically I’m painting the picture of like
01:38:00 what Moloch is and how it’s affecting
01:38:02 almost all these issues in our society
01:38:04 and how it’s driving.
01:38:06 It’s like kind of the generator function
01:38:07 as people describe it of existential risk.
01:38:11 And then at the end of that.
01:38:11 Wait, wait, the generator function of existential risk.
01:38:14 So you’re saying Moloch is sort of the engine
01:38:16 that creates a bunch of X risks.
01:38:19 Yes, not all of them.
01:38:20 Like a, you know, a.
01:38:22 Just a cool phrase, generator function.
01:38:24 It’s not my phrase.
01:38:25 It’s Daniel Schmacktenberger.
01:38:26 Oh, Schmacktenberger.
01:38:27 I got that from him.
01:38:28 Of course.
01:38:29 All things, it’s like all roads lead back
01:38:31 to Daniel Schmacktenberger, I think.
01:38:32 The dude is, the dude is brilliant.
01:38:35 He’s really brilliant.
01:38:36 After that it’s Mark Twain.
01:38:37 But anyway, sorry.
01:38:39 Totally rude interruptions from me.
01:38:41 No, it’s fine.
01:38:42 So not all X risks.
01:38:43 So like an asteroid technically isn’t
01:38:45 because it’s, you know, it’s just like
01:38:47 this one big external thing.
01:38:49 It’s not like a competition thing going on.
01:38:51 But, you know, synthetic bio, you know,
01:38:54 bio weapons, that’s one because everyone’s incentivized
01:38:57 to build, even for defense, you know,
01:38:59 bad, bad viruses, you know,
01:39:01 just to threaten someone else, et cetera.
01:39:03 Or AI, technically the race to AGI
01:39:05 is kind of potentially a Molochi situation.
01:39:09 But yeah, so if Moloch is this like generator function
01:39:15 that’s driving all of these issues
01:39:16 over the coming century that might wipe us out,
01:39:19 what’s the inverse?
01:39:20 And so far what I’ve gotten to is this character
01:39:24 that I want to put out there called Winwin.
01:39:26 Because Moloch is the God of lose, lose, ultimately.
01:39:28 It masquerades as the God of win, lose,
01:39:30 but in reality it’s lose, lose.
01:39:31 Everyone ends up worse off.
01:39:34 So I was like, well, what’s the opposite of that?
01:39:35 It’s Winwin.
01:39:36 And I was thinking for ages, like,
01:39:37 what’s a good name for this character?
01:39:39 And then tomorrow I was like, okay, well,
01:39:42 don’t try and, you know, think through it logically.
01:39:44 What’s the vibe of Winwin?
01:39:46 And to me, like in my mind, Moloch is like,
01:39:49 and I addressed that in the video,
01:39:50 like it’s red and black.
01:39:52 It’s kind of like very, you know, hyper focused
01:39:55 on it’s one goal you must win.
01:39:58 So Winwin is kind of actually like these colors.
01:40:01 It’s like purple, turquoise.
01:40:04 It’s loves games too.
01:40:06 It loves a little bit of healthy competition,
01:40:08 but constrained, like kind of like before,
01:40:10 like knows how to ring fence zero sum competition
01:40:12 into like just the right amount,
01:40:14 whereby its externalities can be controlled
01:40:17 and kept positive.
01:40:19 And then beyond that, it also loves cooperation,
01:40:21 coordination, love, all these other things.
01:40:23 But it’s also kind of like mischievous,
01:40:26 like, you know, it will have a good time.
01:40:28 It’s not like kind of like boring, you know,
01:40:30 like, oh God, it knows how to have fun.
01:40:33 It can get like, it can get down,
01:40:36 but ultimately it’s like unbelievably wise
01:40:39 and it just wants the game to keep going.
01:40:42 And I call it Winwin.
01:40:44 That’s a good like pet name, Winwin.
01:40:47 I think the, Winwin, right?
01:40:49 And I think it’s formal name when it has to do
01:40:51 like official functions is Omnia.
01:40:55 Omnia.
01:40:55 Yeah.
01:40:56 From like omniscience kind of, why Omnia?
01:40:59 You just like Omnia?
01:41:00 She’s like Omniwin.
01:41:01 Omniwin.
01:41:02 But I’m open to suggestions.
01:41:02 I would like, you know, and this is.
01:41:04 I like Omnia.
01:41:05 Yeah.
01:41:05 But there is an angelic kind of sense to Omnia though.
01:41:08 So Winwin is more fun.
01:41:09 So it’s more like, it embraces the fun aspect.
01:41:14 The fun aspect.
01:41:16 I mean, there is something about sort of,
01:41:20 there’s some aspect to Winwin interactions
01:41:23 that requires embracing the chaos of the game
01:41:31 and enjoying the game itself.
01:41:33 I don’t know.
01:41:34 I don’t know what that is.
01:41:35 That’s almost like a Zen like appreciation
01:41:37 of the game itself, not optimizing
01:41:40 for the consequences of the game.
01:41:42 Right, well, it’s recognizing the value
01:41:45 of competition in of itself about,
01:41:47 it’s not like about winning.
01:41:48 It’s about you enjoying the process of having a competition
01:41:51 and not knowing whether you’re gonna win
01:41:52 or lose this little thing.
01:41:53 But then also being aware that, you know,
01:41:56 what’s the boundary?
01:41:57 How big do I want competition to be?
01:41:59 Because one of the reason why Moloch is doing so well now
01:42:02 in our society, in our civilization is
01:42:04 because we haven’t been able to ring fence competition.
01:42:07 You know, and so it’s just having all
01:42:09 these negative externalities
01:42:10 and it’s, we’ve completely lost control of it.
01:42:13 You know, it’s, I think my guess is,
01:42:16 and now we’re getting really like,
01:42:18 you know, metaphysical technically,
01:42:20 but I think we’ll be in a more interesting universe
01:42:26 if we have one that has both pure cooperation,
01:42:29 you know, lots of cooperation
01:42:31 and some pockets of competition
01:42:32 than one that’s purely competition, cooperation entirely.
01:42:36 Like it’s good to have some little zero sumness bits,
01:42:39 but I don’t know that fully
01:42:41 and I’m not qualified as a philosopher to know that.
01:42:44 And that’s what reverse Moloch,
01:42:45 so this kind of win, win creature is a system,
01:42:49 is an antidote to the Moloch system.
01:42:52 Yes.
01:42:53 And I don’t know how it’s gonna do that.
01:42:57 But it’s good to kind of try to start
01:42:59 to formulate different ideas,
01:43:01 different frameworks of how we think about that.
01:43:03 Exactly.
01:43:04 At the small scale of a collection of individuals
01:43:07 and a large scale of a society.
01:43:09 Exactly.
01:43:09 It’s a meme, I think it’s an example of a good meme.
01:43:13 And I’m open, I’d love to hear feedback from people
01:43:15 if they think it’s, you know, they have a better idea
01:43:17 or it’s not, you know,
01:43:18 but it’s the direction of memes that we need to spread,
01:43:21 this idea of like, look for the win, wins in life.
01:43:25 Well, on the topic of beauty filters,
01:43:27 so in that particular context
01:43:28 where Moloch creates negative consequences,
01:43:35 Dostoevsky said beauty will save the world.
01:43:37 What is beauty anyway?
01:43:41 It would be nice to just try to discuss
01:43:45 what kind of thing we would like to converge towards
01:43:49 in our understanding of what is beautiful.
01:43:55 So to me, I think something is beautiful
01:43:59 when it can’t be reduced down to easy metrics.
01:44:04 Like if you think of a tree, what is it about a tree,
01:44:07 like a big, ancient, beautiful tree, right?
01:44:09 What is it about it that we find so beautiful?
01:44:11 It’s not, you know, the sweetness of its fruit
01:44:17 or the value of its lumber.
01:44:20 It’s this entirety of it that is,
01:44:25 there’s these immeasurable qualities,
01:44:28 it’s like almost like a qualia of it.
01:44:30 That’s both, like it walks this fine line between pattern,
01:44:33 well, it’s got lots of patternicity,
01:44:34 but it’s not overly predictable.
01:44:36 Again, it walks this fine line between order and chaos.
01:44:39 It’s a very highly complex system.
01:44:45 It’s evolving over time,
01:44:47 the definition of a complex versus,
01:44:49 and this is another Schmackt and Berger thing,
01:44:50 a complex versus a complicated system.
01:44:53 A complicated system can be sort of broken down
01:44:55 into bits and pieces,
01:44:56 a complicated system can be sort of broken down into bits,
01:45:00 understood and then put back together.
01:45:01 A complex system is kind of like a black box.
01:45:04 It does all this crazy stuff,
01:45:06 but if you take it apart,
01:45:07 you can’t put it back together again,
01:45:09 because there’s all these intricacies.
01:45:11 And also very importantly, like there’s some of the parts,
01:45:14 sorry, the sum of the whole is much greater
01:45:16 than the sum of the parts.
01:45:17 And that’s where the beauty lies, I think.
01:45:21 And I think that extends to things like art as well.
01:45:23 Like there’s something immeasurable about it.
01:45:27 There’s something we can’t break down to a narrow metric.
01:45:29 Does that extend to humans, you think?
01:45:31 Yeah, absolutely.
01:45:33 So how can Instagram reveal that kind of beauty,
01:45:37 the complexity of a human being?
01:45:39 Good question.
01:45:41 And this takes us back to dating sites and Goodreads,
01:45:44 I think.
01:45:46 Very good question.
01:45:47 I mean, well, I know what it shouldn’t do.
01:45:50 It shouldn’t try and like, right now,
01:45:52 you know, I was talking to like a social media expert
01:45:56 recently, because I was like, oh, I hate that.
01:45:58 There’s such a thing as a social media expert?
01:45:59 Oh, yeah, there are like agencies out there
01:46:02 that you can like outsource,
01:46:03 because I’m thinking about working with one to like,
01:46:06 I want to start a podcast.
01:46:09 You should, you should have done it a long time ago.
01:46:11 Working on it.
01:46:12 It’s going to be called Win Win.
01:46:14 And it’s going to be about this like positive sum stuff.
01:46:17 And the thing that, you know, they always come back and say,
01:46:21 is like, well, you need to like figure out
01:46:22 what your thing is.
01:46:24 You know, you need to narrow down what your thing is
01:46:26 and then just follow that.
01:46:27 Have like a sort of a formula,
01:46:31 because that’s what people want.
01:46:32 They want to know that they’re coming back
01:46:33 to the same thing.
01:46:34 And that’s the advice on YouTube, Twitter, you name it.
01:46:37 And that’s why, and the trouble with that
01:46:40 is that it’s a complexity reduction.
01:46:42 And generally speaking, you know,
01:46:44 complexity reduction is bad.
01:46:45 It’s making things more, it’s an oversimplification.
01:46:48 Not that simplification is always a bad thing.
01:46:51 But when you’re trying to take, you know,
01:46:55 what is social media doing?
01:46:56 It’s trying to like encapsulate the human experience
01:47:00 and put it into digital form and commodify it to an extent.
01:47:04 That, so you do that, you compress people down
01:47:07 into these like narrow things.
01:47:09 And that’s why I think it’s kind of ultimately
01:47:12 fundamentally incompatible with at least
01:47:14 my definition of beauty.
01:47:15 It’s interesting because there is some sense in which
01:47:21 a simplification sort of in the Einstein kind of sense
01:47:25 of a really complex idea, a simplification in a way
01:47:29 that still captures some core power of an idea of a person
01:47:33 is also beautiful.
01:47:36 And so maybe it’s possible for social media to do that.
01:47:39 A presentation, a sort of a slither, a slice,
01:47:44 a look into a person’s life that reveals something
01:47:48 real about them.
01:47:50 But in a simple way, in a way that can be displayed
01:47:52 graphically or through words.
01:47:55 Some way, I mean, in some way Twitter can do
01:47:58 that kind of thing.
01:47:59 A very few set of words can reveal the intricacies
01:48:03 of a person.
01:48:04 Of course, the viral machine that spreads those words
01:48:09 often results in people taking the thing out of context.
01:48:14 People often don’t read tweets in the context
01:48:18 of the human being that wrote them.
01:48:20 The full history of the tweets they’ve written,
01:48:24 the education level, the humor level,
01:48:26 the world view they’re playing around with,
01:48:30 all that context is forgotten and people just see
01:48:32 the different words.
01:48:33 So that can lead to trouble.
01:48:35 But in a certain sense, if you do take it in context,
01:48:39 it reveals some kind of quirky little beautiful idea
01:48:43 or a profound little idea from that particular person
01:48:47 that shows something about that person.
01:48:48 So in that sense, Twitter can be more successful
01:48:51 if we’re talking about Mollux is driving
01:48:54 a better kind of incentive.
01:48:56 Yeah, I mean, how they can, like if we were to rewrite,
01:49:01 is there a way to rewrite the Twitter algorithm
01:49:05 so that it stops being the fertile breeding ground
01:49:11 of the culture wars?
01:49:12 Because that’s really what it is.
01:49:15 I mean, maybe I’m giving it, Twitter too much power,
01:49:19 but just the more I looked into it
01:49:21 and I had conversations with Tristan Harris
01:49:25 from Center of Humane Technology.
01:49:27 And he explained it as like,
01:49:30 Twitter is where you have this amalgam of human culture
01:49:34 and then this terribly designed algorithm
01:49:36 that amplifies the craziest people
01:49:39 and the angriest most divisive takes and amplifies them.
01:49:45 And then the media, the mainstream media,
01:49:47 because all the journalists are also on Twitter,
01:49:49 they then are informed by that.
01:49:52 And so they draw out the stories they can
01:49:55 from this already like very boiling lava of rage
01:50:00 and then spread that to their millions
01:50:03 and millions of people who aren’t even on Twitter.
01:50:07 And so I honestly, I think if I could press a button,
01:50:10 turn them off, I probably would at this point,
01:50:13 because I just don’t see a way
01:50:14 of being compatible with healthiness,
01:50:17 but that’s not gonna happen.
01:50:19 And so at least one way to like stem the tide
01:50:23 and make it less malachy would be to change,
01:50:29 at least if like it was on a subscription model,
01:50:31 then it’s now not optimizing for impressions.
01:50:36 Cause basically what it wants is for people
01:50:38 to keep coming back as often as possible.
01:50:40 That’s how they get paid, right?
01:50:42 Every time an ad gets shown to someone
01:50:43 and the way is to get people constantly refreshing their feed.
01:50:46 So you’re trying to encourage addictive behaviors.
01:50:49 Whereas if someone, if they moved on
01:50:52 to at least a subscription model,
01:50:53 then they’re getting the money either way,
01:50:56 whether someone comes back to the site once a month
01:50:59 or 500 times a month,
01:51:01 they get the same amount of money.
01:51:02 So now that takes away that incentive,
01:51:04 to use technology, to build,
01:51:07 to design an algorithm that is maximally addictive.
01:51:10 That would be one way, for example.
01:51:12 Yeah, but you still want people to,
01:51:14 yeah, I just feel like that just slows down,
01:51:17 creates friction in the virality of things.
01:51:20 But that’s good.
01:51:22 We need to slow down virality.
01:51:24 It’s good, it’s one way.
01:51:26 Virality is malach, to be clear.
01:51:28 So malach is always negative then?
01:51:34 Yes, by definition.
01:51:36 Yes.
01:51:37 Competition is not always negative.
01:51:39 Competition is neutral.
01:51:40 I disagree with you that all virality is negative then,
01:51:44 is malach then.
01:51:45 Because it’s a good intuition,
01:51:49 because we have a lot of data on virality being negative.
01:51:52 But I happen to believe that the core of human beings,
01:51:57 so most human beings want to be good
01:52:00 more than they want to be bad to each other.
01:52:03 And so I think it’s possible,
01:52:05 it might be just harder to engineer systems
01:52:09 that enable virality,
01:52:10 but it’s possible to engineer systems that are viral
01:52:13 that enable virality.
01:52:15 And the kind of stuff that rises to the top
01:52:19 is things that are positive.
01:52:21 And positive, not like la la positive,
01:52:24 it’s more like win win,
01:52:25 meaning a lot of people need to be challenged.
01:52:28 Wise things, yes.
01:52:29 You grow from it, it might challenge you,
01:52:31 you might not like it, but you ultimately grow from it.
01:52:34 And ultimately bring people together
01:52:36 as opposed to tear them apart.
01:52:38 I deeply want that to be true.
01:52:40 And I very much agree with you that people at their core
01:52:43 are on average good, care for each other,
01:52:46 as opposed to not.
01:52:46 Like I think it’s actually a very small percentage
01:52:50 of people are truly wanting to do
01:52:52 just like destructive malicious things.
01:52:54 Most people are just trying to win their own little game.
01:52:56 And they don’t mean to be,
01:52:57 they’re just stuck in this badly designed system.
01:53:01 That said, the current structure, yes,
01:53:04 is the current structure means that virality
01:53:09 is optimized towards Moloch.
01:53:10 That doesn’t mean there aren’t exceptions.
01:53:12 Sometimes positive stories do go viral
01:53:14 and I think we should study them.
01:53:15 I think there should be a whole field of study
01:53:17 into understanding, identifying memes
01:53:20 that above a certain threshold of the population
01:53:24 agree is a positive, happy, bringing people together meme.
01:53:27 The kind of thing that brings families together
01:53:29 that would normally argue about cultural stuff
01:53:31 at the table, at the dinner table.
01:53:34 Identify those memes and figure out what it was,
01:53:37 what was the ingredient that made them spread that day.
01:53:41 And also like not just like happiness
01:53:44 and connection between humans,
01:53:45 but connection between humans in other ways
01:53:49 that enables like productivity, like cooperation,
01:53:52 solving difficult problems and all those kinds of stuff.
01:53:56 So it’s not just about let’s be happy
01:53:59 and have a fulfilling lives.
01:54:00 It’s also like, let’s build cool shit.
01:54:03 Let’s get excited.
01:54:04 Which is the spirit of collaboration,
01:54:05 which is deeply anti Moloch, right?
01:54:06 That’s, it’s not using competition.
01:54:09 It’s like, Moloch hates collaboration and coordination
01:54:13 and people working together.
01:54:14 And that’s, again, like the internet started out as that
01:54:18 and it could have been that,
01:54:20 but because of the way it was sort of structured
01:54:23 in terms of, you know, very lofty ideal,
01:54:26 they wanted everything to be open source,
01:54:28 open source and also free.
01:54:30 And, but they needed to find a way to pay the bills anyway,
01:54:32 because they were still building this
01:54:33 on top of our old economics system.
01:54:36 And so the way they did that
01:54:37 was through third party advertisement.
01:54:40 But that meant that things were very decoupled.
01:54:42 You know, you’ve got this third party interest,
01:54:45 which means that you’re then like,
01:54:47 people having to optimize for that.
01:54:48 And that is, you know, the actual consumer
01:54:51 is actually the product,
01:54:52 not the person you’re making the thing for.
01:54:56 In the end, you start making the thing for the advertiser.
01:54:59 And so that’s why it then like breaks down.
01:55:03 Yeah, like it’s, there’s no clean solution to this.
01:55:08 And I, it’s a really good suggestion by you actually
01:55:11 to like figure out how we can optimize virality
01:55:16 for positive sum topics.
01:55:19 I shall be the general of the love bot army.
01:55:24 Distributed.
01:55:26 Distributed, distributed, no, okay, yeah.
01:55:28 The power, just even in saying that,
01:55:30 the power already went to my head.
01:55:32 No, okay, you’ve talked about quantifying your thinking.
01:55:35 We’ve been talking about this,
01:55:36 sort of a game theoretic view on life
01:55:39 and putting probabilities behind estimates.
01:55:42 Like if you think about different trajectories
01:55:44 you can take through life,
01:55:45 just actually analyzing life in game theoretic way,
01:55:48 like your own life, like personal life.
01:55:50 I think you’ve given an example
01:55:52 that you had an honest conversation with Igor
01:55:54 about like, how long is this relationship gonna last?
01:55:57 So similar to our sort of marriage problem
01:56:00 kind of discussion, having an honest conversation
01:56:03 about the probability of things
01:56:05 that we sometimes are a little bit too shy
01:56:08 or scared to think of in a probabilistic terms.
01:56:11 Can you speak to that kind of way of reasoning,
01:56:15 the good and the bad of that?
01:56:17 Can you do this kind of thing with human relations?
01:56:20 Yeah, so the scenario you’re talking about, it was like.
01:56:25 Yeah, tell me about that scenario.
01:56:26 Yeah, I think it was about a year into our relationship
01:56:30 and we were having a fairly heavy conversation
01:56:34 because we were trying to figure out
01:56:35 whether or not I was gonna sell my apartment.
01:56:37 Well, you know, he had already moved in,
01:56:40 but I think we were just figuring out
01:56:41 what like our longterm plans would be.
01:56:43 We should be by a place together, et cetera.
01:56:46 When you guys are having that conversation,
01:56:47 are you like drunk out of your mind on wine
01:56:49 or is he sober and you’re actually having a serious?
01:56:52 I think we were sober.
01:56:53 How do you get to that conversation?
01:56:54 Because most people are kind of afraid
01:56:56 to have that kind of serious conversation.
01:56:58 Well, so our relationship was very,
01:57:01 well, first of all, we were good friends
01:57:03 for a couple of years before we even got romantic.
01:57:08 And when we did get romantic,
01:57:12 it was very clear that this was a big deal.
01:57:15 It wasn’t just like another, it wasn’t a random thing.
01:57:20 So the probability of it being a big deal was high.
01:57:22 It was already very high.
01:57:24 And then we’d been together for a year
01:57:26 and it had been pretty golden and wonderful.
01:57:28 So, you know, there was a lot of foundation already
01:57:32 where we felt very comfortable
01:57:33 having a lot of frank conversations.
01:57:35 But Igor’s MO has always been much more than mine.
01:57:38 He was always from the outset,
01:57:39 like just in a relationship,
01:57:42 radical transparency and honesty is the way
01:57:45 because the truth is the truth,
01:57:47 whether you want to hide it or not, you know,
01:57:48 but it will come out eventually.
01:57:50 And if you aren’t able to accept difficult things yourself,
01:57:56 then how could you possibly expect to be like
01:57:58 the most integral version that, you know,
01:58:00 you can’t, the relationship needs this bedrock
01:58:02 of like honesty as a foundation more than anything.
01:58:06 Yeah, that’s really interesting,
01:58:07 but I would like to push against some of those ideas,
01:58:09 but that’s the down the line, yes, throw them up.
01:58:13 I just rudely interrupt.
01:58:15 That was fine.
01:58:16 And so, you know, we’d been about together for a year
01:58:19 and things were good
01:58:20 and we were having this hard conversation
01:58:23 and then he was like, well, okay,
01:58:25 what’s the likelihood that we’re going to be together
01:58:27 in three years then?
01:58:28 Because I think it was roughly a three year time horizon.
01:58:31 And I was like, ooh, ooh, interesting.
01:58:32 And then we were like, actually wait,
01:58:34 before you said out loud,
01:58:35 let’s both write down our predictions formally
01:58:37 because we’d been like,
01:58:38 we’re just getting into like effective altruism
01:58:40 and rationality at the time,
01:58:41 which is all about making formal predictions
01:58:43 as a means of measuring your own,
01:58:48 well, your own foresight essentially in a quantified way.
01:58:53 So we like both wrote down our percentages
01:58:55 and we also did a one year prediction
01:58:58 and a 10 year one as well.
01:58:59 So we got percentages for all three
01:59:02 and then we showed each other.
01:59:03 And I remember like having this moment of like, ooh,
01:59:06 because for the 10 year one, I was like, well, I mean,
01:59:08 I love him a lot,
01:59:09 but like a lot can happen in 10 years,
01:59:11 and we’ve only been together for,
01:59:14 so I was like, I think it’s over 50%,
01:59:16 but it’s definitely not 90%.
01:59:17 And I remember like wrestling,
01:59:19 I was like, oh, but I don’t want him to be hurt.
01:59:20 I don’t want him to,
01:59:21 I don’t want to give a number lower than his.
01:59:22 And I remember thinking, I was like, ah, ah, don’t game it.
01:59:25 This is an exercise in radical honesty.
01:59:28 So just give your real percentage.
01:59:29 And I think mine was like 75%.
01:59:31 And then we showed each other
01:59:32 and luckily we were fairly well aligned.
01:59:34 And, but honestly, even if we weren’t.
01:59:38 20%.
01:59:39 Huh?
01:59:40 It definitely would have,
01:59:41 I, if his had been consistently lower than mine,
01:59:45 that would have rattled me for sure.
01:59:48 Whereas if it had been the other way around,
01:59:49 I think he would have,
01:59:50 he’s just kind of like a water off a duck’s back type of guy.
01:59:53 It’d be like, okay, well, all right, we’ll figure this out.
01:59:55 Well, did you guys provide error bars on the estimate?
01:59:57 Like the level of uncertainty?
01:59:58 They came built in.
01:59:59 We didn’t give formal plus or minus error bars.
02:00:02 I didn’t draw any or anything like that.
02:00:03 Well, I guess that’s the question I have is,
02:00:05 did you feel informed enough to make such decisions?
02:00:12 Cause like, I feel like if you were,
02:00:13 if I were to do this kind of thing rigorously,
02:00:17 I would want some data.
02:00:20 I would want to set one of the assumptions you have
02:00:23 is you’re not that different from other relationships.
02:00:25 Right.
02:00:26 And so I want to have some data about the way.
02:00:29 You want the base rates.
02:00:30 Yeah, and also actual trajectories of relationships.
02:00:34 I would love to have like time series data
02:00:39 about the ways that relationships fall apart or prosper,
02:00:43 how they collide with different life events,
02:00:46 losses, job changes, moving.
02:00:50 Both partners find jobs, only one has a job.
02:00:54 I want that kind of data
02:00:56 and how often the different trajectories change in life.
02:00:58 Like how informative is your past to your future?
02:01:04 That’s the whole thing.
02:01:05 Like can you look at my life and have a good prediction
02:01:09 about in terms of my characteristics and my relationships,
02:01:13 what that’s going to look like in the future or not?
02:01:15 I don’t even know the answer to that question.
02:01:17 I’ll be very ill informed in terms of making the probability.
02:01:20 I would be far, yeah, I just would be under informed.
02:01:25 I would be under informed.
02:01:26 I’ll be over biasing to my prior experiences, I think.
02:01:31 Right, but as long as you’re aware of that
02:01:33 and you’re honest with yourself,
02:01:34 and you’re honest with the other person,
02:01:35 say, look, I have really wide error bars on this
02:01:37 for the following reasons, that’s okay.
02:01:40 I still think it’s better than not trying
02:01:42 to quantify it at all.
02:01:43 If you’re trying to make really major
02:01:45 irreversible life decisions.
02:01:46 And I feel also the romantic nature of that question
02:01:49 for me personally, I try to live my life
02:01:52 thinking it’s very close to 100%.
02:01:55 Like, allowing myself actually the,
02:01:58 this is the difficulty of this is allowing myself
02:02:01 to think differently, I feel like has
02:02:04 a psychological consequence.
02:02:06 That’s where that, that’s one of my pushbacks
02:02:08 against radical honesty is this one particular perspective.
02:02:14 So you’re saying you would rather give
02:02:16 a falsely high percentage to your partner.
02:02:20 Going back to the wide sage filth.
02:02:21 In order to sort of create this additional optimism.
02:02:24 Helmuth.
02:02:25 Yes.
02:02:26 Of fake it till you make it.
02:02:29 The positive, the power of positive thinking.
02:02:31 Hashtag positivity.
02:02:32 Yeah, hashtag, it’s with a hashtag.
02:02:35 Well, so that, and this comes back to this idea
02:02:37 of useful fictions, right?
02:02:39 And I agree, I don’t think there’s a clear answer to this.
02:02:42 And I think it’s actually quite subjective.
02:02:44 Some people this works better for than others.
02:02:48 You know, to be clear, Igor and I weren’t doing
02:02:50 this formal prediction in it.
02:02:51 Like we did it with very much tongue in cheek.
02:02:55 It wasn’t like we were going to make,
02:02:57 I don’t think it even would have drastically changed
02:03:00 what we decided to do even.
02:03:02 We kind of just did it more as a fun exercise.
02:03:05 For the consequence of that fun exercise,
02:03:06 you really actually kind of, there was a deep honesty to it.
02:03:09 Exactly, it was a deep, and it was, yeah,
02:03:11 it was just like this moment of reflection.
02:03:13 I’m like, oh wow, I actually have to think
02:03:14 like through this quite critically and so on.
02:03:17 And it’s also what was interesting was I got to like check
02:03:22 in with what my desires were.
02:03:25 So there was one thing of like what my actual prediction is,
02:03:28 but what are my desires and could these desires
02:03:30 be affecting my predictions and so on.
02:03:32 And you know, that’s a method of rationality.
02:03:34 And I personally don’t think it loses anything in terms of,
02:03:37 I didn’t take any of the magic away
02:03:38 from our relationship, quite the opposite.
02:03:39 Like it brought us closer together because it was like,
02:03:42 we did this weird fun thing that I appreciate
02:03:45 a lot of people find quite strange.
02:03:47 And I think it was somewhat unique in our relationship
02:03:51 that both of us are very, we both love numbers,
02:03:54 we both love statistics, we’re both poker players.
02:03:58 So this was kind of like our safe space anyway.
02:04:01 For others, one partner really might not
02:04:04 like that kind of stuff at all, in which case
02:04:06 this is not a good exercise to do.
02:04:07 I don’t recommend it to everybody.
02:04:10 But I do think there’s, it’s interesting sometimes
02:04:14 to poke holes in the probe at these things
02:04:18 that we consider so sacred that we can’t try
02:04:21 to quantify them, which is interesting
02:04:24 because that’s in tension with like the idea
02:04:26 of what we just talked about with beauty
02:04:27 and like what makes something beautiful,
02:04:28 the fact that you can’t measure everything about it.
02:04:31 And perhaps something shouldn’t be tried to measure,
02:04:32 maybe it’s wrong to completely try and value
02:04:36 the utilitarian, put a utilitarian frame
02:04:39 of measuring the utility of a tree in its entirety.
02:04:43 I don’t know, maybe we should, maybe we shouldn’t.
02:04:44 I’m ambivalent on that.
02:04:46 But overall, people have too many biases.
02:04:52 People are overly biased against trying to do
02:04:57 like a quantified cost benefit analysis
02:04:59 on really tough life decisions.
02:05:02 They’re like, oh, just go with your gut.
02:05:04 It’s like, well, sure, but guts, our intuitions
02:05:07 are best suited for things that we’ve got tons
02:05:09 of experience in, then we can really, you don’t trust on it.
02:05:11 If it’s a decision we’ve made many times,
02:05:13 but if it’s like, should I marry this person
02:05:16 or should I buy this house over that house?
02:05:19 You only make those decisions a couple of times
02:05:20 in your life, maybe.
02:05:23 Well, I would love to know, there’s a balance,
02:05:26 probably is a personal balance to strike
02:05:29 is the amount of rationality you apply
02:05:33 to a question versus the useful fiction,
02:05:38 the fake it till you make it.
02:05:40 For example, just talking to soldiers in Ukraine,
02:05:43 you ask them, what’s the probability of you winning,
02:05:49 Ukraine winning?
02:05:52 Almost everybody I talk to is 100%.
02:05:55 Wow.
02:05:56 And you listen to the experts, right?
02:05:59 They say all kinds of stuff.
02:06:00 Right.
02:06:01 They are, first of all, the morale there
02:06:05 is higher than probably enough.
02:06:07 I’ve never been to a war zone before this,
02:06:09 but I’ve read about many wars
02:06:12 and I think the morale in Ukraine is higher
02:06:14 than almost any war I’ve read about.
02:06:17 It’s every single person in the country
02:06:19 is proud to fight for their country.
02:06:21 Wow.
02:06:22 Everybody, not just soldiers, not everybody.
02:06:25 Why do you think that is?
02:06:26 Specifically more than in other wars?
02:06:31 I think because there’s perhaps a dormant desire
02:06:36 for the citizens of this country to find
02:06:40 the identity of this country because it’s been
02:06:43 going through this 30 year process
02:06:45 of different factions and political bickering
02:06:48 and they haven’t had, as they talk about,
02:06:50 they haven’t had their independence war.
02:06:52 They say all great nations have had an independence war.
02:06:55 They had to fight for their independence,
02:06:58 for the discovery of the identity of the core
02:07:01 of the ideals that unify us and they haven’t had that.
02:07:04 There’s constantly been factions, there’s been divisions,
02:07:07 there’s been pressures from empires,
02:07:09 from the United States and from Russia,
02:07:12 from NATO and Europe, everybody telling them what to do.
02:07:15 Now they want to discover who they are
02:07:17 and there’s that kind of sense that we’re going to fight
02:07:21 for the safety of our homeland,
02:07:23 but we’re also gonna fight for our identity.
02:07:25 And that, on top of the fact that there’s just,
02:07:32 if you look at the history of Ukraine
02:07:33 and there’s certain other countries like this,
02:07:36 there are certain cultures that are feisty
02:07:39 in their pride of being the citizens of that nation.
02:07:45 Ukraine is that, Poland was that.
02:07:48 You just look at history.
02:07:49 In certain countries, you do not want to occupy.
02:07:54 I mean, both Stalin and Hitler talked about Poland
02:07:56 in this way, they’re like, this is a big problem
02:08:00 if we occupy this land for prolonged periods of time.
02:08:02 They’re gonna be a pain in their ass.
02:08:04 They’re not going to want to be occupied.
02:08:07 And certain other countries are pragmatic.
02:08:09 They’re like, well, leaders come and go.
02:08:12 I guess this is good.
02:08:14 Ukraine just doesn’t have, Ukrainians,
02:08:18 throughout the 20th century,
02:08:19 don’t seem to be the kind of people
02:08:20 that just sit calmly and let the quote unquote occupiers
02:08:28 impose their rules.
02:08:30 That’s interesting though, because you said it’s always been
02:08:32 under conflict and leaders have come and gone.
02:08:35 So you would expect them to actually be the opposite
02:08:37 under that like reasoning.
02:08:39 Well, because it’s a very fertile land,
02:08:42 it’s great for agriculture.
02:08:43 So a lot of people want to,
02:08:44 I mean, I think they’ve developed this culture
02:08:46 because they’ve constantly been occupied
02:08:47 by different people for the different peoples.
02:08:51 And so maybe there is something to that
02:08:54 where you’ve constantly had to feel like within the blood
02:08:59 of the generations, there’s the struggle against the man,
02:09:04 against the imposition of rules, against oppression
02:09:08 and all that kind of stuff, and that stays with them.
02:09:10 So there’s a will there, but a lot of other aspects
02:09:15 are also part of it that has to do with
02:09:18 the reverse Mollet kind of situation
02:09:20 where social media has definitely played a part of it.
02:09:23 Also different charismatic individuals
02:09:25 have had to play a part.
02:09:27 The fact that the president of the nation, Zelensky,
02:09:31 stayed in Kiev during the invasion
02:09:36 is a huge inspiration to them
02:09:37 because most leaders, as you can imagine,
02:09:41 when the capital of the nation is under attack,
02:09:44 the wise thing, the smart thing
02:09:46 that the United States advised Zelensky to do
02:09:49 is to flee and to be the leader of the nation
02:09:52 from a distant place.
02:09:54 He said, fuck that, I’m staying put.
02:09:58 Everyone around him, there was a pressure to leave
02:10:01 and he didn’t, and those singular acts
02:10:07 really can unify a nation.
02:10:09 There’s a lot of people that criticize Zelensky
02:10:11 within Ukraine.
02:10:12 Before the war, he was very unpopular, even still,
02:10:17 but they put that aside, especially that singular act
02:10:22 of staying in the capital.
02:10:24 Yeah, a lot of those kinds of things come together
02:10:27 to create something within people.
02:10:33 These things always, of course, though,
02:10:39 how zoomed out of a view do you wanna take?
02:10:43 Because, yeah, you describe it,
02:10:45 it’s like an anti Moloch thing happened within Ukraine
02:10:47 because it brought the Ukrainian people together
02:10:49 in order to fight a common enemy.
02:10:51 Maybe that’s a good thing, maybe that’s a bad thing.
02:10:53 In the end, we don’t know
02:10:54 how this is all gonna play out, right?
02:10:56 But if you zoom it out from a level, on a global level,
02:11:01 they’re coming together to fight that,
02:11:09 that could make a conflict larger.
02:11:12 You know what I mean?
02:11:13 I don’t know what the right answer is here.
02:11:15 It seems like a good thing that they came together,
02:11:17 but we don’t know how this is all gonna play out.
02:11:20 If this all turns into nuclear war, we’ll be like,
02:11:21 okay, that was the bad, that was the…
02:11:23 Oh yeah, so I was describing the reverse Moloch
02:11:26 for the local level.
02:11:27 Exactly, yes.
02:11:28 Now, this is where the experts come in and they say,
02:11:32 well, if you channel most of the resources of the nation
02:11:37 and the nations supporting Ukraine into the war effort,
02:11:42 are you not beating the drums of war
02:11:45 that is much bigger than Ukraine?
02:11:47 In fact, even the Ukrainian leaders
02:11:50 are speaking of it this way.
02:11:52 This is not a war between two nations.
02:11:55 This is the early days of a world war,
02:12:01 if we don’t play this correctly.
02:12:02 Yes, and we need cool heads from our leaders.
02:12:07 So from Ukraine’s perspective,
02:12:10 Ukraine needs to win the war
02:12:12 because what does winning the war mean
02:12:15 is coming to peace negotiations, an agreement
02:12:21 that guarantees no more invasions.
02:12:24 And then you make an agreement
02:12:25 about what land belongs to who.
02:12:28 And that’s, you stop that.
02:12:30 And basically from their perspective
02:12:34 is you want to demonstrate to the rest of the world
02:12:36 who’s watching carefully, including Russia and China
02:12:39 and different players on the geopolitical stage,
02:12:42 that this kind of conflict is not going to be productive
02:12:46 if you engage in it.
02:12:47 So you want to teach everybody a lesson,
02:12:49 let’s not do World War III.
02:12:50 It’s gonna be bad for everybody.
02:12:53 And so it’s a lose, lose.
02:12:54 It’s a deep lose, lose.
02:12:56 Doesn’t matter.
02:13:00 And I think that’s actually a correct,
02:13:05 when I zoom out, I mean, 99% of what I think about
02:13:10 is just individual human beings and human lives
02:13:12 and just that war is horrible.
02:13:14 But when you zoom out and think
02:13:15 from a geopolitics perspective,
02:13:17 we should realize that it’s entirely possible
02:13:22 that we will see a World War III in the 21st century.
02:13:26 And this is like a dress rehearsal for that.
02:13:29 And so the way we play this as a human civilization
02:13:34 will define whether we do or don’t have a World War III.
02:13:45 How we discuss war, how we discuss nuclear war,
02:13:49 the kind of leaders we elect and prop up,
02:13:55 the kind of memes we circulate,
02:13:58 because you have to be very careful
02:13:59 when you’re being pro Ukraine, for example,
02:14:04 you have to realize that you’re being,
02:14:07 you are also indirectly feeding
02:14:11 the ever increasing military industrial complex.
02:14:15 So you have to be extremely careful
02:14:17 that when you say pro Ukraine or pro anybody,
02:14:24 you’re pro human beings, not pro the machine
02:14:29 that creates narratives that says it’s pro human beings.
02:14:36 But it’s actually, if you look at the raw use
02:14:39 of funds and resources, it’s actually pro making weapons
02:14:44 and shooting bullets and dropping bombs.
02:14:47 The real, we have to just somehow get the meme
02:14:50 into everyone’s heads that the real enemy is war itself.
02:14:54 That’s the enemy we need to defeat.
02:14:57 And that doesn’t mean to say that there isn’t justification
02:15:01 for small local scenarios, adversarial conflicts.
02:15:07 If you have a leader who is starting wars,
02:15:11 they’re on the side of team war, basically.
02:15:13 It’s not that they’re on the side of team country,
02:15:15 whatever that country is,
02:15:16 it’s they’re on the side of team war.
02:15:17 So that needs to be stopped and put down.
02:15:20 But you also have to find a way
02:15:21 that your corrective measure doesn’t actually then
02:15:25 end up being coopted by the war machine
02:15:27 and creating greater war.
02:15:28 Again, the playing field is finite.
02:15:31 The scale of conflict is now getting so big
02:15:35 that the weapons that can be used are so mass destructive
02:15:40 that we can’t afford another giant conflict.
02:15:42 We just, we won’t make it.
02:15:44 What existential threat in terms of us not making it
02:15:48 are you most worried about?
02:15:49 What existential threat to human civilization?
02:15:51 We got like, let’s go.
02:15:52 Going down a dark path, huh?
02:15:53 It’s good, but no, well, no, it’s a dark.
02:15:56 No, it’s like, well, while we’re in the somber place,
02:15:58 we might as well.
02:16:02 Some of my best friends are dark paths.
02:16:06 What worries you most?
02:16:07 We mentioned asteroids, we mentioned AGI, nuclear weapons.
02:16:14 The one that’s on my mind the most,
02:16:17 mostly because I think it’s the one where we have
02:16:19 actually a real chance to move the needle on
02:16:21 in a positive direction or more specifically,
02:16:24 stop some really bad things from happening,
02:16:26 really dumb, avoidable things is bio risks.
02:16:36 In what kind of bio risks?
02:16:37 There’s so many fun options.
02:16:39 So many, so of course, we have natural risks
02:16:41 from natural pandemics, naturally occurring viruses
02:16:45 or pathogens, and then also as time and technology goes on
02:16:49 and technology becomes more and more democratized
02:16:51 and into the hands of more and more people,
02:16:54 the risk of synthetic pathogens.
02:16:57 And whether or not you fall into the camp of COVID
02:17:00 was gain of function accidental lab leak
02:17:03 or whether it was purely naturally occurring,
02:17:07 either way, we are facing a future where synthetic pathogens
02:17:14 or like human meddled with pathogens
02:17:17 either accidentally get out or get into the hands
02:17:21 of bad actors who, whether they’re omnicidal maniacs,
02:17:25 you know, either way.
02:17:28 And so that means we need more robustness for that.
02:17:31 And you would think that us having this nice little dry run,
02:17:33 which is what as awful as COVID was,
02:17:37 and all those poor people that died,
02:17:39 it was still like child’s play compared
02:17:42 to what a future one could be in terms of fatality rate.
02:17:45 And so you’d think that we would then be coming,
02:17:48 we’d be much more robust in our pandemic preparedness.
02:17:52 And meanwhile, the budget in the last two years for the US,
02:17:59 sorry, they just did this, I can’t remember the name
02:18:02 of what the actual budget was,
02:18:03 but it was like a multi trillion dollar budget
02:18:05 that the US just set aside.
02:18:07 And originally in that, you know, considering that COVID
02:18:09 cost multiple trillions to the economy, right?
02:18:12 The original allocation in this new budget
02:18:14 for future pandemic preparedness was 60 billion,
02:18:17 so tiny proportion of it.
02:18:20 That’s proceeded to get whittled down to like 30 billion,
02:18:24 to 15 billion, all the way down to 2 billion
02:18:27 out of multiple trillions.
02:18:28 For a thing that has just cost us multiple trillions,
02:18:31 we’ve just finished, we’re barely even,
02:18:33 we’re not even really out of it.
02:18:34 It basically got whittled down to nothing
02:18:36 because for some reason people think that,
02:18:38 oh, all right, we’ve got the pandemic,
02:18:40 out of the way, that was that one.
02:18:42 And the reason for that is that people are,
02:18:44 and I say this with all due respect
02:18:47 to a lot of the science community,
02:18:48 but there’s an immense amount of naivety about,
02:18:52 they think that nature is the main risk moving forward,
02:18:56 and it really isn’t.
02:18:59 And I think nothing demonstrates this more
02:19:00 than this project that I was just reading about
02:19:03 that’s sort of being proposed right now called Deep Vision.
02:19:07 And the idea is to go out into the wilds,
02:19:10 and we’re not talking about just like within cities,
02:19:13 like deep into like caves that people don’t go to,
02:19:15 deep into the Arctic, wherever, scour the earth
02:19:18 for whatever the most dangerous possible pathogens
02:19:22 could be that they can find.
02:19:24 And then not only do, try and find these,
02:19:27 bring samples of them back to laboratories.
02:19:31 And again, whether you think COVID was a lab leak or not,
02:19:34 I’m not gonna get into that,
02:19:35 but we have historically had so many, as a civilization,
02:19:38 we’ve had so many lab leaks
02:19:40 from even like the highest level security things.
02:19:42 Like it’s just, people should go and just read it.
02:19:45 It’s like a comedy show of just how many they are,
02:19:48 how leaky these labs are,
02:19:50 even when they do their best efforts.
02:19:52 So bring these things then back to civilization.
02:19:55 That’s step one of the badness.
02:19:57 Then the next step would be to then categorize them,
02:20:01 do experiments on them and categorize them
02:20:02 by their level of potential.
02:20:04 And then the piece de resistance on this plan
02:20:07 is to then publish that information freely on the internet
02:20:11 about all these pathogens, including their genome,
02:20:13 which is literally like the building instructions
02:20:15 of how to do them on the internet.
02:20:18 And this is something that genuinely,
02:20:20 a pocket of the like bio,
02:20:23 of the scientific community thinks is a good idea.
02:20:26 And I think on expectation, like the,
02:20:28 and their argument is, is that, oh, this is good,
02:20:31 because it might buy us some time to develop the vaccines,
02:20:35 which, okay, sure, maybe would have made sense
02:20:37 prior to mRNA technology, but you know, like they,
02:20:40 mRNA, we can develop a vaccine now
02:20:43 when we find a new pathogen within a couple of days.
02:20:46 Now then there’s all the trials and so on.
02:20:48 Those trials would have to happen anyway,
02:20:50 in the case of a brand new thing.
02:20:51 So you’re saving maybe a couple of days.
02:20:53 So that’s the upside.
02:20:54 Meanwhile, the downside is you’re not only giving,
02:20:58 you’re not only giving the vaccine,
02:21:00 you’re bringing the risk of these pathogens
02:21:03 of like getting leaked,
02:21:04 but you’re literally handing it out
02:21:06 to every bad actor on earth who would be doing cartwheels.
02:21:10 And I’m talking about like Kim Jong Un, ISIS,
02:21:13 people who like want,
02:21:14 they think the rest of the world is their enemy.
02:21:17 And in some cases they think that killing it themselves
02:21:19 is like a noble cause.
02:21:22 And you’re literally giving them the building blocks
02:21:24 of how to do this.
02:21:25 It’s the most batshit idea I’ve ever heard.
02:21:26 Like on expectation, it’s probably like minus EV
02:21:29 of like multiple billions of lives
02:21:31 if they actually succeeded in doing this.
02:21:33 Certainly in the tens or hundreds of millions.
02:21:35 So the cost benefit is so unbelievably, it makes no sense.
02:21:38 And I was trying to wrap my head around like why,
02:21:42 like what’s going wrong in people’s minds
02:21:44 to think that this is a good idea?
02:21:46 And it’s not that it’s malice or anything like that.
02:21:50 I think it’s that people don’t,
02:21:53 the proponents, they’re actually overly naive
02:21:57 about the interactions of humanity.
02:22:00 And well, like there are bad actors
02:22:02 who will use this for bad things.
02:22:04 Because not only will it,
02:22:07 if you publish this information,
02:22:08 even if a bad actor couldn’t physically make it themselves,
02:22:12 which given in 10 years time,
02:22:14 like the technology is getting cheaper and easier to use.
02:22:18 But even if they couldn’t make it, they could now bluff it.
02:22:20 Like what would you do if there’s like some deadly new virus
02:22:22 that we were published on the internet
02:22:26 in terms of its building blocks?
02:22:27 Kim Jong Un could be like,
02:22:28 hey, if you don’t let me build my nuclear weapons,
02:22:31 I’m gonna release this, I’ve managed to build it.
02:22:33 Well, now he’s actually got a credible bluff.
02:22:35 We don’t know.
02:22:36 And so that’s, it’s just like handing the keys,
02:22:39 it’s handing weapons of mass destruction to people.
02:22:42 Makes no sense.
02:22:43 The possible, I agree with you,
02:22:44 but the possible world in which it might make sense
02:22:48 is if the good guys, which is a whole nother problem,
02:22:53 defining who the good guys are,
02:22:55 but the good guys are like an order of magnitude
02:22:59 higher competence.
02:23:01 And so they can stay ahead of the bad actors
02:23:06 by just being very good at the defense.
02:23:10 By very good, not meaning like a little bit better,
02:23:13 but an order of magnitude better.
02:23:15 But of course the question is,
02:23:17 in each of those individual disciplines, is that feasible?
02:23:21 Can you, can the bad actors,
02:23:23 even if they don’t have the competence leapfrog
02:23:25 to the place where the good guys are?
02:23:29 Yeah, I mean, I would agree in principle
02:23:32 with pertaining to this like particular plan of like,
02:23:37 that, you know, with the thing I described,
02:23:38 this deep vision thing, where at least then
02:23:40 that would maybe make sense for steps one and step two
02:23:42 of like getting the information,
02:23:43 but then why would you release it,
02:23:45 the information to your literal enemies?
02:23:47 You know, that’s, that makes,
02:23:49 that doesn’t fit at all in that perspective
02:23:52 of like trying to be ahead of them.
02:23:53 You’re literally handing them the weapon.
02:23:55 But there’s different levels of release, right?
02:23:56 So there’s the kind of secrecy
02:24:00 where you don’t give it to anybody,
02:24:02 but there’s a release where you incrementally give it
02:24:05 to like major labs.
02:24:07 So it’s not public release, but it’s like,
02:24:09 you’re giving it to major labs.
02:24:10 Yeah, there’s different layers of reasonability, but.
02:24:12 But the problem there is it’s going to,
02:24:14 if you go anywhere beyond like complete secrecy,
02:24:18 it’s gonna leak.
02:24:19 That’s the thing, it’s very hard to keep secrets.
02:24:21 And so that’s still. Information is.
02:24:23 So you might as well release it to the public
02:24:25 is that argument.
02:24:26 So you either go complete secrecy
02:24:29 or you release it to the public.
02:24:31 So, which is essentially the same thing.
02:24:33 It’s going to leak anyway,
02:24:35 if you don’t do complete secrecy.
02:24:38 Right, which is why you shouldn’t get the information
02:24:39 in the first place.
02:24:40 Yeah, I mean, what, in that, I think.
02:24:43 Well, that’s a solution.
02:24:44 Yeah, the solution is either don’t get the information
02:24:46 in the first place or B, keep it incredibly,
02:24:50 incredibly contained.
02:24:52 See, I think, I think it really matters
02:24:54 which discipline we’re talking about.
02:24:55 So in the case of biology, I do think you’re very right.
02:24:59 We shouldn’t even be, it should be forbidden
02:25:02 to even like, think about that.
02:25:06 Meaning don’t collect, don’t just even collect
02:25:08 the information, but like, don’t do,
02:25:11 I mean, gain of function research is a really iffy area.
02:25:14 Like you start.
02:25:15 I mean, it’s all about cost benefits, right?
02:25:17 There are some scenarios where I could imagine
02:25:19 the cost benefit of a gain of function research
02:25:21 is very, very clear, where you’ve evaluated
02:25:24 all the potential risks, factored in the probability
02:25:26 that things can go wrong.
02:25:27 And like, you know, not only known unknowns,
02:25:29 but unknown unknowns as well, tried to quantify that.
02:25:32 And then even then it’s like orders of magnitude
02:25:34 better to do that.
02:25:35 I’m behind that argument.
02:25:36 But the point is that there’s this like,
02:25:38 naivety that’s preventing people from even doing
02:25:40 the cost benefit properly on a lot of the things
02:25:43 because, you know, I get it.
02:25:45 The science community, again, I don’t want to bucket
02:25:48 the science community, but like some people
02:25:50 within the science community just think
02:25:52 that everyone’s good and everyone just cares
02:25:54 about getting knowledge and doing the best for the world.
02:25:56 And unfortunately, that’s not the case.
02:25:58 I wish we lived in that world, but we don’t.
02:26:00 Yeah, I mean, there’s a lie.
02:26:02 Listen, I’ve been criticizing the science community
02:26:05 broadly quite a bit.
02:26:07 There’s so many brilliant people that brilliance
02:26:09 is somehow a hindrance sometimes
02:26:11 because it has a bunch of blind spots.
02:26:13 And then you start to look at the history of science,
02:26:16 how easily it’s been used by dictators
02:26:19 to any conclusion they want.
02:26:20 And it’s dark how you can use brilliant people
02:26:24 that like playing the little game of science
02:26:27 because it is a fun game.
02:26:28 You know, you’re building, you’re going to conferences,
02:26:30 you’re building on top of each other’s ideas,
02:26:32 there’s breakthroughs.
02:26:33 Hi, I think I’ve realized how this particular molecule works
02:26:37 and I could do this kind of experiment
02:26:38 and everyone else is impressed.
02:26:39 Ooh, cool.
02:26:40 No, I think you’re wrong.
02:26:41 Let me show you why you’re wrong.
02:26:42 And that little game, everyone gets really excited
02:26:44 and they get excited, oh, I came up with a pill
02:26:47 that solves this problem
02:26:48 and it’s going to help a bunch of people.
02:26:49 And I came up with a giant study
02:26:51 that shows the exact probability it’s going to help or not.
02:26:54 And you get lost in this game
02:26:56 and you forget to realize this game, just like Moloch,
02:27:01 can have like…
02:27:03 Unintended consequences.
02:27:03 Yeah.
02:27:04 Unintended consequences that might
02:27:06 destroy human civilization.
02:27:08 Right.
02:27:09 Or divide human civilization
02:27:12 or have dire geopolitical consequences.
02:27:17 I mean, the effects of, I mean, it’s just so…
02:27:20 The most destructive effects of COVID
02:27:22 have nothing to do with the biology of the virus,
02:27:25 it seems like.
02:27:27 I mean, I could just list them forever,
02:27:29 but like one of them is the complete distress
02:27:32 of public institutions.
02:27:34 The other one is because of that public distress,
02:27:36 I feel like if a much worse pandemic came along,
02:27:39 we as a world have now cried wolf.
02:27:42 And if an actual wolf now comes,
02:27:45 people will be like, fuck masks, fuck…
02:27:48 Fuck vaccines, yeah.
02:27:49 Fuck everything.
02:27:50 And they won’t be…
02:27:51 They’ll distrust every single thing
02:27:53 that any major institution is going to tell them.
02:27:55 And…
02:27:56 Because that’s the thing,
02:27:57 like there were certain actions made by
02:28:03 certain health public figures where they told,
02:28:07 they very knowingly told it was a white lie,
02:28:10 it was intended in the best possible way,
02:28:12 such as early on when there was clearly a shortage of masks.
02:28:20 And so they said to the public,
02:28:21 oh, don’t get masks, there’s no evidence that they work.
02:28:25 Or don’t get them, they don’t work.
02:28:27 In fact, it might even make it worse.
02:28:29 You might even spread it more.
02:28:30 Like that was the real like stinker.
02:28:33 Yeah, no, no, unless you know how to do it properly,
02:28:35 you’re gonna make that you’re gonna get sicker
02:28:36 or you’re more likely to catch the virus,
02:28:38 which is just absolute crap.
02:28:41 And they put that out there.
02:28:43 And it’s pretty clear the reason why they did that
02:28:45 was because there was actually a shortage of masks
02:28:47 and they really needed it for health workers,
02:28:50 which makes sense.
02:28:51 Like, I agree, like, you know,
02:28:52 but the cost of lying to the public
02:28:57 when that then comes out,
02:28:59 people aren’t as stupid as they think they are.
02:29:02 And that’s, I think where this distrust of experts
02:29:05 has largely come from.
02:29:06 A, they’ve lied to people overtly,
02:29:09 but B, people have been treated like idiots.
02:29:13 Now, that’s not to say that there aren’t a lot
02:29:14 of stupid people who have a lot of wacky ideas
02:29:16 around COVID and all sorts of things.
02:29:18 But if you treat the general public like children,
02:29:21 they’re going to see that, they’re going to notice that.
02:29:23 And that is going to just like absolutely decimate the trust
02:29:26 in the public institutions that we depend upon.
02:29:29 And honestly, the best thing that could happen,
02:29:32 I wish like if like Fauci or, you know,
02:29:34 and these other like leaders who, I mean, God,
02:29:36 I would, I can’t imagine how nightmare his job has been
02:29:39 for the last few years, hell on earth.
02:29:41 Like, so, you know, I, you know,
02:29:43 I have a lot of sort of sympathy
02:29:45 for the position he’s been in.
02:29:46 But like, if he could just come out and be like,
02:29:48 okay, look, guys, hands up,
02:29:51 we didn’t handle this as well as we could have.
02:29:53 These are all the things I would have done differently
02:29:55 in hindsight.
02:29:56 I apologize for this and this and this and this.
02:29:58 That would go so far.
02:30:01 And maybe I’m being naive, who knows,
02:30:03 maybe this would backfire, but I don’t think it would.
02:30:04 Like to someone like me even,
02:30:06 cause I’ve like, I’ve lost trust
02:30:07 in a lot of these things.
02:30:08 But I’m fortunate that I at least know people
02:30:10 who I can go to who I think are good,
02:30:11 like have good epistemics on this stuff.
02:30:14 But you know, if they, if they could sort of put their hands
02:30:16 on my go, okay, these are the spots where we screwed up.
02:30:18 This, this, this, this was our reasons.
02:30:21 Yeah, we actually told a little white lie here.
02:30:22 We did it for this reason.
02:30:23 We’re really sorry.
02:30:24 Where they just did the radical honesty thing,
02:30:26 the radical transparency thing,
02:30:28 that would go so far to rebuilding public trust.
02:30:32 And I think that’s what needs to happen.
02:30:33 Otherwise.
02:30:34 I totally agree with you.
02:30:35 Unfortunately, yeah, his job was very tough
02:30:38 and all those kinds of things, but I see arrogance
02:30:42 and arrogance prevented him from being honest
02:30:46 in that way previously.
02:30:47 And I think arrogance will prevent him
02:30:49 from being honest in that way.
02:30:51 Now we need leaders.
02:30:52 I think young people are seeing that,
02:30:55 that kind of talking down to people
02:30:59 from a position of power, I hope is the way of the past.
02:31:04 People really like authenticity
02:31:06 and they like leaders that are like a man
02:31:10 and a woman of the people.
02:31:12 And I think that just.
02:31:15 I mean, he still has a chance to do that, I think.
02:31:17 I mean, I don’t wanna, you know, I don’t think, you know,
02:31:19 if I doubt he’s listening, but if he is like,
02:31:21 hey, I think, you know, I don’t think he’s irredeemable
02:31:25 by any means.
02:31:26 I think there’s, you know, I don’t have an opinion
02:31:28 whether there was arrogance or there or not.
02:31:30 Just know that I think like coming clean on the, you know,
02:31:34 it’s understandable to have fucked up during this pandemic.
02:31:37 Like I won’t expect any government to handle it well
02:31:39 because it was so difficult, like so many moving pieces,
02:31:42 so much like lack of information and so on.
02:31:46 But the step to rebuilding trust is to go, okay, look,
02:31:49 we’re doing a scrutiny of where we went wrong.
02:31:51 And for my part, I did this wrong in this part.
02:31:53 And that would be huge.
02:31:55 All of us can do that.
02:31:56 I mean, I was struggling for a while
02:31:57 whether I want to talk to him or not.
02:32:00 I talked to his boss, Francis Collins.
02:32:03 Another person that’s screwed up in terms of trust,
02:32:08 lost a little bit of my respect too.
02:32:10 There seems to have been a kind of dishonesty
02:32:14 in the backrooms in that they didn’t trust people
02:32:21 to be intelligent.
02:32:23 Like we need to tell them what’s good for them.
02:32:26 We know what’s good for them.
02:32:27 That kind of idea.
02:32:28 To be fair, the thing that’s, what’s it called?
02:32:33 I heard the phrase today, nut picking.
02:32:36 Social media does that.
02:32:37 So you’ve got like nitpicking.
02:32:39 Nut picking is where the craziest, stupidest,
02:32:44 you know, if you have a group of people,
02:32:45 let’s say people who are vaccine,
02:32:47 I don’t like the term anti vaccine,
02:32:48 people who are vaccine hesitant, vaccine speculative.
02:32:51 So what social media did or the media or anyone,
02:32:57 their opponents would do is pick the craziest example.
02:33:01 So the ones who are like,
02:33:02 I think I need to inject myself with motor oil
02:33:06 up my ass or something.
02:33:07 Select the craziest ones and then have that beamed too.
02:33:11 So from like someone like Fauci or Francis’s perspective,
02:33:14 that’s what they get because they’re getting
02:33:16 the same social media stuff as us.
02:33:17 They’re getting the same media reports.
02:33:19 I mean, they might get some more information,
02:33:20 but they too are gonna get the nuts portrayed to them.
02:33:24 So they probably have a misrepresentation
02:33:27 of what the actual public’s intelligence is.
02:33:29 Well, that’s just, yes.
02:33:31 And that just means they’re not social media savvy.
02:33:33 So one of the skills of being on social media
02:33:36 is to be able to filter that in your mind,
02:33:37 like to understand, to put into proper context.
02:33:40 To realize that what you are seeing,
02:33:41 social media is not anywhere near
02:33:43 an accurate representation of humanity.
02:33:46 Nut picking, and there’s nothing wrong
02:33:49 with putting motor oil up your ass.
02:33:51 It’s just one of the better aspects of,
02:33:54 I do this every weekend.
02:33:56 Okay.
02:33:58 Where the hell did that analogy come from in my mind?
02:33:59 Like what?
02:34:00 I don’t know.
02:34:01 I think you need to, there’s some Freudian thing
02:34:03 you would need to deeply investigate with a therapist.
02:34:06 Okay, what about AI?
02:34:08 Are you worried about AGI, super intelligence systems
02:34:13 or paperclip maximizer type of situation?
02:34:16 Yes, I’m definitely worried about it,
02:34:19 but I feel kind of bipolar in the,
02:34:23 some days I wake up and I’m like.
02:34:24 You’re excited about the future?
02:34:26 Well, exactly.
02:34:26 I’m like, wow, we can unlock the mysteries of the universe.
02:34:29 You know, escape the game.
02:34:30 And this, you know,
02:34:34 because I spend all my time thinking
02:34:35 about these molecule problems that, you know,
02:34:37 what is the solution to them?
02:34:39 What, you know, in some ways you need this like
02:34:41 omni benevolent, omniscient, omni wise coordination mechanism
02:34:49 that can like make us all not do the molecule thing
02:34:53 or like provide the infrastructure
02:34:55 or redesign the system so that it’s not vulnerable
02:34:57 to this molecule process.
02:34:59 And in some ways, you know,
02:35:01 that’s the strongest argument to me
02:35:02 for like the race to build AGI
02:35:04 is that maybe, you know, we can’t survive without it.
02:35:08 But the flip side to that is the,
02:35:12 the unfortunately now that there’s multiple actors
02:35:14 trying to build AI, AGI, you know,
02:35:17 this was fine 10 years ago when it was just DeepMind,
02:35:19 but then other companies started up
02:35:22 and now it created a race dynamic.
02:35:23 Now it’s like, that’s the whole thing.
02:35:25 Is it the same, it’s got the same problem.
02:35:28 It’s like whichever company is the one that like optimizes
02:35:31 for speed at the cost of safety
02:35:33 will get the competitive advantage.
02:35:35 And so we’re the more likely the ones to build the AGI,
02:35:37 you know, and that’s the same cycle that you’re in.
02:35:40 And there’s no clear solution to that
02:35:41 because you can’t just go like slapping,
02:35:46 if you go and try and like stop all the different companies,
02:35:51 then it will, you know, the good ones will stop
02:35:54 because they’re the ones, you know,
02:35:55 within the West’s reach,
02:35:57 but then that leaves all the other ones to continue
02:35:59 and then they’re even more likely.
02:36:00 So it’s like, it’s a very difficult problem
02:36:03 with no clean solution.
02:36:04 And, you know, at the same time, you know,
02:36:08 I know at least some of the folks at DeepMind
02:36:12 and they’re incredible and they’re thinking about this.
02:36:13 They’re very aware of this problem.
02:36:15 And they’re like, you know,
02:36:17 I think some of the smartest people on earth.
02:36:18 Yeah, the culture is important there
02:36:20 because they are thinking about that
02:36:21 and they’re some of the best machine learning engineers.
02:36:26 So it’s possible to have a company or a community of people
02:36:29 that are both great engineers
02:36:31 and are thinking about the philosophical topics.
02:36:33 Exactly, and importantly, they’re also game theorists,
02:36:36 you know, and because this is ultimately
02:36:38 a game theory problem, the thing, this Moloch mechanism
02:36:41 and like, you know, how do we voice arms race scenarios?
02:36:47 You need people who aren’t naive to be thinking about this.
02:36:50 And again, like luckily there’s a lot of smart,
02:36:51 non naive game theorists within that group.
02:36:54 Yes, I’m concerned about it and I think it’s again,
02:36:57 a thing that we need people to be thinking about
02:37:01 in terms of like, how do we create,
02:37:03 how do we mitigate the arms race dynamics
02:37:05 and how do we solve the thing of,
02:37:10 Bostrom calls it the orthogonality problem whereby,
02:37:14 because it’s obviously there’s a chance, you know,
02:37:16 the belief, the hope is,
02:37:17 is that you build something that’s super intelligent
02:37:19 and by definition of being super intelligent,
02:37:22 it will also become super wise
02:37:25 and have the wisdom to know what the right goals are.
02:37:27 And hopefully those goals include keeping humanity alive.
02:37:30 Right?
02:37:31 But Bostrom says that actually those two things,
02:37:35 you know, super intelligence and super wisdom
02:37:39 aren’t necessarily correlated.
02:37:40 They’re actually kind of orthogonal things.
02:37:42 And how do we make it so that they are correlated?
02:37:44 How do we guarantee it?
02:37:45 Because we need it to be guaranteed really,
02:37:47 to know that we’re doing the thing safely.
02:37:48 But I think that like merging of intelligence and wisdom,
02:37:54 at least my hope is that this whole process
02:37:56 happens sufficiently slowly
02:37:58 that we’re constantly having these kinds of debates
02:38:01 that we have enough time to figure out
02:38:06 how to modify each version of the system
02:38:07 as it becomes more and more intelligent.
02:38:09 Yes, buying time is a good thing, definitely.
02:38:12 Anything that slows everything down.
02:38:14 We just, everyone needs to chill out.
02:38:16 We’ve got millennia to figure this out.
02:38:19 Yeah.
02:38:21 Or at least, at least, well, it depends.
02:38:25 Again, some people think that, you know,
02:38:27 we can’t even make it through the next few decades
02:38:29 without having some kind of
02:38:33 omni wise coordination mechanism.
02:38:36 And there’s also an argument to that.
02:38:37 Yeah, I don’t know.
02:38:39 Well, there is, I’m suspicious of that kind of thinking
02:38:42 because it seems like the entirety of human history
02:38:45 has people in it that are like predicting doom
02:38:49 just around the corner.
02:38:50 There’s something about us
02:38:52 that is strangely attracted to that thought.
02:38:56 It’s almost like fun to think about
02:38:59 the destruction of everything.
02:39:02 Just objectively speaking,
02:39:05 I’ve talked and listened to a bunch of people
02:39:08 and they are gravitating towards that.
02:39:11 It’s almost, I think it’s the same thing
02:39:13 that people love about conspiracy theories
02:39:15 is they love to be the person that kind of figured out
02:39:19 some deep fundamental thing about the,
02:39:22 that’s going to be, it’s going to mark something
02:39:24 extremely important about the history of human civilization
02:39:28 because then I will be important.
02:39:31 When in reality, most of us will be forgotten
02:39:33 and life will go on.
02:39:37 And one of the sad things about
02:39:40 whenever anything traumatic happens to you,
02:39:41 whenever you lose loved ones or just tragedy happens,
02:39:46 you realize life goes on.
02:39:49 Even after a nuclear war that will wipe out
02:39:52 some large percentage of the population
02:39:55 and will torture people for years to come
02:40:00 because of the effects of a nuclear winter,
02:40:05 people will still survive.
02:40:07 Life will still go on.
02:40:08 I mean, it depends on the kind of nuclear war.
02:40:10 But in the case of nuclear war, it will still go on.
02:40:13 That’s one of the amazing things about life.
02:40:15 It finds a way.
02:40:17 And so in that sense, I just,
02:40:18 I feel like the doom and gloom thing
02:40:21 is a…
02:40:23 Well, what we don’t, yeah,
02:40:24 we don’t want a self fulfilling prophecy.
02:40:26 Yes, that’s exactly.
02:40:27 Yes, and I very much agree with that.
02:40:29 And I even have a slight feeling
02:40:32 from the amount of time we spent in this conversation
02:40:35 talking about this,
02:40:36 because it’s like, is this even a net positive
02:40:40 if it’s like making everyone feel,
02:40:41 oh, in some ways, like making people imagine
02:40:44 these bad scenarios can be a self fulfilling prophecy,
02:40:47 but at the same time, that’s weighed off
02:40:51 with at least making people aware of the problem
02:40:54 and gets them thinking.
02:40:55 And I think particularly,
02:40:56 the reason why I wanna talk about this to your audience
02:40:58 is that on average, they’re the type of people
02:41:00 who gravitate towards these kinds of topics
02:41:02 because they’re intellectually curious
02:41:04 and they can sort of sense that there’s trouble brewing.
02:41:07 They can smell that there’s,
02:41:09 I think there’s a reason
02:41:10 that people are thinking about this stuff a lot
02:41:11 is because the probability,
02:41:14 the probability, it’s increased in probability
02:41:16 over certainly over the last few years,
02:41:19 trajectories have not gone favorably,
02:41:21 let’s put it since 2010.
02:41:24 So it’s right, I think, for people to be thinking about it,
02:41:28 but that’s where they’re like,
02:41:30 I think whether it’s a useful fiction
02:41:31 or whether it’s actually true
02:41:33 or whatever you wanna call it,
02:41:34 I think having this faith,
02:41:35 this is where faith is valuable
02:41:38 because it gives you at least this like anchor of hope.
02:41:41 And I’m not just saying it to like trick myself,
02:41:43 like I do truly,
02:41:44 I do think there’s something out there that wants us to win.
02:41:47 I think there’s something that really wants us to win.
02:41:49 And it just, you just have to be like,
02:41:53 just like kind of, okay, now I sound really crazy,
02:41:55 but like open your heart to it a little bit
02:41:58 and it will give you the like,
02:42:03 the sort of breathing room
02:42:04 with which to marinate on the solutions.
02:42:07 We are the ones who have to come up with the solutions,
02:42:10 but we can use that.
02:42:15 There’s like, there’s hashtag positivity.
02:42:18 There’s value in that.
02:42:19 Yeah, you have to kind of imagine
02:42:21 all the destructive trajectories that lay in our future
02:42:24 and then believe in the possibility
02:42:27 of avoiding those trajectories.
02:42:29 All while, you said audience,
02:42:32 all while sitting back, which is majority,
02:42:35 the two people that listen to this
02:42:36 are probably sitting on a beach,
02:42:38 smoking some weed, just, that’s a beautiful sunset.
02:42:44 They’re looking at just the waves going in and out.
02:42:47 And ultimately there’s a kind of deep belief there
02:42:50 in the momentum of humanity to figure it all out.
02:42:56 But we’ve got a lot of work to do.
02:42:58 Which is what makes this whole simulation,
02:43:01 this video game kind of fun.
02:43:03 This battle of polytopia,
02:43:05 I still, man, I love those games so much.
02:43:08 So good.
02:43:09 And that one for people who don’t know,
02:43:11 Battle of Polytopia is a big,
02:43:13 it’s like a, is this really radical simplification
02:43:17 of a civilization type of game.
02:43:20 It still has a lot of the skill tree development,
02:43:24 a lot of the strategy,
02:43:27 but it’s easy enough to play on a phone.
02:43:29 Yeah.
02:43:30 It’s kind of interesting.
02:43:31 They’ve really figured it out.
02:43:33 It’s one of the most elegantly designed games
02:43:35 I’ve ever seen.
02:43:35 It’s incredibly complex.
02:43:37 And yet being, again, it walks that line
02:43:39 between complexity and simplicity
02:43:40 in this really, really great way.
02:43:44 And they use pretty colors
02:43:45 that hack the dopamine reward circuits in our brains.
02:43:49 Very well.
02:43:49 Yeah, it’s fun.
02:43:50 Video games are so fun.
02:43:52 Yeah.
02:43:53 Most of this life is just about fun.
02:43:55 Escaping all the suffering to find the fun.
02:43:58 What’s energy healing?
02:43:59 I have in my notes, energy healing question mark.
02:44:02 What’s that about?
02:44:05 Oh man.
02:44:06 God, your audience is gonna think I’m mad.
02:44:09 So the two crazy things that happened to me,
02:44:13 the one was the voice in the head
02:44:14 that said you’re gonna win this tournament
02:44:16 and then I won the tournament.
02:44:18 The other craziest thing that’s happened to me
02:44:20 was in 2018,
02:44:25 I started getting this like weird problem in my ear
02:44:30 where it was kind of like low frequency sound distortion,
02:44:35 where voices, particularly men’s voices,
02:44:37 became incredibly unpleasant to listen to.
02:44:40 It would like create this,
02:44:42 it would like be falsely amplified or something
02:44:44 and it was almost like a physical sensation in my ear,
02:44:46 which was really unpleasant.
02:44:48 And it would like last for a few hours and then go away
02:44:50 and then come back for a few hours and go away.
02:44:52 And I went and got hearing tests
02:44:54 and they found that like the bottom end,
02:44:56 I was losing the hearing in that ear.
02:45:00 And in the end, I got,
02:45:03 doctors said they think it was this thing
02:45:05 called Meniere’s disease,
02:45:07 which is this very unpleasant disease
02:45:10 where people basically end up losing their hearing,
02:45:12 but they get this like,
02:45:14 it often comes with like dizzy spells and other things
02:45:16 because it’s like the inner ear gets all messed up.
02:45:19 Now, I don’t know if that’s actually what I had,
02:45:21 but that’s what at least a couple of,
02:45:23 one doctor said to me.
02:45:24 But anyway, so I’d had three months of this stuff,
02:45:26 this going on and it was really getting me down.
02:45:28 And I was at Burning Man of all places.
02:45:32 I don’t mean to be that person talking about Burning Man,
02:45:35 but I was there.
02:45:36 And again, I’d had it and I was unable to listen to music,
02:45:39 which is not what you want
02:45:40 because Burning Man is a very loud, intense place.
02:45:42 And I was just having a really rough time.
02:45:44 And on the final night,
02:45:45 I get talking to this girl who’s like a friend of a friend.
02:45:49 And I mentioned, I was like,
02:45:51 oh, I’m really down in the dumps about this.
02:45:52 And she’s like, oh, well,
02:45:53 I’ve done a little bit of energy healing.
02:45:54 Would you like me to have a look?
02:45:56 And I was like, sure.
02:45:57 Now this was again,
02:45:59 deep, I was, you know, no time in my life for this.
02:46:03 I didn’t believe in any of this stuff.
02:46:04 I was just like, it’s all bullshit.
02:46:06 It’s all wooey nonsense.
02:46:08 But I was like, sure, I’ll have a go.
02:46:10 And she starts like with her hand and she says,
02:46:14 oh, there’s something there.
02:46:15 And then she leans in and she starts like sucking
02:46:18 over my ear, not actually touching me,
02:46:19 but like close to it, like with her mouth.
02:46:22 And it was really unpleasant.
02:46:23 I was like, whoa, can you stop?
02:46:24 She’s like, no, no, no, there’s something there.
02:46:25 I need to get it.
02:46:26 And I was like, no, no, no, I really don’t like it.
02:46:27 Please, this is really loud.
02:46:29 She’s like, I need to just bear with me.
02:46:31 And she does it.
02:46:31 And I don’t know how long, for a few minutes.
02:46:33 And then she eventually collapses on the ground,
02:46:36 like freezing cold, crying.
02:46:40 And I’m just like, I don’t know what the hell is going on.
02:46:42 Like I’m like thoroughly freaked out
02:46:44 as is everyone else watching.
02:46:45 Just like, what the hell?
02:46:46 And we like warm her up and she was like,
02:46:47 oh, what, oh, you know, she was really shaken up.
02:46:51 And she’s like, I don’t know what that,
02:46:54 she said it was something very unpleasant and dark.
02:46:57 Don’t worry, it’s gone.
02:46:58 I think you’ll be fine in a couple,
02:46:59 you’ll have the physical symptoms for a couple of weeks
02:47:01 and you’ll be fine.
02:47:02 But, you know, she was just like that, you know,
02:47:05 so I was so rattled, A, because the potential that actually
02:47:10 I’d had something bad in me that made someone feel bad
02:47:13 and that she was scared.
02:47:15 That was what, you know, I was like, wait,
02:47:16 I thought you do this, this is the thing.
02:47:19 Now you’re terrified.
02:47:20 Like you bought like some kind of exorcism or something.
02:47:22 Like what the fuck is going on?
02:47:24 So it, like just the most insane experience.
02:47:29 And frankly, it took me like a few months
02:47:31 to sort of emotionally recover from it.
02:47:35 But my ear problem went away about a couple of weeks later
02:47:39 and touch wood, I’ve not had any issues since.
02:47:42 So.
02:47:44 That gives you like hints
02:47:48 that maybe there’s something out there.
02:47:51 I mean, I don’t, again,
02:47:53 I don’t have an explanation for this.
02:47:55 The most probable explanation was, you know,
02:47:57 I was a burning man, I was in a very open state.
02:47:59 Let’s just leave it at that.
02:48:01 And, you know, placebo is an incredibly powerful thing
02:48:08 and a very not understood thing.
02:48:10 So.
02:48:11 Almost assigning the word placebo to it reduces it down
02:48:13 to a way that it doesn’t deserve to be reduced down.
02:48:16 Maybe there’s a whole science of what we call placebo.
02:48:19 Maybe there’s a, placebo’s a door.
02:48:21 Self healing, you know?
02:48:24 And I mean, I don’t know what the problem was.
02:48:26 Like I was told it was many years.
02:48:27 I don’t want to say I definitely had that
02:48:29 because I don’t want people to think that,
02:48:30 oh, that’s how, you know, if they do have that,
02:48:32 because it’s a terrible disease.
02:48:33 And if they have that,
02:48:34 that this is going to be a guaranteed way for it
02:48:35 to fix it for them.
02:48:35 I don’t know.
02:48:37 And I also don’t, I don’t,
02:48:39 and you’re absolutely right to say,
02:48:41 like using even the word placebo is like,
02:48:43 it comes with this like baggage of, of like frame.
02:48:48 And I don’t want to reduce it down.
02:48:49 All I can do is describe the experience and what happened.
02:48:52 I cannot put an ontological framework around it.
02:48:56 I can’t say why it happened, what the mechanism was,
02:49:00 what the problem even was in the first place.
02:49:02 I just know that something crazy happened
02:49:05 and it was while I was in an open state.
02:49:07 And fortunately for me, it made the problem go away.
02:49:09 But what I took away from it, again,
02:49:11 it was part of this, you know,
02:49:13 this took me on this journey of becoming more humble
02:49:16 about what I think I know.
02:49:17 Because as I said before, I was like,
02:49:18 I was in the like Richard Dawkins train of atheism
02:49:21 in terms of there is no God.
02:49:23 And everything like that is bullshit.
02:49:24 We know everything, we know, you know,
02:49:26 the only way we can get through,
02:49:28 we know how medicine works and its molecules
02:49:30 and chemical interactions and that kind of stuff.
02:49:33 And now it’s like, okay, well,
02:49:36 there’s clearly more for us to understand.
02:49:40 And that doesn’t mean that it’s ascientific as well,
02:49:43 because, you know, the beauty of the scientific method
02:49:47 is that it still can apply to this situation.
02:49:49 Like, I don’t see why, you know,
02:49:51 I would like to try and test this experimentally.
02:49:54 I haven’t really, like, you know,
02:49:55 I don’t know how we would go about doing that.
02:49:57 We’d have to find other people with the same condition,
02:49:58 I guess, and like, try and repeat the experiment.
02:50:04 But it doesn’t, just because something happens
02:50:06 that’s sort of out of the realms
02:50:09 of our current understanding,
02:50:10 it doesn’t mean that it’s,
02:50:11 the scientific method can’t be used for it.
02:50:13 Yeah, I think the scientific method sits on a foundation
02:50:17 of those kinds of experiences,
02:50:20 because the scientific method is a process
02:50:24 to carve away at the mystery all around us.
02:50:30 And experiences like this is just a reminder
02:50:33 that we’re mostly shrouded in mystery still.
02:50:36 That’s it.
02:50:37 It’s just like a humility.
02:50:38 Like, we haven’t really figured this whole thing out.
02:50:41 But at the same time, we have found ways
02:50:44 to act, you know, we’re clearly doing something right,
02:50:47 because think of the technological scientific advancements,
02:50:50 the knowledge that we have that would blow people’s minds
02:50:53 even from 100 years ago.
02:50:55 Yeah, and we’ve even allegedly got out to space
02:50:58 and landed on the moon, although I still haven’t,
02:51:01 I have not seen evidence of the Earth being round,
02:51:04 but I’m keeping an open mind.
02:51:08 Speaking of which, you studied physics
02:51:10 and astrophysics, just to go to that,
02:51:16 just to jump around through the fascinating life you’ve had,
02:51:20 when did you, how did that come to be?
02:51:23 Like, when did you fall in love with astronomy
02:51:25 and space and things like this?
02:51:28 As early as I can remember.
02:51:30 I was very lucky that my mom, and my dad,
02:51:33 but particularly my mom, my mom is like the most nature,
02:51:38 she is Mother Earth, is the only way to describe her.
02:51:41 Just, she’s like Dr. Doolittle, animals flock to her
02:51:44 and just like sit and look at her adoringly.
02:51:47 As she sings.
02:51:48 Yeah, she just is Mother Earth,
02:51:51 and she has always been fascinated by,
02:51:54 she doesn’t have any, she never went to university
02:51:57 or anything like that, she’s actually phobic of maths,
02:51:59 if I try and get her to like,
02:52:00 you know, I was trying to teach her poker and she hated it.
02:52:03 But she’s so deeply curious,
02:52:06 and that just got instilled in me when, you know,
02:52:09 we would sleep out under the stars,
02:52:11 whenever it was, you know, the two nights a year
02:52:13 when it was warm enough in the UK to do that.
02:52:15 And we would just lie out there until we fell asleep,
02:52:18 looking at, looking for satellites,
02:52:20 looking for shooting stars, and I was just always,
02:52:24 I don’t know whether it was from that,
02:52:25 but I’ve always naturally gravitated to like the biggest,
02:52:30 the biggest questions.
02:52:31 And also the like, the most layers of abstraction I love,
02:52:35 just like, what’s the meta question?
02:52:36 What’s the meta question and so on.
02:52:38 So I think it just came from that really.
02:52:40 And then on top of that, like physics,
02:52:43 you know, it also made logical sense
02:52:45 in that it was a degree that,
02:52:51 well, a subject that ticks the box of being,
02:52:53 you know, answering these really big picture questions,
02:52:55 but it was also extremely useful.
02:52:57 It like has a very high utility in terms of,
02:53:00 I didn’t know necessarily,
02:53:02 I thought I was gonna become like a research scientist.
02:53:04 My original plan was,
02:53:05 I wanna be a professional astronomer.
02:53:07 So it’s not just like a philosophy degree
02:53:08 that asks the big questions,
02:53:10 and it’s not like biology and the path
02:53:14 to go to medical school or something like that,
02:53:16 which is all overly pragmatic, not overly,
02:53:19 is very pragmatic, but this is, yeah,
02:53:23 physics is a good combination of the two.
02:53:26 Yeah, at least for me, it made sense.
02:53:27 And I was good at it, I liked it.
02:53:30 Yeah, I mean, it wasn’t like I did an immense amount
02:53:32 of soul searching to choose it or anything.
02:53:34 It just was like this, it made the most sense.
02:53:38 I mean, you have to make this decision in the UK age 17,
02:53:41 which is crazy, because, you know, in US,
02:53:43 you go the first year, you do a bunch of stuff, right?
02:53:46 And then you choose your major.
02:53:48 Yeah, I think the first few years of college,
02:53:50 you focus on the drugs and only as you get closer
02:53:53 to the end, do you start to think, oh shit,
02:53:56 this wasn’t about that.
02:53:57 And I owe the government a lot of money.
02:54:01 How many alien civilizations are out there?
02:54:05 When you looked up at the stars with your mom
02:54:07 and you were counting them, what’s your mom think
02:54:11 about the number of alien civilizations?
02:54:13 I actually don’t know.
02:54:15 I would imagine she would take the viewpoint of,
02:54:18 you know, she’s pretty humble and she knows how many,
02:54:21 she knows there’s a huge number of potential spawn sites
02:54:24 out there, so she would.
02:54:25 Spawn sites?
02:54:26 Spawn sites, yeah.
02:54:27 You know, this is all spawn sites.
02:54:29 Yeah, spawn sites in Polytopia.
02:54:30 We spawned on Earth, you know, it’s.
02:54:33 Hmm, yeah, spawn sites.
02:54:35 Why does that feel weird to say spawn?
02:54:39 Because it makes me feel like there’s only one source
02:54:44 of life and it’s spawning in different locations.
02:54:47 That’s why the word spawn.
02:54:49 Because it feels like life that originated on Earth
02:54:52 really originated here.
02:54:54 Right, it is unique to this particular.
02:54:58 Yeah, I mean, but I don’t, in my mind, it doesn’t exclude,
02:55:02 you know, the completely different forms of life
02:55:04 and different biochemical soups can’t also spawn,
02:55:09 but I guess it implies that there’s some spark
02:55:12 that is uniform, which I kind of like the idea of.
02:55:16 And then I get to think about respawning,
02:55:19 like after it dies, like what happens if life on Earth ends?
02:55:23 Is it gonna restart again?
02:55:26 Probably not, it depends.
02:55:28 Maybe Earth is too.
02:55:28 It depends on the type of, you know,
02:55:30 what’s the thing that kills it off, right?
02:55:32 If it’s a paperclip maximizer, not for the example,
02:55:36 but, you know, some kind of very self replicating,
02:55:40 high on the capabilities, very low on the wisdom type thing.
02:55:44 So whether that’s, you know, gray goo, green goo,
02:55:47 you know, like nanobots or just a shitty misaligned AI
02:55:51 that thinks it needs to turn everything into paperclips.
02:55:54 You know, if it’s something like that,
02:55:57 then it’s gonna be very hard for life,
02:55:59 you know, complex life, because by definition,
02:56:02 you know, a paperclip maximizer
02:56:03 is the ultimate instantiation of molecule.
02:56:05 Deeply low complexity, over optimization on a single thing,
02:56:08 sacrificing everything else, turning the whole world into.
02:56:11 Although something tells me,
02:56:12 like if we actually take a paperclip maximizer,
02:56:14 it destroys everything.
02:56:16 It’s a really dumb system that just envelops
02:56:19 the whole of Earth.
02:56:20 And the universe beyond, yeah.
02:56:22 Oh, I didn’t know that part, but okay, great.
02:56:26 That’s the thought experiment.
02:56:27 So it becomes a multi planetary paperclip maximizer?
02:56:30 Well, it just propagates.
02:56:31 I mean, it depends whether it figures out
02:56:33 how to jump the vacuum gap.
02:56:36 But again, I mean, this is all silly
02:56:38 because it’s a hypothetical thought experiment,
02:56:40 which I think doesn’t actually have
02:56:41 much practical application to the AI safety problem,
02:56:43 but it’s just a fun thing to play around with.
02:56:45 But if by definition, it is maximally intelligent,
02:56:47 which means it is maximally good at
02:56:50 navigating the environment around it
02:56:53 in order to achieve its goal,
02:56:54 but extremely bad at choosing goals in the first place.
02:56:58 So again, we’re talking on this orthogonality thing, right?
02:57:00 It’s very low on wisdom, but very high on capability.
02:57:03 Then it will figure out how to jump the vacuum gap
02:57:05 between planets and stars and so on,
02:57:07 and thus just turn every atom it gets its hands on
02:57:09 into paperclips.
02:57:10 Yeah, by the way, for people who don’t.
02:57:12 Which is maximum virality, by the way.
02:57:14 That’s what virality is.
02:57:16 But does not mean that virality is necessarily
02:57:18 all about maximizing paperclips.
02:57:20 In that case, it is.
02:57:21 So for people who don’t know,
02:57:22 this is just a thought experiment example
02:57:24 of an AI system that’s very, that has a goal
02:57:27 and is willing to do anything to accomplish that goal,
02:57:30 including destroying all life on Earth
02:57:32 and all human life and all of consciousness in the universe
02:57:36 for the goal of producing a maximum number of paperclips.
02:57:40 Okay.
02:57:41 Or whatever its optimization function was
02:57:43 that it was set at.
02:57:45 But don’t you think?
02:57:45 It could be making, recreating Lexus.
02:57:47 Maybe it’ll tile the universe in Lex.
02:57:50 Go on.
02:57:51 I like this idea.
02:57:52 No, I’m just kidding.
02:57:53 That’s better.
02:57:54 That’s more interesting than paperclips.
02:57:56 That could be infinitely optimal
02:57:57 if I were to say it to myself.
02:57:58 But if you ask me, it’s still a bad thing
02:58:00 because it’s permanently capping
02:58:02 what the universe could ever be.
02:58:04 It’s like, that’s its end state.
02:58:05 Or achieving the optimal
02:58:07 that the universe could ever achieve.
02:58:09 But that’s up to,
02:58:10 different people have different perspectives.
02:58:12 But don’t you think within the paperclip world
02:58:15 that would emerge, just like in the zeros and ones
02:58:19 that make up a computer,
02:58:20 that would emerge beautiful complexities?
02:58:23 Like, it won’t suppress, you know,
02:58:27 as you scale to multiple planets and throughout,
02:58:30 there’ll emerge these little worlds
02:58:33 that on top of the fabric of maximizing paperclips,
02:58:38 there will be, that would emerge like little societies
02:58:42 of paperclip.
02:58:45 Well, then we’re not describing
02:58:47 a paperclip maximizer anymore.
02:58:48 Because by the, like, if you think of what a paperclip is,
02:58:51 it is literally just a piece of bent iron, right?
02:58:55 So if it’s maximizing that throughout the universe,
02:58:58 it’s taking every atom it gets its hand on
02:59:01 into somehow turning it into iron or steel.
02:59:04 And then bending it into that shape
02:59:05 and then done and done.
02:59:06 By definition, like paperclips,
02:59:08 there is no way for, well, okay.
02:59:11 So you’re saying that paperclips somehow
02:59:13 will just emerge and create through gravity or something.
02:59:17 Well, no, no, no.
02:59:18 Because there’s a dynamic element to the whole system.
02:59:21 It’s not just, it’s creating those paperclips
02:59:24 and the act of creating, there’s going to be a process.
02:59:27 And that process will have a dance to it.
02:59:30 Because it’s not like sequential thing.
02:59:32 There’s a whole complex three dimensional system
02:59:35 of paperclips, you know, like, you know,
02:59:37 people like string theory, right?
02:59:39 It’s supposed to be strings that are interacting
02:59:40 in fascinating ways.
02:59:41 I’m sure paperclips are very string like,
02:59:44 they can be interacting in very interesting ways
02:59:46 as you scale exponentially through three dimensional.
02:59:50 I mean, I’m sure the paperclip maximizer
02:59:53 has to come up with a theory of everything.
02:59:55 It has to create like wormholes, right?
02:59:58 It has to break, like,
03:00:01 it has to understand quantum mechanics.
03:00:02 It has to understand general relativity.
03:00:03 I love your optimism.
03:00:04 This is where I’d say this,
03:00:06 we’re going into the realm of pathological optimism
03:00:08 where if I, it’s.
03:00:09 I’m sure there’ll be a,
03:00:12 I think there’s an intelligence
03:00:14 that emerges from that system.
03:00:16 So you’re saying that basically intelligence
03:00:18 is inherent in the fabric of reality and will find a way.
03:00:21 Kind of like Goldblum says, life will find a way.
03:00:23 You think life will find a way
03:00:25 even out of this perfectly homogenous dead soup.
03:00:29 It’s not perfectly homogenous.
03:00:31 It has to, it’s perfectly maximal in the production.
03:00:34 I don’t know why people keep thinking it’s homogenous.
03:00:37 It maximizes the number of paperclips.
03:00:39 That’s the only thing.
03:00:40 It’s not trying to be homogenous.
03:00:42 It’s trying.
03:00:42 It’s trying to maximize paperclips.
03:00:44 So you’re saying, you’re saying that because it,
03:00:47 because, you know, kind of like in the Big Bang
03:00:50 or, you know, it seems like, you know, things,
03:00:52 there were clusters, there was more stuff here than there.
03:00:54 That was enough of the patternicity
03:00:56 that kickstarted the evolutionary process.
03:00:58 It’s the little weirdness that will make it beautiful.
03:01:01 So yeah.
03:01:02 Complexity emerges.
03:01:03 Interesting, okay.
03:01:04 Well, so how does that line up then
03:01:05 with the whole heat death of the universe, right?
03:01:08 Cause that’s another sort of instantiation of this.
03:01:10 It’s like everything becomes so far apart and so cold
03:01:13 and so perfectly mixed that it’s like homogenous grayness.
03:01:20 Do you think that even out of that homogenous grayness
03:01:23 where there’s no, you know, negative entropy,
03:01:26 that, you know, there’s no free energy that we understand
03:01:30 even from that new stuff?
03:01:33 Yeah, the paperclip maximizer
03:01:36 or any other intelligence systems
03:01:38 will figure out ways to travel to other universes
03:01:40 to create Big Bangs within those universes
03:01:43 or through black holes to create whole other worlds
03:01:46 to break the, what we consider are the limitations
03:01:49 of physics.
03:01:52 The paperclip maximizer will find a way if a way exists.
03:01:56 And we should be humbled to realize that we don’t.
03:01:59 Yeah, but because it just wants to make more paperclips.
03:02:01 So it’s gonna go into those universes
03:02:02 and turn them into paperclips.
03:02:03 Yeah, but we humans, not humans,
03:02:07 but complex system exists on top of that.
03:02:10 We’re not interfering with it.
03:02:13 This complexity emerges from the simple base state.
03:02:17 The simple base.
03:02:18 Whether it’s, yeah, whether it’s, you know,
03:02:20 plank lengths or paperclips as the base unit.
03:02:22 Yeah, you can think of like the universe
03:02:25 as a paperclip maximizer because it’s doing some dumb stuff.
03:02:28 Like physics seems to be pretty dumb.
03:02:31 It has, like, I don’t know if you can summarize it.
03:02:34 Yeah, the laws are fairly basic
03:02:37 and yet out of them amazing complexity emerges.
03:02:39 And its goals seem to be pretty basic and dumb.
03:02:43 If you can summarize its goals,
03:02:45 I mean, I don’t know what’s a nice way maybe,
03:02:49 maybe laws of thermodynamics could be good.
03:02:52 I don’t know if you can assign goals to physics,
03:02:55 but if you formulate in the sense of goals,
03:02:57 it’s very similar to paperclip maximizing
03:03:00 in the dumbness of the goals.
03:03:02 But the pockets of complexity as it emerge
03:03:06 is where beauty emerges.
03:03:07 That’s where life emerges.
03:03:09 That’s where intelligence, that’s where humans emerge.
03:03:12 And I think we’re being very down
03:03:14 on this whole paperclip maximizer thing.
03:03:16 Now, the reason we hated it.
03:03:17 I think, yeah, because what you’re saying
03:03:19 is that you think that the force of emergence itself
03:03:24 is another like unwritten, not unwritten,
03:03:27 but like another baked in law of reality.
03:03:31 And you’re trusting that emergence will find a way to,
03:03:35 even out of seemingly the most mollusky,
03:03:38 awful, plain outcome, emergence will still find a way.
03:03:42 I love that as a philosophy.
03:03:43 I think it’s very nice.
03:03:44 I would wield it carefully
03:03:47 because there’s large error bars on that
03:03:50 and the certainty of that.
03:03:53 How about we build the paperclip maximizer and find out.
03:03:55 Classic, yeah.
03:03:56 Moloch is doing cartwheels, man.
03:03:59 Yeah.
03:03:59 But the thing is it will destroy humans in the process,
03:04:02 which is the reason we really don’t like it.
03:04:05 We seem to be really holding on
03:04:07 to this whole human civilization thing.
03:04:10 Would that make you sad if AI systems that are beautiful,
03:04:13 that are conscious, that are interesting
03:04:15 and complex and intelligent,
03:04:18 ultimately lead to the death of humans?
03:04:20 Would that make you sad?
03:04:21 If humans led to the death of humans?
03:04:23 Sorry.
03:04:23 Like if they would supersede humans.
03:04:25 Oh, if some AI?
03:04:26 Yeah, AI would end humans.
03:04:30 I mean, that’s the reason why I’m like,
03:04:32 in some ways less emotionally concerned about AI risk
03:04:36 than say, bio risk.
03:04:39 Because at least with AI, there’s a chance,
03:04:42 you know, if we’re in this hypothetical
03:04:44 where it wipes out humans,
03:04:45 but it does it for some like higher purpose,
03:04:48 it needs our atoms and energy to do something.
03:04:51 At least now the universe is going on
03:04:53 to do something interesting,
03:04:55 whereas if it wipes everything, you know,
03:04:56 bio like just kills everything on earth and that’s it.
03:05:00 And there’s no more, you know,
03:05:01 earth cannot spawn anything more meaningful
03:05:03 in the few hundred million years it has left,
03:05:05 because it doesn’t have much time left.
03:05:09 Then, yeah, I don’t know.
03:05:13 So one of my favorite books I’ve ever read is,
03:05:16 Novocene by James Lovelock, who sadly just died.
03:05:19 He wrote it when he was like 99.
03:05:22 He died aged 102, so it’s a fairly new book.
03:05:25 And he sort of talks about that,
03:05:27 that he thinks it’s, you know,
03:05:29 sort of building off this Gaia theory
03:05:30 where like earth is like living,
03:05:34 some form of intelligence itself,
03:05:36 and that this is the next like step, right?
03:05:38 Is this, whatever this new intelligence
03:05:41 that is maybe silicon based
03:05:43 as opposed to carbon based goes on to do.
03:05:46 And it’s a really sort of, in some ways an optimistic,
03:05:48 but really fatalistic book.
03:05:49 And I don’t know if I fully subscribed to it,
03:05:52 but it’s a beautiful piece to read anyway.
03:05:54 So am I sad by that idea?
03:05:56 I think so, yes.
03:05:57 And actually, yeah, this is the reason
03:05:59 why I’m sad by the idea,
03:06:00 because if something is truly brilliant
03:06:02 and wise and smart and truly super intelligent,
03:06:06 it should be able to figure out abundance.
03:06:09 So if it figures out abundance,
03:06:11 it shouldn’t need to kill us off.
03:06:12 It should be able to find a way for us.
03:06:14 It should be, there’s plenty, the universe is huge.
03:06:17 There should be plenty of space for it to go out
03:06:19 and do all the things it wants to do,
03:06:21 and like give us a little pocket
03:06:23 where we can continue doing our things
03:06:24 and we can continue to do things and so on.
03:06:27 And again, if it’s so supremely wise,
03:06:28 it shouldn’t even be worried
03:06:29 about the game theoretic considerations
03:06:31 that by leaving us alive,
03:06:33 we’ll then go and create another like super intelligent agent
03:06:35 that it then has to compete against,
03:06:36 because it should be only wise and smart enough
03:06:38 to not have to concern itself with that.
03:06:40 Unless it deems humans to be kind of assholes.
03:06:44 Like the humans are a source of non
03:06:48 of a lose, lose kind of dynamics.
03:06:51 Well, yes and no, we’re not,
03:06:55 Moloch is, that’s why I think it’s important to separate.
03:06:57 But maybe humans are the source of Moloch.
03:07:00 No, I think, I mean, I think game theory
03:07:02 is the source of Moloch.
03:07:03 And, you know, because Moloch exists
03:07:05 in nonhuman systems as well.
03:07:08 It happens within like agents within a game
03:07:10 in terms of like, you know, it applies to agents,
03:07:13 but like it can apply to, you know,
03:07:17 a species that’s on an island of animals,
03:07:20 you know, rats out competing,
03:07:22 the ones that like massively consume all the resources
03:07:25 are the ones that are gonna win out
03:07:26 over the more like chill, socialized ones.
03:07:29 And so, you know, creates this Malthusian trap,
03:07:31 like Moloch exists in little pockets in nature as well.
03:07:34 So it’s not a strictly human thing.
03:07:35 I wonder if it’s actually a result of consequences
03:07:38 of the invention of predator and prey dynamics.
03:07:41 Maybe it needs to, AI will have to kill off
03:07:45 every organism that’s.
03:07:47 Now you’re talking about killing off competition.
03:07:50 Not competition, but just like the way,
03:07:55 it’s like the weeds or whatever
03:07:59 in a beautiful flower garden.
03:08:01 Parasites.
03:08:02 The parasites, yeah, on the whole system.
03:08:05 Now, of course, it won’t do that completely.
03:08:08 It’ll put them in a zoo like we do with parasites.
03:08:10 It’ll ring fence.
03:08:11 Yeah, and there’ll be somebody doing a PhD
03:08:13 on like they’ll prod humans with a stick
03:08:15 and see what they do.
03:08:18 But I mean, in terms of letting us run wild
03:08:22 outside of the, you know, a geographically
03:08:25 constrained region that might be,
03:08:27 that it might decide to against that.
03:08:31 No, I think there’s obviously the capacity
03:08:33 for beauty and kindness and non Moloch behavior
03:08:38 amidst humans, so I’m pretty sure AI will preserve us.
03:08:42 Let me, I don’t know if you answered the aliens question.
03:08:46 No, I didn’t.
03:08:47 You had a good conversation with Toby Orr.
03:08:49 Yes.
03:08:50 About various sides of the universe.
03:08:52 I think, did he say, now I’m forgetting,
03:08:54 but I think he said it’s a good chance we’re alone.
03:08:58 So the classic, you know, Fermi paradox question is,
03:09:02 there are so many spawn points and yet, you know,
03:09:09 it didn’t take us that long to go from harnessing fire
03:09:12 to sending out radio signals into space.
03:09:15 So surely given the vastness of space we should be,
03:09:18 and you know, even if only a tiny fraction of those
03:09:20 create life and other civilizations too,
03:09:23 we should be, the universe should be very noisy.
03:09:24 There should be evidence of Dyson spheres or whatever,
03:09:27 you know, like at least radio signals and so on,
03:09:29 but seemingly things are very silent out there.
03:09:32 Now, of course, it depends on who you speak to.
03:09:33 Some people say that they’re getting signals all the time
03:09:35 and so on and like, I don’t wanna make
03:09:37 an epistemic statement on that,
03:09:38 but it seems like there’s a lot of silence.
03:09:43 And so that raises this paradox.
03:09:45 And then say, you know, the Drake equation.
03:09:51 So the Drake equation is like basically just a simple thing
03:09:55 of like trying to estimate the number of possible
03:09:57 civilizations within the galaxy
03:09:58 by multiplying the number of stars created per year
03:10:02 by the number of stars that have planets,
03:10:03 planets that are habitable, blah, blah, blah.
03:10:04 So all these like different factors.
03:10:06 And then you plug in numbers into that and you, you know,
03:10:09 depending on like the range of, you know,
03:10:11 your lower bound and your upper bound point estimates
03:10:14 that you put in, you get out a number at the end
03:10:16 for the number of civilizations.
03:10:18 But what Toby and his crew did differently was,
03:10:22 Toby is a researcher at the Future of Humanity Institute.
03:10:25 They, instead of, they realized that it’s like basically
03:10:31 a statistical quirk that if you put in point sources,
03:10:33 even if you think you’re putting in
03:10:34 conservative point sources,
03:10:36 because on some of these variables,
03:10:37 the uncertainty is so large,
03:10:41 it spans like maybe even like a couple of hundred
03:10:43 orders of magnitude.
03:10:46 By putting in point sources,
03:10:47 it’s always going to lead to overestimates.
03:10:50 And so they, like by putting stuff on a log scale,
03:10:54 or actually they did it on like a log log scale
03:10:56 on some of them,
03:10:57 and then like ran the simulation across the whole
03:11:01 bucket of uncertainty,
03:11:02 across all of those orders of magnitude.
03:11:04 When you do that,
03:11:05 then actually the number comes out much, much smaller.
03:11:08 And that’s the more statistically rigorous,
03:11:10 you know, mathematically correct way
03:11:11 of doing the calculation.
03:11:13 It’s still a lot of hand waving.
03:11:14 As science goes, it’s like definitely, you know,
03:11:17 just waving, I don’t know what an analogy is,
03:11:19 but it’s hand wavy.
03:11:22 And anyway, when they did this,
03:11:24 and then they did a Bayesian update on it as well,
03:11:27 to like factor in the fact that there is no evidence
03:11:30 that we’re picking up because, you know,
03:11:31 no evidence is actually a form of evidence, right?
03:11:33 And the long and short of it comes out that the,
03:11:37 we’re roughly around 70% to be the only
03:11:42 intelligent civilization in our galaxy thus far,
03:11:45 and around 50, 50 in the entire observable universe,
03:11:47 which sounds so crazily counterintuitive,
03:11:50 but their math is legit.
03:11:53 Well, yeah, the math around this particular equation,
03:11:55 which the equation is ridiculous on many levels,
03:11:57 but the powerful thing about the equation
03:12:03 is there’s the different things,
03:12:05 different components that can be estimated,
03:12:09 and the error bars on which can be reduced with science.
03:12:13 And hence throughout, since the equation came out,
03:12:17 the error bars have been coming out on different,
03:12:19 different aspects.
03:12:20 And so that, it almost kind of says,
03:12:23 what, like this gives you a mission to reduce the error bars
03:12:27 on these estimates over a period of time.
03:12:30 And once you do, you can better and better understand,
03:12:32 like in the process of redoing the error bars,
03:12:34 you’ll get to understand actually
03:12:36 what is the right way to find out where the aliens are,
03:12:41 how many of them there are, and all those kinds of things.
03:12:43 So I don’t think it’s good to use that for an estimation.
03:12:47 I think you do have to think from like,
03:12:50 more like from first principles,
03:12:51 just looking at what life is on Earth,
03:12:55 and trying to understand the very physics based,
03:12:59 biology, chemistry, biology based question of what is life,
03:13:04 maybe computation based.
03:13:05 What the fuck is this thing?
03:13:07 And that, like how difficult is it to create this thing?
03:13:12 It’s one way to say like how many planets like this
03:13:14 are out there, all that kind of stuff,
03:13:16 but it feels like from our very limited knowledge
03:13:20 perspective, the right way is to think how does,
03:13:25 what is this thing and how does it originate?
03:13:28 From very simple nonlife things,
03:13:32 how does complex lifelike things emerge?
03:13:37 From a rock to a bacteria, protein,
03:13:42 and these like weird systems that encode information
03:13:46 and pass information from self replicate,
03:13:49 and then also select each other and mutate
03:13:51 in interesting ways such that they can adapt
03:13:53 and evolve and build increasingly more complex systems.
03:13:56 Right, well it’s a form of information processing, right?
03:13:59 Right.
03:14:00 Whereas information transfer, but then also
03:14:04 an energy processing, which then results in,
03:14:07 I guess information processing,
03:14:09 maybe I’m getting bogged down.
03:14:09 It’s doing some modification and yeah,
03:14:12 the input is some energy.
03:14:14 Right, it’s able to extract, yeah,
03:14:17 extract resources from its environment
03:14:20 in order to achieve a goal.
03:14:22 But the goal doesn’t seem to be clear.
03:14:24 Right, well the goal is to make more of itself.
03:14:29 Yeah, but in a way that increases,
03:14:33 I mean I don’t know if evolution
03:14:36 is a fundamental law of the universe,
03:14:39 but it seems to want to replicate itself
03:14:44 in a way that maximizes the chance of its survival.
03:14:47 Individual agents within an ecosystem do, yes, yes.
03:14:51 Evolution itself doesn’t give a fuck.
03:14:53 Right.
03:14:54 It’s a very, it don’t care.
03:14:55 It’s just like, oh, you optimize it.
03:14:58 Well, at least it’s certainly, yeah,
03:15:01 it doesn’t care about the welfare
03:15:03 of the individual agents within it,
03:15:05 but it does seem to, I don’t know.
03:15:06 I think the mistake is that we’re anthropomorphizing.
03:15:09 To even try and give evolution a mindset
03:15:14 because it is, there’s a really great post
03:15:17 by Eliezer Yudkowsky on Lesrong,
03:15:21 which is an alien God.
03:15:25 And he talks about the mistake we make
03:15:27 when we try and put our mind,
03:15:30 think through things from an evolutionary perspective
03:15:32 as though giving evolution some kind of agency
03:15:35 and what it wants.
03:15:37 Yeah, worth reading, but yeah.
03:15:39 I would like to say that having interacted
03:15:42 with a lot of really smart people
03:15:43 that say that anthropomorphization is a mistake,
03:15:46 I would like to say that saying
03:15:48 that anthropomorphization is a mistake is a mistake.
03:15:51 I think there’s a lot of power in anthropomorphization,
03:15:54 if I can only say that word correctly one time.
03:15:57 I think that’s actually a really powerful way
03:16:00 to reason to things.
03:16:01 And I think people, especially people in robotics
03:16:04 seem to run away from it as fast as possible.
03:16:07 And I just, I think.
03:16:09 Can you give an example of like how it helps in robotics?
03:16:13 Oh, in that our world is a world of humans
03:16:19 and to see robots as fundamentally just tools
03:16:24 runs away from the fact that we live in a world,
03:16:27 a dynamic world of humans.
03:16:30 That like these, all these game theory systems
03:16:32 we’ve talked about, that a robot
03:16:35 that ever has to interact with humans.
03:16:37 And I don’t mean like intimate friendship interaction.
03:16:40 I mean, in a factory setting where it has to deal
03:16:43 with the uncertainty of humans, all that kind of stuff.
03:16:45 You have to acknowledge that the robot’s behavior
03:16:49 has an effect on the human, just as much as the human
03:16:53 has an effect on the robot.
03:16:54 And there’s a dance there.
03:16:56 And you have to realize that this entity,
03:16:58 when a human sees a robot, this is obvious
03:17:01 in a physical manifestation of a robot,
03:17:04 they feel a certain way.
03:17:05 They have a fear, they have uncertainty.
03:17:07 They have their own personal life projections.
03:17:11 We have to have pets and dogs
03:17:13 and the thing looks like a dog.
03:17:14 They have their own memories of what a dog is like.
03:17:17 They have certain feelings and that’s gonna be useful
03:17:19 in a safety setting, safety critical setting,
03:17:22 which is one of the most trivial settings for a robot
03:17:25 in terms of how to avoid any kind of dangerous situations.
03:17:29 And a robot should really consider that
03:17:32 in navigating its environment.
03:17:34 And we humans are right to reason about how a robot
03:17:38 should consider navigating its environment
03:17:41 through anthropomorphization.
03:17:42 I also think our brains are designed to think
03:17:46 in human terms, like game theory,
03:17:55 I think is best applied in the space of human decisions.
03:18:01 And so…
03:18:03 Right, you’re dealing, I mean, with things like AI,
03:18:06 AI is, they are, we can somewhat,
03:18:10 like, I don’t think it’s,
03:18:11 the reason I say anthropomorphization
03:18:14 we need to be careful with is because there is a danger
03:18:17 of overly applying, overly wrongly assuming
03:18:20 that this artificial intelligence is going to operate
03:18:24 in any similar way to us,
03:18:25 because it is operating
03:18:27 on a fundamentally different substrate.
03:18:29 Like even dogs or even mice or whatever, in some ways,
03:18:33 like anthropomorphizing them is less of a mistake, I think,
03:18:37 than an AI, even though it’s an AI we built and so on,
03:18:40 because at least we know
03:18:41 that they’re running from the same substrate.
03:18:43 And they’ve also evolved from the same,
03:18:45 out of the same evolutionary process.
03:18:48 They’ve followed this evolution
03:18:50 of like needing to compete for resources
03:18:52 and needing to find a mate and that kind of stuff.
03:18:55 Whereas an AI that has just popped into existence
03:18:58 somewhere on like a cloud server,
03:19:00 let’s say, you know, or whatever, however it runs
03:19:02 and whatever, whether it,
03:19:03 I don’t know whether they have an internal experience.
03:19:05 I don’t think they necessarily do.
03:19:07 In fact, I don’t think they do.
03:19:08 But the point is, is that to try and apply
03:19:11 any kind of modeling of like thinking through problems
03:19:14 and decisions in the same way that we do
03:19:16 has to be done extremely carefully because they are,
03:19:20 like, they’re so alien,
03:19:23 their method of whatever their form of thinking is,
03:19:26 it’s just so different because they’ve never had to evolve,
03:19:29 you know, in the same way.
03:19:30 Yeah, beautifully put.
03:19:32 I was just playing devil’s advocate.
03:19:33 I do think in certain contexts,
03:19:35 anthropomorphization is not gonna hurt you.
03:19:37 Yes.
03:19:38 Engineers run away from it too fast.
03:19:39 I can see that.
03:19:41 But from the most point, you’re right.
03:19:43 Do you have advice for young people today,
03:19:48 like the 17 year old that you were,
03:19:51 of how to live life?
03:19:53 You can be proud of how to have a career
03:19:56 you can be proud of in this world full of mullocks.
03:20:00 Think about the win wins.
03:20:02 Look for win win situations.
03:20:05 And be careful not to, you know, overly use your smarts
03:20:11 to convince yourself that something is win win
03:20:13 when it’s not.
03:20:13 So that’s difficult.
03:20:14 And I don’t know how to advise, you know, people on that
03:20:17 because it’s something I’m still figuring out myself.
03:20:20 But have that as a sort of default MO.
03:20:25 Don’t see things, everything as a zero sum game.
03:20:28 Try to find the positive sumness and like find ways
03:20:30 if there doesn’t seem to be one,
03:20:32 consider playing a different game.
03:20:34 So that I would suggest that.
03:20:37 Do not become a professional poker player.
03:20:38 I, cause people always ask that like, oh, she’s a pro.
03:20:41 I wanna do that too.
03:20:43 Fine, you could have done it if you were, you know,
03:20:45 when I started out,
03:20:46 it was a very different situation back then.
03:20:48 Poker is, you know, a great game to learn
03:20:52 in order to understand the ways to think.
03:20:54 And I recommend people learn it,
03:20:56 but don’t try and make a living from it these days.
03:20:58 It’s almost, it’s very, very difficult
03:21:00 to the point of being impossible.
03:21:03 And then really, really be aware of how much time
03:21:08 you spend on your phone and on social media
03:21:12 and really try and keep it to a minimum.
03:21:14 Be aware that basically every moment that you spend on it
03:21:17 is bad for you.
03:21:18 So it doesn’t mean to say you can never do it,
03:21:20 but just have that running in the background.
03:21:22 I’m doing a bad thing for myself right now.
03:21:25 I think that’s the general rule of thumb.
03:21:28 Of course, about becoming a professional poker player,
03:21:31 if there is a thing in your life that’s like that
03:21:35 and nobody can convince you otherwise, just fucking do it.
03:21:40 Don’t listen to anyone’s advice.
03:21:44 Find a thing that you can’t be talked out of too.
03:21:46 That’s a thing.
03:21:47 I like that, yeah.
03:21:50 You were a lead guitarist in a metal band?
03:21:53 Oh.
03:21:54 Did I write that down from something?
03:21:56 What did you, what’d you do it for?
03:22:00 The performing, was it the pure, the music of it?
03:22:07 Was it just being a rock star?
03:22:08 Why’d you do it?
03:22:11 So we only ever played two gigs.
03:22:15 We didn’t last, you know, it wasn’t a very,
03:22:17 we weren’t famous or anything like that.
03:22:20 But I was very into metal.
03:22:25 Like it was my entire identity,
03:22:27 sort of from the age of 16 to 23.
03:22:29 What’s the best metal band of all time?
03:22:31 Don’t ask me that, it’s so hard to answer.
03:22:36 So I know I had a long argument with,
03:22:40 I’m a guitarist, more like a classic rock guitarist.
03:22:43 So, you know, I’ve had friends who are very big
03:22:46 Pantera fans and so there was often arguments
03:22:49 about what’s the better metal band,
03:22:52 Metallica versus Pantera.
03:22:53 This is a more kind of 90s maybe discussion.
03:22:57 But I was always on the side of Metallica,
03:23:00 both musically and in terms of performance
03:23:02 and the depth of lyrics and so on.
03:23:06 So, but they were, basically everybody was against me.
03:23:10 Because if you’re a true metal fan,
03:23:12 I guess the idea goes is you can’t possibly
03:23:14 be a Metallica fan.
03:23:16 Because Metallica is pop, it’s just like, they sold out.
03:23:19 Metallica are metal.
03:23:20 Like they were the, I mean, again, you can’t say
03:23:24 who was the godfather of metal, blah, blah, blah.
03:23:26 But like they were so groundbreaking and so brilliant.
03:23:32 I mean, you’ve named literally two of my favorite bands.
03:23:34 Like when you asked that question, who are my favorites?
03:23:37 Like those were two that came up.
03:23:39 A third one is Children of Bodom,
03:23:41 who I just think, oh, they just tick all the boxes for me.
03:23:47 Yeah, I don’t know.
03:23:48 It’s nowadays, like I kind of sort of feel
03:23:51 like a repulsion to the, I was that myself.
03:23:55 Like I’d be like, who do you prefer more?
03:23:56 Come on, who’s like, no, you have to rank them.
03:23:58 But it’s like this false zero sumness that’s like, why?
03:24:01 They’re so additive.
03:24:02 Like there’s no conflict there.
03:24:04 Although when people ask that kind of question
03:24:06 about anything, movies, I feel like it’s hard work.
03:24:11 And it’s unfair, but it’s, you should pick one.
03:24:14 Like, and that’s actually the same kind of,
03:24:17 it’s like a fear of a commitment.
03:24:19 When people ask me, what’s your favorite band?
03:24:21 It’s like, but I, it’s good to pick.
03:24:24 Exactly.
03:24:25 And thank you for the tough question, yeah.
03:24:27 Well, maybe not in the context
03:24:29 when a lot of people are listening.
03:24:31 Yeah, I’m not just like, what, why does it matter?
03:24:33 No, it does.
03:24:35 Are you still into metal?
03:24:37 Funny enough, I was listening to a bunch
03:24:38 before I came over here.
03:24:39 Oh, like, do you use it for like motivation
03:24:43 or it gets you in a certain?
03:24:44 Yeah, I was weirdly listening
03:24:45 to 80s hair metal before I came.
03:24:48 Does that count as metal?
03:24:49 I think so, it’s like proto metal and it’s happy.
03:24:53 It’s optimistic, happy proto metal.
03:24:56 Yeah, I mean, all these genres bleed into each other.
03:25:00 But yeah, sorry, to answer your question
03:25:01 about guitar playing, my relationship with it
03:25:04 was kind of weird in that I was deeply uncreative.
03:25:09 My objective would be to hear some really hard
03:25:11 technical solo and then learn it, memorize it
03:25:14 and then play it perfectly.
03:25:15 But I was incapable of trying to write my own music.
03:25:19 Like the idea was just absolutely terrifying.
03:25:23 But I was also just thinking, I was like,
03:25:24 it’d be kind of cool to actually try starting a band again
03:25:28 and getting back into it and write.
03:25:31 But it’s scary.
03:25:34 It’s scary.
03:25:34 I mean, I put out some guitar playing
03:25:36 just other people’s covers.
03:25:38 I play Comfortly Numb on the internet.
03:25:41 And it’s scary too.
03:25:42 It’s scary putting stuff out there.
03:25:45 And I had this similar kind of fascination
03:25:47 with technical playing, both on piano and guitar.
03:25:50 You know, one of the first,
03:25:55 one of the reasons that I started learning guitar
03:25:58 is from Ozzy Osbourne, Mr. Crowley’s solo.
03:26:01 And one of the first solos I learned is that,
03:26:06 there’s a beauty to it.
03:26:07 There’s a lot of beauty to it.
03:26:08 It’s tapping, right?
03:26:09 Yeah, there’s some tapping, but it’s just really fast.
03:26:14 Beautiful, like arpeggios.
03:26:15 Yeah, arpeggios, yeah.
03:26:16 But there’s a melody that you can hear through it,
03:26:19 but there’s also build up.
03:26:21 It’s a beautiful solo,
03:26:22 but it’s also technically just visually the way it looks
03:26:25 when a person’s watching, you feel like a rockstar playing.
03:26:29 But it ultimately has to do with technical.
03:26:33 You’re not developing the part of your brain
03:26:36 that I think requires you to generate beautiful music.
03:26:40 It is ultimately technical in nature.
03:26:42 And so that took me a long time to let go of that
03:26:45 and just be able to write music myself.
03:26:48 And that’s a different journey, I think.
03:26:53 I think that journey is a little bit more inspired
03:26:55 in the blues world, for example,
03:26:57 where improvisation is more valued,
03:26:58 obviously in jazz and so on.
03:26:59 But I think ultimately it’s a more rewarding journey
03:27:04 because you get your relationship with the guitar
03:27:08 then becomes a kind of escape from the world
03:27:12 where you can create, I mean, creating stuff is.
03:27:17 And it’s something you work with,
03:27:18 because my relationship with my guitar was like,
03:27:20 it was something to tame and defeat.
03:27:23 Yeah, it’s a challenge.
03:27:24 Which was kind of what my whole personality was back then.
03:27:26 I was just very like, as I said, very competitive,
03:27:29 very just like must bend this thing to my will.
03:27:33 Whereas writing music, it’s like a dance, you work with it.
03:27:37 But I think because of the competitive aspect,
03:27:39 for me at least, that’s still there,
03:27:42 which creates anxiety about playing publicly
03:27:45 or all that kind of stuff.
03:27:47 I think there’s just like a harsh self criticism
03:27:49 within the whole thing.
03:27:50 It’s really tough.
03:27:53 I wanna hear some of your stuff.
03:27:55 I mean, there’s certain things that feel really personal.
03:27:59 And on top of that, as we talked about poker offline,
03:28:03 there’s certain things that you get to a certain height
03:28:05 in your life, and that doesn’t have to be very high,
03:28:07 but you get to a certain height
03:28:09 and then you put it aside for a bit.
03:28:11 And it’s hard to return to it
03:28:13 because you remember being good.
03:28:15 And it’s hard to, like you being at a very high level
03:28:19 in poker, it might be hard for you to return to poker
03:28:22 every once in a while and enjoy it,
03:28:24 knowing that you’re just not as sharp as you used to be
03:28:26 because you’re not doing it every single day.
03:28:29 That’s something I always wonder with,
03:28:31 I mean, even just like in chess with Kasparov,
03:28:33 some of these greats, just returning to it,
03:28:36 it’s almost painful.
03:28:38 And I feel that way with guitar too,
03:28:40 because I used to play every day a lot.
03:28:44 So returning to it is painful
03:28:46 because it’s like accepting the fact
03:28:48 that this whole ride is finite
03:28:51 and that you have a prime,
03:28:55 there’s a time when you were really good
03:28:57 and now it’s over and now.
03:28:58 We’re on a different chapter of life.
03:29:00 I was like, oh, but I miss that.
03:29:02 But you can still discover joy within that process.
03:29:06 It’s been tough, especially with some level of like,
03:29:10 as people get to know you, and people film stuff,
03:29:13 you don’t have the privacy of just sharing something
03:29:18 with a few people around you.
03:29:20 Yeah.
03:29:21 That’s a beautiful privacy.
03:29:23 That’s a good point.
03:29:23 With the internet, it’s just disappearing.
03:29:26 Yeah, that’s a really good point.
03:29:27 Yeah.
03:29:29 But all those pressures aside,
03:29:31 if you really, you can step up
03:29:32 and still enjoy the fuck out of a good musical performance.
03:29:39 What do you think is the meaning of this whole thing?
03:29:42 What’s the meaning of life?
03:29:43 Oh, wow.
03:29:45 It’s in your name, as we talked about.
03:29:47 You have to live up.
03:29:48 Do you feel the requirement
03:29:50 to have to live up to your name?
03:29:54 Because live?
03:29:55 Yeah.
03:29:56 No, because I don’t see it.
03:29:57 I mean, my, oh, again, it’s kind of like,
03:30:02 no, I don’t know.
03:30:03 Because my full name is Olivia.
03:30:05 Yeah.
03:30:06 So I can retreat in that and be like,
03:30:07 oh, Olivia, what does that even mean?
03:30:10 Live up to live.
03:30:12 No, I can’t say I do,
03:30:13 because I’ve never thought of it that way.
03:30:15 And then your name backwards is evil.
03:30:17 That’s what we also talked about.
03:30:18 I mean, I feel the urge to live up to that,
03:30:21 to be the inverse of evil or even better.
03:30:25 Because I don’t think, you know,
03:30:27 is the inverse of evil good
03:30:29 or is good something completely separate to that?
03:30:32 I think my intuition says it’s the latter,
03:30:34 but I don’t know.
03:30:34 Anyway, again, getting in the weeds.
03:30:36 What is the meaning of all this?
03:30:39 Of life.
03:30:41 Why are we here?
03:30:43 I think to,
03:30:44 explore, have fun and understand
03:30:48 and make more of here and to keep the game going.
03:30:51 Of here?
03:30:51 More of here?
03:30:52 More of this, whatever this is.
03:30:55 More of experience.
03:30:57 Just to have more of experience
03:30:58 and ideally positive experience.
03:31:01 And more complex, you know,
03:31:05 I guess, try and put it into a sort of
03:31:06 vaguely scientific term.
03:31:10 I don’t know.
03:31:11 I don’t know.
03:31:11 I don’t know.
03:31:12 But make it so that the program required,
03:31:18 the length of code required to describe the universe
03:31:21 is as long as possible.
03:31:23 And, you know, highly complex and therefore interesting.
03:31:27 Because again, like,
03:31:28 I know, you know, we bang the metaphor to death,
03:31:32 but like, tiled with X, you know,
03:31:36 tiled with paperclips,
03:31:37 doesn’t require that much of a code to describe.
03:31:41 Obviously, maybe something emerges from it,
03:31:42 but that steady state, assuming a steady state,
03:31:44 it’s not very interesting.
03:31:45 Whereas it seems like our universe is over time
03:31:49 becoming more and more complex and interesting.
03:31:51 There’s so much richness and beauty and diversity
03:31:54 on this earth.
03:31:54 And I want that to continue and get more.
03:31:56 I want more diversity.
03:31:58 And the very best sense of that word
03:32:01 is to me the goal of all this.
03:32:07 Yeah.
03:32:07 And somehow have fun in the process.
03:32:10 Yes.
03:32:12 Because we do create a lot of fun things along,
03:32:14 instead of in this creative force
03:32:18 and all the beautiful things we create,
03:32:19 somehow there’s like a funness to it.
03:32:22 And perhaps that has to do with the finiteness of life,
03:32:25 the finiteness of all these experiences,
03:32:28 which is what makes them kind of unique.
03:32:31 Like the fact that they end,
03:32:32 there’s this, whatever it is,
03:32:35 falling in love or creating a piece of art
03:32:40 or creating a bridge or creating a rocket
03:32:45 or creating a, I don’t know,
03:32:48 just the businesses that build something
03:32:53 or solve something.
03:32:56 The fact that it is born and it dies
03:33:00 somehow embeds it with fun, with joy
03:33:07 for the people involved.
03:33:08 I don’t know what that is.
03:33:09 The finiteness of it.
03:33:11 It can do.
03:33:12 Some people struggle with the,
03:33:13 I mean, a big thing I think that one has to learn
03:33:17 is being okay with things coming to an end.
03:33:21 And in terms of like projects and so on,
03:33:25 people cling onto things beyond
03:33:27 what they’re meant to be doing,
03:33:28 beyond what is reasonable.
03:33:32 And I’m gonna have to come to terms
03:33:33 with this podcast coming to an end.
03:33:35 I really enjoyed talking to you.
03:33:37 I think it’s obvious as we’ve talked about many times,
03:33:40 you should be doing a podcast.
03:33:41 You should, you’re already doing a lot of stuff publicly
03:33:45 to the world, which is awesome.
03:33:47 And you’re a great educator.
03:33:48 You’re a great mind.
03:33:49 You’re a great intellect.
03:33:50 But it’s also this whole medium of just talking
03:33:52 is also fun.
03:33:53 It is good.
03:33:54 It’s a fun one.
03:33:55 It really is good.
03:33:56 And it’s just, it’s nothing but like,
03:33:58 oh, it’s just so much fun.
03:34:00 And you can just get into so many,
03:34:03 yeah, there’s this space to just explore
03:34:05 and see what comes and emerges.
03:34:07 And yeah.
03:34:08 Yeah, to understand yourself better.
03:34:09 And if you’re talking to others,
03:34:10 to understand them better and together with them.
03:34:12 I mean, you should do your own podcast,
03:34:15 but you should also do a podcast with C
03:34:16 as we’ve talked about.
03:34:18 The two of you have such different minds
03:34:22 that like melt together in just hilarious ways,
03:34:26 fascinating ways, just the tension of ideas there
03:34:29 is really powerful.
03:34:30 But in general, I think you got a beautiful voice.
03:34:33 So thank you so much for talking today.
03:34:35 Thank you for being a friend.
03:34:36 Thank you for honoring me with this conversation
03:34:39 and with your valuable time.
03:34:40 Thanks, Liv.
03:34:41 Thank you.
03:34:42 Thanks for listening to this conversation with Liv Marie.
03:34:45 To support this podcast,
03:34:46 please check out our sponsors in the description.
03:34:48 And now let me leave you with some words
03:34:50 from Richard Feynman.
03:34:53 I think it’s much more interesting to live not knowing
03:34:56 than to have answers, which might be wrong.
03:34:59 I have approximate answers and possible beliefs
03:35:01 and different degrees of uncertainty about different things,
03:35:05 but I’m not absolutely sure of anything.
03:35:08 And there are many things I don’t know anything about,
03:35:11 such as whether it means anything to ask why we’re here.
03:35:15 I don’t have to know the answer.
03:35:17 I don’t feel frightened not knowing things
03:35:20 by being lost in a mysterious universe without any purpose,
03:35:24 which is the way it really is as far as I can tell.
03:35:27 Thank you for listening and hope to see you next time.