Eric Weinstein: Revolutionary Ideas in Science, Math, and Society #16

Transcript

00:00:00 The following is a conversation with Eric Weinstein.

00:00:03 He’s a mathematician, economist, physicist, and the managing director of Teal Capital.

00:00:08 He coined the term, and you can say, is the founder of the intellectual dark web, which

00:00:14 is a loosely assembled group of public intellectuals that includes Sam Harris, Jordan Peterson,

00:00:19 Steven Pinker, Joe Rogan, Michael Shermer, and a few others.

00:00:24 This conversation is part of the Artificial Intelligence Podcast at MIT and beyond.

00:00:30 If you enjoy it, subscribe on YouTube, iTunes, or simply connect with me on Twitter at Lex

00:00:36 Friedman, spelled F R I D.

00:00:39 And now, here’s my conversation with Eric Weinstein.

00:00:44 Are you nervous about this?

00:00:46 Scared shitless.

00:00:47 Okay.

00:00:48 You mentioned Kung Fu Panda as one of your favorite movies.

00:00:54 It has the usual profound master student dynamic going on.

00:00:58 So who was, who has been a teacher that significantly influenced the direction of your thinking

00:01:04 and life’s work?

00:01:05 So if you’re the Kung Fu Panda, who was your Shifu?

00:01:08 Oh, well, it’s interesting because I didn’t see Shifu as being the teacher.

00:01:12 Who was the teacher?

00:01:13 Oogway, Master Oogway.

00:01:15 The turtle.

00:01:16 Oh, the turtle.

00:01:17 Right.

00:01:18 They only meet twice in the entire film.

00:01:21 And the first conversation sort of doesn’t count.

00:01:25 So the magic of the film, in fact, its point is that the teaching that really matters is

00:01:33 transferred during a single conversation and it’s very brief.

00:01:40 And so who played that role in my life?

00:01:42 I would say, uh, either, uh, my grandfather, uh, Harry Rubin and his wife, Sophie Rubin,

00:01:51 my grandmother, or Tom Lehrer.

00:01:54 Tom Lehrer?

00:01:56 Yeah.

00:01:57 In which way?

00:01:58 If you give a child Tom Lehrer records, what you do is you destroy their ability to be

00:02:05 taken over by later malware.

00:02:08 And it’s so irreverent, so witty, so clever, so obscene that it destroys the ability to

00:02:17 lead a normal life for many people.

00:02:19 So if I meet somebody who’s usually really shifted from any kind of neurotypical presentation,

00:02:27 I’ll often ask them, uh, are you a Tom Lehrer fan?

00:02:31 And the odds that they will respond are quite high.

00:02:34 Tom Lehrer is, uh, poisoning pigeons in the park, Tom Lehrer.

00:02:38 That’s very interesting.

00:02:39 There are a small number of Tom Lehrer songs that broke into the general population, poisoning

00:02:44 pigeons in the park, the element song, and perhaps the Vatican rag.

00:02:48 Uh, so when you meet somebody who knows those songs but doesn’t know, oh, you’re judging

00:02:53 me right now, aren’t you?

00:02:55 Harshly.

00:02:56 Uh, no, but you’re a Russian, so I doubt as the, you know, Nikolai Ivanovich Lubachevsky,

00:03:00 that song.

00:03:01 Yeah.

00:03:02 Um, so that was a song about plagiarism that was in fact plagiarized, which most people

00:03:06 don’t know from Danny Kay, uh, where Danny Kay did a song called Stanislavsky of the

00:03:11 Muskie arts.

00:03:13 And so Tom Lehrer did this brilliant job of plagiarizing a song about and making it about

00:03:19 plagiarism and then making it about this mathematician who worked in non Euclidean geometry.

00:03:24 That was like, uh, giving heroin to a child.

00:03:27 It was extremely addictive and eventually led me to a lot of different places, one of

00:03:33 which may have been a PhD in mathematics.

00:03:36 And he was also at least a lecturer in mathematics, I believe at Harvard, something like that.

00:03:41 Yeah.

00:03:42 I just had dinner with him.

00:03:43 In fact, uh, when my son turned 13, we didn’t tell him, but, um, his bar mitzvah present

00:03:51 was dinner with his hero, Tom Lehrer.

00:03:54 And Tom Lehrer was 88 years old, sharp as a tack, irreverent and funny as hell.

00:04:01 And just, you know, there are very few people in this world that you have to meet while

00:04:05 they’re still here.

00:04:07 And that was definitely one for our family.

00:04:09 So that wit is a reflection of intelligence in some kind of deep way, like where that

00:04:16 would be a good test of intelligence, whether you’re a Tom Lehrer fan.

00:04:20 So what do you think that is about wit, about that kind of humor, ability to see the absurdity

00:04:27 in existence?

00:04:28 Well, do you think that’s connected to intelligence or are we just two Jews on a mic that appreciate

00:04:33 that kind of humor?

00:04:34 No, I think that it’s absolutely connected to intelligence.

00:04:37 So you can, you can see it.

00:04:40 There’s a place where Tom Lehrer decides that he’s going to lampoon Gilbert of Gilbert and

00:04:45 Sullivan and he’s going to outdo Gilbert with clever, meaningless wordplay.

00:04:49 And he has, forget the, well, let’s see, he’s doing Clementine as if Gilbert and Sullivan

00:04:55 wrote it.

00:04:56 That I misunderstood depressed her young sister named Mr. This Mr. Depester, she tried pestering

00:05:00 sisters of festering blister you best to resist or say I, the sister persisted the Mr. Resisted

00:05:04 I kissed her all loyalty slip when he said, when she said I could have her, her sister’s

00:05:07 cadaver must surely have turned in its crypt.

00:05:10 That’s so dense.

00:05:11 It’s so insane that that’s clearly intelligence because it’s hard to construct something like

00:05:19 that.

00:05:20 If I look at my favorite Tom Lehrer, Tom Lehrer lyric, you know, there’s a perfectly absurd

00:05:26 one, which is once all the Germans were warlike and mean, but that couldn’t happen again.

00:05:30 We taught them a lesson in 1918 and they’ve hardly bothered us since then, right?

00:05:34 That is a different kind of intelligence.

00:05:36 You know, you’re taking something that is so horrific and you’re, you’re sort of making

00:05:41 it palatable and funny and demonstrating also, um, just your humanity.

00:05:47 I mean, I think the thing that came through as, as Tom Lehrer wrote all of these terrible,

00:05:53 horrible lines was just what a sensitive and beautiful soul he was, who was channeling

00:05:59 pain through humor and through grace.

00:06:02 I’ve seen throughout Europe, throughout Russia, that same kind of humor emerged from the generation

00:06:07 of world war II.

00:06:09 It seemed like that humor is required to somehow deal with the pain and the suffering of that,

00:06:14 that war created.

00:06:16 You do need the environment to create the broad Slavic soul.

00:06:19 I don’t think that many Americans really appreciate, um, Russian humor, how you had to joke during

00:06:30 the time of, let’s say article 58 under Stalin, you had to be very, very careful.

00:06:35 You know, the concept of a Russian satirical magazine like Crocodile, uh, doesn’t make

00:06:40 sense.

00:06:41 So you have this cross cultural problem that there are certain areas of human experience

00:06:48 that it would be better to know nothing about.

00:06:51 And quite unfortunately, Eastern Europe knows a great deal about them, which makes the,

00:06:56 you know, the songs of Vladimir Vysotsky so potent, the, uh, you know, the pros of Pushkin,

00:07:02 whatever it is, uh, you have to appreciate the depth of the Eastern European experience.

00:07:09 And I would think that perhaps Americans knew something like this around the time of the

00:07:14 civil war or merit maybe, um, you know, under slavery and Jim Crow or even the, uh, harsh

00:07:22 tyranny of, uh, the coal and steel employers during the labor wars.

00:07:28 Um, but in general, I would say it’s hard for us to understand and imagine the collective

00:07:34 culture unless we have the system of selective pressures that, for example, uh, Russians

00:07:40 were subjected to.

00:07:41 Yeah, so if there’s one good thing that comes out of war, it’s literature, art, and humor

00:07:49 and music.

00:07:50 Oh, I don’t think so.

00:07:52 I think almost everything is good about war except for death and destruction.

00:07:57 Right.

00:07:59 Without the death, it would bring, uh, the romance of it.

00:08:02 The whole thing is nice.

00:08:03 Well, this is why we’re always caught up in war and we have this very ambiguous relationship

00:08:07 to it is that it makes life real and pressing and meaningful and at an unacceptable price

00:08:15 and the price has never been higher.

00:08:17 So just jump in, uh, into AI a little bit.

00:08:22 You, uh, in one of the conversations you had or one of the videos, you described that one

00:08:28 of the things AI systems can’t do and biological systems can is self replicate in the physical

00:08:34 world.

00:08:35 Oh no, no.

00:08:37 In the physical world.

00:08:38 Well, yes, the physical robots can’t self replicate, but the fit, but you, this is a

00:08:45 very tricky point, which is that the only thing that we’ve been able to create that’s

00:08:50 really complex that has an analog of our reproductive system is software.

00:08:57 But nevertheless, software replicates itself.

00:09:01 Uh, if we’re speaking strictly for the replication in this kind of digital space.

00:09:05 So let me just to begin, let me ask a question.

00:09:08 Do you see a protective barrier or a gap between the physical world and the digital world?

00:09:15 Let’s not call it digital.

00:09:16 Let’s call it the logical world versus the physical world.

00:09:20 Why logical?

00:09:21 Well, because even though we had, let’s say Einstein’s brain preserved, uh, it was meaningless

00:09:28 to us as a physical object because we couldn’t do anything with what was stored in it at

00:09:33 a logical level.

00:09:35 And so the idea that something may be stored logically and that it may be stored physically,

00:09:40 uh, are not necessarily, uh, we don’t always benefit from synonymizing.

00:09:45 I’m not suggesting that there isn’t a material basis to the logical world, but that it does

00:09:51 warrant identification with a separate layer that need not invoke logic gates and zeros

00:09:58 and ones.

00:09:59 And uh, so connecting those two worlds, the logical world and the physical world, or maybe

00:10:03 just connecting to the logical world inside our brain, Einstein’s brain.

00:10:09 You mentioned the idea of out, outtelligence.

00:10:14 Artificial outtelligence.

00:10:15 Artificial outtelligence.

00:10:16 Yes.

00:10:17 This is the only essay that John Brockman ever invited me to write that he refused to

00:10:22 publish in Edge.

00:10:24 Why?

00:10:25 Well, maybe it wasn’t, it wasn’t well written, um, but I don’t know.

00:10:30 The idea is quite compelling is quite unique and new and at least from my view of a stance

00:10:36 point, maybe you can explain it.

00:10:38 Sure.

00:10:39 What I was thinking about is why it is that we’re waiting to be terrified by artificial

00:10:45 general intelligence when in fact, artificial life, uh, is terrifying in and of itself and

00:10:53 it’s already here.

00:10:54 So in order to have a system of selective pressures, you need three distinct elements.

00:11:00 You need variation within a population.

00:11:04 You need heritability and you need differential success.

00:11:08 So what’s really unique and I’ve made this point, I think elsewhere about software is

00:11:16 that if you think about what humans know how to build, that’s impressive.

00:11:19 So I always take a car and I say, does it have an analog of each of the physical physiological

00:11:25 systems?

00:11:26 Does it have a skeletal structure?

00:11:27 That’s its frame.

00:11:28 Does it have a neurological structure?

00:11:30 Has an on board computer, has a digestive system.

00:11:35 The one thing it doesn’t have is a reproductive system.

00:11:38 But if you can call spawn on a process, effectively you do have a reproductive system and that

00:11:47 means that you can create something with variation, heritability and differential success.

00:11:53 Now the next step in the chain of thinking was where do we see inanimate, non intelligent

00:12:01 life outwitting intelligent life?

00:12:05 And um, I have two favorite systems and I try to stay on them so that we don’t get distracted.

00:12:11 One of which is the Ofres orchid, um, subspecies or subclade.

00:12:16 I don’t know what to call it.

00:12:17 There’s a type of flower.

00:12:18 Yeah, it’s a type of flower that mimics the female of a pollinator species in order to

00:12:23 dupe the males into, uh, engaging.

00:12:27 It was called pseudo copulation with the fake female, which is usually represented by the

00:12:31 lowest pedal.

00:12:34 And there’s also a pheromone component to fool the males into thinking they have a mating

00:12:37 opportunity.

00:12:38 But the flower doesn’t have to give up energy energy in the form of nectar as a lure because

00:12:42 it’s tricking the males.

00:12:45 The other system is a particular species, uh, of muscle lampicillus in the clear streams

00:12:53 of Missouri and it fools bass into biting a fleshy lip that contain its young.

00:13:02 And when the bass see this fleshy lip, which looks exactly like a species of fish that

00:13:07 the bass like to eat, the, uh, the young explode and clamp onto the gills and parasitize the

00:13:13 bass and also lose the best redistribute them as they eventually release both of these systems.

00:13:20 You have a highly intelligent dupe being fooled by a lower life form and what is sculpting

00:13:31 these, these convincing lures.

00:13:34 It’s the intelligence of previously duped targets for these strategies.

00:13:41 So when the target is smart enough to avoid the strategy, uh, those weaker mimics, uh,

00:13:48 fall off.

00:13:49 They have terminal lines and only the better ones survive.

00:13:52 So it’s an arms race between the target species, uh, that is being parasitized, getting smarter

00:14:00 and this other less intelligent or non intelligent object getting as if smarter.

00:14:09 And so what you see is, is that artificial intelligence, artificial general intelligence

00:14:13 is not needed to parasitize us.

00:14:17 It’s simply sufficient for us to outwit ourselves.

00:14:22 So you could have a program, let’s say, you know, one of these Nigerian scams, um, that

00:14:27 writes letters and uses whoever sends it Bitcoin, uh, to figure out which aspects of the program

00:14:36 should be kept, which should be varied and thrown away.

00:14:38 And you don’t need it to be in any way intelligent in order to have a really nightmarish scenario

00:14:43 of being parasitized by something that has no idea what it’s doing.

00:14:46 So you, you, you phrased a few concepts really eloquently.

00:14:49 So let me try to, uh, as a few directions this goes.

00:14:53 So one first, first of all, in the way we write software today, it’s not common that

00:14:58 we allow it to self modify.

00:15:01 But we do have that ability.

00:15:02 Now we have the ability, it’s just not common.

00:15:05 It’s not just common.

00:15:06 So, so your, your thought is that that is a serious worry.

00:15:13 If there becomes, uh,

00:15:15 Self modifying code is, is available now.

00:15:18 So there’s, there’s different types of self modification, right?

00:15:21 There’s a personalization, you know, your email app, your Gmail is a self modifying

00:15:28 to you after you log in or whatever you can think of it that way.

00:15:32 But ultimately it’s central, all the information is centralized, but you’re thinking of ideas

00:15:39 where you’re completely, so this is an unique entity, uh, operating under selective pressures

00:15:45 and it changes.

00:15:46 Well, you just, if you think about the fact that our immune systems, uh, don’t know what’s

00:15:52 coming at them next, but they have a small set of spanning components and if it’s, if

00:15:58 it’s a sufficiently expressive system in that any shape, uh, or binding region can be approximated,

00:16:06 uh, with, with the Lego that is present, um, then you can have confidence that you don’t

00:16:13 need to know what’s coming at you because the combinatorics, um, are sufficient to reach

00:16:20 any configuration needed.

00:16:22 Uh, so that’s a beautiful thing.

00:16:25 Well, terrifying thing to worry about because it’s so within our reach.

00:16:30 Whatever I suggest these things, I do always have a concern as to whether or not I will

00:16:34 bring them into being by talking about them.

00:16:37 So, uh, there’s this thing from open AI, uh, next, next week to talk to the founder of

00:16:43 open AI, uh, this idea that, uh, their text generation, the new, uh, the new stuff they

00:16:50 have for generating texts is they didn’t want to bring it, they didn’t want to release it

00:16:54 because they’re worried about the.

00:16:57 I’m delighted to hear that, but they’re going to end up releasing.

00:17:00 Yes.

00:17:01 So that’s the thing is I think talking about it, um, well, at least from my end, I’m more

00:17:06 a proponent of technology preventing tech, uh, so further innovation, preventing the

00:17:14 detrimental effects of innovation.

00:17:16 Well, we’re at a, we’re sort of tumbling down a hill at accelerating speed.

00:17:22 So whether or not we’re proponents or it doesn’t, it doesn’t really, it may not matter, but

00:17:27 I, well, I do feel that there are people who’ve held things back and, uh, you know, died poor

00:17:33 than they might’ve otherwise been.

00:17:35 We don’t even know their names.

00:17:37 I don’t think that we should discount the idea that having the smartest people showing

00:17:43 off how smart they are by what they’ve developed may be a terminal process.

00:17:50 I’m very mindful in particular of a beautiful letter that Edward Teller of all people wrote

00:17:57 to Leo Zillard and where Zillard was trying to figure out how to control the use of atomic

00:18:02 weaponry at the end of world war II and teller rather strangely because many of us view him

00:18:08 as a monster, um, showed some, a very advanced moral thinking talking about the slim chance

00:18:15 we have for survival and that the only hope is to make Warren thinkable.

00:18:19 I do think that not enough of us feel in our gut what it is we are playing with when we

00:18:24 are working on technical problems.

00:18:26 And I would recommend to anyone who hasn’t seen it, a movie called the bridge over the

00:18:31 bridge on the river Kwai about, I believe captured British POWs who just in a desire

00:18:38 to do a bridge well, end up over collaborating with their Japanese captors.

00:18:43 Well now you’re making me a question the unrestricted open discussion of ideas and AI.

00:18:51 I’m not saying I know the answer, I’m just saying that I could make a decent case for

00:18:55 either our need to talk about this and to become technologically focused on containing

00:19:00 it or need to stop talking about this and try to hope that the relatively small number

00:19:06 of highly adept individuals who are looking at these problems is small enough that we

00:19:11 should in fact be talking about how to contain them.

00:19:14 Well the way ideas, the way innovation happens, what new ideas develop, Newton with calculus,

00:19:20 whether if he was silent, the idea would be, would emerge elsewhere in the case of Newton

00:19:26 of course.

00:19:27 But in the case of AI, how small is the set of individuals out of which such ideas would

00:19:34 arise?

00:19:35 Well the idea is that the researchers we know and those that we don’t know who may live

00:19:42 in countries that don’t wish us to know what level they’re currently at are very disciplined

00:19:47 in keeping these things to themselves.

00:19:50 Of course I will point out that there’s a religious school in Kerala that developed

00:19:57 something very close to the calculus, certainly in terms of infinite series in I guess religious

00:20:05 prayer and rhyme and prose.

00:20:10 So it’s not that Newton had any ability to hold that back and I don’t really believe

00:20:16 that we have an ability to hold it back.

00:20:17 I do think that we could change the proportion of the time we spend worrying about the effects

00:20:23 of what if we are successful rather than simply trying to succeed and hope that we’ll be able

00:20:26 to contain things later.

00:20:27 Beautifully put.

00:20:29 So on the idea of intelligence, what form, treading cautiously as we’ve agreed as we

00:20:35 tumbled down the hill, what form…

00:20:38 We can’t stop ourselves, can we?

00:20:39 We cannot.

00:20:42 What form do you see it taking?

00:20:43 So one example, Facebook, Google, do want to, I don’t know a better word, you want to

00:20:52 influence users to behave a certain way and so that’s one kind of example of how intelligence

00:20:59 is systems perhaps modifying the behavior of these intelligent human beings in order

00:21:05 to sell more product of different kinds.

00:21:08 But do you see other examples of this actually emerging in…

00:21:13 Just take any parasitic system, make sure that there’s some way in which that there’s

00:21:18 differential success, heritability, and variation.

00:21:25 And those are the magic ingredients and if you really wanted to build a nightmare machine,

00:21:29 make sure that the system that expresses the variability has a spanning set so that it

00:21:36 can learn to arbitrary levels by making it sufficiently expressive.

00:21:41 That’s your nightmare.

00:21:43 So it’s your nightmare, but it could also be, it’s a really powerful mechanism by which

00:21:49 to create, well, powerful systems.

00:21:52 So are you more worried about the negative direction that might go versus the positive?

00:21:59 So you said parasitic, but that doesn’t necessarily need to be what the system converges towards.

00:22:05 It could be, what is it?

00:22:07 And the dividing line between parasitism and symbiosis is not so clear.

00:22:13 That’s what they tell me about marriage.

00:22:15 I’m still single, so I don’t know.

00:22:17 Well yeah, we could go into that too, but no, I think we have to appreciate, are you

00:22:27 infected by your own mitochondria?

00:22:30 Right.

00:22:31 Right?

00:22:32 Yeah.

00:22:33 So in marriage, you fear the loss of independence, but even though the American therapeutic community

00:22:42 may be very concerned about codependence, what’s to say that codependence isn’t what’s

00:22:47 necessary to have a stable relationship in which to raise children who are maximally

00:22:52 case selected and require incredible amounts of care because you have to wait 13 years

00:22:57 before there’s any reproductive payout and most of us don’t want our 13 year olds having

00:23:00 kids.

00:23:01 That’s a very tricky situation to analyze and I would say that predators and parasites

00:23:08 drive much of our evolution and I don’t know whether to be angry at them or thank them.

00:23:14 Well ultimately, I mean nobody knows the meaning of life or what even happiness is, but there

00:23:19 is some metrics.

00:23:20 They didn’t tell you?

00:23:21 They didn’t.

00:23:22 They didn’t.

00:23:23 That’s why all the poetry and books are about, you know, there’s some metrics under which

00:23:29 you can kind of measure how good it is that these AI systems are roaming about.

00:23:35 So you’re more nervous about software than you are optimistic about ideas of, yeah, self

00:23:44 replicating largely.

00:23:45 I don’t think we’ve really felt where we are.

00:23:50 You know, occasionally we get a wake up, 9 11 was so anomalous compared to everything

00:23:57 else we’ve experienced on American soil that it came to us as a complete shock that that

00:24:03 was even a possibility.

00:24:05 What it really was was a highly creative and determined R and D team deep in the bowels

00:24:11 of Afghanistan showing us that we had certain exploits that we were open to that nobody

00:24:18 had chosen to express.

00:24:19 I can think of several of these things that I don’t talk about publicly that just seem

00:24:23 to have to do with, um, how relatively unimaginative those who wish to cause havoc and destruction

00:24:32 have been up until now.

00:24:33 But the great mystery of our time of this particular little era is how remarkably stable

00:24:43 we’ve been since 1945 when we demonstrated the ability to use a nuclear weapons in anger.

00:24:50 And we don’t know why things like that haven’t happened since then.

00:24:58 We’ve had several close calls, we’ve had mistakes, we’ve had a brinksmanship.

00:25:03 And what’s now happened is that we’ve settled into a sense that, Oh, it’s, it’ll always

00:25:09 be nothing.

00:25:10 It’s been so long since something was at that level of danger that we’ve got a wrong idea

00:25:20 in our head.

00:25:21 And that’s why when I went on the Ben Shapiro show, I talked about the need to resume above

00:25:25 ground testing of nuclear devices because we have people whose developmental experience

00:25:30 suggests that when let’s say Donald Trump and North Korea engage on Twitter, Oh, it’s

00:25:37 nothing.

00:25:38 It’s just posturing.

00:25:39 Everybody’s just in it for money.

00:25:41 There’s, there’s an, a sense that people are in a video game mode, which has been the right

00:25:46 call since 1945.

00:25:49 We’ve been mostly in video game mode.

00:25:51 It’s amazing.

00:25:52 So you’re worried about a generation which has not seen any existential.

00:25:57 We’ve lived under it.

00:25:58 You see, you’re younger.

00:26:00 I don’t know if, if, and again, you came from, from Moscow.

00:26:05 There was a TV show called the day after that had a huge effect on a generation growing

00:26:13 up in the U S and it talked about what life would be like after a nuclear exchange.

00:26:21 We have not gone through an embodied experience collectively where we’ve thought about this.

00:26:27 And I think it’s one of the most irresponsible things that the elders among us have done,

00:26:32 which is to provide this beautiful garden in which the thorns are cut off of the, of

00:26:41 the Rose bushes and all of the edges are rounded and sanded.

00:26:47 And so people have developed this, this totally unreal idea, which is everything’s going to

00:26:52 be just fine.

00:26:54 And do I think that my leading concern is AGI or my leading concern is a thermonuclear

00:27:01 exchange or gene drives or any one of these things?

00:27:04 I don’t know, but I know that our time here in this very long experiment here is finite

00:27:11 because the toys that we’ve built are so impressive and the wisdom to accompany them has not materialized.

00:27:19 And I think it’s, we actually got a wisdom uptick since 1945.

00:27:24 We had a lot of dangerous skilled players on the world stage who nevertheless, no matter

00:27:29 how bad they were managed to not embroil us in something that we couldn’t come back from

00:27:38 the cold war.

00:27:39 Yeah, and the distance from the cold war, you know, I’m very mindful of a, there was

00:27:46 a Russian tradition actually of on your wedding day, going to visit a memorial to those who

00:27:54 gave their lives.

00:27:55 Can you imagine this where you, on the happiest day of your life, you go and you pay homage

00:28:02 to the people who fought and died in the battle of Stalingrad.

00:28:08 I’m not a huge fan of communism, I got to say, but there were a couple of things that

00:28:13 the Russians did that were really positive in the Soviet era.

00:28:18 And I think trying to let people know how serious life actually is, is the Russian model

00:28:25 of seriousness is better than the American model.

00:28:28 And maybe like you mentioned, there was a small echo of that after 9 11.

00:28:34 But we wouldn’t let it form.

00:28:36 We talk about 9 11, but it’s 9 12 that really moved the needle when we were all just there

00:28:42 and nobody wanted to speak.

00:28:46 We witnessed something super serious and we didn’t want to run to our computers and blast

00:28:55 out our deep thoughts and our feelings.

00:28:59 And it was profound because we woke up briefly, you know, I talk about the gated institutional

00:29:05 narrative that sort of programs our lives.

00:29:08 I’ve seen it break three times in my life, one of which was the election of Donald Trump.

00:29:15 Another time was the fall of Lehman Brothers when everybody who knew that Bear Stearns

00:29:21 wasn’t that important knew that Lehman Brothers met AIG was next.

00:29:27 And the other one was 9 11.

00:29:29 And so if I’m 53 years old and I only remember three times that the global narrative was

00:29:35 really interrupted, that tells you how much we’ve been on top of developing events.

00:29:42 You know, I mean we had the Murrow federal building explosion, but it didn’t cause the

00:29:46 narrative to break.

00:29:47 It wasn’t profound enough.

00:29:49 Around 9 12 we started to wake up out of our slumber and the powers that be did not want

00:29:58 to coming together.

00:29:59 They, you know, the admonition was go shopping.

00:30:03 And the powers that be was what is that force as opposed to blaming individuals?

00:30:07 We don’t know.

00:30:08 So whatever that, whatever that force is, there’s a component of it that’s emergent

00:30:13 and there’s a component of it that’s deliberate.

00:30:15 So give yourself a portfolio with two components.

00:30:18 Some amount of it is emergent, but some amount of it is also an understanding that if people

00:30:23 come together, they become an incredible force.

00:30:27 And what you’re seeing right now I think is there are forces that are trying to come together

00:30:34 and there are forces that are trying to push things apart.

00:30:39 And you know, one of them is the globalist narrative versus the national narrative where

00:30:43 to the global, uh, globalist perspective, uh, the nation nations are bad things in essence

00:30:50 that they’re temporary, they’re nationalistic, they’re jingoistic, it’s all negative to people

00:30:56 in the national, more in the national idiom, they’re saying, look, this is where I pay

00:30:59 my taxes.

00:31:00 This is where I do my army service.

00:31:02 This is where I have a vote.

00:31:04 This is where I have a passport.

00:31:06 Who the hell are you to tell me that because you’ve moved into someplace that you can make

00:31:10 money globally, that you’ve chosen to abandon other people to whom you have a special and

00:31:15 elevated duty.

00:31:16 And I think that these competing narratives have been pushing towards the global perspective,

00:31:22 uh, from the elite and a larger and larger number of disenfranchised people are saying,

00:31:27 hey, I actually live in a, in a place and I have laws and I speak a language, I have

00:31:32 a culture.

00:31:33 And who are you to tell me that because you can profit in some far away land that my obligations

00:31:40 to my fellow countrymen are so, so much diminished.

00:31:43 So these tensions between nations and so on, ultimately you see being proud of your country

00:31:47 and so on, which creates potentially the kind of things that led to wars and so on.

00:31:53 They, they ultimately, it is human nature and it is good for us for wake up calls of

00:31:58 different kinds.

00:31:59 Well, I think that these are tensions and my point isn’t, I mean, nationalism run amok

00:32:05 is a nightmare and internationalism run amok is a nightmare.

00:32:09 And the problem is we’re trying to push these pendulums, uh, to some place where they’re

00:32:16 somewhat balanced, where we, we have a higher duty of care to those, uh, who share our log,

00:32:24 our laws and our citizenship, but we don’t forget our duties of care to the global system.

00:32:30 I would think this is elementary, but the problem that we’re facing concerns the ability

00:32:37 for some to profit at the, by abandoning their obligations, uh, to others within their system.

00:32:45 And that’s what we’ve had for decades.

00:32:48 You mentioned nuclear weapons.

00:32:50 I was hoping to get answers from you since one of the many things you’ve done as a economics

00:32:55 and maybe you can understand human behavior of why the heck we haven’t blown each other

00:33:00 up yet.

00:33:01 But okay.

00:33:02 So, uh, we’ll get back.

00:33:03 I don’t know the answer.

00:33:04 Yes.

00:33:05 It’s a, it’s a fast.

00:33:06 It’s really important to say that we really don’t know.

00:33:07 A mild uptick in wisdom.

00:33:09 A mild uptick in wisdom.

00:33:10 Well, Steven Pinker, who I’ve talked with has a lot of really good ideas about why,

00:33:17 but I don’t trust his optimism.

00:33:20 Listen, I’m Russian, so I never trust a guy who was that optimistic.

00:33:24 No, no, no.

00:33:25 It’s just that you’re talking about a guy who’s looking at a system in which more and

00:33:31 more of the kinetic energy like war has been turned into potential energy, like unused

00:33:37 nuclear weapons.

00:33:38 Beautifully put.

00:33:39 You know, now I’m looking at that system and I’m saying, okay, well, if you don’t have

00:33:42 a potential energy term, then everything’s just getting better and better.

00:33:46 Yeah.

00:33:47 Wow.

00:33:48 That’s beautifully put.

00:33:49 Only a physicist could.

00:33:50 Okay.

00:33:51 I’m not a physicist.

00:33:52 Is that a dirty word?

00:33:55 No, no.

00:33:56 I wish I were a physicist.

00:33:57 Me too.

00:33:58 My dad’s a physicist.

00:33:59 I’m trying to live up that probably for the rest of my life.

00:34:02 He’s probably gonna listen to this too.

00:34:04 So.

00:34:05 He did.

00:34:06 Yeah.

00:34:07 So your friend, Sam Harris, uh, worries a lot about the existential threat of AI.

00:34:13 Not in the way that you’ve described, but in the more, well, he hangs out with Elon.

00:34:18 I don’t know Elon.

00:34:20 So are you worried about that kind of, uh, you know, about the, um, about either robotic

00:34:30 systems or, you know, traditionally defined AI systems essentially becoming a super intelligent,

00:34:35 much more intelligent than human beings and, uh, getting, well, they already are and they’re

00:34:40 not when, when seen as a, um, a collective, you mean, well, I mean, I, I can mean all

00:34:47 sorts of things, but certainly many of the things that we thought were peculiar to general

00:34:54 intelligence or do not require general intelligence.

00:34:57 So that’s been one of the big awakenings that you can write a pretty convincing sports story

00:35:04 from stats alone, uh, without needing to have watched the game.

00:35:10 So you know, is it possible to write lively pros about politics?

00:35:13 Yeah, no, not yet.

00:35:17 So we were sort of all over the map.

00:35:20 One of the, one of the things about chess that you’ll, there’s a question I once asked

00:35:24 on Quora that didn’t get a lot of response, which was what is the greatest brilliancy

00:35:29 ever produced by a computer in a chess game, which was different than the question of what

00:35:32 is the greatest game ever played.

00:35:35 So if you think about brilliancies is what really animates many of us to think of chess

00:35:39 as an art form.

00:35:42 Those are those moves and combinations that just show such flair, panache and, and, and

00:35:47 in soul, um, computers weren’t really great at that.

00:35:50 They were great positional monsters and you know, recently we, we’ve started seeing brilliancies

00:35:56 and so.

00:35:57 The grandmasters have identified with, uh, with alpha zero that things were quite brilliant.

00:36:03 Yeah.

00:36:04 So that’s, that’s, that’s a, you know, that’s an example of something we don’t think that

00:36:06 that’s AGI, but in a very restricted set, a set of rules like chess, you’re starting

00:36:12 to see poetry, uh, of a high order.

00:36:15 And, and so I’m not, I don’t like the idea that we’re waiting for AGI, AGI is sort of

00:36:22 slowly infiltrating our lives in the same way that I don’t think a worm should be, you

00:36:29 know, the C elegans shouldn’t be treated as non conscious because it only has 300 neurons.

00:36:34 Maybe it just has a very low level of consciousness because we don’t understand what these things

00:36:39 mean as they scale up.

00:36:40 So am I worried about this general phenomena?

00:36:43 Sure.

00:36:44 But I think that one of the things that’s happening is that a lot of us are fretting

00:36:48 about this, uh, in part because of human needs.

00:36:53 We’ve always been worried about the Golem, right?

00:36:56 Well, the Golem is the artificially created life, you know, it’s like Frankenstein.

00:37:01 Yeah, sure.

00:37:02 It’s a Jewish version and, um, Frankenberg, Frankenstein, yeah, that’s makes sense, right?

00:37:10 So the, uh, but we’ve always been worried about creating something like this and it’s

00:37:16 getting closer and closer and there are ways in which

00:37:23 we have to realize that the whole thing is kind of, the whole thing that we’ve experienced

00:37:27 are the context of our lives is almost certainly coming to an end.

00:37:32 And I don’t mean to suggest that, uh, we won’t survive.

00:37:37 I don’t know.

00:37:39 And I don’t mean to suggest that it’s coming tomorrow and it could be 300, 500 years, but

00:37:43 there’s no plan that I’m aware of if we have three rocks that we could possibly inhabit

00:37:49 that are, uh, sensible within current technological dreams, the earth, the moon and Mars.

00:37:58 And we have a very competitive civilization that is still forced into violence to sort

00:38:04 out disputes that cannot be arbitrated.

00:38:06 It is not clear to me that we have a longterm future until we get to the next stage, which

00:38:13 is to figure out whether or not the Einsteinian speed limit can be broken.

00:38:18 And that requires our source code.

00:38:21 Our source code, the stuff in our brains to figure out what do you mean by our source

00:38:25 code?

00:38:26 The source code of the context, whatever it is that produces the quarks, the electrons,

00:38:30 the neutrinos.

00:38:31 Oh, our source code.

00:38:33 I got it.

00:38:34 So this is,

00:38:35 You’re talking about stuff that’s written in a higher level language.

00:38:38 Yeah.

00:38:39 Yeah.

00:38:40 That’s right.

00:38:41 You’re talking about the low level, the bits.

00:38:42 Yeah.

00:38:43 That’s what is currently keeping us here.

00:38:46 We can’t even imagine, you know, we have harebrained schemes for staying within the Einsteinian

00:38:53 speed limit.

00:38:54 Uh, you know, maybe if we could just drug ourselves and go into a suspended state or

00:38:58 we could have multiple generations of that, I think all that stuff is pretty silly, but

00:39:03 I think it’s also pretty silly to imagine that our wisdom is going to increase to the

00:39:07 point that we can have the toys we have and, uh, we’re not going to use them for 500 years.

00:39:14 Speaking of Einstein, I had a profound breakthrough when I realized you’re just one letter away

00:39:18 from the guy.

00:39:19 Yeah, but I’m also one letter away from Feinstein.

00:39:22 It’s, well, you get to pick.

00:39:25 Okay.

00:39:26 So unified theory, you know, you’ve worked, uh, you, you enjoy the beauty of geometry.

00:39:32 I don’t actually know if you enjoy it.

00:39:34 You certainly are quite good at it.

00:39:35 I tremble before it.

00:39:36 If you’re religious, that is one of the, I don’t have to be religious.

00:39:42 It’s just so beautiful.

00:39:43 You will tremble anyway.

00:39:44 I mean, I just read Einstein’s biography and one of the ways, uh, one of the things you’ve

00:39:50 done is try to explore a unified theory, uh, talking about a 14 dimensional observers that

00:39:59 has the 4d space time continuum embedded in it.

00:40:02 I, I’m just curious how you think and how philosophically at a high level about something

00:40:09 more than four dimensions, uh, how do you try to, what, what does it make you feel?

00:40:17 Talking in the mathematical world about dimensions that are greater than the ones we can perceive.

00:40:23 Is there something that you take away that’s more than just the math?

00:40:27 Well, first of all, stick out your tongue at me.

00:40:32 Okay.

00:40:33 Now on the front of that time, yeah, there was a sweet receptor and next to that were

00:40:41 salt receptors and two different sides, a little bit farther back.

00:40:45 There were sour receptors and you wouldn’t show me the back of your tongue where your

00:40:48 bitter receptor was.

00:40:50 Show the good side always.

00:40:51 Okay.

00:40:52 So you had four dimensions of taste receptors, but you also had pain receptors on that tongue

00:40:58 and probably heat receptors on that time.

00:41:01 So let’s assume that you had one of each, that would be six dimensions.

00:41:05 So when you eat something, you eat a slice of pizza and it’s got some, some, uh, some

00:41:12 hot pepper on it, maybe some jalapeno, you’re having a six dimensional experience, dude.

00:41:17 Do you think we overemphasize the value of time as one of the dimensions or space?

00:41:23 Well, we certainly overemphasize the value of time cause we like things to start and

00:41:27 end or we really don’t like things to end, but they seem to.

00:41:30 Well, what if you flipped one of the spatial dimensions into being a temporal dimension?

00:41:37 And you and I were to meet in New York city and say, well, where, where and when should

00:41:41 we meet?

00:41:42 What about, I’ll meet you on a 36 in Lexington at two in the afternoon and uh, 11 oclock

00:41:50 in the morning.

00:41:53 That would be very confusing.

00:41:55 Well, so it’s convenient for us to think about time, you mean.

00:41:59 We happen to be in a delicious situation in which we have three dimensions of space and

00:42:03 one of time and they’re woven together in this sort of strange fabric where we can trade

00:42:07 off a little space for a little time, but we still only have one dimension that is picked

00:42:11 out relative to the other three.

00:42:13 It’s very much Gladys Knight and the pips.

00:42:15 So which one developed for who?

00:42:17 Do we develop for these dimensions or did the dimensions or were they always there and

00:42:22 it doesn’t?

00:42:23 Well, do you imagine that there isn’t a place where there are four temporal dimensions or

00:42:27 two and two of space and time or three of time and one of space and then would time

00:42:31 not be playing the role of space?

00:42:33 Why do you imagine that the sector that you’re in is all that there is?

00:42:37 I certainly do not, but I can’t imagine otherwise.

00:42:40 I mean, I haven’t done ayahuasca or any of those drugs that hope to one day, but instead

00:42:46 of doing ayahuasca, you could just head over to building two.

00:42:49 That’s where the mathematicians are?

00:42:50 Yeah, that’s where they hang.

00:42:52 Just to look at some geometry.

00:42:53 Well, just ask about pseudo Ramanian geometry.

00:42:55 That’s what you’re interested in.

00:42:57 Okay.

00:42:58 Or you could talk to a shaman and end up in Peru.

00:43:01 And then it’s an extra money for that trip.

00:43:03 Yeah, but you won’t be able to do any calculations if that’s how you choose to go about it.

00:43:06 Well, a different kind of calculation, so to speak.

00:43:09 Yeah.

00:43:10 One of my favorite people, Edward Frankel, Berkeley professor, author of Love and Math,

00:43:14 great title for a book, said that you are quite a remarkable intellect to come up with

00:43:20 such beautiful original ideas in terms of unified theory and so on, but you’re working

00:43:25 outside academia.

00:43:28 So one question in developing ideas that are truly original, truly interesting, what’s

00:43:33 the difference between inside academia and outside academia when it comes to developing

00:43:39 such ideas?

00:43:40 Oh, it’s a terrible choice.

00:43:41 Terrible choice.

00:43:43 So if you do it inside of academics, you are forced to constantly show great loyalty to

00:43:55 the consensus and you distinguish yourself with small, almost microscopic heresies to

00:44:03 make your reputation in general.

00:44:07 And you have very competent people and brilliant people who are working together, who form

00:44:14 very deep social networks and have a very high level of behavior, at least within mathematics

00:44:22 and at least technically within physics, theoretical physics.

00:44:27 When you go outside, you meet lunatics and crazy people, madmen.

00:44:35 And these are people who do not usually subscribe to the consensus position and almost always

00:44:41 lose their way.

00:44:44 And the key question is, will progress likely come from someone who has miraculously managed

00:44:52 to stay within the system and is able to take on a larger amount of heresy that is sort

00:44:59 of unthinkable?

00:45:01 In which case, that will be fascinating, or is it more likely that somebody will maintain

00:45:07 a level of discipline from outside of academics and be able to make use of the freedom that

00:45:15 comes from not having to constantly affirm your loyalty to the consensus of your field?

00:45:21 So you’ve characterized in ways that academia in this particular sense is declining.

00:45:28 You posted a plot, the older population of the faculty is getting larger, the younger

00:45:34 is getting smaller and so on.

00:45:37 So which direction of the two are you more hopeful about?

00:45:40 Well, the baby boomers can’t hang on forever.

00:45:42 First of all, in general, true, and second of all, in academia.

00:45:46 But that’s really what this time is about.

00:45:51 We’re used to financial bubbles that last a few years in length and then pop.

00:45:57 The baby boomer bubble is this really long lived thing, and all of the ideology, all

00:46:04 of the behavior patterns, the norms.

00:46:07 For example, string theory is an almost entirely baby boomer phenomenon.

00:46:11 It was something that baby boomers were able to do because it required a very high level

00:46:16 of mathematical ability.

00:46:20 You don’t think of string theory as an original idea?

00:46:24 Oh, I mean, it was original to Veneziano, probably is older than the baby boomers.

00:46:29 And there are people who are younger than the baby boomers who are still doing string

00:46:32 theory.

00:46:33 And I’m not saying that nothing discovered within the large string theoretic complex

00:46:37 is wrong.

00:46:38 Quite the contrary.

00:46:39 A lot of brilliant mathematics and a lot of the structure of physics was elucidated by

00:46:44 string theorists.

00:46:46 What do I think of the deliverable nature of this product that will not ship called

00:46:51 string theory?

00:46:52 I think that it is largely an affirmative action program for highly mathematically and

00:46:57 geometrically talented baby boomer physics physicists so that they can say that they’re

00:47:03 working on something within the constraints of what they will say is quantum gravity.

00:47:10 Now there are other schemes, you know, there’s like asymptotic safety, there are other things

00:47:15 that you could imagine doing.

00:47:17 I don’t think much of any of the major programs, but to have inflicted this level of loyalty

00:47:26 through a shibboleth.

00:47:27 Well, surely you don’t question X.

00:47:29 Well, I question almost everything in the string program.

00:47:32 And that’s why I got out of physics.

00:47:34 When you called me a physicist, it was a great honor, but the reason I didn’t become a physicist

00:47:39 wasn’t that I fell in love with mathematics.

00:47:41 I said, wow, in 1984, 1983, I saw the field going mad and I saw that mathematics, which

00:47:49 has all sorts of problems, was not going insane.

00:47:52 And so instead of studying things within physics, I thought it was much safer to study the same

00:47:57 objects within mathematics.

00:47:59 There’s a huge price to pay for that.

00:48:01 You lose physical intuition.

00:48:03 But the point is, is that it wasn’t a North Korean reeducation camp either.

00:48:08 Are you hopeful about cracking open the Einstein unified theory in a way that has been really,

00:48:15 really understanding whether this, uh, uniting everything together with quantum theory and

00:48:21 so on?

00:48:22 I mean, I’m trying to play this role myself to do it to the extent of handing it over

00:48:28 to the more responsible, more professional, more competent community.

00:48:34 Um, so I think that they’re wrong about a great number of their belief structures, but

00:48:40 I do believe, I mean, I have a really profound love, hate relationship with this group of

00:48:45 people.

00:48:46 I think the physics side, cause the mathematicians actually seem to be much more open minded

00:48:51 and uh, well they are and they aren’t, they’re open minded about anything that looks like

00:48:55 great math.

00:48:56 Right.

00:48:57 Right.

00:48:58 They’ll study something that isn’t very important physics, but if it’s beautiful mathematics,

00:49:01 then they’ll have a, they have great intuition about these things as good as the mathematicians

00:49:07 are.

00:49:08 And I might even intellectually at some horsepower level, give them the edge.

00:49:12 The theoretical theoretical physics community is bar none.

00:49:16 The most profound intellectual community that we have ever created.

00:49:22 It is the number one.

00:49:23 There’s nobody in second place as far as I’m concerned, like in their spare time and the

00:49:28 spare time they invented molecular biology.

00:49:30 What, what was the origin of molecular biology?

00:49:33 You’re saying something like Francis Crick.

00:49:34 I mean, a lot of, a lot of the early molecular biologists were physicists.

00:49:39 Yeah.

00:49:40 I mean, you know, Schrodinger wrote what is life and that was highly inspirational.

00:49:44 I mean, you have to appreciate that there is no community like the basic research community

00:49:53 in theoretical physics and it’s not something I’m highly critical of these guys.

00:49:59 I think that they would just wasted the decades of time with a near religious devotion to

00:50:08 their misconception of where the problems were in physics.

00:50:13 But this has been the greatest intellectual collapse ever witnessed within academics.

00:50:20 You see it as a collapse or just a lull?

00:50:22 Oh, I’m terrified that we’re about to lose the vitality.

00:50:26 We can’t afford to pay these people.

00:50:29 We can’t afford to give them an accelerator just to play with in case they find something

00:50:33 at the next energy level.

00:50:35 These people created our economy.

00:50:38 They gave us the rad lab and radar.

00:50:41 They gave us two atomic devices to end world war two.

00:50:45 They created the semiconductor and the transistor to power our economy through Moore’s law.

00:50:51 As a positive externality of particle accelerators, they created the worldwide web and we have

00:50:57 the insolence to say, why should we fund you with our taxpayer dollars?

00:51:02 No, the question is, are you enjoying your physics dollars?

00:51:08 These guys signed the world’s worst licensing agreement and if they simply charged for every

00:51:15 time you used a transistor or a URL or enjoyed the piece that they have provided during this

00:51:23 period of time through the terrible weapons that they developed or your communications

00:51:29 devices, all of the things that power our economy, I really think came out of physics,

00:51:34 even to the extent the chemistry came out of physics and molecular biology came out

00:51:37 of physics.

00:51:38 So, first of all, you have to know that I’m very critical of this community.

00:51:42 Second of all, it is our most important community.

00:51:45 We have neglected it.

00:51:46 We’ve abused it.

00:51:47 We don’t take it seriously.

00:51:49 We don’t even care to get them to rehab after a couple of generations of failure, right?

00:51:55 No one, I think the youngest person to have really contributed to the standard model of

00:52:01 theoretical level was born in 1951, right?

00:52:05 Frank Wilczek and almost nothing has happened that in theoretical physics after 1973, 74

00:52:15 that sent somebody to Stockholm for theoretical development that predicted experiment.

00:52:21 So we have to understand that we are doing this to ourselves.

00:52:24 Now, with that said, these guys have behaved abysmally in my opinion because they haven’t

00:52:31 owned up to where they actually are, what problems they’re really facing, how definite

00:52:35 they can actually be.

00:52:37 They haven’t shared some of their most brilliant discoveries, which are desperately needed

00:52:40 in other fields like gauge theory, which at least the mathematicians can, can share, which

00:52:45 is an upgrade of the differential calculus of Newton and Leibniz.

00:52:49 And they haven’t shared the importance of renormalization theory.

00:52:53 Even though this should be standard operating procedure for people across the sciences dealing

00:52:57 with different layers and different levels of phenomena.

00:53:01 And by shared, you mean communicated in such a way that it disseminates throughout the

00:53:06 different sizes.

00:53:07 These guys are sitting, both theoretical physicists and mathematicians are sitting on top of a

00:53:12 giant stockpile of intellectual gold, right?

00:53:16 They have so many things that have not been manifested anywhere.

00:53:19 I was just on Twitter, I think I mentioned the Habermann switch pitch that shows the

00:53:25 self duality of the tetrahedron realized as a linkage mechanism.

00:53:28 Now this is like a triviality and it makes an amazing toy that’s, you know, built a market,

00:53:36 hopefully a fortune for Chuck Habermann.

00:53:38 Well, you have no idea how much great stuff that these priests have in their monastery.

00:53:44 So it’s truly a love and hate relationship for you.

00:53:47 Yeah.

00:53:48 Well, it sounds like it’s more on the love side.

00:53:49 This building that we’re in right here is the building in which I really put together

00:53:54 the conspiracy between the National Academy of Sciences and the National Science Foundation

00:54:00 through the government university industry research round table to destroy the bargaining

00:54:04 power of American academics, uh, using foreign labor with, uh, on microfeature in the base.

00:54:10 Oh yeah.

00:54:11 That was done here in this building.

00:54:13 Isn’t that weird?

00:54:14 And I’m, I’m truly speaking with a revolutionary and a radical, uh, no, no, no, no, no, no,

00:54:19 no, no, no, no, no, no.

00:54:20 At an intellectual level, I am absolutely garden variety.

00:54:25 I’m just straight down the middle.

00:54:27 The system that we are in this, this university is functionally insane.

00:54:33 Yeah.

00:54:34 Harvard is functionally insane and we don’t understand that when we get these things wrong,

00:54:41 the financial crisis made this very clear.

00:54:43 There was a long period where every grownup, everybody with a tie, uh, who spoke in a,

00:54:49 you know, in Barrett, baritone tones, uh, with, with the right degree at the end of

00:54:54 their name.

00:54:55 Yeah.

00:54:56 Uh, we’re talking about how we banished volunteer volatility.

00:54:59 We were in the great moderation.

00:55:01 Okay.

00:55:02 They were all crazy.

00:55:04 And who was, who was right?

00:55:05 It was like Nassim Taleb, Nouriel Roubini.

00:55:08 Now what happens is, is that they claimed the market went crazy, but the market didn’t

00:55:13 go crazy.

00:55:14 The market had been crazy and what happened is, is that it suddenly went sane.

00:55:18 Well, that’s where we are with academics.

00:55:21 Academics right now is mad as a hatter and it’s, it’s absolutely evident.

00:55:25 I can show you graph after graph.

00:55:27 I can show you the internal discussions.

00:55:28 I can show you the conspiracies.

00:55:30 Barrett’s dealing with one right now over, uh, it’s admissions policies for people, uh,

00:55:35 of color, uh, who happened to come from Asia.

00:55:38 All of this madness is necessary to keep the game going.

00:55:41 What we’re talking about just on, well, we’re on the topic of revolutionaries is we’re talking

00:55:46 about the danger of an outbreak of sanity.

00:55:50 Yeah.

00:55:51 You’re, you’re the guy pointing out the elephant in the room here and the elephant has no clothes.

00:55:57 Is that how that goes?

00:55:59 I was going to talk a little bit to, uh, Joe Rogan about this, ran out of time, but I think

00:56:07 you’re, you have some, you, just listening to you, you could probably speak really eloquently

00:56:13 to academia on the difference between the different fields.

00:56:16 So you think there’s a difference between science, engineering, and then the humanities

00:56:21 in academia in terms of tolerance that they’re willing to tolerate?

00:56:25 So from my perspective, I thought computer science and maybe engineering is more tolerant

00:56:33 to radical ideas, but that’s perhaps innocent of me is that I always, you know, all the

00:56:38 battles going on now are a little bit more on the humanity side and gender studies and

00:56:42 so on.

00:56:43 Have you seen the, uh, American mathematical society’s publication of an essay called get

00:56:48 out the way?

00:56:49 I have not.

00:56:50 What’s, what’s the idea is that white men who hold, uh, positions.

00:56:55 Yeah.

00:56:56 Within universities and mathematics should vacate their positions so that young black

00:57:01 women can take over something like this.

00:57:04 That’s in terms of diversity, which I also want to ask you about, but in terms of diversity

00:57:07 of strictly ideas, do you think, cause you’re basically saying physics as a community has

00:57:14 become a little bit intolerant to some degree to new radical ideas, or at least you, uh,

00:57:21 you said it’s changed a little bit recently, which is that even string theory is now admitting,

00:57:26 okay, we don’t look very promising in the short term, right?

00:57:32 So the question is what compiles if you want to take the computer science metaphor, what

00:57:39 will get you into a journal?

00:57:42 Will you spend your life trying to push some paper into a journal or will it be accepted

00:57:47 easily?

00:57:48 What about the characteristics of the submitter and what gets taken up and what does not?

00:57:56 All of these fields are experiencing pressure because no field is performing so brilliantly

00:58:01 well, um, that it’s revolutionizing our way of speaking and thinking in the ways in which

00:58:11 we’ve become accustomed.

00:58:12 But don’t you think even in theoretical physics, a lot of times, even with theories like string

00:58:19 theory, you could speak to this, it does eventually lead to what are the ways that this theory

00:58:24 would be testable.

00:58:25 Yeah, ultimately, although look, there’s this thing about popper and the scientific method

00:58:31 that’s a cancer and a disease and the minds of very smart people.

00:58:36 That’s not really how most of the stuff gets worked out.

00:58:39 It’s how it gets checked.

00:58:41 All right, so, and there is a dialogue between theory and experiment, but everybody should

00:58:47 read Paul directs 1963 American scientific American article where he, he, you know, it’s

00:58:56 very interesting.

00:58:57 He talks about it as if it was about the Schrodinger equation and Schrodinger’s failure to advance

00:59:03 his own work because of his failure to account for some phenomenon.

00:59:06 The key point is that if your theory is a slight bit off, it won’t agree with experiment,

00:59:10 but it doesn’t mean that the theory is actually wrong.

00:59:12 Um, but direct could as easily have been talking about his own equation in which he predicted

00:59:18 that the electrons should have an antiparticle.

00:59:22 And since the only positively charged particle that was known at the time was the proton,

00:59:26 Heisenberg pointed out, well, shouldn’t your antiparticle, the proton have the same mass

00:59:30 as the electron and doesn’t that invalidate your theory?

00:59:33 So I think the direct was actually being quite potentially quite sneaky, um, and, uh, talking

00:59:38 about the fact that he had been pushed off of his own theory to some extent by Heisenberg.

00:59:42 Um, but look, we’ve fetishized the scientific method and popper and falsification, um, because

00:59:51 it protects us from crazy ideas entering the field.

00:59:55 So you know, it’s a question of balancing type one and type two error.

00:59:58 And we’re pretty, we were pretty maxed out in one direction.

01:00:01 The opposite of that, let me say what comforts me sort of biology or engineering, uh, at

01:00:08 the end of the day, does the thing work?

01:00:10 Yeah.

01:00:11 You can test the crazies away and the crazy.

01:00:15 Well see now you’re saying, but some ideas are truly crazy and some are, are actually

01:00:19 correct.

01:00:20 So, well there’s pre correct currently crazy.

01:00:24 Yeah.

01:00:25 Right.

01:00:26 And so you don’t want to get rid of everybody who’s pre correct and currently crazy.

01:00:30 Um, the problem is, is that we don’t have standards in general for trying to determine

01:00:37 who has to be put to the sword in terms of their career and who has to be protected,

01:00:42 uh, as some sort of giant time suck pain in the ass, uh, who may change everything.

01:00:47 Do you think that’s possible?

01:00:48 Uh, creating a mechanism of those select?

01:00:51 Well, you’re not going to like the answer, but here it comes.

01:00:53 Oh boy.

01:00:54 It has to do with very human elements.

01:00:59 We’re trying to do this at the level of like rules and fairness.

01:01:02 That’s not going to work cause the only thing that really understands this, you read the

01:01:10 double helix?

01:01:11 It’s a book.

01:01:12 Oh, you have to read this book.

01:01:16 Not only did Jim Watson, uh, half discover this three dimensional structure of DNA, he’s

01:01:22 also one hell of a writer before he became an ass, uh, that no, he’s tried to destroy

01:01:28 his own reputation.

01:01:29 I knew about the ass, I didn’t know about the good writer.

01:01:33 Jim Watson is one of the most important people now living.

01:01:36 And uh, as I’ve said before, Jim Watson is too important, a legacy to be left to Jim

01:01:42 Watson.

01:01:43 Um, yeah, that book tells you more about what actually moves the dial, right?

01:01:49 There’s another story about him, which I don’t, don’t agree with, which is that he stole everything

01:01:53 from Rosalind Franklin.

01:01:54 I mean the, the problems that he had with Rosalind Franklin are real, but we should

01:01:58 actually honor that tension in our history by delving into it rather than having a simple

01:02:04 solution.

01:02:05 Jim Watson talks about Francis Crick being a pain in the ass that everybody secretly

01:02:10 knew was super brilliant.

01:02:12 And there’s an encounter between, uh, Chargaff, uh, who came up with the equimolar relations

01:02:19 between the nucleotides who should have gotten the structure of DNA and Watson and Crick.

01:02:25 And you know, he talks about missing a shiver in the heartbeat of biology and stuff is so

01:02:30 gorgeous.

01:02:31 It just makes you tremble even thinking about it.

01:02:35 Um, look, we know very often who is to be feared and we need to fund the people that

01:02:42 we fear.

01:02:45 The people who are wasting our time need to be excluded from the conversation.

01:02:49 You see, and you know, maybe we’ll make some errors in both directions.

01:02:54 But we have known our own people.

01:02:58 We know the pains in the asses that might work out and we know the people who are really

01:03:02 just blowhards who really have very little to contribute most of the time.

01:03:06 It’s not a hundred percent, but you’re not going to get there with rules.

01:03:10 Right.

01:03:11 It’s a using some kind of instinct.

01:03:12 I mean, I, to be honest, I’m going to make you roll your eyes for a second, but uh, and

01:03:18 the first time I heard that there is a large community of people who believe the earth

01:03:21 is flat actually made me pause and ask myself the question, why would there be such a community?

01:03:27 Yeah.

01:03:28 Is it possible the earth is flat?

01:03:30 So I had to like, wait a minute.

01:03:33 I mean, then you go through a thinking process that I think is really healthy.

01:03:37 It ultimately ends up being a geometry thing.

01:03:39 I think, uh, it’s an interesting, it’s an interesting thought experiment at the very

01:03:43 least.

01:03:44 Well, I don’t, I do a different version of it.

01:03:46 I say, why is this community stable?

01:03:48 Yeah.

01:03:49 That’s a good, uh, way to analyze it.

01:03:51 Well, interesting that whatever we’ve done has not erased the community.

01:03:56 So you know, they’re taking a long shot bet that won’t pan out, you know, maybe we just

01:04:00 haven’t thought enough about the rationality of the square root of two and somebody brilliant

01:04:04 will figure it out.

01:04:05 Maybe we will eventually land one day on the surface of Jupiter and explore it, right?

01:04:11 These are crazy things that will never happen.

01:04:14 So much of social media operates by AI algorithms.

01:04:17 You talked about this a little bit, uh, recommending the content you see.

01:04:21 So on this idea of radical thought, how much should AI show you things you disagree with

01:04:28 on Twitter and so on in a Twitter word verse in this question?

01:04:34 Yeah.

01:04:35 Yeah.

01:04:36 Cause you don’t know the answer?

01:04:37 No, no, no, no.

01:04:38 Look, we’ve been, they’ve pushed out this cognitive Lego to us that will just lead to

01:04:44 madness.

01:04:45 It’s good to be challenged with things that you disagree with.

01:04:49 The answer is no, it’s good to be challenged with interesting things with which you currently

01:04:53 disagree, but that might be true.

01:04:56 So I don’t really care about whether or not I disagree with something or don’t disagree.

01:05:00 I need to know why that particular disagreeable thing is being pushed out.

01:05:05 Is it because it’s likely to be true?

01:05:07 Is it because, is there some reason?

01:05:09 Because I can write, I can write a computer generator, uh, to come up with an infinite

01:05:14 number of disagreeable statements that nobody needs to look at.

01:05:17 So please, before you push things at me that are disagreeable, tell me why.

01:05:22 There is an aspect in which that question is quite dumb, especially because it’s being

01:05:26 used to, uh, almost, um, uh, very generically by these different networks to say, well,

01:05:34 we’re trying to work this out.

01:05:35 But you know, basically, uh, how much do you see the value of seeing things, uh, you don’t

01:05:43 like, not you disagree with, because it’s very difficult to know exactly what you articulated,

01:05:47 which is, uh, the stuff that’s important for you to consider that you disagree with.

01:05:53 That’s really hard to figure out.

01:05:55 The bottom line is this stuff you don’t like.

01:05:57 If you’re a, uh, uh, Hillary Clinton supporter, you may not want to, it might not make you

01:06:03 feel good to see anything about Donald Trump.

01:06:05 That’s the only thing algorithms can really optimize for currently.

01:06:08 They really can’t.

01:06:09 Now they can do better.

01:06:10 This is where we’re.

01:06:11 You think so?

01:06:12 No, we’re engaged in some moronic back and forth where I have no idea why people who

01:06:21 are capable of building Google, Facebook, Twitter are having us in these incredibly

01:06:27 low level discussions.

01:06:28 Do they not know any smart people?

01:06:31 Do they not have the phone numbers of people who can elevate these discussions?

01:06:36 They do, but this, they’re optimizing for a different thing and they’re pushing those

01:06:41 people out of those rooms.

01:06:42 They’re, they’re optimizing for things we can’t see.

01:06:47 And yes, profit is there.

01:06:48 Nobody, nobody’s questioning that, but they’re also optimizing for things like political

01:06:54 control or the fact that they’re doing business in Pakistan.

01:06:57 And so they don’t want to talk about all the things that they’re going to be bending to

01:07:01 in Pakistan.

01:07:03 So we’re involved in a fake discussion.

01:07:06 You think so?

01:07:07 You think these conversations at that depth are happening inside Google?

01:07:11 You don’t think they have some basic metrics under user engagements?

01:07:15 You’re having a fake conversation with us guys.

01:07:18 We know you’re having a fake conversation.

01:07:19 I do not wish to be part of your fake conversation.

01:07:23 You know how to cool, you know, these units, you know, high availability, like nobody’s

01:07:28 business.

01:07:29 My Gmail never goes down.

01:07:31 Almost.

01:07:32 So you think just because they can do incredible work on the software side with infrastructure,

01:07:38 they can also deal with some of these difficult questions about human behavior, human understanding.

01:07:46 You’re not.

01:07:47 I mean, I’ve seen the, I’ve seen the developers screens that people take shots of inside of

01:07:53 Google.

01:07:55 And I’ve heard stories inside of Facebook and Apple.

01:07:58 We’re not, we’re engaged.

01:08:00 They’re engaging us in the wrong conversations.

01:08:04 We are not at this low level.

01:08:06 Here’s one of my favorite questions.

01:08:08 Why is every piece of hardware that I purchase in tech space equipped as a listening device?

01:08:17 Where’s my physical shutter to cover my lens?

01:08:19 We had this in the 1970s, cameras that had lens caps, you know, how much would it cost

01:08:26 to have a security model pay five extra bucks?

01:08:29 Why is my indicator light software controlled?

01:08:33 Why when my camera is on, do I not see that the light is on by putting it as something

01:08:37 that cannot be bypassed?

01:08:39 Why have you set up my, all of my devices at some difficulty to yourselves as listening

01:08:45 devices and we don’t even talk about this.

01:08:47 This is, this thing is total fucking bullshit.

01:08:50 Well, I hope these discussions are happening about privacy.

01:08:55 Is there a more difficult thing you’re giving them credit for?

01:08:57 It’s not just privacy.

01:08:59 It’s about social control.

01:09:01 We’re talking about social control.

01:09:03 Why do I not have controls over my own levers?

01:09:07 Just have a really cute UI where I can switch, I can dial things or I can at least see what

01:09:11 the algorithms are.

01:09:13 You think that there is some deliberate choices being made here.

01:09:17 There’s emergence and there is intention.

01:09:21 There are two dimensions.

01:09:23 The vector does not collapse onto either axis, but the idea that anybody who suggests that

01:09:29 intention is completely absent is a child.

01:09:34 That’s really beautifully put and uh, like many things you’ve said is going to make me

01:09:38 can I turn this around slightly?

01:09:40 Yeah.

01:09:41 I sit down with you and you say that you’re obsessed with my feed.

01:09:45 I don’t even know what my feed is.

01:09:47 What are you seeing that I’m not?

01:09:49 I was obsessively looking through your feed on Twitter because it was really enjoyable

01:09:54 because there’s the Tom layer element is the humor in it.

01:09:58 By the way, that feed is Eric R. Weinstein on Twitter at Eric R. Weinstein.

01:10:03 No, but seriously, why?

01:10:06 Why did I find it enjoyable or what was I seeing?

01:10:10 What are you looking for?

01:10:11 Why are we doing this?

01:10:13 What is this podcast about?

01:10:14 I know you’ve got all these interesting people.

01:10:16 I’m just some guy who’s sort of a podcast guest.

01:10:18 Sort of a podcast.

01:10:21 You’re not even wearing a tie.

01:10:22 I mean, it’s not even a serious interview.

01:10:25 I’m searching for meaning, for happiness, for a dopamine rush, so short term, long term.

01:10:34 And how are you finding your way to me?

01:10:37 I don’t honestly know what I’m doing to reach you.

01:10:41 The representing ideas which feel common sense to me and not many people are speaking.

01:10:48 So it’s kind of like the intellectual dark web folks, right?

01:10:54 These folks from Sam Harris to Jordan Peterson to yourself are saying things where you’re

01:11:00 like saying, look, there’s an elephant and he’s not wearing any clothes.

01:11:05 And I say, yeah, yeah, let’s have more of that conversation.

01:11:09 That’s how I’m finding you.

01:11:10 I’m desperate to try to change the conversation we’re having.

01:11:14 I’m very worried we’ve got an election in 2020.

01:11:17 I don’t think we can afford four more years of a misinterpreted message, which is what

01:11:23 Donald Trump was.

01:11:25 And I don’t want the destruction of our institutions.

01:11:28 They all seem hell bent on destroying themselves.

01:11:30 So I’m trying to save theoretical physics, trying to save the New York Times, trying

01:11:34 to save our various processes.

01:11:38 And I think it feels delusional to me that this is falling to a tiny group of people

01:11:44 who are willing to speak out without getting so freaked out that everything they say will

01:11:49 be misinterpreted and that their lives will be ruined through the process.

01:11:52 I mean, I think we’re in an absolutely bananas period of time and I don’t believe it should

01:11:57 fall to such a tiny number of shoulders to shoulder this way.

01:12:02 So I have to ask you on the capitalism side, you mentioned that technology is killing capitalism

01:12:08 or it has effects that are unintended, well, not unintended, but not what economists would

01:12:15 predict or speak of capitalism creating.

01:12:18 I just want to talk to you about in general, the effect of even then artificial intelligence

01:12:23 or technology automation taking away jobs and these kinds of things and what you think

01:12:28 is the way to alleviate that, whether the Andrew Ng presidential candidate with universal

01:12:34 basic income, UBI, what are your thoughts there?

01:12:38 How do we fight off the negative effects of technology that…

01:12:41 All right, you’re a software guy, right?

01:12:43 Yep.

01:12:44 A human being is a worker is an old idea, a human being has a worker is a different

01:12:52 object, right?

01:12:53 Yeah.

01:12:54 So if you think about object oriented programming as a paradigm, a human being has a worker

01:12:59 and a human being has a soul.

01:13:01 We’re talking about the fact that for a period of time, the worker that a human being has

01:13:07 was in a position to feed the soul that a human being has.

01:13:11 However, we have two separate claims on the value in society.

01:13:18 One is as a worker and the other is as a soul and the soul needs sustenance, it needs dignity,

01:13:23 it needs meaning, it needs purpose.

01:13:27 As long as your means of support is not highly repetitive, I think you have a while to go

01:13:34 before you need to start worrying.

01:13:37 And if what you do is highly repetitive and it’s not terribly generative, you are in the

01:13:41 cross hairs of for loops and while loops.

01:13:46 And that’s what computers excel at, repetitive behavior and when I say repetitive, I may

01:13:51 mean things that have never happened through combinatorial possibilities, but as long as

01:13:55 it has a looped characteristic to it, you’re in trouble.

01:13:59 We are seeing a massive push towards socialism because capitalists are slow to address the

01:14:07 fact that a worker may not be able to make claims, a relatively undistinguished median

01:14:13 member of our society is still has needs to reproduce, needs to have to dignity.

01:14:20 And when capitalism abandons the median individual or the bottom 10th or whatever it’s going

01:14:28 to do, it’s flirting with revolution.

01:14:32 And what concerns me is that the capitalists aren’t sufficiently capitalistic to understand

01:14:37 this.

01:14:39 You really want to court authoritarian control in our society because you can’t see that

01:14:45 people may not be able to defend themselves in the marketplace because the marginal product

01:14:50 of their labor was too low to feed their dignity as a soul.

01:14:55 So my great concern is that our free society has to do with the fact that we are self organized.

01:15:02 I remember looking down from my office in Manhattan when Lehman brothers collapsed and

01:15:06 thinking who’s going to tell all these people that they need to show up at work when they

01:15:12 don’t have a financial system to incentivize them to show up at work.

01:15:17 So my complaint is first of all, not with the socialists, but with the capitalists,

01:15:22 which is you guys are being idiots.

01:15:24 You’re courting revolution by continuing to harp on the same old ideas that, well, you

01:15:30 know, try, try harder, bootstrap yourself.

01:15:32 Yeah, to an extent that works to an extent, but we are clearly headed in a place that

01:15:38 there’s nothing that ties together our need to contribute and our need to consume.

01:15:45 And that may not be provided by capitalism because it may have been a temporary phenomena.

01:15:49 So check out my article on anthropic capitalism and the new gimmick economy.

01:15:55 I think people are late getting the wake up call and we would be doing a better job saving

01:16:00 capitalism from itself because I don’t want this done under authoritarian control.

01:16:05 And the more we insist that everybody who’s not thriving in our society during their reproductive

01:16:11 years in order to have a family is failing at a personal level.

01:16:15 I mean, what a disgusting thing that we’re saying.

01:16:18 What horrible message who, who the hell have we become that we’ve so bought into the Chicago

01:16:23 model that we can’t see the humanity that we’re destroying in that process.

01:16:28 And it’s, I hate, I hate the thought of communism.

01:16:31 I really do.

01:16:32 My family has flirted with it decades past.

01:16:34 It’s a wrong, bad idea, but we are going to need to figure out how to make sure that those

01:16:39 souls are nourished and respected and capitalism better have an answer.

01:16:45 And I’m betting on capitalism, but I’ve got to tell you, I’m pretty disappointed with

01:16:48 my team.

01:16:49 So you’re still on the capitalism team.

01:16:52 You just, uh, there’s a theme here.

01:16:55 Radical capitalism.

01:16:56 Hyper capitalism.

01:16:57 Yeah.

01:16:58 I want, I think hyper capitalism is going to have to be coupled to hyper socialism.

01:17:02 You need to allow the most productive people to create wonders and you’ve got to stop bogging

01:17:07 them down with all of these extra nice requirements.

01:17:11 You know, nice is dead.

01:17:13 Good has a future.

01:17:14 Nice doesn’t have a future because nice ends up with, with gulags.

01:17:18 Damn, that’s a good line.

01:17:21 Okay.

01:17:22 Last question.

01:17:23 You tweeted today a simple, quite insightful equation saying, uh, imagine that every unit

01:17:30 F of fame you picked up as stalkers and H haters.

01:17:35 So I imagine S and H are dependent on your path to fame perhaps a little bit.

01:17:39 Well, it’s not as simple.

01:17:40 I mean, people always take these things literally when you have like 280 characters to explain

01:17:44 yourself.

01:17:45 So you mean that that’s not a mathematical, uh, no, there’s no law.

01:17:50 Oh, okay.

01:17:51 All right.

01:17:52 So I put the word imagine because I still have a mathematician’s desire for precision.

01:17:56 Imagine that this were true, but it was a beautiful way to imagine that there is a law

01:18:00 that has those variables in it.

01:18:03 And uh, you’ve become quite famous these days.

01:18:06 So how do you yourself optimize that equation with the peculiar kind of fame that you have

01:18:12 gathered along the way?

01:18:13 I want to be kinder.

01:18:14 I want to be kinder to myself.

01:18:16 I want to be kinder to others.

01:18:17 I want to be able to have heart, compassion, or these things are really important.

01:18:24 And uh, I have a pretty spectrumy kind of approach to analysis.

01:18:28 I’m quite literal.

01:18:29 I can go full rain man on you at any given moment.

01:18:32 No, I can’t.

01:18:33 I can’t.

01:18:34 Uh, it’s facultative autism if you like, and people are gonna get angry because they want

01:18:37 autism to be respected.

01:18:39 So when you see me coding or you see me doing mathematics, I’m, you know, I speak with speech

01:18:47 apnea, uh, be right down to dinner, you know, we have to try to integrate ourselves and

01:18:54 those tensions between, you know, it’s sort of back to us as a worker and us as a soul.

01:19:00 Many of us are optimizing one to the, at the expense of the other.

01:19:06 And I struggle with social media and I struggle with people making threats against our families

01:19:11 and I struggle with, um, just how much pain people are in.

01:19:15 And if there’s one message I would like to push out there, um, you’re responsible, everybody,

01:19:21 all of us, myself included with struggling, struggle, struggle mightily because you, it’s

01:19:27 nobody else’s job to do your struggle for you.

01:19:30 Now with that said, if you’re struggling and you’re trying and you’re trying to figure

01:19:34 out how to better yourself and where you failed and where you’ve let down your family, your

01:19:38 friends, your workers, all this kind of stuff, give yourself a break.

01:19:43 You know, if, if, if it’s not working out, I have a lifelong relationship with failure

01:19:48 and success.

01:19:49 There’s been no period of my life where both haven’t been present in one form or another.

01:19:56 And I do wish to say that a lot of times people think this is glamorous.

01:20:00 I’m about to go, you know, do a show with Sam Harris.

01:20:04 People are going to listen in on two guys having a conversation on stage.

01:20:07 It’s completely crazy when I’m always trying to figure out how to make sure that those

01:20:10 people get maximum value.

01:20:12 And uh, that’s why I’m doing this podcast, you know, just give yourself a break.

01:20:18 You owe us, you owe us your struggle.

01:20:20 You don’t owe your family or your coworkers or your lovers or your family members success.

01:20:26 Um, as long as you’re in there and you’re picking yourself up, recognize that this,

01:20:31 this new situation with the economy that doesn’t have the juice to sustain our institutions

01:20:37 has caused the people who’ve risen to the top of those institutions to get quite brutal

01:20:41 and cruel.

01:20:43 Everybody is lying at the moment.

01:20:45 Nobody’s really a truth teller.

01:20:47 Um, try to keep your humanity about you.

01:20:50 Try to recognize that if you’re failing, if things aren’t where you want them to be and

01:20:55 you’re struggling and you’re trying to figure out what you’re doing wrong, which you could

01:20:57 do, it’s not necessarily all your fault.

01:21:01 We are in a global situation.

01:21:02 I have not met the people who are honest, kind, good, successful.

01:21:08 Nobody that I’ve met is checking all the boxes.

01:21:13 Nobody’s getting all tens.

01:21:14 So I just think that’s an important message that doesn’t get pushed out enough.

01:21:18 Either people want to hold society responsible for their failures, which is not reasonable.

01:21:24 You have to struggle, you have to try, or they want to say you’re a hundred percent

01:21:27 responsible for your failures, which is total nonsense.

01:21:31 Beautifully put.

01:21:32 Eric, thank you so much for talking today.

01:21:33 Thanks for having me, buddy.