Transcript
00:00:00 The following is a conversation with Manolis Kellis,
00:00:02 his fourth time on the podcast.
00:00:05 He’s a professor at MIT
00:00:06 and head of the MIT Computational Biology Group.
00:00:10 Since this is episode number 142,
00:00:14 and 42, as we all know,
00:00:16 is the answer to the ultimate question of life,
00:00:18 the universe, and everything,
00:00:20 according to the Hitchhiker’s Guide to the Galaxy,
00:00:23 we decided to talk about this unanswerable question
00:00:26 of the meaning of life
00:00:28 in whatever way we two descendants of apes could muster,
00:00:32 from biology, psychology, to metaphysics, and to music.
00:00:37 Quick mention of each sponsor,
00:00:39 followed by some thoughts related to the episode.
00:00:42 Thanks to Grammarly,
00:00:44 which is a service for checking spelling, grammar,
00:00:47 sentence structure, and readability,
00:00:49 Athletic Greens, the all in one drink
00:00:52 that I start every day with
00:00:54 to cover all my nutritional bases,
00:00:56 and Cash App, the app I use to send money to friends.
00:01:00 Please check out these sponsors in the description
00:01:02 to get a discount and to support this podcast.
00:01:05 As a side note,
00:01:06 let me say that the opening 40 minutes of the conversation
00:01:09 are all about the many songs
00:01:11 that formed the soundtrack to the journey
00:01:14 of Manolis’s life.
00:01:15 It was a happy accident for me to discover
00:01:18 yet another dimension of depth
00:01:20 to the fascinating mind of Manolis.
00:01:22 I include links to YouTube versions
00:01:24 of many of the songs we mentioned in the description,
00:01:28 and overlay lyrics on occasion.
00:01:30 But if you’re just listening to this
00:01:31 without listening to the songs or watching the video,
00:01:34 I hope you still might enjoy, as I did,
00:01:37 the passion that Manolis has for music,
00:01:39 his singing of the little excerpts from the songs,
00:01:43 and in general, the meaning we discuss
00:01:46 that we pull from the different songs.
00:01:49 If music is not your thing,
00:01:50 I do give timestamps to the less musical
00:01:53 and more philosophical parts of the conversation.
00:01:56 I hope you enjoy this little experiment
00:01:59 and conversation about music and life.
00:02:02 If you do, please subscribe on YouTube,
00:02:05 review it with five stars on Apple Podcast,
00:02:07 follow on Spotify, support on Patreon,
00:02:10 or connect with me on Twitter at Lex Friedman.
00:02:13 And now, here’s my conversation with Manolis Callas.
00:02:18 You mentioned Leonard Cohen and the song Hallelujah
00:02:21 as a beautiful song.
00:02:22 So what are the three songs
00:02:26 you draw the most meaning from about life?
00:02:29 Don’t get me started.
00:02:31 So there’s really countless songs that have marked me,
00:02:34 that have sort of shaped me in periods of joy
00:02:38 and in periods of sadness.
00:02:40 My son likes to joke that I have a song
00:02:43 for every sentence he will say,
00:02:44 because very often I will break into a song
00:02:46 with a sentence he’ll say.
00:02:48 My wife calls me the radio
00:02:50 because I can sort of recite hundreds of songs
00:02:53 that have really shaped me.
00:02:54 So it’s gonna be very hard to just pick a few.
00:02:56 So I’m just gonna tell you a little bit
00:02:57 about my song transition as I’ve grown up.
00:03:01 In Greece, it was very much about,
00:03:03 as I told you before, the misery, the poverty,
00:03:06 but also overcoming adversity.
00:03:09 So some of the songs that have really shaped me
00:03:11 are Charis Alexiou, for example,
00:03:13 is one of my favorite singers in Greece.
00:03:16 And then there’s also really just old traditional songs
00:03:19 that my parents used to listen to.
00:03:21 Like one of them is Ani Moun Plousios,
00:03:25 which is basically, oh, if I was rich.
00:03:28 And the song is painting this beautiful picture
00:03:32 about all the noises that you hear in the neighborhood,
00:03:34 his poor neighborhood, the train going by,
00:03:37 the priest walking to the church
00:03:39 and the kids crying next door and all of that.
00:03:43 And he says, with all of that,
00:03:44 I’m having trouble falling asleep and dreaming.
00:03:47 If I was rich, and then he was breaking to that.
00:03:52 So it’s this juxtaposition between the spirit
00:03:55 and the sublime and then the physical and the harsh reality.
00:03:59 It’s just not having troubles, not being miserable.
00:04:03 So basically rich to him just means
00:04:05 out of my misery, basically.
00:04:08 And then also being able to travel,
00:04:10 being able to sort of be the captain of a ship
00:04:12 and see the world and stuff like that.
00:04:14 So it’s just such beautiful imagery.
00:04:16 So many of the Greek songs, just like the poetry
00:04:18 we talked about, they acknowledge the cruelty,
00:04:21 the difficulty of life, but are longing for a better life.
00:04:24 That’s exactly right.
00:04:25 And another one is Ftokhologia.
00:04:26 And this is one of those songs that has like a fast
00:04:29 and joyful half and a slow and sad half.
00:04:33 And it goes back and forth between them.
00:04:35 And it’s like,
00:04:36 Ftokhologia, jefse na kathemou dragoouzi.
00:04:41 So poor, you know, basically it’s the state of being poor.
00:04:45 I don’t even know if there’s a word for that in English.
00:04:50 And then fast part is ta kerya sou megalosan
00:04:54 ke ponesan ke matosan.
00:04:56 So, and then it’s like, oh, you know,
00:04:59 basically like the state of being poor and misery,
00:05:02 you know, for you, I write all my songs, et cetera.
00:05:05 And then the fast part is in your arms,
00:05:09 grew up and suffered and, you know, stood up and,
00:05:14 you know, rose, men with clear vision.
00:05:18 This whole concept of taking on the world
00:05:21 with nothing to lose because you’ve seen the worst of it.
00:05:25 This imagery of psilaki parizopoula harastakorizopoula.
00:05:29 So it’s describing the young men as cypress trees.
00:05:34 And that’s probably one of my earliest exposure
00:05:35 to a metaphor, to sort of, you know,
00:05:38 this very rich imagery.
00:05:40 And I love about the fact that I was reading a story
00:05:42 to my kids the other day and it was dark.
00:05:44 And my daughter who’s six is like,
00:05:47 oh, can I please see the pictures?
00:05:48 And Jonathan, who’s eight, so my daughter Cleo is like,
00:05:53 oh, let’s look at the pictures.
00:05:54 And my son Jonathan, he’s like, but Cleo,
00:05:57 if you look at the pictures, it’s just an image.
00:06:00 If you just close your eyes and listen, it’s a video.
00:06:03 That’s brilliant.
00:06:05 It’s beautiful.
00:06:06 And he’s basically showing just how much more
00:06:08 the human imagination has besides just a few images
00:06:12 that, you know, the book will give you.
00:06:14 And then another one, oh gosh,
00:06:15 this one is really like miserable.
00:06:17 It’s called Sto Perigiali, To Krifo.
00:06:23 And it’s basically describing how vigorously
00:06:28 we took on our life and we pushed hard
00:06:31 towards a direction that we then realized
00:06:33 was the wrong one.
00:06:36 And again, these songs give you so much perspective.
00:06:39 There’s no songs like that in English
00:06:41 that will basically sort of just smack you in the face
00:06:44 about sort of the passion and the force and the drive.
00:06:47 And then it turns out, we just followed the wrong life.
00:06:51 And it’s like, wow.
00:06:54 Okay, so that was you.
00:06:55 All right, so that’s like before 12.
00:06:57 So growing up in sort of this horrendously miserable
00:07:03 sort of view of romanticism, of suffering.
00:07:07 So then my preteen years is like, you know,
00:07:10 learning English through songs.
00:07:12 So basically, you know,
00:07:13 listening to all the American pop songs
00:07:15 and then memorizing them vocally
00:07:17 before I even knew what they meant.
00:07:19 So, you know, Madonna and Michael Jackson
00:07:23 and all of these sort of really popular songs
00:07:25 and, you know, George Michael and just songs
00:07:27 that I would just listen to the radio and repeat vocally.
00:07:30 And eventually as I started learning English,
00:07:32 I was like, oh wow, this thing I’ve been repeating,
00:07:34 I now understand what it means.
00:07:36 Without relistening it to it,
00:07:37 but just with re repeating it, it was like, oh.
00:07:41 Again, Michael Jackson’s Man in the Mirror
00:07:44 is teaching you that it’s your responsibility
00:07:47 to just improve yourself.
00:07:49 You know, if you wanna make the world a better place,
00:07:51 take a look at yourself and make the change.
00:07:52 This whole concept of, again, I mean,
00:07:55 all of these songs, you can listen to them shallowly
00:07:57 or you can just listen to them and say,
00:07:59 oh, there’s deeper meaning here.
00:08:00 And I think there’s a certain philosophy of song
00:08:04 as a way of touching the psyche.
00:08:06 So if you look at regions of the brain,
00:08:08 people have lost their language ability
00:08:10 because they have an accident in that region of the brain
00:08:12 can actually sing because it’s exactly
00:08:16 the symmetric region of the brain.
00:08:18 And that again, teaches you so much
00:08:19 about language evolution and sort of the duality
00:08:23 of musicality and, you know, rhythmic patterns
00:08:28 and eventually language.
00:08:31 Do you have a sense of why songs developed?
00:08:34 You’re kind of suggesting that it’s possible
00:08:36 that there is something important
00:08:39 about our connection with song and with music
00:08:43 on the level of the importance of language.
00:08:46 Is it possible?
00:08:47 It’s not just possible.
00:08:48 In my view, language comes after music.
00:08:51 Language comes after song.
00:08:52 No, seriously.
00:08:53 Like basically, my view of human cognitive evolution
00:08:56 is rituals.
00:08:58 If you look at many early cultures,
00:09:01 there’s rituals around every stage of life.
00:09:04 There’s organized dance performances around mating.
00:09:10 And if you look at mate selection,
00:09:11 I mean, that’s an evolutionary drive right there.
00:09:14 So basically, if you’re not able to string together
00:09:16 a complex dance as a bird, you don’t get a mate.
00:09:19 And that actually forms this development
00:09:23 for many song learning birds.
00:09:25 Not every bird knows how to sing.
00:09:27 And not every bird knows how to learn a complicated song.
00:09:31 So basically, there’s birds that simply have
00:09:33 the same few tunes that they know how to play.
00:09:36 And a lot of that is inherent and genetically encoded.
00:09:39 And others are birds that learn how to sing.
00:09:42 And if you look at a lot of these exotic birds of paradise
00:09:47 and stuff like that,
00:09:48 the mating rituals that they have are enormously amazing.
00:09:51 And I think human mating rituals of ancient tribes
00:09:55 are not very far off from that.
00:09:56 And in my view, the sequential formation of these movements
00:10:01 is a prelude to the cognitive capabilities
00:10:06 that ultimately enable language.
00:10:08 It is fascinating to think that that’s
00:10:10 not just an accidental precursor to intelligence.
00:10:13 Yeah, sexually selected.
00:10:16 Well, it’s sexually selected and it’s a prerequisite.
00:10:19 Yeah.
00:10:20 It’s like, it’s required for intelligence.
00:10:21 And even as language has now developed,
00:10:24 I think the artistic expression is needed,
00:10:28 like badly needed by our brain.
00:10:31 So it’s not just that, oh, our brain can kind of,
00:10:34 you know, take a break and go do that stuff.
00:10:36 No, I mean, you know, I don’t know if you remember
00:10:39 that scene from, oh gosh,
00:10:40 where’s that Jack Nicholson movie in New Hampshire.
00:10:44 All work and no play, make Jack a dull boy.
00:10:47 A dull boy.
00:10:49 The Shining.
00:10:49 The Shining.
00:10:51 So there’s this amazing scene where he’s constantly
00:10:54 trying to concentrate and what’s coming out
00:10:57 of the typewriter is just gibberish.
00:10:58 And I have that image as well when I’m working.
00:11:01 And I’m like, no, basically all of these crazy,
00:11:04 you know, huge number of hobbies that I have,
00:11:06 they’re not just tolerated by my work.
00:11:09 They’re required by my work.
00:11:12 This ability of sort of stretching your brain
00:11:13 in all these different directions
00:11:15 is connecting your emotional self and your cognitive self.
00:11:20 And that’s a prerequisite to being able
00:11:22 to be cognitively capable.
00:11:24 At least in my view.
00:11:25 Yeah, I wonder if the world without art and music,
00:11:28 you’re just making me realize that perhaps
00:11:30 that world would be not just devoid of fun things
00:11:34 to look at or listen to, but devoid of all the other stuff.
00:11:38 All the bridges and rockets and science.
00:11:41 Exactly, exactly.
00:11:42 Creativity is not disconnected from art.
00:11:45 And you know, my kids, I mean, you know,
00:11:47 I could be doing the full math treatment to them.
00:11:50 No, they play the piano and they play the violin
00:11:53 and they play sports.
00:11:54 I mean, this whole, you know, sort of movement
00:11:58 and going through mazes and playing tennis
00:12:01 and, you know, playing soccer and avoiding obstacles
00:12:05 and all of that,
00:12:06 that forms your three dimensional view of the world.
00:12:09 Being able to actually move and run and play
00:12:11 in three dimensions is extremely important for math,
00:12:14 for, you know, stringing together complicated concepts.
00:12:17 It’s the same underlying cognitive machinery
00:12:21 that is used for navigating mazes and for navigating theorems
00:12:27 and sort of solving equations.
00:12:28 So I can’t, you know, I can’t have a conversation
00:12:31 with my students without, you know,
00:12:33 sort of either using my hands
00:12:35 or opening the whiteboard in Zoom
00:12:39 and just constantly drawing.
00:12:41 Or, you know, back when we had in person meetings,
00:12:43 just the whiteboard on my own.
00:12:44 The whiteboard, yeah, that’s fascinating to think about.
00:12:47 So that’s Michael Jackson, man,
00:12:49 Mirror, Careless Whisper with George Michael,
00:12:52 which is a song I like.
00:12:53 You can say Careless Whisper.
00:12:54 I mean, I didn’t say that.
00:12:55 I like that one.
00:12:56 That’s too popular for you.
00:12:57 I had recorded, no, no, no,
00:12:59 that it’s an amazing song for me.
00:13:01 I had recorded a small part of it
00:13:03 as it’s played at the tail end of the radio.
00:13:05 And I had a tape where I only had part of that song
00:13:09 and I just played it over and over and over again,
00:13:11 just so beautiful.
00:13:13 It’s so heartbreaking.
00:13:15 That song is almost Greek.
00:13:16 It’s so heartbreaking.
00:13:17 I know, and George Michael is Greek.
00:13:19 Is he Greek?
00:13:19 He’s Greek, of course.
00:13:20 George Michaelides, I mean, he’s Greek.
00:13:22 Yeah.
00:13:23 I did not know this.
00:13:24 Now you know.
00:13:25 I’m so sorry to offend you so deeply not knowing this.
00:13:30 So, okay, so what’s…
00:13:31 So anyway, so we’re moving to France
00:13:32 when I’m 12 years old.
00:13:33 And now I’m getting into the songs of Gainsbourg.
00:13:36 So Gainsbourg is this incredible French composer.
00:13:39 He is always seen on stage,
00:13:42 like not even pretending to try to please,
00:13:44 just like with his cigarette,
00:13:45 just like rrrr mumbling his songs.
00:13:47 But the lyrics are unbelievable.
00:13:49 Like basically entire sentences will rhyme.
00:13:53 He will say the same thing twice and you’re like, whoa.
00:13:58 And in fact, another, speaking of Greek,
00:14:00 a French Greek, George Mustaky,
00:14:03 this song is just magnificent.
00:14:05 Avec ma gueule de métèque, de juif errant, de patre grecque.
00:14:10 So with my face of métèque is actually a Greek word.
00:14:15 A Greek word, it’s a French word for a Greek word.
00:14:17 But met comes from meta,
00:14:20 and then ec from Ikea, from ecology, which means home.
00:14:23 So métèque is someone who has changed homes for a migrant.
00:14:26 So with my face of a migrant, and you’ll love this one.
00:14:30 De juif errant, de patre grecque,
00:14:32 of a meandering Jew, of Greek pastor.
00:14:39 So again, the Russian Greek,
00:14:42 the Jewish Orthodox connection, so.
00:14:45 Aime mes cheveux au quatre vents,
00:14:46 with my hair in the four wings.
00:14:48 Avec mes yeux tous délavés qui me donnent léreux de rêver.
00:14:53 Avec, with my eyes that are all washed out,
00:14:55 who give me the pretense of dreaming,
00:14:58 but who don’t dream that much anymore.
00:15:01 With my hands of thief, of musician,
00:15:05 and who have stolen so many gardens.
00:15:08 With my mouth that has drunk,
00:15:11 that has kissed, and that has bitten,
00:15:14 without ever pleasing its hunger.
00:15:17 With my skin that has been rubbed
00:15:23 in the sun of all the summers,
00:15:25 and anything that was wearing a skirt.
00:15:29 With my heart, and then you have to listen to this verse,
00:15:32 it’s so beautiful.
00:15:33 Avec mon coeur qui a su faire souffrir autant qu il a souffert.
00:15:38 Qui a su faire.
00:15:39 With my heart that knew how to make suffer
00:15:44 as much as it suffered,
00:15:46 but was able to, that knew how to make,
00:15:50 in French it’s actually su faire,
00:15:53 that knew how to make.
00:15:54 Qui a su faire souffrir autant qu il a souffert.
00:15:57 Verses that span the whole thing.
00:15:59 It’s just beautiful.
00:16:00 Do you know, on a small tangent,
00:16:02 do you know Jacques Brel?
00:16:05 Of course, of course.
00:16:07 And then Ne Me Kite Pas, those songs.
00:16:10 That song gets me every time.
00:16:12 So there’s a cover of that song
00:16:14 by one of my favorite female artists.
00:16:17 Not Nina Simone.
00:16:18 No, no, no, no, no.
00:16:19 Modern?
00:16:20 Carol Emerald.
00:16:21 She’s from Amsterdam.
00:16:24 And she has a version of Ne Me Kite Pas
00:16:28 where she’s actually added some English lyrics.
00:16:31 And it’s really beautiful.
00:16:33 But again, Ne Me Kite Pas is just so,
00:16:35 I mean it’s, you know, the promises,
00:16:38 the volcanoes that will restart.
00:16:42 It’s just so beautiful.
00:16:43 And.
00:16:44 I love, there’s not many songs
00:16:47 that show such depth of desperation
00:16:51 for another human being.
00:16:52 That’s so powerful.
00:16:54 I apologize.
00:16:54 Je t offrirai des perles de pluie
00:16:58 venant de pays où il ne pleut pas.
00:17:01 And then high school.
00:17:02 Now I’m starting to learn English.
00:17:03 So I moved to New York.
00:17:05 So Sting’s Englishman in New York.
00:17:07 Yeah.
00:17:08 Magnificent song.
00:17:09 And again, there’s if manners maggoth manners someone said
00:17:14 then he’s the hero of the day.
00:17:16 It takes a man to suffer ignorance and smile.
00:17:20 Be yourself no matter what they say.
00:17:23 And then takes more than combat gear to make a man.
00:17:27 Takes more than a license for a gun.
00:17:30 Confront your enemies, avoid them when you can.
00:17:34 A gentleman will walk but never run.
00:17:37 It’s again, you’re talking about songs
00:17:39 that teach you how to live.
00:17:41 I mean, this is one of them.
00:17:42 Basically says, it’s not the combat gear that makes a man.
00:17:46 Where’s the part where he says, there you go.
00:17:49 Gentleness so brighty are rare in this society.
00:17:53 At night a candle’s brighter than the sun.
00:17:56 So beautiful.
00:17:57 He basically says, well, you just might be the only one.
00:18:00 Modesty propriety can lead to notoriety.
00:18:03 You could end up as the only one.
00:18:05 It’s, it basically tells you,
00:18:08 you don’t have to be like the others.
00:18:09 Be yourself, show kindness, show generosity.
00:18:14 Don’t, you know, don’t let that anger get to you.
00:18:18 You know, the song Fragile.
00:18:19 How fragile we are, how fragile we are.
00:18:22 So again, as in Greece, I didn’t even know what that meant.
00:18:26 How fragile we are, but the song was so beautiful.
00:18:28 And then eventually I learned English
00:18:29 and I actually understand the lyrics.
00:18:31 And the song is actually written after the Contras
00:18:36 murdered Ben Linder in 1987.
00:18:39 And the US eventually turned against supporting
00:18:43 these guerrillas.
00:18:44 And it was just a political song,
00:18:46 but so such a realization that you can’t win
00:18:49 with violence basically.
00:18:51 And that song starts with the most beautiful poetry.
00:18:55 So if blood will flow when flesh and steel are one,
00:19:00 drying in the color of the evening sun,
00:19:04 tomorrow’s rain will wash the stains away,
00:19:08 but something in our minds will always stain.
00:19:11 Perhaps this final act was meant
00:19:14 to clinch a lifetime’s argument
00:19:16 that nothing comes through violence
00:19:19 and nothing ever could.
00:19:21 For all those born beneath an angry star,
00:19:25 lest we forget how fragile we are.
00:19:28 Damn, right?
00:19:31 I mean, that’s poetry.
00:19:33 It was beautiful.
00:19:35 And he’s using the English language
00:19:37 is just such a refined way with deep meanings,
00:19:42 but also words that rhyme just so beautifully
00:19:45 and evocations of when flesh and steel are one.
00:19:50 I mean, it’s just mind boggling.
00:19:53 And then of course the refrain that everybody remembers
00:19:55 is on and on the rain will fall, et cetera.
00:20:00 But like this beginning.
00:20:00 Tears from a star, wow.
00:20:01 Yeah.
00:20:03 And again, tears from a star, how fragile we are.
00:20:06 I mean, just these rhymes are just flowing so naturally.
00:20:10 Something, it seems that more meaning comes
00:20:12 when there’s a rhythm that, I don’t know what that is.
00:20:16 That probably connects to exactly what you were saying.
00:20:18 And if you pay close attention,
00:20:19 you will notice that the more obvious words
00:20:22 sometimes are the second verse
00:20:25 and the less obvious are often the first verse
00:20:28 because it makes the second verse flow much more naturally
00:20:31 because otherwise it feels contrived.
00:20:33 Oh, you went and found this like unusual word.
00:20:36 In Dark Moments, the whole album of Pink Floyd
00:20:39 and the movie just marked me enormously
00:20:42 as a teenager, just the wall.
00:20:45 And there’s one song that never actually made it
00:20:47 into the album that’s only there in the movie
00:20:49 about when the tigers broke free
00:20:51 and the tigers are the tanks of the Germans.
00:20:55 And it just describes, again, this vivid imagery.
00:20:58 It was just before dawn, one miserable morning in Black 44
00:21:02 when the forward commander was told to sit tight
00:21:05 when he asked that his men be withdrawn.
00:21:08 And the generals gave thanks
00:21:10 as the other ranks held back the enemy tanks for a while.
00:21:16 And the Anzio bridgehead was held
00:21:19 for the price of a few hundred ordinary lives.
00:21:24 So that’s a theme that keeps coming back in Pink Floyd
00:21:26 with Us Versus Them.
00:21:28 Us and them, God only knows
00:21:32 that’s not what we would choose to do.
00:21:36 For work he cried from the rear
00:21:39 and the front rows died from another song.
00:21:43 It’s like this whole concept of Us Versus Them.
00:21:46 And there’s that theme of Us Versus Them again
00:21:48 where the child is discovering how his father died
00:21:52 when he finds an old and a found it one day
00:21:55 in a drawer of old photographs hidden away.
00:21:58 And my eyes still grow damp to remember
00:22:01 his majesty’s sign with his own rubber stamp.
00:22:05 So it’s so ironic because it seems the way
00:22:08 that he’s writing it that he’s not crying
00:22:09 because his father was lost.
00:22:11 He’s crying because kind old King George
00:22:15 took the time to actually write mother a note
00:22:17 about the fact that his father died.
00:22:19 It’s so ironic because it basically says
00:22:23 we are just ordinary men and of course we’re disposable.
00:22:26 So I don’t know if you know the root of the word pioneers
00:22:30 but you had a chess board here earlier, a pawn.
00:22:34 In French, it’s a pigeon.
00:22:36 They are the ones that you send to the front
00:22:38 to get murdered, slaughtered.
00:22:40 This whole concept of pioneers
00:22:42 having taken this whole disposable ordinary men
00:22:45 to actually be the ones that we’re now treating as heroes.
00:22:48 So anyway, there’s this juxtaposition of that.
00:22:50 And then the part that always just strikes me
00:22:53 is the music and the tonality totally changes.
00:22:56 And now he describes the attack.
00:22:59 It was dark all around.
00:23:00 There was frost in the ground.
00:23:02 When the tigers broke free
00:23:05 and no one survived from the Royal Fusiliers company.
00:23:10 They were all left behind.
00:23:13 Most of them dead.
00:23:15 The rest of them dying.
00:23:18 And that’s how the high command took my daddy from me.
00:23:24 And that song, even though it’s not in the album,
00:23:27 explains the whole movie.
00:23:29 Cause it’s this movie of misery.
00:23:31 It’s this movie of someone being stuck in their head
00:23:34 and not being able to get out of it.
00:23:36 There’s no other movie that I think has captured so well
00:23:39 this prison that is someone’s own mind.
00:23:44 And this wall that you’re stuck inside
00:23:47 and this feeling of loneliness.
00:23:50 And sort of, is there anybody out there?
00:23:54 And you know, sort of, hello, hello.
00:23:56 Is there anybody in there?
00:23:58 Just nod if you can hear me.
00:24:00 Is there anyone home?
00:24:04 Come on, yo.
00:24:06 I hear you’re feeling down.
00:24:08 Just one minute, I hear you nodding in again.
00:24:15 Anyway, so.
00:24:16 Yeah, the prison of your mind.
00:24:18 So those are the darker moments.
00:24:19 Exactly, these are the darker moments.
00:24:22 Yeah, in the darker moments, the mind does feel
00:24:25 like you’re trapped alone in a room with it.
00:24:30 Yeah, and there’s this scene in the movie
00:24:32 which like, where he just breaks out with his guitar
00:24:35 and there’s this prostitute in the room.
00:24:36 She starts throwing stuff and then he like, you know
00:24:39 breaks the window, he throws the chair outside.
00:24:41 And then you see him laying in the pool
00:24:43 with his own blood, like, you know, everywhere.
00:24:45 And then there’s these endless hours spent
00:24:48 fixing every little thing and lining it up.
00:24:51 And it’s this whole sort of mania versus, you know
00:24:55 you can spend hours building up something
00:24:57 and just destroy it in a few seconds.
00:24:59 One of my turns is that song.
00:25:01 And it’s like, I feel cold as a tourniquet
00:25:07 right as a manor.
00:25:10 Dry as a funeral drum.
00:25:13 And then the music also is like, run to the bedroom.
00:25:17 There’s a suitcase on the left.
00:25:19 There you find my favorite acts.
00:25:23 Don’t look so frightened.
00:25:24 This is just a passing phase.
00:25:27 One of my bad days.
00:25:30 It’s just so beautiful.
00:25:31 I need to rewatch it.
00:25:32 That’s so, you’re making me realize.
00:25:34 But imagine watching this as a teenager.
00:25:35 It like ruins your mind.
00:25:37 It’s like so many, it’s such harsh imagery.
00:25:42 And then, you know, anyway, so there’s the dark moment.
00:25:46 And then again, going back to Sting
00:25:48 now it’s the political songs, Russians.
00:25:51 And I think that song should be a new national anthem
00:25:55 for the US, not for Russians, but for red versus blue.
00:25:58 Mr. Khrushchev says we will bury you.
00:26:02 I don’t subscribe to this point of view.
00:26:06 It’d be such an ignorant thing to do
00:26:10 if the Russians love their children too.
00:26:14 What is it doing?
00:26:15 It’s basically saying the Russians
00:26:19 are just as humans as we are.
00:26:21 There’s no way that they’re gonna let their children die.
00:26:24 And then it’s just so beautiful.
00:26:26 How can I save my innocent boy
00:26:30 from Oppenheimer’s deadly toy?
00:26:33 And now that’s the new national anthem, are you reading?
00:26:36 There is no monopoly of common sense
00:26:40 on either side of the political fence.
00:26:43 We share the same biology
00:26:47 regardless of ideology.
00:26:50 Believe me when I say to you,
00:26:54 I hope the Russians love their children too.
00:27:00 There’s no such thing as a winnable war.
00:27:03 It’s a lie we don’t believe anymore.
00:27:08 I mean, it’s beautiful, right?
00:27:10 And for God’s sake, America, wake up.
00:27:13 These are your fellow Americans.
00:27:15 They’re your fellow biology.
00:27:19 There is no monopoly of common sense
00:27:21 on either side of the political fence.
00:27:23 It’s just so beautiful.
00:27:24 There’s no crisper, simpler way to say
00:27:27 Russians love their children too, the common humanity.
00:27:31 And remember what I was telling you,
00:27:33 I think in one of our first podcasts
00:27:35 about the daughter who’s crying for her brother
00:27:39 to come back for more.
00:27:40 And then the Virgin Mary appears and says,
00:27:43 who should I take instead?
00:27:44 This Turk, here’s his family, here’s his children.
00:27:47 This other one, he just got married, et cetera.
00:27:51 And that basically says, no.
00:27:53 I mean, if you look at the Lord of the Rings,
00:27:56 the enemies are these monsters, they’re not human.
00:27:59 And that’s what we always do.
00:28:00 We always say, they’re not like us, they’re different.
00:28:04 They’re not humans, et cetera.
00:28:06 So there’s this dehumanization that has to happen
00:28:08 for people to go to war.
00:28:11 If you realize just how close we are genetically,
00:28:14 one with the other, this whole 99.9% identical,
00:28:18 you can’t bear weapons against someone who’s like that.
00:28:22 And the things that are the most meaningful to us
00:28:24 in our lives at every level is the same on all sides,
00:28:28 on both sides.
00:28:29 Exactly.
00:28:30 So it’s not just that we’re genetically the same.
00:28:32 Yeah, we’re ideologically the same.
00:28:34 We love our children, we love our country.
00:28:36 We will fight for our family.
00:28:40 So, and the last one I mentioned last time we spoke,
00:28:43 which is Joni Mitchell’s Both Sides Now.
00:28:47 So she has three rounds, one on clouds, one on love,
00:28:51 and one on life.
00:28:53 And on clouds she says,
00:28:55 Rows and flows of angel hair
00:28:58 And ice cream castles in the air
00:29:02 And feather canyons everywhere
00:29:04 I’ve looked at clouds that way
00:29:07 But now they only block the sun
00:29:10 They rain and snow on everyone
00:29:14 So many things I would have done
00:29:17 But clouds got in my way
00:29:19 And then I’ve looked at clouds from both sides now
00:29:23 From up and down
00:29:25 And still somehow it’s
00:29:28 Clouds illusions I recall
00:29:31 I really don’t know clouds at all
00:29:36 And then she goes on about love,
00:29:37 how it’s super, super happy,
00:29:39 or it’s about misery and loss and about life,
00:29:42 how it’s about winning and losing and so on and so forth.
00:29:45 But now old friends are acting strange
00:29:49 They shake their heads, they say I’ve changed
00:29:53 Well, something’s lost and something’s gained
00:29:56 In living every day
00:29:59 So again, that’s growing up and realizing that,
00:30:02 well, the view that you had as a kid
00:30:04 is not necessarily that you have as an adult.
00:30:07 Remember my poem from when I was 16 years old
00:30:09 of this whole, you know,
00:30:11 children dance now all in row and then in the end,
00:30:14 even though the snow seems bright,
00:30:16 without you have lost their light,
00:30:17 sun that sang and moon that smiled.
00:30:19 So this whole concept of, if you have love
00:30:22 and if you have passion,
00:30:24 you see the exact same thing from a different way.
00:30:26 You can go out running in the rain
00:30:28 or you could just stay in and say,
00:30:29 ah, sucks, I won’t be able to go outside now.
00:30:33 Those sides.
00:30:34 Anyway, and the last one is,
00:30:36 last, last one I promise, Leonard Cohen.
00:30:38 This is amazing by the way.
00:30:39 I’m so glad we stumbled on how much joy you have
00:30:44 in so many avenues of life and music is just one of them.
00:30:49 That’s amazing.
00:30:50 But yes, Leonard Cohen.
00:30:52 Going back to Leonard Cohen,
00:30:53 since that’s where you started.
00:30:54 So Leonard Cohen’s Dance Me to the End of Love.
00:30:56 That was our opening song in our wedding with my wife.
00:30:59 Oh no, that’s good.
00:31:00 As we came out to greet the guests,
00:31:01 it was Dance Me to the End of Love.
00:31:03 And then another one, which is just so passionate always
00:31:06 and we always keep referring back to it is I’m Your Man.
00:31:09 And it goes on and on about sort of,
00:31:12 I can be every type of lover for you.
00:31:14 And what’s really beautiful in marriage
00:31:16 is that we live that with my wife every day.
00:31:20 You can have the passion, you can have the anger,
00:31:23 you can have the love, you can have the tenderness.
00:31:25 There’s just so many gems in that song.
00:31:28 If you want a partner, take my hand.
00:31:31 Or if you want to strike me down in anger,
00:31:35 here I stand, I’m your man.
00:31:39 Then if you want a boxer, I will step into the ring for you.
00:31:43 If you want a driver, climb inside.
00:31:46 Or if you want to take me for a ride, you know you can.
00:31:50 So this whole concept of you want to drive, I’ll follow.
00:31:53 You want me to drive, I’ll drive.
00:31:56 And the difference I would say between like that
00:31:58 and Nemaki Tapa is this song, he’s got an attitude.
00:32:01 He’s like, he’s proud of his ability
00:32:06 to basically be any kind of man for as long as opposed
00:32:09 to the Jacques Brel like desperation of what do I have to be
00:32:14 for you to love me, that kind of desperation.
00:32:17 But notice there’s a parallel here.
00:32:20 There’s a verse that is perhaps not paid attention
00:32:22 to as much which says, ah, but a man never got a woman back,
00:32:29 not by begging on his knees.
00:32:32 So it seems that the I’m your man
00:32:34 is actually an apology song in the same way
00:32:37 that Nemaki Tapa is an apology song.
00:32:40 Nemaki Tapa basically says I’ve screwed up.
00:32:44 I’m sorry, baby.
00:32:45 And in the same way that the Careless Whisper
00:32:48 is I’m screwed up.
00:32:48 Yes, that’s right.
00:32:50 I’m never gonna dance again.
00:32:52 Guilty feet have got no rhythm.
00:32:57 So this is an apology song, not by begging on his knees
00:32:59 or I’d crawl to you, baby, and I’d fall at your feet
00:33:03 and I’d howl at your beauty like a dog in heat
00:33:07 and I’d claw at your heart and I’d tear at your sheet.
00:33:11 I’d say please.
00:33:13 And then the last one is so beautiful.
00:33:18 If you want a father for your child
00:33:22 or only want to walk with me a while across the sand,
00:33:28 I’m your man.
00:33:30 That’s the last verses which basically says
00:33:33 you want me for a day?
00:33:35 I’ll be there.
00:33:36 Do you want me to just walk?
00:33:37 I’ll be there.
00:33:38 You want me for life?
00:33:39 Do you want a father for your child?
00:33:41 I’ll be there too.
00:33:42 It’s just so beautiful.
00:33:43 Oh, sorry.
00:33:44 Remember how I told you I was gonna finish
00:33:45 with a lighthearted song?
00:33:46 Yes.
00:33:48 Last one.
00:33:48 You ready?
00:33:49 So Alison Krauss and Union Station,
00:33:53 country song, believe it or not, the lucky one.
00:33:57 So I’ve never identified as much
00:34:01 with the lyrics of a song as this one.
00:34:05 And it’s hilarious.
00:34:06 My friend, Serafim Batoglou,
00:34:09 is the guy who got me to genomics in the first place.
00:34:11 I owe enormously to him.
00:34:13 And he’s another Greek.
00:34:14 We actually met dancing, believe it or not.
00:34:16 So we used to perform Greek dances.
00:34:18 I was the president of the International Students Association.
00:34:20 So we put on these big performances
00:34:22 for 500 people at MIT.
00:34:24 And there’s a picture on the MIT Tech
00:34:26 where Serafim, who’s like a bodybuilder,
00:34:29 was holding me on his shoulder.
00:34:30 And I was like doing maneuvers in the air, basically.
00:34:34 So anyway, this guy, Serafim,
00:34:35 we were driving back from a conference.
00:34:40 And there’s this Russian girl
00:34:41 who was describing how every member of her family
00:34:44 had been either killed by the communists
00:34:46 or killed by the Germans or killed by the,
00:34:48 like, she had just like, you know, misery,
00:34:51 like death and, you know, sickness and everything.
00:34:54 Everyone was decimated in her family.
00:34:56 She was the last standing member.
00:34:57 And we stopped at a, Serafim was driving
00:35:00 and we stopped at a rest area.
00:35:02 And he takes me aside and he’s like,
00:35:04 Manolis, we’re gonna crash.
00:35:07 Get her out of my car.
00:35:10 And then he basically says,
00:35:11 but I’m only reassured because you’re here with me.
00:35:14 And I’m like, what do you mean?
00:35:15 He’s like, you know, he’s like, from your smile,
00:35:18 I know you’re the luckiest man on the planet.
00:35:22 So there’s this really funny thing
00:35:24 where I just feel freaking lucky all the time.
00:35:28 And it’s a question of attitude.
00:35:30 Of course, I’m not any luckier than any other person,
00:35:32 but every time something horrible happens to me,
00:35:34 I’m like, and in fact, even in that song,
00:35:36 the song about sort of, you know,
00:35:38 walking on the beach and this, you know,
00:35:40 sort of taking our life the wrong way
00:35:43 and then, you know, having to turn around.
00:35:45 At some point he’s like, you know,
00:35:47 in the fresh sand, we wrote her name.
00:35:53 So how nicely that the wind blew and the writing was erased.
00:35:58 So again, it’s this whole sort of,
00:36:00 not just saying, oh, bummer, but, oh, great.
00:36:05 I just lost this.
00:36:06 This must mean something, right?
00:36:09 This horrible thing happened,
00:36:11 it must open the door to a beautiful chapter.
00:36:14 So, so Allison Krauss is talking about the lucky one.
00:36:17 So I was like, oh my God, she wrote a song for me.
00:36:21 And she goes, you’re the lucky one, I know that now,
00:36:24 as free as the wind blowing down the road,
00:36:27 loved by many, hated by none, I’d say,
00:36:30 you were lucky because you know what you’ve done,
00:36:32 not the care in the world, not the worry inside.
00:36:35 Everything’s going to be all right
00:36:37 because you’re the lucky one.
00:36:39 And then she goes, you’re the lucky one,
00:36:42 always having fun, a jack of all trades,
00:36:44 a master of none.
00:36:45 You look at the world with the smiling eyes
00:36:48 and laugh at the devil as his train rolls by.
00:36:50 I’ll give you a song and a one night stand.
00:36:53 You’ll be looking at a happy man
00:36:55 because you’re the lucky one.
00:36:57 It basically says, if you just don’t worry too much,
00:37:01 if you don’t try to be a one trick pony,
00:37:06 if you just embrace the fact
00:37:08 that you might suck at a bunch of things,
00:37:10 but you’re just gonna try a lot of things.
00:37:12 And then there’s another verse that says,
00:37:15 well, you’re blessed I guess,
00:37:17 but never knowing which road you’re choosing,
00:37:20 to you the next best thing
00:37:22 to playing and winning is playing and losing.
00:37:24 It’s just so beautiful
00:37:26 because he basically says,
00:37:28 if you try your best,
00:37:33 but it’s still playing,
00:37:35 if you lose, it’s okay.
00:37:36 You had an awesome game.
00:37:38 And again, superficially,
00:37:41 it sounds like a super happy song.
00:37:43 But then there’s the last verse basically says,
00:37:46 no matter where you are, that’s where you’ll be.
00:37:47 You can bet your luck won’t follow me.
00:37:50 Just give you a song and then one night stand,
00:37:53 you’ll be looking at a happy man.
00:37:54 And then in the video of the song,
00:37:56 she just walks away or he just walks away
00:37:57 or something like that.
00:37:59 And it basically tells you
00:38:00 that freedom comes at a price.
00:38:03 Freedom comes at the price of non commitment.
00:38:05 This whole sort of birds who love
00:38:07 or birds who cry,
00:38:08 you can’t really love unless you cry.
00:38:11 You can’t just be the lucky one,
00:38:12 the happy boy, la la la,
00:38:14 and yet have a long term relationship.
00:38:18 So it’s, on one hand,
00:38:20 I identify with the shallowness of this song,
00:38:22 of you know, you’re the lucky one,
00:38:24 jack of all trades, a master or none.
00:38:26 But at the same time,
00:38:27 I identify with a lesson of,
00:38:29 well, you can’t just be the happy,
00:38:31 merry, go lucky all the time.
00:38:33 Sometimes you have to embrace loss
00:38:35 and sometimes you have to embrace suffering.
00:38:37 And sometimes you have to embrace that.
00:38:40 If you have a safety net,
00:38:42 you’re not really committing enough.
00:38:44 You’re not, you know,
00:38:45 basically you’re allowing yourself to slip.
00:38:49 But if you just go all in
00:38:50 and you just, you know,
00:38:52 let go of your reservations,
00:38:53 that’s when you truly will get somewhere.
00:38:56 So anyway, that’s like the,
00:38:58 I managed to narrow it down to what, 15 songs?
00:39:01 Thank you for that wonderful journey
00:39:03 that you just took us on,
00:39:04 the darkest possible places of Greek song
00:39:10 to ending on this country song.
00:39:13 I haven’t heard it before,
00:39:15 but that’s exactly right.
00:39:16 I feel the same way,
00:39:17 depending on the day,
00:39:19 is this the luckiest human on earth.
00:39:22 And there’s something to that,
00:39:23 but you’re right, it needs to be,
00:39:27 we need to now return to the muck of life
00:39:30 in order to be able to truly enjoy it.
00:39:35 So it’s…
00:39:35 What do you mean muck?
00:39:36 What’s muck?
00:39:38 The messiness of life.
00:39:39 Yeah.
00:39:39 The things that were,
00:39:41 things don’t turn out the way you expect it to.
00:39:44 Yeah.
00:39:44 So like, to feel like you’re in the right place,
00:39:47 to feel lucky,
00:39:48 is like focusing on the beautiful consequences.
00:39:51 Yeah.
00:39:52 But then that feeling of things being different
00:39:55 than you expected,
00:39:56 that you stumble in all the kinds of ways,
00:39:59 that seems to be,
00:40:01 needs to be paired with the feeling of luck.
00:40:03 There’s basically one way,
00:40:04 the only way not to make mistakes,
00:40:06 is to never do anything.
00:40:07 Right.
00:40:10 Basically, you have to embrace the fact
00:40:11 that you’ll be wrong so many times.
00:40:13 In so many research meetings,
00:40:14 I just go off on a tangent and say,
00:40:17 let’s think about this for a second.
00:40:18 And it’s just crazy for me,
00:40:21 who’s a computer scientist,
00:40:22 to just tell my biologist friends,
00:40:24 what if biology kind of worked this way?
00:40:25 Yeah.
00:40:26 And they humor me.
00:40:27 They just let me talk.
00:40:29 And rarely has it not gone somewhere good.
00:40:34 It’s not that I’m always right,
00:40:35 but it’s always something worth exploring further,
00:40:39 that if you’re an outsider with humility
00:40:42 and knowing that I’ll be wrong a bunch of times,
00:40:46 but I’ll challenge your assumptions,
00:40:49 and often take us to a better place,
00:40:52 is part of this whole sort of messiness of life.
00:40:54 Like if you don’t try and lose and get hurt
00:40:58 and suffer and cry and just break your heart
00:41:01 and all these feelings of guilt and,
00:41:03 wow, I did the wrong thing.
00:41:06 Of course, that’s part of life.
00:41:07 And that’s just something that,
00:41:08 if you are a doer, you’ll make mistakes.
00:41:12 If you’re a criticizer, yeah, sure,
00:41:14 you can sit back and criticize everybody else
00:41:16 for the mistakes they make.
00:41:18 Or instead, you can just be out there making those mistakes.
00:41:21 And frankly, I’d rather be the criticized one
00:41:23 than the criticizer.
00:41:24 Yeah, brilliantly put.
00:41:25 Every time somebody steals my bicycle,
00:41:27 I say, well, no, my son’s like,
00:41:29 why do they steal our bicycle, dad?
00:41:31 And I’m like, aren’t you happy that you have bicycles
00:41:34 that people can steal?
00:41:35 Yeah.
00:41:36 Aren’t you happy that you have bicycles that people can steal?
00:41:39 I’d rather be the person stolen from than the stealer.
00:41:41 Yeah, it’s not the critic that counts.
00:41:45 So that’s, we’ve just talked amazingly
00:41:49 about life from the music perspective.
00:41:51 Let’s talk about life from, human life,
00:41:55 from perhaps other perspective and its meaning.
00:41:57 So this is episode 142.
00:42:00 There is perhaps an absurdly deep meaning
00:42:08 to the number 42 that our culture has elevated.
00:42:13 So this is a perfect time to talk about the meaning of life.
00:42:16 We’ve talked about it already,
00:42:18 but do you think this question that’s so simple
00:42:23 and so seemingly absurd has value
00:42:27 of what is the meaning of life?
00:42:29 Is it something that raising the question
00:42:33 and trying to answer it, is that a ridiculous pursuit
00:42:36 or is there some value?
00:42:37 Is it answerable at all?
00:42:39 So first of all, I feel that we owe it to your listeners
00:42:42 to say why 42?
00:42:44 Sure.
00:42:46 So of course the Hitchhiker’s Guide to the Galaxy
00:42:48 came up with 42 as basically a random number.
00:42:52 Just, you know, the author just pulled it out of a hat
00:42:55 and he’s admitted so.
00:42:56 He said, well, 42 just seemed like just random numbers any.
00:43:01 But in fact, there’s many numbers that are linked to 42.
00:43:05 So 42, again, just to summarize,
00:43:09 is the answer that these super mega computer
00:43:13 that had computed for a million years
00:43:15 with the most powerful computer in the world
00:43:17 had come up with.
00:43:18 At some point, the computer says,
00:43:21 I have an answer.
00:43:23 And they’re like, what?
00:43:25 It’s like, you’re not gonna like it.
00:43:27 Like, what is it?
00:43:28 It’s 42.
00:43:30 And then the irony is that they had forgotten, of course,
00:43:33 what the question was.
00:43:33 Yes.
00:43:34 So now they have to build a bigger computer
00:43:36 to figure out what the question is,
00:43:37 to which the answer is 42.
00:43:39 So as I was turning 42,
00:43:41 I basically sort of researched
00:43:43 why 42 is such a cool number.
00:43:45 And it turns out that,
00:43:47 and I put together this little passage
00:43:48 that was explaining to all those guests
00:43:51 to my 42nd birthday party
00:43:53 why we were talking about the meaning of life.
00:43:54 And basically talked about how 42
00:43:58 is the angle at which light reflects off of water
00:44:03 to create a rainbow.
00:44:05 And it’s so beautiful because the rainbow
00:44:07 is basically the combination of sort of,
00:44:09 it’s been raining, but there’s hope
00:44:11 because the sun just came out.
00:44:13 So it’s a very beautiful number there.
00:44:14 So 42 is also the sum of all rows and columns
00:44:17 of a magic cube that contains all consecutive integers
00:44:21 starting at one.
00:44:22 So basically, if you take all integers
00:44:24 between one and however many vertices there are,
00:44:27 the sums is always 42.
00:44:29 42 is the only number left under 100
00:44:34 for which the equation of X to the cube
00:44:36 plus Y to the cube plus Z to the cube is N
00:44:39 and was not known to not have a solution.
00:44:42 And now it’s the only one that actually has a solution.
00:44:46 42 is also one, zero, one, zero, one, zero in binary.
00:44:50 Again, the yin and the yang,
00:44:51 the good and the evil,
00:44:52 one and zero, the balance of the force.
00:44:54 42 is the number of chromosomes for the giant panda.
00:44:58 And the giant panda, I know it’s totally random.
00:45:01 It’s a suspicious symbol of great strength
00:45:04 coupled with peace, friendship, gentle temperament,
00:45:07 harmony, balance, and friendship
00:45:09 whose black and white colors again symbolize yin and yang.
00:45:12 The reason why it’s the symbol for China
00:45:15 is exactly the strength, but yet peace and so on and so forth.
00:45:19 So 42 chromosomes.
00:45:21 It takes light 10 to the minus 42 seconds
00:45:24 to cross the diameter of a proton
00:45:27 connecting the two fundamental dimensions
00:45:29 of space and time.
00:45:31 42 is the number of times a piece of paper
00:45:33 should be folded to reach beyond the moon,
00:45:40 which is what I assume my students mean
00:45:41 when they ask that their paper reaches for the stars.
00:45:44 I just tell them just fold it a bunch of times.
00:45:46 42 is the number of messier object, 42, which is Orion.
00:45:54 And that’s one of the most famous galaxies.
00:45:58 It’s I think also the place where we can actually see
00:46:00 the center of our galaxy.
00:46:03 42 is the numeric representation of the star symbol
00:46:05 in ASCII, which is very useful
00:46:09 when searching for the stars.
00:46:10 And also a reg exp for life, the universe and everything.
00:46:14 So star, in Egyptian mythology, the goddess Maat
00:46:20 which was personifying truth and justice
00:46:22 would ask 42 questions to every dying person.
00:46:25 And those answering successfully would become stars
00:46:27 continued to give life and fuel universal growth.
00:46:31 In Judaic tradition, God ascribe the 42 lettered name
00:46:35 and trusted only to the middle age pious meek
00:46:38 free from bad temper, sober and not insistent on his rights.
00:46:42 And in Christian tradition, there’s 42 generations
00:46:44 from Abraham, Isaac, that we talked about,
00:46:47 the story of Isaac, Jacob, eventually Joseph, Mary and Jesus.
00:46:51 In Kabbalistic tradition, Elocha, which is 42
00:46:54 is the number with which God creates the universe
00:46:57 starting with 25, let there be and ending with 70, good.
00:47:02 So 25 plus, you know, 17, there’s a 42 chapter sutra,
00:47:08 which is the first Indian religious tradition
00:47:10 which is the first Indian religious scripture
00:47:12 which was translated to Chinese,
00:47:14 thus introducing Buddhism to China from India.
00:47:18 The 42 line Bible was the first printed book
00:47:21 marking the age of printing in the 1450s
00:47:25 and the dissemination of knowledge
00:47:27 eventually leading to the enlightenment.
00:47:29 A yeast cell, which is called a single cell eukaryote
00:47:33 and the subject of my PhD research
00:47:34 has exactly 42 million proteins.
00:47:38 Anyway, so there’s a series of 42.
00:47:40 You’re on fire with this, these are really good.
00:47:42 So I guess what you’re saying is just a random number.
00:47:45 Yeah, basically.
00:47:47 So all of these are backronyms.
00:47:49 So, you know, after you have the number,
00:47:50 you figure out why that number.
00:47:52 So anyway, so now that we’ve spoken about Y42,
00:47:56 why do we search for meaning?
00:47:58 And you’re asking, you know, will that search
00:48:01 ultimately lead to our destruction?
00:48:02 And my thinking is exactly the opposite.
00:48:04 So basically that asking about meaning
00:48:08 is something that’s so inherent to human nature.
00:48:11 It’s something that makes life beautiful
00:48:13 that makes it worth living.
00:48:14 And that searching for meaning is actually the point.
00:48:18 It’s not the finding it.
00:48:19 I think when you found it, you’re dead.
00:48:22 Don’t ever be satisfied that, you know, I’ve got it.
00:48:26 So I like to say that life is lived forward
00:48:30 but it only makes sense backward.
00:48:32 And I don’t remember whose quote that is,
00:48:34 but the whole search itself is the meaning.
00:48:39 And what I love about it is that
00:48:43 there’s a double search going on.
00:48:44 There’s a search in every one of us
00:48:47 through our own lives to find meaning.
00:48:50 And then there’s a search which is happening
00:48:52 for humanity itself to find our meaning.
00:48:56 And we as humans like to look at animals and say,
00:49:01 of course they have a meaning.
00:49:03 Like a dog has its meaning.
00:49:04 It’s just a bunch of instincts, you know,
00:49:07 running around, loving everything.
00:49:09 You know, remember our joke with a cat and the dog.
00:49:11 Yeah, cat has no meaning.
00:49:14 No, no.
00:49:15 So, and I’m noticing the yin yang symbol right here
00:49:20 with this whole panda, black and white and the 0102.
00:49:23 You’re on fire with that 42.
00:49:24 Some of those are gold ASCII value for a star symbol.
00:49:29 Damn.
00:49:30 So basically in my view, the search for meaning
00:49:34 and the act of searching for something more meaningful
00:49:39 is life’s meaning by itself.
00:49:43 The fact that we kind of always hope that,
00:49:45 yes, maybe for animals that’s not the case,
00:49:47 but maybe humans have something that we should be doing
00:49:51 and something else.
00:49:52 And it’s not just about procreation.
00:49:53 It’s not just about dominance.
00:49:55 It’s not just about strength and feeding, et cetera.
00:49:58 Like we’re the one species that spends such a tiny,
00:50:00 little minority of its time feeding
00:50:03 that we have this enormous, huge cognitive capability
00:50:08 that we can just use for all kinds of other stuff.
00:50:11 And that’s where art comes in.
00:50:12 That’s where the healthy mind comes in
00:50:15 with exploring all of these different aspects
00:50:18 that are just not directly tied to a purpose.
00:50:23 That’s not directly tied to a function.
00:50:25 It’s really just the playing of life.
00:50:28 The, you know, not for particular reason.
00:50:32 Do you think this thing we got,
00:50:34 this mind is unique in the universe
00:50:39 in terms of how difficult it is to build?
00:50:43 Is it possible that we’re the most beautiful thing
00:50:47 that the universe has constructed?
00:50:49 Both the most beautiful and the most ugly,
00:50:51 but certainly the most complex.
00:50:52 So look at evolutionary time.
00:50:55 The dinosaurs ruled the earth for 135 million years.
00:50:58 We’ve been around for a million years.
00:51:02 So one versus 135.
00:51:05 So dinosaurs were extinct, you know,
00:51:07 about 60 million years ago
00:51:09 and mammals that had been happily evolving
00:51:11 as tiny little creatures for 30 million years
00:51:13 then took over the planet and then, you know,
00:51:15 dramatically radiated about 60 million years ago.
00:51:19 Out of these mammals came the neocortex formation.
00:51:24 So basically the neocortex,
00:51:25 which is sort of the outer layer of our brain
00:51:28 compared to our quote unquote reptilian brain,
00:51:29 which we share the structure of with all of the dinosaurs.
00:51:33 They didn’t have that and yet they ruled the planet.
00:51:36 So how many other planets have still, you know,
00:51:38 mindless dinosaurs where strength
00:51:41 was the only dimension ruling the planet?
00:51:45 So there was something weird that annihilated the dinosaurs.
00:51:49 And again, you could look at biblical things
00:51:51 of sort of God coming and wiping out his creatures
00:51:53 to make room for the next ones.
00:51:55 So the mammals basically sort of took over the planet
00:51:59 and then grew this cognitive capability
00:52:03 of this general purpose machine.
00:52:06 And primates push that to extreme
00:52:10 and humans among primates have just exploded that hardware.
00:52:15 But that hardware is selected for survival.
00:52:20 It’s selected for procreation.
00:52:23 It’s initially selected with his very simple
00:52:26 Darwinian view of the world of random mutation,
00:52:30 ruthless selection, and then selection
00:52:32 for making more of yourself.
00:52:35 If you look at human cognition,
00:52:38 it’s gone down a weird evolutionary path
00:52:41 in the sense that we are expanding
00:52:44 an enormous amount of energy on this apparatus
00:52:47 between our ears that is wasting, what,
00:52:50 15% of our bodily energy, 20%,
00:52:53 like some enormous percentage of our calories
00:52:57 go to function our brain.
00:53:00 No other species makes that big of a commitment.
00:53:03 That has basically taken energetic changes
00:53:06 for efficiency on the metabolic side for humanity
00:53:11 to basically power that thing.
00:53:13 And our brain is both enormously more efficient
00:53:17 than other brains, but also, despite this efficiency,
00:53:20 enormously more energy consuming.
00:53:23 So, and if you look at just the sheer folds
00:53:27 that the human brain has, again, our skull could only
00:53:30 grow so much before it could no longer go
00:53:32 through the pelvic opening and kill the mother
00:53:36 at every birth, so, but yet the folds continued
00:53:40 effectively creating just so much more capacity.
00:53:43 The evolutionary context in which this was made
00:53:48 is enormously fascinating, and it has to do with
00:53:53 other humans that we have now killed off
00:53:55 or that have gone extinct.
00:53:57 And that has now created this weird place of humans
00:54:01 on the planet as the only species
00:54:03 that has this enormous hardware.
00:54:05 So that can basically make us think
00:54:07 that there’s something very weird and unique
00:54:09 that happened in human evolution that perhaps
00:54:11 has not been recreated elsewhere. Maybe the asteroid
00:54:13 didn’t hit, you know, sister earth,
00:54:15 and dinosaurs are still ruling, and, you know,
00:54:19 any kind of proto human is squished
00:54:21 and eaten for breakfast basically.
00:54:25 However, we’re not as unique as we like to think
00:54:28 because there was this enormous diversity
00:54:30 of other human like forms.
00:54:33 And once you make it to that stage where you have
00:54:35 a neocortex like explosion of, wow, we’re not
00:54:38 competing on intelligence, and we’re not competing
00:54:41 on social structures, and we’re not competing
00:54:43 on larger and larger groups, and being able to
00:54:47 coordinate and being able to have empathy,
00:54:51 the concept of empathy, the concept of an ego,
00:54:55 the concept of a self, of self awareness,
00:54:58 comes probably from being able to project
00:55:04 another person’s intentions,
00:55:06 another person’s intentions to understand
00:55:10 what they mean when you have these large cognitive groups,
00:55:13 large social groups.
00:55:15 So me being able to sort of create a mental model
00:55:19 of how you think may have come before I was able
00:55:22 to create a personal mental model of how do I think.
00:55:26 So this introspection probably came after this sort
00:55:30 of projection and this empathy, which basically means,
00:55:34 you know, passion, pathos, suffering,
00:55:37 but basically sensing.
00:55:39 So basically empathy means feeling what you’re feeling,
00:55:42 trying to project your emotional state
00:55:45 onto my cognitive apparatus.
00:55:47 And I think that is what eventually led
00:55:51 to this enormous cognitive explosion
00:55:55 that happened in humanity.
00:55:56 So, you know, life itself in my view is inevitable
00:56:02 on every planet.
00:56:03 Inevitable. Inevitable.
00:56:05 But the evolution of life to self awareness and cognition
00:56:09 and all the incredible things that humans have done,
00:56:12 you know, that might not be as inevitable.
00:56:14 That’s your intuition.
00:56:15 So if you were to sort of estimate and bet some money on it,
00:56:20 if we reran Earth a million times,
00:56:26 would what we got now be the most special thing
00:56:29 and how often would it be?
00:56:30 So scientifically speaking, how repeatable is this experiment?
00:56:36 So this whole cognitive revolution?
00:56:38 Yes.
00:56:40 Maybe not.
00:56:41 Maybe not.
00:56:42 Basically, I feel that the longevity of, you know,
00:56:47 dinosaurs suggests that it was not quite inevitable
00:56:51 that we humans eventually made it.
00:56:56 What you’re also implying one thing here.
00:56:59 You’re saying, you’re implying that humans
00:57:01 also don’t have this longevity.
00:57:03 This is the interesting question.
00:57:05 So with the Fermi Paradox, the idea that the basic question
00:57:09 is like, if the universe has a lot of alien life forms in it,
00:57:14 why haven’t we seen them?
00:57:16 And one thought is that there’s a great filter
00:57:19 or multiple great filters that basically would destroy
00:57:23 intelligent civilizations.
00:57:24 Like this thing that we, you know, this multifolding brain
00:57:28 that keeps growing may not be such a big feature.
00:57:32 It might be useful for survival,
00:57:34 but it like takes us down a side road
00:57:38 that is a very short one with a quick dead end.
00:57:41 What do you think about that?
00:57:42 So I think the universe is enormous,
00:57:45 not just in space, but also in time.
00:57:50 And the pretense that, you know, the last blink
00:57:55 of an instant that we’ve been able to send radio waves
00:57:58 is when somebody should have been paying attention
00:58:00 to our planet is a little ridiculous.
00:58:03 So my, you know, what I love about Star Wars
00:58:07 is a long, long time ago in a galaxy far, far away.
00:58:10 It’s not like some distant future.
00:58:11 It’s a long, long time ago.
00:58:13 What I love about it is that basically says,
00:58:15 you know, evolution and civilization
00:58:19 are just so recent in, you know, on earth.
00:58:23 Like there’s countless other planets
00:58:25 that have probably all kinds of life forms,
00:58:27 multicellular perhaps, and so on and so forth.
00:58:31 But the fact that humanity has only been listening
00:58:35 and emitting for just this tiny little blink
00:58:39 means that any of these, you know, alien civilizations
00:58:42 would need to be paying attention
00:58:44 to every single insignificant planet out there.
00:58:47 And, you know, again, I mean, the movie Contact
00:58:50 and the book is just so beautiful.
00:58:52 This whole concept of we don’t need to travel physically.
00:58:56 We can travel as light.
00:58:57 We can send instructions for people to create machines
00:59:01 that will allow us to beam down light
00:59:03 and recreate ourselves.
00:59:04 And in the book, you know, the aliens actually take over.
00:59:07 They’re not as friendly.
00:59:09 But, you know, this concept that we have to eventually
00:59:13 go and conquer every planet.
00:59:14 I mean, I think that, yes,
00:59:16 we will become a galactic species.
00:59:18 So you have a hope, well, you said think, so.
00:59:22 Oh, of course, of course.
00:59:23 I mean, now that we’ve made it so far.
00:59:25 So you feel like we’ve made it.
00:59:27 Oh gosh, I feel that, you know, cognition,
00:59:30 the cognition as an evolutionary trait
00:59:31 has won over in our planet.
00:59:33 There’s no doubt that we’ve made it.
00:59:35 So basically humans have won the battle for, you know,
00:59:40 dominance.
00:59:41 It wasn’t necessarily the case with dinosaurs.
00:59:43 Like, I mean, yes, you know,
00:59:46 there’s some claims of intelligence.
00:59:50 And if you look at Jurassic Park, yeah, sure, whatever.
00:59:53 But, you know, they just don’t have the hardware for it.
00:59:58 And humans have the hardware.
00:59:59 There’s no doubt that mammals have
01:00:01 a dramatically improved hardware
01:00:03 for cognition over dinosaurs.
01:00:06 Like basically there’s universes where strength won out.
01:00:09 And in our planet, in our, you know,
01:00:11 particular version of whatever happened in this planet,
01:00:15 cognition won out.
01:00:16 And it’s kind of cool.
01:00:18 I mean, it’s a privilege, right?
01:00:20 It’s kind of like living in Boston instead of,
01:00:22 I don’t know, some middle age place
01:00:26 where everybody’s like hitting each other
01:00:28 with, you know, weapons and sticks.
01:00:31 You’re back to the Lucky Ones song.
01:00:33 I mean, we are the lucky ones.
01:00:36 But the flip side of that is that this hardware
01:00:39 also allows us to develop weapons
01:00:41 and methods of destroying ourselves.
01:00:43 Again, I want to go back to Pinker
01:00:46 and the better angels of our nature.
01:00:49 The whole concept that civilization
01:00:53 and the act of civilizing
01:00:57 has dramatically reduced violence, dramatically.
01:01:02 If you look, you know, at every scale,
01:01:05 as soon as organization comes,
01:01:07 the state basically owns the right to violence.
01:01:12 And eventually the state gives that right
01:01:17 of governance to the people,
01:01:20 but violence has been eliminated by that state.
01:01:23 So this whole concept of central governance
01:01:27 and people agreeing to live together
01:01:29 and share responsibilities and duties
01:01:33 and, you know, all of that
01:01:36 is something that has led so much to less violence,
01:01:41 less death, less suffering, less, you know, poverty,
01:01:45 less, you know, war.
01:01:48 I mean, yes, we have the capability to destroy ourselves,
01:01:53 but the arc of civilization
01:01:56 has led to much, much less destruction,
01:01:58 much, much less war and much more peace.
01:02:00 And of course there’s blips back and forth
01:02:04 and, you know, there are setbacks,
01:02:06 but again, the moral arc of the universe.
01:02:10 But it seems to just, I probably imagine
01:02:13 there were two dinosaurs back in the day
01:02:14 having this exact conversation
01:02:17 and they look up to the sky
01:02:18 and there seems to be something like an asteroid
01:02:21 going towards Earth.
01:02:22 So it’s, while it’s very true
01:02:24 that the arc of our society of human civilization
01:02:28 seems to be progressing towards a better, better life
01:02:31 for everybody in the many ways that you described,
01:02:36 things can change in a moment.
01:02:39 And it feels like it’s not just us humans
01:02:41 we’re living through a pandemic.
01:02:44 You could imagine that a pandemic would be more destructive
01:02:47 or there could be asteroids that just appear
01:02:52 out of the darkness of space,
01:02:54 which I recently learned it’s not that easy
01:02:57 to actually detect them.
01:02:59 Yes.
01:03:00 So 48, what happens in 48 years?
01:03:04 I’m not sure.
01:03:05 2068, Apophis.
01:03:07 There’s an asteroid that’s coming.
01:03:09 In 48 years, it has very high chance
01:03:11 of actually wiping us out completely.
01:03:13 Yes.
01:03:14 Yes.
01:03:15 So we have 48 years to get our act together.
01:03:18 It’s not like some distant, distant hypothesis.
01:03:21 Yes.
01:03:21 Like, yeah, sure, they’re hard to detect
01:03:23 but this one we know about, it’s coming.
01:03:25 So how do you feel about that?
01:03:26 Why are you still so optimistic?
01:03:27 Oh gosh, I’m so happy with where we are now.
01:03:29 This is gonna be great.
01:03:30 Seriously, if you look at progress,
01:03:32 if you look at, again, the speed with which knowledge
01:03:36 has been transferred, what has led to humanity
01:03:40 making so many advances so fast?
01:03:43 Okay.
01:03:44 So what has led to humanity making so many advances
01:03:47 is not just the hardware upgrades,
01:03:50 it’s also the software upgrades.
01:03:52 So by hardware upgrades, I basically mean our neocortex
01:03:55 and the expansion and these layers
01:03:57 and folds of our brain and all of that.
01:04:00 That’s the hardware.
01:04:01 The software hasn’t,
01:04:03 you know, the hardware hasn’t changed much
01:04:06 in the last, what, 70,000 years.
01:04:10 As I mentioned last time,
01:04:12 if you take a person from ancient Egypt
01:04:13 and you bring them up now, they’re just as equally fit.
01:04:18 So hardware hasn’t changed.
01:04:20 What has changed is software.
01:04:22 What has changed is that we are growing up in societies
01:04:28 that are much more complex.
01:04:30 This whole concept of neoteny basically allows
01:04:33 our exponential growth.
01:04:35 The concept that our brain has not fully formed,
01:04:39 has not fully stabilized itself until after our teenage years.
01:04:43 So we basically have a good 16 years, 18 years
01:04:46 to sort of infuse it with the latest
01:04:48 and greatest in software.
01:04:51 If you look at what happened in ancient Greece,
01:04:53 why did everything explode at once?
01:04:57 My take on this is that it was the shift
01:04:59 from the Egyptian and hieroglyphic software
01:05:03 to the Greek language software.
01:05:06 This whole concept of creating abstract notions,
01:05:10 of creating these layers of cognition
01:05:16 and layers of meaning and layers of abstraction
01:05:19 for words and ideals and beauty and harmony.
01:05:24 How do you write harmony in hieroglyphics?
01:05:26 There’s no such thing as, you know,
01:05:28 sort of expressing these ideals of peace and justice
01:05:31 and, you know, these concepts of,
01:05:34 or even, you know, macabre concepts of doom, et cetera.
01:05:39 You don’t have the language for it.
01:05:42 Your brain has trouble getting at that concept.
01:05:48 So what I’m trying to say is that these software upgrades
01:05:53 for human language, human culture,
01:05:57 human environment, human education
01:06:00 have basically led to this enormous explosion of knowledge.
01:06:04 And eventually after the enlightenment,
01:06:07 and as I was mentioning the 42 line Bible
01:06:11 and the printed press, the dissemination of knowledge,
01:06:13 you basically now have this whole horizontal dispersion
01:06:17 of ideas in addition to the vertical inheritance of genes.
01:06:21 So the hardware improvements happen
01:06:24 through vertical inheritance.
01:06:25 The software improvements happen
01:06:27 through horizontal inheritance.
01:06:28 And the reason why human civilization exploded
01:06:31 is not a hardware change anymore,
01:06:32 it’s really a software change.
01:06:34 So if you’re looking at now where we are today,
01:06:39 look at coronavirus.
01:06:40 Yes, sure, it could have killed us a hundred years ago,
01:06:43 it would have, but it didn’t.
01:06:45 Why?
01:06:46 Because in January, we published the genome.
01:06:51 A month later, less than a month later,
01:06:53 the first vaccine designs were done.
01:06:54 And now less than a year later, 10 months later,
01:06:58 we already have a working vaccine that’s 90% efficient.
01:07:01 I mean, that is ridiculous by any standards.
01:07:03 And the reason is sharing.
01:07:06 So the asteroid, yes, could wipe us out in 48 years,
01:07:09 but 48 years?
01:07:11 I mean, look at where we were 48 years ago, technologically.
01:07:16 I mean, how much more we understand
01:07:18 the basic foundations of space is enormous.
01:07:23 The technological revolutions of digitization,
01:07:27 the amount of compute power we can put
01:07:29 on any nail size hardware is enormous.
01:07:37 And this is nowhere near ending.
01:07:40 We all have our little problems going back and forth
01:07:43 on the social side and on the political side,
01:07:46 on the sort of human side and the societal side,
01:07:50 but science has not slowed down.
01:07:54 Science is moving at a breakneck pace ahead.
01:07:57 So, you know, Elon is now putting rockets out
01:08:00 from the private space.
01:08:01 I mean, that now democratization of space exploration
01:08:06 is, you know, gonna revolutionize everything.
01:08:07 It’s gonna explode, continue.
01:08:09 In the same way that every technology has exploded,
01:08:12 this is the shift to space technology exploding.
01:08:15 So 48 years is infinity from now
01:08:19 in terms of space capabilities.
01:08:21 So I’m not worried at all.
01:08:22 Are you excited by the possibility of a human,
01:08:25 well, one, a human stepping foot on Mars
01:08:28 and two, possible colonization of not necessarily Mars,
01:08:33 but other planets and all that kind of stuff
01:08:34 for people living in space?
01:08:36 Inevitable.
01:08:37 Inevitable. Inevitable.
01:08:38 Would you do it?
01:08:39 Or do you kind of like Earth? Of course, of course.
01:08:41 You know, how many?
01:08:42 How many people will you wait?
01:08:44 Will you wait for, I think it was about
01:08:46 when the Declaration of Independence was signed,
01:08:50 about two to three million people lived here.
01:08:52 So would you move like before?
01:08:54 Would you be like on the first boat?
01:08:57 Would you be on the 10th boat?
01:08:58 Would you wait until the Declaration of Independence?
01:09:00 I don’t think I’ll be on the short list
01:09:02 because I’ll be old by then.
01:09:04 They’ll probably get a bunch of younger people.
01:09:06 But you’re, it’s the wisdom and the,
01:09:11 then again, you are the lucky one.
01:09:12 But wisdom can be transferred horizontally.
01:09:13 I gotta tell you, you are the lucky one.
01:09:15 So you might be on the list.
01:09:16 I don’t know.
01:09:17 I mean, I kind of feel like I would love
01:09:20 to see Earth from above, just to watch our planet.
01:09:23 I mean, just, I mean, you know,
01:09:25 you can watch a live feed of the space station.
01:09:29 Watching Earth is magnificent,
01:09:32 like this blue tiny little shield.
01:09:35 It’s so thin, our atmosphere.
01:09:38 Like if you drive to New York,
01:09:39 you’re basically in outer space.
01:09:40 I mean, it’s ridiculous.
01:09:41 It’s just so thin.
01:09:42 And it’s just, again, such a privilege
01:09:45 to be on this planet, such a privilege.
01:09:47 But I think our species is in for big, good things.
01:09:54 I think that, you know,
01:09:56 we will overcome our little problems
01:09:58 and eventually come together as a species.
01:10:01 I feel that we’re definitely on the path to that.
01:10:04 And, you know, it’s just not permeated
01:10:07 through the whole universe yet.
01:10:09 I mean, through the whole world yet,
01:10:11 through the whole Earth yet,
01:10:12 but it’s definitely permeating.
01:10:15 So you’ve talked about humans as special.
01:10:18 How exactly are we special relative to the dinosaurs?
01:10:24 So I mentioned that there’s, you know,
01:10:27 this dramatic cognitive improvement that we’ve made,
01:10:31 but I think it goes much deeper than that.
01:10:34 So if you look at a lion attacking a gazelle
01:10:38 in the middle of the Serengeti,
01:10:40 the lion is smelling the molecules in the environment.
01:10:45 Its hormones and neuro receptors
01:10:51 are sort of getting it ready for impulse.
01:10:54 The target is constantly looking around and sensing.
01:10:58 I’ve actually been in Kenya and I’ve kind of seen the hunt.
01:11:02 So I’ve kind of seen the sort of game of waiting
01:11:07 and the mitochondria in the muscles of the lion
01:11:13 are basically ready for, you know, jumping.
01:11:18 They’re expensing an enormous amount of energy.
01:11:21 The grass as it’s flowing
01:11:23 is constantly transforming solar energy into chloroplasts,
01:11:29 you know, through the chloroplast into energy,
01:11:32 which eventually feeds the gazelle
01:11:34 and eventually feeds the lions.
01:11:36 And so on and so forth.
01:11:37 So as humans, we experience all of that,
01:11:44 but the lion only experiences one layer.
01:11:49 The mitochondria in its body
01:11:51 are only experiencing one layer.
01:11:52 The chloroplasts are only experiencing one layer.
01:11:55 The, you know, photoreceptors and the smell receptors
01:11:59 and the chemical receptors,
01:12:00 like the lion always attacks against the wind
01:12:02 so that it’s not smelled.
01:12:04 Like all of these things are one layer at a time.
01:12:11 And we humans somehow perceive the whole stack.
01:12:14 So going back to software infrastructure
01:12:17 and hardware infrastructure,
01:12:18 if you design a computer,
01:12:20 you basically have a physical layer that you start with.
01:12:23 And then on top of that physical layer,
01:12:24 you have, you know, the electrical layer.
01:12:27 And on top of the electrical layer,
01:12:28 you have basically gates and logic and an assembly layer.
01:12:32 And on top of the assembly layer,
01:12:33 you have your, you know, higher order,
01:12:36 higher level programming.
01:12:37 And on top of that,
01:12:38 you have your deep learning routine, et cetera.
01:12:40 And on top of that,
01:12:41 you eventually build a cognitive system that’s smart.
01:12:46 I want you to now picture this cognitive system
01:12:49 becoming not just self aware,
01:12:53 but also becoming aware of the hardware that it’s made of
01:12:57 and the atoms that it’s made of and so on and so forth.
01:13:01 So it’s as if your AI system,
01:13:03 and there’s this beautiful scene in 2001 Odyssey of Space,
01:13:08 where Hull, after Dave starts disconnecting him,
01:13:13 is starting to sing a song about daisies, et cetera.
01:13:17 And Hull is basically saying, Dave, I’m losing my mind.
01:13:24 I can feel I’m losing my mind.
01:13:26 It’s just so beautiful.
01:13:28 This concept of self awareness of knowing
01:13:31 that the hardware is no longer there is amazing.
01:13:35 And in the same way humans who have had accidents
01:13:39 are aware that they’ve had accidents.
01:13:42 So there’s this self awareness of AI
01:13:45 that is, you know, this beautiful concept about,
01:13:49 you know, sort of the eventual cognitive leap
01:13:52 to self awareness.
01:13:54 But imagine now the AI system
01:13:57 actually breaking through these layers
01:13:58 and saying, wait a minute,
01:13:59 I think I can design a slightly better hardware
01:14:01 to get me functioning better.
01:14:03 And that’s what basically humans are doing.
01:14:05 So if you look at our reasoning layer,
01:14:08 it’s built on top of a cognitive layer.
01:14:11 And the reasoning layer we share with AI,
01:14:13 it’s kind of cool.
01:14:14 Like there is another thing on the planet
01:14:16 that can integrate equations and it’s manmade,
01:14:19 but we share computation with them.
01:14:22 We share this cognitive layer of playing chess.
01:14:24 We’re not alone anymore.
01:14:26 We’re not the only thing on the planet that plays chess.
01:14:28 Now we have AI that also plays chess.
01:14:31 But in some sense that that particular organism,
01:14:33 AI as it is now only operates in that layer.
01:14:36 Exactly.
01:14:37 Exactly.
01:14:38 And then most animals operate
01:14:40 in the sort of cognitive layer that we’re all experiencing.
01:14:43 A bat is doing this incredible integration of signals,
01:14:48 but it’s not aware of it.
01:14:50 It’s basically constantly sending echo location waves
01:14:55 and it’s receiving them back.
01:14:56 And multiple bats in the same cave
01:14:58 are operating at slightly different frequencies
01:15:01 and with slightly different pulses.
01:15:03 And they’re all sensing objects
01:15:04 and they’re doing motion planning
01:15:07 in their cognitive hardware,
01:15:08 but they’re not even aware of all of that.
01:15:10 All they know is that they have a 3D view of space
01:15:13 around them, just like any gazelle walking through,
01:15:17 you know, the desert.
01:15:18 And any baby looking around is aware of things
01:15:23 without doing the math of how am I processing
01:15:26 all of these visual information, et cetera.
01:15:28 You’re just aware of the layer that you live in.
01:15:31 I think if you look at this, at humanity,
01:15:35 we’ve basically managed through our cognitive layer,
01:15:38 through our perception layer, through our senses layer,
01:15:41 through our multi organ layer, through our genetic layer,
01:15:46 through our molecular layer, through our atomic layer,
01:15:51 through our quantum layer,
01:15:54 through even the very fabric of the space time continuum
01:15:58 unite all of that cognitively.
01:16:00 So as we’re watching that scene in the Serengeti,
01:16:04 we as scientists, we as educated humans,
01:16:07 we as, you know, anyone who’s finished high school
01:16:09 are aware of all of this beauty
01:16:12 of all of these different layers interplaying together.
01:16:14 And I think that’s something very unique
01:16:16 in perhaps not just the galaxy, but maybe even the cosmos.
01:16:20 This species that has managed to in space
01:16:25 cross through these layers from the enormous
01:16:29 to the infinitely small.
01:16:30 And that’s what I love about particle physics.
01:16:33 The fact that it actually unites everything.
01:16:35 The very small and the very big.
01:16:36 The very small and the very big.
01:16:38 It’s only through the very big
01:16:39 that the very small gets formed.
01:16:41 Like basically every atom of gold
01:16:44 results from the fusion that happened
01:16:47 of increasingly large particles before that explosion
01:16:51 that then disperses it through the cosmos.
01:16:53 And it’s only through understanding the very large
01:16:57 that we understand the very small and vice versa.
01:16:59 And that’s in space.
01:17:01 Then there’s the time direction.
01:17:04 As you are watching the Kilimanjaro mountain,
01:17:08 you can kind of look back through time
01:17:11 to when that volcano was exploding
01:17:14 and growing out of the tectonic forces.
01:17:17 As you drive through Death Valley,
01:17:20 you see these mountains on their side
01:17:23 and these layers of history exposed.
01:17:28 We are aware of the eons that have happened on earth
01:17:34 and the tectonic movements on earth.
01:17:37 The same way that we’re aware of the Big Bang
01:17:41 and the early evolution of the cosmos.
01:17:44 And we can also see forward in time
01:17:46 as to where the universe is heading.
01:17:48 We can see Apophis in 2068 coming over,
01:17:53 looking ahead in time.
01:17:54 I mean, that would be magician stuff in ancient times.
01:17:58 So what I love about humanity and its role in the universe
01:18:02 is that if there’s a God watching,
01:18:05 he’s like, finally, somebody figured it out.
01:18:08 I’ve been building all these beautiful things
01:18:10 and somebody can appreciate it.
01:18:11 And figured me out from God’s perspective,
01:18:13 meaning become aware of, you know.
01:18:16 Yeah, so it’s kind of interesting
01:18:18 to think of the world in this way as layers
01:18:21 and us humans are able to convert those layers
01:18:25 into ideas that you can then combine, right?
01:18:29 So we’re doing some kind of conversion.
01:18:32 Exactly, exactly.
01:18:33 And last time you asked me about
01:18:35 whether we live in a simulation, for example.
01:18:37 I mean, realize that we are living in a simulation.
01:18:42 We are, the reality that we’re in
01:18:45 without any sort of person programming this is a simulation.
01:18:48 Like basically what happens inside your skull?
01:18:51 There’s this integration of sensory inputs
01:18:55 which are translated into perceptory signals,
01:18:57 which are then translated into a conceptual model
01:19:00 of the world around you.
01:19:02 And that exercise is happening seamlessly.
01:19:06 And yet, you know, if you think about sort of, again,
01:19:11 this whole simulation and Neo analogy,
01:19:15 you can think of the reality that we live in as a matrix,
01:19:18 as the matrix, but we’ve actually broken through the matrix.
01:19:22 We’ve actually traversed the layers.
01:19:23 We didn’t have to take a pill, like we didn’t, you know,
01:19:27 Morpheus didn’t have to show up
01:19:29 to basically give us the blue pill or the red pill.
01:19:31 We were able to sufficiently evolve cognitively
01:19:35 through the hardware explosion
01:19:37 and sufficiently evolve scientifically
01:19:41 through the software explosion
01:19:43 to basically get at breaking through the matrix,
01:19:45 realizing that we live in a matrix
01:19:47 and realizing that we are this thing in there.
01:19:51 And yet that thing in there has a consciousness
01:19:53 that lives through all these layers.
01:19:57 And I think we’re the only species.
01:19:58 We’re the only thing that we even can think of
01:20:00 that has actually done that,
01:20:01 that has sort of permeated space and time scales
01:20:09 and layers of abstraction plowing through them
01:20:13 and realizing what we’re really, really made of.
01:20:16 And the next frontier is of course, cognition.
01:20:20 So we understand so much of the cosmos,
01:20:22 so much of the stuff around us,
01:20:24 but the stuff inside here, finding the basis for the soul,
01:20:28 finding the basis for the ego, for the self,
01:20:31 the self awareness, when does the spark happen
01:20:35 that basically sort of makes you you?
01:20:38 I mean, that’s really the next frontier.
01:20:41 So in terms of these peeling off layers of complexity,
01:20:44 somewhere between the cognitive layer
01:20:47 and the reasoning layer or the computational layer,
01:20:52 there’s still some stuff to be figured out there.
01:20:54 And I think that’s the final frontier
01:20:56 of sort of completing our journey through that matrix.
01:20:59 And maybe duplicating it in other versions of ourselves
01:21:03 through AI, which is another very exciting possibility.
01:21:08 What I love about AI and the way that it operates right now
01:21:12 is the fact that it is unpredictable.
01:21:16 There’s emergent behavior
01:21:18 in our cognitively capable artificial systems
01:21:23 that we can certainly model,
01:21:26 but we don’t encode directly.
01:21:30 And that’s a key difference.
01:21:32 So we like to say, oh, of course,
01:21:35 this is not really intelligent because we coded it up.
01:21:38 And we’ve just put in these little parameters there
01:21:41 and there’s like six billion parameters
01:21:43 and once you’ve learned them,
01:21:45 we kind of understand the layers.
01:21:48 But that’s an oversimplification.
01:21:50 It’s like saying, oh, of course, humans,
01:21:53 we understand humans, they’re just made out of neurons
01:21:55 and layers of cortex and there’s a visual area.
01:22:01 But every human is encoded
01:22:04 by a ridiculously small number of genes
01:22:06 compared to the complexity of our cognitive apparatus.
01:22:09 20,000 genes is really not that much
01:22:11 out of which a tiny little fraction
01:22:13 are in fact encoding all of our cognitive functions.
01:22:16 The rest is emergent behavior.
01:22:19 The rest is the cortical layers
01:22:24 doing their thing in the same way
01:22:27 that when we build these conversational systems
01:22:30 or these cognitive systems or these deep learning systems,
01:22:34 we put the architecture in place,
01:22:36 but then they do their thing.
01:22:37 And in some ways,
01:22:38 that’s creating something that has its own identity.
01:22:41 That’s creating something that’s not just,
01:22:43 oh yeah, it’s not the early AI
01:22:45 where if you hadn’t programmed
01:22:47 what happens in the grocery bags
01:22:49 when you have both cold and hot and hard and soft,
01:22:52 the system wouldn’t know what to do.
01:22:54 No, no, you basically now just program the primitives
01:22:57 and then it learns from that.
01:22:59 So even though the origins are humble,
01:23:01 just like it is for our genetic code,
01:23:03 for AI, even though the origins are humble,
01:23:05 the result of it being deployed into the world
01:23:11 is infinitely complex.
01:23:16 And yet, it’s not yet able to be cognizant
01:23:21 of all the other layers of its,
01:23:25 you know, it’s not able to think about space and time.
01:23:33 It’s not able to think about the hardware in which it runs,
01:23:35 the electricity in which it runs yet.
01:23:38 So if you look at humans,
01:23:41 we basically have the same cognitive architecture
01:23:43 as monkeys, as the great apes.
01:23:45 It’s just a ton more of it.
01:23:48 If you look at GPT3 versus GPT2,
01:23:52 again, it’s the same architecture, just more of it.
01:23:55 And yet it’s able to do so much more.
01:23:58 So if you start thinking about sort of
01:23:59 what’s the future of that, GPT4 and GPT5,
01:24:03 do you really need fundamentally different architectures
01:24:05 or do you just need a ton more hardware?
01:24:07 And we do have a ton more hardware.
01:24:10 Like these systems are nowhere near
01:24:12 what humans have between our ears.
01:24:15 So, you know, there’s something to be said
01:24:19 about stay tuned for emergent behavior.
01:24:22 We keep thinking that general intelligence
01:24:24 might just be forever away,
01:24:27 but it could just simply be that
01:24:29 we just need a ton more hardware
01:24:31 and that humans are just not that different
01:24:33 from the great apes, except for just a ton more of it.
01:24:37 Yeah, it’s interesting that in the AI community,
01:24:41 maybe there’s a human centric fear,
01:24:43 but the notion that GPT10 will achieve general intelligence
01:24:49 is something that people shy away from,
01:24:51 that there has to be something totally different
01:24:54 and new added to this.
01:24:56 And yet it’s not seriously considered that
01:25:01 this very simple thing, this very simple architecture,
01:25:05 when scaled, might be the thing
01:25:07 that achieves super intelligence.
01:25:09 And people think the same way about humanity
01:25:11 and human consciousness.
01:25:13 They’re like, oh, consciousness might be quantum,
01:25:15 or it might be, you know, some nonphysical thing.
01:25:18 And it’s like, or it could just be a lot more
01:25:21 of the same hardware that now is sufficiently capable
01:25:25 of self awareness just because it has the neurons to do it.
01:25:29 So maybe the consciousness that is so elusive
01:25:33 is an emergent behavior of you basically string together
01:25:38 all these cognitive capabilities that come from running,
01:25:41 from seeing, for reacting,
01:25:43 from predicting the movement of a fly
01:25:45 as you’re catching it through the air.
01:25:47 All of these things are just like great lookup tables
01:25:49 encoded in a giant neural network.
01:25:51 I mean, I’m oversimplifying, of course,
01:25:53 the complexity and the diversity of the different types
01:25:55 of excitatory and inhibitory neurons,
01:25:57 the wave forms that sort of shine through
01:26:01 the connections across all these different layers,
01:26:04 the amalgamation of signals, et cetera.
01:26:06 The brain is enormously complex.
01:26:08 I mean, of course.
01:26:09 But again, it’s a small number of primitives
01:26:11 encoded by a tiny number of genes,
01:26:14 which are self organized and shaped by their environment.
01:26:21 Babies that are growing up today
01:26:23 are listening to language from conception.
01:26:28 Basically, as soon as the auditory apparatus forms,
01:26:32 it’s already getting shaped to the types of signals
01:26:35 that are out in the real world today.
01:26:37 So it’s not just like, oh, have an Egyptian be born
01:26:39 and then ship them over.
01:26:40 It’s like, no, that Egyptian would be listening in
01:26:44 to the complexity of the world and then getting born
01:26:46 and sort of seeing just how much more complex the world is.
01:26:49 So it’s a combination of the underlying hardware,
01:26:53 which if you think about as a geneticist,
01:26:57 in my view, the hardware gives you an upper bound
01:27:00 of cognitive capabilities,
01:27:02 but it’s the environment that makes those capabilities shine
01:27:05 and reach their maximum.
01:27:06 So we’re a combination of nature and nurture.
01:27:11 The nature is our genes and our cognitive apparatus.
01:27:15 And the nurture is the richness of the environment
01:27:18 that makes that cognitive apparatus reach its potential.
01:27:22 And we are so far from reaching our full potential, so far.
01:27:27 I think that kids being born a hundred years from now,
01:27:31 they’ll be looking at us now and saying
01:27:33 what primitive educational systems they had.
01:27:36 I can’t believe people were not wired
01:27:38 into this virtual reality from birth as we are now,
01:27:42 cause like they’re clearly inferior and so on and so forth.
01:27:46 I basically think that our environment
01:27:48 will continue exploding and our cognitive capabilities,
01:27:53 it’s not like, oh, we’re only using 10% of our brain.
01:27:55 That’s ridiculous.
01:27:56 Of course, we’re using 100% of our brain,
01:27:57 but it’s still constrained by how complex
01:28:00 our environment is.
01:28:02 So the hardware will remain the same, but the software,
01:28:06 in a quickly advancing environment,
01:28:08 the software will make a huge difference
01:28:10 in the nature of like the human experience,
01:28:14 the human condition.
01:28:15 It’s fascinating to think that humans will look
01:28:17 very different a hundred years from now,
01:28:19 just because the environment changed,
01:28:20 even though we’re still the same great apes,
01:28:22 the descendant of apes.
01:28:25 At the core of this is kind of a notion of ideas
01:28:28 that I don’t know if you’re,
01:28:31 there’s a lot of people, including you,
01:28:33 eloquently about this topic,
01:28:34 but Richard Dawkins talks about the notion of memes
01:28:39 and let’s say this notion of ideas,
01:28:45 multiplying, selecting in the minds of humans.
01:28:49 Do you ever think about ideas from that perspective,
01:28:52 ideas as organisms themselves
01:28:54 that are breeding in the minds of humans?
01:28:57 I love the concept of memes.
01:28:59 I love the concept of these horizontal transfer of ideas
01:29:03 and sort of permeating through our layer
01:29:08 of interconnected neural networks.
01:29:11 So you can think of sort of the cognitive space
01:29:15 that has now connected all of humanity,
01:29:18 where we are now one giant information
01:29:22 and idea sharing network,
01:29:24 well beyond what was thought to be ever capable
01:29:28 when the concept of a meme was created by Richard Dawkins.
01:29:32 So, but I wanna take that concept
01:29:35 just into another twist,
01:29:39 which is the horizontal transfer of humans with fellowships.
01:29:47 And the fact that as people apply to MIT
01:29:51 from around the world,
01:29:53 there’s a selection that happens,
01:29:55 not just for their ideas,
01:29:58 but also for the cognitive hardware
01:30:00 that came up with those ideas.
01:30:02 So we don’t just ship ideas around anymore.
01:30:05 They don’t evolve in a vacuum.
01:30:07 The ideas themselves influence the distribution
01:30:11 of cognitive systems, i.e. humans and brains
01:30:14 around the planet.
01:30:15 Yeah, we ship them to different locations
01:30:17 based on their properties.
01:30:18 That’s exactly right.
01:30:19 So those cognitive systems that think of physics,
01:30:24 for example, might go to CERN
01:30:25 and those that think of genomics
01:30:28 might go to the Broad Institute.
01:30:30 And those that think of computer science
01:30:32 might go to, I don’t know, Stanford or CMU or MIT.
01:30:35 And you basically have this co evolution now
01:30:38 of memes and ideas
01:30:40 and the cognitive conversational systems
01:30:44 that love these ideas and feed on these ideas
01:30:47 and understand these ideas and appreciate these ideas
01:30:50 now coming together.
01:30:52 So you basically have students coming to Boston to study
01:30:56 because that’s the place
01:30:56 where these types of cognitive systems thrive.
01:31:01 And they’re selected based on their cognitive output
01:31:05 and their idea output.
01:31:08 But once they get into that place,
01:31:10 the boiling and interbreeding of these memes
01:31:15 becomes so much more frequent.
01:31:17 That what comes out of it is so far beyond
01:31:21 if ideas were evolving in a vacuum
01:31:23 of an already established hardware,
01:31:25 cognitive interconnection system of the planet,
01:31:28 where now you basically have the ideas
01:31:32 shaping the distribution of these systems.
01:31:35 And then the genetics kick in as well.
01:31:37 You basically have now these people
01:31:40 who came to be a student kind of like myself
01:31:42 who now stuck around and are now professors
01:31:45 bringing up our own genetically encoded
01:31:49 and genetically related cognitive systems,
01:31:53 mine are eight, six and three years old,
01:31:56 who are now growing up in an environment
01:31:58 surrounded by other cognitive systems of a similar age
01:32:02 with parents who love these types of thinking and ideas.
01:32:06 And you basically have a whole interbreeding now
01:32:09 of genetically selected transfer of cognitive systems
01:32:14 where the genes and the memes are co evolving
01:32:19 the same soup of ever improving knowledge
01:32:23 and societal inter fertilization,
01:32:28 cross fertilization of these ideas.
01:32:29 So this beautiful image.
01:32:32 So this is shipping these actual meat cognitive systems
01:32:36 to physical locations.
01:32:37 They tend to cluster in the biology ones,
01:32:41 and the biology ones cluster in a certain building too.
01:32:45 So like within that there’s clusters on top of clusters,
01:32:49 top of clusters.
01:32:50 What about in the online world?
01:32:52 Is that, do you also see that kind of,
01:32:55 because people now form groups on the internet
01:32:58 that they stick together so they can sort of,
01:33:03 these cognitive systems can collect themselves
01:33:08 and breed together in different layers of spaces.
01:33:14 It doesn’t just have to be physical space.
01:33:15 Absolutely, absolutely.
01:33:17 So basically there’s the physical rearrangement,
01:33:19 but there’s also the conglomeration
01:33:21 of the same cognitive system.
01:33:24 Doesn’t need to be, i.e. human.
01:33:26 Doesn’t need to belong to only one community.
01:33:29 So yes, you might be a member
01:33:30 of the computer science department,
01:33:31 but you can also hang out in the biology department.
01:33:33 But you might also go online into,
01:33:35 I don’t know, poetry department readings
01:33:37 and so on and so forth.
01:33:38 Or you might be part of a group
01:33:40 that only has 12 people in the world,
01:33:42 but that are connected through their ideas
01:33:45 and are now interbreeding these ideas in a whole other way.
01:33:49 So this coevolution of genes and memes
01:33:53 is not just physically instantiated.
01:33:55 It’s also sort of rearranged in this cognitive space as well.
01:34:01 And sometimes these cognitive systems hold conferences
01:34:05 and they all gather around
01:34:06 and there’s like one of them is like talking
01:34:09 and they’re all like listening
01:34:10 and then they discuss and then they have free lunch
01:34:12 and so on.
01:34:13 No, but then that’s where you find students
01:34:15 where when I go to a conference,
01:34:18 I go through the posters where I’m on a mission.
01:34:20 Basically my mission is to read and understand
01:34:23 what every poster is about.
01:34:25 And for a few of them,
01:34:25 I’ll dive deeply and understand everything,
01:34:27 but I make it a point to just go poster after poster
01:34:29 in order to read all of them.
01:34:31 And I find some gems and students that I speak to
01:34:35 that sometimes eventually join my lab.
01:34:37 And then sort of you’re sort of creating this permeation
01:34:41 of the transfer of ideas, of ways of thinking
01:34:48 and very often of moral values, of social structures,
01:34:52 of just more imperceptible properties
01:34:59 of these cognitive systems
01:35:00 that simply just cling together.
01:35:02 Basically, I have the luxury at MIT
01:35:07 of not just choosing smart people,
01:35:09 but choosing smart people who I get along with,
01:35:12 who are generous and friendly and creative and smart
01:35:17 and excited and childish in their uninhibited behaviors
01:35:25 and so on and so forth.
01:35:26 So you basically can choose yourself to surround,
01:35:29 you can choose to surround yourself
01:35:31 with people who are not only cognitively compatible,
01:35:36 but also imperceptibly
01:35:39 through the meta cognitive systems compatible.
01:35:43 And again, when I say compatible, not all the same.
01:35:46 Sometimes, not sometimes, all the time.
01:35:50 The teams are made out of complimentary components,
01:35:53 not just compatible, but very often complimentary.
01:35:56 So in my own team, I have a diversity of students
01:35:59 who come from very different backgrounds.
01:36:01 There’s a whole spectrum of biology to computation,
01:36:04 of course, but within biology, there’s a lot of realms.
01:36:06 Within computation, there’s a lot of realms.
01:36:08 And what makes us click so well together
01:36:13 is the fact that not only do we have a common mission,
01:36:16 a common passion and a common view of the world,
01:36:22 but that we’re complimentary in our skills,
01:36:25 in our angles with which we come at it and so on and so forth.
01:36:28 And that’s sort of what makes it click.
01:36:29 Yeah, it’s fascinating that the stickiness
01:36:32 of multiple cognitive systems together
01:36:35 includes both the commonality,
01:36:37 so you meet because there’s some common thing,
01:36:40 but you stick together because you’re different
01:36:45 in all the useful ways.
01:36:46 Yeah, yeah.
01:36:47 And my wife and I, I mean, we adore each other to pieces,
01:36:51 but we’re also extremely different in many ways.
01:36:54 And that’s beautiful. Careful.
01:36:55 She’s gonna be listening to this.
01:36:57 But I love that about us.
01:36:59 I love the fact that I’m living out there
01:37:01 in the world of ideas and I forget what day it is.
01:37:05 And she’s like, well, at 8 a.m.,
01:37:07 the kids better be to school.
01:37:08 Right.
01:37:09 And I do get yelled at, but I need it.
01:37:15 Basically, I need her as much as she needs me.
01:37:18 And she loves interacting with me and talking.
01:37:20 I mean, last night, we were talking about this
01:37:23 and I showed her the questions
01:37:24 and we were bouncing ideas off each other.
01:37:26 And it was just beautiful.
01:37:28 We basically have these, basically,
01:37:32 cognitive, let it all loose kind of dates
01:37:36 where we just bring papers
01:37:38 and we’re bouncing ideas, et cetera.
01:37:41 So we have extremely different perspectives,
01:37:44 but very common goals and interests and anyway.
01:37:49 What do you make of the communication mechanism
01:37:52 that we humans use to share those ideas?
01:37:54 Because one essential element of all of this
01:37:57 is not just that we’re able to have these ideas,
01:38:01 but we’re also able to share them.
01:38:03 We tend to, maybe you can correct me,
01:38:06 but we seem to use language to share the ideas.
01:38:10 Maybe we share them in some much deeper way
01:38:12 than language, I don’t know.
01:38:13 But what do you make of this whole mechanism
01:38:15 that ghaf on the matlid is to the human condition?
01:38:18 So some people will tell you
01:38:20 that your language dictates your thoughts
01:38:23 and your thoughts cannot form outside language.
01:38:26 I tend to disagree.
01:38:27 I see thoughts as much more abstract
01:38:33 as basically when I dream, I don’t dream in words.
01:38:36 I dream in shapes and forms and three dimensional space
01:38:40 with extreme detail.
01:38:42 I was describing, so when I wake up
01:38:44 in the middle of the night, I actually record my dreams.
01:38:47 Sometimes I write them down in a Dropbox file.
01:38:50 Other times I’ll just dictate them in audio.
01:38:53 And my wife was giving me a massage the other day
01:38:57 cause like my left side was frozen
01:39:00 and I started playing the recording.
01:39:02 And as I was listening to it, I was like,
01:39:05 I don’t remember any of that.
01:39:06 And it was like, of course.
01:39:08 And then the entire thing came back.
01:39:10 But then there’s no way any other person
01:39:12 could have recreated that entire sort of
01:39:15 three dimensional shape and dream and concept.
01:39:20 And in the same way, when I’m thinking of ideas,
01:39:22 there’s so many ideas I can’t put to words.
01:39:25 I mean, I will describe them with a thousand words,
01:39:27 but the idea itself is much more precise
01:39:29 or much more sort of abstract
01:39:31 or much more something difference,
01:39:34 either less abstract or more abstract.
01:39:36 And it’s either, basically there’s just the projection
01:39:42 that happens from the three dimensional ideas
01:39:44 into let’s say a one dimensional language.
01:39:46 And the language certainly gives you the apparatus
01:39:49 to think about concepts
01:39:51 that you didn’t realize existed before.
01:39:53 And with my team, we often create new words.
01:39:56 I’m like, well, now we’re gonna call these
01:39:58 the regulatory plexus of a gene.
01:39:59 And that gives us now the language
01:40:01 to sort of build on that as one concept
01:40:04 that you then build upon with all kinds of other things.
01:40:07 So there’s this coevolution again of ideas and language,
01:40:11 but they’re not one to one with each other.
01:40:15 Now let’s talk about language itself, words, sentences.
01:40:20 This is a very distant construct
01:40:24 from where language actually begun.
01:40:26 So if you look at how we communicate,
01:40:29 as I’m speaking, my eyes are shining
01:40:32 and my face is changing through all kinds of emotions.
01:40:36 And my entire body composition posture is reshaped.
01:40:41 And my intonation, the pauses that I make,
01:40:44 the softer and the louder and the this and that
01:40:47 are conveying so much more information.
01:40:50 And if you look at early human language,
01:40:54 and if you look at how the great apes
01:40:57 communicate with each other, there’s a lot of grunting,
01:40:59 there’s a lot of posturing, there’s a lot of emotions,
01:41:01 there’s a lot of sort of shrieking, et cetera.
01:41:03 They have a lot of components of our human language,
01:41:09 just not the words.
01:41:10 So I think of human communication
01:41:14 as combining the ape component,
01:41:19 but also of course the GPT3 component.
01:41:22 So basically there’s the cognitive layer
01:41:24 and the reasoning layer that we share
01:41:27 with different parts of our relatives.
01:41:30 There’s the AI relatives,
01:41:31 but there’s also the grunting relatives.
01:41:34 And what I love about humanity is that we have both.
01:41:37 We’re not just a conversational system.
01:41:40 We’re a grunting, emotionally charged,
01:41:44 weirdly interconnected system
01:41:49 that also has the ability to reason.
01:41:51 And when we communicate with each other,
01:41:54 there’s so much more than just language.
01:41:56 There’s so much more than just words.
01:41:59 It does seem like we’re able to somehow transfer
01:42:01 even more than the body language.
01:42:04 It seems that in the room with us
01:42:07 is always a giant knowledge base of shared experiences,
01:42:13 different perspectives on those experiences,
01:42:15 but I don’t know, the knowledge of who the last three,
01:42:19 four presidents in the United States was,
01:42:21 and just all the 9 11, the tragedies in 9 11,
01:42:24 all the beautiful and terrible things
01:42:28 that happen in the world.
01:42:28 They’re somehow both in our minds
01:42:31 and somehow enrich the ability to transfer information.
01:42:37 What I love about it is I can talk to you
01:42:39 about 2001 Odyssey of Space
01:42:40 and mention a very specific scene
01:42:41 and that evokes all these feelings that you had
01:42:44 when you first watched it.
01:42:45 We’re both visualizing that and maybe in different ways.
01:42:48 Exactly.
01:42:48 But in that, yeah, and not only that,
01:42:52 but the feeling is brought back up,
01:42:56 just like you said, with the dreams.
01:42:58 We both have that feeling arise in some form
01:43:01 as you bring up the child facing his own mortality.
01:43:07 It’s fascinating that we’re able to do that,
01:43:09 but I don’t know.
01:43:10 Now let’s talk about Neuralink for a second.
01:43:12 So what’s the concept of Neuralink?
01:43:14 The concept of Neuralink is that I’m gonna take
01:43:17 whatever knowledge is encoded in my brain
01:43:19 directly transfer it into your brain.
01:43:22 So this is a beautiful, fascinating,
01:43:25 and extremely sort of appealing concept,
01:43:29 but I see a lot of challenges surrounding that.
01:43:32 The first one is we have no idea
01:43:34 how to even begin to understand
01:43:36 how knowledge is encoded in a person’s brain.
01:43:40 I mean, I told you about this paper that we had recently
01:43:41 with Li Hui Cai and Asaf Marko
01:43:45 that basically was looking at these engrams
01:43:47 that are formed with combinations of neurons
01:43:50 that cofire when a stimulus happens,
01:43:53 where we can go into a mouse
01:43:54 and select those neurons that fire by marking them
01:43:56 and then see what happens when they first fire.
01:43:58 And then select the neurons that fire again
01:44:00 when the experience is repeated.
01:44:02 These are the recall neurons,
01:44:04 and then there’s the memory consolidation neurons.
01:44:07 So we’re starting to understand a little bit
01:44:09 of sort of the distributed nature of knowledge encoding
01:44:14 and experience encoding in the human brain
01:44:16 and in the mouse brain.
01:44:17 And the concept that we’ll understand
01:44:21 that sufficiently one day
01:44:23 to be able to take a snapshot
01:44:26 of what does that scene from Dave losing his mind,
01:44:31 of Khal losing his mind and talking to Dave,
01:44:36 how is that seen and coded in your mind?
01:44:39 Imagine the complexity of that.
01:44:41 But now imagine, suppose that we solve this problem.
01:44:45 And the next enormous challenge is how do I go
01:44:48 and modify the next person’s brain
01:44:50 to now create the same exact neural connections?
01:44:54 So that’s an enormous challenge right there.
01:44:56 So basically it’s not just reading, it’s now writing.
01:45:00 And again, what if something goes wrong?
01:45:02 I don’t wanna even think about that, that’s number two.
01:45:05 And number three, who says that the way
01:45:08 that you encode Dave, I’m losing my mind
01:45:11 and I encode Dave, I’m losing my mind
01:45:14 is anywhere near each other.
01:45:17 Basically, maybe the way that I’m encoding it
01:45:19 is twisted with my childhood memories of running through
01:45:24 the pebbles in Greece, and yours is twisted
01:45:28 with your childhood memories growing up in Russia.
01:45:31 And there’s no way that I can take my encoding
01:45:34 and put it into your brain,
01:45:35 because it’ll A, mess things up,
01:45:38 and B, be incompatible with your own unique experiences.
01:45:42 So that’s telepathic communication from human to human.
01:45:45 It’s fascinating, you’re reminding us
01:45:48 that there’s two biological systems
01:45:51 on both ends of that communication.
01:45:54 The easier, I guess, maybe half as difficult thing to do
01:45:59 in the hope with Neuralink is that we can communicate
01:46:03 with an AI system, so where one side of that
01:46:06 is a little bit more controllable,
01:46:08 but even just that is exceptionally difficult.
01:46:13 Let’s talk about two neuronal systems talking to each other.
01:46:16 Suppose that GPT4 tells GPT3, hey,
01:46:19 give me all your knowledge, right?
01:46:21 It’s ready, I have 10 times more hardware,
01:46:24 I’m ready, just feed me.
01:46:25 What’s GPT3 gonna do?
01:46:27 Is it gonna say, oh, here’s my 10 billion parameters?
01:46:30 No. No way.
01:46:32 The simplest way, and perhaps the fastest way
01:46:35 for GPT3 to transfer all its knowledge
01:46:36 to its older body that has a lot more hardware
01:46:39 is to regenerate every single possible human sentence
01:46:44 that it can possibly create.
01:46:46 Just keep talking.
01:46:48 Keep talking and just reencode it all together.
01:46:50 So maybe what language does is exactly that.
01:46:53 It’s taking one generative cognitive model,
01:46:56 it’s running it forward to emit utterances
01:46:59 that kind of make sense in my cognitive frame,
01:47:01 and it’s reencoding them into yours
01:47:04 through the parsing of that same language.
01:47:07 And I think the conversation might actually be
01:47:09 the most efficient way to do it,
01:47:11 so not just talking, but interactive,
01:47:14 so talking back and forth, asking questions, interrupting.
01:47:18 So GPT4 will constantly be interrupting.
01:47:20 Annoying. Annoying, yeah.
01:47:25 But the beauty of that is also that
01:47:27 as we’re interrupting each other,
01:47:29 there’s all kinds of misinterpretations that happen,
01:47:33 that basically when my students speak,
01:47:36 I will often know that I’m misunderstanding
01:47:38 what they’re saying, and I’ll be like,
01:47:41 hold that thought for a second.
01:47:43 Let me tell you what I think I understood,
01:47:44 which I know is different from what you said.
01:47:46 Then I’ll say that, and then someone else
01:47:49 in the same Zoom meeting
01:47:50 will basically say, well, here’s another way
01:47:53 to think about what you just said.
01:47:55 And then by the third iteration,
01:47:57 we’re somewhere completely different,
01:47:59 that if we could actually communicate
01:48:01 with full neural network parameters back and forth
01:48:06 of that knowledge and idea and coding,
01:48:09 would be far inferior,
01:48:11 because the reencoding with our own,
01:48:15 as we said last time, emotional baggage
01:48:17 and cognitive baggage from our unique experiences
01:48:21 through our shared experiences, distinct encodings,
01:48:27 in the context of all our unique experiences,
01:48:29 is leading to so much more diversity of perspectives.
01:48:37 And again, going back to this whole concept of these,
01:48:40 entire network of all of human cognitive systems
01:48:43 connected to each other,
01:48:45 and sort of how ideas and memes permeate through that,
01:48:48 that’s sort of what really creates a whole new level
01:48:52 of human experience through this reasoning layer
01:48:59 and this computational layer
01:49:00 that obviously lives on top of our cognitive layer.
01:49:04 So you’re one of these aforementioned cognitive systems,
01:49:09 mortal, but thoughtful, and you’re connected to a bunch,
01:49:15 like you said, students, your wife, your kids.
01:49:19 What do you, in your brief time here on Earth,
01:49:23 this is a Meaning of Life episode,
01:49:25 so what do you hope this world will remember you as?
01:49:31 What do you hope your legacy will be?
01:49:33 I don’t think of legacy as much as maybe most people.
01:49:40 Oh, it’s kind of funny.
01:49:41 I’m consciously living the present.
01:49:44 Many students tell me, oh, give us some career advice.
01:49:47 I’m like, I’m the wrong person.
01:49:48 I’ve never made a career plan.
01:49:50 I still have to make one.
01:49:51 I, it’s funny to be both experiencing the past,
01:49:56 and the present, and the future,
01:50:01 but also consciously living in the present,
01:50:04 and just, there’s a conscious decision we can make
01:50:08 to not worry about all that,
01:50:10 which again, goes back to the I’m the lucky one kind of thing
01:50:13 of living in the present and being happy winning,
01:50:18 and being happy losing,
01:50:20 and there’s a sort of,
01:50:23 I’m happy losing, and there’s a certain freedom
01:50:27 that comes with that, but again,
01:50:29 a certain sort of, I don’t know,
01:50:32 ephemerity of living for the present,
01:50:37 but if you, if you stay back from all of that,
01:50:39 where basically my current modus operandi
01:50:44 is live for the present, make every day
01:50:49 the best you can make,
01:50:50 and just make the local blip of local maxima
01:50:55 of the universe, of the awesomeness of the planet,
01:50:58 and the town, and the family that we live in,
01:51:02 both academic family and biological family,
01:51:07 make it a little more awesome
01:51:08 by being generous to your friends,
01:51:09 being generous to the people around you,
01:51:11 being kind to your enemies,
01:51:13 and just showing love all around.
01:51:17 You can’t be upset at people if you truly love them.
01:51:21 If somebody yells at you and insults you
01:51:23 every time you say the slightest thing,
01:51:25 and yet when you see them, you just see them with love,
01:51:30 it’s a beautiful feeling.
01:51:31 It’s like, you know, I’m feeling exactly like
01:51:34 when I look at my three year old who’s like screaming,
01:51:37 even though I love her and I want her good,
01:51:39 she’s still screaming and saying, no, no, no, no, no.
01:51:42 And I’m like, I love you, genuinely love you,
01:51:47 but I can sort of kind of see that your brain
01:51:49 is kind of stuck in that little mode of anger.
01:51:53 And there’s plenty of people out there who don’t like me,
01:51:58 and I see them with love as a child
01:52:01 that is stuck in a cognitive state
01:52:04 that they’re eventually gonna snap out of,
01:52:06 or maybe not, and that’s okay.
01:52:08 So there’s that aspect of sort of experiencing life
01:52:12 with the best intentions.
01:52:16 And I love when I’m wrong.
01:52:20 I had a friend who was like one of the smartest people
01:52:22 I’ve ever met who would basically say,
01:52:24 oh, I love it when I’m wrong
01:52:26 because it makes me feel human.
01:52:27 And it’s so beautiful.
01:52:31 I mean, she’s really one of the smartest people
01:52:32 I’ve ever met.
01:52:33 And she was like, oh, it’s such a good feeling.
01:52:36 And I love being wrong, but there’s something
01:52:41 about self improvement.
01:52:42 There’s something about sort of how do I
01:52:45 not make the most mistakes, but attempt the most rights
01:52:49 and do the fewest wrongs,
01:52:50 but with the full knowledge that this will happen.
01:52:53 That’s one aspect.
01:52:55 So through this life in the present,
01:53:00 what’s really funny is,
01:53:02 and that’s something that I’ve experienced more and more
01:53:04 really thanks to you and through this podcast,
01:53:07 is this enormous number of people who will basically comment,
01:53:11 wow, I’ve been following this guy for so many years now,
01:53:14 or wow, this guy has inspired so many of us
01:53:17 in computation biology and so on and so forth.
01:53:19 I’m like, I don’t know any of that,
01:53:22 but I’m only discovering this now
01:53:23 through these sort of sharing our emotional states
01:53:27 and our cognitive states with a wider audience,
01:53:31 where suddenly I’m sort of realizing that,
01:53:33 wow, maybe I’ve had a legacy.
01:53:36 Like basically I’ve trained generations of students
01:53:39 from MIT and I’ve put all of my courses freely online
01:53:45 since 2001.
01:53:47 So basically all of my video recordings of my lectures
01:53:50 have been online since 2001.
01:53:52 So countless generations of people from across the world
01:53:56 will meet me at a conference and say,
01:53:58 like I was at this conference where somebody heard my voice
01:54:01 and it’s like, I know this voice,
01:54:02 I’ve been listening to your lectures.
01:54:04 And it’s just such a beautiful thing where
01:54:07 like we’re sharing widely and who knows
01:54:11 which students will get where
01:54:13 from whatever they catch out of these lectures,
01:54:16 even if what they catch is just inspiration
01:54:18 and passion and drive.
01:54:20 So there’s this intangible legacy quote unquote
01:54:26 that every one of us has through the people we touch.
01:54:29 One of my friends from undergrad basically told me,
01:54:31 oh, my mom remembers you vividly
01:54:33 from when she came to campus.
01:54:34 I’m like, I didn’t even meet her.
01:54:36 She’s like, no, but she sort of saw you interacting
01:54:39 with people and said, wow,
01:54:40 he’s exuding this positive energy.
01:54:43 And there’s that aspect of sort of just motivating people
01:54:47 with your kindness, with your passion, with your generosity
01:54:50 and with your just selflessness of just give,
01:54:56 it doesn’t matter where it goes.
01:54:58 I’ve been to conferences where basically people will,
01:55:01 I’ll ask them a question and then they’ll come back to,
01:55:04 or there was a conference where I asked somebody a question
01:55:06 and they said, oh, in fact, this entire project
01:55:08 was inspired by your question three years ago
01:55:10 at the same conference.
01:55:11 I’m like, wow.
01:55:13 And then on top of that, there’s also the ripple effect.
01:55:15 So you’re speaking to the direct influence of inspiration
01:55:18 or education, but there’s also the follow on things
01:55:22 that happen to that and there’s this ripple
01:55:23 that from you just this one individual first drop.
01:55:27 And from every one of us, from everyone,
01:55:29 that’s what I love about humanity.
01:55:30 The fact that every one of us shares genes
01:55:36 and genetic variants with very recent ancestors
01:55:39 with everyone else.
01:55:41 So even if I die tomorrow, my genes are still shared
01:55:45 through my cousins and through my uncles
01:55:47 and through my immediate family.
01:55:49 And of course I’m lucky enough to have my own children,
01:55:52 but even if you don’t, your genes are still permeating
01:55:55 through all of the layers of your family.
01:55:57 So your genes will have the legacy there, yeah.
01:56:00 Every one of us.
01:56:02 Number two, our ideas are constantly intermingling
01:56:05 with each other.
01:56:05 So there’s no person living in the planet
01:56:08 a hundred years from now who will not be directly impacted
01:56:12 by every one of the planet living here today
01:56:14 through genetic inheritance and through meme inheritance.
01:56:18 That’s cool to think that your ideas, Manolis Callas,
01:56:22 would touch every single person on this planet.
01:56:27 It’s interesting.
01:56:27 It’s not just mine, Joe Smith, who’s looking at this
01:56:30 right now, his ideas will also touch everybody.
01:56:33 So there’s this interconnectedness of humanity.
01:56:37 And then I’m also a professor.
01:56:39 So my day job is legacy.
01:56:42 My day job is training, not just the thousands of people
01:56:46 who watch my videos on the web,
01:56:47 but the people who are actually in my class,
01:56:50 who basically come to MIT to learn from a bunch of us.
01:56:55 The cognitive systems that were shipped to this particular
01:56:58 location in space.
01:56:59 And who will then disperse back
01:57:00 into all of their home countries.
01:57:02 That’s what makes America the beacon of the world.
01:57:05 We don’t just export goods.
01:57:09 We export people.
01:57:10 Cognitive systems.
01:57:12 We export people who are born here.
01:57:15 And we also export training that people born elsewhere
01:57:18 will come here to get and will then disseminate,
01:57:21 not just whatever knowledge they got,
01:57:23 but whatever ideals they learned.
01:57:26 And I think that’s something that’s a legacy of the US
01:57:28 that you cannot stop with political isolation.
01:57:31 You cannot stop with economic isolation.
01:57:33 That’s something that will continue to happen
01:57:35 through all the people we’ve touched through our universities.
01:57:38 So there’s the students who took my classes,
01:57:40 who are basically now going off and teaching their classes.
01:57:44 And I’ve trained generations of computational biologists.
01:57:46 No one in genomics who’s gone through MIT
01:57:48 hasn’t taken my class.
01:57:50 So basically there’s this impact through,
01:57:53 I mean, there’s so many people in biotechs who are like,
01:57:55 hey, I took your class.
01:57:56 That’s what got me into the field like 15 years ago.
01:57:58 And it’s just so beautiful.
01:58:00 And then there’s the academic family that I have.
01:58:04 So the students who are actually studying with me,
01:58:07 who are my trainees.
01:58:09 So this sort of mentorship of ancient Greece.
01:58:15 So I basically have an academic family and we are a family.
01:58:20 There’s this such strong connection,
01:58:24 this bond of you’re part of the Kelly’s family.
01:58:27 So I have a biological family at home
01:58:29 and I have an academic family on campus.
01:58:32 And that academic family
01:58:33 has given me great grandchildren already.
01:58:36 Yes.
01:58:37 So I’ve trained people who are now professors at Stanford,
01:58:40 CMU, Harvard, WashU, I mean, everywhere in the world.
01:58:46 And these people have now trained people
01:58:49 who are now having their own faculty jobs.
01:58:53 So there’s basically people who see me
01:58:55 as their academic grandfather.
01:58:57 And it’s just so beautiful
01:58:58 because you don’t have to wait for the 18 years
01:59:00 of cognitive hardware development
01:59:03 to sort of have amazing conversation with people.
01:59:07 These are fully grown humans, fully grown adult
01:59:09 who are cognitively super ready and who are shaped by,
01:59:15 and I see some of these beautiful papers and I’m like,
01:59:18 I can see the touch of our lab in those papers.
01:59:21 It’s just so beautiful.
01:59:22 Cause you’re like, I spent hours with these people
01:59:25 teaching them not just how to do a paper, but how to think.
01:59:29 And this whole concept of, you know,
01:59:31 the first paper that we write together
01:59:34 is an experience with every one of these students.
01:59:37 So, you know, I always tell them
01:59:38 to write the whole first draft
01:59:40 and they know that I will rewrite every word.
01:59:43 But the act of them writing it
01:59:45 and what I do is these like joint editing sessions
01:59:48 where I’m like, let’s coedit.
01:59:50 And with this coediting, we basically have…
01:59:53 Creative destruction.
01:59:55 So I share my Zoom screen
01:59:56 and I’m just thinking out loud as I’m doing this.
01:59:59 And they’re learning from that process
02:00:01 as opposed to like come back two days later
02:00:03 and they see a bunch of red on a page.
02:00:05 I’m sort of, well, that’s not how you write this.
02:00:08 That’s not how you think about this.
02:00:09 That’s not, you know, what’s the point?
02:00:10 Like this morning was having,
02:00:12 yes, this morning between six and 8 a.m.
02:00:14 I had a two hour meeting
02:00:15 going through one of these papers
02:00:18 and then saying, what’s the point here?
02:00:20 Why do you even show that?
02:00:22 It’s just a bunch of points on a graph.
02:00:24 No, what you have to do is extract the meaning,
02:00:26 do the homework for them.
02:00:28 And there’s this nurturing, this mentorship
02:00:32 that sort of creates now a legacy,
02:00:34 which is infinite because they’ve now gone off on the,
02:00:39 you know, and all of that is just humanity.
02:00:42 Then of course there’s the papers I write
02:00:44 because yes, my day job is training students,
02:00:48 but it’s a research university.
02:00:50 The way that they learn is through the mens and manus,
02:00:54 mind and hand.
02:00:56 It’s the practical training of actually doing research.
02:00:59 And that research is a beneficial side effect
02:01:03 of having these awesome papers
02:01:06 that will now tell other people how to think.
02:01:10 There’s this paper we just posted recently on MedArchive
02:01:13 and one of the most generous and eloquent comments about it
02:01:16 was like, wow, this is a masterclass in scientific writing,
02:01:21 in analysis, in biological interpretation, and so forth.
02:01:24 It’s just so fulfilling from a person I’ve never met
02:01:27 or heard about.
02:01:28 Can you say the title of the paper, Brian Chen?
02:01:30 I don’t remember the title,
02:01:31 but it’s single cell dissection of schizophrenia reveals.
02:01:38 So the two points that we found
02:01:39 was this whole transcriptional resilience.
02:01:42 Like there’s some individuals who are schizophrenic,
02:01:46 but they have an additional cell type
02:01:50 or an additional cell state, which we believe is protective.
02:01:53 And that cell state when they have it
02:01:55 will cause other cells to have normal gene expression patterns.
02:01:58 It’s beautiful.
02:02:00 And then that cell is connected
02:02:03 with some of the PV interneurons
02:02:06 that are basically sending these inhibitory brainwaves
02:02:09 through the brain.
02:02:10 And basically there’s another component of,
02:02:14 there’s a set of master regulators that we discovered
02:02:18 who are controlling many of the genes
02:02:20 that are differentially expressed.
02:02:22 And these master regulators are themselves
02:02:24 genetic targets of schizophrenia.
02:02:27 And they are themselves involved
02:02:28 in both synaptic connectivity
02:02:31 and also in early brain development.
02:02:34 So there’s this sort of interconnectedness
02:02:36 between synaptic development axis
02:02:40 and also this transcriptional resilience.
02:02:41 So, I mean, we basically made up a title
02:02:43 that combines all these concepts.
02:02:44 You have all these concepts,
02:02:45 all these people working together,
02:02:46 and ultimately these minds condense it down
02:02:49 into a beautifully written little document
02:02:51 that lives on forever. Exactly.
02:02:52 And that document now has its own life.
02:02:55 Our work has 120,000 citations.
02:02:59 I mean, that’s not just people who read it.
02:03:02 These are people who used it
02:03:03 to write something based on it.
02:03:05 I mean, that to me is just so fulfilling
02:03:10 to basically say, wow, I’ve touched people.
02:03:12 So I don’t think of my legacy as I live every day.
02:03:17 I just think of the beauty of the present
02:03:20 and the power of interconnectedness.
02:03:22 And just, I feel like a kid in a candy shop
02:03:25 where I’m just like constantly,
02:03:28 where do I, what package do I open first?
02:03:31 And, you know. You’re the lucky one.
02:03:33 A jack of all trades, a master of none.
02:03:37 I think for a Meaning of Life episode,
02:03:41 we would be amiss if we did not have at least a poem or two.
02:03:45 Do you mind if we end in a couple of poems?
02:03:49 Maybe a happy, maybe a sad one.
02:03:51 I would love that.
02:03:52 So thank you for the luxury.
02:03:55 The first one is kind of,
02:03:59 I remember when you were talking with Eric Weinstein
02:04:03 about this comment of Leonard Cohen that says,
02:04:09 but you don’t really care for music, do you?
02:04:11 In Hallelujah.
02:04:12 That’s basically kind of like mocking its reader.
02:04:16 So one of my poems is a little like that.
02:04:18 So I had just broken up with my girlfriend
02:04:23 and there’s this other friend who was coming to visit me.
02:04:26 And she said, I will not come unless you write me a poem.
02:04:30 And I was like, writing a poem on demand.
02:04:37 So this poem is called Write Me a Poem.
02:04:40 It goes, write me a poem, she said with a smile.
02:04:44 Make sure it’s pretty, romantic and rhymes.
02:04:47 Make sure it’s worthy of that bold flame,
02:04:49 that love uniting us beyond a mere game.
02:04:52 And she took off without more words,
02:04:54 rushed for the bus and traveled the world.
02:04:57 A poem, I thought, this is sublime.
02:05:00 What better way for passing the time?
02:05:03 What better way to count up the hours
02:05:05 before she comes back to my lonely tower?
02:05:08 Waiting for joy to fill up my heart,
02:05:10 let’s write a poem for when we’re apart.
02:05:13 How does a poem start, I inquired.
02:05:16 Give me a topic, cook up a style.
02:05:18 Throw in some cute words, oh, here and there.
02:05:20 Throw in some passion, love and despair.
02:05:23 Love, three eggs, one pound of flour,
02:05:26 three cups of water and bake for an hour.
02:05:29 Love is no recipe as I understand.
02:05:32 You can’t just cook up a poem on demand.
02:05:34 And as I was twisting all this in my mind,
02:05:37 I looked at the page, by golly, it rhymed.
02:05:40 Three roses, white chocolate, vanilla powder,
02:05:43 some beautiful rhymes and maybe a flower.
02:05:46 No, be romantic, the young girl insisted.
02:05:49 Do this, do that, don’t be so silly.
02:05:51 You must believe it straight from your heart.
02:05:53 If you don’t feel it, we’re better apart.
02:05:56 Oh, my sweet thing, what can I say?
02:05:59 You bring me the sun all night and all day.
02:06:02 You’re the stars and the moon and the birds way up high.
02:06:06 You’re my evening sweet song, my morning blue sky.
02:06:09 You are my muse, your spell has me caught.
02:06:12 You bring me my voice and scatter my thoughts.
02:06:15 To put that loving writing in vain, I can try.
02:06:19 But when I’m with you, my wings want to fly.
02:06:22 So I put down the pen and drop my defenses.
02:06:25 Give myself to you and fill up my senses.
02:06:31 The baffled king composing, that was beautiful.
02:06:35 What I love about it is that I did not
02:06:38 bring up a dictionary of rhymes.
02:06:39 I did not sort of work hard.
02:06:42 So basically when I write poems, I just type.
02:06:45 I never go back, I just.
02:06:48 So when my brain gets into that mode,
02:06:50 it actually happens like I wrote it.
02:06:52 Oh, wow, so the rhymes just kind of.
02:06:54 The rhymes just kind of come.
02:06:55 It’s an emergent phenomenon.
02:06:56 It’s an emergent phenomenon.
02:06:57 I just get into that mode and then it comes up.
02:07:00 That’s a beautiful one.
02:07:02 And it’s basically, you know, as you got it,
02:07:06 it’s basically saying it’s no recipe
02:07:08 and then I’m throwing in the recipes
02:07:09 and as I’m writing it, I’m like, you know.
02:07:11 So it’s very introspective in this whole concept.
02:07:16 So anyway, there’s another one many years earlier
02:07:19 that is, you know, darker.
02:07:23 It’s basically this whole concept of let’s be friends.
02:07:26 I was like, ugh, you know.
02:07:29 No, let’s be friends, just like, you know.
02:07:32 So the last words are shout out,
02:07:34 I love you or send me to hell.
02:07:36 So the title is Burn Me Tonight.
02:07:41 Lie to me, baby.
02:07:43 Lie to me now.
02:07:44 Tell me you love me.
02:07:45 Break me a vow.
02:07:47 Give me a sweet word, a promise, a kiss.
02:07:49 Give me the world, a sweet taste to miss.
02:07:52 Don’t let me lay here, inert, ugly, cold,
02:07:56 with nothing sweet felt and nothing harsh told.
02:07:59 Give me some hope, false, foolish, yet kind.
02:08:02 Make me regret, I’ll leave you behind.
02:08:05 Don’t pity my soul or torture it right.
02:08:08 Treat it with hatred.
02:08:09 Start up a fight.
02:08:11 For it’s from mildness that my soul dies
02:08:14 when you cover your passion in a bland friend’s disguise.
02:08:18 Kiss me now, baby.
02:08:19 Show me your passion.
02:08:21 Turn off the lights and rip off your fashion.
02:08:23 Give me my life’s joy this one night.
02:08:26 Burn all my matches for one blazing light.
02:08:30 Don’t think of tomorrow and let today fade.
02:08:32 Don’t try and protect me from love’s cutting blade.
02:08:35 Your razor will always rip off my veins.
02:08:38 Don’t spare me the passion to spare me the pains.
02:08:42 Kiss me now, honey, or spit in my face.
02:08:44 Throw me an insult I’ll gladly embrace.
02:08:47 Tell me now clearly that you never cared.
02:08:49 Say it now loudly like you never dared.
02:08:52 I’m ready to hear it.
02:08:54 I’m ready to die.
02:08:55 I’m ready to burn and start a new life.
02:08:58 I’m ready to face the rough burning truth
02:09:01 rather than waste the rest of my youth.
02:09:04 So tell me, my lover, should I stay or go?
02:09:07 The answer to love is one, yes or no.
02:09:09 There’s no I like you, no let’s be friends,
02:09:12 shout out I love you, or send me to hell.
02:09:17 I don’t think there’s a better way to end
02:09:21 a discussion of the meaning of life.
02:09:23 Whatever the heck the meaning is,
02:09:25 go all in as that poem says.
02:09:28 Manolis, thank you so much for talking today.
02:09:30 Thanks, I look forward to next time.
02:09:32 Thanks for listening to this conversation
02:09:34 with Manolis Kellis, and thank you to our sponsors.
02:09:38 Grammarly, which is a service for checking spelling,
02:09:40 grammar, sentence structure, and readability,
02:09:43 Athletic Greens, the all in one drink
02:09:46 that I start every day with
02:09:47 to cover all my nutritional bases,
02:09:50 Cash App, the app I use to send money to friends.
02:09:54 Please check out these sponsors in the description
02:09:56 to get a discount and to support this podcast.
02:09:59 If you enjoy this thing, subscribe on YouTube,
02:10:01 review it with five stars on Apple Podcast,
02:10:04 follow on Spotify, support on Patreon,
02:10:06 or connect with me on Twitter at Lex Friedman.
02:10:09 And now, let me leave you with some words
02:10:11 from Douglas Adams in his book,
02:10:13 Hitchhiker’s Guide to the Galaxy.
02:10:16 On the planet Earth, man had always assumed
02:10:19 that he was more intelligent than dolphins
02:10:22 because he had achieved so much.
02:10:24 The wheel, New York, wars, and so on,
02:10:28 whilst all the dolphins had ever done
02:10:31 was muck about in the water having a good time.
02:10:34 But conversely, the dolphins had always believed
02:10:38 that they were far more intelligent than man
02:10:41 for precisely the same reasons.
02:10:43 Thank you for listening and hope to see you next time.