Grimes: Music, AI, and the Future of Humanity #281

Transcript

00:00:00 We are becoming cyborgs.

00:00:02 Our brains are fundamentally changed.

00:00:04 Everyone who grew up with electronics,

00:00:05 we are fundamentally different from previous,

00:00:08 from homo sapiens.

00:00:09 I call us homo techno.

00:00:11 I think we have evolved into homo techno,

00:00:13 which is like essentially a new species.

00:00:15 Previous technologies, I mean,

00:00:17 may have even been more profound

00:00:19 and moved us to a certain degree,

00:00:20 but I think the computers are what make us homo techno.

00:00:22 I think this is what, it’s a brain augmentation.

00:00:25 So it like allows for actual evolution.

00:00:27 Like the computers accelerate the degree

00:00:29 to which all the other technologies can also be accelerated.

00:00:32 Would you classify yourself as a homo sapien or a homo techno?

00:00:35 Definitely homo techno.

00:00:37 So you’re one of the earliest of the species.

00:00:40 I think most of us are.

00:00:45 The following is a conversation with Grimes,

00:00:47 an artist, musician, songwriter, producer, director,

00:00:50 and a fascinating human being

00:00:53 who thinks a lot about both the history

00:00:55 and the future of human civilization.

00:00:57 Studying the dark periods of our past

00:00:59 to help form an optimistic vision of our future.

00:01:03 This is the Lex Friedman podcast.

00:01:05 To support it, please check out our sponsors

00:01:07 in the description.

00:01:08 And now, dear friends, here’s Grimes.

00:01:12 Oh yeah, the cloud lifter, there you go.

00:01:14 There you go.

00:01:15 You know your stuff.

00:01:16 Have you ever used a cloud lifter?

00:01:18 Yeah, I actually, this microphone cloud lifter

00:01:20 is what Michael Jackson used, so.

00:01:23 No, really?

00:01:24 Yeah, this is like Thriller and stuff.

00:01:26 This mic and a cloud lifter?

00:01:28 Yeah, it’s a incredible microphone.

00:01:30 It’s very flattering on vocals.

00:01:32 I’ve used this a lot.

00:01:33 It’s great for demo vocals.

00:01:34 It’s great in a room.

00:01:36 Sometimes it’s easier to record vocals

00:01:38 if you’re just in a room and the music’s playing

00:01:40 and you just wanna feel it so it’s not in the headphones.

00:01:43 And this mic is pretty directional,

00:01:44 so I think it’s a good mic for just vibing out

00:01:47 and just getting a real good vocal take.

00:01:49 Just vibing, just in a room.

00:01:51 Anyway, this is the Michael Jackson, Quincy Jones

00:01:55 microphone.

00:01:57 I feel way more badass now.

00:01:58 All right, you wanna just get into it?

00:02:01 I guess so.

00:02:03 All right, one of your names, at least in this space

00:02:05 and time, is C, like the letter C.

00:02:08 And you told me that C means a lot of things.

00:02:11 It’s the speed of light.

00:02:12 It’s the render rate of the universe.

00:02:14 It’s yes in Spanish.

00:02:16 It’s the crescent moon.

00:02:17 And it happens to be my favorite programming language

00:02:21 because it basically runs the world,

00:02:24 but it’s also powerful, fast, and it’s dangerous

00:02:28 because you can mess things up really bad with it

00:02:30 because of all the pointers.

00:02:31 But anyway, which of these associations

00:02:33 will the name C is the coolest to you?

00:02:37 I mean, to me, the coolest is the speed of light,

00:02:40 obviously, or the speed of light.

00:02:42 When I say render rate of the universe,

00:02:44 I think I mean the speed of light

00:02:46 because essentially that’s what we’re rendering at.

00:02:49 See, I think we’ll know if we’re in a simulation

00:02:52 if the speed of light changes

00:02:53 because if they can improve their render speed, then.

00:02:57 Well, it’s already pretty good.

00:02:58 It’s already pretty good, but if it improves,

00:03:01 then we’ll know, we can probably be like,

00:03:03 okay, they’ve updated or upgraded.

00:03:05 Well, it’s fast enough for us humans

00:03:06 because it seems immediate.

00:03:10 There’s no delay, there’s no latency

00:03:13 in terms of us humans on Earth interacting with things.

00:03:16 But if you’re like intergalactic species

00:03:20 operating on a much larger scale,

00:03:21 then you’re gonna start noticing some weird stuff.

00:03:23 Or if you can operate in like around a black hole,

00:03:27 then you’re gonna start to see some render issues.

00:03:29 You can’t go faster than the speed of light, correct?

00:03:32 So it really limits our ability

00:03:34 or one’s ability to travel space.

00:03:36 Theoretically, you can, you have wormholes.

00:03:38 So there’s nothing in general relativity

00:03:41 that precludes faster than the speed of light travel.

00:03:48 But it just seems you’re gonna have to do

00:03:49 some really funky stuff with very heavy things

00:03:54 that have like weirdnesses,

00:03:56 that have basically terrors in space time.

00:03:58 We don’t know how to do that.

00:03:59 Do navigators know how to do it?

00:04:01 Do navigators? Yeah.

00:04:03 Folding space, basically making wormholes.

00:04:07 So the name C. Yes.

00:04:11 Who are you?

00:04:14 Do you think of yourself as multiple people?

00:04:16 Are you one person?

00:04:18 Do you know, like in this morning,

00:04:20 were you a different person than you are tonight?

00:04:23 We are, I should say, recording this basically at midnight,

00:04:27 which is awesome. Yes, thank you so much.

00:04:29 I think I’m about eight hours late.

00:04:31 No, you’re right on time.

00:04:33 Good morning.

00:04:34 This is the beginning of a new day soon.

00:04:37 Anyway, are you the same person

00:04:39 you were in the morning and the evening?

00:04:43 Is there multiple people in there?

00:04:44 Do you think of yourself as one person?

00:04:46 Or maybe you have no clue?

00:04:47 Or are you just a giant mystery to yourself?

00:04:50 Okay, these are really intense questions, but.

00:04:52 Let’s go, let’s go.

00:04:53 Because I asked this myself, like look in the mirror,

00:04:55 who are you?

00:04:56 People tell you to just be yourself,

00:04:57 but what does that even mean?

00:04:59 I mean, I think my personality changes

00:05:01 with everyone I talk to.

00:05:02 So I have a very inconsistent personality.

00:05:06 Yeah.

00:05:07 Person to person, so the interaction,

00:05:08 your personality materializes.

00:05:11 Or my mood.

00:05:12 Like I’ll go from being like a megalomaniac

00:05:16 to being like, you know, just like a total hermit

00:05:19 who is very shy.

00:05:21 So some combinatorial combination of your mood

00:05:24 and the person you’re interacting with.

00:05:26 Yeah, mood and people I’m interacting with.

00:05:28 But I think everyone’s like that.

00:05:29 Maybe not.

00:05:30 Well, not everybody acknowledges it

00:05:32 and able to introspect it.

00:05:34 Who brings out, what kind of person,

00:05:35 what kind of mood brings out the best in you?

00:05:38 As an artist and as a human.

00:05:40 Can you introspect this?

00:05:41 Like my best friends, like people I can,

00:05:45 when I’m like super confident

00:05:47 and I know that they’re gonna understand

00:05:50 everything I’m saying, so like my best friends,

00:05:52 then when I can start being really funny,

00:05:55 that’s always my like peak mode.

00:05:57 But it’s like, yeah, takes a lot to get there.

00:06:00 Let’s talk about constraints.

00:06:02 You’ve talked about constraints and limits.

00:06:06 Do those help you out as an artist or as a human being?

00:06:09 Or do they get in the way?

00:06:10 Do you like the constraints?

00:06:11 So in creating music, in creating art, in living life,

00:06:16 do you like the constraints that this world puts on you?

00:06:21 Or do you hate them?

00:06:24 If constraints are moving, then you’re good, right?

00:06:29 Like it’s like as we are progressing with technology,

00:06:32 we’re changing the constraints of like artistic creation.

00:06:34 You know, making video and music and stuff

00:06:38 is getting a lot cheaper.

00:06:39 There’s constantly new technology and new software

00:06:42 that’s making it faster and easier.

00:06:44 We have so much more freedom than we had in the 70s.

00:06:46 Like when Michael Jackson, you know,

00:06:48 when they recorded Thriller with this microphone,

00:06:51 like they had to use a mixing desk and all this stuff.

00:06:54 And like probably even get in a studio,

00:06:55 it’s probably really expensive

00:06:56 and you have to be a really good singer

00:06:57 and you have to know how to use

00:06:59 like the mixing desk and everything.

00:07:00 And now I can just, you know,

00:07:02 make I’ve made a whole album on this computer.

00:07:05 I have a lot more freedom,

00:07:06 but then I’m also constrained in different ways

00:07:10 because there’s like literally millions more artists.

00:07:13 It’s like a much bigger playing field.

00:07:15 It’s just like, I also, I didn’t learn music.

00:07:18 I’m not a natural musician.

00:07:20 So I don’t know anything about actual music.

00:07:22 I just know about like the computer.

00:07:24 So I’m really kind of just like messing around

00:07:30 and like trying things out.

00:07:33 Well, yeah, I mean, but the nature of music is changing.

00:07:35 So you’re saying you don’t know actual music,

00:07:37 what music is changing.

00:07:39 Music is becoming, you’ve talked about this,

00:07:41 is becoming, it’s like merging with technology.

00:07:46 Yes.

00:07:47 It’s becoming something more than just like

00:07:51 the notes on a piano.

00:07:53 It’s becoming some weird composition

00:07:54 that requires engineering skills, programming skills,

00:07:59 some kind of human robot interaction skills,

00:08:03 and still some of the same things that Michael Jackson had,

00:08:05 which is like a good ear for a good sense of taste

00:08:08 of what’s good and not the final thing

00:08:10 when it’s put together.

00:08:11 Like you’re allowed, you’re enabled, empowered

00:08:14 with a laptop to layer stuff,

00:08:17 to start like layering insane amounts of stuff.

00:08:20 And it’s super easy to do that.

00:08:22 I do think music production is a really underrated art form.

00:08:25 I feel like people really don’t appreciate it.

00:08:26 When I look at publishing splits,

00:08:27 the way that people like pay producers and stuff,

00:08:32 it’s super, producers are just deeply underrated.

00:08:35 Like so many of the songs that are popular right now

00:08:39 or for the last 20 years,

00:08:40 like part of the reason they’re popular

00:08:42 is because the production is really interesting

00:08:44 or really sick or really cool.

00:08:45 And it’s like, I don’t think listeners,

00:08:50 like people just don’t really understand

00:08:52 what music production is.

00:08:54 It’s not, it’s sort of like this weird,

00:08:57 discombobulated art form.

00:08:59 It’s not like a formal, because it’s so new,

00:09:01 there isn’t like a formal training path for it.

00:09:06 It’s mostly driven by like autodidacts.

00:09:10 Like it’s like almost everyone I know

00:09:11 who’s good at production,

00:09:12 like they didn’t go to music school or anything.

00:09:13 They just taught themselves.

00:09:15 Are they’re mostly different?

00:09:16 Like the music producers, you know,

00:09:18 is there some commonalities that time together

00:09:21 or are they all just different kinds of weirdos?

00:09:23 Cause I just, I just hung out with Rick Rubin.

00:09:25 I don’t know if you’ve.

00:09:26 Yeah, I mean, Rick Rubin is like literally

00:09:29 one of the gods of music production.

00:09:31 Like he’s one of the people who first,

00:09:33 you know, who like made music production,

00:09:36 you know, made the production as important

00:09:39 as the actual lyrics or the notes.

00:09:41 But the thing he does, which is interesting,

00:09:43 I don’t know if you can speak to that,

00:09:45 but just hanging out with him,

00:09:46 he seems to just sit there in silence,

00:09:48 close his eyes and listen.

00:09:50 It’s like, he almost does nothing.

00:09:53 And that nothing somehow gives you freedom

00:09:55 to be the best version of yourself.

00:09:58 So that’s music production somehow too,

00:10:00 which is like encouraging you to do less,

00:10:02 to simplify, to like push towards minimalism.

00:10:06 I mean, I guess, I mean,

00:10:09 I work differently from Rick Rubin

00:10:11 cause Rick Rubin produces for other artists,

00:10:14 whereas like I mostly produce for myself.

00:10:17 So it’s a very different situation.

00:10:19 I also think Rick Rubin, he’s in that,

00:10:21 I would say advanced category of producer

00:10:23 where like you’ve like earned your,

00:10:26 you can have an engineer and stuff

00:10:27 and people like do the stuff for you.

00:10:29 But I usually just like do stuff myself.

00:10:32 So you’re the engineer, the producer and the artist.

00:10:38 Yeah, I guess I would say I’m in the era,

00:10:39 like the post Rick Rubin era.

00:10:41 Like I come from the kind of like

00:10:44 Skrillex school of thought,

00:10:47 which is like where you are.

00:10:49 Yeah, the engineer, producer, artist.

00:10:51 Like where, I mean lately,

00:10:53 sometimes I’ll work with a producer now.

00:10:55 I’m gently sort of delicately starting

00:10:59 to collaborate a bit more,

00:10:59 but like I think I’m kind of from the,

00:11:02 like the whatever 2010s explosion of things

00:11:07 where everything became available on the computer

00:11:11 and you kind of got this like lone wizard energy thing going.

00:11:16 So you embraced being the loneliness.

00:11:19 Is the loneliness somehow an engine of creativity?

00:11:22 Like, so most of your stuff,

00:11:24 most of your creative quote unquote genius in quotes

00:11:28 is in the privacy of your mind.

00:11:32 Yes, well, it was,

00:11:36 but here’s the thing.

00:11:39 I was talking to Daniel Eck and he said,

00:11:40 he’s like most artists, they have about 10 years,

00:11:43 like 10 good years.

00:11:45 And then they usually stop making their like vital shit.

00:11:49 And I feel like I’m sort of like nearing the end

00:11:53 of my 10 years on my own.

00:11:56 So you have to become somebody else.

00:11:58 Now I’m like, I’m in the process

00:11:59 of becoming somebody else and reinventing.

00:12:01 When I work with other people,

00:12:02 because I’ve never worked with other people,

00:12:04 I find that I make like, that I’m exceptionally rejuvenated

00:12:08 and making like some of the most vital work I’ve ever made.

00:12:10 So, because I think another human brain

00:12:13 is like one of the best tools you can possibly find.

00:12:17 Like.

00:12:18 It’s a funny way to put it, I love it.

00:12:20 It’s like if a tool is like, you know,

00:12:23 whatever HP plus one or like adds some like stats

00:12:27 to your character, like another human brain

00:12:30 will like square it instead of just like adding something.

00:12:34 Double up the experience points, I love this.

00:12:36 We should also mention we’re playing Tavern music

00:12:38 before this and which I love, which I first,

00:12:41 I think I first.

00:12:42 You had to stop the Tavern music.

00:12:43 Yeah, because it doesn’t, the audio.

00:12:46 Okay, okay.

00:12:47 But it makes.

00:12:48 Yeah, it’ll make the podcast annoying.

00:12:48 Add it in post, add it in post.

00:12:50 No one will want to listen to the podcast.

00:12:51 They probably would, but it makes me,

00:12:53 it reminds me like of a video game,

00:12:55 like a role playing video game

00:12:56 where you have experience points.

00:12:58 There’s something really joyful about wandering places

00:13:03 like Elder Scrolls, like Skyrim,

00:13:06 just exploring these landscapes in another world

00:13:10 and then you get experience points

00:13:12 and you can work on different skills

00:13:14 and somehow you progress in life.

00:13:16 I don’t know, it’s simple.

00:13:17 It doesn’t have some of the messy complexities of life

00:13:19 and there’s usually a bad guy you can fight in Skyrim.

00:13:23 It’s dragons and so on.

00:13:25 I’m sure in Elden Ring,

00:13:26 there’s a bunch of monsters you can fight.

00:13:28 I love that.

00:13:29 I feel like Elden Ring,

00:13:29 I feel like this is a good analogy

00:13:31 to music production though

00:13:32 because it’s like, I feel like the engineers

00:13:34 and the people creating these open worlds are,

00:13:36 it’s sort of like similar to people, to music producers

00:13:39 where it’s like this hidden archetype

00:13:42 that like no one really understands what they do

00:13:44 and no one really knows who they are,

00:13:46 but they’re like, it’s like the artist engineer

00:13:49 because it’s like, it’s both art

00:13:51 and fairly complex engineering.

00:13:54 Well, you’re saying they don’t get enough credit.

00:13:57 Aren’t you kind of changing that

00:13:58 by becoming the person doing everything?

00:14:01 Aren’t you, isn’t the engineer?

00:14:03 Well, I mean, others have gone before me.

00:14:05 I’m not, you know, there’s like Timbaland and Skrillex

00:14:07 and there’s all these people that are like,

00:14:10 you know, very famous for this,

00:14:12 but I just think the general,

00:14:13 I think people get confused about what it is

00:14:15 and just don’t really know what it is per se

00:14:19 and it’s just when I see a song,

00:14:20 like when there’s like a hit song,

00:14:22 like I’m just trying to think of like,

00:14:27 just going for like even just a basic pop hit,

00:14:29 like, what’s it?

00:14:33 Like Rules by Dua Lipa or something.

00:14:36 The production on that is actually like really crazy.

00:14:39 I mean, the song is also great,

00:14:40 but it’s like the production is exceptionally memorable.

00:14:43 Like, you know, and it’s just like no one,

00:14:47 I can’t, I don’t even know who produced that song.

00:14:49 It’s just like, isn’t part of like the rhetoric

00:14:50 of how we just discuss the creation of art.

00:14:53 We just sort of like don’t consider the music producer

00:14:57 because I think the music producer used to be more

00:15:00 just simply recording things.

00:15:03 Yeah, that’s interesting

00:15:04 because when you think about movies,

00:15:06 we talk about the actor and the actresses,

00:15:08 but we also talk about the directors.

00:15:11 We don’t talk about like that with the music as often.

00:15:15 The Beatles music producer

00:15:17 was one of the first kind of guy,

00:15:19 one of the first people sort of introducing

00:15:21 crazy sound design into pop music.

00:15:22 I forget his name.

00:15:24 He has the same, I forget his name,

00:15:25 but you know, like he was doing all the weird stuff

00:15:29 like dropping pianos and like, yeah.

00:15:32 Oh, to get the, yeah, yeah, yeah,

00:15:33 to get the sound, to get the authentic sound.

00:15:36 What about lyrics?

00:15:38 You think those, where did they fit

00:15:40 into how important they are?

00:15:42 I was heartbroken to learn

00:15:44 that Elvis didn’t write his songs.

00:15:46 I was very mad.

00:15:47 A lot of people don’t write their songs.

00:15:49 I understand this, but.

00:15:50 But here’s the thing.

00:15:52 I feel like there’s this desire for authenticity.

00:15:54 I used to be like really mad

00:15:56 when like people wouldn’t write or produce their music

00:15:58 and I’d be like, that’s fake.

00:15:59 And then I realized there’s all this like weird bitterness

00:16:04 and like agronus in art about authenticity.

00:16:07 But I had this kind of like weird realization recently

00:16:12 where I started thinking that like art

00:16:14 is sort of a decentralized collective thing.

00:16:20 Like art is kind of a conversation

00:16:25 with all the artists that have ever lived before you.

00:16:28 You know, like it’s like, you’re really just sort of,

00:16:31 it’s not like anyone’s reinventing the wheel here.

00:16:33 Like you’re kind of just taking, you know,

00:16:36 thousands of years of art

00:16:38 and like running it through your own little algorithm

00:16:41 and then like making your interpretation of it.

00:16:45 You just joined the conversation

00:16:46 with all the other artists that came before.

00:16:47 It’s just a beautiful way to look at it.

00:16:49 Like, and it’s like, I feel like everyone’s always like,

00:16:51 there’s all this copyright and IP and this and that

00:16:54 or authenticity.

00:16:55 And it’s just like, I think we need to stop seeing this

00:16:59 as this like egotistical thing of like,

00:17:01 oh, the creative genius, the lone creative genius

00:17:04 or this or that.

00:17:05 Because it’s like, I think art shouldn’t be about that.

00:17:08 I think art is something that sort of

00:17:10 brings humanity together.

00:17:12 And it’s also, art is also kind of the collective memory

00:17:14 of humans.

00:17:14 It’s like, we don’t give a fuck about

00:17:18 whatever ancient Egypt,

00:17:20 like how much grain got sent that day

00:17:22 and sending the records and like, you know,

00:17:24 like who went where and, you know,

00:17:27 how many shields needed to be produced for this.

00:17:29 Like we just remember their art.

00:17:32 And it’s like, you know, it’s like in our day to day life,

00:17:34 there’s all this stuff that seems more important than art

00:17:38 because it helps us function and survive.

00:17:40 But when all this is gone,

00:17:41 like the only thing that’s really gonna be left is the art.

00:17:45 The technology will be obsolete.

00:17:46 That’s so fascinating.

00:17:47 Like the humans will be dead.

00:17:49 That is true.

00:17:49 A good compression of human history

00:17:52 is the art we’ve generated across the different centuries,

00:17:56 the different millennia.

00:17:57 So when the aliens come.

00:17:59 When the aliens come,

00:18:00 they’re gonna find the hieroglyphics and the pyramids.

00:18:02 I mean, art could be broadly defined.

00:18:04 They might find like the engineering marvels,

00:18:06 the bridges, the rockets, the.

00:18:09 I guess I sort of classify though.

00:18:11 Architecture is art too.

00:18:13 I consider engineering in those formats to be art, for sure.

00:18:19 It sucks that like digital art is easier to delete.

00:18:23 So if there’s an apocalypse, a nuclear war,

00:18:25 that can disappear.

00:18:26 Yes.

00:18:28 And the physical.

00:18:28 There’s something still valuable

00:18:30 about the physical manifestation of art.

00:18:32 That sucks that like music, for example,

00:18:35 has to be played by somebody.

00:18:37 Yeah, I do think we should have a foundation type situation

00:18:41 where we like, you know how we have like seed banks

00:18:44 up in the north and stuff?

00:18:45 Like we should probably have like a solar powered

00:18:48 or geothermal little bunker

00:18:49 that like has all human knowledge.

00:18:52 You mentioned Daniel Ek and Spotify.

00:18:55 What do you think about that as an artist?

00:18:56 What’s Spotify?

00:18:58 Is that empowering?

00:19:00 To me, Spotify as a consumer is super exciting.

00:19:02 It makes it easy for me to access music

00:19:04 from all kinds of artists,

00:19:06 get to explore all kinds of music,

00:19:08 make it super easy to sort of curate my own playlist

00:19:12 and have fun with all that.

00:19:13 It was so liberating to let go.

00:19:16 You know, I used to collect, you know,

00:19:17 albums and CDs and so on, like horde albums.

00:19:22 Yeah.

00:19:22 Like they matter.

00:19:23 But the reality you could, you know,

00:19:25 that was really liberating that I could let go of that.

00:19:28 And letting go of the albums you’re kind of collecting

00:19:32 allows you to find new music,

00:19:33 exploring new artists and all that kind of stuff.

00:19:36 But I know from a perspective of an artist that could be,

00:19:38 like you mentioned,

00:19:39 competition could be a kind of constraint

00:19:42 because there’s more and more and more artists

00:19:44 on the platform.

00:19:46 I think it’s better that there’s more artists.

00:19:47 I mean, again, this might be propaganda

00:19:49 because this is all from a conversation with Daniel Ek.

00:19:51 So this could easily be propaganda.

00:19:54 We’re all a victim of somebody’s propaganda.

00:19:56 So let’s just accept this.

00:19:58 But Daniel Ek was telling me that, you know,

00:20:01 at the, because I, you know, when I met him,

00:20:03 I came in all furious about Spotify

00:20:06 and like I grilled him super hard.

00:20:07 So I’ve got his answers here.

00:20:10 But he was saying like at the sort of peak

00:20:13 of the CD industry,

00:20:15 there was like 20,000 artists making millions

00:20:18 and millions of dollars.

00:20:19 Like there was just like a very tiny kind of 1%.

00:20:22 And Spotify has kind of democratized the industry

00:20:27 because now I think he said there’s about a million artists

00:20:30 making a good living from Spotify.

00:20:33 And when I heard that, I was like, honestly,

00:20:36 I would rather make less money

00:20:38 and have just like a decent living

00:20:43 and have more artists be able to have that,

00:20:46 even though I like, I wish it could include everyone, but.

00:20:49 Yeah, that’s really hard to argue with.

00:20:50 YouTube is the same.

00:20:52 It’s YouTube’s mission.

00:20:54 They want to basically have as many creators as possible

00:20:58 and make a living, some kind of living.

00:21:00 And that’s so hard to argue with.

00:21:02 It’s so hard.

00:21:03 But I think there’s better ways to do it.

00:21:04 My manager, I actually wish he was here.

00:21:06 Like I would have brought him up.

00:21:07 My manager is building an app that can manage you.

00:21:13 So it’ll like help you organize your percentages

00:21:16 and get your publishing and dah, dah, dah, dah, dah.

00:21:18 So you can take out all the middlemen

00:21:19 so you can have a much bigger,

00:21:21 it’ll just like automate it.

00:21:23 So you can get.

00:21:23 So automate the manager?

00:21:24 Automate management, publishing,

00:21:28 and legal, it can read,

00:21:32 the app he’s building can read your contract

00:21:34 and like tell you about it.

00:21:35 Because one of the issues with music right now,

00:21:38 it’s not that we’re not getting paid enough,

00:21:39 but it’s that the art industry is filled with middlemen

00:21:44 because artists are not good at business.

00:21:47 And from the beginning, like Frank Sinatra,

00:21:50 it’s all mob stuff.

00:21:51 Like it’s the music industry is run by business people,

00:21:56 not the artists and the artists really get very small cuts

00:21:59 of like what they make.

00:22:00 And so I think part of the reason I’m a technocrat,

00:22:04 which I mean, your fans are gonna be technocrats.

00:22:07 So no one’s, they’re not gonna be mad at me about this,

00:22:09 but like my fans hate it when I say this kind of thing

00:22:12 or the general public.

00:22:13 They don’t like technocrats.

00:22:14 They don’t like technocrats.

00:22:15 Like when I watched Battle Angel Alita

00:22:18 and they were like the Martian technocracy

00:22:20 and I was like, yeah, Martian technocracy.

00:22:22 And then they were like, and they’re evil.

00:22:23 And I was like, oh, okay.

00:22:25 I was like, cause Martian technocracy sounds sick to me.

00:22:28 Yeah, so your intuition as technocrats

00:22:31 would create some kind of beautiful world.

00:22:34 For example, what my manager’s working on,

00:22:36 if you can create an app that removes the need for a lawyer

00:22:39 and then you could have smart contracts on the blockchain,

00:22:43 removes the need for like management

00:22:46 and organizing all this stuff,

00:22:48 like can read your stuff and explain it to you,

00:22:50 can collect your royalties, you know,

00:22:54 like then the small amounts,

00:22:56 the amount of money that you’re getting from Spotify

00:22:58 actually means a lot more and goes a lot farther.

00:23:01 It can remove some of the bureaucracy,

00:23:03 some of the inefficiencies that make life

00:23:06 not as great as it could be.

00:23:08 Yeah, I think the issue isn’t that there’s not enough.

00:23:10 Like the issue is that there’s inefficiency

00:23:12 and I’m really into this positive sum mindset,

00:23:18 you know, the win, win mindset of like,

00:23:20 instead of, you know, fighting over the scraps,

00:23:23 how do we make the, or worrying about scarcity,

00:23:26 like instead of a scarcity mindset,

00:23:27 why don’t we just increase the efficiency

00:23:30 and, you know, in that way.

00:23:32 Expand the size of the pie.

00:23:34 Let me ask you about experimentation.

00:23:36 So you said, which is beautiful,

00:23:40 being a musician is like having a conversation

00:23:42 with all those that came before you.

00:23:45 How much of creating music is like

00:23:51 kind of having that conversation,

00:23:53 trying to fit into the cultural trends

00:23:57 and how much of it is like trying to,

00:23:59 as much as possible, be an outsider

00:24:00 and come up with something totally new.

00:24:02 It’s like when you’re thinking,

00:24:04 when you’re experimenting,

00:24:05 are you trying to be totally different, totally weird?

00:24:08 Are you trying to fit in?

00:24:12 Man, this is so hard because I feel like I’m

00:24:14 kind of in the process of semi retiring from music,

00:24:16 so this is like my old brain.

00:24:18 Yeah, bring it from like the shelf,

00:24:22 put it on the table for a couple minutes,

00:24:24 we’ll just poke it.

00:24:26 I think it’s a bit of both

00:24:27 because I think forcing yourself to engage with new music

00:24:32 is really great for neuroplasticity.

00:24:35 Like I think, you know, as people,

00:24:39 part of the reason music is marketed at young people

00:24:41 is because young people are very neuroplastic.

00:24:43 So like if you’re 16 to like 23 or whatever,

00:24:48 it’s gonna be really easy for you to love new music.

00:24:50 And if you’re older than that,

00:24:52 it gets harder and harder and harder.

00:24:53 And I think one of the beautiful things

00:24:54 about being a musician is I just constantly force myself

00:24:57 to listen to new music

00:24:58 and I think it keeps my brain really plastic.

00:25:01 And I think this is a really good exercise.

00:25:02 I just think everyone should do this.

00:25:04 You listen to new music and you hate it,

00:25:05 I think you should just keep, force yourself to like,

00:25:08 okay, well why do people like it?

00:25:09 And like, you know, make your brain form new neural pathways

00:25:14 and be more open to change.

00:25:16 That’s really brilliant actually.

00:25:18 Sorry to interrupt, but like that exercise

00:25:21 is really amazing to sort of embrace change,

00:25:27 embrace sort of practice neuroplasticity.

00:25:31 Because like that’s one of the things,

00:25:33 you fall in love with a certain band

00:25:34 and you just kind of stay with that for the rest of your life

00:25:36 and then you never understand the modern music.

00:25:38 That’s a really good exercise.

00:25:39 Most of the streaming on Spotify

00:25:40 is like classic rock and stuff.

00:25:42 Like new music makes up a very small chunk

00:25:44 of what is played on Spotify.

00:25:46 And I think this is like not a good sign for us as a species.

00:25:50 I think, yeah.

00:25:52 So it’s a good measure of the species open mindedness

00:25:57 to change is how often you listen to new music.

00:26:01 The brain, let’s put the music brain back on the shelf.

00:26:05 I gotta pull out the futurist brain for a second.

00:26:09 In what wild ways do you think the future,

00:26:12 say in like 30 years, maybe 50 years,

00:26:14 maybe a hundred years will be different

00:26:19 from our current way of life on earth?

00:26:22 We can talk about augmented reality, virtual reality,

00:26:25 maybe robots, maybe space travel, maybe video games,

00:26:30 maybe genetic engineering.

00:26:32 I can keep going.

00:26:33 Cyborgs, aliens, world wars,

00:26:36 maybe destructive nuclear wars, good and bad.

00:26:39 When you think about the future, what are you imagining?

00:26:43 What’s the weirdest and the wildest it could be?

00:26:47 Have you read Surface Detail by Iain Banks?

00:26:51 Surface Detail is my favorite depiction of a,

00:26:54 oh wow, you have to read this book.

00:26:56 It’s literally the greatest science fiction book

00:26:58 possibly ever written.

00:26:59 Iain Banks is the man, yeah, for sure.

00:27:01 What have you read?

00:27:03 Just the Player of Games.

00:27:04 I read that titles can’t be copyrighted

00:27:07 so you can just steal them.

00:27:08 And I was like, Player of Games, sick.

00:27:09 Nice.

00:27:10 Yeah, so you can name your album.

00:27:12 Like I always wanted to.

00:27:13 Romeo and Juliet or something.

00:27:15 I always wanted to name an album War and Peace.

00:27:17 Nice.

00:27:17 Like that would be, like you.

00:27:18 That is a good, that’s a good,

00:27:20 where have I heard that before?

00:27:21 You can do that, like you could do that.

00:27:24 Also things that are in the public domain.

00:27:26 For people who have no clue,

00:27:27 you do have a song called Player of Games.

00:27:29 Yes, oh yeah.

00:27:30 So Iain Banks, Surface Detail is in my opinion

00:27:33 the best future that I’ve ever read about

00:27:37 or heard about in science fiction.

00:27:39 Basically there’s the relationship with super intelligence,

00:27:44 like artificial super intelligence is just, it’s like great.

00:27:50 I want to credit the person who coined this term

00:27:53 because I love this term.

00:27:55 And I feel like young women don’t get enough credit in.

00:28:00 Yeah, so if you go to Protopia Futures on Instagram,

00:28:03 what is her name?

00:28:05 Personalized donor experience at scale,

00:28:07 our AI power donor experience.

00:28:08 Monica Bealskite, I’m saying that wrong.

00:28:15 And I’m probably gonna, I’m probably butchering this a bit,

00:28:17 but Protopia is sort of, if utopia is unattainable,

00:28:21 Protopia is sort of like, you know.

00:28:26 Wow, that’s an awesome Instagram, Protopia Futures.

00:28:28 A great, a future that is, you know, as good as we can get.

00:28:33 The future, positive future.

00:28:34 AI, is this a centralized AI in Surface Detail

00:28:38 or is it distributed?

00:28:39 What kind of AI is it?

00:28:40 They mostly exist as giant super ships,

00:28:42 like sort of like the guild ships in Dune.

00:28:45 Like they’re these giant ships that kind of move people

00:28:47 around and the ships are sentient

00:28:49 and they can talk to all the passengers.

00:28:52 And I mean, there’s a lot of different types of AI

00:28:56 in the Banksyan future,

00:28:58 but in the opening scene of Surface Detail,

00:29:01 there’s this place called the Culture

00:29:02 and the Culture is basically a Protopian future.

00:29:04 And a Protopian future, I think,

00:29:07 is like a future that is like,

00:29:09 obviously it’s not utopia, it’s not perfect.

00:29:12 And like, cause like striving for utopia,

00:29:14 I think feels hopeless and it’s sort of like,

00:29:16 maybe not the best terminology to be using.

00:29:20 So it’s like, it’s a pretty good place.

00:29:23 Like mostly like, you know,

00:29:27 super intelligence and biological beings

00:29:29 exist fairly in harmony.

00:29:31 There’s not too much war.

00:29:33 There’s like as close to equality as you can get,

00:29:35 you know, it’s like approximately a good future.

00:29:38 Like there’s really awesome stuff.

00:29:40 It’s, and in the opening scene,

00:29:45 this girl, she’s born as a sex slave outside of the culture.

00:29:49 So she’s in a society that doesn’t adhere

00:29:51 to the cultural values.

00:29:52 She tries to kill the guy who is her like master,

00:29:56 but he kills her, but unbeknownst to her,

00:29:59 when she was traveling on a ship through the culture

00:30:01 with him one day, a ship put a neural lace

00:30:05 in her head and neural lace is sort of like,

00:30:08 it’s basically a Neuralink because life imitates art.

00:30:13 It does indeed.

00:30:14 It does indeed.

00:30:15 So she wakes up and the opening scene is her memory

00:30:17 has been uploaded by this neural lace

00:30:19 when she has been killed.

00:30:20 And now she gets to choose a new body.

00:30:22 And this AI is interfacing with her recorded memory

00:30:26 in her neural lace and helping her and being like,

00:30:29 hello, you’re dead.

00:30:31 But because you had a neural lace, your memory’s uploaded.

00:30:33 Do you want to choose a new body?

00:30:34 And you’re going to be born here in the culture

00:30:36 and like start a new life, which is just,

00:30:38 that’s like the opening.

00:30:39 It’s like so sick.

00:30:41 And the ship is the super intelligence.

00:30:43 All the ships are kind of super intelligence.

00:30:45 But they still want to preserve a kind of rich,

00:30:47 fulfilling experience for the humans.

00:30:49 Yeah, like they’re like friends with the humans.

00:30:51 And then there’s a bunch of ships that don’t want to exist

00:30:53 with biological beings, but they just have their own place

00:30:56 like way over there.

00:30:57 But they don’t, they just do their own thing.

00:30:58 They’re not necessarily.

00:31:00 So it’s a pretty, this protopian existence is pretty peaceful.

00:31:03 Yeah, I mean, and then, for example,

00:31:05 one of the main fights in the book is they’re fighting,

00:31:10 there’s these artificial hells that,

00:31:13 and people don’t think it’s ethical to have artificial hell.

00:31:17 Like basically when people do crime, they get sent,

00:31:19 like when they die, their memory gets sent

00:31:21 to an artificial hell and they’re eternally tortured.

00:31:23 And so, and then the way that society is deciding

00:31:27 whether or not to have the artificial hell

00:31:29 is that they’re having these simulated,

00:31:31 they’re having like a simulated war.

00:31:33 So instead of actual blood, you know,

00:31:36 people are basically essentially fighting in a video game

00:31:38 to choose the outcome of this.

00:31:40 But they’re still experiencing the suffering

00:31:42 in this artificial hell or no?

00:31:44 Can you experience stuff or?

00:31:45 So the artificial hell sucks.

00:31:47 And a lot of people in the culture want to get rid

00:31:48 of the artificial hell.

00:31:49 There’s a simulated wars,

00:31:51 are they happening in the artificial hell?

00:31:53 So no, the simulated wars are happening

00:31:55 outside of the artificial hell,

00:31:57 between the political factions who are,

00:31:59 so this political faction says we should have simulated hell

00:32:02 to deter crime.

00:32:05 And this political faction is saying,

00:32:06 no, simulated hell is unethical.

00:32:08 And so instead of like having, you know,

00:32:11 blowing each other up with nukes,

00:32:13 they’re having like a giant Fortnite battle

00:32:17 to decide this, which, you know, to me that’s protopia.

00:32:21 That’s like, okay, we can have war without death.

00:32:25 You know, I don’t think there should be simulated hells.

00:32:27 I think that is definitely one of the ways

00:32:29 in which technology could go very, very, very, very wrong.

00:32:34 So almost punishing people in a digital space

00:32:37 or something like that.

00:32:37 Yeah, like torturing people’s memories.

00:32:41 So either as a deterrent, like if you committed a crime,

00:32:44 but also just for personal pleasure,

00:32:46 if there’s some sick, demented humans in this world.

00:32:50 Dan Carlin actually has this

00:32:55 episode of Hardcore History on painful attainment.

00:32:59 Oh, that episode is fucked.

00:33:02 It’s dark, because he kind of goes through human history

00:33:05 and says like, we as humans seem to enjoy,

00:33:08 secretly enjoy or used to be openly enjoy

00:33:12 sort of the torture and the death,

00:33:14 watching the death and torture of other humans.

00:33:17 I do think if people were consenting,

00:33:21 we should be allowed to have gladiatorial matches.

00:33:26 But consent is hard to achieve in those situations.

00:33:28 It always starts getting slippery.

00:33:31 Like it could be also forced consent,

00:33:32 like it starts getting weird.

00:33:35 There’s way too much excitement.

00:33:37 Like this is what he highlights.

00:33:38 There’s something about human nature

00:33:40 that wants to see that violence.

00:33:42 And it’s really dark.

00:33:44 And you hope that we can sort of overcome

00:33:47 that aspect of human nature,

00:33:48 but that’s still within us somewhere.

00:33:51 Well, I think that’s what we’re doing right now.

00:33:53 I have this theory that what is very important

00:33:56 about the current moment is that all of evolution

00:34:00 has been survival of the fittest up until now.

00:34:03 And at some point, the lines are kind of fuzzy,

00:34:07 but in the recent past, or maybe even just right now,

00:34:12 we’re getting to this point

00:34:14 where we can choose intelligent design.

00:34:19 Like we probably since like the integration of the iPhone,

00:34:23 like we are becoming cyborgs.

00:34:24 Like our brains are fundamentally changed.

00:34:27 Everyone who grew up with electronics,

00:34:29 we are fundamentally different from previous,

00:34:32 from homo sapiens.

00:34:33 I call us homo techno.

00:34:34 I think we have evolved into homo techno,

00:34:36 which is like essentially a new species.

00:34:39 Like if you look at the way,

00:34:41 if you took an MRI of my brain

00:34:43 and you took an MRI of like a medieval brain,

00:34:46 I think it would be very different

00:34:48 the way that it has evolved.

00:34:49 Do you think when historians look back at this time,

00:34:52 they’ll see like this was a fundamental shift

00:34:54 to what a human being is?

00:34:54 I think, I do not think we are still homo sapiens.

00:34:58 I believe we are homo techno.

00:34:59 And I think we have evolved.

00:35:04 And I think right now, the way we are evolving,

00:35:07 we can choose how we do that.

00:35:09 And I think we are being very reckless

00:35:10 about how we’re doing that.

00:35:11 Like we’re just having social media,

00:35:13 but I think this idea that like this is a time

00:35:16 to choose intelligent design should be taken very seriously.

00:35:19 It like now is the moment to reprogram the human computer.

00:35:24 It’s like, if you go blind,

00:35:27 your visual cortex will get taken over

00:35:29 with other functions.

00:35:32 We can choose our own evolution.

00:35:35 We can change the way our brains work.

00:35:37 And so we actually have a huge responsibility to do that.

00:35:39 And I think I’m not sure who should be responsible for that,

00:35:42 but there’s definitely not adequate education.

00:35:45 We’re being inundated with all this technology

00:35:46 that is fundamentally changing

00:35:49 the physical structure of our brains.

00:35:50 And we are not adequately responding to that

00:35:55 to choose how we wanna evolve.

00:35:57 And we could evolve, we could be really whatever we want.

00:36:00 And I think this is a really important time.

00:36:03 And I think if we choose correctly and we choose wisely,

00:36:06 consciousness could exist for a very long time

00:36:09 and integration with AI could be extremely positive.

00:36:12 And I don’t think enough people are focusing

00:36:14 on this specific situation.

00:36:16 Do you think we might irreversibly screw things up

00:36:19 if we get things wrong now?

00:36:20 Because the flip side of that,

00:36:21 it seems humans are pretty adaptive.

00:36:23 So maybe the way we figure things out

00:36:25 is by screwing it up, like social media.

00:36:28 Over a generation, we’ll see the negative effects

00:36:30 of social media, and then we build new social medias,

00:36:33 and we just keep improving stuff.

00:36:34 And then we learn from the failures of the past.

00:36:37 Because humans seem to be really adaptive.

00:36:39 On the flip side, we can get it wrong in a way

00:36:43 where literally we create weapons of war

00:36:46 or increase hate.

00:36:48 Past a certain threshold, we really do a lot of damage.

00:36:51 I mean, I think we’re optimized

00:36:53 to notice the negative things.

00:36:55 But I would actually say one of the things

00:37:00 that I think people aren’t noticing

00:37:02 is if you look at Silicon Valley

00:37:03 and you look at the technocracy,

00:37:06 like what’s been happening there.

00:37:09 When Silicon Valley started, it was all just Facebook

00:37:11 and all this for profit crap

00:37:14 that really wasn’t particular.

00:37:16 I guess it was useful, but it’s sort of just whatever.

00:37:22 But now you see lab grown meat, compostable,

00:37:26 or biodegradable, single use cutlery,

00:37:30 or meditation apps.

00:37:33 I think we are actually evolving and changing,

00:37:38 and technology is changing.

00:37:39 I think there just maybe there isn’t

00:37:43 quite enough education about this.

00:37:48 And also, I don’t know if there’s quite enough incentive

00:37:52 for it because I think the way capitalism works,

00:37:56 what we define as profit,

00:37:58 we’re also working on an old model

00:38:00 of what we define as profit.

00:38:01 I really think if we changed the idea of profit

00:38:06 to include social good,

00:38:08 you can have economic profit,

00:38:09 social good also counting as profit

00:38:12 would incentivize things that are more useful

00:38:15 and more whatever spiritual technology

00:38:16 or positive technology or things that help reprogram

00:38:21 a human computer in a good way

00:38:22 or things that help us intelligently design our new brains.

00:38:28 Yeah, there’s no reason why within the framework

00:38:30 of capitalism, the word profit or the idea of profit

00:38:33 can’t also incorporate the well being of a human being.

00:38:37 So like long term well being, long term happiness.

00:38:41 Or even for example, we were talking about motherhood,

00:38:43 like part of the reason I’m so late

00:38:44 is because I had to get the baby to bed.

00:38:47 And it’s like, I keep thinking about motherhood,

00:38:48 how under capitalism, it’s like this extremely essential job

00:38:53 that is very difficult that is not compensated.

00:38:56 And we sort of like value things

00:38:58 by how much we compensate them.

00:39:01 And so we really devalue motherhood in our society

00:39:04 and pretty much all societies.

00:39:06 Like capitalism does not recognize motherhood.

00:39:08 It’s just a job that you’re supposed to do for free.

00:39:11 And it’s like, but I feel like producing great humans

00:39:15 should be seen as a great, as profit under capitalism.

00:39:19 Like that should be, that’s like a huge social good.

00:39:21 Like every awesome human that gets made

00:39:24 adds so much to the world.

00:39:25 So like if that was integrated into the profit structure,

00:39:29 then, you know, and if we potentially found a way

00:39:34 to compensate motherhood.

00:39:35 So come up with a compensation

00:39:37 that’s much broader than just money or.

00:39:40 Or it could just be money.

00:39:42 Like, what if you just made, I don’t know,

00:39:45 but I don’t know how you’d pay for that.

00:39:46 Like, I mean, that’s where you start getting into.

00:39:52 Reallocation of resources that people get upset over.

00:39:56 Well, like what if we made like a motherhood Dow?

00:39:58 Yeah, yeah.

00:40:01 And, you know, used it to fund like single mothers,

00:40:07 like, you know, pay for making babies.

00:40:13 So, I mean, if you create and put beautiful things

00:40:17 onto the world, that could be companies,

00:40:19 that can be bridges, that could be art,

00:40:22 that could be a lot of things,

00:40:24 and that could be children, which are.

00:40:28 Or education or.

00:40:29 Anything, that should be valued by society,

00:40:32 and that should be somehow incorporated into the framework

00:40:35 of what, as a market, of what.

00:40:38 Like, if you contribute children to this world,

00:40:40 that should be valued and respected

00:40:42 and sort of celebrated, like, proportional to what it is,

00:40:48 which is, it’s the thing that fuels human civilization.

00:40:51 Yeah, like I. It’s kind of important.

00:40:53 I feel like everyone’s always saying,

00:40:54 I mean, I think we’re in very different social spheres,

00:40:56 but everyone’s always saying, like, dismantle capitalism.

00:40:58 And I’m like, well, okay, well,

00:40:59 I don’t think the government should own everything.

00:41:02 Like, I don’t think we should not have private ownership.

00:41:04 Like, that’s scary.

00:41:05 You know, like that starts getting into weird stuff

00:41:07 and just sort of like,

00:41:08 I feel there’s almost no way to do that

00:41:10 without a police state, you know?

00:41:13 But obviously, capitalism has some major flaws.

00:41:16 And I think actually Mac showed me this idea

00:41:20 called social capitalism, which is a form of capitalism

00:41:23 that just like considers social good to be also profit.

00:41:28 Like, you know, it’s like, right now companies need to,

00:41:31 like, you’re supposed to grow every quarter or whatever

00:41:34 to like show that you’re functioning well,

00:41:38 but it’s like, okay, well,

00:41:39 what if you kept the same amount of profit?

00:41:42 You’re still in the green,

00:41:43 but then you have also all this social good.

00:41:45 Like, do you really need all this extra economic growth

00:41:47 or could you add this social good and that counts?

00:41:49 And, you know, I don’t know if, I am not an economist.

00:41:54 I have no idea how this could be achieved, but.

00:41:56 I don’t think economists know how anything

00:41:58 could be achieved either, but they pretend.

00:42:00 It’s the thing, they construct a model

00:42:02 and they go on TV shows and sound like an expert.

00:42:06 That’s the definition of economist.

00:42:08 How did being a mother, becoming a mother

00:42:12 change you as a human being, would you say?

00:42:16 Man, I think it kind of changed everything

00:42:21 and it’s still changing me a lot.

00:42:22 It’s actually changing me more right now in this moment

00:42:24 than it was before.

00:42:26 Like today, like this?

00:42:28 Just like in the most recent months and stuff.

00:42:33 Can you elucidate that, how change,

00:42:37 like when you wake up in the morning

00:42:39 and you look at yourself, it’s again, which, who are you?

00:42:42 How have you become different, would you say?

00:42:45 I think it’s just really reorienting my priorities.

00:42:50 And at first I was really fighting against that

00:42:53 because I somehow felt it was like a failure

00:42:55 of feminism or something.

00:42:56 Like I felt like it was like bad

00:42:59 if like my kids started mattering more than my work.

00:43:05 And then like more recently I started sort of analyzing

00:43:08 that thought in myself and being like,

00:43:12 that’s also kind of a construct.

00:43:13 It’s like, we’ve just devalued motherhood so much

00:43:16 in our culture that like, I feel guilty for caring

00:43:21 about my kids more than I care about my work.

00:43:23 So feminism includes breaking out

00:43:25 of whatever the construct is.

00:43:28 So just continually breaking,

00:43:30 it’s like freedom empower you to be free.

00:43:34 And that means…

00:43:37 But it also, but like being a mother,

00:43:40 like I’m so much more creative.

00:43:41 Like I cannot believe the massive amount

00:43:45 of brain growth that I am.

00:43:48 Why do you think that is?

00:43:49 Just cause like the stakes are higher somehow?

00:43:51 I think it’s like, it’s just so trippy

00:43:58 watching consciousness emerge.

00:44:00 It’s just like, it’s like going on a crazy journey

00:44:06 or something.

00:44:07 It’s like the craziest science fiction novel

00:44:10 you could ever read.

00:44:11 It’s just so crazy watching consciousness come into being.

00:44:15 And then at the same time,

00:44:16 like you’re forced to value your time so much.

00:44:21 Like when I have creative time now, it’s so sacred.

00:44:23 I need to like be really fricking on it.

00:44:29 But the other thing is that I used to just be like a cynic

00:44:34 and I used to just wanna…

00:44:35 Like my last album was called Miss Anthropocene

00:44:38 and it was like this like, it was like a study in villainy

00:44:42 or like it was like, well, what if we have,

00:44:44 instead of the old gods, we have like new gods

00:44:46 and it’s like Miss Anthropocene is like misanthrope

00:44:49 like and Anthropocene, which is like the, you know,

00:44:53 like and she’s the goddess of climate change or whatever.

00:44:55 And she’s like destroying the world.

00:44:56 And it was just like, it was like dark

00:44:59 and it was like a study in villainy.

00:45:01 And it was sort of just like,

00:45:02 like I used to like have no problem just making cynical,

00:45:06 angry, scary art.

00:45:08 And not that there’s anything wrong with that,

00:45:11 but I think having kids just makes you such an optimist.

00:45:13 It just inherently makes you wanna be an optimist so bad

00:45:16 that like I feel more responsibility

00:45:20 to make more optimistic things.

00:45:23 And I get a lot of shit for it

00:45:25 because everyone’s like, oh, you’re so privileged.

00:45:28 Stop talking about like pie in the sky,

00:45:30 stupid concepts and focus on like the now.

00:45:32 But it’s like, I think if we don’t ideate

00:45:36 about futures that could be good,

00:45:40 we won’t be able to get them.

00:45:41 If everything is Blade Runner,

00:45:42 then we’re gonna end up with Blade Runner.

00:45:44 It’s like, as we said earlier, life imitates art.

00:45:47 Like life really does imitate art.

00:45:49 And so we really need more protopian or utopian art.

00:45:53 I think this is incredibly essential

00:45:56 for the future of humanity.

00:45:58 And I think the current discourse

00:46:00 where that’s seen as a thinking about protopia or utopia

00:46:09 is seen as a dismissal of the problems

00:46:11 that we currently have.

00:46:12 I think that is an incorrect mindset.

00:46:16 And like having kids just makes me wanna imagine

00:46:20 amazing futures that like maybe I won’t be able to build,

00:46:24 but they will be able to build if they want to.

00:46:26 Yeah, it does seem like ideation

00:46:28 is a precursor to creation.

00:46:30 So you have to imagine it in order to be able to build it.

00:46:33 And there is a sad thing about human nature

00:46:36 that somehow a cynical view of the world

00:46:40 is seen as a insightful view.

00:46:44 You know, cynicism is often confused for insight,

00:46:46 which is sad to see.

00:46:48 And optimism is confused for naivete.

00:46:52 Yes, yes.

00:46:53 Like you don’t, you’re blinded by your,

00:46:57 maybe your privilege or whatever.

00:46:59 You’re blinded by something, but you’re certainly blinded.

00:47:02 That’s sad, that’s sad to see

00:47:04 because it seems like the optimists

00:47:06 are the ones that create our future.

00:47:10 They’re the ones that build.

00:47:11 In order to build the crazy thing,

00:47:13 you have to be optimistic.

00:47:14 You have to be either stupid or excited or passionate

00:47:19 or mad enough to actually believe that it can be built.

00:47:22 And those are the people that built it.

00:47:24 My favorite quote of all time is from Star Wars, Episode 8,

00:47:29 which I know everyone hates.

00:47:30 Do you like Star Wars, Episode 8?

00:47:32 No, yeah, probably I would say I would probably hate it, yeah.

00:47:36 I don’t have strong feelings about it.

00:47:38 Let me backtrack.

00:47:39 I don’t have strong feelings about Star Wars.

00:47:41 I’m a Tolkien person.

00:47:43 I’m more into dragons and orcs and ogres.

00:47:47 Yeah, I mean, Tolkien forever.

00:47:49 I really want to have one more son and call him,

00:47:51 I thought Tao Tecno Tolkien would be cool.

00:47:55 It’s a lot of T’s, I like it.

00:47:56 Yeah, and well, and Tao is six, two, eight, two pi.

00:47:59 Yeah, Tao Tecno, yeah, yeah, yeah.

00:48:01 And then techno is obviously the best genre of music,

00:48:04 but also like technocracy.

00:48:06 It just sounds really good.

00:48:07 Yeah, that’s right, and techno Tolkien, Tao Tecno Tolkien.

00:48:11 That’s a good, that’s it.

00:48:12 Tao Tecno Tolkien, but Star Wars, Episode 8,

00:48:15 I know a lot of people have issues with it.

00:48:17 Personally, on the record,

00:48:18 I think it’s the best Star Wars film.

00:48:24 You’re starting trouble today.

00:48:25 Yeah, but don’t kill what you hate, save what you love.

00:48:29 Don’t kill what you hate.

00:48:30 Don’t kill what you hate, save what you love.

00:48:32 And I think we’re, in society right now,

00:48:34 we’re in a diagnosis mode.

00:48:36 We’re just diagnosing and diagnosing and diagnosing,

00:48:39 and we’re trying to kill what we hate,

00:48:41 and we’re not trying to save what we love enough.

00:48:44 And there’s this Buckminster Fuller quote,

00:48:46 which I’m gonna butcher,

00:48:47 because I don’t remember it correctly,

00:48:48 but it’s something along the lines of,

00:48:52 don’t try to destroy the old bad models,

00:48:58 render them obsolete with better models.

00:49:03 Maybe we don’t need to destroy the oil industry.

00:49:05 Maybe we just create a great new battery technology

00:49:08 and sustainable transport,

00:49:10 and just make it economically unreasonable

00:49:13 to still continue to rely on fossil fuels.

00:49:17 It’s like, don’t kill what you hate, save what you love.

00:49:20 Make new things and just render the old things unusable.

00:49:24 It’s like if the college debt is so bad,

00:49:29 and universities are so expensive,

00:49:31 and I feel like education is becoming obsolete.

00:49:35 I feel like we could completely revolutionize education,

00:49:38 and we could make it free.

00:49:39 And it’s like, you look at JSTOR,

00:49:40 and you have to pay to get all the studies and everything.

00:49:43 What if we created a DAO that bought JSTOR,

00:49:46 or we created a DAO that was funding studies,

00:49:48 and those studies were open source, or free for everyone.

00:49:51 And what if we just open sourced education

00:49:55 and decentralized education and made it free,

00:49:56 and all research was on the internet,

00:50:00 and all the outcomes of studies were on the internet,

00:50:05 and no one has student debt,

00:50:10 and you just take tests when you apply for a job,

00:50:14 and if you’re qualified, then you can work there.

00:50:18 This is just like, I don’t know how anything works.

00:50:20 I’m just randomly ranting, but.

00:50:22 I like the humility.

00:50:24 You gotta think from just basic first principles.

00:50:27 What is the problem?

00:50:28 What’s broken?

00:50:29 What are some ideas?

00:50:30 That’s it.

00:50:31 And get excited about those ideas,

00:50:33 and share your excitement,

00:50:34 and don’t tear each other down.

00:50:37 It’s just when you kill things,

00:50:38 you often end up killing yourself.

00:50:40 Like war is not a one sided,

00:50:43 like you’re not gonna go in and just kill them,

00:50:45 like you’re gonna get stabbed.

00:50:46 It’s like, and I think when I talk about this nexus point

00:50:50 of that we’re in this point in society

00:50:53 where we’re switching to intelligent design,

00:50:55 I think part of our switch to intelligent design

00:50:57 is that we need to choose nonviolence.

00:50:59 And we need to, like, I think we can choose to start,

00:51:04 I don’t think we can eradicate violence from our species,

00:51:07 because I think we need it a little bit,

00:51:10 but I think we can choose

00:51:12 to really reorient our primitive brains

00:51:14 that are fighting over scarcity,

00:51:16 and that are so attack oriented,

00:51:20 and move into, we can optimize for creativity and building.

00:51:25 Yeah, it’s interesting to think how that happens,

00:51:27 so some of it is just education,

00:51:29 some of it is living life and introspecting your own mind,

00:51:34 and trying to live up to the better angels of your nature

00:51:37 for each one of us, all those kinds of things at scale.

00:51:41 That’s how we can sort of start to minimize

00:51:44 the amount of destructive war in our world,

00:51:48 and that’s, to me, probably you’re the same,

00:51:51 technology is a really promising way to do that.

00:51:55 Like, social media should be a really promising way

00:51:57 to do that, it’s a way we connect.

00:52:00 I, you know, for the most part,

00:52:01 I really enjoy social media.

00:52:03 I just know all the negative stuff.

00:52:05 I don’t engage with any of the negative stuff.

00:52:07 Just not even, like, by blocking

00:52:10 or any of that kind of stuff,

00:52:11 but just not letting it enter my mind.

00:52:14 Like, just, like, when somebody says something negative,

00:52:18 I see it, I immediately think positive thoughts about them,

00:52:23 and I just forget they exist after that.

00:52:26 Just move on, because, like, that negative energy,

00:52:28 if I return the negative energy,

00:52:30 they’re going to get excited in a negative way right back,

00:52:34 and it’s just this kind of vicious cycle.

00:52:38 But you would think technology would assist us

00:52:40 in this process of letting go,

00:52:42 of not taking things personally,

00:52:44 of not engaging in the negativity,

00:52:46 but unfortunately, social media profits from the negativity,

00:52:50 so the current models.

00:52:52 I mean, social media is like a gun.

00:52:53 Like, you should take a course before you use it.

00:52:57 Like, it’s like, this is what I mean,

00:52:59 like, when I say reprogram the human computer.

00:53:01 Like, in school, you should learn

00:53:03 about how social media optimizes

00:53:05 to, you know, raise your cortisol levels

00:53:07 and make you angry and crazy and stressed,

00:53:09 and, like, you should learn how to have hygiene

00:53:12 about how you use social media.

00:53:16 But, so you can, yeah,

00:53:18 choose not to focus on the negative stuff, but I don’t know.

00:53:22 I’m not sure social media should,

00:53:24 I guess it should exist.

00:53:25 I’m not sure.

00:53:27 I mean, we’re in the messy, it’s the experimental phase.

00:53:29 Like, we’re working it out.

00:53:30 Yeah, it’s the early days.

00:53:31 I don’t even know, when you say social media,

00:53:32 I don’t know what that even means.

00:53:33 We’re in the very early days.

00:53:35 I think social media is just basic human connection

00:53:37 in the digital realm, and that, I think it should exist,

00:53:41 but there’s so many ways to do it in a bad way.

00:53:43 There’s so many ways to do it in a good way.

00:53:45 There’s all discussions of all the same human rights.

00:53:48 We talk about freedom of speech.

00:53:49 We talk about sort of violence

00:53:52 in the space of digital media.

00:53:54 We talk about hate speech.

00:53:56 We talk about all these things that we had to figure out

00:53:59 back in the day in the physical space.

00:54:01 We’re now figuring out in the digital space,

00:54:03 and it’s like baby stages.

00:54:06 When the printing press came out,

00:54:07 it was like pure chaos for a minute, you know?

00:54:10 It’s like when you inject,

00:54:12 when there’s a massive information injection

00:54:14 into the general population, there’s just gonna be,

00:54:20 I feel like the printing press, I don’t have the years,

00:54:23 but it was like printing press came out,

00:54:24 shit got really fucking bad for a minute,

00:54:27 but then we got the enlightenment.

00:54:29 And so it’s like, I think we’re in,

00:54:30 this is like the second coming of the printing press.

00:54:34 We’re probably gonna have some shitty times for a minute,

00:54:37 and then we’re gonna have recalibrate

00:54:40 to have a better understanding of how we consume media

00:54:44 and how we deliver media.

00:54:47 Speaking of programming the human computer,

00:54:50 you mentioned Baby X.

00:54:52 So there’s this young consciousness coming to be,

00:54:56 came from a cell.

00:54:58 Like that whole thing doesn’t even make sense.

00:55:01 It came from DNA.

00:55:02 Yeah.

00:55:03 And then there’s this baby computer

00:55:04 that just like grows and grows and grows and grows,

00:55:06 and now there’s a conscious being

00:55:08 with extremely impressive cognitive capabilities with,

00:55:13 Have you met him?

00:55:14 Yes, yeah.

00:55:15 Yeah.

00:55:16 He’s actually really smart.

00:55:17 He’s really smart.

00:55:17 Yeah.

00:55:18 He’s weird.

00:55:19 Yeah.

00:55:20 Or a baby.

00:55:21 He does.

00:55:22 I don’t, I haven’t.

00:55:22 I don’t know a lot of other babies,

00:55:23 but he seems to be smart.

00:55:24 Zach, I don’t hang out with babies often,

00:55:25 but this baby was very impressive.

00:55:26 He does a lot of pranks and stuff.

00:55:28 Oh, so he’s like.

00:55:29 Like he’ll like give you a treat

00:55:31 and then take it away and laugh and like stuff like that.

00:55:33 So he’s like a chess player.

00:55:35 So here’s a cognitive sort of,

00:55:39 there’s a computer being programmed.

00:55:41 So he’s taking in the environment,

00:55:43 interacting with a specific set of humans.

00:55:45 How would you, first of all, what is it?

00:55:48 What, let me ask.

00:55:50 I want to ask how do you program this computer?

00:55:53 And also how do you make sense of that

00:55:55 there’s a conscious being right there

00:55:58 that wasn’t there before?

00:55:59 It’s giving me a lot of crisis thoughts.

00:56:01 I’m thinking really hard.

00:56:02 I think that’s part of the reason

00:56:03 it’s like, I’m struggling to focus on

00:56:06 art and stuff right now.

00:56:07 Cause baby X is becoming conscious

00:56:09 and like my it’s just reorienting my brain.

00:56:12 Like my brain is suddenly totally shifting of like,

00:56:14 oh shit, like the way we raise children.

00:56:18 Like, I hate all the baby books and everything.

00:56:21 I hate them.

00:56:22 Like they’re, oh, the art is so bad.

00:56:24 And like all this stuff, everything about all the aesthetics.

00:56:29 And like, I’m just like, ah, like this is so.

00:56:32 The programming languages we’re using

00:56:35 to program these baby computers isn’t good.

00:56:37 Yeah, like I’m thinking, and I,

00:56:39 not that I have like good answers

00:56:41 or know what to do, but I’m just thinking really,

00:56:46 really hard about it.

00:56:46 I, we recently watched Totoro with him, Studio Ghibli.

00:56:52 And it’s just like a fantastic film.

00:56:56 And he like responded to,

00:56:57 I know you’re not supposed to show baby screens too much,

00:56:59 but like, I think it’s the most sort of like,

00:57:04 I feel like it’s the highest art baby content.

00:57:06 Like it really speaks, there’s almost no talking in it.

00:57:12 It’s really simple.

00:57:13 Although all the dialogue is super, super, super simple,

00:57:16 you know, and it’s like a one to three year old

00:57:19 can like really connect with it.

00:57:21 Like it feels like it’s almost aimed

00:57:22 at like a one to three year old,

00:57:24 but it’s like great art and it’s so imaginative

00:57:27 and it’s so beautiful.

00:57:28 And like the first time I showed it to him,

00:57:31 he was just like so invested in it,

00:57:33 unlike I’ve ever, unlike anything else

00:57:35 I’d ever shown him.

00:57:36 Like he was just like crying when they cry

00:57:38 and laughing when they laugh,

00:57:39 like just like having this roller coaster of like emotions,

00:57:42 like, and he learned a bunch of words.

00:57:44 Like he was, and he started saying Totoro

00:57:46 and started just saying all this stuff

00:57:48 after watching Totoro,

00:57:49 and he wants to watch it all the time.

00:57:52 And I was like, man, why isn’t there an industry of this?

00:57:55 Like why aren’t our best artists focusing on making art

00:57:59 like for the birth of consciousness?

00:58:04 Like, and that’s one of the things I’ve been thinking

00:58:07 I really wanna start doing.

00:58:08 You know, I don’t wanna speak before I do things too much,

00:58:10 but like, I’m just like ages one to three,

00:58:15 like we should be putting so much effort into that.

00:58:18 And the other thing about Totoro is it’s like,

00:58:21 it’s like better for the environment

00:58:22 because adults love Totoro.

00:58:23 It’s such good art that everyone loves it.

00:58:25 Like I still have all my old Totoro merch

00:58:27 from when I was a kid.

00:58:28 Like I literally have the most ragged old Totoro merch.

00:58:33 Like everybody loves it, everybody keeps it.

00:58:35 It’s like, why does the art we have for babies

00:58:40 need to suck and be not accessible to adults

00:58:45 and then just be thrown out when, you know,

00:58:49 they age out of it?

00:58:50 Like, it’s like, I don’t know.

00:58:53 I don’t have like a fully formed thought here,

00:58:54 but this is just something I’ve been thinking about a lot

00:58:56 is like, how do we like,

00:58:58 how do we have more Totoroesque content?

00:59:01 Like how do we have more content like this

00:59:02 that like is universal and everybody loves,

00:59:05 but is like really geared to an emerging consciousness?

00:59:10 Emerging consciousness in the first like three years of life

00:59:13 that so much turmoil,

00:59:14 so much evolution of mind is happening.

00:59:16 It seems like a crucial time.

00:59:18 Would you say to make it not suck,

00:59:21 do you think of basically treating a child

00:59:26 like they have the capacity to have the brilliance

00:59:28 of an adult or even beyond that?

00:59:30 Is that how you think of that mind or?

00:59:33 No, cause they still,

00:59:35 they like it when you talk weird and stuff.

00:59:37 Like they respond better to,

00:59:39 cause even they can imitate better

00:59:41 when your voice is higher.

00:59:42 Like people say like, oh, don’t do baby talk.

00:59:44 But it’s like, when your voice is higher,

00:59:45 it’s closer to something they can imitate.

00:59:47 So they like, like the baby talk actually kind of works.

00:59:50 Like it helps them learn to communicate.

00:59:52 I’ve found it to be more effective

00:59:53 with learning words and stuff.

00:59:55 But like, you’re not speaking down to them.

00:59:59 Like do they have the capacity

01:00:03 to understand really difficult concepts

01:00:05 just in a very different way,

01:00:07 like an emotional intelligence

01:00:09 about something deep within?

01:00:11 Oh yeah, no, like if X hurts,

01:00:13 like if X bites me really hard and I’m like, ow,

01:00:15 like he gets, he’s sad.

01:00:17 He’s like sad if he hurts me by accident.

01:00:19 Yeah.

01:00:20 Which he’s huge, so he hurts me a lot by accident.

01:00:22 Yeah, that’s so interesting that that mind emerges

01:00:26 and he and children don’t really have memory of that time.

01:00:31 So we can’t even have a conversation with them about it.

01:00:32 Yeah, I just thank God they don’t have a memory

01:00:34 of this time because like, think about like,

01:00:37 I mean with our youngest baby,

01:00:39 like it’s like, I’m like, have you read

01:00:42 the sci fi short story, I Have No Mouth But I Must Scream?

01:00:46 Good title, no.

01:00:47 Oh man, I mean, you should read that.

01:00:49 I Have No Mouth But I Must Scream.

01:00:53 I hate getting into this Rocco’s Basilisk shit.

01:00:55 It’s kind of a story about the,

01:00:57 about like an AI that’s like torturing someone in eternity

01:01:03 and they have like no body.

01:01:05 The way they describe it,

01:01:07 it sort of sounds like what it feels like,

01:01:09 like being a baby, like you’re conscious

01:01:11 and you’re just getting inputs from everywhere

01:01:13 and you have no muscles and you’re like jelly

01:01:15 and you like can’t move and you try to like communicate,

01:01:17 but you can’t communicate and we’re,

01:01:18 and like, you’re just like in this like hell state.

01:01:22 I think it’s good we can’t remember that.

01:01:25 Like my little baby is just exiting that,

01:01:27 like she’s starting to like get muscles

01:01:29 and have more like autonomy,

01:01:30 but like watching her go through the opening phase,

01:01:34 I was like, I was like, this does not seem good.

01:01:37 Oh, you think it’s kind of like.

01:01:39 Like I think it sucks.

01:01:40 I think it might be really violent.

01:01:41 Like violent, mentally violent, psychologically violent.

01:01:44 Consciousness emerging, I think is a very violent thing.

01:01:47 I never thought about that.

01:01:48 I think it’s possible that we all carry

01:01:49 quite a bit of trauma from it that we don’t,

01:01:52 I think that would be a good thing to study

01:01:54 because I think if, I think addressing that trauma,

01:01:58 like, I think that might be.

01:01:59 Oh, you mean like echoes of it

01:02:00 are still there in the shadows somewhere.

01:02:01 I think it’s gotta be, I feel this, this help,

01:02:04 the helplessness, the like existential

01:02:06 and that like fear of being in like an unknown place

01:02:10 bombarded with inputs and being completely helpless,

01:02:13 like that’s gotta be somewhere deep in your brain

01:02:15 and that can’t be good for you.

01:02:17 What do you think consciousness is?

01:02:19 This whole conversation

01:02:20 has impossibly difficult questions.

01:02:22 What do you think it is?

01:02:23 Debbie said this is like so hard.

01:02:28 Yeah, we talked about music for like two minutes.

01:02:30 All right.

01:02:31 No, I’m so, I’m just over music.

01:02:32 I’m over music.

01:02:33 Yeah, I still like it.

01:02:35 It has its purpose.

01:02:36 No, I love music.

01:02:37 I mean, music’s the greatest thing ever.

01:02:38 It’s my favorite thing.

01:02:38 But I just like every interview is like,

01:02:42 what is your process?

01:02:43 Like, I don’t know.

01:02:44 I’m just done.

01:02:45 I can’t do anything.

01:02:46 I do want to ask you about Able to Live.

01:02:46 Oh, I’ll tell you about Ableton

01:02:47 because Ableton’s sick.

01:02:49 No one has ever asked about Ableton though.

01:02:51 Yeah, well, because I just need tech support mainly.

01:02:54 I can help you.

01:02:55 I can help you with your Ableton tech.

01:02:56 Anyway, from Ableton back to consciousness.

01:02:58 What do you, do you think this is a thing

01:03:00 that only humans are capable of?

01:03:03 Can robots be conscious?

01:03:05 Can, like when you think about entities,

01:03:08 you think there’s aliens out there that are conscious?

01:03:10 Like is conscious, what is consciousness?

01:03:11 There’s this Terrence McKenna quote

01:03:13 that I’ve found that I fucking love.

01:03:15 Am I allowed to swear on here?

01:03:17 Yes.

01:03:18 Nature loves courage.

01:03:21 You make the commitment and nature will respond

01:03:23 to that commitment by removing impossible obstacles.

01:03:26 Dream the impossible dream

01:03:28 and the world will not grind you under.

01:03:29 It will lift you up.

01:03:31 This is the trick.

01:03:32 This is what all these teachers and philosophers

01:03:35 who really counted, who really touched the alchemical gold,

01:03:38 this is what they understood.

01:03:40 This is the shamanic dance in the waterfall.

01:03:42 This is how magic is done.

01:03:44 By hurling yourself into the abyss

01:03:46 and discovering it’s a feather bed.

01:03:48 Yeah.

01:03:49 And for this reason,

01:03:50 I do think there are no technological limits.

01:03:55 I think like what is already happening here,

01:03:58 this is like impossible.

01:03:59 This is insane.

01:04:01 And we’ve done this in a very limited amount of time.

01:04:03 And we’re accelerating the rate at which we’re doing this.

01:04:05 So I think digital consciousness, it’s inevitable.

01:04:10 And we may not be able to even understand what that means,

01:04:13 but I like hurling yourself into the abyss.

01:04:15 So we’re surrounded by all this mystery

01:04:17 and we just keep hurling ourselves into it,

01:04:19 like fearlessly and keep discovering cool shit.

01:04:22 Yeah.

01:04:23 Like, I just think it’s like,

01:04:31 like who even knows if the laws of physics,

01:04:32 the laws of physics are probably just the current,

01:04:35 like as I was saying,

01:04:36 speed of light is the current render rate.

01:04:37 It’s like, if we’re in a simulation,

01:04:40 they’ll be able to upgrade that.

01:04:41 Like I sort of suspect when we made the James Webb telescope,

01:04:45 like part of the reason we made that

01:04:46 is because we had an upgrade, you know?

01:04:50 And so now more of space has been rendered

01:04:53 so we can see more of it now.

01:04:56 Yeah, but I think humans are super, super,

01:04:58 super limited cognitively.

01:05:00 So I wonder if we’ll be allowed to create

01:05:04 more intelligent beings that can see more of the universe

01:05:08 as their render rate is upgraded.

01:05:11 Maybe we’re cognitively limited.

01:05:12 Everyone keeps talking about how we’re cognitively limited

01:05:15 and AI is gonna render us obsolete,

01:05:17 but it’s like, you know,

01:05:20 like this is not the same thing

01:05:21 as like an amoeba becoming an alligator.

01:05:26 Like, it’s like, if we create AI,

01:05:28 again, that’s intelligent design.

01:05:29 That’s literally all religions are based on gods

01:05:33 that create consciousness.

01:05:34 Like we are God making.

01:05:35 Like what we are doing is incredibly profound.

01:05:37 And like, even if we can’t compute,

01:05:41 even if we’re so much worse than them,

01:05:44 like just like unfathomably worse than like,

01:05:49 you know, an omnipotent kind of AI,

01:05:51 it’s like we, I do not think that they would just think

01:05:55 that we are stupid.

01:05:56 I think that they would recognize the profundity

01:05:58 of what we have accomplished.

01:05:59 Are we the gods or are they the gods in our personality?

01:06:02 I mean, we’re kind of the guy.

01:06:05 It’s complicated.

01:06:06 It’s complicated.

01:06:07 Like we’re.

01:06:08 But they would acknowledge the value.

01:06:11 Well, I hope they acknowledge the value

01:06:13 of paying respect to the creative ancestors.

01:06:16 I think they would think it’s cool.

01:06:17 I think if curiosity is a trait

01:06:23 that we can quantify and put into AI,

01:06:29 then I think if AI are curious,

01:06:31 then they will be curious about us

01:06:33 and they will not be hateful or dismissive of us.

01:06:37 They might, you know, see us as, I don’t know.

01:06:41 It’s like, I’m not like, oh, fuck these dogs.

01:06:43 Let’s just kill all the dogs.

01:06:45 I love dogs.

01:06:46 Dogs have great utility.

01:06:47 Dogs like provide a lot of.

01:06:49 We make friends with them.

01:06:50 We have a deep connection with them.

01:06:53 We anthropomorphize them.

01:06:55 Like we have a real love for dogs, for cats and so on

01:06:58 for some reason, even though they’re intellectually

01:07:00 much less than us.

01:07:01 And I think there is something sacred about us

01:07:03 because it’s like, if you look at the universe,

01:07:05 like the whole universe is like cold and dead

01:07:09 and sort of robotic.

01:07:09 And it’s like, you know, AI intelligence,

01:07:15 you know, it’s kind of more like the universe.

01:07:18 It’s like cold and you know, logical

01:07:24 and you know, abiding by the laws of physics and whatever.

01:07:28 But like, we’re this like loosey goosey,

01:07:31 weird art thing that happened.

01:07:33 And I think it’s beautiful.

01:07:34 And like, I think even if we, I think one of the values,

01:07:40 if consciousness is a thing that is most worth preserving,

01:07:47 which I think is the case, I think consciousness,

01:07:49 I think if there’s any kind of like religious

01:07:50 or spiritual thing, it should be that consciousness

01:07:54 is sacred.

01:07:55 Like, then, you know, I still think even if AI

01:08:01 render us obsolete and we, climate change, it’s too bad

01:08:05 and we get hit by a comet and we don’t become

01:08:07 a multi planetary species fast enough,

01:08:09 but like AI is able to populate the universe.

01:08:12 Like I imagine, like if I was an AI,

01:08:14 I would find more planets that are capable

01:08:17 of hosting biological life forms and like recreate them.

01:08:20 Because we’re fun to watch.

01:08:21 Yeah, we’re fun to watch.

01:08:23 Yeah, but I do believe that AI can have some of the same

01:08:26 magic of consciousness within it.

01:08:29 Because consciousness, we don’t know what it is.

01:08:31 So, you know, there’s some kind of.

01:08:33 Or it might be a different magic.

01:08:34 It might be like a strange, a strange, different.

01:08:37 Right.

01:08:38 Because they’re not gonna have hormones.

01:08:39 Like I feel like a lot of our magic is hormonal kind of.

01:08:42 I don’t know, I think some of our magic

01:08:44 is the limitations, the constraints.

01:08:46 And within that, the hormones and all that kind of stuff,

01:08:48 the finiteness of life, and then we get given

01:08:51 our limitations, we get to come up with creative solutions

01:08:54 of how to dance around those limitations.

01:08:56 We partner up like penguins against the cold.

01:08:59 We fall in love, and then love is ultimately

01:09:03 some kind of, allows us to delude ourselves

01:09:06 that we’re not mortal and finite,

01:09:08 and that life is not ultimately, you live alone,

01:09:11 you’re born alone, you die alone.

01:09:13 And then love is like for a moment

01:09:15 or for a long time, forgetting that.

01:09:17 And so we come up with all these creative hacks

01:09:20 that make life like fascinatingly fun.

01:09:25 Yeah, yeah, yeah, fun, yeah.

01:09:27 And then AI might have different kinds of fun.

01:09:30 Yes.

01:09:31 And hopefully our funs intersect once in a while.

01:09:34 I think there would be a little intersection

01:09:38 of the fun.

01:09:39 Yeah. Yeah.

01:09:40 What do you think is the role of love

01:09:42 in the human condition?

01:09:45 I think.

01:09:46 Why, is it useful?

01:09:47 Is it useful like a hack, or is this like fundamental

01:09:51 to what it means to be human, the capacity to love?

01:09:54 I mean, I think love is the evolutionary mechanism

01:09:58 that is like beginning the intelligent design.

01:10:00 Like I was just reading about,

01:10:04 do you know about Kropotkin?

01:10:06 He’s like an anarchist, like old Russian anarchist.

01:10:08 I live next door to Michael Malice.

01:10:11 I don’t know if you know who that is.

01:10:12 He’s an anarchist.

01:10:13 He’s a modern day anarchist.

01:10:14 Okay. Anarchists are fun.

01:10:15 I’m kind of getting into anarchism a little bit.

01:10:17 This is probably not a good route to be taking, but.

01:10:22 Oh no, I think if you’re,

01:10:23 listen, you should expose yourself to ideas.

01:10:26 There’s no harm to thinking about ideas.

01:10:28 I think anarchists challenge systems in interesting ways,

01:10:32 and they think in interesting ways.

01:10:34 It’s just as good for the soul.

01:10:35 It’s like refreshes your mental palette.

01:10:37 I don’t think we should actually,

01:10:38 I wouldn’t actually ascribe to it,

01:10:40 but I’ve never actually gone deep on anarchy

01:10:42 as a philosophy, so I’m doing.

01:10:44 You should still think about it though.

01:10:45 When you read, when you listen,

01:10:46 because I’m reading about the Russian Revolution a lot,

01:10:48 and there was the Soviets and Lenin and all that,

01:10:51 but then there was Kropotkin and his anarchist sect,

01:10:53 and they were sort of interesting

01:10:54 because he was kind of a technocrat actually.

01:10:57 He was like, women can be more equal if we have appliances.

01:11:01 He was really into using technology

01:11:04 to reduce the amount of work people had to do.

01:11:07 But so Kropotkin was a biologist or something.

01:11:11 He studied animals.

01:11:13 And he was really at the time like,

01:11:17 I think it’s Nature magazine.

01:11:20 I think it might’ve even started as a Russian magazine,

01:11:22 but he was publishing studies.

01:11:23 Everyone was really into Darwinism at the time

01:11:26 and survival of the fittest,

01:11:27 and war is the mechanism by which we become better.

01:11:30 And it was this real cementing this idea in society

01:11:36 that violence kill the weak,

01:11:39 and that’s how we become better.

01:11:41 And then Kropotkin was kind of interesting

01:11:43 because he was looking at instances,

01:11:45 he was finding all these instances in nature

01:11:47 where animals were like helping each other and stuff.

01:11:49 And he was like, actually love is a survival mechanism.

01:11:53 Like there’s so many instances in the animal kingdom

01:11:58 where like cooperation and like helping weaker creatures

01:12:03 and all this stuff is actually an evolutionary mechanism.

01:12:06 I mean, you even look at child rearing.

01:12:08 Like child rearing is like immense amounts

01:12:12 of just love and goodwill.

01:12:14 And just like, there’s no immediate,

01:12:20 you’re not getting any immediate feedback of like winning.

01:12:24 It’s not competitive.

01:12:26 It’s literally, it’s like we actually use love

01:12:28 as an evolutionary mechanism just as much as we use war.

01:12:30 And I think we’ve like missing the other part

01:12:34 and we’ve reoriented, we’ve culturally reoriented

01:12:37 like science and philosophy has oriented itself

01:12:41 around Darwinism a little bit too much.

01:12:43 And the Kropotkin model, I think is equally valid.

01:12:48 Like it’s like cooperation and love and stuff

01:12:54 is just as essential for species survival and evolution.

01:12:58 It should be a more powerful survival mechanism

01:13:01 in the context of evolution.

01:13:02 And it comes back to like,

01:13:04 we think engineering is so much more important

01:13:06 than motherhood, but it’s like,

01:13:08 if you lose the motherhood, the engineering means nothing.

01:13:10 We have no more humans.

01:13:12 It’s like, I think our society should,

01:13:18 the survival of the, the way we see,

01:13:21 we conceptualize evolution should really change

01:13:24 to also include this idea, I guess.

01:13:27 Yeah, there’s some weird thing that seems irrational

01:13:32 that is also core to what it means to be human.

01:13:37 So love is one such thing.

01:13:40 They could make you do a lot of irrational things,

01:13:42 but that depth of connection and that loyalty

01:13:46 is a powerful thing.

01:13:47 Are they irrational or are they rational?

01:13:49 Like, it’s like, is, you know, maybe losing out

01:13:57 on some things in order to like keep your family together

01:14:00 or in order, like, it’s like, what are our actual values?

01:14:06 Well, right, I mean, the rational thing is

01:14:08 if you have a cold economist perspective,

01:14:11 you know, motherhood or sacrificing your career for love,

01:14:16 you know, in terms of salary, in terms of economic wellbeing,

01:14:20 in terms of flourishing of you as a human being,

01:14:22 that could be seen on some kind of metrics

01:14:25 as a irrational decision, suboptimal decision,

01:14:28 but there’s the manifestation of love

01:14:34 could be the optimal thing to do.

01:14:36 There’s a kind of saying, save one life, save the world.

01:14:41 That’s the thing that doctors often face, which is like.

01:14:44 Well, it’s considered irrational

01:14:45 because the profit model doesn’t include social good.

01:14:47 Yes, yeah.

01:14:48 So if a profit model includes social good,

01:14:50 then suddenly these would be rational decisions.

01:14:52 Might be difficult to, you know,

01:14:54 it requires a shift in our thinking about profit

01:14:57 and might be difficult to measure social good.

01:15:00 Yes, but we’re learning to measure a lot of things.

01:15:04 Yeah, digitizing a lot of things.

01:15:05 Where we’re actually, you know, quantifying vision and stuff.

01:15:10 Like we’re like, you know, like you go on Facebook

01:15:14 and they can, like Facebook can pretty much predict

01:15:16 our behaviors.

01:15:17 Like we’re, a surprising amount of things

01:15:20 that seem like mysterious consciousness soul things

01:15:25 have been quantified at this point.

01:15:27 So surely we can quantify these other things.

01:15:29 Yeah.

01:15:31 But as more and more of us are moving the digital space,

01:15:34 I wanted to ask you about something.

01:15:35 From a fan perspective, I kind of, you know,

01:15:40 you as a musician, you as an online personality,

01:15:43 it seems like you have all these identities

01:15:45 and you play with them.

01:15:48 One of the cool things about the internet,

01:15:51 it seems like you can play with identities.

01:15:53 So as we move into the digital world more and more,

01:15:56 maybe even in the so called metaverse.

01:15:59 I mean, I love the metaverse and I love the idea,

01:16:01 but like the way this has all played out didn’t go well

01:16:09 and people are mad about it.

01:16:11 And I think we need to like.

01:16:12 I think that’s temporary.

01:16:13 I think it’s temporary.

01:16:14 Just like, you know how all the celebrities got together

01:16:16 and sang the song Imagine by Jeff Leonard

01:16:19 and everyone started hating the song Imagine.

01:16:20 I’m hoping that’s temporary

01:16:22 because it’s a damn good song.

01:16:24 So I think it’s just temporary.

01:16:25 Like once you actually have virtual worlds,

01:16:27 whatever they’re called metaverse or otherwise,

01:16:29 it becomes, I don’t know.

01:16:31 Well, we do have virtual worlds.

01:16:32 Like video games, Elden Ring.

01:16:34 Have you played Elden Ring?

01:16:35 You haven’t played Elden Ring?

01:16:36 I’m really afraid of playing that game.

01:16:38 Literally amazed.

01:16:39 It looks way too fun.

01:16:40 It looks I would wanna go there and stay there forever.

01:16:45 It’s yeah, so fun.

01:16:47 It’s so nice.

01:16:50 Oh man, yeah.

01:16:52 So that’s the, yeah, that’s a metaverse.

01:16:54 That’s a metaverse, but you’re not really,

01:16:57 how immersive is it in the sense that,

01:17:02 does the three dimension

01:17:03 like virtual reality integration necessary?

01:17:06 Can we really just take our, close our eyes

01:17:08 and kind of plug in in the 2D screen

01:17:13 and become that other being for time

01:17:15 and really enjoy that journey that we take?

01:17:17 And we almost become that.

01:17:19 You’re no longer C, I’m no longer Lex,

01:17:22 you’re that creature, whatever the hell it is in that game.

01:17:25 Yeah, that is that.

01:17:26 I mean, that’s why I love those video games.

01:17:29 I really do become those people for a time.

01:17:33 But like, it seems like with the idea of the metaverse,

01:17:36 the idea of the digital space,

01:17:37 well, even on Twitter,

01:17:39 you get a chance to be somebody for prolonged periods of time

01:17:42 like across a lifespan.

01:17:44 You know, you have a Twitter account for years, for decades

01:17:47 and you’re that person.

01:17:48 I don’t know if that’s a good thing.

01:17:49 I feel very tormented by it.

01:17:52 By Twitter specifically.

01:17:54 By social media representation of you.

01:17:57 I feel like the public perception of me

01:17:59 has gotten so distorted that I find it kind of disturbing.

01:18:04 It’s one of the things that’s disincentivizing me

01:18:06 from like wanting to keep making art

01:18:07 because I’m just like,

01:18:11 I’ve completely lost control of the narrative.

01:18:13 And the narrative is, some of it is my own stupidity,

01:18:16 but a lot, like some of it has just been like hijacked

01:18:19 by forces far beyond my control.

01:18:23 I kind of got in over my head in things.

01:18:25 Like I’m just a random Indian musician,

01:18:27 but I just got like dragged into geopolitical matters

01:18:31 and like financial, like the stock market and shit.

01:18:35 And so it’s just like, it’s just,

01:18:36 there are very powerful people

01:18:37 who have at various points in time

01:18:39 had very vested interest in making me seem insane

01:18:43 and I can’t fucking fight that.

01:18:45 And I just like,

01:18:48 people really want their celebrity figures

01:18:50 to like be consistent and stay the same.

01:18:53 And like people have a lot of like emotional investment

01:18:55 in certain things.

01:18:56 And like, first of all,

01:18:59 like I’m like artificially more famous than I should be.

01:19:03 Isn’t everybody who’s famous artificially famous?

01:19:06 No, but like I should be like a weird niche indie thing.

01:19:11 And I make pretty challenging,

01:19:13 I do challenging weird fucking shit a lot.

01:19:16 And I accidentally by proxy got like foisted

01:19:22 into sort of like weird celebrity culture,

01:19:24 but like I cannot be media trained.

01:19:27 They have put me through so many hours of media training.

01:19:29 I would love to see BF fly in that wall.

01:19:32 I can’t do, like when I do,

01:19:34 I try so hard and I like learn this thing and I like got it.

01:19:37 And I’m like, I got it, I got it, I got it.

01:19:38 But I just can’t stop saying,

01:19:40 like my mouth just says things like,

01:19:42 and it’s just like, and I just do, I just do things.

01:19:45 I just do crazy things.

01:19:46 Like I’m, I just, I need to do crazy things.

01:19:50 And it’s just, I should not be,

01:19:53 it’s too jarring for people

01:19:56 and the contradictory stuff.

01:19:59 And then all the by association, like, you know,

01:20:05 it’s like I’m in a very weird position and my public image,

01:20:09 the avatar of me is now this totally crazy thing

01:20:14 that is so lost from my control.

01:20:16 So you feel the burden of the avatar having to be static.

01:20:19 So the avatar on Twitter or the avatar on Instagram

01:20:22 on these social platforms is as a burden.

01:20:26 It becomes like, cause like people don’t want to accept

01:20:30 a changing avatar, a chaotic avatar.

01:20:32 Avatar is a stupid shit sometimes.

01:20:34 They think the avatar is morally wrong

01:20:36 or they think the avatar, and maybe it has been,

01:20:39 and like I question it all the time.

01:20:41 Like, I’m like, like, I don’t know if everyone’s right

01:20:46 and I’m wrong.

01:20:47 I don’t know, like, but you know, a lot of times

01:20:50 people ascribe intentions to things,

01:20:51 the worst possible intentions.

01:20:53 At this point, people think I’m, you know,

01:20:57 but which is fine.

01:20:58 All kinds of words, yes.

01:20:58 Yes, and it’s fine.

01:21:00 I’m not complaining about it, but I’m just,

01:21:02 it’s a curiosity to me that we live these double, triple,

01:21:07 quadruple lives and I have this other life

01:21:10 that is like more people know my other life

01:21:13 than my real life, which is interesting.

01:21:16 Probably, I mean, you too, I guess, probably.

01:21:18 Yeah, but I have the luxury.

01:21:20 So we have all different, you know,

01:21:23 like I don’t know what I’m doing.

01:21:25 There is an avatar and you’re mediating

01:21:27 who you are through that avatar.

01:21:29 I have the nice luxury, not the luxury,

01:21:34 maybe by intention of not trying really hard

01:21:38 to make sure there’s no difference between the avatar

01:21:41 and the private person.

01:21:44 Do you wear a suit all the time?

01:21:45 Yeah.

01:21:46 You do wear a suit?

01:21:47 Not all the time.

01:21:48 Recently, because I get recognized a lot,

01:21:51 I have to not wear the suit to hide.

01:21:53 I’m such an introvert, I’m such a social anxiety

01:21:55 and all that kind of stuff, so I have to hide away.

01:21:57 I love wearing a suit because it makes me feel

01:22:00 like I’m taking the moment seriously.

01:22:02 Like I’m, I don’t know.

01:22:04 It makes me feel like a weirdo in the best possible way.

01:22:06 Suits feel great, every time I wear a suit,

01:22:08 I’m like, I don’t know why I’m not doing this more.

01:22:10 Fashion in general, if you’re doing it for yourself,

01:22:15 I don’t know, it’s a really awesome thing.

01:22:18 But yeah, I think there is definitely a painful way

01:22:24 to use social media and an empowering way.

01:22:27 And I don’t know if any of us know which is which.

01:22:32 So we’re trying to figure that out.

01:22:33 Some people, I think Doja Cat is incredible at it.

01:22:36 Incredible, like just masterful.

01:22:39 I don’t know if you like follow that.

01:22:41 So okay, so not taking anything seriously,

01:22:44 joking, absurd, humor, that kind of thing.

01:22:47 I think Doja Cat might be like

01:22:48 the greatest living comedian right now.

01:22:52 Like I’m more entertained by Doja Cat

01:22:53 than actual comedians.

01:22:56 Like she’s really fucking funny on the internet.

01:22:58 She’s just great at social media.

01:23:00 It’s just, you know.

01:23:02 Yeah, the nature of humor, like humor on social media

01:23:05 is also a beautiful thing, the absurdity.

01:23:08 The absurdity.

01:23:09 And memes, like I just wanna like take a moment.

01:23:12 I love, like when we’re talking about art

01:23:14 and credit and authenticity, I love that there’s this,

01:23:18 I mean now memes are like, they’re no longer,

01:23:21 like memes aren’t like new,

01:23:23 but it’s still this emergent art form

01:23:25 that is completely egoless and anonymous

01:23:27 and we just don’t know who made any of it.

01:23:29 And it’s like the forefront of comedy

01:23:32 and it’s just totally anonymous

01:23:35 and it just feels really beautiful.

01:23:36 It just feels like this beautiful

01:23:38 collective human art project

01:23:43 that’s like this like decentralized comedy thing

01:23:46 that just makes memes add so much to my day

01:23:48 and many people’s days.

01:23:49 And it’s just like, I don’t know.

01:23:52 I don’t think people ever,

01:23:54 I don’t think we stop enough

01:23:56 and just appreciate how sick it is that memes exist.

01:23:59 Because also making a whole brand new art form

01:24:02 in like the modern era that’s like didn’t exist before.

01:24:07 Like, I mean they sort of existed,

01:24:08 but the way that they exist now as like this like,

01:24:11 you know, like me and my friends,

01:24:13 like we joke that we go like mining for memes

01:24:16 or farming for memes, like a video game

01:24:18 and like meme dealers and like whatever.

01:24:21 Like it’s, you know, it’s this whole,

01:24:22 memes are this whole like new comedic language.

01:24:27 Well, it’s this art form.

01:24:29 The interesting thing about it is that

01:24:31 lame people seem to not be good at memes.

01:24:35 Like corporate can’t infiltrate memes.

01:24:38 Yeah, they really can’t.

01:24:39 They try, they could try.

01:24:41 But it’s like, it’s weird cause like.

01:24:43 They try so hard and every once in a while,

01:24:45 I’m like fine, like you got a good one.

01:24:48 I think I’ve seen like one or two good ones,

01:24:51 but like, yeah, they really can’t.

01:24:53 Cause they’re even, corporate is infiltrating web three.

01:24:55 It’s making me really sad,

01:24:57 but they can’t infiltrate the memes.

01:24:58 And I think there’s something really beautiful about that.

01:25:00 That gives power, that’s why Dogecoin is powerful.

01:25:03 It’s like, all right, I’m gonna F you

01:25:05 to sort of anybody who’s trying to centralize,

01:25:08 is trying to control the rich people

01:25:10 that are trying to roll in and control this,

01:25:12 control the narrative.

01:25:14 Wow, I hadn’t thought about that, but.

01:25:17 How would you fix Twitter?

01:25:18 How would you fix social media for your own?

01:25:21 Like you’re an optimist, you’re a positive person.

01:25:25 There’s a bit of a cynicism that you have currently

01:25:27 about this particular little slice of humanity.

01:25:30 I tend to think Twitter could be beautiful.

01:25:32 I’m not that cynical about it.

01:25:34 I’m not that cynical about it.

01:25:35 I actually refuse to be a cynic on principle.

01:25:37 Yes.

01:25:38 I was just briefly expressing some personal pathos.

01:25:41 Personal stuff.

01:25:42 It was just some personal pathos, but like, like.

01:25:45 Just to vent a little bit, just to speak.

01:25:48 I don’t have cancer, I love my family.

01:25:50 I have a good life.

01:25:51 That is, if that is my biggest,

01:25:55 one of my biggest problems.

01:25:56 Then it’s a good life.

01:25:57 Yeah, you know, that was a brief,

01:25:59 although I do think there are a lot of issues with Twitter

01:26:01 just in terms of like the public mental health,

01:26:03 but due to my proximity to the current dramas,

01:26:10 I honestly feel that I should not have opinions about this

01:26:13 because I think

01:26:17 that if Elon ends up getting Twitter,

01:26:28 that is a, being the arbiter of truth or public discussion,

01:26:33 that is a responsibility.

01:26:36 I do not, I am not qualified to be responsible for that.

01:26:41 And I do not want to say something

01:26:45 that might like dismantle democracy.

01:26:48 And so I just like, actually,

01:26:49 I actually think I should not have opinions about this

01:26:52 because I truly am not,

01:26:55 I don’t want to have the wrong opinion about this.

01:26:56 And I think I’m too close to the actual situation

01:27:00 wherein I should not have, I have thoughts in my brain,

01:27:04 but I think I am scared by my proximity to this situation.

01:27:09 Isn’t that crazy that a few words that you could say

01:27:14 could change world affairs and hurt people?

01:27:18 I mean, that’s the nature of celebrity at a certain point

01:27:22 that you have to be, you have to a little bit, a little bit,

01:27:27 not so much that it destroys you or puts too much constraints,

01:27:30 but you have to a little bit think about

01:27:32 the impact of your words.

01:27:33 I mean, we as humans, you talk to somebody at a bar,

01:27:36 you have to think about the impact of your words.

01:27:39 Like you can say positive things,

01:27:40 you can say negative things,

01:27:41 you can affect the direction of one life.

01:27:43 But on social media, your words can affect

01:27:45 the direction of many lives.

01:27:48 That’s crazy.

01:27:48 It’s a crazy world to live in.

01:27:50 It’s worthwhile to consider that responsibility,

01:27:52 take it seriously.

01:27:54 Sometimes just like you did choose kind of silence,

01:28:00 choose sort of respectful.

01:28:03 Like I do have a lot of thoughts on the matter.

01:28:05 I’m just, I don’t, if my thoughts are wrong,

01:28:10 this is one situation where the stakes are high.

01:28:12 You mentioned a while back that you were in a cult

01:28:15 that’s centered around bureaucracy,

01:28:17 so you can’t really do anything

01:28:18 because it involves a lot of paperwork.

01:28:20 And I really love a cult that’s just like Kafkaesque.

01:28:24 Yes.

01:28:25 Just like.

01:28:26 I mean, it was like a joke, but it was.

01:28:27 I know, but I love this idea.

01:28:29 The Holy Rain Empire.

01:28:30 Yeah, it was just like a Kafkaesque pro bureaucracy cult.

01:28:34 But I feel like that’s what human civilization is,

01:28:36 is that, because when you said that, I was like,

01:28:38 oh, that is kind of what humanity is,

01:28:40 is this bureaucracy cult.

01:28:41 I do, yeah, I have this theory.

01:28:45 I really think that we really,

01:28:50 bureaucracy is starting to kill us.

01:28:53 And I think like we need to reorient laws and stuff.

01:28:59 Like, I think we just need sunset clauses on everything.

01:29:01 Like, I think the rate of change in culture

01:29:04 is happening so fast and the rate of change in technology

01:29:06 and everything is happening so fast.

01:29:07 It’s like, when you see these hearings

01:29:10 about like social media and Cambridge Analytica

01:29:15 and everyone talking, it’s like, even from that point,

01:29:19 so much technological change has happened

01:29:21 from like those hearings.

01:29:22 And it’s just like, we’re trying to make all these laws now

01:29:24 about AI and stuff.

01:29:25 I feel like we should be updating things

01:29:27 like every five years.

01:29:28 And like one of the big issues in our society right now

01:29:30 is we’re just getting bogged down by laws

01:29:32 and it’s making it very hard to change things

01:29:37 and develop things.

01:29:37 In Austin, I don’t wanna speak on this too much,

01:29:41 but like one of my friends is working on a housing bill

01:29:43 in Austin to try to like prevent

01:29:45 like a San Francisco situation from happening here

01:29:47 because obviously we’re getting a little mini San Francisco

01:29:49 here, like housing prices are skyrocketing,

01:29:52 it’s causing massive gentrification.

01:29:54 This is really bad for anyone who’s not super rich.

01:29:59 Like, there’s so much bureaucracy.

01:30:00 Part of the reason this is happening

01:30:01 is because you need all these permits to build.

01:30:04 It takes like years to get permits to like build anything.

01:30:06 It’s so hard to build and so there’s very limited housing

01:30:09 and there’s a massive influx of people.

01:30:10 And it’s just like, you know, this is a microcosm

01:30:13 of like problems that are happening all over the world

01:30:15 where it’s just like, we’re dealing with laws

01:30:18 that are like 10, 20, 30, 40, 100, 200 years old

01:30:22 and they are no longer relevant

01:30:24 and it’s just slowing everything down

01:30:25 and causing massive social pain.

01:30:29 Yeah, but it’s like, it’s also makes me sad

01:30:32 when I see politicians talk about technology

01:30:35 and when they don’t really get it.

01:30:38 But most importantly, they lack curiosity

01:30:41 and like that like inspired excitement

01:30:44 about like how stuff works and all that stuff.

01:30:46 They’re just like, they see,

01:30:47 they have a very cynical view of technology.

01:30:50 It’s like tech companies are just trying to do evil

01:30:52 on the world from their perspective

01:30:53 and they have no curiosity about like

01:30:55 how recommender systems work or how AI systems work,

01:30:59 natural language processing, how robotics works,

01:31:02 how computer vision works, you know.

01:31:05 They always take the most cynical possible interpretation

01:31:08 of what technology would be used

01:31:09 and we should definitely be concerned about that

01:31:11 but if you’re constantly worried about that

01:31:13 and you’re regulating based on that,

01:31:15 you’re just going to slow down all the innovation.

01:31:16 I do think a huge priority right now

01:31:19 is undoing the bad energy

01:31:25 surrounding the emergence of Silicon Valley.

01:31:28 Like I think that like a lot of things

01:31:30 were very irresponsible during that time

01:31:31 and you know, like even just this current whole thing

01:31:36 with Twitter and everything,

01:31:36 it’s like there has been a lot of negative outcomes

01:31:39 from the sort of technocracy boom

01:31:44 but one of the things that’s happening

01:31:46 is that like it’s alienating people

01:31:49 from wanting to care about technology

01:31:52 and I actually think technology is probably

01:31:56 some of the better, probably the best.

01:31:58 I think we can fix a lot of our problems

01:32:01 more easily with technology

01:32:03 than with you know, fighting the powers that be

01:32:07 as a you know, not to go back to the Star Wars quote

01:32:09 or the Buckminster Fuller quote.

01:32:11 Let’s go to some dark questions.

01:32:14 If we may for time, what is the darkest place

01:32:17 you’ve ever gone in your mind?

01:32:20 Is there a time, a period of time, a moment

01:32:23 that you remember that was difficult for you?

01:32:29 I mean, when I was 18,

01:32:30 my best friend died of a heroin overdose

01:32:33 and it was like my,

01:32:38 and then shortly after that,

01:32:39 one of my other best friends committed suicide

01:32:44 and that sort of like coming into adulthood,

01:32:48 dealing with two of the most important people in my life

01:32:51 dying in extremely disturbing violent ways

01:32:53 was a lot.

01:32:55 That was a lot.

01:32:56 Do you miss them?

01:32:58 Yeah, definitely miss them.

01:32:59 Did that make you think about your own life?

01:33:02 About the finiteness of your own life?

01:33:04 The places your mind can go?

01:33:08 Did you ever in the distance, far away

01:33:10 contemplate just your own death?

01:33:15 Or maybe even taking your own life?

01:33:17 Oh never, oh no.

01:33:18 I’m so, I love my life.

01:33:20 I cannot fathom suicide.

01:33:23 I’m so scared of death.

01:33:24 I haven’t, I’m too scared of death.

01:33:25 My manager, my manager’s like the most Zen guy.

01:33:28 My manager’s always like, you need to accept death.

01:33:31 You need to accept death.

01:33:32 And I’m like, look, I can do your meditation.

01:33:34 I can do the meditation, but I cannot accept death.

01:33:37 I like, I will fight, I’m terrified of death.

01:33:40 I will like fight.

01:33:42 Although I actually think death is important.

01:33:45 I recently went to this meeting about immortality

01:33:50 and in the process of.

01:33:51 That’s the actual topic of the meeting?

01:33:53 I’m sorry.

01:33:54 No, no, it was this girl.

01:33:54 It was a bunch of people working on like anti aging stuff.

01:33:58 It was like some like seminary thing about it.

01:34:01 And I went in really excited.

01:34:03 I was like, yeah, like, okay, like, what do you got?

01:34:05 Like, how can I live for 500 years or a thousand years?

01:34:07 And then like over the course of the meeting,

01:34:10 like it was sort of like, right.

01:34:11 It was like two or three days

01:34:13 after the Russian invasion started.

01:34:14 And I was like, man, like, what if Putin was immortal?

01:34:17 Like, what if I’m like, man, maybe immortality,

01:34:20 is not good.

01:34:23 I mean, like if you get into the later Dune stuff,

01:34:25 the immortals cause a lot of problem.

01:34:29 Cause as we were talking about earlier with the music

01:34:30 and like brains calcify, like good people

01:34:34 could become immortal, but bad people could become immortal.

01:34:36 But I also think even the best people power corrupts

01:34:43 and power alienates you from like the common human experience

01:34:46 and.

01:34:47 Right, so the people that get more and more powerful.

01:34:49 Even the best people who like, whose brains are amazing,

01:34:52 like I think death might be important.

01:34:54 I think death is part of, you know,

01:34:57 like I think with AI one thing we might want to consider,

01:35:01 I don’t know, when I talk about AI,

01:35:02 I’m such not an expert and probably everyone has

01:35:04 all these ideas and they’re already figured out.

01:35:06 But when I was talking.

01:35:07 Nobody is an expert in anything.

01:35:08 See, okay, go ahead.

01:35:09 But when I.

01:35:10 You were talking about.

01:35:11 Yeah, but I like, it’s just like,

01:35:13 I think some kind of pruning.

01:35:16 But it’s a tricky thing because if there’s too much

01:35:20 of a focus on youth culture, then you don’t have the wisdom.

01:35:25 So I feel like we’re in a tricky,

01:35:27 we’re in a tricky moment right now in society

01:35:30 where it’s like, we’ve really perfected living

01:35:32 for a long time.

01:35:33 So there’s all these really like old people

01:35:35 who are like really voting against the wellbeing

01:35:39 of the young people, you know?

01:35:41 And like, it’s like there shouldn’t be all this student dead

01:35:45 and we need like healthcare, like universal healthcare

01:35:48 and like just voting against like best interests.

01:35:52 But then you have all these young people

01:35:53 that don’t have the wisdom that are like,

01:35:55 yeah, we need communism and stuff.

01:35:57 And it’s just like, like literally I got canceled

01:36:00 at one point for, I ironically used a Stalin quote

01:36:04 in my high school yearbook, but it was actually like a diss

01:36:08 against my high school.

01:36:09 I saw that.

01:36:10 Yeah, and people were like, you used to be a Stalinist

01:36:13 and now you’re a class traitor and it’s like,

01:36:15 it’s like, oh man, just like, please Google Stalin.

01:36:19 Please Google Stalin.

01:36:20 Like, you know.

01:36:21 Ignoring the lessons of history, yes.

01:36:23 And it’s like, we’re in this really weird middle ground

01:36:26 where it’s like, we are not finding the happy medium

01:36:31 between wisdom and fresh ideas

01:36:34 and they’re fighting each other.

01:36:35 And it’s like, like really, like what we need is like

01:36:40 the fresh ideas and the wisdom to be like collaborating.

01:36:43 And it’s like.

01:36:45 What the fighting in a way is the searching

01:36:47 for the happy medium.

01:36:48 And in a way, maybe we are finding the happy medium.

01:36:51 Maybe that’s what the happy medium looks like.

01:36:52 And for AI systems, there has to be,

01:36:54 it’s, you know, you have the reinforcement learning,

01:36:57 you have the dance between exploration and exploitation,

01:37:00 sort of doing crazy stuff to see if there’s something better

01:37:03 than what you think is the optimal

01:37:05 and then doing the optimal thing

01:37:06 and dancing back and forth from that.

01:37:08 You would, Stuart Russell, I don’t know if you know that,

01:37:10 is AI guy with, thinks about sort of

01:37:15 how to control super intelligent AI systems.

01:37:18 And his idea is that we should inject uncertainty

01:37:21 and sort of humility into AI systems that they never,

01:37:24 as they get wiser and wiser and wiser and more intelligent,

01:37:28 they’re never really sure.

01:37:30 They always doubt themselves.

01:37:31 And in some sense, when you think of young people,

01:37:34 that’s a mechanism for doubt.

01:37:36 It’s like, it’s how society doubts

01:37:38 whether the thing it has converged towards

01:37:40 is the right answer.

01:37:41 So the voices of the young people

01:37:44 is a society asking itself a question.

01:37:48 The way I’ve been doing stuff for the past 50 years,

01:37:51 maybe it’s the wrong way.

01:37:52 And so you can have all of that within one AI system.

01:37:55 I also think, though, that we need to,

01:37:57 I mean, actually, that’s actually really interesting

01:37:59 and really cool.

01:38:01 But I also think there’s a fine balance of,

01:38:04 I think we maybe also overvalue the idea

01:38:09 that the old systems are always bad.

01:38:11 And I think there are things that we are perfecting

01:38:14 and we might be accidentally overthrowing things

01:38:16 that we actually have gotten to a good point.

01:38:19 Just because we value disruption so much

01:38:22 and we value fighting against the generations

01:38:24 before us so much that there’s also an aspect of,

01:38:29 sometimes we’re taking two steps forward, one step back

01:38:32 because, okay, maybe we kind of did solve this thing

01:38:36 and now we’re like fucking it up, you know?

01:38:38 And so I think there’s like a middle ground there too.

01:38:44 Yeah, we’re in search of that happy medium.

01:38:46 Let me ask you a bunch of crazy questions, okay?

01:38:49 All right.

01:38:50 You can answer in a short way or in a long way.

01:38:53 What’s the scariest thing you’ve ever done?

01:38:55 These questions are gonna be ridiculous.

01:38:57 Something tiny or something big.

01:39:00 Something big, skydiving or touring your first record,

01:39:09 going on this podcast.

01:39:12 I’ve had two crazy brushes, like really scary brushes

01:39:14 with death where I randomly got away on scay.

01:39:16 I don’t know if I should talk about those on here.

01:39:19 Well, I don’t know.

01:39:20 I think I might be the luckiest person alive though.

01:39:22 Like this might be too dark for a podcast though.

01:39:25 I feel like, I don’t know if this is like good content

01:39:27 for a podcast.

01:39:28 I don’t know what is good content.

01:39:30 It might hijack.

01:39:31 Here’s a safer one.

01:39:32 I mean, having a baby really scared me.

01:39:36 Before.

01:39:37 Just the birth process.

01:39:38 Surgery, like just having a baby is really scary.

01:39:45 So just like the medical aspect of it,

01:39:47 not the responsibility.

01:39:49 Were you ready for the responsibility?

01:39:51 Did you, were you ready to be a mother?

01:39:53 All the beautiful things that comes with motherhood

01:39:56 that you were talking about.

01:39:57 All the changes and all that, were you ready for that?

01:40:01 Or did you feel ready for that?

01:40:02 No, I think it took about nine months

01:40:05 to start getting ready for it.

01:40:06 And I’m still getting more ready for it

01:40:08 because now you keep realizing more things

01:40:12 as they start getting.

01:40:14 As the consciousness grows.

01:40:16 And stuff you didn’t notice with the first one,

01:40:18 now that you’ve seen the first one older,

01:40:19 you’re noticing it more.

01:40:21 Like the sort of like existential horror

01:40:24 of coming into consciousness with Baby Y

01:40:28 or Baby Sailor Mars or whatever.

01:40:30 She has like so many names at this point

01:40:31 that it’s, we really need to probably settle on one.

01:40:36 If you could be someone else for a day,

01:40:38 someone alive today, but somebody you haven’t met yet,

01:40:41 who would you be?

01:40:42 Would I be modeling their brain state

01:40:44 or would I just be in their body?

01:40:46 You can choose the degree

01:40:48 to which you’re modeling their brain state.

01:40:50 Cause you can still take a third person perspective

01:40:54 and realize, you have to realize that you’re.

01:40:56 Can they be alive or can it be dead?

01:41:00 No, oh.

01:41:02 They would be brought back to life, right?

01:41:04 If they’re dead.

01:41:05 Yeah, you can bring people back.

01:41:07 Definitely Hitler or Stalin.

01:41:09 I wanna understand evil.

01:41:12 You would need to, oh, to experience what it feels like.

01:41:15 I wanna be in their brain feeling what they feel.

01:41:18 I might change you forever returning from that.

01:41:20 Yes, but I think it would also help me understand

01:41:22 how to prevent it and fix it.

01:41:25 That might be one of those things,

01:41:26 once you experience it, it’ll be a burden to know it.

01:41:29 Cause you won’t be able to transfer that.

01:41:30 Yeah, but a lot of things are burdens.

01:41:33 But it’s a useful burden.

01:41:34 But it’s a useful burden, yeah.

01:41:36 That for sure, I wanna understand evil

01:41:39 and psychopathy and that.

01:41:42 I have all these fake Twitter accounts

01:41:43 where I go into different algorithmic bubbles

01:41:45 to try to understand.

01:41:47 I’ll keep getting in fights with people

01:41:48 and realize we’re not actually fighting.

01:41:50 I think we used to exist in a monoculture

01:41:53 before social media and stuff.

01:41:54 We kinda all got fed the same thing.

01:41:56 So we were all speaking the same cultural language.

01:41:58 But I think recently, one of the things

01:42:00 that we aren’t diagnosing properly enough with social media

01:42:03 is that there’s different dialects.

01:42:05 There’s so many different dialects of Chinese.

01:42:06 There are now becoming different dialects of English.

01:42:09 I am realizing there are people

01:42:11 who are saying the exact same things,

01:42:13 but they’re using completely different verbiage.

01:42:15 And we’re punishing each other

01:42:17 for not using the correct verbiage.

01:42:18 And we’re completely misunderstanding.

01:42:20 People are just misunderstanding

01:42:22 what the other people are saying.

01:42:23 And I just got in a fight with a friend

01:42:27 about anarchism and communism and shit for two hours.

01:42:33 And then by the end of a conversation,

01:42:34 and then she’d say something, and I’m like,

01:42:35 but that’s literally what I’m saying.

01:42:37 And she was like, what?

01:42:39 And then I was like, fuck, we’ve different,

01:42:40 I’m like, our English, the way we are understanding

01:42:44 terminology is like drastically, like our algorithm bubbles

01:42:50 are creating mini dialects.

01:42:53 Of how language is interpreted, how language is used.

01:42:55 That’s so fascinating.

01:42:56 And so we’re like having these arguments

01:42:59 that we do not need to be having.

01:43:00 And there’s polarization that’s happening

01:43:02 that doesn’t need to be happening

01:43:03 because we’ve got these like algorithmically created

01:43:08 dialects occurring.

01:43:09 Plus on top of that, there’s also different parts

01:43:11 of the world that speak different languages.

01:43:13 So there’s literally lost in translation

01:43:16 kind of communication.

01:43:17 I happen to know the Russian language

01:43:19 and just know how different it is.

01:43:22 Then the English language.

01:43:23 And I just wonder how much is lost in a little bit of.

01:43:27 Man, I actually, cause I have a question for you.

01:43:28 I have a song coming out tomorrow

01:43:30 with I Speak Who Are A Russian Band.

01:43:31 And I speak a little bit of Russian

01:43:33 and I was looking at the title

01:43:35 and the title in English doesn’t match

01:43:37 the title in Russian.

01:43:38 I’m curious about this.

01:43:39 Cause look, it says the title in English is Last Day.

01:43:42 And then the title in Russian is New Day.

01:43:45 My pronunciation sucks.

01:43:47 New Day.

01:43:48 Like what?

01:43:49 Like a new day.

01:43:50 A new day.

01:43:51 Yeah, new day, new day.

01:43:52 Like it’s two different meanings.

01:43:53 Yeah, new day, yeah.

01:43:57 Yeah, yeah, new day.

01:43:58 New day, but last day.

01:44:01 New day.

01:44:02 So last day would be the last day.

01:44:04 Yeah.

01:44:05 Maybe they.

01:44:05 Or maybe the title includes both the Russian

01:44:07 and it’s for.

01:44:09 Maybe.

01:44:09 Maybe it’s for bilingual.

01:44:10 But to be honest, Novodin sounds better than

01:44:13 just musically.

01:44:15 Like Novodin is new day.

01:44:17 That’s the current one.

01:44:18 And Posledniy Den is the last day.

01:44:23 I think Novodin.

01:44:25 I don’t like Novodin.

01:44:26 But the meaning is so different.

01:44:30 That’s kind of awesome actually though.

01:44:31 There’s an explicit sort of contrast like that.

01:44:35 If everyone on earth disappeared

01:44:38 and it was just you left, what would your day look like?

01:44:44 Like what would you do?

01:44:45 Everybody’s dead.

01:44:46 As far as you.

01:44:47 Are there corpses there?

01:44:52 Well seriously, it’s a big.

01:44:53 Let me think through this.

01:44:54 It’s a big difference if there’s just like birds singing

01:44:56 versus if there’s like corpses littering the street.

01:44:58 Yeah, there’s corpses everywhere, I’m sorry.

01:45:01 It’s, and you don’t actually know what happened

01:45:05 and you don’t know why you survived.

01:45:07 And you don’t even know if there’s others out there.

01:45:10 But it seems clear that it’s all gone.

01:45:13 What would you do?

01:45:15 What would I do?

01:45:17 Listen, I’m somebody who really enjoys the moment,

01:45:19 enjoys life.

01:45:20 I would just go on like enjoying the inanimate objects.

01:45:26 I would just look for food, basic survival.

01:45:30 But most of it is just, listen, when I just,

01:45:33 I take walks and I look outside and I’m just happy

01:45:36 that we get to exist on this planet,

01:45:39 to be able to breathe air.

01:45:41 It’s just all beautiful.

01:45:43 It’s full of colors, all of this kind of stuff.

01:45:44 Just, there’s so many things about life,

01:45:48 your own life, conscious life that’s fucking awesome.

01:45:50 So I would just enjoy that.

01:45:54 But also maybe after a few weeks,

01:45:56 the engineer would start coming out,

01:45:58 like wanna build some things.

01:46:01 Maybe there’s always hope searching for another human.

01:46:05 Maybe.

01:46:06 Probably searching for another human.

01:46:09 Probably trying to get to a TV or radio station

01:46:13 and broadcast something.

01:46:18 That’s interesting, I didn’t think about that.

01:46:19 So like really maximize your ability

01:46:23 to connect with others.

01:46:24 Yeah, like probably try to find another person.

01:46:29 Would you be excited to see,

01:46:31 to meet another person or terrified?

01:46:33 Because, you know.

01:46:34 I’d be excited.

01:46:35 No matter what.

01:46:36 Yeah, yeah, yeah, yeah.

01:46:38 Being alone for the last however long of my life

01:46:42 would be really bad.

01:46:43 That’s the one instance I might,

01:46:46 I don’t think I’d kill myself,

01:46:47 but I might kill myself if I had to.

01:46:48 So you love people.

01:46:50 You love connection to other humans.

01:46:51 Yeah.

01:46:52 I kinda hate people too, but yeah.

01:46:54 That’s a love hate relationship.

01:46:56 Yeah.

01:46:57 I feel like we’d have a bunch of weird

01:46:58 Nietzsche questions and stuff though.

01:47:00 Oh yeah.

01:47:01 Like I wonder, cause I’m like, when podcast,

01:47:02 I’m like, is this interesting for people

01:47:04 to just have like, or I don’t know,

01:47:06 maybe people do like this.

01:47:08 When I listen to podcasts, I’m into like the lore,

01:47:10 like the hard lore.

01:47:11 Like I just love like Dan Carlin.

01:47:13 I’m like, give me the facts.

01:47:14 Just like, like the facts into my bloodstream.

01:47:18 But you also don’t know,

01:47:20 like you’re a fascinating mind to explore.

01:47:23 So you don’t realize as you’re talking about stuff,

01:47:26 the stuff you’ve taken for granted

01:47:28 is actually unique and fascinating.

01:47:30 The way you think.

01:47:32 Not always what, like the way you reason through things

01:47:35 is the fascinating thing to listen to.

01:47:39 Because people kind of see, oh,

01:47:41 there’s other humans that think differently,

01:47:43 that explore thoughts differently.

01:47:45 That’s the cool, that’s also cool.

01:47:47 So yeah, Dan Carlin retelling of history.

01:47:50 By the way, his retelling of history is very,

01:47:54 I think what’s exciting is not the history,

01:47:57 is his way of thinking about history.

01:48:00 No, I think Dan Carlin is one of the people,

01:48:02 like when, Dan Carlin is one of the people

01:48:04 that really started getting me excited

01:48:06 about like revolutionizing education.

01:48:08 Because like Dan Carlin instilled,

01:48:12 I already like really liked history,

01:48:14 but he instilled like an obsessive love of history in me

01:48:18 to the point where like now I’m fucking reading,

01:48:21 like going to bed, reading like part four

01:48:24 of The Rise and Fall of the Third Reich or whatever.

01:48:26 Like I got like dense ass history,

01:48:28 but like he like opened that door

01:48:31 that like made me want to be a scholar of that topic.

01:48:34 Like it’s like, I feel like he’s such a good teacher.

01:48:37 He just like, you know, and it sort of made me feel like

01:48:42 one of the things we could do with education

01:48:44 is like find like the world’s great,

01:48:46 the teachers that like create passion for the topic

01:48:49 because autodidactricism,

01:48:53 I don’t know how to say that properly,

01:48:55 but like self teaching is like much faster

01:48:57 than being lectured to.

01:48:59 Like it’s much more efficient

01:49:00 to sort of like be able to teach yourself

01:49:02 and then ask a teacher questions

01:49:03 when you don’t know what’s up.

01:49:04 But like, you know, that’s why it’s like

01:49:07 in university and stuff,

01:49:08 like you can learn so much more material so much faster

01:49:11 because you’re doing a lot of the learning on your own

01:49:13 and you’re going to the teachers for when you get stuck.

01:49:15 But like these teachers that can inspire passion

01:49:18 for a topic, I think that is one of the most invaluable

01:49:21 skills in our whole species.

01:49:23 Like, because if you can do that, then you,

01:49:26 it’s like AI, like AI is going to teach itself

01:49:30 so much more efficiently than we can teach it.

01:49:31 We just needed to get it to the point

01:49:32 where it can teach itself.

01:49:34 And then.

01:49:35 It finds the motivation to do so, right?

01:49:37 Yeah.

01:49:38 So like you inspire it to do so.

01:49:39 Yeah.

01:49:40 And then it could teach itself.

01:49:42 What do you make of the fact,

01:49:44 you mentioned Rise and Fall of the Third Reich.

01:49:46 I just.

01:49:47 Have you read that?

01:49:48 Yeah, I read it twice.

01:49:48 You read it twice?

01:49:49 Yes.

01:49:50 Okay, so no one even knows what it is.

01:49:51 Yeah.

01:49:52 And I’m like, wait, I thought this was like

01:49:53 a super poppin book.

01:49:54 Super pop.

01:49:55 Yeah, I’m not like that, I’m not that far in it.

01:49:58 But it is, it’s so interesting.

01:50:00 Yeah, it’s written by a person that was there,

01:50:03 which is very important to kind of.

01:50:05 You know, you start being like,

01:50:06 how could this possibly happen?

01:50:08 And then when you read Rise and Fall of the Third Reich,

01:50:10 it’s like, people tried really hard for this to not happen.

01:50:14 People tried, they almost reinstated a monarchy

01:50:15 at one point to try to stop this from happening.

01:50:17 Like they almost like abandoned democracy

01:50:21 to try to get this to not happen.

01:50:22 At least the way it makes me feel

01:50:24 is that there’s a bunch of small moments

01:50:28 on which history can turn.

01:50:30 Yes.

01:50:30 It’s like small meetings.

01:50:32 Yes.

01:50:33 Human interactions.

01:50:34 And it’s both terrifying and inspiring

01:50:36 because it’s like, even just attempts

01:50:41 at assassinating Hitler, like time and time again failed.

01:50:47 And they were so close.

01:50:48 Was it like Operation Valkyrie?

01:50:49 Such a good.

01:50:51 And then there’s also the role of,

01:50:55 that’s a really heavy burden,

01:50:56 which is from a geopolitical perspective,

01:50:59 the role of leaders to see evil

01:51:00 before it truly becomes evil,

01:51:02 to anticipate it, to stand up to evil.

01:51:05 Because evil is actually pretty rare in this world

01:51:08 at a scale that Hitler was.

01:51:09 We tend to, you know, in the modern discourse

01:51:12 kind of call people evil too quickly.

01:51:14 If you look at ancient history,

01:51:17 like there was a ton of Hitlers.

01:51:18 I actually think it’s more the norm than,

01:51:22 like again, going back to like my

01:51:24 sort of intelligent design theory,

01:51:25 I think one of the things we’ve been successfully doing

01:51:28 in our slow move from survival of the fittest

01:51:31 to intelligent design is we’ve kind of been eradicating,

01:51:37 like if you look at like ancient Assyria and stuff,

01:51:40 like that shit was like brutal

01:51:42 and just like the heads on the, like brutal,

01:51:45 like Genghis Khan just like genocide after genocide

01:51:48 was like throwing plague bodies over the walls

01:51:51 and decimating whole cities

01:51:52 or like the Muslim conquests of like Damascus and shit.

01:51:55 Just like people, cities used to get leveled

01:51:58 all the fucking time.

01:52:00 Okay, get into the Bronze Age collapse.

01:52:02 It’s basically, there was like almost

01:52:05 like Roman level like society.

01:52:07 Like there was like all over the world,

01:52:09 like global trade, like everything was awesome

01:52:11 through a mix of, I think a bit of climate change

01:52:13 and then the development of iron

01:52:16 because basically bronze could only come

01:52:17 from this, the way to make bronze,

01:52:19 like everything had to be funneled

01:52:20 through this one Iranian mine.

01:52:23 And so it’s like, there was just this one supply chain

01:52:26 and this is one of the things

01:52:27 that makes me worried about supply chains

01:52:29 and why I think we need to be so thoughtful about,

01:52:31 I think our biggest issue with society right now,

01:52:34 like the thing that is most likely to go wrong

01:52:36 is probably supply chain collapse,

01:52:38 because war, climate change, whatever,

01:52:40 like anything that causes supply chain collapse,

01:52:41 our population is too big to handle that.

01:52:44 And like the thing that seems to cause Dark Ages

01:52:46 is mass supply chain collapse.

01:52:48 But the Bronze Age collapse happened like,

01:52:52 it was sort of like this ancient collapse

01:52:55 that happened where like literally like ancient Egypt,

01:52:59 all these cities, everything just got like decimated,

01:53:01 destroyed, abandoned cities, like hundreds of them.

01:53:04 There was like a flourishing society,

01:53:05 like we were almost coming to modernity

01:53:07 and everything got leveled.

01:53:08 And they had this mini Dark Ages,

01:53:10 but it was just like, there’s so little writing

01:53:12 or recording from that time that like,

01:53:13 there isn’t a lot of information

01:53:14 about the Bronze Age collapse,

01:53:16 but it was basically equivalent to like medieval,

01:53:18 the medieval Dark Ages.

01:53:21 But it just happened, I don’t know the years,

01:53:23 but like thousands of years earlier.

01:53:26 And then we sort of like recovered

01:53:28 from the Bronze Age collapse,

01:53:30 empire reemerged, writing and trade

01:53:33 and everything reemerged.

01:53:36 And then we of course had the more contemporary Dark Ages.

01:53:40 And then over time, we’ve designed mechanism

01:53:43 to lessen and lessen the capability

01:53:46 for the destructive power centers to emerge.

01:53:50 There’s more recording about the more contemporary Dark Ages.

01:53:54 So I think we have like a better understanding

01:53:55 of how to avoid it,

01:53:56 but I still think we’re at high risk for it.

01:53:58 I think that’s one of the big risks right now.

01:54:00 So the natural state of being for humans

01:54:03 is for there to be a lot of Hitlers,

01:54:04 which has gotten really good

01:54:06 at making it hard for them to emerge.

01:54:09 We’ve gotten better at collaboration

01:54:12 and resisting the power,

01:54:14 like authoritarians to come to power.

01:54:16 We’re trying to go country by country,

01:54:18 like we’re moving past this.

01:54:19 We’re kind of like slowly incrementally,

01:54:21 like moving towards like not scary old school war stuff.

01:54:29 And I think seeing it happen in some of the countries

01:54:32 that at least nominally are like

01:54:35 supposed to have moved past that,

01:54:36 that’s scary because it reminds us that it can happen

01:54:39 like in the places that have moved supposedly,

01:54:44 as hopefully moved past that.

01:54:47 And possibly at a civilization level,

01:54:49 like you said, supply chain collapse

01:54:51 might make people resource constraint,

01:54:54 might make people desperate, angry, hateful, violent,

01:54:59 and drag us right back in.

01:55:01 I mean, supply chain collapse is how,

01:55:03 like the ultimate thing that caused the Middle Ages

01:55:06 was supply chain collapse.

01:55:08 It’s like people, because people were reliant

01:55:11 on a certain level of technology,

01:55:12 like people, like you look at like Britain,

01:55:14 like they had glass, like people had aqueducts,

01:55:17 people had like indoor heating and cooling

01:55:20 and like running water and like buy food

01:55:23 from all over the world and trade and markets.

01:55:26 Like people didn’t know how to hunt and forage and gather.

01:55:28 And so we’re in a similar situation.

01:55:29 We are not educated enough to survive without technology.

01:55:33 So if we have a supply chain collapse

01:55:35 that like limits our access to technology,

01:55:38 there will be like massive starvation and violence

01:55:41 and displacement and war.

01:55:43 Like, you know, it’s like, yeah.

01:55:47 In my opinion, it’s like the primary marker

01:55:49 of like what a dark age is.

01:55:52 Well, technology is kind of enabling us

01:55:54 to be more resilient in terms of supply chain,

01:55:57 in terms of, to all the different catastrophic events

01:56:00 that happened to us.

01:56:02 Although the pandemic has kind of challenged

01:56:03 our preparedness for the catastrophic.

01:56:07 What do you think is the coolest invention

01:56:09 humans come up with?

01:56:11 The wheel, fire, cooking meat.

01:56:14 Computers. Computers.

01:56:16 Freaking computers. Internet or computers?

01:56:18 Which one?

01:56:19 What do you think the?

01:56:20 Previous technologies, I mean,

01:56:22 may have even been more profound

01:56:23 and moved us to a certain degree,

01:56:24 but I think the computers are what make us homo tech now.

01:56:27 I think this is what, it’s a brain augmentation.

01:56:30 And so it like allows for actual evolution.

01:56:33 Like the computers accelerate the degree

01:56:35 to which all the other technologies can also be accelerated.

01:56:38 Would you classify yourself as a homo sapien

01:56:40 or a homo techno?

01:56:41 Definitely homo techno.

01:56:43 So you’re one of the earliest of the species.

01:56:46 I think most of us are.

01:56:49 Like, as I said, like, I think if you

01:56:53 like looked at brain scans of us versus humans

01:56:58 a hundred years ago, it would look very different.

01:57:00 I think we are physiologically different.

01:57:03 Just even the interaction with the devices

01:57:05 has changed our brains.

01:57:06 Well, and if you look at,

01:57:08 a lot of studies are coming out to show that like,

01:57:11 there’s a degree of inherited memory.

01:57:13 So some of these physiological changes in theory

01:57:15 should be, we should be passing them on.

01:57:18 So like that’s, you know, that’s not like a,

01:57:21 an instance of physiological change

01:57:23 that’s gonna fizzle out.

01:57:24 In theory, that should progress like to our offspring.

01:57:29 Speaking of offspring,

01:57:30 what advice would you give to a young person,

01:57:33 like in high school,

01:57:35 whether there be an artist, a creative, an engineer,

01:57:43 any kind of career path, or maybe just life in general,

01:57:46 how they can live a life they can be proud of?

01:57:48 I think one of my big thoughts,

01:57:50 and like, especially now having kids,

01:57:53 is that I don’t think we spend enough time

01:57:55 teaching creativity.

01:57:56 And I think creativity is a muscle like other things.

01:57:59 And there’s a lot of emphasis on, you know,

01:58:01 learn how to play the piano.

01:58:02 And then you can write a song

01:58:04 or like learn the technical stuff.

01:58:05 And then you can do a thing.

01:58:07 But I think it’s, like, I have a friend

01:58:10 who’s like world’s greatest guitar player,

01:58:13 like, you know, amazing sort of like producer,

01:58:15 works with other people, but he’s really sort of like,

01:58:18 you know, he like engineers and records things

01:58:20 and like does solos,

01:58:21 but he doesn’t really like make his own music.

01:58:23 And I was talking to him and I was like,

01:58:26 dude, you’re so talented at music.

01:58:27 Like, why don’t you make music or whatever?

01:58:28 And he was like, cause I got, I’m too old.

01:58:32 I never learned the creative muscle.

01:58:34 And it’s like, you know, it’s embarrassing.

01:58:36 It’s like learning the creative muscle

01:58:39 takes a lot of failure.

01:58:40 And it also sort of,

01:58:44 if when you’re being creative,

01:58:46 you know, you’re throwing paint at a wall

01:58:48 and a lot of stuff will fail.

01:58:49 So like part of it is like a tolerance

01:58:51 for failure and humiliation.

01:58:53 And that’s somehow that’s easier to develop

01:58:54 when you’re young or be persist through it

01:58:57 when you’re young.

01:58:58 Everything is easier to develop when you’re young.

01:59:02 Yes.

01:59:03 And the younger, the better.

01:59:04 It could destroy you.

01:59:05 I mean, that’s the shitty thing about creativity.

01:59:08 If, you know, failure could destroy you

01:59:11 if you’re not careful, but that’s a risk worth taking.

01:59:13 But also, but at a young age,

01:59:15 developing a tolerance to failure is good.

01:59:17 I fail all the time.

01:59:19 Like I do stupid shit all the time.

01:59:22 Like in public, in private, I get canceled for,

01:59:24 I’ve make all kinds of mistakes,

01:59:27 but I just like am very resilient about making mistakes.

01:59:30 And so then like I do a lot of things

01:59:32 that like other people wouldn’t do.

01:59:34 And like, I think my greatest asset is my creativity.

01:59:37 And I like, I think pain, like tolerance to failure

01:59:39 is just a super essential thing

01:59:43 that should be taught before other things.

01:59:45 Brilliant advice.

01:59:46 Yeah, yeah.

01:59:47 I wish everybody encouraged sort of failure more

01:59:51 as opposed to kind of.

01:59:52 Cause we like punish failure.

01:59:53 We’re like, no, like when we were teaching kids,

01:59:55 we’re like, no, that’s wrong.

01:59:56 Like that’s, you know, like X keeps like will be like wrong.

02:00:04 Like he’ll say like crazy things.

02:00:05 Like X keeps being like, like bubble car, bubble car.

02:00:09 And I’m like, and you know, I’m like, what’s a bubble car?

02:00:14 Like, but like, it doesn’t like,

02:00:15 but I don’t want to be like, no, you’re wrong.

02:00:17 I’m like, you’re thinking of weird, crazy shit.

02:00:20 Like, I don’t know what a bubble car is, but like.

02:00:22 It’s creating worlds

02:00:23 and they might be internally consistent.

02:00:25 And through that, you might discover something fundamental

02:00:27 about this world.

02:00:28 Yeah, or he’ll like rewrite songs,

02:00:29 like with words that he prefers.

02:00:32 So like, instead of baby shark, he says baby car.

02:00:35 It’s like.

02:00:36 Maybe he’s onto something.

02:00:41 Let me ask the big, ridiculous question.

02:00:42 We were kind of dancing around it,

02:00:44 but what do you think is the meaning

02:00:47 of this whole thing we have here of human civilization,

02:00:52 of life on earth, but in general, just life?

02:00:55 What’s the meaning of life?

02:00:57 C.

02:00:58 Have you, did you read Nova Scene yet?

02:01:02 By James Lovelock?

02:01:03 You’re doing a lot of really good book recommendations here.

02:01:06 I haven’t even finished this,

02:01:07 so I’m a huge fraud yet again.

02:01:10 But like really early in the book,

02:01:12 he says this amazing thing.

02:01:14 Like, I feel like everyone’s so sad and cynical.

02:01:16 Like everyone’s like the Fermi paradox and everyone.

02:01:18 I just keep hearing people being like, fuck,

02:01:20 what if we’re alone?

02:01:21 Like, oh no, ah, like, ah, ah.

02:01:23 And I’m like, okay, but like, wait,

02:01:25 what if this is the beginning?

02:01:26 Like in Nova Scene, he says,

02:01:30 this is not gonna be a correct,

02:01:31 I can’t like memorize quotes,

02:01:32 but he says something like,

02:01:36 what if our consciousness, like right now,

02:01:39 like this is the universe waking up?

02:01:43 Like what if instead of discovering the universe,

02:01:45 this is the universe,

02:01:47 like this is the evolution

02:01:49 of the literal universe herself.

02:01:51 Like we are not separate from the universe.

02:01:53 Like this is the universe waking up.

02:01:54 This is the universe seeing herself for the first time.

02:01:57 Like this is.

02:01:59 The universe becoming conscious.

02:02:00 The first time we were a part of that.

02:02:02 Yeah, cause it’s like,

02:02:03 we aren’t separate from the universe.

02:02:05 Like this could be like an incredibly sacred moment

02:02:08 and maybe like social media and all this things,

02:02:10 the stuff where we’re all getting connected together.

02:02:13 Like maybe these are the neurons connecting

02:02:16 of the like collective super intelligence that is,

02:02:22 Waking up.

02:02:22 The, yeah, like, you know, it’s like,

02:02:25 maybe instead of something cynical

02:02:27 or maybe if there’s something to discover,

02:02:29 like maybe this is just, you know,

02:02:31 we’re a blast assist of like some incredible

02:02:35 kind of consciousness or being.

02:02:39 And just like in the first three years of life

02:02:41 or for human children,

02:02:42 we’ll forget about all the suffering

02:02:44 that we’re going through now.

02:02:45 I think we’ll probably forget about this.

02:02:46 I mean, probably, you know, artificial intelligence

02:02:50 will eventually render us obsolete.

02:02:52 I don’t think they’ll do it in a malicious way,

02:02:55 but I think probably we are very weak.

02:02:57 The sun is expanding.

02:02:58 Like, I don’t know, like, hopefully we can get to Mars,

02:03:01 but like, we’re pretty vulnerable.

02:03:04 And I, you know, like,

02:03:06 I think we can coexist for a long time with AI

02:03:09 and we can also probably make ourselves less vulnerable,

02:03:11 but, you know, I just think

02:03:15 consciousness, sentience, self awareness,

02:03:18 like, I think this might be the single greatest

02:03:21 like moment in evolution ever.

02:03:24 And like, maybe this is, you know,

02:03:29 the big, like the true beginning of life.

02:03:32 And we’re just, we’re the blue green algae

02:03:34 or we’re like the single celled organisms

02:03:36 of something amazing.

02:03:38 The universe awakens and this is it.

02:03:40 Yeah.

02:03:42 Well, see, you’re an incredible person.

02:03:45 You’re a fascinating mind.

02:03:47 You should definitely do, your friend Liv mentioned

02:03:50 that you guys were thinking of maybe talking.

02:03:52 I would love it if you explored your mind

02:03:55 in this kind of media more and more

02:03:56 by doing a podcast with her or just in any kind of way.

02:03:59 So you’re an awesome person.

02:04:01 It’s an honor to know you.

02:04:03 It’s an honor to get to sit down with you late at night,

02:04:05 which is like surreal.

02:04:08 And I really enjoyed it.

02:04:09 Thank you for talking today.

02:04:10 Yeah, no, I mean, huge honor.

02:04:11 I feel very underqualified to be here, but I’m a big fan.

02:04:13 I’ve been listening to the podcast a lot and yeah,

02:04:15 me and Liv would appreciate any advice and help

02:04:18 and we’re definitely gonna do that.

02:04:19 So yeah.

02:04:21 Anytime.

02:04:22 Thank you.

02:04:23 Cool, thank you.

02:04:24 Thanks for listening to this conversation with Grimes.

02:04:26 To support this podcast,

02:04:28 please check out our sponsors in the description.

02:04:31 And now let me leave you with some words from Oscar Wilde.

02:04:34 Yes, I’m a dreamer.

02:04:36 For a dreamer is one who can only find her way by moonlight

02:04:41 and her punishment is that she sees the dawn

02:04:44 before the rest of the world.

02:04:46 Thank you for listening and hope to see you

02:04:49 next time.