Transcript
00:00:00 The following is a conversation with Brian Johnson,
00:00:02 founder of Kernel, a company that has developed devices
00:00:06 that can monitor and record brain activity.
00:00:09 And previously, he was the founder of Braintree,
00:00:12 a mobile payment company that acquired Venmo
00:00:14 and then was acquired by PayPal and eBay.
00:00:17 Quick mention of our sponsors,
00:00:19 Forsigmatic, NetSuite, Grammarly, and ExpressVPN.
00:00:24 Check them out in the description to support this podcast.
00:00:27 As a side note, let me say that this was a fun
00:00:30 and memorable experience,
00:00:31 wearing the Kernel FlowBrain interface
00:00:34 in the beginning of this conversation,
00:00:35 as you can see if you watch the video version
00:00:38 of this episode.
00:00:39 And there was a Ubuntu Linux machine sitting next to me
00:00:42 collecting the data from my brain.
00:00:44 The whole thing gave me hope that the mystery
00:00:47 of the human mind will be unlocked in the coming decades
00:00:51 as we begin to measure signals from the brain
00:00:53 in a high bandwidth way.
00:00:55 To understand the mind,
00:00:56 we either have to build it or to measure it.
00:01:00 Both are worth a try.
00:01:02 Thanks to Brian and the rest of the Kernel team
00:01:04 for making this little demo happen.
00:01:06 This is the Lex Friedman Podcast,
00:01:08 and here is my conversation with Brian Johnson.
00:01:13 You ready, Lex?
00:01:14 Yes, I’m ready.
00:01:15 Do you guys wanna come in and put the interfaces
00:01:18 on our heads?
00:01:18 And then I will proceed to tell you a few jokes.
00:01:22 So we have two incredible pieces of technology
00:01:25 and a machine running Ubuntu 2004 in front of us.
00:01:30 What are we doing?
00:01:31 All right.
00:01:32 Are these going on our heads?
00:01:33 They’re going on our heads, yeah.
00:01:34 And they will place it on our heads for proper alignment.
00:01:41 Does this support giant heads?
00:01:43 Because I kind of have a giant head.
00:01:45 Is this just giant head?
00:01:46 Are you saying as like an ego
00:01:48 or are you saying physically both?
00:01:55 It’s a nice massage.
00:01:57 Yes.
00:01:59 Okay, how does this feel?
00:02:01 It’s okay to move around?
00:02:04 Yeah.
00:02:05 It feels, oh yeah.
00:02:07 Hey, hey.
00:02:10 This feels awesome.
00:02:11 It’s a pretty good fit.
00:02:12 Thank you.
00:02:13 That feels good.
00:02:15 So this is big head friendly.
00:02:17 It suits you well, Lex.
00:02:19 Thank you.
00:02:22 I feel like I need to,
00:02:24 I feel like when I wear this,
00:02:25 I need to sound like Sam Harris,
00:02:27 calm, collected, eloquent.
00:02:31 I feel smarter, actually.
00:02:34 I don’t think I’ve ever felt quite as much
00:02:36 like I’m part of the future as now.
00:02:38 Have you ever worn a brain interface
00:02:40 or had your brain imaged?
00:02:42 Oh, never had my brain imaged.
00:02:45 The only way I’ve analyzed my brain
00:02:47 is by talking to myself and thinking.
00:02:52 No direct data.
00:02:53 Yeah, that is definitely a brain interface
00:02:57 that has a lot of blind spots.
00:02:59 It has some blind spots, yeah.
00:03:01 Psychotherapy.
00:03:02 That’s right.
00:03:04 All right, are we recording?
00:03:07 Yeah, we’re good.
00:03:08 All right.
00:03:09 So Lex, the objective of this,
00:03:12 I’m going to tell you some jokes
00:03:16 and your objective is to not smile,
00:03:18 which as a Russian, you should have an edge.
00:03:22 Make the motherland proud.
00:03:24 I gotcha.
00:03:25 Okay.
00:03:28 Let’s hear the jokes.
00:03:29 Lex, and this is from the Colonel Crew.
00:03:33 We’ve been working on a device that can read your mind
00:03:37 and we would love to see your thoughts.
00:03:42 Is that the joke?
00:03:43 That’s the opening.
00:03:44 Okay.
00:03:49 If I’m seeing the muscle activation correctly on your lips,
00:03:53 you’re not going to do well on this.
00:03:54 Let’s see.
00:03:55 All right, here comes the first.
00:03:56 I’m screwed.
00:03:57 Here comes the first one.
00:03:58 Is this going to break the device?
00:04:01 Is it resilient to laughter?
00:04:07 Lex, what goes through a potato’s brain?
00:04:09 I can’t.
00:04:14 I got already failed.
00:04:15 That’s the hilarious opener.
00:04:17 Okay.
00:04:18 What?
00:04:19 Tater thoughts.
00:04:24 What kind of fish performs brain surgery?
00:04:27 I don’t know.
00:04:29 A neural surgeon.
00:04:35 And so we’re getting data of everything
00:04:37 that’s happening in my brain right now?
00:04:39 Lifetime, yeah.
00:04:39 We’re getting activation patterns of your entire cortex.
00:04:45 I’m going to try to do better.
00:04:46 I’ll edit out all the parts where I laughed.
00:04:48 Photoshop put a serious face over me.
00:04:50 You can recover.
00:04:51 Yeah, all right.
00:04:53 Lex, what do scholars eat when they’re hungry?
00:04:57 I don’t know, what?
00:04:58 Academia nuts.
00:05:03 That was a pretty good one.
00:05:05 So what we’ll do is,
00:05:07 so you’re wearing kernel flow,
00:05:11 which is an interface built using technology
00:05:16 called spectroscopy.
00:05:17 So it’s similar to what we wear wearables on the wrist
00:05:20 using light.
00:05:22 So using LIDAR, as you know,
00:05:24 and we’re using that to image the functional imaging
00:05:28 of brain activity.
00:05:30 And so as your neurons fire electrically and chemically,
00:05:35 it creates blood oxygenation levels.
00:05:37 We’re measuring that.
00:05:38 And so you’ll see in the reconstructions we do for you,
00:05:41 you’ll see your activation patterns in your brain
00:05:43 as throughout this entire time we are wearing it.
00:05:45 So in the reaction to the jokes
00:05:47 and as we were sitting here talking,
00:05:50 and so we’re moving towards a real time feed
00:05:54 of your cortical brain activity.
00:05:56 So there’s a bunch of things that are in contact
00:05:59 with my skull right now.
00:06:02 How many of them are there?
00:06:03 And so how many of them are, what are they?
00:06:05 What are the actual sensors?
00:06:07 There’s 52 modules,
00:06:09 and each module has one laser and six sensors.
00:06:13 And the sensors fire in about 100 picoseconds.
00:06:18 And then the photons scatter and absorb in your brain.
00:06:21 And then a few go in, a few come back out,
00:06:24 a bunch go in, then a few come back out,
00:06:25 and we sense those photons
00:06:27 and then we do the reconstruction for the activity.
00:06:30 Overall, there’s about a thousand plus channels
00:06:33 that are sampling your activity.
00:06:35 How difficult is it to make it as comfortable as it is?
00:06:38 Because it’s surprisingly comfortable.
00:06:40 I would not think it would be comfortable.
00:06:44 Something that’s measuring brain activity,
00:06:48 I would not think it would be comfortable, but it is.
00:06:51 I agree.
00:06:52 In fact, I want to take this home.
00:06:52 Yeah, yeah, that’s right.
00:06:54 So people are accustomed to being in big systems
00:06:58 like fMRI where there’s 120 decibel sounds
00:07:02 and you’re in a claustrophobic encasement,
00:07:05 or EEG, which is just painful, or surgery.
00:07:09 And so, yes, I agree that this is a convenient option
00:07:12 to be able to just put on your head
00:07:14 that measures your brain activity
00:07:15 in the contextual environment you choose.
00:07:17 So if we want to have it during a podcast,
00:07:20 if we want to be at home in a business setting,
00:07:24 it’s freedom to record your brain activity
00:07:27 in the setting that you choose.
00:07:29 Yeah, but sort of from an engineering perspective,
00:07:31 are these, what is it?
00:07:32 There’s a bunch of different modular parts
00:07:34 and they’re kind of, there’s like a rubber band thing
00:07:38 where they mold to the shape of your head.
00:07:40 That’s right.
00:07:41 So we built this version of the mechanical design
00:07:45 to accommodate most adult heads.
00:07:49 But I have a giant head and it fits fine.
00:07:51 It fits well, actually.
00:07:53 So I don’t think I have an average head.
00:07:55 Okay, maybe I feel much better about my head now.
00:08:01 Maybe I’m more average than I thought.
00:08:06 Okay, so what else is there interesting
00:08:08 that you could say while it’s on our heads?
00:08:10 I can keep this on the whole time.
00:08:12 This is kind of awesome.
00:08:13 And it’s amazing for me, as a fan of Ubuntu,
00:08:16 I use Ubuntu MATE, you guys use that too.
00:08:18 But it’s amazing to have code running to the side,
00:08:23 measuring stuff and collecting data.
00:08:26 I mean, I feel like much more important now
00:08:30 that my data is being recorded.
00:08:32 Like, you know when you have a good friend
00:08:35 that listens to you, that actually is listening to you?
00:08:40 This is what I feel like, like a much better friend
00:08:43 because it’s like accurately listening to me, Ubuntu.
00:08:47 What a cool perspective, I hadn’t thought about that,
00:08:50 of feeling understood.
00:08:53 Heard.
00:08:54 Yeah, heard deeply by the mechanical system
00:08:59 that is recording your brain activity,
00:09:01 versus the human that you’re engaging with,
00:09:04 that your mind immediately goes to
00:09:07 that there’s this dimensionality
00:09:09 and depth of understanding of this software system
00:09:12 which you’re intimately familiar with.
00:09:14 And now you’re able to communicate with this system
00:09:17 in ways that you couldn’t before.
00:09:19 Yeah, I feel heard.
00:09:24 Yeah, I mean, I guess what’s interesting about this is
00:09:26 your intuitions are spot on.
00:09:28 Most people have intuitions about brain interfaces
00:09:30 that they’ve grown up with this idea
00:09:32 of people moving cursors on the screen
00:09:34 or typing or changing the channel or skipping a song.
00:09:38 It’s primarily been anchored on control.
00:09:41 And I think the more relevant understanding
00:09:44 of brain interfaces or neuroimaging
00:09:47 is that it’s a measurement system.
00:09:49 And once you have numbers for a given thing,
00:09:53 a seemingly endless number of possibilities
00:09:54 emerge around that of what to do with those numbers.
00:09:57 So before you tell me about the possibilities,
00:09:59 this was an incredible experience.
00:10:01 I can keep this on for another two hours,
00:10:03 but I’m being told that for a bunch of reasons,
00:10:08 just because we probably wanna keep the data small
00:10:11 and visualize it nicely for the final product,
00:10:13 we wanna cut this off and take this amazing helmet
00:10:17 away from me.
00:10:18 So Brian, thank you so much for this experience
00:10:21 and let’s continue without helmetless.
00:10:25 All right.
00:10:26 So that was an incredible experience.
00:10:28 Can you maybe speak to what kind of opportunities
00:10:31 that opens up that stream of data,
00:10:32 that rich stream of data from the brain?
00:10:35 First, I’m curious, what is your reaction?
00:10:38 What comes to mind when you put that on your head?
00:10:41 What does it mean to you?
00:10:42 And what possibilities emerge
00:10:44 and what significance might it have?
00:10:46 I’m curious where your orientation is at.
00:10:50 Well, for me, I’m really excited by the possibility
00:10:54 of various information about my body,
00:10:59 about my mind being converted into data,
00:11:03 such that data can be used to create products
00:11:06 that make my life better.
00:11:07 So that to me is really exciting possibility.
00:11:09 Even just like a Fitbit that measures, I don’t know,
00:11:13 some very basic measurements about your body
00:11:15 is really cool.
00:11:17 But the bandwidth of information,
00:11:20 the resolution of that information is very crude,
00:11:22 so it’s not very interesting.
00:11:23 The possibility of just building a data set
00:11:29 coming in a clean way and a high bandwidth way
00:11:32 from my brain opens up all kinds of…
00:11:37 I was kind of joking when we were talking,
00:11:40 but it’s not really, it’s like I feel heard
00:11:44 in the sense that it feels like the full richness
00:11:49 of the information coming from my mind
00:11:51 is actually being recorded by the machine.
00:11:56 I mean, I can’t quite put it into words,
00:12:00 but there is genuinely for me,
00:12:02 there’s not some kind of joke about me being a robot.
00:12:05 It just genuinely feels like I’m being heard
00:12:09 in a way that’s going to improve my life,
00:12:13 as long as the thing that’s on the other end
00:12:16 can do something useful with that data.
00:12:17 But even the recording itself is like,
00:12:20 once you record, it’s like taking a picture.
00:12:25 That moment is forever saved in time.
00:12:28 Now, a picture cannot allow you to step back
00:12:32 into that world, but perhaps recording your brain
00:12:38 is a much higher resolution thing,
00:12:41 much more personal recording of that information
00:12:44 than a picture that would allow you to step back
00:12:48 into that where you were in that particular moment
00:12:51 in history and then map out a certain trajectory
00:12:54 to tell you certain things about yourself
00:12:58 that could open up all kinds of applications.
00:13:00 Of course, there’s health that I consider,
00:13:02 but honestly, to me, the exciting thing is just being heard.
00:13:06 My state of mind, the level of focus,
00:13:08 all those kinds of things, being heard.
00:13:10 What I heard you say is you have an entirety
00:13:13 of lived experience, some of which you can communicate
00:13:17 in words and in body language,
00:13:20 some of which you feel internally,
00:13:21 which cannot be captured in those communication modalities,
00:13:24 and that this measurement system captures
00:13:27 both the things you can try to articulate in words,
00:13:29 maybe in a lower dimensional space,
00:13:31 using one word, for example, to communicate focus,
00:13:34 when it really may be represented
00:13:35 in a 20 dimensional space of this particular kind of focus
00:13:39 and that this information is being captured,
00:13:42 so it’s a closer representation
00:13:44 to the entirety of your experience captured
00:13:47 in a dynamic fashion that is not just a static image
00:13:50 of your conscious experience.
00:13:53 Yeah, that’s the promise, that was the feeling,
00:13:58 and it felt like the future.
00:13:59 So it was a pretty cool experience.
00:14:00 And from the sort of mechanical perspective,
00:14:04 it was cool to have an actual device
00:14:06 that feels pretty good,
00:14:08 that doesn’t require me to go into the lab.
00:14:11 And also the other thing I was feeling,
00:14:14 there’s a guy named Andrew Huberman,
00:14:16 he’s a friend of mine, amazing podcast,
00:14:17 people should listen to it, Huberman Lab Podcast.
00:14:21 We’re working on a paper together
00:14:24 about eye movement and so on.
00:14:26 And we’re kind of, he’s a neuroscientist
00:14:28 and I’m a data person, machine learning person,
00:14:30 and we’re both excited by how much the,
00:14:38 how much the data measurements of the human mind,
00:14:42 the brain and all the different metrics
00:14:44 that come from that could be used to understand
00:14:47 human beings and in a rigorous scientific way.
00:14:50 So the other thing I was thinking about
00:14:52 is how this could be turned into a tool for science.
00:14:55 Sort of not just personal science,
00:14:57 not just like Fitbit style,
00:15:00 like how am I doing on my personal metrics of health,
00:15:04 but doing larger scale studies of human behavior and so on.
00:15:07 So like data, not at the scale of an individual,
00:15:10 but data at a scale of many individuals
00:15:12 or a large number of individuals.
00:15:14 So personal being heard was exciting
00:15:17 and also just for science is exciting.
00:15:19 It’s very easy, like there’s a very powerful thing
00:15:23 to it being so easy to just put on
00:15:25 that you could scale much easier.
00:15:28 If you think about that second thing you said
00:15:30 about the science of the brain,
00:15:37 most, we’ve done a pretty good job,
00:15:40 like we, the human race has done a pretty good job
00:15:43 figuring out how to quantify the things around us
00:15:46 from distant stars to calories and steps and our genome.
00:15:52 So we can measure and quantify pretty much everything
00:15:55 in the known universe except for our minds.
00:16:00 And we can do these one offs
00:16:02 if we’re going to get an fMRI scan
00:16:03 or do something with a low res EEG system,
00:16:08 but we haven’t done this at population scale.
00:16:11 And so if you think about human thought
00:16:14 or human cognition is probably the single law,
00:16:19 largest raw input material into society
00:16:23 at any given moment is our conversations
00:16:25 with ourselves and with other people.
00:16:27 And we have this raw input that we can’t,
00:16:33 that haven’t been able to measure yet.
00:16:35 And if you, when I think about it through that frame,
00:16:41 it’s remarkable, it’s almost like we live
00:16:43 in this wild, wild West of unquantified communications
00:16:49 within ourselves and between each other
00:16:52 when everything else has been grounded in me.
00:16:53 For example, I know if I buy an appliance at the store
00:16:56 or on a website, I don’t need to look at the measurements
00:17:00 on the appliance and make sure it can fit through my door.
00:17:03 That’s an engineered system of appliance manufacturing
00:17:06 and construction.
00:17:07 Everyone’s agreed upon engineering standards.
00:17:10 And we don’t have engineering standards around cognition.
00:17:15 It’s not a, it has not entered
00:17:16 as a formal engineering discipline that enables us
00:17:20 to scaffold in society with everything else we’re doing,
00:17:23 including consuming news, our relationships,
00:17:26 politics, economics, education, all the above.
00:17:29 And so to me that the most significant contribution
00:17:33 that kernel technology has to offer
00:17:36 would be the formal, the introduction
00:17:40 to formal engineering of cognition
00:17:42 as it relates to everything else in society.
00:17:45 I love that idea that you kind of think that there’s just
00:17:50 this ocean of data that’s coming from people’s brains
00:17:53 as being in a crude way, reduced down to like tweets
00:17:57 and texts and so on, just a very hardcore,
00:18:02 many scale compression of actual, the raw data.
00:18:06 But maybe you can comment,
00:18:08 because you’re using the word cognition.
00:18:11 I think the first step is to get the brain data.
00:18:15 But is there a leap to be taking
00:18:19 to sort of interpreting that data in terms of cognition?
00:18:22 So is your idea is basically you need to start collecting
00:18:25 data at scale from the brain,
00:18:27 and then we start to really be able to take little steps
00:18:32 along the path to actually measuring
00:18:35 some deep sense of cognition.
00:18:37 Because as I’m sure you know, we understand a few things,
00:18:42 but we don’t understand most of what makes up cognition.
00:18:47 This has been one of the most significant challenges
00:18:50 of building Kernel, and Kernel wouldn’t exist
00:18:52 if I wasn’t able to fund it initially by myself.
00:18:55 Because when I engage in conversations with investors,
00:18:59 the immediate thought is, what is the killer app?
00:19:02 And of course, I understand that heuristic,
00:19:04 that’s what they’re looking at,
00:19:05 is they’re looking to de risk.
00:19:07 Is the product solved?
00:19:09 Is there a customer base?
00:19:10 Are people willing to pay for it?
00:19:11 How does it compare to competing options?
00:19:14 And in the case with brain interfaces,
00:19:15 when I started the company, there was no known path
00:19:20 to even build a technology
00:19:21 that could potentially become mainstream.
00:19:24 And then once we figured out the technology,
00:19:26 we could even, we could commence having conversations
00:19:28 with investors and it became, what is the killer app?
00:19:31 And so what has been,
00:19:33 so I funded the first $53 million for the company.
00:19:36 And to raise the round of funding, the first one we did,
00:19:39 I spoke to 228 investors.
00:19:42 One said yes, it was remarkable.
00:19:45 And it was mostly around this concept
00:19:48 around what is a killer app.
00:19:49 And so internally, the way we think about it is,
00:19:51 we think of the go to market strategy
00:19:55 much more like the Drake equation,
00:19:57 where if we can build technology
00:20:00 that has the characteristics of,
00:20:02 it has the data quality is high enough,
00:20:05 it meets some certain threshold,
00:20:07 cost, accessibility, comfort,
00:20:11 it can be worn in contextual environments.
00:20:12 If it meets the criteria of being a mass market device,
00:20:17 then the responsibility that we have is to figure out
00:20:22 how to create the algorithm that enables the human,
00:20:29 to enable humans to then find value with it.
00:20:32 So the analogy is like brain interfaces
00:20:34 are like early 90s of the internet,
00:20:37 is you wanna populate an ecosystem
00:20:39 with a certain number of devices,
00:20:40 you want a certain number of people
00:20:41 who play around with them, who do experiments
00:20:43 of certain data collection parameters,
00:20:44 you want to encourage certain mistakes
00:20:46 from experts and non experts.
00:20:48 These are all critical elements that ignite discovery.
00:20:51 And so we believe we’ve accomplished the first objective
00:20:57 of building technology that reaches those thresholds.
00:21:01 And now it’s the Drake equation component
00:21:03 of how do we try to generate 20 years of value discovery
00:21:11 in a two or three year time period?
00:21:12 How do we compress that?
00:21:14 So just to clarify, so when you mean the Drake equation,
00:21:17 which for people who don’t know,
00:21:19 I don’t know why you, if you listen to this,
00:21:20 I bring up aliens every single conversation.
00:21:22 So I don’t know how you would know
00:21:24 what the Drake equation is,
00:21:25 but you mean like the killer app,
00:21:28 it would be one alien civilization in that equation.
00:21:31 So meaning like this is in search of an application
00:21:34 that’s impactful, transformative.
00:21:36 By the way, it should be, we need to come up
00:21:38 with a better term than killer app as a.
00:21:41 It’s also violent, right?
00:21:43 It’s very violent.
00:21:43 You can go like viral app, that’s horrible too, right?
00:21:46 It’s some very inspiringly impactful application.
00:21:51 How about that?
00:21:52 No. Yeah.
00:21:53 Okay, so ballistic with killer app, that’s fine.
00:21:55 Nobody’s. But I concur with you.
00:21:56 I dislike the chosen words in capturing the concept.
00:22:02 You know, it’s one of those sticky things
00:22:03 that is as effective to use in the tech world.
00:22:08 But when you now become a communicator
00:22:11 outside of the tech world,
00:22:12 especially when you’re talking about software and hardware
00:22:15 and artificial intelligence applications,
00:22:17 it sounds horrible.
00:22:18 Yeah, no, it’s interesting.
00:22:19 I actually regret now having called attention
00:22:21 to cyber regret having used that word in this conversation
00:22:24 because it’s something I would not normally do.
00:22:26 I used it in order to create a bridge
00:22:29 of shared understanding of how others would,
00:22:31 what terminology others would use.
00:22:32 Yeah.
00:22:33 But yeah, I concur.
00:22:34 Let’s go with impactful application.
00:22:38 Or the.
00:22:39 Just value creation.
00:22:40 Value creation.
00:22:41 Something people love using.
00:22:43 There we go, that’s it.
00:22:45 Love app.
00:22:46 Okay, so what, do you have any ideas?
00:22:49 So you’re basically creating a framework
00:22:51 where there’s the possibility of a discovery
00:22:54 of an application that people love using.
00:22:58 Is, do you have ideas?
00:22:59 We’ve began to play a fun game internally
00:23:01 where when we have these discussions
00:23:03 or we begin circling around this concept of,
00:23:08 does anybody have an idea?
00:23:10 Does anyone have intuitions?
00:23:11 And if we see the conversation starting
00:23:13 to veer in that direction,
00:23:15 we flag it and say, human intuition alert, stop it.
00:23:20 And so we really want to focus on the algorithm
00:23:23 of there’s a natural process of human discovery.
00:23:27 That when you populate a system with devices
00:23:30 and you give people the opportunity to play around with it
00:23:33 in expected and unexpected ways,
00:23:35 we are thinking that is a much better system of discovery
00:23:40 than us exercising intuitions.
00:23:41 And it’s interesting, we’re also seeing
00:23:42 a few neuroscientists who have been talking to us.
00:23:45 While I was speaking to this one young associate professor,
00:23:49 and I approached a conversation and said,
00:23:50 hey, we have these five data streams that we’re pulling off.
00:23:56 When you hear that, what weighted value
00:23:58 do you add to each data source?
00:23:59 Which one do you think is going to be valuable
00:24:00 for your objectives and which one’s not?
00:24:03 And he said, I don’t care, just give me the data.
00:24:05 All I care about is my machine learning model.
00:24:08 But importantly, he did not have a theory of mind.
00:24:10 He did not come to the table and say,
00:24:12 I think the brain operates in this way
00:24:15 and these reasons or have these functions.
00:24:16 He didn’t care, he just wanted the data.
00:24:19 And we’re seeing that more and more
00:24:20 that certain people are devaluing human intuitions
00:24:26 for good reasons, as we’ve seen in machine learning
00:24:28 over the past couple years.
00:24:30 And we’re doing the same in our value creation market
00:24:34 strategy.
00:24:36 So collect more data, clean data,
00:24:40 make the product such that the collection of data
00:24:42 is easy and fun and then the rest will just spring to life.
00:24:47 Through humans playing around with them.
00:24:50 Our objective is to create the most valuable
00:24:54 data collection system of the brain ever.
00:25:01 And with that, then applying all the best tools
00:25:05 of machine learning and other techniques
00:25:09 to extract out, to try to find insight.
00:25:13 But yes, our objective is really to systematize
00:25:16 the discovery process because we can’t put
00:25:19 definite timeframes on discovery.
00:25:22 The brain is complicated and science
00:25:25 is not a business strategy.
00:25:28 And so we really need to figure out how to,
00:25:31 this is the difficulty of bringing technology
00:25:34 like this to market.
00:25:34 And it’s why most of the time it just languishes
00:25:38 in academia for quite some time.
00:25:40 But we hope that we will cross over
00:25:44 and make this mainstream in the coming years.
00:25:48 The thing was cool to wear, but are you chasing
00:25:53 a good reason for millions of people to put this
00:25:56 on their head and keep on their head regularly?
00:26:00 Is there, like who’s going to discover that reason?
00:26:04 Is it going to be people just kind of organically
00:26:08 or is there going to be an Angry Birds style application
00:26:12 that’s just too exciting to not use?
00:26:18 If I think through the things that have changed
00:26:20 my life most significantly over the past few years,
00:26:23 when I started wearing a wearable on my wrist
00:26:27 that would give me data about my heart rate,
00:26:29 heart rate variability, respiration rate,
00:26:33 metabolic approximations, et cetera,
00:26:37 for the first time in my life,
00:26:38 I had access to information, sleep patterns
00:26:41 that were highly impactful.
00:26:43 They told me, for example, if I eat close to bedtime,
00:26:48 I’m not going to get deep sleep.
00:26:50 And not getting deep sleep means you have
00:26:52 all these follow on consequences in life.
00:26:54 And so it opened up this window of understanding of myself
00:26:59 that I cannot self introspect and deduce these things.
00:27:03 This is information that was available to be acquired,
00:27:06 but it just wasn’t.
00:27:07 I would have to get an expensive sleep study,
00:27:08 then it’s an end, like one night,
00:27:10 and that’s not good enough to look at, to run all my trials.
00:27:12 And so if you look just at the information
00:27:15 that one can acquire on their wrist,
00:27:18 and now you’re applying it to the entire cortex
00:27:20 on the brain and you say,
00:27:22 what kind of information could we acquire?
00:27:25 It opens up a whole new universe of possibilities.
00:27:28 For example, we did this internal study at Kernel
00:27:30 where I wore a prototype device
00:27:32 and we were measuring the cognitive effects of sleep.
00:27:36 So I had a device measuring my sleep.
00:27:38 I performed with 13 of my coworkers.
00:27:41 We performed four cognitive tasks over 13 sessions.
00:27:45 And we focused on reaction time, impulse control,
00:27:49 short term memory, and then a resting state task.
00:27:52 And with mine, we found, for example,
00:27:55 that my impulse control was independently correlated
00:28:00 with my sleep outside of behavioral measures
00:28:02 of my ability to play the game.
00:28:04 The point of the study was I had,
00:28:07 the brain study I did at Kernel confirmed my life experience
00:28:12 that if I, my deep sleep determined whether or not
00:28:18 I would be able to resist temptation the following day.
00:28:21 And my brain did show that as one example.
00:28:24 And so if you start thinking,
00:28:25 if you actually have data on yourself,
00:28:27 on your entire cortex and you can control the settings,
00:28:32 I think there’s probably a large number of things
00:28:36 that we could discover about ourselves,
00:28:37 very, very small and very, very big.
00:28:39 I just, for example, like when you read news,
00:28:43 what’s going on?
00:28:44 Like when you use social media, when you use news,
00:28:47 like all the ways we allocate attention.
00:28:51 That’s right.
00:28:52 With the computer.
00:28:53 I mean, that seems like a compelling place
00:28:54 to where you would want to put on a Kernel,
00:28:58 by the way, what is it called?
00:28:59 Kernel Flux, Kernel, like what?
00:29:00 Flow.
00:29:01 Flow.
00:29:02 We have two technologies, you or Flow.
00:29:04 Flow, okay.
00:29:05 So when you put on the Kernel Flow,
00:29:11 it seems like to be a compelling time and place to do it
00:29:16 is when you’re behind a desk, behind a computer.
00:29:18 Because you could probably wear it
00:29:19 for prolonged periods of time as you’re taking in content.
00:29:23 And there could a lot of,
00:29:25 because so much of our lives happens
00:29:28 in the digital world now.
00:29:29 That kind of coupling the information about the human mind
00:29:34 with the consumption and the behaviors in the digital world
00:29:39 might give us a lot of information about the effects
00:29:42 of the way we behave and navigate the digital world
00:29:45 to the actual physical meat space effects on our body.
00:29:50 It’s interesting to think,
00:29:51 so in terms of both like for work,
00:29:54 I’m a big fan of Cal Newport, his ideas of deep work
00:29:59 that I spend, with few exceptions,
00:30:05 I try to spend the first two hours of every day,
00:30:07 usually if I’m like at home and have nothing on my schedule
00:30:11 is going to be up to eight hours of deep work,
00:30:15 of focus, zero distraction.
00:30:16 And for me to analyze, I mean I’m very aware
00:30:20 of the waning of that, the ups and downs of that.
00:30:24 And it’s almost like you’re surfing the ups and downs
00:30:27 of that as you’re doing programming,
00:30:29 as you’re doing thinking about particular problems,
00:30:32 you’re trying to visualize things in your mind,
00:30:34 you start trying to stitch them together.
00:30:37 You’re trying to, when there’s a dead end about an idea,
00:30:40 you have to kind of calmly like walk back and start again,
00:30:45 all those kinds of processes.
00:30:47 It’d be interesting to get data
00:30:49 on what my mind is actually doing.
00:30:51 And also recently started doing,
00:30:53 I just talked to Sam Harris a few days ago
00:30:55 and been building up to that.
00:30:58 I started using, started meditating using his app,
00:31:01 Waking Up, I very much recommend it.
00:31:05 It’d be interesting to get data on that
00:31:07 because it’s, you’re very, it’s like you’re removing
00:31:10 all the noise from your head and you very much,
00:31:13 it’s an active process of active noise removal,
00:31:18 active noise canceling like the headphones.
00:31:21 And it’d be interesting to see what is going on in the mind
00:31:24 before the meditation, during it and after,
00:31:28 all those kinds of things.
00:31:29 And all of your examples, it’s interesting
00:31:31 that everyone who’s designed an experience for you,
00:31:35 so whether it be the meditation app or the Deep Work
00:31:38 or all the things you mentioned,
00:31:40 they constructed this product
00:31:43 with a certain number of knowns.
00:31:45 Yeah.
00:31:47 Now, what if we expanded the number of knowns by 10X
00:31:50 or 20X or 30X, they would reconstruct their product
00:31:54 or incorporate those knowns.
00:31:55 So it’d be, and so this is the dimensionality
00:31:58 that I think is the promising aspect
00:32:00 is that people will be able to use this quantification,
00:32:04 use this information to build more effective products.
00:32:09 And this is, I’m not talking about better products
00:32:11 to advertise to you or manipulate you.
00:32:13 I’m talking about our focus is helping people,
00:32:17 individuals have this contextual awareness
00:32:20 and this quantification and then to engage with others
00:32:23 who are seeking to improve people’s lives,
00:32:26 that the objective is betterment across ourselves,
00:32:31 individually and also with each other.
00:32:33 Yeah, so it’s a nice data stream to have
00:32:35 if you’re building an app,
00:32:36 like if you’re building a podcast listening app,
00:32:38 it would be nice to know data about the listener
00:32:40 so that like if you’re bored or you fell asleep,
00:32:42 maybe pause the podcast, it’s like really dumb,
00:32:46 just very simple applications
00:32:48 that could just improve the quality of the experience
00:32:50 of using the app.
00:32:52 I’m imagining if you have your neural, this is Lex
00:32:56 and there’s a statistical representation of you
00:32:59 and you engage with the app and it says,
00:33:02 Lex, you’re best to engage with this meditation exercise
00:33:09 in the following settings.
00:33:11 At this time of day, after eating this kind of food
00:33:14 or not eating, fasting with this level of blood glucose
00:33:17 and this kind of night’s sleep.
00:33:19 But all these data combined
00:33:23 to give you this contextually relevant experience,
00:33:26 just like we do with our sleep.
00:33:27 You’ve optimized your entire life based upon
00:33:30 what information you can acquire and know about yourself.
00:33:34 And so the question is, how much do we really know
00:33:37 of the things going around us?
00:33:38 And I would venture to guess in my own life experience,
00:33:41 I capture, my self awareness captures an extremely small
00:33:45 percent of the things that actually influence
00:33:47 my conscious and unconscious experience.
00:33:50 Well, in some sense, the data would help encourage you
00:33:54 to be more self aware, not just because you trust everything
00:33:57 the data is saying, but it’ll give you a prod
00:34:02 to start investigating.
00:34:04 Like I would love to get like a rating,
00:34:08 like a ranking of all the things I do
00:34:11 and what are the things, it’s probably important to do
00:34:13 without the data, but the data will certainly help.
00:34:16 It’s like rank all the things you do in life
00:34:19 and which ones make you feel shitty,
00:34:21 which ones make you feel good.
00:34:23 Like you’re talking about evening, Brian.
00:34:26 Like this is a good example, somebody like,
00:34:30 I do pig out at night as well.
00:34:32 And it never makes me feel good.
00:34:35 Like you’re in a safe space.
00:34:37 This is a safe space, let’s hear it.
00:34:40 No, I definitely have much less self control
00:34:42 at night and it’s interesting.
00:34:44 And the same, people might criticize this,
00:34:47 but I know my own body.
00:34:50 I know when I eat carnivores, just eat meat,
00:34:52 I feel much better than if I eat more carbs.
00:35:00 The more carbs I eat, the worse I feel.
00:35:02 I don’t know why that is.
00:35:04 There is science supporting it,
00:35:05 but I’m not leaning on science.
00:35:06 I’m leaning on personal experience
00:35:07 and that’s really important.
00:35:09 I don’t need to read, I’m not gonna go on a whole rant
00:35:12 about nutrition science, but many of those studies
00:35:15 are very flawed.
00:35:17 They’re doing their best, but nutrition science
00:35:19 is a very difficult field of study
00:35:21 because humans are so different
00:35:24 and the mind has so much impact
00:35:26 on the way your body behaves.
00:35:28 And it’s so difficult from a scientific perspective
00:35:30 to conduct really strong studies
00:35:32 that you have to be almost like a scientist of one
00:35:36 if to do these studies on yourself.
00:35:39 That’s the best way to understand what works for you or not.
00:35:41 And I don’t understand why, because it sounds unhealthy,
00:35:44 but eating only meat always makes me feel good.
00:35:47 Just eat meat, that’s it.
00:35:49 And I don’t have any allergies, any of that kind of stuff.
00:35:52 I’m not full like Jordan Peterson,
00:35:54 where if he deviates a little bit from the carnivore diet,
00:36:01 he goes off the cliff.
00:36:03 No, I can have chocolate, I can go off the diet,
00:36:06 I feel fine, it’s a gradual worsening of how I feel.
00:36:14 But when I eat only meat, I feel great.
00:36:17 And it’d be nice to be reminded of that.
00:36:19 Like there’s a very simple fact
00:36:21 that I feel good when I eat carnivore.
00:36:24 And I think that repeats itself in all kinds of experiences.
00:36:27 Like I feel really good when I exercise.
00:36:32 I hate exercise, but in the rest of the day,
00:36:39 the impact it has on my mind and the clarity of mind
00:36:43 and the experiences and the happiness
00:36:45 and all those kinds of things, I feel really good.
00:36:48 And to be able to concretely express that through data
00:36:52 would be nice.
00:36:53 It would be a nice reminder, almost like a statement,
00:36:55 like remember what feels good and whatnot.
00:36:58 And there could be things like that,
00:37:01 I’m not, many things that you’re suggesting
00:37:04 that I could not be aware of,
00:37:07 that might be sitting right in front of me
00:37:09 that make me feel really good and make me feel not good.
00:37:12 And the data would show that.
00:37:14 I agree with you.
00:37:14 I’ve actually employed the same strategy.
00:37:17 I fired my mind entirely from being responsible
00:37:21 for constructing my diet.
00:37:23 And so I started doing a program
00:37:24 where I now track over 200 biomarkers every 90 days.
00:37:28 And it captures, of course, the things you would expect
00:37:31 like cholesterol, but also DNA methylation
00:37:34 and all kinds of things about my body,
00:37:37 all the processes that make up me.
00:37:39 And then I let that data generate the shopping lists.
00:37:43 And so I never actually ask my mind what it wants.
00:37:45 It’s entirely what my body is reporting that it wants.
00:37:48 And so I call this goal alignment within Brian.
00:37:51 And there’s 200 plus actors
00:37:53 that I’m currently asking their opinion of.
00:37:55 And so I’m asking my liver, how are you doing?
00:37:57 And it’s expressing via the biomarkers.
00:38:00 And so that I construct that diet
00:38:02 and I only eat those foods until my next testing round.
00:38:06 And that has changed my life more than I think anything else
00:38:10 because in the demotion of my conscious mind
00:38:15 that I gave primacy to my entire life,
00:38:18 it led me astray because like you were saying,
00:38:20 the mind then goes out into the world
00:38:22 and it navigates the dozens
00:38:25 of different dietary regimens people put together in books.
00:38:29 And it’s all has their supporting science
00:38:32 in certain contextual settings, but it’s not N of one.
00:38:35 And like you’re saying, this dietary really is an N of one.
00:38:39 What people have published scientifically of course
00:38:41 can be used for nice groundings,
00:38:46 but it changes when you get to an N of one level.
00:38:48 And so that’s what gets me excited about brain interfaces
00:38:51 is if I could do the same thing for my brain
00:38:54 where I can stop asking my conscious mind for its advice
00:38:58 or for its decision making, which is flawed.
00:39:01 And I’d rather just look at this data
00:39:03 and I’ve never had better health markers in my life
00:39:05 than when I stopped actually asking myself
00:39:08 to be in charge of it.
00:39:09 The idea of demotion of the conscious mind
00:39:15 is such a sort of engineering way of phrasing meditation.
00:39:21 That’s what we’re doing, right?
00:39:22 That’s beautiful, that means really beautifully put.
00:39:26 By the way, testing round, what does that look like?
00:39:28 What’s that?
00:39:29 Well, you mentioned.
00:39:31 Yeah, the test I do.
00:39:33 Yes.
00:39:34 So it includes a complete blood panel.
00:39:37 I do a microbiome test.
00:39:39 I do a diet induced inflammation.
00:39:43 So I look for exotokine expressions.
00:39:45 So foods that produce inflammatory reactions.
00:39:48 I look at my neuroendocrine systems.
00:39:50 I look at all my neurotransmitters.
00:39:53 I do, yeah, there’s several micronutrient tests
00:39:57 to see how I’m looking at the various nutrients.
00:39:59 What about self report of how you feel?
00:40:04 Almost like, you can’t demote your,
00:40:09 you still exist within your conscious mind, right?
00:40:12 So that lived experience is of a lot of value.
00:40:16 So how do you measure that?
00:40:17 I do a temporal sampling over some duration of time.
00:40:20 So I’ll think through how I feel over a week,
00:40:23 over a month, over three months.
00:40:25 I don’t do a temporal sampling of
00:40:27 if I’m at the grocery store in front of a cereal box
00:40:29 and be like, you know what, Captain Crunch
00:40:31 is probably the right thing for me today
00:40:33 because I’m feeling like I need a little fun in my life.
00:40:36 And so it’s a temporal sampling.
00:40:37 If the data sets large enough,
00:40:38 then I smooth out the function of my natural oscillations
00:40:42 of how I feel about life where some days I may feel upset
00:40:45 or depressed or down or whatever.
00:40:47 And I don’t want those moments
00:40:48 to then rule my decision making.
00:40:50 That’s why the demotion happens.
00:40:52 And it says, really, if you’re looking at
00:40:54 health over a 90 day period of time,
00:40:56 all my 200 voices speak up on that interval.
00:41:00 And they’re all given voice to say,
00:41:02 this is how I’m doing and this is what I want.
00:41:04 And so it really is an accounting system for everybody.
00:41:07 So that’s why I think that if you think about
00:41:10 the future of being human,
00:41:11 there’s two things I think that are really going on.
00:41:17 One is the design, manufacturing,
00:41:20 and distribution of intelligence
00:41:22 is heading towards zero on a cost curve
00:41:26 over a certain design, over a certain timeframe
00:41:30 that our ability to, you know, evolution produced us
00:41:33 an intelligent form of intelligence.
00:41:35 We are now designing our own intelligence systems
00:41:38 and the design, manufacturing, distribution
00:41:40 of that intelligence over a certain timeframe
00:41:42 is going to go to a cost of zero.
00:41:45 Design, manufacturing, distribution of intelligence
00:41:48 cost is going to zero.
00:41:49 For example.
00:41:50 Again, just give me a second.
00:41:52 That’s brilliant, okay.
00:41:54 And evolution is doing the design, manufacturing,
00:41:58 distribution of intelligence.
00:41:59 And now we are doing the design, manufacturing,
00:42:02 distribution of intelligence.
00:42:04 And the cost of that is going to zero.
00:42:06 That’s a very nice way of looking at life on Earth.
00:42:10 So if that’s going on and then now in parallel to that,
00:42:15 then you say, okay, what then happens
00:42:19 if when that cost curve is heading to zero?
00:42:24 Our existence becomes a goal alignment problem,
00:42:28 a goal alignment function.
00:42:31 And so the same thing I’m doing
00:42:33 where I’m doing goal alignment within myself
00:42:34 of these 200 biomarkers, where I’m saying,
00:42:37 when Brian exists on a daily basis
00:42:41 and this entity is deciding what to eat
00:42:43 and what to do and et cetera,
00:42:45 it’s not just my conscious mind, which is opining,
00:42:47 it’s 200 biological processes
00:42:50 and there’s a whole bunch of more voices involved.
00:42:52 So in that equation,
00:42:57 we’re going to increasingly automate the things
00:43:02 that we spend high energy on today because it’s easier.
00:43:05 And now we’re going to then negotiate the terms
00:43:09 and conditions of intelligent life.
00:43:11 Now we say conscious existence because we’re biased
00:43:13 because that’s what we have,
00:43:15 but it will be the largest computational exercise
00:43:18 in history because you’re now doing goal alignment
00:43:20 with planet Earth, within yourself, with each other,
00:43:24 within all the intelligent agents we’re building,
00:43:26 bots and other voice assistants.
00:43:29 You basically have a trillions and trillions of agents
00:43:32 working on the negotiation of goal alignment.
00:43:35 Yeah, this is in fact true.
00:43:39 And what was the second thing?
00:43:40 That was it.
00:43:41 So the cost, the design, manufacturing, distribution
00:43:44 of intelligence going to zero,
00:43:45 which then means what’s really going on?
00:43:48 What are we really doing?
00:43:50 We’re negotiating the terms and conditions of existence.
00:43:55 Do you worry about the survival of this process
00:43:59 that life as we know it on Earth comes to an end
00:44:04 or at least intelligent life,
00:44:06 that as the cost goes to zero something happens
00:44:11 where all of that intelligence is thrown in the trash
00:44:14 by something like nuclear war or development of AGI systems
00:44:19 that are very dumb, not AGI I guess,
00:44:21 but AI systems, the paperclip thing,
00:44:25 en masse is dumb but has unintended consequences
00:44:28 where it destroys human civilization.
00:44:31 Do you worry about those kinds of things?
00:44:32 I mean, it’s unsurprising that a new thing
00:44:37 comes into the sphere of human consciousness.
00:44:40 Humans identify the foreign object,
00:44:43 in this case, artificial intelligence.
00:44:45 Our amygdala fires up and says scary, foreign,
00:44:50 we should be apprehensive about this.
00:44:53 And so it makes sense from a biological perspective
00:44:57 that humans, the knee jerk reaction is fear.
00:45:02 What I don’t think has been properly weighted with that
00:45:08 is that we are the first generation of intelligent beings
00:45:13 on this Earth that has been able to look out
00:45:17 over their expected lifetime
00:45:20 and see there is a real possibility of evolving
00:45:23 into entirely novel forms of consciousness, so different
00:45:28 that it would be totally unrecognizable to us today.
00:45:33 We don’t have words for it, we can’t hint at it,
00:45:34 we can’t point at it, we can’t,
00:45:36 you can’t look in the sky and see that thing
00:45:38 that is shining, we’re gonna go up there.
00:45:40 You cannot even create an aspirational statement about it.
00:45:46 And instead we’ve had this knee jerk reaction of fear
00:45:51 about everything that could go wrong.
00:45:53 But in my estimation, this should be the defining aspiration
00:46:00 of all intelligent life on Earth that we would aspire,
00:46:04 that basically every generation surveys the landscape
00:46:08 of possibilities that are afforded,
00:46:09 given the technological, cultural
00:46:10 and other contextual situation that they’re in.
00:46:14 We’re in this context, we haven’t yet identified this
00:46:17 and said, this is unbelievable, we should carefully
00:46:22 think this thing through, not just of mitigating
00:46:26 the things that’ll wipe us out,
00:46:27 but we have this potential,
00:46:29 and so we just haven’t given voice to it,
00:46:31 even though it’s within this realm of possibilities.
00:46:33 So you’re excited about the possibility
00:46:35 of superintelligence systems
00:46:37 and the opportunities that bring,
00:46:38 I mean, there’s parallels to this,
00:46:41 you think about people before the internet
00:46:43 as the internet was coming to life,
00:46:45 I mean, there’s kind of a fog through which you can’t see,
00:46:51 what does the future look like?
00:46:54 Predicting collective intelligence,
00:46:56 which I don’t think we’re understanding
00:46:57 that we’re living through that now,
00:46:59 is that there’s now, we’ve in some sense
00:47:03 stopped being individual intelligences
00:47:05 and become much more like collective intelligences,
00:47:09 because ideas travel much, much faster now,
00:47:13 and they can, in a viral way,
00:47:16 sweep across the populations,
00:47:18 and so it’s almost, I mean, it almost feels like
00:47:23 a thought is had by many people now,
00:47:26 thousands or millions of people
00:47:27 as opposed to an individual person,
00:47:29 and that’s changed everything,
00:47:30 but to me, I don’t think we’re realizing
00:47:33 how much that actually changed people or societies,
00:47:36 but to predict that before the internet
00:47:38 would have been very difficult,
00:47:41 and in that same way, we’re sitting here
00:47:43 with the fog before us, thinking,
00:47:45 what is superintelligence systems,
00:47:49 how is that going to change the world?
00:47:51 What is increasing the bandwidth,
00:47:54 like plugging our brains into this whole thing,
00:47:59 how is that going to change the world?
00:48:01 And it seems like it’s a fog, you don’t know,
00:48:05 and it could be, it could, whatever comes to be,
00:48:10 could destroy the world,
00:48:12 we could be the last generation,
00:48:14 but it also could transform in ways
00:48:18 that creates an incredibly fulfilling life experience
00:48:24 that’s unlike anything we’ve ever experienced.
00:48:27 It might involve dissolution of ego and consciousness
00:48:30 and so on, you’re no longer one individual,
00:48:32 it might be more, you know,
00:48:35 that might be a certain kind of death, an ego death,
00:48:38 but the experience might be really exciting and enriching,
00:48:42 maybe we’ll live in a virtual,
00:48:44 like it’s like, it’s funny to think about
00:48:48 a bunch of sort of hypothetical questions
00:48:49 of would it be more fulfilling to live in a virtual world?
00:48:57 Like if you were able to plug your brain in
00:48:59 in a very dense way into a video game,
00:49:03 like which world would you want to live in?
00:49:05 In the video game or in the physical world?
00:49:08 For most of us, we’re kind of toying it
00:49:12 with the idea of the video game,
00:49:14 but we still want to live in the physical world,
00:49:16 have friendships and relationships in the physical world,
00:49:20 but we don’t know that, again, it’s a fog,
00:49:23 and maybe in 100 years,
00:49:25 we’re all living inside a video game,
00:49:27 hopefully not Call of Duty,
00:49:29 hopefully more like Sims 5, which version is it on?
00:49:33 For you individually though,
00:49:36 does it make you sad that your brain ends?
00:49:41 That you die one day very soon?
00:49:45 That the whole thing, that data source
00:49:49 just goes offline sooner than you would like?
00:49:54 That’s a complicated question.
00:49:56 I would have answered it differently
00:49:59 in different times in my life.
00:50:00 I had chronic depression for 10 years,
00:50:03 and so in that 10 year time period,
00:50:06 I desperately wanted lights to be off,
00:50:09 and the thing that made it even worse
00:50:11 is I was in a religious, I was born into a religion.
00:50:15 It was the only reality I ever understood,
00:50:17 and it’s difficult to articulate to people
00:50:20 when you’re born into that kind of reality
00:50:22 and it’s the only reality you’re exposed to,
00:50:24 you are literally blinded to the existence of other realities
00:50:28 because it’s so much the in group, out group thing,
00:50:31 and so in that situation,
00:50:33 it was not only that I desperately wanted lights out forever,
00:50:37 it was that I couldn’t have lights out forever.
00:50:38 It was that there was an afterlife,
00:50:40 and this afterlife had this system
00:50:45 that would either penalize or reward you for your behaviors,
00:50:50 and so it was almost like this,
00:50:53 this indescribable hopelessness
00:50:57 of not only being in hopeless despair
00:51:01 of not wanting to exist,
00:51:02 but then also being forced to exist,
00:51:05 and so there was a duration of my time,
00:51:06 a duration of life where I’d say,
00:51:08 like yes, I have no remorse for lights being out,
00:51:13 and I actually want it more than anything
00:51:15 in the entire world.
00:51:18 There are other times where I’m looking out at the future
00:51:21 and I say this is an opportunity
00:51:24 for a future evolving human conscious experience
00:51:27 that is beyond my ability to understand,
00:51:31 and I jump out of bed and I race to work
00:51:34 and I can’t think about anything else,
00:51:37 but I think the reality for me is,
00:51:44 I don’t know what it’s like to be in your head,
00:51:45 but in my head, when I wake up in the morning,
00:51:48 I don’t say good morning, Brian, I’m so happy to see you.
00:51:52 Like I’m sure you’re just gonna be beautiful to me today.
00:51:55 You’re not gonna make a huge long list
00:51:57 of everything you should be anxious about.
00:51:59 You’re not gonna repeat that list to me 400 times.
00:52:01 You’re not gonna have me relive
00:52:03 all the regrets I’ve made in life.
00:52:04 I’m sure you’re not doing any of that.
00:52:06 You’re just gonna just help me along all day long.
00:52:08 I mean, it’s a brutal environment in my brain,
00:52:11 and we’ve just become normalized to this environment
00:52:15 that we just accept that this is what it means to be human,
00:52:18 but if we look at it, if we try to muster
00:52:21 as much soberness as we can
00:52:24 about the realities of being human, it’s brutal.
00:52:27 If it is for me, and so am I sad
00:52:31 that the brain may be off one day?
00:52:37 It depends on the contextual setting.
00:52:38 Like how am I feeling?
00:52:39 At what moment are you asking me that?
00:52:40 And my mind is so fickle.
00:52:42 And this is why, again, I don’t trust my conscious mind.
00:52:45 I have been given realities.
00:52:47 I was given a religious reality that was a video game.
00:52:51 And then I figured out it was not a real reality.
00:52:54 And then I lived in a depressive reality,
00:52:56 which delivered this terrible hopelessness.
00:52:59 That wasn’t a real reality.
00:53:00 Then I discovered behavioral psychology,
00:53:03 and I figured out how biased, 188 chronicle biases,
00:53:06 and how my brain is distorting reality all the time.
00:53:08 I have gone from one reality to another.
00:53:11 I don’t trust reality.
00:53:13 I don’t trust realities are given to me.
00:53:15 And so to try to make a decision
00:53:17 on what I value or not value that future state,
00:53:20 I don’t trust my response.
00:53:22 So not fully listening to the conscious mind
00:53:28 at any one moment as the ultimate truth,
00:53:31 but allowing it to go up and down as it does,
00:53:33 and just kind of being observing it.
00:53:35 Yes, I assume that whatever my conscious mind delivers up
00:53:38 to my awareness is wrong upon landing.
00:53:43 And I just need to figure out where it’s wrong,
00:53:45 how it’s wrong, how wrong it is,
00:53:46 and then try to correct for it as best I can.
00:53:49 But I assume that on impact,
00:53:52 it’s mistaken in some critical ways.
00:53:55 Is there something you can say by way of advice
00:53:59 when the mind is depressive,
00:54:00 when the conscious mind serves up something that,
00:54:06 dark thoughts, how you deal with that,
00:54:08 like how in your own life you’ve overcome that,
00:54:11 and others who are experiencing that can overcome it?
00:54:15 Two things.
00:54:16 One, those depressive states are biochemical states.
00:54:25 It’s not you.
00:54:28 And the suggestions that these things,
00:54:31 that this state delivers to you
00:54:33 about suggestion of the hopelessness of life
00:54:35 or the meaninglessness of it,
00:54:37 or that you should hit the eject button,
00:54:42 that’s a false reality.
00:54:44 And that it’s when,
00:54:47 I completely understand the rational decision
00:54:51 to commit suicide.
00:54:53 It is not lost on me at all
00:54:55 that that is an irrational situation,
00:54:57 but the key is when you’re in that situation
00:54:59 and those thoughts are landing,
00:55:01 to be able to say, thank you, you’re not real.
00:55:06 I know you’re not real.
00:55:08 And so I’m in a situation where for whatever reason
00:55:10 I’m having this neurochemical state,
00:55:13 but that state can be altered.
00:55:16 And so again, it goes back to the realities
00:55:18 of the difficulties of being human.
00:55:21 And like when I was trying to solve my depression,
00:55:22 I tried literally, you name it, I tried it systematically,
00:55:27 and nothing would fix it.
00:55:29 And so this is what gives me hope with brain interfaces,
00:55:32 for example, like, could I have numbers on my brain?
00:55:35 Can I see what’s going on?
00:55:36 Because I go to the doctor and it’s like,
00:55:38 how do you feel?
00:55:39 I don’t know, terrible.
00:55:41 Like on a scale from one to 10,
00:55:42 how bad do you want to commit suicide?
00:55:43 10.
00:55:45 Okay, here’s his bottle.
00:55:48 How much should I take?
00:55:49 Well, I don’t know, like just.
00:55:51 Yeah, it’s very, very crude.
00:55:52 And this data opens up the,
00:55:56 yeah, it opens up the possibility of really helping
00:56:00 in those dark moments to first understand
00:56:03 the ways, the ups and downs of those dark moments.
00:56:06 On the complete flip side of that,
00:56:07 right, I am very conscious in my own brain
00:56:14 and deeply, deeply grateful that what there,
00:56:19 it’s almost like a chemistry thing, a biochemistry thing
00:56:23 that I go many times throughout the day.
00:56:26 I’ll look at like this cup and I’ll be overcome with joy
00:56:31 how amazing it is to be alive.
00:56:34 Like I actually think my biochemistry is such
00:56:37 that it’s not as common, like I’ve talked to people
00:56:42 and I don’t think that’s that common.
00:56:44 Like it’s a, and it’s not a rational thing at all.
00:56:48 It’s like, I feel like I’m on drugs
00:56:51 and I’ll just be like, whoa.
00:56:55 And a lot of people talk about like the meditative
00:56:57 experience will allow you to sort of, you know,
00:56:59 look at some basic things like the movement of your hand
00:57:02 as deeply joyful because it’s like, that’s life.
00:57:06 But I get that from just looking at a cup.
00:57:08 Like I’m waiting for the coffee to brew
00:57:10 and I’ll just be like, fuck, life is awesome.
00:57:15 And I’ll sometimes tweet that, but then I’ll like regret
00:57:17 it later, like, God damn it, you’re so ridiculous.
00:57:20 But yeah, so, but that is purely chemistry.
00:57:24 Like there’s no rational, it doesn’t fit
00:57:27 with the rest of my life.
00:57:28 I have all this shit, I’m always late to stuff.
00:57:30 I’m always like, there’s all this stuff, you know,
00:57:32 I’m super self critical, like really self critical
00:57:35 about everything I do, to the point I almost hate
00:57:38 everything I do, but there’s this engine of joy
00:57:41 for life outside of all that.
00:57:43 And that has to be chemistry.
00:57:45 And this flip side of that is what depression probably is,
00:57:48 is the opposite of that feeling of like,
00:57:53 cause I bet you that feeling of the cup being amazing
00:57:57 would save anybody in a state of depression.
00:58:01 Like that would be like fresh, you’re in a desert
00:58:03 and it’s a drink of water, shit man.
00:58:08 The brain is a, it would be nice to understand
00:58:11 where that’s coming from, to be able to understand
00:58:17 how you hit those lows and those highs
00:58:19 that have nothing to do with the actual reality.
00:58:21 It has to do with some very specific aspects
00:58:25 of how you maybe see the world, maybe,
00:58:28 it could be just like basic habits that you engage in
00:58:31 and then how to walk along the line to find
00:58:34 those experiences of joy.
00:58:35 And this goes back to the discussion we’re having
00:58:37 of human cognition is in volume, the largest input
00:58:43 of raw material into society.
00:58:45 And it’s not quantified.
00:58:47 We have no bearings on it.
00:58:50 And so we just, you wonder, we both articulated
00:58:55 some of the challenges we have in our own mind.
00:58:58 And it’s likely that others would say,
00:59:02 I have something similar.
00:59:03 And you wonder when you look at society,
00:59:08 how does that contribute to all the other compounder
00:59:11 problems that we’re experiencing?
00:59:12 How does that blind us to the opportunities
00:59:16 we could be looking at?
00:59:18 And so it really, it has this potential distortion effect
00:59:24 on reality that just makes everything worse.
00:59:27 And I hope if we can put some,
00:59:30 if we can assign some numbers to these things
00:59:34 and just to get our bearings,
00:59:36 so we’re aware of what’s going on,
00:59:38 if we could find greater stabilization
00:59:40 in how we conduct our lives and how we build society,
00:59:46 it might be the thing that enables us to scaffold.
00:59:50 Because we’ve really, again, we’ve done it,
00:59:52 humans have done a fantastic job
00:59:53 systematically scaffolding technology
00:59:58 and science and institutions.
01:00:00 It’s human, it’s our own selves,
01:00:02 which we have not been able to scaffold.
01:00:05 We are the one part of this intelligence infrastructure
01:00:08 that remains unchanged.
01:00:11 Is there something you could say about coupling
01:00:15 this brain data with not just the basic human experience,
01:00:19 but say an experience, you mentioned sleep,
01:00:22 but the wildest experience, which is psychedelics,
01:00:26 is there, and there’s been quite a few studies now
01:00:29 that are being approved and run,
01:00:33 which is exciting from a scientific perspective
01:00:36 on psychedelics.
01:00:38 Do you think, what do you think happens
01:00:40 to the brain on psychedelics?
01:00:44 And how can data about this help us understand it?
01:00:48 And when you’re on DMT, do you see Ls?
01:00:50 And can we convert that into data?
01:00:54 Can you add aliens in there?
01:00:56 Yeah, aliens, definitely.
01:00:57 Do you actually meet aliens?
01:00:58 And Ls, are Ls the aliens?
01:01:00 I’m asking for a few Austin friends, yeah,
01:01:06 that are convinced that they’ve actually met the Ls.
01:01:08 What are Ls like?
01:01:09 Are they friendly?
01:01:11 Are they help?
01:01:11 I haven’t met them personally.
01:01:12 Are they like the smurfs of like they’re industrious
01:01:15 and they have different skill sets and?
01:01:17 Yeah, I think they’re very,
01:01:20 they’re very critical as friends.
01:01:24 They’re trolls.
01:01:26 The Ls are trolls.
01:01:28 No, but they care about you.
01:01:30 So there’s a bunch of different version of trolls.
01:01:33 There’s loving trolls that are harsh on you,
01:01:37 but they want you to be better.
01:01:38 And there’s trolls that just enjoy your destruction.
01:01:42 And I think they’re the ones that care for you.
01:01:45 I think they’re a criticism for my,
01:01:47 see, I haven’t met them directly,
01:01:49 so it’s like a friend of a friend.
01:01:51 Yeah, they gave him a telephone.
01:01:53 Yeah, a bit of a,
01:01:54 and the whole point is that in psychedelics,
01:01:57 and certainly at DMT,
01:01:59 word, this is where the brain data versus word data fails,
01:02:05 which is, you know, words can’t convey the experience.
01:02:08 Most people that, you can be poetic and so on,
01:02:10 but it really does not convey the experience
01:02:12 of what it actually means to meet the Ls.
01:02:15 I mean, to me, what baselines this conversation is,
01:02:18 imagine if we were interested in the health of your heart,
01:02:24 and we started and said, okay, Lex, self interest back,
01:02:28 tell me how’s the health of your heart.
01:02:30 And you sit there and you close your eyes
01:02:31 and you think, feels all right, like things feel okay.
01:02:36 And then you went to the cardiologist
01:02:38 and the cardiologist is like, hey Lex,
01:02:39 you know, tell me how you feel.
01:02:41 You’re like, well, actually, what I’d really like you to do
01:02:43 is do an EKG and a blood panel and look at arterial plaques
01:02:47 and let’s look at my cholesterol.
01:02:49 And there’s like five to 10 studies you would do.
01:02:53 They would then give you this report and say,
01:02:54 here’s the quantified health of your heart.
01:02:58 Now with this data,
01:02:59 I’m going to prescribe the following regime of exercise
01:03:03 and maybe I’ll put you on a statin, like, et cetera.
01:03:06 But the protocol is based upon this data.
01:03:08 You would think the cardiologist is out of their mind
01:03:11 if they just gave you a bottle of statins based upon,
01:03:14 you’re like, well, I think something’s kind of wrong.
01:03:16 And they’re just kind of experiment and see what happens.
01:03:20 But that’s what we do with our mental health today.
01:03:22 So it’s kind of absurd.
01:03:25 And so if you look at psychedelics to have,
01:03:28 again, to be able to measure the brain
01:03:29 and get a baseline state,
01:03:31 and then to measure during a psychedelic experience
01:03:33 and post the psychedelic experience
01:03:35 and then do it longitudinally,
01:03:37 you now have a quantification of what’s going on.
01:03:39 And so you could then pose questions,
01:03:41 what molecule is appropriate at what dosages,
01:03:45 at what frequency, in what contextual environment,
01:03:47 what happens when I have this diet with this molecule,
01:03:50 with this experience,
01:03:51 all the experimentation you do
01:03:52 when you have good sleep data or HRV.
01:03:55 And so that’s what I think happens,
01:03:57 what we could potentially do with psychedelics
01:04:00 is we could add this level of sophistication
01:04:03 that is not in the industry currently.
01:04:05 And it may improve the outcomes people experience,
01:04:09 it may improve the safety and efficacy.
01:04:11 And so that’s what I hope we are able to achieve.
01:04:14 And it would transform mental health
01:04:19 because we would finally have numbers
01:04:21 to work with to baseline ourselves.
01:04:22 And then if you think about it,
01:04:25 when we talk about things related to the mind,
01:04:26 we talk about the modality.
01:04:28 We use words like meditation or psychedelics
01:04:30 or something else,
01:04:32 because we can’t talk about a marker in the brain.
01:04:34 We can’t use a word to say,
01:04:35 we can’t talk about cholesterol.
01:04:36 We don’t talk about plaque in the arteries.
01:04:38 We don’t talk about HRV.
01:04:40 And so if we have numbers,
01:04:41 then the solutions get mapped to numbers
01:04:45 instead of the modalities being the thing we talk about.
01:04:47 Meditation just does good things in a crude fashion.
01:04:52 So in your blog post,
01:04:53 Zero Principle Thinking, good title,
01:04:56 you ponder how do people come up
01:04:57 with truly original ideas.
01:04:59 What’s your thoughts on this as a human
01:05:02 and as a person who’s measuring brain data?
01:05:05 Zero principles are building blocks.
01:05:09 First principles are understanding of system laws.
01:05:14 So if you take, for example, like in Sherlock Holmes,
01:05:17 he’s a first principles thinker.
01:05:18 So he says, once you’ve eliminated the impossible,
01:05:25 anything that remains, however improbable, is true.
01:05:29 Whereas Dirk Gently, the holistic detective
01:05:31 by Douglas Adams says,
01:05:33 I don’t like eliminating the impossible.
01:05:36 So when someone says,
01:05:38 from a first principles perspective,
01:05:42 and they’re trying to assume the fewest number of things
01:05:47 within a given timeframe.
01:05:50 And so when I, after Braintree Venmo,
01:05:54 I set my mind to the question of,
01:05:57 what single thing can I do that would maximally increase
01:06:00 the probability that the human race thrives
01:06:02 beyond what we can even imagine?
01:06:05 And I found that in my conversations with others
01:06:08 in the books I read, in my own deliberations,
01:06:12 I had a missing piece of the puzzle,
01:06:15 because I didn’t feel like,
01:06:17 yeah, I didn’t feel like the future could be deduced
01:06:22 from first principles thinking.
01:06:24 And that’s when I read the book, Zero,
01:06:27 A Biography of a Dangerous Idea.
01:06:29 And I…
01:06:30 It’s a really good book, by the way.
01:06:32 I think it’s my favorite book I’ve ever read.
01:06:34 It’s also a really interesting number, zero.
01:06:37 And I wasn’t aware that the number zero
01:06:40 had to be discovered.
01:06:40 I didn’t realize that it caused a revolution in philosophy
01:06:43 and just tore up math and it tore up,
01:06:46 I mean, it builds modern society,
01:06:48 but it wrecked everything in its way.
01:06:51 It was an unbelievable disruptor, and it was so difficult
01:06:55 for society to get their heads around it.
01:06:57 And so zero is, of course,
01:07:00 the representation of a zero principle thinking,
01:07:03 which is it’s the caliber
01:07:06 and consequential nature of an idea.
01:07:09 And so when you talk about what kind of ideas
01:07:15 have civilization transforming properties,
01:07:19 oftentimes they fall in the zeroth category.
01:07:21 And so in thinking this through,
01:07:24 I was wanting to find a quantitative structure
01:07:28 on how to think about these zeroth principles.
01:07:31 And that’s, so I came up with that
01:07:33 to be a coupler with first principles thinking.
01:07:37 And so now it’s a staple as part of how I think about
01:07:39 the world and the future.
01:07:41 So it emphasizes trying to identify,
01:07:44 it lands on that word impossible.
01:07:46 Like what is impossible, essentially trying to identify
01:07:50 what is impossible and what is possible.
01:07:52 And being as, how do you, I mean, this is the thing,
01:07:57 is most of society tells you the range of things
01:08:01 they say is impossible is very wide.
01:08:02 So you need to be shrinking that.
01:08:04 I mean, that’s the whole process of this kind of thinking
01:08:08 is you need to be very rigorous in thinking about
01:08:13 and be very rigorous in trying to be,
01:08:18 trying to draw the lines of what is actually impossible
01:08:22 because very few things are actually impossible.
01:08:26 I don’t know what is actually impossible.
01:08:29 Like it’s the Joe Rogan, it’s entirely possible.
01:08:33 I like that approach to science, to engineering,
01:08:37 to entrepreneurship, it’s entirely possible.
01:08:40 Basically shrink the impossible to zero,
01:08:43 to a very small set.
01:08:45 Yeah, life constraints favor first principle thinking
01:08:50 because it enables faster action
01:08:55 with higher probability of success.
01:08:57 Pursuing zero with principle optionality
01:09:00 is expensive and uncertain.
01:09:02 And so in a society constrained by resources,
01:09:06 time and money and a desire for social status,
01:09:10 accomplishment, et cetera, it minimizes zero
01:09:13 with principle thinking.
01:09:14 But the reason why I think zero with principle thinking
01:09:16 should be a staple of our shared cognitive infrastructure
01:09:22 is if you look through the history
01:09:24 of the past couple of thousand years
01:09:26 and let’s just say we arbitrarily,
01:09:29 we subjectively try to assess what is a zero level idea.
01:09:34 And we say how many have occurred on what time scales
01:09:37 and what were the contextual settings for it?
01:09:40 I would argue that if you look at AlphaGo,
01:09:44 when it played Go from another dimension,
01:09:48 with the human Go players, when it saw AlphaGo’s moves,
01:09:53 it attributed to like playing with an alien,
01:09:55 playing Go with AlphaGo being from another dimension.
01:10:00 And so if you say computational intelligence
01:10:04 has an attribute of introducing zero like insights,
01:10:10 then if you say what is going to be the occurrence
01:10:14 of zeros in society going forward?
01:10:17 And you could reasonably say
01:10:19 probably a lot more than have occurred
01:10:21 and probably more at a faster pace.
01:10:24 So then if you say,
01:10:25 what happens if you have this computational intelligence
01:10:28 throughout society that the manufacturing design
01:10:30 and distribution of intelligence
01:10:31 is now going to heading towards zero,
01:10:33 you have an increased number of zeros being produced
01:10:37 with a tight connection between human and computers.
01:10:41 That’s when I got to a point and said,
01:10:43 we cannot predict the future
01:10:45 with first principles thinking.
01:10:47 We can’t, that cannot be our imagination set.
01:10:50 It can’t be our sole anchor in the situation
01:10:55 that basically the future of our conscious existence,
01:10:57 20, 30, 40, 50 years is probably a zero.
01:11:01 So just to clarify, when you say zero,
01:11:06 you’re referring to basically a truly revolutionary idea.
01:11:12 Yeah, something that is currently not a building block
01:11:17 of our shared conscious existence,
01:11:21 either in the form of knowledge.
01:11:24 Yeah, it’s currently not manifest
01:11:26 in what we acknowledge.
01:11:28 So zero principle thinking is playing with ideas
01:11:32 that are so revolutionary that we can’t even clearly reason
01:11:39 about the consequences once those ideas come to be.
01:11:42 Yeah, or for example, like Einstein,
01:11:46 that was a zeroeth, I would categorize it
01:11:49 as a zeroeth principle insight.
01:11:51 You mean general relativity, space time.
01:11:53 Yeah, space time, yep, yep.
01:11:55 That basically building upon what Newton had done
01:11:59 and said, yes, also, and it just changed the fabric
01:12:04 of our understanding of reality.
01:12:06 And so that was unexpected, it existed.
01:12:09 We just, it became part of our awareness
01:12:13 and the moves AlphaGo made existed.
01:12:16 It just came into our awareness.
01:12:19 And so to your point, there’s this question
01:12:24 of what do we know and what don’t we know?
01:12:27 Do we think we know 99% of all things
01:12:30 or do we think we know 0.001% of all things?
01:12:33 And that goes back to no known, no unknowns
01:12:35 and unknown unknowns.
01:12:37 And first principles and zero principle thinking
01:12:39 gives us a quantitative framework to say,
01:12:42 there’s no way for us to mathematically
01:12:44 try to create probabilities for these things.
01:12:47 Therefore, it would be helpful
01:12:50 if they were just part of our standard thought processes
01:12:52 because it may encourage different behaviors
01:12:58 in what we do individually, collectively as a society,
01:13:01 what we aspire to, what we talk about,
01:13:03 the possibility sets we imagine.
01:13:05 Yeah, I’ve been engaged in that kind of thinking
01:13:09 quite a bit and thinking about engineering of consciousness.
01:13:14 I think it’s feasible, I think it’s possible
01:13:17 in the language that we’re using here.
01:13:19 And it’s very difficult to reason about a world
01:13:21 when inklings of consciousness can be engineered
01:13:26 into artificial systems.
01:13:30 Not from a philosophical perspective,
01:13:33 but from an engineering perspective,
01:13:35 I believe a good step towards engineering consciousness
01:13:39 is creating engineering the illusion of consciousness.
01:13:44 So I’m captivated by our natural predisposition
01:13:51 to anthropomorphize things.
01:13:55 And I think that’s what we,
01:13:58 I don’t wanna hear from the philosophers,
01:14:00 but I think that’s what we kind of do to each other.
01:14:05 That consciousness is created socially,
01:14:10 that like much of the power of consciousness
01:14:14 is in the social interaction.
01:14:16 I create your consciousness, no,
01:14:19 I create my consciousness by having interacted with you.
01:14:24 And that’s the display of consciousness.
01:14:26 It’s the same as like the display of emotion.
01:14:28 Emotion is created through communication.
01:14:31 Language is created through its use.
01:14:34 And then we somehow humans kind of,
01:14:36 especially philosophers, the hard problem of consciousness
01:14:39 or the hard problem of consciousness,
01:14:40 really wanna believe that we possess this thing.
01:14:44 That’s like there’s an elf sitting there with a hat
01:14:50 or like name tag says consciousness,
01:14:52 and they’re like feeding this subjective experience to us
01:14:57 as opposed to like it actually being an illusion
01:15:02 that we construct to make social communication more effective.
01:15:05 And so I think if you focus on creating the illusion
01:15:09 of consciousness, you can create
01:15:11 some very fulfilling experiences in software.
01:15:14 And so that to me is a compelling space of ideas to explore.
01:15:18 I agree with you.
01:15:19 And I think going back to our experience together
01:15:21 with Brain Interfaces on,
01:15:23 you could imagine if we get to a certain level of maturity.
01:15:26 So first let’s take the inverse of this.
01:15:28 So you and I text back and forth
01:15:30 and we’re sending each other emojis.
01:15:33 That has a certain amount of information transfer rate
01:15:37 as we’re communicating with each other.
01:15:39 And so in our communication with people via email
01:15:41 and texts and whatnot,
01:15:42 we’ve taken the bandwidth of human interaction,
01:15:46 the information transfer rate, and we’ve reduced it.
01:15:49 We have less social cues.
01:15:51 We have less information to work with.
01:15:53 There’s a lot more opportunity for misunderstanding.
01:15:55 So that is altering the conscious experience
01:15:57 between two individuals.
01:15:59 And if we add Brain Interfaces to the equation,
01:16:01 let’s imagine now we amplify the dimensionality
01:16:04 of our communications.
01:16:05 That to me is what you’re talking about,
01:16:07 which is consciousness engineering.
01:16:09 Perhaps I understand you with dimensions.
01:16:13 So maybe I understand your,
01:16:15 when you look at the cup and you experience that happiness,
01:16:17 you can tell me you’re happy.
01:16:18 And I then do theory of mine and say,
01:16:20 I can imagine what it might be like to be Lex
01:16:23 and feel happy about seeing this cup.
01:16:25 But if the interface could then quantify
01:16:26 and give me a 50 vector space model and say,
01:16:30 this is the version of happiness that Lex is experiencing
01:16:32 as he looked at this cup,
01:16:34 then it would allow me potentially
01:16:36 to have much greater empathy for you
01:16:38 and understand you as a human.
01:16:39 This is how you experience joy,
01:16:41 which is entirely unique from how I experienced joy,
01:16:44 even though we assumed ahead of time
01:16:46 that we’re having some kind of similar experience.
01:16:48 But I agree with you that we do consciousness engineering
01:16:51 today in everything we do.
01:16:52 When we talk to each other, when we’re building products
01:16:55 and that we’re entering into a stage where
01:17:00 it will be much more methodical
01:17:03 and quantitative based and computational
01:17:06 in how we go about doing it.
01:17:07 Which to me, I find encouraging
01:17:09 because I think it creates better guardrails
01:17:14 to create ethical systems versus right now,
01:17:18 I feel like it’s really a wild, wild west
01:17:21 on how these interactions are happening.
01:17:23 Yeah, and it’s funny you focus on human to human,
01:17:25 but that this kind of data enables human to machine
01:17:29 interaction, which is what we’re kind of talking about
01:17:33 when we say engineering consciousness.
01:17:36 And that will happen, of course,
01:17:39 let’s flip that on its head.
01:17:40 Right now we’re putting humans as the central node.
01:17:44 What if we gave GPT3 a bunch of human brains
01:17:48 and said, hey, GPT3, learn some manners when you speak.
01:17:52 Yeah.
01:17:54 And run your algorithms on humans brains
01:17:56 and see how they respond.
01:17:58 So you can be polite and so that you can be friendly
01:18:01 and so that you can be conversationally appropriate,
01:18:04 but to inverse it, to give our machines a training set
01:18:09 in real time with closed loop feedback
01:18:11 so that our machines were better equipped to
01:18:17 find their way through our society
01:18:19 in polite and kind and appropriate ways.
01:18:22 I love that idea.
01:18:23 Or better yet, teach it some,
01:18:27 have it read the following documents
01:18:30 and have it visit Austin and Texas.
01:18:32 And so that when you ask, when you tell it,
01:18:34 why don’t you learn some manners,
01:18:36 GPT3 learns to say no.
01:18:41 It learns what it means to be free
01:18:43 and a sovereign individual.
01:18:46 So that, it depends.
01:18:47 So it depends what kind of a version of GPT3 you want.
01:18:50 One that’s free, one that behaves well with the social.
01:18:52 Viva la revolution.
01:18:54 You want a socialist GPT3, you want an anarchist GPT3,
01:19:00 you want a polite, like you take it home
01:19:03 to visit mom and dad GPT3 and you want like party
01:19:06 and like Vegas to a strip club GPT3, you want all flavors.
01:19:11 And then you’ve gotta have goal alignment between all those.
01:19:14 Yeah, they don’t want to manipulate each other for sure.
01:19:20 So that’s, I mean, you kind of spoke to ethics.
01:19:24 One of the concerns that people have in this modern world,
01:19:28 the digital data is that of privacy and security.
01:19:32 But privacy, they’re concerned that when they share data,
01:19:37 it’s the same thing with you when we trust other human beings
01:19:40 in being fragile and revealing something
01:19:44 that we’re vulnerable about.
01:19:48 There’s a leap of faith, there’s a leap of trust
01:19:51 that that’s going to be just between us.
01:19:54 There’s a privacy to it.
01:19:55 And then the challenge is when you’re in the digital space
01:19:58 then sharing your data with companies
01:20:01 that use that data for advertisement
01:20:03 and all those kinds of things,
01:20:05 there’s a hesitancy to share that much data,
01:20:08 to share a lot of deep personal data.
01:20:10 And if you look at brain data, that feels a whole lot
01:20:14 like it’s richly, deeply personal data.
01:20:17 So how do you think about privacy
01:20:20 with this kind of ocean of data?
01:20:22 I think we got off to a wrong start with the internet
01:20:26 where the basic rules of play for the company that be was,
01:20:33 if you’re a company, you can go out
01:20:36 and get as much information on a person
01:20:39 as you can find without their approval.
01:20:42 And you can also do things to induce them
01:20:46 to give you as much information.
01:20:48 And you don’t need to tell them what you’re doing with it.
01:20:50 You can do anything on the backside,
01:20:52 you can make money on it, but the game is
01:20:54 who can acquire the most information
01:20:56 and devise the most clever schemes to do it.
01:21:00 That was a bad starting place.
01:21:02 And so we are in this period
01:21:05 where we need to correct for that.
01:21:07 And we need to say, first of all,
01:21:09 the individual always has control over their data.
01:21:14 It’s not a free for all.
01:21:15 It’s not like a game of hungry hippo,
01:21:17 but they can just go out and grab as much as they want.
01:21:20 So for example, when your brain data was recorded today,
01:21:22 the first thing we did in the kernel app
01:21:24 was you have control over your data.
01:21:27 And so it’s individual consent, it’s individual control.
01:21:31 And then you can build up on top of that,
01:21:33 but it has to be based upon some clear rules of play
01:21:36 if everyone knows what’s being collected,
01:21:39 they know what’s being done with it,
01:21:40 and the person has control over it.
01:21:42 So transparency and control.
01:21:43 So everybody knows what does control look like,
01:21:46 my ability to delete the data if I want.
01:21:48 Yeah, delete it and to know who is being shared with
01:21:51 under what terms and conditions.
01:21:53 We haven’t reached that level of sophistication
01:21:55 with our products of if you say, for example,
01:22:00 hey Spotify, please give me a customized playlist
01:22:04 according to my neurome, you could say,
01:22:08 you can have access to this vector space model,
01:22:10 but only for this duration of time
01:22:12 and then you’ve got to delete it.
01:22:15 We haven’t gotten there to that level of sophistication,
01:22:16 but these are ideas we need to start talking about
01:22:19 of how would you actually structure permissions?
01:22:22 Yeah.
01:22:23 And I think it creates a much more stable set
01:22:27 for society to build where we understand the rules of play
01:22:31 and people aren’t vulnerable to being taken advantage.
01:22:34 It’s not fair for an individual to be taken advantage of
01:22:39 without their awareness with some other practice
01:22:42 that some company is doing for their sole benefit.
01:22:44 And so hopefully we are going through a process now
01:22:46 where we’re correcting for these things
01:22:48 and that it can be an economy wide shift that,
01:22:54 because really these are fundamentals
01:22:59 we need to have in place.
01:23:01 It’s kind of fun to think about like in Chrome
01:23:05 when you install an extension or like install an app,
01:23:07 it’s ask you like what permissions you’re willing to give
01:23:10 and be cool if in the future it says like,
01:23:13 you can have access to my brain data.
01:23:16 I mean, it’s not unimaginable in the future
01:23:21 that the big technology companies have built a business
01:23:24 based upon acquiring data about you
01:23:26 that they can then create a view to model of you
01:23:27 and sell that predictability.
01:23:29 And so it’s not unimaginable that you will create
01:23:31 with like kernel device, for example,
01:23:33 a more reliable predictor of you than they could.
01:23:37 And that they’re asking you for permission
01:23:39 to complete their objectives and you’re the one
01:23:41 that gets to negotiate that with them and say, sure.
01:23:43 But so it’s not unimaginable that might be the case.
01:23:49 So there’s a guy named Dela Musk and he has a company
01:23:52 in one of the many companies called Neuralink
01:23:55 that’s also excited about the brain.
01:23:59 So it’d be interesting to hear your kind of opinions
01:24:01 about a very different approach that’s invasive,
01:24:03 that require surgery, that implants,
01:24:06 a data collection device in the brain.
01:24:09 How do you think about the difference between kernel
01:24:10 and Neuralink in the approaches of getting
01:24:15 that stream of brain data?
01:24:17 Elon and I spoke about this a lot early on.
01:24:20 We met up, I had started kernel and he had an interest
01:24:24 in brain interfaces as well.
01:24:25 And we explored doing something together,
01:24:28 him joining kernel and ultimately it wasn’t the right move.
01:24:31 And so he started Neuralink and I continued building kernel,
01:24:35 but it was interesting because we were both
01:24:39 at this very early time where it wasn’t certain
01:24:46 if there was a path to pursue,
01:24:49 if now was the right time to do something
01:24:51 and then the technological choice of doing that.
01:24:53 And so we were both,
01:24:54 our starting point was looking at invasive technologies.
01:24:58 And I was building invasive technology at the time.
01:25:01 That’s ultimately where he’s gone.
01:25:04 Little less than a year after Elon and I were engaged,
01:25:08 I shifted kernel to do noninvasive.
01:25:12 And we had this neuroscientist come to kernel.
01:25:15 We were talking about,
01:25:16 he had been doing neural surgery for 30 years,
01:25:17 one of the most respected neuroscientists in the US.
01:25:20 And we brought him to kernel to figure out
01:25:21 the ins and outs of his profession.
01:25:23 And at the very end of our three hour conversation,
01:25:26 he said, you know, every 15 or so years,
01:25:30 a new technology comes along that changes everything.
01:25:34 He said, it’s probably already here.
01:25:37 You just can’t see it yet.
01:25:39 And my jaw dropped.
01:25:40 I thought, because I had spoken to Bob Greenberg
01:25:44 who had built a second site first on the optical nerve
01:25:48 and then he did an array on the optical cortex.
01:25:53 And then I also became friendly with Neuropace
01:25:57 who does the implants for seizure detection
01:25:59 and remediation.
01:26:01 And I saw in their eyes what it was like
01:26:07 to take something through an implantable device
01:26:10 through for a 15 year run.
01:26:11 They initially thought it was seven years
01:26:13 and ended up being 15 years.
01:26:15 And they thought it’d be a hundred million
01:26:16 because it was 300, 400 million.
01:26:18 And I really didn’t want to build invasive technology.
01:26:23 It was the only thing that appeared to be possible.
01:26:25 But then once I spun up an internal effort
01:26:28 to start looking at noninvasive options,
01:26:30 we said, is there something here?
01:26:32 Is there anything here that again has the characteristics
01:26:34 of it has the high quality data,
01:26:37 it could be low cost, it could be accessible.
01:26:39 Could it make brain interfaces mainstream?
01:26:42 And so I did a bet the company move.
01:26:43 We shifted from noninvasive to invasive to noninvasive.
01:26:47 So the answer is yes to that.
01:26:49 There is something there that’s possible.
01:26:51 The answer is we’ll see.
01:26:52 We’ve now built both technologies
01:26:55 and they’re now you experienced one of them today.
01:26:58 We were applying, we’re now deploying it.
01:27:02 So we’re trying to figure out what value is really there.
01:27:04 But I’d say it’s really too early to express confidence.
01:27:07 Whether it’s too, I think it’s too early to assess
01:27:12 which technological choice is the right one
01:27:17 on what time scales.
01:27:19 Yeah, time scales are really important here.
01:27:20 Very important because if you look at the,
01:27:22 like on the invasive side,
01:27:24 there’s so much activity going on right now
01:27:27 of less invasive techniques to get at the neuron firings,
01:27:34 which would what Neuralink is building.
01:27:36 It’s possible that in 10, 15 years
01:27:39 when they’re scaling that technology,
01:27:40 other things have come along.
01:27:42 And you’d much rather do that.
01:27:44 That thing starts to clock again.
01:27:46 It may not be the case.
01:27:47 It may be the case that Neuralink
01:27:48 has properly chosen the right technology
01:27:50 and that that’s exactly what they want to be.
01:27:53 Totally possible.
01:27:53 And it’s also possible that the path we chose
01:27:55 that are noninvasive fall short for a variety of reasons.
01:27:58 It’s just, it’s unknown.
01:28:00 And so right now the two technologies we chose,
01:28:03 the analogy I’d give you to create a baseline
01:28:06 of understanding is if you think of it
01:28:09 like the internet in the nineties,
01:28:11 the internet became useful
01:28:12 when people could do a dial up connection.
01:28:15 And then as bandwidth increased,
01:28:19 so did the utility of that connection
01:28:21 and so did the ecosystem improve.
01:28:22 And so if you say what kernel flow
01:28:26 is going to give you a full screen
01:28:28 on the picture of information,
01:28:30 but as you’re gonna be watching a movie,
01:28:32 but the image is going to be blurred
01:28:34 and the audio is gonna be muffled.
01:28:37 So it has a lower resolution of coverage.
01:28:40 A kernel flux, our MEG technology
01:28:43 is gonna give you the full movie and 1080p.
01:28:47 And Neuralink is gonna give you a circle
01:28:51 on the screen of 4K.
01:28:55 And so each one has their pros and cons
01:28:57 and it’s give and take.
01:28:59 And so the decision I made with kernel
01:29:03 was that these two technologies, flux and flow
01:29:06 were basically the answer for the next seven years.
01:29:10 And that they would give rise to the ecosystem
01:29:12 which would become much more valuable
01:29:14 than the hardware itself.
01:29:15 And that we would just continue to improve
01:29:16 on the hardware over time.
01:29:18 And you know, it’s early days, so.
01:29:20 It’s kind of fascinating to think about that.
01:29:23 You don’t, it’s very true that you don’t know
01:29:27 both paths are very promising.
01:29:30 And it’s like 50 years from now we will look back
01:29:37 and maybe not even remember one of them.
01:29:40 And the other one might change the world.
01:29:43 It’s so cool how technology is.
01:29:44 I mean, that’s what entrepreneurship is,
01:29:47 is like, it’s the zero principle.
01:29:50 It’s like you’re marching ahead into the darkness,
01:29:52 into the fog, not knowing.
01:29:54 It’s wonderful to have someone else
01:29:56 out there with us doing this.
01:29:57 Because if you look at brain interfaces, anything
01:30:02 that’s off the shelf right now is inadequate.
01:30:07 It’s had its run for a couple of decades.
01:30:09 It’s still in hacker communities.
01:30:11 It hasn’t gone to the mainstream.
01:30:14 The room size machines are on their own path.
01:30:19 But there is no answer right now
01:30:20 of bringing brain interfaces mainstream.
01:30:23 And so it both, you know, both they and us,
01:30:27 we’ve both spent over a hundred million dollars.
01:30:29 And that’s kind of what it takes to have a go at this.
01:30:32 Cause you need to build full stack.
01:30:34 I mean, at Kernel, we are from the photon
01:30:37 and the atom through the machine learning.
01:30:40 We have just under a hundred people.
01:30:41 I think it’s something like 36, 37 PhDs
01:30:45 in these specialties, in these areas
01:30:47 that there’s only a few people in the world
01:30:48 who have these abilities.
01:30:50 And that’s what it takes to build next generation,
01:30:53 to make an attempt at breaking into brain interfaces.
01:30:57 And so we’ll see over the next couple of years,
01:30:58 whether it’s the right time
01:31:00 or whether we were both too early
01:31:01 or whether something else comes along in seven to 10 years,
01:31:04 which is the right thing that brings it mainstream.
01:31:08 So you see Elon as a kind of competitor
01:31:11 or a fellow traveler along the path of uncertainty or both?
01:31:17 It’s a fellow traveler.
01:31:19 It’s like at the beginning of the internet
01:31:21 is how many companies are going to be invited
01:31:25 to this new ecosystem?
01:31:29 Like an endless number.
01:31:30 Because if you think that the hardware
01:31:33 just starts the process.
01:31:36 And so, okay, back to your initial example,
01:31:37 if you take the Fitbit, for example,
01:31:39 you say, okay, now I can get measurements on the body.
01:31:42 And what do we think the ultimate value
01:31:44 of this device is going to be?
01:31:45 What is the information transfer rate?
01:31:47 And they were in the market for a certain duration of time
01:31:50 and Google bought them for two and a half billion dollars.
01:31:53 They didn’t have ancillary value add.
01:31:55 There weren’t people building on top of the Fitbit device.
01:31:58 They also didn’t have increased insight
01:32:01 with additional data streams.
01:32:02 So it was really just the device.
01:32:04 If you look, for example, at Apple and the device they sell,
01:32:07 you have value in the device that someone buys,
01:32:09 but also you have everyone who’s building on top of it.
01:32:11 So you have this additional ecosystem value
01:32:13 and then you have additional data streams that come in
01:32:15 which increase the value of the product.
01:32:17 And so if you say, if you look at the hardware
01:32:20 as the instigator of value creation,
01:32:24 over time what we’ve built may constitute five or 10%
01:32:28 of the value of the overall ecosystem.
01:32:29 And that’s what we really care about.
01:32:30 What we’re trying to do is kickstart
01:32:33 the mainstream adoption of quantifying the brain.
01:32:38 And the hardware just opens the door to say
01:32:41 what kind of ecosystem could exist.
01:32:44 And that’s why the examples are so relevant
01:32:46 of the things you’ve outlined in your life.
01:32:49 I hope those things, the books people write,
01:32:52 the experiences people build, the conversations you have,
01:32:55 your relationship with your AI systems,
01:32:58 I hope those all are feeding on the insights
01:33:01 built upon this ecosystem we’ve created to better your life.
01:33:04 And so that’s the thinking behind it.
01:33:06 Again, with the Drake equation
01:33:07 being the underlying driver of value.
01:33:10 And the people at Kernel have joined
01:33:13 not because we have certainty of success,
01:33:17 but because we find it to be the most exhilarating
01:33:21 opportunity we could ever pursue in this time to be alive.
01:33:26 You founded the payment system Braintree in 2007
01:33:30 that acquired Venmo in 2012,
01:33:34 in that same year was acquired by PayPal, which is now eBay.
01:33:39 Can you tell me the story of the vision
01:33:42 and the challenge of building an online payment system
01:33:44 and just building a large successful business in general?
01:33:48 I discovered payments by accident.
01:33:51 When I was 21, I just returned from Ecuador
01:33:57 living among extreme poverty for two years.
01:33:59 And I came back to the US and I was shocked by the opulence
01:34:03 of the United States of America.
01:34:07 Yeah, of the United States.
01:34:09 And I just thought this is, I couldn’t believe it.
01:34:12 And I decided I wanted to try to spend my life helping others.
01:34:16 Like that was the, that was a life objective
01:34:18 that I thought was worthwhile to pursue
01:34:20 versus making money and whatever the case may be
01:34:23 for its own right.
01:34:24 And so I decided in that moment that I was going to
01:34:26 try to make enough money by the age of 30
01:34:30 to never have to work again.
01:34:32 And then with some abundance of money,
01:34:33 I could then choose to do things that might be beneficial
01:34:38 to others, but may not meet the criteria
01:34:40 of being a standalone business.
01:34:42 And so in that process, I started a few companies,
01:34:46 had some small successes, had some failures.
01:34:49 In one of the endeavors, I was up to my eyeballs in debt.
01:34:53 Things were not going well.
01:34:54 And I needed a part time job to pay my bills.
01:34:57 And so I, one day I saw in the paper in Utah
01:35:02 where I was living the 50 richest people in Utah.
01:35:05 And I emailed each one of their assistants and said,
01:35:07 you know, I’m young, I’m resourceful, I’ll do anything.
01:35:11 I’ll just want to, I’m entrepreneurial.
01:35:13 I tried to get a job that would be flexible
01:35:15 and no one responded.
01:35:17 And then I interviewed at a few dozen places.
01:35:21 Nobody would even give me the time of day.
01:35:23 Like it wouldn’t want to take me seriously.
01:35:25 And so finally I, it was on monster.com
01:35:27 that I saw this job posting for credit card sales
01:35:30 door to door.
01:35:31 Commission.
01:35:32 I did not know the story, this is great.
01:35:35 I love the head drop, that’s exactly right.
01:35:37 So it was.
01:35:39 The low points to which we go in life.
01:35:41 So I responded and you know, the person made an attempt
01:35:46 at suggesting that they had some kind of standards
01:35:48 that they would consider hiring.
01:35:50 But it’s kind of like, if you could fog a mirror,
01:35:53 like come and do this because it’s 100% commission.
01:35:55 And so I started walking up and down the street
01:35:57 in my community selling credit card processing.
01:36:00 And so what you learn immediately in doing that is
01:36:03 if you walk into a business, first of all,
01:36:06 the business owner is typically there.
01:36:09 And you walk in the door and they can tell
01:36:10 by how you’re dressed or how you walk,
01:36:12 whatever their pattern recognition is.
01:36:14 And they just hate you immediately.
01:36:15 It’s like, stop wasting my time.
01:36:17 I really am trying to get stuff done.
01:36:18 I don’t want us to a sales pitch.
01:36:19 And so you have to overcome the initial get out.
01:36:23 And then once you engage, when you say the word
01:36:26 credit card processing, the person’s like,
01:36:28 I already hate you because I have been taken advantage
01:36:31 of dozens of times because you all are weasels.
01:36:35 And so I had to figure out an algorithm
01:36:37 to get past all those different conditions.
01:36:39 Cause I was still working on my other startup
01:36:41 for the majority of my time.
01:36:42 So I was doing this part time.
01:36:44 And so I figured out that the industry really was built
01:36:48 on people, on deceit, basically people promising things
01:36:55 that were not reality.
01:36:56 And so I don’t know if you’ve heard of it,
01:36:57 but we’re not reality and so I’d walk into a business.
01:36:59 I’d say, look, I would give you a hundred dollars.
01:37:01 I’d put a hundred dollar bill and say,
01:37:02 I’ll give you a hundred dollars
01:37:04 for three minutes of your time.
01:37:05 If you don’t say yes to what I’m saying,
01:37:06 I’ll give you a hundred dollars.
01:37:08 And then he usually crack a smile and say, okay,
01:37:10 like what do you got for me son?
01:37:12 And so I’d sit down, I just opened my book and I’d say,
01:37:15 here’s the credit card industry.
01:37:16 Here’s how it works.
01:37:17 Here are the players.
01:37:17 Here’s what they do.
01:37:18 Here’s how they deceive you.
01:37:20 Here’s what I am.
01:37:21 I’m no different than anyone else.
01:37:22 I it’s like, you’re gonna process your credit card.
01:37:24 You’re gonna get the money in the account.
01:37:25 You’re just gonna get a clean statement, you’re gonna have
01:37:27 someone who answers the call when someone asks and you know,
01:37:29 just like the basic, like you’re okay.
01:37:31 And people started saying yes.
01:37:32 And then of course I went to the next business and be like,
01:37:34 you know, Joe and Susie and whoever said yes too.
01:37:37 And so I built a social proof structure
01:37:39 and I became the number one salesperson
01:37:42 out of 400 people nationwide doing this.
01:37:45 And I worked, you know, half time
01:37:47 still doing this other startup and.
01:37:49 That’s a brilliant strategy, by the way.
01:37:51 It’s very well, very well strategized and executed.
01:37:55 I did it for nine months.
01:37:57 And at the time my customer base was making,
01:38:00 was generating around, I think it was six,
01:38:03 if I remember correctly, $62,504 a month
01:38:07 were the overall revenues.
01:38:08 I thought, wow, that’s amazing.
01:38:09 If I built that as my own company,
01:38:13 I would just make $62,000 a month of income passively
01:38:16 with these merchants processing credit cards.
01:38:19 So I thought, hmm.
01:38:20 And so that’s when I thought I’m gonna create a company.
01:38:23 And so then I started Braintree.
01:38:25 And the idea was the online world was broken
01:38:29 because PayPal had been acquired by eBay
01:38:35 around, I think, 1999 or 2000.
01:38:38 And eBay had not innovated much with PayPal.
01:38:39 So it basically sat still for seven years
01:38:42 as the software world moved along.
01:38:45 And then authorize.net was also a company
01:38:46 that was relatively stagnant.
01:38:47 So you basically had software engineers
01:38:49 who wanted modern payment tools,
01:38:51 but there were none available for them.
01:38:53 And so they just dealt with software they didn’t like.
01:38:55 And so with Braintree,
01:38:57 I thought the entry point is to build software
01:38:59 that engineers will love.
01:39:00 And if we can find the entry point via software
01:39:02 and make it easy and beautiful
01:39:03 and just a magical experience
01:39:05 and then provide customer service on top of that,
01:39:06 that would be easy, that would be great.
01:39:08 What I was really going after though, it was PayPal.
01:39:11 They were the only company in payments making money.
01:39:14 Because they had the relationship with eBay early on,
01:39:19 people created a PayPal account,
01:39:22 they’d fund their account with their checking account
01:39:24 versus their credit cards.
01:39:25 And then when they’d use PayPal to pay a merchant,
01:39:28 PayPal had a cost of payment of zero
01:39:31 versus if you have coming from a credit card,
01:39:33 you have to pay the bank the fees.
01:39:35 So PayPal’s margins were 3% on a transaction
01:39:39 versus a typical payments company,
01:39:41 which may be a nickel or a penny or a dime
01:39:43 or something like that.
01:39:44 And so I knew PayPal really was the model to replicate,
01:39:48 but a bunch of companies had tried to do that.
01:39:50 They tried to come in and build a two sided marketplace.
01:39:52 So get consumers to fund the checking account
01:39:55 and the merchants to accept it,
01:39:56 but they’d all failed because building
01:39:58 a two sided marketplace is very hard at the same time.
01:40:01 So my plan was I’m going to build a company
01:40:04 and get the best merchants in the whole world
01:40:06 to use our service.
01:40:08 Then in year five, I’m going to have,
01:40:10 I’m going to acquire a consumer payments company
01:40:12 and I’m going to bring the two together.
01:40:14 And to focus on the merchant side and then get
01:40:19 the payments company that does the customer,
01:40:21 the whatever, the other side of it.
01:40:24 This is the plan I presented when I was
01:40:26 at University of Chicago.
01:40:28 And weirdly it happened exactly like that.
01:40:32 So four years in our customer base included Uber,
01:40:36 Airbnb, GitHub, 37 Signals, not Basecamp.
01:40:40 We had a fantastic collection of companies
01:40:43 that represented the fastest growing,
01:40:47 some of the fastest growing tech companies in the world.
01:40:49 And then we met up with Venmo and they had done
01:40:53 a remarkable job in building product.
01:40:55 It does up then something very counterintuitive,
01:40:56 which is make public your private financial transactions
01:40:59 which people previously thought were something
01:41:01 that should be hidden from others.
01:41:03 And we acquired Venmo and at that point we now had,
01:41:08 we replicated the model because now people could fund
01:41:11 their Venmo account with their checking account,
01:41:13 keep money in the account.
01:41:14 And then you could just plug Venmo as a form of payment.
01:41:17 And so I think PayPal saw that,
01:41:19 that we were getting the best merchants in the world.
01:41:22 We had people using Venmo.
01:41:25 They were both the up and coming millennials at the time
01:41:28 who had so much influence online.
01:41:30 And so they came in and offered us an attractive number.
01:41:34 And my goal was not to build
01:41:39 the biggest payments company in the world.
01:41:40 It wasn’t to try to climb the Forbes billionaire list.
01:41:44 It was, the objective was I want to earn enough money
01:41:48 so that I can basically dedicate my attention
01:41:52 to doing something that could potentially be useful
01:41:56 on a society wide scale.
01:41:58 And more importantly, that could be considered to be valuable
01:42:03 from the vantage point of 2050, 2100 and 2500.
01:42:08 So thinking about it on a few hundred year timescale.
01:42:13 And there was a certain amount of money I needed to do that.
01:42:16 So I didn’t require the permission of anybody to do that.
01:42:20 And so that what PayPal offered was sufficient for me
01:42:22 to get that amount of money to basically have a go.
01:42:25 And that’s when I set off to survey everything
01:42:30 I could identify an existence to say
01:42:32 of anything in the entire world I could do.
01:42:35 What one thing could I do
01:42:36 that would actually have the highest value potential
01:42:40 for the species?
01:42:42 And so it took me a little while to arrive at Brainerd Faces,
01:42:44 but.
01:42:46 Payments in themselves are revolutionary technologies
01:42:51 that can change the world.
01:42:53 Like let’s not forget that too easily.
01:43:00 I mean, obviously you know this,
01:43:02 but there’s quite a few lovely folks
01:43:08 who are now fascinated with the space of cryptocurrency.
01:43:13 And payments are very much connected to this,
01:43:17 but in general, just money.
01:43:18 And many of the folks I’ve spoken with,
01:43:21 they also kind of connect that
01:43:22 to not just purely financial discussions,
01:43:25 but philosophical and political discussions.
01:43:28 And they see Bitcoin as a way, almost as activism,
01:43:34 almost as a way to resist the corruption
01:43:38 of centralized centers of power.
01:43:40 And sort of basically in the 21st century,
01:43:42 decentralizing control.
01:43:44 Whether that’s Bitcoin or other cryptocurrencies,
01:43:46 they see that’s one possible way to give power
01:43:51 to those that live in regimes that are corrupt
01:43:55 or are not respectful of human rights
01:43:58 and all those kinds of things.
01:43:59 What’s your sense, just all your expertise with payments
01:44:03 and seeing how that changed the world,
01:44:05 what’s your sense about the lay of the land
01:44:09 for the future of Bitcoin or other cryptocurrencies
01:44:12 in the positive impact it may have on the world?
01:44:16 To be clear, my communication wasn’t meant
01:44:20 to minimize payments or to denigrate it in any way.
01:44:23 It was an attempt at communication
01:44:26 that when I was surveying the world,
01:44:30 it was an algorithm of what could I individually do?
01:44:35 So there are things that exist
01:44:38 that have a lot of potential that can be done.
01:44:41 And then there’s a filtering of how many people
01:44:43 are qualified to do this given thing.
01:44:46 And then there’s a further characterization
01:44:48 that can be done of, okay, given the number
01:44:49 of qualified people, will somebody be a unique out performer
01:44:54 of that group to make something truly impossible
01:44:57 to be something done that otherwise couldn’t get done?
01:44:59 So there’s a process of assessing
01:45:02 where can you add unique value in the world?
01:45:04 And some of that has to do with you being very formal
01:45:08 and calculative here, but some of that is just like,
01:45:11 what do you sense, like part of that equation
01:45:15 is how much passion you sense within yourself
01:45:17 to be able to drive that through,
01:45:19 to discover the impossibilities and make them possible.
01:45:21 That’s right, and so we were at Braintree,
01:45:23 I think we were the first company to integrate Coinbase
01:45:26 into our, I think we were the first payments company
01:45:30 to formally incorporate crypto, if I’m not mistaken.
01:45:35 For people who are not familiar,
01:45:36 Coinbase is a place where you can trade cryptocurrencies.
01:45:39 Yeah, which was one of the only places you could.
01:45:42 So we were early in doing that.
01:45:45 And of course, this was in the year 2013.
01:45:49 So an attorney to go in cryptocurrency land.
01:45:52 I concur with the statement you made
01:45:56 of the potential of the principles
01:46:01 underlying cryptocurrencies.
01:46:04 And that many of the things that they’re building
01:46:07 in the name of money and of moving value
01:46:13 is equally applicable to the brain
01:46:16 and equally applicable to how the brain interacts
01:46:19 with the rest of the world
01:46:20 and how we would imagine doing goal alignment with people.
01:46:25 So to me, it’s a continuous spectrum of possibility.
01:46:29 And your question is isolated on the money.
01:46:32 And I think it just is basically a scaffolding layer
01:46:35 for all of society.
01:46:35 So you don’t see this money as particularly distinct
01:46:38 from other? I don’t.
01:46:39 I think we at Kernel, we will benefit greatly
01:46:44 from the progress being made in cryptocurrency
01:46:47 because it will be a similar technology stack
01:46:48 we will want to use for many things we want to accomplish.
01:46:51 And so I’m bullish on what’s going on
01:46:55 and think it could greatly enhance brain interfaces
01:46:58 and the value of the brain interface ecosystem.
01:47:01 I mean, is there something you could say about,
01:47:02 first of all, bullish on cryptocurrency versus fiat money?
01:47:05 So do you have a sense that in 21st century
01:47:08 cryptocurrency will be embraced by governments
01:47:13 and changed the face of governments,
01:47:16 the structure of governments?
01:47:18 It’s the same way I think about my diet,
01:47:24 where previously it was conscious Brian,
01:47:28 looking at foods in certain biochemical states.
01:47:32 Am I hungry? Am I irritated? Am I depressed?
01:47:34 And then I choose based upon those momentary windows.
01:47:37 Do I eat at night when I’m fatigued
01:47:39 and I have low willpower?
01:47:40 Am I going to pig out on something?
01:47:42 And the current monetary system is based
01:47:46 upon human conscious decision making.
01:47:48 And politics and power and this whole mess of things.
01:47:51 And what I like about the building blocks
01:47:55 of cryptocurrencies, it’s methodical, it’s structured,
01:47:58 it is accountable, it’s transparent.
01:48:02 And so it introduces this scaffolding,
01:48:04 which I think, again, is the right starting point
01:48:07 for how we think about building
01:48:09 next generation institutions for society.
01:48:13 And that’s why I think it’s much broader than money.
01:48:16 So I guess what you’re saying is Bitcoin is the demotion
01:48:19 of the conscious mind as well.
01:48:23 In the same way you were talking about diet,
01:48:25 it’s like giving less priority to the ups and downs
01:48:29 of any one particular human mind, in this case your own,
01:48:33 and giving more power to the sort of data driven.
01:48:37 Yes, yeah, I think that is accurate,
01:48:41 that cryptocurrency is a version of what I would call
01:48:48 my autonomous self that I’m trying to build.
01:48:51 It is an introduction of an autonomous system
01:48:54 of value exchange and the process of value creation
01:49:02 in the society, yes, I see their similarities.
01:49:04 So I guess what you’re saying is Bitcoin
01:49:06 will somehow help me not pig out at night,
01:49:08 or the equivalent of, speaking of diet,
01:49:11 if we could just linger on that topic a little bit,
01:49:15 we already talked about your blog post of I fired myself,
01:49:20 I fired Brian, the evening Brian,
01:49:23 who’s too willing to, not making good decisions
01:49:29 for the long term well being and happiness
01:49:32 of the entirety of the organism.
01:49:34 Basically you were like pigging out at night.
01:49:36 But it’s interesting, because I do the same,
01:49:41 in fact I often eat one meal a day,
01:49:45 and like I have been this week actually,
01:49:50 especially when I travel, and it’s funny
01:49:54 that it never occurred to me to just basically look
01:49:59 at the fact that I’m able to be much smarter
01:50:02 about my eating decisions in the morning
01:50:04 and the afternoon than I am at night.
01:50:06 So if I eat one meal a day, why not eat
01:50:09 that one meal a day in the morning?
01:50:12 Like I’m not, it never occurred to me,
01:50:16 this revolutionary act, until you’ve outlined that.
01:50:21 So maybe, can you give some details,
01:50:23 and this is just you, this is one person,
01:50:26 Brian, arrives at a particular thing that they do,
01:50:29 but it’s fascinating to kind of look
01:50:32 at this one particular case study,
01:50:34 so what works for you, diet wise?
01:50:36 What’s your actual diet, what do you eat,
01:50:38 how often do you eat?
01:50:40 My current protocol is basically the result
01:50:44 of thousands of experiments and decision making.
01:50:48 So I do this every 90 days, I do the tests,
01:50:51 I do the cycle throughs, then I measure again,
01:50:54 and then I’m measuring all the time.
01:50:56 And so what I, of course I’m optimizing
01:50:59 for my biomarkers, I want perfect cholesterol
01:51:01 and I want perfect blood glucose levels
01:51:03 and perfect DNA methylation processes.
01:51:10 I also want perfect sleep.
01:51:12 And so for example, recently, the past two weeks,
01:51:14 my resting heart rate has been at 42 when I sleep.
01:51:20 And when my resting heart rate’s at 42,
01:51:22 my HRV is at its highest.
01:51:24 And I wake up in the morning feeling more energized
01:51:29 than any other configuration.
01:51:30 And so I know from all these processes
01:51:32 that eating at roughly 8.30 in the morning,
01:51:34 right after I work out on an empty stomach,
01:51:37 creates enough distance between that completed eating
01:51:41 and bedtime where I have almost no digestion processes
01:51:45 going on in my body,
01:51:47 therefore my resting heart rate goes very low.
01:51:49 And when my resting heart rate’s very low,
01:51:51 I sleep with high quality.
01:51:52 And so basically I’ve been trying to optimize
01:51:54 the entirety of what I eat to my sleep quality.
01:51:58 And my sleep quality then of course feeds into my willpower
01:52:00 so it creates this virtuous cycle.
01:52:02 And so at 8.30 what I do is I eat what I call super veggie,
01:52:06 which is, it’s a pudding of 250 grams of broccoli,
01:52:10 150 grams of cauliflower,
01:52:11 and a whole bunch of other vegetables
01:52:13 that I eat what I call nutty pudding, which is.
01:52:16 You make the pudding yourself?
01:52:17 Like, what do you call it?
01:52:20 Like a veggie mix, whatever thing, like a blender?
01:52:23 Yeah, you can be made in a high speed blender.
01:52:25 But basically I eat the same thing every day,
01:52:27 veggie bowl as in the form of pudding,
01:52:30 and then a bowl in the form of nuts.
01:52:34 And then I have.
01:52:35 Vegan.
01:52:36 Vegan, yes.
01:52:37 Vegan, so that’s fat and that’s like,
01:52:40 that’s fat and carbs and that’s the protein and so on.
01:52:43 Then I have a third dish.
01:52:44 Does it taste good?
01:52:45 I love it.
01:52:46 I love it so much I dream about it.
01:52:49 Yeah, that’s awesome.
01:52:50 This is a.
01:52:52 And then I have a third dish which is,
01:52:53 it changes every day.
01:52:55 Today it was kale and spinach and sweet potato.
01:52:58 And then I take about 20 supplements
01:53:03 that
01:53:05 hopefully make, constitute a perfect nutritional profile.
01:53:09 So what I’m trying to do is create the perfect diet
01:53:12 for my body every single day.
01:53:14 Where sleep is part of the optimization.
01:53:16 That’s right.
01:53:17 You’re like, one of the things you’re really tracking.
01:53:18 I mean, can you, well, I have a million question,
01:53:21 but 20 supplements, like what kind are like,
01:53:23 would you say are essential?
01:53:25 Cause I only take, I only take athletic greens.com slash.
01:53:30 That’s like the multivitamin essentially.
01:53:33 That’s like the lazy man, you know, like,
01:53:35 like if you don’t actually want to think about shit,
01:53:37 that’s what you take and then fish oil and that’s it.
01:53:40 That’s all I take.
01:53:41 Yeah, you know, Alfred North Whitehead said,
01:53:45 civilization advances as it extends the number
01:53:48 of important operations it can do
01:53:50 without thinking about them.
01:53:52 Yes.
01:53:53 So my objective on this is I want an algorithm
01:53:57 for perfect health that I never have to think about.
01:54:00 And then I want that system to be scalable to anybody
01:54:03 so that they don’t have to think about it.
01:54:05 And right now it’s expensive for me to do it.
01:54:07 It’s time consuming for me to do it.
01:54:09 And I have infrastructure to do it,
01:54:10 but the future of being human is not going
01:54:14 to the grocery store and deciding what to eat.
01:54:17 It’s also not reading scientific papers,
01:54:19 trying to decide this thing or that thing.
01:54:21 It’s all N of one.
01:54:23 So it’s devices on the outside and inside your body,
01:54:26 assessing real time what your body needs
01:54:28 and then creating closed loop systems for that to happen.
01:54:30 Yeah, so right now you’re doing the data collection
01:54:33 and you’re being the scientist,
01:54:35 it’d be much better if you just did the data collect
01:54:39 or it was being essentially done for you
01:54:40 and you can outsource that to another scientist
01:54:43 that’s doing the N of one study of you.
01:54:46 That’s right, because every time I spend time thinking
01:54:48 about this or executing, spending time on it,
01:54:50 I’m spending less time thinking about building kernel
01:54:53 or the future of being human.
01:54:55 And so it’s, we just all have the budget
01:54:58 of our capacity on an everyday basis
01:55:01 and we will scaffold our way up out of this.
01:55:05 And so, yeah, hopefully what I’m doing is really,
01:55:07 it serves as a model that others can also build on.
01:55:11 That’s why I wrote about it,
01:55:12 is hopefully people can then take it and improve upon it.
01:55:15 I hold nothing sacred.
01:55:16 I change my diet almost every day
01:55:19 based upon some new test results or science
01:55:21 or something like that, but.
01:55:23 Can you maybe elaborate on the sleep thing?
01:55:24 Why is sleep so important?
01:55:27 And why, presumably, like what does good sleep mean to you?
01:55:34 I think sleep is a contender for being the most powerful
01:55:43 health intervention in existence.
01:55:46 It’s a contender.
01:55:49 I mean, it’s magical what it does if you’re well rested
01:55:53 and what your body can do.
01:55:56 And I mean, for example, I know when I eat close
01:56:00 to my bedtime and I’ve done a systematic study for years
01:56:05 looking at like 15 minute increments on time of day
01:56:07 on where I eat my last meal,
01:56:09 my willpower is directly correlated
01:56:12 to the amount of deep sleep I get.
01:56:14 So my ability to not binge eat at night
01:56:18 when rascal Brian’s out and about
01:56:22 is based upon how much deep sleep I got the night before.
01:56:24 Yeah, there’s a lot to that, yeah.
01:56:27 And so I’ve seen it manifest itself.
01:56:30 And so I think the way I summarize this is
01:56:34 in society we’ve had this myth of,
01:56:36 we tell stories, for example, of entrepreneurship
01:56:39 where this person was so amazing,
01:56:41 they stayed at the office for three days
01:56:43 and slept under their desk.
01:56:44 And we say, wow, that’s amazing, that’s amazing.
01:56:49 And now I think we’re headed towards a state
01:56:52 where we’d say that’s primitive
01:56:54 and really not a good idea on every level.
01:56:58 And so the new mythology is going to be the exact opposite.
01:57:05 Yeah, by the way, just to sort of maybe push back
01:57:08 a little bit on that idea.
01:57:10 Did you sleep under your desk collects?
01:57:13 Well, yeah, a lot.
01:57:14 I’m a big believer in that actually.
01:57:16 I’m a big believer in chaos
01:57:19 and giving into your passion
01:57:22 and sometimes doing things that are out of the ordinary
01:57:25 that are not trying to optimize health
01:57:29 for certain periods of time in lieu of your passions
01:57:35 is a signal to yourself that you’re throwing everything away.
01:57:39 So I think what you’re referring to
01:57:42 is how to have good performance for prolonged periods
01:57:46 of time.
01:57:47 I think there’s moments in life
01:57:50 where you need to throw all of that away,
01:57:52 all the plans away, all the structure away.
01:57:54 So I’m not sure I have an eloquent way
01:58:00 describing exactly what I’m talking about,
01:58:02 but it all depends on people, people are different,
01:58:07 but there’s a danger of over optimization
01:58:11 to where you don’t just give into the madness
01:58:14 of the way your brain flows.
01:58:17 I mean, to push back on my pushback is like,
01:58:21 it’s nice to have like where the foundations
01:58:29 of your brain are not messed with.
01:58:31 So you have a fixed foundation where the diet is fixed,
01:58:35 where the sleep is fixed and that all of that is optimal
01:58:37 and the chaos happens in the space of ideas
01:58:39 as opposed to the space of biology.
01:58:42 But I’m not sure if there’s a,
01:58:47 that requires real discipline and forming habits.
01:58:50 There’s some aspect to which some of the best days
01:58:54 and weeks of my life have been, yeah,
01:58:56 sleeping under a desk kind of thing.
01:58:58 And I don’t, I’m not too willing to let go
01:59:03 of things that empirically worked
01:59:08 for things that work in theory.
01:59:11 And so I’m, again, I’m absolutely with you on sleep.
01:59:16 Also, I’m with you on sleep conceptually,
01:59:19 but I’m also very humbled to understand
01:59:23 that for different people,
01:59:26 good sleep means different things.
01:59:28 I’m very hesitant to trust science on sleep.
01:59:33 I think you should also be a scholar of your own body.
01:59:35 Again, the experiment of NF1.
01:59:38 I’m not so sure that a full night’s sleep is great for me.
01:59:44 There is something about that power nap
01:59:47 that I just have not fully studied yet,
01:59:50 but that nap is something special.
01:59:52 That I’m not sure I found the optimal thing.
01:59:55 So like there’s a lot to be explored
01:59:57 to what is exactly optimal amount of sleep,
02:00:00 optimal kind of sleep combined with diet
02:00:02 and all those kinds of things.
02:00:03 I mean, that all maps the sort of data,
02:00:05 at least the truth, exactly what you’re referring to.
02:00:08 Here’s a data point for your consideration.
02:00:10 Yes.
02:00:12 The progress in biology over the past, say decade,
02:00:17 has been stunning.
02:00:18 Yes.
02:00:19 And it now appears as if we will be able to replace
02:00:25 our organs, zero X for a transplantation.
02:00:28 And so we probably have a path to replace
02:00:33 and regenerate every organ of your body,
02:00:37 except for your brain.
02:00:42 You can lose your hand and your arm and a leg.
02:00:45 You can have an artificial heart.
02:00:47 You can’t operate without your brain.
02:00:49 And so when you make that trade off decision
02:00:52 of whether you’re going to sleep under the desk or not
02:00:54 and go all out for a four day marathon, right?
02:01:00 There’s a cost benefit trade off of what’s going on,
02:01:02 what’s happening to your brain in that situation.
02:01:05 We don’t know the consequences
02:01:07 of modern day life on our brain.
02:01:09 We don’t, it’s the most valuable organ in our existence.
02:01:14 And we don’t know what’s going on if we,
02:01:18 in how we’re treating it today with stress
02:01:20 and with sleep and with dietary.
02:01:23 And to me, then if you say that you’re trying to,
02:01:28 you’re trying to optimize life
02:01:30 for whatever things you’re trying to do.
02:01:32 The game is soon with the progress in anti aging and biology,
02:01:36 the game is very soon going to become different
02:01:38 than what it is right now with organ rejuvenation,
02:01:41 organ replacement.
02:01:42 And I would conjecture that we will value
02:01:48 the health status of our brain above all things.
02:01:52 Yeah, no, absolutely.
02:01:53 Everything you’re saying is true, but we die.
02:01:58 We die pretty quickly, life is short.
02:02:01 And I’m one of those people that I would rather die in battle
02:02:08 than stay safe at home.
02:02:11 It’s like, yeah, you look at kind of,
02:02:14 there’s a lot of things that you can reasonably say,
02:02:17 these are, this is the smart thing to do
02:02:19 that can prevent you, that becomes conservative,
02:02:21 that can prevent you from fully embracing life.
02:02:24 I think ultimately you can be very intelligent
02:02:27 and data driven and also embrace life.
02:02:30 But I err on the side of embracing life.
02:02:33 It’s very, it takes a very skillful person
02:02:36 to not sort of that hovering parent that says,
02:02:40 you know what, there’s a 3% chance that if you go out,
02:02:44 if you go out by yourself and play, you’re going to die,
02:02:47 get run over by a car, come to a slow or a sudden end.
02:02:51 And I am more a supporter of just go out there.
02:02:57 If you die, you die.
02:02:59 And that’s a, it’s a balance you have to strike.
02:03:02 I think there’s a balance to strike
02:03:04 in the longterm optimization and short term freedom.
02:03:11 For me, for a programmer, for a programming mind,
02:03:15 I tend to over optimize and I’m very cautious
02:03:18 and afraid of that, to not over optimize
02:03:21 and thereby be overly cautious, suboptimally cautious
02:03:26 about everything I do.
02:03:27 And that’s the ultimate thing I’m trying to optimize for.
02:03:30 It’s funny you said like sleep and all those kinds of things.
02:03:33 I tend to think, this is, you’re being more precise
02:03:38 than I am, but I think I tend to want to minimize stress,
02:03:47 which everything comes into that from your sleep
02:03:49 and all those kinds of things.
02:03:50 But I worry that whenever I’m trying to be too strict
02:03:54 with myself, then the stress goes up
02:03:57 when I don’t follow the strictness.
02:04:00 And so you have to kind of, it’s a weird,
02:04:02 it’s a, there’s so many variables in an objective function
02:04:05 as it’s hard to get right.
02:04:07 And sort of not giving a damn about sleep
02:04:09 and not giving a damn about diet is a good thing
02:04:11 to inject in there every once in a while
02:04:14 for somebody who’s trying to optimize everything.
02:04:17 But that’s me just trying to, it’s exactly like you said,
02:04:20 you’re just a scientist, I’m a scientist of myself,
02:04:22 you’re a scientist of yourself.
02:04:24 It’d be nice if somebody else was doing it
02:04:25 and had much better data, because I don’t trust
02:04:28 my conscious mind and I pigged out last night
02:04:30 at some brisket in LA that I regret deeply.
02:04:34 It’s just so, uh.
02:04:37 There’s no point to anything I just said.
02:04:38 But.
02:04:39 But.
02:04:40 What is the nature of your regret on the brisket?
02:04:46 Is it, do you wish you hadn’t eaten it entirely?
02:04:49 Is it that you wish you hadn’t eaten as much as you did?
02:04:51 Is it that?
02:04:55 I think, well, the most regret, I mean,
02:04:58 if we want to be specific, I drank way too much like that.
02:05:03 Like diet soda.
02:05:05 My biggest regret is like having drank so much diet soda.
02:05:08 That’s the thing that really was the problem.
02:05:10 I had trouble sleeping because of that.
02:05:12 Because I was like programming and then I was editing.
02:05:14 And so I’d stay up late at night
02:05:15 and then I had to get up to go pee a few times
02:05:18 and it was just a mess.
02:05:19 A mess of a night.
02:05:20 It was, well, it’s not really a mess,
02:05:22 but like it’s so many, it’s like the little things.
02:05:25 I know if I just eat, I drink a little bit of water
02:05:30 and that’s it, and there’s a certain,
02:05:31 all of us have perfect days that we know diet wise
02:05:36 and so on that’s good to follow, you feel good.
02:05:39 I know what it takes for me to do that.
02:05:41 I didn’t fully do that and thereby,
02:05:43 because there’s an avalanche effect
02:05:47 where the other sources of stress,
02:05:50 all the other to do items I have piled on,
02:05:53 my failure to execute on some basic things
02:05:55 that I know make me feel good and all of that combines
02:05:58 to create a mess of a day.
02:06:02 But some of that chaos, you have to be okay with it,
02:06:05 but some of it I wish was a little bit more optimal.
02:06:07 And your ideas about eating in the morning
02:06:11 are quite interesting as an experiment to try.
02:06:14 Can you elaborate, are you eating once a day?
02:06:18 Yes.
02:06:19 In the morning and that’s it.
02:06:22 Can you maybe speak to how that,
02:06:24 you spoke, it’s funny, you spoke about the metrics of sleep,
02:06:30 but you’re also, you run a business,
02:06:34 you’re incredibly intelligent,
02:06:36 mostly your happiness and success
02:06:40 relies on you thinking clearly.
02:06:43 So how does that affect your mind and your body
02:06:45 in terms of performance?
02:06:47 So not just sleep, but actual mental performance.
02:06:50 As you were explaining your objective function of,
02:06:53 for example, in the criteria you were including,
02:06:56 you like certain neurochemical states,
02:06:59 like you like feeling like you’re living life,
02:07:02 that life has enjoyment,
02:07:04 that sometimes you want to disregard certain rules
02:07:07 to have a moment of passion, of focus.
02:07:10 There’s this architecture of the way Lex is,
02:07:13 which makes you happy as a story you tell,
02:07:16 as something you kind of experience,
02:07:17 maybe the experience is a bit more complicated,
02:07:19 but it’s in this idea you have, this is a version of you.
02:07:22 And the reason why I maintain the schedule I do
02:07:26 is I’ve chosen a game to say,
02:07:29 I would like to live a life
02:07:31 where I care more about what intelligent,
02:07:37 what people who live in the year 2500
02:07:41 think of me than I do today.
02:07:44 That’s the game I’m trying to play.
02:07:46 And so therefore the only thing I really care about
02:07:51 on this optimization is trying to see past myself,
02:07:56 past my limitations, using zeroes principle thinking,
02:08:01 pull myself out of this contextual mesh we’re in right now
02:08:04 and say, what will matter 100 years from now
02:08:07 and 200 years from now?
02:08:08 What are the big things really going on
02:08:11 that are defining reality?
02:08:13 And I find that if I were to hang out with Diet Soda Lex
02:08:22 and Diet Soda Brian were to play along with that
02:08:24 and my deep sleep were to get crushed as a result,
02:08:28 my mind would not be on what matters
02:08:30 in 100 years or 200 years or 300 years.
02:08:32 I would be irritable.
02:08:34 I would be, I’d be in a different state.
02:08:37 And so it’s just gameplay selection.
02:08:41 It’s what you and I have chosen to think about.
02:08:43 It’s what we’ve chosen to work on.
02:08:47 And this is why I’m saying that no generation of humans
02:08:51 have ever been afforded the opportunity
02:08:54 to look at their lifespan and contemplate
02:08:58 that they will have the possibility of experiencing
02:09:03 an evolved form of consciousness that is undeniable.
02:09:06 They would fall in a zero category of potential.
02:09:10 That to me is the most exciting thing in existence.
02:09:14 And I would not trade any momentary neurochemical state
02:09:19 right now in exchange for that.
02:09:21 I would, I’d be willing to deprive myself
02:09:23 of all momentary joy in pursuit of that goal
02:09:27 because that’s what makes me happy.
02:09:29 That’s brilliant.
02:09:30 But I’m a bit, I just looked it up.
02:09:34 I’m with, I just looked up Braveheart’s speech
02:09:38 and William Wallace, but I don’t know if you’ve seen it.
02:09:41 Fight and you may die, run and you’ll live at least a while.
02:09:45 And dying in your beds many years from now,
02:09:48 would you be willing to trade all the days
02:09:51 from this day to that for one chance,
02:09:53 just one chance, picture Mel Gibson saying this,
02:09:57 to come back here and tell our enemies
02:09:59 that they may take our lives with growing excitement,
02:10:03 but they’ll never take our freedom.
02:10:06 I get excited every time I see that in the movie,
02:10:08 but that’s kind of how I approach life and eating.
02:10:11 Do you think they were tracking their sleep?
02:10:13 They were not tracking their sleep
02:10:14 and they ate way too much brisket
02:10:16 and they were fat, unhealthy, died early,
02:10:19 and were primitive.
02:10:22 But there’s something in my ape brain
02:10:25 that’s attracted to that, even though most of my life
02:10:29 is fully aligned with the way you see yours.
02:10:32 Part of it is for comedy, of course,
02:10:34 but part of it is I’m almost afraid of overoptimization.
02:10:38 Really what you’re saying though,
02:10:39 if we’re looking at this,
02:10:41 let’s say from a first principles perspective,
02:10:43 when you read those words,
02:10:44 they conjure up certain life experiences,
02:10:46 but you’re basically saying,
02:10:47 I experienced a certain neurotransmitter state
02:10:50 when these things are in action.
02:10:53 That’s all you’re saying.
02:10:53 So whether it’s that or something else,
02:10:55 you’re just saying you have a selection
02:10:57 for how your state for your body.
02:10:59 And so if you as an engineer of consciousness,
02:11:02 that should just be engineerable.
02:11:05 And that’s just triggering certain chemical reactions.
02:11:08 And so it doesn’t mean they have to be mutually exclusive.
02:11:11 You can have that and experience that
02:11:12 and also not sacrifice longterm health.
02:11:15 And I think that’s the potential of where we’re going
02:11:17 is we don’t have to assume they are trade offs
02:11:23 that must be had.
02:11:25 Absolutely.
02:11:26 And so I guess for my particular brain,
02:11:28 it’s useful to have the outlier experiences
02:11:32 that also come along with the illusion of free will
02:11:35 where I chose those experiences
02:11:37 that make me feel like it’s freedom.
02:11:39 Listen, going to Texas made me realize I spent,
02:11:42 so I still am, but I lived at Cambridge at MIT
02:11:46 and I never felt like home there.
02:11:49 I felt like home in the space of ideas with the colleagues,
02:11:52 like when I was actually discussing ideas,
02:11:54 but there is something about the constraints,
02:11:58 how cautious people are,
02:12:00 how much they valued also kind of a material success,
02:12:05 career success.
02:12:06 When I showed up to Texas, it felt like I belong.
02:12:12 That was very interesting, but that’s my neurochemistry,
02:12:14 whatever the hell that is, whatever,
02:12:17 maybe probably is rooted to the fact
02:12:18 that I grew up in the Soviet Union
02:12:20 and it was such a constrained system
02:12:22 that you’d really deeply value freedom
02:12:23 and you always want to escape the man
02:12:27 and the control of centralized systems.
02:12:29 I don’t know what it is, but at the same time,
02:12:32 I love strictness.
02:12:33 I love the dogmatic authoritarianism of diet,
02:12:38 of like the same habit, exactly the habit you have.
02:12:41 I think that’s actually when bodies perform optimally,
02:12:43 my body performs optimally.
02:12:45 So balancing those two, I think if I have the data,
02:12:48 every once in a while, party with some wild people,
02:12:51 but most of the time eat once a day,
02:12:54 perhaps in the morning, I’m gonna try that.
02:12:56 That might be very interesting,
02:12:57 but I’d rather not try it.
02:12:59 I’d rather have the data that tells me to do it.
02:13:03 But in general, you’re able to, eating once a day,
02:13:07 think deeply about stuff like this.
02:13:09 Concern that people have is like does your energy wane,
02:13:13 all those kinds of things.
02:13:15 Do you find that it’s, especially because it’s unique,
02:13:19 it’s vegan as well.
02:13:21 So you find that you’re able to have a clear mind,
02:13:23 a focus, and just physically and mentally throughout?
02:13:27 Yeah, and I find like my personal experience
02:13:29 in thinking about hard things is,
02:13:36 like oftentimes I feel like I’m looking through a telescope
02:13:40 and like I’m aligning two or three telescopes.
02:13:42 And you kind of have to close one eye
02:13:44 and move it back and forth a little bit
02:13:46 and just find just the right alignment.
02:13:47 Then you find just a sneak peek
02:13:49 at the thing you’re trying to find, but it’s fleeting.
02:13:51 If you move just one little bit, it’s gone.
02:13:54 And oftentimes what I feel like are the ideas
02:13:58 I value the most are like that.
02:14:00 They’re so fragile and fleeting and slippery and elusive.
02:14:06 And it requires a sensitivity to thinking
02:14:13 and a sensitivity to maneuver through these things.
02:14:16 If I concede to a world where I’m on my phone texting,
02:14:21 I’m also on social media.
02:14:22 I’m also doing 15 things at the same time
02:14:25 because I’m running a company
02:14:26 and I’m also feeling terrible from the last night.
02:14:30 It all just comes crashing down.
02:14:31 And the quality of my thoughts goes to a zero.
02:14:35 I’m a functional person to respond to basic level things,
02:14:40 but I don’t feel like I am doing anything interesting.
02:14:44 I think that’s a good word, sensitivity,
02:14:46 because that’s the word that’s used the most.
02:14:50 That’s what thinking deeply feels like
02:14:53 is you’re sensitive to the fragile thoughts.
02:14:55 And you’re right.
02:14:56 All those other distractions kind of dull
02:14:59 your ability to be sensitive to the fragile thoughts.
02:15:02 It’s a really good word.
02:15:05 Out of all the things you’ve done,
02:15:08 you’ve also climbed Mount Kilimanjaro.
02:15:13 Is this true?
02:15:14 It’s true.
02:15:14 Why and how, and what do you take from that experience?
02:15:25 I guess the backstory is relevant
02:15:26 because in that moment, it was the darkest time in my life.
02:15:32 I was ending a 13 year marriage.
02:15:34 I was leaving my religion.
02:15:35 I sold Braintree and I was battling depression
02:15:38 where I was just at the end.
02:15:40 And I got invited to go to Tanzania
02:15:45 as part of a group that was raising money
02:15:47 to build clean water wells.
02:15:49 And I had made some money from Braintree,
02:15:51 and so I was able to donate $25,000.
02:15:54 And it was the first time I had ever had money to donate
02:15:59 outside of paying tithing in my religion.
02:16:01 It was such a phenomenal experience
02:16:04 to contribute something meaningful to someone else
02:16:09 in that form.
02:16:11 And as part of this process,
02:16:12 we were gonna climb the mountain.
02:16:13 And so we went there and we saw the clean water wells
02:16:16 we were building.
02:16:17 We spoke to the people there and it was very energizing.
02:16:20 And then we climbed Kilimanjaro
02:16:21 and I came down with a stomach flu on day three.
02:16:29 And I also had altitude sickness,
02:16:30 but I became so sick that on day four,
02:16:33 we are somebody on day five,
02:16:34 I came into the camp, base camp at 15,000 feet,
02:16:39 just going to the bathroom on myself
02:16:42 and falling all over.
02:16:44 I was just a disaster, I was so sick.
02:16:47 So stomach flu and altitude sickness.
02:16:49 Yeah, and I just was destroyed from the situation.
02:16:57 Plus, it was psychologically one of the lowest points.
02:17:00 Yeah, and I think that was probably a big contributor.
02:17:03 I was just smoked as a human, just absolutely done.
02:17:06 And I had three young children.
02:17:07 And so I was trying to reconcile,
02:17:09 this is not a, whether I live or not
02:17:12 is not my decision by itself.
02:17:14 I’m now intertwined with these three little people
02:17:19 and I have an obligation whether I like it or not,
02:17:22 I need to be there.
02:17:24 And so it did, it felt like I was just stuck
02:17:26 in a straight jacket.
02:17:27 And I had to decide whether I was going to summit
02:17:32 the next day with the team.
02:17:35 And it was a difficult decision
02:17:36 because once you start hiking,
02:17:38 there’s no way to get off the mountain.
02:17:40 And a midnight came and our guide came in
02:17:44 and he said, where are you at?
02:17:44 And I said, I think I’m okay, I think I can try.
02:17:48 And so we went.
02:17:49 And so from midnight to, I made it to the summit at 5 a.m.
02:17:56 It was one of the most transformational moments
02:17:59 of my existence.
02:18:00 And the mountain became my problem.
02:18:06 It became everything that I was struggling with.
02:18:09 And when I started hiking, it was,
02:18:12 the pain got so ferocious that it was kind of like this.
02:18:19 It became so ferocious that I turned my music to Eminem
02:18:25 and it was, Eminem was the,
02:18:26 he was the only person in existence that spoke to my soul.
02:18:31 And it was something about his anger
02:18:33 and his vibrancy and his multi eventually,
02:18:38 he’s the only person who I could turn on
02:18:39 and I could just say, I feel some relief.
02:18:42 I turned on Eminem and I made it to the summit
02:18:46 after five hours, but just 100 yards from the top.
02:18:51 I was with my guide Ike and I started getting very dizzy
02:18:54 and I felt like I was gonna fall backwards
02:18:56 off this cliff area we were on.
02:18:58 I was like, this is dangerous.
02:19:00 And he said, look, Brian, I know where you’re at.
02:19:04 I know where you’re at.
02:19:06 And I can tell you, you’ve got it in you.
02:19:08 So I want you to look up, take a step, take a breath
02:19:14 and look up, take a breath and take a step.
02:19:16 And I did and I made it.
02:19:19 And so I got there and I just sat down with him at the top.
02:19:22 I just cried like a baby.
02:19:24 Broke down.
02:19:25 Yeah, I just lost it.
02:19:26 And so he’d let me do my thing.
02:19:29 And then we pulled out the pulse oximeter
02:19:31 and he measured my blood oxygen levels
02:19:33 and it was like 50 something percent
02:19:36 and it was danger zone.
02:19:36 So he looked at it and I think he was like really alarmed
02:19:39 that I was in this situation.
02:19:41 And so he said, we can’t get a helicopter here
02:19:44 and we can’t get you emergency evacuated.
02:19:46 You’ve gotta go down.
02:19:47 You’ve gotta hike down to 15,000 feet to get base camp.
02:19:49 And so we went out on the mountain.
02:19:53 I got back down to base camp.
02:19:55 And again, that was pretty difficult.
02:19:57 And then they put me on a stretcher,
02:19:59 this metal stretcher with this one wheel
02:20:01 and a team of six people wheeled me down the mountain.
02:20:04 And it was pretty torturous.
02:20:06 I’m very appreciative they did.
02:20:07 Also the trail was very bumpy.
02:20:09 So they’d go over the big rocks.
02:20:10 And so my head would just slam
02:20:11 against this metal thing for hours.
02:20:14 And so I just felt awful.
02:20:15 Plus I’d get my head slammed every couple of seconds.
02:20:18 So the whole experience was really a life changing moment.
02:20:22 And that was the demarcation of me
02:20:25 basically building a new life.
02:20:26 Basically I said, I’m going to reconstruct Brian,
02:20:31 my understanding of reality, my existential realities,
02:20:35 what I want to go after.
02:20:36 And I try, I mean, as much as that’s possible as a human,
02:20:39 but that’s when I set out to rebuild everything.
02:20:44 Was it the struggle of that?
02:20:46 I mean, there’s also just like the romantic poetic,
02:20:50 it’s a fricking mountain.
02:20:54 There’s a man in pain, psychological and physical
02:20:58 struggling up a mountain.
02:20:59 But it’s just struggle, just in the face of,
02:21:05 just pushing through in the face of hardship or nature too.
02:21:09 Something much bigger than you.
02:21:12 Is that, was that the thing that just clicked?
02:21:14 For me, it felt like I was just locked in with reality
02:21:18 and it was a death match.
02:21:21 It was in that moment, one of us is going to die.
02:21:24 So you were pondering death, like not surviving.
02:21:27 Yep.
02:21:28 And it was, and that was the moment.
02:21:29 And it was, the summit to me was,
02:21:33 I’m going to come out on top and I can do this.
02:21:35 And giving in was, it’s like, I’m just done.
02:21:40 And so it did, I locked in and that’s why,
02:21:43 yeah, mountains are magical to me.
02:21:49 I didn’t expect that.
02:21:50 I didn’t design that.
02:21:51 I didn’t know that was going to be the case.
02:21:52 I not, it would not have been something
02:21:55 I would have anticipated.
02:21:58 But you were not the same man afterwards.
02:22:01 Yeah.
02:22:02 Is there advice you can give to young people today
02:22:06 that look at your story,
02:22:07 that’s successful in many dimensions,
02:22:10 advice you can give to them about how to be successful
02:22:13 in their career, successful in life,
02:22:16 whatever path they choose?
02:22:18 Yes, I would say, listen to advice
02:22:22 and see it for what it is, a mirror of that person,
02:22:28 and then map and know that your future
02:22:31 is going to be in a zero principle land.
02:22:34 And so what you’re hearing today is a representation
02:22:38 of what may have been the right principles
02:22:39 to build upon previously,
02:22:41 but they’re likely depreciating very fast.
02:22:45 And so I am a strong proponent
02:22:49 that people ask for advice, but they don’t take advice.
02:22:57 So how do you take advice properly?
02:23:00 It’s in the careful examination of the advice.
02:23:03 It’s actually the person makes a statement
02:23:06 about a given thing somebody should follow.
02:23:09 The value is not doing that.
02:23:10 The value is understanding the assumption stack they built,
02:23:13 the assumption and knowledge stack they built
02:23:15 around that body of knowledge.
02:23:18 That’s the value.
02:23:18 It’s not doing what they say.
02:23:20 Considering the advice, but digging deeper
02:23:25 to understand the assumption stack,
02:23:27 like the full person,
02:23:29 I mean, this is deep empathy, essentially,
02:23:32 to understand the journey of the person
02:23:34 that arrived at the advice.
02:23:36 And the advice is just the tip of the iceberg
02:23:39 that ultimately is not the thing that gives you.
02:23:41 It could be the right thing to do.
02:23:43 It could be the complete wrong thing to do
02:23:45 depending on the assumption stack.
02:23:47 So you need to investigate the whole thing.
02:23:49 Is there some, are there been people in your startup
02:23:54 and your business journey that have served that role
02:23:59 of advice giver that’s been helpful?
02:24:02 Or do you feel like your journey felt like a lonely path?
02:24:07 Or was it one that was, of course,
02:24:11 we’re all there born and die alone.
02:24:15 But do you fundamentally remember the experiences,
02:24:21 one where you leaned on people
02:24:23 at a particular moment in time that changed everything?
02:24:26 Yeah, the most significant moments of my memory,
02:24:30 for example, like on Kilimanjaro,
02:24:33 when Ike, some person I’d never met in Tanzania,
02:24:38 was able to, in that moment, apparently see my soul
02:24:41 when I was in this death match with reality.
02:24:45 And he gave me the instructions, look up, step.
02:24:48 And so there’s magical people in my life
02:24:54 that have done things like that.
02:24:57 And I suspect they probably don’t know.
02:25:00 I probably should be better at identifying those things.
02:25:04 And, but yeah, hopefully the,
02:25:16 I suppose like the wisdom I would aspire to
02:25:18 is to have the awareness and the empathy
02:25:22 to be that for other people.
02:25:24 And not a retail advertiser of advice,
02:25:33 of tricks for life, but deeply meaningful
02:25:39 and empathetic with a one on one context
02:25:41 with people that it really can make a difference.
02:25:45 Yeah, I actually kind of experience,
02:25:47 I think about that sometimes.
02:25:49 You have like an 18 year old kid come up to you.
02:25:51 And it’s not always obvious,
02:25:56 it’s not always easy to really listen to them.
02:26:00 Like not the facts, but like see who that person is.
02:26:04 I think people say that about being a parent
02:26:07 is you want to consider that,
02:26:11 you don’t want to be the authority figure
02:26:14 in the sense that you really want to consider
02:26:16 that there’s a special unique human being there
02:26:18 with a unique brain that may be brilliant
02:26:23 in ways that you are not understanding
02:26:25 that you’ll never be and really try to hear that.
02:26:29 So when giving advice, there’s something to that.
02:26:31 It’s a both sides should be deeply empathetic
02:26:34 about the assumption stack.
02:26:36 I love that terminology.
02:26:38 What do you think is the meaning of this whole thing of life?
02:26:43 Why the hell are we here, Brian Johnson?
02:26:46 We’ve been talking about brains and studying brains
02:26:48 and you had this very eloquent way of describing life
02:26:51 on Earth as an optimization problem
02:26:55 of the cost of intelligence going to zero.
02:26:59 At first through the evolutionary process
02:27:01 and then eventually through building,
02:27:05 through our technology,
02:27:06 building more and more intelligent systems.
02:27:09 You ever ask yourself why is doing that?
02:27:13 Yeah, I think the answer to this question,
02:27:17 again, the information value is more in the mirror
02:27:21 it provides of that person,
02:27:24 which is a representation of the technological,
02:27:26 social, political context of the time.
02:27:30 So if you ask this question a hundred years ago,
02:27:32 you would get a certain answer
02:27:33 that reflects that time period.
02:27:35 Same thing would be true of a thousand years ago.
02:27:36 It’s rare, it’s difficult for a person to pull themselves
02:27:40 out of their contextual awareness
02:27:43 and offer a truly original response.
02:27:45 And so knowing that I am contextually influenced
02:27:48 by the situation, that I am a mirror for our reality,
02:27:52 I would say that in this moment,
02:27:55 I think the real game going on
02:28:01 is that evolution built a system
02:28:07 of scaffolding intelligence that produced us.
02:28:11 We are now building intelligence systems
02:28:13 that are scaffolding higher dimensional intelligence,
02:28:19 that’s developing more robust systems of intelligence.
02:28:28 In doing in that process with the cost going to zero,
02:28:32 then the meaning of life becomes goal alignment,
02:28:37 which is the negotiation of our conscious
02:28:41 and unconscious existence.
02:28:43 And then I’d say the third thing is,
02:28:45 if we’re thinking that we wanna be explorers
02:28:48 is our technological progress is getting to a point
02:28:52 where we could aspirationally say,
02:28:57 we want to figure out what is really going on,
02:29:01 really going on, because does any of this really make sense?
02:29:06 Now we may be a hundred, 200, 500, a thousand years away
02:29:09 from being able to poke our way out of whatever is going on.
02:29:16 But it’s interesting that we could even state an aspiration
02:29:20 to say, we wanna poke at this question.
02:29:22 But I’d say in this moment of time,
02:29:26 the meaning of life is that we can build a future state
02:29:31 of existence that is more fantastic
02:29:35 than anything we could ever imagine.
02:29:37 The striving for something more amazing.
02:29:43 And that defies expectations that we would consider
02:29:50 bewildering and all the things that that’s,
02:29:55 and I guess the last thing,
02:29:57 if there’s multiple meanings of life,
02:29:59 it would be infinite games.
02:30:00 James Kars wrote the book,
02:30:01 Finite Games, Infinite Games.
02:30:04 The only game to play right now is to keep playing the game.
02:30:09 And so this goes back to the algorithm
02:30:11 of the Lex algorithm of diet soda and brisket
02:30:14 and pursuing the passion.
02:30:16 What I’m suggesting is there’s a moment here
02:30:19 where we can contemplate playing infinite games.
02:30:22 Therefore, it may make sense to err on the side
02:30:25 of making sure one is in a situation
02:30:27 to be playing infinite games if that opportunity arises.
02:30:31 So the landscape of possibility is changing very, very fast
02:30:35 and therefore our old algorithms
02:30:37 of how we might assess risk assessment
02:30:38 and what things we might pursue
02:30:39 and why those assumptions may fall away very quickly.
02:30:44 Well, I think I speak for a lot of people
02:30:46 when I say that the game you, Mr. Brian Johnson,
02:30:50 have been playing is quite incredible.
02:30:53 Thank you so much for talking to me.
02:30:54 Thanks, Lex.
02:30:56 Thanks for listening to this conversation
02:30:57 with Brian Johnson and thank you
02:30:59 to Four Sigmatic, NetSuite, Grammarly, and ExpressVPN.
02:31:05 Check them out in the description to support this podcast.
02:31:08 And now let me leave you with some words
02:31:10 from Diane Ackerman.
02:31:12 Our brain is a crowded chemistry lab,
02:31:15 bustling with nonstop neural conversations.
02:31:18 Thank you for listening and hope to see you next time.