Whitney Cummings: Comedy, Robotics, Neurology, and Love #55

Transcript

00:00:00 The following is a conversation with Whitney Cummings.

00:00:03 She’s a standup comedian, actor, producer, writer, director,

00:00:07 and recently, finally, the host of her very own podcast

00:00:11 called Good For You.

00:00:12 Her most recent Netflix special called Can I Touch It?

00:00:15 features in part a robot she affectionately named

00:00:19 Bearclaw that is designed to be visually a replica of Whitney.

00:00:23 It’s exciting for me to see one of my favorite comedians

00:00:26 explore the social aspects of robotics and AI in our society.

00:00:30 She also has some fascinating ideas

00:00:32 about human behavior, psychology, and neurology,

00:00:36 some of which she explores in her book

00:00:37 called I’m Fine and Other Lies.

00:00:41 It was truly a pleasure to meet Whitney

00:00:43 and have this conversation with her

00:00:45 and even to continue it through text afterwards.

00:00:47 Every once in a while, late at night,

00:00:50 I’ll be programming over a cup of coffee

00:00:52 and will get a text from Whitney saying something hilarious

00:00:55 or weirder yet, sending a video of Brian Callan

00:00:58 saying something hilarious.

00:01:00 That’s when I know the universe has a sense of humor

00:01:03 and it gifted me with one hell of an amazing journey.

00:01:07 Then I put the phone down and go back to programming

00:01:10 with a stupid, joyful smile on my face.

00:01:13 If you enjoy this conversation,

00:01:14 listen to Whitney’s podcast, Good For You,

00:01:17 and follow her on Twitter and Instagram.

00:01:19 This is the Artificial Intelligence Podcast.

00:01:22 If you enjoy it, subscribe on YouTube,

00:01:24 give it five stars on Apple Podcasts,

00:01:26 support on Patreon, or simply connect with me on Twitter

00:01:30 at Lex Friedman, spelled F R I D M A N.

00:01:34 This show is presented by Cash App,

00:01:35 the number one finance app in the App Store.

00:01:38 They regularly support Whitney’s Good For You podcast

00:01:40 as well.

00:01:41 I personally use Cash App to send money to friends,

00:01:43 but you can also use it to buy, sell,

00:01:45 and deposit Bitcoin in just seconds.

00:01:47 Cash App also has a new investing feature.

00:01:50 You can buy fractions of a stock, say $1 worth,

00:01:53 no matter what the stock price is.

00:01:55 Broker services are provided by Cash App Investing,

00:01:58 subsidiary of Square, and member SIPC.

00:02:02 I’m excited to be working with Cash App

00:02:04 to support one of my favorite organizations called First,

00:02:07 best known for their FIRST Robotics and Lego competitions.

00:02:10 They educate and inspire hundreds of thousands of students

00:02:13 in over 110 countries,

00:02:16 and have a perfect rating on Charity Navigator,

00:02:18 which means the donated money

00:02:19 is used to maximum effectiveness.

00:02:22 When you get Cash App from the App Store or Google Play,

00:02:25 and use code LEXPODCAST, you’ll get $10,

00:02:28 and Cash App will also donate $10 to FIRST,

00:02:32 which again, is an organization that I’ve personally seen

00:02:35 inspire girls and boys to dream

00:02:37 of engineering a better world.

00:02:40 This podcast is supported by ZipRecruiter.

00:02:43 Hiring great people is hard,

00:02:45 and to me is the most important element

00:02:47 of a successful mission driven team.

00:02:50 I’ve been fortunate to be a part of,

00:02:52 and to lead several great engineering teams.

00:02:54 The hiring I’ve done in the past

00:02:56 was mostly through tools that we built ourselves,

00:02:59 but reinventing the wheel was painful.

00:03:02 ZipRecruiter is a tool that’s already available for you.

00:03:05 It seeks to make hiring simple, fast, and smart.

00:03:08 For example, Codable cofounder Gretchen Huebner

00:03:11 used ZipRecruiter to find a new game artist

00:03:14 to join her education tech company.

00:03:16 By using ZipRecruiter screening questions

00:03:18 to filter candidates, Gretchen found it easier

00:03:21 to focus on the best candidates,

00:03:23 and finally hiring the perfect person for the role

00:03:26 in less than two weeks from start to finish.

00:03:29 ZipRecruiter, the smartest way to hire.

00:03:32 See why ZipRecruiter is effective

00:03:34 for businesses of all sizes by signing up as I did

00:03:37 for free at ziprecruiter.com slash lexpod.

00:03:41 That’s ziprecruiter.com slash lexpod.

00:03:45 And now, here’s my conversation with Whitney Cummings.

00:03:51 I have trouble making eye contact, as you can tell.

00:03:53 Me too.

00:03:54 Did you know that I had to work on making eye contact

00:03:56 because I used to look here?

00:03:58 Do you see what I’m doing?

00:03:59 That helps, yeah, yeah, yeah.

00:04:00 Do you want me to do that?

00:04:01 Well, I’ll do this way, I’ll cheat the camera.

00:04:03 But I used to do this, and finally people,

00:04:05 like I’d be on dates and guys would be like,

00:04:07 are you looking at my hair?

00:04:08 Like they get, it would make people really insecure

00:04:10 because I didn’t really get a lot of eye contact as a kid.

00:04:13 It’s one to three years.

00:04:14 Did you not get a lot of eye contact as a kid?

00:04:16 I don’t know.

00:04:17 I haven’t done the soul searching.

00:04:19 Right.

00:04:20 So, but there’s definitely some psychological issues.

00:04:24 Makes you uncomfortable.

00:04:25 Yeah, for some reason when I connect eyes,

00:04:27 I start to think, I assume that you’re judging me.

00:04:31 Oh, well, I am.

00:04:33 That’s why you assume that.

00:04:34 Yeah.

00:04:35 We all are.

00:04:36 All right.

00:04:36 This is perfect.

00:04:37 The podcast would be me and you both

00:04:38 staring at the table on the whole time.

00:04:42 Do you think robots are the future?

00:04:44 Ones with human level intelligence

00:04:45 will be female, male, genderless,

00:04:49 or another gender we have not yet created as a society?

00:04:53 You’re the expert at this.

00:04:55 Well, I’m gonna ask you.

00:04:56 You know the answer.

00:04:57 I’m gonna ask you questions

00:04:58 that maybe nobody knows the answer to.

00:05:00 Okay.

00:05:01 And then I just want you to hypothesize

00:05:04 as a imaginative author, director, comedian.

00:05:10 Can we just be very clear

00:05:12 that you know a ton about this

00:05:14 and I know nothing about this,

00:05:15 but I have thought a lot about

00:05:19 what I think robots can fix in our society.

00:05:22 And I mean, I’m a comedian.

00:05:24 It’s my job to study human nature,

00:05:27 to make jokes about human nature

00:05:28 and to sometimes play devil’s advocate.

00:05:31 And I just see such a tremendous negativity around robots

00:05:35 or at least the idea of robots that it was like,

00:05:38 oh, I’m just gonna take the opposite side for fun,

00:05:40 for jokes and then I was like,

00:05:43 oh no, I really agree in this devil’s advocate argument.

00:05:45 So please correct me when I’m wrong about this stuff.

00:05:49 So first of all, there’s no right and wrong

00:05:51 because we’re all,

00:05:53 I think most of the people working on robotics

00:05:55 are really not actually even thinking

00:05:57 about some of the big picture things

00:06:00 that you’ve been exploring.

00:06:01 In fact, your robot, what’s her name by the way?

00:06:04 Bearclaw.

00:06:05 We’ll go with Bearclaw.

00:06:06 What’s the genesis of that name by the way?

00:06:11 Bearclaw was, I got, I don’t even remember the joke

00:06:15 cause I black out after I shoot specials,

00:06:16 but I was writing something about like the pet names

00:06:19 that men call women, like cupcake, sweetie, honey,

00:06:22 you know, like we’re always named after desserts

00:06:26 or something and I was just writing a joke about,

00:06:29 if you wanna call us a dessert,

00:06:31 at least pick like a cool dessert, you know,

00:06:33 like Bearclaw, like something cool.

00:06:35 So I ended up calling her Bearclaw.

00:06:38 So do you think the future robots

00:06:42 of greater and greater intelligence

00:06:44 would like to make them female, male?

00:06:46 Would we like to assign them gender

00:06:48 or would we like to move away from gender

00:06:50 and say something more ambiguous?

00:06:54 I think it depends on their purpose, you know?

00:06:56 I feel like if it’s a sex robot,

00:06:59 people prefer certain genders, you know?

00:07:01 And I also, you know, when I went down and explored the robot

00:07:05 factory, I was asking about the type of people

00:07:07 that bought sex robots.

00:07:09 And I was very surprised at the answer

00:07:12 because of course the stereotype

00:07:14 was it’s gonna be a bunch of perverts.

00:07:15 It ended up being a lot of people that were handicapped,

00:07:18 a lot of people with erectile dysfunction

00:07:20 and a lot of people that were exploring their sexuality.

00:07:23 A lot of people that thought they were gay,

00:07:25 but weren’t sure, but didn’t wanna take the risk

00:07:28 of trying on someone that could reject them

00:07:31 and being embarrassed or they were closeted

00:07:33 or in a city where maybe that’s, you know,

00:07:36 taboo and stigmatized, you know?

00:07:37 So I think that a gendered sex robot

00:07:40 that would serve an important purpose

00:07:42 for someone trying to explore their sexuality.

00:07:44 Am I into men?

00:07:45 Let me try on this thing first.

00:07:46 Am I into women?

00:07:47 Let me try on this thing first.

00:07:48 So I think gendered robots would be important for that.

00:07:51 But I think genderless robots in terms of

00:07:53 emotional support robots, babysitters,

00:07:56 I’m fine for a genderless babysitter

00:07:58 with my husband in the house.

00:08:00 You know, there are places that I think

00:08:02 that genderless makes a lot of sense,

00:08:04 but obviously not in the sex area.

00:08:07 What do you mean with your husband in the house?

00:08:09 What does that have to do with the gender of the robot?

00:08:11 Right, I mean, I don’t have a husband,

00:08:13 but hypothetically speaking,

00:08:14 I think every woman’s worst nightmare

00:08:15 is like the hot babysitter.

00:08:17 You know what I mean?

00:08:19 So I think that there is a time and place,

00:08:21 I think, for genderless, you know, teachers, doctors,

00:08:25 all that kind of, it would be very awkward

00:08:27 if the first robotic doctor was a guy

00:08:29 or the first robotic nurse was a woman.

00:08:32 You know, it’s sort of, that stuff is still loaded.

00:08:36 I think that genderless could just take

00:08:38 the unnecessary drama out of it

00:08:43 and possibility to sexualize them

00:08:46 or be triggered by any of that stuff.

00:08:49 So there’s two components to this, to Bearclaw.

00:08:52 So one is the voice and the talking and so on,

00:08:55 and then there’s the visual appearance.

00:08:56 So on the topic of gender and genderless,

00:08:59 in your experience, what has been the value

00:09:03 of the physical appearance?

00:09:04 So has it added much to the depth of the interaction?

00:09:08 I mean, mine’s kind of an extenuating circumstance

00:09:11 because she is supposed to look exactly like me.

00:09:13 I mean, I spent six months getting my face molded

00:09:15 and having, you know, the idea was I was exploring

00:09:19 the concept of can robots replace us?

00:09:21 Because that’s the big fear,

00:09:22 but also the big dream in a lot of ways.

00:09:24 And I wanted to dig into that area because, you know,

00:09:28 for a lot of people, it’s like,

00:09:29 they’re gonna take our jobs and they’re gonna replace us.

00:09:32 Legitimate fear, but then a lot of women I know are like,

00:09:34 I would love for a robot to replace me every now and then

00:09:36 so it can go to baby showers for me

00:09:38 and it can pick up my kids at school

00:09:40 and it can cook dinner and whatever.

00:09:42 So I just think that was an interesting place to explore.

00:09:45 So her looking like me was a big part of it.

00:09:47 Now her looking like me just adds

00:09:49 an unnecessary level of insecurity

00:09:51 because I got her a year ago

00:09:53 and she already looks younger than me.

00:09:54 So that’s a weird problem.

00:09:57 But I think that her looking human was the idea.

00:10:00 And I think that where we are now,

00:10:03 please correct me if I’m wrong,

00:10:04 a human robot resembling an actual human you know

00:10:09 is going to feel more realistic than some generic face.

00:10:13 Well, you’re saying that robots that have some familiarity

00:10:19 like look similar to somebody that you actually know

00:10:22 you’ll be able to form a deeper connection with?

00:10:24 That was the question. I think so on some level, right?

00:10:26 That’s an open question.

00:10:26 I don’t, you know, it’s an interesting.

00:10:30 Or the opposite, because then you know me

00:10:32 and you’re like, well, I know this isn’t real

00:10:33 because you’re right here.

00:10:34 So maybe it does the opposite.

00:10:36 We have a very keen eye for human faces

00:10:39 and they’re able to detect strangeness

00:10:41 especially that one has to do with people

00:10:44 whose faces we’ve seen a lot of.

00:10:46 So I tend to be a bigger fan

00:10:48 of moving away completely from faces.

00:10:52 Of recognizable faces?

00:10:54 No, just human faces at all.

00:10:56 In general, because I think that’s where things get dicey.

00:10:58 And one thing I will say is

00:11:00 I think my robot is more realistic than other robots

00:11:03 not necessarily because you have seen me

00:11:05 and then you see her and you go, oh, they’re so similar

00:11:07 but also because human faces are flawed and asymmetrical.

00:11:11 And sometimes we forget when we’re making things

00:11:13 that are supposed to look human,

00:11:14 we make them too symmetrical

00:11:16 and that’s what makes them stop looking human.

00:11:17 So because they mold in my asymmetrical face,

00:11:20 she just, even if someone didn’t know who I was

00:11:22 I think she’d look more realistic than most generic ones

00:11:26 that didn’t have some kind of flaws.

00:11:28 Got it.

00:11:29 Because they start looking creepy

00:11:30 when they’re too symmetrical because human beings aren’t.

00:11:33 Yeah, the flaws is what it means to be human.

00:11:35 So visually as well.

00:11:37 But I’m just a fan of the idea

00:11:39 of letting humans use a little bit more imagination.

00:11:43 So just hearing the voice is enough for us humans

00:11:47 to then start imagining the visual appearance

00:11:50 that goes along with that voice.

00:11:52 And you don’t necessarily need to work too hard

00:11:54 on creating the actual visual appearance.

00:11:56 So there’s some value to that.

00:11:59 When you step into the stare of actually building a robot

00:12:03 that looks like Bear Claws,

00:12:04 such a long road of facial expressions

00:12:07 of sort of making everything smiling, winking,

00:12:13 rolling in the eyes, all that kind of stuff.

00:12:14 It gets really, really tricky.

00:12:16 It gets tricky and I think I’m, again, I’m a comedian.

00:12:19 Like I’m obsessed with what makes us human

00:12:21 and our human nature and the nasty side of human nature

00:12:25 tends to be where I’ve ended up

00:12:27 exploring over and over again.

00:12:28 And I was just mostly fascinated by people’s reaction.

00:12:32 So it’s my job to get the biggest reaction

00:12:34 from a group of strangers, the loudest possible reaction.

00:12:37 And I just had this instinct

00:12:39 just when I started building her

00:12:41 and people going, ah, ah, and people scream.

00:12:44 And I mean, I would bring her out on stage

00:12:46 and people would scream.

00:12:48 And I just, to me, that was the next level of entertainment.

00:12:51 Getting a laugh, I’ve done that, I know how to do that.

00:12:53 I think comedians were always trying to figure out

00:12:54 what the next level is and comedy’s evolving so much.

00:12:57 And Jordan Peele had just done

00:12:59 these genius comedy horror movies,

00:13:01 which feel like the next level of comedy to me.

00:13:04 And this sort of funny horror of a robot

00:13:10 was fascinating to me.

00:13:11 But I think the thing that I got the most obsessed with

00:13:15 was people being freaked out and scared of her.

00:13:18 And I started digging around with pathogen avoidance

00:13:21 and the idea that we’ve essentially evolved

00:13:24 to be repelled by anything that looks human,

00:13:27 but is off a little bit.

00:13:28 Anything that could be sick or diseased or dead,

00:13:32 essentially, is our reptilian brain’s way

00:13:33 to get us to not try to have sex with it, basically.

00:13:38 So I got really fascinated by how freaked out and scared.

00:13:41 I mean, I would see grown men get upset.

00:13:44 They’d get that thing away from me,

00:13:45 look, I don’t like that, like people would get angry.

00:13:47 And it was like, you know what this is, you know?

00:13:50 But the sort of like, you know, amygdala getting activated

00:13:55 by something that to me is just a fun toy

00:13:58 said a lot about our history as a species

00:14:02 and what got us into trouble thousands of years ago.

00:14:04 So it’s that, it’s the deep down stuff

00:14:07 that’s in our genetics, but also is it just,

00:14:10 are people freaked out by the fact that there’s a robot?

00:14:13 So it’s not just the appearance,

00:14:14 but there’s an artificial human.

00:14:17 Anything people, I think, and I’m just also fascinated

00:14:21 by the blind spots humans have.

00:14:23 So the idea that you’re afraid of that,

00:14:24 I mean, how many robots have killed people?

00:14:27 How many humans have died at the hands of other humans?

00:14:29 Yeah, a few more. Millions?

00:14:31 Hundreds of millions?

00:14:32 Yet we’re scared of that?

00:14:34 And we’ll go to the grocery store

00:14:36 and be around a bunch of humans

00:14:37 who statistically the chances are much higher

00:14:39 that you’re gonna get killed by humans.

00:14:40 So I’m just fascinated by without judgment

00:14:43 how irrational we are as a species.

00:14:47 The word is the exponential.

00:14:49 So it’s, you know, you can say the same thing

00:14:51 about nuclear weapons before we dropped

00:14:54 on the Hiroshima and Nagasaki.

00:14:55 So the worry that people have is the exponential growth.

00:14:59 So it’s like, oh, it’s fun and games right now,

00:15:03 but you know, overnight,

00:15:07 especially if a robot provides value to society,

00:15:10 we’ll put one in every home

00:15:11 and then all of a sudden lose track

00:15:13 of the actual large scale impact it has on society.

00:15:17 And then all of a sudden gain greater and greater control

00:15:20 to where we’ll all be, you know,

00:15:22 affect our political system

00:15:23 and then affect our decision.

00:15:25 Didn’t robots already ruin our political system?

00:15:27 Didn’t that just already happen?

00:15:28 Which ones? Oh, Russia hacking.

00:15:30 No offense, but hasn’t that already happened?

00:15:35 I mean, that was like an algorithm

00:15:36 of negative things being clicked on more.

00:15:39 We’d like to tell stories

00:15:40 and like to demonize certain people.

00:15:43 I think nobody understands our current political system

00:15:46 or discourse on Twitter, the Twitter mobs.

00:15:49 Nobody has a sense, not Twitter, not Facebook,

00:15:52 the people running it.

00:15:53 Nobody understands the impact of these algorithms.

00:15:55 They’re trying their best.

00:15:56 Despite what people think,

00:15:57 they’re not like a bunch of lefties

00:16:00 trying to make sure that Hillary Clinton gets elected.

00:16:03 It’s more that it’s an incredibly complex system

00:16:06 that we don’t, and that’s the worry.

00:16:08 It’s so complex and moves so fast

00:16:11 that nobody will be able to stop it once it happens.

00:16:15 And let me ask a question.

00:16:16 This is a very savage question.

00:16:19 Which is, is this just the next stage of evolution?

00:16:23 As humans, when people will die, yes.

00:16:26 I mean, that’s always happened, you know?

00:16:28 Is this just taking emotion out of it?

00:16:30 Is this basically the next stage of survival of the fittest?

00:16:34 Yeah, you have to think of organisms.

00:16:37 You know, what does it mean to be a living organism?

00:16:41 Like, is a smartphone part of your living organism, or?

00:16:46 We’re in relationships with our phones.

00:16:49 Yeah.

00:16:50 We have sex through them, with them.

00:16:52 What’s the difference between with them and through them?

00:16:54 But it also expands your cognitive abilities,

00:16:57 expands your memory, knowledge, and so on.

00:16:59 So you’re a much smarter person

00:17:00 because you have a smartphone in your hand.

00:17:02 But as soon as it’s out of my hand,

00:17:04 we’ve got big problems,

00:17:06 because we’ve become sort of so morphed with them.

00:17:08 Well, there’s a symbiotic relationship.

00:17:09 And that’s what, so Elon Musk, the neural link,

00:17:12 is working on trying to increase the bandwidth

00:17:16 of communication between computers and your brain.

00:17:19 And so further and further expand our ability

00:17:22 as human beings to sort of leverage machines.

00:17:26 And maybe that’s the future,

00:17:28 the next evolutionary step.

00:17:30 It could be also that, yes, we’ll give birth,

00:17:33 just like we give birth to human children right now,

00:17:36 we’ll give birth to AI and they’ll replace us.

00:17:38 I think it’s a really interesting possibility.

00:17:42 I’m gonna play devil’s advocate.

00:17:44 I just think that the fear of robots is wildly classist.

00:17:48 Because, I mean, Facebook,

00:17:50 like it’s easy for us to say they’re taking their data.

00:17:51 Okay, well, a lot of people

00:17:53 that get employment off of Facebook,

00:17:55 they are able to get income off of Facebook.

00:17:58 They don’t care if you take their phone numbers

00:17:59 and their emails and their data, as long as it’s free.

00:18:01 They don’t wanna have to pay $5 a month for Facebook.

00:18:03 Facebook is a wildly democratic thing.

00:18:05 Forget about the election and all that kind of stuff.

00:18:08 A lot of technology making people’s lives easier,

00:18:12 I find that most elite people are more scared

00:18:17 than lower income people.

00:18:18 So, and women for the most part.

00:18:21 So the idea of something that’s stronger than us

00:18:23 and that might eventually kill us,

00:18:25 like women are used to that.

00:18:26 Like that’s not, I see a lot of like really rich men

00:18:29 being like, the robots are gonna kill us.

00:18:31 We’re like, what’s another thing that’s gonna kill us?

00:18:33 I tend to see like, oh,

00:18:35 something can walk me to my car at night.

00:18:37 Like something can help me cook dinner or something.

00:18:39 For people in underprivileged countries

00:18:43 who can’t afford eye surgery, like in a robot,

00:18:45 can we send a robot to underprivileged places

00:18:48 to do surgery where they can’t?

00:18:50 I work with this organization called Operation Smile

00:18:53 where they do cleft palate surgeries.

00:18:55 And there’s a lot of places

00:18:56 that can’t do a very simple surgery

00:18:59 because they can’t afford doctors and medical care.

00:19:01 And such.

00:19:01 So I just see, and this can be completely naive

00:19:04 and should be completely wrong,

00:19:05 but I feel like a lot of people are going like,

00:19:08 the robots are gonna destroy us.

00:19:09 Humans, we’re destroying ourselves.

00:19:11 We’re self destructing.

00:19:12 Robots to me are the only hope

00:19:14 to clean up all the messes that we’ve created.

00:19:15 Even when we go try to clean up pollution in the ocean,

00:19:18 we make it worse because of the oil that the tankers use.

00:19:21 Like, it’s like, to me, robots are the only solution.

00:19:25 Firefighters are heroes, but they’re limited

00:19:27 in how many times they can run into a fire.

00:19:30 So there’s just something interesting to me.

00:19:32 I’m not hearing a lot of like,

00:19:34 lower income, more vulnerable populations

00:19:38 talking about robots.

00:19:39 Maybe you can speak to it a little bit more.

00:19:42 There’s an idea, I think you’ve expressed it.

00:19:44 I’ve heard, actually a few female writers

00:19:48 and roboticists have talked to express this idea

00:19:51 that exactly you just said, which is,

00:19:55 it just seems that being afraid of existential threats

00:20:01 of artificial intelligence is a male issue.

00:20:06 Yeah.

00:20:07 And I wonder what that is.

00:20:09 If it, because men have, in certain positions,

00:20:13 like you said, it’s also a classist issue.

00:20:15 They haven’t been humbled by life,

00:20:17 and so you always look for the biggest problems

00:20:20 to take on around you.

00:20:22 It’s a champagne problem to be afraid of robots.

00:20:24 Most people don’t have health insurance.

00:20:26 They’re afraid they’re not gonna be able

00:20:27 to feed their kids.

00:20:28 They can’t afford a tutor for their kids.

00:20:30 I mean, I just think of the way I grew up,

00:20:32 and I had a mother who worked two jobs, had kids.

00:20:36 We couldn’t afford an SAT tutor.

00:20:38 The idea of a robot coming in,

00:20:40 being able to tutor your kids,

00:20:41 being able to provide childcare for your kids,

00:20:43 being able to come in with cameras for eyes

00:20:45 and make sure surveillance.

00:20:48 I’m very pro surveillance because I’ve had security problems

00:20:52 and I’ve been, we’re generally in a little more danger

00:20:55 than you guys are.

00:20:56 So I think that robots are a little less scary to us

00:20:58 because we can see them maybe as like free assistance,

00:21:01 help and protection.

00:21:03 And then there’s sort of another element for me personally,

00:21:06 which is maybe more of a female problem.

00:21:08 I don’t know.

00:21:09 I’m just gonna make a generalization, happy to be wrong.

00:21:13 But the emotional sort of component of robots

00:21:18 and what they can provide in terms of, you know,

00:21:22 I think there’s a lot of people that don’t have microphones

00:21:25 that I just recently kind of stumbled upon

00:21:28 in doing all my research on the sex robots

00:21:30 for my standup special, which just,

00:21:32 there’s a lot of very shy people that aren’t good at dating.

00:21:35 There’s a lot of people who are scared of human beings

00:21:37 who have personality disorders

00:21:40 or grow up in alcoholic homes or struggle with addiction

00:21:43 or whatever it is where a robot can solve

00:21:45 an emotional problem.

00:21:46 And so we’re largely having this conversation

00:21:49 about like rich guys that are emotionally healthy

00:21:53 and how scared of robots they are.

00:21:55 We’re forgetting about like a huge part of the population

00:21:58 who maybe isn’t as charming and effervescent

00:22:01 and solvent as, you know, people like you and Elon Musk

00:22:05 who these robots could solve very real problems

00:22:09 in their life, emotional or financial.

00:22:11 Well, that’s a, in general, a really interesting idea

00:22:13 that most people in the world don’t have a voice.

00:22:16 It’s a, you’ve talked about it,

00:22:18 sort of even the people on Twitter

00:22:19 who are driving the conversation.

00:22:22 You said comments, people who leave comments

00:22:25 represent a very tiny percent of the population

00:22:28 and they’re the ones they, you know,

00:22:30 we tend to think they speak for the population,

00:22:33 but it’s very possible on many topics they don’t at all.

00:22:37 And look, I, and I’m sure there’s gotta be

00:22:39 some kind of legal, you know, sort of structure in place

00:22:43 for when the robots happen.

00:22:45 You know way more about this than I do,

00:22:46 but you know, for me to just go, the robots are bad,

00:22:49 that’s a wild generalization that I feel like

00:22:51 is really inhumane in some way.

00:22:54 You know, just after the research I’ve done,

00:22:56 like you’re gonna tell me that a man whose wife died

00:22:59 suddenly and he feels guilty moving on with a human woman

00:23:02 or can’t get over the grief,

00:23:04 he can’t have a sex robot in his own house?

00:23:06 Why not?

00:23:07 Who cares?

00:23:08 Why do you care?

00:23:09 Well, there’s a interesting aspect of human nature.

00:23:12 So, you know, we tend to as a civilization

00:23:16 to create a group that’s the other in all kinds of ways.

00:23:19 Right.

00:23:20 And so you work with animals too,

00:23:23 you’re especially sensitive to the suffering of animals.

00:23:26 Let me kind of ask, what’s your,

00:23:29 do you think we’ll abuse robots in the future?

00:23:33 Do you think some of the darker aspects

00:23:35 of human nature will come out?

00:23:37 I think some people will,

00:23:39 but if we design them properly, the people that do it,

00:23:43 we can put it on a record and we can put them in jail.

00:23:46 We can find sociopaths more easily, you know, like.

00:23:49 But why is that a sociopathic thing to harm a robot?

00:23:53 I think, look, I don’t know enough about the consciousness

00:23:56 and stuff as you do.

00:23:57 I guess it would have to be when they’re conscious,

00:23:59 but it is, you know, the part of the brain

00:24:02 that is responsible for compassion,

00:24:04 the frontal lobe or whatever,

00:24:05 like people that abuse animals also abuse humans

00:24:08 and commit other kinds of crimes.

00:24:09 Like that’s, it’s all the same part of the brain.

00:24:11 No one abuses animals and then it’s like,

00:24:13 awesome to women and children

00:24:15 and awesome to underprivileged, you know, minorities.

00:24:18 Like it’s all, so, you know,

00:24:20 we’ve been working really hard to put a database together

00:24:23 of all the people that have abused animals.

00:24:24 So when they commit another crime, you go, okay, this is,

00:24:27 you know, it’s all the same stuff.

00:24:29 And I think people probably think I’m nuts

00:24:32 for a lot of the animal work I do,

00:24:34 but because when animal abuse is present,

00:24:37 another crime is always present,

00:24:38 but the animal abuse is the most socially acceptable.

00:24:40 You can kick a dog and there’s nothing people can do,

00:24:43 but then what they’re doing behind closed doors,

00:24:46 you can’t see.

00:24:47 So there’s always something else going on,

00:24:48 which is why I never feel compunction about it.

00:24:50 But I do think we’ll start seeing the same thing with robots.

00:24:54 The person that kicks the,

00:24:55 I felt compassion when the kicking the dog robot

00:24:59 really pissed me off.

00:25:00 I know that they’re just trying to get the stability right

00:25:04 and all that.

00:25:05 But I do think there will come a time

00:25:07 where that will be a great way to be able to figure out

00:25:10 if somebody has like, you know, antisocial behaviors.

00:25:15 You kind of mentioned surveillance.

00:25:18 It’s also a really interesting idea of yours

00:25:20 that you just said, you know,

00:25:21 a lot of people seem to be really uncomfortable

00:25:23 with surveillance.

00:25:24 Yeah.

00:25:25 And you just said that, you know what,

00:25:27 for me, you know, there’s positives for surveillance.

00:25:31 I think people behave better

00:25:32 when they know they’re being watched.

00:25:33 And I know this is a very unpopular opinion.

00:25:36 I’m talking about it on stage right now.

00:25:38 We behave better when we know we’re being watched.

00:25:40 You and I had a very different conversation

00:25:41 before we were recording.

00:25:43 If we behave different, you sit up

00:25:46 and you are in your best behavior.

00:25:47 And I’m trying to sound eloquent

00:25:49 and I’m trying to not hurt anyone’s feelings.

00:25:51 And I mean, I have a camera right there.

00:25:52 I’m behaving totally different

00:25:54 than when we first started talking.

00:25:56 You know, when you know there’s a camera,

00:25:58 you behave differently.

00:25:59 I mean, there’s cameras all over LA at stoplights

00:26:02 so that people don’t run stoplights,

00:26:04 but there’s not even film in it.

00:26:05 They don’t even use them anymore, but it works.

00:26:07 It works.

00:26:08 Right?

00:26:09 And I’m, you know, working on this thing

00:26:10 in stand about surveillance.

00:26:11 It’s like, that’s why we embed in Santa Claus.

00:26:14 You know, it’s the Santa Claus

00:26:15 is the first surveillance basically.

00:26:17 All we had to say to kids is he’s making a list

00:26:20 and he’s watching you and they behave better.

00:26:22 That’s brilliant.

00:26:23 You know, so I do think that there are benefits

00:26:26 to surveillance.

00:26:27 You know, I think we all do sketchy things in private

00:26:30 and we all have watched weird porn

00:26:33 or Googled weird things.

00:26:34 And we don’t want people to know about it,

00:26:37 our secret lives.

00:26:37 So I do think that obviously there’s,

00:26:40 we should be able to have a modicum of privacy,

00:26:42 but I tend to think that people

00:26:44 that are the most negative about surveillance

00:26:47 have the most secrets.

00:26:48 The most to hide.

00:26:49 Yeah.

00:26:50 Well, you should,

00:26:52 you’re saying you’re doing bits on it now?

00:26:54 Well, I’m just talking in general about,

00:26:56 you know, privacy and surveillance

00:26:58 and how paranoid we’re kind of becoming

00:27:00 and how, you know, I mean, it’s just wild to me

00:27:03 that people are like, our emails are gonna leak

00:27:05 and they’re taking our phone numbers.

00:27:07 Like there used to be a book full of phone numbers

00:27:11 and addresses that were, they just throw it at your door.

00:27:15 And we all had a book of everyone’s numbers.

00:27:18 You know, this is a very new thing.

00:27:20 And, you know, I know our amygdala is designed

00:27:22 to compound sort of threats

00:27:24 and, you know, there’s stories about,

00:27:27 and I think we all just glom on in a very, you know,

00:27:30 tribal way of like, yeah, they’re taking our data.

00:27:32 Like, we don’t even know what that means,

00:27:33 but we’re like, well, yeah, they, they, you know?

00:27:38 So I just think that someone’s like, okay, well, so what?

00:27:40 They’re gonna sell your data?

00:27:41 Who cares?

00:27:42 Why do you care?

00:27:43 First of all, that bit will kill in China.

00:27:47 So, and I say that sort of only a little bit joking

00:27:51 because a lot of people in China, including the citizens,

00:27:55 despite what people in the West think of as abuse,

00:27:59 are actually in support of the idea of surveillance.

00:28:03 Sort of, they’re not in support of the abuse of surveillance,

00:28:06 but they’re, they like, I mean,

00:28:08 the idea of surveillance is kind of like

00:28:11 the idea of government, like you said,

00:28:14 we behave differently.

00:28:15 And in a way, it’s almost like why we like sports.

00:28:18 There’s rules.

00:28:19 And within the constraints of the rules,

00:28:22 this is a more stable society.

00:28:25 And they make good arguments about success,

00:28:28 being able to build successful companies,

00:28:30 being able to build successful social lives

00:28:32 around a fabric that’s more stable.

00:28:34 When you have a surveillance, it keeps the criminals away,

00:28:37 keeps abusive animals, whatever the values of the society,

00:28:41 with surveillance, you can enforce those values better.

00:28:44 And here’s what I will say.

00:28:45 There’s a lot of unethical things happening

00:28:47 with surveillance.

00:28:48 Like I feel the need to really make that very clear.

00:28:52 I mean, the fact that Google is like collecting

00:28:54 if people’s hands start moving on the mouse

00:28:55 to find out if they’re getting Parkinson’s

00:28:58 and then their insurance goes up,

00:29:00 like that is completely unethical and wrong.

00:29:02 And I think stuff like that,

00:29:03 we have to really be careful around.

00:29:05 So the idea of using our data to raise our insurance rates

00:29:08 or, you know, I heard that they’re looking,

00:29:10 they can sort of predict if you’re gonna have depression

00:29:13 based on your selfies by detecting micro muscles

00:29:16 in your face, you know, all that kind of stuff,

00:29:18 that is a nightmare, not okay.

00:29:20 But I think, you know, we have to delineate

00:29:22 what’s a real threat and what’s getting spam

00:29:25 in your email box.

00:29:26 That’s not what to spend your time and energy on.

00:29:28 Focus on the fact that every time you buy cigarettes,

00:29:31 your insurance is going up without you knowing about it.

00:29:35 On the topic of animals too,

00:29:36 can we just linger on a little bit?

00:29:38 Like, what do you think,

00:29:41 what does this say about our society

00:29:43 of the society wide abuse of animals

00:29:46 that we see in general, sort of factory farming,

00:29:48 just in general, just the way we treat animals

00:29:50 of different categories, like what do you think of that?

00:29:57 What does a better world look like?

00:29:59 What should people think about it in general?

00:30:03 I think the most interesting thing

00:30:06 I can probably say around this that’s the least emotional,

00:30:09 cause I’m actually a very non emotional animal person

00:30:11 because it’s, I think everyone’s an animal person.

00:30:14 It’s just a matter of if it’s yours

00:30:15 or if you’ve been conditioned to go numb, you know.

00:30:19 I think it’s really a testament to what as a species

00:30:22 we are able to be in denial about,

00:30:24 mass denial and mass delusion,

00:30:26 and how we’re able to dehumanize and debase groups,

00:30:31 you know, World War II,

00:30:34 in a way in order to conform

00:30:36 and find protection in the conforming.

00:30:38 So we are also a species who used to go to coliseums

00:30:43 and watch elephants and tigers fight to the death.

00:30:47 We used to watch human beings be pulled apart

00:30:50 and that wasn’t that long ago.

00:30:53 We’re also a species who had slaves

00:30:56 and it was socially acceptable by a lot of people.

00:30:59 People didn’t see anything wrong with it.

00:31:00 So we’re a species that is able to go numb

00:31:02 and that is able to dehumanize very quickly

00:31:05 and make it the norm.

00:31:08 Child labor wasn’t that long ago.

00:31:10 The idea that now we look back and go,

00:31:12 oh yeah, kids were losing fingers in factories making shoes.

00:31:17 Like someone had to come in and make that, you know.

00:31:20 So I think it just says a lot about the fact that,

00:31:23 you know, we are animals and we are self serving

00:31:25 and one of the most successful,

00:31:27 the most successful species

00:31:29 because we are able to debase and degrade

00:31:33 and essentially exploit anything that benefits us.

00:31:36 I think the pendulum is gonna swing as being late.

00:31:39 Which way?

00:31:40 Like, I think we’re Rome now, kind of.

00:31:42 I think we’re on the verge of collapse

00:31:44 because we are dopamine receptors.

00:31:47 Like we are just, I think we’re all kind of addicts

00:31:49 when it comes to this stuff.

00:31:50 Like we don’t know when to stop.

00:31:53 It’s always the buffet.

00:31:54 Like we’re, the thing that used to keep us alive,

00:31:56 which is killing animals and eating them,

00:31:58 now killing animals and eating them

00:31:59 is what’s killing us in a way.

00:32:01 So it’s like, we just can’t,

00:32:02 we don’t know when to call it and we don’t,

00:32:04 moderation is not really something

00:32:06 that humans have evolved to have yet.

00:32:10 So I think it’s really just a flaw in our wiring.

00:32:13 Do you think we’ll look back at this time

00:32:15 as our society is being deeply unethical?

00:32:19 Yeah, yeah, I think we’ll be embarrassed.

00:32:22 Which are the worst parts right now going on?

00:32:24 Is it? In terms of animal?

00:32:26 Well, I think. No, in terms of anything.

00:32:27 What’s the unethical thing?

00:32:29 If we, and it’s very hard just to take a step out of it,

00:32:32 but you just said we used to watch, you know,

00:32:37 there’s been a lot of cruelty throughout history.

00:32:40 What’s the cruelty going on now?

00:32:42 I think it’s gonna be pigs.

00:32:44 I think it’s gonna be, I mean,

00:32:45 pigs are one of the most emotionally intelligent animals

00:32:48 and they have the intelligence of like a three year old.

00:32:51 And I think we’ll look back and be really,

00:32:54 they use tools.

00:32:55 I mean, I think we have this narrative

00:32:58 that they’re pigs and they’re pigs

00:32:59 and they’re disgusting and they’re dirty

00:33:01 and they’re bacon is so good.

00:33:02 I think that we’ll look back one day

00:33:04 and be really embarrassed about that.

00:33:06 Is this for just the, what’s it called?

00:33:09 The factory farming?

00:33:10 So basically mass.

00:33:11 Because we don’t see it.

00:33:12 If you saw, I mean, we do have,

00:33:14 I mean, this is probably an evolutionary advantage.

00:33:17 We do have the ability to completely

00:33:20 pretend something’s not,

00:33:21 something that is so horrific that it overwhelms us

00:33:24 and we’re able to essentially deny that it’s happening.

00:33:27 I think if people were to see what goes on

00:33:29 in factory farming,

00:33:30 and also we’re really to take in how bad it is for us,

00:33:35 you know, we’re hurting ourselves first and foremost

00:33:37 with what we eat,

00:33:38 but that’s also a very elitist argument, you know?

00:33:41 It’s a luxury to be able to complain about meat.

00:33:44 It’s a luxury to be able to not eat meat, you know?

00:33:47 There’s very few people because of, you know,

00:33:49 how the corporations have set up meat being cheap.

00:33:53 You know, it’s $2 to buy a Big Mac,

00:33:55 it’s $10 to buy a healthy meal.

00:33:57 You know, that’s, I think a lot of people

00:34:00 don’t have the luxury to even think that way.

00:34:02 But I do think that animals in captivity,

00:34:04 I think we’re gonna look back

00:34:05 and be pretty grossed out about mammals in captivity,

00:34:07 whales, dolphins.

00:34:08 I mean, that’s already starting to dismantle, circuses,

00:34:12 we’re gonna be pretty embarrassed about.

00:34:14 But I think it’s really more a testament to,

00:34:17 you know, there’s just such a ability to go like,

00:34:22 that thing is different than me and we’re better.

00:34:25 It’s the ego, I mean, it’s just,

00:34:26 we have the species with the biggest ego ultimately.

00:34:29 Well, that’s what I think,

00:34:30 that’s my hope for robots is they’ll,

00:34:32 you mentioned consciousness before,

00:34:34 nobody knows what consciousness is,

00:34:37 but I’m hoping robots will help us empathize

00:34:42 and understand that there’s other creatures

00:34:47 besides ourselves that can suffer,

00:34:50 that can experience the world

00:34:54 and that we can torture by our actions.

00:34:57 And robots can explicitly teach us that,

00:34:59 I think better than animals can.

00:35:01 I have never seen such compassion

00:35:06 from a lot of people in my life

00:35:10 toward any human, animal, child,

00:35:13 as I have a lot of people

00:35:15 in the way they interact with the robot.

00:35:16 Because I think there’s something of,

00:35:19 I mean, I was on the robot owner’s chat boards

00:35:23 for a good eight months.

00:35:25 And the main emotional benefit is

00:35:28 she’s never gonna cheat on you,

00:35:30 she’s never gonna hurt you,

00:35:31 she’s never gonna lie to you,

00:35:33 she doesn’t judge you.

00:35:34 I think that robots help people,

00:35:38 and this is part of the work I do with animals,

00:35:40 like I do equine therapy and train dogs and stuff,

00:35:42 because there is this safe space to be authentic.

00:35:46 With this being that doesn’t care

00:35:47 what you do for a living,

00:35:48 doesn’t care how much money you have,

00:35:50 doesn’t care who you’re dating,

00:35:51 doesn’t care what you look like,

00:35:52 doesn’t care if you have cellulite, whatever,

00:35:54 you feel safe to be able to truly be present

00:35:57 without being defensive and worrying about eye contact

00:35:59 and being triggered by needing to be perfect

00:36:02 and fear of judgment and all that.

00:36:04 And robots really can’t judge you yet,

00:36:08 but they can’t judge you,

00:36:09 and I think it really puts people at ease

00:36:13 and at their most authentic.

00:36:16 Do you think you can have a deep connection

00:36:18 with a robot that’s not judging,

00:36:21 or do you think you can really have a relationship

00:36:25 with a robot or a human being that’s a safe space?

00:36:30 Or is attention, mystery, danger

00:36:33 necessary for a deep connection?

00:36:35 I’m gonna speak for myself and say that

00:36:38 I grew up in an alcoholic home,

00:36:40 I identify as a codependent,

00:36:41 talked about this stuff before,

00:36:43 but for me it’s very hard to be in a relationship

00:36:45 with a human being without feeling like

00:36:47 I need to perform in some way or deliver in some way,

00:36:50 and I don’t know if that’s just the people

00:36:51 I’ve been in a relationship with or me or my brokenness,

00:36:56 but I do think, this is gonna sound really

00:37:01 negative and pessimistic,

00:37:04 but I do think a lot of our relationships are projection

00:37:07 and a lot of our relationships are performance,

00:37:09 and I don’t think I really understood that

00:37:12 until I worked with horses.

00:37:15 And most communication with human is nonverbal, right?

00:37:18 I can say like, I love you,

00:37:19 but you don’t think I love you, right?

00:37:22 Whereas with animals it’s very direct.

00:37:24 It’s all physical, it’s all energy.

00:37:26 I feel like that with robots too.

00:37:28 It feels very,

00:37:32 how I say something doesn’t matter.

00:37:35 My inflection doesn’t really matter.

00:37:36 And you thinking that my tone is disrespectful,

00:37:40 like you’re not filtering it through all

00:37:42 of the bad relationships you’ve been in,

00:37:43 you’re not filtering it through

00:37:44 the way your mom talked to you,

00:37:45 you’re not getting triggered.

00:37:47 I find that for the most part,

00:37:49 people don’t always receive things

00:37:51 the way that you intend them to or the way intended,

00:37:53 and that makes relationships really murky.

00:37:56 So the relationships with animals

00:37:57 and relationship with the robots is they are now,

00:38:00 you kind of implied that that’s more healthy.

00:38:05 Can you have a healthy relationship with other humans?

00:38:08 Or not healthy, I don’t like that word,

00:38:10 but shouldn’t it be, you’ve talked about codependency,

00:38:14 maybe you can talk about what is codependency,

00:38:16 but is that, is the challenges of that,

00:38:21 the complexity of that necessary for passion,

00:38:24 for love between humans?

00:38:27 That’s right, you love passion.

00:38:29 That’s a good thing.

00:38:31 I thought this would be a safe space.

00:38:33 I got trolled by Rogan for hours on this.

00:38:39 Look, I am not anti passion.

00:38:42 I think that I’ve just maybe been around long enough

00:38:45 to know that sometimes it’s ephemeral

00:38:48 and that passion is a mixture of a lot of different things,

00:38:55 adrenaline, which turns into dopamine, cortisol,

00:38:57 it’s a lot of neurochemicals, it’s a lot of projection,

00:39:01 it’s a lot of what we’ve seen in movies,

00:39:03 it’s a lot of, you know, I identify as an addict.

00:39:06 So for me, sometimes passion is like,

00:39:08 uh oh, this could be bad.

00:39:10 And I think we’ve been so conditioned to believe

00:39:11 that passion means like your soulmates,

00:39:13 and I mean, how many times have you had

00:39:14 a passionate connection with someone

00:39:15 and then it was a total train wreck?

00:39:18 The train wreck is interesting.

00:39:19 How many times exactly?

00:39:21 Exactly.

00:39:21 What’s a train wreck?

00:39:22 You just did a lot of math in your head

00:39:24 in that little moment.

00:39:25 Counting.

00:39:26 I mean, what’s a train wreck?

00:39:28 What’s a, why is obsession,

00:39:31 so you described this codependency

00:39:33 and sort of the idea of attachment,

00:39:37 over attachment to people who don’t deserve

00:39:40 that kind of attachment as somehow a bad thing

00:39:45 and I think our society says it’s a bad thing.

00:39:47 It probably is a bad thing.

00:39:49 Like a delicious burger is a bad thing.

00:39:52 I don’t know, but.

00:39:53 Right, oh, that’s a good point.

00:39:54 I think that you’re pointing out something really fascinating

00:39:56 which is like passion, if you go into it knowing

00:39:59 this is like pizza where it’s gonna be delicious

00:40:01 for two hours and then I don’t have to have it again

00:40:03 for three, if you can have a choice in the passion,

00:40:06 I define passion as something that is relatively unmanageable

00:40:09 and something you can’t control or stop and start

00:40:12 with your own volition.

00:40:13 So maybe we’re operating under different definitions.

00:40:16 If passion is something that like, you know,

00:40:18 ruins your real marriages and screws up

00:40:22 your professional life and becomes this thing

00:40:24 that you’re not in control of and becomes addictive,

00:40:28 I think that’s the difference is,

00:40:30 is it a choice or is it not a choice?

00:40:32 And if it is a choice, then passion’s great.

00:40:35 But if it’s something that like consumes you

00:40:37 and makes you start making bad decisions

00:40:39 and clouds your frontal lobe

00:40:41 and is just all about dopamine

00:40:44 and not really about the person

00:40:46 and more about the neurochemical,

00:40:47 we call it sort of the drug, the internal drug cabinet.

00:40:50 If it’s all just, you’re on drugs, that’s different,

00:40:52 you know, cause sometimes you’re just on drugs.

00:40:54 Okay, so there’s a philosophical question here.

00:40:58 So would you rather, and it’s interesting for a comedian,

00:41:03 brilliant comedian to speak so eloquently

00:41:07 about a balanced life.

00:41:09 I kind of argue against this point.

00:41:12 There’s such an obsession of creating

00:41:13 this healthy lifestyle now, psychologically speaking.

00:41:18 You know, I’m a fan of the idea that you sort of fly high

00:41:22 and you crash and die at 27 is also a possible life.

00:41:26 And it’s not one we should judge

00:41:27 because I think there’s moments of greatness.

00:41:30 I talked to Olympic athletes

00:41:32 where some of their greatest moments

00:41:34 are achieved in their early 20s.

00:41:36 And the rest of their life is in the kind of fog

00:41:39 of almost of a depression because they can never.

00:41:41 Because they’re based on their physical prowess, right?

00:41:44 Physical prowess and they’ll never,

00:41:46 so that, so they’re watching their physical prowess fade

00:41:50 and they’ll never achieve the kind of height,

00:41:54 not just physical, of just emotion, of.

00:41:58 Well, the max number of neurochemicals.

00:42:01 And you also put your money on the wrong horse.

00:42:04 That’s where I would just go like,

00:42:06 oh yeah, if you’re doing a job where you peak at 22,

00:42:10 the rest of your life is gonna be hard.

00:42:12 That idea is considering the notion

00:42:15 that you wanna optimize some kind of,

00:42:17 but we’re all gonna die soon.

00:42:19 What?

00:42:21 Now you tell me.

00:42:23 I’ve immortalized myself, so I’m gonna be fine.

00:42:26 See, you’re almost like,

00:42:28 how many Oscar winning movies can I direct

00:42:32 by the time I’m 100?

00:42:34 How many this and that?

00:42:35 But you know, there’s a night, you know,

00:42:38 it’s all, life is short, relatively speaking.

00:42:41 I know, but it can also come in different ways.

00:42:42 You go, life is short, play hard,

00:42:45 fall in love as much as you can, run into walls.

00:42:47 I would also go, life is short,

00:42:49 don’t deplete yourself on things that aren’t sustainable

00:42:53 and that you can’t keep, you know?

00:42:56 So I think everyone gets dopamine from different places.

00:42:59 Everyone has meaning from different places.

00:43:01 I look at the fleeting passionate relationships

00:43:04 I’ve had in the past and I don’t like,

00:43:06 I don’t have pride in that.

00:43:07 I think that you have to decide what, you know,

00:43:10 helps you sleep at night.

00:43:11 For me, it’s pride and feeling like I behave

00:43:13 with grace and integrity.

00:43:14 That’s just me personally.

00:43:16 Everyone can go like, yeah,

00:43:17 I slept with all the hot chicks in Italy I could

00:43:20 and I, you know, did all the whatever,

00:43:23 like whatever you value,

00:43:25 we’re allowed to value different things.

00:43:26 Yeah, we’re talking about Brian Callan.

00:43:28 Brian Callan has lived his life to the fullest,

00:43:32 to say the least.

00:43:33 But I think that it’s just for me personally,

00:43:36 I, and this could be like my workaholism

00:43:38 or my achievementism,

00:43:41 I, if I don’t have something to show for something,

00:43:45 I feel like it’s a waste of time or some kind of loss.

00:43:50 I’m in a 12 step program and the third step would say,

00:43:52 there’s no such thing as waste of time

00:43:54 and everything happens exactly as it should

00:43:56 and whatever, that’s a way to just sort of keep us sane

00:43:59 so we don’t grieve too much and beat ourselves up

00:44:01 over past mistakes, there’s no such thing as mistakes,

00:44:04 dah, dah, dah.

00:44:05 But I think passion is, I think it’s so life affirming

00:44:10 and one of the few things that maybe people like us

00:44:13 makes us feel awake and seen

00:44:14 and we just have such a high threshold for adrenaline.

00:44:20 You know, I mean, you are a fighter, right?

00:44:22 Yeah, okay, so yeah,

00:44:24 so you have a very high tolerance for adrenaline

00:44:28 and I think that Olympic athletes,

00:44:30 the amount of adrenaline they get from performing,

00:44:33 it’s very hard to follow that.

00:44:34 It’s like when guys come back from the military

00:44:36 and they have depression.

00:44:38 It’s like, do you miss bullets flying at you?

00:44:40 Yeah, kind of because of that adrenaline

00:44:42 which turned into dopamine and the camaraderie.

00:44:45 I mean, there’s people that speak much better

00:44:46 about this than I do.

00:44:48 But I just, I’m obsessed with neurology

00:44:50 and I’m just obsessed with sort of the lies we tell ourselves

00:44:54 in order to justify getting neurochemicals.

00:44:57 You’ve done actually quite, done a lot of thinking

00:45:00 and talking about neurology

00:45:01 and just kind of look at human behavior

00:45:04 through the lens of looking at how our actually,

00:45:07 chemically our brain works.

00:45:09 So what, first of all,

00:45:10 why did you connect with that idea and what have you,

00:45:15 how has your view of the world changed

00:45:17 by considering the brain is just a machine?

00:45:22 You know, I know it probably sounds really nihilistic

00:45:24 but for me, it’s very liberating to know a lot

00:45:27 about neurochemicals because you don’t have to,

00:45:30 it’s like the same thing with like critics,

00:45:32 like critical reviews.

00:45:33 If you believe the good,

00:45:34 you have to believe the bad kind of thing.

00:45:36 Like, you know, if you believe that your bad choices

00:45:38 were because of your moral integrity or whatever,

00:45:43 you have to believe your good ones.

00:45:44 I just think there’s something really liberating

00:45:46 and going like, oh, that was just adrenaline.

00:45:48 I just said that thing

00:45:48 because I was adrenalized and I was scared

00:45:50 and my amygdala was activated

00:45:52 and that’s why I said you’re an asshole and get out.

00:45:54 And that’s, you know, I think,

00:45:55 I just think it’s important to delineate what’s nature

00:45:57 and what’s nurture, what is your choice

00:45:59 and what is just your brain trying to keep you safe.

00:46:02 I think we forget that even though we have security systems

00:46:04 and homes and locks on our doors,

00:46:06 that our brain for the most part

00:46:07 is just trying to keep us safe all the time.

00:46:09 It’s why we hold grudges, it’s why we get angry,

00:46:11 it’s why we get road rage, it’s why we do a lot of things.

00:46:14 And it’s also, when I started learning about neurology,

00:46:17 I started having so much more compassion for other people.

00:46:19 You know, if someone yelled at me being like,

00:46:21 fuck you on the road, I’d be like,

00:46:22 okay, he’s producing adrenaline right now

00:46:24 because we’re all going 65 miles an hour

00:46:27 and our brains aren’t really designed

00:46:30 for this type of stress and he’s scared.

00:46:33 He was scared, you know, so that really helped me

00:46:35 to have more love for people in my everyday life

00:46:38 instead of being in fight or flight mode.

00:46:41 But the, I think more interesting answer to your question

00:46:44 is that I’ve had migraines my whole life.

00:46:45 Like I’ve suffered with really intense migraines,

00:46:49 ocular migraines, ones where my arm would go numb

00:46:52 and I just started having to go to so many doctors

00:46:55 to learn about it and I started, you know,

00:46:58 learning that we don’t really know that much.

00:47:00 We know a lot, but it’s wild to go into

00:47:03 one of the best neurologists in the world

00:47:04 who’s like, yeah, we don’t know.

00:47:05 We don’t know. We don’t know.

00:47:07 And that fascinated me.

00:47:08 Except one of the worst pains you can probably have,

00:47:10 all that stuff, and we don’t know the source.

00:47:13 We don’t know the source

00:47:14 and there is something really fascinating

00:47:16 about when your left arm starts going numb

00:47:19 and you start not being able to see

00:47:21 out of the left side of both your eyes.

00:47:22 And I remember when the migraines get really bad,

00:47:25 it’s like a mini stroke almost

00:47:26 and you’re able to see words on a page,

00:47:29 but I can’t read them.

00:47:31 They just look like symbols to me.

00:47:33 So there’s something just really fascinating to me

00:47:35 about your brain just being able to stop functioning.

00:47:38 And I, so I just wanted to learn about it, study about it.

00:47:41 I did all these weird alternative treatments.

00:47:43 I got this piercing in here that actually works.

00:47:45 I’ve tried everything.

00:47:47 And then both of my parents had strokes.

00:47:49 So when both of my parents had strokes,

00:47:51 I became sort of the person who had to decide

00:47:54 what was gonna happen with their recovery,

00:47:56 which is just a wild thing to have to deal with it.

00:47:59 You know, 28 years old when it happened.

00:48:02 And I started spending basically all day, every day in ICUs

00:48:05 with neurologists learning about what happened

00:48:08 to my dad’s brain and why he can’t move his left arm,

00:48:11 but he can move his right leg,

00:48:12 but he can’t see out of the, you know.

00:48:14 And then my mom had another stroke

00:48:17 in a different part of the brain.

00:48:18 So I started having to learn

00:48:19 what parts of the brain did what,

00:48:21 and so that I wouldn’t take their behavior so personally,

00:48:23 and so that I would be able to manage my expectations

00:48:25 in terms of their recovery.

00:48:27 So my mom, because it affected a lot of her frontal lobe,

00:48:31 changed a lot as a person.

00:48:33 She was way more emotional.

00:48:34 She was way more micromanaged.

00:48:35 She was forgetting certain things.

00:48:36 So it broke my heart less when I was able to know,

00:48:40 oh yeah, well, the stroke hit this part of the brain,

00:48:42 and that’s the one that’s responsible for short term memory,

00:48:44 and that’s responsible for long term memory, da da da.

00:48:46 And then my brother just got something

00:48:48 called viral encephalitis,

00:48:50 which is an infection inside the brain.

00:48:53 So it was kind of wild that I was able to go,

00:48:56 oh, I know exactly what’s happening here,

00:48:57 and I know, you know, so.

00:48:59 So that’s allows you to have some more compassion

00:49:02 for the struggles that people have,

00:49:04 but does it take away some of the magic

00:49:06 for some of the, from the,

00:49:08 some of the more positive experiences of life?

00:49:10 Sometimes.

00:49:11 Sometimes, and I don’t, I’m such a control addict

00:49:15 that, you know, I think our biggest,

00:49:18 someone like me,

00:49:19 my biggest dream is to know why someone’s doing it.

00:49:21 That’s what standup is.

00:49:22 It’s just trying to figure out why,

00:49:23 or that’s what writing is.

00:49:24 That’s what acting is.

00:49:25 That’s what performing is.

00:49:25 It’s trying to figure out why someone would do something.

00:49:27 As an actor, you get a piece of, you know, material,

00:49:30 and you go, this person, why would he say that?

00:49:32 Why would he, she pick up that cup?

00:49:33 Why would she walk over here?

00:49:35 It’s really why, why, why, why.

00:49:36 So I think neurology is,

00:49:38 if you’re trying to figure out human motives

00:49:40 and why people do what they do,

00:49:41 it’d be crazy not to understand how neurochemicals motivate us.

00:49:46 I also have a lot of addiction in my family

00:49:48 and hardcore drug addiction and mental illness.

00:49:51 And in order to cope with it,

00:49:53 you really have to understand that borderline personality

00:49:55 disorder, schizophrenia, and drug addiction.

00:49:58 So I have a lot of people I love

00:50:00 that suffer from drug addiction and alcoholism.

00:50:02 And the first thing they started teaching you

00:50:04 is it’s not a choice.

00:50:05 These people’s dopamine receptors

00:50:07 don’t hold dopamine the same ways yours do.

00:50:09 Their frontal lobe is underdeveloped, like, you know,

00:50:13 and that really helped me to navigate dealing,

00:50:17 loving people that were addicted to substances.

00:50:20 I want to be careful with this question, but how much?

00:50:24 Money do you have?

00:50:25 How much?

00:50:26 Can I borrow $10?

00:50:28 Okay, no, is how much control,

00:50:33 how much, despite the chemical imbalances

00:50:39 or the biological limitations

00:50:42 that each of our individual brains have,

00:50:44 how much mind over matter is there?

00:50:47 So through things that I’ve known people

00:50:51 with clinical depression,

00:50:53 and so it’s always a touchy subject

00:50:55 to say how much they can really help it.

00:50:57 Very.

00:50:59 What can you, yeah, what can you,

00:51:01 because you’ve talked about codependency,

00:51:03 you talked about issues that you struggle through,

00:51:07 and nevertheless, you choose to take a journey

00:51:09 of healing and so on, so that’s your choice,

00:51:12 that’s your actions.

00:51:14 So how much can you do to help fight the limitations

00:51:17 of the neurochemicals in your brain?

00:51:20 That’s such an interesting question,

00:51:21 and I don’t think I’m at all qualified to answer,

00:51:23 but I’ll say what I do know.

00:51:25 And really quick, just the definition of codependency,

00:51:28 I think a lot of people think of codependency

00:51:29 as like two people that can’t stop hanging out, you know,

00:51:33 or like, you know, that’s not totally off,

00:51:36 but I think for the most part,

00:51:38 my favorite definition of codependency

00:51:39 is the inability to tolerate the discomfort of others.

00:51:42 You grow up in an alcoholic home,

00:51:44 you grow up around mental illness,

00:51:45 you grow up in chaos,

00:51:46 you have a parent that’s a narcissist,

00:51:48 you basically are wired to just people please,

00:51:51 worry about others, be perfect, walk on eggshells,

00:51:54 shape shift to accommodate other people.

00:51:56 So codependence is a very active wiring issue

00:52:01 that, you know, doesn’t just affect

00:52:04 your romantic relationships, it affects you being a boss,

00:52:06 it affects you in the world.

00:52:09 Online, you know, you get one negative comment

00:52:11 and it throws you for two weeks.

00:52:14 You know, it also is linked to eating disorders

00:52:16 and other kinds of addiction.

00:52:17 So it’s a very big thing,

00:52:20 and I think a lot of people sometimes only think

00:52:21 that it’s in a romantic relationship,

00:52:23 so I always feel the need to say that.

00:52:25 And also one of the reasons I love the idea of robots

00:52:28 so much because you don’t have to walk on eggshells

00:52:30 around them, you don’t have to worry

00:52:31 they’re gonna get mad at you yet,

00:52:33 but there’s no, codependents are hypersensitive

00:52:36 to the needs and moods of others,

00:52:39 and it’s very exhausting, it’s depleting.

00:52:42 Just one conversation about where we’re gonna go to dinner

00:52:45 is like, do you wanna go get Chinese food?

00:52:47 We just had Chinese food.

00:52:48 Well, wait, are you mad?

00:52:50 Well, no, I didn’t mean to,

00:52:50 and it’s just like that codependents live in this,

00:52:54 everything means something,

00:52:56 and humans can be very emotionally exhausting.

00:53:00 Why did you look at me that way?

00:53:01 What are you thinking about?

00:53:01 What was that?

00:53:02 Why’d you check your phone?

00:53:03 It’s a hypersensitivity that can be

00:53:06 incredibly time consuming,

00:53:07 which is why I love the idea of robots just subbing in.

00:53:10 Even, I’ve had a hard time running TV shows and stuff

00:53:13 because even asking someone to do something,

00:53:15 I don’t wanna come off like a bitch,

00:53:16 I’m very concerned about what other people think of me,

00:53:18 how I’m perceived, which is why I think robots

00:53:21 will be very beneficial for codependents.

00:53:23 By the way, just a real quick tangent,

00:53:25 that skill or flaw, whatever you wanna call it,

00:53:29 is actually really useful for if you ever do

00:53:32 start your own podcast for interviewing,

00:53:34 because you’re now kind of obsessed

00:53:36 about the mindset of others,

00:53:39 and it makes you a good sort of listener and talker with.

00:53:43 So I think, what’s her name from NPR?

00:53:48 Terry Gross.

00:53:49 Terry Gross talked about having that.

00:53:50 So.

00:53:51 I don’t feel like she has that at all.

00:53:53 What?

00:53:54 She worries about other people’s feelings?

00:53:56 Yeah, absolutely.

00:53:57 Oh, I don’t get that at all.

00:53:59 I mean, you have to put yourself in the mind

00:54:01 of the person you’re speaking with.

00:54:03 Oh, I see, just in terms of, yeah,

00:54:05 I am starting a podcast,

00:54:06 and the reason I haven’t is because I’m codependent

00:54:08 and I’m too worried it’s not gonna be perfect.

00:54:10 So a big codependent adage is perfectionism

00:54:14 leads to procrastination, which leads to paralysis.

00:54:16 So how do you, sorry to take a million changes,

00:54:18 how do you survive on social media?

00:54:19 Is the exception the evidence?

00:54:20 Is the exception the evidence?

00:54:21 Is the exception the evidence?

00:54:22 To survive on social media, is the exception active?

00:54:25 But by the way, I took you on a tangent

00:54:26 and didn’t answer your last question

00:54:27 about how much we can control.

00:54:29 How much, yeah, we’ll return it, or maybe not.

00:54:32 The answer is we can’t.

00:54:33 Now as a codependent, I’m, okay, good.

00:54:36 We can, but, but, you know,

00:54:38 one of the things that I’m fascinated by is,

00:54:39 you know, the first thing you learn

00:54:40 when you go into 12 step programs or addiction recovery

00:54:43 or any of this is, you know,

00:54:45 genetics loads the gun, environment pulls the trigger.

00:54:47 And there’s certain parts of your genetics

00:54:50 you cannot control.

00:54:51 I come from a lot of alcoholism.

00:54:54 I come from, you know, a lot of mental illness.

00:54:59 There’s certain things I cannot control

00:55:01 and a lot of things that maybe we don’t even know yet

00:55:04 what we can and can’t

00:55:04 because of how little we actually know about the brain.

00:55:06 But we also talk about the warrior spirit.

00:55:08 And there are some people that have that warrior spirit

00:55:12 and we don’t necessarily know what that engine is,

00:55:15 whether it’s you get dopamine from succeeding

00:55:18 or achieving or martyring yourself

00:55:21 or the attention you get from growing.

00:55:24 So a lot of people are like,

00:55:25 oh, this person can edify themselves and overcome,

00:55:29 but if you’re getting attention from improving yourself,

00:55:32 you’re gonna keep wanting to do that.

00:55:34 So that is something that helps a lot of,

00:55:37 in terms of changing your brain.

00:55:38 If you talk about changing your brain to people

00:55:40 and talk about what you’re doing to overcome set obstacles,

00:55:42 you’re gonna get more attention from them,

00:55:44 which is gonna fire off your reward system

00:55:46 and then you’re gonna keep doing it.

00:55:48 Yeah, so you can leverage that momentum.

00:55:50 So this is why in any 12 step program,

00:55:52 you go into a room and you talk about your progress

00:55:55 because then everyone claps for you.

00:55:57 And then you’re more motivated to keep going.

00:55:58 So that’s why we say you’re only as sick

00:56:00 as the secrets you keep,

00:56:01 because if you keep things secret,

00:56:03 there’s no one guiding you to go in a certain direction.

00:56:06 It’s based on, right?

00:56:07 We’re sort of designed to get approval from the tribe

00:56:10 or from a group of people

00:56:11 because our brain translates it to safety.

00:56:14 So, you know.

00:56:15 And in that case, the tribe is a positive one

00:56:17 that helps you go in a positive direction.

00:56:19 So that’s why it’s so important to go into a room

00:56:21 and also say, hey, I wanted to use drugs today.

00:56:25 And people go, hmm.

00:56:26 They go, me too.

00:56:27 And you feel less alone

00:56:28 and you feel less like you’re, you know,

00:56:30 have been castigated from the pack or whatever.

00:56:32 And then you say, and you get a chip

00:56:35 when you haven’t drank for 30 days or 60 days or whatever.

00:56:37 You get little rewards.

00:56:38 So talking about a pack that’s not at all healthy or good,

00:56:43 but in fact is often toxic, social media.

00:56:46 So you’re one of my favorite people

00:56:47 on Twitter and Instagram to sort of just both the comedy

00:56:52 and the insight and just fun.

00:56:54 How do you prevent social media

00:56:55 from destroying your mental health?

00:56:57 I haven’t.

00:56:59 I haven’t.

00:57:00 It’s the next big epidemic, isn’t it?

00:57:05 I don’t think I have.

00:57:06 I don’t think.

00:57:08 Is moderation the answer?

00:57:10 Maybe, but you can do a lot of damage in a moderate way.

00:57:14 I mean, I guess, again, it depends on your goals, you know?

00:57:17 And I think for me, the way that my addiction

00:57:20 to social media, I’m happy to call it an addiction.

00:57:23 I mean, and I define it as an addiction

00:57:24 because it stops being a choice.

00:57:26 There are times I just reach over and I’m like, that was.

00:57:29 Yeah, that was weird.

00:57:30 That was weird.

00:57:31 I’ll be driving sometimes and I’ll be like, oh my God,

00:57:33 my arm just went to my phone, you know?

00:57:36 I can put it down.

00:57:37 I can take time away from it, but when I do, I get antsy.

00:57:41 I get restless, irritable, and discontent.

00:57:43 I mean, that’s kind of the definition, isn’t it?

00:57:45 So I think by no means do I have a healthy relationship

00:57:49 with social media.

00:57:50 I’m sure there’s a way to,

00:57:51 but I think I’m especially a weirdo in this space

00:57:54 because it’s easy to conflate.

00:57:56 Is this work?

00:57:58 Is this not?

00:57:58 I can always say that it’s for work, you know?

00:58:01 But I mean, don’t you get the same kind of thing

00:58:04 as you get from when a room full of people laugh at your jokes?

00:58:08 Because I mean, I see, especially the way you do Twitter,

00:58:11 it’s an extension of your comedy in a way.

00:58:13 So I took a big break from Twitter though,

00:58:16 a really big break.

00:58:16 I took like six months off or something for a while

00:58:19 because it was just like,

00:58:20 it seemed like it was all kind of politics

00:58:22 and it was just a little bit,

00:58:23 it wasn’t giving me dopamine

00:58:25 because there was like this weird, a lot of feedback.

00:58:28 So I had to take a break from it and then go back to it

00:58:30 because I felt like I didn’t have a healthy relationship.

00:58:33 Have you ever tried the, I don’t know if I believe him,

00:58:36 but Joe Rogan seems to not read comments.

00:58:39 Have you, and he’s one of the only people at the scale,

00:58:42 like at your level who at least claims not to read.

00:58:47 So like, cause you and him swim in this space

00:58:51 of tense ideas that get the toxic folks riled up.

00:58:58 I think Rogan, I don’t, I don’t know.

00:59:01 I don’t, I think he probably looks at YouTube,

00:59:05 like the likes and the, you know, I think if some things,

00:59:08 if he doesn’t know, I don’t know.

00:59:10 I’m sure he would tell the truth, you know,

00:59:13 I’m sure he’s got people that look at them

00:59:15 and it’s like disgusted, great.

00:59:17 Or I don’t, you know, like, I’m sure he gets it.

00:59:20 You know, I can’t picture him like in the weeds on.

00:59:23 No, for sure.

00:59:24 I mean, he’s honestly actually saying that I just,

00:59:26 it’s, it’s, it’s admirable.

00:59:28 We’re addicted to feedback.

00:59:29 Yeah, we’re addicted to feedback.

00:59:30 I mean, you know, look,

00:59:31 like I think that our brain is designed to get intel

00:59:36 on how we’re perceived so that we know where we stand,

00:59:39 right?

00:59:40 That’s our whole deal, right?

00:59:41 As humans, we want to know where we stand.

00:59:43 We walk in a room and we go,

00:59:44 who’s the most powerful person in here?

00:59:45 I got to talk to them and get in their good graces.

00:59:47 It’s just, we’re designed to rank ourselves, right?

00:59:49 And constantly know our rank and social media

00:59:52 because of you can’t figure out your rank

00:59:55 with 500 million people.

00:59:58 It’s possible, you know, so our brain is like,

01:00:00 what’s my rank?

01:00:01 What’s my, and especially if we’re following people,

01:00:02 I think the, the big, the interesting thing,

01:00:05 I think I maybe be able to say about this

01:00:07 besides my speech impediment is that I did start muting

01:00:12 people that rank wildly higher than me

01:00:16 because it is just stressful on the brain

01:00:19 to constantly look at people

01:00:21 that are incredibly successful.

01:00:23 So you keep feeling bad about yourself.

01:00:25 You know, I think that that is like cutting

01:00:27 to a certain extent.

01:00:28 Just like, look at me looking at all these people

01:00:30 that have so much more money than me

01:00:32 and so much more success than me.

01:00:33 It’s making me feel like a failure,

01:00:35 even though I don’t think I’m a failure,

01:00:37 but it’s easy to frame it so that I can feel that way.

01:00:41 Yeah, that’s really interesting,

01:00:43 especially if they’re close to,

01:00:45 like if they’re other comedians or something like that,

01:00:46 or whatever.

01:00:48 That’s, it’s really disappointing to me.

01:00:50 I do the same thing as well.

01:00:51 So other successful people that are really close

01:00:53 to what I do, it, I don’t know,

01:00:56 I wish I could just admire.

01:00:58 Yeah.

01:00:59 And for it not to be a distraction, but.

01:01:01 But that’s why you are where you are

01:01:02 because you don’t just admire your competitive

01:01:04 and you want to win.

01:01:05 So it’s also the same thing that bums you out

01:01:07 when you look at this as the same reason

01:01:08 you are where you are.

01:01:09 So that’s why I think it’s so important

01:01:11 to learn about neurology and addiction

01:01:12 because you’re able to go like,

01:01:14 oh, this same instinct.

01:01:15 So I’m very sensitive.

01:01:17 And I, and I sometimes don’t like that about myself,

01:01:19 but I’m like, well, that’s the reason I’m able to

01:01:21 write good standup.

01:01:22 And that’s the reason, and that’s the reason

01:01:23 I’m able to be sensitive to feedback

01:01:25 and go, that joke should have been better.

01:01:26 I can make that better.

01:01:28 So it’s the kind of thing where it’s like,

01:01:29 you have to be really sensitive in your work.

01:01:31 And the second you leave,

01:01:32 you got to be able to turn it off.

01:01:33 It’s about developing the muscle,

01:01:34 being able to know when to let it be a superpower

01:01:38 and when it’s going to hold you back and be an obstacle.

01:01:41 So I try to not be in that black and white of like,

01:01:44 you know, being competitive is bad

01:01:45 or being jealous of someone just to go like,

01:01:47 oh, there’s that thing that makes me really successful

01:01:50 in a lot of other ways,

01:01:51 but right now it’s making me feel bad.

01:01:53 Well, I’m kind of looking to you

01:01:54 because you’re basically a celebrity,

01:01:58 a famous sort of world class comedian.

01:02:01 And so I feel like you’re the right person

01:02:03 to be one of the key people to define

01:02:06 what’s the healthy path forward with social media.

01:02:08 So I, because we’re all trying to figure it out now

01:02:12 and it’s, I’m curious to see where it evolves.

01:02:16 I think you’re at the center of that.

01:02:17 So like, you know, there’s, you know,

01:02:20 trying to leave Twitter and then come back and see,

01:02:22 can I do this in a healthy way?

01:02:24 I mean, you have to keep trying, exploring.

01:02:25 You have to know because it’s being, you know,

01:02:28 I have a couple answers.

01:02:29 I think, you know, I hire a company

01:02:31 to do some of my social media for me, you know?

01:02:33 So it’s also being able to go, okay,

01:02:36 I make a certain amount of money by doing this,

01:02:38 but now let me be a good business person

01:02:40 and say, I’m gonna pay you this amount to run this for me.

01:02:42 So I’m not 24 seven in the weeds hashtagging and responding.

01:02:45 And just, it’s a lot to take on.

01:02:47 It’s a lot of energy to take on.

01:02:48 But at the same time, part of what I think

01:02:51 makes me successful on social media if I am,

01:02:53 is that people know I’m actually doing it

01:02:55 and that I am an engaging and I’m responding

01:02:57 and developing a personal relationship

01:02:59 with complete strangers.

01:03:01 So I think, you know, figuring out that balance

01:03:04 and really approaching it as a business, you know,

01:03:06 that’s what I try to do.

01:03:07 It’s not dating, it’s not,

01:03:09 I try to just be really objective about,

01:03:11 okay, here’s what’s working, here’s what’s not working.

01:03:13 And in terms of taking the break from Twitter,

01:03:15 this is a really savage take,

01:03:17 but because I don’t talk about my politics publicly,

01:03:21 being on Twitter right after the last election

01:03:26 was not gonna be beneficial

01:03:27 because there was gonna be, you had to take a side.

01:03:30 You had to be political in order to get

01:03:32 any kind of retweets or likes.

01:03:34 And I just wasn’t interested in doing that

01:03:37 because you were gonna lose as many people

01:03:38 as you were gonna gain

01:03:39 and it was gonna all come clean in the wash.

01:03:40 So I was just like, the best thing I can do

01:03:42 for me business wise is to just abstain, you know?

01:03:47 And you know, the robot, I joke about her replacing me,

01:03:52 but she does do half of my social media, you know?

01:03:55 Because I don’t want people to get sick of me.

01:03:57 I don’t want to be redundant.

01:03:59 There are times when I don’t have the time or the energy

01:04:02 to make a funny video,

01:04:03 but I know she’s gonna be compelling and interesting

01:04:06 and that’s something that you can’t see every day, you know?

01:04:08 Of course, the humor comes from your,

01:04:11 I mean, the cleverness, the wit, the humor comes from you

01:04:15 when you film the robot.

01:04:16 That’s kind of the trick of it.

01:04:17 I mean, the robot is not quite there

01:04:21 to do anything funny.

01:04:23 The absurdity is revealed through the filmmaker in that case

01:04:26 or whoever is interacting,

01:04:27 not through the actual robot, you know, being who she is.

01:04:33 Let me sort of, love.

01:04:37 Okay.

01:04:37 How difficult.

01:04:39 What is it?

01:04:40 What is it?

01:04:43 Well, first, an engineering question.

01:04:45 I know, I know, you’re not an engineer,

01:04:48 but how difficult do you think is it to build an AI system

01:04:52 that you can have a deep, fulfilling,

01:04:54 monogamous relationship with?

01:04:56 Sort of replace the human to human relationships

01:04:59 that we value?

01:05:01 I think anyone can fall in love with anything, you know?

01:05:04 Like, how often have you looked back at someone?

01:05:08 Like, I ran into someone the other day

01:05:11 that I was in love with and I was like,

01:05:12 hey, it was like, there was nothing there.

01:05:16 There was nothing there.

01:05:17 Like, do you, you know, like, where you’re able to go like,

01:05:19 oh, that was weird, oh, right, you know?

01:05:23 I were able.

01:05:25 You mean from a distant past or something like that?

01:05:27 Yeah, when you’re able to go like,

01:05:28 I can’t believe we had an incredible connection

01:05:31 and now it’s just, I do think that people will be in love

01:05:35 with robots probably even more deeply with humans

01:05:39 because it’s like when people mourn their animals,

01:05:42 when their animals die, they’re always,

01:05:45 it’s sometimes harder than mourning a human

01:05:47 because you can’t go, well, he was kind of an asshole,

01:05:50 but like, he didn’t pick me up from school.

01:05:52 You know, it’s like, you’re able to get out

01:05:53 of your grief a little bit.

01:05:54 You’re able to kind of be, oh, he was kind of judgmental

01:05:57 or she was kind of, you know, with a robot,

01:06:00 there’s something so pure about an innocent and impish

01:06:03 and childlike about it that I think it probably

01:06:07 will be much more conducive to a narcissistic love

01:06:11 for sure at that, but it’s not like, well, he cheated on,

01:06:15 she can’t cheat, she can’t leave you, she can’t, you know?

01:06:17 Well, if Bearclaw leaves your life

01:06:21 and maybe a new version or somebody else will enter,

01:06:25 will you miss Bearclaw?

01:06:27 For guys that have these sex robots,

01:06:30 they’re building a nursing home for the bodies

01:06:34 that are now resting

01:06:36 because they don’t want to part with the bodies

01:06:37 because they have such an intense emotional connection

01:06:39 to it.

01:06:40 I mean, it’s kind of like a car club a little bit,

01:06:42 you know, like it’s, you know,

01:06:45 but I’m not saying this is right.

01:06:47 I’m not saying it’s cool, it’s weird, it’s creepy,

01:06:50 but we do anthropomorphize things with faces

01:06:53 and we do develop emotional connections to things.

01:06:56 I mean, there’s certain, have you ever tried to like throw,

01:06:59 I can’t even throw away my teddy bear

01:07:00 from when I was a kid.

01:07:01 It’s a piece of trash and it’s upstairs.

01:07:04 Like, it’s just like, why can’t I throw that away?

01:07:06 It’s bizarre, you know,

01:07:08 and there’s something kind of beautiful about that.

01:07:10 There’s something, it gives me hope in humans

01:07:13 because I see humans do such horrific things all the time

01:07:15 and maybe I’m too, I see too much of it, frankly,

01:07:18 but there’s something kind of beautiful

01:07:20 about the way we’re able to have emotional connections

01:07:24 to objects, which, you know, a lot of,

01:07:29 I mean, it’s kind of specifically, I think, Western, right?

01:07:32 That we don’t see objects as having souls,

01:07:34 like that’s kind of specifically us,

01:07:36 but I don’t think it’s so much

01:07:39 that we’re objectifying humans with these sex robots.

01:07:43 We’re kind of humanizing objects, right?

01:07:45 So there’s something kind of fascinating

01:07:47 in our ability to do that

01:07:48 because a lot of us don’t humanize humans.

01:07:50 So it’s just a weird little place to play in

01:07:52 and I think a lot of people, I mean,

01:07:54 a lot of people will be marrying these things is my guess.

01:07:57 So you’ve asked the question, let me ask it of you.

01:08:00 So what is love?

01:08:02 You have a bit of a brilliant definition of love

01:08:05 as being willing to die for someone

01:08:07 who you yourself want to kill.

01:08:10 So that’s kind of fun.

01:08:12 First of all, that’s brilliant.

01:08:14 That’s a really good definition.

01:08:16 I think it’ll stick with me for a long time.

01:08:18 This is how little of a romantic I am.

01:08:19 A plane went by when you said that

01:08:21 and my brain is like, you’re gonna need to rerecord that.

01:08:24 And I want you to get into post

01:08:26 and then not be able to use that.

01:08:31 And I’m a romantic as I…

01:08:32 Don’t mean to ruin the moment.

01:08:33 Actually, I can not be conscious of the fact

01:08:35 that I heard the plane and it made me feel like

01:08:38 how amazing it is that we live in a world of planes.

01:08:41 And I just went, why haven’t we fucking evolved past planes

01:08:47 and why can’t they make them quieter?

01:08:49 Yeah.

01:08:50 Well, yes.

01:08:53 My definition of love?

01:08:54 What, yeah, what’s your sort of the more serious note?

01:08:57 Consistently producing dopamine for a long time.

01:09:01 Consistent output of oxytocin with the same person.

01:09:06 Dopamine is a positive thing.

01:09:08 What about the negative?

01:09:09 What about the fear and the insecurity, the longing,

01:09:14 anger, all that kind of stuff?

01:09:16 I think that’s part of love.

01:09:17 I think that love brings out the best in you,

01:09:22 but it also, if you don’t get angry and upset,

01:09:24 it’s, I don’t know, I think that that’s part of it.

01:09:26 I think we have this idea that love has to be like really

01:09:29 placid or something.

01:09:31 I only saw stormy relationships growing up,

01:09:34 so I don’t have a judgment

01:09:36 on how a relationship should look,

01:09:38 but I do think that this idea that love has to be eternal

01:09:45 is really destructive, is really destructive

01:09:48 and self defeating and a big source of stress for people.

01:09:53 I mean, I’m still figuring out love.

01:09:55 I think we all kind of are,

01:09:57 but I do kind of stand by that definition.

01:10:01 And I think that, I think for me,

01:10:03 love is like just being able to be authentic with somebody.

01:10:06 It’s very simple, I know,

01:10:07 but I think for me it’s about not feeling pressure

01:10:10 to have to perform or impress somebody,

01:10:11 just feeling truly like accepted unconditionally by someone.

01:10:16 Although I do believe love should be conditional.

01:10:19 That might be a hot take.

01:10:22 I think everything should be conditional.

01:10:24 I think if someone’s behavior,

01:10:27 I don’t think love should just be like,

01:10:28 I’m in love with you, now behave however you want forever.

01:10:30 This is unconditional.

01:10:31 I think love is a daily action.

01:10:35 It’s not something you just like get tenure on

01:10:38 and then get to behave however you want

01:10:40 because we said I love you 10 years ago.

01:10:41 It’s a daily, it’s a verb.

01:10:44 Well, there’s some things that are,

01:10:46 you see, if you explicitly make it clear

01:10:49 that it’s conditional,

01:10:50 it takes away some of the magic of it.

01:10:52 So there’s certain stories we tell ourselves

01:10:55 that we don’t want to make explicit about love.

01:10:57 I don’t know, maybe that’s the wrong way to think of it.

01:10:59 Maybe you want to be explicit in relationships.

01:11:02 I also think love is a business decision.

01:11:04 Like I do in a good way.

01:11:08 Like I think that love is not just

01:11:11 when you’re across from somebody.

01:11:12 It’s when I go to work, can I focus?

01:11:15 Am I worried about you?

01:11:16 Am I stressed out about you?

01:11:18 You’re not responding to me.

01:11:19 You’re not reliable.

01:11:20 Like I think that being in a relationship,

01:11:23 the kind of love that I would want

01:11:24 is the kind of relationship where when we’re not together,

01:11:26 it’s not draining me, causing me stress, making me worry,

01:11:30 and sometimes passion, that word, we get murky about it.

01:11:36 But I think it’s also like,

01:11:37 I can be the best version of myself

01:11:38 when the person’s not around.

01:11:40 And I don’t have to feel abandoned or scared

01:11:42 or any of these kinds of other things.

01:11:43 So it’s like love, for me, I think it’s a Flaubert quote

01:11:48 and I’m going to butcher it.

01:11:49 But I think it’s like, be boring in your personal life

01:11:53 so you can be violent and take risks

01:11:54 in your professional life.

01:11:55 Is that it?

01:11:56 I got it wrong.

01:11:57 Something like that.

01:11:58 But I do think that it’s being able to align values

01:12:01 in a way to where you can also thrive

01:12:02 outside of the relationship.

01:12:04 Some of the most successful people I know

01:12:06 are those sort of happily married and have kids and so on.

01:12:10 It’s always funny.

01:12:10 It can be boring.

01:12:11 Boring’s okay.

01:12:13 Boring is serenity.

01:12:14 And it’s funny how those elements

01:12:16 actually make you much more productive.

01:12:18 I don’t understand the.

01:12:19 I don’t think relationships should drain you

01:12:21 and take away energy that you could be using

01:12:23 to create things that generate pride.

01:12:25 Okay.

01:12:26 Have you said your relationship of love yet?

01:12:28 Have you said your definition of love?

01:12:31 My definition of love?

01:12:33 No, I did not say it.

01:12:35 We’re out of time.

01:12:36 No.

01:12:39 When you have a podcast, maybe you can invite me on.

01:12:41 Oh no, I already did.

01:12:42 You’re doing it.

01:12:44 We’ve already talked about this.

01:12:46 And because I also have codependency, I have to say yes.

01:12:49 No, yeah.

01:12:50 No, I know, I’m trapping you.

01:12:52 You owe me now.

01:12:53 Actually, I wondered whether when I asked

01:12:58 if we could talk today, after sort of doing more research

01:13:01 and reading some of your book, I started to wonder,

01:13:04 did you just feel pressured to say yes?

01:13:07 Yes, of course.

01:13:09 Good.

01:13:10 But I’m a fan of yours, too.

01:13:11 Okay, awesome.

01:13:11 No, I actually, because I am codependent,

01:13:13 but I’m in recovery for codependence,

01:13:14 so I actually do, I don’t do anything I don’t wanna do.

01:13:17 You really, you go out of your way to say no.

01:13:20 What’s that?

01:13:21 I say no all the time.

01:13:22 Good.

01:13:23 I’m trying to learn that as well.

01:13:24 I moved this a couple, remember,

01:13:25 I moved it from one to two.

01:13:26 Yeah, yeah.

01:13:26 Just to, yeah, just to.

01:13:27 Yeah, just to let you know.

01:13:28 I love it.

01:13:29 How recovered I am, and I’m not codependent.

01:13:31 But I don’t do anything I don’t wanna do.

01:13:34 Yeah, you’re ahead of me on that.

01:13:35 Okay.

01:13:36 So do you.

01:13:37 You’re like, I don’t even wanna be here.

01:13:38 Do you think about your mortality?

01:13:43 Yes, it is a big part of how I was able

01:13:46 to sort of like kickstart my codependence recovery.

01:13:49 My dad passed a couple years ago,

01:13:50 and when you have someone close to you in your life die,

01:13:53 everything gets real clear,

01:13:55 in terms of how we’re a speck of dust

01:13:57 who’s only here for a certain amount of time.

01:14:00 What do you think is the meaning of it all?

01:14:02 Like what the speck of dust,

01:14:05 what’s maybe in your own life, what’s the goal,

01:14:09 the purpose of your existence?

01:14:13 Is there one?

01:14:15 Well, you’re exceptionally ambitious.

01:14:17 You’ve created some incredible things

01:14:19 in different disciplines.

01:14:21 Yeah, we’re all just managing our terror

01:14:23 because we know we’re gonna die.

01:14:24 So we create and build all these things

01:14:26 and rituals and religions and robots

01:14:29 and whatever we need to do to just distract ourselves

01:14:31 from imminent rotting, we’re rotting.

01:14:36 We’re all dying.

01:14:37 And I got very into terror management theory

01:14:42 when my dad died and it resonated, it helped me.

01:14:45 And everyone’s got their own religion

01:14:46 or sense of purpose or thing that distracts them

01:14:50 from the horrors of being human.

01:14:54 What’s the terror management theory?

01:14:56 Terror management is basically the idea

01:14:57 that since we’re the only animal

01:14:58 that knows they’re gonna die,

01:15:00 we have to basically distract ourselves

01:15:03 with awards and achievements and games and whatever,

01:15:09 just in order to distract ourselves

01:15:11 from the terror we would feel if we really processed

01:15:14 the fact that we could not only, we are gonna die,

01:15:16 but also could die at any minute

01:15:18 because we’re only superficially

01:15:19 at the top of the food chain.

01:15:22 And technically we’re at the top of the food chain

01:15:26 if we have houses and guns and stuff machines,

01:15:29 but if me and a lion are in the woods together,

01:15:32 most things could kill us.

01:15:33 I mean, a bee can kill some people,

01:15:35 like something this big can kill a lot of humans.

01:15:38 So it’s basically just to manage the terror

01:15:41 that we all would feel if we were able

01:15:43 to really be awake.

01:15:45 Cause we’re mostly zombies, right?

01:15:46 Job, school, religion, go to sleep, drink, football,

01:15:51 relationship, dopamine, love, you know,

01:15:54 we’re kind of just like trudging along

01:15:57 like zombies for the most part.

01:15:58 And then I think.

01:15:59 That fear of death adds some motivation.

01:16:02 Yes.

01:16:03 Well, I think I speak for a lot of people

01:16:06 in saying that I can’t wait to see

01:16:08 what your terror creates in the next few years.

01:16:13 I’m a huge fan.

01:16:14 Whitney, thank you so much for talking today.

01:16:16 Thanks.

01:16:18 Thanks for listening to this conversation

01:16:20 with Whitney Cummings.

01:16:21 And thank you to our presenting sponsor, Cash App.

01:16:24 Download it and use code LexPodcast.

01:16:27 You’ll get $10 and $10 will go to First,

01:16:30 a STEM education nonprofit that inspires hundreds

01:16:33 of thousands of young minds to learn

01:16:35 and to dream of engineering our future.

01:16:38 If you enjoy this podcast, subscribe on YouTube,

01:16:40 give it five stars on Apple Podcast,

01:16:42 support on Patreon or connect with me on Twitter.

01:16:46 Thank you for listening and hope to see you next time.