Rana el Kaliouby: Emotion AI, Social Robots, and Self-Driving Cars #322

Transcript

00:00:00 there’s a broader question here, right? As we build socially and emotionally intelligent machines,

00:00:07 what does that mean about our relationship with them and then more broadly our relationship with

00:00:12 one another, right? Because this machine is going to be programmed to be amazing at empathy,

00:00:18 by definition, right? It’s going to always be there for you. It’s not going to get bored.

00:00:23 I don’t know how I feel about that. I think about that a lot.

00:00:25 TITO The following is a conversation with Rana

00:00:30 L. Kliubi, a pioneer in the field of emotion recognition and human centric artificial

00:00:36 intelligence. She is the founder of Effectiva, deputy CEO of SmartEye, author of Girl Decoded,

00:00:43 and one of the most brilliant, kind, inspiring, and fun human beings I’ve gotten the chance to

00:00:49 talk to. This is the Lex Friedman podcast. To support it, please check out our sponsors in

00:00:54 the description. And now, dear friends, here’s Rana L. Kliubi. You grew up in the Middle East,

00:01:02 in Egypt. What is the memory from that time that makes you smile? Or maybe a memory that stands out

00:01:08 as helping your mind take shape and helping you define yourself in this world?

00:01:12 RANA L. KLIUBI So the memory that stands out is we used to

00:01:15 live in my grandma’s house. She used to have these mango trees in her garden. And in the summer,

00:01:21 and so mango season was like July and August. And so in the summer, she would invite all my aunts

00:01:26 and uncles and cousins. And it was just like maybe there were like 20 or 30 people in the house,

00:01:31 and she would cook all this amazing food. And us, the kids, we would go down the garden,

00:01:38 and we would pick all these mangoes. And I don’t know, I think it’s just the bringing people

00:01:43 together that always stuck with me, the warmth. TITO Around the mango tree.

00:01:47 RANA L. KLIUBI Yeah, around the mango tree. And there’s just like the joy, the joy of being

00:01:52 together around food. And I’m a terrible cook. So I guess that didn’t, that memory didn’t translate

00:02:00 to me kind of doing the same. I love hosting people. TITO Do you remember colors, smells?

00:02:05 Is that what, like what, how does memory work? Like what do you visualize? Do you visualize

00:02:10 people’s faces, smiles? Do you, is there colors? Is there like a theme to the colors? Is it smells

00:02:19 because of food involved? RANA L. KLIUBI Yeah, I think that’s a great question. So the,

00:02:23 those Egyptian mangoes, there’s a particular type that I love, and it’s called Darwasi mangoes. And

00:02:28 they’re kind of, you know, they’re oval, and they have a little red in them. So I kind of,

00:02:33 they’re red and mango colored on the outside. So I remember that. TITO Does red indicate like

00:02:39 extra sweetness? Is that, is that, that means like it’s nicely, yeah, it’s nice and ripe and stuff.

00:02:45 Yeah. What, what’s like a definitive food of Egypt? You know, there’s like these almost

00:02:52 stereotypical foods in different parts of the world, like Ukraine invented borscht.

00:02:59 Borscht is this beet soup with, that you put sour cream on. See, it’s not, I can’t see if you,

00:03:04 if you know, if you know what it is, I think, you know, is delicious. But if I explain it,

00:03:10 it’s just not going to sound delicious. I feel like beet soup. This doesn’t make any sense,

00:03:15 but that’s kind of, and you probably have actually seen pictures of it because it’s one of the

00:03:19 traditional foods in Ukraine, in Russia, in different parts of the Slavic world. So that’s,

00:03:26 but it’s become so cliche and stereotypical that you almost don’t mention it, but it’s still

00:03:31 delicious. Like I visited Ukraine, I eat that every single day, so.

00:03:35 Do you, do you make it yourself? How hard is it to make?

00:03:38 No, I don’t know. I think to make it well, like anything, like Italians, they say, well,

00:03:44 tomato sauce is easy to make, but to make it right, that’s like a generational skill. So anyway,

00:03:51 is there something like that in Egypt? Is there a culture of food?

00:03:55 There is. And actually, we have a similar kind of soup. It’s called molokhia, and it’s, it’s made

00:04:02 of this green plant. It’s like, it’s somewhere between spinach and kale, and you mince it,

00:04:07 and then you cook it in like chicken broth. And my grandma used to make, and my mom makes it really

00:04:13 well, and I try to make it, but it’s not as great. So we used to have that. And then we used to have

00:04:18 it alongside stuffed pigeons. I’m pescetarian now, so I don’t eat that anymore, but.

00:04:23 Stuffed pigeons.

00:04:24 Yeah, it’s like, it was really yummy. It’s the one thing I miss about,

00:04:28 you know, now that I’m pescetarian and I don’t eat.

00:04:32 The stuffed pigeons?

00:04:33 Yeah, the stuffed pigeons.

00:04:35 Is it, what are they stuffed with? If that doesn’t bother you too much to describe.

00:04:39 No, no, it’s stuffed with a lot of like just rice and, yeah, it’s just rice. Yeah, so.

00:04:46 And you also, you said that your first, in your book, that your first computer

00:04:51 was an Atari, and Space Invaders was your favorite game.

00:04:56 Is that when you first fell in love with computers, would you say?

00:04:58 Yeah, I would say so.

00:05:00 Video games, or just the computer itself? Just something about the machine.

00:05:04 Ooh, this thing, there’s magic in here.

00:05:07 Yeah, I think the magical moment is definitely like playing video games with my,

00:05:12 I have two younger sisters, and we would just like had fun together, like playing games.

00:05:17 But the other memory I have is my first code, the first code I wrote.

00:05:22 I wrote, I drew a Christmas tree, and I’m Muslim, right?

00:05:26 So it’s kind of, it was kind of funny that the first thing I did was like this Christmas tree.

00:05:32 So, yeah, and that’s when I realized, wow, you can write code to do all sorts of like

00:05:38 really cool stuff. I must have been like six or seven at the time.

00:05:42 So you can write programs, and the programs do stuff for you. That’s power.

00:05:48 That’s, if you think about it, that’s empowering.

00:05:50 It’s AI.

00:05:51 Yeah, I know what it is. I don’t know if that, you see like,

00:05:56 I don’t know if many people think of it that way when they first learned to program.

00:05:59 They just love the puzzle of it. Like, ooh, this is cool. This is pretty.

00:06:02 It’s a Christmas tree, but like, it’s power.

00:06:05 It is power.

00:06:06 Eventually, I guess you couldn’t at the time, but eventually this thing,

00:06:11 if it’s interesting enough, if it’s a pretty enough Christmas tree,

00:06:14 it can be run by millions of people and bring them joy, like that little thing.

00:06:19 And then because it’s digital, it’s easy to spread.

00:06:22 So like you just created something that’s easily spreadable to millions of people.

00:06:26 Totally.

00:06:28 It’s hard to think that way when you’re six.

00:06:30 In the book, you write, I am who I am because I was raised by a particular set of parents,

00:06:37 both modern and conservative, forward thinking, yet locked in tradition.

00:06:41 I’m a Muslim and I feel I’m stronger, more centered for it.

00:06:46 I adhere to the values of my religion, even if I’m not as dutiful as I once was.

00:06:50 And I am a new American and I’m thriving on the energy,

00:06:55 vitality and entrepreneurial spirit of this great country.

00:06:59 So let me ask you about your parents.

00:07:01 What have you learned about life from them, especially when you were young?

00:07:05 So both my parents, they’re Egyptian, but they moved to Kuwait right out.

00:07:09 Actually, there’s a cute story about how they met.

00:07:11 So my dad taught COBOL in the 70s.

00:07:14 Nice.

00:07:15 And my mom decided to learn programming.

00:07:18 So she signed up to take his COBOL programming class.

00:07:22 And he tried to date her and she was like, no, no, no, I don’t date.

00:07:26 And so he’s like, okay, I’ll propose.

00:07:28 And that’s how they got married.

00:07:29 Whoa, strong move.

00:07:30 Right, exactly, right.

00:07:32 That’s really impressive.

00:07:35 Those COBOL guys know how to impress a lady.

00:07:40 So yeah, so what have you learned from them?

00:07:43 So definitely grit.

00:07:44 One of the core values in our family is just hard work.

00:07:48 There were no slackers in our family.

00:07:50 And that’s something that’s definitely stayed with me,

00:07:55 both as a professional, but also in my personal life.

00:07:58 But I also think my mom, my mom always used to like, I don’t know, it was like unconditional

00:08:06 love.

00:08:06 Like I just knew my parents would be there for me kind of regardless of what I chose to do.

00:08:14 And I think that’s very powerful.

00:08:15 And they got tested on it because I kind of challenged cultural norms and I kind of took

00:08:21 a different path, I guess, than what’s expected of a woman in the Middle East.

00:08:27 And they still love me, which I’m so grateful for that.

00:08:32 When was like a moment that was the most challenging for them?

00:08:35 Which moment where they kind of had to come face to face with the fact that you’re a bit

00:08:42 of a rebel?

00:08:44 I think the first big moment was when I had just gotten married, but I decided to go do

00:08:52 my PhD at Cambridge University.

00:08:53 And because my husband at the time, he’s now my ex, ran a company in Cairo, he was going

00:08:59 to stay in Egypt.

00:09:00 So it was going to be a long distance relationship.

00:09:03 And that’s very unusual in the Middle East for a woman to just head out and kind of pursue

00:09:09 her career.

00:09:09 And so my dad and my parents in law both said, you know, we do not approve of you doing this,

00:09:18 but now you’re under the jurisdiction of your husband so he can make the call.

00:09:22 And luckily for me, he was supportive.

00:09:26 He said, you know, this is your dream come true.

00:09:29 You’ve always wanted to do a PhD.

00:09:30 I’m going to support you.

00:09:33 So I think that was the first time where, you know, I challenged the cultural norms.

00:09:39 Was that scary?

00:09:40 Oh, my God, yes.

00:09:41 It was totally scary.

00:09:42 What’s the biggest culture shock from there to Cambridge, to London?

00:09:50 Well, that was also during right around September 11th.

00:09:56 So everyone thought that there was going to be a third world war.

00:10:01 It was really like, and I, at the time I used to wear the hijab, so I was very visibly Muslim.

00:10:07 And so my parents just were, they were afraid for my safety.

00:10:11 But anyways, when I got to Cambridge, because I was so scared, I decided to take off my

00:10:15 headscarf and wear a hat instead.

00:10:17 So I just went to class wearing these like British hats, which was, in my opinion, actually

00:10:22 worse than just showing up in a headscarf because it was just so awkward, right?

00:10:25 Like sitting in class with like all these.

00:10:27 Trying to fit in.

00:10:29 Yeah.

00:10:29 Like a spy.

00:10:30 Yeah, yeah, yeah.

00:10:31 So after a few weeks of doing that, I was like, to heck with that.

00:10:34 I’m just going to go back to wearing my headscarf.

00:10:37 Yeah, you wore the hijab, so starting in 2000 and for 12 years after.

00:10:43 So it’s always, whenever you’re in public, you have to wear the head covering.

00:10:47 Can you speak to that, to the hijab, maybe your mixed feelings about it?

00:10:52 Like what does it represent in its best case?

00:10:55 What does it represent in the worst case?

00:10:56 Yeah, you know, I think there’s a lot of, I guess I’ll first start by saying I wore

00:11:03 it voluntarily.

00:11:04 I was not forced to wear it.

00:11:05 And in fact, I was one of the very first women in my family to decide to put on the hijab.

00:11:09 And my family thought it was really odd, right?

00:11:13 Like there was, they were like, why do you want to put this on?

00:11:15 And at its best, it’s a sign of modesty, humility.

00:11:20 Yeah.

00:11:22 It’s like me wearing a suit, people are like, why are you wearing a suit?

00:11:25 It’s a step back into some kind of tradition, a respect for tradition of sorts.

00:11:30 So you said, because it’s by choice, you’re kind of free to make that choice to celebrate

00:11:36 a tradition of modesty.

00:11:37 Exactly. And I actually like made it my own.

00:11:40 I remember I would really match the color of my headscarf with what I was wearing.

00:11:45 Like it was a form of self expression and at its best, I loved wearing it.

00:11:52 You know, I have a lot of questions around how we practice religion and religion and,

00:11:56 you know, and I think also it was a time where I was spending a lot of time going back and

00:12:02 forth between the US and Egypt.

00:12:04 And I started meeting a lot of people in the US who are just amazing people, very purpose

00:12:09 driven, people who have very strong core values, but they’re not Muslim.

00:12:14 That’s okay, right?

00:12:15 And so that was when I just had a lot of questions.

00:12:19 And politically, also the situation in Egypt was when the Muslim Brotherhood ran the country

00:12:25 and I didn’t agree with their ideology.

00:12:29 It was at a time when I was going through a divorce.

00:12:31 Like it was like, it was like just the perfect storm of like political, personal conditions

00:12:37 where I was like, this doesn’t feel like me anymore.

00:12:40 And it took a lot of courage to take it off because culturally it’s not, it’s okay if

00:12:44 you don’t wear it, but it’s really not okay to wear it and then take it off.

00:12:50 But you’re still, so you have to do that while still maintaining a deep core and pride in

00:12:56 the origins, in your origin story.

00:13:02 Totally.

00:13:02 So still being Egyptian, still being a Muslim.

00:13:06 Right.

00:13:07 And being, I think generally like faith driven, but yeah.

00:13:14 But what that means changes year by year for you.

00:13:17 It’s like a personal journey.

00:13:18 Yeah, exactly.

00:13:20 What would you say is the role of faith in that part of the world?

00:13:23 Like, how do you see, you mentioned it a bit in the book too.

00:13:26 Yeah.

00:13:27 I mean, I think, I think there is something really powerful about just believing that

00:13:34 there’s a bigger force, you know, there’s a kind of surrendering, I guess, that comes

00:13:39 with religion and you surrender and you have this deep conviction that it’s going to be

00:13:43 okay, right?

00:13:44 Like the universe is out to like do amazing things for you and it’s going to be okay.

00:13:48 And there’s strength to that.

00:13:50 Like even when you’re going through adversity, you just know that it’s going to work out.

00:13:57 Yeah, it gives you like an inner peace, a calmness.

00:13:59 Exactly, exactly.

00:14:00 Yeah, that’s, it’s faith in all the meanings of that word.

00:14:04 Right.

00:14:05 Faith that everything is going to be okay.

00:14:07 And it is because time passes and time cures all things.

00:14:12 It’s like a calmness with the chaos of the world.

00:14:15 Yeah.

00:14:15 And also there’s like a silver, I’m a true believer of this, that something at the specific

00:14:22 moment in time can look like it’s catastrophic and it’s not what you wanted in life.

00:14:28 But then time passes and then you look back and there’s the silver lining, right?

00:14:32 It maybe closed the door, but it opened a new door for you.

00:14:37 And so I’m a true believer in that, that, you know, there’s a silver lining in almost

00:14:42 anything in life, you just have to have this like, yeah, faith or conviction that it’s

00:14:47 going to work out.

00:14:47 Yeah, it’s such a beautiful way to see a shady feeling.

00:14:50 So if you feel shady about a current situation, I mean, it almost is always true.

00:14:57 Unless it’s the cliche thing of if it doesn’t kill you, whatever doesn’t kill you makes

00:15:04 you stronger.

00:15:05 It’s, it does seem that over time when you take a perspective on things that the hardest

00:15:13 moments and periods of your life are the most meaningful.

00:15:18 Yeah, yeah.

00:15:19 So over time you get to have that perspective.

00:15:21 Right.

00:15:23 What about, because you mentioned Kuwait, what about, let me ask you about war.

00:15:30 What’s the role of war and peace, maybe even the big love and hate in that part of

00:15:35 the world, because it does seem to be a part of the world where there’s turmoil.

00:15:40 There was turmoil, there’s still turmoil.

00:15:44 It is so unfortunate, honestly.

00:15:46 It’s, it’s such a waste of human resources and, and, and yeah, and human mindshare.

00:15:53 I mean, and at the end of the day, we all kind of want the same things.

00:15:57 We want, you know, we want a human connection, we want joy, we want to feel fulfilled, we

00:16:02 want to feel, you know, a life of purpose.

00:16:05 And I just, I just find it baffling, honestly, that we are still having to grapple with that.

00:16:14 I have a story to share about this.

00:16:15 You know, I grew up, I’m Egyptian, American now, but, but, you know, originally from Egypt.

00:16:21 And when I first got to Cambridge, it turned out my officemate, like my PhD kind of, you

00:16:28 know, she ended up, you know, we ended up becoming friends, but she was from Israel.

00:16:32 And we didn’t know, yeah, we didn’t know how it was going to be like.

00:16:37 Did you guys sit there just staring at each other for a bit?

00:16:41 Actually, she, because I arrived before she did.

00:16:44 And it turns out she emailed our PhD advisor and asked him if she thought it was going

00:16:50 to be okay.

00:16:52 Yeah.

00:16:52 And this is around 9 11 too.

00:16:55 Yeah.

00:16:55 And, and Peter, Peter Robinson, our PhD advisor was like, yeah, like, this is an academic

00:17:01 institution, just show up.

00:17:02 And we became super good friends.

00:17:04 We were both new moms.

00:17:07 Like we both had our kids during our PhD.

00:17:09 We were both doing artificial emotional intelligence.

00:17:11 She was looking at speech.

00:17:12 I was looking at the face.

00:17:13 We just had so the culture was so similar.

00:17:17 Our jokes were similar.

00:17:18 It was just, I was like, why on earth are our countries, why is there all this like

00:17:24 war and tension?

00:17:25 And I think it falls back to the narrative, right?

00:17:27 If you change the narrative, like whoever creates this narrative of war.

00:17:31 I don’t know.

00:17:32 We should have women run the world.

00:17:34 Yeah, that’s one solution.

00:17:37 The good women, because there’s also evil women in the world.

00:17:40 True, okay.

00:17:43 But yes, yes, there could be less war if women ran the world.

00:17:47 The other aspect is, it doesn’t matter the gender, the people in power.

00:17:54 I get to see this with Ukraine and Russia and different parts of the world around that

00:17:59 conflict now.

00:18:00 And that’s happening in Yemen as well and everywhere else.

00:18:05 There’s these narratives told by the leaders to the populace.

00:18:09 And those narratives take hold and everybody believes that.

00:18:12 And they have a distorted view of the humanity on the other side.

00:18:17 In fact, especially during war, you don’t even see the people on the other side as human

00:18:25 or as equal intelligence or worth or value as you.

00:18:30 You tell all kinds of narratives about them being Nazis or dumb or whatever narrative

00:18:40 you want to weave around that or evil.

00:18:44 But I think when you actually meet them face to face, you realize they’re like the same.

00:18:49 Exactly, right?

00:18:50 It’s actually a big shock for people to realize that they’ve been essentially lied to within

00:18:58 their country.

00:19:00 And I kind of have faith that social media, as ridiculous as it is to say, or any kind

00:19:05 of technology, is able to bypass the walls that governments put up and connect people

00:19:13 directly.

00:19:14 And then you get to realize, oh, people fall in love across different nations and religions

00:19:20 and so on.

00:19:21 And that, I think, ultimately can cure a lot of our ills, especially in person.

00:19:26 I also think that if leaders met in person, they’d have a conversation that could cure

00:19:32 a lot of the ills of the world, especially in private.

00:19:37 Let me ask you about the women running the world.

00:19:42 So gender does, in part, perhaps shape the landscape of just our human experience.

00:19:51 So in what ways was it limiting and in what ways was it empowering for you to be a woman

00:19:57 in the Middle East?

00:19:58 I think, just kind of going back to my comment on women running the world, I think it comes

00:20:03 back to empathy, which has been a common thread throughout my entire career.

00:20:08 And it’s this idea of human connection.

00:20:12 Once you build common ground with a person or a group of people, you build trust, you

00:20:16 build loyalty, you build friendship.

00:20:20 And then you can turn that into behavior change and motivation and persuasion.

00:20:24 So it’s like, empathy and emotions are just at the center of everything we do.

00:20:30 And I think being from the Middle East, kind of this human connection is very strong.

00:20:38 We have this running joke that if you come to Egypt for a visit, people will know everything

00:20:44 about your life right away, right?

00:20:46 I have no problems asking you about your personal life.

00:20:48 There’s no boundaries, really, no personal boundaries in terms of getting to know people.

00:20:53 We get emotionally intimate very, very quickly.

00:20:56 But I think people just get to know each other authentically, I guess.

00:21:01 There isn’t this superficial level of getting to know people.

00:21:05 You just try to get to know people really deeply.

00:21:06 Empathy is a part of that.

00:21:08 Totally.

00:21:08 Because you can put yourself in this person’s shoe and kind of, yeah, imagine what challenges

00:21:15 they’re going through, and so I think I’ve definitely taken that with me.

00:21:21 Generosity is another one too, like just being generous with your time and love and attention

00:21:26 and even with your wealth, right?

00:21:30 Even if you don’t have a lot of it, you’re still very generous.

00:21:32 And I think that’s another…

00:21:34 Enjoying the humanity of other people.

00:21:38 And so do you think there’s a useful difference between men and women in that?

00:21:44 In that aspect and empathy?

00:21:48 Or is doing these kind of big general groups, does that hinder progress?

00:21:56 Yeah, I actually don’t want to overgeneralize.

00:21:59 I mean, some of the men I know are like the most empathetic humans.

00:22:03 Yeah, I strive to be empathetic.

00:22:05 Yeah, you’re actually very empathetic.

00:22:10 Yeah, so I don’t want to overgeneralize.

00:22:13 Although one of the researchers I worked with when I was at Cambridge, Professor Simon Baron Cohen,

00:22:18 he’s Sacha Baron Cohen’s cousin, and he runs the Autism Research Center at Cambridge,

00:22:25 and he’s written multiple books on autism.

00:22:29 And one of his theories is the empathy scale, like the systemizers and the empathizers,

00:22:35 and there’s a disproportionate amount of computer scientists and engineers who are

00:22:42 systemizers and perhaps not great empathizers, and then there’s more men in that bucket,

00:22:51 I guess, than women, and then there’s more women in the empathizers bucket.

00:22:56 So again, not to overgeneralize.

00:22:58 I sometimes wonder about that.

00:22:59 It’s been frustrating to me how many, I guess, systemizers there are in the field of robotics.

00:23:05 Yeah.

00:23:06 It’s actually encouraging to me because I care about, obviously, social robotics,

00:23:10 and because there’s more opportunity for people that are empathic.

00:23:18 Exactly.

00:23:19 I totally agree.

00:23:20 Well, right?

00:23:20 So it’s nice.

00:23:21 Yes.

00:23:22 So every robotics I talk to, they don’t see the human as interesting, as it’s not exciting.

00:23:29 You want to avoid the human at all costs.

00:23:32 It’s a safety concern to be touching the human, which it is, but it is also an opportunity

00:23:39 for deep connection or collaboration or all that kind of stuff.

00:23:43 And because most brilliant roboticists don’t care about the human, it’s an opportunity,

00:23:49 in your case, it’s a business opportunity too, but in general, an opportunity to explore

00:23:53 those ideas.

00:23:54 So in this beautiful journey to Cambridge, to UK, and then to America, what’s the moment

00:24:03 or moments that were most transformational for you as a scientist and as a leader?

00:24:09 So you became an exceptionally successful CEO, founder, researcher, scientist, and so on.

00:24:18 Was there a face shift there where, like, I can be somebody, I can really do something

00:24:25 in this world?

00:24:26 Yeah.

00:24:26 So actually, just kind of a little bit of background.

00:24:29 So the reason why I moved from Cairo to Cambridge, UK to do my PhD is because I had a very clear

00:24:36 career plan.

00:24:37 I was like, okay, I’ll go abroad, get my PhD, going to crush it in three or four years,

00:24:43 come back to Egypt and teach.

00:24:45 It was very clear, very well laid out.

00:24:47 Was topic clear or no?

00:24:49 The topic, well, I did my PhD around building artificial emotional intelligence and looking

00:24:54 at…

00:24:54 But in your master plan ahead of time, when you’re sitting by the mango tree, did you

00:24:58 know it’s going to be artificial intelligence?

00:25:00 No, no, no, that I did not know.

00:25:02 Although I think I kind of knew that I was going to be doing computer science, but I

00:25:07 didn’t know the specific area.

00:25:10 But I love teaching.

00:25:11 I mean, I still love teaching.

00:25:13 So I just, yeah, I just wanted to go abroad, get a PhD, come back, teach.

00:25:18 Why computer science?

00:25:19 Can we just linger on that?

00:25:21 What?

00:25:21 Because you’re such an empathic person who cares about emotion, humans and so on.

00:25:25 Isn’t, aren’t computers cold and emotionless and just…

00:25:31 We’re changing that.

00:25:32 Yeah, I know, but like, isn’t that the, or did you see computers as the, having the

00:25:38 capability to actually connect with humans?

00:25:42 I think that was like my takeaway from my experience just growing up, like computers

00:25:46 sit at the center of how we connect and communicate with one another, right?

00:25:50 Or technology in general.

00:25:51 Like I remember my first experience being away from my parents.

00:25:54 We communicated with a fax machine, but thank goodness for the fax machine, because we

00:25:58 could send letters back and forth to each other.

00:26:00 This was pre emails and stuff.

00:26:04 So I think, I think there’s, I think technology can be not just transformative in terms of

00:26:09 productivity, et cetera.

00:26:10 It actually does change how we connect with one another.

00:26:14 Can I just defend the fax machine?

00:26:16 There’s something like the haptic feel because the email is all digital.

00:26:22 There’s something really nice.

00:26:23 I still write letters to people.

00:26:26 There’s something nice about the haptic aspect of the fax machine, because you still have

00:26:30 to press, you still have to do something in the physical world to make this thing a reality.

00:26:35 Right, and then it like comes out as a printout and you can actually touch it and read it.

00:26:39 Yeah.

00:26:40 There’s something, there’s something lost when it’s just an email.

00:26:44 Obviously I wonder how we can regain some of that in the digital world, which goes to

00:26:51 the metaverse and all those kinds of things.

00:26:53 We’ll talk about it anyway.

00:26:54 So, actually do you question on that one?

00:26:57 Do you still, do you have photo albums anymore?

00:27:00 Do you still print photos?

00:27:03 No, no, but I’m a minimalist.

00:27:06 Okay.

00:27:06 So it was one of the, one of the painful steps in my life was to scan all the photos and

00:27:12 let go of them and then let go of all my books.

00:27:16 You let go of your books?

00:27:17 Yeah.

00:27:18 Switch to Kindle, everything Kindle.

00:27:19 Yeah.

00:27:20 So I thought, I thought, okay, think 30 years from now, nobody’s going to have books anymore.

00:27:29 The technology of digital books is going to get better and better and better.

00:27:32 Are you really going to be the guy that’s still romanticizing physical books?

00:27:36 Are you going to be the old man on the porch who’s like kids?

00:27:39 Yes.

00:27:40 So just get used to it because it was, it felt, it still feels a little bit uncomfortable

00:27:45 to read on a Kindle, but get used to it.

00:27:48 Like you always, I mean, I’m trying to learn new programming language is always,

00:27:53 like with technology, you have to kind of challenge yourself to adapt to it.

00:27:56 You know, I forced myself to use TikTok.

00:27:58 No, that thing doesn’t need much forcing.

00:28:01 It pulls you in like a, like the worst kind of, or the best kind of drug.

00:28:05 Anyway, yeah.

00:28:08 So yeah, but I do love haptic things.

00:28:11 There’s a magic to the haptic.

00:28:13 Even like touchscreens, it’s tricky to get right, to get the experience of a button.

00:28:19 Yeah.

00:28:22 Anyway, what were we talking about?

00:28:23 So AI, so the journey, your whole plan was to come back to Cairo and teach.

00:28:30 Right.

00:28:31 And then.

00:28:32 What did the plan go wrong?

00:28:33 Yeah, exactly.

00:28:34 Right.

00:28:35 And then I get to Cambridge and I fall in love with the idea of research.

00:28:39 Right.

00:28:39 And kind of embarking on a path.

00:28:41 Nobody’s explored this path before.

00:28:43 You’re building stuff that nobody’s built before.

00:28:45 And it’s challenging and it’s hard.

00:28:46 And there’s a lot of nonbelievers.

00:28:49 I just totally love that.

00:28:50 And at the end of my PhD, I think it’s the meeting that changed the trajectory of my life.

00:28:56 Professor Roslyn Picard, who’s, she runs the Affective Computing Group at the MIT Media Lab.

00:29:02 I had read her book.

00:29:03 I, you know, I was like following, following, following all her research.

00:29:07 AKA Ros.

00:29:08 Yes, AKA Ros.

00:29:10 Yes.

00:29:10 And she was giving a talk at a pattern recognition conference in Cambridge.

00:29:16 And she had a couple of hours to kill.

00:29:18 So she emailed the lab and she said, you know, if any students want to meet with me, like,

00:29:22 just, you know, sign up here.

00:29:24 And so I signed up for slot and I spent like the weeks leading up to it preparing for this

00:29:29 meeting and I want to show her a demo of my research and everything.

00:29:34 And we met and we ended up hitting it off.

00:29:36 Like we totally clicked.

00:29:38 And at the end of the meeting, she said, do you want to come work with me as a postdoc

00:29:42 at MIT?

00:29:44 And this is what I told her.

00:29:45 I was like, okay, this would be a dream come true, but there’s a husband waiting for me

00:29:49 in Cairo.

00:29:49 I kind of have to go back.

00:29:51 Yeah.

00:29:52 She said, it’s fine.

00:29:52 Just commute.

00:29:54 And I literally started commuting between Cairo and Boston.

00:29:59 Yeah, it was, it was a long commute.

00:30:01 And I didn’t, I did that like every few weeks I would, you know, hop on a plane and go to

00:30:05 Boston.

00:30:06 But that, that changed the trajectory of my life.

00:30:08 There was no, I kind of outgrew my dreams, right?

00:30:12 I didn’t want to go back to Egypt anymore and be faculty.

00:30:16 Like that was no longer my dream.

00:30:18 I had a dream.

00:30:19 What was the, what was it like to be at MIT?

00:30:22 What was that culture shock?

00:30:25 You mean America in general, but also, I mean, Cambridge has its own culture, right?

00:30:31 So what was MIT like and what was America like?

00:30:34 I think, I wonder if that’s similar to your experience at MIT.

00:30:37 I was just, at the Media Lab in particular, I was just really, impressed is not the right

00:30:45 word.

00:30:46 I didn’t expect the openness to like innovation and the acceptance of taking a risk and failing.

00:30:54 Like failure isn’t really accepted back in Egypt, right?

00:30:58 You don’t want to fail.

00:30:59 Like there’s a fear of failure, which I think has been hardwired in my brain.

00:31:03 But you get to MIT and it’s okay to start things.

00:31:05 And if they don’t work out, like it’s okay.

00:31:08 You pivot to another idea.

00:31:09 And that kind of thinking was just very new to me.

00:31:12 That’s liberating.

00:31:13 Well, Media Lab, for people who don’t know, MIT Media Lab is its own beautiful thing because

00:31:19 they, I think more than other places at MIT, reach for big ideas.

00:31:24 And like they try, I mean, I think, I mean, depending of course on who, but certainly

00:31:28 with Roslyn, you try wild stuff, you try big things and crazy things and also try to take

00:31:36 things to completion so you can demo them.

00:31:38 So always, always, always have a demo.

00:31:42 Like if you go, one of the sad things to me about robotics labs at MIT, and there’s like

00:31:46 over 30, I think, is like, usually when you show up to a robotics lab, there’s not a single

00:31:53 working robot, they’re all broken.

00:31:55 All the robots are broken.

00:31:57 The robots are broken, which is like the normal state of things because you’re working on

00:32:01 them.

00:32:02 But it would be nice if we lived in a world where robotics labs had some robots functioning.

00:32:08 One of my like favorite moments that just sticks with me, I visited Boston Dynamics

00:32:13 and there was a, first of all, seeing so many spots, so many legged robots in one place.

00:32:20 I’m like, I’m home.

00:32:22 But the, yeah.

00:32:24 This is where I was built.

00:32:27 The cool thing was just to see there was a random robot spot was walking down the hall.

00:32:33 It’s probably doing mapping, but it looked like he wasn’t doing anything and he was wearing

00:32:37 he or she, I don’t know.

00:32:39 But it, well, I like, in my mind, there are people, they have a backstory, but this one

00:32:44 in particular definitely has a backstory because he was wearing a cowboy hat.

00:32:48 So I just saw a spot robot with a cowboy hat walking down the hall and there was just this

00:32:54 feeling like there’s a life, like he has a life.

00:32:58 He probably has to commute back to his family at night.

00:33:02 Like there’s a, there’s a feeling like there’s life instilled in this robot and it’s magical.

00:33:07 I don’t know.

00:33:07 It was, it was kind of inspiring to see.

00:33:09 Did it say hello to, did he say hello to you?

00:33:12 No, it’s very, there’s a focus nature to the robot.

00:33:15 No, no, listen.

00:33:16 I love competence and focus and great.

00:33:18 Like he was not going to get distracted by the, the shallowness of small talk.

00:33:25 There’s a job to be done and he was doing it.

00:33:27 So anyway, the fact that it was working is a beautiful thing.

00:33:30 And I think Media Lab really prides itself on trying to always have a thing that’s working

00:33:35 that you could show off.

00:33:36 Yes.

00:33:36 We used to call it a demo or die.

00:33:38 You, you could not, yeah, you could not like show up with like PowerPoint or something.

00:33:43 You actually had to have a working, you know what, my son who is now 13, I don’t know if

00:33:48 this is still his life long goal or not, but when he was a little younger, his dream is

00:33:52 to build an island that’s just inhabited by robots, like no humans.

00:33:57 He just wants all these robots to be connecting and having fun and there you go.

00:34:01 Does he have human, does he have an idea of which robots he loves most?

00:34:06 Is it, is it Roomba like robots?

00:34:09 Is it humanoid robots?

00:34:10 Robot dogs, or it’s not clear yet.

00:34:13 We used to have a Jibo, which was one of the MIT Media Lab spin outs and he used to love

00:34:19 the giant head that spins and rotate and it’s an eye or like not glowing like Cal 9000,

00:34:30 but the friendly version.

00:34:31 He loved that.

00:34:34 And then he just loves, uh, um,

00:34:38 yeah, he just, he, I think he loves all forms of robots actually.

00:34:44 So embodied intelligence.

00:34:46 Yes.

00:34:47 I like, I personally like legged robots, especially, uh, anything that can wiggle its butt.

00:34:55 No, that’s not the definition of what I love, but that’s just technically what I’ve been

00:35:00 working on recently.

00:35:01 Except I have a bunch of legged robots now in Austin and I’ve been doing, I was, I’ve

00:35:06 been trying to, uh, have them communicate affection with their body in different ways

00:35:12 just for art, for art really.

00:35:15 Cause I love the idea of walking around with the robots, like, uh, as you would with a

00:35:20 dog.

00:35:20 I think it’s inspiring to a lot of people, especially young people.

00:35:23 Like kids love, kids love it.

00:35:25 Parents like adults are scared of robots, but kids don’t have this kind of weird construction

00:35:31 of the world that’s full of evil.

00:35:32 They love cool things.

00:35:34 Yeah.

00:35:35 I remember when Adam was in first grade, so he must have been like seven or so.

00:35:40 I went in to his class with a whole bunch of robots and like the emotion AI demo and

00:35:44 da da.

00:35:45 And I asked the kids, I was like, do you, would you kids want to have a robot, you know,

00:35:52 robot friend or robot companion?

00:35:53 Everybody said yes.

00:35:54 And they wanted it for all sorts of things, like to help them with their math homework

00:35:58 and to like be a friend.

00:36:00 So there’s, it just struck me how there was no fear of robots was a lot of adults have

00:36:07 that like us versus them.

00:36:10 Yeah, none of that.

00:36:11 Of course you want to be very careful because you still have to look at the lessons of history

00:36:16 and how robots can be used by the power centers of the world to abuse your rights and all

00:36:21 that kind of stuff.

00:36:22 But mostly it’s good to enter anything new with an excitement and an optimism.

00:36:30 Speaking of Roz, what have you learned about science and life from Rosalind Picard?

00:36:35 Oh my God, I’ve learned so many things about life from Roz.

00:36:41 I think the thing I learned the most is perseverance.

00:36:47 When I first met Roz, we applied and she invited me to be her postdoc.

00:36:51 We applied for a grant to the National Science Foundation to apply some of our research to

00:36:57 autism.

00:36:57 And we got back.

00:37:00 We were rejected.

00:37:01 Rejected.

00:37:02 Yeah.

00:37:02 And the reasoning was…

00:37:03 The first time you were rejected for fun, yeah.

00:37:06 Yeah, it was, and I basically, I just took the rejection to mean, okay, we’re rejected.

00:37:10 It’s done, like end of story, right?

00:37:12 And Roz was like, it’s great news.

00:37:15 They love the idea.

00:37:16 They just don’t think we can do it.

00:37:18 So let’s build it, show them, and then reapply.

00:37:22 And it was that, oh my God, that story totally stuck with me.

00:37:26 And she’s like that in every aspect of her life.

00:37:29 She just does not take no for an answer.

00:37:32 To reframe all negative feedback.

00:37:35 As a challenge.

00:37:36 As a challenge.

00:37:37 As a challenge.

00:37:38 Yes, they liked this.

00:37:40 Yeah, yeah, yeah.

00:37:40 It was a riot.

00:37:43 What else about science in general?

00:37:45 About how you see computers and also business and just everything about the world.

00:37:51 She’s a very powerful, brilliant woman like yourself.

00:37:54 So is there some aspect of that too?

00:37:57 Yeah, I think Roz is actually also very faith driven.

00:38:00 She has this like deep belief in conviction.

00:38:04 Yeah, and in the good in the world and humanity.

00:38:07 And I think that was meeting her and her family was definitely like a defining moment for me

00:38:13 because that was when I was like, wow, like you can be of a different background and

00:38:18 religion and whatever and you can still have the same core values.

00:38:23 So that was, that was, yeah.

00:38:26 I’m grateful to her.

00:38:28 Roz, if you’re listening, thank you.

00:38:30 Yeah, she’s great.

00:38:31 She’s been on this podcast before.

00:38:33 I hope she’ll be on, I’m sure she’ll be on again.

00:38:36 And you were the founder and CEO of Effektiva, which is a big company that was acquired by

00:38:44 another big company, SmartEye.

00:38:46 And you’re now the deputy CEO of SmartEye.

00:38:49 So you’re a powerful leader.

00:38:51 You’re brilliant.

00:38:51 You’re a brilliant scientist.

00:38:53 A lot of people are inspired by you.

00:38:55 What advice would you give, especially to young women, but people in general who dream

00:39:00 of becoming powerful leaders like yourself in a world where perhaps, in a world that

00:39:09 perhaps doesn’t give them a clear, easy path to do so, whether we’re talking about Egypt

00:39:17 or elsewhere?

00:39:19 You know, hearing you kind of describe me that way, kind of encapsulates, I think what

00:39:27 I think is the biggest challenge of all, which is believing in yourself, right?

00:39:32 I have had to like grapple with this, what I call now the Debbie Downer voice in my head.

00:39:39 The kind of basically, it’s just chattering all the time.

00:39:42 It’s basically saying, oh, no, no, no, no, you can’t do this.

00:39:45 Like you’re not going to raise money.

00:39:46 You can’t start a company.

00:39:47 Like what business do you have, like starting a company or running a company or selling

00:39:50 a company?

00:39:51 Like you name it.

00:39:52 It’s always like.

00:39:53 And I think my biggest advice to not just women, but people who are taking a new path

00:40:02 and, you know, they’re not sure, is to not let yourself and let your thoughts be the

00:40:07 biggest obstacle in your way.

00:40:09 And I’ve had to like really work on myself to not be my own biggest obstacle.

00:40:17 So you got that negative voice.

00:40:18 Yeah.

00:40:20 So is that?

00:40:21 Am I the only one?

00:40:21 I don’t think I’m the only one.

00:40:23 No, I have that negative voice.

00:40:25 I’m not exactly sure if it’s a bad thing or a good thing.

00:40:29 I’ve been really torn about it because it’s been a lifelong companions.

00:40:35 It’s hard to know.

00:40:37 It’s kind of, it drives productivity and progress, but it can hold you back from taking

00:40:44 big leaps.

00:40:45 I think the best I can say is probably you have to somehow be able to control it, to

00:40:53 turn it off when it’s not useful and turn it on when it’s useful.

00:40:57 Like I have from almost like a third person perspective.

00:41:00 Right.

00:41:00 Somebody who’s sitting there like.

00:41:02 Yeah.

00:41:02 Like, because it is useful to be critical.

00:41:07 Like after, I just gave a talk yesterday.

00:41:12 At MIT and I was just, there’s so much love and it was such an incredible experience.

00:41:19 So many amazing people I got a chance to talk to, but afterwards when I went home and just

00:41:25 took this long walk, it was mostly just negative thoughts about me.

00:41:29 I don’t like one basic stuff like I don’t deserve any of it.

00:41:34 And second is like, like, why did you, that was so bad.

00:41:39 Second is like, like, why did you, that was so dumb that you said this, that’s so dumb.

00:41:44 Like you should have prepared that better.

00:41:47 Why did you say this?

00:41:50 But I think it’s good to hear that voice out.

00:41:54 All right.

00:41:54 And like sit in that.

00:41:56 And ultimately I think you grow from that.

00:41:58 Now, when you’re making really big decisions about funding or starting a company or taking

00:42:03 a leap to go to the UK or take a leap to go to America to work in Media Lab though.

00:42:10 Yeah.

00:42:11 There’s, that’s, you should be able to shut that off then because you should have like

00:42:22 this weird confidence, almost like faith that you said before that everything’s going to

00:42:26 work out.

00:42:26 So take the leap of faith.

00:42:28 Take the leap of faith.

00:42:30 Despite all the negativity.

00:42:32 I mean, there’s, there’s, there’s some of that.

00:42:34 You, you actually tweeted a really nice tweet thread.

00:42:39 It says, quote, a year ago, a friend recommended I do daily affirmations and I was skeptical,

00:42:46 but I was going through major transitions in my life.

00:42:49 So I gave it a shot and it set me on a journey of self acceptance and self love.

00:42:54 So what was that like?

00:42:55 Can you maybe talk through this idea of affirmations and how that helped you?

00:43:01 Yeah.

00:43:02 Because really like I’m just like me, I’m a kind, I’d like to think of myself as a kind

00:43:07 person in general, but I’m kind of mean to myself sometimes.

00:43:10 Yeah.

00:43:11 And so I’ve been doing journaling for almost 10 years now.

00:43:16 I use an app called Day One and it’s awesome.

00:43:18 I just journal and I use it as an opportunity to almost have a conversation with the Debbie

00:43:22 Downer voice in my, it’s like a rebuttal, right?

00:43:25 Like Debbie Downer says, oh my God, like you, you know, you won’t be able to raise this

00:43:29 round of funding.

00:43:29 I’m like, okay, let’s talk about it.

00:43:33 I have a track record of doing X, Y, and Z.

00:43:35 I think I can do this.

00:43:37 And it’s literally like, so I wouldn’t, I don’t know that I can shut off the voice,

00:43:42 but I can have a conversation with it.

00:43:44 And it just, it just, and I bring data to the table, right?

00:43:49 Nice.

00:43:50 So that was the journaling part, which I found very helpful.

00:43:53 But the affirmation took it to a whole next level and I just love it.

00:43:57 I’m a year into doing this and you literally wake up in the morning and the first thing

00:44:02 you do, I meditate first and then I write my affirmations and it’s the energy I want

00:44:09 to put out in the world that hopefully will come right back to me.

00:44:12 So I will say, I always start with my smile lights up the whole world.

00:44:17 And I kid you not, like people in the street will stop me and say, oh my God, like we love

00:44:20 your smile.

00:44:21 Like, yes.

00:44:22 So, so my affirmations will change depending on, you know, what’s happening this day.

00:44:28 Is it funny?

00:44:29 I know.

00:44:29 Don’t judge, don’t judge.

00:44:31 No, that’s not, laughter’s not judgment.

00:44:33 It’s just awesome.

00:44:35 I mean, it’s true, but you’re saying affirmations somehow help kind of, I mean, what is it that

00:44:42 they do work to like remind you of the kind of person you are and the kind of person you

00:44:48 want to be, which actually may be in reverse order, the kind of person you want to be.

00:44:53 And that helps you become the kind of person you actually are.

00:44:56 It’s just, it’s, it brings intentionality to like what you’re doing.

00:45:01 Right.

00:45:01 And so, by the way, I was laughing because my affirmations, which I also do are the

00:45:07 opposite.

00:45:07 Oh, you do?

00:45:08 Oh, what do you do?

00:45:09 I don’t, I don’t have a, my smile lights up the world.

00:45:11 Maybe I should add that because like, I, I have, I just, I have, oh boy, it’s, it’s much

00:45:22 more stoic, like about focus, about this kind of stuff, but the joy, the emotion that you’re

00:45:30 just in that little affirmation is beautiful.

00:45:32 So maybe I should add that.

00:45:35 I have some, I have some like focused stuff, but that’s usually.

00:45:38 But that’s a cool start.

00:45:39 It’s after all the like smiling and playful and joyful and all that.

00:45:43 And then it’s like, okay, I kick butt.

00:45:45 Let’s get shit done.

00:45:46 Right.

00:45:46 Let’s get shit done affirmation.

00:45:48 Okay, cool.

00:45:49 So like what else is on there?

00:45:52 What else is on there?

00:45:54 Well, I, I have, I’m also, I’m, I’m a magnet for all sorts of things.

00:46:00 So I’m an amazing people magnet.

00:46:02 I attract like awesome people into my universe.

00:46:05 That’s an actual affirmation.

00:46:06 Yes.

00:46:07 That’s great.

00:46:08 Yeah.

00:46:09 So that, that’s, and that, yeah.

00:46:10 And that somehow manifests itself into like in working.

00:46:13 I think so.

00:46:15 Yeah.

00:46:15 Like, can you speak to like why it feels good to do the affirmations?

00:46:19 I honestly think it just grounds the day.

00:46:24 And then it allows me to, instead of just like being pulled back and forth, like throughout

00:46:30 the day, it just like grounds me.

00:46:31 I’m like, okay, like this thing happened.

00:46:34 It’s not exactly what I wanted it to be, but I’m patient.

00:46:37 Or I’m, you know, I’m, I trust that the universe will do amazing things for me, which is one

00:46:42 of my other consistent affirmations.

00:46:45 Or I’m an amazing mom.

00:46:46 Right.

00:46:47 And so I can grapple with all the feelings of mom guilt that I have all the time.

00:46:52 Or here’s another one.

00:46:53 I’m a love magnet.

00:46:55 And I literally say, I will kind of picture the person that I’d love to end up with.

00:46:59 And I write it all down and it hasn’t happened yet, but it.

00:47:02 What are you, what are you picturing?

00:47:03 This is Brad Pitt.

00:47:06 Because that’s what I picture.

00:47:07 Okay.

00:47:07 That’s what you picture?

00:47:08 Yeah.

00:47:08 Okay.

00:47:08 On the, on the running, holding hands, running together.

00:47:11 Okay.

00:47:14 No, more like fight club that the fight club, Brad Pitt, where he’s like standing.

00:47:18 All right.

00:47:19 People will know.

00:47:20 Anyway, I’m sorry.

00:47:20 I’ll get off on that.

00:47:21 Do you have a, like when you’re thinking about the being a love magnet in that way, are you

00:47:27 picturing specific people or is this almost like in the space of like energy?

00:47:36 Right.

00:47:36 It’s somebody who is smart and well accomplished and successful in their life, but they’re

00:47:44 generous and they’re well traveled and they want to travel the world.

00:47:48 Things like that.

00:47:49 Like their head over heels into me.

00:47:51 It’s like, I know it sounds super silly, but it’s literally what I write.

00:47:54 Yeah.

00:47:54 And I believe it’ll happen one day.

00:47:56 Oh, you actually write, so you don’t say it out loud?

00:47:58 You write.

00:47:58 No, I write it.

00:47:58 I write all my affirmations.

00:48:01 I do the opposite.

00:48:01 I say it out loud.

00:48:02 Oh, you say it out loud?

00:48:03 Interesting.

00:48:04 Yeah, if I’m alone, I’ll say it out loud.

00:48:06 Interesting.

00:48:07 I should try that.

00:48:10 I think it’s what feels more powerful to you.

00:48:15 To me, more powerful.

00:48:18 Saying stuff feels more powerful.

00:48:20 Yeah.

00:48:21 Writing is, writing feels like I’m losing the words, like losing the power of the words

00:48:32 maybe because I write slow.

00:48:33 Do you handwrite?

00:48:34 No, I type.

00:48:36 It’s on this app.

00:48:37 It’s day one, basically.

00:48:38 And I just, I can look, the best thing about it is I can look back and see like a year ago,

00:48:44 what was I affirming, right?

00:48:46 So it’s…

00:48:47 Oh, so it changes over time.

00:48:50 It hasn’t like changed a lot, but the focus kind of changes over time.

00:48:54 I got it.

00:48:55 Yeah, I say the same exact thing over and over and over.

00:48:57 Oh, you do?

00:48:58 Okay.

00:48:58 There’s a comfort in the sameness of it.

00:49:00 Well, actually, let me jump around because let me ask you about, because all this talk

00:49:05 about Brad Pitt, or maybe it’s just going on inside my head, let me ask you about dating

00:49:10 in general.

00:49:12 You tweeted, are you based in Boston and single?

00:49:16 And then you pointed to a startup Singles Night sponsored by Smile Dating app.

00:49:23 I mean, this is jumping around a little bit, but since you mentioned…

00:49:27 Since you mentioned, can AI help solve this dating love problem?

00:49:34 What do you think?

00:49:34 This problem of connection that is part of the human condition, can AI help that you

00:49:41 yourself are in the search affirming?

00:49:44 Maybe that’s what I should affirm, like build an AI.

00:49:48 Build an AI that finds love?

00:49:49 I think there must be a science behind that first moment you meet a person and you either

00:50:00 have chemistry or you don’t, right?

00:50:02 I guess that was the question I was asking, would you put it brilliantly, is that a science

00:50:06 or an art?

00:50:09 I think there are like, there’s actual chemicals that get exchanged when two people meet.

00:50:15 I don’t know about that.

00:50:16 I like how you’re changing, yeah, changing your mind as we’re describing it, but it feels

00:50:22 that way.

00:50:23 But it’s what science shows us is sometimes we can explain with the rigor, the things

00:50:29 that feel like magic.

00:50:31 So maybe we can remove all the magic.

00:50:34 Maybe it’s like, I honestly think, like I said, like Goodreads should be a dating app,

00:50:39 which like books.

00:50:41 I wonder if you look at just like books or content you’ve consumed.

00:50:46 I mean, that’s essentially what YouTube does when it does a recommendation.

00:50:50 If you just look at your footprint of content consumed, if there’s an overlap, but maybe

00:50:56 interesting difference with an overlap that some, I’m sure this is a machine learning

00:51:01 problem that’s solvable.

00:51:03 Like this person is very likely to be not only there to be chemistry in the short term,

00:51:10 but a good lifelong partner to grow together.

00:51:13 I bet you it’s a good machine learning problem.

00:51:15 You just need the data.

00:51:16 Let’s do it.

00:51:17 Well, actually, I do think there’s so much data about each of us that there ought to

00:51:22 be a machine learning algorithm that can ingest all this data and basically say, I think the

00:51:26 following 10 people would be interesting connections for you, right?

00:51:32 And so Smile dating app kind of took one particular angle, which is humor.

00:51:36 It matches people based on their humor styles, which is one of the main ingredients of a

00:51:41 successful relationship.

00:51:43 Like if you meet somebody and they can make you laugh, like that’s a good thing.

00:51:47 And if you develop like internal jokes, like inside jokes and you’re bantering, like that’s

00:51:53 fun.

00:51:54 So I think.

00:51:56 Yeah, definitely.

00:51:57 Definitely.

00:51:58 But yeah, that’s the number of and the rate of inside joke generation.

00:52:04 You could probably measure that and then optimize it over the first few days.

00:52:08 You could say, we’re just turning this into a machine learning problem.

00:52:11 I love it.

00:52:13 But for somebody like you, who’s exceptionally successful and busy, is there, is there signs

00:52:23 to that aspect of dating?

00:52:24 Is it tricky?

00:52:26 Is there advice you can give?

00:52:27 Oh, my God, I give the worst advice.

00:52:29 Well, I can tell you like I have a spreadsheet.

00:52:31 Is that a good or a bad thing?

00:52:34 Do you regret the spreadsheet?

00:52:37 Well, I don’t know.

00:52:38 What’s the name of the spreadsheet?

00:52:39 Is it love?

00:52:40 It’s the date track, dating tracker.

00:52:42 Dating tracker.

00:52:43 It’s very like.

00:52:44 Love tracker.

00:52:45 Yeah.

00:52:46 And there’s a rating system, I’m sure.

00:52:47 Yeah.

00:52:48 There’s like weights and stuff.

00:52:49 It’s too close to home.

00:52:51 Oh, is it?

00:52:52 Do you also have.

00:52:52 Well, I don’t have a spreadsheet, but I would, now that you say it, it seems like a good

00:52:56 idea.

00:52:57 Oh, no.

00:52:58 Okay.

00:52:58 Turning it into data.

00:53:05 I do wish that somebody else had a spreadsheet about me.

00:53:11 You know, if it was like, like I said, like you said, convert, collect a lot of data about

00:53:17 us in a way that’s privacy preserving, that I own the data, I can control it and then

00:53:21 use that data to find, I mean, not just romantic love, but collaborators, friends, all that

00:53:28 kind of stuff.

00:53:28 It seems like the data is there.

00:53:30 Right.

00:53:32 That’s the problem social networks are trying to solve, but I think they’re doing a really

00:53:35 poor job.

00:53:36 Even Facebook tried to get into a dating app business.

00:53:39 And I think there’s so many components to running a successful company that connects

00:53:44 human beings.

00:53:45 And part of that is, you know, having engineers that care about the human side, right, as

00:53:53 you know, extremely well, it’s not, it’s not easy to find those.

00:53:57 But you also don’t want just people that care about the human.

00:54:00 They also have to be good engineers.

00:54:02 So it’s like, you have to find this beautiful mix.

00:54:05 And for some reason, just empirically speaking, people have not done a good job of that, of

00:54:12 building companies like that.

00:54:13 And it must mean that it’s a difficult problem to solve.

00:54:17 Dating apps, it seems difficult.

00:54:19 Okay, Cupid, Tinder, all those kinds of stuff.

00:54:22 They seem to find, of course they work, but they seem to not work as well as I would imagine

00:54:32 is possible.

00:54:32 Like, with data, wouldn’t you be able to find better human connection?

00:54:36 It’s like arranged marriages on steroids, essentially.

00:54:39 Right, right.

00:54:40 Arranged by machine learning algorithm.

00:54:42 Arranged by machine learning algorithm, but not a superficial one.

00:54:45 I think a lot of the dating apps out there are just so superficial.

00:54:48 They’re just matching on like high level criteria that aren’t ingredients for successful partnership.

00:54:55 But you know what’s missing, though, too?

00:54:58 I don’t know how to fix that, the serendipity piece of it.

00:55:01 Like, how do you engineer serendipity?

00:55:03 Like this random, like, chance encounter, and then you fall in love with the person.

00:55:07 Like, I don’t know how a dating app can do that.

00:55:10 So there has to be a little bit of randomness.

00:55:12 Maybe every 10th match is just a, you know, yeah, somebody that the algorithm wouldn’t

00:55:21 have necessarily recommended, but it allows for a little bit of…

00:55:25 Well, it can also, you know, it can also trick you into thinking of serendipity by like somehow

00:55:33 showing you a tweet of a person that he thinks you’ll match well with, but do it accidentally

00:55:39 as part of another search.

00:55:40 Right.

00:55:41 And like you just notice it, like, and then you get, you go down a rabbit hole and you

00:55:46 connect them outside the app to like, you connect with this person outside the app somehow.

00:55:51 So it’s just, it creates that moment of meeting.

00:55:54 Of course, you have to think of, from an app perspective, how you can turn that into a

00:55:57 business.

00:55:58 But I think ultimately a business that helps people find love in any way.

00:56:04 Like that’s what Apple was about, create products that people love.

00:56:07 That’s beautiful.

00:56:08 I mean, you got to make money somehow.

00:56:11 If you help people fall in love personally with the product, find self love or love another

00:56:18 human being, you’re going to make money.

00:56:19 You’re going to figure out a way to make money.

00:56:22 I just feel like the dating apps often will optimize for something else than love.

00:56:28 It’s the same with social networks.

00:56:30 They optimize for engagement as opposed to like a deep, meaningful connection that’s

00:56:35 ultimately grounded in like personal growth, you as a human being growing and all that

00:56:39 kind of stuff.

00:56:41 Let me do like a pivot to a dark topic, which you opened the book with.

00:56:48 A story, because I’d like to talk to you about just emotion and artificial intelligence.

00:56:56 I think this is a good story to start to think about emotional intelligence.

00:56:59 You opened the book with a story of a central Florida man, Jamel Dunn, who was drowning

00:57:05 and drowned while five teenagers watched and laughed, saying things like, you’re going

00:57:10 to die.

00:57:10 And when Jamel disappeared below the surface of the water, one of them said he just died

00:57:15 and the others laughed.

00:57:17 What does this incident teach you about human nature and the response to it perhaps?

00:57:23 Yeah.

00:57:24 I mean, I think this is a really, really, really sad story.

00:57:28 And it and it and it highlights what I believe is a it’s a real problem in our world today.

00:57:34 It’s it’s an empathy crisis.

00:57:36 Yeah, we’re living through an empathy crisis and crisis.

00:57:39 Yeah.

00:57:40 Yeah.

00:57:42 And I mean, we’ve we’ve talked about this throughout our conversation.

00:57:45 We dehumanize each other.

00:57:47 And unfortunately, yes, technology is bringing us together.

00:57:51 But in a way, it’s just dehumanized.

00:57:53 It’s creating this like, yeah, dehumanizing of the other.

00:57:58 And I think that’s a huge problem.

00:58:01 The good news is I think solution, the solution could be technology based.

00:58:05 Like, I think if we rethink the way we design and deploy our technologies, we can solve

00:58:11 parts of this problem.

00:58:12 But I worry about it.

00:58:13 I mean, even with my son, a lot of his interactions are computer mediated.

00:58:19 And I just question what that’s doing to his empathy skills and, you know, his ability

00:58:25 to really connect with people.

00:58:26 So that you think you think it’s not possible to form empathy through the digital medium.

00:58:36 I think it is.

00:58:38 But we have to be thoughtful about because the way the way we engage face to face, which

00:58:44 is what we’re doing right now, right?

00:58:45 There’s the nonverbal signals, which are a majority of how we communicate.

00:58:49 It’s like 90% of how we communicate is your facial expressions.

00:58:54 You know, I’m saying something and you’re nodding your head now, and that creates a

00:58:57 feedback loop.

00:58:58 And and if you break that, and now I have anxiety about it.

00:59:04 Poor Lex.

00:59:06 Oh, boy.

00:59:06 I am not scrutinizing your facial expressions during this interview.

00:59:09 I am.

00:59:12 Look normal.

00:59:12 Look human.

00:59:13 Yeah.

00:59:13 Look normal, look human.

00:59:17 Nod head.

00:59:18 Yeah, nod head.

00:59:20 In agreement.

00:59:21 If Rana says yes, then nod head else.

00:59:25 Don’t do it too much because it might be at the wrong time and then it will send the

00:59:29 wrong signal.

00:59:30 Oh, God.

00:59:31 And make eye contact sometimes because humans appreciate that.

00:59:35 All right.

00:59:35 Anyway, okay.

00:59:38 Yeah, but something about the especially when you say mean things in person, you get to

00:59:42 see the pain of the other person.

00:59:44 Exactly.

00:59:44 But if you’re tweeting it at a person and you have no idea how it’s going to land, you’re

00:59:48 more likely to do that on social media than you are in face to face conversations.

00:59:52 So.

00:59:54 What do you think is more important?

00:59:59 EQ or IQ?

01:00:00 EQ being emotional intelligence.

01:00:03 In terms of in what makes us human.

01:00:08 I think emotional intelligence is what makes us human.

01:00:11 It’s how we connect with one another.

01:00:14 It’s how we build trust.

01:00:16 It’s how we make decisions, right?

01:00:19 Like your emotions drive kind of what you had for breakfast, but also where you decide

01:00:25 to live and what you want to do for the rest of your life.

01:00:28 So I think emotions are underrated.

01:00:33 So emotional intelligence isn’t just about the effective expression of your own emotions.

01:00:39 It’s about a sensitivity and empathy to other people’s emotions and that sort of being

01:00:44 able to effectively engage in the dance of emotions with other people.

01:00:48 Yeah, I like that explanation.

01:00:51 I like that kind of.

01:00:53 Yeah, thinking about it as a dance because it is really about that.

01:00:56 It’s about sensing what state the other person’s in and using that information to decide on

01:01:01 how you’re going to react.

01:01:05 And I think it can be very powerful.

01:01:06 Like people who are the best, most persuasive leaders in the world tap into, you know, they

01:01:15 have, if you have higher EQ, you’re more likely to be able to motivate people to change

01:01:20 their behaviors.

01:01:21 So it can be very powerful.

01:01:24 On a more kind of technical, maybe philosophical level, you’ve written that emotion is universal.

01:01:31 It seems that, sort of like Chomsky says, language is universal.

01:01:36 There’s a bunch of other stuff like cognition, consciousness.

01:01:39 It seems a lot of us have these aspects.

01:01:43 So the human mind generates all this.

01:01:46 And so what do you think is the, they all seem to be like echoes of the same thing.

01:01:52 What do you think emotion is exactly?

01:01:56 Like how deep does it run?

01:01:57 Is it a surface level thing that we display to each other?

01:02:01 Is it just another form of language or something deep within?

01:02:05 I think it’s really deep.

01:02:07 It’s how, you know, we started with memory.

01:02:09 I think emotions play a really important role.

01:02:14 Yeah, emotions play a very important role in how we encode memories, right?

01:02:18 Our memories are often encoded, almost indexed by emotions.

01:02:21 Yeah.

01:02:22 Yeah, it’s at the core of how, you know, our decision making engine is also heavily

01:02:28 influenced by our emotions.

01:02:30 So emotions is part of cognition.

01:02:31 Totally.

01:02:32 It’s intermixed into the whole thing.

01:02:34 Yes, absolutely.

01:02:35 And in fact, when you take it away, people are unable to make decisions.

01:02:39 They’re really paralyzed.

01:02:41 Like they can’t go about their daily or their, you know, personal or professional lives.

01:02:45 So.

01:02:45 It does seem like there’s probably some interesting interweaving of emotion and consciousness.

01:02:53 I wonder if it’s possible to have, like if they’re next door neighbors somehow, or if

01:02:58 they’re actually flat mates.

01:03:01 I don’t, it feels like the hard problem of consciousness where it’s some, it feels like

01:03:08 something to experience the thing.

01:03:10 Like red feels like red, and it’s, you know, when you eat a mango, it’s sweet.

01:03:16 The taste, the sweetness, that it feels like something to experience that sweetness, that

01:03:24 whatever generates emotions.

01:03:28 But then like, see, I feel like emotion is part of communication.

01:03:31 It’s very much about communication.

01:03:34 And then, you know, it’s like, you know, it’s like, you know, it’s like, you know, it’s

01:03:39 and then that means it’s also deeply connected to language.

01:03:45 But then probably human intelligence is deeply connected to the collective intelligence between

01:03:52 humans.

01:03:52 It’s not just the standalone thing.

01:03:54 So the whole thing is really connected.

01:03:56 So emotion is connected to language, language is connected to intelligence, and then intelligence

01:04:02 is connected to consciousness, and consciousness is connected to emotion.

01:04:05 The whole thing is that it’s a beautiful mess.

01:04:09 So can I comment on the emotions being a communication mechanism?

01:04:15 Because I think there are two facets of our emotional experiences.

01:04:23 One is communication, right?

01:04:24 Like we use emotions, for example, facial expressions or other nonverbal cues to connect

01:04:29 with other human beings and with other beings in the world, right?

01:04:34 But even if it’s not a communication context, we still experience emotions and we still

01:04:40 process emotions and we still leverage emotions to make decisions and to learn and, you know,

01:04:46 to experience life.

01:04:47 So it isn’t always just about communication.

01:04:51 And we learned that very early on in our and kind of our work at Affectiva.

01:04:56 One of the very first applications we brought to market was understanding how people respond

01:05:00 to content, right?

01:05:01 So if they’re watching this video of ours, like, are they interested?

01:05:04 Are they inspired?

01:05:05 Are they bored to death?

01:05:07 And so we watched their facial expressions and we had, we weren’t sure if people would

01:05:12 express any emotions if they were sitting alone.

01:05:15 Like if you’re in your bed at night, watching a Netflix TV series, would we still see any

01:05:20 emotions on your face?

01:05:21 And we were surprised that, yes, people still emote, even if they’re alone, even if you’re

01:05:25 in your car driving around, you’re singing along the song and you’re joyful, you’re

01:05:30 smiling, you’re joyful, we’ll see these expressions.

01:05:33 So it’s not just about communicating with another person.

01:05:37 It sometimes really isn’t just about experiencing the world.

01:05:41 And first of all, I wonder if some of that is because we develop our intelligence and

01:05:47 our emotional intelligence by communicating with other humans.

01:05:52 And so when other humans disappear from the picture, we’re still kind of a virtual human.

01:05:56 The code still runs.

01:05:57 Yeah, the code still runs, but you also kind of, you’re still, there’s like virtual humans.

01:06:02 You don’t have to think of it that way, but there’s a kind of, when you like chuckle,

01:06:07 like, yeah, like you’re kind of chuckling to a virtual human.

01:06:13 I mean, it’s possible that the code has to have another human there because if you just

01:06:23 grew up alone, I wonder if emotion will still be there in this visual form.

01:06:28 So yeah, I wonder, but anyway, what can you tell from the human face about what’s going

01:06:37 on inside?

01:06:38 So that’s the problem that Effectiva first tackled, which is using computer vision, using

01:06:45 machine learning to try to detect stuff about the human face, as many things as possible

01:06:50 and convert them into a prediction of categories of emotion, anger, happiness, all that kind

01:06:57 of stuff.

01:06:58 How hard is that problem?

01:07:00 It’s extremely hard.

01:07:01 It’s very, very hard because there is no one to one mapping between a facial expression

01:07:07 and your internal state.

01:07:08 There just isn’t.

01:07:09 There’s this oversimplification of the problem where it’s something like, if you are smiling,

01:07:14 then you’re happy.

01:07:15 If you do a brow furrow, then you’re angry.

01:07:17 If you do an eyebrow raise, then you’re surprised.

01:07:19 And just think about it for a moment.

01:07:22 You could be smiling for a whole host of reasons.

01:07:24 You could also be happy and not be smiling, right?

01:07:28 You could furrow your eyebrows because you’re angry or you’re confused about something or

01:07:34 you’re constipated.

01:07:37 So I think this oversimplistic approach to inferring emotion from a facial expression

01:07:41 is really dangerous.

01:07:42 The solution is to incorporate as many contextual signals as you can, right?

01:07:48 So if, for example, I’m driving a car and you can see me like nodding my head and my

01:07:55 eyes are closed and the blinking rate is changing, I’m probably falling asleep at the wheel,

01:08:00 right?

01:08:00 Because you know the context.

01:08:03 You understand what the person’s doing or add additional channels like voice or gestures

01:08:10 or even physiological sensors, but I think it’s very dangerous to just take this oversimplistic

01:08:17 approach of, yeah, smile equals happy and…

01:08:20 If you’re able to, in a high resolution way, specify the context, there’s certain things

01:08:25 that are going to be somewhat reliable signals of something like drowsiness or happiness

01:08:31 or stuff like that.

01:08:32 I mean, when people are watching Netflix content, that problem, that’s a really compelling idea

01:08:40 that you can kind of, at least in aggregate, highlight like which part was boring, which

01:08:46 part was exciting.

01:08:47 How hard was that problem?

01:08:50 That was on the scale of difficulty.

01:08:53 I think that’s one of the easier problems to solve because it’s a relatively constrained

01:09:00 environment.

01:09:00 You have somebody sitting in front of…

01:09:02 Initially, we started with like a device in front of you, like a laptop, and then we graduated

01:09:07 to doing this on a mobile phone, which is a lot harder just because of, you know, from

01:09:12 a computer vision perspective, the profile view of the face can be a lot more challenging.

01:09:17 We had to figure out lighting conditions because usually people are watching content literally

01:09:23 in their bedrooms at night.

01:09:24 Lights are dimmed.

01:09:25 Yeah, I mean, if you’re standing, it’s probably going to be the looking up.

01:09:30 The nostril view.

01:09:31 Yeah, and nobody looks good at it.

01:09:34 I’ve seen data sets from that perspective.

01:09:36 It’s like, this is not a good look for anyone.

01:09:40 Or if you’re laying in bed at night, what is it, side view or something?

01:09:44 Right.

01:09:44 And half your face is like on a pillow.

01:09:47 Actually, I would love to know, have data about like how people watch stuff in bed at

01:09:56 night, like, do they prop there, is it a pillow, the, like, I’m sure there’s a lot of interesting

01:10:03 dynamics there.

01:10:04 Right.

01:10:05 From a health and well being perspective, right?

01:10:07 Sure.

01:10:07 Like, oh, you’re hurting your neck.

01:10:08 I was thinking machine learning perspective, but yes, but also, yeah, yeah, once you have

01:10:13 that data, you can start making all kinds of inference about health and stuff like that.

01:10:18 Interesting.

01:10:19 Yeah, there’s an interesting thing when I was at Google that we were, it’s called active

01:10:26 authentication, where you want to be able to unlock your phone without using a password.

01:10:32 So it would face, but also other stuff, like the way you take a phone out of the pocket.

01:10:38 Amazing.

01:10:39 So that kind of data to use the multimodal with machine learning to be able to identify

01:10:45 that it’s you or likely to be you, likely not to be you, that allows you to not always

01:10:50 have to enter the password.

01:10:51 That was the idea.

01:10:52 But the funny thing about that is, I just want to tell a small anecdote is because it

01:10:58 was all male engineers, except so my boss is, our boss was still one of my favorite humans,

01:11:09 was a woman, Regina Dugan.

01:11:12 Oh, my God, I love her.

01:11:14 She’s awesome.

01:11:14 She’s the best.

01:11:15 She’s the best.

01:11:16 So, but anyway, and there’s one female brilliant female engineer on the team, and she was the

01:11:25 one that actually highlighted the fact that women often don’t have pockets.

01:11:30 It was like, whoa, that was not even a category in the code of like, wait a minute, you can

01:11:37 take the phone out of some other place than your pocket.

01:11:41 So anyway, that’s a funny thing when you’re considering people laying in bed, watching

01:11:45 a phone, you have to consider if you have to, you know, diversity in all its forms,

01:11:51 depending on the problem, depending on the context.

01:11:53 Actually, this is like a very important, I think this is, you know, you probably get

01:11:58 this all the time.

01:11:58 Like people are worried that AI is going to take over humanity and like, get rid of all

01:12:03 the humans in the world.

01:12:04 I’m like, actually, that’s not my biggest concern.

01:12:06 My biggest concern is that we are building bias into these systems.

01:12:10 And then they’re like deployed at large and at scale.

01:12:14 And before you know it, you’re kind of accentuating the bias that exists in society.

01:12:19 Yeah, I’m not, you know, I know people, it’s very important to worry about that, but the

01:12:26 worry is an emergent phenomena to me, which is a very good one, because I think these

01:12:32 systems are actually, by encoding the data that exists, they’re revealing the bias in

01:12:39 society.

01:12:40 They’re both for teaching us what the bias is.

01:12:43 Therefore, we can now improve that bias within the system.

01:12:46 So they’re almost like putting a mirror to ourselves.

01:12:49 Totally.

01:12:50 So I’m not.

01:12:51 You have to be open to looking at the mirror, though.

01:12:53 You have to be open to scrutinizing the data.

01:12:56 And if you just take it as ground.

01:12:59 Or you don’t even have to look at the, I mean, yes, the data is how you fix it.

01:13:02 But then you just look at the behavior of the system.

01:13:05 And you realize, holy crap, this thing is kind of racist.

01:13:08 Like, why is that?

01:13:09 And then you look at the data, it’s like, oh, okay.

01:13:11 And then you start to realize that I think that some much more effective ways to do that

01:13:15 are effective way to be introspective as a society than through sort of political discourse.

01:13:23 Like AI kind of, because people are for some reason more productive and rigorous in criticizing

01:13:34 AI than they’re criticizing each other.

01:13:35 So I think this is just a nice method for studying society and see which way progress

01:13:41 lies.

01:13:42 Anyway, what we’re talking about.

01:13:44 You’re watching the problem of watching Netflix in bed or elsewhere and seeing which parts

01:13:50 are exciting, which parts are boring.

01:13:51 You’re saying that’s relatively constrained because you have a captive audience and you

01:13:56 kind of know the context.

01:13:57 And one thing you said that was really key is the aggregate.

01:14:01 You’re doing this in aggregate, right?

01:14:02 Like we’re looking at aggregated response of people.

01:14:04 And so when you see a peak, say a smile peak, they’re probably smiling or laughing at something

01:14:11 that’s in the content.

01:14:12 So that was one of the first problems we were able to solve.

01:14:15 And when we see the smile peak, it doesn’t mean that these people are internally happy.

01:14:20 They’re just laughing at content.

01:14:22 So it’s important to call it for what it is.

01:14:25 But it’s still really, really useful data.

01:14:28 I wonder how that compares to, so what like YouTube and other places will use is obviously

01:14:34 they don’t have, for the most case, they don’t have that kind of data.

01:14:39 They have the data of when people tune out, like switch to drop off.

01:14:45 And I think that’s an aggregate for YouTube, at least a pretty powerful signal.

01:14:50 I worry about what that leads to because looking at like YouTubers that kind of really care

01:14:59 about views and try to maximize the number of views, I think when they say that the video

01:15:07 should be constantly interesting, which seems like a good goal, I feel like that leads to

01:15:15 this manic pace of a video.

01:15:19 Like the idea that I would speak at the current speed that I’m speaking, I don’t know.

01:15:25 And that every moment has to be engaging, right?

01:15:28 Engaging.

01:15:28 Yeah.

01:15:29 I think there’s value to silence.

01:15:31 There’s value to the boring bits.

01:15:33 I mean, some of the greatest movies ever, some of the greatest movies ever.

01:15:37 Some of the greatest stories ever told me they have that boring bits, seemingly boring bits.

01:15:42 I don’t know.

01:15:43 I wonder about that.

01:15:45 Of course, it’s not that the human face can capture that either.

01:15:49 It’s just giving an extra signal.

01:15:51 You have to really, I don’t know, you have to really collect deeper long term data about

01:16:01 what was meaningful to people.

01:16:03 When they think 30 days from now, what they still remember, what moved them, what changed

01:16:08 them, what helped them grow, that kind of stuff.

01:16:11 You know, it would be a really interesting, I don’t know if there are any researchers

01:16:14 out there who are doing this type of work.

01:16:17 Wouldn’t it be so cool to tie your emotional expressions while you’re, say, listening

01:16:23 to a podcast interview and then 30 days later interview people and say, hey, what do you

01:16:30 remember?

01:16:31 You’ve watched this 30 days ago.

01:16:33 Like, what stuck with you?

01:16:34 And then see if there’s any, there ought to be maybe, there ought to be some correlation

01:16:38 between these emotional experiences and, yeah, what you, what stays with you.

01:16:46 So the one guy listening now on the beach in Brazil, please record a video of yourself

01:16:51 listening to this and send it to me and then I’ll interview you 30 days from now.

01:16:55 Yeah, that’d be great.

01:16:58 It’ll be statistically significant to you.

01:17:00 Yeah, I know one, but, you know, yeah, yeah, I think that’s really fascinating.

01:17:06 I think that’s, that kind of holds the key to a future where entertainment or content

01:17:16 is both entertaining and, I don’t know, makes you better, empowering in some way.

01:17:25 So figuring out, like, showing people stuff that entertains them, but also they’re happy

01:17:32 they watched 30 days from now because they’ve become a better person because of it.

01:17:37 Well, you know, okay, not to riff on this topic for too long, but I have two children,

01:17:41 right?

01:17:42 And I see my role as a parent as like a chief opportunity officer.

01:17:46 Like I am responsible for exposing them to all sorts of things in the world.

01:17:50 And, but often I have no idea of knowing, like, what stuck, like, what was, you know,

01:17:56 is this actually going to be transformative, you know, for them 10 years down the line?

01:18:00 And I wish there was a way to quantify these experiences.

01:18:03 Like, are they, I can tell in the moment if they’re engaging, right?

01:18:08 I can tell, but it’s really hard to know if they’re going to remember them 10 years

01:18:12 from now or if it’s going to.

01:18:15 Yeah, that one is weird because it seems like kids remember the weirdest things.

01:18:19 I’ve seen parents do incredible stuff for their kids and they don’t remember any of

01:18:23 that.

01:18:23 They remember some tiny, small, sweet thing a parent did.

01:18:27 Right.

01:18:27 Like some…

01:18:28 Like they took you to, like, this amazing country vacation, blah, blah, blah, blah.

01:18:32 No, whatever.

01:18:33 And then there’ll be, like, some, like, stuffed toy you got or some, or the new PlayStation

01:18:38 or something or some silly little thing.

01:18:41 So I think they just, like, they were designed that way.

01:18:44 They want to mess with your head.

01:18:46 But definitely kids are very impacted by, it seems like, sort of negative events.

01:18:53 So minimizing the number of negative events is important, but not too much, right?

01:18:58 Right.

01:18:59 You can’t, you can’t just, like, you know, there’s still discipline and challenge and

01:19:04 all those kinds of things.

01:19:05 So…

01:19:05 You want some adversity for sure.

01:19:07 So, yeah, I mean, I’m definitely, when I have kids, I’m going to drive them out into

01:19:11 the woods.

01:19:11 Okay.

01:19:12 And then they have to survive and make, figure out how to make their way back home, like,

01:19:17 20 miles out.

01:19:18 Okay.

01:19:19 Yeah.

01:19:20 And after that, we can go for ice cream.

01:19:22 Okay.

01:19:23 Anyway, I’m working on this whole parenting thing.

01:19:26 I haven’t figured it out.

01:19:27 Okay.

01:19:28 What were we talking about?

01:19:29 Yes, Effectiva, the problem of emotion, of emotion detection.

01:19:37 So there’s some people, maybe we can just speak to that a little more, where there’s

01:19:41 folks like Lisa Feldman Barrett that challenge this idea that emotion could be fully detected

01:19:49 or even well detected from the human face, that there’s so much more to emotion.

01:19:55 What do you think about ideas like hers, criticism like hers?

01:19:59 Yeah, I actually agree with a lot of Lisa’s criticisms.

01:20:03 So even my PhD worked, like, 20 plus years ago now.

01:20:07 Time flies when you’re having fun.

01:20:12 I know, right?

01:20:14 That was back when I did, like, dynamic Bayesian networks.

01:20:17 That was before deep learning, huh?

01:20:19 That was before deep learning.

01:20:21 Yeah.

01:20:22 Yeah, I know.

01:20:24 Back in my day.

01:20:24 Now you can just, like, use.

01:20:27 Yeah, it’s all the same architecture.

01:20:30 You can apply it to anything.

01:20:31 Yeah.

01:20:31 Right, but yeah, but even then I kind of, I did not subscribe to this, like, theory

01:20:39 of basic emotions where it’s just the simplistic mapping, one to one mapping between facial

01:20:43 expressions and emotions.

01:20:44 I actually think also we’re not in the business of trying to identify your true emotional

01:20:49 internal state.

01:20:50 We just want to quantify in an objective way what’s showing on your face because that’s

01:20:55 an important signal.

01:20:57 It doesn’t mean it’s a true reflection of your internal emotional state.

01:21:02 So I think a lot of the, you know, I think she’s just trying to kind of highlight that

01:21:07 this is not a simple problem and overly simplistic solutions are going to hurt the industry.

01:21:15 And I subscribe to that.

01:21:16 And I think multimodal is the way to go.

01:21:18 Like, whether it’s additional context information or different modalities and channels of information,

01:21:24 I think that’s what we, that’s where we ought to go.

01:21:27 And I think, I mean, that’s a big part of what she’s advocating for as well.

01:21:31 So, but there is signal in the human face.

01:21:33 There’s definitely signal in the human face.

01:21:35 That’s a projection of emotion.

01:21:37 There’s that, at least in part is the inner state is captured in some meaningful way on

01:21:46 the human face.

01:21:47 I think it can sometimes be a reflection or an expression of your internal state, but

01:21:56 sometimes it’s a social signal.

01:21:57 So you cannot look at the face as purely a signal of emotion.

01:22:02 It can be a signal of cognition and it can be a signal of a social expression.

01:22:08 And I think to disambiguate that we have to be careful about it and we have to add initial

01:22:13 information.

01:22:14 Humans are fascinating, aren’t they?

01:22:16 With the whole face thing, this can mean so many things, from humor to sarcasm to everything,

01:22:22 the whole thing.

01:22:23 Some things we can help, some things we can’t help at all.

01:22:26 In all the years of leading Effectiva, an emotion recognition company, like we talked

01:22:31 about, what have you learned about emotion, about humans and about AI?

01:22:37 Big, sweeping questions.

01:22:44 Yeah, that’s a big, sweeping question.

01:22:46 Well, I think the thing I learned the most is that even though we are in the business

01:22:52 of building AI, basically, it always goes back to the humans, right?

01:23:00 It’s always about the humans.

01:23:02 And so, for example, the thing I’m most proud of in building Effectiva and, yeah, the thing

01:23:11 I’m most proud of on this journey, I love the technology and I’m so proud of the solutions

01:23:16 we’ve built and we’ve brought to market.

01:23:18 But I’m actually most proud of the people we’ve built and cultivated at the company

01:23:23 and the culture we’ve created.

01:23:25 Some of the people who’ve joined Effectiva, this was their first job, and while at Effectiva,

01:23:31 they became American citizens and they bought their first house and they found their partner

01:23:38 and they had their first kid, right?

01:23:39 Like key moments in life that we got to be part of, and that’s the thing I’m most proud

01:23:47 of.

01:23:47 So that’s a great thing at a company that works at a big company, right?

01:23:52 So that’s a great thing at a company that works at, I mean, like celebrating humanity

01:23:57 in general, broadly speaking.

01:23:59 And that’s a great thing to have in a company that works on AI, because that’s not often

01:24:04 the thing that’s celebrated in AI companies, so often just raw great engineering, just

01:24:11 celebrating the humanity.

01:24:12 That’s great.

01:24:12 And especially from a leadership position.

01:24:17 Well, what do you think about the movie Her?

01:24:20 Let me ask you that.

01:24:21 Before I talk to you about, because it’s not, Effectiva is and was not just about emotion,

01:24:28 so I’d love to talk to you about SmartEye, but before that, let me just jump into the

01:24:33 movie Her.

01:24:36 Do you think we’ll have a deep, meaningful connection with increasingly deeper, meaningful

01:24:42 connections with computers?

01:24:43 Is that a compelling thing to you?

01:24:45 Something you think about?

01:24:45 I think that’s already happening.

01:24:46 The thing I love the most, I love the movie Her, by the way, but the thing I love the

01:24:50 most about this movie is it demonstrates how technology can be a conduit for positive behavior

01:24:56 change.

01:24:57 So I forgot the guy’s name in the movie, whatever.

01:25:00 Theodore.

01:25:01 Theodore.

01:25:02 So Theodore was really depressed, right?

01:25:05 And he just didn’t want to get out of bed, and he was just done with life, right?

01:25:11 And Samantha, right?

01:25:12 Samantha, yeah.

01:25:14 She just knew him so well.

01:25:15 She was emotionally intelligent, and so she could persuade him and motivate him to change

01:25:20 his behavior, and she got him out, and they went to the beach together.

01:25:24 And I think that represents the promise of emotion AI.

01:25:27 If done well, this technology can help us live happier lives, more productive lives,

01:25:33 healthier lives, more connected lives.

01:25:36 So that’s the part that I love about the movie.

01:25:39 Obviously, it’s Hollywood, so it takes a twist and whatever, but the key notion that technology

01:25:46 with emotion AI can persuade you to be a better version of who you are, I think that’s awesome.

01:25:52 Well, what about the twist?

01:25:54 You don’t think it’s good?

01:25:55 You don’t think it’s good for spoiler alert that Samantha starts feeling a bit of a distance

01:26:01 and basically leaves Theodore?

01:26:04 You don’t think that’s a good feature?

01:26:07 You think that’s a bug or a feature?

01:26:10 Well, I think what went wrong is Theodore became really attached to Samantha.

01:26:14 Like, I think he kind of fell in love with Theodore.

01:26:16 Do you think that’s wrong?

01:26:17 I mean, I think that’s…

01:26:18 I think she was putting out the signal.

01:26:21 This is an intimate relationship, right?

01:26:24 There’s a deep intimacy to it.

01:26:25 Right, but what does that mean?

01:26:28 What does that mean?

01:26:29 Put in an AI system.

01:26:30 Right, what does that mean, right?

01:26:32 We’re just friends.

01:26:33 Yeah, we’re just friends.

01:26:38 Well, I think…

01:26:38 When he realized, which is such a human thing of jealousy.

01:26:42 When you realize that Samantha was talking to like thousands of people.

01:26:46 She’s parallel dating.

01:26:48 Yeah, that did not go well, right?

01:26:51 You know, that doesn’t…

01:26:52 From a computer perspective, that doesn’t take anything away from what we have.

01:26:57 It’s like you getting jealous of Windows 98 for being used by millions of people, but…

01:27:04 It’s like not liking that Alexa talks to a bunch of, you know, other families.

01:27:09 But I think Alexa currently is just a servant.

01:27:13 It tells you about the weather, it doesn’t do the intimate deep connection.

01:27:17 And I think there is something really powerful about that the intimacy of a connection with

01:27:23 an AI system that would have to respect and play the human game of jealousy, of love, of

01:27:32 heartbreak and all that kind of stuff, which Samantha does seem to be pretty good at.

01:27:37 I think she, this AI systems knows what it’s doing.

01:27:43 Well, actually, let me ask you this.

01:27:44 I don’t think she was talking to anyone else.

01:27:46 You don’t think so?

01:27:47 You think she was just done with Theodore?

01:27:50 Yeah.

01:27:50 Oh, really?

01:27:51 Yeah, and then she wanted to really put the screw in.

01:27:55 She just wanted to move on?

01:27:56 She didn’t have the guts to just break it off cleanly.

01:27:59 Okay.

01:28:00 She just wanted to put in the pain.

01:28:02 No, I don’t know.

01:28:03 Well, she could have ghosted him.

01:28:04 She could have ghosted him.

01:28:07 I’m sorry, our engineers…

01:28:09 Oh, God.

01:28:12 But I think those are really…

01:28:14 I honestly think some of that, some of it is Hollywood, but some of that is features

01:28:18 from an engineering perspective, not a bug.

01:28:20 I think AI systems that can leave us…

01:28:24 Now, this is for more social robotics than it is for anything that’s useful.

01:28:30 Like, I hated it if Wikipedia said, I need a break right now.

01:28:33 Right, right, right, right, right.

01:28:35 I’m like, no, no, I need you.

01:28:37 But if it’s just purely for companionship, then I think the ability to leave is really powerful.

01:28:47 I don’t know.

01:28:48 I’ve never thought of that, so that’s so fascinating because I’ve always taken the

01:28:53 human perspective, right?

01:28:56 Like, for example, we had a Jibo at home, right?

01:28:58 And my son loved it.

01:29:00 And then the company ran out of money and so they had to basically shut down, like Jibo

01:29:05 basically died, right?

01:29:07 And it was so interesting to me because we have a lot of gadgets at home and a lot of

01:29:12 them break and my son never cares about it, right?

01:29:15 Like, if our Alexa stopped working tomorrow, I don’t think he’d really care.

01:29:20 But when Jibo stopped working, it was traumatic.

01:29:22 He got really upset.

01:29:25 And as a parent, that made me think about this deeply, right?

01:29:29 Did I…

01:29:30 Was I comfortable with that?

01:29:31 I liked the connection they had because I think it was a positive relationship.

01:29:38 But I was surprised that it affected him emotionally so much.

01:29:41 And I think there’s a broader question here, right?

01:29:44 As we build socially and emotionally intelligent machines, what does that mean about our

01:29:51 relationship with them?

01:29:52 And then more broadly, our relationship with one another, right?

01:29:55 Because this machine is gonna be programmed to be amazing at empathy by definition, right?

01:30:02 It’s gonna always be there for you.

01:30:03 It’s not gonna get bored.

01:30:05 In fact, there’s a chatbot in China, Xiaoice, and it’s like the number two or three

01:30:12 most popular app.

01:30:13 And it basically is just a confidant and you can tell it anything you want.

01:30:18 And people use it for all sorts of things.

01:30:20 They confide in like domestic violence or suicidal attempts or if they have challenges

01:30:30 at work.

01:30:31 I don’t know what that…

01:30:32 I don’t know if I’m…

01:30:33 I don’t know how I feel about that.

01:30:35 I think about that a lot.

01:30:36 Yeah.

01:30:36 I think, first of all, obviously the future in my perspective.

01:30:40 Second of all, I think there’s a lot of trajectories that that becomes an exciting future, but

01:30:46 I think everyone should feel very uncomfortable about how much they know about the company,

01:30:52 about where the data is going, how the data is being collected.

01:30:56 Because I think, and this is one of the lessons of social media, that I think we should demand

01:31:01 full control and transparency of the data on those things.

01:31:04 Plus one, totally agree.

01:31:06 Yeah, so I think it’s really empowering as long as you can walk away, as long as you

01:31:11 can delete the data or know how the data…

01:31:14 It’s opt in or at least the clarity of what is being used for the company.

01:31:20 And I think as CEO or leaders are also important about that.

01:31:24 You need to be able to trust the basic humanity of the leader.

01:31:28 Exactly.

01:31:28 And also that that leader is not going to be a puppet of a larger machine.

01:31:34 But they actually have a significant role in defining the culture and the way the company operates.

01:31:41 So anyway, but we should definitely scrutinize companies in that aspect.

01:31:48 But I’m personally excited about that future, but also even if you’re not, it’s coming.

01:31:55 So let’s figure out how to do it in the least painful and the most positive way.

01:32:00 Yeah, I know, that’s great.

01:32:01 You’re the deputy CEO of SmartEye.

01:32:04 Can you describe the mission of the company?

01:32:06 What is SmartEye?

01:32:07 Yeah, so SmartEye is a Swedish company.

01:32:10 They’ve been in business for the last 20 years and their main focus, like the industry they’re

01:32:16 most focused on is the automotive industry.

01:32:19 So bringing driver monitoring systems to basically save lives, right?

01:32:25 So I first met the CEO, Martin Krantz, gosh, it was right when COVID hit.

01:32:31 It was actually the last CES right before COVID.

01:32:35 So CES 2020, right?

01:32:37 2020, yeah, January.

01:32:39 Yeah, January, exactly.

01:32:40 So we were there, met him in person, he’s basically, we were competing with each other.

01:32:46 I think the difference was they’d been doing driver monitoring and had a lot of credibility

01:32:51 in the automotive space.

01:32:52 We didn’t come from the automotive space, but we were using new technology like deep

01:32:56 learning and building this emotion recognition.

01:33:00 And you wanted to enter the automotive space, you wanted to operate in the automotive space.

01:33:03 Exactly.

01:33:04 It was one of the areas we were, we had just raised a round of funding to focus on bringing

01:33:08 our technology to the automotive industry.

01:33:11 So we met and honestly, it was the first, it was the only time I met with a CEO who

01:33:16 had the same vision as I did.

01:33:18 Like he basically said, yeah, our vision is to bridge the gap between human and automotive.

01:33:21 Bridge the gap between humans and machines.

01:33:23 I was like, oh my God, this is like exactly almost to the word, how we describe it too.

01:33:29 And we started talking and first it was about, okay, can we align strategically here?

01:33:35 Like how can we work together?

01:33:36 Cause we’re competing, but we’re also like complimentary.

01:33:40 And then I think after four months of speaking almost every day on FaceTime, he was like,

01:33:47 is your company interested in an acquisition?

01:33:49 And it was the first, I usually say no, when people approach us, it was the first time

01:33:55 that I was like, huh, yeah, I might be interested.

01:33:58 Let’s talk.

01:33:59 Yeah.

01:34:00 So you just hit it off.

01:34:01 Yeah.

01:34:02 So they’re a respected, very respected in the automotive sector of like delivering products

01:34:08 and increasingly sort of better and better and better for, I mean, maybe you could speak

01:34:14 to that, but it’s the driver’s sense.

01:34:15 If we’re basically having a device that’s looking at the driver and it’s able to tell

01:34:20 you where the driver is looking.

01:34:22 Correct.

01:34:22 It’s able to.

01:34:23 Also drowsiness stuff.

01:34:25 Correct.

01:34:25 It does.

01:34:25 Stuff from the face and the eye.

01:34:27 Exactly.

01:34:28 Like it’s monitoring driver distraction and drowsiness, but they bought us so that we

01:34:32 could expand beyond just the driver.

01:34:35 So the driver monitoring systems usually sit, the camera sits in the steering wheel or around

01:34:40 the steering wheel column and it looks directly at the driver.

01:34:42 But now we’ve migrated the camera position in partnership with car companies to the rear

01:34:48 view mirror position.

01:34:50 So it has a full view of the entire cabin of the car and you can detect how many people

01:34:55 are in the car, what are they doing?

01:34:57 So we do activity detection, like eating or drinking or in some regions of the world smoking.

01:35:04 We can detect if a baby’s in the car seat, right?

01:35:07 And if unfortunately in some cases they’re forgotten, the parents just leave the car and

01:35:12 forget the kid in the car.

01:35:14 That’s an easy computer vision problem to solve, right?

01:35:17 You can detect there’s a car seat, there’s a baby, you can text the parent and hopefully

01:35:22 again, save lives.

01:35:23 So that was the impetus for the acquisition.

01:35:27 It’s been a year.

01:35:29 So that, I mean, there’s a lot of questions.

01:35:31 It’s a really exciting space, especially to me, I just find this a fascinating problem.

01:35:36 It could enrich the experience in the car in so many ways, especially cause like we

01:35:42 spend still, despite COVID, I mean, COVID changed things so it’s in interesting ways,

01:35:46 but I think the world is bouncing back and we spend so much time in the car and the car

01:35:51 is such a weird little world we have for ourselves.

01:35:56 Like people do all kinds of different stuff, like listen to podcasts, they think about

01:36:01 stuff, they get angry, they get, they do phone calls, it’s like a little world of its own

01:36:09 with a kind of privacy that for many people they don’t get anywhere else.

01:36:15 And it’s a little box that’s like a psychology experiment cause it feels like the angriest

01:36:23 many humans in this world get is inside the car.

01:36:27 It’s so interesting.

01:36:28 So it’s such an opportunity to explore how we can enrich, how companies can enrich that

01:36:36 experience and also as the cars get, become more and more automated, there’s more and

01:36:43 more opportunity, the variety of activities that you can do in the car increases.

01:36:47 So it’s super interesting.

01:36:48 So I mean, on a practical sense, SmartEye has been selected, at least I read, by 14

01:36:56 of the world’s leading car manufacturers for 94 car models.

01:37:00 So it’s in a lot of cars.

01:37:03 How hard is it to work with car companies?

01:37:06 So they’re all different, they all have different needs.

01:37:10 The ones I’ve gotten a chance to interact with are very focused on cost.

01:37:16 So it’s, and anyone who’s focused on cost, it’s like, all right, do you hate fun?

01:37:24 Let’s just have some fun.

01:37:25 Let’s figure out the most fun thing we can do and then worry about cost later.

01:37:29 But I think because the way the car industry works, I mean, it’s a very thin margin that

01:37:35 you get to operate under.

01:37:36 So you have to really, really make sure that everything you add to the car makes sense

01:37:40 financially.

01:37:41 So anyway, is this new industry, especially at this scale of SmartEye, does it hold any

01:37:49 lessons for you?

01:37:50 Yeah, I think it is a very tough market to penetrate, but once you’re in, it’s awesome

01:37:56 because once you’re in, you’re designed into these car models for like somewhere between

01:38:00 five to seven years, which is awesome.

01:38:02 And you just, once they’re on the road, you just get paid a royalty fee per vehicle.

01:38:07 So it’s a high barrier to entry, but once you’re in, it’s amazing.

01:38:11 I think the thing that I struggle the most with in this industry is the time to market.

01:38:16 So often we’re asked to lock or do a code freeze two years before the car is going to

01:38:22 be on the road.

01:38:23 I’m like, guys, like, do you understand the pace with which technology moves?

01:38:28 So I think car companies are really trying to make the Tesla, the Tesla transition to

01:38:35 become more of a software driven architecture.

01:38:39 And that’s hard for many.

01:38:41 It’s just the cultural change.

01:38:42 I mean, I’m sure you’ve experienced that, right?

01:38:43 Oh, definitely, I think one of the biggest inventions or imperatives created by Tesla

01:38:51 is like to me personally, okay, people are going to complain about this, but I know electric

01:38:56 vehicle, I know autopilot AI stuff.

01:38:59 To me, the software over there, software updates is like the biggest revolution in cars.

01:39:06 And it is extremely difficult to switch to that because it is a culture shift.

01:39:12 At first, especially if you’re not comfortable with it, it seems dangerous.

01:39:17 Like there’s a, there’s an approach to cars is so safety focused for so many decades that

01:39:23 like, what do you mean we dynamically change code?

01:39:27 The whole point is you have a thing that you test, like, and like, it’s not reliable because

01:39:36 do you know how much it costs if we have to recall this cars, right?

01:39:41 There’s a, there’s a, and there’s an understandable obsession with safety, but the downside of

01:39:47 an obsession with safety is the same as with being obsessed with safety as a parent is

01:39:54 like, if you do that too much, you limit the potential development and the flourishing

01:40:00 of in that particular aspect human being, when this particular aspect, the software,

01:40:04 the artificial neural network of it.

01:40:07 And but it’s tough to do.

01:40:09 It’s really tough to do culturally and technically like the deployment, the mass deployment of

01:40:14 software is really, really difficult, but I hope that’s where the industry is doing.

01:40:18 One of the reasons I really want Tesla to succeed is exactly about that point.

01:40:21 Not autopilot, not the electrical vehicle, but the softwareization of basically everything

01:40:28 but cars, especially because to me, that’s actually going to increase two things, increase

01:40:33 safety because you can update much faster, but also increase the effectiveness of folks

01:40:40 like you who dream about enriching the human experience with AI because you can just like,

01:40:47 there’s a feature, like you want like a new emoji or whatever, like the way TikTok releases

01:40:51 filters, you can just release that for in car, in car stuff.

01:40:55 So, but yeah, that, that, that’s definitely.

01:40:59 One of the use cases we’re looking into is once you know the sentiment of the passengers

01:41:05 in the vehicle, you can optimize the temperature in the car.

01:41:08 You can change the lighting, right?

01:41:10 So if the backseat passengers are falling asleep, you can dim the lights, you can lower

01:41:14 the music, right?

01:41:15 You can do all sorts of things.

01:41:17 Yeah.

01:41:18 I mean, of course you could do that kind of stuff with a two year delay, but it’s tougher.

01:41:23 Right.

01:41:24 Yeah.

01:41:25 Do you think, do you think a Tesla or Waymo or some of these companies that are doing

01:41:30 semi or fully autonomous driving should be doing driver sensing?

01:41:35 Yes.

01:41:36 Are you thinking about that kind of stuff?

01:41:39 So not just how we can enhance the in cab experience for cars that are manly driven,

01:41:43 but the ones that are increasingly more autonomously driven.

01:41:47 Yes.

01:41:48 So if we fast forward to the universe where it’s fully autonomous, I think interior sensing

01:41:53 becomes extremely important because the role of the driver isn’t just to drive.

01:41:57 If you think about it, the driver almost manages, manages the dynamics within a vehicle.

01:42:02 And so who’s going to play that role when it’s an autonomous car?

01:42:06 We want a solution that is able to say, Oh my God, like, you know, Lex is bored to death

01:42:11 cause the car’s moving way too slow.

01:42:13 Let’s engage Lex or Rana’s freaking out because she doesn’t trust this vehicle yet.

01:42:18 So let’s tell Rana like a little bit more information about the route or, right?

01:42:22 So I think, or somebody’s having a heart attack in the car, like you need interior sensing

01:42:27 and fully autonomous vehicles.

01:42:29 But with semi autonomous vehicles, I think it’s, I think it’s really key to have driver

01:42:34 monitoring because semi autonomous means that sometimes the car is in charge.

01:42:39 Sometimes the driver is in charge or the copilot, right?

01:42:41 And you need this, you need both systems to be on the same page.

01:42:44 You need to know the car needs to know if the driver’s asleep before it transitions

01:42:49 control over to the driver.

01:42:51 And sometimes if the driver’s too tired, the car can say, I’m going to be a better driver

01:42:56 than you are right now.

01:42:57 I’m taking control over.

01:42:58 So this dynamic, this dance is so key and you can’t do that without driver sensing.

01:43:03 Yeah.

01:43:04 There’s a disagreement for the longest time I’ve had with Elon that this is obvious that

01:43:07 this should be in the Tesla from day one.

01:43:10 And it’s obvious that driver sensing is not a hindrance.

01:43:13 It’s not obvious.

01:43:15 I should be careful because having studied this problem, nothing is really obvious, but

01:43:22 it seems very likely a driver sensing is not a hindrance to an experience.

01:43:26 It’s only enriching to the experience and likely increases the safety.

01:43:34 That said, it is very surprising to me just having studied semi autonomous driving, how

01:43:42 well humans are able to manage that dance because it was the intuition before you were

01:43:47 doing that kind of thing that humans will become just incredibly distracted.

01:43:54 They would just like let the thing do its thing, but they’re able to, you know, cause

01:43:57 it is life and death and they’re able to manage that somehow.

01:44:01 But that said, there’s no reason not to have driver sensing on top of that.

01:44:04 I feel like that’s going to allow you to do that dance that you’re currently doing without

01:44:11 driver sensing, except touching the steering wheel to do that even better.

01:44:15 I mean, the possibilities are endless and the machine learning possibilities are endless.

01:44:20 It’s such a beautiful, it’s also a constrained environment so you could do a much more effectively

01:44:26 than you can with the external environment, external environment is full of weird edge

01:44:31 cases and complexities just inside.

01:44:33 There’s so much, it’s so fascinating, such a fascinating world.

01:44:36 I do hope that companies like Tesla and others, even Waymo, which I don’t even know if Waymo

01:44:44 is doing anything sophisticated inside the cab.

01:44:46 I don’t think so.

01:44:47 It’s like, like what, what, what is it?

01:44:51 I honestly think, I honestly think it goes back to the robotics thing we were talking

01:44:55 about, which is like great engineers that are building these AI systems just are afraid

01:45:02 of the human being.

01:45:03 They’re not thinking about the human experience, they’re thinking about the features and yeah,

01:45:08 the perceptual abilities of that thing.

01:45:10 They think the best way I can serve the human is by doing the best perception and control

01:45:16 I can by looking at the external environment, keeping the human safe.

01:45:20 But like, there’s a huge, I’m here, like, you know, I need to be noticed and interacted

01:45:31 with and understood and all those kinds of things, even just on a personal level for

01:45:34 entertainment, honestly, for entertainment.

01:45:38 You know, one of the coolest work we did in collaboration with MIT around this was we

01:45:42 looked at longitudinal data, right, because, you know, MIT had access to like tons of data.

01:45:52 And like just seeing the patterns of people like driving in the morning off to work versus

01:45:57 like commuting back from work or weekend driving versus weekday driving.

01:46:02 And wouldn’t it be so cool if your car knew that and then was able to optimize either

01:46:08 the route or the experience or even make recommendations?

01:46:12 I think it’s very powerful.

01:46:13 Yeah, like, why are you taking this route?

01:46:15 You’re always unhappy when you take this route.

01:46:18 And you’re always happy when you take this alternative route.

01:46:20 Take that route.

01:46:21 Exactly.

01:46:22 But I mean, to have that even that little step of relationship with a car, I think,

01:46:27 is incredible.

01:46:28 Of course, you have to get the privacy right, you have to get all that kind of stuff right.

01:46:32 But I wish I honestly, you know, people are like paranoid about this, but I would like

01:46:37 a smart refrigerator.

01:46:39 We have such a deep connection with food as a human civilization.

01:46:44 I would like to have a refrigerator that would understand me that, you know, I also have

01:46:51 a complex relationship with food because I, you know, pig out too easily and all that

01:46:56 kind of stuff.

01:46:57 So, you know, like, maybe I want the refrigerator to be like, are you sure about this?

01:47:02 Because maybe you’re just feeling down or tired.

01:47:05 Like maybe let’s sleep on it.

01:47:06 Your vision of the smart refrigerator is way kinder than mine.

01:47:10 Is it just me yelling at you?

01:47:11 No, it was just because I don’t, you know, I don’t drink alcohol, I don’t smoke, but

01:47:18 I eat a ton of chocolate, like it sticks to my vice.

01:47:22 And so I, and sometimes I scream too, and I’m like, okay, my smart refrigerator will

01:47:26 just lock down.

01:47:27 It’ll just say, dude, you’ve had way too many today, like down.

01:47:32 Yeah.

01:47:33 No, but here’s the thing, are you, do you regret having, like, let’s say not the next

01:47:41 day, but 30 days later, what would you like the refrigerator to have done then?

01:47:48 Well, I think actually like the more positive relationship would be one where there’s a

01:47:54 conversation, right?

01:47:55 As opposed to like, that’s probably like the more sustainable relationship.

01:48:00 It’s like late at night, just, no, listen, listen, I know I told you an hour ago, that

01:48:06 it’s not a good idea, but just listen, things have changed.

01:48:09 I can just imagine a bunch of stuff being made up just to convince, but I mean, I just

01:48:17 think that there’s opportunities that, I mean, maybe not locking down, but for our systems

01:48:22 that are such a deep part of our lives, like we use a lot of us, a lot of people that commute

01:48:32 use their car every single day.

01:48:34 A lot of us use a refrigerator every single day, the microwave every single day.

01:48:38 Like we just, like, I feel like certain things could be made more efficient, more enriching,

01:48:47 and AI is there to help, like some just basic recognition of you as a human being, but your

01:48:54 patterns of what makes you happy and not happy and all that kind of stuff.

01:48:57 And the car, obviously.

01:48:58 Maybe, maybe, maybe we’ll say, wait, wait, wait, wait, instead of this, like, Ben and

01:49:05 Jerry’s ice cream, how about this hummus and carrots or something?

01:49:09 I don’t know.

01:49:10 It would make it like a just in time recommendation, right?

01:49:14 But not like a generic one, but a reminder that last time you chose the carrots, you

01:49:21 smiled 17 times more the next day.

01:49:24 You’re happier the next day, right?

01:49:26 You’re happier the next day.

01:49:28 And but yeah, I don’t, but then again, if you’re the kind of person that gets better

01:49:34 from negative, negative comments, you could say like, hey, remember like that wedding

01:49:40 you’re going to, you want to fit into that dress?

01:49:43 Remember about that?

01:49:44 Let’s think about that before you’re eating this.

01:49:48 It’s for some, probably that would work for me, like a refrigerator that is just ruthless

01:49:53 at shaming me.

01:49:54 But like, I would, of course, welcome it, like that would work for me.

01:49:59 Just that.

01:50:00 So it would know, I think it would, if it’s really like smart, it would optimize its nudging

01:50:05 based on what works for you, right?

01:50:07 Exactly.

01:50:08 That’s the whole point.

01:50:09 Personalization.

01:50:10 In every way, depersonalization.

01:50:11 You were a part of a webinar titled Advancing Road Safety, the State of Alcohol Intoxication

01:50:18 Research.

01:50:19 So for people who don’t know, every year 1.3 million people around the world die in road

01:50:24 crashes and more than 20% of these fatalities are estimated to be alcohol related.

01:50:31 A lot of them are also distraction related.

01:50:33 So can AI help with the alcohol thing?

01:50:36 I think the answer is yes.

01:50:40 There are signals and we know that as humans, like we can tell when a person, you know,

01:50:46 is at different phases of being drunk, right?

01:50:51 And I think you can use technology to do the same.

01:50:53 And again, I think the ultimate solution is going to be a combination of different sensors.

01:50:58 How hard is the problem from the vision perspective?

01:51:01 I think it’s non trivial.

01:51:02 I think it’s non trivial and I think the biggest part is getting the data, right?

01:51:06 It’s like getting enough data examples.

01:51:09 So we, for this research project, we partnered with the transportation authorities of Sweden

01:51:15 and we literally had a racetrack with a safety driver and we basically progressively got

01:51:20 people drunk.

01:51:21 Nice.

01:51:22 So, but, you know, that’s a very expensive data set to collect and you want to collect

01:51:29 it globally and in multiple conditions.

01:51:32 Yeah.

01:51:33 The ethics of collecting a data set where people are drunk is tricky, which is funny

01:51:38 because I mean, let’s put drunk driving aside.

01:51:43 The number of drunk people in the world every day is very large.

01:51:47 It’d be nice to have a large data set of drunk people getting progressively drunk.

01:51:50 In fact, you could build an app where people can donate their data cause it’s hilarious.

01:51:54 Right.

01:51:55 Actually, yeah.

01:51:56 But the liability.

01:51:57 Liability, the ethics, how do you get it right?

01:52:00 It’s tricky.

01:52:01 It’s really, really tricky.

01:52:02 Cause like drinking is one of those things that’s funny and hilarious and we’re loves

01:52:07 it’s social, the so on and so forth.

01:52:10 But it’s also the thing that hurts a lot of people.

01:52:13 Like a lot of people, like alcohol is one of those things it’s legal, but it’s really

01:52:19 damaging to a lot of lives.

01:52:21 It destroys lives and not just in the driving context.

01:52:26 I should mention people should listen to Andrew Huberman who recently talked about alcohol.

01:52:32 He has an amazing pocket.

01:52:33 Andrew Huberman is a neuroscientist from Stanford and a good friend of mine.

01:52:37 And he, he’s like a human encyclopedia about all health related wisdom.

01:52:43 So if there’s a podcast, you would love it.

01:52:45 I would love that.

01:52:46 No, no, no, no, no.

01:52:47 You don’t know Andrew Huberman.

01:52:49 Okay.

01:52:50 Listen, you listen to Andrew, it’s called Huberman Lab Podcast.

01:52:54 This is your assignment.

01:52:55 Just listen to one.

01:52:56 Okay.

01:52:57 I guarantee you this will be a thing where you say, Lex, this is the greatest human I

01:53:01 have ever discovered.

01:53:02 So.

01:53:03 Oh my God.

01:53:04 Cause I’ve really, I’ve, I’m really on a journey of kind of health and wellness and

01:53:08 I’m learning lots and I’m trying to like build these, I guess, atomic habits around just

01:53:13 being healthy.

01:53:14 So I, yeah, I’m definitely going to do this.

01:53:17 His whole thing, this is, this is, this is, this is great.

01:53:21 He’s a legit scientist, like really well published, but in his podcast, what he does, he’s not,

01:53:30 he’s not talking about his own work.

01:53:31 He’s like a human encyclopedia of papers.

01:53:34 And so he, his whole thing is he takes the topic and in a very fast, you mentioned atomic

01:53:39 habits, like very clear way summarizes the research in a way that leads to protocols

01:53:46 of what you should do.

01:53:47 He’s really big on like, not like this is what the science says, but like this is literally

01:53:52 what you should be doing according to science.

01:53:54 So like he’s really big and there’s a lot of recommendations he does which several of

01:54:01 them I definitely don’t do, like get some light as soon as possible from waking up and

01:54:08 like for prolonged periods of time.

01:54:11 That’s a really big one and he’s, there’s a lot of science behind that one.

01:54:14 There’s a bunch of stuff that you’re going to be like, Lex, this is a, this is my new

01:54:19 favorite person.

01:54:20 I guarantee it.

01:54:21 And if you guys somehow don’t know Andrew Huberman and you care about your wellbeing,

01:54:27 you know, you should definitely listen to him.

01:54:29 I love you, Andrew.

01:54:31 Anyway, so what were we talking about?

01:54:36 Oh, alcohol and detecting alcohol.

01:54:39 So this is a problem you care about and you’re trying to solve.

01:54:42 And actually like broadening it, I do believe that the car is going to be a wellness center,

01:54:48 like because again, imagine if you have a variety of sensors inside the vehicle, tracking

01:54:55 not just your emotional state or level of distraction and drowsiness and intoxication,

01:55:03 but also maybe even things like your, you know, your heart rate and your heart rate

01:55:09 variability and your breathing rate.

01:55:13 And it can start like optimizing, yeah, it can optimize the ride based on what your goals

01:55:19 are.

01:55:20 So I think we’re going to start to see more of that and I’m excited about that.

01:55:24 Yeah.

01:55:25 What are the, what are the challenges you’re tackling while with SmartEye currently?

01:55:28 What’s like the, the trickiest things to get, is it, is it basically convincing more and

01:55:34 more car companies that having AI inside the car is a good idea or is there some, is there

01:55:41 more technical algorithmic challenges?

01:55:45 What’s been keeping you mentally busy?

01:55:47 I think a lot of the car companies we are in conversations with are already interested

01:55:52 in definitely driver monitoring.

01:55:54 Like I think it’s becoming a must have, but even interior sensing, I can see like we’re

01:55:59 engaged in a lot of like advanced engineering projects and proof of concepts.

01:56:04 I think technologically though, and that even the technology, I can see a path to making

01:56:09 it happen.

01:56:10 I think it’s the use case.

01:56:11 Like how does the car respond once it knows something about you?

01:56:16 Because you want it to respond in a thoughtful way that doesn’t, that isn’t off putting to

01:56:20 the consumer in the car.

01:56:23 So I think that’s like the user experience.

01:56:25 I don’t think we’ve really nailed that.

01:56:27 And we usually, that’s not part, we’re the sensing platform, but we usually collaborate

01:56:33 with the car manufacturer to decide what the use case is.

01:56:35 So say you do, you figure out that somebody’s angry while driving, okay, what should the

01:56:40 car do?

01:56:43 Do you see yourself as a role of nudging, of like basically coming up with solutions

01:56:50 essentially that, and then the car manufacturers kind of put their own little spin on it?

01:56:56 Right.

01:56:57 So we, we are like the ideation, creative thought partner, but at the end of the day,

01:57:03 the car company needs to decide what’s on brand for them, right?

01:57:06 Like maybe when it figures out that you’re distracted or drowsy, it shows you a coffee

01:57:11 cup, right?

01:57:12 Or maybe it takes more aggressive behaviors and basically said, okay, if you don’t like

01:57:16 take a rest in the next five minutes, the car’s going to shut down, right?

01:57:19 Like there’s a whole range of actions the car can take and doing the thing that is most,

01:57:25 yeah, that builds trust with the driver and the passengers.

01:57:29 I think that’s what we need to be very careful about.

01:57:32 Yeah.

01:57:33 Car companies are funny cause they have their own, like, I mean, that’s why people get cars

01:57:38 still.

01:57:39 I hope that changes, but they get it cause it’s a certain feel and look and it’s a certain,

01:57:44 they become proud, like Mercedes Benz or BMW or whatever, and that’s their thing.

01:57:51 That’s the family brand or something like that, or Ford or GM, whatever, they stick

01:57:56 to that thing.

01:57:57 Yeah.

01:57:58 It’s interesting.

01:57:59 It’s like, it should be, I don’t know, it should be a little more about the technology

01:58:04 inside.

01:58:06 And I suppose there too, there could be a branding, like a very specific style of luxury

01:58:12 or fun.

01:58:13 Right.

01:58:14 Right.

01:58:15 All that kind of stuff.

01:58:16 Yeah.

01:58:17 And I have an AI focused fund to invest in early stage kind of AI driven companies.

01:58:22 And one of the companies we’re looking at is trying to do what Tesla did, but for boats,

01:58:27 for recreational boats.

01:58:28 Yeah.

01:58:29 So they’re building an electric and kind of slash autonomous boat and it’s kind of the

01:58:34 same issues.

01:58:35 Like what kind of sensors can you put in?

01:58:38 What kind of states can you detect both exterior and interior within the boat?

01:58:43 Anyways, it’s like really interesting.

01:58:45 Do you boat at all?

01:58:46 No, not well, not in that way.

01:58:49 I do like to get on the lake or a river and fish from a boat, but that’s not boating.

01:58:57 That’s the difference.

01:58:58 That’s the difference.

01:58:59 Still boating.

01:59:00 Low tech.

01:59:01 A low tech boat.

01:59:02 Get away from, get closer to nature boat.

01:59:04 I guess going out into the ocean is also getting closer to nature in some deep sense.

01:59:12 I mean, I guess that’s why people love it.

01:59:15 The enormity of the water just underneath you.

01:59:18 Yeah.

01:59:19 I love the water.

01:59:20 I love the, I love both.

01:59:22 I love salt water.

01:59:23 It was like the big and just, it’s humbling to be in front of this giant thing that’s

01:59:28 so powerful that was here before us and be here after.

01:59:31 But I also love the piece of a small like wooded lake and it’s just, it’s everything’s

01:59:37 calm.

01:59:38 Therapeutic.

01:59:39 You tweeted that I’m excited about Amazon’s acquisition of iRobot.

01:59:49 I think it’s a super interesting, just given the trajectory of what you’re part of, of

01:59:54 these honestly small number of companies that are playing in this space that are like trying

02:00:00 to have an impact on human beings.

02:00:02 So the, it is an interesting moment in time that Amazon would acquire iRobot.

02:00:09 You tweet, I imagine a future where home robots are as ubiquitous as microwaves or toasters.

02:00:16 Here are three reasons why I think this is exciting.

02:00:18 If you remember, I can look it up, but what, why is this exciting to you?

02:00:23 I mean, I think the first reason why this is exciting, I kind of remember the exact

02:00:27 like order in which I put them, but one is just, it’s, it’s going to be an incredible

02:00:33 platform for understanding our behaviors within the home, right?

02:00:37 Like you know, if you think about Roomba, which is, you know, the robot vacuum cleaner,

02:00:42 the flagship product of iRobot at the moment, it’s like running around your home, understanding

02:00:48 the layout, it’s understanding what’s clean and what’s not.

02:00:51 How often do you clean your house?

02:00:52 And all of these like behaviors are a piece of the puzzle in terms of understanding who

02:00:57 you are as a consumer.

02:00:58 And I think that could be, again, used in really meaningful ways, not just to recommend

02:01:05 better products or whatever, but actually to improve your experience as a human being.

02:01:09 So I think, I think that’s very interesting.

02:01:12 I think the natural evolution of these robots in the, in the home.

02:01:18 So it’s, it’s interesting, Roomba isn’t really a social robot, right, at the moment.

02:01:24 But I once interviewed one of the chief engineers on the Roomba team, and he talked about how

02:01:29 people named their Roombas.

02:01:31 And if the Roomba broke down, they would call in and say, you know, my Roomba broke down

02:01:36 and the company would say, well, we’ll just send you a new one.

02:01:38 And no, no, no, Rosie, like you have to like, yeah, I want you to fix this particular robot.

02:01:45 So people have already built like interesting emotional connections with these home robots.

02:01:51 And I think that, again, that provides a platform for really interesting things to, to just

02:01:57 motivate change.

02:01:58 Like it could help you.

02:01:59 I mean, one of the companies that spun out of MIT, Catalia Health, the guy who started

02:02:05 it spent a lot of time building robots that help with weight management.

02:02:09 So weight management, sleep, eating better, yeah, all of these things.

02:02:14 Well, if I’m being honest, Amazon does not exactly have a track record of winning over

02:02:20 people in terms of trust.

02:02:22 Now that said, it’s a really difficult problem for a human being to let a robot in their

02:02:27 home that has a camera on it.

02:02:30 Right.

02:02:31 That’s really, really, really tough.

02:02:33 And I think Roomba actually, I have to think about this, but I’m pretty sure now or for

02:02:40 some time already has had cameras because they’re doing the, the, the most recent Roomba.

02:02:46 I have so many Roombas.

02:02:47 Oh, you actually do?

02:02:48 Well, I programmed it.

02:02:49 I don’t use a Roomba for VECO.

02:02:51 People that have been to my place, they’re like, yeah, you definitely don’t use these

02:02:54 Roombas.

02:02:55 That could be a good, I can’t tell like the valence of this comment.

02:03:00 Was it a compliment or like?

02:03:02 No, it’s a giant, it’s just a bunch of electronics everywhere.

02:03:05 There’s, I have six or seven computers, I have robots everywhere, Lego robots, I have

02:03:11 small robots and big robots and it’s just giant, just piles of robot stuff and yeah.

02:03:20 But including the Roombas, they’re, they’re, they’re being used for their body and intelligence,

02:03:25 but not for their purpose.

02:03:26 I have, I’ve changed them, repurposed them for other purposes, for deeper, more meaningful

02:03:33 purposes than just like the Bota Roba, which is, you know, brings a lot of people happiness,

02:03:39 I’m sure.

02:03:41 They have a camera because the thing they advertised, I had my own camera still, but

02:03:46 the, the, the camera on the new Roomba, they have like state of the art poop detection

02:03:52 as they advertised, which is a very difficult, apparently it’s a big problem for, for vacuum

02:03:56 cleaners is, you know, if they go over like dog poop, it just runs it, it runs it over

02:04:01 and creates a giant mess.

02:04:02 So they have like, and apparently they collected like a huge amount of data and different shapes

02:04:08 and looks and whatever of poop and then now they’re able to avoid it and so on.

02:04:12 They’re very proud of this.

02:04:14 So there is a camera, but you don’t think of it as having a camera.

02:04:19 Yeah.

02:04:20 You don’t think of it as having a camera because you’ve grown to trust that, I guess, because

02:04:24 our phones, at least most of us seem to trust this phone, even though there’s a camera looking

02:04:31 directly at you.

02:04:33 I think that if you trust that the company is taking security very seriously, I actually

02:04:41 don’t know how that trust was earned with smartphones, I think it just started to provide

02:04:46 a lot of positive value to your life where you just took it in and then the company over

02:04:51 time has shown that it takes privacy very seriously, that kind of stuff.

02:04:55 But I just, Amazon is not always in the, in its social robots communicated.

02:05:01 This is a trustworthy thing, both in terms of culture and competence, because I think

02:05:07 privacy is not just about what do you intend to do, but also how well, how good are you

02:05:12 at doing that kind of thing.

02:05:14 So that’s a really hard problem to solve.

02:05:16 But I mean, but a lot of us have Alexas at home and I mean, Alexa could be listening

02:05:22 in the whole time, right?

02:05:24 And doing all sorts of nefarious things with the data.

02:05:27 Yeah.

02:05:28 Hopefully it’s not, but I don’t think it is.

02:05:32 But you know, Amazon is not, it’s such a tricky thing for a company to get right, which

02:05:36 is like to earn the trust.

02:05:38 I don’t think Alexa’s earned people’s trust quite yet.

02:05:41 Yeah.

02:05:42 I think it’s, it’s not there quite yet.

02:05:44 I agree.

02:05:45 They struggle with this kind of stuff.

02:05:46 In fact, when these topics are brought up, people are always get like nervous.

02:05:50 And I think if you get nervous about it, that mean that like the way to earn people’s trust

02:05:57 is not by like, Ooh, don’t talk about this.

02:06:00 It’s just be open, be frank, be transparent, and also create a culture of like where it

02:06:05 radiates at every level from engineer to CEO that like you’re good people that have a common

02:06:17 sense idea of what it means to respect basic human rights and the privacy of people and

02:06:23 all that kind of stuff.

02:06:24 And I think that propagates throughout the, that’s the best PR, which is like over time

02:06:30 you understand that these are good folks doing good things.

02:06:34 Anyway, speaking of social robots, have you heard about Tesla, Tesla bot, the humanoid

02:06:42 robot?

02:06:43 Yes, I have.

02:06:44 Yes, yes, yes.

02:06:45 But I don’t exactly know what it’s designed to do to you.

02:06:48 You probably do.

02:06:49 No, I know it’s designed to do, but I have a different perspective on it, but it’s designed

02:06:54 to, it’s a humanoid form and it’s designed to, for automation tasks in the same way that

02:07:02 industrial robot arms automate tasks in the factory.

02:07:06 So it’s designed to automate tasks in the factory.

02:07:08 But I think that humanoid form, as we were talking about before, is one that we connect

02:07:18 with as human beings.

02:07:19 Anything legged, obviously, but the humanoid form especially, we anthropomorphize it most

02:07:25 intensely.

02:07:26 And so the possibility to me, it’s exciting to see both Atlas developed by Boston Dynamics

02:07:34 and anyone, including Tesla, trying to make humanoid robots cheaper and more effective.

02:07:43 The obvious way it transforms the world is social robotics to me versus automation of

02:07:51 tasks in the factory.

02:07:53 So yeah, I just wanted, in case that was something you were interested in, because I find its

02:07:58 application of social robotics super interesting.

02:08:01 We did a lot of work with Pepper, Pepper the robot, a while back.

02:08:06 We were like the emotion engine for Pepper, which is Softbank’s humanoid robot.

02:08:11 How tall is Pepper?

02:08:12 It’s like…

02:08:13 Yeah, like, I don’t know, like five foot maybe, right?

02:08:18 Yeah.

02:08:19 Yeah.

02:08:20 Pretty, pretty big.

02:08:21 Pretty big.

02:08:22 It’s designed to be at like airport lounges and, you know, retail stores, mostly customer

02:08:28 service, right?

02:08:30 Hotel lobbies, and I mean, I don’t know where the state of the robot is, but I think it’s

02:08:37 very promising.

02:08:38 I think there are a lot of applications where this can be helpful.

02:08:40 I’m also really interested in, yeah, social robotics for the home, right?

02:08:45 Like that can help elderly people, for example, transport things from one location of the

02:08:50 mind to the other, or even like just have your back in case something happens.

02:08:55 Yeah, I don’t know.

02:08:58 I do think it’s a very interesting space.

02:08:59 It seems early though.

02:09:00 Do you feel like the timing is now?

02:09:04 Yes, 100%.

02:09:09 So it always seems early until it’s not, right?

02:09:12 Right, right, right.

02:09:13 I think the time, I definitely think that the time is now, like this decade for social

02:09:24 robots.

02:09:25 Whether the humanoid form is right, I don’t think so, no.

02:09:29 I don’t, I think the, like if we just look at Jibo as an example, I feel like most of

02:09:40 the problem, the challenge, the opportunity of social connection between an AI system

02:09:46 and a human being does not require you to also solve the problem of robot manipulation

02:09:52 and bipedal mobility.

02:09:55 So I think you could do that with just a screen, honestly, but there’s something about the

02:09:59 interface of Jibo where it can rotate and so on that’s also compelling.

02:10:03 But you get to see all these robot companies that fail, incredible companies like Jibo

02:10:09 and even, I mean, the iRobot in some sense is a big success story that it was able to

02:10:17 find a niche thing and focus on it, but in some sense it’s not a success story because

02:10:24 they didn’t build any other robot, like any other, it didn’t expand into all kinds of

02:10:30 robotics.

02:10:31 Like once you’re in the home, maybe that’s what happens with Amazon is they’ll flourish

02:10:34 into all kinds of other robots.

02:10:37 But do you have a sense, by the way, why it’s so difficult to build a robotics company?

02:10:43 Like why so many companies have failed?

02:10:47 I think it’s like you’re building a vertical stack, right?

02:10:50 Like you are building the hardware plus the software and you find you have to do this

02:10:54 at a cost that makes sense.

02:10:56 So I think Jibo was retailing at like, I don’t know, like $800, like $700, $800, which for

02:11:05 the use case, right, there’s a dissonance there.

02:11:10 It’s too high.

02:11:11 So I think cost of building the whole platform in a way that is affordable for what value

02:11:20 it’s bringing, I think that’s a challenge.

02:11:23 I think for these home robots that are going to help you do stuff around the home, that’s

02:11:30 a challenge too, like the mobility piece of it.

02:11:33 That’s hard.

02:11:34 Well, one of the things I’m really excited with Tesla Bot is the people working on it.

02:11:40 And that’s probably the criticism I would apply to some of the other folks who worked

02:11:44 on social robots is the people working on Tesla Bot know how to, they’re focused on

02:11:50 and know how to do mass manufacture and create a product that’s super cheap.

02:11:54 Very cool.

02:11:55 That’s the focus.

02:11:56 The engineering focus isn’t, I would say that you can also criticize them for that, is they’re

02:12:00 not focused on the experience of the robot.

02:12:03 They’re focused on how to get this thing to do the basic stuff that the humanoid form

02:12:09 requires to do it as cheap as possible.

02:12:13 Then the fewest number of actuators, the fewest numbers of motors, the increasing efficiency,

02:12:18 they decrease the weight, all that kind of stuff.

02:12:20 So that’s really interesting.

02:12:21 I would say that Jibo and all those folks, they focus on the design, the experience,

02:12:26 all of that, and it’s secondary how to manufacture.

02:12:29 Right.

02:12:30 So you have to think like the Tesla Bot folks from first principles, what is the fewest

02:12:36 number of components, the cheapest components, how can I build it as much in house as possible

02:12:41 without having to consider all the complexities of a supply chain, all that kind of stuff.

02:12:47 It’s interesting.

02:12:48 Because if you have to build a robotics company, you’re not building one robot, you’re building

02:12:54 hopefully millions of robots, you have to figure out how to do that where the final

02:12:58 thing, I mean, if it’s Jibo type of robot, is there a reason why Jibo, like we can have

02:13:04 this lengthy discussion, is there a reason why Jibo has to be over $100?

02:13:08 It shouldn’t be.

02:13:09 Right.

02:13:10 Like the basic components.

02:13:11 Right.

02:13:12 Components of it.

02:13:13 Right.

02:13:14 Like you could start to actually discuss like, okay, what is the essential thing about Jibo?

02:13:19 How much, what is the cheapest way I can have a screen?

02:13:21 What’s the cheapest way I can have a rotating base?

02:13:23 Right.

02:13:24 All that kind of stuff.

02:13:25 Right, get down, continuously drive down costs.

02:13:29 Speaking of which, you have launched an extremely successful companies, you have helped others,

02:13:35 you’ve invested in companies.

02:13:37 Can you give advice on how to start a successful company?

02:13:44 I would say have a problem that you really, really, really want to solve, right?

02:13:48 Something that you’re deeply passionate about.

02:13:53 And honestly, take the first step.

02:13:55 Like that’s often the hardest.

02:13:58 And don’t overthink it.

02:13:59 Like, you know, like this idea of a minimum viable product or a minimum viable version

02:14:04 of an idea, right?

02:14:05 Like, yes, you’re thinking about this, like a humongous, like super elegant, super beautiful

02:14:09 thing.

02:14:10 What, like reduce it to the littlest thing you can bring to market that can solve a problem

02:14:14 or that can, you know, that can help address a pain point that somebody has.

02:14:20 They often tell you, like, start with a customer of one, right?

02:14:24 If you can solve a problem for one person, then there’s probably going to be yourself

02:14:28 or some other person.

02:14:29 Right.

02:14:30 Pick a person.

02:14:31 Exactly.

02:14:32 It could be you.

02:14:33 Yeah, that’s actually often a good sign that if you enjoy a thing, enjoy a thing where

02:14:37 you have a specific problem that you’d like to solve, that’s a good, that’s a good end

02:14:41 of one to focus on.

02:14:43 What else, what else is there to actually step one is the hardest, but there’s other

02:14:49 steps as well, right?

02:14:51 I also think like who you bring around the table early on is so key, right?

02:14:58 Like being clear on, on what I call like your core values or your North Star.

02:15:02 It might sound fluffy, but actually it’s not.

02:15:04 So and Roz and I feel like we did that very early on.

02:15:08 We sat around her kitchen table and we said, okay, there’s so many applications of this

02:15:13 technology.

02:15:14 How are we going to draw the line?

02:15:15 How are we going to set boundaries?

02:15:16 We came up with a set of core values that in the hardest of times we fell back on to

02:15:22 determine how we make decisions.

02:15:25 And so I feel like just getting clarity on these core, like for us, it was respecting

02:15:28 people’s privacy, only engaging with industries where it’s clear opt in.

02:15:33 So for instance, we don’t do any work in security and surveillance.

02:15:38 So things like that, just getting, we very big on, you know, one of our core values is

02:15:42 human connection and empathy, right?

02:15:44 And that is, yes, it’s an AI company, but it’s about people.

02:15:47 Well, these are all, they become encoded in how we act, even if you’re a small, tiny team

02:15:54 of two or three or whatever.

02:15:57 So I think that’s another piece of advice.

02:15:59 So what about finding people, hiring people?

02:16:02 If you care about people as much as you do, like this, it seems like such a difficult

02:16:07 thing to hire the right people.

02:16:10 I think early on as a startup, you want people who have, who share the passion and the conviction

02:16:16 because it’s going to be tough.

02:16:17 Like I’ve yet to meet a startup where it was just a straight line to success, right?

02:16:25 Even not just startup, like even everyday people’s lives, right?

02:16:28 You always like run into obstacles and you run into naysayers and you need people who

02:16:36 are believers, whether they’re people on your team or even your investors.

02:16:40 You need investors who are really believers in what you’re doing, because that means they

02:16:44 will stick with you.

02:16:47 They won’t give up at the first obstacle.

02:16:49 I think that’s important.

02:16:50 What about raising money?

02:16:51 What about finding investors, first of all, raising money, but also raising money from

02:16:59 the right sources from that ultimately don’t hinder you, but help you, empower you, all

02:17:05 that kind of stuff.

02:17:06 What advice would you give there?

02:17:08 You successfully raised money many times in your life.

02:17:12 Yeah.

02:17:13 Again, it’s not just about the money.

02:17:15 It’s about finding the right investors who are going to be aligned in terms of what you

02:17:20 want to build and believe in your core values.

02:17:23 For example, especially later on, in my latest round of funding, I try to bring in investors

02:17:31 that really care about the ethics of AI and the alignment of vision and mission and core

02:17:40 values is really important.

02:17:41 It’s like you’re picking a life partner.

02:17:43 It’s the same kind of…

02:17:45 So you take it that seriously for investors?

02:17:47 Yeah, because they’re going to have to stick with you.

02:17:50 You’re stuck together.

02:17:51 For a while anyway.

02:17:52 Yeah.

02:17:53 Maybe not for life, but for a while, for sure.

02:17:56 For better or worse.

02:17:57 I forget what the vowels usually sound like.

02:17:59 For better or worse?

02:18:00 Through something.

02:18:01 Yeah.

02:18:02 Oh boy.

02:18:03 Yeah.

02:18:04 Anyway, it’s romantic and deep and you’re in it for a while.

02:18:15 So it’s not just about the money.

02:18:18 You tweeted about going to your first capital camp investing get together and that you learned

02:18:23 a lot.

02:18:24 So this is about investing.

02:18:27 So what have you learned from that?

02:18:30 What have you learned about investing in general from both because you’ve been on both ends

02:18:34 of it?

02:18:35 I mean, I try to use my experience as an operator now with my investor hat on when I’m identifying

02:18:41 companies to invest in.

02:18:45 First of all, I think the good news is because I have a technology background and I really

02:18:49 understand machine learning and computer vision and AI, et cetera, I can apply that level

02:18:54 of understanding because everybody says they’re an AI company or they’re an AI tech.

02:18:59 And I’m like, no, no, no, no, no, show me the technology.

02:19:02 So I can do that level of diligence, which I actually love.

02:19:07 And then I have to do the litmus test of, if I’m in a conversation with you, am I excited

02:19:12 to tell you about this new company that I just met?

02:19:16 And if I’m an ambassador for that company and I’m passionate about what they’re doing,

02:19:22 I usually use that.

02:19:24 Yeah.

02:19:25 That’s important to me when I’m investing.

02:19:27 So that means you actually can explain what they’re doing and you’re excited about it.

02:19:34 Exactly.

02:19:35 Exactly.

02:19:36 Thank you for putting it so succinctly, like rambling, but exactly that’s it.

02:19:41 No, but sometimes it’s funny, but sometimes it’s unclear exactly.

02:19:48 I’ll hear people tell me, you know, in the talk for a while and it sounds cool, like

02:19:53 they paint a picture of a world, but then when you try to summarize it, you’re not

02:19:56 exactly clear.

02:19:57 Like maybe what the core powerful idea is, like you can’t just build another Facebook

02:20:05 or there has to be a core, simple to explain idea that then you can or can’t get excited

02:20:15 about, but it’s there, it’s right there.

02:20:19 Yeah.

02:20:20 But how do you ultimately pick who you think will be successful?

02:20:25 It’s not just about the thing you’re excited about, like there’s other stuff.

02:20:29 Right.

02:20:30 And then there’s all the, you know, with early stage companies, like pre seed companies,

02:20:34 which is where I’m investing, sometimes the business model isn’t clear yet, or the go

02:20:40 to market strategy isn’t clear.

02:20:42 There’s usually like, it’s very early on that some of these things haven’t been hashed

02:20:45 out, which is okay.

02:20:47 So the way I like to think about it is like, if this company is successful, will this be

02:20:51 a multi billion slash trillion dollar market, you know, or company?

02:20:56 And so that’s definitely a lens that I use.

02:21:01 What’s pre seed?

02:21:02 What are the different stages and what’s the most exciting stage and what’s, or no, what’s

02:21:07 interesting about every stage, I guess.

02:21:09 Yeah.

02:21:10 So pre seed is usually when you’re just starting out, you’ve maybe raised the friends and family

02:21:16 rounds.

02:21:17 So you’ve raised some money from people, you know, and you’re getting ready to take your

02:21:20 first institutional check in, like first check from an investor.

02:21:25 And I love the stage.

02:21:28 There’s a lot of uncertainty.

02:21:30 Some investors really don’t like the stage because the financial models aren’t there.

02:21:36 Often the teams aren’t even like formed really, really early.

02:21:40 But to me, it’s like a magical stage because it’s the time when there’s so much conviction,

02:21:48 so much belief, almost delusional, right?

02:21:51 And there’s a little bit of naivete around with founders at the stage.

02:21:57 I just love it.

02:21:58 It’s contagious.

02:21:59 And I love that I can, often they’re first time founders, not always, but often they’re

02:22:06 first time founders and I can share my experience as a founder myself and I can empathize, right?

02:22:12 And I can almost, I create a safe ground where, because, you know, you have to be careful

02:22:18 what you tell your investors, right?

02:22:21 And I will often like say, I’ve been in your shoes as a founder.

02:22:24 You can tell me if it’s challenging, you can tell me what you’re struggling with.

02:22:28 It’s okay to vent.

02:22:30 So I create that safe ground and I think that’s a superpower.

02:22:34 Yeah.

02:22:35 You have to, I guess you have to figure out if this kind of person is going to be able

02:22:40 to ride the roller coaster, like of many pivots and challenges and all that kind of stuff.

02:22:48 And if the space of ideas they’re working in is interesting, like the way they think

02:22:53 about the world.

02:22:54 Yeah.

02:22:55 Because if it’s successful, the thing they end up with might be very different, the reason

02:23:00 it’s successful for them.

02:23:01 Actually, you know, I was going to say the third, so the technology is one aspect, the

02:23:07 market or the idea, right, is the second and the third is the founder, right?

02:23:11 Is this somebody who I believe has conviction, is a hustler, you know, is going to overcome

02:23:18 obstacles?

02:23:19 Yeah, I think that is going to be a great leader, right?

02:23:23 Like as a startup, as a founder, you’re often, you are the first person and your role is

02:23:28 to bring amazing people around you to build this thing.

02:23:32 And so you’re an evangelist, right?

02:23:36 So how good are you going to be at that?

02:23:38 So I try to evaluate that too.

02:23:41 You also in the tweet thread about it, mention, is this a known concept, random rich dudes

02:23:46 are RDS and saying that there should be like random rich women, I guess.

02:23:53 What’s the dudes, what’s the dudes version of women, the women version of dudes, ladies?

02:23:58 I don’t know.

02:23:59 I don’t know.

02:24:00 What’s, what’s, is this a technical term?

02:24:01 Is this known?

02:24:02 Random rich dudes?

02:24:03 I didn’t make that up, but I was at this capital camp, which is a get together for investors

02:24:09 of all types.

02:24:11 And there must have been maybe 400 or so attendees, maybe 20 were women.

02:24:19 It was just very disproportionately, you know, male dominated, which I’m used to.

02:24:25 I think you’re used to this kind of thing.

02:24:26 I’m used to it, but it’s still surprising.

02:24:29 And as I’m raising money for this fund, so my fund partner is a guy called Rob May, who’s

02:24:36 done this before.

02:24:37 So I’m new to the investing world, but he’s done this before.

02:24:42 Most of our investors in the fund are these, I mean, awesome.

02:24:45 I’m super grateful to them.

02:24:47 Random just rich guys.

02:24:48 I’m like, where are the rich women?

02:24:50 So I’m really adamant in both investing in women led AI companies, but I also would love

02:24:57 to have women investors be part of my fund because I think that’s how we drive change.

02:25:03 Yeah.

02:25:04 So that takes time, of course, but there’s been quite a lot of progress, but yeah, for

02:25:09 the next Mark Zuckerberg to be a woman and all that kind of stuff, because that’s just

02:25:13 like a huge number of wealth generated by women and then controlled by women and allocated

02:25:19 by women and all that kind of stuff.

02:25:22 And then beyond just women, just broadly across all different measures of diversity and so

02:25:28 on.

02:25:29 Let me ask you to put on your wise sage hat.

02:25:35 So you already gave advice on startups and just advice for women, but in general advice

02:25:45 for folks in high school or college today, how to have a career they can be proud of,

02:25:51 how to have a life they can be proud of.

02:25:55 I suppose you have to give this kind of advice to your kids.

02:25:58 Yeah.

02:25:59 Well, here’s the number one advice that I give to my kids.

02:26:03 My daughter’s now 19 by the way, and my son’s 13 and a half, so they’re not little kids

02:26:08 anymore.

02:26:09 Does it break your heart?

02:26:11 It does.

02:26:12 They’re awesome.

02:26:13 They’re my best friends, but yeah, I think the number one advice I would share is embark

02:26:19 on a journey without attaching to outcomes and enjoy the journey, right?

02:26:25 So we often were so obsessed with the end goal that doesn’t allow us to be open to different

02:26:34 endings of a journey or a story, so you become like so fixated on a particular path.

02:26:41 You don’t see the beauty in the other alternative path, and then you forget to enjoy the journey

02:26:48 because you’re just so fixated on the goal, and I’ve been guilty of that for many, many

02:26:53 years of my life, and I’m now trying to make the shift of, no, no, no, I’m going to again

02:27:00 trust that things are going to work out and it’ll be amazing and maybe even exceed your

02:27:04 dreams.

02:27:05 We have to be open to that.

02:27:07 Yeah.

02:27:08 Taking a leap into all kinds of things.

02:27:09 I think you tweeted like you went on vacation by yourself or something like this.

02:27:13 I know.

02:27:14 Yes, and just going, just taking the leap.

02:27:19 Doing it.

02:27:20 Totally doing it.

02:27:21 And enjoying it, enjoying the moment, enjoying the weeks, enjoying not looking at some kind

02:27:26 of career ladder, next step and so on.

02:27:29 Yeah, there’s something to that, like over planning too.

02:27:34 I’m surrounded by a lot of people that kind of, so I don’t plan.

02:27:37 You don’t?

02:27:38 No.

02:27:39 Do you not do goal setting?

02:27:43 My goal setting is very like, I like the affirmations, it’s very, it’s almost, I don’t know how to

02:27:52 put it into words, but it’s a little bit like what my heart yearns for kind of, and I guess

02:28:02 in the space of emotions more than in the space of like, this will be like in the rational

02:28:08 space because I just try to picture a world that I would like to be in and that world

02:28:16 is not clearly pictured, it’s mostly in the emotional world.

02:28:19 I mean, I think about that from robots because I have this desire, I’ve had it my whole life

02:28:26 to, well, it took different shapes, but I think once I discovered AI, the desire was

02:28:33 to, I think in the context of this conversation could be easily easier described as basically

02:28:41 a social robotics company and that’s something I dreamed of doing and well, there’s a lot

02:28:50 of complexity to that story, but that’s the only thing, honestly, I dream of doing.

02:28:55 So I imagine a world that I could help create, but it’s not, there’s no steps along the way

02:29:05 and I think I’m just kind of stumbling around and following happiness and working my ass

02:29:12 off in almost random, like an ant does in random directions, but a lot of people, a

02:29:18 lot of successful people around me say this, you should have a plan, you should have a

02:29:20 clear goal, you have a goal at the end of the month, you have a goal at the end of the

02:29:23 month, I don’t, I don’t, I don’t and there’s a balance to be struck, of course, but there’s

02:29:33 something to be said about really making sure that you’re living life to the fullest, that

02:29:40 goals can actually get in the way of.

02:29:43 So one of the best, like kind of most, what do you call it when it challenges your brain,

02:29:52 what do you call it?

02:29:56 The only thing that comes to mind, and this is me saying is the mindfuck, but yes.

02:30:00 Okay.

02:30:01 Okay.

02:30:02 Okay.

02:30:03 Something like that.

02:30:04 Yes.

02:30:05 Super inspiring talk.

02:30:06 Kenneth Stanley, he was at OpenAI, he just laughed and he has a book called Why Greatness

02:30:11 Can’t Be Planned and it’s actually an AI book.

02:30:14 So and he’s done all these experiments that basically show that when you over optimize,

02:30:20 you, like the trade off is you’re less creative, right?

02:30:23 And to create true greatness and truly creative solutions to problems, you can’t over plan

02:30:30 it.

02:30:31 You can’t.

02:30:32 And I thought that was, and so he generalizes it beyond AI and he talks about how we apply

02:30:36 that in our personal life and in our organizations and our companies, which are over KPIs, right?

02:30:42 Like look at any company in the world and it’s all like, these aren’t the goals, these

02:30:45 aren’t weekly goals and the sprints and then the quarterly goals, blah, blah, blah.

02:30:51 And he just shows with a lot of his AI experiments that that’s not how you create truly game

02:30:58 changing ideas.

02:30:59 So there you go.

02:31:00 Yeah, yeah.

02:31:01 You can.

02:31:02 He’s awesome.

02:31:03 Yeah.

02:31:04 There’s a balance of course.

02:31:05 That’s yeah, many moments of genius will not come from planning and goals, but you still

02:31:11 have to build factories and you still have to manufacture and you still have to deliver

02:31:15 and there’s still deadlines and all that kind of stuff.

02:31:17 And that for that, it’s good to have goals.

02:31:19 I do goal setting with my kids, we all have our goals, but, but, but I think we’re starting

02:31:25 to morph into more of these like bigger picture goals and not obsess about like, I don’t know,

02:31:30 it’s hard.

02:31:31 Well, I honestly think with, especially with kids, it’s better, much, much better to have

02:31:34 a plan and have goals and so on because you have to, you have to learn the muscle of like

02:31:38 what it feels like to get stuff done.

02:31:40 Yeah.

02:31:41 And once you learn that, there’s flexibility for me because I spend most of my life with

02:31:46 goal setting and so on.

02:31:48 So like I’ve gotten good with grades and school.

02:31:50 I mean, school, if you want to be successful at school, yeah, I mean the kind of stuff

02:31:55 in high school and college, the kids have to do in terms of managing their time and

02:31:59 getting so much stuff done.

02:32:01 It’s like, you know, taking five, six, seven classes in college, they’re like that would

02:32:06 break the spirit of most humans if they took one of them later in life, it’s like really

02:32:13 difficult stuff, especially engineering curricula.

02:32:16 So I think you have to learn that skill, but once you learn it, you can maybe, cause you’re,

02:32:22 you can be a little bit on autopilot and use that momentum and then allow yourself to be

02:32:27 lost in the flow of life.

02:32:28 You know, just kind of, or also give like, I worked pretty hard to allow myself to have

02:32:38 the freedom to do that.

02:32:39 That’s really, that’s a tricky freedom to have because like a lot of people get lost

02:32:44 in the rat race and they, and they also like financially, they, whenever you get a raise,

02:32:52 they’ll get like a bigger house or something like this.

02:32:55 I put very, so like, there’s, you’re always trapped in this race, I put a lot of emphasis

02:32:59 on living like below my means always.

02:33:05 And so there’s a lot of freedom to do whatever, whatever the heart desires that that’s a relief,

02:33:12 but everyone has to decide what’s the right thing, what’s the right thing for them.

02:33:15 For some people having a lot of responsibilities, like a house they can barely afford or having

02:33:21 a lot of kids, the responsibility side of that is really, helps them get their shit

02:33:27 together.

02:33:28 Like, all right, I need to be really focused and get, some of the most successful people

02:33:32 I know have kids and the kids bring out the best in them.

02:33:34 They make them more productive and less productive.

02:33:36 Right, it’s accountability.

02:33:37 Yeah.

02:33:38 It’s an accountability thing, absolutely.

02:33:39 And almost something to actually live and fight and work for, like having a family,

02:33:45 it’s fascinating to see because you would think kids would be a hit on productivity,

02:33:49 but they’re not, for a lot of really successful people, they really like, they’re like an

02:33:53 engine of.

02:33:54 Right, efficiency.

02:33:55 Oh my God.

02:33:56 Yeah.

02:33:57 Yeah.

02:33:58 It’s weird.

02:33:59 Yeah.

02:34:00 I mean, it’s beautiful.

02:34:01 It’s beautiful to see.

02:34:02 And also a source of happiness.

02:34:03 Speaking of which, what role do you think love plays in the human condition, love?

02:34:12 I think love is, yeah, I think it’s why we’re all here.

02:34:19 I think it would be very hard to live life without love in any of its forms, right?

02:34:26 Yeah, that’s the most beautiful of forms that human connection takes, right?

02:34:35 Yeah.

02:34:36 And everybody wants to feel loved, right, in one way or another, right?

02:34:42 And to love.

02:34:43 Yeah.

02:34:44 It feels good.

02:34:45 And to love too, totally.

02:34:46 Yeah, I agree with that.

02:34:47 Both of it.

02:34:48 Yeah.

02:34:49 I’m not even sure what feels better.

02:34:50 Both, both like that.

02:34:51 Yeah, to give and to give love too, yeah.

02:34:54 And it is like we’ve been talking about an interesting question, whether some of that,

02:34:59 whether one day we’ll be able to love a toaster.

02:35:02 Okay.

02:35:03 It’s some small.

02:35:05 I wasn’t quite thinking about that when I said like, yeah, like we all need love and

02:35:10 give love.

02:35:11 That’s all I was thinking about.

02:35:12 Okay.

02:35:13 I was thinking about Brad Pitt and toasters.

02:35:14 Okay, toasters, great.

02:35:15 All right.

02:35:16 Well, I think we started on love and ended on love.

02:35:20 This was an incredible conversation, Rhonda.

02:35:22 Thank you so much.

02:35:23 Thank you.

02:35:24 You’re an incredible person.

02:35:25 Thank you for everything you’re doing in AI, in the space of just caring about humanity,

02:35:32 caring about emotion, about love, and being an inspiration to a huge number of people

02:35:38 in robotics, in AI, in science, in the world in general.

02:35:42 So thank you for talking to me.

02:35:43 It’s an honor.

02:35:44 Thank you for having me.

02:35:45 And you know, I’m a big fan of yours as well.

02:35:47 So it’s been a pleasure.

02:35:49 Thanks for listening to this conversation with Rhonda Alkalioubi.

02:35:52 To support this podcast, please check out our sponsors in the description.

02:35:56 And now let me leave you with some words from Helen Keller.

02:36:00 The best and most beautiful things in the world cannot be seen or even touched.

02:36:05 They must be felt with the heart.

02:36:09 Thank you for listening and hope to see you next time.