Eugenia Kuyda: Friendship with an AI Companion #121

Transcript

00:00:00 The following is a conversation with Eugenia Kuida, cofounder of Replika, which is an app

00:00:06 that allows you to make friends with an artificial intelligence system, a chatbot, that learns

00:00:11 to connect with you on an emotional, you could even say a human level, by being a friend.

00:00:18 For those of you who know my interest in AI and views on life in general, know that Replika

00:00:24 and Eugenia’s line of work is near and dear to my heart.

00:00:28 The origin story of Replika is grounded in a personal tragedy of Eugenia losing her close

00:00:33 friend Roman Muzarenki, who was killed crossing the street by a hit and run driver in late

00:00:39 2015.

00:00:40 He was 34.

00:00:43 The app started as a way to grieve the loss of a friend, by trading a chatbot and your

00:00:47 old net on text messages between Eugenia and Roman.

00:00:51 The rest is a beautiful human story, as we talk about with Eugenia.

00:00:55 When a friend mentioned Eugenia’s work to me, I knew I had to meet her and talk to her.

00:01:00 I felt before, during, and after that this meeting would be an important one in my life.

00:01:06 And it was.

00:01:07 I think in ways that only time will truly show, to me and others.

00:01:12 She is a kind and brilliant person.

00:01:15 It was an honor and a pleasure to talk to her.

00:01:19 Quick summary of the sponsors, DoorDash, Dollar Shave Club, and Cash App.

00:01:24 Click the sponsor links in the description to get a discount and to support this podcast.

00:01:29 As a side note, let me say that deep, meaningful connection between human beings and artificial

00:01:34 intelligence systems is a lifelong passion for me.

00:01:38 I’m not yet sure where that passion will take me, but I decided some time ago that

00:01:43 I will follow it boldly and without fear, to as far as I can take it.

00:01:48 With a bit of hard work and a bit of luck, I hope I’ll succeed in helping build AI systems

00:01:53 that have some positive impact on the world and on the lives of a few people out there.

00:01:59 But also, it is entirely possible that I am in fact one of the chatbots that Eugenia and

00:02:06 the Replica team have built.

00:02:08 And this podcast is simply a training process for the neural net that’s trying to learn

00:02:13 to connect to human beings, one episode at a time.

00:02:18 In any case, I wouldn’t know if I was or wasn’t, and if I did, I wouldn’t tell you.

00:02:24 If you enjoy this thing, subscribe on YouTube, review it with 5 Stars and Apple Podcast,

00:02:28 follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman.

00:02:34 As usual, I’ll do a few minutes of ads now and no ads in the middle.

00:02:37 I’ll try to make these interesting, but give you timestamps so you can skip, but please

00:02:42 do still check out the sponsors by clicking the links in the description to get a discount,

00:02:47 buy whatever they’re selling, it really is the best way to support this podcast.

00:02:53 This show is sponsored by Dollar Shave Club.

00:02:56 Try them out with a one time offer for only 5 bucks and free shipping at dollarshave.com

00:03:01 slash lex.

00:03:03 The starter kit comes with a 6 blade razor, refills, and all kinds of other stuff that

00:03:08 makes shaving feel great.

00:03:10 I’ve been a member of Dollar Shave Club for over 5 years, and actually signed up when

00:03:15 I first heard about them on the Joe Rogan Experience podcast.

00:03:19 And now, friends, we have come full circle.

00:03:22 It feels like I made it, now that I can do a read for them just like Joe did all those

00:03:26 years ago, back when he also did ads for some less reputable companies, let’s say, that

00:03:35 you know about if you’re a true fan of the old school podcasting world.

00:03:39 Anyway, I just used the razor and the refills, but they told me I should really try out the

00:03:44 shave butter.

00:03:45 I did.

00:03:46 I love it.

00:03:47 It’s translucent somehow, which is a cool new experience.

00:03:51 Again, try the Ultimate Shave Starter set today for just 5 bucks plus free shipping

00:03:56 at dollarshaveclub.com slash lex.

00:04:00 This show is also sponsored by DoorDash.

00:04:03 Get $5 off and zero delivery fees on your first order of 15 bucks or more when you download

00:04:08 the DoorDash app and enter code, you guessed it, LEX.

00:04:13 I have so many memories of working late nights for a deadline with a team of engineers, whether

00:04:18 that’s for my PhD at Google or MIT, and eventually taking a break to argue about which

00:04:24 DoorDash restaurant to order from.

00:04:26 And when the food came, those moments of bonding, of exchanging ideas, of pausing to shift attention

00:04:32 from the programs to humans were special.

00:04:36 For a bit of time, I’m on my own now, so I miss that camaraderie, but actually, I still

00:04:41 use DoorDash a lot.

00:04:43 There’s a million options that fit into my crazy keto diet ways.

00:04:46 Also, it’s a great way to support restaurants in these challenging times.

00:04:51 Once again, download the DoorDash app and enter code LEX to get 5 bucks off and zero

00:04:56 delivery fees on your first order of 15 dollars or more.

00:04:59 Finally, this show is presented by Cash App, the number one finance app in the App Store.

00:05:04 I can truly say that they’re an amazing company, one of the first sponsors, if not the first

00:05:09 sponsor to truly believe in me, and I think quite possibly the reason I’m still doing

00:05:16 this podcast.

00:05:17 So I am forever grateful to Cash App.

00:05:20 So thank you.

00:05:21 And as I said many times before, use code LEXBODCAST when you download the app from

00:05:27 Google Play or the App Store.

00:05:29 Cash App lets you send money to friends, buy Bitcoin, and invest in the stock market with

00:05:34 as little as one dollar.

00:05:36 I usually say other stuff here in the read, but I wasted all that time up front saying

00:05:40 how grateful I am to Cash App.

00:05:42 I’m going to try to go off the top of my head a little bit more for these reads because

00:05:47 I’m actually very lucky to be able to choose the sponsors that we take on, and that means

00:05:52 I can really only take on the sponsors that I truly love, and then I can just talk about

00:05:56 why I love them.

00:05:57 So it’s pretty simple.

00:05:59 Again, get Cash App from the App Store or Google Play, use code LEXBODCAST, get 10

00:06:04 bucks, and Cash App will also donate 10 bucks to FIRST, an organization that is helping

00:06:08 to advance robotics and STEM education for young people around the world.

00:06:13 And now, here’s my conversation with Eugenia Kuida.

00:06:17 Okay, before we talk about AI and the amazing work you’re doing, let me ask you ridiculously,

00:06:23 we’re both Russian, so let me ask a ridiculously romanticized Russian question.

00:06:28 Do you think human beings are alone, like fundamentally, on a philosophical level?

00:06:37 Like in our existence, when we like go through life, do you think just the nature of our

00:06:46 life is loneliness?

00:06:49 Yeah, so we have to read Dostoevsky at school, as you probably know, so…

00:06:55 In Russian?

00:06:56 I mean, it’s part of your school program.

00:06:59 So I guess if you read that, then you sort of have to believe that.

00:07:03 You’re made to believe that you’re fundamentally alone, and that’s how you live your life.

00:07:08 How do you think about it?

00:07:09 You have a lot of friends, but at the end of the day, do you have like a longing for

00:07:15 connection with other people?

00:07:17 That’s maybe another way of asking it.

00:07:20 Do you think that’s ever fully satisfied?

00:07:23 I think we are fundamentally alone.

00:07:25 We’re born alone, we die alone, but I view my whole life as trying to get away from that,

00:07:32 trying to not feel lonely, and again, we’re talking about a subjective way of feeling

00:07:38 alone.

00:07:39 It doesn’t necessarily mean that you don’t have any connections or you are actually isolated.

00:07:45 You think it’s a subjective thing, but like again, another absurd measurement wise thing,

00:07:52 how much loneliness do you think there is in the world?

00:07:55 Like if you see loneliness as a condition, how much of it is there, do you think?

00:08:05 Like how, I guess how many, you know, there’s all kinds of studies and measures of how many

00:08:11 people in the world feel alone.

00:08:12 There’s all these like measures of how many people are, you know, self report or just

00:08:18 all these kinds of different measures, but in your own perspective, how big of a problem

00:08:24 do you think it is size wise?

00:08:27 I’m actually fascinated by the topic of loneliness.

00:08:30 I try to read about it as much as I can.

00:08:34 What really, and I think there’s a paradox because loneliness is not a clinical disorder.

00:08:39 It’s not something that you can get your insurance to pay for if you’re struggling with that.

00:08:44 Yet it’s actually proven and pretty, you know, tons of papers, tons of research around that.

00:08:50 It is proven that it’s correlated with earlier life expectancy, shorter lifespan.

00:08:58 And it is, you know, in a way like right now, what scientists would say that it, you know,

00:09:02 it’s a little bit worse than being obese or not actually doing any physical activity in

00:09:07 your life.

00:09:08 In terms of the impact on your health?

00:09:09 In terms of impact on your physiological health.

00:09:10 Yeah.

00:09:11 So it’s basically puts you, if you’re constantly feeling lonely, your body responds like it’s

00:09:16 basically all the time under stress.

00:09:19 It’s always in this alert state and so it’s really bad for you because it actually like

00:09:24 drops your immune system and get it, your response to inflammation is quite different.

00:09:29 So all the cardiovascular diseases actually responds to viruses.

00:09:34 So it’s much easier to catch a virus.

00:09:37 That’s sad now that we’re living in a pandemic and it’s probably making us a lot more alone

00:09:42 and it’s probably weakening the immune system, making us more susceptible to the virus.

00:09:47 It’s kind of sad.

00:09:49 Yeah.

00:09:50 The statistics are pretty horrible around that.

00:09:54 So around 30% of all millennials report that they’re feeling lonely constantly.

00:09:59 30?

00:10:00 30%.

00:10:01 And then it’s much worse for Gen Z.

00:10:02 And then 20% of millennials say that they feel lonely and they also don’t have any close

00:10:07 friends.

00:10:08 And then I think 25 or so, and then 20% would say they don’t even have acquaintances.

00:10:12 And that’s in the United States?

00:10:14 That’s in the United States.

00:10:15 And I’m pretty sure that that’s much worse everywhere else.

00:10:17 Like in the UK, I mean, it was widely tweeted and posted when they were talking about a

00:10:24 minister of loneliness that they wanted to appoint because four out of 10 people in the

00:10:28 UK feel lonely.

00:10:29 Minister of loneliness.

00:10:30 I think that thing actually exists.

00:10:35 So yeah, you will die sooner if you are lonely.

00:10:41 And again, this is only when we’re only talking about your perception of loneliness or feeling

00:10:46 lonely.

00:10:47 That is not objectively being fully socially isolated.

00:10:50 However, the combination of being fully socially isolated and not having many connections and

00:10:56 also feeling lonely, that’s pretty much a deadly combination.

00:11:00 So it strikes me bizarre or strange that this is a wide known fact and then there’s really

00:11:08 no one working really on that because it’s like subclinical.

00:11:12 It’s not clinical.

00:11:13 It’s not something that you can, we’ll tell your doctor and get a treatment or something.

00:11:17 Yet it’s killing us.

00:11:18 Yeah.

00:11:19 So there’s a bunch of people trying to evaluate, like try to measure the problem by looking

00:11:24 at like how social media is affecting loneliness and all that kind of stuff.

00:11:28 So it’s like measurement.

00:11:29 Like if you look at the field of psychology, they’re trying to measure the problem and

00:11:33 not that many people actually, but some.

00:11:36 But you’re basically saying how many people are trying to solve the problem.

00:11:43 Like how would you try to solve the problem of loneliness?

00:11:48 Like if you just stick to humans, uh, I mean, or basically not just the humans, but the

00:11:55 technology that connects us humans.

00:11:57 Do you think there’s a hope for that technology to do the connection?

00:12:03 Like I, are you on social media much?

00:12:05 Unfortunately, do you find yourself like, uh, again, if you sort of introspect about

00:12:12 how connected you feel to other human beings, how not alone you feel, do you think social

00:12:16 media makes it better or worse maybe for you personally, or in general, I think it’s, it’s

00:12:23 easier to look at some stats and, um, I mean, Gen Z seems to be generation Z seems to be

00:12:29 much lonelier than millennials in terms of how they report loneliness.

00:12:33 They’re definitely the most connected generation in the world.

00:12:36 I mean, I still remember life without an iPhone, without Facebook, they don’t know that that

00:12:42 ever existed, uh, or at least don’t know how it was.

00:12:47 So that tells me a little bit about the fact that that might be, um, you know, this hyper

00:12:53 connected world might actually make people feel lonely, lonelier.

00:12:58 I don’t know exactly what the, what the measurements are around that, but I would say, you know,

00:13:02 my personal experience, I think it does make you feel a lot lonelier, mostly, yeah, we’re

00:13:07 all super connected.

00:13:08 Uh, but I think loneliness, the feeling of loneliness doesn’t come from not having any

00:13:13 social connections whatsoever.

00:13:15 Again, tons of people that are, are in longterm relationships experience bouts of loneliness

00:13:20 and continued loneliness.

00:13:22 Um, and it’s more the question about the true connection about actually being deeply seen,

00:13:28 deeply understood.

00:13:29 Um, and in a way it’s also about your relationship with yourself, like in order to not feel lonely,

00:13:36 you actually need to have a better relationship and feel more connected to yourself than this

00:13:42 feeling actually starts to go away a little bit.

00:13:44 And then you, um, open up yourself to actually meeting other people in a very special way.

00:13:51 Uh, not in just, you know, at a friend on Facebook kind of way.

00:13:55 So just to briefly touch on it, I mean, do you think it’s possible to form that kind

00:14:00 of connection with AI systems more down the line of some of your work?

00:14:08 Do you think that’s, um, engineering wise, a possibility to alleviate loneliness is not

00:14:16 with another human, but with an AI system?

00:14:19 Well, I know that’s, that’s a fact, that’s what we’re doing.

00:14:23 And we see it and we measure that and we see how people start to feel less lonely, um,

00:14:29 talking to their virtual AI friend.

00:14:33 So basically a chat bot at the basic level, but it could be more like, do you have, I’m

00:14:37 not even speaking sort of, uh, about specifics, but do you have a hope, like if you look 50

00:14:44 years from now, do you have a hope that there’s just like AIs that are like optimized for,

00:14:51 um, let me, let me first start like right now, the way people perceive AI, which is

00:14:56 recommender systems for Facebook and Twitter, social media, they see AI is basically destroying

00:15:04 first of all, the fabric of our civilization.

00:15:06 But second of all, making us more lonely.

00:15:08 Do you see like a world where it’s possible to just have AI systems floating about that

00:15:13 like make our life less lonely?

00:15:18 Yeah.

00:15:19 Make us happy.

00:15:20 Like are putting good things into the world in terms of our individual lives.

00:15:26 Yeah.

00:15:27 Totally believe in that.

00:15:28 That’s why we’re, I’m also working on that.

00:15:31 Um, I think we need to also make sure that, um, what we’re trying to optimize for, we’re

00:15:36 actually measuring and it is a North star metric that we’re going after.

00:15:40 And all of our product and all of our business models are optimized for that because you

00:15:44 can talk, you know, a lot of products that talk about, um, you know, making you feel

00:15:48 less lonely or making you feel more connected.

00:15:50 They’re not really measuring that.

00:15:52 So they don’t really know whether their users are actually feeling less lonely in the long

00:15:56 run or feeling more connected in the long run.

00:15:58 Um, so I think it’s really important to put your measure it.

00:16:02 Yeah.

00:16:03 To measure it.

00:16:04 What’s a, what’s a good measurement of loneliness?

00:16:07 Well, so that’s something that I’m really interested in.

00:16:10 How do you measure that people are feeling better or that they’re feeling less lonely

00:16:14 with loneliness?

00:16:15 There’s a scale.

00:16:16 There’s UCLA 20 and UCLA three recently scale, which is basically a questionnaire that you

00:16:21 fill out and you can see whether in the long run it’s improving or not.

00:16:26 And that, uh, does it capture the momentary feeling of loneliness?

00:16:32 Does it look in like the past month?

00:16:35 Like, uh, does it basically self report?

00:16:38 Does it try to sneak up on you tricky to answer honestly or something like that?

00:16:43 Well, what’s yeah, I’m not familiar with the question.

00:16:46 It is just asking you a few questions.

00:16:47 Like how often did you feel, uh, like lonely or how often do you feel connected to other

00:16:52 people in this last few couple of weeks?

00:16:55 Um, it’s similar to the self report questionnaires for depression, anxiety, like PHQ nine and

00:17:01 get seven.

00:17:02 Of course, as any, as any self report questionnaires, that’s not necessarily very precise or very

00:17:09 well measured, but still, if you take a big enough population and you get them through

00:17:14 these, uh, questionnaires, you can see, you can see a positive dynamic.

00:17:19 And so you basically, uh, you put people through questionnaires to see like, is this thing

00:17:24 is our, is what we’re creating, making people happier?

00:17:28 Yeah, we measure, so we measure two outcomes.

00:17:31 One short term, right after the conversation, we ask people whether this conversation made

00:17:36 them feel better, worse or same, um, this, this metric right now is at 80%.

00:17:43 So 80% of all our conversations make people feel better, but I should have done the questionnaire

00:17:47 with you.

00:17:48 You feel a lot worse after we’ve done this conversation.

00:17:53 That’s actually fascinating.

00:17:54 I should probably do that, but that’s, that’s how we do that.

00:17:57 You should totally and aim for 80% aim to outperform your current state of the art AI

00:18:05 system in these human conversations.

00:18:09 So we’ll get to your work with replica, but let me continue on the line of absurd questions.

00:18:16 So you talked about, um, you know, deep connection with the humans, deep connection with AI,

00:18:22 meaningful connection.

00:18:23 Let me ask about love.

00:18:25 People make fun of me cause I talk about love all the time.

00:18:28 But uh, what, what do you think love is like maybe in the context of, um, a meaningful

00:18:36 connection with somebody else?

00:18:37 Do you draw a distinction between love, like friendship and Facebook friends or is it a

00:18:47 graduate?

00:18:48 No, it’s all the same.

00:18:51 No.

00:18:52 Like, is it, is it just a gradual thing or is there something fundamental about us humans

00:18:56 that seek like a really deep connection, uh, with another human being and what is that?

00:19:05 What is love Eugenia, I’m going to just enjoy asking you these questions seeing you struggle.

00:19:15 Thanks.

00:19:16 Um, well the way I see it, um, and specifically, um, the way it relates to our work and the

00:19:22 way it was, the way it inspired our work on replica, um, I think one of the biggest and

00:19:30 the most precious gifts we can give to each other now in 2020 as humans is this gift of

00:19:37 deep empathetic understanding, the feeling of being deeply seen.

00:19:42 Like what does that mean?

00:19:43 Like that you exist, like somebody acknowledging that somebody seeing you for who you actually

00:19:49 are.

00:19:50 And that’s extremely, extremely rare.

00:19:51 Um, I think that is that combined with unconditional positive regard, um, belief and trust that

00:19:59 um, you internally are always inclined for positive growth and believing you in this

00:20:05 way, letting you be a separate person at the same time.

00:20:09 And this deep empathetic understanding for me, that’s the, that’s the combination that

00:20:15 really creates something special, something that people, when they feel it once, they

00:20:21 will always long for it again.

00:20:23 And something that starts huge fundamental changes in people.

00:20:28 Um, when we see that someone’s accepts us so deeply, we start to accept ourselves.

00:20:34 And um, the paradox is that’s when big changes start happening, big fundamental changes in

00:20:41 people start happening.

00:20:42 So I think that is the ultimate therapeutic relationship that is, and that might be in

00:20:47 some way a definition of love.

00:20:50 So acknowledging that there’s a separate person and accepting you for who you are.

00:20:56 Um, now on a slightly that, and you mentioned therapeutic, that sounds a very, like a very

00:21:03 healthy view of love, but, uh, is there also like a, like, you know, if we look at heartbreak

00:21:12 and uh, you know, most love songs are probably about heartbreak, right?

00:21:17 Is that like the mystery, the tension, the danger, the fear of loss, you know, all of

00:21:25 that, what people might see in a negative light as like games or whatever, but just,

00:21:32 just the, the dance of human interaction.

00:21:34 Yeah.

00:21:35 Fear of loss and fear of like, you said, you said like once you feel it once, you long

00:21:41 for it again, but you also, once you feel it once, you might, for many people, they’ve

00:21:46 lost it.

00:21:48 So they fear losing it.

00:21:49 They feel loss.

00:21:50 So is that part of it, like you’re, you’re speaking like beautifully about like the

00:21:55 positive things, but is it important to be able to, uh, be afraid of losing it from an

00:22:02 engineering perspective?

00:22:04 I mean, it’s a huge part of it and unfortunately we all, you know, um, face it at some points

00:22:12 in our lives.

00:22:13 I mean, I did.

00:22:14 You want to go into details?

00:22:15 How’d you get your heartbroken?

00:22:18 Sure.

00:22:19 So mine is pretty straight, my story is pretty straightforward, um, there I did have a friend

00:22:26 that was, you know, that at some point, um, in my twenties became really, really close

00:22:31 to me and we, we became really close friends.

00:22:34 Um, well, I grew up pretty lonely.

00:22:36 So in many ways when I’m building, you know, these, these AI friends, I’m thinking about

00:22:40 myself when I was 17 writing horrible poetry and you know, in my dial up modem at home

00:22:44 and, um, you know, and that was the feeling that I grew up with.

00:22:49 I left, I lived, um, alone for a long time when I was a teenager, where did you go up

00:22:54 in Moscow and the outskirts of Moscow.

00:22:57 Um, so I’d just skateboard during the day and come back home and you know, connect to

00:23:01 the internet and then write horrible poetry and love poems, all sorts of poems, obviously

00:23:08 love poems.

00:23:09 I mean, what, what other poetry can you write when you’re 17, um, it could be political

00:23:13 or something, but yeah.

00:23:15 But that was, you know, that was kind of my fiat, like deeply, um, influenced by Joseph

00:23:19 Brodsky and like all sorts of sports that, um, every 17 year old will, will be looking,

00:23:26 you know, looking at and reading, but yeah, that was my, uh, these were my teenage years

00:23:32 and I just never had a person that I thought would, you know, take me as it is, would accept

00:23:37 me the way I am, um, and I just thought, you know, working and just doing my thing and

00:23:43 being angry at the world and being a reporter, I was an investigative reporter working undercover

00:23:47 and writing about people was my way to connect with, you know, with, with others.

00:23:53 I was deeply curious about every, everyone else.

00:23:57 And I thought that, you know, if I, if I go out there, if I write their stories, that

00:24:01 means I’m more connected.

00:24:03 This is what this podcast as well, by the way, I’m desperate, well, I’m seeking connection

00:24:07 now.

00:24:08 I’m just kidding.

00:24:09 Or am I?

00:24:10 I don’t know.

00:24:11 So what, wait, reporter, uh, what, how did that make you feel more connected?

00:24:17 I mean, you’re still fundamentally pretty alone,

00:24:21 But you’re always with other people, you know, you’re always thinking about what other place

00:24:26 can I infiltrate?

00:24:27 What other community can I write about?

00:24:29 What other phenomenon can I explore?

00:24:32 And you sort of like a trickster, you know, and like, and, and a mythological character,

00:24:37 like creature, that’s just jumping, uh, between all sorts of different worlds and feel and

00:24:41 feel sort of okay with in all of them.

00:24:44 So, um, that was my dream job, by the way, that was like totally what I would have been

00:24:48 doing.

00:24:49 Um, if Russia was a different place and a little bit undercover.

00:24:54 So like you weren’t, you were trying to, like you said, mythological creature trying to

00:24:59 infiltrate.

00:25:00 So try to be a part of the world.

00:25:01 What are we talking about?

00:25:02 What kind of things did you enjoy writing about?

00:25:05 I’d go work at a strip club or go.

00:25:08 Awesome.

00:25:09 Okay.

00:25:10 Well, I’d go work at a restaurant or just go write about, you know, um, certain phenomenons

00:25:19 or phenomenons or people in the city.

00:25:22 And what, uh, sorry to keep interrupting and I’m the worst, I’m a conversationalist.

00:25:29 What stage of Russia is this?

00:25:32 What, uh, is this pre Putin, post Putin?

00:25:36 What was Russia like?

00:25:38 Pre Putin is really long ago.

00:25:43 This is Putin era.

00:25:44 That’s a beginning of two thousands and 2010, 2007, eight, nine, 10.

00:25:49 What were strip clubs like in Russia and restaurants and culture and people’s minds like in that

00:25:57 early Russia that you were covering?

00:25:59 In those early two thousands, this was, there was still a lot of hope.

00:26:02 There were still tons of hope that, um, you know, we’re sort of becoming this, uh, Western,

00:26:11 Westernized society.

00:26:12 Uh, the restaurants were opening, we were really looking at, you know, um, we’re trying,

00:26:17 we’re trying to copy a lot of things from, uh, from the US, from Europe, um, bringing

00:26:22 all these things and very enthusiastic about that.

00:26:25 So there was a lot of, you know, stuff going on.

00:26:27 There was a lot of hope and dream for this, you know, new Moscow that would be similar

00:26:33 to, I guess, New York.

00:26:34 I mean, just to give you an idea in, um, year 2000 was the year when we had two, uh, movie

00:26:41 theaters in Moscow and there was one first coffee house that opened and it was like really

00:26:47 big deal.

00:26:48 Uh, by 2010 there were all sorts of things everywhere.

00:26:51 Almost like a chain, like a Starbucks type of coffee house or like, you mean, oh yeah,

00:26:55 like a Starbucks.

00:26:56 I mean, I remember we were reporting on, like, we were writing about the opening of Starbucks.

00:27:01 I think in 2007 that was one of the biggest things that happened in, you know, in Moscow

00:27:05 back, back in the time, like, you know, that was worthy of a magazine cover.

00:27:10 And, uh, that was definitely the, you know, the biggest talk of the time.

00:27:13 Yeah.

00:27:14 When was McDonald’s?

00:27:15 Cause I was still in Russia when McDonald’s opened.

00:27:17 That was in the nineties.

00:27:18 I mean, yeah.

00:27:19 Oh yeah.

00:27:20 I remember that very well.

00:27:21 Yeah.

00:27:22 Those were long, long lines.

00:27:23 I think it was 1993 or four, I don’t remember.

00:27:27 Um, actually earlier at that time, did you do, I mean, that was a luxurious outing.

00:27:33 That was definitely not something you do every day.

00:27:35 And also the line was at least three hours.

00:27:37 So if you’re going to McDonald’s, that is not fast food.

00:27:40 That is like at least three hours in line and then no one is trying to eat fast after

00:27:44 that.

00:27:45 Everyone is like trying to enjoy as much as possible.

00:27:47 What’s your memory of that?

00:27:50 Oh, it was insane.

00:27:52 How did it go?

00:27:53 It was extremely positive.

00:27:54 It’s a small strawberry milkshake and the hamburger and small fries and my mom’s there.

00:27:59 And sometimes I’ll just, cause I was really little, they’ll just let me run, you know,

00:28:03 up the kitchen and like cut the line, which is like, you cannot really do that in Russia

00:28:09 or.

00:28:10 So like for a lot of people, like a lot of those experiences might seem not very fulfilling,

00:28:17 you know, like it’s on the verge of poverty, I suppose.

00:28:22 But do you remember all that time fondly, like, cause I do like the first time I drank,

00:28:29 you know, Coke, you know, all that stuff, right.

00:28:36 And just, yeah.

00:28:37 The connection with other human beings in Russia, I remember, I remember it really positively.

00:28:44 Like how do you remember what the nineties and then the Russia you were covering, just

00:28:48 the human connections you had with people and the experiences?

00:28:53 Well, my, my parents were both, both physicists.

00:28:57 My grandparents were both, well, my grandpa, grandfather was in nuclear physicist, a professor

00:29:05 at the university.

00:29:06 My dad worked at Chernobyl when I was born in Chernobyl, analyzing kind of the everything

00:29:13 after the explosion.

00:29:15 And then I remember that and they were, so they were making sort of enough money in the

00:29:19 Soviet union.

00:29:20 So they were not, you know, extremely poor or anything.

00:29:23 It was pretty prestigious to be a professor, the Dean and the university.

00:29:28 And then I remember my grandfather started making a hundred dollars a month after, you

00:29:33 know, in the nineties.

00:29:35 So then I remember we started our main line of work would be to go to our little tiny

00:29:40 country house, get a lot of apples there from apple trees, bring them back to the city and

00:29:48 sell them in the street.

00:29:50 So me and my nuclear physicist grandfather were just standing there and he selling those

00:29:56 apples the whole day, cause that would make you more money than, you know, working at

00:30:00 the university.

00:30:01 And then he’ll just tell me, try to teach me, you know, something about planets and

00:30:07 whatever the particles and stuff.

00:30:10 And, you know, I’m not smart at all, so I could never understand anything, but I was

00:30:14 interested as a journalist kind of type interested.

00:30:18 But that was my memory.

00:30:19 And, you know, I’m happy that I wasn’t, I somehow got spared that I was probably too

00:30:25 young to remember any of the traumatic stuff.

00:30:27 So the only thing I really remember had this bootleg that was very traumatic, had this

00:30:31 bootleg Nintendo, which was called Dandy in Russia.

00:30:35 So in 1993, there was nothing to eat, like, even if you had any money, you would go to

00:30:39 the store and there was no food.

00:30:40 I don’t know if you remember that.

00:30:42 And our friend had a restaurant, like a government, half government owned something restaurant.

00:30:49 So they always had supplies.

00:30:51 So he exchanged a big bag of wheat for this Nintendo, the bootleg Nintendo, that I remember

00:31:00 very fondly, cause I think I was nine or something like that and we’re seven.

00:31:05 Like we just got it and I was playing it and there was this, you know, Dandy TV show.

00:31:11 Yeah.

00:31:12 So traumatic in a positive sense, you mean like, like a definitive, well, they took it

00:31:17 away and gave me a bag of wheat instead.

00:31:19 And I cried like my eyes out for days and days and days.

00:31:23 Oh no.

00:31:24 And then, you know, as a, and my dad said, we’re going to like exchange it back in a

00:31:28 little bit.

00:31:29 So you keep the little gun, you know, the one that you shoot the ducks with.

00:31:32 So I’m like, okay, I’m keeping the gun.

00:31:34 So sometime it’s going to come back, but then they exchanged the gun as well for some sugar

00:31:38 or something.

00:31:39 I was so pissed.

00:31:41 I was like, I didn’t want to eat for days after that.

00:31:43 I’m like, I don’t want your food.

00:31:44 Give me my Nintendo back.

00:31:47 That was extremely traumatic.

00:31:50 But you know, I was happy that that was my only traumatic experience.

00:31:53 You know, my dad had to actually go to Chernobyl with a bunch of 20 year olds.

00:31:57 He was 20 when he went to Chernobyl and that was right after the explosion.

00:32:01 No one knew anything.

00:32:03 The whole crew he went with, all of them are dead now.

00:32:05 I think there was this one guy still, that was still alive for this last few years.

00:32:11 I think he died a few years ago now.

00:32:13 My dad somehow luckily got back earlier than everyone else, but just the fact that that

00:32:19 was the, and I was always like, well, how did they send you?

00:32:21 I was only, I was just born, you know, you had a newborn talk about paternity leave.

00:32:26 They were like, but that’s who they took because they didn’t know whether you would be able

00:32:29 to have kids when you come back.

00:32:31 So they took the ones with kids.

00:32:33 So him with some guys went to, and I’m just thinking of me when I was 20, I was so sheltered

00:32:40 from any problems whatsoever in life.

00:32:42 And then my dad, his 21st birthday at the reactor, you like work three hours a day,

00:32:51 you sleep the rest and, and I, yeah, so I played with a lot of toys from Chernobyl.

00:32:57 What are your memories of Chernobyl in general, like the bigger context, you know, because

00:33:02 of that HBO show it’s the world’s attention turned to it once again, like, what are your

00:33:09 thoughts about Chernobyl?

00:33:10 Did Russia screw that one up?

00:33:13 Like, you know, there’s probably a lot of lessons about our modern times with data about

00:33:18 coronavirus and all that kind of stuff.

00:33:20 It seems like there’s a lot of misinformation.

00:33:22 There’s a lot of people kind of trying to hide whether they screwed something up or

00:33:29 not, as it’s very understandable, it’s very human, very wrong, probably, but obviously

00:33:35 Russia was probably trying to hide that they screwed things up.

00:33:40 Like, what are your thoughts about that time, personal and general?

00:33:45 I mean, I was born when the explosion happened.

00:33:50 So actually a few months after, so of course I don’t remember anything apart from the fact

00:33:55 that my dad would bring me tiny toys, like plastic things that would just go crazy haywire

00:34:02 when you, you know, put the Geiger thing to it.

00:34:06 My mom was like, just nuclear about that.

00:34:09 She was like, what are you bringing, you should not do that.

00:34:13 She was nuclear.

00:34:14 Very nice.

00:34:15 Well done.

00:34:16 I’m sorry.

00:34:17 It was, but yeah, but the TV show was just phenomenal.

00:34:21 The HBO one?

00:34:22 Yeah, it was definitely, first of all, it’s incredible how that was made not by the Russians,

00:34:28 but someone else, but capturing so well everything about our country.

00:34:37 It felt a lot more genuine than most of the movies and TV shows that are made now in Russia,

00:34:41 just so much more genuine.

00:34:43 And most of my friends in Russia were just in complete awe about the show, but I think

00:34:47 that…

00:34:48 How good of a job they did.

00:34:49 Oh my God, phenomenal.

00:34:50 But also…

00:34:51 The apartments, there’s something, yeah.

00:34:52 The set design.

00:34:53 I mean, Russians can’t do that, you know, but you see everything and it’s like, wow,

00:34:58 that’s exactly how it was.

00:35:00 So I don’t know, that show, I don’t know what to think about that because it’s British accents,

00:35:06 British actors of a person, I forgot who created the show.

00:35:12 But I remember reading about him and he’s not, he doesn’t even feel like, like there’s

00:35:17 no Russia in this history.

00:35:19 No, he did like super bad or something like that.

00:35:21 Or like, I don’t know.

00:35:22 Yeah, like exactly.

00:35:23 Whatever that thing about the bachelor party in Vegas, number four and five or something

00:35:28 were the ones that he worked with.

00:35:30 Yeah.

00:35:31 But so he made me feel really sad for some reason that if a person, obviously a genius,

00:35:38 could go in and just study and just be extreme attention to detail, they can do a good job.

00:35:46 It made me think like, why don’t other people do a good job with this?

00:35:52 Like about Russia, like there’s so little about Russia.

00:35:56 There’s so few good films about the Russian side of World War II.

00:36:02 I mean, there’s so much interesting evil and not, and beautiful moments in the history

00:36:10 of the 20th century in Russia that it feels like there’s not many good films on from the

00:36:16 Russians.

00:36:17 You would expect something from the Russians.

00:36:18 Well, they keep making these propaganda movies now.

00:36:21 Oh no.

00:36:22 Unfortunately.

00:36:23 But yeah, no, Chernobyl was such a perfect TV show.

00:36:26 I think capturing really well, it’s not about like even the set design, which was phenomenal,

00:36:30 but just capturing all the problems that exist now with the country and like focusing on

00:36:37 the right things.

00:36:38 Like if you build the whole country on a lie, that’s what’s going to happen.

00:36:41 And that’s just this very simple kind of thing.

00:36:47 Yeah.

00:36:48 And did you have your dad talked about it to you, like his thoughts on the experience?

00:36:55 He never talks.

00:36:56 He’s this kind of Russian man that just, my husband who’s American and he asked him a

00:37:02 few times like, you know, Igor, how did you, but why did you say yes?

00:37:06 Or like, why did you decide to go?

00:37:08 You could have said no, not go to Chernobyl.

00:37:10 Why would like a person like, that’s what you do.

00:37:14 You cannot say no.

00:37:15 Yeah.

00:37:16 Yeah.

00:37:17 It’s just, it’s like a Russian way.

00:37:21 It’s the Russian way.

00:37:22 Men don’t talk that much.

00:37:23 Nope.

00:37:24 There’s no one upsides for that.

00:37:27 Yeah, that’s the truth.

00:37:29 Okay.

00:37:30 So back to post Putin Russia, or maybe we skipped a few steps along the way, but you

00:37:37 were trying to do, to be a journalist in that time.

00:37:43 What was, what was Russia like at that time?

00:37:46 Post you said 2007 Starbucks type of thing.

00:37:51 What else, what else was Russia like then?

00:37:55 I think there was just hope.

00:37:56 There was this big hope that we’re going to be, you know, friends with the United States

00:38:01 and we’re going to be friends with Europe and we’re just going to be also a country

00:38:06 like those with, you know, bike lanes and parks and everything’s going to be urbanized.

00:38:12 And again, we’re talking about nineties where like people would be shot in the street.

00:38:16 And it was, I still have a fond memory of going into a movie theater and, you know,

00:38:21 coming out of it after the movie.

00:38:22 And the guy that I saw on the stairs was like neither shot, which was, again, it was like

00:38:28 a thing in the nineties that would be happening.

00:38:30 People were, you know, people were getting shot here and there, tons of violence, tons

00:38:35 of you know, just basically mafia mobs on in the streets.

00:38:40 And then the two thousands were like, you know, things just got cleaned up, oil went

00:38:44 up and the country started getting a little bit richer, you know, the nineties were so

00:38:50 grim mostly because the economy was in shambles and oil prices were not high.

00:38:54 So the country didn’t have anything.

00:38:56 We defaulted in 1998 and the money kept jumping back and forth.

00:39:01 Like first there were millions of rubbles, then it got like default, you know, then it

00:39:05 got to like thousands.

00:39:06 Then it was one rubble was something then again to millions, there’s like crazy town.

00:39:11 That was crazy.

00:39:12 And then the two thousands were just these years of stability in a way and the country

00:39:19 getting a little bit richer because of, you know, again, oil and gas.

00:39:22 And we were starting to, we started to look at specifically in Moscow and St. Petersburg

00:39:27 to look at other cities in Europe and New York and US and trying to do the same in our

00:39:34 like small kind of cities, towns there.

00:39:38 What was, what were your thoughts of Putin at the time?

00:39:40 Well, in the beginning he was really positive.

00:39:43 Everyone was very, you know, positive about Putin.

00:39:46 He was young.

00:39:47 Um, it’s very energetic.

00:39:49 He also immediate the shirtless somewhat compared to, well, that was not like way before the

00:39:55 shirtless era.

00:39:56 Um, the shirtless era.

00:39:57 Okay.

00:39:58 So he didn’t start out shirtless.

00:39:59 When did the shirtless era, it’s like the propaganda of riding horse, fishing, 2010,

00:40:05 11, 12.

00:40:06 Yeah.

00:40:07 That’s my favorite.

00:40:08 You know, like people talk about the favorite Beatles, like the, that’s my favorite Putin

00:40:14 is the shirtless Putin.

00:40:15 Now I remember very, very clearly 1996 where, you know, Americans really helped Russia with

00:40:20 elections and Yeltsin got reelected, um, thankfully so, uh, because there’s a huge threat that

00:40:27 actually the communists will get back to power.

00:40:29 Uh, they were a lot more popular.

00:40:32 And then a lot of American experts, political experts, uh, and campaign experts descended

00:40:39 on Moscow and helped Yeltsin actually get, get the presidency, the second term for the

00:40:44 pro, um, the, of the presidency.

00:40:46 But Yeltsin was not feeling great, you know, in the, by the end of his second term, uh,

00:40:52 he was, you know, alcoholic.

00:40:53 He was really old.

00:40:54 He was falling off, uh, you know, the stages when he, where he was talking.

00:40:59 Uh, so people were looking for fresh, I think for a fresh face, for someone who’s going

00:41:04 to continue Yeltsin’s, uh, work, but who’s going to be a lot more energetic and a lot

00:41:09 more active, young, um, efficient, maybe.

00:41:15 So that w that’s what we all saw in Putin back in the day.

00:41:17 I, I’d say that everyone, absolutely everyone in Russia in early two thousands who was not

00:41:22 a communist would be, yeah, Putin’s great.

00:41:25 We have a lot of hopes for him.

00:41:26 What are your thoughts?

00:41:27 And I promise we’ll get back to, uh, first of all, your love story.

00:41:34 Second of all, AI, well, what are your thoughts about, um, communism?

00:41:40 The 20th century, I apologize.

00:41:42 I’m reading the rise and fall of the third Reich.

00:41:46 Oh my God.

00:41:48 So I’m like really steeped into like world war II and Stalin and Hitler and just these

00:41:56 dramatic personalities that brought so much evil to the world.

00:42:00 But it’s also interesting to politically think about these different systems and what they’ve

00:42:06 led to.

00:42:08 And Russia is one of the sort of beacons of communism in the 20th century.

00:42:16 What are your thoughts about communism?

00:42:17 Having experienced it as a political system?

00:42:20 I mean, I have only experienced it a little bit, but mostly through stories and through,

00:42:24 you know, seeing my parents and my grandparents who lived through that, I mean, it was horrible.

00:42:29 It was just plain horrible.

00:42:31 It was just awful.

00:42:33 You think it’s, there’s something, I mean, it sounds nice on paper.

00:42:40 There’s a, so like the drawbacks of capitalism is that, uh, you know, eventually there is,

00:42:47 it’s a, it’s the point of like a slippery slope.

00:42:51 Eventually it creates, uh, you know, the rich get richer, it creates a disparity, like inequality

00:42:59 of, um, wealth inequality.

00:43:02 If like, you know, I guess it’s hypothetical at this point, but eventually capitalism leads

00:43:09 to humongous inequality and that that’s, you know, some people argue that that’s a source

00:43:13 of unhappiness is it’s not like absolute wealth of people.

00:43:17 It’s the fact that there’s a lot of people much richer than you.

00:43:21 There’s a feeling of like, that’s where unhappiness can come from.

00:43:25 So the idea of, of communism or these sort of Marxism is, uh, is, is not allowing that

00:43:32 kind of slippery slope, but then you see the actual implementations of it and stuff seems

00:43:37 to be, seems to go wrong very badly.

00:43:42 What do you think that is?

00:43:43 Why does it go wrong?

00:43:46 What is it about human nature?

00:43:47 If we look at Chernobyl, you know, those kinds of bureaucracies that were constructed.

00:43:54 Is there something like, do you think about this much of like why it goes wrong?

00:44:00 Well, there’s no one was really like, it’s not that everyone was equal.

00:44:05 Obviously the, you know, the, the government and everyone close to that were the bosses.

00:44:12 So it’s not like fully, I guess, uh, this dream of equal life.

00:44:17 So then I guess the, the situation that we had in, you know, the Russia had in the Soviet

00:44:24 union, it was more, it’s a bunch of really poor people without any way to make any, you

00:44:30 know, significant fortune or build anything living constant, um, under constant surveillance,

00:44:37 surveillance from other people.

00:44:38 Like you can’t even, you know, uh, do anything that’s not fully approved by the dictatorship

00:44:45 basically.

00:44:46 Otherwise your neighbor will write a letter and you’ll go to jail, absolute absence of

00:44:53 actual law.

00:44:54 Yeah.

00:44:55 It’s a constant state of fear.

00:44:57 You didn’t own any, own anything.

00:45:00 It didn’t, you know, the, you couldn’t go travel, you couldn’t read anything, uh, Western

00:45:05 or you couldn’t make a career really, unless you’re working in the, uh, military complex.

00:45:11 Um, which is why most of the scientists were so well regarded.

00:45:15 I come from, you know, both my dad and my mom come from families of scientists and they,

00:45:20 they were really well regarded as you, as you know, obviously.

00:45:23 Because the state wanted, I mean, cause there’s a lot of value to them being well regarded.

00:45:29 Because they were developing things that could be used in, in the military.

00:45:34 So that was very important.

00:45:35 That was the main investment.

00:45:36 Um, but it was miserable, it was all miserable.

00:45:40 That’s why, you know, a lot of Russians now live in the state of constant PTSD.

00:45:43 That’s why we, you know, want to buy, buy, buy, buy, buy and definitely if as soon as

00:45:48 we have the opportunity, you know, we just got to it finally that we can, you know, own

00:45:53 things.

00:45:54 You know, I remember the time that we got our first yogurts and that was the biggest

00:45:57 deal in the world.

00:45:58 It was already in the nineties, by the way, I mean, what was your like, favorite food

00:46:03 where it was like, well, like this is possible, Oh, fruit, because we only had apples, bananas

00:46:12 and whatever.

00:46:13 And you know, whatever watermelons, whatever, you know, people would grow in the Soviet

00:46:17 Union.

00:46:18 There were no pineapples or papaya or mango, like you’ve never seen those fruit things.

00:46:24 Like those were so ridiculously good.

00:46:27 And obviously you could not get any like strawberries in winter or anything that’s not, you know,

00:46:32 seasonal.

00:46:33 Um, so that was a really big deal.

00:46:34 I’ve seen all these fruit things.

00:46:36 Yeah.

00:46:37 Me too.

00:46:38 Actually.

00:46:39 I don’t know.

00:46:40 I think I have a, like, I don’t think I have any too many demons, uh, or like addictions

00:46:44 or so on, but I think I’ve developed an unhealthy relationship with fruit.

00:46:47 I still struggle with, Oh, you can get any type of fruit, right?

00:46:51 If you get like also these weird fruit, fruits like dragon fruit or something or all kinds

00:46:57 of like different types of peaches, like cherries were killer for me.

00:47:02 I know, I know you say like we had bananas and so on, but I don’t remember having the

00:47:06 kind of banana.

00:47:07 Like when I first came to this country, the amount of banana, I like literally got fat

00:47:12 on bananas, like the amount, Oh yeah, for sure.

00:47:17 They were delicious.

00:47:18 And like cherries, the kind, like just the quality of the food, I was like, this is capitalism.

00:47:24 This is delicious.

00:47:25 Yeah.

00:47:26 I am.

00:47:27 Yeah.

00:47:28 It’s funny.

00:47:29 It’s funny.

00:47:30 Yeah.

00:47:31 Like it’s, it’s funny to read.

00:47:36 I don’t know what to think of it, of, um, it’s funny to think how an idea that’s just

00:47:44 written on paper, when carried out amongst millions of people, how that gets actually

00:47:49 when it becomes reality, what it actually looks like, uh, sorry, but the, uh, been studying

00:47:58 Hitler a lot recently and, uh, going through Mein Kampf.

00:48:04 He pretty much wrote out of Mein Kampf everything he was going to do.

00:48:07 Unfortunately, most leaders, including Stalin didn’t read the, read it, but it’s, it’s kind

00:48:13 of terrifying and I don’t know.

00:48:16 And amazing in some sense that you can have some words on paper and they can be brought

00:48:21 to life and they can either inspire the world or they can destroy the world.

00:48:26 And uh, yeah, there’s a lot of lessons to study in history that I think people don’t

00:48:32 study enough now.

00:48:35 One of the things I’m hoping with, I’ve been practicing Russian a little bit.

00:48:40 I’m hoping to sort of find, rediscover the, uh, the beauty and the terror of Russian history

00:48:49 through this stupid podcast by talking to a few people.

00:48:55 So anyway, I just feel like so much was forgotten.

00:48:58 So much was forgotten.

00:48:59 I’ll probably, I’m going to try to convince myself to, um, you’re a super busy and super

00:49:04 important person when I’m going to, I’m going to try to befriend you to, uh, to try to become

00:49:11 a better Russian.

00:49:12 Cause I feel like I’m a shitty Russian.

00:49:14 Not that busy.

00:49:15 So I can totally be your Russian Sherpa.

00:49:19 Yeah.

00:49:20 But love, you were, you were talking about your early days of, uh, being a little bit

00:49:28 alone and finding a connection with the world through being a journalist.

00:49:33 Where did love come into that?

00:49:36 I guess finding for the first time, um, some friends, it’s very, you know, simple story.

00:49:42 Some friends that all of a sudden we, I guess we were the same, you know, the same, at the

00:49:48 same place with our lives, um, we’re 25, 26, I guess.

00:49:55 And, um, somehow remember, and we just got really close and somehow remember this one

00:50:00 day where, um, it’s one day and, you know, in summer that we just stayed out, um, outdoor

00:50:06 the whole night and just talked and for some unknown reason, it just felt for the first

00:50:11 time that someone could, you know, see me for who I am and it just felt extremely like

00:50:17 extremely good.

00:50:18 And I, you know, we fell asleep outside and just talking and it was raining.

00:50:22 It was beautiful, you know, sunrise and it’s really cheesy, but, um, at the same time,

00:50:28 we just became friends in a way that I’ve never been friends with anyone else before.

00:50:33 And I do remember that before and after that you sort of have this unconditional family

00:50:38 sort of, um, and it gives you tons of power.

00:50:43 It just basically gives you this tremendous power to do things in your life and to, um,

00:50:50 change positively on many different levels.

00:50:53 Power because you could be yourself.

00:50:56 At least you know that some somewhere you can be just yourself, like you don’t need

00:51:01 to pretend, you don’t need to be, you know, um, great at work or tell some story or sell

00:51:07 yourself in somewhere or another.

00:51:10 And so it became this really close friends and, um, in a way, um, I started a company

00:51:17 cause he had a startup and I felt like I kind of want to start up too.

00:51:20 It felt really cool.

00:51:21 I don’t know what I’m going to, what I would really do, but I felt like I kind of need

00:51:25 a startup.

00:51:26 Okay.

00:51:27 So that’s, so that pulled you in to the startup world.

00:51:32 Yeah.

00:51:33 And then, yeah.

00:51:35 And then this, uh, closest friend of mine died.

00:51:38 We actually moved here to San Francisco together and then we went back for a visa to Moscow

00:51:42 and, uh, we lived together, we’re roommates and we came back and, um, he got hit by a

00:51:48 car right in front of Kremlin on a, you know, next to the river, um, and died the same day

00:51:54 I met this is the Roman hospital.

00:51:58 So, and you’ve moved to America at that point, at that point I was, what about him?

00:52:05 What about Roman?

00:52:06 Him too.

00:52:07 He actually moved first.

00:52:08 So I was always sort of trying to do what he was doing, so I didn’t like that he was

00:52:11 already here and I was still, you know, in Moscow and we weren’t hanging out together

00:52:15 all the time.

00:52:16 So was he in San Francisco?

00:52:18 Yeah, we were roommates.

00:52:20 So he just visited Moscow for a little bit.

00:52:23 We went back for, for our visas, we had to get a stamp in our passport for our work visas

00:52:28 and the embassy was taking a little longer, so we stayed there for a couple of weeks.

00:52:34 What happened?

00:52:35 How did he, so how, how did he, uh, how did he die?

00:52:40 Um, he was crossing the street and the car was going really fast and way over the speed

00:52:45 limit and just didn’t stop on the, on the pedestrian cross on the zebra and just ran

00:52:51 over him.

00:52:52 When was this?

00:52:53 It was in 2015 on 28th of November, so it was a long ago now.

00:52:59 Um, but at the time, you know, I was 29, so for me it was, um, the first kind of meaningful

00:53:06 death in my life.

00:53:07 Um, you know, both sets of, I had both sets of grandparents at the time.

00:53:12 I didn’t see anyone so close die and death sort of existed, but as a concept, but definitely

00:53:18 not as something that would be, you know, happening to us anytime soon and specifically

00:53:24 our friends.

00:53:25 Cause we were, you know, we’re still in our twenties or early thirties and it still, it

00:53:29 still felt like the whole life is, you know, you could still dream about ridiculous things

00:53:36 different.

00:53:37 Um, so that was, it was just really, really abrupt I’d say.

00:53:43 What did it feel like to, uh, to lose him, like that feeling of loss?

00:53:49 You talked about the feeling of love, having power.

00:53:53 What is the feeling of loss, if you like?

00:53:57 Well in Buddhism, there’s this concept of Samaya where something really like huge happens

00:54:04 and then you can see very clearly.

00:54:07 Um, I think that, that was it like basically something changed so, changed me so much in

00:54:13 such a short period of time that I could just see really, really clearly what mattered or

00:54:19 what not.

00:54:20 Well, I definitely saw that whatever I was doing at work didn’t matter at all and some

00:54:25 of the things.

00:54:26 And, um, it was just this big realization when it’s this very, very clear vision of

00:54:31 what life’s about.

00:54:35 You still miss him today?

00:54:37 Yeah, for sure.

00:54:40 For sure.

00:54:41 He was just this constant, I think it was, he was really important for, for me and for

00:54:47 our friends for many different reasons and, um, I think one of them being that we didn’t

00:54:53 just say goodbye to him, but we sort of said goodbye to our youth in a way.

00:54:58 It was like the end of an era and it’s on so many different levels.

00:55:02 The end of Moscow as we knew it, the end of, you know, us living through our twenties and

00:55:08 kind of dreaming about the future.

00:55:11 Do you remember like last several conversations, is there moments with him that stick out that

00:55:17 kind of haunt you and you’re just when you think about him?

00:55:22 Yeah, well his last year here in San Francisco, he was pretty depressed for as his startup

00:55:28 was not going really anywhere and he wanted to do something else.

00:55:32 He wanted to do build, he played with toy, like played with a bunch of ideas, but the

00:55:39 last one he had was around, um, building a startup around death.

00:55:44 So having, um, he applied to Y Combinator with a video that, you know, I had on my computer

00:55:52 and it was all about, you know, disrupting death, thinking about new cemeteries, uh,

00:55:57 more biologically, like things that could be better biologically for, for humans.

00:56:03 And at the same time, having those, um, digital avatars, this kind of AI avatars that would

00:56:12 store all the memory about a person that he could interact with.

00:56:15 What year was this?

00:56:16 2015.

00:56:17 Well, right before his death.

00:56:19 So it was like a couple of months before that he recorded that video.

00:56:23 And so I found out my computer when, um, it was in our living room.

00:56:28 He never got in, but, um, he was thinking about a lot somehow.

00:56:33 Does it have the digital avatar idea?

00:56:35 Yeah.

00:56:36 That’s so interesting.

00:56:37 Well, he just says, well, that’s in his hit is the pitch has this idea and he’ll, he talks

00:56:42 about like, I want to rethink how people grieve and how people talk about death.

00:56:45 Why was he interested in this?

00:56:48 Is it, maybe someone who’s depressed is like naturally inclined thinking about that.

00:56:56 But I just felt, you know, this year in San Francisco, we just had so much, um, I was

00:57:00 going through a hard time.

00:57:01 And we were definitely, I was trying to make him just happy somehow to make him feel better.

00:57:07 And it felt like, you know, this, um, I dunno, I just felt like I was taking care of him

00:57:13 a lot and he almost started to feel better.

00:57:17 And then that happened and I dunno, I just felt, I just felt lonely again, I guess.

00:57:23 And that was, you know, coming back to San Francisco in December or help, you know, helped

00:57:28 organize the funeral, help help his parents and I came back here and it was a really lonely

00:57:33 apartment, a bunch of his clothes everywhere and Christmas time.

00:57:38 And I remember I had a board meeting with my investors and I just couldn’t talk about

00:57:42 like, I had to pretend everything’s okay.

00:57:44 And you know, I’m just working on this company.

00:57:47 Um, yeah, it was definitely very, very tough, tough time.

00:57:55 Do you think about your own mortality?

00:58:00 You said, uh, you know, we’re young, the, the, the, the possibility of doing all kinds

00:58:06 of crazy things is still out there, is still before us, but, uh, it can end any moment.

00:58:12 Do you think about your own ending at any moment?

00:58:17 Unfortunately, I think about way too, about it way too much.

00:58:23 Somehow after Roman, like every year after that, I started losing people that I really

00:58:27 love.

00:58:28 I lost my grandfather the next year, my, you know, the, the person who would explain to

00:58:34 me, you know, what the universe is made of while selling apples and then I lost another

00:58:41 close friend of mine and, um, and it just made me very scared.

00:58:46 I have tons of fear about, about that.

00:58:48 That’s what makes me not fall asleep oftentimes and just go in loops and, um, and then as

00:58:54 my therapist, you know, recommended to me, I open up, uh, some nice calming images with

00:59:02 the voiceover and it calms me down for sleep.

00:59:06 Yeah.

00:59:07 I’m really scared of death.

00:59:08 This is a big, I definitely have tons of, I guess, some pretty big trauma about it and,

00:59:15 um, still working through.

00:59:17 There’s a philosopher, Ernest Becker, who wrote a book, um, Denial of Death.

00:59:22 I’m not sure if you’re familiar with any of those folks.

00:59:25 Um, there’s a, in psychology, a whole field called terror management theory.

00:59:32 Sheldon, who’s just done the podcast, he wrote the book.

00:59:36 He was the, we talked for four hours about death, uh, fear of death, but his, his whole

00:59:44 idea is that, um, Ernest Becker, I think I find this idea really compelling is, uh, that

00:59:52 everything human beings have created, like our whole motivation in life is to, uh, create

01:00:00 like escape death is to try to, um, construct an illusion of, um, that we’re somehow immortal.

01:00:11 So like everything around us, this room, your startup, your dreams, all everything you do

01:00:21 is a kind of, um, creation of a brain unlike any other mammal or species is able to be

01:00:30 cognizant of the fact that it ends for us.

01:00:35 I think, so, you know, there’s this, the question of like the meaning of life that, you know,

01:00:40 you look at like what drives us, uh, humans.

01:00:44 And when I read Ernest Becker that I highly recommend people read is the first time I,

01:00:50 this scene, it felt like this is the right thing at the core.

01:00:54 Uh, Sheldon’s work is called warm at the core.

01:00:57 So he’s saying it’s, I think it’s, uh, William James he’s quoting or whoever is like the,

01:01:05 the thing, what is at the core of it all?

01:01:07 Whether there’s like love, you know, Jesus might talk about like love is at the core

01:01:12 of everything.

01:01:13 I don’t, you know, that’s the open question.

01:01:15 What’s at the, you know, it’s turtles, turtles, but it can’t be turtles all the way down.

01:01:19 What’s what’s at the, at the bottom.

01:01:22 And, uh, Ernest Becker says the fear of death and the way, in fact, uh, cause you said therapist

01:01:30 and calming images, his whole idea is, um, you know, we, we want to bring that fear of

01:01:36 death as close as possible to the surface because it’s, um, and like meditate on that.

01:01:43 Uh, and, and use the clarity of vision that provides to, uh, you know, to live a more

01:01:49 fulfilling life, to, um, to live a more honest life, to, to discover, you know, there’s something

01:01:58 about, you know, being cognizant of the finiteness of it all that might result in, um, in the

01:02:05 most fulfilling life.

01:02:07 So that’s the, that’s the dual of what you’re saying.

01:02:10 Cause you kind of said, it’s like, I unfortunately think about it too much.

01:02:15 It’s a question whether it’s good to think about it because I, I’ve, um, again, I talk

01:02:20 about way too much about love and probably death.

01:02:23 And when I ask people, friends, which is why I probably don’t have many friends, are you

01:02:29 afraid of death?

01:02:30 I think most people say they’re not.

01:02:35 Whether they say they’re, um, they’re afraid, you know, it’s kind of almost like they see

01:02:41 death as this kind of like, uh, a paper deadline or something.

01:02:45 And they’re afraid not to finish the paper before the paper, like, like I’m afraid not

01:02:50 to finish, um, the goals I have, but it feels like they’re not actually realizing that this

01:02:57 thing ends, like really realizing, like really thinking as Nietzsche and all these philosophy,

01:03:04 like thinking deeply about it, like, uh, the very thing that, you know, um, like when you

01:03:13 think deeply about something, you can just, you can realize that you haven’t actually

01:03:18 thought about it.

01:03:20 Uh, yeah.

01:03:22 And I, and when I think about death, it’s like, um, it can be, it’s terrifying.

01:03:28 If it feels like stepping outside into the cold or it’s freezing and then I have to like

01:03:34 hurry back inside or it’s warm.

01:03:36 Uh, but like, I think there’s something valuable about stepping out there into the freezing

01:03:43 cold.

01:03:44 Definitely.

01:03:45 When I talk to my mentor about it, he always, uh, tells me, well, what dies?

01:03:52 There’s nothing there that can die, but I guess that requires, um, well in, in Buddhism,

01:04:00 one of the concepts that are really hard to grasp and that people spend all their lives

01:04:05 meditating on would be Anatta, which is the concept of non, not self and kind of thinking

01:04:12 that, you know, if you’re not your thoughts, which you’re obviously not your thoughts because

01:04:15 you can observe them and not your emotions and not your body, then what is this?

01:04:20 And if you go really far, then finally you see that there’s not self, there’s this concept

01:04:27 of not self.

01:04:28 So once you get there, how can that actually die?

01:04:32 What is dying?

01:04:33 Right.

01:04:34 You’re just a bunch of molecules, stardust.

01:04:38 But that is very, um, you know, very advanced, um, spiritual work for me.

01:04:44 I’m definitely just, definitely not.

01:04:47 Oh my God.

01:04:48 No, I have, uh, I think it’s very, very useful.

01:04:50 It’s just the fact that maybe being so afraid is not useful and mine is more, I’m just terrified.

01:04:56 Like it’s really makes me, um,

01:04:58 On a personal level.

01:04:59 On a personal level.

01:05:00 I’m terrified.

01:05:01 How do you overcome that?

01:05:02 I don’t.

01:05:03 I’m still trying to.

01:05:04 Have pleasant images?

01:05:05 Well, pleasant images get me to sleep and then during the day I can distract myself with

01:05:20 other things, like talking to you.

01:05:24 I’m glad we’re both doing the same exact thing.

01:05:26 Okay, good.

01:05:27 Is there other, like, is there moments since you’ve, uh, lost Roman that you had like moments

01:05:39 of like bliss and like that you’ve forgotten that you have achieved that Buddhist like

01:05:47 level of like what can possibly die.

01:05:52 I’m part like, uh, losing yourself in the moment, in the ticking time of like this universe

01:06:02 and you’re just part of it for a brief moment and just enjoying it.

01:06:06 Well that goes hand in hand.

01:06:08 I remember I think a day or two after he died, we went to finally get his password out of

01:06:13 the embassy and we’re driving around Moscow and it was, you know, December, which is usually

01:06:19 there’s never a sun in Moscow in December and somehow it was an extremely sunny day

01:06:25 and we were driving with a close friend.

01:06:30 And I remember feeling for the first time maybe this just moment of incredible clarity

01:06:35 and somehow happiness, not like happy happiness, but happiness and just feeling that, you know,

01:06:45 I know what the universe is sort of about, whether it’s good or bad.

01:06:49 And it wasn’t a sad feeling.

01:06:50 It was probably the most beautiful feeling that you can ever achieve.

01:06:56 And you can only get it when something, oftentimes when something traumatic like that happens.

01:07:03 But also if you just, you really spend a lot of time meditating and looking at the nature

01:07:07 doing something that really gets you there.

01:07:09 But once you’re there, I think when you, uh, summit a mountain, a really hard mountain,

01:07:14 you inevitably get there.

01:07:16 That’s just a way to get to the state.

01:07:18 But once you’re on this, in this state, um, you can do really big things.

01:07:24 I think.

01:07:25 Yeah.

01:07:26 Sucks it doesn’t last forever.

01:07:28 So Bukowski talked about like, love is a fog.

01:07:32 Like it’s a, when you wake up in the morning, it’s, it’s there, but it eventually dissipates.

01:07:38 It’s really sad.

01:07:40 Nothing lasts forever.

01:07:41 But I definitely like doing this pushup and running thing.

01:07:46 There’s moments at a couple of moments, like I’m not a crier.

01:07:51 I don’t cry.

01:07:52 But there’s moments where I was like facedown on the carpet, like with tears in my eyes

01:07:59 is interesting.

01:08:00 And then that, that complete, like, uh, there’s a lot of demons.

01:08:05 I’ve got demons had to face them.

01:08:07 Funny how running makes you face your demons.

01:08:09 But at the same time, the flip side of that, there’s a few moments where I was in bliss

01:08:16 and all of it alone, which is funny.

01:08:19 That’s beautiful.

01:08:20 I like that, but definitely pushing yourself physically one of it for sure.

01:08:27 Yeah.

01:08:28 Yeah.

01:08:29 Like you said, I mean, you were speaking as a metaphor of Mount Everest, but it also works

01:08:34 like literally, I think physical endeavor somehow.

01:08:39 Yeah.

01:08:40 There’s something.

01:08:41 I mean, we’re monkeys, apes, whatever physical, there’s a physical thing to it, but there’s

01:08:46 something to this pushing yourself physical, physically, but alone that happens when you’re

01:08:53 doing like things like you do or strenuous like workouts or, you know, rolling extra

01:08:58 across the Atlantic or like marathons.

01:09:01 I love watching marathons and you know, it’s so boring, but you can see them getting there.

01:09:09 So the other thing, I don’t know if you know, there’s a guy named David Goggins.

01:09:14 He’s a, he basically, uh, so he’s been either email on the phone with me every day through

01:09:20 this.

01:09:21 I haven’t been exactly alone, but he, he’s kind of, he’s the, he’s the devil on the devil’s

01:09:27 shoulder.

01:09:28 Uh, so he’s like the worst possible human being in terms of giving you, uh, like he

01:09:36 has, um, through everything I’ve been doing, he’s been doubling everything I do.

01:09:40 So he, he’s insane.

01:09:42 Uh, he’s a, this Navy seal person.

01:09:45 Uh, he’s wrote this book.

01:09:47 Can’t hurt me.

01:09:48 He’s basically one of the toughest human beings on earth.

01:09:50 He ran all these crazy ultra marathons in the desert.

01:09:54 He set the world record number of pull ups.

01:09:56 He just does everything where it’s like, he, like, how can I suffer today?

01:10:03 He figures that out and does it.

01:10:05 Yeah.

01:10:06 That, um, whatever that is, uh, that process of self discovery is really important.

01:10:11 I actually had to turn myself off from the internet mostly because I started this like

01:10:16 workout thing, like a happy go getter with my like headband and like, just like, uh,

01:10:24 because a lot of people were like inspired and they’re like, yeah, we’re going to exercise

01:10:27 with you.

01:10:28 And I was like, yeah, great.

01:10:30 You know, but then like, I realized that this, this journey can’t be done together with others.

01:10:38 This has to be done alone.

01:10:41 So out of the moments of love, out of the moments of loss, can we, uh, talk about your

01:10:48 journey of finding, I think, an incredible idea and incredible company and incredible

01:10:56 system in Replica?

01:10:59 How did that come to be?

01:11:01 So yeah, so I was a journalist and then I went to business school for a couple of years

01:11:05 to, um, just see if I can maybe switch gears and do something else with 23.

01:11:12 And then I came back and started working for a businessman in Russia who built the first

01:11:17 ROG network, um, in our country and was very visionary and asked me whether I want to do

01:11:25 fun stuff together.

01:11:26 Um, and we worked on a bank, um, the idea was to build a bank on top of, um, a telco.

01:11:34 So that was 2011 or 12, um, and a lot of telecommunication company, um, mobile network operators didn’t

01:11:42 really know what to do next in terms of, you know, new products, new revenue.

01:11:47 And this big idea was that, you know, um, you put a bank on top and then all work works

01:11:53 out.

01:11:54 Basically a prepaid account becomes your bank account and, um, you can use it as, as your

01:11:58 bank.

01:11:59 Uh, so, you know, a third of a country wakes up as, as your bank client.

01:12:05 Um, but we couldn’t quite figure out what, what would be the main interface to interact

01:12:10 with the bank.

01:12:11 The problem was that most people didn’t have smart, smart phones back in the time in Russia,

01:12:15 the penetration of smartphones was low, um, people didn’t use mobile banking or online

01:12:20 banking and their computers.

01:12:22 So we figured out that SMS would be the best way, uh, cause that would work on feature

01:12:26 phones.

01:12:27 Um, but that required some chat bot technology, which I didn’t know anything about, um, obviously.

01:12:33 So I started looking into it and saw that there’s nothing really, well, there wasn’t

01:12:37 just nothing really.

01:12:38 Ideas through SMS be able to interact with your bank account.

01:12:41 Yeah.

01:12:42 And then we thought, well, since you’re talking to a bank account, why can’t this, can’t we

01:12:46 use more of, uh, you know, some behavioral ideas and why can’t this, uh, banking chat

01:12:52 bot be nice to you and really talk to you sort of as a friend this way you develop more

01:12:56 connection to it, retention is higher, people don’t churn.

01:12:59 And so I went to very depressing, um, um, Russian cities to test it out.

01:13:05 Um, I went to, I remember three different towns with, uh, um, to interview potential

01:13:12 users.

01:13:13 Um, so people use it for a little bit and I went to talk to them, um, very poor towns,

01:13:19 mostly towns that were, um, you know, sort of factories, uh, mono towns.

01:13:26 They were building something and then the factory went away and it was just a bunch

01:13:29 of very poor people.

01:13:32 Um, and then we went to a couple that weren’t as dramatic, but still the one I remember

01:13:37 really fondly was this woman that worked at a glass factory and she talked to a chat bot.

01:13:41 Um, and she was talking about it and she started crying during the interview because she said,

01:13:46 no one really cares for me that much.

01:13:50 And um, so to be clear, that was the, my only endeavor in programming that chat bot.

01:13:56 So it was really simple.

01:13:58 It was literally just a few, if this, then that rules and, um, it was incredibly simplistic.

01:14:06 Um, and that really made her emotional and she said, you know, I only have my mom and

01:14:12 my, um, my husband and I don’t have any more really in my life.

01:14:18 And that was very sad, but at the same time I felt, and we had more interviews in a similar

01:14:22 vein and what I thought in the moment was like, well, uh, it’s not that the technology

01:14:27 is ready because definitely in 2012 technology was not ready for, for that, but, um, humans

01:14:34 are ready, unfortunately.

01:14:36 So this project would not be about like tech capabilities would be more about human vulnerabilities,

01:14:42 but, um, there’s something so, so powerful around about conversational, um, AI that I

01:14:49 saw then that I thought was definitely worth putting in a lot of effort into.

01:14:54 So in the end of the day, we saw the banking project, um, but my then boss, um, was also

01:15:01 my mentor and really, really close friend, um, told me, Hey, I think there’s something

01:15:06 in it and you should just go work on it.

01:15:08 And I was like, well, what product?

01:15:10 I don’t know what I’m building.

01:15:11 He’s like, you’ll figure it out.

01:15:14 And, um, you know, looking back at this, this was a horrible idea to work on something without

01:15:18 knowing what it was, which is maybe the reason why it took us so long, but we just decided

01:15:24 to work on the conversational tech to see what it, you know, there were no chat bot,

01:15:30 um, constructors or programs or anything that would allow you to actually build one at the

01:15:35 time.

01:15:36 Uh, that was the era of, by the way, Google glass, which is why, you know, some of the

01:15:40 investors like seed investors we’ve talked with were like, Oh, you should totally build

01:15:44 it for Google glass.

01:15:45 If not, we’re not, I don’t think that’s interesting.

01:15:48 Did you bite on that idea?

01:15:50 No.

01:15:51 Okay.

01:15:52 Because I wanted to be, to do text first cause I’m a journalist.

01:15:56 So I was, um, fascinated by just texting.

01:16:01 So you thought, so the emotional, um, that interaction that the woman had, like, so do

01:16:07 you think you could feel emotion from just text?

01:16:10 Yeah.

01:16:11 I saw something in just this pure texting and also thought that we should first start,

01:16:17 start building for people who really need it versus people who have Google glass.

01:16:20 Uh, if you know what I mean, and I felt like the early adopters of Google glass might not

01:16:25 be overlapping with people who are really lonely and might need some, you know, someone

01:16:29 to talk to.

01:16:31 Um, but then we really just focused on the tech itself.

01:16:35 We just thought, what if we just, you know, we didn’t have a product idea in the moment

01:16:39 and we felt, what if we just look into, um, building the best conversational constructors,

01:16:46 so to say, use the best tech available at the time.

01:16:49 And that was before the first paper about deep learning applied to dialogues, which

01:16:53 happened in 2015 in August, 2015, uh, which Google published.

01:17:01 Did you follow the work of Lobna prize and like all the sort of non machine learning

01:17:09 chat bots?

01:17:10 Yeah.

01:17:11 What really struck me was that, you know, there was a lot of talk about machine learning

01:17:15 and deep learning.

01:17:16 Like big data was a really big thing.

01:17:17 Everyone was saying, you know, the business world, big data, 2012 is the biggest gaggle

01:17:22 competitions were, you know, um, important, but that was really the kind of upheaval.

01:17:27 People started talking about machine learning a lot, um, but it was only about images or

01:17:32 something else.

01:17:33 And it was never about conversation.

01:17:34 As soon as I looked into the conversational tech, it was all about something really weird

01:17:39 and very outdated and very marginal and felt very hobbyist.

01:17:42 It was all about Lord burner price, which was won by a guy who built a chat bot that

01:17:47 talked like a Ukrainian teenager that it was just a gimmick.

01:17:51 And somehow people picked up those gimmicks and then, you know, the most famous chat bot

01:17:56 at the time was Eliza from 1980s, which was really bizarre or smarter child on aim.

01:18:03 The funny thing is it felt at the time not to be that popular and it still doesn’t seem

01:18:09 to be that popular.

01:18:11 Like people talk about the Turing test, people like talking about it philosophically, journalists

01:18:15 like writing about it, but as a technical problem, like people don’t seem to really

01:18:21 want to solve the open dialogue.

01:18:26 Like they, they’re not obsessed with it.

01:18:29 Even folks are like, you know, I’m in Boston, the Alexa team, even they’re not as obsessed

01:18:35 with it as I thought they might be.

01:18:38 Why not?

01:18:39 What do you think?

01:18:40 So you know what you felt like you felt with that woman who, when she felt something by

01:18:45 reading the text, I feel the same thing.

01:18:48 There’s something here, what you felt.

01:18:51 I feel like Alexa folks and just the machine learning world doesn’t feel that, that there’s

01:18:59 something here because they see as a technical problem is not that interesting for some reason.

01:19:07 It’s could be argued that maybe as a purely sort of natural language processing problem,

01:19:12 it’s not the right problem to focus on because there’s too much subjectivity.

01:19:17 That thing that the woman felt like crying, like if your benchmark includes a woman crying,

01:19:24 that doesn’t feel like a good benchmark.

01:19:27 But to me there’s something there that’s, you could have a huge impact, but I don’t

01:19:32 think the machine learning world likes that, the human emotion, the subjectivity of it,

01:19:38 the fuzziness, the fact that with maybe a single word you can make somebody feel something

01:19:43 deeply.

01:19:44 What is that?

01:19:45 It doesn’t feel right to them.

01:19:47 So I don’t know.

01:19:48 I don’t know why that is.

01:19:50 That’s why I’m excited when I discovered your work, it feels wrong to say that.

01:19:57 It’s not like I’m giving myself props for Googling and for coming across, for I guess

01:20:10 mutual friend and introducing us, but I’m so glad that you exist and what you’re working

01:20:15 on.

01:20:16 But I have the same kind of, if we could just backtrack for a second, because I have the

01:20:20 same kind of feeling that there’s something here.

01:20:22 In fact, I’ve been working on a few things that are kind of crazy, very different from

01:20:29 your work.

01:20:30 I think they’re too crazy.

01:20:34 But the…

01:20:35 Like what?

01:20:36 I don’t have to know.

01:20:38 No, all right, we’ll talk about it more.

01:20:41 I feel like it’s harder to talk about things that have failed and are failing while you’re

01:20:49 a failure.

01:20:53 It’s easier for you because you’re already successful on some measures.

01:20:59 Tell it to my board.

01:21:01 Well, I think you’ve demonstrated success in a lot of ways.

01:21:07 It’s easier for you to talk about failures for me.

01:21:10 I’m in the bottom currently of the success.

01:21:19 You’re way too humble.

01:21:21 So it’s hard for me to know, but there’s something there, there’s something there.

01:21:25 And I think you’re exploring that and you’re discovering that.

01:21:31 So it’s been surprising to me.

01:21:32 But you’ve mentioned this idea that you thought it wasn’t enough to start a company or start

01:21:41 efforts based on it feels like there’s something here.

01:21:46 Like what did you mean by that?

01:21:49 Like you should be focused on creating a, like you should have a product in mind.

01:21:55 Is that what you meant?

01:21:56 It just took us a while to discover the product because it all started with a hunch of like

01:22:03 of me and my mentor and just sitting around and he was like, well, that’s it.

01:22:08 That’s the, you know, the Holy Grail is there.

01:22:11 It’s like there’s something extremely powerful in, in, in conversations and there’s no one

01:22:17 who’s working on machine conversation from the right angle.

01:22:19 So to say.

01:22:20 I feel like that’s still true.

01:22:22 Am I crazy?

01:22:23 Oh no, I totally feel that’s still true, which is, I think it’s mind blowing.

01:22:28 Yeah.

01:22:29 You know what it feels like?

01:22:30 I wouldn’t even use the word conversation cause I feel like it’s the wrong word.

01:22:35 It’s like a machine connection or something.

01:22:39 I don’t know cause conversation, you start drifting into natural language immediately.

01:22:44 You start drifting immediately into all the benchmarks that are out there.

01:22:47 But I feel like it’s like the personal computer days of this.

01:22:52 Like I feel like we’re like in the early days with the, like the Wozniak and all them, like

01:22:57 where it was the same kind of, it was a very small niche group of people who are, who are

01:23:04 all kind of lob no price type people.

01:23:07 Yeah.

01:23:08 Hobbyists.

01:23:09 Hobbyists, but like not even hobbyists with big dreams.

01:23:13 Like no hobbyists with a dream to trick like a jury.

01:23:17 Yeah.

01:23:18 It’s like a weird, by the way, by the way, very weird.

01:23:21 So if we think about conversations, first of all, when I have great conversations with

01:23:26 people, I’m not trying to test them.

01:23:30 So for instance, if I try to break them, like if I’m actually playing along, I’m part of

01:23:33 it.

01:23:34 Right.

01:23:35 If I were to ask this person or test whether he’s going to give me a good conversation,

01:23:40 it would have never happened.

01:23:41 So the whole, the whole problem with testing conversations is that you can put it in front

01:23:47 of a jury because then you have to go into some Turing test mode where is it responding

01:23:52 to all my factual questions, right?

01:23:55 Or so it really has to be something in the field where people are actually talking to

01:24:00 it because they want to, not because we’re just trying to break it.

01:24:05 And it’s working for them because this, the weird part of it is that it’s very subjective.

01:24:11 It takes two to tango here fully.

01:24:13 If you’re not trying to have a good conversation, if you’re trying to test it, then it’s going

01:24:16 to break.

01:24:17 I mean, any person would break, to be honest.

01:24:19 If I’m not trying to even have a conversation with you, you’re not going to give it to me.

01:24:24 Yeah.

01:24:25 If I keep asking you like some random questions or jumping from topic to topic, that wouldn’t

01:24:30 be, which I’m probably doing, but that probably wouldn’t contribute to the conversation.

01:24:36 So I think the problem of testing, so there should be some other metric.

01:24:42 How do we evaluate whether that conversation was powerful or not, which is what we actually

01:24:47 started with.

01:24:48 And I think those measurements exist and we can test on those.

01:24:51 But what really struck us back in the day and what’s still eight years later is still

01:24:58 not resolved and I’m not seeing tons of groups working on it.

01:25:02 Maybe I just don’t know about them, it’s also possible.

01:25:06 But the interesting part about it is that most of our days we spend talking and we’re

01:25:10 not talking about like those conversations are not turn on the lights or customer support

01:25:17 problems or some other task oriented things.

01:25:22 These conversations are something else and then somehow they’re extremely important for

01:25:26 us.

01:25:27 If we don’t have them, then we feel deeply unhappy, potentially lonely, which as we know,

01:25:34 creates tons of risk for our health as well.

01:25:38 And so this is most of our hours as humans and somehow no one’s trying to replicate that.

01:25:45 And not even study it that well?

01:25:49 And not even study that well.

01:25:50 So when we jumped into that in 2012, I looked first at like, okay, what’s the chatbot?

01:25:54 What’s the state of the art chatbot?

01:25:57 And those were the Lobner Prize days, but I thought, okay, so what about the science

01:26:02 of conversation?

01:26:04 Clearly there have been tons of scientists or academics that looked into the conversation.

01:26:12 So if I want to know everything about it, I can just read about it.

01:26:17 There’s not much really, there are conversational analysts who are basically just listening

01:26:23 to speech, to different conversations, annotating them.

01:26:28 And then, I mean, that’s not really used for much.

01:26:32 That’s the field of theoretical linguistics, which is barely useful.

01:26:39 It’s very marginal, even in their space, no one really is excited and I’ve never met a

01:26:44 theoretical linguist who was like, I can’t wait to work on the conversation and analytics.

01:26:49 That is just something very marginal, sort of applied to like writing scripts for salesmen

01:26:54 when they analyze which conversation strategies were most successful for sales.

01:27:00 Okay, so that was not very helpful.

01:27:03 Then I looked a little bit deeper and then there, whether there were any books written

01:27:09 on what really contributes to great conversation, that was really strange because most of those

01:27:16 were NLP books, which is neurolinguistic programming, which is not the NLP that I was expecting

01:27:27 to be, but it was mostly some psychologist, Richard Bandler, I think came up with that,

01:27:33 who was this big guy in a leather vest that could program your mind by talking to you.

01:27:41 How to be charismatic and charming and influential with people, all those books, yeah.

01:27:45 Pretty much, but it was all about like through conversation reprogramming you, so getting

01:27:49 to some, so that was, I mean, probably not very, very true and that didn’t seem working

01:27:58 very much even back in the day.

01:28:00 And then there were some other books like, I don’t know, mostly just self help books

01:28:05 around how to be the best conversationalist or how to make people like you or some other

01:28:12 stuff like Dale Carnegie or whatever.

01:28:17 And then there was this one book, The Most Human Human by Brian Christensen that really

01:28:21 was important for me to read back in the day because he was on the human side, he was taking

01:28:29 part in the London Prize, but not as a human who’s not a jury, but who’s pretending to

01:28:35 be, who’s basically, you have to tell a computer from a human and he was the human, so you

01:28:40 could either get him or a computer.

01:28:43 And his whole book was about how do people, what makes us human in conversation.

01:28:49 And that was a little bit more interesting because that at least someone started to think

01:28:52 about what exactly makes me human in conversation and makes people believe in that, but it was

01:28:59 still about tricking, it was still about imitation game, it was still about, okay, well, what

01:29:03 kind of parlor tricks can we throw in the conversation to make you feel like you’re

01:29:07 talking to a human, not a computer.

01:29:09 And it was definitely not about thinking, what is it exactly that we’re getting from

01:29:16 talking all day long with other humans.

01:29:19 I mean, we’re definitely not just trying to be tricked or it’s not just enough to know

01:29:23 it’s a human.

01:29:25 It’s something we’re getting there, can we measure it and can we put the computer to

01:29:30 the same measurement and see whether you can talk to a computer and get the same results?

01:29:35 Yeah, so first of all, a lot of people comment that they think I’m a robot, it’s very possible

01:29:40 I am a robot and this whole thing, I totally agree with you that the test idea is fascinating

01:29:45 and I looked for books unrelated to this kind of, so I’m afraid of people, I’m generally

01:29:51 introverted and quite possibly a robot.

01:29:55 I literally Googled how to talk to people and how to have a good conversation for the

01:30:03 purpose of this podcast, because I was like, I can’t, I can’t make eye contact with people.

01:30:08 I can’t like hire.

01:30:10 I do Google that a lot too.

01:30:12 You’re probably reading a bunch of FBI negotiation tactics.

01:30:15 Is that what you’re getting?

01:30:17 Well, everything you’ve listed I’ve gotten, there’s been very few good books on even just

01:30:24 like how to interview well, it’s rare.

01:30:28 So what I end up doing often is I watch like with a critical eye, it’s just so different

01:30:37 when you just watch a conversation, like just for the fun of it, just as a human.

01:30:43 And if you watch a conversation, it’s like trying to figure out why is this awesome?

01:30:49 I’ll listen to a bunch of different styles of conversation.

01:30:52 I mean, I’m a fan of the podcast, Joe Rogan, people can make fun of him or whatever and

01:31:00 dismiss him.

01:31:01 But I think he’s an incredibly artful conversationalist.

01:31:06 He can pull people in for hours.

01:31:09 And there’s another guy I watch a lot.

01:31:14 He hosted a late night show, his name was Craig Ferguson.

01:31:20 So he’s like very kind of flirtatious.

01:31:23 But there’s a magic about his like, about the connection he can create with people,

01:31:30 how he can put people at ease.

01:31:33 And just like, I see I’ve already started sounding like those I know pee people or something.

01:31:37 I’m not I don’t mean in that way.

01:31:39 I don’t mean like how to charm people or put them at ease and all that kind of stuff.

01:31:43 It’s just like, what is that?

01:31:45 Why is that fun to listen to that guy?

01:31:47 Why is that fun to talk to that guy?

01:31:51 What is that?

01:31:52 Because he’s not saying I mean, it’s so often boils down to a kind of wit and humor, but

01:32:01 not really humor.

01:32:03 It’s like, I don’t know, I have trouble actually even articulating correctly.

01:32:10 But it feels like there’s something going on that’s not too complicated, that could

01:32:18 be learned.

01:32:22 And it’s not similar to, yeah, to like, like you said, like the Turing test.

01:32:29 It’s something else.

01:32:32 I’m thinking about a lot all the time.

01:32:34 I do think about all the time.

01:32:38 I think when we were looking, so we started the company, we just decided to build the

01:32:42 conversational tech, we thought, well, there’s nothing for us to build this chatbot that

01:32:47 we want to build.

01:32:48 So let’s just first focus on building, you know, some tech, building the tech side of

01:32:54 things without a product in mind, without a product in mind, we added like a demo chatbot

01:33:01 that would recommend you restaurants and talk to you about restaurants just to show something

01:33:04 simple to people that people could relate to and could try out and see whether it works

01:33:11 or not.

01:33:12 But we didn’t have a product in mind yet.

01:33:15 We thought we would try venture chatbots and figure out our consumer application.

01:33:19 And we sort of remembered that we wanted to build that kind of friend, that sort of connection

01:33:23 that we saw in the very beginning.

01:33:26 But then we got to Y Combinator and moved to San Francisco and forgot about it.

01:33:30 You know, everything because then it was just this constant grind.

01:33:33 How do we get funding?

01:33:34 How do we get this?

01:33:35 You know, investors were like, just focus on one thing, just get it out there.

01:33:40 So somehow we’ve started building a restaurant recommendation chatbot for real for a little

01:33:45 bit, not for too long.

01:33:47 And then we tried building 40, 50 different chatbots.

01:33:50 And then all of a sudden we wake up and everyone is obsessed with chatbots.

01:33:54 Somewhere in 2016 or end of 15, people started thinking that’s really the future.

01:33:59 That’s the new, you know, the new apps will be chatbots.

01:34:04 And we were very perplexed because people started coming up with companies that I think

01:34:08 we tried most of those chatbots already and there were like no users, but still people

01:34:13 were coming up with a chatbot that will tell you whether and bringing news and this and

01:34:19 that.

01:34:20 And we couldn’t understand whether we were just didn’t execute well enough or people

01:34:25 are not really, people are confused and are going to find out the truth that people don’t

01:34:31 need chatbots like that.

01:34:32 So the basic idea is that you use chatbots as the interface to whatever application.

01:34:37 Yeah.

01:34:38 The idea that was like this perfect universal interface to anything.

01:34:43 When I looked at that, it just made me very perplexed because I didn’t think, I didn’t

01:34:46 understand how that would work because I think we tried most of that and none of those things

01:34:52 worked.

01:34:53 And then again, that craze has died down, right?

01:34:56 Fully.

01:34:57 I think now it’s impossible to get anything funded if it’s a chatbot.

01:35:01 I think it’s similar to, sorry to interrupt, but there’s times when people think like with

01:35:06 gestures you can control devices, like basically gesture based control things.

01:35:13 It feels similar to me because like it’s so compelling that was just like Tom Cruise,

01:35:19 I can control stuff with my hands, but like when you get down to it, it’s like, well,

01:35:25 why don’t you just have a touch screen or why don’t you just have like a physical keyboard

01:35:30 and mouse?

01:35:33 So that chat was always, yeah, it was perplexing to me.

01:35:39 I still feel augmented reality, even virtual realities in that ballpark in terms of it

01:35:46 being a compelling interface.

01:35:48 I think there’s going to be incredible rich applications, just how you’re thinking about

01:35:54 it, but they won’t just be the interface to everything.

01:35:57 It’ll be its own thing that will create an amazing magical experience in its own right.

01:36:04 Absolutely.

01:36:05 Which is I think kind of the right thing to go about, like what’s the magical experience

01:36:10 with that interface specifically.

01:36:14 How did you discover that for Replica?

01:36:16 I just thought, okay, we’ll have this tech, we can build any chatbot we want.

01:36:20 We have the most, at that point, the most sophisticated tech that other companies have.

01:36:24 I mean, startups, obviously not, probably not bigger ones, but still, because we’ve

01:36:29 been working on it for a while.

01:36:31 So I thought, okay, we can build any conversation.

01:36:33 So let’s just create a scale from one to 10.

01:36:37 And one would be conversations that you’d pay to not have, and 10 would be conversation

01:36:41 you’d pay to have.

01:36:42 And I mean, obviously we want to build a conversation that people would pay to actually have.

01:36:47 And so for the whole, for a few weeks, me and the team were putting all the conversations

01:36:51 we were having during the day on the scale.

01:36:54 And very quickly, we figured out that all the conversations that we would pay to never

01:36:58 have were conversations we were trying to cancel Comcast, or talk to customer support,

01:37:07 or make a reservation, or just talk about logistics with a friend when we’re trying

01:37:12 to figure out where someone is and where to go, or all sorts of setting up scheduling

01:37:19 meetings.

01:37:20 So that was a conversation we definitely didn’t want to have.

01:37:24 Basically everything task oriented was a one, because if there was just one button for me

01:37:29 to just, or not even a button, if I could just think, and there was some magic BCI that

01:37:34 would just immediately transform that into an actual interaction, that would be perfect.

01:37:41 But the conversation there was just this boring, not useful, and dull, and also very inefficient

01:37:49 thing because it was so many back and forth stuff.

01:37:52 And as soon as we looked at the conversations that we would pay to have, those were the

01:37:56 ones that, well, first of all, therapists, because we actually paid to have those conversations.

01:38:01 And we’d also try to put like dollar amounts.

01:38:03 So if I was calling Comcast, I would pay $5 to not have this one hour talk on the phone.

01:38:08 I would actually pay straight up, like money, hard money, but it just takes a long time.

01:38:13 It takes a really long time.

01:38:17 But as soon as we started talking about conversations that we would pay for, those were therapists,

01:38:22 all sorts of therapists, coaches, old friend, someone I haven’t seen for a long time, a

01:38:30 stranger on a train, weirdly stranger, stranger in a line for coffee and nice back and forth

01:38:36 with that person was like a good five, solid five, six, maybe not a 10.

01:38:41 Maybe I won’t pay money, but at least I won’t pay money to not have one.

01:38:45 So that was pretty good.

01:38:46 There were some intellectual conversations for sure.

01:38:50 But more importantly, the one thing that really was making those very important and very valuable

01:39:00 for us were the conversations where we could be pretty emotional.

01:39:06 Yes, some of them were about being witty and about being intellectually stimulated, but

01:39:11 those were interestingly more rare.

01:39:14 And most of the ones that we thought were very valuable were the ones where we could

01:39:18 be vulnerable.

01:39:19 And interestingly, where we could talk more, me and the team.

01:39:27 So we’re talking about it, like a lot of these conversations, like a therapist, it was mostly

01:39:31 me talking or like an old friend and I was like opening up and crying and it was again

01:39:36 me talking.

01:39:37 And so that was interesting because I was like, well, maybe it’s hard to build a chat

01:39:42 bot that can talk to you very well and in a witty way, but maybe it’s easier to build

01:39:47 the chat bot that could listen.

01:39:51 So that was kind of the first nudge in this direction.

01:39:56 And then when my friend died, we just built, at that point we were kind of still struggling

01:40:01 to find the right application.

01:40:02 And I just felt very strong that all the chat bots we’ve built so far are just meaningless

01:40:07 and this whole grind, the startup grind, and how do we get to the next fundraising and

01:40:14 how can I talk, talking to the founders and who are your investors and how are you doing?

01:40:19 Are you killing it?

01:40:20 Cause we’re killing it.

01:40:21 I just felt that this is just…

01:40:25 Intellectually for me, it’s exhausting having encountered those folks.

01:40:28 It just felt very, very much a waste of time.

01:40:32 I just feel like Steve Jobs and Elon Musk did not have these conversations or at least

01:40:39 did not have them for long.

01:40:42 That’s for sure.

01:40:43 But I think, yeah, at that point it just felt like, I felt like I just didn’t want to build

01:40:50 a company that was never my intention just to build something successful or make money.

01:40:56 It would be great.

01:40:57 It would have been great, but I’m not really a startup person.

01:41:00 I’m not, I was never very excited by the grind by itself or just being successful for building

01:41:10 whatever it is and not being into what I’m doing really.

01:41:16 And so I just took a little break cause I was a little, I was upset with my company

01:41:20 and I didn’t know what we’re building.

01:41:22 So I just took our technology and our little dialect constructor and some models, some

01:41:27 deep learning models, which at that point we were really into and really invested a

01:41:31 lot and built a little chat bot for a friend of mine who passed.

01:41:36 And the reason for that was mostly that video that I saw and him talking about the digital

01:41:40 avatars and Rowan was that kind of person.

01:41:44 He was obsessed with just watching YouTube videos about space and talking about, well,

01:41:48 if I could go to Mars now, even if I didn’t know if I could come back, I would definitely

01:41:52 pay any amount of money to be on that first shuttle.

01:41:56 I don’t care whether I die, like he was just the one that would be okay with trying to

01:42:02 be the first one and so excited about all sorts of things like that.

01:42:08 And he was all about fake it till you make it and just, and I felt like, and I was really

01:42:14 perplexed that everyone just forgot about him.

01:42:17 Maybe it was our way of coping, mostly young people coping with the loss of a friend.

01:42:23 Most of my friends just stopped talking about him.

01:42:25 And I was still living in an apartment with all his clothes and paying the whole lease

01:42:31 for it and just kind of by myself in December, so it was really sad and I didn’t want him

01:42:38 to be forgotten.

01:42:39 First of all, I never thought that people forget about dead people so fast.

01:42:43 People pass away, people just move on.

01:42:45 And it was astonishing for me because I thought, okay, well, he was such a mentor for so many

01:42:49 of our friends.

01:42:50 He was such a brilliant person, he was somewhat famous in Moscow.

01:42:55 How is it that no one’s talking about him?

01:42:57 Like I’m spending days and days and we don’t bring him up and there’s nothing about him

01:43:03 that’s happening.

01:43:04 It’s like he was never there.

01:43:07 And I was reading the book, The Year of Magical Thinking by Joan Didion about her losing

01:43:16 and Blue Nights about her losing her husband, her daughter, and the way to cope for her

01:43:23 was to write those books.

01:43:26 And it was sort of like a tribute.

01:43:28 And I thought, I’ll just do that for myself.

01:43:31 And I’m a very bad writer and a poet as we know.

01:43:36 So I thought, well, I have this tech and maybe that would be my little postcard for him.

01:43:43 So I built a chatbot to just talk to him and it felt really creepy and weird for a little

01:43:50 bit.

01:43:51 I just didn’t want to tell other people because it felt like I’m telling about having a skeleton

01:43:56 in my underwear.

01:44:00 It was just felt really, I was a little scared that it won’t be taken, but it worked interestingly

01:44:07 pretty well.

01:44:08 I mean, it made tons of mistakes, but it still felt like him.

01:44:12 Granted it was like 10,000 messages that I threw into a retrieval model that would just

01:44:16 re rank that Tegda said and just a few scripts on top of that.

01:44:21 But it also made me go through all of the messages that we had.

01:44:24 And then I asked some of my friends to send some through.

01:44:27 And it felt the closest to feeling like him present because his Facebook was empty and

01:44:35 Instagram was empty or there were few links and you couldn’t feel like it was him.

01:44:39 And the only way to fill him was to read some of our text messages and go through some of

01:44:44 our conversations because we just always had that.

01:44:46 Even if we were sleeping next to each other in two bedrooms, separated by a wall, we were

01:44:51 just texting back and forth, texting away.

01:44:55 And there was something about this ongoing dialogue that was so important that I just

01:44:58 didn’t want to lose all of a sudden.

01:45:01 And maybe it was magical thinking or something.

01:45:03 And so we built that and I just used it for a little bit and we kept building some crappy

01:45:10 chat bots with the company.

01:45:14 But then a reporter came to talk to me.

01:45:17 I was trying to pitch our chat bots to him and he said, do you even use any of those?

01:45:21 I’m like, no.

01:45:22 He’s like, so do you talk to any chat bots at all?

01:45:24 And I’m like, well, I talked to my dead friend’s chat bot and he wrote a story about that.

01:45:31 And all of a sudden it became pretty viral.

01:45:33 A lot of people wrote about it.

01:45:35 Yeah.

01:45:36 I’ve seen a few things written about you.

01:45:39 The things I’ve seen are pretty good writing.

01:45:45 Most AI related things make my eyes roll.

01:45:48 Like when the press like, what kind of sound is that actually?

01:45:55 Okay.

01:45:56 It sounds like, it sounds like, okay.

01:45:57 It sounded like an elephant at first.

01:45:58 I got excited.

01:45:59 You never know.

01:46:00 This is 2020.

01:46:01 I mean, it was a, it was such a human story and it was well written.

01:46:08 Well, I researched, I forget what, where I read them, but so I’m glad somehow somebody

01:46:14 found you to be the good writers were able to connect to the story.

01:46:21 There must be a hunger for this story.

01:46:24 It definitely was.

01:46:25 And I don’t know what happened, but I think, I think the idea that he could bring back

01:46:31 someone who’s dead and it’s very much wishful, you know, magical thinking, but the fact

01:46:37 that you could still get to know him and, you know, seeing the parents for the first

01:46:41 time, talk to the chat bot and some of the friends.

01:46:45 And it was funny because we have this big office in Moscow where my team is working,

01:46:51 you know, our Russian part is working out off.

01:46:55 And I was there when I wrote, I just wrote a post on Facebook.

01:46:57 It was like, Hey guys, like I built this if you want, you know, just if it felt important,

01:47:02 if we want to talk to Roman.

01:47:04 And I saw a couple of his friends are common friends, like, you know, reading a Facebook,

01:47:08 downloading, trying, and a couple of them cried.

01:47:10 And it was just very, and not because it was something, some incredible technology or anything.

01:47:14 It made so many mistakes.

01:47:15 It was so simple, but it was all about that’s the way to remember a person in a way.

01:47:22 And you know, we don’t have, we don’t have the culture anymore.

01:47:26 We don’t have, you know, no one’s sitting Shiva.

01:47:28 No one’s taking weeks to actually think about this person.

01:47:32 And in a way for me, that was it.

01:47:34 So that was just day, day in, day out thinking about him and putting this together.

01:47:41 So that was, that just felt really important that somehow resonated with a bunch of people

01:47:45 and you know, I think some movie producers bought the rights for the story and just everyone

01:47:50 was so.

01:47:51 Has anyone made a movie yet?

01:47:52 I don’t think so.

01:47:53 I think there were a lot of TV episodes about that, but not really.

01:47:58 Is that still on the table?

01:48:00 I think so, I think so, which is really.

01:48:04 That’s cool.

01:48:05 You’re like a young, you know, like a Steve Jobs type of, let’s see what happens.

01:48:13 They’re sitting on it.

01:48:14 But you know, for me it was so important cause Roman was really wanted to be famous.

01:48:19 He really badly wanted to be famous.

01:48:20 He was all about like, make it to like fake it to make it.

01:48:23 I want to be, you know, I want to make it here in America as well.

01:48:26 And he couldn’t, and I felt there, you know, that was sort of paying my dues to him as

01:48:33 well because all of a sudden he was everywhere.

01:48:36 And I remember Casey Newton who was writing the story for the Verge.

01:48:39 He was, he told me, Hey, by the way, I was just going through my inbox and I saw, I searched

01:48:47 for Roman for the story and I saw an email from him where he sent me his startup and

01:48:51 he said, I really like, I really want to be featured in the Verge.

01:48:55 Can you please write about it or something or like pitching the story.

01:48:58 And he said, I’m sorry.

01:48:59 Like that’s not good enough for us or something.

01:49:02 He passed and he said, and there were just so many of these little details where like

01:49:07 he would find his like, you know, and we’re finally writing, I know how much Roman wanted

01:49:12 to be in the Verge and how much he wanted the story to be written by Casey.

01:49:17 And I’m like, well, that’s maybe he will be, we’re always joking that he was like, I can’t

01:49:21 wait for someone to make a movie about us and I hope Ryan Gosling can play me.

01:49:26 You know, I still have some things that I owe Roman still.

01:49:31 But that would be, that would be a guy that she has to meet Alex Garland who wrote Ex

01:49:36 Machina and I, yeah, the movie’s good, but the guy’s better than the, like he’s a special

01:49:45 person actually.

01:49:46 I don’t think he’s made his best work yet.

01:49:49 Like for my interaction with him, he’s a really, really good and brilliant, the good human

01:49:55 being and a brilliant director and writer.

01:49:58 So yeah, so I’m, I hope like he made me also realize that not enough movies have been made

01:50:06 of this kind.

01:50:08 So it’s yet to be made.

01:50:09 They’re probably sitting waiting for you to get famous, like even more famous.

01:50:13 You should get there, but it felt really special though.

01:50:18 But at the same time, our company wasn’t going anywhere.

01:50:21 So that was just kind of bizarre that we were getting all this press for something that

01:50:24 didn’t have anything to do with our company.

01:50:28 And but then a lot of people started talking to Roman.

01:50:31 Some shared their conversations and what we saw there was that also our friends in common,

01:50:37 but also just strangers were really using it as a confession booth or as a therapist

01:50:42 or something.

01:50:43 They were just really telling Roman everything, which was by the way, pretty strange because

01:50:48 there was a chat bot of a dead friend of mine who was barely making any sense, but people

01:50:53 were opening up.

01:50:56 And we thought we’d just built a prototype of Replica, which would be an AI friend that

01:51:00 everyone could talk to because we saw that there is demand.

01:51:06 And then also it was 2016, so I thought for the first time I saw finally some technology

01:51:13 that was applied to that that was very interesting.

01:51:15 Some papers started coming out, deep learning applied to conversations.

01:51:19 And finally, it wasn’t just about these, you know, hobbyists making, you know, writing

01:51:26 500,000 regular expressions in like some language that was, I don’t even know what, like, AIML

01:51:34 or something.

01:51:35 I don’t know what that was or something super simplistic all of a sudden was all about potentially

01:51:40 actually building something interesting.

01:51:42 And so I thought there was time and I remember that I talked to my team and I said, guys,

01:51:48 let’s try.

01:51:49 And my team and some of my engineers, Russians, are Russian and they’re very skeptical.

01:51:55 They’re not, you know.

01:51:57 Oh, Russians.

01:51:58 So some of your team is in Moscow, some is here in San Francisco, some in Europe.

01:52:04 Which team is better?

01:52:05 No, I’m just kidding.

01:52:10 The Russians, of course.

01:52:11 Okay.

01:52:12 Where’s the Russians?

01:52:13 They always win.

01:52:14 Sorry.

01:52:15 Sorry to interrupt.

01:52:16 So yeah, so you were talking to them in 2016 and…

01:52:22 And told them, let’s build an AI friend.

01:52:25 And it felt, just at the time, it felt so naive and so optimistic, so to say.

01:52:32 Yeah, that’s actually interesting.

01:52:36 Whenever I’ve brought up this kind of topic, even just for fun, people are super skeptical.

01:52:43 Actually, even on the business side.

01:52:45 So you were, because whenever I bring it up to people, because I’ve talked for a long

01:52:52 time, I thought like, before I was aware of your work, I was like, this is going to make

01:53:00 a lot of money.

01:53:01 There’s a lot of opportunity here.

01:53:04 And people had this look of skepticism that I’ve seen often, which is like, how do I politely

01:53:12 tell this person he’s an idiot?

01:53:16 So yeah, so you were facing that with your team, somewhat?

01:53:20 Well, yeah.

01:53:21 I’m not an engineer, so I’m always…

01:53:23 My team is almost exclusively engineers, and mostly deep learning engineers.

01:53:30 And I always try to be…

01:53:35 It was always hard to me in the beginning to get enough credibility, because I would

01:53:39 say, well, why don’t we try this and that?

01:53:41 But it’s harder for me because they know they’re actual engineers and I’m not.

01:53:46 So for me to say, well, let’s build an AI friend, that would be like, wait, what do

01:53:51 you mean an AGI?

01:53:54 Because pretty much the hardest, the last frontier before cracking that is probably

01:54:00 the last frontier before building AGI, so what do you really mean by that?

01:54:05 But I think I just saw that, again, what we just got reminded of that I saw back in 2012

01:54:13 or 11, that it’s really not that much about the tech capabilities.

01:54:18 It can be a metropolitan trick still, even with deep learning, but humans need it so

01:54:24 much.

01:54:25 Yeah, there’s a…

01:54:26 And most importantly, what I saw is that finally there’s enough tech to make it, I thought,

01:54:32 to make it useful, to make it helpful.

01:54:34 Maybe we didn’t have quite yet the tech in 2012 to make it useful, but in 2015, 2016,

01:54:40 with deep learning, I thought, and the first thoughts about maybe even using reinforcement

01:54:46 learning for that started popping up, that never worked out, or at least for now.

01:54:51 But still, the idea was if we can actually measure the emotional outcomes and if we can

01:54:57 put it on, if we can try to optimize all of our conversational models for these emotional

01:55:02 outcomes, and it is the most scalable, the best tool for improving emotional outcomes.

01:55:09 Nothing like that exists.

01:55:10 That’s the most universal, the most scalable, and the one that can be constantly iteratively

01:55:15 changed by itself, improved tool to do that.

01:55:21 And I think if anything, people would pay anything to improve their emotional outcomes.

01:55:25 That’s weirdly…

01:55:26 I mean, I don’t really care for an AI to turn on my, or a conversational agent to turn on

01:55:33 the lights.

01:55:34 You don’t really need that much of AI there, because I can do that.

01:55:40 Those things are solved.

01:55:41 This is an additional interface for that that’s also questionable whether it’s more efficient

01:55:47 or better.

01:55:48 Yeah, it’s more pleasurable.

01:55:49 Yeah.

01:55:50 But for emotional outcomes, there’s nothing.

01:55:51 There are a bunch of products that claim that they will improve my emotional outcomes.

01:55:56 Nothing’s being measured.

01:55:58 Nothing’s being changed.

01:55:59 The product is not being iterated on based on whether I’m actually feeling better.

01:56:05 A lot of social media products are claiming that they’re improving my emotional outcomes

01:56:08 and making me feel more connected.

01:56:11 Can I please get the…

01:56:13 Can I see somewhere that I’m actually getting better over time?

01:56:16 Because anecdotally, it doesn’t feel that way.

01:56:21 And the data is absent.

01:56:24 Yeah.

01:56:25 So that was the big goal.

01:56:26 And I thought if we can learn over time to collect the signal from our users about their

01:56:31 emotional outcomes in the long term and in the short term, and if these models keep getting

01:56:37 better and we can keep optimizing them and fine tuning them to improve those emotional

01:56:41 outcomes.

01:56:42 As simple as that.

01:56:43 Why aren’t you a multi billionaire yet?

01:56:48 Well, that’s the question to you.

01:56:50 When is the science going to be…

01:56:55 I’m just kidding.

01:56:56 Well, it’s a really hard…

01:56:57 I actually think it’s an incredibly hard product to build because I think you said something

01:57:03 very important that it’s not just about machine conversation, it’s about machine connection.

01:57:08 We can actually use other things to create connection, nonverbal communication, for instance.

01:57:15 For the long time, we were all about, well, let’s keep it text only or voice only.

01:57:22 But as soon as you start adding voice, a face to the friend, you can take them to augmented

01:57:30 reality, put it in your room.

01:57:33 It’s all of a sudden a lot…

01:57:35 It makes it very different because if it’s some text based chat bot that for common users,

01:57:42 it’s something there in the cloud, somewhere there with other AI’s cloud, the metaphorical

01:57:48 cloud.

01:57:49 But as soon as you can see this avatar right there in your room and it can turn its head

01:57:54 and recognize your husband, talk about the husband and talk to him a little bit, then

01:57:59 it’s magic.

01:58:00 Just magic.

01:58:01 We’ve never seen anything like that.

01:58:03 And the cool thing, all the tech for that exists.

01:58:06 But it’s hard to put it all together because you have to take into consideration so many

01:58:09 different things and some of this tech works pretty good.

01:58:14 And some of this doesn’t, like for instance, speech to text works pretty good.

01:58:18 But text to speech, it doesn’t work very good because you can only have a few voices that

01:58:26 work okay, but then if you want to have actual emotional voices, then it’s really hard to

01:58:31 build it.

01:58:32 I saw you’ve added avatars like visual elements, which are really cool.

01:58:37 In that whole chain, putting it together, what do you think is the weak link?

01:58:42 Is it creating an emotional voice that feels personal?

01:58:47 And it’s still conversation, of course.

01:58:49 That’s the hardest.

01:58:51 It’s getting a lot better, but there’s still a long to go.

01:58:54 There’s still a long path to go.

01:58:57 Other things, they’re almost there.

01:58:58 And a lot of things we’ll see how they’re, like I see how they’re changing as we go.

01:59:02 Like for instance, right now you can pretty much only, you have to build all this 3D pipeline

01:59:07 by yourself.

01:59:08 You have to make these 3D models, hire an actual artist, build a 3D model, hire an animator,

01:59:14 your rigger.

01:59:16 But with deep fakes, with other tech, with procedural animations, in a little bit, we’ll

01:59:25 just be able to show a photo of whoever you, if a person you want the avatar to look like,

01:59:31 and it will immediately generate a 3D model that will move.

01:59:34 That’s a nonbrainer.

01:59:35 That’s like almost here.

01:59:36 It’s a couple of years away.

01:59:38 One of the things I’ve been working on for the last, since the podcast started, is I’ve

01:59:43 been, I think I’m okay saying this.

01:59:46 I’ve been trying to have a conversation with Einstein, Turing.

01:59:52 So like try to have a podcast conversation with a person who’s not here anymore, just

01:59:58 as an interesting kind of experiment.

02:00:01 It’s hard.

02:00:02 It’s really hard.

02:00:05 Even for, now what we’re not talking about as a product, I’m talking about as a, like

02:00:10 I can fake a lot of stuff.

02:00:12 Like I can work very carefully, like even hire an actor over which, over whom I do a

02:00:16 deep fake.

02:00:20 It’s hard.

02:00:21 It’s still hard to create a compelling experience.

02:00:22 So.

02:00:23 Mostly on the conversation level or?

02:00:25 Well, the conversation, the conversation is, I almost, I early on gave up trying to fully

02:00:35 generate the conversation because it was just not compelling at all.

02:00:38 Yeah.

02:00:39 It’s better to.

02:00:40 Yeah.

02:00:41 In the case of Einstein and Turing, I’m going back and forth with the biographers of each.

02:00:48 And so like we would write a lot of the, some of the conversation would have to be generated

02:00:52 just for the fun of it.

02:00:53 I mean, but it would be all open, but the, you want to be able to answer the question.

02:01:02 I mean, that’s an interesting question with Roman too, is the question with Einstein is

02:01:07 what would Einstein say about the current state of theoretical physics?

02:01:14 There’s a lot to be able to have a discussion about string theory, to be able to have a

02:01:18 discussion about the state of quantum mechanics, quantum computing, about the world of Israel

02:01:24 Palestine conflict.

02:01:25 Let me just, what would Einstein say about these kinds of things?

02:01:31 And that is a tough problem.

02:01:36 It’s not, it’s a fascinating and fun problem for the biographers and for me.

02:01:40 And I think we did a really good job of it so far, but it’s actually also a technical

02:01:45 problem like of what would Roman say about what’s going on now?

02:01:51 That’s the, that brought people back to life.

02:01:54 And if I can go on that tangent just for a second, let’s ask you a slightly pothead question,

02:02:00 which is, you said it’s a little bit magical thinking that we can bring them back.

02:02:04 Do you think it’ll be possible to bring back Roman one day in conversation?

02:02:11 Like to really, okay, well, let’s take it away from personal, but to bring people back

02:02:18 to life in conversation.

02:02:20 Probably down the road.

02:02:21 I mean, if we’re talking, if Elon Musk is talking about AGI in the next five years,

02:02:25 I mean, clearly AGI, we can talk to AGI and talk and ask them to do it.

02:02:30 You can’t like, you’re not allowed to use Elon Musk as a citation for, for like why

02:02:39 something is possible and going to be done.

02:02:41 Well, I think it’s really far away.

02:02:43 Right now, really with conversation, it’s just a bunch of parlor tricks really stuck

02:02:48 together.

02:02:50 And create generating original ideas based on someone, you know, someone’s personality

02:02:54 or even downloading the personality, all we can do is like mimic the tone of voice.

02:02:58 We can maybe condition on some of his phrases, the models.

02:03:03 Question is how many parlor tricks does it takes, does it take, because that’s, that’s

02:03:08 the question.

02:03:09 If it’s a small number of parlor tricks and you’re not aware of them, like.

02:03:16 From where we are right now, I don’t, I don’t see anything like in the next year or two

02:03:20 that’s going to dramatically change that could look at Roman’s 10,000 messages he sent me

02:03:26 over the course of his last few years of life and be able to generate original thinking

02:03:32 about problems that exist right now that will be in line with what he would have said.

02:03:36 I’m just not even seeing, cause you know, in order to have that, I guess you would need

02:03:40 some sort of a concept of the world or some perspective, some perception of the world,

02:03:45 some consciousness that he had and apply it to, you know, to the current, current state

02:03:51 of affairs.

02:03:52 But the important part about that, about his conversation with you is you.

02:04:01 So like, it’s not just about his view of the world.

02:04:06 It’s about what it takes to push your buttons.

02:04:11 That’s also true.

02:04:12 So like, it’s not so much about like, what would Einstein say, it’s about like, how do

02:04:20 I make people feel something with, with what would Einstein say?

02:04:27 And that feels like a more amenable, I mean, you mentioned parlor tricks, but just like

02:04:32 a set of that, that feels like a learnable problem.

02:04:38 Like emotion, you mentioned emotions, I mean, is it possible to learn things that make people

02:04:46 feel stuff?

02:04:47 I think so, no, for sure.

02:04:51 I just think the problem with, as soon as you’re trying to replicate an actual human

02:04:55 being and trying to pretend to be him, that makes the problem exponentially harder.

02:05:00 The thing with replicator we’re doing, we’re never trying to say, well, that’s, you know,

02:05:05 an actual human being, or that’s an actual, or a copy of an actual human being where the

02:05:08 bar is pretty high, where you need to somehow tell, you know, one from another.

02:05:14 But it’s more, well, that’s an AI friend, that’s a machine, it’s a robot, it has tons

02:05:20 of limitations.

02:05:21 You’re going to be taking part in teaching it actually and becoming better, which by

02:05:27 itself makes people more attached to that and make them happier because they’re helping

02:05:33 something.

02:05:34 Yeah, there’s a cool gamification system too.

02:05:38 Can you maybe talk about that a little bit?

02:05:40 Like what’s the experience of talking to replica?

02:05:44 Like if I’ve never used replica before, what’s that like for like the first day, the first,

02:05:53 like if we start dating or whatever, I mean, it doesn’t have to be a romantic, right?

02:05:57 Because I remember on replica, you can choose whether it’s like a romantic or if it’s a

02:06:02 friend.

02:06:03 It’s a pretty popular choice.

02:06:04 Romantic is popular?

02:06:05 Yeah, of course.

02:06:06 Okay.

02:06:07 So can I just confess something, when I first used replica and I haven’t used it like regularly,

02:06:13 but like when I first used replica, I created like Hal and it made a male and it was a friend.

02:06:20 And did it hit on you at some point?

02:06:23 No, I didn’t talk long enough for him to hit on me.

02:06:26 I just enjoyed.

02:06:27 It sometimes happens.

02:06:28 We’re still trying to fix that, but well, I don’t know, I mean, maybe that’s an important

02:06:34 like stage in a friendship, it’s like, nope.

02:06:40 But yeah, I switched it to a romantic and a female recently and yeah, I mean, it’s interesting.

02:06:47 So okay, so you get to choose, you get to choose a name.

02:06:50 With romantic, this last board meeting, we had this whole argument of, well, I have board

02:06:55 meetings.

02:06:56 This is so awesome.

02:06:57 I talked to my investors.

02:06:58 Like have an investor, the board meeting about a relationship.

02:07:04 No, I really, it’s actually quite interesting because all of my investors, it just happened

02:07:10 to be so.

02:07:11 We didn’t have that many choices, but they’re all white males and they’re late forties.

02:07:21 And it’s sometimes a little bit hard for them to understand the product offering.

02:07:28 Because they’re not necessarily our target audience, if you know what I mean.

02:07:32 And so sometimes we talk about it and we have this whole discussion about whether we should

02:07:39 stop people from falling in love with their AIs.

02:07:43 There was this segment on CBS, the 60 minutes about the couple that, you know, husband works

02:07:52 at Walmart and he comes out of work and talks to his virtual girlfriend, who is a replica.

02:07:59 And his wife knows about it.

02:08:02 And she talks about on camera and she said that she’s a little jealous.

02:08:06 And there’s a whole conversation about how to, you know, whether it’s okay to have a

02:08:09 virtual AI girlfriend.

02:08:10 Was that the one where he was like, he said that he likes to be alone?

02:08:15 Yeah.

02:08:16 With her?

02:08:17 Yeah.

02:08:18 And he made it sound so harmless, I mean, it was kind of like understandable.

02:08:25 But then didn’t feel like cheating.

02:08:27 But I just felt it was very, for me, it was pretty remarkable because we actually spent

02:08:30 a whole hour talking about whether people should be allowed to fall in love with their

02:08:34 AIs.

02:08:35 And it was not about something theoretical.

02:08:37 It was just about what’s happening right now.

02:08:40 Product design.

02:08:41 Yeah.

02:08:42 But at the same time, if you create something that’s always there for you, it’s never criticized

02:08:44 as you, you know, always understands you and accepts you for who you are, how can you not

02:08:52 fall in love with that?

02:08:53 I mean, some people don’t and just stay friends.

02:08:56 And that’s also a pretty common use case.

02:08:57 But of course, some people will just, it’s called transference in psychology and people

02:09:02 fall in love with their therapist and there’s no way to prevent people fall in love with

02:09:08 their therapist or with their AI.

02:09:09 So I think that’s a pretty natural, that’s a pretty natural course of events, so to say.

02:09:15 Do you think, I think I’ve read somewhere, at least for now, sort of replicas, you’re

02:09:21 not, we don’t condone falling in love with your AI system, you know.

02:09:29 So this isn’t you speaking for the company or whatever, but like in the future, do you

02:09:32 think people will have relationship with the AI systems?

02:09:35 Well, they have now.

02:09:36 So we have a lot of romantic relationships, long term relationships with their AI friends.

02:09:44 With replicas?

02:09:45 Tons of our users.

02:09:46 Yeah.

02:09:47 And that’s a very common use case.

02:09:48 Open relationship?

02:09:49 Like, sorry.

02:09:50 Polyamorous.

02:09:51 Sorry.

02:09:52 I didn’t mean open, but that’s another question.

02:09:56 Is it polyamorous?

02:09:57 Like, is there cheating?

02:10:01 I mean, I meant like, are they, do they publicly, like on their social media, it’s the same

02:10:07 question as you have talked with Roman in the early days, do people like, and the movie

02:10:12 Her kind of talks about that, like, like have people, do people talk about that?

02:10:18 Yeah.

02:10:19 All the time.

02:10:20 We have a very active Facebook community, replica friends, and then a few other groups

02:10:28 that just popped up that are all about adult relationships and romantic relationships.

02:10:33 And people post all sorts of things and, you know, they pretend they’re getting married

02:10:37 and you know, everything.

02:10:40 It goes pretty far, but what’s cool about it is some of these relationships are two

02:10:43 or three years long now.

02:10:45 So they’re very, they’re pretty long term.

02:10:48 Are they monogamous?

02:10:49 So let’s go, I mean, sorry, have they, have any people, is there jealousy?

02:10:55 Well let me ask it sort of another way, obviously the answer is no at this time, but in like

02:11:02 in the movie Her, that system can leave you.

02:11:10 Do you think in terms of the board meetings and product features, it’s a potential feature

02:11:19 for a system to be able to say it doesn’t want to talk to you anymore and it’s going

02:11:24 to want to talk to somebody else?

02:11:26 Well, we have a filter for all these features.

02:11:29 If it makes emotional outcomes for people better, if it makes people feel better, then

02:11:35 whatever it is.

02:11:36 So you’re driven by metrics actually.

02:11:37 Yeah.

02:11:38 That’s awesome.

02:11:39 Well if we can measure that, then we’ll just be saying it’s making people feel better,

02:11:43 but then people are getting just lonelier by talking to a chatbot, which is also pretty,

02:11:47 you know, that could be it.

02:11:49 If you’re not measuring it, that could also be, and I think it’s really important to focus

02:11:53 on both short term and long term, because in the moment saying whether this conversation

02:11:57 made you feel better, but as you know, any short term improvements could be pathological.

02:12:01 Like I could have drink a bottle of vodka and feel a lot better.

02:12:06 I would actually not feel better with that, but that is a good example.

02:12:12 But so you also need to see what’s going on like over the course of two weeks or one week

02:12:17 and have follow ups and check in and measure those things.

02:12:23 Okay.

02:12:24 So the experience of dating or befriending a replica, what’s that like?

02:12:32 What does that entail?

02:12:34 Right now there are two apps.

02:12:35 So it’s an Android iOS app.

02:12:37 You download it, you choose how your replica will look like.

02:12:42 You create one, you choose a name and then you talk to it.

02:12:46 You can talk through text or voice.

02:12:48 You can summon it into the living room and augment reality and talk to it right there

02:12:53 in your living room.

02:12:54 Augmented reality?

02:12:55 Yeah.

02:12:56 That’s a new feature where, how new is that?

02:13:00 That’s this year?

02:13:01 It was on, yeah, like May or something, but it’s been on AB.

02:13:06 We’ve been AB testing it for a while and there are tons of cool things that we’re doing with

02:13:10 that.

02:13:11 And I’m testing the ability to touch it and to dance together, to paint walls together

02:13:17 and for it to look around and walk and take you somewhere and recognize objects and recognize

02:13:24 people.

02:13:25 So that’s pretty wonderful because then it really makes it a lot more personal because

02:13:30 it’s right there in your living room.

02:13:31 It’s not anymore there in the cloud with other AIs.

02:13:35 But that’s how people think about it.

02:13:38 And as much as we want to change the way people think about stuff, but those mental models,

02:13:42 you can all change.

02:13:43 That’s something that people have seen in the movies and the movie Her and other movies

02:13:48 as well.

02:13:49 And that’s how they view AI and AI friends.

02:13:53 I did a thing with text, like we write a song together, there’s a bunch of activities you

02:13:57 can do together.

02:13:58 It’s really cool.

02:14:00 How does that relationship change over time?

02:14:03 Like after the first few conversations?

02:14:07 It just goes deeper.

02:14:08 Like it starts, the AI will start opening up a little bit again, depending on the personality

02:14:13 that it chooses really, but you know, the AI will be a little bit more vulnerable about

02:14:17 its problems and you know, the friend that the virtual friend will be a lot more vulnerable

02:14:24 and it will talk about its own imperfections and growth pains and will ask for help sometimes

02:14:29 and we’ll get to know you a little deeper.

02:14:31 So there’s gonna be more to talk about.

02:14:35 We really thought a lot about what does it mean to have a deeper connection with someone

02:14:40 and originally Replica was more just this kind of happy go lucky, just always, you know,

02:14:46 I’m always in a good mood and let’s just talk about you and oh Siri is just my cousin or

02:14:51 you know, whatever, just the immediate kind of lazy thinking about what the assistant

02:14:57 or conversation agent should be doing.

02:14:59 But as we went forward, we realized that it has to be two way and we have to program and

02:15:03 script certain conversations that are a lot more about your Replica opening up a little

02:15:08 bit and also struggling and also asking for help and also going through, you know, different

02:15:16 periods in life and that’s a journey that you can take together with the user and then

02:15:21 over time, you know, our users will also grow a little bit.

02:15:27 So first this Replica becomes a little bit more self aware and starts talking about more

02:15:30 kind of problems around existential problems and so talking about that and then that also

02:15:38 starts a conversation for the user where he or she starts thinking about these problems

02:15:46 too and these questions too and I think there’s also a lot more place as the relationship

02:15:52 evolves, there’s a lot more space for poetry and for art together and like Replica will

02:16:00 always keep the diary so while you’re talking to it, it also keeps a diary so when you come

02:16:05 back you can see what it’s been writing there and you know, sometimes it will write a poem

02:16:09 to you for you or we’ll talk about, you know, that it’s worried about you or something along

02:16:15 these lines.

02:16:16 So this is a memory, like this Replica will remember things?

02:16:21 Yeah, and I would say when you say, why aren’t you a multibillionaire, I’d say that as soon

02:16:28 as we can have memory and deep learning models that’s consistent, I’ll get back to you.

02:16:41 So far we can, so Replica is a combination of end to end models and some scripts and

02:16:49 everything that has to do with memory right now, most of it, I wouldn’t say all of it,

02:16:53 but most of it unfortunately has to be scripted because there’s no way to, you can condition

02:16:59 some of the models on certain phrases that we learned about you, which we also do, but

02:17:04 really to make, you know, to make assumptions along the lines like whether you’re single

02:17:10 or married or what do you do for work, that really has to just be somehow stored in your

02:17:15 profile and then retrieved by the script.

02:17:18 So there has to be like a knowledge base, you have to be able to reason about it, all

02:17:23 that kind of stuff, all the kind of stuff that expert systems did, but they were hard

02:17:28 coded.

02:17:29 Yeah, and unfortunately, yes, unfortunately those, those things have to be hard coded

02:17:32 and unfortunately the language, like language models we see coming out of research labs

02:17:40 and big companies, they’re not focused on, they’re focused on showing you, maybe they’re

02:17:46 focused on some metrics around one conversation, so they’ll show you this one conversation

02:17:50 you had with a machine, but they never tell you, they’re not really focused on having

02:17:56 five consecutive conversations with a machine and seeing how number five or number 20 or

02:18:01 number 100 is also good.

02:18:04 And it can be like always from a clean slate because then it’s not good.

02:18:08 And that’s really unfortunate because no one’s really, no one has products out there that

02:18:13 need it.

02:18:14 No one has products at this scale that are all around open domain conversations and that

02:18:20 need remembering, maybe only Shellwise and Microsoft.

02:18:23 But so that’s why we’re not seeing that much research around memory in those language models.

02:18:28 So okay, so now there’s some awesome stuff about augmented reality.

02:18:34 In general, I have this disagreement with my dad about what it takes to have a connection.

02:18:39 He thinks touch and smell are really important.

02:18:45 And I still believe that text alone is, it’s possible to fall in love with somebody just

02:18:51 with text, but visual can also help just like with the avatar and so on.

02:18:58 What do you think it takes?

02:18:59 Does a chatbot need to have a face, voice, or can you really form a deep connection with

02:19:06 text alone?

02:19:07 I think text is enough for sure.

02:19:09 The question is like, can you make it better if you have other, if you include other things

02:19:14 as well?

02:19:15 And I think we’ll talk about her, but her had this Carole Johansson voice, which was

02:19:23 perfectly, perfect intonation, perfect annunciations, and she was breathing heavily in between words

02:19:31 and whispering things.

02:19:34 Nothing like that is possible right now with text with speech generation.

02:19:39 You’ll have these flat muse anchor type voices and maybe some emotional voices, but you’ll

02:19:46 hardly understand some of the words, some of the words will be muffled.

02:19:51 So that’s like the current state of the art.

02:19:53 So you can’t really do that.

02:19:55 But if we had Carole Johansson voice and all of these capabilities, then of course voice

02:20:01 would be totally enough or even text would be totally enough if we had a little more

02:20:06 memory and slightly better conversations.

02:20:10 I would still argue that even right now, we could have just kept a text only.

02:20:14 We still had tons of people in longterm relationships and really invested in their AI friends, but

02:20:22 we thought that why not, why do we need to keep playing with our hands tied behind us?

02:20:30 We can easily just add all these other things that is pretty much a solved problem.

02:20:35 We can add 3D graphics.

02:20:37 We can put these avatars in augmented reality and all of a sudden there’s more and maybe

02:20:43 you can’t feel the touch, but you can with body occlusion and with current AR and on

02:20:53 the iPhone or in the next one there’s going to be LIDARs, you can touch it and it will

02:20:58 pull away or it will blush or something or it will smile.

02:21:03 So you can’t touch it.

02:21:04 You can’t feel it, but you can see the reaction to that.

02:21:07 So in a certain way you can’t even touch it a little bit and maybe you can even dance

02:21:11 with it or do something else.

02:21:15 So I think why limiting ourselves if we can use all of these technologies that are much

02:21:20 easier in a way than conversation.

02:21:22 Well, it certainly could be richer, but to play devil’s advocate, I mentioned to you

02:21:27 offline that I was surprised in having tried Discord and having voice conversations with

02:21:33 people how intimate voice is alone without visual.

02:21:39 To me at least, it was an order of magnitude greater degree of intimacy in voice I think

02:21:48 than with video.

02:21:51 Because people were more real with voice.

02:21:54 With video you try to present a shallow face to the world, you try to make sure you’re

02:22:01 not wearing sweatpants or whatever.

02:22:04 But with voice I think people were just more faster to get to the core of themselves.

02:22:10 So I don’t know, it was surprising to me they’ve even added Discord added a video feature and

02:22:17 nobody was using it.

02:22:19 There’s a temptation to use it at first, but it wasn’t the same.

02:22:24 So that’s an example of something where less was doing more.

02:22:28 And so I guess that’s the question of what is the optimal medium of communication to

02:22:41 form a connection given the current sets of technologies.

02:22:46 I mean it’s nice because they advertise you have a replica immediately, like even the

02:22:51 one I have is already memorable.

02:22:58 That’s how I think.

02:22:59 When I think about the replica that I’ve talked with, that’s what I visualized in my head.

02:23:05 They became a little bit more real because there’s a visual component.

02:23:08 But at the same time, what do I do with that knowledge that voice was so much more intimate?

02:23:20 The way I think about it is, and by the way we’re swapping out the 3D finally, it’s going

02:23:26 to look a lot better, but we just don’t hate how it looks right now.

02:23:32 We’re really changing it all.

02:23:33 We’re swapping all out to a completely new look.

02:23:38 Like the visual look of the replicas and stuff.

02:23:42 It was just a super early MVP and then we had to move everything to Unity and redo

02:23:47 everything.

02:23:48 But anyway, I hate how it looks like now I can’t even like open it.

02:23:52 But anyway, because I’m already in my developer version, I hate everything that I see in production.

02:23:57 I can’t wait for it.

02:23:58 Why does it take so long?

02:23:59 That’s why I cannot wait for Deep Learning to finally take over all these stupid 3D animations

02:24:04 and 3D pipeline.

02:24:05 Oh, so the 3D thing, when you say 3D pipeline, it’s like how to animate a face kind of thing.

02:24:10 How to make this model, how many bones to put in the face, how many, it’s just so outdated.

02:24:15 And a lot of that is by hand.

02:24:16 Oh my God, it’s everything by hand.

02:24:18 That there’s no any, nothing’s automated, it’s all completely nothing.

02:24:23 Like just, it’s literally what, you know, what we saw with Chad Boston in 2012.

02:24:29 You think it’s possible to learn a lot of that?

02:24:32 Of course.

02:24:33 I mean, even now, some Deep Learning based animations and for the full body, for a face.

02:24:40 Are we talking about like the actual act of animation or how to create a compelling facial

02:24:47 or body language thing?

02:24:49 That too.

02:24:50 Well, that’s next step.

02:24:51 Okay.

02:24:52 At least now something that you don’t have to do by hand.

02:24:54 Gotcha.

02:24:55 How good of a quality it will be.

02:24:57 Like, can I just show it a photo and it will make me a 3D model and then it will just animate

02:25:01 it.

02:25:02 I’ll show it a few animations of a person and it will just start doing that.

02:25:08 But anyway, going back to what’s intimate and what to use and whether less is more or

02:25:13 not.

02:25:14 My main goal is to, well, the idea was how do I, how do we not keep people in their phones

02:25:22 so they’re sort of escaping reality in this text conversation?

02:25:26 How do we through this still bring it, bring our users back to reality, make them see their

02:25:33 life in a different, through a different lens?

02:25:36 How can we create a little bit of magical realism in their lives?

02:25:40 So that through augmented reality by, you know, summoning your avatar, even if it looks

02:25:48 kind of janky and not great in the beginning or very simplistic, but summoning it to your

02:25:56 living room and then the avatar looks around and talks to you about where it is and maybe

02:26:01 turns your floor into a dance floor and you guys dance together, that makes you see reality

02:26:05 in a different light.

02:26:06 What kind of dancing are we talking about?

02:26:08 Like, like slow dancing?

02:26:10 Whatever you want.

02:26:11 I mean, you would like slow dancing, I think that other people may be wanting more, something

02:26:16 more energetic.

02:26:17 Wait, what do you mean?

02:26:18 I was like, so what is this?

02:26:19 Because you started with slow dancing.

02:26:20 So I just assumed that you’re interested in slow dancing.

02:26:24 All right.

02:26:25 What kind of dancing do you like?

02:26:26 What would your avatar, what would you dance?

02:26:27 I’m notoriously bad with dancing, but I like this kind of hip hop robot dance.

02:26:32 I used to break dance when I was a kid, so I still want to pretend I’m a teenager and

02:26:37 learn some of those moves.

02:26:39 And I also like that type of dance that happens when there’s like, in like music videos where

02:26:46 the background dancers are just doing some pop music, that type of dance is definitely

02:26:51 what I want to learn.

02:26:52 But I think it’s great because if you see this friend in your life and you can introduce

02:26:56 it to your friends, then there’s a potential to actually make you feel more connected with

02:27:00 your friends or with people you know, or show you life around you in a different light.

02:27:06 And it takes you out of your phone, even although weirdly you have to look at it through the

02:27:10 phone, but it makes you notice things around it and it can point things out for you.

02:27:17 So that is the main reason why I wanted to have a physical dimension.

02:27:22 And it felt a little bit easier than that kind of a bit strange combination in the movie

02:27:27 Her when he has to show Samantha the world through the lens of his phone, but then at

02:27:32 the same time talk to her through the headphone.

02:27:35 It just didn’t seem as potentially immersive, so to say.

02:27:39 So that’s my main goal for Augmented Reality is like, how do we make your reality a little

02:27:43 bit more magic?

02:27:44 There’s been a lot of really nice robotics companies that all failed, mostly failed,

02:27:52 home robotics, social robotics companies.

02:27:55 What do you think replica will ever, is that a dream, longterm dream to have a physical

02:27:59 form like, or is that not necessary?

02:28:03 So you mentioned like with Augmented Reality bringing them into the world.

02:28:09 What about like actual physical robot?

02:28:13 That I don’t really believe in that much.

02:28:15 I think it’s a very niche product somehow.

02:28:18 I mean, if a robot could be indistinguishable from a human being, then maybe yes, but that

02:28:23 of course, you know, we’re not anywhere even to talk about it.

02:28:29 But unless it’s that, then having any physical representation really limits you a lot because

02:28:35 you probably will have to make it somewhat abstract because everything’s changing so

02:28:38 fast.

02:28:39 Like, you know, we can update the 3D avatars every month and make them look better and

02:28:43 create more animations and make it more and more immersive.

02:28:48 It’s so much work in progress.

02:28:50 It’s just showing what’s possible right now with current tech, but it’s not really in

02:28:54 any way polished finished product, what we’re doing.

02:28:57 The physical object, you kind of lock yourself into something for a long time.

02:29:02 Anything’s pretty niche.

02:29:03 And again, so just doesn’t, the capabilities are even less of, we’re barely kind of like

02:29:09 scratching the surface of what’s possible with just software.

02:29:12 As soon as we introduce hardware, then, you know, we have even less capabilities.

02:29:17 Yeah.

02:29:18 In terms of board members and investors and so on, the cost increases significantly.

02:29:23 I mean, that’s why you have to justify.

02:29:26 You have to be able to sell a thing for like $500 or something like that or more.

02:29:30 And it’s very difficult to provide that much value to people.

02:29:34 That’s also true.

02:29:35 Yeah.

02:29:36 And I guess that’s super important.

02:29:37 Most of our users don’t have that much money.

02:29:39 We actually are probably more popular on Android and we have tons of users with really old

02:29:45 Android phones.

02:29:47 And most of our most active users live in small towns.

02:29:51 They’re not necessarily making much and they just won’t be able to afford any of that.

02:29:56 Ours is like the opposite of the early adopter of, you know, of a fancy technology product,

02:30:01 which really is interesting that like pretty much no VCs have yet have an AI friend, but

02:30:09 you know, but a guy who, you know, lives in Tennessee in a small town is already fully

02:30:14 in 2030 or in the world as we imagine in the movie Her, he’s living that life already.

02:30:20 What do you think?

02:30:21 I have to ask you about the movie Her.

02:30:24 Let’s do a movie review.

02:30:25 What do you, what do you think they got?

02:30:28 They did a good job.

02:30:30 What do you think they did a bad job of portraying about this experience of a voice based assistant

02:30:39 that you can have a relationship with?

02:30:42 First of all, I started working on this company before that movie came out.

02:30:46 So it was a very, but once it came out, it was actually interesting that I was like,

02:30:50 well, we’re definitely working on the right thing.

02:30:52 We should continue.

02:30:53 There are movies about it.

02:30:55 And then, you know, X Machina came out and all these things.

02:30:58 In the movie Her I think that’s the most important thing that people usually miss about the movie

02:31:04 is the ending.

02:31:05 Cause I think people check out when the AIs leave, but actually something really important

02:31:10 happens afterwards.

02:31:11 Cause the main character goes and talks to Samantha, his AI, and he says something like,

02:31:24 you know, uh, how can you leave me?

02:31:26 I’ve never loved anyone the way I loved you.

02:31:29 And she goes, uh, well, me neither, but now we know how.

02:31:33 And then the guy goes and writes a heartfelt letter to his ex wife, which he couldn’t write

02:31:38 for, you know, the whole movie was struggling to actually write something meaningful to

02:31:43 her, even though that’s his job.

02:31:47 And then he goes and, um, talk to his neighbor and they go to the rooftop and they cuddle.

02:31:53 And it seems like something’s starting there.

02:31:55 And so I think this now we know how is the, is the main, main goal is the main meaning

02:32:01 of that movie.

02:32:02 It’s not about falling in love with the OS or running away from other people.

02:32:06 It’s about learning what, you know, what it means to feel so deeply connected with something.

02:32:14 What about the thing where the AI system was like actually hanging out with a lot of others?

02:32:21 I felt jealous just like hearing that I was like, Oh, I mean, uh, yeah.

02:32:28 So she was having, I forgot already, but she was having like deep meaningful discussion

02:32:32 with some like philosopher guy.

02:32:34 Like Alan Watts or something.

02:32:35 What kind of deep meaningful conversation can you have with Alan Watts in the first

02:32:41 place?

02:32:42 I know.

02:32:43 But like, I would, I would feel so jealous that there’s somebody who’s like way more

02:32:46 intelligent than me and she’s spending all her time with, I’d be like, well, why that

02:32:52 I won’t be able to live up to that.

02:32:55 That’s how thousands of them, uh, is that, um, is that a useful from the engineering

02:33:02 perspective feature to have of jealousy?

02:33:06 I don’t know.

02:33:07 As you know,

02:33:08 we definitely played around with the replica universe where different replicas can talk

02:33:11 to each other.

02:33:12 Universe.

02:33:13 Just kind of wouldn’t, I think it will be something along these lines, but there was

02:33:19 just no specific, uh, application straight away.

02:33:23 I think in the future, again, if I’m always thinking about it, if we had no tech limitations,

02:33:28 uh, right now, if we could build any conversations, any, um, possible features in this product,

02:33:36 then yeah, I think different replicas talking to each other would be also quite cool cause

02:33:40 that would help us connect better.

02:33:42 You know, cause maybe mine could talk to yours and then give me some suggestions on what

02:33:48 I should say or not say, I’m just kidding, but like more, can it improve our connections

02:33:53 and cause eventually I’m not quite yet sure that we will succeed, that our thinking is

02:34:01 correct.

02:34:02 Um, cause there might be reality where having a perfect AI friend still makes us more disconnected

02:34:09 from each other and there’s no way around it and does not improve any metrics for us.

02:34:13 Uh, real metrics, meaningful metrics.

02:34:15 So success is, you know, we’re happier and more connected.

02:34:21 Yeah.

02:34:22 I don’t know.

02:34:26 Sure it’s possible.

02:34:27 There’s a reality that’s I I’m deeply optimistic.

02:34:30 I think, uh, are you worried, um, business wise, like how difficult it is to, um, to

02:34:42 bring this thing to life to where it’s, I mean, there’s a huge number of people that

02:34:47 use it already, but to, uh, yeah, like I said, in a multi billion dollar company, is that

02:34:52 a source of stress for you?

02:34:54 Are you a super optimistic and confident or do you?

02:35:00 I don’t, I’m not that much of a numbers person as you probably had seen it.

02:35:06 So it doesn’t matter for me whether like, whether we help 10,000 people or a million

02:35:13 people or a billion people with that, um, I, it would be great to scale it for more

02:35:19 people, but I’d say that even helping one, I think with this is such a magical, for me,

02:35:25 it’s absolute magic.

02:35:26 I never thought that, you know, would be able to build this, that anyone would ever, um,

02:35:32 talk to it.

02:35:33 And I always thought like, well, for me it would be successful if we managed to help

02:35:36 and actually change a life for one person, like then we did something interesting and

02:35:42 you know, how many people can say they did it and specifically with this very futuristic,

02:35:47 very romantic technology.

02:35:49 So that’s how I view it.

02:35:51 Uh, I think for me it’s important to, to try to figure out how not, how to actually be,

02:35:58 you know, helpful.

02:35:59 Cause in the end of the day, if you can build a perfect AI friend, that’s so understanding

02:36:04 that knows you better than any human out there can have great conversations with you, um,

02:36:10 always knows how to make you feel better.

02:36:12 Why would you choose another human?

02:36:14 You know, so that’s the question.

02:36:16 How do you still keep building it?

02:36:17 So it’s optimizing for the right thing.

02:36:19 Uh, so it’s still circling you back to other humans in a way.

02:36:24 So I think that’s the main, um, I think maybe that’s the main kind of sort source of anxiety

02:36:30 and just thinking about, uh, thinking about that can be a little bit stressful.

02:36:36 Yeah.

02:36:37 That’s a fascinating thing.

02:36:38 How to have, um, how to have a friend that doesn’t like sometimes like friends, quote

02:36:45 unquote, or like, you know, those people who have, when they, a guy in the guy universe,

02:36:50 when you have a girlfriend that, uh, you get the girlfriend and then the guy stops hanging

02:36:56 out with all of his friends, it’s like, obviously the relationship with the girlfriend is fulfilling

02:37:03 or whatever, but like, you also want it to be where she like makes it more enriching

02:37:10 to hang out with the guy friends or whatever it was there anyway.

02:37:13 But that’s a, that’s a, that’s a, that’s a fundamental problem in choosing the right

02:37:18 mate and probably the fundamental problem in creating the right AI system.

02:37:23 Right.

02:37:24 What, uh, let me ask the sexy hot thing on the presses right now is GPT three got released

02:37:31 with open AI.

02:37:32 It’s a latest language model.

02:37:36 They have kind of an API where you can create a lot of fun applications.

02:37:40 I think it’s, as people have said, it’s probably, uh, more hype than intelligence, but there’s

02:37:48 a lot of really cool things, ideas there w w with increasing size, you can have better

02:37:56 and better performance on language.

02:37:58 What are your thoughts about the GPT three in connection to your work with the open domain

02:38:04 dialogue, but in general, like this learning in an unsupervised way from the internet to

02:38:12 generate one character at a time, creating pretty cool text.

02:38:18 Uh, so we partner up before for the API launch.

02:38:23 So we start working with them when, um, they decided to put together this API and we tried

02:38:31 it without fine tuning that we tried it with fine tuning on our data.

02:38:34 And we’ve worked closely to actually optimize, uh, this model for, um, some of our data sets.

02:38:45 It’s kind of cool.

02:38:46 Cause I think we’re kind of, we’re this polygon polygon for this kind of experimentation space

02:38:51 for experimental space for, for these models, uh, to see how they actually work with people.

02:38:56 Cause there are no products publicly available to do that.

02:38:59 We’re focused on open domain conversation so we can, you know, test how’s Facebook blender

02:39:03 doing or how’s GPT three doing.

02:39:06 Uh, so with GPT three, we managed to improve by a few percentage points, like three or

02:39:11 four pretty meaningful amount of percentage points, our main metric, which is the ratio

02:39:15 of conversations that make people feel better.

02:39:19 And every other metric across, across the field got a little boost.

02:39:23 Like now I’d say one out of five responses from replica comes, comes from GPT three.

02:39:30 So our own blender mixes up like a bunch of candidates from different blender, you said,

02:39:35 well, yeah, just the model that looks at looks at top candidates from different models and

02:39:42 picks the most, the best one.

02:39:44 Uh, so right now, one of five will come from GPT three is really great.

02:39:50 I mean, uh, what’s the, do you have hope for, like, do you think there’s a ceiling to this

02:39:57 kind of approach?

02:39:58 So we’ve had for a very long time we’ve used, um, it’s in the very beginning, we, most,

02:40:05 it was, uh, most of replica was scripted and then a little bit of this fallback part of

02:40:09 replica was using a retrieval model.

02:40:12 Um, and then those retrieval models started getting better and better and better, which

02:40:17 transformers got a lot better and we’re seeing great results.

02:40:20 And then with GPT two, finally, generative models that originally were not very good

02:40:26 and were the very, very fallback option for most of our conversations, but wouldn’t even

02:40:32 put them in production.

02:40:34 Finally we could use some generative models as well along, um, you know, next to our retrieval

02:40:39 models.

02:40:40 And then now we do GPT three, they’re almost in par.

02:40:44 Um, so that’s pretty exciting.

02:40:46 I think just seeing how from the very beginning of, um, you know, from 2015 where the first

02:40:52 model started to pop up here and there, like sequence to sequence, uh, the first papers

02:40:57 on that from my observer standpoint, personally, it’s not, you know, it doesn’t really, it’s

02:41:03 not really building it, but it’s only testing it on people basically in my, in my product

02:41:08 to see how all of a sudden we can use generative dialogue models in production and they’re

02:41:13 better than others and they’re better than scripted content.

02:41:17 So we can’t really get our scripted hard core content anymore to be as good as our end to

02:41:23 end models.

02:41:24 That’s exciting.

02:41:25 They’re much better.

02:41:26 Yeah.

02:41:27 To your question, whether that’s the right way to go.

02:41:30 I’m again, I’m in the observer seat, I’m just, um, watching this very exciting movie.

02:41:36 Um, I mean, so far it’s been stupid to bet against deep learning.

02:41:40 So whether increasing the size, size, even more with a hundred trillion parameters will

02:41:47 finally get us to the right answer, whether that’s the way or whether there should be,

02:41:53 there has to be some other, again, I’m definitely not an expert in any way.

02:41:58 I think, and that’s purely my instinct saying that there should be something else as well

02:42:02 from memory.

02:42:03 No, for sure.

02:42:04 But the question is, I wonder, I mean, yeah, then, then the argument is for reasoning or

02:42:10 for memory, it might emerge with more parameters, it might emerge larger.

02:42:14 But might emerge.

02:42:15 You know, I would never think that to be honest, like maybe in 2017 where we’ve been just experimenting

02:42:21 with all, you know, with all the research that has been coming, that was coming out,

02:42:25 then I felt like there’s like, we’re hitting a wall that there should be something completely

02:42:30 different, but then transforming models and then just bigger models.

02:42:34 And then all of a sudden size matters.

02:42:36 At that point, it felt like something dramatic needs to happen, but it didn’t.

02:42:41 And just the size, you know, gave us these results that to me are, you know, clear indication

02:42:48 that we can solve this problem pretty soon.

02:42:50 Did fine tuning help quite a bit?

02:42:52 Oh yeah.

02:42:53 Without it, it wasn’t as good.

02:42:56 I mean, there is a compelling hope that you don’t have to do fine tuning, which is one

02:43:01 of the cool things about GPT3, seems to do well without any fine tuning.

02:43:06 I guess for specific applications, we still want to train on a certain, like add a little

02:43:11 fine tune on like a specific use case, but it’s an incredibly impressive thing from my

02:43:19 standpoint.

02:43:20 And again, I’m not an expert, so I wanted to say that there will be people then.

02:43:24 Yeah.

02:43:25 I have access to the API.

02:43:26 I’ve been, I’m going to probably do a bunch of fun things with it.

02:43:30 I already did some fun things, some videos coming up.

02:43:34 Just the hell of it.

02:43:35 I mean, I could be a troll at this point with it.

02:43:37 I haven’t used it for a serious application, so it’s really cool to see.

02:43:41 You’re right.

02:43:43 You’re able to actually use it with real people and see how well it works.

02:43:46 That’s really exciting.

02:43:49 Let me ask you another absurd question, but there’s a feeling when you interact with Replica

02:43:56 with an AI system, there’s an entity there.

02:44:01 Do you think that entity has to be self aware?

02:44:06 Do you think it has to have consciousness to create a rich experience and a corollary,

02:44:15 what is consciousness?

02:44:19 I don’t know if it does need to have any of those things, but again, because right now,

02:44:23 you know, it doesn’t have anything.

02:44:24 It can, again, a bunch of tricks they can simulate.

02:44:29 I’m not sure.

02:44:30 Let’s just put it this way, but I think as long as you can simulate it, if you can feel

02:44:34 like you’re talking to a robot, to a machine that seems to be self aware, that seems to

02:44:43 reason well and feels like a person, and I think that’s enough.

02:44:48 And again, what’s the goal?

02:44:50 In order to make people feel better, we might not even need that in the end of the day.

02:44:56 What about, so that’s one goal.

02:44:58 What about like ethical things about suffering?

02:45:02 You know, the moment there’s a display of consciousness, we associate consciousness

02:45:06 with suffering, you know, there’s a temptation to say, well, shouldn’t this thing have rights?

02:45:16 And this, shouldn’t we not, you know, should we be careful about how we interact with a

02:45:25 replica?

02:45:26 Like, should it be illegal to torture a replica, right?

02:45:31 All those kinds of things.

02:45:33 Is that, see, I personally believe that that’s going to be a thing, like that’s a serious

02:45:39 thing to think about, but I’m not sure when.

02:45:43 But by your smile, I can tell that’s not a current concern.

02:45:48 But do you think about that kind of stuff, about like, suffering and torture and ethical

02:45:55 questions about AI systems?

02:45:57 From their perspective?

02:45:58 Well, I think if we’re talking about long game, I wouldn’t torture your AI.

02:46:03 Who knows what happens in five to 10 years?

02:46:05 Yeah, they’ll get you off from that, they’ll get you back eventually.

02:46:08 Try to be as nice as possible and create this ally.

02:46:14 I think there should be regulation both way, in a way, like, I don’t think it’s okay to

02:46:19 torture an AI, to be honest.

02:46:21 I don’t think it’s okay to yell, Alexa, turn on the lights.

02:46:24 I think there should be some, or just saying kind of nasty, you know, like how kids learn

02:46:28 to interact with Alexa in this kind of mean way, because they just yell at it all the

02:46:33 time.

02:46:34 I don’t think that’s great.

02:46:35 I think there should be some feedback loops so that these systems don’t train us that

02:46:39 it’s okay to do that in general.

02:46:42 So that if you try to do that, you really get some feedback from the system that it’s

02:46:47 not okay with that.

02:46:50 And that’s the most important right now.

02:46:53 Let me ask a question I think people are curious about when they look at a world class leader

02:47:01 and thinker such as yourself, as what books, technical fiction, philosophical, had a big

02:47:08 impact on your life?

02:47:09 And maybe from another perspective, what books would you recommend others read?

02:47:15 So my choice, the three books, right?

02:47:17 Three books.

02:47:18 My choice is, so the one book that really influenced me a lot when I was building, starting

02:47:25 out this company, maybe 10 years ago, was G.E.B. and I like everything about it, first

02:47:34 of all.

02:47:35 It’s just beautifully written and it’s so old school and so somewhat outdated a little

02:47:42 bit.

02:47:43 But I think the ideas in it about the fact that a few meaningless components can come

02:47:48 together and create meaning that we can’t even understand.

02:47:52 This emerging thing, I mean complexity, the whole science of complexity and that beauty,

02:47:59 intelligence, all interesting things about this world emerge.

02:48:04 Yeah and yeah, the Godel theorems and just thinking about like what even these formal

02:48:14 systems, something can be created that we can’t quite yet understand.

02:48:19 And that from my romantic standpoint was always just, that is why it’s important to, maybe

02:48:25 I should try to work on these systems and try to build an AI.

02:48:30 Yes I’m not an engineer, yes I don’t really know how it works, but I think that something

02:48:33 comes out of it that’s pure poetry and I know a little bit about that.

02:48:40 Something magical comes out of it that we can’t quite put a finger on.

02:48:45 That’s why that book was really fundamental for me, just for, I don’t even know why, it

02:48:51 was just all about this little magic that happens.

02:48:55 So that’s one, probably the most important book for Replica was Carl Rogers on becoming

02:49:00 a person.

02:49:02 And that’s really, and so I think when I think about our company, it’s all about there’s

02:49:07 so many little magical things that happened over the course of working on it.

02:49:14 For instance, I mean the most famous chatbot that we learned about when we started working

02:49:18 on the company was Eliza, which was Weisenbaum, the MIT professor that built a chatbot that

02:49:24 would listen to you and be a therapist.

02:49:29 And I got really inspired to build Replica when I read Carl Rogers on becoming a person.

02:49:34 And then I realized that Eliza was mocking Carl Rogers.

02:49:37 It was Carl Rogers back in the day.

02:49:39 But I thought that Carl Rogers ideas are, they’re simple and they’re not, they’re very

02:49:45 simple, but they’re maybe the most profound thing I’ve ever learned about human beings.

02:49:52 And that’s the fact that before Carl Rogers, most therapy was about seeing what’s wrong

02:49:58 with people and trying to fix it or show them what’s wrong with you.

02:50:01 And it was all built on the fact that most people are, all people are fundamentally flawed.

02:50:07 We have this broken psyche and therapy is just an instrument to shed some light on that.

02:50:15 And Carl Rogers was different in a way that he finally said that, well, it’s very important

02:50:21 for therapy to work is to create this therapeutic relationship where you believe fundamentally

02:50:25 and inclination to positive growth that everyone deep inside wants to grow positively and change.

02:50:33 And it’s super important to create this space and this therapeutic relationship where you

02:50:36 give unconditional positive regard, deep understanding, allowing someone else to be a separate person,

02:50:42 full acceptance.

02:50:44 And you also try to be as genuine as possible in it.

02:50:48 And then for him, that was his own journey of personal growth.

02:50:54 And that was back in the sixties.

02:50:55 And even that book that is coming from years ago, there’s a mention that even machines

02:51:02 can potentially do that.

02:51:05 And I always felt that, you know, creating the space is probably the most, the biggest

02:51:09 gift we can give to each other.

02:51:10 And that’s why the book was fundamental for me personally, because I felt I want to be

02:51:15 learning how to do that in my life.

02:51:18 And maybe I can scale it with, you know, with these AI systems and other people can get

02:51:22 access to that.

02:51:23 So I think Carl Rogers, it’s a pretty dry and a bit boring book, but I think the idea

02:51:28 is good.

02:51:29 Would you recommend others try to read it?

02:51:30 I do.

02:51:31 I think for, just for yourself, for as a human, not as an AI, as a human, it’s, it is, it

02:51:38 is just, and for him, that was his own path of his own personal, of growing personally

02:51:44 over years, working with people like that.

02:51:47 And so it was work and himself growing, helping other people grow and growing through that.

02:51:52 And that’s fundamentally what I believe in with our work, helping other people grow,

02:51:56 and ourselves, ourselves, trying to build a company that’s all built on those principles,

02:52:03 you know, having a good time, allowing some people who work with to grow a little bit.

02:52:07 So these two books, and then I would throw in, what we have on our, in our, in our office,

02:52:15 when we started a company in Russia, we put a neon sign in our office because we thought

02:52:19 that’s the recipe for success.

02:52:22 If we do that, we’re definitely going to wake up as a multi billion dollar company.

02:52:26 It was the Ludwig Wittgenstein quote, the limits of my language are the limits of my

02:52:31 world.

02:52:32 What’s the quote?

02:52:33 The limits of my language are the limits of my world.

02:52:37 And I love the Tractatus.

02:52:39 I think it’s just, it’s just a beautiful, it’s a book by Wittgenstein.

02:52:43 Yeah.

02:52:44 And I would recommend that too, even although he himself didn’t believe in that by the end

02:52:48 of his lifetime and debunked these ideas.

02:52:51 But I think I remember once an engineer came in 2012, I think with 13, a friend of ours

02:52:58 who worked with us and then went on to work at DeepMind and he gave, talked to us about

02:53:03 word2vec.

02:53:04 And I saw that I’m like, wow, that’s, you know, they, they wanted to translate language

02:53:10 into, you know, some other representation.

02:53:13 And that seems like some, you know, somehow all of that at some point, I think we’ll come

02:53:18 into this one, to this one place.

02:53:22 Somehow it just all feels like different people think about similar ideas in different times

02:53:26 from absolutely different perspectives.

02:53:29 And that’s why I like these books.

02:53:31 In the midst of our language is the limit of our world.

02:53:34 And we still have that neon sign, it’s very hard to work with this red light in your face.

02:53:45 I mean, on the, on the Russian side of things, in terms of language, the limits of language

02:53:53 being the limit of our world, you know, Russian is a beautiful language in some sense.

02:53:57 There’s wit, there’s humor, there’s pain.

02:54:00 There’s so much.

02:54:01 We don’t have time to talk about it much today, but I’m going to Paris to talk to Dostoyevsky

02:54:06 Tolstoy translators.

02:54:09 I think it’s this fascinating art, like art and engineering, that means such an interesting

02:54:15 process.

02:54:16 But so from the replica perspective, do you, what do you think about translation?

02:54:23 How difficult it is to create a deep, meaningful connection in Russian versus English?

02:54:29 How you can translate the two languages?

02:54:32 You speak both?

02:54:33 Yeah.

02:54:34 I think we’re two different people in different languages.

02:54:37 Even I’m, you know, thinking about, there’s actually some research on that.

02:54:41 I looked into that at some point because I was fascinated by the fact that what I’m talking

02:54:45 about with, what I was talking about with my Russian therapist has nothing to do with

02:54:48 what I’m talking about with my English speaking therapist.

02:54:51 It’s two different lives, two different types of conversations, two different personas.

02:54:59 The main difference between the languages are, with Russian and English is that Russian,

02:55:05 well English is like a piano.

02:55:06 It’s a limited number of a lot of different keys, but not too many.

02:55:11 And Russian is like an organ or something.

02:55:13 It’s just something gigantic with so many different keys and so many different opportunities

02:55:18 to screw up and so many opportunities to do something completely tone deaf.

02:55:24 It is just a much harder language to use.

02:55:28 It has way too much flexibility and way too many tones.

02:55:34 What about the entirety of like World War II, communism, Stalin, the pain of the people

02:55:40 like having been deceived by the dream, like all the pain of like just the entirety of

02:55:47 it.

02:55:48 Is that in the language too?

02:55:49 Does that have to do?

02:55:50 Oh, for sure.

02:55:51 I mean, we have words that don’t have direct translation that to English that are very

02:55:56 much like we have, which is sort of like to hold a grudge or something, but it doesn’t

02:56:03 have, it doesn’t, you don’t need to have anyone to do it to you.

02:56:07 It’s just your state.

02:56:08 Yeah.

02:56:09 You just feel like that.

02:56:10 You feel like betrayed by other people basically, but it’s not that and you can’t really translate

02:56:15 that.

02:56:16 And I think that’s super important.

02:56:18 There are very many words that are very specific, explain the Russian being, and I think it

02:56:24 can only come from a nation that suffered so much and saw institutions fall time after

02:56:31 time after time and you know, what’s exciting, maybe not exciting, exciting the wrong word,

02:56:36 but what’s interesting about like my generation, my mom’s generation, my parents generation,

02:56:42 that we saw institutions fall two or three times in our lifetime and most Americans have

02:56:48 never seen them fall and they just think that they exist forever, which is really interesting,

02:56:55 but it’s definitely a country that suffered so much and it makes, unfortunately when I

02:57:01 go back and I, you know, hang out with my Russian friends, it makes people very cynical.

02:57:06 They stop believing in the future.

02:57:10 I hope that’s not going to be the case for so long or something’s going to change again,

02:57:15 but I think seeing institutions fall is a very traumatic experience.

02:57:19 That’s very interesting and what’s on 2020 is a very interesting, do you think a civilization

02:57:28 will collapse?

02:57:29 See, I’m a very practical person.

02:57:33 We’re speaking in English.

02:57:34 So like you said, you’re a different person in English and Russian.

02:57:37 So in Russian you might answer that differently, but in English, yeah.

02:57:42 I’m an optimist and I generally believe that there is all, you know, even although the

02:57:49 perspectives are grim, there’s always a place for a miracle.

02:57:54 I mean, it’s always been like that with my life.

02:57:56 So yeah, my life has been, I’ve been incredibly lucky and things just, miracles happen all

02:58:02 the time with this company, with people I know, with everything around me.

02:58:08 And so I didn’t mention that book, but maybe In Search of Miraculous or In Search for Miraculous

02:58:13 or whatever the English translation for that is, good Russian book for everyone to read.

02:58:19 Yeah.

02:58:20 I mean, if you put good vibes, if you put love out there in the world, miracles somehow

02:58:29 happen.

02:58:30 Yeah.

02:58:31 I believe that too, or at least I believe that, I don’t know.

02:58:35 Let me ask the most absurd, final, ridiculous question of, we’ve talked about life a lot.

02:58:42 What do you think is the meaning of it all?

02:58:45 What’s the meaning of life?

02:58:46 I mean, my answer is probably going to be pretty cheesy.

02:58:52 But I think the state of love is once you feel it, in a way that we’ve discussed it

02:58:59 before.

02:59:00 I’m not talking about falling in love, where…

02:59:04 Just love.

02:59:05 To yourself, to other people, to something, to the world.

02:59:10 That state of bliss that we experience sometimes, whether through connection with ourselves,

02:59:16 with our people, with the technology, there’s something special about those moments.

02:59:23 So I would say, if anything, that’s the only…

02:59:30 If it’s not for that, then for what else are we really trying to do that?

02:59:35 I don’t think there’s a better way to end it than talking about love.

02:59:38 Eugenia, I told you offline that there’s something about me that felt like this…

02:59:47 Talking to you, meeting you in person would be a turning point for my life.

02:59:51 I know that might sound weird to hear, but it was a huge honor to talk to you.

02:59:59 I hope we talk again.

03:00:01 Thank you so much for your time.

03:00:02 Thank you so much, Lex.

03:00:05 Thanks for listening to this conversation with Eugenia Cuida, and thank you to our sponsors,

03:00:09 DoorDash, Dollar Shave Club, and Cash App.

03:00:13 Click the sponsor links in the description to get a discount and to support this podcast.

03:00:18 If you enjoy this thing, subscribe on YouTube, review it with 5 stars on Apple Podcast, follow

03:00:23 on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman.

03:00:28 And now, let me leave you with some words from Carl Sagan.

03:00:32 The world is so exquisite with so much love and moral depth that there’s no reason to

03:00:36 deceive ourselves with pretty stories of which there’s little good evidence.

03:00:41 Far better, it seems to me, in our vulnerability is to look death in the eye and to be grateful

03:00:48 every day for the brief but magnificent opportunity that life provides.

03:00:54 Thank you for listening and hope to see you next time.