Charles Isbell: Computing, Interactive AI, and Race in America #135

Transcript

00:00:00 The following is a conversation with Charles Isbell,

00:00:03 Dean of the College of Computing at Georgia Tech,

00:00:06 a researcher and educator in the field of artificial intelligence,

00:00:10 and someone who deeply thinks about what exactly is the field of computing and how do we teach it.

00:00:18 He also has a fascinatingly varied set of interests including music,

00:00:22 books, movies, sports, and history that make him especially fun to talk with.

00:00:28 When I first saw him speak, his charisma immediately took over the room,

00:00:32 and I had a stupid excited smile on my face,

00:00:35 and I knew I had to eventually talk to him on this podcast.

00:00:39 Quick mention of each sponsor, followed by some thoughts related to the episode.

00:00:44 First is Neuro, the maker of functional sugar free gum

00:00:48 and mints that I use to give my brain a quick caffeine boost.

00:00:52 Second is Decoding Digital, a podcast on tech and entrepreneurship

00:00:56 that I listen to and enjoy.

00:00:59 Third is Masterclass, online courses that I watch from some of the most amazing humans in history.

00:01:04 And finally, Cash App, the app I use to send money to friends for food and drinks.

00:01:10 Please check out these sponsors in the description to get a discount and to support this podcast.

00:01:16 As a side note, let me say that I’m trying to make it so that the conversations with Charles,

00:01:21 Eric Weinstein, and Dan Carlin will be published before Americans vote for president on November 3rd.

00:01:28 There’s nothing explicitly political in these conversations,

00:01:31 but they do touch on something in human nature that I hope can bring context to our difficult time,

00:01:37 and maybe, for a moment, allow us to empathize with people we disagree with.

00:01:43 With Eric, we talk about the nature of evil.

00:01:45 With Charles, besides AI and music, we talk a bit about race in America,

00:01:51 and how we can bring more love and empathy to our online communication.

00:01:56 And with Dan Carlin, well, we talk about Alexander the Great,

00:02:01 Genghis Khan, Hitler, Stalin, and all the complicated parts of human history in between,

00:02:07 with a hopeful eye toward a brighter future for our humble, little civilization here on Earth.

00:02:13 The conversation with Dan will hopefully be posted tomorrow, on Monday, November 2nd.

00:02:19 If you enjoy this thing, subscribe on YouTube, review it with 5 Stars and Apple Podcasts,

00:02:24 follow on Spotify, support on Patreon, or connect with me on Twitter at Lex Friedman.

00:02:30 And now, here’s my conversation with Charles Isbell.

00:02:35 You’ve mentioned that you love movies and TV shows.

00:02:39 Let’s ask an easy question, but you have to be definitively, objectively, conclusive.

00:02:44 What’s your top three movies of all time?

00:02:47 So, you’re asking me to be definitive and to be conclusive.

00:02:50 That’s a little hard. I’m going to tell you why.

00:02:51 It’s very simple. It’s because movies is too broad of a category.

00:02:56 I got to pick subgenres, but I will tell you that of those genres,

00:02:59 I’ll pick one or two from each of the genres, and I’ll get us to three, so I’m going to cheat.

00:03:03 So, my favorite comedy of all times, which is probably my favorite movie of all time,

00:03:10 is His Girl Friday, which is probably a movie that you’ve not ever heard of,

00:03:14 but it’s based on a play called The Front Page from, I don’t know, early 1900s.

00:03:20 And the movie is a fantastic film.

00:03:23 What’s the story? What’s the independent film?

00:03:26 No, no, no. What are we talking about?

00:03:27 This is one of the movies that would have been very popular. It’s a screwball comedy.

00:03:31 You ever see Moonlighting, the TV show? You know what I’m talking about?

00:03:33 So, you’ve seen these shows where there’s a man and a woman, and they clearly are in love with one another,

00:03:38 and they’re constantly fighting and always talking over each other.

00:03:40 Banter, banter, banter, banter, banter.

00:03:43 This was the movie that started all that, as far as I’m concerned.

00:03:46 It’s very much of its time. So, it’s, I don’t know, must have come out sometime between 1934 and 1939.

00:03:53 I’m not sure exactly when the movie itself came out. It’s black and white.

00:03:57 It’s just a fantastic film, and it’s hilarious.

00:04:01 So, it’s mostly conversation?

00:04:03 Not entirely, but mostly, mostly. Just a lot of back and forth.

00:04:07 There’s a story there. Someone’s on death row, and they’re newspaper men, including her.

00:04:14 They’re all newspaper men. They were divorced.

00:04:17 The editor, the publisher, I guess, and the reporter, they were divorced.

00:04:22 But, you know, they clearly, he’s thinking, trying to get back together,

00:04:25 and there’s this whole other thing that’s going on.

00:04:27 But none of that matters. The plot doesn’t matter.

00:04:28 Yeah, it’s just a little play in conversation.

00:04:31 It’s fantastic. And I just love everything about the conversation, because at the end of the day,

00:04:35 sort of narrative and conversation are the sort of things that drive me.

00:04:38 And so, I really like that movie for that reason.

00:04:41 Similarly, I’m now going to cheat, and I’m going to give you two movies as one.

00:04:45 And they’re Crouching Tiger, Hidden Dragon, and John Wick.

00:04:49 Both relatively modern. John Wick, of course.

00:04:51 One, two, or three?

00:04:52 One. It gets increasingly, I love them all for different reasons,

00:04:56 and increasingly more ridiculous. Kind of like Loving Alien and Aliens,

00:04:59 despite the fact they’re two completely different movies.

00:05:01 But the reason I put Crouching Tiger, Hidden Dragon, and John Wick together is because I

00:05:06 actually think they’re the same movie, or what I like about them, the same movie.

00:05:09 Which is both of them create a world that you’re coming in the middle of,

00:05:15 and they don’t explain it to you. But the story is done so well that you pick it up.

00:05:20 So, anyone who’s seen John Wick, you know, you have these little coins,

00:05:23 and they’re headed out, and there are these rules,

00:05:25 and apparently every single person in New York City is an assassin.

00:05:28 There’s like two people who come through who aren’t, but otherwise they are.

00:05:31 But there’s this complicated world, and everyone knows each other.

00:05:34 They don’t sit down and explain it to you, but you figure it out.

00:05:35 Crouching Tiger, Hidden Dragon is a lot like that.

00:05:38 You get the feeling that this is chapter nine of a 10 part story,

00:05:41 and you’ve missed the first eight chapters, and they’re not going to explain it to you,

00:05:44 but there’s this sort of rich world behind you.

00:05:45 You get pulled in anyway, like immediately.

00:05:47 You get pulled in anyway. So, it’s just excellent storytelling in both cases,

00:05:50 and very, very different.

00:05:51 And also you like the outfit, I assume? The John Wick outfit?

00:05:54 Oh yeah, of course. Well, of course. Yes. I think John Wick outfit is perfect.

00:05:58 And so that’s number two, and then…

00:05:59 But sorry to pause on that. Martial arts? You have a long list of hobbies.

00:06:03 Like it scrolls off the page, but I didn’t see martial arts as one of them.

00:06:07 I do not do martial arts, but I certainly watch martial arts.

00:06:10 Oh, I appreciate it very much. Oh, we could talk about every Jackie Chan movie ever made,

00:06:14 and I would be on board with that.

00:06:15 The Shower, too? Like that kind of comedy of a cop?

00:06:18 Yes, yes. By the way, my favorite Jackie Chan movie would be Drunken Master 2,

00:06:25 known in the States usually as Legend of the Drunken Master.

00:06:29 Actually, Drunken Master, the first one, is the first kung fu movie I ever saw,

00:06:33 but I did not know that.

00:06:34 First Jackie Chan movie?

00:06:36 No, first one ever that I saw and remember, but I had no idea that that’s what it was,

00:06:40 and I didn’t know that was Jackie Chan. That was like his first major movie.

00:06:43 Yeah. I was a kid. It was done in the 70s.

00:06:46 I only later rediscovered that that was actually.

00:06:49 And he creates his own martial art by drinking. Was he actually drinking or was he played drinking?

00:06:58 You mean as an actor or as a character?

00:06:59 No. I’m sure as an actor. He was in the 70s or whatever.

00:07:04 He was definitely drinking, and in the end, he drinks industrial grade alcohol.

00:07:09 Ah, yeah.

00:07:10 Yeah, and has one of the most fantastic fights ever in that subgenre.

00:07:15 Anyway, that’s my favorite one of his movies, but I’ll tell you the last movie.

00:07:19 It’s actually a movie called Nothing But a Man, which is the 1960s,

00:07:23 starred Ivan Dixon, who you’ll know from Hogan’s Heroes, and Abby Lincoln.

00:07:31 It’s just a really small little drama. It’s a beautiful story.

00:07:35 But my favorite scenes, I’m cheating, one of my favorite movies just for the ending is The

00:07:41 Godfather. I think the last scene of that is just fantastic. It’s the whole movie all summarized in

00:07:47 just eight, nine seconds.

00:07:48 Godfather Part One?

00:07:49 Part One.

00:07:50 How does it end? I don’t think you need to worry about spoilers if you haven’t seen The Godfather.

00:07:54 Spoiler alert. It ends with the wife coming to Michael, and he says,

00:08:01 just this once, I’ll let you ask me my business. And she asks him if he did this terrible thing,

00:08:06 and he looks her in the eye and he lies, and he says, no. And she says, thank you. And she

00:08:10 walks out the door, and you see her going out of the door, and all these people are coming in,

00:08:19 and they’re kissing Michael’s hands, and Godfather. And then the camera switches

00:08:24 perspective. So instead of looking at him, you’re looking at her, and the door

00:08:29 closes in her face, and that’s the end of the movie. And that’s the whole movie right there.

00:08:33 Do you see parallels between that and your position as Dean at Georgia Tech Chrome?

00:08:37 Just kidding. Trick question.

00:08:39 Sometimes, certainly. The door gets closed on me every once in a while.

00:08:44 Okay. That was a rhetorical question. You’ve also mentioned that you, I think, enjoy all kinds of

00:08:51 experiments, including on yourself. But I saw a video where you said you did an experiment where

00:08:56 you tracked all kinds of information about yourself and a few others sort of wiring up your

00:09:03 home. And this little idea that you mentioned in that video, which is kind of interesting, that

00:09:10 you thought that two days worth of data is enough to capture majority of the behavior of the human

00:09:16 being. First, can you describe what the heck you did to collect all the data? Because it’s

00:09:23 fascinating, just like little details of how you collect that data and also what your intuition

00:09:28 behind the two days is. So first off, it has to be the right two days. But I was thinking of a

00:09:32 very specific experiment. There’s actually a suite of them that I’ve been a part of, and other people

00:09:36 have done this, of course. I just sort of dabbled in that part of the world. But to be very clear,

00:09:40 the specific thing that I was talking about had to do with recording all the IR going on in my

00:09:46 infrared going on in my house. So this is a long time ago. So this is everything’s being curled

00:09:50 by pressing buttons on remote controls, as opposed to speaking to Alexa or Siri or someone like that.

00:09:56 And I was just trying to figure out if you could get enough data on people to figure out what they

00:10:01 were going to do with their TVs or their lights. My house was completely wired up at the time.

00:10:06 But you know, what I’m about to look at a movie, I’m about to turn on the TV or whatever and just

00:10:10 see what I could predict from it. It was kind of surprising. It shouldn’t have been. But that’s all

00:10:16 very easy to do, by the way, just capturing all the little stuff. I mean, it’s a bunch of computers

00:10:19 systems. It’s really easy to capture today if you know what you’re looking for. At Georgia Tech,

00:10:22 long before I got there, we had this thing called the Aware Home, where everything was wired up and

00:10:27 you captured everything that was going on. Nothing even difficult, not with video or anything like

00:10:31 that, just the way that the system was just capturing everything. So it turns out that,

00:10:37 and I did this with myself and then I had students and they worked with many other people. And it

00:10:42 turns out at the end of the day, people do the same things over and over and over again. So it

00:10:47 has to be the right two days, like a weekend. But it turns out not only can you predict what

00:10:51 someone’s going to do next at the level of what button they’re going to press next on a remote

00:10:55 control, but you can do it with something really, really simple. You don’t even need a hidden mark

00:11:01 off model. It’s like a mark, just simply, I press this, this is my prediction of the next thing.

00:11:05 It turns out you can get 93% accuracy just by doing something very simple and stupid and just

00:11:11 counting statistics. But what was actually more interesting is that you could use that information.

00:11:15 This comes up again and again in my work. If you try to represent people or objects by the things

00:11:21 they do, the things you can measure about them that have to do with action in the world. So

00:11:26 distribution over actions, and you try to represent them by the distribution of actions that are done

00:11:32 on them, then you do a pretty good job of sort of understanding how people are and they cluster

00:11:38 remarkably well, in fact, irritatingly so. And so by clustering people this way,

00:11:44 you can maybe, you know, I got the 93% accuracy of what’s the next button you’re going to press,

00:11:49 but I can get 99% accuracy or somewhere there’s about on the collections of things you might

00:11:54 press. And it turns out the things that you might press are all related to number to each other and

00:11:58 exactly what you would expect. So for example, all the key, all the numbers on a keypad, it turns out

00:12:04 all have the same behavior with respect to you as a human being. And so you would naturally cluster

00:12:08 them together and you discover that numbers are all related to one another in some way and all

00:12:14 these other things. And then, and here’s the part that I think is important. I mean, you can see

00:12:18 this in all kinds of things. Every individual is different, but any given individual is remarkably

00:12:25 predictable because you keep doing the same things over and over again. And the two things that I’ve

00:12:29 learned in the long time that I’ve been thinking about this is people are easily predictable and

00:12:34 people hate when you tell them that they’re easily predictable, but they are. And there you go.

00:12:39 Yeah. What about, let me play devil’s advocate and philosophically speaking, is it possible to

00:12:46 say that what defines humans is the outlier. So even though many, some large percentage of our

00:12:52 behaviors, whatever the signal we measure is the same and it would cluster nicely, but maybe it’s

00:12:57 the special moments of when we break out of the routine is the definitive thing that we’re

00:13:03 breaking out of the routine is the definitive things. And the way we break out of that routine

00:13:07 for each one of us might be different. It’s possible. I would say that I would say it a

00:13:11 little differently. I think I would make two things. One is a, I’m going to disagree with

00:13:15 the premise, I think, but that’s fine. I think the way I would put it is there are people who

00:13:22 are very different from lots of other people, but they’re not 0%, they’re closer to 10%, right? So

00:13:28 in fact, even if you do this kind of clustering of people, that’ll turn out to be the small number

00:13:31 they all behave like each other, even if they individually behave very differently from everyone

00:13:36 else. So I think that’s kind of important. But what you’re really asking, I think, and I think

00:13:40 this is really a question is, what do you do when you’re faced with the situation you’ve never seen

00:13:46 before? What do you do when you’re faced with an extraordinary situation maybe you’ve seen others

00:13:49 do and you’re actually forced to do something and you react to that very differently. And that is

00:13:53 the thing that makes you human. I would agree with that, at least at a philosophical level, that it’s

00:13:56 the times when you are faced with something difficult, a decision that you have to make

00:14:04 where the answer isn’t easy, even if you know what the right answer is, that’s sort of what defines

00:14:08 you as the individual. And I think what defines people broadly, it’s the hard problem. It’s not

00:14:12 the easy problem. It’s the thing that’s going to hurt you. It’s not even that it’s difficult. It’s

00:14:17 just that you know that the outcome is going to be highly suboptimal for you. And I do think that

00:14:22 that’s a reasonable place to start for the question of what makes us human. So before we talk about

00:14:29 sort of explore the different ideas underlying interactive artificial intelligence, which we

00:14:33 are working on, let me just go along this thread to skip to kind of our world of social media,

00:14:39 which is something that at least on the artificial intelligence side you think about.

00:14:44 There’s a popular narrative, I don’t know if it’s true, but that we have these silos in social

00:14:51 media and we have these clusterings, as you’re kind of mentioning. And the idea is that, you know,

00:14:58 along that narrative is that, you know, we want to, we want to break each other out of those silos

00:15:06 so we can be empathetic to other people, to if you’re a Democrat, you’d be empathetic to the

00:15:12 Republican. If you’re Republican, you’re empathetic Democrat. Those are just two silly bins that we

00:15:17 seem to be very excited about, but there’s other binnings that we can think about. Is there, from

00:15:24 an artificial intelligence perspective, because you’re just saying we cluster along the data,

00:15:29 but then interactive artificial intelligence is referring to throwing agents into that mix,

00:15:35 AI systems in that mix, helping us interacting with us humans and maybe getting us out of that

00:15:41 mix, maybe getting us out of those silos. Is that something that you think is possible? Do you see

00:15:48 a hopeful possibility for artificial intelligence systems in these large networks of people to get

00:15:56 us outside of our habits in at least the idea space to where we can sort of be empathetic to

00:16:04 other people’s lived experiences, other people’s points of view, you know, all that kind of stuff?

00:16:11 Yes, and I actually don’t think it’s that hard. Well, it’s not hard in this sense. So imagine that

00:16:16 you can, let’s just, let’s make life simple for a minute. Let’s assume that you can do a kind of

00:16:22 partial ordering over ideas or clusterings of behavior. It doesn’t even matter what I mean here,

00:16:28 so long as there’s some way that this is a cluster, this is a cluster, there’s some edge

00:16:31 between them, right? And this is kind of, they don’t quite touch even, or maybe they come very

00:16:35 close. If you can imagine that conceptually, then the way you get from here to here is not by going

00:16:40 from here to here. The way you get from here to here is you find the edge and you move slowly

00:16:43 together, right? And I think that machines are actually very good at that sort of thing once we

00:16:47 can kind of define the problem, either in terms of behavior or ideas or words or whatever. So it’s

00:16:51 easy in the sense that if you already have the network and you know the relationships, you know,

00:16:55 the edges and sort of the strengths on them and you kind of have some semantic meaning for them,

00:17:00 the machine doesn’t have to, you do as the designer, then yeah, I think you can kind of move

00:17:04 people along and sort of expand them. But it’s harder than that. And the reason it’s harder than

00:17:08 that, or sort of coming up with the network structure itself is hard, is because I’m gonna

00:17:13 tell you a story that someone else told me and I don’t, I may get some of the details a little bit

00:17:18 wrong, but it’s roughly, it roughly goes like this. You take two sets of people from the same

00:17:24 backgrounds and you want them to solve a problem. So you separate them up, which we do all the time,

00:17:29 right? Oh, you know, we’re gonna break out in the, we’re gonna break out groups. You’re gonna go

00:17:32 over there and you’re gonna talk about this. You’re gonna go over there and you’re gonna talk

00:17:34 about this. And then you have them sort of in this big room, but far apart from one another,

00:17:38 and you have them sort of interact with one another. When they come back to talk about what

00:17:43 they learn, you want to merge what they’ve done together. It can be extremely hard because they

00:17:47 don’t, they basically don’t speak the same language anymore. Like when you create these problems and

00:17:51 you dive into them, you create your own language. So the example this one person gave me, which I

00:17:56 found kind of interesting because we were in the middle of that at the time, was

00:17:58 they’re sitting over there and they’re talking about these rooms that you can see, but you’re

00:18:03 seeing them from different vantage points, depending on what side of the room you’re on.

00:18:06 They can see a clock very easily. And so they start referring to the room as the one with the clock.

00:18:12 This group over here, looking at the same room, they can see the clock, but it’s, you know,

00:18:16 not in their line of sight or whatever. So they end up referring to it by some other way.

00:18:22 When they get back together and they’re talking about things, they’re referring to the same room

00:18:26 and they don’t even realize they’re referring to the same room. And in fact, this group doesn’t

00:18:29 even see that there’s a clock there and this group doesn’t see whatever’s the clock on the wall is

00:18:32 the thing that stuck with me. So if you create these different silos, the problem isn’t that

00:18:36 the ideologies disagree. It’s that you’re using the same words and they mean radically different

00:18:41 things. The hard part is just getting them to agree on the, well, maybe we’d say the axioms in

00:18:47 our world, right? But you know, just get them to agree on some basic definitions because right now

00:18:52 they talk, they’re talking past each other, just completely talking past each other. That’s the

00:18:56 hard part, getting them to meet, getting them to interact. That may not be that difficult. Getting

00:19:01 them to see where their language is leading them to lead past one another. That’s, that’s the hard

00:19:06 part. It’s a really interesting question to me. It could be on the layer of language, but it feels

00:19:10 like there’s multiple layers to this. Like it could be worldview. It could be, I mean, all boils

00:19:14 down to empathy, being able to put yourself in the shoes of the other person to learn the language, to

00:19:20 learn like visually how they see the world, to learn like the, I mean, I experienced this now

00:19:28 with, with trolls, the, the degree of humor in that world. For example, I talk about love a lot.

00:19:33 I’m very like, I’m really lucky to have this amazing community of loving people. But whenever I

00:19:39 encounter trolls, they always roll their eyes at the idea of love because it’s so quote unquote

00:19:45 cringe. So, so they, they show love by like derision, I would say. And I think about on the

00:19:56 human level, that’s a whole nother discussion. That’s psychology, that’s sociology, so on. But

00:20:00 I wonder if AI systems can help somehow and to bridge the gap of what is this person’s life like?

00:20:10 Encourage me to just ask that question, to put myself in their shoes, to experience the agitations,

00:20:16 the fears, the hopes they have, the, to experience, you know, the, to even just to think about what

00:20:23 was their upbringing like, like having a, a single parent home or a shitty education or all those

00:20:32 kinds of things, just to put myself in that mind space. It feels like that’s really important.

00:20:37 For us to, to, to bring those clusters together, to find that similar language. But it’s unclear

00:20:43 how AI can help that because it seems like AI systems need to understand both parties first.

00:20:48 So the, you know, the word understand, there’s doing a lot of work, right?

00:20:51 Yes.

00:20:52 So do you have to understand it or do you just simply have to note that there is something

00:20:57 similar as a point to touch, right? So, you know, you use the word empathy and I like that word,

00:21:04 for a lot of reasons, I think you’re right in the way that you’re using in the ways you’re describing,

00:21:07 but let’s separate it from sympathy, right? So, you know, sympathy is feeling sort of for someone,

00:21:13 empathy is kind of understanding where they’re coming from and how they, how they feel, right?

00:21:16 And for most people, those things go hand in hand. For some people, some are very good at empathy

00:21:22 and very, very bad at sympathy. Some people cannot experience, well, my observation would be,

00:21:28 I’m not a psychologist, my observation would be that some people seem to be very, very,

00:21:32 very bad at sympathy. My observation would be that some people seem incapable of feeling sympathy

00:21:37 unless they feel empathy first. You can understand someone, understand where they’re coming from and

00:21:42 still think, no, I can’t support that, right? It doesn’t mean that the only way I, because if that,

00:21:48 if that isn’t the case, then what it requires is that you, you must, the only way that you can,

00:21:54 to understand someone means you must agree with everything that they do, which isn’t right, right?

00:21:59 I can feel for someone is to completely understand them and make them like me in some way. Well,

00:22:06 then we’re lost, right? Because we’re not all exactly like each other. I have to understand

00:22:10 everything that you’ve gone through. It helps clearly, but they’re separable ideas, right?

00:22:14 Even though they get clearly, clearly tangled up in one another. So what I think AI could help you

00:22:18 do actually is if, and you know, I’m, I’m being quite fanciful as it were, but if you, if you

00:22:23 think of these as kind of, I understand how you interact, the words that you use, the, you know,

00:22:28 the actions you take, I have some way of doing this. Let’s not worry about what that is.

00:22:31 But I can see you as a kind of distribution of experiences and actions taken upon you,

00:22:36 things you’ve done and so on. And I can do this with someone else and I can find the places where

00:22:41 there’s some kind of commonality, a mapping as it were, even if it’s not total, you know, the,

00:22:46 if I think of this as distribution, right, then, you know, I can take the cosine of the angle

00:22:50 between you and if it’s, you know, if it’s zero, you’ve got nothing in common. If it’s one,

00:22:54 you’re completely the same person. Well, you know, you’re probably not one. You’re almost certainly

00:22:59 not zero. I can find the place where there’s the overlap, then I might be able to introduce you on

00:23:03 that basis or connect you in that, connect you in that way and make it easier for you to take that

00:23:09 step of that step of empathy. It’s not, it’s not impossible to do. Although I wonder if it requires

00:23:17 that everyone involved is at least interested in asking the question. So maybe the hard part

00:23:22 is just getting them interested in asking the question. In fact, maybe if you can get them to

00:23:24 ask the question, how are we more alike than we are different, they’ll solve it themselves. Maybe

00:23:28 that’s the problem that AI should be working on, not telling you how you’re similar or different,

00:23:32 but just getting you to decide that it’s worthwhile asking the question. It feels like an economist’s

00:23:37 answer actually. Well, people, okay, first of all, people like would disagree. So let me disagree

00:23:43 slightly, which is, I think everything you said is brilliant, but I tend to believe philosophically

00:23:49 speaking that people are interested underneath it all. And I would say that AI, the, the possibility

00:23:57 that an AI system would show the commonality is incredible. That’s a really good starting point.

00:24:02 I would say if you, if on social media, I could discover the common things deep or shallow between

00:24:10 me and a person who there’s tension with, I think that my basic human nature would take over from

00:24:17 there. And I think enjoy that commonality. And like, there’s something sticky about that, that

00:24:25 my mind will linger on and that person in my mind will become like warmer and warmer. And like, I’ll

00:24:31 start to give a feel more and more compassion towards them. I think for majority of the

00:24:35 population, that’s true, but that might be, that’s a hypothesis. Yeah. I mean, it’s an empirical

00:24:40 question, right? You’d have to figure it out. I mean, I want to believe you’re right. And so I’m

00:24:44 going to say that I think you’re right. Of course, some people come to those things for the purpose

00:24:50 of trolling, right? And it doesn’t matter that they’re playing a different game. Yeah. But I

00:24:55 don’t know. I, you know, my experience is it requires two things. It requires, in fact, maybe

00:25:00 this is really at the end what you’re saying. And I, and I do agree with this for sure. So

00:25:06 you, it’s hard to hold onto that kind of anger or to hold onto just the desire to humiliate someone

00:25:14 for that long. It’s just difficult to do. It takes, it takes a toll on you. But more importantly,

00:25:18 we know this both from people having done studies on it, but also from our own experiences,

00:25:23 that it is much easier to be dismissive of a person if they’re not in front of you,

00:25:27 if they’re not real, right? So much of the history of the world is about making people other, right?

00:25:35 So if you’re social media, if you’re on the web, if you’re doing whatever in the internet,

00:25:38 but being forced to deal with someone as a person, some equivalent to being in the same room,

00:25:45 makes a huge difference. Cause then you’re one, you’re forced to deal with their humanity because

00:25:49 it’s in front of you. The other is of course that, you know, they might punch you in the face

00:25:52 if you go too far. So, you know, both of those things kind of work together, I think to the,

00:25:56 to the right end. So I think bringing people together is really a kind of substitute for

00:26:03 forcing them to see the humanity in another person and to not be able to treat them as bits,

00:26:07 it’s hard to troll someone when you’re looking them in the eye. This is very difficult to do.

00:26:12 Agreed. Your broad set of research interests fall under interactive AI, as I mentioned,

00:26:18 which is a fascinating set of ideas and you have some concrete things that you’re

00:26:23 particularly interested in, but maybe could you talk about how you think about the field of

00:26:30 interactive artificial intelligence? Sure. So let me say upfront that if you look at,

00:26:34 certainly my early work, but even if you look at most of it, I’m a machine learning guy,

00:26:39 right? I do machine learning. First paper ever published, it was in NIPS. Back then it was

00:26:43 NIPS. Now it’s NeurIPS. It’s a long story there. Anyway, that’s another thing. But so,

00:26:48 so I’m a machine learning guy, right? I believe in data, I believe in statistics and all those

00:26:51 kinds of things. And the reason I’m bringing that up is even though I’m a newfangled statistical

00:26:55 machine learning guy and have been for a very long time, the problem I really care about is AI,

00:27:00 right? I care about artificial intelligence. I care about building some kind of

00:27:04 intelligent artifact. However that gets expressed, that would be at least as intelligent as humans

00:27:12 and as interesting as humans, perhaps in their own way. So that’s the deep underlying love and

00:27:18 dream is the bigger AI. Whatever the heck that is. Yeah. The machine learning in some ways is

00:27:24 a means to the end. It is not the end. And I don’t understand how one could be intelligent

00:27:29 without learning. So therefore I got to figure out how to do that, right? So that’s important.

00:27:32 But machine learning, by the way, is also a tool. I said statistical, because that’s what most

00:27:37 people think of themselves as machine learning people. That’s how they think. I think Pat Langley

00:27:40 might disagree, or at least 1980s Pat Langley might disagree with what it takes to do machine

00:27:46 learning. But I care about the AI problem, which is why it’s interactive AI, not just interactive

00:27:50 ML. I think it’s important to understand that there’s a longterm goal here, which I will

00:27:55 probably never live to see, but I would love to have been a part of, which is building something

00:27:59 truly intelligent outside of ourselves. Can we take a tiny tangent? Or am I interrupting,

00:28:04 which is, is there something you can say concrete about the mysterious gap between

00:28:12 the subset ML and the bigger AI? What’s missing? What do you think? I mean, obviously it’s

00:28:19 totally unknown, not totally, but in part unknown at this time, but is it something like with Pat

00:28:25 Langley, is it knowledge, like expert system reasoning type of kind of thing?

00:28:29 So AI is bigger than ML, but ML is bigger than AI. This is kind of the real problem here,

00:28:35 is that they’re really overlapping things that are really interested in slightly different problems.

00:28:39 I tend to think of ML, and there are many people out there who are going to be very upset at me

00:28:42 about this, but I tend to think of ML being much more concerned with the engineering of solving a

00:28:45 problem, and AI about the sort of more philosophical goal of true intelligence. And that’s the thing

00:28:50 that motivates me, even if I end up finding myself living in this kind of engineering ish space,

00:28:56 I’ve now made Michael Jordan upset. But to me, they just feel very different. You’re just

00:29:02 measuring them differently, your goals of where you’re trying to be are somewhat different.

00:29:07 But to me, AI is about trying to build that intelligent thing. And typically, but not always,

00:29:13 for the purpose of understanding ourselves a little bit better. Machine learning is, I think,

00:29:17 trying to solve the problem, whatever that problem is. Now, that’s my take. Others, of course,

00:29:22 would disagree. So on that note, so with the interactive AI, do you tend to, in your mind,

00:29:28 visualize AI as a singular system, or is it as a collective huge amount of systems interacting

00:29:33 with each other? Like, is the social interaction of us humans and of AI systems the fundamental

00:29:40 to intelligence? I think, well, it’s certainly fundamental to our kind of intelligence, right?

00:29:44 And I actually think it matters quite a bit. So the reason the interactive AI part matters to me

00:29:50 is because I don’t, this is going to sound simple, but I don’t care whether a tree makes a sound

00:30:00 when it falls and there’s no one around, because I don’t think it matters, right? If there’s no

00:30:04 observer in some sense. And I think what’s interesting about the way that we’re intelligent

00:30:08 is we’re intelligent with other people, right? Or other things anyway. And we go out of our way to

00:30:13 make other things intelligent. We’re hardwired to find intention, even whether there is no intention,

00:30:18 why we anthropomorphize everything. I think the interactive AI part is being intelligent in and

00:30:25 of myself in isolation is a meaningless act in some sense. The correct answer is you have to

00:30:31 be intelligent in the way that you interact with others. It’s also efficient because it allows you

00:30:35 to learn faster because you can import from past history. It also allows you to be efficient in

00:30:41 the transmission of that. So we ask ourselves about me. Am I intelligent? Clearly, I think so.

00:30:47 But I’m also intelligent as a part of a larger species and group of people, and we’re trying to

00:30:51 move the species forward as well. And so I think that notion of being intelligent with others is

00:30:56 kind of the key thing because otherwise you come and you go, and then it doesn’t matter. And so

00:31:01 that’s why I care about that aspect of it. And it has lots of other implications. One is not just

00:31:08 building something intelligent with others, but understanding that you can’t always communicate

00:31:12 with those others. They have been in a room where there’s a clock on the wall that you haven’t seen,

00:31:16 which means you have to spend an enormous amount of time communicating with one another constantly

00:31:20 in order to figure out what each other wants. So this is why people project, right? You project your

00:31:26 own intentions and your own reasons for doing things on the others as a way of understanding

00:31:30 them so that you know how to behave. But by the way, you, completely predictable person,

00:31:35 I don’t know how you’re predictable. I don’t know you well enough, but you probably eat the same five

00:31:39 things over and over again or whatever it is that you do, right? I know I do. If I’m going to a new

00:31:43 Chinese restaurant, I will get general gals chicken because that’s the thing that’s easy.

00:31:47 I will get hot and sour soup. People do the things that they do, but other people get the chicken and

00:31:52 broccoli. I can push this analogy way too far. The chicken and broccoli. I don’t know what’s

00:31:56 wrong with those people. I don’t know what’s wrong with them either. We have all had our trauma.

00:32:02 So they get their chicken and broccoli and their egg drop soup or whatever. We got to communicate and

00:32:06 it’s going to change, right? So interactive AI is not just about learning to solve a problem or a

00:32:13 task. It’s about having to adapt that over time, over a very long period of time and interacting

00:32:18 with other people who will themselves change. This is what we mean about things like adaptable

00:32:22 models, right? That you have to have a model. That model is going to change. And by the way,

00:32:25 it’s not just the case that you’re different from that person, but you’re different from the person

00:32:28 you were 15 minutes ago or certainly 15 years ago. And I have to assume that you’re at least going

00:32:33 to drift, hopefully not too many discontinuities, but you’re going to drift over time. And I have

00:32:38 to have some mechanism for adapting to that as you and an individual over time and across individuals

00:32:45 over time. On the topic of adaptive modeling and you talk about lifelong learning, which is

00:32:51 a, I think a topic that’s understudied or maybe because nobody knows what to do with it. But like,

00:32:59 you know, if you look at Alexa or most of our artificial intelligence systems that are primarily

00:33:04 machine learning based systems or dialogue systems, all those kinds of things, they know very little

00:33:10 about you in the sense of the lifelong learning sense that we learn as humans, we learn a lot about

00:33:21 each other, not in the quantity effects, but like the temporally rich set of information that seems

00:33:31 to like pick up the crumbs along the way that somehow seems to capture a person pretty well.

00:33:36 Do you have any ideas how to do lifelong learning? Because it seems like most of the machine learning

00:33:44 community does not. No, well, by the way, not only does the machine learning community not spend a

00:33:48 lot of time on lifelong learning, I don’t think they spend a lot of time on learning period in

00:33:52 the sense that they tend to be very task focused. Everybody is overfitting to whatever problem is

00:33:57 they happen to have. They’re overengineering their solutions to the task. Even the people,

00:34:01 and I think these people too, are trying to solve a hard problem of transfer learning, right? I’m

00:34:06 going to learn on one task and learn another task. You still end up creating the task. It’s like

00:34:10 looking for your keys where the light is because that’s where the light is, right? It’s not because

00:34:13 the keys have to be there. I mean, one could argue that we tend to do this in general. As a group,

00:34:20 we tend to hill climb and get stuck in local optima. I think we do this in the small as well.

00:34:26 I think it’s very hard to do. Here’s the hard thing about AI. The hard thing about AI is it

00:34:32 keeps changing on us, right? What is AI? AI is the art and science of making computers act the way

00:34:38 they do in the movies, right? That’s what it is, right? But beyond that. They keep coming up with

00:34:44 new movies. Yes. Right, exactly. We are driven by this kind of need to the ineffable quality of who

00:34:52 we are, which means that the moment you understand something is no longer AI, right? Well, we

00:34:57 understand this. That’s just you take the derivative and you divide by two and then you average it out

00:35:01 over time in the window. Therefore, that’s no longer AI. The problem is unsolvable because

00:35:05 it keeps kind of going away. This creates a kind of illusion, which I don’t think is an entire

00:35:08 illusion, of either there’s very simple task based things you can do very well and over engineer,

00:35:13 there’s all of AI, and there’s nothing in the middle. It’s very hard to get from here to here,

00:35:18 and it’s very hard to see how to get from here to here. I don’t think that we’ve done

00:35:23 a very good job of it because we get stuck trying to solve the small problems in front of it,

00:35:27 myself included. I’m not going to pretend that I’m better at this than anyone else. Of course,

00:35:31 all the incentives in academia and in industry are set to make that very hard because you have

00:35:37 to get the next paper out, you have to get the next product out, you have to solve this problem,

00:35:41 and it’s very sort of naturally incremental. None of the incentives are set up to allow you to take

00:35:47 a huge risk unless you’re already so well established you can take that big risk.

00:35:53 If you’re that well established that you can take that big risk, then you’ve probably spent

00:35:57 much of your career taking these little risks, relatively speaking, and so you have got a

00:36:01 lifetime of experience telling you not to take that particular big risk. So the whole system’s

00:36:05 set up to make progress very slow. That’s fine. It’s just the way it is, but it does make this

00:36:10 gap seem really big, which is my long way of saying I don’t have a great answer to it except

00:36:14 that stop doing n equals one. At least try to get n equal two and maybe n equal seven so that you

00:36:21 can say I’m going to, or maybe t is a better variable here. I’m going to not just solve this

00:36:25 problem and solve this problem and another problem. I’m not going to learn just on you.

00:36:28 I’m going to keep living out there in the world and just seeing what happens and that we’ll learn

00:36:32 something as designers and our machine learning algorithms and our AI algorithms can learn as

00:36:36 well. But unless you’re willing to build a system which you’re going to have live for months at a

00:36:41 time in an environment that is messy and chaotic, you cannot control, then you’re never going to

00:36:47 make progress in that direction. So I guess my answer to you is yes. My idea is that you should,

00:36:51 it’s not no, it’s yes. You should be deploying these things and making them live for a month

00:36:57 at a time and be okay with the fact that it’s going to take you five years to do this. Not

00:37:02 rerunning the same experiment over and over again and refining the machine so it’s slightly better

00:37:06 at whatever, but actually having it out there and living in the chaos of the world and seeing what

00:37:12 its learning algorithm say can learn, what data structure it can build and how it can go from

00:37:16 there. Without that, you’re going to be stuck all the time. What do you think about the possibility

00:37:22 of N equals one growing, it’s probably crude approximation, but growing like if you look at

00:37:28 language models like GPT3, if you just make it big enough, it’ll swallow the world. Meaning like

00:37:35 it’ll solve all your T to infinity by just growing in size of this, taking the small overengineered

00:37:43 solution and just pumping it full of steroids in terms of compute, in terms of size of training

00:37:49 data and the Yann LeCun style self supervised or open AI self supervised. Just throw all of YouTube

00:37:56 at it and it will learn how to reason, how to paint, how to create music, how to love all that

00:38:04 by watching YouTube videos. I mean, I can’t think of a more terrifying world to live in than a world

00:38:08 that is based on YouTube videos, but yeah, I think the answer that I just kind of don’t think that’ll

00:38:14 quite well, it won’t work that easily. You will get somewhere and you will learn something, which

00:38:20 means it’s probably worth it, but you won’t get there. You won’t solve the problem. Here’s the

00:38:25 thing, we build these things and we say we want them to learn, but what actually happens, and

00:38:31 let’s say they do learn, I mean, certainly every paper I’ve gotten published, the things learn,

00:38:35 I don’t know about anyone else, but they actually change us, right? We react to it differently,

00:38:40 right? So we keep redefining what it means to be successful, both in the negative in the case,

00:38:45 but also in the positive in that, oh, well, this is an accomplishment. I’ll give you an example,

00:38:51 which is like the one you just described with YouTube. Let’s get completely out of machine

00:38:55 learning. Well, not completely, but mostly out of machine learning. Think about Google.

00:38:59 People were trying to solve information retrieval, the ad hoc information retrieval

00:39:03 problem forever. I mean, first major book I ever read about it was what, 71, I think it was when

00:39:09 it came out. Anyway, we’ll treat everything as a vector and we’ll do these vector space models

00:39:14 and whatever. And that was all great. And we made very little progress. I mean, we made some progress

00:39:20 and then Google comes and makes the ad hoc problem seem pretty easy. I mean, it’s not,

00:39:24 there’s lots of computers and databases involved, and there’s some brilliant algorithmic stuff

00:39:29 behind it too, and some systems building. But the problem changed, right?

00:39:37 If you’ve got a world that’s that connected so that you have, you know, there are 10 million

00:39:42 answers quite literally to the question that you’re asking, then the problem wasn’t give me

00:39:48 the things that are relevant. The problem is don’t give me anything that’s irrelevant, at least in

00:39:52 the first page, because nothing else matters. So Google is not solving the information retrieval

00:39:59 problem, at least not on this webpage. Google is minimizing false positives, which is not the same

00:40:06 thing as getting an answer. It turns out it’s good enough for what it is we want to use Google for,

00:40:11 but it also changes what the problem was we thought we were trying to solve in the first place.

00:40:15 You thought you were trying to find an answer, but you’re not, or you’re trying to find the answer,

00:40:18 but it turns out you’re just trying to find an answer. Now, yes, it is true. It’s also very good

00:40:22 at finding you exactly that webpage. Of course, you trained yourself to figure out what the keywords

00:40:26 were to get you that webpage. But in the end, by having that much data, you’ve just changed the

00:40:32 problem into something else. You haven’t actually learned what you set out to learn. Now, the

00:40:35 counter to that would be maybe we’re not doing that either. We just think we are because, you know,

00:40:40 we’re in our own heads. Maybe we’re learning the wrong problem in the first place, but I don’t

00:40:44 think that matters. I think the point is that Google has not solved information retrieval.

00:40:49 Google has done amazing service. I have nothing bad to say about what they’ve done. Lord knows

00:40:53 my entire life is better because Google exists. For Google Maps, I don’t think I’ve ever found

00:40:57 this place. Where is this? I see 110 and I see where did 95 go? So I’m very grateful for Google,

00:41:07 but they just have to make certain the first five things are right.

00:41:11 And everything after that is wrong. Look, we’re going off on a totally different topic here, but

00:41:17 think about the way we hire faculty. It’s exactly the same thing.

00:41:21 I get in controversial, getting controversial. It’s exactly the same problem, right? It’s

00:41:27 minimizing false positives. We say things like we want to find the best person to be an assistant

00:41:34 professor at MIT in the new college of computing, which I will point out was founded 30 years after

00:41:42 the college of computing. I’m a part of both of my alma mater. I’m just saying I appreciate all

00:41:49 that they did and all that they’re doing. Anyway, so we’re going to try to hire the best professor.

00:41:57 That’s what we say, the best person for this job, but that’s not what we do at all, right?

00:42:02 Do you know which percentage of faculty in the top four earn their PhDs from the top four,

00:42:08 say in 2017, which is the most recent year for which I have data?

00:42:12 Maybe a large percentage.

00:42:14 About 60%.

00:42:15 60.

00:42:15 60% of the faculty in the top four earn their PhDs in the top four. This is computer science,

00:42:19 for which there is no top five. There’s only a top four, right? Because they’re all tied for one.

00:42:23 For people who don’t know, by the way, that would be MIT Stanford, Berkeley, CMU.

00:42:27 Yep.

00:42:29 Georgia Tech is number eight.

00:42:31 Number eight. You’re keeping track.

00:42:34 Oh yes. It’s a large part of my job. Number five is Illinois. Number six is a tie with

00:42:39 UW and Cornell and Princeton and Georgia Tech are tied for eight and UT Austin is number 10.

00:42:43 Michigan is number 11, by the way. So if you look at the top 10, you know what percentage of

00:42:50 faculty in the top 10 earn their PhDs from the top 10? 65, roughly. 65%.

00:42:56 If you look at the top 55 ranked departments, 50% of the faculty earn their PhDs from the top 10.

00:43:04 There is no universe in which all the best faculty, even just for R1 universities,

00:43:11 the majority of them come from 10 places. There’s no way that’s true, especially when you consider

00:43:16 how small some of those universities are in terms of the number of PhDs they produce.

00:43:20 Yeah.

00:43:20 Now that’s not a negative. I mean, it is a negative. It also has a habit of entrenching

00:43:24 certain historical inequities and accidents. But what it tells you is, well, ask yourself the

00:43:32 question, why is it like that? Well, because it’s easier. If we go all the way back to the 1980s,

00:43:40 you know, there was a saying that nobody ever lost his job buying a computer from IBM,

00:43:45 and it was true. And nobody ever lost their job hiring a PhD from MIT, right? If the person turned

00:43:52 out to be terrible, well, you know, they came from MIT, what did you expect me to know?

00:43:55 However, that same person coming from pick whichever is your least favorite place that

00:44:01 produces PhDs in, say, computer science, well, you took a risk, right? So all the incentives,

00:44:07 particularly because you’re only going to hire one this year, well, now we’re hiring 10,

00:44:10 but you know, you’re only going to have one or two or three this year. And by the way,

00:44:13 when they come in, you’re stuck with them for at least seven years at most places,

00:44:16 because that’s before you know whether they’re getting tenure or not. And if they get tenure,

00:44:19 stuck with them for a good 30 years, unless they decide to leave. That means the pressure to get

00:44:22 this right is very high. So what are you going to do? You’re going to minimize false positives.

00:44:27 You don’t care about saying no inappropriately. You only care about saying yes inappropriately.

00:44:33 So all the pressure drives you into that particular direction. Google,

00:44:36 not to put too fine a point on it, was in exactly the same situation with their search. It turns out

00:44:41 you just don’t want to give people the wrong page in the first three or four pages. And if there’s

00:44:46 10 million right answers and 100 bazillion wrong answers, just make certain the wrong answers

00:44:50 don’t get up there. And who cares if you, the right answer was actually the 13th page. A right

00:44:55 answer, a satisficing answer is number one, two, three, or four. So who cares?

00:44:59 Or an answer that will make you discover something beautiful, profound to your question.

00:45:04 Well, that’s a different problem, right?

00:45:06 But isn’t that the problem? Can we linger on this topic without sort of walking with grace?

00:45:15 How do we get for hiring faculty, how do we get that 13th page with a truly special person? Like

00:45:25 there’s, I mean, it depends on the department. Computer science probably has those department,

00:45:29 those kinds of people. Like you have the Russian guy, Grigory Perlman, like just these awkward,

00:45:38 strange minds that don’t know how to play the little game of etiquette that faculty have all

00:45:46 agreed somehow like converged over the decades, how to play with each other. And also is not,

00:45:51 you know, on top of that is not from the top four, top whatever numbers, the schools. And maybe

00:45:58 actually just says a few every once in a while to the traditions of old within the computer science

00:46:05 community. Maybe talks trash about machine learning is a total waste of time. And that’s

00:46:11 there on their resume. So like how do you allow the system to give those folks a chance?

00:46:19 Well, you have to be willing to take a certain kind of, without taking a particular position

00:46:22 on any particular person, you’d have to take, you have to be willing to take risk, right? A small

00:46:26 amount of it. I mean, if we were treating this as a, well, as a machine learning problem, right?

00:46:31 There’s a search problem, which is what it is. It’s a search problem. If we were treating it that

00:46:34 way, you would say, oh, well, the main thing is you want, you know, you’ve got a prior,

00:46:37 you want some data because I’m Bayesian. If you don’t want to do it that way,

00:46:41 we’ll just inject some randomness in and it’ll be okay. The problem is that feels very,

00:46:46 very hard to do with people. All the incentives are wrong there. But it turns out, and let’s say,

00:46:53 let’s say that’s the right answer. Let’s just give for the sake of argument that, you know,

00:46:57 injecting randomness in the system at that level for who you hire is just not worth doing because

00:47:02 the price is too high or the cost is too high. If we had infinite resources, sure, but we don’t.

00:47:05 And also you’ve got to teach people. So, you know, you’re ruining other people’s lives if you get it

00:47:09 too wrong. But we’ve taken that principle, even if I grant it and pushed it all the way back, right?

00:47:17 So, we could have a better pool than we have of people we look at and give an opportunity to.

00:47:25 If we do that, then we have a better chance of finding that. Of course, that just pushes the

00:47:29 problem back another level. But let me tell you something else. You know, I did a sort of study,

00:47:34 I call it a study. I called up eight of my friends and asked them for all of their data for

00:47:37 graduate admissions. But then someone else followed up and did an actual study. And it

00:47:41 turns out that I can tell you how everybody gets into grad school more or less, more or less.

00:47:47 You basically admit everyone from places higher ranked than you. You admit most people from

00:47:51 places ranked around you. And you meant almost no one from places ranked below you, with the

00:47:54 exception of the small liberal arts colleges that aren’t ranked at all, like Harvey Mudd,

00:47:58 because they don’t have a PhD, so they aren’t ranked. This is all CS. Which means the decision

00:48:04 of whether you become a professor at Cornell was determined when you were 17, by what you knew to

00:48:13 go to undergrad to do whatever. So if we can push these things back a little bit and just make the

00:48:18 pool a little bit bigger, at least you raise the probability that you will be able to see someone

00:48:22 interesting and take the risk. The other answer to that question, by the way, which you could argue

00:48:29 is the same as you either adjust the pool so the probabilities go up, that’s a way of injecting a

00:48:34 little bit of uniform noise in the system, as it were, is you change your loss function.

00:48:40 You just let yourself be measured by something other than whatever it is that we’re measuring

00:48:44 ourselves by now. I mean, US News and World Report, every time they change their formula

00:48:50 for determining rankings, move entire universities to behave differently, because rankings matter.

00:48:55 TITO Can you talk trash about those rankings for a second? No, I’m joking about talking trash.

00:49:01 I actually, it’s so funny how from my perspective, from a very shallow perspective,

00:49:07 how dogmatic, like how much I trust those rankings. They’re almost ingrained in my head.

00:49:13 I mean, at MIT, everybody kind of, it’s a propagated, mutually agreed upon

00:49:21 TITO idea that those rankings matter. And I don’t think anyone knows what they’re,

00:49:26 like, most people don’t know what they’re based on. And what are they exactly based on? And what

00:49:33 are the flaws in that? TITO Well, so it depends on which rankings you’re talking about. Do you

00:49:38 want to talk about computer science or talk about universities? TITO Computer science, US News,

00:49:42 isn’t that the main one? TITO Yeah, the only one that matters is US News. Nothing else matters.

00:49:46 Sorry, CSRankings.org, but nothing else matters but US News. So US News has formula that it uses

00:49:52 for many things, but not for computer science because computer science is considered a science,

00:49:57 which is absurd. So the rankings for computer science is 100% reputation. So two people

00:50:06 at each department, it’s not really department, whatever, at each department,

00:50:11 basically rank everybody. Slightly more complicated than that, but whatever, they rank everyone. And

00:50:16 then those things are put together and somehow. TITO So that means how do you improve reputation?

00:50:22 How do you move up and down the space of reputation? TITO Yes, that’s exactly the

00:50:26 question. TITO Twitter? TITO It can help. I can tell you how Georgia Tech did it,

00:50:31 or at least how I think Georgia Tech did it, because Georgia Tech is actually the case to

00:50:35 look at. Not just because I’m at Georgia Tech, but because Georgia Tech is the only computing unit

00:50:40 that was not in the top 20 that has made it into the top 10. It’s also the only one in the last

00:50:45 two decades, I think, that moved up in the top 10, as opposed to having someone else moved down.

00:50:53 So we used to be number 10, and then we became number nine because UT Austin went down slightly,

00:50:58 and now we were tied for ninth because that’s how rankings work. And we moved from nine to eight

00:51:02 because our raw score moved up a point. So something about Georgia Tech, computer science,

00:51:10 or computing anyway. I think it’s because we have shown leadership at every crisis level, right? So

00:51:17 we created a college, first public university to do it, second college, second university to do it

00:51:21 after CMU is number one. I also think it’s no accident that CMU is the largest and we’re,

00:51:26 depending upon how you count and depending on exactly where MIT ends up with its final college

00:51:30 of computing, second or third largest. I don’t think that’s an accident. We’ve been doing this

00:51:33 for a long time. But in the 2000s when there was a crisis about undergraduate education,

00:51:39 Georgia Tech took a big risk and succeeded at rethinking undergrad education and computing.

00:51:45 I think we created these schools at a time when most public universities anyway were afraid to

00:51:49 do it. We did the online masters and that mattered because people were trying to figure out what to

00:51:56 do with MOOCs and so on. I think it’s about being observed by your peers and having an impact. So,

00:52:02 I mean, that is what reputation is, right? So the way you move up in the reputation rankings is by

00:52:07 doing something that makes people turn and look at you and say, that’s good. They’re better than I

00:52:12 thought. Beyond that, it’s just inertia and there’s huge hysteresis in the system, right? I mean,

00:52:17 there was these, I can’t remember this, this may be apocryphal, but there’s a major or department

00:52:23 that MIT was ranked number one in and they didn’t have it. It’s just about what you… I don’t know

00:52:28 if that’s true, but someone said that to me anyway. But it’s a thing, right? It’s all about

00:52:33 reputation. Of course, MIT is great because MIT is great. It’s always been great. By the way,

00:52:37 because MIT is great, the best students come, which keeps it being great. I mean,

00:52:42 it’s just a positive feedback loop. It’s not surprising. I don’t think it’s wrong.

00:52:46 Yeah. But it’s almost like a narrative. It doesn’t actually have to be backed by reality. Not to

00:52:53 say anything about MIT, but it does feel like we’re playing in the space of narratives, not the

00:53:01 space of something grounded. One of the surprising things when I showed up at MIT and just all the

00:53:06 students I’ve worked with and all the research I’ve done is they’re the same people as I’ve met

00:53:14 at other places. I mean, what MIT has going for it… Well, MIT has many things going for it. One

00:53:21 of the things MIT has going for it is… Nice logo?

00:53:23 Is nice logo. It’s a lot better than it was when I was here. Nice colors too. Terrible,

00:53:30 terrible name for a mascot. But the thing that MIT has going for it is it really does get the

00:53:35 best students. It just doesn’t get all of the best students. There are many more best students out

00:53:40 there. And the best students want to be here because it’s the best place to be or one of the

00:53:45 best places to be. And it’s a sort of positive feedback. But you said something earlier,

00:53:50 which I think is worth examining for a moment. I forget the word you used. You said,

00:53:56 we’re living in the space of narrative as opposed to something objective.

00:54:00 Narrative is objective. I mean, one could argue that the only thing that we do as humans is

00:54:04 narrative. We just build stories to explain why we do what we do. Someone once asked me,

00:54:08 but wait, there’s nothing objective. No, it’s completely an objective measure.

00:54:12 It’s an objective measure of the opinions of everybody else. Now, is that physics? I don’t

00:54:19 know. Tell me something you think is actually objective and measurable in a way that makes

00:54:25 sense. Cameras, they don’t… You’re getting me off on something here, but do you know that

00:54:31 cameras, which are just reflecting light and putting them on film, did not work for dark

00:54:39 skin people until the 1970s? You know why? Because you were building cameras for the people who were

00:54:45 going to buy cameras, who all, at least in the United States and Western Europe, were relatively

00:54:51 light skin. Turns out it took terrible pictures of people who look like me. That got fixed with

00:54:56 better film and whole processes. Do you know why? Because furniture manufacturers wanted to be able

00:55:03 to take pictures of mahogany furniture, right? Because candy manufacturers wanted to be able

00:55:10 to take pictures of chocolate. Now, the reason I bring that up is because you might think that

00:55:16 cameras are objective. They’re just capturing light. No, they’re doing the things that they’re

00:55:23 doing based upon decisions by real human beings to privilege, if I may use that word, some physics

00:55:29 over others, because it’s an engineering problem. There are tradeoffs, right? So I can either worry

00:55:33 about this part of the spectrum or this part of the spectrum. This costs more. That costs less.

00:55:37 This costs the same, but I have more people paying money over here, right? And it turns out that

00:55:41 if a giant conglomerate demands that you do something different and it’s going to involve

00:55:46 all kinds of money for you, suddenly the tradeoffs change, right? And so there you go. I actually

00:55:51 don’t know how I ended up there. Oh, it’s because of this notion of objectiveness, right? So even

00:55:55 the objective isn’t objective because at the end you’ve got to tell a story. You’ve got to make

00:55:58 decisions. You’ve got to make tradeoff. What else is engineering other than that? So I think that

00:56:03 the rankings capture something. They just don’t necessarily capture what people assume they

00:56:09 capture. You know, just to linger on this idea, why is there not more people who just like play

00:56:19 with whatever that narrative is, have fun with it, have like excite the world, whether it’s in

00:56:24 the Carl Sagan style of like that calm, sexy voice of explaining the stars and all the romantic stuff

00:56:31 or the Elon Musk, dare I even say Donald Trump, where you’re like trolling and shaking up the

00:56:37 system and just saying controversial things. I talked to Lisa Feldman Barrett, who’s a

00:56:43 neuroscientist who just enjoys playing the controversy. Things like finds the counter

00:56:50 intuitive ideas in the particular science and throws them out there and sees how they play in

00:56:56 the public discourse. Like why don’t we see more of that? And why doesn’t academia attract an Elon

00:57:02 Musk type? Well, tenure is a powerful thing that allows you to do whatever you want, but getting

00:57:08 tenure typically requires you to be relatively narrow, right? Because people are judging you.

00:57:14 Well, I think the answer is we have told ourselves a story, a narrative that is vulgar,

00:57:22 what you just described is vulgar. It’s certainly unscientific, right? And it is easy to convince

00:57:30 yourself that in some ways you’re the mathematician, right? The fewer there are in your major,

00:57:37 the more that proves your purity, right? So once you tell yourself that story, then it is

00:57:46 beneath you to do that kind of thing, right? I think that’s wrong. I think that, and by the way,

00:57:51 everyone doesn’t have to do this. Everyone’s not good at it. And everyone, even if they would be

00:57:54 good at it, would enjoy it. So it’s fine. But I do think you need some diversity in the way that

00:58:00 people choose to relate to the world as academics, because I think the great universities are ones

00:58:09 that engage with the rest of the world. It is a home for public intellectuals. And in 2020,

00:58:15 being a public intellectual probably means being on Twitter. Whereas of course that wasn’t true

00:58:21 20 years ago, because Twitter wasn’t around 20 years ago. And if it was, it wasn’t around in a

00:58:25 meaningful way. I don’t actually know how long Twitter has been around. As I get older, I find

00:58:28 that my notion of time has gotten worse and worse. Like Google really has been around that long?

00:58:33 Anyway, the point is that I think that we sometimes forget that a part of our job is to

00:58:43 impact the people who aren’t in the world that we’re in, and that that’s the point of being at

00:58:47 a great place and being a great person, frankly. There’s an interesting force in terms of public

00:58:51 intellectuals. Forget Twitter, we could look at just online courses that are public facing in

00:58:57 some part. There is a kind of force that pulls you back. Let me just call it out because I don’t

00:59:06 give a damn at this point. There’s a little bit of, all of us have this, but certainly faculty

00:59:12 have this, which is jealousy. Whoever’s popular at being a good communicator, exciting the world

00:59:20 with their science. And of course, when you excite the world with the science, it’s not

00:59:27 peer reviewed, clean. It all sounds like bullshit. It’s like a Ted talk and people roll their eyes

00:59:36 and they hate that a Ted talk gets millions of views or something like that. And then everybody

00:59:41 pulls each other back. There’s this force that just kind of, it’s hard to stand out unless you

00:59:46 win a Nobel prize or whatever. It’s only when you get senior enough where you just stop giving a

00:59:53 damn. But just like you said, even when you get tenure, that was always the surprising thing to

00:59:58 me. I have many colleagues and friends who have gotten tenure, but there’s not a switch.

01:00:08 There’s not an F you money switch where you’re like, you know what? Now I’m going to be more

01:00:13 bold. It doesn’t, I don’t see it. Well, there’s a reason for that. Tenure isn’t a test. It’s a

01:00:18 training process. It teaches you to behave in a certain way, to think in a certain way, to accept

01:00:24 certain values and to react accordingly. And the better you are at that, the more likely you are to

01:00:28 earn tenure. And by the way, this is not a bad thing. Most things are like that. And I think most

01:00:34 of my colleagues are interested in doing great work and they’re just having impact in the way that

01:00:38 they want to have impact. I do think that as a field, not just as a field, as a profession,

01:00:46 we have a habit of belittling those who are popular as it were, as if the word itself is a

01:00:56 kind of Scarlet A, right? I think it’s easy to convince yourself, and no one is immune to this,

01:01:06 no one is immune to this, that the people who are better known are better known for bad reasons.

01:01:11 The people who are out there dumbing it down are not being pure to whatever the values and ethos

01:01:19 is for your field. And it’s just very easy to do. Now, having said that, I think that ultimately,

01:01:26 people who are able to be popular and out there and are touching the world and making a difference,

01:01:31 you know, our colleagues do, in fact, appreciate that in the long run. It’s just, you know,

01:01:36 you have to be very good at it or you have to be very interested in pursuing it. And once you get

01:01:40 past a certain level, I think people accept that for who it is. I mean, I don’t know. I’d be really

01:01:45 interested in how Rod Brooks felt about how people were interacting with him when he did Fast,

01:01:50 Cheap, and Out of Control way, way, way back when.

01:01:53 What’s Fast, Cheap, and Out of Control?

01:01:55 It was a documentary that involved four people. I remember nothing about it other than Rod Brooks

01:02:00 was in it and something about naked mole rats. I can’t remember what the other two things were.

01:02:05 It was robots, naked mole rats, and then two others.

01:02:07 By the way, Rod Brooks used to be the head of the artificial intelligence laboratory at MIT

01:02:12 and then launched, I think, iRobot and then Think Robotics, Rethink Robotics.

01:02:18 Yes.

01:02:18 Think is in the word. And also is a little bit of a rock star personality in the AI world,

01:02:27 a very opinionated, very intelligent. Anyway, sorry, mole rats and naked.

01:02:32 Naked mole rats. Also, he was one of my two advisors for my PhD.

01:02:36 This explains a lot.

01:02:37 I don’t know how to explain it. I love Rod. But I also love my other advisor, Paul. Paul,

01:02:43 if you’re listening, I love you too. Both very, very different people.

01:02:46 Paul Viola.

01:02:46 Paul Viola. Both very interesting people, very different in many ways. But I don’t know what

01:02:51 Rod would say to you about what the reaction was. I know that for the students at the time,

01:02:56 because I was a student at the time, it was amazing. This guy was in a movie, being very

01:03:03 much himself. Actually, the movie version of him is a little bit more Rod than Rod. I think they

01:03:10 edited it appropriately for him. But it was very much Rod. And he did all this while doing great

01:03:15 work. Was he running the iLab at that point or not? I don’t know. But anyway, he was running

01:03:18 the iLab, or would be soon. He was a giant in the field. He did amazing things, made a lot of his

01:03:23 bones by doing the kind of counterintuitive science. And saying, no, you’re doing this all

01:03:28 wrong. Representation is crazy. The world is your own representation. You just react to it. I mean,

01:03:32 these are amazing things. And continues to do those sorts of things as he’s moved on.

01:03:37 I think he might tell you, I don’t know if he would tell you it was good or bad, but I know that

01:03:43 for everyone else out there in the world, it was a good thing. And certainly,

01:03:46 he continued to be respected. So it’s not as if it destroyed his career by being popular.

01:04:21 P stands for probabilistic. And what does

01:04:26 FUNC stand for? So a lot of my life is about making acronyms. So if I have one quirk,

01:04:31 it’s that people will say words and I see if they make acronyms. And if they do, then I’m happy. And

01:04:36 then if they don’t, I try to change it so that they make acronyms. It’s just a thing that I do.

01:04:41 So PFUNC is an acronym. It has three or four different meanings. But finally, I decided that

01:04:46 the P stands for probabilistic because at the end of the day, it’s machine learning and it’s randomness and

01:05:13 it’s fun.

01:05:40 So there’s a sense to it, which is not an acronym, like literally FUNC. You have a dark, mysterious past.

01:05:43 So there’s a sense to it, which is not an acronym, like literally FUNC.

01:06:13 There’s a sense to it, which is not an acronym, like literally FUNC.

01:06:43 It’s a whole set of things of which rap is a part. So tagging is a part of hip hop. I don’t know why

01:06:48 that’s true, but people tell me it’s true and I’m willing to go along with it because they get very

01:06:51 angry about it. But hip hop is like graffiti. Tagging is like graffiti. And there’s all these,

01:06:56 including the popping and the locking and all the dancing and all those things. That’s all a part of

01:06:59 hip hop. It’s a way of life, which I think is true. And then there’s rap, which is this particular.

01:07:04 It’s the music part.

01:07:05 Yes. A music part. I mean, you wouldn’t call the stuff that DJs do the scratching. That’s not rap,

01:07:12 right? But it’s a part of hip hop, right? So given that we understand that hip hop is this whole

01:07:16 thing, what are the rap albums that best touch that for me? Well, if I were going to educate you,

01:07:21 I would try to figure out what you liked and then I would work you there.

01:07:26 Oh my God. I would probably start with, there’s a fascinating exercise one can do by watching old

01:07:38 episodes of I love the seventies. I love the eighties. I love the nineties with a bunch of

01:07:43 friends and just see where people come in and out of pop culture. So if you’re talking about

01:07:50 those people, then I would actually start you with where I would hope to start you with anyway,

01:07:54 which is public enemy. Particularly it takes a nation of millions to hold us back, which is

01:08:00 clearly the best album ever produced. And certainly the best hip hop album ever produced

01:08:05 in part because it was so much of what was great about the time. Fantastic lyrics is me. It’s all

01:08:11 about the lyrics. Amazing music that was coming from Rick Rubin was the producer of that. And he

01:08:16 did a lot, very kind of heavy mental ish, at least in the eighties sense at the time. And it was

01:08:21 focused on politics in the 1980s, which was what made hip hop so great. Then I would start you

01:08:28 there. Then I would move you up through things that are been happening more recently. I’d probably

01:08:33 get you to someone like a most deaf. I would give you a history lesson, basically. Most of them.

01:08:37 That’s amazing. He hosted a poetry jam thing on HBO or something like that. Probably. I don’t

01:08:41 think I’ve seen it, but I wouldn’t be surprised. Yeah. Spoken poetry. That’s amazing. He’s amazing.

01:08:46 And then I would, I, after I got you there, I’d work you back to EPMD.

01:08:51 And eventually I would take you back to the last poets and particularly the first album,

01:08:57 the last poets, which was 1970 to give you a sense of history and that it actually has been building

01:09:02 up over a very, very long time. So we would start there because that’s where your music aligns. And

01:09:08 then we would cycle out and I’d move you to the present. And then I’d take you back to the past.

01:09:12 Because I think a large part of people who are kind of confused about any kind of music,

01:09:16 you know, the truth is this is the same thing we’ve always been talking about, right? It’s

01:09:19 about narrative and being a part of something and being immersed in something. So you understand it,

01:09:23 right? Jazz, which I also like is one of the things that’s cool about jazz is that you come

01:09:30 and you meet someone who’s talking to you about jazz and you have no idea what they’re talking

01:09:32 about. And then one day it all clicks and you’ve been so immersed in it. You go, oh yeah, that’s a

01:09:37 Charlie Parker. You start using words that nobody else understands, right? And it becomes part of

01:09:41 hip hop’s the same way. Everything’s the same way. They’re all cultural artifacts, but I would help

01:09:45 you to see that there’s a history of it and how it connects to other genres of music that you might

01:09:49 like to bring you in. So that you could kind of see how it connects to what you already like,

01:09:54 including some of the good work that’s been done with fusions of hip hop and bluegrass.

01:10:02 Oh no.

01:10:03 Yes. Some of it’s even good. Not all of it, but some of it is good,

01:10:09 but I’d start you with, it takes a nation of millions to hold this back.

01:10:12 There’s an interesting tradition in like more modern hip hop of integrating almost like classic

01:10:18 crock songs or whatever, like integrating into their music, into the beat, into the whatever.

01:10:25 It’s kind of interesting. It gives a whole new, not just classic crock, but what is it?

01:10:32 Kanye with Gold Digger.

01:10:33 Old R&B.

01:10:35 It’s taking and pulling old R&B, right?

01:10:38 Well, that’s been true since the beginning. I mean, in fact, that’s in some ways,

01:10:41 that’s why the DJ used to get top billing because it was the DJ that brought all the records together

01:10:47 and made it worth so that people could dance. You go back to those days, mostly in New York,

01:10:52 though not exclusively, but mostly in New York where it sort of came out of,

01:10:56 the DJ that brought all the music together and the beats and showed that basically music is

01:11:00 itself an instrument, very meta, and you can bring it together and then you sort of wrap over it and

01:11:04 so on. And it moved that way. So that’s going way, way back. Now, in the period of time where I grew

01:11:10 up, when I became really into it, which was mostly the 80s, it was more funk was the back for a lot

01:11:17 of the stuff, Public Enemy at that time, notwithstanding. And so, which is very nice

01:11:22 because it tied into what my parents listened to and what I vaguely remember listening to when I

01:11:26 was very small. And by the way, complete revival of George Clinton and Parliament and Funkadelic

01:11:33 and all of those things to bring it sort of back into the 80s and into the 90s. And as we go on,

01:11:38 you’re going to see the last decade and the decade before that being brought in.

01:11:42 And when you don’t think that you’re hearing something you’ve heard, it’s probably because

01:11:45 it’s being sampled by someone who, referring to something they remembered when they were young,

01:11:53 perhaps from somewhere else altogether. And you just didn’t realize what it was because it wasn’t

01:11:59 a popular song where you happened to grow up. So this stuff has been going on for a long time.

01:12:02 It’s one of the things that I think is beautiful. Run DMC, Jam Master Jay used to play, he played

01:12:08 piano. He would record himself playing piano and then sample that to make it a part of what was

01:12:13 going on rather than play the piano. That’s how his mind can think. Well, it’s pieces. You’re

01:12:17 putting pieces together, you’re putting pieces of music together to create new music, right?

01:12:20 Now that doesn’t mean that the root, I mean, the roots are doing their own thing.

01:12:23 Yeah. Right. Those, those are, that’s a whole. Yeah. But still it’s the right attitude that,

01:12:30 you know, and what else is jazz, right? Jazz is about putting pieces together and then putting

01:12:33 your own spin on it. It’s all the same. It’s all the same thing. It’s all the same thing.

01:12:36 Yeah. Cause you mentioned lyrics. It does make me sad. Again, this is me talking trash about

01:12:41 modern hip hop. I haven’t, you know, investigated. I’m sure people correct me that there’s a lot of

01:12:46 great artists. That’s part of the reason I’m saying it is they’ll leave it in the comments that you

01:12:50 should listen to this person is the lyrics went away from talking about maybe not just politics,

01:12:59 but life and so on. Like, you know, the kind of like protest songs, even if you look at like a

01:13:04 Bob Marley or you said Public Enemy or Rage Against the Machine, More on the Rockside,

01:13:09 there’s, that’s the place where we go to those lyrics. Like classic rock is all about like,

01:13:16 my woman left me or, or I’m really happy that she’s still with me or the flip side. It’s like

01:13:23 love songs of different kinds. It’s all love, but it’s less political, like less interesting. I

01:13:27 would say in terms of like deep, profound knowledge. And it seems like rap is the place where you

01:13:34 would find that. And it’s sad that for the most part, what I see, like you look like mumble rap

01:13:40 or whatever, they’re moving away from lyrics and more towards the beat and the, and the musicality

01:13:45 of it. I’ve always been a fan of the lyrics. In fact, if you go back and you read my reviews,

01:13:49 which I recently was rereading, man, I wrote my last review the month I graduated. I got my PhD,

01:13:57 which says something about something. I’m not sure what though. I always would always,

01:14:01 but I often would start with, it’s all about the lyrics. For me, it’s all, it’s about the lyrics.

01:14:06 Someone has already written in the comments before I’ve even finished having this conversation

01:14:10 that, you know, neither of us knows what we’re talking about and it’s all in the underground

01:14:13 hip hop and here’s who you should go listen to. And that is true. Every time I despair for popular

01:14:19 rap, I get someone points me to, or I discover some underground hip hop song and I’m, I am made

01:14:25 happy and whole again. So I know it’s out there. I don’t listen to as much as I used to because

01:14:30 I’m listening to podcasts and old music from the 1980s. It’s a kind of no beat at all, but you know,

01:14:37 there’s a little bit of sampling here and there. I’m sure. By the way, James Brown is funk or no?

01:14:42 Yes. And so is junior Wells, by the way. Who’s that? Oh, junior Wells, Chicago blues. He was

01:14:48 James Brown before James Brown was. It’s hard to imagine somebody being James Brown. Go look up

01:14:53 hoodoo man blues, junior Wells, and just listen to, snatch it back and hold it and you’ll see it.

01:15:01 And they were contemporaries. Where do you put like little Richard or all that kind of stuff?

01:15:06 Like Ray Charles, like when they get like, hit the road Jack and don’t you come back. Isn’t that

01:15:12 like, there’s a funkiness in it. Oh, that’s definitely a funkiness in it. I mean, it’s all,

01:15:16 I mean, it’s all, it’s all a line. I mean, it’s all, there’s all a line that carries it all

01:15:20 together. You know, it’s, I guess I would answer your question, depending on whether I’m thinking

01:15:24 about it in 2020 or I’m thinking about it in 1960. Um, I’d probably give a different answer.

01:15:28 I’m just thinking in terms of, you know, that was rock, but when you look back on it, it’s,

01:15:33 it was funky, but we didn’t use those words or maybe we did. I wasn’t around. Uh, but you know,

01:15:38 I don’t think we used the word 1960 funk, certainly not the way we used it in the seventies

01:15:42 and the eighties. Do you reject disco? I do not reject disco. I appreciate all the mistakes that

01:15:47 we have made. Actually, some of the disco is actually really, really good. John Travolta.

01:15:52 Oh boy. He regrets it probably. Maybe not. Well, like it’s the mistakes thing.

01:15:56 Yeah. And it got him to where he’s going, where he is.

01:16:00 Oh, well, thank you for taking that detour. You’ve, you’ve talked about computing. We’ve

01:16:05 already talked about computing a little bit, but can you try to describe how you think about the

01:16:11 world of computing or it fits into the sets of different disciplines? We mentioned College of

01:16:16 Computing. What, what should people, how should they think about computing, especially from an

01:16:22 educational perspective of like, what is the perfect curriculum that defines for a young mind,

01:16:30 uh, what computing is? So I don’t know about a perfect curriculum, although that’s an important

01:16:34 question because at the end of the day, without the curriculum, you don’t get anywhere. Curriculum

01:16:39 to me is the fundamental data structure. It’s not even the classroom. I mean, the world is,

01:16:44 right? I, I, I, so I think the curriculum is where I like to play. Uh, so I spend a lot of time

01:16:49 thinking about this, but I will tell you, I’ll answer your question by answering a slightly

01:16:52 different question first and then getting back to this, which is, you know, you talked about

01:16:55 disciplines and what does it mean to be a discipline? The truth is what we really educate

01:17:01 people in from the beginning, but certainly through college, you sort of failed. If you

01:17:05 don’t think about it this way, I think is the world. People often think about tools and tool

01:17:12 sets, and when you’re really trying to be good, you think about skills and skill sets,

01:17:16 but disciplines are about mindsets, right? They’re about fundamental ways of thinking, not just the,

01:17:22 the, the hammer that you pick up, whatever that is to hit the nail, um, not just the,

01:17:27 the skill of learning how to hammer well or whatever. It’s the mindset of like,

01:17:30 what’s the fundamental way to think about, to think about the world, right? And disciplines,

01:17:37 different disciplines give you different mindsets to give you different ways of sort of thinking

01:17:40 through. So with that in mind, I think that computing, even ask the question, whether

01:17:45 it’s a discipline that you have to decide, does it have a mindset? Does it have a way of thinking

01:17:48 about the world that is different from, you know, the scientist who is doing a discovery and using

01:17:53 the scientific method as a way of doing it, or the mathematician who builds abstractions and tries

01:17:57 to find sort of steady state truth about the abstractions that may be artificial, but whatever,

01:18:03 or is it the engineer who’s all about, you know, building demonstrably superior technology with

01:18:08 respect to some notion of trade offs, whatever that means, right? That’s sort of the world that

01:18:12 you, the world that you live in. What is computing? You know, how is computing different? So I’ve

01:18:16 thought about this for a long time and I’ve come to a view about what computing actually is, what

01:18:22 the mindset is. And, and it’s, you know, it’s a little abstract, but that would be appropriate

01:18:26 for computing. I think that what distinguishes the computationalist from others is that he or

01:18:34 she understands that models, languages and machines are equivalent. They’re the same thing. And

01:18:41 because it’s not just a model, but it’s a machine that is an executable thing that can be described

01:18:47 as a language. That means that it’s dynamic. So it’s not the, it is mathematical in some sense,

01:18:54 in the kind of sense of abstraction, but it is fundamentally dynamic and executable. The

01:18:58 mathematician is not necessarily worried about either the dynamic part. In fact, whenever I tried

01:19:03 to write something for mathematicians, they invariably demand that I make it static. And

01:19:08 that’s not a bad thing. It’s just, it’s a way of viewing the world that truth is a thing, right?

01:19:12 It’s not a process that continually runs, right? So that dynamic thing matters. That self reflection

01:19:19 of the system itself matters. And that is what computing, that is what computing brought us.

01:19:23 So it is a science because it, the models fundamentally represent truths in the world.

01:19:27 Information is a scientific thing to discover, right? Not just a mathematical conceit that

01:19:33 that gets created. But of course it’s engineering because you’re actually dealing with constraints

01:19:37 in the world and trying to execute machines that actually run. But it’s also a math because

01:19:43 you’re actually worrying about these languages that describe what’s happening. But the fact that

01:19:51 regular expressions and finite state automata, one of which feels like a machine or at least

01:19:56 an abstraction machine and the other is a language that they’re actually the equivalent thing. I mean,

01:19:59 that is not a small thing and it permeates everything that we do, even when we’re just

01:20:03 trying to figure out how to, how to do debugging. So that idea I think is fundamental and we would

01:20:09 do better if we made that more explicit. How my life has changed and my thinking about this

01:20:15 in the 10 or 15 years it’s been since I tried to put that to paper with some colleagues is

01:20:21 the realization, which comes to a question you actually asked me earlier, which has to do with

01:20:28 trees falling down and whether it matters, is this sort of triangle of equality.

01:20:34 It only matters because there’s a person inside the triangle, right? That what’s changed about

01:20:40 computing, computer science or whatever you want to call it, is we now have so much data

01:20:46 and so much computational power. We’re able to do really, really interesting, promising things.

01:20:53 But the interesting and the promising kind of only matters with respect to human beings and

01:20:56 their relationship to it. So the triangle exists, that is fundamentally computing.

01:21:01 What makes it worthwhile and interesting and potentially world species changing is that there

01:21:08 are human beings inside of it and intelligence that has to interact with it to change the data,

01:21:12 the information that makes sense and gives meaning to the models, the languages and the machines.

01:21:19 So if the curriculum can convey that while conveying the tools and the skills that you need

01:21:25 in order to succeed, then it is a big win. That’s what I think you have to do.

01:21:30 Do you pull psychology, like these human things into that, into the idea,

01:21:36 into this framework of computing? Do you pull in psychology and neuroscience,

01:21:41 like parts of psychology, parts of neuroscience, parts of sociology?

01:21:44 What about philosophy, like studies of human nature from different perspectives?

01:21:49 Absolutely. And by the way, it works both ways. So let’s take biology for a moment.

01:21:53 It turns out a cell is basically a bunch of if, then statements, if you look at it the right way,

01:21:57 which is nice because I understand if, then statements. I never really enjoyed biology,

01:22:01 but I do understand if, then statements. And if you tell the biologists that and they begin to

01:22:05 understand that, it actually helps them to think about a bunch of really cool things.

01:22:09 There’ll still be biology involved, but whatever. On the other hand, the fact of biology is,

01:22:15 in fact, the cell is a bunch of if, then statements or whatever, allows the computationalist to think

01:22:19 differently about the language and the way that we, well, certainly the way we would do AI machine

01:22:23 learning, but there’s just even the way that we think about computation. So the important thing

01:22:28 to me is as my engineering colleagues who are not in computer science worry about computer science

01:22:33 eating up engineering to colleges where computer science is trapped. It’s not a worry. You shouldn’t

01:22:40 worry about that at all. Computer science computing, it’s central, but it’s not the

01:22:45 most important thing in the world. It’s not more important. It is just key to helping others do

01:22:50 other cool things they’re going to do. You’re not going to be a historian in 2030. You’re not going

01:22:54 to get your PhD in history without understanding some data science and computing, because the way

01:22:58 you’re going to get history done in part, and I say done, the way you’re going to get it done

01:23:02 is you’re going to look at data and you’re going to let, you’re going to have the system that’s

01:23:06 going to help you to analyze things to help you to think about a better way to describe history

01:23:09 and to understand what’s going on and what it tells us about where we might be going. The same

01:23:13 is true for psychology. Same is true for all of these things. The reason I brought that up is

01:23:17 because the philosopher has a lot to say about computing. The psychologist has a lot to say

01:23:21 about the way humans interact with computing, right? And certainly a lot about intelligence, which

01:23:27 at least for me, ultimately is kind of the goal of building these computational devices is to build

01:23:32 something intelligent. Did you think computing will eat everything in some certain sense or almost

01:23:37 like disappear because it’s part of everything? It’s so funny you say this. I want to say it’s

01:23:41 going to metastasize, but there’s kind of two ways that fields destroy themselves. One is they become

01:23:47 super narrow. And I think we can think of fields that might be that way. They become pure. And we

01:23:55 have that instinct. We have that impulse. I’m sure you can think of several people who want computer

01:23:59 science to be this pure thing. The other way is you become everywhere and you become everything

01:24:05 and nothing. And so everyone says, you know, I’m going to teach Fortran for engineers or whatever.

01:24:10 I’m going to do this. And then you lose the thing that makes it worth studying in and of itself.

01:24:15 The thing about computing, and this is not unique to computing, though at this point in time,

01:24:19 it is distinctive about computing where we happen to be in 2020 is we are both a thriving major.

01:24:26 In fact, the thriving major, almost every place. And we’re a service unit because people need to

01:24:34 know the things we need to know. And our job, much as the mathematician’s job is to help,

01:24:39 you know, this person over here to think like a mathematician, much the way the point isn’t the

01:24:43 point of view taking chemistry as a freshman is not to learn chemistry. It’s to learn to think

01:24:47 like a scientist, right? Our job is to help them to think, think like a computationalist. And we

01:24:52 have to take both of those things very seriously. And I’m not sure that as a field, we have

01:24:56 historically certainly taken the second thing that our job is to help them to think a certain way.

01:25:01 People who aren’t going to be our major, I don’t think we’ve taken that that very seriously at all.

01:25:04 I don’t know if you know who Dan Carlin is. He has this podcast called Hardcore History.

01:25:09 Yes.

01:25:10 I’ve just did an amazing four hour conversation with him, mostly about Hitler. But I bring him

01:25:18 up because he talks about this idea that it’s possible that history as a field will become,

01:25:24 like currently, most people study history a little bit, kind of are aware of it. We have a

01:25:31 conversation about it, different parts of it. I mean, there’s a lot of criticism to say that some

01:25:35 parts of history are being ignored, blah, blah, blah, so on. But most people are able to have a

01:25:41 curiosity and able to learn it. His thought is it’s possible given the way social media works,

01:25:49 the current way we communicate, that history becomes a niche field where literally most people

01:25:55 just ignore because everything is happening so fast that the history starts losing its meaning.

01:26:01 And then it starts being a thing that only, you know, like the theoretical computer science,

01:26:07 part of computer science, it becomes a niche thing that only like the rare holders of the

01:26:13 world wars and the, you know, all the history, the founding of the United States, all those kinds of

01:26:19 things, the Civil Wars. And it’s a kind of profound thing to think about how these,

01:26:27 how we can lose track, how we can lose these fields when they’re best, like in the case of

01:26:33 history, is best for that to be a pervasive thing that everybody learns and thinks about and so on.

01:26:40 And I would say computing is quite obviously similar to history in the sense that it seems

01:26:46 like it should be a part of everybody’s life to some degree, especially like as we move into the

01:26:53 later parts of the 21st century. And it’s not obvious that that’s the way it’ll go. It might

01:26:59 be in the hands of the few still. Like depending if it’s machine learning, you know, it’s unclear

01:27:06 that computing will win out. It’s currently very successful, but it’s not, I would say that’s

01:27:12 something, I mean, you’re at the leadership level of this. You’re defining the future. So it’s in

01:27:16 your hands. No pressure. But like, it feels like there’s multiple ways this can go. And there’s

01:27:23 this kind of conversation of everybody should learn to code, right? The changing nature of jobs

01:27:29 and so on. Do you have a sense of what your role in education of computing is here? Like what’s

01:27:41 the hopeful path forward? There’s a lot there. I will say that, well, first off, it would be an

01:27:46 absolute shame if no one studied history. On the other hand, as t approaches infinity, the amount

01:27:51 of history is presumably also growing at least linearly. And so you have to forget more and more

01:27:59 of history, but history needs to always be there. I mean, I can imagine a world where,

01:28:02 you know, if you think of your brains as being outside of your head, that you can kind of learn

01:28:07 the history you need to know when you need to know it. That seems fanciful, but it’s a kind of way of,

01:28:12 you know, is there a sufficient statistic of history? No. And there certainly, but there may

01:28:17 be for the particular thing you have to care about, but you know, those who do not remember.

01:28:20 It’s for our objective camera discussion, right?

01:28:23 Yeah. Right. And, you know, we’ve already lost lots of history. And of course you have your

01:28:26 own history that some of which will be, it’s even lost to you, right? You don’t even remember

01:28:31 whatever it was you were doing 17 years ago.

01:28:33 All the ex girlfriends.

01:28:34 Yeah.

01:28:34 Gone.

01:28:35 Exactly. So, you know, history is being lost anyway, but the big lessons of history shouldn’t

01:28:41 be. And I think, you know, to take it to the question of computing and sort of education,

01:28:46 the point is you have to get across those lessons. You have to get across the way of thinking.

01:28:50 And you have to be able to go back and, you know, you don’t want to lose the data,

01:28:54 even if, you know, you don’t necessarily have the information at your fingertips.

01:28:57 With computing, I think it’s somewhat different. Everyone doesn’t have to learn how to code,

01:29:02 but everyone needs to learn how to think in the way that you can be precise. And I mean,

01:29:07 precise in the sense of repeatable, not just, you know, in the sense of not resolution in the sense

01:29:13 of get the right number of bits, um, in saying what it is you want the machine to do and being

01:29:19 able to describe a problem in such a way that it is executable, which we are not human beings are

01:29:26 not very good at that. In fact, I think we spend much of our time talking back and forth just to

01:29:30 kind of vaguely understand what the other person means and hope we get it good enough that we can,

01:29:34 we can act accordingly. Um, you can’t do that with machines, at least not yet. And so,

01:29:39 you know, having to think that precisely about things is quite important. And that’s somewhat

01:29:45 different from coding. Coding is a crude means to an end. On the other hand, the idea of coding,

01:29:53 what that means that it’s a programming language and it has these sort of things that you fiddle

01:29:57 with in these ways that you express. That is an incredibly important point. In fact, I would argue

01:30:01 that one of the big holes in machine learning right now in an AI is that we forget that we are

01:30:07 basically doing software engineering. We forget that we are doing, um, we’re using programming,

01:30:13 like we’re using languages to express what we’re doing. We get just so all caught up in the deep

01:30:16 network or we get all caught up in whatever that we forget that, you know, we’re making decisions

01:30:22 based upon a set of parameters that we made up. And if we did slightly different parameters,

01:30:26 we’d have completely different, different outcomes. And so the lesson of computing,

01:30:30 computer science education is to be able to think like that and to be aware of it when you’re doing

01:30:36 it. Basically, it’s, you know, at the end of the day, it’s a way of, um, surfacing your assumptions.

01:30:41 I mean, we call them parameters or, you know, we, we, we call them if then statements or whatever,

01:30:45 but you’re forced to surface those, those assumptions. That’s the key, the key thing that

01:30:50 you should get out of a computing education that, and that the models and languages and

01:30:53 machines are equivalent, but it actually follows from that, that you have to be explicit about,

01:30:58 about what it is you’re trying to do because the model you’re building is something you will one

01:31:02 day run. So you better get it right, or at least understand it and be able to express roughly what,

01:31:08 what you want to express. So I think it is key that we figure out how to educate everyone to

01:31:17 think that way, because at the end, it would not only make them better at whatever it is that they

01:31:23 are doing. And I emphasize doing it’ll also make them better citizens. It’ll help them to understand

01:31:30 what others are doing to them so that they can react accordingly. Cause you’re not going to

01:31:35 solve the problem of social media in so far as you think of social media as a problem

01:31:40 by just making slightly better code, right? It only works if people react to it

01:31:46 appropriately and know what’s happening and therefore take control over what they’re doing.

01:31:52 I mean, that’s, that’s my take on it.

01:31:53 Okay. Let me try to proceed awkwardly into the topic of race.

01:32:00 Okay.

01:32:00 One is because it’s a fascinating part of your story and you’re just eloquent and fun about it.

01:32:05 And then the second is because we’re living through a pretty tense time in terms of race,

01:32:12 tensions and discussions and ideas in this time in America. You grew up in Atlanta,

01:32:20 not born in Atlanta. Is some Southern state, somewhere in Tennessee, something like that?

01:32:24 Tennessee.

01:32:25 Nice. Okay. But early on you moved, you’re basically, you identify as an Atlanta native.

01:32:34 Mm hmm. Yeah. And you’ve mentioned that you grew up in a predominantly black neighborhood,

01:32:42 by the way, black African American person of color.

01:32:44 I prefer black.

01:32:46 Black.

01:32:46 With a capital B.

01:32:47 With a capital B. The other letters are…

01:32:50 The rest of them, no matter.

01:32:54 Okay. So the predominantly black neighborhood. And so you didn’t almost see race. Maybe you

01:32:59 can correct me on that. And then just in the video you talked about when you showed up to

01:33:05 Georgia Tech for your undergrad, you’re one of the only black folks there. And that was like,

01:33:12 oh, that was a new experience. So can you take me from just a human perspective,

01:33:19 but also from a race perspective, your journey growing up in Atlanta

01:33:23 and then showing up at Georgia Tech?

01:33:24 Okay. That’s easy. And by the way, that story continues through MIT as well.

01:33:28 Yeah. In fact, it was quite a bit more stark at MIT and Boston.

01:33:32 So maybe just a quick pause, Georgia Tech was undergrad, MIT was graduate school.

01:33:37 Mm hmm. And I went directly to grad school from undergrad. So I had no

01:33:42 distractions in between my bachelor’s and my master’s and PhD.

01:33:45 You didn’t go on a backpacking trip in Europe?

01:33:47 Didn’t do any of that. In fact, I literally went to IBM for three months, got in a car,

01:33:52 and drove straight to Boston with my mother, or Cambridge.

01:33:55 Yeah.

01:33:55 I moved into an apartment I’d never seen over the Royal East. Anyway, that’s another story.

01:34:02 So let me tell you a little bit about it.

01:34:03 You miss MIT?

01:34:04 Oh, I loved MIT. I don’t miss Boston at all, but I loved MIT.

01:34:09 That was fighting words.

01:34:11 So let’s back up to this. So as you said, I was born in Chattanooga, Tennessee.

01:34:14 My earliest memory is arriving in Atlanta in a moving truck at the age of three and a half.

01:34:18 So I think of myself as being from Atlanta, very distinct memory of that. So I grew up in Atlanta.

01:34:22 It’s the only place I ever knew as a kid. I loved it. Like much of the country, and certainly

01:34:28 much of Atlanta in the 70s and 80s, it was deeply highly segregated, though not in a way that I

01:34:34 think was obvious to you unless you were looking at it or were old enough to have noticed it.

01:34:39 But you could divide up Atlanta, and Atlanta is hardly unique in this way, by highway,

01:34:43 and you could get racing class that way. So I grew up not only in a predominantly

01:34:47 black area, to say the very least, I grew up on the poor side of that. But I was very much aware

01:34:55 of race for a bunch of reasons, one that people made certain that I was, my family did, but also

01:35:01 that it would come up. So in first grade, I had a girlfriend. I say I had a girlfriend. I didn’t

01:35:08 have a girlfriend. I wasn’t even entirely sure what girls were in the first grade. But I do remember

01:35:13 she decided I was her girlfriend’s little white girl named Heather. And we had a long discussion

01:35:17 about how it was okay for us to be boyfriend and girlfriend, despite the fact that she was white

01:35:21 and I was black. Between the two of you? Did your parents know about this?

01:35:26 Yes. But being a girlfriend and boyfriend in first grade just basically meant that you spent

01:35:31 slightly more time together during recess. I think we Eskimo kissed once. It didn’t mean anything.

01:35:38 It was. At the time, it felt very scandalous because everyone was watching. I was like,

01:35:41 ah, my life is now my life has changed in first grade. No one told me elementary school would be

01:35:45 like this. Did you write poetry or not in first grade? That would come later. That would come

01:35:50 during puberty when I wrote lots and lots of poetry. Anyway, so I was aware of it. I didn’t

01:35:56 think too much about it, but I was aware of it. But I was surrounded. It wasn’t that I wasn’t

01:36:01 aware of race. It’s that I wasn’t aware that I was a minority. It’s different. And it’s because I

01:36:07 wasn’t as far as my world was concerned. I mean, I’m six years old, five years old in first grade.

01:36:12 The world is the seven people I see every day. So it didn’t feel that way at all.

01:36:17 And by the way, this being Atlanta, home of the civil rights movement and all the rest,

01:36:21 it meant that when I looked at TV, which back then one did because there were only three,

01:36:25 four or five channels. And I saw the news, which my mother might make me watch. Monica Kaufman was

01:36:33 on TV telling me the news and they were all black and the mayor was black and always been

01:36:37 black. And so it just never occurred to me. When I went to Georgia Tech, I remember the first day

01:36:43 walking across campus from West campus to East campus and realizing along the way that of the

01:36:49 hundreds and hundreds and hundreds and hundreds of students that I was seeing, I was the only black

01:36:53 one. That was enlightening and very off putting because it occurred to me. And then of course,

01:36:59 it continued that way for, well, for the rest of my, for much of the rest of my career at Georgia

01:37:05 Tech. Of course, I found lots of other students and I met people cause in Atlanta, you’re either

01:37:09 black or you’re white. There was nothing else. So I began to meet students of Asian descent and I

01:37:14 met students who we would call Hispanic and so on and so forth. And you know, so my world,

01:37:18 this is what college is supposed to do, right? It’s supposed to open you up to people. And it

01:37:22 did, but it was a very strange thing to be in the minority. When I came to Boston, I will tell you

01:37:31 a story. I applied to one place as an undergrad, Georgia Tech, because I was stupid. I didn’t know

01:37:38 any better. I just didn’t know any better, right? No one told me. When I went to grad school,

01:37:43 I applied to three places, Georgia Tech, because that’s where I was, MIT and CMU.

01:37:49 When I got in to MIT, I got into CMU, but I had a friend who went to CMU. And so I asked him what

01:37:58 he thought about it. He spent his time explaining to me about Pittsburgh, much less about CMU,

01:38:03 but more about Pittsburgh, which I developed a strong opinion based upon his strong opinion,

01:38:07 something about the sun coming out two days out of the year. And I didn’t get a chance to go there

01:38:12 because the timing was wrong. I think it was because the timing was wrong. At MIT, I asked

01:38:19 20 people I knew, either when I visited or I had already known for a variety of reasons,

01:38:24 whether they liked Boston. And 10 of them loved it, and 10 of them hated it. The 10 who loved it

01:38:30 were all white. The 10 who hated it were all black. And they explained to me very much why

01:38:35 that was the case. Both stats told me why. And the stories were remarkably the same for the

01:38:41 two clusters. And I came up here, and I could see it immediately, why people would love it

01:38:46 and why people would not. And why people tell you about the nice coffee shops.

01:38:50 Well, it wasn’t coffee shops. It was used CD places. But yeah, it was that kind of a thing.

01:38:55 Nice shops. Oh, there’s all these students here. Harvard Square is beautiful. You can do all these

01:39:00 things, and you can walk. And something about the outdoors, which I wasn’t the slightest bit

01:39:02 interested in. The outdoors is for the bugs. It’s not for humans.

01:39:08 That should be a t shirt.

01:39:09 Yeah, that’s the way I feel about it. And the black folk told me completely different stories

01:39:14 about which part of town you did not want to be caught in after dark. But that was nothing new.

01:39:22 So I decided that MIT was a great place to be as a university. And I believed it then,

01:39:27 I believe it now. And that whatever it is I wanted to do, I thought I knew what I wanted to do,

01:39:32 but what if I was wrong? Someone there would know how to do it. Of course, then I would pick the

01:39:36 one topic that nobody was working on at the time, but that’s okay. It was great. And so I thought

01:39:42 that I would be fine. And I’d only be there for like four or five years. I told myself,

01:39:46 which turned out not to be true at all. But I enjoyed my time. I enjoyed my time there.

01:39:50 But I did see a lot of… I ran across a lot of things that were driven by what I look like

01:39:58 while I was here. I got asked a lot of questions. I ran into a lot of cops. I saw a lot about the

01:40:05 city. But at the time, I mean, I haven’t been here a long time. These are the things that I

01:40:09 remember. So this is 1990. There was not a single black radio station. Now this is 1990. I don’t

01:40:17 know if there are any radio stations anymore. I’m sure there are, but I don’t listen to the radio

01:40:21 anymore and almost no one does, at least if you’re under a certain age. But the idea is you could be

01:40:27 in a major metropolitan area and there wasn’t a single black radio station, by which I mean,

01:40:30 a radio station to play what we would call black music then, was absurd, but somehow captured kind

01:40:37 of everything about the city. I grew up in Atlanta and you’ve heard me tell you about Atlanta.

01:40:44 Boston had no economically viable or socially cohesive black middle class.

01:40:51 Insofar as it existed, it was uniformly distributed throughout large parts, not all parts,

01:40:56 but large parts of the city. And where you had concentrations of black Bostonians,

01:41:02 they tended to be poor. It was very different from where I grew up. I grew up on the poor side of

01:41:07 town, sure. But then in high school, well, in ninth grade, we didn’t have middle school. I went

01:41:13 to an eighth grade school where there was a lot of, let’s just say, we had a riot the year that

01:41:17 I was there. There was at least one major fight every week. It was an amazing experience. But

01:41:24 when I went to ninth grade, I went to Academy. Math and Science Academy, Mays High. It was a

01:41:30 public school. It was a magnet school. That’s why I was able to go there. It was the first high school,

01:41:36 I think, in the state of Georgia to sweep the state math and science fairs. It was great. It had

01:41:44 385 students, all but four of whom were black. I went to school with the daughter of the

01:41:51 former mayor of Atlanta, Michael Jackson’s cousin. I mean, it was an upper middle class.

01:41:56 Dr. Justin Marchegiani Dropping names.

01:41:57 Dr. Justin Marchegiani You know, I just drop names occasionally.

01:41:59 You know, drop the mic, drop some names. Just to let you know, I used to hang out with Michael

01:42:03 Jackson’s cousin, 12th cousin, nine times removed. I don’t know. The point is, they had money. We

01:42:07 had a parking problem because the kids had cars. I did not come from a place where you had cars.

01:42:12 I had my first car when I came to MIT, actually. So, it was just a very different experience for

01:42:21 me. But I’d been to places where whether you were rich or whether you were poor, you know,

01:42:26 you could be black and rich or black and poor. And it was there and there were places and they

01:42:29 were segregated by class as well as by race. But that existed. Here, at least when I was here,

01:42:35 didn’t feel that way at all. And it felt like a bunch of a really interesting contradiction.

01:42:41 It felt like it was the interracial dating capital of the country. It really felt that way.

01:42:49 But it also felt like the most racist place I ever spent any time. You know, you couldn’t go

01:42:55 up the Orange Line at that time. I mean, again, that was 30 years ago. I don’t know what it’s

01:42:59 like now. But there were places you couldn’t go. And you knew it. Everybody knew it. And there were

01:43:05 places you couldn’t live. And everybody knew that. And that was just the greater Boston area in 1992.

01:43:12 Subtle racism or explicit racism?

01:43:14 Both.

01:43:16 In terms of within the institutions, did you feel…

01:43:19 Was there levels in which you were empowered to be first or one of the first black people in a

01:43:26 particular discipline in some of these great institutions that you were a part of? You know,

01:43:31 Georgia Tech or MIT? And was there a part where it felt limiting?

01:43:37 I always felt empowered. Some of that was my own delusion, I think. But it worked out. So I never

01:43:45 felt… In fact, quite the opposite. Not only did I not feel as if no one was trying to stop me,

01:43:52 I had the distinct impression that people wanted me to succeed. By people, I meant the people in

01:43:57 power. Not my fellow students. Not that they didn’t want me to succeed. But I felt supported,

01:44:04 or at least that people were happy to see me succeed at least as much as anyone else. But,

01:44:10 you know, 1990, you’re dealing with a different set of problems. You’re very early, at least in

01:44:15 computer science, you’re very early in the Jackie Robinson period. There’s this thing called the

01:44:20 Jackie Robinson syndrome, which is that the first one has to be perfect or has to be sure to

01:44:27 succeed because if that person fails, no one else comes after for a long time. So it was kind of in

01:44:32 everyone’s best interest. But I think it came from a sincere place. I’m completely sure that people

01:44:37 went out of their way to try to make certain that the environment would be good. Not just for me,

01:44:43 but for the other people who, of course, were around. And I was the only person in the iLab,

01:44:47 but I wasn’t the only person at MIT by a long shot. On the other hand, we’re what?

01:44:53 At that point, we would have been, what, less than 20 years away from the first black PhD to

01:44:57 graduate from MIT, right? Shirley Jackson, right? 1971, something like that? Somewhere around then.

01:45:03 So we weren’t that far away from the first first, and we were still another eight years away from

01:45:09 the first black PhD in computer science, right? So it was a sort of interesting time. But I did

01:45:16 not feel as if the institutions of the university were against any of that. And furthermore, I felt

01:45:25 as if there was enough of a critical mass across the institute from students and probably faculty

01:45:30 that I didn’t know them, who wanted to make certain that the right thing happened. It was very

01:45:35 different from the institutions of the rest of the city, which I think were designed in such a way

01:45:41 that they felt no need to be supportive.

01:45:44 Let me ask a touchy question on that. So you kind of said that you didn’t feel,

01:45:52 you felt empowered. Is there some lesson, advice, in the sense that no matter what,

01:46:00 you should feel empowered? You said, you used the word, I think, illusion or delusion.

01:46:05 Is there a sense from the individual perspective where you should always kind of ignore, you know,

01:46:14 the, ignore your own eyes, ignore the little forces that you are able to observe around you,

01:46:27 that are like trying to mess with you of whether it’s jealousy, whether it’s hatred in its pure

01:46:33 form, whether it’s just hatred in its like deluded form, all that kind of stuff?

01:46:38 And just kind of see yourself as empowered and confident and all those kinds of things.

01:46:44 I mean, it certainly helps, but it’s, there’s a trade off, right? You have to be deluded enough

01:46:47 to think that you can succeed. I mean, you can’t get a PhD unless you’re crazy enough to think you

01:46:51 can invent something that no one else has come up with. I mean, that kind of massive delusion is that

01:46:56 you have to be deluded enough to believe that you can succeed despite whatever odds you see

01:46:59 in front of you, but you can’t be so deluded that you don’t think that you need to step out of

01:47:03 the way of the oncoming train, right? So it’s all a trade off, right? You have to kind of believe in

01:47:11 yourself. It helps to have a support group around you in some way or another. I was able to find

01:47:16 that, I’ve been able to find that wherever I’ve gone, even if it wasn’t necessarily on the floor

01:47:21 that I was in, I had lots of friends when I was here. Many of them still live here. And I’ve kept

01:47:26 up with many of them. So I felt supported. And certainly I had my mother and my family and those

01:47:30 people back home that I could always lean back on, even if it were a long distance call that cost

01:47:37 money, which is not something that any of the kids today even know what I’m talking about. But

01:47:41 back then it mattered, calling my mom was an expensive proposition. But you have that and

01:47:45 it’s fine. I think it helps. But you cannot be so deluded that you miss the obvious because it makes

01:47:51 things slower and it makes you think you’re doing better than you are and it will hurt you in the

01:47:55 long run. You mentioned cops. You tell a story of being pulled over. Perhaps it happened more than

01:48:04 once. More than once, for sure. One, could you tell that story? And in general, can you give me

01:48:11 a sense of what the world looks like when the law doesn’t always look at you with a blank slate?

01:48:22 With a blank slate with objective eyes? I don’t know how to say it more poetically.

01:48:33 Well, I guess the, I don’t either. I guess the answer is it looks exactly the way it looks now

01:48:39 because this is the world that we happen to live in, right? It’s people clustering and doing the

01:48:44 things that they do and making decisions based on one or two bits of information they find

01:48:50 relevant, which, by the way, are all positive feedback loops, which makes it easier for you

01:48:56 to believe what you believed before because you behave in a certain way that makes it true and

01:48:59 it goes on and circles and then cycles and cycles and then cycles. So it’s just about being on edge.

01:49:06 I do not, despite having made it over 50 now.

01:49:11 Congratulations, brother.

01:49:13 God, I have a few gray hairs here and there.

01:49:16 You did pretty good.

01:49:16 I think, I don’t imagine I will ever see a police officer and not get very, very tense.

01:49:25 Now, everyone gets a little tense because it probably means you’re being pulled over for

01:49:30 speeding or something, or you’re going to get a ticket or whatever, right? I mean,

01:49:34 the interesting thing about the law in general is that most human beings experience of it is

01:49:39 fundamentally negative, right? You’re only dealing with a lawyer if you’re in trouble,

01:49:43 except in a few very small circumstances, right? So that’s an underlying reality.

01:49:49 Now, imagine that that’s also at the hands of the police officer. I remember the time when I got

01:49:55 pulled over that time, halfway between Boston and Wellesley, actually. I remember thinking

01:50:04 when he pulled his gun on me that if he shot me right now, he’d get away with it. That was the

01:50:11 that was the worst thing that I felt about that particular moment, is that if he shoots me now,

01:50:16 he will get away with it. It would be years later when I realized actually much worse than that

01:50:24 is that he’d get away with it. And if it became a thing that other people knew about,

01:50:30 odds would be, of course, that it wouldn’t. But if it became a thing that other people knew about,

01:50:34 if I was living in today’s world as opposed to the world 30 years ago, that not only would

01:50:39 get away with it, but that I would be painted a villain. I was probably big and scary, and I

01:50:45 probably moved too fast, and if only I’d done what he said, and da, da, da, da, da, da, da,

01:50:49 which is somehow worse, right? You know, that hurts not just you, you’re dead, but your family,

01:50:55 and the way people look at you, and look at your legacy or your history, that’s terrible.

01:51:00 And it would work. I absolutely believe it would have worked had he done it. Now, he didn’t. I

01:51:05 don’t think he wanted to shoot me. I don’t think he felt like killing anybody. He did not go out

01:51:08 that night expecting to do that or planning on doing it, and I wouldn’t be surprised if he never,

01:51:12 ever did that or ever even pulled his gun again. I don’t know the man’s name. I don’t remember

01:51:16 anything about him. I do remember the gun. Guns are very big when they’re in your face. I can tell

01:51:20 you this much. They’re much larger than they seem. But… And you’re basically like speeding or

01:51:24 something like that? He said I ran a light, I think. You ran a light. I don’t think I ran a

01:51:28 light, but you know, in fact, I may not have even gotten a ticket. I may have just gotten a warning.

01:51:33 I think he was a little… But he pulled a gun. Yeah. Apparently I moved too fast or something.

01:51:37 Rolled my window down before I should have. It’s unclear. I think he thought I was going to do

01:51:42 something, or at least that’s how he behaved. So how, if we can take a little walk around your

01:51:48 brain, how do you feel about that guy and how do you feel about cops after that experience?

01:51:58 Well, I don’t remember that guy, but my view on police officers is the same view I have about

01:52:03 lots of things. Fire is an important and necessary thing in the world, but you must respect fire

01:52:12 because it will burn you. Fire is a necessary evil in the sense that it can burn you. Necessary

01:52:19 in the sense that, you know, heat and all the other things that we use fire for. So when I see

01:52:25 a cop, I see a giant ball of flame and I just try to avoid it. And then some people might see

01:52:33 a nice place, a nice thing to roast marshmallows with a family over.

01:52:37 Which is fine, but I don’t roast marshmallows.

01:52:40 Okay. So let me go a little dark and I apologize. Just talked to Dan Carlin about

01:52:44 Hitler for four hours. So sorry if I go dark here a little bit, but

01:52:50 is it easy for this experience of just being careful with the fire and avoiding it to turn

01:52:57 to hatred? Yeah, of course. And one might even argue that it is a logical conclusion, right?

01:53:05 On the other hand, you’ve got to live in the world and I don’t think it’s helpful. Hate is something

01:53:12 one should, I mean, hate is something that takes a lot of energy. So one should reserve it for

01:53:20 when it is useful and not carried around with you all the time. Again, there’s a big difference

01:53:25 between the happy delusion that convinces you that you can actually get out of bed and

01:53:30 make it to work today without getting hit by a car and the sad delusion that means you can

01:53:36 not worry about this car that is barreling towards you, right? So we all have to be a

01:53:40 little deluded because otherwise we’re paralyzed, right? But one should not be ridiculous.

01:53:46 If we go all the way back to something you said earlier about empathy,

01:53:49 I think what I would ask other people to get out of this one of many, many, many stories

01:53:57 is to recognize that it is real. People would ask me to empathize with the police officer.

01:54:04 I would quote back statistics saying that being a police officer isn’t even in the top 10 most

01:54:11 dangerous jobs in the United States, you’re much more likely to get killed in a taxicab.

01:54:14 Half of police officers are actually killed by suicide, but that means their lives are something,

01:54:22 something’s going on there with them and I would more than happy to be empathetic about what it is

01:54:28 they go through and how they see the world. I think though that if we step back from what I feel,

01:54:34 if we step back from what an individual police officer feels, you step up a level and all this,

01:54:39 because all things tie back into interactive AI. The real problem here is that we’ve built a

01:54:44 narrative. We built a big structure that has made it easy for people to put themselves into different

01:54:50 pots in the different clusters and to basically forget that the people in the other clusters are

01:54:57 ultimately like them. It is useful exercise to ask yourself sometimes, I think, that if I had grown

01:55:03 up in a completely different house and a completely different household as a completely different

01:55:07 person, if I had been a woman, would I see the world differently? Would I believe what that crazy

01:55:12 person over there believes? And the answer is probably yes, because after all, they believe it.

01:55:19 And fundamentally, they’re the same as you. So then what can you possibly do to fix it? How do

01:55:25 you fix Twitter? If you think Twitter needs to be broken or Facebook, if you think Facebook is

01:55:29 broken, how do you fix racism? How do you fix any of these things? That’s all structural.

01:55:35 I mean, individual conversations matter a lot, but you have to create structures that allow people

01:55:42 to have those individual conversations all the time in a way that is relatively safe and that

01:55:47 allows them to understand that other people have had different experiences, but that ultimately

01:55:51 we’re the same, which sounds very, I don’t even know what the right word is. I’m trying to avoid

01:55:57 a word like saccharine, but it feels very optimistic.

01:56:01 But I think that’s okay. I think that’s a part of the delusion, is you want to be a little

01:56:06 optimistic and then recognize that the hard problem is actually setting up the structures

01:56:10 in the first place, because it’s in almost no one’s interest to change the infrastructure.

01:56:16 Right. I tend to believe that leaders have a big role to that, of selling that optimistic

01:56:22 delusion to everybody, and that eventually leads to the building of the structures. But that

01:56:27 requires a leader that unites, sort of unites everybody on a vision as opposed to divides

01:56:33 on a vision, which is, this particular moment in history feels like there’s a nonzero probability,

01:56:43 if we go to the P, of something akin to a violent or a nonviolent civil war. This is one of the

01:56:51 most divisive periods of American history in recent, you can speak to this from a perhaps

01:56:58 a more knowledgeable and deeper perspective than me, but from my naive perspective, this seems like

01:57:03 a very strange time. There’s a lot of anger, and it has to do with people, I mean, for many reasons.

01:57:11 One, the thing that’s not spoken about, I think, much is the conflict of opinion,

01:57:18 much is the quiet economic pain of millions that’s like growing because of COVID, because of closed

01:57:29 businesses, because of like lost dreams. So that’s building, whatever that tension is building.

01:57:35 The other is, there seems to be an elevated level of emotion. I’m not sure if you can psychoanalyze

01:57:41 where that’s coming from, but this sort of, from which the protests and so on percolated. It’s like,

01:57:47 why now? Why this particular moment in history? Oh, because time, enough time has passed, right?

01:57:52 I mean, you know, the very first race riots were in Boston, not to draw anything from that.

01:57:56 Really? When? Oh, this is before like… Going way, I mean, like the 1700s or whatever,

01:58:01 right? I mean, there was a massive one in New York. I mean, I’m talking way, way, way back when.

01:58:05 So Boston used to be the hotbed of riots. It’s just what Boston was all about,

01:58:09 or so I’m told from history class. There’s an interesting one in New York. I remember when

01:58:15 that was. Anyway, the point is, you know, basically you got to get another generation,

01:58:22 old enough to be angry, but not so old to remember what happened the last time, right?

01:58:28 And that’s sort of what happens. But, you know, you said like two completely, you said two things

01:58:33 there that I think are worth unpacking. One has to do with this sort of moment in time.

01:58:38 And, you know, why? Why is this sort of up built? And the other has to do with a kind of, you know,

01:58:43 sort of the economic reality of COVID. So I’m actually, I want to separate those things because,

01:58:47 for example, you know, this happened before COVID happened, right? So let’s separate these two

01:58:54 things for a moment. Now, let me preface all this by saying that although I am interested in history,

01:59:01 one of my three minors as an undergrad was history, specifically history, the 1960s. Interesting. The

01:59:07 other was Spanish. And, okay, that’s a mistake. Oh, I loved that. And history of Spanish and Spanish

01:59:14 history, actually, but Spanish and the other was what we would now call cognitive science. But at

01:59:17 the time, that’s fascinating. Interesting. I minored in Cogsci here for grad school. That was

01:59:24 really, that was really fascinating. It was a very different experience. I mean, it was a very

01:59:29 it was really fascinating. It was a very different experience from all the computer science classes

01:59:33 I’ve been taking, even the Cogsci classes I was taking at an undergrad. Anyway, I’m interested

01:59:42 in history, but I’m hardly a historian, right? So, you know, forgive my, I will ask the audience to

01:59:48 forgive my simplification. But I think the question that’s always worth asking, as opposed, it’s the

01:59:58 same question, but a little different. Not why now, but why not before? Right? So why the 1950s,

02:00:08 60s civil rights movement as opposed to the 1930s, 1940s? Well, first off, there was a civil

02:00:12 rights movement in the 30s and 40s. It just wasn’t of the same character or quite as well known. Post

02:00:17 World War II, lots of interesting things were happening. It’s not as if a switch was turned on

02:00:22 and Brown versus the Board of Education or the Montgomery bus boycott. And that’s when it

02:00:27 happened. These things been building up forever and go all the way back and all the way back and

02:00:30 all the way back. And, you know, Harriet Tubman was not born in 1950, right? So, you know, we can

02:00:35 take these things. It could have easily happened right after World War II. Yes. I think,

02:00:42 and again, I’m not a scholar. I think that the big difference was TV. These things are visible.

02:00:50 People can see them. It’s hard to avoid, right? Why not James Farmer? Why Martin Luther King? Because

02:00:59 one was born 20 years after the other, whatever. I think it turns out that, you know what King’s

02:01:06 biggest failure was in the early days? It was in Georgia. They were doing the usual thing,

02:01:13 trying to integrate. And I forget the guy’s name, but you can look this up. But he, a cop,

02:01:21 he was a sheriff made a deal with the whole state of Georgia. We’re going to take people and we are

02:01:26 going to nonviolently put them in trucks. And then we’re going to take them and put them in jails

02:01:30 very far away from here. And we’re going to do that. And we’re not going to, there’ll be no

02:01:35 reason for the press to hang around. And they did that and it worked. And the press left and

02:01:41 nothing changed. So next they went to Birmingham, Alabama and Bull O Connor. And you got to see on

02:01:48 TV, little boys and girls being hit with fire hoses and being knocked down. And there was

02:01:53 outrage and things changed, right? Part of the delusion is pretending that nothing bad is

02:01:59 happening that might force you to do something big you don’t want to do. But sometimes it gets

02:02:03 put in your face and then you kind of can’t ignore it. And a large part in my view of what happened

02:02:08 right was that it was too public to ignore. Now we created other ways of ignoring it.

02:02:14 Lots of change happened in the South, but part of that delusion was that it wasn’t going to affect

02:02:17 the West or the Northeast. And of course it did. And that caused its own set of problems, which

02:02:21 went into the late sixties into the seventies. And, you know, in some ways we’re living with

02:02:25 that legacy now and so on. So why not what’s happening now? Why didn’t happen 10 years ago?

02:02:32 I think it’s people have more voices. There’s not just more TV, there’s social media. It’s very easy

02:02:38 for these things to kind of build on themselves and things are just quite visible. And there’s

02:02:44 demographic change. I mean, the world is changing rapidly, right? And so it’s very difficult.

02:02:49 You’re now seeing people you could have avoided seeing most of your life growing up in a particular

02:02:52 time. And it’s happening, it’s dispersing at a speed that is fast enough to cause

02:02:58 concern for some people, but not so fast to cause massive negative reaction. So that’s that.

02:03:06 On the other hand, and again, that’s a massive oversimplification, but I think there’s something

02:03:10 there anyway, at least something worth exploring. I’m happy to be yelled at by a real historian.

02:03:13 Oh yeah. I mean, there’s just the obvious thing. I mean, I guess you’re implying, but not

02:03:19 saying this. I mean, it seemed to have percolated the most with just a single video, for example,

02:03:24 the George Floyd video. It’s fascinating to think that whatever the mechanisms that put injustice

02:03:34 in front of our face, not like directly in front of our face, those mechanisms are the mechanisms

02:03:42 of change. Yeah. On the other hand, Rodney King. So no one remembers this. I seem to be the only

02:03:46 person who remembers this, but sometime before the Rodney King incident, there was a guy who

02:03:51 was a police officer who was saying that things were really bad in Southern California. And he

02:03:58 was going to prove it by having some news, some camera people follow him around. And he says,

02:04:03 I’m going to go into these towns and just follow me for a week. And you will see that I’ll get

02:04:06 harassed. And like the first night he goes out there and he crosses into the city, some cops

02:04:11 pull him over and he’s a police officer. Remember, they don’t know that. Of course they like shove

02:04:15 his face through a glass window. This was on the new, like I distinctly remember watching this as

02:04:20 a kid. Actually, I guess I wasn’t a kid. I was in college, I was in grad school at the time.

02:04:25 So that’s not enough. Well, it disappeared like a day late. It didn’t go viral.

02:04:30 Yeah. Whatever that is, whatever that magic thing is.

02:04:33 And whatever it was in 92, it was harder to go viral in 92, right? Or 91,

02:04:38 actually it must’ve been 90 or 91, but that happened. And like two days later,

02:04:42 it’s like it never happened. Again, nobody remembers this, but I’m like the only person.

02:04:45 Sometimes I think I must’ve dreamed it. Anyway, Rodney King happens. It goes viral

02:04:50 or the moral equivalent thereof at the time. And eventually we get April 29th. And I don’t know

02:04:57 what the difference was between the two things, other than one thing caught on and one thing

02:05:00 didn’t. Maybe what’s happening now is two things are feeding onto one another. One is more people

02:05:07 are willing to believe. And the other is there’s easier and easier ways to give evidence. Cameras,

02:05:14 body cams or whatever, but we’re still finding ourselves telling the same story. It’s the same

02:05:17 thing over and over again. I would invite you to go back and read the op eds from what people were

02:05:22 saying about the violence is not the right answer after Rodney King. And then go back to 1980 and

02:05:28 the big riots that were happening around then and read the same op ed. It’s the same words over and

02:05:33 over and over again. I mean, there’s your remembering history right there. I mean,

02:05:37 it’s like literally the same words. Like it could have just caught, but I’m surprised no one got

02:05:40 flagged for plagiarism. It’s interesting if you have an opinion on the question of violence

02:05:46 and the popular perhaps caricature of Malcolm X versus Martin Luther King.

02:05:53 You know, Malcolm X was older than Martin Luther King. People kind of have it in their head that

02:05:57 he’s younger. Well, he died sooner, but only by a few years. People think of MLK as the older

02:06:05 statesman and they think of Malcolm X as the young, angry, whatever, but that’s more of a

02:06:10 narrative device. It’s not true at all. I don’t, I just, I reject the choice as I think it’s a

02:06:18 false choice. I think they’re just things that happen. You just do, as I said, hatred is not,

02:06:23 it takes a lot of energy, but you know, every once in a while you have to fight.

02:06:28 One thing I will say without taking a moral position, which I will not take on this matter,

02:06:34 violence has worked.

02:06:38 Yeah, that’s the annoying thing.

02:06:41 That’s the annoying thing.

02:06:43 It seems like over the top anger works. Outrage works. So you can say like being calm and rational

02:06:53 and just talking it out is going to lead to progress. But it seems like if you just look

02:06:59 through history being irrationally upset is the way you make progress.

02:07:06 Well, it’s certainly the way that you get someone to notice you.

02:07:09 Yeah.

02:07:10 And if they don’t notice you, I mean, what’s the difference between that and what did you,

02:07:13 again, without taking a moral position on this, I’m just trying to observe history here.

02:07:17 If you, maybe if television didn’t exist, the civil rights movement doesn’t happen

02:07:22 or it takes longer or it takes a very different form. Maybe if social media doesn’t exist,

02:07:27 a whole host of things, positive and negative don’t happen. And what do any of those things

02:07:33 do other than expose things to people? Violence is a way of shouting. I mean,

02:07:40 many people far more talented and thoughtful than I have have said this in one form or another,

02:07:45 right? That violence is the voice of the unheard. It’s a thing that people do when they feel as if

02:07:54 they have no other option. And sometimes we agree and sometimes we disagree. Sometimes we think

02:08:00 they’re justified. Sometimes we think they are not, but regardless, it is a way of shouting.

02:08:06 And when you shout, people tend to hear you, even if they don’t necessarily hear the words

02:08:10 that you’re saying, they hear that you were shouting. I see no way. So another way of putting

02:08:15 it, which I think is less, let us just say provocative, but I think is true is that all

02:08:26 change, particularly change that impacts power requires struggle. The struggle doesn’t have to

02:08:32 be violent, you know, but it’s a struggle nonetheless. The powerful don’t give up power

02:08:38 easily. I mean, why should they? But even so, it still has to be a struggle. And by the way,

02:08:45 this isn’t just about, you know, violent political, whatever, nonviolent political

02:08:49 change, right? This is true for understanding calculus, right? I mean, everything requires

02:08:53 a struggle. We’re back to talking about faculty hiring. At the end of the day,

02:08:56 in the end of the day, it all comes down to faculty hiring. All a metaphor. Faculty

02:09:01 hiring is a metaphor for all of life. Let me ask a strange question. Do you think everything is

02:09:10 going to be okay in the next year? Do you have a hope that we’re going to be okay?

02:09:16 I tend to think that everything’s going to be okay because I just tend to think that everything’s

02:09:20 going to be okay. My mother says something to me a lot and always has, and I find it quite

02:09:26 comforting, which is this too shall pass and this too shall pass. Now, this too shall pass is not

02:09:32 just this bad thing is going away. Everything passes. I mean, I have a 16 year old daughter

02:09:38 who’s going to go to college probably at about 15 minutes, given how fast she seems to be growing

02:09:43 up. And you know, I get to hang out with her now, but one day I won’t. She’ll ignore me just as much

02:09:48 as I ignored my parents when I was in college and went to grad school. This too shall pass.

02:09:52 But I think that one day, if we’re all lucky, you live long enough to look back on something that

02:09:57 happened a while ago, even if it was painful and mostly it’s a memory. So yes, I think it’ll be okay.

02:10:06 What about humans? Do you think we’ll live into the 21st century?

02:10:11 I certainly hope so.

02:10:12 Are you worried that we might destroy ourselves with nuclear weapons, with AGI, with engineering?

02:10:19 I’m not worried about AGI doing it, but I am worried. I mean, at any given moment, right? Also,

02:10:24 but you know, at any given moment, a comet could, I mean, you know, whatever. I tend to think that

02:10:28 outside of things completely beyond our control, we have a better chance than not of making it.

02:10:36 You know, I talked to Alex Filipenko from Berkeley. He was talking about comets and

02:10:41 that they can come out of nowhere. And that was a realization to me. Wow. We’re just watching

02:10:49 this darkness and they can just enter. And then we have less than a month.

02:10:53 And yet you make it from day to day.

02:10:57 That one shall not pass. Well, maybe for Earth they’ll pass, but not for humans.

02:11:02 But I’m just choosing to believe that it’s going to be okay. And we’re not going to get hit by

02:11:08 an asteroid, at least not while I’m around. And if we are, well, there’s very little I can do about

02:11:13 it. So I might as well assume it’s not going to happen. It makes food taste better.

02:11:17 It makes food taste better.

02:11:19 So you, out of the millions of things you’ve done in your life,

02:11:24 you’ve also began the This Week in Black History calendar of facts.

02:11:30 There’s like a million questions that can ask here. You said you’re not a historian,

02:11:35 but is there, let’s start at the big history question of, is there somebody in history,

02:11:44 in black history that you draw a lot of philosophical or personal inspiration from,

02:11:50 or you just find interesting or a moment in history you find interesting?

02:11:55 Well, I find the entirety of the 40s to the 60s and the civil rights movement that didn’t happen

02:12:01 and did happen at the same time during then quite inspirational. I mean, I’ve read quite a bit of the

02:12:07 time period, at least I did in my younger days when I had more time to read as many things as I

02:12:12 wanted to. What was quirky about This Week in Black History when I started in the 80s was how

02:12:22 focused it was. It was because of the sources I was stealing from. And I was very much stealing

02:12:25 from sort of like, I’d take calendars, anything I could find, Google didn’t exist, right? And I

02:12:29 just pulled as much as I could and just put it together in one place for other people.

02:12:32 What ended up being quirky about it, and I started getting people sending me information on it,

02:12:36 was the inventors. People who, you know, Gerard Morgan to Benjamin Banneker, right? People who

02:12:43 were inventing things. At a time when, how in the world did they manage to invent anything?

02:12:54 Like, all these other things were happening, mother necessity, right? All these other things

02:12:57 were happening. And, you know, there were so many terrible things happening around them. And, you

02:13:00 know, they went to the wrong state at the wrong time. They may never, never come back, but they

02:13:04 were inventing things we use, right? And it was always inspiring to me that people would still

02:13:10 create even under those circumstances. I got a lot out of that. I also learned a few lessons. I

02:13:16 think, you know, the Charles Richard Drews of the world, you know, you create things that impact

02:13:23 people. You don’t necessarily get credit for them. And that’s not right, but it’s also okay.

02:13:29 TK You okay with that?

02:13:31 CK Up to a point, yeah. I mean, look, in our world,

02:13:36 all we really have is credit.

02:13:38 TK I was always bothered by how much value credit is given.

02:13:43 CK That’s the only thing you got. I mean, if you’re an academic in some sense,

02:13:46 well, it isn’t the only thing you’ve got, but it feels that way sometimes.

02:13:49 TK But you got the actual, we’re all going to be dead soon. You got the joy of having created

02:13:56 the, you know, the credit with Jan. I’ve talked to Jorgen Schmidhuber, right? The Turing Award

02:14:05 given to three people for deep learning. And you could say that a lot of other people should be on

02:14:10 that list. It’s the Nobel Prize question. Yeah, it’s sad. It’s sad. And people like talking about

02:14:16 it. But I feel like in the long arc of history, the only person who will be remembered is Einstein,

02:14:22 Hitler, maybe Elon Musk. And the rest of us are just like…

02:14:27 CK Well, you know, someone asked me about immortality once and I said,

02:14:31 and I stole this from somebody else. I don’t remember who, but it was,

02:14:34 you know, I asked them, what’s your great grandfather’s name? Any of them? Of course,

02:14:39 they don’t know. Most of us do not know. I mean, I’m not entirely sure. I know my grandparents,

02:14:44 all my grandparents names. I know what I called them, right? I don’t know their middle names,

02:14:48 for example. It’s within living memory, so I could find out. Actually, my grandfather

02:14:54 didn’t know when he was born. I had no idea how old he was, right? But I definitely don’t know

02:15:00 any of my great grandparents are. So in some sense, immortality is doing something preferably

02:15:06 positive so that your great grandchildren know who you are, right? And that’s kind of what you

02:15:11 can hope for, which is very depressing in some ways. I could turn it into something uplifting

02:15:16 if you need me to, but it’s simple, right? It doesn’t matter. I don’t have to know my great

02:15:23 grandfather was to know that I wouldn’t be here without him. And I don’t know who my great

02:15:28 grandchildren are. Certainly my great, great grandchildren are, and I’ll probably never meet

02:15:32 them. Although I would very much like to, but hopefully I’ll set the world in motion in such

02:15:38 a way that their lives will be better than they would have been if I hadn’t done that. Well,

02:15:41 certainly they wouldn’t have existed if I hadn’t done the things that I did.

02:15:44 So I think that’s a good positive thing you live on through other people.

02:15:49 Are you afraid of death?

02:15:51 I don’t know if I’m afraid of death, but I don’t like it.

02:15:54 That’s another t shirt. I mean, do you ponder it? Do you think about the

02:16:02 inevitability of oblivion? I do occasionally. This feels like a very rushing conversation.

02:16:07 I will tell you a story, something that happened to me recently. If you look very carefully,

02:16:14 you will see I have a scar, which by the way, is an interesting story of its own about why people

02:16:20 have half of their thyroid taken out. Some people get scars and some don’t. But anyway, I had half

02:16:26 my thyroid taken out. The way I got there, by the way, is its own interesting story, but I won’t go

02:16:30 into it. Just suffice it to say, I did what I keep telling people you should never do, which is never

02:16:33 go to the doctor unless you have to, because there’s nothing good that’s ever going to come

02:16:36 out of a doctor’s visit. So I went to the doctor to look at one thing. It’s a little bump I had on

02:16:41 the side that I thought might be something bad because my mother made me. And I went there and

02:16:45 he’s like, oh, it’s nothing. But by the way, your thyroid is huge. Can you breathe? Yes,

02:16:49 I can breathe. Are you sure? Because it’s pushing on your windpipe. You should be dead.

02:16:52 So I ended up going there. And to look at my thyroid, it was growing. I had what’s called a

02:16:59 goiter. And he said, we’re going to have to take it out at some point. When? Sometime before you’re

02:17:03 85, probably. But if you wait till you’re 85, that’ll be really bad because you don’t want to

02:17:09 have surgery when you’re 85 years old, if you can help it. Certainly not the kind of surgery it

02:17:14 takes to take out your thyroid. So I went there and I would decide I would put it off until

02:17:21 December 19th because my birthday is December 18th. And I wouldn’t be able to say I made it to

02:17:26 49 or whatever. So I said, I’ll wait till after my birthday. In the first six months of that,

02:17:32 nothing changed. Apparently in the next three months, it had grown. I hadn’t noticed this at

02:17:39 all. I went and had surgery. They took out half of it. The other half is still there and working

02:17:44 fine, by the way. I don’t have to take a pill or anything like that. It’s great. I’m in the

02:17:49 hospital room and the doctor comes in. I’ve got these things in my arm. They’re going to do

02:17:56 whatever. They’re talking to me. And the anesthesiologist says, huh, your blood

02:18:00 pressure is through the roof. Do you have high blood pressure? I said, no, but I’m terrified if

02:18:04 that helps you at all. And the anesthesist, who’s the nurse who supports the anesthesiologist,

02:18:11 if I got that right, said, oh, don’t worry about it. I’ve just put some stuff in your IV. You’re

02:18:15 going to be feeling pretty good in a couple of minutes. And I remember turning and saying,

02:18:19 well, I’m going to feel pretty good in a couple of minutes. Next thing I know, there’s this guy

02:18:23 and he’s moving my bed. And he’s talking to me and I have this distinct impression that I’ve met

02:18:29 this guy and I should know what he’s talking about, but I kind of just don’t remember what

02:18:35 just happened. And I look up and I see the tiles going by and I’m like, oh, it’s just like in the

02:18:41 movies where you see the tiles go by. And then I have this brief thought that I’m in an infinitely

02:18:47 long warehouse and there’s someone sitting next to me. And I remember thinking, oh, she’s not

02:18:53 talking to me. And then I’m back in the hospital bed. And in between the time where the tiles were

02:19:00 going by and I got in the hospital bed, something like five hours had passed. Apparently it had

02:19:05 grown so much that it was a four and a half hour procedure instead of an hour long procedure. I

02:19:09 lost a neck size and a half. It was pretty big. Apparently it was as big as my heart.

02:19:16 Why am I telling you this? I’m telling you this because…

02:19:19 It’s a hell of a story already. Between tiles going by and me waking up in

02:19:25 my hospital bed, no time passed. There was no sensation of time passing.

02:19:31 When I go to sleep and I wake up in the morning, I have this feeling that time has passed. This

02:19:36 feeling that something has physically changed about me. Nothing happened between the time they

02:19:42 put the magic juice in me and the time that I woke up. Nothing. By the way, my wife was there

02:19:47 with me talking. Apparently I was also talking. I don’t remember any of this, but luckily I didn’t

02:19:53 say anything I wouldn’t normally say. My memory of it is I would talk to her and she would teleport

02:19:58 around the room. And then I accused her of witchcraft and that was the end of that.

02:20:03 Her point of view is I would start talking and then I would fall asleep and then I would wake

02:20:07 up and leave off where I was before. I had no notion of any time passing.

02:20:10 I kind of imagine that that’s death, is the lack of sensation of time passing. And on the one hand,

02:20:20 I am, I don’t know, soothed by the idea that I won’t notice. On the other hand, I’m very unhappy

02:20:28 at the idea that I won’t notice. So I don’t know if I’m afraid of death, but I’m completely sure

02:20:35 that I don’t like it and that I particularly would prefer to discover on my own whether immortality

02:20:41 sucks and be able to make a decision about it. That’s what I would prefer. You like to have a

02:20:47 choice in the matter. I would like to have a choice in the matter. Well, again, on the Russian thing,

02:20:51 I think the finiteness of it is the thing that gives it a little flavor, a little spice. Well,

02:20:57 in reinforcement learning, we believe that. That’s why we have discount factors. Otherwise,

02:21:00 it doesn’t matter what you do. Amen. Well, let me, one last question sticking on the Russian theme.

02:21:09 You talked about your great grandparents not remembering their name. What do you think is the,

02:21:18 in this kind of Markov chain that is life, what do you think is the meaning of it all?

02:21:26 What’s the meaning of life? Well, in a world where eventually you won’t know who your great

02:21:32 grandchildren are, I’m reminded of something I heard once or I read once that I really like,

02:21:42 which is, it is well worth remembering that the entire universe, save for one trifling exception,

02:21:51 is composed entirely of others. And I think that’s the meaning of life.

02:22:01 Charles, this is one of the best conversations I’ve ever had. And I get to see you tomorrow

02:22:06 again to hang out with who looks to be one of the most, how should I say, interesting personalities

02:22:15 that I’ll ever get to meet with Michael Lippmann. So I can’t wait. I’m excited to have had this

02:22:20 opportunity. Thank you for traveling all the way here. It was amazing. I’m excited. I always love

02:22:25 Georgia Tech. I’m excited to see with you being involved there what the future holds. So thank you

02:22:30 for talking to me. Thank you for having me. I enjoyed every minute of it. Thanks for listening

02:22:34 to this conversation with Charles Isbell and thank you to our sponsors, Neuro, the maker of

02:22:40 functional sugar free gum and mints that I used to give my brain a quick caffeine boost, Decoding

02:22:46 Digital, a podcast on tech and entrepreneurship that I listen to and enjoy, Masterclass, online

02:22:53 courses that I watch from some of the most amazing humans in history, and Cash App, the app I used to

02:23:00 send money to friends for food and drinks. Please check out these sponsors in the description to get

02:23:06 a discount and to support this podcast. If you enjoy this thing, subscribe on YouTube, review it

02:23:11 with Five Stars and Apple Podcast, follow on Spotify, support on Patreon, or connect with me

02:23:17 on Twitter at Lex Friedman. And now let me leave you with some poetic words from Martin Luther

02:23:23 King Jr. There comes a time when people get tired of being pushed out of the glittering sunlight

02:23:30 of life’s July and left standing amid the piercing chill of an alpine November.

02:23:37 Thank you for listening and hope to see you next time.