Transcript
00:00:00 The following is a conversation with Charles Isbell
00:00:02 and Michael Whitman.
00:00:03 Charles is the Dean of the College of Computing
00:00:06 at Georgia Tech and Michael is a computer science professor
00:00:09 at Brown University.
00:00:10 I’ve spoken with each of them individually on this podcast
00:00:14 and since they are good friends in real life,
00:00:17 we all thought it would be fun
00:00:19 to have a conversation together.
00:00:21 Quick mention of each sponsor,
00:00:23 followed by some thoughts related to the episode.
00:00:26 Thank you to Athletic Greens,
00:00:28 the all in one drink that I start every day with
00:00:30 to cover all my nutritional bases.
00:00:33 Eight Sleep, a mattress that cools itself
00:00:35 and gives me yet another reason to enjoy sleep.
00:00:38 Masterclass, online courses
00:00:40 from some of the most amazing humans in history
00:00:43 and Cash App, the app I use to send money to friends.
00:00:47 Please check out the sponsors in the description
00:00:49 to get a discount and to support this podcast.
00:00:53 As a side note, let me say that having two guests
00:00:55 on the podcast is an experiment
00:00:57 that I’ve been meaning to do for a while.
00:01:00 In particular, because down the road,
00:01:02 I would like to occasionally be a kind of moderator
00:01:05 for debates between people that may disagree
00:01:09 in some interesting ways.
00:01:10 If you have suggestions for who you would like to see debate
00:01:13 on this podcast, let me know.
00:01:16 As with all experiments of this kind,
00:01:18 it is a learning process.
00:01:20 Both the video and the audio might need improvement.
00:01:23 I realized I think I should probably do
00:01:25 three or more cameras next time
00:01:27 as opposed to just two.
00:01:28 And also try different ways to mount the microphone
00:01:31 for the third person.
00:01:34 Also, after recording this intro,
00:01:36 I’m going to have to go figure out the thumbnail
00:01:40 for the video version of the podcast
00:01:42 since I usually put the guest’s head on the thumbnail.
00:01:45 And now there’s two heads and two names
00:01:49 to try to fit into the thumbnail.
00:01:52 It’s a kind of a bin packing problem
00:01:55 which in theoretical computer science
00:01:58 happens to be an NP hard problem.
00:02:02 Whatever I come up with, if you have better ideas
00:02:04 for the thumbnail, let me know as well.
00:02:06 And in general, I always welcome ideas
00:02:08 how this thing can be improved.
00:02:10 If you enjoy it, subscribe on YouTube,
00:02:12 review it with Five Stars and Apple Podcast,
00:02:15 follow on Spotify, support on Patreon,
00:02:17 or connect with me on Twitter at Lex Friedman.
00:02:21 And now here’s my conversation with Charles Isbell
00:02:24 and Michael Littman.
00:02:27 You’ll probably disagree about this question,
00:02:30 but what is your biggest, would you say, disagreement
00:02:33 about either something profound and very important
00:02:37 or something completely not important at all?
00:02:39 I don’t think we have any disagreements at all.
00:02:42 I’m not sure that’s true.
00:02:44 We walked into that one, didn’t we?
00:02:45 So one thing that you sometimes mention is that,
00:02:48 and we did this one on air too, as it were,
00:02:51 whether or not machine learning
00:02:52 is computational statistics.
00:02:55 It’s not.
00:02:57 But it is.
00:02:57 Well, it’s not.
00:02:58 And in particular, and more importantly,
00:03:00 it is not just computational statistics.
00:03:02 So what’s missing in the picture?
00:03:04 All the rest of it.
00:03:05 What’s missing?
00:03:07 That which is missing.
00:03:08 Oh, yes, well, you can’t be wrong now.
00:03:10 Well, it’s not just the statistics.
00:03:11 He doesn’t even believe this.
00:03:12 We’ve had this conversation before.
00:03:14 If it were just the statistics,
00:03:15 then we would be happy with where we are.
00:03:18 But it’s not just the statistics.
00:03:19 That’s why it’s computational statistics.
00:03:21 Or if it were just the computational.
00:03:22 I agree that machine learning is not just statistics.
00:03:24 It is not just statistics.
00:03:25 We can agree on that.
00:03:26 Nor is it just computational statistics.
00:03:28 It’s computational statistics.
00:03:29 It is computational.
00:03:30 What is the computational and computational statistics?
00:03:33 Does this take us into the realm of computing?
00:03:35 It does, but I think perhaps the way I can get him
00:03:37 to admit that he’s wrong is that it’s about rules.
00:03:43 It’s about rules.
00:03:44 It’s about symbols.
00:03:45 It’s about all these other things.
00:03:45 But statistics is not about rules?
00:03:47 I’m gonna say statistics is about rules.
00:03:48 But it’s not just the statistics, right?
00:03:50 It’s not just a random variable that you choose
00:03:51 and you have a probability.
00:03:52 I think you have a narrow view of statistics.
00:03:54 Okay, well then what would be the broad view of statistics
00:03:56 that would still allow it to be statistics
00:03:58 and not say history that would make
00:04:01 computational statistics okay?
00:04:03 Well, okay, so I had my first sort of research mentor,
00:04:07 a guy named Tom Landauer,
00:04:09 taught me to do some statistics, right?
00:04:12 And I was annoyed all the time
00:04:14 because the statistics would say
00:04:16 that what I was doing was not statistically significant.
00:04:19 And I was like, but, but, and basically what he said to me
00:04:23 is statistics is how you’re gonna keep
00:04:25 from lying to yourself, which I thought was really deep.
00:04:29 It is a way to keep yourself honest in a particular way.
00:04:33 I agree with that.
00:04:34 Yeah, and so you’re trying to find rules.
00:04:36 I’m just gonna bring it back to rules.
00:04:38 Wait, wait, wait, could you possibly try to define rules?
00:04:44 Even regular statisticians, noncomputational statisticians,
00:04:47 do spend some of their time evaluating rules, right?
00:04:51 Applying statistics to try to understand
00:04:52 does this rule capture this?
00:04:54 Does this not capture that?
00:04:55 You mean like hypothesis testing kind of thing?
00:04:57 Or like confidence intervals?
00:04:59 I think more like hypothesis.
00:05:01 Like I feel like the word statistic
00:05:03 literally means like a summary,
00:05:04 like a number that summarizes other numbers.
00:05:06 But I think the field of statistics
00:05:08 actually applies that idea to things like rules,
00:05:11 to understand whether or not a rule is valid.
00:05:15 Does software engineering statistics?
00:05:18 No.
00:05:19 Programming languages statistics?
00:05:20 No.
00:05:21 Because I think there’s a very,
00:05:22 it’s useful to think about a lot of what AI
00:05:24 and machine learning is or certainly should be
00:05:26 as software engineering, as programming languages.
00:05:29 Just to put it in language that you might understand,
00:05:33 the hyperparameters beyond the problem itself.
00:05:35 The hyperparameters is too many syllables
00:05:37 for me to understand.
00:05:37 The hyperparameters.
00:05:39 That’s better.
00:05:40 That goes around it, right?
00:05:41 It’s the decisions you choose to make.
00:05:42 It’s the metrics you choose to use.
00:05:44 It’s the loss function.
00:05:45 You wanna say the practice of machine learning
00:05:48 is different than the practice of statistics.
00:05:50 Like the things you have to worry about
00:05:51 and how you worry about them are different,
00:05:53 therefore they’re different.
00:05:54 Right.
00:05:55 At a very little, I mean, at the very least.
00:05:57 It’s that much is true.
00:05:59 It doesn’t mean that statistics,
00:06:00 computational or otherwise aren’t important.
00:06:02 I think they are.
00:06:03 I mean, I do a lot of that, for example.
00:06:05 But I think it goes beyond that.
00:06:06 I think that we could think about game theory
00:06:09 in terms of statistics,
00:06:10 but I don’t think it’s very as useful to do.
00:06:12 I mean, the way I would think about it
00:06:14 or a way I would think about it is this way.
00:06:17 Chemistry is just physics.
00:06:19 But I don’t think it’s as useful to think about chemistry
00:06:22 as being just physics.
00:06:23 It’s useful to think about it as chemistry.
00:06:25 The level of abstraction really matters here.
00:06:27 So I think it is,
00:06:28 there are contexts in which it is useful.
00:06:30 Yes.
00:06:31 I think of it that way, right?
00:06:32 So finding that connection is actually helpful.
00:06:33 And I think that’s when I emphasize
00:06:34 the computational statistics thing.
00:06:36 I think I want to befriend statistics and not absorb them.
00:06:41 Here’s the A way to think about it
00:06:43 beyond what I just said, right?
00:06:44 So what would you say,
00:06:47 and I want you to think back to a conversation
00:06:48 we had a very long time ago.
00:06:49 What would you say is the difference between,
00:06:52 say, the early 2000s, ICML
00:06:54 and what we used to call NIPS, NeurIPS?
00:06:57 Is there a difference?
00:06:58 A lot of, particularly on the machine learning
00:06:59 that was done there?
00:07:00 ICML was around that long.
00:07:02 Oh, yeah.
00:07:03 So iClear is the new conference, newish.
00:07:06 Yeah, I guess so.
00:07:07 And ICML was around the 2000.
00:07:10 So ICML predates that.
00:07:12 I think my most cited ICML paper is from 94.
00:07:15 Michael knows this better than me
00:07:16 because, of course, he’s significantly older than I am.
00:07:18 But the point is, what is the difference
00:07:20 between ICML and NeurIPS in the late 90s, early 2000s?
00:07:24 I don’t know what everyone else’s perspective would be,
00:07:26 but I had a particular perspective at that time,
00:07:28 which is I felt like ICML was more
00:07:31 of a computer science place
00:07:33 and that NIPS, NeurIPS was more of an engineering place,
00:07:37 like the kind of math that happened at the two places.
00:07:40 As a computer scientist,
00:07:41 I felt more comfortable with the ICML math.
00:07:44 And the NeurIPS people would say
00:07:46 that that’s because I’m dumb.
00:07:48 And that’s such an engineering thing to say, so.
00:07:51 I agree with that part of it,
00:07:52 but I do it a little differently.
00:07:53 We actually had a nice conversation
00:07:54 with Tom Dietrich about this in public.
00:07:57 On Twitter just a couple of days ago.
00:07:58 I put it a little differently,
00:07:59 which is that ICML was machine learning done
00:08:02 by a computer scientist.
00:08:04 And NeurIPS was machine learning done
00:08:07 by a computer scientist trying to impress statisticians.
00:08:12 Which was weird because it was the same people,
00:08:15 at least by the time I started paying attention.
00:08:17 But it just felt very, very different.
00:08:18 And I think that that perspective
00:08:20 of whether you’re trying to impress the statisticians
00:08:22 or you’re trying to impress the programmers
00:08:24 is actually very different and has real impact
00:08:26 on what you choose to worry about
00:08:29 and what kind of outcomes you come to.
00:08:31 So I think it really matters.
00:08:32 I think computational statistics is a means to an end.
00:08:34 It is not an end in some sense.
00:08:36 And I think that really matters here
00:08:39 in the same way that I don’t think computer science
00:08:40 is just engineering or just science
00:08:42 or just math or whatever.
00:08:43 Okay, so I’d have to now agree
00:08:44 that now we agree on everything.
00:08:46 Yes, yes.
00:08:47 The important thing here is that
00:08:50 my opinions may have changed,
00:08:51 but not the fact that I’m right,
00:08:53 I think is what we just came to.
00:08:54 Right, and my opinions may have changed
00:08:55 and not the fact that I’m wrong.
00:08:57 That’s right.
00:08:59 You lost me.
00:08:59 I’m not even.
00:09:00 I think I lost myself there too.
00:09:01 But anyway, we’re back.
00:09:04 This happens to us sometimes.
00:09:05 We’re sorry.
00:09:06 How does neural networks change this,
00:09:08 just to even linger on this topic,
00:09:11 change this idea of statistics,
00:09:15 how big of a pie statistics is
00:09:17 within the machine learning thing?
00:09:19 Like, because it sounds like hyperparameters
00:09:22 and also just the role of data.
00:09:24 You know, people are starting to use
00:09:25 this terminology of software 2.0,
00:09:28 which is like the act of programming
00:09:31 as a, like you’re a designer
00:09:35 in the hyperparameter space of neural networks,
00:09:38 and you’re also the collector and the organizer
00:09:40 and the cleaner of the data,
00:09:44 and that’s part of the programming.
00:09:47 So how did, on the NeurIPS versus ICML topic,
00:09:52 what’s the role of neural networks
00:09:54 in redefining the size and the role of machine learning?
00:09:57 I can’t wait to hear what Michael thinks about this,
00:10:00 but I would add one.
00:10:01 But you will.
00:10:02 That’s true, I will, I’ll force myself to.
00:10:04 I think there’s one other thing
00:10:06 I would add to your description,
00:10:07 which is the kind of software engineering part
00:10:09 of what does it mean to debug, for example.
00:10:10 But this is a difference between
00:10:13 the kind of computational statistics view
00:10:14 of machine learning and the computational view
00:10:16 of machine learning, which is, I think,
00:10:18 one is worried about the equation, as it were.
00:10:20 And by the way, this is not a value judgment.
00:10:23 I just think it’s about perspective.
00:10:24 But the kind of questions you would ask
00:10:26 when you start asking yourself,
00:10:27 well, what does it mean to program
00:10:28 and develop and build the system,
00:10:29 is a very computer sciencey view of the problem.
00:10:33 I mean, if you get on data science Twitter
00:10:35 and econ Twitter, you actually hear this a lot
00:10:39 with the economist and the data scientist
00:10:43 complaining about the machine learning people.
00:10:44 Well, it’s just statistics,
00:10:46 and I don’t know why they don’t see this.
00:10:47 But they’re not even asking the same questions.
00:10:49 They’re not thinking about it
00:10:50 as a kind of programming problem.
00:10:53 And I think that that really matters,
00:10:54 just asking this question.
00:10:55 I actually think it’s a little different
00:10:57 from programming in hyperparameter space
00:11:00 and sort of collecting the data.
00:11:03 But I do think that that immersion really matters.
00:11:06 So I’ll give you a quick example
00:11:07 of the way I think about this.
00:11:08 So I teach machine learning.
00:11:09 Michael and I have co taught a machine learning class,
00:11:12 which has now reached, I don’t know, 10,000 people at least
00:11:14 over the last several years, or somewhere there’s abouts.
00:11:17 And my machine learning assignments are of this form.
00:11:21 So the first one is something like,
00:11:23 implement these five algorithms,
00:11:25 KNN and SVMs and boosting and decision trees
00:11:29 and neural networks, and maybe that’s it, I can’t remember.
00:11:32 And when I say implement, I mean steal the code.
00:11:34 I am completely uninterested.
00:11:36 You get zero points for getting the thing to work.
00:11:38 I don’t want you spending your time worrying about
00:11:41 getting the corner case right of what happens
00:11:44 when you are trying to normalize distances
00:11:46 and the points on the thing.
00:11:47 And so you divide by zero.
00:11:48 I’m not interested in that, right?
00:11:50 Steal the code.
00:11:51 However, you’re going to run those algorithms
00:11:54 on two data sets.
00:11:55 The data sets have to be interesting.
00:11:57 What does it mean to be interesting?
00:11:58 Well, data sets interesting if it reveals differences
00:12:01 between algorithms, which presumably are all the same
00:12:03 because they can represent whatever they can represent.
00:12:06 And two data sets are interesting together
00:12:07 if they show different differences, as it were.
00:12:10 And you have to analyze them.
00:12:11 You have to justify their interestingness
00:12:13 and you have to analyze them in a whole bunch of ways.
00:12:15 But all I care about is the data in your analysis,
00:12:17 not the programming.
00:12:18 And I occasionally end up in these long discussions
00:12:20 with students, well, I don’t really,
00:12:22 I copy and paste the things that I’ve said
00:12:24 the other 15,000 times it’s come up,
00:12:26 which is they go, but the only way to learn,
00:12:29 really understand is to code them up,
00:12:31 which is a very programmer,
00:12:33 software engineering view of the world.
00:12:35 If you don’t program it, you don’t understand it,
00:12:37 which is, by the way, I think is wrong
00:12:39 in a very specific way.
00:12:40 But it is a way that you come to understand
00:12:42 because then you have to wrestle with the algorithm.
00:12:44 But the thing about machine learning
00:12:45 is it’s not just sorting numbers
00:12:47 where in some sense the data doesn’t matter.
00:12:49 What matters is, well, does the algorithm work
00:12:50 on these abstract things, one less than the other?
00:12:53 In machine learning, the data matters.
00:12:54 It matters more than almost anything.
00:12:57 And not everything, but almost anything.
00:12:59 And so as a result, you have to live with the data
00:13:02 and don’t get distracted by the algorithm per se.
00:13:04 And I think that that focus on the data
00:13:07 and what it can tell you
00:13:09 and what question it’s actually answering for you
00:13:11 as opposed to the question you thought you were asking
00:13:14 is a key and important thing about machine learning
00:13:16 and is a way that computationalists
00:13:18 as opposed to statisticians bring a particular view
00:13:21 about how to think about the process.
00:13:23 The statisticians, by contrast, bring,
00:13:25 I think I’d be willing to say,
00:13:27 a better view about the kind of formal math that’s behind it
00:13:31 and what an actual number ultimately is saying
00:13:35 about the data.
00:13:35 And those are both important, but they’re also different.
00:13:38 I didn’t really think of it this way
00:13:40 is to build intuition about the role of data,
00:13:44 the different characteristics of data
00:13:45 by having two data sets that are different
00:13:48 and they reveal the differences in the differences.
00:13:50 That’s a really fascinating,
00:13:52 that’s a really interesting educational approach.
00:13:55 The students love it, but not right away.
00:13:57 No, they love it at the end.
00:13:58 They love it later.
00:13:59 They love it at the end.
00:14:00 Not at the beginning.
00:14:02 Not even immediately after.
00:14:04 I feel like there’s a deep profound lesson
00:14:06 about education there.
00:14:07 Yeah.
00:14:08 That you can’t listen to students
00:14:10 about whether what you’re doing is the right
00:14:14 or the wrong thing.
00:14:15 Yeah, well, as a wise, Michael Lippmann once said to me
00:14:19 about children, which I think applies to teaching,
00:14:22 is you have to give them what they need
00:14:24 without bending to their will.
00:14:27 And students are like that.
00:14:28 You have to figure out what they need.
00:14:29 You’re a curator.
00:14:29 Your whole job is to curate and to present
00:14:32 because on their own,
00:14:33 they’re not gonna necessarily know where to search.
00:14:35 So you’re providing pushes in some direction
00:14:37 and learn space and you have to give them what they need
00:14:42 in a way that keeps them engaged enough
00:14:44 so that they eventually discover what they want
00:14:46 and they get the tools they need to go
00:14:48 and learn other things off of.
00:14:50 What’s your view?
00:14:52 Let me put on my Russian hat,
00:14:54 which believes that life is suffering.
00:14:55 I like Russian hats, by the way.
00:14:56 If you have one, I would like this.
00:14:58 Those are ridiculous, yes.
00:14:59 But in a delightful way.
00:15:01 But sure.
00:15:04 What do you think is the role of,
00:15:06 we talked about balance a little bit.
00:15:08 What do you think is the role of hardship in education?
00:15:11 Like I think the biggest things I’ve learned,
00:15:16 like what made me fall in love with math, for example,
00:15:20 is by being bad at it until I got good at it.
00:15:24 So like struggling with a problem,
00:15:28 which increased the level of joy I felt
00:15:31 when I finally figured it out.
00:15:33 And it always felt with me, with teachers,
00:15:37 especially modern discussions of education,
00:15:39 how can we make education more fun,
00:15:42 more engaging, more all those things?
00:15:44 Or from my perspective, it’s like,
00:15:46 you’re maybe missing the point
00:15:49 that education, that life is suffering.
00:15:52 Education is supposed to be hard
00:15:54 and that actually what increases the joy you feel
00:15:57 when you actually learn something.
00:15:59 Is that ridiculous?
00:16:02 Do you like to see your students suffer?
00:16:04 Okay, so this may be a point where we differ.
00:16:07 I suspect not.
00:16:08 I’m gonna do go on.
00:16:10 Well, what would your answer be?
00:16:11 I wanna hear you first.
00:16:12 Okay, well, I was gonna not answer the question.
00:16:14 You don’t want the students to know you enjoy them suffering?
00:16:18 No, no, no, no, no, no.
00:16:19 I was gonna say that there’s,
00:16:21 I think there’s a distinction that you can make
00:16:23 in the kind of suffering, right?
00:16:25 So I think you can be in a mode
00:16:27 where you’re suffering in a hopeless way
00:16:30 versus you’re suffering in a hopeful way, right?
00:16:33 Where you’re like, you can see that if you,
00:16:37 that you still have,
00:16:39 you can still imagine getting to the end, right?
00:16:41 And as long as people are in that mindset
00:16:43 where they’re struggling,
00:16:44 but it’s not a hopeless kind of struggling,
00:16:47 that’s productive.
00:16:49 I think that’s really helpful.
00:16:50 But it’s struggling, like if you break their will,
00:16:53 if you leave them hopeless.
00:16:56 No, that don’t, sure, some people are gonna,
00:16:58 whatever, lift themselves up by their bootstraps,
00:17:00 but like mostly you give up
00:17:01 and certainly it takes the joy out of it.
00:17:03 And you’re not gonna spend a lot of time
00:17:05 on something that brings you no joy.
00:17:07 So it is a bit of a delicate balance, right?
00:17:10 You have to thwart people in a way
00:17:12 that they still believe that there’s a way through.
00:17:17 Right, so that’s a, we strongly agree actually.
00:17:20 So I think, well, first off,
00:17:21 struggling and suffering aren’t the same thing, right?
00:17:24 Yeah, just being poetic.
00:17:25 Oh, no, no, I actually appreciate the poetry.
00:17:27 And one of the reasons I appreciate it
00:17:29 is that they are often the same thing
00:17:31 and often quite different, right?
00:17:32 So you can struggle without suffering,
00:17:34 you can certainly suffer pretty easily.
00:17:37 You don’t necessarily have to struggle to suffer.
00:17:38 So I think that you want people to struggle,
00:17:41 but that hope matters.
00:17:42 You have to, they have to understand
00:17:44 that they’re gonna get through it on the other side.
00:17:46 And it’s very easy to confuse the two.
00:17:50 I actually think Brown University has a very,
00:17:52 just philosophically has a very different take
00:17:55 on the relationship with their students,
00:17:56 particularly undergrads from say a place like Georgia Tech,
00:17:59 which is.
00:18:00 Which university is better?
00:18:01 Well, I have my opinions on that.
00:18:03 I mean, remember, Charles said,
00:18:05 it doesn’t matter what the facts are, I’m always right.
00:18:07 The correct answer is that it doesn’t matter,
00:18:09 they’re different.
00:18:10 But clearly, clearly the answer is different.
00:18:14 He went to a school like the school
00:18:16 where he is as an undergrad.
00:18:18 I went to a school, specifically the same school,
00:18:21 though it was changed a bit in the intervening years.
00:18:23 Brown or Georgia Tech?
00:18:24 No, I was talking about Georgia Tech.
00:18:25 And I went to an undergrad place
00:18:28 that’s a lot like the place where I work now.
00:18:29 And so it does seem like we’re more familiar
00:18:32 with these models.
00:18:33 So there’s a similarity between Brown and Yale?
00:18:35 Yeah, I think they’re quite similar, yeah.
00:18:38 And Duke.
00:18:39 Duke has some similarities too,
00:18:40 but it’s got a little Southern draw.
00:18:42 You’ve kind of worked your,
00:18:43 you’ve sort of worked at universities
00:18:45 that are like the places where you learned.
00:18:50 And the same would be true for me.
00:18:52 Are you uncomfortable venturing outside the box?
00:18:56 Is that what you’re saying?
00:18:57 Journeying out?
00:18:58 That’s not what I’m saying.
00:18:58 Yeah, Charles is definitely.
00:19:00 He only goes to places
00:19:01 that have institute in the name, right?
00:19:02 It has worked out that way.
00:19:04 Well, academic places anyway.
00:19:06 Well, no, I was a visiting scientist at UPenn
00:19:08 or visiting something at UPenn.
00:19:11 Oh, wow, I just understood your joke.
00:19:14 Which one?
00:19:14 Five minutes later.
00:19:18 I like to set the sort of time bomb.
00:19:20 The institute is in the,
00:19:22 that Charles only goes to places
00:19:23 that have institute in the name.
00:19:25 So I guess Georgia,
00:19:27 I forget that Georgia Tech
00:19:28 is Georgia Institute of Technology.
00:19:30 The number of people who refer to it
00:19:32 as Georgia Tech University is large
00:19:34 and incredibly irritating.
00:19:35 It’s one of the few things
00:19:37 that genuinely gets under my skin.
00:19:39 But like schools like Georgia Tech and MIT
00:19:41 have as part of the ethos,
00:19:42 like there is,
00:19:43 I wanna say there’s an abbreviation
00:19:45 that someone taught me,
00:19:47 like IHTFP, something like that.
00:19:49 Like there’s an expression
00:19:51 which is basically I hate being here,
00:19:53 which they say so proudly.
00:19:55 And that is definitely not the ethos at Brown.
00:19:57 Like Brown is,
00:19:58 there’s a little more pampering
00:20:01 and empowerment and stuff.
00:20:02 And it’s not like we’re gonna crush you
00:20:03 and you’re gonna love it.
00:20:04 So yeah, I think there’s a,
00:20:06 I think the ethoses are different.
00:20:09 That’s interesting, yeah.
00:20:10 We had Drown Proofing.
00:20:12 What’s that?
00:20:12 In order to graduate from Georgia Tech,
00:20:14 this is a true thing.
00:20:15 Feel free to look it up.
00:20:16 If you.
00:20:17 A lot of schools have this by the way.
00:20:19 No, actually Georgia Tech was barely the first.
00:20:20 Brandeis has it.
00:20:21 Had it.
00:20:23 I feel like Georgia Tech was the first
00:20:25 in a lot of things.
00:20:27 It was the first in a lot of things.
00:20:28 Had the first master’s degree.
00:20:29 First Bumblebee mascot.
00:20:30 Stop that.
00:20:32 First master’s in computer science actually.
00:20:34 Right, online master’s.
00:20:35 Well that too, but way back in the 60s.
00:20:37 NSF grant.
00:20:38 Yeah, yeah.
00:20:39 You’re the first information
00:20:40 and computer science master’s degree in the country.
00:20:42 But the Georgia Tech,
00:20:45 it used to be the case
00:20:46 that in order to graduate from Georgia Tech,
00:20:48 you had to take a Drown Proofing class.
00:20:49 Where effectively, they threw you in the water
00:20:52 and tied you up.
00:20:53 If you didn’t drown, you got to graduate.
00:20:54 Tied you up?
00:20:55 I believe so.
00:20:56 No.
00:20:57 There were certainly versions of it,
00:20:58 but I mean luckily they ended it
00:21:00 just before I had to graduate
00:21:01 because otherwise I would have never graduated.
00:21:03 It wasn’t going to happen.
00:21:04 I want to say 84, 83,
00:21:06 somewhere around then they ended it.
00:21:08 But yeah, you used to have to prove
00:21:10 you could tread water for some ridiculous amount of time
00:21:13 or you couldn’t graduate.
00:21:14 Two minutes.
00:21:14 No, it was more than two minutes.
00:21:15 I bet it was two minutes.
00:21:16 Okay, well we’ll look at it.
00:21:17 And it was in a bathtub.
00:21:18 Yeah, right.
00:21:19 You could just stare.
00:21:20 It was in a pool.
00:21:20 But it was a real thing.
00:21:21 But that idea that, you know, push you.
00:21:23 Fully clothed.
00:21:24 Yeah, fully clothed.
00:21:25 I bet it was that and not tied up.
00:21:27 Because who needs to learn how to swim when you’re tied?
00:21:30 Nobody.
00:21:31 But who needs to learn to swim
00:21:32 when you’re actually falling into the water dressed?
00:21:34 That’s a real thing.
00:21:35 I think your facts are getting in the way
00:21:36 with a good story.
00:21:37 Oh, that’s fair.
00:21:38 That’s fair.
00:21:39 I didn’t mean to.
00:21:40 All right, so they tie you up.
00:21:40 Sometimes the narrative matters.
00:21:41 But whatever it was, you had to,
00:21:43 it was called drown proofing for a reason.
00:21:44 The point of the story, Michael, is that it’s,
00:21:49 well, no, but that’s good.
00:21:50 It doesn’t bring it back to struggle.
00:21:52 That’s a part of what Georgia Tech has always been.
00:21:54 And we struggle with that, by the way,
00:21:56 about what we want to be, particularly as things go.
00:21:59 But you sort of,
00:22:02 how much can you be pushed without breaking?
00:22:06 And you come out of the other end stronger, right?
00:22:08 There’s a saying we used to have
00:22:09 when I was an undergrad there.
00:22:10 It was just Georgia Tech,
00:22:11 building tomorrow the night before.
00:22:13 Right?
00:22:14 And it was just kind of idea that,
00:22:17 give me something impossible to do
00:22:19 and I’ll do it in a couple of days
00:22:20 because that’s what I just spent
00:22:21 the last four or five or six years with.
00:22:24 That ethos definitely stuck to you.
00:22:26 Having now done a number of projects with you,
00:22:28 you definitely will do it the night before.
00:22:30 That’s not entirely true.
00:22:31 There’s nothing wrong with waiting until the last minute.
00:22:33 The secret is knowing when the last minute is.
00:22:35 Right, that’s brilliantly put.
00:22:38 Yeah, that is a definite Charles statement
00:22:41 that I am trying not to embrace.
00:22:44 And I appreciate that
00:22:45 because you helped move my last minute up.
00:22:47 That’s the social construct
00:22:49 the way you converge together
00:22:50 what the definition of last minute is.
00:22:53 We figure that all out together.
00:22:54 In fact, MIT, I’m sure a lot of universities have this,
00:22:58 but MIT has like MIT time
00:23:00 that everyone has always agreed together
00:23:03 that there is such a concept
00:23:05 and everyone just keeps showing up like 10 to 15 to 20,
00:23:08 depending on the department, late to everything.
00:23:11 So there’s like a weird drift that happens.
00:23:13 It’s kind of fascinating.
00:23:14 Yeah, we’re five minutes.
00:23:15 We’re five minutes.
00:23:16 In fact, the classes will say,
00:23:18 well, this is no longer true actually,
00:23:20 but it used to be a class that started at eight,
00:23:22 but actually it started at eight oh five,
00:23:24 it ends at nine, actually it ends at eight fifty five.
00:23:26 Everything’s five minutes off
00:23:27 and nobody expects anything to start until five minutes
00:23:29 after the half hour, whatever it is.
00:23:31 It still exists.
00:23:32 It hurts my head.
00:23:33 Well, let’s rewind the clock back to the fifties and sixties
00:23:37 when you guys met, how did you,
00:23:39 I’m just kidding, I don’t know.
00:23:40 But what, can you tell the story of how you met?
00:23:43 So you’ve, like the internet and the world
00:23:45 kind of knows you as connected in some ways
00:23:50 in terms of education of teaching the world.
00:23:53 That’s like the public facing thing,
00:23:54 but how did you as human beings
00:23:56 and as collaborators meet?
00:24:00 I think there’s two stories.
00:24:01 One is how we met and the other is how we
00:24:05 got to know each other.
00:24:06 I’m not gonna say fell in love.
00:24:08 I’m gonna say that we came to understand that we
00:24:11 Had some common something.
00:24:13 Yeah, it’s funny.
00:24:14 Cause on the surface, I think we’re different
00:24:16 in a lot of ways, but there’s something
00:24:18 Yeah, I mean, now we complete each other’s
00:24:21 There you go.
00:24:22 Afternoon.
00:24:23 So I will tell the story of how we met
00:24:25 and I’ll let Michael tell the story of how we met.
00:24:27 Okay, all right.
00:24:28 Okay, so here’s how we met.
00:24:30 I was already at that point, it was AT&T labs.
00:24:32 There’s a long, interesting story there.
00:24:34 But anyway, I was there and Michael was coming to interview.
00:24:38 He was a professor at Duke at the time,
00:24:40 but decided for reasons that he wanted to be in New Jersey.
00:24:45 And so that would mean Bell Labs slash AT&T labs.
00:24:48 And we were doing the interview.
00:24:49 Interviews are very much like academic interviews.
00:24:51 And so I had to be there.
00:24:53 We all had to meet with him afterwards
00:24:54 and so on, one on one.
00:24:56 But it was obvious to me that he was going to be hired.
00:24:59 Like no matter what, because everyone loved him.
00:25:01 They were just talking about all the great stuff he did.
00:25:03 Oh, he did this great thing.
00:25:04 And you had just won something at AAAI, I think.
00:25:06 Or maybe you got 18 papers in AAAI that year.
00:25:08 I got the best paper award at AAAI
00:25:10 for the crossword stuff.
00:25:11 Right, exactly.
00:25:12 So that had all happened and everyone was going on
00:25:14 and on and on about it.
00:25:14 Actually, so Tinder was saying incredibly nice things
00:25:16 about you.
00:25:17 Really?
00:25:18 Yes.
00:25:19 He can be very grumpy.
00:25:19 Yes.
00:25:20 That’s nice to hear.
00:25:21 He was grumpily saying very nice things.
00:25:22 Oh, that makes sense.
00:25:23 And that does make sense.
00:25:24 So, you know, it was going to come.
00:25:25 So why was I meeting him?
00:25:28 I had something else I had to do.
00:25:29 I can’t remember what it was.
00:25:29 It probably involved comedy.
00:25:31 So he remembers meeting me
00:25:32 as inconveniencing his afternoon.
00:25:34 So he came.
00:25:35 So I eventually came to my office.
00:25:36 I was in the middle of trying to do something.
00:25:37 I can’t remember what.
00:25:37 And he came and he sat down.
00:25:38 And for reasons that are purely accidental,
00:25:41 despite what Michael thinks,
00:25:42 my desk at the time was set up
00:25:44 in such a way that it had sort of an L shape.
00:25:46 And the chair on the outside was always lower
00:25:48 than the chair that I was in.
00:25:50 And, you know, the kind of point was to…
00:25:52 The only reason I think that it was on purpose
00:25:54 is because you told me it was on purpose.
00:25:56 I don’t remember that.
00:25:57 Anyway, the thing is, is that, you know, it kind of gives…
00:25:59 His guest chair was really low
00:26:00 so that he could look down at everybody.
00:26:02 The idea was just to simply create a nice environment
00:26:04 that you were asking for a mortgage
00:26:06 and I was going to say no.
00:26:07 That was the point.
00:26:07 It was a very simple idea here.
00:26:09 Anyway, so we sat there
00:26:10 and we just talked for a little while.
00:26:12 And I think he got the impression that I didn’t like him.
00:26:14 Which wasn’t true.
00:26:15 I strongly got that impression.
00:26:16 The talk was really good.
00:26:16 The talk, by the way, was terrible.
00:26:18 And right after the talk,
00:26:20 I said to my host, Michael Kearns,
00:26:21 who ultimately was my boss.
00:26:23 I’m a huge fan.
00:26:24 I’m a friend and a huge fan of Michael, yeah.
00:26:25 Yeah, he is a remarkable person.
00:26:29 After my talk, I went into the…
00:26:30 He went into basketball.
00:26:32 I went…
00:26:33 Racquetball, he’s good at everything.
00:26:34 No, basketball.
00:26:34 No, but basketball and racquetball too.
00:26:36 Squash.
00:26:36 Squash, squash, squash, not racquetball.
00:26:38 Yes, squash, which is not…
00:26:39 Racquetball, yes.
00:26:41 Squash, no.
00:26:42 And I hope you hear that, Michael.
00:26:43 Oh, Michael Kearns.
00:26:45 As a game, not his skill level,
00:26:47 because I’m pretty sure he’s…
00:26:50 All right, there’s some competitiveness there,
00:26:51 but the point is that it was like the middle of the day,
00:26:54 I had full day of interviews.
00:26:55 I got met with people,
00:26:56 but then in the middle of the day, I gave a job talk.
00:26:58 And then there was gonna be more interviews,
00:27:01 but I pulled Michael aside and I said,
00:27:04 I think it’s in both of our best interest
00:27:07 if I just leave now, because that was so bad
00:27:11 that it’d just be embarrassing
00:27:12 if I have to talk to any more people.
00:27:14 You look bad for having invited me.
00:27:16 It’s just, let’s just forget this ever happened.
00:27:19 So I don’t think the talk went well.
00:27:21 That’s one of the most Michael Lipman set of sentences
00:27:23 I think I’ve ever heard.
00:27:24 He did great, or at least everyone knew he was great,
00:27:27 so maybe it didn’t matter.
00:27:28 I was there, I remember the talk,
00:27:29 and I remember him being very much the way
00:27:31 I remember him now, on any given week.
00:27:33 So it was good.
00:27:34 And we met and we talked about stuff.
00:27:36 He thinks I didn’t like him, but…
00:27:37 Because he was so grumpy.
00:27:39 Must’ve been the chair thing.
00:27:40 The chair thing and the low voice, I think.
00:27:42 But like, he obviously…
00:27:43 And that slight skeptical look.
00:27:47 Yes.
00:27:48 I have no idea what you’re talking about.
00:27:50 Well, I probably didn’t have any idea
00:27:51 what you were talking about.
00:27:53 Anyway, I liked him.
00:27:54 He asked me questions, I answered questions.
00:27:56 I felt bad about myself.
00:27:57 It was a normal day.
00:27:58 It was a normal day.
00:28:00 And then he left.
00:28:01 And then he left, and that’s how you met.
00:28:03 Can we take a…
00:28:03 And then I got hired and I was in the group.
00:28:05 Can we take a slight tangent on this topic of,
00:28:09 it sounds like, maybe you could speak
00:28:11 to the bigger picture.
00:28:12 It sounds like you’re quite self critical.
00:28:15 Who, Charles?
00:28:15 No, you.
00:28:16 Oh.
00:28:17 I can do better.
00:28:18 I can do better.
00:28:19 Try me again.
00:28:20 I’ll do better.
00:28:21 I’ll be so self critical.
00:28:21 I won’t.
00:28:22 I won’t.
00:28:23 I won’t.
00:28:24 Yeah, that was like a three out of 10 response.
00:28:26 So let’s try to work it up to five and six.
00:28:30 Yeah, I remember Marvin Minsky said on a video interview,
00:28:35 something that the key to success in academic research
00:28:38 is to hate everything you do.
00:28:43 For some reason…
00:28:44 I think I followed that because I hate everything he’s done.
00:28:46 That’s a good line.
00:28:49 That’s a six out of 10.
00:28:52 Maybe that’s a keeper.
00:28:53 But do you find that resonates with you at all
00:28:57 in how you think about talks and so on?
00:28:59 I would say it differently.
00:29:00 It’s not that.
00:29:01 No, not really.
00:29:02 That’s such an MIT view of the world though.
00:29:04 So I remember talking about this when, as a student,
00:29:08 you were basically told I will clean it up
00:29:10 for the purpose of the podcast.
00:29:13 My work is crap.
00:29:14 My work is crap.
00:29:15 My work is crap.
00:29:16 Then you go to a conference or something.
00:29:17 You’re like, everybody else’s work is crap.
00:29:18 Everybody else’s work is crap.
00:29:19 And you feel better and better about it, relatively speaking.
00:29:23 And then you sort of keep working on it.
00:29:25 I don’t hate my work.
00:29:26 That resonates with me.
00:29:27 Yes, I’ve never hated my work,
00:29:28 but I have been dissatisfied with it.
00:29:33 And I think being dissatisfied,
00:29:35 being okay with the fact that you’ve taken a positive step,
00:29:38 the derivative’s positive,
00:29:40 maybe even the second derivative’s positive,
00:29:42 that’s important because that’s a part of the hope, right?
00:29:45 But you have to, but I haven’t gotten there yet.
00:29:47 If that’s not there, that I haven’t gotten there yet,
00:29:49 then it’s hard to move forward, I think.
00:29:53 So I buy that, which is a little different
00:29:55 from hating everything that you do.
00:29:56 Yeah, I mean, there’s things that I’ve done
00:29:59 that I like better than I like myself.
00:30:01 So it’s separating me from the work, essentially.
00:30:04 So I think I am very critical of myself,
00:30:06 but sometimes the work I’m really excited about.
00:30:08 And sometimes I think it’s kind of good.
00:30:10 Does that happen right away?
00:30:11 So I found the work that I’ve liked, that I’ve done,
00:30:15 most of it, I liked it in retrospect
00:30:18 more when I was far away from it in time.
00:30:21 I have to be fairly excited about it to get done.
00:30:24 No, excited at the time, but then happy with the result.
00:30:26 But years later, or even I might go back,
00:30:28 you know what, that actually turned out to matter.
00:30:31 That turned out to matter.
00:30:32 Or, oh gosh, it turns out I’ve been thinking about that.
00:30:34 It’s actually influenced all the work that I’ve done since
00:30:36 without realizing it.
00:30:37 Boy, that guy was smart.
00:30:39 Yeah, that guy had a future.
00:30:41 Yeah, I think there’s something to it.
00:30:47 I think there’s something to the idea
00:30:48 you’ve got to hate what you do, but it’s not quite hate.
00:30:50 It’s just being unsatisfied.
00:30:52 And different people motivate themselves differently.
00:30:54 I don’t happen to motivate myself with self loathing.
00:30:56 I happen to motivate myself with something else.
00:30:58 So you’re able to sit back and be proud of,
00:31:02 in retrospect, of the work you’ve done.
00:31:04 Well, and it’s easier when you can connect it
00:31:06 with other people, because then you can be proud of them.
00:31:08 Proud of the people, yeah.
00:31:10 And then the question is.
00:31:11 You can still safely hate yourself.
00:31:12 Yeah, that’s right.
00:31:13 It’s win, win, Michael.
00:31:15 Or at least win, lose, which is what you’re looking for.
00:31:18 Oh, wow, there’s so many brilliant minds in this.
00:31:22 There’s levels.
00:31:23 So how did you actually meet me?
00:31:26 Yeah, Michael.
00:31:26 So the way I think about it is,
00:31:28 because we didn’t do much research together at AT&T,
00:31:32 but then we all got laid off.
00:31:34 So that sucked.
00:31:36 By the way, sorry to interrupt,
00:31:37 but that was one of the most magical places
00:31:40 historically speaking.
00:31:42 They did not appreciate what they had.
00:31:45 And how do we,
00:31:47 I feel like there’s a profound lesson in there too.
00:31:50 How do we get it, like what was, why was it so magical?
00:31:53 Is it just a coincidence of history?
00:31:54 Or is there something special about?
00:31:56 There were some really good managers
00:31:57 and people who really believed in machine learning
00:32:00 as this is gonna be important.
00:32:03 Let’s get the people who are thinking about this
00:32:05 in creative and insightful ways
00:32:08 and put them in one place and stir.
00:32:10 Yeah, but even beyond that, right?
00:32:11 It was Bell Labs at its heyday.
00:32:15 And even when we were there, which I think was past its heyday.
00:32:17 And to be clear, he’s gotten to be at Bell Labs.
00:32:19 I never got to be at Bell Labs.
00:32:21 I joined after that.
00:32:22 Yeah, I showed up in 91 as a grad student.
00:32:24 So I was there for a long time, every summer, except for two.
00:32:28 So twice I worked for companies
00:32:29 that had just stopped being Bell Labs.
00:32:31 Bellcore and then AT&T Labs.
00:32:33 So Bell Labs was several locations or for the research
00:32:37 or is it one?
00:32:38 I don’t know if Jersey’s involved somehow.
00:32:41 They’re all in Jersey.
00:32:41 Yeah, they’re all over the place.
00:32:42 But they were in a couple of places in Jersey.
00:32:44 Murray Hill was the Bell Labs place.
00:32:47 So you had an office in Murray Hill
00:32:49 at one point in your career.
00:32:51 Yeah, and I played Ultimate Frisbee
00:32:53 on the cricket pitch at Bell Labs at Murray Hill.
00:32:56 And then it became AT&T Labs when it split off
00:32:58 with loose during what we called Trivestiture.
00:33:00 Are you better than Michael Korn’s at Ultimate Frisbee?
00:33:03 Yeah. Oh, yeah.
00:33:04 Okay.
00:33:05 But I think that one’s not boasting.
00:33:06 I think Charles plays a lot of Ultimate
00:33:08 and I don’t think Michael does.
00:33:10 Yes, but that wasn’t the point.
00:33:12 The point is yes.
00:33:12 I’m finally better.
00:33:13 Sorry.
00:33:14 Okay, I have played on a championship winning
00:33:17 Ultimate Frisbee team or whatever,
00:33:19 Ultimate team with Charles.
00:33:20 So I know how good he is.
00:33:22 He’s really good.
00:33:23 How good I was anyway, when I was younger.
00:33:24 But the thing is.
00:33:25 I know how young he was when he was younger.
00:33:26 That’s true.
00:33:27 So much younger than now.
00:33:28 He’s older now.
00:33:29 Yeah, I’m older.
00:33:30 Michael was a much better basketball player than I was.
00:33:33 Michael Kearns.
00:33:34 Yes, no, not Michael.
00:33:36 Let’s be very clear about that.
00:33:37 To be clear, I’ve not played basketball with you.
00:33:38 So you don’t know how terrible I am,
00:33:40 but you have a probably pretty good guess.
00:33:42 And that you’re not as good as Michael Kearns.
00:33:44 He’s tall and athletic.
00:33:45 And he cared about it.
00:33:46 He’s very athletic.
00:33:47 He’s very good.
00:33:48 And probably competitive.
00:33:49 I love hanging out with Michael.
00:33:50 Anyway, but we were talking about something else,
00:33:51 although I no longer remember what it was.
00:33:52 What were we talking about?
00:33:53 Oh, Bell Labs.
00:33:54 Oh, Bell Labs.
00:33:55 But also Labs.
00:33:56 So this was kind of cool about what was magical about it.
00:34:00 The first thing you have to know
00:34:01 is that Bell Labs was an arm of the government, right?
00:34:03 Because AT&T was an arm of the government.
00:34:05 It was a monopoly.
00:34:07 And every month you paid a little thing on your phone bill,
00:34:10 which turned out was a tax
00:34:12 for all the research that Bell Labs was doing.
00:34:14 And they invented transistors and the laser
00:34:16 and whatever else is that they did.
00:34:17 The Big Bang or whatever, the cosmic background radiation.
00:34:20 Yeah, they did all that stuff.
00:34:21 They had some amazing stuff with directional microphones,
00:34:23 by the way.
00:34:24 I got to go in this room
00:34:25 where they had all these panels and everything.
00:34:27 And we would talk and one another,
00:34:29 and he’d move some panels around.
00:34:30 And then he would have me step two steps to the left.
00:34:33 And I couldn’t hear a thing he was saying
00:34:35 because nothing was bouncing off the walls.
00:34:37 And then he would shut it all down
00:34:38 and you could hear your heartbeat,
00:34:40 which is deeply disturbing to hear your heartbeat.
00:34:43 You can feel it.
00:34:44 I mean, you can feel it now.
00:34:44 There’s just so much all this sort of noise around.
00:34:46 Anyway, Bell Labs was about pure research.
00:34:48 It was a university, in some sense,
00:34:50 the purest sense of a university, but without students.
00:34:53 So it was all the faculty working with one another
00:34:56 and students would come in to learn.
00:34:57 They would come in for three or four months
00:34:59 during the summer and they would go away.
00:35:00 But it was just this kind of wonderful experience.
00:35:02 I could walk out my door.
00:35:04 In fact, I would often have to walk out my door
00:35:06 and deal with Rich Sutton and Michael Kearns
00:35:08 yelling at each other about whatever it is
00:35:10 they were yelling about the proper way
00:35:13 to prove something or another.
00:35:14 And I could just do that.
00:35:15 And Dave McAllister and Peter Stone
00:35:17 and all of these other people,
00:35:19 including, it’s a tender and then eventually Michael.
00:35:22 And it was just a place where you could think thoughts.
00:35:25 And it was okay because so long as once every 25 years or so
00:35:29 somebody invented a transistor, it paid for everything else.
00:35:31 You could afford to take the risk.
00:35:34 And then when that all went away,
00:35:36 it became harder and harder and harder to justify it
00:35:39 as far as the folks who were very far away were concerned.
00:35:41 And there was such a fast turnaround
00:35:43 among mental management on the AT&T side
00:35:46 that you never had a chance to really build a relationship.
00:35:48 At least people like us didn’t have a chance
00:35:49 to build a relationship.
00:35:51 So when the diaspora happened, it was amazing, right?
00:35:55 Everybody left and I think everybody ended up
00:35:57 at a great place and made a huge,
00:36:00 continued to do really good work with machine learning.
00:36:02 But it was a wonderful place.
00:36:03 And people will ask me, what’s the best job you’ve ever had?
00:36:07 And as a professor, anyway, the answer that I would give is
00:36:11 well, probably Bell Labs in some very real sense.
00:36:16 And I will never have a job like that again
00:36:17 because Bell Labs doesn’t exist anymore.
00:36:19 And Microsoft research is great and Google does good stuff.
00:36:22 And you can pick IBM, you can tell if you want to,
00:36:24 but Bell Labs was magical.
00:36:25 It was around for, it was an important time
00:36:28 and it represents a high watermark
00:36:30 in basic research in the US.
00:36:32 Is there something you could say about the physical proximity
00:36:35 and the chance collisions?
00:36:36 Like we live in this time of the pandemic
00:36:39 where everyone is maybe trying to see the silver lining
00:36:43 and accepting the remote nature of things.
00:36:46 Is there one of the things that people like faculty
00:36:50 that I talk to miss is the procrastination.
00:36:57 Like the chance to make everything is about meetings
00:36:59 that are supposed to be,
00:37:00 there’s not a chance to just talk about comic book
00:37:04 or whatever, like go into discussion that’s totally pointless.
00:37:07 So it’s funny you say this
00:37:08 because that’s how we met, met, it was exactly that.
00:37:11 So I’ll let Michael say that, but I’ll just add one thing
00:37:12 which is just that research is a social process
00:37:16 and it helps to have random social interactions
00:37:20 even if they don’t feel social at the time,
00:37:21 that’s how you get things done.
00:37:22 One of the great things about the AI Lab when I was there,
00:37:25 I don’t quite know what it looks like now
00:37:27 once they moved buildings,
00:37:28 but we had entire walls that were whiteboards
00:37:30 and people would just get up there
00:37:31 and they were just right and people would walk up
00:37:33 and you’d have arguments
00:37:34 and you’d explain things to one another
00:37:36 and you got so much out of the freedom to do that.
00:37:39 You had to be okay with people challenging
00:37:42 every fricking word you said,
00:37:44 which I would sometimes find deeply irritating,
00:37:47 but most of the time it was quite useful.
00:37:49 But the sort of pointlessness and the interaction
00:37:52 was in some sense the point, at least for me.
00:37:54 Yeah, I think offline yesterday I mentioned
00:37:57 Josh Tenenbaum and he’s very much, he’s a man,
00:38:01 he’s such an inspiration in the childlike way
00:38:06 that he pulls you in on any topic.
00:38:07 It doesn’t even have to be about machine learning
00:38:10 or the brain, he’ll just pull you in
00:38:12 to a closest writable surface,
00:38:15 which is still, you can find whiteboards
00:38:18 at MIT everywhere, and just like basically cancel
00:38:23 all meetings and talk for a couple hours
00:38:25 about some aimless thing and it feels like
00:38:28 the whole world, the time space continuum kind of warps
00:38:30 and that becomes the most important thing.
00:38:32 And then it’s just, it’s definitely something
00:38:36 worth missing in this world where everything’s remote.
00:38:40 There’s some magic to the physical presence.
00:38:42 Whenever I wonder myself whether MIT really is
00:38:44 as great as I remember it, I just go talk to Josh.
00:38:48 Yeah, you know, that’s funny.
00:38:49 There’s a few people in this world that carry
00:38:52 the best of what particular institutions stand for, right?
00:38:56 And it’s.
00:38:57 It’s Josh.
00:38:58 I mean, I don’t, my guess is he’s unaware of this.
00:39:00 That’s the point.
00:39:02 Yeah.
00:39:02 That the masters are not aware of their mastery.
00:39:06 So.
00:39:07 How did we meet?
00:39:09 Yes, but first a tangent, no.
00:39:13 How did you meet me?
00:39:14 So I’m not sure what you were thinking,
00:39:16 but when it started to dawn on me
00:39:19 that maybe we had a longer term bond
00:39:21 was after we all got laid off.
00:39:23 And you had decided at that point
00:39:26 that we were still paid.
00:39:28 We were given an opportunity to like do a job search
00:39:30 and kind of make a transition,
00:39:32 but it was clear that we were done.
00:39:35 And I would go to my office to work
00:39:38 and you would go to my office to keep me from working.
00:39:41 That was my recollection of it.
00:39:43 You had decided that there was no,
00:39:44 really no point in working for the company
00:39:46 because our relationship with the company was done.
00:39:49 Yeah, but remember I felt that way beforehand.
00:39:51 It wasn’t about the company.
00:39:52 It was about the set of people there
00:39:53 doing really cool things.
00:39:54 And it always, always been that way.
00:39:55 But we were working on something together.
00:39:57 Oh yeah, yeah, yeah.
00:39:58 That’s right.
00:39:59 So at the very end, we all got laid off,
00:40:00 but then our boss came to, our boss’s boss came to us
00:40:04 because our boss was Michael Kearns
00:40:05 and he had jumped ship brilliantly, like perfect timing.
00:40:08 Like things like right before the ship was about to sink,
00:40:12 he was like, gotta go and landed perfectly
00:40:16 because Michael Kearns.
00:40:18 Because Michael Kearns.
00:40:19 And leaving the rest of us to go like, this is fine.
00:40:23 And then it was clear that it wasn’t fine
00:40:25 and we were all toast.
00:40:27 So we had this sort of long period of time.
00:40:29 But then our boss figured out, okay, wait,
00:40:30 maybe we can save a couple of these people
00:40:33 if we can have them do something really useful.
00:40:37 And the useful thing was we were gonna make
00:40:40 basically an automated assistant
00:40:42 that could help you with your calendar.
00:40:43 You could like tell it things
00:40:45 and it would respond appropriately.
00:40:47 It would just kind of integrate across
00:40:49 all sorts of your personal information.
00:40:53 And so me and Charles and Peter Stone
00:40:56 were set up as the crack team
00:40:58 to actually solve this problem.
00:41:00 Other people maybe were too theoretical that they thought,
00:41:04 but we could actually get something done.
00:41:05 So we sat down to get something done
00:41:07 and there wasn’t time and it wouldn’t have saved us anyway.
00:41:10 And so it all kind of went downhill.
00:41:12 But the interesting, I think, coda to that
00:41:15 is that our boss’s boss is a guy named Ron Brockman.
00:41:18 And when he left AT&T,
00:41:22 cause we were all laid off,
00:41:23 he went to DARPA, started up a program there
00:41:27 that became KALO,
00:41:28 which is the program from which Siri sprung,
00:41:32 which is a digital assistant
00:41:34 that helps you with your calendar
00:41:35 and a bunch of other things.
00:41:37 It really, in some ways got its start
00:41:40 with me and Charles and Peter trying to implement this vision
00:41:44 that Ron Brockman had,
00:41:45 that he ultimately got implemented
00:41:47 through his role at DARPA.
00:41:49 So when I’m trying to feel less bad
00:41:51 about having been laid off
00:41:52 from what is possibly the greatest job of all time,
00:41:56 I think about, well, we kind of helped birth Siri.
00:42:00 That’s something.
00:42:01 And then he did other things too.
00:42:03 But we got to spend a lot of time in his office
00:42:06 and talk about lots of things.
00:42:07 We got to spend a lot of time in my office, yeah.
00:42:10 Yeah, yeah.
00:42:11 And so then we went on our merry way.
00:42:13 Everyone went to different places.
00:42:15 Charles landed at Georgia Tech,
00:42:16 which was what he always dreamed he would do.
00:42:20 And so that worked out well.
00:42:23 I came up with a saying at the time,
00:42:25 which is luck favors the Charles.
00:42:27 It’s kind of like luck favors the prepared,
00:42:30 but Charles, like he wished something
00:42:32 and then it would basically happen just the way he wanted.
00:42:35 It was inspirational to see things go that way.
00:42:38 Things worked out.
00:42:39 And we stayed in touch.
00:42:40 And then I think it really helped
00:42:43 when you were working on,
00:42:46 I mean, you’d kept me in the loop for things like threads
00:42:48 and the work that you were doing at Georgia Tech.
00:42:49 But then when they were starting
00:42:50 their online master’s program,
00:42:52 he knew that I was really excited about MOOCs
00:42:55 and online teaching.
00:42:56 And he’s like, I have a plan.
00:42:57 And I’m like, tell me your plan.
00:42:58 He’s like, I can’t tell you the plan yet.
00:43:00 Cause they were deep in negotiations
00:43:02 between Georgia Tech and Udacity to make this happen.
00:43:05 And they didn’t want it to leak.
00:43:07 So Charles would kept teasing me about it,
00:43:09 but wouldn’t tell me what was actually going on.
00:43:10 And eventually it was announced and he said,
00:43:13 I would like you to teach the machine learning course
00:43:15 with me.
00:43:15 I’m like, that can’t possibly work.
00:43:18 But it was a great idea.
00:43:19 And it was super fun.
00:43:20 It was a lot of work to put together,
00:43:22 but it was really great.
00:43:23 Was that the first time you thought about,
00:43:26 first of all, was it the first time
00:43:27 you got seriously into teaching?
00:43:30 I mean, I was a professor.
00:43:32 This was already after you jumped to,
00:43:35 so like there’s a little bit of jumping around in time.
00:43:38 Yeah, sorry about that.
00:43:39 There’s a pretty big jump in time.
00:43:40 So like the MOOCs thing.
00:43:42 So Charles got to Georgia Tech and he,
00:43:44 I mean, maybe Charles, maybe this is a Charles story.
00:43:46 I got to Georgia Tech in 2002.
00:43:47 He got to Georgia Tech in 2002.
00:43:49 And worked on things like revamping the curriculum,
00:43:52 the undergraduate curriculum,
00:43:53 so that it had some kind of semblance of modular structure
00:43:57 because computer science was at the time
00:44:00 moving from a fairly narrow specific set of topics
00:44:03 to touching a lot of other parts of intellectual life.
00:44:08 And the curriculum was supposed to reflect that.
00:44:10 And so Charles played a big role in kind of redesigning that.
00:44:15 And then the.
00:44:16 And for my labors, I ended up as associate dean.
00:44:20 Right, he got to become associate dean
00:44:22 of charge of educational stuff.
00:44:24 Yeah, I was under.
00:44:25 This should be a valuable lesson.
00:44:26 If you’re good at something,
00:44:30 they will give you responsibility to do more of that thing.
00:44:33 Well.
00:44:34 Until you.
00:44:35 Don’t show competence.
00:44:36 Don’t show competence if you.
00:44:37 Don’t want responsibility.
00:44:38 Here’s what they say.
00:44:40 The reward for good work is more work.
00:44:43 The reward for bad work is less work.
00:44:47 Which, I don’t know.
00:44:48 Depending on what you’re trying to do that week,
00:44:50 one of those is better than the other.
00:44:51 Well, one of the problems with the word work,
00:44:52 sorry to interrupt, is that it seems to be an antonym
00:44:57 in this particular language.
00:44:59 We have the opposite of happiness.
00:45:01 But it seems like they’re.
00:45:02 That’s one of, you know, we talked about balance.
00:45:07 It’s always like work life balance.
00:45:09 It always rubbed me the wrong way as a terminology.
00:45:12 I know it’s just words.
00:45:13 Right, the opposite of work is play.
00:45:15 But ideally, work is play.
00:45:17 Oh, I can’t tell you how much time I’d spend.
00:45:20 Certainly, when I was at Bell Labs,
00:45:21 except for a few very key moments,
00:45:23 as a professor, I would do this too.
00:45:25 I would just say, I cannot believe
00:45:26 they’re paying me to do this.
00:45:28 Because it’s fun.
00:45:29 It’s something that I would do for a hobby
00:45:32 if I could anyway.
00:45:34 So that sort of worked out.
00:45:35 Are you sure you want to be saying that
00:45:37 when this is being recorded?
00:45:38 As a dean, that is not true at all.
00:45:40 I need a raise.
00:45:42 But I think here with this,
00:45:43 even though a lot of time passed,
00:45:45 Mike and I talked almost every, well, we texted,
00:45:47 almost every day during the period.
00:45:49 Charles, at one point, took me,
00:45:53 the ICML conference, the machine learning conference
00:45:55 was in Atlanta.
00:45:57 I was the chair, the general chair of the conference.
00:46:00 Charles was my publicity chair or something like that,
00:46:03 or fundraising chair.
00:46:05 Yeah, but he decided it’d be really funny
00:46:08 if he didn’t actually show up for the conference
00:46:09 in his own home city.
00:46:11 So he didn’t, but he did at one point
00:46:13 pick me up at the conference in his Tesla
00:46:16 and drove me to the Atlanta mall
00:46:19 and forced me to buy an iPhone
00:46:22 because he didn’t like how it was to text with me
00:46:25 and thought it would be better for him
00:46:27 if I had an iPhone, the text would be somehow smoother.
00:46:30 And it was.
00:46:31 And it was.
00:46:32 And it is, and his life is better.
00:46:32 And my life is better.
00:46:33 And so, yeah, but it was, yeah,
00:46:36 Charles forced me to get an iPhone
00:46:38 so that he could text me more efficiently.
00:46:40 I thought that was an interesting moment.
00:46:42 It works for me.
00:46:42 Anyway, so we kept talking the whole time
00:46:44 and then eventually we did the teaching thing
00:46:46 and it was great.
00:46:47 And there’s a couple of reasons for that, by the way.
00:46:48 One is I really wanted to do something different.
00:46:51 Like you’ve got this medium here,
00:46:53 people claim it can change things.
00:46:54 What’s a thing that you could do in this medium
00:46:56 that you could not do otherwise besides edit, right?
00:47:00 I mean, what could you do?
00:47:01 And being able to do something with another person
00:47:03 was that kind of thing.
00:47:04 It’s very hard.
00:47:05 I mean, you can take turns,
00:47:06 but teaching together, having conversations is very hard.
00:47:09 So that was a cool thing.
00:47:10 The second thing, give me an excuse
00:47:11 to do more stuff with him.
00:47:12 Yeah, I always thought, he makes it sound brilliant.
00:47:15 And it is, I guess.
00:47:17 But at the time it really felt like
00:47:20 I’ve got a lot to do, Charles is saying,
00:47:22 and it would be great if Michael could teach the course
00:47:25 and I could just hang out.
00:47:27 Yeah, just kind of coast on that.
00:47:29 Well, that’s what the second class was more like that.
00:47:31 Because the second class was explicitly like that.
00:47:33 But the first class, it was at least half.
00:47:36 Yeah, but I do all the stuff.
00:47:37 So the structure that we came up with.
00:47:37 I think you’re once again letting the facts
00:47:39 get in the way of a good story.
00:47:42 I should just let Charles talk to us.
00:47:44 But that’s the facts that he saw.
00:47:46 So that was kind of true for 7642.
00:47:48 Yeah, that was sort of true for 7642,
00:47:50 which is the reinforcement learning class,
00:47:51 because that was really his class.
00:47:52 You started with reinforcement learning or machine learning?
00:47:55 Intro machine learning, 7641,
00:47:57 which is supervised learning, unsupervised learning,
00:48:00 and reinforcement learning and decision making,
00:48:02 cram all that in there,
00:48:03 the kind of assignments that we talked about earlier.
00:48:04 And then eventually, about a year later,
00:48:06 we did a follow on 7642,
00:48:08 which is reinforcement learning and decision making.
00:48:10 The first class was based on something
00:48:12 I’d been teaching at that point for well over a decade.
00:48:14 And the second class was based on something
00:48:15 Michael had been teaching.
00:48:17 Actually, I learned quite a bit
00:48:18 teaching that class with him, but he drove most of that.
00:48:21 But the first one I drove most, it was all my material.
00:48:23 Although I had stolen that material originally
00:48:26 from slides I found online from Michael,
00:48:28 who had originally stolen that material
00:48:30 from, I guess, slides he found online,
00:48:32 probably from Andrew Moore,
00:48:33 because the jokes were the same anyway.
00:48:34 At least some of the, at least when I found the slides,
00:48:36 some of the stuff with it.
00:48:37 Is that true?
00:48:38 Yes, every machine learning class taught in the early 2000s
00:48:40 stole from Andrew Moore.
00:48:41 A particular joke or two?
00:48:43 At least the structure.
00:48:44 Now, I did, and he did, actually,
00:48:46 a lot more with reinforcement learning and such,
00:48:48 and game theory and those kinds of things.
00:48:50 But, you know, we all sort of built in.
00:48:51 You mean in the research world?
00:48:52 No, no, no, in that class.
00:48:54 No, I mean in teaching that class.
00:48:54 The coverage was different than what we started.
00:48:57 Most people were just doing supervised learning
00:48:58 and maybe a little bit of clustering and whatnot,
00:49:01 but we took it all the way to machine learning.
00:49:03 A lot of it just comes from Tom Mitchell’s book.
00:49:04 Oh, no, yeah, except, well,
00:49:06 half of it comes from Tom Mitchell’s book, right?
00:49:07 I mean, the other half doesn’t.
00:49:10 This is why it’s all readings, right?
00:49:12 Because certain things weren’t invented
00:49:13 when Tom wrote that stuff.
00:49:14 Yeah, okay, that’s true.
00:49:15 All right, but it was quite good.
00:49:17 But there’s a reason for that besides, you know,
00:49:19 just, I wanted to do it.
00:49:21 I wanted to do something new,
00:49:21 and I wanted to do something with him,
00:49:23 which is a realization,
00:49:24 which is despite what you might believe,
00:49:27 he’s an introvert and I’m an introvert,
00:49:29 or I’m on the edge of being an introvert anyway.
00:49:32 But both of us, I think, enjoy the energy of the crowd,
00:49:36 right?
00:49:37 There’s something about talking to people
00:49:39 and bringing them into whatever we find interesting
00:49:41 that is empowering, energizing, or whatever.
00:49:45 And I found the idea of staring alone at a computer screen
00:49:50 and then talking off of materials
00:49:52 less inspiring than I wanted it to be.
00:49:55 And I had in fact done a MOOC for Udacity on algorithms.
00:49:59 And it was a week in a dark room talking at the screen,
00:50:05 writing on the little pad.
00:50:07 And I didn’t know this was happening,
00:50:09 but they had watched,
00:50:10 the crew had watched some of the videos
00:50:12 while, you know, like in the middle of this,
00:50:13 and they’re like, something’s wrong.
00:50:15 You’re sort of shutting down.
00:50:19 And I think a lot of it was I’ll make jokes
00:50:22 and no one would laugh.
00:50:24 And I felt like the crowd hated me.
00:50:26 Now, of course, there was no crowd.
00:50:27 So like, it wasn’t rational.
00:50:29 But each time I tried it and I got no reaction,
00:50:32 it just was taking the energy out of my performance,
00:50:37 out of my presentation.
00:50:38 Such a fantastic metaphor for grad school.
00:50:40 Anyway, by working together,
00:50:42 we could play off each other and have a good time.
00:50:44 And keep the energy up,
00:50:45 because you can’t let your guard down for a moment
00:50:48 with Charles, he’ll just overpower you.
00:50:51 I have no idea what you’re talking about.
00:50:52 But we would work really well together, I thought,
00:50:54 and we knew each other,
00:50:54 so I knew that we could sort of make it work.
00:50:56 Plus, I was the associate dean,
00:50:57 so they had to do what I told them to do.
00:51:00 We had to make it work.
00:51:01 And so it worked out very well, I thought,
00:51:03 well enough that we.
00:51:04 With great power comes great power.
00:51:06 That’s right.
00:51:07 And we became smooth and curly.
00:51:09 And that’s when we did the overfitting thriller video.
00:51:15 Yeah, that’s a thing.
00:51:17 So can we just, like, smooth and curly,
00:51:20 where did that come from?
00:51:21 Okay, so it happened.
00:51:23 It was completely spontaneous.
00:51:24 These are nicknames you go by.
00:51:25 Yeah, so it’s what the students call us.
00:51:28 He was lecturing.
00:51:30 So the way that we structured the lectures
00:51:32 is one of us is the lecturer
00:51:33 and one of us is basically the student.
00:51:35 And so he was lecturing on.
00:51:37 The lecturer prepares all the materials,
00:51:39 comes up with the quizzes,
00:51:40 and then the student comes in not knowing anything.
00:51:43 So it was just like being on campus.
00:51:45 And I was doing game theory in particular,
00:51:48 the Prisoner’s Dilemma.
00:51:48 Prisoner’s Dilemma.
00:51:49 And so he needed to set up a little Prisoner’s Dilemma grid.
00:51:52 So he drew it and I could see what he was drawing.
00:51:54 And the Prisoner’s Dilemma consists of two players,
00:51:57 two parties.
00:51:58 So he decided he would make little cartoons
00:52:00 of the two of us.
00:52:01 And so there was two criminals, right,
00:52:04 that were deciding whether or not to rat each other out.
00:52:07 One of them he drew as a circle with a smiley face
00:52:11 and a kind of goatee thing, smooth head.
00:52:14 And the other one with all sorts of curly hair.
00:52:16 And he said, this is smooth and curly.
00:52:18 I said, smooth and curly?
00:52:19 He said, no, no, smooth with a V.
00:52:21 It’s very important that it have a V.
00:52:23 And then the students really took to that.
00:52:27 Like they found that relatable.
00:52:29 He started singing Smooth Criminal by Michael Jackson.
00:52:31 Yeah, yeah, yeah.
00:52:32 And those names stuck.
00:52:33 So we now have a video series,
00:52:36 an episode, our kind of first actual episode
00:52:38 should be coming out today,
00:52:39 Smooth and Curly on video,
00:52:43 where the two of us discuss episodes of Westworld.
00:52:47 We watch Westworld and we’re like, huh,
00:52:49 what does this say about computer science and AI?
00:52:51 And we’ve never, we did not watch it.
00:52:53 I mean, no, it’s on season three or whatever we have.
00:52:55 As of this recording, it’s on season three.
00:52:57 We’ve watched now two episodes total.
00:52:59 Yeah, I think I watched three.
00:53:01 What do you think about Westworld?
00:53:02 Two episodes in.
00:53:03 So I can tell you so far,
00:53:05 I’m just guessing what’s gonna happen next.
00:53:08 It seems like bad things are gonna happen
00:53:10 with the robots uprising.
00:53:11 It’s a lot of.
00:53:12 Spoiler alert.
00:53:12 So I have not, I have not,
00:53:13 I mean, you know, I vaguely remember a movie existing.
00:53:16 So I assume it’s related to that, but.
00:53:18 That was more my time than your time, Charles.
00:53:20 That’s right, cause you’re much older than I am.
00:53:21 I think the important thing here is that
00:53:24 it’s narrative, right?
00:53:25 It’s all about telling a story.
00:53:26 That’s the whole driving thing.
00:53:27 But the idea that they would give these reveries,
00:53:29 that they would make people,
00:53:31 they would make them.
00:53:32 Let them remember.
00:53:33 Remember the awful things that happened.
00:53:33 The terrible things that happened.
00:53:35 Who could possibly think that was gonna,
00:53:36 I gotta, I mean, I don’t know.
00:53:38 I’ve only seen the first two episodes
00:53:39 or maybe the third one.
00:53:40 I think I’ve only seen the first one.
00:53:41 You know what it was?
00:53:42 You know what the problem is?
00:53:43 That the robots were actually designed by Hannibal Lecter.
00:53:45 That’s true.
00:53:46 They weren’t.
00:53:47 So like, what do you think is gonna happen?
00:53:49 Bad things.
00:53:50 It’s clear that things are happening
00:53:51 and characters are being introduced
00:53:52 and we don’t yet know anything,
00:53:54 but still I was just struck by how
00:53:57 it’s all driven by narrative and story.
00:53:58 And there’s all these implied things like programming,
00:54:01 the programming interface is talking to them
00:54:03 about what’s going on in their heads,
00:54:05 which is both, I mean, artistically,
00:54:08 it’s probably useful to film it that way.
00:54:10 But think about how it would work in real life.
00:54:11 That just seems very great.
00:54:12 But there was, we saw in the second episode,
00:54:14 there’s a screen.
00:54:15 You could see things.
00:54:15 They were wearing like Kubrick’s glasses.
00:54:16 In the world.
00:54:17 It was quite interesting to just kind of ask this question
00:54:20 so far.
00:54:21 I mean, I assume it veers off into Never Never Land
00:54:22 at some point.
00:54:23 So we don’t know.
00:54:24 We can’t answer that question.
00:54:25 I’m also a fan of a guy named Alex Garland.
00:54:28 He’s a director of Ex Machina.
00:54:30 Mm hmm.
00:54:31 And he is the first,
00:54:33 I wonder if Kubrick was like this actually,
00:54:36 is he like studies,
00:54:39 what would it take to program an AI systems?
00:54:41 Like he’s curious enough to go into that direction.
00:54:44 On the Westworld side,
00:54:46 I felt there was more emphasis on the narratives
00:54:49 than like actually asking like computer science questions.
00:54:52 Yeah.
00:54:53 Like, how would you build this?
00:54:54 How would you, and.
00:54:56 How would you debug it?
00:54:57 I still think, to me, that’s the key issue.
00:55:00 They were terrible debuggers.
00:55:02 Yeah.
00:55:03 Well, they said specifically,
00:55:04 so we make a change and we put it out in the world
00:55:05 and that’s bad because something terrible could happen.
00:55:07 Like if you’re putting things out in the world
00:55:09 and you’re not sure whether something terrible
00:55:11 is going to happen, your process is probably.
00:55:13 I just feel like there should have been someone
00:55:14 whose sole job it was to walk around and poke his head in
00:55:17 and say, what could possibly go wrong?
00:55:19 Just over and over again.
00:55:20 I would have loved if there was an,
00:55:22 and I did watch a lot more and I’m not giving anything away.
00:55:24 I would have loved it if there was like an episode
00:55:27 where like the new intern is like debugging
00:55:29 a new model or something and like it just keeps failing
00:55:32 and they’re like, all right.
00:55:34 And then it’s more turns into like a episode
00:55:36 of Silicon Valley or something like that.
00:55:38 Yes.
00:55:39 Versus like this ominous AI systems
00:55:41 that are constantly like threatening the fabric
00:55:45 of this world that’s been created.
00:55:47 Yeah.
00:55:48 Yeah, and you know the other,
00:55:49 this reminds me of something that,
00:55:51 so I agree that that should be very cool,
00:55:52 at least for the small percentage of people
00:55:54 who care about debugging systems.
00:55:56 But the other thing is.
00:55:57 Right, debugging, the series.
00:55:59 Yeah, it falls into, think of the sequels,
00:56:01 fear of the debugger.
00:56:02 Oh my gosh.
00:56:03 Anyway, so.
00:56:04 It’s a nightmare show, it’s a horror movie.
00:56:07 I think that’s where we lose people, by the way,
00:56:08 early on is the people who either decide,
00:56:10 either figure out debugging or think debugging is terrible.
00:56:12 This is where we lose people in computer science.
00:56:14 This is a part of the struggle versus suffering, right?
00:56:17 You get through it and you kind of get the skills of it,
00:56:19 or you’re just like, this is dumb,
00:56:20 and this is a dumb way to do anything.
00:56:22 And I think that’s when we lose people.
00:56:23 But, well, I’ll leave it at that.
00:56:26 But I think that there’s something really, really neat
00:56:33 about framing it that way.
00:56:34 But what I don’t like about all of these things,
00:56:37 and I love Tex Machina, by the way,
00:56:39 although the ending was very depressing.
00:56:42 One of the things I have to talk to Alex about,
00:56:46 he says that the thing that nobody noticed he put in
00:56:49 is at the end, spoiler alert,
00:56:53 the robot turns and looks at the camera and smiles, briefly.
00:57:00 And to him, he thought that his definition
00:57:04 of passing the general version of the Turing test,
00:57:08 or the consciousness test, is smiling for no one.
00:57:17 It’s like the Chinese room kind of experiment.
00:57:20 It’s not always trying to act for others,
00:57:22 but just on your own, being able to have a relationship
00:57:26 with the actual experience and just take it in.
00:57:29 I don’t know, he said nobody noticed the magic of it.
00:57:32 I have this vague feeling that I remember the smile,
00:57:35 but now you’ve just put the memory in my head,
00:57:37 so probably not.
00:57:38 But I do think that that’s interesting.
00:57:40 Although, by looking at the camera,
00:57:41 you are smiling for the audience, right?
00:57:43 You’re breaking the fourth wall.
00:57:44 It seems, I mean, well, that’s a limitation of the medium.
00:57:48 But I like that idea.
00:57:49 But here’s the problem I have with all of those movies,
00:57:51 all of them, is that, but I know why it’s this way,
00:57:54 and I enjoy those movies, and Westworld,
00:57:57 is it sets up the problem of AI as succeeding
00:58:02 and then having something we cannot control.
00:58:05 But it’s not the bad part of AI.
00:58:08 The bad part of AI is the stuff
00:58:10 we’re living through now, right?
00:58:11 It’s using the data to make decisions that are terrible.
00:58:13 It’s not the intelligence that’s gonna go out there
00:58:15 and surpass us and take over the world
00:58:17 or lock us into a room to starve to death slowly
00:58:21 over multiple days.
00:58:22 It’s instead the tools that we’re building
00:58:26 that are allowing us to make the terrible decisions
00:58:30 we would have less efficiently made before, right?
00:58:32 Computers are very good at making us more efficient,
00:58:35 including being more efficient at doing terrible things.
00:58:38 And that’s the part of the AI we have to worry about.
00:58:40 It’s not the true intelligence that we’re gonna build
00:58:44 sometime in the future, probably long after we’re around.
00:58:48 But I think that whole framing of it
00:58:52 sort of misses the point, even though it is inspiring.
00:58:55 And I was inspired by those ideas, right?
00:58:57 I got into this in part
00:58:59 because I wanted to build something like that.
00:59:00 Philosophical questions are interesting to me,
00:59:02 but that’s not where the terror comes from.
00:59:04 The terror comes from the everyday.
00:59:06 And you can construct situations
00:59:08 in the subtlety of the interaction between AI and the human,
00:59:11 like with social networks,
00:59:14 all the stuff you’re doing
00:59:15 with interactive artificial intelligence.
00:59:17 But I feel like Cal 9000 came a little bit closer to that
00:59:22 in 2001 Space Odyssey,
00:59:24 because it felt like a personal assistant.
00:59:29 It felt like closer to the AI systems we have today.
00:59:31 And the real things we might actually encounter,
00:59:35 which is over relying in some fundamental way
00:59:40 on our dumb assistants or on social networks,
00:59:44 like over offloading too much of us
00:59:47 onto things that require internet and power and so on
00:59:55 and thereby becoming powerless as a standalone entity.
00:59:59 And then when that thing starts to misbehave
01:00:02 in some subtle way, it creates a lot of problems.
01:00:05 And those problems are dramatized when you’re in space,
01:00:08 because you don’t have a way to walk away.
01:00:11 Well, as the man said,
01:00:12 once we started making the decisions for you,
01:00:15 it stopped being your world, right?
01:00:17 That’s the matrix, Michael, in case you don’t remember.
01:00:20 But on the other hand, I could say no,
01:00:23 because isn’t that what we do with people anyway?
01:00:25 You know, just kind of the shared intelligence
01:00:27 that is humanity is relying on other people constantly.
01:00:30 I mean, we hyper specialize, right?
01:00:32 As individuals, we’re still generally intelligent.
01:00:34 We make our own decisions in a lot of ways,
01:00:36 but we leave most of this up to other people.
01:00:37 And that’s perfectly fine.
01:00:39 And by the way, everyone doesn’t necessarily share our goals.
01:00:43 Sometimes they seem to be quite against us.
01:00:45 Sometimes we make decisions that others would see
01:00:47 as against our own interests.
01:00:49 And yet we somehow manage it, manage to survive.
01:00:51 I’m not entirely sure why an AI
01:00:54 would actually make that worse or even different, really.
01:01:00 You mentioned the matrix.
01:01:02 Do you think we’re living in a simulation?
01:01:04 It does feel like a thought game
01:01:08 more than a real scientific question.
01:01:10 Well, I’ll tell you why I think
01:01:12 it’s an interesting thought experiment.
01:01:13 Let’s see what you think.
01:01:14 From a computer science perspective,
01:01:16 it’s a good experiment of how difficult would it be
01:01:20 to create a sufficiently realistic world
01:01:22 that us humans would enjoy being in.
01:01:26 That’s almost like a competition.
01:01:27 If we’re living in a simulation,
01:01:29 then I don’t believe that we were put in the simulation.
01:01:31 I believe that it’s just physics playing out
01:01:34 and we came out of that.
01:01:36 Like, I don’t think.
01:01:39 So you think you have to build the universe
01:01:40 and have all the fun in the world?
01:01:41 I think that the universe itself,
01:01:42 we can think of that as a simulation.
01:01:43 And in fact, sometimes I try to think about,
01:01:46 to understand what it’s like for a computer
01:01:49 to start to think about the world.
01:01:52 I try to think about the world.
01:01:55 Things like quantum mechanics,
01:01:56 where it doesn’t feel very natural to me at all.
01:01:59 And it really strikes me as,
01:02:02 I don’t understand this thing that we’re living in.
01:02:05 It has, there’s weird things happening in it
01:02:07 that don’t feel natural to me at all.
01:02:09 Now, if you want to call that as the result of a simulator,
01:02:13 okay, I’m fine with that.
01:02:14 But like, I don’t.
01:02:15 There’s the bugs in the simulation.
01:02:16 There’s the bugs.
01:02:17 I mean, the interesting thing about the simulation
01:02:19 is that it might have bugs.
01:02:21 I mean, that’s the thing that I,
01:02:23 But there would be bugs for the people in the simulation.
01:02:25 That’s just reality.
01:02:27 Unless you were aware enough to know that there was a bug.
01:02:29 But I think.
01:02:30 Back to the matrix.
01:02:31 Yeah, the way you put the question though.
01:02:32 I don’t think that we live in a simulation created for us.
01:02:35 Okay, I would say that.
01:02:36 I think that’s interesting.
01:02:37 I’ve actually never thought about it that way.
01:02:38 I mean, the way you asked the question though,
01:02:40 could you create a world that is enough for us humans?
01:02:43 It’s an interestingly sort of self referential question
01:02:45 because the beings that created the simulation
01:02:49 probably have not created the simulation
01:02:51 that’s realistic for them.
01:02:53 But we’re in the simulation and so it’s realistic for us.
01:02:56 So we could create a simulation
01:02:58 that is fine for the people in the simulation, as it were.
01:03:02 That would not necessarily be fine for us
01:03:03 as the creators of the simulation.
01:03:05 But, well, you can forget.
01:03:07 I mean, if you play video games in virtual reality,
01:03:11 you can, if some suspension of disbelief or whatever.
01:03:16 It becomes a world.
01:03:17 It becomes a world.
01:03:18 Even like in brief moments,
01:03:20 you forget that another world exists.
01:03:22 I mean, that’s what like good stories do.
01:03:24 They pull you in.
01:03:25 And the question is, is it possible to pull,
01:03:28 our brains are limited.
01:03:29 Is it possible to pull the brain in
01:03:31 to where we actually stay in that world
01:03:32 longer and longer and longer and longer?
01:03:34 And like, not only that, but we don’t wanna leave.
01:03:39 And so, especially this is the key thing
01:03:41 about the developing brain,
01:03:43 is if we journey into that world early on in life, often.
01:03:48 How would you even know, yeah.
01:03:49 Yeah, so I, but like from a video game design perspective,
01:03:53 from a Westworld perspective,
01:03:54 it’s, I think it’s an important thing
01:03:57 for even computer scientists to think about
01:04:00 because it’s clear that video games are getting much better.
01:04:04 And virtual reality,
01:04:06 although it’s been ups and downs
01:04:08 just like artificial intelligence,
01:04:09 it feels like virtual reality will be here
01:04:14 in a very impressive form
01:04:16 if we were to fast forward 100 years into the future
01:04:19 in a way that might change society fundamentally.
01:04:22 Like if I were to,
01:04:23 I’m very limited in predicting the future as all of us are,
01:04:26 but if I were to try to predict,
01:04:28 like in which way I’d be surprised
01:04:32 to see the world 100 years from now,
01:04:35 it’d be that, or impressed,
01:04:39 it’d be that we’re all no longer living
01:04:42 in this physical world,
01:04:43 that we’re all living in a virtual world.
01:04:45 You really need to read Calculating God by Sawyer.
01:04:51 It’s a, he’ll read it in the night.
01:04:53 It’s a very easy read,
01:04:54 but it’s, assuming you’re that kind of reader,
01:04:56 but it’s a good story.
01:04:58 And it’s kind of about this,
01:04:59 but not in a way that it appears.
01:05:01 And I really enjoyed the thought experiment.
01:05:07 And I think it’s pretty sure it’s Robert Sawyer.
01:05:08 But anyway, he’s apparently
01:05:10 Canadian’s top science fiction writer,
01:05:12 which is why the story mostly takes place in Toronto.
01:05:14 But it’s a very good sort of story
01:05:18 that sort of imagines this.
01:05:21 Very different kind of simulation hypothesis sort of thing
01:05:25 from say, The Egg, for example.
01:05:28 You know, I’m talking about the short story.
01:05:32 By the guy who did The Martian.
01:05:34 Who wrote The Martian?
01:05:36 You know what I’m talking about.
01:05:37 The Martian. Matt Damon.
01:05:38 The book.
01:05:39 So we had this whole discussion
01:05:41 that Michael doesn’t partake in this exercise of reading.
01:05:45 He doesn’t seem to like it,
01:05:46 which seems very strange to me,
01:05:48 considering how much he has to read.
01:05:50 I read all the time.
01:05:50 I used to read 10 books every week
01:05:53 when I was in sixth grade or whatever.
01:05:55 I was, a lot of it’s science fiction,
01:05:57 a lot of it’s history, but I love to read.
01:05:59 But anyway, you should read Calculating God.
01:06:01 I think you’ll, it’s very easy to read, like I said,
01:06:04 and I think you’ll enjoy sort of the ideas that it presents.
01:06:08 Yeah, I think the thought experiment is quite interesting.
01:06:12 One thing I’ve noticed about people growing up now,
01:06:15 I mean, we talk about social media,
01:06:17 but video games is a much bigger,
01:06:19 bigger and bigger and bigger part of their lives.
01:06:21 And the video games have become much more realistic.
01:06:24 I think it’s possible that the three of us are not,
01:06:31 maybe the two of you are not familiar exactly
01:06:33 with the numbers we’re talking about here.
01:06:36 The number of people.
01:06:37 It’s bigger than movies, right?
01:06:38 It’s huge.
01:06:39 I used to do a lot of the computational narrative stuff.
01:06:42 I understand that economists can actually see
01:06:45 the impact of video games on the labor market.
01:06:48 That there’s fewer young men of a certain age
01:06:54 participating in like paying jobs than you’d expect.
01:06:59 And that they trace it back to video games.
01:07:01 I mean, the problem with Star Trek
01:07:02 was not warp drive or teleportation.
01:07:06 It was the holodeck.
01:07:07 Like if you have the holodeck, that’s it.
01:07:12 That’s it, you go in the holodeck, you never come out.
01:07:13 I mean, it just never made, once I saw that,
01:07:16 I thought, okay, well, so this is the end of humanity
01:07:19 as we know it, right?
01:07:20 They’ve invented the holodeck.
01:07:21 Because that feels like the singularity,
01:07:23 not some AGI or whatever.
01:07:25 It’s some possibility to go into another world
01:07:28 that can be artificially made better than this one.
01:07:32 And slowing it down so you live forever.
01:07:34 Or speeding it up so you appear to live forever.
01:07:35 Or making the decision of when to die.
01:07:39 And then most of us will just be old people on the porch
01:07:42 yelling at the kids these days in their virtual reality.
01:07:47 But they won’t hear us because they’ve got headphones on.
01:07:49 So, I mean, rewinding back to Mook’s,
01:07:53 is there lessons that you’ve, speaking to kids these days?
01:07:58 That was a transition.
01:07:59 That was fantastic.
01:08:01 I’ll fix it in post.
01:08:04 That’s Charles’s favorite phrase.
01:08:06 Fix it in post?
01:08:07 Fix it in post.
01:08:08 Fix it in post.
01:08:08 When we were recording all the time,
01:08:10 whenever the editor didn’t like something or whatever,
01:08:12 I would say, we’ll fix it in post.
01:08:14 He hated that.
01:08:15 He hated that more than anything.
01:08:16 Because it’s Charles’s way of saying,
01:08:17 I’m not gonna do it again.
01:08:20 You’re on your own for this one.
01:08:22 But it always got fixed in post.
01:08:24 Exactly right.
01:08:24 So is there something you’ve learned about,
01:08:28 I mean, it’s interesting to talk about Mook’s.
01:08:29 Is there something you’ve learned
01:08:30 about the process of education,
01:08:32 about thinking about the present?
01:08:35 I think there’s two lines of conversation to be had here.
01:08:38 There’s the future of education in general
01:08:41 that you’ve learned about.
01:08:42 And more passionately is the education
01:08:49 in the times of COVID.
01:08:50 Yeah.
01:08:51 The second thing in some ways matters more than the first,
01:08:54 for at least in my head,
01:08:55 not just because it’s happening now,
01:08:57 but because I think it’s reminded us of a lot of things.
01:09:00 Coincidentally, today, there’s an article out
01:09:02 by a good friend of mine,
01:09:04 who’s also a professor at Georgia Tech,
01:09:06 but more importantly, a writer and editor
01:09:07 at the Atlantic, a guy named Ian Bogost.
01:09:10 And the title is something like,
01:09:13 Americans Will Sacrifice Anything
01:09:15 for the College Experience.
01:09:17 And it’s about why we went back to college
01:09:20 and why people wanted us to go back to college.
01:09:22 And it’s not greedy presidents
01:09:24 trying to get the last dollar from someone.
01:09:26 It’s because they want to go to college.
01:09:28 And what they’re paying for is not the classes.
01:09:29 What they’re paying for is the college experience.
01:09:32 It’s not the education that’s being there.
01:09:33 I’ve believed this for a long time,
01:09:35 that we continually make this mistake of,
01:09:39 people want to go back to college
01:09:40 as being people want to go back to class.
01:09:42 They don’t.
01:09:43 They want to go back to campus.
01:09:44 They want to move away from home.
01:09:44 They want to do all those things that people experience.
01:09:47 It’s a rite of passage.
01:09:48 It’s an identity, if I can steal some of Ian’s words here.
01:09:53 And I think that’s right.
01:09:54 And I think what we’ve learned through COVID
01:09:57 is it has made it,
01:09:59 the disaggregation was not the disaggregation
01:10:02 of the education from the place, the university place,
01:10:05 and that you can get the best anywhere you want to.
01:10:07 Turns out there’s lots of reasons
01:10:08 why that is not necessarily true.
01:10:10 The disaggregation is having it shoved in our faces
01:10:13 that the reason to go, again,
01:10:14 that the reason to go to college
01:10:16 is not necessarily to learn.
01:10:18 It’s to have the college experience.
01:10:20 And that’s very difficult for us to accept,
01:10:21 even though we behave that way,
01:10:23 most of us, when we were undergrads.
01:10:26 A lot of us didn’t go to every single class.
01:10:28 We learned and we got it and we look back on it
01:10:30 and we’re happy we had the learning experience as well,
01:10:32 obviously, particularly us,
01:10:33 because this is the kind of thing that we do.
01:10:35 And my guess is that’s true
01:10:36 of the vast majority of your audience.
01:10:39 But that doesn’t mean the,
01:10:41 I’m standing in front of you telling you this,
01:10:43 is the thing that people are excited about.
01:10:47 And that’s why they want to be there,
01:10:49 primarily why they want to be there.
01:10:50 So to me, that’s what COVID has forced us to deal with,
01:10:54 even though I think we’re still all in deep denial about it
01:10:57 and hoping that it’ll go back to that.
01:10:59 And I think about 85% of it will.
01:11:01 We’ll be able to pretend
01:11:02 that that’s really the way it is, again,
01:11:03 and we’ll forget the lessons of this.
01:11:05 But technically what’ll come out of it,
01:11:07 or technologically what’ll come out of it
01:11:09 is a way of providing a more dispersed experience
01:11:12 through online education
01:11:13 and these kinds of remote things that we’ve learned.
01:11:16 And we’ll have to come up with new ways to engage them
01:11:19 in the experience of college,
01:11:20 which includes not just the parties
01:11:22 or the whatever kids do,
01:11:23 but the learning part of it
01:11:25 so that they actually come out four or five
01:11:27 or six years later with having actually learned something.
01:11:30 So I think the world
01:11:32 will be radically different afterwards.
01:11:34 And I think technology will matter for that,
01:11:36 just not in the way that the people
01:11:38 who were building the technology originally
01:11:40 imagined it would be.
01:11:42 And I think this would have been true even without COVID,
01:11:45 but COVID has accelerated that reality.
01:11:47 So it’s happening in two or three years or five years,
01:11:50 as opposed to 10 or 15.
01:11:52 That was an amazing answer that I did not understand.
01:11:56 It was passionate and meaningful.
01:11:58 Shots fired.
01:11:59 But I don’t, no, I just didn’t,
01:12:00 no, I’m not trying to criticize it.
01:12:01 I just think, I don’t think I’m getting it.
01:12:03 So you mentioned disaggregation.
01:12:05 So what’s that?
01:12:06 Well, so the power of technology
01:12:09 that if you go on the West Coast and hang out long enough
01:12:11 is all about we’re gonna disaggregate these things together.
01:12:13 The books from the bookstore, that kind of a thing.
01:12:15 And then suddenly Amazon controls the universe, right?
01:12:17 And technology is a disruptor, right?
01:12:19 And people have been predicting that
01:12:20 for higher education for a long time,
01:12:22 but certainly in the age of moves.
01:12:23 So is this the sort of idea like
01:12:26 students can aggregate on a campus someplace
01:12:30 and then take classes over the network anywhere?
01:12:33 Yeah, this is what people thought was gonna happen,
01:12:34 or at least people claimed it was gonna happen, right?
01:12:37 Because my daughter is essentially doing that now.
01:12:38 She’s on one campus, but learning in a different campus.
01:12:41 Sure, and COVID makes that possible, right?
01:12:43 COVID makes that legal, all but avoidable, right?
01:12:47 But the idea originally was that,
01:12:49 you and I were gonna create this machine learning class
01:12:51 and it was gonna be great,
01:12:52 and then no one else would,
01:12:52 there’d be the machine learning class everyone takes, right?
01:12:54 That was never gonna happen, but something like that,
01:12:57 you can see happening. But I feel like
01:12:58 you didn’t address that.
01:12:58 Why, why, why is it that, why, why?
01:13:02 I don’t think that will be the thing that happens.
01:13:04 So the college experience,
01:13:05 maybe I missed what the college experience was.
01:13:07 I thought it was peers, like people hanging around.
01:13:10 A large part of it is peers.
01:13:11 Well, it’s peers and independence.
01:13:13 Yeah, but none of that,
01:13:15 you can do classes online for all of that.
01:13:17 No, no, no, no, because we’re social people, right?
01:13:20 So you wanna be in the same room.
01:13:21 So when we take the classes,
01:13:22 that also has to be part of an experience.
01:13:25 It’s in a context, and the context is the university.
01:13:27 And by the way, it actually matters
01:13:29 that Georgia Tech really is different from Brown.
01:13:33 I see, because then students can choose
01:13:36 the kind of experience they think
01:13:37 is gonna be best for them.
01:13:38 Okay, I think we’re giving too much agency to the students
01:13:41 in making an informed decision.
01:13:42 Okay. But the truth,
01:13:43 but yes, they will make choices
01:13:45 and they will have different experiences.
01:13:46 And some of those choices will be made for them.
01:13:48 Some of them will be choices they’re making
01:13:49 because they think it’s this, that, or the other.
01:13:51 I just don’t want to say,
01:13:52 I don’t want to give the idea.
01:13:53 It’s not homogenous.
01:13:55 Yes, it’s certainly not homogenous, right?
01:13:56 I mean, Georgia Tech is different from Brown.
01:13:59 Brown is different from pick your favorite state school
01:14:03 in Iowa, Iowa State, okay?
01:14:05 Which I guess is my favorite state school in Iowa.
01:14:07 But these are all different.
01:14:09 They have different contexts.
01:14:10 And a lot of those contexts are,
01:14:12 they’re about history, yes,
01:14:13 but they’re also about the location of where you are.
01:14:15 They’re about the larger group of people who are around you,
01:14:18 whether you’re in Athens, Georgia,
01:14:20 and you’re basically the only thing that’s there
01:14:23 as a university, you’re responsible for all the jobs,
01:14:25 or whether you’re at Georgia State University,
01:14:27 which is an urban campus,
01:14:28 where you’re surrounded by six million people
01:14:31 in your campus where it ends and begins in the city,
01:14:33 ends and begins, we don’t know.
01:14:35 It actually matters whether you’re a small campus
01:14:37 or a large campus.
01:14:38 I mean, these things matter.
01:14:38 Why is it that if you go to Georgia Tech,
01:14:41 you’re forever proud of that,
01:14:44 and you say that to people at dinners,
01:14:47 like bars and whatever,
01:14:49 and if you get a degree at an online university somewhere,
01:14:56 that’s not a thing that comes up at a bar.
01:14:58 Well, it’s funny you say that.
01:14:59 So the students who take our online masters
01:15:03 by several measures are more loyal
01:15:06 than the students who come on campus,
01:15:07 certainly for the master’s degree.
01:15:09 The reason for that, I think,
01:15:10 and you’d have to ask them,
01:15:11 but based on my conversations with them,
01:15:13 I feel comfortable saying this,
01:15:15 is because this didn’t exist before.
01:15:18 I mean, we talk about this online masters
01:15:19 and that it’s reaching 11,000 students,
01:15:22 and that’s an amazing thing,
01:15:22 and we’re admitting everyone we believe who can succeed.
01:15:25 We got a 60% acceptance rate.
01:15:26 It’s amazing, right?
01:15:27 It’s also a $6,600 degree.
01:15:29 The entire degree costs $6,600 or $7,000,
01:15:32 depending on how long you take.
01:15:33 A dollar degree, as opposed to $46,000
01:15:35 it would cost you to come on campus.
01:15:37 So that feels, and I can do it while I’m working full time,
01:15:40 and I’ve got a family and a mortgage
01:15:42 and all these other things.
01:15:43 So it’s an opportunity to do something you wanted to do,
01:15:46 but you didn’t think was possible
01:15:47 without giving up two years of your life,
01:15:50 as well as all the money
01:15:51 and everything else in the life that you had built.
01:15:53 So I think we created something that’s had an impact,
01:15:56 but importantly, we gave a set of people opportunities
01:15:59 they otherwise didn’t feel they had.
01:16:00 So I think people feel very loyal about that.
01:16:02 And my biggest piece of evidence for that,
01:16:04 besides the surveys,
01:16:05 is that we have somewhere north of 80 students,
01:16:08 might be 100 at this point,
01:16:09 who graduated, but come back in TA for this class,
01:16:15 for basically minimum wage,
01:16:16 even though they’re working full time,
01:16:17 because they believe in sort of having that opportunity
01:16:21 and they wanna be a part of something.
01:16:23 Now, will generation three feel this way?
01:16:25 15 years from now, will people have that same sense?
01:16:28 I don’t know, but right now they kind of do.
01:16:31 And so it’s not the online,
01:16:32 it’s a matter of feeling as if you’re a part of something.
01:16:36 Right, we’re all very tribal, right?
01:16:39 And I think there’s something very tribal
01:16:42 about being a part of something like that.
01:16:44 Being on campus makes that easier,
01:16:45 going through a shared experience makes that easier.
01:16:48 It’s harder to have that shared experience
01:16:49 if you’re alone looking at a computer screen.
01:16:52 We can create ways to make that true.
01:16:53 But is it possible?
01:16:54 It is possible.
01:16:55 The question is, it still is the intuition to me,
01:16:58 and it was at the beginning when I saw something
01:17:01 like the online master’s program,
01:17:04 is that this is gonna replace universities.
01:17:07 No, it won’t replace universities.
01:17:09 But like why?
01:17:11 Because it’s living
01:17:11 in a different part of the ecosystem, right?
01:17:13 The people who are taking it are already adults,
01:17:15 they’ve gone through their undergrad experience.
01:17:18 I think their goals have shifted from when they were 17.
01:17:21 They have other things that are going on.
01:17:23 But it does do something really important,
01:17:25 something very social and very important, right?
01:17:28 You know this whole thing about,
01:17:30 don’t build the sidewalks, just leave the grass
01:17:32 and the students or the people will walk
01:17:33 and you put the sidewalks where they create paths,
01:17:35 this kind of thing.
01:17:36 That’s interesting, yeah.
01:17:37 Their architects apparently believe
01:17:39 that’s the right way to do things.
01:17:40 The metaphor here is that we created this environment,
01:17:45 we didn’t quite know how to think about the social aspect,
01:17:48 but we didn’t have time to solve all,
01:17:51 do all the social engineering, right?
01:17:53 The students did it themselves,
01:17:54 they created these groups, like on Google Plus,
01:17:58 there were like 30 something groups created
01:18:00 in the first year because somebody had used Google Plus.
01:18:04 And they created these groups
01:18:05 and they divided up in ways that made sense.
01:18:07 We live in the same state or we’re working
01:18:08 on the same things or we have the same background
01:18:10 or whatever and they created these social things.
01:18:12 We sent them T shirts and they wear,
01:18:14 we have all these great pictures of students
01:18:16 putting on their T shirts as they travel around the world.
01:18:18 I climbed this mountain top, I’m putting this T shirt on,
01:18:20 I’m a part of this, they were a part of them.
01:18:22 They created the social environment
01:18:24 on top of the social network and the social media
01:18:26 that existed to create this sense of belonging
01:18:29 and being a part of something.
01:18:30 They found a way to do it, right?
01:18:32 And I think they had other,
01:18:36 it scratched an itch that they had,
01:18:38 but they had scratched some of that itch
01:18:40 that might’ve required they’d be physically
01:18:41 in the same place long before, right?
01:18:44 So I think, yes, it’s possible
01:18:47 and it’s more than possible, it’s necessary.
01:18:49 But I don’t think it’s going to replace the university
01:18:54 as we know it.
01:18:55 The university as we know it will change.
01:18:57 But there’s just a lot of power
01:18:59 in the kind of rite of passage
01:19:00 kind of going off to yourself.
01:19:01 Now, maybe there’ll be some other rite of passage
01:19:03 that’ll happen.
01:19:03 That’ll drive us somewhere else, it’s possible.
01:19:06 So the university is such a fascinating mess of things.
01:19:11 So just even the faculty position is a fascinating mess.
01:19:14 Like it doesn’t make any sense.
01:19:15 It’s stabilized itself,
01:19:18 but like why are the world class researchers
01:19:22 spending a huge amount of time or their time teaching
01:19:26 and service?
01:19:27 Like you’re doing like three jobs.
01:19:29 And I mean, it turns out it’s maybe an accident of history
01:19:34 or human evolution, I don’t know.
01:19:36 It seems like the people who are really good at teaching
01:19:38 are often really good at research.
01:19:40 There seems to be a parallel there,
01:19:42 but like it doesn’t make any sense
01:19:44 that you should be doing that.
01:19:45 At the same time, it also doesn’t seem to make sense
01:19:48 that your place where you party
01:19:53 is the same place where you go to learn calculus
01:19:56 or whatever.
01:19:57 But it’s a safe space.
01:19:59 Safe space for everything.
01:20:00 Yeah, relatively speaking, it’s a safe space.
01:20:02 Now, by the way, I feel the need very strongly
01:20:05 to point out that we are living
01:20:07 in a very particular weird bubble, right?
01:20:09 Most people don’t go to college.
01:20:10 And by the way, the ones who do go to college,
01:20:12 they’re not 18 years old, right?
01:20:14 They’re like 25 or something.
01:20:15 I forget the numbers.
01:20:17 The places where we’ve been, where we are,
01:20:20 they look like whatever we think
01:20:22 the traditional movie version of universities are.
01:20:25 But for most people, it’s not that way at all.
01:20:27 By the way, most people who drop out of college,
01:20:28 it’s entirely for financial reasons, right?
01:20:32 So we were talking about a particular experience.
01:20:36 And so for that set of people,
01:20:38 which is very small, but larger than it was a decade
01:20:42 or two or three or four, certainly, ago,
01:20:45 I don’t think that will change.
01:20:47 My concern, which I think is kind of implicit
01:20:50 in some of these questions,
01:20:51 is that somehow we will divide the world up further
01:20:55 into the people who get to have this experience
01:20:57 and get to have the network
01:20:57 and they sort of benefit from it,
01:20:59 and everyone else, while increasingly requiring
01:21:01 that they have more and more credentials
01:21:03 in order to get a job as a barista, right?
01:21:05 You gotta have a master’s degree
01:21:07 in order to work at Starbucks.
01:21:08 I mean, we’re gonna force people to do these things,
01:21:10 but they’re not gonna get to have that experience,
01:21:12 and there’ll be a small group of people who do
01:21:13 who will continue to, you know, positive feedback,
01:21:15 look, et cetera, et cetera, et cetera.
01:21:16 I worry a lot about that, which is why, for me,
01:21:21 and by the way, here’s an answer
01:21:21 to your question about faculty,
01:21:22 which is why, to me, that you have to focus
01:21:24 on access and the mission.
01:21:26 I think the reason, whether it’s good, bad, or strange,
01:21:28 I mean, I agree, it’s strange,
01:21:29 but I think it’s useful to have the faculty member,
01:21:32 particularly at large R1 universities
01:21:33 where we’ve all had experiences,
01:21:36 that you tie what they get to do
01:21:41 and with the fundamental mission of the university
01:21:43 and let the mission drive.
01:21:45 What I hear when I talk to faculty is,
01:21:47 they love their PhD students
01:21:48 because they’re reproducing, basically, right?
01:21:51 And it lets them do their research and multiply.
01:21:53 But they understand that the mission is the undergrads,
01:21:57 and so they will do it without complaint, mostly,
01:22:00 because it’s a part of the mission and why they’re here,
01:22:02 and they have experiences with it themselves,
01:22:04 and it was important to get them
01:22:06 where they were going.
01:22:07 The people who tend to get squeezed in that, by the way,
01:22:09 are the master’s students, right,
01:22:10 who are neither the PhDs who are like us
01:22:12 nor the undergrads we have already bought into the idea
01:22:14 that we have to teach, though.
01:22:16 That’s increasingly changing.
01:22:18 Anyway, I think tying that mission in really matters,
01:22:21 and it gives you a way to unify people
01:22:23 around making it an actual higher calling.
01:22:26 Education feels like more of a higher calling to me
01:22:28 than even research,
01:22:30 because education, you cannot treat it as a hobby
01:22:33 if you’re going to do it well.
01:22:34 But that’s the pushback on this whole system
01:22:38 is that education should be a full time job, right?
01:22:44 And it’s almost like research is a distraction from that.
01:22:49 Yes, although I think most of our colleagues,
01:22:51 many of our colleagues would say that research is the job
01:22:53 and education is the distraction.
01:22:55 Right, but that’s the beautiful dance.
01:22:56 It seems to be that tension in itself seems to work,
01:23:01 seems to bring out the best in the faculty.
01:23:07 But I will point out two things.
01:23:08 One thing I’m going to point out,
01:23:09 and the other thing I want Michael to point out,
01:23:10 because I think Michael is much closer
01:23:11 to sort of the ideal professor in some sense than I am.
01:23:17 Well, he is a dean.
01:23:18 You’re the platonic sense of a professor.
01:23:19 I don’t know what he meant by that,
01:23:20 but he is a dean, so he has a different experience.
01:23:23 I’m giving him time to think of the profound thing
01:23:26 he’s going to say.
01:23:27 That was good.
01:23:27 But let me point this out,
01:23:28 which is that we have lecturers
01:23:31 in the College of Computing where I am.
01:23:33 There’s 10 or 12 of them, depending on how you count,
01:23:35 as opposed to the 90 or so tenure track faculty.
01:23:39 Those 10 lecturers who only teach,
01:23:41 well, they don’t only teach, they also do service.
01:23:42 Some of them do research as well, but primarily they teach.
01:23:46 They teach 50%, over 50% of our credit hours,
01:23:49 and we teach everybody, right?
01:23:51 So they’re doing not just,
01:23:54 they’re doing more than eight times the work
01:23:56 of the tenure track faculty,
01:23:59 just more closer to nine or 10.
01:24:01 And that’s including our grad courses, right?
01:24:03 So they’re doing this, they’re teaching more,
01:24:05 they’re touching more than anyone,
01:24:07 and they’re beloved for it.
01:24:08 I mean, so we recently had a survey.
01:24:11 Everyone does these alumni surveys.
01:24:12 You hire someone from the outside to do whatever,
01:24:14 and I was really struck by something.
01:24:15 You saw all these really cool numbers.
01:24:17 I’m not going to talk about it
01:24:18 because it’s all internal, confidential stuff.
01:24:19 But one thing I will talk about
01:24:21 is there was a single question we asked our alum,
01:24:23 and these are people who graduated,
01:24:24 born in the 30s and 40s,
01:24:25 all the way up to people who graduated last week, right?
01:24:29 Well, last semester.
01:24:30 Okay, good.
01:24:32 Time flies.
01:24:33 Yeah, time flies.
01:24:34 And it was the question,
01:24:36 name a single person who had a strong positive impact on you,
01:24:40 something like that.
01:24:42 I think it was special impact?
01:24:44 Yeah, special impact on you.
01:24:45 And then, so they got all the answers from people,
01:24:47 and they created a word cloud.
01:24:49 It was clearly a word cloud created by people
01:24:50 who don’t do word clouds for a living
01:24:52 because they had one person whose name appeared
01:24:54 like nine different times,
01:24:56 like Philip, Phil, Dr. Phil, you know, but whatever.
01:24:59 But they got all this.
01:25:00 And I looked at it, and I noticed something really cool.
01:25:02 The five people from the College of Computing,
01:25:06 I recognized, were in that cloud.
01:25:09 And four of them were lecturers,
01:25:13 the people who teach.
01:25:15 Two of them, relatively modern,
01:25:17 both were chairs of our division of computing instruction.
01:25:19 One just, one retired, one is going to retire soon.
01:25:22 And the other two were lecturers,
01:25:23 I remembered, from the 1980s.
01:25:26 Two of those four actually have.
01:25:28 By the way, the fifth person was Charles.
01:25:29 That’s not important.
01:25:30 The thing is, I don’t tell people that.
01:25:32 But the two of those people
01:25:34 our teaching awards are named after.
01:25:36 Thank you, Michael.
01:25:36 Two of those our teaching awards are named after, right?
01:25:39 So when you ask students, alumni,
01:25:41 people who are now 60, 70 years old even,
01:25:44 you know, who touched them?
01:25:45 They say the Dean of Students.
01:25:46 They say the big teachers who taught
01:25:48 the big introductory classes that got me into it.
01:25:50 There’s a guy named Richard Park who’s on there,
01:25:52 who’s, you know, who’s known as a great teacher.
01:25:55 The Phil Adler guy who,
01:25:58 I probably just said his last name wrong,
01:26:00 but I know the first name’s Phil
01:26:01 because he kept showing up over and over again.
01:26:03 Famous.
01:26:03 Adler is what it said.
01:26:04 Okay, good.
01:26:05 But different people spelled it differently.
01:26:06 So he appeared multiple times.
01:26:07 Right.
01:26:08 So he was a, clearly,
01:26:10 he was a professor in the business school.
01:26:14 But when you read about him,
01:26:15 I went to read about him because I was curious who he was.
01:26:17 You know, it’s all about his teaching
01:26:18 and the students that he touched, right?
01:26:20 So whatever it is that we’re doing
01:26:22 and we think we’re doing that’s important
01:26:23 or why we think the universities function,
01:26:25 the people who go through it,
01:26:27 they remember the people who were kind to them,
01:26:29 the people who taught them something,
01:26:31 and they do remember it.
01:26:32 They remember it later.
01:26:33 I think that’s important.
01:26:35 That’s why the mission matters.
01:26:37 Yeah.
01:26:38 Not to completely lose track of the fundamental problem
01:26:41 of how do we replace the party aspect of universities
01:26:46 before we go to the what makes the platonic professor.
01:26:51 Do you think, like, what in your sense is the role of MOOCs
01:26:57 in this whole picture during COVID?
01:27:00 Like, should we desperately be clamoring
01:27:04 to get back on campus?
01:27:05 Or is this a stable place to be for a little while?
01:27:08 I don’t know.
01:27:09 I know that the online teaching experience
01:27:12 and learning experience has been really rough.
01:27:15 I think that people find it to be a struggle
01:27:18 in a way that’s not a happy, positive struggle,
01:27:21 that when you got through it,
01:27:23 you just feel like glad that it’s over
01:27:24 as opposed to I’ve achieved something.
01:27:27 So, you know, I worry about that.
01:27:29 But, you know, I worry about just even before this happened,
01:27:33 I worry about lecture teaching,
01:27:35 how well is that actually really working
01:27:38 as far as a way to do education,
01:27:40 as a way to inspire people.
01:27:43 I mean, all the data that I’m aware of seems to indicate,
01:27:47 and this kind of fits, I think, with Charles’s story,
01:27:49 is that people respond to connection, right?
01:27:54 They actually feel, if they feel connected
01:27:57 to the person teaching the class,
01:27:59 they’re more likely to go along with it.
01:28:00 They’re more able to retain information.
01:28:02 They’re more motivated to be involved
01:28:05 in the class in some way.
01:28:06 And that really matters.
01:28:09 People…
01:28:10 You mean to the human themselves.
01:28:12 Yeah.
01:28:13 Okay, can’t you do that actually
01:28:14 perhaps more effectively online?
01:28:18 Like you mentioned, science communication.
01:28:20 So I literally, I think, learned linear algebra
01:28:24 from Gilbert Strang by watching MIT OpenCourseWare
01:28:28 when I was in track.
01:28:29 Like, and he was a personality,
01:28:31 he was a bit like a tiny…
01:28:33 In this tiny little world of math,
01:28:35 he’s a bit of a rockstar, right?
01:28:36 So you kind of look up to that person.
01:28:40 Can’t that replace the in person education?
01:28:44 It can help.
01:28:45 I will point out something, I can’t share the numbers,
01:28:47 but we have surveyed our students,
01:28:50 and even though they have feelings
01:28:51 about what I would interpret as connection,
01:28:54 I like that word, in the different modes of classrooms,
01:28:58 there’s no difference between how well
01:29:00 they think they’re learning.
01:29:02 For them, the thing that makes them unhappy
01:29:05 is the situation they’re in.
01:29:06 And I think the lack of connection,
01:29:08 it’s not whether they’re learning anything.
01:29:10 They seem to think they’re learning something anyway, right?
01:29:13 In fact, they seem to think
01:29:14 they’re learning it equally well,
01:29:16 presumably because the faculty are putting in,
01:29:20 or the instructors, more generally speaking,
01:29:22 are putting in the energy and effort
01:29:25 to try to make certain that what they’ve curated
01:29:28 can be expressed to them in a useful way.
01:29:30 But the connection is missing.
01:29:31 And so there’s huge differences in what they prefer.
01:29:34 And as far as I can tell,
01:29:35 what they prefer is more connection, not less.
01:29:37 That connection just doesn’t have to be physically
01:29:39 in a classroom.
01:29:40 I mean, look, I used to teach 348 students
01:29:43 in my machine learning class on campus.
01:29:44 Do you know why?
01:29:45 That was the biggest classroom on campus.
01:29:48 They’re sitting in theater seats.
01:29:50 I’m literally on a stage looking down on them
01:29:54 and talking to them, right?
01:29:56 There’s no, I mean, we’re not sitting down,
01:29:59 having a one on one conversation,
01:30:01 reading each other’s body language,
01:30:02 trying to communicate and going,
01:30:04 we’re not doing any of that.
01:30:05 So if you’re past the third row,
01:30:07 it might as well be online anyway
01:30:08 is the kind of thing that people have said.
01:30:10 Daphne has actually said some version of this
01:30:12 that online starts on the third row or something like that.
01:30:15 And I think that’s not, yeah, I like it.
01:30:18 I think it captures something important.
01:30:20 But people still came, by the way.
01:30:22 Even the people who had access to our material
01:30:23 would still come to class.
01:30:25 I mean, there’s a certain element
01:30:26 about looking to the person next to you.
01:30:28 It’s just like their presence there, their boredom.
01:30:32 And like when the parts are boring
01:30:34 and their excitement when the parts are exciting,
01:30:37 like in sharing in that,
01:30:39 like unspoken kind of, yeah, communication.
01:30:43 In part, the connection is with the other people
01:30:45 in the room.
01:30:46 Yeah, watching the circus on TV alone is not really.
01:30:52 Ever been to a movie theater
01:30:53 and been the only one there at a comedy?
01:30:55 It’s not as funny as when you’re in a room
01:30:58 full of people all laughing.
01:31:00 Well, you need, maybe you need just another person.
01:31:02 It’s like, as opposed to many.
01:31:04 Maybe there’s some kind of.
01:31:06 Well, there’s different kinds of connection, right?
01:31:07 And there’s different kinds of comedy.
01:31:11 Well, in the sense that.
01:31:12 As we’re learning today.
01:31:15 I wasn’t sure if that was gonna land.
01:31:16 But just the idea that different jokes,
01:31:21 I’ve now done a little bit of standup.
01:31:23 And so different jokes work in different size crowds too.
01:31:26 No, it’s true.
01:31:27 Where sometimes if it’s a big enough crowd,
01:31:30 then even a really subtle joke can take root someplace
01:31:33 and then that cues other people.
01:31:34 And it kind of,
01:31:36 there’s a whole statistics of.
01:31:38 I did this terrible thing to my brother.
01:31:40 So when I was really young,
01:31:41 I decided that my brother was only laughing
01:31:44 as it comes when I laughed.
01:31:46 Like he was taking cues from me.
01:31:48 So I like purposely didn’t laugh
01:31:50 just to see if I was right.
01:31:50 And did you laugh at non funny things?
01:31:52 Yes.
01:31:53 You really wanna do both sides.
01:31:54 I did both sides.
01:31:54 And at the end of it, I told him what I did.
01:31:58 He was very upset about this.
01:32:00 And from that day on.
01:32:01 He lost his sense of humor.
01:32:03 No, no, no, no.
01:32:03 Well, yes.
01:32:04 But from that day on, he laughed on his own.
01:32:07 He stopped taking cues from me.
01:32:08 I see.
01:32:09 So I wanna say that it was a good thing that I did.
01:32:11 Yes, yes.
01:32:12 You saved that man’s life.
01:32:14 Yes, but it was mostly mean.
01:32:15 But it’s true though.
01:32:15 It’s true, right?
01:32:16 That people, I think you’re right.
01:32:19 But okay, so where does that get us?
01:32:20 That gets us the idea that,
01:32:23 I mean, certainly movie theaters are a thing, right?
01:32:26 Where people like to be watching together,
01:32:28 even though the people on the screen
01:32:30 aren’t really co present with the people in the audience.
01:32:33 The audience is co present with themselves.
01:32:35 By the way, and that point,
01:32:36 it’s an open question that’s being raised by this,
01:32:38 whether movies will no longer be a thing
01:32:40 because Netflix’s audience is growing.
01:32:43 So that’s, it’s a very parallel question for education.
01:32:47 Will movie theaters still be a thing in 2021?
01:32:50 No, but I think the argument is
01:32:52 that there is a feeling of being in the crowd
01:32:54 that isn’t replicated by being at home watching it
01:32:57 and that there’s value in that.
01:32:59 And then I think just.
01:33:00 But, but.
01:33:02 It scales better online.
01:33:03 But I feel like we’re having a conversation
01:33:06 about whether concerts will still exist
01:33:09 after the invention of the record or the CD
01:33:13 or wherever it is, right?
01:33:13 They won’t.
01:33:14 You’re right, concerts are dead.
01:33:16 Well, okay, I think the joke is only funny
01:33:19 if you say it before now.
01:33:21 Right, yeah, that’s true.
01:33:23 Like three years ago.
01:33:24 It’s like, well, no, obviously concerts are still a big thing.
01:33:25 I’ll wait to publish this until we have a vaccine.
01:33:27 No, you know, we’ll fix it in post.
01:33:30 But I think the important thing is.
01:33:33 Fix the virus post.
01:33:34 Concerts changed, right?
01:33:36 Concerts changed.
01:33:37 First of all, movie theaters weren’t this way, right?
01:33:39 In like the 60s and 70s, they weren’t like this.
01:33:41 Like blockbusters were basically what?
01:33:44 Well, Jaws and Star Wars created blockbusters, right?
01:33:47 Before then, there weren’t.
01:33:47 Like the whole shared summer experience
01:33:49 didn’t exist in our lifetimes, right?
01:33:52 Certainly you were well into adulthood
01:33:53 by the time this was true, right?
01:33:54 So it’s just a very different.
01:33:56 It’s very different.
01:33:57 So what we’ve been experiencing in the last 10 years
01:33:59 is not like the majority of human history,
01:34:01 but more importantly, concerts, right?
01:34:03 Concerts mean something different.
01:34:04 Most people don’t go to concerts anymore.
01:34:07 Like there’s an age where you care about it.
01:34:09 You sort of stop doing it,
01:34:10 but you keep listening to music or whatever
01:34:12 and da, da, da, da, da, da, da.
01:34:13 So I think that’s a painful way of saying that
01:34:21 it will change.
01:34:22 It was not the same thing as it going away.
01:34:23 Replace is too strong of a word, but it will change.
01:34:27 It has to.
01:34:27 Actually, like to push back, I wonder,
01:34:29 because I think you’re probably just throwing
01:34:31 that your intuition now.
01:34:33 Oh, I wasn’t.
01:34:34 And it’s possible that concerts,
01:34:37 more people go to concerts now,
01:34:39 but obviously much more people listen to,
01:34:42 well, that’s dumb, than before there was records.
01:34:46 It’s possible to argue that if you look at the data,
01:34:51 that it just expanded the pie of what music listening means.
01:34:55 So it’s possible that universities grow in the parallel
01:34:59 or the theaters grow,
01:35:00 but also more people get to watch movies,
01:35:02 more people get to be educated.
01:35:05 Yeah, I hope that is true.
01:35:07 Yeah, and to the extent that we can grow the pie
01:35:09 and have education be not just something you do
01:35:11 for four years when you’re done with your other education,
01:35:16 but it be a more lifelong thing,
01:35:19 that would have tremendous benefits,
01:35:20 especially as the economy and the world change rapidly.
01:35:24 People need opportunities to stay abreast of these changes.
01:35:28 And so, I don’t know,
01:35:31 that’s all part of the ecosystem.
01:35:33 It’s all to the good.
01:35:34 I mean, I’m not gonna have an argument
01:35:36 about whether we lost fidelity
01:35:38 when we went from Laserdisc to DVDs
01:35:40 or record players to CDs.
01:35:43 I mean, I’m willing to grant that that is true,
01:35:45 but convenience matters and the ability to do something
01:35:50 that you couldn’t do otherwise
01:35:51 because that convenience matters.
01:35:53 And you can tell me I’m only getting 90% of the experience,
01:35:56 but I’m getting the experience.
01:35:57 I wasn’t getting it before or it wasn’t lasting as long
01:36:00 or it wasn’t as easy.
01:36:00 I mean, this just seems straightforward to me.
01:36:03 It’s gonna, it’s going to change.
01:36:05 It is for the good that more people get access
01:36:08 and it is our job to do two separate things.
01:36:10 One, to educate them and make access available.
01:36:13 That’s our mission.
01:36:14 But also for very simple selfish reasons,
01:36:17 we need to figure out how to do it better
01:36:18 so that we individually stay in business.
01:36:20 We can do both of those things at the same time.
01:36:21 They are not in, they may be intention,
01:36:24 but they are not mutually exclusive.
01:36:28 So you’ve educated some scary number of people.
01:36:34 So you’ve seen a lot of people succeed,
01:36:37 find their path through life.
01:36:39 Is there a device that you can give to a young person today
01:36:45 about computer science education,
01:36:48 about education in general, about life,
01:36:53 about whatever the journey that one takes in there,
01:36:59 maybe in their teens, in their early 20s,
01:37:02 sort of in those underground years
01:37:05 as you try to go through the essential process of partying
01:37:09 and not going to classes
01:37:10 and yet somehow trying to get a degree?
01:37:12 If you get to the point where you’re far enough up
01:37:16 in the hierarchy of needs that you can actually
01:37:20 make decisions like this,
01:37:21 then find the thing that you’re passionate about
01:37:24 and pursue it.
01:37:25 And sometimes it’s the thing that drives your life
01:37:27 and sometimes it’s secondary.
01:37:29 And you’ll do other things because you’ve got to eat, right?
01:37:31 You’ve got a family, you’ve got to feed,
01:37:32 you’ve got people you have to help or whatever.
01:37:34 And I understand that and it’s not easy for everyone,
01:37:36 but always take a moment or two
01:37:39 to pursue the things that you love,
01:37:42 the things that bring passion and happiness to your life.
01:37:45 And if you don’t, I know that sounds corny,
01:37:46 but I genuinely believe it.
01:37:47 And if you don’t have such a thing,
01:37:49 then you’re lying to yourself.
01:37:51 You have such a thing.
01:37:52 You just have to find it.
01:37:53 And it’s okay if it takes you a long time to get there.
01:37:56 Rodney Dangerfield became a comedian in his 50s, I think.
01:38:00 Certainly wasn’t his 20s.
01:38:01 And lots of people failed for a very long time
01:38:03 before getting to where they were going.
01:38:06 I try to have hope and it wasn’t obvious.
01:38:09 I mean, you and I talked about the experience that I had
01:38:13 a long time ago with a particular police officer.
01:38:17 Was it my first one and was it my last one?
01:38:20 But in my view, I wasn’t supposed to be here after that
01:38:24 and I’m here.
01:38:25 So it’s all gravy.
01:38:25 So you might as well go ahead and grab life as you can
01:38:29 because of that.
01:38:29 That’s sort of how I see it.
01:38:31 While recognizing, again, the delusion matters, right?
01:38:34 Allow yourself to be deluded.
01:38:35 Allow yourself to believe that it’s all gonna work out.
01:38:38 Just don’t be so deluded that you miss the obvious.
01:38:41 And you’re gonna be fine.
01:38:43 It’s gonna be there.
01:38:44 It’s gonna be there.
01:38:45 It’s gonna work out.
01:38:46 What do you think?
01:38:47 I like to say choose your parents wisely
01:38:51 because that has a big impact on your life.
01:38:53 It’s different.
01:38:54 Yeah, I mean, there’s a whole lot of things
01:38:57 that you don’t get to pick.
01:38:58 And whether you get to have one kind of life
01:39:02 or a different kind of life can depend a lot
01:39:05 on things out of your control.
01:39:06 But I really do believe in the passion, excitement thing.
01:39:09 My, I was talking to my mom on the phone the other day
01:39:11 and essentially what came out is that computer science
01:39:19 is really popular right now.
01:39:22 And I get to be a professor teaching something
01:39:25 that’s very attractive to people.
01:39:28 And she was like trying to give me some appreciation
01:39:33 for how foresightful I was for choosing this line of work
01:39:37 as if somehow I knew that this is what was gonna happen
01:39:40 in 2020, but that’s not how it went for me at all.
01:39:44 Like I studied computer science
01:39:45 because I was just interested.
01:39:47 It was just so interesting to me.
01:39:49 I didn’t think it would be particularly lucrative.
01:39:54 And I’ve done everything I’ve can to keep it
01:39:56 as unlucrative as possible.
01:39:59 Some of my friends and colleagues have not done that.
01:40:03 And I pride myself on my ability to remain unrich.
01:40:07 But I do believe that, like I’m glad.
01:40:13 I mean, I’m glad that it worked out for me.
01:40:15 It could have been like, oh, what I was really fascinated by
01:40:17 is this particular kind of engraving
01:40:19 that nobody cares about.
01:40:20 But so I got lucky and the thing that I cared about
01:40:22 happened to be a thing that other people
01:40:24 eventually cared about.
01:40:26 But I don’t think I would have had a fun time
01:40:28 choosing anything else.
01:40:29 Like this was the thing that kept me interested and engaged.
01:40:32 Well, one thing that people tell me,
01:40:34 especially around the early undergraduate,
01:40:38 and the internet is part of the problem here,
01:40:41 is they say they’re passionate about so many things.
01:40:44 How do I choose a thing?
01:40:46 Which is a harder thing for me to know what to do with.
01:40:50 Is there any?
01:40:51 I mean, don’t you know which, I mean, you know, look.
01:40:55 A long time ago, I walked down a hallway
01:40:57 and I took a left turn.
01:40:59 Yeah.
01:40:59 I could have taken a right turn.
01:41:01 And my world could be better or it could be worse.
01:41:03 I have no idea.
01:41:04 I have no way of knowing.
01:41:05 Is there anything about this particular hallway
01:41:07 that’s relevant or you’re just in general choices?
01:41:09 Yeah, you were on the left.
01:41:09 It sounds like you regret not taking the right turn.
01:41:11 Oh no, not at all.
01:41:12 You brought it up.
01:41:13 Well, because there was a turn there.
01:41:16 On the left was Michael Newman’s office, right?
01:41:18 I mean, these sorts of things happen, right?
01:41:20 But here’s the thing.
01:41:20 On the right, by the way, there was just a blank wall.
01:41:22 It wasn’t a huge choice.
01:41:24 It would have really hurt.
01:41:25 He tried first.
01:41:26 No, but it’s true, right?
01:41:27 You know, I think about Ron Brockman, right?
01:41:29 I went, I took a trip I wasn’t supposed to take
01:41:33 and I ended up talking to Ron about this
01:41:38 and I ended up going down this entire path
01:41:40 that allowed me to, I think, get tenure.
01:41:42 But by the way, I decided to say yes to something
01:41:45 that didn’t make any sense
01:41:46 and I went down this educational path.
01:41:48 But it would have been, you know, who knows, right?
01:41:50 Maybe if I hadn’t done that,
01:41:52 I would be a billionaire right now.
01:41:54 I’d be Elon Musk.
01:41:55 My life could be so much better.
01:41:57 My life could also be so much worse.
01:41:59 You know, you just gotta feel that sometimes
01:42:01 you have decisions you’re gonna make.
01:42:03 You cannot know what’s gonna do.
01:42:04 You should think about it, right?
01:42:05 Some things are clearly smarter than other things.
01:42:07 You gotta play the odds a little bit.
01:42:09 But in the end, if you’ve got multiple choices,
01:42:11 there are lots of things you think you might love.
01:42:12 Go with the thing that you actually love,
01:42:14 the thing that jumps out at you
01:42:15 and sort of pursue it for a little while.
01:42:17 The worst thing that’ll happen is you took a left turn
01:42:18 instead of a right turn and you ended up merely happy.
01:42:22 Beautiful.
01:42:23 So, so accepting, so taking the step
01:42:26 and just accepting, accepting that,
01:42:28 that don’t like question, question the choice.
01:42:31 Life is long and there’s time to actually pursue.
01:42:36 Every once in a while, you have to put on a leather suit
01:42:41 and make a thriller video.
01:42:43 Every once in a while.
01:42:44 If I ever get the chance again, I’m doing it.
01:42:47 Yeah.
01:42:49 I was told that you actually dance,
01:42:50 but that part was edited out.
01:42:53 I don’t dance.
01:42:55 There was a thing where we did do the zombie thing.
01:42:59 We did do the zombie thing.
01:43:00 That wasn’t edited out.
01:43:01 It just wasn’t put into the final thing.
01:43:05 I’m quite happy.
01:43:06 There was a reason for that too, right?
01:43:07 Like I wasn’t wearing something right.
01:43:09 There was a reason for that.
01:43:10 I can’t remember what it was.
01:43:11 No leather suit.
01:43:12 Is that what it was?
01:43:13 I can’t remember.
01:43:14 Anyway, the right thing happened.
01:43:16 Exactly.
01:43:16 You took the left turn and ended up being the right thing.
01:43:19 So a lot of people ask me that are a little bit
01:43:23 tangential to the programming and the computing world
01:43:26 and they’re interested to learn programming,
01:43:28 like all kinds of disciplines that are outside
01:43:30 of the particular discipline of computer science.
01:43:33 What advice do you have for people
01:43:36 that want to learn how to program
01:43:38 or want to either taste this little skill set
01:43:43 or discipline or try to see if it can be used somehow
01:43:47 in their own life?
01:43:48 What stage of life are they in?
01:43:53 One of the magic things about the internet
01:43:55 of the people that write me is I don’t know.
01:43:58 Because my answer’s different for, my daughter
01:44:00 is taking AP computer science right now.
01:44:02 Hi, Joni.
01:44:03 She’s amazing and doing amazing things
01:44:06 and my son’s beginning to get interested
01:44:08 and I’ll be really curious where he takes it.
01:44:10 I think his mind actually works very well
01:44:12 for this sort of thing and she’s doing great.
01:44:14 But one of the things I have to tell her all the time,
01:44:17 she points, well, I want to make a rhythm game.
01:44:19 So I want to go for two weeks and then build a rhythm game.
01:44:23 Show me how to build a rhythm game.
01:44:25 Start small, learn the building blocks
01:44:27 and how to take the time.
01:44:28 Have patience, eventually you’ll build a rhythm game.
01:44:31 I was in grad school when I suddenly woke up one day
01:44:34 over the Royal East and I thought, wait a minute,
01:44:37 I’m a computer scientist.
01:44:38 I should be able to write Pac Man in an afternoon.
01:44:39 And I did, not with great graphics.
01:44:42 It was actually a very cool game.
01:44:43 I had to figure out how the ghost moved and everything
01:44:45 and I did it in an afternoon in Pascal
01:44:47 on an old Apple 2GS.
01:44:49 But if I had started out trying to build Pac Man,
01:44:52 I think it probably would have ended very poorly for me.
01:44:55 Luckily back then, there weren’t
01:44:57 these magical devices we call phones
01:44:58 and software everywhere to give me this illusion
01:45:01 that I could create something by myself
01:45:03 from the basics inside of a weekend like that.
01:45:05 I mean, that was a culmination of years and years and years
01:45:09 right before I decided, oh, I should be able to write this
01:45:11 and I could.
01:45:12 So my advice if you’re early on is you’ve got the internet.
01:45:16 There are lots of people there to give you the information.
01:45:18 Find someone who cares about this.
01:45:20 Remember, they’ve been doing it for a very long time.
01:45:22 Take it slow, learn the little pieces, get excited about it
01:45:25 and then keep the big project you want to build in mind.
01:45:28 You’ll get there soon enough.
01:45:29 Because as a wise man once said, life is long.
01:45:32 Sometimes it doesn’t seem that long, but it is long
01:45:35 and you’ll have enough time to build it all out.
01:45:39 All the information is out there, but start small.
01:45:43 Generate Fibonacci numbers.
01:45:44 That’s not exciting, but it’ll get you the language.
01:45:48 Well, there’s only one programming language, it’s Lisp.
01:45:50 But if you have to pick a programming language,
01:45:53 I guess in today’s day, what would I do?
01:45:55 I guess I’d do.
01:45:56 Python is basically Lisp, but with better syntax.
01:46:00 Blasphemy.
01:46:01 Yeah, with C syntax, how about that?
01:46:03 So you’re gonna argue that C syntax
01:46:05 is better than anything?
01:46:07 Anyway, also I’m gonna answer Python despite what he said.
01:46:10 Tell your story about somebody’s dissertation
01:46:12 that had a Lisp program in it.
01:46:14 It was so funny.
01:46:15 This is Dave’s, Dave’s dissertation was like,
01:46:17 Dave McAllister, who was a professor at MIT for a while
01:46:20 and then he came to Bell Labs and now he’s at
01:46:24 Technology Technical Institute of Chicago.
01:46:26 A brilliant guy.
01:46:27 Such an interesting guy.
01:46:28 Anyway, his thesis, it was a theorem prover
01:46:33 and he decided to have as an appendix his actual code,
01:46:38 which of course was all written in Lisp
01:46:39 because of course it was.
01:46:40 And like the last 20 pages are just right parenthesis.
01:46:43 It’s just wonderful.
01:46:47 That’s programming right there.
01:46:48 Pages upon pages of right parenthesis.
01:46:51 Anyway, Lisp is the only real language,
01:46:52 but I understand that that’s not necessarily
01:46:54 the place where you start.
01:46:56 Python is just fine.
01:46:57 Python is good.
01:46:59 If you’re, you know, of a certain age,
01:47:00 if you’re really young and trying to figure it out,
01:47:02 graphical languages that let you kind of see
01:47:04 how the thing works and that’s fine too.
01:47:05 They’re all fine.
01:47:06 It almost doesn’t matter.
01:47:07 But there are people who spend a lot of time
01:47:09 thinking about how to build languages that get people in.
01:47:13 The question is, are you trying to get in
01:47:14 and figure out what it is?
01:47:15 Or do you already know what you want?
01:47:18 And that’s why I asked you what stage of life people are in
01:47:19 because if you’re different stages of life,
01:47:21 you would attack it differently.
01:47:23 The answer to that question of which language
01:47:25 keeps changing, I mean, there’s some value
01:47:27 to exploring, a lot of people write to me about Julia.
01:47:33 There’s these like more modern languages
01:47:35 that keep being invented, Rust and Kotlin.
01:47:39 There’s stuff that, for people who love
01:47:42 functional languages like Lisp,
01:47:44 that apparently there’s echoes of that,
01:47:46 but much better in the modern languages.
01:47:49 And it’s worthwhile to,
01:47:51 especially when you’re learning languages,
01:47:53 it feels like it’s okay to try one
01:47:55 that’s not like the popular one.
01:47:57 Oh yeah, but you want something simple.
01:47:59 And I think you get that way of thinking
01:48:02 almost no matter what language.
01:48:04 And if you push far enough,
01:48:06 like it can be assembly language,
01:48:08 but you need to push pretty far
01:48:09 before you start to hit the really deep concepts
01:48:11 that you would get sooner in other languages.
01:48:13 But like, I don’t know, computation is kind of computation,
01:48:16 is kind of Turing equivalent, is kind of computation.
01:48:19 And so it matters how you express things,
01:48:22 but you have to build out that mental structure
01:48:24 in your mind.
01:48:25 And I don’t think it’s super matters which language.
01:48:28 I mean, it matters a little,
01:48:29 because some things are just
01:48:31 at the wrong level of abstraction.
01:48:32 I think assembly is at the wrong level of abstraction
01:48:33 for someone coming in new.
01:48:35 I think that if you start.
01:48:37 For someone coming in new.
01:48:38 Yes, for frameworks, big frameworks are quite a bit.
01:48:42 You know, you’ve got to get to the point
01:48:43 where I want to learn a new language,
01:48:44 means I just pick up a reference book
01:48:46 and I think of a project and I go through it in a weekend.
01:48:49 Right, you got to get there.
01:48:50 You’re right though, the languages that are designed
01:48:52 for that are, it almost doesn’t matter.
01:48:54 Pick the ones that people have built tutorials
01:48:57 and infrastructure around to help you get kind of,
01:48:59 kind of ease into it.
01:49:00 Because it’s hard.
01:49:01 I mean, I did this little experiment once.
01:49:05 I was teaching intro to CS in the summer as a favor.
01:49:11 Which is, anyway.
01:49:11 I was teaching.
01:49:12 I was teaching intro to CS as a favor.
01:49:15 And it was very funny because I’d go in every single time
01:49:17 and I would think to myself,
01:49:18 how am I possibly going to fill up an hour and a half
01:49:21 talking about for loops, right?
01:49:23 And there wasn’t enough time.
01:49:25 Took me a while to realize this, right?
01:49:26 There are only three things, right?
01:49:27 There’s reading from a variable,
01:49:29 writing to a variable and conditional branching.
01:49:31 Everything else is syntactic sugar, right?
01:49:34 The syntactic sugar matters, but that’s it.
01:49:36 And when I say that’s it, I don’t mean it’s simple.
01:49:38 I mean, it’s hard.
01:49:40 Like conditional branching, loops, variable.
01:49:43 Those are really hard concepts.
01:49:45 So you shouldn’t be discouraged by this.
01:49:47 Here’s a simple experiment.
01:49:48 I’m gonna ask you a question now.
01:49:49 You ready?
01:49:50 X equals three.
01:49:51 Okay.
01:49:53 Y equals four.
01:49:54 Okay.
01:49:55 What is X?
01:49:57 Three.
01:49:57 What is Y?
01:49:59 Four.
01:49:59 Y equals X.
01:50:00 I’m gonna mess this up.
01:50:01 No, it’s easy.
01:50:02 Y equals X.
01:50:04 Y equals X.
01:50:04 What is Y?
01:50:07 Three.
01:50:08 That’s right.
01:50:09 X equals seven.
01:50:11 What is Y?
01:50:12 That’s one of the trickiest things to get for programmers,
01:50:15 that there’s a memory and the variables are pointing
01:50:19 to a particular thing in memory,
01:50:21 and sometimes the languages hide that from you
01:50:23 and they bring it closer
01:50:24 to the way you think mathematics works.
01:50:26 Right, so in fact, Mark Guzdal,
01:50:28 who worries about these sorts of things,
01:50:30 or used to worry about these sorts of things anyway,
01:50:32 had this kind of belief that actually,
01:50:35 people when they see these statements,
01:50:36 X equals something, Y equals something, Y equals X,
01:50:39 that you have now made a mathematical statement
01:50:42 that Y and X are the same.
01:50:45 Which you can if you just put like an anchor in front of it.
01:50:48 Yes, but people, that’s not what you’re doing, right?
01:50:51 I thought, and I kind of asked the question,
01:50:54 and I think I had some evidence for this,
01:50:55 it’s hardly a study,
01:50:56 is that most of the people who didn’t know the answer,
01:50:59 weren’t sure about the answer, they had used spreadsheets.
01:51:02 Ah, interesting.
01:51:03 And so it’s, you know,
01:51:06 it’s by reference, or by name really, right?
01:51:10 And so depending upon what you think they are,
01:51:13 you get completely different answers.
01:51:14 The fact that I could go, or one could go,
01:51:17 two thirds of the way through a semester,
01:51:20 and people still hadn’t figured out in their heads,
01:51:22 when you say Y equals X, what that meant,
01:51:25 tells you it’s actually hard.
01:51:27 Because all those answers are possible,
01:51:29 and in fact, when you said,
01:51:30 oh, if you just put an ampersand in front of it,
01:51:31 I mean, that doesn’t make any sense for an intro class,
01:51:33 and of course a lot of languages
01:51:34 don’t even give you the ability
01:51:35 to think about it in terms of ampersand.
01:51:37 Do we want to have a 45 minute discussion
01:51:38 about the difference between equal EQ and equal in Lisp?
01:51:42 Yeah.
01:51:43 I know you do.
01:51:44 No.
01:51:44 But you know, you could do that.
01:51:47 This is actually really hard stuff.
01:51:49 So you shouldn’t be, it’s not too hard, we all do it,
01:51:52 but you shouldn’t be discouraged.
01:51:53 It’s why you should start small,
01:51:55 so that you can figure out these things,
01:51:56 so you have the right model in your head,
01:51:58 so that when you write the language,
01:51:59 you can execute it, and build the machine
01:52:02 that you want to build, right?
01:52:03 Yeah, the funny thing about programming,
01:52:05 and those very basic things,
01:52:06 is the very basics are not often made explicit,
01:52:11 which is actually what drives everybody away
01:52:13 from basically any discipline,
01:52:15 but programming is just another one.
01:52:17 Like even a simpler version of the equal sign
01:52:19 that I kind of forget, is in mathematics,
01:52:23 equals is not assignment.
01:52:25 Yeah.
01:52:26 Like, I think basically every single programming language
01:52:30 with just a few handful of exceptions,
01:52:33 equals is assignment.
01:52:35 And you have some other operator for equality.
01:52:38 And even that, like everyone kind of knows it,
01:52:42 once you started doing it,
01:52:45 but like you need to say that explicitly,
01:52:47 or you just realize it, like yourself.
01:52:51 Otherwise you might be stuck for,
01:52:53 you said like half a semester,
01:52:54 you could be stuck for quite a long time.
01:52:57 And I think also part of the programming
01:53:00 is being okay in that state of confusion for a while.
01:53:04 It’s to the debugging point.
01:53:06 It’s like, I just wrote two lines of code,
01:53:09 why doesn’t this work?
01:53:10 And staring at that for like hours,
01:53:14 and trying to figure out.
01:53:15 And then every once in a while,
01:53:16 you just have to restart your computer
01:53:18 and everything works again.
01:53:19 And then you just kind of stare into the void
01:53:24 with the tear slowly rolling down your eye.
01:53:26 By the way, the fact that they didn’t get this
01:53:28 actually had no impact on,
01:53:30 I mean, they were still able to do their assignments.
01:53:32 Because it turns out their misunderstanding
01:53:35 wasn’t being revealed to them
01:53:37 by the problem sets we were giving them.
01:53:39 It’s pretty profound actually, yeah.
01:53:41 I wrote a program a long time ago,
01:53:44 actually for my master’s thesis,
01:53:46 and in C++ I think, or C, I guess it was C.
01:53:49 And it was all memory management and terrible.
01:53:52 And it wouldn’t work for a while.
01:53:56 And it was some kind of,
01:53:57 it was clear to me that it was overriding memory.
01:53:59 And I just couldn’t, I was like,
01:54:01 look, I got to pay for this time for this.
01:54:03 So I basically declared a variable
01:54:06 at the front in the main that was like 400K,
01:54:10 just an array, and it worked.
01:54:12 Because wherever I was scribbling over memory,
01:54:14 it would scribble into that space and it didn’t matter.
01:54:17 And so I never figured out what the bug was.
01:54:19 But I did create something to sort of deal with it.
01:54:21 To work around it.
01:54:22 And it, you know, that’s crazy, that’s crazy.
01:54:25 It was okay, because that’s what I wanted.
01:54:27 But I knew enough about memory managed to go,
01:54:29 you know, management to go, you know,
01:54:30 I’m just going to create an empty array here
01:54:32 and hope that that deals with this scribbling memory problem.
01:54:34 And it did.
01:54:35 That takes a long time to figure out.
01:54:36 And by the way, the language you first learned
01:54:38 probably just garbage collection anyway,
01:54:39 so you’re not even going to come up across,
01:54:41 you’re not going to come across that problem.
01:54:43 So we talked about the Minsky idea
01:54:46 of hating everything you do and hating yourself.
01:54:49 So let’s end on a question
01:54:52 that’s going to make both of you very uncomfortable.
01:54:54 Okay.
01:54:55 Which is, what is your, Charles,
01:54:58 what’s your favorite thing that you’re grateful for
01:55:01 about Michael?
01:55:04 And Michael, what is your favorite thing
01:55:06 that you’re grateful for about Charles?
01:55:09 Well, that answer is actually quite easy.
01:55:12 His friendship.
01:55:14 He stole the easy answer.
01:55:15 I did.
01:55:16 Yeah, I can tell you what I hate about Charles,
01:55:17 he steals my good answers.
01:55:19 The thing I like most about Charles,
01:55:21 he sees the world in a similar enough,
01:55:24 but different way that I,
01:55:25 it’s sort of like having another life.
01:55:28 It’s sort of like I get to experience things
01:55:31 that I wouldn’t otherwise get to experience
01:55:32 because I would not naturally gravitate to them that way.
01:55:36 And so he just, he just shows me a whole other world.
01:55:39 It’s awesome.
01:55:39 Yeah, the inner product is not zero for sure.
01:55:44 It’s not quite one, 0.7 maybe.
01:55:47 Just enough that you can learn.
01:55:50 Just enough that you can learn.
01:55:53 That’s the definition of friendship.
01:55:54 The inner product is 0.7.
01:55:55 Yeah, I think so.
01:55:56 That’s the answer to life really.
01:55:58 Charles sometimes believes in me
01:55:59 when I have not believed in me.
01:56:01 He also sometimes works as an outward confidence
01:56:04 that he has so much, so much confidence and self,
01:56:08 I don’t know, comfortableness.
01:56:11 Okay, let’s go with that.
01:56:13 That I feel better a little bit.
01:56:16 If he thinks I’m okay,
01:56:17 then maybe I’m not as bad as I think I am.
01:56:20 At the end of the day, luck favors the Charles.
01:56:24 It’s a huge honor to talk with you.
01:56:26 Thank you so much for taking this time,
01:56:29 wasting your time with me.
01:56:30 It was an awesome conversation.
01:56:32 You guys are an inspiration to a huge number of people
01:56:35 and to me, so really enjoyed this.
01:56:37 Thanks for talking to me.
01:56:38 I enjoyed it as well.
01:56:38 Thank you so much.
01:56:39 And by the way, if luck favors the Charles,
01:56:40 then it’s certainly the case
01:56:41 that I’ve been very lucky to know you.
01:56:43 I’m gonna edit that part out.
01:56:47 Thanks for listening to this conversation
01:56:49 with Charles Isbell and Michael Littman.
01:56:51 And thank you to our sponsors,
01:56:53 Athletic Greens, Super Nutritional Drink,
01:56:57 Eight Sleep, Self Cooling Mattress,
01:57:00 Masterclass Online Courses
01:57:02 from some of the most amazing humans in history,
01:57:05 and Cash App, the app I use to send money to friends.
01:57:09 Please check out the sponsors in the description
01:57:12 to get a discount and to support this podcast.
01:57:16 If you enjoy this thing, subscribe on YouTube,
01:57:18 review it with Five Stars Napa Podcast,
01:57:20 follow on Spotify, support it on Patreon,
01:57:23 or connect with me on Twitter at Lex Friedman.
01:57:26 And now, let me leave you with some words from Desmond Tutu.
01:57:30 Don’t raise your voice, improve your argument.
01:57:34 Thank you for listening and hope to see you next time.