Stephen Schwarzman: Going Big in Business, Investing, and AI #96

Transcript

00:00:00 The following is a conversation with Stephen Schwarzman,

00:00:03 CEO and cofounder of Blackstone,

00:00:05 one of the world’s leading investment firms

00:00:08 with over $530 billion of assets under management.

00:00:12 He’s one of the most successful business leaders in history.

00:00:17 I recommend his recent book called What It Takes

00:00:20 that tells stories and lessons from his personal journey.

00:00:24 Stephen is a philanthropist

00:00:26 and one of the wealthiest people in the world,

00:00:28 recently signing the Giving Pledge,

00:00:31 thereby committing to give the majority of his wealth

00:00:33 to philanthropic causes.

00:00:36 As an example, in 2018, he donated $350 million to MIT

00:00:41 to help establish his new College of Computing,

00:00:45 the mission of which promotes interdisciplinary, big,

00:00:48 bold research in artificial intelligence.

00:00:51 For those of you who know me,

00:00:53 know that MIT is near and dear to my heart

00:00:55 and always will be.

00:00:57 It was and is a place where I believe big, bold,

00:01:01 revolutionary ideas have a home,

00:01:03 and that is what is needed

00:01:05 in artificial intelligence research in the coming decades.

00:01:08 Yes, there’s institutional challenges,

00:01:11 but also there’s power

00:01:13 in the passion of individual researchers,

00:01:15 from undergrad to PhD,

00:01:17 from young scientists to senior faculty.

00:01:20 I believe the dream to build intelligence systems

00:01:23 burns brighter than ever in the halls of MIT.

00:01:26 This conversation was recorded recently,

00:01:28 but before the outbreak of the pandemic.

00:01:31 For everyone feeling the burden of this crisis,

00:01:33 I’m sending love your way.

00:01:35 Stay strong, we’re in this together.

00:01:38 This is the Artificial Intelligence Podcast.

00:01:41 If you enjoy it, subscribe on YouTube,

00:01:43 review it with five stars on Apple Podcast,

00:01:45 support it on Patreon,

00:01:46 or simply connect with me on Twitter at Lex Friedman,

00:01:50 spelled F R I D M A N.

00:01:52 As usual, I’ll do a few minutes of ads now,

00:01:55 and never any ads in the middle

00:01:56 that can break the flow of the conversation.

00:01:58 I hope that works for you,

00:02:00 and doesn’t hurt the listening experience.

00:02:02 Quick summary of the ads.

00:02:03 Two sponsors, Masterclass and ExpressVPN.

00:02:07 Please consider supporting the podcast

00:02:08 by signing up to Masterclass at masterclass.com slash lex,

00:02:13 and getting ExpressVPN at expressvpn.com slash lexpod.

00:02:19 This show is sponsored by Masterclass.

00:02:22 Sign up at masterclass.com slash lex

00:02:25 to get a discount and support this podcast.

00:02:28 When I first heard about Masterclass,

00:02:30 I thought it was too good to be true.

00:02:32 For $180 a year, you get an all access pass

00:02:35 to watch courses from, to list some of my favorites,

00:02:38 Chris Hadfield on Space Exploration,

00:02:41 Neil deGrasse Tyson on Scientific Thinking and Communication,

00:02:44 Will Wright, creator of SimCity and Sims on game design,

00:02:48 Carlos Santana on guitar, Gary Kasparov on chess,

00:02:52 Daniel Negrano on poker, and many, many more.

00:02:56 Chris Hadfield explaining how rockets work,

00:02:58 and the experience of being launched into space alone

00:03:01 is worth the money.

00:03:03 By the way, you can watch it on basically any device.

00:03:06 Once again, sign up at masterclass.com slash lex

00:03:10 to get a discount and to support this podcast.

00:03:13 This show is sponsored by ExpressVPN.

00:03:16 Get it at expressvpn.com slash lex pod

00:03:20 to get a discount and to support this podcast.

00:03:23 I’ve been using ExpressVPN for many years.

00:03:25 I love it.

00:03:26 It’s easy to use, press the big power on button,

00:03:29 and your privacy is protected.

00:03:31 And, if you like, you can make it look

00:03:33 like your location is anywhere else in the world.

00:03:36 I might be in Boston now, but it can make you look like

00:03:39 I’m in New York, London, Paris,

00:03:42 or anywhere else in the world.

00:03:44 This has a large number of obvious benefits.

00:03:46 Certainly, it allows you to access international versions

00:03:49 of streaming websites like the Japanese Netflix

00:03:52 or the UK Hulu.

00:03:54 ExpressVPN works on any device you can imagine.

00:03:57 I use it on Linux, shout out to Ubuntu 2004,

00:04:01 Windows, Android, but it’s available everywhere else too.

00:04:05 Once again, get it at expressvpn.com slash lex pod

00:04:09 to get a discount and to support this podcast.

00:04:13 And now, here’s my conversation with Stephen Schwarzman.

00:04:17 Let’s start with a tough question.

00:04:19 What idea do you believe,

00:04:21 whether grounded in data or in intuition,

00:04:24 that many people you respect disagree with you on?

00:04:28 Well, there isn’t all that much anymore

00:04:32 since the world’s so transparent.

00:04:34 But one of the things I believe in and put it in the book,

00:04:39 the book, what it takes is if you’re gonna do something,

00:04:43 do something very consequential.

00:04:46 Do something that’s quite large, if you can, that’s unique.

00:04:51 Because if you operate in that kind of space,

00:04:54 when you’re successful, it’s a huge impact.

00:04:57 The prospect of success enables you to recruit people

00:05:02 who wanna be part of that.

00:05:04 And those type of large opportunities

00:05:06 are pretty easily described.

00:05:09 And so, not everybody likes to operate at scale.

00:05:14 Some people like to do small things

00:05:16 because it is meaningful for them emotionally.

00:05:21 And so, occasionally, you get a disagreement on that.

00:05:25 But those are life choices rather than commercial choices.

00:05:30 That’s interesting.

00:05:31 What good and bad comes with going big?

00:05:34 We often, in America, think big is good.

00:05:41 What’s the benefit, what’s the cost

00:05:44 in terms of just bigger than business,

00:05:45 but life, happiness, the pursuit of happiness?

00:05:49 Well, you do things that make you happy.

00:05:51 It’s not mandated.

00:05:53 And everybody’s different.

00:05:56 And some people, if they have talent,

00:06:00 like playing pro football,

00:06:02 other people just like throwing the ball around,

00:06:07 not even being on a team.

00:06:09 What’s better?

00:06:10 Depends what your objectives are.

00:06:12 Depends what your talent is.

00:06:15 Depends what gives you joy.

00:06:19 So, in terms of going big,

00:06:21 is it both for impact on the world

00:06:24 and because you personally gives you joy?

00:06:27 Well, it makes it easier to succeed, actually.

00:06:31 Because if you catch something, for example,

00:06:35 that’s cyclical, that’s a huge opportunity,

00:06:39 then you usually can find some place

00:06:42 within that huge opportunity where you can make it work.

00:06:46 If you’re prosecuting a really small thing

00:06:51 and you’re wrong, you don’t have many places to go.

00:06:56 So, I’ve always found that the easy place to be

00:07:00 and the ability where you can concentrate human resources,

00:07:07 get people excited about doing really impactful big things,

00:07:13 and you can afford to pay them, actually.

00:07:16 Because the bigger thing can generate much more

00:07:20 in the way of financial resources.

00:07:24 So, that brings people out of talent to help you.

00:07:28 And so, all together, it’s a virtuous circle, I think.

00:07:34 How do you know an opportunity when you see one

00:07:37 in terms of the one you wanna go big on?

00:07:40 Is it intuition, is it facts?

00:07:43 Is it back and forth deliberation with people you trust?

00:07:48 What’s the process?

00:07:50 Is it art, is it science?

00:07:52 Well, it’s pattern recognition.

00:07:55 And how do you get to pattern recognition?

00:07:57 First, you need to understand the patterns

00:08:00 and the changes that are happening.

00:08:02 And that’s either, it’s observational on some level.

00:08:08 You can call it data or you can just call it listening

00:08:14 to unusual things that people are saying

00:08:18 that they haven’t said before.

00:08:19 And I’ve always tried to describe this.

00:08:24 It’s like seeing a piece of white lint on a black dress.

00:08:29 But most people disregard that piece of lint.

00:08:33 They just see the dress.

00:08:34 I always see the lint.

00:08:37 And I’m fascinated by how did something get someplace

00:08:41 it’s not supposed to be?

00:08:43 So, it doesn’t even need to be a big discrepancy.

00:08:47 But if something shouldn’t be someplace

00:08:49 in a constellation of facts that sort of made sense

00:08:56 in a traditional way, I’ve learned that if you focus

00:09:01 on why one discordant note is there,

00:09:05 that’s usually a key to something important.

00:09:09 And if you can find two of those discordant notes,

00:09:14 that’s usually a straight line to someplace.

00:09:17 And that someplace is not where you’ve been.

00:09:20 And usually when you figure out that things are changing

00:09:24 or have changed and you describe them,

00:09:27 which you have to be able to do

00:09:29 because it’s not some odd intuition.

00:09:33 It’s just focusing on facts.

00:09:35 It’s almost like a scientific discovery, if you will.

00:09:39 When you describe it to other people in the real world,

00:09:42 they tend to do absolutely nothing about it.

00:09:46 And that’s because humans are comfortable

00:09:51 in their own reality.

00:09:53 And if there’s no particular reason at that moment

00:09:57 to shake them out of their reality,

00:10:00 they’ll stay in it even if they’re ultimately

00:10:03 completely wrong.

00:10:05 And I’ve always been stunned that when I explain

00:10:09 where we’re going, what we’re doing and why,

00:10:13 almost everyone just says, that’s interesting.

00:10:18 And they continue doing what they’re doing.

00:10:20 And so I think it’s pretty easy to do that.

00:10:26 But what you need is a huge data set.

00:10:29 So before AI and people’s focus on data,

00:10:33 I’ve sort of been doing this mostly my whole life.

00:10:36 I’m not a scientist, I’m not let alone a computer scientist.

00:10:40 And you can just hear what people are saying

00:10:43 when somebody says something or you observe something

00:10:46 that simply doesn’t make sense.

00:10:48 That’s when you really go to work.

00:10:50 The rest of it’s just processing.

00:10:52 You know, on a quick tangent,

00:10:55 pattern recognition is a term often used

00:10:57 throughout the history of AI.

00:10:58 That’s the goal of artificial intelligence

00:11:01 is pattern recognition, right?

00:11:03 But there’s, I would say, various flavors of that.

00:11:08 So usually pattern recognition refers to the process

00:11:12 of the, we said dress and the lint on the dress.

00:11:17 Pattern recognition is very good at identifying the dress

00:11:21 as looking at the pattern that’s always there,

00:11:24 that’s very common and so on.

00:11:27 You almost refer to a pattern that’s like

00:11:28 in what’s called outlier detection in computer science,

00:11:33 right, the rare thing, the small thing.

00:11:38 Now, AI is not often good at that.

00:11:41 Do you, just almost philosophically,

00:11:46 the kind of decisions you made in your life

00:11:48 based scientifically almost on data,

00:11:52 do you think AI in the future will be able to do?

00:11:55 Is it something that could be put down into code

00:11:59 or is it still deeply human?

00:12:01 It’s tough for me to say since I don’t have domain knowledge

00:12:09 in AI to know everything that could or might occur.

00:12:14 I know, sort of in my own case,

00:12:19 that most people don’t see any of that.

00:12:22 I just assumed it was motivational, you know,

00:12:27 but it’s also sort of, it’s hardwiring.

00:12:32 What are you wired or programmed to be finding or looking for?

00:12:38 It’s not what happens every day.

00:12:41 That’s not interesting, frankly.

00:12:44 I mean, that’s what people mostly do.

00:12:47 I do a bunch of that too because, you know,

00:12:49 that’s what you do in normal life.

00:12:52 But I’ve always been completely fascinated

00:12:57 by the stuff that doesn’t fit.

00:13:00 Or the other way of thinking about it,

00:13:02 it’s determining what people want

00:13:07 without them saying it.

00:13:11 That’s a different kind of pattern.

00:13:14 You can see everything they’re doing.

00:13:17 There’s a missing piece.

00:13:18 They don’t know it’s missing.

00:13:20 You think it’s missing given the other facts.

00:13:23 You know about them and you deliver that

00:13:27 and then that becomes, you know,

00:13:28 sort of very easy to sell to them.

00:13:33 To linger on this point a little bit,

00:13:35 you’ve mentioned that in your family,

00:13:37 when you were growing up,

00:13:38 nobody raised their voice in anger or otherwise.

00:13:41 And you said that this allows you to learn to listen

00:13:44 and hear some interesting things.

00:13:47 Can you elaborate as you have been on that idea,

00:13:50 what do you hear about the world if you listen?

00:13:54 Well, you have to listen really intensely

00:13:57 to understand what people are saying

00:14:01 as well as what people are intending

00:14:03 because it’s not necessarily the same thing.

00:14:07 And people mostly give themselves away

00:14:14 no matter how clever they think they are.

00:14:16 Particularly if you have the full array of inputs.

00:14:22 In other words, if you look at their face,

00:14:24 you look at their eyes, which are the window on the soul,

00:14:28 it’s very difficult to conceal what you’re thinking.

00:14:33 You look at facial expressions and posture.

00:14:36 You listen to their voice, which changes.

00:14:41 You know, when you’re talking about something

00:14:44 you’re comfortable with or not,

00:14:46 are you speaking faster?

00:14:48 Is the amplitude of what you’re saying higher?

00:14:51 Most people just give away what’s really on their mind.

00:14:56 You know, they’re not that clever.

00:14:58 They’re busy spending their time thinking about

00:15:00 what they’re in the process of saying.

00:15:03 And so if you just observe that, not in a hostile way,

00:15:07 but just in an evocative way

00:15:10 and just let them talk for a while,

00:15:12 they’ll more or less tell you almost completely

00:15:16 what they’re thinking,

00:15:17 even the stuff they don’t want you to know.

00:15:21 And once you know that, of course,

00:15:24 it’s sort of easy to play that kind of game

00:15:29 because they’ve already told you

00:15:31 everything you need to know.

00:15:32 And so it’s easy to get to a conclusion

00:15:37 if there’s meant to be one, an area of common interest,

00:15:40 since you know almost exactly what’s on their mind.

00:15:44 And so that’s an enormous advantage

00:15:48 as opposed to just walking in someplace

00:15:51 and somebody telling you something

00:15:54 and you believing what they’re saying.

00:15:56 There are so many different levels of communication.

00:16:01 So a powerful approach to life you discuss in the book

00:16:05 on the topic of listening and really hearing people

00:16:08 is figuring out what the biggest problem,

00:16:10 bothering a particular individual or group is

00:16:12 and coming up with a solution to that problem

00:16:15 and presenting them with a solution, right?

00:16:20 In fact, you brilliantly describe a lot of simple things

00:16:24 that most people just don’t do.

00:16:26 It’s kind of obvious,

00:16:28 find the problem that’s bothering somebody deeply.

00:16:31 And as you said, I think you’ve implied

00:16:33 that they will usually tell you what the problem is,

00:16:36 but can you talk about this process

00:16:39 of seeing what the biggest problem for a person is,

00:16:43 trying to solve it,

00:16:44 and maybe a particularly memorable example?

00:16:47 Sure, if you know you’re gonna meet somebody,

00:16:52 there are two types of situations, chance meetings,

00:16:55 and the second is you know you’re gonna meet somebody.

00:16:59 So let’s take the easiest one,

00:17:01 which is you know you’re gonna meet somebody.

00:17:04 And you start trying to make pretend you’re them.

00:17:09 It’s really easy.

00:17:11 What’s on their mind?

00:17:13 What are they thinking about in their daily life?

00:17:16 What are the big problems they’re facing?

00:17:19 So if they’re, you know, to make it a really easy example,

00:17:24 you know, make pretend, you know,

00:17:26 they’re like president of the United States.

00:17:28 Doesn’t have to be this president, could be any president.

00:17:31 So you sort of know what’s more or less on their mind

00:17:34 because the press keeps reporting it.

00:17:37 And you see it on television, you hear it.

00:17:40 People discuss it.

00:17:42 So you know if you’re gonna be running into somebody

00:17:45 in that kind of position.

00:17:47 You sort of know what they look like already.

00:17:50 You know what they sound like.

00:17:52 You know what their voice is like.

00:17:56 And you know what they’re focused on.

00:17:58 And so if you’re gonna meet somebody like that,

00:18:01 what you should do is take the biggest unresolved issue

00:18:05 that they’re facing and come up with

00:18:09 a few interesting solutions

00:18:11 that basically haven’t been out there.

00:18:16 Or that you haven’t heard anybody else

00:18:19 always thinking about.

00:18:20 So just to give you an example,

00:18:21 I was sort of in the early 1990s

00:18:24 and I was invited to something at the White House

00:18:26 which was a big deal for me because I was like,

00:18:29 you know, a person from no place.

00:18:30 And you know, I had met the president once before

00:18:35 because it was President Bush

00:18:37 because his son was in my dormitory.

00:18:40 So I had met him at Parents Day.

00:18:43 I mean it’s just like the oddity of things.

00:18:45 So I knew I was gonna see him

00:18:47 because that’s where the invitation came from.

00:18:51 And so there was something going on

00:18:54 and I just thought about two or three ways

00:18:57 to approach that issue.

00:19:00 And you know, at that point I was separated

00:19:04 and so I had brought a date to the White House

00:19:08 and so I saw the president

00:19:12 and we sort of went over in a corner for about 10 minutes

00:19:16 and discussed whatever this issue was.

00:19:18 And I later went back to my date.

00:19:22 It was a little rude

00:19:22 but it was meant to be confidential conversation

00:19:25 and I barely knew her.

00:19:27 And you know, she said,

00:19:29 what were you talking about all that time?

00:19:31 I said, well, you know,

00:19:33 there’s something going on in the world

00:19:35 and I’ve thought about different ways

00:19:37 of perhaps approaching that and he was interested.

00:19:41 And the answer is of course he was interested.

00:19:44 Why wouldn’t he be interested?

00:19:45 There didn’t seem to be an easy outcome.

00:19:47 And so, you know, conversations of that type,

00:19:51 once somebody knows you’re really thinking

00:19:53 about what’s good for them and good for the situation,

00:19:58 it has nothing to do with me.

00:20:01 I mean, it’s really about being in service,

00:20:05 you know, to the situation.

00:20:08 Then people trust you and they’ll tell you other things

00:20:12 because they know your motives are basically very pure.

00:20:17 You’re just trying to resolve a difficult situation

00:20:20 or help somebody do it.

00:20:21 So these types of things, you know,

00:20:24 that’s a planned situation, that’s easy.

00:20:27 Sometimes you just come upon somebody

00:20:29 and they start talking and you know,

00:20:31 that requires, you know, like different skills.

00:20:34 You know, you can ask them,

00:20:38 what have you been working on lately?

00:20:39 What are you thinking about?

00:20:41 You can ask them, you know,

00:20:43 has anything been particularly difficult?

00:20:45 And you know, you can ask most people

00:20:48 if they trust you for some reason, they’ll tell you.

00:20:55 And then you have to instantly go to work on it.

00:20:58 And you know, that’s not as good

00:21:02 as having some advanced planning,

00:21:03 but you know, almost everything going on is like out there.

00:21:10 And people who are involved with interesting situations,

00:21:15 they’re playing in the same ecosystem.

00:21:20 They just have different roles in the ecosystem.

00:21:25 And you know, you could do that

00:21:29 with somebody who owns a pro football team

00:21:32 that loses all the time.

00:21:34 We specialize in those in New York.

00:21:37 And you know, you already have analyzed

00:21:41 why they’re losing, right?

00:21:43 Inevitably, it’s because they don’t have a great quarterback,

00:21:48 they don’t have a great coach,

00:21:50 and they don’t have a great general manager

00:21:52 who knows how to hire the best talent.

00:21:55 Those are the three reasons why a team fails, right?

00:21:59 Because there are salary caps,

00:22:01 so every team pays a certain amount of money

00:22:03 for all their players.

00:22:04 So it’s gotta be those three positions.

00:22:07 So if you’re talking with somebody like that,

00:22:09 inevitably, even though it’s not structured,

00:22:13 you’ll know how their team’s doing

00:22:16 and you’ll know pretty much why.

00:22:19 And if you start asking questions about that,

00:22:22 they’re typically very happy to talk about it

00:22:24 because they haven’t solved that problem.

00:22:27 In some cases, they don’t even know that’s the problem.

00:22:29 It’s pretty easy to see it.

00:22:31 So, you know, I do stuff like that,

00:22:33 which I find is intuitive as a process,

00:22:38 but, you know, leads to really good results.

00:22:43 Well, the funny thing is when you’re smart,

00:22:48 for smart people, it’s hard to escape their own ego

00:22:51 and the space of their own problems,

00:22:53 which is what’s required

00:22:55 to think about other people’s problems.

00:22:58 It requires for you to let go of the fact

00:23:01 that your own problems are all important

00:23:03 and then to talk about your,

00:23:05 I think while it seems obvious

00:23:09 and I think quite brilliant,

00:23:11 it’s just a difficult leap for many people,

00:23:14 especially smart people,

00:23:16 to empathize with, truly empathize with the problems

00:23:19 of others.

00:23:21 Well, I have a competitive advantage,

00:23:23 which is, I don’t think I’m so smart.

00:23:29 So, you know, it’s not a problem for me.

00:23:31 Well, the truly smartest people I know

00:23:33 say that exact same thing.

00:23:34 Yeah, being humble is really useful,

00:23:39 competitive advantage, as you said.

00:23:42 How do you stay humble?

00:23:44 Well, I haven’t changed much.

00:23:46 Since?

00:23:47 Since I was in my mid teens.

00:23:51 You know, I was raised partly in the city

00:23:54 and partly in the suburbs.

00:23:58 And, you know, whatever the values I had at that time,

00:24:03 those are still my values.

00:24:05 I call them like middle class values,

00:24:08 that’s how I was raised.

00:24:10 And I’ve never changed, why would I?

00:24:14 That’s who I am.

00:24:16 And so the accoutrement of, you know,

00:24:21 the rest of your life has gotta be put on the same,

00:24:25 you know, like solid foundation of who you are.

00:24:28 Because if you start losing who you really are,

00:24:31 who are you?

00:24:32 So I’ve never had the desire to be somebody else.

00:24:37 I just do other things now that I wouldn’t do

00:24:40 as a, you know, sort of as a middle class kid

00:24:43 from Philadelphia.

00:24:45 I mean, my life has morphed on a certain level.

00:24:48 But part of the strength of having integrity

00:24:52 of personality is that you can remain in touch

00:24:57 with everybody who comes from that kind of background.

00:25:04 And, you know, even though I do some things

00:25:07 that aren’t like that, you know,

00:25:09 in terms of people I meet or situations I’m in,

00:25:12 I always look at it through the same lens.

00:25:15 And that’s very psychologically comfortable

00:25:18 and doesn’t require me to make any real adjustments

00:25:22 in my life and I just keep plowing ahead.

00:25:25 There’s a lot of activity in progress in recent years

00:25:29 around effective altruism.

00:25:32 I wanted to bring this topic with you

00:25:34 because it’s an interesting one from your perspective.

00:25:38 You can put it in any kind of terms,

00:25:39 but it’s philanthropy that focuses on maximizing impact.

00:25:44 How do you see the goal of philanthropy,

00:25:47 both from a personal motivation perspective

00:25:50 and the societal big picture impact perspective?

00:25:53 Yeah, I don’t think about philanthropy

00:25:55 the way you would expect me to, okay?

00:25:58 I look at, you know, sort of solving big issues,

00:26:04 addressing big issues, starting new organizations to do it,

00:26:09 much like we do in our business.

00:26:12 You know, we keep growing our business

00:26:14 not by taking the original thing and making it larger,

00:26:17 but continually seeing new things and building those.

00:26:22 And, you know, sort of marshaling financial resources,

00:26:26 human resources, and in our case,

00:26:30 because we’re in the investment business,

00:26:32 we find something new that looks like

00:26:33 it’s gonna be terrific and we do that

00:26:36 and it works out really well.

00:26:38 All I do in what you would call philanthropy

00:26:42 is look at other opportunities to help society.

00:26:47 And I end up starting something new,

00:26:50 marshaling people, marshaling a lot of money,

00:26:53 and then at the end of that kind of creative process,

00:26:56 somebody typically asks me to write a check.

00:26:59 I don’t wake up and say,

00:27:01 how can I give large amounts of money away?

00:27:05 I look at issues that are important for people.

00:27:10 In some cases, I do smaller things.

00:27:13 Because it’s important to a person, and, you know,

00:27:18 I can relate to that person.

00:27:21 There’s some unfairness that’s happened to them.

00:27:24 And so in situations like that,

00:27:26 I’d give money anonymously and help them out.

00:27:29 And, you know, it’s like a miniature version

00:27:35 of addressing something really big.

00:27:37 So, you know, at MIT, I’m a little bit

00:27:41 you know, at MIT, I’ve done a big thing,

00:27:46 you know, helping to start this new school of computing.

00:27:49 And I did that because, you know,

00:27:51 I saw that, you know, there’s sort of like a global race on

00:27:57 in AI, quantum, and other major technologies.

00:28:00 And I thought that the US could use more enhancement

00:28:06 from a competitive perspective.

00:28:09 And I also, because I get to China a lot

00:28:12 and I travel around a lot compared to a regular person,

00:28:17 you know, I can see the need to have control

00:28:21 of these types of technologies.

00:28:23 So when they’re introduced, we don’t create a mess

00:28:26 like we did with the internet and with social media.

00:28:30 Unintended consequence, you know,

00:28:33 that’s creating all kinds of issues and freedom of speech

00:28:36 and the functioning of liberal democracies.

00:28:39 So with AI, it was pretty clear

00:28:41 that there was enormous difference of views

00:28:44 around the world by the relatively few practitioners

00:28:48 in the world who really knew what was going on.

00:28:51 And by accident, I knew a bunch of these people,

00:28:55 you know, who were like big famous people.

00:28:59 And I could talk to them and say,

00:29:01 why do you think this is a force for bad?

00:29:05 And someone else, why do you feel this is a force for good?

00:29:08 And how do we move forward with the technology

00:29:13 by the same time, make sure that whatever is potentially,

00:29:19 you know, sort of on the bad side of this technology

00:29:22 with, you know, for example, disruption of workforces

00:29:26 and things like that, that could happen much faster

00:29:29 than the industrial revolution.

00:29:32 What do we do about that?

00:29:33 And how do we keep that under control

00:29:35 so that the really good things about these technologies,

00:29:39 which will be great things,

00:29:41 not good things are allowed to happen?

00:29:44 So to me, you know, this was one of the great issues

00:29:52 facing society.

00:29:53 The number of people who were aware of it were very small.

00:29:57 I just accidentally got sucked into it.

00:30:00 And as soon as I saw it, I went, oh my God, this is mega,

00:30:06 both on a competitive basis globally,

00:30:09 but also in terms of protecting society

00:30:13 and benefiting society.

00:30:15 So that’s how I got involved.

00:30:17 And at the end, you know, sort of the right thing

00:30:20 that we figured out was, you know,

00:30:22 sort of double MIT’s computer science faculty

00:30:26 and basically create the first AI enabled university

00:30:30 in the world.

00:30:32 And, you know, in effect, be an example,

00:30:35 a beacon to the rest of the research community

00:30:38 around the world academically,

00:30:40 and create, you know, a much more robust U.S. situation,

00:30:49 competitive situation among the universities.

00:30:52 Because if MIT was going to raise a lot of money

00:30:55 and double its faculty, well, you could bet that,

00:30:59 you know, a number of other universities

00:31:02 were going to do the same thing.

00:31:03 At the end of it, it would be great for knowledge creation,

00:31:08 you know, great for the United States, great for the world.

00:31:12 And so I like to do things that I think are really positive,

00:31:19 things that other people aren’t acting on,

00:31:23 that I see for whatever the reason.

00:31:25 First, it’s just people I meet and what they say,

00:31:29 and I can recognize when something really profound

00:31:32 is about to happen or needs to.

00:31:35 And I do it, and at the end of the situation,

00:31:39 somebody says, can you write a check to help us?

00:31:43 And then the answer is sure.

00:31:44 I mean, because if I don’t, the vision won’t happen.

00:31:48 But it’s the vision of whatever I do

00:31:52 that is compelling.

00:31:54 And essentially, I love that idea of whether it’s small

00:31:59 at the individual level or really big,

00:32:01 like the gift to MIT to launch the College of Computing.

00:32:06 It starts with a vision, and you see philanthropy as,

00:32:14 the biggest impact you can have is by launching something new,

00:32:18 especially on an issue that others aren’t really addressing.

00:32:22 And I also love the notion, and you’re absolutely right,

00:32:25 that there’s other universities, Stanford, CMU,

00:32:30 I’m looking at you, that would essentially,

00:32:33 the seed will create other, it’ll have a ripple effect

00:32:39 that potentially might help US be a leader

00:32:42 or continue to be a leader in AI.

00:32:44 It’s potentially a very transformative research

00:32:49 direction.

00:32:50 Just to linger on that point a little bit,

00:32:52 what is your hope long term for the impact

00:32:55 the college here at MIT might have in the next five, 10,

00:33:00 even 20, or let’s get crazy, 30, 50 years?

00:33:03 Well, it’s very difficult to predict the future

00:33:06 when you’re dealing with knowledge production

00:33:08 and creativity.

00:33:11 MIT has, obviously, some unique aspects.

00:33:16 Globally, there’s four big academic surveys.

00:33:22 I forget whether it was QS, there’s

00:33:25 the Times in London, the US News, and whatever.

00:33:31 And one of these recently, MIT, was ranked number one

00:33:34 in the world.

00:33:37 So leave aside whether you’re number three somewhere else,

00:33:41 in the great sweep of humanity, this is pretty amazing.

00:33:47 So you have a really remarkable aggregation of human talent

00:33:53 here.

00:33:55 And where it goes, it’s hard to tell.

00:33:58 You have to be a scientist to have the right feel.

00:34:03 But what’s important is you have a critical mass of people.

00:34:08 And I think it breaks into two buckets.

00:34:12 One is scientific advancement.

00:34:15 And if the new college can help either

00:34:21 serve as a convening force within the university

00:34:25 or help coordination and communication among people,

00:34:33 that’s a good thing, absolute good thing.

00:34:36 The second thing is in the AI ethics area,

00:34:41 which is, in a way, equally important.

00:34:48 Because if the science side creates blowback

00:34:55 so that science is a bit crippled in terms

00:35:03 of going forward because society’s reaction to knowledge

00:35:09 advancement in this field becomes really hostile,

00:35:13 then you’ve sort of lost the game

00:35:15 in terms of scientific progress and innovation.

00:35:18 And so the AI ethics piece is super important

00:35:22 because in a perfect world, MIT would

00:35:29 serve as a global convener.

00:35:32 Because what you need is you need the research universities.

00:35:38 You need the companies that are driving AI and quantum work.

00:35:46 You need governments who will ultimately

00:35:49 be regulating certain elements of this.

00:35:53 And you also need the media to be knowledgeable and trained

00:35:59 so we don’t get overreactions to one situation, which then goes

00:36:09 viral and it ends up shutting down

00:36:12 avenues that are perfectly fine to be walking down or running

00:36:18 down that avenue.

00:36:20 But if enough discordant information,

00:36:25 not even correct necessarily, sort of gets

00:36:34 pushed around society, then you can end up

00:36:36 with a really hostile regulatory environment and other things.

00:36:40 So you have four drivers that have

00:36:44 to be sort of integrated.

00:36:50 And so if the new school of computing

00:36:55 can be really helpful in that regard,

00:36:58 then that’s a real service to science.

00:37:02 And it’s a service to MIT.

00:37:05 So that’s why I wanted to get involved for both areas.

00:37:10 And the hope is for me, for others,

00:37:12 for everyone, for the world, is for this particular college

00:37:17 of computing to be a beacon and a connector for these ideas.

00:37:22 Yeah, that’s right.

00:37:23 I mean, I think MIT is perfectly positioned to do that.

00:37:31 So you’ve mentioned the media, social media, the internet

00:37:35 as this complex network of communication with flaws,

00:37:41 perhaps, perhaps you can speak to them.

00:37:44 But I personally think that science and technology

00:37:50 has its flaws, but ultimately is, one, sexy, exciting.

00:37:58 It’s the way for us to explore and understand

00:38:01 the mysteries of our world.

00:38:02 And two, perhaps more importantly for some people,

00:38:06 it’s a huge way to, a really powerful way

00:38:09 to grow the economy, to improve the quality of life

00:38:12 for everyone.

00:38:13 So how do we get, how do you see the media, social media,

00:38:19 the internet as a society having a healthy discourse

00:38:26 about science, first of all, one that’s factual

00:38:30 and two, one that finds science exciting,

00:38:33 that invests in science, that pushes it forward,

00:38:36 especially in this science fiction, fear filled field

00:38:41 of artificial intelligence?

00:38:43 Well, I think that’s a little above my pay grade

00:38:45 because trying to control social media

00:38:50 to make it do what you want to do

00:38:52 appears to be beyond almost anybody’s control.

00:38:56 And the technology is being used to create

00:39:00 what I call the tyranny of the minorities.

00:39:04 A minority is defined as two or three people

00:39:07 on a street corner.

00:39:09 Doesn’t matter what they look like.

00:39:11 Doesn’t matter where they came from.

00:39:13 They’re united by that one issue that they care about.

00:39:19 And their job is to enforce their views on the world.

00:39:26 And in the political world, people just

00:39:30 are manufacturing truth.

00:39:34 And they throw it all over.

00:39:36 And it affects all of us.

00:39:38 And sometimes people are just hired to do that.

00:39:45 It’s amazing.

00:39:46 And you think it’s one person.

00:39:48 It’s really just sort of a front for a particular point of view.

00:39:55 And this has become exceptionally disruptive

00:39:59 for society.

00:40:01 And it’s dangerous.

00:40:03 And it’s undercutting the ability of liberal democracies

00:40:07 to function.

00:40:09 And I don’t know how to get a grip on this.

00:40:11 And I was really surprised when we was up here

00:40:16 for the announcement last spring of the College of Computing.

00:40:24 And they had all these famous scientists, some of whom

00:40:27 were involved with the invention of the internet.

00:40:32 And almost every one of them got up and said,

00:40:36 I think I made a mistake.

00:40:39 And as a non scientist, I never thought

00:40:41 I’d hear anyone say that.

00:40:44 And what they said is, more or less, to make it simple,

00:40:48 we thought this would be really cool inventing the internet.

00:40:52 We could connect everyone in the world.

00:40:55 We can move knowledge around.

00:40:56 It was instantaneous.

00:40:58 It’s a really amazing thing.

00:41:01 He said, I don’t know that there was anyone

00:41:04 who ever thought about social media coming out of that

00:41:07 and the actual consequences for people’s lives.

00:41:12 There’s always some younger person.

00:41:17 I just saw one of these yesterday.

00:41:19 It’s reported on the national news

00:41:21 who killed himself when people use social media

00:41:25 to basically sort of ridicule him or something of that type.

00:41:30 This is dead.

00:41:33 This is dangerous.

00:41:35 And so I don’t have a solution for that other

00:41:43 than going forward, you can end up

00:41:46 with this type of outcome using AI.

00:41:50 To make this kind of mistake twice is unforgivable.

00:41:56 So interestingly, at least in the West and parts of China,

00:42:02 people are quite sympathetic to the whole concept of AI ethics

00:42:08 and what gets introduced when and cooperation

00:42:13 within your own country, within your own industry,

00:42:17 as well as globally to make sure

00:42:20 that the technology is a force for good.

00:42:24 And that really interesting topic.

00:42:25 Since 2007, you’ve had a relationship

00:42:28 with senior leadership with a lot of people in China

00:42:32 and an interest in understanding modern China,

00:42:36 their culture, their world, much like with Russia.

00:42:39 I’m from Russia originally.

00:42:42 Americans are told a very narrow, one sided story

00:42:44 about China that I’m sure misses a lot

00:42:48 of fascinating complexity, both positive and negative.

00:42:53 What lessons about Chinese culture, its ideas as a nation,

00:42:57 its future do you think Americans should know about,

00:43:00 deliberate on, think about?

00:43:02 Well, it’s sort of a wide question

00:43:06 that you’re asking about.

00:43:09 China is a pretty unusual place.

00:43:11 First, it’s huge.

00:43:15 It’s physically huge.

00:43:17 It’s got a billion three people.

00:43:19 And the character of the people isn’t as well understood

00:43:24 in the United States.

00:43:27 Chinese people are amazingly energetic.

00:43:34 If you’re one of a billion three people,

00:43:37 one of the things you’ve got to be focused on

00:43:40 is how do you make your way through a crowd

00:43:44 of a billion 2.99999 other people.

00:43:50 No, the word for that is competitive.

00:43:52 Yes, they are individually highly energetic,

00:43:57 highly focused, always looking for some opportunity

00:44:02 for themselves because they need to,

00:44:07 because there’s an enormous amount of just literally people

00:44:11 around.

00:44:12 And so what I’ve found is they’ll

00:44:17 try and find a way to win for themselves.

00:44:21 And their country is complicated because it basically

00:44:25 doesn’t have the same kind of functional laws

00:44:29 that we do in the United States and the West.

00:44:34 And the country is controlled really

00:44:39 through a web of relationships you have with other people

00:44:44 and the relationships that those other people have

00:44:47 with other people.

00:44:48 So it’s an incredibly dynamic culture

00:44:53 where if somebody knocks somebody up

00:44:56 on the top who’s three levels above you

00:44:59 and is, in effect, protecting you,

00:45:01 then you’re like a floating molecule there

00:45:08 without tethering except the one or two layers above you.

00:45:12 But that’s going to get affected.

00:45:14 So it’s a very dynamic system.

00:45:15 And getting people to change is not that easy

00:45:19 because if there aren’t really functioning laws,

00:45:23 it’s only the relationships that everybody has.

00:45:27 And so when you decide to make a major change

00:45:30 and you sign up for it, something

00:45:33 is changing in your life.

00:45:36 There won’t necessarily be all the same people on your team.

00:45:40 And that’s a very high risk enterprise.

00:45:43 So when you’re dealing with China,

00:45:46 it’s important to know almost what everybody’s relationship

00:45:50 is with somebody.

00:45:52 So when you suggest doing something differently,

00:45:55 you line up these forces.

00:45:59 In the West, it’s usually you talk to a person

00:46:02 and they figure out what’s good for them.

00:46:04 It’s a lot easier.

00:46:06 And in that sense, in a funny way,

00:46:08 it’s easier to make change in the West,

00:46:11 just the opposite of what people think.

00:46:14 But once the Chinese system adjusts

00:46:17 to something that’s new, everybody’s on the team.

00:46:22 It’s hard to change them.

00:46:23 But once they’re changed, they are incredibly focused in a way

00:46:28 that it’s hard for the West to do

00:46:31 in a more individualistic culture.

00:46:35 So there are all kinds of fascinating things.

00:46:42 One thing that might interest the people who are listening

00:46:46 who are more technologically based than some other group.

00:46:50 I was with one of the top people in the government

00:46:56 a few weeks ago, and he was telling me that every school

00:47:00 child in China is going to be taught computer science.

00:47:08 Now, imagine 100% of these children.

00:47:14 This is such a large number of human beings.

00:47:20 Now, that doesn’t mean that every one of them

00:47:22 will be good at computer science.

00:47:25 But if it’s sort of like in the West,

00:47:28 if it’s like math or English, everybody’s going to take it.

00:47:33 Not everybody’s great at English.

00:47:35 They don’t write books.

00:47:36 They don’t write poetry.

00:47:38 And not everybody’s good at math.

00:47:41 Somebody like myself, I sort of evolved to the third grade,

00:47:44 and I’m still doing flashcards.

00:47:47 I didn’t make it further in math.

00:47:50 But imagine everybody in their society

00:47:54 is going to be involved with computer science.

00:47:58 I’d just even pause on that.

00:48:01 I think computer science involves,

00:48:05 at the basic beginner level, programming.

00:48:07 And the idea that everybody in the society

00:48:11 would have some ability to program a computer is incredible.

00:48:18 For me, it’s incredibly exciting,

00:48:20 and I think that should give the United States pause

00:48:24 and consider what…

00:48:28 Talking about sort of philanthropy and launching things,

00:48:31 there’s nothing like launching,

00:48:33 sort of investing in young youth, the education system,

00:48:38 because that’s where everything launches.

00:48:40 Yes.

00:48:41 Well, we’ve got a complicated system

00:48:42 because we have over 3,000 school districts

00:48:45 around the country.

00:48:47 China doesn’t worry about that as a concept.

00:48:50 They make a decision at the very top of the government

00:48:55 that that’s what they want to have happen,

00:48:57 and that is what will happen.

00:48:59 And we’re really handicapped by this distributed power

00:49:07 in the education area,

00:49:08 although some people involved with that area

00:49:10 will think it’s great.

00:49:13 But you would know better than I do

00:49:17 what percent of American children

00:49:18 have computer science exposure.

00:49:23 My guess, no knowledge, would be 5% or less.

00:49:29 And if we’re going to be going into a world

00:49:33 where the other major economic power,

00:49:37 sort of like ourselves, has got like 100% and we got 5%,

00:49:42 and the whole computer science area is the future,

00:49:49 then we’re purposely or accidentally actually

00:49:52 handicapping ourselves,

00:49:54 and our system doesn’t allow us to adjust quickly to that.

00:50:00 So, you know, issues like this I find fascinating.

00:50:06 And, you know, if you’re lucky enough

00:50:08 to go to other countries, which I do,

00:50:12 and you learn what they’re thinking,

00:50:14 then it informs what we ought to be doing in the United States.

00:50:22 So the current administration, Donald Trump,

00:50:25 has released an executive order on artificial intelligence.

00:50:29 Not sure if you’re familiar with it.

00:50:31 In 2019, looking several years ahead,

00:50:36 how does America sort of,

00:50:38 we’ve mentioned in terms of the big impact,

00:50:41 we hope your investment in MIT will have a ripple effect,

00:50:47 but from a federal perspective, from a government perspective,

00:50:51 how does America establish, with respect to China,

00:50:55 leadership in the world at the top

00:50:57 for research and development in AI?

00:51:00 I think that you have to get the federal government

00:51:04 in the game in a big way,

00:51:07 and that this leap forward technologically,

00:51:13 which is going to happen with or without us,

00:51:17 you know, really should be with us,

00:51:19 and it’s an opportunity, in effect,

00:51:23 for another moonshot kind of mobilization

00:51:28 by the United States.

00:51:30 I think the appetite actually is there to do that.

00:51:38 At the moment, what’s getting in the way

00:51:41 is the kind of poisonous politics we have,

00:51:45 but if you go below the lack of cooperation,

00:51:52 which is almost the defining element of American democracy

00:51:59 right now in the Congress,

00:52:00 if you talk to individual members, they get it,

00:52:04 and they would like to do something.

00:52:08 Another part of the issue is we’re running huge deficits.

00:52:11 We’re running trillion dollar plus deficits.

00:52:13 So how much money do you need for this initiative?

00:52:19 Where does it come from?

00:52:21 Who’s prepared to stand up for it?

00:52:24 Because if it involves taking away resources

00:52:27 from another area, our political system is not real flexible.

00:52:32 To do that, if you’re creating this kind of initiative,

00:52:41 which we need, where does the money come from?

00:52:45 And trying to get money

00:52:47 when you’ve got trillion dollar deficits,

00:52:49 in a way, could be easy.

00:52:50 What’s the difference of a trillion

00:52:52 and a trillion and a little more?

00:52:54 But, you know, it’s hard with the mechanisms of Congress.

00:52:58 But what’s really important is this is not an issue

00:53:06 that is unknown, and it’s viewed as a very important issue.

00:53:12 And there’s almost no one in the Congress

00:53:15 when you sit down and explain what’s going on

00:53:18 who doesn’t say, we’ve got to do something.

00:53:22 Let me ask the impossible question.

00:53:26 You didn’t endorse Donald Trump, but after he was elected,

00:53:30 you have given him advice, which seems to me a great thing

00:53:38 to do, no matter who the president is,

00:53:42 to positively contribute to this nation by giving advice.

00:53:47 And yet, you’ve received a lot of criticism for this.

00:53:50 So on the previous topic of science and technology

00:53:54 and government, how do we have a healthy discourse,

00:53:59 give advice, get excited conversation with the government

00:54:05 about science and technology without it becoming politicized?

00:54:09 Well, it’s very interesting.

00:54:12 So when I was young, before there was a moonshot,

00:54:17 we had a president named John F. Kennedy from Massachusetts

00:54:22 here.

00:54:23 And in his inaugural address as president,

00:54:27 he asked not what your country can do for you,

00:54:31 but what you can do for your country.

00:54:34 We had a generation of people my age, basically people,

00:54:40 who grew up with that credo.

00:54:45 And sometimes you don’t need to innovate.

00:54:49 You can go back to basic principles.

00:54:52 And that’s good basic principle.

00:54:55 What can we do?

00:54:58 Americans have GDP per capita of around $60,000.

00:55:04 It’s not equally distributed, but it’s big.

00:55:08 And people have, I think, an obligation to help

00:55:15 their country.

00:55:17 And I do that.

00:55:19 And apparently, I take some grief from some people who

00:55:28 project on me things I don’t even vaguely believe.

00:55:33 But I’m quite simple.

00:55:36 I tried to help the previous president, President Obama.

00:55:41 He was a good guy.

00:55:42 And he was a different party.

00:55:44 And I tried to help President Bush.

00:55:46 And he’s a different party.

00:55:48 And I sort of don’t care that much about what the parties are.

00:55:57 I care about, even though I’m a big donor for the Republicans,

00:56:01 but what motivates me is, what are the problems we’re facing?

00:56:08 Can I help people get to a good outcome that

00:56:13 will stand any test?

00:56:16 But we live in a world now where the filters and the hostility

00:56:24 is so unbelievable.

00:56:30 In the 1960s, when I went to school and university,

00:56:34 I went to Yale, we had so much stuff going on.

00:56:40 We had a war called the Vietnam War.

00:56:43 We had sort of black power starting.

00:56:47 And we had a sexual revolution with the birth control pill.

00:56:54 And there was one other major thing going on,

00:57:02 the drug revolution.

00:57:05 There hasn’t been a generation that

00:57:08 had more stuff going on in a four year period than my era.

00:57:17 Yet, there wasn’t this kind of instant hostility

00:57:23 if you believed something different.

00:57:26 Everybody lived together and respected the other person.

00:57:32 And I think that this type of change needs to happen.

00:57:37 And it’s got to happen from the leadership

00:57:41 of our major institutions.

00:57:44 And I don’t think that leaders can

00:57:48 be bullied by people who are against sort

00:57:53 of the classical version of free speech

00:57:56 and letting open expression and inquiry.

00:58:00 That’s what universities are for, among other things,

00:58:05 Socratic methods.

00:58:06 And so I have, in the midst of this onslaught of oddness,

00:58:18 I believe in still the basic principles.

00:58:22 And we’re going to have to find a way to get back to that.

00:58:26 And that doesn’t start with the people sort of in the middle

00:58:31 to the bottom who are using these kinds of screens

00:58:36 to shout people down and create an uncooperative environment.

00:58:41 It’s got to be done at the top with core principles that

00:58:46 are articulated.

00:58:48 And ironically, if people don’t sign on

00:58:53 to these kind of core principles where people are equal

00:58:56 and speech can be heard and you don’t have these enormous

00:59:02 shout down biases subtly or out loud,

00:59:06 then they don’t belong at those institutions.

00:59:09 They’re violating the core principles.

00:59:12 And that’s how you end up making change.

00:59:18 But you have to have courageous people who

00:59:21 are willing to lay that out for the benefit of not just

00:59:26 their institutions, but for society as a whole.

00:59:31 So I believe that will happen.

00:59:35 But it needs the commitment of senior people

00:59:41 to make it happen.

00:59:42 Courage.

00:59:43 And I think for such great leaders, great universities,

00:59:47 there’s a huge hunger for it.

00:59:49 So I am too very optimistic that it will come.

00:59:53 I’m now personally taking a step into building a startup

00:59:56 first time, hoping to change the world, of course.

01:00:00 There are thousands, maybe more, maybe millions

01:00:03 of other first time entrepreneurs like me.

01:00:06 What advice?

01:00:08 You’ve gone through this process.

01:00:09 You’ve talked about the suffering, the emotional turmoil

01:00:14 it all might entail.

01:00:15 What advice do you have for those people taking that step?

01:00:20 I’d say it’s a rough ride.

01:00:23 And you have to be psychologically prepared

01:00:28 for things going wrong with frequency.

01:00:33 You have to be prepared to be put in situations where you’re

01:00:38 being asked to solve problems you didn’t even

01:00:40 know those problems existed.

01:00:43 For example, renting space, it’s not really a problem

01:00:47 unless you’ve never done it.

01:00:48 You have no idea what a lease looks like.

01:00:52 You don’t even know the relevant rent in a market.

01:00:56 So everything is new.

01:00:58 Everything has to be learned.

01:01:00 What you realize is that it’s good to have other people

01:01:04 with you who’ve had some experience in areas

01:01:07 where you don’t know what you’re doing.

01:01:09 Unfortunately, an entrepreneur starting

01:01:13 doesn’t know much of anything.

01:01:14 So everything is something new.

01:01:18 And I think it’s important not to be alone,

01:01:24 because it’s sort of overwhelming.

01:01:28 And you need somebody to talk to other than a spouse or a loved

01:01:33 one, because even they get bored with your problems.

01:01:38 And so getting a group, if you look at Alibaba,

01:01:43 Jack Ma was telling me they basically

01:01:47 were like at financial death’s door at least twice.

01:01:52 And the fact that it wasn’t just Jack.

01:01:55 I mean, people think it is, because he

01:01:57 became the sort of public face and the driver.

01:02:02 But a group of people who can give advice,

01:02:08 share situations to talk about, that’s really important.

01:02:13 And that’s not just referring to the small details

01:02:15 like renting space.

01:02:17 No.

01:02:17 It’s also the psychological burden.

01:02:20 Yeah, and because most entrepreneurs at some point

01:02:24 question what they’re doing, because it’s not going so well.

01:02:27 Or they’re screwing it up, and they

01:02:29 don’t know how to unscrew it up, because we’re all learning.

01:02:34 And it’s hard to be learning when there are like 25 variables

01:02:38 going on.

01:02:39 If you’re missing four big ones, you can really make a mess.

01:02:43 And so the ability to, in effect, have either an outsider

01:02:50 who’s really smart that you can rely on

01:02:53 for certain type of things, or other people who are working

01:02:57 with you on a daily basis, most people

01:03:03 who haven’t had experience believe

01:03:06 in the myth of the one person, one great person,

01:03:10 makes outcomes, creates outcomes that are positive.

01:03:15 Most of us, it’s not like that.

01:03:18 If you look back over a lot of the big successful tech

01:03:22 companies, it’s not typically one person.

01:03:28 And you will know these stories better than I do,

01:03:30 because it’s your world, not mine.

01:03:32 But even I know that almost every one of them

01:03:35 had two people.

01:03:36 If you look at Google, that’s what they had.

01:03:40 And that was the same at Microsoft at the beginning.

01:03:43 And it was the same at Apple.

01:03:48 People have different skills.

01:03:50 And they need to play off of other people.

01:03:53 So the advice that I would give you

01:03:59 is make sure you understand that so you don’t head off

01:04:03 in some direction as a lone wolf and find that either you

01:04:08 can’t invent all the solutions or you make bad decisions

01:04:13 on certain types of things.

01:04:15 This is a team sport.

01:04:18 Entrepreneur means you’re alone, in effect.

01:04:22 And that’s the myth.

01:04:24 But it’s mostly a myth.

01:04:27 Yeah, I think, and you talk about this in your book,

01:04:29 and I could talk to you about it forever,

01:04:31 the harshly self critical aspect to your personality

01:04:36 and to mine as well in the face of failure.

01:04:39 It’s a powerful tool, but it’s also

01:04:41 a burden that’s very interesting to walk that line.

01:04:49 But let me ask in terms of people around you,

01:04:53 in terms of friends, in the bigger

01:04:56 picture of your own life, where do you

01:04:57 put the value of love, family, friendship

01:05:02 in the big picture journey of your life?

01:05:06 Well, ultimately, all journeys are alone.

01:05:12 It’s great to have support.

01:05:16 And when you go forward and say your job is

01:05:23 to make something work, and that’s your number one

01:05:26 priority, and you’re going to work at it to make it work,

01:05:31 it’s like superhuman effort.

01:05:33 People don’t become successful as part time workers.

01:05:38 It doesn’t work that way.

01:05:40 And if you’re prepared to make that 100% to 120% effort,

01:05:48 you’re going to need support, and you’re

01:05:51 going to have to have people involved

01:05:53 with your life who understand that that’s really

01:05:56 part of your life.

01:05:59 Sometimes you’re involved with somebody,

01:06:01 and they don’t really understand that.

01:06:04 And that’s a source of conflict and difficulty.

01:06:09 But if you’re involved with the right people,

01:06:13 whether it’s a dating relationship or a spousal

01:06:20 relationship, you have to involve them in your life,

01:06:28 but not burden them with every minor triumph or mistake.

01:06:36 They actually get bored with it after a while.

01:06:41 And so you have to set up different types of ecosystems.

01:06:45 You have your home life.

01:06:48 You have your love life.

01:06:50 You have children.

01:06:51 And that’s the enduring part of what you do.

01:06:55 And then on the other side, you’ve got the unpredictable

01:07:00 nature of this type of work.

01:07:07 What I say to people at my firm who are younger, usually,

01:07:12 well, everybody’s younger, but people

01:07:16 who are of an age where they’re just

01:07:19 having their first child, or maybe they have two children,

01:07:24 that it’s important to make sure they go away

01:07:31 with their spouse at least once every two months to just

01:07:37 some lovely place where there are no children, no issues,

01:07:42 sometimes once a month if they’re

01:07:45 sort of energetic and clever.

01:07:48 And that

01:07:49 Escape the craziness of it all.

01:07:51 Yeah, and reaffirm your values as a couple.

01:07:58 And you have to have fun.

01:08:00 If you don’t have fun with the person you’re with,

01:08:04 and all you’re doing is dealing with issues,

01:08:07 then that gets pretty old.

01:08:09 And so you have to protect the fun element of your life

01:08:14 together.

01:08:15 And the way to do that isn’t by hanging around the house

01:08:18 and dealing with sort of more problems.

01:08:22 You have to get away and reinforce and reinvigorate

01:08:27 your relationship.

01:08:28 And whenever I tell one of our younger people about that,

01:08:31 they sort of look at me, and it’s

01:08:33 like the scales are falling off of their eyes.

01:08:35 And they’re saying, jeez, I hadn’t thought about that.

01:08:39 I’m so enmeshed in all these things.

01:08:41 But that’s a great idea.

01:08:42 And that’s something, as an entrepreneur,

01:08:45 you also have to do.

01:08:47 You just can’t let relationships slip

01:08:50 because you’re half overwhelmed.

01:08:53 Beautifully put.

01:08:54 And I think there’s no better place to end it.

01:08:57 Steve, thank you so much.

01:08:58 I really appreciate it.

01:08:59 It was an honor to talk to you.

01:09:01 My pleasure.

01:09:02 Thanks for listening to this conversation

01:09:04 with Stephen Schwarzman.

01:09:05 And thank you to our sponsors, ExpressVPN and MasterClass.

01:09:09 Please consider supporting the podcast

01:09:11 by signing up to MasterClass at masterclass.com slash lex

01:09:16 and getting ExpressVPN at expressvpn.com slash lexpod.

01:09:21 If you enjoy this podcast, subscribe on YouTube,

01:09:23 review it with five stars on Apple Podcast,

01:09:26 support it on Patreon, or simply connect with me

01:09:28 on Twitter at lexfriedman.

01:09:31 And now, let me leave you with some words

01:09:33 from Stephen Schwarzman’s book, What It Takes.

01:09:38 It’s as hard to start and run a small business

01:09:41 as it is to start a big one.

01:09:43 You will suffer the same toll financially and psychologically

01:09:46 as you bludgeon it into existence.

01:09:50 It’s hard to raise the money and to find the right people.

01:09:53 So if you’re going to dedicate your life to a business,

01:09:56 which is the only way it will ever work,

01:09:59 you should choose one with the potential to be huge.

01:10:02 Thank you for listening and hope to see you next time.