Transcript
00:00:00 The following is a conversation with Kevin Scott,
00:00:03 the CTO of Microsoft.
00:00:06 Before that, he was the senior vice president
00:00:08 of engineering and operations at LinkedIn.
00:00:11 And before that, he oversaw mobile ads engineering
00:00:14 at Google.
00:00:15 He also has a podcast called Behind the Tech
00:00:18 with Kevin Scott, which I’m a fan of.
00:00:21 This was a fun and wide ranging conversation
00:00:24 that covered many aspects of computing.
00:00:26 It happened over a month ago,
00:00:28 before the announcement of Microsoft’s investment
00:00:30 in OpenAI that a few people have asked me about.
00:00:34 I’m sure there’ll be one or two people in the future
00:00:38 that’ll talk with me about the impact of that investment.
00:00:42 This is the Artificial Intelligence Podcast.
00:00:45 If you enjoy it, subscribe on YouTube,
00:00:47 give it five stars on iTunes,
00:00:49 support it on Patreon,
00:00:50 or simply connect with me on Twitter at Lex Friedman,
00:00:54 spelled F R I D M A N.
00:00:57 And I’d like to give a special thank you
00:00:59 to Tom and Nelante Bighousen
00:01:01 for their support of the podcast on Patreon.
00:01:04 Thanks Tom and Nelante.
00:01:06 Hope I didn’t mess up your last name too bad.
00:01:08 Your support means a lot
00:01:10 and inspires me to keep this series going.
00:01:13 And now, here’s my conversation with Kevin Scott.
00:01:18 You’ve described yourself as a kid in a candy store
00:01:20 at Microsoft because of all the interesting projects
00:01:22 that are going on.
00:01:24 Can you try to do the impossible task
00:01:27 and give a brief whirlwind view
00:01:31 of all the spaces that Microsoft is working in?
00:01:34 Both research and product?
00:01:37 If you include research,
00:01:38 it becomes even more difficult.
00:01:46 I think broadly speaking,
00:01:48 Microsoft’s product portfolio includes everything
00:01:53 from big cloud business,
00:01:56 like a big set of SaaS services.
00:01:59 We have sort of the original,
00:02:01 or like some of what are among the original
00:02:05 productivity software products that everybody uses.
00:02:09 We have an operating system business.
00:02:11 We have a hardware business where we make everything
00:02:14 from computer mice and headphones
00:02:18 to high end personal computers and laptops.
00:02:23 We have a fairly broad ranging research group
00:02:27 where we have people doing everything
00:02:29 from economics research.
00:02:31 So there’s this really, really smart young economist,
00:02:35 Glenn Weil, who my group works with a lot,
00:02:39 who’s doing this research on these things
00:02:42 called radical markets.
00:02:45 He’s written an entire technical book
00:02:48 about this whole notion of radical markets.
00:02:51 So like the research group sort of spans from that
00:02:53 to human computer interaction to artificial intelligence.
00:02:56 And we have GitHub, we have LinkedIn,
00:03:01 we have a search advertising and news business
00:03:05 and like probably a bunch of stuff
00:03:07 that I’m embarrassingly not recounting in this list.
00:03:11 Gaming to Xbox and so on, right?
00:03:12 Yeah, gaming for sure.
00:03:14 Like I was having a super fun conversation this morning
00:03:17 with Phil Spencer.
00:03:19 So when I was in college,
00:03:21 there was this game that LucasArts made
00:03:25 called Day of the Tentacle
00:03:27 that my friends and I played forever.
00:03:30 And like we’re doing some interesting collaboration now
00:03:33 with the folks who made Day of the Tentacle.
00:03:37 And I was like completely nerding out with Tim Schafer,
00:03:40 like the guy who wrote a Day of the Tentacle this morning,
00:03:43 just a complete fan boy,
00:03:45 which sort of it like happens a lot.
00:03:49 Like Microsoft has been doing so much stuff
00:03:53 at such breadth for such a long period of time
00:03:56 that like being CTO like most of the time,
00:04:00 my job is very, very serious.
00:04:02 And sometimes like I get caught up
00:04:05 in like how amazing it is to be able to have
00:04:10 the conversations that I have with the people
00:04:12 I get to have them with.
00:04:14 Yeah, to reach back into the sentimental.
00:04:17 And what’s the radical markets and the economics?
00:04:21 So the idea with radical markets is like,
00:04:24 can you come up with new market based mechanisms to,
00:04:32 you know, I think we have this,
00:04:33 we’re having this debate right now,
00:04:35 like does capitalism work like free markets work?
00:04:40 Can the incentive structures
00:04:43 that are built into these systems produce outcomes
00:04:46 that are creating sort of equitably distributed benefits
00:04:51 for every member of society?
00:04:55 You know, and I think it’s a reasonable,
00:04:56 reasonable set of questions to be asking.
00:04:59 And so what Glenn, and so like, you know,
00:05:02 one mode of thought there,
00:05:03 like if you have doubts that the markets
00:05:05 are actually working, you can sort of like tip towards
00:05:08 like, okay, let’s become more socialist
00:05:10 and, you know, like have central planning and, you know,
00:05:13 governments or some other central organization
00:05:15 is like making a bunch of decisions
00:05:18 about how, you know, sort of work gets done
00:05:22 and, you know, like where the, you know,
00:05:24 where the investments and where the outputs
00:05:26 of those investments get distributed.
00:05:28 Glenn’s notion is like, lean more
00:05:32 into like the market based mechanism.
00:05:35 So like, for instance, you know,
00:05:37 this is one of the more radical ideas,
00:05:39 like suppose that you had a radical pricing mechanism
00:05:45 for assets like real estate where you were,
00:05:50 you could be bid out of your position
00:05:53 in your home, you know, for instance.
00:05:58 So like if somebody came along and said,
00:06:01 you know, like I can find higher economic utility
00:06:04 for this piece of real estate
00:06:05 that you’re running your business in,
00:06:08 like then like you either have to, you know,
00:06:13 sort of bid to sort of stay
00:06:16 or like the thing that’s got the higher economic utility,
00:06:19 you know, sort of takes over the asset
00:06:21 which would make it very difficult
00:06:23 to have the same sort of rent seeking behaviors
00:06:27 that you’ve got right now
00:06:29 because like if you did speculative bidding,
00:06:34 like you would very quickly like lose a whole lot of money.
00:06:40 And so like the prices of the assets
00:06:42 would be sort of like very closely indexed
00:06:45 to like the value that they could produce.
00:06:49 And like, because like you’d have this sort
00:06:52 of real time mechanism that would force you
00:06:53 to sort of mark the value of the asset to the market,
00:06:56 then it could be taxed appropriately.
00:06:58 Like you couldn’t sort of sit on this thing and say,
00:07:00 oh, like this house is only worth 10,000 bucks
00:07:03 when like everything around it is worth 10 million.
00:07:06 That’s really, so it’s an incentive structure
00:07:08 that where the prices match the value much better.
00:07:13 Yeah, and Glenn does a much better job than I do
00:07:16 at selling and I probably picked the world’s worst example,
00:07:18 you know, and it’s intentionally provocative,
00:07:24 so like this whole notion,
00:07:25 like I’m not sure whether I like this notion
00:07:28 that like we can have a set of market mechanisms
00:07:31 where I could get bid out of my property, you know,
00:07:35 but you know, like if you’re thinking about something
00:07:37 like Elizabeth Warren’s wealth tax, for instance,
00:07:42 like you would have, I mean, it’d be really interesting
00:07:45 in like how you would actually set the price on the assets
00:07:50 and like you might have to have a mechanism like that
00:07:52 if you put a tax like that in place.
00:07:54 It’s really interesting that that kind of research,
00:07:56 at least tangentially is touching Microsoft research.
00:08:00 That you’re really thinking broadly.
00:08:02 Maybe you can speak to, this connects to AI,
00:08:08 so we have a candidate, Andrew Yang,
00:08:10 who kind of talks about artificial intelligence
00:08:13 and the concern that people have about, you know,
00:08:16 automation’s impact on society and arguably,
00:08:19 Microsoft is at the cutting edge of innovation
00:08:23 in all these kinds of ways and so it’s pushing AI forward.
00:08:27 How do you think about combining all our conversations
00:08:30 together here with radical markets and socialism
00:08:32 and innovation in AI that Microsoft is doing
00:08:37 and then Andrew Yang’s worry that that will result
00:08:44 in job loss for the lower and so on.
00:08:46 How do you think about that?
00:08:47 I think it’s sort of one of the most important questions
00:08:51 in technology like maybe even in society right now
00:08:54 about how is AI going to develop
00:08:59 over the course of the next several decades
00:09:01 and what’s it going to be used for
00:09:03 and what benefits will it produce
00:09:06 and what negative impacts will it produce
00:09:08 and who gets to steer this whole thing.
00:09:13 I’ll say at the highest level,
00:09:17 one of the real joys of getting to do what I do at Microsoft
00:09:22 is Microsoft has this heritage as a platform company
00:09:27 and so Bill has this thing that he said a bunch of years ago
00:09:32 where the measure of a successful platform
00:09:36 is that it produces far more economic value
00:09:39 for the people who build on top of the platform
00:09:41 than is created for the platform owner or builder
00:09:47 and I think we have to think about AI that way.
00:09:51 As a platform.
00:09:52 Yeah, it has to be a platform that other people can use
00:09:56 to build businesses, to fulfill their creative objectives,
00:10:01 to be entrepreneurs, to solve problems that they have
00:10:04 in their work and in their lives.
00:10:07 It can’t be a thing where there are a handful of companies
00:10:11 sitting in a very small handful of cities geographically
00:10:16 who are making all the decisions about what goes into the AI
00:10:21 and then on top of all this infrastructure,
00:10:26 then build all of the commercially valuable uses for it.
00:10:30 So I think that’s bad from a sort of economics
00:10:36 and sort of equitable distribution of value perspective,
00:10:40 sort of back to this whole notion of did the markets work?
00:10:44 But I think it’s also bad from an innovation perspective
00:10:47 because I have infinite amounts of faith
00:10:51 in human beings that if you give folks powerful tools,
00:10:55 they will go do interesting things
00:10:58 and it’s more than just a few tens of thousands of people
00:11:02 with the interesting tools,
00:11:03 it should be millions of people with the tools.
00:11:05 So it’s sort of like you think about the steam engine
00:11:10 in the late 18th century, like it was maybe the first
00:11:14 large scale substitute for human labor
00:11:16 that we’ve built like a machine
00:11:19 and in the beginning when these things are getting deployed,
00:11:23 the folks who got most of the value from the steam engines
00:11:28 were the folks who had capital
00:11:30 so they could afford to build them
00:11:31 and like they built factories around them and businesses
00:11:34 and the experts who knew how to build and maintain them.
00:11:38 But access to that technology democratized over time.
00:11:42 Like now, like an engine, it’s not like a differentiated
00:11:47 thing, like there isn’t one engine company
00:11:50 that builds all the engines
00:11:51 and all of the things that use engines
00:11:53 are made by this company
00:11:54 and like they get all the economics from all of that.
00:11:57 Like fully demarcated, like they’re probably,
00:12:00 we’re sitting here in this room
00:12:02 and like even though they’re probably things
00:12:05 like the MEMS gyroscope that are in both of our phones,
00:12:09 like there’s like little engines sort of everywhere.
00:12:13 They’re just a component in how we build the modern world.
00:12:16 Like AI needs to get there.
00:12:17 Yeah, so that’s a really powerful way to think.
00:12:20 If we think of AI as a platform
00:12:22 versus a tool that Microsoft owns,
00:12:26 as a platform that enables creation on top of it,
00:12:30 that’s the way to democratize it.
00:12:31 That’s really interesting actually.
00:12:34 And Microsoft throughout its history
00:12:36 has been positioned well to do that.
00:12:38 And the tie back to this radical markets thing,
00:12:41 like so my team has been working with Glenn on this,
00:12:49 and Jaren Lanier actually.
00:12:50 So Jaren is the sort of father of virtual reality.
00:12:56 Like he’s one of the most interesting human beings on the planet,
00:13:00 like a sweet, sweet guy.
00:13:02 And so Jaren and Glenn and folks in my team have been working
00:13:07 on this notion of data as labor
00:13:10 or like they call it data dignity as well.
00:13:13 And so the idea is that if you,
00:13:16 again going back to this sort of industrial analogy,
00:13:20 if you think about data as the raw material that is
00:13:24 consumed by the machine of AI in order to do useful things,
00:13:30 then like we’re not doing a really great job right now in having
00:13:34 transparent marketplaces for valuing those data contributions.
00:13:39 So and we all make them explicitly like you go to LinkedIn,
00:13:43 you sort of set up your profile on LinkedIn,
00:13:46 like that’s an explicit contribution.
00:13:47 Like you know exactly the information
00:13:49 that you’re putting into the system.
00:13:50 And like you put it there because you have
00:13:52 some nominal notion of what value you’re going to get in return.
00:13:56 But it’s like only nominal,
00:13:57 like you don’t know exactly what value you’re getting in return.
00:14:00 Like service is free,
00:14:01 like it’s low amount of perceived debt.
00:14:04 And then you’ve got all this indirect contribution that you’re
00:14:06 making just by virtue of interacting with all of
00:14:09 the technology that’s in your daily life.
00:14:13 And so like what Glenn and
00:14:15 Jaren and this data dignity team are trying to do is like,
00:14:19 can we figure out a set of mechanisms that let us value
00:14:23 those data contributions so that you could create
00:14:27 an economy and like a set of controls and incentives that
00:14:31 would allow people to like maybe even in the limit,
00:14:36 like earn part of their living
00:14:38 through the data that they’re creating.
00:14:41 And like you can sort of see it in explicit ways.
00:14:42 There are these companies like Scale AI,
00:14:46 and like there are a whole bunch of them in China
00:14:49 right now that are basically data labeling companies.
00:14:52 So like you’re doing supervised machine learning,
00:14:54 you need lots and lots of label training data.
00:14:57 And like those people who work for
00:15:01 those companies are getting compensated
00:15:03 for their data contributions into the system.
00:15:06 And so.
00:15:07 That’s easier to put a number on
00:15:09 their contribution because they’re explicitly labeling data.
00:15:11 Correct.
00:15:12 But you’re saying that we’re all
00:15:13 contributing data in different kinds of ways.
00:15:15 And it’s fascinating to start to
00:15:18 explicitly try to put a number on it.
00:15:20 Do you think that’s possible?
00:15:22 I don’t know. It’s hard. It really is.
00:15:24 Because we don’t have
00:15:29 as much transparency as I think
00:15:33 we need in like how the data is getting used.
00:15:37 And it’s super complicated.
00:15:38 Like we, I think as
00:15:41 technologists sort of appreciate
00:15:42 like some of the subtlety there.
00:15:44 It’s like the data gets created and then it gets,
00:15:48 it’s not valuable.
00:15:50 Like the data exhaust that you give off,
00:15:55 or the explicit data that I am putting into
00:16:00 the system isn’t super valuable atomically.
00:16:05 Like it’s only valuable when you sort of
00:16:07 aggregate it together into sort of large numbers.
00:16:10 This is true even for these like folks who are
00:16:12 getting compensated for like labeling things.
00:16:14 Like for supervised machine learning now,
00:16:16 like you need lots of labels to
00:16:18 train a model that performs well.
00:16:21 And so I think that’s one of the challenges.
00:16:24 It’s like how do you sort of figure
00:16:27 out like because this data is getting combined in
00:16:29 so many ways like through
00:16:32 these combinations like how the value is flowing.
00:16:35 Yeah, that’s fascinating.
00:16:37 Yeah. And it’s fascinating that you’re thinking about this.
00:16:41 And I wasn’t even going into this conversation expecting
00:16:44 the breadth of research really
00:16:48 that Microsoft broadly is thinking about,
00:16:50 you’re thinking about at Microsoft.
00:16:52 So if we go back to 89 when Microsoft released Office,
00:16:57 or 1990 when they released Windows 3.0.
00:17:02 In your view, I know you weren’t there through its history,
00:17:07 but how has the company changed in
00:17:09 the 30 years since as you look at it now?
00:17:12 The good thing is it’s started off as a platform company.
00:17:16 Like it’s still a platform company,
00:17:20 like the parts of the business that are thriving and
00:17:22 most successful are those that are building platforms.
00:17:26 Like the mission of the company now is,
00:17:28 the mission’s changed.
00:17:30 It’s like changed in a very interesting way.
00:17:32 So back in 89,
00:17:35 90 like they were still on the original mission,
00:17:39 which was like put a PC on every desk and in every home.
00:17:43 And it was basically about democratizing access to
00:17:47 this new personal computing technology,
00:17:50 which when Bill started the company,
00:17:52 integrated circuit microprocessors were a brand new thing.
00:17:57 And people were building homebrew computers from kits,
00:18:03 like the way people build ham radios right now.
00:18:08 I think this is the interesting thing
00:18:10 for folks who build platforms in general.
00:18:12 Bill saw the opportunity there and
00:18:17 what personal computers could do.
00:18:18 And it was like, it was sort of a reach.
00:18:20 Like you just sort of imagine like where things
00:18:22 were when they started the company
00:18:24 versus where things are now.
00:18:26 Like in success,
00:18:27 when you’ve democratized a platform,
00:18:29 it just sort of vanishes into the platform.
00:18:31 You don’t pay attention to it anymore.
00:18:32 Like operating systems aren’t a thing anymore.
00:18:35 Like they’re super important,
00:18:36 like completely critical.
00:18:38 And like when you see one fail,
00:18:41 like you just sort of understand.
00:18:43 But like it’s not a thing where you’re not like
00:18:46 waiting for the next operating system thing
00:18:50 in the same way that you were in 1995, right?
00:18:52 Like in 1995, like we had
00:18:54 Rolling Stones on the stage with the Windows 95 rollout.
00:18:57 Like it was like the biggest thing in the world.
00:18:59 Everybody lined up for it the way
00:19:01 that people used to line up for iPhone.
00:19:03 But like, you know, eventually,
00:19:04 and like this isn’t necessarily a bad thing.
00:19:07 Like it just sort of, you know,
00:19:08 the success is that it’s sort of, it becomes ubiquitous.
00:19:12 It’s like everywhere, like human beings,
00:19:14 when their technology becomes ubiquitous,
00:19:16 they just sort of start taking it for granted.
00:19:18 So the mission now that Satya
00:19:22 rearticulated five plus years ago now,
00:19:25 when he took over as CEO of the company.
00:19:28 Our mission is to empower every individual and
00:19:33 every organization in the world to be more successful.
00:19:38 And so, you know, again,
00:19:40 like that’s a platform mission.
00:19:43 And like the way that we do it now is, is different.
00:19:46 It’s like we have a hyperscale cloud that
00:19:48 people are building their applications on top of.
00:19:51 Like we have a bunch of AI infrastructure that
00:19:53 people are building their AI applications on top of.
00:19:56 We have, you know,
00:19:58 we have a productivity suite of software,
00:20:02 like Microsoft Dynamics, which, you know,
00:20:05 some people might not think is the sexiest thing in the world,
00:20:07 but it’s like helping people figure out how to automate
00:20:10 all of their business processes and workflows
00:20:13 and to help those businesses using it to grow and be more.
00:20:19 So it’s a much broader vision
00:20:23 in a way now than it was back then.
00:20:25 Like it was sort of very particular thing.
00:20:27 And like now, like we live in this world where
00:20:29 technology is so powerful and it’s like
00:20:32 such a basic fact of life that it both exists
00:20:39 and is going to get better and better over time
00:20:42 or at least more and more powerful over time.
00:20:45 So like, you know, what you have to do as a platform player
00:20:48 is just much bigger.
00:20:49 Right. There’s so many directions in which you can transform.
00:20:52 You didn’t mention mixed reality, too.
00:20:55 You know, that’s probably early days
00:20:59 or it depends how you think of it.
00:21:00 But if we think on a scale of centuries,
00:21:02 it’s the early days of mixed reality.
00:21:04 Oh, for sure.
00:21:04 And so with HoloLens,
00:21:08 Microsoft is doing some really interesting work there.
00:21:10 Do you touch that part of the effort?
00:21:13 What’s the thinking?
00:21:14 Do you think of mixed reality as a platform, too?
00:21:17 Oh, sure.
00:21:18 When we look at what the platforms of the future could be,
00:21:21 it’s like fairly obvious that like AI is one.
00:21:23 Like you don’t have to, I mean, like that’s,
00:21:26 you know, you sort of say it to like someone
00:21:29 and you know, like they get it.
00:21:31 But like we also think of the like mixed reality
00:21:36 and quantum as like these two interesting,
00:21:39 you know, potentially.
00:21:40 Quantum computing?
00:21:41 Yeah.
00:21:42 Okay. So let’s get crazy then.
00:21:44 So you’re talking about some futuristic things here.
00:21:48 Well, the mixed reality, Microsoft is really,
00:21:50 it’s not even futuristic, it’s here.
00:21:52 It is.
00:21:53 It’s incredible stuff.
00:21:54 And look, and it’s having an impact right now.
00:21:56 Like one of the more interesting things
00:21:58 that’s happened with mixed reality
00:21:59 over the past couple of years that I didn’t clearly see
00:22:04 is that it’s become the computing device
00:22:08 for folks who, for doing their work,
00:22:13 who haven’t used any computing device at all
00:22:16 to do their work before.
00:22:16 So technicians and service folks and people
00:22:21 who are doing like machine maintenance on factory floors.
00:22:25 So like they, you know, because they’re mobile
00:22:28 and like they’re out in the world
00:22:30 and they’re working with their hands
00:22:32 and, you know, sort of servicing these like
00:22:34 very complicated things, they’re,
00:22:37 they don’t use their mobile phone
00:22:39 and like they don’t carry a laptop with them
00:22:41 and, you know, they’re not tethered to a desk.
00:22:43 And so mixed reality, like where it’s getting traction
00:22:47 right now, where HoloLens is selling a lot of units
00:22:50 is for these sorts of applications for these workers.
00:22:54 And it’s become like, I mean, like the people love it.
00:22:58 They’re like, oh my God, like this is like for them,
00:23:01 like the same sort of productivity boosts that,
00:23:03 you know, like an office worker had
00:23:05 when they got their first personal computer.
00:23:08 Yeah, but you did mention it’s certainly obvious AI
00:23:12 as a platform, but can we dig into it a little bit?
00:23:15 How does AI begin to infuse some of the products
00:23:18 in Microsoft?
00:23:19 So currently providing training of,
00:23:24 for example, neural networks in the cloud
00:23:26 or providing pre trained models or just even providing
00:23:34 computing resources and whatever different inference
00:23:37 that you wanna do using neural networks.
00:23:39 How do you think of AI infusing as a platform
00:23:44 that Microsoft can provide?
00:23:45 Yeah, I mean, I think it’s super interesting.
00:23:48 It’s like everywhere.
00:23:49 And like we run these review meetings now
00:23:54 where it’s me and Satya and like members
00:24:00 of Satya’s leadership team and like a cross functional
00:24:04 group of folks across the entire company
00:24:06 who are working on like either AI infrastructure
00:24:11 or like have some substantial part of their product work
00:24:18 using AI in some significant way.
00:24:22 Now, the important thing to understand
00:24:23 is like when you think about like how the AI
00:24:26 is gonna manifest in like an experience
00:24:29 for something that’s gonna make it better,
00:24:31 like I think you don’t want the AIness
00:24:36 to be the first order thing.
00:24:38 It’s like whatever the product is
00:24:40 and like the thing that is trying to help you do,
00:24:43 like the AI just sort of makes it better.
00:24:45 And this is a gross exaggeration,
00:24:47 but like people get super excited about like
00:24:51 where the AI is showing up in products and I’m like,
00:24:54 do you get that excited about like
00:24:55 where you’re using a hash table like in your code?
00:24:59 Like it’s just another.
00:25:01 It’s just a tool.
00:25:01 It’s a very interesting programming tool,
00:25:03 but it’s sort of like it’s an engineering tool.
00:25:07 And so like it shows up everywhere.
00:25:09 So like we’ve got dozens and dozens of features
00:25:12 now in Office that are powered by
00:25:15 like fairly sophisticated machine learning,
00:25:18 our search engine wouldn’t work at all
00:25:21 if you took the machine learning out of it.
00:25:24 The like increasingly things like content moderation
00:25:30 on our Xbox and xCloud platform.
00:25:36 When you mean moderation,
00:25:37 you mean like the recommender is like showing
00:25:39 what you wanna look at next.
00:25:41 No, no, no, it’s like anti bullying stuff.
00:25:43 So the usual social network stuff
00:25:45 that you have to deal with.
00:25:46 Yeah, correct.
00:25:47 But it’s like really it’s targeted,
00:25:49 it’s targeted towards a gaming audience.
00:25:52 So it’s like a very particular type of thing
00:25:54 where the line between playful banter
00:25:59 and like legitimate bullying is like a subtle one.
00:26:02 And like you have to like, it’s sort of tough.
00:26:05 Like I have.
00:26:07 I’d love to if we could dig into it
00:26:08 because you’re also,
00:26:10 you led the engineering efforts of LinkedIn.
00:26:12 And if we look at LinkedIn as a social network,
00:26:17 and if we look at the Xbox gaming as the social components,
00:26:21 the very different kinds of I imagine communication
00:26:24 going on on the two platforms, right?
00:26:26 And the line in terms of bullying and so on
00:26:29 is different on the platforms.
00:26:31 So how do you,
00:26:33 I mean, it’s such a fascinating philosophical discussion
00:26:36 of where that line is.
00:26:37 I don’t think anyone knows the right answer.
00:26:39 Twitter folks are under fire now, Jack at Twitter
00:26:43 for trying to find that line.
00:26:45 Nobody knows what that line is.
00:26:46 But how do you try to find the line
00:26:50 for trying to prevent abusive behavior
00:26:57 and at the same time, let people be playful
00:27:00 and joke around and that kind of thing?
00:27:02 I think in a certain way,
00:27:03 like if you have what I would call vertical social networks,
00:27:10 it gets to be a little bit easier.
00:27:12 So like if you have a clear notion
00:27:14 of like what your social network should be used for,
00:27:17 or like what you are designing a community around,
00:27:22 then you don’t have as many dimensions
00:27:25 to your sort of content safety problem
00:27:28 as you do in a general purpose platform.
00:27:33 I mean, so like on LinkedIn,
00:27:37 like the whole social network
00:27:38 is about connecting people with opportunity,
00:27:41 whether it’s helping them find a job
00:27:43 or to sort of find mentors
00:27:46 or to sort of help them like find their next sales lead
00:27:52 or to just sort of allow them to broadcast
00:27:56 their sort of professional identity
00:27:59 to their network of peers and collaborators
00:28:06 and sort of professional community.
00:28:08 Like that is, I mean, like in some ways,
00:28:09 like that’s very, very broad,
00:28:11 but in other ways it’s sort of, it’s narrow.
00:28:15 And so like you can build AI’s like machine learning systems
00:28:20 that are capable with those boundaries
00:28:25 of making better automated decisions
00:28:28 about like what is sort of inappropriate
00:28:30 and offensive comment or dangerous comment
00:28:32 or illegal content when you have some constraints.
00:28:37 You know, same thing with like the gaming social network.
00:28:43 So for instance, like it’s about playing games,
00:28:45 not having fun.
00:28:47 And like the thing that you don’t want to have happen
00:28:49 on the platform is why bullying is such an important thing.
00:28:52 Like bullying is not fun.
00:28:53 So you want to do everything in your power
00:28:56 to encourage that not to happen.
00:28:59 And yeah, but I think it’s sort of a tough problem
00:29:03 in general and it’s one where I think, you know,
00:29:05 eventually we’re going to have to have some sort
00:29:09 of clarification from our policymakers about what it is
00:29:15 that we should be doing, like where the lines are,
00:29:18 because it’s tough.
00:29:20 Like you don’t, like in democracy, right?
00:29:23 Like you don’t want,
00:29:25 you want some sort of democratic involvement.
00:29:28 Like people should have a say
00:29:30 in like where the lines are drawn.
00:29:34 Like you don’t want a bunch of people making
00:29:37 like unilateral decisions.
00:29:39 And like we are in a state right now
00:29:43 for some of these platforms
00:29:44 where you actually do have to make unilateral decisions
00:29:46 where the policymaking isn’t going to happen fast enough
00:29:48 in order to like prevent very bad things from happening.
00:29:52 But like we need the policymaking side of that to catch up,
00:29:56 I think, as quickly as possible
00:29:58 because you want that whole process to be a democratic thing,
00:30:01 not a, you know, not some sort of weird thing
00:30:05 where you’ve got a non representative group
00:30:08 of people making decisions that have, you know,
00:30:10 like national and global impact.
00:30:12 And it’s fascinating because the digital space is different
00:30:15 than the physical space in which nations
00:30:18 and governments were established.
00:30:19 And so what policy looks like globally,
00:30:23 what bullying looks like globally,
00:30:25 what’s healthy communication looks like globally
00:30:28 is an open question and we’re all figuring it out together,
00:30:31 which is fascinating.
00:30:33 Yeah, I mean with, you know, sort of fake news, for instance.
00:30:37 And…
00:30:38 Deep fakes and fake news generated by humans?
00:30:42 Yeah, so we can talk about deep fakes,
00:30:44 like I think that is another like, you know,
00:30:46 sort of very interesting level of complexity.
00:30:48 But like if you think about just the written word, right?
00:30:51 Like we have, you know, we invented papyrus,
00:30:54 what, 3,000 years ago where we, you know,
00:30:56 you could sort of put word on paper.
00:31:01 And then 500 years ago, like we get the printing press,
00:31:07 like where the word gets a little bit more ubiquitous.
00:31:11 And then like you really, really didn’t get ubiquitous
00:31:14 printed word until the end of the 19th century
00:31:18 when the offset press was invented.
00:31:20 And then, you know, just sort of explodes
00:31:22 and like, you know, the cross product of that
00:31:25 and the Industrial Revolution’s need
00:31:28 for educated citizens resulted in like
00:31:32 this rapid expansion of literacy
00:31:34 and the rapid expansion of the word.
00:31:36 But like we had 3,000 years up to that point
00:31:39 to figure out like how to, you know,
00:31:43 like what’s journalism, what’s editorial integrity,
00:31:46 like what’s, you know, what’s scientific peer review.
00:31:50 And so like you built all of this mechanism
00:31:52 to like try to filter through all of the noise
00:31:57 that the technology made possible
00:31:59 to like, you know, sort of getting to something
00:32:01 that society could cope with.
00:32:03 And like, if you think about just the piece,
00:32:06 the PC didn’t exist 50 years ago.
00:32:09 And so in like this span of, you know,
00:32:11 like half a century, like we’ve gone from no digital,
00:32:16 you know, no ubiquitous digital technology
00:32:18 to like having a device that sits in your pocket
00:32:21 where you can sort of say whatever is on your mind
00:32:23 to like what did Mary have in her,
00:32:27 Mary Meeker just released her new like slide deck last week.
00:32:32 You know, we’ve got 50% penetration of the internet
00:32:37 to the global population.
00:32:38 Like there are like three and a half billion people
00:32:40 who are connected now.
00:32:41 So it’s like, it’s crazy, crazy, like inconceivable,
00:32:44 like how fast all of this happened.
00:32:46 So, you know, it’s not surprising
00:32:48 that we haven’t figured out what to do yet,
00:32:50 but like we gotta really like lean into this set of problems
00:32:55 because like we basically have three millennia worth of work
00:33:00 to do about how to deal with all of this
00:33:02 and like probably what, you know,
00:33:04 amounts to the next decade worth of time.
00:33:07 So since we’re on the topic of tough, you know,
00:33:09 tough challenging problems,
00:33:11 let’s look at more on the tooling side in AI
00:33:15 that Microsoft is looking at is face recognition software.
00:33:18 So there’s a lot of powerful positive use cases
00:33:21 for face recognition, but there’s some negative ones
00:33:24 and we’re seeing those in different governments
00:33:27 in the world.
00:33:28 So how do you, how does Microsoft think about the use
00:33:30 of face recognition software as a platform
00:33:35 in governments and companies?
00:33:39 How do we strike an ethical balance here?
00:33:42 Yeah, I think we’ve articulated a clear point of view.
00:33:47 So Brad Smith wrote a blog post last fall,
00:33:51 I believe that sort of like outlined
00:33:54 like very specifically what, you know,
00:33:57 what our point of view is there.
00:33:59 And, you know, I think we believe
00:34:01 that there are certain uses
00:34:02 to which face recognition should not be put.
00:34:04 And we believe again,
00:34:06 that there’s a need for regulation there.
00:34:09 Like the government should like really come in
00:34:11 and say that, you know, this is where the lines are.
00:34:15 And like, we very much wanted to like figuring out
00:34:18 where the lines are, should be a democratic process.
00:34:20 But in the short term, like we’ve drawn some lines
00:34:22 where, you know, we push back against uses
00:34:26 of face recognition technology, you know,
00:34:29 like the city of San Francisco, for instance,
00:34:32 I think has completely outlawed any government agency
00:34:36 from using face recognition tech.
00:34:39 And like that may prove to be a little bit overly broad.
00:34:44 But for like certain law enforcement things,
00:34:48 like you really, I would personally rather be overly
00:34:54 sort of cautious in terms of restricting use of it
00:34:57 until like we have, you know,
00:34:58 sort of defined a reasonable, you know,
00:35:02 democratically determined regulatory framework
00:35:04 for like where we could and should use it.
00:35:08 And, you know, the other thing there is like,
00:35:12 we’ve got a bunch of research that we’re doing
00:35:13 and a bunch of progress that we’ve made on bias there.
00:35:18 And like, there are all sorts of like weird biases
00:35:20 that these models can have,
00:35:22 like all the way from like the most noteworthy one
00:35:25 where, you know, you may have underrepresented minorities
00:35:31 who are like underrepresented in the training data
00:35:34 and then you start learning like strange things.
00:35:39 But like there are even, you know, other weird things.
00:35:42 Like we’ve, I think we’ve seen in the public research,
00:35:46 like models can learn strange things,
00:35:49 like all doctors are men, for instance, just, yeah.
00:35:54 I mean, and so like, it really is a thing
00:35:58 where it’s very important for everybody
00:36:03 who is working on these things before they push publish,
00:36:08 they launch the experiment, they, you know, push the code
00:36:12 to, you know, online, or they even publish the paper
00:36:17 that they are at least starting to think about
00:36:20 what some of the potential negative consequences are,
00:36:25 some of this stuff.
00:36:26 I mean, this is where, you know, like the deep fake stuff
00:36:29 I find very worrisome just because
00:36:32 there are going to be some very good beneficial uses
00:36:39 of like GAN generated imagery.
00:36:46 And funny enough, like one of the places
00:36:48 where it’s actually useful is we’re using the technology
00:36:52 right now to generate synthetic visual data
00:36:58 for training some of the face recognition models
00:37:01 to get rid of the bias.
00:37:03 So like, that’s one like super good use of the tech,
00:37:05 but like, you know, it’s getting good enough now
00:37:09 where, you know, it’s going to sort of challenge
00:37:12 a normal human being’s ability to,
00:37:14 like now you’re just sort of say,
00:37:15 like it’s very expensive for someone
00:37:19 to fabricate a photorealistic fake video.
00:37:24 And like GANs are going to make it fantastically cheap
00:37:26 to fabricate a photorealistic fake video.
00:37:30 And so like what you assume you can sort of trust is true
00:37:34 versus like be skeptical about is about to change.
00:37:38 And like, we’re not ready for it, I don’t think.
00:37:40 The nature of truth, right.
00:37:41 That’s, it’s also exciting because I think both you and I
00:37:46 probably would agree that the way to solve,
00:37:49 to take on that challenge is with technology, right?
00:37:52 There’s probably going to be ideas of ways to verify
00:37:56 which kind of video is legitimate, which kind is not.
00:38:00 So to me, that’s an exciting possibility,
00:38:03 most likely for just the comedic genius
00:38:07 that the internet usually creates with these kinds of videos
00:38:10 and hopefully will not result in any serious harm.
00:38:13 Yeah, and it could be, you know,
00:38:17 like I think we will have technology to,
00:38:21 that may be able to detect whether or not
00:38:23 something’s fake or real.
00:38:24 Although the fakes are pretty convincing,
00:38:30 even like when you subject them to machine scrutiny.
00:38:34 But, you know, we also have these increasingly
00:38:37 interesting social networks, you know,
00:38:40 that are under fire right now
00:38:43 for some of the bad things that they do.
00:38:46 Like one of the things you could choose to do
00:38:47 with a social network is like you could,
00:38:51 you could use crypto and the networks
00:38:55 to like have content signed
00:38:57 where you could have a like full chain of custody
00:39:01 that accompanied every piece of content.
00:39:03 So like when you’re viewing something
00:39:06 and like you want to ask yourself,
00:39:08 like how much can I trust this?
00:39:11 Like you can click something
00:39:12 and like have a verified chain of custody
00:39:14 that shows like, oh, this is coming from this source.
00:39:19 And it’s like signed by like someone
00:39:21 whose identity I trust.
00:39:24 Yeah, I think having that, you know,
00:39:25 having that chain of custody,
00:39:26 like being able to like say, oh, here’s this video.
00:39:29 Like it may or may not have been produced
00:39:31 using some of this deepfake technology,
00:39:33 but if you’ve got a verified chain of custody
00:39:35 where you can sort of trace it all the way back
00:39:37 to an identity and you can decide whether or not
00:39:39 like I trust this identity.
00:39:41 Like, oh no, this is really from the White House
00:39:43 or like this is really from the, you know,
00:39:45 the office of this particular presidential candidate
00:39:48 or it’s really from, you know, Jeff Wiener, CEO of LinkedIn
00:39:53 or Satya Nadella, CEO of Microsoft.
00:39:55 Like that might be like one way
00:39:58 that you can solve some of the problems.
00:39:59 So like that’s not the super high tech.
00:40:01 Like we’ve had all of this technology forever.
00:40:04 And, but I think you’re right.
00:40:06 Like it has to be some sort of technological thing
00:40:11 because the underlying tech that is used to create this
00:40:15 is not going to do anything but get better over time
00:40:18 and the genie is sort of out of the bottle.
00:40:21 There’s no stuffing it back in.
00:40:22 And there’s a social component,
00:40:24 which I think is really healthy for a democracy
00:40:26 where people will be skeptical
00:40:28 about the thing they watch in general.
00:40:32 So, you know, which is good.
00:40:34 Skepticism in general is good for content.
00:40:37 So deepfakes in that sense are creating a global skepticism
00:40:41 about can they trust what they read.
00:40:44 It encourages further research.
00:40:46 I come from the Soviet Union
00:40:49 where basically nobody trusted the media
00:40:53 because you knew it was propaganda.
00:40:55 And that kind of skepticism encouraged further research
00:40:59 about ideas as opposed to just trusting any one source.
00:41:02 Well, look, I think it’s one of the reasons why
00:41:04 the scientific method and our apparatus
00:41:09 of modern science is so good.
00:41:11 Like, because you don’t have to trust anything.
00:41:15 Like, the whole notion of modern science
00:41:20 beyond the fact that this is a hypothesis
00:41:22 and this is an experiment to test the hypothesis
00:41:24 and this is a peer review process
00:41:27 for scrutinizing published results.
00:41:30 But stuff’s also supposed to be reproducible.
00:41:33 So you know it’s been vetted by this process,
00:41:35 but you also are expected to publish enough detail
00:41:38 where if you are sufficiently skeptical of the thing,
00:41:42 you can go try to reproduce it yourself.
00:41:44 And like, I don’t know what it is.
00:41:47 Like, I think a lot of engineers are like this
00:41:49 where like, you know, sort of this,
00:41:51 like your brain is sort of wired for skepticism.
00:41:55 Like, you don’t just first order trust everything
00:41:58 that you see and encounter.
00:42:00 And like, you’re sort of curious to understand,
00:42:02 you know, the next thing.
00:42:04 But like, I think it’s an entirely healthy thing.
00:42:09 And like, we need a little bit more of that right now.
00:42:12 So I’m not a large business owner.
00:42:16 So I’m just a huge fan of many of Microsoft products.
00:42:23 I mean, I still, actually in terms of,
00:42:25 I generate a lot of graphics and images
00:42:27 and I still use PowerPoint to do that.
00:42:28 It beats Illustrator for me.
00:42:30 Even professional sort of, it’s fascinating.
00:42:34 So I wonder, what is the future of,
00:42:38 let’s say Windows and Office look like?
00:42:42 Is, do you see it?
00:42:43 I mean, I remember looking forward to XP.
00:42:45 Was it exciting when XP was released?
00:42:48 Just like you said, I don’t remember when 95 was released.
00:42:51 But XP for me was a big celebration.
00:42:53 And when 10 came out, I was like, oh, okay.
00:42:56 Well, it’s nice.
00:42:57 It’s a nice improvement.
00:42:59 So what do you see the future of these products?
00:43:03 I think there’s a bunch of excite.
00:43:04 I mean, on the Office front,
00:43:07 there’s gonna be this like increasing productivity wins
00:43:13 that are coming out of some of these AI powered features
00:43:17 that are coming.
00:43:18 Like the products will sort of get smarter and smarter
00:43:20 in like a very subtle way.
00:43:21 Like there’s not gonna be this big bang moment
00:43:24 where like Clippy is gonna reemerge and it’s gonna be.
00:43:28 Wait a minute.
00:43:28 Okay, we’ll have to wait, wait, wait.
00:43:30 Is Clippy coming back?
00:43:32 But quite seriously, so injection of AI.
00:43:37 There’s not much, or at least I’m not familiar,
00:43:39 sort of assistive type of stuff going on
00:43:41 inside the Office products.
00:43:43 Like a Clippy style assistant, personal assistant.
00:43:47 Do you think that there’s a possibility
00:43:50 of that in the future?
00:43:52 So I think there are a bunch of like very small ways
00:43:54 in which like machine learning powered assistive things
00:43:58 are in the product right now.
00:44:00 So there are a bunch of interesting things.
00:44:04 Like the auto response stuff’s getting better and better.
00:44:09 And it’s like getting to the point
00:44:11 where it can auto respond with like,
00:44:14 okay, this person’s clearly trying to schedule a meeting.
00:44:19 So it looks at your calendar and it automatically
00:44:21 like tries to find like a time and a space
00:44:24 that’s mutually interesting.
00:44:27 Like we have this notion of Microsoft search
00:44:32 at a Microsoft search where it’s like not just web search,
00:44:34 but it’s like search across like all of your information
00:44:38 that’s sitting inside of like your Office 365 tenant
00:44:43 and like potentially in other products.
00:44:46 And like we have this thing called the Microsoft Graph
00:44:49 that is basically an API federator that sort of like
00:44:53 gets you hooked up across the entire breadth
00:44:57 of like all of the, like what were information silos
00:45:01 before they got woven together with the graph.
00:45:05 Like that is like getting increasing,
00:45:07 with increasing effectiveness,
00:45:09 sort of plumbed into some of these auto response things
00:45:13 where you’re gonna be able to see the system
00:45:15 like automatically retrieve information for you.
00:45:18 Like if, you know, like I frequently send out,
00:45:21 you know, emails to folks where like I can’t find a paper
00:45:24 or a document or whatnot.
00:45:25 There’s no reason why the system
00:45:26 won’t be able to do that for you.
00:45:27 And like, I think the, it’s building towards
00:45:31 like having things that look more like,
00:45:34 like a fully integrated, you know, assistant,
00:45:37 but like you’ll have a bunch of steps
00:45:40 that you will see before you,
00:45:42 like it will not be this like big bang thing
00:45:45 where like Clippy comes back and you’ve got this like,
00:45:47 you know, manifestation of, you know,
00:45:49 like a fully, fully powered assistant.
00:45:53 So I think that’s, that’s definitely coming in,
00:45:56 like all of the, you know, collaboration,
00:45:58 coauthoring stuff’s getting better.
00:46:00 You know, it’s like really interesting.
00:46:02 Like if you look at how we use
00:46:06 the Office product portfolio at Microsoft,
00:46:09 like more and more of it is happening inside of
00:46:12 like Teams as a canvas.
00:46:14 And like, it’s this thing where, you know,
00:46:17 you’ve got collaboration is like at the center
00:46:20 of the product and like we built some like really cool stuff
00:46:25 that’s some of, which is about to be open source
00:46:28 that are sort of framework level things
00:46:30 for doing, for doing coauthoring.
00:46:34 That’s awesome.
00:46:35 So in, is there a cloud component to that?
00:46:37 So on the web, or is it,
00:46:40 and forgive me if I don’t already know this,
00:46:42 but with Office 365, we still,
00:46:45 the collaboration we do if we’re doing Word,
00:46:47 we still send the file around.
00:46:49 No, no.
00:46:50 So this is.
00:46:51 We’re already a little bit better than that.
00:46:54 A little bit better than that and like, you know,
00:46:55 so like the fact that you’re unaware of it means
00:46:57 we’ve got a better job to do,
00:46:59 like helping you discover, discover this stuff.
00:47:02 But yeah, I mean, it’s already like got a huge,
00:47:06 huge cloud component.
00:47:07 And like part of, you know, part of this framework stuff,
00:47:09 I think we’re calling it, like I,
00:47:12 like we’ve been working on it for a couple of years.
00:47:14 So like, I know the internal code name for it,
00:47:17 but I think when we launched it to build,
00:47:18 it’s called the Fluid Framework.
00:47:20 And, but like what Fluid lets you do is like,
00:47:25 you can go into a conversation that you’re having in Teams
00:47:27 and like reference like part of a spreadsheet
00:47:30 that you’re working on where somebody’s like sitting
00:47:33 in the Excel canvas,
00:47:35 like working on the spreadsheet with a, you know,
00:47:37 chart or whatnot,
00:47:38 and like you can sort of embed like part of the spreadsheet
00:47:41 in the Teams conversation where like you can dynamically
00:47:45 update it and like all of the changes that you’re making
00:47:48 to the, to this object are like, you know,
00:47:51 coordinate and everything is sort of updating in real time.
00:47:54 So like you can be in whatever canvas is most convenient
00:47:57 for you to get your work done.
00:48:00 So I, out of my own sort of curiosity as an engineer,
00:48:03 I know what it’s like to sort of lead a team
00:48:06 of 10, 15 engineers.
00:48:08 Microsoft has, I don’t know what the numbers are,
00:48:11 maybe 50, maybe 60,000 engineers, maybe 40.
00:48:14 I don’t know exactly what the number is, it’s a lot.
00:48:17 It’s tens of thousands.
00:48:18 Right, so it’s more than 10 or 15.
00:48:20 What, I mean, you’ve led different sizes,
00:48:28 mostly large size of engineers.
00:48:30 What does it take to lead such a large group
00:48:33 into a continue innovation,
00:48:37 continue being highly productive
00:48:40 and yet develop all kinds of new ideas and yet maintain,
00:48:44 like what does it take to lead such a large group
00:48:47 of brilliant people?
00:48:48 I think the thing that you learn
00:48:52 as you manage larger and larger scale
00:48:55 is that there are three things
00:48:57 that are like very, very important
00:49:00 for big engineering teams.
00:49:02 Like one is like having some sort of forethought
00:49:06 about what it is that you’re gonna be building
00:49:09 over large periods of time.
00:49:11 Like not exactly, like you don’t need to know
00:49:13 that like, you know, I’m putting all my chips
00:49:15 on this one product and like this is gonna be the thing,
00:49:17 but like it’s useful to know like what sort of capabilities
00:49:21 you think you’re going to need to have
00:49:23 to build the products of the future.
00:49:24 And then like invest in that infrastructure,
00:49:28 like whether, and like I’m not just talking
00:49:30 about storage systems or cloud APIs,
00:49:32 it’s also like what does your development process look like?
00:49:35 What tools do you want?
00:49:36 Like what culture do you want to build around?
00:49:40 Like how you’re, you know, sort of collaborating together
00:49:42 to like make complicated technical things.
00:49:45 And so like having an opinion and investing in that
00:49:48 is like, it just gets more and more important.
00:49:50 And like the sooner you can get a concrete set of opinions,
00:49:54 like the better you’re going to be.
00:49:57 Like you can wing it for a while at small scales,
00:50:01 like, you know, when you start a company,
00:50:03 like you don’t have to be like super specific about it,
00:50:06 but like the biggest miseries that I’ve ever seen
00:50:09 as an engineering leader are in places
00:50:12 where you didn’t have a clear enough opinion
00:50:14 about those things soon enough.
00:50:16 And then you just sort of go create a bunch
00:50:18 of technical debt and like culture debt
00:50:21 that is excruciatingly painful to clean up.
00:50:25 So like, that’s one bundle of things.
00:50:28 Like the other, you know, another bundle of things
00:50:33 is like, it’s just really, really important
00:50:36 to like have a clear mission
00:50:41 that’s not just some cute crap you say
00:50:46 because like you think you should have a mission,
00:50:48 but like something that clarifies for people
00:50:52 like where it is that you’re headed together.
00:50:57 Like, I know it’s like probably like a little bit
00:50:59 too popular right now,
00:51:00 but Yuval Harari’s book, Sapiens,
00:51:05 one of the central ideas in his book is that
00:51:10 like storytelling is like the quintessential thing
00:51:15 for coordinating the activities of large groups of people.
00:51:18 Like once you get past Dunbar’s number,
00:51:21 and like I’ve really, really seen that
00:51:23 just managing engineering teams.
00:51:25 Like you can just brute force things
00:51:30 when you’re less than 120, 150 folks
00:51:33 where you can sort of know and trust
00:51:35 and understand what the dynamics are
00:51:38 between all the people, but like past that,
00:51:40 like things just sort of start to catastrophically fail
00:51:43 if you don’t have some sort of set of shared goals
00:51:47 that you’re marching towards.
00:51:48 And so like, even though it sounds touchy feely
00:51:51 and you know, like a bunch of technical people
00:51:54 will sort of balk at the idea that like,
00:51:56 you need to like have a clear, like the missions,
00:52:00 like very, very, very important.
00:52:02 You’re always right, right?
00:52:04 Stories, that’s how our society,
00:52:06 that’s the fabric that connects us,
00:52:08 all of us is these powerful stories.
00:52:10 And that works for companies too, right?
00:52:12 It works for everything.
00:52:14 Like, I mean, even down to like, you know,
00:52:16 you sort of really think about it,
00:52:18 like our currency, for instance, is a story.
00:52:20 Our constitution is a story.
00:52:22 Our laws are stories.
00:52:24 I mean, like we believe very, very, very strongly in them.
00:52:27 And thank God we do.
00:52:29 But like they are,
00:52:31 they’re just abstract things.
00:52:33 Like they’re just words.
00:52:34 Like if we don’t believe in them, they’re nothing.
00:52:36 And in some sense, those stories are platforms
00:52:39 and the kinds, some of which Microsoft is creating, right?
00:52:43 They have platforms on which we define the future.
00:52:46 So last question, what do you,
00:52:48 let’s get philosophical maybe,
00:52:50 bigger than even Microsoft,
00:52:51 what do you think the next 20, 30 plus years
00:52:56 looks like for computing, for technology, for devices?
00:53:00 Do you have crazy ideas about the future of the world?
00:53:04 Yeah, look, I think we, you know,
00:53:06 we’re entering this time where we’ve got,
00:53:10 we have technology that is progressing
00:53:13 at the fastest rate that it ever has.
00:53:15 And you’ve got,
00:53:18 you’ve got some really big social problems,
00:53:21 like society scale problems that we have to tackle.
00:53:26 And so, you know, I think we’re going to rise to the challenge
00:53:28 and like figure out how to intersect
00:53:30 like all of the power of this technology
00:53:32 with all of the big challenges that are facing us,
00:53:35 whether it’s, you know, global warming,
00:53:37 whether it’s like the biggest remainder of the population boom
00:53:41 is in Africa for the next 50 years or so.
00:53:46 And like global warming is going to make it increasingly difficult
00:53:49 to feed the global population in particular,
00:53:52 like in this place where you’re going to have
00:53:54 like the biggest population boom.
00:53:57 I think we, you know, like AI is going to,
00:54:01 like if we push it in the right direction,
00:54:03 like it can do like incredible things to empower all of us
00:54:07 to achieve our full potential and to, you know,
00:54:12 like live better lives.
00:54:15 But like that also means focus on like
00:54:20 some super important things.
00:54:21 Like how can you apply it to healthcare to make sure that,
00:54:26 you know, like our quality and cost of
00:54:29 and sort of ubiquity of health coverage is better
00:54:33 and better over time.
00:54:35 Like that’s more and more important every day is like
00:54:38 in the United States and like the rest of the industrialized world,
00:54:43 so Western Europe, China, Japan, Korea,
00:54:45 like you’ve got this population bubble of like aging,
00:54:50 working, you know, working age folks who are,
00:54:54 you know, at some point over the next 20, 30 years,
00:54:56 they’re going to be largely retired.
00:54:58 And like you’re going to have more retired people
00:55:00 than working age people.
00:55:01 And then like you’ve got, you know,
00:55:02 sort of natural questions about who’s going to take care of
00:55:05 all the old folks and who’s going to do all the work.
00:55:07 And the answers to like all of these sorts of questions,
00:55:11 like where you’re sort of running into, you know,
00:55:13 like constraints of the, you know,
00:55:16 the world and of society has always been like
00:55:20 what tech is going to like help us get around this?
00:55:23 Like when I was a kid in the 70s and 80s,
00:55:26 like we talked all the time about like population boom,
00:55:29 population boom, like we’re going to,
00:55:31 like we’re not going to be able to like feed the planet.
00:55:34 And like we were like right in the middle of the Green Revolution
00:55:38 where like this massive technology driven increase
00:55:44 in crop productivity like worldwide.
00:55:47 And like some of that was like taking some of the things
00:55:49 that we knew in the West and like getting them distributed
00:55:52 to the, you know, to the developing world.
00:55:55 And like part of it were things like, you know,
00:55:59 just smarter biology like helping us increase.
00:56:03 And like we don’t talk about like overpopulation anymore
00:56:08 because like we can more or less,
00:56:10 we sort of figured out how to feed the world.
00:56:12 Like that’s a technology story.
00:56:14 And so like I’m super, super hopeful about the future
00:56:19 and in the ways where we will be able to apply technology
00:56:24 to solve some of these super challenging problems.
00:56:28 Like I’ve, like one of the things that I’m trying to spend
00:56:33 my time doing right now is trying to get everybody else
00:56:36 to be hopeful as well because, you know, back to Harare,
00:56:39 like we are the stories that we tell.
00:56:41 Like if we, you know, if we get overly pessimistic right now
00:56:44 about like the potential future of technology,
00:56:48 like we, you know, like we may fail to get all of the things
00:56:53 in place that we need to like have our best possible future.
00:56:56 And that kind of hopeful optimism, I’m glad that you have it
00:57:00 because you’re leading large groups of engineers
00:57:03 that are actually defining, that are writing that story,
00:57:06 that are helping build that future, which is super exciting.
00:57:09 And I agree with everything you said except I do hope
00:57:13 Clippy comes back.
00:57:15 We miss him. I speak for the people.
00:57:19 So, Galen, thank you so much for talking to me.
00:57:21 Thank you so much for having me. It was a pleasure.