Transcript
00:00:00 The following is a conversation with Jack Dorsey,
00:00:02 co founder and CEO of Twitter
00:00:05 and founder and CEO of Square.
00:00:08 Given the happenings at the time related to Twitter leadership
00:00:12 and the very limited time we had,
00:00:13 we decided to focus this conversation on Square
00:00:16 and some broader philosophical topics
00:00:18 and to save an in depth conversation
00:00:20 on engineering and AI at Twitter
00:00:23 for a second appearance in this podcast.
00:00:25 This conversation was recorded
00:00:27 before the outbreak of the pandemic.
00:00:29 For everyone feeling the medical, psychological
00:00:31 and financial burden of this crisis,
00:00:33 I’m sending love your way.
00:00:35 Stay strong.
00:00:36 We’re in this together.
00:00:37 We’ll beat this thing.
00:00:39 As an aside, let me mention
00:00:41 that Jack moved $1 billion of Square equity,
00:00:45 which is 28% of his wealth
00:00:47 to form an organization that funds COVID 19 relief.
00:00:51 First, as Andrew Yang tweeted,
00:00:53 this is a spectacular commitment.
00:00:56 And second, it is amazing that it operates transparently
00:00:59 by posting all its donations to a single Google doc.
00:01:03 To me, true transparency is simple.
00:01:06 And this is as simple as it gets.
00:01:09 This is the Artificial Intelligence Podcast.
00:01:11 If you enjoy it, subscribe on YouTube,
00:01:13 review it with five stars on Apple Podcast,
00:01:15 support it on Patreon
00:01:17 or simply connect with me on Twitter
00:01:18 at Lex Friedman spelled F R I D M A N.
00:01:22 As usual, I’ll do a few minutes of ads now
00:01:24 and never any ads in the middle
00:01:26 that can break the flow of the conversation.
00:01:28 I hope that works for you
00:01:29 and doesn’t hurt the listening experience.
00:01:32 This show is presented by Masterclass.
00:01:34 Sign up on masterclass.com slash Lex
00:01:37 to get a discount and to support this podcast.
00:01:40 When I first heard about Masterclass,
00:01:42 I thought it was too good to be true.
00:01:44 For $180 a year,
00:01:46 you get an all access pass to watch courses from,
00:01:49 to list some of my favorites,
00:01:51 Chris Hadfield on space exploration,
00:01:53 Neil deGrasse Tyson on scientific thinking
00:01:55 and communication,
00:01:57 Will Wright, creator of SimCity and Sims,
00:02:00 both one of my favorite games on game design,
00:02:03 Jane Goodall on conservation,
00:02:05 Carlos Santana on guitar,
00:02:07 one of my favorite guitar players,
00:02:08 Gary Kasparov on chess,
00:02:10 Daniel Nagrano on poker and many, many more.
00:02:13 Chris Hadfield explaining how rockets work
00:02:17 and the experience of being launched into space alone
00:02:19 is worth the money.
00:02:20 For me, the key is to not be overwhelmed
00:02:22 by the abundance of choice.
00:02:24 Pick three courses you want to complete,
00:02:26 watch each all the way through.
00:02:28 It’s not that long,
00:02:29 but it’s an experience that will stick with you
00:02:31 for a long time.
00:02:32 It’s easily worth the money.
00:02:34 You can watch it on basically any device.
00:02:37 Once again, sign up on masterclass.com slash Lex
00:02:41 to get a discount and to support this podcast.
00:02:44 And now, here’s my conversation with Jack Dorsey.
00:02:48 You’ve been on several podcasts,
00:02:50 Joe Rogan, Sam Harris, Rach Roll, others,
00:02:53 excellent conversations,
00:02:55 but I think there’s several topics
00:02:57 that you didn’t talk about that I think are fascinating
00:03:00 that I’d love to talk to you about,
00:03:01 sort of machine learning, artificial intelligence,
00:03:04 both the narrow kind and the general kind
00:03:06 and engineering at scale.
00:03:08 So there’s a lot of incredible engineering going on
00:03:11 that you’re a part of,
00:03:12 crypto, cryptocurrency, blockchain, UBI,
00:03:16 all kinds of philosophical questions maybe we’ll get to
00:03:18 about life and death and meaning and beauty.
00:03:21 So you’re involved in building some of
00:03:25 the biggest network systems in the world,
00:03:27 sort of trillions of interactions a day.
00:03:30 The cool thing about that is the infrastructure,
00:03:33 the engineering at scale.
00:03:35 You started as a programmer with C building.
00:03:38 Yeah, so.
00:03:39 I’m a hacker, I’m not really an engineer.
00:03:41 Not a legit software engineer,
00:03:43 you’re a hacker at heart.
00:03:44 But to achieve scale, you have to do some,
00:03:47 unfortunately, legit large scale engineering.
00:03:49 So how do you make that magic happen?
00:03:52 Hire people that I can learn from, number one.
00:03:57 I mean, I’m a hacker in the sense that I,
00:04:00 my approach has always been do whatever it takes
00:04:02 to make it work.
00:04:04 So that I can see and feel the thing
00:04:07 and then learn what needs to come next.
00:04:09 And oftentimes what needs to come next is
00:04:13 a matter of being able to bring it to more people,
00:04:16 which is scale.
00:04:17 And there’s a lot of great people out there
00:04:21 that either have experience or are extremely fast learners
00:04:27 that we’ve been lucky enough to find
00:04:30 and work with for years.
00:04:33 But I think a lot of it,
00:04:35 we benefit a ton from the open source community
00:04:39 and just all the learnings there
00:04:41 that are laid bare in the open.
00:04:44 All the mistakes, all the success,
00:04:46 all the problems.
00:04:48 It’s a very slow moving process usually open source,
00:04:53 but it’s very deliberate.
00:04:54 And you get to see because of the pace,
00:04:58 you get to see what it takes
00:05:00 to really build something meaningful.
00:05:02 So I learned most of everything I learned about hacking
00:05:06 and programming and engineering has been due to open source
00:05:11 and the generosity that people have given
00:05:19 to give up their time, sacrifice their time
00:05:21 without any expectation in return,
00:05:24 other than being a part of something
00:05:27 much larger than themselves, which I think is great.
00:05:29 Open source movement is amazing.
00:05:31 But if you just look at the scale,
00:05:33 like Square has to take care of,
00:05:35 is this fundamentally a software problem
00:05:38 or a hardware problem?
00:05:39 You mentioned hiring a bunch of people,
00:05:41 but it’s not, maybe from my perspective,
00:05:45 not often talked about how incredible that is
00:05:48 to sort of have a system that doesn’t go down often,
00:05:52 that is secure, is able to take care
00:05:54 of all these transactions.
00:05:55 Like maybe I’m also a hacker at heart
00:05:58 and it’s incredible to me that that kind of scale
00:06:01 could be achieved.
00:06:02 Is there some insight, some lessons,
00:06:06 some interesting tidbits that you can say
00:06:10 how to make that scale happen?
00:06:12 Is it the hardware fundamentally challenge?
00:06:14 Is it a software challenge?
00:06:19 Is it a social challenge of building large teams
00:06:23 of engineers that work together, that kind of thing?
00:06:25 Like what’s the interesting challenges there?
00:06:28 By the way, you’re the best dressed hacker I’ve met.
00:06:31 I think the. Thank you.
00:06:34 If the enumeration you just went through,
00:06:36 I don’t think there’s one.
00:06:37 You have to kind of focus on all
00:06:39 and the ability to focus on all that
00:06:43 really comes down to how you face problems
00:06:47 and whether you can break them down into parts
00:06:51 that you can focus on.
00:06:53 Because I think the biggest mistake is trying to solve
00:06:58 or address too many at once
00:07:02 or not going deep enough with the questions
00:07:05 or not being critical of the answers you find
00:07:08 or not taking the time to form credible hypotheses
00:07:15 that you can actually test and you can see the results of.
00:07:19 So all of those fall in the face of ultimately
00:07:25 critical thinking skills, problem solving skills.
00:07:27 And if there’s one skill I want to improve every day,
00:07:30 it’s that that’s what contributes to the learning
00:07:34 and the only way we can evolve any of these things
00:07:38 is learning what it’s currently doing
00:07:41 and how to take it to the next step.
00:07:44 And questioning assumptions,
00:07:45 the first principles kind of thinking,
00:07:47 seems like a fundamental to this whole process.
00:07:50 Yeah, but if you get too overextended into,
00:07:53 well, this is a hardware issue,
00:07:54 you miss all the software solutions.
00:07:56 And vice versa, if you focus too much on the software,
00:08:01 there are hardware solutions that can 10X the thing.
00:08:06 So I try to resist the categories of thinking
00:08:13 and look for the underlying systems
00:08:16 that make all these things work.
00:08:18 But those only emerge when you have a skill
00:08:22 around creative thinking, problem solving,
00:08:27 and being able to ask critical questions
00:08:33 and having the patience to go deep.
00:08:36 So one of the amazing things,
00:08:38 if we look at the mission of Square,
00:08:40 is to increase people’s access to the economy.
00:08:45 Maybe you can correct me if I’m wrong,
00:08:46 that’s from my perspective.
00:08:47 So from the perspective of merchants,
00:08:49 peer to peer payments, even crypto, cryptocurrency,
00:08:52 digital cryptocurrency, what do you see as the major ways
00:08:56 that our society can increase participation in the economy?
00:08:59 So if we look at today and the next 10 years,
00:09:01 next 20 years, you go into Africa, maybe in Africa
00:09:04 and all kinds of other places outside of the North America.
00:09:09 If there was one word that I think represents
00:09:13 what we’re trying to do at Square, it is that word access.
00:09:19 One of the things we found is that
00:09:21 we weren’t expecting this at all.
00:09:23 When we started, we thought we were just building
00:09:25 a piece of hardware to enable people
00:09:29 to plug it into their phone and swipe a credit card.
00:09:32 And then as we talked with people
00:09:34 who actually tried to accept credit cards in the past,
00:09:37 we found a consistent theme, which many of them
00:09:40 weren’t even enabled, not enabled,
00:09:44 but allowed to process credit cards.
00:09:46 And we dug a little bit deeper, again, asking that question.
00:09:50 And we found that a lot of them would go to banks
00:09:54 or these merchant acquirers.
00:09:57 And waiting for them was a credit check
00:10:01 and looking at a FICA score.
00:10:03 And many of the businesses that we talked to
00:10:07 and many small businesses,
00:10:09 they don’t have good credit or a credit history.
00:10:15 They’re entrepreneurs who are just getting started,
00:10:17 taking a lot of personal risk, financial risk.
00:10:21 And it just felt ridiculous to us
00:10:24 that for the job of being able to accept money from people,
00:10:31 you had to get your credit checked.
00:10:33 And as we dug deeper, we realized that
00:10:36 that wasn’t the intention of the financial industry,
00:10:38 but it’s the only tool they had available to them
00:10:42 to understand authenticity, intent,
00:10:46 predictor of future behavior.
00:10:49 So that’s the first thing we actually looked at.
00:10:50 And that’s where the, you know, we built the hardware,
00:10:52 but the software really came in terms of risk modeling.
00:10:57 And that’s when we started down the path
00:11:00 that eventually leads to AI.
00:11:03 We started with a very strong data science discipline
00:11:08 because we knew that our business
00:11:10 was not necessarily about making hardware.
00:11:13 It was more about enabling more people
00:11:17 to come into the system.
00:11:18 So the fundamental challenge there is,
00:11:21 so to enable more people to come into the system,
00:11:23 you have to lower the barrier of checking
00:11:26 that that person will be a legitimate vendor.
00:11:30 Is that the fundamental problem?
00:11:31 Yeah, and a different mindset.
00:11:33 I think a lot of the financial industry had a mindset
00:11:36 of kind of distrust and just constantly looking
00:11:41 for opportunities to prove why people shouldn’t get
00:11:46 into the system, whereas we took on a mindset of trust
00:11:50 and then verify, verify, verify, verify, verify.
00:11:52 Yes.
00:11:53 So we moved, you know, when we entered the space,
00:12:00 only about 30 to 40% of the people who applied
00:12:03 to accept credit cards would actually get through the system.
00:12:05 We took that knowledge, we took it to the next level.
00:12:08 If we applied to accept credit cards,
00:12:11 we’d actually get through the system.
00:12:13 We took that number to 99%.
00:12:15 And that’s because we reframed the problem,
00:12:19 we built credible models, and we had this mindset of,
00:12:25 we’re going to watch not at the merchant level,
00:12:28 but we’re gonna watch at the transaction level.
00:12:30 So come in, perform some transactions,
00:12:35 and as long as you’re doing things
00:12:36 with integrity, credible, and don’t look suspicious,
00:12:40 we’ll continue to serve you.
00:12:43 If we see any interestingness in how you use our system,
00:12:47 that will be bubbled up to people to review,
00:12:50 to figure out if there’s something nefarious going on,
00:12:53 and that’s when we might ask you to leave.
00:12:56 So the change in the mindset led to the technology
00:13:01 that we needed to enable more people to get through,
00:13:06 and to enable more people to access the system.
00:13:08 What role does machine learning play into that,
00:13:11 in that context of, you said,
00:13:15 first of all, it’s a beautiful shift.
00:13:17 Anytime you shift your viewpoint into seeing
00:13:20 that people are fundamentally good,
00:13:24 and then you just have to verify
00:13:25 and catch the ones who are not,
00:13:27 as opposed to assuming everybody’s bad,
00:13:30 this is a beautiful thing.
00:13:31 So what role does the, to you,
00:13:35 throughout the history of the company,
00:13:37 has machine learning played in doing that verification?
00:13:40 It was immediate.
00:13:41 I mean, we weren’t calling it machine learning,
00:13:43 but it was data science.
00:13:45 And then as the industry evolved,
00:13:47 machine learning became more of the nomenclature,
00:13:50 and as that evolved, it became more sophisticated
00:13:54 with deep learning, and as that continues to evolve,
00:13:58 it’ll be another thing.
00:13:59 But they’re all in the same vein.
00:14:02 But we built that discipline up
00:14:04 within the first year of the company,
00:14:06 because we also had, we had to partner with a bank,
00:14:11 we had to partner with Visa and MasterCard,
00:14:14 and we had to show that,
00:14:16 by bringing more people into the system,
00:14:19 that we could do so in a responsible way,
00:14:21 that would not compromise their systems,
00:14:23 and that they would trust us.
00:14:25 How do you convince this upstart company
00:14:27 with some cool machine learning tricks
00:14:30 is able to deliver on this trustworthy set of merchants?
00:14:35 We staged it out in tiers.
00:14:37 We had a bucket of 500 people using it,
00:14:41 and then we showed results,
00:14:43 and then 1,000, and then 10,000, then 50,000,
00:14:45 and then the constraint was lifted.
00:14:49 So again, it’s kind of getting something tangible out there.
00:14:54 I want to show what we can do rather than talk about it.
00:14:58 And that put a lot of pressure on us
00:15:00 to do the right things.
00:15:02 And it also created a culture of accountability,
00:15:07 of a little bit more transparency,
00:15:10 and I think incentivized all of our early folks
00:15:15 and the company in the right way.
00:15:18 So what does the future look like
00:15:19 in terms of increasing people’s access?
00:15:21 Or if you look at IoT, Internet of Things,
00:15:25 there’s more and more intelligent devices.
00:15:27 You can see there’s some people even talking
00:15:29 about our personal data as a thing
00:15:32 that we could monetize more explicitly versus implicitly.
00:15:35 Sort of everything can become part of the economy.
00:15:38 Do you see, so what does the future of Square look like
00:15:41 in sort of giving people access in all kinds of ways
00:15:45 to being part of the economy as merchants and as consumers?
00:15:49 I believe that the currency we use
00:15:52 is a huge part of the answer.
00:15:55 And I believe that the internet deserves
00:15:58 and requires a native currency.
00:16:01 And that’s why I’m such a huge believer in Bitcoin
00:16:07 because it just,
00:16:11 our biggest problem as a company right now
00:16:13 is we cannot act like an internet company.
00:16:16 Open a new market,
00:16:17 we have to have a partnership with a local bank.
00:16:20 We have to pay attention
00:16:21 to different regulatory onboarding environments.
00:16:25 And a digital currency like Bitcoin
00:16:29 takes a bunch of that away
00:16:31 where we can potentially launch a product
00:16:35 in every single market around the world
00:16:38 because they’re all using the same currency.
00:16:41 And we have consistent understanding of regulation
00:16:46 and onboarding and what that means.
00:16:49 So I think the internet continuing to be accessible
00:16:54 to people is number one.
00:16:57 And then I think currency is number two.
00:17:01 And it will just allow for a lot more innovation,
00:17:04 a lot more speed in terms of what we can build
00:17:07 and others can build.
00:17:09 And it’s just really exciting.
00:17:11 So, I mean, I wanna be able to see that
00:17:13 and feel that in my lifetime.
00:17:16 So in this aspect and in other aspects,
00:17:19 you have a deep interest in cryptocurrency
00:17:22 and distributed ledger tech in general.
00:17:24 I talked to Vitalik Buterin yesterday on this podcast.
00:17:27 He says hi, by the way.
00:17:29 Hey.
00:17:29 He’s a brilliant, brilliant person.
00:17:33 Talked a lot about Bitcoin and Ethereum, of course.
00:17:36 So can you maybe linger on this point?
00:17:38 What do you find appealing about Bitcoin,
00:17:42 about digital currency?
00:17:43 Where do you see it going in the next 10, 20 years?
00:17:46 And what are some of the challenges with respect to Square
00:17:50 but also just bigger for our globally, for our world,
00:17:55 for the way we think about money?
00:17:59 I think the most beautiful thing about it
00:18:01 is there’s no one person setting the direction.
00:18:05 And there’s no one person on the other side
00:18:07 that can stop it.
00:18:08 So we have something that is pretty organic in nature
00:18:15 and very principled in its original design.
00:18:19 And I think the Bitcoin white paper
00:18:22 is one of the most seminal works of computer science
00:18:24 in the last 20, 30 years.
00:18:28 It’s poetry.
00:18:30 I mean, it really is.
00:18:30 Yeah, it’s a pretty cool technology.
00:18:32 That’s not often talked about.
00:18:33 There’s so much hype around digital currency
00:18:36 about the financial impacts of it.
00:18:38 But the actual technology is quite beautiful
00:18:40 from a computer science perspective.
00:18:42 Yeah, and the underlying principles behind it
00:18:44 that went into it, even to the point
00:18:46 of releasing it under a pseudonym.
00:18:48 I think that’s a very, very powerful statement.
00:18:51 The timing of when it was released is powerful.
00:18:54 It was a total activist move.
00:18:58 I mean, it’s moving the world forward
00:19:00 in a way that I think is extremely noble and honorable
00:19:05 and enables everyone to be part of the story,
00:19:08 which is also really cool.
00:19:10 So you asked a question around 10 years and 20 years.
00:19:13 I mean, I think the amazing thing is no one knows.
00:19:17 And it can emerge.
00:19:19 And every person that comes into the ecosystem,
00:19:22 whether they be a developer or someone who uses it,
00:19:27 can change its direction in small and large ways.
00:19:31 And that’s what I think it should be,
00:19:33 because that’s what the internet has shown is possible.
00:19:36 Now, there’s complications with that, of course.
00:19:38 And there’s certainly companies that own large parts
00:19:42 of the internet and can direct it more than others.
00:19:44 And there’s not equal access
00:19:47 to every single person in the world just yet.
00:19:50 But all those problems are visible enough
00:19:53 to speak about them.
00:19:54 And to me, that gives confidence that they’re solvable
00:19:57 in a relatively short timeframe.
00:20:00 I think the world should be able to do that.
00:20:02 I think the world changes a lot as we get these satellites
00:20:08 projecting the internet down to earth,
00:20:11 because it just removes a bunch of the former constraints
00:20:15 and really levels the playing field.
00:20:18 But a global currency,
00:20:20 which a native currency for the internet is a proxy for,
00:20:24 is a very powerful concept.
00:20:27 And I don’t think any one person on this planet
00:20:29 truly understands the ramifications of that.
00:20:31 I think there’s a lot of positives to it.
00:20:34 There’s some negatives as well.
00:20:35 But…
00:20:36 Do you think it’s possible, sorry to interrupt,
00:20:37 do you think it’s possible that this kind of digital currency
00:20:40 would redefine the nature of money,
00:20:43 so become the main currency of the world,
00:20:46 as opposed to being tied to fiat currency
00:20:49 of different nations and sort of really push
00:20:51 the decentralization of control of money?
00:20:54 Definitely, but I think the bigger ramification
00:20:58 is how it affects how society works.
00:21:02 And I think there are many positive ramifications
00:21:06 outside of just money.
00:21:07 Outside of just money.
00:21:08 Money is a foundational layer that enables so much more.
00:21:12 I was meeting with an entrepreneur in Ethiopia,
00:21:14 and payments is probably the number one problem to solve
00:21:19 across the continent,
00:21:21 both in terms of moving money across borders
00:21:24 between nations on the continent,
00:21:26 or the amount of corruption within the current system.
00:21:33 But the lack of easy ways to pay people
00:21:39 makes starting anything really difficult.
00:21:42 I met an entrepreneur who started the Lyft slash Uber
00:21:47 of Ethiopia, and one of the biggest problems she has
00:21:49 is that it’s not easy for her riders to pay the company,
00:21:54 it’s not easy for her to pay the drivers.
00:21:57 And that definitely has stunted her growth
00:22:00 and made everything more challenging.
00:22:02 So the fact that she even has to think about payments
00:22:07 instead of thinking about the best rider experience
00:22:10 and the best driver experience is pretty telling.
00:22:15 So I think as we get a more durable, resilient
00:22:20 and global standard, we see a lot more innovation everywhere.
00:22:26 And I think there’s no better case study for this
00:22:29 than the various countries within Africa
00:22:32 and their entrepreneurs who are trying to start things
00:22:35 within health or sustainability or transportation
00:22:38 or a lot of the companies that we’ve seen here.
00:22:42 So the majority of companies I met in November
00:22:47 when I spent a month on the continent were payments oriented.
00:22:52 You mentioned, and this is a small tangent,
00:22:54 you mentioned the anonymous launch of Bitcoin
00:22:58 is a sort of profound philosophical statement.
00:23:00 Pseudonymous.
00:23:02 What’s that even mean?
00:23:03 There’s a pseudonym.
00:23:04 First of all, let me ask.
00:23:05 There’s an identity tied to it.
00:23:06 It’s not just anonymous, it’s Nakamoto.
00:23:10 So Nakamoto might represent one person or multiple people.
00:23:13 But let me ask, are you Satoshi Nakamoto?
00:23:15 Just checking, catch you off guard.
00:23:17 And if I were, would I tell you?
00:23:18 Yeah, that’s true.
00:23:19 Maybe you slip.
00:23:21 A pseudonym is constructed identity.
00:23:25 Anonymity is just kind of this random,
00:23:28 like drop something off and leave.
00:23:32 There’s no intention to build an identity around it.
00:23:34 And while the identity being built was a short time window,
00:23:39 it was meant to stick around, I think, and to be known.
00:23:44 And it’s being honored in how the community
00:23:50 thinks about building it,
00:23:50 like the concept of Satoshi’s, for instance,
00:23:55 is one such example.
00:23:56 But I think it was smart not to do it anonymous,
00:24:01 not to do it as a real identity,
00:24:03 but to do it as pseudonym,
00:24:05 because I think it builds tangibility
00:24:07 and a little bit of empathy that this was a human
00:24:13 or a set of humans behind it.
00:24:14 And there’s this natural identity that I can imagine.
00:24:19 But there is also a sacrifice of ego.
00:24:22 That’s a pretty powerful thing
00:24:23 from your perspective. Yeah, which is beautiful.
00:24:25 Would you do, sort of philosophically,
00:24:28 to ask you the question,
00:24:29 would you do all the same things you’re doing now
00:24:32 if your name wasn’t attached to it?
00:24:35 Sort of, if you had to sacrifice the ego,
00:24:39 put another way, is your ego deeply tied
00:24:41 in the decisions you’ve been making?
00:24:44 I hope not.
00:24:45 I mean, I believe I would certainly attempt
00:24:49 to do the things without my name having
00:24:51 to be attached with it.
00:24:53 But it’s hard to do that in a corporation, legally.
00:25:01 That’s the issue.
00:25:02 If I were to do more open source things,
00:25:05 then absolutely, I don’t need my particular identity,
00:25:10 my real identity associated with it.
00:25:12 But I think the appreciation that comes
00:25:17 from doing something good and being able to see it
00:25:21 and see people use it is pretty overwhelming and powerful,
00:25:26 more so than maybe seeing your name in the headlines.
00:25:29 Let’s talk about artificial intelligence a little bit,
00:25:33 if we could.
00:25:34 70 years ago, Alan Turing formulated the Turing test.
00:25:38 To me, natural language is one of the most interesting
00:25:41 spaces of problems that are tackled
00:25:44 by artificial intelligence.
00:25:45 It’s the canonical problem of what it means
00:25:47 to be intelligent.
00:25:48 He formulated it as the Turing test.
00:25:50 Let me ask sort of the broad question,
00:25:53 how hard do you think is it to pass the Turing test
00:25:56 in the space of language?
00:25:58 Just from a very practical standpoint,
00:26:00 I think where we are now and for at least years out
00:26:07 is one where the artificial intelligence,
00:26:11 machine learning, the deep learning models
00:26:13 can bubble up interestingness very, very quickly
00:26:17 and pair that with human discretion around severity,
00:26:22 around depth, around nuance and meaning.
00:26:27 I think for me, the chasm across for general intelligence
00:26:33 is to be able to explain why and the meaning
00:26:38 behind something.
00:26:40 Behind a decision.
00:26:42 Behind a decision or a set of data.
00:26:45 So the explainability part is kind of essential
00:26:48 to be able to explain the meaning behind something.
00:26:52 To explain using natural language
00:26:54 why the decisions were made, that kind of thing.
00:26:56 Yeah, I mean I think that’s one of our biggest risks
00:26:58 in artificial intelligence going forward
00:27:01 is we are building a lot of black boxes
00:27:03 that can’t necessarily explain why they made a decision
00:27:06 or what criteria they used to make the decision.
00:27:09 And we’re trusting them more and more
00:27:11 from lending decisions to content recommendation
00:27:14 to driving to health.
00:27:18 Like a lot of us have watches that tell us
00:27:20 to understand how they’re deciding that.
00:27:23 I mean that one’s pretty simple.
00:27:25 But you can imagine how complex they get.
00:27:28 And being able to explain the reasoning behind
00:27:32 some of those recommendations seems to be an essential part.
00:27:34 Although it’s hard.
00:27:35 Which is a very hard problem because sometimes
00:27:37 even we can’t explain why we make decisions.
00:27:40 That’s what I was, I think we’re being sometimes
00:27:42 a little bit unfair to artificial intelligence systems
00:27:45 because we’re not very good at some of these things.
00:27:48 So do you think, apologize for the ridiculous
00:27:52 romanticized question, but on that line of thought,
00:27:55 do you think we’ll ever be able to build a system
00:28:00 like in the movie Her that you could fall in love with?
00:28:03 So have that kind of deep connection with.
00:28:06 Hasn’t that already happened?
00:28:07 Hasn’t someone in Japan fallen in love with his AI?
00:28:13 There’s always going to be somebody
00:28:14 that does that kind of thing.
00:28:15 I mean at a much larger scale of actually building
00:28:19 relationships, of being deeper connections.
00:28:21 It doesn’t have to be love, but it’s just deeper connections
00:28:24 with artificial intelligence systems.
00:28:26 So you mentioned explainability.
00:28:27 That’s less a function of the artificial intelligence
00:28:29 and more a function of the individual
00:28:32 and how they find meaning and where they find meaning.
00:28:34 Do you think we humans can find meaning in technology
00:28:37 in this kind of way?
00:28:38 Yeah, yeah, yeah, 100%, 100%.
00:28:40 And I don’t necessarily think it’s a negative.
00:28:43 But it’s constantly going to evolve.
00:28:50 So I don’t know, but meaning is something
00:28:54 that’s entirely subjective.
00:28:56 And I don’t think it’s going to be a function
00:28:59 of finding the magic algorithm
00:29:02 that enables everyone to love it.
00:29:07 But maybe, I don’t know.
00:29:09 That question really gets at the difference
00:29:10 between human and machine.
00:29:12 So you had a little bit of an exchange with Elon Musk.
00:29:17 Basically, I mean it’s a trivial version of that,
00:29:20 but I think there’s a more fundamental question
00:29:22 of is it possible to tell the difference
00:29:24 between a bot and a human?
00:29:27 And do you think it’s, if we look into the future,
00:29:31 10, 20 years out, do you think it would be possible
00:29:34 or is it even necessary to tell the difference
00:29:36 in the digital space between a human and a robot?
00:29:40 Can we have fulfilling relationships with each
00:29:42 or do we need to tell the difference between them?
00:29:46 I think it’s certainly useful in certain problem domains
00:29:49 to be able to tell the difference.
00:29:52 I think in others it might not be as useful.
00:29:56 Do you think it’s possible for us today
00:29:58 to tell that difference?
00:30:00 Is the reverse the meta of the Turing test?
00:30:02 Well, what’s interesting is I think the technology
00:30:07 to create is moving much faster
00:30:10 than the technology to detect, generally.
00:30:13 You think so?
00:30:14 So if you look at adversarial machine learning,
00:30:17 there’s a lot of systems that try
00:30:18 to fool machine learning systems.
00:30:21 And at least for me, the hope is that the technology
00:30:23 to defend will always be right there, at least.
00:30:28 Your sense is that…
00:30:30 I don’t know if they’ll be right there.
00:30:31 I mean, it’s a race, right?
00:30:34 So the detection technologies have to be two
00:30:38 or 10 steps ahead of the creation technologies.
00:30:42 This is a problem that I think the financial industry
00:30:44 will face more and more because a lot of our risk models,
00:30:48 for instance, are built around identity.
00:30:50 Payments ultimately comes down to identity.
00:30:53 And you can imagine a world where all this conversation
00:30:57 around deep fakes goes towards the direction
00:31:00 of a driver’s license or passports or state identities.
00:31:06 And people construct identities in order
00:31:09 to get through a system such as ours
00:31:11 to start accepting credit cards or into the cash app.
00:31:15 And those technologies seem to be moving very, very quickly.
00:31:19 Our ability to detect them, I think,
00:31:22 is probably lagging at this point,
00:31:25 but certainly with more focus, we can get ahead of it.
00:31:29 But this is gonna touch everything.
00:31:33 So I think it’s like security.
00:31:38 We’re never going to be able
00:31:39 to build a perfect detection system.
00:31:42 We’re only going to be able to…
00:31:45 What we should be focused on is the speed of evolving it
00:31:50 and being able to take signals that show correctness
00:31:55 or errors as quickly as possible
00:31:58 and move and to be able to build that
00:32:01 into our newer models or the self learning models.
00:32:04 Do you have other worries?
00:32:06 Like some people, like Elon and others,
00:32:07 have worries of existential threats
00:32:10 of artificial intelligence,
00:32:11 of artificial general intelligence?
00:32:13 Or if you think more narrowly about threats
00:32:17 and concerns about more narrow artificial intelligence,
00:32:20 like what are your thoughts in this domain?
00:32:23 Do you have concerns or are you more optimistic?
00:32:26 I think Yuval in his book,
00:32:29 21 Lessons for the 21st Century,
00:32:31 his last chapter is around meditation.
00:32:34 And you look at the title of the chapter
00:32:37 and you’re like, oh, it’s all meditation.
00:32:39 But what was interesting about that chapter
00:32:42 is he believes that kids being born today,
00:32:48 growing up today, Google has a stronger sense
00:32:53 of their preferences than they do,
00:32:57 which you can easily imagine.
00:32:59 I can easily imagine today that Google probably knows
00:33:04 my preferences more than my mother does.
00:33:08 Maybe not me per se, but for someone growing up
00:33:12 only knowing the internet,
00:33:13 only knowing what Google is capable of,
00:33:16 or Facebook or Twitter or Square or any of these things,
00:33:20 the self awareness is being offloaded to other systems
00:33:25 and particularly these algorithms.
00:33:28 And his concern is that we lose that self awareness
00:33:32 because the self awareness is now outside of us
00:33:35 and it’s doing such a better job
00:33:37 at helping us direct our decisions around,
00:33:41 should I stand, should I walk today?
00:33:43 What doctor should I choose?
00:33:45 Who should I date?
00:33:46 All these things we’re now seeing play out very quickly.
00:33:50 So he sees meditation as a tool to build that self awareness
00:33:54 and to bring the focus back on,
00:33:56 why do I make these decisions?
00:33:58 Why do I react in this way?
00:34:00 Why did I have this thought?
00:34:02 Where did that come from?
00:34:04 That’s a way to regain control.
00:34:07 Or awareness, maybe not control, but awareness
00:34:10 so that you can be aware that yes, I am,
00:34:13 I am, I am, I am, I am.
00:34:15 Yes, I am offloading this decision to this algorithm
00:34:19 that I don’t fully understand
00:34:21 and can’t tell me why it’s doing the things it’s doing
00:34:24 because it’s so complex.
00:34:26 That’s not to say that the algorithm can’t be a good thing.
00:34:29 And to me recommender systems,
00:34:31 the best of what they can do is to help guide you
00:34:34 on a journey of learning new ideas of learning period.
00:34:39 It can be a great thing, but do you know you’re doing that?
00:34:41 Are you aware that you’re inviting it to do that to you?
00:34:45 I think that’s the risk he identifies, right?
00:34:50 That’s perfectly okay.
00:34:51 But are you aware that you have that invitation
00:34:55 and it’s being acted upon?
00:34:58 And so that’s a concern you’re kind of highlighting
00:35:02 that without a lack of awareness,
00:35:04 you can just be like floating at sea.
00:35:06 So awareness is key in the future
00:35:08 of these artificial intelligence systems.
00:35:10 Yeah, the movie WALLY.
00:35:12 WALLY.
00:35:13 Which I think is one of Pixar’s best movies
00:35:15 besides RATATOUILLI.
00:35:19 RATATOUILLI was incredible.
00:35:20 You had me until RATATOUILLI, okay.
00:35:22 RATATOUILLI was incredible.
00:35:26 All right, we’ve come to the first point
00:35:28 where we disagree, okay.
00:35:29 It’s the entrepreneurial story in the form of a rat.
00:35:35 I just remember just the soundtrack was really good, so.
00:35:38 Excellent.
00:35:41 What are your thoughts, sticking on artificial intelligence
00:35:43 a little bit, about the displacement of jobs?
00:35:45 That’s another perspective that candidates
00:35:48 like Andrew Yang talk about.
00:35:50 Yang gang forever.
00:35:53 Yang gang.
00:35:54 So he unfortunately, speaking of Yang gang,
00:35:56 has recently dropped out.
00:35:57 I know, it was very disappointing and depressing.
00:36:00 Yeah, but on the positive side,
00:36:02 he’s I think launching a podcast, so.
00:36:05 Really, cool.
00:36:06 Yeah, he just announced that.
00:36:07 I’m sure he’ll try to talk you into trying
00:36:09 to come on to the podcast.
00:36:11 I will talk to him.
00:36:12 So. About RATATOUILLI.
00:36:14 Yeah, maybe he’ll be more welcoming
00:36:16 of the RATATOUILLI argument.
00:36:18 What are your thoughts on his concerns
00:36:20 of the displacement of jobs, of automations,
00:36:22 of the, of course there’s positive impacts
00:36:24 that could come from automation and AI,
00:36:26 but there could also be negative impacts.
00:36:29 And within that framework, what are your thoughts
00:36:31 about universal basic income?
00:36:33 So these interesting new ideas
00:36:36 of how we can empower people in the economy.
00:36:40 I think he was 100% right on almost every dimension.
00:36:46 We see this in Square’s business.
00:36:48 I mean, he identified truck drivers.
00:36:52 I’m from Missouri.
00:36:54 And he certainly pointed to the concern
00:37:00 and the issue that people from where I’m from
00:37:04 feel every single day that is often invisible
00:37:07 and not talked about enough.
00:37:09 You know, the next big one is cashiers.
00:37:12 This is where it pertains to Square’s business.
00:37:15 We are seeing more and more of the point of sale
00:37:19 move to the individual customer’s hand
00:37:22 in the form of their phone and apps
00:37:24 and preorder and order ahead.
00:37:27 We’re seeing more kiosks.
00:37:29 We’re seeing more things like Amazon Go.
00:37:32 And the number of workers as a cashier in retail is immense.
00:37:40 And, you know, there’s no real answers
00:37:43 on how they transform their skills
00:37:47 and work into something else.
00:37:51 And I think that does lead to a lot
00:37:53 of really negative ramifications.
00:37:56 And the important point that he brought up
00:37:59 around universal basic income
00:38:01 is given that the shift is going to come
00:38:04 and given it is going to take time
00:38:07 to set people up with new skills and new careers,
00:38:14 they need to have a floor to be able to survive.
00:38:17 And this $1,000 a month is such a floor.
00:38:22 It’s not going to incentivize you to quit your job
00:38:25 because it’s not enough,
00:38:26 but it will enable you to not have to worry
00:38:30 as much about just getting on day to day
00:38:35 so that you can focus on what am I going to do now
00:38:39 and what am I going to, what skills do I need to acquire?
00:38:44 And I think, you know, a lot of people point
00:38:48 to the fact that, you know, during the industrial age,
00:38:53 we had the same concerns around automation,
00:38:55 factory lines and everything worked out okay.
00:38:59 But the biggest change is just the velocity
00:39:04 and the centralization of a lot of the things
00:39:08 that make this work, which is the data
00:39:11 and the algorithms that work on this data.
00:39:14 I think that the second biggest scary thing
00:39:18 is just how around AI is just who actually owns the data
00:39:24 and who can operate on it.
00:39:26 And are we able to share the insights from the data
00:39:32 so that we can also build algorithms that help our needs
00:39:36 or help our business or whatnot?
00:39:39 So that’s where I think regulation could play
00:39:43 a strong and positive part.
00:39:46 First, looking at the primitives of AI
00:39:50 and the tools we use to build these services
00:39:52 that will ultimately touch every single aspect
00:39:54 of the human experience.
00:39:56 And then where data is owned and how it’s shared.
00:40:05 So those are the answers that as a society, as a world,
00:40:10 we need to have better answers around,
00:40:12 which we’re currently not.
00:40:13 They’re just way too centralized
00:40:15 into a few very, very large companies.
00:40:19 But I think it was spot on with identifying the problem
00:40:23 and proposing solutions that would actually work.
00:40:26 At least that we learned from that you could expand
00:40:29 or evolve, but I mean, I think UBI is well past its due.
00:40:38 I mean, it was certainly trumpeted by Martin Luther King
00:40:41 and even before him as well.
00:40:44 And like you said, the exact $1,000 mark
00:40:48 might not be the correct one,
00:40:50 but you should take the steps to try to implement
00:40:54 these solutions and see what works.
00:40:56 100%.
00:40:57 So I think you and I eat similar diets,
00:40:59 and at least I was.
00:41:01 The first time I’ve heard this.
00:41:04 Yeah, so I was doing it before.
00:41:05 First time anyone has said that to me, in this case anyway.
00:41:08 Yeah, but it’s becoming more and more cool.
00:41:12 But I was doing it before it was cool.
00:41:13 So intermittent fasting and fasting in general,
00:41:16 I really enjoy, I love food,
00:41:18 but I enjoy the, I also love suffering because I’m Russian.
00:41:23 So fasting kind of makes you appreciate the,
00:41:29 makes you appreciate what it is to be human somehow.
00:41:33 But I have, outside the philosophical stuff,
00:41:36 I have a more specific question.
00:41:37 It also helps me as a programmer and a deep thinker,
00:41:41 like from the scientific perspective,
00:41:43 to sit there for many hours and focus deeply.
00:41:46 Maybe you were a hacker before you were CEO.
00:41:50 What have you learned about diet, lifestyle,
00:41:55 mindset that helps you maximize mental performance,
00:41:57 to be able to focus for,
00:42:00 to think deeply in this world of distractions?
00:42:03 I think I just took it for granted for too long.
00:42:08 Which aspect?
00:42:09 Just the social structure of we eat three meals a day
00:42:13 and there’s snacks in between.
00:42:15 And I just never really asked the question, why?
00:42:18 Oh, by the way, in case people don’t know,
00:42:20 I think a lot of people know,
00:42:22 but you at least, you famously eat once a day.
00:42:26 You still eat once a day?
00:42:27 Yep, I eat dinner.
00:42:29 By the way, what made you decide to eat once a day?
00:42:32 Like, cause to me that was a huge revolution
00:42:33 that you don’t have to eat breakfast.
00:42:35 That was like, I felt like I was a rebel.
00:42:37 Like I abandoned my parents or something
00:42:39 and became an anarchist.
00:42:41 When you first, like the first week you start doing it,
00:42:43 it feels that you kind of like have a superpower.
00:42:45 Then you realize it’s not really a superpower.
00:42:47 But it, I think you realize,
00:42:50 at least I realized like it just how much is,
00:42:53 how much our mind dictates what we’re possible of.
00:42:59 And sometimes we have structures around us
00:43:02 that incentivize like, this three meal a day thing,
00:43:05 which was purely social structure
00:43:09 versus necessity for our health and for our bodies.
00:43:14 And I did it just, I started doing it
00:43:17 because I played a lot with my diet when I was a kid
00:43:21 and I was vegan for two years
00:43:23 and just went all over the place just because I,
00:43:28 you know, health is the most precious thing we have
00:43:31 and none of us really understand it.
00:43:33 So being able to ask the question through experiments
00:43:37 that I can perform on myself
00:43:39 and learn about is compelling to me.
00:43:44 And I heard this one guy on a podcast, Wim Hof,
00:43:47 who’s famous for doing ice baths and holding his breath
00:43:50 and all these things.
00:43:54 He said he only eats one meal a day.
00:43:56 I’m like, wow, that sounds super challenging
00:43:59 and uncomfortable.
00:44:00 I’m gonna do it.
00:44:02 So I just, I learn the most when I make myself,
00:44:06 I wouldn’t say suffer,
00:44:07 but when I make myself feel uncomfortable
00:44:10 because everything comes to bear in those moments
00:44:14 and you really learn what you’re about or what you’re not.
00:44:21 So I’ve been doing that my whole life.
00:44:23 Like when I was a kid, I could not,
00:44:25 like I was, I could not speak.
00:44:27 Like I had to go to a speech therapist
00:44:29 and it made me extremely shy.
00:44:31 And then one day I realized I can’t keep doing this
00:44:34 and I signed up for the speech club.
00:44:39 And it was the most uncomfortable thing
00:44:45 I could imagine doing, getting a topic on a note card,
00:44:49 having five minutes to write a speech
00:44:51 about whatever that topic is,
00:44:53 not being able to use the note card while speaking
00:44:56 and speaking for five minutes about that topic.
00:44:59 So, but it just, it puts so much,
00:45:03 it gave me so much perspective
00:45:06 around the power of communication,
00:45:08 around my own deficiencies
00:45:10 and around if I set my mind to do something, I’ll do it.
00:45:14 So it gave me a lot more confidence.
00:45:16 So I see fasting in the same light.
00:45:18 This is something that was interesting,
00:45:21 challenging, uncomfortable,
00:45:23 and has given me so much learning and benefit as a result.
00:45:30 And it will lead to other things that I’ll experiment with
00:45:32 and play with, but yeah,
00:45:35 it does feel a little bit like a superpower sometimes.
00:45:39 The most boring superpower one can imagine.
00:45:42 Now it’s quite incredible.
00:45:44 The clarity of mind is pretty interesting.
00:45:47 Speaking of suffering,
00:45:49 you kind of talk about facing difficult ideas.
00:45:53 You meditate, you think about the broad context of life,
00:45:58 of our societies.
00:46:00 Let me ask, sort of apologize again
00:46:02 for the romanticized question,
00:46:03 but do you ponder your own mortality?
00:46:06 Do you think about death,
00:46:09 about the finiteness of human existence
00:46:13 when you meditate, when you think about it?
00:46:15 And if you do, what,
00:46:18 how do you make sense of it, that this thing ends?
00:46:22 Well, I don’t try to make sense of it.
00:46:23 I do think about it every day.
00:46:25 I mean, it’s a daily, multiple times a day.
00:46:29 Are you afraid of death?
00:46:30 No, I’m not afraid of it.
00:46:32 I think it’s a transformation, I don’t know to what,
00:46:36 but it’s also a tool
00:46:39 to feel the importance of every moment.
00:46:44 So I just use it as a reminder, like I have an hour.
00:46:48 Is this really what I’m going to spend the hour doing?
00:46:52 Like I only have so many more sunsets and sunrises to watch.
00:46:55 Like I’m not going to get up for it.
00:46:58 I’m not going to make sure that I try to see it.
00:47:02 So it just puts a lot into perspective
00:47:06 and it helps me prioritize.
00:47:09 I think it’s, I don’t see it as something that’s like
00:47:13 that I dread or is dreadful.
00:47:15 It’s a tool that is available
00:47:18 to every single person to use every day
00:47:19 because it shows how precious life is.
00:47:21 And there’s reminders every single day,
00:47:24 whether it be your own health or a friend or a coworker
00:47:27 or something you see in the news.
00:47:30 So to me it’s just a question
00:47:32 of what we do with our daily reminder.
00:47:34 And for me, it’s am I really focused on what matters?
00:47:40 And sometimes that might be work,
00:47:42 sometimes that might be friendships or family
00:47:45 or relationships or whatnot,
00:47:47 but it’s the ultimate clarifier in that sense.
00:47:51 So on the question of what matters,
00:47:53 another ridiculously big question of
00:47:57 once you try to make sense of it,
00:47:58 what do you think is the meaning of it all,
00:48:00 the meaning of life?
00:48:02 What gives you purpose, happiness, meaning?
00:48:07 A lot does.
00:48:08 I mean, just being able to be aware
00:48:14 of the fact that I’m alive is pretty meaningful.
00:48:20 The connections I feel with individuals,
00:48:23 whether they’re people I just meet
00:48:25 or long lasting friendships or my family is meaningful.
00:48:30 Seeing people use something that I helped build
00:48:33 is really meaningful and powerful to me.
00:48:38 But that sense of, I mean,
00:48:40 I think ultimately it comes down to a sense of connection
00:48:43 and just feeling like I am bigger,
00:48:47 I am part of something that’s bigger than myself
00:48:49 and like I can feel it directly
00:48:52 in small ways or large ways,
00:48:54 however it manifests is probably it.
00:48:59 Last question.
00:49:00 Do you think we’re living in a simulation?
00:49:05 I don’t know.
00:49:06 It’s a pretty fun one if we are,
00:49:09 but also crazy and random and wrought with tons of problems.
00:49:15 But yeah.
00:49:17 Would you have it any other way?
00:49:19 Yeah.
00:49:20 I mean, I just think it’s taken us way too long
00:49:24 as a planet to realize we’re all in this together
00:49:27 and we all are connected in very significant ways.
00:49:34 I think we hide our connectivity very well through ego,
00:49:38 through whatever it is of the day.
00:49:42 But that is the one thing I would wanna work
00:49:46 towards changing and that’s how I would have it another way.
00:49:51 Cause if we can’t do that,
00:49:52 then how are we gonna connect to all the other simulations?
00:49:55 Cause that’s the next step is like
00:49:57 what’s happening in the other simulation.
00:49:58 Escaping this one and yeah.
00:50:03 Spanning across the multiple simulations
00:50:05 and sharing in and on the fun.
00:50:07 I don’t think there’s a better way to end it.
00:50:09 Jack, thank you so much for all the work you do.
00:50:12 There’s probably other ways that we’ve ended this
00:50:13 and other simulations that may have been better.
00:50:16 We’ll have to wait and see.
00:50:18 Thanks so much for talking today.
00:50:19 Thank you.
00:50:21 Thanks for listening to this conversation with Jack Dorsey
00:50:24 and thank you to our sponsor, Masterclass.
00:50:26 Please consider supporting this podcast
00:50:29 by signing up to Masterclass at masterclass.com slash Lex.
00:50:34 If you enjoy this podcast, subscribe on YouTube,
00:50:37 review it with five stars on Apple Podcast,
00:50:39 support on Patreon or simply connect with me on Twitter
00:50:42 at Lex Friedman.
00:50:45 And now let me leave you with some words
00:50:47 about Bitcoin from Paul Graham.
00:50:50 I’m very intrigued by Bitcoin.
00:50:52 It has all the signs of a paradigm shift.
00:50:55 Hackers love it, yet it is described as a toy,
00:50:58 just like microcomputers.
00:51:01 Thank you for listening and hope to see you next time.