Transcript
00:00:00 If one site is hacked, you can just unleash all hell.
00:00:03 We have stumbled into this new era
00:00:06 of mutually assured digital destruction.
00:00:08 How far are people willing to go?
00:00:11 You can capture their location,
00:00:13 you can capture their contacts
00:00:16 that record their telephone calls, record their camera
00:00:19 without them knowing about it.
00:00:20 Basically, you can put an invisible ankle bracelet
00:00:24 on someone without them knowing.
00:00:26 You could sell that to a zero day broker for $2 million.
00:00:34 The following is a conversation with Nicole Perlroth,
00:00:37 cybersecurity journalist and author
00:00:40 of This Is How They Tell Me The World Ends,
00:00:42 The Cyber Weapons Arm Race.
00:00:44 This is the Lex Friedman podcast.
00:00:46 To support it, please check out our sponsors
00:00:49 in the description.
00:00:50 And now, dear friends, here’s Nicole Perlroth.
00:00:54 You’ve interviewed hundreds of cybersecurity hackers,
00:00:58 activists, dissidents, computer scientists,
00:01:01 government officials, forensic investigators,
00:01:03 and mercenaries.
00:01:05 So let’s talk about cybersecurity and cyber war.
00:01:09 Start with the basics.
00:01:10 What is a zero day vulnerability?
00:01:13 And then a zero day exploit or attack?
00:01:18 So at the most basic level, let’s say I’m a hacker
00:01:22 and I find a bug in your iPhone iOS software
00:01:28 that no one else knows about, especially Apple.
00:01:31 That’s called a zero day because the minute it’s discovered,
00:01:34 engineers have had zero days to fix it.
00:01:37 If I can study that zero day,
00:01:40 I could potentially write a program to exploit it.
00:01:44 And that program would be called a zero day exploit.
00:01:48 And for iOS, the dream is that you craft a zero day exploit
00:01:54 that can remotely exploit someone else’s iPhone
00:01:57 without them ever knowing about it.
00:01:59 And you can capture their location,
00:02:01 you can capture their contacts
00:02:04 that record their telephone calls,
00:02:06 record their camera without them knowing about it.
00:02:09 Basically, you can put an invisible ankle bracelet
00:02:13 on someone without them knowing.
00:02:15 And you can see why that capability,
00:02:17 that zero day exploit would have immense value
00:02:20 for a spy agency or a government
00:02:23 that wants to monitor its critics or dissidents.
00:02:27 And so there’s a very lucrative market now
00:02:30 for zero day exploits.
00:02:32 So you said a few things there.
00:02:33 One is iOS, why iOS, which operating system,
00:02:37 which one is the sexier thing to try to get to
00:02:40 or the most impactful thing?
00:02:42 And the other thing you mentioned is remote
00:02:45 versus like having to actually come
00:02:47 in physical contact with it.
00:02:49 Is that the distinction?
00:02:50 So iPhone exploits have just been
00:02:54 a government’s number one priority.
00:02:58 Recently, actually the price
00:03:00 of an Android remote zero day exploit,
00:03:03 something that can get you into Android phones
00:03:06 is actually higher.
00:03:08 The value of that is now higher on this underground market
00:03:10 for zero day exploits than an iPhone iOS exploit.
00:03:15 So things are changing.
00:03:16 So there’s probably more Android devices,
00:03:20 so that’s why it’s better.
00:03:21 But then the iPhone side,
00:03:24 so I’m an Android person,
00:03:26 because I’m a man of the people.
00:03:28 But it seems like all the elites use iPhone,
00:03:31 all the people at nice dinner parties.
00:03:33 So is that the reason that the more powerful people
00:03:37 use iPhones, is that why?
00:03:38 I don’t think so.
00:03:39 I actually, so it was about two years ago
00:03:42 that the prices flipped.
00:03:43 It used to be that if you could craft
00:03:46 a remote zero click exploit for iOS,
00:03:53 then that was about as good as it gets.
00:03:55 You could sell that to a zero day broker for $2 million.
00:04:01 The caveat is you can never tell anyone about it,
00:04:04 because the minute you tell someone about it,
00:04:07 Apple learns about it,
00:04:08 they patch it in that $2.5 million investment
00:04:12 that that zero day broker just made goes to dust.
00:04:16 So a couple of years ago,
00:04:18 and don’t quote me on the prices,
00:04:20 but an Android zero click remote exploit
00:04:25 for the first time topped the iOS.
00:04:29 And actually a lot of people’s read on that
00:04:32 was that it might be a sign
00:04:35 that Apple security was falling,
00:04:40 and that it might actually be easier
00:04:43 to find an iOS zero day exploit
00:04:46 than find an Android zero day exploit.
00:04:48 The other thing is market share.
00:04:51 There are just more people around the world that use Android.
00:04:54 And a lot of governments that are paying top dollar
00:04:58 for zero day exploits these days
00:05:01 are deep pocketed governments in the Gulf
00:05:05 that wanna use these exploits
00:05:06 to monitor their own citizens, monitor their critics.
00:05:10 And so it’s not necessarily
00:05:12 that they’re trying to find elites,
00:05:14 it’s that they wanna find out who these people are
00:05:17 that are criticizing them
00:05:18 or perhaps planning the next Arab Spring.
00:05:21 So in your experience,
00:05:23 are most of these attacks targeted
00:05:24 to cover a large population,
00:05:26 or is there attacks that are targeted
00:05:29 towards specific individuals?
00:05:31 So I think it’s both.
00:05:32 Some of the zero day exploits that have fetched top dollar
00:05:36 that I’ve heard of in my reporting in the United States
00:05:39 were highly targeted.
00:05:41 There was a potential terrorist attack.
00:05:43 They wanted to get into this person’s phone.
00:05:45 It had to be done in the next 24 hours.
00:05:48 They approached hackers and say, we’ll pay you
00:05:50 X millions of dollars if you can do this.
00:05:53 But then you look at,
00:05:55 when we’ve discovered iOS zero day exploits in the wild,
00:06:00 some of them have been targeting large populations
00:06:03 like Uyghurs.
00:06:05 So a couple of years ago,
00:06:07 there was a watering hole attack.
00:06:10 Okay, what’s a watering hole attack?
00:06:12 There’s a website,
00:06:13 it was actually had information aimed at Uyghurs
00:06:17 and you could access it all over the world.
00:06:20 And if you visited this website,
00:06:24 it would drop an iOS zero day exploit onto your phone.
00:06:29 And so anyone that visited this website
00:06:32 that was about Uyghurs anywhere,
00:06:34 I mean, Uyghurs, Uyghurs living abroad,
00:06:37 basically the Uyghur diaspora would have gotten infected
00:06:42 with this zero day exploit.
00:06:43 So in that case, they were targeting huge swaths
00:06:49 of this one population or people interested
00:06:51 in this one population, basically in real time.
00:06:54 So who are these attackers?
00:06:59 From the individual level to the group level,
00:07:02 psychologically speaking, what’s their motivation?
00:07:05 Is it purely money?
00:07:07 Is it the challenge?
00:07:09 Are they malevolent?
00:07:10 Is it power?
00:07:12 These are big philosophical human questions, I guess.
00:07:15 So these are the questions I set out to answer for my book.
00:07:20 I wanted to know, are these people that are just after money?
00:07:26 If they’re just after money, how do they sleep at night?
00:07:29 Not knowing whether that zero day exploit
00:07:31 they just sold to a broker is being used
00:07:34 to basically make someone’s life a living hell.
00:07:38 And what I found was there’s kind of this long sorted history
00:07:41 to this question.
00:07:43 It started out in the 80s and 90s
00:07:46 when hackers were just finding holes and bugs and software
00:07:51 for curiosity’s sake, really as a hobby.
00:07:54 And some of them would go to the tech companies
00:07:56 like Microsoft or Sun Microsystems at the time or Oracle.
00:08:01 And they’d say, hey, I just found this zero day
00:08:04 in your software and I can use it to break into NASA.
00:08:08 And the general response at the time wasn’t,
00:08:11 thank you so much for pointing out this flaw
00:08:13 and our software, we’ll get it fixed as soon as possible.
00:08:17 It was, don’t ever poke around our software ever again
00:08:21 or we’ll stick our general counsel on you.
00:08:24 And that was really sort of the common thread for years.
00:08:30 And so hackers who set out to do the right thing
00:08:34 were basically told to shut up
00:08:37 and stop doing what you’re doing.
00:08:40 And what happened next was they basically started trading
00:08:44 this information online.
00:08:46 Now, when you go back and interview people
00:08:48 from those early days, they all tell a very similar story,
00:08:53 which is they’re curious, they’re tinkerers.
00:08:57 They remind me of like the kid down the block
00:08:59 that was constantly poking around the hood of his dad’s car.
00:09:03 They just couldn’t help themselves.
00:09:06 They wanted to figure out how a system is designed
00:09:09 and how they could potentially exploit it
00:09:11 for some other purpose.
00:09:13 It doesn’t have to be good or bad.
00:09:15 But they were basically kind of beat down for so long
00:09:20 by these big tech companies
00:09:22 that they started just silently trading them
00:09:26 with other hackers.
00:09:28 And that’s how you got these really heated debates
00:09:32 in the 90s about disclosure.
00:09:35 Should you just dump these things online
00:09:38 because any script kitty can pick them up
00:09:40 and use it for all kinds of mischief.
00:09:43 But don’t you wanna just stick a middle finger
00:09:46 to all these companies
00:09:47 that are basically threatening you all the time.
00:09:50 So there was this really interesting dynamic at play.
00:09:53 And what I learned in the course of doing my book
00:09:57 was that government agencies and their contractors
00:10:01 sort of tapped into that frustration and that resentment.
00:10:06 And they started quietly reaching out to hackers
00:10:09 on these forums.
00:10:11 And they said, hey, you know that zero day
00:10:13 you just dropped online,
00:10:14 could you come up with something custom for me?
00:10:17 And I’ll pay you six figures for it
00:10:20 so long as you shut up and never tell anyone
00:10:22 that I paid you for this.
00:10:24 And that’s what happened.
00:10:27 So throughout the 90s,
00:10:28 there was a bunch of boutique contractors
00:10:31 that started reaching out to hackers on these forums
00:10:34 and saying, hey, I’ll pay you six figures
00:10:37 for that bug you were trying to get Microsoft
00:10:39 to fix for free.
00:10:41 And sort of so began or so catalyzed this market
00:10:45 where governments and their intermediaries
00:10:48 started reaching out to these hackers
00:10:50 and buying their bugs for free.
00:10:53 And in those early days,
00:10:54 I think a lot of it was just for quiet counterintelligence,
00:10:57 traditional espionage.
00:11:00 But as we started baking the software,
00:11:04 Windows software, Schneider Electric,
00:11:07 Siemens industrial software into our nuclear plants
00:11:11 and our factories and our power grid
00:11:14 and our petrochemical facilities and our pipelines,
00:11:18 those same zero days came to be just as valuable
00:11:22 for sabotage and war planning.
00:11:25 Does the fact that the market sprung up
00:11:27 and you can now make a lot of money
00:11:28 change the nature of the attackers that came to the table
00:11:31 or grow the number of attackers?
00:11:34 I mean, what is, I guess,
00:11:35 you told the psychology of the hackers in the 90s,
00:11:40 what is the culture today and where is it heading?
00:11:43 So I think there are people who will tell you
00:11:47 they would never sell a zero day
00:11:49 to a zero day broker or a government.
00:11:52 One, because they don’t know how it’s gonna get used
00:11:54 when they throw it over the fence.
00:11:56 Most of these get rolled into classified programs
00:11:58 and you don’t know how they get used.
00:12:01 If you sell it to a zero day broker,
00:12:02 you don’t even know which nation state might use it
00:12:06 or potentially which criminal group might use it
00:12:09 if you sell it on the dark web.
00:12:11 The other thing that they say is that
00:12:15 they wanna be able to sleep at night.
00:12:17 And they lose a lot of sleep
00:12:19 if they found out their zero day was being used
00:12:22 to make a dissident’s life living hell.
00:12:25 But there are a lot of people, good people,
00:12:28 who also say, no, this is not my problem.
00:12:32 This is the technology company’s problem.
00:12:35 If they weren’t writing new bugs
00:12:37 into their software every day,
00:12:39 then there wouldn’t be a market.
00:12:41 Then there wouldn’t be a problem.
00:12:42 But they continue to write bugs
00:12:44 into their software all the time
00:12:46 and they continue to profit off that software.
00:12:48 So why shouldn’t I profit off my labor too?
00:12:53 And one of the things that has happened,
00:12:55 which is I think a positive development
00:12:57 over the last 10 years, are bug bounty programs.
00:13:02 Companies like Google and Facebook
00:13:05 and then Microsoft and finally Apple,
00:13:07 which resisted it for a really long time,
00:13:10 have said, okay, we are gonna shift our perspective
00:13:14 about hackers.
00:13:15 We’re no longer going to treat them as the enemy here.
00:13:18 We’re going to start paying them
00:13:20 for what it’s essentially free quality assurance.
00:13:23 And we’re gonna pay them good money in some cases,
00:13:26 six figures in some cases.
00:13:28 We’re never gonna be able to bid against a zero day broker
00:13:32 who sells to government agencies.
00:13:34 But we can reward them and hopefully get to that bug earlier
00:13:38 where we can neutralize it
00:13:40 so that they don’t have to spend another year
00:13:43 developing the zero day exploit.
00:13:44 And in that way, we can keep our software more secure.
00:13:48 But every week I get messages from some hacker that says,
00:13:53 you know, I tried to see this zero day exploit
00:13:55 that was just found in the wild,
00:13:58 being used by this nation state.
00:14:00 I tried to tell Microsoft about this two years ago
00:14:04 and they were gonna pay me peanuts so it never got fixed.
00:14:08 There are all sorts of those stories that can continue on.
00:14:12 And I think just generally,
00:14:16 hackers are not very good at diplomacy.
00:14:19 They tend to be pretty snipey, technical crowd.
00:14:24 And very philosophical in my experience.
00:14:28 But diplomacy is not their strong suit.
00:14:31 Oh, there almost has to be a broker
00:14:33 between companies and hackers.
00:14:35 We can translate effectively,
00:14:37 just like you have a zero day broker
00:14:39 between governments and hackers.
00:14:41 You have to speak their language.
00:14:43 Yeah, and there have been some of those companies
00:14:45 who’ve risen up to meet that demand.
00:14:47 And HackerOne is one of them.
00:14:50 Bugcrowd is another.
00:14:52 Cynak has an interesting model, so that’s a company
00:14:55 that you pay for a private bug bounty program essentially.
00:14:59 So you pay this company, they tap hackers all over the world
00:15:04 to come hack your software, hack your system.
00:15:07 And then they’ll quietly tell you what they found.
00:15:10 And I think that’s a really positive development.
00:15:13 And actually, the Department of Defense
00:15:16 hired all three of those companies I just mentioned
00:15:20 to help secure their systems.
00:15:22 Now I think they’re still a little timid
00:15:24 in terms of letting those hackers
00:15:25 into the really sensitive, high side classified stuff.
00:15:30 But you know, baby steps.
00:15:33 Just to understand what you were saying,
00:15:34 you think it’s impossible for companies
00:15:37 to financially compete with the zero day brokers,
00:15:40 with governments.
00:15:42 So like the defense can’t outpay the hackers?
00:15:47 It’s interesting, they shouldn’t outpay them.
00:15:51 Because what would happen
00:15:53 if they started offering $2.5 million at Apple
00:15:59 for any zero day exploit
00:16:02 that governments would pay that much for,
00:16:04 is their own engineers would say,
00:16:06 why the hell am I working for less than that
00:16:10 and doing my nine to five every day?
00:16:12 So you would create a perverse incentive.
00:16:14 And I didn’t think about that until I started this research
00:16:18 and I realized, okay, yeah, that makes sense.
00:16:20 You don’t want to incentivize offense so much
00:16:25 that it’s to your own detriment.
00:16:27 And so I think what they have though,
00:16:29 what the companies have on government agencies,
00:16:32 is if they pay you, you get to talk about it.
00:16:36 You know, you get the street cred.
00:16:38 You get to brag about the fact you just found
00:16:41 that $2.5 million, you know, iOS zero day
00:16:45 that no one else did.
00:16:47 And if you sell it to a broker,
00:16:48 you never get to talk about it.
00:16:50 And I think that really does eat at people.
00:16:53 Can I ask you a big philosophical question
00:16:55 about human nature here?
00:16:57 So if you have, I mean, what you’ve seen,
00:17:00 if a human being has a zero day,
00:17:03 they found a zero day vulnerability that can hack into,
00:17:09 I don’t know, what’s the worst thing you can hack into?
00:17:11 Something that could launch nuclear weapons.
00:17:14 Which percentage of the people in the world
00:17:16 that have the skill would not share that with anyone,
00:17:20 with any bad party?
00:17:23 I guess how many people are completely devoid
00:17:27 of ethical concerns in your sense?
00:17:31 So my belief is all the ultra competent people
00:17:36 or very, very high percentage of ultra competent people
00:17:39 are also ethical people.
00:17:41 That’s been my experience.
00:17:42 But then again, my experience is narrow.
00:17:45 What’s your experience been like?
00:17:48 So this was another question I wanted to answer.
00:17:53 Who are these people who would sell a zero day exploit
00:17:57 that would neutralize a Schneider Electric safety lock
00:18:01 at a petrochemical plant?
00:18:03 Basically the last thing you would need to neutralize
00:18:05 before you trigger some kind of explosion.
00:18:07 Who would sell that?
00:18:11 And I got my answer,
00:18:14 well, the answer was different.
00:18:16 A lot of people said, I would never even look there
00:18:19 because I don’t even wanna know.
00:18:21 I don’t even wanna have that capability.
00:18:22 I don’t even wanna have to make that decision
00:18:26 about whether I’m gonna profit off of that knowledge.
00:18:29 I went down to Argentina
00:18:31 and this whole kind of moral calculus I had in my head
00:18:36 was completely flipped around.
00:18:39 So just to back up for a moment.
00:18:41 So Argentina actually is a real hacker’s paradise.
00:18:47 People grew up in Argentina and I went down there,
00:18:50 I guess I was there around 2015, 2016,
00:18:54 but you still couldn’t get an iPhone.
00:18:57 They didn’t have Amazon Prime.
00:18:58 You couldn’t get access to any of the apps
00:19:00 we all take for granted.
00:19:02 To get those things in Argentina as a kid,
00:19:05 you have to find a way to hack them.
00:19:07 And the whole culture is really like a hacker culture.
00:19:12 They say it’s really like a MacGyver culture.
00:19:15 You have to figure out how to break into something
00:19:17 with wire and tape.
00:19:19 And that means that there are a lot of really good hackers
00:19:24 in Argentina who specialize in developing zero to exploits.
00:19:30 And I went down to this Argentina conference
00:19:33 called Echo Party.
00:19:35 And I asked the organizer, okay, can you introduce me
00:19:39 to someone who’s selling zero to exploits to governments?
00:19:43 And he was like, just throw a stone.
00:19:46 Throw a stone anywhere and you’re gonna hit someone.
00:19:48 And all over this conference, you saw these guys
00:19:52 who were clearly from these Gulf States
00:19:54 who only spoke Arabic.
00:19:55 What are they doing at a young hacking conference
00:19:59 in Buenos Aires?
00:20:01 And so I went out to lunch with kind of this godfather
00:20:05 of the hacking scene there.
00:20:07 And I asked this really dumb question
00:20:10 and I’m still embarrassed about how I phrased it.
00:20:13 But I said, so will these guys only sell
00:20:16 these zero to exploits to good Western governments?
00:20:20 And he said, Nicole, last time I checked,
00:20:22 the United States wasn’t a good Western government.
00:20:25 The last country that bombed another country
00:20:28 into oblivion wasn’t China or Iran,
00:20:31 it was the United States.
00:20:33 So if we’re gonna go by your whole moral calculus,
00:20:36 just know that we have a very different calculus down here
00:20:39 and we’d actually rather sell to Iran or Russia
00:20:44 or China maybe than the United States.
00:20:46 And that just blew me away.
00:20:48 Like, wow, he’s like, we’ll just sell
00:20:51 to whoever brings us the biggest bag of cash.
00:20:53 Have you checked into our inflation situation recently?
00:20:57 So I had some of those like reality checks along the way.
00:21:02 We tend to think of things as is this moral,
00:21:05 is this ethical, especially as journalists.
00:21:08 And we kind of sit on our high horse sometimes
00:21:11 and write about a lot of things
00:21:13 that seem to push the moral bounds.
00:21:16 But in this market, which is essentially
00:21:18 an underground market that the one rule is like fight club.
00:21:22 No one talks about fight club.
00:21:24 First rule of the zero day market,
00:21:25 nobody talks about the zero day market on both sides
00:21:29 because the hacker doesn’t wanna lose
00:21:30 their $2.5 million bounty.
00:21:33 And governments roll these into classified programs
00:21:36 and they don’t want anyone to know what they have.
00:21:39 So no one talks about this thing.
00:21:41 And when you’re operating in the dark like that,
00:21:43 it’s really easy to put aside your morals sometimes.
00:21:48 Can I, as a small tangent, ask you, by way of advice,
00:21:52 you must have done some incredible interviews.
00:21:55 And you’ve also spoken about how serious
00:21:58 you take protecting your sources.
00:22:01 If you were to give me advice for interviewing
00:22:04 when you’re recording on mic with a video camera,
00:22:10 how is it possible to get into this world?
00:22:13 Like is it basically impossible?
00:22:16 So you’ve spoken with a few people,
00:22:19 what is it like the godfather of cyber war, cyber security?
00:22:23 So people that are already out.
00:22:25 And they still have to be pretty brave to speak publicly.
00:22:29 But is it virtually impossible to really talk to anybody
00:22:32 who is a current hacker?
00:22:34 Are you always like 10, 20 years behind?
00:22:37 It’s a good question.
00:22:38 And this is why I’m a print journalist.
00:22:41 But when I’ve seen people do it,
00:22:45 it’s always the guy who’s behind the shadows,
00:22:49 whose voice has been altered.
00:22:51 When they’ve gotten someone on camera,
00:22:53 that’s usually how they do it.
00:22:56 Very, very few people talk in this space.
00:22:58 And there’s actually a pretty well known case study
00:23:02 in why you don’t talk publicly in this space
00:23:04 and you don’t get photographed.
00:23:05 And that’s the gruck.
00:23:07 So the gruck is or was this zero day broker,
00:23:12 South African guy, lives in Thailand.
00:23:15 And right when I was starting on this subject
00:23:18 at the New York Times, he’d given an interview to Forbes.
00:23:22 And he talked about being a zero day broker.
00:23:25 And he even posed next to this giant duffel bag
00:23:29 filled with cash, ostensibly.
00:23:31 And later he would say he was speaking off the record.
00:23:35 He didn’t understand the rules of the game.
00:23:38 But what I heard from people who did business with him
00:23:41 was that the minute that that story came out,
00:23:43 he became PNG’d.
00:23:45 No one did business with him.
00:23:47 His business plummeted by at least half.
00:23:50 No one wants to do business with anyone
00:23:52 who’s going to get on camera and talk
00:23:54 about how they’re selling zero days to governments.
00:23:58 It puts you at danger.
00:23:59 And I did hear that he got some visits
00:24:01 from some security folks.
00:24:04 And that’s another thing for these people to consider.
00:24:06 If they have those zero day exploits at their disposal,
00:24:12 they become a huge target for nation states
00:24:16 all over the world.
00:24:18 Talk about having perfect opsec.
00:24:20 You better have some perfect opsec
00:24:23 if people know that you have access to those zero day
00:24:26 exploits.
00:24:27 Which sucks because, I mean, transparency here
00:24:33 would be really powerful for educating the world
00:24:36 and also inspiring other engineers to do good.
00:24:40 It just feels like when you operate in the shadows,
00:24:43 it doesn’t help us move in the positive direction in terms
00:24:46 of getting more people on the defense side
00:24:48 versus on the attack side.
00:24:50 But of course, what can you do?
00:24:52 I mean, the best you can possibly do
00:24:53 is have great journalists, just like you did,
00:24:57 interview and write books about it,
00:24:58 and integrate the information you get
00:25:01 while hiding the sources.
00:25:02 Yeah, and I think what HackerOne has told me was, OK,
00:25:07 let’s just put away the people that
00:25:09 are finding and developing zero day exploits all day long.
00:25:13 Let’s put that aside.
00:25:15 What about however many millions of programmers
00:25:19 all over the world who’ve never even heard of a zero day
00:25:22 exploit?
00:25:23 Why not tap into them and say, hey, we’ll
00:25:26 start paying you if you can find a bug in United Airlines
00:25:31 software or in Schneider Electric or in Ford or Tesla?
00:25:36 And I think that is a really smart approach.
00:25:39 Let’s go find this untapped army of programmers
00:25:43 to neutralize these bugs before the people who will continue
00:25:47 to sell these to governments can find them and exploit them.
00:25:50 OK, I have to ask you about this.
00:25:53 From a personal side, it’s funny enough,
00:25:55 after we agreed to talk, I’ve gotten,
00:25:59 for the first time in my life, was a victim of a cyber attack.
00:26:06 So this is ransomware.
00:26:07 It’s called Deadbolt.
00:26:08 People can look it up.
00:26:10 I have a QNAP device for basically kind
00:26:14 of coldish storage.
00:26:15 So it’s about 60 terabytes with 50 terabytes of data on it
00:26:20 in RAID 5.
00:26:21 And apparently, about 4,000 to 5,000 QNAP devices
00:26:27 were hacked and taken over with this ransomware.
00:26:30 And what ransomware does there is it goes file by file,
00:26:35 almost all the files on the QNAP storage device,
00:26:39 and encrypts them.
00:26:40 And then there’s this very eloquently and politely
00:26:43 written page that pops up, describes what happened.
00:26:48 All your files have been encrypted.
00:26:50 This includes but is not limited to photos, documents,
00:26:52 and spreadsheets.
00:26:53 Why me?
00:26:56 This is a lot of people commented
00:26:57 about how friendly and eloquent this is.
00:27:00 And I have to commend them.
00:27:01 It is, and it’s pretty user friendly.
00:27:05 Why me?
00:27:06 This is not a personal attack.
00:27:08 You have been targeted because of the inadequate security
00:27:10 provided by your vendor, QNAP.
00:27:15 What now?
00:27:16 You can make a payment of exactly 0.03 Bitcoin,
00:27:19 which is about $1,000, to the following address.
00:27:23 Once the payment has been made, we’ll
00:27:25 follow up with transaction to the same address,
00:27:27 blah, blah, blah.
00:27:28 They give you instructions of what happens next,
00:27:31 and they’ll give you a decryption key
00:27:32 that you can then use.
00:27:34 And then there’s another message for QNAP that says,
00:27:38 all your affected customers have been targeted using
00:27:41 a zero day vulnerability in your product.
00:27:43 We offer you two options to mitigate this and future damage.
00:27:48 One, make a Bitcoin payment of 5 Bitcoin
00:27:51 to the following address, and that
00:27:54 will reveal to QNAP the, I’m summarizing things here,
00:27:58 what the actual vulnerability is.
00:28:00 Or you can make a Bitcoin payment of 50 Bitcoin
00:28:03 to get a master decryption key for all your customers.
00:28:06 50 Bitcoin is about $1.8 million.
00:28:10 OK.
00:28:11 So first of all, on a personal level, this one hurt for me.
00:28:18 There’s, I mean, I learned a lot because I wasn’t,
00:28:22 for the most part, backing up much of that data
00:28:26 because I thought I can afford to lose that data.
00:28:30 It’s not horrible.
00:28:32 I mean, I think you’ve spoken about the crown jewels,
00:28:35 like making sure there’s things you really protect.
00:28:38 And I have, you know, I’m very conscious,
00:28:42 security wise, on the crown jewels.
00:28:45 But there’s a bunch of stuff, like, you know,
00:28:48 personal videos that are not, like,
00:28:49 I don’t have anything creepy, but just, like,
00:28:51 fun things I did that because they’re very large or 4K
00:28:55 or something like that, I kept them on there,
00:28:57 thinking RAID 5 will protect it.
00:28:59 You know, just I lost a bunch of stuff, including raw footage
00:29:05 from interviews and all that kind of stuff.
00:29:08 So it’s painful.
00:29:09 And I’m sure there’s a lot of painful stuff
00:29:11 like that for the 4,000 to 5,000 people that use QNAP.
00:29:15 And there’s a lot of interesting ethical questions here.
00:29:18 Do you pay them?
00:29:20 Does QNAP pay them?
00:29:23 Do the individuals pay them, especially when
00:29:26 you don’t know if it’s going to work or not?
00:29:29 Do you wait?
00:29:30 So QNAP said that, please don’t pay them.
00:29:35 We’re working very hard day and night to solve this.
00:29:41 It’s so philosophically interesting to me
00:29:44 because I also project onto them thinking,
00:29:46 what is their motivation?
00:29:48 Because the way they phrased it, on purpose, perhaps,
00:29:51 but I’m not sure if that actually reflects their real motivation,
00:29:54 is maybe they’re trying to help themselves sleep at night,
00:29:59 basically saying, this is not about you.
00:30:01 This is about the company with the vulnerabilities.
00:30:04 Just like you mentioned, this is the justification they have.
00:30:07 But they’re hurting real people.
00:30:09 They hurt me.
00:30:10 But I’m sure there’s a few others that are really hurt.
00:30:14 And the zero day factor is a big one.
00:30:18 Their QNAP right now is trying to figure out
00:30:22 what the hell is wrong with their system that would let this in.
00:30:25 And even if they pay, if they still don’t know where the zero
00:30:30 day is, what’s to say that they won’t just hit them again
00:30:32 and hit you again?
00:30:34 So that really complicates things.
00:30:36 And that is a huge advancement for ransomware.
00:30:40 It’s really only been, I think, in the last 18 months
00:30:44 that we’ve ever really seen ransomware exploit zero days
00:30:48 to pull these off.
00:30:49 Usually, 80% of them, I think the data shows 80% of them
00:30:54 come down to a lack of two factor authentication.
00:30:58 So when someone gets hit by a ransomware attack,
00:31:01 they don’t have two factor authentication on.
00:31:04 Their employees were using stupid passwords.
00:31:07 You can mitigate that in the future.
00:31:09 This one, they don’t know.
00:31:10 They probably don’t know.
00:31:11 Yeah.
00:31:12 And I guess it’s zero click because I
00:31:14 didn’t have to do anything.
00:31:16 The only thing, well, here’s the thing.
00:31:21 I did basics of I put it behind a firewall.
00:31:26 I followed instructions.
00:31:27 But I didn’t really pay attention.
00:31:30 So maybe there’s a misconfiguration of some sort
00:31:34 that’s easy to make.
00:31:36 It’s difficult. We have a personal NAS.
00:31:40 So I’m not willing to say that I did
00:31:43 everything I possibly could.
00:31:47 But I did a lot of reasonable stuff.
00:31:49 And they still hit it with zero clicks.
00:31:51 I didn’t have to do anything.
00:31:52 Yeah, well, it’s like a zero day.
00:31:54 And it’s a supply chain attack.
00:31:56 You’re getting hit from your supplier.
00:31:59 You’re getting hit because of your vendor.
00:32:01 And it’s also a new thing for ransomware groups
00:32:04 to go to the individuals to pressure them to pay.
00:32:07 There was this really interesting case.
00:32:09 I think it was in Norway where there was a mental health
00:32:13 clinic that got hit.
00:32:16 And the cybercriminals were going to the patients
00:32:18 themselves to say, pay this, or we’re
00:32:22 going to release your psychiatric records.
00:32:25 I mean, talk about hell.
00:32:28 In terms of whether to pay, that is on the cheaper
00:32:31 end of the spectrum.
00:32:33 From the individual or from the company?
00:32:35 Both.
00:32:36 We’ve seen, for instance, there was an Apple supplier in Taiwan.
00:32:42 They got hit.
00:32:43 And the ransom demand was $50 million.
00:32:47 I’m surprised it’s only $1.8 million.
00:32:49 I’m sure it’s going to go up.
00:32:52 And it’s hard.
00:32:53 There’s obviously governments, and maybe in this case,
00:32:57 the company are going to tell you,
00:32:58 we recommend you don’t pay or please don’t pay.
00:33:02 But the reality on the ground is that some businesses
00:33:06 can’t operate.
00:33:07 Some countries can’t function.
00:33:09 I mean, the underreported storyline of Colonial Pipeline
00:33:15 was after the company got hit and took
00:33:19 the preemptive step of shutting down the pipeline
00:33:22 because their billing systems were frozen,
00:33:24 they couldn’t charge customers downstream.
00:33:27 My colleague David Zanger and I got our hands
00:33:30 on a classified assessment that said that as a country,
00:33:35 we could have only afforded two to three more days
00:33:38 of Colonial Pipeline being down.
00:33:40 And it was really interesting.
00:33:42 I thought it was the gas and the jet fuel, but it wasn’t.
00:33:45 We were sort of prepared for that.
00:33:47 It was the diesel.
00:33:48 Without the diesel, the refineries couldn’t function,
00:33:52 and it would have totally screwed up the economy.
00:33:54 And so there was almost this national security
00:33:59 economic impetus for them to pay this ransom.
00:34:04 And the other one I always think about is Baltimore.
00:34:07 When the city of Baltimore got hit,
00:34:09 I think the initial ransom demand
00:34:11 was something around $76,000.
00:34:13 It may have even started smaller than that.
00:34:16 And Baltimore stood its ground and didn’t pay.
00:34:20 But ultimately, the cost to remediate was $18 million.
00:34:25 That’s a lot for the city of Baltimore.
00:34:26 That’s money that could have gone to public school education
00:34:29 and roads and public health.
00:34:32 And instead, it just went to rebuilding these systems
00:34:35 from scratch.
00:34:36 And so a lot of residents in Baltimore
00:34:38 were like, why the hell didn’t you pay the $76,000?
00:34:43 So it’s not obvious.
00:34:46 It’s easy to say, don’t pay.
00:34:48 Because why?
00:34:48 You’re funding their R&D for the next go round.
00:34:52 But it’s too often, it’s too complicated.
00:34:56 So on the individual level, just like the way
00:35:00 I feel personally from this attack,
00:35:03 have you talked to people that were kind of victims
00:35:05 in the same way I was, but maybe more dramatic ways or so on,
00:35:09 in the same way that violence hurts people?
00:35:13 How much does this hurt people in your sense
00:35:15 and the way you researched it?
00:35:16 The worst ransomware attack I’ve covered on a personal level
00:35:23 was an attack on a hospital in Vermont.
00:35:28 And you think of this as like, OK,
00:35:30 it’s hitting their IT networks.
00:35:31 They should still be able to treat patients.
00:35:34 But it turns out that cancer patients
00:35:37 couldn’t get their chemo anymore.
00:35:39 Because the protocol of who gets what is very complicated.
00:35:43 And without it, nurses and doctors couldn’t access it.
00:35:47 So they were turning chemo patients away,
00:35:50 cancer patients away.
00:35:52 One nurse told us, I don’t know why people
00:35:55 aren’t screaming about this, that the only thing I’ve
00:35:58 seen that even compares to what we’re
00:36:00 seeing at this hospital right now
00:36:02 was when I worked in the burn unit
00:36:04 after the Boston Marathon bombing.
00:36:06 They really put it in these super dramatic terms.
00:36:10 And last year there was a report in the Wall Street Journal
00:36:15 where they attributed an infant death to a ransomware attack
00:36:20 because a mom came in and whatever device
00:36:25 they were using to monitor the fetus
00:36:28 wasn’t working because of the ransomware attack.
00:36:30 And so they attributed this infant death
00:36:33 to the ransomware attack.
00:36:34 Now on a bigger scale but less personal,
00:36:39 when there was the NotPetya attack.
00:36:41 So this was an attack by Russia on Ukraine
00:36:46 that came at them through a supplier, a tax software
00:36:51 company in that case, that didn’t just
00:36:53 hit any government agency or business in Ukraine
00:36:57 that used this tax software.
00:36:59 It actually hit any business all over the world that
00:37:02 had even a single employee working remotely in Ukraine.
00:37:07 So it hit Maersk, the shipping company, hit Pfizer,
00:37:10 hit FedEx, but the one I will never forget is Merck.
00:37:14 It paralyzed Merck’s factories.
00:37:17 I mean, it really created an existential crisis
00:37:20 for the company.
00:37:21 Merck had to tap into the CDC’s emergency supplies
00:37:25 of the Gardasil vaccine that year
00:37:27 because their whole vaccine production line had been
00:37:30 paralyzed in that attack.
00:37:32 Imagine if that was going to happen right now
00:37:36 to Pfizer or Moderna or Johnson and Johnson.
00:37:39 Imagine.
00:37:41 I mean, that would really create a global cyber terrorist
00:37:46 attack, essentially.
00:37:47 And that’s almost unintentional.
00:37:49 I thought for a long time, I always
00:37:51 labeled it as collateral damage.
00:37:54 But actually, just today, there was a really impressive threat
00:37:59 researcher at Cisco, which has this threat intelligence
00:38:04 division called Talos, who said, stop calling it
00:38:07 collateral damage.
00:38:08 They could see who was going to get hit before they
00:38:12 deployed that malware.
00:38:15 It wasn’t collateral damage.
00:38:17 It was intentional.
00:38:19 They meant to hit any business that did business with Ukraine.
00:38:23 It was to send a message to them, too.
00:38:26 So I don’t know if that’s accurate.
00:38:28 I always thought of it as sort of the sloppy collateral
00:38:31 damage, but it definitely made me think.
00:38:34 So how much of this between states
00:38:37 is going to be a part of war, these kinds of attacks
00:38:42 on Ukraine between Russia and US, Russia and China,
00:38:48 China and US?
00:38:51 Let’s look at China and US.
00:38:53 Do you think China and US are going
00:38:56 to escalate something that would be called a war purely
00:39:01 in the space of cyber?
00:39:04 I believe any geopolitical conflict from now on
00:39:12 is guaranteed to have some cyber element to it.
00:39:17 The Department of Justice recently
00:39:19 declassified a report that said China has been hacking
00:39:21 into our pipelines, and it’s not for intellectual property
00:39:24 theft.
00:39:25 It’s to get a foothold so that if things escalate in Taiwan,
00:39:29 for example, they are where they need
00:39:31 to be to shut our pipelines down.
00:39:33 And we just got a little glimpse of what
00:39:35 that looked like with Colonial Pipeline and the panic buying
00:39:39 and the jet fuel shortages and that assessment I just
00:39:42 mentioned about the diesel.
00:39:44 So they’re there.
00:39:47 They’ve gotten there.
00:39:49 Anytime I read a report about new aggression from fighter
00:39:54 jets, Chinese fighter jets in Taiwan,
00:39:57 or what’s happening right now with Russia’s buildup
00:40:00 on the Ukraine border, or India, Pakistan,
00:40:04 I’m always looking at it through a cyber lens.
00:40:07 And it really bothers me that other people aren’t,
00:40:11 because there is no way that these governments
00:40:15 and these nation states are not going
00:40:17 to use their access to gain some advantage in those conflicts.
00:40:23 And I’m now in a position where I’m
00:40:27 an advisor to the Cybersecurity Infrastructure Security
00:40:32 Agency at DHS.
00:40:33 So I’m not saying anything classified here.
00:40:37 But I just think that it’s really important
00:40:41 to understand just generally what the collateral damage
00:40:45 could be for American businesses and critical infrastructure
00:40:49 in any of these escalated conflicts around the world.
00:40:54 Because just generally, our adversaries
00:40:57 have learned that they might never
00:41:01 be able to match us in terms of our traditional military
00:41:04 spending on traditional weapons and fighter jets.
00:41:08 But we have a very soft underbelly
00:41:10 when it comes to cyber.
00:41:12 80% or more of America’s critical infrastructure,
00:41:17 so pipelines, power grid, nuclear plants, water systems,
00:41:23 is owned and operated by the private sector.
00:41:26 And for the most part, there is nothing out there legislating
00:41:31 that those companies share the fact they’ve been breached.
00:41:35 They don’t even have to tell the government they’ve been hit.
00:41:38 There’s nothing mandating that they even
00:41:40 meet a bare minimum standard of cybersecurity.
00:41:44 And that’s it.
00:41:46 So even when there are these attacks, most of the time,
00:41:49 we don’t even know about it.
00:41:51 So that is, if you were going to design a system
00:41:54 to be as blind and vulnerable as possible,
00:41:57 that’s pretty good.
00:42:00 That’s what it looks like is what we have here
00:42:02 in the United States.
00:42:04 And everyone here is just operating like,
00:42:08 let’s just keep hooking up everything for convenience.
00:42:12 Software eats the world.
00:42:14 Let’s just keep going for cost, for convenience sake,
00:42:18 just because we can.
00:42:20 And when you study these issues and you study these attacks
00:42:24 and you study the advancement and the uptick in frequency
00:42:29 and the lower barrier to entry that we see every single year,
00:42:34 you realize just how dumb software eats world is.
00:42:39 And no one has ever stopped to pause and think,
00:42:43 should we be hooking up these systems to the internet?
00:42:47 They’ve just been saying, can we?
00:42:49 Let’s do it.
00:42:51 And that’s a real problem.
00:42:52 And just in the last year, we’ve seen a record number
00:42:55 of zero day attacks.
00:42:56 I think there were 80 last year, which
00:42:59 is probably more than double what it was in 2019.
00:43:03 A lot of those were nation states.
00:43:06 We live in a world with a lot of geopolitical hot points
00:43:10 right now.
00:43:11 And where those geopolitical hot points are
00:43:15 are places where countries have been investing heavily
00:43:19 in offensive cyber tools.
00:43:21 If you’re a nation state, the goal
00:43:25 would be to maximize the footprint of zero day,
00:43:29 like super secret zero day that nobody is aware of.
00:43:33 And whenever war is initiated, the huge negative effects
00:43:37 of shutting down infrastructure or any kind of zero day
00:43:39 is the chaos it creates.
00:43:41 So if you just, there’s a certain threshold
00:43:43 when you create the chaos.
00:43:45 The market’s plummeted.
00:43:46 Just everything goes to hell.
00:43:51 I mean, it’s not just zero days.
00:43:52 We make it so easy for threat actors.
00:43:56 I mean, we’re not using two factor authentication.
00:44:00 We’re not patching.
00:44:02 There was the shell shock vulnerability
00:44:04 that was discovered a couple of years ago.
00:44:08 It’s still being exploited because so many people
00:44:11 haven’t fixed it.
00:44:13 So the zero days are really the sexy stuff.
00:44:17 And what really drew me to the zero day market
00:44:19 was the moral calculus we talked about, particularly
00:44:24 from the US government’s point of view.
00:44:26 How do they justify leaving these systems so vulnerable
00:44:31 when we use them here and we’re baking
00:44:34 more of our critical infrastructure
00:44:36 with this vulnerable software?
00:44:38 It’s not like we’re using one set of technology
00:44:41 and Russia is using another and China is using this.
00:44:43 We’re all using the same technology.
00:44:45 So when you find a zero day in Windows,
00:44:49 you’re not just leaving it open so you can spy on Russia
00:44:52 or implant yourself in the Russian grid.
00:44:54 You’re leaving Americans vulnerable too.
00:44:58 But zero days are like, that is the secret sauce.
00:45:02 That’s the superpower.
00:45:04 And I always say every country now,
00:45:07 with the exception of Antarctica,
00:45:09 someone added the Vatican to my list,
00:45:11 is trying to find offensive hacking tools and zero days
00:45:16 to make them work.
00:45:17 And those that don’t have the skills
00:45:20 now have this market that they can tap into,
00:45:23 where $2.5 million, that’s chump change
00:45:26 for a lot of these nation states.
00:45:27 It’s a hell of a lot less than trying
00:45:29 to build the next fighter jet.
00:45:32 But yeah, the goal is chaos.
00:45:34 I mean, why did Russia turn off the lights twice in Ukraine?
00:45:39 I think part of it is chaos.
00:45:42 I think part of it is to sow the seeds of doubt
00:45:46 in their current government.
00:45:47 Your government can’t even keep your lights on.
00:45:50 Why are you sticking with them?
00:45:52 Come over here and we’ll keep your lights on at least.
00:45:56 There’s like a little bit of that.
00:45:58 Nuclear weapons seems to have helped prevent nuclear war.
00:46:04 Is it possible that we have so many vulnerabilities
00:46:08 and so many attack vectors on each other
00:46:11 that you will kind of achieve the same kind of equilibrium
00:46:15 like mutually shared destruction?
00:46:17 Yeah.
00:46:18 That’s one hopeful solution to this.
00:46:20 Do you have any hope for this particular solution?
00:46:23 You know, nuclear analogies always tend to fall apart
00:46:26 when it comes to cyber,
00:46:27 mainly because you don’t need fissile material.
00:46:30 You know, you just need a laptop and the skills
00:46:33 and you’re in the game.
00:46:34 So it’s a really low barrier to entry.
00:46:38 The other thing is attribution is harder.
00:46:40 And we’ve seen countries muck around with attribution.
00:46:44 We’ve seen, you know, nation states piggyback
00:46:47 on other countries spy operations and just sit there
00:46:50 and siphon out whatever they’re getting.
00:46:53 We learned some of that from the Snowden documents.
00:46:56 We’ve seen Russia hack into Iran’s command
00:46:58 and control attack servers.
00:47:01 We’ve seen them hit a Saudi petrochemical plant
00:47:05 where they did neutralize the safety locks at the plant
00:47:08 and everyone assumed that it was Iran,
00:47:10 given Iran had been targeting Saudi oil companies forever.
00:47:13 But nope, it turned out that it was
00:47:15 a graduate research institute outside Moscow.
00:47:17 So you see countries kind of playing around
00:47:20 with attribution.
00:47:21 Why?
00:47:22 I think because they think, okay, if I do this,
00:47:25 like how am I gonna cover up that it came from me
00:47:27 because I don’t wanna risk the response.
00:47:30 So people are sort of dancing around this.
00:47:33 It’s just in a very different way.
00:47:34 And, you know, at the times I’d covered the Chinese hacks
00:47:39 of infrastructure companies like pipelines.
00:47:42 I’d covered the Russian probes of nuclear plants.
00:47:46 I’d covered the Russian attacks on the Ukraine grid.
00:47:50 And then in 2018, my colleague David Sanger and I
00:47:53 covered the fact that US Cyber Command
00:47:57 had been hacking into the Russian grid
00:47:59 and making a pretty loud show of it.
00:48:02 And when we went to the National Security Council,
00:48:05 because that’s what journalists do
00:48:06 before they publish a story,
00:48:08 they give the other side a chance to respond,
00:48:11 I assumed we would be in for that really awkward,
00:48:14 painful conversation where they would say,
00:48:17 you will have blood on your hands if you publish this story.
00:48:20 And instead they gave us the opposite answer.
00:48:22 They said, we have no problem
00:48:25 with you publishing this story.
00:48:27 Why?
00:48:28 Well, they didn’t say it out loud,
00:48:29 but it was pretty obvious they wanted Russia to know
00:48:33 that we’re hacking into their power grid too,
00:48:35 and they better think twice before they do to us
00:48:38 what they had done to Ukraine.
00:48:40 So yeah, you know, we have stumbled into this new era
00:48:44 of mutually assured digital destruction.
00:48:47 I think another sort of quasi norm we’ve stumbled into
00:48:54 is proportional responses.
00:48:57 There’s this idea that if you get hit,
00:49:00 you’re allowed to respond proportionally
00:49:03 at a time and place of your choosing.
00:49:05 That is how the language always goes.
00:49:08 That’s what Obama said after North Korea hit Sony.
00:49:12 We will respond at a time and place of our choosing.
00:49:15 But no one really knows like what that response looks like.
00:49:21 And so what you see a lot of the time
00:49:22 are just these like, just short of war attacks.
00:49:27 You know, Russia turned off the power in Ukraine,
00:49:29 but it wasn’t like it stayed off for a week.
00:49:31 You know, it stayed off for a number of hours.
00:49:34 You know, NotPetya hit those companies pretty hard,
00:49:39 but no one died, you know?
00:49:41 And the question is, what’s gonna happen when someone dies?
00:49:44 And can a nation state masquerade as a cyber criminal group,
00:49:49 as a ransomware group?
00:49:51 And that’s what really complicates
00:49:53 coming to some sort of digital Geneva convention.
00:49:57 Like there’s been a push from Brad Smith at Microsoft.
00:50:01 We need a digital Geneva convention.
00:50:03 And on its face, it sounds like a no brainer.
00:50:06 Yeah, why wouldn’t we all agree to stop hacking
00:50:08 into each other’s civilian hospital systems,
00:50:11 elections, power grid, pipelines?
00:50:15 But when you talk to people in the West,
00:50:19 officials in the West, they’ll say, we would never,
00:50:22 we’d love to agree to it, but we’d never do it
00:50:25 when you’re dealing with Xi or Putin or Kim Jong Un.
00:50:30 Because a lot of times, they outsource these operations
00:50:35 to cyber criminals.
00:50:37 In China, we see a lot of these attacks
00:50:39 come from this loose satellite network of private citizens
00:50:43 that work at the behest of the Ministry of State Security.
00:50:46 So how do you come to some sort of state to state agreement
00:50:51 when you’re dealing with transnational actors
00:50:55 and cyber criminals, where it’s really hard to pin down
00:50:59 whether that person was acting alone
00:51:01 or whether they were acting at the behest of the MSS
00:51:05 or the FSB.
00:51:06 And a couple of years ago, I remember,
00:51:09 can’t remember if it was before or after NotPetya,
00:51:11 but Putin said, hackers are like artists
00:51:14 who wake up in the morning in a good mood and start painting.
00:51:18 In other words, I have no say over what they do or don’t do.
00:51:21 So how do you come to some kind of norm
00:51:24 when that’s how he’s talking about these issues
00:51:26 and he’s just decimated Merck and Pfizer
00:51:30 and another however many thousand companies?
00:51:34 That is the fundamental difference between nuclear weapons
00:51:37 and cyber attacks is the attribution
00:51:40 or one of the fundamental differences.
00:51:42 If you can fix one thing in the world
00:51:45 in terms of cybersecurity
00:51:47 that would make the world a better place,
00:51:48 what would you fix?
00:51:51 So you’re not allowed to fix like authoritarian regimes
00:51:54 and you can’t.
00:51:55 You have to keep that,
00:51:57 you have to keep human nature as it is.
00:52:00 In terms of on the security side, technologically speaking,
00:52:05 you mentioned there’s no regulation
00:52:06 on companies in United States.
00:52:10 What if you could just fix with the snap of a finger,
00:52:14 what would you fix?
00:52:15 Two factor authentication, multifactor authentication.
00:52:19 It’s ridiculous how many of these attacks come in
00:52:24 because someone didn’t turn on multifactor authentication.
00:52:27 I mean, Colonial Pipeline, okay?
00:52:30 They took down the biggest conduit
00:52:34 for gas, jet fuel and diesel
00:52:35 to the East Coast of the United States of America, how?
00:52:39 Because they forgot to deactivate an old employee account
00:52:42 whose password had been traded on the dark web
00:52:44 and they’d never turned on two factor authentication.
00:52:48 This water treatment facility outside Florida
00:52:50 was hacked last year.
00:52:51 How did it happen?
00:52:53 They were using Windows XP from like a decade ago
00:52:56 that can’t even get patches if you want it to
00:52:59 and they didn’t have two factor authentication.
00:53:01 Time and time again,
00:53:02 if they just switched on two factor authentication,
00:53:06 some of these attacks wouldn’t have been possible.
00:53:08 Now, if I could snap my fingers,
00:53:10 that’s the thing I would do right now.
00:53:11 But of course, this is a cat and mouse game
00:53:15 and then the attackers onto the next thing.
00:53:17 But I think right now that is like bar none.
00:53:21 That is just, that is the easiest, simplest way
00:53:24 to deflect the most attacks.
00:53:25 And the name of the game right now isn’t perfect security.
00:53:29 Perfect security is impossible.
00:53:32 They will always find a way in.
00:53:34 The name of the game right now
00:53:35 is make yourself a little bit harder to attack
00:53:39 than your competitor than anyone else out there
00:53:41 so that they just give up and move along.
00:53:44 And maybe if you are a target
00:53:46 for an advanced nation state or the SVR,
00:53:51 you’re gonna get hacked no matter what.
00:53:53 But you can make cyber criminal groups deadbolt, is it?
00:53:57 You can make their jobs a lot harder
00:54:00 simply by doing the bare basics.
00:54:03 And the other thing is stop reusing your passwords.
00:54:05 But if I only get one, then two factor authentication.
00:54:08 So what is two factor authentication?
00:54:10 Factor one is what, logging in with a password.
00:54:13 And factor two is like have another device
00:54:15 or another channel through which you can confirm,
00:54:18 yeah, that’s me.
00:54:19 Yes, usually this happens through some kind of text.
00:54:23 You get your one time code from Bank of America
00:54:26 or from Google.
00:54:28 The better way to do it is spend $20
00:54:31 buying yourself a Fido key on Amazon.
00:54:34 That’s a hardware device.
00:54:36 And if you don’t have that hardware device with you,
00:54:39 then you’re not gonna get in.
00:54:41 And the whole goal is, I mean, basically,
00:54:43 my first half of my decade at The Times
00:54:46 was spent covering like the copy.
00:54:49 It was like Home Depot got breached,
00:54:51 News at 11, Target, Neumann Marcus,
00:54:54 like who wasn’t hacked over the course of those five years?
00:54:58 And a lot of those companies that got hacked,
00:55:01 what did hackers take?
00:55:02 They took the credentials, they took the passwords.
00:55:05 They can make a pretty penny selling them on the dark web
00:55:08 and people reuse their passwords.
00:55:11 So you get one from God knows who, I don’t know,
00:55:15 LastPass, worst case example, actually LastPass.
00:55:19 But you get one and then you go test it
00:55:21 on their email account.
00:55:23 And you go test it on their brokerage account
00:55:25 and you test it on their cold storage account.
00:55:28 That’s how it works.
00:55:29 But if you have multi factor authentication,
00:55:32 then they can’t get in
00:55:34 because they might have your password,
00:55:36 but they don’t have your phone,
00:55:38 they don’t have your Fido key.
00:55:41 So you keep them out.
00:55:42 And I get a lot of alerts that tell me
00:55:46 someone is trying to get into your Instagram account
00:55:49 or your Twitter account or your email account.
00:55:52 And I don’t worry because I use multi factor authentication.
00:55:55 They can try all day.
00:55:58 Okay, I worry a little bit, but it’s the simplest thing to do
00:56:03 and we don’t even do it.
00:56:05 Well, there’s an interface aspect to it
00:56:06 because it’s pretty annoying if it’s implemented poorly.
00:56:11 So actually bad implementation
00:56:13 of two factor authentication, not just bad,
00:56:16 but just something that adds friction
00:56:19 is a security vulnerability, I guess,
00:56:21 because it’s really annoying.
00:56:23 Like I think MIT for a while had two factor authentication.
00:56:27 It was really annoying.
00:56:28 I just, like the number of times it pings you,
00:56:33 like it asks to reauthenticate across multiple subdomains.
00:56:39 Like it just feels like a pain.
00:56:42 I don’t know what the right balance there.
00:56:44 Yeah, it feels like friction in our frictionless society.
00:56:48 It feels like friction, it’s annoying.
00:56:51 That’s security’s biggest problem, it’s annoying.
00:56:54 We need the Steve Jobs of security to come along
00:56:57 and we need to make it painless.
00:56:59 And actually on that point,
00:57:02 Apple has probably done more for security than anyone else
00:57:07 simply by introducing biometric authentication,
00:57:10 first with the fingerprint and then with face ID.
00:57:13 And it’s not perfect, but if you think just eight years ago,
00:57:17 everyone was running around with either no passcode
00:57:20 and optional passcode or four digit passcode on their phone
00:57:23 that anyone, think of what you can get
00:57:26 when you get someone’s iPhone, if you steal someone’s iPhone
00:57:29 and props to them for introducing the fingerprint
00:57:32 and face ID.
00:57:33 And again, it wasn’t perfect, but it was a huge step forward.
00:57:36 Now it’s time to make another huge step forward.
00:57:41 I wanna see the password die.
00:57:42 I mean, it’s gotten us as far as it was ever gonna get us.
00:57:46 And I hope whatever we come up with next
00:57:49 is not gonna be annoying, is gonna be seamless.
00:57:52 When I was at Google, that’s what we worked on is,
00:57:55 and there’s a lot of ways to call it
00:57:57 active authentication, passive authentication.
00:57:59 So basically you use biometric data,
00:58:02 not just like a fingerprint, but everything from your body
00:58:05 to identify who you are, like movement patterns.
00:58:09 So it basically create a lot of layers of protection
00:58:12 where it’s very difficult to fake,
00:58:15 including like face unlock, checking that it’s your actual
00:58:20 face, like the liveness tests.
00:58:23 So like from video, so unlocking it with video,
00:58:26 voice, the way you move the phone,
00:58:31 the way you take it out of the pocket, that kind of thing.
00:58:33 All of those factors.
00:58:34 It’s a really hard problem though.
00:58:37 And ultimately, it’s very difficult to beat the password
00:58:42 in terms of security.
00:58:43 Well, there’s a company that I actually will call out
00:58:46 and that’s Abnormal Security.
00:58:48 So they work on email attacks.
00:58:51 And it was started by a couple of guys who were doing,
00:58:56 I think, ad tech at Twitter.
00:58:59 So ad technology now, like it’s a joke
00:59:02 how much they know about us.
00:59:03 You always hear the conspiracy theories that
00:59:06 you saw someone’s shoes and next thing you know,
00:59:08 it’s on your phone.
00:59:10 It’s amazing what they know about you.
00:59:13 And they’re basically taking that
00:59:16 and they’re applying it to attacks.
00:59:19 So they’re saying, okay, if you’re,
00:59:22 this is what your email patterns are.
00:59:24 It might be different for you and me
00:59:26 because we’re emailing strangers all the time.
00:59:29 But for most people,
00:59:30 their email patterns are pretty predictable.
00:59:33 And if something strays from that pattern, that’s abnormal
00:59:38 and they’ll block it, they’ll investigate it.
00:59:41 And that’s great.
00:59:43 Let’s start using that kind of targeted ad technology
00:59:48 to protect people.
00:59:50 And yeah, I mean, it’s not gonna get us away
00:59:52 from the password and using multifactor authentication,
00:59:56 but the technology is out there
00:59:59 and we just have to figure out how to use it
01:00:02 in a really seamless way because it doesn’t matter
01:00:05 if you have the perfect security solution
01:00:07 if no one uses it.
01:00:08 I mean, when I started at the times
01:00:10 when I was trying to be really good
01:00:12 about protecting sources,
01:00:14 I was trying to use PGP encryption
01:00:17 and it’s like, it didn’t work.
01:00:19 The number of mistakes I would probably make
01:00:22 just trying to email someone with PGP just wasn’t worth it.
01:00:27 And then Signal came along and Signal made it wicker.
01:00:32 They made it a lot easier
01:00:34 to send someone an encrypted text message.
01:00:37 So we have to start investing in creative minds,
01:00:43 in good security design.
01:00:45 I really think that’s the hack that’s gonna get us
01:00:48 out of where we are today.
01:00:50 What about social engineering?
01:00:52 Do you worry about this sort of hacking people?
01:00:57 Yes, I mean, this is the worst nightmare
01:01:00 of every chief information security officer out there.
01:01:04 Social engineering, we work from home now.
01:01:10 I saw this woman posted online about how her husband,
01:01:15 it went viral today,
01:01:16 but it was her husband had this problem at work.
01:01:20 They hired a guy named John
01:01:22 and now the guy that shows up for work every day
01:01:26 doesn’t act like John.
01:01:29 I mean, think about that.
01:01:31 Like think about the potential for social engineering
01:01:34 in that context.
01:01:35 You apply for a job and you put on a pretty face,
01:01:38 you hire an actor or something,
01:01:40 and then you just get inside the organization
01:01:42 and get access to all that organization’s data.
01:01:45 A couple of years ago,
01:01:47 Saudi Arabia planted spies inside Twitter.
01:01:51 Why?
01:01:52 Probably because they were trying to figure out
01:01:54 who these people were
01:01:55 who were criticizing the regime on Twitter.
01:01:58 They couldn’t do it with a hack from the outside,
01:02:00 so why not plant people on the inside?
01:02:02 And that’s like the worst nightmare.
01:02:04 And it also, unfortunately, creates all kinds of xenophobia
01:02:09 at a lot of these organizations.
01:02:11 I mean, if you’re gonna have to take that into consideration,
01:02:14 then organizations are gonna start looking
01:02:16 really skeptically and suspiciously
01:02:19 at someone who applies for that job from China.
01:02:23 And we’ve seen that go really badly
01:02:25 at places like the Department of Commerce,
01:02:28 where they basically accuse people of being spies
01:02:31 that aren’t spies.
01:02:32 So it is the hardest problem to solve,
01:02:35 and it’s never been harder to solve
01:02:37 than right at this very moment
01:02:39 when there’s so much pressure for companies
01:02:41 to let people work remotely.
01:02:43 That’s actually why I’m single.
01:02:45 I’m suspicious that China and Russia,
01:02:48 every time I meet somebody,
01:02:49 are trying to plant and get insider information,
01:02:52 so I’m very, very suspicious.
01:02:54 I keep putting the touring test in front, no.
01:02:57 No, I have a friend who worked inside NSA
01:03:02 and was one of their top hackers,
01:03:04 and he’s like, every time I go to Russia,
01:03:08 I get hit on by these 10s.
01:03:10 And I come home, my friends are like,
01:03:12 I’m sorry, you’re not a 10.
01:03:13 Like, it’s a common story.
01:03:17 I mean, it’s difficult to trust humans
01:03:20 in this day and age online.
01:03:23 So we’re working remotely, that’s one thing,
01:03:27 but just interacting with people on the internet,
01:03:31 sounds ridiculous, but because of this podcast in part,
01:03:35 I’ve gotten to meet some incredible people,
01:03:37 but it makes you nervous to trust folks,
01:03:43 and I don’t know how to solve that problem.
01:03:48 So I’m talking with Mark Zuckerberg,
01:03:51 who dreams about creating the metaverse.
01:03:55 What do you do about that world
01:03:56 where more and more our lives is in the digital sphere?
01:04:01 Like, one way to phrase it is,
01:04:05 most of our meaningful experiences at some point
01:04:10 will be online, like falling in love, getting a job,
01:04:15 or experiencing a moment of happiness with a friend,
01:04:19 with a new friend made online, all of those things.
01:04:23 Like, more and more, the fun we do,
01:04:25 the things that make us love life will happen online,
01:04:28 and if those things have an avatar that’s digital,
01:04:32 that’s like a way to hack into people’s minds,
01:04:35 whether it’s with AI or kind of troll farms
01:04:39 or something like that.
01:04:40 I don’t know if there’s a way to protect against that.
01:04:43 That might fundamentally rely on our faith
01:04:49 in how good human nature is.
01:04:51 So if most people are good, we’re going to be okay,
01:04:54 but if people will tend towards manipulation
01:04:59 and malevolent behavior in search of power,
01:05:03 then we’re screwed.
01:05:05 So I don’t know if you can comment
01:05:07 on how to keep the metaverse secure.
01:05:10 Yeah, I mean, all I thought about
01:05:13 when you were talking just now was my three year old son.
01:05:16 Yeah.
01:05:19 He asked me the other day, what’s the internet, mom?
01:05:22 And I just almost wanted to cry.
01:05:25 You know, I don’t want that for him.
01:05:29 I don’t want all of his most meaningful experiences
01:05:32 to be online.
01:05:33 You know, by the time that happens,
01:05:36 how do you know that person’s human,
01:05:39 that avatar’s human?
01:05:41 You know, I believe in free speech.
01:05:42 I don’t believe in free speech for robots and bots.
01:05:46 And like, look what just happened over the last six years.
01:05:51 You know, we had bots pretending
01:05:53 to be Black Lives Matter activists
01:05:56 just to sow some division,
01:05:59 or, you know, Texas secessionists,
01:06:01 or, you know, organizing anti Hillary protests,
01:06:06 or just to sow more division,
01:06:08 to tie us up in our own politics
01:06:12 so that we’re so paralyzed we can’t get anything done.
01:06:15 We can’t make any progress
01:06:17 and we definitely can’t handle our adversaries
01:06:19 and their longterm thinking.
01:06:22 It really scares me.
01:06:25 And here’s where I just come back to.
01:06:28 Just because we can create the metaverse,
01:06:32 you know, just because it sounds like the next logical step
01:06:36 in our digital revolution,
01:06:39 do I really want my child’s most significant moments
01:06:43 to be online?
01:06:45 They weren’t for me, you know?
01:06:47 So maybe I’m just stuck in that old school thinking,
01:06:51 or maybe I’ve seen too much.
01:06:54 And I’m really sick of being
01:06:58 the guinea pig parent generation for these things.
01:07:01 I mean, it’s hard enough with screen time.
01:07:04 Like thinking about how to manage the metaverse as a parent
01:07:10 to a young boy, like I can’t even let my head go there.
01:07:13 That’s so terrifying for me.
01:07:16 But we’ve never stopped any new technology
01:07:21 just because it introduces risks.
01:07:23 We’ve always said, okay, the promise of this technology
01:07:27 means we should keep going, keep pressing ahead.
01:07:31 We just need to figure out new ways to manage that risk.
01:07:35 And you know, that’s the blockchain right now.
01:07:39 Like when I was covering all of these ransomware attacks,
01:07:44 I thought, okay, this is gonna be it for cryptocurrency.
01:07:48 You know, governments are gonna put the kibosh down.
01:07:51 They’re gonna put the hammer down and say enough is enough.
01:07:54 Like we have to put this genie back in the bottle
01:07:56 because it’s enabled ransomware.
01:07:58 I mean, five years ago, they would hijack your PC
01:08:02 and they’d say, go to the local pharmacy,
01:08:05 get a eGift card and tell us what the pin is.
01:08:08 And then we’ll get your $200.
01:08:10 Now it’s pay us, you know, five Bitcoin.
01:08:13 And so there’s no doubt cryptocurrencies
01:08:16 enabled ransomware attacks,
01:08:17 but after the Colonial Pipeline ransom was seized,
01:08:22 because if you remember, the FBI was actually able to go in
01:08:25 and claw some of it back from DarkSide,
01:08:28 which was the ransomware group that hid it.
01:08:31 And I spoke to these guys at TRM Labs.
01:08:34 So they’re one of these blockchain intelligence companies.
01:08:37 And a lot of people that work there
01:08:38 used to work at the treasury.
01:08:40 And what they said to me was,
01:08:42 yeah, cryptocurrency has enabled ransomware,
01:08:46 but to track down that ransom payment would have taken,
01:08:52 you know, if we were dealing with fiat currency,
01:08:54 would have taken us years to get to that one bank account
01:08:58 or belonging to that one front company in the Seychelles.
01:09:01 And now thanks to the blockchain,
01:09:04 we can track the movement of those funds in real time.
01:09:08 And you know what?
01:09:09 You know, these payments are not as anonymous
01:09:11 as people think.
01:09:13 Like we still can use our old hacking ways and zero days
01:09:16 and, you know, old school intelligence methods
01:09:19 to find out who owns that private wallet
01:09:21 and how to get to it.
01:09:23 So it’s a curse in some ways and that it’s an enabler,
01:09:27 but it’s also a blessing.
01:09:29 And they said that same thing to me
01:09:31 that I just said to you.
01:09:32 They said, we’ve never shut down a promising new technology
01:09:37 because it introduced risk.
01:09:39 We just figured out how to manage that risk.
01:09:42 And I think that’s where the conversation
01:09:44 unfortunately has to go,
01:09:45 is how do we in the metaverse use technology to fix things?
01:09:53 So maybe we’ll finally be able to, not finally,
01:09:56 but figure out a way to solve the identity problem
01:10:00 on the internet, meaning like a blue check mark
01:10:03 for actual human and connect it to identity
01:10:06 or like a fingerprint so you can prove your you.
01:10:11 And yet do it in a way that doesn’t involve the company
01:10:15 having all your data.
01:10:17 So giving you, allowing you to maintain control
01:10:20 over your data, or if you don’t,
01:10:23 then there’s a complete transparency
01:10:25 of how that data is being used, all those kinds of things.
01:10:28 And maybe as you educate more and more people,
01:10:32 they would demand in a capitalist society
01:10:36 that the companies that they give their data to
01:10:38 will respect that data.
01:10:40 Yeah, I mean, there is this company,
01:10:43 and I hope they succeed, their name’s PII Ono, Piano.
01:10:48 And they wanna create a vault for your personal information
01:10:52 inside every organization.
01:10:54 And ultimately, if I’m gonna call Delta Airlines
01:10:57 to book a flight,
01:10:59 they don’t need to know my social security number.
01:11:02 They don’t need to know my birth date.
01:11:05 They’re just gonna send me a one time token to my phone.
01:11:08 My phone’s gonna say, or my Fido key is gonna say,
01:11:11 yep, it’s her.
01:11:13 And then we’re gonna talk about my identity like a token,
01:11:16 some random token.
01:11:17 They don’t need to know exactly who I am.
01:11:20 They just need to know the system trust that I am,
01:11:23 who I say I am, but they don’t get access to my PII data.
01:11:27 They don’t get access to my social security number,
01:11:30 my location, or the fact I’m a Times journalist.
01:11:34 I think that’s the way the world’s gonna go.
01:11:37 We have, enough is enough on sort of
01:11:40 losing our personal information everywhere,
01:11:44 letting data marketing companies track our every move.
01:11:48 They don’t need to know who I am.
01:11:51 Okay, I get it.
01:11:52 We’re stuck in this world where the internet runs on ads.
01:11:57 So ads are not gonna go away,
01:11:59 but they don’t need to know I’m Nicole Perlora.
01:12:03 They can know that I am token number, you know,
01:12:06 X567.
01:12:08 And they can let you know what they know
01:12:11 and give you control about removing the things they know.
01:12:14 Yeah, right to be forgotten.
01:12:15 To me, you should be able to walk away
01:12:17 with a single press of a button.
01:12:20 And I also believe that most people,
01:12:22 given the choice to walk away, won’t walk away.
01:12:25 They’ll just feel better about having the option
01:12:28 to walk away when they understand the trade offs.
01:12:30 If you walk away, you’re not gonna get
01:12:32 some of the personalized experiences
01:12:34 that you would otherwise get,
01:12:35 like a personalized feed and all those kinds of things.
01:12:38 But the freedom to walk away is,
01:12:43 I think, really powerful.
01:12:44 And obviously, what you’re saying,
01:12:45 it’s definitely, there’s all of these HTML forms
01:12:48 where you have to enter your phone number and email
01:12:51 and private information from Delta, every single airline.
01:12:55 New York Times.
01:12:58 I have so many opinions on this.
01:13:00 Just the friction and the sign up
01:13:03 and all of those kinds of things.
01:13:04 I should be able to, this has to do with everything.
01:13:07 This has to do with payment, too.
01:13:09 Payment should be trivial.
01:13:11 It should be one click,
01:13:13 and one click to unsubscribe and subscribe,
01:13:16 and one click to provide all of your information
01:13:19 that’s necessary for the subscription service,
01:13:21 for the transaction service, whatever that is,
01:13:24 getting a ticket, as opposed to,
01:13:25 I have all of these fake phone numbers and emails
01:13:28 that I use in Alta Sign Up,
01:13:29 because you never know if one site is hacked,
01:13:34 then it’s just going to propagate to everything else.
01:13:37 Yeah.
01:13:38 And there’s low hanging fruit,
01:13:41 and I hope Congress does something.
01:13:44 And frankly, I think it’s negligent they haven’t
01:13:46 on the fact that elderly people are getting spammed to death
01:13:51 on their phones these days with fake car warranty scams.
01:13:56 And I mean, my dad was in the hospital last year,
01:13:59 and I was in the hospital room, and his phone kept buzzing,
01:14:02 and I look at it, and it’s just spam attack after spam attack,
01:14:08 people nonstop calling about his freaking car warranty,
01:14:13 why they’re trying to get his social security number,
01:14:15 they’re trying to get his PII,
01:14:17 they’re trying to get this information.
01:14:19 We need to figure out how to put those people
01:14:24 in jail for life, and we need to figure out
01:14:28 why in the hell we are being required
01:14:32 or asked to hand over our social security number
01:14:36 and our home address and our passport,
01:14:39 all of that information to every retailer who asks.
01:14:43 I mean, that’s insanity.
01:14:46 And there’s no question they’re not protecting it
01:14:49 because it keeps showing up in spam or identity theft
01:14:55 or credit card theft or worse.
01:14:57 Well, spam is getting better, and maybe I need to,
01:15:00 as a side note, make a public announcement.
01:15:02 Please clip this out, which is if you get an email
01:15:07 or a message from Lex Friedman saying how much
01:15:12 I, Lex, appreciate you and love you and so on,
01:15:16 and please connect with me on my WhatsApp number
01:15:19 and I will give you Bitcoin or something like that,
01:15:23 please do not click.
01:15:25 And I’m aware that there’s a lot of this going on,
01:15:29 a very large amount.
01:15:30 I can’t do anything about it.
01:15:32 This is on every single platform.
01:15:33 It’s happening more and more and more,
01:15:36 which I’ve been recently informed that they’re not emailing.
01:15:40 So it’s cross platform.
01:15:42 They’re taking people’s, they’re somehow,
01:15:46 this is fascinating to me because they are taking people
01:15:50 who comment on various social platforms
01:15:53 and they somehow reverse engineer.
01:15:56 They figure out what their email is
01:15:57 and they send an email to that person saying,
01:16:00 from Lex Friedman, and it’s like a heartfelt email
01:16:04 with links.
01:16:05 It’s fascinating because it’s cross platform now.
01:16:07 It’s not just a spam bot that’s messaging
01:16:11 and a comment that’s in a reply.
01:16:13 They are saying, okay, this person cares
01:16:16 about this other person on social media.
01:16:18 So I’m going to find another channel,
01:16:20 which in their mind probably increases
01:16:22 and it does the likelihood that they’ll get the people
01:16:26 to click and they do.
01:16:28 I don’t know what to do about that.
01:16:30 It makes me really, really sad,
01:16:32 especially with podcasting.
01:16:33 There’s an intimacy that people feel connected
01:16:36 and they get really excited.
01:16:37 Okay, cool, I wanna talk to Lex.
01:16:40 And they click.
01:16:45 And I get angry at the people that do this.
01:16:50 I mean, it’s like the John that gets hired,
01:16:55 the fake employee.
01:16:57 I mean, I don’t know what to do about that.
01:16:58 I mean, I suppose the solution is education.
01:17:02 It’s telling people to be skeptical
01:17:04 on the stuff they click.
01:17:07 That balance with the technology solution
01:17:09 of creating maybe like two factor authentication
01:17:14 and maybe helping identify things
01:17:17 that are likely to be spam, I don’t know.
01:17:20 But then the machine learning there is tricky
01:17:21 because you don’t wanna add a lot of extra friction
01:17:25 that just annoys people because they’ll turn it off.
01:17:28 Because you have the accept cookies thing, right?
01:17:30 That everybody has to click on now,
01:17:32 so now they completely ignore the accept cookies.
01:17:34 This is very difficult to find that frictionless security.
01:17:42 You mentioned Snowden.
01:17:43 You’ve talked about looking through the NSA documents
01:17:48 he leaked and doing the hard work of that.
01:17:51 What do you make of Edward Snowden?
01:17:54 What have you learned from those documents?
01:17:56 What do you think of him?
01:18:00 In the long arc of history,
01:18:02 is Edward Snowden a hero or a villain?
01:18:05 I think he’s neither.
01:18:07 I have really complicated feelings about Edward Snowden.
01:18:12 On the one hand, I’m a journalist at heart
01:18:15 and more transparency is good.
01:18:19 And I’m grateful for the conversations
01:18:22 that we had in the post Snowden era
01:18:26 about the limits to surveillance
01:18:29 and how critical privacy is.
01:18:33 And when you have no transparency
01:18:35 and you don’t really know in that case
01:18:38 what our secret courts were doing,
01:18:41 how can you truly believe that our country
01:18:45 is taking our civil liberties seriously?
01:18:48 So on the one hand, I’m grateful
01:18:51 that he cracked open these debates.
01:18:54 On the other hand, when I walked into the storage closet
01:19:02 of classified NSA secrets,
01:19:05 I had just spent two years
01:19:09 covering Chinese cyber espionage almost every day.
01:19:14 And the sort of advancement of Russian attacks
01:19:19 that were just getting worse and worse and more destructive.
01:19:23 And there were no limits to Chinese cyber espionage
01:19:27 and Chinese surveillance of its own citizens.
01:19:30 And there seemed to be no limit
01:19:32 to what Russia was willing to do in terms of cyber attacks
01:19:37 and also in some cases assassinating journalists.
01:19:41 So when I walked into that room,
01:19:43 there was a part of me quite honestly
01:19:46 that was relieved to know that the NSA
01:19:50 was as good as I hoped they were.
01:19:53 And we weren’t using that knowledge to,
01:19:58 as far as I know, assassinate journalists.
01:20:03 We weren’t using our access
01:20:06 to take out pharmaceutical companies.
01:20:11 For the most part, we were using it for traditional espionage.
01:20:15 Now, that set of documents also set me
01:20:18 on the journey of my book because to me,
01:20:22 the American people’s reaction to the Snowden documents
01:20:26 was a little bit misplaced.
01:20:28 They were upset
01:20:29 about the phone call metadata collection program.
01:20:33 Angela Merkel, I think rightfully was upset
01:20:36 that we were hacking her cell phone.
01:20:39 But in sort of the spy eat spy world,
01:20:42 hacking world leaders cell phones
01:20:44 is pretty much what most spy agencies do.
01:20:47 And there wasn’t a lot that I saw in those documents
01:20:51 that was beyond what I thought a spy agency does.
01:20:56 And I think if there was another 9 11 tomorrow,
01:21:01 God forbid, we would all say, how did the NSA miss this?
01:21:05 Why weren’t they spying on those terrorists?
01:21:07 Why weren’t they spying on those world leaders?
01:21:10 And there’s some of that too.
01:21:13 But I think that there was great damage done
01:21:17 to the US’s reputation.
01:21:22 I think we really lost our halo
01:21:26 in terms of a protector of civil liberties.
01:21:30 And I think a lot of what was reported
01:21:33 was unfortunately reported in a vacuum.
01:21:36 That was my biggest gripe that we were always reporting,
01:21:41 the NSA has this program and here’s what it does.
01:21:45 And the NSA is in Angela Merkel’s cell phone
01:21:48 and the NSA can do this.
01:21:50 And no one was saying, and by the way,
01:21:57 China has been hacking into our pipelines
01:22:00 and they’ve been making off
01:22:01 with all of our intellectual property.
01:22:04 And Russia has been hacking into our energy infrastructure
01:22:07 and they’ve been using the same methods to spy on track.
01:22:11 And in many cases, kill their own journalists.
01:22:13 And the Saudis have been doing this
01:22:15 to their own critics and dissidents.
01:22:17 And so you can’t talk about any of these countries
01:22:21 in isolation.
01:22:22 It is really like spy out there.
01:22:25 And so I just have complicated feelings.
01:22:29 And the other thing is, and I’m sorry,
01:22:30 this is a little bit of a tangent,
01:22:32 but the amount of documents that we had,
01:22:37 like thousands of documents,
01:22:39 most of which were just crap,
01:22:41 but had people’s names on them.
01:22:46 Part of me wishes that those documents
01:22:48 had been released in a much more targeted, limited way.
01:22:53 It’s just a lot of it just felt like a PowerPoint
01:22:56 that was taken out of context.
01:23:00 And you just sort of wish
01:23:03 that there had been a little bit more thought
01:23:05 into what was released.
01:23:07 Because I think a lot of the impact from someone
01:23:10 was just the volume of the reporting.
01:23:13 But I think based on what I saw personally,
01:23:18 there was a lot of stuff that I just,
01:23:20 I don’t know why that particular thing got released.
01:23:24 As a whistleblower, what’s a better way to do it?
01:23:26 Because I mean, there’s fear,
01:23:28 it takes a lot of effort to do a more targeted release.
01:23:33 If there’s proper channels,
01:23:35 you’re afraid that those channels will be manipulated
01:23:38 by who do you trust.
01:23:41 What’s a better way to do this, do you think?
01:23:43 As a journalist, this is almost like a journalistic question.
01:23:46 Reveal some fundamental flaw in the system
01:23:49 without destroying the system.
01:23:50 I bring up, again, Mark Zuckerberg and Metta,
01:23:54 there was a whistleblower
01:23:57 that came out about Instagram internal studies.
01:24:02 And I also torn about how to feel about that whistleblower.
01:24:06 Because from a company perspective, that’s an open culture.
01:24:11 How can you operate successfully
01:24:13 if you have an open culture
01:24:14 where any one whistleblower can come out,
01:24:17 out of context, take a study,
01:24:19 whether it represents a larger context or not,
01:24:22 and the press eats it up.
01:24:25 And then that creates a narrative
01:24:27 that is just like with the NSA,
01:24:30 you said it’s out of context, very targeted,
01:24:33 to where, well, Facebook is evil, clearly,
01:24:36 because of this one leak.
01:24:38 It’s really hard to know what to do there,
01:24:40 because we’re now in a society
01:24:42 that’s deeply distrust institutions.
01:24:44 And so narratives by whistleblowers make that whistleblower
01:24:49 and their forthcoming book very popular.
01:24:52 And so there’s a huge incentive
01:24:54 to take stuff out of context and to tell stories
01:24:56 that don’t represent the full context, the full truth.
01:25:01 It’s hard to know what to do with that,
01:25:03 because then that forces Facebook and Meta and governments
01:25:06 to be much more conservative, much more secretive.
01:25:10 It’s like a race to the bottom, I don’t know.
01:25:14 I don’t know if you can comment on any of that,
01:25:16 how to be a whistleblower ethically and properly.
01:25:20 I don’t know, I mean, these are hard questions.
01:25:23 And even for myself, in some ways,
01:25:27 I think of my book as sort of blowing the whistle
01:25:31 on the underground zero day market.
01:25:33 But it’s not like I was in the market myself.
01:25:38 It’s not like I had access to classified data
01:25:41 when I was reporting out that book.
01:25:44 As I say in the book, listen,
01:25:46 I’m just trying to scrape the surface here,
01:25:49 so we can have these conversations before it’s too late.
01:25:53 And I’m sure there’s plenty in there
01:25:57 that someone who’s US intelligence agencies
01:26:01 preeminent zero day broker probably
01:26:04 has some voodoo doll of me out there.
01:26:05 And you’re never gonna get it 100%.
01:26:11 But I really applaud whistleblowers
01:26:14 like the whistleblower who blew the whistle
01:26:19 on the Trump call with Zelensky.
01:26:22 I mean, people needed to know about that,
01:26:25 that we were basically, in some ways,
01:26:27 blackmailing an ally to try to influence an election.
01:26:33 I mean, they went through the proper channels.
01:26:37 They weren’t trying to profit off of it, right?
01:26:39 There was no book that came out afterwards
01:26:42 from that whistleblower.
01:26:44 That whistleblower’s not like,
01:26:46 they went through the channels.
01:26:47 They’re not living in Moscow, let’s put it that way.
01:26:51 Can I ask you a question, you mentioned NSA,
01:26:53 one of the things that showed
01:26:56 is they’re pretty good at what they do.
01:27:00 Again, this is a touchy subject, I suppose,
01:27:03 but there’s a lot of conspiracy theories
01:27:06 about intelligence agencies.
01:27:08 From your understanding of intelligence agencies,
01:27:11 the CIA, NSA, and the equivalent of in other countries,
01:27:16 are they, one question, this could be a dangerous question,
01:27:20 are they competent, are they good at what they do?
01:27:24 And two, are they malevolent in any way?
01:27:30 Sort of, I recently had a conversation
01:27:32 about tobacco companies.
01:27:35 They kind of see their customers as dupes,
01:27:39 like they can just play games with people.
01:27:43 Conspiracy theories tell that similar story
01:27:46 about intelligence agencies,
01:27:48 that they’re interested in manipulating the populace
01:27:51 for whatever ends the powerful,
01:27:54 in dark rooms, cigarette smoke, cigar smoke filled rooms.
01:28:03 What’s your sense?
01:28:04 Do these conspiracy theories have any truth to them?
01:28:11 Or are intelligence agencies, for the most part,
01:28:14 good for society?
01:28:15 Okay, well, that’s an easy one.
01:28:18 Is it?
01:28:19 No, I think it depends which intelligence agency.
01:28:23 Think about the Mossad.
01:28:25 They’re killing every Iranian nuclear scientist they can
01:28:32 over the years, but have they delayed the time horizon
01:28:38 before Iran gets the bomb?
01:28:40 Yeah.
01:28:41 Have they probably staved off terror attacks
01:28:45 on their own citizens?
01:28:46 Yeah.
01:28:48 You know, none of these, intelligence is intelligence.
01:28:53 You know, you can’t just say like they’re malevolent
01:28:56 or they’re heroes.
01:28:58 You know, everyone I have met in this space
01:29:03 is not like the pound your chest patriot
01:29:07 that you see on the beach on the 4th of July.
01:29:11 A lot of them have complicated feelings
01:29:15 about their former employers.
01:29:17 Well, at least at the NSA reminded me
01:29:20 to do what we were accused of doing after Snowden,
01:29:25 to spy on Americans.
01:29:28 You have no idea the amount of red tape and paperwork
01:29:33 and bureaucracy it would have taken to do
01:29:38 what everyone thinks that we were supposedly doing.
01:29:42 But then, you know, we find out in the course
01:29:45 of the Snowden reporting about a program called Lovin',
01:29:49 where a couple of the NSA analysts were using their access
01:29:53 to spy on their ex girlfriends.
01:29:55 So, you know, there’s an exception to every case.
01:29:59 Generally, I will probably get, you know,
01:30:05 accused of my Western bias here again,
01:30:07 but I think you can almost barely compare
01:30:15 some of these Western intelligence agencies
01:30:17 to China, for instance.
01:30:19 And the surveillance that they’re deploying on the Uyghurs
01:30:26 to the level they’re deploying it.
01:30:28 And the surveillance they’re starting to export abroad
01:30:32 with some of the programs,
01:30:33 like the watering hole attack I mentioned earlier,
01:30:35 where it’s not just hitting the Uyghurs inside China,
01:30:38 it’s hitting anyone interested
01:30:40 in the Uyghur plight outside China.
01:30:42 I mean, it could be an American high school student
01:30:44 writing a paper on the Uyghurs.
01:30:46 They wanna spy on that person too.
01:30:49 You know, there’s no rules in China
01:30:51 really limiting the extent of that surveillance.
01:30:55 And we all better pay attention to what’s happening
01:30:59 with the Uyghurs because just as Ukraine has been to Russia
01:31:04 in terms of a test kitchen for its cyber attacks,
01:31:08 the Uyghurs are China’s test kitchen for surveillance.
01:31:12 And there’s no doubt in my mind
01:31:15 that they’re testing them on the Uyghurs.
01:31:17 Uyghurs are their Petri dish,
01:31:19 and eventually they will export
01:31:21 that level of surveillance overseas.
01:31:23 I mean, in 2015,
01:31:27 Obama and Xi Jinping reached a deal
01:31:31 where basically the White House said,
01:31:34 you better cut it out on intellectual property theft.
01:31:38 And so they made this agreement
01:31:40 that they would not hack each other for commercial benefit.
01:31:43 And for a period of about 18 months,
01:31:45 we saw this huge drop off in Chinese cyber attacks
01:31:49 on American companies.
01:31:50 But some of them continued.
01:31:53 Where did they continue?
01:31:54 They continued on aviation companies,
01:31:58 on hospitality companies like Marriott.
01:32:02 Why?
01:32:02 Because that was still considered fair game to China.
01:32:05 It wasn’t IP theft they were after.
01:32:07 They wanted to know who was staying in this city
01:32:11 at this time when Chinese citizens were staying there
01:32:15 so they could cross match for counterintelligence
01:32:17 who might be a likely Chinese spy.
01:32:20 I’m sure we’re doing some of that too.
01:32:22 Counterintelligence is counterintelligence.
01:32:24 It’s considered fair game.
01:32:27 But where I think it gets evil
01:32:30 is when you use it for censorship,
01:32:34 to suppress any dissent,
01:32:37 to do what I’ve seen the UAE do to its citizens
01:32:41 where people who’ve gone on Twitter
01:32:44 just to advocate for better voting rights,
01:32:47 more enfranchisement,
01:32:49 suddenly find their passports confiscated.
01:32:53 You know, I talked to one critic, Ahmed Mansour,
01:32:57 and he told me,
01:32:58 you know, you might find yourself a terrorist,
01:33:01 labeled a terrorist one day,
01:33:02 you don’t even know how to operate a gun.
01:33:04 I mean, he had been beaten up
01:33:06 every time he tried to go somewhere.
01:33:07 His passport had been confiscated.
01:33:09 By that point, it turned out
01:33:10 they’d already hacked into his phone
01:33:12 so they were listening to us talking.
01:33:14 They’d hacked into his baby monitor
01:33:16 so they’re spying on his child.
01:33:18 And they stole his car.
01:33:22 And then they created a new law
01:33:24 that you couldn’t criticize the ruling family
01:33:27 or the ruling party on Twitter.
01:33:29 And he’s been in solitary confinement every day since
01:33:32 on hunger strike.
01:33:34 So that’s evil, you know, that’s evil.
01:33:37 And we still, we don’t do that here.
01:33:40 You know, we have rules here.
01:33:42 We don’t cross that line.
01:33:44 So yeah, in some cases, like I won’t go to Dubai.
01:33:48 You know, I won’t go to Abu Dhabi.
01:33:49 If I ever want to go to the Maldives,
01:33:51 like too bad, like most of the flights go through Dubai.
01:33:54 So there’s some lines we’re not willing to cross.
01:33:57 But then again, just like you said,
01:33:58 there’s individuals within NSA, within CIA,
01:34:02 and they may have power.
01:34:05 And to me, there’s levels of evil.
01:34:07 To me personally, this is the stuff of conspiracy theories,
01:34:11 is the things you’ve mentioned as evil
01:34:13 are more direct attacks.
01:34:16 But there’s also psychological warfare.
01:34:19 So blackmail.
01:34:20 So what does spying allow you to do?
01:34:25 Allow you to collect information
01:34:27 if you have something that’s embarrassing.
01:34:30 Or if you have like Jeffrey Epstein conspiracy theories,
01:34:33 active, what is it, manufacture of embarrassing things.
01:34:38 And then use blackmail to manipulate the population
01:34:41 or all the powerful people involved.
01:34:42 It troubles me deeply that MIT allowed somebody
01:34:45 like Jeffrey Epstein in their midst,
01:34:48 especially some of the scientists I admire
01:34:51 that they would hang out with that person at all.
01:34:54 And so I’ll talk about it sometimes.
01:34:59 And then a lot of people tell me,
01:35:00 well, obviously Jeffrey Epstein is a front for intelligence.
01:35:04 And I just, I struggle to see that level of competence
01:35:09 and malevolence.
01:35:10 But, you know, who the hell am I?
01:35:17 And I guess I was trying to get to that point.
01:35:21 You said that there’s bureaucracy and so on,
01:35:23 which makes some of these things very difficult.
01:35:25 I wonder how much malevolence,
01:35:27 how much competence there is in these institutions.
01:35:31 Like how far, this takes us back to the hacking question.
01:35:34 How far are people willing to go if they have the power?
01:35:39 This has to do with social engineering.
01:35:41 This has to do with hacking.
01:35:42 This has to do with manipulating people,
01:35:45 attacking people, doing evil onto people,
01:35:47 psychological warfare and stuff like that.
01:35:50 I don’t know.
01:35:51 I believe that most people are good.
01:35:54 And I don’t think that’s possible in a free society.
01:35:59 There’s something that happens
01:36:00 when you have a centralized government
01:36:02 where power corrupts over time
01:36:05 and you start surveillance programs
01:36:08 kind of, it’s like a slippery slope
01:36:12 that over time starts to both use fear
01:36:16 and direct manipulation to control the populace.
01:36:20 But in a free society, I just,
01:36:23 it’s difficult for me to imagine
01:36:25 that you can have like somebody like a Jeffrey Epstein
01:36:27 in the front for intelligence.
01:36:29 I don’t know what I’m asking you, but I’m just,
01:36:34 I have a hope that for the most part,
01:36:36 intelligence agencies are trying to do good
01:36:39 and are actually doing good for the world
01:36:43 when you view it in the full context
01:36:45 of the complexities of the world.
01:36:51 But then again, if they’re not, would we know?
01:36:55 That’s why Edward Snowden might be a good thing.
01:36:58 Let me ask you on a personal question.
01:37:00 You have investigated some of the most powerful
01:37:02 organizations and people in the world
01:37:04 of cyber warfare, cyber security.
01:37:07 Are you ever afraid for your own life,
01:37:09 your own wellbeing, digital or physical?
01:37:13 I mean, I’ve had my moments.
01:37:15 You know, I’ve had our security team at the times
01:37:20 called me at one point and said,
01:37:21 someone’s on the dark web offering good money
01:37:25 to anyone who can hack your phone or your laptop.
01:37:30 I describe in my book how when I was at that
01:37:33 hacking conference in Argentina and I came back
01:37:35 and I brought a burner laptop with me,
01:37:38 but I’d kept it in the safe anyway
01:37:40 and it didn’t have anything on it,
01:37:42 but someone had broken in and it was moved.
01:37:46 You know, I’ve had all sorts of sort of scary moments.
01:37:52 And then I’ve had moments where I think I went
01:37:55 just way too far into the paranoid side.
01:37:58 I mean, I remember writing about the Times hack by China
01:38:04 and I just covered a number of Chinese cyber attacks
01:38:07 where they’d gotten into the thermostat
01:38:10 at someone’s corporate apartment
01:38:11 and they’d gotten into all sorts of stuff.
01:38:15 And I was living by myself.
01:38:17 I was single in San Francisco and my cable box
01:38:23 on my television started making some weird noises
01:38:25 in the middle of the night.
01:38:26 And I got up and I ripped it out of the wall
01:38:29 and I think I said something like embarrassing,
01:38:32 like, fuck you China, you know.
01:38:33 And then I went back to bed and I woke up
01:38:39 and it’s like beautiful morning light.
01:38:41 I mean, I’ll never forget it.
01:38:42 Like this is like glimmering morning light
01:38:44 is shining on my cable box, which has now been ripped out
01:38:48 and is sitting on my floor and like the morning light.
01:38:50 And I was just like, no, no, no,
01:38:53 like I’m not going down that road.
01:38:56 Like you basically, I came to a fork in the road
01:39:03 where I could either go full tinfoil hat,
01:39:06 go live off the grid, never have a car with navigation,
01:39:10 never use Google maps, never own an iPhone,
01:39:12 never order diapers off Amazon, you know, create an alias
01:39:17 or I could just do the best I can
01:39:22 and live in this new digital world we’re living in.
01:39:26 And what does that look like for me?
01:39:28 I mean, what are my crown jewels?
01:39:30 This is what I tell people, what are your crown jewels?
01:39:32 Cause just focus on that.
01:39:34 You can’t protect everything,
01:39:35 but you can protect your crown jewels.
01:39:37 For me, for the longest time,
01:39:39 my crown jewels were my sources.
01:39:42 I was nothing without my sources.
01:39:44 So I had some sources, I would meet the same dim sum place
01:39:49 or maybe it was a different restaurant on the same date,
01:39:53 you know, every quarter and we would never drive there.
01:39:59 We would never Uber there.
01:40:00 We wouldn’t bring any devices.
01:40:02 I could bring a pencil and a notepad.
01:40:05 And if someone wasn’t in town,
01:40:07 like there were a couple of times where I’d show up
01:40:09 and the source never came,
01:40:11 but we never communicated digitally.
01:40:14 And those were the links I was willing to go
01:40:16 to protect that source, but you can’t do it for everyone.
01:40:19 So for everyone else, you know, it was signal,
01:40:22 using two factor authentication,
01:40:24 you know, keeping my devices up to date,
01:40:26 not clicking on phishing emails, using a password manager,
01:40:30 all the things that we know we’re supposed to do.
01:40:34 And that’s what I tell everyone, like don’t go crazy
01:40:37 because then that’s like the ultimate hack.
01:40:39 Then they’ve hacked your mind, whoever they is for you.
01:40:43 But just do the best you can.
01:40:45 Now, my whole risk model changed when I had a kid.
01:40:50 You know, now it’s, oh God, you know,
01:40:54 if anyone threatened my family, God help them.
01:40:59 But it changes you.
01:41:07 And, you know, unfortunately there are some things,
01:41:12 like I was really scared to go deep on,
01:41:15 like Russian cyber crime, you know, like Putin himself,
01:41:19 you know, and it’s interesting.
01:41:21 Like I have a mentor who’s an incredible person
01:41:24 who was the Times Moscow Bureau Chief during the Cold War.
01:41:29 And after I wrote a series of stories
01:41:32 about Chinese cyber espionage, he took me out to lunch.
01:41:35 And he told me that when he was living in Moscow,
01:41:37 he would drop his kids off at preschool
01:41:40 when they were my son’s age now.
01:41:42 And the KGB would follow him
01:41:44 and they would make a really like loud show of it.
01:41:48 You know, they’d tail him, they’d, you know, honk,
01:41:51 they’d just be, make a ruckus.
01:41:55 And he said, you know what, they never actually did anything
01:41:57 but they wanted me to know that they were following me
01:42:00 and I operated accordingly.
01:42:03 And he says, that’s how you should operate
01:42:05 in the digital world.
01:42:08 Know that there are probably people following you.
01:42:11 Sometimes they’ll make a little bit of noise.
01:42:14 But one thing you need to know is that
01:42:17 while you’re at the New York Times,
01:42:18 you have a little bit of an invisible shield on you.
01:42:21 You know, if something were to happen to you,
01:42:23 that would be a really big deal.
01:42:25 That would be an international incident.
01:42:27 So I kind of carried that invisible shield with me
01:42:29 for years.
01:42:31 And then Jamal Khashoggi happened.
01:42:34 And that destroyed my vision of my invisible shield.
01:42:38 You know, sure, you know, he was a Saudi
01:42:41 but he was a Washington Post columnist.
01:42:44 You know, for the most part,
01:42:46 he was living in the United States.
01:42:47 He was a journalist.
01:42:49 And for them to do what they did to him,
01:42:52 pretty much in the open and get away with it,
01:42:57 and for the United States to let them get away with it
01:43:01 because we wanted to preserve diplomatic relations
01:43:04 with the Saudis,
01:43:06 that really threw my worldview upside down.
01:43:10 And, you know, I think that sent a message
01:43:13 to a lot of countries
01:43:15 that it was sort of open season on journalists.
01:43:19 And to me, that was one of the most destructive things
01:43:22 that happened under the previous administration.
01:43:27 And, you know, I don’t really know
01:43:30 what to think of my invisible shield anymore.
01:43:32 Like you said, that really worries me
01:43:33 on the journalism side that people would be afraid
01:43:36 to dig deep on fascinating topics.
01:43:41 And, you know, I have my own,
01:43:47 part of the reason, like I would love to have kids,
01:43:50 I would love to have a family.
01:43:52 Part of the reason I’m a little bit afraid,
01:43:56 there’s many ways to phrase this,
01:43:57 but the loss of freedom in the way of doing
01:44:02 all the crazy shit that I naturally do,
01:44:04 which I would say the ethic of journalism
01:44:07 is kind of not, is doing crazy shit
01:44:09 without really thinking about it.
01:44:11 This is letting your curiosity
01:44:14 really allow you to be free and explore.
01:44:18 It’s, I mean, whether it’s stupidity or fearlessness,
01:44:22 whatever it is, that’s what great journalism is.
01:44:25 And all the concerns about security risks
01:44:30 have made me like become a better person.
01:44:32 The way I approach it is just make sure
01:44:35 you don’t have anything to hide.
01:44:37 I know this is not a thing.
01:44:38 This is not a, this is not an approach to security.
01:44:41 I’m just, this is like a motivational speech or something.
01:44:44 It’s just like, if you can lose,
01:44:47 you can be hacked at any moment.
01:44:49 Just don’t be a douchebag secretly.
01:44:52 Just be like a good person.
01:44:54 Because then, I see this actually
01:44:56 with social media in general.
01:45:00 Just present yourself in the most authentic way possible,
01:45:03 meaning be the same person online as you are privately.
01:45:06 Have nothing to hide.
01:45:08 That’s one, not the only, but one of the ways
01:45:11 to achieve security.
01:45:14 Maybe I’m totally wrong on this,
01:45:15 but don’t be secretly weird.
01:45:19 If you’re weird, be publicly weird
01:45:21 so it’s impossible to blackmail you.
01:45:24 That’s my approach to security.
01:45:25 Yeah, well, they call it
01:45:26 the New York Times front page phenomenon.
01:45:29 Don’t put anything in email or I guess social media
01:45:32 these days that you wouldn’t want to read
01:45:35 on the front page of the New York Times.
01:45:37 And that works, but sometimes I even get carried,
01:45:41 I mean, I have not as many followers as you,
01:45:45 but a lot of followers,
01:45:47 and sometimes even I get carried away.
01:45:49 Just be emotional and stuff and say something.
01:45:51 Yeah, I mean, just the cortisol response on Twitter.
01:45:57 Twitter is basically designed to elicit those responses.
01:46:01 I mean, every day I turn on my computer,
01:46:04 I look at my phone, I look at what’s trending on Twitter,
01:46:07 and it’s like, what are the topics
01:46:10 that are gonna make people the most angry today?
01:46:13 You know?
01:46:14 And you know, it’s easy to get carried away,
01:46:19 but it’s also just, that sucks too,
01:46:22 that you have to be constantly censoring yourself.
01:46:25 And maybe it’s for the better.
01:46:26 Maybe you can’t be a secret asshole,
01:46:29 and we can put that in the good bucket.
01:46:31 But at the same time, you know,
01:46:33 there is a danger to that other voice,
01:46:39 to creativity, you know, to being weird.
01:46:43 There’s a danger to that little whispered voice
01:46:45 that’s like, well, how would people read that?
01:46:48 You know, how could that be manipulated?
01:46:51 How could that be used against you?
01:46:53 And that stifles creativity and innovation and free thought.
01:47:00 And you know, that is on a very micro level.
01:47:06 And that’s something I think about a lot.
01:47:08 And that’s actually something that Tim Cook
01:47:11 has talked about a lot,
01:47:13 and why he has said he goes full force on privacy
01:47:17 is it’s just that little voice
01:47:20 that is at some level censoring you.
01:47:24 And what is sort of the long term impact
01:47:28 of that little voice over time?
01:47:31 I think there’s a ways, I think that self censorship
01:47:35 is an attack factor that there’s solutions to.
01:47:37 The way I’m really inspired by Elon Musk,
01:47:40 the solution to that is just be privately
01:47:43 and publicly the same person and be ridiculous.
01:47:46 Embrace the full weirdness and show it more and more.
01:47:49 So, you know, that’s memes that has like ridiculous humor.
01:47:54 And I think, and if there is something
01:47:57 you really wanna hide, deeply consider
01:48:00 if that you wanna be that.
01:48:03 Like, why are you hiding it?
01:48:05 What exactly are you afraid of?
01:48:07 Because I think my hopeful vision for the internet
01:48:10 is the internet loves authenticity.
01:48:13 They wanna see you weird, so be that and like live that fully
01:48:18 because I think that gray area
01:48:20 where you’re kind of censoring yourself,
01:48:22 that’s where the destruction is.
01:48:25 You have to go all the way, step over, be weird.
01:48:28 And then it feels, it can be painful
01:48:31 because people can attack you and so on, but just ride it.
01:48:33 I mean, that’s just like a skill
01:48:35 on the social psychological level
01:48:38 that ends up being an approach to security,
01:48:42 which is like remove the attack vector
01:48:45 of having private information
01:48:46 by being your full weird self publicly.
01:48:51 What advice would you give to young folks today,
01:48:55 you know, operating in this complicated space
01:49:00 about how to have a successful life,
01:49:02 a life they can be proud of,
01:49:03 a career they can be proud of?
01:49:07 Maybe somebody in high school and college
01:49:09 thinking about what they’re going to do.
01:49:11 Be a hacker, you know, if you have any interest,
01:49:15 become a hacker and apply yourself to defense, you know.
01:49:19 Every time, like we do have
01:49:21 these amazing scholarship programs, for instance,
01:49:24 where, you know, they find you early,
01:49:26 they’ll pay your college as long as you commit
01:49:30 to some kind of federal commitment
01:49:32 to sort of help federal agencies with cybersecurity.
01:49:35 And where does everyone wanna go every year
01:49:37 from the scholarship program?
01:49:39 They wanna go work at the NSA or Cyber Command, you know.
01:49:42 They wanna go work on offense.
01:49:44 They wanna go do the sexy stuff.
01:49:46 It’s really hard to get people to work on defense.
01:49:49 It’s just, it’s always been more fun
01:49:51 to be a pirate than be in the Coast Guard, you know.
01:49:54 And so we have a huge deficit
01:49:59 when it comes to filling those roles.
01:50:01 There’s 3.5 million unfilled cybersecurity positions
01:50:06 around the world.
01:50:08 I mean, talk about job security,
01:50:09 like be a hacker and work on cybersecurity.
01:50:12 You will always have a job.
01:50:15 And we’re actually at a huge deficit
01:50:18 and disadvantage as a free market economy
01:50:22 because we can’t match cybersecurity salaries
01:50:26 at Palantir or Facebook or Google or Microsoft.
01:50:30 And so it’s really hard for the United States
01:50:32 to fill those roles.
01:50:33 And, you know, other countries have had this work around
01:50:38 where they basically have forced conscription on some level.
01:50:41 You know, China tells people,
01:50:43 like you do whatever you’re gonna do during the day,
01:50:46 work at Alibaba.
01:50:48 You know, if you need to do some ransomware, okay.
01:50:51 But the minute we tap you on the shoulder
01:50:53 and ask you to come do this sensitive operation for us,
01:50:57 the answer is yes.
01:50:59 You know, same with Russia.
01:51:00 You know, a couple of years ago when Yahoo was hacked
01:51:03 and they laid it all out in an indictment,
01:51:05 it came down to two cyber criminals
01:51:07 and two guys from the FSB.
01:51:09 Cyber criminals were allowed to have their fun,
01:51:12 but the minute they came across the username and password
01:51:14 for someone’s personal Yahoo account
01:51:16 that worked at the White House or the State Department
01:51:19 or military, they were expected to pass that over to the FSB.
01:51:23 So we don’t do that here.
01:51:24 And it’s even worse on defense.
01:51:27 We really can’t fill these positions.
01:51:29 So, you know, if you are a hacker,
01:51:33 if you’re interested in code,
01:51:34 if you’re a tinker, you know, learn how to hack.
01:51:39 There are all sorts of amazing hacking competitions
01:51:42 you can do through the SANS org, for example, S A N S.
01:51:48 And then use those skills for good.
01:51:50 You know, neuter the bugs in that code
01:51:53 that get used by autocratic regimes
01:51:56 to make people’s life, you know, a living prison.
01:52:00 You know, plug those holes.
01:52:01 You know, defend industrial systems,
01:52:03 defend our water treatment facilities
01:52:06 from hacks where people are trying to come in
01:52:07 and poison the water.
01:52:09 You know, that I think is just an amazing,
01:52:14 it’s an amazing job on so many levels.
01:52:16 It’s intellectually stimulating.
01:52:19 You can tell yourself you’re serving your country.
01:52:22 You can tell yourself you’re saving lives
01:52:24 and keeping people safe.
01:52:26 And you’ll always have amazing job security.
01:52:28 And if you need to go get that job that pays you,
01:52:31 you know, 2 million bucks a year, you can do that too.
01:52:33 And you can have a public profile,
01:52:34 more so of a public profile, you can be a public rockstar.
01:52:38 I mean, it’s the same thing as sort of the military.
01:52:42 There’s a lot of,
01:52:46 there’s a lot of well known sort of people
01:52:49 commenting on the fact that veterans
01:52:51 are not treated as well as they should be.
01:52:54 But it’s still the fact that soldiers
01:52:56 are deeply respected for defending the country,
01:53:00 the freedoms, the ideals that we stand for.
01:53:02 And in the same way, I mean, in some ways,
01:53:05 the cybersecurity defense are the soldiers of the future.
01:53:09 Yeah, and you know what’s interesting,
01:53:10 I mean, in cybersecurity, the difference is,
01:53:14 oftentimes you see the more interesting threats
01:53:17 in the private sector, because that’s where the attacks come.
01:53:20 You know, when cyber criminals
01:53:22 and nation state adversaries come for the United States,
01:53:25 they don’t go directly for Cyber Command or the NSA.
01:53:29 You know, they go for banks, they go for Google,
01:53:32 they go for Microsoft, they go for critical infrastructure.
01:53:36 And so those companies, those private sector companies
01:53:39 get to see some of the most advanced,
01:53:41 sophisticated attacks out there.
01:53:45 And you know, if you’re working at FireEye
01:53:48 and you’re calling out the SolarWinds attack, for instance,
01:53:51 I mean, you just saved God knows how many systems
01:53:56 from, you know, that compromise turning into something
01:53:59 that more closely resembles sabotage.
01:54:03 So, you know, go be a hacker, or go be a journalist.
01:54:08 So you wrote the book,
01:54:13 This Is How They Tell Me The World Ends,
01:54:15 as we’ve been talking about,
01:54:17 of course, referring to cyber war, cybersecurity.
01:54:21 What gives you hope about the future of our world
01:54:25 if it doesn’t end?
01:54:26 How will it not end?
01:54:31 That’s a good question.
01:54:32 I mean, I have to have hope, right?
01:54:34 Because I have a kid and I have another on the way,
01:54:37 and if I didn’t have hope, I wouldn’t be having kids.
01:54:42 But it’s a scary time to be having kids.
01:54:46 And you know, it’s like pandemic, climate change,
01:54:50 disinformation, increasingly advanced, perhaps deadly
01:54:55 cyber attacks.
01:54:57 What gives me hope is that I share your worldview
01:55:01 that I think people are fundamentally good.
01:55:04 And sometimes, and this is why the metaverse
01:55:07 scares me to death, but when I’m reminded of that
01:55:10 is not online.
01:55:13 Like online, I get the opposite.
01:55:15 You know, you start to lose hope and humanity
01:55:17 when you’re on Twitter half your day.
01:55:19 It’s like when I go to the grocery store
01:55:22 or I go on a hike or like someone smiles at me
01:55:27 or you know, or someone just says something nice.
01:55:31 You know, people are fundamentally good.
01:55:33 We just don’t hear from those people enough.
01:55:37 And my hope is, I just think our current political climate,
01:55:42 like we’ve hit rock bottom.
01:55:44 This is as bad as it gets.
01:55:46 We can’t do anything.
01:55:47 Don’t jinx it.
01:55:49 But I think it’s a generational thing.
01:55:51 You know, I think baby boomers, like it’s time to move along.
01:55:57 I think it’s time for a new generation to come in.
01:56:01 And I actually have a lot of hope when I look at you.
01:56:06 I’m sort of like this, I guess they call me
01:56:08 a geriatric millennial or a young gen X.
01:56:12 But like we have this unique responsibility
01:56:14 because I grew up without the internet
01:56:17 and without social media, but I’m native to it.
01:56:20 So I know the good and I know the bad.
01:56:25 And that’s true on so many different things.
01:56:28 You know, I grew up without climate change anxiety
01:56:32 and now I’m feeling it and I know it’s not a given.
01:56:34 We don’t have to just resign ourselves to climate change.
01:56:39 You know, same with disinformation.
01:56:41 And I think a lot of the problems we face today
01:56:44 have just exposed the sort of inertia
01:56:47 that there has been on so many of these issues.
01:56:49 And I really think it’s a generational shift
01:56:52 that has to happen.
01:56:54 And I think this next generation is gonna come in
01:56:57 and say like, we’re not doing business
01:56:59 like you guys did it anymore.
01:57:00 You know, we’re not just gonna like rape
01:57:02 and pillage the earth and try and turn everyone
01:57:05 against each other and play dirty tricks
01:57:07 and let lobbyists dictate what we do
01:57:11 or don’t do as a country anymore.
01:57:14 And that’s really where I see the hope.
01:57:16 It feels like there’s a lot of low hanging fruit
01:57:19 for young minds to step up and create solutions and lead.
01:57:23 So whenever like politicians or leaders that are older,
01:57:30 like you said, are acting shitty, I see that as a positive.
01:57:34 They’re inspiring a large number of young people
01:57:38 to replace them.
01:57:39 And so I think you’re right, there’s going to be,
01:57:42 it’s almost like you need people to act shitty
01:57:45 to remind them, oh, wow, we need good leaders.
01:57:47 We need great creators and builders and entrepreneurs
01:57:51 and scientists and engineers and journalists.
01:57:54 You know, all the discussions about how the journalism
01:57:56 is quote unquote broken and so on,
01:57:58 that’s just an inspiration for new institutions to rise up
01:58:02 that do journalism better,
01:58:03 new journalists to step up and do journalism better.
01:58:06 So I, and I’ve been constantly,
01:58:08 when I talk to young people, I’m constantly impressed
01:58:11 by the ones that dream to build solutions.
01:58:16 And so that’s ultimately why I put the hope.
01:58:21 But the world is a messy place,
01:58:23 like we’ve been talking about, it’s a scary place.
01:58:27 Yeah, and I think you hit something,
01:58:29 hit on something earlier, which is authenticity.
01:58:33 Like no one is going to rise above that is plastic anymore.
01:58:40 You know, people are craving authenticity.
01:58:43 You know, the benefit of the internet is it’s really hard
01:58:46 to hide who you are on every single platform.
01:58:49 You know, on some level it’s gonna come out
01:58:51 who you really are.
01:58:53 And so you hope that, you know,
01:58:57 by the time my kids are grown,
01:58:59 like no one’s gonna care if they made one mistake online,
01:59:04 so long as they’re authentic, you know?
01:59:07 And I used to worry about this.
01:59:09 My nephew was born the day I graduated from college.
01:59:13 And I just always, you know, he’s like born into Facebook.
01:59:17 And I just think like, how is a kid like that
01:59:21 ever gonna be president of the United States of America?
01:59:24 Because if Facebook had been around when I was in college,
01:59:27 you know, like Jesus, you know,
01:59:31 how are those kids are gonna ever be president?
01:59:34 There’s gonna be some photo of them at some point
01:59:37 making some mistake, and that’s gonna be all over for them.
01:59:41 And now I take that back.
01:59:43 Now it’s like, no, everyone’s gonna make mistakes.
01:59:46 There’s gonna be a picture for everyone.
01:59:49 And we’re all gonna have to come and grow up
01:59:53 to the view that as humans, we’re gonna make huge mistakes.
01:59:56 And hopefully they’re not so big
01:59:58 that they’re gonna ruin the rest of your life.
02:00:00 But we’re gonna have to come around to this view
02:00:02 that we’re all human.
02:00:04 And we’re gonna have to be a little bit more forgiving
02:00:07 and a little bit more tolerant when people mess up.
02:00:10 And we’re gonna have to be a little bit more humble
02:00:12 when we do, and like keep moving forward.
02:00:15 Otherwise you can’t like cancel everyone, you know?
02:00:18 Nicole, this is an incredible, hopeful conversation.
02:00:21 Also, one that reveals that in the shadows
02:00:28 there’s a lot of challenges to be solved.
02:00:30 So I really appreciate that you took on
02:00:32 this really difficult subject with your book.
02:00:34 That’s journalism at its best.
02:00:35 So I’m really grateful that you took the risk
02:00:38 that you took that on,
02:00:40 and that you plugged the cable box back in.
02:00:42 That means you have hope.
02:00:43 And thank you so much for spending
02:00:46 your valuable time with me today.
02:00:47 Thank you, thanks for having me.
02:00:49 Thanks for listening to this conversation
02:00:52 with Nicole Perlroth.
02:00:53 To support this podcast,
02:00:54 please check out our sponsors in the description.
02:00:57 And now let me leave you with some words
02:01:00 from Nicole herself.
02:01:01 Here we are, entrusting our entire digital lives,
02:01:05 passwords, texts, love letters, banking records,
02:01:09 health records, credit cards, sources,
02:01:10 and deepest thoughts to this mystery box
02:01:14 whose inner circuitry most of us would never vet.
02:01:17 Run by code written in a language most of us
02:01:20 will never fully understand.
02:01:22 Thank you for listening and hope to see you next time.