Transcript
00:00:00 The jury found Pfizer guilty of fraud
00:00:02 and racketeering violations.
00:00:04 How does Big Pharma affect your mind?
00:00:06 Everyone’s allowed their own opinion.
00:00:08 I don’t think everyone’s allowed their own scientific facts.
00:00:11 Does Pfizer play by the rules?
00:00:13 Pfizer isn’t battling the FDA.
00:00:16 Pfizer has joined the FDA.
00:00:21 The following is a conversation with John Abramson,
00:00:24 faculty at Harvard Medical School,
00:00:26 a family physician for over two decades,
00:00:29 and author of the new book, Sickening,
00:00:32 about how Big Pharma broke American healthcare
00:00:35 and how we can fix it.
00:00:37 This conversation with John Abramson
00:00:40 is a critical exploration of the pharmaceutical industry.
00:00:43 I wanted to talk to John
00:00:45 in order to provide a countervailing perspective
00:00:48 to the one expressed in my podcast episode
00:00:50 with the CEO of Pfizer, Albert Borla.
00:00:55 And here, please allow me to say a few additional words
00:00:58 about this episode with the Pfizer CEO,
00:01:01 and in general, about why I do these conversations
00:01:04 and how I approach them.
00:01:06 If this is not interesting to you, please skip ahead.
00:01:10 What do I hope to do with this podcast?
00:01:13 I want to understand human nature,
00:01:15 the best and the worst of it.
00:01:18 I want to understand how power, money,
00:01:19 and fame changes people.
00:01:21 I want to understand why atrocities are committed
00:01:24 by crowds that believe they’re doing good.
00:01:27 All this, ultimately, because I want to understand
00:01:30 how we can build a better world together,
00:01:33 to find hope for the future,
00:01:35 and to rediscover each time,
00:01:38 through the exploration of ideas,
00:01:40 just how beautiful this life is.
00:01:43 This, our human civilization,
00:01:45 in all of its full complexity,
00:01:47 the forces of good and evil,
00:01:49 of war and peace, of hate and love.
00:01:53 I don’t think I can do this with a heart and mind
00:01:55 that is not open, fragile, and willing to empathize
00:01:59 with all human beings,
00:02:00 even those in the darkest corners of our world.
00:02:04 To attack is easy.
00:02:06 To understand is hard.
00:02:09 And I choose the hard path.
00:02:11 I have learned over the past few months
00:02:13 that this path involves me getting more and more attacked
00:02:17 from all sides.
00:02:19 I will get attacked when I host people
00:02:21 like Jay Bhattacharya or Francis Collins,
00:02:24 Jamie Merzl or Vincent Ricanello,
00:02:28 when I stand for my friend, Joe Rogan,
00:02:31 when I host tech leaders like Mark Zuckerberg,
00:02:34 Elon Musk, and others,
00:02:36 when I eventually talk to Vladimir Putin, Barack Obama,
00:02:40 and other figures that have turned the tides of history.
00:02:44 I have and I will get called stupid, naive, weak,
00:02:50 and I will take these words
00:02:52 with respect, humility, and love, and I will get better.
00:02:57 I will listen, think, learn, and improve.
00:03:00 One thing I can promise is there’s no amount of money
00:03:04 or fame that can buy my opinion
00:03:06 or make me go against my principles.
00:03:09 There’s no amount of pressure that can break my integrity.
00:03:13 There’s nothing in this world I need
00:03:16 that I don’t already have.
00:03:18 Life itself is the fundamental gift.
00:03:21 Everything else is just the bonus.
00:03:24 That is freedom.
00:03:26 That is happiness.
00:03:28 If I die today, I will die a happy man.
00:03:33 Now, a few comments about my approach
00:03:35 and lessons learned from the Albert Bourla conversation.
00:03:39 The goal was to reveal as much as I could
00:03:41 about the human being before me
00:03:43 and to give him the opportunity to contemplate in long form
00:03:48 the complexities of his role,
00:03:49 including the tension between making money
00:03:53 and helping people, the corruption
00:03:55 that so often permeates human institutions,
00:03:58 the crafting of narratives through advertisements,
00:04:00 and so on.
00:04:02 I only had one hour,
00:04:04 and so this wasn’t the time to address these issues deeply
00:04:07 but to show if Albert struggled with them
00:04:10 in the privacy of his own mind,
00:04:12 and if he would let down the veil of political speak
00:04:16 for a time to let me connect with a man
00:04:19 who decades ago chose to become a veterinarian,
00:04:22 who wanted to help lessen the amount of suffering
00:04:24 in the world.
00:04:26 I had no pressure placed on me.
00:04:28 There were no rules.
00:04:29 The questions I was asking were all mine
00:04:32 and not seen by Pfizer folks.
00:04:34 I had no care whether I ever talked to another CEO again.
00:04:38 None of this was part of the calculation
00:04:41 in my limited brain computer.
00:04:44 I didn’t want to grill him.
00:04:45 The way politicians grill CEOs in Congress,
00:04:48 I thought that this approach is easy,
00:04:51 self serving, dehumanizing, and it reveals nothing.
00:04:56 I wanted to reveal the genuine intellectual struggle,
00:04:59 vision, and motivation of a human being,
00:05:01 and if that fails, I trusted the listener
00:05:04 to draw their own conclusion and insights from the result,
00:05:08 whether it’s the words spoken
00:05:10 or the words left unspoken or simply the silence.
00:05:14 And that’s just it.
00:05:15 I fundamentally trust the intelligence of the listener, you.
00:05:21 In fact, if I criticize the person too hard
00:05:24 or celebrate the person too much,
00:05:26 I feel I fail to give the listener
00:05:29 a picture of the human being that is uncontaminated
00:05:32 by my opinion or the opinion of the crowd.
00:05:36 I trust that you have the fortitude and the courage
00:05:39 to use your own mind, to empathize, and to think.
00:05:43 Two practical lessons I took away.
00:05:46 First, I will more strongly push
00:05:48 for longer conversations of three, four, or more hours
00:05:52 versus just one hour.
00:05:53 60 minutes is too short for the guest to relax
00:05:56 and to think slowly and deeply,
00:05:58 and for me to ask many follow up questions
00:06:00 or follow interesting tangents.
00:06:03 Ultimately, I think it’s in the interest of everyone,
00:06:05 including the guest, that we talk in true long form
00:06:09 for many hours.
00:06:11 Second, these conversations with leaders
00:06:13 can be aided by further conversations
00:06:16 with people who wrote books about those leaders
00:06:18 or their industries.
00:06:20 Those that can steel man each perspective
00:06:22 and attempt to give an objective analysis.
00:06:25 I think of Teddy Roosevelt’s speech
00:06:26 about the man in the arena.
00:06:28 I want to talk to both the men and women in the arena
00:06:32 and the critics and the supporters in the stands.
00:06:36 For the former, I lean toward wanting to understand
00:06:38 one human being’s struggle with the ideas.
00:06:43 For the latter, I lean towards understanding
00:06:45 the ideas themselves.
00:06:48 That’s why I wanted to have this conversation
00:06:50 with John Abramson, who is an outspoken critic
00:06:53 of the pharmaceutical industry.
00:06:55 I hope it helps add context and depth
00:06:58 to the conversation I had with the Pfizer CEO.
00:07:02 In the end, I may do worse than I could have or should have.
00:07:06 Always, I will listen to the criticisms without ego
00:07:10 and I promise I will work hard to improve.
00:07:14 But let me say finally that cynicism is easy.
00:07:20 Optimism, true optimism is hard.
00:07:24 It is the belief that we can and we will
00:07:28 build a better world and that we can only do it together.
00:07:32 This is the fight worth fighting.
00:07:35 So here we go.
00:07:36 Once more into the breach, dear friends.
00:07:39 I love you all.
00:07:41 This is the Lex Friedman podcast.
00:07:43 To support it, please check out our sponsors
00:07:46 in the description.
00:07:47 And now, here’s my conversation with John Abramson.
00:07:52 Your faculty at Harvard Medical School,
00:07:55 your family physician for over two decades,
00:07:57 rated one of the best family physicians in Massachusetts,
00:08:01 you wrote the book, Overdose to America,
00:08:03 and the new book coming out now called Sickening
00:08:07 about how Big Pharma broke American healthcare,
00:08:10 including science and research, and how we can fix it.
00:08:14 First question, what is the biggest problem with Big Pharma
00:08:18 that it fixed would be the most impactful?
00:08:21 So if you can snap your fingers and fix one thing,
00:08:24 what would be the most impactful, you think?
00:08:26 The biggest problem is the way they
00:08:30 determine the content, the accuracy,
00:08:35 and the completeness of what doctors believe
00:08:39 to be the full range of knowledge
00:08:42 that they need to best take care of their patients.
00:08:46 So that with the knowledge having been taken over
00:08:51 by the commercial interests, primarily
00:08:53 the pharmaceutical industry, the purpose of that knowledge
00:08:57 is to maximize the profits that get returned
00:09:00 to investors and shareholders, and not to optimize
00:09:04 the health of the American people.
00:09:07 So rebalancing that equation would be the most important
00:09:11 thing to do to get our healthcare back aimed
00:09:15 in the right direction.
00:09:16 Okay, so there’s a tension between helping people
00:09:20 and making money, so if we look at particularly
00:09:24 the task of helping people in medicine, in healthcare,
00:09:29 is it possible if money is the primary sort of mechanism
00:09:35 by which you achieve that as a motivator,
00:09:38 is it possible to get that right?
00:09:39 I think it is, Lex, but I think it is not possible
00:09:43 without guardrails that maintain the integrity
00:09:46 and the balance of the knowledge.
00:09:48 Without those guardrails, it’s like trying to play
00:09:52 a professional basketball game without referees
00:09:54 and having players call their own fouls.
00:09:57 But the players are paid to win, and you can’t count
00:10:01 on them to call their own fouls, so we have referees
00:10:03 who are in charge.
00:10:05 We don’t have those referees in American healthcare.
00:10:08 That’s the biggest way that American healthcare
00:10:13 is distinguished from healthcare in other wealthy nations.
00:10:17 So okay, you mentioned Milton Friedman,
00:10:19 and you mentioned his book called Capitalism and Freedom.
00:10:24 He writes that there are only three legitimate functions
00:10:27 of government to preserve law and order,
00:10:30 to enforce private contracts, and to ensure
00:10:33 that private markets work.
00:10:36 You said that that was a radical idea at the time,
00:10:40 but we’re failing on all three.
00:10:41 How are we failing?
00:10:43 And also maybe the bigger picture is what are the strengths
00:10:47 and weaknesses of capitalism when it comes to medicine
00:10:50 and healthcare?
00:10:51 Can we separate those out?
00:10:53 Because those are two huge questions.
00:10:55 So how we’re failing on all three,
00:10:58 and these are the minimal functions that our guru
00:11:03 of free market capitalism said the government
00:11:06 should perform, so this is the absolute baseline.
00:11:11 On preserving law and order, the drug companies
00:11:14 routinely violate the law in terms of their marketing,
00:11:20 and in terms of their presentation
00:11:26 of the results of their trials.
00:11:29 I know this because I was an expert in litigation
00:11:32 for about 10 years.
00:11:35 I presented some of what I learned in civil litigation
00:11:40 to the FBI and the Department of Justice,
00:11:42 and that case led to the biggest criminal fine
00:11:46 in US history as of 2009.
00:11:49 And I testified in a federal trial in 2010,
00:11:56 and the jury found Pfizer guilty of fraud
00:12:00 and racketeering violations.
00:12:02 In terms of violating the law, it’s a routine occurrence.
00:12:07 The drug companies have paid $38 billion worth of fines
00:12:10 from I think 1991 to 2017.
00:12:15 It’s never been enough to stop the misrepresentation
00:12:21 of their data, and rarely are the fines greater
00:12:25 than the profits that were made.
00:12:29 Executives have not gone to jail for misrepresenting data
00:12:34 that have involved even tens of thousands of deaths
00:12:38 in the case of Vioxx, OxyContin as well.
00:12:42 And when companies plead guilty to felonies,
00:12:45 which is not an unusual occurrence,
00:12:48 the government usually allows the companies,
00:12:51 the parent companies, to allow subsidiaries to take the plea
00:12:56 so that they are not one step closer
00:12:58 to getting disbarred from Medicare,
00:13:01 not being able to participate in Medicare.
00:13:03 So in that sense, there is a mechanism
00:13:11 that is appearing to impose law and order
00:13:15 on drug company behavior, but it’s clearly not enough.
00:13:18 It’s not working.
00:13:19 Can you actually speak to human nature here?
00:13:24 Are people corrupt?
00:13:26 Are people malevolent?
00:13:28 Are people ignorant that work at the low level
00:13:32 and at the high level at Pfizer, for example,
00:13:36 at big pharma companies, how is this possible?
00:13:40 So I believe, just on a small tangent,
00:13:43 that most people are good.
00:13:45 And I actually believe if you join big pharma,
00:13:48 so a company like Pfizer, your life trajectory
00:13:52 often involves dreaming and wanting
00:13:56 and enjoying helping people.
00:13:58 Yes.
00:13:59 And so, and then we look at the outcomes
00:14:03 that you’re describing, and it looks,
00:14:07 and that’s why the narrative takes hold
00:14:09 that Pfizer CEO, Al Bobrola, who I talked to, is malevolent.
00:14:15 The sense is these companies are evil.
00:14:19 So if the different parts, the people, are good
00:14:24 and they want to do good, how are we getting these outcomes?
00:14:27 Yeah, I think it has to do with the cultural milieu
00:14:33 that this is unfolding in.
00:14:35 And we need to look at sociology to understand this,
00:14:41 that when the cultural milieu is set up
00:14:49 to maximize the returns on investment
00:14:52 for shareholders and other venture capitalists
00:14:55 and hedge funds and so forth,
00:14:57 when that defines the culture
00:15:00 and the higher up you are in the corporation,
00:15:04 the more you’re in on the game of getting rewarded
00:15:10 for maximizing the profits of the investors,
00:15:13 that’s the culture they live in.
00:15:15 And it becomes normative behavior
00:15:19 to do things with science that look normal
00:15:25 in that environment and are shared values
00:15:28 within that environment by good people
00:15:31 whose self evaluation becomes modified
00:15:34 by the goals that are shared by the people around them.
00:15:39 And within that milieu, you have one set of standards,
00:15:44 and then the rest of good American people
00:15:48 have the expectation that the drug companies
00:15:50 are trying to make money, but that they’re playing
00:15:53 by rules that aren’t part of the insider milieu.
00:15:59 That’s fascinating, the game they’re playing
00:16:02 modifies the culture of inside the meetings,
00:16:07 inside the rooms, day to day,
00:16:10 that there’s a bubble that forms.
00:16:12 Like we’re all in bubbles of different sizes.
00:16:15 And that bubble allows you to drift in terms
00:16:18 of what you see as ethical and unethical.
00:16:24 Because you see the game as just part of the game.
00:16:28 So marketing is just part of the game.
00:16:32 Paying the fines is just part of the game of science.
00:16:36 And without guardrails, it becomes
00:16:40 even more part of the game.
00:16:42 You keep moving in that direction.
00:16:44 If you’re not bumping up against guardrails.
00:16:48 And I think that’s how we’ve gotten
00:16:49 to the extreme situation we’re in now.
00:16:53 So, like I mentioned, I spoke with Pfizer CEO,
00:16:57 Albert Berla, and I’d like to raise with you
00:17:00 some of the concerns I raised with him.
00:17:03 So one, you already mentioned, I raised the concern
00:17:07 that Pfizer’s engaged in aggressive advertising campaigns.
00:17:11 As you can imagine, he said no.
00:17:15 What do you think?
00:17:18 I think you’re both right.
00:17:21 I think that the, I agree with you,
00:17:23 that the aggressive advertising campaigns
00:17:27 do not add value to society.
00:17:30 And I agree with him that they’re, for the most part, legal.
00:17:34 And it’s the way the game is played.
00:17:37 Right, so, sorry to interrupt,
00:17:38 but oftentimes his responses are,
00:17:44 especially now, he’s been CEO for only like two years,
00:17:47 three years, he says Pfizer was a different company,
00:17:50 we’ve made mistakes, right, in the past.
00:17:53 We don’t make mistakes anymore.
00:17:56 That there’s rules, and we play by the rules.
00:18:00 So like, with every concern raised,
00:18:02 there’s very, very strict rules, as he says.
00:18:06 In fact, he says sometimes way too strict.
00:18:08 And we play by them.
00:18:10 And so in that sense, advertisement,
00:18:12 it doesn’t seem like it’s too aggressive,
00:18:14 because it’s playing by the rules.
00:18:17 And relative to the other, again, it’s the game.
00:18:19 Relative to the other companies,
00:18:22 it’s actually not that aggressive.
00:18:24 Relative to the other big pharma companies.
00:18:26 Yes, yes, I hope we can quickly get back
00:18:29 to whether or not they’re playing by the rules,
00:18:31 but in general.
00:18:32 But let’s just look at the question
00:18:34 of advertising specifically.
00:18:36 I think that’s a good example of what it looks like
00:18:39 from within that culture, and from outside that culture.
00:18:44 He’s saying that we follow the law on our advertising.
00:18:49 We state the side effects,
00:18:51 and we state the FDA approved indications,
00:18:53 and we do what the law says we have to do for advertising.
00:18:57 And I have not, I’ve not been an expert in litigation
00:19:01 for a few years, and I don’t know what’s going on currently,
00:19:04 but let’s take him at his word.
00:19:07 It could be true, it might not be, but it could be.
00:19:09 But if that’s true, in his world, in his culture,
00:19:15 that’s ethical business behavior.
00:19:18 From a common sense person’s point of view,
00:19:22 a drug company paying highly skilled media folks
00:19:27 to take the information about the drug
00:19:30 and create the illusion, the emotional impact,
00:19:34 and the takeaway message for viewers of advertisements
00:19:38 that grossly exaggerate the benefit of the drug
00:19:41 and minimize the harms, it’s sociopathic behavior
00:19:45 to have viewers of ads leave the ad
00:19:49 with an unrealistic impression
00:19:52 of the benefits and harms of the drug.
00:19:56 And yet he’s playing by the rules,
00:19:58 he’s doing his job as CEO
00:20:01 to maximize the effect of his advertising,
00:20:04 and if he doesn’t do it, this is a key point,
00:20:07 if he doesn’t do it, he’ll get fired and the next guy will.
00:20:11 So the people that survive in the company,
00:20:13 the people that get raises in the company,
00:20:16 move up in the company are the ones that play by the rules,
00:20:19 and that’s how the game solidifies itself.
00:20:21 But the game is within the bounds of the law.
00:20:24 Sometimes, most of the time, not always.
00:20:26 We’ll return to that question.
00:20:29 I’m actually more concerned
00:20:31 about the effect of advertisement
00:20:34 in a kind of much larger scale
00:20:39 on the people that are getting funded
00:20:43 by the advertisement in self censorship,
00:20:46 just like more subtle, more passive pressure
00:20:50 to not say anything negative.
00:20:52 Because I’ve seen this, and I’ve been saddened by it,
00:20:57 that people sacrifice integrity in small ways
00:21:03 when they’re being funded by a particular company.
00:21:06 They don’t see themselves as doing so,
00:21:09 but you can just clearly see that the space of opinions
00:21:12 that they’re willing to engage in,
00:21:15 or a space of ideas they’re willing to play with,
00:21:18 is one that doesn’t include negative,
00:21:22 anything that could possibly be negative about the company.
00:21:25 They just choose not to.
00:21:27 Because, you know, why?
00:21:28 And that’s really sad to me,
00:21:30 that if you give me a hundred bucks,
00:21:33 I’m less likely to say something negative about you.
00:21:38 That makes me sad.
00:21:39 Because the reason I wouldn’t say something negative
00:21:42 about you, I prefer, is the pressure of friendship
00:21:45 and human connection, those kinds of things.
00:21:48 So I understand that.
00:21:50 That’s also a problem, by the way,
00:21:52 sort of having dinners and shaking hands,
00:21:54 and oh, aren’t we friends?
00:21:56 But the fact that money has that effect
00:21:58 is really sad to me.
00:22:00 On the news media, on the journalists, on scientists,
00:22:05 that’s scary to me.
00:22:06 But of course, the direct advertisement to consumers,
00:22:09 like you said, is a potentially very negative effect.
00:22:11 I wanted to ask if what you think
00:22:14 is the most negative impact of advertisement,
00:22:17 is it that direct to consumer on television?
00:22:20 Is it advertisement of the doctors?
00:22:22 Which I’m surprised to learn,
00:22:24 I was vaguely looking at,
00:22:26 is more spent on advertising to doctors than to consumers.
00:22:32 That’s really confusing to me.
00:22:34 It’s fascinating, actually.
00:22:35 And then also, obviously, the law side of things
00:22:38 is the lobbying dollars,
00:22:40 which I think is less than all of those.
00:22:42 But anyway, it’s in the ballpark.
00:22:44 What concerns you most?
00:22:46 Well, it’s the whole nexus of influence.
00:22:49 There’s not one thing, and they don’t invest all their,
00:22:53 they don’t put all their eggs in one basket.
00:22:55 It’s a whole surround sound program here.
00:23:01 But in terms of advertisements,
00:23:04 let’s take the advertisement.
00:23:06 Trulicity is a diabetes drug,
00:23:09 for type two diabetes, an injectable drug.
00:23:12 And it lowers blood sugar just about as well
00:23:15 as Metformin does.
00:23:18 Metformin costs about $4 a month.
00:23:21 Trulicity costs, I think, $6,200 a year.
00:23:25 So $48 a year versus $6,200.
00:23:29 Trulicity has distinguished itself
00:23:31 because the manufacturer did a study
00:23:35 that showed that it significantly reduces
00:23:37 the risk of cardiovascular disease in diabetics.
00:23:41 And they got approval on the basis of that study,
00:23:44 that very large study being statistically significant.
00:23:47 So the ads obviously extol the virtues of Trulicity
00:23:53 because it reduces the risk of heart disease and stroke,
00:23:56 and that’s one of the major morbidities,
00:23:59 risks of type two diabetes.
00:24:01 What the ad doesn’t say is that you have to treat
00:24:03 323 people to prevent one nonfatal event
00:24:08 at a cost of $2.7 million.
00:24:11 And even more importantly than that,
00:24:13 what the ad doesn’t say is that the evidence shows
00:24:17 that engaging in an active, healthy lifestyle program
00:24:22 reduces the risk of heart disease and strokes
00:24:24 far more than Trulicity does.
00:24:28 Now, to be fair to the company, the sponsor,
00:24:32 there’s never been a study that compared Trulicity
00:24:37 to lifestyle changes.
00:24:39 But that’s part of the problem of our advertising.
00:24:42 You would think in a rational society
00:24:45 that was way out on a limb as a lone country
00:24:50 besides New Zealand that allows
00:24:52 direct to consumer advertising,
00:24:54 that part of allowing direct to consumer advertising
00:24:59 would be to mandate that the companies establish
00:25:03 whether their drug is better than,
00:25:05 say, healthy lifestyle adoption
00:25:07 to prevent the problems that they claim to be preventing.
00:25:11 But we don’t require that.
00:25:13 So the companies can afford to do very large studies
00:25:17 so that very small differences
00:25:19 become statistically significant.
00:25:21 And their studies are asking the question,
00:25:23 how can we sell more drug?
00:25:25 They’re not asking the question,
00:25:27 how can we prevent cardiovascular disease
00:25:30 in people with type 2 diabetes?
00:25:32 And that’s how we get off in this,
00:25:34 we’re now in the extreme arm of this distortion
00:25:38 of our medical knowledge of studying
00:25:41 how to sell more drugs than how to make people more healthy.
00:25:45 That’s a really great thing to compare to,
00:25:48 is lifestyle changes.
00:25:51 Because that should be the bar.
00:25:53 If you do some basic diet, exercise,
00:25:56 all those kinds of things,
00:25:58 how does this drug compare to that?
00:26:00 Right, right.
00:26:01 And that study was done, actually, in the 90s.
00:26:04 It’s called the Diabetes Prevention Program.
00:26:06 It was federally funded by the NIH
00:26:09 so that there wasn’t this drug company imperative
00:26:13 to just try to prove your drug was better than nothing.
00:26:16 And it was a very well designed study,
00:26:19 randomized controlled trial
00:26:22 in people who were at high risk of diabetes,
00:26:25 so called pre diabetics.
00:26:26 And they were randomized to three different groups,
00:26:30 a placebo group, a group that got treated with metformin,
00:26:34 and a group that got treated
00:26:36 with intensive lifestyle counseling.
00:26:38 So this study really tested
00:26:42 whether you can get people in a randomized controlled trial
00:26:46 assigned to intensive lifestyle changes,
00:26:49 whether that works.
00:26:50 Now the common wisdom amongst physicians,
00:26:54 and I think in general,
00:26:56 is that you can’t get people to change.
00:26:57 You know, you can do whatever you want,
00:26:59 you can stand on your head,
00:27:00 you can beg and plead, people won’t change.
00:27:02 So give it up and let’s just move on with the drugs
00:27:05 and not waste any time.
00:27:06 Except this study that was published
00:27:08 in the New England Journal, I think in 2002,
00:27:11 shows that’s wrong.
00:27:12 That the people who were in the intensive lifestyle group
00:27:16 ended up losing 10 pounds,
00:27:18 exercising five times a week, maintaining it,
00:27:21 and reduced their risk of getting diabetes by 58%,
00:27:26 compared to the metformin group,
00:27:27 which reduced its risk of getting diabetes by 31%.
00:27:32 So that exact study was done
00:27:34 and it showed that lifestyle intervention is the winner.
00:27:38 Who, as a small tangent, is the leader,
00:27:44 who is supposed to fight for the side of lifestyle changes?
00:27:49 Where’s the big pharma version of lifestyle changes?
00:27:54 Who’s supposed to have the big bully pulpit,
00:27:57 the big money behind lifestyle changes?
00:28:00 In your sense, because that seems to be missing
00:28:03 in a lot of our discussions about health policy.
00:28:06 Right, that’s exactly right.
00:28:08 And the answer is that we assume
00:28:12 that the market has to solve all of these problems.
00:28:15 And the market can’t solve all of these problems.
00:28:18 There needs to be some way of protecting the public interest
00:28:23 for things that aren’t financially driven.
00:28:26 So that the overriding question has to be
00:28:28 how best to improve Americans health,
00:28:31 not companies funding studies to try and prove
00:28:36 that their new inexpensive drug is better
00:28:39 and should be used.
00:28:40 Well, some of that is also people sort of like yourself.
00:28:45 I mean, it’s funny, you spoke with Joe Rogan.
00:28:48 He constantly espouses lifestyle changes.
00:28:50 So some of it is almost like understanding the problems
00:28:55 that big pharma is creating in society
00:28:58 and then sort of these influential voices
00:29:02 speaking up against it.
00:29:03 So whether they’re scientists or just regular communicators.
00:29:08 Yeah, I think you gotta tip your hat to Joe
00:29:11 for getting that message out.
00:29:13 And he clearly believes it and does his best.
00:29:17 But it’s not coming out in the legitimate avenues,
00:29:21 in the legitimate channels that are evidence based medicine
00:29:26 and from the sources that the docs are trained to listen to
00:29:32 and modify their patient care on.
00:29:34 Now, it’s not 100%.
00:29:36 I mean, there are articles in the big journals
00:29:40 about the benefits of lifestyle,
00:29:42 but they don’t carry the same gravitas
00:29:45 as the randomized controlled trials
00:29:48 that test this drug against placebo
00:29:50 or this drug against another drug.
00:29:52 So the Joe Rogans of the world keep going.
00:29:55 I tip my hat.
00:29:57 But it’s not gonna carry the day for most of the people
00:30:00 until it has the legitimacy of the medical establishment.
00:30:04 Yeah, like something that the doctors
00:30:05 really pay attention to.
00:30:07 Well, there’s an entire mechanism established
00:30:09 for testing drugs.
00:30:11 There’s not an entire mechanism established
00:30:14 in terms of scientific rigor of testing lifestyle changes.
00:30:17 I mean, it’s more difficult.
00:30:20 I mean, everything’s difficult in science.
00:30:23 That science that involves humans, especially.
00:30:27 But it’s just, these studies are very expensive.
00:30:30 They’re difficult.
00:30:31 It’s difficult to find conclusions
00:30:33 and to control all the variables.
00:30:35 And so it’s very easy to dismiss them
00:30:37 unless you really do a huge study that’s very well funded.
00:30:40 And so maybe the doctors just lean
00:30:42 towards the simpler studies over and over,
00:30:45 which is what the drug companies fund.
00:30:48 They can control more variables.
00:30:50 See, but the control there is sometimes
00:30:56 by hiding things too, right?
00:31:00 So sometimes you can just say
00:31:03 that this is a well controlled study
00:31:06 by pretending there’s a bunch of other stuff.
00:31:09 It’s just ignoring the stuff that could be correlated.
00:31:13 It could be the real cause of the effects you’re seeing,
00:31:15 all that kind of stuff.
00:31:17 So money can buy ignorance, I suppose, in science.
00:31:21 It buys the kind of blinders that are on
00:31:24 that don’t look outside the reductionist model.
00:31:28 And that’s another issue is that we kind of,
00:31:31 nobody says to doctors in training,
00:31:34 only listen to reductionist studies and conclusions
00:31:39 and methods of promoting health.
00:31:42 Nobody says that explicitly.
00:31:43 But the respectable science
00:31:47 has to do with controlling the factors.
00:31:49 And I mean, it just doesn’t make sense to me.
00:31:54 I’m gonna pick on trulicity
00:31:55 because it’s such an obvious example,
00:31:57 but it’s not more egregious than many others.
00:32:01 It doesn’t make sense to me to allow a drug
00:32:03 to be advertised as preventing cardiovascular disease
00:32:06 when you haven’t included lifestyle changes
00:32:09 as an arm in the study.
00:32:11 It’s just so crystal clear that the purpose of that study
00:32:15 is to sell trulicity.
00:32:17 It’s not to prevent cardiovascular disease.
00:32:21 If we were in charge, I would try to convince you
00:32:24 that anywhere that study, the results of that study
00:32:27 were presented to physicians,
00:32:31 it would be stamped in big red letters,
00:32:33 this study did not compare trulicity to lifestyle changes.
00:32:37 They need to know that.
00:32:38 And the docs are kind of trained,
00:32:40 these blinders get put on,
00:32:42 and they’re trained to kind of forget that that’s not there.
00:32:46 Do you think, so first of all,
00:32:48 that’s a small or big change to advertisement
00:32:51 that seems obvious to say,
00:32:54 like in force that it should be compared
00:32:56 to lifestyle changes.
00:32:59 Do you think advertisements, period,
00:33:01 in the United States for pharmaceutical drugs
00:33:04 should be banned?
00:33:05 I think they can’t be banned.
00:33:07 So it doesn’t matter what I think.
00:33:09 Okay, let’s say you were a dictator,
00:33:13 and two, why can’t they be banned?
00:33:15 Okay.
00:33:16 Answer either one.
00:33:18 I believe, I’ve been told by lawyers who I trust,
00:33:22 that the freedom of speech in the U.S. Constitution
00:33:27 is such that you can’t ban them,
00:33:29 that you could ban cigarettes and alcohol,
00:33:33 which have no therapeutic use,
00:33:35 but drugs have a therapeutic use,
00:33:37 and advertisements about them can’t be banned.
00:33:41 Let’s assume that they can’t be,
00:33:43 because we know they won’t be anyway,
00:33:46 but let’s assume they can’t be,
00:33:49 and especially our Supreme Court now
00:33:51 would be unlikely to take that seriously.
00:33:55 But that’s not the issue.
00:33:57 The issue is that if the drug companies
00:34:00 want to spend their money advertising,
00:34:02 they should have to have independent analysis
00:34:06 of the message that the viewers are left with
00:34:10 about the drug, so that it’s realistic.
00:34:13 What’s the chance the drug will help them?
00:34:15 Well, in true city, it’s one out of 323.
00:34:19 322 people aren’t gonna benefit
00:34:21 from the cardiovascular reduction, risk reduction.
00:34:25 What’s the true cost?
00:34:26 When drugs advertise that you may be able to get this
00:34:30 for a $25 copay or something,
00:34:33 tens of thousands of dollars a year drug,
00:34:35 for a $25 copay, what an enormous disservice that is
00:34:40 to misrepresent the cost to society.
00:34:42 That should not be allowed.
00:34:44 So you should have to make it clear to the viewers
00:34:48 how many people are gonna benefit,
00:34:49 what’s your chance of benefiting?
00:34:51 How does it compare to lifestyle changes
00:34:53 or less expensive therapies?
00:34:55 What do you give up if you use a less expensive therapy
00:34:58 or gain, perhaps?
00:34:59 And how much it costs.
00:35:01 How much it costs.
00:35:02 Now, that can go either way,
00:35:03 because if you say Humira costs $72,000
00:35:06 and it’s no more effective as a first line drug
00:35:08 than methotrexate, which costs $480,
00:35:12 people might say, I want the expensive drug
00:35:15 because I can get it for a $25 copay.
00:35:17 So you’d have to temper that a little bit.
00:35:21 Oh, you mean people are so, they don’t care.
00:35:25 They don’t care.
00:35:26 Their insurance is gonna cover it and it’s a $25 copay,
00:35:29 but we could figure out how to deal with that.
00:35:31 The main point is that if we assume
00:35:35 that advertisements are gonna keep going, and they are,
00:35:38 we could require that there be outside evaluation
00:35:45 of the message that reasonable, unbiased viewers
00:35:48 take away from the ads,
00:35:50 and the ads would have to tell the truth about the drug.
00:35:55 And the truth should have sub truth guardrails,
00:36:00 meaning like the cost that we talked about,
00:36:03 the effects compared to things that actually,
00:36:07 lifestyle changes, just these details,
00:36:11 very strict guardrails of what actually has to be specified.
00:36:16 And I would make it against the law
00:36:19 to have family picnics or dogs catching Frisbees in the ads.
00:36:23 So, you mean 95% of the ads, yes.
00:36:28 I mean, there’s something dark and inauthentic
00:36:32 about those advertisements, but they seem,
00:36:34 I mean, I’m sure they’re being done
00:36:36 because they work for the target audience.
00:36:43 And then the doctors too.
00:36:46 Can you really buy a doctor’s opinion?
00:36:48 Why does it have such an effect on doctors?
00:36:52 Advertisement to doctors, like you as a physician,
00:36:55 again, like from everything I’ve seen, people love you.
00:36:58 And I’ve just, people should definitely look you up from,
00:37:04 there’s a bunch of videos of you giving talks on YouTube,
00:37:09 and it’s just, it’s so refreshing to hear
00:37:14 just the clarity of thought about health policy,
00:37:17 about healthcare, just the way you think
00:37:19 throughout the years.
00:37:20 Thank you.
00:37:21 So like, it’s easy to think about like,
00:37:23 maybe you’re criticizing Big Pharma,
00:37:25 that’s one part of the message that you’re talking about,
00:37:28 but that’s not like, your brilliance actually shines
00:37:33 in the positive, in the solutions and how to do it.
00:37:35 So as a doctor, what affects your mind?
00:37:40 And how does Big Pharma affect your mind?
00:37:43 Number one, the information that comes through
00:37:46 legitimate sources that doctors have been taught
00:37:50 to rely on, evidence based medicine,
00:37:52 the articles in peer reviewed journals,
00:37:55 the guidelines that are issued.
00:37:57 Now, those are problematic,
00:37:59 because when an article is peer reviewed
00:38:03 and published in a respected journal,
00:38:06 people and doctors obviously assume
00:38:10 that the peer reviewers have had access to the data
00:38:15 and they’ve independently analyzed the data,
00:38:18 and they corroborate the findings in the manuscript
00:38:21 that was submitted, or they give feedback to the authors
00:38:25 and say, we disagree with you on this point,
00:38:28 and would you please check our analysis
00:38:30 and if you agree with us, make it.
00:38:32 That’s what they assume the peer review process is,
00:38:35 but it’s not.
00:38:36 The peer reviewers don’t have the data.
00:38:39 The peer reviewers have the manuscript
00:38:41 that’s been submitted by the,
00:38:44 usually in conjunction with or by the drug company
00:38:49 that manufactures the drug.
00:38:51 So peer reviewers are unable to perform the job
00:38:57 that doctors think they’re performing
00:38:59 to vet the data to assure that it’s accurate
00:39:03 and reasonably complete.
00:39:05 They can’t do it.
00:39:07 And then we have the clinical practice guidelines,
00:39:09 which are increasingly more important
00:39:11 as the information, the flow of information
00:39:15 keeps getting brisker and brisker,
00:39:18 and docs need to get to the bottom line quickly.
00:39:22 Clinical practice guidelines become much more important.
00:39:25 And we assume that the authors
00:39:28 of those clinical practice guidelines
00:39:30 have independently analyzed the data
00:39:32 from the clinical trials and make their recommendations
00:39:35 that set the standards of care based on their analysis.
00:39:39 That’s not what happens.
00:39:40 The experts who write the clinical trials
00:39:44 rely almost entirely on the publications
00:39:49 presenting the results of the clinical trials,
00:39:51 which are peer reviewed,
00:39:52 but the peer reviewers haven’t had access to the data.
00:39:56 So we’ve got a system of the highest level of evidence
00:40:01 that doctors have been trained over and over again
00:40:03 to rely on to practice evidence based medicine
00:40:06 to be good doctors that has not been verified.
00:40:10 Do you think that data that’s coming
00:40:14 from the pharma companies,
00:40:17 do you think there,
00:40:19 what level of manipulation is going on with that data?
00:40:22 Is it at the study design level?
00:40:25 Is it at literally there’s some data
00:40:28 that you just keep off, keep out of the charts,
00:40:33 keep out of the aggregate analysis that you then publish?
00:40:38 Or is it the worst case,
00:40:41 which is just change some of the numbers?
00:40:44 It happened.
00:40:45 All three happened.
00:40:46 I can’t, I don’t know what the denominator is,
00:40:48 but I spent about 10 years in litigation.
00:40:51 And for example, in Vioxx,
00:40:54 which was withdrawn from the market in 2004
00:40:57 in the biggest drug recall in American history,
00:41:02 the problem was that it got recalled
00:41:06 when a study that Merck sponsored
00:41:08 showed that Vioxx doubled the risk,
00:41:10 more than doubled the risk of heart attacks,
00:41:12 strokes, and blood clots, serious blood clots.
00:41:16 It got pulled then.
00:41:18 But there was a study, a bigger study
00:41:20 that had been published in 2000
00:41:22 in the New England Journal of Medicine
00:41:24 that showed that Vioxx was a better drug
00:41:28 for arthritis and pain,
00:41:32 not because it was more effective.
00:41:34 It’s no more effective than Aleve or Advil,
00:41:37 but because it was less likely
00:41:40 to cause serious GI complications,
00:41:43 bleeds and perforations in the gut.
00:41:46 Now, in that study that was published
00:41:48 in the New England Journal that was never corrected,
00:41:51 it was a little bit modified 15 months
00:41:55 after the drug was taken off the market,
00:41:57 but never corrected, Merck left out three heart attacks.
00:42:01 And the FDA knew that Merck left out three heart attacks,
00:42:05 and the FDA’s analysis of the data from that study
00:42:10 said that the FDA wasn’t gonna do the analysis
00:42:14 without the three heart attacks in it.
00:42:16 And the important part of this story
00:42:19 is that there were 12 authors listed on that study
00:42:23 in the New England Journal.
00:42:24 Two were Merck employees.
00:42:26 They knew about the three heart attacks
00:42:27 that had been omitted.
00:42:29 The other 10 authors, the academic authors,
00:42:34 didn’t know about it.
00:42:35 They hadn’t seen that data.
00:42:38 So Merck just, they had an excuse.
00:42:41 It’s complicated, and the FDA didn’t accept it,
00:42:44 so there’s no reason to go into it.
00:42:46 But Merck just left out the three heart attacks.
00:42:48 And the three heart attacks,
00:42:50 it may seem three heart attacks in a 10,000 person study
00:42:52 may seem like nothing,
00:42:54 except they completely changed the statistics
00:42:57 so that had the three heart attacks been included,
00:43:00 the only conclusion that Merck could have made
00:43:02 was that Vioxx significantly increased
00:43:04 the risk of heart attack.
00:43:06 And they abbreviated their endpoint
00:43:09 from heart attack, strokes, and blood clots
00:43:12 to just heart attacks.
00:43:13 Yeah.
00:43:14 So those are, maybe in their mind,
00:43:17 they’re also playing by the rules
00:43:18 because of some technical excuse that you mentioned
00:43:20 that was rejected.
00:43:22 How can this, because this is crossing the line.
00:43:24 No, no, let me interrupt.
00:43:25 No, that’s not true.
00:43:28 The study was completed.
00:43:30 The blind was broken, meaning they looked at the data.
00:43:34 In March of 2000, the article was published
00:43:37 in the New England Journal in November of 2000.
00:43:40 In March of 2000, there was an email by the head scientist
00:43:45 that was published in the Wall Street Journal
00:43:49 that said the day that the data were unblinded,
00:43:53 that it’s a shame that the cardiovascular events are there,
00:43:58 but the drug will do well and we will do well.
00:44:08 But removing the three heart attacks,
00:44:10 how does that happen?
00:44:12 Like who has to convince themselves?
00:44:16 Is this pure malevolence?
00:44:19 You have to be the judge of that,
00:44:21 but the person who was in charge of the Data Safety
00:44:24 Monitoring Board issued a letter that said
00:44:28 they’ll stop counting cardiovascular events
00:44:32 a month before the trial is over
00:44:35 and they’ll continue counting GI events.
00:44:38 And that person got a contract to consult with Merck
00:44:43 for $5,000 a day, I think for 12 days a year,
00:44:47 for one or two years that was signed, that contract
00:44:53 was signed within two weeks of the decision
00:44:58 to stop counting heart attacks.
00:45:00 I wanna understand that man or woman.
00:45:04 I wanna, I want, it’s the, I’ve been reading a lot
00:45:08 about Nazi Germany and thinking a lot
00:45:10 about the good Germans because I want to understand
00:45:15 so that we can each encourage each other
00:45:20 to take the small heroic actions that prevents that.
00:45:23 Because it feels to me, removing malevolence
00:45:27 from the table where it’s just a pure psychopathic person,
00:45:31 that there’s just a momentum created
00:45:34 by the game like you mentioned.
00:45:35 And so it takes reversing the momentum within the company,
00:45:40 I think requires many small acts of heroism.
00:45:46 Not gigantic, I’m going to leave and become a whistleblower
00:45:50 and publish a book about it.
00:45:52 But small, quiet acts of pressuring against this.
00:45:57 Like, what are we doing here?
00:45:59 We’re trying to help people.
00:46:00 Is this the right thing to do?
00:46:01 Looking in the mirror constantly asking,
00:46:03 is this the right thing to do?
00:46:05 I mean, that’s how, that’s what integrity is.
00:46:07 Acknowledging the pressures you’re under
00:46:11 and then still be able to zoom out
00:46:13 and think what is the right thing to do here.
00:46:16 But the data, hiding the data makes it too easy
00:46:21 to live in ignorance.
00:46:22 So like within those, inside those companies.
00:46:29 So your idea is that the reviewers should see the data.
00:46:34 That’s one step.
00:46:36 So to even push back on that idea is,
00:46:40 I assume you mean the data remains private
00:46:43 except to the peer reviews, reviewers.
00:46:47 The problem with, of course, as you probably know
00:46:49 is the peer review process is not perfect.
00:46:53 You know, it’s individuals.
00:46:55 It feels like there should be a lot more eyes on the data
00:46:58 than just the peer reviewers.
00:47:00 Yes, this is not a hard problem to solve.
00:47:03 When a study is completed,
00:47:06 a clinical study report is made.
00:47:10 And it’s usually several thousand pages.
00:47:12 And what it does is it takes the raw patient data
00:47:15 and it tabulates it in the ways it’s supposedly and usually
00:47:22 in the ways that the company has pre specified.
00:47:25 So that you then end up with a searchable,
00:47:28 let’s say 3000 page document.
00:47:30 As I became more experienced as an expert in litigation,
00:47:36 I could go through those documents pretty quickly.
00:47:39 Quickly may mean 20 hours or 40 hours,
00:47:42 but it doesn’t mean three months of my work.
00:47:45 And see if the companies,
00:47:49 if the way the company has analyzed the data
00:47:51 is consistent with the way,
00:47:53 with their statistical analysis plan
00:47:55 and their pre specified outcome measures.
00:48:00 It’s not hard.
00:48:01 And I think you’re right.
00:48:02 Peer reviewers, I don’t peer review clinical trials,
00:48:06 but I peer review other kinds of articles.
00:48:09 I have to do one on the airplane on the way home.
00:48:11 And it’s hard.
00:48:12 I mean, we’re just ordinary mortal people volunteering to.
00:48:15 Unpaid, the motivation is not clear.
00:48:19 The motivation is to keep,
00:48:23 to be a good citizen in the medical community
00:48:27 and to be on friendly terms with the journals
00:48:31 so that if you wanna get published,
00:48:33 there’s sort of an unspoken incentive.
00:48:37 As somebody who enjoys game theory,
00:48:39 I feel like that motivation is good,
00:48:42 but it could be a lot better.
00:48:44 Yes, you should get more recognition
00:48:46 or in some way academic credit for it.
00:48:50 It should go to your career advancement.
00:48:53 If it’s an important paper
00:48:54 and you recognize it’s an important paper
00:48:56 as a great peer reviewer,
00:48:58 that this is not in that area
00:49:01 where it’s like clearly a piece of crap paper
00:49:05 or clearly an awesome paper
00:49:08 that doesn’t have controversial aspects to it
00:49:10 and it’s just a beautiful piece of work.
00:49:13 Okay, those are easy.
00:49:14 And then there is like the very difficult gray area,
00:49:17 which may require many, many days of work
00:49:20 on your part as a peer reviewer.
00:49:21 So it’s not just a couple hours,
00:49:24 but really seriously reading.
00:49:27 Like some papers can take months to really understand.
00:49:30 So if you really wanna struggle,
00:49:33 there has to be an incentive for that struggle.
00:49:35 Yes, and billions of dollars ride on some of these studies.
00:49:41 And lives, right, not to mention.
00:49:44 Right, but it would be easy to have full time statisticians
00:49:49 hired by the journals or shared by the journals
00:49:55 who were independent of any other financial incentive
00:50:00 to go over these kind of methodological issues
00:50:04 and take responsibility for certifying the analyses
00:50:08 that are done and then pass it on
00:50:11 to the volunteer peer reviewers.
00:50:14 See, I believe even in this,
00:50:15 in the sort of capitalism or even social capital,
00:50:19 after watching Twitter in the time of COVID
00:50:23 and just looking at people that investigate themselves,
00:50:27 I believe in the citizenry.
00:50:30 People, if you give them access to the data,
00:50:32 like these like citizen scientists arise.
00:50:35 A lot of them on the, it’s kind of funny,
00:50:39 a lot of people that are just really used
00:50:40 to working with data,
00:50:43 they don’t know anything about medicine
00:50:44 and they don’t have actually the biases
00:50:46 that a lot of doctors and medical
00:50:48 and a lot of the people that read these papers,
00:50:51 they’ll just go raw into the data
00:50:53 and look at it with like they’re bored almost
00:50:56 and they do incredible analysis.
00:50:58 So I, you know, there’s some argument to be made
00:51:01 for a lot of this data to become public,
00:51:04 like deanonymized, no, sorry, anonymized,
00:51:08 all that kind of stuff, but for a lot of it to be public,
00:51:11 especially when you’re talking about things
00:51:14 as impactful as some of these drugs.
00:51:16 I agree 100%, so let’s turn the micro,
00:51:19 let’s get a little bit more granular.
00:51:22 On the peer review issue,
00:51:24 we’re talking about pre publication transparencies
00:51:27 and that is critically important.
00:51:29 Once a paper is published, the horses are out of the barn
00:51:33 and docs are gonna read it,
00:51:34 take it as evidence based medicine.
00:51:36 The economists call what then happens as stickiness
00:51:41 that the docs hold on to their beliefs
00:51:43 and my own voice inside says,
00:51:47 once doctors start doing things to their patients bodies,
00:51:52 they’re really not too enthusiastic
00:51:53 about hearing it was wrong.
00:51:55 Yeah, that’s the stickiness of human nature.
00:51:57 Wow, so that bar, once it’s published,
00:52:01 the doctors, that’s when the stickiness emerges, wow.
00:52:05 Yeah, it’s hard to put that toothpaste back in the tube.
00:52:08 Now, that’s pre publication transparency,
00:52:11 which is essential and you could have,
00:52:14 whoever saw that data pre publication
00:52:17 could sign confidentiality agreements
00:52:19 so that the drug companies couldn’t argue
00:52:22 that we’re just opening the spigots of our data
00:52:24 and people can copy it and blah, all the excuses they make.
00:52:28 You could argue that you didn’t have to
00:52:30 but let’s just let them do it.
00:52:32 Let the peer reviewers sign confidentiality agreements
00:52:35 and they won’t leak the data
00:52:36 but then you have to go to post publication transparency,
00:52:39 which is what you were just getting at
00:52:41 to let the data free and let citizens
00:52:47 and citizen scientists and other doctors
00:52:50 who are interested have at it.
00:52:53 Kind of like Wiki, Wikipedia, have at it.
00:52:57 Let it out and let people criticize each other.
00:53:01 Okay, so speaking of the data,
00:53:03 the FDA asked 55 years to release Pfizer vaccine data.
00:53:08 This is also something I raised with Albert Bourla.
00:53:11 What did he say?
00:53:13 There’s several things I didn’t like about what he said.
00:53:16 So some things are expected
00:53:17 and some of it is just revealing the human being,
00:53:20 which is what I’m interested in doing.
00:53:23 But he said he wasn’t aware of the 75 and the 55.
00:53:27 I’m sorry, wait a minute.
00:53:29 He wasn’t aware of?
00:53:30 The how long, so here I’ll explain what he.
00:53:33 Do you know that since you spoke to him,
00:53:36 Pfizer has petitioned the judge to join the suit
00:53:42 in behalf of the FDA’s request
00:53:45 to release that data over 55 or 75 years?
00:53:50 Pfizer’s fully aware of what’s going on.
00:53:52 He’s aware.
00:53:53 I’m sure he’s aware in some formulation.
00:53:56 The exact years he might have not been aware.
00:53:59 But the point is that there is,
00:54:02 that is the FDA, the relationship of Pfizer and the FDA
00:54:06 in terms of me being able to read human beings
00:54:11 was the thing he was most uncomfortable with,
00:54:14 that he didn’t wanna talk about the FDA.
00:54:17 And that really, it was clear
00:54:20 that there was a relationship there
00:54:22 that if the words you use may do a lot of harm,
00:54:26 potentially because like you’re saying,
00:54:28 there might be lawsuits going on, there’s litigation,
00:54:31 there’s legal stuff, all that kind of stuff.
00:54:33 And then there’s a lot of games being played in this space.
00:54:36 So I don’t know how to interpret it
00:54:40 if he’s actually aware or not,
00:54:41 but the deeper truth is that he’s deeply uncomfortable
00:54:49 bringing light to this part of the game.
00:54:53 Yes, and I’m gonna read between the lines
00:54:56 and Albert Borla certainly didn’t ask me to speak for him.
00:54:59 But I think, but when did you speak to him?
00:55:02 How long ago?
00:55:03 Wow, time flies when you’re having fun.
00:55:05 Two months ago.
00:55:06 Two months ago.
00:55:07 So that was just recently it’s come out,
00:55:12 just in the past week it’s come out
00:55:14 that Pfizer isn’t battling the FDA.
00:55:18 Pfizer has joined the FDA in the opposition to the request
00:55:24 to release these documents in the same amount of time
00:55:29 that the FDA took to evaluate them.
00:55:33 Yeah.
00:55:34 So Pfizer is offering to help the FDA
00:55:43 to petition the judge to not enforce the timeline
00:55:51 that he seems to be moving towards.
00:55:54 So for people who are not familiar,
00:55:55 we’re talking about the Freedom of Information Act request
00:55:59 to release the Pfizer vaccine data, study data
00:56:05 to release as much of the data as possible,
00:56:07 like the raw data, the details,
00:56:08 or actually not even the raw data,
00:56:10 it’s data, doesn’t matter, there’s details to it.
00:56:14 And I think the response from the FDA is that of course,
00:56:20 yes, of course, but we can only publish
00:56:25 we can only publish like some X number of pages a day.
00:56:29 500 pages.
00:56:31 500 pages of data.
00:56:32 It’s not a day though, it’s a week I think.
00:56:36 The point is whatever they’re able to publish is ridiculous.
00:56:39 It’s like my printer can only print three pages a day
00:56:45 and we cannot afford a second printer.
00:56:48 So it’s some kind of bureaucratic language for,
00:56:52 there’s a process to this, and now you’re saying
00:56:56 that Pfizer is obviously more engaged
00:57:00 in helping this kind of bureaucratic process prosper
00:57:04 in its full absurdity, Kafkaesque absurdity.
00:57:08 So what is this?
00:57:11 This really bothered people.
00:57:13 This really.
00:57:14 This is really troublesome.
00:57:15 And just to put it in just plain English terms,
00:57:19 Pfizer’s making the case that it can’t,
00:57:24 the FDA and Pfizer together are making the case
00:57:27 that they can’t go through the documents.
00:57:29 It’s gonna take them some number of hundredfold,
00:57:33 hundreds of folds more time to go through the documents
00:57:37 than the FDA required to go through the documents
00:57:39 to approve the vaccines,
00:57:42 to give the vaccines full FDA approval.
00:57:44 And the FDA’s argument, talk about Kafkaesque,
00:57:48 is that to do it more rapidly
00:57:51 would cost them $3 million.
00:57:54 $3 million equals one hour of vaccine sales over two years.
00:58:01 One hour of sales.
00:58:04 And they can’t come up with the money.
00:58:05 And now Pfizer has joined the suit
00:58:08 to help the FDA fight off this judge, this mean judge,
00:58:11 who thinks they ought to release the data.
00:58:12 But evidently Pfizer isn’t offering
00:58:15 to come up with the $3 million either.
00:58:17 So, but for $3 million, I mean, maybe,
00:58:21 maybe the FDA should do a GoFundMe campaign.
00:58:25 Well, obviously the money thing,
00:58:28 I mean, I’m sure if Elon Musk comes along and says,
00:58:31 I’ll give you $100 million, publish it now,
00:58:35 I think they’ll come up with another.
00:58:37 So, I mean, it’s clear that there’s cautiousness.
00:58:43 I don’t know the source of it from the FDA.
00:58:47 There’s only one explanation that I can think of,
00:58:50 which is that the FDA and Pfizer
00:58:53 don’t wanna release the data.
00:58:55 They don’t wanna release the three
00:58:57 or 500,000 pages of documents.
00:59:03 And I don’t know what’s in there.
00:59:05 I wanna say one thing very clearly.
00:59:08 I am not an anti faxer.
00:59:10 I believe the vaccines work.
00:59:11 I believe everybody should get vaccinated.
00:59:15 The evidence is clear that if you’re vaccinated,
00:59:17 you reduce your risk of dying of COVID by 20 fold.
00:59:20 And we’ve got new sub variants coming along.
00:59:23 And I just wanna be very clear about this.
00:59:26 That said, there’s something I would give you 10 to one odds
00:59:32 on a bet that there’s something in that data
00:59:35 that is gonna be embarrassing to either FDA or Pfizer
00:59:40 or both.
00:59:41 So there’s two options.
00:59:42 I agree with you 100%.
00:59:43 One is they know of embarrassing things.
00:59:46 That’s option one.
00:59:48 And option two, they haven’t invested enough
00:59:51 to truly understand the data.
00:59:54 Like, I mean, it’s a lot of data
00:59:56 that they have a sense
00:59:58 that might be something embarrassing in there.
01:00:00 And if we release it,
01:00:02 surely the world will discover the embarrassing
01:00:04 and to do a sort of the steel man their argument.
01:00:08 They’ll take the small, the press,
01:00:11 the people will take the small embarrassing things
01:00:14 and blow them up into big things.
01:00:16 Yes, and support the anti vax campaign.
01:00:20 I think that’s all possible.
01:00:22 Nonetheless, the data are about the original clinical trial.
01:00:27 And the emergency use authorization was based
01:00:33 on the first few months of the data from that trial.
01:00:36 And it was a two year trial.
01:00:37 The rest of that data has not been opened up
01:00:40 and there was not an advisory committee meeting
01:00:43 to look at that data
01:00:44 when the FDA granted full authorization.
01:00:47 Again, I am pro vaccine.
01:00:49 I am not making an anti vax argument here.
01:00:52 But I suspect that there’s something pretty serious
01:00:56 in that data.
01:00:57 And the reason why I’m not an anti vaxxer,
01:01:00 having not been able to see the data
01:01:03 that the FDA and Pfizer seem to willing
01:01:06 not just to put effort into preventing the release of,
01:01:09 but seem to have quite a bit of energy
01:01:12 into preventing, invest quite a bit of energy
01:01:15 in not releasing that data.
01:01:16 The reason why that doesn’t tip me over
01:01:18 into the anti vaxxer side
01:01:20 is because that’s clinical trial data,
01:01:22 early clinical trial data
01:01:23 that involved several thousand people.
01:01:25 We now have millions of data points
01:01:28 from people who have had the vaccine.
01:01:31 This is real world data,
01:01:32 showing the efficacy of the vaccines.
01:01:35 And so far, knock on wood,
01:01:38 there aren’t side effects
01:01:41 that overcome the benefits of vaccine.
01:01:45 So I’m with you.
01:01:46 I’m now, I guess, three shots of the vaccine.
01:01:53 But there’s a lot of people that are kind of saying,
01:01:55 well, even the data on the real world use large scale data
01:02:03 is messy.
01:02:05 The way it’s being reported,
01:02:06 the way it’s being interpreted.
01:02:08 Well, one thing is clear to me
01:02:11 that it is being politicized.
01:02:13 I mean, if you just look objectively,
01:02:17 don’t have to go to at the shallow surface level.
01:02:21 It seems like there’s two groups
01:02:25 that I can’t even put a term to it
01:02:29 because it’s not really pro vaccine versus anti vaccine
01:02:32 because it’s pro vaccine, triple mask, Democrat, liberal,
01:02:41 and then anti mandate, whatever those groups are.
01:02:44 I can’t quite, cause they’re changing.
01:02:46 Anti mask, but not really, but kind of.
01:02:50 So those two groups that feel political in nature,
01:02:53 not scientific in nature, they’re bickering.
01:02:56 And then it’s clear that this data is being interpreted
01:03:01 by the different groups differently.
01:03:04 And it’s very difficult for me as a human being
01:03:07 to understand where the truth lies,
01:03:11 especially given how much money is flying around
01:03:14 on all sides.
01:03:15 So the anti vaxxers can make a lot of money too.
01:03:19 Let’s not forget this.
01:03:20 From the individual perspective,
01:03:22 you can become famous being an anti vaxxer.
01:03:25 And so there’s a lot of incentives on all sides here.
01:03:28 And there’s real human emotion and fear
01:03:33 and also credibility.
01:03:37 Scientists don’t wanna ruin their reputation
01:03:41 if they speak out in whatever, like speak their opinion
01:03:45 or they look at some slice of the data
01:03:49 and begin to interpret it in some kind of way.
01:03:51 They’re very, it’s clear that fear is dominating
01:03:53 the discourse here, especially in the scientific community.
01:03:57 So I don’t know what to make of that.
01:04:01 And the only happy people here is Pfizer.
01:04:06 It’s just plowing all ahead.
01:04:08 I mean, with every single variant,
01:04:13 there’s very, I would say, outside of arguably
01:04:20 a very flawed system, there’s a lot of incredible
01:04:23 scientific and engineering work being done
01:04:25 in constantly developing new, like antiviral drugs,
01:04:29 new vaccines to deal with the variants.
01:04:33 So they’re happily being a capitalist machine.
01:04:37 And it’s very difficult to know what to do with that.
01:04:43 And let’s just put this in perspective for folks.
01:04:46 The best selling drug in the world has been Humira
01:04:49 for a number of years.
01:04:51 It’s approved for the treatment of rheumatoid arthritis
01:04:55 and eight other indications.
01:04:57 And it’s sold about $20 billion globally
01:05:02 over the past few years.
01:05:03 It peaked at that level.
01:05:07 Pfizer expects to sell $65 billion of vaccine
01:05:12 in the first two years of the pandemic.
01:05:16 So this is by far the biggest selling
01:05:19 and most profitable drug that’s ever come along.
01:05:22 I can ask you a difficult question here.
01:05:28 In the fog that we’re operating in here,
01:05:34 on the Pfizer BioNTech vaccine,
01:05:40 what was done well and what was done badly
01:05:43 that you can see now, it seems like we’ll know
01:05:47 more decades from now.
01:05:50 Yes.
01:05:51 But now in the fog of today with the $65 billion
01:05:57 flying around, where do you land?
01:06:03 So we’re gonna get to what I think is one of the key problems
01:06:08 with the pharmaceutical industry model in the United States
01:06:12 about being profit driven.
01:06:15 So in 2016, the NIH did the key infrastructure work
01:06:22 to make mRNA vaccines.
01:06:26 That gets left out of the discussion a lot.
01:06:29 And Pfizer BioNTech actually paid royalties voluntarily
01:06:35 to the NIH.
01:06:36 I don’t know how much it was.
01:06:36 I don’t think it was a whole lot of money,
01:06:38 but I think they wanted to avoid the litigation
01:06:41 that Moderna got itself into by just taking that 2016
01:06:45 knowledge and having that be the foundation
01:06:48 of their product.
01:06:50 So Pfizer took that and they did their R&D,
01:06:54 they paid for their R&D having received that technology.
01:06:59 And when they got the genetic code from China
01:07:03 about the virus, they very quickly made a vaccine
01:07:09 and the vaccine works.
01:07:10 And President Trump to his credit launched
01:07:14 Operation Warp Speed and just threw money at the problem.
01:07:18 They just said, we spent five times more per person
01:07:22 than the EU early on, just pay them whatever they want.
01:07:26 Let’s just get this going.
01:07:28 And Americans were vaccinated more quickly.
01:07:32 We paid a lot of money.
01:07:34 The one mistake that I think the federal government made
01:07:37 was they were paying these guaranteed fortunes
01:07:41 and they didn’t require that the companies participate
01:07:45 in a program to do global vaccinations.
01:07:50 So the companies doing their business model
01:07:53 distributed the vaccines where they would make
01:07:56 the most money.
01:07:57 And obviously they would make the most money
01:07:59 in the first world.
01:08:00 And almost I think 85% of the vaccines early on
01:08:04 went to the first world and very, very few vaccinations
01:08:08 went to the third world.
01:08:10 So what happened is there was such a low vaccination rate
01:08:16 in May of 2021, there was all hands on deck cry for help
01:08:23 from the World Trade Organization,
01:08:26 the World Health Organization, the IMF and the World Bank
01:08:31 made a plea for $50 billion so that we could get
01:08:36 to 40% vaccination rate in the third world
01:08:40 by the end of 2021.
01:08:44 And it was unrequited, nobody answered.
01:08:48 And now Africa has about a 8.9% vaccination rate.
01:08:54 India is coming up, but it’s been very low.
01:08:57 The problem with all this is I believe those mRNA vaccines
01:09:02 are excellent vaccines.
01:09:04 But if we leave the third world unvaccinated,
01:09:07 we’re gonna have a constant supply of variants of COVID
01:09:12 that are gonna come back into the United States
01:09:15 and harm Americans exactly like Delta and Omicron have.
01:09:20 So we’ve made a great drug, it reduces the risk of mortality
01:09:25 in Americans who get it by a lot.
01:09:28 But we’re not doing what we need to do
01:09:31 to protect Americans from Omicron.
01:09:33 You don’t have to be an idealist
01:09:34 and worry about global vaccine equity.
01:09:36 If you’re just ordinary selfish people like most of us are,
01:09:41 and you’re worried about the health of Americans,
01:09:43 you would ensure global vaccine distribution.
01:09:47 Let me just make one more point.
01:09:49 That $50 billion that was requested
01:09:51 by the four organizations back in May of 2021,
01:09:55 32 billionaires made $50 billion
01:09:59 from the vaccines at that point,
01:10:01 took it into their private wealth.
01:10:03 So what had been taken,
01:10:05 this enormous amounts of money that had been taken
01:10:06 into private wealth was enough to do
01:10:10 what those organizations said needed to be done
01:10:12 to prevent the sub variants from coming back
01:10:15 and doing what they’re doing.
01:10:16 So the money was there, but how does the motivation,
01:10:19 the money driven motivation of Big Pharma lead to that,
01:10:22 that kind of allocation of vaccines?
01:10:28 Because they can make more money in the United States.
01:10:31 They’re gonna distribute their vaccines
01:10:33 where they can make the most money.
01:10:34 Right, is there a malevolent aspect to this
01:10:40 where, boy, I don’t like saying this,
01:10:44 but that they don’t see it as a huge problem
01:10:49 that variants will come back to the United States.
01:10:53 I think it’s the issue we were talking about earlier on
01:10:56 where they’re in a different culture
01:10:58 and their culture is that their moral obligation,
01:11:02 as Milton Friedman would say,
01:11:04 is to maximize the profits
01:11:06 that they return to shareholders.
01:11:07 And don’t think about the bigger picture.
01:11:10 The collateral damage, don’t think about the collateral.
01:11:12 And also kind of believe, convince yourself
01:11:16 that if we give into this capitalist machine
01:11:20 in this very narrow sense of capitalism,
01:11:23 that in the end, they’ll do the most good.
01:11:25 This kind of belief that like,
01:11:28 if we just maximize profits, we’ll do the most good.
01:11:32 Yeah, that’s an orthodoxy of several decades ago.
01:11:36 And I don’t think people can really say that in good faith.
01:11:40 When you’re talking about vaccinating the third world
01:11:43 so we don’t get hurt,
01:11:44 it’s a little bit hard to make the argument
01:11:47 that the world’s a better place
01:11:48 because the profits of the investors went up.
01:11:51 Yeah, but at the same time,
01:11:54 I think that’s a belief you can hold.
01:11:58 I mean, I’ve interacted with a bunch of folks that kinda,
01:12:01 it’s the, I don’t wanna mischaracterize Ayn Rand, okay?
01:12:05 I respect a lot of people,
01:12:07 but there’s a belief that can take hold.
01:12:10 If I just focus on this particular maximization,
01:12:13 it will do the most good for the world.
01:12:16 The problem is when you choose what to maximize
01:12:19 and you put blinders on,
01:12:20 it’s too easy to start making gigantic mistakes
01:12:24 that have a big negative impact on society.
01:12:28 So it’s really matters what you’re maximizing.
01:12:30 Right, and if we had a true democracy
01:12:33 and everybody had one vote,
01:12:36 everybody got decent information and had one vote,
01:12:39 Ayn Rand’s position would get some votes, but not many,
01:12:44 and it would be way outvoted by the common people.
01:12:48 Let me ask you about this very difficult topic.
01:12:53 I’m talking to Mark Zuckerberg of Metta,
01:13:00 the topic of censorship.
01:13:03 I don’t know if you’ve heard,
01:13:04 but there’s a guy named Robert Malone and Peter McCullough
01:13:08 that were removed from many platforms
01:13:10 for speaking about the COVID vaccine as being risky.
01:13:14 They were both on Joe Rogan’s program.
01:13:17 What do you think about censorship in this space?
01:13:23 In this difficult space where so much is controlled by,
01:13:29 not controlled, but influenced by advertisements
01:13:31 from Big Pharma,
01:13:32 and science can even be influenced by Big Pharma.
01:13:39 Where do you lean on this?
01:13:41 Should we lean towards freedom
01:13:46 and just allow all the voices,
01:13:50 even those that go against the scientific consensus?
01:13:54 Is that one way to fight the science
01:13:59 that is funded by Big Pharma,
01:14:02 or is that do more harm than good,
01:14:05 having too many voices that are contending here?
01:14:08 Should the ultimate battle be fought
01:14:10 in the space of scientific publications?
01:14:15 And particularly in the era of COVID,
01:14:19 where there are large public health ramifications
01:14:22 to this public discourse, the ante is way up.
01:14:27 So I don’t have a simple answer to that.
01:14:31 I think everyone’s allowed their own opinion.
01:14:34 I don’t think everyone’s allowed their own scientific facts.
01:14:38 And how we develop a mechanism
01:14:43 that’s other than an open internet
01:14:45 where whoever is shouting the loudest gets the most clicks
01:14:49 and rage creates value on the internet,
01:14:54 I think that’s not a good mechanism for working this out.
01:14:58 And I don’t think we have one.
01:14:59 I don’t have a solution to this.
01:15:01 I mean, ideally, if we had a philosopher king,
01:15:05 we could have a panel of people
01:15:08 who were not conflicted by rigid opinions
01:15:13 decide on what the boundaries of public discourse might be.
01:15:19 I don’t think it should be fully open.
01:15:21 I don’t think people who are making,
01:15:25 who are committed to an anti vaccine position
01:15:28 and will tailor their interpretation
01:15:31 of complex scientific data to support their opinion,
01:15:34 I think that can be harmful.
01:15:36 Constraining their speech can be harmful as well.
01:15:39 So I don’t have an answer here.
01:15:41 But yeah.
01:15:42 I tend to believe that it’s more dangerous
01:15:45 to censor anti vax messages.
01:15:49 The way to defeat anti vax messages
01:15:53 is by being great communicators,
01:15:56 by being great scientific communicators.
01:15:58 So it’s not that we need to censor
01:16:02 the things we don’t like.
01:16:04 We need to be better at communicating
01:16:06 the things we do like,
01:16:08 or the things that we do believe represent
01:16:10 the deep scientific truth.
01:16:13 Because I think if you censor,
01:16:18 you get worse at doing science
01:16:22 and you give the wrong people power.
01:16:27 So I tend to believe that you should give power
01:16:30 to the individual scientists
01:16:33 and also give them the responsibility
01:16:35 of being better educators, communicators,
01:16:38 expressers of scientific ideas,
01:16:41 put pressure on them to release data,
01:16:43 to release that data in a way that’s easily consumable,
01:16:46 not just like very difficult to understand,
01:16:49 but in a way that can be understood
01:16:50 by a large number of people.
01:16:52 So the battle should be fought
01:16:54 in the open space of ideas
01:16:57 versus in the quiet space of journals.
01:17:02 I think we no longer have that comfort,
01:17:05 especially at the highest of stakes.
01:17:08 So this kind of idea that a couple of peer reviewers
01:17:11 decide the fate of billions
01:17:14 doesn’t seem to be sustainable,
01:17:18 especially given a very real observation now
01:17:24 that the reason Robert Malone has a large following
01:17:30 is there’s a deep distrust of institutions,
01:17:33 deep distrust of scientists,
01:17:34 of science as an institution,
01:17:37 of power centers, of companies, of everything,
01:17:41 and perhaps rightfully so.
01:17:43 But the way to defend against that
01:17:45 is not for the powerful to build a bigger wall.
01:17:49 It’s for the powerful to be authentic
01:17:53 and maybe a lot of them to get fired,
01:17:55 and for new minds, for new fresh scientists,
01:17:59 ones who are more authentic, more real,
01:18:01 better communicators to step up.
01:18:03 So I fear censorship
01:18:06 because it feels like censorship
01:18:09 is an even harder job to do it well
01:18:13 than being good communicators.
01:18:16 And it seems like it’s always the C students
01:18:19 that end up doing the censorship.
01:18:21 It’s always the incompetent people,
01:18:24 and not just the incompetent, but the biggest whiners.
01:18:28 So what happens is the people
01:18:32 that get the most emotional and the most outraged
01:18:36 will drive the censorship.
01:18:39 And it doesn’t seem like reason drives the censorship.
01:18:42 That’s just objectively observing
01:18:44 how censorship seems to work in this current.
01:18:47 So there’s so many forms of censorship.
01:18:50 You look at the Soviet Union
01:18:51 or the propaganda or Nazi Germany,
01:18:54 it’s a very different level of censorship.
01:18:56 People tend to conflate all of these things together.
01:18:59 Social media trying desperately to have trillions
01:19:03 or hundreds of billions of exchanges a day,
01:19:07 and try to make sure that their platform
01:19:10 has some semblance of, quote, healthy conversations.
01:19:16 People just don’t go insane.
01:19:18 They actually like using the platform,
01:19:20 and they censor based on that.
01:19:23 That’s a different level of censorship.
01:19:24 But even there, you can really run afoul
01:19:28 of the people that get the whiny C students
01:19:32 controlling too much of the censorship.
01:19:34 I believe you should actually put the responsibility
01:19:39 on the self proclaimed holders of truth,
01:19:42 AKA scientists, at being better communicators.
01:19:46 I agree with that.
01:19:47 I’m not advocating for any kind of censorship.
01:19:51 But Marshall McLuhan was very influential
01:19:55 when I was in college.
01:19:57 And his, that meme, the medium is the message.
01:20:03 It’s a little bit hard to understand
01:20:04 when you’re comparing radio to TV
01:20:06 and saying radio’s hotter or TV’s hotter or something.
01:20:09 But we now have the medium as the message
01:20:12 in a way that we’ve never seen,
01:20:14 we’ve never imagined before,
01:20:16 where rage and anger and polarization
01:20:22 are what drives the traffic on the internet.
01:20:28 And we don’t, it’s a question of building the commons.
01:20:34 Ideally, I don’t know how to get there,
01:20:36 so I’m not pretending to have a solution.
01:20:38 But the commons of discourse about this particular issue,
01:20:42 about vaccines, has been largely destroyed by the edges,
01:20:47 by the drug companies and the advocates on the one side
01:20:50 and the people who just criticize and think
01:20:54 that even though the data are flawed
01:20:57 that there’s no way vaccines can be beneficial.
01:21:00 And to have those people screaming at each other
01:21:04 does nothing to improve the health
01:21:07 of the 95% of the people in the middle
01:21:10 who want to know what the rational way to go forward is
01:21:16 and protect their families from COVID
01:21:18 and live a good life
01:21:20 and be able to participate in the economy.
01:21:22 And that’s the problem.
01:21:25 I don’t have a solution.
01:21:26 Well, there’s a difficult problem for Spotify and YouTube.
01:21:29 I don’t know if you heard,
01:21:30 this is a thing that Joe Rogan is currently going through.
01:21:33 As a platform, whether to censor the conversation
01:21:36 that, for example, Joe’s having.
01:21:39 So I don’t know if you heard,
01:21:40 but Neil Young and other musicians have kind of spoke out
01:21:43 and saying they’re going to leave the platform
01:21:45 because Joe Rogan is allowed to be on this platform
01:21:49 having these kinds of conversations
01:21:51 with the likes of Robert Malone.
01:21:54 And it’s clear to me that Spotify and YouTube
01:21:57 are being significantly influenced
01:21:59 by these extreme voices, like you mentioned, on each side.
01:22:03 And it’s also clear to me that Facebook is the same
01:22:05 and it was going back and forth.
01:22:07 In fact, that’s why Facebook has been oscillating
01:22:10 on the censorship is like one group gets louder
01:22:12 than the other, depending on whether it’s an election year.
01:22:19 There’s several things to say here.
01:22:21 So one, it does seem, I think you put it really well,
01:22:24 it would be amazing if these platforms
01:22:26 could find mechanisms to listen to the center,
01:22:29 to the big center that’s actually going to be affected
01:22:34 by the results of our pursuit of scientific truth.
01:22:40 And listen to those voices.
01:22:42 I also believe that most people are intelligent enough
01:22:45 to process information and to make up their own minds.
01:22:49 Like they’re not, in terms of,
01:22:54 it’s complicated, of course,
01:22:55 because we’ve just been talking about advertisement
01:22:57 and how people can be influenced.
01:22:58 But I feel like if you have raw, long form podcasts
01:23:05 or programs where people express their mind
01:23:08 and express their argument in full,
01:23:12 I think people can hear it to make up their own mind.
01:23:15 And if those arguments have a platform on which
01:23:18 they can live, then other people could provide
01:23:21 better arguments if they disagree with it.
01:23:23 And now we as human beings, as rational,
01:23:26 as intelligent human beings, can look at both
01:23:29 and make up our own minds.
01:23:30 And that’s where social media can be very good
01:23:33 at this collective intelligence.
01:23:35 We together listen to all of these voices
01:23:39 and make up our own mind.
01:23:40 Humble ourselves, actually, often.
01:23:42 You think, you know, like you’re an expert,
01:23:46 say you have a PhD in a certain thing,
01:23:48 so there’s this confidence that comes with that.
01:23:50 And the collective intelligence, uncensored,
01:23:54 allows you to humble yourself eventually.
01:23:56 Like as you discover, all it takes is a few times,
01:24:01 you know, looking back five years later,
01:24:05 realizing I was wrong.
01:24:07 And that’s really healthy for a scientist.
01:24:09 That’s really healthy for anybody to go through.
01:24:11 And only through having that open discourse
01:24:13 can you really have that.
01:24:15 That said, Spotify also, just like Pfizer is a company,
01:24:20 which is why this podcast,
01:24:26 I don’t know if you know what RSS feeds are,
01:24:29 but podcasts can’t be censored.
01:24:31 So Joe’s in the unfortunate position
01:24:33 he only lives on Spotify.
01:24:35 So Spotify has been actually very good
01:24:37 at saying we’re staying out of it for now.
01:24:41 But RSS, this is pirate radio.
01:24:44 Nobody can censor it, it’s the internet.
01:24:47 So financially, in terms of platforms,
01:24:51 this cannot be censored,
01:24:53 which is why podcasts are really beautiful.
01:24:56 And so if Spotify or YouTube wants to be
01:25:01 the host of podcasts,
01:25:04 I think where they flourish is free expression,
01:25:10 no matter how crazy.
01:25:12 Yes, but I do wanna push back a little bit on what you’re saying.
01:25:18 I have anti fax friends who I love.
01:25:23 They’re dear, cherished friends.
01:25:26 And they’ll send me stuff.
01:25:28 And it’ll take me an hour to go through what they sent
01:25:34 to see if it is credible.
01:25:37 And usually it’s not.
01:25:40 It’s not a random sample of the anti fax argument.
01:25:42 I’m not saying I can disprove the anti fax argument.
01:25:46 But I am saying that it’s almost like we were talking about
01:25:50 how medical science clinical trials,
01:25:54 the presentation of clinical trials to physicians
01:25:56 could be improved.
01:25:57 And the first thing we came up with
01:26:00 is to have pre publication transparency
01:26:04 in the peer review process.
01:26:06 So bad information, biased information doesn’t get out
01:26:10 as if it’s legitimate, and you can’t put it back,
01:26:13 recapture it once it gets out.
01:26:16 I think there’s an element of that
01:26:18 in the arguments that are going on about vaccines.
01:26:21 And they’re on both sides.
01:26:23 But I think the anti fax side puts out more units
01:26:28 of information claiming to show that the vaccines don’t work.
01:26:33 And I guess in an ideal situation,
01:26:36 there would be real time fact checking by independent people,
01:26:41 not to censor it, but to just say that study was set up
01:26:45 to do this, and this is what the conclusions were.
01:26:47 So the way it was stated is on one side of this argument.
01:26:52 But that’s what I’m arguing.
01:26:53 I agree with you.
01:26:55 What I’m arguing is that this big network of humans
01:26:58 that we have, that is the collective intelligence,
01:27:00 can’t do that real time if you allow it to,
01:27:04 if you encourage people to do it.
01:27:05 And the scientists, as opposed to, listen,
01:27:08 I interact with a lot of colleagues,
01:27:10 a lot of friends that are scientists,
01:27:12 they roll their eyes.
01:27:14 Their response is like, ugh.
01:27:16 Like they don’t want to interact with this.
01:27:18 But that’s just not the right response.
01:27:22 When a huge number of people believe this,
01:27:26 it is your job as communicators to defend your ideas.
01:27:30 It is no longer the case that you go to a conference
01:27:33 and defend your ideas to two other nerds
01:27:36 that have been working on the same problem forever.
01:27:38 I mean, sure, you can do that,
01:27:40 but then you’re rejecting the responsibility
01:27:44 you have explicitly or implicitly accepted
01:27:48 when you go into this field,
01:27:49 that you will defend the ideas of truth.
01:27:52 And the way to defend them is in the open battlefield
01:27:55 of ideas, and become a better communicator.
01:27:58 And I believe that when you have a lot,
01:28:00 you said you invested one or two hours
01:28:02 in this particular, but that’s little ants interacting
01:28:06 at scale, I think that allows us to progress towards truth.
01:28:12 At least, you know, at least I hope so.
01:28:14 I think you’re an optimist.
01:28:15 I want to work with you a little bit on this.
01:28:17 Let’s say a person like Joe Rogan,
01:28:22 who, by the way, had me on his podcast and let me.
01:28:26 It’s an amazing conversation, I really enjoyed it.
01:28:28 Well, thank you.
01:28:29 I did too.
01:28:29 And I didn’t know Joe.
01:28:31 I didn’t know much about his podcast.
01:28:32 He pushed back on Joe a bunch, which is great.
01:28:35 And he was a gentleman, and we had it out.
01:28:38 In fact, he put one clip, at one point,
01:28:41 he said something that was a little bit wrong,
01:28:43 and I corrected him.
01:28:44 And he had the guy who.
01:28:46 Jamie.
01:28:47 Jamie, he had Jamie check it,
01:28:48 and was very forthright in saying,
01:28:51 yeah, you know, John’s got a right here.
01:28:53 We gotta modify this.
01:28:54 In any event, in any event.
01:28:56 You got him.
01:28:58 Well, I wasn’t trying to get him,
01:28:59 I was just trying to. No, no, no, no.
01:29:01 Totally, it was a beautiful exchange.
01:29:03 There was so much respect in the room,
01:29:04 pushing back and forth, it was great.
01:29:06 Yeah, so I respect him.
01:29:08 And I think when he has somebody on
01:29:13 who’s a dyed in the wool anti faxer,
01:29:16 the question is, how can you balance,
01:29:21 if it needs balance, in real time?
01:29:24 I’m not talking about afterwards.
01:29:26 I’m talking in real time.
01:29:27 Maybe you record, well, he does record it, obviously.
01:29:30 But maybe when there’s a statement made
01:29:33 that is made as if it’s fact based,
01:29:38 maybe that statement should be checked by
01:29:41 some folks who,
01:29:45 imaginary folks who are trustworthy.
01:29:48 And in real time, as that discussion
01:29:51 is being played on the podcast,
01:29:54 to show what independent experts say about that claim.
01:29:59 That’s a really interesting idea.
01:30:00 By the way, for some reason,
01:30:01 this idea popped into my head now.
01:30:03 I think real time is very difficult,
01:30:05 and it’s not difficult,
01:30:07 but it kind of ruins the conversation
01:30:09 because you want the idea to breathe.
01:30:11 I think what’s very possible is before it’s published,
01:30:15 it’s the pre publication, before it’s published,
01:30:18 you let a bunch of people review it,
01:30:20 and they can add their voices in post.
01:30:23 Before it’s published, they can add arguments,
01:30:29 arguments against certain parts.
01:30:31 That’s very interesting to sort of,
01:30:32 as one podcast, publish addendums.
01:30:37 Publish the peer review together with the publication.
01:30:40 That’s very interesting.
01:30:43 I might actually do that.
01:30:44 That’s really interesting.
01:30:45 Because I’ve been doing more debates
01:30:47 where at the same time have multiple people,
01:30:51 which has a different dynamic
01:30:53 because both people, I mean,
01:30:56 it’s really nice to have the time to pause
01:30:58 just by yourself to fact check,
01:31:02 to look at the study that was mentioned,
01:31:04 to understand what’s going on.
01:31:05 So the peer review process, to have a little bit of time.
01:31:09 That’s really interesting.
01:31:10 I actually would, I’d like to try that.
01:31:14 To agree with you on some point in terms of anti vax,
01:31:17 I’ve been fascinated by listening to arguments
01:31:20 from this community of folks that’s been quite large
01:31:23 called the flat earthers,
01:31:25 the people that believe the earth is flat.
01:31:28 And I don’t know if you’ve ever listened to them
01:31:30 or read their arguments,
01:31:33 but it’s fascinating how consistent
01:31:36 and convincing it all sounds
01:31:37 when you just kind of take it in.
01:31:39 Just like, just take it in like listening normally.
01:31:43 It’s all very logical.
01:31:46 Like if you don’t think very,
01:31:49 well, no, so the thing is,
01:31:53 the reality is at the very basic human level
01:31:57 with our limited cognitive capabilities,
01:32:00 the earth is pretty flat when you go outside
01:32:03 and you look at flat.
01:32:04 So like when you use common sense reasoning,
01:32:08 it’s very easy to play to that,
01:32:09 to convince you that the earth is flat.
01:32:12 Plus there’s powerful organizations
01:32:13 that want to manipulate you and so on.
01:32:16 But then there’s the whole progress of science
01:32:20 and physics of the past,
01:32:22 but that’s difficult to integrate into your thought process.
01:32:26 So it’s very true that the people
01:32:29 should listen to flat earthers
01:32:30 because it was very revealing to me
01:32:33 how easy it is to be convinced of basically anything
01:32:39 by charismatic arguments.
01:32:42 And if we’re arguing about whether the earth is flat or not,
01:32:46 as long as we’re not navigating airplanes
01:32:48 and doing other kinds of things,
01:32:49 trying to get satellites to do transmission,
01:32:53 it’s not that important what I believe.
01:32:56 But if we’re arguing about how we approach
01:32:59 the worst public health crisis in,
01:33:02 I don’t know how long,
01:33:03 I think we’re getting worse than the Spanish flu now.
01:33:06 I don’t know what the total global deaths
01:33:07 with Spanish flu were, but in the United States,
01:33:10 we certainly have more deaths than we had from Spanish flu.
01:33:12 Plus the economic pain and suffering.
01:33:14 Yes, yes, and the damage to the kids in school and so forth.
01:33:19 We got a problem and it’s not going away, unfortunately.
01:33:23 So when we get a problem like that,
01:33:25 it’s not just an interesting bar room conversation
01:33:28 about whether the earth is flat.
01:33:30 There are millions of lives involved.
01:33:34 Let me ask you yet another question,
01:33:36 an issue I raised with Pfizer CO, Albert Burla.
01:33:42 It’s the question of revolving doors.
01:33:45 That there seems to be a revolving door
01:33:47 between Pfizer, FDA, and CDC.
01:33:51 People that have worked at the FDA,
01:33:53 now work at Pfizer, and vice versa,
01:33:56 including the CDC and so on.
01:34:00 What do you think about that?
01:34:01 So first of all, his response, once again,
01:34:03 is there’s rules, there’s very strict rules,
01:34:06 and we follow them.
01:34:08 Do you think that’s a problem?
01:34:11 Hoo ha.
01:34:12 And also, maybe this is a good time to talk about
01:34:16 this Pfizer play by the rules.
01:34:19 One at a time?
01:34:20 One at a time.
01:34:21 Okay, and this isn’t even about Pfizer,
01:34:22 but it’s an answer to the question.
01:34:24 Yes.
01:34:25 So there’s this drug, Ajihelm,
01:34:27 that was approved by the FDA maybe six months ago.
01:34:31 It’s a drug to prevent the progression
01:34:34 of low grade Alzheimer’s disease.
01:34:38 The target for drug development for Alzheimer’s disease
01:34:43 has been reducing the amyloid plaques in the brain,
01:34:47 which correlate with the progression of Alzheimer’s.
01:34:52 And Biogen showed that its drug, Ajihelm,
01:34:57 reduces amyloid plaques in the brain.
01:35:00 They did two clinical trials
01:35:03 to determine the clinical efficacy,
01:35:05 and they found that neither trial showed a meaningful benefit.
01:35:09 And in those two trials,
01:35:12 33% more people in the Ajihelm group
01:35:15 developed symptomatic brain swelling and bleeding
01:35:19 than people in the placebo group.
01:35:22 There was an advisory committee convened
01:35:27 to debate and determine how they felt
01:35:30 about the approvability of Ajihelm, given those facts.
01:35:35 And those facts aren’t in dispute.
01:35:37 They’re in Biogen slides, as well as FDA documents.
01:35:41 The advisory committee voted 10 against approval
01:35:47 and one abstain.
01:35:49 So that’s essentially universal,
01:35:52 unanimous vote against approving Ajihelm.
01:35:56 Now, the advisory committees have been pretty much cleansed
01:36:00 of financial conflicts of interest.
01:36:03 So this advisory committee votes 10 no, one abstention,
01:36:09 and the FDA overrules the unanimous opinion
01:36:13 of its advisory committee and approves the drug.
01:36:17 Three of the members of the advisory committee resign.
01:36:21 They say, we’re not gonna be part,
01:36:22 if the FDA is not gonna listen to a unanimous vote
01:36:24 against approving this drug,
01:36:26 which shows more harm than benefit, undisputed,
01:36:31 we’re not gonna participate in this.
01:36:33 And the argument against approval
01:36:36 is that the surrogate endpoint,
01:36:38 the reduction of amyloid, the progression of amyloid plaques
01:36:43 is known by the FDA not to be a valid clinical indicator.
01:36:48 It doesn’t correlate, 27 studies have shown,
01:36:50 it doesn’t correlate with clinical progression,
01:36:53 interrupting the amyloid plaques
01:36:54 doesn’t mean that your Alzheimer’s doesn’t get worse.
01:37:02 So it seems like it’s a slam dunk
01:37:05 and the FDA made a mistake and they should do whatever
01:37:09 they do to protect their bureaucratic reputation.
01:37:12 So the head of the Bureau of the FDA,
01:37:15 the Center for Drug Evaluation and Research
01:37:17 that approves new drugs, who had spent 16 years
01:37:21 as an executive in the pharmaceutical industry,
01:37:25 issued a statement and said,
01:37:28 “‘What we should do in this situation
01:37:30 “‘is to loosen the prohibition of financial ties of interest
01:37:36 “‘with the drug companies,
01:37:38 “‘so we get less emotional responses.’”
01:37:43 Said this, it’s in print.
01:37:49 People are just too emotional about this.
01:37:51 People were just too emotional.
01:37:52 The 10 people who voted against it
01:37:55 and the no people who voted for it,
01:37:56 it’s all too emotional.
01:37:58 So this gets back,
01:38:00 this is a long answer to your short question.
01:38:02 I think this is a wonderful window
01:38:04 into the thinking of the FDA
01:38:08 that financial conflicts of interest don’t matter
01:38:11 in a situation when I think it’s obvious
01:38:13 that they would matter.
01:38:15 But there’s not a direct financial conflict of interest.
01:38:18 It’s kinda, like it’s not, like Albert said, there’s rules.
01:38:26 I mean, you’re not allowed
01:38:27 to have direct financial conflicts of interest.
01:38:29 It’s indirect.
01:38:32 Right, but what I’m saying is,
01:38:34 I’m not denying what he said is true,
01:38:37 but the FDA, a high official in the FDA,
01:38:42 is saying that we need to allow conflicts of interest
01:38:45 in our advisory committee meetings.
01:38:48 Wow.
01:38:49 And that, she wants to change the rules.
01:38:53 Right.
01:38:54 So Albert Borla would still be playing by the rules,
01:38:58 but it just shows how one side of the thinking here is.
01:39:03 But you think that’s influenced by the fact
01:39:05 that there were pharmaceutical executives
01:39:07 working at the FDA and vice versa?
01:39:09 And they think that’s a great idea.
01:39:13 Who gets to fix this?
01:39:14 Do you think it should be just banned?
01:39:16 Like if you worked.
01:39:17 I don’t know, two separate questions.
01:39:19 One is should the officials at the FDA come from pharma
01:39:23 and vice versa?
01:39:24 Yes.
01:39:25 That’s one question.
01:39:26 And the other question is should advisory committee members
01:39:28 be allowed to have financial conflicts of interest?
01:39:31 Yes.
01:39:33 I think, in my opinion, and people might say I’m biased,
01:39:38 I think advisory committee people
01:39:40 should not have conflicts of interest.
01:39:42 I think their only interest ought to be the public interest.
01:39:44 And that was true from my understanding of the situation.
01:39:49 It’s the afterword in my book.
01:39:51 I spent some time studying it about Ajihelm.
01:39:54 I think it’s a slam dunk that there ought to be
01:39:56 no conflicts of interest.
01:39:57 Now the head of CDER, Center for Drug Evaluation Research,
01:40:01 thinks that that’s gonna give you a biased result
01:40:04 because we don’t have company influence.
01:40:07 And that, I think, shows how biased their thinking is.
01:40:14 That not having company influence is a bias.
01:40:19 Let me try to load that in.
01:40:21 I’m trying to empathize with the belief
01:40:23 that companies should have a voice at the table.
01:40:28 I mean, yeah, it’s part of the game.
01:40:30 They’ve convinced themselves
01:40:31 that this is how it should be played.
01:40:34 But they have a voice at the table.
01:40:36 They’ve designed the studies.
01:40:37 Right.
01:40:38 That’s their voice.
01:40:39 That’s the whole point.
01:40:40 They analyze the data.
01:40:41 I mean, what bigger voice do you deserve?
01:40:43 But I do also think, on the more challenging question,
01:40:47 I do think that there should be a ban.
01:40:50 If you work at a pharmaceutical company,
01:40:53 you should not be allowed to work
01:40:55 at any regulatory agency.
01:41:00 Yes.
01:41:01 You should not.
01:41:02 I mean, that, going back and forth,
01:41:03 it just, even if it’s 30 years later.
01:41:06 Yeah, I agree.
01:41:07 And I have another nomination for a ban.
01:41:11 We’re in this crazy situation
01:41:12 where Medicare is not allowed to negotiate
01:41:15 the price of drugs with the drug companies.
01:41:17 So the drug companies get a patent on a new drug.
01:41:20 Unlike every other developed country,
01:41:22 they can charge whatever they want
01:41:24 so they have a monopoly on a utility
01:41:27 because no one else can make the drug.
01:41:29 Charge whatever they want and Medicare has to pay for it.
01:41:31 And you say, how did we get in this crazy situation?
01:41:36 So how we got here is that in 2003,
01:41:39 when Medicare Part D was passed,
01:41:42 Billy Towson was head of the Ways and Means Committee
01:41:45 in the House, played a key role in ushering this through
01:41:48 with the nonnegotiation clause of it.
01:41:52 And after it was passed,
01:41:53 Billy Towson did not finish out his term in Congress.
01:41:57 He went to pharma for a $2 million a year job.
01:42:02 This is incredible.
01:42:05 You might think that a ban on that would be a good idea.
01:42:09 I spoke with Francis Collins, head of the NIH,
01:42:12 on this podcast.
01:42:13 He and NIH have a lot of power over funding in science.
01:42:22 What are they doing right, what are they doing wrong
01:42:24 in this interplay with big pharma?
01:42:28 How connected are they?
01:42:32 Again, returning to the question,
01:42:33 what are they doing right,
01:42:35 what are they doing wrong in your view?
01:42:37 So my knowledge of the NIH is not as granular
01:42:41 as my knowledge of pharma.
01:42:44 That said, in broad brushstrokes,
01:42:47 the NIH is doing the infrastructure work
01:42:51 for all drug development.
01:42:53 I think they’ve participated in 100% of the drugs
01:42:56 that have been approved by the FDA
01:42:58 over the past 10 years or so.
01:43:01 They’ve done infrastructure work.
01:43:03 And what they do is not work on particular drugs,
01:43:08 but they develop work on drug targets,
01:43:12 on targets in the human body that can be affected by drugs
01:43:16 and might be beneficial to turn on or off.
01:43:21 And then the drug companies, when they find a target
01:43:24 that is mutable and potentially beneficial,
01:43:29 then the drug companies can take the research
01:43:32 and choose to invest in the development of the drugs,
01:43:34 specific drug.
01:43:36 That’s our model.
01:43:38 Now, 96% of the research that’s done in clinical trials
01:43:44 in the United States is about drugs and devices.
01:43:47 And only a fraction of the 4% that’s left over
01:43:49 is about preventive medicine
01:43:51 and how to make Americans healthier.
01:43:54 I think, again, from the satellite view,
01:43:58 the NIH is investing more in science
01:44:04 that can lead to commercial development
01:44:07 rather than, as you said at the beginning of the podcast,
01:44:10 there’s no big fitness and lifestyle industry
01:44:13 that can counter pharma.
01:44:15 So I think at the NIH level, that countering can be done.
01:44:19 And the diabetes prevention program study
01:44:22 that we talked about before where lifestyle
01:44:24 was part of a randomized trial
01:44:26 and was shown to be more effective than metformin
01:44:28 at preventing the development of diabetes,
01:44:31 that is absolute proof positive
01:44:34 that investing in that kind of science
01:44:36 can produce good results.
01:44:37 So I think that we’re aimed at drug development
01:44:43 and what we ought to be aimed at
01:44:44 is an epidemiological approach
01:44:47 to improving the health of all Americans.
01:44:49 We rank 68th in the world in healthy life expectancy
01:44:55 despite spending an extra trillion and a half dollars a year.
01:44:59 And I believe strongly
01:45:02 that the reason why we’ve gotten in this crazy position
01:45:06 is because the knowledge that we’re producing
01:45:10 is about new drugs and devices
01:45:12 and it’s not about improving population health.
01:45:15 In this problem, the NIH is the perfect institution
01:45:19 to play a role in rebalancing our research agenda.
01:45:23 And some of that is on the leadership side
01:45:24 with Francis Collins and Anthony Fauci,
01:45:27 not just speaking about basically everything
01:45:32 that just leads to drug development, vaccine development,
01:45:34 but also speaking about healthy lifestyles
01:45:36 and speaking about health, not just sickness.
01:45:40 Yes, and investing, investing in health.
01:45:43 I mean, it’s like one feeds the other.
01:45:49 One, you have to communicate to the public
01:45:51 the importance of investing in health
01:45:53 and that leads to you getting props for investing in health
01:45:57 and then you can invest in health more and more
01:45:59 and that communicates, I mean,
01:46:01 everything that Anthony Fauci says or Francis Collins says
01:46:05 has an impact on scientists.
01:46:07 I mean, it sets the priorities.
01:46:12 I don’t think they, it’s the sad thing about leaders,
01:46:18 forgive me for saying the word, but mediocre leaders
01:46:22 is they don’t see themselves as part of a game.
01:46:26 They don’t see the momentum.
01:46:29 It’s like a fish in the water.
01:46:31 They don’t see the water.
01:46:32 Great leaders stand up and reverse the direction
01:46:36 of how things are going.
01:46:37 And I actually put a lot of responsibility,
01:46:39 some people say too much, but whatever.
01:46:43 I think leaders carry the responsibility.
01:46:46 I put a lot of responsibility on Anthony Fauci
01:46:48 and Francis Collins for not actually speaking
01:46:51 a lot more about health, not, and bigger,
01:46:55 inspiring people in the power
01:47:01 and the trustworthiness of science.
01:47:05 You know, that’s on the shoulders of Anthony Fauci.
01:47:12 I’m gonna abstain from that
01:47:13 because I’m not expert enough, but.
01:47:15 Neither am I, but I’m opinionated.
01:47:18 I am too, but not on camera.
01:47:21 Yes.
01:47:22 No, but seriously, the problem is pretty simple,
01:47:27 that we’re investing 96% of our funding
01:47:31 of clinical research in drugs and devices
01:47:33 and 80% of our health is determined
01:47:36 by how we live our lives.
01:47:38 Yes.
01:47:39 And this is ridiculous.
01:47:42 The United States is going further and further
01:47:45 behind the other wealthy countries in terms of our health.
01:47:49 We ranked 38th in healthy life expectancy in 2000
01:47:53 and now we’re spending a trillion and a half dollars extra
01:47:56 and we rank 68th.
01:47:58 We’ve gone down.
01:47:59 You have this excellent, there’s a few charts
01:48:02 that I’ll overlay that tell this story
01:48:06 in really powerful ways.
01:48:09 So one is the healthcare spending is percentage of GDP
01:48:13 that on the X axis is years and the Y axis is percentage
01:48:17 and the United States as compared to other countries
01:48:20 on average has been much larger and growing.
01:48:26 Right, we are now spending 7% more of our GDP,
01:48:30 17.7% versus 10.7% on healthcare.
01:48:35 7% and I think GDP is the fairest way
01:48:38 to compare healthcare spending.
01:48:40 Where per person in dollars we’re spending even,
01:48:43 the difference is even greater
01:48:45 but other costs vary with GDP.
01:48:48 So let’s stick with the conservative way to do it.
01:48:50 17.7 or 18% of GDP, 18% of GDP spent on healthcare,
01:49:00 7% higher than the comparable country average.
01:49:04 Right.
01:49:05 17.7% versus 10.7, 7% higher.
01:49:09 Right and 7% of $23 trillion GDP
01:49:15 is more than $1.5 trillion a year in excess.
01:49:19 And then you have another chart that shows
01:49:21 healthcare system performance compared to spending.
01:49:25 And there’s a cloud, a point cloud of different countries.
01:49:29 The X axis being healthcare spending
01:49:33 is a percentage of GDP which we just talked about.
01:49:36 That US is 7% higher than everyone, the average.
01:49:40 And then on the Y axis is performance.
01:49:44 So X axis spending, Y axis performance.
01:49:48 And there’s a point cloud, we’ll overlay this
01:49:50 if you’re watching on YouTube,
01:49:52 of a bunch of countries that have high performance
01:49:58 for what they’re spending and then US
01:50:02 is all alone on the right bottom side of the chart
01:50:07 where it’s low performance and high spending.
01:50:10 Correct.
01:50:12 So this is a system that is abiding by spending
01:50:17 that is directed by the most profitable ways
01:50:21 to deliver healthcare.
01:50:22 So you put that in the hands of big pharma.
01:50:25 As you maximize for profit, you’re going to decrease
01:50:28 performance and increase spending.
01:50:31 Yes, but I wanna qualify that and say
01:50:34 it’s not all big pharma’s fault.
01:50:37 They’re not responsible for all the problems
01:50:39 in our healthcare system.
01:50:41 They’re not responsible for the administrative costs
01:50:43 for example.
01:50:44 But they are the largest component of the rising,
01:50:49 our rising healthcare costs.
01:50:51 And it has to do with this knowledge issue.
01:50:54 Controlling the knowledge that doctors have
01:50:57 makes it so that doctors can live with this situation
01:51:01 believing that it’s optimal when it’s a wreck.
01:51:04 Yeah.
01:51:06 Let me ask you the big, so as a physician,
01:51:10 so everything you’ve seen, we’ve talked about 80%
01:51:13 of the impact on health is lifestyle.
01:51:18 How do we live longer?
01:51:20 What advice would you give to general people?
01:51:22 What space of ideas result in living longer
01:51:29 and higher quality lives?
01:51:30 Right, this is a very simple question to answer.
01:51:34 Exercise for at least a half hour
01:51:37 at least five times a week.
01:51:41 Number one.
01:51:42 Number two, don’t smoke.
01:51:45 Number three, maintain a reasonably healthy body weight.
01:51:49 Some people argue that being lower than a BMI of 25
01:51:53 is healthy.
01:51:54 I think that may be true,
01:51:56 but I think getting above 30 is unhealthy
01:52:00 and that ought to be.
01:52:01 Now that’s largely impacted by socioeconomic status
01:52:07 and we don’t wanna blame the victims here.
01:52:09 So we gotta understand that when we talk about
01:52:12 all of these things, not cigarettes,
01:52:14 but exercise and a good diet
01:52:18 and maintaining a healthy body weight,
01:52:23 we have to include in doing those things
01:52:27 the impediments to people of lower socioeconomic status
01:52:32 being able to make those changes.
01:52:34 We’ve got to understand that personal responsibility
01:52:38 accounts for some of this,
01:52:39 but also social circumstances accounts for some of it.
01:52:44 And back to your fish bowl analogy,
01:52:47 if you’re swimming in a fish bowl,
01:52:50 if you live in a fish tank
01:52:51 that’s not being properly maintained,
01:52:53 the approach wouldn’t be to treat individual sick fish,
01:52:58 it would be to fix your fish tank
01:53:01 to get the bacteria out of it
01:53:03 and whatever bad stuff is in there
01:53:05 and make your fish tank healthier.
01:53:08 Well, we invest far less than the other wealthy countries do.
01:53:12 We’re flipped, we have the mirror image
01:53:15 in the spending on social determinants of health
01:53:19 and medical determinants of health.
01:53:20 We have exactly the wrong order.
01:53:23 And not only does that choke off
01:53:25 social determinants of health, which are very important,
01:53:28 but actually just the ratio,
01:53:30 even if you were spending,
01:53:32 if we raise the social spending
01:53:35 and raise our medical spending in proportion,
01:53:38 it’s the ratio of social spending to medical spending
01:53:41 that’s the problem.
01:53:42 So, and why do we do that?
01:53:44 Well, the answer is perfectly obvious
01:53:46 that the way to transfer money
01:53:48 from working Americans to investors
01:53:51 is through the biomedical model,
01:53:54 not through the social health model.
01:53:57 And that’s the problem for,
01:53:59 and I’d like to discuss this
01:54:02 because the market isn’t gonna get us
01:54:06 to a reasonable allocation.
01:54:08 All the other wealthy countries
01:54:09 that are so much healthier than we are
01:54:11 and spending so much less than we are
01:54:14 have some form of government intervention
01:54:17 in the quality of the health data that’s available,
01:54:20 in the budgeting of health and social factors.
01:54:25 And we don’t, we’re kind of the Wild West
01:54:28 and we let the market determine those allocations.
01:54:31 And it’s an awful failure.
01:54:34 It’s a horrendous failure.
01:54:36 So one argument against government,
01:54:39 or sorry, an alternative to the government intervention
01:54:44 is the market can work better
01:54:48 if the citizenry has better information.
01:54:51 So one argument is that
01:54:53 communicators like podcasts and so on,
01:54:58 but other channels of communication
01:55:01 will be the way to fight big pharma.
01:55:03 Your book is the way to,
01:55:05 by providing information.
01:55:07 The alternative to the government intervention
01:55:10 on every aspect of this,
01:55:11 including communication with the doctors
01:55:13 is to provide them other information
01:55:15 and not allow the market to provide that information
01:55:18 by basically making it exciting
01:55:22 to buy books, to make better and better communicators
01:55:27 on Twitter, through books, through op eds,
01:55:30 through podcasts, through so on.
01:55:32 So basically, cause there’s a lot of incentive
01:55:35 to communicate against the messages of big pharma.
01:55:40 There’s incentive because people want to understand
01:55:43 what’s good for their lives
01:55:44 and they’re willing to listen to charismatic people
01:55:46 that are able to clearly explain what is good for them.
01:55:50 And they do, and more than 80% of people
01:55:54 think that drugs cost too much
01:55:55 and the drug industry is too interested in profits.
01:56:00 But they still get influenced.
01:56:02 They can’t, you can’t get the vote through Congress.
01:56:05 You know, Democrats and Republicans alike
01:56:08 are taking money from Congress
01:56:10 and somehow it just doesn’t work out
01:56:13 that these even small changes.
01:56:17 I mean, the pared down part of Medicare,
01:56:21 the plan for increasing Medicare negotiation drug costs
01:56:27 in Build Back Better,
01:56:29 it’s literally gonna reduce the number of new drugs
01:56:32 that are beneficial, uniquely beneficial
01:56:37 by about one new drug or two new drugs over 30 years.
01:56:42 It will have virtually an indecipherable impact.
01:56:48 And yet pharma is talking about the impact on innovation.
01:56:53 And if you vote for this,
01:56:55 if you let your Congressman vote for this,
01:56:58 you’re gonna severely slow down drug innovation
01:57:04 and that’s gonna affect the quality of your life.
01:57:07 Let me ask you about over medication
01:57:17 that we’ve been talking about from different angles.
01:57:19 But one difficult question for me,
01:57:22 I’ll just, I’ll pick one of the difficult topics,
01:57:25 depression.
01:57:26 So depression is a serious, painful condition
01:57:31 that leads to a lot of people suffering in the world.
01:57:37 And yet it is likely they were over prescribing
01:57:40 antidepressants.
01:57:42 So as a doctor, as a patient, as a healthcare system,
01:57:47 as a society, what do we do with that fact
01:57:50 that people suffer?
01:57:53 There’s a lot of people suffering from depression
01:57:57 and there’s also people suffering
01:57:59 from over prescribing of antidepressants.
01:58:01 Right.
01:58:02 So a paper in the New England Journal by Eric Turner
01:58:06 showed that the data,
01:58:09 if you put all the data together from antidepressants,
01:58:12 you find out that antidepressants are not effective
01:58:17 for people who are depressed
01:58:19 but don’t have a major depression.
01:58:22 Major depression is a serious problem.
01:58:25 People can’t function normally.
01:58:27 They have a hard time getting out,
01:58:31 performing their normal social roles.
01:58:35 But what’s happened is that the publicity,
01:58:39 I mean, Prozac Nation was a good example
01:58:43 of making the argument that why should people
01:58:45 settle for normal happiness
01:58:47 when they can have better than normal happiness?
01:58:49 And if you’re not having normal happiness,
01:58:52 you should take a drug.
01:58:53 Well, that concept that serotonin metabolism
01:59:00 is the root cause of depression
01:59:03 is really a destructive one.
01:59:05 We have drugs that change serotonin metabolism
01:59:08 but we don’t know if that’s why antidepressants
01:59:12 work on major depression.
01:59:14 And they certainly don’t work on everybody
01:59:16 with major depression.
01:59:16 I forget what the number needed a treat is.
01:59:18 I think it’s around four,
01:59:20 one out of four people have significant improvement.
01:59:23 But the people without major depression don’t get better.
01:59:28 And the vast majority of these drugs
01:59:30 are used for people without major depression.
01:59:33 So what’s happened is that the feelings
01:59:37 of life satisfaction of happiness and not sadness
01:59:42 have been medicalized.
01:59:43 The normal range of feelings have been medicalized.
01:59:47 And that’s not to say that they shouldn’t be attended to.
01:59:51 But the evidence shows that attending to them
01:59:54 by giving somebody a medicine doesn’t help
01:59:57 except that they feel like somebody cares about them
01:59:59 and believes that they’re suffering.
02:00:01 But there are problems in living
02:00:04 that give rise to much of this symptomatology
02:00:07 of less than major depression.
02:00:10 And let’s call it what it is
02:00:12 and figure out a way to help people
02:00:14 in visual therapy, group therapy.
02:00:17 Maybe lifestyle modification would work.
02:00:19 We gotta try that.
02:00:21 But let’s call it what it is instead of saying,
02:00:24 oh, you’re in this vast basket of people who are depressed
02:00:29 so we’ll give you an antidepressant
02:00:31 even though the evidence shows
02:00:33 that people who are suffering from your level of depression
02:00:36 don’t get better.
02:00:38 And that’s a consequence of not focusing
02:00:42 on preventative medicine, the lifestyle changes,
02:00:46 all that kind of stuff.
02:00:47 Well, yes, but it’s really a consequence
02:00:49 of the drug companies creating the impression
02:00:53 that if you’re sad, take a pill.
02:00:56 If you’re nonmajor depression,
02:01:01 how do you overcome depression?
02:01:03 Well, you have to talk about what the problem is.
02:01:06 So talk therapy, lifestyle changes.
02:01:09 Well, no, I’m not jumping to that.
02:01:12 I’m saying that you ought to,
02:01:15 A, the way you feel must be respected.
02:01:19 Yeah, acknowledge that you’re suffering.
02:01:21 Acknowledge that you’re suffering
02:01:22 and deal with healthcare providers
02:01:24 who acknowledge that you’re suffering.
02:01:27 So let’s take that first step.
02:01:30 And then. Big first step also.
02:01:32 Big first step, yeah.
02:01:33 Family docs are pretty good at that.
02:01:36 That’s kind of the arena
02:01:38 that caused me to go into family medicine.
02:01:41 The subjective experience of the patient.
02:01:44 Okay, so you’re a person
02:01:46 who is not getting the enjoyment out of their life
02:01:49 that they feel they ought to be getting.
02:01:52 Now let’s figure out why
02:01:54 and whether that means some time with a social worker,
02:01:57 some time with a psychiatrist,
02:01:59 some time with a psychiatric nurse.
02:02:02 I’m not sure how you’d best do that
02:02:04 most effectively and efficiently,
02:02:05 but that’s what you need to do.
02:02:07 And it may be that there’s a marital problem
02:02:11 and there’s something going on
02:02:13 and one of the spouses can’t find satisfaction
02:02:18 in the life they have to live within their relationship.
02:02:21 Maybe there’s a past history of trauma or abuse
02:02:24 that somebody is projecting onto their current situation.
02:02:28 Maybe there’s socioeconomic circumstances
02:02:31 where they can’t find a job
02:02:33 that gives them self respect and enough money to live.
02:02:36 All, you know, an infinite range of things.
02:02:39 But let’s figure out, make a diagnosis first.
02:02:42 The diagnosis isn’t that the person feels sadder
02:02:45 than they feel, than they want to feel.
02:02:48 The diagnosis is why does the person feel sadder
02:02:51 than they want to feel?
02:02:54 You mentioned this is what made you want
02:02:56 to get into family medicine.
02:03:00 As a doctor, what do you think about the saying,
02:03:03 save one life, save the world?
02:03:05 This was always moving to me about doctors
02:03:13 because you have like this human in front of you
02:03:17 and your time is worth money.
02:03:22 Your, what you prescribe and your efforts
02:03:26 after the visit are worth money.
02:03:28 And it seems like the task of the doctor
02:03:31 is to not think about any of that.
02:03:34 Or not the task, but it seems like a great doctor,
02:03:42 despite all that, just forgets it all
02:03:45 and just cares about the one human.
02:03:47 And somehow that feels like the love and effort
02:03:51 you put into helping one person
02:03:53 is the thing that will save the world.
02:03:55 It’s not like some economic argument
02:03:58 or some political argument or financial argument.
02:04:03 It’s a very human drive that ultimately
02:04:09 is behind all of this that will do good for the world.
02:04:13 Yes, I think that’s true.
02:04:15 And at the same time, I think it’s equally true
02:04:19 that all physicians need to have a sense of responsibility
02:04:23 about how the common resources are allocated
02:04:28 to serve the whole population’s interest best.
02:04:34 That’s a tension that you have as a physician.
02:04:36 Let’s take the extreme example.
02:04:38 Let’s say you had a patient in front of you
02:04:41 who if you gave one $10 billion pill to,
02:04:46 you would save their life.
02:04:49 I would just be tortured by that as a physician
02:04:52 because I know that $10 billion spent properly
02:04:56 in an epidemiologically guided way
02:05:00 is gonna save a whole lot more lives than one life.
02:05:03 So it’s also your responsibility as a physician
02:05:06 to walk away from that patient.
02:05:08 I wouldn’t say that.
02:05:10 I think it’s your responsibility
02:05:12 to be tortured by it.
02:05:14 That’s exactly right.
02:05:17 The human condition.
02:05:21 That’s a tough job, but yeah, yeah.
02:05:24 To maintain your humanity through it all.
02:05:27 Yeah, but you’ve been asking at different points
02:05:30 in this conversation, why are doctors so complacent
02:05:35 about the tremendous amount of money we’re spending?
02:05:38 Why do they accept knowledge from different sources
02:05:41 that may not pan out when they really know the truth?
02:05:45 And the answer is that they’re trying to do their best
02:05:48 for their patients.
02:05:49 And there’s this, it’s the same kind of torture
02:05:56 to figure out what the hell is going on with the data.
02:06:00 And that’s a sort of future project.
02:06:03 And maybe people will read my book
02:06:06 and maybe they’ll get a little more excited about it,
02:06:08 become more legitimate in practice.
02:06:10 I would feel like my life was worthwhile if that happened.
02:06:13 But at the same time, they’ve got to do something
02:06:17 with the patient in front of them.
02:06:18 They’ve got to make a decision.
02:06:21 And they probably, there are not many weirdos like me
02:06:24 who invest their life in figuring out
02:06:27 what’s behind the data.
02:06:28 They’re trying to get through the day
02:06:29 and do the right thing for their patient.
02:06:31 So they’re tortured by that decision too.
02:06:34 And so if you’re not careful,
02:06:38 big pharma can manipulate that drive
02:06:43 to try to help the patient,
02:06:44 that humanity of dealing with the uncertainty of it all.
02:06:49 Like what is the best thing to do?
02:06:51 Big pharma can step in and use money
02:06:53 to manipulate that humanity.
02:06:55 Yeah, I would state it quite differently.
02:06:57 It’s sort of an opt out rather than an opt in.
02:07:00 Big pharma will do that.
02:07:02 And you need to opt out of it.
02:07:07 What advice would you give to a young person today
02:07:11 in high school or college
02:07:13 stepping into this complicated world
02:07:17 full of advertisements, of big powerful institutions,
02:07:22 of big rich companies,
02:07:24 how to have a positive impact in the world,
02:07:27 how to live a life they can be proud of?
02:07:30 I would say should that person
02:07:34 who has only good motives go into medicine.
02:07:38 They have an inclination to go into medicine
02:07:39 and they’ve asked me what I think about that
02:07:42 given what I know about the undermining
02:07:45 of American healthcare at this point.
02:07:47 And my answer is if you’ve got the calling,
02:07:50 you should do it.
02:07:52 You should do it because nobody’s gonna do it
02:07:54 better than you.
02:07:56 And if you don’t have the calling
02:07:58 and you’re in it for the money,
02:08:01 you’re not gonna be proud of yourself.
02:08:03 How do you prevent yourself from doing,
02:08:07 from letting the system change you over years and years,
02:08:12 like letting the game of pharmaceutical influence affect you?
02:08:20 It’s a very hard question
02:08:22 because the sociologic norms are to be affected
02:08:28 and to trust the sources of information
02:08:32 that are largely controlled by the drug industry.
02:08:36 And that’s why I wrote Sickening,
02:08:38 is to try and help those people in the medical profession
02:08:46 to understand that what’s going on right now looks normal
02:08:50 but it’s not.
02:08:52 The health of Americans is going downhill.
02:08:55 Our society’s getting ruined by the money
02:08:57 that’s getting pulled out of other socially beneficial uses
02:09:02 to pay for health care that is not helping us.
02:09:08 So fundamentally, the thing that is normal,
02:09:12 now question the normal, don’t.
02:09:17 If you conform, conform hesitantly.
02:09:21 Well, you have to conform.
02:09:23 You can’t become a doctor without conforming.
02:09:26 I just made it through.
02:09:30 But there aren’t many and it’s hard work.
02:09:35 But you have to conform.
02:09:38 And even with my colleagues in my own practice,
02:09:40 I couldn’t convince them that some of the beliefs they had
02:09:44 about how best to practice weren’t accurate.
02:09:47 There’s one scene, a younger physician
02:09:51 had prescribed hormone replacement therapy.
02:09:53 This is back in 2000, 2001.
02:09:56 Had prescribed hormone replacement therapy for one of my patients
02:10:00 who happened to be a really good personal friend.
02:10:03 And I saw that patient covering for my colleague at one point
02:10:08 and I saw that her hormone replacement therapy had been renewed.
02:10:13 And I said, are you having hot flashes or any problem?
02:10:15 No, no, no, no.
02:10:16 But Dr. So and So said it’s better for my health.
02:10:21 And I said, no, it’s not.
02:10:23 The research is showing that it’s not, it’s harmful for your health
02:10:26 and I think you should stop it.
02:10:27 So my colleague approached me when she saw the chart and said,
02:10:32 wait a minute, that’s my patient.
02:10:34 Maybe your friend, but it’s my patient.
02:10:37 And I went to a conference from my alma mater, medical school,
02:10:43 and they said that healthy people should be given hormone replacement.
02:10:47 And I said, there’s got to be a way to get rid of it.
02:10:51 And I said, there’s got to be drug companies involved in this.
02:10:55 And she said, no, no, no, it was at my university.
02:10:57 It was not a drug company thing.
02:10:59 We didn’t go to a Caribbean island.
02:11:02 I said, do you have the syllabus?
02:11:03 She said, yeah.
02:11:05 And she went and got the syllabus and sure enough,
02:11:07 it was sponsored by a drug company.
02:11:10 They’re everywhere.
02:11:11 They’re everywhere.
02:11:12 And it’s back to Kuhn that groups of experts
02:11:16 share unspoken assumptions, and in order to be included
02:11:21 in that group of experts, you have
02:11:23 to share those unspoken assumptions.
02:11:25 And what I’m hoping to do with my book, Sickening,
02:11:27 and being here having this wonderful conversation with you
02:11:31 is to create an alternative to this normal
02:11:36 that people can pursue and practice better medicine
02:11:45 and also prevent burnout.
02:11:47 I mean, about half the doctors complain that they’re burned
02:11:49 out and they’ve had it.
02:11:51 And I think that this is subjective.
02:11:54 I don’t have data on this.
02:11:55 This is just my opinion.
02:11:57 But I think that a lot of that burnout
02:11:59 is so called moral injury from practicing in a way
02:12:04 that the docs know isn’t working.
02:12:08 It’s not actually providing an alternative to the normals,
02:12:12 expanding the normals, shifting the normal,
02:12:13 just like with Kuhn.
02:12:15 You’re basically looking to shift
02:12:19 the way medicine is done to the original,
02:12:24 to the intent that it represents the ideal of medicine,
02:12:29 of health care.
02:12:30 Yeah, in Kuhnian terms, to have a revolution.
02:12:33 And that revolution would be to practice medicine
02:12:36 in a way that will be epidemiologically most
02:12:40 effective, not most profitable for the people
02:12:43 who are providing you with what’s called knowledge.
02:12:47 You helped a lot of people, as a doctor, as an educator,
02:12:53 live better lives, live longer.
02:12:56 But you yourself are a mortal being.
02:12:59 Do you think about your own mortality?
02:13:02 Do you think about your death?
02:13:03 Are you afraid of death?
02:13:06 I’m not.
02:13:06 I’ve faced it, been close.
02:13:12 Yourself?
02:13:12 Yeah, yeah.
02:13:14 How do you think about it?
02:13:16 What wisdom do you gain from having come close to death,
02:13:19 the fact that the whole thing ends?
02:13:23 It’s liberating.
02:13:25 It’s very liberating.
02:13:26 I’m serious.
02:13:27 I was close, and not too long ago.
02:13:34 And it was a sense of, this may be the way it ends.
02:13:41 And I’ve done my best.
02:13:45 It’s not been perfect.
02:13:48 And if it ends here, it ends here.
02:13:51 The people around me are trying to do their best.
02:13:54 And in fact, I got pulled out of it.
02:13:57 But it didn’t look like I was going to get pulled out of it.
02:14:01 Are you ultimately grateful for the ride, even though it ends?
02:14:07 Well, it’s a little odd.
02:14:11 I think so.
02:14:13 If I know you can’t take the ride if you know it’s going to end well.
02:14:18 It’s not the real ride.
02:14:19 It’s just a ride.
02:14:22 But having gone through the whole thing,
02:14:25 I definitely freed me of a sense of anxiety about death.
02:14:31 And it said to me, do your best every day,
02:14:35 because it’s going to end sometime.
02:14:38 I apologize for the ridiculously big question.
02:14:40 But what do you think is the meaning of life,
02:14:45 of our human existence?
02:14:52 I think it’s to care about something and do your best with it.
02:14:56 Whether it’s being a doctor and trying
02:14:59 to make sure that the greatest number of people
02:15:03 get the best health care.
02:15:06 Or it’s a gardener who wants to have the most beautiful plants.
02:15:09 Or it’s a grandparent who wants to have a good relationship
02:15:12 with their grandchildren.
02:15:13 But whatever it is that gives you a sense of meaning,
02:15:19 as long as it doesn’t hurt other people,
02:15:21 to really commit yourself to it.
02:15:24 That commitment, being in that commitment for me
02:15:27 is the meaning of life.
02:15:29 Put your whole heart and soul into the thing.
02:15:34 What is it, the Bukowski poem, go all the way.
02:15:38 John, you’re an incredible human being, incredible educator.
02:15:42 Like I said, I recommend people listen to your lectures.
02:15:45 It’s so refreshing to see that clarity
02:15:47 of thought and brilliance.
02:15:49 And obviously, your criticism of Big Pharma
02:15:51 or your illumination of the mechanisms of Big Pharma
02:15:56 is really important at this time.
02:15:58 So I really hope people read your book, Sickening,
02:16:02 that’s out today, or depending on when this comes out.
02:16:07 Thank you so much for spending your extremely valuable time
02:16:11 with me today.
02:16:12 It was amazing.
02:16:13 Well, Lex, I wanted back to you.
02:16:15 Thanks for engaging in this conversation,
02:16:18 for creating the space to have it,
02:16:21 and creating a listenership that is
02:16:24 interested in understanding serious ideas.
02:16:27 And I really appreciate the conversation.
02:16:29 And I should mention that offline,
02:16:30 you told me you listened to the Gilbert Strang episode.
02:16:34 So for anyone who don’t know Gilbert Strang,
02:16:35 another epic human being that you should check out.
02:16:39 If you don’t know anything about mathematics
02:16:41 or linear algebra, go look him up.
02:16:43 He’s one of the great mathematics educators of all time.
02:16:46 So of all the people you mentioned to me,
02:16:49 I appreciate that you mentioned him,
02:16:50 because he is a rockstar of mathematics.
02:16:54 John, thank you so much for talking to us, it was awesome.
02:16:56 Great, thank you.
02:16:57 Thanks for listening to this conversation with John Abramson.
02:17:00 To support this podcast,
02:17:02 please check out our sponsors in the description.
02:17:04 And now, let me leave you some words from Marcus Aurelius.
02:17:09 “‘Waste no time arguing about what a good man should be.
02:17:14 Be one.”
02:17:15 Thank you for listening and hope to see you next time.