Richard Haier: IQ Tests, Human Intelligence, and Group Differences #302

Transcript

00:00:00 Let me ask you to this question,

00:00:02 whether it’s bell curve or any research

00:00:04 on race differences,

00:00:09 can that be used to increase the amount of racism

00:00:12 in the world, can that be used to increase

00:00:14 the amount of hate in the world?

00:00:16 My sense is there is such enormous reservoirs

00:00:24 of hate and racism that have nothing to do

00:00:27 with scientific knowledge of the data

00:00:31 that speak against that,

00:00:34 that no, I don’t want to give racist groups

00:00:39 a veto power over what scientists study.

00:00:44 The following is a conversation with Richard Heyer

00:00:46 on the science of human intelligence.

00:00:49 This is a highly controversial topic,

00:00:51 but a critically important one

00:00:52 for understanding the human mind.

00:00:54 I hope you will join me in not shying away

00:00:57 from difficult topics like this,

00:00:59 and instead, let us try to navigate it

00:01:03 with empathy, rigor, and grace.

00:01:06 If you’re watching this on video now,

00:01:08 I should mention that I’m recording this introduction

00:01:11 in an undisclosed location somewhere in the world.

00:01:14 I’m safe and happy and life is beautiful.

00:01:18 This is the Lex Friedman Podcast.

00:01:20 To support it, please check out our sponsors

00:01:22 in the description, and now, dear friends,

00:01:25 here’s Richard Heyer.

00:01:27 What are the measures of human intelligence,

00:01:29 and how do we measure it?

00:01:31 Everybody has an idea of what they mean by intelligence.

00:01:35 In the vernacular, what I mean by intelligence

00:01:40 is just being smart, how well you reason,

00:01:42 how well you figure things out,

00:01:45 what you do when you don’t know what to do.

00:01:48 Those are just kind of everyday common sense definitions

00:01:53 of how people use the word intelligence.

00:01:56 If you wanna do research on intelligence,

00:01:59 measuring something that you can study scientifically

00:02:03 is a little trickier, and what almost all researchers

00:02:09 who study intelligence use is the concept

00:02:12 called the G factor, general intelligence,

00:02:16 and that is what is common, that is a mental ability

00:02:21 that is common to virtually all tests of mental abilities.

00:02:26 What’s the origin of the term G factor,

00:02:28 by the way, such a funny word

00:02:29 for such a fundamental human thing?

00:02:32 The general factor, I really started with Charles Spearman,

00:02:36 and he noticed, this is like, boy,

00:02:39 more than 100 years ago, he noticed that

00:02:44 when you tested people with different tests,

00:02:47 all the tests were correlated positively,

00:02:53 and so he was looking at student exams and things,

00:02:57 and he invented the correlation coefficient, essentially,

00:03:01 and when he used it to look at student performance

00:03:06 on various topics, he found all the scores

00:03:09 were correlated with each other,

00:03:11 and they were all positive correlations,

00:03:14 so he inferred from this that there must be

00:03:17 some common factor that was irrespective

00:03:21 of the content of the test.

00:03:23 And positive correlation means if you do well

00:03:27 on the first test, you’re likely to do well

00:03:29 on the second test, and presumably,

00:03:31 that holds for tests across even disciplines,

00:03:35 so not within subject, but across subjects,

00:03:39 so that’s where the general comes in,

00:03:43 something about general intelligence.

00:03:45 So when you were talking about measuring intelligence

00:03:46 and trying to figure out something difficult

00:03:50 about this world and how to solve the puzzles

00:03:52 of this world, that means, generally speaking,

00:03:54 not some specific test, but across all tests.

00:03:58 Absolutely right, and people get hung up on this

00:04:02 because they say, well, what about the ability

00:04:04 to do X, isn’t that independent?

00:04:08 And they said, I know somebody who’s very good at this

00:04:12 but not so good at this, this other thing.

00:04:15 And so there are a lot of examples like that,

00:04:17 but it’s a general tendency, so exceptions

00:04:21 really don’t disprove, your everyday experience

00:04:26 is not the same as what the data actually show.

00:04:30 And your everyday experience, when you say,

00:04:32 oh, I know someone who’s good at X, but not so good at Y,

00:04:36 that doesn’t contradict the statement of about,

00:04:39 he’s not so good, but he’s not the opposite.

00:04:43 He’s not, it’s not a negative correlation.

00:04:46 Okay, so we’re not, our anecdotal data,

00:04:49 I know a guy who’s really good at solving

00:04:53 some kind of visual thing, that’s not sufficient

00:04:58 for us to understand actually the depths

00:05:00 of that person’s intelligence.

00:05:01 So how, this idea of G factor,

00:05:07 how much evidence is there, how strong,

00:05:11 you know, given across the decades that this idea

00:05:13 has been around, how much has it been held up

00:05:16 that there is a universal sort of horsepower

00:05:21 of intelligence that’s underneath all of it,

00:05:24 all the different tests we do to try to get to this thing

00:05:28 in the depths of the human mind that’s a universal,

00:05:32 stable measure of a person’s intelligence.

00:05:34 You used a couple of words in there, stable and.

00:05:38 We have to be precise with words?

00:05:40 I was hoping we can get away with being poetic.

00:05:42 We can, there’s a lot about research in general,

00:05:46 not just intelligence research that is poetic.

00:05:49 Science has a punetic aspect to it.

00:05:52 Good scientists are very intuitive.

00:05:55 They’re not just, hey, these are the numbers.

00:05:59 You have to kind of step back and see the big picture.

00:06:02 When it comes to intelligence research,

00:06:05 you asked how well has this general concept held up?

00:06:09 And I think I can say without fear

00:06:13 of being empirically contradicted,

00:06:16 that it is the most replicated finding in all of psychology.

00:06:21 Now, some cynics may say, well, big deal,

00:06:22 psychology, we all know there’s a replication crisis

00:06:25 in psychology and a lot of this stuff doesn’t replicate.

00:06:28 That’s all true.

00:06:29 There is no replication crisis when it comes to studying

00:06:33 the existence of this general factor.

00:06:36 Let me tell you some things about it.

00:06:38 It looks like it’s universal

00:06:42 that you find it in all cultures.

00:06:44 The way you find it, step back one step,

00:06:47 the way you find it is to give a battery of mental tests.

00:06:51 What battery?

00:06:52 You choose.

00:06:53 Take a battery of any mental tests you want,

00:06:57 give it to a large number of diverse people,

00:07:01 and you will be able to extract statistically

00:07:05 the commonality among all those tests.

00:07:09 It’s done by a technique called factor analysis.

00:07:12 People think that this may be a statistical artifact

00:07:17 of some kind, it is not a statistical artifact.

00:07:21 What is factor analysis?

00:07:22 Factor analysis is a way of looking at a big set of data

00:07:26 and look at the correlation among the different test scores

00:07:29 and then find empirically the clusters of scores

00:07:33 that go together.

00:07:35 And there are different factors.

00:07:37 So if you have a bunch of mental tests,

00:07:39 there may be a verbal factor,

00:07:41 there may be a numerical factor,

00:07:43 there may be a visual spatial factor,

00:07:45 but those factors have variance in common with each other.

00:07:50 And that is the common,

00:07:53 that’s what’s common among all the tests

00:07:55 and that’s what gets labeled the G factor.

00:07:58 So if you give a diverse battery of mental tests

00:08:01 and you extract a G factor from it,

00:08:04 that factor usually accounts for around half of the variance.

00:08:08 It’s the single biggest factor, but it’s not the only factor,

00:08:12 but it is the most reliable, it is the most stable,

00:08:17 and it seems to be very much influenced by genetics.

00:08:23 It’s very hard to change the G factor with training

00:08:28 or drugs or anything else.

00:08:32 You don’t know how to increase the G factor.

00:08:34 Okay, you said a lot of really interesting things there.

00:08:36 So first, I mean, just to get people used to it

00:08:40 in case they’re not familiar with this idea,

00:08:43 G factor is what we mean.

00:08:45 So often there’s this term used IQ,

00:08:50 which is the way IQ is used,

00:08:53 they really mean G factor in regular conversation.

00:08:58 Because what we mean by IQ, we mean intelligence

00:09:02 and what we mean by intelligence,

00:09:04 we mean general intelligence and general intelligence

00:09:07 in the human mind from a psychology,

00:09:09 from a serious rigorous scientific perspective

00:09:12 actually means G factor.

00:09:13 So G factor equals intelligence,

00:09:15 just in this conversation to define terms.

00:09:18 Okay, so there’s this stable thing called G factor.

00:09:22 You said, now factor, you said factor many times,

00:09:27 means a measure that potentially could be reduced

00:09:33 to a single number across the different factors

00:09:35 you mentioned.

00:09:37 And what you said, it accounts for half, halfish.

00:09:45 Accounts for halfish of what?

00:09:46 Of variance across the different set of tests.

00:09:51 Set of tests, so if you do for some reason

00:09:56 well on some set of tests, what does that mean?

00:10:00 So that means there’s some unique capabilities

00:10:03 outside of the G factor that might account for that.

00:10:05 And what are those?

00:10:07 What else is there besides the raw horsepower,

00:10:10 the engine inside your mind that generates intelligence?

00:10:13 There are test taking skills.

00:10:16 There are specific abilities.

00:10:20 Someone might be particularly good at mathematical things,

00:10:28 mathematical concepts, even simple arithmetic.

00:10:32 Some people are much better than others.

00:10:34 You might know people who can memorize,

00:10:36 and short term memory is another component of this.

00:10:42 Short term memory is one of the cognitive processes

00:10:46 that’s most highly correlated with the G factor.

00:10:51 So all those things like memory,

00:10:56 test taking skills account for variability

00:10:59 across the test performances.

00:11:02 But so you can run, but you can’t hide

00:11:06 from the thing that God gave you.

00:11:08 The genetics, so that G factor,

00:11:12 science says that G factor’s there.

00:11:15 Each one of us have.

00:11:16 Each one of us has a G factor.

00:11:19 Oh boy.

00:11:20 Some have more than others.

00:11:21 I’m getting uncomfortable already.

00:11:22 Well, IQ is a score, and IQ, an IQ score

00:11:28 is a very good estimate of the G factor.

00:11:32 You can’t measure G directly, there’s no direct measure.

00:11:36 You estimate it from these statistical techniques.

00:11:39 But an IQ score is a good estimate, why?

00:11:43 Because a standard IQ test is a battery

00:11:46 of different mental abilities.

00:11:48 You combined it into one score,

00:11:51 and that score is highly correlated with the G factor,

00:11:55 even if you get better scores on some subtests than others.

00:12:00 Because again, it’s what’s common

00:12:02 to all these mental abilities.

00:12:04 So a good IQ test, and I’ll ask you about that,

00:12:08 but a good IQ test tries to compress down that battery

00:12:13 of tests, like tries to get a nice battery,

00:12:16 the nice selection of variable tests into one test.

00:12:21 And so in that way, it sneaks up to this G factor.

00:12:24 And that’s another interesting thing about G factor.

00:12:28 Now you give, first of all, you have a great book

00:12:32 on the neuroscience of intelligence.

00:12:34 You have a great course, which is when I first learned,

00:12:38 you’re a great teacher, let me just say.

00:12:39 Thank you.

00:12:42 Your course at the teaching company,

00:12:44 I hope I’m saying that correctly.

00:12:45 The Intelligent Brain.

00:12:47 The Intelligent Brain is when I first heard

00:12:50 about this G factor, this mysterious thing

00:12:53 that lurks in the darkness that we cannot quite shine

00:12:56 a light on, we’re trying to sneak up on.

00:12:59 So the fact that there’s this measure,

00:13:00 a stable measure of intelligence, we can’t measure directly.

00:13:04 But we can come up with a battery test

00:13:07 or one test that includes a battery

00:13:10 of variable type of questions that can reliably

00:13:17 or attempt to estimate in a stable way that G factor.

00:13:21 That’s a fascinating idea.

00:13:23 So for me as an AI person, it’s fascinating.

00:13:25 It’s fascinating there’s something stable like that

00:13:27 about the human mind, especially if it’s grounded in genetics.

00:13:32 It’s both fascinating that as a researcher

00:13:37 of the human mind and all the human psychological,

00:13:43 sociological, ethical questions that start arising,

00:13:46 it makes me uncomfortable.

00:13:48 But truth can be uncomfortable.

00:13:51 I get that a lot about being uncomfortable

00:13:54 talking about this.

00:13:56 Let me go back and just say one more empirical thing.

00:14:02 It doesn’t matter which battery of tests you use.

00:14:08 So there are countless tests.

00:14:10 You can take any 12 of them at random,

00:14:13 extract a G factor and another 12 at random

00:14:17 and extract a G factor and those G factors

00:14:19 will be highly correlated like over 0.9 with each other.

00:14:23 That’s very, so it is a ubiquitous.

00:14:26 It doesn’t depend on the content of the test

00:14:28 is what I’m trying to say.

00:14:30 It is general among all those tests of mental ability.

00:14:34 And tests of mental, mental abilities include things like,

00:14:37 geez, playing poker.

00:14:41 Your skill at poker is not unrelated to G.

00:14:46 Your skill at anything that requires reasoning

00:14:49 and thinking, anything, spelling, arithmetic,

00:14:54 more complex things, this concept is ubiquitous.

00:15:00 And when you do batteries of tests in different cultures,

00:15:03 you get the same thing.

00:15:05 So this says something interesting about the human mind

00:15:08 that as a computer is designed to be general.

00:15:12 So that means you can, so it’s not easily made specialized.

00:15:17 Meaning if you’re going to be good at one thing,

00:15:21 Miyamoto Musashi has this quote, he’s an ancient warrior,

00:15:26 famous for the Book of Five Rings in the martial arts world.

00:15:30 And the quote goes, if you know the way broadly,

00:15:34 you will see it in everything.

00:15:36 Meaning if you do one thing is going to generalize

00:15:42 to everything.

00:15:44 And that’s an interesting quote.

00:15:46 And that’s an interesting thing about the human mind.

00:15:50 So that’s what the G factor reveals.

00:15:54 Okay, so what’s the difference,

00:15:57 if you can elaborate a little bit further

00:15:58 between IQ and G factor?

00:16:00 Just because it’s a source of confusion for people.

00:16:03 And IQ is a score.

00:16:05 People use the word IQ to mean intelligence.

00:16:08 But IQ has a more technical meaning

00:16:11 for people who work in the field.

00:16:12 And it’s an IQ score, a score on a test

00:16:16 that estimates the G factor.

00:16:20 And the G factor is what’s common

00:16:22 among all these tests of mental ability.

00:16:24 So if you think about, it’s not a Venn diagram,

00:16:27 but I guess you could make a Venn diagram out of it,

00:16:30 but the G factor would be really at the core,

00:16:33 what’s common to everything.

00:16:37 And what IQ scores do is they allow a rank order

00:16:42 of people on the score.

00:16:44 And this is what makes people uncomfortable.

00:16:46 This is where there’s a lot of controversy

00:16:48 about whether IQ tests are biased

00:16:51 toward any one group or another.

00:16:54 And a lot of the answers to these questions are very clear,

00:16:59 but they also have a technical aspect of it

00:17:02 that’s not so easy to explain.

00:17:04 Well, we’ll talk about the fascinating

00:17:06 and the difficult things about all of this.

00:17:10 So by the way, when you say rank order,

00:17:12 that means you get a number and that means one person,

00:17:15 you can now compare.

00:17:17 Like you could say that this other person

00:17:20 is more intelligent than me.

00:17:23 Well, what you can say is IQ scores

00:17:25 are interpreted really as percentiles.

00:17:29 So that if you have an IQ of 140

00:17:33 and somebody else has 70,

00:17:35 the metric is such that you cannot say

00:17:38 the person with an IQ of 140 is twice as smart

00:17:42 as a person with an IQ of 70.

00:17:45 That would require a ratio scale with an absolute zero.

00:17:49 Now you may think you know people with zero intelligence,

00:17:53 but in fact, there is no absolute zero on an IQ scale.

00:17:57 It’s relative to other people.

00:18:01 So relative to other people,

00:18:03 somebody with an IQ score of 140

00:18:06 is in the upper less than 1%,

00:18:09 whereas somebody with an IQ of 70

00:18:12 is two standard deviations below the mean.

00:18:15 That’s a different percentile.

00:18:18 So it’s similar to like in chess,

00:18:20 you have an ELO rating that’s designed to rank order people.

00:18:27 So you can’t say it’s twice one person.

00:18:30 If your ELO rating is twice another person,

00:18:33 I don’t think you’re twice as good at chess.

00:18:35 It’s not stable in that way,

00:18:37 but because it’s very difficult

00:18:39 to do these kinds of comparisons.

00:18:41 But so what can we say about the number itself?

00:18:47 Is that stable across tests and so on, or no?

00:18:50 There are a number of statistical properties of any test.

00:18:54 They’re called psychometric properties.

00:18:56 You have validity, you have reliability,

00:18:59 reliability, there are many different kinds of reliability.

00:19:02 They all essentially measure stability.

00:19:05 And IQ tests are stable within an individual.

00:19:09 There are some longitudinal studies

00:19:11 where children were measured at age 11.

00:19:15 And again, when they were 70 years old

00:19:18 and the two IQ scores are highly correlated with each other.

00:19:22 This comes from a fascinating study from Scotland.

00:19:26 In the 1930s, some researchers decided to get an IQ test

00:19:31 on every single child age 11 in the whole country.

00:19:35 And they did.

00:19:37 And those records were discovered in an old storeroom

00:19:42 at the University of Edinburgh by a friend of mine,

00:19:47 Ian Deary, who found the records, digitized them,

00:19:52 and has done a lot of research

00:19:53 on the people who are still alive today

00:19:57 from that original study,

00:19:58 including brain imaging research, by the way.

00:20:00 It really, it’s a fascinating group of people

00:20:04 who are studied.

00:20:08 Not to get ahead of the story,

00:20:09 but one of the most interesting things they found

00:20:12 is a very strong relationship

00:20:14 between IQ measured at age 11 and mortality.

00:20:21 So that, you know,

00:20:24 in the 70 years later, they looked at the survival rates

00:20:30 and they could get death records from everybody.

00:20:33 And Scotland has universal healthcare for everybody.

00:20:37 And it turned out if you divide the people

00:20:40 by their age 11 IQ score into quartiles

00:20:44 and then look at how many people are alive 70 years later,

00:20:49 the, I know this is in the book,

00:20:52 I have the graph in the book,

00:20:54 but there are essentially twice as many people alive

00:20:57 in the highest IQ quartile than in the lowest IQ quartile.

00:21:01 It’s true in men and women.

00:21:05 Interesting.

00:21:06 So it makes a big difference.

00:21:08 Now, why this is the case is not so clear

00:21:12 since everyone had access to healthcare.

00:21:15 Well, there’s a lot, and we’ll talk about it, you know,

00:21:18 just the sentences you used now

00:21:22 could be explained by nature or nurture.

00:21:25 We don’t know.

00:21:26 Now, there’s a lot of science that starts to then dig in

00:21:29 and investigate that question.

00:21:31 But let me linger on the IQ test.

00:21:33 How are the test design, IQ test design, how do they work?

00:21:37 Maybe some examples for people who are not aware.

00:21:39 What makes a good IQ test question

00:21:44 that sneaks up on this G factor measure?

00:21:48 Well, your question is interesting

00:21:49 because you want me to give examples of items

00:21:53 that make good items.

00:21:55 And what makes a good item is not so much its content,

00:21:59 but its empirical relationship to the total score

00:22:03 that turns out to be valid by other means.

00:22:07 So for example, let me give you an odd example

00:22:12 from personality testing.

00:22:14 Nice.

00:22:15 So there’s a personality test

00:22:18 called the Minnesota Multiphasic Personality Inventory, MMPI.

00:22:22 Been around for decades.

00:22:24 I’ve heard about this test recently

00:22:25 because of the Johnny Depp and Amber Heard trial.

00:22:29 I don’t know if you’ve been paying attention to that.

00:22:31 But they had psychologists.

00:22:32 I have not been paying attention to it.

00:22:33 They had psychologists on the stand,

00:22:35 and they were talking, apparently those psychologists did,

00:22:39 again, I’m learning so much from this trial.

00:22:42 They did different battery of tests

00:22:45 to diagnose personality disorders.

00:22:50 Apparently there’s that systematic way of doing so,

00:22:53 and the Minnesota one is one of the ones

00:22:55 that there’s the most science on.

00:22:59 There’s a lot of great papers,

00:23:00 which were all continuously cited on the stand,

00:23:03 which is fascinating to watch.

00:23:05 Sorry, a little bit of attention.

00:23:06 It’s okay.

00:23:07 I mean, this is interesting because you’re right.

00:23:08 It’s been around for decades.

00:23:09 There’s a lot of scientific research

00:23:11 on the psychometric properties of the test,

00:23:14 including what it predicts with respect

00:23:18 to different categories of personality disorder.

00:23:22 But what I wanna mention is the content

00:23:24 of the items on that test.

00:23:26 All of the items are essentially true false items.

00:23:32 True or false, I prefer a shower to a bath.

00:23:36 True or false, I think Lincoln

00:23:39 was a better president than Washington.

00:23:42 But what of all these, what does that have to do?

00:23:47 And the point is the content of these items,

00:23:49 nobody knows why these items in aggregate predict anything,

00:23:55 but empirically they do.

00:23:57 It’s a technique of choosing items for a test

00:24:01 that is called dust bowl empiricism.

00:24:05 That the content doesn’t matter,

00:24:07 but for some reason when you get a criterion group

00:24:10 of people with this disorder and you compare them

00:24:13 to people without that disorder,

00:24:16 these are the items that distinguish,

00:24:19 irrespective of content.

00:24:20 It’s a hard concept to grasp.

00:24:22 Well, first of all, it’s fascinating.

00:24:25 But from, because I consider myself part psychologist

00:24:33 because I love human robot interaction,

00:24:35 and that’s a problem.

00:24:36 Half of that problem is a psychology problem

00:24:39 because there’s a human.

00:24:41 So designing these tests to get at the questions

00:24:45 is the fascinating part.

00:24:46 Like how do you get to,

00:24:50 like what does dust bowl empiricism refer to?

00:24:52 Does it refer to the final result?

00:24:57 Yeah, so it’s the test is dust bowl empiricism.

00:25:01 But how do you arrive at the battery of questions?

00:25:04 I presume one of the things,

00:25:07 now again, I’m going to the excellent testimony

00:25:10 in that trial, they explain it,

00:25:12 because they also, they explain the tests.

00:25:16 That a bunch of the questions are kind of

00:25:20 make you forget that you’re taking a test.

00:25:24 Like it makes it very difficult for you

00:25:26 to somehow figure out what you’re supposed to answer.

00:25:31 Yes, it’s called social desirability.

00:25:34 But we’re getting a little far afield

00:25:35 because I only wanted to give that example

00:25:37 of dust bowl empiricism.

00:25:40 When we talk about the items on an IQ test,

00:25:44 many of those items in the dust bowl empiricism method

00:25:50 have no face validity.

00:25:52 In other words, they don’t look like they measure anything.

00:25:56 Yes.

00:25:57 Whereas most intelligence tests,

00:25:59 the items actually look like they’re measuring

00:26:02 some mental ability.

00:26:03 So here’s one of the.

00:26:05 So you were bringing that up as an example

00:26:07 as what it is not.

00:26:08 Yes.

00:26:09 Got it.

00:26:09 Okay.

00:26:10 So I don’t want to go too far afield on it.

00:26:12 Too far afield is actually one of the names of this podcast.

00:26:16 So I should mention that.

00:26:19 Far afield.

00:26:20 Far afield.

00:26:21 Yeah, so anyway, sorry.

00:26:22 So they feel the questions look like

00:26:25 they pass the face validity test.

00:26:28 And some more than others.

00:26:29 So for example, let me give you a couple of things here.

00:26:32 If I, one of the subtests on a standard IQ test

00:26:37 is general information.

00:26:40 Let me just think a little bit

00:26:41 because I don’t want to give you the actual item.

00:26:44 But if I said, how far is it between Washington DC

00:26:49 and Miami, Florida?

00:26:52 Within 500 miles plus or minus.

00:26:55 Well, you know, it’s not a fact most people memorize,

00:27:00 but you know something about geography.

00:27:02 You say, well, I flew there once.

00:27:04 I know planes fly for 500 miles.

00:27:06 You know, you can kind of make an estimate.

00:27:10 But it’s also seems like it would be very cultural,

00:27:15 you know, so there’s that kind of general information.

00:27:20 Then there’s vocabulary test.

00:27:22 What does regatta mean?

00:27:27 And I choose that word because that word was removed

00:27:31 from the IQ test because people complained

00:27:33 that disadvantaged people would not know that word

00:27:38 just from their everyday life.

00:27:41 Okay, here’s another example

00:27:43 from a different kind of subtest on.

00:27:46 What’s regatta, by the way?

00:27:48 Regatta is a.

00:27:50 I think I’m disadvantaged.

00:27:51 A sailing competition, a competition with boats.

00:27:54 Not necessarily sailing, but a competition with boats.

00:27:58 Yep, yep, I’m probably disadvantaged in that way.

00:28:02 Okay, excellent, so that was removed anyway you were saying.

00:28:04 Okay, so here’s another subtest.

00:28:07 I’m gonna repeat a string of numbers,

00:28:09 and when I’m done, I want you to repeat them back to me.

00:28:12 Ready?

00:28:13 Okay, seven, four, two, eight, one, six.

00:28:21 That’s way too many.

00:28:22 Seven, four, two, eight, one, six.

00:28:25 Okay, you get the idea.

00:28:26 Now the actual test starts with a smaller number,

00:28:30 like two numbers, and then as people get it right,

00:28:33 you keep going, adding to the string of numbers

00:28:36 until they can’t do it anymore.

00:28:38 Okay, but now try this.

00:28:40 I’m gonna say some numbers, and when I’m done,

00:28:43 I want you to repeat them to me backwards.

00:28:46 I quit.

00:28:47 Okay, now, so I gave you some examples

00:28:51 of the kind of items on an IQ test.

00:28:53 General information, I can’t even remember all,

00:28:58 general information, vocabulary, digit span forward

00:29:03 and digit span backward.

00:29:06 Well, you said I can’t even remember them.

00:29:08 That’s a good question for me.

00:29:11 What does memory have to do with GFactor?

00:29:13 Okay, well, let’s hold on.

00:29:15 Okay, all right.

00:29:16 Let’s just talk about these examples.

00:29:19 Now, some of those items seem very cultural,

00:29:26 and others seem less cultural.

00:29:31 Which ones do you think, scores on which subtest

00:29:35 are most highly correlated with the GFactor?

00:29:39 Well, the intuitive answer is less cultural.

00:29:42 Well, it turns out vocabulary is highly correlated,

00:29:49 and it turns out that digit span backwards

00:29:54 is highly correlated.

00:29:55 How do you figure?

00:29:58 Now you have decades of research to answer the question,

00:30:03 how do you figure?

00:30:04 Right, so now there’s good research that gives you

00:30:08 intuition about what kind of questions get at it,

00:30:11 just like there’s something I’ve done,

00:30:18 I’ve actually used for research in semi autonomous vehicle,

00:30:21 like whether humans are paying attention,

00:30:24 there’s a body of literature that does end back test,

00:30:28 for example, we have to put workload on the brain

00:30:35 to do recall, memory recall, and that helps you

00:30:38 kind of put some work onto the brain

00:30:42 while the person is doing some other task,

00:30:44 and does some interesting research with that.

00:30:47 But that’s loading the memory,

00:30:48 so there’s like research around stably

00:30:52 what that means about the human mind,

00:30:54 and here you’re saying recall backwards

00:30:58 is a good protector.

00:31:00 It’s a transformation.

00:31:01 Yeah, so you have to do some,

00:31:05 like you have to load that into your brain,

00:31:07 and not just remember it, but do something with it.

00:31:11 Right, here’s another example of a different kind of test

00:31:14 called the Hick paradigm, and it’s not verbal at all.

00:31:18 It’s a little box, and there are a series of lights

00:31:23 arranged in a semi circle at the top of the box,

00:31:27 and then there’s a home button that you press,

00:31:31 and when one of the lights goes on,

00:31:34 there’s a button next to each of those lights,

00:31:37 you take your finger off the home button,

00:31:39 and you just press the button

00:31:41 next to the light that goes on,

00:31:44 and so it’s a very simple reaction time.

00:31:46 Light goes on, as quick as you can, you press the button,

00:31:49 and you get a reaction time

00:31:50 from the moment you lift your finger off the button

00:31:53 to when you press the button where the light is.

00:31:58 That reaction time doesn’t really correlate

00:32:02 with IQ very much, but if you change the instructions,

00:32:07 and you say three lights are gonna come on simultaneously,

00:32:13 I want you to press the button next to the light

00:32:15 that’s furthest from the other two.

00:32:19 So maybe lights one and two go on,

00:32:21 and light six goes on simultaneously.

00:32:24 You take your finger off,

00:32:25 and you would press the button by light six.

00:32:28 That’s that reaction time to a more complex task.

00:32:34 It’s not really hard.

00:32:36 Almost everybody gets it all right,

00:32:38 but your reaction time to that

00:32:41 is highly correlated with the G factor.

00:32:43 This is fascinating.

00:32:45 So reaction time, so there’s a temporal aspect to this.

00:32:48 So what role does time?

00:32:50 Speed of processing.

00:32:50 It’s the speed of processing.

00:32:53 Is this also true for ones that take longer,

00:32:55 like five, 10, 30 seconds?

00:32:58 Is time part of the measure with some of these things?

00:33:01 Yes, and that is why some of the best IQ tests

00:33:05 have a time limit, because if you have no time limit,

00:33:10 people can do better,

00:33:12 but it doesn’t distinguish among people that well.

00:33:17 So that adding the time element is important.

00:33:21 So speed of information processing,

00:33:25 and reaction time is a measure

00:33:26 of speed of information processing,

00:33:29 turns out to be related to the G factor.

00:33:31 But the G factor only accounts for maybe half

00:33:35 or some amount on the test performance.

00:33:37 For example, I get pretty bad test anxiety.

00:33:42 Like I was never, I mean,

00:33:46 I just don’t enjoy tests.

00:33:47 I enjoy going back into my cave and working.

00:33:51 Like I’ve always enjoyed homework way more than tests,

00:33:56 no matter how hard the homework is,

00:33:57 because I can go back to the cave

00:33:59 and hide away and think deeply.

00:34:00 There’s something about being watched

00:34:02 and having a time limit that really makes me anxious,

00:34:06 and I can just see the mind not operating optimally at all.

00:34:10 But you’re saying underneath there,

00:34:11 there’s still a G factor, there’s still.

00:34:13 No question, there’s no question.

00:34:16 Boy.

00:34:17 And if you get anxious taking the test,

00:34:19 many people say, oh, I didn’t do well,

00:34:20 because I’m anxious.

00:34:23 I hear that a lot.

00:34:24 Say, well, fine, if you’re really anxious during the test,

00:34:28 the score will be a bad estimate of your G factor.

00:34:32 It doesn’t mean the G factor isn’t there.

00:34:34 That’s right.

00:34:34 And by the way, standardized tests like the SAT,

00:34:40 they’re essentially intelligence tests.

00:34:43 They are highly G loaded.

00:34:45 Now, the people who make the SAT don’t wanna mention that.

00:34:50 They have enough trouble justifying standardized testing,

00:34:54 but to call it an intelligence test

00:34:56 is really beyond the pale.

00:34:58 But in fact, it’s so highly correlated,

00:35:00 because it’s a reasoning test.

00:35:03 SAT is a reasoning test,

00:35:04 a verbal reasoning, mathematical reasoning.

00:35:08 And if it’s a reasoning test, it has to be related to G.

00:35:14 But if people go in and take a standardized test,

00:35:17 whether it’s an IQ test or the SAT,

00:35:20 and they happen to be sick that day with 102 fever,

00:35:24 the score is not going to be a good estimate of their G.

00:35:29 If they retake the test when they’re not anxious

00:35:33 or less anxious or don’t have a fever,

00:35:36 the score will go up, and that will be a better estimate.

00:35:39 But you can’t say their G factor increased

00:35:43 between the two tests.

00:35:45 Well, it’s interesting.

00:35:46 So the question is how wide of a battery of tests

00:35:50 is required to estimate the G factor well?

00:35:53 Because I’ll give you as my personal example,

00:35:55 I took the SAT in, I think it was called the ACT,

00:35:58 where I was two, also, I took SAT many times.

00:36:02 Every single time, I got it perfect on math.

00:36:05 And verbal, the time limit on the verbal

00:36:08 made me very anxious.

00:36:10 I did not, I mean, part of it,

00:36:12 I didn’t speak English very well.

00:36:14 But honestly, it was like you’re supposed to remember stuff,

00:36:17 and I was so anxious.

00:36:18 And as I’m reading, I’m sweating, I can’t,

00:36:21 you know that feeling you have when you’re reading a book

00:36:26 and you just read a page and you know nothing

00:36:30 about what you’ve read because you zoned out.

00:36:32 That’s the same feeling of like, I can’t, I have to,

00:36:36 you’re like, nope, read and understand.

00:36:39 And that anxiety is like, and you start seeing

00:36:43 like the typography versus the content of the words.

00:36:47 Like that was, I don’t, it’s interesting

00:36:50 because I know that what they’re measuring,

00:36:55 I could see being correlated with something.

00:36:58 But that anxiety or some aspect of the performance

00:37:04 sure plays a factor.

00:37:07 And I wonder how you sneak up in a stable way.

00:37:10 I mean, this is a broader discussion

00:37:11 about like standardized testing, how you sneak up,

00:37:16 how you get at the fact that I’m super anxious

00:37:19 and still nevertheless measure some aspect

00:37:22 of my ontology.

00:37:23 I wonder, I don’t know.

00:37:24 I don’t know if you can say to that,

00:37:26 that time limit sure is a pain.

00:37:28 Well, let me say this.

00:37:30 There are two ways to approach the very real problem

00:37:34 that you say that some people just get anxious

00:37:36 or not good test takers.

00:37:38 By the way, part of testing is you know the answer,

00:37:45 you can figure out the answer or you can’t.

00:37:49 If you don’t know the answer, there are many reasons

00:37:52 you don’t know the answer at that particular moment.

00:37:55 You may have learned it once and forgotten it.

00:37:58 It may be on the tip of your tongue

00:38:00 and you just can’t get it

00:38:01 because you’re anxious about the time limit.

00:38:03 You may never have learned it.

00:38:05 You may have been exposed to it,

00:38:08 but it was too complicated and you couldn’t learn it.

00:38:11 I mean, there are all kinds of reasons here.

00:38:13 But for an individual to interpret your scores

00:38:18 as an individual, whoever is interpreting the score

00:38:23 has to take into account various things

00:38:26 that would affect your individual score.

00:38:29 And that’s why decisions about college admission

00:38:32 or anything else where tests are used

00:38:35 are hardly ever the only criterion to make a decision.

00:38:42 And I think people are, college admissions

00:38:45 letting go of that very much.

00:38:46 Oh yes, yeah.

00:38:48 But what does that even mean?

00:38:51 Because is it possible to design standardized tests

00:38:55 that do get, that are useful to college admissions?

00:38:58 Well, they already exist.

00:38:59 The SAT is highly correlated with many aspects

00:39:03 of success at college.

00:39:05 Here’s the problem.

00:39:06 So maybe you could speak to this.

00:39:09 The correlation across the population versus individuals.

00:39:13 So our criminal justice system is designed to make sure,

00:39:23 wow, it’s still, there’s tragic cases

00:39:27 where innocent people go to jail,

00:39:29 but you try to avoid that.

00:39:31 And the same way with testing,

00:39:34 it just, it would suck for an SAT to miss genius.

00:39:38 Yes, and it’s possible, but it’s statistically unlikely.

00:39:43 So it really comes down to which piece of information

00:39:51 maximizes your decision making ability.

00:39:58 So if you just use high school grades, it’s okay.

00:40:05 But you will miss some people

00:40:07 who just don’t do well in high school,

00:40:09 but who are actually pretty smart,

00:40:11 smart enough to be bored silly in high school,

00:40:13 and they don’t care,

00:40:14 and their high school GPA isn’t that good.

00:40:17 So you will miss them in the same sense

00:40:21 that somebody who could be very able and ready for college

00:40:25 just doesn’t do well on their SAT.

00:40:28 This is why you make decisions

00:40:31 with taking in a variety of information.

00:40:36 The other thing I wanted to say,

00:40:38 I talked about when you make a decision for an individual,

00:40:43 statistically for groups,

00:40:46 there are many people who have a disparity

00:40:50 between their math score and their verbal score.

00:40:53 That disparity, or the other way around,

00:40:55 that disparity is called tilt.

00:40:58 The score is tilted one way or the other.

00:41:01 And that tilt has been studied empirically

00:41:05 to see what that predicts.

00:41:07 And in fact, you can’t make predictions

00:41:09 about college success based on tilt.

00:41:14 And mathematics is a good example.

00:41:16 There are many people,

00:41:18 especially non native speakers of English

00:41:20 who come to this country,

00:41:22 take the SATs, do very well on the math

00:41:24 and not so well on the verbal.

00:41:26 Well, if they’re applying to a math program,

00:41:31 the professors there who are making the decision

00:41:33 or the admissions officers

00:41:35 don’t wait so much to score on verbal,

00:41:39 especially if it’s a non native speaker.

00:41:42 Well, so yeah, you have to try to,

00:41:44 in the admission process, bring in the context.

00:41:47 But non native isn’t really the problem.

00:41:50 I mean, that was part of the problem for me.

00:41:53 But it’s the anxiety was, which it’s interesting.

00:41:57 It’s interesting.

00:41:59 Oh boy, reducing yourself down to numbers.

00:42:06 But it’s still true.

00:42:07 It’s still the truth.

00:42:09 It’s a painful truth.

00:42:10 That same anxiety that led me to be,

00:42:16 to struggle with the SAT verbal tests

00:42:20 is still within me in all ways of life.

00:42:24 So maybe that’s not anxiety.

00:42:26 Maybe that’s something, like personality

00:42:30 is also pretty stable.

00:42:32 Personality is stable.

00:42:34 Personality does impact the way you navigate life.

00:42:41 There’s no question.

00:42:42 Yeah, and we should say that the G factor in intelligence

00:42:45 is not just about some kind of number on a paper.

00:42:50 It also has to do with how you navigate life.

00:42:53 How easy life is for you in this very complicated world.

00:43:00 So personality’s all tied into that

00:43:02 in some deep fundamental way.

00:43:05 But now you’ve hit the key point

00:43:07 about why we even want to study intelligence.

00:43:11 And personality, I think, to a lesser extent.

00:43:13 But that’s my interest, is more on intelligence.

00:43:17 I went to graduate school and wanted to study personality,

00:43:20 but that’s kind of another story

00:43:22 how I got kind of shifted from personality research

00:43:25 over to intelligence research.

00:43:27 Because it’s not just a number.

00:43:30 Intelligence is not just an IQ score.

00:43:32 It’s not just an SAT score.

00:43:34 It’s what those numbers reflect about your ability

00:43:39 to navigate everyday life.

00:43:43 It has been said that life is one long intelligence test.

00:43:48 And who can’t relate to that?

00:43:55 And if you doubt, see, another problem here

00:43:58 is a lot of critics of intelligence research,

00:44:00 intelligence testing, tend to be academics

00:44:04 who, by and large, are pretty smart people.

00:44:07 And pretty smart people, by and large,

00:44:10 have enormous difficulty understanding

00:44:12 what the world is like for people with IQs of 80 or 75.

00:44:18 It is a completely different everyday experience.

00:44:23 Even IQ scores of 85, 90.

00:44:27 You know, there’s a popular television program, Judge Judy.

00:44:32 Judge Judy deals with everyday people

00:44:35 with everyday problems, and you can see the full range

00:44:39 of problem solving ability demonstrated there.

00:44:43 And sometimes she does it for laughs,

00:44:45 but it really isn’t funny because people who are,

00:44:52 there are people who are very limited

00:44:54 in their life navigation, let alone success,

00:45:00 by not having good reasoning skills, which cannot be taught.

00:45:05 We know this, by the way, because there are many efforts.

00:45:07 You know, the United States military,

00:45:09 which excels at training people,

00:45:12 I mean, I don’t know that there’s a better organization

00:45:14 in the world for training diverse people,

00:45:18 and they won’t take people with IQs under,

00:45:20 I think, 83 is the cutoff, because they have found

00:45:24 they are unable to train people with lower IQs

00:45:30 to do jobs in the military.

00:45:32 So one of the things that G Factor has to do is learning.

00:45:35 Absolutely, some people learn faster than others.

00:45:40 Some people learn more than others.

00:45:43 Now, faster, by the way, is not necessarily better,

00:45:47 as long as you get to the same place eventually.

00:45:51 But, you know, there are professional schools

00:45:54 that want students who can learn the fastest

00:45:57 because they can learn more or learn better.

00:46:01 Or learn deeper, or all kinds of ideas

00:46:06 about why you select people with the highest scores.

00:46:09 And there’s nothing funnier, by the way,

00:46:12 to listen to a bunch of academics

00:46:15 complain about the concept of intelligence

00:46:17 and intelligence testing, and then you go

00:46:20 to a faculty meeting where they’re discussing

00:46:22 who to hire among the applicants.

00:46:24 And all they talk about is how smart the person is.

00:46:28 We’ll get to that, we’ll sneak up to that in different ways,

00:46:31 but there’s something about reducing a person

00:46:33 to a number that in part is grounded

00:46:35 to the person’s genetics that makes people very uncomfortable.

00:46:38 But nobody does that.

00:46:40 Nobody in the field actually does that.

00:46:43 That is a worry that is a worry like,

00:46:54 well, I don’t wanna call it a conspiracy theory.

00:46:55 I mean, it’s a legitimate worry,

00:46:58 but it just doesn’t happen.

00:47:01 Now, I had a professor in graduate school

00:47:03 who was the only person I ever knew

00:47:05 who considered the students only by their test scores.

00:47:12 And later in his life, he kind of backed off that.

00:47:19 Let me ask you this, so we’ll jump around,

00:47:21 I’ll come back to it, but I tend to,

00:47:26 I’ve had like political discussions with people

00:47:29 and actually my friend Michael Malice, he’s an anarchist.

00:47:36 I disagree with him on basically everything

00:47:39 except the fact that love is a beautiful thing in this world.

00:47:47 And he says this test about left versus right,

00:47:50 whatever, it doesn’t matter what the test is,

00:47:52 but he believes, the question is,

00:47:54 do you believe that some people are better than others?

00:48:00 Question is ambiguous.

00:48:03 Do you believe some people are better than others?

00:48:06 And to me, sort of the immediate answer is no.

00:48:11 It’s a poetic question, it’s an ambiguous question, right?

00:48:15 Like people wanna maybe the temptation

00:48:19 to ask better at what, better at like sports and so on.

00:48:23 No, to me, I stand with the sort of defining documents

00:48:28 of this country, which is all men are created equal.

00:48:32 There’s a basic humanity.

00:48:34 And there’s something about tests of intelligence.

00:48:39 Just knowing that some people are different,

00:48:43 like the science of intelligence that shows

00:48:45 that some people are genetically

00:48:49 in some stable way across a lifetime,

00:48:52 have a greater intelligence than others,

00:48:56 makes people feel like some people are better than others.

00:49:01 And that makes them very uncomfortable.

00:49:03 And I, maybe you can speak to that.

00:49:06 The fact that some people are more intelligent than others

00:49:09 in a way that’s, cannot be compensated

00:49:14 through education, through anything you do in life.

00:49:22 What do we do with that?

00:49:24 Okay, there’s a lot there.

00:49:26 We haven’t really talked about the genetics of it yet.

00:49:29 But you are correct in that it is my interpretation

00:49:35 of the data that genetics has a very important influence

00:49:39 on the G factor.

00:49:41 And this is controversial, and we can talk about it,

00:49:44 but if you think that genetics,

00:49:47 that genes are deterministic, are always deterministic,

00:49:50 that leads to kind of the worry that you expressed.

00:49:55 But we know now in the 21st century

00:49:58 that many genes are not deterministic,

00:50:00 that are probabilistic,

00:50:02 meaning their gene expression can be influenced.

00:50:09 Now, whether they’re influenced only

00:50:11 by other biological variables or other genetic variables

00:50:16 or environmental or cultural variables,

00:50:19 that’s where the controversy comes in.

00:50:23 And we can discuss that in more detail if you like.

00:50:27 But to go to the question about better, are people better?

00:50:31 There’s zero evidence that smart people are better

00:50:36 with respect to important aspects of life,

00:50:43 like honesty, even likability.

00:50:47 I’m sure you know many very intelligent people

00:50:50 who are not terribly likable or terribly kind

00:50:53 or terribly honest.

00:50:55 Is there something to be said?

00:50:56 So one of the things I’ve recently reread

00:50:59 for the second time, I guess that’s what the word reread

00:51:03 means, the rise and fall of the Third Reich,

00:51:08 which is, I think, the best telling

00:51:12 of the rise and fall of Hitler.

00:51:14 And one of the interesting things about the people that,

00:51:20 how should I say it?

00:51:27 Justified or maybe propped up the ideas

00:51:32 that Hitler put forward is the fact

00:51:35 that they were extremely intelligent.

00:51:38 They were the intellectual class.

00:51:41 They were like, it was obvious that they thought

00:51:46 very deeply and rationally about the world.

00:51:49 So what I would like to say is one of the things

00:51:52 that shows to me is some of the worst atrocities

00:51:56 in the history of humanity have been committed

00:51:58 by very intelligent people.

00:52:00 So that means that intelligence

00:52:04 doesn’t make you a good person.

00:52:06 I wonder if there’s a G factor for intelligence.

00:52:12 I wonder if there’s a G factor for goodness.

00:52:16 The Nietzschean good and evil,

00:52:19 of course that’s probably harder to measure

00:52:21 because it’s such a subjective thing

00:52:23 what it means to be good.

00:52:25 And even the idea of evil is a deeply uncomfortable thing

00:52:29 because how do we know?

00:52:31 But it’s independent, whatever it is,

00:52:33 it’s independent of intelligence.

00:52:35 So I agree with you about that.

00:52:37 But let me say this.

00:52:39 I have also asserted my belief

00:52:44 that more intelligence is better than less.

00:52:49 That doesn’t mean more intelligent people are better people

00:52:54 but all things being equal,

00:52:55 would you like to be smarter or less smart?

00:52:58 So if I had a pill, I have two pills.

00:53:01 I said, this one will make you smarter,

00:53:02 this one will make you dumber.

00:53:04 Which one would you like?

00:53:06 Are there any circumstances

00:53:07 under which you would choose to be dumber?

00:53:09 Well, let me ask you this.

00:53:11 That’s a very nuanced and interesting question.

00:53:16 There’s been books written about this, right?

00:53:19 Now we’ll return to the hard questions,

00:53:21 the interesting questions,

00:53:22 but let me ask about human happiness.

00:53:25 Does intelligence lead to happiness?

00:53:29 No.

00:53:31 So, okay, so back to the pill then.

00:53:34 So when would you take the pill?

00:53:38 So you said IQ 80, 90, 100, 110,

00:53:44 you start going through the quartiles

00:53:46 and is it obvious?

00:53:50 Isn’t there diminishing returns

00:53:54 and then it starts becoming negative?

00:53:57 This is an empirical question.

00:54:00 And so that I have advocated in many forums

00:54:06 more research on enhancing the G factor.

00:54:11 Right now there have been many claims

00:54:14 about enhancing intelligence with,

00:54:17 you mentioned the NBAC training,

00:54:19 it was a big deal a few years ago, it doesn’t work.

00:54:22 Data is very clear, it does not work.

00:54:25 Or doing like memory tests, like training and so on.

00:54:28 Yeah, it may give you a better memory in the short run,

00:54:32 but it doesn’t impact your G factor.

00:54:38 It was very popular a couple of decades ago

00:54:40 that the idea that listening to Mozart

00:54:44 could make you more intelligent.

00:54:46 There was a paper published on this

00:54:48 with somebody I knew published this paper,

00:54:50 and intelligence researchers never believed it for a second.

00:54:54 Been hundreds of studies, all the meta analyses,

00:54:57 all the summaries and so on,

00:54:59 show that there’s nothing to it, nothing to it at all.

00:55:05 But wouldn’t it be something,

00:55:08 wouldn’t it be world shaking

00:55:12 if you could take the normal distribution of intelligence,

00:55:15 which we haven’t really talked about yet,

00:55:17 but IQ scores and the G factor

00:55:20 is thought to be a normal distribution,

00:55:23 and shift it to the right so that everybody is smarter?

00:55:30 Even a half a standard deviation would be world shaking,

00:55:35 because there are many social problems,

00:55:38 many, many social problems that are exacerbated

00:55:43 by people with lower ability to reason stuff out

00:55:48 and navigate everyday life.

00:55:51 So.

00:55:52 I wonder if there’s a threshold.

00:55:53 So maybe I would push back and say universal shifting

00:55:59 of the normal distribution

00:56:02 may not be the optimal way of shifting.

00:56:05 Maybe it’s better to,

00:56:07 whatever the asymmetric kind of distributions

00:56:10 is like really pushing the lower up

00:56:13 versus trying to make the people

00:56:17 at the average more intelligent.

00:56:19 So you’re saying that if in fact

00:56:21 there was some way to increase G,

00:56:23 let’s just call it metaphorically a pill, an IQ pill,

00:56:27 we should only give it to people at the lower end.

00:56:30 No, it’s just intuitively I can see

00:56:34 that life becomes easier at the lower end if it’s increased.

00:56:39 It becomes less and less,

00:56:41 it is an empirical scientific question,

00:56:43 but it becomes less and less obvious to me

00:56:46 that more intelligence is better.

00:56:50 At the high end, not because it would make life easier,

00:56:56 but it would make whatever problems you’re working on

00:57:00 more solvable.

00:57:02 And if you are working on artificial intelligence,

00:57:06 there’s a tremendous potential for that to improve society.

00:57:13 I understand.

00:57:14 So at whatever problems you’re working on, yes.

00:57:19 But there’s also the problem of the human condition.

00:57:21 There’s love, there’s fear,

00:57:24 and all of those beautiful things

00:57:26 that sometimes if you’re good at solving problems,

00:57:29 you’re going to create more problems for yourself.

00:57:32 It’s, I’m not exactly sure.

00:57:34 So ignorance is bliss is a thing.

00:57:37 So there might be a place,

00:57:38 there might be a sweet spot of intelligence

00:57:40 given your environment, given your personality,

00:57:43 all of those kinds of things.

00:57:45 And that becomes less beautifully complicated

00:57:48 the more and more intelligent you become.

00:57:50 But that’s a question for literature,

00:57:53 not for science perhaps.

00:57:54 Well, imagine this.

00:57:56 Imagine there was an IQ pill

00:57:58 and it was developed by a private company

00:58:01 and they are willing to sell it to you.

00:58:05 And whatever price they put on it,

00:58:07 you are willing to pay it

00:58:09 because you would like to be smarter.

00:58:11 But just before they give you a pill,

00:58:14 they give you a disclaimer form to sign.

00:58:18 Yes.

00:58:20 Don’t hold us,

00:58:22 you understand that this pill has no guarantee

00:58:25 that your life is going to be better

00:58:26 and in fact it could be worse.

00:58:28 Well, yes, that’s how lawyers work.

00:58:32 But I would love for science to answer the question

00:58:35 to try to predict if your life

00:58:36 is going to be better or worse

00:58:38 when you become more or less intelligent.

00:58:41 It’s a fascinating question

00:58:43 about what is the sweet spot for the human condition.

00:58:47 Some of the things we see as bugs

00:58:49 might be actually features,

00:58:51 may be crucial to our overall happiness

00:58:55 as our limitations might lead to more happiness than less.

00:58:59 But again, more intelligence is better at the lower end.

00:59:02 That’s more, that’s something that’s less arguable

00:59:06 and fascinating if possible to increase.

00:59:10 But you know, there’s virtually no research

00:59:12 that’s based on a neuroscience approach

00:59:15 to solving that problem.

00:59:17 All the solutions that have been proposed

00:59:20 to solve that problem or to ameliorate that problem

00:59:25 are essentially based on the blank slate assumption

00:59:29 that enriching the environment, removing barriers,

00:59:34 all good things by the way,

00:59:36 I’m not against any of those things.

00:59:38 But there’s no empirical evidence

00:59:39 that they’re going to improve the general reasoning ability

00:59:45 or make people more employable.

00:59:47 Have you read Flowers of Algernon? Yes.

00:59:51 That’s to the question of intelligence and happiness.

00:59:56 There are many profound aspects of that story.

00:59:59 It was a film that was very good.

01:00:03 The film was called Charlie

01:00:04 for the younger people who are listening to this.

01:00:08 You might be able to stream it on Netflix or something,

01:00:11 but it was a story about a person

01:00:16 with very low IQ who underwent a surgical procedure

01:00:20 in the brain and he slowly became a genius.

01:00:24 And the tragedy of the story is the effect was temporary.

01:00:31 It’s a fascinating story really.

01:00:33 That goes in contrast to the basic human experience

01:00:36 that each of us individually have,

01:00:38 but it raises the question of the full range of people

01:00:43 you might be able to be given different levels

01:00:47 of intelligence.

01:00:48 You’ve mentioned the normal distribution.

01:00:52 So let’s talk about it.

01:00:54 There’s a book called The Bell Curve written in 1994,

01:00:58 written by psychologist Richard Herrnstein

01:01:01 and political scientist Charles Murray.

01:01:04 Why was this book so controversial?

01:01:08 This is a fascinating book.

01:01:10 I know Charles Murray.

01:01:12 I’ve had many conversations with him.

01:01:15 Yeah, what is the book about?

01:01:16 The book is about the importance of intelligence

01:01:22 in everyday life.

01:01:25 That’s what the book is about.

01:01:27 It’s an empirical book.

01:01:29 It has statistical analyses of very large databases

01:01:34 that show that essentially IQ scores or their equivalent

01:01:39 are correlated to all kinds of social problems

01:01:44 and social benefits.

01:01:46 And that in itself is not where the controversy

01:01:51 about that book came.

01:01:53 The controversy was about one chapter in that book.

01:01:57 And that is a chapter about the average difference

01:02:02 in mean scores between black Americans and white Americans.

01:02:06 And these are the terms that were used in the book

01:02:08 at the time and are still used to some extent.

01:02:14 And historically, or really for decades,

01:02:21 it has been observed that disadvantaged groups

01:02:28 score on average lower than Caucasians

01:02:35 on academic tests, tests of mental ability,

01:02:38 and especially on IQ tests.

01:02:40 And the difference is about a standard deviation,

01:02:43 which is about 15 points, which is a substantial difference.

01:02:48 In the book, Herrnstein and Murray in this one chapter

01:02:54 assert clearly and unambiguously

01:02:59 that whether this average difference

01:03:01 is due to genetics or not, they are agnostic.

01:03:06 They don’t know.

01:03:08 Moreover, they assert they don’t care

01:03:11 because you wouldn’t treat anybody differently

01:03:13 knowing if there was a genetic component or not

01:03:17 because that’s a group average finding.

01:03:20 Every individual has to be treated as an individual.

01:03:23 You can’t make any assumption

01:03:26 about what that person’s intellectual ability might be

01:03:30 from the fact of a average group difference.

01:03:33 They’re very clear about this.

01:03:34 Nonetheless, people took away,

01:03:41 I’m gonna choose my words carefully

01:03:43 because I have a feeling that many critics

01:03:44 didn’t actually read the book.

01:03:49 They took away that Herrnstein and Murray were saying

01:03:51 that blacks are genetically inferior.

01:03:54 That was the take home message.

01:03:56 And if they weren’t saying it, they were implying it

01:03:59 because they had a chapter that discussed

01:04:02 this empirical observation of a difference.

01:04:07 And isn’t this horrible?

01:04:10 And so the reaction to that book was incendiary.

01:04:18 What do we know about from that book

01:04:22 and the research beyond about race differences

01:04:28 and intelligence?

01:04:30 It’s still the most incendiary topic in psychology.

01:04:33 Nothing has changed that.

01:04:35 Anybody who even discusses it is easily called a racist

01:04:41 just for discussing it.

01:04:43 It’s become fashionable to find racism

01:04:45 in any discussion like this.

01:04:49 It’s unfortunate.

01:04:53 The short answer to your question is

01:04:57 there’s been very little actual research

01:05:00 on this topic since 19…

01:05:03 Since the bell curve.

01:05:05 Since the bell curve, even before.

01:05:07 This really became incendiary in 1969

01:05:12 with an article published by an educational psychologist

01:05:15 named Arthur Jensen.

01:05:17 Let’s just take a minute and go back to that

01:05:20 to see the bell curve in a little bit more

01:05:22 historical perspective.

01:05:25 Arthur Jensen was a educational psychologist

01:05:28 at UC Berkeley.

01:05:29 I knew him as well.

01:05:31 And in 1969 or 68, the Harvard Educational Review

01:05:37 asked him to do a review article

01:05:42 on the early childhood education programs

01:05:47 that were designed to raise the IQs of minority students.

01:05:54 This was before the federally funded Head Start program.

01:05:58 Head Start had not really gotten underway

01:06:01 at the time Jensen undertook his review

01:06:04 of what were a number of demonstration programs.

01:06:08 And these demonstration programs were for young children

01:06:13 who were around kindergarten age.

01:06:15 And they were specially designed to be

01:06:18 cognitively stimulating, to provide lunches,

01:06:23 do all the things that people thought would

01:06:27 minimize this average gap of intelligence tests.

01:06:31 There was a strong belief among virtually all psychologists

01:06:37 that the cause of the gap was unequal opportunity

01:06:40 due to racism, due to all negative things in the society.

01:06:45 And if you could compensate for this, the gap would go away.

01:06:51 So early childhood education back then was called

01:06:54 literally compensatory education.

01:06:58 Jensen looked at these programs.

01:07:00 He was an empirical guy.

01:07:02 He understood psychometrics.

01:07:04 And he wrote a, it was over a hundred page article

01:07:08 detailing these programs

01:07:12 and the flaws in their research design.

01:07:15 Some of the programs reported IQ gains

01:07:17 of on average five points,

01:07:20 but a few reported 10, 20 and even 30 point gains.

01:07:24 One was called the miracle in Milwaukee.

01:07:28 That investigator went to jail ultimately

01:07:30 for fabricating data.

01:07:33 But the point is that Jensen wrote an article that said,

01:07:36 look, the opening sentence of his article is classic.

01:07:40 The opening sentence is, I may not quote it exactly right,

01:07:43 but it’s, we have tried compensatory education

01:07:47 and it has failed.

01:07:48 And he showed that these gains were essentially nothing.

01:07:54 You couldn’t really document empirically any gains at all

01:07:58 from these really earnest efforts to increase IQ.

01:08:03 But he went a step further, a fateful step further.

01:08:08 He said, not only have these efforts failed,

01:08:11 but because they have had essentially no impact,

01:08:15 we have to reexamine our assumption

01:08:17 that these differences are caused by environmental things

01:08:22 that we can address with education.

01:08:24 We need to consider a genetic influence,

01:08:28 whether there’s a genetic influence

01:08:30 on this group difference.

01:08:32 So you said that this is one of the more controversial works

01:08:36 ever in science. I think it’s the most infamous paper

01:08:37 in all of psychology, I would go on to say.

01:08:41 Because in 1969, the genetic data was very skimpy

01:08:46 on this question, skimpy and controversial.

01:08:49 It’s always been controversial,

01:08:50 but it was even skimpy and controversial.

01:08:53 It’s kind of a long story that I go into a little bit

01:08:56 in more detail in the book, Neuroscience of Intelligence.

01:09:02 But to say he was vilified is an understatement.

01:09:06 I mean, he couldn’t talk at the American

01:09:08 Psychological Association without bomb threats

01:09:13 clearing the lecture hall.

01:09:15 Campus security watched him all the time.

01:09:18 They opened his mail.

01:09:20 He had to retreat to a different address.

01:09:24 This was one of the earliest kinds,

01:09:28 this is before the internet

01:09:30 and kind of internet social media mobs.

01:09:35 But it was that intense.

01:09:38 And I have written that overnight,

01:09:42 after the publication of this article,

01:09:45 all intelligence research became radioactive.

01:09:49 Nobody wanted to talk about it.

01:09:56 Nobody was doing more research.

01:09:58 And then the bell curve came along.

01:10:02 And the Jensen controversy was dying down.

01:10:05 I have stories that Jensen told me about his interaction

01:10:08 with the Nixon White House on this issue.

01:10:10 I mean, this was like a really big deal.

01:10:14 It was some unbelievable stories,

01:10:16 but he told me this, so I kind of believe these stories.

01:10:20 Nonetheless.

01:10:21 25 years later.

01:10:22 25 years later.

01:10:24 All this silence, basically, saying,

01:10:30 nobody wants to do this kind of research.

01:10:32 There’s so much pressure, so much attack

01:10:34 against this kind of research.

01:10:36 And here’s sort of a bold, stupid, crazy people

01:10:41 that decide to dive right back in.

01:10:44 I wonder how much discussion that was.

01:10:46 Do we include this chapter or not?

01:10:48 Murray has said they discussed it,

01:10:51 and they felt they should include it.

01:10:55 And they were very careful in the way they wrote it,

01:10:59 which did them no good.

01:11:01 So, as a matter of fact, when the bell curve came out,

01:11:06 it was so controversial.

01:11:08 I got a call from a television show called Nightline.

01:11:13 It was with a broadcaster called Ted Koppel.

01:11:16 We had this evening show, I think it was on late at night.

01:11:20 Talked about news.

01:11:21 It was a straight up news thing.

01:11:24 And a producer called and asked if I would be on it

01:11:28 to talk about the bell curve.

01:11:31 And I said, she asked me what I thought

01:11:35 about the bell curve as a book.

01:11:36 And I said, look, it’s a very good book.

01:11:38 It talks about the role of intelligence in society.

01:11:43 And she said, no, no, what do you think

01:11:44 about the chapter on race?

01:11:47 That’s what we want you to talk about.

01:11:49 I remember this conversation.

01:11:52 I said, well, she said, what would you say

01:11:56 if you were on TV?

01:11:58 And I said, well, what I would say is that

01:12:02 it’s not at all clear if there’s any genetic component

01:12:07 to intelligence, any differences.

01:12:13 But if there were a strong genetic component,

01:12:17 that would be a good thing.

01:12:21 And complete silence on the other end of the phone.

01:12:25 And she said, well, what do you mean?

01:12:28 And I said, well, if it’s the more genetic

01:12:31 any difference is, the more it’s biological.

01:12:35 And if it’s biological, we can figure out how to fix it.

01:12:39 I see, that’s interesting.

01:12:41 She said, would you say that on television?

01:12:43 Yes.

01:12:44 And I said, no.

01:12:45 And so that was the end of that.

01:12:47 So that’s for more like biology is within the reach

01:12:56 of science and the environment is a public policy,

01:13:02 is social and all those kinds of things.

01:13:05 From your perspective, whichever one you think

01:13:09 is more amenable to solutions in the short term

01:13:11 is the one that excites you.

01:13:13 But you saying that is good, the truth of genetic differences,

01:13:22 no matter what, between groups is a painful, harmful,

01:13:32 potentially dangerous thing.

01:13:35 So let me ask you to this question,

01:13:38 whether it’s bell curve or any research

01:13:40 on race differences, can that be used to increase

01:13:47 the amount of racism in the world?

01:13:49 Can that be used to increase the amount of hate

01:13:51 in the world?

01:13:52 Do you think about this kind of stuff?

01:13:54 I’ve thought about this a lot, not as a scientist,

01:13:57 but as a person.

01:14:00 And my sense is there is such enormous reservoirs

01:14:05 of hate and racism that have nothing to do

01:14:13 with scientific knowledge of the data,

01:14:16 that speak against that.

01:14:19 That no, I don’t wanna give racist groups a veto power

01:14:25 over what scientists study.

01:14:27 If you think that the differences, and by the way,

01:14:31 virtually no one disagrees that there are differences

01:14:34 in scores, it’s all about what causes them

01:14:37 and how to fix it.

01:14:39 So if you think this is a cultural problem,

01:14:42 then you must ask the problem,

01:14:44 what do you want to change anything about the culture?

01:14:49 Or are you okay with the culture?

01:14:51 Cause you don’t feel it’s appropriate

01:14:53 to change a person’s culture.

01:14:55 So are you okay with that?

01:14:57 And the fact that that may lead to disadvantages

01:14:59 in school achievement.

01:15:01 It’s a question.

01:15:02 If you think it’s environmental,

01:15:05 what are the environmental parameters that can be fixed?

01:15:10 I’ll tell you one, lead from gasoline in the atmosphere.

01:15:15 Lead in paint, lead in water.

01:15:18 That’s an environmental toxin that society

01:15:22 has the means to eliminate, and they should.

01:15:25 Yeah, just to sort of try and define some insights

01:15:30 and conclusion to this very difficult topic.

01:15:33 Is there been research on environment versus genetics,

01:15:38 nature versus nurture, on this question

01:15:40 of race differences?

01:15:43 There is not, no one wants to do this research.

01:15:46 First of all, it’s hard research to do.

01:15:48 Second of all, it’s a minefield.

01:15:50 No one wants to spend their career on it.

01:15:52 Tenured people don’t want to do it, let alone students.

01:15:56 The way I talk about it,

01:16:00 well, before I tell you the way I talk about it,

01:16:02 I want to say one more thing about Jensen.

01:16:05 He was once asked by a journalist straight out,

01:16:08 are you a racist?

01:16:10 His answer was very interesting.

01:16:12 His answer was, I’ve thought about that a lot,

01:16:16 and I’ve concluded it doesn’t matter.

01:16:22 Now, I know what he meant by this.

01:16:23 The guts to say that, wow.

01:16:25 He was a very unusual person.

01:16:27 I think he had a touch of Asperger’s syndrome,

01:16:29 to tell you the truth,

01:16:30 because I saw him in many circumstances.

01:16:34 He would be canceled on Twitter immediately

01:16:36 with that sentence.

01:16:37 But what he meant was he had a hypothesis,

01:16:42 and with respect to group differences,

01:16:44 he called it the default hypothesis.

01:16:47 He said, whatever factors affect individual intelligence

01:16:51 are likely the same factors that affect group differences.

01:16:54 It was the default.

01:16:55 But it was a hypothesis.

01:16:58 It should be tested, and if it turned out

01:17:01 empirical tests didn’t support the hypothesis,

01:17:03 he was happy to move on to something else.

01:17:06 He was absolutely committed to that scientific ideal,

01:17:12 that it’s an empirical question,

01:17:16 we should look at it, and let’s see what happens.

01:17:18 The scientific method cannot be racist,

01:17:22 from his perspective.

01:17:23 It doesn’t matter what the scientists,

01:17:26 if they follow the scientific method,

01:17:30 it doesn’t matter what they believe.

01:17:32 And if they are biased, and they consciously

01:17:35 or unconsciously bias the data,

01:17:39 other people will come along to replicate it,

01:17:42 they will fail, and the process over time will work.

01:17:48 So let me push back on this idea.

01:17:50 Because psychology to me is full of gray areas.

01:17:57 And what I’ve observed about psychology,

01:18:01 even replication crisis aside,

01:18:04 is that something about the media,

01:18:06 something about journalism,

01:18:08 something about the virality of ideas in the public sphere,

01:18:13 they misinterpret, they take up things from studies,

01:18:18 willfully or from ignorance, misinterpret findings,

01:18:23 and tell narratives around that.

01:18:27 I personally believe, for me,

01:18:29 I’m not saying that broadly about science,

01:18:31 but for me, it’s my responsibility to anticipate

01:18:35 the ways in which findings will be misinterpreted.

01:18:40 So I thought about this a lot,

01:18:42 because I published papers on semi autonomous vehicles,

01:18:47 and those cars, people die in cars.

01:18:52 There’s people that have written me letters saying emails,

01:18:57 nobody writes letters, I wish they did,

01:19:00 that have blood on my hands,

01:19:01 because of things that I would say positive or negative,

01:19:04 there’s consequences.

01:19:06 In the same way, when you’re a researcher of intelligence,

01:19:09 I’m sure you might get emails,

01:19:12 or at least people might believe

01:19:14 that a finding of your study is going to be used

01:19:17 by a large number of people

01:19:19 to increase the amount of hate in the world.

01:19:22 I think there’s some responsibility on scientists,

01:19:26 but for me, I think there’s a great responsibility

01:19:30 to anticipate the ways things will be misinterpreted,

01:19:35 and there, you have to, first of all,

01:19:37 decide whether you want to say a thing at all,

01:19:40 do the study at all, publish the study at all,

01:19:43 and two, the words with which you explain it.

01:19:49 I find this on Twitter a lot, actually,

01:19:50 which is, when I write a tweet,

01:19:53 and I’m usually just doing it so innocently,

01:19:58 I’ll write it, it takes me five seconds to write it,

01:20:02 or whatever, 30 seconds to write it,

01:20:04 and then I’ll think, all right, I close my eyes open,

01:20:08 and try to see how will the world interpret this,

01:20:11 what are the ways in which this will be misinterpreted,

01:20:14 and I’ll sometimes adjust that tweet to see,

01:20:18 yeah, so in my mind, it’s clear,

01:20:20 but that’s because it’s my mind from which this tweet came,

01:20:24 but you have to think, in a fresh mind that sees this,

01:20:28 and it’s spread across a large number of other minds,

01:20:32 how will the interpretation morph?

01:20:36 I mean, for a tweet, it’s a silly thing, it doesn’t matter,

01:20:38 but for a scientific paper and study and finding,

01:20:45 I think it matters.

01:20:47 So I don’t know what your thoughts are on that,

01:20:49 because maybe for Jensen, the data’s there,

01:20:54 what do you want me to do?

01:20:55 This is a scientific process that’s been carried out,

01:20:59 if you think the data was polluted by bias,

01:21:02 do other studies that reveal the bias,

01:21:05 but the data’s there.

01:21:07 And I’m not a poet, I’m not a literary writer,

01:21:14 what do you want me to do?

01:21:15 I’m just presenting you the data.

01:21:17 What do you think on that spectrum?

01:21:19 What’s the role of a scientist?

01:21:21 The reason I do podcasts,

01:21:23 the reason I write books for the public

01:21:27 is to explain what I think the data mean

01:21:30 and what I think the data don’t mean.

01:21:32 I don’t do very much on Twitter other than to retweet

01:21:38 references to papers.

01:21:39 I don’t think it’s my role to explain these,

01:21:42 because they’re complicated, they’re nuanced.

01:21:46 But when you decide not to do a scientific study

01:21:51 because you’re, or not to publish a result

01:21:54 because you’re afraid the result could be harmful

01:21:59 or insensitive, that’s not an unreasonable thought.

01:22:05 And people will make different conclusions

01:22:09 and decisions about that.

01:22:11 I wrote about this, I’m the editor

01:22:14 of a journal called Intelligence,

01:22:17 which publishes scientific papers.

01:22:20 Sometimes we publish papers on group differences.

01:22:24 Those papers sometimes are controversial.

01:22:27 These papers are written for a scientific audience.

01:22:29 They’re not written for the Twitter audience.

01:22:32 I don’t promote them very much on Twitter.

01:22:37 But in a scientific paper,

01:22:41 you have to now choose your words carefully also,

01:22:44 because those papers are picked up by non scientists,

01:22:49 by writers of various kinds,

01:22:52 and you have to be available to discuss what you’re saying

01:22:56 and what you’re not saying.

01:22:58 Sometimes you are successful at having a good conversation

01:23:04 like we are today, that doesn’t start out pejorative.

01:23:09 Other times I’ve been asked to participate in debates

01:23:12 where my role would be to justify race science.

01:23:16 Well, you can see you start out.

01:23:21 That was a BBC request that I received.

01:23:25 I have so much, it’s a love hate relationship,

01:23:28 mostly hate with these shallow journalism organizations.

01:23:33 So they would want to use you

01:23:36 as a kind of in a debate setting to communicate

01:23:39 as to like there is raised differences between groups

01:23:42 and make that into debate and put you in a role of…

01:23:47 Justifying racism.

01:23:49 Justifying racism.

01:23:50 That’s what they’re asking me to do.

01:23:51 Courses like educating about this field

01:23:54 of the science of intelligence, yeah.

01:23:56 I wanna say one more thing

01:23:57 before we get off the normal distribution.

01:24:01 You also asked me what is the science after the bell curve?

01:24:06 And the short answer is there’s not much new work,

01:24:09 but whatever work there is supports the idea

01:24:13 that there still are group differences.

01:24:16 It’s arguable whether those differences

01:24:18 have diminished at all or not.

01:24:20 And there is still a major problem

01:24:24 in underperformance for school achievement

01:24:29 for many disadvantaged and minority students.

01:24:33 And there so far is no way to fix it.

01:24:37 What do we do with this information?

01:24:39 Is this now a task?

01:24:42 Now we’ll talk about the future

01:24:45 on the neuroscience and the biology side,

01:24:47 but in terms of this information as a society

01:24:51 in the public policy, in the political space,

01:24:53 in the social space, what do we do with this information?

01:24:56 I’ve thought a lot about this.

01:24:57 The first step is to have people interested in policy

01:25:03 understand what the data actually show

01:25:06 to pay attention to intelligence data.

01:25:09 You can read policy papers about education

01:25:13 and using your word processor,

01:25:15 you can search for the word intelligence.

01:25:17 You can search a 20,000 word document in a second

01:25:22 and find out the word intelligence does not appear anywhere.

01:25:26 In most discussions about what to do about achievement gaps,

01:25:32 I’m not talking about test gaps,

01:25:33 I’m talking about actual achievement gaps in schools,

01:25:37 which everyone agrees is a problem,

01:25:40 the word intelligence doesn’t appear among educators.

01:25:43 That’s fascinating.

01:25:44 As a matter of fact, in California,

01:25:47 there has been tremendous controversy

01:25:50 about recent attempts to revise the curriculum

01:25:53 for math in high schools.

01:25:56 And we had a Stanford professor of education

01:25:59 who was running this review assert

01:26:02 there’s no such thing as talent, mathematical talent.

01:26:07 And she wanted to get rid of the advanced classes in math

01:26:12 because not everyone could do that.

01:26:15 Now, of course, this has been very controversial,

01:26:17 they’ve retreated somewhat,

01:26:19 but the idea that a university professor

01:26:21 was in charge of this who believes

01:26:26 that there’s no talent, that it doesn’t exist,

01:26:31 this is rather shocking,

01:26:33 let alone the complete absence of intelligence data.

01:26:37 By the way, let me tell you something

01:26:38 about what the intelligence data show.

01:26:41 Let’s take race out of it.

01:26:45 Even though the origins of these studies

01:26:48 were a long time ago,

01:26:52 I’m blocking on the name of the report,

01:26:53 the Coleman report was a famous report about education.

01:26:57 And they measured all kinds of variables about schools,

01:27:01 about teachers,

01:27:03 and they looked at academic achievement as an outcome.

01:27:08 And they found the most predictive variables

01:27:12 of education outcome were the variables

01:27:15 the student brought with him or her into the school,

01:27:20 essentially their ability.

01:27:23 And that when you combine the school

01:27:26 and the teacher variables together,

01:27:29 the quality of the school, the funding of the school,

01:27:31 the quality of the teachers, their education,

01:27:34 you put all the teacher and school variables together,

01:27:37 it barely accounted for 10% of the variance.

01:27:41 And this has been replicated now.

01:27:45 So the best research we have shows that school variables

01:27:51 and teacher variables together account

01:27:54 for about 10% of student academic achievement.

01:27:59 Now, you wanna have some policy

01:28:02 on improving academic achievement,

01:28:04 how much money do you wanna put into teacher education?

01:28:08 How much money do you wanna put into the quality

01:28:11 of the school administration?

01:28:14 You know who you can ask?

01:28:15 You can ask the Gates Foundation,

01:28:18 because they spent a tremendous amount of money doing that.

01:28:21 And at the end of it, because they’re measurement people,

01:28:25 they wanna know the data,

01:28:27 they found it had no impact at all.

01:28:29 And they’ve kind of pulled out of that kind of program.

01:28:33 So, oh boy.

01:28:36 Let me ask you, this is me talking, but there’s…

01:28:41 Just the two of us.

01:28:42 Just the two of us, but I’m gonna say

01:28:44 some funny and ridiculous things,

01:28:46 so you’re surely not approving of it.

01:28:51 But there’s a movie called Clerks.

01:28:53 You probably…

01:28:54 I’ve seen it, I’ve seen it, yeah.

01:28:56 There’s a funny scene in there where a lovely couple

01:28:59 are talking about the number

01:29:01 of previous sexual partners they had.

01:29:03 And the woman says that,

01:29:07 I believe she just had a handful,

01:29:09 like two or three or something like that sexual partners,

01:29:12 but then she also mentioned that she…

01:29:17 What’s that called?

01:29:19 Fallacia, what’s the scientific?

01:29:20 But she went, you know, gave a blow job

01:29:23 to 37 guys, I believe it is.

01:29:26 And so that has to do with the truth.

01:29:29 So sometimes, knowing the truth

01:29:35 can get in the way of a successful relationship

01:29:39 of love of some of the human flourishing.

01:29:43 And that seems to me that’s at the core here,

01:29:46 that facing some kind of truth

01:29:49 that’s not able to be changed

01:29:53 makes it difficult to sort of…

01:29:56 Is limiting as opposed to empowering.

01:29:59 That’s the concern.

01:30:01 If you sort of test for intelligence

01:30:03 and lay the data out,

01:30:06 it feels like you will give up on certain people.

01:30:09 You will sort of start bidding people,

01:30:12 it’s like, well, this person is like,

01:30:15 let’s focus on the average people

01:30:18 or let’s focus on the very intelligent people.

01:30:20 That’s the concern.

01:30:21 And there’s a kind of intuition

01:30:26 that if we just don’t measure

01:30:29 and we don’t use that data,

01:30:31 that we will treat everybody equal

01:30:33 and give everybody equal opportunity.

01:30:37 If we have the data in front of us,

01:30:39 we’re likely to misdistribute

01:30:43 the amount of sort of attention we allocate,

01:30:46 resources we allocate to people.

01:30:49 That’s probably the concern.

01:30:52 It’s a realistic concern,

01:30:55 but I think it’s a misplaced concern

01:30:57 if you wanna fix the problem.

01:31:00 If you wanna fix the problem,

01:31:02 you have to know what the problem is.

01:31:03 Yep.

01:31:05 Now, let me tell you this,

01:31:06 let’s go back to the bell curve,

01:31:08 not the bell curve, but the normal distribution.

01:31:11 Yes, 16% of the population on average has an IQ under 85,

01:31:20 which means they’re very hard.

01:31:22 If you have an IQ under 85,

01:31:24 it’s very hard to find gainful employment

01:31:26 at a salary that sustains you

01:31:31 at least minimally in modern life, okay?

01:31:35 Not impossible, but it’s very difficult.

01:31:37 16% of the population of the United States

01:31:41 is about 51 or 52 million people with IQs under 85.

01:31:47 This is not a small issue.

01:31:51 14 million children have IQs under 85.

01:31:57 Is this something we wanna ignore?

01:31:59 Does this have any, what is the Venn diagram between,

01:32:03 you know, when you have people with IQs under 85,

01:32:07 and you have achievement in school or achievement in life?

01:32:13 There’s a lot of overlap there.

01:32:16 This is why, to go back to the IQ pill,

01:32:18 if there were a way to shift that curve toward the higher end,

01:32:27 that would have a big impact.

01:32:30 If I could maybe, before we talk about the impact on life

01:32:34 and so on, some of the criticisms of the bell curve.

01:32:38 So Steven Jay Gould wrote that the bell curve

01:32:41 rests on four incorrect assumptions.

01:32:44 It would be just interesting to get your thoughts

01:32:46 on the four assumptions, which are,

01:32:48 intelligence must be reducible to a single number,

01:32:51 intelligence must be capable of rank ordering people

01:32:54 in a linear order,

01:32:56 intelligence must be primarily genetically based,

01:32:59 and intelligence must be essentially immutable.

01:33:03 Maybe not as criticisms, but as thoughts about intelligence.

01:33:09 Oh yeah, we could spend a lot of time on him.

01:33:13 On Steven Jay Gould?

01:33:14 Yes.

01:33:15 He wrote that in what, about 1985, 1984?

01:33:20 His views were overtly political, not scientific.

01:33:25 He was a scientist,

01:33:27 but his views on this were overtly political,

01:33:30 and I would encourage people listening to this,

01:33:33 if they really want to understand his criticisms,

01:33:38 they should just Google what he had to say,

01:33:44 and Google the scientific reviews of his book,

01:33:49 The Mismeasure of Man,

01:33:51 and they will take these statements apart.

01:33:54 They were wrong, not only were they wrong,

01:33:57 but when he asserted in his first book

01:34:00 that there was no biological basis essentially to IQ,

01:34:05 by the time the second edition came around,

01:34:08 there were studies of MRIs showing that brain size,

01:34:13 brain volume were correlated to IQ scores,

01:34:16 which he declined to put in his book.

01:34:19 So I’m learning a lot today.

01:34:21 I didn’t know actually the extent of his work.

01:34:25 I was just using a few little snippets of criticism.

01:34:28 That’s interesting.

01:34:29 There was a battle here.

01:34:30 He wrote a book, Mismeasure of Man,

01:34:32 that’s missing a lot of the scientific grounding.

01:34:36 His book is highly popular in colleges today.

01:34:39 You can find it in any college bookstore

01:34:41 under assigned reading.

01:34:42 It’s highly popular.

01:34:44 The Mismeasure of Man?

01:34:45 Yes, highly influential.

01:34:46 Can you speak to the Mismeasure of Man?

01:34:48 I’m undereducated about this.

01:34:50 So is this the book basically criticizing the ideas in the book?

01:34:54 Yeah, where those four things came from.

01:34:57 And it is really a book that was really taken apart point by point

01:35:04 by a number of people who actually understood the data.

01:35:08 And he didn’t care.

01:35:09 Yeah.

01:35:10 He didn’t care.

01:35:11 He didn’t modify anything.

01:35:12 Listen, because this is such a sensitive topic,

01:35:16 like I said, I believe the impact of the work,

01:35:24 because it is misinterpreted, has to be considered.

01:35:28 Because it’s not just going to be scientific discourse,

01:35:31 it’s going to be political discourse,

01:35:32 there’s going to be debates,

01:35:34 there’s going to be politically motivated people

01:35:39 that will use messages in each direction,

01:35:42 make something like the bulk of the enemy

01:35:45 or the support for one’s racist beliefs.

01:35:52 And so I think you have to consider that.

01:35:55 But it’s difficult because Nietzsche was used by Hitler

01:35:59 to justify a lot of his beliefs.

01:36:02 And it’s not exactly on Nietzsche to anticipate Hitler

01:36:09 or how his ideas will be misinterpreted and used for evil.

01:36:12 But there is a balance there.

01:36:14 So I understand.

01:36:15 This is really interesting.

01:36:16 I didn’t know.

01:36:17 Is there any criticism of the book you find compelling

01:36:20 or interesting or challenging to you from a scientific perspective?

01:36:23 There were factual criticisms about the nature of the statistics

01:36:29 that were used, the statistical analyses.

01:36:32 These are more technical criticisms.

01:36:34 And they were addressed by Murray in a couple of articles

01:36:38 where he took all the criticisms and spoke to them.

01:36:41 And people listening to this podcast

01:36:44 can certainly find all those online.

01:36:47 And it’s very interesting.

01:36:48 Murray went on to write some additional books,

01:36:52 two in the last couple of years, one about human diversity

01:36:57 where he goes through the data refuting the idea that race

01:37:02 is only a social construct with no biological meaning.

01:37:07 He discusses the data.

01:37:09 It’s a very good discussion.

01:37:11 You don’t have to agree with it.

01:37:12 But he presents data in a cogent way.

01:37:16 And he talks about the critics of that.

01:37:19 And he talks about their data in a cogent, nonpersonal way.

01:37:23 It’s a very informative discussion.

01:37:26 The book is called Human Diversity.

01:37:28 He talks about race.

01:37:29 And he talks about gender, same thing, about sex differences.

01:37:34 And more recently, he’s written what

01:37:37 might be his final say on this, a book called Facing Reality

01:37:43 where he talks about this again.

01:37:46 So he can certainly defend himself.

01:37:49 He doesn’t need me to do that.

01:37:52 But I would urge people who have heard

01:37:55 about him and the bell curve and who

01:37:58 think they know what’s in it, you are likely incorrect.

01:38:03 And you need to read it for yourself.

01:38:06 But it is, scientifically, it’s a serious subject.

01:38:12 It’s a difficult subject.

01:38:13 Ethically, it’s a difficult subject.

01:38:15 Everything you said here, calmly and thoughtfully,

01:38:19 is difficult. It’s difficult for me

01:38:21 to even consider that G factor exists.

01:38:26 I don’t mean from like that somehow G factor is inherently

01:38:29 racist or sexist or whatever.

01:38:32 It’s just it’s difficult in the way

01:38:35 that considering the fact that we die one day is difficult.

01:38:38 That we are limited by our biology is difficult.

01:38:42 And at least from an American perspective,

01:38:47 you would like to believe that everything

01:38:49 is possible in this world.

01:38:51 Well, that leads us to what I think

01:38:56 we should do with this information.

01:38:59 And what I think we should do with this information

01:39:03 is unusual.

01:39:07 Because I think what we need to do

01:39:09 is fund more neuroscience research on the molecular

01:39:13 biology of learning and memory.

01:39:16 Because one definition of intelligence

01:39:22 is based on how much you can learn

01:39:24 and how much you can remember.

01:39:27 And if you accept that definition of intelligence,

01:39:30 then there are molecular studies going on now,

01:39:35 and Nobel Prizes being won on molecular biology

01:39:40 or molecular neurobiology of learning and memory.

01:39:45 Now, the step those researchers, those scientists

01:39:49 need to take when it comes to intelligence

01:39:53 is to focus on the concept of individual differences.

01:39:58 Intelligence research has individual differences

01:40:03 as its heart because it assumes that people

01:40:08 differ on this variable.

01:40:10 And those differences are meaningful

01:40:13 and need understanding.

01:40:15 Cognitive psychologists who have morphed

01:40:19 into molecular biologists studying learning and memory

01:40:23 hate the concept of individual differences historically.

01:40:27 Some now are coming around to it.

01:40:30 I once sat next to a Nobel Prize winner

01:40:34 for his work on memory.

01:40:37 And I asked him about individual differences.

01:40:41 And he said, don’t go there.

01:40:42 It’ll set us back 50 years.

01:40:46 But I said, don’t you think they’re

01:40:48 the key, though, to understand?

01:40:50 Why can some people remember more than others?

01:40:53 He said, you don’t want to go there.

01:40:55 I think the 21st century will be remembered

01:40:58 by the technology and the science that

01:41:02 goes to individual differences.

01:41:04 Because we have now data.

01:41:05 We have now the tools to much, much better

01:41:07 to start to measure, start to estimate,

01:41:10 not just on the sort of through tests and IQ test type

01:41:13 of things, sort of outside the body kind of things,

01:41:18 but measuring all kinds of stuff about the body.

01:41:20 So yeah, truly go into the molecular biology,

01:41:23 to the neurobiology, to the neuroscience.

01:41:27 Let me ask you about life.

01:41:31 How does intelligence correlate with or lead to

01:41:36 or has anything to do with career success?

01:41:39 You’ve mentioned these kinds of things.

01:41:40 And is there any data?

01:41:43 You had an excellent conversation

01:41:44 with Jordan Peterson, for example.

01:41:46 Is there any data on what intelligent

01:41:49 means for success in life?

01:41:53 Success in life.

01:41:54 There is a tremendous amount of validity data

01:42:00 that looked at intelligence test scores and various measures

01:42:08 of life success.

01:42:11 Now, of course, life success is a pretty broad topic.

01:42:17 And not everybody agrees on what success means.

01:42:22 But there’s general agreement on certain aspects of success

01:42:28 that can be measured.

01:42:33 Including life expectancy, like you said.

01:42:35 Life expectancy.

01:42:36 Now, there’s life success.

01:42:42 Life expectancy, I mean, that is such an interesting finding.

01:42:47 But IQ scores are also correlated to things like income.

01:42:54 Now, OK, so who thinks income means you’re successful?

01:42:59 That’s not the point.

01:43:01 The point is that income is one empirical measure

01:43:06 in this culture that says something

01:43:09 about your level of success.

01:43:11 You can define success in ways that

01:43:13 have nothing to do with income.

01:43:15 You can define success based on your evolutionary natural

01:43:21 selection success.

01:43:25 But for variables, and even that, by the way,

01:43:29 is correlated to IQ in some studies.

01:43:33 So however you want to define success, IQ is important.

01:43:42 It’s not the only determinant.

01:43:44 People get hung up on, well, what about personality?

01:43:46 What about so called emotional intelligence?

01:43:49 Yes, all those things matter.

01:43:51 The thing that matters empirically,

01:43:54 the single thing that matters the most

01:43:56 is your general ability, your general mental intellectual

01:44:01 ability, your reasoning ability.

01:44:03 And the more complex your vocation,

01:44:07 the more complex your job, the more G matters.

01:44:11 G doesn’t matter in a lot of occupations

01:44:14 don’t require complex thinking.

01:44:17 And there are occupations like that, and G doesn’t matter.

01:44:20 Within an occupation, the G might not matter so much.

01:44:28 So that if you look at all the professors at MIT

01:44:36 and had a way to rank order them,

01:44:40 there’s a ceiling effect is what I’m saying.

01:44:43 That, you know.

01:44:45 Also, when you get past a certain threshold,

01:44:47 then there’s impact on wealth, for example,

01:44:49 or career success, however that’s

01:44:52 defined in each individual discipline.

01:44:54 But after a certain point, it doesn’t matter.

01:44:56 Actually, it does matter in certain things.

01:44:59 So for example, there is a very classic study

01:45:04 that was started at Johns Hopkins when

01:45:07 I was a graduate student there.

01:45:08 And I actually worked on this study at the very beginning.

01:45:11 It’s the study of mathematically and scientifically

01:45:13 precocious youth.

01:45:15 And they gave junior high school students

01:45:20 age 11 and 12 the standard SAT math exam.

01:45:27 And they found a very large number of students

01:45:31 scored very high on this exam.

01:45:33 Not a large number.

01:45:35 I mean, they found many students when

01:45:37 they cast the net to all of Baltimore.

01:45:40 They found a number of students who

01:45:42 scored as high on the SAT math when

01:45:45 they were 12 years old as incoming Hopkins freshmen.

01:45:50 And they said, gee, now this is interesting.

01:45:53 What shall we do now?

01:45:56 And on a case by case basis, they

01:46:00 got some of those kids into their local community college

01:46:03 math programs.

01:46:06 Many of those kids went on to be very successful.

01:46:10 And now there’s a 50 year follow up of those kids.

01:46:14 And it turns out these kids were in the top 1%.

01:46:21 So everybody in this study is in the top 1%.

01:46:24 If you take that group, that rarefied group,

01:46:28 and divide them into quartiles so that you have the top 25%

01:46:33 of the top 1% and the bottom 25% of the top 1%,

01:46:39 you can find unmeasurable variables of success.

01:46:48 The top quartile does better than the bottom quartile

01:46:51 in the top 1%.

01:46:53 They have more patents.

01:46:54 They have more publications.

01:46:56 They have more tenure at universities.

01:46:59 And this is based on, you’re dividing them

01:47:03 based on their score at age 12.

01:47:05 I wonder how much interesting data

01:47:10 is in the variability in the differences.

01:47:12 So but that’s really, oh, boy.

01:47:16 That’s very interesting.

01:47:17 But it’s also, I don’t know, somehow painful.

01:47:21 I don’t know why it’s so painful that that G

01:47:25 factor is so determinant of even in the nuanced top percent.

01:47:31 This is interesting that you find that painful.

01:47:32 Do you find it painful that people with charisma

01:47:37 are very successful, can be very successful in life,

01:47:40 even though having no other attributes other than they’re

01:47:43 famous and people like them?

01:47:45 Do you find that painful?

01:47:47 Yes, if that charisma is untrainable.

01:47:51 So one of the things, again, this

01:47:53 is like I learned psychology from the Johnny Depp trial.

01:47:56 But one of the things the psychologist, the personality

01:48:01 psychologist, he can maybe speak to this

01:48:03 because he had an interest in this for a time,

01:48:07 is she was saying that personality, technically

01:48:12 speaking, is the thing that doesn’t change over a lifetime.

01:48:16 It’s the thing you’re, I don’t know if she was actually

01:48:20 implying that you’re born with it.

01:48:21 Well, it’s a trait.

01:48:22 It’s a trait that’s state.

01:48:24 It’s a trait that’s relatively stable over time.

01:48:27 I think that’s generally correct.

01:48:28 So to the degree your personality

01:48:31 is stable over time, yes, that too is painful.

01:48:36 Because what’s not painful is the thing,

01:48:40 if I’m fat and out of shape, I can

01:48:42 exercise and become healthier in that way.

01:48:47 If my diet is a giant mess and that’s

01:48:51 resulting in some kind of conditions

01:48:53 that my body is experiencing, I can fix that

01:48:55 by having a better diet.

01:48:58 That sort of my actions, my willed actions

01:49:02 can make a change.

01:49:03 If charisma is part of the personality that’s,

01:49:07 the part of the charisma that is part of the personality that

01:49:10 is stable, yeah, yeah, that’s painful too.

01:49:15 Because it’s like, oh shit, I’m stuck with this.

01:49:18 I’m stuck with this.

01:49:19 Well, and this pretty much generalizes

01:49:22 to every aspect of your being.

01:49:24 This is who you are.

01:49:26 You’ve got to deal with it.

01:49:27 And what it undermines, of course,

01:49:29 is a realistic appreciation for this,

01:49:32 undermines the fairly recent idea prevalent in this country

01:49:40 that if you work hard, you can be anything you want to be,

01:49:44 which has morphed from the original idea

01:49:47 that if you work hard, you can be successful.

01:49:50 Those are two different things.

01:49:53 And now we have if you work hard,

01:49:57 you can be anything you want to be.

01:50:00 This is completely unrealistic.

01:50:03 Sorry.

01:50:04 It just is.

01:50:05 Now, you can work hard and be successful.

01:50:06 There’s no question.

01:50:08 But you know what?

01:50:09 I could work very hard, and I am not

01:50:12 going to be a successful theoretical physicist.

01:50:16 I’m just not.

01:50:18 That said, I mean, we should, because we

01:50:21 had this conversation already, but it’s good to repeat.

01:50:26 The fact that you’re not going to be

01:50:27 a theoretical physicist is not judgment

01:50:31 on your basic humanity.

01:50:32 We’re turning again to the all men, which means

01:50:36 men and women are created equal.

01:50:39 So again, some of the differences

01:50:40 we’re talking about in quote unquote success, wealth,

01:50:47 number of whether you win a Nobel Prize or not,

01:50:50 that doesn’t put a measure on your basic humanity

01:50:55 and basic value and even goodness of you

01:51:00 as a human being.

01:51:02 Because your basic role and value in society

01:51:06 is largely within your control.

01:51:11 It’s some of these measures that we’re talking about.

01:51:16 It’s good to remember this.

01:51:19 One question about the Flynn effect.

01:51:22 What is it?

01:51:23 Are humans getting smarter over the years, over the decades,

01:51:27 over the centuries?

01:51:28 The Flynn effect is James Flynn, who passed away about a year

01:51:33 ago, published a set of analyses going back

01:51:42 a couple of decades when he first noticed this,

01:51:46 that IQ scores, when you looked over the years,

01:51:51 seemed to be drifting up.

01:51:54 Now, this was not unknown to the people who make the test

01:51:58 because they renorm the test periodically

01:52:02 and they have to renorm the test periodically

01:52:05 because what 10 items correct meant

01:52:09 relative to other people 50 years ago

01:52:13 is not the same as what 10 items mean relative today.

01:52:18 People are getting more things correct.

01:52:21 Now, the scores have been drifting up about three points.

01:52:24 IQ scores have been drifting up about three points per decade.

01:52:30 This is not a personal effect.

01:52:31 This is a cohort effect.

01:52:34 Well, it’s not for an individual, but.

01:52:37 The world, how do you explain?

01:52:38 So what’s that?

01:52:39 And this has presented intelligence researchers

01:52:42 with a great mystery.

01:52:44 Two questions.

01:52:46 First, is it effect on the 50% of the variance that’s

01:52:51 the G factor or on the other 50%?

01:52:55 And there’s evidence that it is a G factor effect.

01:52:59 And second, what on earth causes this?

01:53:02 And doesn’t this mean intelligence and G factor

01:53:05 cannot be genetic because the scale of natural selection

01:53:10 is much, much longer than a couple of decades ago?

01:53:15 And so it’s been used to try to undermine the idea

01:53:19 that there can be a genetic influence on intelligence.

01:53:24 But certainly, it can be the Flynn effect

01:53:28 can affect the nongenetic aspects of intelligence

01:53:32 because genes account for maybe 50% of the variance.

01:53:37 Maybe higher, could be as high as 80% for adults,

01:53:40 but let’s just say 50% for discussion.

01:53:46 So the Flynn effect, it’s still a mystery.

01:53:50 It’s still a mystery.

01:53:51 That’s interesting.

01:53:51 It’s still a mystery, although the evidence is coming out.

01:53:54 I told you before I edited a journal on intelligence,

01:53:56 and we’re doing a special issue in honor of James Flynn.

01:54:00 So I’m starting to see papers now on really

01:54:03 the latest research on this.

01:54:06 I think most people who specialize

01:54:08 in this area of trying to understand the Flynn effect

01:54:12 are coming to the view based on data

01:54:16 that it has to do with advances in nutrition and health care.

01:54:20 And there’s also evidence that the effect is slowing down

01:54:25 and possibly reversing.

01:54:27 Oh, boy.

01:54:28 So how would nutrition and health,

01:54:30 so nutrition would still be connected to the G factor.

01:54:36 So nutrition as it relates to the G factor,

01:54:38 so the biology that leads to the intelligence.

01:54:42 Yes.

01:54:42 That would be the claim.

01:54:43 Like the hypothesis being tested by the research.

01:54:50 Yes.

01:54:50 And there’s some evidence from infants

01:54:54 that nutrition has made a difference.

01:54:58 So it’s not an unreasonable connection.

01:55:02 But does it negate the idea that there’s a genetic influence?

01:55:05 Not logically at all.

01:55:07 But it is very interesting.

01:55:09 So that if you take an IQ test today but you take the score

01:55:17 and use the tables that were available in 1940,

01:55:22 you’re going to wind up with a much higher IQ number.

01:55:27 So are we really smarter than a couple of generations ago?

01:55:32 No, but we might be able to solve problems a little better.

01:55:38 And make use of our G because of things like Sesame Street

01:55:43 and other curricula in school.

01:55:45 More people are going to school.

01:55:49 So there are a lot of factors here to disentangle.

01:55:53 It’s fascinating though.

01:55:55 It’s fascinating that there’s not clear answers yet.

01:55:58 That as a population, we’re getting smarter.

01:56:02 When you just zoom out, that’s what it looks like.

01:56:04 As a population, we’re getting smarter.

01:56:06 And it’s interesting to see what the effects of that are.

01:56:08 I mean, this raises the question.

01:56:10 We’ve mentioned it many times but haven’t clearly addressed it,

01:56:14 which is nature versus nurture question.

01:56:17 So how much of intelligence is nature?

01:56:20 How much of it is nurture?

01:56:21 How much of it is determined by genetics versus environment?

01:56:25 All of it.

01:56:26 All of it is genetics.

01:56:28 No, all of it is nature and nurture.

01:56:32 So yes.

01:56:33 Yes.

01:56:34 Okay.

01:56:36 But how much of the variance can you apportion to either?

01:56:44 Most of the people who work in this field say that the framing of that, if the question

01:56:51 is framed that way, it can’t be answered because nature and nurture are not two independent

01:56:57 influences.

01:56:59 They interact with each other.

01:57:01 And understanding those interactions is so complex that many behavioral geneticists say

01:57:10 it is today impossible and always will be impossible to disentangle that, no matter

01:57:17 what kind of advances there are in DNA technology and genomic informatics.

01:57:24 But there’s still, to push back on that, that same intuition from behavioral geneticists

01:57:31 would lead me to believe that there cannot possibly be a stable G factor because it’s

01:57:37 super complex.

01:57:39 Many of them would assert that as a logical outcome.

01:57:45 But because I believe there is a stable G factor from lots of sources of data, not just

01:57:51 one study, but lots of sources of data over decades, I am more amenable to the idea that

01:58:01 whatever interactions between genes and environment exist, they can be explicated, they can be

01:58:09 studied, and that information can be used as a basis for molecular biology of intelligence.

01:58:19 Yeah, and we’ll do this exact question because doesn’t the stability of the G factor give

01:58:27 you at least a hint that there is a biological basis for intelligence?

01:58:33 Yes, I think it’s clear that the fact that an IQ score is correlated to things like thickness

01:58:42 of your cortex, that it’s correlated to glucose metabolic rate in your brain, that identical

01:58:55 twins reared apart are highly similar in their IQ scores.

01:59:02 These are all important observations that indicate, not just suggest, but indicate that

01:59:12 there’s a biological basis.

01:59:13 And does anyone believe intelligence has nothing to do with the brain?

01:59:18 I mean, it’s so obvious.

01:59:21 Well indirectly definitely has to do with it, but the question is environment interacting

01:59:26 with the brain or is it the actual raw hardware of the brain?

01:59:34 Well some would say that the raw hardware of the brain as it develops from conception

01:59:45 through adulthood, or at least through the childhood, that that so called hardware that

01:59:52 you are assuming is mostly genetic, in fact, is not as deterministic as you might think,

02:00:01 but it is probabilistic and what affects the probabilities are things like in uterine environment

02:00:09 and other factors like that, including chance.

02:00:15 That chance affects the way the neurons are connecting during gestation.

02:00:22 It’s not, hey, it’s pre programmed.

02:00:26 So there is push back on the concept that genes provide a blueprint, that it’s a lot

02:00:35 more fluid.

02:00:36 Well, but also, yeah, so there’s a lot, a lot, a lot happens in the first few months

02:00:44 of development.

02:00:47 So in nine months inside the mother’s body and in the few months afterwards, there’s

02:00:57 a lot of fascinating stuff, like including chance and luck, like you said, how things

02:01:02 connect up.

02:01:04 The question is afterwards in your plasticity of the brain, how much adjustment there is

02:01:08 relative to the environment, how much that affects the G factor, but that’s where the

02:01:14 whole conclusions of the studies that we’ve been talking about is that seems to have less

02:01:19 and less and less of an effect as pretty quickly.

02:01:23 As yes, and I do think there is more of a genetic, by my view, and I’m not an expert

02:01:30 on this, I mean, genetics is a highly technical and complex subject.

02:01:34 I am not a geneticist, not a behavioral geneticist, but my reading of this, my interpretation

02:01:41 of this is that there is a genetic blueprint, more or less, and that has a profound influence

02:01:50 on your subsequent intellectual development, including the G factor.

02:01:56 And that’s not to say things can’t happen to, I mean, if you think of that genes provide

02:02:03 a potential, fine, and then various variables impact that potential, and every parent of

02:02:12 a newborn, implicitly or explicitly, wants to maximize that potential.

02:02:19 This is why you buy educational toys.

02:02:21 This is why you pay attention to organic baby food.

02:02:25 This is why you do all these things, because you want your baby to be as healthy and as

02:02:31 smart as possible, and every parent will say that.

02:02:36 Is there a case to be made, can you steel man the case, that genetics is a very tiny

02:02:45 component of all of this, and the environment is essential?

02:02:49 I don’t think the data supports that genetics is a tiny component.

02:02:53 I think the data support the idea that the genetics is a very important, and I don’t

02:02:58 say component, I say influence, a very important influence, and the environment is a lot less

02:03:05 than people believe.

02:03:07 Most people believe environment plays a big role.

02:03:10 I’m not so sure.

02:03:11 I guess what I’m asking you is, can you see where what you just said, it might be wrong?

02:03:19 Can you imagine a world, and what kind of evidence would you need to see to say, you

02:03:27 know what, the intuition, the studies so far, like reversing the directions.

02:03:31 So one of the cool things we have now more and more is we’re getting more and more data,

02:03:36 and the rate of the data is escalating because of the digital world.

02:03:41 So when you start to look at a very large scale of data, both on the biology side and

02:03:48 the social side, we might be discovering some very counterintuitive things about society.

02:03:53 We might see the edge cases that reveal that if we actually scale those edge cases and

02:04:00 they become like the norm, that we’ll have a complete shift in our, like you’ll see G

02:04:08 factor be able to be modified throughout life in the teens and in later life.

02:04:15 So is it any case you can make or for where your current intuitions are wrong?

02:04:20 Yes, and it’s a good question because I think everyone should always be asked what evidence

02:04:25 would change your mind.

02:04:28 It’s certainly not only a fair question, it is really the key question for anybody working

02:04:32 on any aspect of science.

02:04:36 I think that if environment was very important, we would have seen it clearly by now.

02:04:45 It would have been obvious that school interventions, compensatory education, early childhood education,

02:04:53 all these things that have been earnestly tried in well funded, well designed studies

02:04:59 would show some effect, and they don’t.

02:05:02 What if the school, the way we’ve tried school, compensatory school sucks and we need to do

02:05:08 better?

02:05:09 That’s what everybody said at the beginning.

02:05:10 That’s what everybody said to Jensen.

02:05:11 He said, well, maybe we need to start earlier.

02:05:15 Maybe we need not do prekindergarten, but pre, prekindergarten.

02:05:20 It’s always an infinite, well, maybe we didn’t get it right.

02:05:24 But after decades of trying, 50 years, 50 or 60 years of trying, surely something would

02:05:33 have worked to the point where you could actually see a result and not need a probability level

02:05:39 at 0.05 on some means.

02:05:42 So that’s why I, that’s the kind of evidence that would change my mind.

02:05:49 Population level interventions like schooling that you would see like this actually has

02:05:56 an effect.

02:05:57 Yes.

02:05:58 And when you take adopted kids and they grow up in another family and you find out when

02:06:04 those adopted kids are adults, their IQ scores don’t correlate with the IQ scores of their

02:06:09 adoptive parents, but they do correlate with their IQ scores of their biological parents

02:06:15 whom they’ve never met.

02:06:18 I mean, these are important, these are powerful observations.

02:06:22 And it would be convincing to you if the reverse was true.

02:06:26 Yes.

02:06:27 That would be more.

02:06:28 And there is some data on adoption that indicates that the adopted children are moving a little

02:06:35 bit more toward their adoptive parents.

02:06:40 But it’s to me the overwhelming, I have this concept called the weight of evidence where

02:06:47 I don’t interpret any one study too much.

02:06:50 The weight of evidence tells me genes are important.

02:06:53 But what does that mean?

02:06:54 What does it mean that genes are important?

02:06:57 Knowing that gene expression, genes don’t express themselves in a vacuum, they express

02:07:03 themselves in an environment.

02:07:05 So the environment has to have something to do with it, especially if the best genetic

02:07:10 estimates of the amount of variants are around 50 or even if it’s as high as 80%, it still

02:07:17 leaves 20% of non genetic.

02:07:21 Now maybe that is all luck.

02:07:24 Maybe that’s all chance.

02:07:25 I could believe that, I could easily believe that.

02:07:31 But I do think after 50 years of trying various interventions and nothing works, including

02:07:39 memory training, including listening to Mozart, including playing computer games, none of

02:07:44 that has shown any impact on intelligence test scores.

02:07:49 Is there data on the intelligence, the IQ of parents as it relates to the children?

02:07:57 Yes, and there is some genetic evidence of an interaction between the parents IQ and

02:08:07 the environment.

02:08:08 High IQ parents provide an enriched environment, which then can impact the child in addition

02:08:17 to the genes, it’s that environment.

02:08:20 So there are all these interactions that, think about the number of books in a household.

02:08:29 This was a variable that’s correlated with IQ and, well, why?

02:08:37 Especially if the kid never reads any of the books, it’s because more intelligent people

02:08:42 have more books in their house.

02:08:45 And if you’re more intelligent and there’s a genetic component to that, the child will

02:08:52 get those genes or some of those genes as well as the environment.

02:08:57 But it’s not the number of books in the house that actually directly impacts the child.

02:09:04 So the two scenarios on this are you find that, and this was used to get rid of the

02:09:11 SAT test, oh, the SAT score is highly correlated with the social economic status of the parents.

02:09:18 So all you’re really measuring is how rich the parents are.

02:09:21 Okay, well, why are the parents rich?

02:09:27 And so the opposite kind of syllogism is that people who are very bright make more money,

02:09:37 they can afford homes in better neighborhoods so their kids get better schools.

02:09:44 Now the kids grow up bright.

02:09:47 Where in that chain of events does that come from?

02:09:51 Well, unless you have a genetically informative research design where you look at siblings

02:09:58 that have the same biological parents and so on, you can’t really disentangle all that.

02:10:05 Most studies of social economic status and intelligence do not have a genetically informed

02:10:12 design.

02:10:13 So any conclusions they make about the causality of the social economic status being the cause

02:10:20 of the IQ is a stretch.

02:10:25 And where you do find genetically informative designs, you find most of the variance in

02:10:33 your outcome measures are due to the genetic component.

02:10:38 And sometimes the SES adds a little, but the weight of evidence is it doesn’t add very

02:10:46 much variance to predict what’s going on beyond the genetic variance.

02:10:52 So when you actually look at it in some, and there aren’t that many studies that have genetically

02:10:58 informed designs, but when you do see those, the genes seem to have an advantage.

02:11:05 Sorry for the strange questions, but is there a connection between fertility or the number

02:11:13 of kids that you have and G factor?

02:11:16 So you know, the kind of conventional wisdom is people of maybe higher economic status

02:11:25 or something like that are having fewer children.

02:11:28 I just loosely hear these kinds of things.

02:11:30 Is there data that you’re aware of in one direction or another on this?

02:11:36 Strange questions always get strange answers.

02:11:39 Yes.

02:11:40 All right.

02:11:41 Do you have a strange answer for that strange question?

02:11:44 The answer is there were some studies that indicated the more children in a family, the

02:11:54 firstborn children would be more intelligent than the fourth or fifth or sixth.

02:12:00 It’s not clear that those studies hold up over time.

02:12:05 And of course what you see also is that families where there are multiple children, four, five,

02:12:14 six, seven, you know, really big families, the social economic status of those families

02:12:23 usually in the modern age is not that high.

02:12:28 Maybe it used to be the aristocracy used to have a lot of kids, I’m not sure exactly.

02:12:33 But there have been reports of correlations between IQ and fertility, but I’m not sure

02:12:44 that the data are very strong that the firstborn child is always the smartest.

02:12:50 It seems like there’s some data to that, but I’m not current on that.

02:12:54 How would that be explained?

02:12:55 That would be in a nurture.

02:12:58 Well, it could be nurture, it could be in uterine environment, I mean, and this is why

02:13:08 this, you know, like many areas of science, you said earlier that there are a lot of gray

02:13:14 areas and no definitive answers.

02:13:21 This is not uncommon in science that the closer you look at a problem, the more questions

02:13:28 you get, not the fewer questions, because the universe is complicated.

02:13:35 And the idea that we have people on this planet who can study the first nanoseconds of the

02:13:42 Big Bang, that’s pretty amazing.

02:13:48 And I’ve always said that if they can study the first nanoseconds of the Big Bang, we

02:13:53 can certainly figure out something about intelligence that allows that.

02:13:58 I’m not sure what’s more complicated, the human mind or the physics of the universe.

02:14:06 It’s unclear to me.

02:14:08 I think we overemphasize.

02:14:09 Well, that’s a very humbling statement.

02:14:13 Maybe it’s a very human centric, egotistical statement that our mind is somehow super complicated,

02:14:18 but biology is a tricky one to unravel.

02:14:22 Consciousness, what is that?

02:14:27 I’ve always believed that consciousness and intelligence are the two real fundamental

02:14:34 problems of the human brain, and therefore I think they must be related.

02:14:41 Yeah, heart problems like walk together, holding hands kind of idea.

02:14:49 You may not know this, but I did some of the early research on anesthetic drugs with brain

02:14:54 imaging trying to answer the question, what part of the brain is the last to turn off

02:14:58 when someone loses consciousness?

02:15:01 And is that the first part of the brain to turn on when consciousness is regained?

02:15:07 And I was working with an anesthesiologist named Mike Alkire, who was really brilliant

02:15:11 at this.

02:15:12 These were really the first studies of brain imaging using positron emission tomography

02:15:18 long before fMRI.

02:15:21 And you would inject a radioactive sugar that labeled the brain, and the harder the brain

02:15:28 was working, the more sugar it would take up, and then you could make a picture of glucose

02:15:33 use in the brain.

02:15:36 And he was amazing.

02:15:38 He managed to do this in normal volunteers he brought in and anesthetized as if they

02:15:44 were going into surgery.

02:15:49 He managed all the human subjects requirements on this research, and he was brilliant at

02:15:55 this.

02:15:57 And what we did is we had these normal volunteers come in on three occasions.

02:16:05 On one occasion, he gave them enough anesthetic drug so they were a little drowsy.

02:16:14 And on another occasion, they came in and he fully anesthetized them.

02:16:20 And he would say, Mike, can you hear me, and the person would say, uh, yeah.

02:16:31 And then we would scan people under no anesthetic condition.

02:16:37 So same person.

02:16:39 And we were looking to see if we could see the part of the brain turn off.

02:16:46 He subsequently tried to do this with fMRI, which has a faster time resolution, and you

02:16:51 could do it in real time as the person went under and then regain consciousness where

02:16:56 you couldn’t do that with PET.

02:16:57 You had to have three different occasions.

02:17:00 And the results were absolutely fascinating.

02:17:03 We did this with different anesthetic drugs, and different drugs impacted different parts

02:17:08 of the brain.

02:17:09 So we were naturally looking for the common one, and it seemed to have something to do

02:17:15 with the thalamus.

02:17:18 And consciousness, this was actual data on consciousness, actual consciousness.

02:17:25 What part of the brain turns on?

02:17:28 What part of the brain turns off?

02:17:30 It’s not so clear.

02:17:33 But maybe has something to do with the thalamus.

02:17:35 The sequence of events seemed to have the thalamus in it.

02:17:41 Now here’s the question.

02:17:42 Are some people more conscious than others?

02:17:45 Are there individual differences in consciousness?

02:17:49 And I don’t mean it in the psychedelic sense.

02:17:53 I don’t mean it in the political consciousness sense.

02:17:55 I just mean it in everyday life.

02:17:57 Do some people go through everyday life more conscious than others?

02:18:01 And are those the people we might actually label more intelligent?

02:18:06 Now the other thing I was looking for is whether the parts of the brain we were seeing in the

02:18:11 anesthesia studies were the same parts of the brain we were seeing in the intelligence

02:18:16 studies.

02:18:17 Now, this was very complicated, expensive research.

02:18:22 We didn’t really have funding to do this.

02:18:24 We were trying to do it on the fly.

02:18:26 I’m not sure anybody has pursued this.

02:18:29 I’m retired now.

02:18:31 He’s gone on to other things.

02:18:34 But I think it’s an area of research that would be fascinating to see the parts, a lot

02:18:41 more imaging studies now of consciousness.

02:18:43 I’m just not up on them.

02:18:45 But basically the question is which imaging, so newer imaging studies to see in high resolution,

02:18:52 spatial and temporal way, which part of the brain lights up when you’re doing intelligence

02:18:59 tasks and which parts of the brain lights up when you’re doing consciousness tasks and

02:19:03 see the interplay between them, try to infer, that’s the challenge of neuroscience, without

02:19:09 understanding deeply, looking from the outside, try to infer something about how the whole

02:19:18 thing works.

02:19:19 Well, imagine this.

02:19:21 Here’s a simple question.

02:19:23 Does it take more anesthetic drug to have a person lose consciousness if their IQ is

02:19:33 140 than a person with an IQ of 70?

02:19:39 That’s an interesting way to study it.

02:19:40 Yeah.

02:19:41 I mean, if the answer to that is a stable yes, that’s very interesting.

02:19:48 So I tried to find out and I went to some anesthesiology textbooks about how you dose

02:19:55 and they dose by weight.

02:19:59 And what I also learned, this is a little bit off subject, anesthesiologists are never

02:20:07 sure how deep you are.

02:20:10 And they usually tell by poking you with a needle and if you don’t jump, they tell the

02:20:14 surgeon to go ahead.

02:20:17 I’m not sure that’s literally true, but it’s…

02:20:20 Well, it might be very difficult to know precisely how deep you are.

02:20:26 It has to do with the same kind of measurements that you were doing with the consciousness.

02:20:31 It’s difficult to know.

02:20:34 So I don’t lose my train of thought.

02:20:35 I couldn’t find in the textbooks anything about dosing by intelligence.

02:20:40 I asked my friend, the anesthesiologist, he said, no, he doesn’t know.

02:20:45 I said, can we do a chart review and look at people using their years of education as

02:20:52 a proxy for IQ?

02:20:54 Because if someone’s gone to graduate school, that tells you something.

02:20:58 You can make some inference as opposed to someone who didn’t graduate high school.

02:21:02 Can we do a chart review?

02:21:03 And he says, no, they never really put down the exact dose.

02:21:08 And no, he said, no.

02:21:10 So to this day, the simple question, does it take more anesthetic drug to put someone

02:21:18 under if they have a high IQ or less, or less?

02:21:23 It could go either way.

02:21:24 Because by the way, our early PET scan studies of intelligence found the unexpected result

02:21:33 of an inverse correlation between glucose metabolic rate and intelligence.

02:21:38 It wasn’t how much a brain area lit up.

02:21:43 How much it lit up was negatively correlated to how well they did on the test, which led

02:21:48 to the brain efficiency hypothesis, which is still being studied today.

02:21:54 And there’s more and more evidence that the efficiency of brain information processing

02:22:00 is more related to intelligence than just more activity.

02:22:08 Yeah, and it’ll be interesting, again, this is the total hypothesis, how much in the relationship

02:22:14 between intelligence and consciousness, it’s not obvious that those two, if there’s correlation,

02:22:22 they could be inversely correlated.

02:22:23 Wouldn’t that be funny?

02:22:26 If you, the consciousness factor, the C factor plus the G factor equals one.

02:22:38 It’s a nice trade off, you get a trade off, how deeply you experience the world versus

02:22:43 how deeply you’re able to reason through the world.

02:22:48 What a great hypothesis.

02:22:51 Certainly somebody listening to this can do this study.

02:22:54 Even if it’s the aliens analyzing humans a few centuries from now, let me ask you from

02:22:59 an AI perspective, I don’t know how much you’ve thought about machines, but there’s the famous

02:23:08 Turing test, test of intelligence for machines, which is a beautiful, almost like a cute formulation

02:23:17 of intelligence that Alan Turing proposed.

02:23:24 Basically conversation being, if you can fool a human to think that a machine is a human

02:23:33 that passes the test, I suppose you could do a similar thing for humans.

02:23:40 If I can fool you that I’m intelligent, then that’s a good test of intelligence.

02:23:48 You’re talking to two people, and the test is saying who has a higher IQ.

02:24:02 It’s an interesting test, because maybe charisma can be very useful there, and you’re only

02:24:07 allowed to use conversation, which is the formulation of the Turing test.

02:24:11 Anyway, all that to say is what are good tests of intelligence for machines?

02:24:18 What do you think it takes to achieve human level intelligence for machines?

02:24:23 I have thought a little bit about this, but every time I think about these things, I rapidly

02:24:30 reach the limits of my knowledge and imagination.

02:24:37 When Alexa first came out, and I think there was a competing one, well, there was Siri

02:24:47 with Apple, and Google had Alexa.

02:24:50 No, no, Amazon had Alexa.

02:24:52 Amazon had Alexa.

02:24:53 Google has Google Home.

02:24:54 Google has something.

02:24:55 I proposed to one of my colleagues that he buy one of these, one of each, and then ask

02:25:04 it questions from the IQ test.

02:25:09 But it became apparent that they all searched the internet, so they all can find answers

02:25:15 to questions like how far is it between Washington and Miami, and repeat after me.

02:25:22 Now, I don’t know if you said to Alexa, I’m going to repeat these numbers backwards to

02:25:29 me.

02:25:30 I don’t know what would happen.

02:25:31 I’ve never done it.

02:25:33 So one answer to your question is you’re going to try it right now.

02:25:38 Let’s try it.

02:25:39 No.

02:25:40 Let’s try it.

02:25:41 No, no, no.

02:25:42 Yes, Siri.

02:25:43 So it would actually probably go to Google search, and it will be all confusing kind

02:25:47 of stuff.

02:25:49 It would fail.

02:25:50 Well, then I guess there was a test that it would fail.

02:25:53 Well, but that’s not, that has to do more with the language of communication versus

02:26:02 the content.

02:26:03 So if you did an IQ test to a person who doesn’t speak English, and the test was administered

02:26:09 in English, that’s not really the test of…

02:26:11 Well, let’s think about the computers that beat the Jeopardy champions.

02:26:15 Yeah, so that, because I happen to know how those are programmed, those are very hard

02:26:21 coded, and there’s definitely a lack of intelligence there.

02:26:25 There’s something like IQ tests, there’s a guy, an artificial intelligence researcher,

02:26:36 Francois Chollet, he’s at Google, he’s one of the seminal people in machine learning.

02:26:40 He also, as a fun aside thing, developed an IQ test for machines.

02:26:45 Oh, I haven’t heard that.

02:26:47 I’d just like to know about that.

02:26:49 I’ll actually email you this, because it’d be very interesting for you.

02:26:53 It doesn’t get much attention, because people don’t know what to do with it, but it deserves

02:26:59 a lot of attention, which is, it basically does a pattern type of tests, where you have

02:27:06 to do, you know, one standard one is, you’re given three things, and you have to do a fourth

02:27:12 one, that kind of thing, so you have to understand the pattern here.

02:27:17 And for that, it really simplifies to, so the interesting thing is, he’s trying not

02:27:28 to achieve high IQ, he’s trying to achieve like, pretty low bar for IQ.

02:27:35 Things that are kind of trivial for humans, and they’re actually really tough for machines.

02:27:42 It’s just seeing, playing with these concepts of symmetry, of counting, like if I give you

02:27:49 one object, two objects, three objects, you’ll know the last one is four objects, you can

02:27:54 like count them, you can cluster objects together, it’s both visually and conceptually, we can

02:28:01 do all these things with our mind, that we take for granted, the objectness of things.

02:28:07 You can like, figure out what spatially is an object and isn’t, and we can play with

02:28:14 those ideas, and machines really struggle with that, so he really cleanly formulated

02:28:21 these IQ tests, I wonder what like, that would equate to for humans with IQ, but it’d be

02:28:27 a very low IQ, but that’s exactly the kind of formulation, like okay, we want to be able

02:28:33 to solve this, how do we solve this, and he does it as a challenge, and nobody’s been

02:28:38 able to, it’s similar to the Alexa prize, which is Amazon is hosting a conversational

02:28:44 challenge, nobody’s been able to do well on his, but that’s an interesting, those kinds

02:28:51 of tests are interesting, because we take for granted all the ability of the human mind

02:28:58 to play with concepts, and to formulate concepts out of novel things, so like, things we’ve

02:29:08 never seen before, we’re able to use that, I mean that’s, I’ve talked to a few people

02:29:14 that design IQ tests, sort of online, they write IQ tests, and I was trying to get some

02:29:20 questions from them, and they spoke to the fact that we can’t really share questions

02:29:25 with you, because part of the, like first of all, it’s really hard work to come up with

02:29:31 questions, it’s really, really hard work, it takes a lot of research, but it also takes

02:29:37 a lot, it’s novelty generating, you’re constantly coming up with really new things, and part

02:29:46 of the point is that they’re not supposed to be public, they’re supposed to be new

02:29:51 to you when you look at them, it’s interesting that the novelty is fundamental to the hardness

02:29:56 of the problem, at least a part of what makes the problem hard is that you’ve never seen

02:30:02 it before.

02:30:03 Right, that’s called fluid intelligence, as opposed to what’s called crystallized intelligence,

02:30:08 which is your knowledge of facts, you know things, but can you use those things to solve

02:30:16 a problem, those are two different things.

02:30:19 Do you think we’ll be able to, because we spoke, I don’t want to miss opportunity to

02:30:25 talk about this, we spoke about the neurobiology, about the molecular biology of intelligence,

02:30:30 do you think one day we’ll be able to modify the biology of, or the genetics of a person

02:30:39 to modify their intelligence, to increase their intelligence, we started this conversation

02:30:45 by talking about a pill you could take, do you think that such a pill would exist?

02:30:49 Metaphorically, I do, and I am supremely confident that it’s possible because I am supremely

02:30:57 ignorant of the complexities of neurobiology, and so I have written that the nightmares

02:31:07 of neurobiologists, understanding the complexities, this cascade of events that happens at the

02:31:15 synaptic level, that these nightmares are what fuel some people to solve.

02:31:25 So some people, you have to be undaunted, I mean yeah, this is not easy, look we’re

02:31:31 still trying to figure out cancer, it was only recently that they figured out why aspirin

02:31:38 works, you know, these are not easy problems, but I also have the perspective of the history

02:31:47 of science, is the history of solving problems that are extraordinarily complex.

02:31:57 And seem impossible at the time.

02:31:58 And seem impossible at the time.

02:32:01 And so one of the things you look at, at companies like Neuralink, you have brain computer interfaces,

02:32:08 you start to delve into the human mind and start to talk about machines measuring but

02:32:12 also sending signals to the human mind, and you start to wonder what that has, what impact

02:32:19 that has on the G factor.

02:32:23 Modifying in small ways or in large ways the functioning, the mechanical, electrical, chemical

02:32:32 functioning of the brain.

02:32:34 I look at everything about the brain, there are different levels of explanation.

02:32:39 On one hand you have a behavioral level, but then you have brain circuitry, and then you

02:32:46 have neurons, and then you have dendrites, and then you have synapses, and then you have

02:32:57 the neurotransmitters, and the presynaptic and the postsynaptic terminals, and then you

02:33:06 have all the things that influence neurotransmitters, and then you have the individual differences

02:33:13 among people.

02:33:15 Yeah, it’s complicated, but 51 million people in the United States have IQs under 85 and

02:33:27 struggle with everyday life.

02:33:32 Shouldn’t that motivate people to take a look at this?

02:33:37 Yeah, but I just want to linger one more time that you have to remember that the science

02:33:46 of intelligence, the measure of intelligence is only a part of the human condition.

02:33:54 The thing that makes life beautiful and the creation of beautiful things in this world

02:33:59 is perhaps loosely correlated, but is not dependent entirely on intelligence.

02:34:08 Absolutely, I certainly agree with that.

02:34:12 So for anyone sort of listening, I’m still not convinced that more intelligence is always

02:34:22 better if you want to create beauty in this world.

02:34:26 I don’t know.

02:34:27 Well, I didn’t say more intelligence is always better if you want to create beauty.

02:34:31 I just said all things being equal, more is better than less.

02:34:36 That’s all I mean.

02:34:37 Yeah, but that’s sort of that I just want to sort of say because a lot to me, one of

02:34:42 the things that makes life great is the opportunity to create beautiful things, and so I just

02:34:50 want to sort of empower people to do that no matter what some IQ test says.

02:34:56 At the population level, we do need to look at IQ tests to help people and to also inspire

02:35:02 us to take on some of these extremely difficult scientific questions.

02:35:07 Do you have advice for young people in high school, in college, whether they’re thinking

02:35:16 about career or they’re thinking about a life they can be proud of?

02:35:20 Is there advice you can give whether they want to pursue psychology or biology or engineering

02:35:29 or they want to be artists and musicians and poets?

02:35:33 I can’t advise anybody on that level of what their passion is, but I can say if you’re

02:35:43 interested in psychology or if you’re interested in science and the science around the big

02:35:52 questions of consciousness and intelligence and psychiatric illness, we haven’t really

02:36:00 talked about brain illnesses and what we might learn from.

02:36:07 If you are trying to develop a drug to treat Alzheimer’s disease, you are trying to develop

02:36:12 a drug to impact learning and memory, which are core to intelligence.

02:36:20 So it could well be that the so called IQ pill will come from a pharmaceutical company

02:36:26 trying to develop a drug for Alzheimer’s disease.

02:36:29 Because that’s exactly what you’re trying to do, right, yeah, just like you said.

02:36:33 What will that drug do in a college student that doesn’t have Alzheimer’s disease?

02:36:38 So I would encourage people who are interested in psychology, who are interested in science

02:36:47 to pursue a scientific career and address the big questions.

02:36:54 And the most important thing I can tell you if you’re going to be in kind of a research

02:37:03 environment is you got to follow the data where the data take you.

02:37:07 You can’t decide in advance where you want the data to go.

02:37:10 And if the data take you to places that you don’t have the technical expertise to follow,

02:37:16 like you know, I would like to understand more about molecular biology, but I’m not

02:37:21 going to become a molecular biologist now.

02:37:24 But I know people who are, and my job is to get them interested to take their expertise

02:37:31 into this direction.

02:37:33 And that it’s not so easy.

02:37:36 And if the data takes you to a place that’s controversial, that’s counterintuitive in

02:37:41 this world, no, I would say it’s probably a good idea to still push forward boldly,

02:37:52 but to communicate the interpretation of the results with skill, with compassion, with

02:38:01 the greater breadth of understanding of humanity, not just the science, of the impact of the

02:38:07 results.

02:38:08 One famous psychologist wrote about this issue that somehow a balance has to be found between

02:38:16 pursuing the science and communicating it with respect to people’s sensitivities, the

02:38:22 legitimate sensitivities, somehow.

02:38:26 He didn’t say how.

02:38:27 Somehow.

02:38:28 Somehow.

02:38:29 And this is…

02:38:30 This sense, somehow, and balance is left up to the interpretation of the reader.

02:38:37 Let me ask you, you said big questions, the biggest, or one of the biggest, we already

02:38:44 talked about consciousness and intelligence, one of the most fascinating, one of the biggest

02:38:48 questions.

02:38:49 But let’s talk about the why.

02:38:51 Why are we here?

02:38:53 What’s the meaning of life?

02:38:54 I’m not going to tell you.

02:38:55 You know you’re not going to tell me?

02:38:56 This is very…

02:39:00 I’m going to have to wait for your next book.

02:39:03 The meaning of life.

02:39:07 We do the best we can to get through the day.

02:39:12 And then there’s just a finite number of the days.

02:39:16 Are you afraid of the finiteness of it?

02:39:17 I think about it more and more as I get older.

02:39:21 Yeah, I do.

02:39:23 And it’s one of these human things, that it is finite, we all know it.

02:39:30 Most of us deny it and don’t want to think about it.

02:39:35 Sometimes you think about it in terms of estate planning, you try to do the rational thing.

02:39:42 Sometimes it makes you work harder because you know your time is more and more limited

02:39:46 and you want to get things done.

02:39:50 I don’t know where I am on that.

02:39:53 It is just one of those things that’s always in the back of my mind.

02:40:00 And I don’t think that’s uncommon.

02:40:02 Well it’s just like G factor and intelligence, it’s a hard truth that’s there.

02:40:09 And sometimes you kind of walk past it and you don’t want to look at it, but it’s still

02:40:15 there.

02:40:16 Yeah.

02:40:17 Yes, you can’t escape it.

02:40:20 And the thing about the G factor and intelligence is everybody knows this is true on a personal

02:40:28 daily basis.

02:40:31 Even if you think back to when you were in school, you know who the smart kids were.

02:40:39 When you are on the phone talking to a customer service representative, that in response to

02:40:44 your detailed question is reading a script back to you and you get furious at this.

02:40:52 Have you ever called this person a moron or wanted to call this person a moron?

02:40:56 You’re not listening to me.

02:40:58 Everybody has had the experience of dealing with people who they think are not at their

02:41:03 level.

02:41:05 It’s just common because that’s the way human beings are.

02:41:09 That’s the way life is.

02:41:11 But we also have a poor estimation of our own intelligence.

02:41:16 We have a poor, and we’re not always a great, our judgment of human character of other people

02:41:22 is not as good as a battery of tests.

02:41:29 That’s where bias comes in.

02:41:31 That’s where our history, our emotions, all of that comes in.

02:41:35 So, you know, people on the internet, you know, there’s such a thing as the internet

02:41:39 and people on the internet will call each other dumb all the time.

02:41:45 You know, that’s the worry here is that we give up on people.

02:41:53 We put them in a bin just because of one interaction or some small number of interactions as if

02:42:00 that’s it.

02:42:01 They’re hopeless.

02:42:02 That’s just in their genetics.

02:42:03 But I think no matter what the science here says, once again, that does not mean we should

02:42:11 not have compassion for our fellow man.

02:42:15 That’s exactly what the science does say.

02:42:17 It’s not opposite of what the science says.

02:42:22 Everything I know about psychology, everything I’ve learned about intelligence, everything

02:42:30 points to the inexorable conclusion that you have to treat people as individuals respectfully

02:42:38 and with compassion.

02:42:40 Because through no fault of their own, some people are not as capable as others.

02:42:46 And you want to turn a blind eye to it, you want to come up with theories about why that

02:42:52 might be true, fine.

02:42:54 I would like to fix some of it as best I can.

02:42:58 And everybody is deserving of love.

02:43:01 Richard, this is a good way to end it, I think.

02:43:05 I’m just getting warmed up here.

02:43:07 I know.

02:43:08 I know you can go for another many hours, but to respect your extremely valuable time,

02:43:15 this is an amazing conversation.

02:43:16 Thank you for the teaching company, the lectures you’ve given with the New York Science of

02:43:23 Intelligence.

02:43:24 Thank you for everything you’re doing, it’s a difficult topic, it’s a topic that’s controversial

02:43:29 and sensitive to people and to push forward boldly and in that nuanced way, just thank

02:43:35 you for everything you do.

02:43:37 And thank you for asking the big questions of intelligence, of consciousness.

02:43:42 Well thank you for asking me.

02:43:43 I mean, there’s nothing like good conversation on these topics.

02:43:47 Thanks for listening to this conversation with Richard Haier.

02:43:50 To support this podcast, please check out our sponsors in the description.

02:43:54 And now, let me leave you with some words from Albert Einstein.

02:43:57 It is not that I’m so smart, but I stay with the questions much longer.

02:44:04 Thank you for listening and hope to see you next time.