I wrote here about Epistemic Learned Helplessness, a super-useful idea based on an essay by Scott Alexander.
Well, the link is broken. And so are all the other links in all the other articles that point there.
Wayback machine to the rescue. The page is on the Internet Archive, here
But just in case it gets lost, I’ve copied it here. I don’t think Scott will mind.
[Epistemic Status | Probably I’m just coming at the bog-standard idea of compartmentalization from a different angle here. I don’t know if anyone else has noted how compartmentalization is a good thing before, but I bet they have.]
A friend in business recently complained about his hiring pool, saying that he couldn’t find people with the basic skill of believing arguments. That is, if you have a valid argument for something, then you should accept the conclusion. Even if the conclusion is unpopular, or inconvenient, or you don’t like it. He told me a good portion of the point of CfAR was to either find or create people who would believe something after it had been proven to them.
And I nodded my head, because it sounded reasonable enough, and it wasn’t until a few hours later that I thought about it again and went “Wait, no, that would be the worst idea ever.”
I don’t think I’m overselling myself too much to expect that I could argue circles around the average high school dropout. Like I mean that on almost any topic, given almost any position, I could totally demolish her and make her look like an idiot. Reduce her to some form of “Look, everything you say fits together and I can’t explain why you’re wrong, I just know you are!” Or, more plausibly, “Shut up I don’t want to talk about this!”
And there are people who can argue circles around me. Not on any topic, maybe, but on topics where they are experts and have spent their whole lives honing their arguments. When I was young I used to read pseudohistory books; Immanuel Velikovsky’s Ages in Chaos is a good example of the best this genre has to offer. I read it and it seemed so obviously correct, so perfect, that I could barely bring myself to bother to search out rebuttals.
And then I read the rebuttals, and they were so obviously correct, so devastating, that I couldn’t believe I had ever been so dumb as to believe Velikovsky.
And then I read the rebuttals to the rebuttals, and they were so obviously correct that I felt silly for ever doubting.
And so on for several more iterations, until the labyrinth of doubt seemed inescapable. What finally broke me out wasn’t so much the lucidity of the consensus view so much as starting to sample different crackpots. Some were almost as bright and rhetorically gifted as Velikovsky, all presented insurmountable evidence for their theories, and all had mutually exclusive ideas. After all, Noah’s Flood couldn’t have been a cultural memory both of the fall of Atlantis and of a change in the Earth’s orbit, let alone of a lost Ice Age civilization or of megatsunamis from a meteor strike. So given that at least some of those arguments are wrong and all seemed practically proven, I am obviously just gullible in the field of ancient history. Given a total lack of independent intellectual steering power and no desire to spend thirty years building an independent knowledge base of Near Eastern history, I choose to just accept the ideas of the prestigious people with professorships in Archaeology rather than the universally reviled crackpots who write books about Venus being a comet.
I guess you could consider this a form of epistemic learned helplessness, where I know any attempt to evaluate the arguments are just going to be a bad idea so I don’t even try. If you have a good argument that the Early Bronze Age worked completely differently from the way mainstream historians believe, I just don’t want to hear about it. If you insist on telling me anyway, I will nod, say that your argument makes complete sense, and then totally refuse to change my mind or admit even the slightest possibility that you might be right.
(This is the correct Bayesian action, by the way. If I know that a false argument sounds just as convincing as a true argument, argument convincingness provides no evidence either way, and I should ignore it and stick with my prior.)
I consider myself lucky in that my epistemic learned helplessness is circumscribed; there are still cases where I will trust the evidence of my own reason. In fact, I trust it in most cases other than very carefully constructed arguments known for their deceptiveness in fields I know little about. But I think the average high school dropout both doesn’t and shouldn’t. Anyone anywhere - politicians, scammy businessmen, smooth-talking romantic partners - would be able to argue her into anything. And so she takes the obvious and correct defensive manuever - she will never let anyone convince her of any belief that sounds “weird” (note that, if you grow up in the right circles, beliefs along the lines of astrology not working sound “weird”.)
This is starting to sound a lot like ideas I’ve already heard centering around compartmentalization and taking ideas seriously. The only difference between their presentation and mine is that I’m saying that for 99% of people, 99% of the time, this is a terrible idea. Or, at the very least, this should be the last skill you learn, after you’ve learned every other skill that allows you to know which ideas are or are not correct.
The people I know who are best at taking ideas seriously are those who are smartest and most rational. I think people are working off a model where these co-occur because you need to be very clever to fight your natural and detrimental tendency not to take ideas seriously. I think it’s at least possible they co-occur because you have to be really smart in order for taking ideas seriously to be even not-immediately-disastrous. You have to be really smart not to have been talked into enough terrible arguments to develop epistemic learned helplessness.
Even the smartest people I know have a commendable tendency not to take certain ideas seriously. Bostrom’s simulation argument, the anthropic doomsday argument, Pascal’s Mugging - I’ve never heard anyone give a coherent argument against any of these, but I’ve also never met anyone who fully accepts them and lives life according to their implications.
A friend tells me of a guy who once accepted fundamentalist religion because of Pascal’s Wager. I will provisionally admit that this person takes ideas seriously. Everyone else loses.
Which isn’t to say that some people don’t do better than others. Terrorists seem pretty good in this respect. People used to talk about how terrorists must be very poor and uneducated to fall for militant Islam, and then someone did a study and found that they were disproportionately well-off, college educated people (many were engineers). I’ve heard a few good arguments in this direction before, things like how engineering trains you to have a very black-and-white right-or-wrong view of the world based on a few simple formulae, and this meshes with fundamentalism better than it meshes with subtle liberal religious messages.
But to these I would add that a sufficiently smart engineer has never been burned by arguments above his skill level before, has never had any reason to develop epistemic learned helplessness. If Osama comes up to him with a really good argument for terrorism, he thinks “Oh, there’s a good argument for terrorism. I guess I should become a terrorist,” as opposed to “Arguments? You can prove anything with arguments. I’ll just stay right here and not do something that will get me ostracized and probably killed.”
Responsible doctors are at the other end of the spectrum from terrorists in this regard. I once heard someone rail against how doctors totally ignored all the latest and most exciting medical studies. The same person, practically in the same breath, then railed against how 50% to 90% of medical studies are wrong. These two observations are not unrelated. Not only are there so many terrible studies, but pseudomedicine (not the stupid homeopathy type, but the type that links everything to some obscure chemical on an out-of-the-way metabolic pathway) has, for me, proven much like pseudohistory in that unless I am an expert in that particular field of medicine(biochemistry has a disproportionate share of these people and is also an area where I’m weak) it’s hard not to take them seriously, even when they’re super-wrong.
I have developed a healthy dose of epistemic learned helplessness, and the medical establishment offers a shiny tempting solution - first, a total unwillingness to trust anything, no matter how plausible it sounds, until it’s gone through an endless cycle of studies and meta-analyses, and second, a bunch of Institutes and Collaborations dedicated to filtering through all these studies and analyses and telling you what lessons you should draw from them. Part of the reason Good Calories, Bad Calorieswas so terrifying is that it made a strong case that this establishment can be very very wrong, and I don’t have good standards by which to decide whether to dismiss it as another Velikovsky, or whether to just accept that the establishment is totally untrustworthy and, as doctors sometimes put it, AMYOYO. And if the latter, how much establishment do I have to jettison and how much can be saved? Do I have to actually go through all those papers purporting to prove homeopathy with an open mind?
I am glad that some people never develop epistemic learned helplessness, or develop only a limited amount of it, or only in certain domains. It seems to me that although these people are more likely to become terrorists or Velikovskians or homeopaths, they’re also the only people who can figure out if something basic and unquestionable is wrong, and make this possibility well-known enough that normal people start becoming willing to consider it.
But I’m also glad epistemic learned helplessness exists. It seems like a pretty useful social safety valve most of the time.
No comments:
Post a Comment