Jul 10, 2018

Russell conjugation and Eric Weinstein

Daniel turned me on to Eric Weinstein and I’ve been mainlining his ideas for the past dozen hours. The guy is brilliant. See note at bottom of the post.
Discursions out of the way, here’s what I came here to write about. It’s from Eric Weinstein’s answer to the Edge question for 2017 “What scientific term or concept ought to be more widely known.”
His answer is “Russell Conjugation.” What? I never heard of it, either. But I’d seen the concept in passing, and Wikipedia has heard of it too. Of course.
The example I’ve come across:
I am firm, You are obstinate, He is a pig-headed fool.
Here’s a sentence from his article with so much packed in it that I could write a long essay about every phrase. Assuming that I can get myself to write anything at all.
In an era in which anyone can publish anything, the quest to control information has largely been lost by institutions, with a race on to weaponize empathy by understanding its basis in linguistics and tweaking the social media algorithms which now present our world to us accordingly. As the theory goes, it is not that we don’t have our own opinions so much as that we have too many contradictory ones, and it is generally our emotional state alone which determines on which ones we will predicate action or inaction.
What leaps out is the idea that people are working to weaponize empathy. And that’s because we don't have our own opinions so much as that we have too many contradictory ones.
That’s because of a bug in the human mental machinery. We think that we take in facts and then evaluate them. But research tells us otherwise. There are a couple of mechanisms at work. One is demonstrated in Daniel Gilbert’s work, which I wrote about here. His research concludes that we start by believing what we read (or hear) before we analytically decide that something is wrong. His paper is “You can’t not believe everything you read.pdf).”
But if the idea gets activated and the annotation “this is a false idea” is not activated, then we will act as though we believe what have decided we don’t believe.
To compound the problem, we can be convinced that things that are false are true if we’re exposed to powerful enough arguments. In this post I wrote about “Epistemic Learned Helplessness” a post in which Scott Alexander argues that we’re not necessarily right to accept even the best rational arguments because anyone who is not expert in an area can be convinced by someone who is an expert. Convinced, in this case, is a proxy for “being overwhelmed by verifiable facts that support an argument that is, in fact, flawed, but you don’t know enough to see the flaws.”
So taking these together: every conclusion we’re exposed to is stored in our minds as true along with its supporting arguments, also labeled true. even if our System 2 has concluded (as a result of other arguments and facts labeled “true” whether or not they are) that they are not true. “Weaponizing empathy” exploits yet another a bug in our cognitive apparatus that causes us to override the System 2 evaluation (“this is wrong”) with the emotional overlay “this feels right.”
There’s more to his Edge answer, and I’ll maybe write more later. And much more to the whole idea of gaming the mental systems of others.
Also for future reference: In looking for a link to his homepage or something, I found a brilliant idea on international migration summarized on his site and with a paper published in full in full here. In case I die or ADD strikes me before I write my review of the article, those are the links.

No comments:

Post a Comment

Pages