xuenay: (Default)
[personal profile] xuenay
 Some time back, I saw somebody express an opinion that I disagreed with. Next, my mind quickly came up with emotional motives the other person might have for holding such an opinion, that would let me safely justify dismissing that opinion.

Now, it’s certainly conceivable that they did have such a reason for holding the opinion. People do often have all kinds of psychological, non-truth-tracking reasons for believing in something. So I don’t know whether this guess was correct or not.

But then I recalled something that has stayed with me: a slide from a presentation that Stuart Armstrong held several years back, that showed the way that we tend to think of our own opinions as being based on evidence, reasoning, etc.. And at the same time, we don’t see any of the evidence that caused other people to form their opinion, so instead we think of the opinions of others as being only based on rationalizations and biases.

Yes, it was conceivable that this person I was disagreeing with, held their opinion because of some bias. But given how quickly I was tempted to dismiss their view, it was even more conceivable that I had some similar emotional bias making me want to hold on to my opinion.

And being able to imagine a plausible bias that could explain another person’s position, is a Fully General Counterargument. You can dismiss any position that way.

So I asked myself: okay, I have invented a plausible bias that would explain the person’s commitment to this view. Can I invent some plausible bias that would explain my own commitment to my view?

I could think of several, right there on the spot. And almost as soon as I could, I felt my dismissive attitude towards the other person’s view dissolve, letting me consider their arguments on their own merits.

So, I’ll have to remember this. New cognitive trigger-action plan: if I notice myself inventing a bias that would explain someone else’s view, spend a moment to invent a bias that would explain *my* opposing view, in order to consider both more objectively.

Originally published at Kaj Sotala. You can comment here or there.

Date: 2017-09-04 02:30 am (UTC)
siderea: (Default)
From: [personal profile] siderea
Next, my mind quickly came up with emotional motives the other person might have for holding such an opinion, that would let me safely justify dismissing that opinion.

Er, it would seem that a more basic problem is that the emotional motives for having a reason are orthogonal from whether that belief is correct.

A lot of folks have apparently gotten caught up in the idea that because one can have erroneous beliefs due to emotionally motivated reasoning, only emotionally unmotivated reasoning can be correct. But of course, the correctness of a conclusion exists independently of how we feel about it – or how we get to it.

Saying, "They're just saying that because [feeling]" can be both correct and immaterial. I mean, 94% of everything said in quarrels between spouses falls under that category.

Date: 2017-09-04 06:34 am (UTC)
siderea: (Default)
From: [personal profile] siderea
For something like beliefs in the domain of the natural sciences, which was the context I had in mind when writing this, the causes of emotional reactions seem to be mostly uncorrelated with the kinds of questions we're asking and so likely to lead us astray. If I'm emotionally invested in a particular hypothesis about what the speed of light is, because the person who proposed that particular hypothesis happens to be a relative, then the fact of who I'm related to seems uncorrelated with the fact of what the speed of light is.

Right, that's what I'm saying. It's uncorrelated. Not negatively correlated.

In this scenario, how likely you are to be right about the speed of light is 100% dependent of how right the relative (I see what you did there!) is about the speed of light. But choosing to entirely cede the effort to be correct to one's relative out of pariality does not make the relative more or less right about their hypothesis as to the speed of light.

In fact, if one's Uncle Einstein is better at, say, physics, ceding to them the choice of which hypothesis to prefer is rather sensible. In that case the partiality of the decision – entirely accidentally – leads one to a superior result than one might get trying to evaluate the evidence for oneself.

Dismissing a proposition with "Oh, they're just saying that because of feelings" is a cop out: it avoids having to examine the merits of the contention. But people believe true and correct things for crappy, biased reasons all the time.

Date: 2017-09-04 07:14 am (UTC)
siderea: (Default)
From: [personal profile] siderea
This is a very weird conversation!

That if we disagree with someone, we may be tempted to assume that their position is only based on some emotional bias (which is uncorrelated with the truth), but in fact their position is likely to be based on some combination of bias and actual evidence.

Maybe! Or maybe they have no evidence at all. They might still be right.

My point is that it's immaterial to the correctness or incorrectness of their contention whether they have any actual evidence. A stopped clock is right twice a day.

I mean, sure, it would be nice if they have evidence they could share which would be useful to you. But they don't have to be able to convince you or better inform you to be right. But even if you know, somehow, with 100% certainty that they have adopted a position because of terrible, emotional reasons, that doesn't mean the position is incorrect.

Date: 2017-09-05 06:37 am (UTC)
siderea: (Default)
From: [personal profile] siderea
Yes! Though you sort of put it in a way that makes it sound like some sort of longshot. People get right answers for wrong reasons all the time. One's uncle doesn't have to be Einstein, one's uncle just has to have heard of Einstein. If one is partial to one's uncle's favorite theory because he is your uncle and the only one in your family with an interest in physics, oneself included, adopting his favorite hypothesis might get one the right answer, just because one's uncle has managed to get the right answer.

If the stupidest, most biased person you know believes E=MC^2 is true because they read it on Infowars, that doesn't make it false.

December 2018

S M T W T F S
      1
2345678
910 1112131415
16171819202122
23242526272829
3031     

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 29th, 2025 03:58 am
Powered by Dreamwidth Studios