![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)
Now, it’s certainly conceivable that they did have such a reason for holding the opinion. People do often have all kinds of psychological, non-truth-tracking reasons for believing in something. So I don’t know whether this guess was correct or not.
But then I recalled something that has stayed with me: a slide from a presentation that Stuart Armstrong held several years back, that showed the way that we tend to think of our own opinions as being based on evidence, reasoning, etc.. And at the same time, we don’t see any of the evidence that caused other people to form their opinion, so instead we think of the opinions of others as being only based on rationalizations and biases.
Yes, it was conceivable that this person I was disagreeing with, held their opinion because of some bias. But given how quickly I was tempted to dismiss their view, it was even more conceivable that I had some similar emotional bias making me want to hold on to my opinion.
And being able to imagine a plausible bias that could explain another person’s position, is a Fully General Counterargument. You can dismiss any position that way.
So I asked myself: okay, I have invented a plausible bias that would explain the person’s commitment to this view. Can I invent some plausible bias that would explain my own commitment to my view?
I could think of several, right there on the spot. And almost as soon as I could, I felt my dismissive attitude towards the other person’s view dissolve, letting me consider their arguments on their own merits.
So, I’ll have to remember this. New cognitive trigger-action plan: if I notice myself inventing a bias that would explain someone else’s view, spend a moment to invent a bias that would explain *my* opposing view, in order to consider both more objectively.
Originally published at Kaj Sotala. You can comment here or there.
no subject
Date: 2017-09-04 02:30 am (UTC)Er, it would seem that a more basic problem is that the emotional motives for having a reason are orthogonal from whether that belief is correct.
A lot of folks have apparently gotten caught up in the idea that because one can have erroneous beliefs due to emotionally motivated reasoning, only emotionally unmotivated reasoning can be correct. But of course, the correctness of a conclusion exists independently of how we feel about it – or how we get to it.
Saying, "They're just saying that because [feeling]" can be both correct and immaterial. I mean, 94% of everything said in quarrels between spouses falls under that category.
no subject
Date: 2017-09-04 06:09 am (UTC)Hmm. Wouldn't it be more accurate to say that it's not orthogonal, but depends on the type of belief and exact question that we're talking about?
I.e. an emotionally-motivated belief is likely to be correct or not, proportional to how much the cause of that emotional reaction is entangled with the thing that we're interested in.
For something like beliefs in the domain of the natural sciences, which was the context I had in mind when writing this, the causes of emotional reactions seem to be mostly uncorrelated with the kinds of questions we're asking and so likely to lead us astray. If I'm emotionally invested in a particular hypothesis about what the speed of light is, because the person who proposed that particular hypothesis happens to be a relative, then the fact of who I'm related to seems uncorrelated with the fact of what the speed of light is.
On the other hand, if we're talking about something like understanding my own needs (in the sense used in e.g. Non-Violent Communication), then I would put substantially more weight on information gained from my emotional reactions than from most other sources, since there's a direct causal link between my needs and my emotional reactions.
no subject
Date: 2017-09-04 06:34 am (UTC)Right, that's what I'm saying. It's uncorrelated. Not negatively correlated.
In this scenario, how likely you are to be right about the speed of light is 100% dependent of how right the relative (I see what you did there!) is about the speed of light. But choosing to entirely cede the effort to be correct to one's relative out of pariality does not make the relative more or less right about their hypothesis as to the speed of light.
In fact, if one's Uncle Einstein is better at, say, physics, ceding to them the choice of which hypothesis to prefer is rather sensible. In that case the partiality of the decision – entirely accidentally – leads one to a superior result than one might get trying to evaluate the evidence for oneself.
Dismissing a proposition with "Oh, they're just saying that because of feelings" is a cop out: it avoids having to examine the merits of the contention. But people believe true and correct things for crappy, biased reasons all the time.
no subject
Date: 2017-09-04 06:58 am (UTC)Yeah, sure, agreed. That (assuming that I'm interpreting you correctly) was actually what I was trying to express in the post. That if we disagree with someone, we may be tempted to assume that their position is only based on some emotional bias (which is uncorrelated with the truth), but in fact their position is likely to be based on some combination of bias and actual evidence. And even if we correctly notice a bias, that still doesn't say anything about whether the rest of their reasons for believing in the thing are correct or incorrect. (after all, we likely have some emotional motive influencing our beliefs as well, but that doesn't mean we would ignore our own other evidence for the belief, so dismissing another's belief because they had an emotional motive would be a double standard)
no subject
Date: 2017-09-04 07:14 am (UTC)That if we disagree with someone, we may be tempted to assume that their position is only based on some emotional bias (which is uncorrelated with the truth), but in fact their position is likely to be based on some combination of bias and actual evidence.
Maybe! Or maybe they have no evidence at all. They might still be right.
My point is that it's immaterial to the correctness or incorrectness of their contention whether they have any actual evidence. A stopped clock is right twice a day.
I mean, sure, it would be nice if they have evidence they could share which would be useful to you. But they don't have to be able to convince you or better inform you to be right. But even if you know, somehow, with 100% certainty that they have adopted a position because of terrible, emotional reasons, that doesn't mean the position is incorrect.
no subject
Date: 2017-09-04 07:35 am (UTC)This is a very weird conversation!
You liked surrealism, didn't you? Happy to oblige! :D
So... your point is that even if all of their reasons for believing in a thing are totally uncorrelated with the truth, and they had no reliable evidence for believing in a thing, they might still be right just by coincidence? Maybe they are totally biased and irrational in their reasons for believing Uncle Einstein, but it just happens that their uncle is Einstein, so the totally biased and irrational motive happened to get it right just by luck? Is that what you mean?
no subject
Date: 2017-09-05 06:37 am (UTC)If the stupidest, most biased person you know believes E=MC^2 is true because they read it on Infowars, that doesn't make it false.
no subject
Date: 2017-09-05 08:38 am (UTC)The way I'd put this is: if a completely irrational person makes a claim, then you're correct that this doesn't really provide negative evidence for the claim. At worst, it provides zero evidence.
At the same time, most possible claims are false. If I didn't know what the speed of light was and had to make a random guess, my guess would almost certainly be wrong. Even a lot of serious scientific hypotheses whose inventors have put a lot of thought into them, have considered the evidence, etc. still turn out to be wrong. So the prior probability for a lot of claims to be wrong, is pretty high.
So if a totally irrational person makes a claim, them being totally irrational doesn't make the claim more likely to be false. But it also doesn't shift the prior probability of the claim being true, and since the prior probability is low, the claim is still really unlikely to be true.
... at least, if we're talking about science-type claims. Even a very irrational person might still be correct about where the nearest grocery store is, and if I'm in an urban area then it's not super-unlikely that the nearest grocery store really is two blocks away to the north. A lot of irrational people still manage to be pretty rational in most domains; "irrational" in common use doesn't mean "incapable of independent living", after all.
(And to add additional qualifiers: even if someone is clearly wrong in their opinion, them having that opinion is still evidence of something. My favorite example of this is an anecdote about a tribe which believed that when you got sufficiently old, you would start seeing into the supernatural realm... which any science-minded person would be likely to dismiss as superstitious nonsense. But turns out that they were discussing the imagery produced in Charles Bonnet syndrome. Their claim was still pointing to something that was true in the world, even though their interpretation of what was going on was off. A sufficiently curious person could have stopped to consider possible explanations for why that claim was made, rather than just dismissing it entirely.)