xuenay: (Default)
[personal profile] xuenay
Lately, the excellent blog Overcoming Bias has had discussion about the rationality and psychology of disagreement. I admit that I don't entirely understand everything that's discussed there - apparently in 1976, a Nobel prize winner published a paper which says roughly that, in theory, people who have the same information and who talk to each other long enough cannot agree to disagree. This has led to the release of a large amount of subsequent papers, some of which discuss the issue in rather abstract terms and long mathemathical proofs, considering mostly perfectly rational creatures, leaving their exact relevance to the field of human thought a bit unclear. Nevertheless, the bits that I've gleaned from some of the blog posts discussing this subject have been most interesting.

Let's discuss the issue in plain English, without invoking any math or formal logic. The principle is simple, almost obvious when you think about it. Let's assume that we have two people who have a shared goal, and disagree about how to best reach that goal. Since they disagree, that must mean that they have different information - the information person A has says that approach X is better, while the information person B has says that approach Y is better. Now, when they sit down to discuss the issue, they start sharing the information they have with each other, until finally both know exactly the same things. Since they both now know the same things, logically they should also both draw the same conclusions from this material. Thus, assuming they're perfectly rational and have enough time to discuss the issue, in the end they cannot agree to disagree about it. They may have reasons to interpret the information they have differently, but if they do so there is still something they haven't shared with each other, since presumably they have a reason to interpret the information they have differently.

Now, of course we all know that humans aren't perfectly rational creatures. Still, this is a subject that I've thought about every now and then - in just about every field of human behavior, huge disagreements persist about things that have been debated back and forth for ages, with plenty of experimental evidence to go around (consider, for one, the divisions between the political right and the left). Even though I know that people don't really think rationally about most things, this still strikes me as somewhat strange - typically both sides have plenty of really smart people arguing their cases, and often people devote practically their entire lives to the study of these things. There are no doubt plenty of folks who are just biased beyond belief, but nevertheless there should be enough people who really want to find out the truth that these things would get resolved relatively quickly. So what causes might there be for all of these persisting disagreements?

* People might actually have different goals. By Hume's Guillotine, moral rules cannot be directly derived from physical facts. One person can believe that positive rights are inherently the most important things to achieve, while another believes that negative rights are more important. (This is what I suspect is behind a lot of right-left dispute.) One person can believe that maximizing humanity's happines is the most important, while another can believe that living a pure and sinless life in the eyes of God is the most important. For as long as these underlying moral beliefs are axiomatic and not based on any other information (and some beliefs must be, if a person is to have them at all), they cannot be challenged by learning new things.
* People might treat the same information differently based on extra-informational factors. For instance, they might have a genetic disposition towards optimism or pessimism. Also, the human mind is built so that if one learns of something that conflicts with something they already know, they're more likely to discredit the new information than the old one. Thus, simply changing the order in which information is received may alter the information processing, even if ultimately one has all the same information as somebody else. Somebody who is first trained as an engineer and then majors in the humanities will view their both educations entirely differently than somebody who first gets their humanities degree and then goes to study engineering.
* The issue may be too complex to be comprehended fully, or just simply so hard to understand that human minds can never hope to fully grasp it. Of course, in this situation, the most rational choice would be to just accept that it's impossible to really know or that more research into the matter is needed, not to simply cling to the side you'd wish to win more.
* The sides discussing the matter may both have so much information that they cannot hope to ever share all of it in the limited time that they have, or it's not worth spending so much time on to share it all. This argument works on some issues, but it's more dubious on things like politics that are extensively debated - after all, if you're politically on the left at the age of 20, it's not very plausible to assume that you can't communicate all the information that's led to you to this stance, even if you spend the rest of your life talking about it.
* Different ways of communicating information have various effectiveness, and some things can't be communicated with speech alone. You can spend a whole day being told about the economist's mindset by a professor with PhDs in both economy and pedagogy, but even then you still won't internalize all of it as well as somebody who has spent five years in university studying economics. Also, people who experience something themselves and simply hear about somebody else's experience tend to give their own experiences more value.
* One does not always know that he knows something. You can have beliefs that are well-founded in facts you know, but when asked to explain them, you don't remember the original evidence that convinced you anymore. Various incidents where you've seen certain behavior can compress themselves in your mind until it's obvious to you that something works a certain way, and when somebody disagrees you think he's being silly without being able to prove yourself to be right.
* One can be affected by a large amount of other biases. Just look at Wikipedia's list of cognitive biases. It's depressingly long.
* Finally, one might simply not care about the truth, and be uninterested in encountering conflicting points of view. An interesting question is how often this might actually be a good thing - there is a concept known as rational irrationality (HTML version), which basically states that in many situations the benefit people would derive from knowing the truth is practically nonexistent (believing or not believing in evolution doesn't directly influence your life in any way, regardless of which way you swung). Thus, spending even a minimal amount of effort trying to find out the truth might be pointless - irrational, even. And sometimes unfounded beliefs might even benefit you (religion is a good example).

Let's assume, though, that you are a seeker of the truth - if not entirely, then at least in part. You want to know how things are in reality. What lessons should we draw from all of this, and how should you act? Here are my personal suggestions, though I do not claim that I would yet follow them all to the letter. Still, they are things to strive for.

* Study things from as many points of view as possible, and try to understand as many models of thought as you can. This way, you can better understand the behavior of other people, and how people can think in ways that seem incomprehensible to you. If an atheist, talk to religious people until you understand them well enough not to consider them silly; if religious, talk to atheists until you understand them in the same way. Get at least passingly familiar with all the existing genres of fiction, and especially study science fiction - the good sort of science fiction, the one that isn't just "laserguns for revolvers and spaceships for horses" but instead builds on premises and settings that are as bizarre and unusual for us as possible. At the same time, beware the fallacy of generalization from fictional evidence, and always keep in mind that you are reading fiction, not scientific studies. Fiction is just stuff that someone has invented. It doesn't prove that things would go that way in real life, and you should be very cautious of letting the images painted in fictional works color your concept of what the world really is like.
* Become interdisciplinary. Do for science what you did for fiction, for you never know what branch of human thought might grant you the knowledge you need to understand a phenomenon. Where fiction could lead you to misapplied conclusions, science will give you the methods you need to truly understand the world - even the methods that might feel counterintuitive to the one not skilled in them. Study mathemathics, economy, history, psychology, physics, everything.
* Recognize your fallibility. Realize that in a quest for the truth, your own biases become your worst enemy. To defeat your enemy you must understand it, so set forth on studying it. Follow blogs like Overcoming Bias. Read up on the field of heurestics and biases - the book Judgment Under Uncertainty: Heuristics and Biases comes highly recommended, and though I haven't read it yet, I plan to do so soon. Find the time to peruse articles like Wikipedia's list of cognitive biases and Cognitive Biases Potentially Affecting Judgment of Global Risks. In your interdisciplinary studies, especially emphasize the sciences that help you in understanding and combating your bias, and the ones that allow you to think clearly - in his Twelve Virtues of Rationality (which is required reading for you), Eliezer Yudkowsky recommends evolutionary psychology, heuristics and biases, social psychology, probability theory and decision theory. Read texts that are obviously biased, so that you are better in spotting the milder biases. Bookmark lists of debating fallacies. Practice the Art of Rationality in whatever ways you can.
* Actively adjust your thoughts and hypotheses based on information you have about yourself, and of others. In Uncommon Priors Require Origin Disputes (it has some formal logic, but you can just read the plain English summaries and skip the formal bits - that's what I did), Robin Hanson discusses the example of two astronomers with differing opinions about the size of the universe. He notes that they cannot base their differences of opinion on genetic differences influencing optimism and pessimism, because the laws of heredity work independent of the size of the universe - a person inheriting the gene for optimism does not alter the size of the universe (or vice versa), so he must seek to remove the effect that gene has on his thought. Find out which influences on your thought are correlated with better understanding the world, and eliminate others. You having an IQ different from others is relevant to whether or not your hypotheses are accurate, but you having been born in a geographical location where a certain point of view tends to be favored is not.
* When I in my last point said that I skipped the formal bits of a paper and just read the summary? Don't do that. Strive for a technical understanding of all things, as is explained in A Technical Explanation of Technical Explanation. If you know that "everything is waves" but do not understand the mathemathical and physical concepts behind that sentence, then you do not really know anything but a fancy phrase. You cannot use it to make valid predictions, and if you can't use it to make predictions, it's useless for you. Strive for an ability to make testable predictions, not an ability to explain anything you encounter.
* Discuss the same subjects repeatedly, even with the same people. If you are losing a debate but still cannot admit you're wrong, ask for time to ponder upon it. Decide if your hesitation was you being too caught up in the defense of a topic, in which case you only need time to get over it and accept your opponent's arguments, or because there was more relevant information in your mind that you couldn't recall at the moment, in which case you need time for your subconsciousness to bring them to your mind. Be very sceptical of yourself if you disagree with something, but cannot justify it even with time - you might be dealing with bias instead of forgetten knowledge. If questioned, be prepared to double-check your intuitions of what is obvious from scientific studies, and be ready to discard your intuitions if necessary.
* Avoid certainty, and of all people, be the harshest on yourself. 80% of drivers thinks they belong in the top 30% of all drivers, and even people aware of cognitive biases often seem to think those biases don't apply to them. People tend to find in ambiguous texts the points that support their opinions, while discounting the ones that disagree with them. Question yourself, and recognize that if you want your theories to find the truth, you can never be the only one to evaluate them. Subject them for criticism and peer review, and find those with the most conflicting thoughts to look them over. Never think that you have found the final truth, for when you do that, you stop looking for other explanations. Remember the scientists behind the Castle Bravo nuclear test, whose mistake in their calculations was thinking their calculations complete and forgetting to factor in the point they had forgotten. Consider impossible scenarios. Meditate on the mantra of "nothing is impossible, only extremly unlikely". Think of the world in terms of probabilities, not certainties.
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

December 2018

S M T W T F S
      1
2345678
910 1112131415
16171819202122
23242526272829
3031     

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 11th, 2025 08:11 am
Powered by Dreamwidth Studios