(no subject)

Date: 2008-05-13 09:10 pm (UTC)
You may be too rational for your own good? Hope you don't mind the terminology spam, but I've found psychology really useful in my overanalysis of things. All these labels aren't really precise, but they point to underlying dynamics.

From my perspective, there's a few cognitive biases errors in play there. First, premature closure:
http://www.isabelhealthcare.com/home/misdiagnosis_faq/media?ida=isabelstory
A 2005 study published in Archives of Internal Medicine found that cognitive error, often referred to as premature closure, is the single most common cause of diagnosis errors. Premature closure occurs when a clinician arrives at an initial diagnosis that seems to fit the facts then does not consider other reasonable possibilities.
Once a conclusion is made, no more effort is spent re-examining the alleged evidence that led to the conclusion. Intolerance of ambiguity can lead to a tendency toward premature closure. Further tendencies toward dogmatism don't help either, since some types dislike backtracking, deriding it as "flip-flopping" and so forth. Sometimes it's rational to reserve final judgment instead of running with a snap judgment. However, rationality isn't always optimal in a historical evolutionary context when one might be in a fight-or-flight situation, and reserving judgment could mean being a predator's meal.

Two, people can sometimes trust what they hear from their social network despite contradictory evidence:
http://www.msnbc.msn.com/id/21311730/wid/11915773
The new study, published this week online in the Proceedings of the National Academy of Sciences, reveals individuals sometimes place so much stock in gossip that they accept it as true even if their own observations and experiences suggest otherwise.
It could be in-group bias in play there. It's interesting that it's been found to be sometimes sufficiently strong that a disconnect from reality occurs. Cue Philip K. Dick.

Finally, intentions only go so far in my book. People may even genuinely think they're telling the truth, but sometimes they're undone by motivated reasoning:
http://www.ciadvertising.org/SA/fall_05/adv392/kasey/site1/motivated_reasoning2.htm
Research on motivation has consistently shown that people are motivated to come to a desired conclusion (see Kunda 1990 for review). Building support through a broad sample of research, Kunda's (1990) theory of motivated reasoning posits that "people rely on cognitive processes and representations to arrive at their desired conclusions, but motivation plays a role in determining which of these will be used on a given occasion"...

Kunda devotes much of the theory of motivated reasoning to motives arrived at through directional bias to reach a desired conclusion. Motivation affects reasoning "through reliance on a biased set of cognitive processes: strategies for accessing, constructing and evaluating beliefs"... This occurs when people are motivated to reach a certain, desired conclusion, and they conduct a biased memory search to find justification for their decision.
That's also reflected in confirmation bias, where people tend to look for things which they already agree with, and avoid looking for things that might contradict their position.

More likely people think they have ESP, when they don't. Heh.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

the_siobhan: It means, "to rot" (Default)
the_siobhan

May 2025

S M T W T F S
    123
45678910
111213 14151617
18192021222324
25262728293031

Style Credit

Expand Cut Tags

No cut tags