Talk:List of cognitive biases
From Wikipedia, the free encyclopedia
[edit] Name/Organization of article
Can we call this something more specific than "list of biases"? There has to be a better, less confusing name for this category. - Texture 18:19, 7 Mar 2004 (UTC)
- Perhaps List of psychological biases? Isomorphic 18:23, 7 Mar 2004 (UTC)
-
- Much more descriptive. Any objection to moving this article to the new name? - Texture 18:28, 7 Mar 2004 (UTC)
-
-
- If you wanna move it, fine with me. My only thought is that all statistical biases could be considered ultimately cognitive ones. 67.86.96.95 18:31, 7 Mar 2004 (UTC)
-
-
-
- Would List of cognitive biases be better than List of psychological biases? - Texture 18:33, 7 Mar 2004 (UTC)
-
-
-
-
- Better ring to it, covers what I had in mind, yes. 67.86.96.95 18:38, 7 Mar 2004 (UTC)
-
-
-
-
-
-
- I'll move to this article to List of cognitive biases - Texture 18:59, 7 Mar 2004 (UTC)
-
-
-
Maybe the list should be hierarchical. Categories include attributional biases, statistical biases, self-serving biases, group-serving biases, reporting biases (though there might be some overlap between some of them) 67.86.96.95 08:45, 7 Mar 2004 (UTC)
[edit] The list is not accessible
I think the list is not accessible. Each bias should mention the following
- whether this is an informal term only used by lay people or or a term used by psychologists
- whether the bias has been empirically verified
- whether the bias is undisputed by scientists.
- what other biases are closely related, overlap or are a part of it.
I can't do it myself because I don't know enough about the subject. Thanks for helping. Andries 15:56, 16 May 2004 (UTC)
- I agree with Andries. On another note, this is a really great list subject!! If only we could get Bill O'Oreilly and Michael Moore to read it... --64.121.197.36 18:49, 21 Aug 2004 (UTC)
- I agree. I'll probably be working on this over the next few days, any help is appreciated. J.S. Nelson 07:57, 25 Apr 2005 (UTC)
- I've done my best to improve this list (organizing it into obvious catagories and adding short descriptions). Please continue to update and improve this list as y'all see fit. Headlouse 01:33, 16 November 2005 (UTC)
[edit] Victim fallacy
People tend to assume that their problems are unique to them or a group to which they belong, when these problems are very often widespread and affect many people.
Is there a name for this bias or fallacy? I cannot find anything in the list which mentions it. I would tend to call it the "victim fallacy".
LordK 19:13, 20 Nov 2004 (UTC)
[edit] Coercion bias?
This is just an unconfirmed rumor I read somewhere. If there is a person with an opinion, people tend to agree to him more in person than not in person. So this can lead to bias in that individual. But I am not even sure if this belongs here. Samohyl Jan 23:35, 21 Dec 2004 (UTC)
[edit] Why is there a separate "Other cognitive biases:" list?
Why is there a separate "Other cognitive biases:" list? There is no indication to the reader of any reason that these are separated out. I'd suggest they either be integrated in, or an explanation give to the reader why they are separate.
- I am not even convinced that things like optical illusions are cognitive biases in the same sense as the other things on the list. It might be better to remove them entirely, or put them under 'see also.' Tom harrison 00:21, 21 October 2005 (UTC)
[edit] Tunnel Vision
Just wondering if this is correctly linked? AndyJones 18:19, 9 November 2005 (UTC)
[edit] Realism Theory?
I believe that "realism theory" has another name. Does anyone know what it is? 9:08, 21 Nov 2005
[edit] Support for merging this page with Cognitive Bias
Like previous users, I see no reason for not merging this with the cognitive bias page. I would need help to do this.. any offers please?? --Rodders147 11:26, 19 March 2006 (UTC)
- I don't know about the merging. It is a pretty long list, after all. --maru (talk) contribs 18:14, 19 March 2006 (UTC)
-
- I Agree with Marudubshinki. This list is too long to include on the cognitive bias page. Plus there are several "List of _________" pages on wikipedia so this is not abnormal. Headlouse 22:59, 30 March 2006 (UTC)
[edit] Lake Wobegon effect
Where did this list come from? A lot of these, such as Lake Wobegon effect, must have some other name in psychology. It seems like some serious merging would help make these topics more informative. --65.25.217.79 10:12, 21 April 2006 (UTC)
Lake Wobegon effect, egocentric bias and actor-observer bias are closely related. Distinguishing (or merging) them requires expert advise. Peace01234 (talk) 03:52, 25 November 2007 (UTC)
[edit] confusing causation with correlation
It seems as though the common phenomean of assuming causation when all that exists is correclation or association should be on this list. Maybe it is and I could not identify it. Not my area of expertise, hope someone will address this. BTW: to those who maintain this page, its a wonderful resource!
- Correlation implies causation is mentioned in Logical fallacy article. -- Sundar \talk \contribs 16:22, 26 June 2006 (UTC)
[edit] Social Biases are not a type of Cognitive Bias
Cognitive Biases are one thing. Social Biases are another. Social Biases should be split to make a new article, "List of Social Biases."
- I disagree. Social biases stem from the way that people think (cogitate) about social interactions. At the very least, make some sort of argument for your assertion.
--NcLean 8th of October 2006
[edit] Valence effects
Optimism bias and Valence effects have separate articles, but what's the difference? Also, are there known systematic links between optimism effects (Rosy retrospective, valence effect, planning fallacy, overconfidence effect, false-consensus...) --NcLean 8th of October 2006
[edit] Comfort and Implications effects
Where does one's desire for comfort at the expense of something he knows to be more beneficial fit into this list of biases? In other words, a major obstacle to clarity is the human predisposition to adopt and maintain beliefs which are comfortable for us as opposed to true. Perhaps related, or not, is the skewing of one's perception due to the implications of his/her choices. For instance, if I decide that the right thing to do is to help my wife deal with a sick child, then I will have to stop surfing the web, which gives me more immediate pleasure. [Helping my wife and child is a deeper pleasure, but not immediately gratifying]
- Directly related to 'comfort bias' is a bias which isn't listed yet strikes me as one of the most significant of all: believing what you want to believe. Athiests argue that people believe in God and heaven because people very much want to believe in the afterlife. This bias is tremendous in the effect it has had on mankind. Where is it listed? Simon and Garfunkel wrote a song that had a line about this bias.
[edit] Contradiction?
"Déformation professionnelle — the tendency to look at things according to the conventions of one's own profession, forgetting any broader point of view." It seems to me that this line is self-contradictory. Not that it should be removed, but if a psychologist discovered this bias how would this psychologist know whether or not he/she was subject to this bias (the bias of a psychologist) during its discovery? Anyway, I just thought that was interesting. --Merond e 14:04, 18 March 2007 (UTC)
[edit] Reductive Bias
"A common thread running through the deficiencies in learning is oversimplification. We call this tendency the reductive bias, and we have observed its occurrence in many forms. Examples include the additivity bias, in which parts of complex entities that have been studied in isolation are assumed to retain their characteristics when the parts are reintegrated into the whole from which they were drawn; the discreteness bias, in which continuously dimensioned attributes (like length) are bifurcated to their poles and continuous processes are instead segmented into discrete steps; and the compartmentalization bias, in which conceptual elements that are in reality highly interdependent are instead treated in isolation, missing important aspects of their interaction "
Cognitive Flexibility, Constructivism, and Hypertext, R.Spiro et al. http://phoenix.sce.fct.unl.pt/simposio/Rand_Spiro.htm —The preceding unsigned comment was added by Difficult to pick a user name (talk • contribs) 18:59, 29 March 2007 (UTC).
[edit] Loss aversion, endowment effect, and status-quo bias
"Anomalies: The Endowment Effect, Loss Aversion, and Status Quo Bias" Daniel Kahneman; Jack L. Knetsch; Richard H. Thaler The Journal of Economic Perspectives, Vol. 5, No. 1. (Winter, 1991), pp. 193-206.
The first two paragraphs of this article:
- A wine-loving economist we know purchased some nice Bordeaux wines years ago at low prices. The wines have greatly appreciated in value, so that a bottle that cost only $10 when purchased would now fetch $200 at auction. This economist now drinks some of this wine occasionally, but would neither be willing to sell the wine at the auction price nor buy an additional bottle at that price.
- Thaler (1980) called this pattern-the fact that people often demand much more to give up an object than they would be willing to pay to acquire it-the endowment efect. The example also illustrates what Samuelson and Zeckhauser (1988) call a :status quo bias, a preference for the current state that biases the economist against both buying and selling his wine. These anomalies are a manifestation of an asymmetry of value that Kahneman and Tversky (1984) call loss aversion-the disutility of giving up an object is greater that the utility associated with acquiring it.
I think it's very clear that my edit (suggesting that the three are related) is consistent with the conventional usage in economics. Anthon.Eff 22:24, 11 April 2007 (UTC)
- Look, armies of academics have worked on those "asymmetric" effects. Your citations do not support your point as they show clearly that the three phenomena are different manifestations of something more general, but more vague, an "asymmetry". How can, for example, an investor understand his/her biases if he/she sees a confusing explanation? Please, treat each bias for itself, show the analogies if you like, but also the differences. Show why they are not called by the same name, this is the way to make things clear. In asset management the differences are as follows
- Loss aversion is clearly the reluctance to sell when the priced is lower.
- Endowment effect is clearly the idea that what somebody owns has more value that the market offer even if its market price already got multiplied two, ten or a hundred times.
- Status quo bias does not concern only selling but also buying. It is the reluctance to change one's price estimate (whether upwards or downwards), as well as to make arbitrages between assets.
- The purpose of a list is to list things, not to merge them. I plan to eliminate again your misleading add on, to avoid confusion in the mind of readers (yes, framing, which is the way things are presented and perceived, is another cognitive bias). But I'm sure it would not be needed, as I trust you will dig deeper into those phenomena and will refine your wordings (the word "related" is seriously misleading as it hides the - crucial - differences) to avoid those confusions. --Pgreenfinch 07:10, 12 April 2007 (UTC)
-
- I think you need to give a citation for your definitions. I provided a citation, written by three well-known economists, including Thaler (who invented the term endowment effect), and Kahneman (who with Tversky invented the term loss aversion). It would help, when you provide your citation, to also do as I did: actually extract a few paragraphs providing the definitions that support your point. Until then, I think you should avoid reverting, since you have provided no evidence that you are correct. Anthon.Eff 11:54, 12 April 2007 (UTC)
-
-
- If I understood well your only knowledge of the subject is a few old citations and you do not really want to explore the topic by yourself. Sorry, but you took the responsability to make an add on (which btw does not match clearly those citations) and to try to play the professor on something you do not really grasp. So you have to take that responsibility until the end and explain really what your word "related" really means. Maybe you can look at Martin's Sewell's treasure chest on the topic, a site I usually recommends, there you will learn a lot. You know, I respect the fact that you are an expert in philosophy, but thoses topics are very practical and precise ones and should be approached practically and precisely, avoiding reductionism. A philosophical concept, this word, seems to me, although here I'm not an expert and will not meddle in the article ;-). --Pgreenfinch 13:10, 12 April 2007 (UTC)
-
-
-
-
- Actually, I'm an academic economist, though behavioral economics is not my primary area of expertise. But to be honest, I think you are the one trying to "play the professor", as you put it, by presenting your own definitions as if these are somehow authoritative. That's a nice approach when writing ones own papers or playing on a blog, but that's not what is expected on Wikipedia (see WP:Attribution or WP:No original research). --Anthon.Eff 13:49, 12 April 2007 (UTC)
-
-
-
-
-
- I didn't give my page as reference for those concepts (although, by the way, various academics and various professional institutions link to that page and "play" with that forum). I just summed up here the usual definitions - which I didn't invented - of those effects. You are free to find better ones in the litterature, which is why, to help you clarify (what does "related" means?) your (original) add on I suggested you to get more expertise via a recognised academic portal on those topics. If you do not give clear definitions of those phenomena (I don't specially ask to follow those I gave, if you find more explicit ones) that show how close / how far their relations are, how can you write that they are related, a very vague word? --Pgreenfinch 17:43, 12 April 2007 (UTC)
-
-
-
-
-
-
- I guess this is getting pretty unproductive. I did not use the word "related" in the article. I used the words "see also." And I have already provided a citation--from two of the men who coined the terms "loss aversion" and "endowment effect." Since we're having such trouble over this issue, I decided to simply quote these two men in the article for the definition of these terms. I have, of course, provided the source for these quotes. I trust that letting the authorities speak for themselves now resolves any differences we may have. --Anthon.Eff 18:47, 12 April 2007 (UTC)
-
-
-
-
-
-
-
-
- I see your point, an author talked about different things which are about 10% related and this gives an authority argument to put "see also" between those. Status quo bias has very few to do with loss aversion, which is why your citations are not really explicit about that. I find that a poor way to use a Nobel prize author. Now that you opened the pandora box, we could add (we just need to find an author that talked about those different things) "see also" between scores of cognitive, emotional, individual, collective biases that have a few common points between them, and make a complete mess of the article. Just an example, status quo bias could be linked, by stretching the rubber band, to cognitive dissonance, cognitive overload, plain laziness, rationalization, overconfidence and many other things, nearly the whole list of biases. I have nothing against the "see also" tool, but on condition that it explains the similarities as well as the differences. With a "see also" signpost and no clear explanations, it is up to the reader to scratch its head wondering "Boy, what does this is supposed to tell me? Do those concepts complement or oppose each other and why? Do those roads converge or diverge and where?". I think we have here now a serious dent in an article which had the potential to be an example of encyclopedic quality, --Pgreenfinch 07:09, 13 April 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
- The article contains 10 cases of "see also" in parentheses--all but one were there before I made my edit. Not one of the nine preexisting cases has the level of editorial detail you think is mandatory. By the way, note how this "serious dent in the article" only became salient after a dispute with another editor. What kind of bias do you think we are witnessing? My guess would be that this most closely matches Confirmation bias, in that we have seen one thing after another brought up in order to validate the initial negative reaction to the other editor's edit (first wrong definitions, then a few ad hominems, then the use of the word "related", then use of the words "see also"). But it could also be seen as an example of the Focusing effect, since you appear to think one minor edit has the potential to wreck the entire article. But then again, your commitment to the current state of the article may be an example of Status quo bias. Of course, as Pronin, Ross and Gilovich (2004: 781) tell us: "people readily detect or infer a wide variety of biases in others while denying such biases in themselves." An unnamed bias of bias attribution (someone should name this and put it on the list), which has almost certainly caused me to overlook my own biases. Any lurkers out there? What kinds of biases do you think sustain such a long and unproductive discussion over a minor edit? --Anthon.Eff 13:38, 13 April 2007 (UTC)
-
-
-
-
-
-
-
-
-
-
-
- Pronin, Emily, Lee Ross, and Thomas Gilovich. (2004). "Objectivity in the Eye of the Beholder: Divergent Perceptions of Bias in Self Versus Others." Psychological Review. 111(3): 781–799.
-
-
-
-
-
-
-
-
-
-
-
-
- Anthon, you are becoming a real shrink, see all those biases you discovered in me, I'm impressed :-). Btw, the other "see also" do not refer to a specific bias in the list. More generally, "see also" are understandable in an article on a precise topic, as the relations can be seen clearly from the text. But in a list there is no real room to elaborate, either it becomes a mess by trying to explain any relation, however strong or weak, and if those explanations are skipped, the "reductionist" risk is high. An alternative is to make smaller groupings, but here overlaps would be frequent and it would be subjective to find clear category criteria in a field that deals with human behavior. Thus we would face the same risks. Now I expect like you that lurkers will describe fully our repective biases. Certainly, from what I just said, they will find me overly "risk averse" ;-). --Pgreenfinch 15:14, 13 April 2007 (UTC)
-
-
-
-
-
-
[edit] Great List - Inaccurate quality/importance tag?
This article is one of the best pages on the whole of the Internet. It strikes me as bordering on presumptuous for someone to come along ang lightly tag it as low-quality and unimportant. --New Thought 16:36, 12 January 2007 (UTC)
- I agree. This is a fantastic list. As a research, it was a big help to me. What would make it even more powerful is if the article citations for each bias (if available) were noted in the footnotes on this page, rather than just on the individual pages of the biases. Zfranco 00:25, 8 February 2007 (UTC)
-
- Don't repeat yourself. --Gwern (contribs) 03:15 8 February 2007 (GMT)
-
-
- This list is just great. It is not perfect as it is, but nowhere else in my life I´ve seen a list of cognitive biases, which affect so much people´s decisions. Unfortunately most people don´t think too much about them. The world would certainly be a better place if everyone was aware of cognitive biases and thought more about the rationality in their decisions. I don´t know how to express in Wikipedia that an article is important to me. I guess I am going to write it on my user page. Congratulations to all editors who made this list possible. A.Z. 00:16, 12 March 2007 (UTC)
-
-
-
-
- Indeed, this is a highly useful list and is more comprehensive than anything I have been able to find on the internet on the subject. Moreover, it is a great use of the Wiki approach of synthesizing information from so many different sources. As the article currently stands, the writing is sound, the topic important, and I suggest the tags be removed or at least be revised to reflect consensus. In that respect, I have moved the various comments on the quality of the page to this new section, for convenience of others in expressing their views. McIlwrath 20:50, 12 April 2007 (UTC)
-
-
[edit] Observer-expectancy effect
Do you have a source saying that Observer-expectancy effect is a cognitive bias ? I have added a {{citation needed}} in the Observer-expectancy effect article. Thank you. Akkeron 10:44, 15 April 2007 (UTC)
[edit] Déformation professionnelle
My observation is that this bias is detected by folks outside the discipline/profession, not within it.204.87.68.252 20:50, 24 April 2007 (UTC)
[edit] Bandwagon effect
I don't see a good psychology citation for this one either. Or lots of the other ones. That's why it's B-grade in my view. There are other pages with better citations - and good books (cited in the article). But this article is not a bad starting point. —The preceding unsigned comment was added by 74.95.10.169 (talk) July 16, 2007
[edit] Dunning-Kruger effect
I found this, which looks to me like a cognitive bias, but I'm not sure which category to put it in.
Dunning-Kruger effect Saraid 02:10, 22 July 2007 (UTC)
[edit] Perceptual vs. Conceptual
Perhaps the page should say "distortion in the way humans conceive reality" instead of "distortion in the way humans perceive reality" since the perception is the same, it's the concept that's distorted.
66.252.35.247 10:12, 22 July 2007 (UTC)
[edit] More general type of Post-purchase rationalization bias?
[moved from Talk:Post-purchase_rationalization ] Hi, I've been searching List_of_cognitive_biases for a type of bias and I think Post-purchase rationalization is closest to it, but it's more general than purchasing, where a decision has been made, then vacuous justifications are made up afterwards to support the decision, and justifications for alternative outcomes are deemphasized. Is this a separate type of bias that has its own name? 86.14.228.16 12:48, 24 July 2007 (UTC)
[edit] Illusion of control
The 'Illusion of Control' entry in the list contains has a negative bias. I propose 'clearly cannot' be changed to 'scientifically have no control over'. - Shax —Preceding unsigned comment added by 82.17.69.223 (talk) 01:13, 2 September 2007 (UTC)
[edit] Argument from incredulity
Argument_from_ignorance (or incredulity) is listed as a "logical fallacy" but seems like it belongs in this list. Whether one makes the logical argument explicitly or not doesn't really matter; it still is a cognitive bias.
LiamH (talk) 20:17, 6 February 2008 (UTC)
[edit] Is this list (and the article on Cognitive Bias) confusing Cognitive Bias with simple Irrational Thinking?
On reading the article and the discussions here, I wonder where the "authority" comes from to classify this list as comprising genuine cognitive biases. At first I wondered, as others have, about grouping and classifying the biases, but the more I read the more the term Cognitive Bias seemed to be becoming used as a catch-all for any illogical thought process. There seems to be a sort of populist psychology creeping in here. I am not sure that this is correct but IMHO a cognitive bias is the sort of phenomenom noted in Prospect theory. Whilst it is true that, using mathematical formulae, the human response is inconsistent and illogical, at the same time it is also clear, from a human perspective, why the bias exists and indeed its utility in the survival process. On the other hand simple illogical thinking can be termed "bad" thinking e.g.: "I can jump that gap because he could and I am as good a person as he is.". There is a confusion in the example between being goo (at jumping) and being good (as a person). Prospect theory, on the other hand, has more in common with valuing a "bird in the hand more than two in the bush.". Simple illogical thinking is clearly not useful for survival and can even at times be described as a symptom of mental illness. However Cognitive bias can IMHO be seen as a "real" response to a probablistic reality. If a coin is tossed and it has come up heads four times it is illogical to prefer to choose tails the next toss. However, IMHO it is eminently sensible so to do. It all depends on HOW we frame the maths. Could we do some "disambiguation" here into "true" cognitive biases, and the examples of simple "bad"/illogical thinking? Also, I have been unable to find (quickly) any references to "lists" of cognitive biases, but only to particular studies showing an example of a cognitive bias, such as that described by Prospect Theory, Priming, Framing etc.
LookingGlass (talk) 16:55, 18 February 2008 (UTC)