Difference between pages "Human behavior" and "Human perception"

From Faster Than 20
(Difference between pages)
(Abilene Paradox)
 
(Data fallacies)
 
Line 1: Line 1:
[[Social entrainment]]
{{Draft}}


[[wikipedia:Abilene paradox|Abilene Paradox]]
[http://psychlopedia.wikispaces.com/Social+Psychology Social psychology] and [[wikipedia:List of cognitive biases|cognitive biases]].
 
= Group Biases =
 
== Leadership ==
 
Narcissists tend to emerge as leaders of leaderless groups.<ref>Amy B. Brunell, William A. Gentry, W. Keith Campbell, Brian J. Hoffman, Karl W. Kuhnert, and Kenneth G. DeMarree. [http://journals.sagepub.com/doi/10.1177/0146167208324101 "Leader Emergence: The Case of the Narcissistic Leader."] ''Personality and Social Psychology Bulletin'' (October 2, 2008)</ref> Initially, groups tend to perceive these narcissistic leaders positively, even when they are negatively impacting performance.<ref>Barbora Nevicka, Femke S. Ten Velden, Annebel H. B. De Hoogh, and Annelies E. M. Van Vianen. [http://journals.sagepub.com/doi/10.1177/0956797611417259 "Reality at Odds With Perceptions: Narcissistic Leaders and Group Performance."] ''Psychological Science'' (September 19, 2011)</ref> However, these perceptions generally shift to match actual performance over time.<ref>Paulhus, D. L. (1998). [http://psycnet.apa.org/record/1998-01923-007 "Interpersonal and intrapsychic adaptiveness of trait self-enhancement: A mixed blessing?"] ''Journal of Personality and Social Psychology'', 74(5), 1197-1208.</ref> This is similar to the [[wikipedia:Dunning–Kruger effect|Dunning-Kruger effect]], where people who are incompetent tend to overestimate their competence, and people who are highly competent tend to underestimate their competence.
 
== Tribalism, Social Identity, and the Psychology of Categorization ==
 
According to social identity theory (developed by [[wikipedia:Henri Tajfel|Henri Tajfel]]):
 
* Humans are cognitively prone to categorizing and biased into thinking that similarities within categories and differences between categories are stronger than they are.<ref>Henri Tajfel. "Cognitive aspects of prejudice." ''Journal of Social Issues'', 25, 79-97. 1969.</ref>
* We require [[wikipedia:Minimal group paradigm|very little]] (including completely arbitrary distinctions) to form tribes. For example, Tajfel was able to form competing tribes among teenage boys within minutes based on whether or not they preferred paintings by Klee or Kadinsky.
* We tend to identify with groups in order to maximize positive distinctiveness
 
When people are "on our side," we perceive them more warmly, and we take the time to get to know them as humans. When people cross boundaries to work together on shared goals, we also take the time to get to know them. We tend to dehumanize people who fall well outside our social boundaries. However, we are capable of re-humanizing them.<ref>Susan Fiske. [http://www.beinghuman.org/article/our-brains-teams "Our Brains on Teams: The Dark Side of Loyalty to a Group."] ''Being Human''. October 10, 2012.</ref>
 
Muzafer and Carolyn Sherif performed the [http://www.age-of-the-sage.org/psychology/social/sherif_robbers_cave_experiment.html Robbers Cave Experiment] in 1954, where they divided 24 12-year old boys into two groups and had them perform tasks individually, then in competition, then in cooperation. They noted that competition for resources led to hostile attitudes, [[contact hypothesis|contact alone]] did not reduce those attitudes, but shifting to shared goals did reduce those attitudes.<ref name="konnikova2012">Maria Konnikova. [https://blogs.scientificamerican.com/literally-psyched/revisiting-the-robbers-cave-the-easy-spontaneity-of-intergroup-conflict/ "Revisiting Robbers Cave: The easy spontaneity of intergroup conflict."] ''Scientific American'', September 5, 2012.</ref>
 
= Groupthink =
 
Examples:
* Abu Ghraib
* rape culture
* bullying. Counter: “It gets easier”
 
= Strategy and Decision-Making Biases =
 
== Affective Errors ==
 
Tendency to make decisions based on what we wish is true.
 
Applies to folks we like, which can affect the quality of our judgement. Brooke Harrington's [http://www.theepochtimes.com/n3/2014018-a-resurrection-for-investment-clubs/ research on investment clubs] showed clubs formed primarily through professional bonds — as opposed to social ones — earned higher returns. “One of the things that can torpedo group performance is when people are too socially enmeshed with one another. They can become reluctant to really be direct and honest with one another.”
 
== [[wikipedia:Attribution bias|Attribution Bias]] ==
 
When trying to make sense of the world, we are prone to overemphasizing certain things at the expense of others. For example, we are more likely to attribute behavior to disposition ("This is who they are") than to situation ("This is what was going on"). This is known as Fundamental Attribution Error.
 
Another example is [[wikipedia:False consensus effect|false consensus effect]], where we overestimate how widely our views are shared with others. This is a form of [https://psychlopedia.wikispaces.com/Representative+bias representativeness bias] (the presumption that people or events share the features of other members in that category). Said another way, it's [[The Majority Illusion]]. Another example is the [[wikipedia:Abilene paradox|Abilene Paradox]].
 
You can leverage this effect to get people's true opinions. If you ask someone what they think about something, they have an incentive to say what they think you want to hear. If, however, you ask them, "What do you think others think?", it's likely that you'll get a good picture of what they think, thanks to these attribution biases.
 
== [[wikipedia:Availability Heuristic|Availability Heuristic]] ==
 
If you can recall it, it must be important.
 
Bias towards information or actions you can more easily recall (such as recent news or things with major consequences).
 
== [[wikipedia:Confirmation bias|Confirmation Bias]] ==
 
Our tendency to interpret information in a way that [https://youarenotsosmart.com/2010/06/23/confirmation-bias/ confirms our preexisting beliefs].
 
One example of this is the [https://youarenotsosmart.com/2011/06/10/the-backfire-effect/ backfire effect], where people's strongly-held beliefs get even stronger when presented with evidence that contradicts those beliefs. This [http://theoatmeal.com/comics/believe comic] describes this effect beautifully.
 
== [[wikipedia:Curse of knowledge|Curse of Knowledge]] ==
 
We assume that others know as much as we do. This may result in:
 
* Lack of [[empathy]]
* Over-explaining. In ''Made to Stick'', Chip and Dan Heath suggest that the Curse of Knowledge is why we're so bad at storytelling.
 
== Futures Thinking ==
 
Many cognitive biases affect our ability to [[scenario thinking|think about the future]].
 
Jane McGonigal's [http://www.slate.com/articles/technology/future_tense/2017/04/why_people_are_so_bad_at_thinking_about_the_future.html article] provides an excellent summary of several findings:
 
* The further out in time you think, the more you consider your future self a stranger, and the less you care about your future self.<ref>Pengmin Qin, Georg Northoff. [https://static1.squarespace.com/static/528facb6e4b0a18b7e9cde91/t/5314c1c3e4b05444a30d2b45/1393869251983/How+is+our+self+related+to+midline+regions+and+the+default-mode+network.pdf "How is our self related to midline regions and the default mode network?"] ''NeuroImage'' 57 (2011) pp1221-1233.</ref>
* How often do Americans think about something that might happen in:<ref>[The American Future Gap Survey]. Institute for the Future, April 2017.</ref>
** 30 years? 17% at least once a week, 53% rarely or never (21% less than once a year, 32% never)
** 10 years? 29% at least once a week, 36% rarely or never (19% less than once a year, 17% never)
** 5 years? 35% at least once a week, 27% rarely or never
** Having children or grandchildren does not increase futures thinking. Near-death experiences do.
* Neurological deterioration makes it harder to think about the future as we get older. (We also care less, because we're less likely to be alive.)<ref>Peter G. Rendell, Phoebe E. Bailey, Julie D. Henry, Louise H. Phillips, Shae Gaskin, Matthias Kliegel. [https://www.researchgate.net/profile/Matthias_Kliegel/publication/263938085_Older_Adults_Have_Greater_Difficulty_Imagining_Future_Rather_Than_Atemporal_Experiences/links/542457910cf26120b7a740dc.pdf "Older Adults Have Greater Difficulty Imagining Future Rather Than Atemporal Experiences."] ''Psychology and Aging'', 2012 vol27, num4, pp1089-1098.</ref>
 
The more desirable we find something in the future to be, the more likely we think it will happen.<ref>J. Richard Eiser, Christine Eiser. [https://onlinelibrary.wiley.com/doi/full/10.1002/ejsp.2420050305 "Prediction of environmental change: Wish‐fulfillment revisited."] ''European Journal of Social Psychology'' volume 5, issue 3, July / September 1975, pp315-322.</ref> We see this manifest in presidential elections. Between 1952-1980, 80% of major candidate supporters in U.S. presidential elections  expected their candidates to win by a ratio of 4:1.<ref>Donald Granberg, Edward Brent. (1983). [http://psycnet.apa.org/record/1984-12066-001 "When prophecy bends: The preference–expectation link in U.S. presidential elections, 1952–1980. Journal of Personality and Social Psychology."] 45(3), 477-491.</ref> See also Optimism Bias below.
 
== [[wikipedia:Hindsight bias|Hindsight Bias]] ==
 
Also known as the "knew-it-all-along" effect. Hindsight is 20/20.
 
== [[wikipedia:Loss aversion|Loss Aversion]] ==
 
We'd rather ''NOT'' lose something than gain the equivalent amount. For example, we prefer not losing $5 that we already have to gaining $5.
 
== Self-Perception Biases ==
 
'''[[wikipedia:Optimism bias|Optimism Bias]]''' — Thinking that good things are more likely to happen to you than to others. Discovered by Neil Weinstein in the late 1970s.<ref>Neil Weinstein, [http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.535.9244&rep=rep1&type=pdf "Unrealistic Optimism About Future Life Events."] ''Journal of Personality and Social Psychology'' 1980, Vol. 39, No. 5, 806-820.</ref><ref>Neil D. Weinstein and William M. Klein (1996). [https://guilfordjournals.com/doi/abs/10.1521/jscp.1996.15.1.1 "Unrealistic Optimism: Present and Future."] ''Journal of Social and Clinical Psychology'': Vol. 15, No. 1, pp. 1-8.</ref> [https://www.theatlantic.com/science/archive/2017/11/humans-are-bad-at-predicting-futures-that-dont-benefit-them/544709/ This article] provides several excellent examples of optimism bias.
 
Lake Wobegon and Dunning-Kruger Effects.
 
== What You See Is All There Is (WYSIATI) ==
 
(Show pie graph as counter to this.)
 
Ed Batista's [http://www.edbatista.com/2016/12/seeing-whats-not-there-the-importance-of-missing-data.html "Seeing what's not there (The importance of missing data)."]
 
=== [[wikipedia:Survivorship bias|Survivorship Bias]] ===
 
Focusing on things that succeeded, while ignoring things that failed. Another corollary to "correlation ≠ causation."
 
Said another way:
 
[[File:XKCD Survivorship Bias.png|350px|link=https://xkcd.com/1827/]]
 
Be wary of extrapolating from positive deviance!
 
= Gender Bias =
 
== Perception of Dominance ==
 
* Melissa J. Williams, Tiedens, Larissa Z. Tiedens. [http://psycnet.apa.org/journals/bul/142/2/165/ "The subtle suspension of backlash: A meta-analysis of penalties for women’s implicit and explicit dominance behavior."] ''Psychological Bulletin'', Vol 142(2), Feb 2016, 165-197. http://dx.doi.org/10.1037/bul0000039
 
= References =
<references />
 
= See Also =
 
* [[Human behavior]]
* [[Empathy]]
* [https://www.geckoboard.com/best-practice/statistical-fallacies/ Data Fallacies]

Revision as of 03:38, 5 April 2021

Social psychology and cognitive biases.

Group Biases

Leadership

Narcissists tend to emerge as leaders of leaderless groups.[1] Initially, groups tend to perceive these narcissistic leaders positively, even when they are negatively impacting performance.[2] However, these perceptions generally shift to match actual performance over time.[3] This is similar to the Dunning-Kruger effect, where people who are incompetent tend to overestimate their competence, and people who are highly competent tend to underestimate their competence.

Tribalism, Social Identity, and the Psychology of Categorization

According to social identity theory (developed by Henri Tajfel):

  • Humans are cognitively prone to categorizing and biased into thinking that similarities within categories and differences between categories are stronger than they are.[4]
  • We require very little (including completely arbitrary distinctions) to form tribes. For example, Tajfel was able to form competing tribes among teenage boys within minutes based on whether or not they preferred paintings by Klee or Kadinsky.
  • We tend to identify with groups in order to maximize positive distinctiveness

When people are "on our side," we perceive them more warmly, and we take the time to get to know them as humans. When people cross boundaries to work together on shared goals, we also take the time to get to know them. We tend to dehumanize people who fall well outside our social boundaries. However, we are capable of re-humanizing them.[5]

Muzafer and Carolyn Sherif performed the Robbers Cave Experiment in 1954, where they divided 24 12-year old boys into two groups and had them perform tasks individually, then in competition, then in cooperation. They noted that competition for resources led to hostile attitudes, contact alone did not reduce those attitudes, but shifting to shared goals did reduce those attitudes.[6]

Groupthink

Examples:

  • Abu Ghraib
  • rape culture
  • bullying. Counter: “It gets easier”

Strategy and Decision-Making Biases

Affective Errors

Tendency to make decisions based on what we wish is true.

Applies to folks we like, which can affect the quality of our judgement. Brooke Harrington's research on investment clubs showed clubs formed primarily through professional bonds — as opposed to social ones — earned higher returns. “One of the things that can torpedo group performance is when people are too socially enmeshed with one another. They can become reluctant to really be direct and honest with one another.”

Attribution Bias

When trying to make sense of the world, we are prone to overemphasizing certain things at the expense of others. For example, we are more likely to attribute behavior to disposition ("This is who they are") than to situation ("This is what was going on"). This is known as Fundamental Attribution Error.

Another example is false consensus effect, where we overestimate how widely our views are shared with others. This is a form of representativeness bias (the presumption that people or events share the features of other members in that category). Said another way, it's The Majority Illusion. Another example is the Abilene Paradox.

You can leverage this effect to get people's true opinions. If you ask someone what they think about something, they have an incentive to say what they think you want to hear. If, however, you ask them, "What do you think others think?", it's likely that you'll get a good picture of what they think, thanks to these attribution biases.

Availability Heuristic

If you can recall it, it must be important.

Bias towards information or actions you can more easily recall (such as recent news or things with major consequences).

Confirmation Bias

Our tendency to interpret information in a way that confirms our preexisting beliefs.

One example of this is the backfire effect, where people's strongly-held beliefs get even stronger when presented with evidence that contradicts those beliefs. This comic describes this effect beautifully.

Curse of Knowledge

We assume that others know as much as we do. This may result in:

  • Lack of empathy
  • Over-explaining. In Made to Stick, Chip and Dan Heath suggest that the Curse of Knowledge is why we're so bad at storytelling.

Futures Thinking

Many cognitive biases affect our ability to think about the future.

Jane McGonigal's article provides an excellent summary of several findings:

  • The further out in time you think, the more you consider your future self a stranger, and the less you care about your future self.[7]
  • How often do Americans think about something that might happen in:[8]
    • 30 years? 17% at least once a week, 53% rarely or never (21% less than once a year, 32% never)
    • 10 years? 29% at least once a week, 36% rarely or never (19% less than once a year, 17% never)
    • 5 years? 35% at least once a week, 27% rarely or never
    • Having children or grandchildren does not increase futures thinking. Near-death experiences do.
  • Neurological deterioration makes it harder to think about the future as we get older. (We also care less, because we're less likely to be alive.)[9]

The more desirable we find something in the future to be, the more likely we think it will happen.[10] We see this manifest in presidential elections. Between 1952-1980, 80% of major candidate supporters in U.S. presidential elections expected their candidates to win by a ratio of 4:1.[11] See also Optimism Bias below.

Hindsight Bias

Also known as the "knew-it-all-along" effect. Hindsight is 20/20.

Loss Aversion

We'd rather NOT lose something than gain the equivalent amount. For example, we prefer not losing $5 that we already have to gaining $5.

Self-Perception Biases

Optimism Bias — Thinking that good things are more likely to happen to you than to others. Discovered by Neil Weinstein in the late 1970s.[12][13] This article provides several excellent examples of optimism bias.

Lake Wobegon and Dunning-Kruger Effects.

What You See Is All There Is (WYSIATI)

(Show pie graph as counter to this.)

Ed Batista's "Seeing what's not there (The importance of missing data)."

Survivorship Bias

Focusing on things that succeeded, while ignoring things that failed. Another corollary to "correlation ≠ causation."

Said another way:

350px

Be wary of extrapolating from positive deviance!

Gender Bias

Perception of Dominance

References

  1. Amy B. Brunell, William A. Gentry, W. Keith Campbell, Brian J. Hoffman, Karl W. Kuhnert, and Kenneth G. DeMarree. "Leader Emergence: The Case of the Narcissistic Leader." Personality and Social Psychology Bulletin (October 2, 2008)
  2. Barbora Nevicka, Femke S. Ten Velden, Annebel H. B. De Hoogh, and Annelies E. M. Van Vianen. "Reality at Odds With Perceptions: Narcissistic Leaders and Group Performance." Psychological Science (September 19, 2011)
  3. Paulhus, D. L. (1998). "Interpersonal and intrapsychic adaptiveness of trait self-enhancement: A mixed blessing?" Journal of Personality and Social Psychology, 74(5), 1197-1208.
  4. Henri Tajfel. "Cognitive aspects of prejudice." Journal of Social Issues, 25, 79-97. 1969.
  5. Susan Fiske. "Our Brains on Teams: The Dark Side of Loyalty to a Group." Being Human. October 10, 2012.
  6. Maria Konnikova. "Revisiting Robbers Cave: The easy spontaneity of intergroup conflict." Scientific American, September 5, 2012.
  7. Pengmin Qin, Georg Northoff. "How is our self related to midline regions and the default mode network?" NeuroImage 57 (2011) pp1221-1233.
  8. [The American Future Gap Survey]. Institute for the Future, April 2017.
  9. Peter G. Rendell, Phoebe E. Bailey, Julie D. Henry, Louise H. Phillips, Shae Gaskin, Matthias Kliegel. "Older Adults Have Greater Difficulty Imagining Future Rather Than Atemporal Experiences." Psychology and Aging, 2012 vol27, num4, pp1089-1098.
  10. J. Richard Eiser, Christine Eiser. "Prediction of environmental change: Wish‐fulfillment revisited." European Journal of Social Psychology volume 5, issue 3, July / September 1975, pp315-322.
  11. Donald Granberg, Edward Brent. (1983). "When prophecy bends: The preference–expectation link in U.S. presidential elections, 1952–1980. Journal of Personality and Social Psychology." 45(3), 477-491.
  12. Neil Weinstein, "Unrealistic Optimism About Future Life Events." Journal of Personality and Social Psychology 1980, Vol. 39, No. 5, 806-820.
  13. Neil D. Weinstein and William M. Klein (1996). "Unrealistic Optimism: Present and Future." Journal of Social and Clinical Psychology: Vol. 15, No. 1, pp. 1-8.

See Also