Information cascade
An information (or informational) cascade occurs when a person observes the actions of others and then – despite possible contradictions in his/her own private information signals – engages in the same acts. A cascade develops when people "abandon their own information in favor of inferences based on earlier people's actions".[1] Information cascades provide an explanation for how such situations can occur, how likely they are to cascade incorrect information or actions, how such behavior may arise and desist rapidly, and how effective attempts to originate a cascade tend to be under different conditions.[2] By explaining all of these things, the original Independent Cascade model sought to improve on previous models that were unable to explain cascades of irrational behavior, a cascade's fragility, or the short-lived nature of certain cascades.
There are five key conditions in an information cascade model:
- There is a decision to be made – for example, whether to adopt a new technology, wear a new style of clothing, eat in a new restaurant, or support a particular political position
- A limited action space exists (e.g. an adopt/reject decision).
- People make the decision sequentially, and each person can observe the choices made by those who acted earlier
- Each person has some private information that helps guide their decision.
- A person can't directly observe the private information that other people know, but he or she can make inferences about this private information from what they do.
One assumption of Information Cascades which has been challenged is the concept that agents always make rational decisions. More social perspectives of cascades, which suggest that agents may act irrationally (e.g., against what they think is optimal) when social pressures are great, exist as complements to the concept of Information Cascades.[3] While competing models exist, it is more often the problem that the concept of an information cascade is conflated with ideas which do not match the two key conditions of the model, such as social proof, information diffusion,[4] and social influence. Indeed, the term information cascade has even been used to refer to such processes.[5]
Basic model
Qualitative example
Information cascades occur when external information obtained from previous participants in an event overrides one's own private signal, irrespective of the correctness of the former over the latter. The experiment conducted in[6] is a useful example of this process. The experiment consisted of two urns labeled A and B. Urn A contains two balls labeled "a" and one labeled "b". Urn B contains one ball labeled "a" and two labeled "b". The urn from which a ball must be drawn during each run is determined randomly and with equal probabilities (from the throw of a dice). The contents of the chosen urn are emptied into a neutral container. The participants are then asked in random order to draw a marble from this container. This entire process may be termed a "run", and a number of such runs are performed.
Each time a participant picks up a marble, he is to decide which urn it belongs to. His decision is then announced for the benefit of the remaining participants in the room. Thus, the (n+1)th participant has information about the decisions made by all the n participants preceding him, and also his private signal which is the label on the ball that he draws during his turn. The experimenters observed that an information cascade was observed in 41 of 56 such runs. This means, in the runs where the cascade occurred, at least one participant gave precedence to earlier decisions over his own private signal. It is possible for such an occurrence to produce the wrong result. This phenomenon is known as "Reverse Cascade".
Quantitative description
A person's signal telling them to accept is denoted as "H" (a high signal, where high signifies he should accept), and a signal telling them not to accept is "L" (a low signal). The model assumes that when the correct decision is to accept, individuals will be more likely to see an "H", and conversely, when the correct decision is to reject, individuals are more likely to see an "L" signal. This is essentially a conditional probability – the probability of "H" when the correct action is to accept, or P[H|A]. Similarly P[L|R] is the probability that an agent gets an "L" signal when the correct action is reject. If these likelihoods are represented by q, then q > 0.5. This is summarized in the table below.[1]
Agent signal | True probability state | |
---|---|---|
Reject | Accept | |
L | q | 1-q |
H | 1-q | q |
The first agent determines whether or not to accept solely based on his own signal. As the model assumes that all agents act rationally, the action (accept or reject) the agent feels is more likely is the action he will choose to take. This decision can be explained using Bayes rule:
If the agent receives an "H" signal, then the likelihood of accepting is obtained by calculating P[A|H]. The equation says that, by virtue of the fact that q > 0.5, the first agent, acting only on his private signal, will always increase his estimate of p with an "H" signal. Similarly, it can be shown that an agent will always decrease his expectation of p when he receives a low signal. Recalling that, if the value, "V", of accepting is equal to the value of rejecting, then an agent will accept if he believes p >0.5, and reject otherwise. Because this agent started out with the assumption that both accepting and rejecting are equally viable options (p = 0.5), the observation of an "H" signal will allow him to conclude that accepting is the rational choice.
The second agent then considers both the first agent's decision and his own signal, again in a rational fashion. In general, the nth agent considers the decisions of the previous n-1 agents, and his own signal. He makes a decision based on Bayesian reasoning to determine the most rational choice.
Where "a" is the number of accepts in the previous set plus the agent's own signal, and "b" is the number of rejects. Thus, a + b = n. The decision is based on how the value on the right hand side of the equation compares with p.[1]
Explicit model assumptions
The original model makes several assumptions about human behavior and the world in which humans act,[2] some of which are relaxed in later versions[1] or in alternate definitions of similar problems, such as the diffusion of innovations.
- Boundedly Rational Agents: The original Independent Cascade model assumes humans are boundedly rational[7] – that is, they will always make rational decisions based on the information they can observe, but the information they observe may not be complete or correct. In other words, agents do not have complete knowledge of the world around them (which would allow them to make the correct decision in any and all situations). In this way, there is a point at which, even if a person has correct knowledge of the idea or action cascading, they can be convinced via social pressures to adopt some alternate, incorrect view of the world.
- Incomplete Knowledge of Others: The original information cascade model assumes that agents have incomplete knowledge of the agents which precede them in the specified order. As opposed to definitions where agents have some knowledge of the "private information" held by previous agents, the current agent makes a decision based only on the observable action (whether or not to imitate) of those preceding him. It is important to note that the original creators argue this is a reason why information cascades can be caused by small shocks.
- Behavior of all previous agents is known
Resulting conditions
- Cascades will always occur – as discussed, in the simple mode, the likelihood of a cascade occurring increases towards 1 as the number of people making decisions increases towards infinity.
- Cascades can be incorrect – because agents make decisions with both bounded rationality and probabilistic knowledge of the initial truth (e.g. whether accepting or rejecting is the correct decision), the incorrect behavior may cascade through the system.
- Cascades can be based on little information – mathematically, a cascade of an infinite length can occur based only on the decision of two people. More generally, a small set of people who strongly promote an idea as being rational can rapidly influence a much larger subset of the general population
- Cascades are fragile – because agents receive no extra information after the difference between a and b increases beyond 2, and because such differences can occur at small numbers of agents, agents considering opinions from those agents who are making decisions based on actual information can be dissuaded from a choice rather easily.[2] thus suggests that cascades are susceptible to the release of public information.[2] also discusses this result in the context of the underlying value p changing over time, in which case a cascade can rapidly change course.
Responding
A literature exists that examines how individuals or firms might respond to the existence of informational cascades when they have products to sell but where buyers are unsure of the quality of those products. Curtis Taylor (1999)[8] shows that when selling a house the seller might wish to start with high prices, as failure to sell with low prices is indicative of low quality and might start a cascade on not buying, while failure to sell with high prices could be construed as meaning the house is just over-priced, and prices can then be reduced to get a sale. Daniel Sgroi (2002)[9] shows that firms might use "guinea pigs" who are given the opportunity to buy early to kick-start an informational cascade through their early and public purchasing decisions, and work by David Gill and Daniel Sgroi (2008)[10] show that early public tests might have a similar effect (and in particular that passing a "tough test" which is biased against the seller can instigate a cascade all by itself). Bose et al.[11] have examined how prices set by a monopolist might evolve in the presence of potential cascade behavior where the monopolist and consumers are unsure of a products quality.
Examples and fields of application
Information cascades occur in situations where seeing many people make the same choice provides evidence that outweighs one's own judgment. That is, one thinks: "It's more likely that I'm wrong than that all those other people are wrong. Therefore, I will do as they do."
In what has been termed a reputational cascade, late responders sometimes go along with the decisions of early responders, not just because the late responders think the early responders are right, but also because they perceive their reputation will be damaged if they dissent from the early responders.[12]
Market cascades
Information cascades have become one of the topics of behavioral economics, as they are often seen in financial markets where they can feed speculation and create cumulative and excessive price moves, either for the whole market (market bubble...) or a specific asset, like a stock that becomes overly popular among investors.
Marketers also use the idea of cascades to attempt to get a buying cascade started for a new product. If they can induce an initial set of people to adopt the new product, then those who make purchasing decisions later on may also adopt the product even if it is no better than, or perhaps even worse than, competing products. This is most effective if these later consumers are able to observe the adoption decisions, but not how satisfied the early customers actually were with the choice. This is consistent with the idea that cascades arise naturally when people can see what others do but not what they know.[13]
Information cascades are usually considered by economists:
- as products of rational expectations at their start,
- as irrational herd behavior if they persist for too long, which signals that collective emotions come also into play to feed the cascade.
Social network analysis
Dotey et al.[14] state that information flows in the form of cascades on the social network. According to the authors, analysis of virality of information cascades on a social network may lead to many useful applications like determining the most influential individuals within a network. This information can be used for maximizing market effectiveness or influencing public opinion. Various structural and temporal features of a network affect cascade virality.
In contrast to work on information cascades in social networks, the Social Influence Model of belief spread argues that people have some notion of the private beliefs of those in their network.[15] The social influence model, then, relaxes the assumption of information cascades that people are acting only on observable actions taken by others. In addition, the social influence model focuses on embedding people within a social network, as opposed to a queue. Finally, the social influence model relaxes the assumption of the information cascade model that people will either complete an action or not by allowing for a continuous scale of the "strength" of an agents belief that an action should be completed.
Historical examples
- Small protests began in Leipzig, Germany in 1989 with just a handful of activists challenging the German Democratic Republic.[16] For almost a year, protesters met every Monday growing by a few people each time.[16] By the time the government attempted to address it in September 1989, it was too big to quash.[16] In October, the number of protesters reached 100,000 and by the first Monday in November, over 400,000 people marched the streets of Leipzig. Two days later the Berlin Wall was dismantled.[16]
- The adoption rate of drought-resistant hybrid seed corn during the Great Depression and Dust Bowl was slow despite its significant improvement over the previously available seed corn. Researchers at Iowa State University were interested in understanding the public's hesitation to the adoption of this significantly improved technology. After conducting 259 interviews with farmers[17] it was observed that the slow rate of adoption was due to how the farmers valued the opinion of their friends and neighbors instead of the word of a salesman. See[18] for the original report.
Empirical Studies
In addition to the examples above, Information Cascades have been shown to exist in several empirical studies. Perhaps the best example, given above, is.[6] Participants stood in a line behind an urn which had balls of different colors. Sequentially, participants would pick a ball out of the urn, looks at it, and then places it back into the urn. The agent then voices their opinion of which color of balls (red or blue) there is a majority of in the urn for the rest of the participants to hear. Participants get a monetary reward if they guess correctly, forcing the concept of rationality.
Other examples include
- De Vany and Walls[19] create a statistical model of information cascades where an action is required. They apply this model to the actions people take to go see a movie that has come out at the theatre. De Vany and Walls validate their model on this data, finding a similar Pareto distribution of revenue for different movies.
- Walden and Browne also adopt the original Information Cascade model, here into an operational model more practical for real world studies, which allows for analysis based on observed variables. Walden and Browne test their model on data about adoption of new technologies by businesses, finding support for their hypothesis that information cascades play a role in this adoption[20]
Legal aspects
The negative effects of informational cascades sometimes become a legal concern and laws have been enacted to neutralize them. Ward Farnsworth, a law professor, analyzed the legal aspects of informational cascades and gave several examples in his book The Legal Analyst: in many military courts, the officers voting to decide a case vote in reverse rank order (the officer of the lowest rank votes first), and he suggested it may be done so the lower-ranked officers would not be tempted by the cascade to vote with the more senior officers, who are believed to have more accurate judgement; another example is that countries such as Israel and France have laws that prohibit polling days or weeks before elections to prevent the effect of informational cascade that may influence the election results.[21]
See also
- Asch conformity experiments
- Conformity
- Groupthink
- Group polarization
- Herd behavior
- Sheeple
- Social proof
- Woozle effect
- Other modelling approaches
References
- 1 2 3 4 Easley, David (2010). Networks, Crowds and Markets: Reasoning about a Highly Connected Workld. Cambridge University Press. pp. 483–506.
- 1 2 3 4 Bikhchandani, S., Hirshleifer, D., and Welch, I. (1992), "A Theory of Fads, Fashion, Custom, and Cultural Change as Informational Cascades," Journal of Political Economy, Volume 100, Issue 5, pp. pp. 992-1026. + button to enlarge.
- ↑ Schiller, R.J. (1995). "Conversation, Information and Herd Behavior". Rhetoric and Economic Behavior. 85 (3): 181–185.
- ↑ Gruhl, Daniel; Guha, R.; Liben-Nowell, D.; Tomkins, A. (2004). "Information diffusion through blogspace". WWW: 491–501. doi:10.1145/988672.988739.
- ↑ Sadikov, E.; Medina, M.; Leskovec, J.; Garcia-Molina, H. (2011). "Correcting for Missing Data in Information Cascades" (PDF). WSDM. Retrieved March 23, 2012.
- 1 2 Anderson, L.R.; Holt, C.A. (1997). "Information Cascades in the Laboratory". The American Economic Review. 87 (5): 847–862.
- ↑ Newell, A. (1972). Human problem solving. Englewood Cliffs, NY: Prentice Hall.
- ↑ Taylor, C. (1999). "Time-on-the-Market as a Sign of Quality.". Review of Economic Studies. 66: 555–578.
- ↑ Sgroi, D. (2002). "Optimizing Information in the Herd: Guinea Pigs, Pro
ts and Welfare.". Games and Economic Behavior. 39: 137–166. doi:10.1006/game.2001.0881. C1 control character in
|title=
at position 53 (help) - ↑ Gill, D.; D. Sgroi (2008). "Sequential Decisions with Tests.". Games and Economic Behavior. 63: 663–678. doi:10.1016/j.geb.2006.07.004.
- ↑ Bose, S.; G. Orosel; M. Ottaviani; L. Vesterlund (2006). "Dynamic Monopoly Pricing and Herding.". RAND Journal of Economics. 37: 910–928. doi:10.1111/j.1756-2171.2006.tb00063.x.
- ↑ Pierre Lemieux (2003), "Following the Herd", Regulation, Cato Institute, 21. . Retrieved 14 July 2010.
- ↑ http://research.ivo-welch.info/palgrave.pdf
- ↑ Dotey, A., Rom, H. and Vaca C., Information Diffusion in Social Media. 2011, Stanford University
- ↑ Friedkin, N.E. and Johnsen, E.C. (2011). Social Influence Network Theory: A Sociological Examination of Small Group Dynamics. Cambridge University Press.
- 1 2 3 4 Shirky, Clay (2008). Here Comes Everybody: The Power of Organizing Without Organizations. New York: Penguin Press. pp. 161–164. ISBN 1-59420-153-6.
- ↑ Carboneau, Clark. "Using Diffusion of Innovations and Academic Detailing to Spread Evidence-based Practices". Journal for Healthcare Quality. Retrieved 2008-11-11.
- ↑ Beal, George M.; Bohlen, Joe M. "The Diffusion Process" (PDF). Iowa State University of Science and Technology of Ames, Iowa. Retrieved 2008-11-11.
- ↑ De Vany, A.; D. Walls (1999). "Uncertainty in the movie industry: does star power reduce the terror of the box office?". Journal of Cultural Economics. 23: 285–318. doi:10.1023/a:1007608125988.
- ↑ Walden, Eric; Glenn Browne (2002). "Information Cascades in the Adoption of New Technology". ICIS Proceedings.
- ↑ Farnsworth, Ward (2007). The Legal Analyst: A Toolkit for Thinking about the Law. Chicago: University of Chicago Press. ISBN 0-226-23835-0
External links
- Informational Cascades and Rational Herding: An Annotated Bibliography and Resource Reference
- A Bibliography of Information Cascades and Herd Effects
- How a Bubble Stayed Under the Radar, Robert Shiller NYT article, may require login.
- How the Low-Fat, Low-Fact Cascade Just Keeps Rolling Along, John Tierney October 9, 2007 NYT blog, does not require login.
- Schopenhauer on Cascades, John Tierney, October 10, 2007 NYT blog, does not require login.
- Is Justin Timberlake a Product of Cumulative Advantage? Informational Cascade with another name, NYT article, may require login.
- Information Cascades in Magic