Stochastic game
From Wikipedia, the free encyclopedia
In game theory, a stochastic game is a dynamic, competitive game with probabilistic transitions played by one or more players. The game is played in a sequence of stages. At the beginning of each stage the game is in some state. The players select actions and each player receives a payoff that depends on the current state and the chosen actions. The game then moves to a new random state whose distribution depends on the previous state and the actions chosen by the players. The procedure is repeated at the new state and play continues for a finite or infinite number of stages. The total payoff to a player is often taken to be the discounted sum of the stage payoffs or the limit inferior of the averages of the stage payoffs.
If there is a finite number of players and the action sets and the set of states are finite, then a stochastic game with a finite number of stages always has a Nash equilibrium. The same is true for a game with infinitely many stages if the total payoff is the dicounted sum. Vieille has shown that all two-person stochastic games with finite state and action spaces have approximate Nash equilibria when the total payoff is the limit inferior of the averages of the stage payoffs. Whether such equilibria exist when there are more than two players is a challenging open question.
Stochastic games have applications in economics and evolutionary biology. They are generalizations of repeated games which correspond to the special case where there is only one state.
Stochastic games were invented by Lloyd Shapley in the early 1950's. The most complete reference is the book of articles edited by Neyman and Sorin. The more elementary book of Filar and Vrieze provides a unified rigorous treatment of the theories of Markov Decision Processes and two-person stochastic games. They coin the term Competitive MDPs to encompass both 1- and 2-player stochastic games.
[edit] References
- L.S. Shapley. Stochastic games Proc. Nat. Acad. Science, 39:1095-1100, 1953.
- A. Condon. The complexity of stochastic games. Information and Computation, 96:203–224, 1992.
- J. Filar and K. Vrieze. Competitive Markov Decision Processes. Springer-Verlag, 1997.
- N. Vieille. Stochastic games: Recent results. In Handbook of Game Theory,pages 1833–1850. Elsevier Science, 2002.
- A. Neyman and S. Sorin, editors, Stochastic Games and Applications. Kluwer Academic Press, 2003.
|