Talk:Nash equilibrium

From Wikipedia, the free encyclopedia

This article is part of WikiProject Game theory, an attempt to improve, grow, and standardize Wikipedia's articles related to Game theory. We need your help!

Join in | Fix a red link | Add content | Weigh in


B This article has been rated as B-Class on the assessment scale.
Top This article is on a subject of top-importance within game theory.
Nash equilibrium was a good article, but it has been removed from the list. There are suggestions below for improving the article to meet the good article criteria. Once these are addressed, the article can be renominated. Editors may also seek a reassessment of the decision if they believe there was a mistake.

Delisted version: July 14, 2007

WikiProject Mathematics
This article is within the scope of WikiProject Mathematics, which collaborates on articles related to mathematics.
Mathematics rating: B Class High Priority  Field: Discrete mathematics
One of the 500 most frequently viewed mathematics articles.

Contents

[edit] Finding NEs

Should there be no discussion/reference of how to actually find NE in particular types of games?

I mean:

  • 2 player games
    • zero sum-games -> linear program
    • general sum games -> LCP
  • k player games -> ?

(that's how I got here. I'm looking for methods to find a NE in multiplayer games)


[edit] Continuous set?

The article says: "if the set of strategies by player i, is a compact and continuous set"

What the heck is a continuous set?

[edit] John Nash

So what did Nash do, besides defining this equilibrium? Are there any interesting theorems here? AxelBoldt

Well, what was interesting was mainly this (I'm typing this from memory, so don't quote me or put this into the article without checking with some up-to-date math nerd): for all games for which there was previously known "solution" (for some definition of solution appropriate to that type of game), Nash proved that those existing solutions were Nash equilibria; and further, he showed that any reasonable definition of "solution" for any other type of game must be a subset of the Nash equilibria for that game. And finally, he showed how to find the Nash equilibria. So he made it much easier to solve all kinds of games--even those for which a definition of "solved" isn't clear--by reducing the problem to finding all the Nash equilibria and evaluating them. --LDC

[edit] Nash equilibria of a game

[Ed: This is an incomplete example that does not illustrate. If I choose 9 and you choose 0, I'm out two bucks. If I choose 9 and you choose 7, you make $9 and I still make $5 - why is this not superior?]

The system of strategies 9-0 is not a Nash equilibrium, because the first player can improve the outcome by choosing 0 instead of 9. The system of strategies 9-7 is not a Nash equilibium, because the first player can improve the outcome by choosing 9 instead of 7. The only Nash equilibrium of this game is 0-0, as stated. AxelBoldt 15:04 Aug 27, 2002 (PDT)

Boy, I sure don't understand this. Why is the Nash equilibrium not 10-10? Don't the players both get ten bucks if they do this? And the worst that could happen is the other player chooses a smaller number and you get less than ten. Why would anyone rational do that, when you could both pick 10 and get 10 bucks? Is there a statement missing from the problem definition? Jdavidb 04:17, 8 Apr 2004 (UTC)

The second player can chose 9 and he can get 11 while the other one will get only 8. That is why 10-10 is not at Nash equilibrium -- AM

I think the thing to keep in mind here is that it is a competition game. Both players are trying to beat the other's "score". This example would work better, I think, using points instead of dollars. Yes, there is a benefit to both players if they both win some money, but no benefit if they both win some points and they are each trying to beat the other's score. -truthandcoffee /ed: Does anyone else confirm/deny this? I would like to see this example changed from dollars to points, as I think it is much clearer and makes much more sense this way. I'd like some feedback before I go ahead and make the change. --Truthandcoffee 04:27, 13 November 2006 (UTC)

[edit] The modified version with 11 Nash equilibria

If the game is modified so that the two players win the named amount if they both choose the same number, and otherwise win nothing, then there are 11 Nash equilibria.

Does the Nash equilibrium not suppose that everybody is selfish and rational? So, if you suppose that the other player is selfish, you can savely assume that he/she chooses 10. IMHO that's the only equilibrium. Or does the Nash equilibrium not presuppose selfishness? Andy 12:53, 26 September 2006 (UTC)

[edit] Limitations of NE

I disagree with the comment "This indicates one of the limitations of using the Nash equilibrium to analyze a game".

The Nash equilibrium is a predictive tool, and indeed it correctly predicts the (unfortunate) result if self-interested players participate in a Prisoner's dilemma type situation. (as borne out in reality, for instance, over fishing of the world's oceans)

The fact that Nash equilibrium correctly predicts an undesirable result is hardly a flaw or limitation.

Robbrown

I agree with Robbrown, that is not a limitation, it correctly predicts the end out come of a prisoner's dilemma. I will change it, also there are better definitions for the Nash equilbrium out there, it might just be better to quote a source on their definition of one like here(http://www.gametheory.net/Dictionary/NashEquilibrium.html )"

Nash equilibrium, named after John Nash, is a set of strategies, one for each player, 
such that no player has incentive to unilaterally change her action. Players are in 
equilibrium if a change in strategies by any one of them would lead that player to earn 
less than if she remained with her current strategy.

--ShaunMacPherson 18:15, 15 Mar 2004 (UTC)

[edit] Nash equilibrium available online

The seminal journal paper in which Nash introduces what is now called the Nash equilibrium is "Non-Cooperative Games", John Nash, The Annals of Mathematics 54(2):286-295, 1951. It is available online to for a fee at http://links.jstor.org/sici?sici=0003-486X%28195109%292%3A54%3A2%3C286%3ANG%3E2.0.CO%3B2-G (many universities subscribe to JSTOR, so this link should work for at least some people beside me)

I'm not posting that directly to the page since I'm not sure whether it's OK to post links to for-pay resources. If it is OK, then please copy this to the article page.

--JP, Nov 10 2005

Well, I just did that yesterday, before reading this note. I will remove what I did right away. The article is available for a fee, but many universities and colleges provide free access to this and other for fee-services for their students.

--Zsolt, June 7, 2006

On the other hand, referencing the article itself, without a link to the for-fee online article is probably ok. So I just took out the link to that on-line version. This way people can still find it if they want to.

--Zsolt, June 7, 2006

[edit] Removed setence

I reverted an edit by anon, which added this setence to one section:

"choosing the best strategy given the strategies that others have chosen"

The sentence was out of context and imcomplete. If the anon would like to add it in context I'm sure it would be helpful. --Kzollman 17:59, May 11, 2005 (UTC)

[edit] Coordination game

As it turns out the entry Coordination game redirects here. Given the wide discussion of coordination games, I think it diserves its own entry. Would folks mind if I seeded the entry with the material here, and removed the redirect? thanks! -Kzollman 23:43, Jun 2, 2005 (UTC)

Presently, nothing links to Coordination game other than daughters of Nash Equilibrium, and both of these actually give its pay-off matrix (Mixed strategy, Pure strategy). Therefore, there can't be a problem expanding that article from a redirect.
Cheers, Wragge 00:56, 2005 Jun 3 (UTC)
Done! best --Kzollman 00:48, Jun 8, 2005 (UTC)

[edit] Fixed points

In my game theory class and text books we proved the existence of the Nash equilibrium using Kakutani fixed point theorem, a generalization of Brouwer fixed point theorem. Does anyone smarter than me know if Brower's is strong enough to prove the existence (as stated in the article)? --Kzollman 00:48, Jun 8, 2005 (UTC)

Okay, I fixed the proof. --best, kevin ···Kzollman | Talk··· 06:03, August 2, 2005 (UTC)


[edit] correlated equilibrium more flexible than Nash equilibrium?

All the examples in the current Nash equilibrium article seem to be "one-shot" games (is there a better term?) -- as opposed to repeated games.

Strategies such as Tit for Tat don't work for "one-shot" games.

The Robert Aumann article mentions

Aumann's greatest contribution was in the realm of repeated games, which are situations in which players encounter the same situation over and over again.
Aumann was the first to define the concept of correlated equilibrium in game theory, which is a type of equilibrium in non-cooperative games that is more flexible than the classical Nash Equilibrium.

Does that mean that

Nash equilibrium only applies to one-shot games.
correlated equilibrium is used for repeated games.

? If that's true, the article should mention it.

--DavidCary 13:40, 11 October 2005 (UTC)


David, I don't really know much about correlated equilibrium (which is why I haven't written the article). From what I understand, those contributions are two different things. Correlated equilibria are genearalizations of Nash equilibria (i.e. every Nash eq. is a Correlated eq. but there are some Correlated eq. that are not Nash). I don't know what his contributions to repeated games is. --best, kevin ···Kzollman | Talk··· 18:34, 11 October 2005 (UTC)

[edit] Nash Equilibria in a payoff matrix

Under the heading specified above, it claims that "...an NxN matrix may have 0 or N Nash Equilibriums." Shouldn't it be 1 or N NE, since at the beginning of the article it says that Nash proved the existence of equilibria for any finite game with any number of players?

No. For example the matrix [ 0,1 0,1 ][ 1,0 1,0 ] has no equilibrium. I assume the proof doesn't apply to that matrix because it's not a matrix representing a finite game (but that's just a guess).
I do think the sentence is wrong on the other side though, and it should be 0 to NxN equilibriums (note also that it's 0 TO N in the original, not 0 OR N) for the degenerate case of a matrix where all the cells have the same value. Hirudo 06:29, 31 March 2006 (UTC)
Every finite game has a Nash equilibrium in its mixed extension. Some games, don't have pure strategy Nash equilibria, and I assume that what is meant by the article. I will fix it. --best, kevin [kzollman][talk] 07:23, 31 March 2006 (UTC)

[edit] Depth

This section provides a rule for finding NEs. It fails to discuss either the rule's history - who discovered it and when - or why (the reader should believe) it works. While there's a place for pragmatic rules in an encyclopedia, the article should at least indicate where its theoretical underpinnings may be found. yoyo 23:09, 24 December 2006 (UTC)


[edit] Strategy profile

On this page, the phrase

resulting in strategy profile x = (x1,..,xn)

indicates that a strategy profile is simply a vector of strategies, one for each player. I agree. However, if one follows the link, a strategy profile is defined as something that "identifices, describes, and lastly examines a player's chosen strategy". It is conceivable that someone somewhere defined strategy profile in this way, but the "vector"-meaning of the term is much more common. So, at least the link, if not the page linked to is misleading.

Also there is something slightly wrong in the notation concerning strategy profiles on this page itself. You say that

S is the set of strategy profiles.

Okay, but then each element of S is a vector of strategies. Still, you write

x_i \in S

where xi is a single strategy. I think you want to write

S = S_1 \times S_2 \times ... \times S_n is the set of strategy profiles

and

x_i \in S_i.

Bromille 11:50, 31 May 2006 (UTC)

The strategy (game theory) article really needs some fixing. I reverted Littlebear1227's definition of strategy profile to the more correct one... Pete.Hurd 13:16, 31 May 2006 (UTC)


[edit] "Occurence" section

There are some issues in the "Occurrence" section.

First, whatever the stated conditions are, they can only be sufficient, not necessary for equilibrium play. To see this, observe that nothing prevents a bunch of completely non-rational agents from playing the Nash equilibrium, not because it is a rational thing to do, but because non-rational agents can do anything they want.

Second, conditions 1. and 5. together say that agents are rational and they all believe that the other agents are also rational. But these are not sufficent conditions on rationality for equilibrium play. A classical counterexample is Rosenthal's centipede which has a unique Nash equilibrium as assumed. In this game, if you and I play the game and I believe you are rational and you believe I am rational, we still might not play the equilibrium, if, for instance, I do not believe that you believe that I am rational. This could be the case, even if I do believe that you are rational. What is needed to ensure that the equilibrium is played in the case of Rosenthal's centipede is common knowledge of rationality (CKR) which means that

A: I am rational, B: You are rational, C: I believe B, D: You believe A, E: I believe D, F: You believe C, G: I believe F, H: You believe E, etc.. etc..

The difference may seem like splitting hairs but is actually the key to understanding the discrepancy between how Rosenthal's centipede is actually played by rational-seeming people and the equilibrium play.

Reference: Robert Aumann, Backward induction and common knowledge of rationality, Games and Economic Behavior 8 (1995), p. 6--19.

Third, for general games, even if CKR is assumed, this usually only ensures that a correlated equilibrium is played, not a Nash equilibrium. The fact that a unique Nash equilibrium is assumed may make this ok - I'm not sure about this.

Bromille 13:54, 1 June 2006 (UTC)

Bromille, this is definitely right. I have intended for some time to rewrite this section to incorporate some of these issues. But you are welcome to beat me to it!  :) We do have a page on common knowledge (logic) which explains some issues about common knowledge. --best, kevin [kzollman][talk] 17:51, 1 June 2006 (UTC)

[edit] GA Re-Review and In-line citations

Members of the Wikipedia:WikiProject Good articles are in the process of doing a re-review of current Good Article listings to ensure compliance with the standards of the Good Article Criteria. (Discussion of the changes and re-review can be found here). A significant change to the GA criteria is the mandatory use of some sort of in-line citation (In accordance to WP:CITE) to be used in order for an article to pass the verification and reference criteria. Currently this article does not include in-line citations. It is recommended that the article's editors take a look at the inclusion of in-line citations as well as how the article stacks up against the rest of the Good Article criteria. GA reviewers will give you at least a week's time from the date of this notice to work on the in-line citations before doing a full re-review and deciding if the article still merits being considered a Good Article or would need to be de-listed. If you have any questions, please don't hesitate to contact us on the Good Article project talk page or you may contact me personally. On behalf of the Good Articles Project, I want to thank you for all the time and effort that you have put into working on this article and improving the overall quality of the Wikipedia project. Agne 05:49, 26 September 2006 (UTC)

[edit] Relationship to regulation??

I am curious about the relationship between NE and government regulation. Consider the case of airbags in cars. Without regulation requiring airbags, normal supply and demand will dictate the number of airbags produced. If air bags are mandated by the government, the unit cost will be much lower, and more consumers will choose to purchase cars with airbags and receive the consumer surplus from the purchase. The consumer was able to move to a better outcome through the intervention of a third party. In this case it is hard to say how the auto manufacturer is impacted. It is possible that they benefited because they can now offer new cars with more value at a lower cost than without regulation. More people buy a new car vs. a used car to benefit from the air bag. Measuring if there was a benefit to the auto company is impossible, but let's assume there is. Did the regulation help both parties move out of a NE to a better outcome? 65.198.133.254 20:36, 30 November 2006 (UTC)David Wilson

[edit] Stability of strategy

Article states, in the "Stability" section: "Note that stability of the equilibrium is related to, but distinct from, stability of a strategy."

I thought it should be relatively easy to track down the meaning of "stability of a strategy". Unfortunately, a visit to the Stability disambiguation page and the linked Stability (probability) page have left me none the wiser.

So, unless the reader can find some meaning in this phrase, I suggest that leaving the quoted sentence in the article is both unhelpful and potentially discouraging. Should we remove it? yoyo 23:02, 24 December 2006 (UTC)

I think that the current section on stability is rather weak. In general, we know that in a mixed strategy NE, all actions played with a strictly positive probability have the same expected payoff (i.e. if they were played with probability 1 they would yield the same payoff as the equilibrium mixed strategy. Hence if more than one strategy is played in equilibrium it must be unstable in sense 2. Only pure strategy NE can be strict equilibria. Zosimos101 (talk) 17:41, 25 April 2008 (UTC)

[edit] Notation Question

I have seen in some papers the notation x i used to describe a set of strategies for all players other than i. This cleans up notation greatly; for example,

f_i(x^*) \geq f_i(x^*_{1},...,x^*_{i-1},x_i, x^*_{i+1},...,x^*_n) \forall i \forall x_{i} \in S_{i}. (where S_{i} is the set of all admissible strategies for player i)

becomes

f_{i}(x^*_{i},x^*_{-i})\geq f_{i}(x_{i},x^*_{-1}) \forall i \forall x_{i} \in S_{i}.

This seems more compact and conveys the point more clearly. SanitySolipsism 22:27, 6 January 2007 (UTC)

[edit] Collaboration of the Month

OK, what to do? I put this on my watchlist now. ~ trialsanderrors 17:20, 6 March 2007 (UTC)

Here are my vague thoughts. Feel free to disagree or ignore them as you like.
  1. The introductory examples (which oddly come after the formal proof sketch) are overly complex and a bit scattered. I would rather have one simple example that makes the idea clear, and then perhaps one or two more complex examples which help to remove potential misunderstandings (like a NE which is weakly dominated). I started this once, at User:Kzollman/Nash equilibrium, but I'm not sure I'm happy with it.
  2. The stability section seems a bit imprecise, but a good idea. Perhaps the major ideas could be made more clear with a discussion of a dynamics, or ESS?
  3. I think a discussion of the interpretation of NE is needed. What exactly is NE used for? I recall a gt-ist here rattling off a bunch of different interpretations for ne (like fixed point in a dynamics of strategy change, prediction for initial play, prediction for stability but not initial play, etc). In particular a discussion of its empirical shortcommings would be useful. Something that mimics the stuff in game theory.
  4. The "occurence" section needs tightening up, and could be more specific. I know Aumann has a paper about the epistemic conditions for NE. I have it, but probably won't read it until I do so for a class next quarter. If someone else knows this stuff, I think it would be good to have it.
  5. Some discussion of major weakenings/refinements would maybe be nice. But perhaps this would duplicate what solution concept ought to be... Almost certainly we should discuss the refinements that share the name like strict NE, symmetric NE, whatever.
  6. Maybe something about the precursors to Nash. Cornout for example.
Phew... That list ended up being bigger than I thought. What do you think? --best, kevin [kzollman][talk] 05:58, 7 March 2007 (UTC)
I agree mostly. What I noticed first is that the article is a A7 candidate because it never bothers to tell us why the Nash equilibrium is important. So a historical relevance/refinements section should be somewhere at the beginning. The proof sketch should move further down, it's not really deletion material, but it's also not the most encyclopedically relevant aspect. On the examples section, I think examples with one, two and no NE's are necessary, and then maybe an example that's not immediately intuitive (maybe extensive form game?). Also, the relationship of NE to dominance, minimax and Pareto optimality isn't made clear. Lots of work... ~ trialsanderrors 08:51, 9 March 2007 (UTC)
Okay, sorry I haven't had much time recently. Given the ammount of information that need to be added, I suggest maybe we put together an expected Table of Contents here before adding a bunch of stuff to the article. So here's what I think:
  1. Definition
    1. Intuitive
    2. Mathematical
    3. Mixed vs. pure strategy NE
  2. Examples
    1. Simple strict dominance example (Prisoner's dilemma?)
    2. Game with Multiple PSNE (stag hunt/battle of the sexes?)
    3. Game with no PSNE (matching pennies/rock paper scissors?)
  3. History
    1. Cournot and John Nash
  4. Uses for NE
    1. Normative
    2. Descriptive
      1. Epistemic requirements for NE
  5. Refinements
    1. Strict
    2. Stable
    3. Maybe some others
  6. Proof
How does this sound? --best, kevin [kzollman][talk] 22:18, 15 March 2007 (UTC)
Yeah, I thought about that last night too. I'd say the Cournot-Nash history can stay where it is, there isn't really much more to be said aboiut it, other than that the structure looks better than what we currently have. Of course the lead should make a mention why the concept is important to game theory. ~ trialsanderrors 03:30, 17 March 2007 (UTC)
Also, PD is probably not a good example of a simple NE since it's also dominant strategies eq. It might be useful to find an example without dominant strategies. ~ trialsanderrors 03:39, 17 March 2007 (UTC)

[edit] Samuel Goldstein

Does anyone know anything about this? How about what journal the article is in, or anything? I couldn't find anything, the ip of the anon who added the information is at U of Toronto, and another, erilly similar ip address claims at User talk:128.100.53.169 to be familiar with Goldstein, although not think that it belongs in the lead. As unverifiable, I'm still for removing the mention, but loath to do it without mentioning it here, per WP:1RR. Smmurphy(Talk) 17:55, 14 March 2007 (UTC)

Yeah, looks zippable to me. If the anon editor reads this: 1. We need the name of the journal the article appears in, and 2. an independent source that makes the claim that Nash stole it from Goldstein. ~ trialsanderrors 19:01, 14 March 2007 (UTC)

[edit] confusion

This article needs an example that makes more sense. I don't really understand this concept.

[edit] Citation style

Hello all - The article currently uses two inconsistent citation style (APA and wiki-ref style). We need to be consistent. I personally prefer APA-inline citations, but I know wikipedia generally seems to be leaning toward the footnote style. Anybody have any druthers? --best, kevin [kzollman][talk] 23:19, 29 April 2007 (UTC)

[edit] Good article delist?

This is a notification of an intention to delist this article as a good article. In February this year at Wikipedia_talk:WikiProject_Game_theory#CotM, the consensus was that this article is B-Class (at best) from the point of view of WikiProject Game theory. I agree with this assessment, and last month I also rated it as B-Class from the point of view of WikiProject Mathematics. The article has not improved since these assessments, despite being Game theory collaboration of the month for two months.

The lead is inadequate as a summary of the article. The article is missing a history section, which is surely vital in this case (and is partly covered in the lead). The accessibility of the article could be much improved, the references are poor, and citation is both inadequate and inconsistent. I will attempt to improve the article myself over the next day or two, but I don't think I will be able to raise the article to GA standard myself. I hope others will contribute to ensuring that this article meets with criteria, otherwise I will have to delist it. An alternative would be to take the article to Good article review and any editor is welcome to do that. Geometry guy 21:28, 12 July 2007 (UTC)

GA review (see here for criteria)
  1. It is reasonably well written.
    a (prose): b (MoS):
  2. It is factually accurate and verifiable.
    a (references): b (citations to reliable sources): c (OR):
  3. It is broad in its coverage.
    a (major aspects): b (focused):
  4. It follows the neutral point of view policy.
    a (fair representation): b (all significant views):
  5. It is stable.
  6. It contains images, where possible, to illustrate the topic.
    a (tagged and captioned): b (lack of images does not in itself exclude GA): c (non-free images have fair use rationales):
  7. Overall:
    a Pass/Fail:


I think there is consensus that this is not yet a good article. The lead is weak, the prose poor. Citation is extremely weak. Improve and renominate: good luck! Geometry guy 20:35, 14 July 2007 (UTC)

[edit] Why does the term 'governing dynamics' re-direct here?

Governing Dynamics, i.e. the way an object or phenomenon will occur or behave under a given set of circumstances is not Nash's idea, so why when one types the phrase into the search box, does one get diverted to his page? It's ludicrous.
Governing dynamics is a centuries old theory which simply refers to the way an e.g. object or phenomenon will behave given a set of prevailing circumstances. If anything it is a philosophical rationale more than a scientific theory. By understanding the laws which governs said object in said circumstances, one is essentially able to understand the true nature of all matter and occurrence and abstracts are erased. The principle of governing dynamics serves to found and object or occurrence in its pure and absolute state because it is essentially a malleable law, under which nothing is fixed nor finite, that is to say X will behave as Y under Z, but that does not imply X will also behave as Y, because X is not fixed and will not always be under condition Z. Under condition V, X will behave as A and not Z, because it will reflect the new conditions and parameters it finds itself operating in. Thus by understanding the laws, the governing dynamics of phenomenon, one is ultimately able to understand the true and original nature of the object or phenomenon under inspection, because one essentially has a series or set of laws which encompass all possible existence and not simply a single law under which an object or phenomenon is expected to eternally function.
I therfore suggest that for accuracy these two pages should be split and a seperate page developed which specefially deals with the Law of Governing Dynamics in its historical and philosphical contexts.

Andrew Woollock 14:08, 13 July 2007 (UTC)
Feel free to create a page at Governing dynamics to replace the redirect. --best, kevin [kzollman][talk] 15:12, 13 July 2007 (UTC)

[edit] Nontechnical definition

I hope the specialists among us don't mind my attempt at defining Nash equilibrium in a lay person's language. If you are tempted to quibble with my definition, just remember how awfully vague the (non)explanation of Nash equilibrium was in that bar scene in the film of A Beautiful Mind. Such a fundamental and elegant idea deserves to be explained in words everybody can understand. --Rinconsoleao 19:46, 16 July 2007 (UTC)

[edit] Math Typo

I have added a comma after the <forall> i, 194.223.231.3 15:26, 17 July 2007 (UTC)

[edit] Maybe I just don't get this, but...

I think it is weird that the proportion between the players' payoffs are not taken into account. Surely I would, in a two-player game, prefer weakening both myself and my opponent if I weakened him more. Consequently, I would, from the situation A,B (25,40) being player one, change to C (15,5) to force the opponent into also doing C (10,10). That is a truly stable position, in which no player can earn anything by changing strategy. Isn't that actually the only of the three "stable" combinations that satisfies both stability criteria? —Preceding unsigned comment added by 77.40.128.194 (talk) 21:25, 15 January 2008 (UTC)

Well this is one common problem in people looking at normal form game representations etc. the situationis given for the game at hand where each player maximizes his or her payoff. Payoff doesen't have to mean money, but could mean pleasure/whatever aswell, so if someone is happy with the counter-player not getting that much either then it should be counted into the payoff(eg. a 25,40 would be a 0,40 if the player does not want the other player to make 40), but it is not something that should be factored in when calculating a N.E. in other words the example on the page does not necessarily have to mean the two players are competing to win a game (lets say they are both businessmen selling products on the same market and getting that big profits without giving a damn how much the other one makes) Gillis (talk) 23:29, 15 January 2008 (UTC)

[edit] Strong/Weak explanation

I've noticed that although the article makes use of the terms strong & weak Nash equilibria, they are not defined anywhere. -- ricmitch 08:17, 21 April 2008 (UTC)

Check Gillis (talk) 18:37, 21 April 2008 (UTC)

History of Nash equilibrium. I just changed this section. I can develop it a bit more and add some discussion. The concept of a Nash equilibrium in pure strategies (to use the current name) was clearly in use in oligopoly theory in the 19th Century and was well known: the writings of Edgeworth, Hotelling, Stackelburg to name a few of the better known names. I do not know for sure, but it quite possible that Oscar Morgenstern did not know much about this, since it is left out of the Theory of games and economic behavior (as far as I remember it). Von Neuman was primarily interested in card games and developed the idea of a mixed strategy (bluffing) in this context. In the Theory of Games, he showed that all zero-sum games posses a mixed strategy equilibrium. The modern form of the Nash equilibrium (where the strategies are in generla mixed) was the contribution of Nash: he extended Von Neuman's concept to all games with a finite number of actions. There have of course been various extensions since then: in particular Dasgupta and Maskin provided an existence Theorem for the non-finite case (when the strategic variables such as price are continuous) in their 1987 Review of Ecoonomic Studies articles and so on, but much of this is secondary. Zosimos101 (talk) 18:04, 25 April 2008 (UTC)

[edit] pareto optimality example

Oh well, the bank run might be as good as well... feel free to revert me if you think so... but i think a cartel is something more people are prone to be familiar with than a bank run?... and in the cartel example all players get a boost in profits wheras in a bank-run only some get more than theit share of the banks assets value, Gillis (talk) 21:38, 31 May 2008 (UTC)