Talk:Tit for tat
From Wikipedia, the free encyclopedia
I changed the 1984 date that had been previously noted for Axelrod's competition. It was actually two competitions, and at least one of them had to have been held prior to Axelrod's 1981 publication of the original paper in the journal Science in 1981. There appears to be nice references in the Wikipedia article "Evolution of Cooperation" but I don't know how to do a link.
I'm not exactly sure why the fourth condition applies, so if anyone finds an explanation, please post.
I think the fourth condition is necesary because the retaliation only works if the agent plays the same opponent twice in a row.
Herman - hke@home.nl
Contents |
[edit] "Downfall"
I think this section either needs to be removed or renamed/rewritten. I can only interpret the existence of this section as a misunderstanding of the workings of IPD and Tit for Tat.
The section says that Tit for Tat was "beaten for the first time" in the year 2004, which would be incorrect. The success of a given strategy competing in the IPD is entirely dependent on the environment. Robert Axelrod outlined this in his book The Evolution of Cooperation, where he also tested a population where Tit for Tat didn't win. The situation in that population was simular to the one in the "Downfall" section - one category of strategies became induced to provide many points for another (altough they were not deliberately designed for this purpose), which lead to Tit for Tat losing the top position.
The lasting point is that Tit for Tat is robust in a wide variety of environments - including those where the population of strategies are allowed to increase/decline over a number of generations according to their score. /dkristoffersson
The naive interpretation of "winning" doesn't include the sort of "winning" that resulted in one member outscoring Tit for Tat - the article should be more clear about the precise definition of "winning" used in this case.
[edit] Requested move
The term "Tat" in this article shouldn't be capitalized. According to the naming convention:
- Do not capitalize second and subsequent words unless the title is a proper noun (such as a name) or is otherwise almost always capitalized (for example: John Wayne, but not Computer game).
--goethean ॐ 19:41, 3 October 2005 (UTC)
- Add *Support or *Oppose followed by an optional one sentence explanation, then sign your vote with ~~~~
- Oppose Tit for Tat is a proper noun, it is the name of a specific strategy. --best, kevin ···Kzollman | Talk··· 01:11, 4 October 2005 (UTC)
[edit] Discussion
- Add any additional comments
- Support for the reason given. (And because "a specific strategy" does not make a proper noun as according to our comma-splicer.) Stephan Leeds 06:00, 6 October 2005 (UTC)
- Support for the reason given. (And because "a specific strategy" does not make a proper noun as according to our comma-splicer.) Stephan Leeds 06:00, 6 October 2005 (UTC)
[edit] Decision
Page moved per request and double redirects fixed. Ryan Norton T | @ | C 07:03, 17 October 2005 (UTC)
[edit] What tournament?
What tournament is this article referring to? It just mentions a tournament without any explanation. - idiotoff 07:57, 24 April 2006 (UTC)
[edit] temporal confusion
In the introductory paragraph, "It was first introduced by Anatol Rapoport in Robert Axelrod's 1984 tournament." In the fourth paragraph of the Overview section, "For several decades Tit-for-Tat was the most effective strategy..."
So... am I missing a few decades?
--Theory.Of.Eli 20:59, 26 April 2006 (UTC)
- I removed the decades and reworded the sentence. It is not perfect, but now more accurate. Marc Harper 14:31, 11 August 2006 (UTC)
[edit] Practical Applications
Is there any room on this page for practical applications of Tit for tat? Examples: Bittorrent Stop Lights Swerty 20:05, 2 May 2006 (UTC)
- or a lot of cooperation in nature (between unrelated organisms). Yeah examples are good, I came to this article to see if it mentioned bittorrent. BrokenSegue 03:32, 8 August 2006 (UTC)
[edit] What about Tit for two tats
Tit for two tats is sometimes (but not always) more successfull than tit for tat. It certainly deserves a mention.
It retaliates with defection only after two consequitive defections by the opponent. This avoid some locks into mutual retaliation loop.
[edit] Logical Fallacy
The article states that "A fifth condition applies to make the competition meaningful: if an agent knows that the next play will be the last, it should naturally defect for a higher score." There is a logical fallacy in the reasoning that defection is best for the last move. While it is true that, whether the other player chooses to cooperate or defect, choosing to defect will always gain a higher score, this is a simplistic view. Generally, both players cooperating is more beneficial for both sides than both defecting. So, if both defect because they are aware it is the last play, then they both get less points than they would if they had both cooperated.
- There is no fallacy. My best option is to defect if I know you will have no recourse to exact revenge. Your statement, "Generally, both players cooperating is more beneficial for both sides than both defecting." is true only if it is iterated. In the non-iterated Prisoner's dilemma, you must agree, defect is the best option. When there is only one move left, it is effectively a single run of the non-iterated scenario. Thus, people should always defect on their last turn (which is why the rules state that they must never know when the game will end). BrokenSegue 04:39, 17 November 2006 (UTC)
-
- I believe the reference was to the payoff matrix e.g. both cooperating gives 3 points each, both defecting gives 1 each. In this case, even if it is not iterated, if both defect on the last move they will have less points than if both cooperate. - AlKing464
- True (and I just added the payoff matrix to clarify the situation), however, there still is not a fallacy. In fact, the matrix is correct and consistent with other texts. The issue is that even though they would be better off both cooperating instead of both defecting, they will still both defect. This is because no matter what their opponent does, they are better off defecting (and have no way to enforce cooperation agreements). Thus, in a non-iterated round, they will both defect. BrokenSegue 14:17, 22 November 2006 (UTC)
- I believe the reference was to the payoff matrix e.g. both cooperating gives 3 points each, both defecting gives 1 each. In this case, even if it is not iterated, if both defect on the last move they will have less points than if both cooperate. - AlKing464
[edit] Evolutionary Stable Strategy
The strategy is not in itself an evolutionary stable strategy, by definition, as is stated at the end of the first paragraph. While it prevents invasion by defectors, it is possible to invade a Tit for Tat environment with cooperators. TechnoBone 16:41, 2 December 2006 (UTC)
[edit] Prisoner's dilemma article information
There is a lot of information on Prisoner's dilemma that can be incorporated into this specific one. --165.230.46.148 22:02, 11 December 2006 (UTC)