Arimaa
From Wikipedia, the free encyclopedia
Arimaa | |
---|---|
An Arimaa Elephant |
|
Designer | Omar Syed |
Players | 2 |
Setup time | < 1 minute |
Playing time | 15 minutes - 2 hours |
Random chance | None |
Skills required | Tactics, Strategy |
Arimaa is a proprietary (patented and trademarked) two-player abstract strategy board game that can be played using the same equipment as chess. Arimaa has so far proven to be more difficult for artificial intelligences to play than chess.
Contents |
[edit] History
Arimaa was invented by Omar Syed, a computer engineer trained in artificial intelligence. Syed was inspired by Garry Kasparov's defeat at the hands of the chess computer Deep Blue to design a new game which could be played with a standard chess set, would be difficult for computers to play well, but would have rules simple enough for his four-year-old son Aamir to understand. ("Arimaa" is "Aamir" spelled backwards plus an initial "a"). In 2002 Syed published the rules to Arimaa and announced a $10,000 prize, available annually until 2020, for the first computer program (running on standard, off-the-shelf hardware) able to defeat a top-ranked human player in a match six games or longer[1]. The Syeds have been awarded a patent on the Arimaa rules, and the name "Arimaa" is trademarked; see below for more.
[edit] Rules
Arimaa is played on a chessboard with four squares distinguished as trap squares, namely c3, f3, c6, and f6 in algebraic chess notation. The two players, Gold and Silver, each control sixteen pieces. These are, in order from strongest to weakest, one elephant (), one camel (), two horses (), two dogs (), two cats (), and eight rabbits (). These may be represented by the king, queen, rooks, bishops, knights, and pawns respectively when one plays on a chess board.
The objective of the game is to move a rabbit of one's own color onto the home rank of the opponent. Thus Gold wins by moving a gold rabbit to the eighth rank, and Silver wins by moving a silver rabbit to the first rank. However, because it is difficult to usher a rabbit to the goal line while the board is full of pieces, an intermediate objective is to capture opposing pieces by pushing or pulling them into the trap squares.
The game begins with an empty board. Gold places the sixteen gold pieces in any configuration on the first and second ranks. Silver then places the sixteen silver pieces in any configuration on seventh and eighth ranks. The diagram at right shows one possible initial placement.
After the pieces are placed on the board, the players alternate turns, starting with Gold. A turn consists of making one to four steps. With each step a friendly piece may move into an unoccupied square one space left, right, forward, or backward, except that rabbits may not step backward. The steps of a turn may be made by a single piece or distributed between several pieces in any order. A turn must make a net change to the position. Thus one may not, for example, take one step forward and one step back with the same piece, effectively passing the turn.
The second diagram, from the same game as the initial position above, helps illustrate the remaining rules of movement.
A player may use two steps of a turn to dislodge an opposing piece with a stronger friendly piece which is adjacent (in any cardinal direction). For example, a friendly dog may dislodge an opposing rabbit or cat, but not a dog, horse, camel, or elephant. The stronger piece may pull or push the adjacent weaker piece. When pulling, the stronger piece steps into an empty square, and the square it came from is occupied by the weaker piece. The silver elephant on d5 could step to d4 (or c5 or e5) and pull the gold horse from d6 to d5. When pushing, the weaker piece is moved to an adjacent empty square, and the square it came from is occupied by the stronger piece. The gold elephant on d3 could push the silver rabbit on d2 to e2 and then occupy d2. Note that the rabbit on d2 can't be pushed to d1, c2, or d3, because those squares are not empty.
Friendly pieces may not be dislodged. Also, a piece may not push and pull simultaneously. For example the gold elephant on d3 could not simultaneously push the silver rabbit on d2 to e2 and pull the silver rabbit from c3 to d3. An elephant can never be dislodged, since there is nothing stronger.
A piece which is adjacent (in any cardinal direction) to a stronger opposing piece is frozen, unless it is also adjacent to a friendly piece. Frozen pieces may not be moved by the owner, but may be dislodged by the opponent. A frozen piece can freeze another still weaker piece. The silver rabbit on a7 is frozen, but the one on d2 is able to move because it is adjacent to a silver piece. Similarly the gold rabbit on b7 is frozen, but the gold cat on c1 is not. The dogs on a6 and b6 do not freeze each other because they are of equal strength. An elephant cannot be frozen, since there is nothing stronger, but an elephant can be blockaded.
A piece which enters a trap square is captured and removed from the game unless there is a friendly piece adjacent. Silver to move could capture the gold horse on d6 by pushing it to c6 with the elephant on d5. Also a piece on a trap square is captured if all adjacent friendly pieces move away. Thus if the silver rabbit on c4 and the silver horse on c2 move away, voluntarily or by being dislodged, the silver rabbit on c3 will be captured.
Note that a piece may voluntarily step into a trap square, even if it is captured thereby. Also, the second step of a pulling maneuver may be completed, even if the piece doing the pulling is captured on the first step. For example, Silver to move could step the silver rabbit from f4 to g4, step the silver horse from f2 to f3, which captures the horse, and still pull the gold rabbit from f1 to f2 as part of the horse's move.
In the diagrammed position, if it were Gold's turn to move, Gold could win in three steps: The dog on a6 can push the rabbit on a7 to a8, and when the dog is on a7, it unfreezes the rabbit on b7, which can step to b8 for the victory.
There are several ways for the game to end apart from a rabbit reaching its goal. (The frequency of each is shown in parentheses.)[2]
- (2.2%) If, at the beginning of a player's turn, no steps are possible because all friendly pieces are frozen or blockaded, the player whose turn it is loses.
- (0.2%) If the same position occurs three times with the same player to move, the player whose turn caused it to occur the third time loses. Only positions at the end of each turn are considered by this rule, not positions at the end of each step. (This rule originally did not consider side to move, but Syed changed it to be more like the chess repetition rule on May 28, 2005.)
- (0.0%) If all sixteen rabbits are captured, the game is a draw. (In elimination tournaments where a draw would be inconvenient, Syed has ruled that a player who captures all eight opposing rabbits wins the game, although as of September, 2006, the difference has never mattered.)
Finally, if an opposing rabbit is dislodged onto its goal line and dislodged off within the same turn, the game continues.
[edit] Strategy and tactics
For beginning insights into good play, see the Arimaa Wikibook articles on tactics and strategy.
[edit] Computer ineptitude
Several aspects of Arimaa make it difficult for computer programs to beat good human players. Because so much effort has gone into the development of strong chess-playing software, it is particularly relevant to understand why techniques applicable to chess are less effective for Arimaa.
Top chess programs use brute-force searching coupled with static position evaluation dominated by material considerations. Chess programs examine many, many possible moves, but they are not good (compared to humans) at determining who is winning at the end of a series of moves unless one side has more pieces than the other. The same is true for Arimaa programs, but their results are not as good in practice.
When brute-force searching is applied to Arimaa, the depth of the search is limited by the huge number of options each player has on each turn. An Arimaa player has nearly as many legal choices for each step as a chess player has for each turn. Thus a program which can search to depth of sixteen ply can look ahead eight turns for each player in chess, but only approximately two turns (eight steps) for each player in Arimaa.
Brute force search depth, for chess software, is nearly doubled by alpha-beta pruning, which allows the software to conclude that one move is better than another without examining every possible continuation of the weaker move. If the opponent can crush a certain move with one reply, it isn't necessary to examine other replies, which dramatically increases search speed. In Arimaa, however, the side to move switches only every four steps, which reduces the number of available cutoffs in a step-based search.
Furthermore, the usefulness of alpha-beta pruning is heavily dependent on the order in which moves are considered. Good moves must be considered before bad ones in order for the bad ones to be neglected. In particular, checking and capturing moves are key for pruning, because they are often much better than other moves. In Arimaa software the speedup provided by alpha-beta pruning is less, because captures are more rare. In rated games played on arimaa.com, only 3% of steps result in capture, compared to about 19% of chess moves that result in capture.
In most Arimaa positions, particularly toward the beginning of the game when the board is still crowded, a competent player can avoid losing any pieces within the next two turns. Compared to chess, Arimaa allows either player to delay captures for longer. Indeed, the median move number of the first capture in chess is turn 6, whereas in Arimaa it is turn 12. The struggle is initially more positional in Arimaa, and revolves around making captures unavoidable at some point in the future. This magnifies the importance of correctly judging who is gaining ground in non-material ways. Thus the strength of computer programs (examining millions of positions) is not as significant as their weakness (judging the position apart from who has more pieces).
The weakness of Arimaa programs in the opening phases is further magnified by the setup phase. In chess every game starts from the same position. By compiling before the game a list of stock replies to all standard opening moves, chess programs may often make a dozen or more excellent moves before starting to "think". Humans do the same, but have a smaller and less reliable memory of openings, which puts humans at a relative disadvantage in chess. Arimaa, in contrast, has millions of possible ways to set up the pieces even before the first piece moves. This prevents programs from having any meaningful opening book.
As the game progresses, exchanges and the advancement of rabbits tend to make the position more open and tactical. Arimaa programs typically play better in this sort of position, because they see tactical shots which humans overlook. However, it is usually possible for humans to avoid wide-open positions by conservative play, and to angle for strategic positions in which computers fare worse. Against a conservative opponent it is almost impossible to bust open the position in Arimaa, whereas in chess it is merely difficult. One must beat defensive play by the accumulation of small, long-term advantages, which programs do not do very well.
One additional technique from computer chess which does not apply to Arimaa is endgame tablebases. Master-level chess games sometimes trade down into unclear endgames with only a few pieces, for example king and knight vs. king and rook. It is possible to build, by retrograde analysis, an exhaustive table of the correct move in all such positions. Programs have only to consult a pre-generated table in such positions, rather than "thinking" afresh, which gives them a relative advantage over humans. Arimaa, in contrast, seldom comes to an endgame. Equal exchanges of pieces are less common than in chess, so it is rare for a game of Arimaa to "trade down" and still be unclear. An average game of Arimaa has only eight captures (compared to seventeen for chess), and top humans can often defeat top programs in Arimaa without losing a single piece, for example the first game of the 2008 challenge match. In the 2007 Postal Championship, the game between the top two finishers featured only one capture, a goal-forcing sacrifice.
Omar Syed hopes that, because traditional computer game-playing techniques are only marginally effective for Arimaa, programmers will be forced to use artificial intelligence techniques to create a strong Arimaa-playing program. The successful quest to build a world-championship-caliber chess program has produced many techniques to successfully play games, but has contributed essentially nothing to more general reasoning; in fact, the techniques of chess playing programs have been excluded from some definitions of artificial intelligence; a goal for Arimaa is that the techniques involved in playing it will help the larger goals of artificial intelligence.
The structure of Syed's man-against-machine challenge is focused on rewarding advances in AI software and not advances in hardware. In the annual challenge, programs are run on machines chosen and provided by Syed himself, under the criterion that it be a typical, inexpensive, off-the-shelf home computer. The challenge would not be open to anyone requiring expensive multi-processor machines such as those used to challenge top-level chess players, much less something like the custom-built supercomputer Deep Blue, even though it was the success of this hardware-intensive approach which inspired Arimaa's invention. Syed believes that supercomputers might already have the power to conquer Arimaa by brute force, and someday personal computers will too.
[edit] Challenge history
Year | Prize | Challenger / Developer | Human Defender (Human Rank) | Result | Notes |
---|---|---|---|---|---|
2004 | $10,000
|
Bomb / David Fotland |
Omar Syed (1) | 0-8 | Syed gave a rabbit handicap in the last game. |
2005 | $10,000
|
Bomb / David Fotland |
Frank Heinemann (4) | 1-7 | No handicap games |
2006 | $17,500
|
Bomb / David Fotland |
Karl Juhnke (1) Greg Magne (2) Paul Mertens (6) |
0-3 0-3 1-2 |
Mertens gave a camel handicap in his last game and lost. |
2007 | $17,100
|
Bomb / David Fotland |
Karl Juhnke (1) Brendan M (19) Omar Syed (20) N Siddiqui (27) |
0-3 0-2 0-3 1-0 |
Juhnke gave handicaps of a dog, a horse, and a camel respectively. Syed gave a cat handicap in his last game. Siddiqui substituted for Brendan's second game. |
2008 | $17,000
|
Bomb / David Fotland |
Jean Daligault (1) Greg Magne (3) Mark Mistretta (16) Omar Syed (17) |
0-3 0-3 0-1 0-2 |
No handicap games. Syed substituted for Mistretta's final two games. |
The Arimaa Challenge has happened five times so far. Prior to the third match, Syed changed the format to require the software to win two out of three games against each of three players, to reduce the psychological pressure on individual volunteer defenders. Also Syed called for outside sponsorship of the Arimaa Challenge to build a bigger prize fund.
In each of the five challenge cycles David Fotland, renowned for his program Many Faces of Go, won the Arimaa Computer Championship and the right to play for the prize money, only to see his program beaten decisively each year. In 2008 the computer lost all nine games, three of them without winning a piece. It appears that humanity's margin of dominance over computers is widening at present, although a large reason this happens is that more players are becoming familiar with the game.
[edit] Patent and trademark
US PAT No. 6,981,700 was filed on 3 October 2003, and granted on 3 January 2006. Omar Syed also holds a trademark on the name "Arimaa".
Syed has stated that he does not intend to restrict noncommercial use and has released a license called "The Arimaa Public License" with the declared intent to "make Arimaa as much of a public domain game as possible while still protecting its commercial usage". Items covered by the license are the patent and the trademark.
[edit] Footnotes
- ^ Syed, Omar; Syed, Aamir (2003). "Arimaa – a New Game Designed to be Difficult for Computers". International Computer Games Association Journal 26: 138-139.
- ^ The Arimaa server game archive as of December 2006 showed the following number of rated human versus human games: 1653 ending in goal, 38 ending in immobilization, 4 ending by repetition of position, and 0 ending in a draw.
[edit] See also
[edit] References
[edit] External links
- Official Arimaa Website
- David Fotland's Arimaa Program - warning! the free version is turn limited
- The Arimaa Public License
- Arimaa Strategy (wikibook)