Author |
Topic: A bot using a GPU (Read 1763 times) |
|
UruramTururam
Forum Guru
Arimaa player #2537
Gender:
Posts: 319
|
|
A bot using a GPU
« on: Feb 15th, 2011, 2:53am » |
Quote Modify
|
I had a talk about Arimaa with my friend who is an IT designer specialized in local pattern recognition. That is - recognizing small patterns in bigger pictures, e.g: is there a plane among other objects depicted? He said that local pattern recognition could possibly be used for Arimaa positions as the game is a "local" one - each game step changes the position in the "most local possible" way. He added then that if he was to create an Arimaa playing program he would make it use the computer GPU (the graphical processor) up to its extent. The GPUs are specialized in making a lot of simple algebraic operations simultaneously. So they are ideal tools to generate possible future game positions and searching for certain patterns there, while the CPU just sorts the possibilities out and commits the most promising ones to further, deeper analysis. I calmed him down saying that Arimaa bots are not allowed to access the server GPU now, in fact it is not known if the server has one. He was a bit disappointed I guess. Yet I think that the idea is worth mentioning here. By the way he asked me one question that I don't know the answer for: are Arimaa bots allowed to run and calculate what they want during their opponents turns? It could save roughly about 25% of thinking time during the bot's own turn (depending on the position).
|
« Last Edit: Feb 15th, 2011, 2:55am by UruramTururam » |
IP Logged |
Caffa et bucella per attactionem corporum venit ad stomachum meum. BGG Arimaa badges - get your own one!
|
|
|
rbarreira
Forum Guru
Arimaa player #1621
Gender:
Posts: 605
|
|
Re: A bot using a GPU
« Reply #1 on: Feb 15th, 2011, 3:29am » |
Quote Modify
|
I think the problem with this approach (search on the CPU, evaluation completely or partially on the GPU) is that it would require very frequent back-and-forth communication between the CPU and the GPU, which AFAIK is not possible to do with high performance. There are a few folks trying to implement chess engines entirely on the GPU, but so far they haven't been successful. As I understand it GPUs don't support recursion, and while they can run lots of threads simultaneously, they expect every thread to be doing the same thing at a given moment, which makes it problematic to run code with branches (as search algorithms tend to use). I'm a bit skeptic, but I'm certainly not an expert in GPU programming so it would be nice to see some feedback on these concerns... As for your last question, yes, bots are allowed to do that (usually called "pondering").
|
« Last Edit: Feb 15th, 2011, 3:45am by rbarreira » |
IP Logged |
|
|
|
UruramTururam
Forum Guru
Arimaa player #2537
Gender:
Posts: 319
|
|
Re: A bot using a GPU
« Reply #2 on: Feb 15th, 2011, 6:22am » |
Quote Modify
|
Hm, I'm no specialist (I'll try to invite my friend here), but here is the concept he was talking about: Design steps: 1) Identify and code obvious patterns in the game such as goal threats, captures, hostages, true and false protections, etc. 2) Perform the analysis of the game database in order to find the significance of each pattern (weights) to the odds of winning. 3) Perform another search to find not-so-obvious patterns that lad to wining or losing the game or material advantage. 4) For each of those patterns find possible "good" moves and code them along with their weights. 5) Divide the patterns into sets, so processing of each set takes more or less equal time. The number of sets is equal to the number of independent GPU threads. Running the program: 1) During the setup move the program uploads the pattern database into graphic card memory (it treats it as a load of graphical data) it may take a while but it should be done once. 2) During its turn the program loads the current board state to all the threads. Threads start to find patterns and returning numbers and positions of patterns that are similar enough to pre-defined ones. 3) The CPU selects a few of the most promising patterns and calculates its move to exploit one or more of them. 4) During the opponent's turn the GPU works on the current state of the board (as if the opponent made no move at all). 5) When there is another AI turn some patterns are already found and only the fragment of the board that has been changed by the latest opponent's move has to be searched again. That's basically the idea as far as I understand it... Note that it won't work well for chess because of long-range nonlocal interactions between pieces there.
|
|
IP Logged |
Caffa et bucella per attactionem corporum venit ad stomachum meum. BGG Arimaa badges - get your own one!
|
|
|
omar
Forum Guru
Arimaa player #2
Gender:
Posts: 1003
|
|
Re: A bot using a GPU
« Reply #3 on: Feb 17th, 2011, 11:45am » |
Quote Modify
|
on Feb 15th, 2011, 2:53am, UruramTururam wrote: I calmed him down saying that Arimaa bots are not allowed to access the server GPU now, in fact it is not known if the server has one. He was a bit disappointed I guess. Yet I think that the idea is worth mentioning here. |
| The main problem would be that servers don't come with any graphics card, so his programs would not be able to run on the servers. Also his program might be very tuned to a particular graphics card and might not work on systems with other graphics card. It's an interesting approach though. I think the next few years we will be seeing more systems with not just multi-core processors, but multiple multi-core processors in each system.
|
« Last Edit: Feb 17th, 2011, 11:53am by omar » |
IP Logged |
|
|
|
leo
Forum Guru
Gender:
Posts: 278
|
|
Re: A bot using a GPU
« Reply #4 on: Feb 26th, 2011, 5:18am » |
Quote Modify
|
A similar approach might work using neural network processors or general-purpose programmable processors, but neither are likely to be found on web servers soon. Anyway the pattern matching approach, be it hardware accelerated or sequentially churned, has always looked to me highly promising for Arimaa.
|
|
IP Logged |
|
|
|
UruramTururam
Forum Guru
Arimaa player #2537
Gender:
Posts: 319
|
|
Re: A bot using a GPU
« Reply #5 on: Feb 28th, 2011, 8:50am » |
Quote Modify
|
Actually a lot of present neural network computing is performed on Radeon-type graphic cards. They are fast, available, relatively cheap and have a pretty good backward compatibility. Yet their programming is tricky AFAIK.
|
|
IP Logged |
Caffa et bucella per attactionem corporum venit ad stomachum meum. BGG Arimaa badges - get your own one!
|
|
|
|