AAAI Publications, Fourteenth Artificial Intelligence and Interactive Digital Entertainment Conference

Font Size: 
Action Abstractions for Combinatorial Multi-Armed Bandit Tree Search
Rubens O. Moraes, Julian R. H. Mariño, Levi H. S. Lelis, Mario A. Nascimento

Last modified: 2018-09-25


Search algorithms based on combinatorial multi-armed bandits (CMABs) are promising for dealing with state-space sequential decision problems. However, current CMAB-based algorithms do not scale to problem domains with very large actions spaces, such as real-time strategy games played in large maps. In this paper we introduce CMAB-based search algorithms that use action abstraction schemes to reduce the action space considered during search. One of the approaches we introduce use regular action abstractions (A1N), while the other two use asymmetric action abstractions (A2N and A3N). Empirical results on MicroRTS show that A1N, A2N, and A3N are able to outperform an existing CMAB-based algorithm in matches played in large maps, and A3N is able to outperform all state-of-the-art search algorithms tested.


Real-time strategy games; Planning; Search; Combinatorial Multi-armed Bandits; NaiveMCTS; Asymmetric Action abstraction

Full Text: PDF