Epsilon–First Policies for Budget–Limited Multi-Armed Bandits


Tran-Thanh, Long, Chapman, Archie, Munoz De Cote Flores Luna, Jose Enrique, Rogers, Alex and Jennings, Nicholas R. (2010) Epsilon–First Policies for Budget–Limited Multi-Armed Bandits. In, Twenty-Fourth AAAI Conference on Artificial Intelligence, Atlanta, USA, Georgia, 11 - 15 Jul 2010. , 1211-1216.

Download

[img] PDF - Published Version
Download (362Kb)
[img] PDF - Accepted Version
Download (105Kb)

Description/Abstract

We introduce the budget–limited multi–armed bandit (MAB), which captures situations where a learner’s actions are costly and constrained by a fixed budget that is incommensurable with the rewards earned from the bandit machine, and then describe a first algorithm for solving it. Since the learner has a budget, the problem’s duration is finite. Consequently an optimal exploitation policy is not to pull the optimal arm repeatedly, but to pull the combination of arms that maximises the agent’s total reward within the budget. As such, the rewards for all arms must be estimated, because any of them may appear in the optimal combination. This difference from existing MABs means that new approaches to maximising the total reward are required. To this end, we propose an epsilon–first algorithm, in which the first epsilon of the budget is used solely to learn the arms’ rewards (exploration), while the remaining 1 − epsilon is used to maximise the received reward based on those estimates (exploitation). We derive bounds on the algorithm’s loss for generic and uniform exploration methods, and compare its performance with traditional MAB algorithms under various distributions of rewards and costs, showing that it outperforms the others by up to 50%.

Item Type: Conference or Workshop Item (Paper)
Additional Information: Event Dates: 11 - 15 July, 2010
Divisions: Faculty of Physical Sciences and Engineering > Electronics and Computer Science > Agents, Interactions & Complexity
ePrint ID: 270806
Date Deposited: 06 Apr 2010 16:45
Last Modified: 27 Mar 2014 20:15
Further Information:Google Scholar
URI: http://eprints.soton.ac.uk/id/eprint/270806

Actions (login required)

View Item View Item

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics