Adaptive incentive selection for crowdsourcing contests
Adaptive incentive selection for crowdsourcing contests
The success of crowdsourcing projects relies critically on motivating the crowd to contribute. One particularly effective method for incentivising participants to perform tasks is to run contests. However, there are numerous ways to implement such contests in specific projects (that vary in how performance is evaluated, how to reward, and the sizes of the prizes). Additionally, with a given financial budget and a time limit, choosing incentives that maximise the total outcome (e.g., the total number of completed tasks) is not trivial, as their effectiveness in a specific project is usually unknown in advance. Therefore, we introduce algorithms to select such incentives effectively using budgeted multi-armed bandits. To do that, we first introduce the incentive selection problem, then formalise it as a 2d-budgeted multi-armed bandit, where each arm corresponds to an incentive (i.e., a contest with a specific structure). We then propose the HAIS and Stepped e-first algorithms to solve the incentive selection problem. The two algorithms are shown to be effective on simulations with synthetic data. Stepped e-first performs well, but requires a situation-specific parameter to be tuned appropriately (which may be difficult in settings with little prior experience). In contrast to this, HAIS performs better in most cases without depending significantly on the parameter tuning.
2100-2102
Truong, Nhat, Van Quoc
ea0089af-c714-4d31-91dc-e7a3ab7a8bde
Stein, Sebastian
cb2325e7-5e63-475e-8a69-9db2dfbdb00b
Tran-Thanh, Long
e0666669-d34b-460e-950d-e8b139fab16c
Jennings, Nick
0d0a0add-2739-4521-8915-c1e5a1e320b5
2018
Truong, Nhat, Van Quoc
ea0089af-c714-4d31-91dc-e7a3ab7a8bde
Stein, Sebastian
cb2325e7-5e63-475e-8a69-9db2dfbdb00b
Tran-Thanh, Long
e0666669-d34b-460e-950d-e8b139fab16c
Jennings, Nick
0d0a0add-2739-4521-8915-c1e5a1e320b5
Truong, Nhat, Van Quoc, Stein, Sebastian, Tran-Thanh, Long and Jennings, Nick
(2018)
Adaptive incentive selection for crowdsourcing contests.
17th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2018, , Stockholm, Sweden.
10 - 15 Jul 2018.
.
Record type:
Conference or Workshop Item
(Paper)
Abstract
The success of crowdsourcing projects relies critically on motivating the crowd to contribute. One particularly effective method for incentivising participants to perform tasks is to run contests. However, there are numerous ways to implement such contests in specific projects (that vary in how performance is evaluated, how to reward, and the sizes of the prizes). Additionally, with a given financial budget and a time limit, choosing incentives that maximise the total outcome (e.g., the total number of completed tasks) is not trivial, as their effectiveness in a specific project is usually unknown in advance. Therefore, we introduce algorithms to select such incentives effectively using budgeted multi-armed bandits. To do that, we first introduce the incentive selection problem, then formalise it as a 2d-budgeted multi-armed bandit, where each arm corresponds to an incentive (i.e., a contest with a specific structure). We then propose the HAIS and Stepped e-first algorithms to solve the incentive selection problem. The two algorithms are shown to be effective on simulations with synthetic data. Stepped e-first performs well, but requires a situation-specific parameter to be tuned appropriately (which may be difficult in settings with little prior experience). In contrast to this, HAIS performs better in most cases without depending significantly on the parameter tuning.
Text
aamas18ea
- Accepted Manuscript
More information
Published date: 2018
Venue - Dates:
17th International Conference on Autonomous Agents and Multiagent Systems, AAMAS 2018, , Stockholm, Sweden, 2018-07-10 - 2018-07-15
Identifiers
Local EPrints ID: 419826
URI: http://eprints.soton.ac.uk/id/eprint/419826
PURE UUID: 30e2ef39-253b-43e2-a9dd-14dc538650d6
Catalogue record
Date deposited: 23 Apr 2018 16:30
Last modified: 16 Mar 2024 03:57
Export record
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics