The University of Southampton
University of Southampton Institutional Repository

Efficient and adaptive incentive selection for crowdsourcing contests

Efficient and adaptive incentive selection for crowdsourcing contests
Efficient and adaptive incentive selection for crowdsourcing contests

The success of crowdsourcing projects relies critically on motivating a crowd to contribute. One particularly effective method for incentivising participants to perform tasks is to run contests where participants compete against each other for rewards. However, there are numerous ways to implement such contests in specific projects, that vary in how performance is evaluated, how participants are rewarded, and the sizes of the prizes. Also, the best way to implement contests in a particular project is still an open challenge, as the effectiveness of each contest implementation (henceforth, incentive) is unknown in advance. Hence, in a crowdsourcing project, a practical approach to maximise the overall utility of the requester (which can be measured by the total number of completed tasks or the quality of the task submissions) is to choose a set of incentives suggested by previous studies from the literature or from the requester’s experience. Then, an effective mechanism can be applied to automatically select appropriate incentives from this set over different time intervals so as to maximise the cumulative utility within a given financial budget and a time limit. To this end, we present a novel approach to this incentive selection problem. Specifically, we formalise it as an online decision making problem, where each action corresponds to offering a specific incentive. After that, we detail and evaluate a novel algorithm, HAIS, to solve the incentive selection problem efficiently and adaptively. In theory, in the case that all the estimates in HAIS (except the estimates of the effectiveness of each incentive) are correct, we show that the algorithm achieves the regret bound of O(B/c), where B denotes the financial budget and c is the average cost of the incentives. In experiments, the performance of HAIS is about 93% (up to 98%) of the optimal solution and about 9% (up to 40%) better than state-of-the-art algorithms in a broad range of settings, which vary in budget sizes, time limits, numbers of incentives, values of the standard deviation of the incentives’ utilities, and group sizes of the contests (i.e., the numbers of participants in a contest).

Crowdsourcing, Incentive, Online decision making
0924-669X
Truong, Nhat Van-Quoc
ea0089af-c714-4d31-91dc-e7a3ab7a8bde
Dinh, Le Cong
e89b4443-9eff-4790-b101-9eabe5ef947c
Stein, Sebastian
cb2325e7-5e63-475e-8a69-9db2dfbdb00b
Tran-Thanh, Long
aecacf50-460e-410a-83be-b0c2a5ae226e
Jennings, Nicholas R.
569702cf-15b9-4a7f-8e38-d2d5f08cf365
Truong, Nhat Van-Quoc
ea0089af-c714-4d31-91dc-e7a3ab7a8bde
Dinh, Le Cong
e89b4443-9eff-4790-b101-9eabe5ef947c
Stein, Sebastian
cb2325e7-5e63-475e-8a69-9db2dfbdb00b
Tran-Thanh, Long
aecacf50-460e-410a-83be-b0c2a5ae226e
Jennings, Nicholas R.
569702cf-15b9-4a7f-8e38-d2d5f08cf365

Truong, Nhat Van-Quoc, Dinh, Le Cong, Stein, Sebastian, Tran-Thanh, Long and Jennings, Nicholas R. (2022) Efficient and adaptive incentive selection for crowdsourcing contests. Applied Intelligence. (doi:10.1007/s10489-022-03593-2).

Record type: Article

Abstract

The success of crowdsourcing projects relies critically on motivating a crowd to contribute. One particularly effective method for incentivising participants to perform tasks is to run contests where participants compete against each other for rewards. However, there are numerous ways to implement such contests in specific projects, that vary in how performance is evaluated, how participants are rewarded, and the sizes of the prizes. Also, the best way to implement contests in a particular project is still an open challenge, as the effectiveness of each contest implementation (henceforth, incentive) is unknown in advance. Hence, in a crowdsourcing project, a practical approach to maximise the overall utility of the requester (which can be measured by the total number of completed tasks or the quality of the task submissions) is to choose a set of incentives suggested by previous studies from the literature or from the requester’s experience. Then, an effective mechanism can be applied to automatically select appropriate incentives from this set over different time intervals so as to maximise the cumulative utility within a given financial budget and a time limit. To this end, we present a novel approach to this incentive selection problem. Specifically, we formalise it as an online decision making problem, where each action corresponds to offering a specific incentive. After that, we detail and evaluate a novel algorithm, HAIS, to solve the incentive selection problem efficiently and adaptively. In theory, in the case that all the estimates in HAIS (except the estimates of the effectiveness of each incentive) are correct, we show that the algorithm achieves the regret bound of O(B/c), where B denotes the financial budget and c is the average cost of the incentives. In experiments, the performance of HAIS is about 93% (up to 98%) of the optimal solution and about 9% (up to 40%) better than state-of-the-art algorithms in a broad range of settings, which vary in budget sizes, time limits, numbers of incentives, values of the standard deviation of the incentives’ utilities, and group sizes of the contests (i.e., the numbers of participants in a contest).

Text
Efficient and adaptive incentive selection for crowdsourcing contests - Version of Record
Available under License Creative Commons Attribution.
Download (3MB)

More information

Accepted/In Press date: 6 April 2022
e-pub ahead of print date: 6 August 2022
Published date: 6 August 2022
Additional Information: Funding Information: Nhat Truong is funded by the 911 Project of the Vietnamese Government. Sebastian Stein is funded via a UKRI Turing AI Acceleration Fellowship (EP/V022067/1). The authors acknowledge the use of the IRIDIS High Performance Computing Facility, and associated support services at the University of Southampton, in the completion of this work. Publisher Copyright: © 2022, The Author(s).
Keywords: Crowdsourcing, Incentive, Online decision making

Identifiers

Local EPrints ID: 469543
URI: http://eprints.soton.ac.uk/id/eprint/469543
ISSN: 0924-669X
PURE UUID: 29b9a13e-0023-4a95-a952-2a998fe63721
ORCID for Nhat Van-Quoc Truong: ORCID iD orcid.org/0000-0003-4945-8197
ORCID for Le Cong Dinh: ORCID iD orcid.org/0000-0002-3306-0603
ORCID for Sebastian Stein: ORCID iD orcid.org/0000-0003-2858-8857

Catalogue record

Date deposited: 16 Sep 2022 17:10
Last modified: 17 Mar 2024 03:13

Export record

Altmetrics

Contributors

Author: Nhat Van-Quoc Truong ORCID iD
Author: Le Cong Dinh ORCID iD
Author: Sebastian Stein ORCID iD
Author: Long Tran-Thanh
Author: Nicholas R. Jennings

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×