Using adaptive learning in credit scoring to estimate take-up probability distribution
Using adaptive learning in credit scoring to estimate take-up probability distribution
Credit scoring is used by lenders to minimise the chance of taking an unprofitable account with the overall objective of maximising profit. Profit is generated when a good customer accepts an offer from the organisation. So it is also necessary to get the customers to accept the offer. A lender can ‘‘learn’’ about the customers preferences by looking at which type of product different types of customers accepted and hence has to decide what offer to make. In this model of the acceptance problem, we model the lenders decision problem on which offer to make as a Markov Decision Process under uncertainty.
The aim of this paper is to develop a model of adaptive dynamic programming where Bayesian updating methods are employed to better estimate a take-up probability distribution. The significance of Bayesian updating in this model is that it allows previous responses to be included in the decision process. This means one uses learning of the previous responses to aid in selecting offers best to be offered to prospective customers that ensure take-up.
dynamic programming, bayesian updating, take-up probability, credit scoring
880-892
Seow, Hsin-Vonn
06d4e4e7-fd16-4781-b3c9-39a3b2bdafd1
Thomas, Lyn C.
a3ce3068-328b-4bce-889f-965b0b9d2362
2006
Seow, Hsin-Vonn
06d4e4e7-fd16-4781-b3c9-39a3b2bdafd1
Thomas, Lyn C.
a3ce3068-328b-4bce-889f-965b0b9d2362
Seow, Hsin-Vonn and Thomas, Lyn C.
(2006)
Using adaptive learning in credit scoring to estimate take-up probability distribution.
European Journal of Operational Research, 173 (3), .
(doi:10.1016/j.ejor.2005.06.058).
Abstract
Credit scoring is used by lenders to minimise the chance of taking an unprofitable account with the overall objective of maximising profit. Profit is generated when a good customer accepts an offer from the organisation. So it is also necessary to get the customers to accept the offer. A lender can ‘‘learn’’ about the customers preferences by looking at which type of product different types of customers accepted and hence has to decide what offer to make. In this model of the acceptance problem, we model the lenders decision problem on which offer to make as a Markov Decision Process under uncertainty.
The aim of this paper is to develop a model of adaptive dynamic programming where Bayesian updating methods are employed to better estimate a take-up probability distribution. The significance of Bayesian updating in this model is that it allows previous responses to be included in the decision process. This means one uses learning of the previous responses to aid in selecting offers best to be offered to prospective customers that ensure take-up.
This record has no associated files available for download.
More information
Published date: 2006
Keywords:
dynamic programming, bayesian updating, take-up probability, credit scoring
Identifiers
Local EPrints ID: 36773
URI: http://eprints.soton.ac.uk/id/eprint/36773
ISSN: 0377-2217
PURE UUID: aabfecd7-03e4-4a0b-9271-9fc80978c205
Catalogue record
Date deposited: 11 Jul 2006
Last modified: 15 Mar 2024 07:57
Export record
Altmetrics
Contributors
Author:
Hsin-Vonn Seow
Author:
Lyn C. Thomas
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics