Regularised gradient boosting for financial time-series modelling

Regularised gradient boosting for financial time-series modelling

0.00 Avg rating0 Votes
Article ID: iaor20172712
Volume: 14
Issue: 3
Start Page Number: 367
End Page Number: 391
Publication Date: Jul 2017
Journal: Computational Management Science
Authors: , ,
Keywords: time series: forecasting methods, simulation, investment
Abstract:

Gradient Boosting (GB) learns an additive expansion of simple basis‐models. This is accomplished by iteratively fitting an elementary model to the negative gradient of a loss function with respect to the expansion’s values at each training data‐point evaluated at each iteration. For the case of squared‐error loss function, the negative gradient takes the form of an ordinary residual for a given training data‐point. Studies have demonstrated that running GB for hundreds of iterations can lead to overfitting, while a number of authors showed that by adding noise to the training data, generalisation is impaired even with relatively few basis‐models. Regularisation is realised through the shrinkage of every newly‐added basis‐model to the expansion. This paper demonstrates that GB with shrinkage‐based regularisation is still prone to overfitting in noisy datasets. We use a transformation based on a sigmoidal function for reducing the influence of extreme values in the residuals of a GB iteration without removing them from the training set. This extension is built on top of shrinkage‐based regularisation. Simulations using synthetic, noisy data show that the proposed method slows‐down overfitting and reduces the generalisation error of regularised GB. The proposed method is then applied to the inherently noisy domain of financial time‐series modelling. Results suggest that for the majority of datasets the method generalises better when compared against standard regularised GB, as well as against a range of other time‐series modelling methods.

Reviews

Required fields are marked *. Your email address will not be published.