Article ID: | iaor20123377 |
Volume: | 53 |
Issue: | 1 |
Start Page Number: | 245 |
End Page Number: | 256 |
Publication Date: | Apr 2012 |
Journal: | Decision Support Systems |
Authors: | Hu Paul Jen-Hwa, Lee Yen-Hsien, Cheng Tsang-Hsiang, Hsieh Ya-Fang |
Keywords: | decision: studies |
Existing supervised learning techniques are able to support product recommendations in business‐to‐consumer e‐commerce but become ineffective in scenarios characterized by single‐class learning, such as a training sample that consists of some examples pertaining to only one outcome class (positive or negative). To address such challenges, we develop a COst‐sensitive Learning‐based Positive Example Learning (COLPEL) technique, which constructs an automated classifier from a training sample comprised of positive examples and a much larger number of unlabeled examples. The proposed technique incorporates cost‐proportionate rejection sampling to derive, from unlabeled examples, a subset that is likely to feature negative examples in the training sample. Our technique follows a committee machine approach and thereby constructs a set of classifiers that make joint product recommendations while mitigating the potential biases common to the use of a single classifier. We evaluate the proposed method with customers' book ratings collected from Amazon.com and include two prevalent techniques for benchmark purposes; namely, positive naïve Bayes and positive example‐based learning. According to our results, the proposed COLPEL technique outperforms both benchmarks, as measured by accuracy and positive and negative F1 scores.