The K-nearest neighbor (KNN) estimates proposed by Loftsgaarden and Quesenbery give unbiased and consistent estimates of p(X) when K, the number of nearest neighbors considered, and N, the total number of observations available, tend to infinity such that K/N⇒0. Hence excellent results may be obtained in large sample problems by using the KNN method for either density estimation or classification. A class of new KNN estimates is proposed as weighted averages of K KNN estimates, and it is shown that in small sample problems they give closer estimates to the true probability density than the traditional KNN estimates. Further, on the basis of some experimental results, the authors demonstrate that the KNN rules based on these estimates are suitable for small sample classification problems.