Singular value decomposition (SVD) has been used widely in the literature to recover the missing entries of a matrix. The basic principle in such methods is to assume that the correlated data is distributed with a low-rank structure. The knowledge of the low-rank structure is then used to predict the missing entries. SVD is based on the assumption that the data (user ratings) are distributed on a linear hyperplane. This is not always the case, and the data could often be distributed on a nonlinear hyperplane. Therefore, in this paper, we explore the methodology of kernel feature extraction to complement off-the-shelf methods for improving their accuracy. The extracted features can be used to enhance a variety of existing methods such as biased matrix factorization and SVD++. We present experimental results illustrating the effectiveness of using this approach.