A Comparative Study of Nearest Neighbor Regression and Nadaraya Watson Regression


  • Sarwar A. Hamad Department of Mathematics, University of Zakho, Kurdistan Region, IRAQ
  • Kawa S. Mohamed Ali Department of Mathematics, University of Zakho, Kurdistan Region, IRAQ




Nadaraya Watson regression, nearest neighbor regression, Monte Carlo simulation


Two non-parametric statistical methods are studied in this work. These are the nearest neighbor regression and the Nadaraya Watson kernel smoothing technique. We have proven that under a precise circumstance, the nearest neighborhood estimator and the Nadaraya Watson smoothing produce a smoothed data with a same error level, which means they have the same performance. Another result of the paper is that nearest neighborhood estimator performs better locally, but it graphically shows a weakness point when a large data set is considered on a global scale.


Download data is not yet available.


[1] Keith, D. K. (2003). U.S. Patent No. 6,629,097. Washington, DC: U.S. Patent and Trademark Office.
[2] Silver, D., & Wang, X. (1998, October). Tracking scalar features in unstructured data sets. In Proceedings Visualization'98 (Cat. No. 98CB36276) (pp. 79-86). IEEE.
[3] Bell, G., Hey, T., & Szalay, A. (2009). Beyond the data deluge. Science, 323(5919), 12971298.
[4] Durinck, S., Moreau, Y., Kasprzyk, A., Davis, S., De Moor, B., Brazma, A., & Huber, W. (2005). BioMart and Bioconductor: a powerful link between biological databases and microarray data analysis. Bioinformatics, 21(16), 3439-3440.
[5] Koop, G. (2006). Analysis of financial data. Chichester; Hoboken, NJ: John Wiley & Sons Inc..
[6] Michie, D., Spiegelhalter, D. J., & Taylor, C. C. (1994). Machine learning. Neural and Statistical Classification, 13.
[7] Seber, G. A., & Lee, A. J. (2012). Linear regression analysis (Vol. 329). John Wiley & Sons.
[8] Weisberg, S. (2005). Applied linear regression (Vol. 528). John Wiley & Sons.
[9] Kutner, M. H., Nachtsheim, C. J., Neter, J., & Li, W. (2005). Applied linear statistical models (Vol. 5). Boston: McGraw-Hill Irwin.
[10] Wright, R. E. (1995). Logistic regression.
[11] Kleinbaum, D. G., Dietz, K., Gail, M., Klein, M., & Klein, M. (2002). Logistic regression. New York: Springer-Verlag.
[12] Menard, S. (2002). Applied logistic regression analysis (Vol. 106). Sage.
[13] Pregibon, D. (1981). Logistic regression diagnostics. The Annals of Statistics, 9(4), 705-724.
[14] Höskuldsson, A. (1992). Quadratic PLS regression. Journal of Chemometrics, 6(6), 307334.
[15] Cudeck, R., & Du Toit, S. H. (2002). A version of quadratic regression with interpretable parameters. Multivariate Behavioral Research, 37(4), 501-519.
[16] Stinchcombe, J. R., Agrawal, A. F., Hohenlohe, P. A., Arnold, S. J., & Blows, M. W. (2008). Estimating nonlinear selection gradients using quadratic regression coefficients: double or nothing?. Evolution: International Journal of Organic Evolution, 62(9), 2435-2440.
[17] Altman, N. S. (1992). An introduction to kernel and nearest-neighbor nonparametric regression. The American Statistician, 46(3), 175-185.
[18] Stute, W. (1984). Asymptotic normality of nearest neighbor regression function estimates. The Annals of Statistics, 12(3), 917-926.
[19] Maltamo, M., & Kangas, A. (1998). Methods based on k-nearest neighbor regression in the prediction of basal area diameter distribution. Canadian Journal of Forest Research, 28(8), 1107-1115.
[20] Cai, Z. (2001). Weighted nadaraya–watson regression estimation. Statistics & probability letters, 51(3), 307-318.
[21] Devroye, L. P. (1978). The uniform convergence of the nadaraya‐watson regression function estimate. Canadian Journal of Statistics, 6(2), 179-191.
[22] Wand, M. P., & Jones, M. C. (1994). Kernel smoothing. Chapman and Hall/CRC.
[23] Daouia, A., Gardes, L., & Girard, S. (2013). On kernel smoothing for extremal quantile regression. Bernoulli, 19(5B), 2557-2589.
[24] Altman, N. S. (1990). Kernel smoothing of data with correlated errors. Journal of the American Statistical Association, 85(411), 749-759.
[25] Kramer, O. (2011, December). Dimensionality reduction by unsupervised k-nearest neighbor regression. In 2011 10th International Conference on Machine Learning and Applications and Workshops (Vol. 1, pp. 275-278). IEEE.



How to Cite

A. Hamad, S., & S. Mohamed Ali , K. (2021). A Comparative Study of Nearest Neighbor Regression and Nadaraya Watson Regression . Academic Journal of Nawroz University, 10(2), 180–188. https://doi.org/10.25007/ajnu.v10n2a505