Abstract
k-NN (k Nearest Neighbour) density estimate as a nonparametric estimation method is widely used in machine learning or data analysis. The convergence problem of k-NN approach has been intensively investigated. In particular, the equivalence of convergence in weak or strong sense (i.e. in probability sense or in almost surely sense) has been respectively developed. In this note, we will show that the k-NN estimator converges in probability is equivalent to converge in the \(L^2\) sense. Moreover, some relevant asymptotic results about the expectations of k-NN estimator will be established.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Devroye, L.: The equivalence of weak, strong and complete convergence in \(l_1\) for kernel density estimates. Ann. Stat. 11(3), 896–904 (1983)
Devroye, L.: A note on the \(l_1\) consistency of variable kernel estimates. Ann. Stat. 13(3), 1041–1049 (1985)
Devroye, L., Penrod, C.S.: The consistency of automatic kernel density estimates. Ann. Stat. 12(4), 1231–1249 (1984)
Don, O., Loftsgaarden, C.P.Q.: A nonparametric estimate of a multivariate density function. Ann. Math. Stat. 36(3), 1049–1051 (1965)
Fix, E., Hodges, J.L.: Discriminatory analysis, nonparametric discrimination, consistency properties. Int. Stat. Rev. 57(3), 238–247 (1989)
Scott, D.W.: Multivariate Density Estimation: Theory, Practice, and Visualization, 2nd edn. Wiley, Toronto (2015)
Wied, D., Weißbach, R.: Consistency of the kernel density estimator: a survey. Stat. Pap. 53(1), 1–21 (2012)
Yan, J.: Lectures on Measure Theory, 2nd edn. Science Press, Beijing (2004). (in Chinese)
Acknowledgements
The authors acknowledge the financial support by the National NSF of China under Grant No. 61472343, as well as the Open Fund of the State Key Laboratory of Software Development Environment under Grant No. SKLSDE-2015KF-05, Beihang University.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Ding, J., Zhu, X. (2016). A Note on the k-NN Density Estimate. In: Yin, H., et al. Intelligent Data Engineering and Automated Learning – IDEAL 2016. IDEAL 2016. Lecture Notes in Computer Science(), vol 9937. Springer, Cham. https://doi.org/10.1007/978-3-319-46257-8_9
Download citation
DOI: https://doi.org/10.1007/978-3-319-46257-8_9
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-46256-1
Online ISBN: 978-3-319-46257-8
eBook Packages: Computer ScienceComputer Science (R0)