Adaptive and non-adaptive minimax rates for weighted Laplacian-Eigenmap based nonparametric regression

Zhaoyang Shi, Krishna Balasubramanian, Wolfgang Polonik
Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, PMLR 238:2800-2808, 2024.

Abstract

We show both adaptive and non-adaptive minimax rates of convergence for a family of weighted Laplacian-Eigenmap based nonparametric regression methods, when the true regression function belongs to a Sobolev space and the sampling density is bounded from above and below. The adaptation methodology is based on extensions of Lepski’s method and is over both the smoothness parameter ($s\in\mathbb{N}_{+}$) and the norm parameter ($M>0$) determining the constraints on the Sobolev space. Our results extend the non-adaptive result in Green et al., (2023), established for a specific normalized graph Laplacian, to a wide class of weighted Laplacian matrices used in practice, including the unnormalized Laplacian and random walk Laplacian.

Cite this Paper


BibTeX
@InProceedings{pmlr-v238-shi24b, title = {Adaptive and non-adaptive minimax rates for weighted {L}aplacian-Eigenmap based nonparametric regression}, author = {Shi, Zhaoyang and Balasubramanian, Krishna and Polonik, Wolfgang}, booktitle = {Proceedings of The 27th International Conference on Artificial Intelligence and Statistics}, pages = {2800--2808}, year = {2024}, editor = {Dasgupta, Sanjoy and Mandt, Stephan and Li, Yingzhen}, volume = {238}, series = {Proceedings of Machine Learning Research}, month = {02--04 May}, publisher = {PMLR}, pdf = {https://proceedings.mlr.press/v238/shi24b/shi24b.pdf}, url = {https://proceedings.mlr.press/v238/shi24b.html}, abstract = {We show both adaptive and non-adaptive minimax rates of convergence for a family of weighted Laplacian-Eigenmap based nonparametric regression methods, when the true regression function belongs to a Sobolev space and the sampling density is bounded from above and below. The adaptation methodology is based on extensions of Lepski’s method and is over both the smoothness parameter ($s\in\mathbb{N}_{+}$) and the norm parameter ($M>0$) determining the constraints on the Sobolev space. Our results extend the non-adaptive result in Green et al., (2023), established for a specific normalized graph Laplacian, to a wide class of weighted Laplacian matrices used in practice, including the unnormalized Laplacian and random walk Laplacian.} }
Endnote
%0 Conference Paper %T Adaptive and non-adaptive minimax rates for weighted Laplacian-Eigenmap based nonparametric regression %A Zhaoyang Shi %A Krishna Balasubramanian %A Wolfgang Polonik %B Proceedings of The 27th International Conference on Artificial Intelligence and Statistics %C Proceedings of Machine Learning Research %D 2024 %E Sanjoy Dasgupta %E Stephan Mandt %E Yingzhen Li %F pmlr-v238-shi24b %I PMLR %P 2800--2808 %U https://proceedings.mlr.press/v238/shi24b.html %V 238 %X We show both adaptive and non-adaptive minimax rates of convergence for a family of weighted Laplacian-Eigenmap based nonparametric regression methods, when the true regression function belongs to a Sobolev space and the sampling density is bounded from above and below. The adaptation methodology is based on extensions of Lepski’s method and is over both the smoothness parameter ($s\in\mathbb{N}_{+}$) and the norm parameter ($M>0$) determining the constraints on the Sobolev space. Our results extend the non-adaptive result in Green et al., (2023), established for a specific normalized graph Laplacian, to a wide class of weighted Laplacian matrices used in practice, including the unnormalized Laplacian and random walk Laplacian.
APA
Shi, Z., Balasubramanian, K. & Polonik, W.. (2024). Adaptive and non-adaptive minimax rates for weighted Laplacian-Eigenmap based nonparametric regression. Proceedings of The 27th International Conference on Artificial Intelligence and Statistics, in Proceedings of Machine Learning Research 238:2800-2808 Available from https://proceedings.mlr.press/v238/shi24b.html.

Related Material