Error Bounds and the Asymptotic Setting in Kernel-Based Approximation

Author
Abstract

We use ideas from Gaussian process regression to derive computable error bounds that can be used as stopping criteria in kernel-based approximation. The proposed bounds are based on maximum likelihood estimation and cross-validation of a kernel scale parameter and take the form of a product of the scale parameter estimate and the worst-case approximation error in the reproducing kernel Hilbert space induced by the kernel. We also use known results on the so-called asymptotic setting to argue that such worst-case type error bounds are not necessarily conservative.

Download
Karvonen T. (2022) "Error Bounds and the Asymptotic Setting in Kernel-Based Approximation " Dolomites Research Notes on Approximation, 15(3), 65-77. DOI: 10.14658/PUPJ-DRNA-2022-3-7  
Year of Publication
2022
Journal
Dolomites Research Notes on Approximation
Volume
15
Issue Number
3
Start Page
65
Last Page
77
Date Published
10/2022
ISSN Number
2035-6803
Serial Article Number
7
DOI
10.14658/PUPJ-DRNA-2022-3-7
Issue
Section
SpecialIssue3