Preasymptotic error bounds via metric entropy
Autoři
Více o knize
The approximation of multivariate functions from limited amounts of data is a central subject in modern approximation theory, numerical mathematics, and statistics. In terms of worst-case error bounds, an essential problem arises if the number of function variables becomes large. Then, many classical asymptotic a-priori estimates turn out to be valid only for exponentially large amounts of data points. In consequence, these error estimates do not only loose their practical value, they also tell us little about the approximation problem's inherent complexity. In particular, it is not possible to decide whether the curse of dimensionality is present or not. The present thesis addresses this issue and proves new preasymptotic error bounds for a number of high-dimensional approximation problems, including the recovery of ridge functions---the building blocks of artificial neural networks---and the approximation of quantum-mechanical wave functions. The new error bounds allow for a more clear understanding of the interplay of model assumptions, worst-case errors, complexity, and rate of convergence. To obtain these bounds, we prove novel two-sided error estimates which reduce the decay of worst-case errors to the decay of entropy numbers of certain balls and spheres in finite-dimensional quasi-Banach spaces.