MIT Libraries logoDSpace@MIT

MIT
View Item 
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
  • DSpace@MIT Home
  • MIT Open Access Articles
  • MIT Open Access Articles
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Stable Extrapolation of Analytic Functions

Author(s)
Demanet, Laurent; Townsend, Alex
Thumbnail
Download10208_2018_9384_ReferencePDF.pdf (868.0Kb)
Open Access Policy

Open Access Policy

Creative Commons Attribution-Noncommercial-Share Alike

Terms of use
Creative Commons Attribution-Noncommercial-Share Alike http://creativecommons.org/licenses/by-nc-sa/4.0/
Metadata
Show full item record
Abstract
Abstract This paper examines the problem of extrapolation of an analytic function for $$x > 1$$ x > 1 given $$N+1$$ N + 1 perturbed samples from an equally spaced grid on $$[-1,1]$$ [ - 1 , 1 ] . For a function f on $$[-1,1]$$ [ - 1 , 1 ] that is analytic in a Bernstein ellipse with parameter $$\rho > 1$$ ρ > 1 , and for a uniform perturbation level $$\varepsilon $$ ε on the function samples, we construct an asymptotically best extrapolant e(x) as a least squares polynomial approximant of degree $$M^*$$ M ∗ determined explicitly. We show that the extrapolant e(x) converges to f(x) pointwise in the interval $$I_\rho \in [1,(\rho +\rho ^{-1})/2)$$ I ρ ∈ [ 1 , ( ρ + ρ - 1 ) / 2 ) as $$\varepsilon \rightarrow 0$$ ε → 0 , at a rate given by a x-dependent fractional power of $$\varepsilon $$ ε . More precisely, for each $$x \in I_{\rho }$$ x ∈ I ρ we have $$\begin{aligned} |f(x) - e(x)| = \mathcal {O}\left( \varepsilon ^{-\log r(x) / \log \rho } \right) , \quad r(x) = \frac{x+\sqrt{x^2-1}}{\rho }, \end{aligned}$$ | f ( x ) - e ( x ) | = O ε - log r ( x ) / log ρ , r ( x ) = x + x 2 - 1 ρ , up to log factors, provided that an oversampling conditioning is satisfied, viz. $$\begin{aligned} M^* \le \frac{1}{2} \sqrt{N}, \end{aligned}$$ M ∗ ≤ 1 2 N , which is known to be needed from approximation theory. In short, extrapolation enjoys a weak form of stability, up to a fraction of the characteristic smoothness length. The number of function samples does not bear on the size of the extrapolation error provided that it obeys the oversampling condition. We also show that one cannot construct an asymptotically more accurate extrapolant from equally spaced samples than e(x), using any other linear or nonlinear procedure. The proofs involve original statements on the stability of polynomial approximation in the Chebyshev basis from equally spaced samples and these are expected to be of independent interest.
Date issued
2018-03-21
URI
https://hdl.handle.net/1721.1/131505
Department
Massachusetts Institute of Technology. Department of Mathematics
Publisher
Springer US

Collections
  • MIT Open Access Articles

Browse

All of DSpaceCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsThis CollectionBy Issue DateAuthorsTitlesSubjects

My Account

Login

Statistics

OA StatisticsStatistics by CountryStatistics by Department
MIT Libraries
PrivacyPermissionsAccessibilityContact us
MIT
Content created by the MIT Libraries, CC BY-NC unless otherwise noted. Notify us about copyright concerns.