One-sided convergence conditions of Lagrange interpolation based on Jacobi-type weights (Q1878573)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: One-sided convergence conditions of Lagrange interpolation based on Jacobi-type weights |
scientific article; zbMATH DE number 2098981
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | One-sided convergence conditions of Lagrange interpolation based on Jacobi-type weights |
scientific article; zbMATH DE number 2098981 |
Statements
One-sided convergence conditions of Lagrange interpolation based on Jacobi-type weights (English)
0 references
7 September 2004
0 references
The authors state a one-side convergence theorem for Lagrange interpolation. This result says that if \[ -1<\gamma=\max (\alpha,\beta)<\frac 1 2 \] is fixed, \(f\in C(I)\) and \[ \delta(f;t)=\begin{cases} o\big(| \log t| ^{-1}\big), & \text{if}\;-1<\gamma\leq \dfrac 1 2 \\ o\big(t^{\gamma+\frac 1 2}\big), & \text{if}\;-\dfrac 1 2 <\gamma <\dfrac 1 2, \end{cases} \] then \(\lim_{n\to\infty}\| L_n^{(\alpha,\beta)} (f)-f\| =0\). Here \(\| f\| \) is the usual supremum norm of \(f\) on \(I\), moreover \(\delta (f;t)\) denotes the so called one-sided modulus of continuity of \(f\). A crucial point for the investigations of Lagrange interpolation is the knowledge of precise estimates for the first derivative at the nodes of the polynomials whose zeros are the interpolation knots. In the Jacobi case the derivative can be easily estimated by the well known differential equation. In the case of generalized Jacobi polynomials the authors use another way to estimate the derivatives. The authors extend this result to the Lagrange interpolation based on generalized Jacobi zeros. Also, they prove that in a sense the results are best possible.
0 references
interpolation
0 references
generalized Jacobi polynomials
0 references
inner singularities
0 references