Convergence of the method of least squares (Q1326060)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Convergence of the method of least squares |
scientific article; zbMATH DE number 567902
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Convergence of the method of least squares |
scientific article; zbMATH DE number 567902 |
Statements
Convergence of the method of least squares (English)
0 references
13 July 1994
0 references
Let \(f(t)\) be defined and continuous on the interval \([-1,1]\). Suppose we are given measurements \(y_ j = f(t_ j) + g_ j\) \((j = 0,1,\dots,N- 1)\) of \(f\), where \(\{t_ 1,t_ 2,\dots,t_{N - 1}\}\) is a system of distinct points on the segment \([-1,1]\); and \(g_ j\) \((j = 0,1,\dots,N- 1)\) is the observational error, which is an independent random variable satisfying the conditions \(E(g_ j) = 0\), \(E(g_ kg_ j) = \sigma \delta_{jk}\). Let \(S_{n,N}(t)\) denote an algebraic polynomial minimizing the sum \(\sum^{N-1}_{j = 0} (y_ j - P_ n(t_ j))^ 2\) over all the polynomials \(P_ n(t)\) of degree \(n \leq N-1\). If \(J_{n,N}(f,t) = E((f(t) - S_{n,N}(t))^ 2)\) then the following result is obtained. Theorem: \(\text{If }f\in \text{Lip}_ s M\) \((s \geq 1/2)\), the condition \(n^ 2/N \leq a_ n = o(1)\) \((N \to \infty)\) is necessary and sufficient for \(\max_{-1 \leq t \leq 1} J_{n,N} (f,t)\) to tend to zero as \(n \to \infty\). If the function \(f\) satisfies a Dini-Lipschitz condition on \([-1,1]\), then the condition \(n^ 2/N \leq a\) \((a>0)\) is sufficient for \(\max\{J_{n,N}(f,t): -1 + \varepsilon \leq t \leq 1 - \varepsilon\}\) to approach zero as \(n \to \infty\).
0 references
convergence
0 references
method of least squares
0 references
observational error
0 references
random variable
0 references