About almost independent linear statistics (Q1366363)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: About almost independent linear statistics |
scientific article; zbMATH DE number 1059787
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | About almost independent linear statistics |
scientific article; zbMATH DE number 1059787 |
Statements
About almost independent linear statistics (English)
0 references
30 October 1997
0 references
One of the first results in the area of characterizations by the independence of statistics was \textit{S. N. Bernstein}'s theorem [Trudy Leningr. Politech. Inst. 3, 21-22 (1941)]. According to this theorem, if independent, identically distributed random variables \(X_1\) and \(X_2\) are such that their sum \(X_1+X_2\) and difference \(X_1-X_2\) are independent, then \(X_1\) and \(X_2\) are normal random variables. \textit{B. Gnedenko} [Izv. Akad. Nauk SSSR, Ser. Mat. 12, 17-100 (1948; Zbl 0030.03901)] generalized this result in the following way. If the linear statistics \(L_1=a_1X_1+ a_2X_2\) and \(L_2=b_1X_1+ b_2X_2\) are independent, whenever \(a_1b_1\neq 0\), \(a_2b_2\neq 0\), then \(X_1\) and \(X_2\) are normal. By definition, the random variables \(X\) and \(Y\) are called \((\rho, \varepsilon)\)-independent (or simply \(\varepsilon\)-independent) if \[ \rho(F_{(X,Y)},F_X\cdot F_Y)= \sup_{x,y} \bigr| F_{(X,Y)}(x,y)-F_X(x)F_Y(y) \bigr| \leq \varepsilon; \] where \(F_{(X,Y)}(x,y)= P(X<x,Y<y)\), \(F_X(x)= P(X<x)\), and \(\rho\) is a uniform metric. Many authors have considered the problem of \(\varepsilon\)-independence of the statistics \(L_1\) and \(L_2\). This work is a further extension of investigations in this direction.
0 references
characterizations
0 references
independence
0 references
linear statistics
0 references