Inequalities for real random variables connected with Jensen's inequality and applications (Q2726121)
From MaRDI portal
| This is the item page for this Wikibase entity, intended for internal use and editing purposes. Please use this page instead for the normal view: Inequalities for real random variables connected with Jensen's inequality and applications |
scientific article; zbMATH DE number 1620013
| Language | Label | Description | Also known as |
|---|---|---|---|
| English | Inequalities for real random variables connected with Jensen's inequality and applications |
scientific article; zbMATH DE number 1620013 |
Statements
1 March 2002
0 references
random variables
0 references
Jensen's inequality
0 references
Inequalities for real random variables connected with Jensen's inequality and applications (English)
0 references
The paper is divided into five sections: 1. Introduction, 2. Main results, 3. The case of i.i.d. real random variables, 4. Applications and examples, 5. A result of approximation. The second section contains some inequalities for sums of random variables, connected to the Jensen's integral inequality for convex functions. Sections 3-5 contain applications of the inequalities from the second section. Unfortunately none of the results from the Sections 3-5 is new. Moreover to prove them we do not need the inequalities from the second section. We show it on the examples of the Theorems 3.1, 3.2, 4.4.1. Let \(\{X_n\}_{n\geq 1}\) be a sequence of independent identically distributed random variables, \(S_n=X_1+\cdots+X_n,\) \(\varphi\) a concave continuous function defined on an interval \((a,b).\) Suppose that \(X_n\in (a,b)\) for every \(n=1,2,\dots,\) \(E|X_1|<\infty,\) \(E|\varphi(X_1)|<\infty.\) Theorems 3.1, 3.2, 4.4.1 claim respectively: NEWLINE\[NEWLINE \varphi(EX_1)\leq E\varphi(S_n/n)\leq E\varphi (X_1);\leqno (1) NEWLINE\]NEWLINE NEWLINE\[NEWLINE \varphi (EX_1)=\lim_{n\rightarrow\infty}E\varphi (S_n/n)=\inf_{n\geq 1}E\varphi (S_n/n);\leqno (2) NEWLINE\]NEWLINE if \(-\infty<a<b<\infty\), then NEWLINE\[NEWLINE \varphi\left(\frac{a+b}{2}\right)\leq\frac{1}{b-a)^n} \int_{a}^{b}\cdots\int_{a}^{b}\varphi\left (\frac{1}{n}\sum_{k=1}^{n}x_k\right) dx_1\cdots dx_n\leq\frac{\varphi(a)+\varphi (b)}{2}.\leqno (3) NEWLINE\]NEWLINE The left inequality in (1) is valid by Jensen's inequality \(\varphi(EX_1)=\varphi (ES_n/n)\leq E\varphi(S_n/n).\) The right inequality in (1) is a result of the application of the Jensen's inequality to a conditional expectation \(E(X_1 \mid S_n).\) Indeed, \(E(X_1 \mid S_n/n)=S_n/n\) almost surely [see, for example, \textit{R. G. Laha} and \textit{V. K. Rohatgi}, ``Probability theory'' (1979; Zbl 0409.60001), p. 413]. By Jensen's inequality and properties of conditional expectations we have NEWLINE\[NEWLINE E\varphi (S_n/n)=E(\varphi(E(X_1\mid S_n/n))\leq E(E\varphi(X_1) \mid S_n/n))=E\varphi(X_1). NEWLINE\]NEWLINE The left equality in (2) is a consequence of the Kolmogorov strong law of large numbers. The right equality in (2) follows from the left inequality in (1) and of the continuity of \(\varphi.\) To prove (3) we note that the middle part equals \(E\varphi(S_n/n),\) where \(S_n\) is the sum of independent random variables \(X_1,\dots,X_n\) which are uniformly distributed on \([a,b].\) The left inequality follows from the left inequality in (1) since \(EX_1=(a+b)/2.\) The right inequality in (3) follows from the right inequality in (1) since NEWLINE\[NEWLINE E\varphi(X_1)=\frac{1}{b-a}\int_{a}^{b}\varphi(x) dx\leq \frac{1}{b-a}\int_{a}^{b}\left [\frac{b-x}{b-a}\varphi (a)+\frac{x-a}{b-a}\varphi (b)\right ] dx= \frac{\varphi(a)+\varphi(b)}{2}. NEWLINE\]NEWLINE Thus the usefulness of the author's inequalities is doubtful. We hope that in the future one can find possibilities of real using of them.
0 references