Reliability and expectation bounds based on Hardy’s inequality
From MaRDI portal
Publication:6106245
DOI10.1080/03610926.2021.1966037OpenAlexW3202404694MaRDI QIDQ6106245
No author found.
Publication date: 27 June 2023
Published in: Communications in Statistics - Theory and Methods (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/03610926.2021.1966037
mean residual lifeTsallis entropyhazard rateHardy's inequalityextropyPólya-Knopp's inequalityGlaser's function
Inequalities; stochastic orderings (60E15) Inequalities involving derivatives and differential and integral operators (26D10)
Related Items (1)
Cites Work
- Unnamed Item
- Extropy: complementary dual of entropy
- A probabilistic proof of the Hardy inequality
- Characterization of distributions through failure rate and mean residual life functions
- On strengthened Hardy and Pólya-Knopp's inequalities.
- Some characterization results on dynamic cumulative residual Tsallis entropy
- Characterizations of continuous distributions through inequalities involving the expected values of selected functions.
- On Carleman and Knopp's inequalities.
- Some properties of cumulative Tsallis entropy of order \(\alpha \)
- Log-concave probability and its applications
- Inequalities involving expectations to characterize distributions
- Characterizations of distributions through selected functions of reliability theory
- Cumulative Residual Entropy: A New Measure of Information
- ON CUMULATIVE RESIDUAL EXTROPY
- The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
This page was built for publication: Reliability and expectation bounds based on Hardy’s inequality