Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory
From MaRDI portal
Publication:1226988
DOI10.1007/BF01899728zbMath0328.94012OpenAlexW2090476622MaRDI QIDQ1226988
Inder Jeet Taneja, Bhu Dev Sharma
Publication date: 1975
Published in: Metrika (Search for Journal in Brave)
Full work available at URL: https://eudml.org/doc/175682
Related Items (49)
Unnamed Item ⋮ ON SOME INEQUALITIES AND GENERALIZED ENTROPIES: A UNIFIED APPROAC ⋮ Divergence statistics based on entropy functions and stratified sampling ⋮ Statistical applications of order \(\alpha\)-\(\beta\) weighted information energy ⋮ Characterizations of sum form information measures on open domains ⋮ Application of fractional techniques in the analysis of forest fires ⋮ Jensen-information generating function and its connections to some well-known information measures ⋮ Remarks on the Pseudo-Additivity in the Axiomatization of Tsallis Entropy ⋮ Adiabatic thermostatistics of the two parameter entropy and the role of Lambert's \(W\)-function in its applications ⋮ On non-additive measures of inaccuracy ⋮ An entropic form for NLFP with coulombic-like potential ⋮ Unnamed Item ⋮ Cumulative and relative cumulative residual information generating measures and associated properties ⋮ Statistical field theories deformed within different calculi ⋮ Refined two-index entropy and multiscale analysis for complex system ⋮ A new parametric intuitionistic fuzzy entropy and its applications in multiple attribute decision making ⋮ Stationary solution of NLFP with Coulombic potential ⋮ Some properties of past entropy and their applications ⋮ On axiomatic characterization of entropy of type $(\alpha,\beta)$ ⋮ On testing hypotheses with divergence statistics ⋮ Large sample behavior of entropy measures when parameters are estimated ⋮ On generalized measures of relative information and inaccuracy ⋮ A two-parameter entropy and its fundamental properties ⋮ PHYSICAL MEANING OF THE PARAMETERS IN THE TWO-PARAMETER (κ, ζ) GENERALIZED STATISTICS ⋮ On the measurable solutions of certain functional equations ⋮ A completeness criterion for Kaniadakis, Abe and two-parameter generalized statistical theories ⋮ A Comparative Assessment of Various Measures of Entropy ⋮ Nonlinear mean field Fokker-Planck equations. Application to the chemotaxis of biological populations ⋮ Some results on generalized past entropy ⋮ A two-parameter generalization of Shannon-Khinchin axioms and the uniqueness theorem ⋮ Approach of complexity in nature: entropic nonuniqueness ⋮ A Fokker-Planck equation for a piecewise entropy functional defined in different space domains. an application to solute partitioning at the membrane-water interface ⋮ A general methodology for population analysis ⋮ Multidimensional Scaling Visualization Using Parametric Entropy ⋮ Analysis of fractal groups of the type \(d\)-\((m,r)\)-\textit{Cantor} within the framework of Kaniadakis statistics ⋮ A spectral representation for the entropy of topological dynamical systems ⋮ Entropic forms and related algebras ⋮ Unnamed Item ⋮ \((h,\Psi)\)-entropy differential metric ⋮ A general class of entropy statistics ⋮ Entropy as an integral operator: erratum and modification ⋮ On Properties of Measures of lnformation ⋮ Fuzzy MABAC method based on new exponential fuzzy information measures ⋮ Measuring information beyond communication theory - why some generalized information measures may be useful, others not ⋮ Sharma-Mittal quantum discord ⋮ Trigonometric entropies, Jensen difference divergence measures, and error bounds ⋮ Fuzzy Entropy Measure with an Applications in Decision Making Under Bipolar Fuzzy Environment based on TOPSIS Method ⋮ The ø‐Entropy in the Selection of a Fixed Number of Experiments ⋮ On the asymptotic optimum allocation in estimating entropies
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A Mathematical Theory of Communication
- Relative information functions and their type (\(\alpha, \beta\)) generalizations
- Generalized information functions
- A directed-divergence function of type β
- On Shannon's entropy, directed divergence and inaccuracy
- On characterization of a generalized inaccuracy measure in information theory
- On a Functional Equation
This page was built for publication: Entropy of type \((\alpha,\beta)\) and other generalized measures in information theory