Comparison, utility, and partition of dependence under absolutely continuous and singular distributions
DOI10.1016/j.jmva.2014.06.014zbMath1298.62013OpenAlexW1965054760MaRDI QIDQ406508
Nima Y. Jalali, Ehsan S. Soofi, Nader Ebrahimi
Publication date: 8 September 2014
Published in: Journal of Multivariate Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jmva.2014.06.014
copulaentropyutilityconvolutionmutual informationshock modelspredictabilityelliptical distributionMarshall-OlkinSarmanov families
Measures of association (correlation, canonical correlation, etc.) (62H20) Measures of information, entropy (94A17) Information theory (general) (94A15) Statistical aspects of information-theoretic topics (62B10)
Related Items (7)
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Measuring and testing dependence by correlation of distances
- Detecting Novel Associations in Large Data Sets
- On quantifying dependence: a framework for developing interpretable measures
- Characterization of a Marshall-Olkin type class of distributions
- Kendall's \(\tau\) is equal to the correlation coefficient for the BVE distribution
- Shuffles of copulas and a new measure of dependence
- The meta-elliptical distributions with given marginals
- An introduction to copulas.
- Multivariate dynamic information
- On the sample information about parameter and prediction
- Multivariate maximum entropy identification, transformation, and dependence
- A class of models for uncorrelated random variables
- On nonparametric measures of dependence for random variables
- Monotone dependence
- Expected information as ecpected utility
- Formulas for Rényi information and related measures for univariate distributions.
- Bayesian experimental design: A review
- Expressions for Rényi and Shannon entropies for multivariate distributions
- Possible generalization of Boltzmann-Gibbs statistics.
- Expressions for Rényi and Shannon entropies for bivariate distributions
- Measuring stochastic dependence using \(\phi\)-divergence
- Assessing Dependence: Some Experimental Results
- Estimation and Hypothesis Testing for the Parameters of a Bivariate Exponential Distribution
- On a Measure of the Information Provided by an Experiment
- An informational measure of correlation
- The t Copula and Related Copulas
- Continuous Bivariate Distributions
- Pareto processes
- Relative Entropy Measures of Multivariate Dependence
- First-order autoregressive gamma sequences and point processes
- A continuous general multivariate distribution and its properties
- A Continuous Bivariate Exponential Extension
- Entropy expressions for multivariate continuous distributions
- A Dependence Metric for Possibly Nonlinear Processes
- Bayesian Hypothesis Testing: A Reference Approach
- Statistical Problem Classes and Their Links to Information Theory
- Importance of Components for a System
- Modeling Longitudinal Data Using a Pair-Copula Decomposition of Serial Dependence
- Mutual information as a measure of multivariate association: analytical properties and statistical estimation
- Some Concepts of Dependence
- A Multivariate Exponential Distribution
- Uncertainty, Information, and Sequential Experiments
- Correlation and Complete Dependence of Random Variables
This page was built for publication: Comparison, utility, and partition of dependence under absolutely continuous and singular distributions