Efficient approximation of the conditional relative entropy with applications to discriminative learning of Bayesian network classifiers (Q280459)

From MaRDI portal





scientific article; zbMATH DE number 6578303
Language Label Description Also known as
English
Efficient approximation of the conditional relative entropy with applications to discriminative learning of Bayesian network classifiers
scientific article; zbMATH DE number 6578303

    Statements

    Efficient approximation of the conditional relative entropy with applications to discriminative learning of Bayesian network classifiers (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    10 May 2016
    0 references
    Summary: We propose a minimum variance unbiased approximation to the conditional relative entropy of the distribution induced by the observed frequency estimates, for multi-classification tasks. Such approximation is an extension of a decomposable scoring criterion, named approximate conditional log-likelihood (aCLL), primarily used for discriminative learning of augmented Bayesian network classifiers. Our contribution is twofold: (i) it addresses multi-classification tasks and not only binary-classification ones; and (ii) it covers broader stochastic assumptions than uniform distribution over the parameters. Specifically, we considered a Dirichlet distribution over the parameters, which was experimentally shown to be a very good approximation to CLL. In addition, for Bayesian network classifiers, a closed-form equation is found for the parameters that maximize the scoring criterion.
    0 references
    conditional relative entropy
    0 references
    approximation
    0 references
    discriminative learning
    0 references
    Bayesian network classifiers
    0 references

    Identifiers

    0 references
    0 references
    0 references
    0 references
    0 references
    0 references
    0 references