Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems

Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse...

Full description

Saved in:
Bibliographic Details
Sonstige:
Year of Publication:2022
Language:English
Physical Description:1 electronic resource (256 p.)
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Other title:Divergence Measures
Summary:Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.
Hierarchical level:Monograph