Divergence Measures : Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems
Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse...
Saved in:
Sonstige: | |
---|---|
Year of Publication: | 2022 |
Language: | English |
Physical Description: | 1 electronic resource (256 p.) |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
993544495804498 |
---|---|
ctrlnum |
(CKB)5720000000008465 (oapen)https://directory.doabooks.org/handle/20.500.12854/84568 (EXLCZ)995720000000008465 |
collection |
bib_alma |
record_format |
marc |
spelling |
Sason, Igal edt Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems Divergence Measures Basel MDPI - Multidisciplinary Digital Publishing Institute 2022 1 electronic resource (256 p.) text txt rdacontent computer c rdamedia online resource cr rdacarrier Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures. English Research & information: general bicssc Mathematics & science bicssc Bregman divergence f-divergence Jensen-Bregman divergence Jensen diversity Jensen-Shannon divergence capacitory discrimination Jensen-Shannon centroid mixture family information geometry difference of convex (DC) programming conditional Rényi divergence horse betting Kelly gambling Rényi divergence Rényi mutual information relative entropy chi-squared divergence f-divergences method of types large deviations strong data-processing inequalities information contraction maximal correlation Markov chains information inequalities mutual information Rényi entropy Carlson-Levin inequality information measures hypothesis testing total variation skew-divergence convexity Pinsker's inequality Bayes risk statistical divergences minimum divergence estimator maximum likelihood bootstrap conditional limit theorem Bahadur efficiency α-mutual information Augustin-Csiszár mutual information data transmission error exponents dimensionality reduction discriminant analysis statistical inference 3-0365-4332-5 3-0365-4331-7 Sason, Igal oth |
language |
English |
format |
eBook |
author2 |
Sason, Igal |
author_facet |
Sason, Igal |
author2_variant |
i s is |
author2_role |
Sonstige |
title |
Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
spellingShingle |
Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
title_sub |
Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
title_full |
Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
title_fullStr |
Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
title_full_unstemmed |
Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
title_auth |
Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
title_alt |
Divergence Measures |
title_new |
Divergence Measures |
title_sort |
divergence measures mathematical foundations and applications in information-theoretic and statistical problems |
publisher |
MDPI - Multidisciplinary Digital Publishing Institute |
publishDate |
2022 |
physical |
1 electronic resource (256 p.) |
isbn |
3-0365-4332-5 3-0365-4331-7 |
illustrated |
Not Illustrated |
work_keys_str_mv |
AT sasonigal divergencemeasuresmathematicalfoundationsandapplicationsininformationtheoreticandstatisticalproblems AT sasonigal divergencemeasures |
status_str |
n |
ids_txt_mv |
(CKB)5720000000008465 (oapen)https://directory.doabooks.org/handle/20.500.12854/84568 (EXLCZ)995720000000008465 |
carrierType_str_mv |
cr |
is_hierarchy_title |
Divergence Measures Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems |
author2_original_writing_str_mv |
noLinkedField |
_version_ |
1787548883611549696 |
fullrecord |
<?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03909nam-a2200889z--4500</leader><controlfield tag="001">993544495804498</controlfield><controlfield tag="005">20231214133301.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr|mn|---annan</controlfield><controlfield tag="008">202206s2022 xx |||||o ||| 0|eng d</controlfield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(CKB)5720000000008465</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(oapen)https://directory.doabooks.org/handle/20.500.12854/84568</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(EXLCZ)995720000000008465</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Sason, Igal</subfield><subfield code="4">edt</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">Divergence Measures</subfield><subfield code="b">Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems</subfield></datafield><datafield tag="246" ind1=" " ind2=" "><subfield code="a">Divergence Measures </subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="a">Basel</subfield><subfield code="b">MDPI - Multidisciplinary Digital Publishing Institute</subfield><subfield code="c">2022</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 electronic resource (256 p.)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">Data science, information theory, probability theory, statistical learning and other related disciplines greatly benefit from non-negative measures of dissimilarity between pairs of probability measures. These are known as divergence measures, and exploring their mathematical foundations and diverse applications is of significant interest. The present Special Issue, entitled “Divergence Measures: Mathematical Foundations and Applications in Information-Theoretic and Statistical Problems”, includes eight original contributions, and it is focused on the study of the mathematical properties and applications of classical and generalized divergence measures from an information-theoretic perspective. It mainly deals with two key generalizations of the relative entropy: namely, the R_ényi divergence and the important class of f -divergences. It is our hope that the readers will find interest in this Special Issue, which will stimulate further research in the study of the mathematical foundations and applications of divergence measures.</subfield></datafield><datafield tag="546" ind1=" " ind2=" "><subfield code="a">English</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Research & information: general</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Mathematics & science</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bregman divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">f-divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Jensen-Bregman divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Jensen diversity</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Jensen-Shannon divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">capacitory discrimination</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Jensen-Shannon centroid</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">mixture family</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">information geometry</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">difference of convex (DC) programming</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">conditional Rényi divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">horse betting</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Kelly gambling</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Rényi divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Rényi mutual information</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">relative entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">chi-squared divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">f-divergences</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">method of types</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">large deviations</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">strong data-processing inequalities</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">information contraction</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">maximal correlation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Markov chains</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">information inequalities</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">mutual information</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Rényi entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Carlson-Levin inequality</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">information measures</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">hypothesis testing</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">total variation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">skew-divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">convexity</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Pinsker's inequality</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bayes risk</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">statistical divergences</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">minimum divergence estimator</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">maximum likelihood</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">bootstrap</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">conditional limit theorem</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bahadur efficiency</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">α-mutual information</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Augustin-Csiszár mutual information</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">data transmission</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">error exponents</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">dimensionality reduction</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">discriminant analysis</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">statistical inference</subfield></datafield><datafield tag="776" ind1=" " ind2=" "><subfield code="z">3-0365-4332-5</subfield></datafield><datafield tag="776" ind1=" " ind2=" "><subfield code="z">3-0365-4331-7</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Sason, Igal</subfield><subfield code="4">oth</subfield></datafield><datafield tag="906" ind1=" " ind2=" "><subfield code="a">BOOK</subfield></datafield><datafield tag="ADM" ind1=" " ind2=" "><subfield code="b">2023-12-15 05:49:11 Europe/Vienna</subfield><subfield code="f">system</subfield><subfield code="c">marc21</subfield><subfield code="a">2022-07-02 22:45:44 Europe/Vienna</subfield><subfield code="g">false</subfield></datafield><datafield tag="AVE" ind1=" " ind2=" "><subfield code="i">DOAB Directory of Open Access Books</subfield><subfield code="P">DOAB Directory of Open Access Books</subfield><subfield code="x">https://eu02.alma.exlibrisgroup.com/view/uresolver/43ACC_OEAW/openurl?u.ignore_date_coverage=true&portfolio_pid=5337611350004498&Force_direct=true</subfield><subfield code="Z">5337611350004498</subfield><subfield code="b">Available</subfield><subfield code="8">5337611350004498</subfield></datafield></record></collection> |