New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, base...

Full description

Saved in:
Bibliographic Details
:
Year of Publication:2019
Language:English
Physical Description:1 electronic resource (344 p.)
Tags: Add Tag
No Tags, Be the first to tag this record!
id 993544473104498
ctrlnum (CKB)4920000000095103
(oapen)https://directory.doabooks.org/handle/20.500.12854/54566
(EXLCZ)994920000000095103
collection bib_alma
record_format marc
spelling Pardo, Leandro auth
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
MDPI - Multidisciplinary Digital Publishing Institute 2019
1 electronic resource (344 p.)
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.
English
mixture index of fit
Kullback-Leibler distance
relative error estimation
minimum divergence inference
Neyman Pearson test
influence function
consistency
thematic quality assessment
asymptotic normality
Hellinger distance
nonparametric test
Berstein von Mises theorem
maximum composite likelihood estimator
2-alternating capacities
efficiency
corrupted data
statistical distance
robustness
log-linear models
representation formula
goodness-of-fit
general linear model
Wald-type test statistics
Hölder divergence
divergence
logarithmic super divergence
information geometry
sparse
robust estimation
relative entropy
minimum disparity methods
MM algorithm
local-polynomial regression
association models
total variation
Bayesian nonparametric
ordinal classification variables
Wald test statistic
Wald-type test
composite hypotheses
compressed data
hypothesis testing
Bayesian semi-parametric
single index model
indoor localization
composite minimum density power divergence estimator
quasi-likelihood
Chernoff Stein lemma
composite likelihood
asymptotic property
Bregman divergence
robust testing
misspecified hypothesis and alternative
least-favorable hypotheses
location-scale family
correlation models
minimum penalized ?-divergence estimator
non-quadratic distance
robust
semiparametric model
divergence based testing
measurement errors
bootstrap distribution estimator
generalized renyi entropy
minimum divergence methods
generalized linear model
?-divergence
Bregman information
iterated limits
centroid
model assessment
divergence measure
model check
two-sample test
Wald statistic
3-03897-936-8
language English
format eBook
author Pardo, Leandro
spellingShingle Pardo, Leandro
New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
author_facet Pardo, Leandro
author_variant l p lp
author_sort Pardo, Leandro
title New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
title_full New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
title_fullStr New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
title_full_unstemmed New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
title_auth New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
title_new New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
title_sort new developments in statistical information theory based on entropy and divergence measures
publisher MDPI - Multidisciplinary Digital Publishing Institute
publishDate 2019
physical 1 electronic resource (344 p.)
isbn 3-03897-937-6
3-03897-936-8
illustrated Not Illustrated
work_keys_str_mv AT pardoleandro newdevelopmentsinstatisticalinformationtheorybasedonentropyanddivergencemeasures
status_str n
ids_txt_mv (CKB)4920000000095103
(oapen)https://directory.doabooks.org/handle/20.500.12854/54566
(EXLCZ)994920000000095103
carrierType_str_mv cr
is_hierarchy_title New Developments in Statistical Information Theory Based on Entropy and Divergence Measures
_version_ 1796652272562733056
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>04592nam-a2201165z--4500</leader><controlfield tag="001">993544473104498</controlfield><controlfield tag="005">20231214133336.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr|mn|---annan</controlfield><controlfield tag="008">202102s2019 xx |||||o ||| 0|eng d</controlfield><datafield tag="020" ind1=" " ind2=" "><subfield code="a">3-03897-937-6</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(CKB)4920000000095103</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(oapen)https://directory.doabooks.org/handle/20.500.12854/54566</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(EXLCZ)994920000000095103</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Pardo, Leandro</subfield><subfield code="4">auth</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">New Developments in Statistical Information Theory Based on Entropy and Divergence Measures</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="b">MDPI - Multidisciplinary Digital Publishing Institute</subfield><subfield code="c">2019</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 electronic resource (344 p.)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators.</subfield></datafield><datafield tag="546" ind1=" " ind2=" "><subfield code="a">English</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">mixture index of fit</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Kullback-Leibler distance</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">relative error estimation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">minimum divergence inference</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Neyman Pearson test</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">influence function</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">consistency</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">thematic quality assessment</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">asymptotic normality</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Hellinger distance</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">nonparametric test</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Berstein von Mises theorem</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">maximum composite likelihood estimator</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">2-alternating capacities</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">efficiency</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">corrupted data</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">statistical distance</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">robustness</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">log-linear models</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">representation formula</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">goodness-of-fit</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">general linear model</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Wald-type test statistics</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Hölder divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">logarithmic super divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">information geometry</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">sparse</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">robust estimation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">relative entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">minimum disparity methods</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">MM algorithm</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">local-polynomial regression</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">association models</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">total variation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bayesian nonparametric</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">ordinal classification variables</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Wald test statistic</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Wald-type test</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">composite hypotheses</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">compressed data</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">hypothesis testing</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bayesian semi-parametric</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">single index model</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">indoor localization</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">composite minimum density power divergence estimator</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">quasi-likelihood</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Chernoff Stein lemma</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">composite likelihood</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">asymptotic property</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bregman divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">robust testing</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">misspecified hypothesis and alternative</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">least-favorable hypotheses</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">location-scale family</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">correlation models</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">minimum penalized ?-divergence estimator</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">non-quadratic distance</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">robust</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">semiparametric model</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">divergence based testing</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">measurement errors</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">bootstrap distribution estimator</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">generalized renyi entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">minimum divergence methods</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">generalized linear model</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">?-divergence</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bregman information</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">iterated limits</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">centroid</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">model assessment</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">divergence measure</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">model check</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">two-sample test</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Wald statistic</subfield></datafield><datafield tag="776" ind1=" " ind2=" "><subfield code="z">3-03897-936-8</subfield></datafield><datafield tag="906" ind1=" " ind2=" "><subfield code="a">BOOK</subfield></datafield><datafield tag="ADM" ind1=" " ind2=" "><subfield code="b">2024-03-07 03:45:52 Europe/Vienna</subfield><subfield code="f">system</subfield><subfield code="c">marc21</subfield><subfield code="a">2019-11-10 04:18:40 Europe/Vienna</subfield><subfield code="g">false</subfield></datafield><datafield tag="AVE" ind1=" " ind2=" "><subfield code="i">DOAB Directory of Open Access Books</subfield><subfield code="P">DOAB Directory of Open Access Books</subfield><subfield code="x">https://eu02.alma.exlibrisgroup.com/view/uresolver/43ACC_OEAW/openurl?u.ignore_date_coverage=true&amp;portfolio_pid=5337634490004498&amp;Force_direct=true</subfield><subfield code="Z">5337634490004498</subfield><subfield code="b">Available</subfield><subfield code="8">5337634490004498</subfield></datafield></record></collection>