New Developments in Statistical Information Theory Based on Entropy and Divergence Measures

This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, base...

Full description

Saved in:
Bibliographic Details
:
Year of Publication:2019
Language:English
Physical Description:1 electronic resource (344 p.)
Tags: Add Tag
No Tags, Be the first to tag this record!
LEADER 04592nam-a2201165z--4500
001 993544473104498
005 20231214133336.0
006 m o d
007 cr|mn|---annan
008 202102s2019 xx |||||o ||| 0|eng d
020 |a 3-03897-937-6 
035 |a (CKB)4920000000095103 
035 |a (oapen)https://directory.doabooks.org/handle/20.500.12854/54566 
035 |a (EXLCZ)994920000000095103 
041 0 |a eng 
100 1 |a Pardo, Leandro  |4 auth 
245 1 0 |a New Developments in Statistical Information Theory Based on Entropy and Divergence Measures 
260 |b MDPI - Multidisciplinary Digital Publishing Institute  |c 2019 
300 |a 1 electronic resource (344 p.) 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
520 |a This book presents new and original research in Statistical Information Theory, based on minimum divergence estimators and test statistics, from a theoretical and applied point of view, for different statistical problems with special emphasis on efficiency and robustness. Divergence statistics, based on maximum likelihood estimators, as well as Wald’s statistics, likelihood ratio statistics and Rao’s score statistics, share several optimum asymptotic properties, but are highly non-robust in cases of model misspecification under the presence of outlying observations. It is well-known that a small deviation from the underlying assumptions on the model can have drastic effect on the performance of these classical tests. Specifically, this book presents a robust version of the classical Wald statistical test, for testing simple and composite null hypotheses for general parametric models, based on minimum divergence estimators. 
546 |a English 
653 |a mixture index of fit 
653 |a Kullback-Leibler distance 
653 |a relative error estimation 
653 |a minimum divergence inference 
653 |a Neyman Pearson test 
653 |a influence function 
653 |a consistency 
653 |a thematic quality assessment 
653 |a asymptotic normality 
653 |a Hellinger distance 
653 |a nonparametric test 
653 |a Berstein von Mises theorem 
653 |a maximum composite likelihood estimator 
653 |a 2-alternating capacities 
653 |a efficiency 
653 |a corrupted data 
653 |a statistical distance 
653 |a robustness 
653 |a log-linear models 
653 |a representation formula 
653 |a goodness-of-fit 
653 |a general linear model 
653 |a Wald-type test statistics 
653 |a Hölder divergence 
653 |a divergence 
653 |a logarithmic super divergence 
653 |a information geometry 
653 |a sparse 
653 |a robust estimation 
653 |a relative entropy 
653 |a minimum disparity methods 
653 |a MM algorithm 
653 |a local-polynomial regression 
653 |a association models 
653 |a total variation 
653 |a Bayesian nonparametric 
653 |a ordinal classification variables 
653 |a Wald test statistic 
653 |a Wald-type test 
653 |a composite hypotheses 
653 |a compressed data 
653 |a hypothesis testing 
653 |a Bayesian semi-parametric 
653 |a single index model 
653 |a indoor localization 
653 |a composite minimum density power divergence estimator 
653 |a quasi-likelihood 
653 |a Chernoff Stein lemma 
653 |a composite likelihood 
653 |a asymptotic property 
653 |a Bregman divergence 
653 |a robust testing 
653 |a misspecified hypothesis and alternative 
653 |a least-favorable hypotheses 
653 |a location-scale family 
653 |a correlation models 
653 |a minimum penalized ?-divergence estimator 
653 |a non-quadratic distance 
653 |a robust 
653 |a semiparametric model 
653 |a divergence based testing 
653 |a measurement errors 
653 |a bootstrap distribution estimator 
653 |a generalized renyi entropy 
653 |a minimum divergence methods 
653 |a generalized linear model 
653 |a ?-divergence 
653 |a Bregman information 
653 |a iterated limits 
653 |a centroid 
653 |a model assessment 
653 |a divergence measure 
653 |a model check 
653 |a two-sample test 
653 |a Wald statistic 
776 |z 3-03897-936-8 
906 |a BOOK 
ADM |b 2024-03-07 03:45:52 Europe/Vienna  |f system  |c marc21  |a 2019-11-10 04:18:40 Europe/Vienna  |g false 
AVE |i DOAB Directory of Open Access Books  |P DOAB Directory of Open Access Books  |x https://eu02.alma.exlibrisgroup.com/view/uresolver/43ACC_OEAW/openurl?u.ignore_date_coverage=true&portfolio_pid=5337634490004498&Force_direct=true  |Z 5337634490004498  |b Available  |8 5337634490004498