The Statistical Foundations of Entropy

In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory...

Full description

Saved in:
Bibliographic Details
HerausgeberIn:
Sonstige:
Year of Publication:2022
Language:English
Physical Description:1 electronic resource (182 p.)
Tags: Add Tag
No Tags, Be the first to tag this record!
id 993544939804498
ctrlnum (CKB)5680000000037764
(oapen)https://directory.doabooks.org/handle/20.500.12854/80958
(EXLCZ)995680000000037764
collection bib_alma
record_format marc
spelling Jizba, Petr edt
The Statistical Foundations of Entropy
Basel MDPI - Multidisciplinary Digital Publishing Institute 2022
1 electronic resource (182 p.)
text txt rdacontent
computer c rdamedia
online resource cr rdacarrier
In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. The usual Boltzmann–Gibbs statistics were proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce the complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.The aim of this Special Issue was to extend the state of the art by original contributions that could contribute to an ongoing discussion on the statistical foundations of entropy, with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. The accepted contributions addressed various aspects including information theoretic, thermodynamic and quantum aspects of complex systems and found several important applications of generalized entropies in various systems.
English
Research & information: general bicssc
Mathematics & science bicssc
ecological inference
generalized cross entropy
distributional weighted regression
matrix adjustment
entropy
critical phenomena
renormalization
multiscale thermodynamics
GENERIC
non-Newtonian calculus
non-Diophantine arithmetic
Kolmogorov-Nagumo averages
escort probabilities
generalized entropies
maximum entropy principle
MaxEnt distribution
calibration invariance
Lagrange multipliers
generalized Bilal distribution
adaptive Type-II progressive hybrid censoring scheme
maximum likelihood estimation
Bayesian estimation
Lindley's approximation
confidence interval
Markov chain Monte Carlo method
Rényi entropy
Tsallis entropy
entropic uncertainty relations
quantum metrology
non-equilibrium thermodynamics
variational entropy
rényi entropy
tsallis entropy
landsberg-vedral entropy
gaussian entropy
sharma-mittal entropy
α-mutual information
α-channel capacity
maximum entropy
Bayesian inference
updating probabilities
3-0365-3557-8
3-0365-3558-6
Korbel, Jan edt
Jizba, Petr oth
Korbel, Jan oth
language English
format eBook
author2 Korbel, Jan
Jizba, Petr
Korbel, Jan
author_facet Korbel, Jan
Jizba, Petr
Korbel, Jan
author2_variant p j pj
j k jk
author2_role HerausgeberIn
Sonstige
Sonstige
title The Statistical Foundations of Entropy
spellingShingle The Statistical Foundations of Entropy
title_full The Statistical Foundations of Entropy
title_fullStr The Statistical Foundations of Entropy
title_full_unstemmed The Statistical Foundations of Entropy
title_auth The Statistical Foundations of Entropy
title_new The Statistical Foundations of Entropy
title_sort the statistical foundations of entropy
publisher MDPI - Multidisciplinary Digital Publishing Institute
publishDate 2022
physical 1 electronic resource (182 p.)
isbn 3-0365-3557-8
3-0365-3558-6
illustrated Not Illustrated
work_keys_str_mv AT jizbapetr thestatisticalfoundationsofentropy
AT korbeljan thestatisticalfoundationsofentropy
status_str n
ids_txt_mv (CKB)5680000000037764
(oapen)https://directory.doabooks.org/handle/20.500.12854/80958
(EXLCZ)995680000000037764
carrierType_str_mv cr
is_hierarchy_title The Statistical Foundations of Entropy
author2_original_writing_str_mv noLinkedField
noLinkedField
noLinkedField
_version_ 1787548485514428416
fullrecord <?xml version="1.0" encoding="UTF-8"?><collection xmlns="http://www.loc.gov/MARC21/slim"><record><leader>03862nam-a2200817z--4500</leader><controlfield tag="001">993544939804498</controlfield><controlfield tag="005">20231214133252.0</controlfield><controlfield tag="006">m o d </controlfield><controlfield tag="007">cr|mn|---annan</controlfield><controlfield tag="008">202205s2022 xx |||||o ||| 0|eng d</controlfield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(CKB)5680000000037764</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(oapen)https://directory.doabooks.org/handle/20.500.12854/80958</subfield></datafield><datafield tag="035" ind1=" " ind2=" "><subfield code="a">(EXLCZ)995680000000037764</subfield></datafield><datafield tag="041" ind1="0" ind2=" "><subfield code="a">eng</subfield></datafield><datafield tag="100" ind1="1" ind2=" "><subfield code="a">Jizba, Petr</subfield><subfield code="4">edt</subfield></datafield><datafield tag="245" ind1="1" ind2="0"><subfield code="a">The Statistical Foundations of Entropy</subfield></datafield><datafield tag="260" ind1=" " ind2=" "><subfield code="a">Basel</subfield><subfield code="b">MDPI - Multidisciplinary Digital Publishing Institute</subfield><subfield code="c">2022</subfield></datafield><datafield tag="300" ind1=" " ind2=" "><subfield code="a">1 electronic resource (182 p.)</subfield></datafield><datafield tag="336" ind1=" " ind2=" "><subfield code="a">text</subfield><subfield code="b">txt</subfield><subfield code="2">rdacontent</subfield></datafield><datafield tag="337" ind1=" " ind2=" "><subfield code="a">computer</subfield><subfield code="b">c</subfield><subfield code="2">rdamedia</subfield></datafield><datafield tag="338" ind1=" " ind2=" "><subfield code="a">online resource</subfield><subfield code="b">cr</subfield><subfield code="2">rdacarrier</subfield></datafield><datafield tag="520" ind1=" " ind2=" "><subfield code="a">In the last two decades, the understanding of complex dynamical systems underwent important conceptual shifts. The catalyst was the infusion of new ideas from the theory of critical phenomena (scaling laws, renormalization group, etc.), (multi)fractals and trees, random matrix theory, network theory, and non-Shannonian information theory. The usual Boltzmann–Gibbs statistics were proven to be grossly inadequate in this context. While successful in describing stationary systems characterized by ergodicity or metric transitivity, Boltzmann–Gibbs statistics fail to reproduce the complex statistical behavior of many real-world systems in biology, astrophysics, geology, and the economic and social sciences.The aim of this Special Issue was to extend the state of the art by original contributions that could contribute to an ongoing discussion on the statistical foundations of entropy, with a particular emphasis on non-conventional entropies that go significantly beyond Boltzmann, Gibbs, and Shannon paradigms. The accepted contributions addressed various aspects including information theoretic, thermodynamic and quantum aspects of complex systems and found several important applications of generalized entropies in various systems.</subfield></datafield><datafield tag="546" ind1=" " ind2=" "><subfield code="a">English</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Research &amp; information: general</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="650" ind1=" " ind2="7"><subfield code="a">Mathematics &amp; science</subfield><subfield code="2">bicssc</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">ecological inference</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">generalized cross entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">distributional weighted regression</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">matrix adjustment</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">critical phenomena</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">renormalization</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">multiscale thermodynamics</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">GENERIC</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">non-Newtonian calculus</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">non-Diophantine arithmetic</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Kolmogorov-Nagumo averages</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">escort probabilities</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">generalized entropies</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">maximum entropy principle</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">MaxEnt distribution</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">calibration invariance</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Lagrange multipliers</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">generalized Bilal distribution</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">adaptive Type-II progressive hybrid censoring scheme</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">maximum likelihood estimation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bayesian estimation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Lindley's approximation</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">confidence interval</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Markov chain Monte Carlo method</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Rényi entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Tsallis entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">entropic uncertainty relations</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">quantum metrology</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">non-equilibrium thermodynamics</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">variational entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">rényi entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">tsallis entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">landsberg-vedral entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">gaussian entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">sharma-mittal entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">α-mutual information</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">α-channel capacity</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">maximum entropy</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">Bayesian inference</subfield></datafield><datafield tag="653" ind1=" " ind2=" "><subfield code="a">updating probabilities</subfield></datafield><datafield tag="776" ind1=" " ind2=" "><subfield code="z">3-0365-3557-8</subfield></datafield><datafield tag="776" ind1=" " ind2=" "><subfield code="z">3-0365-3558-6</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Korbel, Jan</subfield><subfield code="4">edt</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Jizba, Petr</subfield><subfield code="4">oth</subfield></datafield><datafield tag="700" ind1="1" ind2=" "><subfield code="a">Korbel, Jan</subfield><subfield code="4">oth</subfield></datafield><datafield tag="906" ind1=" " ind2=" "><subfield code="a">BOOK</subfield></datafield><datafield tag="ADM" ind1=" " ind2=" "><subfield code="b">2023-12-15 05:48:38 Europe/Vienna</subfield><subfield code="f">system</subfield><subfield code="c">marc21</subfield><subfield code="a">2022-05-14 21:41:54 Europe/Vienna</subfield><subfield code="g">false</subfield></datafield><datafield tag="AVE" ind1=" " ind2=" "><subfield code="i">DOAB Directory of Open Access Books</subfield><subfield code="P">DOAB Directory of Open Access Books</subfield><subfield code="x">https://eu02.alma.exlibrisgroup.com/view/uresolver/43ACC_OEAW/openurl?u.ignore_date_coverage=true&amp;portfolio_pid=5337794500004498&amp;Force_direct=true</subfield><subfield code="Z">5337794500004498</subfield><subfield code="b">Available</subfield><subfield code="8">5337794500004498</subfield></datafield></record></collection>