Transfer entropy / / edited by Deniz Gençağa.

Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation,...

Full description

Saved in:
Bibliographic Details
TeilnehmendeR:
Place / Publishing House:Basel, Switzerland : : MDPI - Multidisciplinary Digital Publishing Institute,, [2018]
©2018
Year of Publication:2018
Language:English
Physical Description:1 online resource (334 pages) :; illustrations
Tags: Add Tag
No Tags, Be the first to tag this record!
LEADER 02381nam a2200301 i 4500
001 993602782904498
005 20230629144700.0
006 m o d
007 cr |||||||||||
008 230629s2018 sz a ob 000 0 eng d
035 |a (CKB)4920000000094810 
035 |a (NjHacI)994920000000094810 
035 |a (EXLCZ)994920000000094810 
040 |a NjHacI  |b eng  |e rda  |c NjHacl 
050 4 |a QC318.E57  |b .T736 2018 
082 0 4 |a 536.73  |2 23 
245 0 0 |a Transfer entropy /  |c edited by Deniz Gençağa. 
264 1 |a Basel, Switzerland :  |b MDPI - Multidisciplinary Digital Publishing Institute,  |c [2018] 
264 4 |c ©2018 
300 |a 1 online resource (334 pages) :  |b illustrations 
336 |a text  |b txt  |2 rdacontent 
337 |a computer  |b c  |2 rdamedia 
338 |a online resource  |b cr  |2 rdacarrier 
588 |a Description based on print version record. 
520 |a Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation, are widely used to identify these relationships. However, information-theoretic quantities, such as mutual information and transfer entropy, have been proven to be superior in the case of nonlinear dependencies. Mutual information quantifies the amount of information obtained about one random variable through the other random variable, and it is symmetric. As an asymmetrical measure, transfer entropy quantifies the amount of directed (time-asymmetric) transfer of information between random processes and, thus, it is related to concepts, such as the Granger causality. This Special Issue includes 16 papers elucidating the state of the art of data-based transfer entropy estimation techniques and applications, in areas such as finance, biomedicine, fluid dynamics and cellular automata. Analytical derivations in special cases, improvements on the estimation methods and comparisons between certain techniques are some of the other contributions of this Special Issue. The diversity of approaches and applications makes this book unique as a single source of invaluable contributions from experts in the field. 
504 |a Includes bibliographical references. 
650 0 |a Entropy. 
776 |z 3-03842-919-8 
700 1 |a Gençağa, Deniz,  |e editor. 
906 |a BOOK 
ADM |b 2023-07-08 12:08:10 Europe/Vienna  |f system  |c marc21  |a 2019-11-10 04:18:40 Europe/Vienna  |g false 
AVE |i DOAB Directory of Open Access Books  |P DOAB Directory of Open Access Books  |x https://eu02.alma.exlibrisgroup.com/view/uresolver/43ACC_OEAW/openurl?u.ignore_date_coverage=true&portfolio_pid=5338792990004498&Force_direct=true  |Z 5338792990004498  |b Available  |8 5338792990004498