Amazon cover image
Image from Amazon.com

Transfer Entropy

By: Material type: ArticleArticleLanguage: English Publication details: MDPI - Multidisciplinary Digital Publishing Institute 2018Description: 1 electronic resource (VIII, 326 p.)ISBN:
  • 9783038429203
  • 9783038429197
Subject(s): Online resources: Summary: Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation, are widely used to identify these relationships. However, information-theoretic quantities, such as mutual information and transfer entropy, have been proven to be superior in the case of nonlinear dependencies. Mutual information quantifies the amount of information obtained about one random variable through the other random variable, and it is symmetric. As an asymmetrical measure, transfer entropy quantifies the amount of directed (time-asymmetric) transfer of information between random processes and, thus, it is related to concepts, such as the Granger causality. This Special Issue includes 16 papers elucidating the state of the art of data-based transfer entropy estimation techniques and applications, in areas such as finance, biomedicine, fluid dynamics and cellular automata. Analytical derivations in special cases, improvements on the estimation methods and comparisons between certain techniques are some of the other contributions of this Special Issue. The diversity of approaches and applications makes this book unique as a single source of invaluable contributions from experts in the field.
Item type:
Tags from this library: No tags from this library for this title. Log in to add tags.
Star ratings
    Average rating: 0.0 (0 votes)
No physical items for this record

Open Access star Unrestricted online access

Statistical relationships among the variables of a complex system reveal a lot about its physical behavior. Therefore, identification of the relevant variables and characterization of their interactions are crucial for a better understanding of a complex system. Linear methods, such as correlation, are widely used to identify these relationships. However, information-theoretic quantities, such as mutual information and transfer entropy, have been proven to be superior in the case of nonlinear dependencies. Mutual information quantifies the amount of information obtained about one random variable through the other random variable, and it is symmetric. As an asymmetrical measure, transfer entropy quantifies the amount of directed (time-asymmetric) transfer of information between random processes and, thus, it is related to concepts, such as the Granger causality. This Special Issue includes 16 papers elucidating the state of the art of data-based transfer entropy estimation techniques and applications, in areas such as finance, biomedicine, fluid dynamics and cellular automata. Analytical derivations in special cases, improvements on the estimation methods and comparisons between certain techniques are some of the other contributions of this Special Issue. The diversity of approaches and applications makes this book unique as a single source of invaluable contributions from experts in the field.

Creative Commons https://creativecommons.org/licenses/by-nc-nd/4.0/ cc https://creativecommons.org/licenses/by-nc-nd/4.0/

English

There are no comments on this title.

to post a comment.
Share