Modelling Natural Language with Claude Shannon’s Notion of Surprisal / / Michael Richter.

Have you ever wondered how the principles behind Shannon's groundbreaking Information Theory can be interwoven with the intricate fabric of linguistic communication? This book takes you on a fascinating journey, offering insights into how humans process and comprehend language. By applying Info...

Full description

Saved in:
Bibliographic Details
Superior document:Title is part of eBook package: De Gruyter DG Plus DeG Package 2024 Part 1
VerfasserIn:
Place / Publishing House:Berlin ;, Boston : : De Gruyter Mouton, , [2024]
©2024
Year of Publication:2024
Language:English
Series:Quantitative Linguistics [QL] , 76
Online Access:
Physical Description:1 online resource (XIII, 175 p.)
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Other title:Frontmatter --
Acknowledgements --
Contents --
List of figures --
List of tables --
Introduction --
Chapter 1 Information and contexts --
Chapter 2 Previous work on application of Information Theory to the semantics of natural language --
Chapter 3 Modelling intensification through Shannon information --
Chapter 4 Shannon Information in information retrieval --
Chapter 5 Shannon information, form and linguistic coding --
Chapter 6 Some concluding remarks --
References --
Index
Summary:Have you ever wondered how the principles behind Shannon's groundbreaking Information Theory can be interwoven with the intricate fabric of linguistic communication? This book takes you on a fascinating journey, offering insights into how humans process and comprehend language. By applying Information Theory to the realm of natural language semantics, it unravels the connection between regularities in linguistic messages and the cognitive intricacies of language processing. Highlighting the intersections of information theory with linguistics, philosophy, cognitive psychology, and computer science, this book serves as an inspiration for anyone seeking to understand the predictive capabilities of Information Theory in modeling human communication. It elaborates on the seminal works from giants in the field like Dretske, Hale, and Zipf, exploring concepts like surprisal theory and the principle of least effort. With its empirical approach, this book not only discusses the theoretical aspects but also ventures into the application of Shannon's Information Theory in real-world language scenarios, strengthened by advanced statistical methods and machine learning. It touches upon challenging areas such as the distinction between mathematical and semantic information, the concept of information in linguistic utterances, and the intricate play between truth, context, and meaning. Whether you are a linguist, a cognitive psychologist, a philosopher, or simply an enthusiast eager to dive deep into the world where language meets information, this book promises a thought-provoking journey.
Format:Mode of access: Internet via World Wide Web.
ISBN:9783110788143
9783111332192
9783111438047
ISSN:0179-3616 ;
DOI:10.1515/9783110788143
Access:restricted access
Hierarchical level:Monograph
Statement of Responsibility: Michael Richter.