Normal view MARC view ISBD view

Information theory meets power laws : stochastic processes and language models / Łukasz Dębowski, Polish Academy of Sciences.

By: Dębowski, Łukasz Jerzy, 1975- [author.].
Material type: materialTypeLabelBookPublisher: Hoboken, NJ : John Wiley & Sons, Inc., 2021Copyright date: ©2021Description: 1 online resource (xvi, 368 pages) : illustrations.Content type: text Media type: computer Carrier type: online resourceISBN: 9781119625384; 1119625386; 9781119625377; 1119625378; 9781119625360; 111962536X.Subject(s): Computational linguistics | Stochastic processes | Computational linguistics | Stochastic processesGenre/Form: Electronic books.Additional physical formats: Print version:: Information theory meets power lawsDDC classification: 410.1/5195 Online resources: Wiley Online Library Summary: "This book introduces mathematical foundations of statistical modeling of natural language. The author attempts to explain a few statistical power laws satisfied by texts in natural language in terms of non-Markovian and non-hidden Markovian discrete stochastic processes with some sort of long-range dependence. To achieve this, he uses various concepts and technical tools from information theory and probability measures. This book begins with an introduction. The first half of the book is an introduction to probability measures, information theory, ergodic decomposition, and Kolmogorov complexity, which is provided to make the book relatively self-contained. This section also covers less standard concepts and results, such as excess entropy and generalization of conditional mutual information to fields. The second part of the book discusses the results concerning power laws for mutual information and maximal repetition, such as theorems about facts and words. There is also a separate chapter discussing toy examples of stochastic processes, which should inspire future work in statistical language modeling"-- Provided by publisher.
    average rating: 0.0 (0 votes)
No physical items for this record

Includes bibliographical references and index.

"This book introduces mathematical foundations of statistical modeling of natural language. The author attempts to explain a few statistical power laws satisfied by texts in natural language in terms of non-Markovian and non-hidden Markovian discrete stochastic processes with some sort of long-range dependence. To achieve this, he uses various concepts and technical tools from information theory and probability measures. This book begins with an introduction. The first half of the book is an introduction to probability measures, information theory, ergodic decomposition, and Kolmogorov complexity, which is provided to make the book relatively self-contained. This section also covers less standard concepts and results, such as excess entropy and generalization of conditional mutual information to fields. The second part of the book discusses the results concerning power laws for mutual information and maximal repetition, such as theorems about facts and words. There is also a separate chapter discussing toy examples of stochastic processes, which should inspire future work in statistical language modeling"-- Provided by publisher.

Description based on online resource; title from digital title page (viewed on January 27, 2021).

Wiley Frontlist Obook All English 2020

There are no comments for this item.

Log in to your account to post a comment.