Normal view MARC view ISBD view

Information and communication theory / Stefan H�ost.

By: H�ost, Stefan [author.].
Contributor(s): IEEE Xplore (Online Service) [distributor.] | Wiley [publisher.].
Material type: materialTypeLabelBookSeries: IEEE series on mobile & digital communication: 30.Publisher: Piscataway, New Jersey : IEEE Press, [2019]Distributor: [Piscataqay, New Jersey] : IEEE Xplore, [2019]Description: 1 PDF (368 pages).Content type: text Media type: electronic Carrier type: online resourceISBN: 9781119433828.Subject(s): Information theoryGenre/Form: Electronic books.Additional physical formats: Print version:: No titleDDC classification: 003.54 Online resources: Abstract with links to resource Also available in print.
Contents:
Preface ix -- Chapter 1 Introduction 1 -- Chapter 2 Probability Theory 5 -- 2.1 Probabilities 5 -- 2.2 Random Variable 7 -- 2.3 Expectation and Variance 9 -- 2.4 The Law of Large Numbers 17 -- 2.5 Jensen’s Inequality 21 -- 2.6 Random Processes 25 -- 2.7 Markov Process 28 -- Problems 33 -- Chapter 3 Information Measures 37 -- 3.1 Information 37 -- 3.2 Entropy 41 -- 3.3 Mutual Information 48 -- 3.4 Entropy of Sequences 58 -- Problems 63 -- Chapter 4 Optimal Source Coding 69 -- 4.1 Source Coding 69 -- 4.2 Kraft Inequality 71 -- 4.3 Optimal Codeword Length 80 -- 4.4 Huffman Coding 84 -- 4.5 Arithmetic Coding 95 -- Problems 101 -- Chapter 5 Adaptive Source Coding 105 -- 5.1 The Problem with Unknown Source Statistics 105 -- 5.2 Adaptive Huffman Coding 106 -- 5.3 The Lempel-Ziv Algorithms 112 -- 5.4 Applications of Source Coding 125 -- Problems 129 -- Chapter 6 Asymptotic Equipartition Property and Channel Capacity 133 -- 6.1 Asymptotic Equipartition Property 133 -- 6.2 Source Coding Theorem 138 -- 6.3 Channel Coding 141 -- 6.4 Channel Coding Theorem 144 -- 6.5 Derivation of Channel Capacity for DMC 155 -- Problems 164 -- Chapter 7 Channel Coding 169 -- 7.1 Error-Correcting Block Codes 170 -- 7.2 Convolutional Code 188 -- 7.3 Error-Detecting Codes 203 -- Problems 210 -- Chapter 8 Information Measures For Continuous Variables 213 -- 8.1 Differential Entropy and Mutual Information 213 -- 8.2 Gaussian Distribution 224 -- Problems 232 -- Chapter 9 Gaussian Channel 237 -- 9.1 Gaussian Channel 237 -- 9.2 Parallel Gaussian Channels 244 -- 9.3 Fundamental Shannon Limit 256 -- Problems 260 -- Chapter 10 Discrete Input Gaussian Channel 265 -- 10.1 M-PAM Signaling 265 -- 10.2 A Note on Dimensionality 271 -- 10.3 Shaping Gain 276 -- 10.4 SNR Gap 281 -- Problems 285 -- Chapter 11 Information Theory and Distortion 289 -- 11.1 Rate-Distortion Function 289 -- 11.2 Limit For Fix Pb 300 -- 11.3 Quantization 302 -- 11.4 Transform Coding 306 -- Problems 319 -- Appendix A Probability Distributions 323.
A.1 Discrete Distributions 323 -- A.2 Continuous Distributions 327 -- Appendix B Sampling Theorem 337 -- B.1 The Sampling Theorem 337 -- Bibliography 343 -- Index 347.
Summary: An important text that offers an in-depth guide to how information theory sets the boundaries for data communication In an accessible and practical style, Information and Communication Theory explores the topic of information theory and includes concrete tools that are appropriate for real-life communication systems. The text investigates the connection between theoretical and practical applications through a wide-variety of topics including an introduction to the basics of probability theory, information, (lossless) source coding, typical sequences as a central concept, channel coding, continuous random variables, Gaussian channels, discrete input continuous channels, and a brief look at rate distortion theory. The author explains the fundamental theory together with typical compression algorithms and how they are used in reality. He moves on to review source coding and how much a source can be compressed, and also explains algorithms such as the LZ family with applications to e.g. zip or png. In addition to exploring the channel coding theorem, the book includes illustrative examples of codes. This comprehensive text: -Provides an adaptive version of Huffman coding that estimates source distribution -Contains a series of problems that enhance an understanding of information presented in the text -Covers a variety of topics including optimal source coding, channel coding, modulation and much more -Includes appendices that explore probability distributions and the sampling theorem Written for graduate and undergraduate students studying information theory, as well as professional engineers, master's students, Information and Communication Theory offers an introduction to how information theory sets the boundaries for data communication.
    average rating: 0.0 (0 votes)
No physical items for this record

Preface ix -- Chapter 1 Introduction 1 -- Chapter 2 Probability Theory 5 -- 2.1 Probabilities 5 -- 2.2 Random Variable 7 -- 2.3 Expectation and Variance 9 -- 2.4 The Law of Large Numbers 17 -- 2.5 Jensen’s Inequality 21 -- 2.6 Random Processes 25 -- 2.7 Markov Process 28 -- Problems 33 -- Chapter 3 Information Measures 37 -- 3.1 Information 37 -- 3.2 Entropy 41 -- 3.3 Mutual Information 48 -- 3.4 Entropy of Sequences 58 -- Problems 63 -- Chapter 4 Optimal Source Coding 69 -- 4.1 Source Coding 69 -- 4.2 Kraft Inequality 71 -- 4.3 Optimal Codeword Length 80 -- 4.4 Huffman Coding 84 -- 4.5 Arithmetic Coding 95 -- Problems 101 -- Chapter 5 Adaptive Source Coding 105 -- 5.1 The Problem with Unknown Source Statistics 105 -- 5.2 Adaptive Huffman Coding 106 -- 5.3 The Lempel-Ziv Algorithms 112 -- 5.4 Applications of Source Coding 125 -- Problems 129 -- Chapter 6 Asymptotic Equipartition Property and Channel Capacity 133 -- 6.1 Asymptotic Equipartition Property 133 -- 6.2 Source Coding Theorem 138 -- 6.3 Channel Coding 141 -- 6.4 Channel Coding Theorem 144 -- 6.5 Derivation of Channel Capacity for DMC 155 -- Problems 164 -- Chapter 7 Channel Coding 169 -- 7.1 Error-Correcting Block Codes 170 -- 7.2 Convolutional Code 188 -- 7.3 Error-Detecting Codes 203 -- Problems 210 -- Chapter 8 Information Measures For Continuous Variables 213 -- 8.1 Differential Entropy and Mutual Information 213 -- 8.2 Gaussian Distribution 224 -- Problems 232 -- Chapter 9 Gaussian Channel 237 -- 9.1 Gaussian Channel 237 -- 9.2 Parallel Gaussian Channels 244 -- 9.3 Fundamental Shannon Limit 256 -- Problems 260 -- Chapter 10 Discrete Input Gaussian Channel 265 -- 10.1 M-PAM Signaling 265 -- 10.2 A Note on Dimensionality 271 -- 10.3 Shaping Gain 276 -- 10.4 SNR Gap 281 -- Problems 285 -- Chapter 11 Information Theory and Distortion 289 -- 11.1 Rate-Distortion Function 289 -- 11.2 Limit For Fix Pb 300 -- 11.3 Quantization 302 -- 11.4 Transform Coding 306 -- Problems 319 -- Appendix A Probability Distributions 323.

A.1 Discrete Distributions 323 -- A.2 Continuous Distributions 327 -- Appendix B Sampling Theorem 337 -- B.1 The Sampling Theorem 337 -- Bibliography 343 -- Index 347.

Restricted to subscribers or individual electronic text purchasers.

An important text that offers an in-depth guide to how information theory sets the boundaries for data communication In an accessible and practical style, Information and Communication Theory explores the topic of information theory and includes concrete tools that are appropriate for real-life communication systems. The text investigates the connection between theoretical and practical applications through a wide-variety of topics including an introduction to the basics of probability theory, information, (lossless) source coding, typical sequences as a central concept, channel coding, continuous random variables, Gaussian channels, discrete input continuous channels, and a brief look at rate distortion theory. The author explains the fundamental theory together with typical compression algorithms and how they are used in reality. He moves on to review source coding and how much a source can be compressed, and also explains algorithms such as the LZ family with applications to e.g. zip or png. In addition to exploring the channel coding theorem, the book includes illustrative examples of codes. This comprehensive text: -Provides an adaptive version of Huffman coding that estimates source distribution -Contains a series of problems that enhance an understanding of information presented in the text -Covers a variety of topics including optimal source coding, channel coding, modulation and much more -Includes appendices that explore probability distributions and the sampling theorem Written for graduate and undergraduate students studying information theory, as well as professional engineers, master's students, Information and Communication Theory offers an introduction to how information theory sets the boundaries for data communication.

Also available in print.

Mode of access: World Wide Web

Online resource; title from PDF title page (EBSCO, viewed March 11, 2019)

There are no comments for this item.

Log in to your account to post a comment.