Normal view MARC view ISBD view

Information Theory for Electrical Engineers [electronic resource] / by Orhan Gazi.

By: Gazi, Orhan [author.].
Contributor(s): SpringerLink (Online service).
Material type: materialTypeLabelBookSeries: Signals and Communication Technology: Publisher: Singapore : Springer Nature Singapore : Imprint: Springer, 2018Edition: 1st ed. 2018.Description: IX, 276 p. 122 illus., 2 illus. in color. online resource.Content type: text Media type: computer Carrier type: online resourceISBN: 9789811084324.Subject(s): Telecommunication | Coding theory | Information theory | Communications Engineering, Networks | Coding and Information TheoryAdditional physical formats: Printed edition:: No title; Printed edition:: No title; Printed edition:: No titleDDC classification: 621.382 Online resources: Click here to access online
Contents:
Concept of Information, Discrete Entropy and Mutual Information -- Entropy for Continuous Random Variables Discrete Channel Capacity, Continuous Channel Capacity -- Typical Sequences and Data Compression -- Channel Coding Theorem.
In: Springer Nature eBookSummary: This book explains the fundamental concepts of information theory, so as to help students better understand modern communication technologies. It was especially written for electrical and communication engineers working on communication subjects. The book especially focuses on the understandability of the topics, and accordingly uses simple and detailed mathematics, together with a wealth of solved examples. The book consists of four chapters, the first of which explains the entropy and mutual information concept for discrete random variables. Chapter 2 introduces the concepts of entropy and mutual information for continuous random variables, along with the channel capacity. In turn, Chapter 3 is devoted to the typical sequences and data compression. One of Shannon’s most important discoveries is the channel coding theorem, and it is critical for electrical and communication engineers to fully comprehend the theorem. As such, Chapter 4 solely focuses on it. To gain the most from the book, readers should have a fundamental grasp of probability and random variables; otherwise, they will find it nearly impossible to understand the topics discussed.
    average rating: 0.0 (0 votes)
No physical items for this record

Concept of Information, Discrete Entropy and Mutual Information -- Entropy for Continuous Random Variables Discrete Channel Capacity, Continuous Channel Capacity -- Typical Sequences and Data Compression -- Channel Coding Theorem.

This book explains the fundamental concepts of information theory, so as to help students better understand modern communication technologies. It was especially written for electrical and communication engineers working on communication subjects. The book especially focuses on the understandability of the topics, and accordingly uses simple and detailed mathematics, together with a wealth of solved examples. The book consists of four chapters, the first of which explains the entropy and mutual information concept for discrete random variables. Chapter 2 introduces the concepts of entropy and mutual information for continuous random variables, along with the channel capacity. In turn, Chapter 3 is devoted to the typical sequences and data compression. One of Shannon’s most important discoveries is the channel coding theorem, and it is critical for electrical and communication engineers to fully comprehend the theorem. As such, Chapter 4 solely focuses on it. To gain the most from the book, readers should have a fundamental grasp of probability and random variables; otherwise, they will find it nearly impossible to understand the topics discussed.

There are no comments for this item.

Log in to your account to post a comment.