This course introduces various principles and techniques for reliable and efficient digital encoding of information. The Information Theory part of the course provides models for studying redundancy in codes, with the goal of encoding information more efficiently by minimizing redundancy. On the other hand, the Coding Theory part deals with techniques for systematically introducing redundancy into codes so as to enable detection and correction of errors in encoded information that has been affected by noise.
Bibliography:
- Richard W. Hamming, Coding and Information Theory, 1980.
- Mark Nelson and Jean-Loup Gailly, The Data Compression Book, 1996.
- John Baylis, Error-Correcting Codes: A Mathematical Introduction, 1998.
Course Outline
- Introduction (about 3 hrs)
- Definitions
- Some Examples of Codes
- Model of a Communication System
- Probabilities and Source Encoding
- Probabilities and Channel Encoding
- Error-detecting Codes (3-6 hrs)
- Parity Check Codes
- Cyclic Codes
- Weighted Codes
- Error-correcting Codes (3-6 hrs)
- Hamming Code
- Hamming Bound
- Single Error Correction Plus Double Error Detection
- Algebraic Coding Theory (6-9 hrs)
- Definition of Algebraic Code
- Parity Check Matrix
- Relation of Polynomials and Matrices
- Shift Registers for Encoding
- A Double Error Correcting Code
- Variable-length Codes (9-12 hrs)
- McMillan-Kraft Inequality
- Huffman Code
- Adaptive Huffman Code
- Extensions of a Code
- Entropy and Shannon's First Theorem (6-9 hrs)
- Definition of Entropy
- Shannon's First Coding Theorem
- Shannon-Fano Code
- Information Channels (6-9 hrs)
- Model of an Information Channel
- Binary Symmetric Channel
- System Entropies
- Channel Capacity (6-9 hrs)
- Definition of Channel Capacity
- The Uniform Channel
- Capacity of a Binary Symmetric Channel
- Shannon's Second Theorem (6-9 hrs)
- Mathematical Preliminaries
- Decision Rules
- Random Encoding
- Shannon's Second Coding Theorem
- Fano Bound
This page has been accessed
times since October 28, 2002.