Information and Coding Theory (EE 410)

2021 Spring
Faculty of Engineering and Natural Sciences
Electronics Engineering(EE)
3
6.00 / 6.00 ECTS (for students admitted in the 2013-14 Academic Year or following years)
Özgür Erçetin oercetin@sabanciuniv.edu,
Click here to view.
English
Undergraduate
MATH203 MATH201
Formal lecture
Interactive,Communicative
Click here to view.

CONTENT

Mathematical models for communication channels and sources; entropy, information, lossless data compression, Huffman coding, channel capacity, Shannon's theorems rate-distortion theory.

OBJECTIVE

To learn about information, how to measure it and how to use it to better design information systems.

LEARNING OUTCOME

Define the information content of an information source mathematically and define information theoretical measures such as entropy, conditional entropy, joint entropy mutual information, differential entropy etc.
Describe the fundamental limit in source coding and learn Shannon?s Source Coding Theorem
Design and implement some of the practical source codes
Describe the fundamental limit in maximum information rate at which the information is sent reliably and learn Shannon?s Channel Capacity Theorem.
Design and implement some of the practical channel codes
Describe the capacity of Gaussian Channel and optimal power allocation over Gaussian Channel using Water-Filling algorithm.
Describe the application of information theory to some of engineering problems through the course project.

ASSESSMENT METHODS and CRITERIA

  Percentage (%)
Final 40
Midterm 40
Exam 10
Participation 10

RECOMENDED or REQUIRED READINGS

Textbook

Thomas and Cover, "Elements of Information Theory", 2nd Edition

Readings

Shannon's seminal papers on Information Theory and Secrecy. Additional papers illustrating the applications of Information Theory in real life systems will also be provided.