Number of hours
- Lectures 16.5
- Projects -
- Tutorials 16.5
- Internship -
- Laboratory works -
- Written tests -
ECTS
ECTS 3.0
Goal(s)
Provide the theoretical fundamentals for the quantitative measurement of information (what is a bit of information?), Representation, storage, transmission, protection and information hiding. Application to lossless compression, error correcting codes, transmission security and content security (steganography).
Jean-Marc BROSSIER
Content(s)
- Uncertainty and information. Entropies . The Shannon entropy and its properties. Entropy and information transfer.
- Structure of a communication chain. Information and compression. Channel capacity and channel coding.
- Coding of discrete sources. Asymptotic equipartition. Theoretical optimal codes . Construction of optimal codes.
- Transmit and store information. Discrete channel (binary symmetric channel). Channel coding and second Shannon theorem.
- Error correcting codes. Repetition and second Shannon theorem. Error detection codes. Error correcting codes. Linear block codes. Hamming distance and Euclidean distance. Maximum likelihood decoding.
- Some applications: steganography, information leakage and security.
Probability
N1 = E1
N2= E2
The course exists in the following branches:
- Curriculum - Core curriculum - Semester 5
Course ID : 3MMTINF
Course language(s):
The course is attached to the following structures:
You can find this course among all other courses.
C. Shannon, W. Weaver, La théorie mathématique de la communication, Cassini, avril 2018.
Théorie de l'information et du codage. O. Rioul. Hermès, 2007.
Elements of Information Theory. T.M. Cover, Joy A. Thomas. John Wiley & Sons Inc, 2006.
David J.C. MacKay “Information Theory, Inference, and Learning Algorithm”, Cambridge Univ. Press, 2003. http://www.cs.toronto.edu/~mackay/itila/book.html
R.G. Gallager, « Information Theory and reliable communication », Wiley, 1968