6.050J / 2.110J Information and Entropy, Spring 2003
Author(s)
Lloyd, Seth; Penfield, Paul
Download6-050JSpring-2003/OcwWeb/Electrical-Engineering-and-Computer-Science/6-050JInformation-and-EntropySpring2003/CourseHome/index.htm (16.27Kb)
Alternative title
Information and Entropy
Metadata
Show full item recordAbstract
Unified theory of information with applications to computing, communications, thermodynamics, and other sciences. Digital signals and streams, codes, compression, noise, and probability. Reversible and irreversible operations. Information in biological systems. Channel capacity. Maximum-entropy formalism. Thermodynamic equilibrium, temperature. The Second Law of Thermodynamics. Quantum computation. From the course home page: Course Description 6.050J / 2.110J presents the unified theory of information with applications to computing, communications, thermodynamics, and other sciences. It covers digital signals and streams, codes, compression, noise, and probability, reversible and irreversible operations, information in biological systems, channel capacity, maximum-entropy formalism, thermodynamic equilibrium, temperature, the Second Law of Thermodynamics, and quantum computation. Designed for MIT freshmen as an elective, this course has been jointly developed by MIT's Departments of Electrical Engineering and Computer Science and Mechanical Engineering. There is no known course similar to 6.050J / 2.110J offered at any other university.
Date issued
2003-06Other identifiers
6.050J-Spring2003
local: 6.050J
local: 2.110J
local: IMSCP-MD5-0484a9ff1bd2f5a1168c64c3707e555c
Keywords
computing, communications, thermodynamics, codes, compression, noise, probability, reversible operations, irreversible operations, channel capacity, thermodynamic equilibrium, temperature, maximum-entropy formalism, second law of thermodynamics, quantum computation, biological systems, unified theory of information, digital signals, digital streams, bits, errors, processes, inference, physical systems, energy, quantum information, 6.050J, 2.110J, 6.050, 2.110, Entropy (Information theory)