General Information
This section provides miscellaneous information about MIT subject 6.050J / 2.110J Information and Entropy, offered in Spring 2003. This subject is designed for MIT freshmen. Academic credit of 6 units (half of that given by a typical MIT subject) is provided.
First Offering
Spring 2003 is the first offering of this subject. It was offered three times while being developed, under the numbers 6.095 and 2.995, in Spring 2000, 2001, and 2002.
This subject is offered jointly by the Department of Electrical Engineering and Computer Science, and the Department of Mechanical Engineering. Students may sign up for either 2.110 or 6.050.
Assignments
Problem sets are distributed weekly. They are typically distributed on Monday morning (or if possible the previous Friday), and are usually due on Friday of the same week. They are related to concepts presented in the lecture Tuesday that week. Students may submit their homework solutions either electronically, or at the 6.050J / 2.110J office. Official solutions are posted shortly after the problem sets are due. Late submissions do not receive any credit.
Computers
Students are expected to access resources from all over the World Wide Web.
Laboratory
Laboratory exercises are part of some of the problem sets. These make use of the MATLAB® programming environment. MATLAB® is a commercial product.
Workload
6.050J / 2.110J is a six-unit subject. It is intended that the overall work required be approximately six hours per week, including two hours of lecture and recitation. Any students who find themselves spending substantially more than six hours any week should question whether they are stuck and might make more rapid progress if they asked the instructing staff for some hints or got advice from fellow students. In particular, students should avoid spending nonproductive time on the computer, either polishing a MATLAB® exercise unnecessarily, or surfing the Web aimlessly.
Collaboration
Weak collaboration is permitted on problem sets. In this context the term "weak collaboration" means that two or more students may discuss the problems and their ways of approaching them, but that each student must fully work out the problem and present only his or her own solution. Advice can be given and received, but no part of the solution can be copied from another, nor can identical portions appear in the submissions of two or more students. Any weak collaboration must be fully disclosed as part of the problem solution, for example by a phrase like "Alice Alison and Bob Robertson collaborated in part (b) by discussions of general approach." Since weak collaboration involves discussions among two or more people, all must have compatible statements.
Help from people not taking this course is also permitted, provided that it is fully disclosed, and that the solution submitted was written in the the privacy of the submitter's own mind and body.
Strong collaboration is not permitted on problem sets. In this context the term "strong collaboration" is any collaboration in which work done by others is incorporated, with or without disclosure. Strong collaboration is normal and desirable in the work environment, where the principal purpose is to accomplish, as a team, some objective. In an academic setting, however, the purpose is to facilitate learning by all individual students, and strong collaboration does not support that goal.
It is, of course, a serious academic offense for a student to present another's work as his or her own. It is also an offense to fail to report collaboration in accordance with course policy. Such offenses will be treated seriously.
Prerequisites
The catalog description states that a prerequisite for 6.050J / 2.110J is one of the versions of 8.01 Physics I. This prerequisite is enforced. To qualify, a student must have received credit for 8.01 either through advanced standing or by receiving a passing grade in 8.01,8.012, 8.01L, or 8.01X.
Grades
Grades will be based on participation in class (15%), problem set solutions (30%), mid-term quiz (15%), final examination (30%), and subjective judgment of the instructing staff (10%). Assignment of grades is not an exact science, so these percentages should be regarded as approximate.
FAQs
- Are these questions really frequently asked?
Of course not. Most have never been asked in exactly this form. However, questions and answers are a good way to explain things.
- What is this course all about?
Information and entropy. Information is a measure of the amount of knowledge; it is not that knowledge itself. Thus information is like energy, in the sense that knowing the amount of energy does not tell you anything about its nature, location, form, etc. The same is true with information. Entropy is a one kind of information. It is the information you do not have about a situation.
- The information I don't have? Is this different from the information you don't have?
Certainly. Information is subjective. You know things that I don't. This is, after all, why communication is useful.
- But is entropy also a subjective quantity?
Strictly speaking, yes. However, in physical situations the difference of entropy as perceived by two observers may be negligibly small.
- If information is the measure of a quantity, what are its units?
Information is measured in bits. More bits means more information.
- Is entropy also measured in bits?
Yes. However, in physical situations there is lots of information that is not known. (Think about how many bits of information are needed to specify the position of all atoms in an object.) It is impractical to work with such large numbers, so another set of units is used: Joules per degree Kelvin.
- Wait a minute. Entropy is physical and information is mathematical. How can they be conceptually the same?
Historically, information storage and transmission required a physical artifact of some kind. Think of newspapers, books, and medieval manuscripts. So traditionally information has also been physical — most of the cost of information processing was due to the physical carrier of the information. It is only recently that the cost of storing, moving, or processing information is so low that we think of information apart from its physical form, not even subject to physical laws. But this is only a temporary situation. Eventually information technology will have to face the limits imposed by quantum mechanics and thermodynamics, and it will again be necessary to understand the fundamental physics of information. This time it will be entropy that is the relevant physical concept.
- Why is information so important?
This is the beginning of the information age. Just as the industrial age was opened up by our ability to manage energy, so the information age is upon us because we are learning how to manage information effectively.
- Why is entropy so important?
Entropy is one of the most mysterious of all the concepts dealt with by scientists. The Second Law of Thermodynamics states that the entropy of a given situation cannot decrease unless there is a greater increase elsewhere. Thus entropy has the unusual property that it is not conserved, as energy is, but monotonically increases over time. The Second Law of Thermodynamics is often regarded as one of science's most glorious laws. And also one of them the most difficult to understand.
- If entropy is so difficult, can a freshman really understand it?
Certainly. It's all in how the topics are approached. It is true that the concepts involved here are not normally taught to freshmen. This is a shame, because they have the background necessary to appreciate them if approached from the point of view of information. Traditionally the Second Law of Thermodynamics is taught as part of a course on thermodynamics, and a background in physics or chemistry is needed. Also, the examples used come from thermodynamics. In this course, the Second Law is treated as an example of information processing in natural and man-made systems; the examples come from many domains.
- I am thinking about taking this course. What do I need to know to start?
You need to understand energy, and how it can exist in one place or another, how it can be transported from here to there, and can be stored for later use. You need to know how to deal with a conserved quantity like energy mathematically, and to appreciate that if energy increases in one region, it must decrease elsewhere. More specifically, the prerequisite for this course is the first semester freshman physics subject 8.01 (or 8.012, 8.01L, or 8.01X).
- Is entropy useful outside of thermodynamics?
All physical systems obey the laws of thermodynamics. The challenge is to express these laws in simple but general forms so that their significance in, for example, a biological system gives insight. Besides, laws similar to the Second Law of Thermodynamics are found in abstract systems governed more by mathematics than physics — two examples discussed in the course are computers and communications systems. In these contexts the "information" part of "Information and Entropy" is important.
- Why aren't information and entropy normally thought of together?
Most scientists recognize that information can be exchanged for entropy and vice versa. but they don't consider that fact important. The reason is that in typical physical situations the number of bits of entropy is far larger than the number of bits of information that even the largest information processing systems can deal with. In other words, the scales involved are vastly different. This is because there are a large number of atoms in physical systems.
- But then why treat them together now?
For two reasons. First, the underlying principles are the same so you only have to learn them once. And second, modern technology is continuously increasing the amount of information that computers and communication systems can deal with. Another way of saying this is to observe that modern microelectronic systems control more bits with fewer atoms. As the number of atoms per bit comes down, the difference in scale between information in computers and communications systems and entropy in the corresponding physical systems shrinks. Eventually it will be possible to make devices that cannot be understood without considering the interplay between the information stored in the device and the entropy of its physical configuration.
- If I take this course, can I get a summer job?
Certainly, but probably not because of anything you learn here.