Demon's Memory
Historically, the concept of information in physics does not have a clear-cut origin. An important thread can be traced if we consider the paradox of Maxwell's demon of 1871 (fig. 1) (see also Brillouin 1956). Recall that Maxwell's demon is a creature that opens and closes a trap door between two compartments of a chamber containing gas, and pursues the subversive policy of only opening the door when fast molecules approach it from the right, or slow ones from the left. In this way the demon establishes a temperature difference between the two compartments without doing any work, in violation of the second law of thermodynamics, and consequently permitting a host of contradictions.
Fig. 1. Maxwell's demon. In this illustration the demon sets up a pressure difference by only raising the partition when more gas molecules approach it from the left than from the right. This can be done in a completely reversible manner, as long as the demon's memory stores the random results of its observations of the molecules. The demon's memory thus gets hotter. The irreversible step is not the acquisition of information, but the loss of information if the demon later clears its memory.
A number of attempts were made to exorcise Maxwell's demon (see Bennett 1987), such as arguments that the demon cannot gather information without doing work, or without disturbing (and thus heating) the gas, both of which are untrue.
Some were tempted to propose that the 2nd law of thermodynamics could indeed be violated by the actions of an ``intelligent being.'' It was not until 1929 that Leo Szilard made progress by reducing the problem to its essential components, in which the demon need merely identify whether a single molecule is to the right or left of a sliding partition, and its action allows a simple heat engine, called Szilard's engine, to be run. Szilard still had not solved the problem, since his analysis was unclear about whether or not the act of measurement, whereby the demon learns whether the molecule is to the left or the right, must involve an increase in entropy.
A definitive and clear answer was not forthcoming, surprisingly, until a further fifty years had passed. In the intermediate years digital computers were developed, and the physical implications of information gathering and processing were carefully considered. The thermodynamic costs of elementary information manipulations were analysed by Landauer and others during the 1960s (Landauer 1961, Keyes and Landauer 1970), and those of general computations by Bennett, Fredkin, Toffoli and others during the 1970s (Bennett 1973, Toffoli 1980, Fredkin and Toffoli 1982). It was found that almost anything can in principle be done in a reversible manner, i.e. with no entropy cost at all (Bennett and Landauer 1985). Bennett (1982) made explicit the relation between this work and Maxwell's paradox by proposing that the demon can indeed learn where the molecule is in Szilard's engine without doing any work or increasing any entropy in the environment, and so obtain useful work during one stroke of the engine. However, the information about the molecule's location must then be present in the demon's memory (fig. 1). As more and more strokes are performed, more and more information gathers in the demon's memory. To complete a thermodynamic cycle, the demon must erase its memory, and it is during this erasure operation that we identify an increase in entropy in the environment, as required by the 2nd law. This completes the essential physics of Maxwell's demon; further subtleties are discussed by Zurek (1989), Caves (1990), and Caves, Unruh and Zurek (1990).
Source: Quantum Computing:introduction
http://eve.physics.ox.ac.uk/Personal/steane/qcintro.html
The Second Law of Thermodynamics forbids (due to statistical improbability) that two bodies of equal temperature, when brought into contact with each other and isolated from the rest of the Universe, will evolve to a state in which one of the two has a significantly higher temperature than the other. The second law is also expressed as the assertion that in an isolated system, entropy never decreases.
wikipedia.com
In topology, two continuous functions from one topological space to another are called homotopic (Greek homos = identical and topos = place) if one can be "continuously deformed" into the other, such a deformation being called a homotopy between the two functions. An outstanding use of homotopy is the definition of homotopy groups and cohomotopy groups, important invariants in algebraic topology.
wikipedia
<< Home