Entropy is a concept that's almost as important as energy, but far less widely understood. The book explains how until the 1850s the two concepts were confused, and that a clear understanding of what entropy really is was reached only in 1947.
A French military engineer, Sadi Carnot, took he first step by thinking clearly about steam engines. For almost two decades his work was neglected, and then it impeded the efforts of an English brewer, James Joule, to establish the concept of energy. Eventually a couple of Germans and a Scotsman made energy and entropy into well defined structures. Over the next fifty years an Austrian and an American made the connection between entropy and disorder. The final step was taken by a mathematician at Bell Labs working on telecommunication protocols.
Today entropy is a hot topic in research into quantum information, black-hole physics and quantum field theory. Entropy is a measure of missing information and quantum systems tend to leak information to their environments through the phenomenon of entanglement. Quantum technologies are concerned with limiting this leakage.
A black hole sets an upper limit on the entropy any body can have. Black holes contain stupendous quantities of entropy, but their entropies don't increase with mass as rapidly as predicted by quantum field theory. Our current understanding of material reality is based on quantum field theory, so entropy is signalling a fundamental weakness in our understanding of the Universe.


