Entropy

Daniel Davies dsquared at al-islam.com
Thu Apr 11 07:32:14 PDT 2002



> >There has been an incredible amount of
>>really stupid literary criticism written by critics who think >>"chaos"
>>means chaos and "entropy" means entropy.
>>
>>Carrol


>You mean literary scholars don't use the term entropy to >mean an
increasing
>value for Bolzman's H? I'm shocked, what DO you people >learn in graduate
>school . . .

I'm sure that there is an incredible amount of really stupid literary criticism written for all sorts of reasons, but this particular modish fallacy can't really be pinned on the literary theorists a la Sokal. The tendency to read more into the concept of entropy than can be justified by the evidence is practically the Original Sin of thermodynamics, and Boltzman's entropy concept (which I remember as W rather than h, an interesting example of transatlantic idiom) was there pretty much the moment that Boltzmann's interpretation of the Second Law was off the presses. The Maxwell's Demon thought-experiment is the source of a lot of the confusion, as it really encourages physicists to talk about "information" as an unproblematic concept which can be used back-and-forth with energy; there are a whole nother load of problems arising from the fact that it's difficult to keep your head straight when talking about statistical mechanics, and indeed a number of physicists contemporaneous with Boltzmann had really serious problems and genuine arguments against the statistical interpretation of the Second Law.

But where the concept of "entropy" gets most of its baggage from is Claude Shannon's information theory, which is also expressed in terms of a k log W formula, and which absolutely *none* of the people involved in its development took *any* care *at all* to avoid or discourage interpretations of the theory outside telecommunications engineering. Shannon and Weaver popularised the usage of "entropy" which made its way into the literary criticism Carroll abhors, so although "entropy" in this sense isn't entropy, the misuse began with the engineers. The counterpart to entropy, "information", split off into the economics profession, and did considerable damage there.

All of which is a roundabout way to suggest that Philip Mirowski's book "Machine Dreams: Economics becomes a cyborg science", recommended by Michael Perelman on this list a while ago, really is one of the best things I've read in a long time, and more or less everyone on this list would find something of interest to their own field in it.

dd

Get Your Free Email at http://www.al-islam.com



More information about the lbo-talk mailing list