On Thu, 11 Apr 2002, Daniel Davies wrote:
> But where the concept of "entropy" gets most of its baggage from is
> Claude Shannon's information theory, which is also expressed in terms of
> a k log W formula, and which absolutely *none* of the people involved in
> its development took *any* care *at all* to avoid or discourage
> interpretations of the theory outside telecommunications engineering.
Well, to be fair to those who came after, one of the most influential extenders of that information theory was also someone involved in its further development, Norbert Weiner. His _The Human Use of Human Beings: Cybernetics and Society_ was probably the most influential attempt to apply Shannon's entropy to society, and no one can say he didn't understand the math. Also some of his arguments, like about the baleful effects of secrecy on weapons development, are still fresh today.
I think most of the American lit crit use of the term eventually traces back to him via Pynchon.
Michael