the term 'software' (fwd)

Les Schaffer godzilla at netmeg.net
Mon Jul 31 11:02:36 PDT 2000



> NEW BRUNSWICK, New Jersey (AP) -- John W. Tukey, a Princeton University
> statistician credited with coining the word "software," died here
> Wednesday. He was 85.

;-(


> As a professor at Princeton and researcher for AT&T's Bell Labs, Tukey
> developed important theories about how to analyze and present data
> clearly. But his most widely recognized contribution is his introduction
> of the term "software" to describe the programs used to run early
> computers. It first appeared in a 1958 article he wrote in the journal
> American Mathematical Monthly.

goddamn.... where do they find these dumbass writers....

Tukey is a household name from where i come from, but not because he coined the term 'software', an idea which i NEVER heardof before till seeing this article.

to sum up his work as 'how to analyze and present data clearly' is plain stupid.

Tukey is known, among most scientists and those engineers involved in any kind of signal or digital processing, as being responsible for [further] developing Fast Fourier Transform algorithms (FFT). The ability to calculate transform quickly allowed all kinds of filtering and assorted other signal processing techniques and instruments to come into being. The household term is the Cooley-Tukey algorithm.

In fairness, these fast fourier transform schemes can be dated back to early 1940's. At that time, a recursive-type algorithm for analyzing (transforming) a signal into frequency components was developed by Cornelius and Lanczos. This basic scheme, when combined with a digital computer operation called bit reversal, enabled FFTs to gain huge speed increases. Cooley and Tukey refined and popularized this scheme into whats now called decimation-in-time, or the Cooley-Tukey algorithm, There is also a Sande-Tukey algorithm, which plays the game a little differently.

How much faster are these FFTs?

Some idea can be gained as follows. The discrete Fourier transform is an order N^2 calculation (on its face that is), meaning if you have a little more than a million samples of a signal in time, say

2^20 = 1048576

samples, it takes roughly

2^20 * 2^20 = 2^40

or 1.1 x 10^12 mathematical operations to calculate the spectrum. But FFT's make the process an

N * log__2 N

operation, or

2^20 * log_2 ( 2^20 ) = 2^20 * 20 = 5 * 2^22

operations. That gives a ratio of

2^40 / (5 * 2^22) = .2 * 2^18

or a speedup factor of 52,4300 !!!. and THAT contributed to the digital communications and processing "revolution".

what do you say? does that qualify simply as 'how to analyze and present data clearly'??????

http://www-groups.dcs.st-and.ac.uk/~history/Mathematicians/Tukey.html

les schaffer

memo to marxists: can we think up a term that can be substituted for 'revolution' here, that means more than linear development (somehting like punctuated equilibrium, except still development occurs in between the punctuation marks), but doesnt usurp the social meaning of revolution???



More information about the lbo-talk mailing list