Einstein on FBI physics

Les Schaffer godzilla at netmeg.net
Thu Sep 7 12:50:37 PDT 2000


FBI physics == Fuzzy Bobbling Idealist/Intuitionist physics

First, Chris Burford said:


> His reliance on thought experiments presupposes an ideal simple
> logical structure to the universe.

this seems wrong to me, not sharp enough to capture what Einstein accomplished. its true that einstein was very much a ponderer: he liked thought experiments. but thought experiments have a place in research: they help the mind to digest and organize and consolidate collective experience and theory and see if some simpler theory can be arrived at ...


> His lifelong search for a unified field theory is of the same
> character.

... so that while he was fond of gedanken experiments, this sentence is missing a big piece of the historical puzzle: which is that the earlier Einstein himself, along with Planck, invented quantum ideas to begin with JUST to explain simple observable experimental facts like that a finite sized hot body does not radiate infinite energy as it cools by even a fraction of a degree (Planck), or the photoelectric effect (einstein's contribution) in which light energy was proposed to act in little teensy weensy sized (but very SPECIFIC SIZED) bundles in interactions with a metallic surface.


> His difficulty in accepting empirical evidence in support of quantum
> theory is essentially idealist.

"The properties of elementary processes ... make it almost seem inevitable to formulate a truly quantized theory of radiation" [A. Einstein, Phys. Zeitschr. 18, 121, 1917 , as quoted in Inward Bound: Of matter and forces in the material world, by Abraham Pais]


> "God does not play dice", is an arbitrary rejection of the evidence
> that the universe is probabilistic.

except that Einstein made two significant contributions that were of a probabilistic nature: one on Brownian motion, where he formulated a theory which accounted for the random motions of small little teeny weeny particles embedded in fluids, as observed under microscopes -- and which is credited with contributing to the strength of the __atomic__ and __molecular__ theory of matter (see bio by Pais and also Feynmann) and secondly his work on radiative transitions, where he picked up the ball passed from Planck and formulated a manifestly probabilistic explantion for light-matter interactions.

but, yes, he had a knee-jerk response to probabilistic explanation at the lowest (reduced) level, yet, nevertheless, made some of the most fundamental contributions early on to the quantum theory. a theory which required Born, Bohr and others later to make manifestly probabilistic in interpretation just because of things like what einstein (and others) earlier had revealed by theory and practice.


> The proprositions that time can run backwards is not unique to him

in fact, i never read any place where einstein talked about time running backwards. are you sure you are not confusing einstein with another smark alec, feynmann, who DID talk about particles going backwards in time???

see the section on 'Virtual Particles: How can they be responsible for attractive forces?' in the Sci.physics FAQ:

http://www.math.ucr.edu/home/baez/physics/virtual_particles.html

in particular the latter part of:


>>> Now, consider a virtual photon that comes from the particle on the
right and is absorbed by the particle on the left. Actually calculating the photon's wave function is a little hairy; I have to consider the possibility that the photon was emitted by the other particle at any prior time. (However, I can save myself a little effort later by automatically including the possibility that the photon actually comes from the particle on the left and is absorbed by the particle on the right, with the recoil nudging the left particle: all I have to do is include situations in which the photon is "emitted on the right" in the future and goes "backward in time," and take its momentum to be minus what it really is! As long as I remember what's really going on, this trick is formally OK and saves a lot of trouble; it was introduced by Richard Feynman.) <<<

that is, considering backward motion in time is a __calculational__ strategy.


> but is common to the simplistic mathematical modelling of that
> approach to science, and I suggest is a fundamentally idealist,
> non-materialist assumption. (i.e. I suggest that along with a basic
> assumption that reality exists, a materialist approach needs to
> posit that time runs forwards, and cannot run backwards.)

chris, you'd like the stuff by prigogine and his crowd, i believe.

einstein took a huge step in removing the ideas of space and time from a purely god-given, idealistic realm. some of his early writings on this stuff, INCLUDING in his original papers on relativity, are full of eloquent expression of this:

"The only justification for our concepts and system of concepts is that they serve to represent the complex of our experiences; beyond this they have no legitimacy. I am convinced tha thet philosphers have had a harmful effect upon the progress of scientific thinking in removing certain fundamental concepts from the domain of empiricism, were they are under our control, to the intangible heights of the a priori. For even if it should appear that the universe of ideas cannot be deduced from experience by logical means, but is in a sense, a creation of the human mind, without which no science is possible, nevertheless this universe of ideas is just as little independent of the nature of our experiences as clothes are of the form of the human body. This is particualrly true of our concepts of time and space, which physicists have been obliged by the facts to bring down from the Olympus of the a priori in order to adjust them and put them in a serviceable condition." [Einstein, The Meaning of Relativity]

anyone who is tricked into thinking that einstein was a total idealist in his physics reads too much of the popular press and too little of einstein. for it is a perhaps dirty little secret that some of einstein's early papers, the one's for which he is now timelessly (?) regarded as a genius and thus must be unapproachable by definition, house some of the early eloquent and simple and down to earth physics which belie this view of Chris' that einstein was too an idealist. maybe he died as an idealist (??), but his __concrete__ contributions to physics are anything but.

read about them for yourselves and fuck the capitalist scientific press and their one minute sound bite physics coverage. a press (or popular books for that matter), for example, which would tend to harp on his 'see youself in the mirror moving at the speed of light' koan -- err -- gedanken experiment, but miss all the juicy part where he makes "serviceable" the meanings we give to those constantly changing numbers we see on our (left?) wrist, in a direct and simple way. in particular, the first part of his first paper on special relativity, where he disects the idealist notion of simultaneous events, is a mind blower. see it at all good bookstores near you. [pp 37-43, The electrodynamics of moving bodies, in The Principle of Relativity, Dover Press, $7 and change]

in brief, einstein may have talked idealist later on, but his actions (contributions) were anything but...

then gordon fitch said:


> That the speed of light is the same in all directions and does not
> change over time is an empirical fact.

to which jks (Justin??) said:


> Empirical facts are those known through experience, as opposed to a
> priori facts like 2+2=4, known in some other way. There may also be
> a contrast with a theoretical fact, a fact derived from the
> structure of a theory. If that is a valid contrast, the constant
> velocity of c is a theoretical rather than a merely empirical fact.

and later:


> Basically, yeah, a theoretical fact is a fact that we accept because
> it is a part of a theory that we have reason to accept for some
> complex of reasons.

i don't quite follow this, but i think whats here is this:

1.) we had maxwell's equations, which unified electricity and magnetism and predicted light __wave__ phenomena at a speed independent of any motion of the light wave's source (doppler shift is something more). the speed was a property of, well, the vacuum itself.

2.) michelson-morley experiments, "the constancy of the speed of light", seeing if one could detect motion relative to an aether. The negative result of which gave further empirical support for maxwell's theory on light speed and its constant character independent of source motion relative to said aether: (i.e., exactly what maxwell WOULD predict).

Einstein came along and said lets keep 1.) the theoretical framework, and raise 2.) observed speed of light independent of frame of reference to the level of an independent postulate, and amend 3.) newton's laws. thus did special relativity flow forth.

So, yes, you could say that c was also a fact derived from one theoretical framework -- Maxwell's equations, viz.,

c = sqrt( 1 / mu * epsilon )

where mu * epsilon is the product of something that was "all magnetic" and something that was "all electric". Though don't forget too that there was also an earlier and exceedingly interesting measurement of the speed of light in the 1600's involving the satellites of Jupiter.

But by raising Item 2. to a postulate, einstein arrived at a new interpretation of the underlying framework of maxwell's theory, that of the structure of space-time, upon which the electromagnetic waves ride, so to speak.

not bad for a couple years work.

moving forwards in time, Dace said:


> Einstein was a firm believer in the Platonic religion of geometry.
> He wanted a purely geometrical theory of the universe, while quantum
> physics sought out a materialistic theory. Einstein referred to
> geometry as "marble" and particles as "wood." He believed in a
> universe of marble = that merely takes on the illusion of wood.

what gives you this impression, i am curious?

Einstein took an element of idealism OUT OF space-time theory. he said that the marble was intimately affected by the wood as much as the wood was affected by the marble. in other words, while matter responds to space-time structure (throw a comet by the sun and its arc bends a little when it goes near Jupiter), space-time structure (the gift formerly handed down to us from Providence) in fact arranges itself according to matter (curvature of spacetime around Jupiter).

This business of a universe of marble with an illusion of wood i suppose refers to einstein's and other's attempts to see whether the masses of various particles could themselves be deduced from some simpler field theory. for example, could the mass of the electron be explained strictly in terms of gravitational or electromagnetic energy (equivalence of mass and energy).

But of course no such fundamental result has been obtained yet. The mass of the electron, the proton/neutron (well, quark-triples), and the photon (zero) are still fundamental measured constants.

again, espousal of idealist views does not imply that contributions were idealist in nature. one has to look and see what was put forward in actuality rather than read only the philosophy.

for that matter, you could call all particle physicists idealist beause they seek a unified field theory. and there would be truth to that. and certainly this belief in a unified field theory determines the direction of mainstream physics research, etc. and yet its well to keep in mind that physicists generally speaking try to maintain at least one foot on the materialist ground, so that whatever the philosophical pronouncement of the day appears in the press from one nobel physicist to the next, its better to try and grasp a more complete historical and theoretical view of actually existing physics.

does that make sense?


> The rift between geometry and particles has apparently been solved
> by superstring theory. According to this theory, every type of
> matter is simply a different resonance of tiny, vibrating strings.
> Just as the voice resolves itself into discrete notes, superstrings
> take on a particular sequence of vibrations, corresponding to the
> various types of particles.

i kind of see what you're talking about here, that superstring theory may allow us to to avoid divergences in theory found when blending gravitation and quantum theory together, albeit in dimensions higher than three.

of course, when you say "matter is simply a different resonance of tiny, vibrating strings", the story is a bit more complicated (of course, quantum field theory is complicated too). since time has run out on me, i refer you to Brian Greene's The Elegant Universe: SUperstrings, Hidden dimensions, and the Quest for the ultimate theory. i am guessing mayeb you are already reading this. anyway, greene is a principal contributor to superstring theories and his semi-popular book is one of the better physics books-for-everyone, in my opinion.

in particular, see the bottom of 149 through page 151, starting with:

"This raises a crucial question directly related to the goal of reproducing rge particle properties in Tables 1.1 nd 1.2 [les: mass of the electron, quark, proton, guage bosons, etc]: if the "natural energy scale of string theory is some billion billion times that of a proton, how can it possibly account for the far-lighter particles -- electrons, quarks, photons, and so on -- making up the world around us?

[snip section on cancellation of quantum jitters in string with normal vibrations of string]

This tells us that commaratively light fundamental particles of Tables 1.1 and 1.2 should arise, in a sense, from the fine mist above the roaring ocean of energetic strings. Even a particle as heavy as the top quark, with a mass about 189 times that of the proton, can arise from a vibrating string only if the string's enormous characteristic Planck-scale energy is canceled by the jitters of quanutm uncertainty to better than better than one part ina hundren million billion. [snip] ... approximate alculations have conclusively shown that analagous energy calcenllations certainly __can__ occur, but for reasons that will become increasingly clear in subsequent chapters, verifying the cancellations to such a high level of precision is generally beyond our theoretical ken at present."

other physicists (some critical of string theory) have added that experimental verification of some aspect of super-string theory's predictions would be -- ahem -- welcome as well. so although superstring theory offers a promising theoretical route to unification, its important to not jump the gun and assume that "The rift between geometry and particles has apparently been solved" with all the philosophical ramifications which would flow from that.


> The important thing is that the movements of the strings are =
> self-consistent rather than force-consistent. Nothing outside of
> the string causes it = to vibrate the way it does. Rather, its
> vibration arises from "within."

strings vibrate. thats one part. then strings interact with other strings, thats the other part. the precise way in which string paths merge in spacetime (like tubes joining in a Y-type connection) helps out with the divergence problem -- in some higher number of dimension -- formerly arising in interaction theories. so how non-point-particle things interact is an __essential__ geometrical/topological feature of super-string theory.


> = Once self-consistency is posited, then the math describing the
> primary manifestation of the strings turns out to be *exactly* the
> same set of equations Einstein wrote out years ago for gravity.

you're jumping the gun here. one should be careful and note that superstring theory has no experimental tests yet. and also there are lots of smart people working on it and hence write interesting books which get picked up in the press and, well, ugh.....


> The mistake here is believing that laws of physics are absolute.
> Really = a better word would be "habits" of physics.

are you a fan of rupert sheldrake?

now, running backwards in time, Lisa and Ian Murray said (simultaneously??):


> **Light can also be slowed down to less than three hundred miles an
> hour in experiments.

in experiments in different kinds of matter, which makes the deduced light speed an __effective__ velocity of light, not the speed of light in vacuum, which is a different and more fundamental kind of thing.


> Photons and electrons etc. emerged probabilistically out of a
> plasma. For a while in the very early universe there was no speed of
> light.

this is a muddle. i think you are saying that photons and electrons formed much "later" in the big bang, which is the muddle i think. photons and electrons are elementary particles in the standard model, i think you are wanting to say that protons and neutrons and later atoms (nucleons plus electrons) condensed out of smaller constituent particles after the first moments.

actually, now that i think about your statement again, there are really two __different__ time milestones.

there was a moment, so the theory goes, that later on photons effectively __decoupled__ from the particle soup, and started flying around almost freely with little interaction with matter. Whereas prior to this moment, matter was dense and hot enough that everything was banging into everything. this was, so goes the theory -- at about the 100,000 year mark, when the average temperature was about 3000 degrees K. that tirns out to be cool enough for protons to hang around.

there was an earlier milestone your reference to "plasma" brings to mind, when things were hot and dense enough that there was a so-called quark-gluon plasma. this was the stuff prior to the protons and neutrons, but the end of the quark-gluon plasma was much earlier than T plus 100,000 years.

be that as it may, all this does not preclude that there was a (theoretically) well defined speed to light (photons) during the tightly coupled radiation phase and also as far back as the quark gluon plasma phase (gluons travel at the speed of light too). only that it would have been a hell of an experiment if you could have measured such a speed c over a finite distance back in those "days". in other words, we can experimentally measure the speed of light now because we can "catch" light propagating over finite distances and time the traversal. before decoupling, light would be constantly absorbed and re-emitted by the dense particle soup, so a finite distance traversal would have been hard to find. does that mean there was no well defined speed of light back then??? no, only that such speeds, if we are to carry on this bizzare scenario just a bit further, would have to have been detected experimentally based on observations of electromagnetic interactions over short microscopic distances and interpreted using the laws of quanutm mechanics.

By the way, this moment when photons decoupled from matter places a limit on how far back in time we can visualize early large-scale cosmological structure formation using the cosmic microwave background (CMB) radiation. The CMB results let us look backwards in time to a point when the radiation decoupling happened. (though i scanned through a paper recently purporting to extend the CMB observations back somewhat into the radiation dominated phase). so the clumpiness of the CMB should reflect spatial variations in matter density which existed at the time of decoupling.

You may have read that the CMB radiation is 3 degrees Kelvin __now__ whereas at the moment of radiation decoupling it must have been about 3000 degrees Kelvin if the particle soup was in thermal equilibrium with the radiation. if the CMB is now radiatively decoupled from matter since T plus 100000, you might be tempted to ask, where did the "energy" go (3000 degrees down to 3 degrees)?

the crude answer is that the __expansion__ of the universe since T plus 100000 years implies that the CMB radiation has shifted down in frequency from short-wavelength visible light to to invisible long-wavelength microwave frequencies, a general relativistic result. For the afficianando: the radiation is not doppler shifted due to recession rate of early emitting matter, instead radiation wavelength scales with radius (of curvature) of universe. [see also Subject: I.17 at http://usenet.umr.edu/faqs/astronomy/faq/part9 ]

of course, 3000 deg K at T plus 100,000 years is the estimate one gets by analyzing the Hubble expansion in reverse, running time backwards, and watching the temp go up and up from 3 deg K as the universe condenses.


> The unification of physics and computer science could quite possibly
> take us very far beyond Einstein's incredible ideas, with enormous
> economic and political consequences.

i don't know about the 'far beyond' part, but, ... i am not sure this is available to non-subscribers, but see the article by Seth Lloyd:

Nature 406, 1047 - 1054 (2000) © Macmillan Publishers Ltd.

Ultimate physical limits to computation

SETH LLOYD

d'Arbeloff Laboratory for Information Systems and Technology, MIT Department of Mechanical Engineering, Massachusetts Institute of Technology 3-160, Cambridge, Massachusetts 02139, USA (slloyd at mit.edu)

Computers are physical systems: the laws of physics dictate what they can and cannot do. In particular, the speed with which a physical device can process information is limited by its energy and the amount of information that it can process is limited by the number of degrees of freedom it possesses. Here I explore the physical limits of computation as determined by the speed of light c, the quantum scale h_bar and the gravitational constant G. As an example, I put quantitative bounds to the computational power of an 'ultimate laptop' with a mass of one kilogram confined to a volume of one litre.

Over the past half century, the amount of information that computers are capable of processing and the rate at which they process it has doubled every 18 months, a phenomenon known as Moore's law. A variety of technologies - most recently, integrated circuits - have enabled this exponential increase in information processing power. But there is no particular reason why Moore's law should continue to hold: it is a law of human ingenuity, not of nature. At some point, Moore's law will break down. The question is, when?

The answer to this question will be found by applying the laws of physics to the process of computation[1-85]. Extrapolation of current exponential improvements over two more decades would result in computers that process information at the scale of individual atoms. Although an Avogadro-scale computer that can act on 10^23 bits might seem implausible, prototype quantum computers that store and process information on individual atoms have already been demonstrated[64, 65, 76-80]. Existing quantum computers may be small and simple, and able to perform only a few hundred operations on fewer than ten quantum bits or 'qubits', but the fact that they work at all indicates that there is nothing in the laws of physics that forbids the construction of an Avogadro-scale computer.

The purpose of this article is to determine just what limits the laws of physics place on the power of computers. At first, this might seem a futile task: because we do not know the technologies by which computers 1,000, 100, or even 10 years in the future will be constructed, how can we determine the physical limits of those technologies? In fact, I will show that a great deal can be determined concerning the ultimate physical limits of computation simply from knowledge of the speed of light, c = 2.9979 10^8 m s-1, Planck's reduced constant, h_bar = h/2 pi = 1.0545 10^-34 J s, and the gravitational constant, G = 6.673 10-^11 m3 kg-1 s-2. Boltzmann's constant, k_B = 1.3805 10^-23 J K-1, will also be crucial in translating between computational quantities such as memory space and operations per bit per second, and thermodynamic quantities such as entropy and temperature. In addition to reviewing previous work on how physics limits the speed and memory of computers, I present results - which are new except as noted - of the derivation of the ultimate speed limit to computation, of trade-offs between memory and speed, and of the analysis of the behaviour of computers at physical extremes of high temperatures and densities.

Before presenting methods for calculating these limits, it is important to note that there is no guarantee that these limits will ever be attained, no matter how ingenious computer designers become. Some extreme cases such as the black-hole computer described below are likely to prove extremely difficult or impossible to realize. Human ingenuity has proved great in the past, however, and before writing off physical limits as unattainable, we should realize that certain of these limits have already been attained within a circumscribed context in the construction of working quantum computers. The discussion below will note obstacles that must be sidestepped or overcome before various limits can be attained.

Energy limits speed of computation

To explore the physical limits of computation, let us calculate the ultimate computational capacity of a computer with a mass of 1 kg occupying a volume of 1 litre, which is roughly the size of a conventional laptop computer. Such a computer, operating at the limits of speed and memory space allowed by physics, will be called the 'ultimate laptop' (Fig. 1).

[snip]

Physical systems that can be programmed to perform arbitrary digital computations are called computationally universal. Although computational universality might at first seem to be a stringent demand on a physical system, a wide variety of physical systems - ranging from nearest-neighbour Ising models[52] to quantum electrodynamics[84] and conformal field theories (M. Freedman, unpublished results) - are known to be computationally universal[51-53, 55-65]. Indeed, computational universality seems to be the rule rather than the exception. Essentially any quantum system that admits controllable nonlinear interactions can be shown to be computationally universal[60, 61]. For example, the ordinary electrostatic interaction between two charged particles can be used to perform universal quantum logic operations between two quantum bits. A bit is registered by the presence or absence of a particle in a mode. The strength of the interaction between the particles, e^2/ r, determines the amount of time t_flip = pi h_bar r/2e^2 it takes to perform a quantum logic operation such as a controlled-NOT on the two particles. The time it takes to perform such an operation divided by the amount of time it takes to send a signal at the speed of light between the bits t_com = r/c is a universal constant, t_flip/t_com = pi h_bar c/2e^2 = pi/ 2 alpha, where alpha = e^2/ h_bar c equals approx 1/137 is the fine structure constant. This example shows the degree to which the laws of physics and the limits to computation are entwined.

In addition to the theoretical evidence that most systems are computationally universal, the computer on which I am writing this article provides strong experimental evidence that whatever the correct underlying theory of physics is, it supports universal computation. Whether or not it is possible to make computation take place in the extreme regimes envisaged in this paper is an open question. The answer to this question lies in future technological development, which is difficult to predict. If, as seems highly unlikely, it is possible to extrapolate the exponential progress of Moore's law into the future, then it will take only 250 years to make up the 40 orders of magnitude in performance between current computers that perform 10^10 operations per second on 10^10 bits and our 1-kg ultimate laptop that performs 10^51 operations per second on 10^31 bits.

the rest at:

http://www.nature.com/cgi-taf/DynaPage.taf?file=/nature/journal/v406/n6799/full/4061047a0_fs.html

but i would say the thrust is __using__ einstein to characterise limits to computation, and then go after those limits.

les 'forwards-and-backwards-in-time' schaffer



More information about the lbo-talk mailing list