Matter & Memory

Scott Martens smartens at moncourrier.com
Mon Apr 3 11:19:30 PDT 2000



>You've helped me clarify the theory I'm presenting, or at least the
>presentation of it. Thanks. I appreciate your philosophical background,
>particularly your philosophy of science.

Thank you. I'm trying to clear up some of my ideas too as a prelude to going back to grad school (and clearing up my political ideas before going back to a country where I can vote.)


>>I have profound disagreements with the Cartesian idea of minds
>>existing separately from bodies.
>
>Same here.
>
>>This is at the root of a lot of my
>>problems with both the hopes and aspirations of artificial
>>intelligence and the tenets of certain linguists and philosophers
>>(Roger Penrose comes to mind, Searle is another) who oppose the entire
>>programme of artificial intelligence and cognitive science.
>
>I really like Searle. What do you make of his "Chinese Room" argument?

I think that human cognition is a physical process, and if so, it must be expressable as an algorithm of some sort. (Penrose's arguments against that are highly doubtful.) If human cognition is an algorithm, I am forced to concede that Searle's Chinese Room can execute the algorithm.

So, I don't see a problem or a paradox.


>>I'm not entirely sure what a mind is, but I firmly believe that human
>>cognition - thinking, feeling, and planning - all take place within
>>the confines of the human body.
>
>Does the mind not exist? If it does not, then who's hallucinating it? If
>it does, then why not assume it has properties, namely mental properties
>like "thinking, feeling, and planning." (You evaded my argument.)

I suppose if I'm going to be completely hinest, I'm not sure that mind exists apart from human cognition. If by "mind" the activities of human cognition are what you mean, then yes, minds exist as activities of the human nervous system. If not, then I have to say the concept is a little too nebulous for me to give it much credit.

The mind may be a kind of linguistic mirage. Humans recognise that other humans have cogntive processes like their own (so do some other animals) and we think of this as a property of human organisms. Thus, we say that people have minds. However, the notion may not have much scientific merit. Humans have lots of intuitive notions without a lot of scientific merit. Race is the classical example.


>>A lot of people in the cognitive
>>sciences see the mind as some kind of program running on the brain's
>>hardware. I am inclined to disagree - I don't think the brain is
>>sufficiently abstract in structure to make that work.
>
>How could the brain be abstract in its structure? Isn't its structure
>material?

A personal computer is very abstract in structure. It is not designed to do any specific task, instead it can execute general categories of algorithms. In this sense, I think humans are not like computers: we are not general purpose algorithmic machines.

Humans are animals. Animals may be a kind a machine, but they are sufficiently unlike the machines that we are used to to make analogies suspect.


>>Humans evolved
>>to certain bodies and certain environments. Our mental functioning is
>>a part of our physical functioning, completely dependent on that
>>environment and evolved to meet the needs of survival within it.
>
>No. At 250 KYA (thousand years ago), when a core grammar was in place,
>consciousness was still exclusively a *social* phenomenon, as it had been
>since its origins in higher primate evolution. Our toolkit, which was fully
>established by 1.4 MYA, did not alter during the period that our vocal
>anatomy and group size began to reflect the presence of language.
>Consciousness and its offspring had not, in this period, been integrated
>with day-to-day functions of living, such as toolmaking and exploitation of
>nature. We don't see any appreciable changes in lifestyle (besides
>meaningless drift over the millennia) until about 100 KYA, when our
>ancestors demonstrated improved knowledge of animals, plants, seasons, and
>geography. Not until 60 KYA is there any sign of improvement in toolmaking,
>such as specializing weapons and scrapers for particular animals or using
>bone to make tools or giving them components, etc. The reason such things
>could finally come into being is that the specialized programs of
>problem-solving for social, natural, and technical tasks had always been
>separate. By circa 50 KYA all these domains were integrated into general,
>abstract intelligence. Language itself had become a domain (out of which
>mathematics later evolved), and was also integrated into general
>intelligence under the roof of consciousness. Homo sapiens-- defined by
>consciousness and language-- evolved into being according to the demands of
>social interaction, not bodily survival in particular environments. Mental
>functioning began from social, not survival, pressure. In other words, it's
>about ego survival, not body survival. But that's just how it orginated.
>Mental functioning has continued to evolve.

I disagree with that history in several respects. Remember, it's very hard to deduce the origns of langauge and consciousness from the fossil record.

I don't think consciousness comes from a the breakdown of the modular mind. I'm not convinced that minds are modular in the fashion described by Pinker and others at all. I've worked with genetic algorithms, and neat modular structures aren't what normally emerges.

Language is not the source of thought, and mathematics doesn't come from it. Rather, language is a way of communicating that allows ideas to be transmitted from person to person rather than having each individual have to learn everything they can about the world unaided. It makes social evolution possible, granted, but linguistic structures are not the root of mathematical ones.


>>Is this hypothesis falsifiable? I would have to answer no.
>
>If you can prove that *any* mental functions exist in the brain, (cognition,
>will, memory, affect, behavioral habits, etc.) then the hypothesis I'm
>proposing is false. End of discussion.

Oh. Well, I can prove just that sort of thing in animals. I can certainly show that in humans brain damage can lead to memory loss. Stimulating certain areas in the brain electrically leads to people experiencing hallucinations, suddenly remembering strange things, getting particular moods. There is a well documented case involving a person who was stimulated at a certain spot in the brain and suddenly finding everything absolutely hilarious. Is that sufficient evidence?


>I am not proposing that mental traits are located external to the brain.
>Time is not external to space. If space is *ex*tended, then time is
>*in*tended. It's the brain that's external to the mind, for the mind has no
>space.

I still don't quite follow this. I know what space is, but I don't see how the mind is external to it.


>>Materialist cognition, however, has had a great deal of success in
>>increasing our knowledge.
>
>Assuming that the crude notion of "matter" is the whole of mentality has not
>helped us discover anything.

Well, yes it has. We can localise a sizeable part of image processing in the human mind. We know a great deal about how to modify moods and even behaviours and a little bit about how humans plan motions and other activities, as well as how some kinds of chemicals can stimulate moods and behaviours.


>I'd say "90%" reduction in symptoms.

Fair enough. I'm certainly not a booster for prozac. However, we can, using physical methods, affect changes in what is traditionally considered the mind.


>Back to our *analogy*: Change the tuner, get a new station-- yet station not
>in tuner. Therefore *in all cases* it does not follow from "change matter,
>change property" that property arises within matter. You must *prove* that
>mind is in brain.

Rather, I'm shifting the burden of proof to you. I see ample evidence that physical processes can account for everything humans think and do. I see no evidence of the existence of a separate mind. Why then should I consider your hypothesis credible?


>Mind is not measurable, no matter where you "locate" it (as if mind could
>possess the property of location). Your proposal that the mind is in the
>brain is not falsifiable. No matter how long we wait while neuroscientists
>try to "find" the mind, you can always say they're just about to set foot on
>abstract soil. Like the Second Coming, it's always right around the corner.

There is something of a truism in AI. Once you've figured out how something works, it doesn't seem that intelligent anymore. I expect science to uncover all the stuff that goes on in the brain, and find that there isn't anything left that we need a mind for.


>As Chomsky says, there is no problem, because there are no definable terms.
>It's not that we understand mind according to matter. It's that we
>understand neither.

However, we can understand the mind in terms of matter. There is progress, although research is by no means near an end. Chomsky sometimes seems to think the whole idea is hopeless, but I don't.


>>Acutally, we do know a bit about time. We know the arrow of entropy
>>only points one way, and that we can, within limits, trade time for
>>space. We also know some really freaky things about it through
>>quantum mechanics.
>>
>Do tell!

Hawking radiation (if it exists, but I think most physicists think it does) requires mass to go backwards in time for brief periods, due to the uncertainty in it's position in both space and time, however, it can't carry any information through that transition.

That's weird.


>>Well, if several different people can measure the time something
>>takes, and get the same results, I'm inclined to think that the
>>passage of time is a real thing.
>>
>What are they measuring? Are they measuring how long a second is? A
>minute? An hour? Or do they just measure how many seconds or minutes or
>hours are taken by a sequence of events to unfold? Time is subjective. If
>you think it's not, then tell me how long a second is. Try to impart this
>to me. Give up yet?

A second is the time it takes light to travel ~3*10^8 metres. Alternatively a second is 9192631770 oscillations of a cesium 133 atom. Neither of those measures is in any way subjective, although they do depend on one's physical frame of reference, but that isn't subjective either.


>>Take a look at a barometer.
>>What does it measure?


>I don't know. What does it measure? (Barometric pressure?) Whatever it
>is, it's not time.

No, but you are suggesting time doesn't exist, and any devise that measure it is really just measuring space. A barometer doesn't measure space, it measures pressure, but it is constructed in the same manner as a clock.


>>I am able to impose infomation on a floppy disk. How does this
>>process differ from matter having memory?
>>
>Memory is not a property of matter.

No, perhaps not, but matter can represent information. How does the persistent representation in matter of otherwise ephimeral information differ from the notion of memory?


>>Actually, I don't think people think that way computers do. However,
>>the more important question is why assume a magical process of
>>thinking when an algorithmic one will do?
>>
>Why assume an abstract process instead of a real one? Just because the mind
>contains imagination doesn't mean it's imaginary. Just because abstraction
>is a product of the intellect doesn't mean the intellect itself is abstract.
>It is not abstract. It is actual mental form. Just like actual material
>form, only stretched out in time instead of space.

This doesn't make much sense to me. I am assuming real processes.


>Exactly. It doesn't make sense to conjoin matter and ideal. Newton's
>synthesis, which brought us the mechanstic paradigm, is not coherent and has
>always been wrong.

Huh? How was Newton's description of the universe inconsistent? It is wrong in some repsects, but it is certainly coherent.


>The paradigm shift means that scientists change their method of
>investigation. It used to be strictly a search for the mechanisms
>underlying things. Ironically, the unraveling of mechanism began with
>Newton himself. He dismissed the aether through contact mechanics occurs
>across space, and declared that gravity works at a distance. Mechanism took
>another major hit in the early 20th century with quantum "mechanics" and
>later with cosmology, which proposes an organic rather than mechanical
>origin and development (evolution) of the universe. Next came chaos theory.
>Most recently, we have Rupert Sheldrake's natural memory. The pressure for
>a paradigm-quake has been building up for three hundred years, so it ought
>to be a good one. (Run for you lives.)

No, science does not change its methods even during revolutionary changes in theory. Science at all levels still searches for the mechanisms underlying things.


>>The first part I agree with, after a fashion. I don't think people
>>are big meat robots being driven by software running in their brains.
>
>Are you sure your viewpoint doesn't imply that by necessity? What's your
>alternative?

Humans are not stimulus response machines of the type Skinner is credited with thinking them to be, however, they are also not autonomous software running in the brain, as Pinker and Hofstaeder think they are.

It may be possible to create an algorithm which is capable of fully human congition, but if we can do such a thing, we will not be shifting human minds to "another substrate" as Hofstaeder believes. Rather, we would be emulating a human in a computer. We must provide it with stimuli and environments much like the ones we are use to now, otherwise the simulation would be incomplete and the result would not be intelligent in the way humans are at all.

I am sceptical of the idea that we can even define human cognition and human intelligence outside of the limits of human physiology and environment. Unless we encounter aliens, we have no model of intelligence other than human and no intelligence outside of human experience.

Since these are all physical processes, they probably are modelable by a sufficiently powerful computer, but that would involve simulating human bodies and envrioments, not disembodied minds.


>Physics has no bearing on time. It has no bearing on mental forms, i.e.
>self-existent mental properties. And it has no bearing on abstractions,
>such as "physics," which exist only in the mind of the beholder.
>Furthermore, since bodies don't have to be reduced to physical phenomena,
>you're still reducing mind to body.

Physics has plenty to do with time. If human cognition is a physical process it also has bearing on human cognition. I still haven't seen any reason to think human consciousness is nto a purely physical process.


>>I have never heard
>>of anyone recovering a memory from the brain of a dead man.
>
>Has anyone recovered a memory from a live one? I'd certainly like to know.

Yup. I did just now. I remember going to a seminar at Stanford over the weekend on moving human minds to computers and thinking it was a bit of a joke. I'm alive, and I just recovered a memory from my brain. If I had no brain, or if I was dead, I would be hard out to accomplish this feat.


>You're assuming that the brain does those things. That's faith, not
>science.

No, if I damage the forebrain, human brains stop doing that sort of thing. It isn't faith to think that visual pattern recognition takes place in the brain. There is a visible, measurable train of consequences from the sight of something to its complete assimliatation.


>Are you saying physical phenomena *can* be ideal or atemporal?

No.

Scott Martens

---

Envoyé par Moncourrier.com Vous aussi obtenez gratuitement votre adresse électronique sur Moncourrier.com 



More information about the lbo-talk mailing list