On Tue, 4 Apr 2000, Curtiss Leung quotes Searle as saying:
> If we think of ourselves as living in [a world] which contains mental
> things in the sense in which it contains liquid things and solid
> things then there are no metaphysical obstacles to a causal account of
> such things. My beliefs and desires, my thirsts and visual
> experiences, are real causal features of my brain, as much as the
> solidity of the table I work at and the liquidity of the water I drink
> are causal features of tables and water. (Searle, _Intentionality_,
and then asks:
> Now, why he holds this view but then goes out of his way to attack
> everyone else's effort to advance a brain-based account of mind . . .
> beats me.
Actually he doesn't, as far as I know -- he's with you. Rather Searle attacks AI because he thinks of if as exactly the opposite of a brain-based account -- he thinks of it as a "brains are irrelevant" account. He thinks the fundamental presupposition of AI is
1. Mind:Brain::Software:Hardware 2. Therefore, brains are irrelevant
because any hardware in which you could instantiate the software is equivalent. And naturally, his view is the opposite: minds are features of brains and only brains. (Although frankly, if he wants to cover all mental states, I think he should say minds are features of bodies. A pounding heart and a shiver of delight need more than a brain. Not to mention our spatial-temporal orientation and modes of visualization, which are kind of important for how we think too.)
__________________________________________________________________________ Michael Pollak................New York City..............mpollak at panix.com