> Or is your worry more the sort of thing that John
> Searle is getting at in his examples about the
> "Chinese Room," that no matter how complicated and
> intricate a system of symbolic operations you create
> -- Searles is a big room that crunches Chinese
> ideograms, put in some, out come some that are
> appropriate -- it wases the Turing Test -- Searle says
> it won't be _conscious_, it won't be "there," it will
> lack that glow, whatever that is. Here there's no
> answer, there are only intuitions. Mine is that the
> Chinese room can think. Searle's, not.
Searle's analogy is ridiculous - akin to saying that we're not intelligent because neurons don't know what they're doing. Whatever else intelligence is, its definitely holistic, which is why the reductionist arguments are so ludicrous (and apply to humans, as much as they would a computer).