[lbo-talk] re: AI

andie nachgeborenen andie_nachgeborenen at yahoo.com
Wed Nov 19 14:47:18 PST 2003



>
> On Wednesday, November 19, 2003, at 04:57 PM, andie
> nachgeborenen wrote:
>
> >
> > By theway, I donk think that thinking makes for
> > rights. It's sensing and, specifically, suffering
> --
> > not just physical suffering.
>
> very peter singer. ;-)

More Rorty, in my case. I'm not a utilitarian.


> > I think it tells us that we are
> > fully part of the natural world, just a moderately
> > intelligent bit of it. That is a view that people
> have
> > been resisting tooth and claw since the dawn of
> the
> > modern age.
>
> two words: donna haraway. it's all always about how
> super-duper-special
> human beings are. we're just so *different* that
> we're *better*!
> neither ("mere") machine, nor ("mere") animal, nor
> god, we are Human.

It's curious, because there has been this ambivalence ever since Descartes. In the middle ages, the Great Chain of Being told you your place in the world: below the angels, above the animals. The debates about dualism, materialism, and mechanism, the nature of consciousness, etc., that exercise modern philosophers would be literally incomprehensible to Aristotle or Aquinas. Now dount Joanna is right that part of the story is that if we are relevantly like machines in the way we think, then we are machine-like in other ways too, usable, dispoable, etc. Which of course under capitalism or Stalinism we are.

But the analogy is defective if it isn't the way we think that gives us moral standing, but the fact that we feel. It may be that machines could come to feel too -- so far I don't think they have. But they might, and if so, we should treat them with equal concern and respect. The point of this is not a science fiction exercise, but a humansitic one. What we should strive to avoid and to abolish is pointless suffering, humiliation, and cruelty, whether or not the beings that experience it are smart.


>
> we need to get over ourselves as a species. people
> who would have no
> truck with such talk when it comes to nations or
> classes or races
> traffic in it regularly when it comes to species . .
> . and life, for
> that matter. we base our arguments on some
> quasi-scientific notions of
> our uniqueness, and then what happens to our
> arguments if/when it turns
> out we aren't so unique, after all. the obvious
> analogy here is the
> copernican revolution . . . we were special because
> the universe
> revolved around us, but then, oops! it doesn't! so,
> uh, now what is it?
> and so on . . .

Or the Darwinian revolution. Eeek! We're not specially created. And I'm justa monkey's uncle. Or he's my uncle, I forget. Lots of people still find it hard to wrap their heads around that one.


>
> in philip k dick's _do androids dream of electric
> sheep_, the defining
> element of human-ness/humanity, as opposed to the
> androids, is taken to
> be the capacity for empathy rather than
> "consciousness" (not unlike the
> tin man wanting a heart). scott's film is true to
> the novel even while
> it turns it inside-out by making it pretty plain
> that the android
> "villains" actually feel.

That is why I made Bladerunner the Uncle Tom's Cabin of the Robot Rights movement . . . .

the point is not that they
> do or even that
> they will, but for us to wonder how we would think
> of ourselves *if
> they did*.

And indeed, what we think of those among us who are not so smart, but who may feel, right now. Both humana and animal. Now I do sound like Singer. But before Marta, et al, go off on rant, this is a strong pro-diasbled rights position.


>
> all this business about turing tests and mere
> machines is, imo, a way
> of avoiding this fundamental question.

Well, if something passes the Turing test, it can feel, right? Not only if, of course.

jks

__________________________________ Do you Yahoo!? Protect your identity with Yahoo! Mail AddressGuard http://antispam.yahoo.com/whatsnewfree



More information about the lbo-talk mailing list