Chomsky -- Put up or blah blah

Scott Martens smartens at moncourrier.com
Tue Mar 28 13:59:25 PST 2000



> Gravity is also counterintuitive --- Newton called it "occult" ---
but, so what?


> To say that is not especially useful is vague at best. Not useful for
> what? For writing better rebuttals to someone's vague arguments?>
> Perhaps not. Better for understanding how language develops and
> operates? I beg to differ...

Okay. Name one phenomena explained by GG that can't be explained by dependency syntax, word grammar, or any other significant linguistic theory. What insight has GG provided on the subject of language acquisition, language education, or linguistic imparements? Has any outcome of GG been comparable to the success of Tesniere's structural syntax or Russian traditions of dependency grammar in explaining linguistic phenomena. Name one study or experimental result, obtained after 1959, that lends supports the existence of grammar functions mechanically separate from semantic or morphological functions.

Gravity may be counterintuitive. It is, however, falsifiable. Chomsky's generative grammar is either false, or unfalsifiable, since no result has ever shown it to be superior to competing hypotheses.


> I see. You have discovered a way to measure linguistic distance.
> Please elucidate.


> Aside from this problem, it has been shown to apply to a variety of
> languages, from Spanish, French, German, Mandarin, BVE, and a host of
> others.

It is mathematically possible to write a generative grammar for any language, if you include enough rules, and fudge your definition of a language enough.

Use generative grammar to explain the following phenomena:

Lexical functions: In English, you "take a class" when you study in school. In French, you "follow a class" (suivre un cours.) The definitions of "follow" and "suivre" are largely identical. Explain how generative grammar can be used to descrie this phenomena without recourse to lexical rules which take into account the meaning of words. Is this phenomena expalinable using any variant of X-bar theory, and if so, how, especially considering that verb choice is governed by elements lower on the grammar tree (or whatever it's being called this year.)

Another: "A rose is a rose is a rose." This sentence is unparsable using generative grammar. Yet, it forms a part of English usage. How can a generative theory account for this data?

One more: The transformation of "I am going to the party" to "Aren't I going to the party?" How does GG - in any of its forms - explain the change in verb? GG ought to explain it, since most speakers will refuse to accept "Amn't I going to the party?" and thus it is a part of linguistic competence by Chomsky's definition. (I'll give credit to Ivan Sag, et al, who do account for this in GPSG with a special rule, but the whole point of Chomskyan linguistics is that you shouldn't need a special rule.)

Latin or Russian, both of which enjoy relatively free word order, can be explained using a generic "move" transformation to restructure them into something more like the positional grammar of English (or French or Chinese), but isn't case grammar or dependency syntax much neater in the way it uses morphology to extract semantic dependencies in sentences? What privleges postional grammars (and the neat generative trees you can use to build them) as more universal than relations of case?


> This I find hard to swallow. From what I have read, there is plenty
> of evidence, some of it adduced by Chomsky in his various publications
> on the topic, some of it elsewhere.

Like? On the contrary, studies of how long it takes speaker to parse sentences with multiple transformations shows that these structures which GG says ought to take longer to interpret do not.


> They seem to have carried the understanding of the functioning of
> human language quite a distance from where it stood 40 years ago.

For example? I give Chomsky credit for inateness. However, the only solid work on universals, even the stuff coming from generativists, produces results with nothing to do with generative grammar. Berlin & Kay's work, for instance, on colour universals, is completely independent of generative claims.


> This is nonsense. Please provide us a quote from Chomsky where he
> seriously "encourages linguists to treat introspection as a form of
> validation for their hypotheses". Chomsky has written repeatedly that
> linguistic structures are *beyond* introspection and must be verified
> by other means.

Cartesian Linguistics, 1966. (Afraid I haven't a copy in the office, so an exact quote isn't possible right now.)

Perhaps in more recent years he has pulled back from that position - I am trying to read The Minimalist Program (awful, awful writing) but he certainly hasn't put forward any sort of empirical method for his linguistics. At each turn, we are asked to put our faith in abstractions he considers intuitively obvious, but which are impossible, or perhaps merely improbable, to falsifiy. For example, the distinction between competence and performance. How can you verify that competence exists as a real entity when speakers provide a variety of responses when asked about the correctness of a sentence? For Chomsky, this has to be accepted or rejected on the basis of introspection - a truly Cartesian method - rather than on the basis of falsification or predictive power.

Another problem of Chomsky's programme, particularly minimalism, is that it demands that grammars be constructed in the basis of the simplest, most intuitive formalism. Yet, there is no evidence that the mind (or any other natural phenomena) uses the simplest or most intuitive means of doing anything. This is formalism for it's own sake: neat on paper, but unable to stand up to the test of falsification.


> This again, from what I have read, is false. He does not treat
> "grammar as simply an arbitrary way of ordering words without any
> reference to what those words mean." Perhaps his "dry, uninspiring"
> prose style has forced you to drop the book before you finished it...

Chomsky's position, from the very beginning, is that syntax can be described without reference to semantics. He makes quite explicit that surface grammatical relations do not reflect meaning, only his restructured deep grammar can do that, if even then. He makes this clear in his analysis of passive verbs, wh- movement, and elsewhere.

Some more recent Chomskyans have retreated from this position, like Denis Bouchard who argues for a syntax that takes into account only the minimum necessary semantic information, but again, this relies on judging what sort of information is necessary on the basis of what produces the best formalism.

Scott Martens

---

Envoyé par Moncourrier.com Vous aussi obtenez gratuitement votre adresse électronique sur Moncourrier.com



More information about the lbo-talk mailing list