Carl and Paul Henry on Ma(t)r(i)x

Jonathan Sterne j-stern1 at uiuc.edu
Sat Apr 24 07:59:12 PDT 1999



>Surely you're familiar with leftist academic status compensation.

Oh boy am I ever.


>You
>may, of course, argue this is not such an instance, but it sure smells
>like it from here in Californ-I-A.

Nope, just me being pissy. As for status, if I *had* such anxieties, I promise you I wouldn't be working them out here or on you or Carl. I'd have to go piss on other academics, wouldn't I?


>> >But that's NOT what Carl actually said, it's a considerable
>> >embellishment, courtesy of you. Carl simply wrote:
>> >
>> >>The art of special effects has advanced at the expense of
>> >>characterization and plot development. Movies have contributed
>> >>vastly to the domination of imagery over the written word and
>> >>to the stupefaction of people in general.
>> >
>> >I would agree with this as an almost trivially obvious statement.
>>
>> Here's where we disagree. I would like to see this statement
>> actually argued for, demonstrated, substantiated, rather than
>> hearing about its obviousness. As Stuart Hall says, "what's
>> most obvious is what's most ideological." Other writers on
>> ideology make a similar point.
>
>It's also trvially obvious that California is hotter in the summer than
>it is in winter. Is that ideological, too?

Sure, but movies and the weather have some important differences.


>Depending on the degree of rigor you require, this could well be a
>lifetime project. Sorry, but I decline. I'm busy. I'm trying to find
>ways to get thru the stupefaction you assure me doesn't exist. I note
>that this is a qualitatively DIFFERENT kind of struggle than back in the
>Twilight Zone days.

Sure, I'll go with that, but I don't think it's because of the change in the narrative style of televised and filmed science fiction.


>> Carl's claim and your reassertion are not obvious to me.
>
>Not obvious what we're saying or not obviously true?

Thanks for asking. Not obviously true.


>> His claim about the web being the revenge of the literate
>> (revenge against whom?) is both factually incorrect and
>> rather curious given his general suspicion of images.
>
>I think it's a case of naively appealing optimism. It reminds me of
>1992 or 3. I wish I could share his simple faith, I really do.

Me too.


>> But then, maybe *I've* been stupefied by watching the Matrix!
>> I should have thought of that sooner. This is your big chance
>> to get me back on track.
>
>But I didn't think "The Matrix" was stupefying. Didn'tcha read my post
>comparing it to Philip K. Dick? I'm not down on special effects per
>se. It's just that almost all moviemakers can't hold their liquour.

Yes, though I'd forgotten it was you. Even if you liked the Matrix, though, I think we're still having the same disagreement. I don't think I'd use the word stupefying to describe a film.


>> >It's a LONG way aways from claiming "that any film with
>> >narrative and character development makes people smart."
>>
>> No, but it does claim that movies contribute to making people stupid.
>> That's straight outta Carl's quote.
>
>To the extent that special effects have pushed everything else to the
>side.

I still disagree. If you want to object to special effects because they're an incredible waste of money, I'd be down with that. But arguing that a particular film or group of films is stupefying because they don't justify their use of special effects with narrative and character development is a different matter. I can only imagined this reasoned out through a really simplistic theory of media effects, which is why -- rather than assuming that little bit -- I've spent this and the last message trying to get you to explain how a group of films can make people, well, less intelligent.


>Note that I have explicitly identified narrative intelligence as
>just one form -- albeit a crucial one for the purposes of sustained
>critical thought.

I don't think the kinds of narrative skills one develops from watching movies are the same things as the kinds of critical skills one develops thinking, talking, and in some cases reading to educate onesself about the power differences in our society and their causes.


>> I think he's wrong, and I'd like to see that assertion backed
>> up with something other than reassertion.
>
>And I’d just LOVE to be proven wrong. I’m always rooting for the
>triumph of intelligence.

I'd love to prove you wrong, but you've got to give me an actual argument to argue with. To use your metaphor, it's like if you told me the sky was red, and this was obvious to you, and asking me to prove to you that it wasn't without any further explanation or justification. I need some reasoning or I can't do more than repeat my request to explain your claim. I've got a few more pointed questions below.


>> So you can tell by watching the X-Files that people have less
>> "intelligence" (your word, not mine) in whatever form than those people who
>> watch the Twilight Zone? What about people who watch both?
>
>You are recasting this into a personalistic framework. Carl and I were
>both addressing mass-audience effects.

What is a mass audience effect?


>What are these shows trying to
>do? What assumptions do they make about their audiences?

What do those two things have to do with the actual audiences (and by extension, intelligence)?


> These are the
>kinds of questions we’d ask in coming to the conclusions we reach.
>Twilight Zone assumes *at least* a kind of latent critical intelligence
>on the part of its viewers that is witlessly mocked by the X-Files.

Here's the problem. Intelligence, as you use it here, is a highly individualistic (and dare I say naturalistic) term. Yet you're claiming that you are describing mass audience effects, which means you have some implicit theory of how the mass communication process simmers down into the individual head, but as of yet, you haven't shared it with me.


>Oh, shoot, check out Janet Murray’s _Hamlet on the Holodeck: The Future
>of Narrative in Cyberspace_ for starters. She has plenty to say about
>this process as its unfolding today, and draws parallels to the
>evolution of the book as well as movies. Got references, too.

I remember not being really taken with that book on a quick look. But I'll look again. And yes, she's of course a respected person, whether or not I agree with her. Have you tried the dated but still excellent _Resisting the Virtual Life_? Verging on technophobia in a couple places, but some really excellent stuff.


>I’m of the same mind. That’s why I put them in quotes. I've argued
>several times onlist against folks who say that all thought is
>language. I'd do the same with those who say all communication is
>language. But that doesn't mean language isn't a useful cognitive
>metaphor for describing either thought or communication. It clearly
>*IS* a useful cognitive metaphor for both, but NOT a literal truth for
>either.

Mmmm. I see. But where I'm sitting, people tend to take those metaphors literally at some point, and very quickly think that they can apply the techniques of linguistic analysis to things that are only (loosely) analogically like language.


>As I explained above, it’s NOT an individualist argument Carl and I are
>making here. We're talking about historical processes, too, not
>individual works of art in splendid isolation.

Then explain the historical process, because at some point you move from the mass audience (a concept you just introduced) to the notion of intelligence, which seems to apply directly to individuals. I'm sorry, but I just don't understand how this can work.


>Look, you’re making some intelligent points in this post, but any
>argument is going to contain assertions in it. Jumping in like you do
>right here is what gave me a bad impression of your argument in the
>first place. If you want to object, then object to the whole statement,
>not to a common linguistic form which will be found almost anywhere you
>care to look.

Well, my problem is that these are the key moments in your argument for our disagreement, and I can't argue with you if you just say "it's so".

<late quartets example snipped>


>I mean that art has rules, that it MUST have rules. The freest
>art--free jazz, Duchamp, John Cage, what have you--comes not from having
>no rules, but from having the deepest understanding of the rules. It
>need not be conscious, of course. The understanding can be deep in the
>bones. But these rules have to be grasped by the audience as well, or
>else they won’t be able to comprehend the work. (They might still get
>something out of it, the might like something relatively superficial
>about it, enjoy some facet or another, and there’s nothing wrong with
>that, but that’s because they understand only part of its necessity.)

Ah, but then we have to talk about what the rules of film really are. And even once we establish those rules, we still need some kind of link between them and their comprehension by an audience. I would say the Beethoven example you used, while fascinating, isn't necessarily a fair analogy, or a fair example for the universal condition of creative expression. Consider that 19th century art music (or whatever we like to call it) not only was extremely rule-conscious, but had a definite telos (around harmonic tension and resolution, e.g.) that composers and listeners consented to and sort of rode out over the course of the period. That broke early in the 20th century, and has caused people who think about so-called "classical" music no end of consternation because they can't tell you where it's going, and they can't -- as a group -- even really agree upon a coherent telos looking back on the 20th century and reconstructing it.

Is the history of classical music in the 19th century a good model for the history of film in the 20th? I'd say no. To start with, there's no sonata form.

--J



More information about the lbo-talk mailing list