Different Strokes for Different Folks (Contra Snobbery, continued)

Art is such a hard thing to qualify. And yet we do it all the time.

My earliest preferences were colored by my opinionated mother, who never waffled on anything, and always had a reason for her preferences. This place is better than that place because such and so happened at that place; this TV show is better than that one because its sex jokes are implied rather than explicit; this piece of music is intolerable because it's performed by hippies; and so it went. She could back up every one of her clearly-defined judgments with some bit of information, some personal experience that rendered the conclusion obvious to anyone with a brain. As a child and adolescent, I clung to this binary view of all things.

But then along came college and professors whose opinion mattered to me even more than my mother's did. They clearly knew what they were talking about, and mostly what they wanted was for me to keep an open mind about music I initially disliked intensely. This was particularly true in band, where Martin Behnke regularly indulged his taste in new music. We played pieces that were neo-classical, serial, aleotoric, that drew on eclectic sources, and frequently had us digging into bizarre harmonies and complicated polyrhythms. For the most part, we were good sports about it, trusting his authoritative opinion. But there was one day he had to lecture us about making room in our hearts for a piece of music that just wasn't clicking with us. It was called "Decorations," a multi-movement piece with chance elements, which called for us to make strange sounds with our instruments, to pick random notes, and for its final movement, to sing a bit of doggerel that included the words, "Decorate your face with a smile, friend...and just decorate yourself with love, love, love, and get ready to celebrate!" Apart from the Hallmark lyrics, the sound of this piece, especially the "pick a note, any note" harmony, was grating to our tender ears, raised as they were on the easy harmonies of '70s rock and disco. We had taken referring to the piece as "Defecations," and whenever Dr. Behnke announced it was time to work on it, there were audible groans.

So we got a lecture. He told us how he, personally, enjoyed music that was challenging to the ear, that tried new things, surprised the listener, and that expanded the sound repertoire of an ensemble. He told us that, from his experienced and well-studied perspective, "Decorations" was, in fact, very well-written, that here was a composer so gifted at orchestration, so knowledgeable about the aural possibilities of every instrument in the band, that he had created a work of spectacular originality. Unfortunately, since we weren't approaching it with an open mind, it did, indeed, sound like crap. He told us he was always open to suggestions, but that his was the final word in programming for this band, and that we were, indeed, going to give this piece the time and care it deserved, perform it as well as anything else in our repertoire, and only then would we be in the position to render judgment on it.

Ashamed at this revelation of our own closed-mindedness (what college student wants to be confronted with that?), we buckled down and gave it the old college try. And by the time we performed it, we had warmed to it. It'll never be one of my favorites--I still can't get over that part where we had to sing trite lyrics, even though their triteness might have, itself, been understood by the composer, and used for ironic effect--but there are parts of it that haunt me to this day. Most importantly, I've never forgotten the lesson I learned from it: I do not have the right to judge another person's taste, and I really don't have the right to judge any work of art until I've spent some time with it.

Here's another example of what I'm getting at in this essay. One of the first rules a student learns in music theory is to avoid parallel fourths, fifths, and octaves when harmonizing a melody. Breaking this rule causes the ear to perk up and take notice of the harmony, rather than the melody, calling attention to something that just doesn't feel right in the context of the piece. Ever since, I have diligently held to this rule whenever arranging music for multiple voices or instruments.

And then, two years ago, as I was studying for my Level III Orff certification, I learned that I had been coming at this rule from the wrong perspective, that it was descriptive, rather than prescriptive. As my teacher at the San Francisco Orff Course told me, it's not that parallel fifths sound bad. Not at all. It's that they're powerful, something rock guitarists discovered in the 1960s when they invented power chords: leaving out the third, using a single barred fingering, and moving up and down the neck of the guitar, driving the music with those parallel fifths.

This revelation blew me out of three decades of judging parallelism, because it made perfect sense. I still don't appreciate parallel fifths in the harmonization of hymns or folk melodies--such music is far too delicate for such a powerful device--but I have come to respect and enjoy their use in rock.

Seminarians are a lot like music majors: they've learned there are right ways to do things, particularly when it comes to worship, and they frown upon those who choose to violate these rules. They see liturgical theology as prescriptive, rather than descriptive. Though considering how little time they actually spend studying liturgy, it's startling how judgmental they can be.

My seminary offered one course on liturgy. Just one. And it was an elective. That means that many graduates of the Perkins School of Theology (SMU) were graduating without ever learning the history and theory of the hour a week that put them in touch with the most people, and for which they were most likely to be judged by their congregations.

At least, that's how I felt for many years, until I realized that there wasn't much point in teaching The Right Way to do things. Because, at least in Methodism, The Right Way is whatever way works best given the received traditions of whichever church a pastor is appointed to.

Oh, I knew a lot of rules: where to put the sermon, what order to present the readings from the lectionary, when to the use the Doxology, when to use the Gloria, when not to use each, when not to sing Alleluias (Lent), when not to sing carols (Advent), the proper form for a collect, for a benediction, waiting to break the bread until after the Great Thanksgiving, et cetera. There were reasons for all these rules, reasons that made ample sense to me at the time. I came out of that one semester course with more dangerous knowledge than I gathered in any other class I've ever taken.

I say dangerous because, being young and not really understanding how liturgy actually develops, I took it all to be prescriptive, and as soon as I had a church of my own, I started prescribing. Worse than that, whenever I attended a service at a church pastored by one of my classmates or, more likely, by an older pastor, I quickly rushed to judgment about all the things that were wrong in the service, without stopping to think there might be good reasons for all those "violations." (And in my mind, "We've always done it that way" was never a good reason.)

That actually worked well in most of my congregations. Methodist churches are used to seeing new pastors with new ways of doing things every few years, and are usually quite adaptable.

But then came the exception: the Lents United Methodist Church.

Lents was a dying congregation, a church that should have disbanded before I even got to it, but, under my predecessor, a retired minister sent in to close the place down, had grown into a sort of community center, with a large influx of AA members. I tackled worship at Lents just I had in my previous three churches, researching old worship bulletins to get an idea of how things had been done (a lesson I learned from my father), but then gently coaxing them in what I believed to be the right direction. People were fine with almost every change I made, except one: they didn't care for intinction.

Intinction is a Communion practice preferred by Protestants who want to have a common cup in worship, but fill it with grape juice rather than wine. Each communicant receives a small piece of bread, preferably torn off a common loaf by the presider, dips it in the grape juice, then eats it. I was not a huge fan of intinction--one of my British parishioners had, in fact, compared it to the "traitor's sop," as the only reference to anyone doing it in the Bible is Judas--but given our society's fear of germs, it seemed a reasonable compromise.

But Lents would have none of it. They wanted the small individual glasses, what my worship professor had called "shot glasses," rather than dipping their bread in a common cup. The opposition became so heated we had to have a church meeting about it. I went over the rationale for intinction, acknowledged it was a compromise, but stated my belief that the symbolism of the common cup was important in a church this polarized. A parishioner stood up then and said that for him, it came down to what Jesus had instructed the disciples to do: take and eat, then take and drink. They were two separate acts, each symbolizing a different aspect of Jesus' identity. Eat, then drink. That's what it says to do.

I couldn't argue with that; and, as I said, I knew the practice was on shaky theological ground anyway. So I rolled with it, and we put aside the common cup in favor of the little shot glasses. And from then on, whenever I talked to a congregation about liturgy, I recalled the definition with which that professor had started class on the first day: liturgy is "the work of the people." It's not something imposed upon them; it's something that grows out of their experience of God. All that business about the right and wrong way to do it forgets the most important element: if it's not true to the experience of the people worshipping, then by definition, it's bad liturgy. The words I found to best explain this came in the form of an aphorism: "Different strokes for different folks."

Rules of grammar, music theory, liturgy, fashion, of anything that defines what it is to be human in a particular place and time, have far less to do with the dicatates of the intelligentsia than ivory tower residents like to admit. The rules grow out of observation of common practice. Turning them into prescriptions for proper artistic interaction with the world is, itself, a violation of the most basic tenets of humanism: art, music, literature, liturgy, all the things that make us most human do and outght to be allowed to develop organically from our communal life. We are the ones who, as our music evolves, decide it sounds better without parallel perfect intervals in the harmony. We are the ones whose worship has evolved to take a certain form, and our reasons for that evolution are deeply rooted in our tradition. And yes, "we've always done it that way" is, at the very least, a reason for taking very seriously the reluctance of some people to change a practice that has worked for them possibly for decades, and underlines the importance of having very good reasons for messing with that practice. "If it ain't broke, don't fix it" still rings true.

The bottom line in any argument over the relative quality of one symboic act or work of art over another is diversity. Human beings grow and mature within family systems and communities, and are formed and shaped by them. Experiences accumulated on their own, as well as within those systems, add to the lode of preferences that is ultimately mined when, as individuals, we state emphatically that this doesn't work for me, I prefer it this way instead. I may not always know the why of my preference--it may be rooted in early childhood, it may grow out of a traumatic experience buried in my subconscious, it may just rub me the wrong way because of how it's presented--but I have a right to those feelings. Insisting that my feelings are wrong is a violation of my personhood.

Now, it may very well be important for me to move out of my comfort zone. That is, after all, where future growth lies. Embracing a new way of doing things may be exactly what it takes for me to become a healthier, happier, more whole individual, to make the leap into the next stage in my development. Cognitive dissonance lays the groundwork for future learning. So I do need to consider the possibility that my preferences may be grounded in something outdated or illogical. But I also have the right to be taken seriously, to have my feelings considered as part of the decision process. And if those feelings are deep enough, it behooves the leaders pushing for the change to make it, important though it may be, as gentle and gradual as they can.

In the world of liturgy, it is ultimately pastors who have the responsibility to maintain the integrity of worship. But they aren't called "pastors" for nothing. Yes, they are supposed to lead their congregations into the future, but not with violence, not whacking them on the behind.

And in the world art? In some senses, the rule is "anything goes." Often it is the radically different that captures the attention of the young simply because it is so very different from what their parents prefer. That doesn't negate either the value of either the new or the old. It's just one more reason that aesthetic rules are made to be broken, and that today's revolution will someday be the old order that must itself be rebelled against.

Comments

Popular posts from this blog

Boys with Toys Are Killing Us

I got a gun, he got a gun, everybody got guns!

Contact Matters