Sunday, February 28, 2016

Over-Engagement

It may be a stock photo, but this really is what engagement looks like in a general music class.

Let's be clear about one thing: I can keep children engaged without teaching them anything about music. When I do, they look and sound exactly the way they're supposed to--in their home room.

I do it by telling them stories. Sometimes I use puppets. Sometimes it's enough just to read them a picture book. When I'm doing these things, I'm using skills I mastered in my years as a pastor, tapping into narrative preaching techniques that held the attention of multi-generational and multi-ethnic congregations in Illinois, England, and Oregon. Preaching was challenging at first, but the more I practiced it, the easier it became, until I found myself able to reel off a sermon on the spot with little or no preparation. It was improv without a scene partner--except when I did it for African-American congregations, whose responses are very much a part of the sermon--and it was the one thing about being a pastor that I did very well.

So I can do that with my kids. I sit them down, I say "Once upon a time" or "Long ago and far away," or perhaps I have a quick exchange of questions to lead into whatever story I'm going to tell, and I've got them in the palm of my hand for as long as it takes me to reach the end of the story. I love doing it, love seeing their reactions, hearing their giggles at the funny parts, the way their hands thrust instantly into the air when I need a volunteer to use a puppet or play a small part in the narrative.

Make no mistake, though: there's not much I'm teaching these children when I tell them a story, anymore than if I was showing them a video. I'm entertaining them. Anything they learn along the way is a bonus. That's why I use these stories sparingly: as much as I love telling them, and as much as the children enjoy hearing them, I can only justify them to the extent that I can make them hooks upon which to hang a music lesson.

And here's the thing about an engaging music lesson: it looks and sound completely different from story time. The children are singing, moving, playing instruments. If they're being at all creative (and if the teaching space is right, that's something to be encouraged), the class will seem chaotic at times. There may be moments where the entire class grasps a concept at the same time, and performs it so brilliantly together that it's like a big bang of music-making; but more typically, there will be some children who just don't get it, but are fully engaged anyway, and in ways that drown out the learning of the others. This can be hard for non-musicians to appreciate--at times, it even tests my patience--but it's actually a good thing.

At least, that's what I'm going to tell myself the next time it happens--and carefully explain to my principal the next time she's horrified by it.

I write this because yesterday, at a workshop featuring master teacher Thom Borden, I heard something that made such perfect sense that I felt like slapping my forehead and saying "Duh!" I didn't follow through on that impulse--it would've been rude and disruptive--but seriously, how have I come this far, teaching for fourteen years, and not grasped the simple truth that children play instruments loudly because they're over-, not under-, engaged?

Thom talked about the need children have to express their excitement at playing a drum or recorder, or wave a scarf, or yank on a stretchy band, or whatever other object I put in their hands, before they can settle down and acquire the deeper learning that the lesson seeks to impart. He also spoke of those moments in a lesson when, reading the signs of the times, we teachers have to realize it's time to move on, even if not everyone's had a turn, even if the part hasn't been played to perfection, because if we don't, we'll lose the whole bunch of them as their over-engagement spins off into a train wreck.

These are fundamental music room behavior management concepts that made sense to me because I've seen the dynamics at work in my own lessons. I've had moments when I've insisted on sticking with teaching a skill to every student even though the half of the class who've already mastered it are so bored they're on the verge of breaking into loud conversations. I've wasted valuable minutes of the one half hour I get with a class trying to get them to all stop drumming so I can introduce the lesson. I've taken instruments away from children who want nothing more than to make music because they're not waiting for me to tell them how to play them. I've erred again and again on single-minded sanity for myself and (if she's in the room observing) my principal. And I've done it at the expense of the children's music education.

Far better to channel that enthusiasm into a warm-up activity that communicates to the over-engaged child that yes, you will get to play today, in fact you'll be playing a lot, because you'll learn much more by playing that drum than by me talking to you about it; so let's jam for a couple of minutes, bring it to a good solid "STOP," give me thirty seconds to introduce the concept, and then go back to playing.

Ugh. It's so obvious I'm feeling again like pounding myself on the forehead and chanting, "Duh! Duh! Duh!"

I know I'm being too hard on myself. I know the reality is that the two and a half years I've been teaching at Margaret Scott School have been like a journeyman period for me as an Orff teacher. I've had to learn how to teach in a variety of settings, with all manner of distractions, sometimes in a well-equipped space, this year with only borrowed space, sometimes in a room with excellent dry acoustics, this year (again) in a horribly echoey gymnasium. Apart from the equipment and the space, I've been teaching classes of children that almost all come blessed with high flyers who are frequently the bane of every teacher in the school--and who, handed an instrument, become instantly over-engaged (I'm studiously avoiding the word "over-stimulated," thanks to Thom's workshop).

Don't be deceived by my appearance. Even though I'm at an age (going on 55!) when most veteran teachers are considering early retirement, and have long since mastered techniques of classroom management, I've really only got eight and a half years of full-time general music teaching under my belt. And especially when an administrator makes a face at the noise-level of what's happening in the music room, or chides me for not having 100% of the children listening to every word I say, or for not having multiple ways of presenting a lesson so as to reach diverse learners who clearly aren't getting it because they keep banging on that damned drum instead of paying attention, I'm going to try and accommodate those concerns. But when I do, it's at the expense of the children who just need to move, to make a noise, to express themselves in ways that are disruptive in a general classroom, but if properly channeled, can actually enhance their experience in the music room.

Thom spoke about how every lesson needs to have a "gem," an Easter egg that makes it special to children: a moment of silliness, a twist in the choreography, a surprise buried in the song that makes them smile. For me, the gem of his workshop was this teaching about engagement. Plugging it into my personal educational credo, I've got a goal for as many more years as I'm able to practice this discipline: to engage all my students as musically as I can when I'm telling them a story.

Thanks, Thom.

Sunday, February 21, 2016

Originalism Isn't Just about the Constitution

Supreme Court Justice Antonin Scalia attending a Red Mass in 2012.

Antonin Scalia died a week ago, and the repercussions of his passing have only just begun to play out.

Our historically unproductive Senate announced immediately that it but hold true to form on the appointment of any nominee President Obama might put forward: it would refuse to act, preferring instead to leave the seat vacant for over a year, in hopes that the next President will be Republican, and appoint another hardshell conservative.

The President, on the other hand, stated plainly that he will put forward a highly qualified nominee, and he expects the Senate to fulfill its Constitutional duty of advising and casting a vote on that nominee.

Meanwhile, across America there has been a full spectrum of reactions, from sheer delight on the left to cries of conspiracy and even accusations of murder on the right. In the press, legal scholars have weighed in on Scalia's significance from a host of perspectives, and in particular, on his doctrine of "originalism": the interpretation of the Constitution as essentially a dead letter, a document that should only mean what its writers intended it to mean. I'm not a legal scholar, but from what I've heard, it doesn't take one to know that Scalia used this doctrine only when it suited him, using it to hammer against opinions he disliked, while ignoring it when it stood in the way of making George W. Bush President.

But I'm not here to write about Scalia's hypocrisy, or to attempt to interpret his often virulent influence on the recent history of the United States. As I said, I'm not a legal scholar. I've got an amateur, at best, understanding and appreciation of the law, though I do enjoy debating the meaning of the Constitution when it comes to the Bill of Rights.

With that said, I do have a deep appreciation for the principle of originalism, though I come by it from two other disciplines I've studied in far greater depth: musicology and Biblical criticism.

Originalism is in play whenever a classically trained musician begins to interpret a score. That's true whether the musician is a solo pianist or the conductor of a symphony orchestra. Whatever the performer's instrument or ensemble, it is his or her task to play or sing, as nearly as possible, exactly what the composer had in mind. In the case of music composed since the mid-to-late 19th century, the performer is aided by a sophisticated system of notation that conveys every nuance of tempo, articulation, and volume. This doesn't rule out the possibility of individual expression for the performer, but it does mean that any deviation from the composer's intent is made by choice, not accident.

As I intimated above, this system of notation was largely standardized by the late 1800s. Work your way back before that, and the farther you go, the harder it is to decide precisely what the composer intended. This means that performers of early music--music written before the classical era (which began around 1750)--have to also be historians. The farther back one goes, the more the question of original intent is open to interpretation, and debates over that intent can become quite heated. If one is a conductor of a church choir in a progressive denomination, one has the added challenge of deciding whether inclusifying a sexist text is violating the composer's intention.

One might think that this musical originalism has been around for some time, but in fact, it only really emerged in the mid-twentieth century. Prior to that time, performers and conductors took great liberties with classic works of music. I have a fond memory of listening to a 78 rpm album--there were at least a dozen disks--my mother had of Leopold Stokowski conducting Handel's Messiah. I think it was my junior year of college, so I was studying music history at the time. The recording featured a huge orchestra and an enormous chorus, and to my ears, it felt far too rich for the material. What really blew me away was the moment in the "Hallelujah" chorus when a single trumpet plays a descending run of five pitches, a lovely modulation that always makes me sigh. In the Stokowski version, it was an entire trumpet section, sounding as if they were heralding the arrival of the apocalypse.

I'm not always doctrinaire in my approach to great works of music--I've been known to jazz up some Bach or Beethoven just for fun--but by and large, I'm a believer in playing and conducting them as close to the composer's intention as I realistically can.

That also goes for Biblical texts. As a seminarian, I was introduced to the historical-critical method of scriptural interpretation, a scholarly approach that seeks to understand the most likely original setting and meaning of a text before attempting to bring it forward to a contemporary audience. The responsible interpreter had to be a historian, anthropologist, sociologist, and literary critic. It helped enormously to be able to read the text in its original language, but with that, one ought to have an extensive library of commentaries by scholars whose grasp of Greek and Hebrew was far better than one's own. Many of these scholars spent their entire careers studying the text of just a few books of the Bible. One might think that doing so, they could arrive at some kind of definitive understanding of those books, but in fact, competing schools of historical-critical interpretation could be every bit as venomous as competing schools of Constitutional interpretation. At my seminary (Perkins School of Theology at SMU), the resident iconoclast was William Farmer, whose theory of Matthean primacy, while elegantly argued, set him apart from the consensus of most other New Testament scholars. And it makes a difference: one has to read the gospel of Mark very differently if it was a paring down of, rather than source material for, the gospel of Matthew.

This brings me back to Scalia. As with Constitutional Originalism, Biblical Originalism necessitates dispensing with many dearly held beliefs, and if one is not willing to do so consistently, one ought not claim to be a true adherent of the text. For instance, the Bible says next to nothing about an afterlife, a concept traditional Christianity borrowed from Greek mythology. It says very little about marriage as we now understand it. It has only a handful of mentions of anything that approximates homosexuality as we understand it today, and when it condemns it, it does so in the context of condemning many other things that modern humanity has no problem with (eating shellfish, being a sassy teenager, wearing blended fabrics). It both blesses and condemns militarism, preaches both tolerance and anathematization of competing faiths, and declares that faith is the most important path to salvation--no, wait, its works--scratch that, it's faith! How dare you? Faith without works is dead!

And so on. If one is really serious about honoring the original writers of the text, one ought to "preach the controversy." Unfortunately, I rarely see or hear about anyone, from any part of the theological spectrum, doing that. Instead, I see both liberals and conservatives cherry-picking texts that reinforce their own beliefs, often ignoring both the textual context of the verse they're using and its greater, historical context. They're practicing their own version of Scalia's selective originalism: if it backs you up, beat it like a dead horse; if not, pretend it's not even there.

Back in the days when I preached regularly, I liked to think of the Bible as a living document, revealing new truths through the ancient words of its original writers. For fifteen years, that was enough for me. Over time, though, historical-criticism laid the groundwork for my disenchantment with the Christian faith: I knew to much about how the Bible came into existence to go on believing it was God's Word. It's a work of great diversity, profound thought, and at times tremendous beauty, but as the basis for a system of theology, it's far too self-contradictory--not to mention containing plenty of ideas that are simply repugnant. For many years, I clung to an adage several of my professors liked to use--"Sometimes to be true to a text, you have to disagree with it"--but in the end, there were just too many I disagreed with.

I'm going to continue practicing originalism at the piano, with my trumpet, or on the podium. There's much less at stake when I decide to play a Bach prelude a little faster than he probably did in 1720, than if I'm telling people how to vote based on a questionable reading of a Biblical passage. And when I experience the interpretations of others? Well, if their Bach is too fast, their Beethoven not romantic enough, I may roll my eyes a little, but I'll mostly keep that to myself. And if they're holding forth on how the Apostle Paul meant for his condemnation of child rape to be applied to loving relationships between same-gender adults, I will briskly walk the other direction. They're no more eager to hear my understanding of that passage than Scalia was to admit he was wrong about Bush v. Gore, and I've got better things to do than try to argue it out with them.

Why Is This Even an Issue?


I've been using gender-neutral restrooms all my life.

Mostly it's been a matter of necessity: growing up as part of a large family in houses that rarely had more than one bathroom, we took turns using said bathroom. Come to think of it, that's been true of every home I've ever been in, no matter whose house it was: the bathrooms are for everyone, regardless of gender. There are times, of course, when it's clear that a bathroom is primarily used by family members of a particular gender (this one's got one bottle of shampoo and a can of shaving cream on the counter, that one is equipped with a full line of skin and hair care products), but even so, there's no question but that the toilet works and is available for anyone who needs it, no matter how that person is plumbed. It must also be noted that these bathrooms were all "one hole" facilities, with just a single toilet making it unlikely there would ever be two adults in the room at the same time.

To be fair, opponents to gender-neutrality in the pee-and-poo-place are generally concerned about public facilities, rather than private homes. As luck would have it, though, I experienced a public gender-neutral bathroom (with multiple stalls!) as early as 1979.

The place was my college dormitory, Lausanne Hall at Willamette University. Most of the dorms on campus were co-ed, but only Lausanne was door-to-door co-ed. Both the second and third floors were equipped with two sets of bathrooms and showers. The first floor, though, had only one set of facilities, most likely because only half the floor had bedrooms, with the other half dedicated to the dining hall and lounge. If the restroom was designated as gender-specific, this would require one gender or the other to climb to the second floor to use its facilities, which would unfairly inconvenience both the first and second floor residents of that gender, since the second floor restrooms had only half the stalls and showers of the one restroom on the first floor. It also meant that whichever gender was using the first floor restroom would be using a room designed for twice the number of users visiting it.

This was, of course, silly. A floor meeting was held, and by unanimous consensus, the first floor agreed to designate their restroom co-ed. For the sake of shower privacy, signs were created to let people know which gender might be showering at any given time, but apart than that, students of both genders moved freely through that room, using it whenever they had need of it. This seemed a sensible, respectful solution to a ridiculous, arbitrary problem.

Imagine my surprise, then, when I first read of a state legislature passing a law requiring that persons use only the restroom for which they are biologically plumbed.

Let's be clear about this: gender-specific restrooms are a relic of the Victorian era. The first law mandating the existence of separate male and female restrooms in public places was passed in Massachusetts in 1887, primarily to address the discomfort men felt at the growing presence of women in the workplace--and to protect frail women from the aggressive leers of brutish men. (That last clause, though ironic on my part, accurately describes gender stereotypes still prevalent as recently as the 1990s.)

Fast-forward to the present, and the increasingly public presence of transgender persons. Trans women (born with male parts, but self-identifying, and often presenting, as women) would rather use the women's room than enter a men's room where they will both be exposed to men's private parts (urinals do not typically have stalls around them) and be subjected to the stares of men upon seeing a person who appears to be a woman in a men's space. The corollary scenario is also disturbing: a person sporting facial hair and dressed as a man entering a women's restroom is certain to raise red flags. Given the ongoing existence of Victorian-era segregated facilities, it makes perfect sense that trans persons of either gender would rather use the restrooms associated with the gender of their identity.

Unfortunately, that makes bigots uncomfortable. And as we saw (and continue to see) in the marriage equality struggle, the primary concern of bigots is preserving systems that oppress the people they hate.

Sex bigots love the Victorian norms of frail women and strong men. There is no place in their world view for same-gender attraction, or for any gender identity other than that which conforms to the organs one was born with between one's thighs. Anything that differs from these norms is perversion, deviance, abomination, and must be forced back into the narrow criteria of the norms. Sex bigots have largely lost the marriage battle, though some are still pressing for the right to discriminate based on religious bigotry. Losing does not go done easily for a bigot of any stripe--witness the continued existence of the Ku Klux Klan--and rather than be chastened by defeats in court and public opinion, sex bigots have simply shifted their hatred to trans issues, particularly where they impact public restrooms.

In prosecuting this phase of the sex war, the bigots have created a straw man: the leering pervert who wants to sneak into a women's room by claiming to be a trans woman, just so he can glimpse some women using the toilet. This threat is patently absurd: as I wrote above, the whole point of trans individuals being able to use restrooms that correspond to their identity is that someone presenting as a man would be completely out of place in a women's restroom, would be spotted immediately, and (unless it was an awkward accident, as happens to us all from time to time) chased out of that space. Add to this the reality that women's rooms have stalls with doors, so that the only place one is likely to actually see genitals out in the open is a men's room--a scenario that doesn't come up in the arguments of male sex bigots because they just don't see women being as perverted as they themselves clearly are. (Remember Mike Huckabee wishing he could've used his "feminine side" when he was in high school to play peeping tom in the girls' locker room?)

The reality is that gender-specific restrooms are an inefficient relic of the nineteenth century. Adding a gender-neutral restroom to a building that already has both male and female restrooms just increases that inefficiency. It makes the most sense to just follow the example of Lausanne Hall's first floor--and of 1990s dramedy Allie McBeal--and have a single, large restroom with enough stalls for everyone.

This will, of course, make the sex bigots uncomfortable. But at some point, we're just going to have to get their minds out of the gutter, and start acting like civilized people who can tolerate the presence of both men and women in the same bathroom.

Tuesday, February 16, 2016

Losing It

Carolina Panthers quarterback Cam Newton kneels on the field after fumbling in the fourth quarter against the Denver Broncos in Super Bowl 50 at Levi’s Stadium in Santa Clara, Calif. Newton was sacked six times and fumbled twice. Photo: Aj Mast /New York Times / NYTNS
Carolina Panthers quarterback Cam Newton after a fourth quarter fumble during Super Bowl 50.

It was an ugly game. Most of the scoring came courtesy of the defense. Quarterbacks of both teams were sacked repeatedly, and both suffered humiliating fumbles that led to turnovers. The winning Broncos may simply have been the luckier team--or perhaps better at exploiting Carolina's errors, and at recovering from their own.

So this was not a passing game. Despite that, the quarterbacks were still at the center of the pageantry, the hoopla, and ultimately, both the credit and the blame. Peyton Manning, at 39 well past his prime, was overjoyed to claim his second championship ring. Cam Newton, a 26-year-old whose best days have not yet come, was so shattered at losing that he was caught on camera with his head in his hands, weeping uncontrollably.

I'm not a big football fan, though it always grabs me when I glimpse it at the sports bar where Amy and I shoot pool. There is something hypnotic about this brutal sport that turns exceptional athletes into limping pensioners before they've even reached middle age. The relatively small number of games compared to basketball and baseball (necessitated by its inherent violence and players' need of recovery time) make the stakes of winning and losing much higher. In baseball, it can take seven games spread over a week to determine the world champion. In football, there is just one. It doesn't matter that it's an honor just to be competing in the Super Bowl: winning is everything.

So of course a young man, barely an adult, felt those stakes far more deeply than his elder rival. At 26, everything is heightened: break up with your girlfriend, lose your job, flunk a class in grad school, have your car break down, all of it feels like the end of your world. And these are things everyone goes through. Imagine what it must be like to lose the biggest game in the world at 26.

I remember 26 a bit too well. I remember how deeply I took perceived slights, how important it was to me to be taken seriously, how terrified I was of losing the love I'd found. I'd left teaching after just a year to enter seminary, traveling to an alien place that was not, I came to realize, a good place for me. I was completely out of my element in Dallas, Texas, a small town boy in a big city that was faster, meaner, and far more conservative than anything I'd experienced in the Pacific Northwest.

My self-esteem was low, and now I was studying subjects I'd had little preparation for. Academic theology is built on philosophy, and I'd never had time for anything as cerebral as that. All my college and graduate classes had been aimed at practical matters: how to teach a child to play an instrument, how to modify a choir's vowels to arrive at the right timbre for a particular piece, how to turn a piano piece into an orchestration for concert band. There were a few dips into the theory of aesthetics, but nothing to prepare me for the language of soteriology, hermeneutics, or any of the other philosophical disciplines I was expected to have under my belt. For all that, I did well without realizing it: I skipped an awards ceremony at the end of my first year assuming I had no reason to be there, and thus missed out on being honored as one of the top students in my class. Low grades shattered me, breaking up with my first girlfriend was devastating, and losing a church choir directing job almost did me in.

Half a lifetime later, I can look back on that callow 26-year-old and marvel at his innocence, his tenderness, his utter lack of perspective. I know that $300/month choir job was not right for him. I know (and he knew, really--he just didn't know how to initiate the breakup) that first girlfriend wasn't right for him, either. And I know those low grades, which were still passing grades, were subjective and had no real effect on him receiving his degree. All those apocalyptic experiences were, in the end, just life happening to him.

That's what I'd like to tell Cam Newton, too. No, you don't have a Super Bowl ring to wear. And yes, it's the second time you've lost. And you're right, you may not get the chance to be in that game again. This may very well have been your last chance at being a world champion. So yes, it feels very much like the end of the world, and I completely understand why you couldn't keep those tears under wraps as the game ground to its conclusion. You'd literally taken a beating from the game's first minutes right up to its bruising conclusion, as time and again the Bronco defense threw you to the ground. I could see it in your face every time a pass didn't connect, or you had to give up possession altogether. This game was no fun. Nobody was having fun. In fact, "game" wasn't the right word for what was happening on that field: it was an ordeal to watch it, let alone to have to play it.

So I'm cutting Cam Newton some slack. If I was 26, and losing the Super Bowl, I'd be bawling, too. Not at 54--I've been through enough failures and defeats that it takes a lot to wrench a tear from these withered ducts--but I'll bet even mercurial Cam Newton tones down by the time he's my age.

So let the poor man alone. 

Friday, February 5, 2016

Cat Fancy

Yeah. He had me at "hello."

How odd to find myself, at 54, falling for a skittish ball of fur.

I am not, by and large, a fan of other people's pets. It's not that I'm hostile to animals--I've been known to give an occasional stroke to a cat or dog, and I did care for and love both cats and dogs when I was a child--but for the most part, I'm much more interested in what the children of the house are doing. That suits me well for my profession, as I spend my work hours stimulating the musical creativity of 5-11 year olds and delighting in seeing what they come up with.

I think this is, in large part, due to a deep sense of responsibility I feel for the well-being of any creature in my care. There was a time, toward the end of my first marriage, when I acceded to pressure to buy a puppy, a lively black Lab/golden retriever mix. My reluctance to do so came out of my sense that our dual career household just didn't have the spare time available to care for a dog properly. And that's what happened: the lonely dog barked excessively, ran away twice, and was never completely house trained. After a year and a half of trying to make it work, we gave her to a farm family who could give her both the attention and the space she needed.

The lessons of that experience stuck with me a bit too well. During the long lonely stretch after my second divorce, only one pet passed through my household: a cat I adopted for my daughter, who was never able to have him live with her. As I found myself in my first solid relationship in years, I also found I was not able to be home with this cat very much, and so I gave him to a couple who had cared for him while I took vacations, and who were very happy to have him.

Being pet-free had many advantages over having an animal in my home: I could go away for weeks at a time and never have to worry about who was feeding the cat, cleaning the litter, and most importantly, giving him attention. It also saved me a bundle on kenneling, as I would've had to do for a dog who could not travel with me. It saved my furniture from being used as a scratching post, it kept my rent down (not to mention making "no pets" rentals available to me in the first place), saved me from veterinarian bills, and was quite simply far more convenient in every regard.

But convenience is not companionship. And there were times when I was very, very lonely in that apartment, that house, that empty place that had no one in it to welcome me when I came home at the end of the day. When a relationship ended, when my teaching job was cut, there was no creature to curl up in my lap or at my feet and let me stroke my sadness away.

And then I met Amy, and I wasn't lonely anymore.

For the first six years of our relationship, we were renters, and it was simple to reply to the kids' appeals for pets that we weren't having them because of pet deposits or no-pet policies. But then we purchased our house, and that argument went away. Just over a year after becoming home-owners, the pressure to acquire a pet began to pick up. In contrast to my puppy-buying experience, we handled this the mature way, talking it out, weighing the benefits and challenges, insisting on a strict litter box regimen. And then one day between classes, I pulled out my phone to see that Amy had sent me a photo of a kitten.

From the moment I heard his first mew, he had me.

His name is Clyde, and he's now about six months old. He has beautiful fur, deep eyes, an alternately playful and affectionate nature (don't even try stroking him when he's hunting!), curiosity up the wazoo, claws he seems not to mind having trimmed but which he'd rather sharpen on our year-old leather couch (groan) and, in the words of a cat-loving friend of ours, he makes the world a better place.

In some ways, it's like the first time I tried a Belgian beer, and realized what I'd been missing for the first 49 years of my life. How could something so wonder, so satisfying, not have been known to me for so long? How had I gotten by all those years without experiencing it?

At bedtime, he plays pillow mountain with us, pouncing on Amy's wiggling fingers. He likes to drink from the bathroom faucets, stick his nose into the shower, chew on the corners of cardboard boxes. He'll chase anything that can be moved across the floor. He sits for hours in the window, staring hungrily at the birds that visit the feeders on the patio. He can jump incredibly high. He would like to climb on the kitchen counter and dining room table, two surfaces that are forbidden to him, but knows not to do it when anyone's watching. He goes hunting in the garage, then scratches at the door to be let back in. His purr is audible from rooms away. I can hear it now as I write--he's sleeping next door.

Am I gushing? There's a reason, and his name is Clyde, and much to my amazement, I'm very happy to have him in my family, my home, my life.

If you've talked yourself into petlessness with hard facts and logic, you may want to reconsider. Pets really do make the world a better place.

Thursday, February 4, 2016

They're Both Right

Sure, they look like they're having fun, but just wait. This is gonna get ugly.

Pity the poor politician.

To succeed in politics, one must convince a majority of voters that one will enact policies congruous with those voters' wishes. One must further convince a subset of these voters that one sincerely believes in the principles behind these policies. Simultaneously, one must refrain from engaging publicly in behaviors or making comments, even in error, that these voters will find offensive. Finally, one must appear genuinely warm and friendly to one's voters, while appearing fierce and defiant, perhaps even rude, to interests they oppose. And don't even get me started on donors. It's a rare politician whose mind can be spoken to any hot microphone, and stay in the game for long. It remains to be seen how long Donald Trump can pull it off: with his billions, he's immune from donor decay, but it's extremely doubtful he can attract a real plurality, let alone a majority, of voters nationwide to his fascistic prognostications.

So what about Bernie Sanders? Isn't he an exception to the balancing act? He appears to be, after all, a politician who speaks his mind, stays true to his principles, and does not have to rely on the big money interests that fuel most other campaigns.

As Jamelle Bouie points out in a brilliant piece in Slate, Sanders really does seem to fly in the face of the whole notion of wheeling, dealing, compromising, voter-pandering politics--except when it comes to gun control, where he has cut a deal with gun dealers that should make most liberals squirm. (As I've said, if I was going to be a single-issue voter, this would be my issue, and he'd lose me on it--though in fairness to Sanders, that's true of almost everyone who's occupied a seat in Congress or the White House in my lifetime.) He's taken advantage of a home base (Vermont) that is small, homogeneous, and partial to cranky iconoclasts, giving him freedom to be as ideologically pure as he cares to be; and his appeal nationwide appears to be to a class of voters who would fit in well in Vermont.

The primary thrust of Bouie's essay is a meditation on the term "progressive." I'm familiar with its history--originally a description of liberal Republicans who sought to promote free market economics over the robber barons of the nineteenth century, the word nearly vanished from popular usage until it was revived in the 1980s as a replacement for "liberal," a word that had been demonized by the Reagan administration. In the 1990s, "progressive" became interchangeable with "moderate Democrat," a term that meant both concern for the social safety net and a belief in holding individuals responsible for their own actions. In the 2000s, conservatives seemed to catch on to its association with the Democratic party which, by now, had been almost completely purged of the Southern conservatives who used to be one of its mainstays; and with no more need to pretend it meant anything other than "liberal," Democrats began using it as a synonym for that word, disassociating it with "moderate."

Which brings us to the present, and a war of words being fought between the Sanders and Clinton campaigns over who is the true progressive. Given the elastic history of the word's meaning, it's clear to me that it all depends on which decade's definition one is using.

Take the word's origin, in the time of Teddy Roosevelt. Considering the accusations leveled at Hillary Clinton by Bernie Sanders over her associations with corporate interests--interests that exist because of the progressive reforms of the late 1800s--there's no question but that she is the true, classic progressive.

But then let's look at the progressives of the 1980s, when the word was code for big government New Deal liberalism. That's Bernie Sanders, no question. It's also pre-White House Hillary Clinton, who earned the wrath of the radical right with her efforts to create universal health care.

The best example of 1990s progressives, though, were Hillary's husband Bill and his VP, Al Gore, who were redefining the Democratic Party as a mainstream movement. Hillary was along for the ride, then, and continued to claim those values as she was elected to the Senate, then appointed Secretary of State.

In the Obama era, progressivism has taken on more of its original reformist luster, though now it's seen as a corrective to the publicly-owned corporations that were created by the first progressives. Elizabeth Warren, in particular, has emerged as a critic of Wall Street, of profiteering, and of the ever-widening income gap initiated by the Reagan administration. Bernie Sanders embraces this critique, and calls it progressive.

So who's right? Take a look at the title of this essay for my answer. And actually, I have to wonder if maybe "progressive" hasn't outlived its purpose, especially when one considers the mindset that gave birth to it.

Progressivism wasn't just about liberating money from robber barons, after all. "Progress" also meant "expansion": consolidating control of the now continent-spanning United States, growing the economy, making all Americans (persons of color excepted) wealthier, happier, freer. It was a freedom borne on the backs of immigrant labor, though those immigrants quickly assimilated into this culture of progress, working their ways up and out of the ghettos of Eastern cities. It came at a terrible price to the environment of the young nation, though the same Roosevelt who championed progressivism also set aside large swathes of land to protect them from the onslaught of industrial exploitation. To a large extent, this progressive era set the climate change juggernaut in motion.

That's why I wish Bernie Sanders would stick to his own self-definition, and keep calling himself a socialist. I've never been ashamed of being a socialist, and he seems to be quite comfortable with it. Socialism's critique of markets and wealth fits him and his movement far better than most of progressivism's incarnations.

And as for Hillary Clinton: while I do think she's got more of a claim to being a true progressive (and I mean that in the 1890s, 1980s, 1990s, and even the 2000s senses of the word), I'd rather she picked a different word for herself, too. Moderate, consensus-building, even liberal all work. And they don't have that 19th century baggage.

But fighting over who's the true progressive? That's just plain silly.

Tuesday, February 2, 2016

Heroes Regurgitated

If you missed it when it was on, you can probably see it on demand. But don't.

Once upon a time, there was a TV writer who took a whole mess of comic book tropes, added conspiracy theory, sprinkled in a thrilling serial killer plot, and came up with a serialized drama that was original, funny, and compelling. The characters were multi-cultural and multi-generational, and their powers ran the gamut from traditional (flight, super speed, invulnerability) to awe-inspiring (teleportation through time and space, as well as the ability to stop time) to ridiculous (the serial killer could take on the power of every hero whose brain he consumed, so that by the end of the series, he was so invincible nobody could figure out what to do with him). Over the course of its four-season run, it fell victim to convoluted conspiracy-spinning and the kind of bombastic stakes-raising that has become de rigeur for movies in the Marvel universe (Antman blessedly excepted). How many times can the Earth be almost destroyed before our fear turns to yawns? Heroes went there early and often, so much so that four seasons was probably more than enough. And yes, I watched every episode--79 total, the same number as the original series of Star Trek (though the comparison ends there). But then, I was a Lost junky, too.

When I heard it was coming back after a five year absence as Heroes Reborn, I had hopes that this might be a chance to clean out all the silliness that had progressively turned the original Heroes from a pleasure into a hate-watch. I eagerly set the DVR to record, and was intrigued with what I saw: a renewed X-Men-type world in which the people with powers are suspected, discriminated against, even hunted down, coupled with a Watchmen plot line about a city's destruction being pinned on them. But at the heart of this reboot was yet another rotten core of conspiracy, one that became more ludicrous with each passing episode. The parts that worked kept me watching through the winter hiatus, and brought me back in January for the final denouement, but the more I saw, the more I hated myself for continuing to watch what I could no longer deny was simply very bad TV. Apart from the silly plotting, the special effects were incredibly cheesy (think early 1980s), the dialogue so stilted it would've rung false in the bubbles over genuine comics characters' heads, and the directing--bleah.

Of course, like every previous episode of this franchise, that finale ended with the promise of another chapter, perhaps in fall 2016, but there's no word yet on whether that will come to pass. If it does, I may be drawn in yet again, though I'll do it at times when nobody else is around to make fun of it. Because this is a program I love to hate.

That's an odd experience for me. I've always considered my tastes to be discerning, though when I look back on some of my youthful TV loves, I can't help but roll my eyes: Lost in Space. The Six Million Dollar Man. Space:1999. the original Battlestar Galactica, The Starlost. Yes, those are all science fiction shows (you're aware I'm a Star Trek fan, right?), because that's what I loved best. If circumstances had been different--if my family had lived in range of larger cities, for instance--I could add quite a few other shows to that list, some of which I saw occasional episodes of (Voyage to the Bottom of the Sea, Time Tunnel, Land of the Giants), others which I only know by virtue of the fan magazines I started reading in high school (Thunderbirds, Secret Agent). Most of these shows--the ones I saw and those I only managed to sample--would initially grab me with their coolness. The gadgetry, the scenario, the music drew me in. Over time, though, I'd be disappointed. Even Star Trek suffered from inferior scripts in its latter seasons, and it was the gold standard for televised science fiction well into the 1980s, when it was surpassed by its reincarnated self, Next Generation. By the time The Six Million Dollar Man had entered its third season, I'd given up on it, and I barely made it through the first season of Space:1999, despite its inarguably excellent production values. I'd come to realize that it took more than space battles, flashing lights, and fantastic plotlines to make a science fiction series work: without good writing, directing, and acting, there was just no point.

This is what made the original Star Trek stand out from the rest of the pack. And yes, there are episodes of the original series that are painful to watch ("Spock's Brain," anyone? Or how about "Catspaw"?), but overall, there was a commitment to quality in that series that separated it not just from the rest of televised science fiction, but from episodic TV in general. "The City on the Edge of Forever" measures up to the best episodes of the best dramas of its era.

But back to Heroes (sigh) and Heroes Reborn (ugh): why did I keep watching this dreck?

I guess, when it comes down to it, I finally have a place in my heart for hate-watching.

A few years ago, I tried watching the pilot of Lost in Space, which was, I'm embarrassed to admit, my first real TV science fiction crush. I remember watching the scary "Keeper" episodes when I was just 5 years old, covering my eyes and peeking between my fingers at the huge spider monster that was strangling Dr. Smith, fearing in a later episode that Dr. Smith would be left behind when the Jupiter II left the planet it had crashed on two seasons earlier, worrying about the status of the robot when it had its battle with Robby, and much more. I didn't discover Star Trek until it was in reruns, probably because it was on too late for my tiny self to stay up for it. Seeing that first episode as a middle-aged adult, I cringed again and again at the sheer silliness of it. I couldn't even hate-watch it: I was just embarrassed for the people who'd made it, and for my childish self for loving it so much.

Perhaps that's what Heroes and its offspring kindled for me: nostalgia for the crap TV I watched as a child, updated to the present. I've also watched every episode of Glee, a gimmick show that, like Heroes, stayed on far longer than it should have--watched them and made fun of every ludicrous twist, even though I knew the series was deep into self-referential irony long before it was finally put to rest, to the extent that I wasn't sure whose leg was being pulled when Vice President Sue Sylvester took the stage in those final moments to congratulate her mortal enemy, Will Schuester, for all the music educational good he'd done. Glee ended last March, and Heroes Reborn premiered in September, filling the bad TV void for me.

If I'm honest with myself, I have to admit that I've always had some room in my brain for bad writing, bad music, bad television. The musical snob in me enjoys an occasional dose of schlock. The literary aesthete likes to pick up a pot boiler from time to time. And yes, I've always watched some bad TV, usually because I can't let go of series that once meant something to me, but are now years past their prime: the later seasons of The Cosby Show, LA Law, ER, and many others. I do eventually give up on many of these shows--I can't remember watching the final seasons of any of those I just mentioned--but that's only after staying with them much longer than I needed to. It's also rare for me to give up on a novel, even if I'm finding it frustratingly bad. Perhaps I just don't want to admit I've wasted a significant part of my life on something that's just not very good.

For all the shame of sticking it out with bad TV, I must admit there is a perverse pleasure in watching something crash and burn. It's rubbernecking syndrome, the phenomenon that arises when a wreck on one side of a divided highway causes the other side to slow down, as well, from people checking out the damage. There's something in us that makes us want to witness a disaster. Perhaps the apocalypse of a TV show collapsing under its own weight is more real to us than the ridiculous events it depicts. 

Or maybe we just can't help thinking, "That could be me."

No, I don't have millions to burn making an embarrassingly awful TV show. But I'm no stranger to the project that goes south, for all the best efforts of many gifted individuals. I've helmed a few, and been a part of more. It hurts when the dream turns sour, the audience wanders away, and the big idea refuses to take off. It hurts even more when a great idea turns into something that no one wants to claim, when the best minds available produce a pile of excrement.

So maybe--and this is just the beginning of a hypothesis--the real reason we hate-watch bad TV (or hate-read bad novels, or hate-eat junk food, or hate-do anything) is to empathize with the humans who created it, who invested time, effort, and money in something that thought would be great, only see it turn into something awful. 

And also to gloat. Because sometimes it's nice to be reminded that a shit-ton of money is no guarantee of success.

Postscript: I'm writing this toward the end of my second sick-day from my mid-winter cold, with another ahead of me tomorrow. Yesterday I also hate-watched the ScyFy miniseries Childhood's End. Ugh, but that was a disappointment.

Monday, February 1, 2016

You Say You Want a Revolution

The passion's undeniable. The strategy? Hmmmm...

Let me get this out of the way up front: I agree with everything in Bernie Sanders' platform. He's on what I believe to be the right side of every issue save gun control--which, if I was a single-issue voter, would be my single issue. But I'm not. I'm a strategic voter, checking the box of the candidate I think is most likely, if elected, to make the most substantive progress in making this nation more free, open, compassionate, and affirming of diversity. If Bernie Sanders is the Democratic nominee for President, there is no question but that he'll have my vote.

Until that happens, though, I've got some concerns about Bernie.

I've spelled them out in this space. I'm not convinced Bernie's electable, given this nation's long-running distaste for socialism, and I worry that this could put Donald Trump or Ted Cruz in the White House. The possibility of Michael Bloomberg entering the race as an independent candidate--as he has threatened to do should the nominees be Trump and Sanders--deepens this concern.

But let's set that aside for now, and look at another problem I see in the Sanders platform: despite being a great set of talking points, an agenda that warms my leftist heart, it's unnervingly short on specifics. As I wrote in those previous essays on the Sanders phenomenon, hardly anything on this agenda is doable by executive order. It all depends on our bicameral legislature, at least one side of which is almost certain to remain in the hands of extremely conservative Republicans who've spent the last six years making life miserable for the White House, blocking any piece of legislation to bear the President's fingerprints, devoting their energy instead to trying to take down his signature health care reform. We don't have a parliamentary democracy: our President is not guaranteed a legislature that agrees with his or her agenda.

Confronted in interviews and debates with this reality, Bernie falls back on a single word: revolution. He believes his campaign will stir up the American people to rise up against their do-nothing Congresspersons, demanding that they roll over and enact socialist legislation. If enough Americans believe in change, his theory goes, then Congress will have to accede to their wishes, giving us a $15 minimum wage, free college, single-payer health insurance, paid family leave, publicly funded elections, et cetera. 

That's a lovely thought. Certainly that's how republics are supposed to work: the voters tell their legislators what to do, and they do it. Unfortunately, it flies in the face of generations of Congress ignoring the wishes of vast majorities of Americans, opting instead to preserve the status quo.

Consider, for instance, my single-issue voting point, gun control. In the wake of every school shooting, polls are published that confirm a significant majority of Americans want sensible limits on gun sales, limits that fall far short of any real control over gun ownership. None of these tweaks has made it out of Congress, leading the President to finally throw up his hands and simply put in place what mild regulations he had the authority to enact by executive order.

The reality of voter revolts is that they have to get ugly to have an impact: consider the Tea Party, whose virulent light is only now beginning to fade, as an even nuttier segment of Republicans swings toward the fascist views expressed by Donald Trump. The kind of revolt Bernie Sanders advocates--progressives politely insisting they be listened to--isn't going to have any impact on the nut wing of the GOP, the cabal responsible for keeping the House of Representatives firmly in the "Hell no" camp. These legislators were elected by reactionaries who will eject them from office the moment they make even the slightest of concessions to progressivism. Any progress to come from the House will be despite them, and likely bring the downfall of Speaker Ryan, just as it did for Speaker Boehner. 

All right then, suppose there's an expansion of the Occupy movement, with progressives camping out on Wall Street, the Capitol Mall, on state capitol grounds, city halls, any public place that stands in the way of progress. Suppose these activists add some bite to their bark, chaining themselves to gates, pushing back at police who come to impose order on their rowdiness, screaming so loudly at town hall meetings that conservative opinions are drowned out (a favorite Tea Party tactic). Suppose this becomes a real revolution, the kind LaVoy Finicum was advocating to his gun-toting right wing militia buddies. Are we really going to have violence in the streets? And is it going to bring the change Bernie Sanders is advocating?

No, it's not.

The Civil Rights movement of the 1960s was most effective when it was following Gandhi's principles of non-violent resistance. That brought change, though at a price--many activists lost their lives--and the change, even with significant pieces of legislation being passed, was too incremental to satisfy the more militant factions of the movement. Gun-toting Black Panthers and rioters led to a backlash that got Richard Nixon elected President, and set back the movement for decades.

And how about successful revolutions? I can name three: Soviet Russia, Nazi Germany, and Communist China.

Let's start with China, where a genuine people's movement succeeded, after decades, in ejecting a corrupt Western-backed government--then replaced it with a monolithic dictatorship that remains in power 65 years later. Reforms in China continue to be incremental, as the pragmatic oligarchs who continue to run the country make minor concessions to the populace, who have learned not to push too hard against their masters, for fear of massive reprisal.

Nazi Germany is an example of a small contingent manipulating the rules of democracy to its own advantage, placing a populist leader in a position to impose a reform agenda that pleased the majority of Germans, but contained within it the seeds of genocide and nearly destroyed European civilization.

And finally, the granddaddy of all leftist revolutions: the Bolsheviks rose to power in Russia by overthrowing an ineffective monarch, taking advantage of the chaos of a world war, to put in power a team of socialist leaders whose ideals could not survive the decade. The death of Lenin paved the way for decades of brutal, genocidal absolutism. The regime ultimately rotted from within, but the populist government that took its place fumbled for years before reverting to strong man rule under Vladimir Putin.

There are other, smaller examples of revolutions, many of them non-violent, bringing hope to a nation, ending long-lasting regimes: Cuba, Egypt, Libya, Nicaragua, El Salvador, Liberia, South Africa. It could be a very long list. And unfortunately, again and again, we'd see that the very forces that make a revolution effective do not translate to an equally effective government.

Because, sadly, revolution is just not a good way to run a country.

The one exception to this litany of failures is the United States. Within just a few years of their victorious revolution, the founders realized the survival of the republic depended on massive compromise. They drew up a Constitution built around a three-branched government that included a bicameral legislature. Running the country would depend, repeatedly, on the separate powers negotiating with each other, giving up ground to make incremental progress. Some of those compromises were horrendous: it took another revolution to end slavery, massive movements to give anything approaching equal rights to women, and the subjugation and near-extermination of the native population to make room for the expansion of the nation beyond its initial eastern dimensions. But compromise by compromise, the equity and justice envisioned by the founders continued to expand and grow, diversity taking root in communities across the nation.

It's the exception that proves the rule: this country would not exist but for the leadership of visionaries who were also pragmatists. Revolutionaries typically don't build up; they overthrow.

To close this out, I'll make one more appeal not to misunderstand me: I agree with so much that Bernie Sanders advocates. I just wish he had a realistic strategy for creating his socialist utopia. Because revolution ain't it.