Wednesday, July 30, 2014

Gassholes


Let's start with the disclaimer: roads are for cars. That's why they're paved and marked the way they are. Engineers design roads to accommodate drivers, not walkers, runners, or cyclists.

Now for the rant.

This morning at 7 a.m., I headed out the door of my suburban home for a summer run. Amy and I live at the far north end of Bethany, an unincorporated area within the Beaverton School District. Running or cycling, it takes me just a minute to leave the spec houses of my neighborhood and be on a lovely country road, looking out across pastures, fields, and valleys to the hills beyond. This morning I surprised a trio of young bucks, and snapped pictures of one before continuing my run. My runner's high was kicking in early as I breathed the fresh air, drank in the long sunrise shadows, and celebrated all the best things about rural life.

I was planning a run around the "block," a loop that includes Kaiser Road, Germantown Road, NW 185th, and finally Springville Road. All told, it's about seven miles, chockablock with pastoral scenes. Unfortunately, the north side of the block--Germantown--has some issues.

Germantown Road should be a beautiful experience for anyone traveling it, whether on foot or on wheels. Starting at Cornelius Pass Road, it runs through farms, past a beautiful white-steepled church, then climbs up to Skyline before descending through Forest Park to the St. Johns Bridge. That last leg is, unfortunately, what makes it far too deadly to enjoy. I've had good runs and rides on Germantown, but only in the middle of the day or on weekends. In the morning, in the evening, the road is dominated by commuters whose sole purpose in life appears to be shaving a few seconds off their record for getting to work.

I had barely turned onto Germantown when I was reminded of this reality, as car after car roared past me at 45 miles an hour, not yielding an inch to the red-shirted runner struggling to maintain his balance on the tiny strip of gravel that passes for a shoulder. Occasionally a driver would steer wide of me, giving me a chance for a deep breath. Two, not able to pull into the opposite lane because of oncoming traffic, courteously slowed down, as I would have done had a been overtaking a cyclist or runner in a similar situation. Most just didn't care. This was especially worrisome on those stretches of Germantown where there is no shoulder at all, meaning one must risk jumping into a ditch if a driver will not yield.

This was not my first time in this rodeo. I've run Germantown many times, cycled it as well; and in fact, many of these drivers seem to save their worst manners for cyclists. I've had truck drivers blast their horns at me and yell obscenities as they roared past, furious that my presence on the road cost them a second or two; worse still, I've had vehicles shoot past me with just inches to spare. Running against traffic, as I always do, I at least have the chance to step off the road into a ditch. Cycling with traffic, even if I spot the car in my rearview mirror (which I watch assiduously on Germantown), I can't safely get out of the way of a vehicle overtaking me without crashing my bike. So I grit my teeth and hope I don't get clipped, and shout loud obscenities as the car zooms by.

If you've driven at all in rural Washington county, you've seen signs like this one:
I heartily approve of the message on this sign, that cars and bikes (and runners, too, though they're not depicted) should be careful and courteous of one another on these back roads. As I said earlier, they are designed with motor vehicles in mind. There's another thing to consider, though: many of the rights-of-way these roads follow predate the internal combustion engine. There was a time when traffic moved much more slowly on them, and much of it was on foot. Such modes of transportation didn't cease to exist with the invention of the automobile, and while paved roads may not have been created with pedestrians and cyclists in mind, we never stopped existing; in fact, traffic laws still give us the right of way for the simple reason that there is no question who comes out in a collision between a car and a pedestrian. We who walk, run, and ride on country roads understand this, and are probably overly sensitive to the risk of using these roads. I never force my right of way on a driver when I'm out in the country, because I don't trust them.

It's a strange thing: people move out here, on the edge of the urban growth boundary, because life is less hectic and schools are better than in the city. And yet, put us behind a wheel, and many of us forget all that: suddenly it's all about getting from A to B in the least time possible, and to hell with anyone on foot or on two wheels who gets in our way.

Maybe it's that, even knowing our address is a matter of choice, we resent the fact that we still have to drive into the city to work. Maybe it's that gasoline and commuting are a combustible mixture, and we just turn into angrier people when we're behind the wheel. Or maybe some of us are always there, in the hot zone, fuming at every slight delay, resenting every second we have to be on the road, ignoring all the beauty around us, ignoring the fragile body struggling to climb a hill under its own power that we could so easily zoom up in third gear. Maybe it's all of these things, rolled into one nasty mixture of rudeness and momentum that's itching for a manslaughter charge.

All I know is, as lovely as this run around the block is, I really shouldn't attempt it on a weekday morning.

Tuesday, July 29, 2014

Calling a Bigot a Bigot

Biblically Based Bigotry

A new book by Stephen Eric Bronner presents the argument that homophobes are bigots, nothing more, nothing less; and that, as bigots, there is no way to talk them out of their rejection of gay rights, gay marriage, gay ordination, gay anything. Refute an argument--point out, for example, that the passage from Leviticus in the picture above is surrounded by other laws presenting equivalent condemnations of eating shellfish, wearing mixed fiber clothing, and being sassy to one's parents--and they simply shift to a different argument, refusing to even acknowledge that the previous one has been deflated.

This is the experience I've had with homophobes for as long as I've attempted to deal with them. For a time, principally in the 1990s when I was serving as a United Methodist pastor, my approach to the homophobes I occasionally encountered in my congregations, much more frequently encountered in ecumenical settings, was to be patient and evasive, to avoid bringing up the topic at all and, if it did come up, to avoid at all costs entering into any sort of debate about it. To do otherwise was an exercise in futility. They felt what they felt, believed what they believed, and would not see reason on the topic. My hope was that, in time, the Supreme Court would make a ruling, attitudes across the country would ease up, and homophobia would wither away, just as racism has.

Of course, there's a huge fallacy in that hope: racism has not, in fact, vanished, though its proponents have had to alter their approach, finding cagier, more politically correct ways of couching its expression: battling affirmative action as reverse racism, for instance. But it is much harder to be channel one's bigotry into racism now than it was forty years ago. And that is, perhaps, the main reason homophobia has seized so much of the national attention in the last twenty years: it may not be acceptable to hate persons of color anymore, but in much of the United States it has still been quite safe to go on hating persons with a same-gender sexual orientation.

That, too, is fading, though, especially as "defense of marriage" laws fall across the country. Homophobes are becoming shriller, more desperate, in their arguments, and have given up on many of their most precious myths about homosexuality. The one avenue remaining to them is a reversal so Kafkaesque it boggles the mind: insisting that according civil rights to gay couples means denying the right of religious expression to bigots who believe their hatred is Biblically grounded. The Supreme Court gave a nominal blessing to this argument with the Hobby Lobby decision, though the topic of that case was contraception; but the principle is already being applied by bigots hoping to avoid having to treat gay men and lesbians like any other human being.

This, too, is likely to fail, and in the end, bigots will have no choice but to surrender on the issues of equal rights to marriage, employment, housing, adoption, and whatever other privileges the "normal" community already enjoys. One could hope that homophobia will finally wither away. Unfortunately, it is far more likely that the hatred will simply shift to a new target; and already, right wing media attacks on transgender individuals are increasing. Simultaneously, there is the growing uproar over undocumented immigrants.

Hatred, it seems, is like life, as described in Jurassic Park: it will always find a way. Haters gonna hate. Whatever it is about bigots that makes them direct their rage against sexual minority individuals will, once their rights have the rule of law to back them up, and the battle against them is finally conceded, simply shift to a new target, yet to be identified. Bigotry is a virulent pathogen that must have an outlet. Once the new victim is identified, justifications will quickly follow. The scripturally oriented will turn to their Bibles, poring over them until the right passages are located to prove that God hates Mexicans, or Wiccans, or vegans, or hipsters, whoever it is, just as much as God used to hate homosexuals but apparently no longer does.

And this is the challenge for progressives: the hatred will always be there, and we who just want everyone to get along, want everyone to enjoy all the Constitutional freedoms that are so precious to us, will be confronted with people claiming their Constitutional right to hate, and to claim it as the be all and end all of their faith. Salt of the earth Christians will go on insisting that God loves everyone, but hates whatever it is that makes this particular minority different from the rest of us, so we must purge them of their differences or, barring that, build a wall of exclusion around them to keep them from enjoying the same rights and privileges we normal folk do. Our liberal tendency will be to urge caution, patience, just to bide out time until the Supreme Court rules, as it must, in favor of expanding, rather than contracting, civil rights. We will want, above all else, not to apply labels to the haters.

And yet, Christians though they may be, Americans like us though they are, there is one word that best describes them in their rejection of fellow human beings, and that word is bigot. Not traditionalist, not conservative, not even reactionary. Bigot.

And the only way to deal with bigotry is to tell it "no," firmly, assertively, politely when possible; but, unquestionably, "No."

Sunday, July 27, 2014

Misogyny Done Right

Shotgun mass wedding? Happy ending? Sexist cliches turned on their heads? It's all this, and more.

If you read the screed I posted yesterday about James Bond movies, you may be of the opinion that I've become a humorless political corrector with no tolerance for fantasy, satire, or simple battle-between-the-sexes fun. If you are of that opinion, this post should dispel those concerns.

To rehash, briefly: to my dismay, I found when settling down to a classic James Bond film that the blatant sexism and misogyny of the franchise were trumping the fun for me, and had to turn it off. I fumed about it for a day or two, then wrote up all the problems I saw in the disposable "Bond girls" and the hero's nonchalant attitude toward their frequent deaths, and why I have lost my tolerance for such plot devices. "But it's a fantasy!" someone commented, and she's right: it no more depicts the real world of espionage than Star Wars is an accurate portrayal of the space program. Fantasy or not, I've come to a point where the main character's reckless abuse of women is too much for me, and the whole thing feels like a relic of a bygone era where men were men and women were vaginas with tempers.

This is not the first time I have found a movie I once enjoyed leaving a bad taste in my mouth, by the way. I previously had this experience with Holiday Inn, the 1940s musical that introduced "White Christmas," and featured a startling firecracker dance by Fred Astaire. Watching it for the first time on my small black and white TV in 1983, just before embarking on a three-day train ride from Illinois to Oregon to celebrate Christmas with my family after my first semester of grad school, I loved it. Ten years later, I purchased a copy on VHS, put it on for my children, and had to turn it off. Between the stereotyped portrayal of the black servants at the hotel and the blackface song and dance number in celebration of Lincoln's birthday, I couldn't justify showing such blatant racism to anyone, nor could I enjoy it. I probably hadn't noticed it the first time around because, at that time, I'd known so few African-Americans. I had similar problems the last time--again, sometime in the 1990s--I tried to watch Gone with the Wind, and found the nostalgia for the nobility of antebellum slave culture revolting. And don't even get me started on Song of the South, Disney's paean to the happy lives slaves led on plantations, under the protective paternal gaze of their masters. Gah!

I could make excuses for these films, putting down the racism and sexism in each of them, by saying they are products of their times, that the producers of the films were just portraying popular attitudes, and should not be faulted for their inability to present entertainments that broke away from those attitudes. In the 1940s, Jim Crow still ruled in the South and African-Americans living in the North were almost all employed in service jobs. The romanticized South of Gone with the Wind was in complete harmony with the version of history presented in public school textbooks, probably even in northern schools where it was essential to maintaining the union of a nation still nursing its Civil War wounds. All of that may be true; and yet, this same period was already producing great works of art and literature by African-Americans, and had embraced their music, jazz, as the most American of art forms. There are, it must be admitted, few great works of popular entertainment that present an alternative approach to the black experience, but they do exist: Porgy and Bess, Stormy Weather, Cabin in the Sky. There are elements of these movies that bring on discomfort--minstrel songs, shuck-and-jive routines--but in the context of black performers putting on these acts for black audiences, come with the knowing wink of "Can you believe white people actually think these things about us?" Such entertainments tell me that Hollywood knew better, and could have done better, but simply chose not to.

Which brings us back to the misogyny of the entire Bond series. Sexist humor was enormously popular in the 1950s and 60s--cavemen dragging women by their hair back to the cave to rape them; Ralph Kramden promising to hit Alice "to the moon"; bosses chasing secretaries around desks to, like the cavemen, rape them; sex kittens begging to be mistresses of elderly rich men--and apart from that humor, popular culture was rife with images of infantalized women who couldn't be trusted with checkbooks or charge cards, who drove like maniacs, who burst into tears at the slightest criticism, who jumped up on chairs at the sight of a mouse. There was no question which gender ruled during this era, so why fault a series of movies that simply presents an accurate portrayal of contemporary attitudes?

The answer to that rhetorical question is that it didn't have to be that way. Hollywood was just as capable of turning sexism upside down as it was of portraying well-rounded black characters, and of doing it in entertaining ways. As evidence, I offer two films: The Quiet Man and Seven Brides for Seven Brothers.

John Ford's The Quiet Man tells the story of a retired boxer who, after killing a man in the ring, returns to his childhood home in Ireland to escape his guilty conscience. There he falls in love with a fiery-tempered local girl and, after a quick courtship, marries her. Their marriage quickly descends into cultural conflict, as his American sensibilities come up against the Irish customs she holds dear. The story culminates in a confrontation that both honors and subverts these customs. The stakes are high: for this man and this woman to have a life together, they must find a way to appreciate and affirm each other's identities, even as they push each other toward transcending those identities.

Seven Brides for Seven Brothers takes the caveman-dragging-the-woman-by-her-hair joke and turns it on its head. A family of seven hard scrabble pioneer men descend from their mountain cabin on a town that is one step closer to civilization, woo the only eligible bachelorettes in town, then make off with them, closing the pass behind them with an avalanche, trapping them at the cabin until the thaw clears the pass. The women have minds of their own, and while they are clearly traumatized by the experience, soon take charge of the situation, barring the well-meaning but clueless men from the house. Over the course of the winter, the men are tamed by the women, and ultimately put at their mercy. Once the pass is cleared, and their fathers come charging in to right the wrongs committed against their daughters, the situation is again upended, and concludes with a shotgun wedding in which the women are calling the shots. The entire cliche is subverted from beginning to end: the rough-hewn pioneer brothers are dumb lunks, the town men are paternalistic jerks, and the women are the true heroes, wise, passionate, clever, and ultimately in charge of the entire situation.

To modern sensibilities, both these movies have shocking plots: John Wayne throwing Maureen O'Hara on the wedding bed or carrying her over his shoulder, the Pontipee men reenacting the rape of the Sabines by carrying off a wagonload of virgins; and yet both seem to understand the lie at the heart of the cliche of male dominance, tweaking it, subverting it, satirizing it with a sophistication remarkable for their early 1950s production dates.

These films are not outliers, either. Most musicals of the period present images of female power for the simple reason that a weak female lead is just not that interesting to a modern audience. Thus Marian the librarian is the only person in River City with the power to reform conman Harold Hill, Maria rescues grieving widower Von Trapp from his lonely misery while healing his relationship with his children, and Mary Poppins may well be the most powerful female character in any movie ever made. Eliza Doolittle, too, proves herself more clever than her erstwhile mentor, Henry Higgins, though his final line in both the theatrical and cinematic versions of My Fair Lady--"Bring me my slippers"--is extremely problematic, as is the sense that the only way to solve a problem like Maria is to marry her off; but comparing the forced happy endings to the rich character development and complex relationships depicted, these works overwhelmingly transcend the sexism of the era.

I could list many other films, musical, comedic, and dramatic, that present strong female characters who are very much in control of their lives, but I will leave it here. I believe I've made my point: Hollywood knew misogyny was a joke with limited appeal, knew how to subvert and transcend it, and when it did, created timeless entertainment that is still watchable.

It also knew how to pander, though, as it still does, and so we have frat boy fantasies of bedding and discarding sexy women while using clever gadgets to defeat supervillains. Such fantasies still dominate box offices throughout the summer, and sexism still figures prominently in their plot lines--think of Megan Fox draped across a motorcycle in Transformers--because that is what the teenage boys in the audience want. As I pointed out in my last post, though, it's a rare film that callously puts bullets in its female characters. Except in the Bond universe.

Saturday, July 26, 2014

Oh, James...

Is this an action movie, or an excuse for putting scantily clad sexy women in sexy situations while men in tuxedos look on approvingly?

I used to love James Bond.

I don't remember not knowing about James Bond, or at least about agent 007 and his cool gadgets. I'm not sure how I knew about him, as throughout my childhood my parents were as disapproving of violence as they were of smoking, drinking, and sexual references, so the only movies we went to were either cartoons or musicals. We did watch a lot of TV, and there may well have been ads for James Bond movies, and one of our favorite shows was the Mel Brooks spy spoof, Get Smart, which was as appealing to me for the coolness of the gadgets as the humor, which often went over my head. Whatever the source, by the time I was six I already knew the James Bond theme and knew that 007 was a spy, and was eager to experience some entertainment in that vein.

Twelve years later, I finally saw my first Bond movies.

There were two of them, an on-campus double feature, tied together by the theme of gold: Goldfinger and The Man with the Golden Gun. I was captivated by the coolness of the music, the over-the-top villains, the intricate universe in which Bond operated, the campy humor, the gadgets and, of course, the girls. As a sexually frustrated teenager, the Bond girls particularly appealed to me. "I like a girl in a bikini," says Scaramanga, the titular man with a golden gun. "No concealed weapons." I, too, liked looking at these girls in bikinis, liked Bond's easy way with them--but already, in my first full exposure, felt some discomfort at how disposable they were. When Bond discovers that, thanks to his meddling, Scaramanga has publicly executed the beautiful Domino, he feels perhaps a moment of remorse, but it's fleeting. His real passion is solving the mystery, putting a bullet through the villain, and then moving on to his next conquest, the bikini-clad spy Goodnight, who has been lusting after him for the entire film, but doesn't get her "Oh James!" moment until the end.

I've seen Goldfinger several times since that first viewing in 1979. Golden Gun I have not revisited in 35 years, yet still those details pop out to me, most likely because of the guilt I felt, even then, for enjoying such a misogynistic bit of entertainment.

Last night, seeking some light entertainment, I turned to the DVR and a recording I had made months earlier: From Russia with Love, long hailed by film critics as the best of the early Bond movies for the depth of its characterizations. I watched maybe half an hour of it, and had to turn it off. In that half hour, I saw Bond casually bed another throwaway girl who was frantic to have him, saw two gypsy women have a cat fight over a man, and saw Bond given the task by their chief of ending their conflict by, apparently, having a threesome with them. I don't know if it actually came to that, because I turned it off during the setup for this scene.

I've seen every James Bond movie, even the tiredest, tritest Roger Moore episodes, and over the years, I've observed a gradual evolution on most counts: technology, politics, diversity at MI-6 headquarters. The sexism bordering on misogyny, though, has proven especially stubborn. While casting Judi Dench as M was a welcome change, and led to some of the most dramatic moments in the Daniel Craig reboot, these most recent films still feature subplots in which Bond's reckless disregard for the consequences of bedding a villain's girlfriend is breathtaking. The superspy's body count doesn't stop with those he turns his pistol on (including, in the very first film, Dr. No, a woman he shoots in cold blood after having his way with her); there's plenty of collateral damage, from the woman Goldfinger causes to suffocate by coating her in gold paint to the torture shooting of Severine in Skyfall. Whenever one of these deaths occurs, Bond is angered by it, but appears to feel no remorse for putting a woman in such a position. It may strengthen his resolve to use his license to kill to exact revenge on the villain, but that was going to happen anyway.

This aspect of the Bond universe is heightened by those few women who manage to pierce 007's confirmed, but active, bachelorhood and worm their way into his heart. James Bond is an old-fashioned rake: he may be willing to bed anything in a skirt, but he's no adulterer. Anytime he gets close to settling down, we know this woman is not long for this world. The only time he's permitted to marry--On Her Majesty's Secret Service--his wife is gunned down, in the wedding limousine, by arch-villain Blofeld.

What, then, are we to make of this beloved character whose career is littered with the corpses of beautiful women who had the bad luck to give in to his charms? Should we consign him to the dust heap along with the Lockhorns, Andy Capp, the Honeymooners, and every tired sexist bit of humor or drama that populated television up to the 1980s? Is there any way to cleanse misogyny from the Bond franchise?

There's been talk lately of casting Idris Elba as the first black Bond, but I have to say that's just not good enough. The only way I can see to save the Bond universe from more than half a century of dehumanized femininity is to do with it what Marvel is about to do with its most virile superhero, Thor: total gender reassignment.

That's right. If 007 is to remain relevant in an age that takes misogyny far more seriously, in which there is at least an even chance that the most powerful man in the world will soon be replaced by a woman, then Daniel Craig needs to hand the franchise over to a woman. Let 007 be known as Jane Bond, and maybe this beloved hero can be set back on a track with a future. Otherwise, I expect we'll see more and more people having the experience I had with From Russia with Love, and voting with their remotes.

Thursday, July 24, 2014

Putting Capital Punishment to Death

Why do we kill people who kill people to prove that killing people is wrong? --Holly Near

I know you've seen it, probably on a bumper sticker, possibly on a button, perhaps a banner, a sign, a poster. I've seen it many times, on all those things, but not until tonight did I, thanks to the magic of the internet, learn that the words I quoted came from a folk singer named Holly Near. Now you can attribute them, too.

I've seen it so many times they strike me as a cliche, but even so, every time I encounter this quote, I nod. It's so obvious, so plainly true, that I'm stunned the most democratic nation in the world hasn't figured it out. If murder is wrong, then so is killing murderers. And yet we keep doing it.

There's a strong possibility, though, that capital punishment may finally be coming to an end. Not because of the dozens of exonerated prisoners, innocent people who could have been put to death for someone else's crimes if not for the diligence of researchers. Not because those exonerated prisoners probably represent just a fraction of the number of death row inmates, whether still alive or put to death, who were innocent. Not because there is a huge numerical imbalance between persons of color and white people on death row. Not because keeping a convicted person on death row is astronomically more expensive than simply locking him or her up in prison. Not because the appeals process can take decades, and has to to minimize the possibility that an innocent will be put to death. Not because all of this puts the families of victims through vastly more hell than a life sentence would. And not because it is simply wrong for the state to be in the business of killing people, however horrible their crimes.

What's bringing capital punishment to the brink of unconstitutionality is lethal injection, the method that was supposed to address all the concerns liberals had about how inhumane it is to put a person to death by hanging, electrocution, gassing, decapitation, or shooting. It appears that this method may, in the final analysis, be the cruelest of all.

Let's start with the paralysis drug, the first part of the cocktail to be administered, the one that spares witnesses from watching a dying human's convulsions. This drug does nothing to alleviate pain. It simply makes it invisible.

Go from that drug to the other parts of the cocktail, drugs which have, in several recent cases, led to lengthy, excruciating deaths on execution tables, most recently in Arizona, where an inmate took almost two hours to die. The most effective drugs are no longer available for these purposes; the drug companies refuse to sell them to prisons, and the states have had to go elsewhere in pursuit of legally sanctioned poison. The alternatives they've come up with have resulted in the ugly revelation of just how inhumane lethal injection really is. Unable to pretend any longer that this manner of execution is peaceful, this nation may be left with no more options save one: incarceration for life.

I've written many times over the years about this issue. The first time I saw print, in fact, it was in a letter in the Oregonian about the death penalty, in which I made the same case I'm making now: forcing a murderer to live with the knowledge of what he or she has done, and live with it for decades, is a far greater punishment than execution. It also opens up the possibility that this person may discover remorse, find redemption, and leave this world with a kind of peace not afforded by being strapped to a table and filled with lethal drugs.

The standard response to the arguments I've made has been, "You'd feel differently if the victim was someone you knew." And that's where I drop the bombshell that I have, in fact, known a murder victim.

His name was Newt Aschim. I believe I've told his story before, but I'll bring it up again. He was 80 years old and a kick in the pants. I became his pastor in July, 1995, and quickly decided he was going to be my favorite parishioner at the Amity United Methodist Church. Newt was a hard worker around the church, a devoted father and grandfather, a man who was both warm and contrary. He made church meetings fun--something very few people can do. And he made me, a recently divorced single father struggling to understand what it meant to be alone, feel welcome and appreciated.

I only got to know him for three months. One night in October, a drunk wandered into the Aschims' home. Newt challenged him verbally, told him to get out--and was beaten into a coma. He lingered for two months, and died on Christmas day.

I sat with the Aschim family at the trial, and was disgusted by the defense attorney's closing argument: this man should not be held accountable for his actions because he was intoxicated, and as such, was not in control of himself. That was enough for at least one member of the jury. He was acquitted of second degree murder, and found guilty of first degree manslaughter. The judge gave him the maximum sentence, and since this was his third strike, he would be serving the entire time: nineteen years. If he's still alive, he'll be getting out sometime next year.

I remember how disappointed the Aschims were not to hear the word "murder" in the conviction. It wouldn't have been enough to bring a death penalty--it was second degree due to the intoxication, and thus lack of premeditation--but it would have confirmed for them their belief that this old man's life mattered, and that his death was significant enough to merit serious consequences for its perpetrator.

And yet, this man, who was in his 20s when he committed this violent act, forfeited his youth. When he emerges from prison, he will be middle aged. He will be beginning life in his late 40s. The world he knew when he went in is gone. That's if he survived prison.

I can't speak for the Aschim family. In fact, I haven't seen any of them since I left that church in 1999. If Newt's wife, Doris, is still alive, she'll be nearly 100, and she's really the only member of the family I got to know. I expect some of them would have been happy to see this man die--though if he'd received that penalty, he'd probably still be on death row, going through the interminable appeals process. I don't know whether they found closure in the conviction for a lesser crime, or in the strict, but not overwhelming, sentence. I expect most of them would probably rather he spent the rest of his life behind bars.

I can say for myself, though, that I'm glad he wasn't put to death by the state, that instead he has had to live for almost two decades with the proof, all around him, that he broke the social code so flagrantly that he had to be removed from society, that he has blood on his hands that can never be washed away. I'm glad that, when he gets out, he will know every day that he threw away his youth. I hope he got clean while he was in there. I hope he stays clean once he's out. And finally, I'm glad that he has the chance to make his peace with whatever higher power he believes in, and to somehow make amends once he's out.

I don't know that he has made peace. I don't know that he'll do anything to make amends. I'm just glad that he has that chance, a chance he would not have if he was strapped to that table and forced to endure a few minutes--or a couple of hours--of pain leading to oblivion.

Killing him would not bring Newt back, and it wouldn't prove to anyone that killing people is wrong. It just is, no matter who does it--a criminal, a soldier, or an executioner.

Monday, July 21, 2014

Bionic Me

I loved this when I was in junior high. But I never imagined I would BE this.

Half my mouth is still number from this morning's dental ordeal. The occasion: whittling down a problem tooth to a nub so an artificial replacement can be glued onto it. This will be my fourth crown, making a complete set for the fifth tooth back in each quadrant.

Last month, I visited my audiologist to get my hearing aids tuned up, the better to distinguish what people were saying to me when I was in Ghana. I also had the last of my pre-Africa vaccinations, and picked up my malaria prophylaxis, making me immune to all sorts of creepy crawlies.

And, of course, in January space age lasers reshaped my corneas so that, for the first time in almost half a century, I can see the world with my own eyes.

Add to this the portable computer I carry around in my pocket, and you can see that I really have become bionic, which, according to Dictionary.com, means "utilizing electronic devices and mechanical parts to assist humans in performing difficult, dangerous, or intricate tasks, as by supplementing or duplicating parts of the body."

But wait, you're thinking, probably with reference to the 1970s TV series The Six Million Dollar Man and The Bionic Woman, don't those parts have to be attached to or implanted in your actual body? (Just in case you don't remember, or weren't born yet, the main characters in these shows had suffered horrible accidents and had three limbs each plus either an eye or an ear replaced by electronic prostheses that were better, stronger, faster than the original fleshy versions.) Well, no, not necessarily. The remote-control graspers used by nuclear engineers to handle radioactive material are bionic; and in fact, using the definition above, so is the car you drove to work this morning. But I'm going for a more specific and personal definition here: technology that enhances the body and mind that nature gave us.

Technology is replacing my teeth with porcelain/metal hybrids that will look better and last longer than the genetically flawed originals. The same goes for my eyes, misshapen by nature to be, by the time I was twelve, virtually useless without corrective lenses. The tiny devices I put on my ears digitally enhance frequencies I've long had a problem hearing properly. And the iPhone that is rarely more than a few feet from me not only supplements the cluttered and inefficient memory bank between my ears, it makes the knowledge of the entire world available to me in seconds. 53-year-old me transported back to 1974, when The Six Million Dollar Man premiered, would be wearing the same coke-bottle glasses I had at age 12, would be wearing a clunky microphone around my neck, would have four teeth missing from my mouth, and would have to carry an address book to jog my memory, as well as spending hours in university libraries to dig up a fraction of what Siri can tell me in seconds. Take me back fifty years before that, and I'm using an ear trumpet, and it takes me a full day to get to the university library because the highway infrastructure hasn't been built yet.

I could keep going back, but the point should be clear by now: technology that my 12-year-old science fiction nerdy self could only experience through novels and TV shows I now take for granted. Looking out at the roses in our flower bed, I don't even stop to think that a year ago I could only see them by virtue of plastic disks I inserted in my eyes every morning. Chewing my food, it never crosses my mind what it would be like to do that with large gaps in my teeth. And for two weeks in Ghana, I felt like part of my brain was missing because I couldn't access the internet from my phone.

It gets deeper than this: I know two people roughly my age who have artificial hips. This is a technology that didn't exist prior to World War II. So go back 75 years, and these people could expect to spend the rest of their lives hobbling about with canes or consigned to wheel chairs, instead of running, skiing, dancing, or whatever other active-middle-aged activities they are actually doing. I have a neighbor with an artificial leg. I've seen her walking her dog, and her stride looks as natural as my own. Two years ago, we had our first Olympic Games in which a runner was almost disqualified because his two artificial legs gave him an unfair advantage over the biological limbs of his rivals.

And then there's the matter of the singularity. The interface between me and the internet remains my fingertips and my voice, but the time is coming when people will have a direct connection between their brains and the Web. There are already wearables that project data onto spectacle-like screens, and soon these will be supplemented with smart contact lenses. At some point, implants will make the division between human thought and the Cloud indistinguishable. Homo Sapiens is inexorably being supplanted by Techno Sapiens (a term I borrow from Slate magazine). And then, who knows? Immortality? At what point does our consciousness outgrow its need for a biological component?

In the profound words of Keanu Reeves as Neo in The Matrix: "Whoa..."

Sunday, July 20, 2014

A Kinder, Gentler Action Hero


James Garner is dead at 86.

As an actor, Garner played wise rascals who got out of fixes using their wits. In his iconic television roles as Brett Maverick, a gambler in the Old West, and Jim Rockford, a private investigator in 1970s Los Angeles, Garner created characters who preferred charm to gunfire. Rockford pointedly left his gun at home.

Reminiscing about The Rockford Files, I found myself nostalgic for another icon of my childhood: the G.I. Joe Explorer line. As the Vietnam War heated up, Hasbro found demand for military action figures was dropping off, and revamped its most famous toy line. G.I. Joe toys no longer wore uniforms, and the names of the toys emphasized the adventures in which they and their lifelike hair were engaged. I had the space explorer, which came with a scale model Mercury capsule my silver-suited astronaut Joe could ride in. My brother Stephen had (appropriately) a sea explorer, with red hair and beard, knit cap, and inflatable raft.

Jim Rockford and the G.I. Joe explorer inspire nostalgia in me for a time when adventure didn't have to be about body counts and explosions. Rockford still got into physically dangerous situations, and the stunts he pulled off during car chases were legendary; and some of the G.I. Joe explorer toys came with shoulder holsters and pistols that fit (awkwardly) in their rigid hands, but the focus was elsewhere. Rather than being the whole reason for the story, action was simply a part of it, something that happened in the third act; and when we played with our G.I. Joes, we took them on expeditions rather than campaigns.

I've often wondered why, as charming as he was, James Garner did not have a regular television role after Rockford went off the air in 1980. I suspect the reason is that, in the wake of the Iran hostage crisis and with Ronald Reagan in the White House, America rediscovered the joys of violence, and Hollywood realized there was money to be made in blowing things up. "Ready, fire, aim" became our de facto national motto. In a culture like that, there's no place for heroes who leave their guns in the cookie jar.

Still, it's instructive to think back to a time when, for a few short years, America looked at what it had become and backed away. How long will it be before we again realize that bullets make more problems than they solve?

Born That Way?

News from the world of science: conservatives (and liberals too, let's be inclusive here) are hard-wired to hold their political beliefs.

According to the research quoted in this article, conservatives have a much higher "negativity bias": stimuli that they find annoying, disgusting, threatening cause them to shrink away. The article speculates on possible evolutionary roots to this tendency, notes that this distaste for the other actually creates happier people who are more content with the status quo, while liberals tend to be more neurotic, and conclude with a plea for liberals and conservatives to stop trying to convert each other and make peace so Congress can go back to getting things done.

My first reaction to these ideas was simply this: Vindication! I've been saying something like this for decades. My experience of butting heads with conservatives long ago taught me that arguments are only persuasive to people who are already sympathetic to the ideas being propounded. Conservatives, who find contentment and security in preserving the way of life that makes them happy, and are physically repulsed by change, begin putting up barricades the moment new information starts to be presented. It's not just skepticism at play here, it's fight-or-flight survivalism: don't you mess with my happy place.

I first encountered this quality in conservatives in college, the place where I began to realize my own inclination is to be intrigued by the very speculations that drive conservatives to cover their ears and scream the ABCs. My group of friends was diverse in many ways--race, ethnicity, religion, politics--and we frequently got into heated discussions of the topics that came up in our classes. While some of us were able to consider ideas contrary to our beliefs without abandoning those beliefs, there was one whose reaction was like the child who flips the game board over when things don't go her way (which she literally did once, during a Risk game): if you're not going to accept my world view, I refuse to play your verbal games at all. She lacked the arsenal to effectively counter our assaults, however, and most of us would simply give up. I could not: I needed to explore why she was so attached to these ideas that were obviously (to me) wrong, so I would worry away at them, trying to convince her to open her mind, until she would furiously storm off.

Once I was in seminary, I encountered conservatives who could hold their own in an argument with me and who were just as dedicated to convincing me of my wrongness as I was of changing their minds--a futile endeavor on both our parts. Where my own eloquence proved inadequate, it seemed to me that our professors were far better armed, and even delighted in vivisecting a conservative student while the rest of us watched. In retrospect, I suspect this really just confirmed that student's belief that academics is the devil's playground.

It was ministry that taught me, finally, to treat conservatives with kid gloves, to give up on my hopes of converting them to my ideas. Granted, my profession was to exhort, to argue at length for justice, generosity, acceptance, and the transformation of this world into a new Eden; but really, I doubt I changed any minds that weren't already of a liberal persuasion. The people in my churches may have been older, may have even been Republicans, but just by dint of being Methodists, were already inclined to be socially progressive. They might (and many probably were) have been skeptical of some of my more leftist causes, but that had more to do with ignorance than personality. Any who radically disagreed with Methodist ideas about tolerance had long since migrated to a church that honored their distaste for social change.

By the time I left ministry, I'd learned my lesson: evangelism is a joke. People are predisposed to be conservative or liberal. Conversions do happen, but they have far more to do with something happening in a person's interior life than with the power of an evangelist's message. That's why I long ago gave up the time-honored theological art of prooftexting: however much a fundamentalist may claim to be a Biblical literalist, and roll out text after text in defense of his or her antiquated ideas, it's never really about the Bible, and no contradictory text, not even the words of Jesus, will change those ideas. And God forbid one suggest that maybe the Bible isn't crystal clear on that topic, that just maybe it should be put in its historical, cultural, and literary context.

So I quit trying. It's not about the ideas, I realized, it's about the person. There's no changing that mind; better to look for common ground: hey, we both have kids. Are yours struggling with being bullied as much as mine? Wow, it must have hurt when you lost your home in the financial crisis that cost me my job. Maybe the corporations aren't our friends, after all.

And this is where I depart from the ideas in this article, because I have seen people change their minds. I've known individuals who were absolutely convinced homosexuality was a perversion, and that homosexuals deserved whatever abuse was heaped upon them, and watched those people's hardened hearts begin to soften, seen them come to realize that gay men and lesbians are also human beings, that they deserve better treatment, basic rights, and finally accept them as neighbors and friends. Conversions like this don't happen overnight--my own took about a decade--but they do happen.

They don't happen from arguments and debates, though. Conservatives change their minds because of personal experiences: one of their children comes out. A Mexican family moves into the neighborhood. A decade of droughts forces them to sell the family farm. The most profound of these experiences come through relationships, from meeting and getting to know people who are different, who are other, and yet share so much common humanity that it's impossible to treat them as dangerous aliens.

This is what makes the university experience so important. Most conservatives grow up, as I did, in monoculture communities, surrounded by people who are just like them. As averse as they are to differences, they are likely to choose colleges that extend that sameness. This is unfortunate, even tragic, because of all the experiences people can have, college is the one that is most likely to begin the opening of their minds. In an independent university, a young person cannot help but meet people from other places, people whose faith, ethnicity, language, and politics are radically different from that which they grew up with. And they don't just meet them: they befriend them. Taking a class together, studying together, eating meals together, and, the most important part of the college experience, having down time together, one cannot help but discover the shared humanity in people one once thought to be aliens.

I first got to know Catholics, African-Americans, Asians, and gays and lesbians at university. And I didn't just meet them. They became my friends. In befriending me, they broke open a conservative shell within me that wanted to stick to my own kind. More than any courses, any professors, this is what changed me into the person I am today.

My bottom line on the personality theory of political persuasion, then, is this: it does make sense at a very superficial level, and my experience of arguing ideas bears it out. Fundamentally, though, I believe it's flawed: we're really not that hard-wired in the long run. Take a soft approach, treat a conservative like a fellow human being, build relationships, share commonalities, and you can lay the seeds for a gradual conversion that will blow your mind.

Friday, July 18, 2014

Summer Wasteland

For a single parent with a low income, summer is hell.

My first stint of single parenting was from 1995-96. I was a rural pastor, on minimum salary, with very few resources to drawn on, and my children were very small: 2 and 5 at the beginning of the divorce, 4 and 7 by the time I remarried almost two years later. During the school year, they were with me weekends and holidays. During the summer, I alternated entire weeks. Early on, I discovered a painful reality: on my parenting weeks, I had no freedom. I could only visit people who enjoyed children. I had to do all my office work from home. And I could forget about exercising. They were just too small to be by themselves, and I didn't know my new congregation well enough to ask for babysitting help.

Fortunately, rural ministers enjoy far greater flexibility than practitioners of almost any other trade or profession I'm aware of. I survived my first summer largely unscathed, and by the second, I had gained enough experience and confidence to fare far better on my parenting weeks. Three years later, when I became a single parent once more, the summer question was moot: the kids were old enough to go to camp. I also didn't see much of them the first summer of the second divorce for reasons I will not go into in this space. The summer after that, I was no longer a minister, and had all the flexibility I could ever want. Ever since, my summers have been blessed with an absence of work responsibilities, whether it was because I was without a job (2000-2002) or had completed my reentry into public school teaching, and wasn't working summers, anyway.

Still, I look back on the summer of 1995 as an especially hard time for me and my children. Apart from adjusting to the divorce, there really was no way to adequately juggle my professional and parental responsibilities. Had I held a job that involved reporting to a workplace during office hours, I would have been in dire straits, and I knew it. I felt fortunate to be able to work around my children's demanding schedule, rather than having to fit them into mine.

This gives me an extra dose of empathy for Debra Harrell, the South Carolinian mother who was arrested, and had her daughter placed in foster care, for letting the child spend the day in the park while Ms. Harrell reported to work at a McDonald's a mile and a half away. The nine-year-old child had a cell phone, but was otherwise completely on her own. At lunchtime, she walked back to her mother's workplace to eat, then returned to the park for the afternoon. This apparently happened for several days in a row before a concerned citizen called the police.

One can speculate on the racial aspects of this story (Ms. Harrell is African-American), and note, as does the poorly-written TV news story, that the state does have child care programs assistance programs available to low-income parents. But thinking back to my own single-parenting on a shoestring budget, I'm reminded of how hard it was to think during my parenting weeks. In retrospect, I'm sure there were plenty of surrogate grandparents available to watch my children, perhaps even on a regular basis, not to mention my own parents who lived just ten miles away from the town where I was appointed; but the stress of adjusting to a new congregation, of creating a single-parenting life from scratch while in the midst of a painful divorce, and of simply being with small children 24-7, overloaded my thought processes to the point that I just could not think clearly.

The experience of having too many stresses in one's life to be able to think clearly is called bandwidth poverty. The concept is simple: the more survival challenges the brain must cope with, the less it is able to engage in higher-order reasoning. Parenting without a partner is a huge stress all by itself: there's just no time to oneself. I came to treasure nap time, the only time all day that I could concentrate on writing sermons, making phone calls, planning worship services, not to mention taking a little time for myself. Now take that stress and compound it with the task of working at a minimum wage job without child care. If it were me, I suspect I'd have to have a social worker sticking a pamphlet under my nose and reading the information about child care to me, then helping me fill out and submit the forms, before I'd be in a state to apply for such a thing. There's just too much on this woman's plate for her to think clearly.

And now to that terrible bit of parenting: leaving her daughter at a park all day. The summer I was 10, I had very few summer programs to participate in. Rural Filer, Idaho offered a few arts and crafts classes to children, and I played little league baseball, but for the most part, my younger brother Stephen and I were on our own. Our mother had her hands full with the two other boys in the family, who were aged 1 and 5. So we rode our bicycles (no helmets!) all over town, played in vacant lots, went to the library, hung out in the youth room at our father's church, met up with friends, used the playground equipment at the elementary school, all of it without supervision. We did spend some time at home--we had a swing set and sandbox in our back yard, and were both voracious readers--but whether at home or on the far side of town, we were completely on our own, setting our own agendas, playing our own games, coming home for lunch and then heading back out for another adventure.

Our parents thought nothing of this. Now, if we were out too late, there would certainly be trouble--I remember one evening when my brother Jon was so late coming home after school that my mother became extremely distraught--but such things hardly ever happened. The one accident I had occurred in our own back yard, when I broke my arm being stupid on the slide. We were all schooled in "stranger danger"--there's nothing new about this variety of paranoia--but in truth, such incidents were so rare, and our parents knew this, that they lost no sleep over it. So our summers were magical times for practicing independence.

A lot has changed since 1971. Parents no longer feel secure letting their children roam around town. Those who can afford it send their children to camp, whether it's day camps they are dropped off at and picked up from, or sleep-away camps they stay at for entire weeks, or even months. Camp is a wonderful experience, one I was fortunate to have as a teenager, and I don't begrudge at all the structured, mediated approach to giving children some independence from their parents. But it's expensive, and there's no way a minimum wage worker like Debra Harrell could afford it.

Some schools in low-income neighborhoods offer day programs for children during the summer that provide them with meals, activities, and literacy and math coaching, but for the most part, these children and their often single parents have nothing to fall back on. If they're lucky, there may be extended family members who can care for them while their parents are at work. If they're not, they may well find themselves alone in an apartment, left to their own devices.

Just as I was.

This country needs to take a long, deep breath, and think about what we do to our children when we project our darkest fears about stranger abductions on them. Children are three times more likely to be abducted by a family member than a stranger, and vastly more likely to be injured or killed in the back seat of their parents' car. The simple reality is that the world is nowhere near as scary a place as modern parents think it is; in fact, it's actually a much safer place than it was in our own childhoods. Bicycle helmets, child-proofed playground equipment, proliferating pedestrian crossings, and everywhere, including in the hands of even poor children, cell phones. And don't even get me started on how quickly a random adult will call 911 if he or she believes some strange child is at risk by, say, being alone at the park for too long. Yes, we need to lighten up and let children have adventures away from their parents' all-seeing eyes.

Of course, we also need to start supporting low-income working parents with affordable child care, better wages, and benefits, but that's another story. The moral of this story is simple: get a grip, and let kids be kids.

Thursday, July 17, 2014

California(s), Here I Come!

Six Californias map
Fearless blogger to the rescue! All twenty of my followers will, no doubt, rise to the occasion to turn back this scourge!

In the latest bald-faced evil Republican plot to steal the White House from the popular vote, Tim Draper, a Californian billionaire, has gathered over a million signatures on a petition to divide the state of California into six smaller states, thus diluting the impact of the huge liberal urban vote and tipping the balance of the US Senate solidly in the direction of the GOP.

It's gerrymandering on a macro scale, and it's hard to imagine anyone but the petition signers and the handful of conservative billionaires living in California voting for it,  not to mention the challenge of getting it through Congress, which has to approve changes in state borders by much larger majorities than the Republican party can generate, but even so, one has to give a nod of respect to the sheer audacity of the scheme. Claiming that this is in the best interests of Californian citizens is almost as brazen a lie as the pretense that this is anything other than a power play by the haves to wrest what little clout the have-nots retain.

It's wholly consistent with the dirty tricks Republicans have been bringing to bear in the years since the theft of the 2000 Presidential election, carving out Congressional districts so crazily drawn that there is no geographical sense to their boundaries, disqualifying minority voters by demanding identification it is difficult for them to produce, shutting down polling stations at hours more conducive to minority participation, creating obstacles to true majority rule that haven't been seen since the days of Jim Crow.

The thought of tearing apart California is revolting for a lot of reasons. This was my birth state (San Jose, 1961), and while I only lived in it during my toddler years, I've been back plenty of times. Oregon is my home, but I feel a deep affinity for the big state to the south. It's a state of diversity: racial, ethnic, religious, political, cultural; a state whose largest cities bear Spanish names while maintaining identities unique from each other; a state of magnificent mountains, searing deserts, rugged coastlines, crystal beaches, farmland that feeds the nation, vineyards whose wines rival those of any other region in the world; a state that has so perfected the presentation of American culture that its iconic images and products are internationally ubiquitous. Breaking up this state that is--well, not a microcosm, but at least a midicosm of America would be a travesty. Had I a vote in whatever election that may result of this plot, it would be cast in strident opposition.

I can understand where the desire to do such a thing comes from. California's reliable electoral votes for the Democratic Presidential candidate are a thorn in the side of Republican schemers. I feel the same way about the reliably homophobic votes commanded by the United Methodist conferences of the South, which keep my denominational home from making a single advance in gay inclusion in the last thirty years. And I admit I've often speculated (in this blog, even) that the answer is division: who needs the South, anyway? But that's just church politics I'm fulminating over. In this case, we're talking about an entire state which, despite a few thousand cranks in the far northern reaches who'd like to break away (along with Oregon's Rogue Valley) to form a new state called Jefferson, is both powerfully diverse and unified. No other state holds together the opposing forces of urban and rural politics as California does. If the Republicans want to nab some Californian electors, they need to craft a platform that works for a majority of Californians, and have it represented by a candidate who looks more like a 21st century American than the old white guys in suits who keep heading the ticket. It needs to acknowledge the growing gap between rich and poor, and address it in a way that actually works for the poor, rather than continuing to throw more and more money at the wealthy who already have more of the stuff than they know what to do with.

Of course, ranting about this from Portland, whence so many Californians have fled, is a futile endeavor. The people who can make this go away are all down there, not up here. But if you've got family or friends in California, I suggest you talk with them about the "Six Californias" idea, and get them on board with the resistance. California is a big, bright, crazily diverse state. Let's keep it in one piece.

Tuesday, July 15, 2014

Secular Pronouncements


Good news for atheists wanting to preside over weddings in Indiana: you no longer have to get ordained in a fake internet church before you can sign the papers!

I'm thrilled by this development for a variety of reasons, which I will tick off with bullet points:

  • Contrary to popular belief, weddings have not always been the province of the church. Prior to the middle ages, in fact, weddings in the Roman Empire and its successor states were civil events. In ancient Judaism, there was no priestly role at all: the wedding was consummated--made official--in the marriage bed. Only with the decline of literacy did marriage come into the church, and it did so literally through the back door. Typically the only person in town who could read and write, and thus keep records, was the parish priest. This led to the back door of the church being a place where announcements were made, contracts sealed, and any other business else requiring a notary was conducted. Once marriage vows had been taken at that back door, though, it made sense to go on in and have a service blessing the couple. Thus began the evolution of the church wedding. But to reiterate: it's a historical accident that this contractual relationship, which continued to have civil status, ever took on religious connotations; and requiring that weddings be performed by clergy (giving rise to internet ordination mills) a needless, and ultimately unconstitutional, intrusion of the state into the religious status of couples.
  • Of all the duties clergy are expected to perform, weddings are the most odious. Balancing the expectations of the bride, her family, the photographer, the wedding planner, and (sorry guys, but your interests almost always come last here) the groom and his family with the desire of any pastor to maintain theological integrity can be a nightmare. While I will admit to having some wonderful times preparing couples for marriage and performing their ceremonies, I've also got stories I could tell you that would make your toes curl. Some of them are funny. Mostly they just remind me how relieved I am not to have this job anymore, and what an improvement it would be if churches could push these events back onto the street.
  • In the last two years, I attended two weddings and officiated at a third. None of them took place in a church. The two I attended were officiated by friends of the couples being married, and the one I performed was for friends. Not being tied to a church, or to a book of ritual, meant the entire ceremony could be built around the couple. With a friend presiding, the entire service could be personalized far better than in almost every wedding I performed while a minister, when I typically met the couple for the first time when they came in for their first premarital counseling session. Having a friend perform a wedding, whether or not he or she is ordained, makes vastly more sense than going to a perfect stranger, however well qualified, however well-respected by the bride's or groom's parents; and for couples who have no church affiliation (as the vast majority of young adults do not) it makes even more sense to choose a place of beauty that has significance for the couple, however lovely their parents' home church may be. I found all the services to be moving, and believe the couples will have far more indelible memories of the occasion than if they had shopped around for a church building and met with the pastor a handful of times before having their service, signing the papers, and never entering that building or meeting that pastor again.
  • For the friend a couple chooses to preside, the requirement of ordination in some made-up Universal Life Church is not only a silly thing to require of an agnostic or atheist, but cheapens the meaning of ordination for those of us who devoted many years of our lives to achieve it. It took me a full decade to become a United Methodist elder, a struggle that ultimately cost me a marriage and, I must admit, my faith in both the church and its doctrines. That someone can fill out a form, click a button, and be placed on a par with me and my fellow legitimate ordinands is appalling. Let's do away with this requirement and put these sham religions out of business.
  • Desanctifying the institution of marriage and wresting it from the clutches of religion can only help in the struggle to place committed same-gender relationships on a par with those of mixed-gender couples. Nearly all the resistance to marriage equality comes from conservative Christians, acting as if they have a monopoly on the ceremony and the institution it symbolizes. Once the requirement of ordination is removed from the equation, the notion that one segment of the population should be able to define marriage for the entire nation will lose what power it still has.
And now a story: in 2000, I moved into the Peace House, an intentional community of United Methodist activists located in northeast Portland. The house parents of this place (there really is no term that better describes their roles) were, and still are, John and Pat Schwiebert. John, like me, is a United Methodist elder, while Pat is a layperson, a registered nurse, and the only straight member of Portland's Metropolitan Community Church. The house has served as an AIDS hospice, and John and Pat have been active for decades in the movement to ordain sexual minorities in, and bring marriage equality to, the United Methodist Church. Soon after I moved into the Peace House, John told me that Pat would be performing a same-gender wedding. "Is that legal?" I blurted out before realizing two things: 1) Since the marriage would not be recognized in Oregon anyway, it made no difference whether the officiant was ordained or not; 2) as a layperson, Pat was at no risk of being put on trial by the United Methodist Church which continues to forbid its clergy (like John and me) to perform any kind of union ceremony for a same-gender couple. This made Pat the perfect person to preside over this wedding.

Fourteen years later, marriage equality has finally come to Oregon, though with the ordination requirement still on the books, gay couples are in many cases having to turn to justices-of-the-peace, pastors of churches other than their own, or friends willing to be "ordained" by an online "church." In this most unchurched of states, it's high time we follow the example set by Indiana, and get weddings out of the church and back where they belong: in the community that will support the couple in their life together.

One final note: in two and a half weeks, Amy and I will be solemnizing our "mountain marriage" at a spot we just picked out on the Leif Erickson Drive, a trail through Portland's Forest Park. For our officiant, we chose a friend and fellow improviser, Scott Simon. We've asked our closest friends and relatives (including the Schwieberts) to be present. No papers will be signed, and nothing will be entered into county records, but in every other respect, but we will be making promises to each other and sealing them with rings and a kiss. Nine days later, we'll have an all-day reception on our patio. This is the kind of wedding that's only possible apart from a church: uniquely personalized, significant to us in ways it could be to no other couple, honoring our individual and mutual identities.

Here's hoping more and more weddings will break the bonds of religious expectations, and take place wherever they mean the most to the people being married, presided over by someone who knows them far better than any cleric ever can.

Not Our Children



"Not our children, not our problem."

The sign was seen at a protest in Murrieta, California, according to reporter Bob Ortega, speaking with Brooke Gladstone on NPR's On the Media program last Friday. The moment I heard those words, I knew I had to write about them, that they would haunt me until I did. So here I am.

The occasion for that sign was a blockade. Three buses of undocumented children were on their way to a Border Patrol processing station in Murrieta. The demonstrators managed to keep the buses out of Murrieta, but not out of the United States: the buses were simply diverted to a different processing station in Chula Vista. One other accomplishment of the protesters was giving Murrieta a black eye, as the publicity led the town's mayor to write an open letter to the White House apologizing on behalf of his community. It also delivered a message to the children on the buses, the message of the sign Bob Ortega saw (full disclosure: I Googled that sign for nearly an hour, and couldn't find an image of it): "You're not our children, not our problem, so go home."

Let's set aside the irony of Caucasian-Americans calling Central American Indian children illegal immigrants, and look instead at these three words: "Not our children."

I teach music to children. Last year in my job in the Reynolds School District of outer east side Portland, I had more than a thousand students. The majority of them were recent immigrants to the United States. Many were Mexican, but there were also children from China, Vietnam, Samoa, Russia, Ukraine and, most tellingly, Somalia. I don't know how many of these children were here legally: that's not a question we ask in a public school setting. I do know that the Somalian children who arrived mid-year were refugees fleeing the violent civil war in their homeland. Whatever the situation their families left to come to America, these children were, like all the children I've taught (including the middle class white suburban children) playful, inquisitive, affectionate, precocious, sometimes naughty, sometimes nice, empathetic, sensitive, creative, charming--the list could go on, but to summarize it in one word, children. Read anything I write about children and it becomes clear I've got a very soft spot for them. It's the main reason I'm so well suited to be a teacher, more specifically, an elementary teacher.

My soft spot extends to children regardless of their ethnicity. I was utterly entranced with the children of Ghana, just as I have been with the highly diverse population of Reynolds kids, and by the near-monoculture of children in Banks, Oregon (at least 90% white). I love children, period.

I realize this is not a universal feeling, however. Growing up in churches and then having several of my own, I encountered many adults who had a low tolerance for children, who preferred them "seen but not heard"--and, in some cases, not seen at all. There were many in these churches who bemoaned the absence of young families, and pined for the days when the Sunday School was bursting with children, but were unwilling to adjust their worship or program in any way that might accommodate young people--particularly young people whose culture differed from that of the traditionalists who so resisted change in these churches.

Then there's the matter of financing school budgets. In my youth, Oregon funded schools locally through property taxes. As school populations grew and school properties needed upgrading, budgets came before voters in the form of a property tax increase. In my memory, these almost always were defeated. Eventually there was a property tax revolt, modeled on California's, that led to such a severe limit that the state had to take over most school funding through income taxes--which, again, have to come before voters, now in statewide elections, and are almost always defeated. If I were a child watching these election results, the message to me would be clear: we don't want to pay for your education.

And here's the saddest part of this equation: the voters turning down tax increases that would pay for schools, the church people resisting changes that would bring young people back to church, the protesters blocking buses of immigrant children, the Tea Party Republicans resisting immigration reform, are almost all older people, grandparents who no longer want to pay for the care and feeding of children who did not issue from their own loins. Not our children, not our problem.

Set aside for a moment that the children on those buses were fleeing drug war violence in Central America, that the children in schools across Oregon are being amassed in larger classes in decaying buildings with fewer teachers, that churches that cannot attract young families will ultimately close. Boil it all down to this: these are children.

When I was a child, I benefited from public education that was reasonably well-funded. The people who paid for that education were property owners, most of them middle-aged and older. As children, they, too, had benefited from public education. Now add in this piece: my great-grandparents were immigrants who benefited from the open-arms policy symbolized by the Statue of Liberty. Whether or not I'm a kid person, it would be ungrateful, selfish, and hypocritical of me to suddenly announce that I'm not going to pay one more cent to welcome or educate children who are not mine.

Because whether or not they're our children, whether or not they're a problem, they deserve far better than this:

Thursday, July 10, 2014

Religiously Exempt


In the wake of the Hobby Lobby ruling, which permits corporations to deny health benefits to employees if providing those benefits conflicts with the religious beliefs of the owners, "religious exemption" has become the strategie du jour of the radical right to justify bad behaviors for which American society no longer has patience. While the wedge issue is health care, specifically contraception, it's easy to see why LGBTQ activists are nervous about the ruling. Don't want gay people working for you? Claim religious exemption from non-discrimination laws. Don't want to sell your services or rent an apartment to gay people? Religious exemption to the rescue! Want to keep gay people from visiting their partners in the hospital, or having a say in their care? Again, religious exemption. These two words put at risk every gain in civil rights that sexual minorities have eked out of a cluelessly bigoted culture over the last half century.

The US Constitution provides grounds for two levels of exemption: religious institutions maintaining theological standards for employees, and religious employees of secular institutions receiving protection for their beliefs. It is reasonable, for instance, for a church to require that its pastor adhere to a denominational creed, or that teachers in its parochial schools present an accurate representation of church doctrine to students. On the other side of the coin, employees of American businesses have a right to dress in accordance with their beliefs, and to be able to practice the rites of their faith without harassment, as long as such practice does not interfere to a harmful degree with the performance of their work duties.

The passage of the Affordable Care Act has led to conflict over a third level of religious exemption: religiously affiliated institutions that are not, themselves, houses of worship: church-affiliated hospitals and colleges, for instance. The question of whether these organizations must provide contraception in their health insurance benefit has led to a work-around that pleases no one, since the goal of the institutions is to paternalistically keep their employees from using contraception at all. The Hobby Lobby ruling extends this kind of exemption even further, to private corporations whose owners happen to hold religious beliefs that contradict federal statutes. As Mark Joseph Stern explains in a Slate piece on levels of religious exemption, it's not far here to opening up this right to individuals. 

I have to admit to feeling some sympathy for religious persons being pressed by federal statute into behaviors they consider sinful. Since childhood, I've considered myself a conscientious objector--a person who, if drafted into the military, would refuse to serve due to religious or philosophical objections. Conscientious objection as a status dates back to the 1500s, though it did not become a part of military law until the late 18th century in Britain and the new United States. In any case, for the entire history of this republic, it has been understood that pacifists should not have to bear arms or use them on others, however just the conflict might seem to its perpetrators. This does not mean that conscientious objectors have always been treated well, but at least their right not to fight has been protected.

So I'm comfortable with individuals having a religious exemption to serving in the military. Refusing to bear arms and take lives seems to me a proper use of this principle.

What, then, of religious exemption to justify discriminatory behavior, whether it is in the matter of providing contraceptive health benefits or of according equal treatment to persons one considers sinful?

This is where I have to do some ethical parsing. In the matter of conscientious objection, a person is refusing to participate in a practice that leads directly to harming other people in tangible ways. When it comes to religious exemption from non-discrimination laws, or from providing contraception, we enter the area of protecting one's right to harm others. Granted, such harm is, in the case of discrimination, more a matter of economics and psychology: to refuse service to a gay customer may cause that person distress, inconvenience, and added expense, but no actual physical harm is being inflicted. If it were, assault statutes would come into play. Similarly, a woman denied the use of an IUD can still settle for a less convenient--and less effective--form of birth control, or pay for the device out of pocket, but except for the increased risk of conception, the harm to her is far less than that of firing a rifle at an enemy soldier.

The severity of harm, then, is, at first glance at least, far less severe than that of serving in the military. But again, this is a reversal of the conscientious objection principle: the individual or corporation claiming religious exemption is seeking to protect the right not of abstaining from harming another, but of inflicting harm on another, however much it may fall short of actual physical assault.

This argument ignores another factor, though, one that Americans have accepted with regard to racial discrimination and most forms of sex discrimination: psychological harm is still harm. Second-class treatment causes wounds that, while not immediately visible, create scars that may never heal, and may be passed on to future generations. Deny me a job or a home not because I am unqualified, but because of who I am, and I will internalize that rejection. Do it repeatedly and however philosophically detached I may be, I will sink deeper and deeper into resentment. This will color my relationships with my partner, my family, my children, and it may take generations for them to heal the wound inflicted on me today. If you doubt this, just consider for a moment how much America is still struggling with the scars of slavery and Jim Crow, generations after these practices were rendered illegal.

One final thing about the religious exemption: there is deep irony in hiding behind a church to justify treating others uncharitably. True religion, as I and, I think, most Americans understand it, is about doing good. Using religion to justify harming others, and using the Constitution to protect such behavior, is a perversion of religion that harks back to the Inquisition, during which concern for the souls of sinners merited their torture and murder by the church. And yet, again and again in the United States, religion has been used as a shield protecting bigots and abusers from acknowledging and accepting consequences for the harmfulness of their actions. That such reasoning is ultimately counter-productive--who wants to belong to a church that defends bigotry?--is lost on those for whom the hatred of others matters more than the future of their church and nation.

It's probably just a matter of time until someone uses the Hobby Lobby decision to justify denying an apartment to a gay couple. Hopefully that action will throw the whole matter back up to the Supreme Court, making plain just how dangerous it is to create a shield of religious exemption for discriminatory behavior, motivating them to reverse themselves and restore sanity to the American ethos.

Wednesday, July 9, 2014

Divided We Fall



Things are falling apart.

The end of World War I saw the establishment, across Africa, Asia, and even Europe, of new nations created by treaties drawn up by the winners of that conflict. Many of these nations functioned as colonies until the 1950s, when they began declaring independence. The superpowers drawing up their borders paid little or no attention to the territories of ethnic groups, and thus throughout Africa ancient nations like the Ewe found themselves living in multiple nations, often with borders running down the middle of villages. It also brought together groups of people who would not, of their own accord, have formed a state with each other: Serbs, Croats, and Bosnians in Yugoslavia; Christians, Jews, and Muslims in Lebanon; and, in Iraq, Shia, Sunni, and Kurds. Some of these nations were able to establish unity governments, and even to have democratic elections, but for the most part, they stayed intact as long as they did because of the presence of a strongman in the president's mansion. Once that strongman died or was deposed, the unity of the nation began to unravel.

Yugoslavia is no more. The bloody civil war of the 1990s shattered it into seven states. Beirut has never recovered from the Lebanese civil war that began in the 1970s. Going back farther, the decolonized nation of India was just months old when the Muslim nation of Pakistan broke away. In our time, Iraq seems on the verge of dissolving into three different states.

For some, this is vindication: no longer forced to live in peace with their traditional ancestral enemies, they can now form an affinity state, a nation in which all speak the same language, worship the same god, and practice the same customs. For others, such division initiates an identity crisis: the nation they were born into and grew to love, warts and all, under the previous dictatorial government, was blessed with civil diversity. Shia could have Sunni friends, Muslim and Hindu children could play together, Serbs and Croats could be business partners. As long as unity was an essential aspect of nationhood, diversity reigned.

Loss of the dictator, and the arrival of democracy, has sadly almost always meant a shattering of that unity. A Shia Iraqi can say, "I don't want to live in a Shi'ite state! I want to live in Iraq, with neighbors who are Sunnis and Kurds. I wanted my children growing up in that country, not the Shi'ite republic of whatever it would be called. That is, unfortunately, where Iraq appears to be headed.

There are tragedies running through every one of these conflicts, tragedies even more poignant than the partition of families brought on by the arbitrarily imposed colonial borders. Live in peace with people for a generation or two, and friendships grow, partnerships are created, intermarriage may even take place. The shattering of these unity states into smaller, ethnic monocultures either breaks apart these relationships or requires one or the other partner to live as an alien in the partner's country. Two weeks ago, This American Life featured a story about Iraqis who are grieving the loss of a community of diversity.

It is not unusual for commentators to brush off this tragedy, insisting that the unity state was a manufactured entity, that only the anti-democratic strength of a colonial power or a dictator kept such states together, and that left to their own devices, these peoples would never have lived in the same neighborhoods.

Be that as it may, many of these nations lasted for nearly a century before coming apart, and that is plenty of time to form relationships that transcend ethnicity.

The United States of America, with its founding (though never official) motto E pluribus unum (from many, one), suffered its own unraveling in the 1860s. As with the post-World War I unity states, the nation had managed to hold together for several generations, only to find itself splitting over cultural, economic, and political differences. Unlike the more modern examples above, the USA came back together in the aftermath of the Civil War, in large part because the chief principle of the winning side was not ethnic cleansing, not even the end of slavery, but reunification. The country then imposed on itself a reaffirmation of its Constitutional values. It took another century for the losing states to genuinely capitulate to those values, but it did happen.

And now it is beginning to come apart, Southern conservatives pushing back against the progressives of the West Coast and Northeast. It is highly unlikely that there can be an ethnic civil war in the United States as has come to pass in the Balkans, as is developing in Iraq, and as is festering in much of the European Union, because diversity has, more and more, become a broadly accepted reality of American life. If Americans divide, it will be on ideological, rather than racial or ethnic, lines.

And that will be a tragedy. There was a time when Republicans and Democrats could be friends, when Texans and Oregonians could find common cause, and it was not that long ago. My college friends ran the gamut of ideologies and were as culturally diverse as any group I've belonged to. I have the feeling, though, that if we were to meet in the same dormitory now, we'd be seeking out affinity groups for Catholics, mainline Protestants, Evangelicals, Republicans, Democrats--and simply not come together.

Our nation is coming apart. Ideologues insisting on political purity are pulling people away from each other. The diversity that has been the strength of America--and just as an aside, it's generally accepted that mongrels are hardier than purebreds--is fracturing. I don't know that we'll descend into the anarchy of Yugoslavia or Iraq, but there are frightening times ahead unless we can find common cause once more, overcome our differences, and out of these many ideological splinters, rediscover the unity that is the name of our country.

Tuesday, July 8, 2014

Republican't



First the shocker: until I was 23, I considered myself a Republican.

You probably find that hard to believe. If you've ever read this blog or had a conversation with me, even if it was when I was in high school or college, you'll knit your brow at this confession. Funny, you never seemed like a Republican. And certainly since graduate school, when I abandoned the party once and for all and started calling myself an independent (I only officially affiliated with the Democratic party in 1991), there's been no question about my political persuasion. Even in college, while still clinging to the GOP label, I found myself leaning socialist.

I blame my parents. In fact, most young people I know adhere to their parents' political ideals up to the point when they are developmentally motivated to declare independence. In the 1970s, my parents were a variety of Republican that no longer exists: socially progressive, really New Deal Democrats at heart. Theirs was the party of Abraham Lincoln, Teddy Roosevelt, Dwight Eisenhower, Gerald Ford--and, until his Watergate dealings became public, Richard Nixon. Ronald Reagan was another matter: in him, my parents saw their party changing direction, walking away from compassion and good sense, turning to war-mongering and tax cuts. So it ceased to be their party, just as it ceased to be mine.

Had my 18-year-old self believed the things my 53-year-old self does, I would never have registered as a Republican. But looking back, I still feel reasonably comfortable with that choice. The GOP of 1979 may not have been a hotbed of liberalism, but at least it had both a heart and a brain.

Which brings us to 2014, when the only organ still functioning in the Grand Old Pachyderm appears to be the liver, which is working overtime generating more bile than this country can handle.

The news item that ticked me off on this particular rant is right here: the state of Idaho will not permit a 74-year-old Navy veteran to have her ashes buried next to her deceased wife. This is one of those small-minded expressions of conservatism that is so blatantly bigoted it's hard to imagine anyone defending it. Yet millions of Republicans do, and will, because they believe that gays, lesbians, and transgendered individuals do not deserve the same rights as straight people. Common sense compassion is trumped by homophobia.

This week has seen the Supreme Court coming down on the side of Hobby Lobby, opening the door for both corporations and non-profits with a conservative bent to deny their employees the same birth control coverage accorded all US citizens by the Affordable Care Act. This is, again, a Republican issue, as is the constant whittling away of abortion rights (which, I must point out, Republicans were just as likely as Democrats to favor back in the days when I was registered with the GOP).

And please don't get me started on gun rights. You know how I feel about that--an opinion I've held since childhood. Just to avoid vagary, I'd be very happy if every gun in America was gathered up and melted down.

I'm aware not all Democrats are of a mind on these issues, but they're certainly more likely to be on the side I favor than Republicans. They're also more likely to favor progressive taxation, universal health care, liberal immigration policies, environmental protection, and expanded benefits for the poor and homeless.

It should come as no surprise to anyone who reads this and is startled by my initial confession that I'm a lefty on all these issues, and that I consider the official Republican position on every one of them to be wrong. I'm not alone in this: on almost all these issues, Americans are heavily in favor of the Democratic position. Unfortunately, gerrymandering and the two-senators-per-state clause conspire to create a Congress to the right of American majority opinions. This can't last forever--at some point, Congress has to catch up with the ethos of the nation--but for now, it is the principle reason we can't pass sensible legislation. It's also why Congress is as despised an institution as it has ever been.

I'd like to think a loss in the polls this November might cause the GOP to wake up, but that didn't happen in 2008, and it certainly didn't in 2012; and besides, all indications are that the idiosyncrasies of this particular election will consolidate Republican power, even as Republican positions are representing less and less the opinions of Americans as a people.

Come on, elephants. You're on the wrong side of history. If you're not careful, you're liable to be hit by a mass extinction event.