Thursday, May 30, 2013

That Reminds Me of a Story


"You're going to tell one of your stories! I can't stand to hear another one of your stories!" (Secretary of War Edwin Stanton, in Steven Spielberg's Lincoln)

For a good chunk of my life, I was a preacher.

Preachers use stories to illustrate. It's a practice that goes back millennia. Jesus told many a parable, as did Buddha. The points made by the stories are not always obvious, and in fact, it's often best if there's not a simple, one-to-one correspondence between the story and the lesson. Parables, in particular, are stories that come at a teaching from an odd angle, and may contain multiple lessons, some of them not even intended by the storyteller. The sign of a great story, as with any work of art, whether performed or visual, is the life it takes on independent of its creator. A truly masterful story will have a different meaning for every person who hears it.

This can be frustrating for people who would rather have something simple, declarative, to the point. When someone comes to me with a specific concern, and I respond with a story, even if that story has obvious relevance to the issue at hand, it will come up wanting in the advice department. "What does this have to do with my concern?" I will hear; or, "Are you saying that because X happened to this person you know, I should do Y?" No, not necessarily, maybe not at all. The story was not told to give you a narrative how-to manual for solving your problem. It could illustrate a point I will make if you'll give me a little more time; or maybe I'm just trying to defuse the tension, as is the case in the scene from Lincoln that leads to Edwin Stanton's explosion. Lincoln goes on to tell his very funny story at that point, about Ethan Allen visiting England after the Revolutionary War. The Situation Room bursts into laughter, interrupted by the rattle of a telegraph bringing the war news everyone was anxiously awaiting.

I say all this to demonstrate that, as powerful as stories can be, they are rarely useful as tools of debate. There is nothing logical about an illustration. Its meaning in a given setting depends on far too many variables for it to be a useful tool of argument.

The same is, sadly, all too true of anecdotal evidence, experiences or news stories rolled out to defend arguments that are logically lacking in strength.

I grew up hearing such evidence. It's my mother's favorite way of settling an argument:

"Don't go to the Gresham Hospital; they'll make you wait two hours, and then overcharge you." Sometime in the early 1970s, we were traveling from Idaho to Oregon to visit my grandmother, and stopped for lunch at a rest area. We had lunch in our travel trailer. At some point, the screen door slammed on my little brother Jon's finger. We rushed him to the nearest emergency room, somewhere in Gresham, and apparently my mother was not satisfied with the experience. Ever since, Gresham is the place that mistreats ER patients. Mind you, I've had similar experiences in almost every ER I've ever been to, regardless of its location. It could also be true that this just happened to be a bad day at the Gresham ER. It might even be the case that changes have been made to ER procedures since then at this very hospital; but because something happened there 40 years ago, that's what this place is like.

I've heard many such anecdotes from my mother. They're always solidly grounded in an experience she had, and they all make the logical leap from "happened to me" to "will happen to everyone else."

We travel now to the year 2013, and the astounding realization that many internet commentators have, apparently, learned to debate at my mother's knee. Witness the following:

The UK: Gun Control = Safety?

Written by on Tuesday, May 28, 2013 at 6:06 PM EDT in Articles
Woolwich-Attack-1905187
“You people will never be safe. Remove your government, they don’t care about you.”
These words rang true on May 22nd 2013 in Woolwich, UK. 20 minutes is how long it took a unit of armed police officers to arrive on the scene of a brutal murder committed in broad daylight by two men. So who said those words?
One of the murderers.
They made no attempt to flee, and they made members of the public watch powerless as they literally butchered a man on the pavement using two machetes and a meat cleaver. Afterwards, they calmly stood around boasting about what they had done. One of them was filmed saying this, whilst holding two of the weapons, covered in blood, with both of his hands covered in blood:
“I apologise that women have had to witness this today,
but in our land our women have to see the same. You
people will never be safe. Remove your government, they
don’t care about you.”
[Video: http://www.bbc.co.uk/news/uk-22633269]
Fortunately, they didn’t attack anyone else, and were taken down when the armed police EVENTUALLY arrived. But who could have known that would happen? They could easily have turned it into a bloodbath, since nobody else in the country can be armed except government agents and criminals. They had a full 20 minutes in which the law abiding public were at their mercy.
In the UK it is illegal to carry ANY object in a public place with the intent of using it as a weapon, offensive or defensive. Never mind the right to keep and bear arms, daring to carry a rock in your pocket is a criminal offence.
America, please take a good long hard look at the UK. When lawmakers want to take your rights away, they will do it slowly. “Just an inch” makes a fuckton of difference if you keep taking inches. Running a marathon starts with the first inch of the track. Less than 100 years ago, Britons could keep firearms for self defence, and the rules were probably less restrictive than in the US at the time, certainly less than the US is now.
Gun control advocates frequently cite the high profile spree shootings in the USA as why our gun laws “work”. Yet they ignore the fact that these ALL took place in “gun free zones”. Don’t let your entire country turn into a gun free zone like the UK.
So there you have it: something horrible happened in England, nobody had a gun to take out the offender, therefore gun control is evil.

Here's a statistic: in 2010, there were 11,078 deaths by gunfire in the United States, 3.6 per 100,000 population. That's the figure from the Center for Disease Control. That same year in the United Kingdom, there were 155, or 0.25 per 100,000 population, according to the World Health Organization. The rate of deaths by gunfire in the United States is 1400% higher than it is in the United Kingdom. So yes, overall, I'd say the strict gun laws in the UK are working just fine, and the US could learn a lesson from them.

Unfortunately, we Americans are an anecdotal people. If we used science and statistics to inform our public policy, we'd be spending far more money on schools, infrastructure, and public health than we do on arming ourselves for the next war, whether it's as a nation or as individuals. We'd also stop trying to make the argument that school shootings prove the need for gun control. Statistically, they're a tiny percentage of the overall body count. Much more pressing is the high volume of gun deaths in inner cities, not to mention home shootings, whether they are accidental or brought on by domestic violence. I'd feel much safer with the UK statistic of 0.25 than our gobsmacking 3.6.

Anecdotal evidence features strongly in the continuing clamor against policies designed to alleviate climate change. One hard winter in Washington, DC is enough to convince deniers that it's all a fiction. And yet the ice caps continue to shrink, winters overall grow milder, and one has to be living in a refrigerator to claim that summers are not getting longer and hotter.

A single contradictory case does not make an argument. Bringing up counter-illustrations is little better, as it lends weight to the practice of logical ping pong. When debate becomes binary in this fashion, the better stories win. It doesn't matter if they represent flukes, while the counter-stories illustrate the norm. A couple of creeps in Britain commit murder using machetes. That story literally has a visceral impact. It doesn't matter that the statistical argument overwhelmingly demonstrates the fallacy of the writer's conclusion; you're still left with the thought that maybe a concealed handgun could've put a stop to that incident. That's exactly why the NRA continues, even in the face of the grieving parents of Newtown, to trumpet the need for more guns in public life.

More Americans have died of gun violence in the six months since Newtown than in the entire ten-year invasion of Iraq.

Stories prove nothing. They illustrate beautifully, but they do not, cannot, prove anything. Stories of bureaucratic inefficiency are regularly rolled out to counter arguments for a single-payer health care system, though the same inefficiencies can easily be found in private health insurance practices. Stories of children losing the family homestead are used to defend the repeal of "death taxes," though such "reforms" are generally aimed at protecting multi-millionaires rather than middle class homeowners, whose estates likely fall below the threshold of having to pay any estate tax at all.

They prove nothing, and yet in the minds of those that hear them, they trump many a better argument. So I, for one, pledge to never again use a story to prove a point. Henceforth, all my stories shall be used solely for the purpose of illustration--or better yet, amusement.

I'd do that for free!

December 30, 1999: the day I discovered my favorite place on earth.

The occasion was my second post-divorce retreat to the desert. In December 1995, I drove to Utah to tour the national parks, something I had wanted to do throughout my first marriage but had never been able to interest my wife in. As luck would have it, though, that trip coincided with the Clinton/Congress budget standoff, and all the national parks were closed. I was able to see parts of a few, but only if they were visible from the highway (Capitol Reef, Zion, and on my way home through California, Death Valley). Within days of my second marriage ending, I knew I would repeat the odyssey, and this time I'd see Arches.

On New Year's Eve, I made it. I only had one short winter day to see as much of it as I could. In term of size, Arches is actually one of the smaller western parks. It's possible to see most of the highlights in a single day, and I set about doing just that, visiting Park Avenue, the Balanced Rock, the Double Arch (all of which make appearance in "Indiana Jones and the Last Crusade"), with some extended hiking at the Devil's Garden, sight of the Landscape Arch. At the end of the day, I hiked up to the Delicate Arch, in my mind the most spectacular natural structure in the world. I was there at sunset, to watch the colors shift from yellowish-organge to deep red. As the sun dipped belong with horizon, one of the other hikers who was there began softly playing a hand drum. It was awe-inspiring, solemn, more sacred than any worship service I'd ever attended, whether in a chapel or a cathedral. I made my way back to my car humbled by the ancient beauty of what I'd experienced.


So there I was, back in Utah, seeking perspective on my second bout of involuntary singleness, but with an added cause for self-reflection: the Hillsboro United Methodist Church, of which I had been associate pastor in charge of music for barely six months, was fed up with the sadness divorce had brought on. They wanted me to just get over it, and since that appeared not to be in the offing, they wanted me out.

As I explored Arches, I found myself digging deep into my sense of vocation: did I really even want to be a minister anymore? I was afraid of what would happen if I just quit and I suddenly lost the salary and housing allowance I had come to expect; but fear is not the same thing as vocation. There were many pastoral tasks for which I had no motivation whatsoever, and this had been going on longer than I had been dealing with divorce. Halfway through my day at Arches, I found myself at the foot of an arch like that at the top of this post, gazing up at a natural window that had taken millennia for the forces of nature to create. I closed my eyes to meditate, and felt something emerging from the recesses of my mind: I should only be a pastor if I would be willing to do this work without pay. And there it was, the answer to all my wrestling: I didn't love this work. I didn't even like it. Oh, I loved preaching, mainly because it was a weekly opportunity for me to do improvisational performance art; but in my new position, I preached at most once a month, and people weren't taking to it the way they had in my earlier, weekly preaching appointments. The other thing that could have been rewarding, music direction, came down to one evening of rehearsals a week, and again, people in the choir and praise band just weren't taking to my ideas.

But in all honesty to myself, I knew my dissatisfaction with the pastorate predated Hillsboro by years. Going all the way back to my internship in Illinois, I remembered how isolated and alone I felt most days after Brenda would drive off to Robinson. Forcing myself to go out and visit with my parishioners, I had many wonderful encounters, but it was always hard work. I came to love the people of Seed Chapel, and as undemonstrative as they were, they loved me. They were patient with me as I taught myself the art of manuscript-free preaching. Sometimes I'd forget where I was going, and just have to end the sermon--which was fine with them, since it meant they could beat the Baptists to the good restaurants for Sunday dinner. They took good care of me, just as they took good care of each other, and after eleven months, it was hard to leave. It was a good, growing year.

England was good, too, though often difficult. Visitation again came easy with a few people, but it was difficult with most; and in England, I could exploit my role as one half of a clergy couple, sending out my far more gregarious wife to do the casual socializing I didn't care for. The friends I made in England came to me in my role as Brenda's husband. I honed my skills as a preacher and worship leader, but started falling away from the intentional visitation discipline I had imposed on myself in Illinois.

Returning to Oregon in a pre-depressed state, the year in Medford nearly did me in professionally. I barely got to know my congregation in Talent, let alone the Medford church where I was supposed to spend half my time. In Estacada and Lents, my first year was a time of darkness, as I came to grips with my inner sadness over a failing marriage and career, neither of which I was admitting to myself. After a year, Lents was dropped from my appointment, and I was three-quarter time at Estacada. There I began to explore my depression in sermons, and the congregation identified with me, began to care for me in ways that reversed the pastor-parish role. In the middle of my third year, my tattered marriage finally came apart, and the church and town of Estacada became my support group. I believe I did good work there, providing a voice for the community activists who gathered around that church, but it was despite, not because of, my receding call to ministry.

Amity/McCabe/Sheridan was where my vocation called it quits. Now I was going through the motions, spending far more time preparing for services and sermons I could have improvised in minutes. I had that aspect of my work down. I can still do this; give me a text, and a couple of minutes to think about it, and I'll preach you a sermon, as well as rattling off several hymns that would fit the message. I spent hours in the study, reading, organizing, creating beautiful layouts for bulletins, writing newsletters, doing all I could to avoid the work of visitation. During my third year, the Amity/McCabe parish entered into conversations with the Sheridan church and its pastor, Bert Hanson, over merging all three churches into a single cooperative parish. I'd known Bert for several years, and found him to be amiable, but no great shakes as a worship leader or preacher; and yet, in Bert, I saw someone who had the one essential element I utterly lacked. Bert had left a lucrative career as an industrial chemist to become a lay pastor assigned to a local church, and was on the very long course of study track to ordination. And he loved it. Every time I met with him he talked enthusiastically about how much fun he was having, how he'd gladly keep doing this as long as he was allowed to. I just couldn't see it.

So I started looking elsewhere. I heard there was an opening for a chaplain at Portland State University, and applied for it. The application was similar to the mountains of paperwork needed to apply for ordination, and involved a great deal of soul-searching. I visited the campus of PSU, walked around, tried to imagine myself serving the student community there, and felt a mild stirring. It wasn't a full-blown call, but it gave me hope. When I was turned down without even an interview, I could feel the lid beginning to close on the coffin of my career.

Hillsboro was my last chance. I probably could have made it work. I knew how to do the visitation, had done it successfully in other places, though never to the degree I really needed to. I just didn't want to. Those words don't do it justice: I so badly didn't want to that I no longer cared what happened to me. I was stuck in a job I could not force myself to do, trying to spin out the busy work to stave off the ecclesial ax that was soon to fall on my clerical collar.

I should only be a pastor if I would be willing to do it for free.

Looking back on my career, I know I had met many pastors who, like Bert Hanson, would happily have done their work for free. Some of them grumbled about the low wages and substandard housing, but most were simply delighted to be able to do the work, and receiving a salary was a bonus. After I left ministry, I became part of the Metanoia Peace Community which, for its twenty-five year existence, was able to do all it did (much more than large churches with multiple pastors on staff) because John and Pat Schwiebert voluntarily took poverty wages to avoid paying war taxes, and yet devoted their entire lives to serving the poor and marginalized. John tired of wrestling with Metanoians as the years went by--the community could be extremely argumentative and simultaneously stuck on needing a "sense of the meeting" before making a decision--but I never knew him to covet the much more comfortable lifestyle of a senior pastor at a large church. He would have done it for free.

Please don't get me wrong. I am not saying that ministry should be done for free, though I doubt I was alone in my sense of being trapped in the profession by my need for shelter and income, and out of obligation to the sheer volume of grief it took me to achieve ordination. I believe that most pastors do their work out of a sense of vocation, and find genuine rewards in serving God and others in this way.

To get to my point, I'm going to spend a little time on my transition back into teaching. I've previously described how my early days in education paralleled by pastoral career: I was full of doubts and misgivings, didn't feel like I had a knack for it, was easily discouraged by my student teaching experience with a supervisor who was, quite simply, a jerk, and then being summarily dismissed from my first true job by a principal/superintendent who was, himself, incompetent. The different is that I gave up on teaching before I'd given it a chance. Knowing I had done this, I felt a real obligation to stay in ministry well past the time I probably should have left, rather than be a two-time quitter.

As my time away from ministry expanded, and my two attempts at returning to it were rebuffed (first by the conference, then by Emanuel Hospital, where I had sought an internship with the chaplaincy program), I knew I needed a new source of income, as my disability benefit was running out. And then it came to me: substitute teaching. I contacted the TSPC, and began the relatively long process of reinstatement: compared to my ten-year ordination odyssey, the months-long wait for my new certificate was a walk in the park.

Once I got back in classrooms, first as a substitute, then as a full time music teacher, I found, much to my amazement, that things had changed for me. The challenges were still the same, the children (despite radically different clothing choices, hairstyles, and electronic pursuits) were developmentally the same; what was different was me. I had grown up. Maybe it was the wear and tear of marriage, divorce, remarriage, redivorce; or the experience of raising my own children through the developmental stages I was encountering in students; or those years in the clergy trenches, and my struggles to stay in them. Whatever had happened, I was tougher, more assertive; and now I loved this work.

It didn't completely love me back. It took two full-time jobs with varying degrees of success before I hit on the right mixture of firmness and fun. And then it all went away, and when it came back, it was only half-time, and with the big kids I used to fear. Except that, in the years I was working with little kids, my own kids got bigger and passed through the same developmental stages as my current students. So it again comes back to this: I love this work. It seems like an amazing treat that I get paid to do it. In fact, I have four jobs, all music-related (essential for filling the budgetary hole left by half-time schoolwork), and I enjoy myself doing all of them. One of them I would gladly do for free, in fact.

Let me spell out a few details on that last bit: for almost four years now, I have been playing keyboard for ComedySportz. I was brought in through the side door by my partner, Amy, who had herself been improvising on stage with CSz for over a year at that point. After one of the first CSz "Farm Team" events I attended, many of us gathered at a McMenamins pub that is popular with them because of its large tables and late closing hour. There was one "Pro Team" player there with us, Bill Cernansky, an Intel engineer by day and a brilliant improviser by night. Bill is very well paid by Intel, enough so that he and his wife, Betse (another improv genius) recently bought a home in the spendy Irvington district of Portland, as well as a Nissan Leaf. Bill was in a speculative mode that night at the Tavern & Pool, and said dreamily at one point, "If I could have anything, it would be to be a full-time improviser." That's it: Bill works his day job to support his love for his night job, which pays him probably around $100 a month. Between rehearsals and performances, Amy does improv 3-4 nights a week, for which she receives, if she's in a show, the grand sum of about $25. This is something we both love so much we'd do it without the monthly "player share" check.

But as Bill's remark demonstrated, one can't be a full-time improviser and pay the mortgage. That's why most artists work at other jobs to cover expenses.

Now let me reiterate what I said earlier about pastors: I shouldn't have to do it for free. I should be paid a decent wage for the hard work I do with my students, or as a performer entertaining others, and I am. I put in plenty of extra hours for my school job, and I don't begrudge a minute of it, because this, really and truly, after all those years trying something else, is what I was called to do. Had I stuck with it for another year of subbing and perhaps a new job with a district, I might have figured this out for myself. But sometimes you have to take a long journey to realize how much you love your home.

Do what you love. If you hate what you're doing, but it's making it possible to do what you love on the side, that's a pragmatic compromise. If you hate your work, and you've got no time at all for what you love, it's time to find new work.

Tuesday, May 28, 2013

There's Money to Be Made

 
Forget "E Pluribus Unum." The real motto of the United States is "There's money to be made." It's no accident we print "In God we trust" on our cash, because materialism is our true national religion.
 
This first struck me my sophomore year of college, when I took Econ 101. The professor told us a story of Soviet athletes (this was nine years before the Berlin Wall came down) visiting the United States, and being shocked by the sheer variety of products on grocery shelves. Granted, they came from what was, at that time, a planned economy, and they were often lucky to find one version of a product on the shelves, let alone the half dozen available at even a small market in America. But the question they asked is even more relevant today than it was then: Why do we need that many brands of ketchup? Or, if you really want to blow your mind with how many ways the same product can be packaged, take a look at the toothpaste aisle. I did just that this morning at my neighborhood grocery store, the QFC. I counted 94 different varieties of toothpaste, and that's not counting how many different sizes of tubes were offered. Ask an honest dentist about toothpaste, and they'll tell you the only active ingredient in toothpaste is flouride. Everything else is to keep you brushing longer: colors, flavors, textures, foaming agents, breath fresheners.
 
This was the QFC. Absent from my survey were any generic tubes of toothpaste. Many stores also carry their own lines of toothpaste, significantly less costly than the brightly packaged name brands. I remember well the moment in Econ 101 when Professor Hanson told us about price discrimination, and how disgusted I was with the notion. The same toothpaste, the very same stuff, gets packaged in multiple tubes, and I'm not just talking different sizes with the same name on the label: the store brand is the same stuff, just called something different. Why do they do this? Because it has been proven again and again that some consumers will only buy a product if it's in their price range, while others are willing to pay extra for the very same product with a brand name attached to it. They're exploiting us. Once I learned this, I have made it my practice to only buy name brand products if, with sales and coupons, I can get them at the same price as or a lower price than the store label.
 
All those different names and packages exist for this reason: to extract money from consumers' pockets. If we all quit paying attention, and just bought whatever was least expensive, most of that glut of toothpaste varieties would vanish.
 
"There's money to be made." I have to give some credit to the entrepreneurial spirit of innovation. Money is the driving force behind American creativity. Nations that have attempted to do away with this impulse have found that, without it, productivity falls away rapidly. The USSR collapsed under the burden of the arms race coupled with an inability for its planned economy to keep up with the needs and wants of its citizens. China avoided this fate by, beginning in the 1970s, reintroducing competition into its own economy. In both the former Soviet Union and China, it must be admitted that capitalism has worked far better for either of these nations than centralized communism ever did.
 
But here's the rub: the transition came at a high cost. Russia descended nearly into chaos, and many of the former Soviet republics have clung to tyranny on a smaller scale, and it could be argued that Russia itself is creeping back toward authoritarianism as an antidote to the rampant corruption of its initial foray into a free market economy. China's unstoppable economic engine is built upon horrendous working conditions and low wages, as well as the generation of enormous quantities of greenhouse gases.
 
For those whose god is money, virtues are all associated with maximizing profit and minimizing loss. Environmental and safety regulations are resisted because they cannot help but reduce efficiency and require capital expenditures. Labor unions are the enemy, striving as they do to improve wages, enhance working conditions, and expand benefits, all of which reduce overall profit. The work week is structured around productivity.
 
Conversely, family and community values are treated as sins. Does attending your child's concert mean you can't work a night shift you just found out about? Too bad. Is your health suffering because you can't adjust to working the swing shift? You're welcome to look for work elsewhere. What, you want a vacation? It'll be six months before you're allowed any time off. Is your marriage suffering the long hours you're working? Tell your spouse to get a job, too. Long hours, making money, is the American way of life.
 
Perhaps the most insidious aspect of our free-market religion is its addiction to short-term gain. The economic collapse of 2008, which resulted in my being laid off in 2009, and four years later still not being back to full-time work, came about because banks and investment firms became obsessed with generating profits from thin air, spinning out mortgage-backed equities and all manner of other financial products without regard for the future. Billions were made, billions more lost, and it felt like the entire country went into foreclosure. The nation could have recovered faster, but the Republican Congress that caused the collapse has refused to cooperate with the sort of stimulus that could bring back jobs, rebuild infrastructure, and plan for a sustainable economy rather than one that is subject to the excesses of profit-hungry speculators.
 
All of this came out of an addiction to the most ephemeral commodity of all: the dollar. Money only has value because we agree that it does. Take the constituent elements of a dollar or a dime: paper, linen, ink, nickel, copper. They're worth a tiny fraction of the face value of that currency. And before you tell me this is an argument for the gold standard, remember that the value of gold is just as ephemeral. Gold is valuable because we say it is; in fact, apart from some usefulness as a coating for high-end wiring, gold is good for nothing. This sense of ephemeral value becomes even more obvious when the entire system is computerized. Now it's just numbers rising and falling according to ridiculously complex algorithms in the heart of a server somewhere on Wall Street that determine who wins and who loses. All the lives ruined by stock market crashes, mortgage crises, bank collapses, boil down to flickering 1s and 0s.
 
Let's turn to something more tangible: oil. It has become a scientific certainty that the world is warming rapidly, and that fossil fuels are driving this process. Take a look at this story in Slate: The Arctic Ice Death Spiral, by Phil Plait  In particular, read the portion about how oil companies can't wait for ice-free summers in the Arctic Ocean, where they hope to drill for more oil--which, of course, will drive global temperatures even higher, as gas prices come back down and consumers happily return to SUVs and Hemis. Concern for the livability of the planet is drowned out by the chorus of delighted executives screaming, "Money!" So what if the temperature in Southern California tops 100 on a cool day? Hop in your SUV and move north!
 
There was a time when I accumulated possessions, especially books, records, and movies. These were not large objects, but over time, they took up more and more space. When changes in my livelihood forced me to downsize my living space, I packed up my collections and stored them in my parents' attic. A few years ago, I went up to that stack of boxes, and pulled out the books. There were ten boxes of them. Many were filled with books I had read in my youth, books I had studied in college, grad school, seminary, books I had enjoyed; but there were also books I had purchased because they looked like books I should read, then never did. I scoured those boxes, and sorted them into three new collections: books I would want to refer back to or might read again; books that might have some value as collectibles; and books I knew I would never open again. The vast majority fell into the third category, seven boxes in all. I loaded them into my car, drove to Powell's, and took them to the bookbuyers. Of those seven boxes, less than one box was of any use to them, and I received $75 for it. The rest I left in the donation area, most likely to be picked up by Goodwill. I did similar things with my record and movie collections. When I think about all the ephemeral money I invested in these collections, I feel deep pangs of regret. I am now far more cautious about such purchases, especially as we enter into an age in which nearly all information and entertainment are available instantly online. The bottom line for me: I don't need much. Having less means I can live in a smaller, less expensive home, and when it comes to move again (perhaps in yet another downsizing), it's far less daunting.
 
I see millennials making this decision as a generation. They don't need hard copies of books, newspapers, albums, videos; everything entertainment or information related can be found in the cloud, and does not even need to be downloaded prior to use. Their politics mirror this shift toward simplicity: the Occupy movement, and its stress on the 99%, was a reaction against the excess consumption that had led to the 2008 recession. This shift has corporate executives scrambling. The youth market has traditionally driven the economy; for that market to be losing interest in acquisition is terrifying to producers. But have no fear: if there's a way to make money off the new paradigm, it will be discovered and exploited, and the US economy will continue to grow, though this time the products being purchased will be as ephemeral as the money used to buy them.
 
You may be wondering, at this point, if this is all just a socialist rant. But like most Americans, even the most idealistic, I am compromised when it comes to capitalism. I do participate in the economy. Apart from groceries, most of my tangible purchases are recreational: shoes appropriate to my many outdoor exercise activities, backpacks, ski gear, technical clothing, bicycle accessories; not to mention travel and lodging for the adventures Amy and I love to take. Given my early discussion of downsizing, you may be surprised to learn that our garage is rapidly filling with exercise-related gear. As with my feelings about consuming meat, I'm not a purist by any stretch of the imagination.
 
But I do yearn for a simpler lifestyle, and for a spirituality grounded less in stuff, and more in being. Everytime I'm in the Peace House (my home for three years, and still a holy place to me), I see a small bit of calligraphy hanging in the first floor restroom: "It is you I want, not your possessions." (2 Corinthians 12:14) The Apostle Paul was writing, in this passage, about needing a place to stay when he next visited Corinth, but not wanting to be any sort of burden on the small church there. It's a sentiment I have long felt in myself with respect to my parents' estate, and whether I will benefit from the financial wherewithal of any romantic partner. I don't want anyone's stuff. I would far rather enjoy your company than your money. If there is a God, I expect that this being really has no use for our stuff. If anything, God would rather we chuck our stuff, because it tends to distract us from the presence of the Creator in our lives and our world.
 
I began this essay with a discussion of the irony of the motto "In God we trust" being on our money. Perhaps there's another way of looking at it. The next time you've got some cash in your hand, look at that motto and ask yourself: If I really trusted God, what would I do with this scrap of paper, this bit of metal?
 
Then go forth and do accordingly.

Monday, May 27, 2013

I got a gun, he got a gun, everybody got guns!

Gyp Rosetti

That's Gyp Rosetti, a very scary gangster in HBO's Boardwalk Empire. He spoke that line during a tense standoff, one of the few times in the series when everybody having guns didn't end in a shootout. In Boardwalk Empire, guns exist to be used on others. It happens a lot. The third season climaxed with a citywide gun battle, bodies piling up all over Atlantic City.

Everybody having guns is a symptom of everybody being scared. For the last twelve years, Americans have been afraid. Someone sent me this graphic in 2002:

 
With planes hijacked and crashed into buildings, anthrax in the mail, terrorist organizations plotting further attacks, perhaps we had cause to be afraid. In such a climate, it should be no surprise that Americans are buying guns, that people who previously would never think of owning a killing device would want one in the nightstand. But there are some real problems with that kneejerk reaction. First, none of the people we were afraid of in 2001 was in any position to invade our homes; in fact, the only home invasions that followed that terrible day were perpretated by heavily armed Americans, seeking terrorists in third world countries, as well as large numbers of civilians who were never going to rise up against anyone.
 
The wars started in the wake of 9-11 are finally tapering off, at a cost of hundreds of thousands of mostly innocent Iraqi and Afghani lives, as well as more than 6000 young American servicepeople. And yet still we are afraid. Tea Partiers and the Republicans who speak in their behalf are afraid of change, and take it to paradoxical extremes (Get the government's hands off my Medicair!). Most recently, Americans seem to be afraid of having their guns taken away, and are reacting by buying more of them.
 
And yet even as the number of guns owned by Americans increases, the number of Americans owning guns is going down. This means some Americans have enormous arsenals. The primary argument for owning so many weapons is defense; and yet, what are they thinking they can defend themselves from?
 
The American police state? Hardly. Anyone who's seen footage of the Boston Police capturing the marathon bombers knows it would take an army to fight off our increasingly militarized "peace officers." Invasion by another country? Don't be ridiculous; the United States armed forces outnumber and outweapon the next ten military countries combined. Nobody's invading America anytime soon. Home invasion? Sneak downstairs with your urban assault weapon to take out whoever just made a bump in the night, and you will most likely either kill your brother-in-law having a midnight snack or be killed yourself by the jumpy and equally-well-armed burglar who broke into your house.
 
Most of those law-abiding guns stay in their cabinets gathering dust. Those that do get used on humans will most likely be used in anger, by accident, or to commit suicide (far more effective than taking a bottle of pills). And then there are the children who, steeped in a culture of cool violence, end a play date by firing a gun found at a friend's house, and killing that friend.
 
Then there are the libertarians who publish sentiments like this one:
 
Gay rights and gun rights poster
 
There are a lot of problems with this line of "reason," primary among them this: HAVING A GAY IN YOUR HOUSE DOES NOT ENDANGER MY CHILD. It's not just that I dislike guns; they terrify me. I want sensible gun laws based on safety, not the personal preferences of hobbyists. That child holding the assault weapon could kill an entire classroom in a matter of minutes.
 
So that's why I don't want a gun in my house, why I want you to register your gun, why I don't want every kid on the block to have a concealed weapon, why I just want everyone to for God's sake step back, take a deep breath, and think about how many weapons we really need in this country. Please.

Sunday, May 26, 2013

Methodist Dogmatics

 
Methodism is a non-credal denomination.
 
Ostensibly that means Methodists do not have a statement of faith to which they must adhere in order to belong to the church. That should mean there is no such thing in Methodism as heresy, as to be a heretic is to act contrary to the dogma of the church. And in traditional terms, Methodists remain non-dogmatic. While Methodists are expected to believe in the Trinity, the Resurrection, and other doctrines typical of Christians throughout the world, there is no one way in which they must believe. Methodists can believe, for instance, that Jesus was a great teacher and historical figure, but that the miracles presented in the Gospels are symbolic, that the Resurrection describes something that happened in the disciples' hearts, that, in fact, all the teachings about the divinity of Christ are metaphors for how humans can find spiritual fulfillment through prayer and good works, and still be fully within the fold of the church.
 
In practice, though, there are dogmas that United Methodists hold to, beliefs that, if contradicted, can lead to exclusion from full participation in the life of the church. Given the name of the denomination--originally an epithet describing early Methodists' obsession with organization--it should come as no surprise that these dogmas rise from the polity of the denomination. In this post, I'll be crafting a systematic theology of United Methodism.
 
At the heart of Methodist doctrine is the concept of Christian conferencing. Early Methodists met in classes, groups of believers who covenanted to hold each other accountable in their practice of spiritual disciplines. The Methodist classes grew out of a need felt by 18th century Anglicans for deeper spirituality than was available to them in their parish church life. Classes met to study the Bible, pray, and encourage each other to perform works of service and charity in their communities. Class meetings were believed to be, like Communion and Baptism, a means of grace, a way in which the community could experience God's presence in their lives. Eventually, these highly structured groups became the first Methodist congregations. The idea that a meeting itself had sacramental power carried over into church councils and denominational conferences, and was carried across the Atlantic as part of Methodism in early America.
 
John Wesley concluded early on that classes needed clerical supervision, and appointed circuit riders, traveling pastors responsible for a local group of classes (and eventually, when those classes became formalized and purchased meeting space, of chapels). This model also lended itself well to the American frontier, although circuit riders now had far more territory to cover as they traveled among the chapels in their parish.
 
Wesley concluded, based on his own reading of the Greek New Testament, that he had the authority to ordain without assistance from a Bishop--useful, as he had effectively been excommunicated from the Church of England at this point for his radical beliefs. Methodists in America at one point wrote to Wesley asking for assistance, as they were especially short on clergy, and did not have him there to perform ordinations. Acknowledging this need, Wesley commission two of his best circuit riders, Francis Asbury and Thomas Coke, to be "General Superintendents" of American Methodism, with authority to ordain and give centralized guidance to the church in the new nation of the United States; however, he was clear with them that they were not in any way to consider themselves bishops. In Wesley's mind, one of the greatest mistakes of the Anglican Reformation was the decision to retain the structure of the Roman Catholic Church, including the episcopacy. No church leader should have that much power or authority, Wesley believed, and so to this day British Methodism has no bishops, only a conference president who serves for one year and cannot be reelected. British Methodist clergy are ordained by their colleagues at annual conference, continuing Wesley's pragmatic practice.
 
Asbury and Coke had other ideas, however, and upon stepping off the boat in America, announced that they were the first Bishops of the new Methodist Episcopal Church. We've had Bishops ever since, and except for one schismatic portion of the church (Methodist Protestants, who broke away in the early 1800s, finally coming back into the fold in 1939), our Bishops, like Supreme Court Justices, have always served for life.
 
First Dogma of the United Methodist Church: Bishops are the heads of church government, elected for life from the clergy. There decisions with regard to appointments are final.
 
Methodist churches throughout the world are led by itinerant clergy, descendants of the early circuit riders. Some Methodist pastors still serve circuits, appointments of multiple churches, but all are permitted to locate, to live in a single place, even if they must preach in a different church each Sunday. They are appointed on an annual basis, but unless something goes seriously wrong, generally stay in an appointment for several years. Appointments in United Methodism are arrived at through a complicated process involving communications between every church and pastor in a district and the district superintendent, who then shares these data with the other superintendents and the Bishop, who together make up the Cabinet. The process of making appointments consumes a huge portion of the Cabinet's time, and can take half the year to complete. The wishes of each church and each pastor must be weighed with the needs of every other church and the strengths of every other pastor in the entire conference (a conference, in this case, meaning a geographical region of the denomination). The belief is that the Cabinet knows every church and pastor better than they know themselves, and can wisely appoint them as necessary. In reality, the process is often skewed by the ambitions and needs of pastors and their families, who understandably may not want to move the year before a pastor's child is to graduate from high school, or to a parish so far away that a pastor's spouse must leave his or her job.
 
Second Dogma of the United Methodist Church: The appointment process is not to be questioned. The Cabinet knows best, and will prayerfully put every pastor exactly where her or his gifts can best be used to the glory of God.
 
Growing as it did out of Anglicanism, Methodism from its beginnings had a high pastoral theology. Only ordained ministers could preside over the Sacraments. This is the greatest distinction between Methodists and Baptists who, at least in their origins, had lay pastors, and believed that anyone so led by the Holy Spirit could and should preside over both Baptism and Communion. Early American Methodists suffered the lack of ordained clergy in that they would not take Communion or baptize one another. Only with the arrival of circuit riders ordained by Wesley or, ultimately, of self-titled Bishops Coke and Asbury were they able to partake of these means of grace. To this day, only ordained persons and--here the essential pragmatism of Wesley breaks through--lay ministers authorized by the Bishop may preside over the Sacraments. In a pinch, a layperson can be given permission to preside. But only with the Bishop's okay. This heightens the sense in Methodism that ordained clergy are somehow magically different from ordinary laypeople.
 
Third Dogma of the United Methodist Church: Ordained clergy are set aside by God to preside over the Sacraments. Laypeople are not. When no ordained clergyperson is available to serve a congregation's sacramental needs, a layperson may be authorized by the Bishop to preside, but only when no ordained person is present.
 
It should be apparent by now that Methodism, especially in America, has a hierarchical structure modeled on that of the Church of England, which itself is based upon the structure of the Roman Catholic Church. This organization places pastors over parishes, superintendents over pastors, and Bishops over all. The result is an insensitivity in church headquarters to the preferences of local churches, and a paternalism that local churches find enervating and disempowering. Grassroots creativity is muzzled in favor of conference initiatives. Every few years, another plan for growth issues from the main office, usually the same ideas repackaged with new graphics and wording; but chances are veteran lay leaders and pastors have all seen it before. It was just called something else.
 
Fourth Dogma of the United Methodist Church: Program and mission priorities come from the main office. Innovation begins with the Bishop.
 
Once a year, representatives of local churches meet with all the ordained clergy in an annual conference at an event called, of course, Annual Conference. This is the only body that can make decisions about the policies of the conference. It's a huge meeting, hundreds of people gathered in a college field house or community conference center, filled with stirring worship services, powerful music, and dull reports, culminating in an ordination service and the reading of appointments for the next year. Every four years, representatives of all Annual Conferences, evenly divided between clergy and laypeople, meet for a General Conference, which serves the same function for the entire denomination that the Annual Conference does locally. General Conference lasts for two weeks, and its work is more substantive, more controversial, and far more newsworthy than anything that happens at Annual Conference. The decisions of conferences are final within the jurdisdictions they govern, and in the case of General Conference, can have generations of repercussions. These meetings can be intensely partisan, breaking along regional, racial, national (most of the recent growth in United Methodism has been in Africa), and theological lines. The Bishops, all gathered on the dais but without voice in the General Conference, typically have far more liberal opinions than those of the GC, which is dominated by delegates from southern states and Africa. General Conference is accorded the same authority by United Methodists that Roman Catholics give the Pope.
 
Fifth Dogma of the United Methodist Church: The decisions of Conference are arrived at prayerfully and represent God's will for the church. They are not to be questioned.
 
United Methodism has other dogmas I could describe here, but I'm going to limit it to these five because I believe they demonstrate the rottenness at the heart of America's second largest Protestant denomination. The hierarchic power structure robs local congregations of the ability to innovate in ways that can best serve their needs and the needs of their community. The emphasis on ordination eliminates all but the most assertive lay leadership from having a voice in the ongoing governance of the church: if one is really called to ministry, one is expected to give up whatever career he or she may have, taking an effective vow of poverty, surrender three years to seminary and another three to five to the ordination process, and only then be accorded a voice in the hall of Methodist leadership. Only then is this person granted authority to speak the magic words over the bread, grape juice, and water that make it possible to serve Communion or baptize a new believer.
 
I've been calling this "clergy privilege" for many years, though really it's more of an ordeal that must be endured if one is to have the privilege of living on substandard wages, spending an entire career paying off student loans, living in poorly maintained parsonages while working 50- and 60-hour weeks. There are good benefits--health insurance, pension, a liberal vacation policy, the ability, in most cases, to structure one's own work week independently of direct supervision--but considering what must be survived to arrive at this, it demands a great deal of an individual. It's also enormously expensive to a congregation to support and house a pastor, not to mention finding the cash to support the conference and denomination that appoint that pastor, with or without the approval of the congregation.
 
Things are simpler in other denominations which practice a "calling" system, in which congregations engage in searches for their own pastors and hire them after interviews. One problem with such denominations, however, is that it can be very difficult for women and members of minority to find a position. On the other hand, their congregations experience far more authority in setting their own mission and program priorities. And pastors answer to them, rather than to their bishop.
 
I personally am of two minds on what works best. The one community I know that has best worked for me as a Methodist was Metanoia. Although officially a United Methodist congregation, Metanoia encouraged lay persons to preside over Communion, part of the radical equality at its core. Although nominally under the leadership of an appointed pastor, in practice Metanoia set its own priorities, whether or not he agreed with them, and often told him "no" when he came to council meetings with an idea he thought would work well for the community. Metanoia was able to be far more generous in mission than much larger churches because its pastor really had take a vow of poverty. Every year of his active, pre-retirement career, John Schwiebert came before the entire clergy session of Annual Conference to receive their permission to earn a sub-minimum salary, just enough to keep him under the federal poverty level.
 
Metanoians were a rare breed. We really were an island of misfit toys, most of us disenchanted Methodists (though some retained membership in other churches, and I remained a clergy member of the Oregon-Idaho Annual Conference), many of us gay or lesbian, many in recovery, all of us believing in intentional community, and none of us respecters of dogma. I have often thought the conference allowed us to exist, turning a blind eye to our heretical lay-led Communion, authorizing John's poverty wage, as a release valve: we were the ones on the front lines, marching, getting arrested, hiring an out lesbian as a co-pastor, organizing with other institutions to better our community; and knowing we were doing these things took the pressure off other congregations, who could feel better about doing less along those lines.
 
But Metanoia is gone now. John has really, truly retired at the age of 74, and after a year spent trying to find a new home, the community finally disbanded. That throws the onus of change back on the conference, which desperately needs a bottom-up restructuring.
 
United Methodism in America has been bleeding membership since the 1960s. We've now had several generations that prefer local to regional or national control, who want worship services they structure for themselves, sermons they can apply to their own lives, and leadership that emerges organically from the community. The young adults who will take our places see little in Methodism that appeals to them, and understandably are founding their own religious communities, created in their image rather than John Wesley's. Ironically, these communities may be closer to the primitive Methodism Wesley founded in the early 1700s than anything that has come out of Nashville in the last fifty years. Like the Anglican Church Wesley knew, Methodism has grown tired, ossified, too wrapped up in the orthodoxy of its clergy dogmas to be able to innovate. Our leaders are far behind the curve, held back from what innovation they believe in by the conservative General Conference.
 
In a word, Methodism needs its own Reformation. The question facing all of us who grew up in this denomination is this: Will it come too late? Or, as with the English reformation that gave birth to Methodism, will it take an exodus to a new church to spur the changes we so desperately need?

Unity Is Over-Rated

File:Great Seal of the United States (obverse).svg
 
E pluribus unum: out of many, one. That's our United States of America summed up in three neat Latin words. To the world, we're a single, monolithic entity, the only true superpower left. American culture has conquered the world. McDonalds, Starbucks, Wal-Mart--our corporations are everywhere. American tourists are known for their loud, aggressive friendliness. We don't mean to be rude, but sometimes we just can't help ourselves.
 
25 years ago, I had the privilege of spending two years serving an English Methodist/United Reformed congregation. It was a "united church," a merger between two Methodist congregations and one from the United Reformed denomination, and as such, it had an ongoing identity crisis. Every time there was a pastoral change, the position had to constitutionally switch between Methodist and United Reformed. My first wife, Brenda, and I came to Trinity Church as a clergy couple with one year of student pastorate experience behind us. We quickly discovered that, even fifteen years after the merger, the church was divided along denominational lines. Church council meetings were sometimes like battles between Methodists and Congregationalists (what the URC members had been before a denominational merger in the early 1970s), but those differences were minor next to the far more localized clashes among all three congregations. A number of members had simply fallen away with the merger, grieving the loss of whichever old building they had belonged to, not wishing to worship in the modern structure that now housed the church. There was a rift between the younger members and the "junior church" (Sunday school) they ran and the older members who preferred traditional worship.
 
One thing that united them all: Americans were greeted with suspicion.
 
Early on, I heard from my superintendent, Alan Mimmack, that a lapsed member, who had never been to the new building, had passed away, and her husband wished him to perform the service. Alan told me he would honor my wishes on this, and do the service if I thought it best, but he really thought I should go meet with the grieving family. I did so, and had a memorable visit. The issue, it turned out, was not that I was so young. It was that I was an American, and in the mind of this family, that meant I acted and talked like J.R. Ewing. The only Americans they had experienced up to then (not having been to a single service since our arrival) were on the Dallas, which was at that time one of the most popular shows on British TV. Once they realized my accent was West Coast rather than Southern (despite my being a student at Southern Methodist University, in Dallas, at the time), they warmed to me, and agreed to hold the service at Trinity, with me in the pulpit.
 
This story illustrates a simple point: there is far more to us Americans than the rest of the world realizes. As a nation, we have a well-earned reputation of being friendly bullies, both economically and militarily, and our popular culture reinforces this image with explosive action movies, throbbing music, pushy advertising, and bright logos that crop up in the most inappropriate of places.
 
But there is far more to us as a people than the loud, pounding refrain of our culture. As individuals, we are far more pluribus than unum. Our Constitution is built around the principle of minority rights, and keeping it in balance with majority rule. Our Supreme Court at one time was a valiant guardian of those rights, and still can be, though it has taken to defining corporations as citizens, with all the rights and privileges (but rarely the responsibilities) thereunto appertaining. And Congress: the House was the defender of the little guy, as Representatives of smaller districts gave a national voice to their local constituencies; and the Senate was the place where diverse voices had to work together, compromising when necessary, but always aware that there were a wealth of ideas in circulation. Some of the sausages to emerge from this factory were ugly, but there were also civil rights and voting rights, environmental and consumer protections, fair labor practices, minimum wages, legislation that furthered the civilization of our rough-and-tumble pioneer nation, sometimes in baby steps, but also at times in great leaps.
 
No more. Congress is now where ideas go to die. Take a log at the political blogs in Slate magazine, and you'll see story after story of fractious, nasty behavior tearing apart the legislative branches. The Supreme Court is also often divided along partisan lines. And getting either of these branches to work with the White House seems an utterly lost cause. Meanwhile, the executive branch follows in the its centrist predecessor's footsteps, arrogating power to the White House because after all, in this partisan climate, how else can it hope to get anything done?
 
Remember the culture wars? Here, at least, there seems to be a growing swell, primarily in the young, of progress. Our nation seems to be outgrowing the bigotry of its elders. Young people have no patience for homophobia or knee-jerk racism, and they are not, by and large, tied to the orthodoxy that has sustained the unwinnable war on drugs, kept gays and lesbians marginalized, and fed our profit-driven health care crisis. Often unable to secure the career track positions enjoyed by their parents, they also are pragmatically detached from the traditional American lust for material possessions. This sets them up for a new culture war, between simplicity and acquisition, especially as marketing desperately seeks new ways to convince the young to buy, buy, buy, pleas that fall on deaf ears and blind eyes that have learned to skip, fast-forward through, or simply ignore advertising.
 
The greatest division I see in America, though, remains geographical. Southern states cling to conservatism even as traditional northern bastions of orthodoxy are outgrowing its strictures. New Hampshire, ever the refuge of stubborn libertarianism, approved gay marriage in 2010, a year when the Democratic party was receiving a drubbing in the polls. Since then, a growing tide of northern, midwestern, and western states has joined this movement, while the south remains solidly traditionalist. This cultural counterweight to progress is, in my experience, consistent across institutional lines: the Boy Scouts of America, headquartered in Irving, Texas, is resolutely behind on gay rights (its recent acceptance of gay Scouts still falls far short of acknowledging that gay men and women can be role models); and the United Methodist Church's positions on ordaining and marrying sexual minorities remains firmly rooted in 1972's reactionary positions.
 
A district secretary once told me, for the church to move on, sometimes someone has to die. She was referring to older laypeople who stand in the way of local church progress by clinging to the past. I have come to believe that, for the United Methodist Church to survive and grow, it must give up its attachment to the word "united." Methodist unity has been forced for decades by its arcane polity: a governing body that meets once every four years, consisting of both lay and clerical representatives of the entire denomination, apportioned by population. This skews voting power to the south, and not just the southern American states; most United Methodist growth these days is taking place in Africa, also the source of resistance to Anglican progress on sexual issues.
 
A movement has recently arisen in Methodism of actively protesting the rigidness of the Discipline on these issues, insisting that Methodists are Biblically called to act in ways that go beyond regressive church laws. What this movement lacks is courageous church leaders, willing to face church trials and removal from their high offices, to stand against these backward policies. At present, the protest seemes to be coming primarily from retired bishops who have nothing to lose. Bishops in office remain staunch defenders of the unity of the church, even though they may, in their hearts, believe the right thing to do is to openly ordain gay men and women who are called to ministry, and to marry same-gender couples who clearly have as much right to happiness as those who are traditionally oriented. "We have no choice but to prosecute," they say when faced with a complaint against a pastor who has performed a gay wedding. "It's in the Discipline."
 
The Discipline will not change as long as unity trumps justice. Scouting has only changed because enough Scouts and Scout leaders insisted it was time, that too many young people were being hurt, enough that it became clear to the leadership that rebellion was afoot. What Methodism needs is secession: the Western and Northeastern Jurisdictions deciding they are fed up, that for the church to move on, its false unity must die. Perhaps then it can survive the flight of the young from dull institutional worship that never makes the leap to true social change.
 
I'll have more to say about barriers to Methodist progress in a subsequent post on clergy privilege. For now, I encourage you to embrace whatever change is in store for you, even if it means breaking with the comforts of orthodoxy and unity. Diversity is where the fun is.


Saturday, May 25, 2013

The Running Fool

Get a load of the quads on this guy.

It's September 25, 2001, two weeks after the towers fell. Two days ago, I completed my seventh marathon, the Top of Utah, setting a personal record of 4:19. Today I hiked every trail at Bryce Canyon. Every single one of them. I will never be this fit again.

Four months earlier, I ran my sixth marathon, the Avenue of the Giants, and spied a t-shirt on another runner claiming membership in the "50 States Plus One" club. That person had run marathons in all fifty states, plus the District of Columbia. I knew in an instant that this would be my lifelong fitness goal. I already had three states under my belt: Oregon, Washington, and now California. Utah was next. I got home from that trip, and began planning to add another state to my list. Perhaps Arizona, or New Mexico, or even Texas, someplace warm enough for an early spring marathon. I took my usual month off, letting the massive blisters from Bryce Canyon heal, and began easing back into training.

I hadn't gotten many miles under my belt when I noticed something: a very specific tender spot on my left shin. I could run through the pain--endorphins are magical--but it always came back, and over time got worse. Finally I took it to my doctor, had an x-ray, and learned I had a stress fracture.

I'll never know if it was doing two marathons in one year, the gonzo Bryce hiking day, or just my too-big-for-marathoning physique catching up with me. Most likely it was all of the above. Whatever it was that caused this, my distance running career ended on this day in 2001. I've had many attempts at coming back since then, but every time I would start ramping up the miles, an injury would intrude: a possible stress fracture developing, my trick ankle, or my old enemy since 1991, plantar fasciitis. It is clear that I will lucky to run in a marathon ever again, let alone in the 46 states (plus the District of Columbia) still on my running bucket list.

If you knew me in high school or college, you would never guess that I could run a mile without collapsing, let alone 26.2. I was chunky and exercise-averse. I took PE because it was required. As a Scout, I hiked only to the extent required to earn merit badges. My sole exercise was walking, which I did mostly out of necessity, lacking a car of my own. The change came in the summer of 1984. I came home from grad school to begin looking for a teaching job. I had a lot of time to kill, and my parents thought it would be a good idea for me to have a physical, so I want in to be poked and prodded. The doctor compared my vitals to my age, and told me, in no uncertain terms, that I needed to start exercising if I wanted to live to see 40.

So I did. For the first time in my life, I began walking without a geographical goal in mind. By the end of the summer, I was walking for hours at a time. Then it came time to start my abortive first teaching career. I moved to LaGrande, and quickly discovered I wasn't going to have time to cover the distance I had while unemployed. I wanted to continue burning calories, and improving my cardiopulminary health, but I wasn't going to have two to three hours a day for a long walk. It struck me then that I could accomplish the same fitness goals in less time if I were to transition to running, rather than walking. Over the next two months, I systematically worked running intervals into my walks, until I was able to run for 30 minutes without interruption. Then winter hit (it comes early in the high desert), and I was abruptly dismissed from my teaching job. I moved to Salem, began subbing, and as soon as weather permitted, returned to running on the trails of lovely Bush Park, next door to my apartment building. When I moved to Dallas, I explored the city by continuing to systematically add to the lengths of my workouts, going a block further each day. I ran in my first race in 1987, a 5K in Robinson, Illinois, finishing in 20:58. My first marathon was in Stoke-on-Trent in 1989, as was my second, the following year. Subsequent marathons came in 1995 (Portland, my post-divorce race), 1999 (Seattle, also a post-divorce race), 2000 (Portland again), and 2001 (Avenue of the Giants and Top of Utah).

Starting with my fourth marathon, I added a brief pre-race ritual: lined up with hundreds or thousands of people, sorted by estimated pace, waiting for the moment when my cohort could cross the starting line, I would say to myself, "What a bunch of crazy people!" I loved the atmosphere of the marathon, the sense that, except for the elites in the front of the pack, all of us were really competing against ourselves, and were simultaneously there to support each other in this endeavor. I loved the friendliness of the people lining the street, their eagerness to support not just whichever family member they were there for, but every runner on the course, and especially of the locals who were there just to watch the race: children sticking out their hands for high fives, people handing out orange slices, people holding hoses and offering to spray any runners in need of a cool-down.

In my second British marathon, as in my first, I delighted in the very English cheers of "Nicely run, lad!" In my first, I had lost steam at thirteen miles, and spent the second half of the race taking progressively longer walking breaks. This time I resolved not to walk at all (I had not yet hit on the method of walking for a minute each mile, usually at a water stop), and did well until mile 20, that point in the race traditionally called the Wall. It's where the body exhausts all its stored carbohydrate energy, and has to switch to burning fat--a transition that sucks the life out of a runner. After three hours of non-stop running, I found myself facing an excruciatingly long climb (the course in Stoke is all rolling hills), and slowed to a crawl. As is often the case at the beginning of the race, I had gone out too fast, and had not given myself the breaks I needed to keep my pace under control. For the next six miles I paid for that newby choice, only managing to increase my pace to a run on downhills. By the time I reached mile 25, the roadside fans were thinning out, the anonymous cheers of "Nicely run, lad" had lost their power to cheer me, and I felt utterly alone, one miserable face amid the thousands out on that hot, humid day. I had hoped to run the last mile, but now I came very close to giving up, and slowed to my most defeated-feeling walk of the race. Then, up ahead, I saw a small cluster of spectators with a newspaper. I noticed one of them staring at me, then saying something to the others, one of whom haltingly called out, "Go, Bettinger-Anderson!" (Hopeless liberal that I was, and still am, I had hyphenated my name when I married. And it seems the local paper had printed the names of every entrant, along with their race numbers, in that morning's edition.) That was all I needed. I picked up my feet, began to run, and as I passed the group, I called out, "Thanks for the name!" "Oh, you're American?" one of them called back, and as one, they began singing, "Oh, say can you see, by the dawn's early light..."

I didn't slow again until I crossed the finish line.

I have run through the hardest times of my life. When I have not been able to run due to injury, illness, or out of an exagerrated sense of household responsibility (My shoes are on, but the baby's awake! I can't run now; my wife needs her sleep!), I have experienced a growing level of stress, and a disconnection with myself, that were ultimately alleviated only by a return to running. The choice to run has always been a choice to embrace life. Even when everything else in my life is turning to dross, running charges me with optimism and energy.

As I said above, in recent years I have had to severely curtail my running to avoid further injury. In fact, I am just now easing my way back into running after taking a year off due to my transition to the barefoot style. I moved to quickly to shift from the fasciitis-inducing rearfoot strike I had always used. Midfoot running solved that problem--my problem arch has given me no trouble at all since I made the change--but also put stress on other tendons, to the point that it was becoming difficult to walk. Now I'm using a more cushioned minimalist shoe, making sure I don't run two days in a row, and it's feeling wonderful. It helps that, in the interim, I took up bicycling, which I have come to love almost as passionately as running. And then there's hiking, which, especially since beginning my relationship with Amy, has become an activity I cannot experience often enough. And don't forget snow-shoeing (which we discovered together) and cross-country skiing (which Amy finally tried in earnest last winter, and now can't get enough of). We are outdoor recreation fools. It feeds us spiritually, emotionally, physically, and there is no fatigue that feels as good as that we experience as we drag ourselves into a brewpub for our post-recreation pint.

In 2001, I was regularly doing track workouts in which I ran ten 800 meter intervals, each in four minutes or less. That's five miles at sub-8 minute pace. Just running flat-out on the track, without overly straining myself, I frequently managed sub-7 minute miles. In that last marathon, my average pace was 9:54. These days I'm lucky to break 12 minute pace on a fast day. But I'm not complaining. It is such a gift just to be outside, even on a lousy drenching day, my feet falling into that delightful, soothing, energizing rhythm, the running fool going up and down hills, vibrantly alive and deliriously happy.

Why We Can't Have Nice Conversations

 
This. Is. A. Book.
 
So spoke the seminary professor after throwing a Bible to the floor and stomping on it. I never witnessed this act--it was years before my arrival at Perkins--but it was a story the professors I knew loved to tell. I don't remember the name of this particular professor, but I wholeheartedly agree with the point he was making to a shocked classroom of East Texan Methodists: idolatry is a sin. Anything besides God can be made into an idol, including, perhaps especially, this book. And yet Christians of every persuasion routinely assign it more weight, power, and authority than the God it supposedly tells them about--the one this very book quotes as saying not to make idols of anything, including books.
 
You may, like those East Texan seminarians, be in a state of shock that I am casting aspersions on this book. You may consider it the Word of God. You may even consider it the inerrant Word of God. If that's the case, it's best you stop reading this essay, and find someone else writing from your viewpoint, because you're not going to like what I have to say about the Bible. Perhaps you come up short of that "inerrant" thing, but still place a great deal of weight on what Scripture says, searching it for answers to your questions, advice in times of confusion, words of comfort in affliction, inspiration in defeat. You may still want to look elsewhere, but if you've got an open, probing mind, we may be able to have a dialogue.
 
Here's the problem: there are two irreconcilable ways of reading the Bible. The first takes it as, for the most part, inerrant and literally true: both versions of creation, the Flood, the sun standing still, Jonah in the belly of the fish, all four versions of the life of Jesus, and so on. The second approach views the Bible as a collection of historical documents, compiled over the course of a thousand years to tell the story of God's evolving relationship with humankind. All of this has to be read critically, placed in its historical, sociological, and literary context. For the most part, this approach views Scripture as symbolic and metaphorical, revealing deeper truths through stories and writings that should not be taken literally.
 
Representatives of these two approachs may appreciate each other for the sincerity of their beliefs, may enjoy the same kinds of religious music, may be moved by the same acts of worship and preaching. But they are never going to agree on this one thing, no matter what they may say to each other. In my mind, the literalist approach to reading the Bible is a form of idolatry. In their minds, that makes me a heretic. So it's best we just don't get into it, not if we want to be friends.
 
For the record, here's what I believe: the Bible is an amazing book of great power and beauty. There are parts of it that are among the finest things written in ancient times. The Succession Narrative, for instance, which tells the story of how David became king and then began the only dynasty in the history of ancient Israel, is much like Malory's Mort d'Arthur. The prophets wrote timeless social criticism that has as much relevance to modern times as it did two and a half milennia ago. Some of the parables of Jesus are so universally moving that they function as miniature gospels, presenting a complete spiritual worldview encapsulated in a few minutes of story-telling. Paul's soaring rhetoric provides a tremendous model for persuasive speaking. And so on. I could go on listing the good bits for hours, and still not be done. I honestly believe this is one of the greatest works of literature ever created.
 
But here's the rub: it's a book. These are stories, poems, essays, and yes, polemic. They are meant to inspire, chasten, encourage. While they are historical documents, from which we can learn a great deal about the times and places in which they were written, they are not history. This is hard for many people to accept, including many who, having the same kind of seminary training I did, should really know better, because much of the narrative in the Bible presents itself as history. But in fact, the discipline of history as we know it, sifting through actual documents and records, weighing eyewitness accounts, seeking to arrive at as accurate a picture of the past as can be done with the materials available, then acknowledging how much of the emerging picture still remains uncertain, did not exist at the time this book was being written. People told stories about history to reveal capital-T Truth. But they weren't writing small-t truth. Legends about the creation, the flood, Abraham, Sarah, Isaac, Rebekah, Jacob, Rachel, Leah, Joseph, Moses, David--all of these circulated for centuries before anyone thought of writing them down. The good part of this is that the stories took on added power as storytellers embellished them, tricked them out, created new details that heightened their power and added to the deeper Truth about human beings and their difficult relationship with the Creator. Eventually, they were written, and as one of my seminary professors liked to say, at that moment the inspiration died. Once a story is in print, it's set--especially if it is institutionally canonized. There's no more working with it, updating it for contemporary audiences, teasing out of deeper meanings. Future generations must puzzle over what is actually going on in stories whose ancient details are as alien to them as if they came off a flying saucer.
 
That is, fortunately, how people of faith have handled these stories--for the most part. Prior to modern times, religious art usually clothed the characters of the Bible in garments that reflected the culture of the artist. There was no effort at historical accuracy; the artists understood they were working with an old story, and were attempting to make it relevant to a contemporary audience. This worked well until the Reformation, when a German monk named Martin Luther decided one of the problems with the Church was that its holy book was written in an ancient language only spoken by priests. He went to the original languages and translated them into vibrant sixteenth-century German. Suddenly anyone who could read could interpret Scripture for himself or herself. And now the gloves came off.
 
The Bible is the most-published book in the world. Its countless translations give ordinary, untrained believers access to it--or, really, to interpretations of it--which many of them read along the lines of the first camp, the literalists. Some, finding contradictions galore, not to mention wacky stories about God that just don't ring true, toss it aside, unable to appreciate the metaphorical power of the book. Others take it to mean what it says, and say what it means, and come undone when it is suggested to them that none of the events depicted probably happened as described, many never happened at all, that almost none of the "authors" actually wrote the books attributed to them, and hardly any of the quotes actually came from the people who are depicted as saying them. The Jesus Seminar has exploited this upsetting revelation with great success, reveling in the discomfort of Christians who would just rather not hear this stuff, let alone apply it to their precious book.
 
I understand how unsettling it can be. There was actually a time when I had a quasi-literalist view of the Bible. Then, at the age of 23, I got in an argument with a literalist friend over the existence of hell. I couldn't believe a loving God would condemn anyone to eternal damnation. He said it was in the Bible. I said the Bible was full of contradictions. He said, "Are you saying God can't write?" I decided to dig in and find those contradictions, and read the thing from cover to cover for the second time in my life. (I had previously done this while in high school.) By the time I finished, I had found a wealth of contradictions, but also had been deeply moved and inspired, and decided to go to seminary to find answers to my questions.
 
Seven years later, I emerged from seminary with more questions, but they were no longer about the Bible. I now understood it for what it was. My questions now were about whether I could ever sincerely present my understanding to a congregation. I found that, in fact, I could not, that I had to compromise, toning down my knowledge of the source and nature of the Bible. It's a compromise most preachers make. I don't have to anymore, because I took myself out of the preaching game thirteen years ago, and don't intend to go back.
 
It still haunts me, though. My previous post about the anti-Jewish core of the Christian message got me trolled by a Biblical literalist. No matter how I reworded my knowledge of the Bible's origins and nature, he refused to hear me, constantly beating on the drum of literal truth. In the end, I gave up. There is no seeing eye to eye on this. We can't agree to disagree. He's just wrong. And he can never admit to it, because at the core of his faith is his belief, not about God, but about this book.
 
It's just a book. A beautiful, powerful book. But a book.

Friday, May 24, 2013

A is for Arbitrary

Report Card
 
Midlife is a time when things start to slip away. I'm sure I've peaked as an endurance athlete; if I ever again run in a marathon, there's no way I'll come close to the 4:19 PR I set at the age of 40. Much of the detritus I accumulated in my early adulthood now seems redundant, so a few years ago I shed about 75% of my book collection. Eight years ago, I had to accept that I would never be the school parent I had always wanted to be; 700 miles is just too far away to help with homework, attend parent-teacher conferences, or be my kid's biggest fan at a concert. (Teaching high school and being in a relationship that includes adolescent children has given me another shot at this last one lost opportunity.) And memory--ah, memory. The brain becomes so crowded with past experiences. Many of them blur together in a way that I see the Game of Thrones writers compositing redundant characters and plotlines. Realizing that's happening, that my subconscious is constantly editing my memories into a more efficient plotline, it becomes all the more special when I can recall a specific moment from long ago.
 
One such moment came back to me last night, as I was talking with Amy about the paper tiger that is scholastic assessment. In 1973, the sixth grade me received a copy of My Weekly Reader that had a cover story about grades, and about school districts doing away with them in favor of more realistic assessments that actually told parents something useful about their children's intellectual development. I was horrified at the thought: take away my grades, and what would I have left? I was a shy, pasty preacher's kid with coke-bottle glasses who was never going to be popular. The one area in which I could receive praise and admiration was my Straight A report card.
 
Turns out that article was just blowing smoke. Don't get me wrong, assessment reform has always been a priority in teacher colleges, and some progressive private schools do a spectacular job of individually assessing their students. And you may be aware that elementary students are by and large free of the "ABCDF" grading system, being evaluated instead as to whether they meet or exceed developmental expectations. But in reality, little has changed; it's still ABC with a different name. As a music teacher, most of my grading has been in the area that used to be called (and still is, in some places) "citizenship," that is, how well students participate in class, and whether they make it to concerts.
 
Here's a story: my first semester of seminary, I took a course with the misleading title "Introduction to Ministry." I thought that, like the "Introduction to Teaching" course I'd taken my sophomore year of college, this would be an omnibus exploration of all the different things involved in being a pastor: visitation, counseling, worship preparation, preaching, praying publicly in a variety of settings. Every entering student at Perkins School of Theology (SMU, Dallas, TX) was required to take this course, so the large lecture hall was packed. Three professors team-taught the course. And what did they teach? Mega-trends. Pop sociology. Pastoral theology--not, as you might think, about being a pastor. There were a couple of useful experiences we had outside the classroom, doing a case study of a church and participating in an "inter-ethnic experience" at a church where people were a different color than we were (mine was at a Spanish-language Episcopal church). Class time, though, was usually a joke. We were all frustrated, knowing how useful this could have been, furious as what it really was. Early on, we wrote essays on one of those sociology books we were reading. I write well; I've never had any problems with expressing myself on paper, and I know I answered the question being asked comprehensively. My score for that essay was, I believe, an 82.
 
I took it into the professor demanding to know why it was such a low score. He launched into an explanation of how, a few years earlier, the faculty had realized they were inflating grades, and had decided as a group to lower the grading scale by 5 points. That meant the best grade one could get at Perkins was a 95. The worst was 70. This school had a 25-point grading scale that ran from 70 to 95. He want on to insist that, in his experience, the students who made the best pastors scored in the low 80s. Any higher than that and they should stick to academics. He never explained why my essay was only worth 82 points.
 
I walked out of that meeting understanding many things: Perkins professors clearly were not professional educators, and most likely had never reeceived any training in assessment; academic achievement and pastoral effectiveness were two completely different things; and any system of letters or numbers used to measure achievement is to at least extent arbitrary, irrelevant, and cruel.
 
The main point of assigning grades is to be able to compare students to each other. A student with poor grades needs assistance in improving his or her performance; a student with high grades needs a different kind of attention, to assist her or him in maximizing gifts for whatever subject the grades are measuring. "A" students are good candidates for higher education, while "C" students should probably be looking at vocational training, instead. As far as university admissions go, this is founded on the assumption that grades mean the same thing at Philomath High School that they do at Punaho School in Honolulu. In fact, though, for all the efforts at standardization, grading is still in the hands of individual teachers, and apart from reading and math, the subjectivity involved in measuring performance obliterates any ability to compare achievement between students at opposite ends of Burnside, let alone from different states, different types of school (public, charter, parochial, military), or even neighboring classrooms. Consider my final cumulative Perkins GPA of 89. Given that it's on an arbitrarily lowered scale, that 89 would be a 94 on a "normal" 100-point scale, solidly in the A range. But if I were to be applying to graduate schools to, say, pursue a Doctor of Theology degree, how could that be compared with the GPA of a student graduating from a seminary that used a more traditional ABC system? And how could either of those GPAs, coming as they did from schools staffed by professors who, though expert at ministry, never took a single course in educational methods, be compared with the GPA of a philosophy major from UC Berkeley? That's the dilemma admissions offers face, and it explains why higher education turned to standardizing testing (the SAT, ACT, GRE, LSAT, GMAT, MCAT, etc.) as a tool for comparing prospective students from a variety of secondary schools and colleges in their decision-making process. Standardized testing has a plethora of problems of its own, but its one advantage is knowing that everyone score in consideration uses the same criteria, however irrelevant they may be to ultimate academic success.
 
That's what my seminary professor was getting at: the numbers assigned to essays, tests, and final grades really had no bearing on a student's ultimate success in the real world. I've found this to be true in both the vocations I chose. My success as a teacher has developed over time, and with experience. My 3.66 college GPA (and 3.75 graduate GPA) were lousy predictors for my first--and almost only--year in the field. Any school administrator will tell you that the first year is a hard one, that it takes experience to hone a teacher's skills, and that master teachers have that mastery thanks to many years of practice. Yes, some new teachers have gifts that set them apart from the pack of recent graduates, but it's the practice of teaching itself that shapes an excellent teacher. It's true with pastors, as well. The high marks I received in most of my seminary classes demonstrated my mastery of texts, but said nothing about my native ability to do the most important work of ministry: talking to people. As it turned out, my aptitude for and interest in that task were not up to the standards of this profession. I really was better suited for the classroom. Put another way, pastoring didn't love me. Teaching does. What that grizzled veteran professor told me was true: the numbers say nothing about how effective the graduate will be.
 
So the grades are arbitrary and irrelevant. And one more thing: they're cruel. Why do parents call up teachers, distraught over a D in history? It usually is because they're genuinely concerned about how well their children are performing, and wanting to know what they can do at home to help them improve; but it's also because they can see what an impact a low grade has on their children's sense of worth. I received two Bs in high school, and was shattered by them. (There would have been more, but my parents succeeded in having a school policy changed just for me so I could take PE pass-fail and protect my GPA.) All I had, I really believed, was my grades. Students who regularly receive low grades, despite working hard, can feel defeated, discouraged, inadequate, and if they've got nothing else going on in their lives at which they can excel, report cards can feel abusive.
 
The worst part of all this is how utterly unnecessary this cruel system is. As far back as the 1970s, competency-based education was already happening in Oregon. The legislature had approved a set of skills which, it was believed, were necessary for a young person to succeed in the adult world, skills like balancing a check book, applying for a job, following a set of instructions. These competencies were tested throughout high school, in the context of classes in which the skills would normally be taught, and they were pass/fail: either you could do them, or you couldn't, in which case you got special attention until you mastered them. Any subject being taught can be broken down into such skills, and while mastery need not be a binary pass/fail matter, a simple rubric can still be used, not unlike the meets/exceeds criterion in place for elementary report cards. In fact, I can guarantee you that this is exactly how every teacher really does assess students. It's how curriculum resources are structured, how textbooks are written, and how most courses are evaluated...but that's not what you see at the end of the term. All those individual competencies are shoehorned into an ancient, arbitrary system of letter grades, fed into a computer, and come out boiled down to a single letter that tells students and parents nothing about what really happened for four and a half months of interaction between student and subject matter.
 
Why do we do it this way? Why don't we supply every student, throughout the term but especially at the end, with a printout of every task assessed over the course of a class, and the degree of mastery exhibited on each one?
 
Perhaps, you might be thinking, it's the time involved, but no, that's definitely not the case. Grading is no longer a paper-and-pencil chore. I enter all my grades into a program called eSIS. It's an ornery program with a clunky interface, and my district is actually transitioning to something more refined, but it has provisions for recording every graded activity in a class; and both students and parents can log in, and check each of those activities. I enter these grades daily. It's simple to provide each student with an itemized record of all his or her assignments and assessed activities. It would also be simple to present those data in a graphic format. There's no need to boil it all down to a an arbitrary letter grade. Every student can have a complete copy of his or her line in the grade book, sort it by categories to know which areas need improvement, and adjust study and work habits accordingly.
 
But still we use that single letter. Because people want it.
 
Remember "No Child Left Behind"? (I prefer to call it "No School Left Unscathed.") I don't know a teacher or administrator who was in favor of that one-size-fits-all approach to grading schools and punishing those that couldn't bring their diverse populations up to arbitrary levels on reading tests with reduced funding. But legislators loved it, and so did voters who wanted a simple standard for comparing schools to each other. Nobody wants to hear that it's complicated, that inner city schools and schools catering to migrant workers and schools with large special education departments really can't be fairly compared to private schools that can selectively admit the cream of the academic crop for their student bodies. No, we want to point fingers at the faculty and administration of "low performing" schools, and insist they do better, get those test scores up, or else. The NCLB regime is, thankfully, finally being relaxed, but at enormous cost to an entire generation of students who have experienced school as a testing mill.
 
And that is why we still assign these arbitrary, irrelevant, cruel letters to students, telling them whether they are excellent, good, fair, poor, or failures: people want it. The thought of giving up those letter grades is as alien to them as it was to my sixth-grade self. Legislators are not professional educators. Nor are voters. Change is frightening. We want to be able to say, "My child is a straight A student, just like I was." (Or just like I wish I could have been?) We could have report cards that say, "Here are areas you have mastered, and here are areas in which you need to more time with the subject matter, and perhaps different methods for learning it." And get this: There is such a system already in place for students on Individualized Education Plans. I've been in IEP meetings both as a parent and as a teacher, and they're wonderful, personalized, encouraging, compassionate, everything assessment should be. They take more time, and time is money. There's no getting away from that. They make a huge difference, though. At present, only students who are at risk of serious academic failure due to physical, psychological, or emotional disability are required to be on IEPs, and because there is such an investment involved in having a student on such a plan, public schools only place students on them when they are required to by law. Again, there are some expensive private schools who make this kind of investment in individual students, but they are both expensive and private.
 
It comes down to money, then, but only to an extent. Americans still prefer grades. But ask yourself: what would be better for me and my child, and the child next door, the child across the tracks, the child living in a migrant trailer park, for every child, however bright and challenged, who will someday enter the work force, fix your car, take your temperature, prepare your will, teach your grandchildren? The answer is clear to me: put the grades on the scrap heap with land lines, fax machines, and steam-powered automobiles, and put every child on an IEP. Our nation will be a far better place for it.