Archive for April, 2008

The unease that comes with moral flexibility

Monday, April 28th, 2008

The unease that comes with moral flexibility

Some religious leaders have howled about the moral relativism of our contemporary culture, pointing to this as the slippery slope that ultimately will lead to our collective downfall. In some respects, this can be argued successfully, I suppose, but it’s generally assumed we’re always talking about trending toward more liberal values than the other way around.

The recent raids of the polygamist compound in Texas raised a new awareness of this for me. Some ideals we now hold would not have jibed with those maintained back in so-called “biblical” times. Particularly, they might view some of our takes on sexuality as unnecessarily restrained, while others are unabashedly exploitative.

For example, the notion of polygamy itself challenges my sense of propriety, but there are plenty of stories in the Bible about men taking more than one wife. Now, back in the day, it was customary for a brother to marry the wife of his deceased sibling to help protect and provide for the departed one’s family. Also, childbearing was held in such high regard that if a couple could not bear children, a husband was within his right to use a concubine or slave to carry on a man’s bloodline.

With high infant mortality rates, low life expectancies, constant war, famine and disease to contend with, a family had to work to ensure it persisted from generation to generation. They didn’t have the advantages of antibiotics, emergency rooms, vitamins or even balanced-diet options to help increase their chances of survival. So does this mean that a practice that once was considered acceptable must be considered in context?

Consider girls being impregnated at age 13. Who doesn’t shudder at the thought of some 50-year-old guy using a barely-teenage girl for personal satisfaction and for proliferating his genetic line? But consider Mary, son of Jesus, who bore the Son of Man at or around that age, and she wasn’t the only one! As soon as a young woman could do so, it was her familial and spiritual duty to bear as many offspring as possible before her number was up, or she became “barren.” There were no fertility doctors, and childbearing years did not span nearly as many decades back then as they now do.

So was it wrong then? Is it wrong now?

The list of taboos today compared with common practice then is not a short one, including slavery, and, some might argue, capital punishment and war. On the other hand, how many of us today cringe at the thought of tying knots on the Sabbath, or mixing the fibers of our clothing? Do we guys consider shaving our beards as an affront to God, and how many of our wives still sleep in red tents in the back yard when they menstruate?

All of this suggests to me that, yes, given current knowledge, social conditions and the like, morality indeed should be considered in context. Such arguments even have been waged over the Mosaic commandment, “Thou shalt not kill.” Some maintain this means all killing of any kind is against God’s law, while others suggest this speaks specifically to human murder only. How can such a seemingly direct handful of words be construed with such a broad brush?

Keep crying about the dangers of moral relativism if you must, but practically any search for absolute right and wrong will meet up with a confounding counter-argument. Most, if not all, of us could join together in condemning acts of rape and genocide, though the ranks would thin a bit, evidently, in opposition against polygamy and marrying young girls. Even fewer would argue passionately about the importance of keeping kosher, but there are those who believe fervently that this is a critical part of living a godly life.

So who’s right and who’s wrong? Who can pump his fists in self-righteousness and who ought to hang his head in shame? The answer changes from time to time, but I think I’ll add this to the growing list of questions I have to ask when God and I have a little sit-down.

When is a sin not a sin?

Tuesday, April 22nd, 2008

Let’s see, now: When is a sin not a sin?

Sometimes we humans can’t seem to win for losing.

Clearly, our collective dependence on fossil fuels raises myriad issues that we’re at least gradually beginning to address. Alternative energy sources are springing up and proliferating, almost as quickly as our infrastructure can support them, including more solar power, wind, hydroelectric, nuclear and biofuels, among others.

The controversy around nuclear power, including the security risks and toxicity of the remaining byproducts, has been passionately debated for decades now. Some see this as a viable alternative, while others consider nuclear power to be laden with disastrous potential.

Hydroelectric power is an exciting renewable source of power, but there are opponents who decry the ecological damage of dams and turbines and contend that the negative consequences of this alternative are worse than its benefits.

Wind power seems a brilliant solution, harnessing and endless stream of currents canvassing the globe. But there are some who have serious concerns about the imbalance the giant windmills present in their habitats, killing owls, bats and other creatures of flight in alarming numbers. On top of this, some neighborhoods simply don’t want to have their views cluttered by such monstrous eyesores.

Hydrogen fuel cells seem like a swell idea, until you consider the energy it takes to create, process and transport the hydrogen fuel to begin with. Electric cars are attractive, until we imagine the possibility of millions of deceased batteries with their toxic innards leaking into our water supplies. Plus, unless you are using something other than traditional electricity sources to charge up your car, the source of that power is most likely coal.

So far, I’ve not heard the beef with solar power, but I’m sure there’s at least one out there. Send it my way if you know of one.

The most recent culprit is biofuels. Colorado is among the frontrunners in the trend to convert crops such as wheat, corn and the like into fuel, which may help reduce our dependence on oil. However, now it’s suggested that the dramatic spike in world food prices is, in part, due to the increased demand for biofuel material that might otherwise feed starving people.

Believe it or not, on National Public Radio this week I heard one reporter claim that leaders in some developing nations are charging the proliferation of biofuels as a crime against humanity.

Are they serious? Can anyone really liken the distillation of corn crops into usable, renewable fuel to genocide? My first reaction was probably like that of many readers, which was to dismiss this as an overblown polemic, intended to grab headlines more than anything else.

But then I got to thinking a little more about it. If, in fact, we’re using viable food in volumes that would make any dent in the consumption of global oil, it would have to have a major ripple effect in the world food economy, and people most likely will die as a result.

Did Willie Nelson and his fellow biofuel advocates ever consider the potential for such complications when they initially championed the benefits of this alternative approach? Probably not. Sometimes, we only recognize the impact of our decisions after the fact. So is it a sin to be indirectly responsible for people dying of starvation, even if we had the best of intentions for a win-win outcome?

My estimation is that it becomes a sin when you know the result of your actions and still do nothing to change it. I call this a sin of occlusion: willfully blocking out reality to keep it from infringing on our personal comfort or way of life.

It would also be nice if we actually had to do something in order to commit a sin, and thus, feel a little more agency and control over our own sins. But unfortunately, I would suggest we can sin by simply not responding to a need or an injustice, once we know about it. I call this a sin of accountability.

So we’re no more off the hook in avoiding the truth than we are in doing nothing about it once we know it.

It’s almost like this sin thing, if someone took it seriously, could change someone’s entire way of life, from the bottom-up.

But where in the world to begin?

The challenge of living with opposable minds

Saturday, April 12th, 2008

The challenge of living with opposable minds

One of the biggest knocks against organized religion these days is the tendency from within to take a firm ideological position, dig in collective heels and refuse to consider alternative points of view. Most of us can list the so-called “hot button” issues that have become the marquee causes, such as abortion, gay marriage, and, in general, any activity having to do with body parts touching other body parts.

Lest anyone assume I’m only referring to the evangelical right, I’ll point out explicitly that I’ve met my share of arrogant religious liberals who are as rigid and self-righteous about their personal views as any conservative. It’s a human tendency, but like many other bad habits, one that we can work against if we so choose.

But how?

In a recent column in The Christian Century magazine, Gregory Jones addressed the issue of what author Roger Martin calls “the opposable mind.” Both Martin and Jones contend that, although we are born with a basic ability to hold multiple, opposing concepts in tension in our mind, we tend to condition ourselves from childhood to do just the opposite.

The blame, it seems, is not entirely upon churches. Martin argues that the problem begins in grade school, with the way we teach our children to think. This was a familiar notion to me, given that I worked with teachers for years professionally on how to teach kids how to think. Though we begin early on to show children how to read, write and add, the skills of critical thought, rhetoric and analysis often come a decade later or more, if at all.

This vacuum of higher-order thinking trickles over into a culture that, instead of welcoming debate and opposing views, feels threatened by difference. Rather than learning from alternate perspectives, we cling to our own ideals even tighter, casting verbal volleys at the “other side,” and if we’re lucky, the fight stops before things really get messy.

Jones, now the dean of the Duke University Divinity School, suggests a strategy he was taught in school and which has had a significant impact on the way he views religion. He was given the challenge of selecting a topic about which he felt very strongly, and then researching and arguing in favor of the opposite view. The exercise pushed his intellectual and emotional skills to their limits, but he came away with a much richer sense of the importance of different viewpoints.

“Such exercises do not ask us to become less passionate or to compromise our views,” he says. “But they do help us learn to hold our own views in a deeper tension with alternative possibilities. Compelling us to find new patterns, patterns that are consistent with Jesus’ own teaching and life.”

When applied to the context of faith, Jones calls this practice “interpretive charity.” To consider availing ourselves to otherwise threatening, or at least uncomfortable, views as an act of charity somehow makes it easier to swallow. After all, we’re all called to a life of charity, even in matters of thought, dialogue and social interaction.

This is not a novel concept. In his book, “Christ and Culture,” H. Richard Niebuhr contended that our world conditions us to unlearn what Jones calls here “opposable thinking.” Niebuhr’s book, published originally in 1956, falls back on similar arguments posed by F.D. Maurice and John Stuart Mill, who preceded Niebuhr by a century in their work. One could even argue that the concept of opposable minds stretches back to the great philosophers of ancient Greece.

So why are we so slow on the uptake? Is it fear, laziness or something else that keeps us from using our opposable minds? Are we really so blind to the consequences of choosing a different, more narrow path? Worse yet, do we really believe that we’re so singular in our righteousness that the price of such attitudes is worth our willful ignorance of centuries of history?

Maybe the next 2,000 years will offer more promise in this regard, provided we don’t self-destruct in the meantime.

Is Mel Gibson to blame for ‘Horton’ paranoia?

Saturday, April 5th, 2008

Is Mel Gibson to blame for ‘Horton’ paranoia?

I grew up enamored with Dr. Seuss books, reading every one I could get my hands on, including “Oh, the Places You’ll Go,” which came out after I was an adult. I’ve continued to share these books with my son, Mattias, who responds with similar enthusiasm.

So it was with no small degree of eagerness that I looked forward to taking him to see “Horton Hears a Who” when previews popped up last fall. The animation looked faithful to the book, and I was sure that Mattias and I would have a great time together.

Then I saw the review that appeared in our local paper by MaryAnn Johanson about “Horton,” who said the creator “has turned it into something that looks astonishingly like far-right propaganda about how Christians are a persecuted minority – as if this were 100 A.D. in the Roman Empire – and loudmouthed atheists are ruining everything.”

Had it not been Dr. Seuss, those words alone may have been enough to keep me away. After all, who needs neo-con propaganda wrapped up in a cartoon elephant? True, there’s a more public debate between people of faith and atheists about the validity of organized religion. True, there have been more than a handful of biased portrayals of religion and its prominent figures pitched from both sides of the dividing line. But have we gone too far in reading more divisive subtexts into our films than actually exist?

I think that this wave of hypersensitivity began with Mel Gibson’s “Passion of the Christ.” Church communities came out in droves to support the film, and in some ways, the devoutly Catholic – some say fundamentalist – filmmaker and actor created as much controversy as “The Last Temptation of Christ” did some years prior.

The truth is that there has never been a shortage of controversial material in the media regarding faith. However, it almost always has come from the so-called heretics and blasphemers rather than those who defend their faith in the public forum. Not surprising, there has been some push-back from those who feel the pro-faith contingency has either gone too far, or at least is gaining too much of a popular audience.

What has ensued, it seems, is a hypersensitivity from both sides about any potential ulterior motives within any book, film or television show. Is “The Golden Compass” a veiled atheist plot to turn all of our children into zombified unbelievers? Are the “Left Behind” books so publicly embraced that people are taking them as gospel? Is “The Da Vinci Code” the linchpin leading to the demise of the Catholic Church?

There are, in fact, two themes that could potentially be construed as slanted in “Horton.” First is the tenet that there’s some virtue to be had in believing in things we can’t observe. Though this could be construed as pro-religion, the broader context within which Seuss’ book presented this idea is generally benign, as is the take in the film.

Second, the phrase, “a person’s a person, no matter how small” could be inferred as code for pro-lifers, but considering the original book came out in 1954, it’s hard to imagine Horton’s creator was prophetic enough to predict the coming debate about abortion. More consistent with Seuss’ agenda is a general respect for equality, sensitivity and compassion – all values that are hard to argue with, whether one believes in God or not.

It’s a shame when a culture becomes so cynical and paranoid that it can even wring the fun out of “Horton Hears a Who.” Since when did we all become so delicate and threatened by differences of opinions? My greatest concern is not so much that people will be swayed en masse in one direction or another, but rather that producers of mainstream media will become so gun-shy that all of our content will be distilled down to the intellectual equivalent of tap water.

It’s just a movie, people. Go see it or don’t, but enough with the Pollyanna politicking before you suck the fun out of everything.