Thursday, December 29, 2011

Evanescence by Evanescence and Theseus' Paradox


Before buying Evanescence’s new self-titled album, I decided to check out the early customer reviews on Amazon. One reviewer raised the question of whether the album can really be considered an Evanescence album since the only original member of the band who appears on it is Amy Lee. I found the reviewer’s question, rhetorical and casual as it was, to be an interesting, if not difficult, one and immediately thought of Theseus’ paradox.

The ship of the Greek hero Theseus, Plutarch tells us in The Life of Theseus, was preserved by the Athenians for generations. Over time, the planks were replaced as they aged and decayed. The question then arose among philosophers whether it was really the same ship anymore. Suppose every last bit of it were replaced at some time or another. If not a single piece of the original ship remains, is it still Theseus’ ship?

Most of us today would say no, it is a replica of the ship of Theseus. However, let us reimagine the paradox. Suppose Theseus is still alive to sail his ship with those under his command. As the years pass and the adventures stack up, parts of the ship grow old or are damaged and must be replaced until no original piece of the ship remains. Is this still Theseus’ ship? 

Obviously, yes. In the same way, Evanescence is still Evanescence by virtue of Amy Lee, who as founding member, constant presence, guiding force and frontwoman, defines the band.
Other music groups have been able to weather massive shakeups because one or more members remain at the helm, preserving the group’s name despite changing members and musical styles. Whitesnake is still going strong under David Coverdale’s leadership--and ownership, of the band’s name--despite regular lineup changes. Billy Corgan reformed The Smashing Pumpkins for 2007’s Zeitgeist with all-new bandmates, and there have been more changes since.

“But,” you may say, “that isn’t really The Smashing Pumpkins,” which raises an interesting point: Sometimes we aren’t prepared to accept the new entity under the same name. 

Personally, as much as I admire Jimmy Chamberlain’s drumming and saw D’arcy and James Iha has important elements of the image of The Smashing Pumpkins during its rise to fame and zenith, I don’t mind recognizing anything Billy Corgan wants to call The Smashing Pumpkins as The Smashing Pumpkins. But something in me refuses to accept the band now calling itself Dream Theater as truly deserving the name. I accepted the earlier changes in keyboardists, but drummer Mike Portnoy was one of Dream Theater’s Amy Lees, the others being John Petrucci and James LaBrie. Without him, something of the essence of the band has been lost. Others may feel differently.

Evanescence is, in my opinion, the solidest of the group’s three studio albums. Whereas the previous two established a new sound and had amazing standout tracks, the other tracks tended to be stiff and just fill in space. The new album has fewer filler tracks and demonstrates a liveliness and groove that the previous albums lacked. For this reason, some of the other reviewers on Amazon feel that the band has drifted away from its alternative and Goth roots toward commercial pop, but many others, like me, hear something slightly different from what is the same band.

Thursday, December 22, 2011

Kepler-22b Deals a Blow to the Design Argument


NASA’s Kepler space telescope has discovered a roughly Earth-sized planet, now named Kepler-22b, that orbits in its sun’s habitable zone, where temperatures are neither too hot nor too cold for life. This presents yet another challenge--as if any more were necessary--to the Christian apologist’s argument from design. 


Some Christian apologists customarily extend scientific findings to support supernatural claims, and never more so than when making the argument from design. They tell us that the conditions for planetary life are so implausible that a divine creator must have designed the universe, intentionally placing the Earth at just the sweet spot near the Sun where human beings can survive.


Even when I was a Christian, this argument never worked for me. After all, implausible events happen on a daily basis without any divine intervention--we run into acquaintances in unlikely places, say something in conversation at the same time someone else says the exact same thing, or get a lucky break at precisely the right time. On the scale of the cosmos, we should expect a few planets here and there to fall into the Goldilocks Zone.


But for some, the orbit of the Earth is not merely fortuitous for us but fine-tuned by an omnipotent intelligent being in order that we may live and be saved from our fallen nature and its resultant stay in Hell, both the works of the same Tuner.


That’s taking a great deal of liberty with the facts afforded by science. Indeed, there is a Goldilocks Zone and Earth is in it, but that isn’t anything special. Kepler-22b is one of 54 planets scientists have discovered in the habitable zone, and the universe remains mostly uncharted. The Kepler space telescope alone has discovered what appear to be five small, Earth-like planets in habitable zones and has viewed a couple thousand more planets, the data for which scientists have yet to analyze.


Life turning out to be plausible is unlikely to faze many Christians whose faith doesn't rest on the implausibility of life, but it is at least one less argument for the polished apologists given to pseudo-science and specious reasoning.


Life after the discovery of Kepler-22b will remain precarious, and precious, but we now have less reason to believe that it is a miraculous occurrence unique to Earth.


Friday, November 25, 2011

The Two Subversive Turns in Ringer

The only television show that started this past fall that I continue to watch is Ringer, a thriller starring Sarah Michelle Gellar. The show has many flaws, but I like the touching way it subverts its own subversive noir sensibilities.

Ringer is about Bridget, a recovering drug addict and ex-stripper who witnessed a mafia hit. Afraid for her life, she ditches police protection and reaches out to her twin sister Siobhan. Siobhan appears to live a charmed existence, but then she disappears, apparently committing suicide. Bridget takes her sister’s place only to find out that Siobhan's life wasn’t so great after all. She was barely on speaking terms with her husband, her daughter is in a rebellious phase that includes late nights of booze and drugs, she was sleeping with her best friend’s husband, and someone was trying to kill her.

The show’s creators, Eric Charmelo and Nicole Snyder have envisioned a noir drama with all the intrigue:  mysterious deaths, extramarital liaisons, secret identities, double-dealings, police inquiries and abductions. It reminds me of the sort of thing Orson Welles would have loved, like The Third Man or Mr. Arkadin. What I have always liked about noir is the way it subverts the view that society is innocent and orderly in favor of the sordid acts and dark desires that are always there beneath the surface.

Ringer does more than that, however, by subverting its own subversion. No sooner does Bridget step into her sister’s life than she begins to set it right, ending Siobhan's affair and becoming a loving wife and mother. Meanwhile, the only one who knows she is really Bridget is her old sponsor from Narcotics Anonymous. Monogamous love. Familial harmony. Responsible parenting. Self-improvement. At times the script is like a moral manual to the dominant modern American values.

But this is where Ringer is at its best. The characters grow, which gives the show the potential to go somewhere when most television shows are populated with static characters who are doomed to be the same week after week, just in different situations.

Ringer’s execution is uneven, but its vision of light and dark so tumultuous that it's hard to tell where it comes down, on the spic-n-span world we are supposed to believe in or the world eating it from underneath--the superego or the id--is intriguing.

Sunday, November 20, 2011

Politics and the English Language: In Favor of More Arguing and Less Name-calling

In Politics and the English Language, George Orwell decries the poor use of language in the political writings of his day, calling for a less obfuscatory style and presenting examples of bad writing. No doubt political discourse has degenerated even further in the years since 1946, when Orwell’s essay first appeared. This may be a characteristic of political rhetoric common to all times and places about which little can be done, but there is one practice we really do need to put a damper on, and that is calling opponents “idiots.”

In recent years, I’ve been surprised to see how frequently people use the word “idiot” or an equivalent to describe anyone with whom they disagree. Perhaps this has always been true in the vulgar idiom of the street, but it is certainly new in higher spheres of discourse, such as public debate, from which one would expect more. It is hard to imagine past newsmen like Edward R. Murrow, Walter Cronkite or even more recent figures such as Dan Rather or Tom Brokaw calling public figures “idiots” on the air, but it has become quite common in recent years.

Fox news commentator Bill O’Reilly has his “pinheads,” and most other well-known news anchors have followed his lead in having a regular segment just for insulting someone. Politicians do it as well, such as when former Utah governor Jon Huntsman called Baptist pastor Robert Jeffrees a “moron” recently (previous blog), or when former Speaker of the House Newt Gringrich in a recent Republican presidential debate offered open insult to a lot of people:

“What is amazing to me is the inability of much of our academic world, much of our news media and most of the people on Occupy Wall Street to have a clue about history.”

He just called a lot of people idiots, just without using that exact word. The political and economic pundits are even worse about it, and even academic scholars, when debating contentious issues like evolution or religion, increasingly resort to insult.

Surely public figures, often educated at the nation’s best institutions of learning, know that calling someone an idiot is not an argument. Of course, the object of their ire may very well be an idiot, but what needs to be said is why that person’s view is so wrong--or why that person is an idiot. The summary insult is a shortcut that spares one the effort of formulating and expressing real arguments. The insult is superfluous and unconstructive.

I suspect that this tendency is at least in part a social phenomenon caused by information overload. Our lives are increasingly dominated by media that present us with a cacophony of contrary opinions all backed voluminously with facts and argumentation. The brain tires of the argument and wants to put it all to rest in one fell swoop. Calling someone an idiot accomplishes this nicely, at least in the mind of the one doing the name-calling, by obviating consideration of anything that person says.

As understandable as that is, we must take the time to form our opinions deliberately and express them eloquently. We must take the time to argue or we risk forfeiting the realm of public discourse to those least worthy of engaging in it--the least thoughtful and least well-spoken.

Thursday, November 17, 2011

Forgetting the History We All Know

Many recent public debates have shown that certain lessons from high school American History have either been forgotten or are being willfully ignored by our elected representatives, bestselling authors, experts and large swathes of the American people. We should, of course, be wary of simplistic applications of history to the present, but surely some lessons are beyond controversy and we should suspect that anyone who suggests otherwise is trying to lead us in the wrong direction.

In high school, I remember learning how the economic excesses of the Roaring Twenties resulted in Black Tuesday and the Great Depression. My instructor described FDR’s New Deal policies as a series of efforts to right the economy, some of which were more successful than others. The depression did not fully end, we were told, until the war economy of World War II, which was followed by the affluent society epitomized by the Leave It to Beaver world of the 1950s. One would think then that we would be wary of economic conditions such as those prevalent in the 1920s and view prudent government programs as a possible solution.

Yet, while economic disparities are as bad now as they were before Black Tuesday and unregulated markets continue to periodically crash the economy, big business and conservative politicians remain untouched and therefore unfazed. They continue to call for decreased regulation, lower taxes for the rich, dismantlement of welfare programs, and smaller government, this last to the point where the current Republican presidential candidates bicker over how many government departments they would abolish. These policies would roll back history, put us further behind the rest of the developed world, and quite possibly turn the Great Recession into something much worse, something that might even reach the 1% in their gated communities.

Another history lesson every American knows is that, while some balk at saying we lost the Vietnam War, we most certainly did not win it, nor was our waging of it always honorable. Think of the My Lai Massacre. One would think then that we would be wary of protracted military engagements and, when they’re necessary, carry them out with some humility.

Yet the armed forces remain a national fetish. Rarely does one hear a voice unconvinced of our military omnipotence or of the good-ol’-boyness of our troops. Indeed, to suggest otherwise is viewed as unpatriotic. The campaigns of the Republican presidential candidates in 2008 were based almost solely on an unseemly blend of military worship and jingoism. And dissent among their constituents and the media is uncommon. I think this is because many Americans prefer to ignore the experience of Vietnam in favor of the Good War, when the world really did need saving and we played a decisive, although hardly lone, role in saving it. Nonetheless, all wars and other military interventions since have been much messier affairs, with attainment of victory and our moral high ground more open to question.

I should add that I do have a great deal of respect for some members of the military and do support some of our overseas engagements, but I am less excited about the unquestioning adulation we are asked to lavish upon all members of the military and every operation they are called upon to perform.

Also disturbing are challenges to racial equality. American history is fraught with racial issues from slavery to Jim Crow, but I always believed that while some individuals may still practice discrimination and prejudice, our institutions and leaders had overcome the shameful practices of our past through struggles such as the Civil War and the Civil Rights movement of the 1960s. I thought that we had all learned the lesson that racism is bad and any view to the contrary is unspeakable.

Yet many in the public light, again mostly conservatives, are attempting to stage a comeback for institutional discrimination against select non-white groups. Kentucky Senator Rand Paul, among others, has suggested that private businesses should have the right to reject service on the basis of race, and recent tough immigration laws like Arizona Senate Bill 1070 have been drafted to target Hispanics, albeit in a sort of code language that thinly covers, but not very successfully, their vile core. Surely we are not still fighting this battle are we?

Unfortunately, the question to that answer is “Yes, we are.” Karl Marx wrote in The Eighteenth Brumaire of Louis Bonaparte, “Hegel remarks somewhere that all facts and personages of great importance in world history occur, as it were, twice. He forgot to add: the first time as tragedy, the second as farce.” I do not know if the trends in forgetfulness I have mentioned above will lead to a second or third occurrence of past calamities, but there is surely much tragedy in the suffering that could result if we fail to learn even the most basic lessons from our past.

Thursday, November 10, 2011

Thawrat al-Karāmah: Dignity Revolution

“Men’s only hope lies in a revolutionary becoming: the only way of casting off their shame or responding to what is intolerable.”
     --Gilles Deleuze, Negotiations

In the summer of 2010, my wife and I took a trip to Tunisia. I knew very little about Tunisia at the time, as was true, I suspect, for many Americans. Now, I would guess that if not most, at least many more Americans are familiar with Tunisia, especially as the birthplace of the Arab Spring.

Our tour flew into Tunis and then described a big loop passing through Roman ruins at Dougga, Kairouan, Roman ruins at Sbeitla, the salt pan of Chott el Jerid,  the Berber village of Matmata, the dunes of the Sahara, the Roman amphitheater of El Jem, the popular tourist area of Sousse, and Carthaginian ruins in Carthage, eventually returning to Tunis. While there was much of interest, I have to say the highlight for me was the area around Chott el Jerid and Matmata, where some scenes set on Tatooine were filmed for Star Wars.

Our tour guide made much of how Tunisia was a secular democracy without any of the aspects of radical Islam so disturbing to the West. The impression we received was of a progressive and stable country, and very little we saw challenged that. Indeed, in Sousse, where European tourists were numerous, I purchased a nifty Crusaders-vs.-Muslims chessboard at an upscale shopping center staffed mainly by young women who apparently felt no need to even wear a scarf to cover their hair.

Looking back, however, I can see signs of the political and economic dysfunction that would soon inspire the Jasmine Revolution. Pictures of President Zine El Abidine Ben Ali, who had kept a tight hold on power since 1987, could be seen hanging in places of business and one of the reasons we went to Tunisia in the first place was that the poor economy made it one of the cheapest tours
available for seeing ancient ruins .

Six months later, in the town of Sidi Bouzid, a 26-year-old vegetable vendor by the name of Mohamed Bouazizi walked to a police station and immolated himself after incidents in which the police had confiscated his goods and insulted him. This set off protests. The Tunisian people had had enough and revolted--no longer would they allow the few with political, economic and brute power to grind them into the dust. Events escalated and Ben Ali eventually resigned. The country recently held an election in which about 60% of eligible voters participated, electing representatives from a handful of parties to a constitutional assembly. The party to win the most seats is the Islamist party Nahda, but at the moment, the party shows no signs of departing from democracy for theocracy.

Meanwhile, the revolution has bloomed, with regimes in Egypt and Libya falling and others in Syria, Yemen and Bahrain under pressure.  No one saw the Arab Spring coming and it is changing an entire region of the world on a scale so large that in future years we are likely to remember it as an era-defining event like the fall of the Berlin Wall. It is too soon, however, to say whether any of these countries will go on to become liberal democracies or, after the manner of the 1979 Iranian Revolution, lapse into regimes no better than what they have replaced. This reminds me of something I read a couple months back in Slavoj Zizek’s In Defense of Lost Causes:

“Recall how Arendt describes, in Badiouian terms, the suspension of temporality as the defining ontological characteristic of ontic political action: acting, as man’s capacity to begin something new, “out of nothing,” not reducible to a calculated strategic reaction to a given situation, takes place in the non-temporal gap between past and future, in the hiatus between the end of the old order and the beginning of the new which in history is precisely the moment of revolution.”

While one might say that the events shaking the Middle East are the result of the past--long years of repressive regimes who humiliated their peoples--and that the Middle East is racing toward a new future born of that past, we might also say, after Zizek’s summary of Hannah Arendt above, that they are in a timeless moment that refuses the past but has yet to embrace a future. They are at a tipping point, only no one can say which direction they will fall.

I like to think that Occupy Wall Street is a part of the movement sweeping the Arab world. Like the brave people who gathered in Tahrir Square to demand that then Egyptian president Hosni Mubarak step down or the Libyan dentist I saw on the news who had turned into a machinegun-toting rebel and was trading fire with Muammar Gaddafi’s thugs, the 99% in America have had enough of working too hard for too little while the 1% has gone from rich to super-rich and contemplates what comes next. While anything deserving the description “revolution” appears to be a ways off yet, and may never come, I hope Occupy Wall Street will continue and that its methods will remain peaceful but increasingly effective.

A greater worldwide movement against oligarchy would require a different name than “Arab Spring,” but its courageous origins needn’t be abandoned. The Tunisians called their revolution Thawrat al-Karāmah in Arabic. This is a name that could be used by the downtrodden anywhere because it means Dignity Revolution.

Monday, November 7, 2011

Bottom as the Most Human Character in A Midsummer Night's Dream, with Commentary on Baltar in Battlestar Galactica

“Lord, what fools these mortals be!”
     --Oberon in William Shakespeare’s A Midsummer Night’s Dream

 
A few years ago, lying in bed reading Shakespeare’s A Midsummer Night’s Dream, I had a flash of insight: Bottom, one of the Clowns, is the most human character. Ever since, I have believed that my insight came from a comment in the introduction to my copy of Shakespeare’s drama, but pulling the book out recently to see if I could explore this line of thought further, I found that no such comment exists. It must have been a genuine insight of my own. So, in order to pull off this blog, I embarked on a review of A Midsummer Night’s Dream.

 
Nick Bottom is one of the hapless Joes who have undertaken to perform a play before Theseus and Hippolyta, duke and duchess of Athens. Bottom approaches the task with gusto, desiring to play multiple parts himself so that he can show off his thespian skills. When the actors meet in the forest at night to practice, the mischievous fairy Puck transforms him so that he has the head of a donkey. Under the influence of love juice, Titania, Queen of the Fairies, falls in love with him, but he is more interested in eating oats than in her affection. Later, he awakens back in human form, remembering what has happened as a lovely vision. A Midsummer Night’s Dream ends with the nobility of Athens poking fun at, and touched by, Bottom and his ridiculous troupe as they perform their play.

 
It is Bottom’s humanity as Clown that originally interested me, although he is the most fully human character in the play in other ways too numerous and complex to go into here. All his lofty aspirations are thwarted and he ends up looking the fool. There is much of the human situation in this. We would do great works, but we are imperfect creatures and all too often our fine intentions end up an embarrassing shambles. 

 
For all his clownishness, however, Bottom holds an exalted place in Shakespeare’s play. Not only is he the only character to cross the line between the mundane world and the Faerie Realm, there to share the queen’s bed, but upon returning to the world of Bottom the Weaver, he has an epiphany that he expresses in a celebrated passage:

 
“I have had a most rare vision. I have had a dream, past the wit of man to say what dream it was. Man is but an ass if he go about to expound this dream. Methought I was—there is no man can tell what. Methought I was, and methought I had—But man is but a patched fool if he will offer to say what methought I had. The eye of man hath not heard, the ear of man hath not seen, man’s hand is not able to taste, his tongue to conceive, nor his heart to report what my dream was.”

 
The footnotes tell me that the last sentence is a corruption of Corinthians 2:9-10, but what strikes me most is how Bottom touches upon what he is—an ass and a fool. And yet, when he returns to Athens and runs into his buddies, he refuses to expound upon his dream, thereby refusing to play the fool. His experience in the Faerie World turning from human to ass has actuated a change from ass to something higher in the real world. This capacity to span worlds both mundane and numinous, and to rise by reflection, is it not distinctly human?

 
The other characters in A Midsummer Night’s Dream are more archetypal, each one a narrower representation of the human spirit. Theseus is a wise, benevolent, yet firm ruler, whereas Hippolyta is more given to emotion and romantic fancy. These two are mirrored by Oberon and Titania in the Faerie Realm. Puck is pure playfulness, and the other characters are generally nondescript, although their antics are quite amusing. Bottom alone shows that he has more dimensions than one.

 
Around the same time as my insight regarding Bottom, I realized that the character of Gaius Baltar in the new Battlestar Galactica television series is also a clown, and he too is the richest character in the series.

 
Baltar is a famous scientist and inveterate playboy on planet Caprica. He becomes infatuated with a Cylon—a type of robot who looks human—and gives her defense codes, which the Cylons then use to launch a nuclear strike that nearly sends the human race into extinction. He joins a ragtag fleet of survivors under the protection of the battlestar Galactica that then flees across space in search of a new home on planet Earth. Along the way, he is continually thrust to positions of responsibility even as he bears the secret of his guilt.

 
That may not sound very funny—and indeed Baltar’s comedy is mixed with much tragedy—but Battlestar Galactica reserves what little outright humor it has for scenes involving Baltar. A great deal of this humor involves him talking to, sometimes making out with, a vision of his now deceased Cylon lover that appears only to him, thus making him look crazy to others. Perhaps the funniest scene of the whole series is when a Cylon who looks exactly like his dead lover, but isn't, appears. He can't understand why she is acting like she isn't who he thinks she is and angrily pursues her into the latrine. Of course, someone walks in as he is standing before a closed stall door, shaking his fist and shouting, “No more Mr. Nice Gaius!”

 
That is all slapstick, but like Bottom, Baltar’s clownishness strikes a truly human figure. His lofty goals fall to ruin, he expresses a manic range of human emotions, crosses boundaries between worlds spiritual and profane, and experiences epiphanies only to fall victim to fate or lapse into his old bad habits. Battlestar Galactica is not short on rich characters, but none display this richness of human character and experience, except perhaps Starbuck. And none is a Clown whose sticky situations, funny to everyone but himself, are so representative of the human condition.

 
Like Bottom at his art or Baltar in his constant turmoil, we would do something great, but our efforts too often end in disarray and ridicule. Yet there is majesty in our striving and richness in the breadth of our souls.

Wednesday, September 28, 2011

Twilight of the Cultural Omnivores

Earlier this year, I read an article on NPR’s website called “In Praise of Cultural Omnivores” that referred to a study by the National Endowment for the Arts showing that cultural omnivores—people who enjoy both high and low culture—are on the decline. The article struck a chord with me because I realized that I am a cultural omnivore—I just never knew what to call myself before.

The study, “Age and Arts Participation: A Case Against Demographic Destiny,” examines whether age and generation have a strong correlation to arts participation. The results are a little complicated, but one thing is clear. Cultural omnivores are disappearing, as are highbrows, albeit to a somewhat lesser degree:

“Omnivore representation declined from 15 percent in 1982 to 10 percent in 2008. Highbrows represented just over 7 percent of all respondents in 1985 and 1992 and then declined to 5.3 percent in 2008.”

Much of the literature on these trends tends to focus its worry on the future of traditional highbrow art forms like classical music and ballet, but what worries me in addition to that is the inability or unwillingness of people to cross cultural lines either by ascending or descending through cultural strata.

The NPR article mentions “social status,” but doesn’t explore in much depth the possibility of a connection between wealth and cultural activity. The study also merely brushes against this disturbing possibility, by drawing a connection between higher education and greater cultural participation. In an America with an ever smaller wealthy set, a disappearing middle-class and growing poverty, many people simply cannot afford a night at the opera and they have less access to spheres of influence that would make them want to go to the opera.

Economic conditions
material conditions, to be Marxist about itmay to a certain extent impose constraints determining what people enjoy, but I feel as if at the same time more and more people choose to limit themselves to the perceived dictates of class. Many feel that to enjoy high culture you must be an expert or a snob. If you want to enjoy wine tasting, you must know all the etiquette and be able to wax lofty upon the bouquet or you might as well not even bother. If you want to enjoy classical music, you must be able to facilely discourse upon the sublimity of Brahms’s A German Requiem. It’s never okay to dabble and learn, you must be that guy who knows it all, so it’s better to stick with low forms of culture, which are easier to come to grips with.

I do not, however, really buy that the distinction between high and low art can be determined as a difference between genres or media. The distinction should be one of quality. I find a lot of literary fiction, for example, to be pretentious and uninsightful, while no small amount of genre fiction shows much greater craft and perspicacity into human nature. Likewise, much comic book art today—even the superhero stuff, not just underground or indie comics—is innovative and engaging, whereas the spaces set aside for current visual artists at the San Francisco Museum of Modern Art is generally ho-hum.

Omnivores have grasped this. There is significant value in all forms of art and sometimes the highest experience is to be had with the lowest art form, making it not so low after all.

I am glad to be an omnivore, and while I would hesitate to say that you should be one, too—for making omnivorous cultural behavior a moral imperative would be the kind of snobbishness detrimental to the omnivore’s cause—I would encourage anyone to be one. It is always good to broaden one’s horizons.

Friday, September 23, 2011

President Obama Way Behind at the U.N.

President Obama’s approach to foreign policy is often described as “leading from behind.” This style of leadership is unsatisfactory for some who see it as suspiciously like doing very little, but I have generally cut the president some slack when critics express frustration with his reluctance for bold action. This week at the U.N., however, saw him trying to lead from so far behind that he was just plain behind.

Palestinian President Mahmoud Abbas’s call for U.N. recognition of Palestine is commendable as a bold and peaceful effort toward concrete progress in a old conflict. Israel occupied Palestine in the Six Day War of 1967 (which wasn’t a war with Palestine) and then stayed. To this day, Israel has continued to establish new settlements in Palestinian lands and since 2007 has maintained a blockade of Gaza that has caused widespread poverty, joblessness and hunger. Palestine should have full state status, and apparently most of the world is willing to grant it now—everyone but Israel and the U.S.


When President Obama addressed the U.N. General Assembly on Wednesday, he used his usual conciliatory rhetoric, accompanied by his this-is-common-sense-folks tone of voice, to reaffirm his support for eventual full statehood for Palestine, while also urging Palestine to delay its call for full membership:


“I am convinced that there is no short-cut to the end of a conflict that has endured for decades. Peace will not come through statements and resolutions at the U.N.—if it were that easy, it would have been accomplished by now. Ultimately, it is Israelis and Palestinians who must live side by side.” 


His words must have sounded hollow to everyone in that hall. Abbas is offering him a chance to make real progress—but by no means a final step—toward peace in the Middle East, real progress his previous efforts have failed to produce, and he’s passing it up.


French president Nicolas Sarkozy, meanwhile, has shown admirable leadership. In expectation of this conflict, he opened the General Assembly with an actual constructive suggestion. His comments, while they came before President Obama’s, seem designed as a response to them: 


“True, this peace will be built by the Israelis and the Palestinians. No one else can do it. And no one can claim to impose it on them. But we must help them.” 


Sarkozy proposes upgrading Palestine’s status from observer entity to observer state, with a one-year deadline for Israel and Palestine to reach an agreement. He probably wants to avoid escalating violence and instability in the region that might be instigated by a U.S. veto of Palestinian statehood in the Security Council, but I suspect he also views Obama as a friend, an embarrassing one who is behaving poorly and requires a little help not looking any more foolish than he already does.


Peace in the Middle East will come when the U.S. decides to stop sponsoring Israel’s occupation of Palestine and sends a clear signal that it will brook no more nonsense on this issue. The Obama administration should seize upon the opportunity presented by Abbas’s proposal to stop lagging behind and get caught up with the rest of the world.

Thursday, September 15, 2011

Reflections on the Necessary Evil of Government Today in the United States of America from the Viewpoint of a Hypothetical Situation in a Pamphlet by Thomas Paine

Earlier this year, I read Common Sense by Thomas Paine. In this short work of revolutionary reason is a passage in which its author presents a hypothetical scenario imagining the pristine development of government, with the suggestion that any rational government would preserve similar principles. I was shocked to find that the American system of government today falls far short of these principles.
 
Paine imagines a small group of people who find themselves isolated “in some sequestered part of the earth” in a “state of natural liberty.” It isn’t long before they realize that they need to help each other out if they are to survive. Thus, they become a society. At first, this society is harmonious, but inevitably disputes arise and government becomes necessary for laying down the law. While the colony is small, everyone can participate as a rule-maker, but eventually their society grows and there are too many people, so they have to elect representatives. The idea is that these representatives come from the people, legislate in the interests of the people, and will before long leave government and return to the people, allowing new representatives of the people to step in—so that “the ELECTED might never form to themselves an interest separate from the ELECTORS.”

 
Paine says that this is the origin of government and that “the simple voice of nature and of reason will say it is right.” I do think it makes sense, but the form of American government today does not closely resemble the fraternity described by Paine in certain key aspects.

 
Paine’s hypothetical situation allows for anyone and as many people as possible to get into government, but today you must be rich. In recent years nearly half the members of Congress have been millionaires. What’s worse, only those with the weight of vast amounts of money behind them can get the ear of the government. We all know about the lobbyists that course through Washington’s halls of power and how presidents appoint fat cats to key positions, but the American political process is disgusting with money. An example: As part of his 2012 campaign, President Obama plans to have intimate dinners with ordinary citizens . . . who pay $38,000 for the honor. 

 
Another characteristic of the government described by Paine is the regular turnover of the elected, yet in our government, the elected often stay in for decades. A couple obvious examples are Strom Thurmond, who was Senator for South Carolina for 49 years, and Ted Kennedy, who served nearly 47 years as a Senator from Massachusetts. Our politicians are professional politicians who, once they get in, do everything in their power to stay in until retiring time. Even if they did at some point spring from the people, they have little intention of ever returning to them. We should not be surprised then if Washington’s interests are not those of the rest of the country.

 
In Paine’s proto-government, everyone participates, either as a voting member of the government while the colony is still small, or as a voter once the community must rely on elected representatives. Yet, in America today, large portions of the population face obstacles to voting. Perhaps the biggest obstacle is that the working class . . . has to work. The working class often cannot take time off, even for sickness, without some difficulty. Absentee balloting is available, but registering beforehand is just another obstacle in a process that should have very few.

 
All of these problems have what should be easy solutions: establishing campaign spending limits, setting term limits, and making election day a national holiday. Many modern democracies have exactly these practices in place. The primary obstacles to instituting such policies here are political—our politicians don’t want to do anything that would make them less well off, hand voters to the opposition, or put them out of a job.

 
We don’t have to do something just because Thomas Paine said it was a good idea, but we should if what he says makes sense. And tell me, doesn’t it?

Sunday, September 11, 2011

The Common Dynamic Underlying 9/11s

Today marks ten years since the attacks of September 11, 2001, so the airwaves are full of memorials and analyses, but something I find missing from all the talk is recognition that 9/11 does not belong only to America.

One used to hear foreign casualties mentioned, but I have not this week. A total of 90 countries lost citizens in the attacks that day. Of the 2,977 casualties (excluding the hijackers), 372 were from other countries. Twenty-eight were Muslims from European, African, Asian and Middle Eastern nations. I lived in Sapporo, Japan at the time of the attacks. Twenty-four of the casualties were Japanese. And we must not forget that today’s most violent party of God has unleashed multiple attacks on other countries since. The Madrid train bombings, London transport bombings and various attacks in Pakistan come to mind.


Three Chileans died in the attacks ten years ago. September 11 was, however, already an infamous date in  Chile, for it was on September 11, 1973 that a C.I.A.-backed coup overthrew the democratically elected government of Salvador Allende in favor of a military junta led by General Augusto Pinochet. The regime began in the blood of hundreds and went on to kill thousands and torture tens of thousands. I remember reading in The Japan Times about Pinochet’s arrest in Britain under universal jurisdiction and the ensuing controversy when he was released on medical grounds. One article showed a large caricature of the general, his oversized head framed by the cap and collar of his military uniform and surrounded by skulls. He returned to Chile in March 2000, just months before al-Qaeda bombed the USS Cole in Yemen.


I mention this first 9/11 because looking at the two together provides a broader perspective. The essence of a 9/11—which, of course, doesn’t have to fall on that date—is a conflict between liberal democracy and its enemies, be they megalomaniacal leaders like Richard Nixon in Washington D.C. or Islamofascists in the deserts of Afghanistan. There is an us-versus-them dynamic in the second 9/11 and the struggle against terror since, but the us with which we should be identifying is not just America, but at another level is any who side with free and open societies.


According to this dynamic, while new towers on the WTC site will be a nice symbol, the best way for us to strike at them is to embody our highest values—refusing to turn intolerant of Muslims here at home or supporting Muslims in their battle against dictatorship in Libya, for example—by siding with the parties of liberty, whatever their nationality.

Tuesday, September 6, 2011

Psst!—We Are All Marxists Now

“The revenge of Marx begins now.”
     --Masahiro Mita (translation from Japanese mine)


Those on the Right today are quick to call those said to be on the Left Marxists, and those on the Left are quick to deny it. In American politics, “Marxist” is an insult and Marxism is the bogeyman. I always think this is funny, because as far as I can tell, everyone believes that most of this bogeyman’s most fundamental concepts hold true for America today. We are all in some ways Marxists, we just use different terminology.


The first thing anyone ever learns about Marx’s thought is his belief that societies split into two opposing groups, with one always having the advantage. In the Middle Ages, those two groups were the lords and serfs. Marx called the two groups in the capitalist societies of his day the bourgeoisie and the proletariat. The bourgeoisie is the class that possesses the means of production and the proletariat consists of those who do the work. Today, we talk about the haves and the have nots, and just as in the dichotomies of times past, one group has all the power and wealth, and one group has precious little.


Another fundamental concept of Marxism is the accumulation of wealth. Marx noticed that the bourgeoisie employs its money to make ever greater amounts of money for itself, while conditions for the proletariat tend to worsen. This is no different than what we today describe as the growing gap between the rich and the poor. Report after report for years has described a middle class falling in ever greater numbers into the ranks of the poor, while the rich become the super rich. A quote from a recent article by Nobel Prize-winning American economist Joseph Stiglitz in Vanity Fair:


“The upper 1 percent of Americans are now taking in nearly a quarter of the nation’s income every year. In terms of wealth rather than income, the top 1 percent control 40 percent. Their lot in life has improved considerably. Twenty-five years ago, the corresponding figures were 12 percent and 33 percent.”

The wealth is accumulating, comrades.


And a great part of Marx’s greatest work, Das Kapital, is dedicated to describing exploitation of labor, terminology which has survived unchanged to our own day—long hours, little pay, no benefits, wretched work environments, and so on. You know the litany of workplace horrors visited on employees by their employers, because they are all too common in America today. Some of the worst conditions may have improved here thanks to regulations (although companies often avoid these regulations by setting up sweat shops overseas), but workers continue to undergo continuous degradations at the hands of their employers.


In Proposed Roads to Freedom: Socialism, Anarchism and Syndicalism, English philosopher Bertrand Russell (1872-1970) discussed the ways that Marx’s philosophy had fallen short, among them the way capitalist societies were demonstrating a graduated class structure rather than two starkly different classes and the absence of working class revolutions in most capitalist nations, and these criticisms are valid, but it seems to me that Marx got an awful lot right. British literary theorist Terry Eagleton appears to agree, since his latest book is called Why Marx Was Right


But I am not just saying that much of Marx’s analysis of capitalism holds true today, but that everyone—from politicians to news analysts to businessmen to the “average American”—seems to have drawn many of the same conclusions, they just don’t call it Marxism, don’t want to call their beliefs by what has become a dirty name, or don’t know that that is what their beliefs are called.


The form of today’s economic and social dialectic may be different than it was in the 19th Century, but the content is essentially the same. And while revolutions may have failed to appear or to take hold, it is well past time that we lend an ear (a critical ear, of course) to this thinker from the past, whose voice has in many ways proven prescient, and look for a means to address the inequalities of our day.

Sunday, August 28, 2011

The Destruction of Wagner by the Artsy Elite


In this week’s The New Yorker (Aug. 15 & 22, 2011), Alex Ross reflects on his experiences at the Bayreuth Festival. As a Wagnerite, I have often dreamed of going to Bayreuth to see the Ring Cycle, but the last I checked, procuring tickets was a lengthy, complicated and expensive process lasting years. After reading Ross’s lamentations on the avant-garde productions at the Bayreuth Festspielhaus, I am a little less disappointed that I will likely never attend the festival.

This year’s festival includes a performance of Tannhauser. Tannhauser is about a minstrel knight who, after spending a year and a day enjoying pagan delights with Venus, is refused forgiveness by the Pope, but ultimately achieves it through his pure love for Elisabeth. The opera is based on the historical German minnesanger and poet Tannhauser, who lived in the 13th Century, as well as legends and mythology, and features various Medieval personages such as a landgrave and a healthy helping of nubile demi-goddesses and fantastical creatures—The Three Graces, sirens, naiads, nymphs and bacchantes—so where better to set the action than in a “dystopian waste-recycling facility” (Ross) dominated by a pre-existing piece of art that is, in the words of its creator the installation artist Joep van Lieshout, a “closed circuit of food, alcohol, excrement, and energy”?

I must admit, I am generally not a fan of this type of avant-garde reimagining and updating of a classic dramatic work, although it appears to be the norm in opera productions. 

I can speak from experience. In May this year, my wife and I went to see Siegfried at the San Francisco Opera House. Siegfried is about a boisterous young man who leaves his home in the forest, slays a dragon and takes its treasure, confronts his father Wotan (the ruler of the gods), and finally ascends mountainous heights, where he traverses a ring of magic fire and finds a sleeping Valkyrie with whom he promptly falls in love. The pre-performance speaker described the SF Opera’s production as beginning on the outskirts of an urban wasteland, and indeed, the curtain rose on a junkyard with a blasted trailer home in it. In Act Two, the dragon Fafner was a mechanical beast that looked like an overgrown four-wheeler with the head of a cheap robot toy. The quality of the musical performance was fine, but the early sets were cringe-inducing.

Luckily, these urban elements faded somewhat after the slaying of Fafner. The rest of Act Two was dominated by bright-green forest imagery projected on a screen, in front of which flitted the Woodbird in a bright red outfit. Act Three mostly took place in scenery suggesting rocky heights, soaring and majestic for the meeting between Wotan and Erda the earth goddess and turbulent for the awakening love between Siegfried and Brunnhilde. The second half of the opera was much more enjoyable as it assumed a congruence with the music and dramatic action.

The people who design such productions as described above are no doubt incredibly talented musicians, directors, producers, set designers, and so forth, but I must cry from my seat in Philistia that they have missed the essence of the work. All their theory—of which they have too much—is wrong. The music, characters and stories speak for themselves and as plainly today as they did in Wagner’s day and do not need any meddling by us. 

I suspect the artsy elite simply look down their noses at fantasy, even fantasy of the mythological type present in many of the greatest operas. They would not be caught dead reading David Gemmell, nor would they put a horned helm on Brunnhilde’s head unless it were over the dead body of their sophistication. That is all so tacky and old-fashioned. But that is where the real power is. History, myth and imagination are much bigger and meaningful than any narrow commentary on our own times, no matter how cleverly contrived.

Wagner knew that—after all, he set most of his stories in a mythical past rather than a context more literally resembling the Europe of his own day—and we should too. According to Ross in his article, boos have become part of the standard Bayreuth repertoire. It’s hard to imagine that would be so if the original operas were allowed to shine.