Friday, March 07, 2008

Killer Pretzel Strikes Again

Remember the January 2002 "killer pretzel" that left George W. Bush bruised after he choked on a pretzel and fainted? This goes against every political bone in my body, but I now have reason to think that maybe he told the truth for a change.

True personal experience this week: I was at my desk, racing to get work done, unable to go to lunch. I got up at about 3 or 4 pm and grabbed some pretzel sticks and a Diet Coke.

Sat down, popped one in and took a sip. Somehow, either the pretzel went down the wrong way or the Coke flooded my throat or ... I don't know. Next thing I knew I propelled myself out of my chair noisily attempting to breathe.

Like Bush said, "I hit the deck." I fell, throwing a paper basket out of the way and even shoving a bookcase against the wall so hard that the phone jack was twisted in such a way that the phone became inoperable.

I'm not sure what happened then. I blacked out. I came to in pain, lying on the chair mat and attempting to catch my breath. I could not speak, just make signals that I needed a moment.

I felt myself sweat profusely. It was a cold, panicked sweat. Slowly breath returned to me and from shallow gasps I went to deeper, more moderate breathing.

Then I noticed I had hit my left leg badly. My big toe was swollen and, upon inspection later, at home, it was bruised -- just like Bush's face.

CNN called it a "vasovagal syncope" at the time. I'd come across that term once before, when someone I know had a horrible, humiliating loss of bodily function. According to the Wikipedia, a syncope is
a sudden, and generally momentary, loss of consciousness, or blacking out caused by the Central Ischaemic Response, because of a lack of sufficient blood and oxygen in the brain. The first symptoms a person feels before fainting are dizziness; a dimming of vision, or brownout; tinnitus; and feeling hot. Moments later, the person's vision turns black, and he or she drops to the floor (or slumps if seated in a chair). If the person is unable to slump from the position to a near horizontal position, he or she risks dying of the Suspension trauma effect.
This approximates in many ways my own experience, and possibly Bush's.

Uncannily, Bush was almost exactly my age in 2002, when he had his episode. Perhaps it's a middle-aged-man thing. The killer pretzel attacked me, too.

Tuesday, February 26, 2008

The Hidden Norms in Religious Flux

Being part of a survey team conducting a survey of active and lapsed Catholics in the early 1980s prepared me to deal with a today's news stories about a Pew study on religious change in the United States. Let me deal with two things I learned back then that make sense now.

Keep in mind that most of these surveys can only measure affiliation through a tangible behavior that is deemed to denote an inner disposition. While scientologists claim to have machines that can measure advancement in their religion, social scientists do not have a soulmeter of any kind.

So, for the most part, the sociology of religion describes behavior of churchgoers, often in rations that are not doctrinally correct. For example, for the study of Catholics we called someone "active" if they went to church on Sunday at least four times a year, not counting major holidays or family occasions.

This is well below the canonical obligation of Sunday Mass, but it is a behavior indication of a certain degree of engagement. Indeed, in most predominantly Catholic countries perhaps a tenth of all Catholics go to Mass on a regular Sunday; in the United States, a survey in the 1990s found attendance as high as 45 to 55%, depending on how you counted it.

OK, my insights now.

First, it is statistically normal for people between the ages of 15 to 30 to "drop out" of the organized religion in which they were raised. This I learned from sociologist Dean Hoge, who led the research teams and wrote the book, long out of print, about the study.

"Normal" to a sociologist only means that a behavior does not deviate significantly from the social average. It doesn't mean it is good or bad. There are many reasons why disaffiliation during adolescence and early adulthood might occur in societies in which this period involves a prolonged crisis.

The various Anabaptist denominations (Amish, Mennonites, Brethren, etc.) developed a detour around this by decreeing that they would not baptize or affiliate infants. Indeed, most Anabaptists don't formally join their churches before marriage.

This leads to the second interesting insight: most people's religious affiliation has very little to do with philosophy or theology.

Most plainly, I learned from interviewing people who had returned to the faith, the pattern was that once they got married, or even more importantly had children, many drifted back to churchgoing. It was almost as if they wanted to give their children something similar to rebel against.

Significantly, also, very, very few people referred to conversion or returning to faith as a process involving study and thought, or the reading of certain works. Most converts wanted to marry a Catholic.

At the time, I found this profoundly disappointing. I had been involved in the conversion of two people who had wrestled with ideas, read and discussed books with me, written lengthy letters with questions and concerns. They were modern St. Augustines, turning from one view to another with thoughtful deliberation.

Even in my questioning of religious faith, I have always felt the theological and philosophical issues were important. The idea of changing to get married or to pass on certain conventions to children seemed and still seems very hollow.

This is why I find the Pew study less than interesting. Yes, 28% of U.S. adults have changed from the church of their fathers (or mothers), more in the younger years. Given the pattern of social research, I doubt they were asked too deeply why.

Friday, February 22, 2008

Extra Ecclesiam Nulla Salus

Is it wrongheaded to hold that those who assert ideas contrary to your own are mistaken and that, ideally, they should see the error of their ways? Much as I bitterly disagree with the pope I call "Nazinger," overlooked in the brouhaha over the Good Friday prayer for the conversion of Jews, to my mind, is a philosophical debate about conviction and tolerance.

In speaking of conviction, let's agree that we're talking about tested ideas about which you have a certainty that is, perhaps, not absolute, but sufficient to convince you of their validity or truth. Similarly, by tolerance I mean the amicable and peaceful acceptance of those who hold differing convictions.

Take the proposition that the Earth orbits the Sun. When Galileo affirmed it, there was no empirical way to verify whether this was true; we now have been able to "see" the orbit in motion from satellites and spaceships to the point that this is a fact. It wasn't in Galileo's time.

Was Galileo wrong to insist that his heliocentric scientific theory was right and that the views of his church inquisitors were mistaken? Assuming Galileo prayed on this matter, would it have been wrong for Galileo to ask God to help convince Cardinal Bellarmine?

Is it wrong for Democrats to seek to convince Republicans? For Obamans to woo Clinonites? For Keynesians to wish to persuade Adamsmithians that they're off the mark by a few points?

After all, not absolutely everything a Republican president does is without some redeeming value, and there isn't a huge policy difference between Hillary Clinton and Barack Obama, and Keynesian economics can be just as fallible as orthodox free market capitalism.

Yet wouldn't Democrats have a point or three in noting that Republicans presidents brought us the Great Depression, the stagnation of wages beginning in 1973, the de-industrialization of the United States in the 1980s and I'd run out of space cataloging the current prez's disasters?

Wouldn't Obamaniacs have some bragging rights when it comes to their candidate's ability to sway and mobilize?

And didn't those who deficit-spent us out of World War II (and the Depression) and eliminated hunger for two decades through food subsidies show that pure-accounting balanced budgets and minimalist governance, such as propounded by McCain when he isn't squiring a blonde lobbyist, are not particularly useful policy recipes?

That's what conviction is all about: being sufficiently convinced of something to assert that it is the truth, even without total and absolute proof. Most of what we "know" is really a matter of reasonable conviction and/or trust in a given source, rather than actual, factually verified knowledge of our own.

A confusion arises in our day that mixes up syncretism, the attempted reconciliation of different or opposing principles, and relativism, the deeming of all ideas to be validity or truthful relative to a variety of factors, with tolerance.

In Western culture this is a debate that has as its center the classic ecclesiastical Latin phrase in my heading, which literally means "outside the Church there is no health." This was the conviction of Cyprian of Carthage, a third-century bishop who made the idea famous. (Personal note: Cyprian was converted from paganism by St. Caecilius, a North African presbyter who may be the source of my name.)

Cyprian faced two crucial issues for the Church of his time: whether the baptisms performed by heretics were valid and whether the Christians who defected to paganism and renounced their faith during the Decian persecution, a majority, should be welcomed back.

The Carthaginian prelate argued that the baptisms were invalid and refused absolution to the apostates without long and public penitence unless they were facing death. In the first he went against the consensus of his time and all the way up to the present. In the latter, a council supported his view.

One need not be a believer to see logic in Cyprian's arguments.

If you do not believe or do not believe "rightly," no matter what words you use and what actions you take, the meaning of what you do cannot possibly go beyond your own convictions. If you betray your beliefs publicly to save your skin, while others are dying for the same beliefs, returning to fellowship with other believers might reasonably entail some action showing remorse before being accepted in fellowship.

Do note that in both controversies Cyprian, while intolerant of dissension and defection within his group, had nothing to say about the world outside, other than that it lacked "health," later translated as "salvation." Why would Cyprian have gone peacefully to his beheading, rather than publicly state he believed otherwise, if he didn't think that his way was the healthful one?

My point is that, even as I look in from the outside and disagree with the substance of Cyprian's conviction, I still admire and agree with the notion that one should stand for one's convictions.

People of conviction A are entitled to believe that A would be better for people of conviction B. Catholics are entitled to pray for the conversion of Jews, Muslims, Protestants, and even me, since they believe that believing in Catholicism is the best thing since sliced bread. Democrats are entitled to hope for a change of heart in Republicans.

Thursday, February 14, 2008

Rethinking "Terrorism"

A friend's philosophy course assignment prompts me to reconsider the term "terrorism," particularly in light of its recurrent invocation abuse by the Bush Administration. Who is a terrorist and what is terrorism?

The specter of "terrorism" was applied with such a broad brush by the Argentine military in the dictatorship of 1976-83, at the cost of the lives of people I knew, among them a close friend, that it has long lost any meaning to me.

Terror? Maybe the White House aides whom I saw scrambling out like rats on the morning of Sept. 11, 2001, were frightened by the 18 fanatically misguided Muslims who in suicide attacks flew planes straight into several buildings.

Although I was well within the White House security perimeter, I only stopped working because the FBI kicked me out of my office -- allegedly to protect the president, who was hiding his own very brave hide in Nebraska at the time, as I recall.

People aren't terrorists just because we don't like 'em and would like to lock 'em up. They have to wilfully inspire terror.

Yet that is not, insofar as I can tell, the aim of Al Qaeda. Osama and his buddies want to destroy the United States, scared or not. "Death to America" is not the same as "Terror to America."

Terror means intense fear throughout a large population. Neither the original Spanish guerrillas who fought Napoleonic troops in the early 19th century nor the admittedly effective French Maquis of World War II nor, arguably, even the Viet Cong managed to hold whole populations in the thrall of fear.

Indeed, the repeated failure of Ernesto Che Guevara is a testament to the inadequacy of insurgency as an instrument of terror. Even in suicide-bomber-rife Israel, the likelihood that alleged terrorists will get you is a crap shoot; you're just as likely to get hit by a crazy Israeli driver.

Historically, political terror has been the weapon of rulers intent on scaring large numbers of subjects into submission. Public drawing, quartering and hangings of Jesuits in England or the recurrent whacking of guillotine blades on the French nobility were both instances of terror. Most people feared being thought Catholic in Elizabethan England or a blue-blood in Revolutionary France.

Under Joseph Stalin terror was evident in speech applause sessions that lasted sometimes as long as an hour, because no one wanted NKVD agents to see them stop applauding first. McCarthy-era blacklisting was a form of economic terror: if some people thought you were a Communist, they felt entitled to deprive you of your livelihood without trial -- even though it was never illegal to be a Communist.

Who wields terror today? Think about it.

Al Qaeda doesn't care what Americans feel. These fanatically theocratic Muslims believe in wiping out Western liberal (and illiberal) democracy, along with Western humanistic mores that go back to the Renaissance, off the face of the Earth.

The only people who stand to gain from terror, politically and economically, are George W. Bush, Richard Cheney and their associates. Oh, yes, and the cops everywhere who act like they're rushing to smoldering Downtown Manhattan seven years ago every time someone doesn't halt quite long enough at a stop sign.

Those folks really scare me. Bush and Cheney have already launched two wars. The cops -- and every thick-necked wannabe vigilante -- are notorious bullies. That's terror.

Tuesday, February 12, 2008

Yes we can vote for a Black man

In a vapid attempt either to rescue sagging circulations or pander to Hillary Clinton or merely expose to the world how little they know about Hispanics, major American newspapers have trumpeted a Brown vs. Black rift that has never really existed.

Supposedly, Latinos despise African-Americans and for that reason are flocking to Clinton.

As proof, major papers from East to West have rediscovered Dolores Huerta, who otherwise never graces their pages. Huerta has been a great labor organizer whose claim to fame, per the Anglo press, is to have worked alongside Cesar Chavez (note to broadcast journalists: please pronounce that SAY-czar CHA-vase, not Caesar ChaVEZ).

Huerta has cast her political hat in the ring for Clinton and threw her political influence to win the senator representing New York a sizable portion of the Latino vote in California.

Enter the "West Side Story" narrative. Reporters and editors who are always searching for the oversimplification that will sell papers have fallen back on a script right out of the musical that was made into a hit movie in 1961.

Surely you remember the Romeo and Juliet saga of "impossible" love between the all-American clean-cut Anglo boy Tony (played by Richard Beymer) and the beautiful Puerto Rican señorita (played by Natalie Wood). Born Natalia Nikolaevna Zakharenko, the actress who played Latina was not exactly of Hispanic origin; then again, in those days señorita, mispronounced, was about all the Spanish most American non-Hispanics knew.

Fast-forward to the presidential election of 2008 and you have an African-American candidate and a Latina labor leader backing the woman candidate running against him. What do reporters see? Rumble!

You can almost see the Obama campaign singing for the Jets the Anglo gang's side of Stephen Sondheim's "Tonight" Quintet:
The Puerto Ricans grumble: "Fair fight."
But if they start a rumble,
We'll rumble 'em right.
And the Clintonistas, with Huerta at the lead belting out the war cry of the Puerto Rican gang, the Sharks:
We're gonna rock it tonight!
They're gonna get it tonight,
The began it.
We'll stop 'em once and for all.
The Sharks are gonna have their day,
We're gonna rock it tonight.
Tonight!
But, oh, did I forget that most Hispanics in California are of Mexican, not Puerto Rican, origin? You think it matters to the major Anglo press?

And, no, stop salivating, Mexicans and Puerto Ricans do not hate each other. If anything, Mexicans remember the 1848 theft of half their country by the Anglos, much as Puerto Rican children to this day are told that El Drako (the Anglo pirate Francis Drake) will come get them if they do not go to bed.

Hispanics vs. Blacks, Mexicans vs, Puerto Ricans are all part of the Anglo wet-dream, one in which the minorities keep each other down by fighting one another and the WASPs, who numerically are no longer the majority, get to divide, conquer and rule, laughing all the way to the bank.

This election isn't about ethnicity -- "race" is an unscientific term with no basis in fact -- or about sex -- "gender" is a grammatical, not biological term. It's not a choice between a Black man and an Anglo white woman.

Rather, there is a choice between two very solid Democratic candidates with positions and views that will likely, and at last, turn the ship of state away from the iceberg toward which George W. Bush is blithely steaming.

Frankly, I don't see how Hispanics could lose with either one. In fact, the election is not the endgame for Hispanics.

The late Willie Velázquez, founder of the Southwest Voter Registration and Education Project, whose biting oratory reminded me of a mixture of comedian Lenny Bruce and community organizer Saul Alinsky, once put it very succinctly almost three decades ago at an event I attended.

"Do you want to know why Hispanos don't vote?" he asked an audience in Albuquerque, N.M., back in 1982. "Because nothing happens, that's why. The national organizers come by every four years to pick the ripe, fresh Mexican vote. And the streets of the barrio stay just as dusty and the schools just as bad."

That's the real key to the Hispanic vote. Not who you are, but whether you'll respect me the morning after -- by putting in place solid programs and policies that benefit my community.

Today, when I vote in the primary in my area, that's what I, a Hispanic, will keep in mind.

Saturday, February 09, 2008

Meme 123

Still uncertain as to what exactly is a meme, I have been tagged by Alex at Abandon All Fear, a British Christian I often annoy with my unbelief.


The game is to
  • Pick up the nearest book (of at least 123 pages).
  • Open the book to page 123.
  • Find the fifth sentence.
  • Post the next three sentences.
  • Tag five people.

Never do this in your office, even if it is after hours. The book at hand happened to be a multidisciplinary collection of papers titled Women Immigrants in the United States. Page 123 happened to fall in the paper titled "Detention of Women Asylum Seekers in the United States" by Marleine Bastien, founder and executive director of Haitian Women of Miami, and the recipient of a 2000 human rights award from Amnesty International.

Starting on the fifth sentence on p. 123, Bastien writes:
Women detainees at TGK and other facilities around the United States lack access to basic recreational facilities. The outdoor recreation at TGK consists of a small concrete wall space exposed to the elements. The women supposedly have access to it from 8 a.m. to 7 p.m. but actually do not because of frequent lockdowns and other unexplained emergencies.

Now I have to tag five people, presumably fellow bloggers:

Genevieve
Jen
Savia
Schmutzie
The Palinode

See, I can play blogger games, too!

Thursday, February 07, 2008

No-Cojones Congress

House Speaker Nancy Pelosi (P-Calif.) perpetuates the myth that women are terrible at math: she can't count noses in the House of Representatives, where Democrats have a comfortable -- let's spell it -- m-a-j-o-r-i-t-y. That's why she handed Sen. Harry Reid (P-Nev.) a "stimulus" bill that smelled like three-day-old fish left in the sun because it does nothing for the unemployed, who could do the most for the economy.

The "P" in the identification tags is not a mistake: it stands for Pseudo-Democrats. As in showing every sign of being outside what one presidential candidate called "the Democratic wing of the Democratic Party." That's the wing that puts workers and general well-being first.

Pelosi blew it by inexplicably failing to stop the House Republicans, who are in the minority (Nancy, check the House roll, will ya?), from constructing a monstruosity of a tax-rebate and business tax-cut bill. Reid failed by caving in and accepting Pelosi's stupid done deal with only a token gesture for the elderly and veterans.

Bad policy and terrible politics. This legislative work lacks what in very colloquial Spanish is called cojones (balls).

In the face of a recession, the most stimulative disbursements would be in the form of money going to people with the least disposable income -- that is, people for whom a dollar in hand is a means to fulfilling an immediate need by spending the dough. These are the folks most likely to generate consumer demand and boost the economy.

Giving money to wealthier people risks having the funds go into savings or investments with little or no demand effect at all.

An early appraisal of the principles for an economic stimulus, prepared before Pelosi even had a bill to deal with, two major economic analysts were cited as specifically locating the highest stimulative effect in expenditures such as unemployment compensation and food stamps.

Economist Mark Zandi is quoted as demonstrating that for every dollar spent in the form of unemployment checks, the economy receives an economic consumer demand boost of $1.73. In contrast, the Republicans' much-vaunted increase in tax breaks for small business investment would yield only 25¢.

The Republicans, along with Pelosi and Reid, chose a 25¢ stimulus over a $1.73 boost. Oh, yes, and a lot of pandering in an election year to folks to whom $300 won't mean very much. Certainly not more than it did in early 2001. Remember the Bush rebate (and the recession that followed it)?

One might accept that Reid, who was blocked by a single vote in a very tightly divided Senate couldn't do otherwise. But what's Pelosi's excuse? Democrats have a comfortable, if not veto-proof, majority in the House. Doesn't Pelosi know that when you have a majority you can get things you want done?

What do I hear? Bush would have vetoed help for the neediest citizens that would help all of us the most? I would have replied "Make our day, Georgie! Let's make sure every American hears that the leader of the Republican Party would rather give money to the richest taxpayers than fight recession by aiding those who will spend the cash assistance."

Now Bush can say that the Dems are in the pockets of the rich just about as much as the Repubs.

And the unemployed? The poor? They don't count to either one.

Monday, February 04, 2008

On Compromising

By the time you get to middle age the life you have is very different from the life you planned -- unless you're the odd geek who started Microsoft or the poverty-inspired boy from a town called Hope who wanted to be in JFK's shoes one day. Is the answer to dream doable dreams? To work harder? To accept fate?

These questions will one day dog you, too, younger readers. Trust me on this. My favorite description of life is "life is what happens when you had other plans." (Anyone know this phrase's author?)

The answer depends in part on your philosophical system. The ancient Greeks subscribed to the invincibility of Fate.

On the other hand, core Judaism, Christianity and Shi'ite Islam all teach that we have free will, that the deity may well know the future, or rather be outside time, but that nothing is preordained. That is, unless you area Calvinist, one of a small band within the Lubavitcher school of Hasidism or a Sunni Muslim.

By the time we become adults, most of us subscribe to some middle road. We have some power to alter the course of our lives, we think, but there are limits.

Some limits are givens: we are born rich or poor, male or female, a perceived member of the majority in our society or of a minority; our genes, science tell us, carry many predispositions. What little science I know and what experience I have tend to tell me that my individuality amounts to little more than a certain mix of chemicals that one day we will know how to completely control and manage.

Still I persist in thinking that by sheer willpower I can achieve a few things. Years ago, when I first learned the game, I spent weeks losing at backgammon consistently until I went to the library, borrowed a book about the game and evened my odds.

Why can't I do the same when it comes to becoming president of the United States, winning the Nobel Peace Prize or enticing Penélope Cruz to my lair? Where's the how-to book for dreams?

Even if I know that I will never be president nor be invited to the prize ceremony by the Swedish Academy nor spend a night with Penélope ... what would I really feel if I embarked on a campaign to achieve any of these things and actually succeeded?

Does John Updike wake up every morning thinking "Gee, how wonderful, I'm John Updike"? Or does he get depressed from time to time that he is not, say, Gustave Flaubert or Albert Schweitzer or Neil Armstrong?

I'm probably not the first to muse on why we conceive of dreams. They're archetypally human. Heaven and salvation, wealth and power and sexual satisfaction, the admiration of others and the feeling of conquest over oneself -- these are some of the things to which many of us aspire.

The story was once told to me of a saint who, upon applying to enter a monastery was asked what job he would like. He said "abbot." He was placed as porter and later laughed that if he had said anything less he wouldn't have been admitted at all.

The paradox seems to be that attaining goals by sheer effort is illusory or happenstance and probably impossible. It's not true that the poor are lazy; most work more than the rich and at harder, more grueling jobs. Yet we would not be human if we didn't aspire to a reality beyond our present one.

So what do we do once we know that the big dreams won't come true? Three things.
  • We realize how unrealistic it was to believe that by our own single-minded, individual efforts we could succeed. Most success involves help from others and sheer luck. (Balzac put it another way: all wealth come from a crime, he said.)
  • We gratefully accept the wisdom that falling short imparts.
  • We adjust our dreams to things that still stretch us but are no longer obviously unattainable.
I have more or less attained the presidency in my little world. I still am continually looking for the opportunity to build my own little Lambaréné, realizing that the real prize comes not from the Swedish Academy, but from the smiles of those you manage, by chance, to influence for the better.

Finally, I'm not sure that Penélope and I would actually get along or have much of a passionate night, but I'm daring to hope that, as Daniel Berrigan once wrote, there is "love, love in the end."

Tuesday, January 29, 2008

Death as a Way of Life

Having anticipated spending the weekend engaged in tea-leaf reading with harbingers of Mr. Death, I was surprised to discover instead living as a way of dying, in a way that applies to all of us. We treat death as Benjamin Franklin's joke, something unpleasant and unmentionable, rather than as the useful nudge to live, just as taxes are a necessary means to share.

Perhaps it helps that I am a mere four years away from the age at which my father died, although I am a comfortable 35 years from my father's father's age of demise.

My father died "young," or so his contemporaries said. I was just about to become a father for the first time and my middle-aged father did not seem young at all. Now it he seems to have been too young to go; at nearly his age of death, I still have some living in me. I think.

My grandfather, on the other hand, voluntarily decided not to undergo a third operation that might -- or might not -- have extended his life an uncertain span of time. He knew it was his time to die and given his advanced age nearly everyone, save for those of us who loved him and still miss him, would agree.

Most people I have known who were aware of death's impending arrival at old age were ready, almost anxious for it come, to be done with physical decline and pain, to end resistance to nature's course. This past weekend, however, I came across what seemed to me an entirely new, Zen-like approach.

A sick person within range of a reasonable age for a man to die -- no matter how unreasoning death will always feel to those who have loved him -- had given his family, and perhaps himself, a few scares. The fear and shock was perhaps enough that he seemed to embrace his fate -- one that's not imminent, yet feels closer than the demise of his youngest child -- with a joy and matter-of-fact calm that seemed to imbue his household with a way of living that is very much in the moment.

Because I am closer in age to the person who is ill than I am to my grandfather's age of death, the picture I took in seemed a reality not to be ignored: this is more or less how I will be when I get closer to my turn.

Then I was struck by how the healthier living, those whose dying seemed likely to stretch out for decades beyond even my time, were living day to day, even with an awareness of Mr. Death they had not had before.

There were tears and laughter and worry, of course, but fundamentally, as a grounding of all that was going on there was an air of letting go, of living to the fullest in tiny ways, of a normalcy that might have seemed unnatural were it not so wise.

Why not? We are all dying. If only we were more forcefully aware of it!

I could walk out and get hit by the proverbial truck. I might have a deadly disease incubating in me as I write. My body might just tire out inexplicably one night.

Am I ready for that? Have I let go of my resentments and angers and worries and fears, my navel gazing and self-pity, to replace them with a serious but not humorless sense of purpose and focus on the things that, to the best of my knowledge, are important?

It seemed, and perhaps I idealize, that the household of the man I went to see was trying to let go and live. Or rather, to take on dying as a way of life.

If I were on my deathbed -- or my death computer chair -- that is how I would like life around me to be. Indeed, I am in death computer chair and I feel a greater urgency to focus on what is important.

Excuse me, then, I have to go do some work.

Thursday, January 24, 2008

Children or Dogs?

Perhaps it is the bruising cold that sharpens the critical faculty, but I see around me a depressing lack of discriminating judgment in distinctions that aren't so fine or difficult to make. Let me offer two instances.

Case #1 -- Surgery for Pets

It seems the past few weeks have been the time for pets to get expensive surgery that society does not feel fit to grant to the 40 million Americans (many of them children) who simply cannot get any kind of preventive health care because they are uninsured.

One person is spending $1,400 on a cat's operation. Fellow-blogger Julie has had a dog diagnosed with cancer undergo surgery. Ever heard of putting an animal out of its misery with a shotgun? (Truth in advertising: I have never even touched a shotgun. But you get the idea.)

When I raised the question of a hierarchy of values -- among them, people before pets -- in a comment in Julie's blog, mommyblogger Dharmamama weighed in with an out-of-context biblical quote to propose that no one is facing a choice between pets and children. (This amid an ocean of there-theres and poor-yous.)

Julie, for her part, threatened to censor me. Never mind that child homelessness has not quite been eradicated driving distance from her cancer-operated dog. To Julie's credit, the next day she aptly called the dog-cancer post a "pity party."

We all feel our hangnails are worse than a famine in India. But they're not in fact, in truth and in reality.

Case #2 "He Crossed the Line"

Heard from a blonde, white capped pedestrian commuter on her cell phone: "Brian, he f*cking crossed the line."

A man other than the patient Brian, whom she "f*cking" did not know well at all, had apparently invited this pretty, well-dressed but potty-mouthed cell-phone-toting young woman to a "f*cking" strip club. Then, some prodigious (and presumably expensive) amount of "f*cking" drinking had taken place. All ending up at his or her "f*cking" place in the middle of the "f*cking" night, where alcoholic intoxication lowered inhibitions to the point that clothes were discarded amid "f*cking" amorous activities (which, one imagines, were headed toward f*cking). Finally, some "f*cking" Maginot Line was crossed.

And all downtown, or at least everyone within the radius of a city block, heard about it.

The cognitive dissonance in this conversation begins with the understanding that in 2008 everyone knows that yelling into cell phones does not improve communication, any more than loud, slow diction and adding an "o" at the end of every word translates English into Italian. Certainly, yelling out one's angst at a line "crossed" when one is crossing so many socially accepted lines concerning public comportment is internally self-contradicting.

As is almost everything else in this overheard conversation. What delicate sensibility belongs to a young woman who has to f*cking cuss every other word? Where's the common sense in going with a little-known man anywhere, let alone a strip club and a private residence where intimate behavior may ensue?

If one can be held legally liable for driving drunk, can't one be held at least morally responsible for drinking to the point that one disregards the normal inhibitions about placing oneself in a situation of nudity with a stranger?

None of this suggests that the male stranger was therefore authorized to treat the unnamed bodily territory in question the way Germany twice treated Belgium in the 20th century. However, it does suggest that the frontier crossing was a folie à deux, as in the number of people it takes to tango.

So, what's more important: children or dogs, morning-after rescuing of self-respect or circumspect civility the evening before leading to a better morning after? Some people seem not to know the difference.

Sunday, January 20, 2008

Unblocking the Writer

If you've noticed, I have been a bit blog blocked. Everything I considered writing about seemed trite, or said, or a clichè. So now I'm taking a new tack in hopes that the blogging juices will once again flow freely.

Beginning on Monday, Jan. 21, and through the rest of the year, I will do my version of the Times 365 blog meme. This is a project started by a blogger to mark his 40th year by remembering 365 people who left an impression, one day per person.

My fellow blogger Schmutzie has been doing this with startling results. She posts 50 words every day. She has joined x365. Being a less than compulsive individualist you would not want to have on your team, I'm making up my own rules for my own people project.

I will post one 30-word note on a real person I have met, from Monday through Friday each week, for 250 days, which I calculate will take me to the end of this year. Moreover, I will attempt to recall people in order of appearance in my life. (Got my numbers wrong, think I met you before I did? Sue me.)

I will, however, make every attempt to keep appropriately private the actual identities of those about whom I write.

Saturday, January 19, 2008

To Want, To Need, Perchance To Love

With only 14 shopping days until Christmas, a correspondent inquired as to the difference between needing, wanting and loving anything from a PC game to a trip to the Bahamas to true love and to a peaceful world. The season of shopping and greed ... um ... peace and love ... is over, but not the question.

As I see it, we need very little. Water, air, food, shelter from the elements and clothing. If we do not wish to survive, we do not even need these.

My correspondent, who is French, of course, says we need sex. I'd question that. I'd agree to the stipulation that we probably need some form of affection in our lives.

Mais, oui, we often want sex and want sex often. But need? What will happen without sex? We'll be a little irritable? We'll squirm? We'll soil our bedsheets? That's about all I can think will happen. Not exactly the Four Horsemen of the Apocalypse.

We want everything under the sun, but especially what we see others enjoying (in commercials). We want convenience and well-being and ease, but we also want the things that will make us feel so much more powerful, handsomer, desirable. Hence the market for sports cars.

Want is our problem. We desire much we do not need for our survival or even our well-being, whereas necessity, true need, is the mother of invention. The less we need, the more we merely want, the less creative and more consuming we become.

Is it absolutely necessary to leave so many office buildings lit up at night, sucking in energy for no one to enjoy? Of course not.

Do we need purified water in bottles? Are purifiers? Do we need meat every day, three full square meals, ample desserts? Do we need a closet with umpteen pairs of shoes (OK, women do), suits, shirts, jackets and coats? Do we need a home with several regularly unused bathrooms, a yard, a two-car garage?

Of course not. Yet that's the normal North American dream.

I spent the bulk of my adulthood in a two-bedroom apartment that was at maximum legal occupancy (two adults, two children), without television or a car. I may have taken the odd vacation here and there, but I spent many of them on my balcony, reading detective novels in long summer days.

I was the "poorest" in my leafy neighborhood of million-dollar homes of Washington wonks and journalists. In the global village, however, I was undoubtedly a potentate, what with running water and electricity (not to mention a computer). About four-fifths of humanity do not have any version of these "necessities."

At the risk of sounding self-satisfied (I now have a TV, a car and an under-occupied apartment), the way of life into which I once stumbled was modest enough that the world -- and I mean every citizen in the globe -- could conceivably aspire to live as I did without a huge drain on resources. A (much needed) redistribution would have done the trick.

Sell one CEO's Gulfstream Jet (about $57 million) and you could get four-bedroom apartments for several African villages. Hell, several U.S. towns.

But -- aha! -- who's going to be the first to step forward? How do we let go of our wants and focus more precisely on our real needs?

For that we need love. The love we all want, the love we all want to give and are sometimes too scared to part with, the love others need and deserve.

Monday, January 07, 2008

Pornographic or Risqué?

Savia's recent post on the joys of a toy for gals and related matters has set off an e-mail controversy: is the Savia Bella blog pornographic or merely risqué?

I only cyberknow Savia through Schmutzie, another cyber-acquaintance. They both strike me as charming women too old to be my daughters, but too young to date, who are articulate about some poignant experiences -- and occasionally a little edgy, saucy and, yeah, not quite what you would read out loud to your great-aunt Julia.

They are articulate and funny and painfully honest and Saskatchewanian -- I've never met anyone like them in real life. For all I know, they may be one 45-year-old overweight, beer delivery guy in Yonkers. But I doubt it.

I found Schmutzie's Milkmoney ... goodness, I don't remember how! Someone's blog roll, I'm sure. I was amazed to discover someone blogging about such serious setbacks as being diagnosed with cancer (and beating it!) with compassion-evoking lightness. This is how I would like to get cancer (knock on wood) if I had to.

Then Savia guest-posted on Milkmoney about her incestuous-but-not-quite adventures with her hunky Italian cousins. She revealed to me the female side of sexual temptation and limits in a way I had never quite encountered before, in a language franker than any woman I know uses, or has used, at least since college.

Part of the allure is hearing the in-your-face raw sexuality of the younger generation, of course. But another part is that it is literate, delicate and well short of raunchy.

I would argue that it is not pornographic. To me pornography aims to titillate, to profit, to manipulate the hormonal imagination. Savia seems merely to speak her mind (and body) in a "just us girls" tone that makes all of it very natural.

We all like sex. Want some. Know that some people are off limits. Would rather focus on just one, but are maybe less virginal than the nuns said we should be.

To my mind, Savia (occasionally) holds up this aspect of life for all to titter a little but ultimately enjoy in a good, clean sense. And besides, she writes about any number of things, such as the death of a loved one's parent or getting soaked in a London afternoon rain, in ways that are memorable and even moving.

Schmutzie, for her part, may prefer to have the first syllable of her blogging handle pronounced like "smut," but she is delightfully child-like and heart-warmingly adoring of her mate. Even when she's edgy. Sorry, Schmuts.

Tuesday, January 01, 2008

USA Number One?

Few recent political events have stayed with me as totalitarian emblems as the sight of a young Republican throng chanting "USA! USA! USA!" Now comes a foreigner questioning how the United States could possibly be no. 1 given an allegedly inferior educational system, which prompts me to ask how the United States got here and what it means for the future of the world.

Of course, it's almost un-American to be as "patriotic" as the young people at the last Republican convention. The USA is historically and essentially a nation of oddballs ornery enough to be embarrassed by orchestrated cheering. True American patriotism has always been best represented by dissenters.

The notion of American empire, at last openly acknowledged by those in power, is also at odds with democracy. All empires have been autocratic and the imperial behavior of Americans abroad is often grossly at odds with the national democratic vocation: our diplomats and soldiers have repeatedly shown they want to force others to adopt what we think is best for them, like it or not.

Part of the reason for this is the mistaken belief that the ascendancy of the United States is an inevitable result of a superior culture or form of government, when in fact it is a major historical accident. Had the European powers -- in what Churchill aptly described as a thirty years' total war with a long truce -- avoided reducing each other to sheer rubble by 1945, the United States would have remained the ungainly, greedy older child of the British Empire and no more.

U.S. hegemony is merely the result of a large, untouched industrial base filling a global vacuum half a century ago. I have already pointed out that American military prowess was of as dubious value in the 20th century as it is in this one (see here).

The real source of U.S. power has always been primarily economic.

This has involved huge foreign inputs, in terms of labor, investment and creativity -- rather than the much ballyhooed "know how." We tend to forget, for example, that without a Scottish inventor, immigrants from Ireland and China, and hefty British investment in the 19th century, there would never have been a continental U.S. railroad network, the dominant interstate form of transportation until Eisenhower's highway program. The same could be said about any number of major U.S. economic projects.

Moreover -- and this foreigners often miss -- U.S. economic strength lies primarily in its dynamic and large internal market, rather than external trade. This is how the United States remains much more powerful economically than China, which is several times larger in many senses.

Indeed, this is why, should the United States decline, as is historically inevitable, I think China is unlikely to fill the gap -- the People's Republic is a vast underdeveloped heartland that faces the world with the mask of its glittering coastal regions.

What the U.S. ascendancy has meant for the world and still can be its enduring legacy, is the leveling effect of a relatively transparent economy and a stable but adversarial political system.

In sum, the United States is not no. 1 in brains, brawn or brass. The U.S. originality is an economic and political constitution for "men who disagree," as Oliver Wendell Holmes put it, one that is potentially open to improvement.

Sunday, December 30, 2007

Geography of Education and Truth

Mention an obscure painter or poet to a continental European and you'll get an elegant summation of the artist's work, the movement that inspired it and perhaps a word on its relevance to the world today. An American will frankly admit not knowing about the artist and probably ask a question; a Brit will offer a clever joke that changes the topic.

The differences in response do not necessarily mean the Continentals are more learned. It merely means they have been taught differently about the truth.

There may be many more systems of education in the world, but the two educational approaches to which I and people I know have been exposed might be labeled Anglo-American and Continental. They have fundamental epistemological differences, especially in those fields that are not empirical.

The Continentals, according to observations of mine and others, study humanities as a collection of facts subject to approved, taught interpretations. This novel is about X and its symbolism means Y; remember that for the exam and spew it back exactly as taught or fail the class.

In the past two weeks, for example, a French correspondent provided an unwitting example of this. A secondary school teacher who was exposing his students to the idea of colonization, offered them several quotes on the subject, then asked
1. Compare the arguments put forward in 1885 by Jules Ferry, a prime minister favoring French colonization of Indochina, and Georges Clemenceau, a member of the opposition.

2. How was European leadership being called into question at the beginning of the twentieth century?
So I asked what the students had replied, and my correspondent replied "I suppose what [the teacher] taught them." Such a response conveys the assumption that the teacher's role is to provide not merely facts, but also the "correct" interpretation of the events in question.

If the teacher is in a progressive secular school, I would expect the answers to lean toward describing Ferry as a retrograde racist and Clemenceau was a visionary and European "leadership" (quaint description of genocide, ecological rape and theft, but never mind) as a thing of the past well worth burying.

In a religious and conservative environment, on the other hand, one might lament the loss of the "wise" European stewardship of the world and note that Ferry might have had a point about the tutelage needed by the Third World.

In either case, education sets up the student as a parroter of the correct line of interpretation. The European who seems to opine about an obscure poet is likely repeating something learned in secondary school. By rote.

This is the system that Napoleon spread throughout continental Europe, alongside his famous legal code.

In the United States, Canada and Britain (and in British schools abroad) schooling, after 1945 at least, I would venture to say that in a similar situation, the students would be pointed to sources (as the French teacher did), then left to their own devices as to interpretation.

Because the Anglo-American student is not encouraged to imbibe opinions, but rather to consider and search for information, typically Anglo-American school systems cover less material than their Continental equivalents. Thus, it is more likely that Americans, Canadians and Brits may come across as "ignorant" and not know the obscure artist mentioned at the outset -- but if an opinion is ventured, it is more than likely that of the speaker, not of the speaker's high school teacher.

Anglo-American educators not only worship at the altar of open-ended inquiry, but also engage in a full-fledged debate concerning the canonical information to which students should be exposed. For example, there's the library of dead white men as opposed to multicultural readings that include women, people of color and sources that were not conventional 50 years ago. Textbook versus textbook-less.

Is one system better? Not necessarily.

The Anglo-American student typically has a narrower frame of reference tending toward specialization, depth and creativity. The European peer has the advantage of a broader base of basic information, yet also tendencies toward more conventional thinking, surface knowledge and generalization. The eclectic and the specialist complement can each other.

Socially, however, they speak of societies with different vocations and temptations.

European indoctrination aspires to develop renaissance men and women, yet it carries the temptation toward the totalitarian conformity of Fascism and Stalinism. Girded in philosophical absolutism traceable back to medieval, Catholic Europe, its insistence on one truth and one truth only, may spur the desire to uncover her. Once found, European Truth, like Reason during the French Revolution, risks becoming a worshipped statue.

Anglo-American inquiry hopes to develop free democratic citizens who insist on their own truths, yet it can yield unreasoning, over-confident zealotry. Undergirded in the Reformation epistemology according to which each Bible reader was to be seen as the sole legitimate interpreter of truth, under the influence of the more modern offshoots of rationalism and empiricism it can spur to develop scientific and technological marvels. However, these may become soulless and rudderless innovation for its own sake in a vast and stormy ocean of relativism, witness the hollow chatter one so often hears on cell phones.

In the end, I am torn. I like the European palaver, but I admire the Anglo-American thirst for knowledge. I wonder what you think.

Friday, December 28, 2007

An Education Dictator?

Back in 1996, Republican presidential candidate Lamar Alexander pledged to become the "education president," a promise George W. Bush stole for his pack of lies, I mean, his campaign. Surrounded by educators this week, I find myself wondering whether, since this has proven woefully inadequate, an education dictator would do better -- and what would such a potentate do.

The problems are well-known. Only half of all high school graduates go to postsecondary education. Their incomes and general well-being are stagnating. Only half of those who go to college complete four years. Yet all the future jobs demand higher and higher order skills.

Meanwhile, high percentages of youths go straight from dropping out to jail, at enormous social and fiscal cost.

Education would seem to be the natural ticket out of poverty and stagnation for such young people, yet the schools can't manage the job. Why?

Part of it is declining standards.

The New York Regents exams, once the hallmark ticket to a coveted high school diploma are no longer the obligatory for graduation in the Empire State. Students who don't make the grade, can go for a "local diploma," which community colleges accept. While the Regents require minimum scores of 65 percent to pass, the local diplomas accept 55 percent.

This is a way to pad the graduation rates, which fell precipitously during the 1980s, when a B-film actor presided over the first effort to bankrupt educational and social programs. I'm told that half the schools in New York would be closed if Regents were the only ticket to graduation, as they must graduate a certain percentage of the student body by law.

But it's not just that.

Kids who are hungry, who are brought up by guardians rather than parents in prison or imprisoned by addictions, who know no one who has a conventional job and thrives, who must toughen up before their time -- such kids are half defeated before they take their first step into a school.

Only a sustained, intensive, broad-based frontal campaign to address the entire network of social problems that are creating a permanent underclass -- and thus undoing the foundation of democracy -- can hope to succeed.

Here's where the ancient Roman notion of a dictator, someone drafted by the Senate during an emergency to literally dictate what everyone should do, seems a plausible answer. Not a tyrant, mind you, a dictator. Someone appointed as immovably as a federal judge, say, to see things through the resolution of the problems decisively, persistently, immune to fashion and citizen fatigue.

What should such a person do?

1. Federalize education. There is no rhyme or reason to the patchwork of 16,000 school districts, which operate as if the world of the mind stopped at the county, city or district line. No other advanced nation has as balkanized a system.

2. Consolidate bureaucracies so the bulk of the funds can be directed strategically at problems, so the doers in the system are left alone to do their best.

3. Connect educational systems to community and disciplined civilian work agencies and programs with modern apprenticeships and practice-based credentials.

4. Require all university students to serve for one year in literacy and educational support activities as a condition of graduation with a bachelor's degree.

5. Coordinate education with child welfare and family economic self-sufficiency programs, so that enrollment in school becomes the gateway to all necessary services to ensure the well-being of every American or immigrant from birth to 18.

Put together, the school districts, states and the small present federal contribution add up to more than $400 billion a year. These funds need only be better directed.

Don't have children? Think about whether you'd like the ambulance driver taking you to a hospital to be able to read street signs.

Education and social well-being is for all of us. Happiness spreads. When the poorest are reasonably cared for, the richest can sleep soundly.

Then, after 20 or 25 years, the dictator should be asked to resign and hand things over to elected and appointed officials, who will then have another 200 years to run amok.

Friday, December 21, 2007

A Call for Glückenfreude

We all cheer for the underdog, the person who is depressed, who lost a job, who is ill. Secretly, we also occasionally cheer when someone we dislike experiences misfortune, deservedly we believe: schadenfreude. But perhaps the opposite is required somewhat more -- and is considerably nobler.

Schadenfreude, we all know, comes from the German Schaden (harm) and Freude (joy): joy in the misfortune of another.

Face it, you think you might not feel a teensy weensy bit of it if Bill and Melinda Gates got divorced? If Osama got cancer? You weren't secretly glad when Barry Bonds got caught using steroids, Hugh Grant was arrested for getting oral sex from a prostitute in a car, banks lost money due to shady loans, when Scooter Libby was convicted?

Good. Now it's out in the open. We all feel a little schadenfreude now and then. Now Let's consider the opposite.

Your pal gets a promotion or award while you're still stuck in the same old job. Your best friend falls madly in love and you can't get a first date to save your life. Your neighbors take that dream vacation you've always wanted and you haven't been to the next town in three years.

Don't these people make you mad?

For years I felt invidious irritation toward James Fallows. Although he is only three years older than I am, he was Jimmy Carter's chief speechwriter, when I was an apprentice aide to the speechwriter of an international diplomat.

He glided from the White House to the Atlantic Monthly, NPR and endless books, essays and a generally placid and comfortable life with wife and, I believe, daughter. I was let go, later fired from another job and have since toiled obscurely on an economic publication that is revered in its field -- but let's face it, I'm no James Fallows.

How dare he show me up like this!

At first I comforted myself that his passage through Harvard and Oxford were mere perquisites of being born with a silver spoon. But no! He had the effrontery of coming from a working class background and winning scholarships on his own merit.

Surely he would divorce. Surely he would have children with disabilities. Get cancer. Turn out to be a plagiarizer. No, no, no.

People like James Fallows should be shot.

So imagine my shock when I discovered that other people felt similarly about me. Ten years ago I had the good fortune to manage a very leveraged buyout of the firm where I worked. I went to lunch with a dear friend, showed her my new business card with "President" on it. Her face was blank. I thought she didn't understand, so I told her.

"Oh, I have thought of starting a publication," she said. No "congratulations" or "I'm so happy for you," no matter how insincere. I chased her for another lunch over the next three months and it was clear she despised me for my good luck. At least, she was honest; she just couldn't deal with my admittedly modest success.

Since then, I have experienced moments in which I wanted to cry out for joy -- all amid the humdrum teeth-gritting reality in this vale of tears. Sons getting into prestigious universities and embarking upon challenging, make-a-Dad-proud careers.

I have gradually learned that no one is interested in my good fortune. Indeed, they'll likely get upset.

So beginning in this Winter Solstice season, I am calling for a new campaign of Glückenfreude -- joy in the happiness and good fortune of others.

Let me begin with James Fallows: I raise a toast to you, sir, I am honored to have read your marvelous prose, am delighted you have traveled well with your delightful family. If we ever meet, I admit, I will be starstruck, bask in your good fortune and consider it my own to have such a privilege.

Thursday, December 20, 2007

Is U.S. news about 9/11 (self-)censored?

You might think so after you learn about the arrest two weeks ago of a French journalist who in April 2007 reported that France's secret service knew minute details of Al-Qaeda's 9/11 plan as early as January 2001 and passed them to U.S. intelligence. Add to that the fact that neither the original French story, nor the reporter's arrest has appeared in a single major U.S. newspaper.

The story is very simple.

On April 16, 2007, the Paris daily Le Monde, which is France's top newspaper, ran a story by Guillaume Dasquié in which he describes a sheaf of 328 pages stamped "Confidential-Defense" and "Strictly National Usage," that he obtained from a source who had access to secret documents of the Direction générale des services extérieurs (General Directorate for Foreign Services).

These documents described Al-Qaeda detailed discussions concerning the hijacking of planes on U.S. soil, including the selection of American Airlines and United flights. All information available months before the attacks.

Moreover, a Jan. 5, 2001 DGSE memo on this subject was given to the Central Intelligence Agency's chief of station in Paris, Bill Murray. Not only did the French know, the CIA knew.

You can read the full story here.

Dasquié, who has also been writing controversial stories concerning French government corruption, was arrested Dec. 5 and charged with "publishing defense secrets" after refusing to name his sources or sources.

“We are troubled by the criminal probe against Guillaume Dasquié and his detention for two days by French security services who pressured him to reveal his sources,” the New York-based Committee to Protect Journalists’ Executive Director Joel Simon said. “Dasquié should not be prosecuted for serving the public’s right to know.”

The Associated Press picked up this story and the The Guardian of London ran it (see here). But is it anywhere in The New York Times or The Washington Post? Does it turn up in any U.S. newspaper or major media in a Google search?

Nope.

The Times last mentioned Dasquié in 2002 in a book review. The Post appears never to have heard of him. Why? Has everyone in the newsrooms been so full of eggnog for the last two weeks that they couldn't be bothered?

As a journalist, I find this appalling. Frightening. We are about to have a presidential election and significant information that our government knew beforehand of the signal event of the present century is swept under the carpet.

Tuesday, December 11, 2007

What Elections?

It's not even the presidential election year and how many candidate debates have occurred? I've lost count. U.S. presidential elections last too long, cost too much and the results are unimpressive. There are solutions.

As regards timing, I like the 90-day campaigns under the system that prevails in much of the British Commonwealth. Granted, that's because elections are not fixed at four year intervals.

Also, there are no truly national elections under the parliamentary system: you vote for the candidates in your riding or constituency and cumulatively a party acquires a majority -- or not; thus, most voters have some personal knowledge of the person they are voting for -- or against.

The cost has amply been remarked elsewhere as a barrier to truly popular candidates and the inevitable end-result that presidents first enter the White House already hostage to the sources of cash that put them there.

The duration of the campaign is a factor in raising the cost, but there are silent partner in this: the uninformed voter and the relative secrecy in which decision-making occurs.

Every campaign involves debate of policy questions posed in overly simplified terms for citizens who have not attended to the duty of keeping up. Nothing struck me as more symptomatic of the problem than the question posed to President George H.W. Bush at an open forum by an individual who obviously did not know the difference between the federal deficit and the national debt. (Hint: the deficit is a negative annual balance, the debt is the cumulative borrowing to cover the deficits.)

The cost of the quadrennial education campaign -- or in many cases, the quadrennial play on people's ignorance and basest emotions (yes, Republicans, I mean you) -- is largely the result of poor citizenship. If we don't look after our interests, no one will.

Is it any surprise that the results are so unimpressive? Think about the notion that the electorate in 1980 chose an actor whose sole talent was the ability to read and declaim as if the words he was using were his own. Yes, of course, the current president also comes to mind among the disasters of the electoral system.

Hunter Thompson cannily remarked that up to the 1972 campaign neither major political party had put up a candidate that garnered less than 40 percent of the vote nationally.

Yet the percentage of eligible citizens who vote has been cumulatively declining from the 63 percent recorded in 1960 to the low of 49 percent in the 1996 election. The massive electoral fraud of the year 2000 changed that: in 2004, a full 56 percent of the eligible electorate actually voted.

So think about it. John F. Kennedy got a razor thin margin (49.7 percent of the popular vote), garnering in reality about 31 percent of all eligible citizens' votes. Even Lyndon Johnson, with his "landslide" 61 percent of the popular vote, really had the assent of 37.1 percent of eligible voters. The Ronald Reagan "landslide" of 1984 (58.8 percent) still only won 31.2 percent of all citizens eligible to vote.

So neither qualitatively nor quantitatively can anyone argue that two years of bombast achieves worthy results.

Thus, three remedies strike me as plausible incentives for change:

1. Set presidential campaigns to take no more than 180 days, with one national primary and one national election.

2. Establish a public fund to subsidize candidacies and bar any other source of funding.

3. Make voting mandatory, with periodic citizenship tests and penalties for failing to keep up with the basic decisions that we must make as a society.

Monday, December 03, 2007

Democracy in Latin America

The champagne must be flowing in the White House over the vote in Venezuela. The vote cheered me for very different reasons: to me, it shows that while Latin America wants systemic change, people no longer believe a strongman is needed to achieve that.

For as long as I've been politically aware -- and I started young -- I have known that the key political issue in the region was, and remains, the redistribution of income and wealth from the neofeudal socioeconomic structures that have persisted for half a millenium to ... something else.

What else, and how, has been a widely debated and hotly contested question.

In the 1930s, movements such as the APRA in Peru proposed a kind of socialism with autochtonous flavor and revindication for the peoples of the Inca Empire. In Nicaragua at that time a peasant rebel named Augusto César Sandino, who conceivably never read Marx, prompted the intervention of U.S. Marines.

During the 1940s an 50s individuals like Juan Domingo Perón of Argentina and Getúlio Vargas of Brazil offered a different way -- a right-wing form of anti-imperialism and labor power and redistributionism. It was an era of strongmen.

In the 1960s and 70s came César Augusto Pinochet's theory of "the national security state," which he proposed in a military journal in 1965, just as the Brazilian military regime that most successfully embodied it began to take shape. By the decade's end, with the connivance of the CIA-run "traffic school" torturers wearing military boots were in power in Buenos Aires, Montevideo, Santiago and other Latin American capitals.

With the 1980s democratization began. We are still in the democratic era. One in which almost all countries have tried wild and extreme laissez faire policies -- in Buenos Aires the municipality went so far as to privatize parks! -- and abandoned them.

Now Nestor Kirchner, soon his wife, in Argentina, Michelle Bachelet in Chile, Luiz Inácio Lula da Silva of Brazil along with others, represent a wave of social-democracy, expanding rights from the civic realm to social and economic arenas. These are reformist, pro-union, pro-worker leaders who nonetheless recognize the need to rule from consensus and compromise.

This is what people have long wanted. Hugo Chávez of Venezuela, along with Evo Morales of Bolivia, represent the vanguard of Latin America's "new left" -- neither is too far apart from what solid majorities want. At least in their ideals.

What the Sunday vote in Venezuela showed, however, was a new maturity. Left-leaning majorities have learned that power foes not grow from the barrel of a gun, as Mao and a good number of guerrilla leaders have suggested. They have also learned not to trust even the greatest of saviors, such as Chávez.

In Sunday's plebiscite Venezuelan voters rejected by a 51 percent to 49 percent the proposal to expand Chávez's powers and accelerate his move to socialize the economy. The slim margin suggests that the country is deeply divided and that his program has not been resoundingly defeated.

Instead, it seems clear to me that Venezuelans are quarreling mainly with the strategy. They want economic and social democracy. But without a strongman. Cuba without Fidel and one-party rule -- or perhaps merely Sweden.

To me, having watched decades of blood flowing in the streets to no good end, over strongmen and guerrilla strongmen-wannabes, over militaries and ideologies, it is heartening to see Latin Americans choosing, indeed forcing, peaceful debate and the ballot box on their own leader. Chávez looms greater also in my esteem for accepting the verdict.

Friday, November 30, 2007

After the Boomers

Looking ahead from the end of 2007 seems a gloomy exercise given war, recession, global warming. John Maynard Keynes had a snappy comeback for precisely sunny prognostications of better times further into the future: "In the long run we're all dead." That thought, precisely, is my point of departure today.

When all of us Boomers are dead, sometime in 2064 or thereabouts, or indeed 20 to 40 years earlier when someone pries our cloven hooves from our work, a huge bounty will open up to those born after 1964.

In the United States, the generations that follow us are smaller, even our own children, the Echo Boomers; yet we are roughly replacing ourselves. In Europe, population can be expected to decline by as much as 25%.

Meanwhile, economic resources continue to increase or at least remain constant. Imagine the coming boon.

First of all, of course, jobs at the top will empty out just in time for Gen-Xers and Echo Boomers, as will housing units built for nuclear families. More money, ample supply, will mean lower prices for a comfortable life.

Granted, hospitals and nursing homes will become crowded -- as will cemeteries -- but only for a while. After all, in the long run we Boomers, too, will all be dead. In fact, I predict that society will either find an affordable way to support and keep us healthy -- or we will be euthanized.

Good riddance, too. Who wants millions of useless, gray, wrinkled people who cannot do anything but consume? Hell, who wants to be one?

My only regret will be not living to see how humanity will overcome our challenges, how someone will find what will seem as the obvious solution to many of our problems -- yielding, of course, a new problematic paradigm. Hey, that's not my problem.

In the meantime, I suppose, I can only hope to be as productive as possible, as engaging, as amusing to convince those around me to put off the day I am asked to step into the Eu-machine for the one-way trip to Neverland. But I am ready.

Happy future, next generations!

Monday, November 26, 2007

The Madding Crowd

Finding myself in church on Sunday, I realized that my problem with faith has to do with the sense that I -- along with the rest of humanity, including Christians especially -- am one of the crowd spitting at Jesus. I do not believe Jesus' words to the good thief "this day thou shalt be with me in paradise," suggesting that the drama of history will have a happy ending.

In fact, I perceive war Iraq and Afghanistan, corporate fraud and the exploitation of humans by humans -- or any of the million big and small misdeeds most of us do -- as part and parcel of a picture of reality askew. Where is the evidence otherwise?

It's the conflict described by Henry Wadsworth Longfellow, in his poem "Christmas Bells," written in 1864 upon hearing that his son had been wounded in battle,
And in despair I bowed my head
“There is no peace on earth,” I said,
“For hate is strong and mocks the song
Of peace on earth, good will to men.”

Then pealed the bells more loud and deep:
“God is not dead, nor doth He sleep;
The wrong shall fail, the right prevail
With peace on earth, good will to men.”

Cast an eye to the four-fifths of humanity living in benighted squalor and degradation and the conclusion is clear: God is dead and right does not prevail. The feast of Christ the King is a monarchist delusion.

Neither the deity, nor the man-god Son rules nor exerts sovereign power that anyone can tell. I ceased believing so when I realized that I was in the first ranks among the crowd whose lives mock all professions of faith.

Friday, November 23, 2007

Noose Media

Even a Hispanic-friendly editor to whom I pitch commentary on news of this nature seemed nervous about my writing a piece on the story of the Mexican-American Boston transit worker who was punished for wearing a Día de los Difuntos (Day of the Dead) costume to work on Halloween. The outfit was a black three-piece suit with a red noose around his neck -- but the noose was all people saw.

Jaime Garmendia, 27, was suspended by the Massachusetts Bay Transportation Authority for five days without pay, forced to write a letter of apology and undergo racial sensitivity training, the Boston Herald reported. A columnist in that paper even called the costume part of a "pagan ritual."

It was a knee-jerk reaction to a Hispanic custom by people who didn't know what it was about. The response did nothing to undo the wrongs against African Americans in Jena, Louisiana, or -- more to the point -- Boston itself.

Instead, race-obsessed Anglophones should be taking cultural sensitivity classes. After all, it was they who historically lynched African Americans -- not Mexicans or any other Hispanics.

Sure, prejudice exists in Latin America and in U.S. Hispanic communities, as everywhere. I don't condone it.

Yet history shows that Hispanic culture has been remarkably open to the mixing of peoples. In Latin America today there are millions of people of African, Asian, American native and European background ... all at once. Among Hispanics there never was anything so filled with racial contempt as a legally enforced separate drinking fountain, or restroom, or bus seat.

Besides, Halloween comes from England's "All Hallows' Eve," festivities approaching the Christian holiday of All Saints, Nov. 1st. The following day is the equally ancient, and inextricably linked, Christian feast of All Souls, the day on which traditionally the "faithful departed" are recalled. Nothing "pagan" or voodoo about that.

That's what the Mexican Day of the Dead festivities are all about. In small towns people dress up as skeletons and an informal parade takes place, led by a person in a "living corpse" costume -- presumably Garmendia's model. People throw oranges and other goodies at the "corpse," who gets to keep the loot, just like trick-or-treaters.

So, in fact, Garmendia's costume was actually a very canny cultural translation for Halloween. It was only his employer and the local press who displayed their cultural tin ears. Day of the Dead costumes, far from being about hate, are about love of life and love of those we recall fondly even after their death.

If anyone should apologize it's the MBTA -- and the noose media.

Friday, November 16, 2007

Never Intervene Again

It's astounding to learn that U.S. military commanders in Iraq are wringing their hands over what they describe as the "intransigence" of the Shiite-dominated government. What did they expect? Moderate European-style liberal democrats crawling out of the rubble they made?


The real fact of U.S. intervention since 1945 has been that whenever the United States has meddled in another country's politics, that country's ideological spectrum has polarized into two irreconcilable extremes and the centrist, compromising, moderate middle has fallen out.

Chile was a model democracy in the 1960s until the CIA, through the program of Jesuit Roger Vekemans, decided to intervene, destabilizing the centrist, moderate Christian-Democratic Party and ushering in first, in 1970, socialist Salvador Allende, who was never quite the Marxist-Leninist his successors painted him as, and then in 1973 the draconic right-wing regime of Gen. Cesar Augusto Pinochet.

In 1970 Cambodia was a neutralist peaceable country run by an ancient monarchic dynasty until the USA decided that it was time to plug up a supply line of the Viet Cong and bring the Vietnam war into its neighbor's territory. Prince Norodom Sihanouk was overthrown by a CIA-led military coup, in turn overthrown by the Khmer Rouge, who killed an estimate 2.5 million of their own people.

Much the same had happened with South Vietnam, which was run by a neutralist, Ngo Dihn Diem, overthrown in 1963 by the CIA simply because he was perceived as not being rightist enough, although he represented a moderate, Catholic elite that was Western-oriented. We all know how successful that turned out to be.

What did they expect in Iraq when they removed Saddam Hussein? After all, he was the only figure who -- through admitted utter ruthlessness -- held together the three major segments of the Mesopotamian territory dubbed Iraq by the British in the 1930s.

Of course, the Shiites are intransignet. Of course, the Sunnis would love to slit their throats. Of course, the Kurds would like independence. Of course, the middle class, secularist professionals have all fled, by the millions, to Jordan and elsewhere.

What did anyone expect?

Until the United States learns to be more subtle, more agreeable to compromise, more respectful of other nations, there's not a snowball's chance in hell that any U.S. intervention, however well-meant (and this one was not), will succeed at really contributing peace and stability to any other region of the world.

Perhaps we ought to make a national pledge: never again intervene. Never.

Monday, November 12, 2007

On Contributing to Poverty

"How did the United States contribute to the poverty in Latin America?" asks commenter and fellow-blogger Jen. The drum roll of military interventions and roster of investment companies and list of rebels killed springs to mind, but that is not her question. She asks something well worth pondering that doesn't often get addressed: how have we, collectively and individually contributed to poverty outside our immediate context?

Indeed, how does anyone contribute to poverty? How have we contributed to poverty around us? The short answer is that most of us who do not hold the major economic and political levers in our hands do so primarily by omission, inaction and neglect.

Things Undone

In the 1928 Book of Common Prayer there was a general confession recited in Morning Prayer that said, in part:
We have left undone those things which we ought to have done;
And we have done those things which we ought not to have done;
And there is no health in us.
The idea is that we all know that we are born into a human society that is morally askew, whatever the reason and however it came to be.

In this sense, while it is true the U.S. did not systematically create poverty in Latin America (or elsewhere), it's a fair question to ask what our country, we collectively, have left "undone" that might have alleviated or diminished poverty.

As someone culturally with one foot in Latin America and one here in the USA, I have long struggled to understand how it was that, say, the United States, Perú, Argentina and Haiti started out more or less at the same starting line about 200 years ago, yet reached vastly different levels of socioeconomic and technological advancement and well-being.

Travel these countries' histories and you'll find a distant European exploration and colonization, with all the attendant tragedies of the meeting of newcomers and inhabitants, the importation of African slaves, the establishment of miniature European political and social structures, an often bloody war of independence, followed by conflicts in nation-building throughout the 19th century.

Compare the USA, Perú, Argentina and Haiti in 1861, when one of my grandfathers was born, and there really wasn't such a huge difference. Sure, DeTocqueville had predicted in 1836 that the United States and Russia would be the major powers of the 20th century, but that was based merely on their land mass and continental expansion.

From Baring Brothers to United Fruit

In 1861, all were agricultural countries in which land tenure had become largely hereditary and oligarchic. Although slavery had been abolished in all but the United States, the agricultural labor regime in all four countries had in common elements of medieval serfdom.

In 1907 it was not yet a sure bet that of the four the United States would become the richest, even though U.S. industrial development far outstripped that of the other three countries, it was early enough in industrialization to allow for a quick sprint by Perú or Argentina -- although probably not tiny Haiti -- to an equal spot. Certainly, Argentina had the resources.

One missing piece in this history is neocolonialism, the system by which one country controls another through economic, rather than political or military means. Early in the 19th century, George Canning, British under-secretary of state for foreign affairs, wrote that South America, freed from political bondage to Spain, would be "in our thrall" provided Britain managed its business with the new republics well.

Without firing a shot, British railroads and banks positioned their nation in a controlling role in many South American countries. In Central America, the British model began to be attempted by U.S. companies such as the infamous United Fruit Company (since 1984 Chiquita Brands International Inc.), which arranged the election and deposition of countless governments, along with multiple U.S. military interventions.

Still, some ask, how come foreign investment in the United States didn't wreak the havoc that it did in Latin America? The short answer is that, first of all, it did: the hated railroad men who spawned countless popular outlaws in the U.S. West worked for British and European investors. (Just wait until foreigners start dumping their U.S.-denominated investments -- coming soon to a financial market near you -- and see how you like foreign investors.)

Indeed, my grandfather participated in an 1890 popular uprising in Argentina to stop the government from paying what were deemed exorbitant interest fees to the Baring Brothers & Co. (now Barings Bank), which then went into its first bankruptcy, causing a European continent-wide financial panic. My father burned Union Jacks in the 1930s. (Of course, then I did them both the dishonor of being born in the United States, heir to perfidious Albion.)

The U.S. pre-eminence in the Western Hemisphere does not date back to 1823, when President James Monroe first claimed that "as a principle in which the rights and interests of the United States are involved, that the American continents, by the free and independent condition which they have assumed and maintain, are henceforth not to be considered as subjects for future colonization by any European powers." The United States lacked the power to enforce the position -- and did not try in the most egregious and obvious example, Canada.

Bully

The real change was brought about by the Spanish-American war and the "hero" of San Juan Hill, Theodore Roosevelt, who in 1904 added to Monroe's position the view that
If a nation shows that it knows how to act with reasonable efficiency and decency in social and political matters, if it keeps order and pays its obligations, it need fear no interference from the United States. Chronic wrongdoing, or an impotence which results in a general loosening of the ties of civilized society, may in America, as elsewhere, ultimately require intervention by some civilized nation, and in the Western Hemisphere the adherence of the United States to the Monroe Doctrine may force the United States, however reluctantly, in flagrant cases of such wrongdoing or impotence, to the exercise of an international police power.
From this declaration, with loopholes and vagaries large enough to run a truck though them, sprang the bulk of the 150 U.S. military interventions in Latin America. From the 19th century bombardment of Nicaragua by the U.S. Navy for the effrontery of attempting to charge a fee on Cornelius Vanderbilt's yacht to the 20th century occupation by U.S. Marines leading to the execution of one Augusto Nicolás Calderón Sandino in 1934.

In every instance, U.S. troops, spies and influence conveyed not the alleged message of liberty and freedom for all, but the message of the freedom of the wealthy, of their corporate structures and of their local landowning oligarch allies to squeeze the last drop of labor from anyone as they please for as little as possible.

That's how the governments of the United States, my country, contributed to squelch every legitimate claim to human dignity in Latin America (and elsewhere), to support those who would deny the essentials of living to the majority.

And it's not history. In 2002 the Bush Administration attempted to overthrow President Hugo Chávez of Venezuela.

I won't claim that Chávez or Castro or the Sandinistas have the answer, or even an answer I would advocate. I know many Latin Americans feel the same way.

In fact, every time the United States has intervened, the political space for reasonable and balanced compromises has shrunk, in favor of the extremes of the (usually U.S.-supported) right and left. I explained how and why here. Want moderate answers to come from Latin America? Let's keep the hell out of their politics.

Discerning the path to socioeconomic fairness and prosperity in Latin America is not something to be handled in the boardrooms of Wall Street or the situation rooms of the White House or the Pentagon. It's something that, left to their own devices, Latin Americans are perfectly capable of figuring out on their own.

Saturday, November 10, 2007

United States of Brazil

The title of this essay was, in fact, the original legal name of the Estados Unidos do Brasil, just as Mexico is legally the Mexican United States (Estados Unidos Mexicanos). What I mean is not to evoke these countries but to suggest the general drift of the historical and socioeconomic current propelling the nation we know as the United States of America. We are slouching toward Brazil, or worse, Bolivia.

These are double-edged, complicated ideas for me. I have visited relatives in Brazil many times, counted among my personal acquaintances and friends a number of Bolivians, including one president. To me, these countries are not distant, abstract instances of Latin American stupidity or laziness or [throw in your pejorative here].

Rather, they are expressions of what Uruguayan writer Eduardo Galeano called "el continente de despojo" (the continent of dispossession) in his famous work Open Veins of Latin America, which recounts the sad, sad tale of my parents' ancestral society in a continental context.

"Latin America," like "Hispanic," is an abstraction as seen from outside the reality. There is a common historical, linguistic, religious and to some extent ethnic heritage uniting the score of nations south of the Rio Grande. However, Latin Americans think of themselves as nationals of a country before they think of themselves as citizens of "la patria grande" (the larger homeland), Hispanic or Latin America.

Where all citizens of the region share an important commonality is in the sense of belonging in the Third World, a place where
  • telephones often malfunction (to the point of being a great excuse for not keeping in contact);
  • wages of government officials, technicians, and every kind of service worker a middle class person is likely to need, are so low that nothing gets done without greasing a palm;
  • middle class status itself is a privilege bestowed on a few, or often enough, a slide down the slippery Maypole of social stratification;
  • as few as one or two percent of the population owns and controls the overwhelming majority of the land and productive resources;
  • vast majorities live in a crushing, degrading poverty that makes the average U.S. slum look luxurious;
  • clear pluralities or majorities do not have regular access to electricity or running water, three meals a day, new clothes, an actual formal building for shelter or regular employment, let alone benefits such as health care;
  • governments, elected and not, are really committees formed by and for the top of society's heap;
  • reform has historically been crushed ruthlessly (since 1945, in some countries earlier) with ample U.S. aid and abetting; and
  • on and on and on ...
I have to stop to stop myself from becoming a crashing bore or become so angry I cannot write any more.

My point is that the Third World conditions that exist in Latin America are, in general, way below what most Americans would deem a normal part of life. Even so, Latin America offers among the best of the conditions affecting the four-fifths of humanity to which no one who is reading this even remotely belongs.

The keen observer will have noted already that many of these conditions are no longer entirely foreign to the United States, as they largely were during the second half of the 20th century. Our country is a place where:
  • telephones began to become erratic since the breakup of Ma Bell;
  • the average, inflation-adjusted wage in 2006 was 22 percent below that of 1973;
  • the middle class is stagnating, as indicated by declining median household incomes for the five years of this century;
  • unemployment duration is becoming lengthier and the safety net for those who slip out of the middle class are frayed to nonexistent;
  • the top 20 percent of households ($92,032 a year or higher) took home 51 percent of all income, while the bottom 20 percent ($20,035 annually or less) took home about 3.4 percent (2006 figures) -- and that's just income, on the wealth side, the top 1 percent of households owned 33.4 percent of all privately held wealth, with the next 19 percent owned 51 percent -- thus, the wealthiest 20 percent of the people owned 84 percent of all private property, leaving only 16 percent for the rest (2001 figures);
  • the increasing proportion of poor households in the USA experience food insecurity, lack of or spotty access to health care, inability to pay bills such as rent and other essentials, substandard housing, irregular employment, law wages, lack of career advancement prospects, poor education and more;
  • the current government came to power against the wishes of the majority in the year 2001 and has ruled to benefit a tiny, tiny elite; and
  • ask the Wobblies, the Molly Maguires, Sacco and Vanzetti and the Black Panthers if U.S. political repression is harsh, or ask the blacklisted people during McCarthyism, or those lynched in the 1930s ...
What disturbs me is not so much the U.S. reality, which has always meant that each social advance in our history is soaked in blood, but that the trend is now downsliding.

The poor are becoming poorer, the rich richer, the middle class is dwindling. With that comes a deterioration of an admittedly charmed style of life.

The telephones are bad? I just read about a lady of 75 in Virginia who went to the telephone company's offices with a hammer and started smashing computers after being utterly unable to get the attention of "customer service" staff for 3 months when her phone was mistakenly cut off. She's in jail when the telephone company executives who cut everything to the bone should be in irons.

They do it because investors demand profits? The investors' greed should be limited. By the government that shouldn't be in the pockets of the highest bidder.

Let things slide, work off frustrations with Comedy Central or the Fox network's over-the-top cartoon humor shows, chill ... and by the time you take a good look, there will be Brazilian favelas in New York, you will have to bribe the cable man, that is ... if you still have a respectable job with rapidly vanishing health insurance and pension benefits.

Don't think it can't happen. In 1907, Argentina had the 7th economy in the world. "Rich as an Argentine" was a popular phrase in the United States, which was not yet the towering, all-powerful and super-rich nation we have known since 1945.

There is nothing divinely ordained about U.S. wealth or institutions that attempt to achieve greater socioeconomic equality. Both are severely at risk.

It is likely that, much as the 20th century was aptly dubbed "the American century" by Walter Lippmann, the 21st may be the Chinese century or -- my guess -- the European century. Very little can be done about that. What goes up, must come down.

Absent social and political forces to level not just the playing field but to some extent the scores of the game, however, the United States shows all the earmarks of drifting toward a Third World social structure. This need not happen.

When European nations lost their pre-eminence and vast colonial empires starting in 1945, they introduced the most generous "cradle to grave" systems of social insurance ever known in history. Some may need their sails trimmed a bit, but on the whole, these are viable and necessary systems that the USA, as an advanced nation, should have.

Else, welcome to Brazil.