San Francisco 1906

April 29, 2010

I  have always been fascinated by old film footage – particularly that which captures people unaware that they are being filmed.

A few days ago this wonderful footage was sent to me.   A 33mm print of a cable car trip down a street  in San Francisco in 1906.   Automobiles have definitely made their impression on this scene and it is startling to see how many of them traverse the path of the cable car as it rattles along the street.

But more impressive is the quotidian nature of the footage.   Here we look in on the life on an average day ( and one assumes before the great earthquake which almost destroyed the town in the same year) of a great American city at the turn of the 20th Century.   Men in great coats and bowler hats amble  across  the road.  Boys in large, outsize hats ride bicycles;  Billboards advertise medicines and root beer.  Life is busy.

Viewing such footage gives one a certain feeling of familiarity and warmth, much like the photographs I presented in my piece The Color of History.

We could ask questions about  where this cable car is going and whether we are in store for any surprises.  But the answer is that it is not going anywhere in particular and the only surprises are in the fascinating details of the journey itself.

So enjoy this footage as you experience a day in the life of San Francisco, 1906.

News as Entertainment

April 27, 2010

If you want an idea of how network news has completely transformed over the past ten years from the sober conveyance  of information to a place for larks and comedy, look no further than the ad that came wrapped around the Los Angeles Times on Thursday, April 22.

The ad displays all eight members of the KTLA 5 Morning News team ( four women and four men) posing with their arms outstretched and apparently singing  words  ‘Find Me A Comic.’ appear.   Here are the lines that then follow:

The KTLA 5 Morning  News, in association with the John Lovitz Comedy Club at Universal City Walk, is searching for the next great Stand- Up Comedian.  Go To KTLA .com/Comic for details.  Upload your best  two minute routine and you could be on your way to a booking in Los Angeles or Las Vegas.”

It concludes with the words  “KTLA5 Morning News –   News, Weather and OUTRAGEOUS FUN!”  (my emphasis added) .

If you think that sinks to a new level of imbecility, then you might want to flip over the news sheet.  For there you can read what looks like a page ripped from a ninth grade  year book with all eight members of the team individually profiled.  One by one they have jotted down their biographical details, in order  to give you important insight  into their sparkling personalities:

Here’s Michaela Periera:

Greatest Achievement: Not losing my mind in this town or in this business. Biggest Dislike:  Judgment and negativity; Favorite Journey: Home

Jessica Holmes goes one better:

Greatest Love: Hmmm… still waiting…on that answer; Greatest extravagance: food…I would spend a paycheck on a good meal; Greatest Fear: Being stuck in an elevator. Swarm of bees.”

And we surely are better off for gleaning this information from Sam Rubin:

“Favorite Journey:  Amalfi Coast, Italy; My Most Treasured Possession: My Kindle;  Most Marked Characteristic: Its my own hair.”

Well, its a certainly a relief Sam, to know that you have your own hair.

This sophomoric posturing wouldn’t be so offensive if KTLA5  didn’t command a sizable morning audience that relies upon its newscasters for reliable information.  According to Nielsen, the station  has the second largest market share for morning news  in Southern California and  ranks  as 12th in the nation.

There is no great revelation in declaring that the division between news and entertainment has disappeared.   Turn on almost any news program from MSNBC to Fox News and it becomes clear:  newscasters are presented as celebrities in their own right whose images are carefully groomed;  the news is tailored to capture sensation, often at the expense of veracity and proportion;  and aimless banter between newscasters is given almost as much emphasis as the news itself.

So on any given day, we  can view newscasters deliver, in mordant tones, news of the death of hundreds in an earthquake or flood, only to follow it a few minutes later with some lighthearted banter about home cooking or the hosts’ latest travel plans.

The repugnant narcissism of the whole enterprise, should make almost anyone with a sense of propriety and reverence for traditional news service to turn off their televisions and never return.   That, of course, will not  happen.  Most of us will still tune into televised news in order to be brought up to date as quickly as possible on events unfolding in our local environment.   No one can say they  regret having a television on September 11, 2001 and learning, instantly, of the events unfolding in New York City on that day.  The television is a vital source of information and communication and at one time, was regarded as a great blessing to our society and civilization.

There is almost no question that that perspective on the medium has changed and for many of us  it is difficult to turn on a news broadcast without a sense of being used or manipulated.

Perhaps then, in the light of KTLA’s  search for new comedic talent, it  might pay us then to be aware that those providing us with the information we often need to make daily choices or decisions in our lives, are nothing more than paid actors, enjoying their celebrity at our expense and all in the name of outrageous fun.

ANZAC Day: Well Done Ted

April 26, 2010

No one can deny that the memory of the First World War is almost as doddery as its survivors.  Save for two men and one woman, all aged 109,  every veteran of that war, on both sides of the conflict,  is now  dead.   Harry Patch, the oldest of them at 113, died last year within months of the two other surviving British veterans.

Today there are  no surviving Australians from the conflict.   This year, for the first time, only a riderless horse represented the fallen at the annual April 25  ANZAC ( Australian and New Zealand Armored Corps) Day Parade.

Growing up in Australia, I was well acquainted with how seared into the national consciousness was the  Gallipoli campaign in Turkey which took place between  April and November, 1915.  I had watched for years as the aged soldiers of that forgotten war would amble down St. Kilda Rd. in Melbourne, bearing their medals and wearing their Digger caps.

It was First Lord of  the Admiralty, Winston Churchill’s idea to attack the Turkish Dardanelles, the narrow strait that led to Constantinople( today’s Istanbul) which lay athwart the entrance to the Black Sea.  Churchill’s idea – and it was a sound one – was that if Constantinople could be taken, Turkey and the Ottoman Empire it controlled, could be knocked out of the war and the British and Russian fleets could link up.   This would give the Allies  overpowering control of the Mediterranean and would inevitably tip the balance of the war in favor of the Allies.

In order to break the Turkish resistance on the Dardanelles and allow the British navy safe passage, the British High Command decided to send, with their own troops,  the newly minted Australian and New Zealand Armed Corps, which had been training in Egypt.   It was decided to attack the Gallipoli Peninsula on its Western coast which faced the Aegean.

The mistakes made – of underestimating the strength of the Turkish defenses, the steepness of the terrain, the absence of shade and difficultly of resupply have been heavily documented.  The ANZACs in the north of the peninsula and the combined British and French forces in the south, could not break the Turks who had command of the highlands and far more effective  resupply from their hinterland.  After seven long months of attrition and nearly 15,000 dead, the Allies were forced into an ignominious retreat.

No Australian schoolchild grows up without knowing these facts.   Although only a relatively small number of Australians perished at Gallipoli (8, 700), the Australian participation in the war was the first armed conflict with which the young nation had been involved and the devastating impact of the losses for a country with a population of only 5 million, began to meld a sense of national purpose and identity.

Like other schoolboys I had been quite taken with the stories of the courageous Australian”diggers ” who had fought in the trenches and endured shocking casualties in their assault on the Turks.  So naturally my first trip to Turkey I felt should  necessitate a trip to the Australian military cemeteries on the Gallipoli Highlands.  Since I was riding a bicycle I knew the trip was going to be difficult.

But I didn’t realize how difficult.  Impossible head winds, rutted roads, broken spokes and blown tires bedeviled the journey –  and that was all in the first six days!  I huffed up those hills under a scorching summer sun, not having endured anything quite like it before.

When I arrived, exhausted, at the Gallipoli Battlefield I traveled  along ridges where the ANZAC  and Turkish trenches could still be seen on opposite sides of the road – in places only ten yards apart.  The ANZAC cemeteries were immaculately groomed and maintained and the numerous memorials told the story of the legendary battles which took place there.

I  have always maintained an interest in military cemeteries.  I am fascinated by how little remains of  the men who fought and died in the places they are buried, save for a name, date of birth and date of death. There are very little other associations left for us to appreciate – the comradeship under fire; the relationships with commanding officers, nor the homesickness of  boys eating bully beef out of tins cans longing  for their mothers’  home cooking.

Among the grave stones I saw in the Gallipoli cemetery, I read the usual inscriptions one would read on any military tombstone in Australia.

Until I came to one in the middle of the field.  It was a simple plinth with the name,  date of birth and death of the soldier inscribed as usual. But below the inscription were just three words which seemed to tell me more about the soldier and his loved ones than any other memorial in the park.

“Well done Ted.”

Those three words  spoke to me across the generations about  two parents’ reverence for their lost son’s sacrifice, for his calm in the face of battle and for the sense that he stood for something beyond his own preservation.

We live in a cynical age where there is little respect or interest in the sacrifices of an older generation.  But we shouldn’t forget how it all happens.   Young boys, shipped overseas on a great adventure, come face to face with the sheer brutality of war, its indifference to human suffering and the shattering realization of how life can end in a split second.  Under such circumstances one grows up in a hurry and the awakening maturity leaves its scar, not only on the men who come to fight, but on the nation that has sent them.

Western civilization has yet to recover  from the shock of the First World War.  We are all deeply scarred by it and almost everything that has occurred in history since that time can find its roots planted in the that conflict’s soil.  Australians are generally nonchalant and casual people, not standing on ceremony, nor given to hyperbole.   For their sacrifices, remembered on this 95th anniversary of the day Australia truly became a united nation, perhaps then the most fitting words spoken to the country’s First World War veterans  might be “Well done boys ”  –  even if  there is no one now left to hear them.


April 26, 2010

Saturday, April 24, 2010 marks the 95th anniversary of the beginning of the Armenian Genocide.   On April 24, 1915, in Constantinople, the Ottoman authorities arrested some 250 Armenian intellectuals and community leaders.  Thereafter, the Ottoman military uprooted Armenians from their homes and forced them to march for hundreds of miles, depriving them of food and water, to the desert of what is now Syria.  Massacres were indiscriminate of age or gender, with rape and other sexual abuse commonplace.

A number of years ago I wrote a short story to commemorate the 80th anniversary.  I reproduce it here.


Anoush can’t remember much these days.

The hurried packing of  the household goods;  the clatter of pots and tin cups as they were loaded into sacks. Dishes smashing as the Ottoman police forced them on.   The squeak and crank of the cart as it grumbled over the rutted road.

She has can barely recall the river, where the guards tied together hundreds of them, attached rocks to their feet and watched them drown.

She has only hazy memories of the desert.   The sun burning her face red ;  the desperate need for water; the pounding of hooves of the Kurds on horseback as they poured down from the hills.  The screams of the girls as they were set upon and raped.

She can’t remember now how her husband died or what became of her two girls.   But she remembers the boy and the tree she felt might save his life.

” Leave him! ”  she remembers her husband saying with tears in his eyes.  ” Leave him or he will kill you and he will kill us!”

She remembers looking into his eyes one last time.  His skin blanched and his cheeks swollen.  She thought she saw him smile.  And the fading  sounds of his gurgles as the ragged convoy moved on.

As she stumbled on, she noticed the silhouette of two wolves on a nearby hill.

” Gone,”  she mumbles, as she looks out now at the distant hills.” But the boy……….what happened to the boy?”

She rests back in her chair and wipes  a veined, wrinkled hand across her brow.

” The wolves,”  she smiles wanly as she rocks back and forth, ” I know the wolves are taking care of my son.”

Good William Hunting

April 25, 2010

If there’s at least one thing we are fairly confident about in William Shakespeare’s hazy biography, it is the date of  his birth.  The parish register of Holy Trinity Church at Stratford-upon-Avon in Warwickshire shows that he was baptized there on April 26, 1564.  Since baptisms in 16th Century England were traditionally celebrated three days following birth, we have come to accept that Shakespeare was born on  April 23rd of that year.

After this, things start to get murky.  We know nothing of his youth and education, what he read or his relationship with a father who was engaged in various forms of trade and seems, despite being elected an alderman and bailiff  ( the mid-16th Century equivalent of a mayor) to have had fluctuations in prosperity.   The next mention in the official record is of his marriage to Anne Hathaway at the age of 18 and the birth of his three children in the 1580s.  But what Shakespeare was actually doing during this decade, we know nothing.

His first appearance in London is only recorded in a sarcastic note penned by the playwright Richard Greene.  From then on we have little to munch on regarding Shakespeare’s private life, even if there exist official documents notarizing his land purchases and the performance of his plays.  We have his date of death and a will – a long document, signed in a shaky hand.  But little else remains of the flesh and blood man.

The dearth of  information on Shakespeare the man has mystified scholars for years.  It has plunged  biographers, academics, amateur historians and would be literary sleuths into an endless struggle over the question: Who was William Shakespeare?

Well since at least the 1850s it has been argued by a legion of skeptics that whoever was born in Stratford in April 1564 and died there 52 years later, was not the author of  the greatest works in English literature.   It  has been suggested, by no greater literary eminences than Mark Twain, Henry James,  Helen Keller and Sigmud Freud  –  among many lesser lights –  that the author of such soaring masterpieces as Othello, Macbeth, King Lear and Hamlet must have been a man of great distinction, well traveled, schooled in many languages and close to the  Elizabethan and Jacobean Courts.  Twain, whose own authorial canon was drawn from the well spring of his own experiences, maintained the literary theory that all fiction is in essence autobiography.  And if this is the case, William Shakespeare of  the obscure village of Stratford, could not have been the same man who penned the immortal lines of a Hamlet or the soliloquies of  a King Lear.

If not William Shakespeare, then who?

This has been the literary community’s holy grail for centuries and the search has settled on two main contenders – the polymath and supreme Man of Letters, Francis Bacon or Edward de Vere, the 17th Earl of  Oxford.  I am not going to go into the extensive arguments made for either man , only to contend that, to our knowledge, Bacon did not write plays and that his published poetry was inferior in quality to that of Shakespeare’s.  The other contender – de Vere,  who died in 1604, (which would put him beyond the capacity to have penned King Lear, Macbeth and The Tempest) was reputedly a playwright and poet of high distinction, but since nothing in his own hand  has survived, we have had no opportunity to compare his skills with that of the much maligned glover’s son in distant Stratford.

There might be good reason to shake one’s head in disbelief at the level of sophistication required to write the Shakespearean canon.  As an enthusiast for Shakespearean drama since my early teens, the question that I have often asked is not whether Shakespeare could have written his own plays, but whether any human could have  done so?  So deep are their understanding of human nature, so filled with pathos and majesty of poetic expression, so given to consistency of character development and  brimming with vim and vitality (so that  the same characters appear to virtually lift off the page)  – that they leave flapping  in the dust any other playwright of his own time  –  or of any other time.  The endurance of the plays and the survival of their often archaic language  into our our day, offers the sneaking suspicion that the Shakespearean pen was guided by the light of Divine inspiration.

This is perhaps where all the Shakespearean skeptics have gone so wrong.   What they fail to account for is the power of the human imagination, which was perhaps perfected in the person of a low born boy from the West of England.   They fail to appreciate that even a child living  in the hovels of Calcutta, the barrios of Rio de Janiero or the windswept deserts of the Sahara, might have the capacity to stare out a window and dream of things he has never seen or experienced before.   That such imagination can lift him on a journey of exploration and provide him  with the power to shape in his own mind characters, conversations and scenes which he had never previously even contemplated, is beyond the ken of  their understanding.

We know nothing of William Shakespeare’s boyhood.   But is it not possible that a young, lonely boy spent his days day dreaming about kings and princes, fairies and goblins, castle moats and palace intrigues while playing in nearby fields and  forests  – and, with the support of parents lovingly aware of their son’s interests and gifts, provided  him with any book they could find to feed his insatiable need for detail and information?

Anyone who has written fiction and has had supportive parents, knows this experience.   The gift of William Shakespeare was that he took historical and mythical characters, drawn from books and pamphlets he might have read as a child  and molded them through an extraordinarily powerful imagination and a mature, refined intellect, into real life.  The skeptics, in the end, for want of documentary proof of Shakespeare’s exceptionalism, support the idea of the perpetration of the greatest hoax in literary history because they cannot accept  the reality of this genius.

Maybe he was indeed a fluke of history – given exactly the right time and environment for his genius to flourish.   Maybe these circumstances only converge once every 500 or 1,000 years.   Yet I firmly believe that there are or will be other young William Shakespeares, gazing at a cobblestone road or the sky from a bedroom window and dreaming  of distant lands and of people whom they may never meet, but seem to already know.   All they might need is a pen, paper and enough adult awareness to recognize the breadth of a child’s imagination and the power of their insight to shape and change the world.

Earth Day as Global Guilt Fest

April 23, 2010

There are plenty of people around today who can tell you about the first Earth Day, held on April 22, 1970.   They are in their late 50s and early 60s  now, but back then they were students, part of the Woodstock Generation and eager to find a cause to which to attach themselves.   In the midst of the Vietnam War, in which defoliants such as Agent Orange were used to strip away forests concealing enemy combatants and a  massive oil spill had contaminated the Southern California coastline, the idea of celebrating the Earth and its natural bounty seemed a redeeming response to human despoliation.

But there was, surprisingly, already an  “Earth Day” in existence in 1970 that had been enthusiastically celebrated for close to 100 years.  Arbor Day had been instituted in 1872 in Nebraska by Julius Morton, a Michigan transplant, who felt that the planting of trees could bring life again to the Great Plains.  He believed that tree planting was “no more than a desire to pay a just debt to our forefathers who had cultivated trees before us.”  Tree husbandry was also an expression of the human impulse to increase the beauty of the land, “to endeavor to make the world lovely because he has been a dweller on it.” On the very first Arbor Day alone, one million trees were planted.

You would hardly know it, but National Arbor Day is still around and is organized and coordinated by the non-profit organization Arbor Day Foundation, whose sole mission is to support the planting and maintenance of trees.  It may seem decidedly uncool and retrograde to support such a moribund notion, but in fact today’s environmentalists could learn a great deal from the arborists.  They could learn that all global warming could be dealt with through a massive reafforestation program around the world.  They could be guided by the goodwill and genuine level of cooperation between arborists.   And they could absorb that love for nature, rather than contempt for humanity, should be the ultimate guiding principle of any environmental movement.

Because contempt for humanity and censure  for its transgressions are at the core of today’s environmental movement.  Global warming hysteria has clearly taken aim at human incompetence in managing the environment.  Animal rights organizations regularly decry the hubris of human beings who consider themselves as in some way superior to the animal kingdom;  And supporters of such outré international policy platforms as Agenda 21 regularly dismiss  inconvenient notions such as individual property rights, as an interference in their quest to save the planet.

In contrast, National Arbor Day is pro-human and focuses on human productivity and hope for the future.   “The only stand we take,” says Mark Derowitsch of the Arbor Day Foundation, “is that it’s a great thing to plant trees.” Arbor Day does not require nor ask for government intervention, regulation, restriction or taxation.  All it calls for is publicly spirited individuals and organizations to plant trees. “Anyone can plant a tree,” says Derowitsch. “You get your hands dirty and make a huge difference in the world.”

Is it any wonder that many people who once called themselves environmentalists are now loathe to do so because of that term’s negative connotations?    In 1990, according to an ABC News/Gallup survey series, 75 percent of Americans surveyed  said they considered themselves to be environmentalists.    But those  numbers have been slowly reversing over the last decade.  As of 2008 (the most recent year the question was asked), only 41 percent of Americans identified themselves as environmentalists, with 58 percent now saying they do not.

And Gallup’s annual environmental survey also finds the public now favors economic growth over environmental protection by a 53-38 margin. For most of the last 25 years, even during previous recessions, the public favored the environment over the economy by as much as a two-to-one margin.  That trend is now reversing itself  as more and more citizens begin to understand that absurd environmental moratoria are making it impossible to compete with countries that are not so hindered.  Meanwhile, opinion in favor of increased oil and gas exploration is surging, just as is the demand for nuclear power as a safe, cheap and clean alternative to fossil fuels.

Perhaps this is all because the militancy of the environmental movement has turned people against it as they increasingly realize that such self aggrandizing organizations as  Sierra Club, World Wildlife Fund, National Defense Resources Council and Environmental Defense Fund have transformed  from grass roots advocacy institutions into multimillion dollar fund raising franchises, with radical political agendas which stray far from their original mission of environmental protection.

That is not to mention the widespread realization of the alarming religious overtones which have crept into environmental language.  In 1990, the Earth Day Foundation introduced The Equinoctial Earth Day, celebrated on the March equinox (around March 20) to mark the precise moment of astronomical mid-spring in the Northern Hemisphere, and of astronomical mid-autumn in the Southern Hemisphere.   Solstice festivals are pagan in origin, going back to the earliest days of civilization.   In 1992, Maurice Strong, the Secretary-General of the Earth Conference, hinted at the overtly religious agenda proposed for a future Earth Charter( 2000), when in his opening address to the Rio delegates he said, “It is the responsibility of each human being today to choose between the force of darkness and the force of light…….We must therefore transform our attitudes and adopt a renewed respect for the superior laws of Divine Nature.”  Strong finished with unanimous applause from the crowd.

In anticipation of the conference, his wife, Hanne Strong, held a three-week vigil with Wisdomkeepers, a group of “global transformationalists.” Through round-the-clock sacred fire, drumbeat, and meditation, the group helped hold the “energy pattern” for the duration of the summit.

As if to prove the wild eyed ambition of  this New Age millenarianism, authors of the Earth Charter, an environmental manifesto promulgated at a UNESCO meeting held in Paris in March, 2000 commissioned the building of The Ark of Hope , a latter day replica of the Ark of the Covenant as a place of refuge for the Earth Charter document.   The Ark was later brought on foot to New York City from Vermont (just as the Ancient Israelites had once carried their Ark) and exhibited at the United Nations.

Is it any wonder that Strong would therefore comment after the promulgation of the Earth Charter:  “The real goal of the Earth Charter is that it will in fact become like the Ten Commandments?”

Or that Mikhail Gorbachev, one of the  world’s leading proponents of sustainability could state: “ Do not do unto the environment of others what you do not want done to your own environment….My hope is that this Charter will be a kind of Ten Commandments, a ‘Sermon on the Mount’, that provides a guide for human behavior toward the environment in the next century?”

In preparation for AFA’s The Green Movement Seminar in February, 2010,  all of the speakers explained to me that they must regularly qualify themselves as sincere supporters of  clean air and pure water before launching into any jeremiad against the environmental movement.   This is largely a result of the guilt laid upon our society by environmentalists/ecologists over the past 40 years, who insist that mankind is preternaturally oriented towards environmental degradation.

With such a sad record of negativism, there should be little wonder that this year’s Earth Day, on the 40th  anniversary of that seminal event, has passed by with a whisper, not a bang.   Perhaps it is because Americans are beginning to understand that the world is not quite  as grim a place as the Earth Day promoters would have us believe.  Or that our future is indeed tethered to an environment that is put to humanity’s use and better purposes and not the other way around.

Has the United States Given Up on Missile Defense?

April 22, 2010

In the Wall Street Journal lately a debate has raged over the impact of the new START Treaty signed by Barack Obama and Russian president Dmitry Medvedev in Prague on April 8.

In an editorial on April 17, the WSJ editorial board opined that the treaty would prevent the U.S. from converting an ICBM silo into a missile defense site.  It claims that the premable to the Treaty points to  “the interrelationship between strategic offensive arms and strategic defensive arms,” thus linking existing weapons and America’s missile defenses. Article V contains a binding clause that the U.S. or Russia “shall not convert and shall not use ICBM launchers and [submarine-launched ballistic missile] launchers for placement of missile defense interceptors therein.” Article XIV lets any party withdraw if “extraordinary events . . . have jeopardized its supreme interests.”

The Journal then concludes– and  in my opinion correctly –  that:

“In practice, this will mean that any new defense initiative will have to overcome a Russian threat to withdraw from START. Opponents of missile defenses in Congress and abroad will claim that any such move would be destabilizing and start a new “arms race.”

In a furious rejoinder, James L. Jones, the White House National Security Adviser, responded with a denial of any impact on missile defense.

“As Gen. Patrick O’Reilly, the head of the Missile Defense Agency, explained to Congress on April 15, we have no plans to convert any additional ICBM silos. In fact, it would be less expensive to build a new silo rather than convert an old one. In other words, if we were to ever need more missile defense silos in California, we would simply dig new holes, which is not proscribed by the treaty (nor are we barred from building new missile defense silos anywhere else). Gen. O’Reilly also told Congress that launching missile defense interceptors from submarines was reviewed and deemed an “unattractive and extremely expensive option.” 

James concludes his letter by asserting that the Obama administration is unequivocally committed to missile defense:

” The treaty restrains neither our program for missile defense of the U.S. (at bases in California and Alaska) nor the new phased adaptive approach for missile defense in Europe. The president remains committed to developing and deploying missile defenses, as evidenced by his nearly $10 billion budget request for fiscal year 2011, almost $700 million more than the current year. A Russian threat to withdraw from START will not deter the president from taking steps necessary to protect the U.S. and her allies.

Is there or is there not then a linkage between missile deployment and missile defense in this Treaty that gives the Russians an effective veto over the development of U.S. missile defense systems?  

The Russians seem to think so.  Before the Treaty was signed, Russian Federation Foreign Minister Sergei Lavrov stated in a Business Week report:

 “Linkage to missile defense is clearly spelled out in the accord and is legally binding. Russia will have the right to exit the accord if the U.S.’s build-up of its missile defense strategic potential in numbers and quality begins to considerably affect the efficiency of Russian strategic nuclear forces,” 

Hillary Clinton , however,  clearly doesn’t see it that way.   On  April 9, following the signing, she commented:

  “Now, one aspect of our deterrent that we specifically did not limit in this treaty is missile defense. The agreement has no restrictions on our ability to develop and deploy our planned missile defense systems or long-range conventional strike weapons now or in the future.”

Well, then, who’s interpretation is right – the Russians or the Americans? In the absence of an adjudicator, the matter is moot. For what is clear here is that there is no meeting of the minds on the subject.   The Russians continue to see the construction of  an effective missile defense shield as an aggressive act in itself, which will jeopardize the execution of any of their nuclear  arms reduction commitments.  This is in line with Russian policy going back as far as the Reagan era and was most recently put on full display in their resistance to the deployment of missile defense systems in Poland and Czechoslovakia. 

In that instance, the Obama administration cravenly backed out of its commitments to those countries, souring the East European leadership on American resolve.  The United States won nothing for that retreat, save for this treaty, over whose interpretation the two countries can now not agree. 

The Treaty, under the Russian interpretation, will do serious damage to a program that has already been eviscerated by the Obama administration.  The United States, to put it bluntly, does not have an effective missile defense shield and cuts in last year’s budget of approximately 30% for missile defense, have meant that the  program may have been set back by up to ten years.

The START Treaty, under the U.S. Constitution, must now be submitted to the U.S. Senate for ratification.   The Senate should not ratify the Treaty until it receives ironclad commitments from the Obama administration to the pursuit of an aggressive missile defense program and its insistence that it will not allow Russian intimidation to veto any systems now either  in development or to be developed in the future.  

In the end, without a clear U.S. demand that there be no linkage between missile deployment and missile defense, the Treaty will not be worth the paper it is is written on and will be dead before even one bolt is unscrewed on any ICBM silo.

Oklahoma City, Columbine and Waco in Popular Memory

April 22, 2010

What image comes to your mind when you think of the Oklahoma City bombing ?  

 The destruction of the Alfred P. Murrah  Federal Building on April 19, 1995, in which 168 people lost their lives and hundereds were injured  seems to have paled in significance in the shadow of the much larger and more historic events in New York  City six-and-a-half years later.    The memory has faded, but not simply as a result of  the passage of  time.  It is because it is far easier to picture outsiders conducting such a heinous attack upon American individuals than  it is to consider them perpetrated by home born citizens.

That it was indeed an American who set the fuse which brought down the building and damaged 384 others in the immediate vicinity, is still hard for many to fathom.   Timothy McVeigh and Terry Nicols may have been very disturbed young men with wildly grandisose projections of themselves as true patriots, but they had nonetheless served in the armed forces and at one time lived fairly normal suburban lives with loving families.

How does it happen that someone grows up to be Timothy McVeigh?  The same question can be asked about Eric Harris and Dylan Klebold  the eighteen- year-olds who eleven years ago in this same week, ran amock at Columbine High School in Jefferson County, Colorado killing 12 students and injuring six before killing themselves.

And seven years before them, the seige of the Branch Davidian sect at Waco, Texas had ended in tragedy after a 51 day seige resulting in all 75 inmates of the compound dying in a conflagration ordered by the sect’s leader, David Koresh.

All three events are inextricably linked.  McVeigh fixed on the date of April 19 to commemorate the Waco seige.   At his trial, he wanted his attorney, Stephen Jones, to present a “necessity defense”—which would have argued that he was in “imminent danger” and that his bombing was intended to prevent future crimes by the government, “such as the Waco and Ruby Ridge incidents. ”

A journal found in Harris’ bedroom described events such as the Oklahoma City bombing and  Waco, and noted  how the two wished to “outdo” these atrocities, focusing especially on what Timothy McVeigh had accomplished in Oklahoma City.

In the six years which separated these three events, the media became rife with speculation about a home grown far right wing conspiracy which had taken hold of  a sector of the U.S. population.  It is alleged that many nascent  plots were hatched and then foiled before they occured.  But no united conspiracy was ever discovered and there has been no other events quite like them since.

What then inspired these men and teenagers to wreak such damage and permanently affect the lives of hundreds of people?   Both Harris and Kelobold, unhappy and unpopular young men, had mentioned how they wanted to leave a lasting impression on the world through violence.  The disaffected McVeigh wanted to change history and to make a statement through an act of political theater.  David Koresh claimed that he was the new King Cyrus and a Messiah to boot.

In all of them a malignant narcissism reigned, encouraged and inflamed by local circumstances and popular culture, but in the end, operating as a  fatal personality disorder that led to the slaughter of innocent Americans in the name of a higher good.

There was, then, no conspiracy, no ideology and no political platform that linked these three crimes.  What linked them was a depserate need for attention that could only be satiated by a spectacular act of destruction. 

As we remember these tragic events today and seek to explain them to ourselves we should not forget that, yes, all the perpetrators were Americans.  But first and foremost  they were highly distrubed and disaffected human beings, for whom violence became a key to opening the door to everlasting fame and attention.

Israel and the Meaning of Independence

April 21, 2010

Many people are confused why the State of Israel seems to celebrate the anniversary of its establishment on a different date every year.   After all, the State of Israel came into existence after a declaration by the Provisional Council of State in Palestine, led by David Ben Gurion, on May 14, 1948.    But in the intervening 62 years, Israel Independence Day has been celebrated on May 14 only once.

The answer is that Israel marks its anniversaries by the Hebrew calendar, which means that from year to year, anniversaries, holy days and even birthdays, are often celebrated as much as a month apart from the dates to which they are attached in the Gregorian calendar.

But another fact that is often glossed over is that Israel did not actually achieve independence 62 years ago because there was nothing to claim independence from.  British suzerainty of Palestine had been mandated, not by the international body, The League of Nations, but under a resolution of the San Remo Conference (1920) which was later ratified by the Treaty of Lausanne (1923). Both effectively recognized British conquest of Palestine and ended Ottoman rule.   In fact, the British Mandatory Authority, established thereafter, was not a sovereign body and was not universally recognized by all nations ( the United States being the most prominent among them).  Its legal legitimacy was in fact in question for 30 years.   So while the creation of the state in 1948 derived  its standing in international law from U.N. Resolution 181,  Israel’s declaration of “independence”  was no more than a dramatic means of  stating its formation as a contiguous and indivisible state.   But on May 14, 1948  it became independent of nothing.

Those might seem like picayune legal arguments, with no particular relevance to today’s politics or diplomacy.   Yet the importance of understanding the concept and meaning of independence is vital to appreciating how Israel sees itself today.

For the question of  the country’s independence  has been a determining factor in Israel’s survival until now and today is a deciding factor in how it proposes to deal with the menace arising to its existence from the  Persian Gulf .

History has some important things  to say about the matter.

In 1956, Dwight Eisenhower exerted enormous diplomatic pressure on Israel to withdraw from the Sinai Desert after its troops captured it in a two day lightning strike on Egypt during the Suez War.  At the time, Eisenhower, expressing pique at the way the joint British-French- Israeli action sought to reverse Gamal Abdul Nasser’s nationalization of the Suez Canal without his involvement, made it clear that he would join with the Soviet Union in condemning Israel’s actions at the United Nations if there wasn’t a prompt withdrawal.  Eisenhower’s impetuous actions were to have devastating consequences for the region and the world as it handed Nasser a diplomatic victory at the expense of  Western unity and resolve.

To sweeten the bitter pill it forced the young country to swallow, the Eisenhower administration offered to guarantee the safe passage of Israeli shipping in the Red Sea and would use military force if necessary to do so.

Ten years later, when that agreement was put to the test, the United States was exposed as  impotent in upholding its end of the bargain.  In May,1967, the Egyptian government once again closed the Straits of Tiran to Israeli shipping, forcing an enormous hardship on Israel.  This act of aggression, along with the repeated genocidal pronouncements of the Egyptian leader, placed Israel on a war footing.   President Lyndon Johnson sought to defuse the crisis by attempting to corral an international maritime force to break the blockade.  Only two nations responded to his call.   By June 5th of that year, recognizing that the American guarantee was worthless, Israel staged its now famous pre-emptive strike on Egyptian and Syrian airfields, which effectively ended the Arab offensive and determined the ultimate outcome of the Six Day War.

In 1991, during the first Gulf War, Iraq fired 39 scud missiles at Israel, causing millions of dollars worth of damage to the country and 74 deaths.   Israel was asked not to retaliate by the United States in the hope of preventing the outbreak of a regional war.   But part of the bargain for  exercising restraint was the United States’ tacit agreement with the Shamir government  that the latter would receive an increase in aid to resettle the stream of Soviet  Jews pouring into the country as well as diplomatic space in dealing with the Palestinian rebellion known as the Intifada..

Yet rather than rewarding the Israelis for their stoicism and restraint in the face of these unprovoked attacks, the government of George H.W.Bush  withheld vitally important loan guarantees necessary for resettlement of Soviet Jews  and soon after the war placed inordinate pressure on the Israelis to attend a peace conference in Madrid without preconditions.  Hoodwinked by Bush and Secretary of State James Baker, the Madrid Conference initiated the series of events which  lead to the  1993 Oslo Accords, which in hindsight, served as the greatest diplomatic debacle in Israel’s history.

The train of misguided guarantees did not stop there.

On April 14, 2004 President George W. Bush, in a letter to then  prime minister Ariel Sharon, accepted that in exchange for Israeli withdrawals from Gaza and certain areas of the West Bank, the U.S. Israel would expect Israel to retain settlement blocs in the West Bank as well as maintain Jerusalem as a united city.  The wording of the letter ran thus:

” …… is unrealistic to expect that the outcome of final status negotiations will be a full and complete return to the armistice lines of 1949, and all previous efforts to negotiate a two-state solution have reached the same conclusion.”

That sentence was regarded as another guarantee and committed Israel to a course of action that would ultimately bring into existence a terrorist run government on its southern border.

Both Ariel Sharon and George W. Bush have now left the political stage and Israel  is facing a new administration which shows no desire nor interest in honoring the previous administration’s written guarantees.  Instead it has turned on the Jewish state, describing Israel’s obstinance in holding on to settlements (and East Jerusalem!) as having  a deleterious impact on American interests in other parts of the region.

Taking this unhappy history into account, one can well imagine then the current Israeli prime minister, an avid student of history, possessing a thoroughly jaundiced view of American resolve.  He almost certainly recognizes that very little of what  the United States government has to say about guaranteeing or supporting the security of the state of Israel can be taken at face value.

And so we return to the issue of independence.  Like any modern state, Israel justly reserves the right of self-defense and to take independent action in order to execute that right.  Not only this, but the Israelis have consistently demonstrated over the past 62 years that they are willing to risk the ire of the United States and the condemnation of the world in order to provide for their national security.

All of which makes it increasingly likely that in the event the Obama administration fails to take concrete action to prevent Iran’s emergence as a nuclear power, Israel will feel unconstrained by U.S. warnings or offers of protection, and will proceed with a devastating strike against the Iranian regime.

This will be Israel’s ultimate declaration of independence.  For this country, which does not celebrate the anniversary of its creation on its fixed calendar date ,there is a pattern of challenge to orthodoxies and expectations.  In formulating its Middle East policies, the Obama administration would therefore be making a grievous error in judgment  in underestimating  the determination of the Jewish state to fill the vacuum left by American irresolution.

To Badger Hunters Everywhere: We Will Rock You

April 19, 2010

You can’t get a creature much cuter than a badger.  And an English badger is a cut above the rest.   Celebrated and anthropomorphized in the works of C.S. Lewis, Kenneth Grahame, Beatrix Potter and Roald Dahl, the badger is about as British an animal as you might want, even if every continent sports its own variety.

It shouldn’t be much of a surprise then, that a new movement has arisen in England and Wales to defend the badger from unwanted culling.   Led by Brian May,  guitarist of the legendary U.K. rock band Queen, this movement calls for an end to a government  mandated program of curtailing badger populations, which is now proposed for the highlands of Wales.  The reason?   Badgers, who carry tuberculosis without harm to themselves, urinate and salivate on cattle grazing fields causing Welsh cattle to ingest contaminated grass.  The contraction of bovine TB is fatal to the cattle, necessitating trips to the slaughterhouse.  In the past several years, thousands of heads of  Welsh cattle have been  put down once seized with the disease.

The solution would seem to be a no brainer, right?  Get rid of the badgers.  That might have been appropriate in the 1980s when a devastating source of trichinellosis affected Russian badgers and cattle and later bovine TB which affected English farms.  The answer of English authorities then was a program of gassing which effectively ended the plague.

But that was so 1980s.   Since then there has been a widespread growth of English acceptance of the value of the badger and over 60 associations have  sprung up around the country to lobby for their protection.  Spurred by these groups,  the 1992 Protection of Badgers Act made it an offense to kill, injure or take a badger, or to damage or interfere with its lair ( known as a sett) unless a license is obtained from a statutory authority.  An exemption that allowed fox hunters to loosely block setts to prevent chased foxes escaping into them, was brought to an end with the passage of the 2004 Hunting Act.

Desperate to save their farms from increasing danger, the Welsh farmers are urging  a new cull of 1,000 head of badgers to eliminate the threat.  Mr. May is having none of it.  An animal lover who maintains a menagerie for sick animals at his 19th Century mansion,  May insists that the animals be inoculated rather than killed.  The farmers argue the difficulties involved in such inoculation ( for instance, it is not as if the badgers will line up at the local veterinarian for their shot).

The dispute is emblematic of a struggle being waged across the Western world between animal liberationists and those whose livelihood is dependent on animal  husbandry.   As I wrote in last week’s piece,  How Would You Like Your Eggs?, people like the Welsh farmers win little sympathy from activists who believe that the world’s priorities rest with conserving the animal kingdom whatever the cost to human beings.    And so rich rock stars like May will pump hundreds of thousands of pounds of their own money into animal rights campaigns in an effort to harness support for  their pet projects, indifferent to how this might affect the livelihoods of thousands of farmers.

The outrage of  the farmers is palpable.    In a recent  Wall Street Journal article Brian Walters, the vice-president of the Farmer’s Union of Wales stated:

” It is completely galling for those who have to live with the misery and financial losses caused by TB to see a millionaire rock star dropping in to talk about the proposed cullwhen he has no idea of the desperate need to control this disease.”

Christianne Glossop, chief veterinary officer of Wales added before the hearings :

” By Day Two,  Brian May had gone back to wherever he lives in the English home counties and here we are in Wales, and we still have TB.”

May, himself, is unapologetic.  In the same Wall Street Journal article he is quoted as expressing astonishment at the furor:

” Why do we as a species think we have the right to exterminate another animal species?”

In those few words May encapsulates the nub of the dispute.   The rampant and growing belief that humans are like all other animals on earth, with no greater claim to use of the earth’s resources,  is in direct conflict with the notion of human exceptionalism – that human destiny is to control the planet and utilize its resources for the benefit of mankind.  The former kind of thinking can only lead to a further collapse in moral values, to an attack on the protection and valuation of human life and the diminution of the  willingness to make environmental compromises in the event of an environmental crisis which affects the health and welfare of human beings.

No one , of course, is talking about exterminating badgers.  But the animal liberationists among us know a crusade when they see one, and will stop at nothing, even the destruction of a local farm economy, to enforce their world view on their fellow countrymen. We will rock you, indeed.

%d bloggers like this: