A GRAND OASIS IN THE VASTNESS OF SPACE

July 14, 2009

 

Perhaps it is an outworn cliché, but it still holds true: everyone remembers exactly where they were and what they were doing at the moment.  I was a 5th grade schoolchild in Melbourne, Australia, sitting in a tiny classroom with 20 other children as the voice, crinkled with static, rumbled from the television set and across the room.   We sat transfixed knowing, without any real prompting from our teacher, that we were watching a major historical event, quite unlike any other we were likely to witness in our lifetime.  

Neil Armstrong’s first step on the moon and his resonant words “That’s one small step for a man, one giant leap for mankind,” still impresses me as a mark of extraordinary human daring and technological wonder.  Perhaps it all seems rather commonplace now, but in the 1950s the idea that man would travel in space or would be able to place a foot on an extra-terrestrial surface seemed as remote as the idea as the ability to travel through time. But in the eight years that passed between Yuri Gagarin’s epochal orbit of the Earth in April, 1961 and the Apollo 11 moon mission of July, 1969, our entire perspective on what applied human intelligence coupled with unfettered determination could achieve, was greatly expanded.  Suddenly we were aware that the cosmos was not some inky, impenetrable blackness that could not be understood, but a vast panorama of possibilities for exploration, study and adventure.  

The conversation which followed on that wet winter’s day (remember this was Melbourne, Australia) revolved around not what we had just seen, but on the next step humanity would take in its exploration of space.   A mission to Mars or Venus seemed inevitable and for the next two hours we debated with one another about the new civilizations that would soon be discovered and the possibilities for travel toward them. 

Our generation was to be flatly disappointed in its expectations.   In fact, despite several more lunar landings in the five yeas that followed, the NASA program, at least from a relatively uninformed adolescent perspective, seemed to slow down and that its greatest implied quest – of finding other forms of intelligent life in the universe, had become just a passing interest, not its fundamental mission.   As the years passed, the space shuttle program, the unmanned explorations of Venus and Mars and the Mariner, Venera, Viking and Voyager expeditions sent to explore the outer reaches of our solar system, might have all been historic programs, yet they seemed to pale in comparison to the tactile act of placing a human foot on the surface of an extra- terrestrial sphere.

Why was this?  Because, gazing for millennia into the vast night sky, we humans have longed to be reassured that we are not alone.  The conviction that there must be other forms of complex life or intelligent beings in the universe has embedded itself in the human imagination and become an obsession.  It has also led, sadly, to a dismissal of the notion of Earth’s uniqueness.   From the time of the first modern astronomical discoveries in the 16th Century, most scientists have supposed that our solar system is rather ordinary and that the emergence of life somewhere other than Earth is almost certain given the vast size and age of the Universe.  The discoveries of other planets, the realization that our sun is one of hundreds of billions of stars in the Milky Way, which is, in itself, one of hundreds of billions of galaxies in a very large and very ancient universe, is indeed humbling and can leave us with an extreme sense of isolation.  This has led many to cast the Earth as an inconsequential planet, lacking any unique purpose or place in the universe’s general order.  This “Principle of Mediocrity,” popularized by the late Carl Sagan, has been adopted with gusto by many scientists today who also espouse, not unsurprisingly, a denial of the existence of a Creator or of a higher intelligence involved in the design of the Universe.

Yet since those formative years I have come to understand some important things about the Earth’s place in the universe that I could not have appreciated as a child.   For instance, the mere presence of other planets and Earth’s position in the inner solar system reduces the number of asteroids and comets that could likely hit earth, giving us a level of safety not enjoyed by planets in the outer solar system.   Earth has a transparent atmosphere that provides a platform to study and explore the universe, an ability that would be unknown to most other planets that have gaseous, opaque atmospheres; that its position in the Milky Way puts it at the greatest of advantages for the development of life – not too close to the sun which would make it too hot and not too distant, which would make it  uncompromisingly cold; that the conditions for the existence of complex life are exceedingly rare and that the probability of all those conditions coalescing at the same time and place is infinitely improbable;  that carbon and water are the two most important ingredients necessary for the creation of life and the fact that they cannot be detected on any other planet in the combinations necessary for life is extremely perplexing.

Today it is possible to look up at the night sky, possessed of the knowledge of both the immensity of the cosmos and the incomprehensible distances across which it stretches – and feel crushed by our seeming insignificance.

 But isn’t there another way to look at this existential dilemma? 

Could it be that the universe came into existence not as a random accident but for both the Earth’s and humanity’s benefit?  Is there perhaps a purpose and order to the universe that we have been actually programmed to discover?  Jim Lovell, aboard Apollo 8, the first manned mission to orbit the moon, sensed this.   Gazing out the window of his spacecraft and watching the Earth “rise” above the Moon’s horizon, he exclaimed: “the Earth from here is a grand oasis in the big vastness of space!” 

The idea of an oasis, feeding and watering the universe, is a profound understanding of life that not only gives us confidence in exploring space but also in a sense of purpose that the current proponents of the Principle of Mediocrity can neither fathom nor appreciate. If the universe is truly as dead and barren as the surface of the Moon, have we, in fact, been created in order to seed it with life? 

As a boy I could not imagine that forty years after Neil Armstrong’s famous walk, we would be no closer to the discovery of intelligent life in the cosmos than we had been in 1969.  But science itself, coupled with the ingenuity of the human mind, may have provided us with something far richer and more significant than any such discovery could afford:  the overpowering acceptance of our uniqueness and purpose.  And it this realization which has provided me with a deep appreciation of this tiny blue dot in the “big vastness of space” and makes me feel not alone, but glad to be alive.


THE CONTINUING BATTLE OF THE LITTLE BIGHORN

July 14, 2009

I have never been a big fan of Westerns or caught up in the romance of the Wild West.  But a visit to the site of the Battle of the Little Bighorn in Eastern Montana, long venerated in American hagiography as Custer’s Last Stand, does, I admit, stir some emotion.  The story of the attack on the villages of the combined Sioux, Arapaho and Cheyenne tribes by the U.S. Seventh Cavalry  on June  25th, 1876 and the consequent decisive defeat of the U.S military was a true tragedy for both sides.  It bespeaks the appalling toll that misunderstanding and miscalculation can wreak in any struggle over territory. And it gives pause to anyone who believes that right and wrong can be neatly parsed in determining the moral demarcations in that violent conflict.

Strolling to the top of the so called Last Stand Hill, where what remained of Custer’s battalion reputedly fought a desperate battle for survival against an overwhelming Indian force, ( although no one really knows because no white survivor lived to tell the full story), one can read  the names of the 210 U.S. soldiers, citizens and scouts who died that day as well as view  the 19th century markers of where the most famous among  them – George Custer, his brothers Tom and Boston and his more junior officers, fell.     Further down the hillside is the monument to the Indians who perished during the Indian campaign of 1876.   That monument, erected only six years ago, offers a perspective on both the battle itself and the toll that the Indian Wars of the late 19th Century took on the culture and lives of the Great Plains Indians.

Yet where Custer and his impetuous actions of the summer of 1876 truly stands in American consciousness today, is almost impossible to determine with any clarity.  On the one hand, he is lauded as both a symbol of American derring-do, courage and perseverance and yet on the other as a reckless adventurer, imperialist puppet and genocidal murderer.  For the 136 years since his death he has divided historians, politicians and military enthusiasts alike  – so much so that Custer National Battlefield was compelled to change its name to Little Bighorn Battlefield National Monument in 1991 for fear that it paid too much homage to the fallen Lieutenant Colonel and thus to the perceived 19th Century campaigns of extermination.

This new tweak on Custer mythology has penetrated into battlefield brochures and latter day re-enactments. Today the Little Bighorn Battlefield has been established as a place where the descendants of both sides can simultaneously claim the mantle of victory and the moral authority of defeat.  The guides at the battle site and the announcers at the re-enactment itself all refer to an ineradicable clash of cultures and civilizations that began with the arrival of  Europeans on the American continent and most probably continues today:

“The Battle of the Little Bighorn was but the latest encounter in a centuries-long conflict that began with the arrival of the first Europeans in North America.   It reached its peak in the decade following the Civil War when settlers resumed their vigorous westward movement. These western emigrants , possessing little or no understanding of the Indian way of life, showed slight regard for the sanctity of the hunting grounds or the terms of former treaties. The Indians’ resistance to those encroachments on their domains only served to intensify hostilities”

Here is baldly stated, what multiculturalists, revisionist historians and Indian rights advocates have been arguing for decades: that the Indian narrative of the struggle over territory in the 19th Century  was just as significant, if not more so, than the white man’s version of events – and that such a perspective must become part of America’s historical memory.   This is of course argued  in much the same way the black community  demands that the memory of slavery should  underpin  relations today with African-Americans.

‘Well, what is wrong with that?’- might be the question that most might ask when presented with such a demand.  After all, native Indians are no less American than whites and their history certainly deserves recognition, particularly since they have been here so much longer.

Well recognition is one thing.  But when the promotion of a sub-national consciousness is used to promote separatism, hatred and violence toward the majority culture, then it is entirely another.

The militancy of native American Indians has absorbed momentum from the prevailing culture wars of  the past three decades and the fragmentation of U.S. national identity.  In 1969, Vine Deloria Jr., a Stand Rock Sioux Indian published Custer Died for Your Sins: An American Indian Manifesto – a political best seller.  In that book Deloria described Custer as a blood sacrifice and a necessary sacrifice for the crimes of U.S. expansionism. Deloria’s book accelerated the American Indian Movement (AIM) and for the 1976 centennial the Indian activist Russell Means led a band of 200 Sioux Indians to the battle site carrying an upside down American flag, demanding and receiving unscheduled time on the speakers’ platform. He then prevented a descendant of Custer, Lieut. Col. George A. Custer III from being either officially recognized or presenting remarks.

Means upped the ante even further for the 112th anniversary in 1988 when he led a group that brought its own historical plaque of welded steel, planted it in the sod on Last Stand Hill and poured in concrete.  The plaque read:

“In honor of our Indian Patriots who fought and defeated U.S. Calvary (sic.).   In order to save our women and children from mass murder.  In doing so, preserving our rights to our Homeland, Treaties and Sovereignty”

AIM has since forcefully promoted a reparations agenda and has argued that tribal law, rather than American law, should be applied on native American reservations and communities. That particular demand is eerily reminiscent of the demands of British Muslims for their own sharia courts to adjudicate family disputes.

Means and others like him have vigorously promoted the concept of Indians as innocents who led peaceful lives on the plains before they were victimized and then militarized by the advent of white incursion.   But this is a total fabrication.   The Plains Indians were already militarized, having spent centuries in mortal combat with one another, a fact attested to by the revelation that most of Custer’s scouts were Crow Indians, who bore their neighbors such an abiding hatred they were willing to provide vital details on the Sioux encampment.    Not mentioned by Means or many other Indian archivists, is the brutal way in which white settlers were massacred at every stage of American westward expansion.   In fact, in the recorded depredations of Indians against whites families, there are almost no instances where the women of the settlements, when attacked,  were not raped and then mutilated by Indian war parties.

In addition, we should remember the barbarism of the Indian warriors and their cohorts at the Little Bighorn itself.  How many people, with even a casual interest in the West, know that in the aftermath of Battle of the Little Bighorn, the bodies of Custer and his men were stripped naked by their Indian victors, that many were either decapitated or eviscerated, their hands, feet and private parts dismembered and that these depredations were carried out, not by the warriors who did the killing, but mainly by the women and children of the encampment, after the warriors had left the field..  U.S Cavalry divisions arriving on the scene on June 27th  two days later, could barely make out the faces of many of their comrades, so flattened had they been by tomahawks and clubs.

The charge of genocide against 19th Century U.S. Administrations has often been leveled and indeed, an entire historiography has arisen in our universities which relies on the “facts” of a concerted U.S. campaign to exterminate the American Indian.  And it is on this platform of guilt that American multiculturalists raise the banner of national rights for Indians.  So described as a “holocaust”, in line with other “holocausts” of the Aztec peoples of Mexico and the Inca of Peru, it is all an attempt to give Indians the moral authority to make demands of the majority white population that they should not, in good conscience, refuse.

The effort to contemporize the “Manifest Destiny” policies of mid-19th Century American governments as genocidal campaigns has been a cynical exercise in capitalizing on the collapse of identity in this country, wherein those of Caucasian extraction are made to feel the burden of guilt for any number of depredations against minorities –  complaints which can only be redressed by an acknowledgement that “white” culture is somehow  morally and ethically bankrupt and inherently inferior to other cultures.

Yet, lets understand something.   The roll of western civilization over the American continent and the attempts of successive American administrations to consolidate the American hinterland was a historical process that could not, ultimately, be resisted.  In that process, U.S. governments, military leaders and citizens indisputably committed certain acts of betrayal, wanton murder and destruction.   But these injustices, as egregious in some instances as they might have been, did not amount to exterminationist policies; nor do they warrant an unending apology from the descendants, to the point where whites are forced to acknowledge a superior Indian claim to moral authority.

These are the vulnerabilities that other “victimized” people, from the Palestinians to the many Muslims in European countries, have chosen to manipulate today.  Multiculturalists would have you believe that such acknowledgements will heal old wounds.   But if that is what is meant by “ burying the hatchet,”  we would be better off recognizing that militant Indians are glad Custer died and if given the opportunity, would gleefully kill him all over again.


CALIFORNIA ROOTS OF THE EUGENICS MOVEMENT

July 14, 2009

Every now and then you stumble across one of those odd historical facts that is so outrageous and beyond belief that it forces you to hold your breath in sheer incomprehension.

Such an event happened to me in April this year when interviewing a group of scientists for a Western Word Radio program focused on the debate over intelligent design.   I discovered then, courtesy of Dr. John West, that over a 50 year period, beginning in 1905, over 60,000 people, deemed unfit for reproduction, had been forcefully sterilized in the United States. 

Although the State of Indiana was the first U.S. state to enact sterilization legislation, the hub of activity soon moved to the west coast, where California’s first sterilization law was enacted in 1909- exactly one hundred years ago this week. Like many Midwestern transplants, this practice found less  restrictions in the Golden State, and by 1921 more eugenic sterilizations had been performed in California than in the rest of the United States combined.   Unlike other states, the practice suffered no legal challenge or hindrance until the Supreme Court validated forced sterilization practices in the landmark case of Buck vs Bell.

The movement behind the forced sterilization laws was known as Eugenics.   Eugenics stressed the application of science to human heredity and breeding in order to improve the human species both mentally and physically. Some Progressives referred to eugenics as “the science and the art of being well born.” Human sterilization was carried out for many reasons. It might be implemented as punishment, perhaps in the form of castration for repeat sex offenders. It might be used for social reasons, to restrain individuals from having children because they are completely unable to care for them, either physically, emotionally or financially.  But when the state sterilizes an individual because he is seen to be genetically defective and therefore likely to pass his defects on to offspring, this is eugenic sterilization. And this was the type of sterilization that many California policymakers sought to carry out.

The United States was the first country to concertedly undertake compulsory sterilization programs for the purposes of eugenics but the movement thereafter took off like wild fire in the rest of the world.   In Japan, in the first part of the Showa era, Japanese governments promoted increasing the number of healthy Japanese, while simultaneously decreasing the number of people suffering mental retardation, disability, genetic disease and other conditions that led to them being viewed as “inferior” contributions to the Japanese gene pool. Their Leprosy Prevention laws of 1907, 1931 and 1953, permitted the segregation of patients in sanitariums where forced abortions and sterilization were common and authorized punishment of patients for “disturbing the peace.” Under the colonial Korean Leprosy Prevention Ordinance, Korean patients were also subjected to hard labor.

Eugenics programs, including forced sterilization, existed in most Northern European countries, as well as other more or less Protestant countries. Some programs, such as Canada’s and Sweden’s, lasted well into the 1970s. Other countries that had notably active sterilization programs include Australia, Norway, Finland, Estonia and Switzerland.

Organizations in support of eugenics were established around the world. For instance, one year after Buck vs Bell,  The Human Betterment Foundation came into existence in Pasadena, California with the aim “of fostering and aiding constructive and educational forces for the protection and betterment of the human family in body, mind, character, and citizenship.”  It primarily served to compile and distribute information about compulsory sterilization legislation in the United States, for the purposes of eugenics.

An understanding of the widespread support forced strerilization enjoyed in California can be gleaned with the reading of a list of the group’s inaugural Board of Trustees.  They included Henry M. Robinson (a Los Angeles banker), George Dock (a Pasadena physician), David Starr Jordan (chancellor of Stanford University), Charles Goethe (a Sacramento philanthropist), Justin Miller (dean of the college of law at the University of Southern California), Otis Castle (a Los Angeles attorney), Joe G. Crick (a Pasadena horticulturist), and biologist/eugenicist Paul Popenoe.  Later members included Lewis Terman (a Stanford psychologist best known for creating the Stanford-Binet test of IQ), William B. Munro (a Harvard professor of political science), and UC. Berkeley professors Herbert M. Evans (anatomy) and Samuel J. Holmes (zoology).

In other words, some of the top members of the political, business and scientific elites in the United States were among eugenics’ most enthusiastic benefactors and moral supporters.

In England, about the same time, a widespread national eugenics movement was being established.   In 1908 the Eugenics Education Society was founded with the hearty endorsement of  some of the leading intellectuals of the day including H.G. Wells, George Bernard Shaw and Beatrice Web, among other leading politicians, scientists and society patrons. 

The most infamous sterilization program of of the 20th century took place, of course, under the Third Reich. One of the first acts of Adolf Hitler after achieving control over the German state was to pass the Law for the Prevention of Hereditarily Diseased Offspring (Gesetz zur Verhütung erbkranken Nachwuchses) in July,1933.  The bill was signed into law by Hitler himself, and over 200 eugenic courts were created specifically as a result.  Under the German law, all doctors in the Reich were required to report patients of theirs who were mentally retarded, mentally ill (including schizophrenia and manic depression), epileptic, blind, deaf, or physically deformed, and a steep monetary penalty was imposed for any patients who were not properly reported.

The individual’s case was then presented to a court of Nazi officials and public health officers who would review a patient’s medical records, take testimony from friends and colleagues, and eventually decide whether or not to order a sterilization operation performed on the individual – using force if necessary.  By the end of World War II, over 400,000 individuals were sterilized under the German law, most within its first four years of enactment.

When the issue of compulsory sterilization was brought up at the Nuremberg trials after the Second World War, many Nazi leders defended their actions by indicating that it was the United States itself from whom they had taken their inspiration.

They were right on target. 

The question then is why?  Why did forced sterilization gain such traction in the United States?  What could have compelled Oliver Wendell Holmes Jr., the country’s leading jurist and otherwise a redoubtable liberal champion of free speech and human rights, to declare in the majority opinion in Buck vs Bell that:  “ It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manfiestly unfit from continuing their kind. Three generations of imbeciles is enough!”?

It has often been argued that you can’t judge one generation’s moral viewpoint from the vantage of the future.  But in this case, that argument appears tendentious.  There were numerous legal challenges to the eugenics laws of  state governments – by both individuals and by organizations and there was a fairly vigorous editorial campaign launched against the practice.  In addition, the argument that these were not regarded as moral issues at all by early U.S. 20th Century citizens, but practical utilitarian measures, designed to save society from the added expense of caring for those who could not care for themselves, also falls flat.  Multiple asssociations and welfare organizations had begun to sprout by the beginning of the century which were equipped to give assistance to the mentally and physically disabled, making the United States the most prodigious locus for charitable voluteerism in the world.

The answer to this imponderable question is more likely to be found in the “progressive” thinking which had gripped the intellectual, political and social elites of the West since the mid-19th Century.   The advent of Darwinist thought and the coining of the expression “survival of the fittest” ( which is accredited to the English philospher Herbert Spencer and not Darwin himself, who never mentions it in any of his works) led many to invest  in the idea of racial purity in order to protect the future of their progeny in an increasingly competitve world.  In the 1880s and 90s, as England, France, Russia, Germany and the Johnny-come-lately United States tussled with oneanother in carving spheres of influence into the world map, national greatness seemed to hinge on the ability of a civilization to produce a race of men worthy of empire and capable of holding on to it.

In the mad rush to secure their places on the totem pole of national grandeur, it was then commonly accepted, throughout all of these societies, that only the fit would survive.  This meant that the “unfit” – blind, deaf, mute, spastic, leprous, incurably diseased and  even chronically poor individuals, had to be quietly and efficiently neutered so that they would not contaminate the remainder of the national stock.

Leading progressive intellectuals of the early 20th Century had, in other words, interpreted Darwinian theory as a writ to “interfere” with human natural selection.  The crass inhumanity of it all was besides the point, since such beings were in fact only half or quarter human anyway. 

Looking back at this dark history we all must feel that twinge of deep embarassment when we realize that our vaunted civilization is not quite as lily-white as we once might have considered it.    But that kind of regret is wasted if we learn nothing from this stain on our national reputation.  Totalitarianism in Europe did not begin with brownshirts breaking bones on the streets of Rome, but with ideas that would brook no opposition.    Today, there are many other commonly accepted ideas – from anthropogenic global warming to the social utility of gay marriage to scientific certainties about the origins of life and the universe – that turn viciously against those who either question or deny them. The casualties in these culture wars might not  be the incomparable unfortunates of the 20th Century who had suffered physical deformation.  Nonetheless they are still innocents who suffer sterilization of another sort – the stigma of isolation and the pain of non-inclusion in the national debate.

The inevitable truth is that totalitarian thinking, sporting ideologies that can turn against peaceable citizens – can sprout in any country, even one with as proud a record in protecting human liberty as the United States.

Social Darwinism, the ideology which gave life to the eugenics movement, is still very much with us today.  It often reappears in the abortion debates, in the writings of such elite and highly respected philosophers as Harvard’s Peter Singer and among animal rights advocates who elevate animal life above that of human.   In this hallmark month, we should remember its repercussions and vow that never again should it be allowed to overrride mens’ better moral instincts in the name of a nebulous and ultimately soul destroying sense of progress.


OBAMA’S TOUGH LOVE

July 14, 2009

There was something for everyone in Barak Obama’s speech in Cairo on Tuesday.  It hit all the acceptable notes on compassion, human dignity and peaceful coexistence and delivered its vision of a world without conflict, war or inequality with generous helpings of hyperbole. Perhaps no one should fault the President for the desire to rebuild damaged relations in the Muslim world or in calling for a nuclear weapons free world. But he can and should be faulted for presenting facts that are patently untrue and for the implicit willingness to accommodate evil in his search for an unrealizable vision of world amity.

Obama’s speech was, for all and intents and purposes, a “ hovering speech”, that is, in the great Wilsonian tradition, it hovered over conflicts, border disputes and nuclear proliferation issues, sagaciously dispensing wisdom on how those conflicts could be resolved.   Taking no particular side (not really even his own)  he could then expostulate that he had come to Cairo to build a new relationship between the United States and the Muslim world – one based on  mutual interest and mutual respect:  “ America and Islam are not exclusive and need not be in competition. Instead, they overlap, and share common principles — principles of justice and progress; tolerance and the dignity of all human beings.”

Well that is well and good if you are talking to the Rotary Club of Cairo, Georgia.  But the crowd in Cairo, Egypt, to which he addressed his comments, may not be quite as open to “ sharing what is in our heart” and expressing solidarity with Obama’s universalistic vision of a humanity with similar goals and aspirations.

After all, what does the United States really have in common with a culture and civilization which in the main denies human and political rights to women; has little respect for democratic government;  where, in places such as Saudi Arabia, Iran and even Kuwait,  an accusation of thievery can result in the amputation of hands;  where a woman accused of adultery has a 70% chance, regardless of her innocence, of being stoned to death;  where human rights activists such Egypt’s own Ayman Nur and other dissidents have spent years rotting in jail for the expressions of their political beliefs?

In the Cairo speech, nearly every historical allusion was nonfactual or inexact: the fraudulent claims that Muslims were responsible for European, Chinese, and Hindu discoveries; the notion that Christian Cordoba was an example of Islamic tolerance during the Inquisition; that the Renaissance and Enlightenment were fueled by Arab learning; that abolition and civil rights in the United States were accomplished without violence.

Then there were the other statements made by Obama which simply deny reality:

What to make of Obama’s statement that “ regardless of race, religion, or station in life, all of us share common aspirations — to live in peace and security; to get an education and to work with dignity; to love our families, our communities, and our God.”

It seems that Obama was coloring in the picture of the Islamic world  from his own palette of multicultural experience. That the two civilizations do not share “common aspirations” in  that the Muslim family unit differs fundamentally from the Western model  – was one of the more egregious mistakes.  The kind of patriarchal structure that exists in the Muslim world, with its emphasis on honor, its absorption with shame and the apotheosis of the family head, precludes respect for individual rights and needs within the  family circle.

Also why, exactly, is it Barak Obama’s responsibility, as President of the United States, “to fight against negative stereotypes of Islam wherever they appear.”  Once again the comparison to Wilson is appropriate – the self appointed guardian of righteousness vowing to protect the rights and dignity of the disenfranchised. Wilson’s utopianism came to grief on the shoals of political and cultural realities. Obama should prepare himself for a similar outcome.

And what about this:  “And America will not turn our backs on the legitimate Palestinian aspiration for dignity, opportunity, and a state of their own.”

Oh, if only Palestinian leaders had ever felt the same way.  The struggle for Palestinian self determination has left behind it a blood soaked trail of demagoguery, absence of statesmanship, graft, venality and terrorism.  But more than that, there has never emerged in the Palestinian intellectual narrative very much support for the idea of a nation state with its own cultural and political identity.  That idea is largely a Western one, foisted on Palestinians in the name of national dignity. 

Today Palestinian nationalism, such as it is, largely revolves around the elimination of another state, rather than the creation of a new one. There is nothing in Palestinian circles that approximates the kind of intellectual energy which was dispensed by the founding fathers of either the United States or Israel in detailing the institutions necessary to maintain the superstructure of a new democratic state capable of living in peace with its neighbors.  Nor is there is any debate or discussion of these issues.

When talking about dignity, Mr. Obama might have also referred to the demand that any people wishing to join the family of nations should be required to express dignity in their  struggle for independence. Perhaps Mr. Obama and his advisors should ask themselves what have the Palestinians done to warrant statehood? Have they demonstrated a willingness to forswear violence, alter their founding documents to reflect the desire for peaceful coexistence or build institutions that would safeguard individual rights?  What have they done other than strewn the streets of their neighbor with the wreckage of exploded city buses accompanied by the burning flesh of their occupants?

Obama, like many American presidents, likes to deal with complicated issues through orotund expressions of moral symmetry.   The penchant for finding moral, political and cultural equivalence between competing national narratives however leads to gross distortions of historical fact.  In Obama’s case it is the conflation of the Jewish people’s suffering through centuries of persecution with the so-called denial of rights to Palestinians.  It is in the moral comparisons of the construction of Israeli West Bank settlements to the Palestinians’ incessant campaigns of suicide bombing; it is in drawing parallels between the extraordinary growth of highly organized civil societies such as Japan and South Korea with the chaotic and often dysfunctional economic structures of the Arab world.

When we think of other great presidential speeches abroad, we remember that they were tethered to an idea of American exceptionalism – the concept that America stood for freedom and against tyranny. John  F. Kennedy’s and Ronald Reagan’s Berlin speeches were examples of unintimidated American leaders willing to talk down to brutal regimes who exhibited no respect for human rights or the principles of Western freedom. They set the high watermark of American leadership abroad and its resolve to confront evil.  There was no evidence in Obama’s speech that anyone in the Arab or Muslim world has anything to fear by failing to abide by civilized norms.  Rather, he imparted the strong sense that there would be no more punishment for violations of civilized conduct than mild expressions of tough love.

The United States might well need the help and support of the Muslim world in stymieing Iran’s nuclear ambitions.   But that does not need to mean that the United States should accept or support the abuses of civilized conduct , driven by Islamic teachings, that are found  rampant in the Muslim world.  In the struggle between these two civilizations, where one will and must ultimately predominate, Obama has undoubtedly given our adversaries  a helping hand.