A University’s Decline · 09 June 2009

By David Horowitz - FrontpageMagazine.com
Filed under: State of the Campus

The following is the text of a talk given by Columbia University alumnus David Horowitz at the fifty-year reunion of the Columbia Class of ’59, on a panel titled “Fifty Years Since We Graduated.” – The Editors

I’m glad to be here with you at this our fiftieth class reunion. In fact (and sorry to remind you of this) we’ve all reached that point in life where we can appreciate George Burns’ crack that at his age he was glad to be anywhere.

I am especially pleased to be on a platform with my old teacher, Professor Bernard Wishy, whom I have not seen in fifty years. I have often reflected on the impact a few select teachers encountered early can have over the course of a life. Even though Professor Wishy was an instructor of mine for only a single term in my sophomore year in 1956, he is one of these influential mentors for me.

I vividly recall an exchange in his class with a student who strenuously objected to Freud’s Moses and Monotheism for reasons I have long since forgotten. What I do recall is Professor Wishy’s command of Freud’s sources and the texts of his critics, in answering the student, even though it was not his own academic field of expertise. In that exchange Professor Wishy conveyed a powerful message to us — that knowledge was a serious business and that there were no simple answers to the questions that truly mattered.

But the most profound impression he left was his classroom demeanor, which exemplified Columbia’s official mission, defined in those days as “the disinterested pursuit of knowledge.”

Professor Wishy was a scholar not a proselytizer. We never knew when he might be playing devil’s advocate, and taking positions he didn’t himself hold in order to shake us from our reflexive assumptions. I don’t recall him ever expressing his personal beliefs in the classroom, whether political or religious or otherwise. I don’t know how he voted in the 1956 election, or whether he was of the opinion that religion is an illusion as Freud maintained.

Instead, the point of his teaching was to illuminate the process by which one confronts an intellectual argument, understanding that in order to do so one must be acquainted not only with the facts but with the arguments that have preceded one’s own. He was there, in other words, to teach us how to think and not to tell us what to think — therefore to respect the divergent opinions of others. I am afraid this is a vanishing ethos in our culture and a dying pedagogical art in our university classrooms today.

In short, what Professor Wishy taught by example was respect for the difficulty we experience as ordinary mortals in arriving at the truth concerning life’s most vexing questions. This was not a lesson I absorbed easily. I was too filled with my own youthful certitudes for that. Nonetheless, I kept the memory of Professor Wishy’s classroom with me for the next twenty years until a personal crisis of belief finally allowed me to appreciate what he had taught.

Another influential Columbia teacher was Moses Hadas, a professor of classics who is no longer with us. In one memorable class, Professor Hadas drew our attention to the Roman General Scipio Africanus who wept when his soldiers burned the great city of Carthage because he saw in the flames the future of his beloved Rome. The ancients did not have our view of history as a progress. For them it was a series of cycles, the story of civilizations that rose and fell, came into being and were gone.

In retrospect, it seems odd to me now that Scipio’s tears should have made such an impression on a young radical. I arrived at Columbia believing that a progressive future was imminent and that it would transform everything we knew.

Consequently, I viewed my college education not as a step on a personal career path but a preparation for my life mission, which was to participate in a revolution that would change the world. Grandiose as this may sound, it was an audacity of hope shared by all progressives in one form or another, and is still so today.

By contrast, the ancients believed that the world cannot be fundamentally changed, at least not by human beings. At the end of the first chapters of Genesis, an angel with a flaming sword is said to stand at the gates of Eden to prevent us from re-entering because the first man and woman had already demonstrated that it is not within our nature to achieve an earthly bliss.

As a result of the flaws in our nature although we may secure justice in this case or that, injustice we will have with us always. This is why the preacher, Ecclesiastes, said, “There is nothing new under the sun.” The French have a similar phrase: Plus ca change, plus c’est la meme chose: the more things change the more they are the same.

These statements recognize the fact that the world that so obviously needs repair is a world that we and human beings like us have made. Consequently, our efforts to make it a different world will necessarily fail. This is the religious view of the circumstance we find ourselves in, and it is also the conservative view, and it is also mine.

Have there been changes since we left Columbia fifty years ago? There have. But from a conservative vantage, none of them have altered the fundamental pattern of our lives — the self-centered and selfish desires, the envy and resentment of others, the resort to dishonesty when it suits our ends, which are the real causes of the social ills we wish to redress.

Some of the changes of these fifty years have been good; others have been bad; some of the good changes have come with consequences that are bad; and regarding some of the changes, there is less to them than meets the eye. Plus ca change.

I came to Columbia more than fifty years ago at the tail end of the McCarthy era as a leftist whose Communist parents had lost their teaching jobs because of their political views. Today, I have returned to Columbia as a conservative.

But in fact the views I hold on the issues that are thought to define these labels such as race and freedom of expression, and my concerns for the poor and those left behind, are no different today from what they were then. The parameters around me have changed, and my understanding of how things work, but not my fundamental values. Fifty years ago, my radical views caused me to feel like an outsider at Columbia. Returning as a conservative, I find myself an outsider still – and again it is because of my political views.

In the half century since I graduated, this is the first time that I have been invited to an official Columbia function, and even so the occasion is an alumni reunion not a formal academic event. This exclusion has occurred despite the fact that I am the well-known author of many books, several concerned with university reform; and despite the fact that my son who is also a Columbia alumni has donated a generous scholarship fund to the college for minority students; or that my grand-daughter is currently a Columbia student so that we are in a manner of speaking a Columbia family. Evidently, I have been more loyal to Columbia than Columbia has been to me. Even the invitation to this alumni function had to be sustained against a strenuous resistance by some of my classmates who are professors now at other schools and are apparently of the opinion that my views should be suppressed.

And this attitude of exclusion is a prevailing one among current Columbia faculty. So far as I can ascertain, there is not a single prominent conservative intellectual on Columbia’s liberal arts faculty today. The dozen or so books I have written, like those of other well-known conservatives, though widely praised and highly regarded in the world outside Columbia, are more effectively banned in its classrooms than were the books of Marxists fifty years ago, during the height of the McCarthy era.

From a conservative vantage, the changes that have taken place in the last fifty years can be regarded as the result of scientific and technological advances, and do not represent a fundamental reordering of the relations between human beings themselves.

This is the case, for example, with the changes that have taken place in the lives of women, who have moved into a variety of public roles in unprecedented numbers. These developments are quite different than a change in the fundamental relationships between the genders, in male respect for women or in the nature of women themselves.

To the politically incorrect like myself these new roles and the respect they earn are the result of technological developments that have relieved women of arduous tasks on their end of the division of labor, and scientific innovations that allow them to control their reproductive cycles and to be protected from routine mortality in childbirth.

This conclusion is reinforced by my experience as a student of English literature at Columbia fifty years ago. One of the leading and most honored Shakespearean scholars in the nation at the time was Columbia professor Caroline Spurgeon. Benighted as we may have been back then, I do not remember anyone who thought it odd that Professor Spurgeon was a woman or thought less of her work because of it.

Similarly, when I took a course in the 19th Century English novel, 5 of the 12 authors we read were women, and this was well before the publication of The Feminine Mystique and the beginnings of the so-called “women’s liberation movement,” whose subtext was that men, which would have included my teachers, were their oppressors.

It is true that in recent years we have witnessed the appointments of the first three women secretaries of state, and the first two women Supreme Court justices, with a third now on the way. But these are easily understood as a consequence of technological improvements that afford women new freedom to pursue such careers, rather than the overthrow of an oppressive ruling “patriarchy.”

Thus, the Elizabethans I studied in my literature classes were called “Elizabethans” in deference to one of the most powerful monarchs in English history because long before the women’s movement she ruled her era.

In sum, as we embark on the 21st Century, women and men are pretty much the familiar genders we encountered in our first days as undergraduates reading in our Humanities sections Homer’s 3,000-year-old epic about Helen of Troy who had the power even then to cause the launching of a thousand ships and the burning of the “topless towers of Illium.”

Of course if you were to enroll today in Columbia’s Department of Women’s Studies you would be taught that we still live in an oppressive patriarchy and that gender differences are “socially constructed” and can be re-constructed, and then eliminated as we reach the highest stage of women’s liberation. But this is ideology not reality.

The fact that this ideology is a required creed for students of Women’s Studies reflects not an advance in consciousness but the retrogressive return of American liberal arts colleges to their 19th Century roles as doctrinal institutions, the difference being that this time the doctrines are secular and political rather than religious.

Of course a large and important sector of our modern research universities has not regressed. The hard sciences – the engines of our technological futures – continue to progress. If one were to walk over to the departments of biology and neuroscience, one would learn that gender differences are not “socially constructed” but hard-wired as part of our genetic makeup. We can already see the next academic reformation coming as the new progressive religions increasingly clash with empirical discoveries in the biological sciences. Plus ca change.

While some changes add up to less than meets the eye, others have led to consequences that are nothing short of catastrophic. The last fifty years have witnessed the growth of a new environmental consciousness, for example, whose modest goal is to “save the planet.” Talk about hubris! Shortly after we graduated Columbia, Rachel Carson published a book that is regarded as a founding document of the environmental crusade. Her tract warned that the continued use of DDT pesticides would kill the world’s bird population and create a “silent spring.” A little over a decade later, because of the influence of her book, DDT pesticides were globally banned.

As it happens, at the time Carson wrote, the world had been recently freed from the scourge of malaria, which had previously accounted for three million deaths a year. This was thanks to the Rockefeller Foundation and its funding of a malaria eradication program, which relied on the pesticide DDT. Soon after the pesticide was banned malaria reappeared. The resulting epidemics have produced a toll of preventable deaths that already exceeds any other in the grim annals of man-made mortalities.

Since the progressive doctrine of The Silent Spring was implemented, three million people have died of malaria every year for more than thirty years, adding up to a total now of nearly 100 million. Ninety-five percent of the victims have been black African children under the age of five. As a footnote to this tragedy, Carson’s claim that DDT was harmful to birds has since been discredited.

Of all the battles that Americans have fought to advance agendas that are generally regarded as “progressive,” the one that appears to have had the most uncontroversial success is the fight against racial discrimination. There was a time not long before we came to Columbia when there were overt and unapologetic bigots in the U.S. Congress, such as Senator Theodore Bilbo a member of the Ku Klux Klan and an avowed racist. Today no anti-black bigot could stand up in the public square and proclaim his bigotry and survive with a public career. And now we have our first black president.

At Columbia last year a noose was posted anonymously on an African American professor’s office door. The entire university – administrators, faculty and students — recoiled in horror and came to the defense of the target. We have come so far that no one could be surprised at that.

But it is only half the story. At the same time that anti-black prejudice has retreated from the public square, other forms of prejudice using other groups as targets have become acceptable, even normal, and particularly in the most “progressive” circles. At Duke University not too long ago a drug-addicted prostitute who was black accused three white students of a crime they did not commit.

There was not a shred of evidence to sustain the charge, and much to contradict it. Yet the prosecutor, seeking the support of the black vote in Durham was not deterred. So reckless and racially motivated was his prosecution of the innocent students that he was subsequently disbarred for his actions, which included suppressing evidence that proved conclusively that they had not committed the crime.

Yet because they were white and the alleged victim was black the public lynching of their reputations continued for a year. The president of Duke, an Ivy League scholar, expelled them in advance of any trial, and terminated the athletic season of their team, and fired their coach.

Eighty-eight professors condemned them as racists, associating them with slave owners and white rapists of the past. While the press protected the name of their accuser, it paraded their images before a mass audience and made them national pariahs.

This is a particularly ugly case, but the new racism reflected in its details has become instiutionalized. At colleges and professional schools across the country, privileges are routinely granted to individuals officially designated as members of so-called “under-represented minorities” and withheld from others who belong to so-called “over-represented minorities.”

The result is that if you are an impoverished and discriminated against Asian student, universities will deny you financial aid available to wealthy African Americans and you will have to score much higher on your graduate achievement tests just to be able to apply to medical and law schools.

Forty-five years after the civil rights revolution, we have taken a giant step backwards in our efforts to create a society where the rules are color blind and individuals are rewarded on their merit.

Taking a personal view of these developments, I note that when we entered Columbia in 1955 we understood that there was a quota system for Jewish applicants. It was masked as a geographical diversity program, just as deceptive as the one I’ve just described, and rationalized as an attempt to create a student body drawn from all parts of the country. Its architects had figured out that that the pool of Jews in states such as Arkansas and Nebraska was likely to be small.

Still, the overall quota was rumored to be 48% of the entering class, which seemed generous to us then. The Nazis’ “Final Solution” had recently (but only recently) given anti-Semitism a bad name, and it seemed as though things were changing for the better for the Jews. I was privileged, for example, to have taken a class with Lionel Trilling, the first Jew ever to be hired by an Ivy League English Department. From this perspective, a 48% quota persuaded us we were making real progress.

Today, even though there are many Jews on the Columbia faculty and Jews even sit on the board of trustees, there are also overt and unapologetic anti-Semites lecturing in Columbia classrooms, which would have been unheard of in our day. There are now tenured bigots on the Columbia faculty whose classes are an assault on the only existing Jewish state – a tiny nation under continuous attack from an Arab world determined to extinguish it from the day of its creation more than 60 years ago.

More than six decades after Hitler’s demise, an Islamic death cult in the Arab world has made very clear – and in so many words – that it is determined to finish the job he started. A state leader of this cult whose government is about to become a nuclear power and who has declared his intention to wipe both Israel and America from the face of the earth was not too long ago invited to speak to Columbia students by its president.

It is true that President Bollinger was rude to the dictator when he came, and criticized him as a tyrant – an act of minimum decency (which notwithstanding elicited protest from Columbia’s radical faculty). But why was a genocidal maniac whose declared goal is to kill the Jews so honored in the first place?

When we arrived at Columbia fifty-four years ago, America was engaged in a world war with another totalitarian ideology seeking to put an end to the West. Today we are faced with yet another that seeks our extinction. Plus ca change, plus c’est la meme chose.

What is this chose anyway, this thing that doesn’t change? It is the human desire to fill the emptiness that is our fate, which is unchanging and unchangeable: that we are born alone and we die alone and we are forgotten. Over this emptiness human beings drape their mythic causes and impossible dreams, their hopes for an earthly redemption – for a change that will fill the emptiness by creating a world that is holy or just. It is this hope that allows us to forget who we are. It is this vision that inspired the ideologues of communism; and it is this vision that drives the Islamic radicals, who believe they are making the world safe for Allah by purging it of infidels, and the unfaithful, and especially Jews.

In these visions we Americans are seen as the party of Satan, as the unbelievers who stand in their way with our pragmatism and tolerance, our devotion to enterprises and pleasures that are bourgeois and mundane; and our hope that is reserved for individual lives and not for grandiose social collectives and schemes.

————————————————————————————————————————
David Horowitz is the founder of The David Horowitz Freedom Center and author of the new book, One Party Classroom.