Forget About It!

I have blogged in the past about our country’s anti-intellectualism which is glaringly obvious and has been commented upon by numerous others. I refer to our increasing determination to deny the higher purpose of the human mind, its capacity to achieve order, inclusiveness, and coherence. Our country was founded by practical people who were busy building lives in a new country. Following those early years we seem to have attracted a great many people, with notable exceptions, who were convinced that such things as education were esoteric and not really worth the time or attention they received in Europe, for example. Following those early years, we have seen increasingly pragmatic people who have narrowed their focus on the here and now and such things as the making of profits. Today, as I have noted on numerous occasions, we have reduced everything to the business model, including religion and education. The human mind now simply calculates profit and loss — or checks out social media.

There were exceptions, as noted, and one of those exceptions was Thomas Jefferson who in his Notes on the State of Virginia proposed a system of public education for all (boys) that would be capped off by several years at his university where the very best and brightest would be given the best possible classical education then available.

Interestingly, even in the three primary grades of his proposed public education, Jefferson did not stress such things as reading, writing, and figuring. He thought those things were a given — all kids learned them at home. In the very early years he advocated more substantive subject matters, such as history. The memories of young children were to be

“. . .stored with the most useful facts of Greek, Roman, European, and American history. . . .History by apprising them of the past will enable them to judge the future; it will avail them of the experiences of other times and actions; it will qualify them as judges of the actions and designs of men.”

Jefferson was clearly in the minority, since history has never been the strong suit of American schools and by the time of the intellectual rebellion in the 1960s of the last century history was rejected by student radicals as “irrelevant.” It has now been removed from the basic core requirements of the majority of American colleges and universities and many high schools as well. Henry Ford thought it “bunk,” a sentiment taken up by Huxley in his Brave New World in which his citizens were nothing more than ignorant pleasure-seekers. Young American men captured in Korea during that “police action” were easily programmed to take anti-American half-truths as the whole truth because they were ignorant of their own history. Moreover, many of those who teach, even today, insist that the teaching of such things as “facts” is a waste of time when, indeed, facts are the building blocks of thought and like it or not they must be learned if thinking is to take place. Without those blocks thinking and speaking are merely gobble-de-gook — as we can tell by reading or listening to our Fearless Leader. And history is the subject best able to prepare the young to be “judges of the actions and designs of men.”

Santayana famously said that those who ignore history are doomed to repeat its mistakes, and we have seen how true that is. But in Jefferson’s program outlined above there are other elements that also deserve to be considered. For one thing, he is advocating what might be called a “natural aristocracy” in which the best and brightest rise, like cream, to the top. Borrowing from Plato, he thought the preservation of our Republic depended on this. Education was the key. The Republic, if it was to be successful must attract the best and brightest to the halls of power to make the important decisions regarding the correct path the country should follow. We have no idea how that might have worked because we have never really committed ourselves to the education of all citizens as Jefferson would have us do. Job training, yes. Education, no. And our anti-intellectual sentiments lead a great many people to regard a liberal education, for example, as “elitist,” a citadel of social privilege, if you will. In fact, a liberal education is one that would provide the very best possible foundation for anyone with a mind to make important decisions and be aware of the forces that operate around them — forces that threaten to imprison them in chains of bias and ignorance and overwhelm them with such things as “alternative facts.”

We pay a huge price for our ignorance, not only of the past which we blindly ignore, but also of such things as science and mathematics which enable us to better understand the world around us and make sense of things. Jefferson’s was a pipe-dream, many would say, though he rested his hopes for the future of his beloved Republic on that base. And my dream of a liberal education for all — which owes its origin to such thinkers as Jefferson and Plato, among others — is also a pipe dream. I have kicked this poor, dead horse so many times my foot is numb (and the damned horse simply will not budge). But we might do well to recall that one of the founders of this nation who had high hopes for a free country of free minds once outlined a program for maintaining freedom in the years to come. And in ignoring his admonition to educate (not train) all citizens we may well have made ourselves a bed of thorns upon which we now must sleep. If we can.

 

 

Advertisements

Decline of the West

 

This is a slightly modified and updated version of a previous post.

Oswald Spengler wrote a classic study of what he regarded as the rise and fall of various civilizations throughout the history of mankind. The key for Spengler was that these civilizations are natural organisms and like any other natural entity, they are born, grow, decay, and eventually die. The British historian, Arnold Toynbee, wrote his Study of History after Spengler and while he agreed with Spengler on many points, he regarded civilizations as artificial, not natural. There is no reason to expect that all civilizations will necessarily die out. But in his study, he noted that sixteen of the twenty-one fully developed civilizations he identified have, in fact, died out and four of the remaining five were in their death throes. The only relatively “healthy” civilization Western civilization.

But despite its relative healthy state, Western civilization is in the latter portion of its cycle — a series of stages that every civilization goes through — and while its roots grew strong in the rich soil provided by the fall of the Western Roman Empire, Toynbee could see the beginnings of a trend toward dissolution beginning in the Reformation with the failure of Christianity to withstand a variety of attacks from without and within. The most vital society in Western civilization was, as Toynbee saw it,  the new kid on the block, India — because of its

“vast literature, magnificent opulence, majestic sciences, soul touching music, awe-inspiring gods. It is already becoming clear that a chapter which has a western beginning will have to have an Indian ending if it is not to end in the self-destruction of the human race. At this supremely dangerous moment in history the only way of salvation for mankind is the Indian way.”

A healthy spirituality is essential to the well-being of any human civilization.

In general, Toynbee presented the history of each civilization in terms of challenge-and-response. Civilizations arose in response to some set of challenges of extreme difficulty, when “creative minorities” devised solutions that reoriented their entire society. Challenges and responses were physical, as when the Sumerians exploited the intractable swamps of southern Iraq by organizing the Neolithic inhabitants into a society capable of carrying out large-scale irrigation projects; or social, as when the Catholic Church resolved the chaos of post-Roman Europe by enrolling the new Germanic kingdoms in a single religious community. When a civilization responds to challenges, it grows. Civilizations declined when their leaders stopped responding creatively, and the civilizations then sank owing to loss of control over the environment, nationalism, militarism, and the tyranny of a despotic minority. Again, Toynbee believed that societies do not die from natural causes, but nearly always from self-inflicted wounds. And that death necessarily involves the death of the soul — the vital spirit that kept the civilization alive throughout the ages, though this sounds much like Spengler’s “organic” view of civilizations.

Whether or not we agree that India will dance on the charred remains of Western civilization (or whether we agree with Toynbee at all) we can certainly agree that the cycles that he insisted all civilizations repeat seem to be very much in evidence today — even if we simply focus on a small part of Western civilization, namely, the United States of America. Clearly, we have lost control over our environment, given global warming, which many of us continue to deny. Further, the growth of nationalism, militarism, and the “tyranny of a despotic minority” are very much in evidence as I write this brief blog. In particular, we can see the increase of militarism today as so many political decisions seem to be directed by the military which enjoys the lion’s share of our annual budget, just as we can see the immense influence the “despotic minority” of the wealthy have on the President and this Congress and their determined attempt to turn this democracy into an oligarchy.  But the growth of nationalism and especially militarism, along with the failure of a “creative minority” to maintain a foothold in this society, seem to have brought about what Toynbee called “an answering withdrawal of mimesis on the part of the majority” — i.e, apathy. This is especially disconcerting.

Looking at both the ancient Greek and Sumerian civilizations, Toynbee saw a movement through what the Greeks called “kouros, hubris, and haté.” These signify the growth of  especially the military in those societies from a surfeit of power through excessive pride, to disaster. If he were alive today he would doubtless note a similar pattern emerging in this country, if not in the West generally. And it all seems to be hidden under the cloak of “national security” born of the fear of terrorism.

New History?

I have been exploring two themes recently in my posts. On the one hand, I am concerned about the current state of civilization, that is, the delicate fiber that holds together diverse peoples out of respect for law, tradition, and for one another. On the other hand, I have explored many of the problems in higher education that seem to somehow have had an adverse effect on the world outside the ivory towers that once protected those inside from prying eyes. I have been especially concerned about the movement called “postmodernism” that has taken over in our universities and which rests on the central tenet that there is no such thing as truth, only “texts.”

A major movement within the academy since the late 1960s has been “New History,” one of the bastard offspring of postmodernism. It is based on the notion that history is simply another form of literature and historians are no longer to be held to the standards and rigor that ruled the discipline for generations, demands for evidence and the desire to approximate the truth about the past as much as possible. Footnotes and reliable references are no longer required. Again, since there is no such thing as truth, there cannot possibly be any accurate depiction of the past. The new historian, therefore, is free to wing it, make things up and tell it like he or she would like it to have been. New history is more about the historians than it is about history itself.

One of the most prominent historians to have defended Old History against the onslaught of the New Historians is Gertrude Himmelfarb, whom I have mentioned in past posts. She has done a remarkable job of seeking to defend truth against the attacks of the subjectivists and relativists, but one has the sense that she is spitting against the wind — and she knows it. In any event, she has written a number of books attempting to show the absurdity of rejecting standards of evidence and attempts to reconstruct the past as accurately as possible and one of those books, The New History and The Old addresses the topic directly. In that book, a collection of her papers, she recounts the following anecdote about a Conference she attended in 1969 when New History was aborning and was regarded by most historians as merely a passing fad, a novelty soon to be dismissed. As Himmelfarb tells us:

“. . .what the history profession needed was a “little anarchy.” This . . . was the great merit of the new history — its variety, openness, and pluralism. . . . .there is no meeting ground between [different ways of approaching history] and there need not be. All that was necessary was the tolerance to permit “different people doing different kinds of things in different ways.”

What we have here is the wheels of an academic discipline falling off. The notion that two or three or four historians are free to reconstruct events in accordance with any loose principles whatever, drawing on psychology, anthropology, science, or any other unrelated discipline and every one of those views is somehow legitimate and is to be respected by historians across the boards is on its face absurd. Tolerance is here carried out to the extreme of denial that there is anything we ought to agree about, anything beyond different ways of doing things. Anything goes. We are intolerant if we do not make room for the absurd and the outrageous. There is no truth available, only opinion.

Traditionally, the various academic disciplines each had its own distinctive manner of approaching problems that require reasonable solutions. There has always been disagreement about the best way to approach those problems and one never really expected any two thinkers in diverse academic disciplines to agree with one another about which is the better way. Hell, it was seldom the case that two academics within the same discipline agreed about much of anything! But that disagreement was the key to keeping lines of communication open and encouraging the exchange of diverse opinions and theories which were designed to eventually lead us all closer to the truth about the human condition. Dialogue requires open minds and a conviction that there is a goal to be achieved in the end, no matter how long it takes. Difference of opinion was a good thing because it made us careful about the way we conducted research and put together evidence and arguments. Difference was a means to an end, not the end in itself; but it was required in order to eventually reach some agreement about what is true and what is not. With New History, as Himmelfarb notes,

“Two historians working on the same subject are apt to produce books so disparate that they might be dealing with different events centuries and continents apart.”

What has occurred, not only in history but in all of the humanistic disciplines and the social sciences as well, is that they are all dangerously close to becoming as like one another as possible in their unanimous rejection of the notion that there is a truth worth pursuing, rejecting in one way or another the conviction that if one applied the techniques of the various disciplines one could at least hope to reach some degree of accord about what is and what is not the case. In a word, it used to be held that there is an answer to every question, but that answer must be sought by each thinker in accordance with the rules laid down within the discipline he or she has chosen to pursue, different ways to achieve a common goal, as it were. The current relativism, the rejection of the notion that there is any truth, blurs the distinctions among the various disciplines and tells us that it really doesn’t matter what anyone says about much of anything because there is no point in reasonable pursuit of truth since there is no such thing as reason or truth anyway. There is no point in searching for a common meeting ground on which we could all stand in search for something beyond personal opinion. The most persuasive or colorful writer or speaker wins.

Needless to say, this relativism has found its way into the world outside of the academy and we now find ourselves surrounded by such things as “alternative facts” and the notion that truth is a matter of who shouts loudest and is able to shut down opposing points of view. Might makes truth.

 

Popular Culture

I have written recently about how the movements that begin within the hallowed halls of academe tend to find their way outside those halls much like a scientific experiment that went wrong in a science-fiction movie. The most recent example of this is the notion of “alternative facts” that almost certainly is the bastard offspring of the postmodern movement born in Germany and France and now in ascendency in American Universities that stresses such things as the denial that there is such a thing as truth.

One of the heads of this movement that would reject all “modern” academic courses of study in history, literature, philosophy, and sociology is what is called “popular culture.” This is the study of such things as movies, television shows, comic books, and the like. This movement, in addition to rejecting the notion that history should be written without footnotes because it’s only a matter of subjective opinion anyway, has given birth to the following sorts of phenomena — as recently reported by the American Council of Trustees and Alumni:

• Rice university offers a first-year writing intensive course titled “Star Wars and the Writing of Popular Culture.”

• Appalachian State University requires its freshmen students to take a first-year seminar to help them develop “creative and critical thinking abilities.” Seminars this spring include “Death (and Rebirth?) of the Hippie.”

• The English department at the University of Pennsylvania — an Ivy League School — offers a course on “Wasting Time on The Internet.”

And this is just a tiny sample at a time when a recent poll of college graduates revealed that:

• 34% could not identify correctly when Election Day is held.

• 25% could not identify Tim Kaine as a candidate for vice president of the United States.

• 50% could not name Franklyn Roosevelt as the last president to win more than two elections to the presidency.

A number of colleges and universities now offer not only courses in Popular Culture, but also majors in that field as well as PhDs for those who want to go on to teach in that  academic “discipline.” And, A.C.T.A. concludes, “When many of our colleges and universities treat popular culture and entertainment as subjects worthy of serious study, it surely isn’t surprising that so many college graduates can’t identify key civic leaders, events, and their significance.” Indeed.

So what? you might ask. The answer is, of course, that this is coming at a time when we need young people who can think, and who can think critically. The recent election should have proven how vital that is and how far short we are falling as a nation. In this regard, there are two major problems that lie at the heart of this movement. To begin with, courses in Popular Culture emphasize information at the cost of thinking about information. I shall return to that notion in a moment. Secondly, the movement shoves aside other courses in the college curriculum that actually might help put young people in possession of their own minds, make them intelligent, critical thinking adults who can discriminate between a well-qualified candidate for president, say, and a complete fraud.

To return to the first point, it has been shown in tests conducted years ago that there are certain academic courses that help young people to think. This is reflected in tests such as the LSAT that students take in order to enter law school. Law requires critical thinking skills and the fields that do well, it has been shown, are mathematics, economics, philosophy, engineering, English, Foreign Language, Chemistry, Accounting, and History (in that order). The fields of study that score lowest in the LSAT are those that stress information and memorization. I shall not mention them out of respect to those who wasted their time and money earning degrees in those subject areas. But Popular Culture would certainly be at the top of that list if it had been offered at the time these studies were conducted.

The point is that the sorts of shenanigans that are going on behind the hallowed halls of academe have consequences for those who pay little or no attention to what is going on there. The graduates who have shown themselves to be badly informed about American history and government and also unable to think critically grow in number while those that cannot use minds filled with drivel increase accordingly, fostered by colleges and universities now being run as businesses, catering to the whims of their “customers.” And this at a time when our democracy desperately needs intelligent, well-informed, thoughtful citizens.  Courses in such non-fields as “Popular Culture” are the sort of things that guarantee that this will not happen.

 

News That Sells

I found the following remarks in an article about how we should take reports about the latest polling results with a grain of salt. I have always done so, but it was most interesting to read what the writer said about news reporting generally:

Our research suggests yet another reason not to overreact to news stories about the newest poll: Media outlets tend to cover the surveys with the most “newsworthy” results, which can distort the picture of where the race stands.

Why? Consider the incentives of the news business. News outlets cover polls because they fit the very definition of newsworthiness. They’re new, timely, often generate conflict and allow political reporters to appear objective by simply telling readers and viewers what the public thinks. Horse-race stories are also popular.

Given that readers are drawn to drama and uncertainty, polls that offer intrigue or new developments — such as a close race or signs that one candidate is surging — are more likely to be deemed newsworthy. In particular, polls with unusual results may be more likely to make the news.

Note, please, the “incentives” of the news business. To begin with, news is regarded as a business, not a public service. This is, of corse, true. The hooker is that as a business news sources must worry about who pays the piper. That is to say, news reporting should be about what we need to know to be an informed citizenry; rather, it’s about what sells newspapers or air time. “Newsworthiness” is nothing more or less than what sells.

But I was struck by the notion that reporters should “appear to be objective,” as though objectivity should not be their highest goal. Clearly, it is impossible to be completely objective — how could one be objective about a person such as Donald Trump, for example? One either hates or (apparently) loves the man. But the idea that a reporter, like an historian, should be objective should be the first order of business. I recall a friend once saying that he wished someone would write an objective history of the Civil War — from the Southern point of view! As I say, it can’t be done. None the less, it should always be the goal of any historian or reporter. But this writer says it is enough to “appear” to be objective. The polls do this by giving us numbers. But the selection of those polls can be very subjective and it appears as though that choice is based on what strikes the reporter as sensational (“drama and uncertainty”).

It’s a good idea to take what we hear and read with a grain of salt generally. It pays to be suspicious and question all sources of information. We cannot always do this, but it, too, is a goal we should all seek to achieve. This is the point of thinking critically — not to reject, but to accept on reasonable grounds, which requires that we have a good idea off what constitute reasonable grounds. This is especially difficult in an age like ours in which the reports we read and see on television are selected for all the wrong reasons.

I have noted in past blogs that reporting has become an arm of the entertainment industry. But it is interesting to have reinforcement of that idea by someone who seems to accept as a given the fact that reporting is all about getting through to an audience rather than about telling the world what is going on and letting the world decide what they want to read, see, or hear. Apparently TV is the worst culprit in this decline of reporting as news provider and this is because TV is a cut-throat business and as we all know business is what our world is all about these days: it’s all about the bottom line.

Making Widgets (Once More)

We are having a hot, tropical summer here in Minnesota and I decided to repost a previous entry rather than simply repeat what I have already said in order to avoid getting even more overheated. This post deals with my favorite topic, the failure of our education system (which I think is at the root of many of our current difficulties and helps us to understand why a moron could be seriously considered for the highest office in the land.) Please note that I have made some subtle changes to update the entry.

Some time ago I wrote a post about the need to make distinctions in order to be clear about the things we discuss. One of the distinctions I mentioned is that between “wants” and “needs.” We rarely make the distinction and that leads to major confusion, especially when raising our kids, forming policies, or selling goods. In the latter case, for example, we are told that people need the product they are buying when, in fact, they may simply want the product.m Or they may not even want it at all until an ad convinces them they do. One of the things marketing people are very good at doing is creating wants and they do this by insisting that those wants are needs. (Do we really need a 5 hour energy drink??)

Surprisingly, educators do the same thing. They talk about what the kids need when they are really talking about what the kids want. It’s easier to determine wants than needs, because we can simply ask the kids: “what do you want?” Or we can continue to dumb-down the curriculum and provide them with electronic toys until they stop complaining. When it comes to needs, the kids don’t have the slightest clue. Sad to say, neither do many of their teachers and professors. And this is a very important point, because it leads us to the central reason why education is in deep do-do: those who are in a position to determine what the kids really need are either unaware of what those needs are or fail to act on that knowledge and fall into the marketing trap of simply determining what the kids want and then attempting to meet those fickle wants by insisting that they are providing the things the kids really need. It’s the path of least resistance. The confusion is widespread and until it is cleared up there is little likelihood that those who teach will lead those who learn rather than the other way around. (Note the interesting parallel here with parenting.)

But there’s another distinction that we seldom make and that is the distinction between education and training. I have discussed this confusion in previous blogs but have never focused on the key difference — until now. Training involves teaching learners how to do something, say, make widgets. Education involves understanding why we might want to make widgets in the first place. This is a critical difference, and the fact that education has devolved into job training is a serious blunder, because we need folks now more than ever who ask the troubling questions — why DO we make widgets?

There is a growing number of company CEOs who insist that educators are failing because the people coming out of college lack the ability to communicate, read and write memos, and speak before an audience. These highly paid corporate bosses talk a great deal about the need for these young people to have a broader, “liberal education,” though what they mean is that the folks they hire should be more effective at their jobs. However, at the level at which people are hired the message to hire broadly educated employees has failed to filter down and the initial search is simply for college graduates who can do a particular job, who can make widgets. The computer apps these recruiters use tend to screen out applicants who have majored in, say, philosophy, because presumably those people cannot make widgets (even though they could be trained to do so in a matter of weeks [days?]). So the job market looks bleak for graduates in such subjects as philosophy, literature, and history, because those folks are weeded out by a process that is designed to assure companies that the people hired can do meaningless jobs without the companies themselves having to spend money training them: the colleges are now expected to turn out people to make widgets, not ask why those widgets are being made in the first place.

Thus the CEOs who speak about the need for liberally educated employees don’t really mean it. The last thing they want is employees who ask why they are making widgets. They want workers who are already trained and can effectively make and market the products. The irony is that those who stop to ask the troubling questions would make the best employees in the long run because it is those people who can not only learn how to make and market the products, but they can also figure out how to improve those products as the world changes and demands for new products arise — as they most assuredly will. Because the only certain thing about the future is that things will change. And this is why America needs educated citizens, not simply those trained to make widgets.

Hate Breeds Hate

We have read often about the terrible conditions undergone by the American rag-tag army as it endured the freezing cold Winter at Valley Forge prior to the attack on the Hessians at Trenton during the Revolution. But we don’t read as often about the many other such Winters both at Valley Forge and elsewhere, that had to be endured as the war dragged on for eight long years and the underfed and ill-clothed condition of the army remained virtually the same. Washington Irving in his biography of George Washington described one such Winter at Morristown in some detail:

“The dreary encampment at Valley Forge has become proverbial for its hardships, yet they were scarcely more severe than those suffered by Washington’s army during the present winter [1780] while hutted among the heights of Morristown. The winter set in early and was uncommonly rigorous. The transportation of supplies was obstructed, the magazines were exhausted, and the commissaries had neither money nor credit to enable them to replenish them. For weeks at a time the army was on half allowance, sometimes without meat, sometimes without bread, sometimes without both. There was a scarcity too of clothing and blankets so that the poor soldiers were suffering from cold as well as hunger. .  .  .  The severest trails of the Revolution in fact were not in the field, where there were shouts to excite and laurels to be won, but in the squalid wretchedness of ill-provided camps, where there was nothing to cheer and everything to be endured. To suffer was the lot of the revolutionary soldier.”

The details of the picture sketched here are graphically completed in a letter written by General Anthony Wayne, who was in charge of six regiments hutted near Morristown:

“Poorly clothed, badly fed, and worse paid. . . . some of them not having received a paper dollar for near twelve months, exposed to winter’s piercing cold, to drifting snows and chilling blasts, with no protection but old worn-out coats, tattered linen overalls and but one blanket between three men.”

Needless to say, there was widespread sickness and desertions were common, even mutiny. The wonder is that any of the soldiers stayed it out and that Washington had enough men to continue the fight when the war resumed after the long, cold Winters. But he did.

Much if this remarkable fact is attributed by many historians to Washington’s undeniable charisma, his devotion to his troops, and his willingness to endure the same conditions as they. But there is another factor that needs to be mentioned and that is the fact that the British and their allies were intent to demoralize the colonists by burning whole villages  and pillaging everything in sight. This activity had precisely the opposite effect. One famous incident involving the wife of the Rev. James Caldwell is recounted by Irving:

“When sacking of the village took place she retired with her children into a back room of the house. Her infant of eight months was in the arms of an attendant. She herself was seated on the side of a bed holding a child of three years of age by the hand, and was engaged in prayer. All was terror and confusion in the village when suddenly a musket was discharged in at the window. Two balls struck her in the breast and she fell dead on the floor. The parsonage and church were set on fire and it was with difficulty her body was rescued from the flames.”

The terrible incident became a rallying cry for the angry colonists who grew to hate the invaders and more determined than ever to drive them from their homeland. Their hatred helped keep them warm during the harsh winters.

There were a great many loyal British subjects as the war began and the colonies had a difficult time raising militia enough to engage in a war against one of the most powerful armies on earth, especially since many of those “loyal” British subjects joined with the invaders to fight against their former countrymen. But as the war went on and the atrocities multiplied, despite the harsh conditions of the Winters and the lack of pay accompanied by the diminishing value of printed currency, the number of loyal British subjects diminished and the intensity of the colonists grew and became fierce. And they became better soldiers.

In any number of ways throughout history the same story, or stories very much like this one, has been repeated in the innumerable wars that humans have waged against one another. And yet the lesson is never learned. It is determined by one side or the other to “escalate” the war and demoralize the enemy by dropping bigger bombs or sending drones — which is the modern version of pillaging — only to discover that such actions merely enrage the enemy and make them more determined than ever to retaliate.

We find this today with the rapid growth of terrorist groups that has resulted from the “war on terror” this nation has declared as a result of the attack on the Twin Towers. The number of terrorists doesn’t diminish, it expands. Hatred breeds hatred. This is one of the lessons that history has held before us and it is one of the many lessons that we continue to ignore.

Failing To Deliver

I have from time to time bemoaned the fact in these blogs that our schools are failing to educate students. I have also noted that the American Council of Trustees and Alumni in Washington, D. C. has decided to do something about the failure of the colleges and universities, in particular. I would argue that the lower grades are failing their students as well, but the approach of the ACTA is to embarrass higher education into cleaning up its house in the expectation that this will require that the lower grades do so as well. If, for example, colleges and universities required two years of a foreign language upon entrance (as they once did), then high schools would have to provide such courses for those students who plan to attend college (as they once did). And this is true even for such basic things as English grammar which is now being taught in remedial courses in a majority of the colleges across this great land of ours — and a few professional schools as well — if you can imagine.

In any event, the ACTA recently sent out a mailing to help raise monies to further their cause. In that material they sent along some disturbing facts that help them make the case for a solid core requirement in all American undergraduate colleges to provide their graduates with the basic tools they will need in order to be productive citizens in a democracy and better able to advance in whatever profession they chose to follow after college. They identify seven areas from composition and literature to mathematics and science which all colleges need to cover; moreover, they have found after an exhaustive survey over several years that the vast majority of American colleges get failing grades. My undergraduate college received an “A” grade, but my graduate school received a grade of “D” because their undergraduate core includes only foreign language and science. If you want to know more you might check out their web page (info@goacta.org). The only question that is not raised in their material is why the high schools aren’t teaching these basic courses. One does wonder. In any event, here are some of the facts that they bring forward to make their case against so many of our colleges today:

Even after the highly publicized television series on the Roosevelts, “recent college graduates showed, in large numbers, that they simply don’t know or understand what the Roosevelts did or even the difference between Teddy and Franklyn.” Further, one of the ACTA’s recent surveys showed that “More than half of college graduates didn’t know that Franklyn D. Roosevelt served four terms in office; A third of college graduates couldn’t pick FDR out from a multiple list of the presidents who spearheaded the New Deal; Barely half of the college graduates could identify Teddy Roosevelt as leading the construction of the Panama Canal.” The problem extends much further than failure to know about the Roosevelts. In general terms, quoting from an editorial in the Wall Street Journal,

“A majority of U.S. college graduates don’t know the length of a congressional term, what the Emancipation Proclamation was, or which Revolutionary War general led the American troops at Yorktown. . . .The reason for such failures, according to a recent study: Few schools mandate courses in core subjects like U.S. government, history, or economics. The sixth annual analysis of core curricula at 1,098 four-year colleges and universities by the American Council of Trustees and Alumni found that just 18% of schools require American history to graduate, 13% require a foreign language, and 3% require economics.”

The truly astonishing (and distressing) thing is that an increasing number of American colleges and universities allow “Mickey Mouse” courses to count as core courses: the University of Colorado offers “Horror Films and American Culture.” UNC-Greensboro considers “Survey of Historic Costumes” a core course, and Richard Stockton College of New jersey lets students satisfy the core history requirement with “Vampires: History of the Undead.” I kid you not. These are examples picked at random from a list that continues to grow as college faculty seek to draw students to their classes (and thereby guarantee their jobs) without any consideration whatever of the benefits of such courses — or lack thereof — to the student. Believe me, I know whereof I speak. As former Harvard President Larry Summers wrote recently:

“The threat today is less from overreaching administrators and trustees than it is from prevailing faculty orthodoxies that make it very difficult for scholars holding certain views to advance in certain fields.”

What Summers is speaking about is the determination of a great many faculty members at our colleges and universities to teach courses they want to teach simply to increase enrollments or, perhaps, to remedy what they perceive as past injustices; most are unwilling to teach courses that draw on Western tradition, the subject matter that has informed generations, because they firmly believe the works of “dead, white European males” are at the core of what is wrong with the world today. Worse yet, they discourage their students from taking such courses and disparage their colleagues who want to teach them. In my experience, many of these same people reveal their own ignorance of the very tradition they turn their backs upon and deny to their students. And they certainly don’t care whether the courses they teach instead will benefit their students in the long run — which would appear to be the central question.

The ACTA seeks to publicly embarrass trustees and alumni at American colleges and universities into putting pressure on the administrations and governing boards to remedy this situation. And it is working. The organization has attracted a great deal of attention to the problem and keeps a list of the colleges and universities that have modified their core requirements; they annually gives grades to all in an attempt to draw attention to the fact that in so many cases parents and students are simply not getting their money’s worth — especially given the escalating costs of college tuition these days.

The Now Generation

The psychiatrists who studied the American prisoners of war released after the Korean conflict were amazed at the success of the “brainwashing” techniques that were used on those men. Captured documents revealed that one of the secrets to that success was the claim of the North Koreans that Americans were generally ignorant of history, even their own. These young men could be told pretty much anything bad about their country and they tended to believe it because they had no frame of reference. For example, they could be told that in America children were forced to work in the coal mines and a couple of the men vaguely remembered hearing of this and were willing to embrace the half-truth and share it with their fellow prisoners. True, there were children working in the coal mines at one time, but no longer. It was precisely those half-truths that enabled the North Koreans to convince the ignorant young men of blatant falsehoods. Couple that treatment with censored mail that the prisoners received from wives and sweethearts complaining about how bad things were back home, not to mention the seeds of suspicion that were planted among the men that broke down their trust in one another, and you have a formula for success. There was not a single attempt by an American soldier to escape imprisonment during the entire conflict!

Today’s young people are equally ignorant of their history, perhaps even more so. We make excuses for these kids by moaning about how much “pressure” they are under. Nonsense! I would argue they under less pressure than those young men who were fighting in Korea, or even the generation that followed them. Today’s young people need not fear the draft. Moreover, they are the beneficiaries of the sexual revolution and are therefore free from the restraint experienced by prior generations who were told to wait for sex until they were married. In fact, they don’t seem to show much restraint about much of anything, truth to tell. And there is considerably less expected of them in school these days than was expected of their fathers and mothers. They are told they are wonderful: they feel entitled. So let’s hear no more about how much pressure they are under.

Now, social scientists — who would rank below even the geologists on Sheldon Cooper’s hierarchy of sciences, I suspect — love to label the generations. We have read about the “me” generation and the “millenialists,” the “X” generation, and the “Y” generation. While I hesitate to lump myself together with the social scientists, I would nonetheless suggest that we call today’s young people the “Now Generation.” They, like their parents before them, don’t know diddly about their own history, much less world history. In fact, studies of recent college graduates have shown an alarming number of these folks who cannot name the first five presidents of the United States, cannot recognize the Gettysburg Address, don’t know who were our allies during the Second World War, or when the First World War was fought — or what countries it involved. Much ink has been spilled along with weeping and gnashing of teeth over these sad revelations, but very little of substance has resulted from all the angst. History is still not considered important in our schools or in this culture. As Henry Ford would have it: “History is bunk!”

Santayana famously said that those who are ignorant of their history are doomed to repeat its mistakes. This presupposes a cyclical view of history and is predicated on the notion that human beings don’t really change that much. Because events tend to repeat themselves — we seem to be constantly at war, for example — and humans have become increasingly locked in the present moment, ignorant of their own past, they will tend to fall into the same traps as their predecessors. On a smaller scale, every parent laments the fact that their kids don’t listen to them and seem determined to make the same mistakes their parents made twenty years before. History is a great teacher. But we have to read it, assimilate it, and take it to heart. We tend not to do that. History is not bunk, Mr. Ford, and we are certain to repeat the mistakes of previous generations if we continue to remain locked in the present moment, ignoring not only the past (from which we have so much to learn) but also ignoring our obligations to the future as well.

So, I recommend that a more appropriate label for the present younger generation is the one suggested above. It is certainly true, as psychological and sociological studies have revealed, that today’s youth are addicted to electronic toys, immersed in themselves, uncaring, and seemingly unaware of the world outside themselves; the label “Me Generation” does seem to fit. But my suggestion is designed to expand the domain of the label to include not only the young, but their parents as well. We all need to read and study the past in order to avoid the traps and pitfalls that most assuredly lie ahead.

Forgetting The Past

The student protests in this country during the turbulent 1960s led by well-intentioned, idealistic young people, seem to have marked the death-throes of the American spirit. Directed as it was, unsuccessfully, against the “establishment” of materialistic, commercial and militaristic power that increasingly controlled this country, the effort sought in its blind way to breathe life into the spirit that had made this country remarkable. But blind it was, led by uneducated zealots who lacked a coherent plan of action, confused freedom with license, and targeted education which they barely understood and were convinced was turning into simply another face of the corporate corruption that was suffocating their country. In their reckless enthusiasm they decided that the core academic requirements at several of America’s leading universities were “irrelevant” and they bullied bewildered, frightened, and impotent professors and administrators into cutting and slashing those requirements. Other institutions were soon to follow. One of the first casualties was history, which was regarded by militant students as the least relevant of subjects for a new age they were convinced they could bring about by force of will and intimidation.

Had they been inclined to read at all, they might have done well to heed the words of Aldous Huxley when, in Brave New World, he pointed out that the way the Directors of that bizarre world controlled their minions was by erasing history. One of Huxley’s slogans, lifted from Henry Ford, was “history is bunk.” By erasing and re-writing history those in power could control the minds of the population and redirect the nation and determine its future. In the end, of course, the students who led the protests in this country and who thought history irrelevant were themselves (inevitably?) co-opted by the corporations and eventually became narrow, ignorant Yuppies, running up huge credit card debt and worried more about making the payments on their Volvos and their condos than about the expiring soul of a nation they once claimed to love. Or they became politicians tied to corporate apron-strings thereby rendering them incapable of compromise and wise leadership.

In 1979 Christopher Lasch wrote one of the most profound and informative  analyses of the cultural malaise that resulted in large part from the failure of the protests in this country in the 1960s. In his remarkable book The Culture of Narcissism: American Life In An Age of Diminishing Expectations, which I have referred to in previous blogs he warned us about this attempt to turn our backs on history:

“. . .the devaluation of the past has become one of the most important symptoms of the cultural crisis to which this book addresses itself, often drawing on historical experience to explain what is wrong with our present arrangements. A denial of the past, specifically progressive and optimistic, proves on closer analysis to embody the despair of a society that cannot face the future. . . . After the political turmoil of the sixties, Americans have retreated to purely personal preoccupations. Having no hope of improving their lives in any of the ways that matter, people have convinced themselves that what matters is psychic self-improvement: getting in touch with their feelings, eating health food, taking lessons in ballet or belly dancing, immersing themselves in the wisdom of the East, jogging, learning how to ‘relate,’ overcoming the ‘fear of pleasure.’ Harmless in themselves, these pursuits, elevated to a program and wrapped in the rhetoric of authenticity and awareness, signify a retreat from politics and a repudiation of the recent past. Indeed, Americans seem to wish to forget not only the sixties, the riots, the new left, the disruptions on college campuses, Vietnam, Watergate, and the Nixon presidency, but their entire collective past, even in the antiseptic form in which it was celebrated during the Bicentennial. Woody Allen’s movie Sleeper, issued in 1973, accurately caught the mood of the seventies. Appropriately cast in the form of a parody of futuristic science fiction, the film finds a great many ways to convey the message that ‘political solutions don’t work,’ as Allen flatly announces at one point. When asked what he believes in, Allen, having ruled out politics, religion, and science, declares: ‘I believe in sex and death — two experiences that come once in a lifetime.’ . . . To live for the moment is the prevailing passion — to live for yourself, not for your predecessors or posterity.”

If there were any questions about the spiritual health of this country, the loss of hope, the rejection of religion, history, and science, and the abandoned expectations of viable political solutions provide clear answers.  We do seem to be a vapid people, collecting our toys and worrying about how to pay for them, wandering lost in a maze of our own making, ignoring the serious problems around us as we follow our own personal agendas — and remaining ignorant of the history lessons that might well show us the way to a more promising future.