More Scandal

I published a piece in 2001 about the corruption of higher education resulting from the huge amounts of money in collegiate sports, especially at the NCAA Division I level, and especially in men’s basketball and football. If I had any wild notions that my revelations would cure the problem I was wrong [!]. The problem has simply grown worse in the interim and I have blogged several times about outrageous scandals in the collegiate ranks. The latest incident involves the men’s basketball program at the University of Louisville; more importantly it involves the FBI. Now things will get serious.

The FBI has been investigating evidence of “pay-for-play” scandals in several universities for a couple of years now and the revelations regarding Louisville’s men’s basketball program are the headline-makers; but apparently there are a number of other men’s basketball programs involved and the web of intrigue will continue to grow and eventually, it is believed, will include some of the major collegiate football programs as well. We haven’t heard nothin’ yet! But what we have heard makes a person cringe, especially if that person likes to think that college is about education and not about high-power sports involving millions of dollars.

In any event, Louisville has been charged with improprieties involving Adidas which signed a contract recently with the university for $160 million over a ten-year period, reportedly including $2.1 million for Rick Pitino, the long-time Hall-of-Fame coach of the men’s basketball team (whose annual salary is $7.7 million without the additional money from Adidas). The contract pays the university for requiring the sports teams, presumably all of them, to wear uniforms and equipment, provided by Adidas, with the Adidas logo prominently on display. This is not unusual and has been going on for years, not only with Adidas but also with Nike and with Under Armour as well.  The rationale for taking money from these corporations as put forth by people like Bobby Bowden, former head coach of the Florida State football team, is that “somehow we have to pay the bills.” Indeed.

In any event, the liaison person between Adidas and the University of Louisville agreed to pay the family of a high-school basketball player $100,000 to make sure their son would play for Louisville. Apparently there is another high school player involved as well. This is the “pay-for-play” element and, of course, it also could be regarded as bribery. In any event, the FBI are now involved and they apparently don’t like what they see.

Louisville is in the process of firing Pitino and the Athletics Director as well in order to cover their butts — though it’s a bit late for that. And Adidas will fire the head of global sports marketing who made the arrangements with the university to pay for the high school basketball player’s favors. But, more to the point, the university will attempt to keep the $160 million that Adidas has agreed to pay them for the privilege of supplying free athletics equipment. And this raises an interesting moral question: is it not the case that this sort of hypocrisy on the part of the university is precisely at the core of what is wrong with collegiate sports at the highest levels?? The Louisville administration knows their relationship with this corporation has soiled the university’s reputation but they will continue to enjoy the bribe (let’s call a spade a spade) because it’s a lot of money and they want to keep it. Presumably. The university ought to be setting an example for its students and putting things right with the academic and athletics sides of things. But they are simply going through the motions by firing Patino and the athletics director and hope the financial arrangements with a corporation that makes and sells athletics equipment will continue as though nothing has occurred.

The stink from the major colleges is rank and it just seems to get worse. One would like to think that with the FBI turning over rocks the stink will get so bad that steps will finally be taken to cure the problem. The universities are about education and sports has its place, but as it is now it is the tail that wags the dog and that is not the way it should be. Not at all.

 

Advertisements

Why The Humanities?

I have referred to a book by Anthony Kronman defending, if not in fact attempting to resurrect, the humanities. He fails to define quite what he means by the term, but it appears he means what I and others have meant by the liberal arts, namely, those studies that help us better understand what it means to be human and how it is that we are to make sense of a world that seems on its face to be meaningless. His book has the cumbersome title: Education’s End: Why Our Colleges and Universities Have Given Up On the Meaning of Life. Kronman stepped down as Dean of the Yale Law School in order to teach in a Freshmen elective course “Directed Studies” that focuses attention of the Great Books of Western Civilization.

In his book Kronman makes a strong case that the study of such things as great literature, philosophy, history, and the fine arts can help is to gain a wider perspective on our own lives, a deeper understanding of ourselves and the world in which we live. He is convinced, as am I, that in our frenzy to follow wherever science and technology (especially the latter) lead we have lost the better part of ourselves. The only alternative for many people is fundamentalist religion.

Those who teach the humanities in our colleges and universities have bought into such things as political correctness and the research ideal that places careers ahead of classroom teaching and this, according to Kronman, has cost the humanities their very soul. They are dying from self-inflicted wounds, society and the academic community both agree that they have passed their must-sell-by date, they are passé and not worth pursuing. While students need more than ever to wonder what great minds have had to say about the meaning of life, humanities teachers are busy trying to convince the world that they are as respectable as the hard sciences by devising schemes that provide them with spurious “theories” about truth and reality. The end result is postmodernism, with its rejection of Western ideas and ideals.

There is considerable data that suggest that Kronman is correct in his assessment as increasing numbers of students ignore the humanities altogether in their pursuit of a career — which, they and their parents, are convinced, is the sole purpose of a higher education. As Kronman himself puts it:

“However urgently students feel pressed to choose a career, to get in a groove and start moving along, the college years are their last chance to examine their lives from a wider perspective and to develop the habit, which they will need later on, of looking at things from a point of view outside the channels of their careers. This is precisely what [the humanities] encourage. In doing so, they run against the grain of the belief  most students share that there is no point of view outside those channels. That a life is a career is for them an article of faith. [The humanities] put this piety in doubt by insisting on the importance of the idea of life as a whole. For the young person on the threshold of a career, nothing could be more disturbing or helpful.”

In a word, we live at a time when we need to ask the deeper questions about the meaning of our own lives and we are wasting our time, and that of our children and students , in pushing them into narrow career paths from which they lose perspective and forget what is truly important.

Kronmen is a bit overwrought at times and I hesitate to embrace his claims all at once. But he makes a sound point: our confusing and confused times demand a way, other than religious fundamentalism, to escape from the narrow world of self and relish the past accomplishments of our fellow humans, their remarkable accomplishments in the arts, science, and the humanities. We are cutting ourselves off from the past to our own detriment, forgetting those on whose shoulders we must stand if we are ever to get some sort of idea who we are and why we are here.

The colleges and universities are especially to blame for holding the humanistic studies in low esteem, but this simply reflects a world in which the practical and immediate are all-important and the past and the truly remarkable are ignored in an attempt to make ourselves more comfortable and make sure we are up to speed with the latest invention or the latest gadget that we are confident will make our lives more pleasant, if not more meaningful.

Militant Multiculturalism

I have held forth on a number of occasions (too many, some might say) about the battles going on in the Higher Education since at least the 1960s when the wheels started to fall off. The battles take many faces but occur under the umbrella term “postmodernism,” a new age that will replace the old. One of those faces is that of “multiculturalism,” which has become increasingly militant and focuses on an attack against Western Civilization — regarded as the source of all major problems now confronting the world. It began with an attack on the “establishment” in the 1960s and expanded to take in the whole of Western Civilization, especially during the Viet Nam war, because of  the West’s consistent pattern of aggression and exploitation in an attempt to bring other peoples to their knees and force them to yield up their treasure  — exacerbated  by the presumption of greatness on the part of Western Europe and America and Western art, literature, and philosophy, in particular.

It’s a movement that is well intended, to be sure, though it tends to dwell all too intently on the failures of the Western way of looking at the world. To be sure, there have been terrible mistakes, such as genocide, greed, slavery, pointless wars, and intolerance of other ways of looking the world. But in the tossing out process something precious is being glossed over and in the tizzy to replace the old with the new some important elements are being ignored or forgotten altogether.

Beaten down by this attack, for example, are the “Great Books” of Western Civilization which are now regarded as the villains in the drama, the source of the ideas that have made our culture rotten at the core — though one must wonder how many the zealots have bothered to read any of those books. Indeed, it is mainly dwindling numbers of old geezers such as myself who continue to spit into the wind while defenders of the New Age proudly display their ignorance and triumph in their new-won victories. Their goal is to “rid the world of colonial oppression,” to convert students to one way of thinking, toss out the old, and pave the way to a new and more open way of engaging the world in an effort at what its called “globalization.” And they are winning. Indeed, they may have already won.

One of the old geezers to have joined the battle in a rear-guard effort save the humanities — where these battles have been fought for the most part — is Anthony Kronman of Yale University who has written a book that describes the battles in some detail in an effort to save what remains and perhaps even to resuscitate the humanities as they lie dying in agony from self-inflicted wounds. His book, Education’s End: Why Our Colleges And Universities Have Given Up On the Meaning of Life, points out some of the many ironies of the attack on the tradition that is being replaced. To begin with, there is the fact that replacing our culture with another, presumably superior, culture would require a total immersion in that culture, which is not possible — even in theory — for American students who have spent their lives inculcating scraps from the very culture they hope to displace. Furthermore, the attack on Western Civilization draws on the categories and ideals of that very civilization which also provides the intellectual framework, such as it is, for that attack. And ironically those ideas and ideals are endemic to most, if not all, of the cultures that are regarded by the militants as superior to our own from whence they arose. As Kronman points out:

“The ideals of individual freedom and toleration; of democratic government; of respect for the rights of minorities and for human rights generally; a reliance on markets as a mechanism for the organization of economic life and the recognition of the need for markets to be regulated by a supervenient political authority; a reliance, in the political realm, on the methods of bureaucratic administration, with its formal functions and legal separation of office from officeholder; an acceptance of the truths of modern science and the ubiquitous employment of its technical products: all these provide, in many parts of the world, the existing foundations of political, social, and economic life, and where they do not, they are viewed as aspirational goals toward which everyone has the strongest moral and material reasons to strive.  . . . all of them, all of these distinctively modern ideas and institutions, are of Western origin. . . . The ideas and institutions of the West, liberated from the accidental limits of their historical beginnings, have become the common possession of humanity.”

Moreover, as Kronman points out,

“The idea of tolerance [which the militants champion] finds support in many traditions, especially religious ones. But only in the modern West did it become — fitfully, hesitantly, but with increasing clarity and determination– an axiom of political life.”

I have often noted that we seem to be throwing out the baby with the bath water, but those who would do the throwing couldn’t care less as they reach left and right for the latest Western evil to be tossed. However, while there are indeed many reasons to feel disdain for our past, even terrible, mistakes that we in the West have made, there are also so many things that are worth saving and preserving. To be sure, the universities should be open to new ideas and make the students aware of the many cultures around the world other than their own — all of which also have made mistakes, by the way. But at the same time they should seek to preserve the best of what we have all learned from our own past in order to pass those things along. Healthy criticism is a good thing along with honest appraisal and a weighing of pros and cons, but a hysterical rejection of all things Western in the name of “tolerance” is itself the most intolerant view one can possibly exhibit.

Computers and Kids

I have blogged about this before, but a recent post by a dear friend congratulating a former teacher for taking time out of her retirement to fit out a bus with computers and take this “fully equipped mobile tech center” to the kids to help them get a leg up on education disturbed me a bit and made me recall what I had read some time ago about computers and the kids. It’s not at all clear that getting young children on computers — or any sort off electronic device — will help them develop their minds. The jury is still out on the question, but there is growing clinical evidence that those devices develop the right side of the human brain and leave the left side almost totally undeveloped. In addition, there are “windows” when certain types of brain development must take place in young children or it will never happen.

The problem here is the left hemisphere of the human brain is the side that controls language and thought. The right side is the “affective” side, the side of imagination and emotion. There’s nothing wrong with developing the right side of the child’s brain — unless the left hemisphere is left undeveloped as a result. And that seems to be the case when we rely on computers to teach. In addition, it has been shown that there is a direct correlation between increased computer usage and attention deficit disorder.

Ironically, the schools are on the bandwagon, buying computers for the kids — or accepting them from all-too-willing corporations that are delighted to get the kids hooked as soon as possible. And the parents applaud these efforts, which often include providing the child with his or her very own computer, because they are convinced that this will put their kids squarely on the information highway and on their way to a successful life. They may not support increased salaries for the teachers, but they will gladly see their tax money spent on computers.

Nothing provides us with information as quickly or as efficiently as computers. That much is clear. Moreover, we all know that information is a key to understanding.  It is a sine qua non of all knowing. But it is not alone sufficient. Humans must also know how to process information, separate the wheat from the chaff and determine what is true and what is fiction —  recognize “false facts.” Thought requires the development of the left hemisphere of the human brain and as Jane M. Healy has told us in her book Endangered Minds: Why Children Don’t Think, recent clinical studies of human brain development involving brain scans and MRIs  have shown that electronic devices do not help that portion of the brain to develop. To quote Dr. Healy directly:

“The experiences of children today [involving television and the use of electronic devices such as computers] may be predisposing them to deficits both in effective coordination between hemispheres and in higher-level linguistic and organizational skills of the left hemisphere [of the brain]. They may particularly lack practice in the use of left-hemisphere systems of auditory analysis and in the skills of logical, sequential reasoning.”

Moreover, as Marie Winn points out regarding television in the book referred to above,

“.. a carefully controlled study designed to explore the relationship between television viewing and the language spoken by preschool children discovered an inverse relationship between viewing time and performance on tests of language development; the children in the study who viewed more television at home demonstrated lower language levels.”

Computers, like television, are essentially passive devices — even when “interactive.” They cannot substitute for a human being sitting down with another human being, or several other human beings, and having a discussion. Human interaction, especially at a young age, the telling of stories, reading stories, making stories up, or simply visiting and chatting about the sort of day the child is having are certain ways to help the child’s mind to grow and develop fully — not just on one side. I hasten to point out that we are talking about young children here, kindergarten through eighth grade. There is plenty of time to teach students basic computer skills to help them get a leg up in the job-hunting arena when they reach high school, after the critical windows have closed in early brain development. These skills could at that time be taught along with civics, history, literature, mathematics, and science, subjects that will deepen the young students’ minds and broaden their horizons well.

My wife and I gave a book of brain-teasers to a precocious young child we love dearly thinking it would help her develop her mind and that she would enjoy the challenge. After a very few minutes she was looking up the answers in the back of the book! This is learned behavior. One wonders how often this happens with computers as attention spans shrink. In any event, it is something that would not happen with another human being. There would be give and take, exchanges back and forth, encouragement, hints, and the kind of coaching that goes into good teaching. That’s what should have been happening on the “mobile tech center.” Computers are not the answer to helping young kids learn how to use their minds. Good teaching and good parenting are the answers.

Culture Studies

I have made passing reference from time to time of the postmodern trend in the academy away from traditional coursework in the standard academic disciplines and toward something that has come to be called “Culture Studies.” These studies are an attempt to replace those traditional disciplines that are regarded by a growing number of academics as irrelevant or even “a part of the problem” in an attempt to radically change the climate not only within the universities but also in society at large. As literature professor James Seaton tells us in Literary Criticism From Plato to Postmodernism:

“In the twenty-first century, the academic study of popular culture has become a part of culture studies, a transdisciplinary approach whose attraction derives in  large part from its implicit promise that adepts gain the ability to make authoritative pronouncements about all aspects of human life without going to the trouble of learning the rudiments of any particular discipline.”

I have discussed in previous posts the birth from this movement of New History that insists that historians simply express their own particular view of events — without footnotes or corroboration of facts — because, they say, the traditional view of how to write history is based on the absurd notion that there are such things as facts and even a thing called “truth.” In the end, the movement of postmodernism in general agrees in rejecting such “absurd” notions and in the process  moves on toward a more radical manner of viewing one’s world and the things that go on in that world. I have noted the tendency of this movement within the academy to morph into movements outside the academy in society at large — in the form, most recently, of “alternative facts.” In a word, the repercussions of what growing numbers of academics do within the hallowed halls of academe have an effect on the way people think both within and without the academy. Most interesting in Seaton’s remarks above is the notion that culture studies — which is his special concern in his book — are an attempt to replace traditional academic disciplines, especially in literature, history, and philosophy, and transform them into something that loosely resembles sociology, badly done.

To what end, one might ask? The answer is to the end of radically transforming the world. Revolutionaizing the world, if you will. The three editors of an anthology titled Culture Studies and published in 1992 put is quite explicitly:

“. . .a continuing preoccupation within culture studies is the notion of radical social and cultural transformation . . . in virtually all traditions of culture studies, its practitioners see culture studies not simply as a chronicle of cultural change but as an intervention in it, and see themselves not simply as scholars providing an account but as politically engaged participants.”

Thus we should not be surprised that on many college campuses across the land militant faculties and students are turning away prospective speakers with whom they disagree and are steamrolling their political agendas through committee meetings, commandeering professional journals, and turning the curriculum into a homogeneous series of studies in like-minded writers that will indoctrinate students into their way of thinking. This unanimity of opinion is regarded by this group as essential to the ends they have in view, namely “a commitment to education as a tool for progressivist politics.” This has disturbed even a few of those who regard themselves as liberal members of the faculty. As one recently noted (and please note that this person is not a reactionary conservative):

“. . .by putting politics outside of discussion, and insisting that intellectual work proceed within an a priori view of proper leftist belief — conveyed between the lines, parenthetically, or with knowing glances and smiles — all sorts of intellectual alliances have been foreclosed at the outset.”

When he says that “politics[ is] outside of discussion” what he means, of course is that political issues have already been decided: America is a corrupt imperialistic country, our democracy is irremediably damaged, racism and sexism are rampant, and corruption is the order of the day. These things may or may not be true, but they are not to be discussed. The matter has been settled, “foreclosed at the outset.” Their success, which has been surprising, has been due to simple tactics: intimidation and guilt. Much of what they say is true, or at least half-true, but it is all beyond discussion.

Folks like this writer, and a diminishing number of other relics, following in the footsteps of the brilliant Black historian W.E.B. DuBois, attempt to defend what was once called “High Culture” and is now regarded as “elitist,” or “undemocratic.” Such folks are regarded as past their must-sell-by-date, not worth a moment’s reflection or worry on the way toward the transformation of the university  from a place where ideas are freely exchanged and discussion is open-ended and hopefully leads to something we can agree is true or factual (or at least plausible) to an institution where future leaders of shared radical views of society are bred and raised in a comforting and comfortable atmosphere of inflated grades where they will find only support and agreement.

The agenda in “higher” education has changed radically: it is no longer about putting young people in possession of their own minds. It is now about making sure they see that the only way to transform society and eliminate injustice is to read and discuss those who agree with the program that has been carefully laid out for them by growing numbers of faculty who see themselves as having arrived at a place where disagreement can no longer be tolerated if it is likely to lead students away from what they regard as the truth — despite the fact, of course, that they insist that there is no such thing as “truth.”

This may help us to understand why at the moment 45% of America’s college graduates think the sitting president is doing a good job. A figure that surprises many but which makes perfect sense to those who see this man as the embodiment of radical change — and who have not been taught how to think, only what to think.

Tennis Lessons

I was a tennis teaching professional for 35 years giving private lessons — first at a private club just outside of Chicago and then in the Summers while I was teaching full-time in philosophy at the university — running tennis camps, and coaching for 15 years, both men and women. My approach to private lessons was pretty much the same: hit with the pupil for a few minutes to see what his or her strengths and weaknesses were. I didn’t have a formula which I tried to force on every player. I took what they had and tried to work with it, bending the lesson to fit the pupil. I usually saw fairly quickly what they needed to do to improve their game and I would make a few suggestions to them — avoiding criticism and making sure I didn’t say something that might undermine their confidence or make them self-conscious. If what I said failed to work, I tried to say the same thing using different words: everyone is different. Eventually something I said would seem to work and I piled on the praise and relied on repetition to help groove the stroke and make it work better for the pupil.

The year after college I taught arithmetic, history, geography, and science to boys in grades 3 through 7 at a private school. I also coached football and basketball. I learned during that year to apply the same techniques I had used on the tennis court: listen carefully and observe; be patient and full of praise when the students got the message. And I tried to keep my sense of humor throughout — giving private lessons, in the classroom, or while coaching intercollegiate tennis players.

What I learned over the years is that teaching is not a science; it is an art. There are no “methods” that can be taught to every aspiring teacher that will work with all the students — or even the majority of them. This is why I have become so critical of the methods courses taught in education programs across the nation. They rest on the faulty assumption that teaching is a science. The best thing that could be done for our teachers is to encourage them to take an academic major in college — history, English, biology, chemistry, sociology, mathematics, whatever — and then take a year as an apprentice to a veteran teacher. The veteran can give the aspiring teacher tips on what has worked over the years for them — how to reach the quiet or subdued pupils in the class, for example, instead of teaching to the ones that always raise their hands. There are things that can be learned, but not sitting in a classroom in college working through a manual on “methods.”

I have come to believe that this is the best plan. I would note in passing that the teachers at the private school where I taught for that year all had legitimate college degrees and none of them (that’s right, none of them) was “certified” to teach. They were not driven away from teaching like so many bright, young people by having to take tedious and pointless “methods” courses. They learned on the go and, for the most part, were very good at what they did. Granted, the students were bright but the principle is the same. The best way to learn how to teach is to be patient, be aware of what is going on around you, and have adequate communication skills to make your point in a variety of ways in order to reach the largest number of pupils. These are not things that can be learned in a department of education. They must be natural or acquired on the job in a classroom teaching others what you yourself have learned, what excites you.

I realize that I am drawing on my own personal experience to make a general point, and I hasten to note that I do not regard myself as an outstanding teacher. I always taught to the brightest and loved most working with the best athletes. But I have observed over the years countless others who are either good or bad teachers and have tried to understand what made the difference.  And as Director of an Honors Program I saw many a bright, aspiring teacher turn away from teaching because of the boring methods courses they were required to take. To repeat, teaching is an art, not a science. And if we want to start attracting the best and brightest students to the teaching profession we need to admit that we cannot teach others how to teach.

Under Attack

I often wonder how many people outside the Academy realize (or care?) how severe the attack on Western Civilization is within the Academy as students and faculty on a growing number of campuses across the country have determined that Western Civilization is the source of most of the world’s  problems today.  Indeed, I wonder how many people within the Academy are aware of the seriousness of the problem.

In a recent acceptance speech at the American Council of Trustees and Alumni annual banquet, one of the recipients of their “Philip Merrill Award for Outstanding Contributions to Liberal Arts Education,” Ms Ayaan Hirsi Ali, a Fellow at the John Kennedy School of Government at Harvard, paints a bleak picture indeed. She cites a battle at Stanford University in 2016 in which a group of students sought to reinstate a course requirement in “Western Civilization” that had been eradicated 25 years ago. The attempt was overwhelmingly rejected by the student body.

“In the run-up to the vote, one Stanford student [a young woman in this case] wrote in the Stanford Daily that ‘a Western Civ requirement would necessitate that our education be centered on upholding white supremacy, capitalism, and colonialism, and all other oppressive systems that flow from Western civilizations.'”

The ignorance of this student’s comment beggars belief and, sad to say, it is a view that is shared by what many think is the majority of students (and faculty) on today’s campuses. Let’s take a look at this comment.

To begin with, one course requirement would not result in an education “centered” on Western Civilization. The is what logicians call a “straw man” and it is a fallacy. The young lady would know this if she knew more about Western Civilization, since logic was first formalized by Aristotle and later refined by the Schoolastics during the Middle Ages. In any event, even if the course were required, it would not comprise the whole of the students’ study for his or her four years. Moreover, there is no reason to believe that there could not also be a requirement in “Eastern Civilization” as well. But, more to the point, the comment ignores the benefits of Western Civilization that this student has chosen to ignore — if, indeed, she was aware of them. I speak of such things as women’s equality, the abolition of slavery, individual freedom, religious tolerance, and freedom of expression (which makes possible ignorant comments like that of the student in question). As Ms Ali points out:

“One cannot dismiss the sum total of Western Civilization without losing one’s moral compass. And one cannot participate meaningfully in the battle of ideas raging in the world today while dismissing the value of Western Civilization as a whole.”

While there are many things to note and regret about the luggage that has been brought with them by folks who have struggled to create what we call “Western Civilization,” and here we would have to acknowledge the half-truth hidden in the rhetoric of the Stanford student, we must insist upon a wider perspective and note the extraordinary beauty in Western art, the intellectual triumphs, the moral gains (as noted above) that form the warp and woof of Western Civilization. Perspective, when speaking of such a large issue, is essential. And this student has lost hers entirely (if she ever had it to begin with). To take an obvious example, capitalism, for all its faults, has made it possible for this particular student to attend one of the most prestigious universities in the world. She bites the hand that feeds her.

As one who has read, taught, and defended the Great Books of the Western World I have an obvious bias against this sort of blanket condemnation. But even if this were not the case, the intolerance built into the ignorant comment by this student would be disquieting. After all, college is a place where one broadens one’s mind, not shrinks it — ideally. And the comment reflects the growing attitude on many college campuses across the country that results in the exclusion of certain “types” of speakers from appearing on campus, because they represent views that are regarded as unacceptable. This includes Ms Ali who was denied access to Brandeis University by militant students and faculty  after initially being invited to speak about the crisis within Islam and receive an honorary degree. It is an attitude that has also resulted in the prohibition against saying certain words or thinking certain thoughts, an attitude that reflects a fascist approach to eduction — if this is not, in fact, a contradiction in terms. The “battle of ideas” requires that we keep an open mind.

My concerns are obvious to anyone who has read any of my blogs. But I do not think they are misplaced or even exaggerated. Higher education is supposed to be a place where the students do not learn certain things, necessarily, but they learn to use their minds to determine which things are worth knowing and which things are not. And a blanket condemnation of the whole of “Western Civilization” by a group of students at Stanford University who, we may assume, know little or nothing about that which they reject, is nothing short of presumptuous, if not arrogant. And the fact that the faculty at Stanford did not take the lead in determining which courses were to be required in the first place is also to be regretted, but not surprising in an age in which the students and the children are mistaken for those who should lead rather than follow. And here we have a graphic example of why they should not be allowed to lead.

Indoctrination

Readers of my blog are fully aware that I am somewhat fixated on the topic of education — what it is and what it is not. In reading Jean Jacques Rousseau’s notions about education (an author who wrote Emile, one of the supposed great works in education) I found myself disturbed by his confusion between education and indoctrination. It made me reflect on the fact that we tend to make the same confusion — though we would be reluctant to admit it. After all, who would agree to pay teachers to indoctrinate their children rather than educate them? The answer should be obvious: most of us do (to a degree).

But, back to Rousseau for a moment who, among other things, did not believe that the children of the poor and disenfranchised should be educated. In his words:

“The poor man does not need to be educated. His station gives him a compulsory education. He could have no other. . . .Those who are destined to live in country simplicity have no need to develop their faculties in order to be happy. . . . Do not at all instruct the villagers child, for it is not fitting that he be instructed; do not instruct the city dweller’s children, for you do not know yet what instruction is fitting for him.”

The sort of “education” that Rousseau recommends for the remaining few is most interesting:

“It is education which must give souls the national form, and so direct their opinions and their tastes that they are patriots by inclination, by passion, by necessity. A child, on opening his eyes, should see his country, and until he dies he should see nothing but his country.”

These two comments are worth considerable reflection. They both raise red flags, for different reasons. The first quote focuses on Rousseau’s conviction that some people (most people?) cannot be educated. The hero of his book, Emile, was a privileged son of a wealthy father and was privately tutored. Rousseau simply took for granted that the children of poor villagers could not be educated and that any attempt would fail. This is interesting because we are, as a society, committed to the notion of universal education, the notion that all are educable and “no child should be left behind.” Unfortunately, as it happens, this is not true. To an extent Rousseau is correct. Not all children are educable. Take it from me! But it is impossible to state a priori who is and who is not educable and therefore the opportunity should be made available to all. But the notion that all children can be taught something by good teachers is a stronger position, because teaching children “something” does not necessarily mean they are educable.

This leads to the notion of indoctrination which is clearly implied in Rousseau’s second comment above. So much of our teaching is directed toward teaching children “something” rather than teaching them how to use their own minds to determine what “somethings” are worth knowing and which are only worth ignoring altogether. In point of fact, much of what passes for education in this culture is really job training, teaching the young those skills that will enable them to make a living. This is assuredly not education; it is indoctrination by another name. And there are those among us who would insist that the sorts of flag-waiving that Rousseau recommends should be taught as well. In a word, we ignore the fundamental distinction between education, training, and indoctrination. These are not at all alike, and while training may be advantageous to all, education ought to be but, as Robert Hutchins said long ago, we have never really made the effort. We are satisfied if the kids can get a job after they graduate, whether they are able to use their own minds or not. And were the schools to buy into the sort of brain-washing that Rousseau recommends it is fairly certain that a great many parents would rejoice.

In brief, we need to be clear in our minds just what it is we are talking about when we talk about “universal education.” If we really believe in it, we should embrace the concept fully and make it available to all — and not settle for indoctrination or job training. A democracy, as I have said on numerous occasions, requires an educated citizenry. It was the assumption of the Founders that all who voted would be aware of and concerned about the common good and also they would be “schooled” to the point where they could distinguish the worthy candidates for public office from the frauds. Recent experience has proven that a great many of our citizens do not exhibit “social virtue” and cannot vote intelligently and this should make us even more determined than ever to insist that teachers focus on enabling all of their students to use their own minds and not settle for anything less.

Popular Culture

I have written recently about how the movements that begin within the hallowed halls of academe tend to find their way outside those halls much like a scientific experiment that went wrong in a science-fiction movie. The most recent example of this is the notion of “alternative facts” that almost certainly is the bastard offspring of the postmodern movement born in Germany and France and now in ascendency in American Universities that stresses such things as the denial that there is such a thing as truth.

One of the heads of this movement that would reject all “modern” academic courses of study in history, literature, philosophy, and sociology is what is called “popular culture.” This is the study of such things as movies, television shows, comic books, and the like. This movement, in addition to rejecting the notion that history should be written without footnotes because it’s only a matter of subjective opinion anyway, has given birth to the following sorts of phenomena — as recently reported by the American Council of Trustees and Alumni:

• Rice university offers a first-year writing intensive course titled “Star Wars and the Writing of Popular Culture.”

• Appalachian State University requires its freshmen students to take a first-year seminar to help them develop “creative and critical thinking abilities.” Seminars this spring include “Death (and Rebirth?) of the Hippie.”

• The English department at the University of Pennsylvania — an Ivy League School — offers a course on “Wasting Time on The Internet.”

And this is just a tiny sample at a time when a recent poll of college graduates revealed that:

• 34% could not identify correctly when Election Day is held.

• 25% could not identify Tim Kaine as a candidate for vice president of the United States.

• 50% could not name Franklyn Roosevelt as the last president to win more than two elections to the presidency.

A number of colleges and universities now offer not only courses in Popular Culture, but also majors in that field as well as PhDs for those who want to go on to teach in that  academic “discipline.” And, A.C.T.A. concludes, “When many of our colleges and universities treat popular culture and entertainment as subjects worthy of serious study, it surely isn’t surprising that so many college graduates can’t identify key civic leaders, events, and their significance.” Indeed.

So what? you might ask. The answer is, of course, that this is coming at a time when we need young people who can think, and who can think critically. The recent election should have proven how vital that is and how far short we are falling as a nation. In this regard, there are two major problems that lie at the heart of this movement. To begin with, courses in Popular Culture emphasize information at the cost of thinking about information. I shall return to that notion in a moment. Secondly, the movement shoves aside other courses in the college curriculum that actually might help put young people in possession of their own minds, make them intelligent, critical thinking adults who can discriminate between a well-qualified candidate for president, say, and a complete fraud.

To return to the first point, it has been shown in tests conducted years ago that there are certain academic courses that help young people to think. This is reflected in tests such as the LSAT that students take in order to enter law school. Law requires critical thinking skills and the fields that do well, it has been shown, are mathematics, economics, philosophy, engineering, English, Foreign Language, Chemistry, Accounting, and History (in that order). The fields of study that score lowest in the LSAT are those that stress information and memorization. I shall not mention them out of respect to those who wasted their time and money earning degrees in those subject areas. But Popular Culture would certainly be at the top of that list if it had been offered at the time these studies were conducted.

The point is that the sorts of shenanigans that are going on behind the hallowed halls of academe have consequences for those who pay little or no attention to what is going on there. The graduates who have shown themselves to be badly informed about American history and government and also unable to think critically grow in number while those that cannot use minds filled with drivel increase accordingly, fostered by colleges and universities now being run as businesses, catering to the whims of their “customers.” And this at a time when our democracy desperately needs intelligent, well-informed, thoughtful citizens.  Courses in such non-fields as “Popular Culture” are the sort of things that guarantee that this will not happen.

 

The Blind Leading

I strongly opposed the appointment of Betsy DeVos as Secretary of Education. She is obviously unqualified since she has no experience whatever with American public education.  (And I hasten to note in passing that I attended public schools for the requisite 12 years.) In any event, my blogging buddy Jill was spot on when she noted in a recent post that the DeVos appointment appears to be a determined effort on the part of this Administration to dumb down America even further.

But, then, I reflected on one basic question: is the American public education system already beyond repair? Can it be saved? And that question took me to some very sad truths (not alternative facts, but actual facts). To begin with is the “Blob.” This is the name one theorist has given to the huge bureaucracy that controls public education in this country. I have first-hand experience with such a bureaucracy on a smaller scale in my years in public higher education in Minnesota. When I started teaching in this state in 1968 there were six state universities and one Chancellor who, with his secretary, oversaw the system from his office in St. Paul. By the time I retired 37 years later his office took an entire city block in St. Paul and was peopled by hundreds of drones who scurried back and forth issuing directives that necessitated more and more administrative positions at the (now seven) universities simply to keep up and issue their countless reports.

In the public schools the same situation can be found. In spades. There are innumerable functionaries at all levels who are paid large salaries out of money that ought to go to the teachers. Their job is to issue directives and determine policy, including curriculum, for the many schools in the various state systems. This is the Blob. The system as it now stands is top-heavy. There are far too many chiefs and not enough Indians.

Moreover, as I have mentioned on numerous occasions, teachers are paid slave-wages and this leads to the fact (as shown by several studies) that our teaching force in the public primary and secondary education system is drawn from the lower 1/3rd or lower 1/4th (depending on which study you refer to) of our college students. The lower salaries make teaching unattractive for many students, as do the “methods courses” prospective teachers are required to take. The pupils have been raised to think that successful people make large salaries and since these people make very little they must all be losers. They tend not to respect their teachers. And given that parents are too busy these days to raise their children, the schools are expected to do so — except that the teachers must discipline their charges with hands tied behind their backs by countless regulations laid down by the bureaucrats mentioned above who worry about possible law suits and not about the pupils or the teachers. Teaching and the pupils are lost in the shuffle.

There are good teachers who have taken the required vow of poverty. No doubt about it. But studies all show that American public education is in a shambles and the question how it can be saved is a profound and perplexing one. It must start at the top, but at the top we find people who control the purse strings and who seem to regard their own positions as sacrosanct. Since they are at the top they are first in line to receive funding. In my state the State University Board takes their portion after the legislative allocation comes down and then doles out what is left to the several universities who are all told that since budgets are tight they will have to make draconian cuts — usually in the humanities and arts faculty. (Never in sports. But that’s another topic.)

The international comparisons with schools in other countries strongly suggest that things seem to have gone from bad to worse. And a new start may not be a terrible thing. I realize that DeVos is not the brightest bulb on the tree and has no credentials whatever (which seems to be a trait among Trump’s appointments), but perhaps she will bring some new ideas to the job. If, for example, she were to eradicate, or even seriously injure, the Blob and dispense with certification requirements (including “methods” courses) while making it possible for young people to attend schools with bright, well-paid teachers this may not be a bad thing. She is known to favor charter schools, for example, which are not in all cases a bad thing. Two of my grandchildren attend a charter school in the Twin Cities that teaches latin and Greek along with logic, mathematics, and science. The curriculum is built around the original seven liberal arts and the kids love the challenge and are getting a very good, free education — complete with homework, can you imagine?

In any event, it will be interesting to see what happens. I am much more worried, I confess, about what this president is doing to the E.P.A. and other regulating agencies than I am with this particular appointment, given the current state of public education. It could turn out to be a good thing if it results in a fundamental shake-up of a system that seems to be tottering and about fall under its own weight.