Under Attack

I often wonder how many people outside the Academy realize (or care?) how severe the attack on Western Civilization is within the Academy as students and faculty on a growing number of campuses across the country have determined that Western Civilization is the source of most of the world’s  problems today.  Indeed, I wonder how many people within the Academy are aware of the seriousness of the problem.

In a recent acceptance speech at the American Council of Trustees and Alumni annual banquet, one of the recipients of their “Philip Merrill Award for Outstanding Contributions to Liberal Arts Education,” Ms Ayaan Hirsi Ali, a Fellow at the John Kennedy School of Government at Harvard, paints a bleak picture indeed. She cites a battle at Stanford University in 2016 in which a group of students sought to reinstate a course requirement in “Western Civilization” that had been eradicated 25 years ago. The attempt was overwhelmingly rejected by the student body.

“In the run-up to the vote, one Stanford student [a young woman in this case] wrote in the Stanford Daily that ‘a Western Civ requirement would necessitate that our education be centered on upholding white supremacy, capitalism, and colonialism, and all other oppressive systems that flow from Western civilizations.'”

The ignorance of this student’s comment beggars belief and, sad to say, it is a view that is shared by what many think is the majority of students (and faculty) on today’s campuses. Let’s take a look at this comment.

To begin with, one course requirement would not result in an education “centered” on Western Civilization. The is what logicians call a “straw man” and it is a fallacy. The young lady would know this if she knew more about Western Civilization, since logic was first formalized by Aristotle and later refined by the Schoolastics during the Middle Ages. In any event, even if the course were required, it would not comprise the whole of the students’ study for his or her four years. Moreover, there is no reason to believe that there could not also be a requirement in “Eastern Civilization” as well. But, more to the point, the comment ignores the benefits of Western Civilization that this student has chosen to ignore — if, indeed, she was aware of them. I speak of such things as women’s equality, the abolition of slavery, individual freedom, religious tolerance, and freedom of expression (which makes possible ignorant comments like that of the student in question). As Ms Ali points out:

“One cannot dismiss the sum total of Western Civilization without losing one’s moral compass. And one cannot participate meaningfully in the battle of ideas raging in the world today while dismissing the value of Western Civilization as a whole.”

While there are many things to note and regret about the luggage that has been brought with them by folks who have struggled to create what we call “Western Civilization,” and here we would have to acknowledge the half-truth hidden in the rhetoric of the Stanford student, we must insist upon a wider perspective and note the extraordinary beauty in Western art, the intellectual triumphs, the moral gains (as noted above) that form the warp and woof of Western Civilization. Perspective, when speaking of such a large issue, is essential. And this student has lost hers entirely (if she ever had it to begin with). To take an obvious example, capitalism, for all its faults, has made it possible for this particular student to attend one of the most prestigious universities in the world. She bites the hand that feeds her.

As one who has read, taught, and defended the Great Books of the Western World I have an obvious bias against this sort of blanket condemnation. But even if this were not the case, the intolerance built into the ignorant comment by this student would be disquieting. After all, college is a place where one broadens one’s mind, not shrinks it — ideally. And the comment reflects the growing attitude on many college campuses across the country that results in the exclusion of certain “types” of speakers from appearing on campus, because they represent views that are regarded as unacceptable. This includes Ms Ali who was denied access to Brandeis University by militant students and faculty  after initially being invited to speak about the crisis within Islam and receive an honorary degree. It is an attitude that has also resulted in the prohibition against saying certain words or thinking certain thoughts, an attitude that reflects a fascist approach to eduction — if this is not, in fact, a contradiction in terms. The “battle of ideas” requires that we keep an open mind.

My concerns are obvious to anyone who has read any of my blogs. But I do not think they are misplaced or even exaggerated. Higher education is supposed to be a place where the students do not learn certain things, necessarily, but they learn to use their minds to determine which things are worth knowing and which things are not. And a blanket condemnation of the whole of “Western Civilization” by a group of students at Stanford University who, we may assume, know little or nothing about that which they reject, is nothing short of presumptuous, if not arrogant. And the fact that the faculty at Stanford did not take the lead in determining which courses were to be required in the first place is also to be regretted, but not surprising in an age in which the students and the children are mistaken for those who should lead rather than follow. And here we have a graphic example of why they should not be allowed to lead.

Am I Dreaming?

Lewis Carroll’s classics Alice In Wonderland and Through the Looking Glass focus on a perennial philosophical question first propounded by Bishop George Berkeley in the eighteenth century: the things we take to be real, material, and substantial are merely intangible, “sorts of things” in the mind of God. We do not know what is real and what is merely apparent. Further, we cannot say at any given moment whether we are awake or dreaming because there is no reliable criterion that enables us to distinguish the two states from one another.

Bishop George Berkeley
(Courtesy of Wikipedia)

In a conversation Alice is having with Tweedledee and Tweedledum in Looking-Glass Land we hear the following exchange that follows their discovery of the red king sleeping under a nearby tree:

“I’m afraid he will catch cold with lying in the damp grass,” said Alice, who was a very thoughtful little girl.

“He’s dreaming now,” said Tweedledee: “and what do you think he is dreaming about?”

Alice said, “Nobody can guess that.”

“Why, about you!” Tweedledee exclaimed, clapping his hands triumphantly. “And if he left off dreaming about you, where do you suppose you would be?”

“Where I am now, of course,” said Alice.

“Not you!” Tweedledee retorted contemptuously. “You’d be nowhere, Why you’re only a sort of thing in his dream!”

“If that there King was to wake,” added Tweedledum, “you’d go out — bang! — just like a candle!”

“I shouldn’t!” Alice exclaimed indignantly. “Besides, if I’m only a sort of thing in his dream, what are you, I should like to know?”

“Ditto,” said Tweedledum

“Ditto, ditto,” said Tweedledee.

Both of Carroll’s tales have a surreal quality and throughout Alice is constantly wondering if she is awake or just dreaming. This generates the pithy problem: how do we determine that we are awake? Berkeley was convinced we could not, that is, we cannot say just why it is that we know we are awake at any given moment and not dreaming. We may have strong feelings. Common sense insists that we are awake and not dreaming when we ask the very question. But the problem is HOW do we know this? We cannot distinguish dreams from reality with any certainty. And this is because any claim to knowledge must produce the criteria that make the claim knowledge and not a pretender.

If I claim that this computer before me is real I can say I know it because I can see it and touch it. But how do I know I am really seeing it and touching it and not just dreaming that I am seeing it and touching it? As you can see, it’s a tough one! No one really answered Berkeley satisfactorily in the many years that have followed his suggesting the paradox and it is still out there.  David Hume suggested reality has greater “force and vivacity,” but this won’t work because many people have very vivid dreams and for many people reality is a blur — especially if they a prone to the occasional tipple. So Lewis Carroll is having great fun with it in his Alice stories. Children’s stories, eh?? I don’t think so!

Carroll later wrestled with the problem in his book, Sylvie and Bruno in which the narrator shuttles back and forth mysteriously between real and dream worlds.

“So, either I’ve been dreaming about Sylvie,” he says to himself in the novel, “and this is not reality. Or else I’ve really been with Sylvie and this is a dream! Is life a dream, I wonder?”

If it is, perhaps we will all wake up soon and discover that this is so and breathe a sigh of relief. Otherwise this dream is a nightmare.

 

Inverted Consciousness

I wrote this post several years ago in attempt to understand why folks seem to have become lost in the world inside their own heads. It is a topic I had written a book about a few years ago and one that continues to intrigue me. With a few adjustments to the earlier version, here is my attempt to understand the condition known medically as “Asperger’s Syndrome.”

“The Big Bang Theory’s” Sheldon Cooper has it. BBC’s Sherlock Holmes and Doc Martin have it as well. It’s all the rage these days. It’s called “Asperger’s Syndrome” and it is defined as follows:

“a developmental disorder resembling autism that is characterized by impaired social interaction, by restricted and repetitive behaviors and activities, and by normal language and cognitive development —called also Asperger’s disorder.”

Actually, “language and cognitive development” is often exceptional. But these folks  have to be taught how to interact with others, because they are not fully aware of the other’s presence — except insofar as the other person accommodates or interferes with the person’s own desires. They seem to be emotionally stunted, lacking a sense of humor and any instinctive reaction to other people’s feelings and the subtle nuances of human behavior.
I wrote a book about the phenomenon years ago before I had ever heard the word. I called it “inverted consciousness” and argued that it is a widespread cultural phenomenon resulting from a fixation on the part of the subject with his or her own experience, an inability to see beyond that experience. For this person “the” world is “my” world. Paintings and music are beautiful or ugly because the subject likes or dislikes them; behavior is right or wrong because it pleases the individual or fails to do so; all opinions are of equal merit — there is no such thing as truth or even expertise. I maintained that there are degrees of this disorder from the extremely inverted consciousness of what I now know is Aspergers down to the occasional or intermittent inversion. It is usually found in men, though I know of a woman or two who have it. My sense of it is that women are more empathetic and compassionate than men as a rule and those qualities do not live comfortably alongside a condition that blinds the person to the fact that there are others in their world — except in so far as the others serve their own purposes. That sounds sexist, but I still think there are important differences between men and women and in this case women are being complimented: Aspergers is very unattractive. However, I apologize in advance to any readers who find this differentiation offensive!
As I say, I do regard the condition as widespread in our culture and took my clue from Ortega y Gasset who noted the symptoms in Europe in the 1930s and wrote about them in describing Mass Man in his classic The Revolt of the Masses. Defining “barbarism” as simply “the failure to take others into account,” Ortega was convinced that Europe was then on the brink of a new barbarism, an age in which people would become more and more removed from one another and “hermetically sealed” within themselves. World War II soon followed, interestingly enough.
Describing this type of person, Ortega said at the time,

“The innate hermetism of his soul is an obstacle to the necessary condition for the discovery of his insufficiency, namely: a comparison of himself with other beings. To compare himself would mean to go outside of himself for a moment and transfer himself to his neighbor.”

But he is incapable of that.
I am not sure what causes this phenomenon, but it does appear to be more and more prevalent. It seems apparent that our current President suffers from the condition to a high degree as well. I suppose our increasingly crowded living conditions together with the almost constant bombardment of images and sounds around us tend to turn our attention inward. In addition, the countless number of electronic toys that seem designed to discourage human interaction must also be considered. I recall vividly the photograph of a group of teenagers sitting in front of Rembrandt’s “Night Watch” texting — totally unaware of the extraordinary painting nearby. Note how all of these devices turn the individual’s attention away from what is around them (he said, sitting alone at his computer).
In any event, I thought what Ortega had to say was a powerful message when I first read it, and I find it even more so today. If we are, indeed, “from birth deficient in the faculty of giving attention to what is outside [ourselves], be it fact or persons,” this is something we need to ponder seriously, since it suggests we are becoming increasingly isolated from one another — like Sheldon. And Sherlock. And Doc Martin — who are all funny up to a point, but also pathetic.

Photography As Art

The question of whether or not photography can be regarded as art is a very tough question.  I have never addressed it myself. But I recently picked up a follower who does beautiful photography and the question forced itself upon me: when does the photograph become a work of art?

To address this question, I will begin with Monroe Beardsley’s definition of the artwork which he proposed back in the 1980s in the last essay he ever wrote on the subject just before he died. Beardsley told us then:

“An artwork is something produced with the intention of giving it the capacity to satisfy the aesthetic interest.”

Beardsley chose his words carefully. He stresses the “capacity” of the work to “satisfy the aesthetic interest.” The object may not, in fact, generate any response whatever. But the notion of aesthetic interest is particularly important. It contrasts with the sorts of interest we take from day to day in ordinary objects, interest that generates feelings of sentiment, fear, anger, lust, or whatever. The aesthetic response when it occurs is distinctive. It results from attention that is focused entirely on the object itself. Eliseo Vivas called this “rapt, intransitive attention” to the object. The object holds our attention to itself and does not let our minds or feelings wander off into memories, associations, irrelevancies. When we look at a Norman Rockwell painting, in contrast, it conjures up all sorts of fond memories of past Thanksgivings, childhood pains, family gatherings, Boy Scouts, Girl Scouts, — complete with a small dog at our feet. These are not aesthetic responses and Rockwell characterized himself as an “illustrator,” not an artist. He knew whereof he spoke.

An artist can produce an object, a painting, sculpture, dance, musical composition, poem, or even a piece of driftwood that demands of us a “rapt, intransitive” response. We behold the object and we become lost in it. Our minds do not wander and strong feelings do not obtrude. We simply feel at one with the object. It approaches the religious experience described by mystics.

Now Beardsley doesn’t say the object always yields such a response. he says, rather, that it is the artist’s “intention” that it have the “capacity” to satisfy the aesthetic interest. He wants us to focus on the artist’s intention — to the extent that we can figure out what that was (and even assuming that the artist herself even knew what it was at the time). We look at the object and we see forms, shapes, colors, relationships that announce the presence of beauty. My new blogger friend is a photographer and her blog contains a number of her photographs that are clearly works of art. They are beautiful. They speak for themselves and need no explanation. In fact, any attempt at explanation is doomed to fail, because explanations involve discursive language whereas the language of art is immediate, intuitive and instinctive. Works of art do not seek to evoke nostalgia, memories of long walks in the woods, past memories of lost moments in childhood. None of these things is present as we simply look and our interest is absorbed by the photograph itself. That distinctive, decidedly aesthetic, response is the only one that seems appropriate.

I do think there are photographs that rise to the level of art. In days long gone the photographer was able to control the finished product as it was developing by altering the time spent on taking and developing the photograph, altering chemicals, etc. In cases such as the photographs of Ansel Adams, for example, the results (even in black and white) were clearly works of art.

Today, with digital cameras — and iPhones! — the artist requires an eye for composition, color, shadow, subject-matter, and the subtleties of form. One can simply point and shoot with a PhD camera (“push here, dummy”) and relish the shot of one’s self (!) or friends, or the lovely spot where we saw the eagle soaring in the deep blue sky. But only the artist is able to capture the moment when all of the pieces fit together and the finished product speaks for itself.

It can happen by accident, of course. In a recent trip to the North Shore with my wife’s niece and her brother, for example, our niece took a photograph of her brother sitting on some rocks at the shore of Lake Superior in the evening with the moon shining on the lake. It was a work of art. The photographer was able to capture just the right moment, when things were aligned and the finished product demands our complete attention. It is a truly beautiful shot.

The Shores of Lake Superior at Dusk

Yes, photography can become art. But it usually falls short because not all of us see with the artist’s eye and it is so difficult for even an artist to capture the precise moment when all things come together; and, of course, there is never any guarantee that those who view the photograph will have an aesthetic response. In a world flooded with images and sounds and diversions surrounding us on all sides there are rare moments when we are willing to take the time to just look and appreciate, allow ourselves to get lost in the picture. It takes imagination, time, patience, experience, and sensitivity. And these things are becoming less and less common. But artists are still among us and they paint, they sculpt, they dance, they play. And they take photographs.

Thank goodness!

The Patient

There he sits, over there near the window. He looks a bit pale and not full of vim and vinegar as they say. We had some tests done last week and the doctor wants to run some more next week. In the meantime, he tells us to humor the boy and give him what he wants. Don’t force him to eat anything he doesn’t want to eat and try to keep him as calm as possible. They don’t know what the problem is, but they know that he needs to be cared for and we plan to do just that. After all, we love him and want him to be happy. If only we could find out what is wrong with him. We ask, but all he does is mumble and play with that damned iPhone, telling us he needs to keep up with what’s going on with his friends. Sometimes I just wish we could snatch that thing and throw it in the garbage. But I suspect he would be right in there after it! Sick or not sick, he always has that thing in his hand and pretty much ignores us. But we love him and the doctor say to do what he wants so we will continue to humor him.

The patient, of course, is America’s child. And he is not physically sick, but whatever the problem is we aren’t going to solve it by humoring him.

Indoctrination

Readers of my blog are fully aware that I am somewhat fixated on the topic of education — what it is and what it is not. In reading Jean Jacques Rousseau’s notions about education (an author who wrote Emile, one of the supposed great works in education) I found myself disturbed by his confusion between education and indoctrination. It made me reflect on the fact that we tend to make the same confusion — though we would be reluctant to admit it. After all, who would agree to pay teachers to indoctrinate their children rather than educate them? The answer should be obvious: most of us do (to a degree).

But, back to Rousseau for a moment who, among other things, did not believe that the children of the poor and disenfranchised should be educated. In his words:

“The poor man does not need to be educated. His station gives him a compulsory education. He could have no other. . . .Those who are destined to live in country simplicity have no need to develop their faculties in order to be happy. . . . Do not at all instruct the villagers child, for it is not fitting that he be instructed; do not instruct the city dweller’s children, for you do not know yet what instruction is fitting for him.”

The sort of “education” that Rousseau recommends for the remaining few is most interesting:

“It is education which must give souls the national form, and so direct their opinions and their tastes that they are patriots by inclination, by passion, by necessity. A child, on opening his eyes, should see his country, and until he dies he should see nothing but his country.”

These two comments are worth considerable reflection. They both raise red flags, for different reasons. The first quote focuses on Rousseau’s conviction that some people (most people?) cannot be educated. The hero of his book, Emile, was a privileged son of a wealthy father and was privately tutored. Rousseau simply took for granted that the children of poor villagers could not be educated and that any attempt would fail. This is interesting because we are, as a society, committed to the notion of universal education, the notion that all are educable and “no child should be left behind.” Unfortunately, as it happens, this is not true. To an extent Rousseau is correct. Not all children are educable. Take it from me! But it is impossible to state a priori who is and who is not educable and therefore the opportunity should be made available to all. But the notion that all children can be taught something by good teachers is a stronger position, because teaching children “something” does not necessarily mean they are educable.

This leads to the notion of indoctrination which is clearly implied in Rousseau’s second comment above. So much of our teaching is directed toward teaching children “something” rather than teaching them how to use their own minds to determine what “somethings” are worth knowing and which are only worth ignoring altogether. In point of fact, much of what passes for education in this culture is really job training, teaching the young those skills that will enable them to make a living. This is assuredly not education; it is indoctrination by another name. And there are those among us who would insist that the sorts of flag-waiving that Rousseau recommends should be taught as well. In a word, we ignore the fundamental distinction between education, training, and indoctrination. These are not at all alike, and while training may be advantageous to all, education ought to be but, as Robert Hutchins said long ago, we have never really made the effort. We are satisfied if the kids can get a job after they graduate, whether they are able to use their own minds or not. And were the schools to buy into the sort of brain-washing that Rousseau recommends it is fairly certain that a great many parents would rejoice.

In brief, we need to be clear in our minds just what it is we are talking about when we talk about “universal education.” If we really believe in it, we should embrace the concept fully and make it available to all — and not settle for indoctrination or job training. A democracy, as I have said on numerous occasions, requires an educated citizenry. It was the assumption of the Founders that all who voted would be aware of and concerned about the common good and also they would be “schooled” to the point where they could distinguish the worthy candidates for public office from the frauds. Recent experience has proven that a great many of our citizens do not exhibit “social virtue” and cannot vote intelligently and this should make us even more determined than ever to insist that teachers focus on enabling all of their students to use their own minds and not settle for anything less.

Religion and Morality

It has always struck me as odd that those of a liberal political persuasion are frequently, if not always, averse to any talk about religion or morality — especially religion. I suspect it has something to do with the historical record of religions, especially Christianity, in which the Church, as the embodiment of the religion, has shown itself to be intolerant and authoritarian, not to mention responsible for thousands of deaths. The Church decides what is right and wrong and it has been throughout its history intolerant of those who would dispute its absolute authority on such matters as good and evil.

Dostoevsky had problems with this role the Church has played and pilloried it in his remarkable book The Brothers Karamazov. He was himself a deeply religious man but he was also distrustful and suspicious of the Church and insisted that its claim to absolute authority on matters of ethics has threatened, if not removed altogether, the freedom that makes human beings human. In any event, I share his distrust of the Church as an institution and would follow him in insisting that religion be separated from the institution in which it finds itself housed, to wit, the Church. The two are not the same, by any means. Christ preached love; the Church, historically preaches intolerance — as do so many of its followers.

And this brings us to the point I raised at the outset: why so many intellectuals have rejected the Church as well as the religion they often confound with the institution that houses it. I suspect it is all about tolerance, or the lack of same. As I have noted in past blogs, we hear again and again (and again) that we must not be “judgmental,” which is to say, we need to be more open-minded and tolerant of other ways of living and believing. But the notion of tolerance is a double-edged sword. On the one hand, we should tolerate other points of view — not blindly, not always accepting, but after thinking our way through them, listening and questioning, but tolerant none the less. On the other hand we should not tolerate, say, views that promote violence, hatred, and fear. In a word, we need to be circumspect but not refuse to make judgments (be “judgmental”), acknowledging that we must remain open to the possibility that we do not have all the answers and that those very answers may come from the most unexpected sources — even from others whose opinions are diametrically opposed to our own.

There are certain things we come across in our lives that simply should not be tolerated. The insistence that we not be “judgmental” is simplistic nonsense  — because it ignores those very actions that we not only should not but must not tolerate, namely those actions that lead to the violations of another’s personhood or violate the universal principle of fairness that transcends all ethical systems. And these sorts of actions are precisely those that religions preach against. The tendency to turn away from religion and morality toward a relativism that would insist that all actions are somehow good simply because they are practiced by someone is wrong-headed, as I have noted in the past, because it makes impossible the judgment that some practices are quite simply wrong. Words like “right” and “wrong,” “good” and “evil” are not frightening. It is possible that in talking about these things we might become intolerant when we should remain open to other points of view. But that is a mistake and something we should avoid at all costs; it is not, however, a necessary concomitant of searching for answers to complex moral issues. We should not be afraid to talk about those things that we and others do that are simply not right. If I see a young woman being attacked on a dark night I should not tolerate such an action; I should instead intercede in her behalf. Intolerance may at times involve intervention, but it need not do so. The determination not to be intolerant or not to interfere with the actions of others should not blind us to the fact that we, as humans, should never fear the making of judgments and, at times, recall that intervention may be necessary. Good judgment is the key.

In any event, it is not religion and morality that we should be wary of, but the reluctance to acknowledge that at times it makes perfect sense to be intolerant. And it always makes good sense to exercise judgment; it’s what leads to informed action rather than impulsive behavior.

The Welfare State

When President Franklyn Roosevelt initiated steps to thwart the depression his country was deep in, he cautioned against the real possibility that citizens would become dependent on the hand-outs the Federal Government was taking steps to provide. As he said at the time:

“Continued dependence upon relief indicates a spiritual and moral degeneration fundamentally destructive to the national fiber. To dole out relief in this way is to administer a narcotic, a subtle destroyer of the human spirit.”

The possibility that Roosevelt alluded to had been noted years before by such intellectual giants as Nietzsche and Dostoevsky who both saw “socialism” as a step toward the destruction of human freedom. Indeed, Dostoevsky thought socialism was the bastard offspring of the Church which, by making moral decisions for mankind, had robbed them of their humanity. If the Church or the State take care of people they will stop taking care of themselves. And taking care of themselves involves a struggle and, at times, suffering; these are the things that make us fully human. It is a dilemma: on the one hand there are folks who desperately need help and all who are able have a duty to care for them. On the other hand, this care can become a habit and rob those folks of the very freedom that makes them human.

Robert Kennedy in a speech in 1966 echoed Roosevelt’s warning, adding that “higher welfare payments . . .often lead to lifelong dependency.” The problem is how to find a balance between meeting genuine human needs and creating a situation in which those who receive assistance become dependent on it and find themselves unable to take care of themselves. The obvious solution takes the form of assistance with strings tied to it, assistance that demands that those who receive it do so for a limited amount of time and then fend for themselves, frequently referred to as “workfare.” Presumably this is what welfare reform is all about.

It’s not a Republican/Democrat sort of problem either, though there are Democrats who support all forms of welfare and there are Republicans who oppose all forms of welfare, which they see as hand-outs to lazy ne’er-do-wells. In a country that ponders the possibility of spending billions of dollars building walls to keep “terrorists” out and spends more billions to build planes and ships that can travel the world with nuclear weapons tucked away in their bellies, the notion that spending millions to help those in needs wastes our hard-earned money is truly ironic. And the notion that those in need are lazy is incredibly insensitive and wrong-headed. It is not the fact that millions are being spent on those in need that bothers so many people, however, it is the fact that they see those millions as being better spent on building higher walls. Or they point to anecdotes about abuse of the system, those who take without needing. In a word, we have a serious problem with perception and a loss of a sense of balance between what is being done and what should be done. And this in a nation that prides itself on its Judeo-Christian heritage!

Clearly, a wealthy nation such as the United States can afford to take care of those in need — whose numbers grow daily. The money that is spent elsewhere could be reshuffled easily to cover all costs. But the real problem is that those who receive this aid, regardless of how much money it turns out to be, must be enabled to take care of themselves. Many who receive welfare admit this and insist that their own self-respect depends on their eventually earning a living, taking care of themselves and their families– even if the income they earn turns out to be less than the money they are receiving on welfare!  The notion that these people are all lazy ne’er-do-wells is twisted and distorted — and self-serving. These are folks like you and like me who have come on hard times. The issue is not whether we spend some of our tax dollars to take care of those who desperately need it; the question is how we do this while still making possible the retention of self-respect and a degree of human freedom that they require to go on with their lives and become healthy, productive citizens.

Regulations

We live in a time of ferocious de-regulation as the Republican majority in both houses of government in the United States is in positive tizzy to rid the country of those nasty regulations that have been interfering with the increase of profits for the very few. But there are regulations and there are regulations. Some are in the spirit of “mercantilism” that is intended to increase a nation’s wealth by regulating all of the nation’s commercial interests. Those are the regulations people like Adam Smith and Edmund Burke had in mind when they argued for a system of “free enterprise” that would increase human liberty and contribute to the common good. That was in the eighteenth century, the age of Enlightenment.

But there are regulations today that are designed to protect citizens from the dangers following upon the blind pursuit of profit that threaten the health and well-being of us all. Smith fought against “mercantilism” because the government at that time was intent on decreasing wages and expanding the pool of needy workers that would then be available to the wealthy who owned the factories. Smith argued vociferously for raising the wages of the working classes. The attacks of Karl Marx were also against the same propertied class in the name of the “workers of the world.” Today’s regulations that are designed to protect citizens from corporate abuse, not to mention the destruction of the planet that sustains us all, are of a different order and would most certainly not have been opposed by Adam Smith. One wonders about Edmund Burke who, while a student of Adam Smith, was also a more ferocious defender of the rights of the propertied classes —  though he had some rough words for the “sophists, economists, and calculators” who pervert the true principles of economy by promoting policies that were inimical to the welfare of the country.

In any event, those who might refer to Smith or to Burke in pursuing the elimination of regulations might want to reflect on the intention of the two thinkers., Both were concerned about the liberty of all citizens, though Smith was primarily concerned about the liberty of the ordinary workers who were busily being exploited in his day by the mercantile classes, the owners of the means of production, as Marx would have it a century later. Smith was a compassionate thinker, a pillar of the British Enlightenment and firmly at the center of the Scottish Moral Sense School of philosophy who was convinced that men would, left to themselves, do the right thing. His famous comment, often invoked in defense of free-enterprise capitalism should be taken in the context of his entire thesis and in his earlier work in moral philosophy. When he says that

“It is not from the benevolence of the butcher, the brewer, or the baker, that we expect our dinner, but from their regard to their own self-interest”

we must pause and reflect that this is the man who regards human sympathy and benevolence as fundamental traits in the human soul. The pursuit of self-interest, in Smith’s view, would not conflict with benevolence or the well-being of others, because we are all, Smith thought, concerned not only about our own good but also the good of our fellow human beings. By pursuing self-interest we are at the same time pursuing the best interest of others. There is no conflict, in Smith’s view, because all humans want all other humans to be happy and well off. All, that is, but those who own the factories that employ humans at starvation levels. Smith fought hard to demand that the government, if it interfere at all, fund public education and work to promote policies that raise the wages of the working men and women rather than to reduce them as many would do in his name today. The mercantile system that Smith criticized sought to direct the economy in the interests of national wealth and power, not in the direction of the ordinary worker. Thus, when he advocates free enterprise it was because he was convinced that left to themselves workers would care for one another and help the economy at the same time. He was convinced, for example, that educating the workers and raising their wages would increase productivity, improve worker morale, and increase profits, while at the same time making it possible for the workers to live fuller, richer lives. This is the free-enterprise system he advocated.

Smith would not have fought against the sorts of regulations that protect citizens today against the abuses of the large corporations that would poison the air and water. Nor would he defend the supposed “right” of mega-corporations to be deregulated in the name of increased profits. Certainly not if those actions were undertaken at the cost of increasing poverty for increasing numbers of men and women and direct threats to their health and the preservation of the planet on which we all depend, which they most assuredly are. Smith would never condone, for example, the sort of attacks on the Environmental Protection Agency we have seen of late. So those who invoke his name in defense of their attacks on regulations might to do well to actually read Adam Smith’s book and pause to reflect on the long-term costs of their short-term thinking.  It’s not all about profits. It’s all about the common good. It was in Smith’s time and it still is in our’s.

Rights and Responsibilities

One hears so much about “rights” these days it suggests that it might be a good idea to see if folks know what the hell they are talking about. When I hear the word it usually means something like “wants.” Thus, when Albert says he has a “right” to that parking space over there what he means is that he wants it. I heard a man from Charleston recently explain why he hadn’t voted in the last election because he “had a right not to vote.” This is absurd. What he meant to say, as so many like him mean to say, is that he didn’t want to vote.

The notion of rights comes from the Enlightenment tradition that informed our own Constitution and was firmly in the minds of the founders of this nation as they worried about separation from the most powerful country on earth at the time. They were concerned about their rights, their human rights. The word has strong moral overtones and suggests, when properly used, that one is morally permitted a certain course of action. Thus, when I say that I have a right to free speech the implication is that it is morally right that I be allowed to speak my mind and others are morally bound to allow me to do so — as long as I don’t shout “Fire!” in crowded theater, engage in hate speech, or promote civil insurrection (or tell lies with the intention to misinform the public).

In any event, rights imply a corresponding responsibility. Rights are one side of the coin, responsibilities, or duties, are the other. But we hear very little about the responsibilities that are intimately bound up with rights, because we have reduced the notion of rights to wants — and wants do not imply responsibilities. Again, the moral connotations are strong in the case of both rights and responsibilities. And in saying this I am speaking about what folks like John Locke and Thomas Jefferson regarded as human rights, the rights that every human being is entitled to simply because he or she is a human being. This contrasts with civil rights, which attach to membership in a specific polity and which can be taken away by those in power, if we abuse them by breaking the law. Our Bill of Rights are civil rights and are not absolute in any sense — even the Second Amendment that guarantees the militia (not every Tom, Dick, and Sally) the right to “bear arms.”

Human rights, as Jefferson says, are “inalienable,” that is, they cannot be taken away. They can be forfeited in that if I ignore the corresponding responsibilities I can be said to forfeit the rights that I might otherwise lay claim to. If I kill someone, according to Thomas Aquinas, I forfeit my right to life and am therefore subject to capital punishment. I myself think this is simplistic, as it is not always clear when a person has killed another and thus never clear when those rights can be said to have been forfeited, but the point is that no one else can take my rights from me. Or you. They are “inalienable.” The principle is quite clear.

What is important to keep in mind when speaking about human rights are two things: (1) they are moral in that those in power can take them but they should not do so. No one should do so. The “should” here suggests the moral nature of human rights. Clearly, those in power can take them from us, but they should not do so: they have no moral justification whatever for doing so. And this raises the second point: (2) Rights have reciprocal responsibilities in the sense that if I claim to have rights this implies that you have a (moral) responsibility to recognize those rights — and I to recognize yours, since we are both human beings. The only humans who can be said to have rights without responsibilities are the mentally infirm and children. In these cases alone those who are not capable of recognizing their responsibilities still have rights because they are human beings. But with these rare exceptions (and these are debatable) all who have rights also have responsibilities and if we ignore our responsibilities we can no longer lay claim to our rights. We might want to keep this in mind next time we hear Albert shouting about his “right” to the parking space. There is no such right.