Self-Interest

I recall reading years ago a book in ethics that built an entire ethical system out of the notion of self-interest. This was not simply ego-centricity, not raw selfishness. It was self-interest properly understood: enlightened self-interest. If I ask not “what do I want here and now,” but “what will I want in a few day’s time” I begin to see what is in my true self-interest. I will denote the difference by putting caps on the notion of Self Interest properly understood.

On a mundane level, Self Interest translates into “I will scratch your back because there may come a time when I need you to scratch my back.” Thus, if your car breaks down and you need a ride to the garage I will take you there in spite of the fact that I was on my way to the Mall to buy the item I really wanted because it is on special this week and the sale ends today. I really want to go to the Mall, but I realize that it is in my Self Interest to help you out, because there may come a time when I need you to help me out. Conscience may enter into it, or it may not. It may simply be a matter of calculation. But the end result is that I do the right thing. Similarly, if you make me really angry and I want to smack you upside the head, I realize that if I walk away you will still be my friend and we can continue to have fun together. It’s in my Self Interest to swallow my anger and simply walk away and cool off.

A good citizen who is calculating his or her Self Interest will realize that they need to vet each candidate carefully, get out and vote, and continue to keep an eye on the voting habits of the candidates of choice in order to determine whether they deserve to be reelected. He or she will pay taxes because they realize that they will benefit the schools (whether they have kids in the schools or not) and help the state pay for road repair, support fire and police salaries, and keep up the public parks — all of which benefit me in the long run. (Even someone else’s kids will vote and act wisely in the future if they are well schooled, presumably.)  In a word, Self Interest requires taking the long view, considering the consequences of actions and asking the question: what will benefit me in the long term.

The owner of a factory who knows he can save big bucks by neglecting to put scrubbers on his factory’s chimneys takes the view of Self Interest and spends the money for the scrubbers because he realizes that this will improve air quality that benefits the health of those around him, including his employees, and himself and his family as well. Short-term profits are sacrificed for long-term benefits to a great many more people. And, in the end, these are the people that will continue to work for him and will buy his products. The long term always involves a sense that each of us is in a boat with others. It’s not just about me or you: it’s about all of us. What is good for each is good for all. It’s not rocket science, but it takes a bit of imagination and patience and a willingness to think before acting.

At the highest levels, of course, ethics demands that those who make the major decisions that indirectly affect us all require the perspective of Self Interest. It may be in my self-interest (small case) to cheat on my taxes and save a few bucks, put pressure on my political cronies to get them to vote my way, cut health care because it will benefit those few who support my candidacy, fail to fill vacant federal judgeships that stand in the way of my political objectives, or eliminate regulatory agencies because they interfere with profits. But if I step back and take the perspective of Self Interest I realize that paying my taxes, cooperating with my political cronies (whether I like them or not), promoting universal health care, promoting a strong and healthy judiciary, and funding regulatory agencies that protect us all are in my Self Interest: they are in the best interest of all and therefore of myself as well. When we all benefit each of us as individuals benefits as well.

This system is not the be-all and end-all of ethics, but anyone who seeks to follow the path will find that he or she ends up doing the right thing most of the time. It takes imagination and a willingness to ignore short-term desires for long-term benefits. But if each of us followed that path our democracy would be a stronger and healthier political system that does, in the end, help to promote  the Common  Good — which was always the goal of a republican system of government.

Good Behavior

I taught ethics for many years. It was my area of primary study in graduate school; I wrote and defended a dissertation on the subject and later published a book trying to convince readers that one could think critically about ethical issues — one doesn’t simply have to go by hunches and gut feelings. But the thing I always found most difficult when teaching and thinking about ethical issues was how to close the gap between the determination of what is right and wrong and actually doing what one has decided is right.

For example, let’s say I live in a border state in the American Southwest. My government has decided to build a wall to keep the Mexicans out of this country and I am aware that the local police randomly arrest Mexicans off the streets, whether they are here legally or not, and keep them locked up for days at a time. I fear for the lives of my family because I am aware that many of these people who are here from Mexico are poor and unable to find work; as a result I worry that they are likely to steal from me and possibly harm my family. It matters not whether these people actually pose a threat to my family: what is important here is the perception that this may be so, because that is my primary motivation. In any event, I know that from an ethical perspective determination to keep “foreigners” out is wrong, as are the racial profiling and the false arrests. But I support the efforts of my government and the actions of the police because it seems to be a way to keep my family safe.

Note the conflict here between the ethical considerations of the rights of the Mexicans to share our way of life if they so choose — certainly as much right as we had, if not more, to take this country away from the native people. Human rights are based on the capacity to make moral choices, according to Immanuel Kant. And the Mexicans have that capacity as surely as I do. So, on the one hand, I must recognize their rights while, on the other hand, I experience fear and suspicion of those who are different from me and I support steps I know are wrong in order to keep my family safe. Here’s the gap between what I know is right and my ability to act on that knowledge. In the best of all possible worlds, where everyone does the right thing, I would welcome the Mexicans to my town and make an effort to ease their transition to a new way of life. But this is not the best of all possible worlds. This is the real world where people base their actions on perceived danger, real or not, and act out of ignorance or on impulse rather than on sound reasoning.

In my book I distinguish between justification, explanation, and rationalization in ethics. The first is the ability to find sound ethical reasons to support a claim. I know, for example, that the right thing to do in my example is to treat all humans, including “foreigners,” with respect. An explanation simply accounts for my determination to act as I do. I can explain my reluctance to welcome those who differ from me even though I cannot justify my actions: I fear for my family’s safety. And finally, I find it easy to rationalize my actions: it’s what everyone else is doing so why shouldn’t I? The latter is an attempt to find bogus reasons for  what we are inclined to do anyway. One would like to find sufficient justification for doing the right thing. But, as Dostoevsky noted in several of his novels, the problem is frequently not one of justification, explanation, or rationalization but of reconciliation —  to the fact that at times we must do the thing we know is wrong.

In the end the gap is still there. I may know what is right, but I am unable to do it even though I can rationalize and even explain it. I cannot justify my actions from an ethical perspective. I know I am not doing the right thing. Knowing what is right and doing what is right are two entirely different things. How to close the gap between thought and the real world which as Machiavelli tells us is full of humans who are “ungrateful, fickle, liars, deceitful, fearful of danger, and greedy of gain.” In the end  I have come to realize that this is not a philosophical problem; it is a psychological problem. Why do we find it so difficult to do the right thing?

Lying, Of Course

It started with advertising, I think — though I can’t be sure. I refer, of course, to lying. I don’t mean the occasional lie. I mean the chronic lie, lying as a matter of course. Selling the car to the unsuspecting customer by telling him that it was owned by an old lady and never driven over forty; selling the house without mentioning the fact that the basement leaks whenever it rains; insisting in the face of overwhelming evidence that global warming is a fiction.  I realize, of course, that people have always lied. But what I am talking about is the blind acceptance of lying as a way of life. It seems to have become the norm. Everybody does it, so it must be OK.

As one who taught ethics for forty-one  years I have a bone to pick with this sort of logic. Just because everyone does it (which is a bit of an exaggeration) does not make it right. In fact, the cynic in me is tempted to say that if everyone does it it is almost certainly not right! From an ethical perspective it is never right to lie, not even in an extreme case, although one might plead expediency in such a case. But it is never right, not even the “little white lie” that we might tell about our neighbor’s hat in order not to hurt her feelings. I might tell the little white lie, but I must realize that it is not the right thing to do, strictly speaking. In this case it’s just the expedient thing to do, since hurting her feelings would be much more upsetting than simply telling her that her hat is lovely when in fact it’s perfectly awful. It’s the lesser of two evils, if you will. In any event, the little white lie is not the problem. The big black lie is the problem: it has become commonplace. And it is the fact that lying has become accepted behavior that is of greatest concern.

When my wife and I were babysitting with our Granddaughters some time back I sat and watched several Walt Disney shows the girls seemed to like. The plots involving teenagers and their bumbling parents were absurdly simple, but they tended to focus on a lie told by one of the characters that generated a situation that required several other lies to be resolved. It was supposed to be funny.  I was reminded of the “I Love Lucy” shows (which I did love) that were also frequently based on a lie that Lucy told Ricky and which generated a situation from which all of Lucy’s cleverness was required to extricate herself. I then began to reflect on how many TV shows generate humor in this way. These situations are funny, of course, as were the Disney shows, I suppose. But the point is that the lie was simply an accepted way of doing things. If you are in a tight situation, lie your way out of it.

On our popular TV shows, it’s not that big a deal. But when our kids see this day after day it must send them a message that lying is simply the normal way of dealing with certain sorts of situations that might be embarrassing or uncomfortable. In any event, when it becomes widespread and commonplace, as it has clearly done in today’s world, it does become a larger problem. When Walmart claims it always has the lowest prices and has to be taken to court to reduce the claim to always having low prices we become aware that the rule of thumb seems to be: say it until someone objects and after the courts have ruled we will make the change. In the meantime we will tell the lie and expect greater profits. And we all know politicians lie without giving it a second thought: whatever it takes to remain in a well-paid position requiring little or no work whatever.

As we listen to the political rhetoric that fills the airwaves and makes us want to run somewhere to hide, we realize that bald-faced lying has become a commonplace in politics. Tell the people what they want to hear, regardless of the consequences. It’s all about getting the nomination and then winning enough votes to be elected. If those lies result in harm to other people, say people of another religion or skin color, so be it. Consequences be damned! It is possible to check the facts, of course, but very few bother to take the time since if the lie supports the listener’s deep-seated convictions and prejudices it will readily be believed, true or false. And if it doesn’t, we simply stop listening. For example, one could simply search “FactCheck” and discover that the majority of Donald Trump’s claims are a fabrication or are blatantly false. But, then, truth does not enter in. We don’t seem to care much about that any more. Sell the house. Sell the car, Sell the political candidate. Whatever it takes. The end justifies the means.

This, of course, is utter nonsense.

 

The Tail of That Dog

I have written about the tail that wags the dog for many years and general awareness has increased; none the less, the problem isn’t any closer to being solved. I speak of the inordinate amount of money and time spent on athletics, especially in NCAA Division I schools, that seriously undermine the higher purpose of education. A recent article in Sports Illustrated about the scandal at The University of North Carolina focuses the issue nicely. The author, a graduate of UNC, turns his attention to the weakening of the academic program that is in direct proportion to the rise of the athletics programs at one of the most prestigious Division I schools. He raises the question”How Did Carolina Lose Its Way?”

It is especially disturbing to see the problem growing in the face of the inordinate costs of athletics, reflected in the fact that public universities, like UNC, now spend three to six times as much on athletics per athlete as they do on academics per student. Even more remarkable is the fact that the average amount of money lost, I repeat, lost, on athletics among Division I public universities is $11.6 million each year. So the myth that athletics brings in the dough turns out to be just that, a myth — except for those schools at the top of the pyramid, including the University of North Carolina where the cost of athletics has grown from $9.1 million in 1984 to $83 million last year, and the cost to the university in the reduction of the quality of education is beyond rubies.

The problem doesn’t end with the cost to the athletics program at that university. It extends into the classroom as well. At UNC where the recent controversy centers around the Department of African and African-American Studies, the main problem started to appear in 1993, the year that a woman by the name of Debbie Crowder headed up the AFAM department. The SI story describes the program she initiated in which

She began to devise “paper classes.” The “shadow curriculum” run by Crowder and department head Julius Nyang’oro “required no class attendance or course work other than a single paper, and resulted in consistently high grades that Crowder awarded without reading the papers,” the report said. (Crowder retired in 2009 and Nyang’oro was forced to retire in ’11). A disproportionate 47.4% of the enrollees in AFAM classes were athletes, mostly the football and men’s basketball players.”

The problem at UNC also includes “special admits,” the alarming number of students who are admitted to the university with “rock-bottom SAT verbal scores of 200,” scores well below the acceptable level, coupled by the placement of those students very carefully into special classes designed to guarantee their success — at the university if not in later life. One is put in mind of the parent who allows his child to continue to eat candy thinking they are doing the child a favor while, in fact, the child’s teeth are rotting out. In any event, as it happens, the problem at UNC goes beyond the AFAM program and included

“philosophy lecturer Jan Boxill, who was chair of the faculty and head of UNC’s Parr Center for Ethics (!), [who] was discharged last October for steering athletes into sham courses, doctoring students’ papers, and sanitizing an official report in an attempt to shield the athletics department from NCAA scrutiny. From 2004 to 2012, The Daily Tar Heel reported, Boxill also taught 160 independent studies — 20 in one semester. (The standard runs between one and three per year).”

Those independent studies courses, of course, were a joke. But apparently the fox was caught guarding the chicken coop! (A philosophy professor, I shudder to admit, and chair of an Ethics Center to boot!) But the problem extended beyond the playing fields and the gymnasium as students across campus became aware of the “cake courses” being offered by various departments. According to the report, those taking Crowder’s “paper classes” numbered  3,100 students, the majority of whom were not athletes.  This is not new — students will always find the easy courses to help their GPA — but it has simply grown by leaps and bounds at North Carolina, where some courses aren’t even real courses, but most are encouraged by the demands of the athletics department.

Thus does the infection begin to seep into the bowels of the university itself and infect the entire student body. A recent book by two professors at UNC, Cheated: The UNC Scandal: The Education of Athletes and The Future of Big-Time College Sports, focuses attention on the problems at that university, where “We show pretty persuasively that it all started with easy-grade-independent studies in the late ’80s for a handful of weak students on the men’s basketball team and mushroomed from there.” But as the SI article points out, the issue is broad and deep. The author of the article asks in discussing the current situation with the new chancellor at UNC, where things have reportedly been put straight, “. . . [whether] the money in college sports — at least $16 billion in TV contracts alone — [makes] ‘the right way’ impossible”? That is the $64 million question. In saying this, however, it is important to point out that it isn’t only at the University of North Carolina where these sorts of problems exist. They are becoming all-too common, not to say prevalent. The tail is indeed wagging the dog.

The Cat In The Room

In a comment on a previous post I was trying to make myself understood by my good friend Dana about the various colors in ethics — black, white, and gray. In doing so I came to realize that I could be clearer about where I stand on the issue. And where I stand is not where many others stand, so it behooves me to make my position clear in case it might be close to the truth, as I like to think it is. The issue surrounds the question of whether there is a right and wrong in ethics.

The prevailing opinion as late as the medieval period was that there is a clear difference between the two, an absolute right and an absolute wrong. The Church, of course, knew the difference and if men and women were in a moral quandary they would simply ask the priest. And if he didn’t know he would refer to Church dogma. I think there are echoes of that conviction among church-goers today who still ask their parish priest or parson for advice when facing a moral dilemma. Many, however, came to regard this black/white position in ethics as leading straight to intolerance and a host of atrocities all in the name of ethical certainty. And it did. So for the most part the view of absolute right and absolute wrong has been tossed aside along with the Ptolemaic hypothesis about the neat arrangement of our finite universe. We are now living in a relativistic age and we tend to think that when it comes to ethics, at the very least, it is all a matter of opinion.

What I have tried to do is to carve out a middle ground between the two views, to insist that there is an absolute right and an absolute wrong — but we don’t know it absolutely. It is this last proviso that keeps us from the intolerance and even arrogance that often came with the supposed certainty that one was right about which side God was on in a war, for example, or whether heretics should be burned alive in an auto-da-fé. We pride ourselves on being more tolerant and, in the name of tolerance, ask the question “who’s to say?” when it comes to ethics. We then end up with a mishmash of conflicting opinions that cannot possibly all be correct. But I am convinced that this view leads us away from dialogue and the search for answers when it comes to ethical issues — especially since so many people are convinced there is no answer. Let me propose an analogy — which will appeal to Dana. He’s a poet.

The search for the right answer in ethics is like searching for a black cat in a dark room with a blindfold on. I insist that there is a cat in the room — somewhere — whereas the prevailing view is that since no one seems to know where the cat is he isn’t there at all. It’s just your opinion and mine: there’s really no cat. My conviction that there is a cat in the room rests on the fact that, in ethics, we have discovered a number of clear truths that are universally agreed upon, even though it has taken a struggle over many years (and even wars) to reach agreement. I speak about the evils of slavery and human sacrifice, for example, and the conviction that all persons have rights that ought to be respected, regardless of the circumstances. We know now that we were wrong for lo those many centuries to deny women the rights that men took for granted. We also know that in a democracy the vote should be allowed to all who are of age and must not be restricted to men with property. In fact, one could even argue that over the years there has been something akin to moral progress — for all our stupidity and determination to reduce ethics to a wrestling match. It appears that when men and women put their heads together and think things through they sometimes (rarely?) find the black cat in the dark room — despite the fact that their blindfold frustrates them and makes things extremely difficult and even painful at times.

The fact is that it is very difficult indeed to continue to search for that elusive cat. And this is why so many people simply give up and insist that it’s all a matter of opinion. We have become intellectually lazy. We prefer to save ourselves a passel of work and the difficult thinking we have decided is just not worth the effort. So many of us throw up our arms and ask “who’s to say?” It saves us the trouble of opening our minds and sifting through whatever evidence there is, scrutinizing arguments, and trying to reach even tentative conclusions. We prefer to think there is no cat. But I am convinced there is. We have held it from time to time and that assures me that we might get ahold of the cat every now and again, even briefly. There are answers to ethical dilemmas. We just have to work hard to find them and most often, because we are human, we must be content with reasonable suppositions and tentative conclusions though, at times, certain ethical truths are clear as crystal: what the Nazis did to the Jews was wrong by any standards one chooses to evoke. Now there’s a black cat if there ever was one!

Discrimination

It wasn’t that long ago that discrimination was a desirable sort of thing. One learned about art, music, and wine in order to acquire a “discriminating” taste. One could, presumably, distinguish good wine, art, and music — separate these from the wanna-bes. But those times have passed.  Much like the word “discipline” which has acquired negative connotations,  “discrimination” has become a nasty word, reflective of a determination to deny folks their inalienable rights. No one should be discriminated against, no matter what.

This is a classic example of a half-truth that has taken on all the feeling of an axiom in this culture. To be sure, there are cases in which discrimination is without grounds and ethically unacceptable — as when a black couple is denied access to an apartment, not because they don’t have references or are unable to pay the rent, but simply because they are black. And we know this happens, to be sure. In fact, it happens more than we like to admit. We don’t want to accept the fact that people would be that narrow-minded, but many are.

On the other hand, there are cases in which discrimination would appear to be the better part of wisdom. Consider the following cases. You are interviewing candidates to broadcast the evening news and a young woman appears with her lower lip pierced and she is unable to pronounce foreign names or read the teleprompter without squinting and considerable hesitation over two-syllable words. Bear in mind that as the person responsible you need to be aware of your audience, and your sponsors are certainly going to make sure you are. Your audience wants to see a pleasant face, someone who seems relaxed, and is able to pronounce the names of a great many folks who make the news each night — not one whose appearance is off-putting and who cannot seem to do her job. This would appear to be a legitimate instance of warranted discrimination, as opposed to unethical discrimination. You refuse this women the job. You are discriminating. But you are not discriminating against this person because she is a woman, but because she will not be able to do the job required of her — much like a 98 pound man who is refused a position in a heavy construction company because he cannot lift, as required, 200 pound bags of concrete eight or nine times each day on the job.

In a word, there are cases in which discrimination seems not only proper, but warranted. It is not always the case that it raises ethical red flags. Those flags are raised when the determination not to hire, let us say, is based on arbitrary criteria, such as gender, race, or creed — things that do not affect the person’s ability to perform the job at hand.  And that seems to be the key. Can this person do the job he or she is applying for? It would be wrong to assume that a woman, let us say, should not be hired for a job in heavy construction just because she is a woman. But if the job requires her to do things she is physically incapable of doing — not because she is a woman, but because she is simply not strong enough — then one would seem to be justified in turning her down for the job, assuming that the woman is given the chance to show she could do the work and is not being dismissed on the grounds of prejudice. The determination is not to be made a priori.

To return to our original point:  discrimination is a key to a good education. One learns about good art and good music and literature. But one also learns what criteria are applicable when it comes to the determination of whether a person is fit for a job — or political office. A well educated person is able to separate the relevant from the irrelevant; sound reasons and solid evidence from the bloat and rhetoric which issues forth from the mouths of so many political candidates. One learns how to discriminate against those who are incapable of doing the job they are asked to do, namely, lead the country in times of great need. Discrimination is not always wrong: it is sometimes the sign of a person who is well informed and able to make sound judgments. The key is to know when discrimination is ethically wrong and when it is central to a well-reasoned argument — when the criteria applied are arbitrary or when they are pertinent.

Thick and Thin

One of the more interesting books I read in my checkered past was written by a sociologist. I say that because it is remarkable given the fact that the man had more interesting things to say about my field in philosophy, namely ethics, than most of the philosophers I have read since Immanuel Kant and John Stuart Mill. The author, Michael Walzer,  begins with an anecdote and expands his argument into broader territory.

“I want to begin my argument by recalling a picture (I have in mind a film clip from the television news, late in that wonderful year 1989) . . . It is a picture of people marching in the streets of Prague; they carry signs, some of which simply say “Truth” and others “Justice.” When I saw the picture I knew immediately what the signs meant — and so did everyone else who saw the same picture. Not only that, but I recognized and acknowledged the values that the marchers were defending — and so did (almost) everyone else. . . .How could I penetrate so quickly and join so unreservedly in the language game or the power play of a distant demonstration?”

Imagine, I might add, we are sitting in our living room watching the news and we are confronted by a story about some folks on the other side of the world who are taken from their homes at night and locked up without a trial and never heard from again. Despite the fact that this is happening in another part of the world, we would not hesitate to judge that this is wrong. Walzer calls this part of “thin” morality — a few basic principles (he focuses on justice) that are binding anywhere and at all times. He makes a strong case, since any child can tell when injustice has reared its ugly head: just give one of them a smaller piece of birthday cake than their sibling! “It’s not fair,” they would shout! And since justice is essentially a matter of fairness, none would really argue with the child. That is the nature of thin morality: it is straight-forward and compelling to any open mind.

Of course, when it comes to morality we are not dealing with open minds. In this egalitarian age where all are regarded as equal in every possible respect and “discrimination” has become a nasty thing, we are admonished not to be “judgmental” and we are asked repeatedly “who’s to say” what’s right and what is wrong? Walzer argues that in the region of “thick” morality, namely those hundreds of morés that are peculiar to specific cultures, things are, indeed, relative. We don’t really care what the marriage customs are in far off countries, how people dress, whether they shave their faces, or whether kissing is considered unacceptable in public. Nor should we. It’s none of our business. In fact, when it comes to thick morality, the only people in a position to judge are those actually living in the culture making the judgment.

And this is where folks go wrong: they lump all of morality together, thick and thin, and draw the hasty conclusion that it’s all relative — to particular cultures or even to particular individuals. It’s part and parcel of our anti-intellectualism that has fostered a deep distrust of experts and our unwillingness to acknowledge that some people know more than others and some things are simply wrong. In itself, this may not be a matter of concern. But when we reflect that the war in Iraq, as an example, was undertaken by a small clique of small-minded people who were on a power trip and who refused to confer with known experts about the dangers such a war would invariably entail, we can see how this sort of blindness can lead to tragedy on a broad scale — thousands of lives lost and millions more displaced or out of mind. The war was wrong from the git-go.

In a word, ethics is not relative and there are some who know more about the world and what things might lead to catastrophe (and are therefore clearly wrong) than others. I would only add to Walzer’s notion of justice as the central concept in “thin” morality the related concept of human rights, which seems a bit broader. It would rule out such things as lying to Congress and the rest of the country about so-called “weapons of mass destruction,.” since we all have a right to the truth.  In any event, human rights certainly include justice, since all persons clearly have the right to be treated fairly. This does not mean people are all the same, or that everyone knows as much as everyone else. It simply means that all persons are equal before the law and are entitled to being treated the same way. It is a “thin” precept that is so simple a child can see it clearly.

It’s Not All Relative

During my time as a professor of philosophy I taught a great many ethics courses, including business ethics — which was actually one of my favorite courses: there are so many real-live  incidents in business to discuss from an ethical perspective. But during all those years I continued to run up against a stone wall that appeared in the form of a mindless relativism. “It’s all a matter of opinion.” Or, “It’s all relative.” Or, “we really shouldn’t be judgmental.” I came to understand that these sorts of responses were just a dodge to allow the students to avoid thinking about problems that are complex, do not allow of quantification, and which require a modicum of objectivity. But I hit my head against that stone wall for years and it gave me many a headache.

Thus, while I have blogged about this before, an article in the news jumped out at me today that simply demands comment. It was a Yahoo News story about a terrible incident in a far-off New Guinea:

CANBERRA, Australia (AP) — On a tropical island in Papua New Guinea where most people live in huts, a mob armed with guns, machetes and axes stormed a wooden house by night. They seized Helen Rumbali and three female relatives, set the building on fire and took the women away to be tortured. Their alleged crime: Witchcraft.

After being repeatedly slashed with knives, Rumbali’s older sister and two teenage nieces were released following negotiations with police. Rumbali, a 40-something former schoolteacher, was beheaded.

The standard response to such a story in one of my classes, should I have brought it up, would be something like this: who are we to judge whether that is wrong? It’s not our country and we don’t know enough of the details of what really went on. In its shortened form it is the cliché “who’s to say? We haven’t walked a mile in their shoes.” It’s called “cultural relativism.”

The objections ring true, of course, but they are irrelevant. We haven’t walked a mile in their shoes — or even two yards. But we know enough from the article to make an informed judgment — subject to further correction if later information alters the ethical perspective. But at this point we can say with some assurance that even in a country on the other side of the earth, men coming into a home at night and taking four women suspected of witchcraft to be tortured and/or killed is simply wrong. That is to say, even though we have not walked in those shoes, the people who do walk in them are engaged in actions that cannot possibly be justified in a neutral court of rational appeal. And that is the test for all ethical claims: the neutral court of rational appeal. It is something like a jury, except that it has no formal status. But thinking persons anywhere read and assimilate the information provided and attempt to see both sides of complex issues and then render a judgment. Failure to do so would be morally irresponsible: indifference disguised as tolerance.

As I have said before moral condemnation does not necessarily result in an invasion of another country — as though they were hiding weapons of mass destruction, for example. But it simply means that when we read such a story we are appalled, thank our lucky starts we don’t live in such a country and that we have become enlightened enough to recognize that “witchcraft” is hardly grounds for decapitation and torture — or anything much other than bemused indifference. But when concern over witchcraft leads to acts of violence and murder then it is simply wrong, wherever it may occur. When something is wrong, it is wrong whether it happens next door or on the other side of the world. All that is required is careful judgment, imagination, and a lively sensibility. This does not imply our cultural superiority, it simply implies that we have thought about the actions of those men and condemned them — just as we would if they had happened next door.

Big Mistake

A recent Yahoo News story underscores the stupidity of undertaking a military operation in Afghanistan:

KANDAHAR, Afghanistan (AP) — Militants killed six Americans, including a young female diplomat, and an Afghan doctor Saturday in a pair of attacks in Afghanistan on Saturday. It was the deadliest day for the United States in the war in eight months.

The violence — hours after the U.S. military’s top officer arrived for consultations with Afghan and U.S.-led coalition officials — illustrates the instability plaguing the nation as foreign forces work to pull nearly all their combat troops out of the country by the end of 2014.

The current plan is to withdraw the vast majority of the 66,000 U.S. troops leaving a few thousand to train Afghan troops who are then supposed to restrain the insurgents. So far the plan doesn’t seem to be working very well as this latest incident attests. The female diplomat mentioned above was on her way to a school to deliver some books the teachers needed for their classes. As this incident suggests, our entire involvement in Afghanistan has been one bloody mistake after another.

Our initial involvement in Afghanistan was part of George W Bush’s plan to democratize that part of the world and, of course, to gain control of the oil fields in Iraq. When Barack Obama won the presidency for the first time George McGovern, who had a PhD in history from Northwestern, warned Obama not to get further involved in that country. It is a hornet’s nest and has been for centuries. It brought the Soviet Union to its knees and many think it was largely responsible for issuing the death-blow to what was left of the British Empire. The country has a history of internecine unrest and tribal hatred and the latest chapter, written by the Taliban, is simply one of many that can be read over the graves of thousands of dead.

McGovern knew whereof he spoke and Obama made a huge mistake to ignore him and ratchet up the war in an effort to bring stability to such a volatile country.  We learn that every day to our chagrin. Obama seems to take the advice of his military advisers far too seriously. We should never have gotten ourselves  involved in the first place and every American death can be chalked up to the stupidity of those who think there can be military victories any more and that they can determine the way the rest of the world lives.

This may strike the careful reader as inconsistent thinking on my part. After all, I am the champion of ethical judgment across cultural boundaries. I have insisted in a number of past blogs that it is our responsibility as moral agents to be aware of what is going on around us and to refuse to stop thinking at national borders when we become aware that wrong is being done. And clearly a great deal of wrong is being done in Afghanistan where women, for example, are treated like chattel and people have little or no self-determination. But it is one thing to judge this to be wrong and quite another to send in troops that will simply exacerbate an already volatile situation. It is one thing to judge an action to be wrong and it is quite another to act to bring about change in a situation that history has taught us simply doesn’t want things to be any different from they have been for as long as anyone can remember. Indeed, it is one thing to think and quite another to act: thinking should always take place; action is frequently ill-advised as careful thought will attest.

Spurious Reasoning

This post is aimed at those among us who think everything is a “matter of opinion.” During my years of teaching that became one of my pet peeves — and I have many. There is such a thing as truth and there is such a thing as sound reasoning. The opposite of truth is falsehood and the opposite of sound reasoning is spurious reasoning. During the recent presidential race we were witness to innumerable examples of spurious reasoning. My favorite was: The economy is in the toilet. Sitting presidents are responsible for the economy. Barack Obama is the sitting president. Therefore Obama is responsible for the poor economy. There are two problems with this reasoning: it smacks of what logicians call “false cause,” about which I have blogged previously. And the second premise is false: the sitting president is not alone responsible for the weak economy. This raises the interesting question: how many politicians does it take to weaken the economy? I leave the answer to you.

But my favorite example of spurious reasoning of all time occurred some years ago when a woman in Maine was shot and killed by a deer hunter while in her back yard hanging up the wash. After the trial in which the shooter was found “not guilty” of manslaughter (!) one of the jurors was asked why he voted as he did. He answered: she should have known better than to have been in her yard during deer season. Now the woman had recently moved to Maine from Ohio so there might have been a tad bit of bias against “Westerners” — those who live West of the Maine boundary. But whatever the man’s reasoning process might have been, and I doubt there was any at all, it is most assuredly a prime example of spurious reasoning. It requires that we accept the fact that an example of a person getting shot is not an instance of “manslaughter,” which it is by definition. It also avoids altogether the ethical principle that one should not shoot at another person — even during deer season. In a word, it avoids the central issue altogether.

And this brings me to my main point: even in ethics where the common notion is that everyone is “entitled” to his or her opinion, there are arguments and claims that are just plain silly, and opinions that are just plain stupid. Ethical arguments where we try to establish the viability of an ethical conclusion by incorporating specific principles and relevant facts can be sound or spurious — just like our reasoning in any other sphere of investigation. We need to separate the facts from the falsehoods and examine the reasoning critically, which is why critical reasoning is such an important part of everyone’s education. We rely on sound reasoning in nearly every endeavor we undertake every day of our lives — and especially when we seek the “moral high ground,” or when we are deciding which candidate is best qualified for political office (which I suppose should be called the “moral low ground”).