Stupid!

One of the very few sit-coms I watch on the telly is “Young Sheldon,” the spin-off from “The Big Bang Theory.” It stars the truly remarkable child actor Iain Armitage and is in many ways more delightful (and funny) than its predecessor.

Young Sheldon is a nine-year old Sheldon Cooper who likes to brag (even in Church with his Fundamentalist mother) that he doesn’t believe in God: he believes in science. This is amusing when it comes from the mouth of a small boy sitting next to an adult, but it is also a bit stupid. As Pastor Jeff tells Sheldon in an exchange they have in Church, even some of the most brilliant scientists believed in God — to wit, Albert Einstein and Charles Darwin. In another episode, Sheldon comes across Pascal’s wager in which the brilliant mathematician explains that it is smarter to believe in God than to disbelieve in God because those who believe will be rewarded while those who do not cannot be. And even if God doesn’t in fact exist, those who believe will have lived better lives. This is a bit of a simplification, but you get the idea.

In any event, young Sheldon, for all his intelligence, has committed the fallacy of bifurcation: either God or science, not both. But why not both (ask Einstein and Darwin)? Indeed, it is a bit stupid to insist, as so many intellectuals do, that there is only one way to know anything and that is the way of science. This, of course, is what has been called “scientism,” and I have written about it before; it commits the fallacy of poisoning the well. That is to say, it rules out the possibility that there are other ways of knowing and it ignores the uncomfortable fact that there may be things we simply cannot know — mysteries, if you will. This, too, is stupid. We have already encountered two fallacies in the minds of those who, like young Sheldon, insist there is only one way to know.

But it is equally stupid to ignore the findings of science, including medical science — such things as evolution and climate change, for example. Science can deliver us a great many truths that simply cannot be denied without being completely stupid. And it is perhaps the fact that many people who identify themselves with religion insist that science is the work of the devil that intellectuals don’t want to acknowledge that there could be any semblance of truth in religion. This is guilt by association. Those people conflate the differences among religion, organized religion, and faith. This, too, is stupid — as Pascal would attest. But the fact is that a great many people who insist that faith is the only road to the Truth are as stupid as those who think science is that road. Either road requires a form of denial and an assumption that our way is the only way. There may, in fact, be many roads.

In a word, there are, as Hamlet tells us, a great many things in heaven and earth which we cannot explain with science. There are limits to human truth. But there is truth and it is available to those who are willing to search for it; while a little knowledge is a dangerous thing, the unexamined life is not worth living.  And the start of that search begins with the acknowledgement that we do not know everything and may never know everything. Not in this life, anyway.

It may well be the case that we will only know the truth after we die. Heaven may consist of a world in which the Truth is revealed to us. And Hell, of course, may be a place where truth is denied and everyone tells lies, a world in which everyone makes everything up as they go along and in which there is nothing whatever that is solid and we are surrounded by incessant confusion and uncertainty — a world of Donald Trumps, if you can imagine.

In any event, I have no problem whatever accepting the very real possibility that I do not know everything and that there are things which I simply must accept on faith. But I also believe that there are things that are true, things that stand on a solid base of empirical evidence and intuitive truths that simply cannot be denied. In the end, though, there is only one certainty and that is that there is no absolute certainty. That much I do know.

Advertisements

Inverted Consciousness

I wrote this post several years ago in attempt to understand why folks seem to have become lost in the world inside their own heads. It is a topic I had written a book about a few years ago and one that continues to intrigue me. With a few adjustments to the earlier version, here is my attempt to understand the condition known medically as “Asperger’s Syndrome.”

“The Big Bang Theory’s” Sheldon Cooper has it. BBC’s Sherlock Holmes and Doc Martin have it as well. It’s all the rage these days. It’s called “Asperger’s Syndrome” and it is defined as follows:

“a developmental disorder resembling autism that is characterized by impaired social interaction, by restricted and repetitive behaviors and activities, and by normal language and cognitive development —called also Asperger’s disorder.”

Actually, “language and cognitive development” is often exceptional. But these folks  have to be taught how to interact with others, because they are not fully aware of the other’s presence — except insofar as the other person accommodates or interferes with the person’s own desires. They seem to be emotionally stunted, lacking a sense of humor and any instinctive reaction to other people’s feelings and the subtle nuances of human behavior.
I wrote a book about the phenomenon years ago before I had ever heard the word. I called it “inverted consciousness” and argued that it is a widespread cultural phenomenon resulting from a fixation on the part of the subject with his or her own experience, an inability to see beyond that experience. For this person “the” world is “my” world. Paintings and music are beautiful or ugly because the subject likes or dislikes them; behavior is right or wrong because it pleases the individual or fails to do so; all opinions are of equal merit — there is no such thing as truth or even expertise. I maintained that there are degrees of this disorder from the extremely inverted consciousness of what I now know is Aspergers down to the occasional or intermittent inversion. It is usually found in men, though I know of a woman or two who have it. My sense of it is that women are more empathetic and compassionate than men as a rule and those qualities do not live comfortably alongside a condition that blinds the person to the fact that there are others in their world — except in so far as the others serve their own purposes. That sounds sexist, but I still think there are important differences between men and women and in this case women are being complimented: Aspergers is very unattractive. However, I apologize in advance to any readers who find this differentiation offensive!
As I say, I do regard the condition as widespread in our culture and took my clue from Ortega y Gasset who noted the symptoms in Europe in the 1930s and wrote about them in describing Mass Man in his classic The Revolt of the Masses. Defining “barbarism” as simply “the failure to take others into account,” Ortega was convinced that Europe was then on the brink of a new barbarism, an age in which people would become more and more removed from one another and “hermetically sealed” within themselves. World War II soon followed, interestingly enough.
Describing this type of person, Ortega said at the time,

“The innate hermetism of his soul is an obstacle to the necessary condition for the discovery of his insufficiency, namely: a comparison of himself with other beings. To compare himself would mean to go outside of himself for a moment and transfer himself to his neighbor.”

But he is incapable of that.
I am not sure what causes this phenomenon, but it does appear to be more and more prevalent. It seems apparent that our current President suffers from the condition to a high degree as well. I suppose our increasingly crowded living conditions together with the almost constant bombardment of images and sounds around us tend to turn our attention inward. In addition, the countless number of electronic toys that seem designed to discourage human interaction must also be considered. I recall vividly the photograph of a group of teenagers sitting in front of Rembrandt’s “Night Watch” texting — totally unaware of the extraordinary painting nearby. Note how all of these devices turn the individual’s attention away from what is around them (he said, sitting alone at his computer).
In any event, I thought what Ortega had to say was a powerful message when I first read it, and I find it even more so today. If we are, indeed, “from birth deficient in the faculty of giving attention to what is outside [ourselves], be it fact or persons,” this is something we need to ponder seriously, since it suggests we are becoming increasingly isolated from one another — like Sheldon. And Sherlock. And Doc Martin — who are all funny up to a point, but also pathetic.

The Now Generation

The psychiatrists who studied the American prisoners of war released after the Korean conflict were amazed at the success of the “brainwashing” techniques that were used on those men. Captured documents revealed that one of the secrets to that success was the claim of the North Koreans that Americans were generally ignorant of history, even their own. These young men could be told pretty much anything bad about their country and they tended to believe it because they had no frame of reference. For example, they could be told that in America children were forced to work in the coal mines and a couple of the men vaguely remembered hearing of this and were willing to embrace the half-truth and share it with their fellow prisoners. True, there were children working in the coal mines at one time, but no longer. It was precisely those half-truths that enabled the North Koreans to convince the ignorant young men of blatant falsehoods. Couple that treatment with censored mail that the prisoners received from wives and sweethearts complaining about how bad things were back home, not to mention the seeds of suspicion that were planted among the men that broke down their trust in one another, and you have a formula for success. There was not a single attempt by an American soldier to escape imprisonment during the entire conflict!

Today’s young people are equally ignorant of their history, perhaps even more so. We make excuses for these kids by moaning about how much “pressure” they are under. Nonsense! I would argue they under less pressure than those young men who were fighting in Korea, or even the generation that followed them. Today’s young people need not fear the draft. Moreover, they are the beneficiaries of the sexual revolution and are therefore free from the restraint experienced by prior generations who were told to wait for sex until they were married. In fact, they don’t seem to show much restraint about much of anything, truth to tell. And there is considerably less expected of them in school these days than was expected of their fathers and mothers. They are told they are wonderful: they feel entitled. So let’s hear no more about how much pressure they are under.

Now, social scientists — who would rank below even the geologists on Sheldon Cooper’s hierarchy of sciences, I suspect — love to label the generations. We have read about the “me” generation and the “millenialists,” the “X” generation, and the “Y” generation. While I hesitate to lump myself together with the social scientists, I would nonetheless suggest that we call today’s young people the “Now Generation.” They, like their parents before them, don’t know diddly about their own history, much less world history. In fact, studies of recent college graduates have shown an alarming number of these folks who cannot name the first five presidents of the United States, cannot recognize the Gettysburg Address, don’t know who were our allies during the Second World War, or when the First World War was fought — or what countries it involved. Much ink has been spilled along with weeping and gnashing of teeth over these sad revelations, but very little of substance has resulted from all the angst. History is still not considered important in our schools or in this culture. As Henry Ford would have it: “History is bunk!”

Santayana famously said that those who are ignorant of their history are doomed to repeat its mistakes. This presupposes a cyclical view of history and is predicated on the notion that human beings don’t really change that much. Because events tend to repeat themselves — we seem to be constantly at war, for example — and humans have become increasingly locked in the present moment, ignorant of their own past, they will tend to fall into the same traps as their predecessors. On a smaller scale, every parent laments the fact that their kids don’t listen to them and seem determined to make the same mistakes their parents made twenty years before. History is a great teacher. But we have to read it, assimilate it, and take it to heart. We tend not to do that. History is not bunk, Mr. Ford, and we are certain to repeat the mistakes of previous generations if we continue to remain locked in the present moment, ignoring not only the past (from which we have so much to learn) but also ignoring our obligations to the future as well.

So, I recommend that a more appropriate label for the present younger generation is the one suggested above. It is certainly true, as psychological and sociological studies have revealed, that today’s youth are addicted to electronic toys, immersed in themselves, uncaring, and seemingly unaware of the world outside themselves; the label “Me Generation” does seem to fit. But my suggestion is designed to expand the domain of the label to include not only the young, but their parents as well. We all need to read and study the past in order to avoid the traps and pitfalls that most assuredly lie ahead.

Bang You’re Dead!

Just when you think you’ve heard about the most absurd human behavior —  a contest to see who can eat the most worms and cockroaches in order to win a python — you read a story like the one in a recent Minneapolis Tribune article that tells about a couple in a Minneapolis suburb who are making money from having created a “simulated killing of Osama Bin Laden experience.” (Seriously, I did not make this up!) Let’s start with a couple of paragraphs from the story itself:

Six AR-15 semi-automatic rifles are loaded with paint bullets. The Kevlar vests are 25 pounds of light body armor. And the nondescript industrial park in New Hope sits 6,900 miles from Osama bin Laden’s former compound in Abbottabad, Pakistan.

But the mixture of fear, adrenaline and smell of gunpowder was real enough to jump-start the heart rates of five mock Navy SEALs who cashed in Groupons for this simulated adventure that has transformed a firearms studio north of Minneapolis into a gung-ho war-game night out.

Eighteen months after a team of SEALs killed the world’s most-wanted terrorist, everyday folks like these guys can plunk down $150 for their own vicarious shot at Operation Geronimo.

The story attempts to make the project seem quite reasonable — giving a struggling business in a Minneapolis suburb a boost while helping people learn how to protect themselves under simulated conditions that make the adrenalin flow and the palms sweat, just like the “real thing.” Participants wear plastic body armor and use paintball rifles. But this is not Sheldon Cooper and his nerdy friends shooting paint balls at the pseudo-scientists in the geology department. This is supposed to simulate real life with real villains. The bottom line is that we have people pretending they are Navy SEALs, shooting cardboard cutouts of women who represent Bin Laden’s wives –“who might be carrying a bottle of kerosene” — or they might not. And, of course, there’s the fact that the real-life character they shoot in the end  — a former police sniper disguised to look like Bin Laden — constitutes a racial profile if there ever was one. Needless to say, there are minority groups in the Twin Cities who are deeply disturbed by reports of these goings-on.

So we have several interesting moral issues here in the name of teaching people how to protect themselves: racial profiling; acceptable “killing” of persons who might well be innocent bystanders in the name of “self-defense”; and fostering aggressive impulses in ordinary people who have a spare $150.00 to blow on playing a war game, of sorts. One of the participants was Ben Leber a former Minnesota Viking whose wife bought him a gift certificate for his birthday. Seriously?

I have blogged before about our limitless appetite for distractions, which Aldous Huxley noted many years ago. And to be sure, we need distractions in our stressful world. But isn’t there a point when these distractions start telling us something deeply disturbing about ourselves? Just when you think you have heard the most bizarre example possible you read about this newest attempt to give bored folks something to do in their spare time that will give them an adrenalin rush and make them feel like they have actually accomplished something important. Apparently, participants get so worked up they have trouble sleeping the following night. It does give one pause.

Mill’s Methods and Violence

John Stuart Mill wrote a book many years ago that very few people have read — except maybe his mother and his wife.. . . and me (don’t ask me why).  It is a book on inductive logic and scientific method. I learned a number of interesting things in reading the book. For instance I learned that evidence and arguments “imply” conclusions; we “infer” the conclusion from the evidence. In a word, inference is something we do whereas implication is something arguments and evidence do. Further, I learned that points cannot be “valid,” and neither can ideas — though Sheldon Cooper (on “The Big Bang Theory”) keeps insisting they are. Arguments are valid (or invalid) whereas points and ideas can be spot on, insightful, interesting, telling, or perhaps simply stupid — they cannot be “valid.” So you can see my time was not wasted. I dearly love Sheldon Cooper but am delighted to trip him up!

But I also learned something much more interesting, because in his book Mill explained his methods for determining causes. These rules were, respectively, “the “method of similarity” and “the method of difference.” I won’t go into detail, but the former tells us that if you want to know why a group of people got sick at a convention, for example, look for the common denominator — something they all ate or drank. Isolate the item and you can pretty much figure that’s the cause. Years ago it was determined that the cause of a large group of convention-goers getting sick was the ventilating system at their hotel. They all ate and drank different things, but they all breathed the same air.

On the other hand, the method of difference seeks to isolate the causal factor by looking at the one thing that is different in a group that exhibits some strange affliction. Let’s say we want to know why America is such a violent nation. Now we know that there is violence in other countries, but that violence pales in contrast to the frequent violence in this country. Why is that?? Michael Moore made a movie (“Bowling for Columbine”) that attempted to determine the cause and he concluded that it was probably (we can’t be sure) the frequent violence in our news broadcasts. Let’s examine his reasoning.

We begin by comparing and contrasting America with, say, Britain, Japan, and Germany and we look at what the countries have in common: they all watch violent TV, play violent video games, and watch violent movies (often American movies that are known for their violence). Moore thus ruled out those factors as causes of violence in America. What he found was that American news broadcasts are much more violent than the news broadcasts in other developed countries. So he suggested that violence in the news we watch is the likely cause of violence in this country.

This reasoning is sound as far as it goes. But we might just as well pick out coffee as the single factor that separates America from the other countries. In the other three countries tea is the drink of choice; Americans drink a lot of coffee. Or perhaps it’s widespread ownership of guns: Americans own more guns than most other people on earth — except for the Canadians, as I understand it. And the Canadians watch just as much violent TV, movies, and play violent video games. So it can’t be gun ownership. Perhaps it is the violent news programs, as Moore suggested. Or the coffee. Canadians also drink tea as their drink of choice, not coffee. I’m going with coffee.

But then, perhaps it is a combination of the factors listed above. We know that animals learn by imitation and that humans are animals. Further, we know that Americans watch a great deal of violence in their TV, video games, and movies — and their news programs. They also own a great many guns.  And they also drink a lot of coffee. So, perhaps, the cause of violence in America is a combination of these factors: the many guns we own together with violence we are exposed to plus the stimulation of a drink spiked with lots of caffeine. You never know! Damn! Causal reasoning is hard.

And yet, there are politicians out there who think it’s easy. They say that the sitting President is the cause of the poor economy because he is the sitting president and the economy is weak. It’s like saying coffee is the cause of violence in America because Americans drink a lot of coffee. And it’s just as stupid.

Asperger’s Syndrome

“The Big Bang Theory’s” Sheldon Cooper has it. BBC’s Sherlock and Doc Martin have it as well. It’s all the rage these days. It’s called “Asperger’s Syndrome” and it is defined as follows:

a developmental disorder resembling autism that is characterized by impaired social interaction, by restricted and repetitive behaviors and activities, and by normal language and cognitive development —called also Asperger’s disorder.” Actually, “language and cognitive development” is often exceptional. But these people have to be taught how to interact with others, because they are not fully aware of the others’ presence — except insofar as the other person accommodates or interferes with the person’s own desires. They seem to be emotionally stunted, lacking any reaction to other people’s feelings and the subtle nuances of human behavior.
     I wrote about the phenomenon years ago before I had ever heard the word. I called it “inverted consciousness” and argued that it is a widespread cultural phenomenon resulting from a fixation on the part of the subject with his or her own experience, an inability to see beyond that experience. For this person “the” world is “my” world. Paintings and music are beautiful or ugly because the subject likes or dislikes them; behavior is right or wrong because it pleases the individual or fails to do so; all opinions are of equal merit — there is no such thing as truth or even expertise. I maintained that there are degrees of this disorder from the extremely inverted consciousness of what I now know is Aspergers down to the occasional or intermittent inversion. It is usually found in men, though I know of a woman or two who have it.  My sense of it is that women are more empathetic and compassionate than men as a rule and those qualities do not live comfortably alongside a condition that blinds the person to the fact that there are others in their world — except in so far as the others serve their own purposes. That sounds sexist, but I still think there are important differences between men and women and in this case women are being complimented: this condition is very unattractive. However, I apologize in advance to any readers who find this differentiation offensive!
     As I say, I do regard the condition as widespread in our culture and took my clue from Ortega y Gasset who noted the symptoms in Europe in the 30s and wrote about them in describing Mass Man in his classic The Revolt of the Masses. Defining “barbarism” as simply “the failure to take others into account,” Ortega was convinced that Europe was then on the brink of a new barbarism, an age in which people would become more and more removed from one another and “hermetically sealed” within themselves.  World War II soon followed.
     Describing this type of person, Ortega said at the time, “The innate hermetism of his soul is an obstacle to the necessary condition for the discovery of his insufficiency, namely: a comparison of himself with other beings. To compare himself would mean to go outside of himself for a moment and transfer himself to his neighbor.”  But he is incapable of that.
     I am not sure what causes this phenomenon, but it does appear to be more and more prevalent. I suppose our increasingly crowded living conditions together with the almost constant bombardment of images and sounds around us are causal factors. In addition, the countless number of technical devices that seem designed to discourage human interaction must also be considered. I was recently at a restaurant, for example, and noted the table next to me where three of the five people were texting while they waited to be served — presumably to people elsewhere. But note how all of these technical devices turn the individual’s attention inward (he said, sitting alone at his computer).
     In any event, I thought what Ortega had to say was a powerful message when I first read it, and I find it even more so today. If we are, indeed, “from birth deficient in the faculty of giving attention to what is outside [ourselves], be it fact or persons,” this is something we need to ponder seriously, since it suggests we are becoming increasingly isolated from one another — like Sheldon. And Sherlock. And Doc Martin — who are all funny up to a point, but also pathetic. And we may be more like them than we want to admit.

“The Big Bang,” Science, and Ethics

I have blogged before about our need to make distinctions to be clear about what we say and there is a key distinction that we frequently fail to make. That’s where I am going now with the help of a popular sit-com.

Science is not technology. Sheldon Cooper — of “The Big Bang Theory” — is a theoretical physicist. He is a pure scientist (or the character is). Like Einstein, he doesn’t care a whit for applied science (note in the show Sheldon’s low opinion of Leonard because the latter is an experimental physicist. The suggestion that he needs to conduct experiments to prove his theories makes Sheldon laugh….or snicker.) As a general rule, however, scientists do not eschew experimentation. Indeed, there is an episode of “Big Bang” where Sheldon and Leonard collaborate and are asked to deliver a paper together. In any event, experiments are routinely conducted to verify theories in science, though at the highest levels of theoretical physics mathematics sometimes can suffice. Einstein didn’t need to conduct experiments to establish his theory of relativity, for example.

But there are other sciences, of course, both exact (like physics and chemistry) and inexact (like geology and biology) which rely on mathematics to a greater or lesser degree. And there are social sciences that mimic the exact sciences by using mathematics in the form of statistics — though their experiments and even their calculations are notoriously inexact, dealing in probabilities rather than certainties. But all of these sciences, exact or not, rely on the empirical method — looking and recording — and some sort of calculation. And all are desirous of knowing why things happen as they do. Why do objects tend toward the center of the earth?  Why do blond parents have red-headed children? Why did the dinosaurs become extinct? Scientists want to know. That’s what they do: they look and they record, and they draw tentative conclusions that lead to theories that are in turn verified — or falsified — by experiments or new empirical data.

Technology, on the  other hand, is not science for a number of reasons. Technology is all about “How?” and not “Why?” On “The Big Bang Theory,” Howard Wolowitz is the designated technologist. Because Howard has “only” a Master’s Degree from M.I.T. in engineering — which involves considerable math and physics — he is relegated on this show to designing toilets and telescopes for NASA –merely technical tasks. In a word, he figures out how to do things and he does them without asking why. In the case of toilets for the space station, the “why” is fairly obvious, but what about the “why” question as it regards the entire NASA endeavor? Few of us question that at all. In any event, the difference between science and technology was made clear in an episode of “Big Bang” when Penny’s car engine failed but the scientists could not fix it even though they knew all about how internal combustion engines work — in theory.

As Jacques Ellul said many years ago, ours is a technological age: we tend to denigrate theory. We laugh at Sheldon, not just because Jim Parsons is a superb comic actor and the writers have given him some juicy lines, but because he is a theoretician in a world in which, strange to say, Howard Wolowitz is much more at home, much more like the rest of us. Like Howard, we don’t seem to care about why things happen as they do, we just keep doing what we are doing and worry about the consequences later on when they become another problem to be solved. And we are convinced someone can solve it regardless of how complicated it might be — a dangerous assumption indeed.

Interestingly, what neither the scientist nor the technologist ventures into are the ethical implications of what they do. Thus, we have theoretical physicists who work together to develop the Atom bomb. Or we have medical technologists who conduct experiments to determine whether certain cosmetics will blind rabbits without asking whether or not this is the right thing to do. We have medical researchers who give placebos to cancer patients as part of an experiment. There’s a wonderful scene in one of the “Big Bang” episodes where Penny asks the guys why they rigged their computer so it could turn on the light by sending signals around the world; they respond in unison: “because we can.” Note that even Sheldon chimes in. Indeed. That is our society’s answer, and we are content with it — until a crisis arises that we simply cannot fix because we failed to look deep enough or far enough — or ask “why?” As Ellul suggests, it is precisely the failure to inquire into the moral and theoretical implications of what we do that gets us in trouble. And some of it is deep trouble indeed.

Sheldon’s Problem

In one of my favorite episodes of “The Big Bang Theory,” in which Laurie Metcalf plays a large role, Sheldon Cooper has had a miserable day because his Mom spends the day with Sheldon’s friends. She would rather do that than go to a lecture with him and listen to him trip-up the Nobel Prize-winning speaker. She tells Sheldon she would rather to go to Hollywood so she can talk to a wax statue of Ronald Reagan and thank him for his service to his country!  At the end of the day Sheldon has caught a cold and lies in bed while his Mom rubs Vaporub onto his chest and sings “Soft Kitty.” Near the end of the scene Sheldon tells his mother that he regrets that he hasn’t been able to spend time with her on this trip. His mother asks, “And whose fault is that?” Sheldon replies, “Yours.” Funny, yes. But also a bit sad. In Shelden’s world it is always the other person’s fault. Increasingly this seems to be the same world we all live in: when things go wrong, it is always someone else’s fault.

Edith Wharton’s novel The Custom of The Country features a heroine, Undine Spragg, who marries Ralph Marvell a man with little money but “Old New York” family connections. The marriage goes terribly wrong because Undine cannot control her spending and, like her parents, her husband can’t say “no.” Undine, like Sheldon, blames her husband: the failed marriage couldn’t possibly be her fault. And that is the problem: Undine is a self-absorbed, spoiled child.  No one has ever denied her anything. And despite the fact that these are fictional characters, they both reflect a common feature of our culture.

We may not have Sheldon’s problem — which would require extensive psychological counselling. But more and more of us have Undine’s problem: we are spoiled rotten and our attention is turned inward. I suspect there is a connection here. I have spoken about the unwillingness of so many of us in this culture to accept responsibility for our actions. The problem is widespread. But while I have never discussed the possible causes, I think there is a definite connection between our increasing preoccupation with ourselves and our unwillingness to accept responsibility for our actions.

As we become increasingly self-centered and others become reduced to a means toward our personal ends, and those around us (especially our parents and teachers) confirm this preoccupation with self by gratifying our every wish and telling us how wonderful we are, we grow in our sense of entitlement. We are thus subjected to ego enhancement at every turn. As the ego becomes enlarged, and the world becomes our world, it becomes harder and harder to accept the fact that we may have caused the things that go wrong in that world. It simply cannot be our fault: we are too wonderful. It must be someone else’s fault.

Sheldon Cooper has Asperger’s Syndrome, which may not be treatable. Undine Spragg is simply a self-absorbed, spoiled brat. But her problem may also be untreatable, since she has reached adulthood and it has developed into a character flaw. These are fictional characters, but they resemble us in important ways; as their condition becomes widespread among the population at large, our society takes on the character of those who comprise it. We are used to seeing people duck responsibility and indulge themselves at others’ expense. A few people complain, but on the whole, it’s what we do. It’s who we are.

In 1810 Thomas Jefferson wrote a letter to his friend John Langdon in which he spoke of kings; he said, in part, “. . .take any race of animals, confine them in idleness and inaction, whether in a stye or a stateroom, pamper them with a high diet, gratify all their sexual appetites, immerse them in sensualities, nourish their passions, let everything bend before them, and they [become delusional and self-absorbed].” We are all kings today, with little or no political power, but with more power over the things that affect us directly than even the kings of Jefferson’s day might have had. Let us hope we don’t all turn into Undine Spragg.

Sit-Com Philosophy

My wife and I wait eagerly each week for the newest version of “The Big Bang Theory.” In the interim we watch re-runs that we have stored on the DVR, so much so that we can say the lines with the actors. Very funny stuff! It has some of the cleverest writing I have come across on TV and Jim Parsons is the best comic actor I have ever seen. He makes a humorless, self-absorbed character almost likeable. Almost. And when they bring in Laurie Metcalf as Sheldon Cooper’s mother it makes our day. She is perfectly cast as the spiritually certain Texas mother of the brilliant theoretical physicist Sheldon Cooper.

The episodes often provide food for thought as well, and Sheldon is a wealth of information, much of which his friends find boring (as do many in the audience, I dare say). But it is remarkably well done. One such episode struck me as worthy of extended comment. It appeared at the beginning of a new season when Sheldon and his three friends return from the North Pole where they have spent three months doing research to substantiate one of Sheldon’s theories.

Sheldon is walking on air as they return to the apartment because he is convinced the data prove him right and he has already announced his triumph to the scientific world  and he now awaits the inevitable Nobel prize which will give his life new meaning. But, as it happens, the data that “proves” his hypothesis was provided by the three friends using the static produced by the electric can opener. When Sheldon finds out, he is humiliated and furious. He is disgraced in the eyes of his peers and must write a detraction which, for him, is a gargantuan task. In a giant pout, he quits his job and returns to Texas and his mother.

During the entire episode, Sheldon’s attempt to put the blame for his humiliation on the shoulders of his three friends raises questions about his willingness to take responsibility for his own actions. It is true that they provided him with flawed date, but he is the one who spread the word about his latest scientific triumph. It never occurs to him that he is in any way responsible for the public humiliation one could say he brought upon himself. He didn’t have to shoot his mouth off! To make matters worse, his friends seem willing to accept the blame, though this is a comic device that makes the episode funny. If they confronted Sheldon with the fact that he is the one responsible for his own humiliation, it wouldn’t get laughs. And I dare say the character would deny it: he’s very good at that. But it would be true. One hears echoes of Todd Blackledge’s attempts to shift blame for Joe Paterno’s recent behavior at Penn State to the media when Paterno refused to take action upon learning that his assistant coach was seen abusing a young boy in the team showers. Only this episode is funny, Blackledge’s rationalization is borderline absurd. But the point is the same: actions have consequences, though we want to deny it.

In the end, we really ought to focus in on the fact that the freedom we prize so highly brings with it a responsibility to accept the consequences of our free choices. You can’t have freedom (even as we understand that term) without responsibility. And vice versa. They are two sides of the same coin. In this comic episode, Sheldon has made his bed but he refuses to lie in it. That can be funny when his friends go along with his dementia, but it sends the wrong message. Sheldon is a study in asperger’s syndrome, a condition that renders the subject unaware of the effect he is having on other people. He is so immersed in himself he is barely aware of others at all. As his roommate Leonard says Sheldon is “irony impaired” — a characteristic of this type of personality. (Leonard, by the way, is played by Johnny Galecki who is, unfortunately, talent-impaired in an otherwise gifted cast.) Sheldon must learn “social protocols” constantly just to muddle through a quasi-normal public life. That makes for terrific humor when handled by the likes of Jim Parsons. But it is just possible that we all share Sheldon’s condition to a degree in our self-absorption and our inability to acknowledge responsibility for our actions, not to mention the urge to find someone else to blame for our own mistakes.

Ali’s Courage

Sports Illustrated recently did a nice story about Muhammad Ali on the occasion of his 70th birthday. In a letter praising the magazine and the fighter, the writer noted the man’s “courage. . .which always allowed him to be himself, and his unwavering insistence on living by his beliefs. . .” Let’s think about this.

To begin with, “living by his beliefs” is empty praise. Hitler lived by his beliefs, as did Torquemada, Joseph Stalin, and other, lesser, fanatics. History is full of evil men who lived (and died) by their beliefs. That’s hardly praise. Further, it’s never clear what it means to say his “courage allowed him to be himself.” Again, the same thing could be said of so many evil people. How could a man be anything but himself, anyway? I gather the writer wants to praise Ali for being honest and courageous.

To be sure, Ali stood by his beliefs, which is to say his principles, during the Viet Nam war in which he refused to participate. It did take courage, to be sure, to buck the system. It also cost him some of the best years of his fighting life. But standing up for one’s principles, being “always himself,” is praiseworthy only if the principles one stands up for are also worthy of praise. Standing up — and even being willing to die — for the principle of white supremacy, let us say, is certainly not praiseworthy, except in a bizarre sense of that term. It crosses the line between courage and stupidity. In this case, it’s not clear that Ali’s courage was admirable.

If we are going to praise a man or a woman for being courageous and standing by his or her principles in difficult times, the principles themselves need to be examined.

In Ali’s case, the principles would stand the test of critical examination if they reflected an opposition to the Viet Nam war on grounds of conscience. War itself is never a good thing (except, perhaps, in self-defense or as a last resort). But the Viet Nam war was of questionable moral worth. Those who stood against it were often persons of courage, especially if they suffered as a result. In Ali’s case, unfortunately, his public statement  “I ain’t got no quarrel with those Viet Cong” suggests self-interest, which is not an acknowledged moral principle. Ali may have been honest and willing to accept the consequences of his actions. In some sense, his action took courage, as mentioned above. But it is not at all clear that he was taking the moral high ground. Contrast Ali’s simple statement with Martin Luther King’s “Letter from a Birmingham Jail” and you see immediately how different is an appeal to ethical principles from mere self-interest.

Honesty in itself is not an inherently good thing, either, despite the fact that we tend to embrace it blindly in this culture. If I tell my neighbor lady that her new hat is hideous, it may be an honest thing to say, but it hurts her feelings unnecessarily and is therefore morally reprehensible. Sheldon Cooper on the popular TV show “The Big Bang Theory” is relentlessly honest, and the people he hurts are legion. But their hurt is skin deep, as it must be if we are to laugh. In his case, Sheldon’s honesty is one of his comical features on a very funny show. But honesty is not in itself at all funny. Nor is it necessarily admirable.

So we need to be clear about what the honest talk or courageous action comes to mean, what the context is. If Ali’s honesty and courage were praiseworthy, it was not because of his “unwavering insistence on living by his beliefs,” but because his beliefs were themselves praiseworthy. If, for example, his conversion to the Muslim religion resulted in deep antipathy to war of all kinds, we could acknowledge that his decision was a matter of principle, or a matter of conscience at the very least. But we don’t know that. It’s not at all clear that Muhammad Ali acted on principle at all. Until we know more about the principles that informed his decision we must in all honesty withhold praise.