Who Are The Trustworthy?

I have referred to Charles Pierce and his marvelous book several times. The first time was back in 2013. No one “liked” it or made a single comment — perhaps because I attack our blind faith in the wisdom of children? Anyway, I will post it again (with modifications), because what he had to say is still very much to the point.

The truth is something different from the habitual lazy combinations begotten by our wishes.(George Eliot)

One of the major curiosities in this most curious age in which we live is the undue adulation the young receive at the hands of their elders. In fact, one might say the young now command center stage in this drama we call contemporary living, as their elders are ignored and shunted off to stage left, despite the fact that their elders spend countless hours trying to pretend they are young themselves. The young can do no wrong and we listen at doors for the latest piece of wisdom that might let slip from their lips. They are charming, lovely, beautiful — untainted by the stains of a corrupt world. And they are wise beyond their years, presumably.

If families are talking over the dinner table and the young speak up silence immediately ensues in order to allow them to say their piece. The notion that the kids should not interrupt and are simply being rude has gone the way of the dinosaur. In any event, it never occurs to anyone that when they speak what the kids have to say may not be worth listening to and their withdrawal from the adult world as they grow older is nothing more than a sign of their budding narcissism. But there it is: the result of the youth rebellion.

Mark Bauerlein, author of The Dumbest Generation, insists that it started in the 1960s when groups like the S.D.S. led the attack on the “establishment” in general and the universities in particular, giving birth to the slogan “Don’t trust anyone over thirty.” Richard Hofstadter would insist, I dare to say, that it started a decade earlier during the McCarthy hearings, or, perhaps, when Dwight Eisenhower was running against Adlai Stevenson and suddenly Americans began to distrust “eggheads” like Stevenson. The youth movement, he might say, is simply the logical development of the anti-intellectual movement that came out into the open in the 1950s and which has since been fostered by growing numbers of people in this commodified culture who have never trusted those impractical types who live in “ivory towers.” In any event, as a culture we have come to distrust the elderly (especially those who can think and speak coherently) and instead we check our gut feelings and listen to the young as the sources of what we like to call “truth.”

The attack on the universities has resulted in grade inflation and the dumbing down of the curriculum in the schools, and the distrust of those over thirty has resulted in the mindless rejection of all in authority, including parents and teachers, and the almost total dismissal of the notion of expertise which, we are told, is “elitist.” To be sure, the teachers and parents have been party to the retreat as they have shown little courage and practically no confidence in themselves in the face of this assault. But, face it, some are in a better position to know than others and the odds are that those who have lived longer and studied complex issues carefully probably know a thing or two. Perhaps it is time to invent a new slogan: “Don’t trust anyone under thirty.” Or so says Mark Bauerlein and this sentiment, if not those same words, is echoed in the writing of another contemporary student of America’s current cultural malaise.

I refer to Charles Pierce who, in his best-selling book Idiot America: How Stupidity Became a Virtue In The Land of The Free, points out that this attack on authority and expertise — and those over thirty — has resulted in a lowering of intelligence (in a country where more people vote for the latest American Idol than they do the President of the United States), along with the reduction of all claims, including scientific claims, to simple matters of individual opinion, anyone’s opinion. And this in a nation based on Enlightenment ideas articulated and defended by the likes of John Jay, James Madison, Thomas Jefferson, and Alexander Hamilton. We have devolved into a nation that has declared war on intelligence and reason, the cornerstones of the Enlightenment, and prefers instead the alleged certainty of gut feelings and the utterances of children. We have turned from books and hard evidence to the mindless drivel of reality shows and video games. Pierce defends three “Great Premises” that he is convinced sum up the attitude of Americans in our day to matters of fact and questions of ultimate truth:

(1) Any theory is valid if it sells books, soaks up ratings, or otherwise moves units.

(2) Anything can be true if someone says it [often and] loudly enough.

(3) Fact is that which enough people believe.  (Truth is determined by how fervently they believe it).

I suppose the last parenthetical comment might be regarded as a corollary of the third premise. But the fact is that in this relativistic age we distrust those who are in a position to know, we wait for the latest poll to decide what is true, and we adulate the young while we ignore the fact that, lost as they are in the world of digital toys, they know very little indeed. As Pierce has shown so convincingly, we are all becoming idiots. We have lost the respect for that truth which we do not manufacture for ourselves, but which stands outside the self and requires a relentless effort to grasp even in part — together with our conviction that some things are truly evil while others are truly good. All truth is now mere opinion and the moral high ground has been leveled. We ignore the beauty all around us along with the ugly truths about what we are doing to the planet while we indulge ourselves in the latest fashion and seek the liveliest pleasure, convinced that it is the good. And all the while we wait eagerly to see what pearls of wisdom might fall from the young who are busy playing with their digital toys.

What will come of all this remains to be seen, but we might be wise to recognize the fact that those under thirty are still wet behind the ears and don’t know diddly about much of anything that really matters. Their elders don’t seem to know much either, but if we recall that the admission of our own ignorance (as Socrates so famously said) is the beginning of wisdom, then that may be the way the adults in this country might begin to resume their role as mentors and our distrust of authority and expertise might be put to rest while we acknowledge that the children know even less than we do, and the majority does not determine what is true or false.

Advertisements

My Truth

One of my favorite bloggers, and one who makes frequent insightful and thought-provoking comments on my blogs as well, recently included this statement in a response she made to a blog post:

“That’s my truth, and the beauty of it is, it remains my truth though no one else may accept it.”

This claim is worth pondering. In fact, acceptance by others is the heart and soul of truth. “Truth” is a word that applies to claims. Some of these claims are private as in “These pretzels are making me thirsty.” This claim cannot be corroborated by anyone else: it is private. It is “my truth.” But it is also somewhat uninteresting, except to close friends and, perhaps, one’s psychiatrist — or bartender. And, strictly speaking, it is not a “truth” at all. Truth claims are public and require corroboration in order to be called “true.” And some of those claims, such as the claim that 2+3=5 and  “the earth travels around the sun in an elliptical orbit” are absolutely true. They are true for me and they are true for you. They have always been true (even though not always accepted as such) and they always will be. Denial of those truths would engender a contradiction, which is one of the three laws of thought that govern all human thinking.

Acceptance, or what I have called “corroboration” is the heart of the matter. Truth claims must be tested and verified by others in order to be true. To make the claim that “my truth” is mine and mine alone is, on its face, pointless. That is, a claim that no one else can accept is not a truth claim at all. It is an intuition or private conviction that we may hold dear but which we do not expect anyone else to share with us. Indeed, we may not even care whether anyone else agrees with us! None the less, such things can be convictions that we hold dear and which help us survive in this insane world of ours. But, strictly speaking, those are not “truths.” They are very personal and they sustain us in times of struggle. They sit comfortably alongside matters of faith.

So what? You might well ask. The reason these sorts of distinctions are important, pedantic though they may seem at first blush, is because there are those “out there” in our shared world who deny truth in order to redefine it as consisting of claims they want us to accept as true, whether they are or not. We confront such claims on a daily basis these days. We quite correctly call them “lies.” The denial that there is objective truth leads invariably to a type of subjectivism which when institutionalized by those in power can lead directly to indoctrination. That way lies totalitarianism.

One of the first things both Adolf Hitler and Joseph Stalin did when they came into power — and, indeed, on their way to power —  was to redefine truth as consisting of those claims they insisted were true even though they could not be corroborated by others. They were true by fiat and repetition. Such claims as “The Jews are an inferior race,” for example. This cannot be corroborated because it is blatantly false. We are all members of the same race, uncomfortable though that thought might have been for Adolph Hitler. It is only by de-humanizing certain types that they could be eradicated, and that was the “final solution.” And while Hitler was making the Jews scapegoats for all of Germany’s ills, Stalin was rewriting history. Truth was cast aside in order to realize twisted dreams.

Thus, in the end, how we define “truth” is important. And it is Important to insist that truth is something we must all agree upon, something shared, something we all accept because it can be corroborated by anyone else at any time. It is not “my truth.” It is “our truth.”

Stupid!

One of the very few sit-coms I watch on the telly is “Young Sheldon,” the spin-off from “The Big Bang Theory.” It stars the truly remarkable child actor Iain Armitage and is in many ways more delightful (and funny) than its predecessor.

Young Sheldon is a nine-year old Sheldon Cooper who likes to brag (even in Church with his Fundamentalist mother) that he doesn’t believe in God: he believes in science. This is amusing when it comes from the mouth of a small boy sitting next to an adult, but it is also a bit stupid. As Pastor Jeff tells Sheldon in an exchange they have in Church, even some of the most brilliant scientists believed in God — to wit, Albert Einstein and Charles Darwin. In another episode, Sheldon comes across Pascal’s wager in which the brilliant mathematician explains that it is smarter to believe in God than to disbelieve in God because those who believe will be rewarded while those who do not cannot be. And even if God doesn’t in fact exist, those who believe will have lived better lives. This is a bit of a simplification, but you get the idea.

In any event, young Sheldon, for all his intelligence, has committed the fallacy of bifurcation: either God or science, not both. But why not both (ask Einstein and Darwin)? Indeed, it is a bit stupid to insist, as so many intellectuals do, that there is only one way to know anything and that is the way of science. This, of course, is what has been called “scientism,” and I have written about it before; it commits the fallacy of poisoning the well. That is to say, it rules out the possibility that there are other ways of knowing and it ignores the uncomfortable fact that there may be things we simply cannot know — mysteries, if you will. This, too, is stupid. We have already encountered two fallacies in the minds of those who, like young Sheldon, insist there is only one way to know.

But it is equally stupid to ignore the findings of science, including medical science — such things as evolution and climate change, for example. Science can deliver us a great many truths that simply cannot be denied without being completely stupid. And it is perhaps the fact that many people who identify themselves with religion insist that science is the work of the devil that intellectuals don’t want to acknowledge that there could be any semblance of truth in religion. This is guilt by association. Those people conflate the differences among religion, organized religion, and faith. This, too, is stupid — as Pascal would attest. But the fact is that a great many people who insist that faith is the only road to the Truth are as stupid as those who think science is that road. Either road requires a form of denial and an assumption that our way is the only way. There may, in fact, be many roads.

In a word, there are, as Hamlet tells us, a great many things in heaven and earth which we cannot explain with science. There are limits to human truth. But there is truth and it is available to those who are willing to search for it; while a little knowledge is a dangerous thing, the unexamined life is not worth living.  And the start of that search begins with the acknowledgement that we do not know everything and may never know everything. Not in this life, anyway.

It may well be the case that we will only know the truth after we die. Heaven may consist of a world in which the Truth is revealed to us. And Hell, of course, may be a place where truth is denied and everyone tells lies, a world in which everyone makes everything up as they go along and in which there is nothing whatever that is solid and we are surrounded by incessant confusion and uncertainty — a world of Donald Trumps, if you can imagine.

In any event, I have no problem whatever accepting the very real possibility that I do not know everything and that there are things which I simply must accept on faith. But I also believe that there are things that are true, things that stand on a solid base of empirical evidence and intuitive truths that simply cannot be denied. In the end, though, there is only one certainty and that is that there is no absolute certainty. That much I do know.

Facts (As Opposed to Opinions)

I wrote this in the early years of this blog, but, with a few additional comments added, it seems especially relevant today with “false facts” floating around us. And, Heaven knows, we need a respite from the truly ugly political shenanigans going on.

One of the most popular segments on E.S.P.N.’s popular Sports Center is called “Cold Hard Facts,” and it consists of one or more “experts” sitting down and giving his opinions about upcoming sports events. The confusion here between “facts” and “opinions” is instructive. We seem to have lost sight of a rather important distinction.

While there is nothing we claim to know that should ever be held beyond doubt, there is certainly a basic distinction between an opinion — which can be silly or sensible — and a fact which has the weight of evidence and argument behind it. It is a fact that water freezes at 32 degrees fahrenheit. It is a fact that objects fall toward the center of the earth. The most reliable facts are in the hard sciences and in mathematics (though there is some discussion whether a mathematical formula is a fact or simply a tautology). But even when an expert tells us that the New England Patriots are sure to win the game on Sunday, that is an opinion.

As mentioned, opinions can be silly — as in “there’s a monster in my closet,” or sensible, as in “don’t raise the bet when holding a pair of twos — unless you are a really good bluffer.” And opinions can differ in degree, some being more likely or more probable than others. But they do not cross over into the territory of fact until the weight of argument and evidence is so heavy it cannot be moved. Thus the opinion that smoking causes cancer became fact once the correlation between the two became very nearly inviolable (there are still exceptions). And the opinion that humans are evolved from lower forms of animals became fact when the weight of evidence became so heavy it could no longer be ignored — except by looking the other way.

One of the big controversies in our schools, especially in the South, is whether “intelligent design” is a fact or an opinion, that is, whether or not it should be taught along with the theory of evolution. But as there is no possible way to disprove intelligent design and there are any number of ways one might try to disprove evolution, the latter can be regarded as fact whereas the former cannot.  Intelligent design, the claim that human evolution is guided by a Creator, is a matter of faith. It may have plausibility, but it cannot be proved or, more importantly, disproved. This is where Socratic doubt comes in.

The secret to Socrates’ method was to doubt until we could doubt no longer. At the point where a claim seems to be beyond doubt, we can claim it is true — so far as we know. The key to the Socratic method was questioning and attempting to disprove. That is the key to scientific method as well. Claims become factual to the extent that they can no longer be disproved. If there is no way to disprove a claim, even in principle, it cannot ever rise to the status of fact. The Freudian position is usually denied the status of fact precisely because it cannot be proved — or disproved, even in principle. Still, it functions as an explanation of many of our human foibles and can be regarded as plausible.

We can talk until we are blue in the face about who was the best basketball player ever, or whether the souls of evil persons will suffer eternal punishment, but since no claim we make could ever be proved false, we never get beyond the realm of personal opinion. The claim that the polar ice caps are melting is a fact. The claim that humans are part of the cause of global warming is an opinion, though it is probable. And in this case, it would be wise to treat it as fact because even if it turns out to be false, it hasn’t cost us a great deal to seek ways to reverse the trend. And if it turns out to be true, we will have taken steps to solve a serious problem facing our earth.

Distinctions help to clarify our thinking. When they are glossed over, it leads to confusion. That is my opinion, but it seems plausible. That is the most I can say until further review.

Words

    “I don’t know what you mean by ‘glory,’ ” Alice said.
    Humpty Dumpty smiled contemptuously. “Of course you don’t—till I tell you. I meant ‘there’s a nice knock-down argument for you!’ ”
    “But ‘glory’ doesn’t mean ‘a nice knock-down argument’,” Alice objected.
    “When I use a word,” Humpty Dumpty said, in rather a scornful tone, “it means just what I choose it to mean—neither more nor less.”
    “The question is,” said Alice, “whether you can make words mean so many different things.”
    “The question is,” said Humpty Dumpty, “which is to be master—that’s all.”

It’s interesting, to say the least, how folks bandy words about, making them mean what they want them to mean — not unlike Humpty Dumpty who pays them extra when they work overtime.

Take the word conservative, for example, which ought to include such things as environmentalists who are regarded by many so-called conservatives as liberal “tree-huggers.” Environmentalists are dedicated to conserving our world. But those conservative critics are really dollar conservatives who care only about the bottom line, the profits that are frequently the result of attacks on the environment. There are also intellectual conservatives who are dedicated to preserving those ideas that have helped to create a better world. I number myself among such types. And then there are those liberals usually identified as democrats who advocate human freedom and number among themselves the bleeding heart liberals who react in a programmed manner to all types of human pain and misery — real and supposed. They leave their minds on the shelf and lead with their gut. Endorsing political correctness, they also head the attack against the Canon in the universities and all books written by “dead, white European males.” The pain and misery resulting from this attack, in the form of uninformed and confused students with shrunken minds, is ignored in the name of “social justice” — which can be loosely translated as “what I want to be the case.”

Oddly, it is quite possible for someone to embrace a number of these positions simultaneously and without inconsistency. One can be, for example, a democratic socialist who seeks greater social equality through democratic means.

Socialism, according to Karl Marx, is the economic system that arises upon the death of capitalism, an economic system that feeds on the rotting carcasses of exploited workers — speaking of human pain and misery. Karl Marx was convinced that the state would commandeer the means of production and socialism would result. But eventually the workers would themselves own the means of production and all would share equally — an economic system, called Communism.  Many an intellectual in the early part of the last century embraced the ideals of Communism until, like George Orwell, they discovered that so many of those who said they were promoting Communism were actually fostering totalitarianism and were responsible for the death of millions of their fellow humans — all in the name of “equality,” and “justice.” It is worthy of note that Communism, as embraced by Marx, resembles in important ways the Christianity preached in the Gospels.

And speaking of Christians, there are those who claim to be Christians and who are quite happy with their own prejudices and even preach hatred against all of those they regard as different from themselves. These should be called nominal Christians, as they are Christian in name only. The real Christians, who are rare, are those who do the right thing because it is the right thing and try hard to love their fellow humans, as was preached by the original (and some might say the only) true Christian. There are some who seek to do the right thing, as our beloved blogger Jill Dennison tells us each week, pointing out those who truly deserve our respect and admiration. And, I dare say, many of those people are not even nominal Christians! So it goes.

In any event, words do have relatively fixed meanings, as our dictionaries attest. But, in the spirit of Humpty Dumpty, many of us think that meaning, like truth itself, is something we make up and which dances to the tunes we play. This leads us, as we are becoming increasingly aware, toward a relativism of the meanest sort, a relativism in which hate comes to mean the same thing as love and truth is a fabrication of those in power whose private agenda centers around themselves and their ugly urges toward more and more power. It pays us to beware and to tread carefully, to make sure we know whereof we speak and insist that those claims that we are told are true have the force of evidence and argument to support them. And we should make sure folks say what they mean even though they seldom seem to mean what they say. Otherwise our minds will become prisoners of those who delight in making others a means toward their own ends.

 

“My” Truth

In a comment she made to a post I wrote about the nature of truth and falsity, Sha’Tara made the following remark: “In my world there is no truth whatever unless it is ‘my’ truth. . . “ I dismissed this claim as “indefensible,” which was a bit flippant. It deserves closer examination, though in the end I will try to show why it is indefensible. In effect the position is what has been called “solipsism,” and it has been around since humans began to think about truth and falsehood. The position rests on the assurance that I am the only one; I alone know about the “world” which is regarded as “real.” Truth, which is my personal fiction, is mine and mine alone.

As I noted in my comment to Sha’Tara, the claim “there is no truth” also claims to be true and this paradox is the key to the dismissal of the position. At best the claim itself is a half-truth. Some truths are mine while most have nothing whatever to do with me. To be sure, we all look at the world differently; each of us brings with us a large suitcase full of bias, prejudice and, at the very least, individual perspective. No question. But we bring this large piece of luggage to a world we share with others who also bring their own luggage. And we try to make sense of it, to make claims that can withstand criticism and which seem evidential. The evidence is itself available to others and can be examined and verified or rejected as the case warrants.

But in the end, claims from the axioms of Euclid (“Things equal to the same thing are equal to one another”) to the claims of the scientist (“For every action there is an equal and opposite reaction”) to the claims of the historian (“Caesar crossed the Rubicon”) can be verified. They are true because there is considerable intuitive, mathematical, historical, or sensory evidence to support them. They are not “my” truth: they are “our” truth. If we disagree about the claim that Caesar crossed the Rubicon, for example, we must bring evidence forward that provides an indisputable case against that claim. Indeed, like any claim, this one can be dismissed willy-nilly, because no one holds a gun to the head of anyone else (we would hope), but the dismissal is pure whimsy. There are no grounds for doing so, except that it makes the person himself or herself feel good. All the evidence supports the above claims in the parentheses.

The chemist/philosopher Michael Polanyi wrote a book in 1958 titled Personal Knowledge in which he argued that the scientist, no matter how exact the science itself, always brings with him or her a personal element. All claims, even the most precise and exact ones supported by mathematics and empirical observation are couched within a context of “personal knowledge.” Nonetheless, Polanyi insists, the knowledge itself is not in question because of the personal element. Polanyi’s goal was to restore science to a place within the body of human studies, to show that it is human knowledge and not so impersonal and somehow clinical that no one would approach it without rubber gloves and no interest whatever in the relationship between scientific truth and the truth in the social sciences and the Humanities. In a word, Polanyi wanted to substitute for the objective, impersonal ideal of scientific detachment an alternative ideal which gives attention to the personal involvement of the knower in all acts of understanding. But, note please, he did not reject all knowledge or all truth out of hand. Indeed, he affirmed the certainty of certain truths, while admitting the elements of personal involvement in the discovery and formulation of those truths.

The point of all this philosophical rambling is to show that truth is in a sense “mine,” but as truth it is there for anyone else. I am not all alone (solus ipse). I share a world with others with whom I can agree or disagree but with whom I also share a body of knowledge. I can get on the airplane with confidence that it will take off and land safely — because science tells me it is safe. I can drive my car and it will start and stop when I ask it to. The danger in rejecting truth is that we become open to manipulation by those in power who seek to instill in us a body of half-truths and “false facts” that allows them to realize their political goals. We are susceptible to the demagogue and the politically ambitious. We need to insist that there is truth and knowledge available to all who take the time to search and seek to validate it because if we do not do so we have nothing with which to defend ourselves from clap-trap and political nonsense.

This is why education is so important, especially in an age in which there are people “out there” who would have their way with us, convince us that black is white and that theirs is the only truth when, in fact, the truth belongs to no one. It belongs to all of us.

True Or False?

I begin with a rather lengthy quote from Wikipedia regarding one of the greatest atrocities ever committed by one group of human beings against another. I refer, of course, to the Holocaust.

The Holocaust, also referred to as the Shoah, was a genocide during World War II in which Nazi Germany, aided by its collaborators, systematically murdered some six million European Jews, around two-thirds of the Jewish population of Europe, between 1941 and 1945. Jews were targeted for extermination as part of a larger event involving the persecution and murder of other groups, including in particular the Roma and “incurably sick”, as well as ethnic Poles, Soviet citizens, Soviet prisoners of war, political opponents, gay men and Jehovah’s Witnesses, resulting in up to 17 million deaths overall.

Germany implemented the persecution in stages. Following Adolf Hitler‘s rise to power in 1933, the government passed laws to exclude Jews from civil society, most prominently the Nuremberg Laws in 1935. Starting in 1933, the Nazis built a network of concentration camps in Germany for political opponents and people deemed “undesirable”. After the invasion of Poland in 1939, the regime set up ghettos to segregate Jews. Over 42,000 camps, ghettos, and other detention sites were established.

The deportation of Jews to the ghettos culminated in the policy of extermination the Nazis called the “Final Solution to the Jewish Question“, discussed by senior Nazi officials at the Wannsee Conference in Berlin in January 1942. As German forces captured territories in the East, all anti-Jewish measures were radicalized. Under the coordination of the SS, with directions from the highest leadership of the Nazi Party, killings were committed within Germany itself, throughout German-occupied Europe, and across all territories controlled by the Axis powers. Paramilitary death squads called Einsatzgruppen in cooperation with Wehrmacht police battalions and local collaborators murdered around 1.3 million Jews in mass shootings between 1941 and 1945. By mid-1942, victims were being deported from the ghettos in sealed freight trains to extermination camps where, if they survived the journey, they were killed in gas chambers. The killing continued until the end of World War II in Europe in May 1945.

There are those among us who would insist that we cannot judge the Nazis because we haven’t walked in their boots. Seriously. There are also those among us who deny that the Holocaust ever happened, who insist that it is a fiction. These people also believe, many of them at any rate, that the moon landing was staged and never happened. I suspect these people also believe the earth is flat and that the sitting President of the United States is an exemplary human being.

What we need to think about when it comes to truth and falsity — which are being conflated these days in order to carry forth hidden agendas by those in power, I strongly suspect — is that the truth need not be pleasant. It need not fit in with our preconceptions and predilections. It can even be a bit ugly — like the truth about the Holocaust. The sheer numbers in the above quote beggar belief. And since the quote is from Wikipedia there are many who would question the truth of those claims. But there is a considerable body of evidence — available to anyone who wants to examine it — that those figures are accurate. Indeed, this is the nature of truth and how we can separate it from the falsehoods that parade as true because we (or someone out there) wants to (us) believe them. The truth can be corroborated by anyone at any time and in any place. Falsehoods cannot: they dissolve in the face of evidence, criticism, and sound argumentation. More than ever before, perhaps, it is imperative that we insist upon the difference between the two.

The way one goes about proving a statement, as we know from the hard sciences, is to seek to disprove the statement. If we cannot do so, we must accept it as true, like it or not. This was once known as the “Socratic method,” the method Socrates used in pleasant conversations with young men in Athens to test the claims that were floating about in the air — seeing if he could prove them to be mere “wind-eggs.” So much of what we hear today is in that category and we, as responsible adults, should dismiss them out of hand and insist that we be told the truth.

There is much to learn from history and we ignore it to our peril. We must test all claims, including those of historians — and if they are any good they would insist that we do so. But if those claims can stand the test of criticism and review then we must accept them, like it or not. That’s the nature of truth.

The Hollow Man

Bartley Hubbard is a hollow man. He is a flawed character and totally without principles. He is self-absorbed and uses others to improve his standing in his own mind. He is not a wicked man in the strict sense of that word: he hasn’t killed anyone and hasn’t raped any women — so far as we know. Though, in all honesty he does flirt mercilessly with pretty young women while in the company of his beautiful wife. Oh, did I mention? His wife is beautiful and worships the ground Bartley walks on — which is why he married her. While she is away one Summer after they have been married for some years, he ruminates on his wife and his feelings for her, recalling that when they broke apart some years before, she was the one who sought him out and wanted to be with him, accepting all the blame for his many shortcomings:

“As he recalled the facts, he was in a mood of entire acquiescence; and the reconciliation had been of her own seeking; he could not blame her for it; she was very much in love within and he was fond on her. In fact, he was still fond of her; when he thought of the little ways of hers, it filled him with tenderness. He did justice to her fine qualities, too; her generosity, her truthfulness, her entire loyalty to his best interests; . . .[however,] in her absence he remembered that her virtues were tedious and even painful at times. He had his doubts whether there was sufficient compensation for them. He sometimes questioned whether he had not made a great mistake to get married; he expected now to stick it through; but this doubt occurred to him.”

Bartley and his wife Marcia have a child. He is only a fiction, of course, a figment of William Dean Howells’ imagination. But he is, in Howells’ words, a “modern instance” in the novella by that name. Bartley Hubbard, pragmatic and unfeeling at the core, is a modern instance of a hollow man whom Howells worried was beginning to become more and more common in the late nineteenth century, the so-called “modern” age. In our “post-modern” age his type is becoming legion. And in a country led by the grand pooh-bah of hollow men, we should be quite familiar with the type by this time.

Bartley drifts along writing for newspapers and accepting the accolades and financial rewards, when they come, as a matter of course. A turning point in the novel, when Bartley steps over a line and becomes less a hollow man and perhaps more a cad, is when he steals intellectual property from an old and trusted (and trusting) friend, Kinney, “the philosopher from the logging camp.” Kinney was, among many things, a cook at that logging camp in Maine who had befriended Bartley because he saw in him a bright and good-humored person. One evening Kinney shares with Bartley and another friend stories of his exploits during his long and fascinating life. He plans one day to write them down and get them published, but before he can do that Bartley has written them down and had them published himself to wide acclaim. In the process he allows it to be mistakenly believed that the friend who was with him that evening wrote the stories — his friend is allowed to take the blame for the theft of another’s intellectual property when it becomes known. Needless to say, in the process Bartley loses two close friends. But he cares not. Not really; after all, he has lost a number of friends along the way, people who have seen through the facade and don’t like what lies behind. After all, his story was a success and it garnered him a large financial reward.  And money is very important to Bartley — along with the prestige it gives him.

The truth slowly comes out about what Bartley has done and he finds himself fired from his high-paying job on one of Boston’s most popular newspapers and set somewhat adrift. He borrows some money from a man he regards as a friend and proceeds to gamble it away. His wife finally begins to see the sort of man she has married and sends him packing, though she immediately regrets it because she can never quite shakes the image she has of the man she still loves. It bothers him not, because he can rationalize that what he did is not wrong and others are wrong to persecute him. Bartley is very good at rationalizing and placing the blame on others. As a hollow man he has no center, no principles that might otherwise give his life meaning and direction. This in one reason he remained with his wife as long as he did: she had been very willing to take the blame for his many faults and brush them aside as they did not fit in with her image of what her husband is.

William Dean Howells is a brilliant novelist and A Modern Instance may be his best work., But in any event, he is prescient as he saw coming soon after the Civil War that the Bartley Hubbards would become increasingly numerous, men who are hollow at the core and who are lost within the labyrinth of their own diminished self whose only goal is to seek pleasure and financial ease. And like any great work of literature, there is much food for thought and many insights into the modern, and the post-modern, temper. We can learn a great deal from those old, dead, white, European (or in this case American) men, can we not?

 

The Habit of Lying

I am reposting here on a topic that seems even more relevant today than it was when it was originally posted more than a year ago. It does seem to me that lying has become the new TRUTH and we need to get a grasp on this problem lest we become lost in a world of make-believe — if we aren’t already lost in that world. There is such a thing as truth and there is such a think as a blatant lie. Just because there are those who manage to convince people otherwise does not mean that we should not hold fast to the distinction between truth and falsehood like a life-raft in the swirling chaos of confused thought that surrounds us. 

It started with advertising I think — though I can’t be sure. I refer, of course, to lying. I don’t mean the occasional lie. I mean the chronic lie, lying as a matter of course. Selling the car to the unsuspecting customer by telling him that it was owned by an old lady and never driven over forty; selling the house without mentioning the fact that the basement leaks whenever it rains; insisting in the face of overwhelming evidence that global warming is a fiction.  I realize, of course, that people have always lied. But what I am talking about is the blind acceptance of lying as a way of life. It seems to have become the norm. Everybody does it, so it must be OK.

As one who taught ethics for forty-one  years I have a bone to pick with this sort of logic. Just because everyone does it (which is a bit of an exaggeration) does not make it right. In fact, the cynic in me is tempted to say that if everyone does it it is almost certainly not right! From an ethical perspective it is never right to lie, not even in an extreme case, although one might plead expediency in such a case. But it is never right, not even the “little white lie” that we might tell about our neighbor’s hat in order not to hurt her feelings. I might tell the little white lie, but I must realize that it is not the right thing to do, strictly speaking. In this case it’s just the expedient thing to do, since hurting her feelings would be much more upsetting than simply telling her that her hat is lovely when in fact it’s perfectly awful. It’s the lesser of two evils, if you will. In any event, the little white lie is not the problem. The big black lie is the problem: it has become commonplace. And it is the fact that lying has become accepted behavior that is of greatest concern.

When my wife and I were babysitting with our Granddaughters some time back I sat and watched several Walt Disney shows the girls seemed to like. The plots involving teenagers and their bumbling parents were absurdly simple, but they tended to focus on a lie told by one of the characters that generated a situation that required several other lies to be resolved. It was supposed to be funny.  I was reminded of the “I Love Lucy” shows (which I did love) that were also frequently based on a lie that Lucy told Ricky and which generated a situation from which all of Lucy’s cleverness was required to extricate herself. I then began to reflect on how many TV shows generate humor in this way. These situations are funny, of course, as were the Disney shows, I suppose. But the point is that the lie was simply an accepted way of doing things. If you are in a tight situation, lie your way out of it.

On our popular TV shows, it’s not that big a deal. But when our kids see this day after day it must send them a message that lying is simply the normal way of dealing with certain sorts of situations that might be embarrassing or uncomfortable. In any event, when it becomes widespread and commonplace, as it has clearly done in today’s world, it does become a larger problem. When Walmart claims it always has the lowest prices and has to be taken to court to reduce the claim to always having low prices we become aware that the rule of thumb seems to be: say it until someone objects and after the courts have ruled we will make the change. In the meantime we will tell the lie and expect greater profits. And we all know politicians lie without giving it a second thought: whatever it takes to remain in a well-paid position requiring little or no work whatever.

As we listen to the political rhetoric that fills the airwaves and makes us want to run somewhere to hide, we realize that bald-faced lying has become a commonplace in politics. Tell the people what they want to hear, regardless of the consequences. It’s all about getting the nomination and then winning enough votes to be elected. If those lies result in harm to other people, say people of another religion or skin color, so be it. Consequences be damned! It is possible to check the facts, of course, but very few bother to take the time since if the lie supports the listener’s deep-seated convictions and prejudices it will readily be believed, true or false. And if it doesn’t, we simply stop listening. For example, one could simply search “FactCheck” and discover that the majority of Donald Trump’s claims are a fabrication or are blatantly false. But, then, truth does not enter in. We don’t seem to care much about that any more. Sell the house. Sell the car, Sell the political candidate. Whatever it takes. The end justifies the means.

This, of course, is utter nonsense.

 

Still Wondering

I posted this (slightly modified) piece two years ago — before the Age of The Trumpet and Alternative Facts — but it still seems pertinent. Perhaps more so! So I decided to repost it in the hope that its might be of interest to some of my readers who missed it the first time around.

As Hannah Arendt uses the term, “totalitarianism” is any form of government in which those in power seek to gain “total domination” of the minds and actions of the citizens by any means — violent or otherwise. In this sense, Huxley’s Brave New World is a totalitarian state in which a benign dictator, convinced that he is doing the right thing, makes sure his people think they are free while all the time he guarantees their continued mental captivity in a world of pleasure and endless diversions. If this sounds a bit familiar, it may well be, though in these United States it is not clear whether there is a single person or a group that is in complete control. But it is certainly the case that we are provided with endless diversions and a mind-boggling array of entertainment to keep us convinced we are free while all the time we are buying what the media are selling, electing inept officials who are cleverly marketed like toothpaste, and embracing the platitudes we hear repeatedly. Seriously, how many people in this “free” nation really use their minds?

In any event, I came across a passage or two in Arendt’s remarkable book about totalitarianism — which I have alluded to previously — that are well worth pondering. Bear in mind that she was writing in 1948 and was primarily interested in Joseph Stalin and Adolph Hitler and their totalitarian governments. Donald Trump was not a name on everyone’s lips. She was convinced that this period in history is when the “mob mentality” that later theorists latched upon came into the historical picture and “mass man” was born: Eric Hoffer’s “true Believer.” This was before political correctness, of course, when “man” was generic. The “elite” of whom she is speaking is the educated and cultured individuals in those countries who should have known better — but who did not. There are subtle differences in the mentality of the two groups, but Arendt was convinced that they were both easily led astray.

“This difference between the elite and the mob notwithstanding, there is no doubt that the elite was pleased whenever the underworld frightened respectable society into accepting it on an equal footing. The members of the elite did not object at all to paying a price, the destruction of civilization, for the fun of seeing how those who had been excluded unjustly in the past forced their way into it. They were not particularly outraged at the monstrous forgeries in historiography of which the totalitarian regimes are guilty and which announce themselves clearly enough in totalitarian propaganda. They had convinced themselves that traditional historiography was a forgery in any case, since it had excluded the underprivileged and oppressed from the memory of mankind. Those who were rejected by their own time were usually forgotten by history, and the insult added to injury had troubled all sensitive consciences ever since faith in a hereafter where the last would be the first had disappeared. Injustices in the past as well as the present became intolerable when there was no longer any hope that the scales of justice eventually would be set right.”

And again,

“To this aversion of the intellectual elite for official historiography, to its conviction that history, which was a forgery anyway, might as well be the playground of crackpots, must be added the terrible, demoralizing fascination in the possibility that gigantic lies and monstrous falsehoods can eventually be established as unquestioned facts, that man may be free to change his own past at will, and that the difference between truth and falsehood may cease to be objective and become a mere matter of power and cleverness, of pressure and infinite repetition.”

Those who might question the notion of a historical parallel here might do well to reflect on the fact that postmodernism has literally “taken over” our college campuses. And “New History” is all the rage.  The basic tenet of deconstructionism, which lies at the heart of postmodern thought, is that truth is a fiction — or, as the American philosopher Richard Rorty has said, truth is nothing more than “North Atlantic bourgeois liberalism.” His famous predecessor Jacques Derrida said, unblushingly, that truth is simply a “plurality of readings” of various “texts.” A great many of these intellectuals are convinced that history is a fiction that has for too long ignored the disenfranchised and are determined to right this wrong by rewriting the history books to stress the role of those who have been excluded by an elite white, male hegemony. And while the motive may be admirable, one must question the premise on which these folks operate, since this is coming from those whose job, traditionally, has been that of protectors and transmitters of civilized thought. Popular culture [and politicians have] simply latched on to the droppings of these intellectuals and reduced truth to subjectivity: truth is what you want to be the case; we do not discover it, we manufacture it. Say something often enough and loudly enough and it becomes true.

In the event that anyone should suggest that the rejection of objective truth is trivial, I present the following observation by Ms Arendt:

“The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction and the distinction between true and false no longer exist.”

Bearing in mind that totalitarianism need not be violent, this appears to be the direction we are headed. Or am I wrong in thinking that the signs of totalitarianism are increasingly clear and it appears that a small group of wealthy and powerful men — supported in their ivory towers by “elite” intellectuals who would never admit their allegiance to this group while they deny objective truth and busily rewrite history — are slowly but surely gaining control of the media and by attacking the public school system, ignoring such things as global warming, eliminating regulating agencies, approving numerous invasions of personal privacy, and picking and choosing stupid and malleable people to run for public office are increasingly able to make us think we are free when, in fact, we are simply doing their bidding? I wonder.