Whom To Trust?

The truth is something different from the habitual lazy combinations begotten by our wishes. (George Eliot)

One of the major curiosities in this most curious age in which we live is the undue adulation the young receive at the hands of their elders. In fact, one might say the young now command center stage in this drama we call contemporary living, as their elders are ignored and shunted off to stage left, despite the fact that they spend countless hours trying to pretend they are young themselves. The young can do no wrong and we listen at doors for the latest piece of wisdom they might let slip from their lips. They are charming, lovely, beautiful — untainted by the stains of a corrupt world. If families are talking over the dinner table and the young speak up silence immediately ensues in order to allow them to say their piece, though as they grow older they withdraw, become sullen and disinclined to speak at all.  The notion that the kids are simply being rude has gone the way of the dinosaur. In any event, it never occurs to anyone that when they speak what the kids have to say may not be worth listening to and their withdrawal from the adult world is nothing more than a sign of their budding narcissism. But there it is: the result of the youth rebellion.

Mark Bauerlein, author of The Dumbest Generation, insists that it started in the 1960s when groups like the S.D.S. led the attack on the “establishment” in general and the universities in particular, giving birth to the slogan “Don’t trust anyone over thirty.” Richard Hofstadter would insist, I dare to say, that it started a decade earlier during the McCarthy hearings, or, perhaps, when Dwight Eisenhower was running against Adlai Stevenson and suddenly Americans began to distrust the “eggheads” like Stevenson. The youth movement, he might say, is simply the logical development of the anti-intellectual movement that began in the 1950s and which has since been fostered by growing numbers of people in this commodified culture who have never trusted those impractical types who live in “ivory towers.” In any event, as a culture we have come to distrust the elderly (especially those who can think and speak coherently) and instead we check our gut feelings and listen to the young as the sources of what we like to call “truth.” The result has been a general lowering of the culture to the level of what I have called the “new barbarism.” The attack on the universities has resulted in grade inflation and the dumbing down of the curriculum in the schools, and the distrust of those over thirty has resulted in the mindless rejection of all in authority, including parents and teachers, and the almost total dismissal of the notion of expertise which, we are told, is “elitist.” To be sure, the teachers and parents have been party to the retreat as they have shown little courage and practically no confidence in themselves in the face of this assault. But, face it, some are in a better position to know than others and the odds are that those who have lived longer and studied complex issues carefully probably know a thing or two. Perhaps it is time to invent a new slogan: “Don’t trust anyone under thirty.” Or so says Mark Bauerlein and this sentiment, if not those same words, is echoed in the writing of another contemporary student of America’s current cultural malaise.

I refer to Charles Pierce who, in his best-selling book Idiot America: How Stupidity Became a Virtue In The Land of The Free, points out that this attack on authority and expertise — and those over thirty — has resulted in a lowering of intelligence (in a country where more people vote for the latest American Idol than they do the President of the United States), along with the reduction of all claims to simple matters of individual opinion, anyone’s opinion. And this in a nation based on Enlightenment ideas articulated and defended by the likes of John Jay, James Madison, Thomas Jefferson, and Alexander Hamilton.  We have devolved into a nation that has declared war on intelligence and reason, the cornerstones of the Enlightenment, and prefers instead the alleged certainty of gut feelings and the utterances of children. We have turned from books and hard evidence to the mindless drivel of reality shows and video games. Pierce defends three “Great Premises” that he is convinced sum up the attitude of Americans in our day to matters of fact and questions of ultimate truth:

(1) Any theory is valid if it sells books, soaks up ratings, or otherwise moves units.

(2) Anything can be true if someone says it [often and] loudly enough.

(3) Fact is that which enough people believe.  (Truth is determined by how fervently they believe it).

I suppose the last parenthetical comment might be regarded as a corollary of the third premise. But the fact is that in this relativistic age we distrust those who are in a position to know, we wait for the latest poll to decide what is true, and we adulate the young while we ignore the fact that, lost as they are in the world of digital toys, they know very little indeed. As Pierce has shown so convincingly, we are all becoming idiots. We have lost the respect for that truth which we do not manufacture for ourselves, but which stands outside the self and requires an assiduous effort to grasp even in part — together with our conviction that some things are truly evil while others are truly good. All truth is now mere opinion and the moral high ground has been leveled. We ignore the beauty all around us along with the ugly truths about what we are doing to the planet while we indulge ourselves in the latest fashion and seek the liveliest pleasure, convinced that it is the good. And all the while we wait eagerly to see what pearls of wisdom might fall from the young who are busy playing with their digital toys.

What will come of all this remains to be seen, but we might be wise to recognize the fact that those under thirty are still wet behind the ears and don’t know diddly about much of anything of importance. Their elders don’t seem to know much either, but if we recall that the admission of our own ignorance (as Socrates so famously said) is the beginning of wisdom, then that may be the way the adults in this country might begin to resume their role as mentors and our distrust of authority and expertise might be put to rest while we acknowledge that the children know even less than we do, and the majority does not determine what is true or false.

A Fact Is Not An Opinion

One of the most popular segments on E.S.P.N.’s popular Sports Center is called “Cold Hard Facts,” and it consists of one or more “experts” sitting down and giving his opinions about upcoming sports events — not facts. The confusion here between “facts” and “opinions” is instructive. We seem to have lost sight of a rather important distinction.

While there is nothing we claim to know that should ever be held beyond doubt, as I mentioned in an earlier blog, there is certainly a basic distinction between an opinion — which can be silly or sensible — and a fact which has the weight of evidence and argument behind it. It is a fact that water freezes at 32 degrees fahrenheit. It is a fact that objects fall toward the center of the earth. The most reliable facts are in the hard sciences and in mathematics (though there is some discussion whether a mathematical formula is a fact of simply a tautology). But even when an expert tells us that the Baltimore Ravens will repeat as Super Bowl Champions, that is an opinion.

As mentioned, opinions can be silly — as in “there’s a monster in my closet,” or sensible, as in “unless you are a really good bluffer, don’t raise the bet when holding a pair of twos.” And opinions can differ in degree, some being more likely or more probable than others. But they do not cross over into the territory of fact until the weight of argument and evidence is so heavy it cannot be moved. Thus the opinion that smoking causes cancer became fact once the correlation between the two became very nearly inviolable (there are still exceptions). And the opinion that humans are evolved from more primitive animal species became fact when the weight of evidence became so heavy it could no longer be ignored — except by looking the other way.

One of the big controversies in our schools, especially in the South, is whether “intelligent design” is a fact or an opinion; that is, whether or not it should be taught along with the theory of evolution. But as there is no possible way to disprove intelligent design and there are any number of ways to disprove evolution, the latter can be regarded as fact whereas the former cannot. Intelligent design, the claim that human evolution is guided by a Creator, is a matter of faith. It may have plausibility, but it cannot be proved or, more importantly, disproved. This is where Socratic doubt comes in.

The secret to Socrates’ method was to doubt until we could doubt no longer. At the point where a claim seems to be beyond doubt, we can claim it is true — so far as we know. The key to the Socratic method was questioning and attempting to disprove. That is the key to scientific method as well. Claims become factual when they are testable but they cannot be disproved. If there is no way, in principle, to test a claim it cannot ever rise to the status of fact. Karl Popper said this was the case with Freud’s and Jung’s theories: they cannot be tested and proved or disproved, therefore they cannot be regarded as scientific fact — no matter how useful they might prove to be in explaining human behavior.

We can talk until we are blue in the face about who was the best basketball player ever, or whether the souls of evil persons will suffer eternal punishment, but since no claim we make about the soul or the best basketball player ever could be tested or proved one way or the other, we never get beyond the realm of personal opinion or belief. The claim that the polar ice caps are melting is a fact. The claim that humans are part of the cause of global warming is an opinion, though it is plausible. There are core samples that support the claim on the basis of the amounts of carbon dioxide in the air in the past 150 years — since the Industrial Revolution. And in this case, it would be wise to treat it as fact because even if it turns out to be false, it hasn’t cost us a great deal. And if it turns out to be true, we will have taken steps to solve a serious problem facing our earth.

Distinctions help to clarify our thinking. When they are glossed over, it leads to confusion. That is my opinion, but it seems plausible. That is the most I can say until further review.

Cold Hard Facts

One of the most popular segments on E.S.P.N.’s popular Sports Center is called “Cold Hard Facts,” and it consists of one or more “experts” sitting down and giving his opinions about upcoming sports events. The confusion here between “facts” and “opinions” is instructive. We seem to have lost sight of a rather important distinction.

While there is nothing we claim to know that should ever be held beyond doubt, as I mentioned in an earlier blog, there is certainly a basic distinction between an opinion — which can be silly or sensible — and a fact which has the weight of evidence and argument behind it. It is a fact that water freezes at 32 degrees fahrenheit. It is a fact that objects fall toward the center of the earth. The most reliable facts are in the hard sciences and in mathematics (though there is some discussion whether a mathematical formula is a fact of simply a tautology). But even when an expert tells us that the New England Patriots are sure to win the game on Sunday, that is an opinion.

As mentioned, opinions can be silly — as in “there’s a monster in my closet,” or sensible, as in “don’t raise the bet when holding a pair of twos — unless you are a really good bluffer.” And opinions can differ in degree, some being more likely or more probable than others. But they do not cross over into the territory of fact until the weight of argument and evidence is so heavy it cannot be moved. Thus the opinion that smoking causes cancer became fact once the correlation between the two became very nearly inviolable (there are still exceptions). And the opinion that humans are evolved from lower forms of animals became fact when the weight of evidence became so heavy it could no longer be ignored — except by looking the other way.

One of the big controversies in our schools, especially in the South, is whether “intelligent design” is a fact or an opinion, that is, whether or not it should be taught along with the theory of evolution. But as there is no possible way to disprove intelligent design and there are any number of ways to disprove evolution, the latter can be regarded as fact whereas the former cannot. Intelligent design, the claim that human evolution is guided by a Creator, is a matter of faith. It may have plausibility, but it cannot be proved or, more importantly, disproved. This is where Socratic doubt comes in.

The secret to Socrates’ method was to doubt until we could doubt no longer. At the point where a claim seems to be beyond doubt, we can claim it is true — so far as we know. The key to the Socratic method was questioning and attempting to disprove. That is the key to scientific method as well. Claims become factual to the extent that they cannot be disproved. If there is no way to disprove a claim, it cannot ever rise to the status of fact. We can talk until we are blue in the face about who was the best basketball player ever, or whether the souls of evil persons will suffer eternal punishment, but since no claim we make could ever be proved false, we never get beyond the realm of personal opinion. The claim that the polar ice caps are melting is a fact. The claim that humans are part of the cause of global warming is an opinion, though it is plausible. And in this case, it would be wise to treat it as fact because even if it turns out to be false, it hasn’t cost us a great deal. And if it turns out to be true, we will have taken steps to solve a serious problem facing our earth.

Distinctions help to clarify our thinking. When they are glossed over, it leads to confusion. That is my opinion, but it seems plausible. That is the most I can say until further review.