Whom To Trust

This is a post from four years ago which still seems relevant except for the fact that the lowered intelligence I speak of became even more apparent in the recent presidential election.

The truth is something different from the habitual lazy combinations begotten by our wishes. (George Eliot)

One of the major curiosities in this most curious age in which we live is the undue adulation the young receive at the hands of their elders. In fact, one might say the young now command center stage in this drama we call contemporary living, as their elders are ignored and shunted off to stage left, despite the fact that they spend countless hours trying to pretend they are young themselves. The young can do no wrong and we listen at doors for the latest piece of wisdom they might let slip from their lips. They are charming, lovely, beautiful — untainted by the stains of a corrupt world. If families are talking over the dinner table and the young speak up silence immediately ensues in order to allow the youngsters to say their piece, though as they grow older they withdraw, become sullen and disinclined to speak at all. The notion that the kids are simply being rude has gone the way of the dinosaur. In any event, it never occurs to anyone that when they speak what the kids have to say may not be worth listening to and their withdrawal from the adult world is nothing more than a sign of their budding narcissism. But there it is: the result of the youth rebellion.
Mark Bauerlein, author of The Dumbest Generation, insists that it started in the 1960s when groups like the S.D.S. led the attack on the “establishment” in general and the universities in particular, giving birth to the slogan “Don’t trust anyone over thirty.” Richard Hofstadter would insist, I dare to say, that it started a decade earlier during the McCarthy hearings, or, perhaps, when Dwight Eisenhower was running against Adlai Stevenson and suddenly Americans began to distrust the “eggheads” like Stevenson. The youth movement, he might say, is simply the logical development of the anti-intellectual movement that began in the 1950s and which has since been fostered by growing numbers of people in this commodified culture who have never trusted those impractical types who live in “ivory towers.” In any event, as a culture we have come to distrust the elderly (especially those who can think and speak coherently) and instead we check our gut feelings and listen to the young as the sources of what we like to call “truth.” The result has been a general lowering of the culture to the level of what I would label the “new barbarism.” The attack on the universities has resulted in grade inflation and the dumbing down of the curriculum in the schools, and the distrust of those over thirty has resulted in the mindless rejection of all in authority, including parents and teachers, and the almost total dismissal of the notion of expertise which, we are told, is “elitist.” To be sure, the teachers and parents have been party to the retreat as they have shown little courage and practically no confidence in themselves in the face of this onmslought. But, face it, some are in a better position to know than others and the odds are that those who have lived longer and studied complex issues carefully probably know a thing or two. Perhaps it is time to invent a new slogan: “Don’t trust anyone under thirty.” Or so says Mark Bauerlein and this sentiment, if not those same words, is echoed in the writing of another contemporary student of America’s current cultural malaise.
I refer to Charles Pierce who, in his best-selling book Idiot America: How Stupidity Became a Virtue In The Land of The Free, points out that this attack on authority and expertise — and those over thirty — has resulted in a lowering of intelligence (in a country where more people vote for the latest American Idol than they do the President of the United States), along with the reduction of all claims to simple matters of individual opinion, anyone’s opinion. And this in a nation based on Enlightenment ideas articulated and defended by the likes of John Jay, James Madison, Thomas Jefferson, and Alexander Hamilton. We have devolved into a nation that has declared war on intelligence and reason, the cornerstones of the Enlightenment, and prefers instead the alleged certainty of gut feelings and the utterances of children. We have turned from books and hard evidence to the mindless drivel of reality shows and video games. Pierce defends three “Great Premises” that he is convinced sum up the attitude of Americans in our day to matters of fact and questions of ultimate truth:
(1) Any theory is valid if it sells books, soaks up ratings, or otherwise moves units.
(2) Anything can be true if someone says it [often and] loudly enough.
(3) Fact is that which enough people believe. (Truth is determined by how fervently they believe it).
I suppose the last parenthetical comment might be regarded as a corollary of the third premise. But the fact is that in this relativistic age we distrust those who are in a position to know, we wait for the latest poll to decide what is true, and we adulate the young while we ignore the fact that, lost as they are in the world of digital toys, they know very little indeed. As Pierce has shown so convincingly, we are all becoming idiots. We have lost the respect for that truth which we do not manufacture for ourselves, but which stands outside the self and requires an assiduous effort to grasp even in part — together with our conviction that some things are truly evil while others are truly good. All truth is now mere opinion and the moral high ground has been leveled. We ignore the beauty all around us along with the ugly truths about what we are doing to the planet while we indulge ourselves in the latest fashion and seek the liveliest pleasure, convinced that it is the good. And all the while we wait eagerly to see what pearls of wisdom might fall from the young who are busy playing with their digital toys.
What will come of all this remains to be seen, but we might be wise to recognize the fact that those under thirty are still wet behind the ears and don’t know diddly about much of anything of importance. Their elders don’t seem to know much either, but if we recall that the admission of our own ignorance (as Socrates so famously said) is the beginning of wisdom, then that may be the way the adults in this country might begin to resume their role as mentors and our distrust of authority and expertise might be put to rest while we acknowledge that the children know even less than we do, and the majority does not determine what is true or false.

Advertisements

Adam Smith Revisited

The usual take on Adam Smith is that he was the father of modern capitalism, an apologist for man’s greed and ambition, inventor of the notion of the “invisible hand” that would lead to prosperity and happiness for one and all in a capitalistic economy — trickle down, as it were. The fact is that he was much more famous in his day for his moral philosophy as author of  The of Moral Sentiments in which he insisted that human beings were born with a natural sympathy for one another that would temper their dealings and — in the case of capitalism — keep them from gouging one another and making huge profits at the cost of exploiting their workers and screwing one another.  As he said in Moral Sentiments:

“How selfish soever man may be supposed, there are evidently some principles in his nature which interest him in the fortune of others and render this happiness necessary to him, though he derives nothing from it except the pleasure of seeing it.”

Smith’s reference here to the supposed selfishness of human beings is a direct reference to the cynical Bernard Mandeville who insisted that thinkers like Lord Shaftesbury and Adam Smith were all wet to insist that men were naturally virtuous because, in fact, they are selfish and self-seeking. Mandeville’s infamous little book The Fable of the Bees, which develops this theme at length, was severely attacked by an eighteenth century English audience led by thinkers such as Shaftesbury, Bishop Butler, Francis Hutchinson, Edward Gibbon, and Adam Smith who agreed that Mandeville was all wet. The group even included such skeptical thinkers as David Hume, though he was not as vociferous a proponent of the moral sense theory as the others. And these thinkers were supported by John Wesley and his Methodistic followers who were very active, especially among the very poor.  In any event, these  folks were all great minds that comprised what came to be called the Scottish “moral sense” school of philosophy, insisting that humans are born with a natural sensitivity to others, that we all exhibit the “social virtues” of sympathy, benevolence, compassion, and fellow-feeling. As Smith notes, sympathy cannot be a disguised form of self-interest or we could not explain how a man could sympathize with a woman feeling the pains of childbirth. Sympathy is primal; it is not self-interest posing as something else.

The theme was presupposed when he later wrote Wealth of Nations. Very few have read the 900 page book, but they have perused the pages and picked out passages that reinforce their own particular views of the nature of capitalism and the desirability of the capitalistic enterprise to guarantee human happiness. It is not necessary to repeat here what I have written before of Smith’s reservations about raw capitalism, nor to repeat the excellent comments on my blog by Jerry Stark, except to note that Smith had serious concerns about the deleterious effects of the profit motive on human beings.

To be sure, there is no question but that capitalism has improved the lot of most people in this society. We live in a country where the average person has so many things that would have made kings jealous in Smith’s day, we live longer, and we are healthier. But what is noticeably lacking today is the social virtue that Smith presupposed in his treatise. And without moral sensibility, the “fellow-feeling” of which Smith speaks, capitalism is reduced to fierce competition among people who are all reaching for the same goals of fabulous wealth, status, power, and prestige. Somewhere along the line the social virtues that Smith simply assumed were prevalent in humankind have all but disappeared, and the ugly qualities that are accompany capitalism are left unrestrained by the gentler, human sympathies.

The fact is that the eighteenth-century thinkers who founded this nation, who wrote the “Declaration of Independence” and the “Constitution,” all presupposed the very same social virtues that Smith speaks of. They assumed, as James Madison says quite clearly in a number of the Federalist Papers, that virtuous people would elect wise and virtuous leaders who would promote the common good. This was axiomatic in English and American political and moral thought at that time, and was regarded as the sine qua non of a republican government. And yet we look around and fail to see much virtue at all; it has been replaced by the greed and avarice that capitalism breeds when it is not tempered, as Smith simply assumed it would be, by the social virtues. Recall Madison’s comment in Federalist Paper #55:

“Were the pictures which have been drawn of the political jealousies by some among us [Mandeville?] faithful likenesses of the human character, the inference would be that there is not sufficient virtue among men for self-government; that nothing less than the chains of despotism can restrain them from destroying and devouring one another.”

I have spoken before about the transition of the word “virtue” into “value,” and the consequent reduction of virtues to feelings that are not in the least bit shared by all, but are purely subjective and personal. You like what you like and you value what you value; I like and value what I like and value. And that’s an end to it. But this seemingly innocent alteration in the way we look at things and speak about things reflects a deeper attitude toward our fellow human beings, a lack of sympathy and fellow-feeling accompanied by a conviction that there is nothing that is valuable or true, and that human happiness can be bought and paid for by grubbing about in the market place, trading stocks, exploiting our fellow humans, accumulating as much stuff as possible, climbing the political and social ladder, and ignoring our responsibilities to one another.

We have come a long way, baby, in the name of “progress.” What is not so clear is that we are any the happier or that what we have thrown away was not more valuable than what we have kept.

Term Limits

The Federalist Papers are a collection of essays written by James Madison, Alexander Hamilton, and John Jay. They were an attempt by these men to persuade the citizens of New York to ratify the Constitution and the book is generally regarded as the best collective statement of the meaning and purpose of the document they wanted New York to ratify. Madison is usually credited with writing the 55th Paper. In that Paper the shows how the Founders simply assumed that the members of the House of Representatives would change every two years. They thought that a good thing — new blood and folks elected because they more closely represented the wishes of their constituency than did the Senate which was to be chosen by the several State Legislatures. There are other assumptions at work in this paper, as they are throughout the Federalist Papers as a whole. One of the assumptions had to do with the “virtue” — which at that time meant “civic virtue” of the ordinary citizen who would always attempt to do what was best for the country at large. In response to the critics who had their doubts about the virtue of the citizens,  or indeed those who represented them, Madison had this to say:

“I am unable to conceive that the people of America, in their present temper, or under circumstances which can speedily happen, will choose, and every second year repeat the choice of, sixty-five or a hundred men who would be disposed to form and pursue a scheme of tyranny or treachery. . . . I am equally unable to conceive that there are at this time, or can be in any short time, in the United States, any sixty-five or a hundred men capable of recommending themselves to the choice of the people at large, who would either desire or dare, within the short space of two years, to betray the solumn trust committed to them. . . .Were the pictures which have been drawn by the political jealousies of some among us faithful likenesses of the human character, the inference would be that there is not sufficient virtue among men for self-government; that nothing less than the chains of despotism can restrain them from destroying and devouring one another.”

What we have here, by contemporary standards, is eighteenth century naiveté. Madison shows himself convinced that the citizens of this country have sufficient virtue to select the very best legislators and that those same legislators would commit themselves to the common good — since they are in office for only two years — or they would be dismissed from office and replaced by those who would more nearly reflect the views of those who elected them in the first place.

What has come about, as we all now know, is a government of extremely well-paid professional politicians who are elected again and again and who cling to the offices they are elected to the way a drowning man clings to the life raft that will save his life. The citizens have shown themselves bereft of “virtue” to the extent that if they vote at all they vote for individuals who represent the interests not of the citizens at large, but of the corporations that put up the money to have them nominated in the first place. The allegiance of those elected officials is, naturally, to those very corporations they are bound to and not to the people whom they supposedly represent.

What it all boils down to is that term limits would be the only thing at this point that would restore this government to a shadow of the image the Founders had in mind when they wrote the Constitution. The basic concept that comes through loud and clear on nearly every page of the Federalist Papers is that of a well-informed citizenry that would insist that their representatives work for them or they would be summarily replaced. This will not, it cannot, happen today as long as members of Congress are allowed to hold office interminably. We have term limits for the President and there should be term limits for members of Congress. Otherwise, we shall have the continued boondoggle that passes for representative government in which representatives pursue self-interest (which is identical with corporate interest) and not the best interest of their constituents or their country, a country in which the citizens are currently bound by the “chains of despotism” if you will.

Madison’s Amendment

In an interesting article about the original 20 items on James Madison’s Bill of Rights — reduced to 12 after considerable debate in the Continental Congress and later to 10 during the ratification process —  it is made fairly clear what the man was thinking when he wrote those amendments.

We know that the major concern of those who were debating the Constitution was the issue of ratification. How to write the Constitution in such a way that the required number of states would agree to it? Originally it mentioned the abolition of slavery, but that had to be cut to assure that the Southern states would climb on board. A number of those items also had to be cut from Madison’s 20 “Rights,” though they were eventually reworked into later amendments — such things, for example, as restricting Federal judicial powers. Another was added as late as 1992. Compromise was necessary in a new nation where individual rights, and the rights of the states themselves, must be guaranteed. The original Second Amendment reads as follows:

 James Madison

James Madison


 

“The right of the people to keep and bear arms shall not be infringed; a well armed and well regulated militia being the best security of a free country: but no person religiously scrupulous of bearing arms shall be compelled to render military service in person,” said Madison.

The final clause was dropped, sad to say. But, like that clause, the remaining part of the statement makes it abundantly clear that the major concern in this amendment is the right of the militia to bear arms, since the right of “the people” is predicated on the claim that “a well armed and well regulated militia” is necessary to guarantee that the country remain free. And the reference to “military service” in the omitted clause also makes it clear that the militia was of major concern — for reasons of self-defense.

It is a wonder in these days of heated debate over the need for some sort of gun control to limit the sales of automatic weapons to possible terrorists in this country that few bother to recall what the founders were most concerned about when they agreed to the Second Amendment. Much is said about our “Constitutional Right” to bear arms, but nothing whatever is said about this so-called right being predicated on the maintenance of a militia. With the disappearance of the militia the right to bear arms also disappears. At best, one could argue that the National Guard has such a right. But not every Tom, Dick, and Sally — and certainly not those who are not of sound mind.

Note: After writing this post I was pleased to read an article quoting various Constitutional lawyers on this topic that support what I have said here:

For almost 200 years after it was adopted, the Second Amendment was interpreted to protect the right for militias to bear arms, but not individuals. In 1939, the Supreme Court ruled in United States v. Miller that restricting access to shotguns or machine guns by citizens outside the military was permissible. . .  .

[Harvard Law Professor Laurence Tribe added that] the Second Amendment does not stand in the way of gun legislation to make the country safer.

“The largest misconception is that the Second Amendment justifies — or ever has justified — our nation’s abysmal record in protecting innocent people from avoidable gun violence, . . . The Second Amendment and the Constitution as a whole are abused by those who treat them as a sick suicide pact.”

So while there is a legitimate political debate to be had about the merits of gun control, Tribe says, conservatives are wrong to make it a constitutional issue.

This, of course, does not imply that the debate over gun control will end, though it should quiet those who argue that carrying automatic weapons is a “right” guaranteed by the Second Amendment. However, it most assuredly will not.

 

Movers and Shakers

Machiavelli (Courtesy of Wikipedia)

Machiavelli
(Courtesy of Wikipedia)

Machiavelli’s Prince was written in the sixteenth century ostensibly as advice to the rulers of Florence — especially Lorenzo de Medici — about how to achieve and maintain power. Or it may have been written to alert the common folk about what their rulers were up to. It is so vivid and frank that people like Jean Jacques Rousseau have been tempted to insist that it is satirical: surely, politics isn’t that rough and cut-throat! The Catholic Church disagreed with Rousseau and banned the book soon after it appeared.  For my part, I think Machiavelli was being quite honest: politics is, indeed, a matter of doing whatever it takes to achieve the desired objective.  And the “objective” is always to gain and maintain power. In his day, it was the Medici family who pursued that goal. In our day it is the corporations where the CEOs make 475 times as much money as their average employee and “morality” is a word never used.

In fact, there is a most interesting and provocative parallel here that might have missed a great many readers of Machiavelli’s classic. The Medici were the wealthiest family in Florence. Today’s power-brokers are the very wealthy, as was the case in Machiavelli’s day. Money is power. Thus, while we like to delude ourselves about democracy resting upon the power of the people, Machiavelli would insist that the people who have the power are, in fact, those who hold the purse strings. The people simply go through the motions and exercise the very few options open to them.

Thus, while you and I might bemoan the fact that the planet is suffering from severe attacks by greedy people and something must be done and the quicker the better, as long as people like the Koch brothers are the ones who decide what will be done, the planet must suffer.  They hope to stack the political deck with hand-picked puppets and rid the country of restraints on “free enterprise” — by such as agencies as the EPA. To be sure, today’s movers and shakers failed to achieve all they hoped for during the past election, despite the millions of dollars they spent to guarantee that the puppets they had selected for public office were successful in the national elections. But they have sworn that this will not happen again in the mid-term elections. And given their determination together with the money they have at their disposal, success seems inevitable. The vision of the fore-fathers that was framed in the Enlightenment optimism of the eighteenth century, the vision that assured those who embraced their new nation that the people will in fact rule in this Democracy — as reflected in Madison’s statement in Federalist Papers that those in positions of political prominence would be removed if they failed to attend to the voice of those who elected them — turns out to have been a pipe dream. Sad to say.

In then end, then, those of us who care about our planet and our country will have to sit by with hands tied and watch those who rule — who are, in fact if not in principle, the movers and shakers of today. They are the ones who hold the reins of power by means of the amount of monies they have to spend on electing puppets who will respond only to the pull of the strings that are wielded by the power-brokers themselves. And, of course, those same people could care less about the planet or their country. They care only about the bottom line. They are blinded by greed and the love of power and care only about what will bring them what they want. So let’s not fool ourselves. Machiavelli told us all about it centuries ago, and things have not really changed that much since then. Those who have money and power seek only to maintain their positions of strength while the rest of us seek the latest diversion they provide us with.

Does this mean that I, personally, will no longer hope for real change, that I will no longer send in my piddling amounts of money to help support those few politicians who seem to have something resembling a conscience? Certainly not. One must free one’s hands and continue to swim against the tide if it is certain to be heading in the wrong direction. I will continue to hope and I will continue to struggle and raise my shrill voice. But though I am not a pessimist or even a fatalist, I am a realist who has learned from the wisest and brightest of those who have passed before me. I have a pretty good idea how things will turn out.

The Second Amendment

James Madison, who wrote the Constitution in close association with his friend Thomas Jefferson, did not think a Bill of Rights was necessary. Alexander Hamilton agreed and said in a lengthy discussion of a possible Bill of Rights in Federalist Papers #84,  “The Constitution is its own Bill of Rights.” These men worried that if a list of such rights was drawn up something would be left out or, worse yet, folks would think those were the only rights that citizens have. Indeed, Hamilton went on to note that a Bill of Rights is both “dangerous” and “unnecessary,” since he thought such rights are clearly implied in the Constitution itself and need not be specified or if specified could be circumvented by devious minds. Hamilton assures his readers that “Here in strictness people surrender nothing [by not having their rights specified]; and as they retain every thing they have no need of particular reservations. . . . [the Constitution] contains all which in relation to their objects, is reasonably desired.” Further, the men thought that citizens’ rights were self-evident, a favorite concept of Enlightenment thinkers.

But since several states were reluctant to ratify the Constitution without a specified Bill of Rights, Madison eventually drew up a list of twelve such rights that were soon pared down to ten. The one that is most talked about these days is the right of citizens to keep and bear arms, the Second Amendment. This right was specified because the Founders regarded militias, raised by the states and paid by the states as the need arose, as essential to the freedom of the American people. Their model, in all likelihood, was Cincinnatus, the citizen/farmer in the early days of Rome, who fought when the need arose and then went back to his farm when the danger had passed. The founders were known to have greatly admired the Roman Republic, using it as a model for their own government. And given their experience with the constant presence of the red-coated British, they were very concerned about the possibility of a standing army — even their own army — that would strengthen the government and weaken the people’s freedom.  Indeed, when they were considering ratification of the Constitution, Hamilton had to assure his New York readers, in Federalist #24, that they need not fear the presence during peace time of a standing army: it simply wouldn’t happen.  The states would retain the power to raise militias when necessary and disband them when the danger had passed: they would be “well regulated.” Thus, in order to avoid a standing army, state militias were essential. Not only had the conjoined militias won the Revolution after all, but, during Washington’s presidency, a collection of several state militias amounting to 17,000 men was quickly rounded up and, led by the President himself, headed West to put down the Whiskey Rebellion in Western Pennsylvania. The word got out that the militia was headed their way and the Rebellion broke up. At that time it was determined that the militias could safely protect the citizens of the new nation.

The point of this little history lesson is to show that the Second Amendment was less about the right to keep and bear arms than it was about the need for armed militia. Indeed, when, much later, in 1934, the Congress passed the National Firearms Act to keep such things as sawed-off shotguns out of the hands of gangsters, the case eventually went to the Supreme Court whose decision clearly centered around the Founders’ express need for a militia. In their decision, they reasoned that “The Court cannot take judicial notice that a shotgun having a barrel less than 18 inches long has today any reasonable relation to the preservation or efficiency of a well regulated militia, and therefore cannot say that the Second Amendment guarantees to the citizen the right to keep and bear such a weapon.” Might not the very same thing be said of today’s automatic weapons?

In fact, if you read the Second Amendment carefully, you will see that it presents us with a compound statement in which two clauses are interdependent. It reads, “A well regulated militia being necessary to the security of a free state, the right of the people to keep and bear arms shall not be infringed.” In other words it states that since a militia is necessary to defend freedom, the right to keep and bear arms shall not be infringed. The statement is quite precise: one thing necessitates the other. If there were no need for a militia — as, say, if there were a standing army, navy, marine corps, air force, and national guard — then there would be no grounds for the so-called “right” to keep and bear arms. And, conversely, the right to keep and bear arms need not be recognized when the need for a militia disappears — because of the presence of a standing army, for example.

The relentless attempts by the arms manufacturers — for the most part — to bully this Congress and the Supreme Court into allowing any and all weapons in the hands of any and all citizens, regardless of age, flies in the face of the Second Amendment as it was written and understood for many years. The arguments by groups such as the NRA tend to focus exclusively on the “right” itself, and ignore the explicit concern for militias. But, assuredly, the fact that state militias are a thing of that past implies that the right to keep and bear arms can no longer be said to be protected by this Amendment. Perhaps in the end Hamilton was right — certainly with respect to the Second Amendment: it has proven to be “dangerous.”

Defining Moments

In the truly remarkable seven-part HBO series on John Adams there is one of those defining moments that almost redeems the American movie-making industry, allowing us to forget for a moment that so many movies today are just technical display with no plot and maximum sex and violence. That moment occurs immediately after the representatives from the thirteen colonies meeting in Philadelphia have voted to become independent from England. After months of acrimonious debate and the delaying tactics of a number of cautious representatives who sensibly feared the might of British arms and pleaded patience, the vote was taken and the results read to the small contingent in the crowded room. At that moment, the camera backs off and slowly pans the faces in the room; there is no sound; there is little or no movement for nearly 10 seconds — it seems like hours — as the delegates realize what they have just done. One imagines them thinking: “My God! We have just declared war on one of the most powerful nations on earth — and we have no army and no navy! We are marked men with targets on our chests. If we are caught we will be hanged.” The moment is powerful and extremely effective.

At the time of America’s declaration of independence there were no political parties. There were, of course, grave differences among the various colonies, each of which prized its own uniqueness. There was a growing rift between North and South which would eventually erupt into the Civil War — a slave economy in the South violently opposed to the aggressive, commercial enterprise of the North. That tension soon gave birth to what eventually became political parties, the Federalists in the North and the Republicans in the South. The former, led by people like Alexander Hamilton and John Adams, tending toward a stronger national unity, the latter, led by folks like Thomas Jefferson and James Madison, insisting on autonomy for the individual states and a minimum of national interference. As President, Adams signed into law the infamous “Alien and Sedition Act,” designed to protect the new nation from foreign spies. And one of the first things Jefferson did as President was to disband the navy — which was a bit of a joke to begin with. This difference of opinion about what the new nation was to become eventually broke up the close relationship between Jefferson and Adams, who had become very close in those formative years. Late in their lives they became friends again and died on the same day 50 years to the day after the Declaration of Independence. Remarkable!

But those early differences among the various delegates were buried in a common concern: rid the colonies of the dreaded British and declare independence as a confederation of states free of English Parliamentary abuse. During the 200th birthday of this country Henry Steele Commager was asked what the major difference was between the America of 1776 and the America of 1976. He did not hesitate, but said the major difference was that 200 years before America was looking to the future; now we have become focused on the present and tend to ignore the future altogether. There is no question whether Commager was right. But there was another difference as well: in the eighteenth century the men who got together in Philadelphia to deal with the abuses of a common enemy were able to put aside their differences and act in common. Despite the acrimony, deep and genuine ideological differences, and the relentless heat of a Philadelphia Summer, they were able to decide on a common course of action and prepare to act together, whatever the costs. They were marked men, traitors to the Mother country. But they were determined and of one mind (for the most part). That doesn’t even seem possible any more.

We are at a time in our history when we need more than ever to act in accord. Our country is not under attack (seriously), but our planet is. We need to put aside our differences, like those delegates, and act with one common accord to attempt to reverse the terrible consequences of a damaged planet we are in the process of destroying. But the special interests, Big Oil and Gas and folks like the Koch brothers, have all the cards and seem determined to play out the hand they have dealt themselves — regardless of the consequences. Once again we have acrimony and tension between those who fear for the future of the planet and those who are blind to the problems that stare us all in the face out of a love of unlimited profits — or just plain ignorance. In Congress, loyalty to political party has completely erased loyalty to what the Founders referred to as the Common Good. It would appear that this time there will be no meeting of the minds, folks will not come together and put aside their differences to cooperate and reach agreement on what must be done. This is, assuredly, a defining moment, not in a film made for television, but in real life.

Defending The Eggheads

In 1952 the right-wing novelist and essayist Louis Bromfield wrote the following barb regarding the intellectual, who was increasingly referred to as an “egghead.”

Egghead: A person of spurious intellectual pretensions, often a professor or the protegé of a professor. Fundamentally superficial. Over-emotional and feminine in reactions to any problem. Supercilious and surfeited with conceit and contempt for the experience of more sound and able men. Essentially confused in thought and immersed in a mixture of sentimentality and violent evangelism. A doctrinaire supporter of Middle-European socialism as opposed to Greco-French-American ideas of democracy and liberalism. Subject to the old-fashioned philosophical morality of Nietzsche which frequently leads him into jail or disgrace. A self-conscious prig, so given to examining all sides of a question that he becomes thoroughly addled while remaining always in the same spot. An anemic bleeding heart.”

To add to the mix, president Eisenhower later added “by the way, I heard a definition of an intellectual that I thought was very interesting: a man who takes more words than are necessary to tell more than he knows.” And so, led by the likes of Joe McCarthy, the war against those who use their minds and choose their words carefully began. And despite McCarthy’s dwindling popularity, the cry was swiftly taken up by hordes of more practical and down-to-earth folks who have always had a distrust of poets, artists, dreamers, and those reputed to live in ivory towers.

But we might note that Eisenhower’s definition might well include Bromfield who uses way too many words and doesn’t seem to know what he is talking about.   After all, it’s not at all clear how those dry intellectuals can at the same time be “over-emotional and feminine” (whatever that might mean). Further, socialism cannot easily be set in opposition to democracy since they are not of a kind: one is an economic system and the other a political one. There are highly successful countries that blend in interesting ways both socialism and democracy. Moreover, Bromfield might even fit his own description of an egghead, since he is “supercilious and surfeited with conceit and contempt for the experience of more sound and able men.” But, we leave these enticing thoughts because there are larger issues here.

To begin with, Bromfield does make a couple of good points. For one, intellectuals do tend to look at every side of complex issues and it often renders them ineffectual. Accurate or not, the common image of the intellectual is the philosopher Thales who reportedly fell into a hole while gazing at the stars! However, we might recall that Plato’s notion that philosophers should be kings was dismissed out of hand by that other egghead, Aristotle, who preferred a person of “practical wisdom,” which meant a person with good common sense. Neither, however, would have approved of a political leader who rushes blindly into action before he or she has fully accessed the consequences of that action — like, say, engaging in war in Iraq or Afghanistan. Thus, if the alternative to the egghead is the “real world” person of a practical mien who jumps at conclusions and rushes headlong into disaster, then one would think the intellectual approach is to be preferred. Or, perhaps, there is a third option: careful deliberation followed by determination to follow the agreed-upon course of action. Indeed, this is the sort of thing James Madison envisioned when he wrote the Constitution. (Now, there’s an egghead if there ever was one!!) This was supposed to be the strength of a Democracy. We were to be a nation that took its time to do things right, examining both sides of complex issues and reaching a consensus when possible.  We were supposed to deliberate and use our minds; in order to make sure we could do that, an educated citizenry was the keystone. Both Madison and his close friend (and another egghead) Thomas Jefferson agreed about that.

But the anti-intellectual ethos that permeates this culture today has lent its considerable weight to the attack on the public schools and the notion that education will lead this country into a brighter tomorrow has been lost in the concern over more practical matters: like job training and the economy. To be sure, Bromfield is right that intellectuals can be a pain in the ass. But one would hope that in this complex world of ours we would willingly take time to listen to a person who knows what he or she is talking about rather than mindlessly follow the person who shoots off his mouth and rushes blindly into situations filled with hidden dangers.

Whom To Trust?

The truth is something different from the habitual lazy combinations begotten by our wishes. (George Eliot)

One of the major curiosities in this most curious age in which we live is the undue adulation the young receive at the hands of their elders. In fact, one might say the young now command center stage in this drama we call contemporary living, as their elders are ignored and shunted off to stage left, despite the fact that they spend countless hours trying to pretend they are young themselves. The young can do no wrong and we listen at doors for the latest piece of wisdom they might let slip from their lips. They are charming, lovely, beautiful — untainted by the stains of a corrupt world. If families are talking over the dinner table and the young speak up silence immediately ensues in order to allow them to say their piece, though as they grow older they withdraw, become sullen and disinclined to speak at all.  The notion that the kids are simply being rude has gone the way of the dinosaur. In any event, it never occurs to anyone that when they speak what the kids have to say may not be worth listening to and their withdrawal from the adult world is nothing more than a sign of their budding narcissism. But there it is: the result of the youth rebellion.

Mark Bauerlein, author of The Dumbest Generation, insists that it started in the 1960s when groups like the S.D.S. led the attack on the “establishment” in general and the universities in particular, giving birth to the slogan “Don’t trust anyone over thirty.” Richard Hofstadter would insist, I dare to say, that it started a decade earlier during the McCarthy hearings, or, perhaps, when Dwight Eisenhower was running against Adlai Stevenson and suddenly Americans began to distrust the “eggheads” like Stevenson. The youth movement, he might say, is simply the logical development of the anti-intellectual movement that began in the 1950s and which has since been fostered by growing numbers of people in this commodified culture who have never trusted those impractical types who live in “ivory towers.” In any event, as a culture we have come to distrust the elderly (especially those who can think and speak coherently) and instead we check our gut feelings and listen to the young as the sources of what we like to call “truth.” The result has been a general lowering of the culture to the level of what I have called the “new barbarism.” The attack on the universities has resulted in grade inflation and the dumbing down of the curriculum in the schools, and the distrust of those over thirty has resulted in the mindless rejection of all in authority, including parents and teachers, and the almost total dismissal of the notion of expertise which, we are told, is “elitist.” To be sure, the teachers and parents have been party to the retreat as they have shown little courage and practically no confidence in themselves in the face of this assault. But, face it, some are in a better position to know than others and the odds are that those who have lived longer and studied complex issues carefully probably know a thing or two. Perhaps it is time to invent a new slogan: “Don’t trust anyone under thirty.” Or so says Mark Bauerlein and this sentiment, if not those same words, is echoed in the writing of another contemporary student of America’s current cultural malaise.

I refer to Charles Pierce who, in his best-selling book Idiot America: How Stupidity Became a Virtue In The Land of The Free, points out that this attack on authority and expertise — and those over thirty — has resulted in a lowering of intelligence (in a country where more people vote for the latest American Idol than they do the President of the United States), along with the reduction of all claims to simple matters of individual opinion, anyone’s opinion. And this in a nation based on Enlightenment ideas articulated and defended by the likes of John Jay, James Madison, Thomas Jefferson, and Alexander Hamilton.  We have devolved into a nation that has declared war on intelligence and reason, the cornerstones of the Enlightenment, and prefers instead the alleged certainty of gut feelings and the utterances of children. We have turned from books and hard evidence to the mindless drivel of reality shows and video games. Pierce defends three “Great Premises” that he is convinced sum up the attitude of Americans in our day to matters of fact and questions of ultimate truth:

(1) Any theory is valid if it sells books, soaks up ratings, or otherwise moves units.

(2) Anything can be true if someone says it [often and] loudly enough.

(3) Fact is that which enough people believe.  (Truth is determined by how fervently they believe it).

I suppose the last parenthetical comment might be regarded as a corollary of the third premise. But the fact is that in this relativistic age we distrust those who are in a position to know, we wait for the latest poll to decide what is true, and we adulate the young while we ignore the fact that, lost as they are in the world of digital toys, they know very little indeed. As Pierce has shown so convincingly, we are all becoming idiots. We have lost the respect for that truth which we do not manufacture for ourselves, but which stands outside the self and requires an assiduous effort to grasp even in part — together with our conviction that some things are truly evil while others are truly good. All truth is now mere opinion and the moral high ground has been leveled. We ignore the beauty all around us along with the ugly truths about what we are doing to the planet while we indulge ourselves in the latest fashion and seek the liveliest pleasure, convinced that it is the good. And all the while we wait eagerly to see what pearls of wisdom might fall from the young who are busy playing with their digital toys.

What will come of all this remains to be seen, but we might be wise to recognize the fact that those under thirty are still wet behind the ears and don’t know diddly about much of anything of importance. Their elders don’t seem to know much either, but if we recall that the admission of our own ignorance (as Socrates so famously said) is the beginning of wisdom, then that may be the way the adults in this country might begin to resume their role as mentors and our distrust of authority and expertise might be put to rest while we acknowledge that the children know even less than we do, and the majority does not determine what is true or false.

Do Corporations Have Rights?

There is no mention of corporations in either the Declaration of Independence or the Constitution of the United States. But as early as 1819 in Dartmouth College vs. Woodward the Supreme Court suggested that corporations were entitled to make and enforce contracts, thus implying early on that they should be treated as persons with rights protected by the Constitution. By 1886 it was simply assumed “without argument” that corporations are persons. The absurdity of this interpretation became glaring clear not long ago when the Supreme Court decided in the “Citizen’s United” case that spending limits should not be placed on corporations under protection of the First Amendment. That is, corporations should be allowed to spend as much on political campaigns as they see fit on the grounds that, as persons, they had a right to freedom of speech. Yes, that’s right, corporations are not only persons, they are entitled to give politicians as much money as they want under the aegis of freedom of speech.

None of these court decisions considered the rather basic fact that if corporations have rights they must also have responsibilities. While fines are levied against corporations in some cases for the atrocities they commit they can be “held responsible” for those acts, but this can hardly be called “having responsibilities.” The only responsibilities corporations acknowledge are to their stockholders and these, too, can hardly be called “responsibilities,” since it is simply what corporations are supposed to do — namely, maximize profits. There is very little, if any, talk about responsibilities to “stakeholders” in corporate inner circles — or about moral or ethical responsibilities, either. Further, it’s never clear just who the corporations are. Are they the CEOs or the boards that govern them? Or are they the stockholders? Or are they the engineer who turns the handle that releases poisonous gas and kills 2500 people? The question threatens to become positively metaphysical. But assigning corporations rights without acknowledging their responsibilities makes no sense whatever. Rights without responsibilities can apply only to children and the mentally challenged, otherwise the notion is absurd on its face. (I hesitate to discuss the question whether corporations can be said to be mentally challenged.)

I have always thought that the concept of balance of powers under the Constitution is one of the most brilliant ideas ever conceived by the human mind. It arose, of course, in a French mind in the person of Montesquieu in the seventeenth century who saw this balance as necessary for the protection of individuals in a political group. Kings are not to be trusted. Presidents are not to be trusted. Those in power in general are not to be trusted. But if we balance the power among the executive, legislators and judges we can control the abuse that nearly always follows from too much power in the hands of one person. That’s the idea.

The United States Supreme Court was the result of this thinking, of course, as it worked its way down through John Locke, Thomas Jefferson,  and James Madison. And it is an inspired notion: a court that would be above political influence since members are not elected but appointed for life. And, indeed, some of the decisions of the court over the years have been brilliant. But the decision in January of 2010 to grant corporations the status of persons with rights under the First Amendment is simply stupid, if not absurd — as noted above. And it certainly does not appear to have been apolitical. Not only are corporations not persons, unlimited donations to a political election clearly do not constitute free speech.

In any event, the concept of “person” is a moral concept fully explored in the ethics of Immanuel Kant and previously used by the Founders to apply to citizens with both rights and responsibilities. As Kant examined the notion, it was held that persons were “ends in themselves,” and never a means to an end. In other words it is morally wrong to use others for one’s own purposes: Kant stressed responsibilities, or duties, over rights. It is precisely because we can recognize our duties to other persons (who are also ends in themselves) that we have rights. Responsibilities are primary; rights are derivative. But corporations are clearly not “ends in themselves”; they are simply a means to an end, namely, profit. Further, as mentioned, they have no responsibilities. The appropriation of a moral concept for legal purposes by the Court in 1819 and applied to an entity that was not even human was inappropriate; extending the notion further as the court did recently borders on the bizarre.

The absurdity of this decision can be seen by considering what other rights are guaranteed to persons under the First Amendment, namely, the right to practice religion as one sees fit, to assemble, and to petition the government for redress of grievances. The Constitution also guarantees every citizen the right to vote and to run for national office. Is the Court now saying that a corporation can run for President if it is thirty-five years old? Nonsense! But just as it would be absurd to think about corporations assembling, practicing religion, running for public office, or voting, it is also absurd to think that “they” have the right to free speech — assuming that this is what giving stacks of money to political candidates amounts to. This has to be one of the worst decisions ever to come from this Court and it deserves to be overthrown by a Constitutional amendment, and a movement to do so is afoot. That movement, however, seems sluggish at best — a reflection, perhaps, of the population’s general indifference to political issues and the unwillingness of those in power to bite the hand that feeds them.