Still Wondering

I posted this (slightly modified) piece two years ago — before the Age of The Trumpet and Alternative Facts — but it still seems pertinent. Perhaps more so! So I decided to repost it in the hope that its might be of interest to some of my readers who missed it the first time around.

As Hannah Arendt uses the term, “totalitarianism” is any form of government in which those in power seek to gain “total domination” of the minds and actions of the citizens by any means — violent or otherwise. In this sense, Huxley’s Brave New World is a totalitarian state in which a benign dictator, convinced that he is doing the right thing, makes sure his people think they are free while all the time he guarantees their continued mental captivity in a world of pleasure and endless diversions. If this sounds a bit familiar, it may well be, though in these United States it is not clear whether there is a single person or a group that is in complete control. But it is certainly the case that we are provided with endless diversions and a mind-boggling array of entertainment to keep us convinced we are free while all the time we are buying what the media are selling, electing inept officials who are cleverly marketed like toothpaste, and embracing the platitudes we hear repeatedly. Seriously, how many people in this “free” nation really use their minds?

In any event, I came across a passage or two in Arendt’s remarkable book about totalitarianism — which I have alluded to previously — that are well worth pondering. Bear in mind that she was writing in 1948 and was primarily interested in Joseph Stalin and Adolph Hitler and their totalitarian governments. Donald Trump was not a name on everyone’s lips. She was convinced that this period in history is when the “mob mentality” that later theorists latched upon came into the historical picture and “mass man” was born: Eric Hoffer’s “true Believer.” This was before political correctness, of course, when “man” was generic. The “elite” of whom she is speaking is the educated and cultured individuals in those countries who should have known better — but who did not. There are subtle differences in the mentality of the two groups, but Arendt was convinced that they were both easily led astray.

“This difference between the elite and the mob notwithstanding, there is no doubt that the elite was pleased whenever the underworld frightened respectable society into accepting it on an equal footing. The members of the elite did not object at all to paying a price, the destruction of civilization, for the fun of seeing how those who had been excluded unjustly in the past forced their way into it. They were not particularly outraged at the monstrous forgeries in historiography of which the totalitarian regimes are guilty and which announce themselves clearly enough in totalitarian propaganda. They had convinced themselves that traditional historiography was a forgery in any case, since it had excluded the underprivileged and oppressed from the memory of mankind. Those who were rejected by their own time were usually forgotten by history, and the insult added to injury had troubled all sensitive consciences ever since faith in a hereafter where the last would be the first had disappeared. Injustices in the past as well as the present became intolerable when there was no longer any hope that the scales of justice eventually would be set right.”

And again,

“To this aversion of the intellectual elite for official historiography, to its conviction that history, which was a forgery anyway, might as well be the playground of crackpots, must be added the terrible, demoralizing fascination in the possibility that gigantic lies and monstrous falsehoods can eventually be established as unquestioned facts, that man may be free to change his own past at will, and that the difference between truth and falsehood may cease to be objective and become a mere matter of power and cleverness, of pressure and infinite repetition.”

Those who might question the notion of a historical parallel here might do well to reflect on the fact that postmodernism has literally “taken over” our college campuses. And “New History” is all the rage.  The basic tenet of deconstructionism, which lies at the heart of postmodern thought, is that truth is a fiction — or, as the American philosopher Richard Rorty has said, truth is nothing more than “North Atlantic bourgeois liberalism.” His famous predecessor Jacques Derrida said, unblushingly, that truth is simply a “plurality of readings” of various “texts.” A great many of these intellectuals are convinced that history is a fiction that has for too long ignored the disenfranchised and are determined to right this wrong by rewriting the history books to stress the role of those who have been excluded by an elite white, male hegemony. And while the motive may be admirable, one must question the premise on which these folks operate, since this is coming from those whose job, traditionally, has been that of protectors and transmitters of civilized thought. Popular culture [and politicians have] simply latched on to the droppings of these intellectuals and reduced truth to subjectivity: truth is what you want to be the case; we do not discover it, we manufacture it. Say something often enough and loudly enough and it becomes true.

In the event that anyone should suggest that the rejection of objective truth is trivial, I present the following observation by Ms Arendt:

“The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction and the distinction between true and false no longer exist.”

Bearing in mind that totalitarianism need not be violent, this appears to be the direction we are headed. Or am I wrong in thinking that the signs of totalitarianism are increasingly clear and it appears that a small group of wealthy and powerful men — supported in their ivory towers by “elite” intellectuals who would never admit their allegiance to this group while they deny objective truth and busily rewrite history — are slowly but surely gaining control of the media and by attacking the public school system, ignoring such things as global warming, eliminating regulating agencies, approving numerous invasions of personal privacy, and picking and choosing stupid and malleable people to run for public office are increasingly able to make us think we are free when, in fact, we are simply doing their bidding? I wonder.

P.B.F.

The initials in my title stand for the words: “Post-Birdie-F%$kup. They are words taught to me by a friend I play golf with and they describe a pattern one finds in a great many sports —  not just golf. But in golf they describe the tendency of players to score a birdie and then, on the next hole, to get a triple boogie. “Can’t stand success,” they say. It happens a great deal. In tennis, for example, I noted that many of the people I played with (never me, of course) double-faulted after an ace. Great shot then PGGFFFFGH.

The saying goes: “Pride goeth before destruction; a haughty spirit before a fall.” This is usually shortened to “Pride Goeth Before a Fall,” which is a lazy way of saying the same thing. But whether we are talking about pride or a haughty spirit, we are referring to the tendency which has been around for a great many years apparently, to blow a lead, choke in a crisis, get a big head, get cocky after a good shot. Take your pick.

One of the aspects of this phenomenon is the tendency of highly rated players — say a top seed in a tennis tournament — to choke under the pressure (the air is thin at the top). When I coached tennis and used to take my players to the National Tournament in the Spring after the regular season I realized (years later) that the players I managed to get seeded never did well. The ones who did well, including three All-Americns, were always unseeded. They “flew beneath the radar.” If I had noticed it early on I would never have allowed my players to get the seed in the first place. It put undue pressure on them and they felt it and had difficulty making their bodies obey they commands of their minds. In a word, they choked. As all athletes know, it is easier to play when behind than when ahead — or favored to win.

Arthur Ashe once said that all athletes choke. The great ones learn how to play well even under the pressure. This is what separates the great athletes in every sport from the average to good ones: they handle the pressure better. This would include people like Tiger Woods in his prime, Jack Nicklaus, Chris Evert in her prime, Rod Laver, Roger Federer, and teams such as  the Chicago Bulls with Michael Jordan, the 1950s Yankees, the current Golden State Warriors, and other teams and players noted for their winning ways — regardless of the pressure. No P.B.F. for them, though even the great ones have problems at times.

As an example of this is Dustin Johnson the golfer was recently named the #1 player in the P.G.A.  In a recent W.G.C. match-play tournament he built a large 5-hole lead in the final match and then saw it whither away and had to hold on to squeak out a win by one hole. Even the great ones feel the pressure.

So what do we learn from this — those of us who aren’t involved in athletics at the higher levels? We learn that it is best to remain silent and fight the tendency to get smug when things go well for fear that it “will come back to bite us.” A president, for example, who is convinced that his personal prestige and bullying tactics are sufficient to move a bill through Congress may discover that his smug attitude is the very thing that turns those very Congressmen against him and he may lose the fight. P.B.F.

Beware the bug that comes back to bite you. Beware of P.B.F. It can strike anywhere and at any time!

Puzzled about the ACA – take this quick quiz

This man is one of the few among us who truly understands the ramifications of the Affordable Care Act which has been the target of the far Right for so many years and which is helping so many people. It’s worth read.

musingsofanoldfart

Now that the AHCA effort by the President and Republican majority has fizzled, it would be appropriate to step away from the rhetoric and ask a few questions about the Affordable Care Act (ACA). I would also suggest you may not want to listen only to politicians on this as I have learned the health care awareness of politicians is not as high as we need it to be and some are more interested in optics than impact.

The questions and answers have been provided by a retired benefits actuary, consultant and manager for a Fortune 500 company.

Question 1: The ACA is: (a) undergoing a death spiral, (b) a disaster and will implode, (c) doing well in a number of places, but needs help in a few others.

Question 2: The reasons for rising costs under the ACA are: (a) adverse selection where more bad risks are signing up than good…

View original post 505 more words

Other Cultures

I have been rereading Yasunari Kawabata’s Beauty and Sadness and came across the following description which made me think. It comes early in the story about a middle-aged man, Oki Toshio, who has been sitting by the window reflecting on his first love from whom he separated 20 years since:

“He looked out of the small French window of his study. At the base of the hill behind the house a high mound of earth, dug out during the war in making an air raid shelter, was already hidden by weeds so modest one barely noticed them. Among the weeds bloomed a mass of flowers the color of lapis lazuli. The flowers too were extremely small, but they were a bright, strong blue. Except for the sweet daphne, these flowers bloomed earlier than any in their garden. And they stayed in bloom a long time. Whatever they were, they could hardly be familiar harbingers of spring, but they were so close to the window that he often thought he would like to take one in his hand and study it. He had never yet gone to pick one, but that only seemed to increase his love for these tiny lapis-blue flowers.”

This passage, like so many in this novel, reflect the main theme of beauty and sadness. The description of the beautiful flowers almost hides the reference to the air-raid shelter that harkens back to the Second World War and makes the reader recall the terrible effects of the fire bombings that destroyed an estimated 40% of the population of the 64 major cities in Japan toward the end of the war, coupled by the dropping of the Atom Bombs that killed another 129,000 men, women, and children. The end of the war was followed by a seven year allied occupation by 300,000 men that brought about the Westernization of Japan, with its sports, music, movies, clothing, fast-food restaurants, and love of money. The older Japanese, like Kawabata himself, struggled with the loss of pride coupled with transmogrification of their culture from the old ways to the faster, more frenetic new ways. His novels are filled with references to this struggle within himself and in the hearts of his countrymen.

But what struck me powerfully was the fact that we can read passages like this in a novel written by  a man in another culture and “relate” to it, because we share a common humanity. We have lost  sight of this fact in our preoccupation with  the differences in cultures stressed by anthropologists and social scientists like Margaret Mead who started the movement toward cultural relativism that lead us, wrongly I insist, to the conclusion that we are not in a position to judge what folks do in other cultures. From the undeniable truth that we can never fully understand what people in other cultures feel and think we draw the unwarranted conclusion that we can not sympathize with them at all. But this flies in the face of the human sympathy that the moral sense theorists in the eighteenth century brought to our attention that allows each of us to sympathize with other human beings, all other human beings. In stressing difference we have lost sight of our fundamental similarities.

We can read passages like that above, read poetry, hear the music, watch their dances, view their art, and we can feel many of the same things those people feel — not all, but many to be sure. We are not all that different. And, as a result, when we read about Suttee in India, or the stoning of adulteresses in the Middle East, or clitoridectomies forced upon young women in Africa, or the denial of fundamental rights to women around the world, we can judge these things to be wrong because we do know better. Values are relative to cultures to a point, but that point is reached when a violation of fundamental human rights are in question. We know this because we feel it deeply and because our reasoning capacity tells us that if it were us we would not stand for it.

In a word, there its such a thing as “human nature” and it is something we share with the world at large and which, even though many of those in power and those who posses great wealth seem to have denied, defines all of us as human. But why is this discussion significant? Or even of interest? I can do no better than end with a quote by one of the finest minds I have ever encountered, Hannah Arendt, who tells this in her book The Origins of Totalitarianism:

“If the idea of humanity, of which the most conclusive symbol is the common origin of the human species, is no longer valid, then nothing is more plausible than a theory according to which brown, yellow, or black races are descended from some other species of apes than the white race, and all together they are predestined by nature to war against each other until they have disappeared from the face of the earth.”

Religious Americans?

In reading books by Gertrude Himmelfarb, whom I have cited on numerous occasions in these posts, I delight in the fact that she and I agree so much with one another. This, of course, leads me to conclude that she is a brilliant woman, since brilliance is defined as “in agreement with oneself.” In any event, we do agree about so much and I have learned a great deal in reading her books. She insists on one point, however, that strikes me as simply mistaken and I decided to write this post pointing out just where I think she went wrong.

Himmelfarb insists that America is the most religious nation on earth — or certainly in the West, at any rate. She cites de Tocqueville as support who, when travelling in America in the nineteenth century, was struck by the religiosity of so many Americans. Indeed, he was convinced that the American Republic rested on religious faith. As he said:

“Religion is the first of [America’s] political institutions because it was the prerequisite of both freedom and morality — and thus of republican government itself. . . . [Freedom] considers religion as the safeguard of mores; and mores as the guarantee of laws and the pledge of its own duration. . . . At the same time that the law allows the American people to do everything, religion prevents them from conceiving everything and forbids them to dare everything.”

The problem is, of course, de Tocqueville visited America in 1831 for nine months and while his book was extraordinary — and still is — it may not be totally adequate to describe the state of things in this country today. But, more to the point, de Tocqueville and Himmelfarb both neglect to define what they mean by “religion” and this causes problems. Himmelfarb seems to mean by the word simply church and synagogue attendance which is higher in this country than it is in many European countries, especially France. As it happens, though, fewer than 40% of us report that we attend church regularly – and critics insist that this figure is inflated. In fact, attendance in church among the young has lately fallen off drastically and the vast majority of the “millennial” generation – born after 1980 – claim no church affiliation whatever. But, regardless of these figures, church attendance does not determine religiosity, especially in the age of mega-churches that serve our favorite coffee laté and provide us with television sets on site to fill our empty minutes when we are not browsing in the bookstore for souvenirs. Indeed, many churches are nothing more or less than social clubs where folks go to meet and greet one another for an hour or so of a Sunday in order to make themselves feel good about themselves.

But it behooves me to define what I mean by “religion.” When I was  freshman in college back in the dark ages I wrote a seminar paper on Lucretius’ De Rerum Natura as a religious work. The first question out of my seminar leader when I sat down to defend the paper was “what is religion?” I looked aghast. I gaped, I was stunned. I thought everyone knew what religion is! So I struggled and tried to bluff my way, which did not serve me well. Accordingly, I now seek to make amends for past failures and will define religion as a set of beliefs based on the conviction that there is something in the universe greater than the self and that we owe to that entity respect and reverence, even devotion. Those who are indeed religious center their lives around the worship of this entity and find meaning in their lives by devoting themselves to something greater than themselves.

Contrast today’s notion of what it means to be “religious” with the medieval world in Europe in which church was the center of most people’s lives, with daily attendance (sometimes twice daily), prayers in the evenings, and total dedication to making one’s life on this earth a preparation for the next one. In that regard, I do think Lucretius’ book was religious and his “entity” was Nature, which he sought to love and respect and, as far as possible, become one with. In doing so, as a Stoic, he was convinced that, with discipline and determination, we could become one with something greater than ourselves and find peace in a chaotic world. For the truly religious, there is profound mystery in the world and it gives meaning to their lives.

In that regard, there do not seem to me to be many religious Americans. The data suggest that the traditional churches are closing their doors or seeking to conform to the pattern of the non-denominational churches that focus on fellowship and good feeling, demanding as little as possible from the parishioners and continually reassuring them that they are loved and are among the happiest and luckiest people on this earth. In a word, those churches that do manage to fill their pews do not demand “respect and reverence” for the God they profess to worship. Certainly not sacrifice. Parishioners, for the most part, do not center their lives around the church and its teachings. Indeed, the churches demand very little of their worshippers at all. They seek, rather, to make things as easy as possible for the congregation so they will continue to attend and help pay for the new roof.

I exaggerate, of course, but I seek to make a serious point: the claim that Himmelfarb makes about the supposed religiosity of the American people rests on flimsy evidence and flies in the face of the fact that so many “religious” people in this country have tended to resort quickly to violence, elect self-absorbed morons to political offices, and are caught up in the self-as-God movement which places the focus of their lives on themselves and not on something greater than themselves “out there” in the world. I conclude therefore that Himmelfarb was mistaken — at least on this topic.

Dum and Dee

In Through the Looking Glass by Lewis Carroll we find the indomitable twins Tweedledee and Tweedledum. I have noted their presence before in an earlier blog post, but I have another point to make about this pair in what is usually regarded as a child’s story — unless you count the horror story made about Wonderland by Hollywood (which I don’t). I also discount Walt Disney’s abortive attempt to capture the magic of Carroll’s masterpiece. One really has to read the original and delight in the remarkable drawings by John Tenniel.

In any event, Alice comes across the pair in a thick woods and they stand arm-in-arm looking sideways at Alice as she struggles to tell them apart. They form what geometers call “enantiomorphs,” mirror-image forms of each other. This is appropriate since they are encountered in looking-glass world where everything is a bit backwards and “it takes all the running you can do to keep in the same place. If you want to get somewhere else, you must run twice as fast as that.” — as the Red Queen explains to Alice earlier in the story. (Reminds us all of the condition of politics in this country at the moment  — or is that the caucus race?)

Alice With the Twins

But the Tweedle twins also pose another philosophical conundrum, which may have been in the back of Carroll’s mind. Logicians call it the “identity of indiscernables.” The two are identical and differ only with respect to place. One is to the left of the other. This suggests that unless we regard space as a feature of a thing’s essence, the two twins would not be two but only one. If the only respect in which they differ is that one is to the left of the other — and vice versa — then, if they are identical in every other respect,  they are one and the same person! But this seems absurd. We must therefore conclude that the thing’s position in space is essential to it’s being what it is. But that is also absurd.  That would mean that every time you move you would be someone else — because your spatial location (which we have assumed is part of who you are) has changed. In fact, no one would ever be the same person in one place that she is in another. This paradox is embraced by such thinkers as Hegel,  but most philosophers refuse to regard spacial location as in any way a defining characteristic of who they are. It is a mere accident.

Thus, it would seem, Twedledee and Tweedledum are the same person. Or so it would seem.

The delightful thing about Carroll’s tale is not only that it is a chess game (which it is) but it is also filled with logical puzzles like this one and we are reminded on nearly every page that the man was a mathematician with the most remarkable imagination. What a delight!

Under Attack

I often wonder how many people outside the Academy realize (or care?) how severe the attack on Western Civilization is within the Academy as students and faculty on a growing number of campuses across the country have determined that Western Civilization is the source of most of the world’s  problems today.  Indeed, I wonder how many people within the Academy are aware of the seriousness of the problem.

In a recent acceptance speech at the American Council of Trustees and Alumni annual banquet, one of the recipients of their “Philip Merrill Award for Outstanding Contributions to Liberal Arts Education,” Ms Ayaan Hirsi Ali, a Fellow at the John Kennedy School of Government at Harvard, paints a bleak picture indeed. She cites a battle at Stanford University in 2016 in which a group of students sought to reinstate a course requirement in “Western Civilization” that had been eradicated 25 years ago. The attempt was overwhelmingly rejected by the student body.

“In the run-up to the vote, one Stanford student [a young woman in this case] wrote in the Stanford Daily that ‘a Western Civ requirement would necessitate that our education be centered on upholding white supremacy, capitalism, and colonialism, and all other oppressive systems that flow from Western civilizations.'”

The ignorance of this student’s comment beggars belief and, sad to say, it is a view that is shared by what many think is the majority of students (and faculty) on today’s campuses. Let’s take a look at this comment.

To begin with, one course requirement would not result in an education “centered” on Western Civilization. The is what logicians call a “straw man” and it is a fallacy. The young lady would know this if she knew more about Western Civilization, since logic was first formalized by Aristotle and later refined by the Schoolastics during the Middle Ages. In any event, even if the course were required, it would not comprise the whole of the students’ study for his or her four years. Moreover, there is no reason to believe that there could not also be a requirement in “Eastern Civilization” as well. But, more to the point, the comment ignores the benefits of Western Civilization that this student has chosen to ignore — if, indeed, she was aware of them. I speak of such things as women’s equality, the abolition of slavery, individual freedom, religious tolerance, and freedom of expression (which makes possible ignorant comments like that of the student in question). As Ms Ali points out:

“One cannot dismiss the sum total of Western Civilization without losing one’s moral compass. And one cannot participate meaningfully in the battle of ideas raging in the world today while dismissing the value of Western Civilization as a whole.”

While there are many things to note and regret about the luggage that has been brought with them by folks who have struggled to create what we call “Western Civilization,” and here we would have to acknowledge the half-truth hidden in the rhetoric of the Stanford student, we must insist upon a wider perspective and note the extraordinary beauty in Western art, the intellectual triumphs, the moral gains (as noted above) that form the warp and woof of Western Civilization. Perspective, when speaking of such a large issue, is essential. And this student has lost hers entirely (if she ever had it to begin with). To take an obvious example, capitalism, for all its faults, has made it possible for this particular student to attend one of the most prestigious universities in the world. She bites the hand that feeds her.

As one who has read, taught, and defended the Great Books of the Western World I have an obvious bias against this sort of blanket condemnation. But even if this were not the case, the intolerance built into the ignorant comment by this student would be disquieting. After all, college is a place where one broadens one’s mind, not shrinks it — ideally. And the comment reflects the growing attitude on many college campuses across the country that results in the exclusion of certain “types” of speakers from appearing on campus, because they represent views that are regarded as unacceptable. This includes Ms Ali who was denied access to Brandeis University by militant students and faculty  after initially being invited to speak about the crisis within Islam and receive an honorary degree. It is an attitude that has also resulted in the prohibition against saying certain words or thinking certain thoughts, an attitude that reflects a fascist approach to eduction — if this is not, in fact, a contradiction in terms. The “battle of ideas” requires that we keep an open mind.

My concerns are obvious to anyone who has read any of my blogs. But I do not think they are misplaced or even exaggerated. Higher education is supposed to be a place where the students do not learn certain things, necessarily, but they learn to use their minds to determine which things are worth knowing and which things are not. And a blanket condemnation of the whole of “Western Civilization” by a group of students at Stanford University who, we may assume, know little or nothing about that which they reject, is nothing short of presumptuous, if not arrogant. And the fact that the faculty at Stanford did not take the lead in determining which courses were to be required in the first place is also to be regretted, but not surprising in an age in which the students and the children are mistaken for those who should lead rather than follow. And here we have a graphic example of why they should not be allowed to lead.

Am I Dreaming?

Lewis Carroll’s classics Alice In Wonderland and Through the Looking Glass focus on a perennial philosophical question first propounded by Bishop George Berkeley in the eighteenth century: the things we take to be real, material, and substantial are merely intangible, “sorts of things” in the mind of God. We do not know what is real and what is merely apparent. Further, we cannot say at any given moment whether we are awake or dreaming because there is no reliable criterion that enables us to distinguish the two states from one another.

Bishop George Berkeley
(Courtesy of Wikipedia)

In a conversation Alice is having with Tweedledee and Tweedledum in Looking-Glass Land we hear the following exchange that follows their discovery of the red king sleeping under a nearby tree:

“I’m afraid he will catch cold with lying in the damp grass,” said Alice, who was a very thoughtful little girl.

“He’s dreaming now,” said Tweedledee: “and what do you think he is dreaming about?”

Alice said, “Nobody can guess that.”

“Why, about you!” Tweedledee exclaimed, clapping his hands triumphantly. “And if he left off dreaming about you, where do you suppose you would be?”

“Where I am now, of course,” said Alice.

“Not you!” Tweedledee retorted contemptuously. “You’d be nowhere, Why you’re only a sort of thing in his dream!”

“If that there King was to wake,” added Tweedledum, “you’d go out — bang! — just like a candle!”

“I shouldn’t!” Alice exclaimed indignantly. “Besides, if I’m only a sort of thing in his dream, what are you, I should like to know?”

“Ditto,” said Tweedledum

“Ditto, ditto,” said Tweedledee.

Both of Carroll’s tales have a surreal quality and throughout Alice is constantly wondering if she is awake or just dreaming. This generates the pithy problem: how do we determine that we are awake? Berkeley was convinced we could not, that is, we cannot say just why it is that we know we are awake at any given moment and not dreaming. We may have strong feelings. Common sense insists that we are awake and not dreaming when we ask the very question. But the problem is HOW do we know this? We cannot distinguish dreams from reality with any certainty. And this is because any claim to knowledge must produce the criteria that make the claim knowledge and not a pretender.

If I claim that this computer before me is real I can say I know it because I can see it and touch it. But how do I know I am really seeing it and touching it and not just dreaming that I am seeing it and touching it? As you can see, it’s a tough one! No one really answered Berkeley satisfactorily in the many years that have followed his suggesting the paradox and it is still out there.  David Hume suggested reality has greater “force and vivacity,” but this won’t work because many people have very vivid dreams and for many people reality is a blur — especially if they a prone to the occasional tipple. So Lewis Carroll is having great fun with it in his Alice stories. Children’s stories, eh?? I don’t think so!

Carroll later wrestled with the problem in his book, Sylvie and Bruno in which the narrator shuttles back and forth mysteriously between real and dream worlds.

“So, either I’ve been dreaming about Sylvie,” he says to himself in the novel, “and this is not reality. Or else I’ve really been with Sylvie and this is a dream! Is life a dream, I wonder?”

If it is, perhaps we will all wake up soon and discover that this is so and breathe a sigh of relief. Otherwise this dream is a nightmare.

 

Inverted Consciousness

I wrote this post several years ago in attempt to understand why folks seem to have become lost in the world inside their own heads. It is a topic I had written a book about a few years ago and one that continues to intrigue me. With a few adjustments to the earlier version, here is my attempt to understand the condition known medically as “Asperger’s Syndrome.”

“The Big Bang Theory’s” Sheldon Cooper has it. BBC’s Sherlock Holmes and Doc Martin have it as well. It’s all the rage these days. It’s called “Asperger’s Syndrome” and it is defined as follows:

“a developmental disorder resembling autism that is characterized by impaired social interaction, by restricted and repetitive behaviors and activities, and by normal language and cognitive development —called also Asperger’s disorder.”

Actually, “language and cognitive development” is often exceptional. But these folks  have to be taught how to interact with others, because they are not fully aware of the other’s presence — except insofar as the other person accommodates or interferes with the person’s own desires. They seem to be emotionally stunted, lacking a sense of humor and any instinctive reaction to other people’s feelings and the subtle nuances of human behavior.
I wrote a book about the phenomenon years ago before I had ever heard the word. I called it “inverted consciousness” and argued that it is a widespread cultural phenomenon resulting from a fixation on the part of the subject with his or her own experience, an inability to see beyond that experience. For this person “the” world is “my” world. Paintings and music are beautiful or ugly because the subject likes or dislikes them; behavior is right or wrong because it pleases the individual or fails to do so; all opinions are of equal merit — there is no such thing as truth or even expertise. I maintained that there are degrees of this disorder from the extremely inverted consciousness of what I now know is Aspergers down to the occasional or intermittent inversion. It is usually found in men, though I know of a woman or two who have it. My sense of it is that women are more empathetic and compassionate than men as a rule and those qualities do not live comfortably alongside a condition that blinds the person to the fact that there are others in their world — except in so far as the others serve their own purposes. That sounds sexist, but I still think there are important differences between men and women and in this case women are being complimented: Aspergers is very unattractive. However, I apologize in advance to any readers who find this differentiation offensive!
As I say, I do regard the condition as widespread in our culture and took my clue from Ortega y Gasset who noted the symptoms in Europe in the 1930s and wrote about them in describing Mass Man in his classic The Revolt of the Masses. Defining “barbarism” as simply “the failure to take others into account,” Ortega was convinced that Europe was then on the brink of a new barbarism, an age in which people would become more and more removed from one another and “hermetically sealed” within themselves. World War II soon followed, interestingly enough.
Describing this type of person, Ortega said at the time,

“The innate hermetism of his soul is an obstacle to the necessary condition for the discovery of his insufficiency, namely: a comparison of himself with other beings. To compare himself would mean to go outside of himself for a moment and transfer himself to his neighbor.”

But he is incapable of that.
I am not sure what causes this phenomenon, but it does appear to be more and more prevalent. It seems apparent that our current President suffers from the condition to a high degree as well. I suppose our increasingly crowded living conditions together with the almost constant bombardment of images and sounds around us tend to turn our attention inward. In addition, the countless number of electronic toys that seem designed to discourage human interaction must also be considered. I recall vividly the photograph of a group of teenagers sitting in front of Rembrandt’s “Night Watch” texting — totally unaware of the extraordinary painting nearby. Note how all of these devices turn the individual’s attention away from what is around them (he said, sitting alone at his computer).
In any event, I thought what Ortega had to say was a powerful message when I first read it, and I find it even more so today. If we are, indeed, “from birth deficient in the faculty of giving attention to what is outside [ourselves], be it fact or persons,” this is something we need to ponder seriously, since it suggests we are becoming increasingly isolated from one another — like Sheldon. And Sherlock. And Doc Martin — who are all funny up to a point, but also pathetic.

Photography As Art

The question of whether or not photography can be regarded as art is a very tough question.  I have never addressed it myself. But I recently picked up a follower who does beautiful photography and the question forced itself upon me: when does the photograph become a work of art?

To address this question, I will begin with Monroe Beardsley’s definition of the artwork which he proposed back in the 1980s in the last essay he ever wrote on the subject just before he died. Beardsley told us then:

“An artwork is something produced with the intention of giving it the capacity to satisfy the aesthetic interest.”

Beardsley chose his words carefully. He stresses the “capacity” of the work to “satisfy the aesthetic interest.” The object may not, in fact, generate any response whatever. But the notion of aesthetic interest is particularly important. It contrasts with the sorts of interest we take from day to day in ordinary objects, interest that generates feelings of sentiment, fear, anger, lust, or whatever. The aesthetic response when it occurs is distinctive. It results from attention that is focused entirely on the object itself. Eliseo Vivas called this “rapt, intransitive attention” to the object. The object holds our attention to itself and does not let our minds or feelings wander off into memories, associations, irrelevancies. When we look at a Norman Rockwell painting, in contrast, it conjures up all sorts of fond memories of past Thanksgivings, childhood pains, family gatherings, Boy Scouts, Girl Scouts, — complete with a small dog at our feet. These are not aesthetic responses and Rockwell characterized himself as an “illustrator,” not an artist. He knew whereof he spoke.

An artist can produce an object, a painting, sculpture, dance, musical composition, poem, or even a piece of driftwood that demands of us a “rapt, intransitive” response. We behold the object and we become lost in it. Our minds do not wander and strong feelings do not obtrude. We simply feel at one with the object. It approaches the religious experience described by mystics.

Now Beardsley doesn’t say the object always yields such a response. he says, rather, that it is the artist’s “intention” that it have the “capacity” to satisfy the aesthetic interest. He wants us to focus on the artist’s intention — to the extent that we can figure out what that was (and even assuming that the artist herself even knew what it was at the time). We look at the object and we see forms, shapes, colors, relationships that announce the presence of beauty. My new blogger friend is a photographer and her blog contains a number of her photographs that are clearly works of art. They are beautiful. They speak for themselves and need no explanation. In fact, any attempt at explanation is doomed to fail, because explanations involve discursive language whereas the language of art is immediate, intuitive and instinctive. Works of art do not seek to evoke nostalgia, memories of long walks in the woods, past memories of lost moments in childhood. None of these things is present as we simply look and our interest is absorbed by the photograph itself. That distinctive, decidedly aesthetic, response is the only one that seems appropriate.

I do think there are photographs that rise to the level of art. In days long gone the photographer was able to control the finished product as it was developing by altering the time spent on taking and developing the photograph, altering chemicals, etc. In cases such as the photographs of Ansel Adams, for example, the results (even in black and white) were clearly works of art.

Today, with digital cameras — and iPhones! — the artist requires an eye for composition, color, shadow, subject-matter, and the subtleties of form. One can simply point and shoot with a PhD camera (“push here, dummy”) and relish the shot of one’s self (!) or friends, or the lovely spot where we saw the eagle soaring in the deep blue sky. But only the artist is able to capture the moment when all of the pieces fit together and the finished product speaks for itself.

It can happen by accident, of course. In a recent trip to the North Shore with my wife’s niece and her brother, for example, our niece took a photograph of her brother sitting on some rocks at the shore of Lake Superior in the evening with the moon shining on the lake. It was a work of art. The photographer was able to capture just the right moment, when things were aligned and the finished product demands our complete attention. It is a truly beautiful shot.

The Shores of Lake Superior at Dusk

Yes, photography can become art. But it usually falls short because not all of us see with the artist’s eye and it is so difficult for even an artist to capture the precise moment when all things come together; and, of course, there is never any guarantee that those who view the photograph will have an aesthetic response. In a world flooded with images and sounds and diversions surrounding us on all sides there are rare moments when we are willing to take the time to just look and appreciate, allow ourselves to get lost in the picture. It takes imagination, time, patience, experience, and sensitivity. And these things are becoming less and less common. But artists are still among us and they paint, they sculpt, they dance, they play. And they take photographs.

Thank goodness!