Revisiting “E-Literacy.”

As much as I hate to admit it, there are some who would disagree with my take on the sad state of affairs in the world of American education. Indeed, there are a great many people — some of whom write books and many others who teach in that world — who insist that things couldn’t be better. They love the kids and they love the way things are going. They explain away the wealth of data that show that the kids are not learning anything with the claim that the tests are simply archaic and don’t register the intellectual skills the kids in the millennial generation are acquiring with their electronic toys. Indeed, many of them think the schools themselves are archaic and the kids are learning what they really need to know to get along in tomorrow’s world OUTSIDE of school, with those toys. While there are those of us who would insist that the toys are rotting the kids’ brains (as I have said in an earlier blog), there are a great many people who defend the toys and insist that the kids will save the world with the digital facility and e-literacy they are acquiring with those very toys.

In fact, in 2005 Randy Bomer of the National Council of Teachers of English (!) attacked as too narrow a study called the American Diploma Project that was designed to help design curricula that would assist young people become better prepared for work in a changing world. Bomer defended the use of electronic toys and applauded the proficiency with which the kids use the toys, insisting that their critics are out-of-the-loop idiots. He remarked that today’s high school graduate (who may not be a-literate, as they say) is e-literate, he or she

“can synthesize information from multiple information and technical sources. . . .[they can] analyze the setting, plot, theme, characterization, and narration of classic and contemporary short stories and novels. . . .They are inventing new forms of literature.”

High praise indeed. And a breath of fresh air for those who find the constant criticism of America’s schools unsettling. We always like to hear those things that make us feel better about the way things are and allow us to dismiss the nay-sayers with a sigh of relief. Unfortunately, however, it’s a pile of rubbish.

One must wonder what this new “viewer literacy” really amounts to — if it can be called “literacy” at all. And the claims Bomer makes are outlandish — given that every test devised (and one must agree that tests don’t always tell the whole story) reflect the inability of these young people to understand the printed word or work with figures. How can such people be said to be able to “analyze the setting, plot, theme, characterization, and narration of classic and contemporary short stories and novels.”? Especially when they don’t even read comics or cereal boxes — as the students themselves defiantly tell investigators (reading is “too analogue”). They take great pride in the fact that they don’t read and generally regard reading as a waste of time — though they will spend more than three hours a day, on average, watching television (while they send text messages and check their Facebook page) and never think for a moment that it is a waste of time.

But, in the end it is all about thinking, which requires both synthesis and analysis. Mark Bauerlein, author of The Dumbest Generation, has made a study of e-literacy. He quotes his critics who defend it on the grounds that e-literacy

“is not just knowing how to download music, program an iPod, create a virtual profile, and comment on a blog. It’s a general deployment capacity, a particular mental flexibility. E-literacy accommodates hypermedia because e-literates possess hyperalertness. Multitasking entails a special cognitive attitude toward the world, not the orientation that enables slow concentration on one thing, but a lightsome, itinerant awareness of numerous and dissimilar inputs.”

So say its defenders who go on to insist that

“The things that have traditionally been done — you know, reflection and thinking and all that stuff — are in some ways too slow for the future. . . .Is there a way to do these things faster?”

But, jargon and wishful thinking aside, thought does take time, much as we might hate to admit it. And faster is not necessarily better. The fact that the kids show remarkable dexterity and quickness with their toys — one claim is that they can read four books at once (!) — is praiseworthy on some level. But when we are told that this dexterity will (or should) replace the traditional way of knowing and thinking about the world we must pause. The kids feel out of place in schoolrooms. I get that. But we know enough about them to realize that this is a statement about their narcissism, not about the schools, and many would consider it a condition that needs to be addressed and remedied so these kids can make their way in the real world where things are not always to our liking and problems need to be thought through and solutions found by careful, and slow, reflection and the consideration of possible outcomes in dialogue with other thinkers. If computers can help speed up that process, perhaps this is a good thing. Defenders of video games contend that they encourage “collateral learning,” and how to “make the right decision” and do it quickly. But there is no hard evidence that these toys teach anything that can in all seriousness be called “thinking.”

In the end a human being, or a group of human beings, must carefully consider what the computer spews out and determine which of several alternatives is the best course of action. Whether games will help people acquire the necessary skills remains to be seen. The “right decision” taught by the electronic game may simply prove to be the one that directs the drones to kill the most people. But the kids themselves will become adults who are expected to play a role in this democracy. Electronic toys cannot make moral judgments or judge which of two or three political candidates will do the best job. E-literacy won’t get them there. A-literacy is required: the ability to read and understand what they read, write coherent sentences that can be readily understood by others, and speak persuasively in order to help others grasp the claims they are determined to make. And people need to judge of better or worse, whether they like to admit it or not.

In the end, we may well admire the skills these kids show with multiple electronic toys, and even their ability to learn new ways to do things that take their elders seemingly forever. But we should hesitate to admit that this way of doing things will prove superior at the end of the day — especially since we really don’t know where e-literacy will take us. And as a general rule, we should not allow the kids to tell us how to design educational curriculum: they have no idea where they are going: their toys may indeed be taking these kids down an intellectual blind alley. In any event, given the addiction that has already been attributed to so many of them, we will have to depend on the toys themselves to pave the way to a new tomorrow: the kids will simply be doing what their toys tell them to do. I prefer to take the path well-travelled. At least I have a pretty good idea where the traps and pitfalls might be found and I can use the wisdom of past generations as a guide.

Advertisement

Improved Learning

I repost once again because I have begun to think that some of my best thoughts have already been set forth and there are a few new readers who might find them worth pondering.

There must be many school administrators who have too much time on their hands. They keep trying to come up with new ways to teach and learn forgetting that the best way to do that is to get the brightest teachers you can by paying them a decent salary and then turning them loose in the classrooms. Instead, they have fallen hook, line, and sinker for the electronic toys that have been incorporated into schools at nearly every level. This is part of the common educational practice of bringing the subject matter down to the level of the student rather than to have the student stretch and grow to reach a higher level. “Dumbing down the curriculum,” as it is called. Give ’em what they want. The kids play with electronic toys, let’s incorporate them into the curriculum. Somehow. The latest educational fad in “higher” education is to make learning even easier and less painful: let the students stay at home where they can sit in front of a computer screen as passive vessels instead of in a classroom where they might accidentally interact with each other or, worse yet, the instructor.

I’m with Albert here: led by a purblind educational bureaucracy we are rapidly turning out idiots who cannot interact with one another and cannot use their minds except to turn things on and off. Socrates was never “certified” to teach, and he didn’t use the latest electronic gadget, either. Plato’s Academy also did rather well without the latest electronic toy, thank you very much. After all, Plato was able to turn out people like Aristotle without a huge cadre of administrators looking over his shoulder, a committee of well-meaning board members to answer to, or a single computer.

Our addiction to electronic toys has seriously inhibited human interaction as we see people walking down the street holding electronic devices to their ears or looking down at the device they are sending text messages from: they don’t talk to one another any more, they talk at one another — in broken English. As suggested above, the latest fad in higher education is the trend toward on-line learning, which is simply another way to guarantee that students will learn very little. I dare say it will soon catch on at the high school level as well.

However, studies have shown repeatedly that the lecture method — in the classroom or on-line — is the worst way to teach a subject for most students. In addition, the drop-out rates in on-line education are off the charts. Real learning takes place when people interact with one another. On-line lecturing is simply multiplying the lecture-system mistake by making it easier and faster — and cheaper. And there’s the rub. Education has become so costly that students are turning to on-line “universities” like The University of Phoenix, and the other colleges and universities realize they must either join the party or sit by as their high-paid faculty lecture to empty halls. It’s sink or swim. We are now told that a group of so-called “prestige” universities wants to join the fray:

Now 30 Under 30 alum 2U, which has previously focused on online graduate degree programs, has decided to throw its hat into the ring. This week, the company, formerly known as 2tor, announced a partnership with a consortium of 10 universities to offer undergraduate courses online. The company’s new program, Semester Online, will launch in September 2013 with a catalog of about 30 courses offered by Brandeis, Duke, Emory, Northwestern, University of North Carolina, Notre Dame, University of Rochester, Vanderbilt, Wake Forest, and Washington University in St. Louis.

Whatever the reason, we insist on embracing the latest fashion even when the evidence proves that it not only fails to deliver the goods, but it actually inhibits the results — teaching and learning in this case. Electronic gadgets do not enhance learning; as Jane Healy has shown, they actually inhibit learning. Their use has led to the incapacity of parts of the brain to function as they should, thereby making thought and coherent speech more and more difficult for growing numbers of young people. Instead of embracing the latest fad, we might be better advised to simply reflect on the goal of education which is to enable young minds to grow and develop. We need to stop worrying about what is latest, or cheapest and easiest, and start to recall what is most effective: a good teacher in a room interacting with interested and curious students. Preferably they should sit in a circle or, better yet, around a table.

Human Interaction

A former colleague of mine has written a book of science fiction suggesting what our great-grandchildren might look forward to in 2092. It promises to be the first in a series about the world far into the future after the world war has destroyed pretty much all of what we call “civilization.” All that is left, besides vast wasteland, is a few communities, cantonments, and the remnants of some of the major  cities along with a wealth of technology which has allowed for regular trips to the moon and even a Mars colony. The major player in the game in the Mars Corporation which pretty much runs the world after the nations spent themselves and were left powerless.

The book is a clever indictment and brilliant satire of the corporate world, its depersonalization and relentless thirst for power and wealth — the only real values left besides the urge to simply maintain one’s own life and try to ascend up the corporate ladder and gain a bit more power for oneself. It raises, among other things, the question of whether the world would indeed be better off with one central power, even a corrupt power, keeping all other powers at bay — if the price is human freedom? After all, where has that freedom brought us?

I have only begun the first volume and cannot comment on the whole book (much less the entire Saga) but an insight that I found most thought-provoking was the author’s claim that the disintegration of civilization that led to the world war and the terrible aftermath began with the dehumanization resulting from the technical world: the replacement of human relationships by electronic toys and social media.

The dissident Miller lives outside of the cities (such as they are) and spends his days, among other things, tending to his garden. He tries to explain what happened to the world to his daughter who is loyal to Marsco:

“‘I don’t know how to explain it, Tess, except that the techworld that evolved, that world seemed to make this world unnecessary.’ He placed his bare hand gently on her forearm. . . .’Human touch became unnecessary for some — to enough. Cyberspace became authenticity for far too many.'”

Such is happening all around us at present, people find themselves ignoring one another more and more as human relationships degenerate and the electronic toys and the desire to be “liked” on social media replace such things as genuine contact with other human beings; disappearing are such things as feelings of love and respect, fear and embarrassment — you know, those things that make us human, for better or worse.

I think this is a profound insight on my colleague’s part as I have for years shouted warnings myself (on these pages) about the dangers of those electronic toys. The evidence is overwhelming that they leave parts of the human brain undeveloped (the thinking parts) and they are addictive. Those shouts fall on deaf ears, of course, because, in fact, the toys are addictive and it is not clear that even if they wanted to folks could not put them down even for a day or two, look around them, and interact with others and with the world itself which offers us so much joy and delight. Our author is convinced that we are paying a severe price: the loss of our basic humanity.

Novels have a way of making a point so much more effectively than the sort of prose I write, but this novel has been self-published and lacks the promotional punch that could be provided by a major book company, leading perhaps to one one hellova movie series. This is too bad because the book is insightful, well written and remarkably imaginative. It opens us to the possibility of what the world might be like after we have encountered the near-fatal catastrophe that will finally get our attention, make us realize what a self-involved people we are, the kinds of damage we are doing to our planet, and force us once again to reach out and treat one another with the respect and love we both crave and deserve. Those who survive, that is.

The book is titled The Marsco Dissident and is written by James Zarzana. It is available on Amazon and promises to be a good read. And, no, I will not receive a kickback! Jim doesn’t even know I have written this and will almost certainly not read it. Take  care and have a Happy New Year, one and all.

D.I.C. (Revisited)

In the spirit of saving myself the trouble of repeating myself, and given the wealth of new readers of this blog 😆, I reblog a post that may be of some interest.

One of the sobering consequences of the revolution that has placed electronic toys in the hands of everyone who can hold one is what I would call “D.I.C.”  — diminished imaginative capacity. By coining this term I join with others who seem to love to make up names, and especially acronyms, for common events and phenomena in order to seem more learned. (We need not dwell on the acronym in this case!) The electronic toys the kids play with today and the movies they see do not require that they use their imaginations at all: they are loud, graphic, vivid, and present themselves to a largely passive audience. All the person has to do is sit and watch, or play with a joy stick, and their world is at their finger-tips with all its violence and noise. And because they read far less than their parents and grandparents and visit fewer art galleries, dance recitals, or symphony performances, this is of considerable concern: it is symptomatic.

To begin with, the appreciation of all great art and literature requires an effort of imagination. Take Joseph Conrad, for example. Despite working in a second language, his vocabulary is very rich. Further, He is what many have called an “impressionistic” writer and this causes problems for many readers for two reasons. Thus, Conrad’s rich vocabulary requires an extensive knowledge of words on the part of a reader. But more to the point, Conrad leaves gaps and spaces in his writing that require an imaginative effort on the part of the reader in order to engage his writing fully. And the effort is one that a great many people are unwilling or unable to make, especially given their shrunken vocabularies of late. The same might be said of the highly imaginative Shakespeare whose language is rapidly becoming foreign to growing numbers of young people. But the list of writers who demand an effort on the part of their readers could be added to endlessly. And the same could be said for art and music: they require an effort of imagination to engage the works fully. So, the question before us is: Why should anyone make the effort when they can pick up an electronic device, push buttons, sit back, and let the thrills begin? The answer is that these folks are living in a shrunken world and they shrink as a result.

The results of all this have been analyzed and cataloged by a number of psychologists who have shown that the young, especially, are going forth into a complicated world with short attention spans and what amounts to a form of brain damage. They cannot attend to any subject, especially one that doesn’t interest them, for any significant length of time; further, portions of their brains are simply not developed. There is, indeed, quite a controversy among so-called experts about whether these people will or will not be able to cope in the future. I have written about it in previous blogs and choose not to repeat myself here. But the evidence suggests that it will be increasingly difficult, if not impossible, for these people to think their way through complex issues or use their imaginations to consider alternative consequences of future actions. And this is serious, indeed.

Moreover, I worry about the loss of capacity to imagine when it comes to great literature and great art because it means that these things will simply slide into oblivion, pushed aside by a growing number of people whose interest is focused on the immediate present and the graphic nature of the images and sounds that issue forth from their electronic toys that require no effort whatever. It may not be a problem on the scale of global warming, but coupled with that problem — and others of major proportions — it does not bode well for the future. Those who solve the problems we face now and in the future will have to use their analytic powers and, above all else, their imaginations. So, on the growing list of things that ought to have our undivided attention, we most assuredly should add D.I.C. and insist that the schools continue to require literature and art and that teachers discourage the use of toys as a substitute for those activities that will fully engage their minds and hearts.

If only the teachers would..


Reality

One of the first essays I assigned as a brand new Instructor at the University of Rhode Island many years ago was the question: “What Is Real?” The students were allowed to take the question wherever they wanted and provide reasonable answers to the question. It was one of my first thought exercises in the spirit of Robert Hutchins’ admonition: the only questions worth asking are those that have no answers.

Be that as it may, there is a genuine problem out there in our world that has seldom, if ever, been addressed in a direct manner. It surfaced recently in a comic I like to check out each day as a young girl staring at her iPhone told her parents who were captivated by a fireworks display that “Snapshot” had shown a much more thrilling event recently. She was completely bored by the real thing. Think about that: reality is boring because it fails to measure up to make-believe.

Freud talks about the “reality principle” that is essential for humans to develop in a healthy manner — the ability to separate reality from illusion. At birth we know only hunger and crave the pleasure that comes from satisfying that hunger and the quick response to our other immediate needs — including love from our parents. We spend the rest of our lives wishing we were back in the womb where it was safe and all our needs were immediately satisfied. But life hits us squarely in the buttocks and we grow painfully into adulthood. In the process we occasionally retreat into our own heads and find it a safe place to retreat to when things in the real world become too threatening. It’s called becoming an adult. But a large part of growing up involves the realization that we cannot remain within our own heads and become healthy, mature adults at the same time.

The point is that as we grow older we are also supposed to also grow more certain about what is real and what is make-believe. And frightening as reality can be at times (especially these times!) we must prefer it to an imaginary world in which we are all-powerful and in complete control — like the world of electronic toys. We already know these toys are addictive: they release quantities of dopamine into the brain, just as does gambling or alcohol. But I speak here of a deeper problem. For many who engage with these toys reality becomes hard, too hard, and they retreat into a make-believe world which seems safer but which can entrap them for the remainder of their lives. Reality shrinks and the world of make-believe becomes larger and it becomes OUR world. It’s called “delusion,” or eventually “psychosis.”

Many of us are aware that our feckless leader lives in such a world. It is disturbing to say the least. But it pales in contrast to the fact that he is joined in that make-believe world by growing numbers of people who find reality simply too hard to deal with in a direct and honest manner. Thus do games, and, indeed, the world of entertainment as a whole, draw us to them and the imaginary world becomes the real world, a world in which we are at the center and a world that bends to our every wish. The problem is that this is not the real world. The real world is one of pain and struggle with a blend of heroism, love, sympathy for others and, we would hope, a sincere wish to belong with others to a world we share but cannot bring utterly under our control.

One must wonder where this will eventually lead us all, given the genuine need to address real problems and suggest real solutions. There is much to do and there are problems waiting to be addressed. We start in the wrong direction if we take in hand an electronic toy that leads us to believe that it is all very simple and problems that arise can be solved by pushing an icon.

In answer to my own question, then, I would say reality is what we experience daily; it is a struggle tempered by occasional beauty, a remarkable number of good people, and those few who are close to us whom we love. It involves frustration at times, but it also rewards heroic efforts — or even the slightest effort — to do the right thing. We cannot solve all the world’s problems, but we can certainly address those closest to us which allow us to make small inroads into solutions that will help make the world a better place. The real world, not an imaginary one.

Intelligence

IN 2008 Northwestern University Press published a collection of essays by Lionel Trilling edited by Leon Wieseltier under the title The Moral Obligation to Be Intelligent. Wieseltier chose the title because one of Trilling’s teachers, John Erskine, had once published an essay by that title. The problem I have with this title is that it makes no sense whatever and given that Trilling was a brilliant man he would have known this. The collection is in some way an insult to the man Wieseltier hoped to praise. There is no question he held Trilling in very high regard, but he should have given the title of the book more thought.

The title makes no sense because we cannot have an obligation to be intelligent. We either are or we are not intelligent. As Immanuel Kant argued many hears ago, “ought implies can.” We cannot choose to be intelligent, though we can choose to be as intelligent as possible. Thus the title “The Moral Obligation To Be As Intelligent As Possible” would have made sense. But it is a bit cumbersome and was doubtless rejected on those grounds. Again, we can try to be intelligent. Indeed, according to much of the collective wisdom of the Western tradition, we have a moral obligation to develop our potential, including our mental capacity, and not to waste it.

Our president and his minions have set the benchmark for intelligence at a very low level. In addition, the electronic toys the kids are addicted to have been shown to diminish intelligence. Popular culture and the entertainment industry have replaced “high culture” and civil discourse. And our schools don’t see intelligence as having any real value. But then intelligence in this country has never been regarded as an especially good thing, a thing to be sought after as desirable in its own right. Ours is a nation of practical folks who have always been suspicious of those who exhibit intelligence, those “eggheads” so derided not long ago. The notion that we should pursue knowledge for its own sake and not simply because it may someday translate into greater profits for ourselves and the companies we might happen to work for is anathema in this culture. And, to a lesser extent, it always has been, despite the fact that the founders of this nation were a remarkably intelligent group of men, as were the two presidents we revere most highly — namely, Abraham Lincoln and George Washington. But, then, consistency has never been our strong forte.

Moreover, it makes no sense to say that we have a moral obligation to do something we cannot do. I cannot tell you, for example, that you really should leap off the highest building in town and fly — where “should” reflects the moral obligation to do just that. This makes no sense whatever. Thus, if intelligence is something we are either born with or not, then it makes no sense whatever to tell someone that they really should be intelligent. Even the phrase reflects the nonsense at the heart of the demand. But the notion that we should all, in this day and age, try as hard as we can to become as intelligent as possible makes perfectly good sense — despite the current cultural pressures to be as stupid as possible. Wasting our time and our minds on electronic toys, social media, violent movies, and listening to mindless people shouting at one another on television is not designed to make us smarter. It is tantamount to wasting our talents, our potential as human beings, our potential as a specific human being with specific abilities and talents.

We pay lip service to this idea when we note that “the mind is a terrible thing to waste” (or as Dan Quayle said in this regard, “What a waste it is to lose one’s mind. . .”  Quayle knew whereof he spoke.) And our sitting president who spends his time tweeting inanities and taking mulligans on the golf course at the expense of the American taxpayer is certainly not my choice to be captain of the intelligence corps. But he is revered by countless Americans who see him as the Great White Hope, a man of extraordinary intelligence (as he insists he is) who will lead us to a brighter tomorrow. Probably not. Certainly not if we continue to waste our minds on trivia and toys and ignore the obligation to try to be as intelligent as possible and to elect politicians in the future who exhibit at least a modicum of intelligence.

Homework

As a rule I mute television commercials. I can’t stand most of them as they send us all subconscious messages from multinational corporations that seek to entrap the will and bring about the purchase of something we simply do not need. Some are clever and I try to listen to them, just for a laugh. But there is a new Apple iPad commercial that I happened to listen to recently, because I was remote from the remote, and that commercial gets my goat!

The commercial shows a middle school teacher assigning homework to his class, presumably on a Friday, and a voice-over starts intoning the message “Ugh, homework. I hate homework.” The style of the commercial is reminiscent of Jean Shepherd’s A Christmas Story and perhaps that is what they were going for. It shows the kids having fun, playing and larking about, at times with their iPads (presumably suggesting that homework on iPads can be fun? Or perhaps the kids are just checking social media?), while all the time the voice tells us repeatedly how much they all hate homework.

And we wonder why our kids are falling behind the students of nearly all of the other so-called “developed” nations! This sort of anti-intellectualism, which is all-too prevalent in America and has been for many years, determines that those children will never catch up to the rest of the world. We know the public schools are under attack and the data show that we draw those into public school teaching who are in the bottom third  of the students in our colleges. They are paid a pittance and asked to raise the kids in addition to teaching them — or, most recently, arming themselves against possible terrorists. And if we now start to send the message that they should not assign homework — presumably because the kids don’t like to do homework — we simply add fuel to a fire already threatening to go out of control.

Homework, like it or not, helps young people deepen their knowledge of the subject matter after an all-too-short school day — in addition to acquiring the skills of self-discipline and self-denial, which we all dearly need. It also helps them to become independent learners instead of just recipients of the teacher’s bits of knowledge. To be sure none of us wants to do work of any sort — which is why we are paid, I suppose. But work is necessary and homework in the schools is a necessary component of the load the student is asked to bear. And let’s face it, that load is not back-breaking. We seem to be asking our students to do less and less due to the fiction that they are under so much pressure already. And at the same time grade inflation convinces them that the work they are doing is stellar when, in my experience and from what I have read, it is generally sub-par. The result, of course, is our age of entitlement.

Needless to say, this is an issue close to the heart of a retired college professor who has read and thought about education at all levels for many years (and blogged endlessly, some would say). I have even written a book about the current condition of education in this country and it has always been a concern of mine — because it is a problem that can be solved if we simply put our minds to it. If tiny Finland can do it, we certainly can! Initially it would require that we somehow stop the mindless attacks from the political Right against public education and determine to put a much larger share of the annual federal and state budgets into education thereby attracting better teachers and showing them that education matters.

In any event, the attack on homework by a corporation determined to sell more electronic toys to a generation already stupefied by those toys is a compound felony in my view. I have always thought Apple a cut above the rest, but I must now revise my views. At the same time I will continue to worry about the present state of education in this country, convinced as I am that it holds the key to the success or failure of this democracy. And I will continue my practice of muting the commercials.

The Younger Generation

It’s a cliché that the older generation has complained about the younger generation since God wore short pants, as they say. But I have been maintaining for some time now that something new has appeared on the horizon; the “millennials” — those born in the middle to late 80s of the last century — are a new breed posing new problems.

Accordingly it was most interesting to come across an interview with Simon Sinek, a “Leadership Expert” (?), on You-Tube who is making quite a splash with his analysis of “what is wrong with the present generation.”  According to Sinek there are four major areas of concern that must be explored to understand what is going on. He stresses that he is not making judgments about the younger generation and he refuses to blame them. Rather, he blames (1) bad parenting, (2) technology, (3) impatience, and (4) the environment.

I have touched on most, if not all, of these points in many of my blogs — most especially the “self-esteem” movement that has caught fire in the schools and in parenting (thereby contributing to what Sinek calls “bad parenting”). This movement rests upon the totally false psychological premise that by praising kids endlessly we will raise their self-esteem, whereas clinical studies have shown that false praise and the awarding of such things as “participation trophies” actually decreases self-esteem. It sends false messages and instills in the young an expectation to be praised for everything they do, thereby reducing their motivation to actually put out an effort to achieve something difficult. It leads invariably to a sense of “entitlement” on the part of growing young people. True achievement, of course, would in fact raise their self-esteem and would give them a sense of satisfaction they now expect to receive for no effort whatever.

Sinek stresses how damaging this is to the young who know, deep down, that they have done nothing to deserve the praise. But worse yet, they later become depressed because they do not receive the same praise for every effort when out in the workplace — the “environment” of which Sinek speaks. In the real world of real work, folks have to make an effort and many times their efforts are unrewarded. That’s just how it is. But Sinek has himself interviewed a great many bright and able young people who, after a few months on the job, find themselves deeply depressed and disillusioned, even suicidal. Others drift with no goal or sense of purpose. They simply are not getting the stroking they have become used to.

Of considerable interest to me is Sinek’s second point, the factor of technology in the world of the young. In a word, the electronic toys. I have written endlessly (some would say) about this problem as these toys have always seemed to me to drive the users deeper within themselves and to construct barriers between themselves and the world outside themselves. They promote what I have called the “inversion of consciousness,” preoccupation with the self and its reactions. Worse yet, Sinek says there is considerable evidence that these electronic toys are addictive. Like such things as gambling and alcohol, social media and the “likes” on the toys increases levels of dopamine, the chemical in the brain that is increased in addictive behaviors. Thus our intuitive sense that these toys are addictive is well-founded. We (and this includes the schools that hand out electronic toys as a sign of their advanced educational views) are handing these young kids an invitation to become involved in a make-believe world where they are all-powerful at the center and which they find increasingly difficult to escape from — much like the alcoholic who tries to go on the wagon.

The third item on his list, it seems to me, is the result of a combination of #1 and #2 above: the refusal of parents to deny their kids anything coupled with the ready availability of toys that provide users with immediate gratification in so many ways. They are impatient because they have never learned to put off gratification for a later and fuller sense of satisfaction. So many parents tell us that they don’t want their kids to have to “do without” as they did — while it may very well be that putting off gratification, learning self-discipline, is the key to true satisfaction and happiness.

Sinek is not long on solutions, suggesting only that we encourage the young to put aside their iPhones and iPads for a few hours each day and try to build bridges with other people in the real world. This is an excellent suggestion, but one that is easier said than done.  It takes “tough love” on the part of parents who truly care about their children and who are determined to take more time to be with their kids and interact with them on a personal level. And the schools need to get back to good teaching and stop turning the kids into addicts .

The only other element I would add to Sinek’s list above is the entertainment industry which compounds the problems Sinek points out. The ultimate cause of the problems he discusses is the removal of these young people from the real world, the weakening of what Freud calls “the reality principle” that allows them to function fully in the world of people and things, interact with others, build meaningful relationships, and find true joy in living and working in the world. This, in my view, is the central problem and it is one that we all need to think about and deal with in our interactions with a  generation that is in danger of becoming lost in a world of make-believe where their sense of power and importance is imaginary and can never live up to the real thing. This must ultimately lead to depression — and worse. And the cost to society at large is beyond reckoning.

Spectators?

Is it possible we are becoming a society of spectators? Is it the case that we are so removed from the world that we have become passive observers of the scene around us as though we are watching a movie? I do wonder sometimes. I have gone on (and on) about the danger of the electronic toys we all seem to be addicted to and the damage they are doing to our collective brains. There is hard evidence that this is the case, but it doesn’t seem to deter anyone. We walk through life with our eyes down, fixed on the toys in our hands and checking social media to see if we have new friends — or if the old ones still “like” us. Meanwhile the real world around us becomes less and less real as the pictures we are fascinated by become true reality. This detachment from the real (REAL) world is a sign of mental imbalance, folks. Just ask Freud who talked a good bit about the “reality principle” that governs the gradual maturing of the young child as he or she grows and becomes an adult. The make-believe world of the child is supposed to be replaced by the (at times painful) world of things and people that the child slowly realizes is the real world. Things seem to have become turned upside-down. The real world is now for so many the world of make-believe: the world in our hands that we can control, not the world “out there.”  Worse yet, we have become emotionally detached, many of us, and see tragic events as simply another episode in a drama we are not really a part for. We have become a nation of spectators, it would appear.

A story in Yahoo news recently brought this possibility home in a rather graphic way:

Shocking surveillance video shows the moment a Pittsburgh woman was knocked out cold by a man on a busy sidewalk — but that’s not the worst of it. The footage also shows the woman being beaten and robbed by bystanders — who proceed to take pictures of her, including selfies — as she lay unconscious on the ground. “They don’t treat animals like that. They wouldn’t treat a dog that way,” the victim’s mother told KDKA on Thursday. “It’s disgusting. My daughter needs help.”

I suppose the woman being knocked down and robbed shows us a side of ourselves we have always known was there. Recall the Kew Garden incident in 1964 when thirty-seven or thirty-eight people ignored the cries of a Kitty Genovese being stabbed to death outside their bedroom windows. In a crowded world we tend to become a bit more callous and robbing and beating a helpless woman seems like yesterday’s news to a people who have become jaded and over-exposed to violence and mayhem. What is unique about the Yahoo story is the observation that people were taking photos, including “selfies,” as the woman lay there beaten and suffering on the sidewalk.

We need to keep our perspective here (speaking of the reality principle). This is not about all of us and it is only an anecdote. There are good people out there doing good things every day. But there are growing numbers of people who seem to have become inured to the suffering of others, as though it’s not real but something to watch and get their own emotional high from. We don’t experience the woman’s pain, only our own emotional reaction to the incident, “getting a rush.” This seems to be what it’s all about for many, indeed for an increasing number of people, in a world of detached spectators who “get off” by watching  rather than becoming truly emotionally and intellectually connected with the events taking place. Taking a photograph freezes the event and allows us to see it as something happening in our own little world where we are in control and sympathy and empathy are no longer part of the equation.

Language

Once upon a time, long ago, after humans had freed themselves from the primeval ooze and struggled to stand upright, they gradually invented language in order to communicate with one another. Initially, it was through pictures and gestures, but eventually they developed an alphabet and put words together. All of this was in order to communicate their ideas and feelings to one another, to make clear what they had in mind.

It was thought for many years that language was the one thing that separated humans from other animal species. But then it was discovered by people like Wolfgang Köhler that chimpanzees could communicate with one another and it was later learned that they could even teach one another the language. Then we learned that other animal species also have communication skills and even something similar to language. This was about the time when humans were losing their own use of language. Coincidence? Perhaps. But in the event, humans discovered their vocabularies shrinking and their ability to grasp such things as compound sentences slipping away. It was about the time when they started playing with electronic gadgets designed to increase their ability to contact other people and, presumably, to communicate with them. Coincidence? Perhaps.

But, it turns out, the idea is no longer to use language to communicate with one another. Language is now for self-expression. We use it to tell others how we feel or, at best, to order pizza. We discovered that we don’t need a rich vocabulary or complicated sentences. We can use images and gestures. Just like our ancestors. 🙂

The problem is, of course, that language is necessary for thought and as language becomes impoverished so also does our ability to think. This is demonstrated, if we require a demonstration, by the alarming number of people who support Donald Trump. Obviously, these people have lost the ability to think. I haven’t been listening at doorways, but I would wager they can’t speak, either. The problem is that language was initiated in order to make it possible for us to communicate with one another. And this means that a fairly sophisticated vocabulary along with the rules of grammar and usage are also necessary if we are to tell each other what’s on our mind. The point was wonderfully made by John Barth in his novel The End of The Road in which the hero, Jake Horner, is dealing with a reluctant student in his basic College English class. The student insists that because language came before grammar we don’t need grammar. After a lengthy Socratic exchange between Jake and the student, Horner concludes as follows:

“. . .if we want our sentences to be intelligible to very many people, we have to go along with the convention [the rules of grammar]. . . You’re free to break the rules, but not if you are after intelligibility. If you do want intelligibility, then [you must master the rules].”

But, it would appear that a great many of us are like the student in this exchange: we don’t want to obey the rules of grammar because ultimately we are not really interested in communicating, in intelligibility. Language is simply a device we employ to express ourselves. Period.

In a word, we as a species regress. And as we regress we are surrounded by a growing number problems that require careful thought and imagination. This at a time when thought and imagination have become impoverished by “advances” in technology and the growing influence of the entertainment industry whose motto is: take it down to the lowest level in order to attract the largest audience. Educators have followed suit, lowering expectations and providing their students with electronic toys. Coincidence? Perhaps. But a bit unnerving none the less.

Thus we discover around us folks whose attention is directed at the toys in their hands — even when they are next to one another — and who find it difficult, if not impossible, to say what they mean or understand what others say to them,. But since language is no longer about communication, since it is now about self-expression, it really doesn’t matter. As long as others know that I am angry, hungry, or sad, that’s really all that matters. If they don’t understand what I am feeling so much the worse for them. It’s all about me. I don’t need language. 🙂