Improved Learning

I repost once again because I have begun to think that some of my best thoughts have already been set forth and there are a few new readers who might find them worth pondering.

There must be many school administrators who have too much time on their hands. They keep trying to come up with new ways to teach and learn forgetting that the best way to do that is to get the brightest teachers you can by paying them a decent salary and then turning them loose in the classrooms. Instead, they have fallen hook, line, and sinker for the electronic toys that have been incorporated into schools at nearly every level. This is part of the common educational practice of bringing the subject matter down to the level of the student rather than to have the student stretch and grow to reach a higher level. “Dumbing down the curriculum,” as it is called. Give ’em what they want. The kids play with electronic toys, let’s incorporate them into the curriculum. Somehow. The latest educational fad in “higher” education is to make learning even easier and less painful: let the students stay at home where they can sit in front of a computer screen as passive vessels instead of in a classroom where they might accidentally interact with each other or, worse yet, the instructor.

I’m with Albert here: led by a purblind educational bureaucracy we are rapidly turning out idiots who cannot interact with one another and cannot use their minds except to turn things on and off. Socrates was never “certified” to teach, and he didn’t use the latest electronic gadget, either. Plato’s Academy also did rather well without the latest electronic toy, thank you very much. After all, Plato was able to turn out people like Aristotle without a huge cadre of administrators looking over his shoulder, a committee of well-meaning board members to answer to, or a single computer.

Our addiction to electronic toys has seriously inhibited human interaction as we see people walking down the street holding electronic devices to their ears or looking down at the device they are sending text messages from: they don’t talk to one another any more, they talk at one another — in broken English. As suggested above, the latest fad in higher education is the trend toward on-line learning, which is simply another way to guarantee that students will learn very little. I dare say it will soon catch on at the high school level as well.

However, studies have shown repeatedly that the lecture method — in the classroom or on-line — is the worst way to teach a subject for most students. In addition, the drop-out rates in on-line education are off the charts. Real learning takes place when people interact with one another. On-line lecturing is simply multiplying the lecture-system mistake by making it easier and faster — and cheaper. And there’s the rub. Education has become so costly that students are turning to on-line “universities” like The University of Phoenix, and the other colleges and universities realize they must either join the party or sit by as their high-paid faculty lecture to empty halls. It’s sink or swim. We are now told that a group of so-called “prestige” universities wants to join the fray:

Now 30 Under 30 alum 2U, which has previously focused on online graduate degree programs, has decided to throw its hat into the ring. This week, the company, formerly known as 2tor, announced a partnership with a consortium of 10 universities to offer undergraduate courses online. The company’s new program, Semester Online, will launch in September 2013 with a catalog of about 30 courses offered by Brandeis, Duke, Emory, Northwestern, University of North Carolina, Notre Dame, University of Rochester, Vanderbilt, Wake Forest, and Washington University in St. Louis.

Whatever the reason, we insist on embracing the latest fashion even when the evidence proves that it not only fails to deliver the goods, but it actually inhibits the results — teaching and learning in this case. Electronic gadgets do not enhance learning; as Jane Healy has shown, they actually inhibit learning. Their use has led to the incapacity of parts of the brain to function as they should, thereby making thought and coherent speech more and more difficult for growing numbers of young people. Instead of embracing the latest fad, we might be better advised to simply reflect on the goal of education which is to enable young minds to grow and develop. We need to stop worrying about what is latest, or cheapest and easiest, and start to recall what is most effective: a good teacher in a room interacting with interested and curious students. Preferably they should sit in a circle or, better yet, around a table.

Advertisement

Stupid!

One of the very few sit-coms I watch on the telly is “Young Sheldon,” the spin-off from “The Big Bang Theory.” It stars the truly remarkable child actor Iain Armitage and is in many ways more delightful (and funny) than its predecessor.

Young Sheldon is a nine-year old Sheldon Cooper who likes to brag (even in Church with his Fundamentalist mother) that he doesn’t believe in God: he believes in science. This is amusing when it comes from the mouth of a small boy sitting next to an adult, but it is also a bit stupid. As Pastor Jeff tells Sheldon in an exchange they have in Church, even some of the most brilliant scientists believed in God — to wit, Albert Einstein and Charles Darwin. In another episode, Sheldon comes across Pascal’s wager in which the brilliant mathematician explains that it is smarter to believe in God than to disbelieve in God because those who believe will be rewarded while those who do not cannot be. And even if God doesn’t in fact exist, those who believe will have lived better lives. This is a bit of a simplification, but you get the idea.

In any event, young Sheldon, for all his intelligence, has committed the fallacy of bifurcation: either God or science, not both. But why not both (ask Einstein and Darwin)? Indeed, it is a bit stupid to insist, as so many intellectuals do, that there is only one way to know anything and that is the way of science. This, of course, is what has been called “scientism,” and I have written about it before; it commits the fallacy of poisoning the well. That is to say, it rules out the possibility that there are other ways of knowing and it ignores the uncomfortable fact that there may be things we simply cannot know — mysteries, if you will. This, too, is stupid. We have already encountered two fallacies in the minds of those who, like young Sheldon, insist there is only one way to know.

But it is equally stupid to ignore the findings of science, including medical science — such things as evolution and climate change, for example. Science can deliver us a great many truths that simply cannot be denied without being completely stupid. And it is perhaps the fact that many people who identify themselves with religion insist that science is the work of the devil that intellectuals don’t want to acknowledge that there could be any semblance of truth in religion. This is guilt by association. Those people conflate the differences among religion, organized religion, and faith. This, too, is stupid — as Pascal would attest. But the fact is that a great many people who insist that faith is the only road to the Truth are as stupid as those who think science is that road. Either road requires a form of denial and an assumption that our way is the only way. There may, in fact, be many roads.

In a word, there are, as Hamlet tells us, a great many things in heaven and earth which we cannot explain with science. There are limits to human truth. But there is truth and it is available to those who are willing to search for it; while a little knowledge is a dangerous thing, the unexamined life is not worth living.  And the start of that search begins with the acknowledgement that we do not know everything and may never know everything. Not in this life, anyway.

It may well be the case that we will only know the truth after we die. Heaven may consist of a world in which the Truth is revealed to us. And Hell, of course, may be a place where truth is denied and everyone tells lies, a world in which everyone makes everything up as they go along and in which there is nothing whatever that is solid and we are surrounded by incessant confusion and uncertainty — a world of Donald Trumps, if you can imagine.

In any event, I have no problem whatever accepting the very real possibility that I do not know everything and that there are things which I simply must accept on faith. But I also believe that there are things that are true, things that stand on a solid base of empirical evidence and intuitive truths that simply cannot be denied. In the end, though, there is only one certainty and that is that there is no absolute certainty. That much I do know.

Lessons Learned?

The latest word from Afghanistan is disturbing.

KABUL (Reuters) – The U.S. military said in a secret report that the Taliban, backed by Pakistan, are set to retake control of Afghanistan after NATO-led forces withdraw, raising the prospect of a major failure of Western policy after a costly war.

This, of course, should not surprise us, though it will surprise some for all the wrong reasons. George McGovern wrote an open letter to President Obama upon his assuming the Presidency of this country warning him not to get further involved in that part of the world. History has shown that such a step is ill-advised. McGovern pointed out that the Russians and the English, in recent history, learned tough lessons and went home with their tails between their legs. He even went so far as to suggest that Britain’s involvement in the war in Afghanistan brought about the final days of the British Empire. The NATO forces now engaged in that war are finding out how frustrating it can be — not only because of the elusive Taliban who are the known target, but also because of native security forces who have turned on them in significant numbers, according to recent reports.

Now whether or not we want to agree with McGovern — who has a PhD in history from Northwestern and has also had considerable “real-world” experience — we should have learned enough by this time to realize that (as Santayana said long ago) those who ignore history are doomed to repeat its mistakes. So here we are.

We are told that “ontogeny recapitulates phylogeny,” in that the human embryo seems to repeat the stages of evolution the human race has gone through, complete with a vestigial tail and gills. It has occurred to me that humans after they are born exhibit the same sort of “recapitulation.” The children refuse to learn from their elders just as their elders, for centuries past, have refused to learn from the collective wisdom of the human race. We prefer to make our own mistakes, even if those mistakes are costly in both lives and money. Einstein defined “stupid” as the determination to repeat an act that is known not to work.  We claim to be the most evolved species on earth. I think not!

As one who has become convinced that we can not only learn from history but also from great literature, I watch with amazement as seemingly intelligent people like our President listen to the wrong kind of advice and make the wrong choices. We were mistaken to get involved in Afghanistan in the first place, though chasing down Osama Bin Laden was a viable excuse in the minds of many. But we know Pakistan is not a worthy ally and we also know that the tribes in Afghanistan have been at one another’s throats for centuries. And we also know, or should know, that McGovern’s analysis was based on weighty historical evidence.  But all that is cast aside in the frenzy to impose our will on another culture and eliminate a man whose cause would certainly not die with him.

In the end, we have made our own bed and we must now lie in it. But we should have known enough not to make the bed in the first place. The refusal to learn from others’ mistakes may turn out to be our fatal flaw.