Not long ago I had a disturbing visit from an Australian colleague. She told me that her brother was a neonatalogist, part of whose responsibilities included training graduate nurses and physicians in his specialty. But over the past several years, he noticed ominous changes in his students. In their previous studies so little had been required of them in the way of computational precision and memory, that he could no longer trust them to mix or administer the solutions and medications on which depended the well-being of his newborn patients. His mood, she reported, lurched between terror and fury. The obvious question is how did these young people get so far with so dangerously little knowledge at their fingertips?
Another vignette redoubled my unease: Comedian Jay Leno routinely ventures outside his California TV studio to ask passersby seemingly obvious questions which invariable receive antically uncertain replies. “Why'd the Pilgrims come to America?” “For dinner?” “Where'd they land?” “Santa Monica?” The studio audience laughs, Leno shakes his head in mock disbelief at the grotesque ignorance on display, implicitly asking how people could reach adulthood with so little general knowledge.
Those two stories popped into mind when I recently heard an avowedly progressivist educator on my campus urge that “100% of the students can get 100% of the material 100% of the time.” Granted, they may choose not to (but not really, since such a choice is merely a defense mechanism against unimaginative instruction); however, the [progressivist] catechism concluded, we are all creative, we are all “limitless learners.”
Other forums made clear how these notions were to apply to the study of history. Insistence on clear narrative, precise geographical and chronological placement of events, vivid biographical sketches, logical causation, line-by-line explication of primary source documents—the staples of my teaching and testing—these were not only retrograde and unfashionable, but positively harmful. They should neither be required nor even striven for. They only inhibited students from “telling their own stories.” from “constructing their own learning.” Such goals were unfair, elitist, bad. We must rather set students free from the tranny of “fact” (the quotation marks were essential to the “argument”). Far more important that the student be made comfortable and have positive feelings toward the subject, that his self-esteem never suffer from a demonstrated inability to produce and deploy “facts.”
By this time I couldn't help wondering whether such doctrines were behind the distress registered by Leno and the Australian doctor. I wondered whether the proponents of such views, should they ever double over in pain, would wish to be taken to a physician trained on such principles. I even wondered whether we hadn't, to paraphrase Daniel Patrick Moynihan, defined competency down for fear of finding out that there are some things some of us can't do, or can't do well. Classroom trends seemed to suggest as much: What used to be expected of freshmen in conceptual grasp and spoken and written articulateness gets deferred or forgotten; assignment lists are trimmed; papers shortened or abandoned; monographs are out, simple texts with big print and bigger pictures are in; group evaluations are preferred to individual efforts. If a student fails a test, renorm the test; if a student fails a marking period, renorm the teacher. Just don't upset the children.
On the premise of defining competency down, the questions I annually get from my students before their first quiz make perfect sense: “Does spelling count?” “Do you take off for grammar?” “Do dates matter?” “Do we have to know this?” My routine answer to this barrage, “What possible reason would I have for saying No?” begets more consternation than anger. The same premise explains why students, despite weeks of preliminary work, resist writing thesis-and-evidence essays, much preferring a content-free paper in which proof becomes synonymous with “I really feel.”
Somewhere they got the idea that there are no real “facts” available for public inspection, but only affective states, and how dare anyone presume to judge those?
There is an undeniable surface appeal to defining competency down. Everyone does well, whatever well now means. Grading is far less onerous. There are no invidious comparisons of the dull and the dutiful, a cozy egalitarianism envelops all. The phones don't ring with anxious parental complaints. The elaborate preparation necessary for effective lecture-discussion classes disappears. Above all, students are comfortable. Yet nagging questions remain: Is excellence ever comfortable? Doesn't striving for excellence presuppose mastery of a body of knowledge? Should we be content with feelings?
Guidance about how and what I should teach came from an unexpected source, a senior who spoke to our school about this desire to be a rock star. However, while he wanted to gyrate with 50,000 watts, his teacher had other ideas. He gave the boy a “horribly uncool” acoustic instrument and set him to learning simple songs, scales, and chord fingerings. He insisted upon correct notes, correct rhythms, correct tempi, correct phrasing, unafraid to use words like ugly, lazy, or wrong. The boy pouted, raged, remonstrated: “Hey, man don't you get it? Forget this stuff. I want to jam.” The music teacher nodded and assigned more exercises, more analyses, more simple stuff in a variety of idioms: folk, blues, jazz, classical. The boy howled and squealed: “I hate this stuff. I want to express myself.” “Until this stuff becomes second nature,” came the devastating reply, “what you express won't be worth listening to.”
Finally, he realized that the appearance of musicianship was no substitute for the reality and returned to the teacher ready to face the discipline of music. Today he can play in a way people want to hear. No, he still cannot perform tricky solos written for performers of wholly exceptional, maybe even unique, gifts. But the boy has found that the entire heavy metal songbook has paled in interest before the wide repertoire he now plays along with his own tunes. He has become a competent musician.
He found that music was both delight and work; that esteem was a consequence, not a cause of demonstrable accomplishment; that there were lots of things he simply had to know without excuse. He found, paradoxically, that structure frees, that only tuned instruments and trained fingers are truly free to make music. In returning to the teacher, acknowledging the latter's experience and insight, the boy made what philosopher Dietrich von Hildebrand called a “response to value,” the necessary preliminary for genuine learning. Not least, the boy had met someone who took his student and his subject seriously enough not to dumb things down. All in all, not a bad educational outcome.
In teaching history, I hope I have the courage to take my cues from people like the guitarist instead of the educationist. Call me mad, bad, and dangerous to know, but I don't want any of my students making unscheduled appearances on the Tonight Show any time soon.
P.M. Alliazi writes from Cleveland. This article is reprinted from Basic Education with the permission of Dr. Madelyn Holmes of the Council for Basic Education.