• Facebook Apostles

  • Enter your email address to follow this blog and receive notifications of new posts by email.

    Join 10,918 other followers

    • 73,856 Visits
  • Recent Posts

  • Categories

Son of God Movie Review by Nic Haros

son-of-god01_edited-1

Son of God Movie Review 

by Nic Haros, Director of Facebook Apostles.
February 28, 2014

Yes.  Now I can say that I’ve actually seen the Son of God movie and I can wholeheartedly recommend it to Catholics, Christians, and non-believers alike.  The scenes and dialogue from the movie (and the emotions it invoked) are still now playing on my mind.  I do have some thoughts on the movie, which I’d like to share. 

In summary, I was awestruck by the performance of Roma Downey (co-producer) as Mary, the Mother of Jesus.  Though missed by most, I think, this movie not only experienced the life of Jesus and His mission on earth, but also shed some light on the story as perceived by Mary.  I will develop that idea in a bit.

First, I’d like to respond to some of thoughts shared by you, Page Fans of Facebook Apostles.  All in all, those who have seen it have had a very good experience watching the film and most have liked it.  Some have “criticized” the movie (falsely, I think) on a number of points.

Most respondents have drawn comparison to other Jesus movies such as the Passion of the Christ and Jesus of Nazareth.  I think such comparisons (favorably and unfavorably) are unfair.  All three are very different movies though the subject matter is the same.  The Jesus of Nazareth movie was, in my opinion, a narrative of the events in the New Testament.  The Passion of the Christ was a cerebral film—it appealed just as much to our psyche as to our emotions.  Son of God, I think, was different.  Of the three movies, SOG presented the most hopeful ending and message.  The joy and peace of living and loving the Beatitudes is made known to following Christians to come and clearly offered a path of redemption by Christ.  The message of this movie was the universal call to all to Evangelize.  Not so with the other two films. Continue reading

Plato’s Big Mistake by Louis Markos

Plato

Plato’s Big Mistake

Plato never cared much for the sophists, viewing them as amoral peddlers of a relativistic kind of wisdom with the potential to corrupt the souls of those who hired them. It is therefore not surprising that when they appear in his dialogues, they are generally treated in a negative or at least suspect manner. InProtagoras, however, Plato treats the sophist of the title with considerable respect. He even has Socrates debate with Protagoras—on fairly equal terms!—a two-part question that Plato considered vital: what is the nature of virtue and can it be taught to others? Although the more elitist Socrates begins the dialogue by asserting that virtue cannot be taught, as the dialogue proceeds, he slowly adopts a position concerning the nature of virtue that drives him—almost against his will—toward the necessary conclusion that virtue can be taught.

In striking contrast to the Christian doctrine of original sin, Plato argues in Protagoras—and elsewhere—that human evil is not the result of rebellion or disobedience. Although G. K. Chesterton was certainly right when he claimed that original sin was “the only part of Christian theology which can really be proved,” Plato seems to have overlooked this proof in favor of a different cause for vicious behavior. “For myself,” says Socrates, “I am fairly certain that no wise man believes anyone sins willingly or willingly perpetrates any evil or base act. They know very well that all evil or base action is involuntary” (345e). Later in the dialogue, Socrates explains more clearly what the cause is of this involuntary evil:

…when people make a wrong choice of pleasures and pains—that is, of good and evil—the cause of their mistake is lack of knowledge….no one who either knows or believes that there is another possible course of action, better than the one he is following, will ever continue on his present course when he might choose the better. To “act beneath yourself” is the result of pure ignorance, to “be your own master” is wisdom. (357e, 358c)

Evil actions, that is to say, are caused not by sin but by ignorance. If we knew of another, better course of action, we would take it. Continue reading

Controlled Burn, Alinskyian organizing, and Common Core by Stephanie Block

(Book Review)
Controlled Burn Alinskyian organizing and Common Core: A Book Review of A Match on Dry Grass: Community Organizing as a Catalyst for School Reform By Mark R. Warren, Karen L. Mapp, and the Community Organizing and School Reform Project, Oxford University Press (2011)

Match-on-Grass-cover2


The title of this book, A Match on Dry Grass, is a metaphor. The education system is desiccated; parents are frustrated and angry. In such an environment, all it takes is a small push for reform, supplied by professional organizers around the country, and a wild prairie-fire of a movement against the “’savage inequalities’ of American public education” will be ignited. (p. 5) At least, that’s the plan.

Continue reading

Two Noble Ends of an Authentic Education by Steven Jonathan Rummelsburg

Jacques-Louis-David_-_The_Death_of_Socrates-1787-611x320 Noble Ends of an Authentic Education

The Oracle of Delphi foretold countless fortunes, futures, prophecies, and mysteries over many centuries and is the same ancient fount of wisdom who declared Socrates to be the wisest man in the world. A great sign above the entrance to the Temple at Delphi exhorts all who enter her sacred halls to “know thyself,” for without such knowledge, the Oracle’s prophecies remain enigmatic and undecipherable.

Devoted teachers have faithfully transmitted the Great Western Tradition to countless souls over many generations. For the better part of three millennia, cultivated men knew that true and accurate knowledge of self is necessary for every authentically educated soul. To “know thyself” remains one of the twin ends of the complete man, the other being the attainment of deep and precise knowledge of reality. These two ends allow us to attain rhetorical skills needed to describe reality as it is, not as we wish it to be. The accurate conveyance of reality is a duty to justice and is owed to the other through the proper use of speech. Continue reading

Who Killed the Liberal Arts? And why we should care by Joseph Epstein

Who Raphael’s ‘School of Athens’ Killed the Liberal Arts?

And why we should care

SEP 17, 2012, VOL. 18, NO. 01 • BY JOSEPH EPSTEIN

When asked what he thought about the cultural wars, Irving Kristol is said to have replied, “They’re over,” adding, “We lost.” If Kristol was correct, one of the decisive battles in that war may have been over the liberal arts in education, which we also lost.

In a loose definition, the “liberal arts” denote college study anchored in preponderantly Western literature, philosophy, and history, with science, mathematics, and foreign languages playing a substantial, though less central, role; in more recent times, the social science subjects—psychology, sociology, political science—have also sometimes been included. The liberal arts have always been distinguished from more specialized, usually vocational training. For the ancient Greeks, the liberal arts were the subjects thought necessary for a free man to study. If he is to remain free, in this view, he must acquire knowledge of the best thought of the past, which will cultivate in him the intellectual depth and critical spirit required to live in an informed and reasonable way in the present.

For many years, the liberal arts were my second religion. I worshipped their content, I believed in their significance, I fought for them against the philistines of our age as Samson fought against the Philistines of his—though in my case, I kept my hair and brought down no pillars. As currently practiced, however, it is becoming more and more difficult to defend the liberal arts. Their content has been drastically changed, their significance is in doubt, and defending them in the condition in which they linger on scarcely seems worth the struggle.

The loss of prestige of the liberal arts is part of the general crisis of higher education in the United States. The crisis begins in economics. Larger numbers of Americans start college, but roughly a third never finish—more women finish, interestingly, than do men. With the economic slump of recent years, benefactions to colleges are down, as are federal and state grants, thus forcing tuition costs up, in public as well as in private institutions. Inflation is greater in the realm of higher education than in any other public sphere. Complaints about the high cost of education at private colleges—fees of $50,000 and $55,000 a year are commonly mentioned—are heard everywhere. A great number of students leave college with enormous student-loan debt, which is higher than either national credit card or automobile credit debt. Because of the expense of traditional liberal arts colleges, greater numbers of the young go to one or another form of commuter college, usually for vocational training.

Although it is common knowledge that a person with a college degree will earn a great deal more than a person without one—roughly a million dollars more over a lifetime is the frequently cited figure—today, students with college degrees are finding it tough to get decent jobs. People are beginning to wonder if college, at its currently extravagant price, is worth it. Is higher education, like tech stocks and real estate, the next big bubble to burst?
A great deal of evidence for the crisis in American higher education is set out in College: What It Was, Is, and Should Be. Its author, Andrew Delbanco, the biographer of Herman Melville, is a staunch defender of liberal arts, as he himself studied them as an undergraduate at Harvard and as he teaches them currently at Columbia. The continuing diminution of the liberal arts worries him. Some 18 million people in the United States are now enrolled in one or another kind of undergraduate institution of higher learning—but fewer than 100,000 are enrolled in liberal arts colleges.

At the same time, for that small number of elite liberal arts colleges—Harvard, Yale, Princeton, Stanford, Duke, the University of Chicago, and a few others—applications continue to rise, despite higher and higher tuition fees. The ardor to get into these schools—for economic, social, and snobbish reasons—has brought about an examination culture, at least among the children of the well-to-do, who from preschool on are relentlessly trained to take the examinations that will get them into the better grade schools, high schools, colleges, and, finally, professional schools. Professor Delbanco is opposed to the economic unfairness behind these arrangements, believing, rightly, that as a result, “the obstacles [to getting into the elite colleges] that bright low-income students face today are more insidious than the frank exclusionary practices that once prevailed.”

Whether students today, despite all their special tutoring and testing, are any better than those of earlier generations is far from clear. Trained almost from the cradle to smash the SATs and any other examination that stands in their way, the privileged among them may take examinations better, but it is doubtful if their learning and intellectual understanding are any greater. Usually propelled by the desires of their parents, they form a meritocracy that, in Delbanco’s view, as in that of the English sociologist Michael Young whom he quotes, comprises a dystopia of sorts, peopled by young men and women driven by high, but empty, ambition. “Are these really the people we want running the world?” Delbanco asks. Unfortunately, they already are. I am not the only one, surely, to have noticed that some of the worst people in this country—names on request—are graduates of the Harvard and Yale law schools.

Attending one of a limited number of elite colleges continues to yield wide opportunities for graduates, but fewer and fewer people any longer believe that someone who has finished college is necessarily all that much smarter than someone who hasn’t. With standards lowered, hours of study shortened, reports appearing about how many college graduates can no longer be depended upon to know how to write or to grasp rudimentary intellectual concepts, having gone to college seems to have less and less bearing on a person’s intelligence.

Studies cited by Delbanco in his footnotes claim an increase among college students in cheating, drinking, and depression. In their book Academically Adrift, Richard Arum and Josipa Roska argue that the gain in critical thinking and complex reasoning among the majority of students during college years is very low, if not minimal. In an article in the Chronicle of Higher Education drawn from their book, Arum and Roska write:

Parents—although somewhat disgruntled about increasing costs—want colleges to provide a safe environment where their children can mature, gain independence, and attain a credential that will help them be successful as adults. Students in general seek to enjoy the benefits of a full collegiate experience that is focused as much on social life as on academic pursuits, while earning high marks in their courses with relatively little investment of effort. Professors are eager to find time to concentrate on their scholarship and professional interests. Administrators have been asked to focus largely on external institutional rankings and the financial bottom line. Government funding agencies are primarily interested in the development of new scientific knowledge. .  .  . No actors in the system are primarily interested in undergraduates’ academic growth, although many are interested in student retention and persistence.

What savvy employers are likely to conclude is that those who graduate from college are probably more conformist, and therefore likely to be more dependable, than those who do not. Paul Goodman, one of the now-forgotten gurus of the 1960s, used to argue that what finishing college really meant is that one was willing to do anything to succeed in a capitalist society. In getting a college degree, Goodman held, one was in effect saying, I want in on the game, deal me a hand, I want desperately to play. Education, meanwhile, didn’t have a lot to do with it.

Not everywhere in higher education have standards slipped. One assumes that in engineering and within the sciences they have been maintained, and in some ways, owing to computer technology, perhaps improved. Relatively new fields of learning, computer science chief among them, have not been around long enough to have lost their way. Medical and legal education are probably not greatly different than they have traditionally been. Chiefly in the liberal arts subjects do standards seem most radically to have slipped.

Early in the 19th century, Sydney Smith, one of the founders of the Edinburgh Review, remarked that if we had made the same progress in the culinary arts as we have made in education, we should still be eating soup with our hands. Apart from eliminating corporal punishment and ­widening the educational franchise, we can’t be sure if, over the centuries, we have made much progress in education. At the moment there is great enthusiasm about “advances” in education owing to the Internet. Two teachers at Stanford, for example, put their course on Artificial Intelligence online and drew an audience of 160,000 students from all around the world. But science, which deals in one right answer, is more easily taught without a physical presence in the room, and probably works better online than humanities courses, whose questions usually have many answers, few of them permanently right. The Washington Monthly, in its May-June issue, has a special section called “The Next Wave of School Reform,” a wave that, in the words of the editor, aims to “improve students’ ability to think critically and independently, solve complex problems, apply knowledge to novel situations, work in teams and communicate effectively.” The problem with these waves of school reform, of course, is that a new one is always needed because the last one turns out to have tossed up more detritus on the shore than was expected.

The fact is that we still don’t know how to assess teaching—trial by student test scores, except in rudimentary subjects, isn’t very helpful—and we remain ignorant about the true nature of the transaction between teacher and student that goes by the name of learning. In undergraduate education, we may even have retreated a step or two through the phenomenon known as grade inflation and through the politicization of curricula.

The division between vocational and liberal arts education, which began during the 19th century with the advent of the land-grant state universities in the United States, is today tilting further and further in favor of the vocational. Even within the liberal arts, more and more students are, in Delbanco’s words, “fleeing from ‘useless’ subjects to ‘marketable’ subjects such as economics,” in the hope that this will lend them the practical credentials and cachets that might impress prospective employers.

Delbanco reminds us of Max Weber’s distinction between “soul-saving” and “skill-acquiring” education. The liberal arts, in their task to develop a certain roundedness in those who study them and their function, in Delbanco’s phrase, “as a hedge against utilitarian values,” are (or at least were meant to be) soul-saving. Whether, in the majority of students who undertook to study the liberal arts, they truly were or not may be open to question, but what isn’t open to question is that today, the liberal arts have lost interest in their primary mission. That mission, as Delbanco has it, is that of “attaining and sustaining curiosity and humility,” while “engaging in some serious ­self-examination.” A liberal education, as he notes, quoting John Henry Cardinal Newman, “implies an action upon our mental nature, and the formation of our character.”
Delbanco warns that it won’t do to posit some prelapsarian golden age when higher education approached perfection. Surely he is correct. A good deal of the old liberal arts education was dreary. The profession of teaching, like that of clergyman and psychiatrist, calls for a higher sense of vocation and talent than poor humanity often seems capable of attaining. Yet there was a time when a liberal arts education held a much higher position in the world’s regard than it does today. One of the chief reasons for its slippage, which Delbanco fails directly to confront, is that so many of its teachers themselves no longer believe in it —about which more presently.

I mentioned earlier that the liberal arts were for a good while my second religion. Here let me add that I had never heard of them until my own undergraduate education had begun.
When I was about to graduate from high school as an amiable screw-off, ranked barely above the lower quarter of my class, my father, who had not gone to college, told me that if I wished to go he would pay my way, but he encouraged me to consider whether my going wouldn’t be a waste of time. He personally thought I might make a hell of a good salesman, which was a compliment, for he was himself a hell of a good salesman, and a successful one. I eschewed his advice, not because it wasn’t sound, but chiefly because I felt that, at 18, I wasn’t ready to go out in the world to work.
In those days, the University of Illinois was, at least for residents of the state, an open-enrollment school. If you lived in Illinois, the school had to take you, no matter how low in your high school class you graduated. Lots of kids flunked out, and my own greatest fear on the train headed from Chicago down to Champaign-Urbana, in white bucks and reading The Catcher in the Rye, was that I would be among them.

Most of my friends, Jewish boys from the rising lower-middle class, went to the University of Illinois to major in business. “Business major” nicely rang the earnestness gong. Yet the courses required of a business major struck me as heart-stoppingly boring: accounting, economics, marketing, advertising, corporation finance, also known as “corp fin,” which sounded to me like nothing so much as a chancy seafood dish. I was especially nervous about accounting, for I had wretched handwriting and a dis- orderly mind, which I viewed as two strikes against me straightaway. Wasn’t there something else I might study instead of business? A fellow in the fraternity that was rushing me suggested liberal arts. This was the first time I had heard the phrase “liberal arts.” What it initially stood for, in my mind, was no accounting.
In my first year at the University of Illinois, I had slightly above a B average. I attained this through sheer memorization: of biological phyla, of French irregular verbs and vocabulary, of 17th-century poems. I also discovered, in a course called Rhetoric 101, that I had a minor skill at prose composition, a skill all the more remarkable for my excluding all use of any punctuation trickier than commas or periods.

After this modest success, I decided that I was ready for a more exotic institution, the University of Chicago, to which I applied during my second semester at Illinois. What I didn’t know then, but have since discovered, was that my demographic cohort, those people born toward the middle and end of the Depression, were lucky when it came to college admission, for our small numbers made colleges want us quite as much as we wanted them. In short, I was accepted at the University of Chicago, though I would never have been accepted there today, and that is where I spent the next, and final, three years of my formal education.

The University of Chicago had a reputation for great teachers, but I managed, somehow, to avoid them. I never sat in a class conducted by Leo Strauss, Joseph Schwab, Norman Maclean, David Greene, or Edward Shils. (Of course, great teachers, like great lovers, can sometimes be overrated. Later in life, I met a few men and women reputed to be great teachers and found them pompous and doltish, their minds spoiled by talking too long to children.) I attended a lecture by David Reisman, who was then Time magazine-cover famous, and was impressed by what then seemed to me his intellectual suavity. I sat in on a couple of classes taught by Richard Weaver, the author of Ideas Have Consequences, but left uninspired. I was most impressed by teachers from Mittel-Europa, Hitler’s gift to America, whose culture seemed thicker than that of the native-born teachers I encountered, and could not yet perceive the commonplace mind that sometimes lurked behind an English accent.

I took a course from Morton Dauwen Zabel, who was the friend of Harriet Monroe, Marianne Moore, and Edmund Wilson. Although not a great teacher, Zabel was an impressive presence who gave off whiffs of what the literary life in the great world was like. I took a summer course from the poet and critic Elder Olson, who kept what seemed a full-time precariously long ash on the end of his cigarette, and who, after reading from The Waste Land, ended by saying, “How beautiful this is. Too bad I can’t believe a word of it.”

The students at the University of Chicago were something else. In his book, Delbanco, defending the small classroom, refers to something he calls “lateral learning,” which refers to what a college student learns in class from his fellow students. He cites Cardinal Newman and John Dewey on this point, and quotes Nathaniel Hawthorne:

It contributes greatly to a man’s moral and intellectual health, to be brought into habits of companionship with individuals unlike himself, who care little for his pursuits, and whose sphere and abilities he must go out of himself to appreciate.

A great many of my fellow students in the College at the University of Chicago seemed to come from New York City, several others from academic families. They appeared to have been reading the Nation and the New Republic from the age of 11. Their families argued about Trotsky at the dinner table. A few among them had the uncalled-for candor of psychoanalysands. I recall a girl sitting next to me at a roundtable in Swift Hall volunteering her own menstrual experiences in connection with a discussion of those of the Trobriand Islanders.

Some among these University of Chicago students had an impressive acquaintance with books. One morning in Elder Olson’s class in modern poetry, Olson began quoting Baudelaire (mon semblable,—mon frère!) and a student next to me, named Martha Silverman, joined him, in French, and together, in unison, the two of them chanted the poem to its conclusion. This was one of those moments when I thought it perhaps a good time to look into career opportunities at Jiffy Lube.

“I invariably took the first rank in all discussions and exercises, whether public or private, as not only my teachers testified, but also the printed congratulations and carmina of my classmates.” So wrote Leibniz about his own classroom performance. Reverse everything Leibniz wrote and you have a fairly accurate picture of my classroom performance at the University of Chicago. None among my teachers there ever suggested that I had intellectual promise. Nor should they have done, for I didn’t show any, not even to myself. I made no “A”s. I wrote no brilliant papers. I didn’t do especially well on exams. I was not quick in response in the classroom.

Only years later did I realize that quickness of response —on which 95 percent of education is based—is beside the point, and is required only of politicians, emergency-room physicians, lawyers in courtrooms, and salesmen. Serious intellectual effort requires slow, usually painstaking thought, often with wrong roads taken along the way to the right destination, if one is lucky enough to arrive there. One of the hallmarks of the modern educational system, which is essentially an examination system, is that so much of it is based on quick response solely. Give 6 reasons for the decline of Athens, 8 for the emergence of the Renaissance, 12 for the importance of the French Revolution. You have 20 minutes in which to do so.

At the University of Chicago I read many books, none of them trivial, for the school in those years did not allow the work of second- or third-rate writers into its curriculum. Kurt Vonnegut, Toni Morrison, Jack Kerouac, Adrienne Rich, or their equivalents of that day, did not come close to making the cut. No textbooks were used. You didn’t read “Karl Marx postulated .  .  .”; you read Karl-bloody-Marx. The working assumption was that one’s time in college is limited, and mustn’t be spent on anything other than the first-rate, or on learning acquired (as with textbooks) at a second remove.

Nor did Chicago offer any “soft” majors or “lite” courses. I remember, in my final year, looking for such a course to fill out a crowded schedule, and choosing one called History of Greek Philosophy. How difficult, I thought, could this be? Learn a few concepts of the pre-Socratics (Thales believed this, Heraclitus that), acquire a few dates, and that would be that. On the first day of class, the teacher, a trim little man named Warner Arms Wick, announced that there was no substantial history of Greek philosophy, so we shall instead be spending the quarter reading Aristotle and Plato exclusively.

How much of my reading did I retain? How much does any 19- or 20-year-old boy, whose hormones have set him a very different agenda, retain of serious intellectual matter? How much more is less than fully available to him owing to simple want of experience? What I do remember is the feeling of intellectual excitement while reading Plato and Thucydides and an almost palpable physical pleasure turning the pages of Max Weber’s The Protestant Ethic and the Spirit of Capitalism as he made one dazzling intellectual connection after another. I can also recall being plunged into a brief but genuine depression reading Freud’s Civilization and Its Discontents.

The idea behind the curriculum at the College of the University of Chicago was the Arnoldian one, abbreviated to undergraduate years, of introducing students to the best that was thought and said in the Western world. Mastery wasn’t in the picture. At least, I never felt that I had mastered any subject, or even book, in any of my courses there. What the school did give me was the confidence that I could read serious books, and with it the assurance that I needed to return to them, in some cases over and over, to claim anything like a genuine understanding of them.

I was never more than a peripheral character, rather more like a tourist than a student, at the University of Chicago. Yet when I left the school in 1959, I was a strikingly different person than the one who entered in 1956. What had happened? My years there allowed me to consider other possibilities than the one destiny would appear to have set in grooves for me. I felt less locked into the social categories—Jewish, middle-class, Midwestern—in which I had grown up, and yet, more appreciative of their significance in my own development. I had had a glimpse—if not much more—of the higher things, and longed for a more concentrated look.

Had I not gone to the University of Chicago, I have often wondered, what might my life be like? I suspect I would be wealthier. But reading the books I did, and have continued to throughout my life, has made it all but impossible to concentrate on moneymaking in the way that is required to acquire significant wealth. Without the experience of the University of Chicago, perhaps I would have been less critical of the world’s institutions and the people who run them; I might even have been among those who do run them. I might, who knows, have been happier, if only because less introspective—nobody said the examined life is a lot of laughs—without the changes wrought in me by my years at the University of Chicago. Yet I would not trade in those three strange years for anything.

I turned out to be a better teacher than student. In fact I took to saying, toward the close of my 30-year stint in the English department at Northwestern University, that teaching provides a better education than does being a student. If he wishes to elude boredom among his students and embarrassment for himself, a teacher will do all he can to cultivate the art of lucid and interesting presentation and the habits of thoroughness. Thereby, with a bit of luck, education may begin to kick in.

Yet even after completing three decades of teaching, I am less than sure that what I did in the classroom was effective or, when it might have been effective, why. Of the thousands of inane student evaluations I received—“This guy knows his stuff” .  .  . “Nice bowties” .  .  . “Great jokes”—the only one that stays in my mind read: “I did well in this course; I would have been ashamed not to have done.” How I wish I knew what it was that I did to induce this useful shame in that student, so that I might have done it again and again!

Student evaluations, set in place to give the impression to students that they have an important say in their own education, are one of the useless intrusions into university teaching by the political tumult of the 1960s. Teaching remains a mysterious, magical art. Anyone who claims he knows how it works is a liar. No one tells you how to do it. You walk into a classroom and try to remember what worked for the teachers who impressed you, or, later in the game, what seemed to work best for you in the past. Otherwise, it is pure improv, no matter how extensive one’s notes.

As a testimony to the difficulty of evaluating the quality of teaching, Professor Delbanco includes a devastating footnote about student evaluations. One study found that students tend to give good evaluations “to instructors who are easy graders or who are good looking,” and to be hardest on women and foreign teachers; another, made at Ohio State University, found “no correlation between professor evaluations and the learning that is actually taking place.” As Delbanco notes, the main result of student evaluations is to make it easier for students to avoid tough teachers or, through harsh reviews, punish these teachers for holding to a high standard.

I was not myself regarded as a tough teacher, but I prefer to think that I never fell below the line of the serious in what I taught or in what I asked of my students. What I tried to convey about the writers on whom I gave courses was, alongside the aesthetic pleasures they provided, their use as guides, however incomplete, to understanding life. Reading Joseph Conrad, Henry James, Leo Tolstoy, Fyodor Dostoyevsky, Willa Cather, and other writers I taught was important business—possibly, in the end, though I never said it straight out, more important than getting into Harvard Law School or Stanford Business School. When I taught courses on prose style, I stressed that correctness has its own elegance, and that, in the use of language, unlike in horseshoes, close isn’t good enough; precision was the minimal requirement, and it was everything.

How many students found helpful what I was trying to convey I haven’t the least notion. If anything I said during the many hours we were together mattered to them, I cannot know. Not a scholar myself, I never tried to make scholars of my students. A small number of them went on to do intellectual work, to become editors, critics, poets, novelists; a few became college teachers. Did my example help push them in their decision not to go for the money? Some of the brightest among them did go for the money, and have lived honorable lives in pursuit of it, and that’s fine, too. A world filled with people like me would be intolerable.

When I taught, I was always conscious of what I thought of as the guy in the next room: my fellow teachers. During my teaching days (1973-2003), I could be fairly certain that the guy in the next room was teaching something distinctly, even starkly, different from what I was teaching. This was the age of deconstruction, academic feminism, historicism, Marxism, early queer theory, and other, in Wallace Stevens’s phrase, one-idea lunacies. A bright young female graduate student one day came to ask me if I thought David Copperfield a sexual criminal. “Why would I think that?” I asked. “Professor X thinks it,” she said. “He claims that because of the death in childbirth of David Copperfield’s wife, he, Copperfield, through making her pregnant, committed a crime.” All I could think to reply was, “I guess criticism never sleeps.”

While not wishing to join the dirge-like chorus of those who write about the fate of higher education in our day, Andrew Delbanco does not shy from setting out much that has gone wrong with it. He highlights the importance everywhere accorded to research over teaching among faculty. He notes the preeminence of science over the humanities, due to the fact that science deals with the provable and can also lead to technological advancement, and hence pays off. (He mentions the sadly mistaken slavishness of the humanities in attempting to imitate science, and cites the advent of something called the “literature lab” as an example.) He brings up the corruption implicit in university presidents sitting on corporate boards, the fraudulence of big-time college athletics, some of whose football and basketball coaches earn more than entire academic departments, and much more.
Delbanco, a secular Jew and a man of the Vietnam generation, is nonetheless ready to allow the pertinence of the earlier Protestant view of higher education in the liberal arts:

The era of spiritual authority belonging to college [when it was under religious auspices] is long gone. And yet I have never encountered a better formulation—“show me how to think and how to choose”—of what a college should strive to be: an aid to reflection, a place and process whereby young people take stock of their talents and passions and begin to sort out their lives in a way that is true to themselves and responsible to others.

College: What It Was, Is, and Should Be gives a clear picture of all the forces, both within and outside the university, working against the liberal arts. Yet Delbanco lets off the hook the people who were in the best position to have helped save them—the teachers, those “guys in the next room.” Much could be said about teaching the liberal arts before the Vietnam generation came to prominence (which is to say, tenure) in the colleges: that it could be arid, dull, pedantic, astonishingly out of it. But it never quite achieved the tendentious clownishness that went into effect when “the guys in the next room” took over.

Not that the ground hadn’t been nicely prepared for them. Universities had long before opened themselves up to teaching books and entire subjects that had no real place in higher education. Take journalism schools. Everyone who has ever worked on a newspaper knows that what one learns in four years in journalism school can be acquired in less than two months working on a newspaper. But as journalism schools spread, it slowly became necessary to go through one in order to get a job on a large metropolitan daily. Going to “journ” school became a form of pledging the fraternity. Everyone else in the business had pledged; who are you, pal, to think you can get in without also pledging? And so journalism schools became mainstays of many universities.

Then there is the business school, especially in its MBA version. Business schools are not about education at all, but about so-called networking and establishing, for future employers, a credential demonstrating that one will do anything to work for them—even give up two years of income and pay high tuition fees for an MBA to do so. As with an American Express card, so with an MBA, one daren’t leave home without one, at least if one is applying for work at certain corporations. Some among these corporations, when it comes to recruiting for jobs, only interview MBAs, and many restrict their candidate pools to MBAs from only four or five select business schools. Pledging the fraternity again.

Soon, the guys in the next room, in their hunger for relevance and their penchant for self-indulgence, began teaching books for reasons external to their intrinsic beauty or importance, and attempted to explain history before discovering what actually happened. They politicized psychology and sociology, and allowed African-American studies an even higher standing than Greek and Roman classics. They decided that the multicultural was of greater import than Western culture. They put popular culture on the same intellectual footing as high culture (Conrad or graphic novels, three hours credit either way). And, finally, they determined that race, gender, and social class were at the heart of all humanities and most social science subjects. With that finishing touch, the game was up for the liberal arts.

The contention in favor of a liberal arts education was that contemplation of great books and grand subjects would take students out of their parochial backgrounds and elevate them into the realm of higher seriousness. Disputes might arise from professor to professor, or from school to school, about what constituted the best that was thought and said—more Hobbes than Locke, more Yeats than Frost—but a general consensus existed about what qualified to be taught to the young in the brief span of their education. That consensus has split apart, and what gets taught today is more and more that which interests professors.

Columbia still provides two years of traditional liberal arts for its undergraduates. The University of Chicago continues to struggle over assembling a core curriculum based on the old Robert Hutchins College plan. St. John’s College, both in Annapolis and in Santa Fe, has, from its founding, been devoted to the cult of the liberal arts, even to the point of having its students study medieval science. The hunger among students for the intellectual satisfaction that a liberal arts education provides is not entirely dead. (At Northwestern, a course in Russian novels taught by Gary Saul Morson attracts 600 students, second only to the recently canceled notorious course in sex education offered by the school.) But the remaining liberal arts programs begin to have the distinct feel of rearguard actions.

The death of liberal arts education would constitute a serious subtraction. Without it, we shall no longer have a segment of the population that has a proper standard with which to judge true intellectual achievement. Without it, no one can have a genuine notion of what constitutes an educated man or woman, or why one work of art is superior to another, or what in life is serious and what is trivial. The loss of liberal arts education can only result in replacing authoritative judgment with rivaling expert opinions, the vaunting of the second- and third-rate in politics and art, the supremacy of the faddish and the fashionable in all of life. Without that glimpse of the best that liberal arts education conveys, a nation might wake up living in the worst, and never notice.

Joseph Epstein, a contributing editor to The Weekly Standard, is the author, most recently, of Essays in Biography.

Our New Albigensian Age by Stephen M. Krason

SEPTEMBER 17, 2013

Our New Albigensian Age

by Stephen M. Krason

***

Ruins of Holyrood Chapel (1824) by Louis Daguerre

***

In an old (1950) monograph entitled The Truth about the Inquisition, Dr. John A. O’Brien, a Notre Dame history professor of the time, provides a brief but interesting exposé of the Albigensian heresy. Few people recall that that almost maniacal rebellion against Catholic teaching and, for that matter, commonsensical and civilized living was the trigger for the much-misunderstood Inquisition. O’Brien’s discussion makes one think of many aspects of our current civilizational crisis, even though the comparison could not have been so evident in 1950.

The Albigensians, or Catharists, were neo-Manicheans, regarding material creation as an evil and viewing all of existence as a conflict between evil matter and good spirit—but O’Brien says it was much more. Like all Gnostics, of which Manicheanism was a branch, they believed themselves to be the only “pure” ones and the only ones to have the truth. They were certainly a forerunner of Protestantism and even more specifically of the most ardent of contemporary fundamentalists, with their complete rejection of the Real Presence, transubstantiation, the Eucharist, and the Mass, and their belief that the pope was the Antichrist. Their teaching and practice, however, had enormous implications for marriage, sexual morality, and social and political life.

The parallels to the present are almost uncanny. While hatred for the Church is nothing new, the visceral character of the Albigensians’ hatred bears a resemblance to the ugliest side of the Reformation and today’s assaults on religion. For example, O’Brien tells us how the Albigensians were known for indiscriminately chopping down crosses and stamping on them. In America today, we see the relentless efforts by rabid, uncompromising church-state separationist groups to remove all religious symbols from public places and the heightened vandalism of crosses and other Christian monuments.

The sexual libertinism, views about marriage, and feminism of our time resemble the Albigensian heresy. While the Albigensians considered sex an “inherent evil,” it seems as if it was not so much sex per se that they rejected but the proper context for it. They utterly rejected marriage, mostly because it meant bringing children into the world. Pregnancy for them was diabolical. Their confusion about sexual matters made them believe that marriage was worse than fornication and adultery. In our time, people don’t quite make this claim, but marriage has become irrelevant as the condition for engaging in sexual activity and no judgment is made about the morality of almost any sexual practices. For many, particularly in lower socioeconomic status groups, marriage almost seems obsolete; children are routinely born out-of-wedlock. Others, particularly among the affluent, enter marriage—or what is called that—but have no intention of bearing children. While people may not proclaim pregnancy as evil, they act is if it is in our contracepting age. As O’Brien says, for the Albigensians even perversion was preferable to marriage. In our time, we witness the celebration of sexual perversion as a good thing—as “LGBT pride.” While the Albigensians wanted to abolish marriage, we have transformed it into something that they would have lauded: an association devoid of procreative intent or even, in the case of same-sex “marriage,” capability. As far as traditional, true marriage is concerned, we increasingly give it no special support or even recognition as uniquely important for society. We say that people are free to choose what “version” of it they prefer—and be officially “affirmed” in their choice.

So the Albigensians, who so rejected sex as part of their disdain for the material world and supposedly in the interest of spiritual purity, actually opened the door to sexual debauchery and the corruption of both body and soul. This was typical of Manicheans historically. Some would become extreme ascetics, and others utter hedonists.

Contemporary feminism has a ring of the Albigensian. Instead of equality in marriage, it effectively placed women in a dominant position. As O’Brien explains, since pregnancy was despised married women who were converted to Albigensianism unilaterally abrogated their husbands’ marital rights and consigned them to “an enforced celibacy.” It was considered “sinful and degrading” to even touch a woman (even if innocently and in a pure way). This almost rings of the extremes to which sexual harassment has gone in our day. It makes one think of the anti-male ethos in the statements of some of today’s feminists. The female dominance was further seen in that a religious punishment of fasting for inter-gender touching could only be imposed on a man, even if the woman did the touching.

Today, abortion seems to have become a positive good for ardent feminists and their fellow-travelers. It’s much like the Albigensians, for whom O’Brien says “abortion was highly to be commended.”

The Albigensians anticipated today’s assault on human life in other areas, as well. Believing that the seriously ill would gain eternal bliss if they did not recover their health, they encouraged them to commit suicide. In fact, they practiced assisted suicide. The assisted suicide advocates of today are different only in that their methods are (usually) more technologically sophisticated. The Albigensians either suffocated or starved the person. Today’s practice in medical facilities of hastening death by withholding nutrition and hydration was what they did—except it took place in the person’s home. Like today, the person was supposedly given a choice: they gave him a choice of these two methods of death, today people sign living wills. Either way, the supposed choice is no real choice. In both eras, there is a coercive backstop. The Albigensian leaders forbade the sick person’s family from feeding him, or would forcibly remove him from his home if they weren’t “reliable.” In our day, family members may make a choice for death even if the patient didn’t want it or, increasingly, the medical authorities do it even when it’s against the patient’s or the family’s wishes.

The present era, prodded along by the likes of Peter Singer, pushes more and more toward post-partum infanticide. Even on this, the Albigensians were a precursor. They insisted upon—even enforced—among their followers the starvation of very sick children. To make sure their parents didn’t lose their nerve, the sect leaders frequently visited their homes to monitor them. So, the Albigensians also anticipated our era’s undermining of parental rights. Continue reading

Humane Learning in the Age of the Computer by Russell Kirk

Humane Learning in the Age of the Computer

by Russell Kirk

computer

Permit me to offer you some desultory reflections concerning the effect of the electronic computer upon the reason and the imagination. We are told by many voices that the computer will work a revolution in learning. So it may; but that accomplishment would not be salutary.

The primary end of the higher learning, in all lands and all times, has been what John Henry Newman called the training of the intellect to form a philosophical habit of mind. University and college were founded to develop right reason and imagination, for the sake of the person and the sake of the republic. The higher education, by its nature, is concerned with abstractions — rather difficult abstractions, both in the sciences and in humane studies. Most people, in any age, are not fond of abstractions. Therefore, in this democratic time, higher education stands in danger everywhere from levelling pressures.

In Britain, a few years ago, the member of the opposition who had been designated minister of education in a prospective Labour government denounced Oxford and Cambridge universities as “cancers.” Presumably he would have converted those ancient institutions, had it been in his power, into something like the Swedish “people’s universities” — that is, institutions at which everybody could succeed, because all standards would be swept away for entrance or for graduation. Every man and woman an intellectual king or queen, with an Oxbridge degree! The trouble with this aspiration is that those kings and queens would be impoverished intellectually-and presumably Britain in general would be impoverished in more ways than one.

Recently we have heard similar voices in the graduate schools of Harvard. Why discriminate against indolence and stupidity? Why not let everybody graduate, regardless of performance in studies? Wouldn’t that be the democratic way? If young people don’t care for abstractions, and manifest a positive aversion to developing a philosophical habit of mind, why not give them what they think they would like: that is, the superficial counter-culture? Continue reading

%d bloggers like this: