• Facebook Apostles

  • Enter your email address to follow this blog and receive notifications of new posts by email.

    Join 10,916 other followers

    • 74,964 Visits
  • Recent Posts

  • Categories

The Catholic Tolkien and the Knights of Middle-earth by Stratford Caldecott

The Catholic Tolkien and the Knights of Middle-earth

catholic tolkien

This month, fans around the world will flock to the cinema to watch the first of three installments of Peter Jackson’s adaptation ofThe Hobbit—the “prequel” to the award-winning Lord of the Rings trilogy that was also released in three parts between 2001 and 2003 (The Hobbit:An Unexpected Journey will be released in U.S. theaters Dec. 14.). Based on J.R.R. Tolkien’s classic novels, the films depart from the original storyline in significant details, but goes to great lengths to respect the author’s vision of Middle-earth—a world of great natural beauty and intense moral drama, set in the distant past.

Many will argue that translating such a story from book into film, no matter how impressive the result, is a mistake. A movie presents the audience with the filmmakers’ visualization, not the author’s or the reader’s. Conversely, reading or listening to a story engages the imagination at a deeper level than watching it on screen. Yet if a film had to be made, we should be grateful that efforts have been made to remain faithful to the spirit and texture of Tolkien’s stories.

The Catholic Tolkien

The spirit of Tolkien’s hugely successful fantasy novels is deeply Christian. Born in 1892, the author was a devout Catholic who grew up under the influence of Blessed John Henry Newman’s Oratory in Birmingham, England. All through his busy life as an Oxford professor and popular writer, he tried to attend Mass every day. His eldest son even became a Catholic priest. The stories that Tolkien wrote were more than entertainment; they were written to express a profound Christian wisdom.

In a letter Tolkien drafted to the manager of the Newman Bookshop in 1954, but never sent because it sounded too self-important (Letter 153 in the published collection), he admitted that his aim in writing the stories was “the elucidation of truth, and the encouragement of good morals in this real world, by the ancient device of exemplifying them in unfamiliar embodiments, that may tend to ‘bring them home.’” In another letter to a Jesuit friend in 1953, he explained that while he had consciously “absorbed” the religious element “into the story and the symbolism” (because he had no intention of making religious propaganda), The Lord of the Rings remains “a fundamentally religious and Catholic work.”

Tolkien’s Christian wisdom can pop out at readers in unexpected ways, but most often it simply sinks in at a deep level without distracting our attention from the story. I noticed an example as I read The Lord of the Rings to my youngest daughter recently. The story concerns the attempt to destroy a magical “Ring of Power” that threatens the freedom of all the peoples of Middle-earth. As the little hobbits Frodo and Sam struggle up Mount Doom in the final stage of their quest to reach the volcanic furnace in which the Ring can be unmade, Frodo comes to the end of his strength—drained by the ever-growing weight of the Ring he bears around his neck and the constant temptation to claim its power for his own.

His faithful servant Sam, who knows he is not permitted to bear the Ring, invites Frodo to climb onto his back. “I can’t carry it for you, but I can carry you and it as well. So up you get!” Staggering to his feet, he finds to his amazement “the burden light.” Tolkien writes, “[Sam] had feared that he would have barely strength to lift his master alone, and beyond that he had expected to share in the dreadful dragging weight of the accursed Ring. But it was not so. Whether because Frodo was so worn by his long pains, wound of knife, and venomous sting, and sorrow, fear, and homeless wandering, or because some gift of final strength was given to him, Sam lifted Frodo with no more difficulty than if he were carrying a hobbit-child pig-a-back in some romp on the lawns or hayfields of the Shire. He took a deep breath and started off.”

Does this not remind you, as if in a faint echo, of a certain well-known passage in the Gospels? I am thinking of the one where Jesus says, “Come to me, all who labor and are heavy laden, and I will give you rest. Take my yoke upon you, and learn from me, for I am gentle and lowly in heart, and you will find rest for your souls. For my yoke is easy, and my burden is light” (Mt 11:28-30).

The echo may be faint, yet the whole journey of the two Hobbits across Mordor—including descriptions of the Ring and Frodo’s many falls under its weight—recalls the Way to Calvary, where Jesus bore the weight of the world’s sin. Those who are familiar with the Gospels can hardly fail to recognize a similarity. If the Ring is analogous to the Cross (because it represents sin), and Frodo as Ringbearer is analogous to Christ, then when Sam hauls the burden up onto his shoulders he finds exactly what Christ has promised: It feels light because Christ himself is still bearing the major part of the weight.

The link to the Christian story is even reinforced by the calendar date. The Ring is destroyed on March 25, which in our world is the Solemnity of the Annunciation, the day Christ was conceived in the womb of Mary to bear our sins away.

Nobility of the Soul

There are plenty of other parallels with Christianity in The Lord of the Rings and The Hobbit, but as the author insisted, the important point lies deeper than this. The story is meant to be enjoyed for its own sake, not merely decoded. A story is a way of exploring the way the world works. No author can avoid bringing his own understanding of free will and fate or providence, not to mention some conception of good and evil, to his writing. Tolkien’s understanding was shaped by his faith, which is the truth revealed by God about the way the world really works — and not only this world, but every possible world.

An important part of Catholic wisdom is the ethical tradition that rests on the natural laws of our nature, made in the image of God. This tradition could be called “nobility of soul” or “spiritual chivalry.” We see both in The Hobbit and The Lord of the Rings a learning process that Tolkien called “the ennoblement (or sanctification) of the humble,” which he believed was an important theme of his writing as a whole. In both novels, the hobbit heroes (Bilbo in the one, Frodo and Sam and their friends in the other) are lifted from the narrow, comfortable world of the Shire into a much vaster landscape to play key roles in battles that decide the fate of Middle-earth. This was a process that Tolkien observed among the soldiers he fought beside in the Battle of the Somme, in the First World War.

Through suffering and trial, the hobbits are fashioned into heroes, empowered to save their little world of the Shire from the spiritual evil that has corrupted it while they were away. Gandalf the wizard tells them, “That is what you have been trained for.” Although the film versions of The Lord of the Rings unfortunately omit this last stage, it is still clear that the hobbits have attained greater maturity and courage through their adventures.

After all, Tolkien wove the idea of “nobility of soul” very deeply into his mythology. This concept is represented partly in the Elves. The human beings and hobbits who are closest to the Elves by influence or nature are the noblest: Frodo (named “Elf-friend”) among the hobbits, Aragorn and Imrahil and Faramir among the men. The “elvish” tendency in man is always towards physical beauty, artistic ability and respect for creation. It is associated with a love for God’s creation that seeks to improve, protect, celebrate and adorn.

The “chivalry” that reveals this nobility is shown in behavior towards others, such as kindness and mercy, the refusal to mistreat even prisoners of war, and the showing of honor to the bodies of the dead. We see this, for instance, when Aragorn, heir to the throne of Gondor and leader of the fellowship of the Ring, insists on a proper funeral for Boromir before they continue with their quest. The knights of Middle-earth defend the weak from their oppressors and remain faithful to friends and liege-lord. Such behavior outwardly signifies the presence of heroic virtue within the soul, especially the cardinal virtues of prudence, fortitude, temperance and justice.

It is with these virtues that we are equipped to defend the truly important things, the little things, the domestic world of the free family, and the love that binds people together in fellowship.

Aragorn exemplifies all of these virtues in the highest degree, but we see them develop in the hobbits, too, as they learn to submit to discipline and overcome their fear to achieve great deeds without hope of reward — just because it is the right thing to do. This is Tolkien’s challenge to us: to become, in our own way, the knights of Middle-earth.


The Age of Intolerance by Mark Steyn

DECEMBER 20, 2013 6:00 PM

The Age of Intolerance

When Worlds Collide: Pajama Boy and Duck Dynasty‘s Phil Robertson

Mark Steyn 

Mark Steyn 

Last week, following the public apology of an English comedian and the arrest of a fellow British subject both for making somewhat feeble Mandela gags, I noted that supposedly free societies were increasingly perilous places for those who make an infelicitous remark. So let’s pick up where we left off:

Here are two jokes one can no longer tell on American television. But you can still find them in the archives, out on the edge of town, in Sub-Basement Level 12 of the ever-expanding Smithsonian Mausoleum of the Unsayable. First, Bob Hope, touring the world in the year or so after the passage of the 1975 Consenting Adult Sex Bill:

“I’ve just flown in from California, where they’ve made homosexuality legal. I thought I’d get out before they make it compulsory.”

For Hope, this was an oddly profound gag, discerning even at the dawn of the Age of Tolerance that there was something inherently coercive about the enterprise. Soon it would be insufficient merely to be “tolerant” — warily accepting, blithely indifferent, mildly amused, tepidly supportive, according to taste. The forces of “tolerance” would become intolerant of anything less than full-blown celebratory approval.

Second joke from the archives: Dean Martin and Frank Sinatra kept this one in the act for a quarter-century. On stage, Dino used to have a bit of business where he’d refill his tumbler and ask Frank, “How do you make a fruit cordial?” And Sinatra would respond, “I dunno. How do you make a fruit cordial?” And Dean would say, “Be nice to him.”

But no matter how nice you are, it’s never enough. Duck Dynasty’s Phil Robertson, in his career-detonating interview with GQ, gave a rather thoughtful vernacular exegesis of the Bible’s line on sin, while carefully insisting that he and other Christians are obligated to love all sinners and leave it to the Almighty to adjudicate the competing charms of drunkards, fornicators, and homosexuals. Nevertheless, GLAAD — “the gatekeepers of politically correct gayness” as the (gay) novelist Bret Easton Ellis sneered — saw their opportunity and seized it. By taking out TV’s leading cable star, they would teach an important lesson pour encourager les autres — that espousing conventional Christian morality, even off-air, is incompatible with American celebrity.

Some of my comrades, who really should know better, wonder why, instead of insisting Robertson be defenestrated, GLAAD wouldn’t rather “start a conversation.” But, if you don’t need to, why bother? Most Christian opponents of gay marriage oppose gay marriage; they don’t oppose the right of gays to advocate it. Yet thug groups like GLAAD increasingly oppose the right of Christians even to argue their corner. It’s quicker and more effective to silence them.

As Christian bakers ordered to provide wedding cakes for gay nuptials and many others well understand, America’s much-vaunted “freedom of religion” is dwindling down to something you can exercise behind closed doors in the privacy of your own abode or at a specialist venue for those of such tastes for an hour or so on Sunday morning, but when you enter the public square you have to leave your faith back home hanging in the closet. Yet even this reductive consolation is not permitted to Robertson: GLAAD spokesgay Wilson Cruz declared that “Phil and his family claim to be Christian, but Phil’s lies about an entire community fly in the face of what true Christians believe.” Robertson was quoting the New Testament, but hey, what do those guys know? In today’s America, land of the Obamacare Pajama Boy, Jesus is basically Nightshirt Boy, a fey non-judgmental dweeb who’s cool with whatever. What GLAAD is attempting would be called, were it applied to any other identity group, “cultural appropriation.”

In the broader sense, it’s totalitarian. While American gays were stuffing and mounting the duck hunter in their trophy room, the Prince of Wales was celebrating Advent with Christian refugees from the Middle East, and noting that the land in which Christ and Christianity were born is now the region boasting “the lowest concentration of Christians in the world — just four percent of the population.” It will be three, and two, and one percent soon enough, for there is a totalitarian impulse in resurgent Islam — and not just in Araby. A few miles from Buckingham Palace, Muslims in London’s East End are now sufficiently confident to go around warning local shopkeepers to cease selling alcohol. In theory, you might still enjoy the right to sell beer in Tower Hamlets or be a practicing Christian in Iraq, but in reality not so much. The asphyxiating embrace of ideological conformity was famously captured by Nikolai Krylenko, the People’s Commissar for Justice, in a speech to the Soviet Congress of Chess Players in 1932, at which he attacked the very concept of “the neutrality of chess.” It was necessary for chess to be Sovietized like everything else. “We must organize shock brigades of chess players, and begin immediate realization of a Five-Year Plan for chess,” he declared.

Six years later, the political winds having shifted, Krylenko was executed as an enemy of the people. But his spirit lives on among the Commissars of Gay Compliance at GLAAD. It is not enough to have gay marriage for gays. Everything must be gayed. There must be Five-Year Gay Plans for American bakeries, and the Christian church, and reality TV. There must be shock brigades of gay duck-hunters honking out the party line deep in the backwoods of the proletariat. Obamacare pajama models, if not yet mandatorily gay, can only be dressed in tartan onesies and accessorized with hot chocolate so as to communicate to the Republic’s maidenhood what a thankless endeavor heterosexuality is in contemporary America.

Mentally Strong People: The 13 Things They Avoid by Amy Morin

Mentally Strong People: The 13 Things They Avoid

Amy Morin is a licensed clinical social worker and writer (Image courtesy of AmyMorinLCSW.com)

Amy Morin is a licensed clinical social worker and writer (Image courtesy of AmyMorinLCSW.com)

Editors’ Note: Following the huge popularity of this post, article source Amy Morin has authored a Dec. 3 guest post on exercises to increase mental strength here. Cheryl Conner has also interviewed Amy Morin in a Forbes video chat that expands on this article here.

For all the time executives spend concerned about physical strength and health, when it comes down to it, mental strength can mean even more. Particularly for entrepreneurs, numerous articles talk about critical characteristics of mental strength—tenacity, “grit,” optimism, and an unfailing ability asForbes contributor David Williams says, to “fail up.”

However, we can also define mental strength by identifying the things mentally strong individualsdon’t do. Over the weekend, I was impressed by this list compiled by Amy Morin, a psychotherapist andlicensed clinical social worker,  that she shared in LifeHack. It impressed me enough I’d also like to share her list here along with my thoughts on how each of these items is particularly applicable to entrepreneurs.

1.    Waste Time Feeling Sorry for Themselves. You don’t see mentally strong people feeling sorry for their circumstances or dwelling on the way they’ve been mistreated. They have learned to take responsibility for their actions and outcomes, and they have an inherent understanding of the fact that frequently life is not fair. They are able to emerge from trying circumstances with self-awareness and gratitude for the lessons learned. When a situation turns out badly, they respond with phrases such as “Oh, well.” Or perhaps simply, “Next!”

2. Give Away Their Power. Mentally strong people avoid giving others the power to make them feel inferior or bad. They understand they are in control of their actions and emotions. They know their strength is in their ability to manage the way they respond.

3.    Shy Away from Change. Mentally strong people embrace change and they welcome challenge. Their biggest “fear,” if they have one, is not of the unknown, but of becoming complacent and stagnant. An environment of change and even uncertainty can energize a mentally strong person and bring out their best.

4. Waste Energy on Things They Can’t Control. Mentally strong people don’t complain (much) about bad traffic, lost luggage, or especially about other people, as they recognize that all of these factors are generally beyond their control. In a bad situation, they recognize that the one thing they can always control is their own response and attitude, and they use these attributes well.

5. Worry About Pleasing Others. Know any people pleasers? Or, conversely, people who go out of their way to dis-please others as a way of reinforcing an image of strength? Neither position is a good one. A mentally strong person strives to be kind and fair and to please others where appropriate, but is unafraid to speak up. They are able to withstand the possibility that someone will get upset and will navigate the situation, wherever possible, with grace.

6. Fear Taking Calculated Risks. A mentally strong person is willing to take calculated risks. This is a different thing entirely than jumping headlong into foolish risks. But with mental strength, an individual can weigh the risks and benefits thoroughly, and will fully assess the potential downsides and even the worst-case scenarios before they take action.7. Dwell on the Past. There is strength in acknowledging the past and especially in acknowledging the things learned from past experiences—but a mentally strong person is able to avoid miring their mental energy in past disappointments or in fantasies of the “glory days” gone by. They invest the majority of their energy in creating an optimal present and future.

8. Make the Same Mistakes Over and Over. We all know the definition of insanity, right? It’s when we take the same actions again and again while hoping for a different and better outcome than we’ve gotten before. A mentally strong person accepts full responsibility for past behavior and is willing to learn from mistakes. Research shows that the ability to be self-reflective in an accurate and productive way is one of the greatest strengths of spectacularly successful executives and entrepreneurs.

9. Resent Other People’s Success. It takes strength of character to feel genuine joy and excitement for other people’s success. Mentally strong people have this ability. They don’t become jealous or resentful when others succeed (although they may take close notes on what the individual did well). They are willing to work hard for their own chances at success, without relying on shortcuts.

10. Give Up After Failure. Every failure is a chance to improve. Even the greatest entrepreneurs are willing to admit that their early efforts invariably brought many failures. Mentally strong people are willing to fail again and again, if necessary, as long as the learning experience from every “failure” can bring them closer to their ultimate goals.

11. Fear Alone Time. Mentally strong people enjoy and even treasure the time they spend alone. They use their downtime to reflect, to plan, and to be productive. Most importantly, they don’t depend on others to shore up their happiness and moods. They can be happy with others, and they can also be happy alone.

12. Feel the World Owes Them Anything. Particularly in the current economy, executives and employees at every level are gaining the realization that the world does not owe them a salary, a benefits package and a comfortable life, regardless of their preparation and schooling. Mentally strong people enter the world prepared to work and succeed on their merits, at every stage of the game.

13. Expect Immediate Results. Whether it’s a workout plan, a nutritional regimen, or starting a business, mentally strong people are “in it for the long haul”. They know better than to expect immediate results. They apply their energy and time in measured doses and they celebrate each milestone and increment of success on the way. They have “staying power.” And they understand that genuine changes take time. Do you have mental strength? Are there elements on this list you need more of? With thanks to Amy Morin, I would like to reinforce my own abilities further in each of these areas today. How about you?

Did ‘The Great Society’ Ruin Society? by Pat Buchanan

800px-Lyndon_Johnson_signing_Civil_Rights_Act,_2_July,_1964 Did ‘The Great Society’ Ruin Society?

“I’m not concerned about the very poor. We have a safety net there. If it needs a repair, I’ll fix it.”

Thus did Mitt Romney supposedly commit the gaffe of the month — for we are not to speak of the poor without unctuous empathy.

Yet, as Robert Rector of the Heritage Foundation reports in “Understanding Poverty in the United States: Surprising Facts About America’s Poor,” Mitt was more right about America’s magnanimity than those who bewail her alleged indifference.

First, who are the poor?

To qualify, a family of four in 2010 needed to earn less than $22,314. Some 46 million Americans, 15 percent of the population, qualified.

And in what squalor were America’s poor forced to live?

Well, 99 percent had a refrigerator and stove, two-thirds had a plasma TV, a DVD player and access to cable or satellite, 43 percent were on the Internet, half had a video game system like PlayStation or Xbox.

Three-fourths of the poor had a car or truck, nine in 10 a microwave, 80 percent had air conditioning. In 1970, only 36 percent of the U.S. population enjoyed air conditioning.

America’s poor enjoy amenities almost no one had in the 1950s, when John K. Galbraith described us as “The Affluent Society.”

What about homelessness? Are not millions of America’s poor on the street at night, or shivering in shelters or crowded tenements?

Well, actually, no. That is what we might call televised poverty. Of the real poor, fewer than 10 percent live in trailers, 40 percent live in apartments, and half live in townhouses or single-family homes.

Forty-one percent of poor families own their own home.

But are they not packed in like sardines, one on top of another?

Not exactly. The average poor person’s home in America has 1,400 square feet — more living space than do Europeans in 23 of the 25 wealthiest countries on the continent.

Two-thirds of America’s poor have two rooms per person, while 94 percent have at least one room per person in the family dwelling.

Only one in 25 poor persons in America uses a homeless shelter, and only briefly, sometime during the year.

What about food? Do not America’s poor suffer chronically from malnutrition and hunger?

Not so. The daily consumption of proteins, vitamins and minerals of poor children is roughly the same as that of the middle class, and the poor consume more meat than the upper middle class.

Some 84 percent of America’s poor say they always have enough food to eat, while 13 percent say sometimes they do not, and less than 4 percent say they often do not have enough to eat.

Only 2.6 percent of poor children report stunted growth. Poor kids in America are, on average, an inch taller and 10 pounds heavier than the youth of the Greatest Generation that won World War II.

In fiscal year 2011, the U.S. government spent $910 billion on 70 means-tested programs, which comes to an average of $9,000 per year on every lower-income person in the United States.

Among the major programs from which the poor receive benefits are Temporary Assistance to Needy Families, the Earned Income Tax Credit, Supplemental Security Income, food stamps, the Women, Infants and Children (WIC) food program, Medicaid, public housing, low-income energy assistance and the Social Service Block Grant.

Children of the poor are educated free, K-12, and eligible for preschool Head Start, and Perkins Grants, Pell Grants and student loans for college.

Lyndon Johnson told us this was the way to build a Great Society.

Did we? Federal and state spending on social welfare is approaching $1 trillion a year, $17 trillion since the Great Society was launched, not to mention private charity. But we have witnessed a headlong descent into social decomposition.

Half of all children born to women under 30 in America now are illegitimate. Three in 10 white children are born out of wedlock, as are 53 percent of Hispanic babies and 73 percent of black babies.

Rising right along with the illegitimacy rate is the drug-use rate, the dropout rate, the crime rate and the incarceration rate.

The family, cinder block of society, is disintegrating, and along with it, society itself. Writes Rector, “The welfare system is more like a ‘safety bog’ than a safety net.”

Heritage scholars William Beach and Patrick Tyrrell put Rector’s numbers in perspective:

“Today … 67.3 million Americans — from college students to retirees to welfare beneficiaries — depend on the federal government for housing, food, income, student aid or other assistance. … The United States reached another milestone in 2010. For the first time in history, half the population pays no federal income taxes.”

The 19th century statesman John C. Calhoun warned against allowing government to divide us into “tax-payers and tax-consumers.” This, he said, “would give rise to two parties and to violent conflicts and struggles between them, to obtain the control of the government.”

We are there, Mr. Calhoun, we are there.

Essays by Pat Buchanan may be found here. Books related to the topic of this article may be found in The Imaginative Conservative Bookstore.

Who Killed the Liberal Arts? And why we should care by Joseph Epstein

Who Raphael’s ‘School of Athens’ Killed the Liberal Arts?

And why we should care

SEP 17, 2012, VOL. 18, NO. 01 • BY JOSEPH EPSTEIN

When asked what he thought about the cultural wars, Irving Kristol is said to have replied, “They’re over,” adding, “We lost.” If Kristol was correct, one of the decisive battles in that war may have been over the liberal arts in education, which we also lost.

In a loose definition, the “liberal arts” denote college study anchored in preponderantly Western literature, philosophy, and history, with science, mathematics, and foreign languages playing a substantial, though less central, role; in more recent times, the social science subjects—psychology, sociology, political science—have also sometimes been included. The liberal arts have always been distinguished from more specialized, usually vocational training. For the ancient Greeks, the liberal arts were the subjects thought necessary for a free man to study. If he is to remain free, in this view, he must acquire knowledge of the best thought of the past, which will cultivate in him the intellectual depth and critical spirit required to live in an informed and reasonable way in the present.

For many years, the liberal arts were my second religion. I worshipped their content, I believed in their significance, I fought for them against the philistines of our age as Samson fought against the Philistines of his—though in my case, I kept my hair and brought down no pillars. As currently practiced, however, it is becoming more and more difficult to defend the liberal arts. Their content has been drastically changed, their significance is in doubt, and defending them in the condition in which they linger on scarcely seems worth the struggle.

The loss of prestige of the liberal arts is part of the general crisis of higher education in the United States. The crisis begins in economics. Larger numbers of Americans start college, but roughly a third never finish—more women finish, interestingly, than do men. With the economic slump of recent years, benefactions to colleges are down, as are federal and state grants, thus forcing tuition costs up, in public as well as in private institutions. Inflation is greater in the realm of higher education than in any other public sphere. Complaints about the high cost of education at private colleges—fees of $50,000 and $55,000 a year are commonly mentioned—are heard everywhere. A great number of students leave college with enormous student-loan debt, which is higher than either national credit card or automobile credit debt. Because of the expense of traditional liberal arts colleges, greater numbers of the young go to one or another form of commuter college, usually for vocational training.

Although it is common knowledge that a person with a college degree will earn a great deal more than a person without one—roughly a million dollars more over a lifetime is the frequently cited figure—today, students with college degrees are finding it tough to get decent jobs. People are beginning to wonder if college, at its currently extravagant price, is worth it. Is higher education, like tech stocks and real estate, the next big bubble to burst?
A great deal of evidence for the crisis in American higher education is set out in College: What It Was, Is, and Should Be. Its author, Andrew Delbanco, the biographer of Herman Melville, is a staunch defender of liberal arts, as he himself studied them as an undergraduate at Harvard and as he teaches them currently at Columbia. The continuing diminution of the liberal arts worries him. Some 18 million people in the United States are now enrolled in one or another kind of undergraduate institution of higher learning—but fewer than 100,000 are enrolled in liberal arts colleges.

At the same time, for that small number of elite liberal arts colleges—Harvard, Yale, Princeton, Stanford, Duke, the University of Chicago, and a few others—applications continue to rise, despite higher and higher tuition fees. The ardor to get into these schools—for economic, social, and snobbish reasons—has brought about an examination culture, at least among the children of the well-to-do, who from preschool on are relentlessly trained to take the examinations that will get them into the better grade schools, high schools, colleges, and, finally, professional schools. Professor Delbanco is opposed to the economic unfairness behind these arrangements, believing, rightly, that as a result, “the obstacles [to getting into the elite colleges] that bright low-income students face today are more insidious than the frank exclusionary practices that once prevailed.”

Whether students today, despite all their special tutoring and testing, are any better than those of earlier generations is far from clear. Trained almost from the cradle to smash the SATs and any other examination that stands in their way, the privileged among them may take examinations better, but it is doubtful if their learning and intellectual understanding are any greater. Usually propelled by the desires of their parents, they form a meritocracy that, in Delbanco’s view, as in that of the English sociologist Michael Young whom he quotes, comprises a dystopia of sorts, peopled by young men and women driven by high, but empty, ambition. “Are these really the people we want running the world?” Delbanco asks. Unfortunately, they already are. I am not the only one, surely, to have noticed that some of the worst people in this country—names on request—are graduates of the Harvard and Yale law schools.

Attending one of a limited number of elite colleges continues to yield wide opportunities for graduates, but fewer and fewer people any longer believe that someone who has finished college is necessarily all that much smarter than someone who hasn’t. With standards lowered, hours of study shortened, reports appearing about how many college graduates can no longer be depended upon to know how to write or to grasp rudimentary intellectual concepts, having gone to college seems to have less and less bearing on a person’s intelligence.

Studies cited by Delbanco in his footnotes claim an increase among college students in cheating, drinking, and depression. In their book Academically Adrift, Richard Arum and Josipa Roska argue that the gain in critical thinking and complex reasoning among the majority of students during college years is very low, if not minimal. In an article in the Chronicle of Higher Education drawn from their book, Arum and Roska write:

Parents—although somewhat disgruntled about increasing costs—want colleges to provide a safe environment where their children can mature, gain independence, and attain a credential that will help them be successful as adults. Students in general seek to enjoy the benefits of a full collegiate experience that is focused as much on social life as on academic pursuits, while earning high marks in their courses with relatively little investment of effort. Professors are eager to find time to concentrate on their scholarship and professional interests. Administrators have been asked to focus largely on external institutional rankings and the financial bottom line. Government funding agencies are primarily interested in the development of new scientific knowledge. .  .  . No actors in the system are primarily interested in undergraduates’ academic growth, although many are interested in student retention and persistence.

What savvy employers are likely to conclude is that those who graduate from college are probably more conformist, and therefore likely to be more dependable, than those who do not. Paul Goodman, one of the now-forgotten gurus of the 1960s, used to argue that what finishing college really meant is that one was willing to do anything to succeed in a capitalist society. In getting a college degree, Goodman held, one was in effect saying, I want in on the game, deal me a hand, I want desperately to play. Education, meanwhile, didn’t have a lot to do with it.

Not everywhere in higher education have standards slipped. One assumes that in engineering and within the sciences they have been maintained, and in some ways, owing to computer technology, perhaps improved. Relatively new fields of learning, computer science chief among them, have not been around long enough to have lost their way. Medical and legal education are probably not greatly different than they have traditionally been. Chiefly in the liberal arts subjects do standards seem most radically to have slipped.

Early in the 19th century, Sydney Smith, one of the founders of the Edinburgh Review, remarked that if we had made the same progress in the culinary arts as we have made in education, we should still be eating soup with our hands. Apart from eliminating corporal punishment and ­widening the educational franchise, we can’t be sure if, over the centuries, we have made much progress in education. At the moment there is great enthusiasm about “advances” in education owing to the Internet. Two teachers at Stanford, for example, put their course on Artificial Intelligence online and drew an audience of 160,000 students from all around the world. But science, which deals in one right answer, is more easily taught without a physical presence in the room, and probably works better online than humanities courses, whose questions usually have many answers, few of them permanently right. The Washington Monthly, in its May-June issue, has a special section called “The Next Wave of School Reform,” a wave that, in the words of the editor, aims to “improve students’ ability to think critically and independently, solve complex problems, apply knowledge to novel situations, work in teams and communicate effectively.” The problem with these waves of school reform, of course, is that a new one is always needed because the last one turns out to have tossed up more detritus on the shore than was expected.

The fact is that we still don’t know how to assess teaching—trial by student test scores, except in rudimentary subjects, isn’t very helpful—and we remain ignorant about the true nature of the transaction between teacher and student that goes by the name of learning. In undergraduate education, we may even have retreated a step or two through the phenomenon known as grade inflation and through the politicization of curricula.

The division between vocational and liberal arts education, which began during the 19th century with the advent of the land-grant state universities in the United States, is today tilting further and further in favor of the vocational. Even within the liberal arts, more and more students are, in Delbanco’s words, “fleeing from ‘useless’ subjects to ‘marketable’ subjects such as economics,” in the hope that this will lend them the practical credentials and cachets that might impress prospective employers.

Delbanco reminds us of Max Weber’s distinction between “soul-saving” and “skill-acquiring” education. The liberal arts, in their task to develop a certain roundedness in those who study them and their function, in Delbanco’s phrase, “as a hedge against utilitarian values,” are (or at least were meant to be) soul-saving. Whether, in the majority of students who undertook to study the liberal arts, they truly were or not may be open to question, but what isn’t open to question is that today, the liberal arts have lost interest in their primary mission. That mission, as Delbanco has it, is that of “attaining and sustaining curiosity and humility,” while “engaging in some serious ­self-examination.” A liberal education, as he notes, quoting John Henry Cardinal Newman, “implies an action upon our mental nature, and the formation of our character.”
Delbanco warns that it won’t do to posit some prelapsarian golden age when higher education approached perfection. Surely he is correct. A good deal of the old liberal arts education was dreary. The profession of teaching, like that of clergyman and psychiatrist, calls for a higher sense of vocation and talent than poor humanity often seems capable of attaining. Yet there was a time when a liberal arts education held a much higher position in the world’s regard than it does today. One of the chief reasons for its slippage, which Delbanco fails directly to confront, is that so many of its teachers themselves no longer believe in it —about which more presently.

I mentioned earlier that the liberal arts were for a good while my second religion. Here let me add that I had never heard of them until my own undergraduate education had begun.
When I was about to graduate from high school as an amiable screw-off, ranked barely above the lower quarter of my class, my father, who had not gone to college, told me that if I wished to go he would pay my way, but he encouraged me to consider whether my going wouldn’t be a waste of time. He personally thought I might make a hell of a good salesman, which was a compliment, for he was himself a hell of a good salesman, and a successful one. I eschewed his advice, not because it wasn’t sound, but chiefly because I felt that, at 18, I wasn’t ready to go out in the world to work.
In those days, the University of Illinois was, at least for residents of the state, an open-enrollment school. If you lived in Illinois, the school had to take you, no matter how low in your high school class you graduated. Lots of kids flunked out, and my own greatest fear on the train headed from Chicago down to Champaign-Urbana, in white bucks and reading The Catcher in the Rye, was that I would be among them.

Most of my friends, Jewish boys from the rising lower-middle class, went to the University of Illinois to major in business. “Business major” nicely rang the earnestness gong. Yet the courses required of a business major struck me as heart-stoppingly boring: accounting, economics, marketing, advertising, corporation finance, also known as “corp fin,” which sounded to me like nothing so much as a chancy seafood dish. I was especially nervous about accounting, for I had wretched handwriting and a dis- orderly mind, which I viewed as two strikes against me straightaway. Wasn’t there something else I might study instead of business? A fellow in the fraternity that was rushing me suggested liberal arts. This was the first time I had heard the phrase “liberal arts.” What it initially stood for, in my mind, was no accounting.
In my first year at the University of Illinois, I had slightly above a B average. I attained this through sheer memorization: of biological phyla, of French irregular verbs and vocabulary, of 17th-century poems. I also discovered, in a course called Rhetoric 101, that I had a minor skill at prose composition, a skill all the more remarkable for my excluding all use of any punctuation trickier than commas or periods.

After this modest success, I decided that I was ready for a more exotic institution, the University of Chicago, to which I applied during my second semester at Illinois. What I didn’t know then, but have since discovered, was that my demographic cohort, those people born toward the middle and end of the Depression, were lucky when it came to college admission, for our small numbers made colleges want us quite as much as we wanted them. In short, I was accepted at the University of Chicago, though I would never have been accepted there today, and that is where I spent the next, and final, three years of my formal education.

The University of Chicago had a reputation for great teachers, but I managed, somehow, to avoid them. I never sat in a class conducted by Leo Strauss, Joseph Schwab, Norman Maclean, David Greene, or Edward Shils. (Of course, great teachers, like great lovers, can sometimes be overrated. Later in life, I met a few men and women reputed to be great teachers and found them pompous and doltish, their minds spoiled by talking too long to children.) I attended a lecture by David Reisman, who was then Time magazine-cover famous, and was impressed by what then seemed to me his intellectual suavity. I sat in on a couple of classes taught by Richard Weaver, the author of Ideas Have Consequences, but left uninspired. I was most impressed by teachers from Mittel-Europa, Hitler’s gift to America, whose culture seemed thicker than that of the native-born teachers I encountered, and could not yet perceive the commonplace mind that sometimes lurked behind an English accent.

I took a course from Morton Dauwen Zabel, who was the friend of Harriet Monroe, Marianne Moore, and Edmund Wilson. Although not a great teacher, Zabel was an impressive presence who gave off whiffs of what the literary life in the great world was like. I took a summer course from the poet and critic Elder Olson, who kept what seemed a full-time precariously long ash on the end of his cigarette, and who, after reading from The Waste Land, ended by saying, “How beautiful this is. Too bad I can’t believe a word of it.”

The students at the University of Chicago were something else. In his book, Delbanco, defending the small classroom, refers to something he calls “lateral learning,” which refers to what a college student learns in class from his fellow students. He cites Cardinal Newman and John Dewey on this point, and quotes Nathaniel Hawthorne:

It contributes greatly to a man’s moral and intellectual health, to be brought into habits of companionship with individuals unlike himself, who care little for his pursuits, and whose sphere and abilities he must go out of himself to appreciate.

A great many of my fellow students in the College at the University of Chicago seemed to come from New York City, several others from academic families. They appeared to have been reading the Nation and the New Republic from the age of 11. Their families argued about Trotsky at the dinner table. A few among them had the uncalled-for candor of psychoanalysands. I recall a girl sitting next to me at a roundtable in Swift Hall volunteering her own menstrual experiences in connection with a discussion of those of the Trobriand Islanders.

Some among these University of Chicago students had an impressive acquaintance with books. One morning in Elder Olson’s class in modern poetry, Olson began quoting Baudelaire (mon semblable,—mon frère!) and a student next to me, named Martha Silverman, joined him, in French, and together, in unison, the two of them chanted the poem to its conclusion. This was one of those moments when I thought it perhaps a good time to look into career opportunities at Jiffy Lube.

“I invariably took the first rank in all discussions and exercises, whether public or private, as not only my teachers testified, but also the printed congratulations and carmina of my classmates.” So wrote Leibniz about his own classroom performance. Reverse everything Leibniz wrote and you have a fairly accurate picture of my classroom performance at the University of Chicago. None among my teachers there ever suggested that I had intellectual promise. Nor should they have done, for I didn’t show any, not even to myself. I made no “A”s. I wrote no brilliant papers. I didn’t do especially well on exams. I was not quick in response in the classroom.

Only years later did I realize that quickness of response —on which 95 percent of education is based—is beside the point, and is required only of politicians, emergency-room physicians, lawyers in courtrooms, and salesmen. Serious intellectual effort requires slow, usually painstaking thought, often with wrong roads taken along the way to the right destination, if one is lucky enough to arrive there. One of the hallmarks of the modern educational system, which is essentially an examination system, is that so much of it is based on quick response solely. Give 6 reasons for the decline of Athens, 8 for the emergence of the Renaissance, 12 for the importance of the French Revolution. You have 20 minutes in which to do so.

At the University of Chicago I read many books, none of them trivial, for the school in those years did not allow the work of second- or third-rate writers into its curriculum. Kurt Vonnegut, Toni Morrison, Jack Kerouac, Adrienne Rich, or their equivalents of that day, did not come close to making the cut. No textbooks were used. You didn’t read “Karl Marx postulated .  .  .”; you read Karl-bloody-Marx. The working assumption was that one’s time in college is limited, and mustn’t be spent on anything other than the first-rate, or on learning acquired (as with textbooks) at a second remove.

Nor did Chicago offer any “soft” majors or “lite” courses. I remember, in my final year, looking for such a course to fill out a crowded schedule, and choosing one called History of Greek Philosophy. How difficult, I thought, could this be? Learn a few concepts of the pre-Socratics (Thales believed this, Heraclitus that), acquire a few dates, and that would be that. On the first day of class, the teacher, a trim little man named Warner Arms Wick, announced that there was no substantial history of Greek philosophy, so we shall instead be spending the quarter reading Aristotle and Plato exclusively.

How much of my reading did I retain? How much does any 19- or 20-year-old boy, whose hormones have set him a very different agenda, retain of serious intellectual matter? How much more is less than fully available to him owing to simple want of experience? What I do remember is the feeling of intellectual excitement while reading Plato and Thucydides and an almost palpable physical pleasure turning the pages of Max Weber’s The Protestant Ethic and the Spirit of Capitalism as he made one dazzling intellectual connection after another. I can also recall being plunged into a brief but genuine depression reading Freud’s Civilization and Its Discontents.

The idea behind the curriculum at the College of the University of Chicago was the Arnoldian one, abbreviated to undergraduate years, of introducing students to the best that was thought and said in the Western world. Mastery wasn’t in the picture. At least, I never felt that I had mastered any subject, or even book, in any of my courses there. What the school did give me was the confidence that I could read serious books, and with it the assurance that I needed to return to them, in some cases over and over, to claim anything like a genuine understanding of them.

I was never more than a peripheral character, rather more like a tourist than a student, at the University of Chicago. Yet when I left the school in 1959, I was a strikingly different person than the one who entered in 1956. What had happened? My years there allowed me to consider other possibilities than the one destiny would appear to have set in grooves for me. I felt less locked into the social categories—Jewish, middle-class, Midwestern—in which I had grown up, and yet, more appreciative of their significance in my own development. I had had a glimpse—if not much more—of the higher things, and longed for a more concentrated look.

Had I not gone to the University of Chicago, I have often wondered, what might my life be like? I suspect I would be wealthier. But reading the books I did, and have continued to throughout my life, has made it all but impossible to concentrate on moneymaking in the way that is required to acquire significant wealth. Without the experience of the University of Chicago, perhaps I would have been less critical of the world’s institutions and the people who run them; I might even have been among those who do run them. I might, who knows, have been happier, if only because less introspective—nobody said the examined life is a lot of laughs—without the changes wrought in me by my years at the University of Chicago. Yet I would not trade in those three strange years for anything.

I turned out to be a better teacher than student. In fact I took to saying, toward the close of my 30-year stint in the English department at Northwestern University, that teaching provides a better education than does being a student. If he wishes to elude boredom among his students and embarrassment for himself, a teacher will do all he can to cultivate the art of lucid and interesting presentation and the habits of thoroughness. Thereby, with a bit of luck, education may begin to kick in.

Yet even after completing three decades of teaching, I am less than sure that what I did in the classroom was effective or, when it might have been effective, why. Of the thousands of inane student evaluations I received—“This guy knows his stuff” .  .  . “Nice bowties” .  .  . “Great jokes”—the only one that stays in my mind read: “I did well in this course; I would have been ashamed not to have done.” How I wish I knew what it was that I did to induce this useful shame in that student, so that I might have done it again and again!

Student evaluations, set in place to give the impression to students that they have an important say in their own education, are one of the useless intrusions into university teaching by the political tumult of the 1960s. Teaching remains a mysterious, magical art. Anyone who claims he knows how it works is a liar. No one tells you how to do it. You walk into a classroom and try to remember what worked for the teachers who impressed you, or, later in the game, what seemed to work best for you in the past. Otherwise, it is pure improv, no matter how extensive one’s notes.

As a testimony to the difficulty of evaluating the quality of teaching, Professor Delbanco includes a devastating footnote about student evaluations. One study found that students tend to give good evaluations “to instructors who are easy graders or who are good looking,” and to be hardest on women and foreign teachers; another, made at Ohio State University, found “no correlation between professor evaluations and the learning that is actually taking place.” As Delbanco notes, the main result of student evaluations is to make it easier for students to avoid tough teachers or, through harsh reviews, punish these teachers for holding to a high standard.

I was not myself regarded as a tough teacher, but I prefer to think that I never fell below the line of the serious in what I taught or in what I asked of my students. What I tried to convey about the writers on whom I gave courses was, alongside the aesthetic pleasures they provided, their use as guides, however incomplete, to understanding life. Reading Joseph Conrad, Henry James, Leo Tolstoy, Fyodor Dostoyevsky, Willa Cather, and other writers I taught was important business—possibly, in the end, though I never said it straight out, more important than getting into Harvard Law School or Stanford Business School. When I taught courses on prose style, I stressed that correctness has its own elegance, and that, in the use of language, unlike in horseshoes, close isn’t good enough; precision was the minimal requirement, and it was everything.

How many students found helpful what I was trying to convey I haven’t the least notion. If anything I said during the many hours we were together mattered to them, I cannot know. Not a scholar myself, I never tried to make scholars of my students. A small number of them went on to do intellectual work, to become editors, critics, poets, novelists; a few became college teachers. Did my example help push them in their decision not to go for the money? Some of the brightest among them did go for the money, and have lived honorable lives in pursuit of it, and that’s fine, too. A world filled with people like me would be intolerable.

When I taught, I was always conscious of what I thought of as the guy in the next room: my fellow teachers. During my teaching days (1973-2003), I could be fairly certain that the guy in the next room was teaching something distinctly, even starkly, different from what I was teaching. This was the age of deconstruction, academic feminism, historicism, Marxism, early queer theory, and other, in Wallace Stevens’s phrase, one-idea lunacies. A bright young female graduate student one day came to ask me if I thought David Copperfield a sexual criminal. “Why would I think that?” I asked. “Professor X thinks it,” she said. “He claims that because of the death in childbirth of David Copperfield’s wife, he, Copperfield, through making her pregnant, committed a crime.” All I could think to reply was, “I guess criticism never sleeps.”

While not wishing to join the dirge-like chorus of those who write about the fate of higher education in our day, Andrew Delbanco does not shy from setting out much that has gone wrong with it. He highlights the importance everywhere accorded to research over teaching among faculty. He notes the preeminence of science over the humanities, due to the fact that science deals with the provable and can also lead to technological advancement, and hence pays off. (He mentions the sadly mistaken slavishness of the humanities in attempting to imitate science, and cites the advent of something called the “literature lab” as an example.) He brings up the corruption implicit in university presidents sitting on corporate boards, the fraudulence of big-time college athletics, some of whose football and basketball coaches earn more than entire academic departments, and much more.
Delbanco, a secular Jew and a man of the Vietnam generation, is nonetheless ready to allow the pertinence of the earlier Protestant view of higher education in the liberal arts:

The era of spiritual authority belonging to college [when it was under religious auspices] is long gone. And yet I have never encountered a better formulation—“show me how to think and how to choose”—of what a college should strive to be: an aid to reflection, a place and process whereby young people take stock of their talents and passions and begin to sort out their lives in a way that is true to themselves and responsible to others.

College: What It Was, Is, and Should Be gives a clear picture of all the forces, both within and outside the university, working against the liberal arts. Yet Delbanco lets off the hook the people who were in the best position to have helped save them—the teachers, those “guys in the next room.” Much could be said about teaching the liberal arts before the Vietnam generation came to prominence (which is to say, tenure) in the colleges: that it could be arid, dull, pedantic, astonishingly out of it. But it never quite achieved the tendentious clownishness that went into effect when “the guys in the next room” took over.

Not that the ground hadn’t been nicely prepared for them. Universities had long before opened themselves up to teaching books and entire subjects that had no real place in higher education. Take journalism schools. Everyone who has ever worked on a newspaper knows that what one learns in four years in journalism school can be acquired in less than two months working on a newspaper. But as journalism schools spread, it slowly became necessary to go through one in order to get a job on a large metropolitan daily. Going to “journ” school became a form of pledging the fraternity. Everyone else in the business had pledged; who are you, pal, to think you can get in without also pledging? And so journalism schools became mainstays of many universities.

Then there is the business school, especially in its MBA version. Business schools are not about education at all, but about so-called networking and establishing, for future employers, a credential demonstrating that one will do anything to work for them—even give up two years of income and pay high tuition fees for an MBA to do so. As with an American Express card, so with an MBA, one daren’t leave home without one, at least if one is applying for work at certain corporations. Some among these corporations, when it comes to recruiting for jobs, only interview MBAs, and many restrict their candidate pools to MBAs from only four or five select business schools. Pledging the fraternity again.

Soon, the guys in the next room, in their hunger for relevance and their penchant for self-indulgence, began teaching books for reasons external to their intrinsic beauty or importance, and attempted to explain history before discovering what actually happened. They politicized psychology and sociology, and allowed African-American studies an even higher standing than Greek and Roman classics. They decided that the multicultural was of greater import than Western culture. They put popular culture on the same intellectual footing as high culture (Conrad or graphic novels, three hours credit either way). And, finally, they determined that race, gender, and social class were at the heart of all humanities and most social science subjects. With that finishing touch, the game was up for the liberal arts.

The contention in favor of a liberal arts education was that contemplation of great books and grand subjects would take students out of their parochial backgrounds and elevate them into the realm of higher seriousness. Disputes might arise from professor to professor, or from school to school, about what constituted the best that was thought and said—more Hobbes than Locke, more Yeats than Frost—but a general consensus existed about what qualified to be taught to the young in the brief span of their education. That consensus has split apart, and what gets taught today is more and more that which interests professors.

Columbia still provides two years of traditional liberal arts for its undergraduates. The University of Chicago continues to struggle over assembling a core curriculum based on the old Robert Hutchins College plan. St. John’s College, both in Annapolis and in Santa Fe, has, from its founding, been devoted to the cult of the liberal arts, even to the point of having its students study medieval science. The hunger among students for the intellectual satisfaction that a liberal arts education provides is not entirely dead. (At Northwestern, a course in Russian novels taught by Gary Saul Morson attracts 600 students, second only to the recently canceled notorious course in sex education offered by the school.) But the remaining liberal arts programs begin to have the distinct feel of rearguard actions.

The death of liberal arts education would constitute a serious subtraction. Without it, we shall no longer have a segment of the population that has a proper standard with which to judge true intellectual achievement. Without it, no one can have a genuine notion of what constitutes an educated man or woman, or why one work of art is superior to another, or what in life is serious and what is trivial. The loss of liberal arts education can only result in replacing authoritative judgment with rivaling expert opinions, the vaunting of the second- and third-rate in politics and art, the supremacy of the faddish and the fashionable in all of life. Without that glimpse of the best that liberal arts education conveys, a nation might wake up living in the worst, and never notice.

Joseph Epstein, a contributing editor to The Weekly Standard, is the author, most recently, of Essays in Biography.

Breaking Bad: A Contemporary Tragedy by Dutton Kearney

breaking badBreaking Bad: A Contemporary Tragedy

by Dutton Kearney

The final eight episodes of Breaking Bad are upon us. If you haven’t been following the series, you’re missing what many media critics are calling the best show on television and one of the best of all time. Perhaps so. For many, it has been a five year guilty pleasure. The writing is quite good, and characters like Saul Goodman are so interesting that they could very well have a show of their own. It’s not for children, with its language, drug economy, and extreme violence. However, it does seem that violence has become the tenor and vehicle of American television drama, and many of its best series—especially those available only on cable—are indeed violent. The Writer’s Guild of America recently issued its list of the 101 best-written television shows, and it is probably no surprise to anyone that The Sopranos is on the top of that list. Breaking Bad is number 13, and The Wire, another cable drama featuring drugs and violence, is number 9.

What is the appeal of a high school chemistry teacher who decides to cook methamphetamine when he discovers that he has Stage Four lung cancer? The characters are vivid, well-drawn, and memorable, but that’s not why so many people can’t get enough of the show. The setting is New Mexico—with its beauty on the one hand, and, because of its shared border with Mexico, its proximity to the drug trade on the other—but that’s not why viewers tune in either. Neither is it its relentless series of violent cliffhangers—watching an episode replicates the exhilaration and exhaustion of a roller coaster. As Justin Jackson, a colleague at Hillsdale College, put it, the tragic plot of Walter White is what makes the show so interesting. It is probably the closest we will get to experiencing tragedy as the Greeks did in Athens. We know the characters. We know the conclusion. We know the plotline’s inevitability. We are simultaneously drawn to Walter and repelled by him, just as the Greeks were with Oedipus, just as the Elizabethans were with Macbeth. We pity Walter White as much as we fear him.

Continue reading

Informing Catholic Consciences: Compiled by the Diocese of Trenton

Informing Catholic Consciences

Bishop O'Connell October 21, 2013
Informing Catholic Consciences.pdf

The Roman Catholic Church (RCC) in the United States does not endorse any candidate for political office whatsoever and does not endorse any political party. In its 2007 statement “Forming Consciences for Faithful Citizenship,” the United States Conference of Catholic Bishops (USCCB) states that “we bish­ops do not intend to tell Catholics for whom or against whom to vote” but, rather, “to help Catholics form their consciences in accordance with God’s truth (paragraph 7).” The bishops continue, “In the Catholic Tradition, responsible citizenship is a virtue, and participation in political life is a moral obligation (paragraph 13).” Neither the USCCB nor the Diocese of Trenton provide, present or endorse a “voters’ guide” or a “scorecard of issues” with directions on how to vote. In an effort to help Catholics form and inform their consciences, however, we do attempt to present Catholic teaching on moral and social issues to the faithful clearly and consistently in ac­cordance with the Gospel and the Church’s rich tradition regarding matters of faith and morals. Although by no means exhaustive, the following summary briefly attempts to do that.

Abortion, Euthanasia and Life Issues: RCC teaches unqualified and absolute support for all human life in all its stages from conception to natural death. RCC considers abortion and euthanasia grave moral evils (see the Catechism of the Catholic Church (CCC), paragraphs 2270 through 2283).

Death Penalty: RCC teaches that “at the heart of the Catholic teaching on the death penalty is the belief that ‘human life is sacred because from its beginning it involves the creative action of God and it remains forever in a special relationship with the Creator, who is its sole end (CCC, paragraph 2258).’” In paragraphs 2266 and 2267, CCC goes on to teach that

The State’s effort to contain the spread of behaviors injurious to human rights and the fundamental rules of civil coexistence corresponds to the requirement of watching over the common good. Legitimate public authority has the right and duty to inflict penalties commensurate with the gravity of the crime. The primary scope of the penalty is to redress the disorder caused by the offense. When his punishment is voluntarily accepted by the offender, it takes on the value of expiation. Moreover, punishment, in addition to preserving public order and the safety of persons, has a medicinal scope: as far as possible it should contribute to the correction of the offender (CCC, paragraph 2266).

The traditional teaching of the Church does not exclude, presupposing full ascertainment of the identity and responsibility of the offender, recourse to the death penalty, when this is the only practicable way to defend the lives of human beings effectively against the aggressor. “If, instead, bloodless means are sufficient to defend against the aggressor and to protect the safety of persons, public authority should limit itself to such means, because they better correspond to the concrete conditions of the common good and are more in conformity to the dignity of the human person. Today, in fact, given the means at the State’s disposal to effectively repress crime by rendering inoffensive the one who has committed it, without depriving him definitively of the possibility of redeeming himself, cases of absolute necessity for suppression of the offender ‘today … are very rare, if not practically non-existent.’”[John Paul II, encycli­cal Evangelium Vitae, paragraph 56.] (CCC, paragraph 2267). Continue reading

%d bloggers like this: