An Editor’s Life

For the last several months, I have taken up the responsibility of editing The Journal of Faith and the Academy, a publication of the Institute of Faith and the Academy at Faulkner University. The role of an editor, I am learning, is more complex than simply organizing and formatting articles. It is an amalgamation of duties, including writer, marketer, logistician, researcher, counselor, and politician. So taxing has been this new assignment that I have found little time for creativity, recreation, and writing of my own. I might spend some time lamenting this fact, but your time is short, so I will edit that digression and move onto more interesting thoughts.

Some people confuse editing with proofreading. In truth, proofreading is easy. Grammar follows a fairly standard set of rules. Convention and use have laid the path, and the writer needs merely to follow it. A proscriptive doctrine of subject-verb agreement, dangling participles, and (there it was if you just missed it) the Oxford comma acts as a railroad track to lead readers to clarity.

Leading readers to enlightenment, however, is a more arduous journey. As an editor, your goal is to ensure that the writer says what he thinks he says—in such a way, of course, that will make sense to the audience. Standing in as the imagined reader, the editor must walk the middle ground of statement and intention. In this respect, the editor should do as little revising as necessary out of respect to the author. Time, if not ethics, do not permit the editor to meddle. And yet, there’s always that one author who can’t seem to stay on the rails and whose train just refuses to climb the hill, that one whose lack of clarity screams out for a firm hand…

Nevertheless, revision errs in making the author say what the editor wants him to say. That impulse is always there—for surely I could make this idea more profound, could make this more obscure connection, could argue this point so much more clearly. But it is an impulse that can only be suggestive, not prescriptive. For if Dumbledore is correct, and words are the most powerful magic we wield, then the responsibility of handling the work of the author is a weighty one indeed.

Thought is composed of language, and language has the unique ability to affect thoughts, introducing ideas that can later become whole perspectives on the world. With this in mind, it is no large step from the grammatical to the psychological, even less from the psychological to the metaphysical.

What parts of our lives do we edit out? Surely we edit our past, as I have remarked elsewhere. But I think this is even more true when it comes to our interpretation and transformation. First, as believers, we attempt to edit out those behavior patterns, thoughtless actions, and evil intentions that prevent us from writing a masterpiece of our lives. We attempt—sometimes successfully—to delete gossip, lying, lust, or any other sin that would tarnish our divine likeness. This is, to a large extent, necessary. Yet at the same time, there is no good story without conflict, and the Father has chosen to make his kingdom out of sinful citizens. In the city of God, the weakest characters are those in whom Christ shines the brightest. He is, after all, the master editor.

But good editing can only occur when we have a good style guide and know how to use it well (I fear my metaphor is descending into obscurity and banality, but bear with me). Doctrine is fairly easy to grasp, as it is determined by scripture, tradition, and local (sometimes global) authority. Interpretation is much harder. One can become lost in the minutiae of grammar, denotation and connotation, and etymology, and never see the symbolism, allusion, paradox, and overall unity of the work. Sometimes I think we become so hung up on the task of proofreading that we neglect the more important work of editing. The Christ gives meaning to Isaiah’s echo of desiring mercy and not sacrifice, spiritual metamorphosis over religious purity. Focusing on doctrine means never making the mistake of revision (though doctrinal purists commit their own brand of revisionism as well). Yet it also means missing out on the importance of editing. The work of editing rises not only in laboring to make one’s life a reflection of the Author, but also in partnering with him to write a beautiful poem of one’s life in the larger poetry of humanity.

The task is not an easy one. It requires a right intention, an observant mind, and a steady hand with nimble fingers. But, when accomplished, the right words can create redemptive ideas.

Pop Culture, Pop Guns, and the Politicization of Celebrities

In the midst of Ebola scares, a midterm election, and the rise of the Islamic State, you probably knew (not that you could help it, but much to your shame) that George Clooney recently married, that Chris and Gwyneth held a divorce ceremony, and Renee Zellweger used more plastic than a Lego factory to change her face. This is hardly news, and yet it occupies the headlines of current events from The Sun to the Wall Street Journal.

Celebrities are not a new phenomenon. People have been breathlessly taken by their rulers since governments first formed. People gravitate toward the rich and famous. It not only gives us, the little people, heroes to follow but reaffirms our belief that one day we can “make it” too. If Taylor Swift can become famous with little more than a smile, then maybe I can achieve fame and success. Most classical writers had fans who might have hash-tagged them were it not for the absence of social media and threat of execution. It was, at least in part, Julius Caesar’s popularity with the mob that contributed to his assassination—desires to overthrow the republic notwithstanding. And while politics is merely show business for the, shall we say, aesthetically disenfranchised, media figures distastefully use their attention-grabbing status to opine on everything political from environmentalism to economics.

As an antagonist of pop culture, I find it difficult to support the pet causes of people whose greatest claim to fame is pretending (albeit convincingly) to be someone else. If one makes their living fictionalizing, it is hard not to cynically wonder if their outrage is similarly feigned. Other celebrities have even fewer talents, like being able to sing better than most—and if they don’t possess that talent, they can at least remove more clothes while singing sub par. For this reason alone, their social and political judgments should be discounted.

I think many media figures are self-consciously aware of their own inanity, which is why they must expand their platform to be taken seriously. They don’t cure cancer, but they can make it look like they care about curing cancer. To some degree, this is systemic to their industry. For a profession which must outdo itself every summer season, bigger must be better. Yet we may or may not be reaching a point of diminishing returns for the theatre-going audience; time, as in all things, will tell. Are we no longer able to suspend disbelief in the predictable highwire jumps, slow motion effects, and CGI? Somehow I doubt it. It is more likely that the convenience of home viewing is out-pacing the perceived value of soaring ticket prices, popcorn and soda that is more expensive than a McDonald’s value meal, and the annoyance of sitting with loud and socially-inappropriate viewers.

But if there is any cultural backlash against Hollywood beginning to assert itself, it may (and this is a really big MAY) be rooted in the celebrity phenomenon. It is hard to take seriously, for instance, a video of celebrities condemning gun violence when the blockbuster movies they sell make money through crafted explosions, fake bullets, and fictional body counts. To my knowledge, the only action hero who has ever to account for his actions is Jack Bauer, appearing before a congressional committee explaining his counter-terrorist measures in the seventh season of 24 (apparently you can’t kill 309 people without someone taking notice), and even then his use of torture is considered sufficiently justified to allow him to save the world for another three seasons. But when the Avengers help save New York City, we never see the President granting emergency federal funds and the painstaking process of rebuilding billion-dollar skyscrapers. Rarely do we see one of Denzel’s characters facing guilt-ridden sleepless nights for all the bad guys he’s shot in the line of duty. We don’t hear the residents of Westeros standing up and saying “Enough” to the violence. Ironically, if the American news media saw the residents of Westeros standing up and saying “Enough” to the nudity, incest, and general promiscuity in town, they’d be heckled as too Puritanical for a more enlightened age.

The beauty of fiction is in paring down a story to its most essential elements; we are told only those plots that communicate the greatest meaning. Though it takes place in real time, we are thankfully spared Jack Bauer’s real needs to eat and pee (fighting terrorists never allows for a bathroom break—except perhaps during the commercials). This is an acceptable aspect of fiction wherein we the audience suspend our disbelief. Yet the downside is that we consumers of pop culture never see the day-to-day reality of what would occur outside the narrative. We only see the sensational events but never the consequences.

And herein lies the problem of the political celebs. They appeal to the low information voter, urging them to act without ever considering the costs of action—rarely ever considering the benefits of inaction. Ignoring the realities of real life, they do not consider the end result nor the unintended consequences. Because pop culture reduces everything to the lowest common denominator, the least truth, the least goodness, and the least beauty are merely enough—though the least never is. While this is certainly democratic, it means the most superficial engagement suffices for electoral success.

In the marketplace of ideas, there will always be bad ones. In the media landscape, we will always find stories of no inherent value. But this also means we don’t have to justify their existence with our attentiveness.

The Middle of King Lear

Sitting high atop the dung heap of humanity, the solitary weeper weeps alone. Like the biblical Job, he laments his fall and cries at life’s injustice. He tears at his hair, scratches lines into the dirt, speaks nonsensical gibberish. He speaks without meaning and we are left trying to assemble broken puzzle pieces, vainly trying to create meaning. I and three dozen other audience members watch the actor transform into King Lear, witnessing high art yet at a loss to know what to do with it.

What is insanity? When I ask this question in class, my student athletes claim they know the answer. Doing the same thing over and over again and expecting a different result. Far be it for me to contradict a generation of coaches looking to motivate their players, but that answer belongs to the question, What is stupidity? (I’ll take “Worn-Out Cliches” for $300, Alex…).

In truth, insanity is the inability to see reality as it actually is. Having never been insane (that I know of), I feel unequipped to explore the answer to this question. It is not as comprehensive as, What is love? nor as abortively academic as, What is death? But an insane person, I conjecture, cannot—or chooses not to—live in the real world.

But then this begs the question, What is real? I think we can safely answer that point by asserting that the real is what everyone accepts the real to be. Everyone accepts that the sky is blue, that gravity is irresistible, that murder is wrong. Yet legal experts and theologians can find plenty of instances where murder is acceptable; does that then mean that the majority view is insufficient? It may be, but exceptions do not fully invalidate the morally secure position of the majority. But then what if a minority asserted a metaphysical truth, like for instance, the belief that a god could rise from the dead, and claimed a new moral code that the majority did not agree with. Would they be considered insane? And if so, must they be dealt with by society, using means that might appear harsh yet were perhaps necessary to protect the majority?

You see the problem?

Insanity cannot be defined as disputes between moral viewpoints, nor even fully the inability to relate to the world as it is. Insanity may be a clinical definition for a psychological world of schizophrenia and psychotic breaks, though I prefer the more antiquated and literary term madness. Madness removes, sometimes violently and often temporarily, an individual from the external world to a world within. It is not an affliction of the brain but of the soul.

Shakespeare’s King Lear addresses the unimaginable horrors of old age, of isolation, and of its accompanying madness. We agonizingly watch, scene after scene, as Lear’s very identity is slowly stripped from him. He loses his deepest love in his daughter Cordelia, loses his presumed affections from his other, duplicitous daughters, he loses his kingship and his home. Finally, having lost everything else that would identify us as human and give us meaning, he loses his mind.

In the one-man performance of The Middle of King Lear I was fortunate to see last month, Royal Shakespeare Company veteran Cedric Liqueur portrays Lear alone on the heath with nothing remaining to him—not even his mind. He pours forth a litany of Lear’s lines from the iconic original play, but in isolation, without anyone to talk to, the words literally lose all meaning. Deeper into his soliloquy, Lear begins speaking to the audience members around him, selecting one observer to be his Edgar, one his Kent, two his Goneril and Regan, one his Cordelia. He looks at all of us surrounding him, and the stage becomes a place where we become the silent ghosts of Lear’s past. What would he have said to them—to us—in his madness if he could?

Notably, one of the markers of insanity is talking to one’s self. What if the mind, poisoned by pain and despair, imagines its friends, its families, its enemies—recreates them out of the heavy air of dreams? What if madness is simply trying to reconstruct a new reality out of what might have been? Further, what if we, the sane, watch scornfully the insane, never realizing the futility of our own state? We too try to reconstruct new realities: comfortable decadence, insulated from the darkness of the world, fantastical utopias. Perhaps we are mad for ignoring the darkness and trying to replace it with false light.

One of the constant refrains of Man of La Mancha is that human beings are born into a dung heap (which is not a skewed perspective in seventeenth-century Spain). It is Don Quixote who is thought insane because he refuses to lose hope. Or rather, he creates hope with fantasy, choosing to believe the world is a finer, richer, and more magnificent place than it actually is. And in believing it to be so, he infects others with his redeemable lunacy. Don Quixote asks, “If life is insanity, what then is madness?”

The human spirit must rise above the human condition. If we hope for more from this life—and potentially from the next—we must transcend the dung heap. The revered Union general Joshua Lawrence Chamberlain, having hunkered down before the stone wall at Fredricksburg, triumphed in that inspirational charge at Little Round Top at Gettysburg, and survived the mud and the horrors of Petersburg, was frequently asked how he did it. How did he endure through the mud, the cold, the blood, and indecency? His response was that he imagined himself a medieval knight, a warrior doing the business of God where little was glorious and all was to be gained. He could courageously face the abyss believing himself something more than himself. He, like my man Don, reminds us that we can lament the dung, burn it all, or build something with it. Most of us are not destructive, but even more of us lack the insane courage to create something significant out of something so filthy, worthless, and wicked. And yet, I am reminded, thus did the Christ.

Is madness inate? Does it manifest itself in some sort of Freudian nightmare? Or do we, forced into the cuckoo’s nest, retreat into madness for an ounce of security? Or is it that madness the only rational response in an irrational world?

Maimonides’ Quandary and the Perils of Interpretation

Last month, an article popped up on my News Feed, “Does Personal Bible Reading Destroy the Church?” Naturally, a title like this is written to attract attention—and it certainly did mine. The author argues that the development of Protestantism fractured Christianity into 34,000 different sects. Consequently, both insiders and outsiders don’t know what to believe about scripture—and finding out the answers for themselves may be just as misleading and costly as seeking them out within the fenced-in doctrines of one of Christianity’s many sects.

Ironically, this fracturing of Christianity was one of the prophecies leveled against Martin Luther in his enforced break from Catholicism. The church, it was said, would shatter into a thousand pieces, which evidentially turned out to be true (give or take ten thousand or so). Catholicism splinters into Lutheranism, which breaks into Protestantism, out of which arises Methodism and the various Baptist fellowships, which eventually devolves into the Cult of Fullman. Without a central authority to interpret and enforce Holy Writ, each man becomes his own glossator of scripture. This is especially problematic for two reasons.

The first is our tenuous relationship to truth. Nietzsche argued, most convincingly, in “On Truth and Lying in a Non-Moral Sense” that most people are not interested in truth, only in what makes them comfortable. We will believe, he claims, what we want to believe—not what is commensurate with reality. While this sounds terribly cynical, there is still more than a little substance to what he says. My subjective, individual will is a more immediately compelling force than an objective, external truth.

We can see this in the way people fashion for themselves narratives about the world. This begins with our parents, as we hear their political opinions, religious convictions, and social commentary. As we mature we adopt, reject, or modify our parents’ narratives. Perhaps we choose to spank our children because our parents did; perhaps we don’t. As we age and encounter new experiences, these narratives become more solidified in our minds, and it becomes difficult to change or alter them. When confronted with a new idea that challenges the narrative, people are naturally resistant. How we’ve always done things is sufficient, we tell ourselves. Sufficiency is later replaced with complacency, which resigns itself to the assumption that we have already discovered, understand, and communicate the truth. The first problem is that we may not see reality as it is—only as we wish it to be.

Stemming from this premise, the second problem is that, especially in matters of divinity, only some people may be trusted with the truth. In Guide to the Perplexed, the twelfth-century theorist Maimonides acknowledges the very real difficulty in interpreting scripture. Against our largely Western perspective that each person can fairly understand anything, his Jewish take is that the truth is too vast and mysterious a concept to be grasped. Even the learned are perplexed at the intricacies and various shades of meaning implicit in God’s word—how much more so the vulgar? The learned have a special advantage in that they have a greater understanding of the world, and divine science can be approached through natural science. This does not preclude teaching the vulgar, but you can only cast pearls before swine so often before they turn and tear you to pieces. This form of criticism asserts that there is indeed a palpable interaction between text and audience, but Maimonides insists that deeper understanding is only for the committed few. The question becomes not only how we should interpret, but who should interpret?

This is a quandary most 21st century Westerners would likely not even entertain. So entrenched is our post-literate thinking that we adopt a posture of arrogance with the assumption that we can understand the deep mysteries of God. But if God reserves special revelation for Abraham, Moses, and the prophets, surely there are things he does not tell the masses. If Christ reserves special explanation for his apostles (Mark 4:10-12), surely there are things the average disciple could not grasp. If the Word of scripture contains “things that are hard to understand” (2 Pet 3:16), surely not all can accurately interpret.

We democratize our virtues and special gifts of the Spirit to our peril. If not everyone is equipped to become pastors and teachers, should everyone be given the benefit of the doubt in their own personal analysis of a text? If we wouldn’t make each man an administrator, should we deem each man a qualified reader? Dare we go so far as to say that interpretation should be handled by professionals? But then again, what if the interpreters have been poorly trained? What if they are similarly ill-equipped to rightly handle the word of truth? If power over scripture resides in a select few, then to whom will we turn when the few intentionally—or unintentionally—misinterpret?

The dilemma does not solve itself were Christianity to suddenly reunify under one umbrella. Even the popular credo Sola Scriptura from my own and from other faith traditions is rife with problems—since people interpret scripture differently. An argument between an Independent Baptist and a Free-Will Baptist may be hostile and infused with urgency, though from an outsider’s perspective largely inconsequential. A premillennialist or postmillennialist view of eschatology should not disqualify one from salvation nor sanctification. Of course, from another perspective these conclusions reveal my own interpretive biases—and maybe my perspective on the Holy Spirit should be the very thing that disqualifies me. But if Paul’s instructions regarding the freedom of conviction are any indication (Rom. 14), then I may not be condemned after all. Blaming the divisions in Christianity on interpretation is only tenable from a position of hegemony. Once one sect becomes the dominant representation of the faith, then everyone else becomes a heretic. If, however, we focus on those articles that unite us—the supremacy of Christ, for instance—then we become less enamored of those articles that divide us.

Maimonides brings much to bear in this discussion but does not offer many solutions. And therein may be just the point. We presuppose for some reason that the “fracturing” of Christianity is a problem, confusing unity of spirit with unity of thought. But if God has given us the freedom to interpret, then does misinterpretation automatically indicate divine disapproval? And if we obey the Augustinian injunction that all interpretation should lead to the double love of God and one’s neighbor, then do we not choose what is best?

I believe this may be one of the most important questions I’ve ever considered in these pages. While it may appear that there is a bit of hyperbole in that statement, it is nevertheless true that what, how, and why we interpret are fundamental to a life lived in pursuit of truth.

The Braveheart Dilemma: Scottish Nationalism and the Vote for Independence

In 1320, the Declaration of Arbroath, signed by Robert the Bruce and more than fifty noblemen, was sent to Pope John XXII with the intention that he would recognize Scottish independence. Prompted by the political efforts of the Bruce, and by the earlier military efforts of William Wallace, the Declaration was an achievement akin to the spirit of the Magna Carta, if not with its strength. It supported the Bruce’s royal claims, arguing for his right to rule his own Scotsmen against the English king Edward I’s presumed rights. Asserting itself to be the vox populi, it held that “It is in truth not for glory, nor riches, nor honours that we are fighting, but for freedom—for that alone, which no honest man gives up but with life itself.”

I have held mixed feelings about last week’s referendum. I have also held off discussing them until now (in the aftermath of which they may be even more unwelcome). The perspective of an American may be unsolicited in an election of a sovereign nation, much as it is when other nations weigh in on our electoral college. It is, of course, easy to play the Monday morning quarterback—which is not my intention. Rather, I hope to speak briefly to those who may not fully grasp the implications of this particular decision.

On the one hand, a national movement for independence is understandable and in keeping with our own political heritage. Freedom is the natural desire of all peoples. States work better when they are smaller and less encumbered by bureaucracies. Citizens feel more engaged and empowered when they feel their voices are heard. Localization of control both increases the efficiency of government and the power of the people. Scotland has felt disenfranchised from its faraway rulers since the 1707 Act of Union. Such has been the criticism of England from its global subsidiaries for many centuries, and since the mid-twentieth century English commonwealths have been replaced by local, sovereign governments. The United States, Australia, Kenya, and India have all outgrown London. Why should Scotland be any different? Scottish complaints against Westminster are legitimate, and they demand greater attention from both Parliament and Downing Street. Their yearly “allowance” from England is a pittance, and like a teenager who is never permitted autonomy (or like a thirty-year-old living in his parent’s basement), a nation governed by a distant father is never likely to grow up.

On the other hand, I feared that Scotland would be crushed under its own weight. Unfortunately for them, they will never have the chance to prove themselves. The United States and India are not bound geographically and culturally to England; their departures from Great Britain were natural, evolutionary outgrowths of those disunites. But for a number of historical divisions, some of them violent, Scotland has always been tied to England, and is as much a part of its fabric as Massachusetts is to Maine and Maine to Oregon. I recognize, though, this analogy is not perfect as Scotland is not a state constituted under a federalist system. Further, unlike the American Tea Party, which wants less government intervention, the Scottish nation would exercise more. The National Health Service, which each British citizen appears to value so highly (and blindly) would have been permanently enshrined as an entitlement in the Scottish Constitution. Such a binding law would allow no flexibility in a national budget. Taxes would have to be raised to pay for the inevitable expansion of this program. Scottish nationalists claim that control over their own resources—timber, water, and oil in particular—would offset these costs. But one entitlement produces many more, as we have seen in our own country. And a people given over to entitlements lose the very right of self-government that they eagerly proclaim.

Why should any of this matter to uninterested Americans across the pond? For one, it deeply affects the nature of our “special relationship” with Britain. British culture encompasses the Scots as well as the English—not to mention the Welsh and a few Irishmen. The British economy and military are the only powers in Europe that might successfully stand against the rise of another tyrant. Tumult within will welcome advances from abroad, which we have already witnessed in the rise of the Russian bear. As a leading member of NATO, the UK can likely ill afford the loss of important peoples and resources. I am aware of the paternalistic overtones of such a sentiment, but it is true that while sentiment is not by itself enough reason to remain in union, the “stronger together” rhetoric is not only rhetoric.

Additionally, Edinburgh’s calls for succession are occurring around the globe and within our own nation. Catalonia has wanted to break from Spain, and Flanders from Belgium. Corsica dissidents have recently resorted to violence to shock their French rulers. And let us not forget that Putin annexed the Crimea only after its people publicly claimed a stronger affinity for Russia than for Ukraine. Here at home, Texas citizens are boasting that their infrastructure and oil supplies (though likely not their water reservoirs) are strong enough as to not need oversight from Washington. People around the world are growing increasingly fed up with the political class.

The conventional wisdom, in this country at least, is that the right to secede was negated with the resolution of the Civil War in 1865. But I believe this did not so much resolve secession as temporarily quash it. In the aftermath of the Civil War, no rule of law determined that the South lacked legal standing to leave the union; in fact, it was President Lincoln’s more powerful army that determined the extent of their rights. No Supreme Court decision interpreted the eternal power of the Constitution; the Confederacy simply lost the desire to fight.

If the modern nation state is to survive, then it must be bound together by the rule of law. Secession, by its nature, rejects one law, purportedly to favor another created by the secessionist. At the same time, the Civil War, and all such wars, teach us that the rule of law is only so strong as the ability to enforce it and the will of its people to believe in it. The world is, and forever will be, ruled by might and power. While law is the foundation of a stable civilization, it is only a check against the aggressive, natural state of mankind. Thus, it is disconcerting to ponder: if one nation breaks away from another because of political disenfranchisement, will the next generation see yet another division? If the American Confederacy had been successful in its aims in creating a second United States, then how long would it have been before Alabama found itself disenfranchised from Richmond? Had the nationalist movement been successful last week, how long would it have been before the more populous Glasgow felt estranged from the capitol of Edinburgh? This too is not entirely adequate an argument to reject independence, but it does remind us that independence is not absolute.

Thankfully, Scotland employed its rights to decide its destiny by the democratic process. This is both legal and socially constructive: liberalism at its finest. No revolution has been necessary, and though a sizable portion of the Scottish people are deflated, they have not taken up arms in support of their cause. Their behavior in this gentleman’s quarrel will hopefully urge Westminster to uphold the gentleman’s agreements made during the debates. Powers of devolution should continue to be invested in the Scottish National Parliament. And with time, perhaps a new generation will see the value in severing themselves from London.

I am deeply sympathetic for the independence movement. I have heard and seen firsthand the yearning for sovereignty. William Wallace and Robert the Bruce remain the founding fathers of Scotland’s national identity. Braveheart is perhaps the most poetic expression of freedom in cinematic history. But the historical and the contemporary realities are far more complex. And are these feelings sufficient warrant to justify independence for all nations in all circumstances? I can honestly say, I don’t know.

A Defense of Poetry

The title of this post amuses me. Some of my faithful readers (all three of you) will quickly pass this by, assuming the poetic art form has no relevance for an era of science, for an epoch dominated by the very real concerns of ISIS, illegal immigration, and nude pictures of celebrities. Other readers might find this post simply superfluous. They think poetry needs no defense, that it exists alone and for itself. Yet Ars gratia artis may not be a sufficient justification to read literature or to create it. Plato has long been accused of rejecting poetry for its ability to communicate falsehood—a mischaracterization if ever there was one. But everything we do needs defending; for if we do not think to justify our actions, then no self-reflective thought guides our actions.

In 1579 Sir Philip Sydney composed a response to Renaissance criticisms that English drama was a hotbed of wickedness in his apology, The Defence of Posey. Though no one during the Renaissance had ever witnessed an episode of Game of Thrones, these claims against the stage were understandable for an age that, only a few steps removed from medievalism, mistrusted any representation divorced from the control of the church.

The first of these objections Sydney addresses is that there are supposedly more important things to learn and spend our time on than poetry. Aquinas seems to agree when he asserts in the Summa that poetry is the lowest of sciences, but he redeems poetry from this irrelevance when he asserts that God has used the lowest science to communicate the highest science of theology. After all, it is true that not everyone can read the complex reasoning of the Summa, but everyone can read Narnia. Thus, if God has used literature to communicate eternal truths, it cannot wholly be a waste of time.

The second objection is that fiction is simply a lie. This is Plato’s problem with the Homeric gods, and why he is mistakenly thought to issue a wholesale condemnation of poetry. But Plato’s objection is rooted in the Grecian misunderstanding of Homeric’s verse as truth; but this does not mean he is against literature—especially since he uses literature as a vehicle for his philosophy. We should not confuse the vehicle with its passenger. (Though I have yet failed to meet one, it is possible that some very smart people drive Smart cars.) As Sydney confirms, “shall the abuse of a thing make the right use odious?” Is erotic love wrong in marriage simply because others wrongly use it outside of marriage? Is social media an evil because more people use it for evil than for good? (Yes, I’m talking to you.) Wisdom is necessary to discern between right use and wrong.

The final objection, still toted by the fundamentalist inheritors of Puritanism today, is that poetry urges us to think on evil. Through book and screen we witness injustice and corruption, we watch people be murdered, we voyeuristically participate in lovers’ passions. There is some argument to be made here, but we must distinguish between the erotic and the pornographic. The erotic displays beauty, which should point us toward the God who created beauty; the pornographic turns inward, twisting the erotic toward base impulses that only reflect and gratify the self. Sir Guyon of The Fairie Queen, for instance, sees the bathing beauties in the Bower of Blisse and longs to go to them. Jane Eyre wants desperately to marry the already-married Rochester. Both Spenser and Bronte know their readers will be seduced by the imaginative possibilities presented in their narrative and want us rooting for our heroes to give into their bestial natures. But herein lies the point. Poetry can deceive us into accepting its premises and to promote immorality. Few authors are as skilled as Spenser and Bronte to then pull the rug out from under us and reaffirm the Christian truth that we had forgotten in our unreflective consumption of literature. Poetry, therefore, is not morally bankrupt but is itself a form of moral currency to be deposited in our minds.

Perhaps the final objection demands the most attention. Yet the problem seems not to be that we don’t trust stories, as might have been the case in Sydney’s day. After all, how many people do you know who refuse to go to the movies or watch television out of some purist sense of principle? Most people, believers and pagans alike, raptly follow the compelling serials, blockbusters, and trilogies. What is needed then in our day is not a defense of poetry but a defense for good poetry. Anyone with moving brainwaves can watch film (maybe even those with flat-lined brain waves can watch), but it takes effort to watch film well. All can consume, but we must urge reflection as and after we consume.

The first obstacle to properly encountering art is, unfortunately, ourselves. We myopically value only that which we already like. Alexander Pope warns in An Essay on Criticism: “Fondly we think we honour Merit then, / When we but praise Our selves in Other Men.” Everyone thinks they have good taste, just as everyone assumes that God must see the world the way they themselves do. But taste must be cultivated, like intelligence, the muscles, or any other faculty. The more cheeseburgers we chow down, the less we will be able to distinguish between a Cabernet and a Merlot, or between different years of a Pinot. The more Modern Family or even Facing the Giants we watch, the less we will be able to read, understand, and discern the Christian powers at work in Spenser or Bronte. We become what we eat, and this is no more true for our stomach than for our minds and souls.

For the reader still unconvinced of the premises in this treatise, little may be done to compel him to pick up Keats. But for the man who searches eagerly for truth, he may yet find it in the eloquence of poetry. May we then surround ourselves with beauty and contemplate the higher things.

The Deity Made Me Do It: Christian Fatalism in American Evangelicalism

When does God make us do things?

A loaded question, to be sure. And one poured over by theologians for millennia. It is not likely that we will solve the problem here. But I have been intrigued by some of the statements I have heard issued from the mouths of evangelicals. 

  • “God doesn’t make mistakes.” This comment has been used as an explanation for every human action, from sinners showing up in church at an exact moment to saints finding their “soulmate” to others justifying their sexual behavior. And while I believe it true that God doesn’t make mistakes, it does not follow that everything which occurs in life is controlled by God and is therefore predetermined.
  • “There is a reason for everything.” I don’t know how many times I have heard this one. It is frequently tossed around, a buzz-phrase designed to mollify our doubts and suffering. Its speaker is often throwing up their hands and resigning themselves to some sort of disappointment. Rarely does the speaker realize that they themselves may be the reason.
  • “God causes everything for good.” Such is an alternative statement to the one above with some slight modifications. I understand the good intentions of this sentiment, but it is a misquoted, misused verse, hacked up and hackneyed to mean that God is the cause of all things—Hitler, the Khmer Rouge, and ISIS among them—with a definitive end game in mind. Though I assent to the telos of history, I cannot fully embrace the unscriptural notion that God intentionally causes wickedness in order to create good.

To assume God is the originator of all human events ignores the power of evil, and by extension must presume that God is unjust. As I said, we won’t be able to outdo the Augustinians or the Thomists here, who have wrestled with this quandary ad infinitum. But, in a few short moments, I hope to crack a little light into the very real darkness of Christian fatalism.

Though not a popular label, Christian fatalism holds that God manipulates every event in our lives, from what clothes we wear in the morning each day to where we will ultimately die. He does this, presumably, because He knows more than we do and because He wants each and every event to work out in our favor. As soothing as this concept sounds, I think it foolishness to hold that this is the rhyme and reason of every act in the universe, as if we were all Trumans in our own carefully-orchestrated Show. However, neither do I believe the universe is governed by luck (this is perhaps the keenest of oxymorons). Chance appears to be just as brutal, though maybe not as oppressive, as the notion of the divine dictator. If nothing is providential, then God is indifferent, choosing not to intervene in human history—something which scripture would vehemently reject.

The first and most superficial response is the ever-popular retort, “What about free will?” as if this forever solves the predestination issue. (I have heard this red herring in more theological debates than I care to rehearse.) We who stand against the Calvinist tradition argue for free will incessantly while at the same time not acknowledging there is less said about free will in scripture than about predestination and God’s sovereignty. The concept of the human will as we know it wasn’t even fully codified until St. Augustine in the fifth century. Nevertheless, it remains true that human choice runs throughout holy writ. Individual agency is at the heart of God’s love for us. As Horace aptly stated, “To save a man against his will is the same as killing him.”

Within the context of God’s purposes, He certainly works within natural evil to create positive results (Rom. 8.28). It also seems clear that for those who love Him, God has special purposes in mind. But we must remember that the Good is not always clear to our way of thinking. To reach the Good, we may have to endure natural evil. Oppositely, what we believe is the Good (for instance, personal popularity) may be the very thing that creates evil.

I hope I am not too apophatic with my theology here, but I do think we frequently misunderstand the Good in favor of whatever we think is good for me. We trust that God doesn’t lead us toward evil (2 Cor. 10.13, Jas. 1.13-15). And we have faith that God is the originator of all good gifts (Jas 1.17). But we must first learn to discern good gifts. Did God give me the job that made me work longer hours and lose touch with my children? On one level, the job was a blessing and what I did with it cursed myself; but on another level, had my priorities been family and not money, then I might have initially refused the job offer because of its pitfall as a filial distraction.

The Western church would do well to recall that the Christian ethos—up until the 20th century—was in enduring suffering, not escaping it. This is something that our Eastern brethren have learned far too well, centuries before we even remembered to forget it. After all, success—in finances, in one’s career, even in ministry—can be a trap rather than an indicator of God’s favor. Our prayers for relief rather than for endurance, or for success rather than for godliness, may be robbing us of transformation. God, out of His great providence, will sometimes give us the desires of our heart. Even if those desires will lead us away from him, they reveal that we were never truly close to Him anyway. Our vision of God and His purposes, therefore, must be tempered by the active power in the Word.

So when is a good truly a Good? An existential approach can be helpful here, mistrustful as I am of the assumptions of existentialism. When experiencing a good, be it an open parking spot on a hot and hurried day, an admission to the college of one’s choice, or an amazing wife beyond one’s imaginings, we should attribute it to Christ—even if we cannot be certain that God dropped that particular blessing in one’s lap. Seeing the work of God where it may be always beats being unable to see the work of God where it is. Yet I also believe it true that if we deliberately choose to be moved by the Spirit of God, He will move us where He wills. I don’t mind thinking that God used me in a particular place and time when I have decided I want Him to use me. But in this I have to be careful because God is also not bound by what I think about Him. Therefore, if He wants to use me—as either a partner in good works or as a cautionary tale—He will do so.

My fear is that Christian fatalism will become fatalistic. In other words, if we hold that God is responsible for everything that happens, then all of life is inevitably reduced to destruction. While final apocalyptic cataclysm is certain, God’s ultimate joy is not in chaos but in creation. Thus, our souls are not simply to be redeemed and translated to the heavens but are to first be sanctified and to be transformed. If this is not true, then we should, as the fourteenth century poem Piers Plowman suggests, pray for death immediately after baptism. Sanctification and transformation are lifelong processes leading us deeper into Christ and preparing our souls for the ecstasies of eternity.

Fatalism resigns itself to evil without addressing it. It gives up with the hope that something like the Second Coming will eventually redress all wrongs. This is indeed the residual substance of our Hope, but such Hope should stir us to confront darkness with the light—without the giddy expectation that it’s all going to burn anyway. An apathetic faith is, in the final analysis, a pathetic faith.