Media Matters: Truth (?) to Power (!)

A fire is being ignited in America today, its flames evidenced in curiously bizarre media reports floating around the websites and airwaves. The various controversies surrounding Lena Dunham. The Rolling Stone rape story and the frightening lack of due process in college rape cases. The (mis)information disseminated during the Michael Brown incident and the premeditated riots that followed. Now, it would seem, is the winter of our discontent. One wonders not only what has happened to the country but where all of this unrest is coming from—and where it is going.

The goal, it would appear, of these characters in our seriously silly play is indeed the unrest: the overthrow of the social order. Romantic notions of revolution seem to occupy the minds of these figures, all in the name of making this world a more “just” place to live. Racial, sexual, and aesthetic marginalization demands, we are told, a loud and sometimes violent retort. Forgetting the hard lessons of the French Revolution—genuine injustices committed in the cause of equality—they press on toward some indeterminate end. The fire, like that same one sparked in Gotham in The Dark Knight Rises, is given the name Justice but is actually its opposite.

What seems to be unifying each of these causes, real or imagined, is the desire for power. Populism, of the left- and right-wing varieties, taps into the generalized national angst and into specific local or community concerns. With the rise of demographic-focused politics beginning in the 1960s, we have all become segmented into competing special interest groups jockeying for attention. Black, brown, anglo, feminist, gay, religious, secular, rural, and urban all want a seat at the throne. Assuming one’s cause is just and the other’s not, rarely do these groups think about the ultimate conclusion of their philosophy. They want it—and now.

I cannot help but wonder if the fuel to these flames is exacerbated by the media landscape. And I don’t mean The Media in some sort of conspiratorial, industrial complex way (although there might be room for that sort of discussion). I mean the greedy, irresponsible, crisis-driven reporting of events that has captivated mainstream, cable, and internet news.

The bent of journalists and editors toward crisis existed long before the early twentieth-century days of muckrakers. But at that time, news existed in a 1-week (newspaper) or 24-hour (network) news cycle. Today, we’re dominated by a four-hour news cycle, which means we have terribly trifling attention spans. But it also means we’re dependent on the dramatically shifting media narratives. Our information is filtered through the biases and errors of eyewitness accounts, grand jury testimony, and for most of us, news outlets who report this information. We cannot pay attention constantly, yet we want to be informed—and so we have little choice but to trust (even with some skepticism) those who report those events.

But sensationalism rather than truth has become the medium of exchange for modern news outlets. Whatever will make ratings, sell magazines, or get the most likes in the moment is what matters. Consider, briefly, the news stories above. For these few weeks at least, this is the fare to which we have been sumptuously treated. But then what? In what abyss is sent the stories once furiously occupying our televisions and news feeds? What was the result, for instance, of the congressional investigation in Benghazi? What is the status on the prison at Guantanamo Bay and rendition sites around the world? Why was it not boldy announced that Lois Lerner’s emails had been miraculously found after all? Where is the coverage of Russia’s incursion into Ukraine? What is happening in the streets and parliaments in the aftermath of the Scottish referendum? (More importantly, why is the world’s supply of chocolate running short?) And when something of great significance does occur, like the surprisingly tame and remarkably conservative pronouncements of Pope Francis in the last few years, more is made of it than it might actually deserve or is reinterpreted to fit a perceived pattern. The media builds the intensity of the crisis and then rides the ideological waves of change, claiming first-row seats to history. For without change, there is no history–and no ratings.

The thirst for ratings taps into the sensational and least discerning of the human faculties. Further, when reaching those parts of ourselves, the most likely emotion this will generate is fear. This is especially true of those with an agenda. Those who seek power must use fear to cow their opponents into silence. The oppressed, dispossessed, and disenfranchised rarely remember the abuses suffered, real or imagined, under the former regime; or they remember their abuses so clearly that they invoke the lex talionis against their oppressors. Anger is a poor substitute for wise thinking, yet it seems that the angrier someone is the more others will pay attention to them. Our characters are not motivated by virtue but by air time, book contracts and movie deals, and stretching their fifteen minutes into thirty.

The American consumer seems at least tacitly aware of this conundrum, as they rarely let media events dramatically challenge their worldview. School shootings, for instance, almost always precede a brief upswing in a public call for increased gun control before plummeting back to previous levels a couple months later. This has both positive and negative components, for people often stubbornly refuse to believe what their eyes tell them; yet if their eyes are being manipulated, then their refusal is justified. Lacking trust, we wind up believing in whatever fits our previously-existing (meta)narrative. If I think that police brutality against African Americans is more common than the media reports, then I’m likely to feel sympathy for the Michael Brown family and the entire Ferguson community. If I think police brutality against African Americans is exaggerated, I’m more likely to pity Darren Wilson and law enforcement officers. Both factions believe they are pursuing truth, and perhaps they are, yet the result is chaos.

Torn as I am between cynicism and optimism, I yet fear a society so un-governed by virtue that we would abandon our institutions in the vain hope of finding money, fame, and attention only to despair in the meaninglessness of our discovery. Long gone are the days of acolytes seeking to exercise their civic oaths, their business ethics, or religious orders in humble service to a greater good. We have fallen far from the examples of King and Mandela, of Jefferson and Monroe. But then again, they were men with direction and, we can reasonably argue, a commitment to virtue. How many of our journalists, celebs, and pols can say that?

But then again, perhaps truth is not found in the news ticker or feeds. Truth must deal with objective facts, yes, but it must also transcend those facts. Truth is never its own end, though it should be hunted with the same moral and intellectual vigor as if it was. Rather, truth is the means by which the individual moves to action. Because I cannot know the facts in the Michael Brown case beyond what I read regarding the grand jury testimony, what I do know is that a young man lost his life and another young man his livelihood. Such (limited) knowledge is likely to throw me into ambivalence or despair, neither of which are productive for the good life. Emotional wallowing would seem to be the default position here. Yet if I choose to employ the truth I can know in pursuit of the good I can do, then I can help build a world I can dream. Such aspirations are found in the simple choices of life, ones that will never receive attention in television reports, but ones that will make a lifetime’s worth of difference for each person and in the lives of those around them.

Media truly matters in a free society when it becomes the arbiter of facts and truth. Better that we, in our pursuit of virtue, become more discerning in our consumption of media.

The Tyrannical Daemon

I hate clichés. And I hate bandwagons. And, with a few exceptions, I hate people. My misanthropic tendencies arise from a superiority complex that values thoughts over feelings. Seeing thoughts in short supply, my disgust of the musical tour bus similarly develops from the kum-by-yah, self-realization, shakra formation inherent in most groupthink. And my distaste for clichés comes from a combination of the other two aversions: if you’re going to say something profound, it should be new and original, I often say.

But my distaste for clichés does not make them wrong. Perhaps some profundities last in the cultural consciousness because of their deep truth married to their simplistic expression. I haven’t fully figured out why cats are overly nervous in a room full of rocking chairs, nor do I understand why airing dirty laundry is a problem if everyone’s clothes are stained from time to time. And though a rose by any other name would surely smell as sweet, the word “rose” is a far more appealing word to identify a flower of love than the phonetically abrasive “maggot,” “moist,” or “lugubrious.”

These observations are no more true than of the clichéd expression rocketing around Christendom that “God is good all the time. And all the time God is good.”

But what about when God is not good?

We trust in God because he is all-powerful. But what happens when he doesn’t use that power to protect our home, to allow us to retire, to save our loved ones from death? We trust in God because he is all-knowing. But what happens when our knowledge seems far more accurate than his (as my son recently informed me of my own weak knowledge compared to his own)?

Christianity’s answer to the existence of evil has been challenged by people no less insightful than men like Voltaire. Should God be good, then why do so many bad things happen? Should God have immense power, then why does he not act on that power to prevent evil? Should this earth be not the best of all possible worlds, then God may not be good and loving but instead a tyrannical daemon who treats mankind like puppets on a stage, or as in Voltaire’s Candide, rats on a ship. Satan’s rationale to Eve in Book 9 of Paradise Lost sounds frighteningly accurate:

“God therefore cannot hurt ye, and be just; / Not just, not God; not feared then, nor obeyed: / Your fear itself of death removes the fear.”

Not just, not God. If God is defined in the superlative, then he is all justice. But if he does not enact justice, is he really as wise and potent and beautiful as we would like to believe? Theoretically, God could also be all anger and jealousy as well. As the poet William Blake muses, the being who created the innocence of the lamb also created the ferocity of the tiger.

But I think we get lost in the clichés and happy moralizing of the Christian walk. We love God because he first loved us. We love God because he gives us stuff. We love God because we fear not to. Do we love God because he is?

I believe the story of Job exists for this reason: to show us God’s darkness as well as his light. When everything is stripped from Job, his family, his riches, even his identity, all he has left is God. And while he believes God wrong, he still says, “Though he slay me, yet will I hope in him” (13.15). What faith must it take to hold fast to God even when he shows himself unjust? Even in his darkness, God is still far brighter than us—and it is only because of his ineffable, marvelous light that he can use the darkness. For God is not merely an instigator of darkness but one who saves within its shadows. Which means he is good because he cares. Pouring out one’s wrath on an innocent son is definitely not good for the son. Yet it is just for all of humanity whom the son volunteered to save.

I expect an intense moment of surprise will hit me when I arrive in heaven. Unencumbered by flesh, I will be able to look in the face of Christ and recount the times when I was so ungrateful, when I doubted, when I even hated him for what happened in my life. How could you let in such a wretch like me? And I imagine he will pull out a mustard seed and say, “Thank you for believing in me.”

We are the ones who lack vision. The petty disagreements we have with one another. The self-induced stress of worrying about tomorrow. The trials of the day. These moments become so self-defining yet are insignificant next to his incomparably great power for us who believe. The truth is that our knowledge is so finite that we cannot know why God does what he does. We can never know what trials and tragedies he saves us from experiencing. We cannot answer our doubts with information. Sometimes we must trust the tyrannical daemon.

Is God love? The cross tells us he is. Is God good? Most assuredly. But he does not always show himself good. Sometimes he shows himself unjust. But I think that has more to do with our perspective than it does a change in the character of God. It is true that he shows himself pure to the pure, faithful to the faithful, devious to the shrewd (Ps. 18.25-26). Much like the son being punished or the daughter not receiving her cherished ice cream cone an hour before supper, the parent sometimes seems harsh and unloving. Indeed, sometimes the broader perspective of the parent demands they be unloving, if we define unloving by refusing to giving children what they want rather than what is best for them. Only as the child grows does he realize the operations of fatherhood. If this be true, then we can still say with confidence, quieting all doubt…

God is good.

An Editor’s Life

For the last several months, I have taken up the responsibility of editing The Journal of Faith and the Academy, a publication of the Institute of Faith and the Academy at Faulkner University. The role of an editor, I am learning, is more complex than simply organizing and formatting articles. It is an amalgamation of duties, including writer, marketer, logistician, researcher, counselor, and politician. So taxing has been this new assignment that I have found little time for creativity, recreation, and writing of my own. I might spend some time lamenting this fact, but your time is short, so I will edit that digression and move onto more interesting thoughts.

Some people confuse editing with proofreading. In truth, proofreading is easy. Grammar follows a fairly standard set of rules. Convention and use have laid the path, and the writer needs merely to follow it. A proscriptive doctrine of subject-verb agreement, dangling participles, and (there it was if you just missed it) the Oxford comma acts as a railroad track to lead readers to clarity.

Leading readers to enlightenment, however, is a more arduous journey. As an editor, your goal is to ensure that the writer says what he thinks he says—in such a way, of course, that will make sense to the audience. Standing in as the imagined reader, the editor must walk the middle ground of statement and intention. In this respect, the editor should do as little revising as necessary out of respect to the author. Time, if not ethics, do not permit the editor to meddle. And yet, there’s always that one author who can’t seem to stay on the rails and whose train just refuses to climb the hill, that one whose lack of clarity screams out for a firm hand…

Nevertheless, revision errs in making the author say what the editor wants him to say. That impulse is always there—for surely I could make this idea more profound, could make this more obscure connection, could argue this point so much more clearly. But it is an impulse that can only be suggestive, not prescriptive. For if Dumbledore is correct, and words are the most powerful magic we wield, then the responsibility of handling the work of the author is a weighty one indeed.

Thought is composed of language, and language has the unique ability to affect thoughts, introducing ideas that can later become whole perspectives on the world. With this in mind, it is no large step from the grammatical to the psychological, even less from the psychological to the metaphysical.

What parts of our lives do we edit out? Surely we edit our past, as I have remarked elsewhere. But I think this is even more true when it comes to our interpretation and transformation. First, as believers, we attempt to edit out those behavior patterns, thoughtless actions, and evil intentions that prevent us from writing a masterpiece of our lives. We attempt—sometimes successfully—to delete gossip, lying, lust, or any other sin that would tarnish our divine likeness. This is, to a large extent, necessary. Yet at the same time, there is no good story without conflict, and the Father has chosen to make his kingdom out of sinful citizens. In the city of God, the weakest characters are those in whom Christ shines the brightest. He is, after all, the master editor.

But good editing can only occur when we have a good style guide and know how to use it well (I fear my metaphor is descending into obscurity and banality, but bear with me). Doctrine is fairly easy to grasp, as it is determined by scripture, tradition, and local (sometimes global) authority. Interpretation is much harder. One can become lost in the minutiae of grammar, denotation and connotation, and etymology, and never see the symbolism, allusion, paradox, and overall unity of the work. Sometimes I think we become so hung up on the task of proofreading that we neglect the more important work of editing. The Christ gives meaning to Isaiah’s echo of desiring mercy and not sacrifice, spiritual metamorphosis over religious purity. Focusing on doctrine means never making the mistake of revision (though doctrinal purists commit their own brand of revisionism as well). Yet it also means missing out on the importance of editing. The work of editing rises not only in laboring to make one’s life a reflection of the Author, but also in partnering with him to write a beautiful poem of one’s life in the larger poetry of humanity.

The task is not an easy one. It requires a right intention, an observant mind, and a steady hand with nimble fingers. But, when accomplished, the right words can create redemptive ideas.

Pop Culture, Pop Guns, and the Politicization of Celebrities

In the midst of Ebola scares, a midterm election, and the rise of the Islamic State, you probably knew (not that you could help it, but much to your shame) that George Clooney recently married, that Chris and Gwyneth held a divorce ceremony, and Renee Zellweger used more plastic than a Lego factory to change her face. This is hardly news, and yet it occupies the headlines of current events from The Sun to the Wall Street Journal.

Celebrities are not a new phenomenon. People have been breathlessly taken by their rulers since governments first formed. People gravitate toward the rich and famous. It not only gives us, the little people, heroes to follow but reaffirms our belief that one day we can “make it” too. If Taylor Swift can become famous with little more than a smile, then maybe I can achieve fame and success. Most classical writers had fans who might have hash-tagged them were it not for the absence of social media and threat of execution. It was, at least in part, Julius Caesar’s popularity with the mob that contributed to his assassination—desires to overthrow the republic notwithstanding. And while politics is merely show business for the, shall we say, aesthetically disenfranchised, media figures distastefully use their attention-grabbing status to opine on everything political from environmentalism to economics.

As an antagonist of pop culture, I find it difficult to support the pet causes of people whose greatest claim to fame is pretending (albeit convincingly) to be someone else. If one makes their living fictionalizing, it is hard not to cynically wonder if their outrage is similarly feigned. Other celebrities have even fewer talents, like being able to sing better than most—and if they don’t possess that talent, they can at least remove more clothes while singing sub par. For this reason alone, their social and political judgments should be discounted.

I think many media figures are self-consciously aware of their own inanity, which is why they must expand their platform to be taken seriously. They don’t cure cancer, but they can make it look like they care about curing cancer. To some degree, this is systemic to their industry. For a profession which must outdo itself every summer season, bigger must be better. Yet we may or may not be reaching a point of diminishing returns for the theatre-going audience; time, as in all things, will tell. Are we no longer able to suspend disbelief in the predictable highwire jumps, slow motion effects, and CGI? Somehow I doubt it. It is more likely that the convenience of home viewing is out-pacing the perceived value of soaring ticket prices, popcorn and soda that is more expensive than a McDonald’s value meal, and the annoyance of sitting with loud and socially-inappropriate viewers.

But if there is any cultural backlash against Hollywood beginning to assert itself, it may (and this is a really big MAY) be rooted in the celebrity phenomenon. It is hard to take seriously, for instance, a video of celebrities condemning gun violence when the blockbuster movies they sell make money through crafted explosions, fake bullets, and fictional body counts. To my knowledge, the only action hero who has ever to account for his actions is Jack Bauer, appearing before a congressional committee explaining his counter-terrorist measures in the seventh season of 24 (apparently you can’t kill 309 people without someone taking notice), and even then his use of torture is considered sufficiently justified to allow him to save the world for another three seasons. But when the Avengers help save New York City, we never see the President granting emergency federal funds and the painstaking process of rebuilding billion-dollar skyscrapers. Rarely do we see one of Denzel’s characters facing guilt-ridden sleepless nights for all the bad guys he’s shot in the line of duty. We don’t hear the residents of Westeros standing up and saying “Enough” to the violence. Ironically, if the American news media saw the residents of Westeros standing up and saying “Enough” to the nudity, incest, and general promiscuity in town, they’d be heckled as too Puritanical for a more enlightened age.

The beauty of fiction is in paring down a story to its most essential elements; we are told only those plots that communicate the greatest meaning. Though it takes place in real time, we are thankfully spared Jack Bauer’s real needs to eat and pee (fighting terrorists never allows for a bathroom break—except perhaps during the commercials). This is an acceptable aspect of fiction wherein we the audience suspend our disbelief. Yet the downside is that we consumers of pop culture never see the day-to-day reality of what would occur outside the narrative. We only see the sensational events but never the consequences.

And herein lies the problem of the political celebs. They appeal to the low information voter, urging them to act without ever considering the costs of action—rarely ever considering the benefits of inaction. Ignoring the realities of real life, they do not consider the end result nor the unintended consequences. Because pop culture reduces everything to the lowest common denominator, the least truth, the least goodness, and the least beauty are merely enough—though the least never is. While this is certainly democratic, it means the most superficial engagement suffices for electoral success.

In the marketplace of ideas, there will always be bad ones. In the media landscape, we will always find stories of no inherent value. But this also means we don’t have to justify their existence with our attentiveness.

The Middle of King Lear

Sitting high atop the dung heap of humanity, the solitary weeper weeps alone. Like the biblical Job, he laments his fall and cries at life’s injustice. He tears at his hair, scratches lines into the dirt, speaks nonsensical gibberish. He speaks without meaning and we are left trying to assemble broken puzzle pieces, vainly trying to create meaning. I and three dozen other audience members watch the actor transform into King Lear, witnessing high art yet at a loss to know what to do with it.

What is insanity? When I ask this question in class, my student athletes claim they know the answer. Doing the same thing over and over again and expecting a different result. Far be it for me to contradict a generation of coaches looking to motivate their players, but that answer belongs to the question, What is stupidity? (I’ll take “Worn-Out Cliches” for $300, Alex…).

In truth, insanity is the inability to see reality as it actually is. Having never been insane (that I know of), I feel unequipped to explore the answer to this question. It is not as comprehensive as, What is love? nor as abortively academic as, What is death? But an insane person, I conjecture, cannot—or chooses not to—live in the real world.

But then this begs the question, What is real? I think we can safely answer that point by asserting that the real is what everyone accepts the real to be. Everyone accepts that the sky is blue, that gravity is irresistible, that murder is wrong. Yet legal experts and theologians can find plenty of instances where murder is acceptable; does that then mean that the majority view is insufficient? It may be, but exceptions do not fully invalidate the morally secure position of the majority. But then what if a minority asserted a metaphysical truth, like for instance, the belief that a god could rise from the dead, and claimed a new moral code that the majority did not agree with. Would they be considered insane? And if so, must they be dealt with by society, using means that might appear harsh yet were perhaps necessary to protect the majority?

You see the problem?

Insanity cannot be defined as disputes between moral viewpoints, nor even fully the inability to relate to the world as it is. Insanity may be a clinical definition for a psychological world of schizophrenia and psychotic breaks, though I prefer the more antiquated and literary term madness. Madness removes, sometimes violently and often temporarily, an individual from the external world to a world within. It is not an affliction of the brain but of the soul.

Shakespeare’s King Lear addresses the unimaginable horrors of old age, of isolation, and of its accompanying madness. We agonizingly watch, scene after scene, as Lear’s very identity is slowly stripped from him. He loses his deepest love in his daughter Cordelia, loses his presumed affections from his other, duplicitous daughters, he loses his kingship and his home. Finally, having lost everything else that would identify us as human and give us meaning, he loses his mind.

In the one-man performance of The Middle of King Lear I was fortunate to see last month, Royal Shakespeare Company veteran Cedric Liqueur portrays Lear alone on the heath with nothing remaining to him—not even his mind. He pours forth a litany of Lear’s lines from the iconic original play, but in isolation, without anyone to talk to, the words literally lose all meaning. Deeper into his soliloquy, Lear begins speaking to the audience members around him, selecting one observer to be his Edgar, one his Kent, two his Goneril and Regan, one his Cordelia. He looks at all of us surrounding him, and the stage becomes a place where we become the silent ghosts of Lear’s past. What would he have said to them—to us—in his madness if he could?

Notably, one of the markers of insanity is talking to one’s self. What if the mind, poisoned by pain and despair, imagines its friends, its families, its enemies—recreates them out of the heavy air of dreams? What if madness is simply trying to reconstruct a new reality out of what might have been? Further, what if we, the sane, watch scornfully the insane, never realizing the futility of our own state? We too try to reconstruct new realities: comfortable decadence, insulated from the darkness of the world, fantastical utopias. Perhaps we are mad for ignoring the darkness and trying to replace it with false light.

One of the constant refrains of Man of La Mancha is that human beings are born into a dung heap (which is not a skewed perspective in seventeenth-century Spain). It is Don Quixote who is thought insane because he refuses to lose hope. Or rather, he creates hope with fantasy, choosing to believe the world is a finer, richer, and more magnificent place than it actually is. And in believing it to be so, he infects others with his redeemable lunacy. Don Quixote asks, “If life is insanity, what then is madness?”

The human spirit must rise above the human condition. If we hope for more from this life—and potentially from the next—we must transcend the dung heap. The revered Union general Joshua Lawrence Chamberlain, having hunkered down before the stone wall at Fredricksburg, triumphed in that inspirational charge at Little Round Top at Gettysburg, and survived the mud and the horrors of Petersburg, was frequently asked how he did it. How did he endure through the mud, the cold, the blood, and indecency? His response was that he imagined himself a medieval knight, a warrior doing the business of God where little was glorious and all was to be gained. He could courageously face the abyss believing himself something more than himself. He, like my man Don, reminds us that we can lament the dung, burn it all, or build something with it. Most of us are not destructive, but even more of us lack the insane courage to create something significant out of something so filthy, worthless, and wicked. And yet, I am reminded, thus did the Christ.

Is madness inate? Does it manifest itself in some sort of Freudian nightmare? Or do we, forced into the cuckoo’s nest, retreat into madness for an ounce of security? Or is it that madness the only rational response in an irrational world?

Maimonides’ Quandary and the Perils of Interpretation

Last month, an article popped up on my News Feed, “Does Personal Bible Reading Destroy the Church?” Naturally, a title like this is written to attract attention—and it certainly did mine. The author argues that the development of Protestantism fractured Christianity into 34,000 different sects. Consequently, both insiders and outsiders don’t know what to believe about scripture—and finding out the answers for themselves may be just as misleading and costly as seeking them out within the fenced-in doctrines of one of Christianity’s many sects.

Ironically, this fracturing of Christianity was one of the prophecies leveled against Martin Luther in his enforced break from Catholicism. The church, it was said, would shatter into a thousand pieces, which evidentially turned out to be true (give or take ten thousand or so). Catholicism splinters into Lutheranism, which breaks into Protestantism, out of which arises Methodism and the various Baptist fellowships, which eventually devolves into the Cult of Fullman. Without a central authority to interpret and enforce Holy Writ, each man becomes his own glossator of scripture. This is especially problematic for two reasons.

The first is our tenuous relationship to truth. Nietzsche argued, most convincingly, in “On Truth and Lying in a Non-Moral Sense” that most people are not interested in truth, only in what makes them comfortable. We will believe, he claims, what we want to believe—not what is commensurate with reality. While this sounds terribly cynical, there is still more than a little substance to what he says. My subjective, individual will is a more immediately compelling force than an objective, external truth.

We can see this in the way people fashion for themselves narratives about the world. This begins with our parents, as we hear their political opinions, religious convictions, and social commentary. As we mature we adopt, reject, or modify our parents’ narratives. Perhaps we choose to spank our children because our parents did; perhaps we don’t. As we age and encounter new experiences, these narratives become more solidified in our minds, and it becomes difficult to change or alter them. When confronted with a new idea that challenges the narrative, people are naturally resistant. How we’ve always done things is sufficient, we tell ourselves. Sufficiency is later replaced with complacency, which resigns itself to the assumption that we have already discovered, understand, and communicate the truth. The first problem is that we may not see reality as it is—only as we wish it to be.

Stemming from this premise, the second problem is that, especially in matters of divinity, only some people may be trusted with the truth. In Guide to the Perplexed, the twelfth-century theorist Maimonides acknowledges the very real difficulty in interpreting scripture. Against our largely Western perspective that each person can fairly understand anything, his Jewish take is that the truth is too vast and mysterious a concept to be grasped. Even the learned are perplexed at the intricacies and various shades of meaning implicit in God’s word—how much more so the vulgar? The learned have a special advantage in that they have a greater understanding of the world, and divine science can be approached through natural science. This does not preclude teaching the vulgar, but you can only cast pearls before swine so often before they turn and tear you to pieces. This form of criticism asserts that there is indeed a palpable interaction between text and audience, but Maimonides insists that deeper understanding is only for the committed few. The question becomes not only how we should interpret, but who should interpret?

This is a quandary most 21st century Westerners would likely not even entertain. So entrenched is our post-literate thinking that we adopt a posture of arrogance with the assumption that we can understand the deep mysteries of God. But if God reserves special revelation for Abraham, Moses, and the prophets, surely there are things he does not tell the masses. If Christ reserves special explanation for his apostles (Mark 4:10-12), surely there are things the average disciple could not grasp. If the Word of scripture contains “things that are hard to understand” (2 Pet 3:16), surely not all can accurately interpret.

We democratize our virtues and special gifts of the Spirit to our peril. If not everyone is equipped to become pastors and teachers, should everyone be given the benefit of the doubt in their own personal analysis of a text? If we wouldn’t make each man an administrator, should we deem each man a qualified reader? Dare we go so far as to say that interpretation should be handled by professionals? But then again, what if the interpreters have been poorly trained? What if they are similarly ill-equipped to rightly handle the word of truth? If power over scripture resides in a select few, then to whom will we turn when the few intentionally—or unintentionally—misinterpret?

The dilemma does not solve itself were Christianity to suddenly reunify under one umbrella. Even the popular credo Sola Scriptura from my own and from other faith traditions is rife with problems—since people interpret scripture differently. An argument between an Independent Baptist and a Free-Will Baptist may be hostile and infused with urgency, though from an outsider’s perspective largely inconsequential. A premillennialist or postmillennialist view of eschatology should not disqualify one from salvation nor sanctification. Of course, from another perspective these conclusions reveal my own interpretive biases—and maybe my perspective on the Holy Spirit should be the very thing that disqualifies me. But if Paul’s instructions regarding the freedom of conviction are any indication (Rom. 14), then I may not be condemned after all. Blaming the divisions in Christianity on interpretation is only tenable from a position of hegemony. Once one sect becomes the dominant representation of the faith, then everyone else becomes a heretic. If, however, we focus on those articles that unite us—the supremacy of Christ, for instance—then we become less enamored of those articles that divide us.

Maimonides brings much to bear in this discussion but does not offer many solutions. And therein may be just the point. We presuppose for some reason that the “fracturing” of Christianity is a problem, confusing unity of spirit with unity of thought. But if God has given us the freedom to interpret, then does misinterpretation automatically indicate divine disapproval? And if we obey the Augustinian injunction that all interpretation should lead to the double love of God and one’s neighbor, then do we not choose what is best?

I believe this may be one of the most important questions I’ve ever considered in these pages. While it may appear that there is a bit of hyperbole in that statement, it is nevertheless true that what, how, and why we interpret are fundamental to a life lived in pursuit of truth.

The Braveheart Dilemma: Scottish Nationalism and the Vote for Independence

In 1320, the Declaration of Arbroath, signed by Robert the Bruce and more than fifty noblemen, was sent to Pope John XXII with the intention that he would recognize Scottish independence. Prompted by the political efforts of the Bruce, and by the earlier military efforts of William Wallace, the Declaration was an achievement akin to the spirit of the Magna Carta, if not with its strength. It supported the Bruce’s royal claims, arguing for his right to rule his own Scotsmen against the English king Edward I’s presumed rights. Asserting itself to be the vox populi, it held that “It is in truth not for glory, nor riches, nor honours that we are fighting, but for freedom—for that alone, which no honest man gives up but with life itself.”

I have held mixed feelings about last week’s referendum. I have also held off discussing them until now (in the aftermath of which they may be even more unwelcome). The perspective of an American may be unsolicited in an election of a sovereign nation, much as it is when other nations weigh in on our electoral college. It is, of course, easy to play the Monday morning quarterback—which is not my intention. Rather, I hope to speak briefly to those who may not fully grasp the implications of this particular decision.

On the one hand, a national movement for independence is understandable and in keeping with our own political heritage. Freedom is the natural desire of all peoples. States work better when they are smaller and less encumbered by bureaucracies. Citizens feel more engaged and empowered when they feel their voices are heard. Localization of control both increases the efficiency of government and the power of the people. Scotland has felt disenfranchised from its faraway rulers since the 1707 Act of Union. Such has been the criticism of England from its global subsidiaries for many centuries, and since the mid-twentieth century English commonwealths have been replaced by local, sovereign governments. The United States, Australia, Kenya, and India have all outgrown London. Why should Scotland be any different? Scottish complaints against Westminster are legitimate, and they demand greater attention from both Parliament and Downing Street. Their yearly “allowance” from England is a pittance, and like a teenager who is never permitted autonomy (or like a thirty-year-old living in his parent’s basement), a nation governed by a distant father is never likely to grow up.

On the other hand, I feared that Scotland would be crushed under its own weight. Unfortunately for them, they will never have the chance to prove themselves. The United States and India are not bound geographically and culturally to England; their departures from Great Britain were natural, evolutionary outgrowths of those disunites. But for a number of historical divisions, some of them violent, Scotland has always been tied to England, and is as much a part of its fabric as Massachusetts is to Maine and Maine to Oregon. I recognize, though, this analogy is not perfect as Scotland is not a state constituted under a federalist system. Further, unlike the American Tea Party, which wants less government intervention, the Scottish nation would exercise more. The National Health Service, which each British citizen appears to value so highly (and blindly) would have been permanently enshrined as an entitlement in the Scottish Constitution. Such a binding law would allow no flexibility in a national budget. Taxes would have to be raised to pay for the inevitable expansion of this program. Scottish nationalists claim that control over their own resources—timber, water, and oil in particular—would offset these costs. But one entitlement produces many more, as we have seen in our own country. And a people given over to entitlements lose the very right of self-government that they eagerly proclaim.

Why should any of this matter to uninterested Americans across the pond? For one, it deeply affects the nature of our “special relationship” with Britain. British culture encompasses the Scots as well as the English—not to mention the Welsh and a few Irishmen. The British economy and military are the only powers in Europe that might successfully stand against the rise of another tyrant. Tumult within will welcome advances from abroad, which we have already witnessed in the rise of the Russian bear. As a leading member of NATO, the UK can likely ill afford the loss of important peoples and resources. I am aware of the paternalistic overtones of such a sentiment, but it is true that while sentiment is not by itself enough reason to remain in union, the “stronger together” rhetoric is not only rhetoric.

Additionally, Edinburgh’s calls for succession are occurring around the globe and within our own nation. Catalonia has wanted to break from Spain, and Flanders from Belgium. Corsica dissidents have recently resorted to violence to shock their French rulers. And let us not forget that Putin annexed the Crimea only after its people publicly claimed a stronger affinity for Russia than for Ukraine. Here at home, Texas citizens are boasting that their infrastructure and oil supplies (though likely not their water reservoirs) are strong enough as to not need oversight from Washington. People around the world are growing increasingly fed up with the political class.

The conventional wisdom, in this country at least, is that the right to secede was negated with the resolution of the Civil War in 1865. But I believe this did not so much resolve secession as temporarily quash it. In the aftermath of the Civil War, no rule of law determined that the South lacked legal standing to leave the union; in fact, it was President Lincoln’s more powerful army that determined the extent of their rights. No Supreme Court decision interpreted the eternal power of the Constitution; the Confederacy simply lost the desire to fight.

If the modern nation state is to survive, then it must be bound together by the rule of law. Secession, by its nature, rejects one law, purportedly to favor another created by the secessionist. At the same time, the Civil War, and all such wars, teach us that the rule of law is only so strong as the ability to enforce it and the will of its people to believe in it. The world is, and forever will be, ruled by might and power. While law is the foundation of a stable civilization, it is only a check against the aggressive, natural state of mankind. Thus, it is disconcerting to ponder: if one nation breaks away from another because of political disenfranchisement, will the next generation see yet another division? If the American Confederacy had been successful in its aims in creating a second United States, then how long would it have been before Alabama found itself disenfranchised from Richmond? Had the nationalist movement been successful last week, how long would it have been before the more populous Glasgow felt estranged from the capitol of Edinburgh? This too is not entirely adequate an argument to reject independence, but it does remind us that independence is not absolute.

Thankfully, Scotland employed its rights to decide its destiny by the democratic process. This is both legal and socially constructive: liberalism at its finest. No revolution has been necessary, and though a sizable portion of the Scottish people are deflated, they have not taken up arms in support of their cause. Their behavior in this gentleman’s quarrel will hopefully urge Westminster to uphold the gentleman’s agreements made during the debates. Powers of devolution should continue to be invested in the Scottish National Parliament. And with time, perhaps a new generation will see the value in severing themselves from London.

I am deeply sympathetic for the independence movement. I have heard and seen firsthand the yearning for sovereignty. William Wallace and Robert the Bruce remain the founding fathers of Scotland’s national identity. Braveheart is perhaps the most poetic expression of freedom in cinematic history. But the historical and the contemporary realities are far more complex. And are these feelings sufficient warrant to justify independence for all nations in all circumstances? I can honestly say, I don’t know.