Thursday, November 5, 2015

It's Democracy, Dumb Ass!

On Tuesday, my home state of Kentucky elected Republican Matt Bevin as governor. To outsiders, this hardly seems surprising. Kentucky consistently votes conservative in national elections. Currently both senators and five of the state's six representatives are Republican. Since the 1960s, Democratic presidential candidates have only carried the state four times, and each time the candidate was a Southerner (LBJ, Jimmy Carter, and Bill Clinton -- twice).

However, state-level elections typically tell a different story. Democrats have held the governorship for 40 of the last 44 years. Additionally, the Kentucky House of Representatives is the only state legislative body in the South currently held by Democrats. So why the change?

Voter turnout has become the scapegoat. Indeed, the numbers are troubling. Results show that Bevin captured 52.5%, while the Democratic challenger Jack Conway came in at 43.8%. This appears like a pretty resounding mandate, until you consider that voter turnout was 30.7%, which is abysmally low.



It's also worth noting that their are more registered Democrats in Kentucky than Republicans, meaning that lower turnout benefits conservatives -- as it does in much of the country. It's sad for the Republican Party that suppressed voter turnout is its best ally, but arguably even sadder for the Democratic Party that its constituents can't be bothered to vote.

Low voter turnout is a national problem, and there are several proposal to correct it: make voter registration automatic rather than an opt-in process, expand early voting, move Election Day to the weekend, etc.

I agree that voter registration should be automatic. A recent Census Bureau report estimates that a little over 35% of eligible voters are not registered. Still, expanded registration won't truly solve the problem as fewer than half of registered voters bother to vote in non-presidential election years.

Convenience could be a factor. Expanding early voting would likely help, though I'm skeptical about moving elections to the weekend: Would Americans be more likely to take time out of an off day than a work day? Perhaps a better solution may be to develop a system voters might cast their ballots at any polling station. After all, it's not 1850. Many Americans don't live and work in the same proximity that they once did, and voting at a polling place closer to work might help boost turnout.

But still, that doesn't fix the apathy. Part of the problem is that Americans are uninterested and ignorant of political issues and processes. Voters perceive national, and in particular presidential, elections to be most important because they involve higher offices; voter turnout supports this assumption. Presumably, they see the stakes as higher and they show up.

But this is democracy, dumb ass! The stakes are always high. State and local elections are just as important -- and arguably more so -- than national elections. Meaningful change happens at these levels and voters have enhanced influence. Mathematically, your vote has more meaning because the pool of voters is smaller and the ridiculousness of an electoral college is a nonfactor. Not to mention the fact that you may actually have real access to and influence over the candidate.

Indeed, local politics may be the last bastion of representative democracy left in America. The U.S. Congress now has more millionaires than non-millionaires. In what way are these people our peers? In state and local elections, there at least remains the hope of true citizen governance, where intelligent and civic minded people can serve without being independently rich or owned by moneyed interests.

Until citizens understand the importance of civic engagement -- at all levels -- all other efforts will be half-measures at best. Wake up, America. Let 'em know you're there.

Thursday, October 8, 2015

Considering the tradeoff: The cost of the Second Amendment

This semester, like every other, I teach my PR writing students that one element of newsworthiness is unusualness. If something is happening for the first or last time, rarely happens, or is just plain strange, it probably has some news value. It's been about a week since the Oregon shooting, and I remember vividly my reaction to hearing the news: "Eh." The sad truth is that acts like these have become prevalent enough that they are no longer unusual, and as such we've become desensitized to them.

I was prepared for the predictable news cycle to run its course: the shooting happens; the president, grief stricken, speaks to the nation; prominent figures and the media half-heartedly debate gun control measures; we delay action; and then Donald Trump says something stupid and/or racist and we forget the whole thing ever happened.

I must say, however, that I felt President Obama's remarks displayed more anger than grief, and I found that refreshing, particularly when he commented that "our thoughts and prayers are not enough" (as if they ever are). This attack on our societal complacency regarding gun violence will hopefully jar us into acting, but at the very least it's made for a more interesting conversation than we've become accustomed to.

Still, the predictable pro-gun arguments popped up all over social and traditional media. So let's take a look at what I consider the top five, starting with the most absurd and working our way up.

5.) You can ban guns, but that won't stop criminals from obtaining them.

True. But this is more an argument against laws in general than an argument against gun control measures. Laws exist to deter undesirable behavior and provide means for punishment and isolation for those who commit heinous acts. Criminals, by definition, are those who break laws, and so long as laws exist there will always be criminals. The success of a law is best measured by the reduction -- not the elimination -- of unwanted actions committed by the population as a whole.

4.) I have  a right to protect my family.

You absolutely do, and the best way to protect your family is actually not owning a gun. Generally speaking, in households with guns, deaths of family members and suicides are far more common than in households where no gun is present. In truth, your guns are more likely to be used -- either purposely or accidentally -- to kill a member of your family than some masked intruder.

3.) Guns don't kill people. People kill people.

It's been shown through various statistics that increased access to guns correlates with increases in accidental gun deaths, gun homicides, and suicides. The gist of this line of argumentation is that if we banned guns, we'd still find ways to kill one another, so what's the point? Well, the point is that we'd almost assuredly do so a slower pace. Studies on suicide best illustrate this point. First, as you might expect, those who attempt suicide using firearms are far more successful at killing themselves than those using other means. There is also considerable evidence to suggest that once a preferred means of suicide is eliminated, many people simply choose not to attempt suicide at all.

2.) The problem isn't guns, it's mental illness.

I'm somewhat skeptical of this argument, particularly since we only seem to categorize gunmen as mentally disturbed in hindsight. More than anything, I see the mental health argument as a convenient red herring. But for the sake of argument, let's assume our inability to diagnose and properly treat the mentally ill -- and thus keep firearms out of their reach -- is the true problem. Why aren't we doing anything to correct this? The U.S. has a rather appalling history concerning the treatment of the mentally ill, and we're clearly not doing enough to help these people or to stop perpetuating a horrible stigma. As mental illness relates to gun violence, the discourse here is that guns aren't the problem, the mentally ill are the problem, and...and that's it. No concrete solution is ever put forth. In the words of John Oliver, if we're going to continue this ruse "then the very least we owe [the mentally ill] is a fucking plan." The fact that no one ever develops a viable way to address the mental-illness-gun problem suggests that this argument is largely nonsensical.

1.) The Second Amendment guarantees my right to own a gun.

Let's go to the text:
A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall no be infringed.
Many argue that the framers' intent was to provide for a national defense because the U.S. had no standing military at the time of Constitutional ratification. Therefore, more recent and more liberal interpretations constitute an overreach by justifying the right of all citizens to privately own firearms. While I am sympathetic to this argument, I tend to prefer interpretations of the Constitution that elicit the greatest amount of personal freedom and then work backward to restriction, if necessary.

With guns, it's now necessary to work backward. And I'll even acknowledge that doing so infringes on our Constitutional right to own and bear arms. But perhaps it's time we asked ourselves what the right is ultimately worth to us.

All rights come with some tradeoff. Take the rights of free speech and freedom of religion enumerated in the First Amendment. The tradeoff for my right to openly criticize my government and to practice a religion of my choosing -- which happens to be none -- means that I must also allow individuals to spout racist nonsense or express moronic and uninformed opinions. And, of course, I also have to allow Scientology to exist. Terrible as the downside of the First Amendment is, the good far outweighs the bad.

And for what it's worth, there are certain instances where we have agreed to place reasonable restrictions on speech. Threatening speech is forbidden, as are libel and slander. Commercial speech is also highly regulated and false advertising is downright illegal.

Our gun rights are already reasonably restricted to some degree as well. Firearms are forbidden on airplanes, in most schools, and on most government property. Upholding our gun rights simply isn't worth the potential costs in these scenarios.

In truth, it's not worth it in most scenarios. As I see it, guns have two legitimate functions: self defense and hunting. And as I have explained earlier, statistics show that guns aren't all that effective when it comes to self defense. As far as hunting is concerned, a bolt action rifle and a breach loading shotgun are more than sufficient. If we banned every other firearm tomorrow, would we really be affected so negatively?

Still, even if we didn't want to go that far, there are a host of other reasonable actions we might take. More extensive background checks, tougher regulations on dealers, requirements for safe gun storage, mandatory gun safety courses, and the implementation of smart technology are just a few examples. Which would work best? Unfortunately we have no idea because, believe it or not, the U.S. Congress has banned federal agencies from conducting most gun violence research.

There's no denying that guns are deeply ingrained in the American culture. Hell, I own a gun. I love to go shooting at the range. It's fun. It makes you feel powerful. And, I think for many, gun ownership serves as a symbol of control and self-reliance. But I'm no longer persuaded that the benefits of upholding the Second Amendment outweigh the costs, and if we're not willing to at least consider what a different reality concerning guns in America might look like (i.e., funding studies on potential gun control initiatives), then societally, we're probably not too sure about this tradeoff either.

Sunday, September 13, 2015

Why it's okay to call people stupid...sometimes...sort of...

It's never been particularly polite to insult people in public. But often people say dumb or questionable things. Challenging such foolishness was once the duty of a reasoned citizenry, but now practically consider such behavior rude. What went wrong?

It seems to me that we've mistakenly sacrificed our ability to "call bullshit" on the altar of pluralism. And I say mistakenly because I believe we fundamentally misunderstand what it means to live in a pluralist society. Opinions are not meritorious simply by virtue of the fact that you hold and express them. Opinions deserve a voice in the marketplace of ideas, but the very notion of a marketplace assumes the existence of competition; therefore, all thoughts and ideas must be subject to ridicule if we ever hope to achieve any semblance of consensus on which ideas hold water.

In an effort to avoid giving offense, we shy away from applying much needed ridicule. This is costing us dearly, and perhaps because I work as a professor, nowhere is it more evident to me than in the classroom.

Generally, I shy away from bashing millennials (probably because I am one), but I found myself agreeing with many of Caitlan Flanagan's arguments about the decline of college education -- though I find her thoughts on "farm to table dining" and "idiot politically correct humanities curriculum" to be either misinformed or non sequiturs. Still, I agree that universities should offer a means through which students might "be relieved of [a] great burden of ignorance."

Constructive criticism is the educator's greatest friend. I can remember very vividly my first semester of college. I was fortunate enough to take Communication 201 with William Thompson, who not only had no qualms challenging your ignorance, he rather enjoyed it -- almost sadistically, in point of fact. He saw it as his duty to expand the way in which his students viewed particular issues, and not just those related to the communication subject matter.

I also had the great fortune of studying English 105 with Dr. Dennis Hall. Regularly -- by which I mean weekly -- he made it his mission to put my ignorance on display. He often read aloud to the class passages of my meager attempts at writing, opening the floor to waves of public critique. I was offended and embarrassed, but I never spoke out for a very simple reason: his criticisms were completely valid. In speaking with him privately at the close of the semester, he confessed that he actually thought my writing was a bit better than what my peers were producing, but he feared I would become complacent and fail to improve if I weren't challenged. Right again! Hall motivated me, and my writing -- such that it is -- would be much worse without his instruction.

I went on to take three more electives with Thompson and two more with Hall over the next four years, and both men served on my honors thesis committee. The challenges, ridicule, and occasional outright scorn they applied were never meant as personal affronts. They encouraged me to think more broadly, to act with clearer purpose, and to become a more well-rounded, functioning individual.

Now, as I transition into my position as an assistant professor, I inherit these responsibilities. Problematically, the waters are much more treacherous than they were even a decade ago -- or perhaps just as treacherous and I simply didn't recognize the struggle from the vantage of my comfortable lounge chair, rooted firmly in the sandy shore.

Perhaps my critiques must be subtler. Every semester I teach writing, and I always enforce this rule: it's better to show it than to tell it. My point here is twofold. One, if you can back up your claim with evidence, people take it more seriously. Two, if you have a hard time finding evidence for your claim, perhaps it holds no merit. Admittedly, this is a rather meager challenge, but it's a start.

It's rather easy for me to critique assignment, but it's immensely more difficult to critique ideas. Tenure and promotion for junior and adjunct faculty are to some degree determined by a flawed student evaluation system. The easiest ways to boost evaluation scores are to dole out mostly As and Bs -- which leads to grade inflation, a concern among some -- or to get students to just plain like you. Higher grades help in this regard, as do minimizing assignments and dodging confrontations. But having beliefs and ideas confronted and challenged defines education, and it's downright necessary when someone makes a stupid or unfounded claim.

Generally, I've been fortunate to teach many bright students, but I have heard some express a variety of what I and the scientific majority consider stupid or ill-informed opinions. When a student remarks that evolution is "just a theory," should I point out that overwhelming evidence for the process of evolution suggests otherwise? When a student suggests that the universe is only  a few thousand years old, should I direct him or her to the eloquent remarks of Lawrence Krauss, who explains quite clearly how cosmologists determine the age of the universe to be around 13.72 billion years? When a student argues that vaccines cause autism, should I explain the difference between causation and correlation -- and should I also point out that this fallacious notion arose from erroneous studies and that even organizations like Autism Speaks agree that no scientific evidence exists to support such a claim? When a student argues that global warming is a hoax, should I point out that there is virtually undeniable evidence that the planet is indeed getting hotter, that human beings are contributing to this warming, that we are currently experiencing the early effects of climate change, that there is vast consensus among climatologists on all these points, and that the inability among lay persons to distinguish between climate and weather is a chief contributor to unwarranted skepticism? When a student claims Donald Trump would make a good president, should I abruptly kill him or her for the betterment of our species?

All tough questions.

Most days I feel trapped in a Catch 22, where doing my job effectively puts my job security at risk. That dilemma, perhaps as much as any other single factor, is a major reason for the declining quality of American education -- at all levels. As a scholar of ethics, I'm becoming increasingly convinced that I have a moral obligation to make students uncomfortable if it helps them learn. William Thompson, Dennis Hall, and Leon Festinger will at least be proud.

Friday, August 28, 2015

When Social Science Fails Itself

Yesterday, The New York Times reported on a study suggesting that less than half of research findings published in prominent psychology journals could be confirmed upon replication. The stunned, and at times stupefied, reactions from many readers show the public lacks a fundamental understanding of the scientific process.

The authors of "Estimating the reproducibility of psychological science," published in Science, investigated 100 manuscripts published in three leading psychology journal. Their goal was to test the veracity of the findings by replicating the original procedures; only 36 percent of findings in the original studies remained significant upon replication.

To many, these inaccuracies represent a damning failure for social sciences. I would argue they actually embody a triumph for the process of science, but at the same time point to a glaring problem in the process of publishing scientific research.

The public views publication as the pinnacle of research, believing that if it makes it to print then it must be fact. Therefore, when studies fail the test of replication, public confidence in science is shaken. And it is shaken

But that's largely because of a misconception that a single study is enough to establish research findings as facts. In truth, science is an exercise in consensus. When we are able to establish that findings hold true over time and across varied situations, we build a reliable body of knowledge that becomes the basis for scientific understanding. However, when replication fails to yield support for a particular finding, it is dismissed and we move forward with different ideas. Or at least that's how it's supposed to work.

In truth, the problem lies not with the scientific method, but instead with the publish-or-perish environment of academia. Tenure and promotion are based largely on one's ability to publish original research, and to publish it often. Problematically, we take the notion of "original" a bit too literally. As the authors of the Science piece put it:
Reproducibility is not well understood because the incentives for individual scientists prioritize novelty over replication. Innovation is the engine of discovery and is vital for a productive, effective scientific enterprise. However, innovative ideas become old news fast. Journal reviewers and editors may dismiss a new test of a published idea as unoriginal. The claim that "we already know this" belies the uncertainty of scientific evidence.
Put simply, replication is a necessary component science, but it's not sexy, so it's hard to publish. And since academics must publish to survive, they don't replicate studies often.

Perhaps the saddest part of this indictment is that it's our own damn fault. Most reputable social scientific journals are peer reviewed, meaning that we have the power to rectify a problem we know exists simply by changing our review policies.

So why don't we? I thought The New York Times was spot on there:
The act of double-checking another scientist's work has been divisive. Many senior researchers resent the idea that an outsider, typically a younger scientist, with less expertise, would critique work that often has taken years of study to pull off.
Certainly some -- but not all -- senior researchers feel this way, and I find it shameful. The whole premise of scientific inquiry is that no person or idea is above reproach. To quote Albert Einstein, "The important thing is not to stop questioning." 

Consequently, when Einstein first published papers on the photoelectric effect, Brownian motion, and special relativity -- the last of which shook the very foundations of Newtonian physics -- he was only in his mid-20s, very much a junior scholar. But how much could a 26-year-old possibly know anyway?

Thursday, August 6, 2015

Jon Stewart: The Walter Cronkite of a Generation

Stewart at the 2008 USO-Metro
Merit Awards
Yeah, I said it. Well, more specifically Lewis Black said it:
Weirdly enough, [Jon Stewart] was, on a certain level, the Walter Cronkite of his generation. He was the trusted source. People trusted what came out of his mouth. What made Cronkite that and what makes Jon that is a certain kind of honesty that is related through the medium, and the one thing I learned from television is that it doesn't lie.
While Lewis Black may have stolen my headline in his Entertainment Weekly interview, and while I agree with the gist of his argument, I think there's more to the Stewart-Cronkite comparison. Specifically, I would argue temperance, sincerity, accuracy, and depth are the chief contributors to Stewart's legacy.

The guttural reaction to the Stewart-Cronkite comparison is easy to sum up: Stewart is a comedian, Cronkite is a newsman, so how can you compare the two? Stewart was more than a comedian. He was able to elevate to higher levels of credibility than his comedic counterparts due largely to his temperate nature.

Stewart speaks sternly, but without the rage of a Lewis Black. Stewart hits with biting satire, but refrains from the obtuse and deadpan tendencies of his protege, Stephen Colbert. And Stewart has always been more influential than the late night hosts of network television because he was never afraid to say something of substance or to alienate segments of his would-be audience. But, while Stewart is certainly left-leaning, he avoids playing the outraged -- and often off-putting -- liberal we see in Bill Maher.

In fact, in 2010 Maher very publicly criticized Stewart for taking it too easy on Republicans. I happen to agree with Maher's assessment that, if we were to dole out points for craziness among the two major parties, the Republicans would win in a landslide that would make the Reagan-Mondale election look like a nail-biter.

The brilliance of Stewart was that he never took the bait. While he criticizes the right with greater frequency than the left -- and at times more harshly -- Stewart lacks the smugness that Maher constantly displays. There is a real sense that Stewart is wrestling with the same political non sequiturs as his viewers, where Maher appears stubbornly entrenched in a preset ideology.

In many ways, the ongoing struggle of Stewart's personal politics fuels his quest for accuracy in reporting. Love him or hate him, you have to admire that Stewart works incredibly hard to accurately represent the facts, despite that, as a comedian, he has no professional responsibility to do so. Even more impressive to me, he freely admits factual errors that he makes, perhaps most notably in his mischaracterization of Dante Parker's death as the result of police shooting rather than a drug overdose.

In that same segment, Stewart described of The Daily Show as a "media counter-errorism" program and the difficulties of working within such confines. Predictably, Fox News dismissed Stewart's genuine effort to raise issues of police brutality and militarization for a more simplistic narrative: Jon Stewart hates police officers. Fox News anchor Brian Kilmeade went so far as to imply that Stewart was at best unfeeling toward police officers who die in the line of duty. Stewart's response to Kilmeade was characteristic of the complexity and nuance he attempts to bring to reporting:
You can truly grieve for every officer who has been lost in the line of duty in this country and still be troubled by cases of police overreach. Those two ideas are not mutually exclusive. You can have great regard for law enforcement and still want them to be held to high standards.
More than anything, that nuance, that context, and that depth which Stewart brings to important issues make him an effectual newsman. I don't watch television news anymore. My main source for news is The New York Times. Largely this is because I can't remember the last time I watched a nightly network newscast or a 24-hour cable news program and walked away feeling informed.

Part of that is a function of a fragmented media environment. It's difficult for anyone to have the gravitas and impact of a Cronkite because news audiences just aren't that large anymore. Also, a key aspect of news is that it is, well, new. Being the outlet to break a story matters, and the continuous nature of the news cycle in out digital environment puts an even higher premium on being first. Sadly, the drive to be first often leads to speculation, and even worse, misinformation. Unfortunately, many traditional news outlets aren't as responsible about correcting their factual errors as The Daily Show.

Another factor in the decline of TV news is a lack of resources. There was a time when networks had reporters on the ground, actually gathering information. Now, more often than not, on-location reporters are there just to say they are there -- and to get a good background shot, not to provide background on the story. That's one reason the "on location" reports from The Daily Show correspondents are so funny: we all know they're not on location, and we all know it doesn't matter.

What's worse, I all-too-frequently see anchors reading tweets from viewers as if they were field reports. I just start throwing things at the screen. Man-on-the-street interviews were always sorry excuses for news coverage, and viewer tweets are no different. I don't care what @hawkeyewoman9834 (or Kathy, 33, from Iowa) thinks about the administration's new immigration policy, though as an unemployed dental hygienist and mother of three, I can see why CNN sought out her expert opinion.

Institutional forces aside, much of the blame lies with the men and women in the television news business. The news media is supposed to be our watchdog, which means calling bullshit on institutional powers and their leaders. Somehow that core function of the news has been relabeled as subjective opinion, or even worse, partisanship. Good journalists have been scared into a position of reporting facts with no context, leaving relatively uninformed and often unqualified viewers to interpret what they see and hear. Bad journalists have devolved into pure punditry, where ideology dictates the facts rather than being informed by them.

This wasn't always the case. In 1954, Edward R. Murrow unabashedly took on Joseph McCarthy. He pointed out the contradictions made by the senator and demonstrated that, even if the Red Scare were not an overreaction, the conduct of the House Un-American Activities Committee was itself, ironically, un-American. Was that a partisan attack on a Republican senator or simply a statement of what was and a reasoned argument that it ought not be? Currently, the Edward R. Murrow Award is among the most prestigious honors given in recognition of outstanding electronic journalism.

Following the Tet Offensive in 1968, Cronkite said the following concerning the Vietnam War:
To say that we are closer to victory is to believe, in the face of the evidence, the optimists who have been wrong in the past. To suggest we are on the edge of defeat is to yield to unreasonable pessimism. To say that we are mired in stalemate seems the only realistic, yet unsatisfactory conclusion. On the off chance that military and political analysts are right, in the next few months we must test the enemy's intentions, in case this is indeed his last big gasp before negotiation. But it is increasingly clear to this reporter that the only rational way out then will be to negotiate, not as victors, but as an honorable people who lived up to their pledge to defend democracy, and did the best they could.
Cronkite traveled to Vietnam, gathered information first-hand, analyzed it, and developed a reasoned argument that America was in the midst of an unwinnable war. Was Cronkite acting as an unpatriotic communist unsupportive of U.S. troops, or a responsible journalist seeking to inform his audience as best he could? It's worth noting that Walter Cronkite hosted The CBS Evening News for another 13 years after that report, retiring as one of the most respected journalists and revered public figures in U.S. history.

Like Murrow and Cronkite, Stewart isn't afraid to call bullshit. He isn't afraid of controversy. He isn't afraid of reasoned editorializing. In short, he isn't afraid to inform his viewers, which sadly can't be said of many "real" television journalists. What's more, he makes you laugh, if only to keep you from crying. In the broader American zeitgeist, Stewart's legacy won't even remotely challenge that of Cronkite, but for his generation and for mine, Stewart may be remembered as the most trusted man in America.

Tuesday, July 21, 2015

Kentucky Clerks' Stance Unrelated to Religion

Kentucky clerks in five counties (Casey, Clinton, Lawrence, Montgomery, and Rowan) are refusing to issue marriage licenses to homosexual couples. The clerks argue that to do so violates their religious liberties. The ACLU has filed suit against Rowan County Clerk Kim Davis, and my hope is that the federal case reveals the most important truth in this farcical debate: opposition to gay marriage has nothing to do with religion.

An article in Louisville's Courier-Journal provides some insight into the Davis' reasoning in refusing to issue marriage licenses. In the article, Davis claims that her actions were "thought out" and that she "sought God on it." According to Mike Wynn, author of the article:
On the stand Monday, Davis described herself as an Apostolic Christian who believes marriage is defined as the union of one man and one woman under the Bible -- "God's holy word" -- and said she contemplated her policy for months beforehand.
Supporters of Davis have shown up in droves outside the courtroom, many sporting signs with biblical verses that reinforce their belief that marriage should be between one man and one woman. The most commonly featured passage is Genesis 2:22-24:
Then the rib which the Lord God had taken from man He made into a woman, and He brought her to the man. And Adam said: "This is now bone of my bones and flesh of my flesh; she shall be called woman because she was taken out of man." Therefore a man shall leave his father and mother and be joined to his wife, and they shall become one flesh.
Many other passages that simply repeat Genesis 2:24 are also featured. Matthew 19:5 and Mark 10:7-8 are fan favorites, I assume because they demonstrate Jesus' support of Old Testament doctrine. Interestingly, the next verses (Matthew 19:6 and Mark 10:9) are frequently omitted. In both gospels, Jesus says the following:
Therefore what God has joined together, let not man separate.
You're probably familiar with these verses as they are often invoked at the end of wedding ceremonies. In these verses, Jesus very clearly condemns divorce. There are other biblical condemnations of divorce as well, most notably from early church leader Paul in 1 Corintians 7:10-11:
Now to the married I command, yet not I but the Lord: A wife is not to depart from her husband. But even if she does depart, let her remain unmarried or be reconciled to her husband. And a husband is not to divorce his wife.
If we're going to follow the letter of the law as outlined by scripture, accepting that biblical marriage only constitutes the union of one man and one woman also means accepting that the Bible does not recognize divorce, endorse second marriages, and considers both sinful.

Adultery is another big no-no related to marriage. Apart from being expressly forbidden in the Ten Commandments (Exodus 20:14), it is condemned in other books, as is premarital sex:
Marriage is honorable among all, and the bed undefiled; but fornicators and adulterers God will judge. (Hebrews 13:4)
In fact, that judgment manifests itself in a rather harsh punishment prescribed in Leviticus 20:10-13:
The man who commits adultery with another man's wife, he who commits adultery with his neighbor's wife, the adulterer and the adulteress, shall surely be put to death. The man who lies with his father's wife has uncovered his father's nakedness; both of them shall surely be put to death. Their blood shall be upon them. If a man lies with his daughter-in-law, both of them shall surely be put to death. They have committed perversion. Their blood shall be upon them. If a man lies with a male as he lies with woman, both of them have committed an abomination. They shall surely be put to death. Their blood shall be upon them.
Certainly I over-quoted here, but I do so to prove a point. The Bible treats adultery and homosexuality as equivalent sins. Has Rowan County Clerk Kim Davis attempted to deny marriage licenses to adulterers? Or to divorced heterosexuals seeking to marry another? According to The Courier-Journal and Davis' own testimony, the answer is a resounding no:
After working in the clerk's office for nearly 30 years, she said she has never denied a license on religious grounds or asked applicants about relationships she might find sinful.
It appears Davis has been rather derelict in her religious duties. In fact, from a biblical standpoint, as a woman she has no place instructing us on matters of religion in the first place:
Let your women keep silent in the churches, for they are not permitted to speak; but they are to be submissive, as the law also says. And if they want to learn something, let them ask their own husbands at home; for it is shameful for women to speak in church. (1 Corinthians 14:34-35)
In this particular instance, I agree with scripture. I think we'd all be better off if Davis would simply shut the hell up.

Why? Clearly Davis is not interested in preserving biblical marriage. She has a 30 year track record of failing to do so. Her objection to issuing marriage licenses boils down to a discomfort with homosexuality, and she is not alone in that feeling. Many people believe homosexuality to be unnatural and strange and its practice to be disturbing or outright disgusting.

The right to have that belief is actually something that the First Amendment does uphold. However, simply because something makes you uncomfortable is not a compelling reason to suppress it or make it illegal, and the Supreme Court was correct to extend to homosexual couples equal protection under the law.

Moreover, the religious freedom argument that Davis and the other four county clerks in Kentucky are making is nonsensical. Even if these officials' actions were motivated by religious belief -- which they are not -- there is nothing that prohibits adherence to both the laws of man and of God. Jesus was arguably the first advocate for the separation of church and state:
Then they asked Him, saying, "Teacher, we know that You say and teach rightly, and You do not show personal favoritism, but teach the way of God in truth: Is it lawful for us to pay taxes to Caesar or not?" But He perceived their craftiness, and said to them, "Why do you test Me? Show Me a denarius. Whose image and inscription does it have?" They answered and said, "Caesar's." And He said to them, "Render therefore to Caesar the things that are Caesar's and to God the things that are God's." (Luke 20:21-25)
Upholding this law is not an affront to religious liberty. And if these five clerks feel it is, they should resign. I see no reason why we should praise, let alone pay, an individual for refusing to do his or her job.

Wednesday, July 3, 2013

Rethinking Patriotism for Millennials: Beyond Answering a Truly Dumb Ass Question

I check Twitter as part of my morning routine, and as it often does, Pew's "Daily Number" caught my eye. Today was especially relevant for me because (a) I'm a sucker for generational studies and (b) my generation was the topic of discussion.
 Here's the gist of their findings:
Just 32% of Millennials believe the U.S. is the greatest country in the world. That number progressively increases among the Gen X (48%), Boomer (50%) and Silent generations (64%). Millennials were also the most likely generation to say America is not the greatest country in the world (11%).
I take no conceptual issue with this work. It's well done, as is most all of what Pew does. My issue is with the framing, in particular the title: A generational gap in American patriotism.

My patriotism does not depend on answering in the affirmative to a question so asinine as, "Is America the greatest country on earth?" Unfortunately, I suspect it does rest on that answer for a lot of Americans, which likely explains the gap in reported rates of patriotism.

First of all, the greatest country question is pretty loaded. I'd wager that most Americans who say the United States is the greatest country probably reason as follows: "I live here, I like it, and therefore it is the best." Nevermind that most Americans have never even been to another country (only 36% of us hold valid passports), which severely limits even anecdotal comparisons.

I consider myself to be patriotic in that I am devoted to my country -- which is the dictionary definition of the term -- but I would say based on odds alone (the United States is just one of 196 countries) that we're not number one.

But then again, it depends on the criteria we're using to rank nations.

If we're talking economic prowess, military might, or (sadly) incarceration rate, the United States tops the list. But what about the country with the happiest population? Switzerland is currently numero uno, though in years past it has often been Denmark. What about price and quality of health care? That goes to France. How about education? Finland is the winner there.

I think you get my point as to why determining which nation is truly the greatest is a complicated matter.

It might help to think of it in terms of those best-rock-album-of-all-time conversations you've had with friends over a few beers. My answer is typically The Beatles' "Revolver." But there are dozens of other candidates; perhaps the other I most often hear is the Rolling Stones' "Exile on Main Street." And just because I didn't put "Exile" atop my list doesn't mean I don't think it's a damn good record, just maybe not the best.

In much the same way, not believing that America is the greatest country on earth, period, end of discussion, doesn't mean I'm unpatriotic or that I dislike America. I'm devoted to what I consider to be the founding and most critical principles of the United States: you are free to determine your own destiny, to make a better life for yourself and your family, to live with few constraints on your day-to-day activities, and -- most importantly -- to question the established way of doing things in the hopes of producing something better.

I worry about America. We've got problems. We rank 37th in health care, 17th in education, 14th in happiness, and 1st in incarceration rates. We have serious issues concerning wealth inequality and an eroding middle class further threatened by holes in our safety net programs and unemployment that hovers around 8%. We also have difficulties facing long-term problems, climate change and the never-ending accrual of debt chief among them. And every one of these problems is further complicated by our currently polarized political system.

Still, despite all these issues and our apparent lack of patriotism, Millennials are the lease likely generation to ring America's death knell. Quite the opposite: Millennials are the most optimistic generation about the state of our nation and our ability to improve both individually and collectively. I would argue that a hopeful outlook for America's future is a better measure of patriotism than a response to a truly dumb ass question.