Monday, December 7, 2015

Obama speaks, but will the Middle East listen?

Last night, President Obama addressed the nation from the Oval Office, speaking on a number of issues related to national security in the wake of recent terrorist attacks. Specifically, he defended his strategy for fighting ISIS, lobbied Congress to pass some critical but relatively minor restrictions on who may purchase firearms, and urged Americans not to be bated into discrimination or lured into a state of perpetual war.


The New York Times editorial board praised Obama for projecting strength and advocating for calm. While I have been critical of Obama's handling of ISIS, I must say I remain impressed with his resolve to act with the patience needed to attain a truly workable solution. Predictably, the Republican response was a slightly more polite version of, "Obama...man, what a pussy."

But for all their rhetoric, the Republican presidential candidates have no short-term military solution that is in any way discernible from what Obama is already doing. Front-runner Donald Trump recently articulated his exceedingly complex plan:
ISIS is making a tremendous amount of money because they have certain oil camps, certain areas of oil that they took away. [...] They have some in Syria, some in Iraq. I would bomb the shit out of 'em.
Such elegance. But more or less, this is what Obama is doing and what he has been doing since September 2014. So far as I know, Lindsey Graham is the only presidential candidate explicitly calling for a sizable force ground troops in Syria and Iraq. The other Republicans' position is that Obama is weak and that they would project strength by...well, by continuing his policies.

Perhaps the most ironic response was that of House Speaker Paul Ryan:
Our primary responsibility is to keep the American people safe from the real and evolving threat of radical Islamic terrorism. That will require the president to produce a comprehensive strategy to confront and defeat ISIS. The enemy is adapting, and we must too. That's why what we heard tonight was so disappointing: no new plan, just a half-hearted attempt to defend and distract from a failing policy.
Ryan may be correct that the Obama policy is failing -- which I'll address in a moment. But somewhere in the midst of his required outrage at Obama, he must have forgotten that he is arguably the most powerful member of legislative branch, which is also capable of establishing a definitive policy agenda. Congress is also the only branch of government with the Constitutional authority to declare war, though again, to my knowledge, no such vote has been called. So again, I ask, what is the alternative strategy to the Obama plan?

Interestingly, the Senate did call a terrorism related vote on Dec. 3. Before them were two gun control measures, one requiring more stringent background checks on individuals purchasing firearms at gun shows and another preventing suspects on the FBI terror watch list from purchasing guns at all. Both were voted down, with presidential hopefuls Ted Cruz, Marco Rubio, Rand Paul, and Lindsey Graham all voting against the bill.

This is truly mind-boggling. First, the fact that Congress would vote against more stringent background checks runs counter to widespread, bipartisan public opinion: 85% of Americans -- including 79% of Republicans -- are in favor of such laws. Pew Research Center, who conducted this polling, didn't ask about public opinion concerning potential terrorists' gun rights, I assume because the very question is laughable. I can only imagine the level of cognitive dissonance required to allow a person to say, on the one hand, he believes national security is the most important responsibility of the presidency, while on the other hand casting a vote that continues to allow terror suspects to legally acquire firearms. 

Still, the Republicans are correct on one point: Obama's strategy will not work to defeat ISIS. But the GOP error here is twofold. The first is they believe defeating ISIS is the intent of the airstrikes. The second is blaming the strategy, when in truth, the problem is a lack of will and political feasibility.

I firmly believe Obama -- and any American politician for that matter -- wants to see ISIS defeated and terrorism stamped out. I don't believe that's the aim of our efforts in Syria. We're attempting to buy time by containing a threat, not eliminating it. By that lesser measure of success, the airstrike strategy is completely viable.

The reason for this lower bar is simple: Americans lack the will to win this war on our traditional military terms. The debacle in Afghanistan and the quagmire that was/is Iraq is fresh on the public's mind. Nobody wants to go down that road again, which is why no politician with a snowball's chance in hell of becoming president is calling for ground troops. John McCain lost the 2008 election to Obama for a variety of reasons, one of which was his commitment to the Second Iraq War. The electorate was wary of his assertion that a 100 year occupation of Iraq might be necessary for a total military victory. But he was probably right then, and he's probably right now.

The problem is that nobody wants to commit 100 years to a ground war. Simultaneously, there seems to be a recognition that dropping bombs is not enough. Obama is unquestionably right that "many Americans are asking whether we are confronted by a cancer that has no immediate cure."

We are.

But there is an eventual cure, and Obama, who favors the long game, nailed it during his speech:
If we’re to succeed in defeating terrorism we must enlist Muslim communities as some of our strongest allies, rather than push them away through suspicion and hate. That does not mean denying the fact that an extremist ideology has spread within some Muslim communities. This is a real problem that Muslims must confront, without excuse.
There is a war at our doorstep. We can no longer prevent it. But we can control how the conflict will be framed. We must adhere to our values and temper caution with compassion, particularly in our treatment of refugees from the Syrian conflict. We must embrace the peaceful Muslim as our neighbor, for only then can we rightly condemn the Islamic terrorist as our enemy. And in so doing, we put pressure on other countries in the region -- predominantly Muslim countries -- to take military action against ISIS. 

A war between Muslim states would not carry the the immense and overt religious connotations that a war between the U.S. and ISIS would. A war of the latter kind has no immediate end, if any end at all. As Americans, we need to exercise the wisdom to see this conflict for what it is: a war that we cannot win, at least not alone. 

Monday, November 30, 2015

Something for nothing leaves nothing

About a week ago, Vox contributor Dylan Matthews pointed out that the media has no idea how to deal with Donal Trump's constant lying. I have mixed feelings about Matthews' assertion, partly because he forgot that "media" is a plural noun.

While I agree that many journalists like George Stephanopoulos seem stymied, there are some media personalities that have a better grip on Americans' current infatuation with political outsiders -- which is a nice way to call someone unqualified. I'm partial to the Stephen Colbert approach: making fun of them. Humor, after all, is based in the absurd, and there are few things in this world more absurd than whatever Donald Trump says at pretty much any given moment.

Of course comedic figures like Colbert have a key advantage in that there's no real concern about appearing partisan. Were a journalist to truly take many of these candidates to task, he or she might be labeled as liberal and therefore biased. The moderators of the CNBC GOP debate learned that lesson the hard way. Still, I fail to see how asking about a candidate's proposed tax policies constitutes a "gotcha" question, or how CNN's Democratic debate was the love fest Republicans claimed.

But perceptions persist in the face of facts, and in many cases, exposure to facts actually strengthens the effects of misinformation. Cognitive dissonance lies at the heart of such effects. Once individuals identify with a particular position, they find it difficult to process information that contradicts that position. Rather than changing their stances -- which is psychologically difficult -- they simply double down on their stupid.

And this brings me back to the Matthews' Vox article, and in particular the third paragraph:
But Trump also has a tendency to use his appearances on TV news to spout flagrant lies about a variety of topics. His statements aren't false the way that, say, Marco Rubio's claim that he can cut taxes by $12 trillion and still balance the budget is false. False claims of that variety are a long and distinguished tradition in American electoral politics, and it's an established policy on programs like This Week to not challenge them too aggressively.
Double take. Say that again? Media -- journalistic media -- have an "established policy" not to question falsehoods, and that is somehow okay?

Most of what Trump says is obviously bullshit, meaning we as the public require less help from media in seeing it as such. But many other candidates operate using covert bullshit, and we need more help sniffing it out, not less.

From what I can tell, elections are based in fantasy, appealing contradictions, and careful maneuvering. It's more or less a constant peddling of nonsense. You can't pay for universal healthcare and universal college education simply by raising taxes on the top 1%. You can't cut taxes and increase military spending and claim to be fiscally responsible. You can't be for states' rights and against state laws legalizing recreational marijuana use. You can't be for smaller government and against gay marriage. You can't be for free speech and against flag burning. And so on, and so on...

But in a democracy, the behavior of our potential leaders may say more about us than it does about them. Politicians lie to us for one simple reason: we don't want the truth. We say we do, but we don't.

In his so-called malaise speech, Jimmy Carter warned us very clearly about focusing too clearly on materialistic goals:
In a nation that was proud of hard work, strong families, close-knit communities, and our faith in God, too many of us now tend to worship self-indulgence and consumption. Human identity is no longer defined by what one does, but by what one owns. But we've discovered that owning things and consuming things does not satisfy our longing for meaning. We've learned that piling up material goods cannot fill the emptiness of lives which have no confidence or purpose.
But rather than working to restore a sense of community and deeper purpose, we ousted Carter and elected Ronald Reagan, mostly because he told us we could have everything without paying for it. Moreover, the two men who tried to address Reagan's disastrous fiscal legacy -- his 1984 challenger, Walter Mondale, and his 1988 successor, George H.W. Bush -- were marginalized and ultimately defeated for telling the truth: more tax money was needed to pay for the Reagan spending spree.

And despite my derision of conservatives, liberals don't fare much better. I was pleased to see a New York Times report this morning stating that two-thirds of Americans want the U.S. to join a climate change pact. Well, that is so long as we don't actually have to do anything:
Thinking about policies to reduce carbon emissions, Americans generally favor regulating business activity more than taxing consumers. The poll found broad support for capping power plant emissions. Half of all Americans said they thought the government should take steps to restrict drilling, logging and mining on public lands, compared with 45 percent who opposed such restrictions. Support for limiting mineral extraction on public lands rose to 58 percent among Democrats. But just one in five Americans favored increasing taxes on electricity as a way to fight global warming; six in 10 were strongly opposed, including 49 percent of Democrats. And support was not much higher for increasing gasoline taxes, at 36 percent over all.
There's no incentive to supply the truth when lies are in high demand. Accepting a fantasy is easy. It requires nothing from us. Living in reality is hard. It means making tough decisions, accepting tradeoffs, and dealing with opportunity costs. Americans have been running full speed from hard truths toward an impending dead end. Someday soon we're going to hit the brakes or hit the wall, and in large part, our willingness to accept ideological falsehoods may be the determining factor.

Thursday, November 19, 2015

Why Mitch McConnell Hates Coal Miners

Earlier this week, the Senate voted to block two key initiatives from the EPA intended to reduce emission from coal-fired power plants and halt global warming. Using a familiar tactic, Republicans leading the fight justified their votes by pitting progress on climate change against progress on job growth. In reality, we don't need to choose between the two, and even if we did, we shouldn't care that much about protecting coal mining jobs.

Mitch McConnell, senior senator from Kentucky (and senior mutant neocon turtle), led the attack on the EPA and President Obama:
These regulations make it clearer than ever that the president and his administration have gone too far, and that Congress should act to stop this regulatory assault. [...] Here's what is lost in this administration's crusade for ideological purity: the livelihoods of our coal miners and their families. Folks who haven't done anything to deserve a 'war' being declared upon them.
As a native Kentuckian myself, I admit there's an instinctual response to defend coal miners. We're a coal state after all, right?

Well, yes and no. Kentucky is the third largest coal producing state, behind Wyoming and West Virginia. However, there are only about 12,000 coal miners employed in Kentucky, which accounts for less that 1% of total jobs statewide. Moreover,  coal mine production amounts to just over 1% of the state's total GDP.

So it turns out, coal isn't as big a part of the Kentucky economy as you might think. In fact, there aren't that many coal miners in the U.S. as a whole. The Kentucky Department for Energy Development and Independence estimates that there are 78,300 coal mining jobs in America. Fortune puts that number a bit higher at just over 93,000. That means coal mining employs .006% of the current workforce -- at best.

But when you think about it, those numbers aren't all that surprising. Alternative energies are growing as technologies become more affordable. Plus, coal mining is a shitty job. The life of a coal miner is nasty, brutish, and short -- largely thanks to diseases like cancer and black lung, in combination with poor safety conditions in mines. And coal mining doesn't pay particularly well either. Again, looking just at Kentucky, the Appalachian counties in which coal mining is most prevalent are consistently among the poorest counties in the U.S.

Finally, it's worth noting that progress always comes at a price. Remember the pianist who played Vaudeville tunes to accompany silent films. Of course you don't. Since you've been alive, theaters have only shown "talkies." Huzzah! Hell, the expansion of electrical services that coal-fired power plants contributed to also perpetuated the decline of kerosine lamp manufacturing. It was worth the trade each time, and the same is true now.

It's disingenuous to accuse Obama of waging a war on coal miners when in truth their jobs are just collateral damage in a larger battle. I doubt there are many politicians that truly relish in eliminating jobs. But quite frankly, coal mining jobs aren't worth protecting, especially at the cost of addressing the more pressing problem of global warming. And if Mitch McConnell and the other Senate Republicans really gave a damn about coal miners, they'd try to find them better jobs, doing literally anything else.

Monday, November 9, 2015

The Blogger's 10 Commandments

"Moses with the Ten Commandments"
by Rembrandt, 1659
The world changes quickly. Faster when you’re in PR. Even faster if you deal with online communication. Still, we have to keep up.

Think about how much has changed in just the last decade. Tweeting use to be something only birds did, “friend” is a verb now and around every turn you’ll find a variety of specialized blogs (which is short for “web log,” though it sounds like some species of monsters that lived in your childhood closet).

Blogs are great ways for organizations and individuals to communicate ideas to anyone interested in listening. The problem is that a lot of blogs suck...hard.

Bloggers commit several errors, usually because they write before they think. Blogging is like all other writing: it’s systematic, it’s rules-driven and it takes time to master. But rather than leave you to wander for years in the digital desert learning this stuff the hard way, I thought you could use some guidance. I give to you, my chosen people, The Blogger’s 10 Commandments:

I.) Remember your audience. Every blog reader asks the same thing: How does this affect ME? It’s sometimes okay to inject yourself and your personal stories into your posts, but make sure you offer a clear take-away for your readers. A blog caters to them, not you.

II.) Master writing headlines. This is the most important part of each post. Few people will make it past the headline unless you give them a reason. Create reader interest with your headlines. Some helpful headline techniques include: asking questions, using superlatives, creating intriguing analogies or employing numbers – particularly if you’re making a list like I am. How many “Top #” lists have you seen just today?

III.) Stay focused on your purpose. Every blog serves some purpose and every post should advance that purpose. Most blog hosting sites let writers clearly state what they intend to accomplish with their blogs, so follow the guidelines you set.

IV.) Be conscious of length. Posts should be long enough to cover the topic at hand, but not so long that readers lose interest. A good rule of thumb is to keep blogs between 200 and 500 words if you post frequently. Feature length blogs are sometimes appropriate, but be careful with those.

V.) Write like you speak. Blogs are meant to be conversational, so bend the rules of grammar to match your speech: end a sentence in a preposition, start sentences with conjunctions, even split an infinitive or two. But don’t get sloppy. And don’t, like, you know, take it too far, like, like...ugh.

VI.) Participate in the conversation. Speaking of conversation…you know those little comment thingies? They aren’t there to facilitate random rants. They exist to facilitate conversation, both among readers and between you and your readers. Shorter posts leave room for them to interject, and for you to respond.

VII.) Keep a schedule. Your readers expect to hear from you at certain intervals. Don’t disappoint. Keep a posting schedule. Usually twice a week is a good way to go, but depending on your purpose you might post more or less often. Different content requires different schedules. I find the 1-7-30-4-2-1 mnemonic pretty helpful.

VIII.) Think past words. It’s the internet! You have endless amounts of information to rely on, so don’t use only words. Embed photos and video. Link to relevant information. Just like any other medium, you want to make full use of this one. But be aware of copyright limitations or you can get into serious trouble.

IX.) Know the power of design. Everything about your blog communicates something. The template, style and background are no different. Select ones that advance your set purpose and speak to your audience. Usually these overarching design choices are done through cascading style sheets (CSS), but bloggers often introduce subtle design elements using HTML code. If you want to bold, italicize or underline something, there’s code for that. Similar codes are often used for photos, videos and links.

X.) Move the audience to act. Give readers something to do. Engaging them in your message is a means to an end, so provide an end. Prompt them to share the post. Ask them to make comments. Invite them to ask questions about products or services. Direct them to sites to buy tickets for events. Creating enthusiasm about your topic only takes readers so far. Show them where to go next.

This list is not a complete guide to blogging. I’m not sure there even is such a thing. But this list should at least get you started as you sit down to write your first posts. After some practice, maybe you can comment here and impart your wisdom on me. The learning never ends…not for anyone. 

Thursday, November 5, 2015

It's Democracy, Dumb Ass!

On Tuesday, my home state of Kentucky elected Republican Matt Bevin as governor. To outsiders, this hardly seems surprising. Kentucky consistently votes conservative in national elections. Currently both senators and five of the state's six representatives are Republican. Since the 1960s, Democratic presidential candidates have only carried the state four times, and each time the candidate was a Southerner (LBJ, Jimmy Carter, and Bill Clinton -- twice).

However, state-level elections typically tell a different story. Democrats have held the governorship for 40 of the last 44 years. Additionally, the Kentucky House of Representatives is the only state legislative body in the South currently held by Democrats. So why the change?

Voter turnout has become the scapegoat. Indeed, the numbers are troubling. Results show that Bevin captured 52.5%, while the Democratic challenger Jack Conway came in at 43.8%. This appears like a pretty resounding mandate, until you consider that voter turnout was 30.7%, which is abysmally low.



It's also worth noting that their are more registered Democrats in Kentucky than Republicans, meaning that lower turnout benefits conservatives -- as it does in much of the country. It's sad for the Republican Party that suppressed voter turnout is its best ally, but arguably even sadder for the Democratic Party that its constituents can't be bothered to vote.

Low voter turnout is a national problem, and there are several proposal to correct it: make voter registration automatic rather than an opt-in process, expand early voting, move Election Day to the weekend, etc.

I agree that voter registration should be automatic. A recent Census Bureau report estimates that a little over 35% of eligible voters are not registered. Still, expanded registration won't truly solve the problem as fewer than half of registered voters bother to vote in non-presidential election years.

Convenience could be a factor. Expanding early voting would likely help, though I'm skeptical about moving elections to the weekend: Would Americans be more likely to take time out of an off day than a work day? Perhaps a better solution may be to develop a system voters might cast their ballots at any polling station. After all, it's not 1850. Many Americans don't live and work in the same proximity that they once did, and voting at a polling place closer to work might help boost turnout.

But still, that doesn't fix the apathy. Part of the problem is that Americans are uninterested and ignorant of political issues and processes. Voters perceive national, and in particular presidential, elections to be most important because they involve higher offices; voter turnout supports this assumption. Presumably, they see the stakes as higher and they show up.

But this is democracy, dumb ass! The stakes are always high. State and local elections are just as important -- and arguably more so -- than national elections. Meaningful change happens at these levels and voters have enhanced influence. Mathematically, your vote has more meaning because the pool of voters is smaller and the ridiculousness of an electoral college is a nonfactor. Not to mention the fact that you may actually have real access to and influence over the candidate.

Indeed, local politics may be the last bastion of representative democracy left in America. The U.S. Congress now has more millionaires than non-millionaires. In what way are these people our peers? In state and local elections, there at least remains the hope of true citizen governance, where intelligent and civic minded people can serve without being independently rich or owned by moneyed interests.

Until citizens understand the importance of civic engagement -- at all levels -- all other efforts will be half-measures at best. Wake up, America. Let 'em know you're there.

Thursday, October 8, 2015

Considering the tradeoff: The cost of the Second Amendment

This semester, like every other, I teach my PR writing students that one element of newsworthiness is unusualness. If something is happening for the first or last time, rarely happens, or is just plain strange, it probably has some news value. It's been about a week since the Oregon shooting, and I remember vividly my reaction to hearing the news: "Eh." The sad truth is that acts like these have become prevalent enough that they are no longer unusual, and as such we've become desensitized to them.

I was prepared for the predictable news cycle to run its course: the shooting happens; the president, grief stricken, speaks to the nation; prominent figures and the media half-heartedly debate gun control measures; we delay action; and then Donald Trump says something stupid and/or racist and we forget the whole thing ever happened.

I must say, however, that I felt President Obama's remarks displayed more anger than grief, and I found that refreshing, particularly when he commented that "our thoughts and prayers are not enough" (as if they ever are). This attack on our societal complacency regarding gun violence will hopefully jar us into acting, but at the very least it's made for a more interesting conversation than we've become accustomed to.

Still, the predictable pro-gun arguments popped up all over social and traditional media. So let's take a look at what I consider the top five, starting with the most absurd and working our way up.

5.) You can ban guns, but that won't stop criminals from obtaining them.

True. But this is more an argument against laws in general than an argument against gun control measures. Laws exist to deter undesirable behavior and provide means for punishment and isolation for those who commit heinous acts. Criminals, by definition, are those who break laws, and so long as laws exist there will always be criminals. The success of a law is best measured by the reduction -- not the elimination -- of unwanted actions committed by the population as a whole.

4.) I have  a right to protect my family.

You absolutely do, and the best way to protect your family is actually not owning a gun. Generally speaking, in households with guns, deaths of family members and suicides are far more common than in households where no gun is present. In truth, your guns are more likely to be used -- either purposely or accidentally -- to kill a member of your family than some masked intruder.

3.) Guns don't kill people. People kill people.

It's been shown through various statistics that increased access to guns correlates with increases in accidental gun deaths, gun homicides, and suicides. The gist of this line of argumentation is that if we banned guns, we'd still find ways to kill one another, so what's the point? Well, the point is that we'd almost assuredly do so a slower pace. Studies on suicide best illustrate this point. First, as you might expect, those who attempt suicide using firearms are far more successful at killing themselves than those using other means. There is also considerable evidence to suggest that once a preferred means of suicide is eliminated, many people simply choose not to attempt suicide at all.

2.) The problem isn't guns, it's mental illness.

I'm somewhat skeptical of this argument, particularly since we only seem to categorize gunmen as mentally disturbed in hindsight. More than anything, I see the mental health argument as a convenient red herring. But for the sake of argument, let's assume our inability to diagnose and properly treat the mentally ill -- and thus keep firearms out of their reach -- is the true problem. Why aren't we doing anything to correct this? The U.S. has a rather appalling history concerning the treatment of the mentally ill, and we're clearly not doing enough to help these people or to stop perpetuating a horrible stigma. As mental illness relates to gun violence, the discourse here is that guns aren't the problem, the mentally ill are the problem, and...and that's it. No concrete solution is ever put forth. In the words of John Oliver, if we're going to continue this ruse "then the very least we owe [the mentally ill] is a fucking plan." The fact that no one ever develops a viable way to address the mental-illness-gun problem suggests that this argument is largely nonsensical.

1.) The Second Amendment guarantees my right to own a gun.

Let's go to the text:
A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall no be infringed.
Many argue that the framers' intent was to provide for a national defense because the U.S. had no standing military at the time of Constitutional ratification. Therefore, more recent and more liberal interpretations constitute an overreach by justifying the right of all citizens to privately own firearms. While I am sympathetic to this argument, I tend to prefer interpretations of the Constitution that elicit the greatest amount of personal freedom and then work backward to restriction, if necessary.

With guns, it's now necessary to work backward. And I'll even acknowledge that doing so infringes on our Constitutional right to own and bear arms. But perhaps it's time we asked ourselves what the right is ultimately worth to us.

All rights come with some tradeoff. Take the rights of free speech and freedom of religion enumerated in the First Amendment. The tradeoff for my right to openly criticize my government and to practice a religion of my choosing -- which happens to be none -- means that I must also allow individuals to spout racist nonsense or express moronic and uninformed opinions. And, of course, I also have to allow Scientology to exist. Terrible as the downside of the First Amendment is, the good far outweighs the bad.

And for what it's worth, there are certain instances where we have agreed to place reasonable restrictions on speech. Threatening speech is forbidden, as are libel and slander. Commercial speech is also highly regulated and false advertising is downright illegal.

Our gun rights are already reasonably restricted to some degree as well. Firearms are forbidden on airplanes, in most schools, and on most government property. Upholding our gun rights simply isn't worth the potential costs in these scenarios.

In truth, it's not worth it in most scenarios. As I see it, guns have two legitimate functions: self defense and hunting. And as I have explained earlier, statistics show that guns aren't all that effective when it comes to self defense. As far as hunting is concerned, a bolt action rifle and a breach loading shotgun are more than sufficient. If we banned every other firearm tomorrow, would we really be affected so negatively?

Still, even if we didn't want to go that far, there are a host of other reasonable actions we might take. More extensive background checks, tougher regulations on dealers, requirements for safe gun storage, mandatory gun safety courses, and the implementation of smart technology are just a few examples. Which would work best? Unfortunately we have no idea because, believe it or not, the U.S. Congress has banned federal agencies from conducting most gun violence research.

There's no denying that guns are deeply ingrained in the American culture. Hell, I own a gun. I love to go shooting at the range. It's fun. It makes you feel powerful. And, I think for many, gun ownership serves as a symbol of control and self-reliance. But I'm no longer persuaded that the benefits of upholding the Second Amendment outweigh the costs, and if we're not willing to at least consider what a different reality concerning guns in America might look like (i.e., funding studies on potential gun control initiatives), then societally, we're probably not too sure about this tradeoff either.

Sunday, September 13, 2015

Why it's okay to call people stupid...sometimes...sort of...

It's never been particularly polite to insult people in public. But often people say dumb or questionable things. Challenging such foolishness was once the duty of a reasoned citizenry, but now practically consider such behavior rude. What went wrong?

It seems to me that we've mistakenly sacrificed our ability to "call bullshit" on the altar of pluralism. And I say mistakenly because I believe we fundamentally misunderstand what it means to live in a pluralist society. Opinions are not meritorious simply by virtue of the fact that you hold and express them. Opinions deserve a voice in the marketplace of ideas, but the very notion of a marketplace assumes the existence of competition; therefore, all thoughts and ideas must be subject to ridicule if we ever hope to achieve any semblance of consensus on which ideas hold water.

In an effort to avoid giving offense, we shy away from applying much needed ridicule. This is costing us dearly, and perhaps because I work as a professor, nowhere is it more evident to me than in the classroom.

Generally, I shy away from bashing millennials (probably because I am one), but I found myself agreeing with many of Caitlan Flanagan's arguments about the decline of college education -- though I find her thoughts on "farm to table dining" and "idiot politically correct humanities curriculum" to be either misinformed or non sequiturs. Still, I agree that universities should offer a means through which students might "be relieved of [a] great burden of ignorance."

Constructive criticism is the educator's greatest friend. I can remember very vividly my first semester of college. I was fortunate enough to take Communication 201 with William Thompson, who not only had no qualms challenging your ignorance, he rather enjoyed it -- almost sadistically, in point of fact. He saw it as his duty to expand the way in which his students viewed particular issues, and not just those related to the communication subject matter.

I also had the great fortune of studying English 105 with Dr. Dennis Hall. Regularly -- by which I mean weekly -- he made it his mission to put my ignorance on display. He often read aloud to the class passages of my meager attempts at writing, opening the floor to waves of public critique. I was offended and embarrassed, but I never spoke out for a very simple reason: his criticisms were completely valid. In speaking with him privately at the close of the semester, he confessed that he actually thought my writing was a bit better than what my peers were producing, but he feared I would become complacent and fail to improve if I weren't challenged. Right again! Hall motivated me, and my writing -- such that it is -- would be much worse without his instruction.

I went on to take three more electives with Thompson and two more with Hall over the next four years, and both men served on my honors thesis committee. The challenges, ridicule, and occasional outright scorn they applied were never meant as personal affronts. They encouraged me to think more broadly, to act with clearer purpose, and to become a more well-rounded, functioning individual.

Now, as I transition into my position as an assistant professor, I inherit these responsibilities. Problematically, the waters are much more treacherous than they were even a decade ago -- or perhaps just as treacherous and I simply didn't recognize the struggle from the vantage of my comfortable lounge chair, rooted firmly in the sandy shore.

Perhaps my critiques must be subtler. Every semester I teach writing, and I always enforce this rule: it's better to show it than to tell it. My point here is twofold. One, if you can back up your claim with evidence, people take it more seriously. Two, if you have a hard time finding evidence for your claim, perhaps it holds no merit. Admittedly, this is a rather meager challenge, but it's a start.

It's rather easy for me to critique assignment, but it's immensely more difficult to critique ideas. Tenure and promotion for junior and adjunct faculty are to some degree determined by a flawed student evaluation system. The easiest ways to boost evaluation scores are to dole out mostly As and Bs -- which leads to grade inflation, a concern among some -- or to get students to just plain like you. Higher grades help in this regard, as do minimizing assignments and dodging confrontations. But having beliefs and ideas confronted and challenged defines education, and it's downright necessary when someone makes a stupid or unfounded claim.

Generally, I've been fortunate to teach many bright students, but I have heard some express a variety of what I and the scientific majority consider stupid or ill-informed opinions. When a student remarks that evolution is "just a theory," should I point out that overwhelming evidence for the process of evolution suggests otherwise? When a student suggests that the universe is only  a few thousand years old, should I direct him or her to the eloquent remarks of Lawrence Krauss, who explains quite clearly how cosmologists determine the age of the universe to be around 13.72 billion years? When a student argues that vaccines cause autism, should I explain the difference between causation and correlation -- and should I also point out that this fallacious notion arose from erroneous studies and that even organizations like Autism Speaks agree that no scientific evidence exists to support such a claim? When a student argues that global warming is a hoax, should I point out that there is virtually undeniable evidence that the planet is indeed getting hotter, that human beings are contributing to this warming, that we are currently experiencing the early effects of climate change, that there is vast consensus among climatologists on all these points, and that the inability among lay persons to distinguish between climate and weather is a chief contributor to unwarranted skepticism? When a student claims Donald Trump would make a good president, should I abruptly kill him or her for the betterment of our species?

All tough questions.

Most days I feel trapped in a Catch 22, where doing my job effectively puts my job security at risk. That dilemma, perhaps as much as any other single factor, is a major reason for the declining quality of American education -- at all levels. As a scholar of ethics, I'm becoming increasingly convinced that I have a moral obligation to make students uncomfortable if it helps them learn. William Thompson, Dennis Hall, and Leon Festinger will at least be proud.

Friday, August 28, 2015

When Social Science Fails Itself

Yesterday, The New York Times reported on a study suggesting that less than half of research findings published in prominent psychology journals could be confirmed upon replication. The stunned, and at times stupefied, reactions from many readers show the public lacks a fundamental understanding of the scientific process.

The authors of "Estimating the reproducibility of psychological science," published in Science, investigated 100 manuscripts published in three leading psychology journal. Their goal was to test the veracity of the findings by replicating the original procedures; only 36 percent of findings in the original studies remained significant upon replication.

To many, these inaccuracies represent a damning failure for social sciences. I would argue they actually embody a triumph for the process of science, but at the same time point to a glaring problem in the process of publishing scientific research.

The public views publication as the pinnacle of research, believing that if it makes it to print then it must be fact. Therefore, when studies fail the test of replication, public confidence in science is shaken. And it is shaken

But that's largely because of a misconception that a single study is enough to establish research findings as facts. In truth, science is an exercise in consensus. When we are able to establish that findings hold true over time and across varied situations, we build a reliable body of knowledge that becomes the basis for scientific understanding. However, when replication fails to yield support for a particular finding, it is dismissed and we move forward with different ideas. Or at least that's how it's supposed to work.

In truth, the problem lies not with the scientific method, but instead with the publish-or-perish environment of academia. Tenure and promotion are based largely on one's ability to publish original research, and to publish it often. Problematically, we take the notion of "original" a bit too literally. As the authors of the Science piece put it:
Reproducibility is not well understood because the incentives for individual scientists prioritize novelty over replication. Innovation is the engine of discovery and is vital for a productive, effective scientific enterprise. However, innovative ideas become old news fast. Journal reviewers and editors may dismiss a new test of a published idea as unoriginal. The claim that "we already know this" belies the uncertainty of scientific evidence.
Put simply, replication is a necessary component science, but it's not sexy, so it's hard to publish. And since academics must publish to survive, they don't replicate studies often.

Perhaps the saddest part of this indictment is that it's our own damn fault. Most reputable social scientific journals are peer reviewed, meaning that we have the power to rectify a problem we know exists simply by changing our review policies.

So why don't we? I thought The New York Times was spot on there:
The act of double-checking another scientist's work has been divisive. Many senior researchers resent the idea that an outsider, typically a younger scientist, with less expertise, would critique work that often has taken years of study to pull off.
Certainly some -- but not all -- senior researchers feel this way, and I find it shameful. The whole premise of scientific inquiry is that no person or idea is above reproach. To quote Albert Einstein, "The important thing is not to stop questioning." 

Consequently, when Einstein first published papers on the photoelectric effect, Brownian motion, and special relativity -- the last of which shook the very foundations of Newtonian physics -- he was only in his mid-20s, very much a junior scholar. But how much could a 26-year-old possibly know anyway?

Thursday, August 6, 2015

Jon Stewart: The Walter Cronkite of a Generation

Stewart at the 2008 USO-Metro
Merit Awards
Yeah, I said it. Well, more specifically Lewis Black said it:
Weirdly enough, [Jon Stewart] was, on a certain level, the Walter Cronkite of his generation. He was the trusted source. People trusted what came out of his mouth. What made Cronkite that and what makes Jon that is a certain kind of honesty that is related through the medium, and the one thing I learned from television is that it doesn't lie.
While Lewis Black may have stolen my headline in his Entertainment Weekly interview, and while I agree with the gist of his argument, I think there's more to the Stewart-Cronkite comparison. Specifically, I would argue temperance, sincerity, accuracy, and depth are the chief contributors to Stewart's legacy.

The guttural reaction to the Stewart-Cronkite comparison is easy to sum up: Stewart is a comedian, Cronkite is a newsman, so how can you compare the two? Stewart was more than a comedian. He was able to elevate to higher levels of credibility than his comedic counterparts due largely to his temperate nature.

Stewart speaks sternly, but without the rage of a Lewis Black. Stewart hits with biting satire, but refrains from the obtuse and deadpan tendencies of his protege, Stephen Colbert. And Stewart has always been more influential than the late night hosts of network television because he was never afraid to say something of substance or to alienate segments of his would-be audience. But, while Stewart is certainly left-leaning, he avoids playing the outraged -- and often off-putting -- liberal we see in Bill Maher.

In fact, in 2010 Maher very publicly criticized Stewart for taking it too easy on Republicans. I happen to agree with Maher's assessment that, if we were to dole out points for craziness among the two major parties, the Republicans would win in a landslide that would make the Reagan-Mondale election look like a nail-biter.

The brilliance of Stewart was that he never took the bait. While he criticizes the right with greater frequency than the left -- and at times more harshly -- Stewart lacks the smugness that Maher constantly displays. There is a real sense that Stewart is wrestling with the same political non sequiturs as his viewers, where Maher appears stubbornly entrenched in a preset ideology.

In many ways, the ongoing struggle of Stewart's personal politics fuels his quest for accuracy in reporting. Love him or hate him, you have to admire that Stewart works incredibly hard to accurately represent the facts, despite that, as a comedian, he has no professional responsibility to do so. Even more impressive to me, he freely admits factual errors that he makes, perhaps most notably in his mischaracterization of Dante Parker's death as the result of police shooting rather than a drug overdose.

In that same segment, Stewart described of The Daily Show as a "media counter-errorism" program and the difficulties of working within such confines. Predictably, Fox News dismissed Stewart's genuine effort to raise issues of police brutality and militarization for a more simplistic narrative: Jon Stewart hates police officers. Fox News anchor Brian Kilmeade went so far as to imply that Stewart was at best unfeeling toward police officers who die in the line of duty. Stewart's response to Kilmeade was characteristic of the complexity and nuance he attempts to bring to reporting:
You can truly grieve for every officer who has been lost in the line of duty in this country and still be troubled by cases of police overreach. Those two ideas are not mutually exclusive. You can have great regard for law enforcement and still want them to be held to high standards.
More than anything, that nuance, that context, and that depth which Stewart brings to important issues make him an effectual newsman. I don't watch television news anymore. My main source for news is The New York Times. Largely this is because I can't remember the last time I watched a nightly network newscast or a 24-hour cable news program and walked away feeling informed.

Part of that is a function of a fragmented media environment. It's difficult for anyone to have the gravitas and impact of a Cronkite because news audiences just aren't that large anymore. Also, a key aspect of news is that it is, well, new. Being the outlet to break a story matters, and the continuous nature of the news cycle in out digital environment puts an even higher premium on being first. Sadly, the drive to be first often leads to speculation, and even worse, misinformation. Unfortunately, many traditional news outlets aren't as responsible about correcting their factual errors as The Daily Show.

Another factor in the decline of TV news is a lack of resources. There was a time when networks had reporters on the ground, actually gathering information. Now, more often than not, on-location reporters are there just to say they are there -- and to get a good background shot, not to provide background on the story. That's one reason the "on location" reports from The Daily Show correspondents are so funny: we all know they're not on location, and we all know it doesn't matter.

What's worse, I all-too-frequently see anchors reading tweets from viewers as if they were field reports. I just start throwing things at the screen. Man-on-the-street interviews were always sorry excuses for news coverage, and viewer tweets are no different. I don't care what @hawkeyewoman9834 (or Kathy, 33, from Iowa) thinks about the administration's new immigration policy, though as an unemployed dental hygienist and mother of three, I can see why CNN sought out her expert opinion.

Institutional forces aside, much of the blame lies with the men and women in the television news business. The news media is supposed to be our watchdog, which means calling bullshit on institutional powers and their leaders. Somehow that core function of the news has been relabeled as subjective opinion, or even worse, partisanship. Good journalists have been scared into a position of reporting facts with no context, leaving relatively uninformed and often unqualified viewers to interpret what they see and hear. Bad journalists have devolved into pure punditry, where ideology dictates the facts rather than being informed by them.

This wasn't always the case. In 1954, Edward R. Murrow unabashedly took on Joseph McCarthy. He pointed out the contradictions made by the senator and demonstrated that, even if the Red Scare were not an overreaction, the conduct of the House Un-American Activities Committee was itself, ironically, un-American. Was that a partisan attack on a Republican senator or simply a statement of what was and a reasoned argument that it ought not be? Currently, the Edward R. Murrow Award is among the most prestigious honors given in recognition of outstanding electronic journalism.

Following the Tet Offensive in 1968, Cronkite said the following concerning the Vietnam War:
To say that we are closer to victory is to believe, in the face of the evidence, the optimists who have been wrong in the past. To suggest we are on the edge of defeat is to yield to unreasonable pessimism. To say that we are mired in stalemate seems the only realistic, yet unsatisfactory conclusion. On the off chance that military and political analysts are right, in the next few months we must test the enemy's intentions, in case this is indeed his last big gasp before negotiation. But it is increasingly clear to this reporter that the only rational way out then will be to negotiate, not as victors, but as an honorable people who lived up to their pledge to defend democracy, and did the best they could.
Cronkite traveled to Vietnam, gathered information first-hand, analyzed it, and developed a reasoned argument that America was in the midst of an unwinnable war. Was Cronkite acting as an unpatriotic communist unsupportive of U.S. troops, or a responsible journalist seeking to inform his audience as best he could? It's worth noting that Walter Cronkite hosted The CBS Evening News for another 13 years after that report, retiring as one of the most respected journalists and revered public figures in U.S. history.

Like Murrow and Cronkite, Stewart isn't afraid to call bullshit. He isn't afraid of controversy. He isn't afraid of reasoned editorializing. In short, he isn't afraid to inform his viewers, which sadly can't be said of many "real" television journalists. What's more, he makes you laugh, if only to keep you from crying. In the broader American zeitgeist, Stewart's legacy won't even remotely challenge that of Cronkite, but for his generation and for mine, Stewart may be remembered as the most trusted man in America.

Tuesday, July 21, 2015

Kentucky Clerks' Stance Unrelated to Religion

Kentucky clerks in five counties (Casey, Clinton, Lawrence, Montgomery, and Rowan) are refusing to issue marriage licenses to homosexual couples. The clerks argue that to do so violates their religious liberties. The ACLU has filed suit against Rowan County Clerk Kim Davis, and my hope is that the federal case reveals the most important truth in this farcical debate: opposition to gay marriage has nothing to do with religion.

An article in Louisville's Courier-Journal provides some insight into the Davis' reasoning in refusing to issue marriage licenses. In the article, Davis claims that her actions were "thought out" and that she "sought God on it." According to Mike Wynn, author of the article:
On the stand Monday, Davis described herself as an Apostolic Christian who believes marriage is defined as the union of one man and one woman under the Bible -- "God's holy word" -- and said she contemplated her policy for months beforehand.
Supporters of Davis have shown up in droves outside the courtroom, many sporting signs with biblical verses that reinforce their belief that marriage should be between one man and one woman. The most commonly featured passage is Genesis 2:22-24:
Then the rib which the Lord God had taken from man He made into a woman, and He brought her to the man. And Adam said: "This is now bone of my bones and flesh of my flesh; she shall be called woman because she was taken out of man." Therefore a man shall leave his father and mother and be joined to his wife, and they shall become one flesh.
Many other passages that simply repeat Genesis 2:24 are also featured. Matthew 19:5 and Mark 10:7-8 are fan favorites, I assume because they demonstrate Jesus' support of Old Testament doctrine. Interestingly, the next verses (Matthew 19:6 and Mark 10:9) are frequently omitted. In both gospels, Jesus says the following:
Therefore what God has joined together, let not man separate.
You're probably familiar with these verses as they are often invoked at the end of wedding ceremonies. In these verses, Jesus very clearly condemns divorce. There are other biblical condemnations of divorce as well, most notably from early church leader Paul in 1 Corintians 7:10-11:
Now to the married I command, yet not I but the Lord: A wife is not to depart from her husband. But even if she does depart, let her remain unmarried or be reconciled to her husband. And a husband is not to divorce his wife.
If we're going to follow the letter of the law as outlined by scripture, accepting that biblical marriage only constitutes the union of one man and one woman also means accepting that the Bible does not recognize divorce, endorse second marriages, and considers both sinful.

Adultery is another big no-no related to marriage. Apart from being expressly forbidden in the Ten Commandments (Exodus 20:14), it is condemned in other books, as is premarital sex:
Marriage is honorable among all, and the bed undefiled; but fornicators and adulterers God will judge. (Hebrews 13:4)
In fact, that judgment manifests itself in a rather harsh punishment prescribed in Leviticus 20:10-13:
The man who commits adultery with another man's wife, he who commits adultery with his neighbor's wife, the adulterer and the adulteress, shall surely be put to death. The man who lies with his father's wife has uncovered his father's nakedness; both of them shall surely be put to death. Their blood shall be upon them. If a man lies with his daughter-in-law, both of them shall surely be put to death. They have committed perversion. Their blood shall be upon them. If a man lies with a male as he lies with woman, both of them have committed an abomination. They shall surely be put to death. Their blood shall be upon them.
Certainly I over-quoted here, but I do so to prove a point. The Bible treats adultery and homosexuality as equivalent sins. Has Rowan County Clerk Kim Davis attempted to deny marriage licenses to adulterers? Or to divorced heterosexuals seeking to marry another? According to The Courier-Journal and Davis' own testimony, the answer is a resounding no:
After working in the clerk's office for nearly 30 years, she said she has never denied a license on religious grounds or asked applicants about relationships she might find sinful.
It appears Davis has been rather derelict in her religious duties. In fact, from a biblical standpoint, as a woman she has no place instructing us on matters of religion in the first place:
Let your women keep silent in the churches, for they are not permitted to speak; but they are to be submissive, as the law also says. And if they want to learn something, let them ask their own husbands at home; for it is shameful for women to speak in church. (1 Corinthians 14:34-35)
In this particular instance, I agree with scripture. I think we'd all be better off if Davis would simply shut the hell up.

Why? Clearly Davis is not interested in preserving biblical marriage. She has a 30 year track record of failing to do so. Her objection to issuing marriage licenses boils down to a discomfort with homosexuality, and she is not alone in that feeling. Many people believe homosexuality to be unnatural and strange and its practice to be disturbing or outright disgusting.

The right to have that belief is actually something that the First Amendment does uphold. However, simply because something makes you uncomfortable is not a compelling reason to suppress it or make it illegal, and the Supreme Court was correct to extend to homosexual couples equal protection under the law.

Moreover, the religious freedom argument that Davis and the other four county clerks in Kentucky are making is nonsensical. Even if these officials' actions were motivated by religious belief -- which they are not -- there is nothing that prohibits adherence to both the laws of man and of God. Jesus was arguably the first advocate for the separation of church and state:
Then they asked Him, saying, "Teacher, we know that You say and teach rightly, and You do not show personal favoritism, but teach the way of God in truth: Is it lawful for us to pay taxes to Caesar or not?" But He perceived their craftiness, and said to them, "Why do you test Me? Show Me a denarius. Whose image and inscription does it have?" They answered and said, "Caesar's." And He said to them, "Render therefore to Caesar the things that are Caesar's and to God the things that are God's." (Luke 20:21-25)
Upholding this law is not an affront to religious liberty. And if these five clerks feel it is, they should resign. I see no reason why we should praise, let alone pay, an individual for refusing to do his or her job.