John Boehner’s Legacy

Published: The Good Men Project (October 29, 2015)

As Republicans scramble to anoint Rep. Paul Ryan as the successor for soon-to-be-former Speaker of the House John Boehner, it is worthwhile to examine precisely why the Ohio congressman’s tenure has been such a failure. After all, Boehner has reportedly been planning on resigning for a long time, even claiming that he had considered resigning as early as last November. His reasoning, then and now, was straightforward and reasonable: The staunch opposition of Tea Partyers and other hardline conservatives to his leadership had made it impossible for him to reclaim his current title next year without a prolonged and nasty fight.

Of course, if Boehner had been able to balance his conciliatory nature with a strong vision for his party and an even stronger grip – in the tradition of the best Speakers – he may not have found himself in this predicament in the first place.

To understand why this is the case, it’ll illustrative to look at the last two Speakers who presided when a president of the opposing party was in the White House. The last Republican to find himself in this situation was Newt Gingrich, who led a memorable GOP takeover of the House during the 1994 midterm elections that forced President Bill Clinton to deal with a hostile House for the last six years of his presidency. While Gingrich was hardly short of invective against Clinton (as his spearheading of the successful movement to impeach him later demonstrated), his focus during those contests was on unifying his party behind a clear ideological vision, the so-called “Contract with America.” Although he was later pressured into resigning after being blamed for his party’s setbacks in the 1998 midterm elections (to say nothing of his growing list of scandals and well-known massive ego problem), for a time he was able to unite his party into an effective counterweight against the Clinton administration.

It’s worth noting here that this did not mean Gingrich simply obstructed everything Clinton tried to accomplish. To be a truly effective Speaker, Gingrich needed a list of positive accomplishments, and that in turn necessitated collaborating with the president. As explained by Steven M. Gillon, the author of “The Pact: Bill Clinton, Newt Gingrich, and the Rivalry that Defined a Generation,” “they worked behind the scenes crafting significant legislation that had a positive impact on the economy. And they were doing so at great risk.” Gingrich even worked closely with Clinton behind the scenes to reform Social Security and Medicare, with each man attempting to form a centrist coalition despite knowing that doing so would alienate the stalwart bases in their respective parties (the Lewinsky scandal ultimately sabotaged their efforts).

Boehner’s immediate predecessor, Rep. Nancy Pelosi of California, took over as a liberal Democrat during the final two years of the Republican presidency of George W. Bush. Like Boehner, Pelosi had to deal with plenty of partisan overzealousness from within her own party, incurring the ire of the far left almost immediately by declaring that efforts to impeach Bush were “off the table.” Just as notably, she was able to find common ground with Bush on important issues like energy independence and the bank bailouts, something she has not been reluctant to point out when criticizing House Republicans’ current treatment of President Obama.

Although Pelosi didn’t come up with an ideological agenda akin to Gingrich’s Contract with America, she did exert her influence to shape her party in a similarly meaningful way. As the highest-ranking Democrat to oppose the 2002 Iraq War resolution – one whose chief opponent stakes his campaign against her on the grounds of the vote – Pelosi was frustrated that so many Democrats had failed to heed her warning about the impending invasion. Consequently she made it clear that her sympathies rested with then-Senator Obama (who had also opposed the 2002 resolution)  over then-Senator Hillary Clinton (who had supported the resolution) during their close campaign for the Democratic presidential nomination in 2008, even though political decorum prohibited her from making an official endorsement. The fact that she believed Clinton was a more divisive candidate than Obama, and as such would hurt the party’s chances in the general election, almost certainly strengthened her resolve.

When compared to Gingrich and Pelosi on the aforementioned grounds, Boehner consistently comes up short. Unlike Gingrich, Boehner never had a coherent ideological vision that could have united the Republican Party during Obama’s administration behind more than just a visceral dislike of the president himself. He wasn’t even able to meaningfully bolster the GOP’s chances of replacing Obama with a president from their own ranks, infamously telling one crowd during the 2012 presidential election that in his opinion “the American people probably aren’t going to fall in love with Mitt Romney.”

Boehner’s biggest failure, however, was his inability to rein in the wilder elements in his party when their opposition to the president got out of control. Instead of collaborating with Obama on a series of centrist achievements much as Gingrich did with Clinton or Pelosi did with Bush, Boehner sat helplessly by while his party’s obstructionist ranks forced him to preside over two of the least productive congresses in American history. When he leaves office next month, he will have no significant legislative accomplishments of any kind associated with his tenure as Speaker.

While Boehner isn’t wrong in arguing that it was the “false prophets” of his party’s far right that reduced him to such a sorry state, this isn’t a defense so much as it is an acknowledgment of his shortcomings as a leader. Legislative leadership has never been easy, but the great House Speakers – men like Henry Clay, Thomas Reed, and Joe Cannon – are lauded for their ability to forge compromises even among the most disparate political factions (in Clay’s case this skill was so invaluable that his death is widely considered to have been a precipitating event in bringing about the Civil War). We’ll never know if another Republican might have done a better job as Speaker than Boehner – or, indeed, if anyone could have risen to the challenge – but it is undeniable that Boehner failed.

Why the Democrats need Martin O’Malley…

Published: The Good Men Project (October 27, 2015)

To begin, I must emphasize that I am not officially supporting the presidential campaign of Martin O’Malley, the former governor of Maryland. This article is focusing on his candidacy because there is a practical argument to be made in favor of nominating him – one that is too compelling, even alarming, to be safely ignored.

Right now the Democratic Party has a problem. There are only three candidates left in the race – former Secretary of State Hillary Clinton, Senator Bernie Sanders of Vermont, and O’Malley – and it is widely assumed that one of them, Sanders, is unelectable. As political scientist Monica Bauer explained in an editorial for The Huffington Post, “it seems clear to me his open dismissal of capitalism makes him pretty much unelectable in a general election, and thus a disaster for the Democratic party, if they were to nominate him for president.” This view is hardly limited to Bauer; in fact, it sums up the conventional wisdom held by the Democratic Party for pretty much as long as “socialist” has been used as epithet in this country. Although I personally feel that the stigma surrounding that term is unfounded, that doesn’t alter the objective facts regarding its potency. At the very least, a detached analyst has to concede that Sanders is a very risky candidate in terms of his ability to win the general election.

The same thing must be said about Clinton. While she is certainly more moderate than Sanders, the former First Lady has prohibitively high unfavorability ratings in swing states like Colorado, Iowa, Virginia, Florida, Ohio, and Pennsylvania. “Since voters in general elections normally won’t vote for a candidate they don’t like or at least find trustworthy, it’s imperative that a Democratic nominee hold positive favorability ratings going into Election Day,” observed columnist H. A. Goodman in The Huffington Post, who goes on to note that the combined electoral votes of the six aforementioned swing states are enough to deny the presidency to any candidate who lose all of them. What’s more, Clinton’s numbers against her potential Republican opponents are disturbingly weak: A recent CNN/ORC survey had her losing by one point to Ben Carson and ahead of Donald Trump by only five (Sanders loses to Carson by two and beats Trump by eleven), while a Fox News Poll taken roughly the same time found her losing to Trump by five, to Carson by eleven, to Jeb Bush by four, and to Carly Fiorina by three (Sanders wasn’t included in that poll).

That leaves O’Malley, who despite being a popular two-term governor has so far not been taken seriously enough to even appear in face-to-face polls against various Republicans. Examining his record, one finds a strong enough resume: As mayor of Baltimore, his innovative cost-saving measures helped the city earn its first budget surplus in years, and his overall reputation for competence led to him being considered one of America’s top young big city mayors during his tenure. Upon being elected governor of Maryland, O’Malley proceeded to rack up a series of impressive progressive accomplishments, including abolishing the death penalty, raising the minimum wage to $10.10 an hour, legalizing same-sex marriage, extending in-state tuition breaks to undocumented immigrants, and passing stricter gun control regulations despite heavy opposition from conservative Democrats as well as Republicans. This isn’t to say that O’Malley is without his own weaknesses – his “zero tolerance” policies as Baltimore mayor are blamed by many activists for worsening mistreatment of racial minorities by that city’s police, while his various social programs resulted in tax increases that could prove damaging if brought up in a general election. Nevertheless, as Senate President Thomas V. Mike Miller put it, O’Malley was “the most effective Maryland governor since the mid-1970s,” which if nothing else earns him his right to be taken seriously next to his primary opponents.

None of this means that Democrats currently supporting Clinton or Sanders should abandon those options for O’Malley. That said, any Democrat who sincerely believes that a Republican victory next November would be disastrous for America must acknowledge that serious questions exist about both Clinton’s and Sanders’ electability. If they want to effectively confront these concerns, they need to do three things:

  1. They must interrogate Clinton and Sanders as to why they believe they can win. It isn’t enough for the two frontrunners to simply mouth platitudes about how their message and/or record will be strong enough to impress ordinary voters during the general election. Each one should be pressed to lay out, in a logical fashion, how they expect to win despite their potentially debilitating weaknesses. If they can’t or won’t do this, then their nominations would be liabilities to the party.
  2. They must start taking Martin O’Malley seriously as a presidential candidate. During the Saturday Night Live parody of the first Democratic debate, the sketch comedians included several jokes about O’Malley’s political irrelevance in terms of that contest. This is a foolish attitude, if for no other reason than it cuts the party off from an alternative that clearly requires sober consideration right now. This brings me to the third lesson…
  3. Democrats must never again allow their candidate field to become this limited. At its largest, the Democratic candidate field included five options in this race – Clinton, Sanders, O’Malley, Lincoln Chaffee (former senator and governor from Rhode Island), and Jim Webb (former senator from Virginia). Even if Clinton or Sanders is nominated and subsequently elected, the mere fact that the party is in its current predicament demonstrates the danger of not having a variety of contenders in the primaries. This should be instructive.

In an ideal America, the Democratic Party wouldn’t face the possibility of defeat more than a year away from the general election simply because there are only three candidates left before the race has officially started. If the party didn’t find itself in that very specific situation, there would be no need for pragmatic editorials like this one. Be that as it may, Democrats cannot afford to put Trump, Carson, Cruz, or Bush in the White House simply because they’d rather not admit that their preferred choice is unable to win. Because O’Malley is in the race, there is a third option that potentially staves off the dangers posed by a Clinton or Sanders nomination.

In light of the dire stakes involved, it behooves us to take him seriously.

Before “Steve Jobs,” there was “Pirates of Silicon Valley”: What a made-for-tv movie got right that Aaron Sorkin didn’t

Published: Salon (October 26, 2015)

As the new biopic “Steve Jobs” continues to receive rave reviews, it seems appropriate to stop and take a look at its predecessor —  the “Citizen Kane” of made-for-TV movies, “Pirates of Silicon Valley.” I’m not simply comparing this film to “Citizen Kane” as a way of drawing attention to its quality. Much like Orson Welles’ 1941 magnum opus based loosely on the life of William Randolph Hearst, Martyn Burke’s 1999 motion picture is at its core the tale of how massive business empires can be built and destroyed by the egos and weaknesses of their creators’. That said, there are two key differences between these films:

  1. “Pirates of Silicon Valley” focuses on two personalities, Steve Jobs and Bill Gates, instead of just one.
  2. “Pirates of Silicon Valley” is surprisingly accurate.

Obviously it would be a stretch to say that “Pirates of Silicon Valley” is as accurate as a documentary – this is a Hollywood dramatization, after all – but it certainly didn’t get much of the flak leveled at “Steve Jobs,” either. “Sorkin chose to cherry-pick and exaggerate some of the worst aspects of Jobs’ character, and to focus on a period of his career when he was young and immature,” complains Walt Mossberg of The Verge (who also found some “Citizen Kane” parallels in the Jobs biopic approach), while Laurene Powell (Jobs’ widow) condemned it as “fiction.” By contrast, Apple co-founder Steve Wozniak (who was a consultant on the new Jobs biopic) praised “Pirates of Silicon Valley” for being surprisingly close to the mark. “”One of the things is, yes, it not only captures inside of Steve Jobs,” Wozniak explained to Business Insider. “It’s the events that occurred and what was their meaning in the development of computers and ‘Pirates of Silicon Valley’ was intriguing, interesting. I loved watching it.”

This trait of “Pirates of Silicon Valley” is best captured in the two scenes that bookend the film. As it opens, we see Noah Wyle as Steve Jobs talk directly to the camera about the importance of the personal computer revolution. To him and many of the other PC revolutionaries, this wasn’t just an ambitious business venture. “We’re here to make a dent in the universe,” he intones. “Otherwise, why even be here?” As the shot pans back, we see that he’s talking to Ridley Scott (best known then for directing “Alien” and “Blade Runner”), who is in the process of shooting the legendary “1984” commercial that introduced the world to what would soon become the first popular personal computer.

Then we flash forward a decade-and-a-half, during which time Jobs is being hired back by Apple – from which he will be fired by the expiration of the film’s running time – by none other than his nemesis Bill Gates himself (brilliantly captured by Anthony Michael Hall). The image of Gates looms over Jobs, deliberately evoking the Big Brother imagery to which we had been introduced mere moments ago, with Jobs barely concealing his inner anguish as he plasters a fake smile on his face and pretends he is delighted to be reunited with the Microsoft founder. For all intents and purposes, the rest of the movie will cover how Jobs managed to be transformed from the man who imagined himself destroying Big Brother to the man who would be forced to capitulate to Big Brother – and learn to love it.

Of course, if “Pirates of Silicon Valley” was merely an exploration of personalities, it would be intriguing – but perhaps not quite great. What truly elevates it above the run of normal biopics is that it never loses sight of the greater significance of what its principles are doing. At one point in the film, as Jobs and Wozniak run through the University of Berkeley circa 1971 (i.e., in the full throes of countercultural upheaval), Jobs remarks that “those guys think they’re revolutionaries. They’re not revolutionaries, we are.” This theme pervades the motion picture – not merely an awe of computers, but a recognition that its creators realized they were going to change the world. For Jobs, the personal computer revolution was a religious crusade; for Gates, a ripe business adventure; for Wozniak and Paul Allen (depicted here as Gates’ number two, with a comic relief Steve Ballmer close at his heels), it’s a nerdy enthusiasm. All of them, however, see something that no one else can recognize – the potential for personal computers to completely transform how we live our lives – and how that realization shaped their personalities, and with it history.

Unfortunately, there was a darker side to the PC revolution, which is why the word “Pirates” appears in the title of “Pirates of Silicon Valley.” While Jobs and Gates were indisputably brilliant men, they did not invent much of the technology that is widely attributed to them. No, the credit for those innovations belongs to countless obscure men and women – many of them employees at Xerox, which paid them to create marvels and then refused to make bank on their work because it didn’t appreciate what they had. Jobs realized this and, characteristically, charmed Xerox into forcing its resentful employees to share the fruits of their labors with the self-entitled Jobs, who thought nothing of harvesting their bounty and acting like he had cultivated it himself. This brings us to the other scene that captures the essence of this movie’s greatness, an exchange between Jobs and Gates after the former realizes the latter has been stealing his innovations (much as Jobs did to the hapless Xerox employees), which I dare not quote here for risk of spoiling it for others. Suffice to say this much: This is as much a film about intellectual theft, and the grandiose egotism necessary to morally justify such actions, as it is about genius and inspiration and the world-changing technology they wrought.

The greatness of “Pirates of Silicon Valley” lies in the fact that it manages to brilliantly balance all of these elements into a single hour-and-a-half long narrative, paying tribute to the achievements of its subjects without excessively glorifying or vilifying them. While audiences will need to decide whether “Steve Jobs” performs a comparable feat (Wozniak has praised the new film — not as effusively as he did “Pirates,” perhaps, but at least he didn’t dismiss it as “crap” like he did the Ashton Kutcher vehicle “Jobs”), it’s fair to say that “Pirates of Silicon Valley” established the standard that all worthy movies about the computer revolution will need to follow. Its themes are the themes not only of these particular stories, but of our era in history as a whole. To understand our time, we need to understand the feats and foibles that helped create it, and so far no work of art has done this better than “Pirates of Silicon Valley.”

Is God One of Us?

Published: The Good Men Project (October 24, 2015)

It’s somewhat amazing that, despite having written more than 400 articles in the past four years, I’ve never really touched on my religious beliefs. My Asperger’s Syndrome and depression, romantic relationships and childhood traumas, even my Jewish background (from a heritage standpoint)… All these sensitive and personal matters have been explored in my previous articles. Yet until this week, I’ve never discussed my opinions about God.

Like most secular writers who suddenly feel compelled to delve into the metaphysical, I’ve reached this point due to a particularly bad week that I’m grateful to see end. While I’m not going to discuss these personal matters here (no worries, I’ll be fine), they have caused me to contemplate a variety of spiritual matters, some of them only tangentially related to my immediate difficulties. In no particular order, I’m going to address them below.

1. The wisest Biblical verse of them all.

Courtesy of Ecclesiastes 9:11 –

“I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.”

Some of the most intelligent people I know work in food service or retail jobs. Similarly, some of the kindest and wisest people I know face constant hardship and mistreatment. Meanwhile, I know plenty of mediocre people – intellectually, morally, and otherwise – who (on the surface at least) appear to be happy in their professional and personal lives.

These are hardly original thoughts, but they have bothered me for years. My political ideology is in many ways informed by these observations – one can only see so many great minds and decent souls forced to suffer before feeling desperate to do something, anything, to create a political and social order that will reward individuals based on merit rather than mere chance.

As I’ve studied history over the years, however, I’ve begun to seriously worry that these hopes are in vain. Sure, technological advances have increased our life expectancy and made it easier for the masses to muffle their pain with cheap entertainment. At the same time, it still seems as if the vast majority of the opportunities for meaningful socioeconomic advancement are restricted to the lucky few who are born with privilege; for much of the rest, Henry David Thoreau put it best when he wrote that “the mass of men lead lives of quiet desperation.”

Although I have my own professional frustrations, by the way, this passage is more about friends and loved ones that I know rather than myself – I’m ABD in a PhD program, a freelance writer, and a local elected official, all jobs that I cherish and find immensely satisfying. This isn’t to say that I’m not ambitious and hope to rise even higher in the near future, but when I think of Ecclesiastes 9:11, my thoughts aren’t primarily on myself. The same cannot be said for my next observation…

2. Right and wrong – both morally and practically speaking – have very little to do with determining the ultimate outcome of important events.

Without going into specifics, I will say that this has been a bad week for me because of several occasions in which I was challenged, tried to do “the right thing” (more on that term in a moment), and found that regardless of what I said or did, a negative result ensued.

When I speak of “the right thing,” I’m doing so in two senses: First, the notion that if you treat others with respect and compassion (i.e., following The Golden Rule), and are always as open and honest as possible with both other people and yourself, that you will more often than not receive positive results; and second, the idea that – moral considerations aside – there is a logic to how events will proceed, meaning that if you are prudent, hard-working, able to learn from your mistakes, and capable of standing your ground when necessary, the fact that you exhibit these practically useful qualities will significantly increase the probability of receiving a desired outcome.

Unfortunately, in addition to my own recent experiences, I’ve observed as many of the people around me do “the right thing” when interacting with others… only to wind up suffering anyway. This doesn’t mean that I think being a bad or irrational person is somehow more likely to lead to success than being a good one, but rather that doing right versus doing wrong really makes very little difference. To the extent that any variable can predetermine outcomes in given situations, the determining factor usually seems to be power – the party(-ies) with money, authority, and/or vital resources tends to win, and the one(s) without tends to lose, regardless of whether the powerful happen to think that they’re right or not.

This is usually the part of the discussion when someone chimes into say that if we didn’t have suffering in the world, we wouldn’t be able to appreciate all of the good things. Well…

3. If God happens to exist, and has a moral code, his moral philosophy bears little relationship to our conventional definition of that term.

First, I need to point out that the question of whether God exists is entirely separate from whether there is an objective, divinely sanctioned “morality.” After all, there have been plenty of brilliant men (Albert Einstein comes to mind) who did believe in a God but also felt that this divine entity, whatever it was, certainly did not intervene in human affairs or pass any kind of moral judgment on our actions. Depending on how you define God, it’s entirely possible to conceive of a universe in which a supernatural hand is behind all creation and yet doesn’t give a damn about what we measly humans do on this single speck of dust in the incomprehensibly vast cosmos.

Assuming, however, that one not only believes in God, but thinks that God has a system of “right” and “wrong” which he applies to our universe, it is inconceivable to me that this notion bears any meaningful relationship with our own sense of those terms. For instance, whenever someone argues that pain is necessary to appreciate pleasure, I like to point out that the only reason such a contrast is necessary is because the alleged omnipotent God who created our universe decided it should be so. Indeed, virtually any argument made to justify the existence of suffering in this world can be rebutted using the same logic: God created the rules, which means that he could have jiggered with them to make it so that those who suffer wouldn’t have to do so. Yet he chose not to.

In fact, when you really examine our world, you realize that God made it so that life itself is fueled by pain. Think about it: Living creatures are, in the end, nothing more than piles of physical matter which happen to be self-aware. Thanks to the laws of biology (at least on this planet), the only way for such self-aware creatures to maintain their brief flashes of consciousness on this planet is to consume organic compounds – in other words, to snuff out the lives of other creatures. This is true not only for carnivores, but herbivores – after all, aren’t plants also alive? And plants aren’t blameless either, for while many of them rely on sunlight to survive, their roots draw nutrients from a soil enriched by the decomposed remains of previous living creatures. Everything that is alive maintains its existence by profiting off of death.

I may be open to the idea that God exists, but it is inconceivable to me that a God whose moral code is based in compassion would choose to construct this system. If God is real and applies moral rules, they certainly aren’t governed by empathy.

Normally this is the point in which I would try to tie together my article’s meandering themes, but truth be told I don’t know how to do it. This is one of those essays that a writer composes to unburden himself, not so much because he has an agenda in mind. I needed to get these thoughts off my chest so I could slough off the refuse of this week and hopefully make the next one better for me. So it is done.

Photo: Flickr – Waiting For The Word/”God the Father 11″

Biden: The president we deserve, but not the one we need

Published: The Good Men Project (October 22, 2015)

To paraphrase a famous line from “The Dark Knight”: Joe Biden is the president we deserve, but not the one we need.

I’ve written quite a bit about Biden over the course of my career because – as a PhD student focusing on American political history – I can’t help but admire such a quintessential throwback. We live in a time of “authenticity” rather than authenticity, one in which presidential candidates can skyrocket to the top of the polls for blatant pandering and demagoguery that seems “real” rather than offering actual substance. By contrast, Biden has always been a politician who speaks his mind from a clearly genuine place (even if that makes him a bit gaffe-prone), usually with the underlying message that we need to create an America which focuses less on partisan bickering and more on helping those who need it most.

These qualities were particularly evident in the Rose Garden speech during which he announced that he wasn’t going to run in the 2016 presidential election. First he explained, with characteristic candor, that “as my family and I have worked through the grieving process [for the death of his son Beau earlier this year], I’ve said all along – what I’ve said time and again to others – that it may very well be that that process, by the time we get through it, closes the window on mounting a realistic campaign for president,” and that it turns out they are now “out of time.” After that, Biden went on to urge Americans to embrace President Obama’s legacy, which led America from “crisis to recovery,” and reminded the powerful that in the end they aren’t fighting simply for themselves. “Go back to your old neighborhoods,” he urged the reporters and politicians who had gathered to hear his announcement. “Talk to your contemporaries who aren’t as successful as you’ve been.”

If you think Biden came up with this rhetoric simply because it matched the occasion of his “choose not to run” moment, you’d be wrong. In fact, Biden’s first became a national figure for expressing very similar thoughts in a speech at an Iowa Jefferson-Jackson Day dinner in 1985. At that time, the Democratic Party was mired in a fierce internal battle between center-conservatives who believed the party should abandon the progressive legacy of forebears like Franklin Roosevelt and John Kennedy and staunch liberals who were convinced that dyed-in-the-wool rhetoric could win converts despite the landslide defeats of George McGovern and Walter Mondale. In that address, Biden reminded his fellow Democrats that success would only come from focusing on the concerns of struggling Americans – from members of the working class to victims of gender or racial oppression – and listening to them regardless of whether their words fit into either a preexisting ideological agenda or the demands of specific special interest groups. ”It’s time we hear the sound of our country singing and soaring in the dawn of a new day,” he proclaimed. ”It’s time to restore America’s soul. It’s time to be on the march again. It’s time to get America on the move again. Our time has come.”

Throughout his career, Biden has remained faithful to this message, whether in terms of policy (such as passing landmark legislation protecting women’s rights or blocking the confirmation of Robert Bork to the Supreme Court) or his own life, most notably his decision to live in Wilmington and commute an hour-and-a-half to Washington during his Senate career instead of losing touch with his constituents and family by moving to DC. All of these qualities speak to a deeper idealism that America deserves… and, indeed, will be much poorer if it winds up losing. At the same time, they are qualities that seem out of place in our current election cycle, in which PR spectacle and incendiary rhetoric propel candidates far more effectively than any kind of coherent national vision.

That’s why the aforementioned line from “The Dark Knight” comes to mind as I reflect on the end of Biden’s career as a presidential candidate. At this point in our history, America probably won’t elect a politician like Joe Biden, but that doesn’t mean future Joe Bidens don’t have a place in our political life. Quite to the contrary, anyone who believes in America’s underlying ideals should hope that we will continue to benefit from their service in our political life. We deserve nothing less.

Here’s why Donald Trump is hosting ‘SNL,’ as explained by Twitter

Published: The Daily Dot (October 21, 2015)

Donald Trump is a racist blowhard who shouldn’t step foot anywhere near theWhite House—but that doesn’t mean he also shouldn’t be allowed on TV.

The National Hispanic Leadership Agenda recently denounced Republicanpresidential candidate Donald Trump’s forthcoming appearance on Saturday Night Live. Referring to his “bigoted comments” about undocumented workers, the group called next month’s hosting gig a “slap in the face.” While they’re correct to target Donald Trump’s anti-Latino views, the NHLA wrong in one important respect: Racism will not be given a “thumbs up” when Trump hosts an episode ofSaturday Night Live next month.

Their concern is reminiscent of a popular sentiment on platforms like Facebookand Twitter, where people who retweet a racist article or comment are often criticized for promoting said views, even though they could simply be referencing it to make an opposing point. Many users might boost sentiments they disagree with to spread awareness of their existence; it draws an implicit arrow as if to say: “Hey, look at this thing. What do you think about it?”

Racism will not be given a “thumbs up” when Trump hosts an episode of Saturday Night Live next month.

In that sense, Saturday Night Live is acknowledging an unavoidable reality: Whether we like it or not, that Donald Trump could very well be the Republican presidential candidate next year. The important question is what SNL will do with Trump once he’s on their program.

This doesn’t mean that I support Saturday Night Live’s decision. I would agree with NHLA when they argue that SNL is essentially trying to “enable Trump’s hateful speech for nothing (more) than a ratings ploy.” After all, this is the same network that banned Trump from the Miss USA pageant and dropped him as the host of The Apprentice when it was trendy to do so. But this was back when the hashtag campaign to #DumpTrump was a national topic on Twitter, and Trump looked like a long shot for the nomination. Now that his polling numbers have risen, the network has changed its tune.

Making matters worse, SNL has never hired a Latina cast member and has only recruited two Latinos in its 41 years on the air—Horatio Sans and Fred Armisen. This makes it all the more insulting that the program has asked its host to be a man whose presidential campaign took off because he argued that Mexican immigrants are criminals, drug-smugglers, and rapists.

That said, the fact that Trump’s views are toxic doesn’t mean that, as the NHLA claims, those ideas are being effectively “sanitized” by virtue of appearing on SNL. The TV show won’t legitimize Trump—he already has legitimacy by sheer virtue of his frontrunner status. His presence on the program in its own right does little more than reflect his prominence in the current news cycle—like asking Miley Cyrus to host after her controversial VMAs hosting gig.

Instead of “sanitizing” Trump, the writers of Saturday Night Live have a golden opportunity. Rather than buying into Donald Trump’s “straight-talker” persona and making him appear likable, the show could also use his hosting gig to take him down a peg—by addressing his record on race.

The TV show won’t legitimize Trump—he already has legitimacy by sheer virtue of his frontrunner status.

But when it comes to actually holding famous figures accountable, the show has a mixed track record at best. For instance, let’s look at two other prominent politicians and presidential aspirants who have hosted episodes of SNL. In 1984, SNLinvited the Rev. Jesse Jackson to host the show, even though his presidential campaign had hit a major bump earlier that year when he was caught using anti-Semitic slurs in front of a reporter. And when former New York mayor Rudy Giuliani hosted an episode in 1997, his administration had been plagued with controversy involving the extensive use of racial profiling by members of the NYPD.

On neither occasion was their SNL appearance widely perceived as an implicit endorsement of their candidacy. But at the same time, neither the Jackson nor Giuliani programs took these men to task for their moral failings. Giuliani’s appearance consisted mainly of an opportunity to show off his zany side (including a sketch in which he appeared in drag), while Jackson’s segments were less memorable than his monologue calling for increased diversity among the Saturday Night Live cast.

The missed chance to confront these two men was a real problem, since America might have benefited from seeing some of the nation’s most influential comic minds openly challenging powerful men.

If nothing else, Donald Trump gave the SNL writers a gift if they choose to lampoon him. In addition to his famous hairstyle, Trump’s announcement speech was both racist and objectively wrong: Immigrant workers are much less likely to be involved in criminal activity than native U.S. citizens. Yet because his un-PC language was viewed as “refreshing” in our current political climate, he has continued to gain traction from it.

This is reminiscent of the brilliant Trump parody that appeared in a recent episode of South Park—in which a Canadian right-wing candidate clearly modeled on Trump is elected “president of Canada” (which technically has a prime minister) to everyone’s subsequent chagrin. The monologue of one Canadian immigrant character in the show captures the joke perfectly:

There were several candidates during the Canadian elections. One of them was this brash a–hole who just spoke his mind. He didn’t really offer any solutions. He just said outrageous things. We thought it was funny. Nobody really thought he’d ever be president. It was a joke. But we just let the joke go on for too long. He kept gaining momentum, and by the time we were all ready to say, “Okay, let’s get serious now. Who should really be president?” He was already being sworn into office. We weren’t paying attention!

The fact that Donald Trump will soon be hosting an episode of Saturday Night Live seems trivial in comparison to a more terrifying prospect: that he could win next November because we’d prefer to simply ignore him.

On neither occasion was their SNL appearance widely perceived as an implicit endorsement of their candidacy.

What separates Trump’s SNL appearance from the “RT ≠ endorsement” mantra is that signal boosting Trump isn’t just a matter of “[showing] people the different views that are out there,” as Baltimore Sun reporter Justin Fenton has argued. Giving him a platform provides the program with a historic opportunity to take a stand against political bigotry—in the same way that Tina Fey’s much-lauded impression helped deflate Sarah Palin’s rising balloon back in 2008.

Palin might seem like a humorous historical footnote now—but let me remind you: Before accepting the Vice Presidential spot the McCain ticket and becoming a frequent SNL target, she was widely billed as the most popular governor in the country. Her addition was considered a boost for John McCain’s campaign, and Saturday Night Live helped change the conversation.

At a time when what was once a joke is quickly becoming a reality, a little humor may be just what America needs to bring Trump—and his supporters—back down to earth.

Why Biden’s Decision Matters (for the Vice Presidency)

Published: The Good Men Project (October 20, 2015)

As America prepares for Joe Biden’s decision on whether or not to run for president next year, it is perhaps appropriate to reflect on the impact his choice will have on the office he currently holds – the vice presidency of the United States.

In one sense, Biden’s legacy as vice president is already secure. While scaling back the excesses that caused his predecessor, Dick Cheney, to be widely viewed as the secret power behind the throne, he still played a key role in shaping and pushing for the Obama administration’s agenda on issues ranging from fiscal policy and gun control to the war in Afghanistan. He also deserves a great deal of credit for improving his commander-in-chief’s political fortunes, in particular turning attention away from Obama’s poorly-reviewed performance in the first 2012 presidential debate by decisively besting Paul Ryan in the vice presidential match-off. Despite occasionally attracting the wrong kind of attention with highly publicized gaffes, Biden ultimately falls into the modern tradition established by predecessors like Walter Mondale, Al Gore, and even Cheney himself – namely, that of an important player in the administration of the president whose name will always be associated with his own.

That said, if Biden runs for president and doesn’t win the Democratic nomination, he will be the first sitting vice president to be spurned by his own party since Alben W. Barkley in 1952. And to paraphrase Biden himself, that will be a big frickin’ deal.

It’s important to remember that since 1960, every incumbent vice president who has sought his party’s nomination has managed to receive it: Richard Nixon in 1960, Hubert Humphrey in 1968, George H. W. Bush in 1988, Al Gore in 2000. Even Walter Mondale, though not a sitting vice president in 1984, managed to win his party’s nomination that year on the strength of his record while serving under President Jimmy Carter. Indeed, the only vice presidents in the second half of the twentieth century who didn’t at least get nominated by their party either lacked the benefit of incumbency when their turn rolled around (Dan Quayle), passed away before they had the chance (Nelson Rockefeller), or had been forced to resign in disgrace (Spiro Agnew).

Once again, Cheney deserves special attention in this discussion. By the time he was elected in 2000, the expectation that any vice president would eventually make a White House bid of his own had become so firmly entrenched that Cheney was compelled to emphatically disavow any ambitions of his own as soon as he was picked by George W. Bush. One could argue that Cheney wouldn’t have been nominated had he later changed his mind (his approval ratings were infamously low by that time), but because he had already disclaimed any presidential ambitions, it’s impossible to know whether he could have had a strong base of support had he actively cultivated one from the get-go. All that we know for certain is that, by not running for president in 2008, Cheney broke a precedent that had been in place for nearly fifty years.

This brings us back to why, if Biden seeks and fails to win the Democratic presidential nomination in 2016, the fact of his defeat will signify an important diminution in the vice presidential office’s political clout. Before the 1960 election, vice presidents had a pretty poor track record when it came to being tapped by their party as their successor’s heir apparent. This was largely because the office itself wasn’t held in particularly high regard; the Constitution only stated that the vice president would take over if the president died or was incapacitated and should president over (and occasionally cast the tie-breaking vote in) the Senate. America’s first vice president, John Adams, described the position as the “most insignificant office that ever the invention of man contrived.” Not surprisingly, only three incumbent vice presidents had been elected directly to the presidency before 1960 – John Adams in 1796, Thomas Jefferson in 1800, and Martin Van Buren in 1836. Until Theodore Roosevelt’s election in 1904, even vice presidents who rose to the presidency due to the president’s death were consistently passed over when they sought to be nominated on their own (including John Tyler, Millard Fillmore, Andrew Johnson, and Chester Arthur).

Of course, if Biden runs for president and does receive the Democratic nomination, it will affirm that the power that office has accrued since the mid-twentieth century continues to be a potent political force. That said, a defeat would indicate – regardless of the realities of what a vice president may do behind the scenes – the public no longer views a vice president as the rightful heir to the leadership of his or her party. This would be an event of no small consequence, and as such deserves to be acknowledged.

We were promised hoverboards: Of course “Back to the Future II” got 2015 mostly wrong — here’s why

Published: Salon (October 19, 2015)

Let’s get the obvious out of the way: Wednesday is October 21, 2015 – the date visited by Marty McFly in the time-traveling DeLorean 26 years ago in “Back to the Future: Part II” – and we do not have hoverboards. I repeat, hoverboards do not exist — these toys don’t count.

In fact, not much of the technology that appears most ubiquitously in the 1989 classic seem to have materialized. Our cars don’t run on garbage — indeed, as ahilarious College Humor skit observed, our refusal to break free from carbon fuels has destroyed our planet in the twenty-six years since the film’s release — and our highways are still confined to terra firma. That’s just the tip of the iceberg: We don’t have self-lacing sneakers or self-drying jackets, there are no robots to pump our gas, our weather forecasts are still unreliable, doorknobs haven’t been phased out by thumb scanners, and miniature pizzas still remain fun-sized after you stick them in an oven.

Instead of merely observing where “Back to the Future: Part II” was wrong — or, for that matter, noting the occasional odd area where it may have actually been prophetic (could the Chicago Cubs win the World Series?) — perhaps it would be more useful to analyze why it was wrong in certain areas. What does that tell us about how we viewed the near-future back in the late ‘80s and how does that compare to other famous sci-fi stories that also made their own prognostications? More importantly, how can we apply those lessons to the predictions we might make today?

When it comes to technological innovation, the practical beats the glamorous

Because science fiction writers are meant to depict future worlds that audiences will find entertaining, they tend to predict technological advancements that are wondrous to behold. The problem, of course, is that ideas which look good on a page or screen often don’t make sense in real life. Jet packs, for example, were a staple of comic books, TV shows, and movies in the mid-twentieth century, but when they were invented in 1961 it became apparent that they were too fuel-inefficient for average consumers and too high-risk for the military (a soldier in a jet pack makes for an easy target).

Similarly, although space age predictions about regular trips to the stars (see “Star Trek”), moon bases (“Project Moon Base”), and martial colonies (“Total Recall“) seemed plausible when our budget regularly included generous allocations to NASA, the end of the Cold War wound up removing a major incentive behind American investment in space exploration — our competition with the Soviet Union.

Incidentally, this rule explains why the flying cars of “Back to the Future: Part II” aren’t a reality. They might look cool, but they’re also expensive, sensitive (they’d be hard to drive in bad weather), and dangerous (even the slightest malfunction would cause you to fall out of the sky).

Medicine has improved, but the basic rules of human biology are unchangeable

Though it often isn’t remembered, “Back to the Future: Part II” did take a stab at biotech prediction. In one scene, Doc Brown tells Marty that he went to a rejuvenation clinic that repaired his hair, transfused his blood, and replaced his spleen and colon to add “thirty to forty years” to his life. This is consistent with a long history of sci-fi writers making absurdly optimistic predictions about how technology will improve our lives — certainly we are farther along now than we were in 1989, but nowhere near that level.

The same can be said for “Brazil,” which correctly predicted our obsession with plastic surgery but grossly overestimated how long a human lifespan could be extended, or “Blade Runner,” which assumed we would have genetic clones (or replicants) that could perform slave labor or serve as organ farms.

This isn’t to say that we haven’t made remarkable advances in medicine since the 1980s, but scholars tend to agree that science fiction and fantasy writers often engage in wish fulfillment (usually assuming we will be immortal or at least live for significantly beyond the usual 75-to-100 year life expectancy) instead of making reliable predictions.

When technological progress merges with politics, the results tend to be more rather than less democratic

The future depicted in “Back to the Future: Part II” may be utopian from a technological standpoint, but it is distinctly dystopian when it comes to the social order. Doc Brown makes an offhand reference to how Marty McFly’s son is arrested, tried, and convicted in the same day (because lawyers have been banned, har har), while Biff’s grandson is able to publicly bully Marty Jr. with apparent impunity. Fortunately, as economists Daron Acemoglu and James Robinson have pointed out, countries with democratic institutions tend to thrive technologically when compared to their despotic counterparts. Since it can be assumed that a nation which rushes suspected criminals into prison is far from democratic, the world of “Back to the Future: Part II” can be safely lumped in with the unrealistic high-tech dystopias seen in films like “Blade Runner,” “Fahrenheit 451” or “Logan’s Run.”

As for the bullying… Well, College Humor’s skit addressed this best when it joked that while there are still bullies, our tendency to shame those who publicly abuse their victims has caused much of the worst bullying to occur anonymously online. The notion that a Griff could brazenly assault a weakling in the light of day without alerting the police and causing a social media scandal seems practically quaint.
.
None of these observations are meant to detract from “Back to the Future: Part II.” I rewatched the film twice for this article and can attest that, as a moviegoing experience, it remains as smart, funny, and engaging as ever. That said, if future science fiction scribes want to write about the future in a way that will prove prophetic, they may be well-advised to learn from exactly how that film went wrong.