My Life in Review: The People Who Changed Me

Published: Good Men Project (December 31, 2014)

As the calendar moves forward, Matthew Rozsa looks back at 6 life-changing relationships he’s had and lost.

___

In less than 72 hours, the dictates of the Gregorian calendar system will draw the current year to a close and a new chapter—on this occasion 2015 A.D.—will begin. As humanity collectively commences with this latest small slice in its epic history, it’s generally regarded as useful for each person to take a step back and look at the history of their individual life … with an emphasis on the word “story” and the type of protagonist everyone hopes to be when writing their own.

For me to do that here, I’m going to talk about six people who had a major impact on my life, including two ex-girlfriends, two former friends, a would-be mentor, and an erstwhile colleague. As my rampant use of the past tense in the preceding sentence most likely revealed, I’m not currently in touch with any of them. Indeed, the point of this article will be to explore what we can learn from the loss of important relationships. Because I don’t want my observations confined to a single type of relationship, I have used a wide spectrum here, including close professional and platonic relationships as well as romantic and sexual ones. Similarly, because it’s equally important to learn both from how we were wronged and how we’ve wronged others, these stories run the gamut in terms of my own moral culpability in the relationships having ended (a factor that is often subjective anyway). Finally, I must stress that it’s quite likely the parties mentioned here would disagree with part or all of how I recall events. While I can’t fully account for how memories may differ and my own biases have slanted my interpretations, I can at least assure you (and them, if they ever read this) that I have done my best to be as detached and self-critical as possible.

♦◊♦

I’ve already written extensively about the bullying and general social ostracism I experienced while growing up. While my experiences stemmed from having Asperger’s Syndrome at a time when the autism spectrum wasn’t widely understood, and to a lesser extent from being Jewish in communities with very small Jewish populations, there are plenty of people who bear the scars of childhood mass peer rejection. The causes can range from being marginalized for one’s race, sexual orientation, or gender to simply having unusual personality traits, being socially awkward, or not adhering to consensus standards of attractiveness, but there is one bottom line: When you are victimized by this brutal alienation, isolation, and abuse at such a developmentally formative time in your life, it defines every single relationship you have for the rest of your life.

We can start with romantic relationships (a topic that I’ve discussed in the past in a piece co-authored with a close friend). While I’ve stayed in touch with several of my ex-girlfriends, there are two who have severed all ties with me—and both still haunt me to this day. One of them, who I’ll refer to here as R, was in a relationship with me for five-and-a-half years, encompassing the last two-thirds of my undergraduate career at Bard College as well as nearly four years after that. That relationship ended for a variety of reasons, ranging from ongoing tensions between myself and her parents (who made no secret to her, or me, of their opposition to the relationship)”to our mutual volatile temperaments, which caused incessant bickering due to both of us carrying fresh insecurities from being bullied in high school.” Regardless of why it ended, however, the fact that our couplehood corresponded with so many milestone events in my life makes it hard to return to those memories without feeling that they’ve been tainted by this sour note (more on that in a moment). The other, known here as K, dated me from the end of December to the middle of April this year. Having reached out to me as a fan of my writing, I naturally was thrilled at being able to subsequently form a deep bond with her over what I consider to be my art. What’s more, as a fellow writer, I enjoyed running article ideas by her and being genuinely stimulated by the insightfulness and creativity of her feedback, as well as her sharp eye as a proofreader. When she broke up with me, she specifically cited the fact that I was “an Aspie” and that “70% of what you do is objectively annoying” (her examples included my oft-brazen loquaciousness, odd gesticulations, difficulty making eye contact, and need for tightly structured day-to-day schedules).

While these romantic tribulations were certainly quite painful, they weren’t more so than the friendships I’ve also lost over time. Both of my examples actually took place this year. First there was B, with whom I’d maintained a continuous friendship for longer than with anyone else in my life, even as he had frequently blown hot-and-cold with virtually everyone else in our mutual social group. Although we had never fought, that abruptly changed when I invited a female friend to an informal party I was throwing to celebrate acquiring the minimum number of signatures necessary to appear on the ballot for a local elected office; he threw a temper tantrum because he didn’t want any girls to join us. After several months of refusing to speak to me, he finally warmed up only to immediately freeze me out after reading an article I’d written for this site about Gamergate, a movement to which he was sympathetic. There was also A, an art school dropout with whom I’d had numerous arguments during our sixteen-year friendship. The straw that broke the camel’s back on this occasion was a financial dispute: After hiring him to be my driver while I worked as a field organizer for Tom Wolf’s gubernatorial campaign, I was dismayed at his unreliability and inability to be available on important occasions when he was needed, ultimately prompting me to fire him. While that didn’t end our friendship on its own, our subsequent dispute over his compensation—in which he demanded half of what I’d earned from the day I’d hired him until the day he’d been discharged and me arguing that he should only be paid half of what I earned on those days when he showed up for work (an offer I felt was still incredibly generous, since his sole responsibility had been driving me)—put the nail in its coffin as far as I was concerned.

Finally, I’ll discuss two professional mentorships that have since come to a close. The first was with C, a professor at Bard College who taught one of the most fascinating classes I’ve ever taken (on the history of American intelligence gathering) and soon inspired me with his iconoclastic and fiercely independent-minded political philosophy. Soon he took me under his wing, offering tips on the art of crafting op-eds and networking as a writer that I still follow to this day. That relationship was terminated rather abruptly, after I accused him of being a bigot following an email in which he expressed angry criticism over actions committed by the State of Israel (in retrospect, I probably would have agreed with 90% of his arguments had I received that same letter today). The other incident occurred with L, a young woman my age who worked as my editor during one of my earlier writing gigs. More than a year after we both left that organization, I found myself chatting with her about ways I could improve my career. While the conversation was initially quite normal, I soon began venting to her and projecting my own frustrations out on her, at times even implicitly blaming her for not doing more to help me out. Instead of apologizing afterward, I then tried to act like nothing had happened and she was overreacting. She promptly cut me off.

♦◊♦

While the loss of important romantic, platonic, and professional relationships is very painful for anyone, it is especially so for those who struggled socially at an early age because of how easily those past agonies can be looked upon as “failures.” When an important interpersonal connection is severed, and then rendered incapable of being restored, it reinforces the sense many of us have that we are inherently unlovable and/or unlikeable, as well as that there is something intangibly and ineradicably “wrong” with us that will prevent us from ever succeeding in our careers or personal lives. Perhaps worst of all, the accumulation of experiences is extraordinarily disempowering. After all, few people intentionally sever important relationships for no good reason; when a relationship fails and then can’t even be partially restored, it cements the idea that we simply lack the power to better our own lives, no matter how hard we try or how much we learn from our past mistakes.

These lost relationships are also painful for another reason: Those meaningful connections were instrumental in shaping the human being you are today—and, with very rare exceptions, has an equally powerful impact on the other person as well—so losing touch with them can cause a jarring sense of confusion and disconnection in terms of how you perceive your own identity. For one thing, it is always important to know that your recollections of the past are roughly the same as those of the other individuals who shared portions of it with you; permanently losing touch with men and women who were previously central characters in your life leaves a host of lingering questions about whether the shared story that you’ve incorporated into your own larger narrative is, in its essential respects, the same one that they recall. Existential and introspective questions abound: Maybe I’m not who I thought I am? Maybe the narrative didn’t really progress as I thought it did? Maybe I don’t have as much influence on my own life as I would like to think (see my earlier observation about disempowerment)?

♦◊♦

This brings me to the major lessons we can learn from these lost connections:

  1. Talk about it.

Both men and women face a significant stigma when it comes to discussing lost relationships. While I will leave to a female writer the task of exploring the taboos held against women who try to be vulnerable on these issues, I can say that as a man there are two main struggles: (A) There seems to be an expiration date on how long you’re allowed to still be upset over these things, after which you’re viewed as “lame” or “pathetic” if you acknowledge that they still bother you; and (B) You’re almost never allowed to express emotions like heartbreak or mournfulness in an unrestrained manner, which is viewed as “unmanly.”

Again, I have no idea how much these themes overlap with what women experience, but I can say that—regardless of gender—no one should ever feel ashamed to openly acknowledge and converse about parts of their past that still hurt.

  1. Learn from your mistakes: Accept that you were sometimes morally wrong.

This is almost certainly the toughest rule to follow, but we can’t let ourselves off here. It’s especially difficult for individuals who, like me, spent most of their childhood being persecuted; there is something galling, even rage-inducing, about the notion that those who have spent so much time being victims that they could also adopt the role of victimizer (Gamergate is a great example of this). At the same time, the sad truth is that those who have been mistreated in the past are more likely to mistreat others, in part due to pent up resentment and anger and in part because they are subconsciously patterning their behaviors off of what they became accustomed to during their early years.

When it comes to my own experiences, I can honestly say that there are many occasions which have left me with deep moral regrets. Because R was my first serious girlfriend, I found myself bungling the awkwardness of her parents’ opposition (which, in an eerie foreshadowing, was rooted in the fact that they felt our idiosyncrasies made us too similar, with her mother exclaiming after first meeting me, “Oh God, there’s two of them!”) More seriously, I frequently argued with her when I allowed the doubts and insecurities I’d developed in my earlier years to flare up. With A, I forgot to treat him like a friend and instead “fired” him in a letter that was insulting and demeaning, which set things off on a sour note from which it subsequently became impossible to recover. It was similar with C, wherein I disregarded years of friendship and kindness by allowing my momentary feeling of outrage at what I perceived to be an anti-Semitic slur to overshadow what I’d long known about him as a human being. Of all the parties discussed here, none were more unambiguously in the right than L, who I forgot was a person with her own vulnerabilities and instead used as a symbol for my own frustrations—none of which were her fault, and many of which she had actively worked to help me effectively address. Although she did accept my apology when I offered one later, I still very enormous regret over how I handled that situation.

  1. Learn from your mistakes: Accept that others have wronged you.

Though this may seem paradoxical, the same impulse that makes it difficult to admit that we can be wrong also makes us inclined to be too harsh on ourselves. Even as we’re reluctant to think that the victim can ever be the victimizer, so too are we inclined to think that we were victims because we deserved to be. Consequently, it is far too easy to assume that the hurt others have inflicted on us was deserved even when a detached perspective reveals quite the opposite. From there, it behooves us to learn how to stand up for ourselves and never again allow others to abuse us.

R, for instance, never stopped projecting the rage she had developed from her own high school persecution onto me, just as I was doing to her (although she did stop hitting me when I confronted her about how I had never done that to her). With K, I actually committed virtually the opposite mistakes from what I had done with R. After having had several girlfriends between R and K, I had developed an almost-obsequious demeanor in which I allowed myself to be nitpicked for harmless quirks (many of them related to having Asperger’s) under the notion that if I stood up for myself, I would be “abusing” her. In retrospect, K’s constant ridicule—often public—for things like being obsessed with scheduling or struggling to make eye contact was in itself severe emotional abuse, teaching me that I should never allow a woman to act like “having Asperger’s” is a valid cause for criticism or complaint. Just as L was unambiguously right for being upset with me, B was unambiguously wrong; having never had a girlfriend and being notorious within our group for his frequent sexist comments, it’s no coincidence that our two blow-ups both occurred over gender-related issues. While A was correct for calling me out on being disrespectful and demeaning in my email terminating relations with him, his subsequent demand to be paid for days he didn’t work was so transparently irrational that it seemed almost deliberately vindictive.

  1. Learn to balance trusting what you know with questioning who you are.

As I mentioned earlier, one of the greatest struggles that comes with losing connections which were important to you—be they romantic, platonic, or professional—is that they cause you to feel a sense of disconnection from your past. To answer two of the three questions posed earlier in this article:

Maybe I’m not who I am I am? Maybe the narrative didn’t really progress as I thought it did?

Maybe not, but in the end you have no choice but to trust your own memory. Instead of viewing it as an infallible recorder of truth, however, look at your recollections in the same that you might look at one movie director’s adaptation of a classic book; it’s based on the same source material and, unless you’re severely mentally ill, the odds are that it has the basic plot points and major details down right. That said, certain aspects of that interpretation are likely to be plumb wrong, and many others will be neither right nor wrong but more subjective in terms of questions like “What happened?” and “Who was right or wrong?” As such, while you should trust your memory, don’t hesitate to question it.

To answer the third question—Maybe I don’t have as much influence on my own life as I would like to think?—I turn to my final lesson.

  1. Pain is an unavoidable part of life. Once you understand that, figure out what you want in the relationships that are important to you and never hesitate to be bold and adventurous in pursuing them.

It may be impossible to sever past bonds that have been lost, but I’ve always believed that hesitation and self-doubt are the mortal enemies of emotional intimacy and true happiness. No matter what you do, you are always going to be hurt, and you are always going to lose contact with people who had once been very important to you. The worst mistake you can make is to give up on forging future bonds because you’ve been burned in the past; not only will this deprive you of the only authentic cure to loneliness that exists in this life, but it reveals an unrealistic expectation about how relationships unfold. Show me a person who hasn’t lost touch with people that he or she once cared about, and I’ll show you someone whose timidity has prevented them from reaching out at all. One thing you’ll never find, I assure you, is someone with numerous positive relationships and no negative ones simply because they lack the flaws that can alienate or disconnect people from them; no one is perfect, and so such an example simply does not exist.

Once all of this is understood, the rational course of action becomes clear: Talk about your past with others, learn how to stop hurting those you care about, learn how to stop letting those you care about hurt you, cautiously trust your memory, and—most importantly of all—do your best to find the right people with whom to form new relationships in the future. When you feel those insecurities bubbling up that insist you’re unlovable, ignore them; when you feel the hesitation emerge that urges you to avoid reaching out lest you be rejected, recognize it as foolishly timid and dismiss it accordingly; and finally, when you feel that spark which tells you that you’ve found a true mentor, friend, or romantic partner, do everything you can to cultivate it.

To me, Orson Welles understood this best:

We’re born alone, we live alone, we die alone. Only through our love and friendship can we create the illusion for the moment that we’re not alone.

There is an important corollary to this lesson—the importance of learning from your own experiences to become a better person. “From these damaged goods, Saints arise,” blogger and activist Ariel Gordon once wrote when discussing growing up as an adult who was bullied, “Not always, of course, but there are those special individuals whose decency and courage cannot be shattered. When these Saints stand back up, push their way through the carnage, and allow themselves to be reborn, they become the ones this world needs.”

Why Facebook makes you think the world is ending

Published: Daily Dot (December 29, 2014)

Back in October, a Fox News article on apocalyptic belief systems by Dr. Robert Jeffress observed that “47 percent of American Christians believe that the end of the world as described in the Bible will occur within the next 40 years.” Around the same time—but, notably, on the opposite side of the ideological spectrum—Roger Cohen of the New York Times proclaimed that “many people I talk to, and not only over dinner, have never previously felt so uneasy about the state of the world,” before closing that “the search is on for someone to dispel the foreboding and embody, again, the hope of the world.”

A Gallup poll last week seemed to tie these sentiments together when it discovered that the number of Americans satisfied with the direction in which their country was headed had stagnated at 23 percent.

While the Eeyore-esque doomsday prophecies might be justifiable if we lived during the Black Death or Great Depression, Steven Pinker and Andrew Mack of Slate recently argued that the problems we assume are steadily worsening have, in many ways, actually gotten better. To summarize: Homicide throughout the world is in on the decline, increased awareness of women’s rights is gradually reducing violence against women, democracy is spreading, genocide and mass murdering of civilians is far less common, and even the Middle East offers sober observers valid cause for cautious optimism.

What’s more: Americans are increasingly optimistic that, for them personally, 2015 will be better than 2014, with an Associated Press/Times Square Alliance poll indicating that 48 percent thinking the upcoming year will be an improvement and 11 percent believing it will be worse.

While 2014 was a gloomy year in American culture—with Ebola and Michael Brown dominating headlines—Pinker and Mack show that Americans need to believe the sky is falling even when the weather is sunny and clear. It is a tendency that, though rooted in the journalistic culture of our mainstream media, is exacerbated by the socio-psychological conditions created by the Internet. In other words, there is a considerable disconnect between how we feel about our own lives and our need to perceive the world around us; to quote Don DeLillo in Mao II, there’s “an unremitting mood of catastrophe… We don’t even need catastrophes, necessarily. We only need the reports and predictions and warnings.”

Nowhere is this seeming paradox more vividly illustrated than in social media. On the one hand, as the New York Times reported last year, an analysis from social psychological Jonah Berger at the University of Pennsylvania revealed that when social media users chose to pass along content, “they preferred good news to bad.” According to Berger, “the more positive an article, the more likely it was to be shared.” As other neurological and psychological researchers eventually discovered, this is because social media users are conscientious not only about being interesting, but about reinforcing a positive impression about themselves to others. Indeed, as the article noted, “this social consciousness comes into play when people are sharing information about their favorite subject of all: themselves,” with 80 percent of all Twitter users sharing content about themselves—most of which, of course, was positive.

However, even though social media users often shared positive news more than negative news, the material that seems to have the greatest impact on the consumer’s worldview is that which skews pessimistic. It all has to do with how the media’s famous mantra that “if it bleeds, it leads” translates into the real world—or as Jeb Lund of the Guardian put it in his op-ed “2014 was a terrible, horrible, no good, very bad year. You probably don’t even remember why”: “Some critics credit viral social media news with intensifying this phenomenon… but that’s letting the media off too easy.”

“Fear-based news programming has two aims,” explained Dr. Deborah Serani in an article for Psychology Today. “The first is to grab the viewer’s attention. In the news media, this is called the teaser. The second aim is to persuade the viewer that the solution for reducing the identified fear will be in the news story.” While this may seem cruelly manipulative, research suggests that the media does this because their audiences instinctively gravitate toward this material. As the BBC reported earlier this year, an experiment run by McGill University that tricked subjects into reading the political stories toward which they were naturally most inclined (by telling them that only their eye movements, and not the content of what they read, would be monitored) found that “participants often chose stories with a negative tone— corruption, setbacks, hypocrisy and so on—rather than neutral or positive stories.”

The reason for this is negativity bias, which is motivated not merely by the enjoyment of others’ misfortune for its own sake, but because of evolution. According to Dr. Serani, “we’ve evolved to react quickly to potential threats.” Serani said, “Bad news could be a signal that we need to change what we’re doing to avoid danger.”

When a handful of the stories declaring 2014 to be the “worst” year are deconstructed, this evolutionary instinct becomes especially evident. For example, in Dean Obeidallah’s piece for the Daily Beast on how the summer of 2014 was the “worst ever,” he specifically mentions the shootings of Michael Brown and Eric Garner, the Arab-Israeli conflict, the spread of ISIS in Iraq and Syria, and the Russian invasion of Ukraine—all stories that speak to existing threats either faced by Americans at home or potential ones in which the nation could be mired abroad.

By simply adding “Ebola” or “cyberhacking” to that list, one also gets the bulk of the stories that Jeffress and Cohen each considered when issuing their gloomy assessments on the year. When the media focuses on stories like these, they feed on the public’s desire to feel informed about potential dangers. Because social media already caters to an instinct to feel one’s life is inadequate compared to that of those around you (which is also a very real, albeit somewhat different type, of threat), these stories inevitably add fuel to the fire.

While Jeffress is hopefully wrong about Americans becoming indifferent to these problems because they take them for granted, there are real-world political consequences to this attitude. After all, despite governing during a strong economy and with a solid record of achievement behind him, President Obama’s Democratic Party wound up suffering significant losses in the 2014 midterm elections due less to a low approval rating than to low voter turnout. While a strong job performance may not have been enough to inspire Obama’s supporters to keep his party in power, the animosity felt by his opponents did sufficiently inspire them to appear at the polls.

Beyond the immediate realm of electoral politics, however, this issue has real-world ramifications in another important way. Although it is dangerous to be ill-informed of the real dangers that exist in our world, it is unhealthy to have a distorted point-of-view that skews toward the melancholy and/or fearful. As Dr. Serani explained, being bombarded with negative news stories online has joined watching TV news or reading the newspaper as a “psychologically risky pursuit, which could undermine your mental and physical health.” This isn’t to say that Americans should close their eyes to bad news, but at the same time, the exaggeration does us no favors. Being prepared for the worst and appreciating how things have gotten better aren’t mutually exclusive attitudes.

 

‘The Interview’ Has A Smart Argument That Deserves Respect

Published: Good Men Project (December 28, 2014), The Daily Dot (December 30, 2014)

The Interview has been getting a lot of sympathy but, from a critical standpoint, not a great deal of respect. After seeing the film twice, I find this attitude a tad dismaying.

For those who haven’t seen the movie and wish to watch it with a fresh set of eyes, there are two things you should know:

1) At its core, The Interview has a remarkably insightful thesis about human nature and politics—namely, that people respond more to theatricality, sensationalism, and emotional appeals than they do to facts and reason. While it’s certainly not the first satire to offer this observation, it is surprisingly effective at making its point, which is why it deserves more respect than it’s getting—and, even if they weren’t directly responsible for the Sony hacks, it almost certainly explains the North Korean regime’s efforts to denounce and suppress the film.

2) The Interview is also a lowbrow comedy chock full of toilet humor, dick jokes, sexual innuendos (to say nothing of explicit descriptions of sex acts), and the other staples of vulgarity one expects in a R-rated American yukfest. If these types of jokes aren’t your cup of tea, the chances are you won’t enjoy 45-55 percent of the humor in this movie. Personally, I laughed just shy of a dozen times, which is an above-average ratio for a 107-minute motion picture. Make of that what you will.

The Interview sets out making its argument right in the beginning, when it juxtaposes a montage of scenes establishing North Korea’s anti-American propaganda and Kim Jong-un’s potential to be “the next Hitler” with a clip from Skylark Tonight, a fictitious tabloid news show hosted by Dave Skylark (James Franco) and produced by his best friend Aaron Rapaport (Seth Rogen). While Rapaport has his reservations about his role in dumbing down American journalism with cheap celebrity gossip—Eminem coming out of the closet, Rob Lowe being secretly bald, that sort of thing—Skylark openly relishes being able to “give the people what they want,” which he erroneously considers “the first rule of journalism.”

Skylark may be wrong about journalistic ethics, but he is absolutely right about the importance of theatricality and mass appeal in the realm of politics—a point The Interview clearly understands and emphasizes. After Skylark and Rapaport are recruited by the CIA to assassinate Kim Jong-un after scoring their big interview, Skylark insists that America showcase its role in deposing the leader in a dramatic, action-packed fashion. Naturally his ideas are met with derision, and just as inevitably (from a storytelling perspective anyway), the climax winds up unfolding almost exactly as Skylark predicted/hoped they would.

What directors Evan Goldberg and Seth Rogen and writer Dan Sterling intuit, however, is that while violence and explosions win battles, it is lowest common denominator image-making—the type of journalism in which Dave Skylark specializes—that wins wars. Hence the heart of the story, the movie’s depiction of Kim Jong-un himself (brilliantly performed by Randall Park), which is not only humanizing but oddly sympathetic. When first introduced to the audience, Kim Jong-un is presented as nothing more than a shy fanboy, as awe-struck and intimidated at meeting Skylark as vice versa. As Skylark and Kim Jong-un get to know each other, a seemingly genuine friendship seems to form, with each one revealing deep insecurities over their failed attempts to win their fathers’ approval and what they believe to be the unfair ways they’re perceived by the public. The two men form a bond over their shared emotional immaturity and craving for respect, perhaps best captured by Kim Jong-un’s concession to Skylark that he is indeed “batshit crazy”:

I’m 31 years old. The fact that I’m running a country is batshit crazy. What am I to do when 24 million people look to me as their leader, their god? What am I to do when my father’s dying wish was to carry his torch?

While The Interview understands the human frailty of its real-world subject, however, it never excuses it. In fact, its ultimate position is that Kim Jong-un’s man-child psychology—the same psychology satirists have skewered in dictators since the days of Charlie Chaplin’s The Great Dictator—is precisely what makes him so dangerous. This is most ominously exposed during a dinner party outburst in which the despot unleashes an outburst that could just as easily have come from the mouth of Adolf Hitler or Joseph Stalin as the fictionalized (or real) Kim Jong-un:

To those who seek to undermine me at home and abroad, I cannot respond with anything less than the totality of my strength! If a billion people across the earth and in my own country must be burned to prove it, then my worthiness as a king will be demonstrated!

It’s easy to forget, amidst the hype, that this entire movie is meant to culminate in an internationally-televised interview between Skylark and Kim Jong-un, but it is that titular interview which finally and powerfully deflates the North Korean tyrant. First, it builds a considerable amount of tension over whether Kim Jong-un’s snow job has worked on Skylark or whether the schlocky journalist will have the chops to expose the Supreme Leader to his people—one that pays off in a particularly satisfying manner when Skylark, after unctuously sticking to his script, poses a devastatingly simple question.

From there, the interview evolves into an exchange of facts: Skylark mentions that 200,000 North Koreans are imprisoned in concentration camps, that 16,000,000 are starving, that the country spends $800 million on nuclear weapons, while Randall Park’s Kim Jong-un rebuts—not inaccurately—that America has imposed crippling economic sanctions on his country and has more incarcerated people per capita. For a moment it looks as if Skylark is going to “lose” to Kim Jong-un, until he shifts gears from well-researched debater to his forte as a premiere sensationalist, using the same skills of psychological manipulation displayed earlier in the film to reduce the strongman and would-be living god to a pathetic, blubbering wreck.

The scene works well not only as comedy but as a sharp critique on the politics that allow dictators like Kim Jong-un to acquire and maintain power. Insofar as its observations that are specific to Kim Jong-un himself are concerned, they work because The Interview effectively fleshes out his vulnerabilities from the moment he’s introduced, adding plausibility to the notion that this is a man who could be tricked into shitting himself (metaphorically and literally) in front of the whole world.

More importantly, however, the film understands that the masses empathize with their political leaders in the same way that they do with celebrities. Facts and figures, though intellectually persuasive, are sterile players in the theater of international politics. It is personal image-making that, for better or worse, creates and destroys aspiring statesmen. Machiavelli may have been right when he wrote that it is better to be feared than loved, but The Interview follows in the tradition of The Great Dictator in demonstrating the Achilles’ heel of that approach: When a leader who depends on being feared is made to look ridiculous, his legitimacy is demolished.

Make no mistake about it, this is why Kim Jong-un is afraid of The Interview, and with good reason: It’s working. Not only has the movie already developed a cult following in America, but it has even become something of a sensation in China, North Korea’s only ally.

What’s more, if the movie is pirated into North Korea, it could have a devastating effect on the Supreme Leader’s image there. As Bruce Bennett of the RAND Corporation explained in an email to Sony CEO Michael Lynton, “a story that talks about the removal of the Kim family regime and the creation of a new government by the North Korean people (well, at least the elites) will start some real thinking in South Korea and, I believe, in the North once the DVD leaks into the North (which it almost certainly will).” Amy Nicholson of LA Weekly more succinctly explained in her article “Why Is Kim Jong-un So Afraid of Seth Rogen?”, that there is a reason why the North Korean government didn’t protest last year’s thriller Olympus Has Fallen—a thriller starring Jamie Foxx and Channing Tatum in which generic North Korean baddies blow up the White House—but is genuinely shaken by The Interview.

This brings us back to the critical reaction. Considering the overwhelming praise lavished on the 2006 satire Borat, it stands to reason that this film’s vulgarity was not the source of its critical lambasting. Another common theme seems to emerge. You can spot it in A. O. Scott of The New York Times’ smug seconding of a colleague’s observation that “the only real mystery is how something this ordinary could have caused so much agitation” and Joe Morgenstern of The Wall Street Journal sniping that “never has less of a film had more of an impact on the studio—and the nation—that produced it,” from Christy Lemire of her self-named website complaining “This is what all the fuss was about?” to Scott Foundas of Variety writing that “For all its pre-release hullabaloo…this half-baked burlesque about a couple of cable-news bottom-feeders…should feel like a kind of terror attack to any audience with a limited tolerance for anal penetration jokes.”

In short, if there is one thing on which critics have no choice but to agree, it’s that The Interview has made cinematic history. For better or worse, this is a movie that is destined to be remembered decades from now, if for no other reason than it may have been the catalyst behind one of the most financially destructive acts of corporate hacking ever up to that point, provoked an international incident, and was nearly outright suppressed by Sony in an act of cowardice. While it would be unfair to claim that every critic who has panned the movie has done so out of envy, the constant referencing to its virtually guaranteed place in history does bring to mind one of the film’s most memorable lines: They hate us ‘cause they ain’t us.

What a shame. The Interview is certainly not a perfect movie, but even if it hadn’t become a symbol for free speech and opposition to authoritarianism, it deserves to be seen.

The ‘War on Christmas’ isn’t real – but these 6 Yuletide controversies are

Published: Daily Dot (December 25, 2014)

The nativity, whether on a church lawn or brazenly displayed at City Hall, is a fixture of American Christmases—as is, of course, controversy. While arguments over nativity scenes trend toward questions of the separation between church and state, nativity scenes gone wrong run the span from bizarre, to funny, to actively offensive. It’s not a War on Christmas, per se, but a war over Christmas, one that deeply speaks to definitions of a changing America.

If Christmas is a time for fighting—whether its for that last beautiful evergreen tree or with your family after you’ve had too much eggnog—there seems to be nothing more popular to fight about than the most famous Christmas staple of all.

1) Zombie nativity

There was no good reason for the Ohio homeowner who created this zombie nativity scene to be ordered to take down his handiwork. For one thing, his neighbors have known for the last fifteen years he’s displayed it that it’s intended as a harmless cheeky parody; however, its harmlessness is beside the point, considering that our founding fathers not only intended to protect iconoclasm, but occasionally indulged in it (see: Thomas Jefferson’s rewriting of the New Testament). The town officials in Deer Park need to lighten up and read their Constitution.

But why zombies in particular? The walking dead have become increasingly in vogue in recent years, and according to researcher Saura Lauro, there’s a reason for that. Lauro told Mic, “We are more interested in the zombie at times when as a culture we feel disempowered.” As Mic’s Hannah Klapp-Kote further explains, “The first popular zombie movie was in 1968, a tumultuous year in American politics with the Vietnam War, the unrest at the Democratic Convention, and the general malaise of the 1960s.”

Whereas the past decade has been marked by the proliferation of global conflict and concerns over the effects of climate change and overconsumption, the zombie might hold a particular power after a year marked by racial unrest. Either way, if zombies can take over our televisions and multiplexes, why wouldn’t they come for Christmas next?

2) Nintendo World nativity

It’s interesting how the zombie nativity has stirred up outrage, but a transparent corporate promotion from Nintendo did not. In using contemporary video game characters to stand in for the nativity, Nintendo’s PR ploy is another sign of the commercialization of Christmas, a trend stretching all the way back to the Three Kings. According to a recent Pew poll, 52 percent of Americans think that the holidays are too commercialized. As Chris Rock put it in a recent SNL monologue, “Jesus kept a low profile and we turned his birthday to the most materialistic day of the year. Matter of fact, we have the Jesus birthday season. It’s a whole season of materialism.”

But in this case, commercialism isn’t all bad, since the Nintendo nativity is quite delightful. To be fair, I’m not sure Sackboy should be playing the baby Jesus in Gamestation’s Nintendo World nativity scene. As the most recognizable video game character in the world, it’s hard to argue that Mario isn’t the proper savior of the Nintendo universe (and before you ask, Baby Mario is a thing). This is a nitpick, though; the bottom line is that there are plenty of people who worship video games with far more reverence than they do the Roman holiday used as a proxy for the birth of their Lord and Savior.

3) Pop culture potpourri nativity

Like the aforementioned Nintendo World nativity scene, I’m surprised (and happy) this one hasn’t attracted controversy. I’m also astounded that this picture hasn’t broken the Internet, since what we have here, in a single tableau, are some of the greatest fixtures of American pop culture: William Riker and Deanna Troi from Star Trek: The Next Generation, a trio of Darth Vaders as the wise men, and a smattering of random onlookers ranging from Bill S. Preston and Theodore Logan from the titular “Bill & Ted movie series to the Tyrannosaurus rex from Jurassic Park.

Like the Nintendo scene before it, this scene blends the traditional and the modern, using the iconography of geek culture to put the scene into our own symbolic framework. While this remixing is a hallmark of fandom, it’s also an important part of how we understand the world, whether its Stephen Colbert satirizing the GOP with Lord of the Rings metaphors or teens attempting to unpack Ferguson by drawing parallels to The Hunger Games.

While this is less problematic than the latter, it shows that these scenes aren’t just a byproduct of today’s culture—they are today’s culture.

4) Tampon nativity

In 2012, a website called Tampon Crafts launched their own tampon nativity scene; as the website makes everything from tampon blowguns to tampon earrings and anything else you might want to imagine, the potential for this particular arts and crafts product is limitless. Unfortunately, they had to take down the tampon nativity scene because of death threats, while the zombie nativity scene got a simple cease and desist order. What gives?

The difference between the two reactions has less to do with the level of offense than our extreme uncomfortableness with the finer details of female sexuality, especially when it comes to menstruation. According to Think Progress’ Tara Culp-Ressler, this is because “society’s conceptions about what it means to remain ‘pure’ have an outsized impact on women. Culp-Ressler writes, “While men are expected and encouraged to constantly seek out sex, women are shamed for dressing or acting in a way that’s perceived to be promiscuous.”

If she’s right that these messages “begin at a very young age,” it’s important to remember that a holiday about virgin birth need not be about purity. Sure, the mythic Mary is often deified as a saint she was chaste, but would she be any less important if she weren’t?

5) Atheist Nativity

There’s nothing necessarily wrong with celebrating secular rationalism as well as Judeo-Christian religiosity every December. My objection to this atheist nativity scene is not with the idea itself, but rather with the condescending message of its creators:

At this season of the Winter Solstice, may reason prevail. There are no gods, no devils, no angels, no heaven or hell. There is only our natural world. Religion is but myth and superstition that hardens hearts and enslaves minds.

That’s the official statement of Anne Nicol Gaylor, co-president emerita of the Freedom from Religion Foundation, which is responsible for the project. Her attitude is particularly unfortunate because there is a healthier way for atheists to feel their voices are being heard on Christmas. When a Santa Monica atheist went to court two years ago to defend his right to display an atheist nativity scene in a park, Charles Haynes of the First Amendment Center and the Newseum’s Religious Freedom Education Project pointed out that “the tactic of many in the atheist community has been, if you can’t beat them, join them. If these church groups insist that these public spaces are going to be dominated by a Christian message, we’ll just get in the game—and that changes everything.”

The Freedom from Religion Foundation had a better approach last year when protesting the placement of a nativity scene at the Florida State Capitol. They responded with a banner that depicted Thomas Jefferson, James Madison, Benjamin Franklin, and the Statue of Liberty adoring an infant Bill of Rights, with the message: “Let us also honor the birth of our Bill of Rights, which reminds us there can be no freedom of religion without having freedom FROM religion in government.”

In short, you can be a proud atheist on Christmas without being mean-spirited about it.

6) Anti-Semitic Nativity

It’s chic to criticize Israel these days. This doesn’t mean that Israel isn’t guilty of human rights violations—far from it—but this nativity scene criticizing the creation of a separatist wall in Jerusalem reeks of bigotry. Setting aside the details of the Arab-Israeli conflict for a moment, it’s also important to note that Christmas has a long history of being used as a focal point for anti-Semitism and even outright persecution of Jews. As recently as last year, there was controversy over a Romanian Christmas song that became a hit precisely because it glorified the Holocaust and called for Jews to be burned over Israel.

Criticisms of Israel may be valid, but unless you think it’s okay to foster attitudes that discriminate against Jews, while making the point that it’s wrong to persecute Muslims, you should avoid using a nativity scene as your vehicle for expressing those opinions. If Christmas is a time too often used to drive us apart and perpetuate culture wars, maybe we should think instead about how to use the nativity scene to help bring us together

A Call To Dispense With Our Culture of Fear

Published: Good Men Project (December 23, 2014)

Matthew Rozsa offers three suggestions to protect both the police and the communities that fear them and calls for both groups to demand that politicians enact them.

___

Believe it or not, Michael Brown and Eric Garner share one very important quality with the pair of NYPD officers who were brutally slain earlier this week:

Both of them live in a society that forces them to fear for their lives.

While the 2016 presidential election is nearly two years away, America mustn’t ignore the implications—positive as well as negative—of the fact that this potentially unifying factor exists between two communities that are currently so fiercely divided. There is an opportunity here to promote policies that can address both the individual concerns of each group and the underlying shared fear that their members carry every day.

On the one side there are the men and women of American law enforcement, the so-called “thin blue line” that protects ordinary citizens from crime and upholds the law. We take it for granted these days that our municipalities, counties, states, and federal government will provide us with their protection. It’s easy to forget that the first official American police department wasn’t established until Boston decided to do away with volunteers and fee-grabbers in 1838, and even then, it took another half century for the concept of guaranteed police protection to be institutionalized in every major American city. As the 21st century increasingly complicates every layer of our society, the need for a large and active police force is as strong as ever. The days are long gone when Jeffersonian idealism was enough to instill in all Americans a stirring belief in the importance of upholding the law, to say nothing of the communal unity necessary to punish lawbreakers without the use of a centrally organized force. If our laws are to be enforced—and regardless of what you may think of specific policies and statutes, it is axiomatic that no society can long endure when its laws aren’t being followed—then we need people like Wenjian Liu, Rafael Ramos, and Byron Dickson to devote their careers to law enforcement.

You may notice that, although there were only two victims of the NYPD assassinations, I added a third name to that list. Dickson was the Pennsylvania state trooper killed by a homegrown sniper, Eric Frein, due to the latter’s ideological opposition to police specifically and what he perceived to be an excessively powerful government in general. Although Frein was an anarchical survivalist and Ismaaiyl Brinsley—the main suspect in the killing of Liu and Ramos—appeared to have been exacting retribution for the death of Eric Garner, both were linked by a single bond: They saw individual police officers not as human beings trying to do a job, but as agents of oppression.

It is doubtful that any cop in America isn’t aware of the fact that people like Frein and Brinsley perceive them in this way. For a job that already holds the threat of serious injury or death every day, this knowledge has no doubt created a deep existential fear within the law enforcement community.

♦◊♦

On the other side there are the men and women, disproportionately from racial minority backgrounds and the poor, who also live with a constant fear of losing their lives. Evidence abounds that racial profiling is a common practice throughout the country. Had Michael Brown and Eric Garner been anomalous cases of unfortunate police excess, it is unlikely that their deaths would have become causes celebre. Unfortunately, police racism has an ugly history that still conjures up images of Alabama troopers shooting fire hoses at civil rights protesters and LAPD officers beating Rodney King to within an inch of his life … and also, inevitably, being acquitted.

The pervasive sense among racial minorities today is that their lives are viewed as less valuable in the eyes of police and, by extension, of society as a whole. Even when they aren’t being literally gunned down or strangled to death, they are having their freedom literally stripped away from them. Just look at the statistics: Today America has more people in prison than any other country in the world—2.4 million, or 25% of all the world’s prisoners (despite America only representing 5% of the world population)—and African Americans and Hispanics make up more than half of that community. Racial minorities are far more likely to be convicted of a crime than whites facing the same charges and, during sentencing, are likely to receive longer and tougher penalties for their offenses.

While much of this is undoubtedly due to ingrained racist attitudes, there is a big business community that has a vested interest in maintaining this status quo. Since 1970, America’s prison population has increased by 700% despite an overall drop in crime, thanks to the privatization of America’s prison system. Whereas government run prisons would be under pressure to spend as little money as possible, corporations like CCA (the Correctional Corporation of America) and Wackenhut make millions each year by building new facilities, feeding inmates, and providing cheap labor to various industries. In addition, despite the overwhelming evidence that our war on drugs has failed, they also successfully lobby for stricter drug laws and harsher penalties to guarantee a steady stream of new inmates. Between that and the militarization of our police—i.e., defense contractors arming our law enforcement officials with unprecedented weaponry—one is left with the sense that the business of policing has worked to the detriment of the practice.

♦◊♦

Now for the policies that can solve this. While there are many great ideas being propounded, I’d like to explore three:

  1. Mandatory body cameras for cops.

As Time Magazine reported in August, cities that have required law enforcement officials to wear body cameras have seen significant reductions in use-of-force incidents and complaints of police abuse by citizens. In the words of Police Foundation Executive Fellow, Chief Tony Farrar:

“The findings suggest that more than a 50% reduction in the total number of incidents of use-of-force compared to control-conditions, and nearly ten times more citizens’ complaints in the twelve months prior to the experiment.

  1. Strengthen gun regulations to protect cops.

When President Obama summoned the Major Cities Chiefs Association and the Major County Sheriffs Association to the White House last year to discuss his proposed bill banning assault-style weapons and restricting high-capacity ammunition magazines, he found that while the latter organization disagreed with the gun control legislation (due mainly to sheriffs coming from rural areas with lower crime rates), the chiefs—who hailed from major cities that regularly deal with crime and, notably, lose the lives of their officers because of the availability of sophisticated firearms to civilians—were overwhelmingly supportive. As former New York City Mayor Michael Bloomberg—who was known for visiting the families of officers killed in the line of duty, and witnessed firsthand how often these weapons made the difference between life and death for these cops—bluntly put it:

“I don’t understand why the police officers across the country don’t stand up collectively and say, ‘We’re going to go on strike. We’re not going to protect you unless you, the public, through your legislature, do what’s required to keep us safe.”

  1. Ending the war on drugs.

As discussed earlier, the war on drugs costs trillions of dollars to achieve very little … which perhaps explains why five Nobel Prize-winning economists all say we should do away with it. More to the point, however, the war on drugs is responsible for such a disproportionate number of our crimes that it keeps our police occupied monitoring citizens’ lifestyle choices (unwise as those decisions may be) instead of focusing on more serious threats to their communities. What’s more, because of the aforementioned racial disparity in how drugs laws are applied, the ongoing war on drugs only exacerbates tensions between cops who are required to enforce those laws and the minority groups who—despite being less likely to use drugs than whites—are more likely to face the police over drug related issues.

♦◊♦

It must be emphasized that I am under no illusions that these policies alone will completely solve the ongoing controversy over American law enforcement. That said, it is essential that our citizens—and in particular the racial minorities who rightly feel persecuted—establish a positive relationship with the men and women whose literal job is to make sure our social fabric remains intact. When Democrats like Hillary Clinton, Joe Biden, and Elizabeth Warren face off against Republicans like Jeb Bush, Mitt Romney, and Rand Paul, all of them will be asked how they plan on handling this crisis. Here are three good ways they can start.

Stop criticizing ‘The Interview’ for killing Kim Jong-un’

Published: Daily Dot (December 22, 2014)

Say what you will about The Interview, but let’s abandon the notion that the filmmakers were wrong for their fictionalized depiction of Kim Jong-un’s death. The reason is simple: It is healthy to assert that Kim Jong-un deserves to die.

Kim Jong-un is not just some interchangeable foreign leader. This is a point critics, like Justin Moyer of the Washington Post, partially acknowledge, conceding that “it’s impossible to summon much sympathy for Kim Jong-un,” before urging his readers to “imagine this assassination farce was made not in Hollywood, but in North Korea or Moscow, and the leader assassinated in the film was a president of the United States. Or imagine the film was made by Iran, and the leader assassinated in the film was the prime minister of Israel.”

Dean Obeidallah offered a variation of this thinking on CNN when he criticized right-wingers as hypocrites whose defense of The Interview was “hollow pandering,” since, after all, they’d expressed outrage over the fictionalized depiction of George W. Bush’s assassination in the 2006 British mockumentary Death of a President. Amy Nicholson of LA Weekly, meanwhile, observed that North Korea didn’t care about Olympus Has Fallen, which depicted a North Korean attack on the White House, because it didn’t have the “goal to debase him and humiliate him” (Seth Rogen’s quote, not hers). “Fear is fine,” she insists, “but humiliation means war.”

Ironically, it’s entirely possible that The Interview is more geopolitically responsible than its critics anticipate. As the leaked Sony emails reveal, Bruce Bennett, a senior defense analyst at the RAND Corporation who specializes in North Korea, actually felt The Interview might have a positive effect on the Korean peninsula, since “a story that talks about the removal of the Kim family regime and the creation of a new government by the North Korean people … will start some real thinking in South Korea and, I believe, in the North once the DVD leaks into the North (which it almost certainly will).”

In terms of its message to domestic audiences, we can turn to Richard Brody’s review in the New Yorker, which describes The Interview as “a post-9/11 and, especially, a post-Iraq War meditation on a pre-9/11 theme,” specifically, “at what point is an act of war—because, of course, that’s what the planned assassination is—justified?” If Brody’s summary of the plot is correct, the film goes to extraordinary lengths to have their fictionalized version of Kim Jong-un winds up posing a direct existential threat to the United States. “Rogen and Goldberg offer a comedic dramatization of a political principle that would pass muster over craft beers in a liberal barroom,” he concludes, namely that “the killing of a foreign leader, or an act of war, would be justified if and only if an attack were verifiably imminent.”

For the sake of argument, however, let’s assume that Bennett and Brody are completely wrong. Even if The Interview is just a half-baked buddy comedy with little serious foreign policy thought at all, we still need to remember something very, very important: This isn’t Zoolander having fun with the idea of assassinating the Prime Minister of Malaysia or, again, Death of a President (which Moyer covered in a separate recent article but, tellingly, did not offer personal commentary upon). Even in a world in which America tortures innocent people in the name of a war on terrorism, Israeli bombs snuff out Palestinian refugee lives in the Gaza Strip, and Vladimir Putin attempts to gradually recreate the Russian empire, Kim Jong-un still belongs in a class all of his own.

We must never forget that North Korea has consistently ranked among the world’s worst violators of human rights. As a United Nations report put it earlier this year, “we are in the position to in fact directly put culpability on the supreme leader for these massive human rights violations,” including ordering the suppression of political dissent, torture, massive starvation, and murder. “Eighty-four percent [of North Koreans], according to the World Health Organization, suffered from borderline malnutrition last year,” explained Sue Terry, a former CIA analyst who specialized in North Korea, during an interview last month with Fox News, “There are 120,000 that are sitting in Soviet-style gulags, and they were consigned there for political crimes, not real crimes.”

Now to pose a series of questions akin to Moyer’s: Would there be a moral outcry if a comedy depicted the on-screen death of a notorious serial killer while he was still alive—e.g., Ted Bundy, Jeffrey Dahmer, John Wayne Gacy? What about a movie that did the same for a mass murderer—e.g., Charles Manson, Lee Boyd Malvo (one of the two “Beltway Snipers”) or James Eagen Holmes (who is currently under consideration for the death penalty for his involvement in the Dark Knight Rises shootings)?

While some might be offended on those occasions because of a broader theoretical belief in the sanctity of human life, it’s inconceivable that there would be any kind of meaningful editorial backlash against it, and for a reason we often don’t consciously recognize: We elevate political leaders, particularly when they rule entire countries, above the rest of the population. Even the ones we detest, like Kim Jong-un, are still assumed on some level to be ineffably “different” from ordinary citizens.

In a hypothetical movie showing the death of a real and living serial killer or mass murderer, only the heinous acts themselves are perceived as being attacked; apply the same treatment to a political leader, however, and people may perceive (rightly or wrongly) that you are attacking that leader’s entire country and/or the very institution of government itself. Consequently, a man like Kim Jong-un is treated differently than a common mass murderer, even though he is without question responsible for exponentially more deaths than Bundy, Dahmer, Gacy, Manson, Malvo, and Holmes combined.

Of course, there are still potential problems with how The Interview might have handled its subject. As S.E. Smith, my colleague at the Daily Dot, pointed out in an editorial on Friday, America has a long tradition of emasculating and “othering” Asian males for comic effect. When we ridicule Kim Jong-un using racist stereotypes about Asian men, the victim is not Kim Jong-un himself, but the 25 million North Koreans who languish under his rule. Without seeing the film to assess its nuances, one can’t be entirely sure if these racist tropes are used in The Interview. But if the movie commits this offense, it should be condemned for using that tool.

Perhaps one way of testing this would be to see how much energy the film invests in skewering Kim Jong-un for his qualities as a dictator (a la Charlie Chaplin with Adolf Hitler in The Great Dictator) versus how much it does so for adhering to racial stereotypes (a la Trey Parker and Matt Stone with Kim Jong-il in Team America: World Police). There is no reason for it to use the latter at all, quite frankly, and if it does, the movie will have tainted its ostensibly noble objectives.

Insofar as the question involves whether we should feel comfortable depicting the death of a real-life despot, however, the answer must be clear. Once a nation’s leader does certain terrible things, he or she forfeits their right to the respect inherent in their office and are fair game to the same pop culture characterizations we might bestow on anyone responsible for similar crimes. Indeed, humanity will be better served by severing Kim Jong-un, individual human being, from any identification he currently possesses as a valid representative of his people or a symbol of a respectable form of government. Kim Jong-un’s atrocities may not be sufficient grounds for America to go to war with North Korea, but they are sound cause for our culture to arrive at the conclusion that his authority should be delegitimized.

The moment we fail to state that as fact, we surrender something more important than our national security, more precious indeed even than our First Amendment rights. We surrender our souls.

Does Sony’s Fear Know No Bounds?

Published: Good Men Project (December 19, 2014)

Matthew Rozsa is mad as hell that Sony pulled ‘The Interview.’ But he’s equally horrified by the company’s racist attitudes.

___

If North Korea has shown the world anything, it is that the executives at Sony are abject cowards.

This may sound rough, but it needs to be said. And since I’ve already expounded at length upon the free speech issue involved here —which, make no mistake about it, is the most pressing one in terms of current events—something should be said about what the hacked emails reveal about that company’s racist attitude toward African Americans.

Because there are two halves to the problem, we’ll inspect them separately.

Exhibit A: Sony chairman Michael Lynton arguing against casting black leads to avoid offending audiences.

“I believe that the international motion picture audience is racist,” he explained in a private email conversation, “In general pictures with an African American lead don’t play well overseas.”

This musing was prompted not by a box office catastrophe, by the way, but by the Denzel Washington hit The Equalizer. That film apparently did very well both domestically and overseas, so Lynton was complaining not of money not made, but of a hypothetically higher amount that might have been grossed.

Having deflated the logic of his gripe, it is important to note the quickness of his moral capitulation. After all, Lynton admits that he personally believes Washington to be “the best actor of his generation.” Certainly he has made enough money for the studio; Lynton, however, simply can’t help but flinch at the thought that they might be upsetting bigots by promoting black leads.

If there is one silver lining to this dimension of the controversy, it’s that the outing of Sony’s racism has prompted them to strike a deal with Al Sharpton in which he will monitor their exclusionary policies in the future. At least their eagerness to please goes both ways.

♦◊♦

Exhibit B: Sony’s jokes about President Obama.

If the business community should have been taught anything by this year’s earlier humiliation of Los Angeles Clippers’ owner Donald Sterling—who was caught surreptitiously making racist remarks to his girlfriend—it is that the Internet era does not allow for privacy when it comes to repulsive opinions.

On the one hand, there is a certain hesitation any decent person feels over being too judgmental about the confidentially conveyed prejudices held by others. After all, no one is truly free from the blight of bigotry. Even those of us not raised in the Christian faith acknowledge the importance of the admonition to “judge not, lest ye be judged.”

Then again, it would be nice if people like Sterling had the chutzpah to say publicly what they confided privately. Which brings us to Sony’s jokes about our Commander-in-Chief.

The exchange in question began when Amy Pascal, another executive at Sony Pictures, asked producer Scott Rudin what she should say to President Obama during a fundraiser. Among her comments …

“Should I ask him if he liked DJANGO?” [Django Unchained]

Rudin’s reply?

“12 YEARS” [12 Years A Slave]

Pascal: “Or the butler. Or think like a man?” [The Butler, Think Like A Man]

Rudin: “Ride-along. I bet he likes Kevin Hart.” [Ride-Along]

It’s safe to assume that Pascal and Rudin never intended for this exchange to see the light of day. And if there wasn’t any evidence that Sony’s racist assumptions about President Obama reflected their business policies, these jokes could be dismissed as distasteful but little more.

Of course, we now know that these racist attitudes do percolate into their business practices (see Exhibit A). And that’s where the cowardice comes into play.

These days, it is not fashionable to be known as a racist, but hardly the same can be said for harboring racist views. The key (as grand juries from Ferguson to Staten Island have recently reminded us) is to enforce racist ideas under the cover of non-racist theories. Very often, those who hold these opinions don’t realize that they’re being racist. Because the same obviously is not true for Pascal and Rudin, however, one has to feel more than a little disgust for their willingness to sound like Sterling in private and progressives in public.

I turn to Ralph Ellison’s review of An American Dilemma, the sociological classic by Swedish scholar Gunnar Myrdal on the psychological origins of racism in this country:

In our society it is not unusual for a Negro to experience a sensation that he does not exist in the real world at all. He seems, rather, to exist in the nightmarish fantasy of the white American mind as a phantom that the white mind seeks unceasingly, by means both crude and subtle, to lay to rest. Myrdal proves this no idle Negro fancy. He locates the Negro problem “in the heart of the [white] American … the conflict between his moral valuations on various levels of consciousness and generality.”

Hence where the word “cowardice” comes into play. Pascal and Rudin, though conscious of their own strain of racist ideology, like Sterling insist on keeping it under a veil except when confident that they will not be held accountable for their views.

♦◊♦

For better or worse, the North Korean hackers have forced Sony executives—and hopefully other business elites throughout America—to recognize that the possibility of no accountability is rapidly evaporating. Again, the point must be emphasized that the main concern right now is their lack of accountability to the civil liberties that have allowed them to profit so handsomely within the paradigm of American culture. The fact that this is the paramount issue, however, cannot obscure that they should also be held accountable for how their words and deeds have perpetuated America’s uglier as well as nobler traditions.

7 things Americans think are more plausible than global warming

Published: Daily Dot (December 18, 2014), Salon (December 20, 2014)

Seventy-seven percent of the country believe in angels. Only 40 percent concede climate change is a reality

In a video that went viral this week, Bill O’Reilly has spoken, not only for himself but (generously) for the rest of America: “It’s easier to believe in a benevolent God and the baby Jesus than it is about some kind of theory about global warming. It’s just easier!”

Is he right? Roughly 73 percent of Americans believe Jesus was born of a virgin, while only 61 percent believe the Earth’s temperature has been warming. Even worse, only 40 percent of the Americans who concede that climate change is happening will admit that it’s primarily due to man-made activity.

For context, let’s compare those polling figures with something Americans are more likely to believe than man-made climate change.

1) 77 percent of Americans believe in angels.

Not only did this AP/GFK poll in 2011 find that more than three out of four Americans believed angels literally exist, but so do more than four out of 10 of those who never attend religious services. A poll taken five years earlier found that 81 percent of Americans believe in angels, essentially meaning the number had gone unchanged.

2) 55 percent of Americans believe that the Founding Fathers established this country as a Christian nation in the Constitution.

In a similar vein, this figure comes from a First Amendment Center survey taken in 2007. I’m sure these Americans would be fascinated to read Thomas Jefferson’s rewriting of the New Testament, which he felt perfected Jesus Christ’s teachings by removing all theological and supernatural elements from his life story. Lest there remain any doubt, we can return to the subject of angels as we review Jefferson’s letter to John Adams opining that “to talk of immaterial existences is to talk of nothings.” Jefferson continued, “To say that the human soul, angels, God, are immaterial, is to say they are nothings, or that there is no God, no angels, no soul. I cannot reason otherwise.”

3) 1 in 4 Americans believe the sun revolves around the Earth.

This comes from a National Science Foundation study conducted earlier this year. Before you accuse me of including a statistic that doesn’t belong on this list—as this statistic and the figure for man-made climate change are, sadly, only fifteen points apart—bear in mind that, whereas evolution and the Big Bang theory are relatively new to our collective consciousness, Copernicus and Galileo cracked our solar system’s biggest secret roughly five centuries ago. It would be no less ludicrous for one in four Americans to believe that the Earth is flat.

4) Only 60 percent of Americans believe in evolution.

According to the 2013 Pew Research poll, 33 percent of Americans believe that “humans and other living things have existed in their present form since the beginning of time.” Among those who believe in evolution, 32 percent believe that modern organisms evolved through natural selection, while 24 percent believe that evolution occurred through the direct intervention of God. By comparison, 97 percent of scientists believe in evolution.
advertisement

5) 51 percent of Americans don’t trust the Big Bang theory.

This refers to the scientific theory explaining the origin of the universe. If the statistic referred to the sitcom, I would be wholly sympathetic (I don’t trust Chuck Lorre either).

Kidding aside, according to an AP/GFK poll from earlier this year, slightly more than half of all Americans were either “not too confident” or “not at all confident” that the Big Bang happened. This same survey also found that only 27 percent believe the Earth to be 4.5 billion years old, which is the consensus figure among scientists.

6) Only 44 percent of Americans are confident that vaccines don’t cause autism.

According to a University of Chicago study taken earlier this year, 20 percent of Americans believe “vaccines cause autism and other psychological disorders,” while another 36 percent weren’t sure enough to agree or disagree with that statement. As someone on the autism spectrum who has written on this subject before, it is necessary to reiterate that the only basis for the belief that vaccines cause autism is a single scientific report that has since been universally discredited as a fraud.

7) 48 percent of Americans think the Civil War was about states’ rights, while only 38 percent of Americans believe it was over slavery.

While the 2011 Pew Research survey that yielded this statistic might lead you to believe that there is legitimate debate over the cause of the Civil War, there really isn’t: Over 90 percent of historians with graduate degrees accept that it was prompted by opposition to the election of Abraham Lincoln on the grounds that he would prohibit the expansion of slavery into the Western territories. Although the Southern states argued that they had a legal right to secede because of their sovereignty as states in a federal union, it was their opposition to Lincoln’s policies on slavery that incited them to leave the Union.

While it would take quite a bit of time to provide the full gamut of scientific and historical lessons necessary to correct all of these misconceptions, we can at least return to Bill O’Reilly’s climate change denialism with some measure of hope. As psychologist Dr. Michael Ranney of the University of California, Berkeley discovered, part of the challenge in getting Americans to believe in climate change rests in simply clarifying the process by which it works. He found that straightforward, step-by-step illustrations of the principle frequently changed people’s minds.

Fortunately, there’s a video for that:

If that sounds easy, for the majority of Americans, statistics show it’s anything but.