What the debate over Kim Kardashian’s race says about the changing face of America

Published: Daily Dot (January 30, 2015)

America’s obsession with celebrities has a tendency to reveal the complex depths of our social and cultural prejudices. In the case of Kim Kardashian, one observation becomes very clear: Americans need to stop seeing race as a black-and-white issue, and on the Internet, people clearly have no idea what to do with her.

Kim Kardashian is not a woman of color she is Armenian = European = Caucasian not a minority #listen
— Brooke (@Brookii_Babii) December 22, 2014

Kardashian’s husband, rap luminary Kanye West, recently drew attention to his wife’s experiences with racism at a BET Honors event last week when he addressed rumors that his wife was only married to him because of his successful career. During the speech, he shared an anecdote about how her father (the late attorney Robert Kardashian, best known as a member of O. J. Simpson’s legal defense team) responded to bigoted threats, telling his daughter, “One day, you may have a black child…a beautiful, beautiful, beautiful black child…and it’s gonna be hard. You’re gonna see how hard it is.” West was no doubt at least partially referring to an incident in June when Kardashian reported being harassed on an airplane yelling for her to “shut that black baby up.”

On the Internet, people clearly have no idea what to do with Kim Kardashian.

Along with catching flak for being in an interracial relationship, Kardashian has been part of a larger cultural trend in which celebrities are attacked for embracing diverse identities that don’t easily fit into a narrow black-white dichotomy. For Kardashian herself, this occurred in November when a Paper magazine cover celebrating her famously ample posterior was widely criticized on the Internet for echoing stereotypes about black bodies, prompting debate over whether the Armenian-American should be considered “white” or “ethnic white.” The controversy was also marked by mother-shaming, class-shaming, and general garden-variety prudishness on the Web.

Last month, Rashida Jones faced a similar problem regarding her race, documented in a video that has recently gone viral. While being interviewed on the Screen Actors Guild red carpet, a confused reporter complimented the actress by saying, “You look like you just came off an island. You’re very tan, very tropical.” Jones, who is biracial, replied: “I’m ethnic.” Outside of the African-American paradigm, there have been plenty of celebrities whose status as “white Hispanics” has caused confusion over their racial identity, including Cameron Diaz, Jessica Alba, and Freddie Prinze, Jr. Similarly, a white-passing actor like Keanu Reeves has often benefited from white privilege in casting even though he is partially of Asian descent.

On each of these occasions, a celebrity’s racial identity became the subject of scrutiny precisely because their image allowed different sections of the public to project their own perceptions onto them. More importantly, like Kim Kardashian, their own bodies were used as the cultural text to determine their whiteness or lack thereof, as opposed to any informed analysis based on their actual genealogy.

Additionally, however, the racial politics seen in Kardashian’s body aren’t making the news due to simple confusion over skin pigmentation. They are also derived from how she juxtaposes her sexual identity with physical and cultural tropes commonly associated with the black community. “The photo shoot, which evoked controversial images of black women,” wrote Bethonie Butler of the Washington Post, “has staked an ongoing debate over how black women are represented (or not) in mainstream media.” Similarly, as West noted in his speech, Kardashian’s openness about her interracial relationships has also raised questions about her views on race and how she racially self-identifies.

Ultimately, this is an issue that has very little to do with Kardashian or any other specific celebrity. Instead it is about the nuanced ways in which we as a society perceive race, which is often in the eye of the beholder. To use one particularly notorious example: After the controversial shooting of Trayvon Martin in 2012, a great deal of media attention focused on whether the perpetrator George Zimmerman should be identified as white or Latino. “To those grieving an injustice, Zimmerman was a white man,” wrote Isa Hopkins of Salon, “those who celebrate his acquittal have been quick to point out that Zimmerman is, in fact, not white at all, but Hispanic.”

Because the conventional historical pattern of white-on-black racial violence was partially inverted by Zimmerman’s half-Latino background, his racial identity became a touchy subject in the case. “His father suggest[ed] that Zimmerman look[ed] white but his mother’s Peruvian heritage means that he too understands racial discrimination and would never discriminate against another racial minority,” observed Professor Isabel Molina-Guzman of the University of Illinois at Urbana-Champaign. “Shortly thereafter the news media correct themselves and finally and controversially Zimmerman becomes a ‘white Hispanic,'” just like Cameron Diaz and Jessica Alba.

While it would be easy to glean the lesson that we shouldn’t judge people based purely on their physical appearance, the underlying problem is deeper than that. As America rapidly reaches the day when whites are no longer the majority in this country, a phenomenon known as “white minority politics” has emerged, one in which Americans who identify as white claim to feel attacked based on their own heritage. While the most conspicuous target of this development has been President Obama, it has also exacerbated Americans’ historic tendency to whitewash their own culture or simplify the boundaries of diversity.

Race is Kim Kardashian’s problem—because it’s everyone’s problem.

In Kardashian’s case, the core of the problem is that her ethnic background doesn’t fully sync with her image. Although Armenians were legally considered white in this country as of 1909, Kardashian’s physical attributes make it easy for her to be juxtaposed with stereotypes about African Americans. Similarly, her determination to self-identify with black culture because of her upbringing and personal life further blur the lines of her racial perception. While she isn’t a passive recipient of ethnic confusion like Rashida Jones, or someone of non-white descent whose body allows her to benefit from white privilege like so-called “white Hispanic” celebrities, the dialogue around her racial identity is still symptomatic of a cultural desire for clear-cut racial archetypes.

This speaks to the ever-changing bounds who gets to be white in today’s America (remember: there was a time when the Irish weren’t white), and Salon’s Erin Keane suggests Kardashian is the most fitting symbol of our contemporary confusion. Keane writes, “Like a round, shiny mirror, Kardashian’s butt reflects back to us our myriad cultural panics and anxieties, inviting us to oil them up and present them to the world in one glorious compendium of concern trolling and shame that reveals more about us than any magazine cover could of her.”

The Washington Post’s Sally Kohn suggested that Kim Kardashian is exploiting our racial mores for profit, “She’s like a buck-naked Trojan horse for a devious message: that the rest of us shouldn’t get our panties in a bunch about sexism and racism because, hey, Kim Kardashian is laughing about it. All the way to the bank.” However, it’s just as likely she isn’t thinking about it. After giving birth to her first child last year, Kardashian wrote, “To be honest, before I had North, I never really gave racism or discrimination a lot of thought. … I guess it was easier for me to believe that it was someone else’s battle.”

However, race is Kim Kardashian’s problem—because it’s everyone’s problem. She might be smiling when you look at her in the Paper magazine photos, but that doesn’t mean the reality is all that funny.

Be a Good Guy, Not a Nice Guy: A Lesson From History

Published: Good Men Project (January 29, 2015)

Liskula Cohen and Matt Rozsa

co-author: Liskula Cohen

Liskula Cohen and Matthew Rozsa discuss the difference between being a nice guy and being a good one.


There is a difference between being a nice guy and being a good guy.

It’s easy to forget this, especially as the cliché that “nice guys finish last” is reaching a fever pitch among Men’s Rights Activists and self-proclaimed dating gurus. While usually intended as a complaint, however, critics often forget that there is a big difference between being “nice” and being “good.” Many of the men we admire most—and who are found extremely attractive by their partners—were known for being brilliant, opinionated, eccentric, vulgar … and yet, in their romantic relationships, incredibly kind, supportive, and loving.

While being “nice” merely involves knowing how to be polite, being “good” requires you to develop a genuine connection with the person you love.

This is because while being “nice” merely involves knowing how to be polite, being “good” requires you to develop a genuine connection with the person you love. Similarly, while “nice” men are accommodating and pleasant, “good” men are interesting and bring something to the table that their partners can’t find anywhere else. Finally, these men didn’t view their significant others as rewards to be collected in return for following “nice guy” rules, but as true partners who shared and influenced their lives.

Here are five famous examples that illustrate this point:    2871185331_ac9fdd1608_z

1. John Adams (and Abigail Adams; Politics)

The second president of the United States referred to his wife Abigail as his “ballast,” an allusion to a navigational tool that would help lost boats acquire stability and orient themselves. As their voluminous correspondence demonstrates, John and Abigail based their relationship on a deep mutual respect for each other’s intelligence, shared political ideals, and strength in enduring the hardships of the Revolutionary War. When John faced disloyalty and chicanery during his careers as a diplomat and president, his relationship with Abigail—the one person with brainpower who he could fully trust—became vital to the success of his political career.

There is a saying, “Behind every good man there is a good woman.” Every man should acknowledge and appreciate when he’s got a wife who is willing to take these risks with him and support him.


2. Francis Crick (with Odile Crick; Molecular Biology)

While the Adamses came from Puritan stock, the Cricks were on the opposite extreme: They were Bohemians, intellectuals, and free thinkers straight out of a countercultural stereotype. Francis Crick was a molecular biologist, his wife Odile an art student trained in Vienna—and fate conspired to make their relationship one of the most important in the history of human genetics. After a night of partying and dropping LSD, Francis stumbled home and began rambling to his wife about visions of two spirals spinning in opposite directions, twirling together like pasta around a fork. Her resulting sketch was the double helix. By being on the same wavelength, Francis the scientist and Odile the artist were able to give humanity our understanding of the structure of DNA (as did Crick’s assistant, James Watson).

They didn’t just look for what was wrong, but also for what was right. And in the process, Odile’s love of art wound up making Francis’s greatest scientific contribution possible.

Its part of the saying “opposites attract.” These were two people who came from different sides of the playing field in terms of their interests, but it didn’t stop them from having a successful relationship. They didn’t just look for what was wrong, but also for what was right. And in the process, Odile’s love of art wound up making Francis’s greatest scientific contribution possible.

Bill & Melinda Gates Foundation - GAVI Alliance: Melinda French Gates, William H. Gates III

3. Bill Gates (with Melinda Gates; Business/Philanthropy)

Bill Gates’s name may be synonymous with nerdiness, but his wife Melinda actually has the superior education—a fact that provides Bill with no shortage of amusement. That said, Melinda is more than just her husband’s intellectual equal; after years of incurring controversy with monopolistic business practices, Bill began to redeem his public image by pledging to give away all his wealth and in the process become one of history’s most generous philanthropists. By all accounts, the inspiration for that was Melinda’s influence on her husband. By bringing out the best in the man she loves, Melinda has managed to use that love to make the world a better place.

By bringing out the best in the man she loves, Melinda has managed to use that love to make the world a better place.

One would hope that within every relationship, the two partners are capable of bringing out the best in their spouse. In my personal experience, I’ve only seen that happen sometimes, but it’s incredibly important. Wouldn’t it be great if all the other billionaires found women like Melinda?


4. Harry Houdini (with Bess Rahner; Magician)

Some stories have bittersweet endings. Like the other couples profiled here, Harry Houdini and his wife Bess were in a loving and faithful lifelong relationship, with Bess working as his stage assistant the entire time. After Harry’s tragic death in 1926 (on Halloween no less), Bess began to hold séances to see if her spouse would attempt to contact her. As an outspoken skeptic of occultism and spiritualism in his later years, Harry had agreed to use a secret password to communicate with his wife should he encounter her on the other side. By 1936, after one last failed séance, Bess famously told her friends, “Ten years is long enough to wait for any man.” Though she wanted very badly to see her one true love again, she refused to delude herself in a way that would have insulted his memory and the principles for which he stood.

Though she wanted very badly to see her one true love again, she refused to delude herself in a way that would have insulted his memory and the principles for which he stood.

Men sometimes don’t realize how long women have the capacity to hold on and wait. It’s not uncommon for women—especially in literature—to wait for years and years, embracing the memory of the men who left them behind. It’s important for men to try to have these sentimental qualities just as much as women.


5. George Takei (with Brad Altman; Acting)

George Takei may be best known as Mr. Sulu on Star Trek, but he found his future partner Brad Altman after asking him for help—in this case, getting in shape with the eventual goal of competing in a marathon. Although they kept their relationship secret for many years, they remained faithful in both good times and bad, with George referring to Brad as a “saint” for helping care for his terminally ill mother. Eventually they were rewarded for their patience and loyalty by finally being allowed to legally wed in the state of California.

It doesn’t take a woman to notice what is a good man or how to be good in a relationship. Gender doesn’t determine these things; you don’t need a woman to know whether a man is good news or not.

It doesn’t take a woman to notice what is a good man or how to be good in a relationship. Gender doesn’t determine these things; you don’t need a woman to know whether a man is good news or not. They also demonstrate that being faithful, even if it doesn’t lead to an immediate reward, can still be worth it.



That, really, is the bottom line. A nice guy may be polite, but a good guy seeks to share his life with you in a meaningful way and brings something wonderful of his own to the table.

1. Photo – Via Tsuji/Flickr

2. Photo – Cliff/Flickr

3. Photo – Ioppear/Flickr

4. Photo – World Economic Forum/Flickr

5. Photo – Battle Creek CVB/Flickr

6. Photo – Nathan Rupert/Flickr

The Pros and Cons of Workaholism

Published: Good Men Project (January 24, 2015)

Being a workaholic has its advantages and disadvantages. Matt Rozsa takes a look at both.

I am a workaholic. And if you think that statement is anything to be proud of, you might be one too.

For a very long time, I secretly aspired to workaholism. If you threw a penny at me six years ago, I probably would have been struck while sitting alone in a basement apartment in Montclair, NJ, doing nothing but watching TV and reflecting on my screw-ups to pass the time. The Great Recession was at its worst, my personal depression wasn’t doing much better, and I was almost four years out of college without any meaningful work experiences to show for it.

Eventually I decided to change that. Within three years I had created a blog for writing political articles, written a couple of plays (one of which was performed at an Off Broadway venue, the Lark Theatre), and completed my MA in American history at Rutgers University – Newark. Three years after that and I find myself with an increasingly successful career as a freelance political columnist, a nearly-completed PhD program at Lehigh University (again in history; I’ve finished my coursework, which leaves me with my comps and dissertation), and a local elected office within the Democratic Party. Whereas January 2009 found me stuck in a slump, January 2015 finds me with so much on my plate that I’m regularly scooping the metaphorical mashed potatoes off the tablecloth.


There are upsides and downsides to this.

The major advantages [of workaholism] are obvious and, to be fair, not inconsiderable. Foremost among them is the fact that I’m busy every day doing things that I love, which provides me with an immense sense of personal gratification.

The major advantages are obvious and, to be fair, not inconsiderable. Foremost among them is the fact that I’m busy every day doing things that I love, which provides me with an immense sense of personal gratification. It reminds me of an observation by Mark Twain:

“Who was it who said, ‘Blessed is the man who has found his work’? Whoever it was he had the right idea in his mind. Mark you, he says his work – not somebody else’s work. The work that is really a man’s own work is play and not work at all. Cursed is the man who has found some other man’s work and cannot lose it. When we talk about the great workers of the world we really mean the great players of the world. The fellows who groan and sweat under the weary load of toil that they bear never can hope to do anything great. How can they when their souls are in a ferment of revolt against the employment of their hands and brains? The product of slavery, intellectual or physical, can never be great.”

By waking up every morning and knowing that I’ll spend my day doing what I love (and being paid to do it… sometimes), I have an enormous blessing. Not only would I refuse to trade this for anything, I’m doing all I can to double-down on my workload and fill my life even more with these professional academic, literary, and political activities.

This is a good thing. In fact, it is sometimes too much of a good thing.


Recently I started keeping a sleep journal and noticed that I average five to six hours each night. At times that I’ve scheduled for relaxation, I frequently find my mind wandering to my work… and far more often than not, it does so in a way that stirs up anxiety. Over time the state of constant agitation has grown so bad that I’ve needed to seek medical help, with several doctors listing my stress levels as a serious health concern. Even my editors and collaborators have noted that I seem to always be “on.”

Aside from the toll taken on my physical and mental health, my workaholism also diminishes my ability to enjoy life’s simple pleasures. So much of what makes our day-to-day existence enjoyable can’t be achieved with a schedule and a willful exertion of strenuous activity. You can’t force yourself to have fun during a night on the town, or relax with friends over a couple beers, or really invest yourself into the plot of a movie you’re seeing in theaters or get lost in music that crashes over you in waves at a rock concert. These things require an inner equanimity that eludes the grasp of the workaholic.

It was an equanimity that I possessed in spades six years ago. I could (and did) do all of those things back then, and I suspect that on some level I was happier for it.

This doesn’t mean that I believe everyone should live as I did at that time. Indeed, I have several friends who have refused to move on from that lifestyle and are gradually becoming bitter, jealous husks of their former selves as a result. Nevertheless, I’ve also seen that being a workaholic leads to problems of its own. Instead of being a shell, you become a nub, worn out from overstimulation and overuse.

The only path that truly offers no reward is that of the slave (to use Twain’s earlier definition). While a slacker doesn’t work at all and a workaholic is at least engaged in activities that feel right in his heart, a slave is forced to work in employment for which he or she is simply not well-suited. We live in a society that assumes it is worse to be a slacker than either a slave or a workaholic, but in truth the slave experiences a level of misery and despair that shouldn’t be envied by anyone. Whether they do this out of economic necessity (as is far too often the case), because they never found a line of work that was right for them, or for any other reason, they deserve boundless sympathy. Any society that doesn’t prioritize ending this type of slavery is unworthy of respect.

At the same time, I suspect a middle ground exists between workaholism and being a slacker. The happiest people in the world are those who strike a balance between their professional and personal lives, not only quantitatively (how much time they set aside for both) but qualitatively (how much enjoyment they derive from both). As I anticipate the years that lie ahead, I know that I have two priorities: To continue making substantial progress in my career and to develop a rich and fulfilling personal life that can accompany it.

I have my work cut out for me.

Don’t hate on critics of ‘American Sniper’ – criticize its racist hero

Published: Daily Dot (January 22, 2015), Republished on MediaREDEF (January 23, 2015), Salon (January 28, 2015)
This article is not about American Sniper. It is about Chris Kyle.

That’s an important distinction because, as American Sniper continues to break box office records, many of its fans are having trouble separating the flesh-and-blood human being from his celluloid counterpart. This was perhaps best embodied by Kid Rock’s response to the public criticisms of the film made by Seth Rogen and Michael Moore on Twitter. “F*** you Michael Moore, you’re a piece of s*** and your uncle would be ashamed of you,” he wrote. “Seth Rogen, your uncle probably molested you. I hope both of you catch a fist to the face soon. God bless you, Chris Kyle. Thank you for your service.”

The fact that Moore had focused on Kyle the man and not Clint Eastwood’s movie, while Rogen had limited his criticism entirely to the movie without mentioning a word about the real man, escaped Kid Rock, much as it has eluded others in the pro-American Sniper camp, including Craig Morton, Blake Shelton, and Sarah Palin.

All of this is noteworthy because it’s symptomatic of a larger problem. For far too many Americans, it is impossible to separate criticism of individuals within certain institution—or even systematic injustices perpetrated by those institutions—from the actual institutions themselves.

This was seen last year in the right-wing backlash against those who protested racial profiling among law enforcement. “If you read the liberal mainstream media,” argued Ben Stein, you’d think “that the main problem with race in America was poor innocent black people being set upon and mistreated by the police.” In his dismissal of the #BlackLivesMatter protests, Rudy Giuliani claimed that “they are tearing down respect for a criminal justice system that goes back to England in the 11th century.” After a crazed cop-hater assassinated two police officers in December, New York Patrolmen’s Benevolent Association president Patrick Lynch blamed it on those who “incited violence on the street under the guise of protest.”

There is an obvious logical response to these attitudes. “You can truly grieve for every officer who’s been lost in the line of duty in this country, and still be troubled by cases of police overreach,” argued Jon Stewart on The Daily Show. “Those two ideas are not mutually exclusive. You can have great regard for law enforcement and still want them to be held to high standards.”

The same thing can be said for those who criticize Chris Kyle. After all, regardless of one’s opinion on the American military, there is much that is problematic about Kyle himself. His autobiography is filled with distortions and outright lies, from claiming that he shot carjackers in Texas and New Orleans looters during Hurricane Katrina to libel against Jesse Ventura that ultimately led to a civil settlement in the former governor’s favor. He refers to Iraqis as “savages” about whom he “couldn’t give a flying f***,” says he is drawn to killing because it’s “fun,” and brags about telling investigators into his sniping of a civilian that “I don’t shoot people with Korans—I’d like to, but I don’t.”

“In Kyle’s version of the Iraq War, the parties consisted of Americans, who are good by virtue of being American, and fanatic Muslims whose ‘savage, despicable evil’ led them to want to kill Americans simply because they are Christians,” writes Laura Miller of Salon. Tellingly, Kyle’s book never challenges the Bush administration’s assumption that Iraq was somehow involved in the Sept. 11 terrorist attacks, even though no such connection has ever been proved; Kyle even claims that he personally discovered some of the weapons of mass destruction (WMDs) that were brandished as the pretext for our invasion, although all of the confirmed materials found in that country were leftovers from the early 1990s that were already known at the time.

None of these observations are meant to detract from Kyle’s physical courage and unwavering support for his fellow soldiers. If anything, the fact that a man capable of such heroism could also be so flawed only underscores the complexity of human nature—something that those who insist on Kyle’s beatification studiously ignore. In the same vein, drawing attention to Kyle’s faults is not tantamount to attacking the character of every soldier who fought and/or died in Iraq. Kyle was one man, not a symbol for the entire American military.

Of course, the reason we are seeing such reflexive rallying behind American Sniper and Kyle’s character is that there are Americans who wish to turn him into such a symbol. “Treating Kyle as a patriot and ignoring any other possibility,” observes Dennis Jett of the New Republic, “allows Americans to ignore the consequences of invading a country that had no weapons of mass destruction, had nothing to do with 9/11, and had no meaningful ties to Al Qaeda.” Just as important, the canonization of Chris Kyle allows Americans to duck the morally thorny questions involving Kyle’s possible killing of innocent civilians, his dehumanization of both Muslims in general and Iraqis specifically, and his bloodthirsty attitude toward war itself. Because his supporters don’t wish to see these things (or, even worse, secretly condone them), they gloss over the inconvenient details and insist that drawing attention to them is un-American.

This speaks to an issue even larger than questions about the Iraq War, America’s military presence overseas, or even racism among law enforcement (to refer to the earlier analogy in this article). If America is going to have an intelligent public debate on any political issue, it is essential that its citizens be able to participate without fear of having their motives baselessly attacked. More specifically, if we are to hold our government accountable for its actions, we absolutely must be able to criticize its most powerful institutions—particularly those who use violence, be it the military abroad or the police at home—without being intimidated into silence.

It’s not un-American to question Chris Kyle and the military operation he worked for. In fact, it might just be the most patriotic thing you can do.

Thoughts on Jewish Identity in a Post-Charlie Hebdo World

Published: Good Men Project (January 22, 2015)

Matthew Rozsa explores the divided politics of the American Jewish community.


Although I recall a great deal about the day I was nearly murdered, my conscious memory has suppressed the few minutes during which my head was held under water and my 6th grade classmates chanted “Drown the Jew!” Like a scratched DVD, it skips from the moment right before I was dunked to my immediate reaction after being pulled out of the water by my best friend’s older brother. Arms flailing violently, I landed indiscriminate blows on whoever was in arm’s reach—including the man who saved my life—before rushing ashore to gulp down air and vomit up mud.

To fully appreciate the plight facing American Jews in the aftermath of the Charlie Hebdo attacks, it is essential to understand that we live in a world where millions of people have marked us as targets of hatred, or even violence, simply because we are Jewish.

As the American Jewish community reacts to the terrorist attack in Paris, many of them are feeling a primordial panic very similar to what I experienced as a twelve-year-old boy. It isn’t simply that some of the same Islamofascists who shot up Charlie Hebdo on Wednesday deliberately chose a Jewish supermarket as the location of their last stand, ultimately leading in the deaths of 4 hostages. Even those of us who don’t follow current events are familiar with the millennia of persecution that our co-religionists have endured, culminating in a Holocaust that claimed six million Jewish lives as part of Adolf Hitler’s “Final Solution.” To fully appreciate the plight facing American Jews in the aftermath of the Charlie Hebdo attacks, it is essential to understand that we live in a world where millions of people have marked us as targets of hatred, or even violence, simply because we are Jewish. The state of nonstop existential peril felt among Jews everywhere as a result accounts for the two greatest challenges faced by the American Jewish community in this century – one involving our political character, the other our very survival.

For most of American history, Jews have been disproportionately associated with the political left, from their conspicuous involvement trade unionism and the civil rights movement to the fact that a plurality have voted Democratic in every presidential election since 1924.

For most of American history, Jews have been disproportionately associated with the political left, from their conspicuous involvement trade unionism and the civil rights movement to the fact that a plurality have voted Democratic in every presidential election since 1924. As Jay Michaelson of The Jewish Daily Forward put it, “Jews are predominantly liberal because we are still mindful of being outsiders.” Conservative pundit Norman Podhoretz came to a similar conclusion, arguing that for Jews “liberalism has become more than a political outlook. It has for all practical purposes superseded Judaism and become a religion in its own right.” A Pew Research poll from 2013 confirmed this, referring to Jews as “among the most strongly liberal, Democratic groups in U.S. politics.”

French Jews alone have seen anti-Semitic hate crimes double over the past year; hate speech targeted against Israelis and Jews online has undergone a similarly dramatic increase.

Clashing with this traditional Jewish progressivism, however, is the stark reality that global anti-Semitism is currently on the rise. Last year, an in-depth Anti-Defamation League survey of more than 100 countries found that 26 percent of the people polled held anti-Semitic beliefs. French Jews alone have seen anti-Semitic hate crimes double over the past year; hate speech targeted against Israelis and Jews online has undergone a similarly dramatic increase; explicitly anti-Semitic political parties have been gaining power from Greece to Hungary; and a 2013 survey by the European Union Agency for Fundamental Rights found that almost a third of Europe’s Jews have considered emigrating because of anti-Semitism. Not coincidentally, 2014 saw the largest spike in Jewish immigration to Israel in ten years.

At first glance, the tenets of political liberalism would seem entirely consistent with opposition to the anti-Semitism that has been breaking out all over Europe (much of which has originated from the political right, such as in Greece and Hungary). That said, while liberals do overwhelmingly oppose European anti-Semitism, they have been much more likely than conservatives to hold unfavorable views toward Israel since the 1970s. By building settlements into Palestinian territory, engaging in military campaigns that lead to extraordinarily high loss of Palestinian life, and generally treating Palestinians like second-class citizens, Israel’s behavior is the type of thing that American Jews would have been historically predisposed to oppose … had it been perpetrated by any other country.

Indeed, while Jews on average voted 80% for Democrats in the thirty-six years after Franklin Roosevelt’s first election in 1932, that number fell to an average of 70% in the forty years since the nomination of George McGovern – the first Democratic presidential candidate to be outspokenly critical of Israel – since his nomination in 1972.

This most likely explains why Jewish support for Barack Obama fell from 78% in 2008 to 69% in 2012 (the lowest share received by any Democrat since Michael Dukakis) with Jews showing unprecedented support for Republicans congressional candidates (33%) in the most recent midterm elections. To the extent that polls indicate the reason for the drop off, it appears to be because of Obama’s comparatively harsher stances toward Israel. Indeed, while Jews on average voted 80% for Democrats in the thirty-six years after Franklin Roosevelt’s first election in 1932, that number fell to an average of 70% in the forty years since the nomination of George McGovern – the first Democratic presidential candidate to be outspokenly critical of Israel – since his nomination in 1972. Although a 2013 Pew Research poll found Jews were critical of certain Israeli policies, they overall remained overwhelmingly supportive of Israel, particularly when compared to other liberals.

This is because, for all of Israel’s faults, anti-Semitism like that displayed against the Parisian Jews during the Charlie Hebdo shootings trigger an entirely human fear – namely, that you will be threatened by oppression and violence because of your heritage. Contrary to popular prejudices, Jews do not secretly control the world or support Israel out of dual loyalties, and they did not fake the Holocaust or create wars in the Middle East for profit. When Israelis undergo a sharp rightward shift in their politics and, at the end of the day, receive the support of American Jews, it is because they are acting like any group of people which has been persecuted in the past and is continuing to experience it at present. While this doesn’t justify Jewish double-standards when it comes to Israel’s treatment of the Palestinians, the fact that it is understandable and valid should be a prominent part of the conversation.

The 5 most important takeaways from Obama’s State of the Union address

Published: Daily Dot (January 21, 2015)

The gist of President Obama’s State of the Union speech can be found in one sentence: “The shadow of crisis has passed and the State of the Union is strong.”

This was not the address of a president whose party suffered a historic shellacking in the most recent set of midterm elections, but one whose approval rating has reached 50 percent and is emerging as a tough, creative statesman during his lame duck years. There was no attempt to apologize, no undercurrent of defensiveness, no sense of feeling beleaguered. As Obama enters the final quarter of his presidency, he has made it clear that he intends to position himself as a strong leader.

Here are the five most important ways Obama got it done last night.

1) He was confrontational toward Republicans

The line from last night most likely to be remembered deserves to be detailed in full. It began when Obama told Congress, “I have no more campaigns to run.” After Republicans responded by laughing and applauding, a twinkle appeared in his eye as the president added, “I know ’cause I won both of ’em.”

This may have been the funniest line of the evening—it’s certainly the one that will stand out when historians look back on this speech, akin to Rep. Joe Wilson’s heckling of Obama’s 2009 health care reform speech; however, as far as proverbial throwings of the gauntlet go, it was hardly isolated. Instead of simply telling Republicans that they shouldn’t threaten the economy with partisan government shutdowns or stymie progress on issues like immigration reform and Wall Street regulation, he told them that if they sent bills to his desk to do these things, they will “earn my veto.”

As a rebuttal to conservatives who insist that there is no need to raise the minimum wage, he further issued the challenge that if they “truly believe you could work full-time and support a family on less than $15,000 a year, go try it.” These jabs and others like them cast Obama in the role of pugnacious progressive rather than shrinking violet, even though the latter would have normally been anticipated after the 2014 midterms. Of course, in the campaigns when Obama himself was on the ballot, he did “win both of ’em.”

2) He showed the human side of his agenda

When the president awkwardly transitioned into an anecdote about a Minnesota couple who had slowly recovered during the darkest days of the Great Recession, Congress initially responded with a knowing laugh. Politicians are used to the poignant “ordinary citizen” stories that are a staple of American political rhetoric, so much so that they react with knowing cynicism when it’s clear they’re about to make another appearance.

Obama’s segue into the story of Rebekah and Ben Erler may have been awkward, but his point was still well-taken. More importantly, it was politically essential; for Obama to succeed, he must establish himself as a champion for the little guy against forces that he or she can’t control; as Obama himself put it, he needs “to restore the link between hard work and growing opportunity for every American.” The theme of middle-class economics appeared repeatedly, and the Erlers’ narrative helped drive the point home.

3) He focused on fairness

The word “fair” appeared eight times in Obama’s speech ( e.g., “Make sure everyone gets a fair shot,” “It’s not fair to them,” and “As Americans, we don’t mind paying our fair share of taxes, as long as everybody else does, too”). Because Obama’s economic agenda includes boldly progressive initiatives like providing free community college for all academically eligible students, raising the minimum wage, and subsidizing child care costs for impoverished families, it was necessary to preemptively rebut the anticipated attacks on these proposals as being “socialistic.”

The same principle explains how he justified his plans to increase business regulations or raise taxes on the wealthy. Many Americans may be reluctant to openly criticize the humanitarian flaws of under-regulated capitalism, but it’s quite a different matter when you appeal to their basic sense of fairness.

4) He wasn’t shy about promoting his accomplishments

Because he entered the presidency with vast expectations, it has been fashionable for conservatives to claim that Obama has failed—indeed, to do so even before he had been in office long enough to actually accomplish anything positive or negative. In last night’s speech, however, he rattled off an impressive list of achievements: The economy is growing at a rate unseen since the booming ’90s, our unemployment rate is lower than it has been since before the financial crisis, Obamacare has insured more Americans than ever before, and our troop levels in Afghanistan and Iraq are 1/12th of what they had been when Obama took office.

This was more than mere bragging. ABC News perhaps put it best when it described this tendency as “the big turn—from ‘yes we can’ to ‘yes we did.'” For Obama’s presidential legacy to be secured (not only in terms of his own administration but the ability of future presidents to pick up his torch after he has left office), it is essential that he not allow its strengths to fade into the background or go entirely unnoticed. By devoting the entire introduction of his speech to framing his presidency within this positive light, Obama has advanced an argument that is long overdue—namely, that he has been a very, very good president.

5) He tried to bring everyone together

It’s easy to lump this in with his confrontational stance toward Republicans, but whereas the aforementioned jokes and jabs were intended to display a fighter’s spirit, the close of Obama’s speech was an eloquent attempt to appeal to America’s more virtuous angels. This could be referred to as the “better politics” section of his speech. He called for Americans to “appeal to each other’s basic decency instead of our basest fears,” where we don’t demonize our opponents but “talk issues, and values, and principles, and facts,” where we have substantive arguments instead of attack ads and find common ground on issues like women’s rights, immigration reform, and racial profiling instead of torpedoing all progress over those areas where we disagree.

The “better politics” section of Obama’s speech wound up closing the entire State of the Union address, which in retrospect seems very appropriate. After all, as Obama himself noted, the speech at the 2004 Democratic National Convention that transformed him overnight into a national figure had been focused on drawing Americans together by bridging regional, racial, political, and other polarizing differences. As his presidency draws to a close, Obama wants to make it clear that he has not lost sight of that mission. One way to do that, naturally, is to project an image of strength when advancing your agenda. In last night’s speech, Obama did exactly that.

Confessions of a Former Football Nerd

Published: Good Men Project (January 20, 2015)

Can sports fans be nerdy—in that good nerdy way? Matt Rozsa thinks so.


I’m not sure if I can be considered a Green Bay Packers fan anymore.

It isn’t simply that I missed the NFC Conference championship on Sunday, which by all accounts was a spectacular game. It’s that I didn’t particularly care about missing it. Indeed, were it not for a well-timed email from my parents, I probably wouldn’t have known that it was on at all.

This never would have happened in the ‘90s. My fandom began when, as a six-year-old child, I awoke my parents in the early a.m. and proudly informed them that any team which had green in their name and their uniform deserved my support (green has always been my favorite color, so this logic seemed impeccable at the time). From that moment until I left for college, I followed the NFL season as diligently as millions of other red-blooded Americans. I memorized statistics, learned the intricacies of offensive and defensive strategy, and was swept up in the martial metaphors and weekly displays of athletic prowess.

In other words, I used to be a football nerd. To understand why that changed, it’s important to first understand how it’s possible to be a sports fan and a nerd … and what it says about our culture that those concepts will undoubtedly strike many readers as mutually exclusive.


For a working definition of what it means to be a nerd, I turn to the author John Green:

… nerds like us are allowed to be unironically enthusiastic about stuff … Nerds are allowed to love stuff, like jump-up-and-down-in-the-chair-can’t-control-yourself love it. Hank, when people call people nerds, mostly what they’re saying is ‘you like stuff.’ Which is just not a good insult at all. Like, ‘you are too enthusiastic about the miracle of human consciousness.’

Green was hardly the first social critic to observe that geek has become chic. Today we celebrate science fiction nerds and fantasy nerds, comic book nerds, and video game nerds, nerds who focus on the hard sciences and history and literature and art. We live in the era of Comic Con and The Big Bang Theory, of the Marvel Cinematic Universe and Netflix-induced binge watching, of bitter disputes over whether Han shot first or Gamergate is about more than harassing women (it isn’t). And yet, colloquially speaking, sports fans are rarely considered nerds.

Sure, most coaches and many athletes need to master and effectively employ complex strategies to win in any major organized sport you can toss a stick at (football, baseball, basketball, hockey, soccer). Similarly, it takes considerable mathematical and analytical dexterity to be a conversant fan in any of these games, be it creating a fantasy league and creating odds for a betting pool or simply maximizing your enjoyment as an audience member by fully appreciating the intricacies of the game you’re watching. Yet when all is said and done, being a truly knowledgeable sports fan simply doesn’t carry a great deal of cachet.

One big reason for this, I suspect, is our embedded cultural bias against jock culture. Just as we have been conditioned to assume athletes are big dumb lugs, so too do we tend to view the sports fan as a visceral rather than intellectual creature. Beyond this snobbery, it’s also easy for the uninitiated to dismiss athletic competitions as simple games. Finally, the scandals that keep cropping up in the sports world—from high schools and colleges that give special privileges to their “student athletes” to steroid abuse in baseball and more NFL scandals than can be listed in a single article—have undoubtedly sullied the reputation of sports among many in the intelligentsia, who, I can vouch for from firsthand experience as a PhD student and professional editorialist, have never thought very highly of sports in the first place.

Like most prejudices, however, the assumptions that prevent die-hard sports fans from taking their rightful place in the nerd pantheon disintegrate with even modest scrutiny. “I can date my sports nerdom back to first grade, when my friend Evan and I would memorize the names of the 49ers’ (our favorite team) starting lineup,” recalled Mike Raymond, who currently hosts ESPN Radio of the Lehigh Valley. Modern technology has increased the level of sophistication with which fans can learn about their favorite sports in the couple decades since Raymond’s childhood. “Websites like FanDuel and Draftkings have monetized the participation in fantasy sports,” he explained, while “video games such as Madden, FIFA, and NBA 2K allow you to play with teams from around the country and the world, learning more about the players and hence gaining more interest in the sport.”

Salomea Grucela, another football fan whose Facebook page remains one of my main sources of football-related information, knows plenty of people with stories like Raymond’s. “I think there’s definitely an overlap in ‘nerd culture’ that includes sports, particularly the kinds of fans who dedicate the time to memorizing stats, rosters, play calling, etc,” she explained. “The personal investment fans place in ‘their teams,’ which they themselves have never played for, or ‘their players’ who they’ve never met and don’t know on a personal level, really isn’t all that different from drawing those same emotional connections or devotions to fictional literary characters or historical heroes.” Her observations echoed those of Jessica Blasco, who told me that she hadn’t become a football fan until college. “Football was ‘just a game’ to me, and I didn’t understand everyone’s obsession with it,” she recalled, until she went to her first NCAA football game and experienced  “the pride, tradition, and camaraderie that surrounded it … In an instant, the realization that you have a shared love for a team can bond you with a stranger, whether it’s at a bar, a tailgate, or seeing someone a thousand miles from home wearing your favorite jersey.”


This brings me back to my own experience as an erstwhile cheesehead (the nickname for Packers fans). There was never a single moment in which I became disenchanted with football or made a conscious decision to pay less attention to it. As I grew older, the intellectual energy that I invested in being a football fan gradually found itself directed toward other subjects, primarily the ones from which I make my career today (history, in which I’m getting my PhD, and political science and literature, which I use as an editorialist). During that time, football faded from being a secondary interest to a tertiary one, and eventually a passing curiosity with the faint aura of nostalgia affixed to it.

While the subjects that occupied my nerdy attentions may have changed, however, the passion and intensity is exactly the same. The respectable history/political/literature nerd that I am today is the direct descendant of the sports nerd who, less than twenty years ago, cheered when the Green Bay Packers won their first Super Bowl in twenty-nine years and wept when they lost their first Super Bowl ever twelve months later. As I reflect on my numb response to the Packers’ last-minute loss to the Seattle Seahawks, I can’t help but hear the elegy for the inner child who would have been devastated at that event. Make no mistake about it: That part of me was a bona fide nerd, even if the cultural consensus seems to dictate otherwise. It is a genuine tragedy when something that once helped a person plug into “the miracle of human consciousness” evaporates. After all, the best part of being a nerd is feeling that connection—and for the sake of our humanity, it’s important that we never lose it.

A Guide to Online Etiquette – and Basic Human Decency

Published: Good Men Project (January 20, 2015)

Matthew Rozsa offers a set of rules to give us a kinder, gentler Internet.


My first experience with Internet flaming occurred more than a decade ago, when I was still an undergraduate at Bard College. The 2004 presidential election had just ended, and although I’d staunchly supported John Kerry (more on that in a moment), I’d openly disapproved of how many of my fellow student activists had blocked a four-way intersection in our host community following George W. Bush’s reelection. The night after my editorial was published, I received a very unexpected email from one of my professors:

Just a heads up: The student radicals aren’t happy with you. Expect to be flamed.

Fortunately this did not culminate in some me being literally fricasseed, but I can say that the subsequent campaign of email harassment was particularly unpleasant. The head of one of the campus’s most vociferous leftist groups (though hardly its most numerous) sent the following message to her followers:

Matthew Rosza (sic) is trying to print an editorial about what happened on weds—please feel free to write to him what you think of the article in order to help his poor disillusioned soul! According to him we ‘alienated the community’ and committed acts of DISRESPECT towards the cops.

Certainly I’ve gotten used to it by now—my accumulated hate mail tally is without question in the hundreds, and possibly in the thousands—but the shock of my initial encounter was extremely jarring. When I reported this to our college president Leon Botstein a few weeks later (during one of his regular open houses), he offered an observation that still resonates with me today:

Emails and anonymous comments are the phone calls you shouldn’t make.

I’d like to play with this analogy here, because it actually works on several levels. If you think about the rules of etiquette that you would instinctively follow in a phone conversation, you quickly realize how many of them depend on your subconscious awareness that there is another person receiving the brunt of what you have to say. We may not be in the physical company of our guest in a phone conversation, but our attention is glued to them and we constantly require their input for validation (even if, for us talkative fellows, that involves little more than an occasional “uh huh” and “yep”).

By contrast, the recipients of online communication—by email, on Facebook, on a message board, or anywhere else—are usually nothing more than text and pixelated images on a screen (exempting Skype and other video messaging services, which aren’t as common). That’s why the first rule for being a good person when chatting online is to remember what my editor Thomas Fiffer pointed out in a Facebook forum earlier this week:

I’d like to offer a short course on respectful disagreement. A person makes a statement with which you disagree. How far do you go in your response:

A. That’s not how I feel.
B. Here’s why I don’t feel the way you do.
C. I don’t like that you feel that way.
D. You shouldn’t feel that way.
E. You have no right to feel that way.
F. I judge you for feeling that way.
G. I’m going to assume you haven’t thought your position through or read any of the research.
H. Your feeling that way makes you a bad or stupid person.
I. Your feeling that way makes me better than you.
J. Hey, everyone, can you believe this jerk feels that way?
K. I’m deeply offended that you won’t agree with me.

If you stopped at B., you get an A. Anything beyond that constitutes disrespect and communication failure.

All of these tidbits are structurally supported by the same logical premise: If something would sound rude when said aloud, it is almost certainly going to sound rude in an online conversation. Indeed, the second rule for being a good person involves precisely that: Before you submit a comment, recite it to yourself. Often it will help you catch turns of phrase and subtleties that you may have unintentionally inserted into your text … or perhaps will help you realize that what you believe comes across as righteous and informed indignation is instead condescending, confrontational, and even juvenile. As an added bonus, this practice almost always helps you improve the quality of your writing, from catching grammatical errors and fragmented logic to helping you think of a better way of expressing your ideas (personally I use this to rein in my penchant toward verbosity).

Finally, it’s important to think not merely of how you’ll sound, but of what kind of argument would be most likely to persuade you if you were on the other side of the debate. Once more, this works both as a mechanism for improving your empathy and as a convenient way to improve your skills as a writer: No matter how distant someone else’s perspective may seem from your own, a truly great debater is capable of effectively rebutting that point of view precisely because he or she can empathize (if not necessarily sympathize) with the reasoning behind it. Empathy is absolutely essential not only to comporting yourself with dignity and character, but also to sharpening your skills as a thinker and advocate.

Believe me, I’m not writing any of this from a presumed lofty perch. Although I have never participated in a formal flaming, I have certainly been guilty of emotional outbursts and downright nastiness that I have since come to regret. It’s difficult to keep your head on an even keel when they’re buffeted by emotional tempests, and anyone who writes for a living has more than a few issues which can knock them off their footing.

When I criticized many of my Bard classmates for their behavior on Election Day 2004, it was because I shared their pain and hoped they could find a constructive outlet for expressing themselves. Years later, when I read Bob Woodward’s account of the Bush administration’s experience during that contest, I was struck by how it depicted Kerry’s handling of the concession process. At first, it seemed like controversies over voting irregularities in Ohio would present Kerry with an opportunity to challenge the results similar to the way Gore had done in Florida four years earlier, but once it became clear that there was no realistic chance that the Kerry campaign could make a difference, their candidate had to stop viewing himself as President-elect and instead call to concede the election to his opponent. This was an agonizing ordeal even for a man who had won a Silver Star, a Bronze Star, and three Purple Hearts, but because the Bush team didn’t attempt to confront or pressure him, he had enough time to calm down and handle the challenge with propriety (as well as deliver as fine a concession speech as any I’ve read or seen).

Whether it was Bush showing grace toward a defeated adversary or Kerry waiting until his rational judgment could prevail, both sides of the ideological spectrum can learn a great deal from how the candidates in that election handled themselves. So too with everyone else who wishes to be a constructive participant in our public debate—political or non-political, online or off.