From Gamergate to “Ghostbusters” to Suicide Squad: The Problem of Fan Entitlement

Published: Salon (August 9, 2016)

It’s easy to roll your eyes at the “Suicide Squad” petition. In case you’ve been lucky enough to miss the news, fans of the new movie “Suicide Squad” have created an online movement to shut down aggregation site Rotten Tomatoes for posting predominantly negative reviews of their beloved film. Cue the inevitable jokes about how nerds need to get a life.

Is it really that simple, though? Over the past few years, it’s become increasingly clear that fans of pop culture properties – whether movies, TV shows, books, video games, or anything else – don’t merely view them as forms of entertainment, or themselves as consumers of said media. From Comic Cons to the nostalgia craze, it is clear that millions of people deeply identify with the culture produced by others, and, as a result of this feeling of ownership, many of them have developed a deep sense of entitlement that at its most innocuous is merely silly, and at its worst manifests itself in ugly bigotries.

The brouhaha over “Suicide Squad” offers a great starting point for tracing this evolution from the absurd to the sinister. While there is a highly unflattering whininess in those “Suicide Squad” fans who assume that critics are compelled to share their views, Rotten Tomatoes hasn’t exactly been victimized by their petition (no one believes it’s going to be effective). The same can be said of Ben Affleck, who three years ago was targeted by a petition to recast him as Batman before critics and audiences had a chance to see that he’d wind up being the best thing about “Batman v. Superman: Dawn of Justice.” Incidents like these can be safely lumped under the “silly” category.

But what about the female film critic who received misogynistic death threats from a comic book fan incensed over her negative review for “Man of Steel”?

The same entitlement that can cause DC Comics fans to complain about unpopular actors or unfavorable movie reviews can also, if they harbor certain prejudices, come across in more harmful ways. Because only 15 percent of major movies star female characters, it was easy for fanboys with a sense of entitlement to denounce the new “Ghostbusters” reboot in viciously misogynistic language for recasting the lead roles with female performers. Similarly, because video games have traditionally targeted white men as their core audience, movements like Gamergate can spring up when reactionary gamers hear feminists call for increased gender diversity in gaming. These sexist attitudes even appear around franchises where you wouldn’t expect it; just ask Anna Gunn, who has endured years of harassment for her role as Skyler White in the TV drama, “Breaking Bad.”

Unfortunately, the problem of nerd entitlement isn’t limited to misogyny. Last year a number of racists made waves with their movement to boycott Star Wars: Episode VII – The Force Awakens because it had cast African-American actor John Boyega in one of the starring roles. A similar backlash occurred when it came out that Michael B. Jordan had been cast as The Human Torch in last year’s reboot of The Fantastic Four. Skip over to the realm of literature and things aren’t much better, as evidenced by the considerable number of Twilight fans who harassed indie pop singer FKA Twigs in vile racist language for daring to enter a relationship with the male star of their franchise’s film universe, Robert Pattinson.

These are only a handful of examples (I had to cut more than three-quarters of my research for this article just to save space), but they all underscore a common theme. It isn’t simply that consumers of popular culture often harbor ugly racist and sexist views; it’s that, because they personally identify with the properties in question, their inflated sense of entitlement over these products can make them quick to anger when that identity is challenged. This is why latent racism and sexism so often bubbles to the surface among those members of the community community that think of their identity in terms of being white and male.

The underlying logic is fundamentally irrational: It’s the belief that, because they’ve financially supported these industries their whole lives and received an embarrassing social stigma for doing so, these industries owe them. While being a fan gives you a legitimate emotional connection to a product, the underlying relationship is still that of consumer with product. Any loyalty that you feel is a personal choice you make on how to invest your time and money; any choice made by a producer, from corporations to individuals, is done to promote their own self-interest. Because that involves appealing to as broad an audience as possible, this means ignoring their fans when they insist on exclusivist attitudes.

What can be done about this? More than anything else, we need to change the conversation that we’re having about pop culture in general. For better or worse, the fact that our generation holds pop culture on such a pedestal means that the cultural has become political. As a result, when a disproportionately large number of our movies, TV shows, video games, and books feature white, straight, and male characters at the expense of other groups, this is an inherently political act (deliberately or otherwise) and needs to be confronted. Indeed, when nerds react to calls for diversity with hostility, they are only demonstrating how true this is. There is a poignant symbolic significance to including non-white, non-male, and non-straight voices in cultural roles that were traditionally reserved for members of privileged groups… and, conversely, it is terribly disheartening when the producers of entertainment refuse to recognize the cultural power they wield and utilize it in an inclusive way.

Beyond simply calling for diversity, though, we also must infuse our debate with an awareness that being a fanboy doesn’t entitle you to anything. The common thread linking the “Suicide Squad” petition to other nerd-based racist and misogynist incidents this decade is that, at their core, all of them betray an assumption that producers of popular entertainment are beholden to the nerd community. This misunderstands a basic principle of a free market society – while consumers have the right to invest or not invest their time and money as they see fit, they don’t have the right to demand that producers act as obedient servants to their will. It’s certainly nice when an author or actor or critic or film studio shows deference to the wishes of fans, but they are in no way ethically obligated to do so. Indeed, because many fans (like many people from all walks of life) harbor terrible social views, it is very often necessary for producers to disregard the will of the more vocal segments of their fanbases. Just because a lot of gamers don’t want increased diversity doesn’t mean it shouldn’t happen; just because a lot of moviegoers liked “Suicide Squad” (myself included) doesn’t mean the critics on Rotten Tomatoes should feel likewise.

At the same time, it’s also necessary for progressives to maintain an even keel about the greater significance of these cultural properties. The sexist backlash against the “Ghostbusters” reboot was certainly despicable, but that doesn’t justify alleging misogyny in every moviegoer who disliked the film (I personally thought it was good and worth seeing). It’s important to oppose racism, but that doesn’t mean we should start hashtags like #CancelColbert that willfully ignore the difference between satire and bigotry. While it’s important for progressives to stand up to problematic trends and tropes in cultural products, we still need to remember that they are ultimately just that – products. When we lose sight of this, we risk overreacting against those whose opinions and actions are based on an awareness of the fact that we too are acting first and foremost as consumers of entertainment.

I suspect that, years from now, future cultural historians will love to mine incidents like Gamergate and the “Ghostbusters” controversy for deeper meaning. There is a great deal to be said about a society that loves its popular culture so fervently that they will turn them into platforms on which greater social justice causes are fought. For right now, though, it behooves all of us to take a step back and recognize that there is an air of entitlement which makes all of this possible… and none of us look good so long as it remains unaddressed.

The troubling and counterproductive trend of liberals policing free speech: We cannot silence those we disagree with

Published: Salon (August 5, 2016)

co-authored with Mark Schierbecker

When Donald Trump was asked last November to give his opinion on the student-led protests at the University of Missouri, he called them “disgusting,” adding to Fox Business News that “I think the two people that resigned are weak, ineffective people. […] Trump should have been the chancellor of that University. Believe me. There would have been no resignation.”

While Trump’s language was hyperbolic and insulting, he wasn’t simply making these comments out of spite. Throughout his presidential campaign and into the year 2016, free speech issues have been a touchstone among right-wing politicians… and not entirely without cause. Trump may be hypocritical in criticizing others for suppressing free speech, but there is a deeper problem in our political culture that has bred demagoguery like his own.

For the past few years, there have been a number of incidents in which the ethos of free speech has been outright attacked. Recently an elected student official at the University of Houston who posted #AllLivesMatter on her Twitter feed was suspended from her position, forced to attend a “diversity” workshop, and required to participate in three cultural events each month. That same week, conservative pundit Ben Shapiro was banned from speaking at DePaul University because administrators anticipated protests from the liberal students. These are only the most recent incidents in a series that stretches back to similar suppressions at Yale,Wesleyan, and other schools last year… and explains why sixty-nine percent of students would favor prohibiting “intentionally offensive” speech on campuses.

Another example of the ways perceived personal slights have been used and abused is a “microaggression” tip sheet that has been adopted by many universities which labels innocuous expressions such as “America is a melting pot” as “racial microaggressions.” In higher education, programs dedicated to rooting out microaggressions, such as one instituted last year at New York’s Ithaca College, are teaching young people that microaggressions constitute grounds for punishment. The program may even be responsible for the abrupt retirement of Ithaca president Tom Rochon amid anger over improper handling of alleged racist incidents on campus. Among the charges of racism, students complained of a fraternity party’s ‘90s-inspired dress code which suggested attendees appear as their choice of either “Preps” or “Crooks.” Earlier this year, one of the authors of an influential book on microaggressions was horrified to learn that his research was being misrepresented to shame students into compliance. A co-author of the above-mentioned microaggression tip sheet was similarly displeased that it was being used to mete out punishment, rather than as a way of starting conversation.

Just to be clear, this problem is not entirely confined to the left. For all of his focus on speech suppression on college campuses, Trump himself has openly declared that he would use his executive powers to sue journalists who write critical things about him. This is a man who in three months could be elected to the presidency of the United States because many in his own party did not find his declarations to be particularly troubling. Indifference to censorship is no better than actively perpetrating it.

Different facets of this issue have been personally apparent to both of the authors of this piece. Mark Schierbecker, then a student journalist at the University of Missouri, became a national news interest when in November of last year he filmed a viral video showing racial protesters at Mizzou forcing journalists to leave a protest site. When Schierbecker stood his ground, a communication professor assaulted him and then called for “muscle” to forcefully remove him.

That same week when Schierbecker spoke at a skeptic conference in Springfield, MO, several attendees questioned his motives for covering the story, and implied that he had racist intentions. Subsequently the conference’s organizer formally apologized to those in attendance for hosting a speaker whose story “only reflected the views of white people.”

By contrast, Matthew Rozsa (the other author of this article) had an incident from the other extreme – one that a radio host suggested tested the limits of free speech. After he wrote an article criticizing Trump for comments he considered anti-Semitic, prominent neo-Nazi websites wrote several pieces defaming Rozsa in overtly bigoted language. Alongside emails attacking him with racist slurs, Rozsa also received a few from leftists who argued that this proved he should reconsider his earlier views on free speech (which he had written in pieces like these).

The problem with this logic is that it mistakes the right to call out hate speech with the right to suppress it. Certainly it is the moral responsibility of anyone who witnesses racist, sexist, anti-Semitic, homophobic, or other bigoted language to openly call it out as such. Unless they do so, prejudiced ideas and attitudes will not be effectively identified and confronted. A line must be drawn, however, at practices that go beyond simple denunciation and instead try to silence those with contrary opinions (however unsavory they might be). When accusations of bigotry are used to shut down opinions that may not in fact be bigoted, or when the argument is made that a bigoted view should be censored altogether, everyone’s liberty is jeopardized.

This is the main lesson we must learn from the events of 2016. There is a fundamental truth about political debate that seems to have been lost – namely, that there is a difference between the specific ideas you’re supporting in a conversation and the ethics you practice in how you conduct that discussion. Regardless of whether you’re left-wing, right-wing, or anything else under the sun, it is imperative to respect the right of others to express views that are different from your own… even if those opinions, or the language with which they are conveyed, is offensive to you. Not only does this guarantee that your own right to expression will be protected if others find your ideas offensive, but it creates a climate that stimulates creative thought instead of stifling it. This should be true wherever a political debate is being held– on the Internet, at a university campus, or anywhere else.

Campus protests can go viral in no time–so can the backlash

Published: The Daily Dot (November 27, 2015)

It’s hard to follow the recent flurry of college protests without being reminded of President Harry S. Truman, who famously said that “there is nothing new in the world except the history you do not know.”

As the media continues to fete attention on high-profile student demonstrations at Yale, Wesleyan, Princeton, and the University of Missouri, one could be forgiven for thinking there is something novel about the state of American universities today.

While this assumption isn’t entirely wrong, it’s important to identify what exactly has changed and why. As history shows, and my personal experiences confirm, college campuses have always been hotbeds for radical student activism. The difference today is that, because so many students use social media to gather news and shape their political philosophies, collegiate conflagrations that would have died out on their own in the past can instead blaze on indefinitely and spread more widely thanks to the Internet.

To illustrate my point, a brief look at my own past as an undergraduate at Bard College (which I attended from 2003 until 2006, when I graduated after completing my coursework in three years) can be particularly instructive.

The year was 2004. George W. Bush had just been re-elected and I, like the dozens of other Bard students who had campaigned for Democrat John Kerry, was devastated. Because early exit polls had predicted a Kerry victory but the candidate himself had already conceded defeat, many of us decided that a march on the local town of Red Hook, New York, would tell the world that our voices would be heard. I wrote about the protest for the Bard Observer, the main campus newspaper, roughly two weeks later:

“[A]s the incendiary rhetoric and actions of the radical students became more prominent, the more reasoned individuals – myself included – began to disassociate ourselves from the main proceedings. By the time the sit-in had started in the town of Red Hook, the initial group of 250 protesters had dwindled to what locals approximate was a band of three or four dozen highly determined activists, who found themselves sitting in the primary [four-way] intersection of Red Hook.”

Inevitably, the police were called out to handle the demonstration, and what followed was a seemingly endless dialogue between the protest leaders and law enforcement officials. Although the students were eventually persuaded to end the traffic jam they had started, several of them began tossing pebbles and other small objects at the officers as they walked back to campus. Arrests swiftly followed, leaving the bulk of the student body in an uproar.

Because the prevailing impression among students who hadn’t participated was that the cops had arbitrarily “gone bezerk,” and I had personally witnessed the students provoking the cops, I felt compelled to use my column in the Bard Observer to offer a counterpoint to the consensus story. Unfortunately, after showing the draft of my op-ed to one of the protest leaders, the editorial was leaked throughout the campus. “Matthew Rosza [sic] is trying to print an editorial about what happened on weds,” one widely circulated email declared, “please feel free to write to him what you think of the article in order to help his poor disillusioned soul!”

What followed were two of the most intensely unpleasant weeks of my life.

As I went about my day-to-day business on the campus, I was frequently berated by total strangers–as well as more than a handful of casual acquaintances–who would accuse me of everything from fascism to plain old conservatism. In one bizarre twist, some even declared they were going to “boycott” me–not theObserver, mind you, which would have at least made sense, but me as a human being–a nomenclature faux pas that baffles me to this day (to be fair, I was guilty of a rhetorical imprecision of my own, referring to the Red Hook altercation as a “riot” in my article). The good news, though, is that as time wore on the students’ interest in this particular kerfuffle faded away. By the time I was telling this story to my extended family on Thanksgiving, I was already referring to it in the past tense.

If these same events had occurred today, I suspect the outcome would have been very different.

“What is unique about these issues is how social media has changed the way protests take place on college campuses,” explained Tyrone Howard, associate dean of equity, diversity and inclusion at UCLA, in an interview with the Los Angeles Times. “A protest goes viral in no time flat. With Instagram and Twitter, you’re in an immediate news cycle. This was not how it was 20 or 30 years ago.”

Professor Daniel W. Drezner of Tufts University made a similar point in a recent editorial for The Washington Post, remarking that “as a professor with 20 years of experience, I can assure you that college students have been saying stupid things since the invention of college students. The difference today is that because of social media, it is easy for college students to have their opinions go viral when that was not the original intent.”

Certainly there is no arguing that social media has transformed how millennials perceive and participate in our political process. As a survey by the Pew Research Center discovered in June, “when it comes to where younger Americans get news about politics and government, social media look to be the local TV of the millennial generation,” with 61 percent reporting getting their political news from Facebook in a given week. A separate survey also taken this year by the American Press Institute yielded similar results, finding that 57 percent of millennials get news from Facebook at least once a day, with others mentioning social media (or potential social media) sites like YouTube (29 percent), Instagram (26 percent), Twitter (13 percent), Pinterest (10 percent), Reddit (8 percent), and Tumblr (7 percent).

Because so many young people are engaging with politics through social media, the medium itself has shaped how they respond to current events. “Social media favors the bitty over the meaty, the cutting over the considered,” observedNicholas Carr in a Politico article about the popularity of fringe presidential candidacies on social media platforms. “It also prizes emotionalism over reason. The more visceral the message, the more quickly it circulates and the longer it holds the darting public eye.”

In many ways, this creates a perfect storm of conditions for student protests to be blown out of proportion by both participants and third parties alike. The students provide the first ingredient by using social media to express and/or mobilize a demonstration regarding an intensely emotional position (legitimate or otherwise) that they hold on a touchstone political or social issue. If their rhetoric and/or protest activities are sensational enough to go viral, the media catches on and begins feverishly reporting it, with the subsequent heightened attention exacerbating the pre-existing histrionic climate and further fueling the cycle.While social media was already beginning to take off during my undergraduate days in the mid-2000s, it was nowhere near as pervasive as it is today, and in terms of my personal experience that may have made all the difference. Had the same events occurred 11 years later, it is hard to imagine that either the Red Hook protest or the subsequent backlash against me would have been mere footnotes in the national news.

Certainly the fact that Bard College has consistently been ranked as one of the most left-wing schools in America (then and now) would have featured prominently in the news coverage, along with the fact that I was an active member of the campus Democratic Club and could by no means be considered conservative. The odds are also strong that, as we’ve seen with high profile student protests today, each side would have seen its mistakes mercilessly nitpicked by critics: Cell phone videos taken of the protests would have been analyzed to confirm or contradict both our versions of events, our rhetorical excesses would have been dutifully chronicled, and the emotions that might have otherwise subsided over a couple weeks would have instead intensified and metastasized. For all I know, they would have ultimately defined my entire college experience–or brought it to a premature close.

The point here is not that student protests should be discouraged or condemned. Many important issues have been brought to the fore of our national debate because of these activities, from the anti-Vietnam War and pro-civil rights movements that swept campuses in the 1960s to the concerns about on racial and gender inequality that motivate students today. Although my personal experience has taught me to be critical of student protesters when they attempt to punish or silence dissenting views, I’m grateful to have had the opportunity to play a role–however small–in the vibrant political debate that is the lifeblood of American democracy.

That said, as social media continues to transform our world, it is important that we make sure not to unwittingly amplify words and deeds that could conceivably cool off on their own. In the end, this type of excess means that we have allowed our technological tools to manipulate us instead of the other way around–and when that happens, everyone loses.

Hillary Clinton, unpaid internships, and the culture of exploiting young workers

Published: Daily Dot (July 9, 2015)

Although Hillary Clinton has proposed a series of programs for reducing youth unemployment, she has also come under fire on the Internet for staffing her presidential campaign with unpaid interns. In response, Clinton announced on Tuesday that she was hiring 20 paid staffers in Iowa (the first state to hold a presidential primary or caucus next year). But, as Joanna Rothkopf of Jezebel noted, “the new hires don’t do anything to interrupt the trend of forcing even high-ranking staffers to work as unpaid ‘volunteers’ before being officially hired.”

How fast can Hillary Clinton’s unpaid interns photoshop a red white and blue soccer ball onto a bumper sticker let’s find out
— Matthew (@sixgunguerilla) July 6, 2015

While it’s valid to criticize Clinton for professing concern about youth unemployment while refusing to pay many of the young people who work for her, there is a deeper problem at play here—namely, the fact that unpaid internships have become an all-too-common way of economically exploiting young people.

“Prior to the 1990s, formal internships were rare. They functioned as apprenticeships in credentialed professional programs such as health care or accounting,” explained economist Neil Howe in a Forbes editorial last year. “But starting with late-wave Xers, this formality began to fade. College credit started to replace pay as more high-prestige companies offered unpaid positions, which continued to attract plenty of well-qualified applicants willing to compete for free.”

Because millennials have been taught that unpaid internships will help them build their resumes and develop valuable professional connections, they continue to seek them out—often without realizing that, as the National Association of Colleges and Employers reported in 2013, only 37 percent of college graduates with experience as unpaid interns received at least one job offer as a result. This is only marginally higher than the success rate of graduates who never interned (35 percent) and significantly lower than that for those who held paid internships (63 percent).

Unpaid internships have become an all-too-common way of economically exploiting young people.

However, thanks to the problem of high youth unemployment (the latest data from the Department of Labor places the number at more than 12 percent), an increasing number of ambitious college graduates are left with no choice but to work for free if they want to get their foot in the door within their chosen fields. “No one keeps statistics on the number of college graduates taking unpaid internships, but there is widespread agreement that the number has significantly increased,” reported the New York Times in 2012.

By last year, the practice had become so widespread that the same newspaper described it as the creation of a “permanent intern underclass: educated members of the millennial generation who are locked out of the traditional career ladder and are having to settle for two, three and sometimes more internships after graduating college, all with no end in sight.”

College graduates aren’t the only ones victimized by this institution. “Unpaid internships contribute to recessions as well as are triggered by them,” writes Dr. Nicolas Pologeorgis—an expert in business, economics, and finance—in Investopedia.

Along with misleading hopeful millennials into thinking they can obtain full-time work, Dr. Pologeorgis notes that “an increased supply of free labor tends to displace full-time workers and increase unemployment,” as well as “close off opportunities for minority applicants or people coming from disadvantaged backgrounds since high-quality and prestigious internships tend to favor the students/interns who come from affluent or relatively wealthy families and can afford to work for free.”

Technically unpaid internships that exist outside of the nonprofit sector and aren’t being used for college credit are only legal if they meet six criteria established by the Department of Labor, including that the internship must be “similar to training which would be given in an educational environment,” that the interns not “displace regular employees,” and that the employer “derives no immediate advantages from the activities of the intern.”

Unfortunately, some of the criteria contradict each other, in spirit if not precisely literally; for example, even though one stipulation requires unpaid internships to be “for the benefit of the intern,” another states that the intern isn’t “entitled to a job at the conclusion of the internship,” a detail that is frequently downplayed by employers as it would remove one of the main incentives that drives unpaid internships in the first place.

Even worse, although the Economist noted that “most unpaid internships in the private sector in America look decidedly iffy” when it comes to both the aforementioned labor regulations and minimum wage laws, it has been notoriously difficult to actually hold the employers accountable in court, despite more than 30 cases being filed on behalf of unpaid interns in the past four years.

If there is any silver lining to all this, it’s the fact that more attention is being paid to the unfairness of the existing system.

Indeed, one particularly prominent case made Internet headlines last week: In Glatt v. Fox Searchlight, two unpaid interns who worked on the movie Black Swan from 2009 to 2010 sued to be compensated for labor that included “copying documents, maintaining takeout menus, assembling furniture, taking out trash and, in one case, procuring a nonallergenic pillow for the movie’s director, Darren Aronofsky.”

Although a Federal District Court ruled in 2013 that the working conditions did not meet federal standards and the plaintiffs should have thus been classified as employees, the United States Court of Appeals for the Second Circuit reversed that decision. They argued that companies had the right to disregard the Department of Labor’s guidelines if it can be determined that the intern rather than the employer “is the primary beneficiary of the relationship” (even though an intern can be viewed as the primary beneficiary by simple virtue of having gained “experience”).

Since the opinion by Judge John R. Walker, Jr. also argued that courts should be flexible when “[examining] the economic reality as it exists between the intern and the employer,” employers have basically been given a carte blanche to pick which federal guidelines they choose to follow when using unpaid interns, then rely on appeals courts to bail them out if they lose an earlier legal case when accused of labor exploitation.

If there is any silver lining to all this, it’s the fact that more attention is being paid to the unfairness of the existing system. Celebrities like Charlie Rose and Lena Dunham have already been taken to task on Twitter for their use of unpaid labor (the former settled for $250,000 and the latter voluntarily agreed to start paying her performers). Although Clinton is still using plenty of unpaid interns for her campaign, her decision to pay 20 of her Iowa staffers reveals that she is at least aware of the controversy surrounding this issue.

That said, at a time when publications like the Washington Post have started referring to millennials as “the lost generation” because of their bad luck in inheriting a post-Great Recession job market, these paltry steps forward simply aren’t enough. If we want America to be a land of opportunity in practice as well as theory, we need to do away with unpaid internships altogether.

What 9 World Leaders Were Doing In Their 20s

Published: Question of the Day (June 2, 2015)

These were really formative years. The 20-something years are often as sharply defined by the “something” aspect of that term than by their numerical designation. This is the decade in which so many of us struggle to find ourselves in our careers; for some, the path lies clearly in front of us, while for others it wind around and is covered in shadows.With that in mind, what were the world’s most powerful leaders of today doing in this formative decade of their lives?

The Ones With Clear Paths

Some of them had already gotten a head start on their eventual career paths. Former U.S. senator and secretary of state (and current 2016 Democratic presidential candidate) Hillary Clinton spent her 20s shedding the conservative political beliefs she had been taught as a child and embracing left-wing student activism. According to her biography, as an undergraduate at Wellesley College, she organized protests for causes ranging from civil rights to ending the Vietnam War. After graduating, she went to Yale Law School (where she met her future husband Bill Clinton), distinguishing herself there with a scholarly article on children’s rights that is still widely cited today.

In a similar vein, Governor Scott Walker (R-Wisc.) attributes his catching the “political bug” to his childhood work in Boys Nation, a civic group run by the American Legion. By the time he was 25, he had managed to get elected to the Wisconsin legislature, putting him on the career track in state politics that would eventually take him to the governor’s mansion.

The Ones With Unexpected Turns

Not every future political leader had such a simple time discovering their destiny. Germany’s Chancellor Angela Merkel didn’t become engaged in politics until the age of 35, when the fall of the Berlin Wall opened up a new world of political opportunities to East Germans such as herself. Before that historic event, Merkel had focused on earning her doctorate as a physical chemist, and whatever political aspirations she might have had were subsumed by the political restrictions that she had known for her entire life as an East German. It’s hard to imagine that she, or anyone else inhabiting the bleak repression of Communist Europe, could have conceived of how radically her life would change after the revolutions of 1989.

Unlike Merkel, President Barack Obama had displayed an interest in political and social issues during his 20s, although his own eventual desire to pursue a career in politics had not become fully evident to him at that time, according to his biography. After graduating from Columbia University with a degree in political science (specializing in international relations), Obama moved to Chicago and spent three years as a community organizer. While studying at Harvard Law School, he told Ebony Magazine that he was pursuing his JD because it would help him in his work as a community organizer. This was a period of considerable hardship for Obama — he barely scraped by with a decent living, worked grueling hours, and often felt that his various career goals were thwarted by those with more experience and pull than him.

The Ones Who Follow Their Fathers’ Paths

Despite representing opposite ends of the conservative spectrum, 2016 Republican presidential candidate and current Kentucky Senator Rand Paul and former Florida governor Jeb Bush are both notable for closely following the career paths of their prominent fathers, Ron Paul and George H.W. Bush. In Paul’s case, this entailed spending his undergraduate years at Baylor University as an anti-tax activist with a notorious independent streak, which he then followed by completing medical school and becoming a doctor (although he became an opthamologist even though his father was an obstetrician).

For Bush, this meant spending the years after his graduation trying his hand at various business ventures until he managed to make his own fortune; although he eventually succeeded, Bush was so broke when moved to Miami at the age of 27 that he had to use his American Express card to pay his MasterCard bill. “After Bush moved his young family to Miami, making money — lots of it — became a priority,” wrote Alecia Swasy and Robert Trigaux of the St. Petersburg Times. “Bush was raised in a wealthy household and wanted the same living standards for his family.”

A 26-Year-Old Barack Obama

Gettyimages 78911883
KOGELO, KENYA – JANUARY 12: A family photograph of Barack Obama with his step-grandmother Sarah Obama on a 1987 visit to Kenya stands on a table in her house on January 12, 2008 in Kogelo, western Kenya. Barack Hussein Obama, father of US presidential candidate hopeful Obama, was born and raised in Kogelo. He died in a car accident in 1982. Senator Barack Obama’s parents separated when he was young. (Photo by Peter Macdiarmid/Getty Images)

Future Leaders of the World: How Millennials Think About Politics

The Ones Who Started In Business or the Military Sometimes the business world brought future political leaders together in surprising ways. Both Israeli Prime Minister Benjamin Netanyahu and former Massachusetts governor Mitt Romney worked at the prestigious Boston Consulting Group in 1976, where they became good friends. More importantly, despite having had drastically different life experiences beforehand — Romney had served as a Mormon missionary in France, studied business in college, and worked on his father’s presidential campaign, while Netanyahu mixed serving in the Israeli military with completing his higher education — both learned how to “employ similar methods in analyzing problems and coming up with solutions for them” (Netanyahu’s own words) during their time together at BCG.The last group to be included here are the politicians who spent their 20-something years defending their countries. Although Senator John McCain (R-Ariz.) and Russian President Vladimir Putin hold very different political ideologies, McCain came of age while serving in the Navy (he wasn’t captured and held as a prisoner-of-war in Vietnam until his early 30s), while Putin spent his 20s rising through the ranks of the KGB. If there is any lesson to be learned from studying the early adulthoods of these men and women, it is that one can wind up having a successful career in politics through any number of routes. Sometimes a future politician already sensed that he or she was going to pursue a career as an elected official, whether through their own initiative (Clinton, Walker) or because they were following in their father’s footsteps (Bush, Paul). Others started out in completely different careers than the ones they had set out for themselves (Merkel, Obama), or entered politics through a sideways route such as business (Romney, Netanyahu) or service protecting their country (McCain, Putin). There is no single path that needs to be taken to become a national or world leader.

The Plight of the Basement Dweller

Published: Good Men Project (April 21, 2015)

Basement dwellers. Trolls. We’re all familiar with this personality type: Matthew Rozsa offers an explanation as to why they exist in the first place.


“Cellar dwellers.” That was the expression that my friend’s husband used when referring to the trolls who posted misogynistic comments under her latest article. After laughing at the generation gap revealed by that statement (the youngins like me prefer the term “basement dweller,” although the rhyming “cellar dweller” has an undeniable ring to it), I began to reflect on my own experience. After all, several of my closest friends fit the general description of being a basement dweller: They live with their parents, have failed to continue their education, languish in matters of romance and love, and are either sporadically employed or remain stuck in dead-end menial jobs.

‘I saw the best minds of my generation destroyed by madness…’ It is this madness – the madness of a society that has produced a generation of basement dwellers – that I wish to dissect in this essay.

For a very long time, these friends were like a second family to me. During the summer of 2010, as I recovered from a devastating break-up, they provided me with much-needed comfort and fun; the parties we threw in those months will almost certainly remain among the cherished memories of my twenty-something years. To a man they are intelligent, generous, and friendly individuals, so much so that the prospect of them not realizing their potential summons to mind the classic Allen Ginsberg verse:

“I saw the best minds of my generation destroyed by madness …”

It is this madness—the madness of a society that has produced a generation of basement dwellers—that I wish to dissect in this essay. While I’ve written about the so-called basement generation in the past, I’ve generally done so in reaction to a position they have taken (usually on gender politics) rather than by trying to bring about a better understanding of their plight. This was a serious oversight on my part, and one that needs to be corrected. There are many great minds trapped in America’s basements, and while even one lost soul is tragic enough, the fact that there are millions of basement dwellers (as a quick glance at any message board should indicate) represents a staggering waste of human potential … and, perhaps more importantly, an incalculable amount of terrible suffering.


As I see it, there are three underlying causes for the basement dweller phenomenon:

1. The economy.

This line from a recent CNBC story sums it up best: “Turning 30 used to mean hitting your stride as an official adult. But for many of the country’s millennials, it feels like being stuck in perpetual late adolescence.”

It wasn’t always this way. Back when Adlai Stevenson ran for president in 1952, he could tell an audience of college graduates that “one worry you are spared is the worry of finding a job. When you finish college and military service, you will enter a world which wants and needs you. This seems a natural thing—when you have it. It is a terrible thing when you don’t ….” More than sixty years later, the vast majority of millennials have had to struggle mightily to obtain even marginally sustainable employment, much less the kind of work that can enable the meaningful autonomy that is a prerequisite to adulthood. For those who didn’t further their education beyond high school, the job market has been even more abysmal. This is why so many millennials have become “boomerangers” who move back in with their parents long after their teenage years have ended. For men, the bleak situation is rendered downright damnable by the social stigma attached to lacking economic independence during one’s early adult years.  Inevitably they become depressed and seek a form of escape, which brings us to our second point …

2. The addictions.

During the Great Depression, countless young men who had been left emasculated and desperate by America’s harsh economic conditions became compulsive gamblers. While many social critics of the era viewed this development with understandable contempt, today psychologists understand that gambling addiction is a real and serious medical condition. Although addicts don’t use mind-altering substances to artificially induce a high, they nevertheless develop a very real dependency on gambling, one that is part neurological (the brain released dopamine and other chemicals that “reward” the user) and in part psychological (people need to escape from terrible realities, and being swept up in gambling allows them to do exactly that).

When future historians look back at early 21st century America, they will most likely classify video game addiction and Internet addiction as the great scourges of the millennial generation. Like drugs and alcohol (which are often used while playing video games and/or surfing online), these addictions provide their users with an escape where they can feel good about their lives. Although harmless when indulged in moderation, it is easy to become dependent upon them when everything else in one’s life seems hopeless or burdensome. Of course, the fact that one turns to these venues for escape doesn’t mean that the anger, bitterness, and other negative emotions are left behind in the real world; indeed, the reason basement dwellers so quickly become trolls is that trolling offers them an emotional outlet that is safe (read: anonymous and thus unaccountable). So why can’t they find a more constructive way of dealing with their troubles? Perhaps it is because they are faced with …

There are many great minds trapped in America’s basements, and while even one lost soul is tragic enough, the fact that there are millions of basement dwellers (as a quick glance at any message board should indicate) represents a staggering waste of human potential… and, perhaps more importantly, an incalculable amount of terrible suffering.

3. The indifference.

Make no mistake about it: When basement dwellers leave racist comments on message boards or create sexist online campaigns like Gamergate; when they engage in hyperbole over their favorite pop culture properties (e.g., asserting that George Lucas raped their childhood by making bad Star Wars prequels, calling on President Obama to step in and prevent Ben Affleck from playing Batman, that sort of thing); or when they in general behave like boors … Well, they are absolutely wrong, and need to be called out on it.

At the same time, someone who finds himself in the plight of a basement dweller has legitimate emotional needs that are not being met, and if we wish to be a compassionate and rational society, we owe it to ourselves to address them. For example, much has been written about how Men’s Rights Activists depict women as prizes to be won and then complain when a woman spurns their advances as if she owes them sex. Again, when it comes to the underlying ideological issue, the feminists are right and the Men’s Rights Activists are wrong; no woman owes a man sex ever, for any reason. At the same time, heterosexual men do face enormous social pressure to not only “get laid,” but to settle down in long-term romantic relationships with a suitable partner. Indeed, even if these social pressures did not exist, men have a basic psychological need for companionship that is no less strong than the one commonly attributed to women. While they don’t have the right to sleep with or date any woman they choose, they certainly have a right to eventually wind up with someone they find attractive; to argue that it is morally acceptable for them to remain alone against their wishes is not only cruelly indifferent to their needs, but a recipe for disaster. After all, a moral code that only protects the interests of other people but is comfortable letting them suffer doesn’t have much of a case for commanding their respect.


Although the last point used dating and sexuality as its prime example, it really applies to the whole gamut of issues facing the average basement dweller. Ideally, every human being should have the opportunity to maximize his or her latent potential and thereby flourish in the type of career that Mark Twain aptly described as one’s “true work”—i.e., “the work that is really a man’s own work is play and not work at all.”  Barring that, they should at the very least have the ability to obtain employment that will allow them to achieve economic independence, and with it the psychological foundation of feeling like a true adult. From there, it stands to reason that no one who honestly seeks genuine romantic companionship should be denied that for reasons they can’t control (physical appearance, economic insolvency, etc.)

While I don’t have any solutions to these problems, I think it’s high time that we not only acknowledge their existence, but actively search for solutions to them. There are many great minds being destroyed by the madness of having to live in a basement—figuratively as well as literally—and I have been privileged to know a few of them. They deserve advocates who will help them out of this slump, just as they should be held accountable when they behave in a juvenile or hateful fashion. If we continue to fail them, we will also be failing ourselves.

Why I’ve Never Thrown Myself A Birthday Party

Published: Good Men Project (April 4, 2015)

Like so many Millennials, Matthew Rozsa isn’t sure he sees the point in throwing birthday parties.

It recently occurred to me that, in less than five weeks, I will be turning thirty years old. May 8th, to be exact – I was born on the same day that New Coke was released into the global marketplace with infamously lackluster results. It was also the 40th anniversary of V-E Day, the 101st anniversary of Harry Truman’s birthday, and (for specialists in arcane history) the 27th anniversary of the day in which Vice President Richard Nixon was nearly murdered by an angry mob in Lima, Peru.

Considering my introspective nature, I’m a little surprised that the ramifications of this impending milestone have taken so long to set in. Then again, this probably has something to do with another unusual aspect of my personality – namely, that I’ve never thrown myself a birthday party.


The birthday party is a more or less universal ritual. Different cultures obviously have their own ways of celebrating the anniversary of when one began the adventure of life, but there are few that don’t at least make note of the event. Yet I’m hardly alone among Millennials that I know in taking a casual approach to the traditional birthday party, as observed here in the West. More and more often, it seems to me, people our age prefer a casual approach – meeting up with a few friends, drinking some beer, watching good movies.

The birthday party is a more or less universal ritual. Different cultures obviously have their own ways of celebrating the anniversary of when one began the adventure of life, but there are few that don’t at least make note of the event.

For the past five years I’ve had a very distinct ritual of my own. It was May 8, 2010, less than two months after the messy conclusion to a long-term relationship and a little more than two months before I began my Masters program at Rutgers University – Newark. Needless to say, I was in a state of great flux, and so what I desired more than anything at that time was some equanimity. Consequently I met up with a few very close friends – Adam, Tommy, Brian, Andy, Sean, Jen, Tommy M. (as distinguished from the first Tommy) – ordered a huge bucket of fried chicken, and watched an episode of “Mystery Science Theater 3000.”

That was it. A simple little ritual that I have thus far reenacted without exception for every subsequent year. It may sound silly, but in the interim that has passed since I have lived a packed life: I’ve completed my MA degree, made significant headway in my PhD program, become a published author, and met countless fascinating people as significant others, professional colleagues, and friends. After my older sister had a child, I even became an uncle – an entirely novel experience for me.

I guess what I’m saying is that, when your day-to-day life is packed, it’s easy to view the ordeal of organizing and throwing a birthday party as just one more exhausting burden.

I guess what I’m saying is that, when your day-to-day life is packed, it’s easy to view the ordeal of organizing and throwing a birthday party as just one more exhausting burden. By simply surrounding myself with a few good friends and unwinding, I not only avoid turning a celebration of my life into another stressor, but pay respect to what I really want on my birthday, which is some time to myself. When I was a little child, I liked “fun” in its archetypal manifestations, and as a teenager I became more ambitious in my party plans. As an adult, though, it seems so… well, frivolous.


Of course, I may have to make an exception this year. When one of my friends invited me to a 30th birthday bash she threw for herself with characteristic panache, I began idly chit-chatting about whether I should do something special for myself this time around. That said, I wonder if there are more people out there like me. After all, our generation is one of the most put-upon in recent history. We came of age in the aftermath of 9/11 and the Great Recession, spend more of our time worrying about making ends meet than any other group of Americans since the Great Depression, and thanks to the digital revolution, we are better informed about everything that’s wrong in the world around us than any of our predecessors.

In short, we are a generation that is chronically stressed out. Am I alone in finding birthday parties to be more trouble than they’re worth?

In Defense of Being Opinionated

Published: Good Men Project (April 2, 2015), Daily Dot (April 6, 2015)

Is it entitled for millennials to believe their opinions matter? Matthew Rozsa argues: No.


Writers tend to be argumentative creatures (it’s part of our opinionated personality type), and consequently some of my most memorable debates have been with other practitioners of my craft. Such was the case a couple months ago when I had this exchange with a fellow writer who insisted, somewhat confrontationally, that people our age should not be writing op-eds:

My friend: “I think for some reason our generation thinks the world cares about our opinions. They don’t. I think it’s tremendously self-entitled. I don’t care about anyone’s opinion unless they have significant first-hand knowledge of something, or they’ve been reporting for about 30 years. Real reporting.”

My friend: ‘I think for some reason our generation thinks the world cares about our opinions. They don’t. I think it’s tremendously self-entitled. I don’t care about anyone’s opinion unless they have significant first-hand knowledge of something, or they’ve been reporting for about 30 years. Real reporting.’

Not much else needs to be said about the author of these remarks, aside from the fact that this individual is a very successful journalist. There are plenty of people throughout the world who share that fundamental opinion – to whit, that it is entitled to believe the world should and/or does care about your opinion, that it is absurd to take anyone’s opinion seriously unless they have first-hand knowledge or are a hardened journalistic veteran, and that when it comes to caring about our every inane thought and opinion, millennials are particularly egregious offenders.

As someone who has made his living writing op-eds for more than three years, this story has been on my proverbial backburner almost from the moment the conversation happened (unfortunately, it is for that same reason that I’ve kept having to put off writing it as other, more pressing matters came to the fore). Now that I have some free time, however, a few thoughts on why everyone’s opinion matters.


First, I must stress one point: Saying that everyone’s opinion matters is NOT the same as saying that everyone’s opinion is valid. There are plenty of asinine ideas out there, and no one should hesitate in calling them out as such. That said, our society isn’t capable of differentiating between bad ideas and good ones unless we have a free marketplace for speech. Without allowing every voice to be heard, we always run the risk that an idea we might otherwise dismiss as rubbish actually has great merit, or vice versa.

Having dispensed with that possible misconception, it is time to hit upon the nut of why our generation’s opinionated nature is so valuable. For one thing, at a time when the economy has financially disempowered American millennials to a degree not seen since the Great Depression, it is essential that we recognize the difference between not being well-traveled and not being well-informed. It costs a great deal of money to go everywhere and see everything, resources that most millennials do not have at their disposal. While this in its own right doesn’t mean that the average schlub’s opinion is worth hearing – after all, the fact that someone is ill-informed through no fault of their own doesn’t make them any less ill-informed – it does require us to take into account that there are many intelligent individuals whose thoughts should be heard who, for practical reasons, won’t be able to provide on-the-scene analysis.

Thanks to the Internet (and other electronic media), however, this obstacle doesn’t have to be debilitating. In the digital era, virtually anyone can perform the research and acquire the factual basis necessary to offer meaningful commentary on any number of issues. In this regard, as in so many others, cyberspace serves as a great equalizer. Indeed, it could be something more: Just as the scholar slaving over books in a library has a perspective that the explorer or archaeologist lacks, so that any student in a literal field worth his salt would never disclaim the importance of his counterpart in a study, so too do we need the perspective of the men or women who live an ordinary lives but can come to extraordinary conclusions through dint of education and hard work.

In the digital era, virtually anyone can perform the research and acquire the factual basis necessary to offer meaningful commentary on any number of issues.

It’s also important to remember that our generation is hardly the first to be opinionated. Since the days when ordinary ancient Romans carved graffiti on the walls of Pompeii or colonial Americans passed out polemical pamphlets with talk of revolution against the British Empire, people have been of the unalterable perception that their point-of-view matters. What makes millennials different from their forebears is, again, the Internet: While it hasn’t fundamentally altered our instinctive belief that our views on affairs external to ourselves are important, it has exacerbated this condition by providing everyone with an easily accessible outlet for self-expression.

More significantly, it has allowed voices that might have otherwise been silenced to change the world. Had it not been for the Internet, it’s entirely possible that Barack Obama never would have won the Democratic presidential nomination in 2008 and be governing America right now (whether this is a good thing or not depends entirely on your world view). The various Social Justice movements that have sprung up online have made waves in feminism, gay rights, marijuana legalization, and a host of other grassroots movements that have been gestating for years but needed that cyber-push to make them politically relevant. Indeed, the Arab Spring revolutions that have swept through the Middle East have almost entirely been the product of Internet activism – and their long-term impact on our world may wind up being greater than all of the aforementioned American political phenomena combined.


Though I’ve already expended nearly 900 words rebutting my journalist friend’s contempt for millennial editorializing, I really only needed twenty:

If freedom of speech is taken away, then dumb and silent we may be led, like sheep to the slaughter.

That was George Washington, a man who needs no introduction, and his quote is more germane than one might think. Although he was referring to the importance of politically-guaranteed free speech as a bulwark against possible tyranny, I’d argue that cultural deference to self-proclaimed experts is simply another route to being “dumb and silent” that “we may be led, like sheep to the slaughter.” A life merely observed is hardly worth living at all, so naturally everyone should be encouraged to throw in their two cents. This is the case to be made for the producers of free speech; as for the consumers, I simply rest with the observation that in a true meritocracy, the quality of the words, research, and logic should always surpass any other variable.