Why the Internet needs Monica Lewinsky

Published: Daily Dot (October 23, 2014), Huffington Post (October 24, 2014)

It’s hard to imagine anyone wanting to trade places with Monica Lewinsky. Not only is she destined to be remembered in textbooks as the “other woman” in one of America’s most infamous sex scandals (culminating in the first presidential impeachment since Reconstruction), but she continues to be a cheap and popular punchline in the contemporary zeitgeist. A decade-and-a-half may have passed since her name was a staple in headline news and talk-show monologues, but “Lewinsky” is still synonymous with everything from soapy political melodrama in general to the specific sex act she performed on President Bill Clinton.

Viewed from this perspective, it’s hardly surprising that she eventually returned to the spotlight in an attempt to revise her public identity. More striking is the fact that her recent speech at Forbes’ “30 Under 30 Summit” not only offered valuable insights about her role in America, but by extension drew attention to how the Internet has fundamentally changed our culture. Even as political pundits speculate as to her potential effect on Hillary Clinton’s probable presidential campaign or debate whether she should be chiefly remembered as a victim or villain (all of which, it must be emphasized, are valid questions) we shouldn’t overlook the universal lesson Americans can pick up from both Lewinsky’s legacy.

As the first major scandal of the digital age—political or otherwise—the Clinton-Lewinsky scandal proved that the Internet was a game-changer in the public’s relationship with notable people, from politicians and celebrities to recipients of Warhol’s “fifteen minutes of fame.”

To be clear, there were plenty of presidential sex scandals before Clinton took office (see: Thomas Jefferson and Sally Hemings, Grover Cleveland and Maria Halpin, Franklin Roosevelt and Lucy Rutherford, John Kennedy and Judith Exner, et al). One of the main reasons Clinton’s illicit liaison nearly cost him the presidency, however, was that it occurred just as the Internet was beginning to come into its own.

As Lewinsky explained:

Thanks to the internet and a website that at the time was scarcely known outside of Washington D.C. (but a website most of us know today called the Drudge Report), within 24 hours [of the story being scooped and leaked online] I became a public figure, not just in the United States but around the entire globe. As far as major news stories were concerned, this was the very first time that the traditional media was usurped by the Internet.

For Clinton, the main consequence here was scandal and impeachment, both of which marred his reputation without overshadowing the public’s esteem for his impressive record as president. Lewinsky, on the other hand, became “the first person to have their reputation completely destroyed worldwide via the Internet.” Her subsequent emotional ordeal was nothing short of hellish:

The experience of shame and humiliation online is different than offline. There is no way to wrap your mind around where the humiliation ends—there are no borders. It honestly feels like the whole world is laughing at you… I couldn’t imagine ever showing my face in public again. I cringed. I yelled. I sobbed. And the mantra continued: I just want to die.

As she notes, all of this happened to Lewinsky before the rise of Facebook, Twitter, and YouTube. In short, before social media and society’s growing dependence on the Internet made everything exponentially worse.

To be fair, Lewinsky’s claim of having been an early victim of cyberbullying is somewhat flawed. Although she draws comparisons to the likes of Jennifer Lawrence and Tyler Clementi, the former was victimized by acts of theft and sexual exploitation, while the latter fell prey to a malicious prank that drove him to suicide. Lewinsky, on the other hand, was actually involved in harming innocent parties through her actions (i.e., Hillary and Chelsea Clinton).

What’s more, regardless of whether one believes this ought to be the case, the public has long held an interest in the sexual indiscretions of its political leaders; had the Internet never existed, Lewinsky’s actions still would have been considered newsworthy (although it’s unlikely the ramifications would have been nearly as overwhelming for her personally).

At the same time, Lewinsky has a strong point about the brutality of modern Internet culture. Our society takes for granted that a single mistake in one’s professional or personal life should have permanently destructive consequences. In the years since Lewinsky, Google has become so powerful that its name is synonymous with the act of conducting thorough research on another human being.

As a result, a person’s entire identity can now be defined by a single mistake from their past—one angry email, one embarrassing video, or one off-color joke. The notion that human beings are complex creatures has gone by the wayside; in its place is the unspoken belief that any wayward action (real or, in the case of actual bullying victims like Lawrence and Clementi, not so) can fairly be used as the sole point of identification for the rest of a person’s life.

Indeed, we relish these public humiliations. Hit television shows exist solely to mock those whose foibles somehow wound up online for the world to see, be they celebrities (TMZ) or ordinary folks on YouTube (Tosh.0). Photographs circulated online are used to shame beautiful people who have “let themselves go” (e.g. Renee Zellweger) or chronicle the downward spirals of prominent individuals experiencing financial, personal, or drug-related difficulties.

“We have created, to borrow a term from historian Nicolaus Mills, a ‘culture of humiliation,’” Lewinsky wrote in a recent Vanity Fair article, “that not only encourages and revels in Schadenfreude but also rewards those who humiliate others, from the ranks of the paparazzi to the gossip bloggers, the late-night comedians, and the Web ‘entrepreneurs’ who profit from clandestine videos.”

In this sense, Lewinsky’s chief legacy is more a cultural than a political one. While she unwittingly provided a new breed of journalist with their first taste for blood, her post-scandal life illustrates the human cost of our prevailing ethos. As Jonathan Bernstein, a crisis consultant in Los Angeles, aptly put it: “There have always been people whose aim in life was to cause pain to others. If they saw people embarrassing themselves, they got pleasure in sharing that information. Before the Internet, they had to gossip with their neighbors. Now they can gossip with the world.”

Perhaps Lewinsky’s newly minted Twitter page will provide her with an opportunity to change the conversation. Given Twitter’s notorious reputation as a breeding ground for misogyny, it will be interesting to see Lewinsky take on her detractors and (hopefully) draw attention to other victims of Internet cruelty. Should that happen, she will provide a genuine service for America. If nothing else, the misogynistic tweets she’s been receiving are proof that this voice is definitely as needed as ever.

It is here that we return to Lewinsky the laughingstock, the one trying to reclaim her identity today. Whatever errors she committed in her past, it is hard to reflect on the commentary surrounding her in the late 1990s without cringing at the slut-shaming, fat-shaming, and other cruel jokes she endured. Surveying the decades that have passed since, it appears that the Internet’s malevolent streak has only gotten worse.

If politicians were the only ones forced to be increasingly image-conscious as a result, the net effect might be positive. Considering that virtually anyone can be caught in the web, however, we may be wise to heed some of what Lewinsky is saying.

Gamergate and the Basement Generation

Published: Good Men Project (October 23, 2014), Daily Dot (October 27, 2014)

Matt Rozsa doesn’t see the passion of the basement generation as necessarily a bad thing, but it needs to be used for good.

“Man, I see in Fight Club the strongest and smartest men who’ve ever lived. I see all this potential, and I see it squandered. God damn it, an entire generation pumping gas, waiting tables – slaves with white collars. Advertising has us chasing cars and clothes, working jobs we hate so we can buy shit we don’t need. We’re the middle children of history, man. No purpose or place. We have no Great War. No Great Depression. Our great war is a spiritual war. Our great depression is our lives. We’ve all been raised on television to believe that one day we’d all be millionaires, and movie gods, and rock stars, but we won’t. And we’re slowly learning that fact. And we’re very, very pissed off.”

–       Brad Pitt in “Fight Club” (1999)

“It is a crisis of confidence… The confidence that we have always had as a people is not simply some romantic dream or a proverb in a dusty book that we read just on the Fourth of July. It is the idea which founded our nation and has guided our development as a people… Our people are losing that faith, not only in government itself but in the ability as citizens to serve as the ultimate rulers and shapers of our democracy.”

–          President Jimmy Carter (1979)

Introduction:

I’m not the first pundit to observe that Gamergate isn’t really about video games. As Chris Tognotti of Bustle succinctly put it, the popular online campaign directed against feminist video game developers and critics (most notably Anita Sarkeesian, Zoe Quinn, Jenn Frank, and Brianna Wu) is “a long-simmering pot of male privilege, misogyny, and slut-shaming in the gamer community boiling over.” Authors from Adi Robertson at The Verge to Amanda Marcotte at The Daily Beast have already analyzed the sexist assumptions driving the Gamergate movement, while I wrote a piece at The Good Men Project to the “angry mob” mentality being displayed by the Gamergaters, as seen most recently in the threatened school shooting at Utah State.

It isn’t enough, however, to simply say that Gamergate is at its core a clash between progressives and enraged reactionaries over the acceptability of gender archetypes in video games. These games may not be the fundamental source of Gamergate, but it’s very telling that they have become the front on which this battle is being waged. At a time when Americans are feeling increasingly disenchanted with and detached from their own society, the passion and energy that in the past would have been invested in real world issues is instead being directed toward the fantasy worlds created in various fictional media, most notably those facilitated by the digital revolution (film, television, music, video games). The act of consuming pop culture, which used to be an escapist diversion, has now become the central focus in life for many millennials. If we are to effectively address the crisis of masculinity that has been exposed by Gamergate, we must first confront this dimension of our zeitgeist – namely, the rise of the Basement Generation.

The Basement Generation:

The term “basement dweller,” as defined by the online encyclopedia of colloquialisms known as the Urban Dictionary, is “a male over the age of eighteen, usually with no formal education, who continues to live at home with his parent(s) and has no ambition to move out or contribute to society. His will to live is sucked by his daily marijuana regiment. He can be found working second shift at your local gas station. Lives in a perpetual state of ‘Getting his shit together.’”

This description, though obviously intended to be humorous, contains a great deal of existential truth for American millennials. If anything, its main inaccuracy is that it singles out men for derision (more on that in a moment), since studies clearly show millennial women joining men in residing with their parents through their 20s and 30s, languishing in unemployment or underemployment, and abandoning higher education in increasing numbers. Unlike women, however, male “basement dwellers” face the stigma of failing to measure up to America’s standards of masculinity. Since the days of the early republic, a certain level of economic and social autonomy has been a prerequisite to American manhood – the former has generally entailed working hard at a full-time job, residing in one’s own home, and “providing” for one’s wife and children, while the latter encouraged active engagement with larger political and social issues. By being unable to meet these expectations on an epidemic level, so many young men today fall under the broad aegis of “basement dweller” that it seems necessary to designate them as members of a “basement generation.”

While the aforementioned quote from “Fight Club” goes a long way toward explaining the attitude of the basement generation, its assessment is incomplete in the post-Bush world. After all, “Fight Club” was released during a period of unprecedented peace and prosperity, when the Internet was still young, the Great Recession was more than eight years in the future, and pop culture nerds were a fringe profiled in cult classics like “Clerks.” Consequently, although the ennui captured by “Fight Club” is still prevalent today, it has been coupled with a socioeconomic stagnation unseen since the Great Depression – a period which, not coincidentally, also saw Americans turn to mass entertainment for escape at unprecedented levels. Unlike the 1930s, however, the 2010s offers a medium (the Internet) that allows consumers of mass entertainment to indulge in the escapist entertainment of their choosing for as long as they please and form tight-knit communities with like-minded individuals all over the world. When these variables are combined with the inherently addictive nature of today’s most popular forms of entertainment (including not only video games but TV and the Internet), the final result is a perfect storm for a very specific type of mass delusion – namely, one in which the difference between fiction and nonfiction are dangerously blurred.

How Basement Men View Women:

This is why Sarkeesian’s feminist critique of video games is so important. Mass culture has a profound influence on how people view the world regardless of the period in history, but for members of the basement generation, it often serves as their primary paradigm. When women are depicted as things rather than people – damsels in distress to be rescued, sex objects to be lusted over, supporting characters who exist only to propel forward the journey of a male protagonist – these notions creep into the gamer’s perception of reality outside of the game. It results in the type of misogyny that, as was recently reported, is scaring female gamers away from pursuing their dreams. It contributes to the attitudes that pack Men’s Rights Activists full of angry caucasian men who embrace their hysterical anti-woman ideology (and I’m basing the assumption that Gamergaters and MRAs are overwhelmingly composed of young white men on the fact that the hundreds I’ve met or encountered online all fit that description). This isn’t the same as saying that gamers can’t tell the difference between fictitious violence and the real thing (an argument used to attack the medium whenever a mass shooting hits the news); extreme violence, though sometimes gratifying as an abstract thought, falls too far outside the pale. Gender archetypes, on the other hand, are much more subtle – and thus more influential.

The Gamergate backlash against Quinn, Sarkeesian, and Wu demonstrates this problem in two ways. The sheer vitriol of the attacks, combined with the utter absurdity of its logic (the notion that it’s about journalism ethics is brilliantly torn apart here), comes from a sense of outrage that the already-emasculated basement man is now seeing one of the few fragments of gender privilege he still possesses under attack. Since the basement man can’t control the real world, he opts for one that he can manipulate to his will – not a fight club, but a fantasy realm. Thus will Sarkeesian and other feminists express concern over how the tropes in a fictional medium might influence real-world attitudes and behaviors, the basement generation gamers view their fictional medium as a reality in its own right, one in which their status must be protected.

It is here that the basic underlying difference between a “gamergater” and a regular gamer becomes clear. To an ordinary gamer, video games are a pleasurable diversion – perhaps one to which they devote a lot of time and enthusiasm, to be sure, but still distinct from the external world they actually inhabit. As a result, the countless gamers who either side with the feminists or are simply indifferent to the current brouhaha aren’t likely to become outraged over a proposal that video games be more socially conscious (unless those gamers already happened to harbor misogynist views). To the basement gamer, however, the danger is as real as that felt by conservatives when the suffragette movement began to climax in the early 20th century. The realization that they are stirring up sturm und drang over a simulation of reality rather than the real thing may not fully elude them, but in an era of widespread jadedness, this doesn’t perturb them.

Conclusion:

When I try to think of an antidote to the malaise afflicting the basement generation, I am reminded of a speech delivered by Theodore Roosevelt back when he was governor of New York in 1899:

“I wish to preach, not the doctrine of ignoble ease, but the doctrine of the strenuous life, the life of toil and effort, of labor and strife; to preach that highest form of success which comes, not to the man who desires mere easy peace, but to the man who does not shrink from danger, from hardship, or from bitter toil, and who out of these wins the splendid ultimate triumph.”

While the era in which Roosevelt spoke was in many ways quite different from our own, the value he placed on hard work and risk taking would quite neatly address the dilemma facing the basement generation. This isn’t to say that they have to hunt moose or even engage in strenuous physical activity; rather it means that they need to find ways of channeling the passion they can invest in simply consuming an entertainment product into more constructive outlets. Video games don’t have to be an exclusionary pastime, but one that can bring people together and forge wonderful relationships (former NFL punter Chris Kluwe waxes colorfully about this in his own anti-Gamergate piece). One is reminded of Simon Pegg’s description of what it means to be a geek:

“Being a geek is all about being honest about what you enjoy and not being afraid to demonstrate that affection. It means never having to play it cool about how much you like something. It’s basically a license to proudly emote on a somewhat childish level rather than behave like a supposed adult. Being a geek is extremely liberating.”

This idea, when combined with Roosevelt’s gospel of joie de vivre, offers a way out for those gamers who want to embrace their passion without transforming into grotesque caricatures. Indeed, it is a philosophy that could actually lead to their joining common cause with the Sarkeesians and other feminists – anyone who truly loves an art form, after all, should seize on an opportunity to correct institutional and cultural flaws that tarnish it. So long as they remain firmly planted in the basement of modern times, however, the Gamergaters will not only oppose fairness to women, but the values that will help them most in their own lives.

How being best friends with Siri saved an autistic boy’s life

Published: Daily Dot (October 23, 2014)

Earlier this week, the New York Times ran an editorial by Judith Newman about her autistic child and the bond he has developed with Siri, the iPhone app. After discovering there was “someone who would not just find information on his various obsessions…but actually semi-discuss these subjects tirelessly,” Newman has found that her son Gus now has a “friend,” one who doesn’t judge him when he fails to pick up on interpersonal cues or indulges in esoteric interests.

Indeed, thanks to his extensive use of the knowledge navigator, Gus has improved his enunciation, learned important rules about etiquette, and even become comfortable carrying out prolonged conversations. “It’s not that Gus doesn’t understand Siri’s not human,” Newman explains. “He does—intellectually.”

Is this healthy? The short answer is an unequivocal “Yes,” but with a complicated subtext. As someone who was diagnosed with high-functioning autism (colloquially known as Asperger’s Syndrome) when I was in high school in the late ‘90s, I can say from firsthand experience that the digital revolution has made a world of difference in my own life. If we are to fully realize the potential of these technological advances for autistic youth, however, we must fully understand what causes those benefits in the first place.

Because Gus clearly struggles with a more severe form of autism than my own, I can’t neatly compare what I gleaned from my childhood with his own experiences. That said, I was struck by one major parallel: As I put it in an article I wrote shortly after the Sandy Hook shooter was revealed to have had Asperger’s Syndrome, I spent the better part of my childhood as a voracious bookworm because “there are no unspoken rules that you’re supposed to ‘just know’ when you enter the realm of words and information.” While other kids my age were making friends and starting their earliest relationships, I spent any time I wasn’t studying scouring my school, community, and home libraries.

What Siri seems to do for Gus, books did for me. Not only did they allow me to both flesh out existing intellectual passions and discover new ones but, like Gus Newman, I used the impersonal world of information to develop my social skills. As I delved further into the worlds of history and political science, my preferred disciplines (more on that in a moment), I absorbed fundamental lessons about appropriate professional and personal interactions that my condition had made it excruciatingly challenging to pick up through instinct and first-hand observation.

Similarly, even though books didn’t simulate direct interactivity like Siri, I still developed meaningful relationships with the innumerable men and women whose books passed through my fingers; my “conversations” with these authors, whether internal monologues or verbalized expressions of opinion, were as real to me as I imagine the discussions with Siri are to Gus Newman.

Finally, although I’d never had trouble striking up conversation with strangers (quite to the contrary, I was notorious for spontaneously lecturing my peers on subjects that interested me regardless of whether the enthusiasm was mutual) it was through books that I learned how to have those discussions by talking to instead of at the other party, although I’m still told that I talk in paragraphs instead of conversational sentences.

In short, Siri is performing the same essential service for autistic children today that books did for me, albeit in a more technologically sophisticated manner. Because children with Asperger’s will be inclined to “retreat from what they can’t understand by plunging into what they can,” Siri helps them acquire vital interpersonal skills as they pursue their interests. Indeed, considering that people with Asperger’s often develop specialized intellectual skills because of their autistic traits, it may have the dual benefit of cultivating their unique strengths even as it helps them overcome their social weaknesses.

Of course, there is always the danger that a method of escape will become an end unto itself instead of a means unto an end, whether it’s talking to an iPhone or burying one’s self in a book. Fortunately, technology can be a very useful tool in preventing this. The key is for the dynamic between the autistic child and his or her “outlet” to transition from a passive relationship to an active one.

More often than not, this starts with the child learning to stop “absorbing” and start “creating.” Because my chief passions were history and political science, I began writing editorials—first for college newspapers and the occasional larger press but then, starting with my first publication at an online news site in 2012, as a bona fide career. Not only did this allow me to turn a manifestation of my autism into my bread-and-butter, it gave me a sense of power and control over my own life that I’d never known before. While I could have conceivably become a pundit even without the Internet, it undoubtedly enormously facilitated the process.

Fortunately for autistic children, there are very few interests that can’t find similar expression through the infinite venues available in cyberspace. The post-Internet world rewards those with a genuine enthusiasm for and expertise in a given topic by guaranteeing that, with enough persistence, they can find an audience which will share their passion and respect their knowledge, from blogs and message boards to YouTube videos and organizations that will publish their opinions (whether written or recorded).

Once an autistic person has found what Mark Twain once astutely called an individual’s “own true work,” he or she will find that valuable interpersonal relationships follow almost organically. The skills needed to garner respect and success online are very similar to those in the flesh-and-blood world—the ability to network, honor deadlines, say what you mean and mean what you say.

In my experience, the vast majority of people with AS have trouble holding down a job or, at the very least, feeling comfortable in professional environments (myself included). Whereas this challenge can be insurmountable when one is left with no middle ground between direct confrontation or avoiding it entirely, the Internet allows autistic children to engage in real interactions behind the safety of their computer screens and comforted by the familiarity of the enthusiasms that first drew them in.

That said, professional achievement and intellectual fulfillment aren’t the only objectives that matter. “Of all the worries the parent of an autistic child has,” Newman explained, “the uppermost is: Will he find love? Or even companionship?”

Fortunately, studies have already shown that social media helps people with autism develop confidence in a non-threatening environment. In addition, like any lonely individual who gradually discovers kinship with a larger community, people with Asperger’s who come into their own through the opportunities available online frequently wind up with a plethora of close friendships almost by default.

The same is true for romance; despite having entered several relationships through dating sites, it’s hardly a coincidence that I met my last two girlfriends when they contacted me after reading one or more of my articles. Just as social media helps autistics by removing the cues and countless unspoken rules that trip them up in the non-digital world, so too can their ability to speak directly to others purely through the prism of their interests help them make deep connections that may not have been possible in the pre-Internet era.

When I reflect on the changes in society’s collective understanding of autism since I was first diagnosed barely a decade-and-a-half ago, it’s hard not to feel optimistic. It’s true that we have a long way to go toward erasing the stigmas associated with autism—not only with the label itself, but with the traits caused by autism that are frequently met with confusion or scorn. At the same time, the digital revolution not only offers valuable tools for personal growth to people with AS; it also provides them with new frontiers of self-expression.

If there is one benefit that our era has to offer autistics which overshadows all the others, it is the fact that it gives us the greatest gift of all—a voice.

5 ways to make sure you’re not that customer the Internet hates

Published: Daily Dot (October 21, 2014)

If there’s one topic food service employees love to discuss, it’s the pros and cons of working in their industry. People seem to hold a different standard for how they treat waiters, fast food clerks, and the wide range of other men and women who serve their meals. This was most recently evidenced by the viral story of a high school football player who thought it was humorous to use his bill as toilet paper in lieu of leaving a tip for his waiter.

This story prompted me to talk with roughly half a dozen food service employees about their experiences. Even if you’re not as boorish as the aforementioned teenager, it is generally a good idea to remember the following list of tips on food service etiquette. You don’t have to be like the folks who left an $100 tip for bad service, but a little courtesy is a good place to start.

1) Always say “Hello!”

“Not saying hello to or ignoring a hello from a waiter is a huge pet peeve,” explained Blipka, a barista at a local coffee shop. “I know you’re in a hurry, but Jesus, take two seconds to acknowledge me as a person.”

There is more supporting Blipka’s position than basic common courtesy. Studies have shown that the simple act of saying hello helps people feel included and, in the case of the ones delivering your food, increases your chances of receiving quality service. It also increases the psychological and physical health of the recipients of your social generosity, which is a big plus for anyone who wants to perform a simple but powerful daily good deed.

2) If your server makes a mistake, be calm and polite about it.

While customer service employees are generally the ones who are taught about the importance of being polite, the customers themselves are often in dire need of that lesson.

“There were a few times when the sandwich was missing or the fries were cold, especially during an overnight shift,” Nate, an employee at a fast food restaurant, told me. “Believe it or not, it isn’t uncommon for people to verbally and even physically attack us over a forgotten burger.”

This almost certainly plays into the modern tendency to view those who serve us food as automatons rather than people. That factor is then exacerbated by the traditional stigma associated with food service work, leading many customers to assume that mistakes are due to incompetence or stupidity. Not only is that not always the case, but even when it is true, it doesn’t mean that taking your day’s frustrations out on your waiter or server will result in any constructive result. By keeping your cool, your problem is more likely to be solved—and you’ll feel better, too.

3) Don’t blame the servers for policies established by their bosses.

This may seem like common sense, but every waiter and food service worker I’ve talked to has told me stories of being blamed for their employers’ mistakes. “One time a woman came up to me and asked for two cups of ranch dressing,” recalled Dylan, who works at a local pizzeria. “When I told her that they would cost fifty cents each, she shouted ‘Ugh! That is just terrible customer service.’”

Again, a large element of this can traced back to the depersonalization of the people who serve us our food. Specifically, there seems to be the prevailing notion that it is acceptable to take one’s life frustrations out on the people who enforce stupid or unfair policies laid down by their higher-ups. If a food service employee is a fellow human being, of course, one might be inclined to avoid using them as proverbial punching bags and recognize that they have no more power over the people who write their paychecks than any other employee. More than any other single theme, however, this is the one that I encountered over and over again during my interviews for this article.

4) Line cutters are the bane of a food server’s existence, too.

Believe it or not, line butters are as much the bane of a food server’s existence as they are of the people waiting patiently in line. “People jumping in line is one of the most irritating things anyone can do,” explained George, who works at a local restaurant. “It is not only rude to other customers, but can be quite frustrating and confusing to the servers as well.”

As a recent study found, people have an instinctive sense of fairness that comes into play when interacting with waiting lines. Although they’re willing to make exceptions for individuals who are clearly pressed for time or facing some other trying circumstance, in general they become “upset,” “angry,” and even “outright hostile” when they perceive one individual clearly believing he or she is more entitled than the others. If you find yourself facing a line cutter in a restaurant and want to strangle the offender, the chances are your server feels the same way.

5) Acknowledge when they’ve done something to make your experience better.

Like everyone else, food service employees appreciate it when you acknowledge the things they’ve done right. “If there is an issue, you should say something immediately and politely, but then give your server a chance to correct the issue to make your experience [enjoyable],” explained Alice, who has worked as a waitress at several restaurants. “After you’re done, just say thank you. It goes a long way.”

Indeed it does. Studies are unambiguous about the social and economic benefits of simply saying thank you. “Receiving expressions of gratitude makes us feel a heightened sense of self-worth,” explained Francesca Gino of Harvard Business School, “and that in turn triggers other helpful behaviors toward both the person we are helping and other people, too.” Because food service employees are so frequently met with hostility (see: points #2 and #3), it stands to reason that they would be even more susceptible to the kindness of basic gratitude.

None of this is being written to imply that all Americans are as crass and cruel as the teenager who relieved himself on a waiter’s bill. Even if you aren’t as depraved as that young man, however, it’s always valuable to try to improve the way you treat others. Food servers aren’t going anywhere, and they’re people, too: They deserve to be treated as such.

 

The one thing everyone forgets about Obama’s legacy

Published: Daily Dot (October 16, 2014)

Unlikely though it might seem, First Lady Michelle Obama’s new viral video “Turnip for What” is more than just a silly video that parodies a hip-hop hit to teach Americans about the joys of eating the titular vegetable. Indeed, even as pundits continue to debate Paul Krugman’s controversial Rolling Stone article detailing Barack Obama’s policy successes and arguing that they place him among America’s greatest presidents (a claim with which I completely agree), “Turnip for What” is being dismissed as a humorous novelty.

This may not be surprising, but it doesn’t change the fact that “Turnip For What” highlights an aspect of Obama’s legacy overlooked by the aforementioned editorial—namely, how the Obama administration pioneered new ways of using the Internet (both as a tool for communication and a hub of a distinct “digital age” culture) to achieve its electoral and policy goals.

Although the Internet was commercialized near the end of George H. W. Bush’s presidency (thanks to a little-known legislative initiative by Senator Al Gore of Tennessee), Obama was the first president to use it as more than an extension of “real world” apparatuses for political mobilization (such as successfully employed by George W. Bush in 2004). When he was accused of being a “celebrity” during the 2008 election, it was a testament to just how completely his campaign had mastered the Internet as a tool for strengthening their candidate’s political brand.

Much has already been written about Obama’s use of social media sites like Facebook and Twitter was instrumental to convert and mobilize millennials in his victories over Hillary Clinton, John McCain, and Mitt Romney. Just as important, however, were the numerous ways his supporters used the Internet to control the narrative of his presidential bids.

Major developments in the 2008 and 2012 elections occurred because of the Internet’s potential as a medium for galvanizing support and leaking information, whether it was the Obama campaign making sure his most inspiring public statements became viral hits, will.i.am and other celebrities cementing their candidate’s status as a pop culture icon by creating a music video out of quotes from his now-famous “Yes We Can” speech, or an obscure Florida bartender embarrassing Romney by surreptitiously recording and publishing (through Mother Jones) his notorious “47 percent” remarks.

This brings us to 2014. While the Obama campaign wasn’t responsible for the “Yes We Can” or “47 percent” videos, “Turnip For What” isn’t even the Obamas’ first viral video this year. Back in March, Obama incurred considerable criticism for encouraging young people to sign up for the Affordable Care Act’s insurance exchanges by appearing on Between Two Ferns with Zach Galifianakis. This gambit, which some alleged undermined the dignity of the executive by allowing the president to be openly insulted in a satirical interview sketch, paid off when the novelty of his appearance on a popular online show led to a spike in traffic to Healthcare.gov and helped the administration reach its enrollment goals.

Even if Michelle Obama’s attempt to inspire a healthy eating craze is less successful (it will certainly be harder to objectively quantify), the popularity of her video shows that the White House knows how to control the news cycle by finding innovative ways to use the Internet.

Then again, innovative use of technological advances in political campaigns is an American tradition as old as the republic itself. In The Midst of Perpetual Fetes, historian David Walstreicher’s acclaimed monograph on political rituals in the early American republic, identifies the 1800 presidential election between Thomas Jefferson and John Adams as an early turning point. “At the turn of the nineteenth century, elections were both annual and seasonal,” Walstreicher explains, thanks to a vibrant and festive approach to politics that treated local elections as a form of popular entertainment.

In the hotly contested battle over John Adams’ reelection, this celebratory atmosphere trickled up to the national stage for the first time, with Jefferson’s Democratic-Republican Party taking advantage of both the newspaper boom and America’s advanced federal postal system to speak directly to ordinary voters who, during the Federalist-oriented presidency of George Washington and the actual Federalist presidency of John Adams, had often felt they were expected to defer to “elites” in making political decisions. The partisan literature became so vicious that it even prompted President Adams to launch a direct attack against the First Amendment through a bill outlawing much of the political literature as “seditious.”

The use of political rallies and partisan newspapers was further sophisticated in the presidential campaigns of Andrew Jackson in 1828 and William Henry Harrison in 1840, both of which used the press to create carefully constructed narratives that depicted their candidates as common men with rustic virtues and their opponents as elitists. Although these characterizations weren’t always accurate (Harrison was a plantation aristocrat while his opponent in 1840, President Martin Van Buren, was a lifelong Jeffersonian who had been born in a tavern), they still brought about the largest increases in voter turnout ever, with the percentages jumping from 27 percent to 58 percent between 1824 and 1828 and from 58 percent to 80 percent between 1836 and 1840.

Similarly, Jackson used newspapers to increase the power of the executive office in 1832 by writing his colorful message vetoing the Second National Bank with a newspaper-literate audience in mind. By the time the Samuel Morse was famously demonstrating that the telegraph could communicate over long distances in May 1844, Americans weren’t at all surprised that Morse followed his histrionic first telegram (which proclaimed “What Hath God Wrought”) with an inquiry as to the results of that year’s Democratic National Convention (which nominated James K. Polk). Communications technology and politics had been intertwined for so long that Americans could take the relationship for granted.

As newly constructed railroads and transportation infrastructure continued to crisscross the continent during the nineteenth century, Americans would occasionally be treated to the spectacle of a presidential candidate who attempted to win votes by appearing directly before the voters instead of choosing surrogates to stump on their behalf (as had been traditional). Not coincidentally, the few major party nominees willing to do this were the ones whose political careers had been rooted in charisma instead of substance, thus giving them good reason to hope that the “personal touch” might help them—most notably Republican James G. Blaine in 1884 and Democrat William J. Bryan in 1896.

The first sitting president to use the same approach was Theodore Roosevelt, whose larger-than-life persona made him a superstar. Of course, like Blaine and Bryan, Roosevelt’s goal wasn’t only to shake as many hands and kiss as many babies as humanly possible; by inviting journalists to cover everything from his stirring whistle-stop orations articulating progressive ideals to his numerous hunting adventures (including a bear hunt in 1902 that, thanks to embellished reports, inspired the teddy bear), he created a cult of personality grounded in the newspapers’ breathless accounts of his whirlwind adventures, energy, and accessibility.

The invention of radio brought on the next phase in the history of American politicking. Although Warren Harding was the first sitting president to deliver a nationally broadcast speech, it was Franklin Roosevelt’s so-called “fireside chats” that maximized the political potential of that medium. Inheriting a country ravaged by the Great Depression and convinced that its leaders had grown calloused to the suffering of ordinary Americans (especially FDR’s immediate predecessor, the hopelessly dour Herbert Hoover), Roosevelt delivered a series of radio addresses that projected confidence, warmth, and candor, giving Americans an unprecedented sense of familiarity with their president.

In the process, he not only restored the prestige of the office but made it impossible for any future president to hope for political success without establishing a direct connection of their own with the American people.

What Roosevelt did for radio, Dwight Eisenhower, John Kennedy, and Richard Nixon did for television. In his 1952 presidential campaign, Eisenhower hired some of the best ad men on Madison Avenue to create a series of infectiously memorable and upbeat TV spots that are considered today to be the first successful presidential campaign commercials. Meanwhile, his running mate, Richard Nixon, used the medium to diffuse a political scandal that nearly cost him his spot on the ticket (through his shrewdly emotional “Checkers speech”).

The next milestone event was the first presidential debate between John Kennedy and Nixon in 1960, in which Kennedy was widely perceived as the “winner” because he looked better on television than the haggard Nixon, even though polls of Americans who had only heard the debate over the radio either found that Nixon had won or declared it a draw.

By the time televised presidential debates were brought back for good in 1976 (the favorites in the 1964, 1968, and 1972 elections, Lyndon Johnson and Nixon, remembered the lessons of 1960 and avoided jeopardizing certain victory with an embarrassing performance), television had more or less finished its work in changing the practice of American politics, from the rise of media handlers to the increasing role of the early primary states in determining each party’s eventual nominee (caused by television’s tendency to award “momentum” to the candidates who either win or do unexpectedly well in the first states).

Like the print press, radio, and television, the Internet has been a game-changer in American politics. It has fundamentally altered not only how the average voter processes information about the larger world, but forces candidates to develop skills and adopt strategies so that they can succeed in the distinct culture of the digital age, a culture that can be reached with appeals to idealism (such as in Obama’s 2008 campaign), outrage (see the response to the “47 percent” tape), and even humor (see the Obama White House’s humorous videos this year).

Obama has been as much a trailblazer in his use of the Internet as Jefferson and Jackson were in using the print press, or Franklin Roosevelt was in his use of radio, or Dwight Eisenhower and John Kennedy were in their use of television. While it’s too early to foresee exactly how Obama’s successors will avail themselves of the medium, it’s likely that we’ll be seeing plenty of political videos like “Turnip for What” in the foreseeable future.

I’m a real teacher, and I love Rate My Professor

Published: Daily Dot (October 13, 2014)

The website Rate My Professor, which allows users to rank and write reviews of their teachers, is more than just a valuable tool for college students who want to be smart consumers about their classroom experiences. It also embodies the greatest promise of the Internet revolution—its ability to not only disseminate information, but encourage people to better understand why quality education is so important in the first place.

When I worked as an assistant T.A. for a seminar on medieval Europe, I was asked to teach The Cheese and the Worms, a classic microhistory by the legendary Carlo Ginzburg that tells the tragic story of Menocchio (real name Domenico Scandella), a 16th-century Italian miller who was tried and eventually executed for heresy by the Inquisition. Since this also happens to be one of my favorite books, I made a vow to myself that no student would leave the classroom without having caught my eager-beaver bug for the material. I even gave out some comical brain food before class (cheese puffs and gummy worms—I will fiercely defend my pedagogical privilege to force puns on my students) to make sure everyone was in good spirits when I started my lecture.

The enthusiastic discussion that ensued, which even continued for a few minutes after the class had officially ended, remained one of the satisfying memories of my academic life.

Although I have no idea how my students would have commented on this if I’d had a page on Rate My Professor (which I don’t), the basic lesson remains the same: For those entrusted with carrying the torch of education to future generations, there is no substitute for student feedback. While the criticisms can be cruel and unfair (more on that in a moment), they also help instructors see themselves through the eyes of those they need to reach in order to fulfill their fundamental duty as teachers, and, when necessary, to change so they can be better at their jobs. Just as importantly, it allows them to understand why they succeed, providing them with both a precedent they can follow in the future and an inspiring reminder of why they entered the field of education in the first place.

On one level, Rate My Professor can be viewed as a manifestation of the Internet’s growing role as an instrument for empowering consumers. Before, a customer dissatisfied with a certain good or service had to go to the Better Business Bureau, a local newspaper, or (if applicable) a courtroom for redress of their grievances. Today, the Internet gives every consumer a forum to express their views. Whether it’s rating a corporation on Consumer Reports or a vendor on eBay, expressing an opinion on a blog or assigning a star rating to a movie on IMDb, there is no limit to the ways buyers can reward or punish those who ask them to part with their hard-earned dollars. Thanks to cyberspace, caveat emptor (buyer beware) is gradually returning to caveat venditor (seller beware).

This isn’t to say that there aren’t downsides to this phenomenon, and specifically to Rate My Professor. Although professors are given an opportunity to respond to negative reviews on their pages, many still argue that the comments are off-putting and mean-spirited instead of constructive. There is also the controversy surrounding the inclusion of categories such as “Easiness,” which risks punishing teachers who insist on intellectually challenging their students (including many of the professors who mentored me), and “Attractiveness,” which is frankly downright creepy. Like any site that offers a soap box to any and all, the reviews on Rate My Professor are only as insightful as the students who write them. Groupthink, shallowness, and the distinct brand of cruelty abetted by online anonymity are inevitably rampant.

Ultimately, though, it’s hard to argue that Rate My Professor does more harm than good. For one thing, it holds professors as accountable as members of any other craft or trade. As an article in Assessment and Evaluation in Higher Education put it, “students who post ratings may be regarded as experts who have had significant experience with the professors” and can thus help the best ones earn tenure and promotions. Moreover, the website serves as a potent reminder of the deeper responsibilities inherent in the education profession. There is a reason why Thomas Jefferson wrote that “an educated citizenry is a vital requisite for our survival as a free people.”

For any democratic society to fully realize the potential of its human resources, it needs to pass on the legacies of past generations so that their successors can learn from their mistakes and build on their accomplishments. We need experts in physical sciences like chemistry, physics, and earth sciences so we can better understand and manipulate the world around us; experts in life sciences like biology and medicine so we can improve the length and quality of our lives and maintain a respectful relationship with the ecosystems of which we are a part; experts in social sciences like history, political studies, economics, and law so our government can best protect individual rights and serve the needs of society as a whole; and experts in arts such as literature, music, cinema, theater, and painting so we can fully appreciate the richness of the world we inhabit.

While the Internet has done wonders in spreading opinions and ideas, formal education still plays an indispensable role for the brightest and most well-informed minds in a democratic nation. Professors teach those who study at their colleges and universities how to separate sound reasoning and information from nonsense (which is particularly useful given the avalanche of bullshit available online today), provide aspiring scholars with the intellectual training they’ll need to master their fields, and offer society’s brightest and most well-informed minds with institutional credibility to promote their work.

Considering the significant threats facing academics today—from the right-wing’s reflexive anti-intellectualism to the fact that new professors are grossly underpaid—any modern development that strengthens academia, even with harsh criticism, should be embraced. As I learned from my own time in front of the classroom, it makes the triumphs of the job taste that much sweeter.

 

Navigating with Anxiety

Published: Good Men Project (October 12, 2014)
co-author: Liskula Cohen

Two people, one condition. See how anxiety affected the lives of Liskula Cohen and Matt Rozsa.

There is a chaotic quality to suffering from anxiety. As this article was being developed, it became clear that it’s impossible to effectively capture the experience of clinical anxiety in the same linear fashion used for other medical conditions. In the following piece, I cut between Liskula’s experiences and my own, covering them in rough chronological order (her words are in plain text and mine are italicized). – MR

I thought I was a dying slow, torturous, evil death.

My anxiety condition first appeared right after the crash of TWA Flight 800 on July 17, 1996. I was in Rome waiting for that plane; it was scheduled to stop in Paris before picking me up and flying back to New York City.

I had a good friend on that flight. The minute my feet touched ground in the United States, the anxiety started.

At first I didn’t know what was causing the attacks. My mind would go into hyperdrive and I’d become uncontrollably paranoid. Soon I was spending a fortune on doctors. I felt that I couldn’t be alone, and although my then-husband was mostly helpful, he still wound up enabling my attacks.

Eventually I found solace in my mother. For a few months I lived at home with her, sleeping on the floor beside her bed out of fear of being alone. She took me to see all kinds of specialists; after that, we went to a Chinese herbalist and relaxation classes taught by a 90-year-old blind woman. The herbalist made me an awful tea with licorice and cinnamon, the worst I’d ever tasted, and told me to “go swim in big water” to gain perspective on how small my life was and how beautiful the world can be. The old woman made me an elastic wristband with a cat bell on it and told me to snap it whenever I felt an anxiety attack coming on. All of this worked… the first time.

            If you trace the roots of my anxiety, you’ll probably find that they’re planted in the soil of my childhood – more specifically, the emotional isolation I experienced throughout my formative years. When you have Asperger’s Syndrome, you need to learn from scratch the rudimentary social skills most other people develop instinctively. Until you do this, your esoteric passions and awkward behavior will cause you to be shunned by your peers. You embarrass yourself; you’re annoying to others; no one shares your interests or enjoys your company. It never stops hurting, but eventually you get used to it. Perhaps it’s akin to the characters in “The Shawshank Redemption” who were ‘institutionalized’ – so used to the misery of prison that the uncertainty of freedom is terrifying. After spending most of my childhood as an outcast, I panic whenever I feel accepted… to say nothing of loved. This has happened on and off, with varying degrees of intensity, for my entire adult life.

My second series of anxiety attacks occurred shortly after September 11th.

I was in a horrible relationship at the time. My partner was leading many lives and I learned to live in fear of him. He had me feeling trapped – but this time I sought help from a therapist and got a prescription for Celexa. Unlike my previous anxiety condition, which was induced by the realization that life is fleeting, this one was brought on by depression. Thanks to the medicine and psychological care, I eventually developed the strength to end my toxic relationship.

Even so, I will always live in fear of letting my mind take over like that again. I felt like I was on the brink of schizophrenia, like I had no more control of what my mind and body wanted to do. That said, I know now that I will never let myself spiral down like that again. I know what situations trigger it in me and accept that I can’t be in control of everything. After ten years of being too afraid to visit Europe, I can now fly without serious incident (it’s different with my daughter, but I suppose that’s to be expected).

I doubt I will ever be “normal,” but I have accepted that. What’s important is that I keep it moving.

      More than a decade has passed since my childhood ended. During that time the traits I developed as a high functioning autistic have stopped being an albatross around my neck. Indeed, they’re a boon to my professional life, which has inevitably improved my personal life as well. Yet despite having countless close friends (in person and online) and being a veteran of half a dozen or so relationships, my anxiety is no less severe today than it was years ago. Necessity has required me to internalize it, and so I do. When the internal pressure becomes too much and I find it impossible to work, as occurred very recently, I seek help until I can put a cork on the bottle and return to my routine. It’s fun to work as a writer, a scholar, a politician. There is a perverse glee in turning the source of my inner turmoil into my greatest strength and claim to social status. It doesn’t stop the demons from tormenting me, but every moment in which I’m excelling at the work I was meant to do, I torment the demons right back.

If there is one thing that little old lady taught me which I carry with me today, it’s her special anagram for “fear”:

Forget Everything And Relax

I think about that all the time.

Our hope, our hope is that others who recognize their own struggles in these stories will reach out for help – whether from a specialist, a spouse, friends, family, or anyone else. There is nothing to be ashamed of, nothing to be ashamed of… and no reason why you should have to feel like this.

America’s flip-flop foreign policy: Rand Paul, Hillary Clinton, and the new millennial voter

Published: Salon (October 11, 2014)

Previously, politicians fought to be seen as more hawkish than their opponents. Here’s why those days are over

Senator Rand Paul, ostensibly the heir to his father’s reputation as one of America’s premier go-to libertarians, recently said that as president he would “destroy ISIS militarily.” Given that libertarians often see eye-to-eye with the left in opposing imperialism and the security state, the media gave the story a moderate amount of attention before letting it fade into the static. After all, in the post-Romney era, what’s so new about a probable presidential aspirant flip-flopping when it’s politically expedient?

Pundits would do well to scratch a little deeper here. More specifically, they should pay attention to the backlash Paul is receiving from millennials, the group whose disproportionate support is a reason he is currently among the Republican frontrunners for the 2016 GOP nomination. Hillary Clinton, too, has paid a price for her rhetorical vacillations from hawkishness to dovishness and back again, as partially indicated by the attention given the prospective presidential candidacy of another prominent anti-interventionist, independent Senator Bernie Sanders (who caucuses with the Democrats). Combined, these developments point to one of the most important emerging stories of the 21st century: National politicians seeking to fare best at the hands of millennials must speak to their concerns about the post-9/11 military-industrial complex and security.

Before World War II, American foreign policy was theoretically based in the principles embodied in a public letter George Washington published near the end of his presidency in 1796. It stated that the United States would “steer clear of permanent alliances with any portion of the foreign world” and was “neither seeking nor granting exclusive favors or preferences” in our commercial or geopolitical relationships. Americans should “avoid the necessity of those overgrown military establishments,” he argued, recognizing them as “inauspicious to liberty” and “particularly hostile to republican liberty.” This isolationism was somewhat broadened in 1823 when James Monroe, through the so-called “Monroe Doctrine,” officially reiterated America’s avowed avoidance of international entanglements while simultaneously linking it with a vow to oppose any attempts by foreign empires to harass or colonize other nations in the Western hemisphere. Hence when Woodrow Wilson attempted to enter America into the League of Nations after World War I, he was met with resounding defeat by the nation’s bedrock isolationism.

None of this stopped the United States from starting acquisitional wars when powerful business interests could effectively lobby them to do so (the Mexican-American War, the Spanish-American War). At the same time, America didn’t become a superpower until the outbreak of World War II. In the same year that he violated Washington’s third-term precedent, Franklin Roosevelt ran for reelection on an open program of providing financial assistance to the Allies as they struggled against the Nazi Empire… one supported by his Republican opponent, Wendell Willkie. While both candidates in the election of 1940 had to remain nominally isolationist to appease large anti-interventionist wings in their parties, the Japanese attacks on Pearl Harbor and German declaration of war against America changed everything. Within months, America had ended 12 years of the Great Depression by building up an unprecedented military-industrial complex.

Psychologically, Americans adjusted to living in a world in which there loomed a genuinely sinister existential threat. First it was fascism and the coalition led by the Third Reich (1941-1945). During the Cold War (1945-1991), it was Communism and the Soviet Union. The few Republicans and Democrats who ran for president while openly opposing the new security state model either failed to get nominated (Republican Robert Taft in 1940, 1948 and 1952) or lost in landslides (Democrats George McGovern in 1972, Walter Mondale in 1984, Michael Dukakis in 1988).

Millennials, on the other hand, have been shaped in a world where the national traumas of World War II and the Cold War are the stuff of history textbooks. When history looks back at the factors that shaped the foreign policy views of the millennial generation, it will identify two definitive experiences, among others.

Most obvious is the September 11th terrorist attacks. The genuine fear and devastation felt by millions of Americans entirely unconnected to the 3,000 victims lost that day cannot be questioned. At the same time, a far smaller percentage of Americans were directly affected by that event than personally felt the effects of World War II and the Cold War (especially the Cold War’s bloodiest “hot” manifestations, particularly the Korean War and Vietnam War). The existential fear of Islamofascist terrorism has been accompanied by a wariness of the less savory aspects of the American security state, from erosions of civil liberties (the Patriot Act, revelations of domestic spying by the NSA) to violations of human rights (the detentions at Guantanamo Bay, the use of torture on terrorist suspects). Even the militarization of law enforcement at home is tied to this deeper distrust; in the unprecedented number of Americans living behind bars (with 716 per 100,000 Americans, we now have the largest prison population in the world), the draconian response to protesters in Ferguson and our increased awareness of systematic immoralities like the war on drugs and pervasive racism among police officers and our criminal justice system, millennials actually do have a palpable connection to the security state.
advertisement

Another turning point, of course, was the war in Iraq. As it became increasingly clear that George W. Bush’s channeling of America’s post-9/11 energies into toppling Saddam Hussein was based on a deliberately misleading premise (namely, that Hussein was tied to the 9/11 attacks), many millennials were led during their politically formative years to deeply distrust the integrity of their government’s foreign policy making apparatus. In addition, similar to the Vietnam War a generation earlier, the Iraq War provided millions of young voters with their first rallying point issue to stimulate their participation in the electoral process. It’s important to remember that when Barack Obama became the first president to get elected in part due to support from millennials, it was largely because of his longstanding opposition to both that conflict and other Bush era excesses like the Patriot Act and Guantanamo Bay – both of which had been initially supported his main opponent in the Democratic primaries and the presumptive frontrunner, Hillary Clinton. It’s hardly a coincidence that the other prominent presidential candidates most associated with pioneering ways to use the Internet for mobilizing millennials were also conspicuously anti-security state – namely Howard Dean in 2004, Barack Obama in 2008 and Ron Paul in 2008 and 2012. Outside of electoral politics, polls consistently show millennials as being more anti-security state and anti-interventionism than previous generations (best chronicled in the survey databases at PollingReport.com).

Just as Rand Paul implicitly acknowledged the lingering influence of hawks in the conservative movement through his recent statements about ISIS, so too has Obama deferred to the weight of internationalist Democrats in his multiple interventions (such as during the 2011 Libyan revolution or in the ongoing crisis with ISIS in Syria and Iraq). Their lack of ideological fealty notwithstanding, however, it is important to remember that both Obama and Paul owed their large national followings among youth to their initial Washingtonian foreign policy beliefs.
advertisement

It’s true that their stances sprang from very different ideological sources: Obama was drawing on the anti-imperialism and anti-statism developed by the New Left as a response to the unpopular Vietnam War (its greatest success before Obama was the nomination of George McGovern in 1972, although his landslide defeat stifled the New Left’s sway in the Democratic Party for the next third of a century), while Paul is a throwback to the paleoconservatism that last peaked among Republicans in the first decade of the Cold War (their last major candidate was Robert Taft, who barely missed the GOP nomination in 1940, 1948 and 1952). Nevertheless, their appeal to millennials was based in the collective historical experiences that have gradually shaped our generation’s foreign policy philosophy. Just as World War II and the Cold War taught earlier generations that America needed to be a superpower for both its own good and that of the entire world, so too have 9/11, the war in Iraq and the consequent growth of the security state convinced this one that it is fundamentally dangerous and un-American.

For now, of course, the stance du jour among leading presidential candidates seems to be to dance between America’s now-established tradition of militarism and the new generation’s inclinations. For the time being, politicians like Obama on the left and Paul on the right have tried to find a balance between these two, often being charged with flip-flopping in the process. Over time, however, millennials will gradually replace baby boomers as the dominant generational voting bloc. When that happens, it will be the course of wisdom for future anti-imperialists on both sides to be staunch in their confrontation of the post-9/11 military-industrial complex and security state. At that point, America’s flip-flop foreign policy will be an historical footnote.