How Leonard Nimoy made Spock an American Jewish icon

Published: Salon (February 27, 2015)

Nimoy transformed the classic intellectual self-questioning archetype into a dashing “Star Trek” action hero

I suspect I can speak for most American Jews when I say: Before I’d watched even a single episode of “Star Trek,” I knew about Leonard Nimoy.

Although there are plenty of Jews who have achieved fame and esteem in American culture, only a handful have their Jewishness explicitly intertwined with their larger cultural image. Much of the difference has to do with how frequently the celebrity in question alludes to his or her heritage within their body of work. This explains why, for instance, a comedian like Adam Sandler is widely identified as Jewish while Andrew Dice Clay is not, or how pitcher Sandy Koufax became famous as a “Jewish athlete” after skipping Game 1 of the 1965 World Series to observe Yom Kippur, while wide receiver Julian Edelman’s Hebraic heritage has remained more obscure.

With this context in mind, it becomes much easier to understand how Nimoy became an iconic figure in the American Jewish community. Take Nimoy’s explanation of the origin of the famous Vulcan salute, courtesy of a 2000 interview with the Baltimore Sun: “In the [Jewish] blessing, the Kohanim (a high priest of a Hebrew tribe) makes the gesture with both hands, and it struck me as a very magical and mystical moment. I taught myself how to do it without even knowing what it meant, and later I inserted it into ‘Star Trek.’”

Nimoy’s public celebration of his own Jewishness extends far beyond this literal gesture. He has openly discussed experiencing anti-Semitism in early-20th century Boston, speaking Yiddish to his Ukrainian grandparents, and pursuing an acting career in large part due to his Jewish heritage. “I became an actor, I’m convinced, because I found a home in a play about a Jewish family just like mine,” Nimoy told Abigail Pogrebin in “Stars of David: Prominent Jews Talk About Being Jewish.” “Clifford Odets’s ‘Awake and Sing.’ I was seventeen years old, cast in this local production, with some pretty good amateur and semiprofessional actors, playing this teenage kid in this Jewish family that was so much like mine it was amazing.”

Significantly, Nimoy did not disregard his Jewishness after becoming a star. Even after his depiction of Mr. Spock became famous throughout the world, Nimoy continued to actively participate in Jewish causes, from fighting to preserve the Yiddish language and narrating a documentary about Hasidic Jews to publishing a Kabbalah-inspired book of photography, The Shekhina Project, which explored “the feminine essence of God.” He even called for peace in Israel by drawing on the mythology from “Star Trek,” recalling an episode in which “two men, half black, half white, are the last survivors of their peoples who have been at war with each other for thousands of years, yet the Enterprise crew could find no differences separating these two raging men.” The message, he wisely intuited, was that “assigning blame over all other priorities is self-defeating. Myth can be a snare. The two sides need our help to evade the snare and search for a way to compromise.”

As we pay our respects to Nimoy’s life and legacy, his status as an American Jewish icon is important in two ways. The first, and by far most pressing, is socio-political: As anti-Semitism continues to rise in American colleges and throughout the world at large, it is important to acknowledge beloved cultural figures who not only came from a Jewish background, but who allowed their heritage to influence their work and continued to participate in Jewish causes throughout their lives. When you consider the frequency with which American Jews will either downplay their Jewishness (e.g., Andy Samberg) or primarily use it as grounds for cracking jokes at the expense of Jews (e.g., Matt Stone of “South Park”), Nimoy’s legacy as an outspokenly pro-Jewish Jew is particularly meaningful right now.

In addition to this, however, there is the simple fact that Nimoy presented American Jews with an archetype that was at once fresh and traditional. The trope of the intellectual, self-questioning Jew has been around for as long as there have been Chosen People, and yet Nimoy managed to transmogrify that character into something exotic and adventurous. Nimoy’s Mr. Spock was a creature driven by logic and a thirst for knowledge, yes, but he was also an action hero and idealist when circumstances demanded it. For the countless Jews who, like me, grew up as nerds and social outcasts, it was always inspiring to see a famous Jewish actor play a character who was at once so much like us and yet flung far enough across the universe to allow us temporary escape from our realities. This may not be the most topically relevant of Nimoy’s legacies, but my guess is that it will be his most lasting as long as there are Jewish children who yearn to learn more, whether by turning inward into their own heritage or casting their gaze upon the distant stars.

My Top 3 Pet Peeves (That Are Actually A Big Deal)

Published: Good Men Project (February 26, 2015)

Matthew Rozsa shares the three pet peeves that bug him the most. Someone has to do it, right?


It’s strange writing an article about pet peeves. Normally I try to write about topics that have significant consequence, and a pet peeve by definition is an annoyance that can be easily brushed off.

Except sometimes they can’t. There are certain habits that, though commonly regarded as pet peeves, can actually be far worse. This article is about the three that bug me the most.


1. People who loudly whisper and/or text at a movie theater.

After spending almost a century as a staple of American entertainment, movie theaters have developed a pretty well-known set of etiquette rules. Foremost among them, of course, is to not talk to other members of the audience while the film is playing. Over time that stipulation has had various supplements—don’t talk on your cell phone, don’t talk during the previews, no laser pointers, etc.—but there are two standards that are long overdue:

Since the majority of theater etiquette rules can be reduced to a single imperative—don’t distract other people from enjoying the movie—it stands to reason that doing so with a visual annoyance isn’t any better than an auditory one

First, there should be no texting during a movie screening. This may sound draconian, but I can’t recall how many times I’ve been snapped out of the fantasy world being woven in front of me by the flickering lights of a dozen illuminated cell phone screens in front of me. Since the majority of theater etiquette rules can be reduced to a single imperative—don’t distract other people from enjoying the movie—it stands to reason that doing so with a visual annoyance isn’t any better than an auditory one (hence the aforementioned prohibition against laser pointers). The same holds true for “whisper-yelling” to other people: If I can hear you, then you’re being too loud, regardless of whether it’s with your “inside voice” or the “whispering equivalent of your inside voice.”


2. Bureaucrats who make their jobs into your problem.

This one requires a bit of elaboration: For the past two semesters in a row, I have had difficulty with student loans. From the moment I began graduate school in the summer of 2010 (to get my Masters at Rutgers-Newark) until the summer of 2014 (when I completed my PhD coursework at Lehigh), I have been a full-time student. Starting last autumn, however, I began the process of studying for my comprehensive exams, a series of tests that every prospective recipient of a history doctorate is required to take. Although I was still a full-time student, my credit status became a tad more complicated; inevitably, I called financial aid at my school to figure out how I could make sure this alteration didn’t impair my student loan situation. When I decided to take a year-long leave of absence for 2015, I made the same request.

On both occasions, the proverbial ball was dropped, and without delving into too many details, the bottom line is that it happened each time for the exact same reason: Someone, somewhere along the way, didn’t give me complete information about every step I needed to take in order to be squared away. Consequently, on both occasions, I was forced to spend hours and hours fixing some faceless bureaucrat’s mistake before they destroyed my credit rating or began consuming sizeable chunks from my meager savings.

Bureaucrats have dreary, mind-numbing jobs. But when their indifference to their work costs other people money, they should be forced to fix those mistakes on their own free time—and without pay—rather than forcing the cost of their apathy onto other people’s shoulders.

Before you blast me, I get it: Bureaucrats have dreary, mind-numbing jobs. But when their indifference to their work costs other people money, they should be forced to fix those mistakes on their own free time—and without pay—rather than forcing the cost of their apathy onto other people’s shoulders.


3. People who freeze each other out.

Unlike the other items on this list, this one actually has some valid applications, most of which come into play when you’re dating or leaving a relationship. If someone won’t take no for an answer or is behaving in an inappropriate or aggressive way, then yes, freezing them out is entirely understandable and necessary.

That said, freezing someone out should be the exception to the rule, and one only called into play in extreme circumstances. Unfortunately, I’ve noticed that my generation in particular seems to use the freeze out at the drop of a hat—with friends and family, doctors and co-workers, as a substitute for canceling previous engagements or a tool for avoiding unpleasant confrontations. I suspect the culprit here is a combination of an undue sense of entitlement (the idea that if you don’t want to talk to someone, you should never be obligated to do so) and the more pernicious legacies of the digital age (with so many alternatives for face-to-face interaction, the act of ignoring someone doesn’t seem as rude).

The problem is that when you freeze someone out, you aren’t committing a neutral act; you are directly insulting that individual by deciding that their convenience and/or feelings are less important than your wish to avoid an encounter with them

The problem is that when you freeze someone out, you aren’t committing a neutral act; you are directly insulting that individual by deciding that their convenience and/or feelings are less important than your wish to avoid an encounter with them. Even in the dating world, the freeze out should be used very sparingly (see the aforementioned examples), since people are rarely as vulnerable as when they are attempting to find meaningful companionship.


And that is my article on pet peeves. I have to say, I’ve been waiting for a long time to get these three annoyances off of my chest, but never found the right hook that would allow me to do so. Until now, that is.

Why old white Republicans keep mansplaining women’s vaginas

Published: Daily Dot (February 25, 2015)

When yet another old white man nationally embarrasses himself with his ignorance about how women’s bodies work, it isn’t enough to hold them up for richly deserved derision, as the Internet is wont to do. We must also learn a very important lesson: namely, that their poor understanding of experiences outside their own frame-of-reference demonstrates why we need more women in positions of power.

This may seem like a self-evident statement, but as both the controversial question asked by Idaho Rep. Vito Barberi (R) and the subsequent online brouhaha that it caused makes clear, it is anything but. As the state legislature continues to preside over hearings regarding a controversial anti-abortion bill, Barberi caused a firestorm on the Internet after a physician testifying against the bill told him that some colonoscopy patients swallow a small camera to get a closer look at their intestines. “Can this same procedure then be done in a pregnancy?” Barberi asked. “Swallowing a camera and helping the doctor determine what the situation is?”

“Prior to asking if women’s vaginas are somehow connected to their digestive tracts, Barbieri was holding himself out as extremely knowledgeable about the practice of medicine,” explains Amanda Marcotte of Slate. Although experts agree that telemedicine abortions are perfectly safe, the goal of Barberi’s question was to justify banning the practice on medical grounds by arguing that “there are certain examinations and procedures which require personal hands-on exams, and I think this is one of them.”

Hence the problem, as Kate Brown of Jezebel pointed out, is that while men like Barberi clearly possess a poor understanding about how women’s bodies work, “these exact idiot men get to make medical decisions on behalf of women.”

Tara Culp-Ressler of ThinkProgress summed up the online reaction pretty perfectly when she wrote that  “the GOP lawmaker, who sits on the board of a right wing ‘crisis pregnancy center’ that works to dissuade women from choosing abortion, added that ‘this is a proper role of government to protect life,’” even though “research into Iowa’s telemedicine abortion program has found that there’s no difference between women who are prescribed the abortion pill in person versus the women who consult with a doctor remotely. Surveys among the patients who have used the service report low rates of complications and high levels of satisfaction.”

Of course, this isn’t the first time a Republican politician has gotten himself in trouble by failing to comprehend basic human biology. Perhaps the most notorious instances occurred during the infamous “War on Women” from the 2012 elections, when two GOP candidates for the Senate made similarly offensive comments about rape and abortion rights in short succession. It began in August with Todd Akin of Missouri, who insisted that abortion shouldn’t be legalized in cases when the pregnancy was due to rape because “if it’s a legitimate rape, the female body has ways to try to shut the whole thing down.” Richard Mourdock of Indiana joined Akin in the Republican Hall of Shame a couple of months later when he argued that “even when life begins in that horrible situation of rape, that is something that God intended to happen.”

As recently as last week, a South Dakota Republican lawmaker named Isaac Latterell introduced a bill making it illegal for a physician to “knowingly behead a living unborn child with the intent of endangering the life or health of the child,” which is a grossly inaccurate description of dilation and evacuation (D and E) abortion procedures. (The bill’s language also makes you wonder how someone could behead a living creature without knowingly endangering its life.)

In the end, politicians from Akin and Mourdon to Latterell and Barberi are united in two fundamental ways: They’re both right-wing lawmakers who wish to pass policies controlling what women choose to do with their bodies, and they both display a shocking lack of knowledge about how those bodies actually work.

Of course, this doesn’t mean that there aren’t female right-wingers who are similarly uninformed (see Texas Rep. Jodie Laubenberg’s claim in June 2013 that abortion shouldn’t be allowed in cases of rape because “they have what’s called rape kits where a woman can get cleaned out”). At the same time, there is something distinctly galling about hearing male politicians pontificate about the medical and moral implications of a procedure that they will not only never have to consider having themselves, but which they frequently fail to even comprehend on the most basic conceptual level.

It is impossible to understand why this happens without contextualizing it within the framework of male privilege. In a 2013 report published in Psychological Science, researchers Ashleigh Rosette of Duke University and Lehigh Plunkett Tost of the University of Michigan observed that one of the defining characteristics of male privilege is callousness. Rosette and Tost hypothesized that “if you occupy a historically marginalized position (being a woman or African-American, for example) and a dominant group (e.g., an executive team) on another, the more you’ll be aware of social inequities in that dominant group,” as the Association for Psychological Science reported about the study. “But that awareness, they posited, could be dampened the higher your level of success in the dominant group.”

In other words, the reason so many old white men are capable of making asinine or even bafflingly inaccurate remarks about women’s bodies isn’t simply that they aren’t women but that they don’t feel any strong imperative to empathize with a woman’s point of view. As such, when they approach issues like abortion and contraception coverage with a pre-existing conservative perspective, they view the inevitable discussions of female biology as obstacles to be overcome en route to making their argument. Playing fast and loose with the basic facts on such a subject—as Barberi tried to do with his attempt at a “gotcha” question at the expense of a qualified doctor, his subsequent denials to the contrary notwithstanding—is easy when you’ve already disregard the potential relevance of the women’s experiences in these matters.

If there is one picture that can best sum up the problem facing old white men today as they attempt to shape the politics of women’s bodies, it is a shot from nearly a dozen years ago. When George W. Bush met with congressional leaders to sign the Partial Birth Abortion Ban of 2003, many pundits noted the irony: The picture is filled with middle-aged and elderly white men, standing side-by-side and grinning ear-to-ear, as they use the strong arm of the law to make a decision that will impact countless women. One doesn’t have to question their good intentions to be disgusted at their obliviousness.

Such is the lesson we’ve learned again this week, courtesy of Vito Barberi.

What John Oliver doesn’t understand about America’s justice system

Published: Daily Dot (February 24, 2015)

John Oliver’s eloquent rant about corruption from this weekend’s edition Last Week Tonight about the American judiciary has been going viral since the show aired Sunday night. “What he revealed was truly disturbing,” wrote Sarah Gray of Salon. Forrest Wickman of Slate echoed this sentiment, observing that “as a British immigrant, John Oliver has often demonstrated a knack for bringing fresh eyes to America’s absurdities. Kyle Whitmire of, an Alabama news website, captured the essence of this commentary with his proclamation that Oliver’s video “should be required viewing in pretty much any high school civics class.”

What revelation did Oliver present that could have such a powerful effect? Simply this: that although our judges “occupy an exalted position in American life,” to use Oliver’s own words, many of our jurists act no better than common politicians. This is all well and good on its own, but then Oliver goes on to say that America shouldn’t be electing its judges at all.

That is one step too far.

The first problem with this argument becomes apparent as soon as Oliver presents it: He never proves that judicial elections are responsible for the main problems he identifies. After explaining how Alabama Chief Justice Roy Moore is objectively wrong that the Supreme Court doesn’t have the authority to trump state courts (“that’s basically its job”), Oliver immediately blames Moore’s rise to the bench on the fact that “like 85 percent of state judges in America, he was elected.”

Because this is the last time he focuses on Moore in that segment, Oliver’s assertion that men like Moore—who has promised to block same-sex marriages in Alabama despite higher court mandate—rise to power because of elections is never substantiated. Indeed, upon further scrutiny, it becomes rather confusing: Is he implying that a judge appointed by politicians would be more likely to understand the Constitution than an elected one? Does he believe the electorate is more likely to anoint a judge with anti-gay prejudices than an executive and legislature? Is he implying that Moore’s election was somehow corrupt?

It is a bad sign that Oliver raises so many unanswered questions before his argument has moved past its first case-in-point, and the pattern of incomplete reasoning continues for the rest of the broadcast. He rattles off wonky statistics (“Thirty-nine states hold elections for judges and America is virtually alone in doing this”), analyzes how campaign fundraising can cause serious conflicts of interest (e.g., judges being more likely to rule in favor of contributors), and showcases a number of egregiously pandering, gimmicky, and/or dishonest campaign commercials. After briefly noting that appointing judges has imperfections of its own, he concludes by telling his viewers that “if we’re going to keep electing judges, we may have to alter our idea of what justice is.”

Again, Oliver’s reasoning is flawed because he doesn’t prove that this type of corruption is inextricably connected to the practice of electing judges. As such, his thesis is weakened by the fact that many infamous judicial scandals throughout American history have involved appointed judges; examples from the last few years include G. Thomas Porteous Jr. of Louisiana, Samuel B. Kent of Texas, and Federal District Judge Jack Camp.

It also overlooks the frequency with which blatant conflicts of interest influence judges as high up as the Supreme Court. Justices Antonin Scalia and Clarence Thomas were heavily criticized for participating in a lavish political retreat hosted by billionaire conservative activist Charles Koch despite his opposition to campaign finance reform, which was under consideration in the landmark Citizens United ruling (Scalia and Thomas voted as Koch would have wanted them to). One year later, Thomas refused to recuse himself when Obamacare reached the nation’s highest bench, even though his wife was a paid lobbyist for interest groups that actively opposed health care reform (Thomas voted to overturn the bill).

Even when Oliver does identify a problem that is clearly linked to judicial elections, his claim still buckles upon closer inspection. Although Oliver is rightly horrified that judges are statistically more likely to vote against a criminal defendants’ appeal during election season, this isn’t entirely dissimilar from how appointed judges will misuse their power by attempting to shape electoral politics. The Supreme Court’s decision to award the 2000 presidential election to the Republican candidate in Bush v. Gore was so transparently partisan that Justice John Paul Stevens wrote in his dissent that “although we may never know with complete certainty the identity of the winner of this year’s Presidential election, the identity of the loser is perfectly clear. It is the Nation’s confidence in the judge as an impartial guardian of the rule of law.”

Later, when Chief Justice John Roberts joined the four court liberals in upholding Obamacare, multiple sources reported that he was motivated not only because he felt the individual mandate was constitutionally sound (a position on which 19 of 21 constitutional scholars across political affiliations agreed), but out of concern that the conservative judges opposing the law were acting out of partisan rather than jurisprudential considerations.

None of this is meant to imply that Oliver was wrong for being outraged at the corruption and incompetence that run rampant in the American judicial system today. That said, as the above examples demonstrate, there are a number of root causes behind this problem. Foremost among them is the inordinate influence of money in every phase of our political process, including not only the purchasing of influence through election contributions but also through lobbying groups and political action committees.

In addition to that, there is the simple fact that judges appointed by politicians are just as likely to allow political considerations to unduly influence their decisions as those who have reelection campaigns to worry about. While the immediate incentives differ somewhat between a judge merely aiming to get reelected and one who angling to satisfy his or her partisan supporters, both pervert the notion of a blind and fair justice with their rulings.

The most important policy prescription for addressing this is comprehensive campaign finance reform. To do so, we must make it illegal for lawyers to donate to a judge’s campaign, while also barring judges from ruling on cases in which election fundraising considerations might cause a conflict of interest. Over time, however, Americans should also transition to providing publicly financed political campaigns for judges, so that an even playing field will remove the disproportionate power of big money over the candidates. Although these steps won’t entirely eliminate partisanship and other unfair bias among judges, they would certainly go a long way toward helping out.

Of course, the law can only do so much to address this type of crisis. The other part of the solution must be cultural rather than legal or political, and to its enormous credit, Oliver’s segment had the right idea on that front by raising awareness about the need for a higher standard of integrity among our judges. By oversimplifying the cause of judicial corruption by blaming it on elections, however, Oliver misdiagnosed the illness even as he performed a public service by identifying many of its symptoms.

February 2015 Has Been A Historic Month for Science

Published: Pixable (February 24, 2015)

This has been a historic month for science. Even if all scientific progress crawls to a stop this week, the knowledge we’ve already acquired since February 1, 2015 is likely to have a direct impact on all of our lives. Among the lessons we’ve learned:

1. The theory of the Big Bang may be false

(Source: Flickr/nasamarshall)

(Source: Flickr/nasamarshall)

In a paper published by Physics Letters B titled “Cosmology from quantum potential,” researchers Ahmed Farag Ali in Egypt and Saurya Das in Canada propose that the universe is filled with a quantum fluid that expands space by exerting a slight but constant force on all matter. If correct, their argument could disprove the Big Bang theory, which has long been criticized due to its equations being unable to take scientists back to the universe’s origin point – one that, if the Big Bang theory is to be believed, would be an infinitely small and dense singularity. The Ali-Das model not only fixes this by positing a universe that has always existed, but even explains physical phenomena like gravity and the theory of relativity.

2. Global warming is causing a megadrought on the verge of literally transforming America.

(Source: NASA Goddard)

As President Obama pushes for stricter regulations to curb global warming, Americans are being offered an ominous glimpse of what will happen if we fail to act. First, researchers from NASA joined experts at Cornell University and Columbia University to warn against an impending “megadraught” that will grip the Southwestern and Plains states by 2050 if climate change continues at its current pace. According to the report, shifts in weather patterns will lead to major water shortages, threatening America’s agricultural industry and increasing the likelihood of wildfires throughout the country.

Then, a panel of climatologists submitted a report projecting that future heat waves will bury large sections of New York City under water by the middle of the century, turning the Big Apple into another Venice.

(Source: Flickr/mtaphotos)

3. Losing weight isn’t as simple as exercising and dieting.


(Source: Flickr/Sirwiseowl)

A massive study led by I-Min Lee of the Harvard School of Public Health has found that eating less and exercising may not be enough to cure obesity. Numerous biological mechanisms encourage people to eat more when they’re trying to diet, the paper argued, from the body slowing the rate at which it burns calories during exercise to the brain making high calorie foods seem more appealing. Since 80 to 95 percent of obese people regain the weight they lose solely through diet and exercise, the study concluded doctors should prescribe obesity medications and/or bariatric surgery in conjunction with diet and exercise for patients who want to permanently lose large amounts of weight. Similarly, international genetics researchers have discovered more than 90 gene regions that explain why some people are more likely to gain weight than others, which can be used to further improve weight loss treatment in the future.

As we approach the final ten months of 2015, our best hope for the future is to learn from the fruits of modern scientific progress. Whether we’re discussing the shapes of our bodies, the landscape of our continent, or the origin of the universe, the stakes clearly couldn’t be higher.

Sean Penn’s Big Mistake and What We Should Learn From It

Published: Good Men Project (February 24, 2015)

Sean Penn forgot that behaviors which may seem appropriate in private are often inappropriate in public.


For those of you who haven’t heard, two-time Academy Award-winning actor Sean Penn is under fire for a racist joke he made during the Oscars on Sunday. As he presented “Birdman” director Alejandro Gonzalez Inarritu with the award for Best Picture, he decided to crack wise at the expense of his long-time friend with this remark:

“Who gave this son of a bitch his green card?”

As many pundits have already observed, Penn’s quip was absolutely inappropriate.

As many pundits have already observed, Penn’s quip was absolutely inappropriate. Not only did it draw attention to Inarritu’s heritage even though his background has nothing to do with the subject of his film, but he did so by relying on a stereotype about Mexican-Americans (i.e., that they’re not legitimate Americans) at a time when racial discrimination against Mexican-Americans is often rationalized with anti-immigration rhetoric and Hispanics are grossly underrepresented in Hollywood.


That said, while all of these points are relevant, there is another dimension to this story, which revealed itself when Inarritu defended Penn on the grounds that the two of them like to tease each other (more on that in a moment). While this explanation may exonerate Penn from the charge of racism, it unintentionally draws attention to a much deeper problem. As the era of mass media and the Internet continues to blur the lines between our public identities and our private selves, it is becoming increasingly difficult for us to realize where one realm ends and the other begins.

Every human being reveals different sides of their personalities depending upon the social context. How we behave around our families is different from how we behave around close friends, which is in turn different from how we behave around casual friends, which in turn differs from how we’d behave around employers or police officers or strangers in the street…

Every human being reveals different sides of their personalities depending upon the social context. How we behave around our families is different from how we behave around close friends, which is in turn different from how we behave around casual friends, which in turn differs from how we’d behave around employers or police officers or strangers in the street …

You get the picture. It is doubtful I’m going to run into anyone who disagrees with me on this point. I’m even confident enough in the soundness of my next observation—namely, that these numerous interpersonal relationships can be broadly classified into two categories, those pertaining to our public image and those that we reserve for those with whom we are close—that I’ll also pass over it as a given. This leaves me with the final and most important point:

Sean Penn forgot that friends who “get you” aren’t the same as members of the public who don’t.


This is a mistake that we’ve seen public figures make all the time. In the last few years we’ve seen celebrities from Justin Bieber to Donald Sterling get caught in embarrassing moments of casual racism—an un-PC joke, a bigoted rant—precisely because the private face they only showed to friends was somehow leaked to the public. Of course, because he actually uttered his offensive comment in a setting that he knew would be consumed by the public, Penn’s jocularity is more on par with Brett Ratner (director of the “Rush Hour” trilogy) saying rehearsals are for “fags” during an interview. On both occasions the celebrity allowed his private side to slip out in a public setting.

Even if Penn isn’t a genuine racist, he still made a vulgar and insulting comment in front of a much larger audience than the single person of Alejandro Gonzalez Inarritu.

Does that mean that Penn is a racist? Considering the man’s extensive humanitarian work and outspoken progressivism (which studies have shown do make you less inclined to hold racist opinions), he certainly deserves the benefit of the doubt. At the same time, it isn’t enough for Penn to point out that his joke was directed to a friend from the minority group he was insulting, not only because the “I have ___ friends” argument is offensive in its own right, but because it misses the point about the faux pas he committed. Even if Penn isn’t a genuine racist, he still made a vulgar and insulting comment in front of a much larger audience than the single person of Alejandro Gonzalez Inarritu. If he wants to make sure his public identity is not associated with casual racism, he needs to be humble and admit that he was wrong.

There is a beauty to the simplicity of the lesson in this story. It comes down to something we were taught as young children: When you do something wrong, just apologize. No matter how much technology changes society, certain fundamental lessons will always be relevant, and this is one of them. If we can sincerely admit when we’re wrong, we can fix just about any problem, whether all over the world or in our own souls.

The biggest problem with virtual reality is us

Published: Daily Dot (February 23, 2015)

Are we approaching the dawn of the virtual reality age?

“You’re looking at the future,” declared Chris Cox, the chief product officer at Oculus VR, in reference to his company’s virtual reality headset being purchased by Facebook, “and it’s going to be awesome.” Recent reports indicate that the company is developing its own virtual reality version of the service, with the ostensible goal of “[bringing] social media to the world of VR” (at least according to Tech Times). But as virtual reality seems closer than ever before to mainstream viability, are we dreaming too big?

For example, take Monica Kim waxing poetic in a recent article for the Atlantic about how “one could have breakfast at the Louvre beside the Winged Victory of Samothrace, followed by a lunchtime spelunk through Thailand’s water caves” before noting, as virtual reality prognosticators are wont to do, that “if virtual reality becomes a part of people’s day-to-day lives, more and more people may prefer to spend a majority of their time in virtual spaces.”

While I may personally hope Cox and Kim are right, it is important to maintain an attitude of healthy skepticism toward optimistic predictions about virtual reality. Like flying cars and casual space travel, virtual reality is one of those technological advances that has long been promised but never winds up being brought to full fruition. If Facebook actually succeeds in making virtual reality a part of the average consumer’s life, it will have bucked a trend that has been in place for at least two decades. Of course, this is a big “if,” and as we attempt to anticipate the major technological advances to come, it’s a good idea to avoid jumping the proverbial gun.

Part of the problem with virtual reality is that the actual experience has never been as appealing in practice as it seemed it would be in theory. Kyle Fowle detailed the reality of experiencing VR in Kill Screen: “Stepping into the pod, though, shattered any illusions. … Three minutes later you stumbled out of the pod, an attendant wearily guiding your now vertigo-ravaged body away from the machine, praying that your vomit would subdue itself until you rounded the corner to Cinema #8.”

The problem was, as Matthew Schnipper writes in the Verge, that “though a true believer could immerse him or herself in the roughly built digital landscape, the chasm between that crude digital experience and the powerful subtly of real life was too great.” In short, “the vision simply did not match the means.”

From a sociological perspective, this can be attributed to a phenomenon profiled in the Gartner Hype Curve. “The Gartner Hype Curve communicates that there are ‘technology triggers’ that lead to new opportunities and possibilities,” writes Dr. Mark Bolas of the Institute for Creative Technologies. As creative people in the sciences and arts imagine the possible manifestations of a particularly promising technology, hopes are inflamed beyond what existing scientific capabilities can deliver.

“This all leads to growing anticipation resulting in a climb to the ‘peak of inflated expectations,’” Bolas continues. “This was VR in the 1990s. When issues like technology limitations, costs, and other issues begin to deflate those inflated expectations, there is a slide down to the ‘trough of disillusionment.’ This is where public sentiment has been towards VR for the past several years.”

This doesn’t mean that we can’t make any predictions about the future use of virtual reality. It simply means that we need to be more modest and realistic when it comes to those projections about which we’re especially certain. According to “Killer Apps in the Gigabit Age,” a new report published by the Pew Research Center and Elon University, the most interesting potential uses for virtual reality would occur in health and education.

“Business and economics professor Ed Lyell suggested that teachers will become less talking-head experts, and more ‘teacher-coach’ figures for their students,” writes Stuart Dredge of the Guardian. “Healthcare is a major topic for the report, with its predictions of ‘continuous health monitoring’ through a variety of sensors and personal devices.”

As virtual reality seems closer than ever before to mainstream viability, are we dreaming too big?

There are two additional lessons to be gleaned from our spotty history with virtual reality. The first is that it’s important to factor human nature into our understanding of technology and the predictions we make about them. In the case of virtual reality, a common mistake seems to be assuming that people will automatically find something appealing about an entertainment technology that offers total visual and auditory immersion.

While there are plenty of men and women who use their entertainment options as a means of escaping from their real-world lives, the virtual reality experience can be uncomfortable and distracting instead of seeming like an actual, viable alternative to, well, reality. Of course, this doesn’t preclude the use of VR for practical reasons, like supplementing classroom lesson plans or providing patients with more comprehensive diagnoses, but it is more of a leap to assume people will choose such a disconcerting sensory experience for leisure activities.

Similarly, we need to remember that futurists are notoriously hit-and-miss when it comes to the accuracy of their forecasts. For instance, when a 2012 Forbes article analyzed a dozen of the major predictions made by popular science writer Ray Kurzweil, it found that he was flat-out wrong two-out-of-three times. From predicting that the majority of text would be created using continuous speech recognition and that “human musicians [will] routinely jam with cybernetic musicians” to asserting that the tech bubble of the ‘90s (which fueled the unprecedented prosperity of that decade) would last indefinitely, Kurzweil was plumb wrong with embarrassing frequency.

When he erred, most of his mistakes were the same as those who have been overly-optimistic about VR: He went too far in guessing how much people would accept technology as a substitute for direct human activity.

While it will probably prove unpopular to be a wet blanket on the burning dreams of the VR enthusiasts, it is undeniably necessary to remain cautious even in our optimism. Advances in digital technology have so revolutionized our lives that it’s easy to get caught up in a wave of irrational mass excitement over what is perceived as the next big thing. Instead of succumbing to these impulses, we should instead refer to an old cliche that seems oddly appropriate for this topic: Believe it when you see it.

When They Laughed at Rape…

Published: Good Men Project (February 19, 2015), Salon (February 22, 2015)

Matthew Rozsa discusses how an off-hand observation during a speech illustrated an important point about rape culture.


Every so often, a single experience will instantaneously illuminate an aspect of the human condition to a person who had previously failed to viscerally grasp it. Such was the case for me earlier this month, when I joined Professor James Peterson in a round-table discussion with Lehigh University students and faculty about how to get published as an op-ed columnist.


When one of the attendees asked us to elaborate on the type of negative feedback one can anticipate upon being published, I remarked that it often depends on the demographic profile of the author in question. “Figure out what aspects of your racial, religious, or sexual identity are most likely to be attacked, and then assume that a bunch of trolls are going to make the most vulgar comments they can think of based on them,” I pointed out.

Figure out what aspects of your racial, religious, or sexual identity are most likely to be attacked, and then assume that a bunch of trolls are going to make the most vulgar comments they can think of based on them.

“Can you give a specific example?” asked one student.

After taking a moment to think about it, I replied, “Well, as those of you who followed Gamergate last year have already seen, female writers frequently receive rape threats. Many of the women I personally know in the industry get rape threats all the time, even when they’re not writing about video games or gender-related topics. As a man who has never gotten a rape threat …”


Although I hadn’t intended to say anything humorous, I had inadvertently caught a glimpse of a deep gender bias in our culture, one that equally disadvantages men and women.

Before I could finish that thought, a chuckle rippled throughout the room. Although I hadn’t intended to say anything humorous, I had inadvertently caught a glimpse of a deep gender bias in our culture, one that equally disadvantages men and women. To wit:

—For women: Although I am certain the audience members did not intend for their mildly jocular response to be interpreted this way, the reality is that they laughed because our society has normalized certain horrifying assumptions about women being raped. When the term “rape culture” is used, it is because of occasions like this, wherein the concept of a woman being raped is easy to understand but the notion of a man having the same experience is literally scoffed at. It also goes a long way toward explaining why rape threats are such a common default response for Internet commenters—even though most polite society distances itself from such rhetoric, on a primal level we still haven’t shaken the inner bigot that still deems them acceptable.

—For men: According to a study in 2008, over 200,000 prison inmates have been victimized by sexual assault during their time beyond bars; by comparison, there were over 90,000 reported case of rape outside of prison that same year. Considering that the majority of those prison rape victims were men, it stands to reason that rape is hardly an exclusively female problem in our society. Acknowledging this reality does not diminish the problem of rape culture or culturally-ingrained misogyny, but instead isolates rape as an act of barbarism that should never be normalized, regardless of the gender of its victim.

When I pointed out to the audience that their laughter reflected the pervasiveness of rape culture, most of them nodded in agreement. After all they, like me, were leftist academics, and as such generally sympathetic to the tenets of Third Wave feminism.


Indeed, this incident reinforced why it is so important for the West to have a robust feminist movement at this point in our history. For all of the progress that has been made in women’s rights over the past century, there are still plenty of embedded cultural assumptions that teach us dangerous, even dehumanizing lessons about the kinds of experiences that should be considered normal for women versus those anticipated for men. When feminists speak of a patriarchy, this is why. While I’ve always understood that intellectually, for a brief moment, the harsh real-world reality stared right into my face.