Gawker Agrees: The End of Reality Postscript

April 6th, 2010

Continuing the trend of major publications echoing the ideas discussed here, Gawker’s Brian Moylan last Thursday traced the same narrative in reality TV that I described in my last post from artless, captivating beginnings to streamlined, artificial ends. His recapitulation (which doesn’t appeal to the concept of Weberian rationalization) was conducted, though, in support of a pointed overarching argument: MTV Must Cancel The Real World.

In his polemic against the documentary series, which just concluded its 23rd season, Moylan picked up on the adverse effects reality TV’s rationalization has had on its ability to say something authentic about human experience:

I remember the excitement, the magazine covers, and the buzz surrounding the original sociological experiment. This was the first time a bunch of strangers had been throw together and the results taped. They fought, they loved, the hooked up, they went on vacation. It was just like the program is now (minus the vacation, which D.C. skimped out on) except it seemed that the people had real lives.

Sure, we never heard much from Heather B’s rap career or Andre’s band Reigndance after the show, but these people seemed less like characters or types and more like actual people. There were ambitious twentysomethings already involved in finding their way in their chosen field. They also had some sort of life in the city where it was being filmed, so outside friends and interests filtered onto the show, much in the same way that sharing a house with a bunch of roommates really does. Over time, the characters calcified into “types”—the angry black man, the gay one, the slut, the conservative, the sheltered zealot—and people were cast less as individuals, but as stock characters who would create conflict.

The serious sociological aspect of the show quickly started to diminish after the San Francisco season, perhaps the shows most poignant and famous thanks to the death of AIDS activist Pedro Zamora and the ouster of his nemesis Puck, who was so nasty the roommates kicked him out of the house. Remember on that season that Pam was in med school while it was being filmed? That was some serious stuff. Now we’re lucky if one of the kids works one day a week at something other than exhibitionism and self-promotion. In later seasons, the show started giving the cast projects, like starting a business or working a job, to give the show some cohesion, but even those shortly fell by the wayside.

What do we get now? The people on the show don’t seem to be actually doing anything outside of the house. They have silly internships that don’t involve much work and seem more like pre-arranged camera dates than documented work experience. Either that or they have little hobbies that the producers try to blow up into a huge thing. Callie is a photographer! Andrew is an artist! Emily is a (really bad) poet! Erika the quitter and Josh are musicians! Ashley is…well, just whiny!

No, they are practically forbidden to do anything outside other than get drunk, go to the gym, party, and hook up with people. Otherwise, they are trapped within the confines of their messy, faux Ikea domicile to claw each others eyes out, sob on the phone, and have petty squabbles and heavy petting. Thanks to the rule-breaking Las Vegas season, which was the start of The Real World’s descent into trash for trash’s sake, there is only a thin patina of social relevance to the entire enterprise. Ironically, it is that earnestness that makes it seem stodgy and outdated.

From casts of participants whose interpersonal interactions were authentic and dramatic, we have “types” cast only for their predisposition to engage in explosive and pathetic behavior. From a show structured to document real life, we have contrivances designed to maximize conflict.

But Moylan blames audiences as much as producers and participants for wanting to be fed only the fat of the reality animal, arguing that it is our hunger for disembodied discord that spurred the rationalization of reality. We have forced the devolution of documentary, from representational of real experience to manipulated, empty interactions between easy-to-cast types. And, he claims, it was the early seasons of The Real World that served to whet our appetites:

Thanks to The Real World itself, we have catapulted ourselves headfirst into the reality television black hole. Now seven eight strangers followed by cameras is no longer a novelty now that every two-bit celebrity will mug for the camera and countless shows pit strangers against each other in much more extreme and exotic locations. The audience no longer demands low brow entertainment disguised as high brow documentary. We want to wallow in the muck. Give us the Kardashians. Give us Tinsley Mortimer and her fake racist socialites. Give us the Bad Girl’s Club. Shockingly, MTV mastered this art form quickly with Jersey Shore, the crown jewel of the reality treasure chest. If you’re going to lock a bunch of people in a house and make them drink, fight, and fuck their way to fame and salvation, that this is the way to do it. No Real World cast ever will be able to top Snooki, The Situation, and crew in unabashed trashiness. With its continued innovation, MTV made their old innovation obsolete.

He and I agree that shows like Keeping Up with the Kardashians and Bad Girl’s Club are the product of rationalized reality — the food of early reality processed to extract the basest bits — but I can’t get on board with the assertion that Jersey Shore is this trend’s apotheosis (or nadir, depending on how you look at it).

No Real World season to come can top Snooki and The Situation, but not because the latter are more trashy. Unlike the current seasons of The Real World, The Jersey Shore and its participants are compelling because they are untrained and artless. As fake as they are, we watch them because they are real.

Max Weber and The Rationalization of Reality

March 30th, 2010

The End of Reality: Part II.

This is what happens in the blogosphere: I vow to post more persistently, then go five months without an update. No more vows — just a futile hope that I can muster the energy and wherewithal to actually record my thoughts for you, my faithful(?) audience.

So where was I?


Yes, even reality shows can jump the shark, because even reality shows can have artistic integrity and grounding assertions. In the case of Project Runway, it had continually cast itself as the high-brow reality show (embracing the implied contradiction), insisting it is meritocratic even within its convoluted constraints.

During its last season, Heidi went so far as to verbalize its internal logic: “three strikes, and you’re out.” But in Christopher’s survival past three egregiously heinous strikes, the foundational arguments of the show were thrown over and the series — or, at least, the season (for each new batch of contestants provides its own potentially-redemptive slate-wiping) — jumped the shark.

Where does this leave us? With the realization that we are nearing, at, or just past a critical inflection point in the genre.


It’s taken for granted these days that “reality shows” no longer represent anything “real.” Shows that, at their launch, trained their cameras on non-camera-trained individuals in unfamiliar settings and constructs (The Real World, The Bachelor, Survivor, American Idol) have become repetitive and clichéd. New reality shows have eschewed the goals of their antecedents entirely, uninterested in gleaning insight about real people in microcosm (The Hills, the entire VH1 reality line-up).

When Court TV distanced itself from trial coverage, moving towards documentary shows about true crime and dangerous jobs, it renamed itself “TruTV” and worked our disenchantment with reality TV right into its motto: “Not Reality. Actuality.” “Reality” as a TV genre has become meaningless, a codeword for nothing more than non-fiction (not necessarily unscripted) starring individuals playing themselves, or versions of themselves (not necessarily non-actors).

But the meaninglessness of “Reality” and the inescapable cliché of contemporary reality shows are merely symptoms of culture — they are not the ding an sich (the thing in itself).

Of what are they symptoms? The Rationalization of Reality.


Father of sociology Max Weber described “rationalization” as the unavoidable progression of systems (both physical systems and systems of thought) from inefficient abstraction to cold logic that occurs as we gain better understanding of means and ends, cause and effect, and adapt accordingly. It’s a bit of a difficult concept to understand, and I’m doing it no favors with my abstruse attempts at definition. Examples are the best way to get at it — metonymically.

Bureaucratization is a great example: From early governments and companies that deal with issues ad hoc, with messy delegating and overlapping domains, we develop bureaucracies, with clearly delineated institutions and internal hierarchies for each carefully differentiated issue. So we get the Deputy Assistant Secretary for Canada, Mexico and NAFTA Issues in the State Department’s Bureau of Western Hemisphere Affairs.

Health has also been extremely rationalized over the last few centuries: from a vague understanding of illness tied into conceptions of sin and virtue, we’ve developed keen observations of patterns of sickness and of the world on a microscopic level — we now understand how germs are disseminated, and we’ve developed highly organized systems of treatments for every conceivable array of symptoms.

Even something as simple as our usage of a park can become rationalized. From an open field, we develop well-trod paths where the most people have found the most amenable routes. From free and spontaneous play all around, we designate an area for picnics and an area for baseball. From inconsistent self-policing, we develop rules and guidelines and post them on big green signs forbidding cell phone usage from 11am to 4pm.

While rationalization makes these systems and our lives more efficient, we become constrained by the rigidity of the structures we’ve made for ourselves. We become, as Weber wrote a bit melodramatically, trapped in an “iron cage” and our world devolves into a “polar night of icy darkness.”

But we can see where Weber’s coming from. In a hyper-rationalized landscape of, for instance, mental health, every possible deviation from “normalcy” becomes its own syndrome. As Louis Menand recently wrote in the New Yorker (paraphrasing David Healy in “The Antidepressant Era”), “if a drug (in this case, Paxil) proves to change something in patients (shyness), then that something becomes a disorder to be treated (social anxiety). The discovery of the remedy creates the disease.” As we are constantly hone in on more taut relationships between causes and effects, we can become blinded to the bigger picture.


From a brief survey of reality programming over the last decade, we can clearly see the bigger picture of authenticity being lost as shows become rationalized to milk drama from ultimately inauthentic characters and conventions. But before we can perform that survey, we have to understand the shows and the goals of their subjects and producers.

To generalize, there are two main sub-genres of reality TV: the documentary series (The Real World, The Hills, Jersey Shore) and the game show (Survivor, Project Runway, The Bachelor). Though the lines are blurry — there’s not much fundamental difference between I Love New York and New York Goes to Work — there is an essential distinction. While contestants on game shows are competing for a prize (be it a million dollars or the love of an over-the-hill 80s hip hop artist), with individuals often voted off each week, the subjects of documentary series need only exist within the contrived situations mapped out for them (New York works at a farm! Eight strangers stop being polite and start getting real!).

The goals of the early contestants on game shows was to win. Now, contestants want to win, but they also hope to gain some moderate level of fame and future opportunity through participation. Tabatha Coffey parlayed her appearance on the reality game show Shear Genius into her own reality documentary series, Tabatha’s Salon Takeover; Big Brother’s Jeff and Jordan won $500,000 and $25,000, respectively, and won enough of America’s affection to land them on The Amazing Race; myriad former contestants on American Idol, America’s Next Top Model, and Project Runway have leveraged their fifteen minutes of fame into much longer periods of moderate success in their chosen fields.

The goal of the subjects of reality’s documentary series was, at one point, simply to participate (think the early Real Worlders). Now, it seems their goals are primarily focused toward the attention they can earn by being interesting “characters” on their shows.

In both cases, the goals of the shows’ producers is viewership, achieved by making their programs interesting. Interesting can take many forms — cloyingly romantic (The Bachelor), cringingly pathetic (Celebrity Rehab), explosively charged (The Bad Girls Club) — but, in all cases, producers hope that their programs’ drama will translate into throngs of dedicated viewers.

Understanding the goals of the constituent individuals, we can see how reality TV can become rationalized: participants and producers better understand the means and ends of achieving success however defined (a million dollars, future opportunities, high ratings) and acting accordingly.


Let’s consider game shows first. Like people walking in a well-trod park looking for the best routes, early participants in reality contests found themselves more or less successful depending on different strategies of behavior, leading to the carving out of conventional types. From the complete blank slate of the first season of Survivor — in which Sue Hawk and Rudy Boesch had no touchstone against which to judge Richard Hatch, no model for suggesting they should suspect his scheming and double-dealing — there is now the season of “Heros” and “Villains,” with contestants from past seasons so neatly fitting into the types pioneered by their reality forbears that the subtextual “types” have become the text itself.

Every kind of game show — from talent to matchmaking to social experiment — has gone through enough iterations to develop these same conventions, these same paths through the park, and now contestants cannot help but retread the same steps. Reality game shows now have such clearly articulated narratives of success and failure that contemporary seasons cannot feel like anything more than variations on a theme.

And what about documentary shows? At the beginning, producers plumbed drama from the conflict between individuals from disparate backgrounds in contrived social situations. Untrained and unfamiliar with what patterns of behavior would lead to post-participitory fame — and unfamiliar with the notion that participation could lead to fame at all — the individuals on whom the cameras were focused acted authentically, and to the fascination of viewing audiences. But once producers noticed what moments were most likely to lead to the camera’s and audience’s attention — fights, sex, sloppy drunkenness leading to fights and sex — they began casting participants most likely to slap each other, sleep with each other, and drink to excess. The first season of the Real World becomes every subsequent season, with the frat-boy jock, the Mormon, the gay guy, the alcoholic — characters who were at one time simply compelling real people — cast to foster the contrived drama the producers think will attract audiences and that now-savvy participants think will attract future job opportunities.

In some cases, like The Hills, the producers have gone so far as to hire writers to ensure that each episode has the drama that unscripted reality shows cannot guarantee will arise on a regular schedule. Whole shows like Celebrity Rehab are built around premises designed for maximum pathos with little regard for documenting relatable human experience. Reality documentary shows are so manipulated to foster the drama that authenticity once provided that they have become scripted echoes of their true-to-life ancestors.

The problem with this rationalization is that any value reality TV once had as a genre inhered in its represnetation of authentic human experience. Settings like Survivor’s deserted island or American Idol’s big stage or the Real World apartment were contrived, but there was no behavioral model to follow for the early participants — no conventions of “successful” participation. Their behaviors and conflicts were thus authentic and engaging: Pedro on The Real World, Richard Hatch on Survivor, Jay McCarroll on Project Runway, and Omarosa on The Apprentice were compelling because they had not yet learned they were performing.


But there is hope for the genre; or, there was at least a glimmer of hope during the fall of 2009, when MTV assembled a group of youngsters who wanted no more than to participate in the opportunity provided: a summer at the Jersey Shore.

What made Jersey Shore so compelling to viewers was that it was authentic in a way reality TV hasn’t been in years. Snooki, Sammi, JWow, The Situation, Ronnie, Pauly D, and Vinny were not there to perform — and, indeed, they seemed unaware of the promise of recognition and fame (unaware even of the cameras) until after the show had begun airing. They were there only for a swank house on the Shore and like-minded guidos and guidettes with whom to party. Indeed, Angelina’s early departure is evidence of the fact that her goal was not to be the focus of a reality camera; when she dragged her trash-bag of belongings into the house, one sensed she was there for no more than a good time. When she failed to have that good time, she left.

Though some of the conflicts on the show may have been prompted by the producers (one can’t believe that Vinny really seduced the girlfriend of his boss and landlord unwittingly), the interactions between the characters — and between them and the other people at the shore — was strikingly, unsettlingly realistic.

The phenomenon was fostered by the guido/guidette-framed nature of the grouping. Unlike The Real World, whose social experiment was once premised on people from diverse backgrounds coming into conflict, Jersey Shore had no such pretensions of diversity — a shallowness that in fact bolstered its representational success. When one goes from a community of like-minded people to a setting in which one is a minority (think The Mormon on The Real World), one must be as much a representative of one’s group as a normal version of oneself. Snooki and her kin did not need to be “the guido” in an unsympathetic group — they needed only be themselves.

But the magical moment of Jersey Shore season 1 is not replicable. Copycat shows (the as-yet-unnamed-Brighton-Beach-based spinoff, Jerseylicious) now have an implicit script to follow, characters to cast. Even the cast of Jersey Shore season 1 will be camera-trained and ratings-minded when they shoot season 2 this summer.

Still, there is a lesson here: Rather than manipulating reality shows to wring compelling television out of known-to-be-dramatic characters and conventions, we must find the last batch of people who are not yet characters and the last batch of contrivances that are not yet conventions. Any show with an existing script for success and drama, a script written by the last decade of the genre, will be fated to staleness. Only by a renewed commitment to authenticity can we break out of the “iron cage” of rationalization — only with a jettisoning of characters and conventions can reality TV be real again.

This Isn’t Funny Anymore. Or, The Night Project Runway Jumped The Shark.

October 26th, 2009

The End of Reality: Part I.

On Thursday, October 25, 2009, at 9:57 PM, Project Runway jumped the shark.

I know ‘jumping the shark’ is a loaded concept that’s now bordering on the cliché. And it’s easy to indict a show that’s having a lackluster season — especially a reality competition that’s suffering from inconsistent and frustrating judging — of having debased itself in some core way. But I think ‘jumping the shark’ is a very particular kind of invalidation, one perpetrated by PR in its last episode.

First, the facts. Spoiler alert.

In the bottom two on Thursday night: feather prince Nicolas Putvinski, with his malproportioned Grecian fantasy; and fragile autodidact Christopher Straub, with his indescribably bad “Sante Fe”-“inspired” “outfit” to match his unfortunate, hairline-thin, jawline-hugging facial hair.

Christopher, an earnest if overconfident soul from Shakopee, Minnesota, was making his fourth appearance in the bottom in just as many weeks. After a strong showing early in the competition, Christopher continued to display an utter lack of taste; it was his third time in bottom two, a perch from which he outlasted better competitors Louise and Shirin.

Somehow, Christopher had continued to squeak by on something — remembered potential? Simple favoritism?

This week, though, the there was simply no way he could get another reprieve after running so long on fabric fumes. Michael Kors described his Sante Fe garment as “costume.” Heidi was more frank: “unwearable,” she said; and, later, “just ugly.”

It was thus with the collective gasp of a million viewers that Heidi announced, “Christopher… you’re in.”


This season of Project Runway was problematic far before last week. After relocating to Lifetime and Los Angeles, the show has been unmoored by innumerable absences from New York-based judges Nina Garcia and Michael Kors.

Consistent judging is essential for a show like Project Runway, where contestants prove their mettle and articulate their point-of-view over a season’s worth of wacky challenges. If I had missed school as many times as either judge has abandoned their post (or, more accurately, their runway-side stool) this season, I would’ve never made it past the seventh grade.

There have been other problems, too.

None of the contestants has impressed audiences with innovative design. Each week, the winning designs seem to be the ones conceived and executed with the most competence, not originality.

And none of the personalities has proven exceptionally engaging, leaving an absence of interesting interpersonal dynamics. Yes, Irina is a bitch and Carol Hannah thinks Logan is attractive. But it’s hard to summon hatred for Irina, as she is the most consistently successful of the designers; it’s harder to empathize with Carol Hannah, as Logan is criminally devoid of personality.

So why was Christopher’s third bottom-two survival the moment that marked the jumping of the shark?


Let’s take a step back. What does it mean to jump the shark?

Wikipedia defines the term as “a colloquialism coined by Jon Hein and used by TV critics and fans to denote the point in a television program’s history where the plot veers off into absurd story lines or out-of-the-ordinary characterizations. This usually corresponds to the point where a show with falling ratings apparently becomes more desperate to draw in viewers.”

This definition approaches the phenomenon by metonymy: yes, jumping the shark is often found in conjunction with declining ratings, and it often occurs vis-a-vis absurdity or inconsistency. But these are not the ding an sich.

What these associations hint at is the core of shark-jumping: a cultural object’s forfeiture of artistic integrity. A TV show jumps the shark when it ceases playing by its internally-established rules or abandons its foundational premises.

Happy Days jumped the shark when Fonzie literally jumped a shark on water skies (still in his trademark leather jacket), but it jumped the shark because in that moment it gave up the pretense that it was a naturalistic representation of the lives of Richie Cunningham and his 50s teenage friends.

Cousin Oliver came to stay with the Brady Bunch because of their declining ratings, but the show jumped the shark because his arrival fundamentally altered its premise as a sitcom built on the foibles of what happened after a lovely lady bringing up three very lovely girls married a man named Brady who was busy with three boys of his own — this was a show with its premises built right into the theme song!

When Christopher lived to sew another day after first taking up residence in the bottom and then living their comfortably for a month, it wasn’t just an opportunity to scream at the screen — it marked Project Runway’s loss of artistic integrity.


Much of the best cultural criticism being written today can be found on a blog called FourFour, where Rick Juzwiak meditates on music, web culture, and, most prominently, reality TV. (His recaps of America’s Next Top Model offer enough motivation in themselves to continue watching.)

On the occasion of Project Runway’s sixth season premiere, he wrote about the show he once recapped but never fell in love with:

Project Runway has a reputation for being a high-brow reality show, probably because of its supposed investment in talent, its tempered contestants and its consistent pacing. I think assigning high- and low-culture status within the genre of reality TV is like assigning a hierarchy of pork products, from, say, belly to scrapple. In the end, it’s all fucking pig…

I don’t mean to hold its hype against it, and it’s not like Project Runway ultimately does that great of a job in avoiding being what it is, anyway. People are not there to make friends, they throw each other under the bus, this isn’t the last you’ve heard of them when they’re bounced. As though sniffing out truffles, the casting agents fill the show with types…

There is an androgynous, aggressively coiffed pseudo-intellect who described his design as “ineffable,” but was unfortunately incorrect as he didn’t then shut up.

In response to the task of designing for the red carpet, this one also said “I don’t differentiate between different colored carpets,” which, uh, yeah you do because you just called them “different.” It was here that I was reminded of maybe the main reason I stopped watching this show: I find humorless snobs too excruciating to even laugh at, and as a fashion-design competition, pretension runs thick on Project Runway. It’s not the show’s fault, per se, it’s just how it works out.

Juzwiak has never been able to sign onto Project Runway’s premises — that it is a cut above the typical reality competition, a true search for the best that rewards the excellent and dismisses the dilettantes — but these are its premises indeed. These are the reasons discerning viewers, who would never deign to watch Top Model, have fawned over Daniel Vosovic and Jeffrey Sebelia and Christian Siriano and Korto Momolu for years.

But Juzwiak is right: Project Runway was never perfect, and it has always had more base reality conventions sewn into the muslin core beneath its silk exterior. Yes, contestants who make for good TV might outlast their less interesting competitors. Yes, the challenges with their money- and time-limits are contrived.

Still, the internal logic of the competition demands that continued ineptitude be punished. The show is built on its premise of pretension, of being the highbrow reality competition that may give a second and third chance, but never a fourth.


At the beginning of this season, there was a contestant named Mitchell, whose last name I forget. Technically talentless, he seemed constitutionally incapable of assembling a wearable garment by the time of the runway show.

He was in the bottom two in week one, but was kept over the otherworldly Ari Fish. He was in the bottom two in week two, but was kept over the ineffable Malvin Vein. Viewers were frustrated, seeing admittedly eccentric designers leave before the bungling Mitchell.

But, then, justice.

In week three, Mitchell found himself in the bottom two for the third time — and this was after a challenge in which his team had won!

It was unprecedented, but clearly required by the logic of the show — his continued failure could not be countenanced.

Heidi made the awaited pronouncement: “Never in Project Runway history has a team member for a winning design been eliminated. Three strikes and you’re out.”

Flash forward to October 22. Christopher sews together fabric that leaves fellow designer Althea dumbstruck: “If Christopher can put that garment down the runway and not get eliminated, then I don’t know what’s going on.” We all agree.

He lands in the bottom two for the third time. The logic of the competition, the internal rules of the show articulated by Heidi herself, demand his expulsion.

But he survives. And he’s not even good TV.

The rules are broken. The premises are thrown over. The foundation collapses.

Project Runway jumps the shark.


In my next post, I’ll explore what Project Runway’s shark-jumping says about the state of reality TV — a genre built on the premise of representing “reality” that may be increasingly incapable of fulfilling its foundational requirement.

Note that this series is also being posted on Tears and Jeers, a pop culture blog written with Sachi Ezura. It was relevant to both blogs’ interests, and I couldn’t choose just one place to post. And some cross-blog promotion never hurts.

What Ever Happened to Ostracism?

October 11th, 2009

A couple of months ago, during the late-lamented summer, my parents and I found ourselves driving from Shelter Island’s Heights back to the Center, from the pharmacy and Stars Café to the post office and George’s IGA.

Turning into Dering Harbor village (population: 13), we were treated to a unusual sight for our small, modest island community: two young women in bikinis skipped down the street arm-in-arm, Laverne-and-Shirley-style, with their bikini bottoms pulled down beneath their pert-but-untanned buttocks. My father later recounted that day as his favorite of the summer.

We also later discovered that this semi-nude jaunting had been a summer-long habit of the two women, likely a fun way to get a rise out of the more staid and sheltered residents of the island.

Unfortunately for one of the women, who had been working as a hostess at one of the island’s inns, her reputation got back to dining room. When it did, she was fired.

While my parents thought it was ridiculous that the woman should be dismissed just for having a little fun, my grandmother and I agreed that the inn’s owners were right — or at least had the right — to dissociate their business from their hostess’s indecent public displays.

It surprised me, though, to see an institution actually exercising a desire to uphold somewhat stuck-up standards of “decency”; the idea of a small-town community collectively looking down their noses at an impetuous young woman — and actually ostracizing her in some real way — seemed to belong more to the age of Ellen Olenska or even Hester Prynne than the age of Lindsay Lohan and Lady Gaga.

I thought about the incident again last month after reading an article in the New York Times “Vows” section about a couple that met and fell in love while performing together in La Bohème:

…When he kissed her, she momentarily lost her footing. “I was thinking, ‘What was that?’ ” she said. “There was definitely something there.”

After the rehearsal, Mr. Miller decided he had to see Ms. Kabanuck outside of work and invented a reason to call. A question about their schedule quickly turned into an invitation to a movie. That evening they went to see “50 First Dates.”

“I was so drawn to him immediately and tried to talk myself out of it,” Ms. Kabanuck said. Theirs was a clash of outlooks, if not cultures. He wore red cowboy boots, had earrings in both ears and spiked hair. She had been raised as a Baptist fundamentalist and said she remained devout, describing herself as “a little church girl.”

A sweet story so far, an opposites attract rom-com plot against the backdrop of a classic love story. Very Kate Hudson/Matthew McConaughey. Just one problem:

The date led to a few other encounters, but he was about to depart for Piacenza, Italy, for what he expected to be a triumph as the Duke of Mantua in a new production of “Rigoletto.” She drove him to the airport. Neither of them knew what would happen next. She was still married, but very much wanted to be close to him. He later described the experience of looking into her eyes on the first date as “that thunderstruck moment.”

“I was in love,” he said, “not just in my heart but in head, my body, my soul. That was it.”

…Holed up in a hotel in the Latin Quarter for two weeks, they reveled in their own vie bohème. Only in this version, the two lovers began planning his next career move, an audition for the pop-opera quartet, Il Divo, then being put together by Simon Cowell. She scraped together the last of her money to buy him an MP3 player so he could rehearse.

The player turned out to be a solid investment. He became a member of Il Divo and now tours the world with the group.

Ms. Kabanuck, when she returned from Paris, moved out of the home in New Jersey that she shared with her husband and found an apartment in Manhattan. The decision to leave her marriage and devote herself to Mr. Miller was extraordinarily difficult, she conceded. Still, she added, “from the moment our eyes met through those two weeks of being in Paris and the pain of going through a divorce, I knew that I loved him.”

Emphasis mine. Call me old-fashioned, but I was a little thrown to be reading this in the Times.

Sure, love doesn’t always happen neatly, but should adulterers be rewarded with a profile in the Sunday Styles section? The Times chooses whom to include in their highly competitive Weddings pages — isn’t the inclusion of the cheating coloratura and her Divo an implicit (bordering on explicit) endorsement of flouting marital bonds?

The devout “little church girl” shouldn’t have to be marked with a scarlet A, but shouldn’t cheating on her spouse disqualify her from being celebrated in a national newspaper?

I wasn’t the only one surprised: a post on New York Magazine’s “Daily Intel” blog slammed the couple — and others who end up in Vows after cheating on their spouses — for wanting the world to applaud their disregard for their first husbands and wives:

We at Daily Intel are not naïve. We understand that sometimes people in relationships fall in love with other people, and that they sometimes want to marry those people, which necessitates ending their current relationship. The heart wants what the heart wants, and all of that. We get it. We’ve even applauded it, bizarrely. But what we do not understand, what we cannot abide, is when said people, in the throes of connubial bliss, lobby to have themselves included in the New York Times “Vows” column, and then proceed to tell the reporter about how they cheated on their previous partner in a way that suggests they think of it not as something crap they have done to another person but instead like it is a part of their personal love story…

We actually just find it kind of distracting as a reader of Vows, because it raises all kinds of questions that then go unanswered, such as: Do the people who tell these stories really realize this stuff is going to end up in the Times, really? Do they worry that it’s going to ruin their wedding announcement by making them sound awful? And what do the exes think? What’s their version of events?

The authors fault the Times for lazy reporting in not getting the story of the disbanded husbands and wives, but, really, it’s a question of values. Why offer your institution’s extremely well-respected stamp of approval to clearly distasteful if not unethical behavior?

In Edith Wharton’s world, one whiff-of-a-hint of an adultery scandal that coalesced into an acknowledged item of society gossip could push someone out of social life forever.

That end of the spectrum seems too extreme. One mistake doesn’t define a person; there should be room for rehabilitation — of one’s reputation if not of his character.

A few weeks ago, after Joe Wilson shouted “You lie!” during Obama’s address to Congress; after Serena Williams told a line judge at the U.S. Open that she’d shove the f-ing tennis ball down her f-ing throat; and after Kanye West assured Taylor Swift he was really happy for her and he was gonna let her finish, but Beyoncé’s video was one of the best of all time, the blogosphere punditocracy’s take-away message was that civility was dead.

But my take-away was slightly different and more reassuring: ostracism was alive, if not totally well.

Joe Wilson was “rebuked” by the House of Representatives, Serena Williams was fined by the tournament, and Kanye West was called a jackass by none other than Barack Obama.

The institutions which these individuals represent — Congress, professional tennis, the United States of America — made clear that their constituents’ actions were not in line with their institutional values.

Like the inn on Shelter Island, unlike the New York Times Vows section, these institutions (metaphorically) fired their flashing hostesses.

But we have a short societal memory and a shorter cultural attention span. These events will remain wrinkles on their perpetrators’ reputations forever, but they won’t bar all reputational rehabilitation.

Case in point: Eliot Spitzer. Eighteen months after resigning in a prostitution scandal, he has a column in Slate and may even run for office again.

This is a kind of provisional ostracism that we now generally practice. Serena can earn back the respect of her fans and become a model sportsman. The flashing hostess can be hired by another Shelter Island restaurant next summer. Institutions can censure those who show disregard for their values while still leaving the door open for redress.

If we want to keep civility alive, though, we must keep ostracism working. We must sometimes retain collective scowls at distasteful behavior. Let’s congratulate former adulterers on their weddings but keep them out of the Weddings sections. Let’s let Michael Vick play football but not give him endorsement deals. Let’s let Joe Wilson keep his seat but not make him minority leader.

And let’s get the flashing hostess a job at the Gardiner’s Bay Country Club so my dad can see her more often.

The Economist Stole My Idea!

October 14th, 2008

Well, not really, but the similarities are uncanny…

From my September 26 post, Climate Change and The Winner’s Curse:

…the research on the rate and reach of climate change, even if it’s all done by good scientists using sound data-collection and analysis, is likely to result in findings that fall along a distribution. But while the truth of the matter is likely found in considering the distribution as a whole, the findings on the ends are going to be the ones that stick out to journal editors as the most interesting to prospective readers….

Let’s say there’s an auction of a good with an objective but unknown value (think fields for oil drilling, not a painting that each prospective buyer will value differently). Each buyer will estimate the value differently. Maybe they’ll each hire someone to professionally survey and appraise the good. The real value is probably around the mean estimate, but it’s the buyer with the high estimate who will buy the good, thinking the others suckers for passing on such a valuable purchase. But that buyer will almost certainly have over-valued the good. In an auction like this, you don’t want to be the winner.

Similarly, science journals are buying the articles that most highly estimate the costs of climate change — but they might be overpaying.

From an article in the current issue of The Economist, Publish and be wrong:

IN ECONOMIC theory the winner’s curse refers to the idea that someone who places the winning bid in an auction may have paid too much. Consider, for example, bids to develop an oil field. Most of the offers are likely to cluster around the true value of the resource, so the highest bidder probably paid too much.

The same thing may be happening in scientific publishing, according to a new analysis. With so many scientific papers chasing so few pages in the most prestigious journals, the winners could be the ones most likely to oversell themselves—to trumpet dramatic or important results that later turn out to be false. This would produce a distorted picture of scientific knowledge, with less dramatic (but more accurate) results either relegated to obscure journals or left unpublished.

More accurate, perhaps, than saying “The Economist Stole My idea!” would be to say “The Economist reported on scientific findings that support ideas I discussed!”

Starbucks Baristas and Incentives for Store Activity

October 13th, 2008

Yesterday evening, I went into my favorite Starbucks for a drink and got to chatting with one of my favorite baristas. She was exhausted; there had been Oktoberfest festivities all day around Harvard Square, and so Starbucks was packed all day. (As my barista put it, the endless alcohol consumption was making people tired, so they’d come in for coffee then go back to their drinking.)

I was a bit puzzled and asked whether increased store activity didn’t have any upside for them — after all, it’s the baristas who have to stay cheerful and diligent while ringing up long lines of customers and making drink after drink.

Nope. Their salary stays the same no matter how busy the store is. And, though at one point an all-day rush would’ve left the baristas flush with tips, these days almost everyone pays with a credit card; there’s little change changing hands — and even less being dropped into the tip jar.

Long gone are the days when “Kristina Doran, who works behind the counter at a Starbucks in SoHo, said she has been known to take home an extra $160 a week in tips.” That was 2002, when the New York Times reported on the rising trend of ubiquitous tip jars.

And Starbucks has been having a lot of tip-related trouble recently. In March, as described by columnist Connie Schultz (no relation to Howard),

A California judge has ordered Starbucks to pay more than $100 million to its low-wage coffee servers, called “baristas,” after ruling that the company violated state law in allowing supervisors to share in the tip pool. The decision applies only to California but could influence tip jar policies across the country….

Starbucks called the decision “fundamentally unfair and beyond all common sense and reason.” Interestingly, many Starbucks employees — including baristas — agree, which is why this is more complicated than the typical management tip-skimming maneuver. Baristas insisted to journalists, including me, that their supervisors often brew coffee and wait on customers just like they do.

“I can’t hire or fire anybody,” one supervisor in the Cleveland area told me. “The only difference between me and a barista is that I count the money and I have keys.” Supervisors also reportedly make $1 to $2 more an hour. I don’t know for sure because no one at Starbucks’ corporate headquarters would talk to me.

My barista and I agreed that without substantial tips, the baristas’ incentives suggest they should want fewer customers: they’ll make the same income and won’t be overwhelmed with long lines and piles of drinks to be made.

As I sipped my mocha, I got to thinking about why this struck me as strange, and about what the possible alternatives are.

First off, it’s plain odd, with Starbucks refocusing its corporate energies on the quality of their coffee and of each drink made — with promises that if your drink isn’t perfect, your barista should gladly make it again — that its payment structure shouldn’t align with this goal.

Think about it: when making a drink, the barista has an incentive to make it satisfactory, so that the customer doesn’t ask him to make it again. But, unless he has an incentive to make sure his location is packed with customers, he doesn’t have any extrinsically-motivated reason to make the drink great enough that the customer will definitely keep coming back.

The same is true for the customer’s general experience. A barista has an incentive not to make the customer have an unpleasant experience, because that could reflect negatively on her job performance and could lead to disciplinary action; but, she doesn’t have an incentive to make the customer experience so positive that people flood the store. (I’m assuming, as my barista suggested, that tips won’t increase markedly enough to justify this extra effort.)

(Sidenote: I’m talking pure economic incentives. I know firsthand that most baristas are wonderful people who want to make each customer thrilled with being at Starbucks even though they don’t necessarily benefit financially from that extra effort.)

The bottom line is that the corporation does nothing to make baristas excited rather than grumbly about a crowded store. That can’t be good for business.

As I reached the middle of my mocha, I started to wonder about how this payment scheme jibed (or didn’t) with similar service industries — a thought experiment that served to highlight how unique the Starbucks barista job description is.

On one hand, we could compare being a barista to being a cashier at a grocery store, perhaps someone who also walks the floor restocking shelves. There’s no incentive for this person to have a busy store, but his actions generally don’t reflect powerfully on the customer experience. And it doesn’t take much skill or training to ring up customers.

At Starbucks, the customer experience is almost completely determined by the baristas: how friendly they are, how well they make the drinks, how much they make you want to come back. This importance is reflected in the training baristas get and in their being called “partners” by the company — I don’t think Food Emporium feels so strongly about its employees.

It makes sense for the former that their pay isn’t tied to store activity, but it doesn’t make sense for Starbucks baristas.

On the other hand, we could compare being a barista to working at a clothing store. They spend some time simply ringing up customers, but they also spend time doing more skill-and-time intensive work: finding clothes for customers, ensuring that sure store is in order, making the customer happy and ready to purchase. This extra effort — akin, it seems, to the skill and time needed to make drinks to customer satisfaction — is rewarded with a commission on clothing sales for which there is no analogue in the Starbucks world.

Moreover, Food Emporium can see when it’s going to be busiest and simply employ more cashiers during those shifts. Same for Banana Republic and its staff. But there’s only so much room behind the counter and at the espresso machines at a Starbucks, and so the variance in employees from shift to shift is necessarily small. When the store is especially busy, then, the brunt of the extra work falls directly on those baristas’ shoulders.

It seems pretty obvious that given the kind of work baristas are doing and the close relationship of that work to customer experience — and thus to store activity — the baristas should be incentivized to want the store busy by more than the skimpy possibility of tips.

As I reached the last few sips of my mocha, I wondered: how can this be done?

Above the base rate salary for baristas, there should be performance bonuses that come with increased store activity. If Starbucks has data on the revenue from each location, it can presumably see when a given location sees its activity increasing (perhaps above predictable seasonal changes in activity) and then reward the baristas accordingly.

And workers on shifts that face especially overwhelming crowds should be compensated for bearing that burden with grace and discipline. This salary bump would show that the corporation really does care about the effort of its partners.

In general, Starbucks is a good employer. As Shultz wrote in March:

Starbucks was the first major U.S. company to offer health care coverage to some part-timers. It also offers tuition reimbursements and a 401(k) program. That’s a high standard I wish more companies would meet.

But Starbucks has its problems with workers, too. Earlier this month, the company agreed to pay an undisclosed benefit to about 350 managers in Texas who claimed they were forced to work off the clock.

And now there’s this business with the tip jars.

Starbucks supervisors work hard, and they should be paid for their efforts. The company should stop relying on customers’ generosity to compensate them adequately. (emphasis mine)

No matter how uniform and involved corporate policies are, the personality of each Starbucks location is resolutely in the hands of its baristas.

As such, baristas should be rewarded financially for making their locations especially fun, welcoming, delicious places to be.

An Open Letter to The New York Times

September 30th, 2008

In an earlier post I talked about the impending demise of print media and offered a probably not-incredibly-useful proposal for improving the quality of new hires into the industry.

Here’s another idea, which I’m convinced could actually add a revenue stream for circulation-lacking papers like The New York Times. And with the demise of The New York Sun this week just another in a long line of portents, it’s time for them to listen.

During a car ride to the city the other day, my cousin Jarema was talking about how she wished she could read the newspaper every day, but she didn’t have the time. And she described how, in addition to listening to music, she loves using her iPod to listen to audiobooks while she works (as a painter for an art collective). If only she could listen to the newspaper on her iPod!

Well, why can’t she?

The Times needs subscription revenue, but readers are flocking to the website instead. The paper tried a pay-for-view scheme for some web content, but with the abundance of free news online there’s no reason to pay.

Meanwhile, many people (like me) love the content from the Times but just can’t read every article. We like the actual paper for the variety, depth, and quality of its coverage. In contrast, the local and even national TV news and news radio lack this quality and depth, and lack the user-side control of clicking around on the Times Online.

Add to that the efficiency of using iPods for purposes other than music. For instance, I download History Channel documentaries and listen to them while I walk around.

So what’s the prescription? The Times should team up with Apple to offer a daily download of the paper, divided into tracks for each article. It shouldn’t be too difficult to have a few voice artists record the articles every night; use one artist for the dozen or so articles in each section — a Diane Sawyer type for International News; a Ray Romano sound-alike for Sports; for the Metro Section, Fran Drescher (she might be available for this, right?).

I don’t subscribe to the print version of the paper because I move around too much, it’s too bulky, and there’s not enough value added over the online version. But I’d subscribe to the audio downloadable paper for sure. Just as I get an email whenever there’s a new Mad Men episode available for download, I’d have an email in my inbox every morning with a one-click link to the audio of the day’s paper. A minute later, I’d have my iPod earbuds in, on my way to the elevator, hearing the day’s headline article read to me.

Maybe I’d skip articles on telecom mergers or soccer matches, but I’d get a much wider variety of news than when I click around the articles that pop out on me on the website.

Another benefit: people love to dissociate payment from their purchases — it adds utility. It’s why we convert money into chips when we go into a casino: we suffer the expense once and then we don’t have to think about it. Paying for a subscription to the Times on iTunes would be quick and painless, making us more likely to expend money we wouldn’t in increments of $1.50 over 365 days.

Bottom line: with minimal effort and expense, the paper can make a whole new generation into Times subscribers. By making our currently unproductive time productive — letting us hear the Times while walking the streets — they’ll add value to their reporting that makes it worthwhile for us to spend money on the news. New revenue abounds.

Update (10/1/08): So it turns out that a company called Audible, bought by Amazon in January for $300 million, offers an “Audio Digest” version of the Times for like $13/mo. So someone over at the Times is recording an abridged version of the paper every night. They’re just not making it easily available — nor marketing it aggressively — to the iPod generation. To get it, one has to go to and find it, then create an account, download it, and import to iTunes. And let’s face it: nothing with the word “Digest” in its name is being marketed to millennials. This Digest should be made a lot sexier and be made easily available through the iTunes Store. Of course, it’s also worth noting that you can subscribe to some New York Times podcasts, but nothing akin to what I describe above.

Climate Change and The Winner’s Curse

September 26th, 2008

Over the last few years, most mainstream doubt about the existence of climate change has been quelled — just think about Al Gore, Nobel Prize Laureate — and only the most strident of zealots resist the coalescing consensus that the activities of man are adversely affecting our environment.

As a result, I was surprised and intrigued to hear from a friend over dinner recently that her boyfriend — a Harvard business school student who holds numerous science degrees, including a master’s from Cambridge — is resistant to the forecasting that appears in major scientific publications like Science and Nature.

No one’s accusing these publications of being anything less than rigorously peer-reviewed bastions of scientific research, so why disbelieve them?

My friend said her boyfriend felt that lots of good, legitimate research is done on climate change, but the publications — and the media at large — focus on the extreme, the provocative, and so it’s the most pessimistic views that end up in their pages.

This makes perfect sense: the research on the rate and reach of climate change, even if it’s all done by good scientists using sound data-collection and analysis, is likely to result in findings that fall along a distribution. But while the truth of the matter is likely found in considering the distribution as a whole, the findings on the ends are going to be the ones that stick out to journal editors as the most interesting to prospective readers.

It’s a case of the classic economic phenomenon known as The Winner’s Curse.

Let’s say there’s an auction of a good with an objective but unknown value (think fields for oil drilling, not a painting that each prospective buyer will value differently). Each buyer will estimate the value differently. Maybe they’ll each hire someone to professionally survey and appraise the good. The real value is probably around the mean estimate, but it’s the buyer with the high estimate who will buy the good, thinking the others suckers for passing on such a valuable purchase. But that buyer will almost certainly have over-valued the good. In an auction like this, you don’t want to be the winner.

Similarly, science journals are buying the articles that most highly estimate the costs of climate change — but they might be overpaying.

Infomercials and Attack Ads

September 24th, 2008

What do infomercials and many political attack ads have in common?

They both change the way we think even though they employ words and imagery we all know to be blatantly false.

Last week, Saturday Night Live did a fake commercial based on my favorite infomercial convention: making simple everyday tasks, from peeling a potato to doing a sit-up, look Sisyphean. In the SNL commercial, Kristen Wiig just CANNOT get a jar open! In trying, she ends up accidentally killing her husband, burying him, lying to the cops, getting arrested, being convicted, then getting chased by dogs after breaking out of jail. “There’s got to be a better way!” she cries. Then in a clear color shot, she uses the “jar glove” to easily remove the lid. “Jar glove. The better way!”

Infomercials all seem to use this tactic: in black and white dramatizations, we watch unfortunate people with contorted faces flail this way and that in a desperate attempt to get the knife to move straight or to get their chests to meet their knees. We all know that these tasks are not this hard. We know these dramatizations are absurd. And yet they do seem to make those Tater Mitts and Ab Lounges more attractive.

George Orwell described the “schizophrenia” of “holding simultaneously two beliefs which cancel out.” And though we’re not schizophrenic when we watch infomercials, we do seem to inhabit a double consciousness, knowing that our lives are not the black and white ordeals we see onscreen but still feeling that they could be easier.

And we inhabit this same double consciousness when seeing, hearing, or reading malicious political attacks.

Two months ago, John McCain said that Barack Obama would “rather lose a war in order to win a political campaign.” This is a statement that no reasonable person, even a McCain supporter, would believe. Add to this other scurrilous rumors — Obama is secretly a practicing Muslim; he was childhood friends with Iranian president Mahmoud Ahmadinejad — that are either conspiratorially unbelievable or quickly, easily falsified. Yet these clearly false attacks persist — probably because they are effective. We’ll likely see even more attacks like this in next six weeks; Fact Check is a good place to keep track.

Why are they effective? As John Bullock writes in his article “The enduring importance of false political beliefs”:

Much work on political persuasion maintains that people are influenced by information that they believe and not by information that they don’t. By this view, false beliefs have no power if they are known to be false…But findings from social psychology suggest that this view requires modification: sometimes, false beliefs influence people’s attitudes even after they are understood to be false.

Negative associations change our attitudes, even if these associations are as transparent as false rumors or impossible sit-ups.

The producers of infomercials know it’s true. Clearly, so do politicians.

It’s intriguing that simple lying might be as effective as aggressively spinning the truth. The conventional wisdom is that bullshit is much more invidious than bold-faced lies, being harder to refute. The manipulation of words and images, subtly misleading statements (like President Bush mentioning Iraq and 9/11 together, giving the impression that Iraq was somehow involved in the terrorist attacks), the framing of issues, etc., are supposed to be the most dangerous form of political maneuvering. But maybe the conventional wisdom is wrong.