Winter 2010: Features
Or How I Learned to Stop Worrying and Love Top Chef
By Eddy Mueller 07PhD
If you are one of those inclined to agree with Newton Minow, you might think reality television proves his point once and for all.
Minow was the chair of the Federal Communications Commission under John Kennedy and, incidentally, one of the only academics ever to hold the post. Making few friends in the industry whose conduct he was to oversee, he famously branded prime time television a “vast wasteland.” And that was way back in 1961; Newton never saw a single episode of Sex Rehab with Dr. Drew or The Swan.
Yet, however individual viewers or critics might sneer at individual programs, so-called reality television is everywhere, from the once-monolithic broadcast networks to narrowcast cablecasters zeroed in on a single demographic. “Reality TV” is a generic term for a mode of television production that now encompasses an extraordinary variety of programs.
Perhaps you take guilty pleasure from The Real Housewives, or experience earnest (if voyeuristic) pathos watching Intervention. Maybe you keep your eyes peeled in case your path crosses that of one of America’s Most Wanted, or maybe you’ve sprained your thumbs textercising your franchise to make Adam Lambert or Fantasia or any of the assorted Davids the next American Idol. But whether you keep up with the Kardashians or with Jon and Kate’s adorable eight (and there are now shows featuring even more fecund families), you are part of reality TV’s staggering global audience.
So successful and so pervasive has the model become that unless you are Amish or have banned the box entirely from your life in some sort of Luddite frenzy, it’s now hard not to watch reality TV. Love it, loathe it, or both, reality content rules contemporary small-screen entertainment.
Yael Sherman 09PhD, who completed her dissertation in Emory’s Department of Women’s Studies examining issues of gender and power in makeover shows, taught a popular course on reality television last semester. She emphasizes the international scope and significance of the phenomenon. “Reality TV, like any other form of popular culture, offers valuable insights into different cultures,” she says.
To be sure, television has always traded on the reality factor. Like radio, its immediate forebear and the source of most of its forms and conventions, TV was in its early years primarily a live medium. Sharing as well radio’s miraculous capacity to reach into our very homes, television opened a window—albeit initially a tiny, blurry, black-and-white window—onto actual events occurring in more or less real time.
Even after the broadcast industry began to “go to film”—live programming was too prone to gaffes and accidents that could mar the image sponsors sought to create—television constantly blurred the boundaries between entertainment and actuality, from the early enthusiasm for pseudo-sports like roller derby and professional wrestling, to fantastically popular semireal celeb sitcoms, like Ozzie and Harriet, The George Burns and Gracie Allen Show, and of course the iconic I Love Lucy, in which the actual Arnazes played the fictional Ricardos. As Jack Webb intoned every week in his somber preamble to Dragnet, the original “ripped from the headlines” procedure drama, only the names had been changed, and those not by much.
In fact, much of the reality material we see today has precedents in programs past. Arthur Godfrey’s Amateur Hour is the indisputable ancestor to the Idol-style TV talent show; Divorce Hearing, a surprise 1957 syndicated success, was a precursor to many a prurient People’s Court; quiz show champ Dr. Joyce Brothers, trailblazer for Dr. Phil and friends, was shrinking heads and saving families on the tube as early as 1958; and Alan Funt’s Candid Camera had been “punking” people for a quarter century before Ashton Kutcher was even born.
And there has also always been on television, as there had been on radio, the tantalizing prospect of sharing an ostensibly ordinary citizen’s scramble for celebrity, cash, and fabulous prizes on any of scores of game shows and “quizzers.” As the grand jury that investigated the quiz shows—most of which turned out to be rigged—discovered, there are always limits to the reality of the realities sold on television.
While there are radio and television ancestors of virtually every mode that has emerged in the reality racket, the contemporary boom began in 1989, in large part because of a tectonic shift—not in public taste, but in television technology.
During the course of the 1980s, videotape became an almost ubiquitous part of the home viewing experience. Despite lawsuits from film distributors and networks, by 1989 more than 62 million Americans owned VCRs, more than 70 percent of the television market. Still more rented players at any of the tens of thousands of stores that had sprouted up all over the country to peddle small-screen cinema.
Videotape did more than create a booming ancillary market for films, though. It also provided an efficient, low-cost medium for the production of motion pictures. Film, even the relatively user-friendly small-gauge 16mm and Super 8mm formats, is notoriously temperamental and slow. All filmmakers, from the humble local news cameraman to the slickest Hollywood shooter, were obliged to wait for a processing lab to do its thing before they could even know whether they had caught the moment they wanted to memorialize forever.
Several video camera models were being offered in the United States by 1985, and whatever they might lack in resolution and aesthetic nuance, they more than made up for in convenience. The picture might not look like much by cinematographic standards, but what you saw, you got, and right away. Although only pornographers, telejournalists, and a handful of avant-garde videographers embraced the medium professionally, video “camcorders” revolutionized amateur home moviemaking. Highly portable, simple to operate, and ever cheaper as the decade wore on, camcorders allowed private citizens everywhere to generate for their delectation millions and millions of hours of content, a gold mine of potential awaiting exploitation.
Enter, Thanksgiving weekend, 1989, America’s Funniest Home Videos. The cost of content has always been a major obstacle to profitability in the entertainment business. AFV, as it is known to aficionados, cannily sidesteps that hurdle by getting viewers to provide the content for free, all of them sharing their pain with the world in the hopes of being one of a tiny handful of entrants to be graced with a cash award and an on-camera verbal slaparound by sitcom funnyman Bob Saget.
Initially cribbed from a Japanese TV hit, this edifying celebration of public humiliation and accidental injury gamely soldiers on today, the second-longest-running entertainment show ever to air on ABC. Cops, another surprise success, also appeared in 1989 on Fox, sharing with the viewers at home the drudgery, adrenaline, and tragicomedy of rank-and-file law enforcement. Exploiting low- or no-cost sources of videotaped content, Cops helped AFV lay the groundwork for what has become a reality revolution.
The success of America’s Funniest Home Videos revealed the willingness of members of the television audience both to view and to volunteer amateur video content. Granted, there are in some programs additional attractions and incentives—like the remote promise of reward in the form of prize money or celebrity, or free services to make over one’s hair or home or habit—but the possibility of participation and a moment in the television sun alone drives millions of aspirants literally to make spectacles of themselves. And for every wannabe supermodel, survivor, inventor, interior designer, and Pussycat Doll who makes it onto the air, hundreds more turn up for cattle calls or turn in videotaped “auditions.”
Creatives and cost-conscious producers were quick to embrace the vogue in video voyeurism. Not only did these new models and technologies hold down the cost of content, but using self-selected, nonprofessional “performers” provided other advantages. Most of the exhibitionists—as many as a thousand on TV at any given moment, according to a recent column in the New York Times—obligingly yielding their lives, loves, and kitchen renovations to the reality production machine are amateurs, undefended by the agencies, guilds, and unions that leverage compensation and residual rights for the pros playing host and peopling scripted shows. Much of reality content is also unmoved in the face of labor actions within the entertainment industry. All icing on the cake.
The expansion in content coincides with another revolution in television’s delivery system: the cable boom. From the mid-1970s to the mid-1990s, the number of content providers available to many consumers swelled from a dozen or so on-air “stations” to more than fifty. By 2002, some 280 national cable channels were available twenty-four hours a day, seven days a week.
With so much airtime to fill, the shot-on-video reality paradigm has proved by far the most economical means of generating content for the cable industry. Quickly mutating to claim every possible niche in this new wasteland (“waste” only in the sense of “emptiness,” I hasten to clarify, lest I offend fans of Dog the Bounty Hunter, I Shouldn’t Be Alive, or Flava of Love), the reality paradigm developed a startling variety of forms, from the makeover shows like What Not to Wear and Queer Eye for the Straight Guy, reality-competition programs like The Bachelor and The Biggest Loser, to celebrity ethnography-coms like The Osbournes and The Girls Next Door. By the end of the 1990s, many national cable channels were filling nearly their entire schedule with programming that falls under reality’s big umbrella.
All this goes a long way toward explaining why content producers and providers are so enamored of the reality mode, but leaves as a mystery the often obsessive interest of the audience. What exactly drives our unquenchable appetite for the stuff?
“America, and the world, loves reality television because it is both familiar and new,” offers Emory’s Sherman, one of a growing number of scholars and researchers interested in this varied and vibrant cultural form. “It thrives on hybridization—taking elements from talk shows, game shows, dating shows, dramas, social experiments, and documentaries and making them new through mixing them together and mixing them with reality TV elements.”
Indeed, despite dire predictions throughout the early 2000s that the reality lode had played out, the mode has proved a Hydra. Though some shows have had impressive staying power, most reality shows are relatively short-lived—a few seasons, a DVD or two, and done. But for each cancellation, it seems, two new entries spring up, shot ever faster and cheaper as the price of video production and postproduction continues to collapse. True, there is always in the mix a million-dollar purse or an expensive show plunking a handful of hand-picked Americans selected to scoop in desired demographics into a tropical locale in which people have been living for thousands of years and asking them to survive with nothing more than several camera crews, a trained medical team, and a small army of lawyers to help them along. But by and large the cost of reality television production has been prone to steady deflation, a boon in trying times. And cable competition shows built around identifying people worth building shows around has allowed some producers to profit even from the development and test-marketing process.
Energetic academics and critics are busily gleaning those insights, exploring the mode’s historical, sociological, psychological, political, and even aesthetic dimensions. And with no end to the bonanza in sight, they have their work cut out for them.
As for the rest of us, dwellers in the wasteland, can we be blamed for being besotted with television so precisely tuned to our desires, dreads, and self-regard?
“Reality TV does a great job of reaching people,” says Sherman. “After all, these are ordinary people, just like you, on TV.”
Eddy Mueller 07PhD is a lecturer in film studies.