Volume 76
Number 3

The Romance of the West

Home Away from Home

Burden of Proof

The Moviegoer

CASE Editor’s Forum

Your connection to
Emory University

Emory University

Association of Emory Alumni

Current News and Events

EmoryWire

Sports Updates

Use our searchable index to find specific Emory Magazine articles from 1995 to 2000.

 

 

 

 

 


 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 


WHEN WALKER PERCY put the finishing touches to his novel The Moviegoer in 1961, he had gotten it right: sitting in a darkened movie theater is just the place for figuring things out. For David A. Cook, director of Emory’s Film Studies Program, the same has long been true. Unlike Percy’s protagonist, however, most of Cook’s revelations concern the movies themselves and the peculiar brand of alchemy they practice on viewers.

Cook’s latest project, Lost Illusions: American Cinema in the Shadow of Watergate and Vietnam, 1970—1979, is simultaneously a book on 1970s filmmaking and a meditation on an era that has been, says Cook, “commodified.” Television, for instance, already has showcased the era with the program That ’70s Show.

“Most traffickers in 1970s nostalgia,” asserts Cook, “herald the new freedom of the youth culture, yet blunt the ramifications of the Vietnam War.”

On the checkerboard of academic life, it was clearly a triple jump that brought Cook from a 1971 dissertation on the Wessex novels of John Cowper Powys to the recent contemplation of why early 1970s musicals did poorly at the box office. He arrived at Emory in 1973 as an assistant professor of English. The climate of the time dictated that a movie might be one’s entertainment on a Saturday afternoon, but it was not part of the canon. During the proving years, Cook joined ranks with his fellow professors and filled blackboards with evidence of Dante Gabriel Rossetti’s Asian influences and Vladimir Nabokov’s dark puns. Determined, nonetheless, that film merited a double bill with literature, Cook did all in his power to prepare the way for its sustained study at Emory.

That day came in 1986, with the creation of the Department of Theater and Film Studies. In less than six years, the Film Studies Program stood on its own, offering a graduate-degree program and the only undergraduate film studies major in the Southeast. Today, the lights are down and the enthusiasm is up as Cook works alongside three other full-time professors of film studies to offer everything from core courses in film theory to electives such as Movie Censorship and American Culture.

GIVEN THAT COOK’S OWN GENESIS as a film scholar began in the 1970s, it is fitting that his current book project take this decade as its period of study–an epoch that is, says Cook, “ancient history” to many of his students.

“Everything the American cinema is today,” maintains Cook, “began in the 1970s. It was a decade whose changes were as revolutionary as the introduction of sound was to cinema in 1928, yet it was a quieter shift.”

Lost Illusions not only offers a complete history of a pivotal decade at the movies, it is also a barometer for who we were–politically and morally–as Americans in those years.

November 7, 1980–the date of Ronald Reagan’s election–marked the end to at least two illusions the 1970s had fostered. Cook describes them with a sense of personal resignation:

First was the illusion of a liberal political consensus created by the antiwar movement, the Watergate scandal, and the subsequent resignation of Richard Nixon as president of the United States: the decade that began with the [October 15, 1969] moratorium, Kent State, and the Washington Post’s exposé of “the White House horrors” ended with the election of Ronald Reagan and the ascendancy of right-wing conservatism in our political leadership for the next twelve years. The second illusion, intermingled with the first, was that mainstream American movies might aspire to the same sort of serious social or political content described above on a permanent basis.

The antecedents for what happened in the American film industry in the 1970s are apparent in the big-budget musicals of the late 1960s. Fox’s The Sound of Music in 1965 marked the crest of the genre’s popularity. Although it seems impolite to blame Julie Andrews and her sunny ilk, Cook notes that “the musical entered the 1970s with the onerous distinction of having helped more than any other single genre to create the financial crisis of 1969—1971.”

Overproduced, big-budget musicals triggered more than $60 million
in losses ($264 million in contemporary terms) for the “majors” –Paramount, Warner Brothers, Columbia, Fox, Universal, and MGM/United Artists–between 1967 and 1970. Using Julie Andrews vehicles as a point of comparison, The Sound of Music cost around $7 million and grossed $135 million, while Star! (1968) cost $15 million to make and returned only $4.2 million in domestic rentals. The recession in the industry triggered $500 million in losses for the majors, resulted in all-time lows for weekly audience attendance, and left an estimated 40 percent of Hollywood filmmakers unemployed by 1970.

It became incumbent upon the industry to imagine a new way to produce and market its product. From the ashes of such musicals as Star! arose a defensive production strategy known as “the blockbuster syndrome.”

In 1971, notes Cook, “185 pictures returned $364 million in domestic rentals, with fourteen producing 52 percent of this income, and the remaining 171 left to scramble for the rest. . . . By this kind of logic, only films that were carefully packaged and laden with ‘proven’ elements, like pre-sold properties (best-selling books, hit plays, popular comic strips) and bankable stars, had a reasonable chance of becoming top-echelon blockbusters.”

And what of the character of these blockbusters? Cook writes, “In marketing and distribution, as well as in production content, the majors adopted practices . . . that were designed to maximize profit quickly, regardless of a film’s quality or merit.” Making an unforgettable splash at the midpoint of the decade was Universal’s Jaws (1975), a film directed by Steven Spielberg.

“As sharks were raised to the level of a national fetish,” observes Cook, nothing would ever be the same in the way films were cost-projected and marketed. The great-white-hype was produced for only $12 million, yet went on to earn $100 million in domestic rentals in its initial summer run (and $129.5 million overall). Such singular sensations soon became, paradoxically, the norm. Jaws was followed in 1977 by George Lucas’s Star Wars and Spielberg’s Close Encounters of the Third Kind.

Jaws was, says Cook, the first “high-concept” film—an industry term denoting a film whose story is easily reducible to a single image, “which then becomes the basis for an aggressive advertising campaign keyed to merchandising tie-ins and ancillary markets, creating ‘synergy’ between film, products, and related media.”

As proof of the economic impact of the blockbuster, Jaws coproducer Richard Zanuck made more money on his share of the film’s profits in six months than his father, studio executive Darryl F. Zanuck, had made in his entire career. Twenty-six at the time Jaws was filmed, Spielberg would go on to become, in the words of David Cook, “the dominant commercial force in American cinema for the next twenty years.” With the creation in 1994 of his own multimedia conglomerate, Dreamworks, Spielberg packed a bite which rivaled that of any other executive in the industry.

DAVID COOK DID NOT SET OUT to write a book about the 1970s; his abiding interest was in the period from 1967 to 1975. For it was then that a phenomenon quietly took place between the recession and the advent of the blockbuster: American auteurism. In 1954, French director François Truffaut would be the first to speak of “la politique des auteurs” (the policy of authors) to describe the way in which a film should be a means of personal artistic expression for its director-writer. Even before a term appeared to describe the phenomenon, American moviegoers generally had recognized the special power of directors such as Orson Welles and Alfred Hitchcock—men who had worked within the studio system yet had managed to impart a distinctive signature element to their work.

Concurrent with the economic nadir of the film industry in the late 1960s was a discovery of the importance of the youth market. For those aware of the tight controls historically exerted by the studios, it is refreshing indeed to hear that studio executives jointly made the remarkable decision to grant artistic control to young directors in order to capture the youth market. Although their decision was market-driven, notes Cook, “the studios’ embrace of auteurism represented a genuine attempt to bridge the generation gap, which brought with it a few years of artistic freedom and resulted in some of the most original American films since the late forties.”

Cook spends an extremely joyful chapter charting the very real inroads made by the likes of Arthur Penn, Stanley Kubrick, Sam Peckinpah, Francis Ford Coppola, Martin Scorsese, and Robert Altman. As with anything in Hollywood, days of innocence are not even twenty-four hours long. And so it was that what began as “personal” cinema—Cook cites the ecstatic rebelliousness of films such as Bonnie and Clyde (Arthur Penn, 1967) or Easy Rider (Dennis Hopper, 1969)—became, by decade’s end, corporate and impersonal.

The auteurs themselves suffered a similar fate, finding celebrity the equivalent of wearing a mink coat in the middle of a Hollywood summer. In David Cook’s description, many of them became “high-rolling celebrity directors . . . with their own chauffeurs, Lear jets, and body guards.” Sadly, the intoxicating change wrought by the first wave of auteurs would “boomerang” on those who followed and, argues Cook, “recast their films as branded merchandise to be consumed along with T-shirts, action figures, Happy Meals, and, by the end of the decade, miniaturized and badly framed versions of the films themselves called ‘videos.’ ”

The other principal joy for Cook in writing Lost Illusions was the opportunity Emory gave him to build a seminar around 1970s films and his evolving book. Taught in fall 1997, Cinema of the 1970s gave Cook what he calls “the perfect test-audience.” Cook considers his time in front of the class valuable in many senses, but especially so if “the course communicated the breadth of the cultural phenomenon posed by the blockbuster.”

Cook chose three notable contributors to add chapters to his volume. Douglas Gomery, professor of communication arts at the University of Maryland, wrote the chapter “Motion Picture Exhibition in 1970s America.” Flanking Gomery’s work is “Looking Back and Turning Inward: American Documentary Films of the Seventies,” by William Rothman, a professor in the motion picture program of the School of Communications at the University of Maryland. Rounding out the book is the essay “Avant-Garde Cinema of the Seventies” by Robin Blaetz, author of Visions of the Maid: Women, War, and Joan of Arc in Twentieth-Century America.

AFTER A LONG PERIOD of immersion in cinema’s past, Cook has emerged with an interest in the industry’s next step. He predicts that it will be to make “CGI”–computer-generated imagery–“the real thing,” by which he means that CGI will be planned from a film’s beginning rather than being added in postproduction. The technologies that fall under the rubric CGI make it possible digitally to integrate photographic images into a film minus traditional photography.

Already, the tireless film historian is chronicling the uses of CGI. When it was used in Terminator II: Judgment Day in 1992, Cook reports, CGI consumed a third of the film’s budget. Currently, however, it often represents a saving for filmmakers, who no longer must rely on virtuosity to achieve an effect. Of the many recent uses of CGI, Cook cites a Denzel Washington film, The Bone Collector, which portrays the gritty realism of New York City streets without ever entering those mean streets: all the dramatic exchanges take place in front of a blue screen.

Cook’s musings about cinema’s flirtation with the digital domain will be incorporated into the fourth edition of his ambitious History of Narrative Film. In circulation since 1981, this major textbook of film history is in use at approximately four hundred colleges and universities in North America, the United Kingdom, and Europe.

Perhaps the spanning of a decade is a puddle-jump for a historian versed in limning the history of cinema. Still, the films of the 1970s are complex, ranging from the bland euphony of Sound of Music imitators to the disturbing cacophony of Taxi Driver (1976) as the period progressed. Weathering remarkably strong tensions within a comparatively short period of time, the 1970s was both “Hollywood Renaissance,” based on the stirring aesthetic revival engineered by the young auteurs, and “New Hollywood,” a haven for dollar-driven cynics. Withal, writes David Cook, “lies a cinema of great expectations and lost illusions that mirrored what the historian Peter N. Carroll has called ‘the tragedy and promise of America in the 1970s.’ ”

Susan M. Carini is director of publications at Emory and a master’s degree candidate in the film studies program.

 

© 2000 Emory University