Last year, at least, the Academy Awards offered asked a compelling question: would acclaimed director Martin Scorsese finally break through and win the Best Director Oscar he’d so often been denied? His most recent vehicle, “The Departed,” offered a powerhouse cast (Jack Nicholson, Leonardo diCaprio, Matt Damon, Mark Wahlberg) and better commercial performance than Marty’s usual box office duds. His victory allowed the show (even with Ellen’s oddly understated and tone-deaf performance as host) to avoid identification as a total disaster. Meanwhile, the other films in prominent contention (“The Queen,” “Pan’s Labyrinth,” “The Last King of Scotland,” “Little Miss Sunshine” “Babel,” “Letters from Iwo Jima”) might qualify as worthy artistic endeavors but hardly counted as mass audience crowd pleasers.
For 2008, the choices for Best Picture and other major awards are even more removed from mainstream sensibilities, and even the titles themselves will discourage many potential viewers. “There Will Be Blood” doesn’t exactly sound like an old-fashioned, pass-the-popcorn good time at the movies, nor does “Atonement,” “No Country for Old Men,” or “Michael Clayton.” In fact, the only vaguely upbeat offering among these relentlessly grim, artfully intense offerings is a quirky comedy (“Juno”) about a pregnant high school girl trying to find an adoptive home for her new baby –not the sort of plot line (despite its admirable pro-life messages) normally devised to satisfy all those who yearn for more wholesome, family-friendly entertainment.
The disconnect between Oscar nominees and the industry’s biggest box office winners remains a daunting problem for the motion picture establishment. It’s no accident that the recent Oscar broadcast that brought by far the biggest audience came in 1998, when “Titanic” not only set records for ticket sales but also sailed away with the most prominent and prestigious Academy Awards. An impressive 55 million viewers tuned in to see James Cameron (remember him?) declare “I’m King of the World!” and to watch as the potentates of pop culture decided to anoint a movie that ordinary people had seen and liked.
Does anyone expect a similar response to the prospect of the Coen Brothers getting Oscar gold for “No Country for Old Men” (and maybe explaining the Yeats reference in their title), or to Paul Thomas Anderson (who?) claiming the prize for “There Will Be Blood”?
Of course, the producers of the yearly telecast would love it if the Academy voters found a way to honor the big box office winners with prominent Oscar nods. But this year, not even the most populist critics could make a plausible case for the towering artistic excellence of “Spider-Man 3,” “Shrek the Third,” “Transformers” or “Pirates of the Caribbean: At Wit’s End”, the money-making big four in 2007.
Of course, such sure-thing blockbusters hardly need validation or promotion from the motion picture Academy, and defenders of the Oscar process applaud the recent habit of emphasizing worthy independent and low budget pictures (“Savages,” “The Diving Bell and the Butterfly”) that otherwise fail to reach a mainstream audience. Though it’s easy to respect the good intentions involved in using the Academy Awards to call attention to such quirky, adventurous work, the declining public significance of the Oscar ceremony has undermined ability of the yearly telecast to accomplish that purpose. Even though “Crash” won Best Picture in a little-seen Academy Awards broadcast in 2006, few Americans rushed out to see an audacious and intensely moving film that’s now largely forgotten.
When the Academy of Motion Picture Arts and Sciences produced its first award ceremony on May 16, 1929, none of the movie moguls behind the celebration planned to use such occasions to call attention to under-appreciated art films that had escaped public attention. Instead, the whole purpose of the Academy was to add prestige and a patina of “class” to big studio productions that already appealed to a mass audience. Classic Best Picture winners managed to combine lavish budgets, epic ambition, and crowd-pleasing spectacle, like “Gone With the Wind” (1939), “Ben-Hur” (1959), “The Sound of Music” (1965), or “The Godfather” (1972) . Only recently did the Academy begin making a habit of selecting Best Picture winners that clearly aimed at more limited, selective, sophisticated audiences, as did “The English Patient” (1996), “Shakespeare in Love” (1998), “A Beautiful Mind” (2000), “Million Dollar Baby” (2004) and “Crash” (2005).
In a sense, this alteration in emphasis reflected the changed status of movie-going from a wildly popular form of entertainment with universal appeal to a specialized interest appealing primarily to niche audiences (particularly young singles). The numbers tell a dramatic tale of the sharp decline in the movie-going habit. In 1930, nearly two thirds of Americans went to the movies in any given week. By 1960, with the advent of TV and other entertainment alternatives, the number had declined to about one-third. The next ten years saw a disastrous collapse in movie-going (associated, I’ve long-argued, with the new emphasis on harsher, darker, more violent fare) with weekly attendance at 40 million in the US in 1960, and losing more than half the audience by 1970 (down to weekly attendance of 18 million in the “Easy Rider”/”Midnight Cowboy” era). The motion picture industry never recovered, hovering around 25 million weekly filmgoers in recent years despite the huge increases in population.
Of course, part of the reason for the stagnant ticket sales (aside from the sharply escalating price of tickets) involves countless new entertainment alternatives – video games, DVD’s, home theatre systems, the internet, and so forth. Regardless of the cause, however, the effect remains undeniable: motion pictures no longer appeal to everyone, and a full third of the country fails to go to the local multiplex even once in the course of a year.
In this context, the Oscar show can’t possibly wield the impact it once did, and the proliferation of alternative televised award shows (many catering to the public’s alternate interests) further shrinks the Academy Award audience. This year, gluttons for punishment can view, to name just a few possibilities, “The People’s Choice Awards,” “The American Music Awards,” “The Country Music Awards,” “The Grammy Awards,” “The Emmy Awards,” “The MTV Awards” “The Golden Globes,” “The Tony Awards,” “The Independent Spirit Awards,” “The Screen Actors Guild Awards” “The NAACP Image Awards” and, if you’re really desperate, even “The Critics Choice Awards” (sponsored by the Broadcast Film Critics of America, a group of which I’m a voting member).
By the time the Oscars finally roll around, we’ve all experienced award fatigue; it’s hardly surprising that the old air of electric anticipation has become difficult to recapture.
Many conservatives will look at ailing Oscar with ill-disguised glee: anything that indicates tough times for liberal Hollyweird seems worthy of celebration.
There are reasons, however, that the eclipse of the Academy Awards, however tacky and decadent they may seem, represents an unwelcome development for those who believe in the importance of a unified pop culture rather than the current craze for a gorgeous, multicultural mosaic of diversity. It’s worth noting that the nation felt much less fragmented fifty years ago than it does today in part because the public chose from far fewer entertainment alternatives. If you wanted to watch the tube, you viewed one of three networks – without a hundred cable options plus DVD’s to fit your fancy. If you liked movies, you chose each week from the big releases from the eight big Hollywood studios, with few odd-ball “indie” options, or the thousands of titles you can consider at Blockbuster. Even big-time sports provided fewer options, with baseball the dominant national pastime and virtually everyone choosing to support one of the sixteen well-established teams in the American and National Leagues.
Of course, this situation offered far less choice than we enjoy today, and far less capacity to indulge eccentric individualism. Nevertheless, it provided a largely unified frame of reference in school, at work, in the neighborhood. Nearly everyone saw “I Love Lucy” or “Father Knows Best,” so everyone could talk about it. When “Around the World in Eighty Days” won Best Picture in 1956 (beating out DeMille’s Biblical epic, “The Ten Commandments”) tens of millions could discuss the upset because so many people had seen both films – even among the young kids in elementary school at the time.
The Oscar ceremony once provided a unifying moment of secular solemnity that helped bring the nation together. It didn’t matter if you were young or old, rich or poor, high school drop out or college grad, immigrant or native born, loony lefty or crazy conservative, it still felt enjoyable and important to tune in to Hollywood’s big show. Optimistically, Academy Award promoters still like to compare their glittering night to the Superbowl as the two yearly occasions that command truly national attention.
In truth, there’s no longer much comparison: the most recent Superbowl set ratings records with 97.5 million viewers. The most recent Oscar show (a year before the feeble and star-crossed offering scheduled for this Sunday night) captured only 39.9 million.
Looking forward to that event, it’s true that there’s no real reason to care whether the Academy voters give their top prize to a dark, violent, nihilistic movie like “No Country for Old Men,” or else choose to reward a dark, violent nihilistic movie like “There Will Be Blood.” Nor should we invest too much emotional energy or prayer in the possibility that a gentler, more life-affirming film like “Juno” could slip through in an upset.
It does matter, however, that a time-honored, slightly nostalgic and tradition-bound occasion that once helped to define our common culture as Americans now looks less relevant and less significant than ever before.