How We Got Here: “A Nonexistent Piece of the Future”

Hi, everyone. Sam here. Chris didn’t put up a HWGH this week, and in light of recent events, y’all clearly need a pick-me-up, so here’s part of a never-published manuscript I wrote many years ago called The Night Was Spent that suddenly seems relevant to the coming midterm elections. Part 1 can be found here and Part 2 here.

In October 2007, The Daily Show with Jon Stewart was at its peak of popularity and critical acclaim. The show had begun in 1996 as a loose parody of news programs, hosted by comedian Craig Kilborn, but creator Lizz Winstead had always envisioned Stewart as host, and he got his chance in 1999, just in time for the 2000 election and 9/11 attacks to turn America into a nation of news junkies. Although Stewart and his cast and crew set out to playfully scrutinize the media itself, the various Presidential and Congressional scandals of the mid-late 2000s were too juicy not to mock for all their worth. Indeed, many of those scandals probably would not have had the impact they did without The Daily Show, especially the then-ongoing crisis in the Justice Department. In turn, these scandals gave the show a much-needed sense of direction whose absence would be keenly felt when the Bush years ended.

So it came as quite a shock when, on November 1, 2007, Stewart abruptly announced that The Daily Show would go off the air indefinitely. They were far from alone. The Writers‘ Guild of America, the union and governing body of the nation’s credited screenwriters, was days from announcing a strike; their goal to open negotiations for writers to receive residuals (or royalty payments) from streaming video services such as Hulu, which had launched only days before on October 29; and to increase residuals from DVD sales from 0.3% to 0.6%.

tds_13001_03_a_v6

Although the WGA’s collective bargaining agreement also covered Hollywood screenwriters, the bulk of those covered were the much more numerous television writers, who mostly worked in teams. Thus, the impact of the strike was felt most broadly and lastingly in television. At first, talk shows were the first to go off the air, since they were mostly written and recorded the same day they were broadcast. Then scripted sitcoms and dramas departed; many episodes of these shows were left untitled as the writers had walked off their jobs before naming them. Projects in development were canceled before filming could begin. Others were drastically truncated. Writing the first season of the sleeper hit Breaking Bad, showrunner Vince Gilligan had planned to end the show’s first season after eight episodes with the death of supporting character Jesse Pinkman, played by Aaron Paul. When the strike forced the AMC network to cut the episode order back to six, Jesse lived, and Paul got a career.

For the WGA’s public relations, the strike couldn’t have come at a more advantageous time. Television in America was being celebrated as never before. As long as it had existed, conventional wisdom had passed off the medium as an opiate of the masses, beholden to corporate advertisers and limited budgets, and forever inferior to the true art of film. But things began to change starting in the 1990s. Shows like Twin Peaks and Northern Exposure began to change public conception of what TV was capable of, while the likes of NYPD Blue pushed the broadcast networks to their limits, in terms of both their racy content and narrative complexity. Cable took things even further. HBO’s The Sopranos proved that exclusive paid content could actually become a pop cultural phenomenon.

The most elemental changes, however, were technical. First, television became more like film. While dramas had almost always been shot on film, in a closed studio, with a single-camera setup, in the 2000s more and more comedy and variety shows began to do the same, a style adapted from Great Britain.

Second was widescreen. Before the 2000s, all TV shows were presented in a 4:3 aspect ratio (that is, the image is 1/3 wider than it is tall), as that had been the industry standard for film when television was invented. In the 1950s, movie studios rapidly switched to various widescreen formats (such as 16:9, the current standard, or the even wider 2.39:1 “cinemascope”) to create a moviegoing experience that couldn’t be replicated on the small screen. In the late 1990s, television finally caught up, adjusting the shape of their screens to match those in theaters. This, in turn, led television to take on a more cinematic visual style and language. It’s no coincidence that traditional multi-camera sitcoms performed in front of live audiences suffered a precipitous decline when this happened.

Third and most importantly, DVD arrived. Before the 2000s, few TV shows were ever sold to the public on home video, at least not as a complete set– commercial VHS tapes were simply too bulky on the shelf, and a single cassette couldn’t hold much more than two hours of footage. DVDs, which could store up to four hours per disc and fit into a storage space just 1/4 inch thick, made it practical for years’ worth television series to be purchased and placed on the shelf. Suddenly, people could watch their favorite shows whenever they wanted. In the early 2000s, most series that got DVD releases were at least a couple of years old, but that changed in 2005, when The Office, an American adaptation of a British comedy, released its first season on DVD immediately after it finished airing. Many in the media reacted with shock at such a quick release, but within a year it was standard practice.

On a March 16, 2007 edition of the NPR radio show This American Life, host Ira Glass declared “…the most interesting idea you hear about television right now is this idea that– right now, all of us– we are living through the Golden Age of Television.” At the time, the term “Golden Age of Television” had always referred to the 1950s, when TV in America was still developing its conventions and had the nation’s undivided attention; even now, the term “Golden Age” is viewed uneasily, but the sentiment of Glass’ statement was wholly uncontroversial. In previous decades, people would brag that they never watched TV. In 2007, anyone who said such a thing was ridiculed as pretentious and out of touch.

If anyone in the television business suffered from this change, it was the broadcast networks, who saw more and more of their market share disappear as cable became the vanguard of what critic Alan Sepinwall only-slightly-jokingly called “the Revolution.” Although the networks were still the most watched channels at any given time in 2007, they no longer controlled the market for content; the proliferation of digital recording and DVD had freed viewers from having to watch certain shows at certain times and opened the floodgates for original content on cable; the networks, as users of the public airwaves, were subject to crippling restrictions on what they could show. If anything, the networks’ residual dominance at this time was self-defeating, since they received (and thus declined) so many more new ideas from eager producers. Still, the network did experiment with high-end serialized shows. Most of them flopped, but Fox’s 24 and ABC’s Lost were among the most talked-about shows– hell, the most talked-about things– in offices, at schools, and at dinner tables.

Not only had television achieved the respect given to cinema, but it had even developed its own distinct auteur theory. Because of the rapid turnaround between episodes of a TV series, as well as the turnaround between directors of individual episodes, authorship for a TV show was not credited to them, as in film, but to the show’s head writer, a term soon to be replaced in America by the more authoritative British coinage “showrunner.” The overall sense of goodwill toward the television industry had carried over into the strike.

None of this is to say that film had receded from public interest. In fact, 2007 was regarded by many professional film critics as the greatest year in American cinematic history. That they had the chutzpah to say such a thing even in the moment speaks to the pure volume of classic films that came out that year, among them I’m Not There, Superbad, Zodiac, Once, Ratatouille, Persepolis, Grindhouse, Hot Fuzz, The King of Kong: A Fisftul of Quarters, Eastern Promises, Sunshine, No Country for Old Men, There Will Be Blood, Charlie Wilson’s War, Before the Devil Knows You’re Dead, and the Assassination of Jesse James by the Coward Robert Ford. But the film industry was nevertheless changing in a way that most Americans would come to see as negative. The success of Peter Jackson’s Lord of the Rings series had driven Hollywood to depend heavily on blockbusters with recognizable titles (and thus a built-in audience); of the ten highest-grossing films of 2007, all were sequels, remakes, or adaptations, and the majority were panned by critics. The budgets for these films had gotten so out of control that the industry could no longer rely on domestic audiences to make a profit; a film’s appeal abroad now took priority. In the words of film historian Bridget Murnane, “we don’t make movies for Americans anymore.”

The organization contesting the Writers’ Guild’s demands was the Association of Motion Picture and Television Producers, or AMPTP. The last time the WGA had gone on strike, demanding residuals from home video in the first place, they lacked public support as the five-month ordeal left the American people without anything to watch for the rest of the year. This time, the public was rallying behind the writers, and the AMPTP accordingly kept a low profile. Only two major executives spoke out on behalf of their billionaire brethren. One was Barry Diller, former CEO of Paramount and Fox. Normally a rare progressive voice in big business, Diller declared “There are no profits for the work that writers do that is then digitized and distributed through the Internet.” At the time, of course, he himself was profiting from a vast wealth of online content as CEO of InterActiveCorp. The other was Diller’s protégé and one of the most hated men in America: Michael Eisner.

Born in 1942, Eisner had gotten his start in television programming, mainly working under Diller at ABC. When Diller took over Paramount in 1976, he brought Eisner with him, and the company enjoyed a long string of success on both big and small screens. When Diller retired, Eisner was unexpectedly passed over to replace him; instead he took the job as CEO at the Walt Disney Company. In 1966, the same year that Eisner began his career, Walt Disney died, creating a severe leadership vacuum that went untreated for nearly two decades, resulting in multiple takeover attempts. The company’s product had suffered, too, with a slough of unoriginal, blandly inoffensive films. In the words of Disneyologist Tony Goldmark, “[the company] was too busy wondering ‘what Walt would have done.’” Disney was in such dire straits that when Eisner took over in 1984, most viewed it as a humiliating step down for him. A decade later, he was a hero, a God among men who had saved an American institution and restored it to its former glory.

How much credit Eisner deserves for saving the Walt Disney Company is debatable. In 1986, he actually proposed shutting down the company’s feature Animation department, then on the brink of its vaunted “Renaissance” of 1989-99. Most of Eisner’s plans for the 1990s involved expanding Disney’s theme park and television empire. To that end, he acquired his old network ABC and turned it into a mouthpiece for the company. What upset people most though, inside the company and out, was that Eisner wasn’t a team player. He refused to treat Disney’s legacy as anything but a transaction; and in failing to recognize the company’s unique position in American culture, he helped bring down what he had once saved. Most famously to fans, he commissioned the company’s TV animation wing, DisneyToon Studios, to create over a dozen off-brand sequels to Disney’s signature animated features. These were decried as cheap cash-ins, and both fans and investors worried that Eisner was digging Disney into another hole by cheapening its brand. The final straw came in 1996, when Eisner engineered a hostile appointment (an appointment without the involvement of the Board of Directors) of Michael Ovitz to the Presidency of Disney. Ovitz lasted only fourteen months, but Eisner agreed to give him a severance package worth over $100 million. The Board of Directors revolted, many actively campaigning to have Eisner fired. He left in disgrace in 2005, shortly after finally killing the hand-drawn feature animation department. So for Eisner to come out two years after his ouster to decry the “stupid strike” for “a nonexistent piece of the future” only benefitted the writers’ cause.

By mid-December, the television listings had become startlingly spare. The broadcast networks scrambled to replace their primetime scripted programs with reality shows. So-called “reality TV” had been a boon to television producers for nearly a decade, since they supposedly didn’t require actors or writers. In fact, one of the biggest consequences of the WGA strike was the revelation that reality shows were not only scripted, but scripted by non-union scabs. By late 2007, the blockbuster novelty of these programs had long worn off, largely becoming relegated to cable.

Thus, the networks mostly responded to the dearth of content by airing more news coverage. The biggest news, of course, was that Barack Obama had defeated Hillary Clinton in the Iowa caucuses. The obvious conclusion to the casual viewer, unaware that the Iowa Caucuses often resulted in darkhorse victories, was that Obama might actually have a chance to win the nomination– even the presidency. This perception may have been a deciding factor in Obama’s ultimate success.

In addition to being the first programs to disappear from the airwaves, late-night talk shows were also the first to return. Worldwide Pants, producer of CBS’ The Late Show, was actually owned by the show’s host, David Letterman, who made an interim agreement with the WGA and announced his return to the 11:30 slot on December 28, 2007. This caused the other networks to force the hands of their late-night producers and resume work. By mid-January of 2008, all of the late-night talk shows had returned, and fueled the fire of Obama’s high-profile win.

This was not without controversy. As a form of protest, some shows, such as Saturday Night Live, performed versions of their programs without cameras, so late-night had effectively served as the public relations wing of the strike. For the most part, that continued. While NBC’s Jay Leno and Carson Daly were accused of strike-breaking, the other hosts resumed their shows without scripts, effectively improvising every night. David Letterman and NBC’s Conan O’Brien refused to shave their beards out of solidarity, a breathtaking fashion statement for any public figure of the time. Jon Stewart temporarily re-titled his show A Daily Show rather than The Daily Show, while his friend and lead-in host Stephen Colbert of The Colbert Report deliberately mispronounced his own name.

The most bizarre incident to come out of this involved none other than Arkansas governor and Republican Iowa Caucus winner Mike Huckabee. Over the course of 2007, Huckabee had appeared on both The Daily Show and The Colbert Report. Colbert’s persona on his own show was one of an egotistical, hyperactive manchild, so of course Colbert on air declared himself to be the cause of Huckabee’s victory because he’d given him the “Colbert Bump.” In return, Conan O’Brien, host of NBC’s Late Night and next in line to host the more prestigious Tonight Show, declared that it was he who had caused Huckabee’s victory because he had previously pulled the “Walker, Texas Ranger Lever,” a gag device that triggered video clips of actor and Huckabee supporter Chuck Norris; and that he had caused Colbert to be more popular just by mentioning him. Finally, Stewart claimed to have been responsible for the success of O’Brien, Colbert, and Huckabee. Before returning to air, the three hosts had agreed to create an improvised, farcical feud that would fill airtime, boost their ratings, keep viewers interested in the strike, and demonstrate that their shows were more fun than the stodgier Tonight Show and Late Show. It was a massive success on all fronts, culminating on February 4, when the three hosts all made guest appearances on each other’s shows to “fight” one another.

The only person who didn’t profit was Huckabee, who lost the New Hampshire Primary to John McCain, in addition to all but seven more states afterward. The same vacuum of scripted media that had boosted Obama proved to be Huckabee’s undoing. Despite his media savvy and initial public persona as a self-effacing conservative whose Evangelical faith transcended party orthodoxy, Huckabee’s performances in actual debate revealed him to be a ruthless social conservative of a common mold who forcefully opposed many of the economic principles that he had previously extolled. He withdrew from the race on March 4, long after it had become apparent that McCain would prevail.

Things weren’t nearly as simple for the Democrats. By the end of January, Barack Obama had defeated Hillary Clinton in Iowa and South Carolina, and Clinton’s air of inevitability was gone. The Writers’ Guild of America strike ended on February 12, after four months and seven days. Without scripted content, sports and news programming had become the only reliable way for networks to keep people watching, which may have given Democratic voters more confidence in Barack Obama than they otherwise would have had. But even if that was the case, it was far from the only factor that led to his eventual victory. Though nobody could have seen it coming, a victory like Obama’s had been years in the making.