With the release of “Nashville” and “Jaws,” the summer of ’75 delivered both the culmination — and the beginning of the end — of that period. “Nashville” seemed to incarnate a film buff’s hopes for American movies. Here was an artist putting the machinery of popular culture to work for the sake of art, yet entering into the spirit of popular culture and partaking of its energy too. That was the dream: the power of popular art combined with the complexity of fine art, high and low not at war, and not blurred indistinguishably into each other, but embracing.
“Nashville” was debated in the mainstream press in a way that seems inconceivable now: The New York Times ran at least eight pieces about the movie, and editorial writers and critics weighed in with opinions and interpretations for months after the film opened. (The movie’s 25th anniversary isn’t going unnoted. The Times and Premiere have already run major pieces about Altman; Fox Television will broadcast a documentary about him, “Altman: On His Own Terms,” on August 13; the Academy of Motion Picture Arts and Sciences screened the film on June 22 in Los Angeles, with Altman and various cast and crew members in attendance; and, in November, Simon & Schuster will publish “The ‘Nashville’ Chronicles,” by the Newsday film critic Jan Stuart. Paramount will release the DVD version, offering its proper Panavision screen-aspect ratio, on August 15.)
But it was “Jaws” that captured the mass audience and really changed movies. It wasn’t the first big success of the boomer generation, but it was a hit on a scale no one had ever seen before. (Within a month of its release, the stock of MCI, the conglomerate that owned the film company that released “Jaws,” went up 22 points.) The aftereffects of “Jaws” rattled the world of film from top to bottom: Soon the artists were coming a cropper — Altman spent the rest of the decade creating ever-more-perverse head-scratchers; Coppola spent years on the debilitating “Apocalypse Now,” and seems never to have recovered his energy or concentration; Scorsese tripped himself up making the over-ambitious, epic musical, “New York, New York.” In 1977, George Lucas’ “Star Wars” was released, and the intellectual and art side of filmmaking and filmgoing has been scattered to the four winds ever since. Despite the occasional good movie, the news since has all been about technology, effects, gender, race and business.
In 1975, film was potentially the greatest of all the arts; in 2000, it’s one data stream among many. The hierarchical, centralized culture the baby boomers reacted against could be exclusionary, and its emphasis on ego and on greatness could be annoying. But it offered the possibility of something called “depth,” and it also provided a shared culture and language. The atomized, decentered culture we have now allows for horizontal ranging about; the new digital tools (and media) are irresistible; and the openness to cultural mixing is certainly a relief. But this mix-and-match culture can also seem shallow. If everything’s always available, why bother trying to unearth anything? (If it isn’t on a database, it doesn’t exist.)
A young Ivy League graduate I know made a success in arts journalism without ever having seen a Bergman picture. When she finally caught up with one, she was stunned to realize that there’d once been a time when people went to a movie theater to watch characters agonize and philosophize at each other. She hasn’t seen another Bergman since, and she hasn’t gone on to read any Scandinavian literature, or to search out further examples of Swedish films either. In Altman’s “The Player,” a comedy about what has become of Hollywood, a young studio executive is watching his career dissolve, and recovers his momentum only when he learns to stop worrying about integrity and depth. During my lunch with him, Altman observed wryly that one thing he could say for the executives he’d battled in the ’70s was that they cared enough about the work being done to get angry at you, and to hate your movies. Nowadays, when someone takes an idea upstairs for a decision, there’s nothing there but a computer.
Watched on videotape today, “Nashville” seems in its element in a way many movies don’t. It’s alive, and it doesn’t suffer from the fragmenting effects of stop-and-start, at-home viewing. This may be because Altman is instinctively drawn to multiple points of view and unresolved resolutions. It doesn’t exactly cohere, but it seems to bring our channel-surfing minds and experiences into some kind of loose relationship. It gives the impression of being a video installation rather than a routine feature; you can get the feeling that it’s playing on several monitors at once. Watching it made me think that one way of conceiving of TV is as movies gone to pieces and turned into wallpaper.
It also made me think that an upbeat way of looking at where we’ve arrived is this: We have been freed — perhaps against our will — of our attachment to the idea of art as a rebel activity, a gesture toward freedom made for the sake of the unconscious and revolution. Now it has become simply an activity some people pursue, and perhaps get something out of — as legitimate as (but no more vanguard than) business, cleaning, sports, science and child-rearing. “Nashville,” seen at this distance, looks like a snapshot of the moment when substance began to vaporize into information.