Archive for the 'Film theory' Category
Grandmaster flashback
DB here:
Elsewhere I’ve sung the glories of Turner Classic Movies. Would that the other basic-cable staple, the Fox Movie Channel, were as committed to classic cinema. It’s curious that a studio with a magnificent DVD publishing program (the Ford boxed set, the Murnau/ Borzage one) is so lackluster in its broadcast offerings. Fox was one of the greatest and most distinctive studios, and its vaults harbor many treasures, including glossy program pictures that would still be of interest to historians and fans. Where, for instance, is Caravan (1934), by the émigré director Erik Charell who made The Congress Dances (1931)? Caravan‘s elaborate long takes would be eye candy for Ophuls-besotted cinephiles.
Occasionally, though, the Fox schedulers bring out an unexpected treat, such as the sci-fi musical comedy Just Imagine (1930). Last month, the main attraction for me was The Power and the Glory (1933), directed by William K. Howard from a script by Preston Sturges.
This was an elusive rarity in my salad days. As a teenager I read that it prefigured Citizen Kane, presenting the life of a tycoon in a series of daring flashbacks. I think I first saw it in the late 1960s at a William K. Everson screening at the New School for Social Research. I caught up with it again in 1979, at the Thalia in New York City, on a double bill with The Great McGinty (1940). In my files, along with my scrawls on ring-binder paper, is James Harvey’s brisk program note, which includes lines like this: “One of Sturges’ achievements was to make movies about ordinary people that never ever make us think of the word ‘ordinary.’” I was finally able to look closely at The Power and the Glory while doing research for The Classical Hollywood Cinema (1985). The UCLA archive kindly let me see a 16mm print on a flatbed viewer.
So after a lapse of twenty-eight years I revisited P & G on the Fox channel last month. It does indeed prefigure Kane, but I now realize that for all its innovations it belongs to a rich tradition of flashback movies, and it can be correlated with a shorter-term cycle of them. Rewatching it also teased me to think about flashbacks in general, and to research them a little. You see, I am very fond of what contemporary practitioners like to call broken timelines.
A trick, an old story
On our subject for today, the indispensible book, which ought to be brought back into print or archived online, is Maureen Turim’s Flashbacks in Film: Memory and History (Routledge, 1989). We may think of the flashback as a modern technique, but Turim shows that flashbacks have been a mainstay of filmic storytelling since the 1910s.
Although the term flashback can be found as early as 1916, for some years it had multiple meanings. Some 1920s writers used it to refer to any interruption of one strand of action by another. At a horse race, after a shot of the horses, the film might “flash back” to the crowd watching. (See “Jargon of the Studio,” New York Times for 21 October 1923, X5.) In this sense, the term took on the same meaning as then-current terms like “cut-back” and “switch-back.” There was also the connotation of speed, as “flash” was commonly used to denote any short shot.
But around 1920 we also find the term being used in our modern sense. You can find it in popular fiction; one short story has its female protagonist remembering something “in a confused flashback.” F. Scott Fitzgerald writes in The Beautiful and Damned of 1922:
Anthony had a start of memory, so vivid that before his closed eyes there formed a picture, distinct as a flashback on a screen.
At about the same time writers on theatre start to adopt the term and credit it to film. A historian of drama writes in 1921 of a play that rearranges story order:
The movies had not yet invented the flashback, whereby a thing past may be repeated as a story or a dream in the present.
Within film circles, there were signs of an exasperation with the device. One 1921 writer calls the flashback a “murderous assault on the imagination.” Turim quotes a New York Times review of His Children’s Children (1923):
For once a flash-back, as it is made in this photoplay, is interesting. It was put on to show how the older Kayne came to say his prayers.
In the same year, a critic discusses Elmer Rice’s On Trial, an influential 1911 stage play. Rice employs
a dramatic technique which up to its time was probably unique, though since then the ever recurrent “flash back” of the movies has made the trick an old story.
During the 1930s, although some critics and filmmakers employed older terms like “switch back” and “retrospect,” flashback seems to have become the standard label. It denoted any shot or scene that breaks into present-time action to show us something that happened in the past. It probably speaks to the intuitive and informal nature of filmmaking that writers and directors didn’t feel a need to name a technique that they were using confidently for two decades.
The early flashback films pretty much set the pattern for what would come later. Turim shows that all the sorts we find today have their precedents in the 1910s and 1920s. Adapting her typology a little bit, we can distinguish between character-based flashbacks and “external” ones.
A character-based flashback may be presented as purely subjective, a person’s private memory, as in Letter to Three Wives or The Pawnbroker or Across the Universe. There’s also the flashback that represents one character’s recounting of past events to another character, a sort of visual illustration of what is told. This flashback is often based on testimony in a trial or investigation (Mortal Thoughts, The Usual Suspects), but it may simply involve a conversation, as in Leave Her to Heaven, Titanic, or Slumdog Millionaire. It can also be triggered by a letter or diary, as happens with the doubly-embedded journals in The Prestige.
An alternative is to break with character altogether and present a purely objective or “external” flashback. Here an impersonal narrating authority simply takes us back in time, without justifying the new scene as character memory or as illustration of dialogue. The external flashback is uncommon in classic studio cinema (although see A Man to Remember, 1938) but was common in the 1900s and 1910s and has returned in contemporary cinema. Typically the film begins at a point of crisis before a title appears signaling the shift to an earlier period. Recent examples are Michael Clayton (“Three days earlier”), Iron Man (“36 Hours Before”), and Vantage Point (“23 Minutes Earlier”).
In current movies, flashbacks can fall between these two possibilities. Are the flashbacks in The Good Shepherd the hero’s recollections (cued by him staring blankly into space) or more objective and external, simply juxtaposing his numb, colorless life with the past disintegration of his family? The point would be relevant if we are trying to assess how much self-knowledge he gains across the present-time action of the film.
Rationales for the flashback
What purposes does a flashback fulfill? Why would any storyteller want to arrange events out of chronological order? Structurally, the answers come down to our old friends causality and parallelism.
Most obviously, a flashback can explain why one character acts as she or he does. Classic instances would be Hitchcock’s trauma films like Spellbound and Marnie. A flashback can also provide information about events that were suppressed or obscured; this is the usual function of the climactic flashback in a detective story, filling in the gaps in our knowledge of a crime.
By juxtaposing two incidents or characters, flashbacks can enhance parallels as well. The flashbacks in The Godfather Part II are positioned to highlight the contrasts between Michael Corleone’s plotting and his father’s rise to power in the community. Citizen Kane’s flashbacks are famous for juxtaposing events in the hero’s life to bring out ironies or dramatic contrasts.
Of course, flashbacks need not explain or clarify things; they can make things more complicated too. We tend to think of the “lying flashback” as a modern invention (a certain Hitchcock film has become the prototype), but Turim shows that The Goose Woman (1925) and Footloose Widows (1926) did the same thing, although not with the same surprise effect. Kristin points out to me that an even earlier example is The Confession (1920), in which a witness at a trial supplies two different versions of a killing we have already (sort of) seen.
At the limit, flashbacks can block our ability to understand characters and plot actions. This is perhaps best illustrated by Last Year at Marienbad, but the dynamic is already there in Jean Epstein’s La Glace à trois faces (“The Three-Sided Mirror,” 1927).
I argue in Poetics of Cinema that, at bottom, flashbacks are tactics fulfilling a broader strategy: breaking up the story’s chronological order. You can begin the film at a climactic moment; once the viewers are hooked, they will wait for you to move back to set things up. You can create mystery about an event that the plot has skipped over, then answer the question through a flashback. You can establish parallels between past and present that might not emerge so clearly if the events were presented in 1-2-3 order. Consequently, you can justify the switch in time by setting up characters as recalling the past, or as recounting it to others.
Having a character remember or recount the past might seem to make the flashback more “realistic,” but flashbacks usually violate plausibility. Even “subjective” flashbacks usually present objective (and reliable) information. More oddly, both memory-flashbacks and telling-flashbacks usually show things that the character didn’t, and couldn’t, witness.
I don’t suggest that recollections and recountings are merely alibis for time-juggling. They bring other appeals into the storytelling mix, such as allegiance with characters, pretexts for point-of-view experimentation, and so on. Still, the basic purpose of nonchronological plotting, I think, is to pattern information across the film’s unfolding so as to shape our state of knowledge and our emotional response in particular ways. Scene by scene and moment by moment, flashbacks play a role in pricking our curiosity about what came before, promoting suspense about what will happen next, and enhancing surprise at any moment.
A trend becomes a tradition
When The Power and the Glory was released in August 1933, it was part of a cycle of flashback films. The Trial of Mary Dugan (1929), The Trial of Vivienne Ware (1932), and other courtroom films rendered testimony in flashbacks. A film might also wedge a brief or extended flashback into an ongoing plot. The most influential instance was probably Smilin’ Through (1931), which is notable for using a crane shot through a garden to link present and past.
Also well-established was the extended insert model. Here we start with a critical situation that triggers a flashback (either subjective or external), and this occupies most of the movie. Digging around, I found these instances, but I haven’t seen all of them; some don’t apparently survive.
- Behind the Door (1919): An old sea salt recalls life in World War I and, back in the present, punishes the man responsible for his wife’s death. A ripoff of Victor Sjöström’s Terje Vigen (1917)?
- An Old Sweetheart of Mine (1923): A husband goes through a trunk in an attic and finds a memento that reminds him of childhood sweetheart. The pair grow up and marry, facing tribulations. At the end, back in the present, she comes to the attic with their kids.
- His Master’s Voice (1925): Rex the dog is welcomed home from the war. An extended flashback shows his heroic service for the cause, and back in the present he is rewarded with a parade.
- Silence (1926): A condemned man explains the events that led up to the crime. Back in the present, on his way to be executed, he is saved.
- Forever After (1926): On a World War I battlefield, a soldier recalls what brought him there.
- The Woman on Trial (1927): A defendant recalls her past.
- The Last Command (1928): One of the most famous flashback films of the period. An old movie extra recalls his life in service of the tsar.
- Mammy (1930): A bum reflects on the circumstances leading him to a life on the road.
- Such is Life (1931): A ghoulish item. A fiendish scientist confronts a young man with the corpse of the woman he loves. A flashback to their romance ensues.
- The Sin of Madelon Claudet (1931; often cablecast on TCM): A young wife bored with her husband is told the story of a neighbor woman who couldn’t settle down.
- Two Seconds (1932): A man about to be executed remembers, in the two seconds before death, what led him here. A more mainstream reworking of a premise of Paul Fejos’s experimental Last Moment (1928), which is evidently lost.
An interesting variant of this format is Beyond Victory, a 1931 RKO release. The plot presents four soldiers on the battlefield, each one recalling his courtship of the woman he loves back home. The principle of assembling flashbacks from several characters was at this point prised free of the courtroom setting, and multiple-viewpoint flashbacks became important for investigation plots like Affairs of a Gentleman (1934), Through Different Eyes (1942), The Grand Central Murder (1942), and of course Citizen Kane, itself a sort of mystery tale.
Why this burst of flashback movies? It’s a good question for research. One place to look would be literary culture. The technique of flashback goes back to Homer, and it recurs throughout the history of both oral and written narrative. Literary modernism, however, made writers highly conscious of the possibility of scrambling the order of events. From middlebrow items like The Bridge of San Luis Rey (1927) to high-cultural works by Dos Passos and Faulkner, elaborate flashbacks became organizing principles for entire novels. It’s likely that Sturges, a Manhattanite of wide literary culture, was keenly aware of this trend.
It’s just as likely that he noticed similar developments in another medium. By 1931, when Katharine Seymour and J. T. W. Martin published How to Write for Radio (New York: Longmans, Green), they could devote considerable discussion to frame stories and flashbacks in radio drama (pp. 115-137). Especially interesting for Sturges’ film, radio programs were letting the voice of the announcer or the storyteller drift in and out of the action that was taking place in the past.
For whatever reasons, the technique became more common. The year 1933 saw several flashback films besides The Power and the Glory. In the didactic exploitation item Suspicious Mothers, a woman recounts her wayward path to redemption. Mr. Broadway offers an extensive embedded story using footage from another film (a common practice in the earliest days). Terror Aboard begins with the discovery of corpses on a foundering yacht, followed by an extensive flashback tracing what led up the calamity. A borderline case is the what-if movie Turn Back the Clock (1933). Ever-annoying Lee Tracy plays a small businessman run down by a car. Under anesthesia, he reimagines his life as it might have been had he married the girl he once courted. Call it a rough draft for the “hypothetical flashbacks” that Resnais was to exploit in his great La Guerre est finie.
The point of this cascade of titles is that in writing The Power and the Glory, Sturges was working with a set of conventions already in wide circulation. His inventiveness stands out in two respects: the handling of voice-over and the ordering of the flashbacks.
Now I’m about to divulge details of The Power and the Glory.
Narratage, anyone?
The film begins with what became a commonplace opening gesture of film, fiction, and nonfiction biography: the death of the protagonist. We are at the funeral of Thomas Garner, railroad tycoon. His best friend and assistant Henry slips out of the service. After visiting the company office, Henry returns home. Sitting in the parlor with him, his wife castigates Garner as a wicked man. “It’s a good thing he killed himself.” So we have the classic setup of retrospective suspense: We know the outcome but become curious about what led up to it.
Henry’s defense of Garner launches a series of flashbacks. As a boyhood friend, Henry can take us to three stages of the great man’s life: adolescence, young manhood, and late middle age. Scenes from these time periods are linked by returns to the narrating situation, when Henry’s wife will break in with further criticisms of Garner.
Sturges boasted in a letter to his father: “I have invented an entirely new method of telling stories,” explaining that it combines silent film, sound film, and “the storytelling economy and the richness of characterization of a novel.” At the time, the Paramount publicists trumpeted that the film employed a new storytelling technique labeled narratage, a wedding of “narrating” and “montage.” One publicity item called it “the greatest advance in film entertainment since talking pictures were introduced.” Hyperbole aside, what did Sturges have in mind?
There is evidence that some screenwriters were rethinking their craft after the arrival of sound filming. Exhibit A is Tamar Lane’s book, The New Technique of Screen Writing (McGraw-Hill, 1936). Lane suggests that the talking picture’s promise will be fulfilled best by a “composite” construction blending various media. From the stage comes dialogue technique and sharp compression of action building to a strong climax. From the novel comes a sense of spaciousness, the proliferation of characters, a wider time frame, and multiple lines of action. Cinema contributes its own unique qualities as well, such as the control of tempo and a “pictorial charm” (p. 28) unattainable on the stage or page.
Vague as Lane’s proposal is, it suggests a way to think about the development of Hollywood screenwriting at the time. Many critics and theorists believed that the solution to the problem of talkies was to minimize speech; this is still a common conception of how creative directors dealt with sound. But Lane acknowledged that most films would probably rely on dialogue. The task was to find engaging ways to present it. Several films had already explored some possibilities, the most notorious probably being Strange Interlude (1932). In this MGM prestige product, the soliloquys spoken by characters in O’Neill’s play are rendered as subjective voice-over. The result, unfortunately, creates a broken tempo and overstressed acting. A conversation will halt, and through changes of facial expression the performer signals that what we’re now hearing is purely mental.
The Power and the Glory responds to the challenge of making talk interesting in a more innovative way. For one thing, there is the sheer pervasiveness of the voice-over narration. We’re so used to seeing films in which the voice-over commentary weaves in and out of a scene’s dialogue that we forget that this was once a rarity. Most flashback films in the early sound era had used the voice-over to lead into a past scene, but in The Power and the Glory, Henry describes what we see as we see it.
Most daringly, in one scene Henry’s voice-over substitutes for the dialogue entirely. Young Tom and Sally are striding up a mountainside, and he’s summoning up the nerve to propose marriage. What we hear, however, is Henry at once commenting on the action and speaking the lines spoken by the couple, whose voices are never heard.
This scene, often commented upon by critics then and now, seems have exemplified what Sturges late in life recalled “narratage” to be. Describing that technique in his autobiography, he wrote: “The narrator’s, or author’s, voice spoke the dialogue while the actors only moved their lips” (p. 272).
So one of Sturges’ innovations was to use the voice-over not only to link scenes but to comment on the action as it played out. In her pioneering book Invisible Storytellers: Voice-Over Narration in American Fiction Film (Univesity of California Press, 1988), Sarah Kozloff has argued that the pervasiveness of Henry’s narration has no real precedent in Hollywood, and few successors until 1939 (pp. 31-33). (There’s one successor in Sacha Guitry’s Roman d’un tricheur.) The novelty of the device may have led Sturges and Howard toward redundancies that we find a little labored today. The transitions into the past from the frame story are given rather emphatically, with Henry’s voice-over aided by camera movements that drift away from the couple. (Compare the crisp shifts in Midnight Mary, below.) Henry’s comments during the action are sometimes accentuated by diagonal veils that drift briefly over the shot, as if assuring us that this speech isn’t coming from the scene we see.
The “montage” bit of “narratage” also invokes the idea of a series of sequences guided by the voice-over narrator. The concept might also have encompassed the most famous innovation of The Power and the Glory: Sturges’ decision to make Henry’s flashbacks non-chronological.
Even today, most flashback films adhere to 1-2-3 order in presenting their embedded, past-tense action. But Sturges noticed that in real life people often recount events out of order, backing and filling or free-associating. So he organized The Power and the Glory as a series of blocks. Each block contains several scenes from either boyhood, youth, or middle age. Within each block, the scenes proceed chronologically, but the narration skips around among the blocks.
For example, a block of boyhood scenes gives way to a set showing Garner, now in middle age, ordering around his board of directors. The next cluster of flashbacks returns to Garner’s youth and his courtship of his first wife, Sally. Then we are carried back to his middle age, with scenes showing Garner alienated from Sally and his son Tommy but also attracted to the young woman Eve. And from there we return to Garner’s early married life with Eve.
To keep things straight, Sturges respects chronology along another dimension. Not only do the scenes within each block follow normal order, but the plotlines developing across the three phases of Garner’s life are given 1-2-3 treatment. In one block of flashbacks, we see Tom and Sally courting. When we return to that stage of their lives in another block, they are happily married. The next time we see Garner as a young man, he is improving himself by attending college. The later romance with Eve develops in a similar step-by-step fashion across the blocks devoted to middle age.
A major effect of the shuffling of periods is ironic contrast. Maureen Turim points out that seeing different phases of Garner’s life side by side points up changes and disparities. In his youth, Tom watches the birth of his son with awe; in the next scene, we are reminded what a wastrel young Tommy turned out to be.
The juxtaposition of time frames also nuances character development. As Sally ages, she turns into something of a nag, quarreling with her husband and pampering Tommy. But in the next sequence we see her young, ambitiously pushing Tom to succeed and willing to undergo sacrifice by taking up his job as a railroad track-walker. The next scenes show Tom in class and in a bar while Sally walks the desolate tracks in a blizzard. She has given up a lot for her husband. In the next scene, set in middle age, Garner confesses his love to Eve but says he could never leave Sally, and the juxtaposition with Sally’s solitary track-walking suggests that he recognizes her sacrifice. And in the following scene, when Sally comes to Garner’s office, she admits that she has become disagreeable and asks if they couldn’t take a trip to reignite their love. The juxtaposition of scenes has turned a caricatural shrew into a woman who is a more complex mixture of devotion, disenchantment, and self-awareness.
Other characters aren’t given this degree of shading—Tommy is pretty much a wastrel, Eve a vamp—but another married couple deepens the central parallel. Meek Henry is dominated by his wife, but by the end she is chastened by what she learns of Garner’s real motives. Critic Andy Horton, in his helpful introduction to Sturges’ published screenplay, indicates that this couple adds a note of contentment to what is otherwise a pretty sordid melodrama of adultery and quasi-incest.
The innovative flashbacks and voice-overs are an important part of the film’s appeal, but director William K. Howard supplied some craftsmanship of his own. Particularly striking are some silhouette effects, low angles, and deep-focus compositions that underscore the parallels between Sally’s suicide and Garner’s impending death.
The original screenplay suggests that Sturges intended to push his innovations further. About halfway through, he starts to break down the time-blocks. In the script, Sally visits Garner while he’s working on a bridge. The next scene shows their son Tommy already grown and spoiled, being taken back into his father’s good graces. Then the script returns to the bridge, where Sally tells Tom she’s pregnant. The interruption of the bridge scene reminds us of how badly their child turned out.
The script jumps back to the birth of the baby. In the film the birth scene plays out in its entirety, but in the screenplay Sturges cuts it off by the scene (retained in the film) showing Garner’s marriage to Eve. The final moments of the birth scene, when Garner prays (“Thou art the power and the glory”), become in the script the very end of the film. Coming after Tom’s death at the hand of his son, this epilogue is a bitter pill, rendered all the harder to take by providing no return to Henry and his wife.
The greater fragmentation of the second part of the script, along with Garner’s death as a sort of murder-suicide and the failure to return to the narrating frame, is striking. It’s as if Sturges felt he could take more chances, counting on his viewers’ familiarity with current flashback conventions and on his film’s firmly established time-shuttling method. But if, as sources report, Sturges’ script was initially filmed exactly as written, then it seems likely that the film’s June 1933 preview provoked the changes we find in the finished product. “The first half of the picture,” he remarked in a letter, “went magnificently, but the storytelling method was a little too wild for the average audience to grasp and the latter half of the picture went wrong in several spots. We have been busy correcting this and the arguments and conferences have been endless.”
Even the compromised film proved difficult for audiences. Tamar Lane, proponent of the “composite” form suitable for the sound cinema, felt that the “retrospects” in The Power and the Glory were too numerous and protracted. Nonetheless, he praised it for its “radical and original cinema handling” (p.34). That handling rested upon tradition—a tradition that in turn encouraged innovations. Once flashbacks had become solid conventions, Sturges could risk pushing them in fresh directions.
Mary remembers
Finally, two more flashy flashback movies from 1933. Some spoilers.
Midnight Mary (MGM, William Wellman) works a twist on the courtroom template. The defendant Mary Martin is introduced jauntily reading a magazine while the prosecutor demands that the jury find her guilty of murder. This also sets up a nice little motif of shots highlighting Loretta Young’s lustrous eyes. The motif pays off with a soft-focus shot of her in jail just before the climax.
As the opening scene ends, Mary is led to a clerk’s office to wait for the verdict. There’s an automatic dose of suspense (Will she be found guilty?) but there’s also considerable curiosity: Whom has she killed? How was she caught?
These questions won’t be answered for some time. Lounging in the clerk’s office, Mary runs her eye runs across the annual reports filling his shelves. The flashbacks, which comprise most of the film, are introduced as close-ups of the volumes’ spines—1919, 1923, 1926, 1927, and so on up to the present. They serve as neatly motivated equivalents of those clichéd calendar pages that ripple through montage sequences of the 1930s.
The flashbacks are motivated as subjective; Mary doesn’t recount her life to the clerk but simply reviews it in her mind. Unlike the flashbacks in The Power and the Glory, they are chronological and without gaps. Nothing is skipped over to be revealed later. As usual, though, once Mary’s recollections have triggered the rearrangement of story order, the flashbacks are filmed as any ordinary scenes would be, including bits of action that she isn’t present to witness. The film is a good example of using the extended-flashback convention chiefly to delay the resolution of the climactic action. Told in chronological order, Mary’s tale of woe would have had much less suspense.
Transitions between present and past are areas open to innovation, and early sound filmmakers took advantage of them. In Midnight Mary, the long flashback closes with gangsters pounding on the door of Mary’s boudoir; this sound continues across the dissolve to the present, with Mary roused from her reverie by a knock on the clerk’s office door. Earlier, one transition into the past begins with Mary blowing cigarette smoke toward the bound volumes on the shelf.
Dissolve to a close-up of one book as smoke wafts over it, and then to a shot of Mary’s gangster boyfriend blowing cigarette smoke out before he sets up a robbery..
At one point the narration supplies a surprise by abruptly shifting into the present. Once Mary has become a prostitute, she is slumped over a barroom table in sorrow, while her pal Bunny consoles her. In a tight shot, Bunny (Una Merkel, always welcome), leans over and says: “Oh, what’s the diff, Mary? A girl’s gotta live, ain’t she?”
Cut directly to the present, with Mary murmuring: “Not necessarily, Bunny. The jury’s still out on that.”
Mary’s reply casts Bunny’s question about needing to live in a new light, since Mary is facing execution, and the use of the stereotyped phrase, “The jury’s still out,” now with a double meaning, reminds us of the present-tense crisis. It is a more crisp and concise link than the transitions we get in The Power and the Glory. But then, Wellman has no need for continuous voice-over, which gives the Sturges/ Howard film its more measured pace.
Filmmakers were concerned with finding storytelling techniques appropriate to the sound film, and these unpredictable links between sequences became characteristic of the new medium. Similar links had appeared in silent films, but they gained smoothness and extra dimensions of meaning when the images were blended with dialogue or music. For more on transitional hooks, go here.
Nora and narratage
The hooks between scenes are perhaps the least outrageous stretches of The Sin of Nora Moran, a Majestic release that, thanks to a gorgeous restoration and a DVD release, has rightly earned a reputation as the nuttiest B-film of the 1930s.
It is a flashback frenzy, boxes within boxes. A District Attorney tells the governor’s wife to burn the apparently incriminating love letters she’s found. In explaining why, the D. A. introduces a flashback (or is it a cutaway?) to Nora in prison. We then move into Nora’s mind and see her hard life, the low point occurring when she’s raped by a lion tamer.
Now we start shuttling between the D. A. telling us about Nora and Nora remembering, or dreaming up, traumatic events. At some points, characters in her flashbacks tell her that what she’s experiencing is not real. In one hazy sequence, her circus pal Sadie materializes in her cell to remind Nora that she killed a man. (Actually, she didn’t.) At other moments Nora’s flashbacks include moments in which she says that if she does something differently, it will change—it being the outcome of the story. At this point another character will point out that they can’t change the outcome because it has already happened . . . of course, since this is a flashback.
By the end, after the governor has had his own flashback to the end of his affair with Nora and after she appears as a floating head, things have gotten out of hand. The rules, if there are any, keep changing. And the whole farrago is propelled by furious montage sequences built out of footage scavenged from other films.
Publicity and critical response around The Sin of Nora Moran implied that the movie followed the “narratage” method. There was surely some influence. Scenes contain fairly continuous voice-over commentary, and director Phil Goldstone occasionally drops in the diagonal veil used in The Power and the Glory. But on the whole this delirious Poverty Row item falls outside the strict contours of Sturges’ experiment. Nora Moran blurs the line separating flashbacks and fantasy scenes, and it illustrates how easily we can lose track of what time zone we’re in. Watching it, I had a flashback of my own—to Joseph Cornell’s Rose Hobart, another compilation revealing that Hollywood conventions are only a few steps from phantasmagorias.
Unwittingly, Nora Moran’s peculiarities point forward to the flashback’s golden age, the 1940s and early 1950s. Then we got contradictory flashbacks, flashbacks within flashbacks within flashbacks, flashbacks from the point of view of a corpse (Sunset Boulevard) or an Oscar statuette (Susan Slept Here). Filmmakers knew they had found a good thing, and they weren’t going to let it go.
The original screenplay of The Power and the Glory is included in Andrew Horton, ed., Three More Screenplays by Preston Sturges (Berkeley: University of California Press, 1998). Sturges’ reflections from the late 1950s are to be found in Preston Sturges by Preston Sturges: His Life in His Words, ed. Sandy Sturges (New York: Simon and Schuster, 1990). The quotations from Sturges’ letters and from publicity about “narratage” can be found in Diane Jacobs, Christmas in July: The Life and Art of Preston Sturges (Berkeley: University of California Press, 1992), 123-129 and James Curtis, Between Flops: A Biography of Preston Sturges (New York: Harcourt Brace Jovanovich, 1982), 87.
My citatations of literary uses of the term come from Elliott Field, “A Philistine in Arcady,” The Black Cat 24, 10 (July 1919), 33; Fitzgerald’s The Beautiful and Damned (1922), available here, 433; Samuel A. Eliot, Jr., ed., Little Theater Classics vol. 3 (Boston: Little, Brown, 1921), 120; The Outlook (11 May 1921), 49, available here; review of His Children’s Children, quoted in Turim p. 29; commentary on On Trial, in The New York Times (25 March, 1923), X2.
For more on the history of flashback construction, apart from Maureen Turim’s Flashbacks in Film, see Barry Salt’s Film Style and Technology: History and Analysis, 2nd ed. (London: Starword, 1992), especially 101-102, 139-141. There are discussions of the technique throughout David Bordwell, Janet Staiger, and Kristin Thompson, The Classical Hollywood Cinema: Film Style and Mode of Production to 1960 (New York: Columbia University Press, 1985), especially 42-44.
P.S. 15 November 2015: 1940s flashback technique is surveyed in my Reinventing Hollywood: How 1940s Filmmakers Changed Movie Storytelling.
Categorical coherence: A closer look at character subjectivity
Subjective
[NOTE: There are some spoilers here, though I’ve tried to avoid giving away the ends of the films I mention. Teachers who show clips in class would probably want to do the same. Some of the films mentioned here would be good choices to show in their entirety to classes when they study Chapter 3 of Film Art.]
Kristin here—
We have had occasion to mention the Filmies list-serve of the Department of Communication Arts’ Film Studies area here at the University of Wisconsin-Madison. Current and past students and faculty, sometimes known as the “Badger squad,” share news, links, and requests for information. Once in a while a topic is raised for discussion. These exchanges are usually fascinating, and when we have felt that they might be of interest to a general audience, we have used them as the basis for blog entries. (See here and here.)
Another occasion arose recently, one which relates to the teaching of Film Art: An Introduction. Matthew Bernstein, of Emory University, queried the group about suggestions for teaching a section of the third chapters, “Narrative as a Formal System.” He found that the section, “Depth of Story Information,” gives some students trouble. They can’t grasp the distinction we make between perceptual and mental subjectivity.
Our definitions of the terms go like this. Perceptual subjectivity is when we get “access to what characters see and hear.” Examples are point-of-view shots and soft noises suggesting that the source is distant from the character’s ear. In contrast, “We might hear an internal voice reporting the character’s thoughts, or we might see the character’s inner images, representing memory, fantasy, dreams, or hallucinations. This can be termed mental subjectivity.”
We then go on to give a few examples. The Big Sleep has little perceptual subjectivity, while The Birth of a Nation contains numerous POV shots. We see the heroine’s memories in Hiroshima mon amour and the hero’s fantasies in 8 ½. Filmmakers can use such devices in complex ways, as when in Sansho the Bailiff, the mother’s memory ends not by returning to her in the present, but to her son, apparently thinking of the same things at the same time.
Yeah, but what about …?
Filmmakers don’t always use these categories in straightforward ways. They may oscillate between perceptual and mental representations and events. Films may present narrative events that make the distinction between characters’ perceptions and their mental events ambiguous. That’s why in introducing the concepts, we stress that “Just as there is a spectrum between restricted and unrestricted narration, there is a continuum between objectivity and subjectivity.
Easy to say, but not necessarily so easy to grasp for a student who’s trying to understand these categories. They need to start with simple examples to familiarize themselves with the basic distinction before going on to what the likes of Resnais, Fellini, and Mizoguchi can throw at them. But some students immediately proffering exceptions. As Charlie Keil, of the University of Toronto, put it in his post on the subject, “I hardly need add that students, even the least analytically astute, border on the brilliant when it comes to suggesting examples that provide challenges to categorical coherence.”
Looking over the one and a half pages of the textbook that we devoted to depth of narration, you might find it fairly straightforward. Once you start looking for ways to teach the concepts without bogging down in too many nuances and subcategories, though, the passage does seem challenging. What follows is a suggestion about how someone might go about explaining and utilizing examples. It also points up some ways we might revise this passage of the book for its next edition.
What it’s not
One obstacle that Matthew has found is that students are used to thinking of “subjective” as meaning something like “biased,” as in “here’s my subjective opinion on that.” It might be useful to point out that this is only one meaning of the word. The dictionary definition that comes closest to the way we use it in Film Art is this: “relating to properties or specific conditions of the mind as distinguished from general or universal experience.” For film, we specify that “subjective” means either sharing the characters’ eyes and ears (properties) or getting right inside his or her mind (conditions).
Another misconception comes when students assume that the facial expressions, vocal tones, and gestures used by actors are subjective because they convey the characters’ feelings. Again, best to scotch that notion up front. Those techniques are projected outward from the character, and we observe them. Cinematic subjectivity goes inward.
One step at a time
Another distinction to stress early on is the basic one we make between film technique and function. There are a myriad of film techniques that could be used in either objective or subjective ways. To take a simple example, a low camera angle might indicate the POV of a character lying down and looking up at something; that’s perceptual subjectivity. A low angle of a skyscraper in an establishing shot might simply be the objective narration’s way of showing where a new scene will occur.
In contrast, indicating perceptual and mental subjectivity are two specific kinds of functions. Filmmakers can call upon whatever cinematic techniques they choose, and historically the more imaginative ones have shown immense creativity in trying to convey what characters see and think.
Some films depend heavily upon subjectivity, but objective narration is more common. There are many films that give us little access to characters’ perceptual and mental activities. Apart from The Big Sleep, there are films like Anatomy of a Murder, or virtually anything by Preminger. Most films use subjectivity sparingly. So let’s assume the students can tell that objective narration is the default.
With that out of the way, we can go back to basics. Both perceptual and mental subjectivity depend on being “with” the character in a strong way, as opposed to observing him or her as we would see another person in real life.
Perceptual subjectivity is fairly simple. The camera is in the character’s place, showing what he or she sees. The microphone acts as the character’s ears. Other characters present in the scene could step into the same vantage point and observe the same things.
So the test could run like this: As a viewer, when we see something in a film, is it really present within the scene? Could someone else in the same position see and hear the same things? Or is it a purely mental event, something no one else could see and hear, even if they stood beside the character or stepped into the place where that character had been standing?
If I were teaching the concept of narrational subjectivity as Film Art defines it, I would stress the notion of a continuum. After starting with very clear examples of both perceptual and mental subjectivity, I would progress to more ambiguous, mixed, or tricky instances.
Contributing to this discussion on the Filmies list, Chris Sieving, of the University of Georgia, says he shows a clip from Lady in the Lake, the film noir where the camera always shows the POV of the detective protagonist. As Chris says, “It seems to effectively get across what perceptual subjectivity means (and how awkward it is in large doses), as well as what is meant by a (highly) restricted range of narration (as is the case with most whodunits).” As Chris also says, Lady in the Lake is a sort of limit case, a film that depends as much on perceptual subjectivity as it’s possible to do. (That’s “our” hand taking the paper in the shot at the right.) One could show several minutes from any section of the film and make the point thoroughly.
There are other films that contain a lot of POV shots but intersperse them with objective ones. Rear Window is an obvious case. When Jefferies is alone, we see the courtyard events from his POV, a fact which is stressed by his use of binoculars and his long camera lenses to spy on his neighbors. This is clearly perceptual rather than mental. We never doubt that what we see through the hero’s eyes is real; we don’t believe that he’s making things up to entertain himself or because his mind is unbalanced. For one thing, there are two other characters who visit at intervals and see the same things that he does. We and they might question whether Jefferies’ interpretation of what he sees is correct, but we assume that the story’s real events have been conveyed to us through his eyes and ears. Only if we saw something like his fantasy of how he imagines his neighbor might have killed his wife would we move into the mental realm.
These two films foreground their use of POV. Usually, though POV shots are slipped into the flow of the action smoothly. Scenes of characters looking at small objects or reading letters often cut to a POV shot to help us get a look at an important plot element. At intervals during Back to the Future, Marty looks at a picture of himself and his siblings, gradually fading away to indicate that the three of them might never be born if he doesn’t succeed in bringing their parents together. It’s a simple way of reminding us what’s at stake and that Marty’s time to solve the problem is running out.
The Silence of the Lambs uses many POV shots and provides an excellent case of narration switching frequently and seamlessly between objective and subjective. Most of the POV views are seen through Clarice’s eyes, but sometimes through those of other characters. The first view of Lecter is a handheld tracking shot clearly established as what she sees as she walks along the corridor in front of the row of cells.
The scenes of Clarice conversing with Lecter develop from conventional over-the-shoulder shot/ reverse shot to POV shot/ reverse shot as both characters stare directly into the lens. (In Film Art, we use this device as an example of how style can shape the narrative progression of a scene [p. 307, 8th edition]). Other characters have POV shots as well, most noticeably, when Buffalo Bill twice dons night-vision goggles and we briefly see the world as he does.
Occasionally we see the POV of even minor characters, as when Lecter’s attack on his guard is rendered with a quick POV shot of Lecter lunging open-mouthed at the camera and then an objective shot of him grabbing the guard.
In their minds
Let’s jump to the other end of the continuum. Here we find a clear-cut use of mental subjectivity for fantasies, hallucinations, and the like. Obviously no one else present, unless he or she is posited as having special telepathic abilities, can see and hear what takes place only in a character’s mind.
Such fantasies are a running gag on The Simpsons, where Homer’s misinterpretations, distractions, and visions of grandeur are shown either in a thought balloon or superimposed on his skull. In the example above, he abruptly starts “watching” a little cartoon after assuring Lisa that she has his complete attention. No one could mistakenly assume that these exist anywhere but in Homer’s imagination.
In Buster Keaton’s Our Hospitality, the hero receives a letter telling him he has inherited a Southern home. After an image of him thinking there is a dissolve to a view of a large, pillared house. Later, when he arrives at his small, ramshackle house, he stands staring in a similar situation, and here there is a fade-out to his dream house, which abruptly blows up. The humor in the second scene would be impossible to grasp if we didn’t easily understand that the shot of the house exists only in his mind.
Whole films can be built around fantasies. A large central section of Preston Sturges’ black comedy Unfaithfully Yours consists of a husband envisioning three different ways he might kill his supposedly cheating wife, all set to the musical pieces he is conducting at a concert. Fellini’s 8 ½ is only the most obvious example of how the art cinema often brings in fantasies. Other examples include Jaco van Dormael’s Toto le héros or Bergman’s Persona.
Once students have grasped the basic distinction between perceptual and mental subjectivity, it might be useful to emphasize the continuum by moving directly to its center, where the two types coexist.
The middle of the continuum: Ambiguity and simultaneity
Right in the middle of the continuum between the pure cases we find ambiguous cases. Filmmakers can create deliberate, complex, and important effects by keeping it unclear whether what we see is a character’s perception of reality or his/her imaginings.
1961 was a big year for ambiguous subjectivity, with two of the purest cases appearing. One was a genre picture, the other a controversial art film.
The first is the British horror film, The Innocents, directed by Jack Clayton and based on Henry James’s novella, The Turn of the Screw. In several scenes, a new governess at a country estate sees frightening figures whom she takes to be ghosts haunting the two children entrusted to her. We see these figures as she does, but we never see them except when she does. The children behave very oddly in ways that might be consistent with her belief, yet they deny seeing any ghosts. Finally the governess tries to force the little girl to admit that she also sees the silent female figure standing in the reeds across the water. Does the child’s horrified expression reflect her realization that the governess knows about her secret relationship with the ghosts? Or is she simply baffled and frightened by the governess’s increasingly frantic demands that she confess to seeing something that in fact she can’t see?
From the first appearance of the eery figures, the question arises as to whether the governess is imagining the ghosts or they are real, controlling the children, who try to keep them secret. There are apparent clues for either answer. By the end, we arguably are no closer to knowing whether the ghosts are real or figments of the heroine’s imagination.
Since Last Year at Marienbad appeared, critics have spilled gallons of ink trying to fathom its symbolism and sort out the “real” story that it tells. Clearly there are contradictions in events, settings, and voiceover narration. The second of the three accompanying images depicts the heroine as the hero’s voiceover describes their first meeting. They talked about the statue that stands beside her, seen against the formal garden of walks lined with pyramid-shaped shrubs. Yet another version of the first meeting starts, this time with the characters and the same statue against a background of a large pool. Later scenes in these locales display the same inconsistencies.
Are such contradictions the result of one character’s fantasies or of the conflicting memories that two characters have of the same event? Or are the contradictions not the products of subjectivity at all but just the playfulness of an objective, impersonal narration, manipulating characters like game pieces to challenge the viewer? (Our analysis of Last Year at Marienbad is available here on David’s website. It and all the Sample Analyses that have been eliminated from Film Art to make room for new essays are available here as pdf’s; the index to them is here, about halfway down the page.)
It’s not a good example of subjectivity to show students, but just as an aside, the screwball comedy Harvey is an interesting case, a sort of reversal of the Innocents situation. The narration withholds a character’s perceptual and mental events. Elwood P. Dowd describes what he sees and hears: a six-foot talking rabbit invisible to us and to all the other characters. The story concerns whether Dowd’s relatives will institutionalize him for insanity, and we assume along with them Harvey is a mere delusion. Thus the narration seems to be objective—or, the ending asks, has it been very uncommunicative, withholding something that the hero really does see and hear? True, throughout the film the framing leaves room for Harvey, as if he were there. One could argue, though, that these framings simply emphasize that the giant rabbit isn’t visible to us or the other characters and hence isn’t likely to really exist.
A film can easily present both perceptual and mental subjectivity at the same time. A POV shot may be accompanied a character’s voice describing his or her thoughts and feelings. Matthew shows his class part of the opening section of The Diving Bell and the Butterfly, where a stroke victim’s extremely limited sight and hearing are rendered in juxtaposition with his voice telling of his reactions to his new situation. It is one of the most extensive and successful uses of such a combination in recent cinema, and it might be very effective in differentiating the two for introductory students.
A scene can also move rapidly back and forth between a character’s perceptions and his or her thoughts. Charlie writes that he shows his students the scenes of Marion Crane driving in Psycho. While the car is moving, almost every alternating shot is a POV framing of the rearview mirror or through the windshield. At that same time, we hear the voices of her lover, her boss, her fellow secretary, and the rich man whose money she has stolen while she imagines how they would react to her crime.
Showing scenes like these, where both types of subjectivity are used and clearly distinguishable might be more useful for students than showing several scenes that contain only one or the other.
Flashbacks, Voiceovers, and Altered States
In his original query to the Filmies, Matthew also said that students have problems with flashbacks: “Particularly, they resist the idea that a flashback is an example of subjective depth in general, even if the flashback unfolds objectively.” I can think of two ways to explain this.
First, classical films tend not to have flashbacks that just start on their own. To be sure, some do: a track-in to an important object and a dissolve can signal the start of a passage from the past, without a character being there. But more often a character is used to motivate the move into the past. A thoughtful look may do it, or a character may describe the past to someone else. In either case, the flashback is coming “from” the character and is assumed to show approximately what he or she is remembering. Occasionally the flashback may be a lie rather than objective truth, as we know from Hitchcock’s Stage Fright, The Usual Suspects, and a few other films.
In Poetics of Cinema’s third essay, David argues that most flashbacks in films are motivated as a character’s memory, but what is shown often strays from what he or she knew or could have known. He suggests that the prime purposes of most flashbacks is to rearrange the order of story events, and the character recalling or recounting simply provides an alibi for the time shift.
So we might think of character-motivated flashbacks as subjective frameworks that also contain objectively conveyed narrative information.
Second, the fact that flashbacks slip in objective information into characters’ memories is a convention. It’s a widely used method for presenting us with two things at once: first, a character’s memories and second, some story information that the viewer needs to have—even if the character couldn’t know about it. As we watch movies, we frequently accept conventions for the sake of being entertained. Beings that travel to Earth from distant galaxies speak English, high school kids can put together shows that wind up triumphing on Broadway, and people who drive up to buildings in crowded cities always find perfect parking spots. In a similar way, implausible mixtures of subjectivity and objectivity in flashbacks is just something we have to accept.
There are two other techniques that didn’t get mentioned during the discussion but that might confuse students.
What about shots showing the vision of a character who is drunk, dizzy, drugged, or otherwise unable to see straight? The most common convention is to include a POV shot that’s out of focus, perhaps accompanied by a bobbing handheld camera. Other characters in the scene, assuming they are not similarly impaired, would not see the surroundings in the same way.
Still, I think the same perceptual/mental distinction holds for such moments. The fuzziness and the lack of coordination are physical effects that are not being imagined by the character. They remain in the perceptual realm. Mental effects of impairment would be dreams or hallucinations. Some examples: the “Pink Elephants on Parade” sequence in Dumbo and the DTs vision of the protagonist of Wilder’s The Lost Weekend. As the title suggests, Altered States takes such mental activities as its subject matter and has many scenes that represent them.
A harder case is voiceover. Are all cases of voiceover subjective? Clearly not. If we have a situation where a character tells a story to a group and his or her voice continues over a flashback, the narration remains objective. We assume that the group can still hear the storyteller. Cases where the voice exists only in the mind, as when a character speaks to himself or herself, but not aloud, are mental subjectivity.
That said, there are many unclear cases. Just what is the status of the narration in Jerry Maguire? Jerry seems to speak directly to us, pointing out things that happen during the action, yet clearly there is no suggestion that he made the film we are watching. The problem is compounded in Sunset Boulevard, where the protagonist not only implicitly addresses us but is also dead. In many cases when a character’s voice is heard over a scene, it might occur to audience members to wonder where the character is or was when speaking these words. And does a character’s voice describing his or her feelings constitute objective or subjective narration? Probably we would want to say that only when the voice is posited as strictly an internal voice, audible only to the character, would we want to dub it subjective.
The problem with voiceovers arises, I think, because it’s such a slippery technique to begin with, and therefore often hard to categorize. Is a character speaking narration over events that happened in the past diegetic sound or nondiegetic? If there’s never an establishment of where and when that character does the narrating, he or she exists in a sort of limbo in between the two states: diegetic because he or she is a character, nondiegetic because he or she is in some ineffable way removed from the story world.
For voiceovers, then, I think it’s best simply to categorize the ones that obviously are straightforwardly objective or subjective. In tricky cases, we just have to admit that not all uses of film technique are easy to pin labels on. But the point of having categories like these isn’t to pin labels. In part knowing them allows us simply to notice things in films that might otherwise remain a part of an undifferentiated flow of images. They enable us to see underlying principles that make films into dynamic systems rather than collections of techniques. They give us ways to organize our thoughts about films and convey them to others. And, though students may doubt this, watching for such things becomes automatic and effortless once we have understood such categories and watched a lot of films. As a child, I don’t think I knew about the concept of editing or ever really noticed cuts. Now I’m aware of every cut in every film I see, and I notice continuity errors and graphic matches and other related techniques, all automatically, without that awareness impinging in the least on my following the story and being entertained. Learning the categories is only the beginning.
Playing with subjectivity
Once students have seen some clear cases of each type of cinematic subjectivity and understand the difference, the teacher could move on to emphasize that filmmakers can play with both in original ways. It’s not really possible in an introductory textbook to discuss all the possibilities—and probably not possible to come up with a typology that would cover every example of subjectivity that could exist across that continuum we mentioned. Imaginative filmmakers will always find new variants on how to use techniques for this purpose. But here are a few intriguing cases.
In his class, Chris shows the scene in Hannah and Her Sisters where the Woody Allen character, a hypochondriac, visits a doctor for some hearing tests. Initially the doctor comes into the room and gives a dire diagnosis of inoperable cancer. After Mickey has reacted to that, a cut takes us back to an identical shot of the doctor entering, but this time he gives Mickey a clean bill of health. The first part of the scene is retroactively revealed to have been a mental event, Mickey’s pessimistic fantasy.
The same sort of thing happens in a more extended way with the familiar “it was only a dream” revelations that make the audience realize that a major part of the plot has been subjective. In a more sophisticated way, as Chris points out, other sorts of mental subjectivity, usually lies or extended fantasies, can be revealed retrospectively, as in The Usual Suspects, Mulholland Drive, and Fight Club.
A flashback from one character’s viewpoint may reveal something new about an earlier scene. In Ford’s The Man who Shot Liberty Valance, the shootout is initially shown through objective narration. Only later in the plot does another character reveals that, unbeknownst to us or the other people present, he had also been present at the shootout. The flashback to his account reveals that the shootout happened very differently from the way we had assumed when first seeing it.
I’ll close with an example from The Silence of the Lambs. This shows how subtle and effective a play with perceptual and mental narration can be. In the scene at the funeral home where a recently discovered body of a murder victim is to be examined, Clarice is left waiting in the midst of a group of state troopers who stare at her. To get out of the situation, she turns and looks into the chapel, where a funeral is taking place. We see her face and then her POV as she surveys the room. A cut shows Clarice suddenly within the chapel, moving forward toward the camera and staring straight into the lens. We will only realize retrospectively that this image begins a fantasy that leads quickly into a memory. Clarice has not actually left her previous position just outside the door.
The next shot is a track forward through the center aisle toward the casket. Since Clarice had been walking in that direction before the cut, we recognize this as a POV and assume that Clarice is continuing to walk. Yet the man in the casket, as we soon will learn, is Clarice’s father. The shot represents a different funeral, one in the past which she has been triggered to remember by her glance into the chapel. The fantasy has become a flashback. A second, similar view of Clarice’s face returns us to her fantasizing adult self, the one remembering this scene but not the one still standing outside the door. A closer view of the father’s casket shows it from a lower vantage-point, as if that of a child.
The reason for the change becomes apparent from a radical change at the next cut, so that the camera is on the far side of the casket, filming from a low angle. The sudden shift moves us away from the adult Clarice in order to show her as a child approaching her father and leaning down to kiss him. A noise pulls Clarice out of her memory, and a cut back to the hallway shows her turning away from the chapel. (As often happens, sound, and particularly music, helps guide us through the scene, marking the beginning and ending of the fantasy/flashback.)
The most experienced film specialist could not track all the rapidly shifting levels of subjectivity in this scene on first viewing. Still, later analysis using some categories of subjective narration can help us appreciate how Demme has woven them into a scene that helps explain Clarice’s motives in becoming an FBI agent and her determination in pursuing her first case.
Categories matter
To some students, the categories I’ve just discussed may seem like trivial distinctions. They’re not. The use of subjective narration is one of the key ways the filmmaker has to engage our thoughts and emotions with the characters. In Psycho, we become involved in Marion Crane’s life in a remarkably short time, partly because of her situation but also partly because Hitchcock keeps us so close to her once she prepares to steal the money. The camera not only frames her closely, but to a considerable degree we see and imagine what she does: her fearful forebodings of how her rash act will turn out. Much the same thing happens with Clarice Starling in The Silence of the Lambs, though there our emotional involvement lasts throughout the film, and we are given glimpses of Clarice’s memories.
Perhaps choosing a scene or two from such films and going through them with the students, trying to imagine what it would be like without the POV shots and the imaginings and the memories, would convince them of the value of learning these categories. After all, the exceptions they find are only exceptional because they play in a zone defined by solid concepts.
[Note added October 24: I should have referred back as well to David’s entry “Three Nights of a Dreamer,” largely on POV.]
Objective
Superheroes for sale
DB here:
After a day at the movies, maybe I am living in a parallel universe. I go to see two films praised by people whose tastes I respect. I find myself bored and depressed. I’m also asking questions.
Over the twenty years since Batman (1989), and especially in the last decade or so, some tentpole pictures, and many movies at lower budget levels, have featured superheroes from the Golden and Silver age of comic books. By my count, since 2002, there have been between three and seven comic-book superhero movies released every year. (I’m not counting other movies derived from comic books or characters, like Richie Rich or Ghost World.)
Until quite recently, superheroes haven’t been the biggest money-spinners. Only eleven of the top 100 films on Box Office Mojo’s current worldwide-grosser list are derived from comics, and none ranks in the top ten titles. But things are changing. For nearly every year since 2000, at least one title has made it into the list of top twenty worldwide grossers. For most years two titles have cracked this list, and in 2007 there were three. This year three films have already arrived in the global top twenty: The Dark Knight, Iron Man, and The Incredible Hulk (four, if you count Wanted as a superhero movie).
This 2008 successes have vindicated Marvel’s long-term strategy to invest directly in movies and have spurred Warners to slate more comic-book titles. David S. Cohen analyses this new market here. So we are clearly in the midst of a Trend. My trip to the multiplex got me asking: What has enabled superhero comic-book movies to blast into a central spot in today’s blockbuster economy?
Enter the comic-book guys
It’s clearly not due to a boom in comic-book reading. Superhero books have not commanded a wide audience for a long time. Statistics on comic-book readership are closely guarded, but the expert commentator John Jackson Miller reports that back in 1959, at least 26 million comic books were sold every month. In the highest month of 2006, comic shops ordered, by Miller’s estimate, about 8 million books (and this total includes not only periodical comics but graphic novels, independent comics, and non-superhero titles). There have been upticks and downturns over the decades, but the overall pattern is a steep slump.
Try to buy an old-fashioned comic book, with staples and floppy covers, and you’ll have to look hard. You can get albums and graphic novels at the chain stores like Borders, but not the monthly periodicals. For those you have to go to a comics shop, and Hank Luttrell, one of my local purveyors of comics, estimates there aren’t more than 1000 of them in the U. S.
Moreover, there’s still a stigma attached to reading superhero comics. Even kitsch novels have long had a slightly higher cultural standing than comic books. Admitting you had read The Devil Wears Prada would be less embarrassing than admitting you read Daredevil.
For such reasons and others, the audience for superhero comics is far smaller than the audience for superhero movies. The movies seem to float pretty free of their origins; you can imagine a young Spider-Man fan who loved the series but never knew the books. What’s going on?
Men in tights, and iron pants
The films that disappointed me on that moviegoing day were Iron Man and The Dark Knight. The first seemed to me an ordinary comic-book movie endowed with verve by Robert Downey Jr.’s performance. While he’s thought of as a versatile actor, Downey also has a star persona—the guy who’s wound a few turns too tight, putting up a good front with rapid-fire patter (see Home for the Holidays, Wonder Boys, Kiss Kiss Bang Bang, Zodiac). Downey’s cynical chatterbox makes Iron Man watchable. When he’s not onscreen we get excelsior.
Christopher Nolan showed himself a clever director in Memento and a promising one in The Prestige. So how did he manage to make The Dark Knight such a portentously hollow movie? Apart from enjoying seeing Hong Kong in Imax, I was struck by the repetition of gimmicky situations–disguises, hostage-taking, ticking bombs, characters dangling over a skyscraper abyss, who’s dead really once and for all? The fights and chases were as unintelligible as most such sequences are nowadays, and the usual roaming-camera formulas were applied without much variety. Shoot lots of singles, track slowly in on everybody who’s speaking, spin a circle around characters now and then, and transition to a new scene with a quick airborne shot of a cityscape. Like Jim Emerson, I thought that everything hurtled along at the same aggressive pace. If I want an arch-criminal caper aiming for shock, emotional distress, and political comment, I’ll take Benny Chan’s New Police Story.
Then there are the mouths. This is a movie about mouths. I couldn’t stop staring at them. Given Batman’s cowl and his husky whisper, you practically have to lip-read his lines. Harvey Dent’s vagrant facial parts are especially engaging around the jaws, and of course the Joker’s double rictus dominates his face. Gradually I found Maggie Gyllenhaal’s spoonbill lips starting to look peculiar.
The expository scenes were played with a somber knowingness I found stifling. Quoting lame dialogue is one of the handiest weapons in a critic’s arsenal and I usually don’t resort to it; many very good movies are weak on this front. Still, I can’t resist feeling that some weighty lines were doing duty for extended dramatic development, trying to convince me that enormous issues were churning underneath all the heists, fights, and chases. Know your limits, Master Wayne. Or: Some men just want to watch the world burn. Or: In their last moments people show you who they really are. Or: The night is darkest before the dawn.
I want to ask: Why so serious?
Odds are you think better of Iron Man and The Dark Knight than I do. That debate will go on for years. My purpose here is to explore a historical question: Why comic-book superhero movies now?
Z as in Zeitgeist
More superhero movies after 2002, you say? Obviously 9/11 so traumatized us that we feel a yearning for superheroes to protect us. Our old friend the zeitgeist furnishes an explanation. Every popular movie can be read as taking the pulse of the public mood or the national unconscious.
I’ve argued against zeitgeist readings in Poetics of Cinema, so I’ll just mention some problems with them:
*A zeitgeist is hard to pin down. There’s no reason to think that the millions of people who go to the movies share the same values, attitudes, moods, or opinions. In fact, all the measures we have of these things show that people differ greatly along all these dimensions. I suspect that the main reason we think there’s a zeitgeist is that we can find it in popular culture. But we would need to find it independently, in our everyday lives, to show that popular culture reflects it.
*So many different movies are popular at any moment that we’d have to posit a pretty fragmented national psyche. Right now, it seems, we affirm heroic achievement (Indiana Jones and the Kingdom of the Crystal Skull, Kung Fu Panda, Prince Caspian) except when we don’t (Get Smart, The Dark Knight). So maybe the zeitgeist is somehow split? That leads to vacuity, since that answer can accommodate an indefinitely large number of movies. (We’d have to add fractions of our psyche that are solicited by Sex and the City and Horton Hears a Who!)
*The movie audience isn’t a good cross-section of the general public. The demographic profile tilts very young and moderately affluent. Movies are largely a middle-class teenage and twentysomething form. When a producer says her movie is trying to catch the zeitgeist, she’s not tracking retired guys in Arizona wearing white belts; she’s thinking mostly of the tastes of kids in baseball caps and draggy jeans.
* Just because a movie is popular doesn’t mean that people have found the same meanings in it that critics do. Interpretation is a matter of constructing meaning out of what a movie puts before us, not finding the buried treasure, and there’s no guarantee that the critic’s construal conforms to any audience member’s.
*Critics tend to think that if a movie is popular, it reflects the populace. But a ticket is not a vote for the movie’s values. I may like or dislike it, and I may do either for reasons that have nothing to do with its projection of my hidden anxieties.
*Many Hollywood films are popular abroad, in nations presumably possessing a different zeitgeist or national unconscious. How can that work? Or do audiences on different continents share the same zeitgeist?
Wait, somebody will reply, The Dark Knight is a special case! Nolan and his collaborators have strewn the film with references to post-9/11 policies about torture and surveillance. What, though, is the film saying about those policies? The blogosphere is already ablaze with discussions of whether the film supports or criticizes Bush’s White House. And the Editorial Board of the good, gray Times has noticed:
It does not take a lot of imagination to see the new Batman movie that is setting box office records, The Dark Knight, as something of a commentary on the war on terror.
You said it! Takes no imagination at all. But what is the commentary? The Board decides that the water is murky, that some elements of the movie line up on one side, some on the other. The result: “Societies get the heroes they deserve,” which is virtually a line from the movie.
I remember walking out of Patton (1970) with a hippie friend who loved it. He claimed that it showed how vicious the military was, by portraying a hero as an egotistical nutcase. That wasn’t the reading offered by a veteran I once talked to, who considered the film a tribute to a great warrior.
It was then I began to suspect that Hollywood movies are usually strategically ambiguous about politics. You can read them in a lot of different ways, and that ambivalence is more or less deliberate.
A Hollywood film tends to pose sharp moral polarities and then fuzz or fudge or rush past settling them. For instance, take The Bourne Ultimatum: Yes, the espionage system is corrupt, but there is one honorable agent who will leak the information, and the press will expose it all, and the malefactors will be jailed. This tactic hasn’t had a great track record in real life.
The constitutive ambiguity of Hollywood movies helpfully disarms criticisms from interest groups (“Look at the positive points we put in”). It also gives the film an air of moral seriousness (“See, things aren’t simple; there are gray areas”). That’s the bait the Times writers took.
I’m not saying that films can’t carry an intentional message. Bryan Singer and Ian McKellen claim the X-Men series criticizes prejudice against gays and minorities. Nor am I saying that an ambivalent film comes from its makers delicately implanting counterbalancing clues. Sometimes they probably do. More often, I think, filmmakers pluck out bits of cultural flotsam opportunistically, stirring it all together and offering it up to see if we like the taste. It’s in filmmakers’ interests to push a lot of our buttons without worrying whether what comes out is a coherent intellectual position. Patton grabbed people and got them talking, and that was enough to create a cultural event. Ditto The Dark Knight.
Back to basics
If the zeitgeist doesn’t explain the flourishing of the superhero movie in the last few years, what does? I offer some suggestions. They’re based on my hunch that the genre has brought together several trends in contemporary Hollywood film. These trends, which can commingle, were around before 2000, but they seem to be developing in a way that has created a niche for the superhero film.
The changing hierarchy of genres. Not all genres are created equal, and they rise or fall in status. As the Western and the musical fell in the 1970s, the urban crime film, horror, and science-fiction rose. For a long time, it would be unthinkable for an A-list director to do a horror or science-fiction movie, but that changed after Polanski, Kubrick, Ridley Scott, et al. gave those genres a fresh luster just by their participation. More recently, I argue in The Way Hollywood Tells It, the fantasy film arrived as a respectable genre, as measured by box-office receipts, critical respect, and awards. It seems that the sword-and-sorcery movie reached its full rehabilitation when The Lord of the Rings: The Return of the King scored its eleven Academy Awards.
The comic-book movie has had a longer slog from the B- and sub-B-regions. Superman, Flash Gordon, and Dick Tracy were all fodder for serials and low-budget fare. Prince Valiant (1954) was the only comics-derived movie of any standing in the 1950s, as I recall, and you can argue that it fitted into a cycle of widescreen costume pictures. (Though it looks like a pretty camp undertaking today.) Much later came revivals of the two most popular superheroes, Superman (1978) and Batman (1989).
The success of the Batman film, which was carefully orchestrated by Warners and its DC comics subsidiary, can be seen as preparing the grounds for today’s superhero franchises. The idea was to avoid simply reiterating a series, as the Superman movie did, or mocking it, as the Batman TV show did. The purpose was to “reimagine” the series, to “reboot” it as we now say, the way Frank Miller’s The Dark Knight Returns re-launched the Batman comic. Rebooting modernizes the mythos by reinterpreting it in a thematically serious and graphically daring way.
During the 1990s, less famous superheroes filled in as the Batman franchise tailed off. Examples were The Rocketeer (1991), Timecop (1994), The Crow (1994) and The Crow: City of Angels (1996), Judge Dredd (1995), Men in Black (1997), Spawn (1997), Blade (1998), and Mystery Men (1999). Most of these managed to fuse their appeals with those of another parvenu genre, the kinetic action-adventure movie.
Significantly, these were typically medium-budget films from semi-independent companies. Although some failed, a few were huge and many earned well, especially once home video was reckoned in. Moreover, the growing number of titles, sometimes featuring name actors, fueled a sense that this genre was becoming important. As often happens, marginal companies developed the market more nimbly than the big ones, who tend to move in once the market has matured.
I’d also suggest that The Matrix (1999) helped legitimize the cycle. (Neo isn’t a superhero? In the final scene he can fly.) The pseudophilosophical aura this movie radiated, as well as its easy familiarity with comics, videogames, and the Web, made it irrevocably cool. Now ambitious young directors like Nolan, Singer, and Brett Ratner could sign such projects with no sense they were going downmarket.
The importance of special effects. Arguably there were no fundamental breakthroughs in special-effects technology from the 1940s to the 1960s. But with motion-control cinematography, showcased in the first Star Wars installment (1977) filmmakers could create a new level of realism in the use of miniatures. Later developments in matte work, blue- and green-screen techniques, and digital imagery were suited to, and driven by, the other genres that were on the rise—horror, science-fiction, and fantasy—but comic-book movies benefited as well. The tagline for Superman was “You’ll believe a man can fly.”
Special effects thereby became one of a film’s attractions. Instead of hiding the technique, films flaunted it as a mark of big budgets and technological sophistication. The fantastic powers of superheroes cried out for CGI, and it may be that convincing movies in the genre weren’t really ready until the software matured.
The rise of franchises. Studios have always sought predictability, and the classic studio system relied on stars and genres to encourage the audience to return for more of what it liked. But as film attendance waned, producers looked for other models. One that was successful was the branded series, epitomized in the James Bond films. With the rise of the summer blockbuster, producers searched for properties that could be exploited in a string of movies. A memorable character could tie the installments together, and so filmmakers turned to pop literature (e.g., the Harry Potter books) and comic books. Today, Marvel Enterprises is less concerned with publishing comics than with creating film vehicles for its 5000 characters. Indeed, to get bank financing it put up ten of its characters as collateral!
Yet a single character might not sustain a robust franchise. Henry Jenkins has written about how popular culture is gravitating to multi-character “worlds” that allow different media texts to be carved out of them. Now that periodical sales of comics have flagged, the tail is wagging the dog. The 5000 characters in the Marvel Universe furnish endless franchise opportunities. If you stayed for the credit cookie at the end of Iron Man, you saw the setup for a sequel that will pair the hero with at least one more Marvel protagonist.
Merchandising and corporate synergy. It’s too obvious to dwell on, but superhero movies fit neatly into the demand that franchises should spawn books, TV shows, soundtracks, toys, apparel, and so on. Time Warner’s acquisition of DC Comics was crucial to the cross-platform marketing of the first Batman. Moreover, most comics readers are relatively affluent (a big change from my boyhood), so they have the income to buy action figures and other pricy collectibles, like a Batbed.
The shift from an auteur cinema to a genre cinema. The classic studio system maintained a fruitful, sometimes tense, balance between directorial expression and genre demands. Somewhere in recent decades that balance has split into polarities. We now have big-budget genre films that made by directors of no discernible individuality, and small “personal” films that showcase the director’s sensibility. There have always been impersonal craftsmen in Hollywood, but the most distinctive directors could often bring their own sensibilities to projects big or small.
David Lynch could make Dune (1984) part of his own oeuvre, but since then we have many big-budget genre pictures that bear no signs of directorial individuality. In particular, science-fiction, fantasy, and superhero movies demand so much high-tech input, so much preparation, so many logistical tasks in shooting, and such intensive postproduction, that economy of effort favors a standardized look and feel. Hence perhaps the recourse to well-established techniques of shooting and cutting; intensified continuity provides a line of least resistance. A comic-book movie can succeed if it doesn’t stray from the fanbase’s expectations and swiftly initiates the newbies. Not much directorial finesse is needed, as 300 (2007) shows.
The development of the megapicture may have led the more talented directors to the “one for them, one for me” motto. Think of the difference between Burton’s Planet of the Apes or even Sweeney Todd and, say, Ed Wood or Big Fish. Or think of the moments of elegance in Memento and The Prestige, as opposed to the blunt handling of Batman Begins and The Dark Knight.
Shock and awe in presentation. The rise of the multiplex meant not only an upgrade in comfort (my back appreciates the tilting seats) but also a demand for big pictures and big sound. Smaller, more intimate movies look woeful on your megascreen, and what’s the point of Dolby surround channels if you’re watching a Woody Allen picture? Like science-fiction and fantasy, the adventures of a superhero in yawning landscapes fill the demand for immersion in a punchy, visceral entertainment. Scaling the film for Imax, as Superman Returns and The Dark Knight have, is the next step in this escalation.
Too much is never enough. Since the 1980s, mass-audience pictures have gravitated toward ever more exaggerated presentation of momentary effects. In a comedy, if a car is about to crash, everyone inside must stare at the camera and shriek in concert. Extreme wide-angle shooting makes faces funny in themselves (or so Barry Sonnenfeld thinks). Action movies shift from slo-mo to fast-mo to reverse-mo, all stitched together by ramping, because somebody thinks these devices make for eye candy. Steep high and low angles, familiar in 1940s noir films, were picked up in comics, which in turn re-influenced movies.
Movies now love to make everything airborne, even the penny in Ghost. Things fly out at us, and thanks to surround channels we can hear them after they pass. It’s not enough simply to fire an arrow or bullet; the camera has to ride the projectile to its destination—or, in Wanted, from its target back to its source. In 21 of earlier this year, blackjack is given a monumentality more appropriate to buildings slated for demolition: giant playing cards whoosh like Stealth fighters or topple like brick walls.
I’m not against such one-off bursts of imagery. There’s an undoubted wow factor in seeing spent bullet casings shower into our face in The Matrix.
I just ask: What do such images remind us of? My answer: Comic book panels, those graphically dynamic compositions that keep us turning the pages. In fact, we call such effects “cartoonish.” Here’s an example from Watchmen, where the slow-motion effect of the Smiley pin floating down toward us is sustained by a series of lines of dialogue from the funeral service.
With comic-book imagery showing up in non-comic-book movies, one source may be greater reliance on storyboards and animatics. Spfx demand intensive planning, so detailed storyboarding was a necessity. Once you’re planning shot by shot, why not create very fancy compositions in previsualization? Spielberg seems to me the live-action master of “storyboard cinema.” And of course storyboards look like comic-book pages.
The hambone factor. In the studio era, star acting ruled. A star carried her or his persona (literally, mask) from project to project. Parker Tyler once compared Hollywood star acting to a charade; we always recognized the person underneath the mime.
This is not to say that the stars were mannequins or dead meat. Rather, like a sculptor who reshapes a piece of wood, a star remolded the persona to the project. Cary Grant was always Cary Grant, with that implausible accent, but the Cary Grant of Only Angels Have Wings is not that of His Girl Friday or Suspicion or Notorious or Arsenic and Old Lace. Or compare Barbara Stanwyck in The Lady Eve, Double Indemnity, and Meet John Doe. Young Mr. Lincoln is not the same character as Mr. Roberts, but both are recognizably Henry Fonda.
Dress them up as you like, but their bearing and especially their voices would always betray them. As Mr. Kralik in The Shop around the Corner, James Stewart talks like Mr. Smith on his way to Washington. In The Little Foxes, Herbert Marshall and Bette Davis sound about as southern as I do.
Star acting persisted into the 1960s, with Fonda, Stewart, Wayne, Crawford, and other granitic survivors of the studio era finishing out their careers. Star acting continues in what scholar Steve Seidman has called “comedian comedy,” from Jerry Lewis to Adam Sandler and Jack Black. Their characters are usually the same guy, again. Arguably some women, like Sandra Bullock and Ashlee Judd, also continued the tradition.
On the whole, though, the most highly regarded acting has moved closer to impersonation. Today your serious actors shape-shift for every project—acquiring accents, burying their faces in makeup, gaining or losing weight. We might be inclined to blame the Method, but classical actors went through the same discipline. Olivier, with his false noses and endless vocal range, might be the impersonators’ patron saint. His followers include Streep, Our Lady of Accents, and the self-flagellating young De Niro. Ironically, although today’s performance-as-impersonation aims at greater naturalness, it projects a flamboyance that advertises its mechanics. It can even look hammy. Thus, as so often, does realism breed artifice.
Horror and comic-book movies offer ripe opportunities for this sort of masquerade. In a straight drama, confined by realism, you usually can’t go over the top, but given the role of Hannibal Lector, there is no top. The awesome villain is a playground for the virtuoso, or the virtuoso in training. You can overplay, underplay, or over-underplay. You can also shift registers with no warning, as when hambone supreme Orson Welles would switch from a whisper to a bellow. More often now we get the flip from menace to gargoylish humor. Jack Nicholson’s “Heeere’s Johnny” in The Shining is iconic in this respect. In classic Hollywood, humor was used to strengthen sentiment, but now it’s used to dilute violence.
Such is the range we find in The Dark Knight. True, some players turn in fairly low-key work. Morgan Freeman plays Morgan Freeman, Michael Caine does his usual punctilious job, and Gary Oldman seems to have stumbled in from an ordinary crime film. Maggie Gylenhaal and Aaron Eckhart provide a degree of normality by only slightly overplaying; even after Harvey Dent’s fiery makeover Eckhart treats the role as no occasion for theatrics.
All else is Guignol. The Joker’s darting eyes, waggling brows, chortles, and restless licking of his lips send every bit of dialogue Special Delivery. Ledger’s performance has been much praised, but what would count as a bad line reading here? The part seems designed for scenery-chewing. By contrast, poor Bale has little to work with. As Bruce Wayne, he must be stiff as a plank, kissing Rachel while keeping one hand suavely tucked in his pocket, GQ style. In his Bat-cowl, he’s missing as much acreage of his face as Dent is, so all Bale has is the voice, over-underplayed as a hoarse bark.
In sum, our principals are sweating through their scenes. You get no strokes for making it look easy, but if you work really hard you might get an Oscar.
A taste for the grotesque. Horror films have always played on bodily distortions and decay, but The Exorcist (1973) raised the bar for what sorts of enticing deformities could be shown to mainstream audiences. Thanks to new special effects, movies like Total Recall (1990) were giving us cartoonish exaggerations of heads and appendages.
But of course the caricaturists got here first, from Hogarth and Daumier onward. Most memorably, Chester Gould’s Dick Tracy strip offered a parade of mutilated villains like Flattop, the Brow, the Mole, and the Blank, a gentleman who was literally defaced. The Batman comics followed Gould in giving the protagonist an array of adversaries who would even raise an eyebrow in a Manhattan subway car.
Eisenstein once argued that horrific grotesquerie was unstable and hard to sustain. He thought that it teetered between the comic-grotesque and the pathetic-grotesque. That’s the difference, I suppose, between Beetlejuice and Edward Scissorhands, or between the Joker and Harvey Dent. In any case, in all its guises the grotesque is available to our comic-book pictures, and it plays nicely into the oversize acting style that’s coming into favor.
You’re thinking that I’ve gone on way too long, and you’re right. Yet I can’t withhold two more quickies:
The global recognition of anime and Hong Kong swordplay films. During the climactic battle between Iron Man 2.0 and 3.0, so reminiscent of Transformers, I thought: “The mecha look has won.”
Learning to love the dark. That is, filmmakers’ current belief that “dark” themes, carried by monochrome cinematography, somehow carry more prestige than light ones in a wide palette. This parallels comics’ urge for legitimacy by treating serious subjects in somber hues, especially in graphic novels.
Time to stop! This is, after all, just a list of causes and conditions that occurred to me after my day in the multiplex. I’m sure we can find others. Still, factors like these seem to me more precise and proximate causes for the surge in comic-book films than a vague sense that we need these heroes now. These heroes have been around for fifty years, so in some sense they’ve always been needed, and somebody may still need them. The major media companies, for sure. Gazillions of fans, apparently. Me, not so much. But after Hellboy II: The Golden Army I live in hope.
Thanks to Hank Luttrell for information about the history of the comics market.
The superhero rankings I mentioned are: Spider-Man 3 (no. 12), Spider-Man (no. 17), Spider-Man 2 (no. 23), The Dark Knight (currently at no. 29, but that will change), Men in Black (no. 42), Iron Man (no. 45), X-Men: The Last Stand (no. 75), 300 (no. 80), Men in Black II (no. 85), Batman (no. 95), and X2: X-Men United (no. 98). The usual caveat applies: This list is based on unadjusted grosses and so favors recent titles, because of inflation and the increased ticket prices. If you adjust for these factors, the list of 100 all-time top grossers includes seven comics titles, with the highest-ranking one being Spider-Man, at no. 33.
For a thoughtful essay written just as the trend was starting, see Ken Tucker’s 2000 Entertainment Weekly piece, “Caped Fears.” It’s incompletely available here.
Comics aficionados may object that I am obviously against comics as a whole. True, I have little interest in superhero comic books. As a boy I read the DC titles, but I preferred Mad, Archie, Uncle Scrooge, and Little Lulu. In high school and college I missed the whole Marvel revolution and never caught up. Like everybody else in the 1980s I read The Dark Knight Returns, but I preferred Watchmen (and I look forward to the movie). I like the Hellboy movies too. But I’m not gripped by many of the newest trends in comics. Sin City strikes me as a fastidious piece of draftsmanship exercised on formulaic material, as if Mickey Spillane were rewritten by Nicholson Baker. Since the 80s my tastes have run to Ware, Clowes, a few manga, and especially Eurocomics derived from the clear-line tradition (Chaland, Floc’h, Swarte, etc.). I believe that McCay and Herriman are major twentieth-century artists, with Chester Gould and Cliff Sterrett worth considering for the honor too.
You can argue that Oliver Stone’s films create ambivalence inadvertently. JFK seems to have a clear-cut message, but the plotting is diverted by so many conspiracy scenarios that the viewer might get confused about what exactly Stone is claiming really happened.
On the ways that worldmaking replaces character-centered media storytelling, the crucial discussion is in Henry Jenkins, Convergence Culture: Where Old and New Media Collide (New York University Press, 2007), 113-122.
On franchise-building, see the detailed account in detail in Eileen R. Meehan, “‘Holy Commodity Fetish, Batman!’: The Political Economy of a Commercial Intertext,” in The Many Lives of the Batman, ed. Roberta E. Pearson and William Uricchio (Routledge, 1991), 47-65. Other essays in this collection offer information on the strategies of franchise-building.
Just as Star Wars helped legitimate itself by including Alec Guinness in its cast (surely he wouldn’t be in a potboiler), several superhero movies have a proclivity for including a touch of British class: McKellan and Stewart in X-Men, Caine in the Batman series. These old reliables like to keep busy and earn a spot of cash.
PS: 21 August 2008: This post has gotten some intriguing responses, both on the Internets and in correspondence with me, so I’m adding a few here.
Jim Emerson elaborated on the zeitgeist motif in an entry at Scanners. At Crooked Timber, John Holbo examines how much the film’s dark cast owes to the 1990s reincarnation of Batman. Peter Coogan writes to tell me that he makes a narrower version of the zeitgeist argument in relation to superheroes in Chapter 10 of his book, Superhero: The Secret Origin of a Genre, to be reprinted next year. Even the more moderate form he proposes doesn’t convince me, I’m afraid, but the book ought to be of value to readers interested in the genre.
From Stew Fyfe comes a letter offering some corrections and qualifications.
*Stew points out that chain stores like Borders do sell some periodical comics titles, though not always regularly.
*Comics publishing, while not at the circulation levels seen in the golden era, is undergoing something of a resurgence now, possibly because of the success of the franchise movies. Watchmen sales alone will be a big bump in anticipation of the movie.
*As for my claim that film is driving the publishing side, Stew suggests that the relations between the media are more complicated. The idea that the tail wags the dog might apply to DC, but Marvel has made efforts to diversify the relations between the books and the films.
They’ve done things like replacing the Hulk with a red, articulate version of the character just before the movie came out (which is odd because if there’s one thing that the general public knows about the character is that he’s green and he grunts). They’ve also handed the Hulk’s main title over to a minor character, Hercules. They’ve spent a year turning Iron Man, in the main continuity, into something of a techno-fascist (if lately a repentant one) who locks up other superheroes.
Stew speculates that Marvel is trying to multiply its audiences. It relies on its main “continuity books” to serve the fanbase who patronizes the shops, and the films sustain each title’s proprietary look and feel. In addition, some of the books offer fresh material for anyone who might want to buy the comic after seeing the film; this tactic includes reprinted material and rebooted continuity lines in the Ultimate series. Marvel has also brought in film and TV creators as writers (Joss Whedon, Kevin Smith), while occasionally comics artists work in TV shows like Heroes, Lost, and Battlestar Galactica. So the connections are more complex than I was indicating.
Thanks to all these readers for their comments.
In critical condition
DB here:
A Web-prowling cinephile couldn’t escape all the talk about the decline of film criticism. First, several daily and weekly reviewers left their print publications, as Anne Thompson points out. Then one of our brightest critics, Matt Zoller Seitz, suspended writing in order to return to film production. This and the departure of another web critic, Raymond Young of Flickhead, has prompted Tim Lucas to ponder, at length and in depth, why one would maintain a film blog. Go to Moviezzz for a summing up.
I’ve been teaching film history and aesthetics since the early 1970s, but before that I wrote criticism for my college newspaper, the Albany Student Press, and then for Film Comment. When I set out for graduate school, film criticism was virtually the only sort of film writing I thought existed. Auteurism was my faith, and Andrew Sarris its true apostle (for reasons explained in an essay in Poetics of Cinema). In grad school, I learned that there were other ways of thinking about cinema. Since then, I’ve tried to steer a course among film criticism, film history, and film theory—sometimes doing one, sometimes mixing them. But criticism has remained central to my interest in cinema.
What, though, does the concept mean? I think that some of the current discussions about the souring state of movie criticism would benefit from some thoughts about what criticism is and does.
Watch your language
Consider criticism as a language-based activity. What do critics do with their words and sentences? Long ago, the philosopher Monroe Beardsley laid out four activities that constitute criticism in any art form, and his distinctions still seem accurate to me. (1) We use them in Chapter 2 of Film Art.
*Critics describe artworks. Film critics summarize plots, describe scenes, characterize performances or music or visual style. These descriptions are seldom simply neutral summaries. They’re usually perspective-driven, aiding a larger point that the critic wants to make. A description can be coldly objective or warmly partisan.
*Critics analyze artworks. For Beardsley, this means showing how parts go together to make up wholes. If you simply listed all the shots of a scene in order, that would be a description. But if you go on to show the functions that each shot performs, in relation to the others or some broader effect, you’re doing analysis. Analysis need not concentrate only on visual style. You can analyze plot construction. You can analyze an actor’s performance; how does she express an arc of emotion across a scene? You can analyze the film’s score; how do motifs recur when certain characters appear? Because films have so many different kinds of “parts,” you can analyze patterns at many levels.
*Critics interpret artworks. This activity involves making claims about the abstract or general meanings of a film. The word “interpret” is used in lots of ways, but in the sense meant here, figuring out the chronological order of scenes in Pulp Fiction wouldn’t count. If, though, you claim that Pulp Fiction is about redemption, both failed (Vincent) and successful (Jules’ decision to quit the hitman trade), you’re launching an interpretation. If I say that Cloverfield is a symbolic replay of 9/11, that counts as an interpretation too.
*Critics evaluate artworks. This seems pretty straightforward. If you declare that There Will Be Blood is a good film, you’re evaluating it. For many critics, evaluation is the core critical activity; after all, the word critic in its Greek origins means judge. Like all the other activities, however, evaluation turns out to be more complicated than it looks.
Why break the process of criticism into these activities? I think they help us clarify what we’re doing at any moment. They also offer a rough way to understand the critical formats that we usually encounter.
In paper media, on TV, or on the internet, we can distinguish three main platforms for critical discussion. A review is a brief characterization of the film, aimed at a broad audience who hasn’t seen the film. Reviews come out at fixed intervals—daily, weekly, monthly, quarterly. They track current releases, and so have a sort of news value. For this reason, they’re a type of journalism.
An academic article or book of criticism offers in-depth research into one or more films, and it presupposes that the reader has seen the film (or doesn’t mind spoilers). It isn’t tied to any fixed rhythm of publication.
A critical essay falls in between these types. It’s longer than a review, but it’s usually more opinionated and personal than an academic article. It’s often a “think piece,” drawing back from the daily rhythm of reviewing to suggest more general conclusions about a career or trend. Some examples are Pauline Kael’s “On the Future of Movies” and Philip Lopate’s “The Last Taboo: The Dumbing Down of American Movies.” (2) Critical essays can be found in highbrow magazines like The New Yorker and Artforum, in literary quarterlies, and in film journals like Film Comment, CinemaScope, and Cahiers du Cinéma.
Any critic can write on all three platforms. Roger Ebert is known chiefly for his reviews, but his Great Movies books consist of essays. (3) J. Hoberman usually writes reviews, but he has also published essays and academic books. And the lines between these formats aren’t absolutely rigid, as I’ll try to show later.
Reviewing reviewers
How do these forums relate to the different critical activities? It seems clear that academic criticism, the sort published in research articles or books, emphasizes description, analysis, and interpretation. Evaluation isn’t absent, but it takes a back seat. Usually the academic critic is concerned to answer a question about the films. How, for instance, is the theme of gender identity represented in Rebecca, and what ambiguities and contradictions arise from that process? In order to pursue this question, the critic needn’t declare Rebecca a great film or a failure.
Of course, the academic piece could also make a value judgment, either at the outset (I think Rebecca is excellent and want to scrutinize it) or at the end (I’m forced to conclude that Rebecca is a narrow, oppressive film). But I don’t have to pass judgment. I have written about a lot of ordinary films in my life. They became interesting because of the questions I brought to them, not because they had a lot going for them intrinsically.
The academic article has a lot of space to examine its question—several thousand words, usually—and of course a book offers still more real estate. By contrast, a review is pinched by its format. It must be brief, often a couple of hundred words. Unlike the academic critique, the review’s purpose is usually to act as a recommendation or warning. Most readers seek out reviews to get a sense of whether a movie is worth seeing or even whether they would like it.
Because evaluation is central to their task, reviewers tend to focus their descriptions on certain aspects of the film. A reviewer is expected to describe the plot situation, but without giving away too much—major twists in the action, and of course the ending. The writer also typically describes the performances, perhaps also the look and feel of the film, and chiefly its tone or tenor. Descriptions of shots, cutting patterns, music, and the like are usually neglected. And what is described will often be colored by the critic’s evaluation. You can, for instance, retell the plot in a way that makes your opinion of the film’s value pretty clear.
Reviews seldom indulge in analysis, which typically consumes a lot of space and might give away too much. Nor do reviewers usually float interpretations, but when they do, the most common tactic is reflectionism. A current film is read in relation to the mood of the moment, a current political controversy, or a broader Zeitgeist. A cynic might say that this is a handy way to make a film seem important and relevant, while offering a ready-made way to fill a column. Reviewers don’t have a monopoly on reflectionism, though. It’s present in the essayistic think-piece and in academic criticism too. (4)
The centrality of evaluation, then, dictates certain conventions of film reviewing. Those conventions obviously work well enough. But we can learn things about cinema through wide-ranging descriptions and detailed analyses and interpretations, as well. We just ought to recognize that we’re unlikely to get them in the review format.
The good, the bad, and the tasty
Let’s look at evaluation a little more closely. If I say that I think that Les Demoiselles de Rochefort is a good film, I might just be saying that I like it. But not necessarily. I can like films I don’t think are particularly good. I enjoy mid-level Hong Kong movies because I can see their ties to local history and film history, because I take delight in certain actors, because I try to spot familiar locations. But I wouldn’t argue that because I like them, they’re good. We all have guilty pleasures—a label that was coined exactly to designate films which give us enjoyment, even if by any wide criteria they aren’t especially good.
They needn’t be disastrously bad, of course. I do like Les Demoiselles de Rochefort, inordinately. It’s my favorite Demy film and a film I will watch any time, anywhere. It always lifts my spirits. I would take it to a desert island. But I’m also aware that it has its problems. It is very simple and schematic and predictable, and it probably tries too hard to be both naive and knowing. Artistically, it’s not as perfect as Play Time or as daring as Citizen Kane or as….well, you go on. It’s just that somehow, this movie speaks to me.
The point is that evaluation encompasses both judgment and taste. Taste is what gives you a buzz. There’s no accounting for it, we’re told, and a person’s tastes can be wholly unsystematic and logically inconsistent. Among my favorite movies are The Hunt for Red October, How Green Was My Valley, Choose Me, Back to the Future, Song of the South, Passing Fancy, Advise and Consent, Zorns Lemma, and Sanshiro Sugata. I’ll also watch June Allyson, Sandra Bullock, Henry Fonda, and Chishu Ryu in almost anything. I’m hard-pressed to find a logical principle here.
Taste is distinctive, part of what makes you you, but you also share some tastes with others. We teachers often say we’re trying to educate students’ tastes. True, but we should admit that we’re trying to broaden their tastes, not necessarily replace them with better ones. Elsewhere on this site I argue that tastes formed in adolescence are, fortunately, almost impossible to erase. But we shouldn’t keep our tastes locked down. The more different kinds of things we can like, the better life becomes.
The difference between taste and judgment emerges in this way: You can recognize that some films are good even if you don’t like them. You can declare Birth of a Nation or Citizen Kane or Persona an excellent film without finding it to your liking.
Why? Most people recognize some general criteria of excellence, such as originality, or thematic significance, or subtlety, or technical skill, or formal complexity, or intensity of emotional effect. There are also moral and social criteria, as when we find films full of stereotypes objectionable. All of these criteria and others can help us pick out films worthy of admiration. These aren’t fully “objective” standards, but they are intersubjective—lots of people with widely varying tastes accept them.
So critics not only have tastes; they judge. The term judgment aims to capture the comparatively impersonal quality of this sort of evaluation. A judge’s verdict is supposed to answer to principles going beyond his or her own preferences. Judges at a gymnastic contest provide scores on the basis of their expertise and in recognition of technical criteria, and we expect them to back their judgment with detailed reasons.
One more twist and I’m done with distinctions. At a higher level, your tastes may make you weigh certain criteria more heavily than others. If you most enjoy movies that wrestle with philosophical problems, you may favor the thematic-significance criterion. So you’ll love Bergman and think he’s a great director. In other words, you can have tastes in films that you also consider excellent. Presumably this is what we teachers are trying to cultivate as well: to teach people, as Plato urged, to love the good.
Of course we can disagree about relevant criteria, particularly about what criteria to apply to a particular movie. I’d argue that profundity of theme isn’t a very plausible criterion for judging Cloverfield; but formal originality, technical skill, and intensity of emotional appeal are plausible criteria to apply. Many of the best Hong Kong films don’t apply rank high on subtlety of theme or character psychology, but they do well on technical originality and intensity of visceral and emotional response. You may disagree, but now we’re arguing not about tastes but about what criteria are appropriate to a given film. To get anywhere, our conversation will appeal to both intersubjective standards and discernible things going on in the movie–not to whether you got a buzz from it and I didn’t.
There’s a reason they call that DVD series Criterion
Now back to film reviewing. Judgment certainly comes into play in a film review, because the critic may invoke criteria in evaluating a movie. Such criteria are widely accepted as picking out “good-making” features. For instance:
The plot makes no sense. Criterion: Narrative coherence helps make a film, or at least a film of this sort, good.
The acting is over the top. Criterion: Moderate performance helps make a film, or at least a film of this sort, good.
The action scenes are cut so fast that you can’t tell what’s going on. Criterion: Intelligibility of presentation helps make a film, or at least a film of this sort, good.
Most reviewers, though, can’t resist exposing their tastes as well as their judgments. This is a convention of reviewing, at least in the most high-profile venues. Readers return to reviewers with strongly expressed tastes. Some readers want to have their own tastes reinforced. If you think Hollywood pumps out shoddy product, Joe Queenan will articulate that view with a gonzo relish that gives you pleasure. Other readers want to have their tastes educated, so they seek out a strong personality with clear-cut tastes to guide them. Still other readers want to have their tastes tested, so they read critics whose tastes vary widely from theirs. I’m told that many people read Armond White for this reason. Tastes come in all flavors.
Celebrity critics—the reviewers who attract attention and controversy—are usually vigorous writers who have pushed their tastes to the forefront. Top critics like Andrew Sarris and Pauline Kael are famous partly because they flaunted their tastes and championed films that they liked. (Of course they also thought that the films were good, according to widely held criteria.) It isn’t only a matter of praise, either. Every so often critics launch all-out attacks on films, directors, or other critics, and some are permanently cranky. Movie reviewer Jay Sherman (above) ranks films by analogy to diseases. Hatchet jobs assure a critic notoriety, but they also prove Valéry’s maxim that “Taste is made of a thousand distastes.”
At a certain point, celebrity critics may even give up justifying their evaluations altogether, simply asserting their preferences. They trust that their track record, their brand name, and their forceful rhetoric will continue to engage their readers. It seems to me that after decades of stressing the individuality of their tastes, many of the most influential reviewers are emitting two main messages: You see it or you don’t and Differ if you dare. I’d like to see more argument and less strutting. But then, that’s my taste.
Stuck in the middle, with us
There’s much more to say about the distinctions I’ve floated. They are rough and need refining. But they’ll do for my purpose today, which is to indicate that everything I’ve said can apply to Web writing.
For instance, it seems likely that one cause of critical burnout is that reviews dominate the Net. They’re typically highly evaluative, mixing taste and judgment. Many people will find a bombardment of such items eventually too much to take. I could imagine somebody abandoning Net criticism simply because of the cacophony of shrieking one-liners. We’re all interested in somebody else’s opinions, but we can’t be interested in everybody else’s opinions.
In addition, the distinguishing feature among these thousands of reviews won’t necessarily be wit or profundity or expertise, but style. I think that, years ago, the urge toward self-conscious critical style arose from the drudgery of daily reviewing. Faced with a dreadful new movie, you could make your task interesting only by finding a fresh way to slam it. In addition, magazines that wanted to appear smart encouraged writers to elevate attitude above ideas. In the overabundance of critical talk on the Net, saying “It’s great” or “It stinks” in a clever way will draw more attention than plainer speaking, but even that novelty will wear off eventually.
Fortunately, there are other formats available to cybercriticism. At first glance, the Web seems to favor the snack size, the 150-word sally that’s all about taste and attitude. In fact, the Net is just as hospitable to the long piece. There are in principle no space limitations, so one can launch arguments at length. (It’s too long to read scrolling down? Print it out. Maybe you have to do that with this essay.) Thanks to the indefinitely large acreage available, one of the heartening developments of Web criticism is the growth of the mid-range format I’ve mentioned: the critical essay.
Historically, that form has always been closer to the review than to the academic piece. It relies more on evaluation. That’s centrally true of the Kael and Lopate essays I mentioned above, both of which warn about disastrous changes in Hollywood moviemaking. But the tone can be positive too, most often seen in the appreciative essay, which celebrates the accomplishment of a film or filmmaker. Dwight Macdonald’s admiring piece on 8 ½ and Susan Sontag’s 1968 essay on Godard, despite their differences, seem to me milestones in this genre. (5)
The critical essay is, I think, the real showcase for a critic’s abilities. We say that good critics have to be good writers, usually meaning that their style must be engaging, but it doesn’t have to explode at the end of every paragraph. More generally, in a long essay, you are forced to use language differently than in a snippet. You need to build and delay expectations, find new ways to repeat and modify your case, and seek out synonyms.
Just as important, the long piece separates the sheep from the goats because it shows a critic’s ability to sustain a case. The short form lets you pirouette, but the extended essay—unless it’s simply a rant—obliges you to show all your stuff. In the long form, your ideas need to have heft. Stepping outside film for a moment, consider Gary Giddins on Jack Benny, or Geoffrey O’Brien on Burt Bacharach, or Robert Hughes on Goya, and in each you will see a sprightly, probing, deeply informed mind develop an argument in surprising ways. (6) Strikingly, all these writers venture into subtleties of analysis and interpretation, putting them close to the academic model. So who needs footnotes?
Above all, the critical essay can develop new depth on the Web. Given more space, the Web can ask critics to lay out their assumptions and evidence more fully. After years of “writing short,” of firing off invectives, put-downs, and passing paeans to great filmmakers unknown to most readers, critics now have an opportunity—not to rant at greater length but to go deeper. If you think a movie is interesting or important, please show us. Don’t simply assert your opinion with lots of hand-waving, but back it up with some analysis or interpretation. The Web allows analysis and interpretation, which take a lot of effort, to come into their own.
Need an example? Jim Emerson, time and again. There are plenty of other instances hosted by journals like Rouge and the extraordinary Senses of Cinema, and many solo efforts, such as a recent one from Benjamin Wright.
Some will object that this is a pretty unprofitable undertaking. Who’ll pay people to write in-depth critical studies on the movies they find compelling? Well, who’s paying for all those 100-word zingers? And who has paid those programmers who continue to help Linus Torvalds develop Linux? People do all kinds of things for love of the doing and for the benefit of strangers. Besides, no one should expect that writing Web criticism will pay the bills. If Disney can’t collect from people who have downloaded Pirates of the Caribbean 3 for free, why should you or I expect to be paid for talking about it? Maybe only idlers, hobbyists, obsessives, and retirees (count me among all four) have the leisure to write long for the Web.
I envision another way to be in the middle. If most critical essays have been akin to reviews, what about essays that lie closer to the other extreme, the academic one? I’d like to see more of what might be called “research essays.” If the critical essay of haut journalism tips toward reviewing while being more argument-driven, the research essay leans toward academic writing, while not shrinking from judgment, and even parading tastes. I’ve tried my hand at several research essays, in books as well as in pieces you’ll find on the left side of this page; and occasionally one of our blog entries moves in this direction.
This isn’t to discourage people from jotting down ideas about movies and triggering a conversation with readers. The review, professional or amateur, shouldn’t go extinct. But we also benefit from ambitious critical essays, pieces that illuminate movies through analysis and interpretation. Web critics could write less often, but longer. In an era of slow food, let’s try slow film blogging. It might encourage slow reading.
(1) Monroe K. Beardsley, Aesthetics: Problem in the Philosophy of Art Criticism (New York: Harcourt, Brace and World, 1958), 75-78.
(2) Pauline Kael, “On the Future of Movies,” in Reeling (New York: Warner Books, 1977), 415-444; Philip Lopate, “The Last Taboo: The Dumbing Down of American Movies,” in Totally Tenderly Tragically: Essays and Criticism from a Lifelong Love Affair with the Movies (New York: Anchor, 1998), 259-279.
(3) In addition, Ebert often manages to build his daily pieces around a general idea, not necessarily involving cinema, so he can be read with enjoyment by people not particularly interested in film. I talk about this a little in my introduction to his collection Awake in the Dark.
(4) Reflectionist interpretation usually seems to me unpersuasive, for reasons I’ve discussed in Poetics of Cinema, pp. 30-32. I realize that I’m tilting at windmills. Reflectionism will be with us forever.
(5) Dwight Macdonald, “8 ½: Fellini’s Obvious Masterpiece,” in On Movies (Englewood Cliffs, NJ: Prentice-Hall, 1969), 15-31; Susan Sontag, “Godard,” in Styles of Radical Will (New York: Delta, 1969), 147-189.
(6) Gary Giddins, “’This Guy Wouldn’t Give You the Parsley off His Fish,’” in Faces in the Crowd: Musicians, Writers, Actors, and Filmmakers (New York: Da Capo Press, 1996), 3-13; Geoffrey O’Brien, “The Return of Burt Bacharach,” in Sonata for Jukebox: Pop Music, Memory, and the Imagined Life (New York: Counterpoint, 2004), 5-28; and Robert Hughes, “Goya,” in Nothing If Not Critical: Selected Essays on Art and Artists (New York: Penguin, 1992), 50-64. I pay tribute to Giddins as a critic elsewhere on this site. Hughes later wrote a fine monograph on Goya.