{No. 62: Method Acting} [1979] Kramer vs. Kramer

Unless it’s the editor’s intent, an audience shouldn’t notice cuts between shots or transitions between scenes. Think: the wipe edit in A New Hope; or, the match cut in 2001: A Space Odyssey; the dolly zoom interspersed with the violent action shots in Raging Bull. These edits are iconic for adding style and substance to their respective films. They’re integral to the success of telling the story.

Unless it’s the editor’s intent, the audience should not notice transitions between characters in a dialogue scene or quick fades that flow as effortlessly as the narrative itself. Editing, we learn by studying editors, is method. Editors learn by immersing themselves in the script and in the daily shots and in the dark rooms with hundreds of terabytes of film that would run miles long (sometimes it does). A good editor makes a director’s vision shine. A great editor’s director gets them the shots they need to build the story.

Acting is different than editing, he writes, seriously. Great acting, as with great editing, should lift a script into the stratosphere. It should inspire! What, then, constitutes great acting: technical touchpoints; a “feeling?” Is it how and how much an actor appropriately emotes? Is it the ability to recite long lectures of soliloquy, or to spit lines ticky-tacky with one or more scene partners? Is it, “you know it when you see it?”

Actors engage in method, too. This immersion technique is meant to cut the distance between the character and the performance. Perfect method acting aims to remove the human from the performance entirely, as if the person were to be a vessel for lines and blocking. It’s not a new technique, but it’s rarely practiced anymore, if it ever was at all (known cobbler and part-time actor, Daniel Day-Lewis, is a famous, noted exception). Anecdotal evidence points to words like “arrogant” and “self-indulgent.” If the “point” is to immerse oneself so deeply in character study that the performance feels “real,” can it ever? If one was not a soldier in World War I, should one attempt to achieve appropriate levels of shell shock to play a soldier with smoldering PTSD? Should a man who hasn’t experienced loss and death fake it for real? Continue reading

[1976] All The President’s Men

There’s a film (not nominated for Best Picture, probably incorrectly) called The Thin Blue Line, which doesn’t really distinguish between narrative fiction and fictional narrative, but asks the audience to follow incredibly closely and decide for themselves what happened. Errol Morris took this film in a brilliant direction as each person watching the movie (documentary?) was asked to examine their own biases for the name of fairness, correctness, and real life tragedy. His work is an important distinction and groundbreaking in that before The Thin Blue Line, film was very obviously either true or false; a director took license only where absolutely necessary. A few hypotheses why this was the case, in order from probably the truth to certainly not the truth:

  • Technical limitations set the parameters for what could be staged, shot, edited, and pressed. Until the advent of more advanced cameras and computers and software to handle the ambition, storytellers limited their ideas to plausible narratives and the naturally insane.
  • Film was expensive, and filming too much more in the wayward sense of exposition and exploration, would have driven budgets beyond what a financier would consider “acceptable” overruns.
  • Inventing a whole new type of storytelling takes a bold visionary, and they had not yet come along.
  • Audiences cared much more and were entirely more naive about what was truth and what was not. Critical narratives were not readily accessible and without them audiences could not fathom a distinction between manipulative intent and honesty.
  • There was no incentive or market to bust up inertia and jump-start creativity [Ed. – This might be true in the 2010s, somewhat]

This last point is not true, though film in the mid-to late 1980s had lost some of the ferocity brought forth starting in the late 1960s and The Thin Blue Line had started to shake up some of the storytelling techniques that would carry forward, especially into Oliver Stone’s JFK in 1990 and lots of neo-noir works like LA Confidential in 1997 and Mystic River in 2003. There was a cascading acceptance of newness toward the late 1980s. Continue reading

[1969] Midnight Cowboy

What a difference a year makes. Between 1969 and 1970, between Midnight Cowboy and Patton, some monumental shift realigned what kind of film could earn the Most Prestigious Award in western filmmaking. Not only are both movies enshrined as Best Picture winners, but are almost thematic polar opposites released just a few months apart. If we extend a film metaphor, that what we capture and release on film accurately reflects some kind of zeitgeist, it follows logically that we can assume the world changed significantly between the end of the decade and the start of the next. But let’s talk about a film’s MPAA “rating:” the elusive “X” given to Midnight Cowboy and the harmless “PG” awarded to Patton in 1970. Was public attitude shifting away from the queer and more towards the centre and the normal?

Since its creation, the Motion Picture Association of America has attempted to create some soft and hard guidelines as to regulate the movie-making process. Originally founded in 1922 (making it older than the Academy), the MPAA sought to create a standard for filmmakers, actors, producers and financiers to ensure stability, both financially and, for a while, morally. For the first 46 years in existence, the MPAA sought (especially under Will Hays) to standardize theme, content and production to a code up to focus on “wholesome” films and ones that don’t include “profanity” or “indecency.” In 1968, after several revisions and unraveling of the restrictive code, Jack Valenti sought to rework Hays’ code into the modern rating system still in use today – shifting the morality burdens off of the producers and onto the viewers, and specifically the parents of children Hays tried to protect.

Curious, then that Midnight Cowboy won an Oscar as the first (and only) X-rated film. This fact is mostly irrelevant seeing as the definition of an X-rated film has changed even more dramatically from 1968 to 2014 than the code has from 1922 to 1968; the definition of profanity has changed more than the actuality of the content; the technology and clarity of the filmmaking process has overshadowed the content somewhat. More likely than not, the rating created fantastic hype around the film, whose only true X-rated premise delves into the correlation between male prostitution and homosexuality. These themes in 2014 most likely would earn this film a soft R-rating – and in fact the newly reformed MPAA rerated the X-rating into an R fewer than 2 years after its release. Continue reading