Check out this gallery of photographs by photojournalist Timothy O’Sullivan, who documented the interactions among settlers and Native Americans in the Old West. O’Sullivan’s ethnographic style and eye for detail are impressive, and most importantly, he made an important effect to be authentic:
O’Sullivan was famous for not trying to romanticise the native American plight or way of life in his photographs and instead of asking them to wear tribal dress was happy to photograph them wearing denim jeans.
Adapted from notes and essays for a graduate section of the Ethnographic and Documentary Film course at UF.
Jean-Luc Godard observed that in filmmaking, “you can start with fiction or documentary. But whichever you start with you inevitably find the other.” How true. From the latest simpering inspirational movie “based on a true story” or shocking horror-action movie “inspired by real events” (notice the semantic difference?) to the (often necessary) addition of artifice or narrative construction to documentaries, it sometimes seems pointless to label a film as fiction or non-fiction. In the interest of making documentaries palatable or perhaps more plausible, filmmakers slap an improvised story or Hollywood trope on top of it, as James Marsh did with Man On Wire. In the interest of making horror films more intriguing, the scripts are drawn from reported events and even fleshed out by directors through extensive research with the real persons involved, as Ole Bernadal did with The Possession.
As Picasso observed, art is a lie that tells the truth. Artifice simply smoothes the truth-telling process for documentary and fictional filmmakers alike. Of course, documentaries may include misrepresentation, or their level of fiction is high enough that the narrative supersedes the on-screen evidence. In short, the impact of a documentary lies in its execution more than its ratio of pretense to corroboration.
On the other side of the coin, docudramas or “truth-inspired” fiction films are much more likely to shoehorn the real-world persons and events into the archetypes and tropes time-tested by the consuming society. Because of studios’ marketing interests and screenwriters’ need for linearity, beautiful people step into the historical roles and a confluence of events that informed the inspiring event is streamlined into a smooth narrative for the script. Some films attempt to redeem themselves by drawing in real-world footage, but overall, true-story dramas’ tendency towards misrepresentation seems to detract from the truth-telling potential of the lie.
What’s more interesting is the capacity of fiction films to effectively tell the truth or border on documentary in terms of social impact. Films such as Sometimes in April, Hotel Rwanda, Babel, and The Help tackle enduring social concerns, important events in history, or world events underreported by U.S. media, without any traditional elements of documentary film. Instead, honest characterizations, serious or violent portrayals of events, and lack of an explicit message allow the viewer an immersion into a situation, or shades thereof, that the interrupted and dictated format of a documentary precludes. Fiction films like these have the effect of a documentary and fulfill Godard’s maxim, despite being “just a story.”
But “just a story” means very little when one considers the pre-printing press importance of oral tradition, in which stories efficiently transmitted a wealth of cultural information and left the listener with a sense of truth. This changed in Western culture with the advancements in visual technology: first photography, then films, then films with synchronized sound, then digital video. The ever-individualized forms of audiovisual storytelling seem to generate a higher level of truth: isn’t it more authentic and more real if there isn’t all that corrupted industry BS, FX distortion, and acting? In fact, the opposite is true because truth is less about authenticity and more about the social construction of what is true. The power given by the words “real,” “true,” “honest,” and the like carries an exceptional weight in Western culture. Why, I don’t know. But our obsession with it is the operative factor in what we constitute as true. We’ve ignored Picasso’s observation in favor of constructing truth as a social experience, and deliberately applying those key words to things that defy their definition: hello, reality TV.
The development of a “constructed reality” in film began when artistic film did. American filmmakers might not like to admit this, but they owe much to the Russians. Soviet films of the 1920s were montages of recorded, re-created, and designed events, and based upon librettos, or guides, instead of screenplays—much like the docudramas and found-footage films described above. Soviet-influenced newsreels and documentaries in Britain and the United States similarly involved montage, location shooting, and a higher aesthetic representation of social reality. John Grierson’s and Pare Lorentz’s films, commissioned by the Film Board of Canada and the U.S. Film Service, respectively, were stylistic and epistemological precursors to the docudrama genre, which, after the U.S. Film Service was dissolved in 1940, was developed by observational filmmaker Louis de Rochemont. Docudrama’s ontological analogy with the noir genre was based on an increasing occupation with matters of film reality in both fiction and semi-fiction films, and was hardly an accident. The importance of photographs in finding the truth in criminal matters, as seen in many noir films, was a reflection of the authenticating power of photographs. The shady witness could lie; the camera could not!
But increasingly savvy filmgoers have realized the illusion and control of the audiovisual medium. Before, filmgoers strongly identified with the reality of the medium; film’s immersive method, drawn from its emphasis on visual information, the removal of artificial division, and the re-presentation of “authentic” photographic “data,” was escapist (or at least purportedly so)—and had to rely on the reproduction of conventional ideology to be both accessible and salient. Now, with more people having access to even simple moviemaking equipment (even iPhones!), and following the postmodernist zeitgeist, films’ illusory methods are almost too explicit. The sense of unreal knocks us over the heads. However, an apparent removal of the director, and especially of the film editor, heightens the immersiveness to a new level while undoing the sense of illusion. The found-footage genre restores this because actors are often presented in whole, unshaped and unimproved form. To control for the apparent artificiality, the found footage is usually presented as an ethnographic or autobiographical project; thus found-footage films often operate on a meta-level. This technological self-awareness often has the effect of social commentary on the Western paradigm: namely, its obsession with the real and the separation of the real from the ideal.
Recommended Found-Footage Films:
Leading anti-censorship crusader George Carlin once said, “By and large, language is a tool for concealing the truth.” If so, attempts to conceal certain language, as well as certain visual content, is an attempt at double concealment. The United States’ reputation for prudishness among Europeans is not undeserved. After a year in which several horrific acts of violence occurred and the outcry began again against violence in the media and the decline of Western society, it seems appropriate to consider the history of this debate.
Film is probably the most seminal medium of cultural transmission. Its unprecedented combination of story, visuals, sound, and sociopolitical context means that it can interact with viewers on multiple levels. Film’s power was embraced by propagandists in the U.S. and abroad; its importance is attested to by scores of film journals, magazines, and academic departments across the U.S. and Europe. And despite the increasing availability of audiovisual media distributed through private channels, movie theaters continue to rake in millions of dollars in domestic gross.
As most Americans know, the Motion Pictures Association of America (MPAA) gives a rating to every film released in U.S. theaters. The rating is intended to measure and indicate the offensiveness of the film. This practice underscores the volatility of the film product, but, I argue, shapes the filmgoing experience and enforces the American conservatism against artistic excess.
The Production Code, also called the Hays Code after the studios’ consultant on indecency, Will H. Hays, more fairly ought to have been called the Breen Code, after Joseph Breen, who assumed control of the Production Code Administration devised by a Catholic interest group of the intimidating title The Legion of Decency. Their mission statement, which suggested the film industry involved “the dissemination of the false, atheistic and immoral doctrines repeatedly condemned by all accepted moral teachers,” encouraged the MPAA (then called the Motion Picture Producers and Distributors of America, considering the studio monopolies) to control the content of their films rather than judging them afterwards. The Code abhorred all anti-religious, anti-family, pro-violence, pro-lust, and anti-American sentiment in films. The precedent was set for censorship of films, although the buck passed from studio head to director in the 1960s.
Since the Code was lifted in 1968, filmmakers could make films with any content (although certain classic films did not always abide by the Code, with little repercussion). The Code had already been weakened by the decision in U.S. vs. Paramount, which dismantled studios’ control of the theaters and thus allowed the distribution of independent (read: naughty) films. In addition, Catholic interest groups did not hold as much sway over the film industry. Without the selection being limited to studio-grown films that expressed the “American” values held by the Hays/Breen Office, and thanks to an influx of profanity and nudity in 60s films, it became necessary to classify films according to their level of offensive content. It is important to note that the ratings system was enacted by MPAA President Jack Valenti in an attempt to guide parents rather than enforce filmmaking values. In effect, though, the ratings reflected the ideologies of a conservative section of society, represented by the Classification and Ratings Administration, and solidified certain elements of content as “immoral” or “offensive.” The ratings system, although technically voluntary, is required for films released in theaters; moreover, distribution is limited for films with higher ratings.
Throughout the various incarnations of film censorship organization, four primary concerns remain constant: the depiction of sexual or lewd acts, the demonstration or justification of violence, the portrayal of minorities, and the construal of what the MPAA now vaguely refers to as “thematic content” and what the Hays Office blatantly called “blasphemy.” Unfortunately, the high ratings given to films that depict these types of content do not completely prevent the distribution of films, nor do they effectively tackle the social concerns that are often blamed upon film. The same problem applies to that endless scapegoat for teen violence, video games.
This construction of the offensive film is inextricably tied to the pejorative social context of certain “swear words.” These words are deliberately and effectively negative, and have such a neurological effect upon utterance as to divert from other pain. As with all types of symbols, there is nothing inherently volatile or negatory about these words. In fact, they’re used quite commonly* (more if you’re a character on premium television), despite their “taboo” label. Most swear words in English are of Germanic origin. As Latin was not only the language of the church but the language of the learned classes, it’s understandable that the Germanic words in English would come to be regarded as common and profane (itself a Latinate meaning “outside the temple”). Ultimately, the social construction of swear words is largely influenced by class divisions. Moreover, the psychosocial impact of swearing is entirely shaped by our level of rejection to them (hence why people who swear more often experience less of an analgesic effect from swearing). Should we discourage children from swearing? Perhaps, but it provides them little benefit to make the words taboo. On the other hand, teaching the children total acceptance of the words deprives them of an instant painkiller.
Since the portrayal of minorities and women in film is something I’d like to write in a future post building off of this one, for now I will conclude by reviewing the prudish tendencies of CARA. According to the documentary This Film Is Not Yet Rated, CARA allows films with proportionately more violent content than sexual content to pass with a PG-13, while films with proportionately more sexual content are given an R or NC-17. Moreover, films with some sexual content but no violent content are more likely to be given an R. Filmmakers Kirby Dick and Eddie Schmidt argue that this reflects a latent bias against sexuality that does not occur within conservative-minded frameworks of romance and heteronormativity. The film controversially features an exposé of CARA members (some of whom are not eligible to keep their seats according to CARA’s own by-laws) and a montage of rape scenes from PG-13 movies. Although the documentary’s thesis could be said to be overly deductive, it is interesting to consider the gender divisions, heteronormativity, and relative offensiveness of sexual acts portrayed in film. We know, for example, that a certain film with a nude painting scene, followed by adulterous sex, had a PG-13 rating despite a high on-screen body count and general terror. But Blue Valentine, featuring a loving straight couple engaged in oral sex, was slapped with an NC-17, while Gods and Monsters, featuring nude art scenes and a homosexual central character, was given an R rating. Zoolander evaded its R rating for its orgy scene only by omitting the goat who was scripted to be involved. Admittedly, it’s easier for a child to ignore the shot of two sweaty heads backlit by candlelight that adults decode as “lovemaking,” than the more explicit shot required to show oral sex. But this defense cannot extend to homosexuality, and cannot explain the excessive violence, in particular sexual violence, deemed suitable for teenagers. While I do not attribute teen violence entirely to media consumption, one can only deduce, based on the explicit function of the MPAA, that violence is simply considered less offensive than sex. It is more easily coded as an extension of the character’s personality; it may even be justified by the committing character’s reasons, be they national defense, lover’s avenge, or speciesist superiority. Sex is also coded, but is hidden within character archetypes. Thus sex is acceptable for the leather-clad female action hero, but not for a gay woman. It is acceptable for the heterosexual Prince Charming, but not for the teenage sex hound. There are exceptions (hello, Grease). But if an MPAA rating is a measure of the offensive potential of a film, we must question the cultural source of those conclusions, and what might actually happen if a child sees two men kissing. Would it be so different from seeing a man and a woman kissing? And of a sex scene and a slaughter scene, which is more likely to shape a child’s view of her social world—or give her nightmares?
The American Film Industry, ed. Tino Balio
A History of the Cinema, Eric Rhode
When I was a child, I was fascinated by science. I don’t recall what got me hooked, but my parents certainly encouraged my interest by providing me with books, informative videos, and subscriptions to children’s science magazines. Up until I was 16, I wanted to study in the realm of natural science, in particular the fields of evolutionary biology and paleontology.
However, my passion for science competed with my love for the narrative visual arts. Raised by two writers, one of whom was a fan of classic sitcoms and Broadway musicals, the other a fan of documentary films and the narrative-driven folk rock prevalent in the 70s, I was primed to be seriously interested in the narrative aspects of auditory and visual media. Thus my dreams of a science career conflicted with my love of movies and plays. Perhaps I could combine them, I reasoned. Following a blatantly adolescent interest in “making it big” after I graduated, I focused on star-making possibilities. Having watched many science documentaries, I decided to combine science and filmmaking in that way. I had been inspired by a scene in Sam Mendes’ American Beauty in which a young filmmaker describes how he uses film to demonstrate the beauty of the world. What a perfect application of filmmaking to science, I thought!
Unfortunately, my family was not wealthy enough to send me to the private universities where I applied, despite the generous scholarships I was offered. I settled on the one in-state public university on my list, which did not have a Filmmaking program. Having devoted most of my spare time in high school to drama club, I declared a Theatre Arts major in college, reasoning that I would develop important crossover skills and a foundational knowledge of narrative visual media.
I did indeed, but ultimately, the program was not a good match for me; I felt under-stimulated and under-challenged. However, my Anthro 101 class had completely reinvigorated my left brain. Moreover, I saw important theoretical parallels to the social and philosophical underpinnings of the theatre arts; I was interested in how theatre as a cultural practice and social event might be analyzed anthropologically. I switched to the Anthropology department, but continued my theatre courses as I completed my Anthropology degree. By that time I had accrued credits in comparative religion, philosophy, mythology, theatre history, and playwriting in addition to my coursework in cultural anthropology. I had learned the name of the fusion of my chosen disciplines: visual anthropology. Moreover, I had learned that it could be applied. My interest in the natural sciences had waned, although I reasoned that I could use it if I ever wanted to make an intelligent science fiction film.
After a long undergraduate career marked by a tumultuous personal life, declining funds for public education, and an exhausting job in retail, I was accepted to the University of Florida, where I feel that my truly (and somewhat messily) interdisciplinary plan of action can and will happen. I have devised a plan for the applied part of my science, and have learned that all of my academic interests are important to what I do, because anthropology for me has evolved from a mere major to a major lifestyle.
Somewhat of a departure from the topics we’ve been discussing of late, but interesting: A Knox College study of young girls brings to light factors of self-sexualization:
Media consumption alone didn’t influence girls to prefer the sexy doll. But girls who watched a lot of TV and movies and who had mothers who reported self-objectifying tendencies, such as worrying about their clothes and appearance many times a day, in the study were more likely to say the sexy doll was popular.
The authors suggest that the media or moms who sexualize women may predispose girls toward objectifying themselves; then, the other factor (mom or media) reinforces the messages, amplifying the effect. On the other hand, mothers who reported often using TV and movies as teaching moments about bad behaviors and unrealistic scenarios were much less likely to have daughters who said they looked like the sexy doll. The power of maternal instruction during media viewing may explain why every additional hour of TV- or movie-watching actually decreased the odds by 7 percent that a girl would choose the sexy doll as popular, Starr said. “As maternal TV instruction served as a protective factor for sexualization, it’s possible that higher media usage simply allowed for more instruction.”
Mothers’ religious beliefs also emerged as an important factor in how girls see themselves. Girls who consumed a lot of media but who had religious mothers were protected against self-sexualizing, perhaps because these moms “may be more likely to model higher body-esteem and communicate values such as modesty,” the authors wrote, which could mitigate the images portrayed on TV or in the movies.
However, girls who didn’t consume a lot of media but who had religious mothers were much more likely to say they wanted to look like the sexy doll. “This pattern of results may reflect a case of ‘forbidden fruit’ or reactance, whereby young girls who are overprotected from the perceived ills of media by highly religious parents … begin to idealize the forbidden due to their underexposure,” the authors wrote.
The authors [of the 2007 APA study] cited examples like “advertisements (e.g. the Sketchers naughty and nice ad that featured Christina Aguilera dressed as a schoolgirl in pigtails, with her shirt unbuttoned, licking a lollipop), dolls (e.g. Bratz dolls dressed in sexualized clothing such as miniskirts, fishnet stockings and feather boas), clothing (e.g. thong underwear sized for 7- to 10-year-olds, some printed with slogans such as ‘wink wink’), and television programs (e.g. a televised fashion show in which adult models in lingerie were presented as young girls).”
I will say that I think adults dressing as children is probably less of an influence on girls’ self-sexualization than the plethora of kid-size adult clothing styles. Years ago, I saw girls at the pool dressed in halter-top swimsuits…with nothing to halter! I see girls in miniskirts, mini cowboy boots, spaghetti-strap tops, mini-heels, the works.