How Prosumer Cameras, Apple and YouTube Have Changed Documentary Storytelling in the 10 Years Since 9/11

by |

The Naudet brothers, Jules and Gedeon, hadn’t set out to make a documentary about 9/11. On the morning of September 11, 2001, in the early days of the digital-media revolution, they were following a rookie firefighter checking out a gas leak and ended up at the World Trade Center, capturing the first plane hitting the towers (conspiracy theories about the film notwithstanding) and the only footage from inside.

At a time when mainstream media dominated news coverage, the day’s digital technology allowed the Naudets to capture video footage on the go that would not have been possible for low-budget documentarians 10 years before. (By the same token, there weren’t as many documentary filmmakers then.)

In a week that finds us reflecting on the changes, both domestically and globally, in the 10 years that have followed the 9/11 attacks, I’m fascinated by how far digital technology has come and how documentary filmmaking has changed along with it.



The Rise of Digital Video

On this 10th anniversary, we find ourselves living in a consumer-electronics landscape where millions of Americans have video recorders in their pockets (be it a cell phone or a point-and-shoot camera with video capabilities), the means to edit on personal computers, and the means to distribute online — for free.

But in 2001, the “prosumer” market, which provided professional-grade equipment at consumer-grade costs, was just picking up steam on the back of the new DV (Digital Video) format. We also forget that reliable video sharing didn’t come to be until 2005, with the online video repository YouTube, and that bandwidth has grown so much since the turn of the millennium that it’s become an afterthought when creating media. By 2007, YouTube’s offerings were so vast and so popular that it was consuming nearly as many bits as the entire Internet had in 2000.

Today, companies associated with easy creation, editing and sharing of video are some of America’s largest. Apple, maker of the iPhone and video editing software Final Cut, was for a moment this summer our largest public company, surpassing the market capitalization of Exxon Mobil. YouTube owner and phone-operating-system-maker Google holds steady in the top 25.

The Filmmaking Burden Loosens

As filmmaking and online distribution have become even more accessible, the very definition of “documentary” has both evolved and eroded.

A decade ago, making a film had a monumental feel to it. Because of the difficulties in making one, both in terms of cost and technical skill, a documentary about mercurial or thin subjects (or by a newcomer) took a back seat to one where a filmmaker invested time commensurate with the costs of equipment. Now, every would-be filmmaker can create high-definition video with high-quality audio. Topics can be as small as the pocket-sized equipment the films are shot on and filmmakers can find niche audiences online, what Wired‘s Chris Anderson called “the long tail” in 2004. That, as much as anything, has changed documentary film — not having to squeeze every frame through the narrow (and often profit-needy) outlets who once exercised control over the broadcast of content.

Ten years ago, network broadcast standards and the theater’s need for 35mm prints meant it was easy to shun a DV video (especially one shot quickly or cheaply). In the mainstream media, the new reality of documentary video has meant some odd splitting. We watch the nightly news on crystalline HD, then instantly cut to correspondents reporting, on-camera, via satellite phone. Newspapers have become documentary video sources (via their websites, just as television outlets now deliver text on theirs). “User-generated” videos are given labels like “crowdsourcing” and “iReporting” to separate it from the real stuff, and when one video breaks through, reaching the masses against all odds, it’s called a “viral video.” (Surely its success is not based on its merits.) To see video produced by The New York Times today is to forget the partitioning of media 10 years ago.

A New Aesthetic

And, as the saying goes, one man’s junk is another’s treasure. Work that failed traditional “quality” measures was also succeeding on sheer storytelling. When the documentary Tarnation came out in 2004, made for the legendary amount of $213.32 and edited on Apple’s iMovie, its string of best documentary awards showed that pixels were not the final determiner of a film’s potential. Viewers are willing to accept what they see if it makes sense.

The acclaimed 2009 documentary Tehran Without Permission was made surreptitiously on a Nokia N95 camera phone by Sepideh Farsi and played at festivals worldwide. YouTube conditioned people to watch films on business-card-sized viewers, and personal devices only furthered that, but storytelling has shown itself to trump format.

Crowdsourced documentaries, such as One Day on Earth, #18DaysInEgypt and YouTube’s Life in a Day, are continuing to pushing the form on every front, from how they are conceived to how they are produced to how they are exhibited (or interacted with).

A New Decade

Beyond all the equipment, beyond all the digital means, what’s changed in the past decade is our reality.

Documentary is about taking a hard look at ourselves. If the 1990s was a bit of a fantasy, with easy-to-win “wars” and an easy economy, all that made for a prelude to the stark truths in the 2000s. We have lived in a decade of constant fact, and documentaries are a tool for understanding our world, as unpleasant as it might be. On 9/11, we could hardly bear to watch, but we couldn’t look away. It’s something that the documentary form continues to demand of us.

Find more documentary news and features on POV’s Blog, or follow POV on Facebook or Twitter.

Edward J. Delaney
Edward J. Delaney
Edward J. Delaney is a journalist, author, filmmaker and editor of DocumentaryTech, an online project that explores documentary filmmaking techniques and technology.