A group of documentary filmmakers, producers, and archivists has written a series of guidelines on how they believe filmmakers should — and should not — use generative AI in their documentary movies.
While the AI guidelines for many entertainment folks may go something like this: “never, ever, a billion times no,” the reality is that generative AI has already crept into documentary filmmaking and is likely here to stay. An organization called the Archival Producers Alliance has outlined its best practices for filmmakers when it comes to handling consent, being transparent, and preserving history and truth.
“We recognize that AI is here, and it is here to stay. And we recognize that it brings with it potential for amazing creative opportunities,” APA co-founder Jennifer Petrucelli (“Crip Camp”) said at the IDA’s Getting Real event on Wednesday. “At the same time, we want to really encourage people to take a collective breath and move forward with thoughtfulness and intention as we begin to navigate this new and rapidly changing landscape.”
The initial guidelines — a nine-page document obtained by IndieWire — are just a draft at this point, with the group intending to formally publish them in June. And by no means will any of these practices be binding for people across the documentary field. (The APA will be soliciting more feedback on the proposals in the meantime.)
The proposals advocate for the use of original images and video footage rather than anything created by AI. The APA is a group of several hundred archival producers who aim to uphold “truthfulness” and journalistic integrity in documentaries.
In their view, it’s OK to use AI to lightly touch up or restore an image (the group distinguishes between “GenAI” and other machine learning to be used for workflow improvements), but anything that would be newly created, alter a primary source, or “change their meaning in ways that could mislead the audience” are a big no-no. The APA acknowledges too that even archival footage can be biased or problematic, but says the source material’s intent can be known and put into context. AI, the guidelines say, has “no accountability of authorship.”
If you must use AI because no primary source is available, it’s important to take into account the bias that could be implicit in the training data, to take special legal care, and to consider how any images you create could be put out into the world and be “in danger of forever muddying the historical record,” the draft reads. The group also believes filmmakers should get “additional consent” from subjects about how AI is being used, mimicking some of the language actors have pushed for in their contracts.
To that end, the APA’s guidelines advocate for transparency: disclosing to filmmakers, the subject, to an estate, and especially to the viewers, that this is AI you are hearing or seeing. Just as if you were filming a reenactment of an event in a documentary, it should be abundantly clear to everyone involved in the production (through real-time communication and time-codes in editing) and watching it on screen (including appropriate lower-thirds or visual cues), and filmmakers should use “the same intentionality” they would with other material. For instance, the APA advises against using AI to make a real person do or say something they didn’t actually do, to create a realistic-looking historical event that never happened, or to alter footage of a real place or event.
One controversial example of this is when filmmaker Morgan Neville used AI to create a digital voice replica of Anthony Bourdain for the 2021 documentary “Roadrunner.” An AI voice that sounded like Bourdain read aloud a few lines from Bourdain’s journals, things he hadn’t actually spoken or put to tape. Neville revealed his use after the fact in interviews, but it wasn’t clear to viewers that what you were hearing wasn’t Bourdain. A more-agreeable example of use was from the film “The Andy Warhol Diaries,” which used an AI Warhol only after his estate gave approval.
The APA was only recently founded in 2023 and now has 300 members, including several Oscar nominees in the documentary field. In November, the group published an open letter warning against AI and calling for industry standards around the use of generative material, particularly in documentary films.
The group’s next steps will be to get endorsement on the proposals from other industry and awards organizations, to convene distributors and streamers on their opinions, and to establish an “AI Board” to annually review changes in the space.
THR first reported the news of the guidelines.