DGA members have until June 23 to vote on the tentative agreement with the studios that passed early this month. Ratifying the deal is no slam dunk, and one line about A.I. could crater the whole thing.
“Employers may not use [generative AI] in connection with creative elements without consultation with the Director or other DGA-covered employees,” the tentative agreement reads. The use of the word “consultation” just hangs in the air.
Some directors guild members have called that language “weak” and “dangerous.” Probably most nervous are DGA members who are also in the still-striking writers guild, the WGA.
“The Matrix” director Lilly Wachowski said the DGA-AMPTP’s legalese “has a stink of deviousness.” “Law & Order: SVU” showrunner Warren Leight said on Twitter he’s been around long enough to know that taking the studio’s word on that whole “consultation” thing doesn’t mean much.
The trouble is, there is no single, consistent definition for what such a “consultation” actually means. So we asked the only folks who could make sense of such a thing: entertainment lawyers.
“It’s more than a notification but considerably less than an approval,” Simon Pulman, who is a partner and co-chair of Pryor Cashman’s media and entertainment groups, told IndieWire. “The studio or the producer would have to tell the director of their plan to use generative AI, and would have to give them the opportunity to air their thoughts and feelings about that. But there’s no requirements within that to actually engage with that feedback.”
So the studio or producer can’t just drop the director a text or an email about planned usage of AI and call it a day. They must engage in good faith. But it doesn’t mean the director gets the final say — or veto power — on the matter.
It’s understandable that some directors, writers, and especially director-writers want more than that. Ivy Kagan Bierman, Loeb & Loeb partner and the chair of the Entertainment Labor Group, sees “consultation” as a stand-in for “compromise,” which she called “very meaningful.”
“If the DGA agreement said that the companies simply had to notify the directors that they were going to be using AI in connection with the creative elements, that would be a much bigger issue because the directors would have no involvement whatsoever in that decision making,” she told IndieWire. “The purpose of that consultation is to be meaningful, is to take into consideration the director’s perspective, the director’s notes and comments, and then to make decisions in part based upon that consultation with the director.”
Kagan Bierman, Pulman, and one DGA-WGA member who spoke with IndieWire on the condition of anonymity all agree on one thing: thank goodness the agreement defines AI as not being a person.
“I was really pleasantly relieved to see them willing to say on the record that AI is not a person and that people are required for these jobs,” the DGA-WGA member told IndieWire. “I know that sounds like a really dumb thing. But that really was an existential dread that everybody was feeling because of a lack of responsibility to the writer at all in terms of engaging them on those issues.”
Pulman may not be as optimistic as our DGA-WGA member, but he offered up another silver lining that hinges on Hollywood being a relationship business: If a studio wants to keep working with an A-list director, they better not finish his or her film with Midjourney. The bad news? Not every director has such status.
Worse yet? Even some AI advocates think the DGA deal’s language is soft. Edward Saatchi of The CultureDAO, which represents a collective of filmmakers telling stories with AI (without the aid of studios), believes directors deserve more autonomy than the pending agreement gives them.
“That is the bare minimum, and we can all go much further for a kind of technology that is very disturbing to people,” Saatchi told IndieWire. “One would hope it would be a director-led process, rather than the producer checking in with the director, the director having the power to figure out how to use generative AI for those who are making entirely AI movies. That probably is too weak and artists should have more control.”
When reached, here’s what a DGA rep had to say: The language with AI also applies to previously codified language surrounding the consultation over creative elements such as a director’s cut, as spelled out in section 7-202 of the DGA’s 2020 basic agreement, which requires employers to give good faith consideration to the director’s advice and suggestion. The spokesperson declined to comment further.
We’ll find out in one week if the language was good enough for the (majority of) directors. Regardless, the writers see a reason to continue holding out: AI threatens to impact writing credits more so than it threatens to hurt directors. The WGA maintains its own fight is not tied to the DGA’s — or to SAG-AFRTA’s, for that matter. Pulman says he gets it, though he believes the DGA’s language will still be “helpful” for the WGA to build upon at the negotiating table.