Let’s Talk About The Shady Ethics Of The New Anthony Bourdain Documentary

Photo: Courtesy of Focus Features.
The new documentary Roadrunner about Anthony Bourdain has been creating a lot of buzz among audiences and critics, but not for the reasons you'd expect. After watching the documentary, many are questioning the ethics around the use of machine-generated soundbites in the narration of the film, which, to unknowing audiences, appear to have come from the late Bourdain himself.
It has recently been revealed that Roadrunner features three sentences that were recreated from written notes using A.I. technology. Filmmaker Morgan Neville teamed up with a software company that created the roughly 45 seconds of AI narration using hours of archived footage of Bourdain's real voice and manipulating it. In the film, Bourdain's A.I. voice speaks throughout different parts of the film, and even reads out one of a real email that he had sent to friend David Choe: "My life is sort of shit now. You are successful, and I am successful, and I'm wondering: Are you happy?"
The issue here is less that Neville decided to use AI-generated voiceover rather than say, have an actor read out Bourdain's words, but rather, it's the fact that nowhere in the film is this information disclosed. In fact, the information only came to light in an expansive interview with the film's director, Neville, when he was directly asked about obtaining certain quotes. During a conversation earlier this month, Helen Rosner of The New Yorker asked how they could've gotten audio of Bourdain — who died by suicide on June 8, 2018 — reading his own email, and Neville revealed that the voice was in fact machine-generated. He explained the decision, saying, "There were a few sentences that Tony wrote that he never spoke aloud. With the blessing of his estate and literary agent we used AI technology. It was a modern storytelling technique that I used in a few places where I thought it was important to make Tony's words come alive." At the time of the interview, Rosner writes that the director did not acknowledge that two other quotes were also AI-generated.
However, after the news of the manufactured quotes began to circulate, Bourdain's estranged wife Ottavia Busia said on Twitter that she never gave her permission for Bourdain's voice to be rendered in this way. She later told The New Yorker that she while she remembered A.I. being mentioned in a conversation, she didn't realize it would be used in the film. "I took the decision to remove myself from the process early on because it was just too painful for me," she wrote via email to the magazine.
Neville, however, still doesn't think the matter is an issue, and even said that it's something that "Tony would've been cool with." (Busia denied this claim on Twitter writing, "I certainly was NOT the one who said Tony would have been cool with that.") "We can have a documentary-ethics panel about it later," he told The New Yorker.
Many on social media think it needs to be addressed now. Yes, some feel that that because it's Bourdain's own words and the clips are so short it's not a huge deal, but many specifically take issue with the fact that audiences aren't alerted to its use in the film.
"When I wrote my review I was not aware that the filmmakers had used an A.I. to deepfake Bourdain's voice for portions of the narration. I feel like this tells you all you need to know about the ethics of the people behind this project," film critic Sean Burns wrote on Twitter.
"I feel like using AI to make Anthony Bourdain's voice appear to say something that he never actually said is not ONLY wildly unethical but also something that Anthony Bourdain would absofuckinglutely hate on account of it being ghoulish and awful," wrote another user.
Much of the discomfort also comes from the increasing pervasiveness of unverified and false information on the internet, as well as the weaponization of deepfake videos to mislead and dupe. We tend to trust audio and video more so than text and quotes these days, but the fear is that people will start to doubt what they see and hear.
But what some felt was perhaps even more troubling than the Bourdain's deepfake was the exclusion of Bourdain’s former girlfriend Asia Argento from the film — especially given that she's presented in it as "the agent of his unravelling," per Rosner. Neville told Vulture that he didn't interview her on purpose. He claimed that because she's done many interviews, it would've "distracted" from Bourdain's story and have “been painful for a lot of people.”
“It instantly just made people want to ask ten more questions,” Neville said of not including Argento. “It became this kind of narrative quicksand.”
Alternatively, it seems that the shady ethics of the documentary have become a critical quicksand all its own.
Refinery29 reached out to Focus Features for comment.

More from Movies

R29 Original Series