On the future of cinema, creating with AI, and stories as intellectual DNA
In conversation with Jeff Desom, film director and VFX artist for Everything Everywhere All At Once
Jeff Desom is a VFX artist who worked on Everything Everywhere All At Once and a film director who has directed music videos for Cuco, Mitski, and Father John Misty. His work is fantastical, featuring vivid miniature worlds and alternate realities come to life with VFX. In one video, a car goes on a bizarre flight through space; in another, events in a replica world intersect in gruesome and unexpected ways with the real world.
The film industry is an archaic, slow-moving beast, but parts of it—visual effects, animation, cameras—are inherently technical fields. Software and hardware innovations are not unusual: computer graphics evolved from 2D to 3D animation in the past century and cameras have consistently improved with several feature films shot on an iPhone in the 2010s. With Jeff, we extend the conversation about technological shifts to AI, speaking about his experiments with AI tools, visions for the future of cinema, and the craft of storytelling.
On becoming a filmmaker at the dawn of the digital revolution
Tell us about yourself. How did you get into film?
I grew up in Luxembourg on the cusp of the digital revolution. Everything analog was moving towards the digital and new editing systems were being introduced. Suddenly, with all this technology available to me, I could edit my own movies. Using visual effects, I could make something that I filmed in my garage look a bit more expensive, a bit bigger than it actually was.
For those who don’t know, what is visual effects (VFX)?
I’d like to say that visual effects is every alteration of the image that's been captured by the camera that happens after the shoot—but nowadays VFX starts even earlier, in the pre-production stage. The delineations between pre-production and post-production are starting to blur. For instance, in Disney’s show The Mandalorian, they’re shooting against an LED screen. The images on that LED screen have to be prepared in advance, meaning VFX happens before the first frame is even filmed.
How would you describe your style as a filmmaker?
I have a very eclectic style—I like to lean into the technical aspect of things and I have a tendency to the fantastical. I love Jean-Pierre Jeunet (Amelie), Tim Burton (Edward Scissorhands), Wes Anderson (The Grand Budapest Hotel), and David Fincher (Fight Club). In the conceptual stage of a project, I’m always asking myself how I can involve VFX. If you want to push the medium, you have to embrace the technology behind it and always try to play with the latest tools to see what new things you can come up with.
On infinite video streams
If you could know the answer to any question about how AI is going to change the filmmaking process, what would you ask?
I love films. I love the two-hour, film-in-a-darkened-theater experience. That's what I live for and that's what I’ve always wanted to do. I want to know if that will survive. Whatever the other thing is, will it be it’s own thing? Or will it swallow up this experience that is so cherished?
Have you ever seen the movie Strange Days? It starts off somewhere in the distant future and people have cameras in their heads. There's this moment when the character looks into a TV that projects whatever he is seeing. I feel like there’s going to be a point in the future where someone else will be able to watch whatever you’re imagining. It’s going to be a constant stream of everybody hallucinating. If you want to, you’ll be able to let this dream wash over you. It’ll adapt to your feelings and play you like an instrument. It’ll get so good at recognizing your emotional state that I’m afraid we’ll all be in a trance that we can’t come out of anymore.
On creating and consuming AI-generated content
Are you currently engaging with AI tools? Do you have any experiments or proof of concepts?
I have a very clear imagination. I want to regurgitate what I see in my mind’s eye, so that I can share it with my team or an audience. When I used Midjourney, I couldn’t get exactly what I was looking for. If I had painted from scratch, I would've been done in half the time. I can see how in a few generations down the line these AI tools will really change things, but they’re not quite there yet.
So often, you see people talk about how AI helps them, but I find myself thinking: how much of that is them looking at what computer spat out and then talking about it as if they had conceptualized it themselves?
It sounds like what you’re saying is that image generation has become more accessible at the expense of creative control. Do you think audiences are cognizant of this difference when they are consuming a piece of media?
I'd like to think that an audience can tell the difference between good craftsmanship and bad craftsmanship, but now you have this third category, which is AI. So, you’re asking: Was this even filmed? Was there a person behind this or wasn't there? Even I’m getting to the point where I couldn’t tell you.
Does it feel intuitively important for you either as an artist or a viewer to know who the maker of a piece of media is?
Stories are our intellectual DNA, a hand-me-down from one generation to the next. They contain lessons that have been learned by other people—so you don't have to learn them again. As such, I think stories have to be told from person to person. The computer's not going to have an intricate life experience. It's going to regurgitate something that might look like an experience, but it won’t know the purpose of sharing that story, so there’ll be an uncanny valley effect. I think you can feel the intention behind a story. If the intention doesn't come from sharing a life experience or a very particular point of view, something’s going to feel off.
Is there any way you are trying to or might want to try to incorporate AI tools into your process?
I've been using it on a current project in the concepting stage to come up with quick iterations of an image. I had it spit out a few options and then took the client’s feedback to rework the images in a more traditional flow. I definitely see it as an accelerant for concepting. From a technical standpoint, it's going to make things like rotoscoping or 3D animation faster. I'm only mad that AI is coming out after I've already wasted so many hours of my life on rotoscoping.
That said, sometimes in the process of making something, by having time to reflect as you're doing a repetitive process or maniacal task, you discover a spark. I’m sure we’ll lose some of that. But I think as long as there is human interaction, you’ll find those moments somewhere else in the creative process. There’s always room for happy accidents.
On the life of unfinished films
Any closing thoughts?
It doesn’t take a lot to imagine AI tools that spit out a finished product, so that the film will be malleable until the very end. There’s this phrase “films aren’t finished, they’re abandoned.” But what if they were never finished in the first place? The strangest thing is that movies might continue on to have their own weird life.
Even now, on a streaming platform, you might see a Coke replaced with a Pepsi because an advertising deal has expired. Imagine that someone or something would continue to tinker with a film after it’s been released. When you see a movie again, it’s different. It’s not what you imagined. Not because you forgot something, but because it has changed on the screen.
That sounds like a media multiverse.
Yes, crazy times ahead.
Embeddings is an interview series exploring how generative AI is changing the way we create and consume culture. In conversation with AI researchers, media theorists, social scientists, poets, and painters, we’re investigating the long-term impacts of this technology.