When an exhibition called We Are At War opens in northern France, visitors will be stunned. It brings together many of the D-Day images shot by the famed war photographer Robert Capa on the beaches of Normandy in 1944 and believed to have been lost due to a disastrous processing error in the lab back in the UK. Scenes of close combat, of the terrified and the dead, of the bloody mayhem are revealed for the first time.
Or, at least, that’s how it will look until visitors exit the show—when it will be revealed that the brutal and harrowing images have, in fact, been created by the artist Phillip Toledano using AI. Only those with a nerdish knowledge of military history are likely to have not been fooled.
“People need to understand how easy it is to the manipulated by AI now,” says Toledano, who worked with AI on his previous series of more obviously altered, often surreal images, Another America. “We’re at a cultural hinge point in relation to being able to trust what we see now, one that only a small demographic—academics, journalists, media—seem to [be] talking about. But you have to be fooled by AI first to get what’s happening. I want regular people to see this show, enjoy it and then realise how easy it is for them to be lied to [by imagery], to realise that they’ve been had—because we’re going to live in a constant state of being had now”.
Toledano concedes that the ability to manipulate images has been with us since the invention of photography. The iconic 1860 photograph of Abraham Lincoln, for example, is actually the president’s head on someone else’s body, some trickery deemed necessary at the time because newspapers lacked a sufficiently "heroic" image of him. Between 1917 and 1920 two girls created an international hullabaloo with what they claimed were their photos of fairies at the bottom of the garden—the photos had Arthur Conan Doyle, inventor of Sherlock Holmes, convinced. Stalin would have his comrades-turned-class-traitors airbrushed out of photos following their execution.
More recently—well, in 1982—National Geographic courted controversy with a shot that moved two Egyptian pyramids closer together so they fitted better on the cover. And tools the likes of airbrushing and Photoshop have been widely used in publishing—to tidy up many a Hollywood cover star, for example—since the early '90s. The difference, he contends, is that using these tools requires time, money and expertise.
“But AI represents a quantum leap in what can be done and the speed with which it can be done, and it’s only going to get better and faster,” Toledano says. “To make convincing images of soldiers in battle on a beach would have required huge skill and resources. AI can give an image a new depth of truth. And it’s getting better all the time. You could sense with images generated using earlier AI that something was a bit off. You don’t sense that now”.
We Are At War
Planches Contact at the Deauville Photography Festival
Indeed, recent years have seen instances of an image generated using AI winning a photography prize—only to lose when the artist revealed his methods—and a jury disqualifying an image from another photography competition for being AI-generated when it wasn’t. Both entries require skills to produce. But the jury is not alone in its confusion. A study by psychologist Dr Sophie Nightingale of the University of Warwick, UK, suggests that just 65 per cent of people—not much better than chance—are able to identify whether an image is "real" or AI-generated. What’s just as concerning is that, in a second study, participants typically found images of AI-generated people more attractive and trustworthy than those of real people.
We do seem to be at a tipping point, at least in terms of attempts to deliberately mislead. A research paper released this year tracked misinformation trends, analysing some 136,000 fact-checks back to 1995, and found that AI-based imagery accounted for very little of them—until spring of 2023. What happened then? The democratisation of accessible generative AI tools. And an explosion in unreal imagery.
But this is not just about AI. Importantly, what’s also changing in our relationship with imagery is the way that AI-generated and other imagery is disseminated: the digital realm in which we now exist—the Internet, and social media especially—means that it’s everywhere, almost instantaneously and with little time to make an assessment—compounded by being viewed on the small screens of our cellphones—before it goes viral.
Such is the pace of the dissemination of these images that, the Brazilian intellectual Vilem Flusser argued, we no longer bother to decode the scenes in them as signifiers of the world. Rather, now we experience the world as a series of scenes. If it isn’t captured as an image, it did not happen. Hence the rise of holiday destinations and restaurants, now chosen for their Instagram appeal and the enduring popularity (and ceaseless narcissism) of the selfie.
That’s given us the likes of Kim Kardashian saying—seemingly without any self-judgement—that in the short spell of a four-day holiday in Mexico in 2016 she managed to take 6000 photos. Of herself. Maybe she was right: Flusser also argued that the more images are produced of a particular person or event, the more socially or economically relevant that person or event becomes.
But why have we ever trusted photography, privileging it as a source of realism—relative to, say, a painting, through which the artist is assumed to have interpreted what they have seen? After all, for all of its claims to being more "real", black and white photography is obviously anything but. Twentieth-century philosophers the likes of Guy Debord have often pointed out that images actively shape our perception of reality by manipulating visual representation, while Jean Baudrillard argued that contemporary society is saturated with simulations that blur the lines between reality and representation. With his famous aphorism, Marshall McLuhan warned that “the medium is the message”.
Flusser even argued that there was a world before and after the invention of the camera because it completely changed our collective understanding of events, despite his contention that a technical image—one produced by a machine like a camera and seen as a machine, with framing, composition, filters, depth of field and so on—can never be an objective representation of reality even as it offers the illusion of being just that. Rather than help us get to the truth, it stands in the way, the images it creates further detaching us from the real world.
Of course, we rarely take the time to consider all this when presented with an image, even while using software—the likes of the AI-based Facetune and other image-editing apps—to routinely doctor images ourselves. We like, we share, we scroll on, we reflexively snap away and then use whatever tools we can to perfect before we publish.
Indeed, perhaps we are somehow already beyond the idea of photography as truth. Toledano argues that while it had a dual purpose as art and as reportage from the outset, that distinction is now breaking down. Perhaps, he proposes, photography—pioneered only around 150 years ago—represents just a brief moment in human history when a medium was granted a role as some kind of truth-giver, and now that moment has passed. We’re back, in effect, to pre-photography times, when it was accepted that every image came slanted with a point of view, that every image was in some sense art. Does that mean we need to find something to be the new arbiter of the truth?
According to Bryan Neumeister, yes, because while fake images have been found to have a negative impact on our well-being—especially given the way less-than-flawless teenagers regard themselves—and deepfakes of a naked Taylor Swift may upset her fans, AI-manipulated pics are increasingly creeping into places where the effects can be profound, into journalism and the courts.
Neumeister, through his Detroit-based company USA Forensic, is one of the few independent experts in digital forensics, called to make an assessment of imagery around the world. He’s busier than ever. It was USA Forensic that was asked to be a rebuttal witness in the acrimonious Jonny Depp-Amber Heard court case, in which both sides claimed that images presented as evidence could not be trusted. Are hostage images from an ongoing conflict real? It’s likely those wanting to know talk to Neumeister, who will dig down into the image’s hexadecimal data—the electrons to binary data’s atoms—to find out.
“AI imagery has its weaknesses and doesn’t do some things well, like eyes and hands. But you have to know what you’re looking for and the problem is that, apart from obviously artistic images, the average person has no way of telling if an image is real or aims at deception and has been generated using AI,” says Neumeister. That’s especially the case when it’s been uploaded to social media because then ownership goes to the host and useful data is washed from it.
We Are At War
Planches Contact at the Deauville Photography Festival
“AI already means more people can fake images and do so very well, even if we’re not there yet for professional-standard images,” he adds. “But the emphasis there is on ‘yet’. In just six months it will be much better. I’ve seen some remarkably convincing fakes. That means AI imagery is a dangerous weapon when you know people don’t have the time, motivation or ability to check whether what they’re seeing can be trusted. I get news networks sending me photos and asking ‘Is this real?’ because they don’t want to get sued. But finding out takes time and they’re up against deadlines”.
Heavyweight news organisations the likes of Reuters have already been caught out publishing images that proved to be doctored, with many—flabbergasted and outraged—issuing a ‘correction’ regarding photos of Kate Middleton and her family earlier this year after the future Queen of England was found to have lightly tinkered with them on her laptop. In academia, doctored images have been used to fabricate research results.
But when the right image—even when that image is wrong—can change the course of a national election, or justify military action, we’re in deep water. In 2004, John Kerry’s US presidential nomination bid was scuppered when a doctored photo of him sharing a stage with anti-Vietnam War protestor Jane Fonda was widely shared. Now something similar could be knocked up on an iPhone and go global in minutes. Indeed, last year a mostly innocent but faked photo of Pope Francis looking very street in a white Balenciaga puffer coat was commonly believed to be real.
These kind of faked images can have lasting resonance too. Several studies over the last 20 years have shown how easy it is to use doctored images to implant memories. In one, participants were shown photos of real events from their childhood, with one doctored to show them on a hot air balloon ride that never happened. When interviewed some time later, half of them remembered the ride as if it had. Doctored images of public events—protests, for example—have been shown to change the way participants later recall the events.
Meanwhile, the very ubiquity of such fakery is corrupting because it allows people to claim any real image to be fake—to play on the sense of unease and mistrust that AI image generation is creating. Alondra Nelson, a sociologist at the Institute for Advanced Study at Princeton University, speaks of “the liar’s dividend”—that suddenly no information is deemed trustworthy - and the cost of that in terms of, for example, holding public officials to account. “I think over time we be much more sceptical [about all imagery we see]. And that’s not good for social cohesion,” she warns.
No wonder that some have proposed that any AI manipulation should be listed within an image’s metadata as standard—and, indeed, earlier this year the EU passed landmark AI regulation rules forcing AI companies to label fake images. Others have proposed all AI systems come with guardrails that prevent them from creating fake news images or have called for the certification of images through the use of so-called watermarks—a kind of technological stamp that would authenticate an image as real or flag up that it’s been manipulated by AI.
We Are At War
Planches Contact at the Deauville Photography Festival
But, Neumeister worries, this inevitably leads to some kind of arms race of tech—“not just to detect where images have been manipulated but to improve how they can be manipulated”—each trying to keep ahead of the other. Philip Toledano leans towards concern rather than confidence too.
“Some people are still very priggish about AI pictures, saying they have no soul. But that’s just not true. Show the images from We Are At War to a photojournalist and they see that they have a power, an emotion. And if images generated using AI can move you as much as ‘real’ photos can, then guess what comes next? People are going to use that to negative ends. Either we’re going to have to get used to being fooled by imagery or get to the point where nobody will care”.
But for all that younger generations especially appear to embrace image editing with few concerns—to the point where the final image bears little resemblance to reality—Nelson argues that this stands them in good stead. “I don’t think that they care less about the truth. [It’s more that] the way they use this tech seamlessly gives them expertise and a sense of what fakery is possible,” she says. They know to be alert to fakery in a way that older generations perhaps do not.
“Fakery has always been the norm. It’s as old as human culture,” she adds. “We’ve lived through shifts before in terms of different kinds of visual reproduction so I think in the course of time society shows resilience. There’s always a transitional moment when it seems that truth has been cast to the wind. But we develop the ability to discern—to pause and pose the question ‘Is this true?’ We’re just going to have to figure out how to take that pause. We didn’t do that five years ago. But now it’s do that or get duped. And nobody wants to be a dupe”.
We Are At War will be at Planches Contact at the Deauville Photography Festival until 5 January 2025. A book of the work is out now.