Most Shared

Kate Middleton and the End of Shared Reality

Kate Middleton and the End of Shared Reality

If you’re looking for an image that perfectly showcases the confusion and chaos of a choose-your-own-reality information dystopia, you probably couldn’t do better than yesterday’s portrait of Catherine, Princess of Wales. In just one day, the photograph has transformed from a hastily released piece of public-relations damage control into something of a Rorschach test—a collision between plausibility and conspiracy.

For the uninitiated: Yesterday, in celebration of Mother’s Day in the U.K., the Royal Family released a portrait on Instagram of Kate Middleton with her three children. But this was no ordinary photo. Middleton has been away from the public eye since December reportedly because of unspecified health issues, leading to a ceaseless parade of conspiracy theories. Royal watchers and news organizations naturally pored over the image, and they found a number of alarming peculiarities. According to the Associated Press, “the photo shows an inconsistency in the alignment of Princess Charlotte’s left hand”—it looks to me like part of the princess’s sleeve is disappearing. Such oddities were enough to cause the AP, Agence France-Presse, and Reuters to release kill notifications—alerts that the wire services would no longer distribute the photo. The AP noted that the photo appeared to have been “manipulated.”

Across social media, onlookers offered amateur analyses of the photograph, suggesting that it was poorly Photoshopped or perhaps touched up using AI. They wondered why there are leaves on the trees despite the photo supposedly having been taken in early March. The children seem to have weird hand positions. There are unexpectedly blurred lines in the image, and Middleton is missing her wedding and engagement rings. “I wasn’t in on this whole conspiracy about Kate Middleton missing and the royals covering it up until they dropped this obviously fake photo today to appease public concern,” one amateur photographer wrote on X, citing a “few unexplainable issues.”

In response to the blowback, Kensington Palace released a statement earlier today—signed with a “C,” likely in reference to Middleton’s formal name, Catherine—saying in part that “like many amateur photographers, I do occasionally experiment with editing.” That post has only made things worse. As one popular response to the statement put it, “I am struggling to believe that the most famous royal family in the world—and the woman who would be queen—fiddled around with photoshop and put out a family pic (designed to quash rumours about her whereabouts) without anyone in the ranks inspecting it. Nah. Not buying it.”

For years, researchers and journalists have warned that deepfakes and generative-AI tools may destroy any remaining shreds of shared reality. Experts have reasoned that technology might become so good at conjuring synthetic media that it becomes difficult for anyone to believe anything they didn’t witness themselves. The royal-portrait debacle illustrates that this era isn’t forthcoming. We’re living in it.

This post-truth universe doesn’t feel like chaotic science fiction. Instead, it’s mundane: People now feel a pervasive, low-grade disorientation, suspicion, and distrust. And as the royal-photo fiasco shows, the deepfake age doesn’t need to be powered by generative AI—a hasty Photoshop will do.

Back in 2018, I spoke with Renee DiResta, a researcher at the Stanford Internet Observatory, about AI tools being used by bad actors to cast doubt on real events. “You don’t need to create the fake video for this tech to have a serious impact,” she said at the time. “You just point to the fact that the tech exists and you can impugn the integrity of the stuff that’s real.” This dynamic works in the opposite direction too, as demonstrated by the royal portrait released yesterday. The popular emergence of generative AI has deepened uncertainty in an already-skeptical information environment, leading people to suspect that anything could be the product of synthetic media.

To my untrained eye, there appears to be no sign that the image of Middleton and her children was made with a generative-AI tool. It does not, for example, have any of the gauzy hallmarks of some of the big-name programs, such as Midjourney. Yet some people have seized upon small details in the photo to claim that it is indeed synthetic: Observers have argued that the children’s hands and teeth look off, which are classic giveaways that an image was made by AI. The most likely explanation, of course, is that the children were squirming or perhaps clumsily Photoshopped to get the best individual take of each child across multiple shots. But the fact that AI image tools exist offers a juicier, perhaps more sinister option of fakery, one that might imply that the princess is far worse off than the monarchy is letting on. This is tinfoil-hat stuff—and yet, it is also theoretically, technically possible. And it is true that some hyperrealistic image models produce such high-quality images that it’s quite difficult to distinguish between real people and fake ones. Even hastily made AI photos can fool casual observers—remember the Pope Coat from last year?

All of these anxieties and suspicions are most potent when they intersect with a subject where genuine conspiracy seems plausible. And when it comes to the Princess of Wales, there is some weird stuff going on. As my former colleague Ellie Hall, who extensively covers the Royal Family, noted in an interview last week, Kensington Palace’s public-relations strategy has been “out of character”—the communications team doesn’t usually respond to gossip. There’s also been a dearth of speculative coverage from British tabloids, which Hall notes has aroused suspicions. And then, of course, there’s the photo, which Hall wrote was distributed by the palace in an “unprecedented” manner. The seemingly sloppy Photoshopping, then, is merely the final, very odd straw. A good conspiracy theory involves a lot of world building—the more twists and turns and murky details, the better, Mike Caulfield, a researcher at the University of Washington, told me last year: “And it’s all possible because there is some grain of reality at the center of it.” The princess’s Mother’s Day portrait slots easily into the already-dense, opaque universe of the Royal Family.

Most important, as Hall notes, people have recently lost trust both in the Royal Family as an institution and in the organizations that cover the monarchy. In part due to Prince Harry and Meghan Markle’s departure from royal life, there is a newfound sense of the royals as conniving and manipulative, and the press plays into this. This trust vacuum, when it collides with a still-new technology such as generative AI, creates the optimal conditions for conspiracy theories to grow. It seems clear, at least in the case of the Royal Family, that the institutions aren’t sure how to handle any of this. It makes sense, then, that two of the biggest “Is it real?” image controversies of the past year have centered on figures from archaic cultural-political organizations: the papacy and the monarchy.

None of these dynamics is particularly new—Adobe Photoshop, the likely culprit of any supposed “manipulation” in the royal portrait, has been around for more than three decades. And although the tools are getting considerably better, the bigger change is cultural. The royal-photo debacle is merely a microcosm of our current moment, where trust in both governing institutions and gatekeeping organizations such as the mainstream press is low. This sensation has been building for some time and was exacerbated by the corrosive political lies of the Trump era.

But synthetic media seems poised to act as an amplifier—a vehicle to exacerbate the misgivings, biases, and gut feelings of anyone with an internet connection. It’s never been easier to collect evidence that sustains a particular worldview and build a made-up world around cognitive biases on any political or pop-culture issue. It’s in this environment that these new tech tools become something more than reality blurrers: They’re chaos agents, offering new avenues for confirmation bias, whether or not they’re actually used.

When I look at Middleton’s portrait and the cascade of memes, posts, and elaborate theories about which elements of the image are real, I’m reminded of the title of a book by the journalist Peter Pomerantsev: Nothing Is True and Everything Is Possible. The book is about post-Soviet Russia, where distrust, corruption, and propaganda created a surreal and toxic culture and politics, but the title may as well be describing the events of the past 30 hours. And I fear that it may be an apt descriptor of the months and years to come.




Source link

Related Articles

Back to top button