close
close

‘All eyes on Rafah’: How a viral campaign exposed an unfolding AI war

Over the past eight months, we have witnessed the unspeakable horror of Israel’s war against Gaza.

After the most recent offensive on May 26 against a tent camp for displaced Palestinians in Rafah, the world saw a headless baby being held together while bodies turned to ash and fires raged in the background. Forty-five people were killed that Sunday evening in what was likely retaliation after the International Court of Justice ordered Israel to halt its military operations in Rafah.

These apocalyptic scenes were a reminder that Palestinians cannot escape the bombings.

The relentless onslaught since October 7 has left more than 36,000 Palestinians dead (not counting those trapped under the rubble), tens of thousands injured and nearly two million displaced as a result of Israel's strategic and targeted violence. These scenes have circulated on social media with unprecedented volume and have been described as the “first live-streamed genocide.”

There is no shortage of content broadcast from the heart of Gaza, bearing witness to the terror in real time. Indeed, in this age of influencers, it is disturbing, if predictable, that the brave Palestinian voices documenting and broadcasting to the world have become familiar names to us all.

Stay up to date with the MEE newsletters

Sign up to receive the latest alerts, insights and analysis.
starting with Turkey Unpacked

The sheer volume of raw, gruesome footage circulating on social media has long meant that mainstream media has been displaced as the primary source of news by TikTok, Instagram and X.

We follow the feeds of young people on the ground who are sharing not just pictures but insights into their lives and personal experiences. We share their fear and their grief and worry for their safety if they haven't posted that they survived the night.

With this in mind, we watched with great interest as the now ubiquitous Instagram template “All Eyes on Rafah” quickly went viral.

What improvements do we make when we publish an AI-generated image instead of a real one? What exactly are we protecting?

The image, generated by artificial intelligence (AI), consists of white tents lying over an endless expanse of neatly erected tents in various shades, overshadowed by snow-white mountains.

The words printed in capital letters and the nice picture don't seem to fit together; the reality on the ground in Rafah is completely different from what is being told. But maybe that's why it's so easy to share.

It is possible that the complete sanitization and deliberate omission of graphic images made the post easier for social media users to “share” at a time when influencers and celebrities have been heavily criticized for their lack of engagement on the issue of Palestine.

But if ease or convenience are key factors in the content we share, then we need to ask ourselves a fundamental question: what are we improving by posting this image instead of a real one? What exactly are we protecting?

Digital activism

The outbreak of the Gaza war brought a significant shift in how we use technology and the digital space. Students protesting against the atrocities often face the threat of doxxing, as well as concerns about professional consequences if employers are tagged in or informed of social media posts.

In a recent case, Faiza Shaheen, a potential candidate for the UK Labour Party, was reportedly thrown out of the race for liking tweets supporting boycott, divestment and sanctions (BDS) against Israel. The intimidation of activists is real, as are its effects, and so the virality of the “Rafah” post, which has been shared over 40 million times, may be due to a feeling of being safer in a crowd.

Many critics of the AI-generated graphic seemed to compare it to other viral gestures that are considered superficial and performative, such as the infamous black profile squares and associated #BlackoutTuesday posts that gained popularity during the 2020 George Floyd protests to show solidarity with black communities against the violent injustices and exclusion they faced.

“All eyes on Rafah”: Why did the viral graphic spark so much anger?

Read more ”

Others believe that the “Rafah” piece is proof of the political power of art and its potential as a form of protest and resistance. The lack of “horror” may have contributed to the image’s enormous reach, by bypassing all filtering and blocking mechanisms.

But this episode prompts us to look more deeply at the role and influence of AI in developing online narratives and activism. The ease and speed with which AI can generate content is alternately hailed as fascinating, entertaining, and even empowering.

Enter a few keywords and you immediately have an image (or text) at your fingertips that, in one way or another, expresses our thoughts while revealing our fears and prejudices.

Just as the digital space has undoubtedly opened up a multitude of opportunities for activism, it also raises questions about the effectiveness of viral online trends in achieving goals.

What exactly are the effects of hashtags and mass blockouts in this neoliberal age?

While they may indeed influence their intended targets, the question remains: what information about ourselves are we revealing to social media companies through our engagement in this activism, and are such trends potentially tools to control, contain or appease activism?

“AI genocide”

In response to the popularity of the “Rafah” image, numerous countergraphics were produced, including one showing a Hamas fighter holding a gun and looking down at a baby, with the question “Where were your eyes on October 7?” emblazoned above it in capital letters.

Regardless of our political leanings, it is clear that AI-generated content is empowering our narratives and appealing to a content-driven audience. At the same time, it is becoming increasingly difficult to distinguish between truth and lies as bots and fake accounts spreading ideology become more common and will only increase our distrust of technology. What makes this particularly complicated, however, is that we are in a time when citizen journalism is at its peak.


Follow Middle East Eye's live coverage of the Israeli-Palestinian war


We are literally witnessing the first AI war unfolding in real time, whether through viral AI-generated posts to fuel narratives or the use of AI surveillance and weaponry to inflict maximum damage.

While opinions vary on whether sharing AI-generated content is acceptable, we urgently need to address how AI is being used to wage war through surveillance and “targeted” killings.

In 2021, Israel boasted about its use of innovative technology during an incursion known as Operation Guardian of the Walls, which killed 261 Palestinians as Gaza became an annual laboratory for AI weapons.

While opinions differ on the sharing of AI-generated content, we urgently need to address how AI is being used to advance warfare.

The Israeli military's continued reliance on artificial intelligence marking and tracking systems such as “Lavender,” “The Gospel,” and “Where's Daddy?” gives programs frighteningly broad powers, empowered to kill with “dumb bombs” (or unguided rockets) with minimal human oversight.

The use of drones and sophisticated surveillance programs to destabilize the lives of Palestinians while protecting the lives of Israeli soldiers by preventing a ground invasion shows how AI technologies are being used to prioritize some lives over others.

In this way, AI is being used to dehumanize, erase, and threaten Palestinian lives, whether intentionally through the use of weapons against entire populations, or perhaps inadvertently through the dissemination of sanitized images that obscure the atrocities routinely inflicted on Palestinians.

In these pressing times, AI's obfuscation of the truth will only serve to cloud our vision. This is a paradox we must be aware of as the amount of content at our disposal continues to grow unabated.

The views expressed in this article are those of the author and do not necessarily reflect the editorial line of Middle East Eye.