search
Simon Kupfer

The surveillance state comes to campus

Use of private-sector AI technology to reveal protesters' identities amounts to a digital witch hunt to suppress dissent
An anti-Israel protester wearing a Hamas headband outside the NYU Langone Medical Center in New York City, January 6, 2025. (Luke Tress/Times of Israel)
An anti-Israel protester wearing a Hamas headband outside the NYU Langone Medical Center in New York City, January 6, 2025. (Luke Tress/Times of Israel)

There is a new battlefront at the heart of American academia. It is fought, this time, not with placards or slogans, but with algorithms and artificial intelligence. Across US campuses, the fallout from the Israel-Hamas war has intensified. With it comes a growing campaign to identify, expose, and deport foreign students in anti-Israel protests.

When a demonstrator was captured on video in January at a New York rally against the State of Israel, a mask and headscarf concealed her face, leaving only her eyes on display. Days later, photos of her entire, unconcealed face, along with her name and her employer, were distributed online.

Private groups have begun using facial recognition technology to identify, expose, and even seek the deportation of foreign students participating in anti-Israel protests – expulsions that President Trump appears to be all too happy to approve.

NesherAI, a software developed to unmask previously anonymous demonstrators, has helped change how governments track and manage protesters. Use of this technology represents a notable shift from traditional protest monitoring to algorithm-driven identification. From the French revolutionaries to civil rights marchers in the 1960s, protesters’ anonymity was a safeguard against retaliation. Today, that protection has crumbled and there is a new generation of protesting. Nesher AI, developed by software engineer Eliyahu Hawila, scans images of masked demonstrators, matches them with social media profiles, and exposes their identities. Occasionally, their names are forwarded to US authorities.

Eliyahu Hawila, a software engineer, tinkers with the coding of a facial-recognition program he wrote to identify masked protesters in New York on March 7, 2025. (AP Photo/Adam Geller)

What we see here is essentially the use of facial recognition as a tool to curb political activism. Whereas Hawila and his supporters argue that this is simply a matter of accountability – “You come here on a visa, you chant for intifada, you celebrate Hamas’ atrocities – you don’t get to hide,” he says – it is more of a digital witch hunt, an unprecedented use of AI to suppress dissent. This controversy is not merely a matter of technology, but power. Foreign students who support Hamas, occupy buildings, or intimidate Jewish peers have undeniably crossed a line. This may be, though, an attempt to silence criticism of Israel, conflating protest with extremism. Both arguments recognise the stakes.

READ: Private groups use facial recognition to unmask anti-Israel campus protesters for deportation

Surveillance is not new: aside from Big Brother, the historical reality offers rather chilling parallels. The FBI’s covert COINTELPRO, a program in the 1960s that sought to neutralise civil rights leaders, Black Panthers, and anti-war activists through surveillance, infiltration, and exposure could never have competed with the tools of today. McCarthyism blacklisted suspected communists, driving many into exile or economic ruin.

There is, however, a crucial difference: COINTELPRO was secretive, operating in the shadows of government agencies. NesherAI, in contrast, is a private actor that operates in plain sight, its creators openly celebrating each ‘unmasking’ online. Abed Ayoub, national executive director of the American-Arab Anti-Discrimination Committee, described it best: ‘It’s a very concerning practice. We don’t know who these individuals are or what they’re doing with this information… Essentially, the administration is outsourcing surveillance.’

One of the most striking aspects of this technological shift is the manner in which it bypasses traditional legal structures. Governments have historically controlled the mechanisms of surveillance and law enforcement. Today, that power is becoming increasingly decentralised, and private individuals – NesherAI among them – have begun to take political matters, it seems, into their own hands.

Democratic societies limit surveillance and punishment to legally accountable institutions for a reason: unchecked power invites abuse with open arms. When private groups become judge, jury, and executioner, one is left with a system with no checks or balances. Mistakes, misidentifications, and deliberate abuse are inevitable. Once a precedent is set, the technology will not remain confined to one ideological cause. What is used today to combat terrorism on campuses can tomorrow be used to crush dissent of the people against whatever regime finds itself in power.

This is especially so given that what happens in the United States rarely stays in the United States. The use of artificial intelligence to track, identify, and target protesters is not unique to this conflict. Across the world, authoritarian regimes have long exploited facial recognition to suppress dissent: China has long used facial recognition, particularly against Uighurs and pro-democracy activists in Hong Kong; in Russia, similar tools have been deployed to identify and arrest anti-war protesters.

Historically, student activism has shaped the way society views an issue. If new technologies render activism too dangerous, an entire tradition of political engagement is at great risk of being lost. NesherAI may have been created with a specific purpose in mind, but history has shown that once a weapon is developed, it rarely remains in the hands of its creators. It is, after all, a slippery slope.

About the Author
English writer exploring Zionism, diaspora, and what makes a democracy. Contributor to the Times of Israel, Haaretz and other platforms.
Related Topics
Related Posts