Technologies of Genocide: Episode Four | The Uyghur Genocide: Surveillance, Global Capital, and Racialised Counterterrorism

By December 1, 2025

The Uyghur genocide is a Chinese state-organized campaign of mass persecution, social destruction, and demographic engineering against Uyghurs and other Turkic Muslim communities in the Xinjiang region of China. The Chinese state has built an infrastructure of disappearance through a dense network of digital surveillance, predictive-policing algorithms, police build‑up, and an archipelago of detention and forced-labor sites designed to remove people from their everyday life while erasing the political meaning of that violence. Over the years, multiple independent experts have argued that this meets the legal definition of genocide or at least crimes against humanity.

For this episode of Technologies of Genocide, I spoke to activist, writer, and poet Abduweli Ayup. The conversation locates Uyghur genocide at the intersection of surveillance, global capital, and racialized counterterrorism. But more importantly, it asks what it means to live in a world where thoughts, emotions, language, and labor flow through the same data infrastructures. Prof Ayup describes cameras in prison cells, apps on our phones, shrimp in European supermarkets made possible by Uyghur labor, and children who no longer speak their mother tongue as all part of the same continuum of oppression enabled by the same data infrastructures.

Prof Ayup begins with a powerful reflection:

“As human being[s], all of us believed until now that we are safe in our thoughts. People can put us in prison, but they will not imprison our heart, our will, our ideas.”

Political prisoners, dissidents, and poets have long consoled themselves with the idea that no matter what is done to their bodies, their minds would always remain free. Yet the Chinese state has set itself the task of conquering even that last refuge. Prof Ayup, in his steady voice, describes a cell with cameras above, recording twenty-four hours a day with its eyes trained on the inmate—it reads the movement of the mouth, the small changes in the face, and even the way your body holds grief or anger. “Inside the prison, there are three cameras on top of your head,” he says. “When you move your mouth, they learn you are repeating something inside. If you cry, if you feel sad or angry, you are identified and taken to interrogation.”

The image that emerges as he speaks, in his calm, composed voice, is scary—a dystopian present with its slow, corrosive assault on a person’s interior life. Every movement is logged, coded, and assigned a value; even a silent repetition of a prayer becomes an event in a dataset, a moment of anger quickly assigned a risk value, and grief or discomfort is logged as a potential act of dissent or disloyalty. If the system that tracks and watches you regularly decides that your face has become too expressive, your name is called over the loudspeaker. The inmate is now further disappeared again into the carceral black hole—solitary confinement, further interrogation, and punishment, can all be triggered by emotion as caught on the camera and assessed as dangerous by an AI system.

Prof Ayup is telling us that under the algorithmic rule, even if a state does not read your mind in a metaphysical sense, it can still punish every sign that you still possess one. The goal is not simply to know what you feel, but to train you to feel less, think less—perhaps move your lips less, shrink and then finally shut the inner world down before the cameras notice. Here is a fully functional system built, deployed, and perfected in which terror is fluid, all-encompassing, and perpetual. It functions as a constant background score to everyday existence.

As we spoke, a vivid image began to form in my mind—I saw people draw circles around their existence and learn to live within it. With each successive recording and logging of human movement as data, these circles become smaller. It is not that the world around them has ceased to exist; instead, it is rendered incomprehensible. The violence is never overt, but always omnipresent like a panopticon. Always buzzing. I am reminded of a powerful sentence by John Keane’s Democracy and Violence:

“The terror must be neither directly graspable nor manageable: it should function as noise whizzing through the heads of its potential victims. The name of the game is militant defeatism. The minds and bodies of the enemy should be shaken to their core. They should (to use prison language) be buried alive, tortured in their isolation, compelled to doubt themselves into oblivion. Meaning itself should be destroyed.”

Language forbidden, children unmoored

 

From here, the conversation moved to language and linguistic erasure.

Prof Ayup narrates another powerful scene: a parent waiting for a child to return home from school, calls and asks, “Where are you, what are you doing?” in Uyghur. The call is automatically transcribed and flagged. “If you speak your language on the phone,” he says, “your conversation is identified, your child will be punished in school, and you will be questioned why you are speaking Uyghur to your kids.”

For a people already systematically targeted as a security threat, speaking to your child in the language of your ancestors immediately becomes an act of immense risk. An intimate conversation no longer carries any guarantees of privacy, and fear slowly does its work. Parents begin switching languages when they call, or stop calling altogether. Children learn that the mother tongue can put them “in danger,” a phrase Prof Ayup uses very deliberately.

The boarding school system established around 2014 took Uyghur children from ages three to nineteen. As parents vanished into camps and prisons, the children vanished into dormitories where they learned to speak Chinese, behave as Chinese, and perform Chinese culture. The Uyghur language was first removed from the curriculum, then from public institutions altogether. Over the years, Prof Ayup has interviewed many who have managed to leave or flee; they told him that children now under thirteen cannot speak Uyghur anymore. The genocide, he argues, has “ achieved its goal without any smoke,” because advanced technology and carefully designed policies—implemented at the scale of society and institutions—do the work quietly. You are not killed, but broken down and rebuilt in the vision conjured by the Chinese state.

As I listened to our conversation again, I felt a particular kind of cruelty in this design: the parent on the phone trying to locate a child, the child who learns that the sound of their own language can summon punishment, the school where they spend two hundred and sixty days a year, the rule that they cannot communicate freely with family, and the official red note in 2017 that declares Uyghur no longer to be used in education in the autonomous region. The sum of these details means that a generation of Uyghur children will grow up unable to imagine their future in the language of their grandparents.

Language carries everyday tenderness, anger, desire, insult, prayer, and humor. To strip that away is to interfere not only with identity, but with how people love and grieve each other.

Life inside the data architecture

 

None of this happens by accident. It is administered through a well-oiled system: the Integrated Joint Operations Platform. The phrase sounds bureaucratic, almost dull; however, the reality is something else.

Prof Ayup walks us through it step by step. “All of the information related to your daily life is connected to the Integrated Joint Operations Platform,” he explains. “Your phone, your ID card, your bank card, your electricity, your health, your schooling, all in one big data.”

The genealogy of this system is as important as its function. It did not appear out of nowhere in Ürümqi. It traces it back to years after September 11, 2001, when the US Department of Homeland Security commissioned IBM to build a system that could hold “all of the information about those, quote unquote, “terrorists.” This system was used in Guantánamo and then retired. Chinese companies later bought that architecture and refined it, first using it against Tibetans, then against Uyghurs.

Imagine that Prof Ayup is in Ürümqi, and I am in the United States, and we are having this conversation online. Because his provider is connected to IJOP, the platform flags his contact with a foreign number. An automated notification goes to the local police station. A message arrives on his phone: “Please come to the station.” He goes, then he is arrested and taken to prison.

Once inside, he discovers something worse. The interrogators already know about his family history going back three generations, his old classmate who now lives in the United States, and his previous work. Police can now recite these details back to him: what his grandfather did, who his friends are, and who he briefly spoke to about human rights.

“You cannot remember all of the things about yourself,” he says, “then somebody tells you. You feel you are just like naked. They know everything about you.”

The word “naked” here is not a metaphorical declaration; it is a precise description of the loss of self. Every human is now digital data, or a data body. Our memory, which is meant to be imperfect and malleable, is now catalogued into digital dust. More importantly, this strips away the right to decide what of oneself is visible and what can be hidden. It is also an inversion of the usual promise of big data, which pretends to know populations in the abstract. However, in Xinjiang, the data is intimate enough to use against every individual.

Today, we speak about data and the loss of privacy as unfortunate but inevitable consequences of our lives. We treat surveillance as an irritation, a flaw.

In reality, however, data architecture is the nervous system of a carceral state: an integrated mesh of databases, predictive scores, and tracking devices that learn to recognize, contain, and punish particular kinds of lives. The same search engines that decide which cookies should follow you across the internet are being retrained to determine which neighbourhood should be flooded with patrols, which family should face a welfare fraud investigation, and which migrant should be pulled aside for secondary screening.

Put differently, the “recommendations” and “suggestions” do not stop at the marketplace. The same tools that infer a taste can diagnose a person as a “threat.” And once a person has been rendered as a risky profile in a system, that profile can be passed along a chain of agencies until it ends in detention in a cell, an ankle monitor, or a street designated for constant surveillance.

In fact, these systems have already learned to recommend bodies for detention.

Tibet as a laboratory, Xinjiang as perfected genocide

 

The Chinese state’s digital infrastructure was not born in Xinjiang. It was first tested in Tibet. Prof Ayup argues that in the past, we used to see images of self-emulation by Tibetan monks attempting to protest Chinese rule. That particular form of resistance has almost disappeared now. Leading up to 2015, the Chinese state installed cameras in temples, extended surveillance in religious neighborhoods, categorized people and concentrated them in particular neighborhoods, collected information, and refined its ability to intervene before a public act of dissent could occur.

Those methods tested on Tibetan monks later traveled to Xinjiang on a larger scale. In Tibet, Prof Ayup says, the tools were applied primarily to temples and smaller dissident groups. In Xinjiang, they perfected this through a kind of unprecedented totalisation—from monks to entire communities, from religious institutions to all aspects of life, and from selected sites seen as sites of dissent to whole villages.

Before 2017, the Chinese government’s declared goal was accelerated assimilation, bringing minorities “into Chinese society.” After 2017, the policy shifted considerably. First, language was banned from schools and public institutions. Second, birth control and forced operations were imposed on Uyghur women from eighteen to fifty, and the birth rate dropped sharply. Third, Uyghurs were transferred in large numbers to Chinese provinces and “immersed” in Han society. Fourth, children were separated from families and routed into three kinds of boarding schools: inside Xinjiang, in Chinese-dominant cities, and in distant provinces. Fifth, public use of Uyghur is forbidden in banks, hospitals, police stations, and government offices. In Prof Ayup’s words, “These five factors show us this is genocide against a specific ethnic group—the Uyghurs.”

When he calls Tibet and Xinjiang part of the same continuous project, he traces a thorough line of experimentation, escalation, and refinement. A technology first aimed at monks becomes a system capable of reordering an entire population. That trajectory forces us to see laboratories of mass violence as testing grounds where new techniques of control are developed before they circulate outward.

Genocide is a business model

 

“Genocide is also a business model,” Prof Ayup says. “It is not only the Chinese government’s behaviour,” he argues. “International companies, international stockholders, big entertainment companies are also involved with this genocide.”

None of these—zone of deaths, architectures of mass disappearances—are sustained by ideology alone; these are also profitable enterprises. Xinjiang has become a laboratory for surveillance technology and an engine of profit for Chinese and foreign firms. For instance, Huawei produces surveillance cameras, and AliCloud, which provides servers. But the investments and profits are also global. The national funds in Norway, including an oil foundation and a national pension fund, invested in companies tied to camps and only divested recently, in 2020 and 2021. There are, as always, American firms: IBM, which developed the original big data system, and Microsoft, and others that provide software and hardware.

Disney’s 2020 live-action Mulan end credits extended “special thanks” to the Turpan Public Security Bureau and the publicity department of the CPC Xinjiang Uyghur Autonomous Region Committee, entities implicated in the mass detention, surveillance, and abuses against over a million Uyghurs and Muslim minorities in Xinjiang’s internment camps. Earlier films include a U.S.-China co-production titled Warriors of Heaven and Earth (2003) and The Kite Runner (2007).

Tesla opens factories in China, happy to operate in a landscape shaped partly by forced labor. Apple manufactures phones in China and hires Uyghur workers under coercive conditions. Consumer brands like Nike, H&M, and Zara profit from global supply chains where prison factories and “poverty alleviation” schemes make labor cheap and disposable.

The line between state violence and private profit has all but disappeared. The same investment firms that invest pension funds in Europe or growth in Silicon Valley help underwrite the tools that empty villages, sterilize women, and break families. The genocide now appears in PowerPoint presentations and investment prospectuses as a stable market for cameras, chips, cloud services, and low-cost labor. And genocide and labor extracted from communities targeted for annihilation become lists on spreadsheets.

For long, we have been sold a comforting fiction that atrocities happen in isolation, far from the global economy. The language of “doing business in China” obscures the fact that these are businesses operating inside a political project that treats Uyghur peoples and bodies as both threats to be eradicated and resources.

Exporting the Uyghur model

 

Once the system proved efficient in Xinjiang, it was quickly exported to other parts of the world. The Chinese state turned the same technologies inward on the Han majority during the pandemic, using them to track movement and enforce lockdowns. Then it exported them along the Digital Silk Road.

By 2019, China’s surveillance technology had been exported to countries like Ecuador and Zimbabwe as “smart city” or “safe city” solutions. These “packages” promise public safety, efficient policing, and modern infrastructure. They provide, in practice, the foundations of political control, monitoring, social scoring, and sorting.

In Qatar, during the 2022 FIFA World Cup, an app was required as a prerequisite to enter certain areas, functionally identical to one imposed on Uyghurs, with a different name and an Arabic interface. The United Arab Emirates used a similar system during elections. The code remains, the language changes, and the oppression becomes global, repeating, mutating, and becoming technologies of genocide that can be installed anywhere.

If you download consumer apps like Temu or Shein, your data is collected: “What you buy, where you go, what your preference is, they learn,” Prof Ayup says. “Another part is your political view. At the end, they will use it however they want.”

He calls the Uyghur model “a perfect model for dictators and for international businessmen.” It manufactures and delivers an underclass of labor, predictive social control, and lucrative datasets. It travels quietly, embedded in contracts, export packages, and free downloads that promise bargains.

The point is not that every instance of surveillance or every app is a direct extension of Xinjiang. The point is that the techniques refined there have become part of a global repertoire for managing populations. Somewhere in that repertoire, what began as genocide against a specific people becomes a template for governing many others.

Forced labor and relocation: the second phase

 

Under international scrutiny, China shifted from very visible mass detention to something, in his view, more dangerous. Mass arrests peaked around 2017; however, by 2020, the visible expansion of camps slowed. Since 2022, a new vocabulary has dominated: “Poverty alleviation”, “Labor transfers,” or “Skills training.”

In practice, Prof Ayup explains, this means relocating Uyghurs from their homeland to factories in at least eighteen Chinese provinces. It means transforming prisons into factories and factories into prison-like environments. It means stocking production lines with coerced workers who cannot refuse.

Prof Ayup was held in Chinese prisons, and he argues that even in prison, you live with your own community, you share a culture. You see faces you recognize, and hear your language. Even if you work sixteen hours a day, you know that others around you understand who you were before the cell.

Relocation to Han majority cities strips away even that fragile sense of safety. “You will disappear in Chinese society more easily than in your own culture,” Prof Ayup says. You lose the comfort of recognition. Everything is designed to dissolve you into the majority, to disappear you.

The products of this Uyghur labor circulate everywhere.

“Just imagine we are eating seafood in Europe,” he adds. “It includes Uyghur forced labor. Our car spare parts also include Uyghur forced labor. Nike, H&M, Zara, they include Uyghur forced labor.”

Here is one of the many ways the genocide enters the bloodstream of global consumption, not only through chips and cameras, but through shrimp, steering wheels, and fast fashion. We ingest it unknowingly, live in it, and walk around in it.

He calls this new phase more dangerous than the camps because it cuts people off from roots, culture, and the environments where they still feel some small measure of hope. People scattered into production chains, their present spent away for the comfort of others, and their children taught to speak another tongue.

How do conversations like this end?

 

By the end of the conversation, Prof Ayup returns to the question that has been quietly present throughout: Whose future is being shaped in Xinjiang?

Without Silicon Valley, without international investors, without companies that sell efficiency and safety to states hungry for control, China’s system could not operate at this scale. “Without those chips, how can they operate those advanced surveillance cameras?” he asks. First, the system, then the hardware, then Chinese companies put them together and turn them against Uyghurs.

He is unsparing about responsibility. “For me, no one is innocent,” he repeats. “Everybody is connected to this genocide, as a customer, as a supporter, also as ignorance.”

He calls the Uyghur genocide “the final call for humanity, for all humanity, to think about what we are doing and what we are buying and who we stand with and what is the right thing to do and what is wrong. Who will be controlling us and what we can do to protect ourselves as human beings with our own language, our own culture, our own tradition?”

It is tempting, when faced with a machinery of domination this vast, to retreat to helplessness. The scale is global, the actors are powerful, and the systems are deeply embedded. Yet his words do not come from a place of surrender, but a warning. A people stands at the edge of annihilation, and that process reveals the map of a world many of us already inhabit.

To listen carefully to this conversation is to understand that the Uyghur genocide is a map of what is possible when states, corporations, and technologies converge without meaningful constraint. The question that remains is not whether the system will spread. It already has. The question is whether those of us who hear this “final call” will demand to rethink how we live, what we buy, what we endorse, and who we choose to stand beside when the cameras turn toward us.

SUPPORT US

We like bringing the stories that don’t get told to you. For that, we need your support. However small, we would appreciate it.


Suchitra Vijayan is the founder and executive director of The Polis Project and the author of Midnight’s Borders: A People’s History of Modern India and How Long Can the Moon Be Caged? Voices of Indian Political Prisoners.

Technologies of Genocide: Episode Four | The Uyghur Genocide: Surveillance, Global Capital, and Racialised Counterterrorism


Suchitra Vijayan is the founder and executive director of The Polis Project and the author of Midnight’s Borders: A People’s History of Modern India and How Long Can the Moon Be Caged? Voices of Indian Political Prisoners.