Artificial intelligence has been weaponized in China. That should be a wake-up call for the world

CBC News
The Chinese government is using AI-powered facial recognition systems to monitor and target members of the Uighurs CBC News James Alexander Michie

The Chinese government is using AI-powered facial recognition systems to monitor and target members of the Uighurs, a persecuted Muslim minority in China. (Ng Han Guan/Associated Press)

China is using AI to persecute its Uighur minority

At this year’s World Economic Forum in Davos, the philanthropist George Soros caught everyone’s attention when he warned that the Chinese government’s use of artificial intelligence (AI) presents an “unprecedented danger” to its citizens and to all open societies.

His reading of the situation was prophetic. Last month, The New York Times confirmed Soros’ fears when the newspaper revealed that the Chinese government is using AI-powered facial recognition systems to monitor and target members of the Uighurs, a persecuted Muslim minority in China. Human Rights Watch, in its recently released report titled “China’s Algorithms of Oppression,” provide additional evidence of Beijing’s use of new technologies to curtail the rights and liberties of the Uighurs.

In the province of Xinjiang, where the majority of Turkic minorities reside, surveillance cameras equipped with face scans are omnipresent on street corners, mosques and schools. Commuters travelling between towns must go through security checkpoints where police, with the help of a mobile app, can access information ranging from their religious practices, political affiliation, use of social media platforms and even blood type. In this ecosystem of intense social monitoring, even legal routine behaviour, such as exiting through a backdoor, can be treated as suspect and serve as grounds for dubious arrests.

China has already faced international condemnation for its large-scale arbitrary detention of the Uighurs. The Global Center for the Responsibility to Protect and the Asia-Pacific Center for the Responsibility to Protect, in their report titled The Persecution of the Uighurs and Potential Crimes Against Humanity in China, have signaled that approximately one million Uighurs and other Turkic Muslim minorities are placed against their will in “re-education” facilities.

The report cautions: “If urgent measures are not implemented to end the current state of systematic persecution, there is a clear and imminent danger of further crimes against humanity occurring.”

Social credit system

China’s willingness to use AI to control its wider population and stamp out disorder is already well reflected in its nascent social credit system. Developed in concert by private entities and the state, AI-powered algorithms collect data on an individual’s financial and social behaviours to calculate their social score and determine if they pose a threat to the Communist Party of China.

Citizens with low creditworthiness are publicly shamed as their names and faces appear on billboard size displays. However, the use of AI-based facial recognition systems to target minorities pushes this systematic repression one-step further. This is the world’s first case of a government using AI to carry out what many human rights experts consider mass atrocity crimes.

Continue reading…

Source: Kyle Matthews & Alexandrine Royer | CBC News

Leave a Reply

Your email address will not be published. Required fields are marked *