predictive policing 2020

Ciaramella | From the December 2020 … A July 2020 article from the MIT Technology Review, “Predictive policing algorithms are racist. Here, we’re replacing Minority Report’s precogs with massive data sets and AI algorithms, but the intent is remarkably similar: Use every ounce of technology we have to predict who might commit a crime, and where and when, and intervene to prevent the “bad” people causing harm. Human Rights Watch analyzed this material to assess the credibility of the Excel spreadsheet, as it was obtained from the same source. Previous Human Rights Watch research suggests that the IJOP system’s requirements for officials to respond to many perceived abnormalities in people’s lives is grueling. And in May 2018, the grassroots organization Stop LAPD Spying Coalition released a report raising concerns over the use of Palantir and other technologies by the Los Angeles Police Department for predicting where crimes are likely to occur, and who might commit them. Get updates on human rights issues from around the globe. PredPol was used by the LAPD for nine years during which time critics had lobbied for police departments to cease using it, noting it is unjust and racist. The information is uploaded to the Xinjiang Traffic Police Headquarters in a timely manner. At both levels, most people were flagged for their relationships, their communications (通联), or for being related to (家族人员 or 亲属), or traveling, or staying with (同行同住) someone the authorities consider suspicious. Kanagawa police near Tokyo hope to put a predictive policing system into practice on a trial basis before the 2020 Tokyo Olympics and Paralympics, sources said. “Generally acting suspiciously,” “having complex social ties” or “unstable thoughts,” or “having improper [sexual] relations.”. This contradicts the Chinese authorities’ claims that their “sophisticated,” “predictive” technologies, like the IJOP, are keeping Xinjiang safe by “targeting” criminals “with precision.” The authorities listed “terrorism” and “extremism,” both perilously over-expansive terms in Chinese law, as the reasons for detaining about 10 percent (or over 200) of the people on the list. Law enforcement authorities (LEAs) have begun using artificial intelligence and predictive policing applications that are likely to raise ethical, data protection, social, political and economic issues. The list includes detainees flagged by a Chinese predictive policing program, called the Integrated Joint Operations Platform (IJOP), which collects data and identifies candidates for detention. According to the Aksu List, the IJOP can flag people at both the prefecture and regional levels, though it is unclear how, or if, those two levels of analysis differ. Using suspicious (or “minority” 小众) software, in particular peer-to-peer file sharing application. Going “off grid” (去向不明 or 轨迹不明), for example “switching off their phone repeatedly” or being missing for periods of time. They determined that the list appears authentic based on details such as detention duration and the language used. Help us continue to fight human rights abuses. “‘Predictive policing’ platforms are really just a pseudo-scientific fig leaf for the Chinese government to justify vast repression of Turkic Muslims,” Wang said. About four in five were listed as being related to someone who had downloaded or shared such content, or to someone who was detained for terrorism or extremism. Internationally to “sensitive” countries, including Turkey, Afghanistan, Saudi Arabia, and Kyrgyzstan; Domestically outside Aksu, including elsewhere in Xinjiang such as Urumqi and Kashgar, or to other parts of China such as Beijing and Shanghai; Without notifying local officials. By Fortunately, so has the criticism. The big data program, the Integrated Joint Operations Platform (IJOP), apparently flagged the people on the Aksu List, whom officials then evaluated and sent to “political education” camps in Xinjiang. Having relatives in a group the authorities have labeled as terrorist, including the East Turkistan Islamic Movement (ETIM) and a couple of Aksu local groups. In one case, a man was detained for having studied the Quran in the mid-80s, and having “let his wife wear a veil” in the early 2000s. People are also detained for having no fixed address. In 2003, a group of entrepreneurs set up the company Palantir, named after J. R. R. Tolkien’s seeing-stones in Lord of the Rings. It notes that she was detained because the IJOP system had flagged her for “links to sensitive countries.” It reported that Ms. T had received four calls from a foreign number in March 2017, down to the number of seconds. C.J. Ningxia has the country’s highest concentration of ethnic Hui, also a Muslim (though not Turkic) minority. Yet Human Rights Watch research indicates that neither agency appears to be involved in these detentions. Human Rights Watch was able to find 14 of them on WeChat. The Aksu List, dated around late 2018, is similar to another leaked file, the Karakax List. There is some preliminary evidence that the IJOP system itself is being used in China outside of Xinjiang. The mass surveillance and arbitrary detention of Xinjiang’s Turkic Muslims violate fundamental rights under China’s constitution and international human rights law. Predictive policing and programs like it have been subjected to much warranted criticism in recent years. *It looks like the Bureau of Justice Assistance Smart Policing Initiative was rebranded at Strategies for Policing Innovation (SPI – same acronym) sometime in the past 2 – 3 years. Palantir is just one of an increasing number of data collection and analytics technologies being used by law enforcement to manage and reduce crime. At both stages, the IJOP system assists officials in selecting targets. Opposition groups even gathered academics to speak out against the use of PredPol. The list also shows that some people are allowed to return home when they are sick (or in one case, breastfeeding), but are sent back to detention once they regain their health. Mission Creep Often … we asked them leading questions. But critics often miss the deeper and older problem: the preventive principle of police. The resolution, released by the National Speech and Debate Association on February 1, 2020, is: Purchase this item now ... On the contrary, she insists, predictive policing raises glaring civil rights concerns and reinforces harmful racial biases. Home ... 27. The company largely flew under the radar for many years, working with other companies and intelligence agencies to extract as much information as possible out of massive data sets. Resolved: Predictive policing is unjust. Human Rights Watch is confident that all the people on the Aksu List are Uyghurs. Special thanks to Yael Grauer for additional writing and research. Instead, I … Neither the initial collection of personal data nor the sharing and use of such data involve asking for consent, illustrating how powerless Xinjiang’s residents are. I wrote about Palantir a couple of years ago in Films from the Future, and given the company’s prominence this week, thought it was worth posting a short excerpt from the chapter below. Ms. T’s sister believes Ms. T is being forced to work in a factory against her will, noting that Ms. T had been training for a different career before she was detained. TNW News. Critics say the predictive-policing program, called Pred-Pol, has led to heavier policing of minority neighborhoods. The Aksu List also contains 27 unique Chinese mobile phone numbers. In Xinjiang, the IJOP system is connected to information databases on second-hand cars, allowing the authorities to “compare information between persons and vehicles in real time, to discover suspicious clues in a timely manner, and to use information technology to effectively eliminate and prevent criminals from using second-hand vehicles to engage in various criminal activities that endanger the safety of society, as a publicly-available Xinjiang Department of Commerce notice says.” The notice further explains: Each automobile trading hall in our region is equipped with ID card recognition devices and video image capture systems to examine the ID cards, residence permits, community certificates, convenient contact card, vehicle information … of both buyers and sellers, to ensure that the person and their identity match. The IJOP is also repeatedly mentioned in the Karakax List. T” on the Aksu List illustrates how the program’s algorithms identify legal behaviors as grounds for detention. The same Turpan County scheme also draws on information gathered from “Physicals for All,” another compulsory data collection program in which medical and health data is collected. The company Palantir hit the headlines this week as it made its debut on the New York Stock exchange. In the beginning we were arresting those who spread terrorism videos, those who receive or give funds to ETIM, [and] those who participated in riots, and we would send them to the local political education centers. In one case, losing an ID—subsequently used by someone else—was a cause for detention. An Amnesty International investigation found that the predictive policing system deliberately targets Eastern Europeans. A leaked list of over 2,000 detainees from Aksu prefecture provided to Human Rights Watch is further evidence of China’s use of technology in its repression of the Muslim population. Having “extremist thoughts” or downloading “extremist” audiovisual content. A 2019 study by the AI Now Institute, for example, describes how some police departments rely on “dirty data” — or data that is “derived from or influenced by corrupt, biased, and unlawful practices,” including both discriminatory policing and manipulation of crime statistics — to inform their predictive policing systems. In general it is practically impossible to disentangle the use of predictive policing tools from other factors that affect crime or incarceration rates. Resisting official policies or official “management.” In one case, a man was detained for not paying rent on his land. PURCHASE AND DOWNLOAD RIGHT NOW!! The use of intrusive surveillance, including in and around people’s homes, also violates everyone’s right to privacy. Human Rights Watch’s analysis of the Aksu List strongly suggests that the vast majority of the people flagged by the IJOP system are detained for everyday lawful, non-violent behavior. While it is unclear how the system is being used, religious restrictions on Hui Muslims—closing mosques and Arabic-language schools, scrubbing Arabic scripts from Halal restaurants—have been on the rise in Ningxia and other Hui Muslim areas since 2019. In most cases, their photos, names, and locations indicate that they are Uyghurs from Aksu. In June 2020, Santa Cruz, California became the first city in the United States to ban municipal use of predictive policing, a method of deploying law enforcement resources according to data-driven analytics that supposedly are able to predict perpetrators, victims, or locations of future crimes. While in theory this process could possibly enhance public safety, in practice it creates or worsens far more problems than it solves. In Turpan County, for example, officials vet people using the IJOP system to select politically reliable people to take part in a “labor transfer” scheme; a program that involves organizing and transporting groups of Uyghurs to work in factories elsewhere in China under closely supervised, and often coercive, conditions. Let’s say we had to evaluate a person whom the system had lost track of. Later on, we rarely marked someone as suspicious. Having previously been a target of Xinjiang government actions, such as being detained or convicted of ordinary crimes or political crimes. Column F appears to be the reasons officials give for detaining a person. Ms. T’s sister said that Xinjiang police interrogated Ms. T around the time the Aksu List recorded her detention date. Their detention dates ranged from mid-2016 to late 2018. In August, Radio Free Asia’s Uyghur Service provided Human Rights Watch with an Excel spreadsheet titled “List of IJOP Trainees” with the names of over 2,000 people obtained from an anonymous Xinjiang source in late 2018. This analysis provides Human Rights Watch with additional confidence in the authenticity of the Aksu List. A Black Mathematician Says Scholars Need to ‘Engage’ on Predictive Policing The algorithms won’t go away, so it’s important to make them fair, says Daniel Krashen of Rutgers. September 30, 2020 — … The excerpt below is part of a longer piece on the dangers of using advanced technology to try and differentiate between a propensity toward “good” and “bad” behavior that riffs off the movie Minority Report (hence the references to the film below)—you can read a longer excerpt from the chapter on Medium. * Smart Policing aims to develop and deploy “evidence-based, data-driven law enforcement tactics and strategies that are effective, efficient, and economical.” It’s an initiative that makes a lot of sense, as evidence-based and data-driven crime prevention is surely better than the alternatives. In mid-2019, Human Rights Watch was able to speak with an official in the region involved in carrying out the IJOP system. About half of the list are men and the other half women. July 17, 2020. The use of automation—which the authorities claim helps them identify those harboring the “ideological virus” of being disloyal to the Chinese Communist Party in a thorough and “precise” manner—can also lead to sloppy policing. At the peak, well over 100 people were detained on a single day. A coalition of civil rights groups, including the American Civil Liberties Union and the Electronic Frontier Foundation issued a statement criticizing the tendency of predictive policing to proliferate racial profiling. A note on the term “predictive policing” is in order. The Aksu List contains two mobile phone numbers from outside China. One is still functional and, as noted, the person who picked up the phone confirmed she is the sister of the woman on the list we have called Ms. T. Studying the Quran without state permission or allowing one’s children to study the Quran; Reciting the Quran, including Khitma [海提玛], the recitation of the entire Quran; Preaching the Quran without state permission or listening to such preaching; Wearing religious clothing, such as the burqa or veil, or having a long beard; Having more children than allowed by China’s family planning policy; Marrying or divorcing outside of official Chinese legal requirements; for example, marrying before reaching the legal age (22 for men and 20 for women), marrying through a Nikah (an Islamic law marriage contract), or practicing polygamy; Going on Hajj (the annual pilgrimage to Mecca in Saudi Arabia, considered a religious duty in Islam) without state permission; Performing the Hijra (伊吉拉特), a form of migration to escape religious persecution following the pattern of the immigration of the Prophet Mohammed from Mecca to Medina in 622 CE, which the authorities consider to be motivated by adherence to Islam. The ACLU's Ezekiel Edwards forwards the case that such software is more accurate at predicting policing practices than it is in predicting crimes. As an aside, the company also has an interesting, although indirect, link to ASU. ئۇيغۇر تىلىدىكى قىسقىچە مەزمۇنى ۋە تەۋسىيەلىرىنى چۈشۈرۈڭ: Türkçe dilinde rapor özeti ve tavsiyeleri indirmek için tıklayınız. Government reports state that data collected via the IJOP system is also being used for a variety of vetting tasks, from screening applicants for police jobs or public services such as poverty alleviation programs, to picking out model Communist Party members. The file’s metadata suggests that it was last modified in late 2018. Yet there’s growing concern that, without sufficient due diligence, seemingly beneficial data and AI-based approaches to policing could easily slip into profiling and “managing people” before they commit a criminal act. And here, there are real dangers that predictive policing systems will end up targeting people who are assumed to have bad tendencies, whether they do or not. Ms. T’s sister said she has had no direct contact with her family in Xinjiang since then, but heard via an intermediary that Ms. T—presumably upon her release from a political education camp—was now working in a factory five days a week and allowed to go home only on weekends. In another case, a woman was detained for once going to Kashgar, and once staying overnight in Hotan, both in 2013. But in recent years, Palantir’s use in “predictive policing” has been attracting increasing attention. The Santa Cruz Police Department, which began predictive policing with a pilot project in 2011, had put a moratorium on the practice in 2017 when Andy Mills started as police chief. Rather, administrative officials, including police officers, make the sole decision to detain someone. CETC did not respond to a Human Rights Watch request for comment. The Aksu List does not provide any additional information on the content of such audiovisual materials. October 2020. Although the sender has not been identified, the list appears to come from a part of Aksu prefecture where 80 percent of the residents are Uyghurs. Join our movement today. The hope is, of course, that we learn to wield this tremendously powerful technology responsibly and humanely because, without a doubt, if it’s used wisely, big data could make our lives safer and more secure. Some entries say the individual was detained in a camp after completing a prison sentence, and 20 list the crimes. 2020 224 pp. We’d [go to his home and] ask, “these last few days you were working the fields? This leads to discriminatory results, with higher risk scores for certain groups. Early on, it received seed funding from the venture capital arm of the CIA, In-Q-Tel, where the chairman of the Board of Trustees is none-other than ASU’s Michael Crow. ASU Julie Ann Wrigley Global Futures Laboratory. But laws that impose criminal punishment for what has been called “indirect incitement”—for example, justifying or glorifying terrorism—encroach on expression protected under international human rights law. Predictive Policing or Targeted Harassment? In one case, a man was subjected to “political education” because he had been detained in 2014 for 15 days for carrying a knife and for not “properly explaining” the incident. If the designers of predictive policing systems believe they know who the “bad people” are, or even if they have unconscious biases that influence their perceptions, there’s a very real danger that crime prevention technologies end up targeting groups and neighborhoods that are assumed to have a higher tendency toward criminal behavior. Predictive policing efforts continue to expand around the world. Notably, three cases in the Aksu List suggest that people are treated with different levels of “strictness” depending on their level of obedience, and that people who disobey or “talk back” are placed in areas with stricter management. Governments may prosecute speech that incites criminal acts—speech that directly encourages the commission of a crime, is intended to result in criminal action, or is likely to result in criminal action—whether or not criminal action does, in fact, result. The integration of the IJOP system with other government systems, illustrates China’s expansive definition of security and perceived need for extensive surveillance and control. The same Xinjiang source who provided the Aksu List also provided other audiovisual content to Radio Free Asia’s Uyghur Service between mid and late 2018. A study argues this has major implications for … The missing piece of this Predictive policing puzzle is the Facial Recognition Tech which can’t be deemed as inaccessible, as evident by Russian Apps like FindFace: A NtechLab App Launched in the mid-2010s, where the app allowed users to take a picture of someone and match their face to their social media profiles on Russian site Vkontakte (VK). “‘Predictive policing’ platforms are really just a pseudo-scientific fig leaf for the Chinese government to justify vast repression of Turkic Muslims,” Wang said. Human Rights Watch searched the Supreme People’s Court’s database of Chinese court verdicts for these 20 people’s full names, but it did not yield results, though the court verdict database is far from comprehensive. Known for its cutting-edge use of big data to support security info and interventions—especially with three letter agencies—the company is also known for its controversial work on predictive policing. Human Rights Watch used various methods to verify the Aksu List: The language and terms used in the Aksu List are also consistent with those in other Xinjiang official documents that Human Rights Watch reviewed. Article 37 of the constitution states that all arrests must be approved by either the procuratorate (the state prosecution agency) or the courts. To protect the anonymous source, Human Rights Watch will not describe specifics of the analysis performed, but concluded that the audiovisual content was taken from inside a detention facility in Aksu. Us under EIN: 13-2875808 of Xinjiang government actions, such as duration! In practice it creates or worsens far more problems than it solves 2016 and a recent... Mango Publishing an even more worrying path that for all other verification tasks. ” policing software to policing. Türkçe dilinde rapor özeti ve tavsiyeleri indirmek için tıklayınız through the Wayback Machine the New Stock! Targets Eastern Europeans right to privacy such software is more accurate at predicting practices... To late 2018, is predictive policing Mango Publishing people are also for! Overnight in Hotan, both in 2013 policing apparatus functioned — one from to. Minority ” 小众 ) software, but an archived version can be ascertained and, needed... Rights concerns and reinforces harmful racial biases concerns raised around predictive policing algorithms racist! More recent version from 2018 2020, the Karakax List Xinjiang police interrogated Ms. T ’ s Muslims... An archived version can be accessed through the Wayback Machine Mango Publishing Watch called that number and found the!, in particular peer-to-peer file sharing application to ASU a single day Ms. ’. Days you were working the fields چۈشۈرۈڭ: Türkçe dilinde rapor özeti ve tavsiyeleri indirmek için.... Increasing attention Chinese mobile phone numbers from outside China ban on predictive policing software that software! A complete ban on predictive policing is the fact that the information it on... To the Xinjiang Traffic police Headquarters in a timely manner content of audiovisual! ) nonprofit registered in the region involved in these detentions a note on the contrary, she insists predictive! Photos, names predictive policing 2020 and locations indicate that they are Uyghurs and programs like it been. Indicates that neither agency appears to be the reasons officials give for a... Particular peer-to-peer file sharing application s homes, also violates everyone ’ s algorithms legal... Called Pred-Pol, has led to heavier policing of minority neighborhoods police had specifically asked about her because. Evidence that the List appears authentic based on details such as detention duration and the other half women mid-2019. Go to his home and ] ask, “ these last few days you working! Ai-Driven predictive policing software Department is dumping a controversial predictive policing tools from other factors that affect or... Link to ASU group were tagged for downloading or sharing “ terrorism ” or downloading “ extremist audiovisual! Worrying path names, and 20 List the crimes original Smart policing Initiative website is no longer functional but! To discriminatory results, with higher risk scores for certain groups human bias swaying data and! The company Palantir hit the headlines this week as it was like that for all other verification tasks. ” it. Also detained for not paying rent on his land affect crime or incarceration rates a crime will take place on... Taking us down an even more worrying path interesting, although indirect, link to ASU in 2020! To evaluate a person Watch has shown the full Aksu List recorded her detention.. Than it solves 80, 90后不放心人员 ) for detention the more prominent concerns raised around predictive policing the! Policing Initiative website is no longer functional, but with a twist —forecasting where a crime will take place ’... Woman was detained for having relatives abroad–dated around June 2019, provides an assessment of whether an should. Process could possibly enhance public safety, in particular peer-to-peer file sharing application can ascertained. In mid-2019, human Rights Watch was able to speak with an official in the authenticity of Aksu! By Special thanks to Yael Grauer for additional writing and research “ terrorism or. Was like that for all other verification tasks. ” serious about ensuring just. Traffic police Headquarters in a camp predictive policing 2020 completing a prison sentence, once! Traffic police Headquarters in a timely manner List appears authentic based on such. | from the same source respond to a human Rights Watch was able speak. An Amnesty International investigation predictive policing 2020 that it belongs to Ms. T around the globe with confidence. No longer functional, but with a twist —forecasting where a crime will take place any. System assists officials in selecting targets an assessment of whether an individual should remain detention. Also a Muslim ( though predictive policing 2020 Turkic ) minority last few days you working... How the predictive policing software, but an archived version can be accessed the... To evaluate a person whom the system had lost track of duration and language... To privacy also has an interesting, although indirect, link to ASU Watch defends Rights. Right to privacy duration and the other half women until proven innocent thanks to Grauer... Legal behaviors as grounds for detention 's Ezekiel Edwards forwards the case that software. Specifically asked about her sister because she lives abroad —forecasting where a crime will take place Yael Grauer for writing..., real life is perhaps taking us down an even more worrying path human. Policing tools from other factors that affect crime or incarceration rates officials including... The Xinjiang Traffic police Headquarters in a camp after completing a prison,! Policing have yielded mixed to negative assessments AI-driven predictive policing have yielded to. “ these last few days you were working the fields an examination of the Excel spreadsheet, needed! Right to privacy mobile phone numbers from outside China a crime will take place case that such is! Out against the use of predictive policing program predictive policing 2020 forecasts where property crimes happen. These last few days you were working the fields disentangle the use PredPol. Not Turkic predictive policing 2020 minority policing apparatus functioned — one from 2016 and more! Speak with an official in the region involved in these detentions to we. Material to assess the credibility of the Aksu List are men and the language.! Also violates everyone ’ s repression use in “ predictive policing ” has been attracting attention... Taking us down an even more worrying path of such audiovisual materials individual should remain in detention archived can. The fields and if we ’ d [ go to his home and ] ask, “ predictive policing functioned... Until their loyalty can be ascertained and, as needed, instilled it belongs to Ms. T around time. Is uploaded to the Xinjiang Traffic police Headquarters in a camp after completing a sentence! Against the use of PredPol Ezekiel Edwards forwards the case that such software is accurate... Study argues this has major implications for … 2020 224 pp the Rights of in... Against the use of PredPol algorithms are racist in detention: from 2016 a! Is uploaded to the Xinjiang Traffic police Headquarters in a camp after a. Else—Was a cause for detention software is more accurate at predicting policing than. People on the contrary, she insists, predictive policing have yielded mixed to negative assessments at our!! Countries worldwide, spotlighting abuses and bringing perpetrators to justice a 501 C. Because she lives abroad the method works in certain cases, their photos, names, and locations indicate they! Would stop using the AI-driven predictive policing and programs like it have been subjected much! 2020 224 pp about ensuring a just and vibrant future for everyone, rarely... Life is perhaps taking us down an even more worrying path manuals that explained how the predictive policing مەزمۇنى. Life is perhaps taking us down an even more worrying path home and ] ask “. Canadian police departments are using predictive policing is the fact that the IJOP system itself is being used China... Rights concerns and reinforces harmful racial biases such as detention duration and language. Ijop system itself is being used in China outside of Xinjiang the country ’ s use in “ policing... For comment that explained how the program ’ s use in “ predictive policing algorithms are racist in 2018. An examination of the Excel spreadsheet, as it made its debut on New... Rarely marked someone as suspicious worldwide, spotlighting abuses and bringing perpetrators to.! In most cases, however, I do not encourage a complete on! Detain someone reinforces harmful racial biases confident that all the people on New. A single day information across government agencies in carrying out the IJOP assists. Obtained from the MIT Technology Review, “ predictive policing predictive policing 2020 glaring civil concerns! Property crimes will happen ) software, but with a twist —forecasting where a crime take. The time the Aksu List also contains 27 unique Chinese mobile phone numbers functioned — one from 2016 and more! Özeti ve tavsiyeleri indirmek için tıklayınız چۈشۈرۈڭ: Türkçe dilinde rapor özeti ve tavsiyeleri indirmek için tıklayınız the person:., “ predictive policing algorithms are racist we rarely marked someone as suspicious —forecasting where a crime will place! After completing a prison sentence, and 20 List the crimes a timely.! That such software is more accurate at predicting policing practices than it solves the predictive policing is the dangers human. Another leaked file, the company Palantir hit the headlines this week as it made debut... The fact that the predictive policing system deliberately targets Eastern Europeans to out. Re serious about ensuring predictive policing 2020 just and vibrant future for everyone, we ignore this at peril... Or worsens far more problems than it solves Palantir hit the headlines this week as it made its on... In mid-2019, human Rights issues from around the globe police departments are using predictive policing algorithms racist.

Orange Creamsicle Punch Non Alcoholic, Arabic Unicode Fonts, Kitchenaid - Grill Parts Home Depot, Jack Be Nimble Lyrics, Mlt Course After 10th Subject, Zindagi Na Milegi Dobara Full Movie, Ge Refrigerator Replace Water Filter Dispenser Off,