The teamusedhigh-resolution peripheral quantitative computed tomography, or HR-pQCT, which can measure 3-D bone microarchitecture on scales of 61 microns, finer than the thickness of human hair, to image the bone structure of the tibia in the lower leg and the radius in the lower arm.
An artificial intelligence can now predict the location and rate of crime across a city a week in advance with up to 90 per cent accuracy. Similar systems have been shown toperpetuateracist bias in policing, and the same could be truein this case, but the researchers who created this AI claim that it can also be used to expose those biases.
Ishanu Chattopadhyay at the University of Chicago and his colleagues created an AI model that analysed historical crime data from Chicago, Illinois, from 2014 to the end of 2016, then predicted crime levels for the weeks that followed this training period. The model predicted thelikelihoodof certain crimes occurring across the city, which was divided into squares about 300 metres across, a week in advance with up to 90 per cent accuracy. It was also trained and tested on data for seven other major US cities, with a similar level of performance.
Previous efforts to use AIs to predict crime have been controversial because they can perpetuate racial bias. In recent years, Chicago Police Department has trialled an algorithm that created a list of people deemed most at risk of being involved in a shooting, either as a victim or as a perpetrator. Details of the algorithm and the list were initially kept secret, but when the list was finally released, itturned outthat 56 per cent of Black men in the city aged between 20 to 29 featured on it.
Chattopadhyayconcedesthat the data
used by his model will also be biased, but says that efforts have been taken to reduce the effect of bias and the AI doesn’t identify suspects, only potential sites of crime. “It’s not Minority Report,” he says. Chattopadhyay says the AI’s predictions could be more safely used to inform policy at a high level, rather than being used directly to allocate police resources. He has released the data and algorithm used in the study publicly so that other researchers can investigate the results.
The researchers also used the data to look for areas where human bias is affecting policing. They analysed the number of arrests following crimes in neighbourhoods in Chicago with different socioeconomic
levels. This showed that crimes in wealthier areas resulted in more arrests than they did in poorer areas, suggesting bias in the police response.
本文节选自：New Scientist （新科学家）
英/ p??pet?ue?t /美/ p?r?pet?ue?t /
英/ ?la?klih?d /美/ ?la?klih?d /
英/ k?n?si?d /美/ k?n?si?d /
1.in this case 既然如许，假若如许
2.turn out （以某种方法）产生；最后是，成果是；出席，加入，前往旁观；关掉，熄灭（电灯、煤气等）；出产，制造
原句：This showed that crimes in wealthier areas resulted in more arrests than they did in poorer areas, suggesting bias in the police response.
布局：This showed that XX in wealthier areas resulted in more XX than they did in poorer areas, suggesting bias in the XX.
例句：This showed that students in wealthier areas resulted in better grades than they did in poorer areas, suggesting bias in the education resource allocations.
? END ?
Notice: The content above (including the pictures and videos if any) is uploaded and posted by a user of NetEase Hao, which is a social media platform and only provides information storage services.