AI in Jail? Robotic Guards? How the Prison Justice System Is Adopting Tech

Metro Loud
8 Min Read



That is The Marshall Mission’s Closing Argument e-newsletter, a weekly deep dive right into a key legal justice problem. Need this delivered to your inbox? Join future newsletters.

In July, Tesla followers lined up for hours in Los Angeles to take a look at the brand new “retro-futuristic” diner and charging station opened by Elon Musk. Among the many sights was the corporate’s “Optimus” robotic, which served popcorn to hungry prospects close to the people grilling Wagyu burgers. Fifty miles east in Chino, Delinia Lewis, the affiliate warden of the California Establishment for Girls, hopes to someday put AI-powered machines like these to work in her jail doing much more vital jobs than slinging snacks. As staffing shortages proceed to plague prisons across the nation, Lewis believes AI may assist shut the hole.

“Drugs distribution, cell feeding, safety searches, bundle searches for fentanyl, all of the hazardous and routine duties that employees do not wish to do,” mentioned Lewis. “Why not let the robotic do it? Then employees can deal with extra intricate components of the job.”

Lewis has written about using AI in corrections, and mentioned she is forming a enterprise to supply AI-driven robots to be used in corrections settings. Whereas she hopes the tech may very well be employed inside the subsequent 10 years, the state’s finances disaster makes buying cutting-edge AI instruments powerful.

“Who is aware of when California will probably be again within the inexperienced,” Lewis mentioned of the state’s finances, “however we’re dropping employees at a file charge, so the bridge has received to interrupt, and we’ve gotta actually reap the benefits of know-how.”

Robots behind bars could also be a methods off, however prisons and jails have been quickly adopting different AI and machine-learning instruments. Advocates essential of the know-how are involved about opaque information assortment processes, privateness violations and bias.

Jail telecommunications corporations have been a number of the first to dip their toes in AI know-how. In 2017, LeoTech started advertising Verus, a cellphone surveillance instrument to file and monitor calls. The corporate makes use of Amazon’s cloud and transcription providers to flag key phrases which may alert employees to “beneficial intelligence.” At the least three states used the instrument to monitor cellphone requires mentions of coronavirus in the course of the pandemic, in an try to trace outbreaks, in accordance with The Intercept. Whereas instruments like Verus have been initially marketed as add-ons to present cellphone providers, many jail telecommunications giants have since made AI name monitoring a default a part of their providers.

“Given Securus and World Tel Hyperlink at the moment are offering it, it means it’s going to be much more accessible in much more locations,” mentioned Beryl Lipton, an knowledgeable on legislation enforcement and jail surveillance instruments on the Digital Frontier Basis.

The usage of these instruments has led to severe breaches of attorney-client privilege. During the last 5 years, lawsuits have been filed in a number of states towards Securus, alleging that the corporate recorded privileged calls. Securus has settled a number of the lawsuits and has denied purposely recording protected calls. The controversy hasn’t stopped corrections departments from utilizing the know-how, or distributors from advertising it. LeoTech has been lobbying in Ohio, the place lawmakers handed a finances this 12 months that features $1 million for the state’s jail system to pay for software program that can “transcribe and analyze all inmate cellphone calls” starting subsequent 12 months, in accordance with Sign Ohio. Florida inked a deal with LeoTech in 2023.

Lipton’s main concern with the AI instruments in prisons and police departments is how the information they collect is saved, retained, and later fed into different programs.

“Regulation enforcement and the businesses serving to them do that are very eager about gathering all the knowledge they presumably can gather on anyone, as a result of they suppose that is going to assist them in fixing or stopping a future crime,” mentioned Lipton.

Whereas some AI know-how is making its means into the system, in some methods, the U.S. is enjoying catch-up with different international locations. Final month, the UK’s Ministry of Justice laid out its plan to embed AI throughout prisons, probation providers and courts. A number of the company’s objectives embody integrating AI transcription and doc processing instruments for probation officers, and the creation of a “digital assistant…to assist households resolve youngster association disputes outdoors of courtroom.”

However the star of the announcement is a brand new “AI violence predictor” that guarantees to forestall jail violence by analyzing information, together with an incarcerated particular person’s age and former involvement in violent incidents. If this sounds acquainted, you may be considering of threat evaluation instruments which have lengthy been used throughout the U.S., which ProPublica documented almost 10 years in the past to be rife with racial bias and “remarkably unreliable in forecasting violent crime.” The older instruments typically assess threat by contemplating a set of weighted variables — reminiscent of age and prior convictions — both manually or by utilizing an algorithm. AI-driven “predictors” are like threat evaluation instruments on steroids, drawing on a lot bigger datasets.

Whereas at the moment’s AI-driven instruments are extra subtle in some methods, the danger for bias and error remains to be there, and the efficacy of predictive instruments has repeatedly been known as into query.

“A whole lot of these predictive instruments can create unintended errors the place sure communities are underserved or misunderstood due to how the mannequin missed or wrongly accounted for people’ dangers in that neighborhood,” mentioned Albert Fox Cahn, founder and government director of the Surveillance Know-how Oversight Mission, who has studied AI surveillance in prisons.

Along with predicting violence towards others, some correctional employees need to use “biometric behavioral profiling” instruments together with AI to stop in-custody deaths and medical emergencies. The Maricopa County Sheriff’s Workplace, in Arizona, needs to purchase wearable know-how to trace coronary heart charge, physique temperature, and different “key indicators,” in accordance with AZ Central. Jails in Colorado, Alabama, and elsewhere in Arizona have already begun utilizing related instruments.

Lewis, the affiliate warden in California, is effectively conscious of the moral issues that include AI instruments, and believes criticism will finally produce higher outcomes.

“I welcome issues, as a result of that provides us a chance to do extra analysis and resolve these issues,” mentioned Lewis. “I do not suppose it is going to inhibit us, I believe it is simply going to assist us make a extra superior and a greater product.”

Share This Article