The Houston Police Department (HPD) in Texas found itself engulfed in a scandal recently. The scandal erupted over the controversial suspension of over 200,000 reported crime incidents involving Officers using a “Lack of Personnel” code to drop criminal investigations of sexual assaults and even some homicide cases.
HPD Chief Troy Finner was forced to retire following a thorough investigation into when he knew or did not know about the codes that effectively closed cases, cases with good leads to apprehend suspects.
This ridiculous situation continued to happen despite the shortage of police officers.
To compensate for the lack of officers in a city with approximately three million citizens, an AI technology firm has stepped in to address concerns related to limited manpower on the force.
64 Camera Network Able to Execute Facial Recognition
A Houston Chronicle article reported last month that the AI technology company will install 64 new cameras throughout Houston to enhance public safety. As stated, this recent action is in response to the staffing shortages faced by the Police Department.
The initiative is part of a statewide push to implement cutting-edge technology across various departments and agencies. Proponents assert that machine-learning tools will greatly amplify law enforcement capabilities and revolutionize public safety.
“The backdrop to all this, whether in policing or corrections, is the ongoing staffing crisis in the public safety sector,” said Marc Levin, chief policy counsel at the Council on Criminal Justice. “With the right guardrails, this type of technology can improve the efficiency and efficacy of police work.”
Levin pointed to the recent controversy surrounding HPD’s crime lab backlog. He said that, when used properly and within the guidelines of the Constitution, AI-driven technologies can free up resources dedicated to menial work.
“Right now we’re only clearing about half of reported homicides,” Levin said. “Property crime is at about 10%. There’s clearly a use for technologies that can help us get those numbers up. It just comes down to whether we’re willing to accept some of the tradeoffs that come with using AI.”
Growing Concern
There is a growing concern among hi-tech people regarding using AI surveillance tools in public safety. Savannah Kumar, a legal professional with the American Civil Liberties Union who focuses on surveillance and police bias, highlighted that these agreements signify a prevailing pattern observed within governmental entities as they rapidly adopt such technological advancements.
“We’re seeing AI crop up pretty much across the entire state,” Kumar said. “It’s definitely a source of alarm and concern, particularly when it comes to the use of predictive technology and other tools in law enforcement.”
Kumar posited that the utilization of predictive technologies, which are capable of identifying individuals or groups at high risk for criminal behavior, directly conflict with the Fourth Amendment’s proscription against unreasonable searches and seizures.
Kumar further suggested that AI technologies could potentially be utilized for preemptive policing of citizens before the commission of unlawful acts is committed.
Houston Police May Not Use Predictive Tech
Although the expansion of Airship AI’s infrastructure does not necessarily indicate that the Houston Police Department (HPD) will integrate predictive algorithms into its daily law enforcement activities, the eagerness to embrace advanced technology may prompt government agencies to compromise due process inadvertently, she cautioned.
“There’s a real lack of transparency and understanding in terms of what algorithms are being used, how these tools are functioning, or even when they’re a part of the process,” Kumar said in a Houston Chronicle article.
Kumar added, “Particularly as it applies to law enforcement, surveillance companies are making use of proprietary technology that is sometimes opaque even to the departments using them.”
No Mention of How the AI Will Be Used
The current contract makes no mention of specific algorithms or analytics, yet Airship AI’s website states part of its mission is to provide law enforcement and security operations with “… predictive analysis of events before they occur …”
Also, the Chronicle article reported how Levin, despite his support for the technology, largely shared Kumar’s concerns about the lack of transparency. He pointed to recent legislation from the European Union, which mandated AI providers share code with regulators.
In the U.S. the information is generally considered as intellectual property and is not accessible to government officials.
Levin mentioned that there have been cases where evidence obtained using AI was not admissible in court because no one was available to explain how the software worked or how it reached its conclusions.
Wrongful Arrests
Experts and community members are concerned about the system’s incorporation into broader surveillance, despite police insistence that it does not use artificial intelligence.
John Zandi told ABC13, “I’m all for less crime, but privacy is very important and should be maintained.”
Carroll Robinson, a professor at Texas Southern University and a former member of the city council, foresees issues.
“Some innocent person, misidentified, not by a real-life person but by a camera, ends up in the criminal justice system, incarcerated at the county jail,” he stated.
Dr. Michael O. Adams, Robinson’s colleague, says they have advocated for state legislation that would guard against AI-based racial discrimination.
Despite concerns about transparency, the Chronicle reported that the technology has been used at the southern border of Texas.
In late 2023, Airship AI was given a $10.9 million contract by an agency within the Department of Homeland Security to support intelligence-gathering operations. Levin sees the use of AI in roles such as border security not as an option, but as a necessity in a competitive environment.
“We’ve seen an explosion of AI use among organized crime groups like cartels,” Levin said. “At the end of the day, they often have access to a lot of the same technology we do.”
This is not the first time police departments have used questionable technology. Earlier this year, we reported Houston Police trialing shot recognition technology that is also used in Chicago.
NewsBlaze Reporter Clarence Walker can be reached at [email protected]