The pitfalls of increased AI use in policing

Advertisement

Advertise with us

As a part of its body-worn camera program, the RCMP recently completed a pilot project using artificial intelligence (AI) to draft reports. The AI-generated reports are created from audio captured from officers’ body cameras. A report can be drafted in mere seconds. The pilot, which ran for about six months and concluded in January, occurred across eight detachments in British Columbia generating nearly 800 reports.

Read this article for free:

or

Already have an account? Log in here »

We need your support!
Local journalism needs your support!

As we navigate through unprecedented times, our journalists are working harder than ever to bring you the latest local updates to keep you safe and informed.

Now, more than ever, we need your support.

Starting at $15.99 plus taxes every four weeks you can access your Brandon Sun online and full access to all content as it appears on our website.

Subscribe Now

or call circulation directly at (204) 727-0527.

Your pledge helps to ensure we provide the news that matters most to your community!

To continue reading, please subscribe:

Add Brandon Sun access to your Free Press subscription for only an additional

$1 for the first 4 weeks*

  • Enjoy unlimited reading on brandonsun.com
  • Read the Brandon Sun E-Edition, our digital replica newspaper
Start now

No thanks

*Your next subscription payment will increase by $1.00 and you will be charged $20.00 plus GST for four weeks. After four weeks, your payment will increase to $24.00 plus GST every four weeks.

Opinion

As a part of its body-worn camera program, the RCMP recently completed a pilot project using artificial intelligence (AI) to draft reports. The AI-generated reports are created from audio captured from officers’ body cameras. A report can be drafted in mere seconds. The pilot, which ran for about six months and concluded in January, occurred across eight detachments in British Columbia generating nearly 800 reports.

Harnessing AI to write police reports is replete with some serious and unresolved concerns and must be immediately discontinued.

It isn’t even entirely clear why police need to use AI in the first place.

The Royal Canadian Mounted Police recently completed a pilot project using artificial intelligence to draft reports, but Brandon University sociologist Christopher Schneider writes that using AI to write police reports “is replete with some serious and unresolved concerns and must be immediately discontinued.” (The Canadian Press files)

The Royal Canadian Mounted Police recently completed a pilot project using artificial intelligence to draft reports, but Brandon University sociologist Christopher Schneider writes that using AI to write police reports “is replete with some serious and unresolved concerns and must be immediately discontinued.” (The Canadian Press files)

The primary justification for the expanding use of AI to generate police reports across law enforcement is to free police from the administrative burden of having to write reports in the first place. The idea is that officers could do more relevant police work, presumably patrol work.

The majority of police work is dull, often punctuated by hours of inactivity and sedentary behaviour, which already includes tasks like patrol work and traffic duty. While there is some evidence that increasing preventative patrol can reduce crime, a stronger rationale would have to be made for relieving police from the task of report writing.

Have calls for service suddenly become so onerous that police are no longer able to engage in routine patrolling? To the contrary, evidence reveals that calls to RCMP in some jurisdictions declined in 2025.

The significance of writing reports has arguably never been more important because of delays in case processing, with officers sometimes testifying in court months or years later. The report is the written record of the event and officers rely on these notes to recall important details in their testimony — a failure to accurately recall the incident can lead to charges of perjury and obstruction of justice, and can lead to unjust outcomes. Thus, report writing should remain a necessary part of how we should continue to think about police work. Research illustrates that police reports can have a significant impact and central role in criminal cases and that errors in police reports can hinder the pursuit of justice.

Errors are documented across industries that have incorporated AI, including in policing.

Earlier this year in Utah, an AI-generated police report matter-of-factly stated that an officer had morphed into an amphibian. The audio on the officer’s body camera had captured the Disney movie “The Princess and the Frog” playing in the background. Astonishingly, it was only in this extraordinary context that the subject officer reported having “learned the importance of correcting these AI-generated reports.”

But what about unremarkable details of reports that might be overlooked by officers that could jeopardize court proceedings?

Evidence indicates that AI-generated police reports have already been used in plea deals in the U.S. Plea bargaining is used to resolve most criminal cases in Canada. According to the Canadian Department of Justice, “For the court to accept a plea of guilty, the facts alleged by the prosecutor must be accepted by the accused as being substantially accurate.”

One of the most popular AI report-generating tools currently used by police is Draft One, a proprietary software created by Axon Enterprise, the U.S.-based company that the RCMP has contracted with to also supply its body cameras and digital evidence management system.

According to a 2025 investigation, “it’s often impossible to tell which parts of a police report were generated by AI and which parts were written by an officer.” The investigation also found that “there is no meaningful way to audit Draft One usage.” As the authors point out, if a report includes a statement like, “the subject made a threatening gesture,” there is no way to determine if this account was provided by the AI or the officer.

AI can only simulate interpretation, but it cannot interpret like an officer. In other words, there is no way to verify whether AI-generated police reports are substantially accurate. The possible impact that AI reports might have on plea bargaining in Canada remains unknown.

As if all of this wasn’t enough, Draft One uses a customized variant of ChatGPT to create its police reports. ChatGPT is the most popular generative AI chatbot. A recent MIT study examining the influence of generative AI on critical thinking skills found that ChatGPT users “consistently underperformed in neural, linguistic and behavioural levels.” The authors note that while generative AI might initially defer mental effort, the long-term consequences will result in “diminished critical inquiry [and] increased vulnerability to manipulation.”

Writing is an important kind of thinking and writing reports requires that officers think. Policing is an occupation that demands officers utilize critical thinking in high-stress and fast-paced, evolving situations. Do we really want even the remote chance of officers empowered with the ability to use force, including deadly force, operating with a cognitive debt that could result from repeated reliance on AI to generate police reports?

I think not.

» Christopher J. Schneider is professor of sociology at Brandon University. His most recent book (with Erick Laming) is “Police Body-Worn Cameras: Media and the New Discourse of Police Reform” (Routledge, 2026).

Report Error Submit a Tip

Opinion

LOAD MORE