TL;DR

Civil-rights advocates and some prosecutors have raised alarms about police departments using generative AI to draft incident narratives. Concerns center on transparency, evidence integrity and vendor practices; two states, Utah and California, have passed laws to force disclosure and preserve machine-origin drafts.

What happened

Over 2024–2025, activists and legal officials highlighted risks from police departments adopting generative AI to produce incident reports. Axon’s Draft One — a tool that ingests body-worn camera audio, produces a draft narrative for officers to edit, and then exports the final report — became a focal point because the system does not retain the original AI-generated draft after export. That design choice, critics say, complicates efforts to determine which passages were machine-suggested and which were penned or altered by an officer. In response to these concerns, the King County prosecuting attorney’s office announced it will not accept narratives produced with AI assistance. Advocacy groups published investigative and practical guides to help the public learn whether local agencies are using AI and to tailor records requests. Legislatures also moved: Utah passed SB 180 requiring AI disclosures and officer certification of accuracy, while California’s SB 524 added disclosure mandates, limited vendor use of agency inputs, and required retention of the initial draft for review.

Why it matters

  • Police reports influence charging decisions, court testimony and detention; opaque AI contributions can undermine accountability and fairness.
  • When original AI drafts are not preserved, it becomes harder to verify who authored or edited specific report language.
  • Vendor bundling of body cameras and AI tools concentrates control and data with a few companies, complicating oversight.
  • New state laws create transparency and retention standards that could shape how departments adopt or limit AI tools.

Key facts

  • EFF first raised concerns about AI-written police reports in 2024 and later produced a two-part report examining Axon’s Draft One.
  • Axon is described in the source as the largest supplier of body-worn cameras to U.S. police departments and the maker of Draft One.
  • Draft One workflow: officers upload body-camera audio, the system generates a draft, officers edit, then export; the original draft is not stored after export.
  • Axon representatives stated the system is designed not to retain the original draft and to avoid creating additional stored copies.
  • King County’s prosecuting attorney’s office announced it will not accept police narratives produced with AI assistance.
  • EFF published a public-records guide to help communities determine whether agencies are using AI to create police reports.
  • Utah’s SB 180 requires that reports created in whole or part by generative AI carry a disclaimer and that officers certify accuracy.
  • California’s SB 524 requires disclosure on reports if AI contributed, prohibits vendors from selling or sharing agency-provided data, and mandates keeping the first draft so reviewers can see machine-originated content.

What to watch next

  • More state legislatures considering rules or bans on AI-authored police reports, as anticipated by advocates.
  • Whether Axon or other vendors change Draft One’s storage or logging behavior in response to criticism or new laws (not confirmed in the source).
  • How courts and prosecutors handle submissions that incorporate AI-generated text and whether retention requirements affect evidence practices (not confirmed in the source).

Quick glossary

  • Generative AI: Machine learning systems that produce new text, audio, images or other content based on input data and learned patterns.
  • Body-worn camera: A recording device worn by officers to capture audio and video of interactions and incidents.
  • Records Management System (RMS): A software platform used by police agencies to store, manage and share incident reports and related records.
  • Disclaimer: A label or notice included on a document to indicate that content was generated or assisted by AI.

Reader FAQ

Are police departments using AI to write reports?
The source indicates police departments have adopted AI tools for report drafts and highlights Axon’s Draft One as a widely used example.

Does Draft One keep the original AI-generated draft?
According to reporting cited here, Draft One does not retain the initial draft after an officer exports or copies the final report; it is designed not to store that original version.

Have any prosecutors or states restricted AI-written reports?
Yes. The King County prosecuting attorney’s office will not accept AI-assisted narratives, and Utah and California passed laws requiring disclosure and retention measures.

Can the public check whether local reports were AI-generated?
EFF released a guide to help craft public-records requests tailored to learn about AI use, but the source notes that obfuscation by tools can make audits difficult.

In 2024, EFF wrote our initial blog about what could go wrong when police let AI write police reports. Since then, the technology has proliferated at a disturbing rate. Why?…

Sources

Related posts

By

Leave a Reply

Your email address will not be published. Required fields are marked *