A child pornography case in Eau Claire County is testing a recently enacted state law because the images that brought six felony charges were created entirely by artificial intelligence. The defendant’s attorney has vowed to challenge the constitutionality of Wisconsin’s virtual child porn law.

A criminal complaint filed against 24-year-old Kai Jon of Altoona alleges he possessed six images of graphic sexual acts involving babies, which were shared on chat forums where users commented on the images. A sergeant with the Altoona Police Department reported “that it appears all the images were fully AI-generated images and not that of a real human child.”

Jon was charged with six felony counts of possession of virtual child pornography, which come with maximum penalties of 15 years in prison and up to $100,000 in fines for each image. Wisconsin’s ban on AI-generated child pornography went into effect in March 2024.

Stay connected to Wisconsin news — your way

Get trustworthy reporting and unique local stories from WPR delivered directly to your inbox.

According to a tally by the nonprofit Enough Abuse, 45 states have criminalized AI-generated or computer-edited child pornography.

Court records show that Jon’s public defender notified the circuit court on Aug. 14 that she will be filing a “constitutional motion” challenging the statute. The attorney did not respond to a request for comment asking what grounds the challenge would be based on.

Jonathan LaVoy is a partner with Kim & LaVoy Attorneys at Law in Milwaukee who has spent 25 years as a defense attorney and has handled child pornography cases before. He said it’s not uncommon for lawyers to challenge the constitutionality of a new law. LaVoy said arguments could include claims that the statute is overly broad, “or potentially that there’s some type of free speech area that allows for this.”

“My personal opinion is that these laws will stand up to challenges in the appellate courts,” LaVoy said. “Many of our legislatures have enacted these laws, and it’s clear that it’s a necessity in our society right now because AI is becoming so prevalent and so lifelike.”

Last year, Wisconsin Attorney General Josh Kaul said people using AI to create child pornography is a “growing problem that’s impacting communities around the country.” The comment followed the arrest of 42-year-old Steven Anderegg of Holmen, who used AI to create thousands of images of minors by using sexually explicit text prompts. He was indicted in federal court in May on charges of creating a visual depiction of minors engaging in sexually explicit conduct, distributing it across state lines, and sharing it with a boy under 16 years old.

Anderegg is also facing a felony first-degree child sex assault charge in state court

In February, a federal judge in Wisconsin dismissed the federal charge related Anderegg’s creation of the child pornography, ruling that “the private production of obscenity” is protected under a 1969 decision from the U.S. Supreme Court that held the First and Fourteenth Amendments of the U.S. Constitution doesn’t allow the government to prohibit the private possession of obscene materials.

The U.S. attorneys involved with that case have appealed the ruling, arguing the provisions in the 1969 decision from the Supreme Court don’t apply to obscene materials depicting children.

LaVoy said obscene materials, especially those involving depictions of children, haven’t typically been considered protected free speech under prior case law. 

“So I think it would be odd for the Court of Appeals in this situation to uphold the district court decision on a dismissal based on that older case,” LaVoy said, noting that it was “before this whole AI thing kicked in.”

“AI is really a different animal,” LaVoy added.

Dietram Scheufele is a professor at the University of Wisconsin-Madison who specializes in misinformation, social media and AI. He told WPR the rise of artificial intelligence models has opened numerous legal and ethical questions that courts are left to grapple with. 

On the technical side, Sheufele said, the question is how AI models are able to create lifelike images of child pornography. Another question is whether people or businesses that create the algorithm to assist the AI models that ultimately create the images would be liable.

“The same logic that applies to child pornography will apply to a whole bunch of other things — not in the sense of obscenity, but in the sense of responsibility and copyright, and all the other things that come that come along with that,” Scheufele said.