HAMPTON ROADS, Va. (WAVY) – A bipartisan coalition is urging the federal government to take a look at putting regulations on artificial intelligence in order to keep kids safe.

The coalition includes Virginia Attorney General Jason Miyares and North Carolina Attorney General Josh Stein.

Miyares said the technology can be used for good, but some are using AI to exploit children online.

“Artificial intelligence is a pretty amazing tool,” Miyares said. “It can be both a tool for good but also, unfortunately, with bad actors, a tool for unbelievable evil.”

Miyares said AI’s use in the production of child sexual abuse material, or CSAM, is becoming more common.

“What we have seen with AI already is to both digitally alter the face of children,” Miyares said. “To take a child that’s maybe been previously exploited, exploit them again or take a child that’s never been exploited and put them in a scene of child pornography, it’s unbelievable evil.”

In part of a letter addressed to Congress that was signed by Miyares and other attorneys general, it said AI is making it easier to create deepfake images. Before, it took a skilled photo editor to Photoshop images.

Miyares said states are trying to tackle the issue, but it’ll take some help from the federal government.

“Listen, we need a federal solution that applies across the country, protecting our kids,” Miyares said. “Imagine your child’s face suddenly appearing in horrific child pornography image. That’s what we are seeing spread around the web and we want to stop that.”

That’s why the letter drafted by the National Association of Attorneys General pushes Congress to act quickly.

It’s asking for a commission to study how AI is being used to exploit children and find ways to deter child exploitation.

One of the solutions suggested is to expand the existing restrictions on CSAM to explicitly cover AI-generated CSAM.

The letter said that would help prosecutors have the tools they need to protect children.