Chinese police claimed to have recently caught a ChatGPT user for allegedly fabricating a false train crash report using the chatbot’s AI.
It’s one of the first enforcement measures under a recently passed Chinese regulation governing artificial intelligence (AI)-generated “deep fakes,” which are fake digital photos, videos, or other material that appear realistic.
A man using just his last name, Hong, allegedly utilized ChatGPT to fabricate a news story about a collision that reportedly claimed the lives of nine construction workers in the northwest Chinese province of Gansu.
The fabricated narrative was quickly propagated among 21 accounts on a well-known social media site belonging to a southern Chinese media outlet.
Due to China’s “Great Firewall,” which censors the internet for citizens, ChatGPT is theoretically unavailable in China, like most foreign websites and services.
However, determined people can get access using readily available “virtual private network” software that gets around the firewall.
How Hong used ChatGPT is not mentioned in the police report.
According to the report, the article had 15,000 views when Gansu security officials realized it was a hoax.
After conducting a raid on Hong’s home to gather evidence, police used “criminal coercive measures” against Hong. Police use that word to denote short-term restrictions on a suspect’s freedom.
On January 10, the new Chinese deepfake law came into force. It forbids several subcategories of “deep synthesis technologies”—technologies like machine learning and virtual reality—used to create fake media, but it only provides hazy definitions for many of these prohibited classes.
The law forbids explicitly using such technologies to produce, publish, or transmit fake news.
This is according to a translation of the direction provided by the crowdsourced website China Law Translate.
It also forbids deep fakes used in activities that endanger national security, harm the nation’s image or the societal public interest, or disturb “economic or social order.”