After a 6.8-magnitude earthquake hit Dingri county in Southwest China's Xizang autonomous region, an image of "a child buried under rubble" gained significant attention, eliciting sympathy and support from many netizens. However, it was later revealed that the image was created using AI tools. The creator had previously shared multiple AI-generated videos on video platform.
The widely shared image, initially posted by its creator on Nov 18, 2024, with a clear AI-generated label, exhibits telltale signs of artificial creation like unnatural colors, lighting, awkward movements, shifting stones, and even a hand with six fingers.
石头位置不合理移动
Following the earthquake, numerous fake images and videos have circulated widely, portraying collapsed buildings, homeless children, and mothers protecting their children from debris. Authorities have discredited these materials, confirming them as either AI-generated, old, or edited.
A lawyer has warned that purposefully using AI to fabricate images of severe earthquake disasters, spreading false information deviating from reality, and inciting public concern or panic constitutes a disruption of public order.
Experts note that many online platforms only add disclaimers like "suspected to be AI-generated" about an hour after images or videos are posted, revealing inefficiencies in their review processes.
AI-generated images and videos function by analyzing text descriptions, transforming semantic details into realistic visuals, and gradually incorporating relevant elements. If you come across a suspected AI-generated image, follow these steps:


