AI Threat Child Photo Misuse Soars
There is a significant and growing worry surrounding the use of artificial intelligence (AI) to manipulate photographs of children for harmful and exploitative purposes.
The Alarming Rise of AI-Generated Abuse Material
Holly Brooker, co-founder of the organization Makes Sense, has highlighted that the transformation of children's pictures into abusive content is an issue the New Zealand Police are increasingly bringing to public attention. "It’s an emerging trend that’s growing rapidly," Brooker stated in an interview with Breakfast.
The gravity of the situation is underscored by recent data. "The Internet Watch Foundation reported recently that there’s been a 380% increase in AI-generated child abuse material in the last year," she revealed. This staggering statistic points to a dangerous escalation in the misuse of AI technology.
(Source: 1News)
Schools Urged to Strengthen Protective Measures
Brooker believes that educational institutions need to adopt a more proactive stance in addressing this escalating threat. "I don’t think that it’s something that all schools have thought about and considered well," she commented.
She pointed out that the Privacy Commission released guidelines in May of this year concerning the sharing of images and videos that feature children. "What I’d love to see is schools adopt really robust policies about ‘how are we going to share these images, what are the risks?’, really tighten things up a bit," Brooker urged.
Navigating Digital Sharing in a New Tech Era
Reflecting on the rapid changes in technology, Brooker noted, “We come from an era in the early days of Facebook where we weren’t worried about these types of things. With tech, things have escalated quite quickly.”
As a final piece of advice, Brooker emphasized the importance of parental vigilance regarding where their children's images are posted online. She suggested that private messaging threads offer a more secure alternative for sharing such content, minimizing exposure to potential misuse.