YouTubers Must Remove AI-Generated Content Depicting Deceased Minors

YouTube has taken a firm stance against the use of artificial intelligence (AI) in creating content that depicts deceased minors or victims of violent events. In an effort to combat the rising abuse of AI technology, the platform has updated its harassment and cyberbullying policies. The enforcement of these policies is set to begin on January 16.

The move comes as a response to content creators who have been using AI to recreate the likeness of deceased or missing children. These creators have given these AI-generated characters childlike voices to describe the details of their deaths. The alarming use of this technology gained significant attention following a report by The Washington Post, which highlighted instances where AI was used to narrate the abduction and death of children.

YouTube has made it clear that any content violating this policy will be promptly removed, with the creators being notified via email. In cases where the safety of a posted link cannot be verified, the platform may also remove the link itself. The platform emphasized its commitment to removing harmful content and ensuring the safety of its users.

Moreover, YouTube has implemented a strike system for channels that repeatedly violate the policy. If a channel receives three strikes for policy violations within a 90-day period, it will face termination.

This action by YouTube mirrors a similar development on the Chinese short-video platform, TikTok. TikTok recently introduced a feature that allows creators to disclose if their content includes AI-generated or manipulated media with realistic scenes. This step is part of the platform’s efforts to address the ethical and safety concerns associated with the use of AI in content creation.

By taking these measures, YouTube is demonstrating its dedication to maintaining a safe and respectful environment for its users. It is essential for online platforms to address the potential misuse of technology and protect vulnerable individuals from exploitation.

Privacy policy
Contact