YouTube has updated its child safety policies to include now blocking all videos that contain “adult or violent topics”, even if they explicitly state that they are intended for minor viewers or families.
The YouTube team said in a post on the site’s help center: that YouTube will begin over the next 30 days to remove videos that are accompanied by keywords such as: “family fun” or “for children” although it is actually offering content not intended for minors.
While YouTube faces a multitude of child safety issues, the new policy update aims to curb this kind of content, especially after the Elsagate campaign in 2017, which saw videos featuring popular cartoon characters but including content leading to death.
YouTube explains that adult content that clearly targets adults is still allowed and will not be removed. But he also cautioned that “If you create adult content that can be confused with family entertainment, make sure your titles, descriptions and tags match your target audience”.
YouTube will delete all videos that violate the new policy, which were uploaded before the update, and the channels you uploaded will not receive any warning within the next 30 days, but channels uploading videos that violate the policy will receive a warning if you break this deadline.
YouTube team wrote : “Protecting minors and families from inappropriate content is always a top priority and that’s why we expanded our child safety policies”. The team added : “We are constantly taking steps to protect minors on YouTube, but we recommend parents to use YouTube Kids if they plan to allow children under 13 to watch independently”.