YouTube may be removing harmless videos because of their staff shortages… this will be interesting.
In a post on their Creator Blog, headed “Protecting our extended workforce and the community” they state:
“Our Community Guidelines enforcement today is based on a combination of people and technology: Machine learning helps detect potentially harmful content and then sends it to human reviewers for assessment. As a result of the new measures we’re taking, we will temporarily start relying more on technology to help with some of the work normally done by reviewers. This means automated systems will start removing some content without human review, so we can continue to act quickly to remove violative content and protect our ecosystem, while we have workplace protections in place.
As we do this, users and creators may see increased video removals, including some videos that may not violate policies. We won’t issue strikes on this content except in cases where we have high confidence that it’s violative. If creators think that their content was removed in error, they can appeal the decision and our teams will take a look. However, note that our workforce precautions will also result in delayed appeal reviews.
We’ll also be more cautious about what content gets promoted, including livestreams. In some cases, unreviewed content may not be available via search, on the homepage, or in recommendations.”