YouTube recently announced that their content moderators were being sent home from the office and automated AI systems will handle the removal of inappropriate videos. As most content creators know too well, the AI doesn’t always correctly remove inappropriate videos. The moderators are the ones who correct the mistakes that the AI often makes from a lack of context. With the moderators gone, removed videos will remain offline until someone gets around to reviewing them.
The outcry from content creators was, “Why can’t the moderators work from home?”
Although YouTube released a video explaining this policy change, it didn’t fully address the work from home question. Having worked at Google before and after the Great Recession ten years ago, I can tell you why the moderators can’t work from home. It’s not a technical issue, it’s the business model.
If you haven’t heard from the Internet, I’ve quit YouTube — for the rest of the year. No new videos until 2020. I’ve already lost three subscribers for not putting out a video on Wednesday. I may lose more subscribers when I don’t put out a video this Sunday. Or YouTube could be deleting closed accounts that are also subscribed to my channel. Meanwhile, I’m busy re-branding my channel and website. Both have the new header graphics. See you in 2020!
Updated 29 December 2019: Looks like YouTube had a glitch with the subscriber count being lower in the backend than the front end. The three subscribers that I “lost” since Christmas have come back.
Did four AI robots kill 29 scientists at a Japanese weapons research lab? A tweet with that headline and a linked video briefly lit up social media for one week in December 2018. The mainstream media didn’t even bother to cover it, suggesting a possible government cover-up. Like most people who saw the story, I thought it was too sensational to be true. When I saw the thumbnail for the linked video, I knew it was fake news.