A frequently quoted fundamental problem here is transparency, Youtube don’t give anywhere near clear enough guidance up front, and, when videos are demonetised, they don’t explain why so you can avoid it in future. There is also pretty clear evidence that some channels, particularly MSM ones, avoid the blacklisting completely. Mostly it just seems arbitrary. Perhaps sometimes it’s end users flagging as innappropriate.
There are several reasons we can give for the arbitrary nature of demonetisation, but other than end user flagging, the algorithms are clearly crap. That’s quite a frightening indictment for a company whose fundamental business is built on the quality of its software. More frightening is if those algortihms are based on AI to any degree, in a complex system such as Youtube’s which I am sure is an organic derivation of a series of spaghetti wired tweaks, I doubt anyone knows how it works at all: all you can do is prod it empirically and hope it doesn’t break too much. I’d contest that they’ve already prodded it too much, hence the level of excessive demonetisation.
One way they could help themselves is to re-monetise the revenue lost during the period of arbitrary demonetisation, I assume they don’t. The first hours of release when demonetisation happens are also where the bulk of views occur. More fundamentally, just be fully transparent about the detail of demonetisation reasons both before and after, although as I mentioned, I don’t think they actually have a clue most if the time. Evidently Google’s not such a great software company after all.