Probably not AI, but clustering of keywords and ranking the matches.
These algorithms are more or less a
positive feedback loop, which is fine if you search for some knowledge. In other words, once you choose autoplay, you are kind of drawn into the center of some topic, with every video you get the most or highest ranked keyword matches of the last few ones as the highest ranking for the next recommendation.
It is bad if the starting point is at some opinion based content, you get like the most matches and it seems some people in their basement are choosing wider ranges of keywords than news outlets and therefore beat them in that regard, matching more often in these positive feedback loops, no matter where you started.
What these algorithms can´t detect is satire and cynicism, which do have stronger matches/better association to certain keywords (by naming it more often, using it more specific) than most serious content and then it drifts off into fictional areas, which are somewhat related.
(political rant) How novel... A story sourced from NYT about a "radical right wing" movement blaming the system and not the people that actually got awakened by 14 years of vile theft and mismanagement from the left wing party. This is a similar phenomenon that happens here in the US: people got woke, others reacted with vitriol to their BS.
I am not sure if a purely stereotypical approach does what you think, without a healthy amount of dissent most systems are bound to fail, no matter the intentions they started out with. It might be a bad idea to brush it off this way.