I'm not sure that 'averages' is a word I'd use.
Think of your signal as being like a lottery, and every time you take a sample of it to measure its voltage - the outcome of the lottery - that's like buying a ticket. If the measured level is at or very close to the mean dc level, that's a losing ticket, and if the measured level deviates from that level by more than a couple of pixels' worth, then that's a 'win'.
The display on a digital scope typically refreshes at about 50-60 Hz. Within that time, it's taken millions of samples and combined them together to give a picture of what happened to the signal since the last frame.
Unless the lottery is rigged, buying literally millions of tickets virtually guarantees you a win at some point. So the fat trace on the digital scope shows that it IS possible to win the lottery you're studying - which is absolutely true. But because there's no intensity grading, all it's showing is that some tickets were winners and some were losers. Important information has been lost.
On the analogue scopes, the brightness of any given point depends on the proportion of time the beam spends pointing at that spot. What it's showing is that, just like a real-life lottery, it is overwhelmingly likely that you will lose.
Both statements are true, of course. You could win, but that outcome is extraordinarily unlikely.