Maybe a blasphemy question, but what do you need 6-digit absolute accuracy for - except for calibrating other instruments?
All validation/qualification labs I know had basic requirements of 0.1% or in rare cases 0.05%. If your meter should be one order of magnitude better than your specification you get to 0.005% - these are 4 1/2 digits.
Of course you can use 7 or more digits for relative purposes. Seeing small changes also over a large timespan is a great feature, so the instrument should have sufficient stability. From a pragmatic point of view this stability can also be ensured by tracking an external standard instead of going through the exhaustive and expensive complete calibration process.
What are real life applications for 6 digit absolute accuracy and not only stability? I´m curious.
Few professional people will need the absolute accuracy, but because DVM technology has been 'boringly brilliant' for several decades now, the internal reference accuracy and stability justifies 6.5 digits or maybe more for a decent bench multimeter.
In my experience, 6.5 digit meters generally offer other features making them very attractive for professional ATE systems. They offer fast read rates and various inbuilt math functions and they offer GPIB and/or USB and/or LAN interfaces.
In all my years at work, I have never seen any engineer get excited when a decent DMM was returned from calibration. Also, I don't think I've ever seen an engineer check two 6.5 digit DMMs against each other in the way many hobbyists do obsessively. Many of them seem to progress to obsessing over state of the art voltage references and calibrators.
Don't ask me what they think they need this stuff for, but in other hobbies there will always be people who want either the titanium/platinum/silver/gold version of whatever widget(s) the hobby demands. I think it's partly a status thing, partly a FOMO thing and partly an OCD thing.