Because doing otherwise (blanking leading zeros) would give you a false impression of the accuracy.
With all the leading zeros you can always see, if your DMM is switched to a reasonable range - if there are too many, you're using the wrong range.
I must admit, the HPAK Multimeter on your first pic (with the space separator between the first zero and the significant digits) does look wrong, and indeed - it is wrong: An "old style" DMM would suppress this first leading zero on the 1000V range, since they could only display a "1" or a "-" in the first digit, not a "0".
"Computerized" displays lead to all kinds of stupidity in terms of displaying measured values. Worst to see are the ones that treat everything as a float and display e.g. a room temperature reading with 5 digits after the decimal point - no joke, you can see such stupidity quite often in PC based software written by coders that do not know the usage of resolution and accuracy.
So, I'd always prefer a constant amount of displayed (or blanked but still kind of visible as seen in many modern multimeters having discrete LCD / LED displays) digits respresenting the actual resolution of the displayed value.
BTW.
Though beeing an European and beeing taught "," as the decimal separator, I do consider "." as _the only valid_ decimal seperator in science and technical issues, so any scientific or technical application displaying a "," gets cursed by me.