In the past,the v/cm & time/cm controls were often used in the "uncal" position.
In many situations,the percentage difference in two voltage levels is more important than the absolute voltages.
For example,the nominal voltage between peak white & peak sync of a PAL analog TV signal is 1.0volt.
Between peak sync & blanking will be 300mv (30% of the overall voltage),& between blanking & peak white ,700mv (70%).
All this is cool,if you are looking at a signal which is exactly at nominal voltage,so it will fit perfectly between the appropriate graticule lines,but the signal might be a bit out,(but still within limits),say 0.995v,or 1.005 volts.
For the former case,peak sync to blanking will be 298.5mv,& blanking to peak white will be 696.5mv,while for the latter,the values will be 301.5mv & 703.5mv,respectively.
OK,pretty close,but they don't quite fit the graticule lines.
Easy fix is to adjust the v/cm until they do fit,& you can read out the values as percentages.
With NTSC,a special IEEE graticule is used,as black level is slightly elevated above blanking,unlike PAL,so you need to fit the two voltage extremes into the graticule,& determine if they have the correct relationships.
Special graticules were used widely in analog TV, for the above & many other tests.
These were both physical & projected,with the latter preferred,as they don't have parallax problems.
Two factors arose:
(1)The graticules were a standard size,& not all analog 'scope faces were,so the 'scope display size needed to be adjusted to suit the graticule.
(2)Some tests required the amplitude and apparent duration of only that part of the test signal which was of immediate interest to be "tweaked" to fit into the graticule.(There were normally reference points provided in the graticule).
With the advent of DSOs,the need to be able to continuously adjust v/cm & time/cm has diminished in importance,as cursors can easily provide percentage readings,& special graticules have been supplanted by masks.