The answer that "it just sucks in the voltage axis" isn't quite accurate enough for me, so I decided to explore exactly what's going on.
A scope's vertical accuracy is pretty opaque, since generally you're not looking for super-precise vertical measurements - that much is definitely true.
For example, let's take a reference DC measurement and analyze the uncertainties. I've set my bench supply to 2V, measured by my let's-assume-accurate 4.5 digit meter to be 1.9997V
(See first screenshot, which I guess I can't embed)
In order to get as accurate a measurement as possible, I zoom in on the scope pretty far - the 10mV/div range. I could have gone farther in, but scrolling the position knob endlessly is a huge pain, since it scrolls about 1div/turn or so, which is to say 10mV/turn or 200 turns to set the full 2V offset. Anyway. Full scale, I've got 4 positive and 4 negative vertical divisions, so 8. Assuming the ADC limits at those edges for maximum resolution displayed, that's 8divs*10mV = 80mV, and 80mV/256adc steps = .31mV/ADC step. [aside, I think it actually doesn't, because the trigger can happen at ±5 divisions from center, and in fact, the voltage per division setting isn't even restricted to these coarse values]
Of course, because I'm measuring a full scale of 80mV range, I have to offset by a lot to see this signal at 2V.
Pause for a sec. Here's the datasheet for my Rigol DS1104z:
http://beyondmeasure.rigoltech.com/acton/attachment/1579/f-0317/1/-/-/-/-/DS1000Z%20Data%20Sheet.pdfPage 5 has the relevant information.
On the 10mV range (in fact, all ranges 500mV or lower) the maximum offset is ±2V, which means I can offset the signal down 2V to put it on the centerline of the scope. Here's the first bit of interesting inaccuracy from the datasheet: DC Offset Accuracy is ±0.1 div, ±2mV and ±1% of offset setting. For my settings, that's ±1mv ±2mV ±20mV, for ±23mV total. So we can already ignore ADC graduation, as it's 2 orders lower in amplitude down here.
Now, we know we're going to measure our signal ±23mV, and assuming my meter is spot on, what we actually measure is 2.004V on a 1.9997v signal, or 4.3mV out. That's actually pretty good!
And we didn't even account for DC Gain Accuracy of ±4% full scale lower than 10mV or 3% full scale >10mV. Not that we had to, because we tried to center the ADC on our signal exactly via offset.
Let's take a different measurement, something with no offset. I've set the PSU to 1v, measured at .9962 actual. I can easily see this on-screen with the vertical scale at 500mV/div
(SCREENSHOT 2)
Now we're measuring 1.030V on the scope, both per the cursor and avg measurement. That's a bit worse than before, 33.8mV error. Okay, let's shoot this one. You'd think the offset error would be lower, given that we've got offset set to zero volts (surely, it just shorts the offset input to ground, right?! Maybe not!). But assuming it comes int play, and we do have to assume that, we've got ±0.1*500mV/div ±2mV ±1% of zero = 52mV. Well, that right there is larger than our mesasured error, so assuming the offset circuit isn't magically perfect at zero offset (which it may or may not be), we're within spec!
But now we've also got to account for DC gain accuracy, because we're not centered on the zero of the ADC! Our signal is above 10mV, so we're at 3% of full scale error. Now, I don't know what full scale for the ADC is right now, but it could well be ±5 divisions = 5V. 3% of 5V is 150mV, so if that's right, we're WAY better than worst case! Let's be conservative and assume the datasheet meant 3% of our INPUT signal. 3% of .9962V is about 30mV. Ignoring offset error, that's still pretty darn close to our actual measurement error.
Now, OP said the measured DS2072A was out by 100mV, which is significantly higher than what I've measured my scope to be off by. But again, depending on how the measurement was set up, 100mV of offset error isn't crazy if volts/div is relatively high, nor is it insane for gain error.
All of these parameters (except the DC Offset accuracy, which doesn't depend on input signal) will improve with frequency too, though I don't find specs for them in the datasheet. Making components accurate at both DC and frequency is hard, considering DC is infinitely low frequency. As an example, you'll never find a traditional sweep spectrum analyzer that goes all the way down to DC. Keeping things linear across an infinite number of decades (.1Hz, .01Hz, .001Hz...) is basically impossible, and ultra low frequencies aren't generally useful to pay attention to anyway! If you were going to be measuring those, you'd use a multimeter
Conclusion: Despite being, on an absolute scale, terrible resolution, 8 bits is actually high enough that ADC step size isn't really the problem here. However, because scopes aren't really designed for operation at DC, gain error at DC is QUITE high, and if you're using it (maybe even if you aren't? I'd love to know in detail how the input offset circuit works exactly), offset error is pretty high too.