You can consider them offset and gain errors. Say you are measuring resistance with a 3.5 digit meter. Lead/contact resistance might be 0.5 ohm, plus an additional 0.1% uncertainty in the current source. If we want to specify the 100 ohm range from 10 to 100 ohm, the maximum error would be 5.1% (10 ohm might be measured as 10.51 ohm). This figure would greatly overestimate the error for 100 ohm resistors, where the actual error would be only 0.6% (100.6 ohm instead of 100 ohm). It would also be wrong for 1 ohm, where the actual error would be > 50%.
If we specify the 100 ohm range as 0.1% + 5 digits, it would give correct results all the way from 1 ohm to 100 ohm. Of course specifying it as a linear function is again an approximation, you may find even more detailed specs for the $$$$ high-precision meters. The offset error can be a combination of various factors, like offset in the buffer amplifier or non-linearity of the ADC near 0.