What you observed is very common with low end DMMs or older DMMs, where they calibrate using pots. Due to the non-linear nature of their A/D converter, and with the limitation to only one calibration point with the pot, when the low end is right, the top end might be off, and vice-versa.
Often, if there's a service manual, the calibration voltage is a single digit voltage for these meters, and not necessarily near the top of a range. In a lab grade DMM where there are multiple adjustment points, there are multiple zero and gain adjustment for each range. For example, in DC alone, the Keysight 34401A has six separate gain adjustment across the voltage ranges, and 5 different zero adjustments for DC, for a total of 11.
It used to drive me crazy trying to optimize the accuracy of a handheld DMM using the service manual, as the high end is always way off spec. Now I am down to one of two ways:
* You will need a voltage calibrator for the first approach, you first would map out the measured voltage on a graph across the entire measurement range. Then pick a calibration voltage which will allow both the high and low end to fall within spec.
* Determine the voltage that you care about, pick the point you use most often, then calibrate against that particular point. So if 3V is what you measure the most, set the calibration to 3V and adjust the pot to give you a 3V reading. Often we care about <20V DC, you will likely find high end to be off spec, but at least the voltage that matters to you is accurate.