You are missing how the whole control system works. I know that the driver has a 16 bit PWM, so the voltage can be controlled in steps of about 1.6 mV on our 53V supply.
This is far less critical than one might think.
Some time ago I did a position controller in an FPGA and as the controller rate was pretty high (I think it was 170 kHz), I was afraid that the less than 8 bit voltage resolution (FPGA clock 40 MHz) would lead to problems. It turned out that even with artificially reduced PWM resolution of a few bits the controller worked flawlessly.
Viewing this from a signal processing point of view, a chain voltage -> current/torque -> velocity -> position is a triple integrator, or seen as filter gives 20 dB/decade of attenuation, 60 dB total. If you have a PWM frequency of 16 kHz and a position "frequency" of 16 Hz, these are three decades or 3*60 = 180 dB of attentuation. Roughly the same principle as a delta sigma DAC, where you start with a 1 fast bit DAC and filter this down to slower, but much higher resolution signal.
For motor control it probably makes sense to keep some reasonable PWM resolution in order not to have excessive current ripple, but for a not too low inductance motor a handful of bits should be enough for that.