The reduced resolution is really odd. There may be reason to use reduced resolution for intermediate data to save on the memory - though still odd, as 1 µV resolution would be more than 24 bits and 32 bits would give enough resolution.
Yes, I had assumed this originally, but the internal data is absoutlely fine (that's what I used to do the excel graph) and you can zoom in and see the full resolution, so it's purely some unnecessary (in my view) rounding ... may be for performance, but I can't really see it.
There are so many horrible (but easy to fix) things with this meter .. just looking at the above screenshot ... why would you show min, max, and avg to 10uV resolution? .. it just makes them useless, I think I'd rather they weren't there at all!
There is another not so nice feature that was noted before: the grid lines are 1.1 µV apart, which is an odd choice. At least it looks like exactly 1.1 with no extra rounding error at the labels.
Don't get me started on the auto-scaling! I keep meaning to capture screenshots of all the bizarre ways this manifests itself. I can understand the 1.1uV from a pure "best efforts" scaling perspective, but we should have the ability to set a min and max ... being only able to set to factors of 10 per division is awful (and again I don't understand this, their autoscaling can set it to whatever it wants) but we can only do 1uV, 10uV, 100uV etc. Ugh!
The graphing has the potential to be so much better than others (like the 34470 for example), but the implementation just lets it down all over the place. Still, I guess you can always export the data.