I received my SDM3065-SC a couple days ago. So far I haven't had time to use it for anything real, but I've been playing with it a bit. I did a quick measure the resistance of a short, just clipping a couple leads together so there was something to measure and watching the trend chart scroll by with the statistics displayed underneath. It varies more than I thought it would, 0.0411 to 0.0449 ohm for a couple bran new (literally first time connected to anything, still has that fresh silicone smell) test leads clipped together. I hope the uA range doesn't have the same variance.
General first impression is the UI has some fixable awkwardness. The depth of menus to get to the screen shot feature is a good example. There is that nice bright blue, tangible, physical "Shift" button on the keypad, but it is under utilized, no effect on any on-screen menus or under-screen button usage. The numbering and ordering of "Probe Hold" measurements displayed is counter intuitive. It makes sense up to the 8th measurement, but then it keeps the "1st/2nd/3rd measurement" numbering but shifts the measured values up for the 9th and beyond measurement. If they changed to ordering of the measured values to always have the most recent first on the list, or otherwise increment the measurement numbering to reflect the actual Nth measurement, it would make much more sense.
The layout of the scanner card seems odd to me... I was planning on making a breakout box with a bunch of banana jacks and using ribbon cable or something similar to keep it all nice and neat going from breakout-to-DMM, but with the board's layout and terminal block choice I'm thinking using ribbon cable will be a PIA. Also, would be nice if front panel measurements were exposed as "channel 0" when in scanning mode so you have one channel with full V and A ranges. Not sure if the front panel technically counts as 1 channel plus a second dedicated current channel, so perhaps "channel 0" is not sufficient...
The custom sensor feature is pretty nice, for my purposes anyway, but it needs work. Same UI awkwardness, and you have to navigate and reload your custom sensor file every time you switch to any other function (like DCV/DCI/etc). For built in supported temperature sensors, the last one selected is always the default, but not your custom sensor. You can use arbitrary current, voltage, or resistance based sensors, specify what custom unit you want displayed, and it will automatically prefix magnitude (m/u/n/etc). I'll be trying it out more in the next couple weeks.
But I'm happy so far. At the price with a 16 channel scanner card, my nitpics are easy to work around.