I'm sort of interested in the idea of this, so, realistically, what is required to make a meter? I imagine that conceptually it is a pretty straightforward idea. A voltage reference and an ADC. Compare the incoming voltage with the reference. Use a DSP to analyze the input and do whatever you want with it, whether that's measuring the average of the input, p-p voltage, Fourier analysis, etc. Feed it to an mcu/mpu and display it on a nice little screen. Of course that's just voltage. I would think current could be measured in a similar manner. Really, as far as quality goes, what would be required to make it as high-quality as possible? I would think basically just an extremely accurate and precise voltage reference, high-quality ADC, and measures to minimize input noise. Is there anything (probably a lot) that I'm missing? Because it really sounds fairly simple to me. Simple idea, hard to pull off, I would imagine.
I've seen people talking about 24 or even crazy 32-bit devices (with an ENOB of 21.3 max). If you're designing such devices in practice, it's not the case of slamming some parts together with the highest specs, it's nicer if you can get the maximum specification that's still reliable from every part in your circuit.
A 24-bit is really not necessary. If you want to design a 20,000 count multimeter it basically means the A/D converter needs a effective resolution of 20k steps. That's a resolution of about 14.288 bits. There are 16-bit A/D converters on the market that can get that resolution easily, even in bi-polar form to make things 'easier'. Remember these circuits are extremely low noise (anything from 14-bits is) which requires very well thought circuit design and prototyping.
I also hear people talking about power measurement. I think it's a community challenge enough, as it is proven, to even choose a single set of measurements to be taken. Adding power to that is not sane. Furthermore, it would push the budget far because you will need to select a multi-channel A/D converter. On top of that, good power measurements are done by simultaneously sampling both channels (current and voltage readings) instead of sequential, and those parts are more expensive as well.
What I think first is needed for a project always is a budget. I never saw people talking about prices, except it might become too expensive. Let's say we make it reasonable of 200$. I can tell you that is going to be really tight for an accurate meter, if you look up the specs below. MCU, A/D, 'basic parts', mechanical (PCB, casing, buttons) will cost >$125 easily for a small project. Then again, I guess a case wouldn't be required if it's not going to be markted product. People might be able to built it into their own casings.
VDC: 20000 count (V), 2000 count (mV) in high ranges 0.25%
VAC: 2000 count, 2%
AAC: 2000 count, 2%
ADC: 2000 count, 0.5%
Ohms: 2000 count, 0.5%
(Cap: 100 count, 2%)
Diode test: 1mA test , up to 2V, 2000 count
Frequency: 100kHz with a fast proccesor should be possible...
Adjustable current sounds fun and useful, but I have some arguments in not incorpating that:
1) When do you actually need to measure accurately the voltage of a LED?
2) If you do, however, you can built a 'Diode tester' more easily. There are dozen schematics on the internet that do just that, with adjustable current (tenths of mA to several mA's) and respectable resolution.
3) It sounds kinda cheap, but high currents drains battery.
In my mind, the easiest solution on making a multimeter is to have a MCU with an external ADC linked together (SPI/Parallel). On the A/D is , ofcourse, some amplifying, protection, detection and range selection circuits. For a MCU it will has to be a 16 or 32-bit device for sure. MSP430 sounds reasonable because it's famous for mobile device operation, but microchip has some interesting controllers as well in their dsPIC, PIC24 and PIC32 ranges. I think Microchip, being a hobbyist community project, is more accessible. But the exact selection is not necessary now, as I will explain.
Communication to outside world could be done with USB (FTDI chip to make a virtual COM port). Easy, always works. Add a digital isolator and it's a bit safer, but I still wouldn't want to measure the mains whilst having it connected to PC. Optical is best.
Just to note something, it's funny that people are discussing ARMs, connection methods, but I haven't seen much attention to the analog part at all.
How do you design overload protections? Surges, be able to handle high voltages on every input setting, etc.
Determine required accuracy of the built electronics instead of gambling for high-spec-unaffordable-parts?
Create ultra low noise pre amplifying circuits?
Set up a reliable form of ranging that doesn't blow up components (i.e. 230V~ on mVdc)?
Calibration, reliabality, accuracy, bandwith, noise levels, power supplies, bipolar measurements, diode/cap/resistance measurements..?
etc.
To me, the digital part of a multimeter is just a way of common sense on what microcontroller and A/D converter fits the bill the most. But you do this after you investigated what you want to capture from the analog frontend and at what degree. These kind of circuits has the A/D and MCU coupled tight together. If a project can get away with a 10-bit resolution, it might be fine to use onboard A/Ds. If a high speed measurements are in order, having a high speed A/D connected to a 16-bit wide PMP on a PIC32 might be necessary. The A/D converter is selected on the required speed, SNR or SINAD calculated from the desired ranges and resolutions, margins and stuff like that.
I think we should discuss on getting those figures and schematic(!) idea's on table first, before adding anything else to the discussion on the 'digital section'. In my opinion a high-count multimeter is more about the analog circuitry, because that needs to be done well. A digital circuit is more like connecting the pins correctly when the right parts are chosen for the job.