Some food for thought: what is current limitation in reality? How is it usually realized in a power supply circuit? What happens when the output terminals (i.e. the device being powered) see a short circuit and the output voltage can't go below a certain voltage?
Often voltage regulation and current regulation is done seperatly. It could be that the supply cannot regulate a voltage under a certain level, but is capable of regulating a current which results in a voltage lower that that level.
1. What is the voltage (1.25V??) below which it usually becomes more difficult to design a bench power supply. I.e. where you need to start add extra circuitry to be able to go below that voltage. For instance a simple regulator or a switching regulator have a lowest voltage that it can output.
2. For what kind of lab work (i.e. what kind of circuits) would a person regularly go down below this easily achievable lowest voltage. Maybe you kind forum users can list the cases where you use low voltages, with and without high currents, in your daily work and home projects. It would be interesting to see what applications use such low voltages.
3. How often (% of your projects) do you have to go down to really low voltages.
Thank you
Cicada
1] Depends on the topology. The main issue is that if you want to work with a single supply rail, you need rail-to-rail capable feedback networks. Otherwise, you need negative supplies for your control circuit so that a zero volt output is not against the rail of the feedback network.
2] I often go very close to zero when I am doing things like playing with precision measurements. I might want to use a few hundred millivolts as a reference voltage in a feedback circuit, and use a powersupply to generate it.
At work, I do research on ICs and we might have bias voltages below 100 mV, which we want to control with very high precision (to a few tens of microvolts). If I were to apply 1.25 V on most of the chips we work with they would die within seconds. (\$V_{DD}\$ below 800 mV is not uncommon. The low power guys at my research group do ARM cores that operate on a \$V_{DD}\$ of 300 mV, if I remember that correctly).
3] At work, most of the time. At home, not that commonly, although I am in the process of making a device to provide precision voltages precisely for biasing and calibration purposes.
If you mean with a power supply, I would say none because if I am doing this, then I likely need more performance than a power supply will provide. Sourcemeters which combine a multimeter and precision power supply would be more suitable for these applications but I usually cobble something together as needed.
Interesting point, though I would like to add that source meters are far more expensive. If you are working something where you need 7-8 bias voltages in the few hundreds of mV, using a source meter for each will quickly become very expensive (compared to using precision supplies)