I came across this design for a inrush limiting soft-start circuit:
The initial voltage at vg is set by the capacative voltage divider of C3 and C1 & C2. If C2 << C1, then the series combination of C1 and C2 is approximately just C2, and at the instant of the voltage step the capacitors are much lower impedance than R1, so it too can be ignored, making the startup voltage at vg equal to V1 * C3/(C3 + C2), so C3 is chosen to ensure vg starts high enough that M1 is initially off. The voltage at vg decays until it reaches the turn-on threshold of M1, then negative feedback through C2 will hold vg up to keep the current Iout delivered to the out node approximately constant. If C2 << C1, then the current through C2 is approximately (C2/C1)*Iout, so R1 acts as a current sense resistor while M1 is in saturation. R1 and C2 are chosen so that R1*C2 = Vg/(Iout/C1), where Vg is the voltage that puts the MOSFET in saturation to deliver Iout. In this case, 5V to deliver a desired current of 100 mA yields Vg = 4.1 V, and requires R1*C2 = .041. Eventually v(out) ramps to V1 and the current through the capacitors falls to zero, the MOSFET exits the linear region and the voltage at vg resumes its decay with time constant R1*C3.
It's a simple, cheap circuit that works but has some disadvantages:
1. The ramp up and down of current is slow
2. Choosing C3 with enough margin that Vgs starts below threshold means a long startup delay where the circuit is waiting for the MOSFET to start to turn on and no current is flowing.
3. Once the capacitors are charged, R1 is now problematic since it slows the decay of the gate voltage. The MOSFET is "on", but rDS(on) is probably higher than optimal. Ideally, vg would drop to ground as soon as the inrush is done. Reducing R1 means C2 must increase along with C3, so the time constant is basically fixed by the supply voltage and desired current limit.
Are there easy ways to improve this circuit to address some of its shortcomings?