Just another general "Head's Up" and as David L. Jones would put it: "A Trap for Young Players" if you're building a DIY 'scope calibrator box:
If you're using something like an LTC2057 amp - or any precision amp in general for your calibrator circuit - and the datasheet gives you just the "indefinite" short-circuit output current, usually around 20 or 30mA. As Edwin pointed out that is a useless condition, since at 20mA or so you're driving the amp's output signal into a zero volts short. Normally when you design your circuit you will keep your normal operating current no more than say 10% of the spec'd "short circuit" current if that's what's given on the datasheet. That is around ~ 2mA max normal operating output current for '2057, and any more than than that and the quiet, soft output stage of a precision amp starts acting like it has noisy, distorting resistor drivers; not transistor drivers - and it becomes obvious they are bouncing along on the same die as that 100kHz chopper system. Oops.
Another possible hidden issue here is if you're using that '2057 (look for this for any op-amp really) as some sort of output buffer that goes to the calibrator box output terminals. Maybe the op-amp is used as a unity-gain or boost buffer: So you accidentally short out the output terminals to zero volts. Now trace back your feedback path to the input of that op-amp and for example: If you're using a 10V Vref on the non-inverting input to the amp, and your output is being fed back to the inverting terminal (which is suddenly now zero volts) you just created a 10V differential voltage across the amp's input terminals. In the case of the '2057, absolute max differential input voltage is only 6V. Oops. So you've just blown out the amp's -input- stage, even though you thought you were safe with that 20~30mA "indefinite output short circuit current" spec. Now you have to solder in a new chip, and you realized why you should've used a better op-amp in an 8-pin DIP socket <Laughing>.
So when you design your "simple cheapskate calibrator box with op-amp" <Grin>, keep those things in mind as you trace everything out when those output terminals are shorted - typically you might want to have your Vref at say 5V (use an off-the-shelf chip) and boost the output to 10V if you need that - of course that needs decent feedback resistors and maybe a trimmer pot. If you can calibrate often you don't need exotic resistors.
I would also look at the specs of what you -really- need for DC calibrator accuracy for an old, low-res 'scope (What are those TDS300's - 2% vertical accuracy, 8-bit?). I'll bet something like an '6655-5 could be an adequate 5V source, and gain that X 2 via a quiet bipolar amp, and you'd be fine. Or just put a 10V Vref chip in a box and be done (with perhaps a low-value limiter resistor for a small bit of short-circuit protection) - maybe that's all you need? Just an idea if you want something as cheap as possible.