My understanding. The calibration and adjustment process of a modern sub-GHz DSO is pretty simple, all you need is calibrated DC and AC sources, the software handles the rest. For example, for a traditional Tektronix TDS/TBS scope, you need to:
1. Send a bunch of accurate DC voltages into the scope, e.g. 20 V, 2 V, 1.6 V, 0.8 V, 0.4 V, 0.32 V, 0.2 V, 0.16 V, 0.08 V, 0.04 V, 0.03 V, 0.02 V, 0.015 V, 0 V, both polarities.
2. Send a bunch of leveled AC sinewaves into the scope, e.g. 5 KHz, 1 MHz, 50 kHz, 20 MHz, 100 MHz (max bandwidth), at amplitude 1 Vpp, 2.5 Vpp.
This is 90% of the work. Then you also need:
1. 1 KHz, 0 mV - 800 mV edge signal from a pulse or square wave generator for calibrating the trigger signal.
2. Time-mark generator for checking timebase accuracy.
Then you're done.
I think the time-mark generator is the easiest test equipment here. Many oscilloscopes only have a low-performance timebase, to verify its accuracy you only need a 10 ppm reference clock. in 2020 this is almost trivially solvable. For best results you can use a $50 crystal oven timebase for sub 1 ppm accuracy. In this age, any signal generator powered by a Direct Digital Synthesizer can meet the requirement with a good timebase. However, in calibration procedures, it may need to output not a standard square wave but a special pulsed waveform, a suitable trigger output may also be necessary, plug-and-play may not work. But it should be an easy issue to solve with suitable configuration (e.g. programming a waveform) or building an external circuitry. Alternatively, a complete DIY time-mark generator can be built for $30.
The edge signal is only a minor detail, any signal generator should be okay.
Then you need an accurate DC source, 0.5% accurate. Getting an accurate DC source is difficult and expensive, but I think you can always adjust a voltage manually with a multimeter. Even a relatively cheap meter can easily reach 0.5% at DC. For manual performance verification that's all you need. For automatic adjustment, the voltage should be stable for a few minutes to allow the scope to complete one adjustment step.
Finally, a leveled sinewave generator, which is basically an RF signal generator, but it needs to have an amplitude accuracy around 3% to 7% from 0.5 Vpp to 5 Vpp, from near-DC to sub-GHz
. This is the most difficult and challenging test instrument in the entire calibration procedure, 80% of the signal generators will NOT be able meet this requirement. 3% is 0.256 dB in power, but most RF signal generators are only 1 dB accurate. There are several workarounds:
1. Used equipment. Tektronix SG50x series is the vintage classic. But the premise of the question is "without correct equipment", so...
2. Leveling the amplitude by manual adjustment with an accurate RF power meter, as long as the signal generator's amplitude can remain stable for a few minutes. For example, R&S NRP-Z81 RF power meter has an amplitude uncertainty around 0.05 dB within +20 to 25 degrees C. But RF power meters are expensive, especially if you want a meter you can trust to be completely within the specifications. But at least they're standard lab instruments and have many uses, unlike a leveled sinewave generator or a scope calibrator.
3. Some really high-end RF signal generators allow you to connect an external power meter or sensor and use that for automatic level control. Basically just like 2, but now it's automatic.
4. If the RF signal generator supports amplitude modulation with DC. By building a suitable error amplifier, you can DIY the automatic level control.
But note that 2, 3, 4 have pitfalls. You must ensure that the signal the oscilloscope is seeing is exactly what you're measuring with the meter. All adapters, cables, connectors, power dividers, and especially attenuators (for RF leveling since 5 Vpp is too much for most RF meters) will introduce amplitude errors due to VSWR and insertion loss. Any extra attenuation must be zeroed out manually.
Oscilloscope calibration procedures already say you must use known cables type and length, known power dividers, adapters, etc., specified by the manufacturer, so that attenuation won't create error. Doing all of these manually with impoverished procedures should definitely be possible, but will have more pitfalls.
Of course if you're not a voltnut and you just need to fix a broken scope, I don't think 4% leveling is really needed, a RF signal with 0.5 dB flatness should be good enough. But this is the Metrology forum.
BTW, right now I'm playing with the idea of whether it's possible to reach the amplitude accuracy required with inexpensive circuitry. My idea: feed a CW signal into a variable attenuator, split the signal into two paths, one to the receiver and another to a detector. The detector can be an inexpensive log converter (Analog Devices' high performance RF log amps are 0.25 dB accurate). Finally, control the variable attenuator with the error signal from the detector, close the control loop, and you get leveled sinewave. With this design, it's possible to perform the same task with a standard signal generator. But I don't really expect a success.