Yes, good point, that could very well be the case. The Jupiter-T is just a receiver with a 1pps and 10kHz output. Any suggestions as to properly test the jitter on the 1pps. Looking at a few tutorials and it is easy if you have a 100K scope<g> I can use the persistence mode on my DS1052e and monitor the rising edge of the 1pps pulse stretched across the display.
You can't just look at the pulse, you have to look at the period of the 1 PPS. So you'd have to trigger on the 1 PPS, but tell the DSO to delay the data acquisition for about 1 sec. so that you can see the rising edge of the next pulse. The jitter on this rising edge will show you the quality of the output. Maybe your DSO could measure the period of the 1 PPS and report that.
If your DSO can't do this, you could trigger off of the 1 PPS on one channel and look at your Rb on the other channel. But you'd have to tweak the Rb to be exactly on frequency, otherwise, the Rb signal will just walk across the screen and smear the results.
The best way to make this measurement is with a time interval counter that measures the time from the rising edge of the 1 PPS from the GPSDO to the rising edge of a signal (ideally 1 PPS) derived from the Rb. Make many measurements, collect the data electronically, then process it with a program like Timelab. I don't know if your DSO can be coaxed into doing something like that. In the end, the results should look something like the attached. The diagonal part on the left is due to limitations of the measuring equipment. The flat part shows the performance of the OCXO. Looks like the Trimble UCCM is okay, but nowhere near as good as the Z3801A. Both graphs then turn and follow the "GPS line" which is the approximate performance limit of the GPS system. Finally, at about 10K sec., the Rb starts to peek out from underneath the GPS line. The black line would have turned and followed a similar path to the blue line if the data run had lasted longer.