Took a brief look at the analog color TV standards. What a marvelous piece of engineering, wow!
The first one was NTSC, the American standard. They had to keep backwards compatibility with the existing black&white TV receivers, and with the already allocated spectrum, and with the existing channel bandwidth, yet somehow add in there the color information. They come up with a very clever design. Many details were to consider, most of them neglected here, with the focus on the 64us delay line (a delay line that NTSC receiver doesn't need, or have, there might be other smaller delays in NTSC receivers, but not the 64us one - or at least that's my understanding
).
The color information was to be sent as (analog) AM quadrature modulation with a 3.579545 MHz carrier for NTSC (4.43361875 MHz +/-1Hz for PAL).
In the complex plane representation, the amplitude of the (U+V) vector corresponds to saturation, and the phase of the (U+V) vector corresponds to color. In practice, because of propagation conditions, there was a problem with unwanted phase shifts in the (U+V) vector, where anything bigger than 5° was perceived by the eye as a wrong color (such the rant about NTSC as Never Twice Same Color).
At a very brief description, R, G, B were the video signals for each color. They were summed together (a weighted sum which coefficients will be neglect in all the following formulas, to keep it simple). That was the Y luminance signal (or luma? don't know the correct English term), same as the video signal in black and white TV.
Y = R + G + B
Most of the info was in the Y signal. The color info was encoded in two other analog signals, U and V, as a difference between Blue and luma, and between Red and luma (again with some weighted factors omitted here for simplification)
U = B - Y
V = R - Y
In practice, the differences were small, and the bandwidth (<1.5MHz) was smaller than the bandwidth of the Y signal. U and V analog signals were used to quadrature modulate a chroma carrier, at 4.43361875 MHz +/-1Hz for PAL. In the end, the spectrum of a color TV broadcast was something like this:
Spectrum of a System I television channel with PAL. Source:
https://en.wikipedia.org/wiki/PALThe receiver was receiving 3 analog signals, Y, U and V. From these, the R, G, B can be deduced by simple adding and/or inverting of analog signals.
PAL TV standard came after NTSC, and one of its goals was to reduce the color errors. Thus, the quadrature modulation was slightly different than in the NTSC. The new trick in PAL was that odd and even TV lines, were sending once the V signal, then the -V signal, a 180° phase shift (inverting the signal) was added, only to the V signal, at each second TV line.
Thus, by knowing both the previous line and the current line chroma signals, and assuming 2 consecutive lines will have about the same color info and about the same error phase shift, one can deduce the correct phase (phase = color) by simply mirroring the color vector Ec relative to U axis then averaging it with the vector from the previous line.
Such, the need to delay the chroma info with one line (64us). Because of this trick of mirroring and compare the color vector between consecutive lines, in practice the color was still OK to watch in PAL for phase shift errors as big as 18°, vs only 5° in NTSC.
As opposed as I thought at the beginning, my understanding now is that the 64us delay line in PAL was not used to delay the entire luma+chrome (from 50Hz to 6.5MHz), but to delay only the 4.43MHz AM quadrature modulated (the chroma signal shown in red in the spectrum pic).