Distortion, like frequency response, is enhanced by applying negative feedback.
Approaching the cutoff frequency, loop gain is dropping, so the voltage gain no longer exactly matches the resistor divider ratio; distortion rises in a reciprocal fashion.
Normally, distortion is spec'd at unity gain, where it is lowest (and frequency response is highest, being near GBW). It isn't always measured at unity gain, especially for amps where the distortion is vanishingly small. A THD figure of 0.00003% or something like that is usually measured at higher gain, 100 or 1000, where the THD is correspondingly higher.
The ratio is not exactly the gain ratio, but of the form 1 / (1 + G0/G), where G0 is the amp's DC gain and G is the gain you're measuring at. Op-amps have very high G0, so it looks like the gain ratio. A lower gain amp, like a single transistor stage, makes this distinction more apparent.
Tim