Yeah, it is just too complicated to say. It only makes sense to compare two specific devices under specific operating conditions, or to compare two approaches under a common optimization metric including cost, efficiency, and so forth. A 50/60 Hz transformer followed by a linear regulator will essentially always be less efficient than a SMPS, but that is due to many factors, not only the transformer itself, and is as much guided by cost effectiveness than anything else.
The advantages for low frequency transformers are: lower eddy current losses, stray capacitances, and antenna losses. The advantages for high frequency devices is that magnetizing current is much less. This means you need less inductance so you can use fewer turns of larger gauge copper, reducing copper losses, and cores of lower susceptibility, which allows lower hysteresis losses and higher resistivity for less eddy current losses even at high frequency.
However, this is dwarfed by the main differences between a linear and switch-mode supply. A SMPS transfers less energy per cycle due to the higher operating frequency, and therefore can store that energy in an inductor or gapped transformer. This allows efficient regulation through duty cycle control. A linear supply instead stores the energy in a capacitor and uses a variable resistor to regulate the voltage, dissipating a lot of power in the process. This makes SMPSs much more efficient than linear power supplies with line frequency transformers, but isn't really related to the losses in the transformer at all.
Finally, an SMPS must consider switching losses in the MOSFETs and diodes. A line transformer supply simply doesn't have these losses because the AC waveform comes to you from the power company. These losses get worse with frequency, so generally high frequency SMPSs have more loss than lower frequency SMPS. It isn't really cut and dry though, because the higher frequency you go, the smaller and cheaper the transformers, inductors, and capacitors can be, allowing for cheaper 'oversizing' for improved efficiency. A properly cost-optimized design will have to trade off the cost of the silicon vs. the cost of the passive devices vs. the desired efficiency, plus performance requirements like ripple, noise, regulation, and emissions to find an optimal design.
If I had to give an answer to your relatively meaningless question, I would probably say that the physics favors low frequency but economics favors high frequency. If your really want to set a record for minimum transformer losses and aren't concerned with cost or output power, a low frequency transformer will probably win. That lets you minimize antenna losses and eddy currents, which are a problem even at zero power. Signal transformers can be really high efficiency at low audio frequency, but are expensive, bulky, and very low power. If you have to make something cost effective, a high frequency transformer will win every time.