The official definition of "high speed" that I had to learn for a recent exam was: any situation where the rise- or falltime of a given signal is
more faster than 20% of the propagationspeed of that signal. The propagation speed is the length of the pcb trace or signal path divided by the signal speed. And the signal speed is the speed of light divided by the squareroot of the conductive material's relative permeability.
In formula form:
high speed is where t_rise
>= <= 20% t_propagation.
T_propagation = l/v. (l= length of pcb trace or signal path)
v=c/sqrt(epsilon). (v=signal speed, c=speed of light, epsilon=relative permeability)
So there's no standard frequency that divides between high speed and not high speed. It realy depends on the situation. When sending 50Hz power across Russia you could speak of a high speed signal according above definition. The 50 Hz sinus is definately rising and falling faster than it takes to transport is across all of Russia ;-)
Also the engineers I spoke with at Thales Netherlands who work on radar systems consider everything below 1 GHz as DC ;-)
Edit: stupid mistake corrected