No rule of thumb. You should consider the specs valid, i.e., no derating needed. Sometimes you can even overrate: for example, many inductors are specified for 40 degC temperature rise, and in some applications, 50 degC rise is just fine.
However, it's quite challenging to actually analyze:
(1) real losses in the inductor, especially AC (copper: skin/proximity effect. core: hysteresis losses)
(2) thermal performance of your system, i.e., how the heat is removed.
The AC part of the losses is often hard, since almost no inductor manufacturer provides sufficient data to model it. If you just do the DC calculation in a switching converter, the actual losses may be double, even triple that!
Thermal side needs analyzing the PCB layout (in case of SMD inductors), heatsinking, air flow, worst case ambient temperature, etc.
You should analyze these effects as well as you can, given your skills and design time available, then approximate how well you were able to do that, and derate based on that confidence number. Prototyping, and measuring actual temperatures in the worst-case emulated environment is highly recommended. BTW, the same is true for most types of components.
Only tantalums are a rare expection to the general rule of "part is (barely) usable within its specifications - just add some safety margin for your own part of the design". For some reason, the datasheet specifications for tantalums give ridiculously bad reliability, so you need to apply a significant extra factor (typically 0.4 to 0.6) in addition to your standard safety margin for surges, worst case conditions, voltage regulation tolerances, etc.