The 100mA max. really is a deprecated part of the standard for a long time.
Well that is certainly a bold claim. I don't think it's correct, though.
Lets put it differently: try and find a USB port which enforces it... I have never found a USB port which doesn't deliver less than 500mA. It just causes the user grief to limit the current based on negotiation at the cost of extra parts and software. It never made sense to negotiate power at this level. Maybe the very, very first USB host ports had variable current limiting but I guess they found out quickly that it wasn't a good idea.
I guess you've never used an unpowered USB 1.1/2.0 hub. It has 500mA for the whole hub and all unpowered downstream devices.
But you also seem to mistakenly think that it's the host's responsibility to enforce the negotiated current. It isn't. It's the device's responsibility to obey. If it doesn't follow this, it's noncompliant.
A port can be designed to protect itself, and that's OK. I've never heard of the protection being variable, but I have seen hosts that shut down the port if a device pulls more than the port's maximum current.
Also, there are dual-use ports for both data and charging (Charging Downstream Port) defined in the USB battery charging standard.
That is true, but
so what? I am discussing
the port on the RTM2004, not every type of USB port that exists in the universe.
The whole USB power delivery situation is a huge mess; better design something that has wide margins for as long as it doesn't make attached devices go up in smoke.
Well, find a time machine and tell early-90s Intel that USB will become the dominant DC charging port. Back when they designed it, nobody even distantly envisioned it as a de-facto DC power supply standard. The little bit of power envisioned was just a courtesy so small gadgets wouldn't need a separate power supply. Remember, it was conceived as a replacement for the keyboard and mouse ports, the serial port, and the printer port. Every other application basically got tacked on later. And of those original applications, only the keyboard and mouse ports had power at all. So the original power architecture was designed for that kind of thing. The 500mA power level was itself generous overkill for all the original applications.
At the time, storage devices and the like were envisioned to use FireWire, which had up to 1.5A of power (at 9-30V), though actual available power varied wildly by system.
It's so easy to forget that USB only accidentally became the de-facto DC power supply standard (and later still a de-jure one in the EU!), since it's now everywhere. Had we known that back then, we doubtless would have designed it differently. But it's disingenuous to decry as poor the decisions made ages ago in a context where those decisions absolutely made sense.