For electrical insulation of wires of a particular composition and diameter, thicker electrical insulation will mean thicker thermal insulation too. This does not necessarily mean the wire ampacity has to be derated. For most applications, wire ampacity is limited by the ability of the electrical insulation to withstand heat and ability the surroundings for the intended application to withstand heat. For example, ampacity rating of mineral-insulated wire (type MI) is higher than that of thermoplastic wire (type THW) of the same conductor diameter because mineral insulation can withstand higher temperatures than thermoplastic insulation. However, minimum thickness standard of mineral insulation is slightly greater than the minimum thickness standard for thermoplastic insulation of wire of the same gauge and voltage rating.
I’ve seen considerable variation in thickness of thermoplastic insulation (i.e. departure from standard). This has always been in the upward direction. A common place to see this is on zip cord versus hookup wire. Insulation on zip cord is often thicker than the minimum standard to provide toughness, ease of handling and as part of the “bridge” between wires in the cord. I think there is no worry that the small increase in thermal insulation from the thicker insulation necessitates decreasing ampacity rating. As long as currents are within the usual acceptable range for the wire diameter and type of insulation, the insulation will not suffer thermal damage if it's somewhat thicker than minimum required.
It's not heating of the metal conductor itself that is usually a concern in assigning ampacity. It’s heating of the insulation and surroundings like insulation of adjacent wires, wood framing, etc. that’s important.
Mike