It is always dangerous to go beyond the manufacturer's maximum values.
Typically, tubes were designed with a maximum heater voltage spec, although some were specified for current in a series string.
If you apply 6.3 VAC to a cold tube heater, the initial current (and therefore the power) will be much higher than the hot value after warmup.
Therefore, it can be useful to include a current limiting resistor or thermistor in the circuit to reduce that effect.
To simplify your circuit to one 6.3V/0.3A tube, assuming 60 Hz, it would be equivalent to applying
(230/110)x(6.3V)= (13.17 V) through a series capacitor to the heater, which when hot is (6.3 V/0.3 A) = 21 ohms resistive.
Doing a "complex" calculation, that would require 69 uF, for a total circuit impedance of 43.9 ohms (absolute value).
When the tube is cold, we can assume its resistance is very low, so the total impedance becomes 38.5 ohms (capacitive).
Feel free to audit my calculations, remembering that the total impedance is complex, and to scale it to your actual heater circuit at 0.45 A for three tubes.
Therefore, the voltage across the tube heater is very low, and the current is only 14% high, and the initial power is actually low.
With a "straight" connection to a filament transformer, rated at 6.3V/@>0.3A secondary, the initial power would be far higher with 6.3 V and a high current (>0.3 A).
The problem with your test circuit is that the heater voltage after warm-up is too low for proper tube operation: it should be above 5.7 V.