24Hz flickers massively, or almost "blinks". No one can stand it for long.
Cinematic projection has never worked at 24Hz. Depending on the actual projector shutter type used, they either doubled or tripled the film frame rate, so it's either 48Hz or 72Hz. TV was 50-60Hz depending on country, but CRT has some persistence (gradual fade during off-time) that LED doesn't have.
So now we are looking at a complex problem, which we can simplify:
1) After around 100Hz (there is some room for arguing), the eye integrates the light perfectly, and only the average matters,
2) Between about, say, 20-100Hz, it's a really fuzzy "depends who you ask" type of problem. Different people perceive quickly flickering lights in a different way.
3) Below about 20Hz, you have a clearly blinking light, and of course it appears brighter during the on-time, as your eyes recognize the on-time as a separate "event".
For 1), the only thing that matters is the light output vs. input current curve for the particular LED. Often, for modern LED types, this curve is quite linear for a big part. For a modern standard-power LED specified for 40mA absolute maximum, you would likely have the linear range somewhere between about 2-20 mA. If this is your case, then only the average current matters, and PWMing makes no difference. OTOH, if your PWM "on" current would be at the absolute maximum 40mA, the LED likely runs at lower efficiency, lowering the average light output for same average input current. But all this speculation is meaningless: you need to look at the curve of the actual LED you are using, and you have your answer!