By your own logic, if a wire has 1A flowing it, but almost zero voltage drop across the wire, then integrating the power over time for that wire proves that there is minimal energy in that wire.
If you do the same calculation for the resistor you get reliable results, so why not for a wire? What is so special about it?
And where along the transition between "resistor" and "wire" does this specialness happen? at what resistance or current does your math become invalid?
A wire is a resistor with lower resistance so there is no difference.
Not quite sure you know what you are asking. Can you be more exact maybe give a proper example with values.
Yes 1A drop on a 0.1Ohm resistor will result in a 0.1V drop across the resistor/wire and thus 0.1W of power lost on the wire as heat.
Same 1A on a 1kOhm wire/resistor will result in 1000V drop thus 1000W of power loss on that wire/resistor.
Where do you see any difference or problem between the two examples ?
None, as you say
electrical current multiplied with voltage is power
Sure, a good wire has 1A flowing in it, and 0V measured across either end, so has 0W
and power integrated over time is energy
Sure, let's integrate 0W for as long as we want.... it's 0 J.
it means energy flows only in wire.
This doesn't follow - the result of your calculation is 0 J.
Well, following your own calculations, there was 0J in that wire.
If it was a 1 ohm resistor with 1A
"electrical current multiplied with voltage is power" - 1A at 1V is 1 W
" and power integrated over time is energy" - let's integrate 1W for a second.... it's 1 J.
Yep, 1J as expected. That checks out.