I am assuming that as the resistor I have is a lower value then what I require based on the previous calculation I'm guessing it will not sink the voltage I require it to and therefore the voltage drop over the LED will increase?
This is where you're going wrong and confusing yourself. For all intents and purposes you can consider the voltage drop across the diode will stay the same no matter what, it will ALWAYS (try to) drop the "same" voltage (somewhere within that 0.7v range) [ok hand waving away messy details, but that's not important].
If you change the resistance of your limiting resistor it will change the current, but the voltage drops across diode and resistor will still remain as they were.
Now work your calculations based on the voltage of the diode being fixed at say an average of 3.5v
12v - 3.5v = 8.5v drop across your resistor
doesn't matter what value resistor it is, it will by necessity drop 8.5v.
So knowing that, simply applying ohms law on the resistor you can see the current in the circuit (and by extension the power dissipation especially in the resistor), it's this current which is the important factor to the LED.
So with a 10R (convenient for maths) we get 0.85A at a 3.5v drop on the diode.
But that diode could "want" to drop anywhere from 3.0 to 3.7, so we better look at the extremes too.
At 3.0v we get 0.9A, and at 3.7v we get 0.83A
Clearly, 10R is going to make your LED very bright, for a brief period of time, and then very dark