As the title suggest, I'm trying to learn how inductor behaves. I read from my book that it follows this equation V=L*(di/dt). So, to confirm it, I'm setting up a circuit in LTSpice which can be seen below. Basically, I'm giving a time-varying linear current to the inductor from 0A to 1A for 5ms. Then, the current rate changes to 0A from 1A in 2ms. If my calculation is correct, at the rising current, the inductor will have 0.2mV across it until 5ms then it will have 0.5mV at the falling current (5ms to 7ms). However, the simulation graph shows different result. So, basically I want to ask is this, how does this happen? Did I make a mistake?