The driver you bought doesnt look very useful for this application.
The problem with dimming LEDs is there is a very small range of voltage where the current varies wildly and LED goes from not lit to burned
The correct way is usually to regulate current instead of voltage - that gives you more linear-ish control of the brightness.
There are two basic ways to do that:
One is regulating the current using some sort of analog circuitry (negative feedback loop with op-amp) and resistor to detect and control the current.
Advantage of analog current control is the LED is not blinking so its suitable as lighting for photography / video recording.
Disadvantage is it has more power losses (gets hot) and some LEDs tend to change color on different brightness levels.
Other is the PWM control (pulse width modulation) where you have a fixed maximum allowable current (maybe set with small resistor and constant voltage supply) and you switch the LED on and off quickly, varying the ratio between on and off.
Advantages are less losses and thus longer battery life, constant color.
Disadvantage is the flickering that can cause weird interference patterns when there are two light sources PWM switched using similar but not identical switching frequency.
For your setup you probably need a 38V /1.5A power supply as the voltage required by the LED is pretty high.
You could use a beefy 5Watt 1.5Ohm resistor to limit the current, and some kind of MOSFET to switch the LED on/off, controlled by either a 555 timer based PWM circuit or small cheap microcontroller