If you really want to go down the potentiometer route, it's going to be tricky with such a high power LED. Here's what you need anyway. First double check the "forward" voltage drop of the LEDs. 12V sounds very high, but then again, 90W is huge for a LED. Most likely there's quite a few LEDs bundled together. Once you're very sure of the voltage, V, and power, P, of the LED, calculate the current at full brightness as I = P/V. Using your numbers that gives I = 7.5A.
Now take your power supply voltage, say 18V, subtract the LED voltage and divide by your max current to find your resistance at maximum brightness. Eg. R = V/I = (18-12)/7.5 = 0.8 ohms. Now to figure out the power rating of your potentiometer, take the square of the max current and multiply it by the resistance: P = I*I*R = 7.5*7.5*0.8 = 45W. Adjusting the potentiometer to a higher resistance will dim the LED (though it wont necessarily be particularly linear!).
If you're still keen to proceed, consider a few practical matters:
1) a 45W resistor will be very large (at least as big as your hand) and get very, very hot
2) trying to make a variable resistor that is accurate down to 0.8 ohms is tricky - this is a pretty specialist area that might be hard to find off the shelf
3) as resistors get hot, their resistance (usually) goes up. While this good from a stability point of view, is does mean your brightness will tend to wander.
4) at such low series resistances, the internal impedance of your power supply will likely be significant. Might just mean your potentiometer needs to go a bit lower than the simple calculation above gives.
5) Using the numbers above, you'll need at least a 135W power supply. That's a very big, low voltage DC power supply.
I appreciate you may have your own motivations and requirements, and it's certainly possible to do it, but don't neglect Skogsgurra's suggestion - it has some fairly serious practical advantages!