If you are trying to size the dropping resistors you first have to know what voltage the power supply puts out when it is under a load. For many other applications you would only need a ball park figure, but for powering LEDs, especially IR LEDs, you need a more exact figure. With LEDs an extra half of a volt over its rated value will shorten its life, an extra volt will burn it out right away. A low voltage would greatly reduce the output of the LED. If the power supply actually puts out 18.3 volts under load, the LEDs drop 1.2 volts and use 100ma and you wire the LEDs in parallel with a separate resistor for each one then you would have to use 180 ohm resistors. If you were to build the circuit and measure the voltage at the power supply and it was really less than 18.3 volts you could go to lower value resistors to be sure that the LEDs had their full output. Here is the calculator that I used:
http://ledcalc.com/ . Since I have a good background in electronics I don't need a calculator of this sort, but using it gave me the nearest resistor to the actual calculated value of 171 ohms. You only need to do the calculation for one LED/resistor combination when they are all wired in parallel.
You should be aware that the sensors that you use should be matched in frequency to the emitters. Even a small mismatch would result in the sensor seeing only a fraction or the emitter's output. If you had used the emitters and sensors that Radio Shack sells in the same package those do not match. The detectors are rated at 850nm and the emitters are rated at 950nm. With that combination you might loose 3/4ths of the emitter's output.