Omega.com on Infrared Temperature Measurement Theory:
- An infrared thermometer measures temperature by detecting the infrared energy emitted by all materials which are at temperatures above absolute zero, (0°Kelvin). The most basic design consists of a lens to focus the infrared (IR) energy on to a detector, which converts the energy to an electrical signal that can be displayed in units of temperature after being compensated for ambient temperature variation.
- The infrared detector or the detector system acts as transducer, which converts radiation into electrical signals. It forms the core of an IR imaging system. The quality of this transduction determines the performance of the imaging system to a great extent.
- Infrared detectors can be separated into two groups: photon detectors and thermal detectors.
- In photon (or quantum) detectors, a single-step transduction leads to changes of concentration or mobility of the free charge carriers in the detector element upon absorption of photons from the infrared radiation.
- Thermal detectors can be treated as two-step transducers. First, the incident radiation is absorbed to change the temperature of a material. Second, the electrical output of the thermal sensor is produced by a respective change in some physical property of a material (e.g., temperature-dependent electrical resistance in a bolometer).
- first detects the temperature of the object by absorbing incident radiation from the object into a detector until radiative equilibrium, with the detector at start at a lower temperature than the object, and reports a temperature difference vs the background temperature of the instrument,
- then converts the temperature difference to an electrical signal which is calibrated to report the detected temperature of the object.
This may be a common misconception.