Hubbry Logo
ThermographyThermographyMain
Open search
Thermography
Community hub
Thermography
logo
8 pages, 0 posts
0 subscribers
Be the first to start a discussion here.
Be the first to start a discussion here.
Contribute something
Thermography
Thermography
from Wikipedia
Thermogram of a traditional building in the background and a "passive house" in the foreground

Infrared thermography (IRT), also known as thermal imaging, is a measurement and imaging technique in which a thermal camera detects infrared radiation originating from the surface of objects. This radiation has two main components: thermal emission from the object's surface, which depends on its temperature and emissivity, and reflected radiation from surrounding sources. The result is a visible image called a thermogram. Thermal cameras most commonly operate in the long-wave infrared (LWIR) range (7–14 μm); less frequently, systems designed for the mid-wave infrared (MWIR) range (3–5 μm) are used.

Since infrared radiation is emitted by all objects with a temperature above absolute zero according to the black body radiation law, thermography makes it possible to see one's environment with or without visible illumination. The amount of radiation emitted by an object increases with temperature, and thermography allows one to see variations in temperature. When viewed through a thermal imaging camera, warm objects stand out well against cooler backgrounds. For example, humans and other warm-blooded animals become easily visible against their environment in day or night. As a result, thermography is particularly useful to the military and other users of surveillance cameras.

Thermogram of a cat

Some physiological changes in human beings and other warm-blooded animals can also be monitored with thermal imaging during clinical diagnostics. Thermography is used in allergy detection and veterinary medicine. Some alternative medicine practitioners promote its use for breast screening, despite the FDA warning that "those who opt for this method instead of mammography may miss the chance to detect cancer at its earliest stage".[1] Notably, government and airport personnel used thermography to detect suspected swine flu cases during the 2009 pandemic.[2]

Thermography has a long history, although its use has increased dramatically with the commercial and industrial applications of the past 50 years. Firefighters use thermography to see through smoke, to find persons, and to locate the base of a fire. Maintenance technicians use thermography to locate overheating joints and sections of power lines, which are a sign of impending failure. Building construction technicians can see thermal signatures that indicate heat leaks in faulty thermal insulation, improving the efficiency of heating and air-conditioning units.

The appearance and operation of a modern thermographic camera is often similar to a camcorder. Often the live thermogram reveals temperature variations so clearly that a photograph is not necessary for analysis. A recording module is therefore not always built-in.

Specialized thermal imaging cameras use focal plane arrays (FPAs) that respond to longer wavelengths (mid- and long-wavelength infrared). The most common types are InSb, InGaAs, HgCdTe and QWIP FPA. The newest technologies use low-cost, uncooled microbolometers as FPA sensors. Their resolution is considerably lower than that of optical cameras, mostly 160×120 or 320×240 pixels, and up to 1280 × 1024[3] for the most expensive models. Thermal imaging cameras are much more expensive than their visible-spectrum counterparts, and higher-end models are often export-restricted due to potential military uses. Older bolometers or more sensitive models such as InSb require cryogenic cooling, usually by a miniature Stirling cycle refrigerator or with liquid nitrogen.

Thermal energy

[edit]
A comparison of a thermal image (top) and an ordinary photograph (bottom). The plastic bag is mostly transparent to long-wavelength infrared, but the man's glasses are opaque.
This thermogram shows excessive heating on a terminal in an industrial electrical fuse block.
A thermal image showing temperature variation in a hot air balloon

Thermal images, or thermograms, are visual displays of the total infrared energy emitted, transmitted, and reflected by an object. Because there are multiple sources of the infrared energy, it is sometimes difficult to get an accurate temperature of an object using this method. A thermal imaging camera uses processing algorithms to reconstruct a temperature image. Note that the image shows an approximation of the temperature of an object, as the camera integrates multiple sources of data in the areas surrounding the object to estimate its temperature.[4]

This phenomenon may become clearer upon consideration of the formula:

Incident Radiant Power = Emitted Radiant Power + Transmitted Radiant Power + Reflected Radiant Power;

where incident radiant power is the radiant power profile when viewed through a thermal imaging camera. Emitted radiant power is generally what is intended to be measured; transmitted radiant power is the radiant power that passes through the subject from a remote thermal source, and; reflected radiant power is the amount of radiant power that reflects off the surface of the object from a remote thermal source.

This phenomenon occurs everywhere, all the time. It is a process known as radiant heat exchange, since radiant power × time equals radiant energy. However, in the case of infrared thermography, the above equation is used to describe the radiant power within the spectral wavelength passband of the thermal imaging camera in use. The radiant heat exchange requirements described in the equation apply equally at every wavelength in the electromagnetic spectrum.

If the object is radiating at a higher temperature than its surroundings, then power transfer takes place radiating from warm to cold following the principle stated in the second law of thermodynamics. So if there is a cool area in the thermogram, that object will be absorbing radiation emitted by surrounding warm objects.

The ability of objects to emit is called emissivity, to absorb radiation is called absorptivity. Under outdoor environments, convective cooling from wind may also need to be considered when trying to get an accurate temperature reading.

Emissivity

[edit]

Emissivity (or emissivity coefficient) represents a material's ability to emit thermal radiation, which is an optical property of matter. A material's emissivity can theoretically range from 0 (completely not-emitting) to 1 (completely emitting). An example of a substance with low emissivity would be silver, with an emissivity coefficient of 0.02. An example of a substance with high emissivity would be asphalt, with an emissivity coefficient of .98.

A black body is a theoretical object with an emissivity of 1 that radiates thermal radiation characteristic of its contact temperature. That is, if the contact temperature of a thermally uniform black body radiator were 50 °C (122 °F), it would emit the characteristic black-body radiation of 50 °C (122 °F). An ordinary object emits less infrared radiation than a theoretical black body. In other words, the ratio of the actual emission to the maximum theoretical emission is an object's emissivity.

Each material has a different emissivity which may vary by temperature and infrared wavelength.[5] For example, clean metal surfaces have emissivity that decreases at longer wavelengths; many dielectric materials, such as quartz (SiO2), sapphire (Al2O3), calcium fluoride (CaF2), etc. have emissivity that increases at longer wavelength; simple oxides, such as iron oxide (Fe2O3) display relatively flat emissivity in the infrared spectrum.

Measurement

[edit]
Thermogram of a snake held by a human

A thermal imaging camera performs radiometric processing to convert the detected infrared radiation into estimates of the object's surface temperature. This is achieved through the application of the thermography equation, which accounts for emitted and reflected components of radiation, as well as the influence of the atmosphere, which emits its own thermal radiation and attenuates the radiation originating from the measured surface. In the case of a radiometric thermal camera (the current standard), the output images contain not only visual information but also radiometric data that represent the detected radiation and allow accurate temperature evaluation based on the computational model provided by the thermography equation.

The spectrum and amount of thermal radiation depend strongly on an object's surface temperature. This enables thermal imaging of an object's temperature. However, other factors also influence the received radiation, which limits the accuracy of this technique: for example, the emissivity of the object.

For a non-contact temperature measurement, the emissivity setting needs to be set properly. An object of low emissivity could have its temperature underestimated by the detector, since it only detects emitted infrared rays. For a quick estimate, a thermographer may refer to an emissivity table for a given type of object, and enter that value into the imager. It would then calculate the object's contact temperature based on the entered emissivity and the infrared radiation as detected by the imager.

For a more accurate measurement, a thermographer may apply a standard material of known, high emissivity to the surface of the object. The standard material might be an industrial emissivity spray produced specifically for the purpose, or as simple as standard black insulation tape, with an emissivity of about 0.97. The object's known temperature can then be measured using the standard emissivity. If desired, the object's actual emissivity (on a part of the object not covered by the standard material) can be determined by adjusting the imager's setting to the known temperature. There are situations, however, when such an emissivity test is not possible due to dangerous or inaccessible conditions, then the thermographer must rely on tables.

Other variables can affect the measurement, including absorption and ambient temperature of the transmitting medium (usually air). Also, surrounding infrared radiation can be reflected in the object. All these settings will affect the calculated temperature of the object being viewed.

Color scale

[edit]

Images from infrared cameras tend to be monochrome because the cameras generally use an image sensor that does not distinguish different wavelengths of infrared radiation. Color image sensors require a complex construction to differentiate wavelengths, and color has less meaning outside of the normal visible spectrum because the differing wavelengths do not map uniformly into the color vision system used by humans.

Sometimes these monochromatic images are displayed in pseudo-color, where changes in color are used rather than changes in intensity to display changes in the signal. This technique, called density slicing, is useful because although humans have much greater dynamic range in intensity detection than color overall, the ability to see fine intensity differences in bright areas is fairly limited.

In temperature measurement the brightest (warmest) parts of the image are customarily colored white, intermediate temperatures reds and yellows, and the dimmest (coolest) parts black. A scale should be shown next to a false color image to relate colors to temperatures.

Cameras

[edit]
Image of a Pomeranian taken in mid-infrared ("thermal") light (false-color)

A thermographic camera (also called an infrared camera or thermal imaging camera, thermal camera or thermal imager) is a device that creates an image using infrared (IR) radiation, similar to a normal camera that forms an image using visible light. Instead of the 400–700 nanometre (nm) range of the visible light camera, infrared cameras are sensitive to wavelengths from about 1,000 nm (1 micrometre or μm) to about 14,000 nm (14 μm). The practice of capturing and analyzing the data they provide is called thermography.

Thermal cameras convert the energy in the far infrared wavelength into a visible light display. All objects above absolute zero emit thermal infrared energy, so thermal cameras can passively see all objects, regardless of ambient light. However, most thermal cameras are sensitive to objects warmer than −50 °C (−58 °F).

Some specification parameters of an infrared camera system are number of pixels, frame rate, responsivity, noise-equivalent power, noise-equivalent temperature difference (NETD), spectral band, distance-to-spot ratio (D:S), minimum focus distance, sensor lifetime, minimum resolvable temperature difference (MRTD), field of view, dynamic range, input power, and mass and volume.

Their resolution is considerably lower than that of optical cameras, often around 160×120 or 320×240 pixels, although more expensive ones can achieve a resolution of 1280×1024 pixels. Thermographic cameras are much more expensive than their visible-spectrum counterparts, though low-performance add-on thermal cameras for smartphones became available for hundreds of US dollars in 2014.[6]

Types

[edit]

Thermographic cameras can be broadly divided into two types: those with cooled infrared image detectors and those with uncooled detectors.

Cooled infrared detectors

[edit]
A thermographic image of several lizards
Thermal imaging camera & screen, in an airport terminal in Greece. Thermal imaging can detect fever, one of the signs of infection.

Cooled detectors are typically contained in a vacuum-sealed case or Dewar and cryogenically cooled. Cooling is necessary for the operation of the semiconductor materials used. Typical operating temperatures range from 4 K (−269 °C) to just below room temperature, depending on the detector technology. Most modern cooled detectors operate in the 60 Kelvin (K) to 100 K range (-213 to -173 °C), depending on type and performance level.[7]

Without cooling, these sensors (which detect and convert light in much the same way as common digital cameras, but are made of different materials) would be 'blinded' or flooded by their own radiation. The drawbacks of cooled infrared cameras are that they are expensive both to produce and to run. Cooling is both energy-intensive and time-consuming.

The camera may need several minutes to cool down before it can begin working. The most commonly used cooling systems are peltier coolers which, although inefficient and limited in cooling capacity, are relatively simple and compact. To obtain better image quality or for imaging low temperature objects Stirling cryocoolers are needed. Although the cooling apparatus may be comparatively bulky and expensive, cooled infrared cameras provide greatly superior image quality compared to uncooled ones, particularly of objects near or below room temperature. Additionally, the greater sensitivity of cooled cameras also allow the use of higher F-number lenses, making high performance long focal length lenses both smaller and cheaper for cooled detectors.

An alternative to Stirling coolers is to use gases bottled at high pressure, nitrogen being a common choice. The pressurised gas is expanded via a micro-sized orifice and passed over a miniature heat exchanger resulting in regenerative cooling via the Joule–Thomson effect. For such systems the supply of pressurized gas is a logistical concern for field use.

Materials used for cooled infrared detection include photodetectors based on a wide range of narrow gap semiconductors including indium antimonide (3-5 μm), indium arsenide, mercury cadmium telluride (MCT) (1-2 μm, 3-5 μm, 8-12 μm), lead sulfide, and lead selenide. Infrared photodetectors can also be created with structures of high bandgap semiconductors such as in quantum well infrared photodetectors.

Cooled bolometer technologies can be superconducting or non-superconducting. Superconducting detectors offer extreme sensitivity, with some able to register individual photons. For example, ESA's Superconducting camera (SCAM). However, they are not in regular use outside of scientific research. In principle, superconducting tunneling junction devices could be used as infrared sensors because of their very narrow gap. Small arrays have been demonstrated, but they have not been broadly adopted for use because their high sensitivity requires careful shielding from background radiation.

Uncooled infrared detectors

[edit]

Uncooled thermal cameras use a sensor operating at ambient temperature, or a sensor stabilized at a temperature close to ambient using small temperature control elements. Modern uncooled detectors all use sensors that work by the change of resistance, voltage or current when heated by infrared radiation. These changes are then measured and compared to the values at the operating temperature of the sensor.

In uncooled detectors the temperature differences at the sensor pixels are minute; a 1 °C difference at the scene induces just a 0.03 °C difference at the sensor. The pixel response time is also fairly slow, at the range of tens of milliseconds.

Uncooled infrared sensors can be stabilized to an operating temperature to reduce image noise, but they are not cooled to low temperatures and do not require bulky, expensive, energy consuming cryogenic coolers. This makes infrared cameras smaller and less costly. However, their resolution and image quality tend to be lower than cooled detectors. This is due to differences in their fabrication processes, limited by currently available technology. An uncooled thermal camera also needs to deal with its own heat signature.

Uncooled detectors are mostly based on pyroelectric and ferroelectric materials or microbolometer technology.[8] The material are used to form pixels with highly temperature-dependent properties, which are thermally insulated from the environment and read electronically.

Thermal image of steam locomotive

Ferroelectric detectors operate close to phase transition temperature of the sensor material; the pixel temperature is read as the highly temperature-dependent polarization charge. The achieved NETD of ferroelectric detectors with f/1 optics and 320×240 sensors is 70-80 mK. A possible sensor assembly consists of barium strontium titanate bump-bonded by polyimide thermally insulated connection.

Silicon microbolometers can reach NETD down to 20 mK. They consist of a layer of amorphous silicon, or a thin film vanadium(V) oxide sensing element suspended on silicon nitride bridge above the silicon-based scanning electronics. The electric resistance of the sensing element is measured once per frame.

Current improvements of uncooled focal plane arrays (UFPA) are focused primarily on higher sensitivity and pixel density. In 2013 DARPA announced a five-micron LWIR camera that uses a 1280 × 720 focal plane array (FPA).[9] Some of the materials used for the sensor arrays are amorphous silicon (a-Si), vanadium(V) oxide (VOx),[10] lanthanum barium manganite (LBMO), lead zirconate titanate (PZT), lanthanum doped lead zirconate titanate (PLZT), lead scandium tantalate (PST), lead lanthanum titanate (PLT), lead titanate (PT), lead zinc niobate (PZN), lead strontium titanate (PSrT), barium strontium titanate (BST), barium titanate (BT), antimony sulfoiodide (SbSI), and polyvinylidene difluoride (PVDF).

CCD and CMOS thermography

[edit]
Color contours of temperature for a smoldering ember measured with a CMOS camera.

Non-specialized charge-coupled device (CCD) and CMOS sensors have most of their spectral sensitivity in the visible light wavelength range. However, by utilizing the "trailing" area of their spectral sensitivity, namely the part of the infrared spectrum called near-infrared (NIR), and by using off-the-shelf CCTV camera it is possible under certain circumstances to obtain true thermal images of objects with temperatures at about 280 °C (536 °F) and higher.[11]

At temperatures of 600 °C and above, inexpensive cameras with CCD and CMOS sensors have also been used for pyrometry in the visible spectrum. They have been used for soot in flames, burning coal particles, heated materials, SiC filaments, and smoldering embers.[12] This pyrometry has been performed using external filters or only the sensor's Bayer filters. It has been performed using color ratios, grayscales, and/or a hybrid of both.

Infrared films

[edit]

Infrared (IR) film is sensitive to black-body radiation in the 250 to 500 °C (482 to 932 °F) range, while the range of thermography is approximately −50 to 2,000 °C (−58 to 3,632 °F). So, for an IR film to work thermographically, the measured object must be over 250 °C (482 °F) or be reflecting infrared radiation from something that is at least that hot.

Comparison with night-vision devices

[edit]

Starlight-type night-vision devices generally only magnify ambient light and are not thermal imagers.

Some infrared cameras marketed as night vision are sensitive to near-infrared just beyond the visual spectrum, and can see emitted or reflected near-infrared in complete visual darkness. However, these are not usually used for thermography due to the high equivalent black-body temperature required, but are instead used with active near-IR illumination sources.

Passive vs. active thermography

[edit]

All objects above the absolute zero temperature (0 K) emit infrared radiation. Hence, an excellent way to measure thermal variations is to use an infrared sensing device, usually a focal plane array (FPA) infrared camera capable of detecting radiation in the mid (3 to 5 μm) and long (7 to 14 μm) wave infrared bands, denoted as MWIR and LWIR, corresponding to two of the high transmittance infrared windows. Abnormal temperature profiles at the surface of an object are an indication of a potential problem.[13]

In passive thermography, the features of interest are naturally at a higher or lower temperature than the background. Passive thermography has many applications such as surveillance of people on a scene and medical diagnosis (specifically thermology).

In active thermography, an energy source is required to produce a thermal contrast between the feature of interest and the background.[14] The active approach is necessary in many cases given that the inspected parts are usually in equilibrium with the surroundings. Given the super-linearities of the black-body radiation, active thermography can also be used to enhance the resolution of imaging systems beyond their diffraction limit or to achieve super-resolution microscopy.[15]

Advantages

[edit]

Thermography shows a visual picture so temperatures over a large area can be compared.[16][17][18] It is capable of catching moving targets in real time.[16][17][18] It is able to find deterioration, i.e., higher temperature components prior to their failure. It can be used to measure or observe in areas inaccessible or hazardous for other methods. It is a non-destructive test method. It can be used to find defects in shafts, pipes, and other metal or plastic parts.[19] It can be used to detect objects in dark areas. It has some medical application, essentially in physiotherapy.

Limitations and disadvantages

[edit]

Quality thermography cameras often have a high price (often US$3,000 or more) due to the expense of the larger pixel array (state of the art 2560x2048[20][21][22]), although less expensive models (with pixel arrays of 40×40 up to 160×120 pixels) are also available. Fewer pixels compared to traditional cameras reduce the image quality making it more difficult to distinguish proximate targets within the same field of view.

There is also a difference in refresh rate. Some cameras may only have a refreshing value of 5 –15 Hz, other (e.g. FLIR X8500sc[3]) 180 Hz or even more in no full window mode.

There are various types of lenses available, including fixed focus, manual focus, and auto focus. Most thermal cameras only support digital zoom and lack true optical zoom capabilities. However, a few models (e.g. FOTRIC P7MiX) offer dual-view optical zoom, combining lenses with different fields of view (e.g., 25° and 12°, or 25° and 7°).

Many models do not provide the irradiance measurements used to construct the output image; the loss of this information without a correct calibration for emissivity, distance, and ambient temperature and relative humidity entails that the resultant images are inherently incorrect measurements of temperature.[23]

Images can be difficult to interpret accurately when based upon certain objects, specifically objects with erratic temperatures, although this problem is reduced in active thermal imaging.[24]

Thermographic cameras create thermal images based on the radiant heat energy it receives.[25] As radiation levels are influenced by the emissivity and reflection of radiation such as sunlight from the surface being measured this causes errors in the measurements.[26]

  • Most cameras have ±2% accuracy or worse in measurement of temperature and are not as accurate as contact methods.[16][17][18]
  • Methods and instruments are limited to directly detecting surface temperatures.

Applications

[edit]
Kite aerial thermogram revealing features on/under a grassed playing field.
UAS thermal imagery of a solar panel array in Switzerland
Thermographic image of a ring-tailed lemur

Thermography finds many uses, and thermal imaging cameras are excellent tools for the maintenance of electrical and mechanical systems in industry and commerce. For example, firefighters use it to see through smoke, find people, and localize hotspots of fires. Power line maintenance technicians locate overheating joints and parts, a telltale sign of their failure, to eliminate potential hazards. Where thermal insulation becomes faulty, building construction technicians can see heat leaks to improve the efficiencies of cooling or heating air-conditioning.

By using proper camera settings, electrical systems can be scanned and problems can be found. Faults with steam traps in steam heating systems are easy to locate.

In the energy savings area, thermal imaging cameras can see the effective radiation temperature of an object as well as what that object is radiating towards, which can help locate sources of thermal leaks and overheated regions.

Viewed from space by WISE using a thermal camera, asteroid 2010 AB78 appears redder than the background stars as it emits most of its light at longer infrared wavelengths. In visible light and near-infrared it is very faint and difficult to see.

Cooled infrared cameras can be found at major astronomy research telescopes, even those that are not infrared telescopes. Examples include telescopes such as UKIRT, the Spitzer Space Telescope, WISE and the James Webb Space Telescope[27]

For automotive night vision, thermal imaging cameras are also installed in some luxury cars to aid the driver, the first being the 2000 Cadillac DeVille.

In smartphones, a thermal camera was first integrated into the Cat S60 in 2016.

Industry

[edit]

In manufacturing, engineering and research, thermography can be used for:

In building inspection, thermography can be used in:[29]

Health

[edit]

Some physiological activities, particularly responses such as fever, in human beings and other warm-blooded animals can also be monitored with non-contact thermography. This can be compared to contact thermography such as with traditional thermometers.

Healthcare-related uses include:

  • Dynamic angiothermography
  • Peripheral vascular disease screening.
  • Medical imaging in infrared
  • Thermography (medical) - Medical testing for diagnosis
  • Carotid artery stenosis (CAS) screening through skin thermal maps.[32]
  • Active Dynamic Thermography (ADT) for medical applications.[33][34][35]
  • Neuromusculoskeletal disorders.
  • Extracranial cerebral and facial vascular disease.
  • Facial emotion recognition.[36][37]
  • Thyroid gland abnormalities.
  • Various other neoplastic, metabolic, and inflammatory conditions.

Security and defence

[edit]
The thermographic camera on a Eurocopter EC135 helicopter of the German Federal Police
AN/PAS-13 thermal rifle scope mounted on an AR-15 rifle

Thermography is often used in surveillance, security, firefighting, law enforcement, and anti-terrorism:[38]

In weapons systems, thermography can be used in military and police target detection and acquisition:

In computer hacking, a thermal attack is an approach that exploits heat traces left after interacting with interfaces, such as touchscreens or keyboards, to uncover the user's input.[40]

Other applications

[edit]
Hot hooves indicate a sick cow.

Other areas in which these techniques are used:

Standards

[edit]
ASTM International (ASTM)
  • ASTM C1060, Standard Practice for Thermographic Inspection of Insulation Installations in Envelope Cavities of Frame Buildings
  • ASTM C1153, Standard Practice for the Location of Wet Insulation in Roofing Systems Using Infrared Imaging
  • ATSM D4788, Standard Test Method for Detecting Delamination in Bridge Decks Using Infrared Thermography
  • ASTM E1186, Standard Practices for Air Leakage Site Detection in Building Envelopes and Air Barrier Systems
  • ASTM E1934, Standard Guide for Examining Electrical and Mechanical Equipment with Infrared Thermography
International Organization for Standardization (ISO)
  • ISO 6781, Thermal insulation – Qualitative detection of thermal irregularities in building envelopes – Infrared method
  • ISO 18434-1, Condition monitoring and diagnostics of machines – Thermography – Part 1: General procedures
  • ISO 18436-7, Condition monitoring and diagnostics of machines – Requirements for qualification and assessment of personnel – Part 7: Thermography

Regulation

[edit]

Higher-end thermographic cameras are often deemed dual-use military grade equipment, and are export-restricted, particularly if the resolution is 640×480 or greater, unless the refresh rate is 9 Hz or less. The export from the USA of specific thermal cameras is regulated by International Traffic in Arms Regulations.

In biology

[edit]

Thermography, by strict definition, is a measurement using an instrument, but some living creatures have natural organs that function as counterparts to bolometers, and thus possess a crude type of thermal imaging capability. This is called thermoception. One of the best known examples is infrared sensing in snakes.

History

[edit]

Discovery and research of infrared radiation

[edit]

Infrared was discovered in 1800 by Sir William Herschel as a form of radiation beyond red light.[46] These "infrared rays" (infra is the Latin prefix for "below") were used mainly for thermal measurement.[47] There are four basic laws of IR radiation: Kirchhoff's law of thermal radiation, Stefan–Boltzmann law, Planck's law, and Wien's displacement law. The development of detectors was mainly focused on the use of thermometers and bolometers until World War I. A significant step in the development of detectors occurred in 1829, when Leopoldo Nobili, using the Seebeck effect, created the first known thermocouple, fabricating an improved thermometer, a crude thermopile. He described this instrument to Macedonio Melloni. Initially, they jointly developed a greatly improved instrument. Subsequently, Melloni worked alone, creating an instrument in 1833 (a multielement thermopile) that could detect a person 10 metres away.[48] The next significant step in improving detectors was the bolometer, invented in 1880 by Samuel Pierpont Langley.[49] Langley and his assistant Charles Greeley Abbot continued to make improvements in this instrument. By 1901, it could detect radiation from a cow from 400 metres away and was sensitive to differences in temperature of one hundred thousandths (0.00001 C) of a degree Celsius.[50][51] The first commercial thermal imaging camera was sold in 1965 for high voltage power line inspections.

The first civil sector application of IR technology may have been a device to detect the presence of icebergs and steamships using a mirror and thermopile, patented in 1913.[52] This was soon outdone by the first accurate IR iceberg detector, which did not use thermopiles, patented in 1914 by R.D. Parker.[53] This was followed by G.A. Barker's proposal to use the IR system to detect forest fires in 1934.[54] The technique was not genuinely industrialized until it was used to analyze heating uniformity in hot steel strips in 1935.[55]

First thermographic camera

[edit]

In 1929, Hungarian physicist Kálmán Tihanyi invented the infrared-sensitive (night vision) electronic television camera for anti-aircraft defense in Britain.[56] The first American thermographic camera developed was an infrared line scanner. This was created by the US military and Texas Instruments in 1947[57][failed verification] and took one hour to produce a single image. While several approaches were investigated to improve the speed and accuracy of the technology, one of the most crucial factors dealt with scanning an image, which the AGA company was able to commercialize using a cooled photoconductor.[58]

The first British infrared linescan system was Yellow Duckling of the mid-1950s.[59] This used a continuously rotating mirror and detector, with Y-axis scanning by the motion of the carrier aircraft. Although unsuccessful in its intended application of submarine tracking by wake detection, it was applied to land-based surveillance and became the foundation of military IR linescan.

This work was further developed at the Royal Signals and Radar Establishment in the UK when they discovered that mercury cadmium telluride was a photoconductor that required much less cooling. Honeywell in the United States also developed arrays of detectors that could cool at a lower temperature,[further explanation needed] but they scanned mechanically. This method had several disadvantages which could be overcome using an electronic scanning system. In 1969 Michael Francis Tompsett at English Electric Valve Company in the UK patented a camera that scanned pyro-electronically and which reached a high level of performance after several other breakthroughs during the 1970s.[60] Tompsett also proposed an idea for solid-state thermal-imaging arrays, which eventually led to modern hybridized single-crystal-slice imaging devices.[58]

By using video camera tubes such as vidicons with a pyroelectric material such as triglycine sulfate (TGS) as their targets, a vidicon sensitive over a broad portion of the infrared spectrum[61] is possible. This technology was a precursor to modern microbolometer technology, and mainly used in firefighting thermal cameras.[62]

Smart sensors

[edit]

One of the essential areas of development for security systems was for the ability to intelligently evaluate a signal, as well as warning of a threat's presence. Under the encouragement of the US Strategic Defense Initiative, "smart sensors" began to appear. These are sensors that could integrate sensing, signal extraction, processing, and comprehension.[63] There are two main types of smart sensors. One, similar to what is called a "vision chip" when used in the visible range, allow for preprocessing using smart sensing techniques due to the increase in growth of integrated microcircuitry.[64] The other technology is more oriented to specific use and fulfills its preprocessing goal through its design and structure.[65]

Towards the end of the 1990s, the use of infrared was moving towards civilian use. There was a dramatic lowering of costs for uncooled arrays, which along with the significant increase in developments, led to a dual-use market encompassing both civilian and military uses.[66] These uses include environmental control, building/art analysis, functional medical diagnostics, and car guidance and collision avoidance systems.[67][68][69][70][71][72]

See also

[edit]

References

[edit]
[edit]
Revisions and contributorsEdit on WikipediaRead on Wikipedia
from Grokipedia
Thermography is a non-invasive technique that detects and visualizes surface variations by measuring the emitted from objects, converting into visible thermograms using specialized cameras. This method relies on the principle that all objects above emit proportional to their , governed by laws such as the Stefan-Boltzmann law, which states that total energy radiated is directly proportional to the fourth power of the absolute . Unlike visible light , thermography operates in the spectrum (typically 0.9–14 μm), allowing detection of heat patterns without physical contact. The development of thermography traces back to the mid-20th century, with early applications emerging in 1956 when physician Robert Lawson first used it for by recording surface differences on the skin. Building on foundational work in detection from the , such as Kálmán Tihanyi's of a thermal-sensitive camera, practical thermographic systems became viable during for military and targeting. By the , advancements in technology enabled widespread adoption in medical diagnostics and industrial inspections. Thermography finds diverse applications across fields, including medical diagnostics for detecting , circulatory disorders, and early cancer indicators through skin temperature anomalies; industrial maintenance to identify overheating electrical components, mechanical , or insulation failures in equipment like motors and pipelines; and building energy assessments to locate heat leaks in structures for improved efficiency. In , agencies like employ it for thermal modeling of spacecraft components and debris detection during missions. Its passive nature—requiring no external heat source—makes it suitable for real-time monitoring in , firefighting, and . Key advantages of thermography include its speed, safety, and ability to cover large areas quickly, though accuracy depends on factors like (the efficiency of radiation emission, often around 0.96 for many surfaces) and environmental conditions such as ambient and . Modern systems integrate quantitative analysis for precise measurements, often from -20°C to over 150°C, supporting that reduces downtime and costs in industrial settings. As of 2025, the field has seen significant growth, with the market exceeding $7 billion, driven by AI integration for enhanced detection and analysis. Despite its benefits, thermography is typically used as a complementary tool rather than a standalone diagnostic method, particularly in where it aids but does not replace techniques like .

Fundamentals

Thermal Radiation Basics

Thermal radiation refers to the emitted by any object with a above , arising from the thermal motion of charged particles within the material. This emission occurs across a spectrum of wavelengths, with the intensity and distribution determined solely by the object's in the case of an ideal emitter. For most practical purposes in thermography, thermal radiation in the portion of the is of primary interest, as it allows non-contact assessment of everyday objects. A blackbody is an idealized physical body that absorbs all incident electromagnetic radiation, regardless of frequency or angle of incidence, and re-emits the absorbed energy in a manner dependent only on its temperature. The spectral distribution of this emitted radiation is described by Planck's law, which quantifies the radiance as a function of wavelength and temperature. The law is expressed as: B(λ,T)=2hc2λ51ehc/λkT1B(\lambda, T) = \frac{2hc^2}{\lambda^5} \frac{1}{e^{hc / \lambda k T} - 1} where B(λ,T)B(\lambda, T) is the spectral radiance in watts per square meter per steradian per micrometer, hh is Planck's constant, cc is the speed of light, kk is Boltzmann's constant, λ\lambda is the wavelength, and TT is the absolute temperature in kelvin. This formulation resolved the ultraviolet catastrophe in classical physics and provides the foundation for understanding how thermal emission peaks at longer wavelengths for cooler objects. The total energy radiated by a blackbody across all wavelengths is governed by the Stefan-Boltzmann law, which states that the power radiated per unit surface area is proportional to the of the absolute temperature. For a real body, the total radiated power PP is given by P=ϵσAT4P = \epsilon \sigma A T^4, where ϵ\epsilon is the (a between 0 and 1 indicating how closely the body approximates a blackbody), σ=5.67×108\sigma = 5.67 \times 10^{-8} W/m²K⁴ is the Stefan-Boltzmann constant, AA is the surface area, and TT is the absolute temperature. This law underscores the rapid increase in radiative output with temperature, making it essential for quantitative thermography. Kirchhoff's law of thermal radiation establishes that, in thermodynamic equilibrium, the emissivity of a body at a given wavelength equals its absorptivity at that same wavelength, ensuring that good absorbers are also good emitters. Formally stated, for a body in thermal equilibrium, the power radiated equals the power absorbed, linking the material's emission and absorption properties directly. This principle holds for opaque bodies and is fundamental to the definition of blackbodies, where both emissivity and absorptivity are unity across all wavelengths. In thermography, the relevant range for terrestrial applications, such as room-temperature objects around 300 K, corresponds to peak emission in the long-wave band of approximately 8 to 15 μm, where atmospheric transmission is favorable and detectors are sensitive. Accurate interpretation of this emission requires accounting for variations across materials, as deviations from blackbody behavior can introduce measurement errors.

Emissivity and Surface Properties

(ε), a key property in thermography, is defined as the ratio of the emitted by a surface to that emitted by an ideal blackbody at the same and wavelength, with values ranging from 0 (perfect reflector) to 1 (perfect emitter). This measure quantifies how effectively a emits radiation relative to a blackbody, directly influencing the accuracy of readings in imaging since real surfaces deviate from ideal blackbody behavior. Typical values vary widely by material; for instance, polished metals exhibit low around 0.05, making them poor emitters and challenging for thermographic detection. In contrast, has a high of approximately 0.98, facilitating reliable non-contact measurements in thermography. Oxidized surfaces, such as those on metals exposed to air, show intermediate values around 0.8, which can enhance emission compared to their polished counterparts but still require adjustment for precise readings. Several factors influence , leading to variations in thermographic interpretations. Surface roughness increases by trapping and re-emitting more effectively than smooth surfaces. Oxidation forms a layer that elevates , as seen in metals where coatings boost emission. also depends on temperature, often rising with higher thermal levels due to changes in surface structure, and on , with materials showing different values across bands. To address emissivity variations in thermography, correction techniques are essential for accurate temperature mapping. One common approach involves using reference sources, such as calibrated blackbody emitters placed adjacent to the target, to normalize readings and estimate the surface's effective . Alternatively, applying high-emissivity coatings, like black paint with ε near 0.95, to low-emissivity surfaces standardizes emission and simplifies measurements without altering the underlying material significantly. These methods ensure that thermographic data reflects true thermal conditions rather than surface artifacts.

Principles of Operation

Measurement Techniques

Infrared detection in thermography relies on the absorption of by specialized detector materials, which convert the incident into measurable electrical signals. Detectors can be , which respond to rises from absorbed by changing electrical properties (e.g., resistance in bolometers or microbolometers, common in uncooled systems for broad applications), or quantum (photon) detectors, which operate via the where photons with energy matching the material's bandgap excite electrons from the valence band to the conduction band, generating electron-hole pairs that produce a . This quantum process avoids intermediate conversion, enabling direct and sensitive response, though it often requires cooling to reduce and dark current. Common materials for quantum detectors include (MCT) for broad spectral coverage and (InSb) for mid-wave applications, with the absorption efficiency depending on the photon's wavelength relative to the material's bandgap. Thermography employs distinct spectral bands tailored to different temperature ranges and applications, each offering unique sensitivities based on blackbody radiation characteristics. The short-wave infrared (SWIR) band spans 0.9–1.7 μm and is primarily used for imaging reflected light, but can detect thermal emissions from very high temperatures (typically above 800°C) where thermal emission peaks shift to shorter wavelengths; however, it also captures reflected solar radiation, limiting pure thermal sensitivity at ambient or moderate conditions. The mid-wave infrared (MWIR) band, from 3–5 μm, provides high thermal contrast for moderate temperatures (0–500°C), as objects in this range emit strongly within the band according to Wien's displacement law, making it ideal for detecting hot sources against cooler backgrounds. The long-wave infrared (LWIR) band, covering 8–14 μm, excels in sensitivity for lower temperatures (down to -20°C), capturing peak emissions from room-temperature objects (around 300 K) where radiation intensity is higher in longer wavelengths, though it is more susceptible to atmospheric interference. Accurate temperature readings in these bands depend on accounting for surface emissivity, which varies by material and affects the interpreted radiance. Spatial resolution in thermographic measurements is quantified by the instantaneous (IFOV), defined as the angular extent subtended by a single detector , typically expressed in milliradians (mrad). A smaller IFOV, such as 1.0 mrad, allows finer detail resolution, enabling the camera to distinguish smaller targets at a given distance without averaging over larger areas, which is critical for applications like defect detection in structures. resolution, meanwhile, is characterized by the minimum detectable temperature difference (MDTD), often measured as the noise-equivalent temperature difference (NETD), representing the smallest temperature variation the system can reliably detect above levels, commonly in the range of 20–100 mK for uncooled detectors. Lower NETD values enhance the ability to identify subtle thermal gradients, with performance improving under stabilized conditions to minimize contributions from the detector and . Atmospheric effects significantly influence measurements by attenuating signals through absorption and , primarily by and (CO2). absorbs strongly in the LWIR band (e.g., around 6–7 μm and beyond 15 μm) and contributes to continuum absorption across broader regions, reducing and introducing path-dependent errors in long-range thermography. CO2 exhibits prominent absorption in the MWIR band near 4.3 μm and weaker features in LWIR around 15 μm, further complicating signal interpretation in humid or elevated CO2 environments. Correction methods include narrow-band filtering to isolate atmospheric transmission windows, such as 3–5 μm for MWIR or 8–12 μm for LWIR, where absorption is minimal, thereby preserving signal integrity without full modeling. Additional techniques involve on-site measurements of atmospheric profiles (e.g., and ) to apply empirical corrections, ensuring quantitative accuracy over distances up to several kilometers.

Thermography Equations

In infrared thermography, the relationship between measured radiance and temperature is fundamentally governed by the Stefan-Boltzmann law adapted for greybody emitters, where the radiance LL emitted by an object is proportional to the fourth power of its absolute temperature TT: LϵσT4L \propto \epsilon \sigma T^4. This non-linear dependence implies that small changes in temperature produce disproportionately large variations in detected signal, particularly at higher temperatures, which is critical for qualitative imaging where relative thermal contrasts are visualized without absolute calibration. For quantitative temperature derivation, the absolute temperature is obtained by inverting the greybody radiance equation, accounting for the object's ϵ\epsilon (0 < ϵ\epsilon ≤ 1) and the Stefan-Boltzmann constant σ=5.67×108\sigma = 5.67 \times 10^{-8} W/m²K⁴: T=(Lϵσ)1/4T = \left( \frac{L}{\epsilon \sigma} \right)^{1/4} Here, LL represents the measured radiance corrected for calibration factors such as detector response and integration over the instrument's spectral band. In practice, this formulation incorporates additional terms for noise δL\delta L in the radiance measurement and calibration offsets, yielding T=(L+δLϵσ)1/4+ΔTcalT = \left( \frac{L + \delta L}{\epsilon \sigma} \right)^{1/4} + \Delta T_{cal}, where ΔTcal\Delta T_{cal} adjusts for instrument-specific biases. Over small temperature ranges, where the non-linearity is minimal, a linear approximation simplifies data interpretation by differentiating the radiance equation: ΔT14ϵσT3ΔL\Delta T \approx \frac{1}{4 \epsilon \sigma T^3} \Delta L This derives from the sensitivity L/T=4ϵσT3\partial L / \partial T = 4 \epsilon \sigma T^3, allowing ΔT\Delta T to be estimated directly from signal fluctuations ΔL\Delta L. The approximation holds well for ΔT/T<10%\Delta T / T < 10\%, common in thermographic applications monitoring subtle thermal gradients, but accuracy degrades at larger ranges due to the underlying T4T^4 curvature. Error analysis in thermography reveals that uncertainties in temperature arise primarily from emissivity estimation, atmospheric effects, and detector limitations, each propagating through the radiance-to-temperature inversion. Emissivity uncertainty Δϵ\Delta \epsilon dominates bias errors, as a relative error of 5% in ϵ\epsilon (e.g., from surface variability or incorrect assumption) can induce ΔT/T(Δϵ/ϵ)/4\Delta T / T \approx (\Delta \epsilon / \epsilon) / 4 at typical operating temperatures around 300 K, often exceeding 1 K for low-ϵ\epsilon materials like metals. Atmospheric attenuation introduces path-dependent losses via transmittance τ<1\tau < 1, modeled in the full radiance equation L=τ(ϵσT4+(1ϵ)Lrefl)+(1τ)LatmL = \tau (\epsilon \sigma T^4 + (1 - \epsilon) L_{refl}) + (1 - \tau) L_{atm}, where errors from uncompensated Δτ\Delta \tau (due to humidity or CO₂ absorption) can add 0.5–2 K over 10 m distances in mid-wave infrared bands. Detector noise contributes random errors quantified by the noise-equivalent temperature difference (NETD, typically 20–50 mK for uncooled microbolometers), relating to radiance noise via ΔTNETD=ΔL/(4ϵσT3)\Delta T \approx \text{NETD} = \Delta L / (4 \epsilon \sigma T^3), which limits precision in low-contrast scenes and is mitigated by averaging or spectral filtering. Overall, combined uncertainties often yield total ΔT\Delta T of 1–2 K for well-calibrated systems, emphasizing the need for site-specific corrections.

Visualization Methods

In thermography, raw thermal data captured as grayscale intensity values representing temperature variations is typically transformed into visual representations to facilitate human interpretation. This process begins with the application of color scales, where pseudocolor palettes map temperature gradients to a spectrum of colors, enhancing the perceptual distinction of thermal differences. Common palettes include the ironbow, which transitions from black through red, yellow, and white to highlight hot spots with high contrast, and the rainbow palette, which employs a broad chromatic range from blue (cool) to red (hot) for intuitive visualization of subtle gradients. For instance, in detecting laptop overheating, a thermal camera may display hot areas like the heat sink in red or glowing white, depending on the selected palette and heat intensity. These mappings leverage non-linear temperature scaling to align with human visual sensitivity, ensuring that small temperature changes are perceptually amplified without altering the underlying quantitative data. Isotherms and false-color imaging further refine this visualization by overlaying contour lines that delineate regions of constant temperature, allowing users to identify thresholds such as critical overheating zones. In false-color approaches, specific temperature bands are assigned uniform hues, creating binary or segmented displays where, for instance, areas above a defined limit appear in a stark color like red against a neutral background. This technique is particularly useful for isolating anomalies in complex scenes, as isotherms can be dynamically adjusted to focus on user-specified ranges, improving the detection of thermal boundaries. To address the often low contrast inherent in thermal images due to limited dynamic range, image enhancement techniques are applied post-capture. Histogram equalization redistributes pixel intensities to span the full grayscale range, thereby amplifying contrast and revealing fine details in uniform temperature areas. Edge detection algorithms, such as those based on gradient operators, sharpen boundaries between thermal regions, aiding in the highlighting of defects like cracks or voids by emphasizing discontinuities in the temperature field. These methods, when combined, preserve thermal accuracy while improving visual clarity, though they must be calibrated to avoid introducing artifacts. Despite these advances, common pitfalls in thermographic visualization arise from over-reliance on color without quantitative validation, leading to misinterpretation of apparent anomalies as actual thermal events. For example, perceptual biases in palette choice can exaggerate or obscure gradients, potentially causing users to overlook subtle issues if the color scheme does not align with the scene's thermal dynamics. To mitigate this, visualizations should always be corroborated with numerical temperature readings rather than treated as standalone qualitative indicators.

Equipment and Technology

Infrared Camera Systems

Infrared camera systems for thermography consist of several core components that enable the capture and processing of thermal radiation. The optical subsystem typically employs lenses made from , a material with high transparency in the infrared spectrum (particularly 8-14 μm for long-wave infrared applications), to focus incoming thermal energy onto the detector array without significant absorption. The detector array, often a focal plane array of thousands of pixels, converts the focused infrared photons into electrical signals. Electronics for signal amplification and readout follow, processing these signals to form a digital image, while cooling systems—such as Stirling coolers in high-end models—maintain the detector at cryogenic temperatures (around 77 K) to reduce thermal noise and enhance sensitivity. Key specifications of these systems determine their performance in thermographic applications. Spatial resolution commonly reaches 640 × 480 pixels, allowing detailed imaging of temperature distributions over a scene. Frame rates typically range from 25 Hz to 60 Hz for real-time monitoring, with higher rates (up to 100 Hz or more) available in specialized models for dynamic processes. The noise equivalent temperature difference (NETD), a measure of sensitivity, is generally below 50 mK, enabling detection of subtle temperature variations as small as 0.05°C. Calibration is essential to ensure accuracy and uniformity in infrared camera outputs. Two-point non-uniformity correction (NUC) involves imaging uniform blackbody sources at two distinct temperatures to compute and apply gain and offset adjustments for each pixel, compensating for detector response variations. Flat-fielding complements this by using a single uniform reference field to correct spatial non-uniformities, often performed periodically during operation to account for environmental drifts. These cameras integrate seamlessly with software platforms for enhanced functionality. Real-time analysis tools process live video feeds to overlay temperature data, perform hotspot detection, and generate isotherms, while export options support formats like CSV or DICOM for post-processing in tools such as MATLAB or dedicated thermography suites.

Detector Types

Infrared detectors are essential components in thermography systems, converting thermal radiation into electrical signals for imaging. They are broadly categorized into cooled photon detectors and uncooled thermal detectors, each offering distinct performance characteristics suited to different applications. Cooled detectors provide superior sensitivity for high-precision measurements, while uncooled detectors prioritize portability and cost-effectiveness. Cooled detectors, primarily photon-based, operate by absorbing infrared photons to generate electron-hole pairs, requiring cryogenic cooling to suppress thermal noise and achieve high detectivity. Common materials include indium antimonide (InSb) for mid-wavelength infrared (MWIR, 3–5 μm) detection and mercury cadmium telluride (HgCdTe) for both MWIR and long-wavelength infrared (LWIR, 8–12 μm) ranges. These detectors are typically cooled to approximately 77 K using liquid nitrogen or Stirling cryocoolers to enable background-limited performance. For instance, InSb detectors exhibit detectivity (D*) values exceeding 10^{10} cm Hz^{1/2}/W at 80 K, enabling noise equivalent temperature difference (NETD) as low as 20–25 mK. HgCdTe variants offer tunable bandgaps for multicolor imaging and D* > 10^{11} cm Hz^{1/2}/W in LWIR under cryogenic conditions. However, their high power consumption (often >10 W for cooling) and bulkier design make them ideal for stationary, laboratory-based thermography rather than portable use. Uncooled detectors, in contrast, rely on thermal effects and operate at ambient temperatures, eliminating the need for and reducing system complexity. Microbolometers, the dominant type, use materials like (VO_x) or (a-Si) in focal plane arrays where incident causes resistance changes via heating. VO_x microbolometers provide higher sensitivity with NETD around 20–50 mK, while a-Si variants achieve 50–100 mK but offer better uniformity and lower cost. Response times for microbolometers are typically 10–20 ms, slower than the microsecond-scale of cooled photon detectors, but sufficient for most thermographic frame rates (30–60 Hz). Their low power consumption (<1 ) and compact size (e.g., <1 kg systems) suit portable and field-deployable thermography, though with lower detectivity (D* ~10^8–10^9 cm Hz^{1/2}/) compared to cooled options.
MetricCooled Photon Detectors (e.g., InSb, HgCdTe)Uncooled Microbolometers (e.g., VO_x, a-Si)
Detectivity (D*)>10^{10} cm Hz^{1/2}/W10^8–10^9 cm Hz^{1/2}/W
NETD20–25 mK20–100 mK
Response Time<10 μs10–20 ms
Power Consumption>10 W (including cooling)<1 W
SuitabilityStationary, high-sensitivity applicationsPortable, cost-effective field use
This table highlights key trade-offs, where cooled detectors excel in sensitivity for demanding scenarios like medical diagnostics, while uncooled ones dominate consumer and industrial thermography due to affordability. Emerging trends in detector technology focus on enhancing resolution and efficiency through advanced focal plane arrays (FPAs) and hybrid architectures. Large-format hybrid FPAs, combining HgCdTe detector layers with silicon read-out integrated circuits via indium bumps, achieve resolutions up to 4K × 4K pixels with frame rates > Hz, improving spatial detail in thermographic imaging. High-operating-temperature () designs, operating above 120 K, reduce cooling demands while maintaining D* >10^{10} cm Hz^{1/2}/W. Additionally, type-II materials are gaining traction as HgCdTe alternatives for hybrid detectors, offering better uniformity and reduced dark current for next-generation LWIR thermography. These advancements enable compact, high-performance systems bridging the gap between cooled and uncooled capabilities.

Alternative Imaging Approaches

Reflected thermography represents an alternative to direct emission detection, utilizing (CCD) and complementary metal-oxide-semiconductor () sensors originally designed for visible light. In this method, an external illumination source is applied to the target surface, and the sensors capture variations in reflectivity caused by temperature changes, a phenomenon known as thermoreflectance. This approach enables high-resolution thermal mapping, particularly for microelectronic devices, with demonstrated temperature resolutions below 0.01°C using lock-in techniques to enhance signal-to-noise ratios. However, it is inherently limited to measuring surface temperatures and requires materials with measurable thermoreflectance coefficients, restricting its applicability compared to conventional systems. Historical alternatives include emulsions sensitized to near- wavelengths, which served early thermographic needs before digital detectors dominated. These photographic films, exposed either to reflected near- light or to the output of rudimentary scanners, allowed for static recording of thermal patterns in applications like medical diagnostics and industrial inspections during the mid-20th century. Though now obsolete due to lower sensitivity, slower processing, and inability to capture real-time dynamics, they played a key role in pioneering thermographic documentation. Thermography differs fundamentally from night-vision technologies that employ image intensification, where ambient visible and near- is amplified to produce visible images in low-light conditions. Image intensifiers rely on photocathodes to convert photons into electrons, which are then accelerated and multiplied before reconversion to , enabling detailed scene rendering but failing in complete or through heat-obscuring media like . In contrast, thermography passively detects mid- to long-wavelength emissions from thermal sources, providing temperature-based detection independent of ambient and superior penetration in adverse weather. Hybrid imaging systems integrate thermography with visible-spectrum cameras to fuse and structural data, yielding overlaid images that enhance interpretive accuracy. By aligning fields of view through beam splitters or software registration, these setups correlate heat signatures with visual details, proving valuable in and where contextual awareness is critical. Such fusion mitigates the lower resolution of images while preserving the non-contact, real-time benefits of thermography. These methods often exhibit narrower coverage than dedicated detectors, limiting sensitivity to certain wavelengths.

Techniques

Passive Thermography

Passive thermography is a non-invasive imaging technique that captures the natural emitted by objects and surfaces due to their , without applying any external sources or stimulation. This method relies on the principle that all objects above emit , which is detected by cameras to produce thermograms representing distributions. The setup typically involves positioning a calibrated camera to view the target area under ambient conditions, allowing for the measurement of steady-state thermal patterns or transient changes arising from internal sources within the object itself. In applications, passive thermography is employed to identify hotspots indicative of electrical or mechanical faults, detect insulation flaws that lead to energy loss in structures, and screen for elevated body temperatures associated with fever. These uses leverage the technique's ability to visualize thermal anomalies in real-world scenarios, such as building envelopes or human subjects, providing a quick overview of performance without disrupting operations. Standard infrared camera systems facilitate this imaging by converting detected into visible temperature maps. Data interpretation in passive thermography depends heavily on ambient environmental factors, such as surrounding air , , and solar loading, which influence the overall contrast in the captured images. Surface —the measure of how efficiently a emits radiation compared to a blackbody—plays a critical role, as variations (e.g., lower values for polished metals) can lead to inaccurate assessments if not properly accounted for during . Analysts must therefore adjust for these parameters to distinguish genuine signatures from . One key advantage of passive thermography is its suitability for real-time monitoring, enabling continuous observation of thermal behaviors in dynamic settings without the need for preparation or intervention. This makes it ideal for ongoing assessments where immediacy is essential. However, limitations arise in low-contrast scenes, where subtle differences are masked by uniform ambient conditions or high-emissivity backgrounds, potentially reducing detection sensitivity for minor anomalies.

Active Thermography

Active thermography is a (NDT) technique that employs external energy sources to stimulate a material, generating thermal contrasts that highlight subsurface defects or anomalies otherwise undetectable through passive observation. This approach builds on passive thermography principles by actively enhancing temperature differences to improve defect visibility in materials such as composites, metals, and coatings. Common excitation methods in active thermography include optical stimulation using flash lamps, which deliver short, high-energy pulses to rapidly heat the surface; mechanical excitation via vibrothermography, where ultrasonic vibrations induce frictional heating at defect sites; inductive heating, which uses electromagnetic fields to generate eddy currents and localized in conductive materials; and excitation, employing electromagnetic waves to penetrate and volumetrically heat the sample. Each method is selected based on material properties and defect types, with optical techniques suiting non-conductive surfaces and inductive or approaches preferred for metals. The thermal response in active thermography is governed by the heat diffusion equation, describing transient as T/t=α2T\partial T / \partial t = \alpha \nabla^2 T, where TT is , tt is time, α\alpha is the , and 2\nabla^2 is the Laplacian operator. This models how diffuses through the material following excitation, enabling the analysis of cooling or heating curves to infer defect presence and characteristics. For enhanced depth resolution, phase-based analysis via lock-in thermography modulates the excitation source sinusoidally and extracts phase information from the thermal response, which attenuates less with depth than , allowing precise localization of subsurface flaws. This technique improves signal-to-noise ratios and defect sizing in layered structures. Active thermography's suitability for NDT stems from its non-contact nature and ability to detect subsurface flaws, such as delaminations or voids, in materials without surface preparation, making it valuable for in and . Limitations include sensitivity to , addressed through controlled excitation and .

Advantages and Limitations

Key Advantages

Thermography offers significant advantages as a non-destructive testing method, allowing for the inspection of materials and structures without causing any physical damage or alteration to the subject being examined. This non-invasive approach is particularly beneficial for evaluating components that must remain operational during assessment, such as in ongoing . A primary benefit is its non-contact nature, which enables remote from a safe distance without physical interaction with the heat source. This capability facilitates the scanning of large areas efficiently, making it ideal for broad-surface inspections that would be impractical or hazardous with direct-contact methods. For instance, thermography can detect thermal anomalies across expansive surfaces like pipelines or building facades in a single pass. Thermography provides real-time , capturing dynamic thermal patterns and changes instantaneously on stationary or moving targets. This high-speed capability surpasses traditional contact methods like thermocouples, which require point-by-point attachment and exhibit slower response times, often limiting them to static measurements. In contrast, thermographic systems deliver immediate visual data over entire fields of view, enabling the monitoring of evolving processes without interruption. The technique demonstrates versatility in challenging environments, such as high-voltage electrical systems or dusty industrial settings, where direct contact could pose safety risks or be infeasible. By operating remotely, thermography allows inspections in hazardous conditions without exposing personnel to dangers like electrical arcs or airborne contaminants, thus enhancing operational safety. In terms of cost-effectiveness, thermography supports by identifying early signs of malfunctions, such as overheating connections or friction in machinery, thereby preventing unexpected breakdowns and associated . Studies indicate that integrating thermography into programs can reduce costs by 25-30%, eliminate 70-75% of breakdowns, and cut by 35-45%, leading to substantial long-term savings through proactive interventions.

Challenges and Disadvantages

Thermography exhibits significant sensitivity to environmental factors, including reflections from nearby surfaces, variations in surface , and atmospheric interference such as , airflow, or absorption, which collectively contribute to measurement inaccuracies often ranging from ±2–5°C. Reflections can create misleading hot spots in thermal images, particularly in outdoor or reflective environments like substations, complicating accurate defect identification. errors arise when the camera's assumed value deviates from the actual surface property, leading to over- or underestimation of temperatures, while atmospheric further distorts the signal over distance. Spatial resolution in thermography is constrained by diffraction effects in optics, analogous to the Abbe diffraction limit, where the longest wavelengths in the (typically 8-14 μm for long-wave systems) inherently limit the smallest resolvable feature size to several times the . Additionally, size in focal plane array detectors imposes practical limits, often resulting in resolutions insufficient for detecting fine subsurface defects or distant small anomalies without advanced . As of 2025, advancements in AI super-resolution algorithms and uncooled detector technology are helping to mitigate these resolution limitations by enhancing detail without increasing hardware costs. The high initial costs of thermography equipment, especially systems with cooled detectors that require cryogenic maintenance, represent a substantial barrier to adoption, often making quantitative imaging setups prohibitively expensive compared to uncooled alternatives. Furthermore, effective use demands considerable operator expertise in camera calibration, environmental control, and protocol adherence to minimize errors during and . Interpreting thermographic images poses challenges due to false positives induced by non-thermal artifacts, such as transient shadows, equipment reflections, or physiological variations unrelated to the target anomaly, which can mimic defects and lead to erroneous diagnoses. Emerging techniques, including models for and anomaly filtering, are addressing these issues by 2025, achieving up to 42% reduction in false positives through automated error correction and enhanced specificity in applications like medical screening. corrections, such as those based on , can mitigate some accuracy losses but require integration with surface property assessments for optimal results.

Applications

Industrial and Engineering Uses

In industrial and engineering contexts, infrared thermography serves as a non-destructive testing method to monitor equipment performance, ensure structural integrity, and optimize operational efficiency across manufacturing and infrastructure sectors. By detecting thermal anomalies such as overheating components or material defects, it enables early intervention to prevent failures, reduce downtime, and lower maintenance costs. This technique is particularly valuable in high-stakes environments like factories, power plants, and construction sites, where traditional inspection methods may be invasive or time-consuming. Predictive maintenance represents one of the primary applications of thermography in industry, allowing technicians to identify potential issues in mechanical and electrical systems before they escalate. For instance, it detects electrical faults in and transformers by revealing hotspots caused by loose connections or insulation degradation, which can prevent catastrophic failures and fires. Similarly, thermography monitors bearing wear in rotating machinery, such as motors and pumps, by capturing elevated temperatures indicative of or deficiencies, enabling scheduled repairs that extend equipment lifespan. In process industries, it identifies steam leaks in piping and valves through visible thermal plumes, reducing energy waste and improving safety in chemical plants and refineries. Quality control in manufacturing leverages thermography to inspect welds and composite materials, ensuring compliance with stringent standards in sectors like . Active thermography, which applies external sources to reveal subsurface defects, is commonly used for detecting in carbon fiber composites during production, where even minor flaws can compromise airworthiness, with studies reporting up to 95% accuracy in identifying subsurface anomalies. For weld inspections, post-process thermography identifies lack of fusion or cracks by analyzing residual patterns, providing rapid, non-contact evaluation without disassembling components. These applications have been shown to enhance defect detection rates in composites and welds. Building diagnostics employ thermography to assess thermal performance and identify inefficiencies in structures, supporting and occupant comfort. It reveals insulation gaps in walls and roofs by highlighting areas of uneven loss during cold weather scans, allowing for targeted retrofits that can reduce heating costs. In HVAC systems, thermography pinpoints issues like blocked ducts or faulty heat exchangers through temperature differentials, facilitating proactive that minimizes energy consumption and extends system durability. This method's ability to provide whole-building surveys in a single pass makes it indispensable for commercial and residential audits. Recent expansions of thermography in renewable energy infrastructure highlight its adaptability to emerging engineering challenges. In solar photovoltaic systems, drone-mounted thermography detects hotspots on panels caused by microcracks or soiling, enabling timely cleaning or replacement to improve yields. For wind turbines, it assesses blade stress and by capturing thermal variations during operation, often using active techniques to simulate load conditions and predict failures. These integrations support the of clean projects by minimizing operational disruptions.

Medical and Health Applications

Thermography, particularly passive imaging, plays a role in medical diagnostics by visualizing surface temperature variations associated with physiological changes, such as increased blood flow or metabolic activity in diseased tissues. In , dynamic area thermography detects abnormal vascular patterns and heat signatures indicative of tumors, often through protocols involving cooling and reheating to highlight thermal asymmetries. However, its efficacy remains controversial due to variable sensitivity (pooled around 88%) and specificity (70-90%) compared to , leading to high false-positive rates and recommendations against its use as a standalone tool. The U.S. (FDA) has issued multiple warnings since the , emphasizing that thermography is not approved for detection and should not replace mammograms, as unproven claims have misled patients into delaying effective screenings. Recent studies explore AI integration to improve image analysis, but clinical validation is ongoing. For inflammation detection, identifies skin temperature asymmetries linked to conditions like and abnormalities. In , elevated temperatures (often 1-2°C above contralateral sides) correlate with activity, enabling non-invasive monitoring of synovial and treatment response. For disorders, such as or nodules, thermal patterns show localized due to increased vascularity, with AI-enhanced analysis demonstrating good performance in distinguishing benign from malignant nodules by processing thermal data alongside . In , thermography aids in diagnosing lameness in horses by detecting heat from inflamed tendons or joints, often identifying issues before overt clinical signs emerge, with temperature differences of 0.5-1°C signaling early . It also monitors by tracking progressive normalization of thermal profiles, where persistent hotspots indicate or delayed recovery in surgical sites. Post-2020 updates highlight FDA cautions on thermography for broad fever screening during pandemics like , noting inaccuracies in non-contact devices without proper . Emerging AI-enhanced protocols, however, have improved fever detection reliability, using to process images for elevated forehead temperatures (≥37.5°C) in high-traffic settings, achieving sensitivities over 95% in controlled trials.

Security and Defense

Thermography is extensively employed in and facility to detect concealed individuals or vehicles through their thermal signatures, enabling effective in total darkness where visible is absent. , Systems (BSS) integrate thermal imaging into fixed towers and small unmanned aerial systems (SUAS), allowing detection of border incursions from distances of 0.5 to 7 miles without requiring recognition or ambient . These systems capture emitted by human bodies or vehicle engines, facilitating the identification and apprehension of unauthorized entrants in remote terrains. Automated thermal panoramic imaging sensors further enhance perimeter monitoring by providing 360-degree, real-time coverage in the 8-14 micron range, automatically classifying and tracking multiple targets based on differentials to alert personnel of potential intrusions. Unlike night-vision technologies that rely on amplification and falter in pitch-black conditions, thermography passively senses , ensuring reliable operation across all and lighting scenarios. In (SAR) missions, thermography proves invaluable for locating survivors amid smoke, rubble, or dense foliage, as penetrates these obscurants without needing external illumination. Drone-equipped cameras, utilizing algorithms such as YOLOv5 for detection and Kalman filters for tracking, achieve high performance in challenging environments; for example, in tests over mountainous areas with occlusions and rapid aerial movement, systems demonstrated an average total track life of 0.987 and track purity near 1.0, enabling persistent monitoring at altitudes of 40-60 meters. Convolutional neural networks applied to datasets from aerial perspectives further support real-time person detection in wooded or occluded settings, attaining 95% accuracy for partially hidden individuals using mid-range sensors like the FLIR Vue Pro, thus expediting efforts in zones. Military operations leverage thermography for by exploiting thermal contrasts from personnel, vehicles, and equipment, which remain detectable even under cover of darkness or adverse weather. Mid-wave (MWIR) systems excel in long-range identification of heat-emitting targets, such as missile launches or armored vehicles, while short-wave (SWIR) complements this by reflecting off surfaces to pierce , , and basic , distinguishing synthetic materials from natural surroundings. In detection, thermal imaging counters advanced concealment methods, including ultra-light netting systems and adaptive that minimize emissions; for instance, U.S. developments like the Improved Ghillie System integrate thermal-suppressing fabrics, yet high-resolution sensors still identify residual heat signatures for . Russian systems, such as the Ratnik-3 , incorporate thermal viewers in visors to aid soldiers in spotting camouflaged adversaries, underscoring thermography's role in maintaining tactical superiority. Advancements in drone-based intelligence, surveillance, and reconnaissance (ISR) have integrated thermography with multi-spectral capabilities, enhancing military effectiveness through compact, airborne platforms. Tactical unmanned aerial systems (UAS), such as the AR4, equip electro-optical/ gimbals for day-night operations, delivering high-resolution feeds for threat assessment and target designation over extended ranges. These drone systems support persistent ISR by fusing data with visible and other spectral bands, improving material identification—such as distinguishing explosives or types—via enhanced , with market analyses projecting multi-spectral adoption to drive detection accuracy in contested environments through the .

Biological and Environmental Uses

In monitoring, thermography enables the non-invasive detection of animals through their heat signatures, facilitating studies of migration patterns, prevention, and assessments. Drone-mounted cameras have been employed to track large mammals like deer in forested areas, overcoming visibility challenges posed by dense vegetation and low-light conditions during nocturnal migrations. This aids in real-time detection by identifying human and animal heat sources in protected reserves, with algorithms processing imagery to alert rangers promptly. For surveys, supports population counts of elusive , such as birds during seasonal movements, by capturing surface temperature variations without disturbing natural behaviors. In plant health assessment, thermography detects physiological stress by measuring canopy variations, which indicate issues like deficits, inefficiencies, or pest infestations in agricultural settings. Elevated canopy temperatures signal stomatal closure under drought stress, allowing early intervention to optimize use and crop yields. Thermal imaging has revealed temperature increases of 0.4 to 0.9 in infested by parasites, due to reduced , enabling targeted pest management without chemical overuse. Low-cost systems further enhance precision in vertical farms and field crops by automating canopy mapping for scheduling. Environmental applications of thermography include monitoring volcanic activity, where thermal infrared sensors detect hotspots and lava flows through atmospheric obscuration, providing for eruption . In wildfire detection, unmanned aerial systems equipped with thermal cameras identify ignition points and fire fronts in real-time, even through smoke, supporting rapid response and . mapping utilizes drone-based thermography to visualize surface temperature disparities, revealing hotspots in built environments that exacerbate climate vulnerabilities. Recent advancements in the 2020s have expanded thermography's role in research, particularly for melt analysis and thermal modeling. Ground-based and UAV thermal measures supraglacial thickness and surface temperatures, correlating them with melt rates to quantify climate-driven retreat. For instance, thermal data under rock layers indicate enhanced melting where insulation effects are minimal, informing models of loss. In studies, refined thermal protocols enable consistent temperature monitoring, revealing heat stress responses that influence under warming scenarios. As of 2025, AI integrations continue to enhance accuracy in these applications, such as automated defect detection in renewable infrastructure and real-time . These applications underscore thermography's utility in tracking environmental thermal dynamics, with adjustments accounting for biological tissue variations.

Standards and History

Standards and Regulations

Standards and regulations for thermography encompass a range of international guidelines and legal frameworks that ensure the accuracy, safety, and ethical use of thermographic practices and equipment across various domains. These standards address protocols, device , operational safety, and data handling to mitigate risks and promote reliability in applications such as industrial inspections and medical diagnostics. Key standards include ASTM E1934, which provides a guide for examining electrical and mechanical equipment using thermography, incorporating procedures for adjustments to support both qualitative and quantitative assessments. Complementing this, ISO 18434 outlines general procedures for applying thermography in machinery and diagnostics, emphasizing consistent data collection under varying operational and environmental conditions to detect anomalies effectively. These standards promote standardized methodologies that enhance the reproducibility and trustworthiness of thermographic evaluations. Device certification is governed by specific regulations tailored to application areas. For photovoltaic systems, IEC TS 62446-3 specifies requirements for outdoor thermographic inspections, including protocols for identifying defects in modules, cables, and connections to verify system performance and . In medical contexts, the U.S. (FDA) regulates thermographic devices as Class I or II medical devices under the 510(k) clearance process, requiring demonstrations of and for uses like fever screening or tissue assessment, while prohibiting unapproved claims for diagnostic purposes. Safety regulations focus on potential hazards associated with thermographic systems. In active thermography, where excitation sources like are employed, limits on laser power and exposure are enforced by IEC 60825-1, classifying lasers and mandating protective measures to prevent eye and skin injuries during operation. For security applications involving thermal imaging, data privacy is regulated under frameworks like the EU's (GDPR), which treats thermal data revealing health or biometric information as personal data, requiring consent, anonymization, and secure storage to protect individuals from unauthorized . Recent developments include emerging EU standards under the AI Act (Regulation (EU) 2024/1689), effective from August 2024, which classify AI systems used in thermographic —such as automated defect detection or anomaly identification—as high-risk if they impact safety or rights, mandating conformity assessments, transparency, and under the phased starting in 2024 to address biases and reliability in AI-enhanced processing. Proper of thermographic equipment is essential to maintain measurement accuracy, as outlined in techniques from measurement standards that account for environmental factors and drift.

Historical Development

The discovery of infrared radiation, foundational to thermography, occurred in 1800 when British astronomer conducted an experiment dispersing sunlight through a prism and measuring temperature across the using thermometers placed in each color band. He observed that the highest temperatures were recorded beyond the red end of the spectrum, in an invisible region he termed "calorific rays," later identified as infrared radiation. Early efforts to visualize infrared emissions advanced in the early , culminating in 1929 when Hungarian Kálmán Tihanyi invented the first -sensitive electronic television camera, designed for anti- defense to detect heat signatures from at night. This device marked the initial step toward practical thermographic imaging by converting signals into visible electronic images. By the , thermography transitioned to commercial applications with the development of scanning systems, including the first infrared line scanner introduced by Agema Infrared Systems (formerly Aga) in 1965, specifically engineered for inspecting electrical power lines and enabling non-contact temperature mapping in industrial settings. These mechanical scanning devices laid the groundwork for broader adoption in and . A major milestone in the 1990s was the commercialization of uncooled infrared detectors, particularly microbolometer arrays, which eliminated the need for cryogenic cooling required in earlier photon detectors, thereby reducing size, cost, and complexity. In 1997, Agema released the first infrared camera incorporating a detector, significantly expanding thermography's accessibility for field use in and medical diagnostics. The 2000s saw the integration of digital processing into thermographic systems, with advancements in image capture, storage, and software allowing for real-time radiometric handling and enhanced resolution. This era's falling prices and of cameras, driven by improvements, facilitated widespread industrial and the emergence of portable digital thermography for building inspections and electrical testing. In the and , thermography evolved further with smartphone-compatible attachments, such as the launch of Seek Thermal's compact modules that clip onto mobile devices for on-the-go imaging, democratizing access for consumers and professionals in fields like home energy audits. By 2025, AI-driven processing has become integral, with algorithms enhancing , , and in thermal data—for instance, models automating heat loss identification in building facades to improve energy efficiency assessments. These developments have accelerated thermography's integration into everyday tools, fostering innovations in and automated diagnostics.

References

Add your contribution
Related Hubs
Contribute something
User Avatar
No comments yet.