Thermography and Infrared Light
Infrared light or thermography is the use of an infrared imaging and measurement camera to “see” and “measure” thermal energy emitted from an object. Thermal, or infrared energy, is light that is not visible because its wavelength is too long to be detected by the human eye; it’s the part of the electromagnetic spectrum that we perceive as heat. Unlike visible light, in the infrared world, everything with a temperature above absolute zero emits heat. Even very cold objects, like ice cubes, emit infrared.
The higher the object’s temperature, the greater the IR radiation emitted. Infrared allows us to see what our eyes cannot. Infrared thermography cameras produce images of invisible infrared or “heat” radiation and provide precise non-contact temperature measurement capabilities.
Since ancient times the detection and monitoring of heat emitted from the body has been used as a diagnostic and management tool. The Egyptians used to monitor skin temperature change by moving their fingers across a body surface. Hippocrates, one of the Ancient Greeks was recorded applying wet mud to a cloth and draping it over the patient’s thorax. He determined that the area to dry first was the problematic point and in so doing took the first ‘thermogram’ over 2400 years ago.
Vets and owners have been feeling and palpating limbs for centuries to gauge where injuries have occured. Temperature difference is such a key indicator, and nothing new in injury detection. What is new, though , is the sensitivity and degree of objectivity that thermal imaging technology provides. It is 40 times more sensitive than the human hand.