Measurements at the speed of light

6 min read
Flir Systems

Researchers on the military test range need to perform a wide variety of thermal signature measurements, so they turn to infrared (IR) cameras to capture the IR light coming off those objects. They track the flight of aircraft, rockets, and projectiles. They watch flares as they burn and rockets as they explode. Whether it’s fast movement through space, or the rapid transformation of an explosion, all these things happen at high speeds, so to record what’s happening, the cameras need to operate at high speeds as well. Several factors go into making such rapid measurements and help determine whether a camera is up to the task.

High Sampling Rates

IR cameras have improved a lot in recent years. A few years ago, 100 frames per second (fps) was the maximum frame rate you could expect, providing only one sample every 10 milliseconds. A lot of the activity being studied could take place in the time between frames, so missing information had to be extrapolated from what was available. Today’s cameras, by contrast, can capture 1,000 fps at the full resolution of 640 x 512 pixels, a tenfold increase. That allows users to take many more measurements of whatever they’re studying.

Say you’re examining how a rocket or a round of ammunition explodes. Researchers want to understand and make computer models of explosions, so they can design better rockets and ammunition. To do that, they need to track the trajectory of various fragments of the exploding object, which can be done with IR cameras that can easily see the pieces through the flames. The transformation involved takes place in under 1 ms, and the fragments travel fast, so sampling at 10-ms intervals misses a lot, whereas there’s more information when the samples are only 1 ms apart.

Another aspect researchers look at is which gases are emitted during an explosion or while a rocket or aircraft engine is in flight. Those are invisible to the naked eye, but obvious to the camera at IR wavelengths. Researchers want to know what the plumes look like at various wavelengths, so they can, for instance, design flares that can be used as decoys to confuse enemy targeting systems. The spectral signature of the flare needs to resemble that of the aircraft, so to optimize the decoy, researchers need to know what the gas emissions look like at any given time. That level of detail requires fast frame rates.

Short Sampling Times

But the frame rate alone is not the only factor in making high-speed measurements. Another consideration is the sampling time, or integration time, of the individual frames. Integration time is the time during which the detector is actually collecting photons to compose the image. The optimal integration time depends on several factors, among which are the object’s temperature and speed.

As with frame rate, though, a faster integration time is better for the military test range. Say, for example, you want to catch a bullet in mid-flight to understand its ballistic behavior. In that case, what you need, effectively, is a single still image. As the bullet flies past your camera, you want at least one of those many frames to capture a clear shot of the bullet virtually frozen in flight. If the integration time is too long, the bullet will have moved further through the frame and the motion will cause the image to blur. Sampling times on the order of tens of microseconds can avoid such motion blur. Monitoring an explosion, in which transformations happen very quickly and gases and debris move across the scene very fast, requires both high frame rates and short integration times.

High-Speed Lenses

The very short sampling times in high-speed measurements means a small window of opportunity for enough light from the object to reach the detector. With visible light cameras, you can handle that issue by lighting up the object with a floodlight or a flash, thus assuring a higher rate of photons coming off it. IR cameras, however, work with passive radiation; all the photons they capture are emitted by the object they’re imaging, and there’s no external source to make it brighter. That reliance on emitted radiation makes high frame rates and short integration times a bigger challenge for IR cameras than for visible ones.

For that reason, it’s important to have a fast lens. The speed of the lens is determined by the diameter of the aperture, the opening that lets the photons in. A larger lens with a larger aperture can capture more light and transfer it to the pixels in the detector element, allowing for a short integration time—hence a “fast” lens. Apertures are measured in terms of the f-number, which is the ratio of the system’s focal length to the diameter of the opening. The lower the f-number, the larger the aperture and the faster the lens.

But it’s not a simple matter of getting a bigger lens with a bigger opening to collect more light. For one thing, a larger aperture means a smaller depth of field, so it becomes more challenging to focus, especially on the rapidly moving objects military test ranges are likely to study. Additionally, a larger lens can add cost to the overall system. For a high-speed IR imaging system, there is an optimal point at which the tradeoff between focusing ability and cost, on the one hand, and light-collecting ability on the other, balance out.

In older IR cameras, the standard aperture was f/3. These days, the standard is f/2.5. That may seem like a small difference, but because the f-number is related to the surface area of the lens, it’s measured in terms of diameter squared. So going from f/3 to f/2.5 is almost a 15 percent increase in the sensitivity to light.

Pixel Pitch

Another feature that affects how much light the camera receives is pixel pitch, the distance between the center of one pixel and the next. In the past, the standard pixel pitch was 15 µm. For today’s high-speed cameras, it’s 25 µm. The 15 µm pitch equates to a light-sensitive area on the detector of 2.25 µm2, whereas the larger 25-µm pitch results in a sensitive area of 6.25 µm2. That dramatic improvement in area, and thus in light collection, allows a much shorter integration time for an equally bright scene.

Wavelength Range

The waveband in which the measurement is performed is also relevant. In general, IR cameras operate in one of two ranges, either the midwave IR (MWIR), with wavelengths of about 3 to 5 µm, or the longwave IR (LWIR), at 7 to 14 µm. The wavelengths in which objects glow brightest are related to the temperature of those objects. Hotter objects emit more at shorter wavelengths. The hottest objects are therefore easier to see in the MWIR.

The detectors in MWIR cameras are made from materials such as indium antimonide (InSb). Recently, LWIR detectors made with Type II Strained Layer Superlattice (SLS) have been introduced. The SLS technology allows for shorter integration times in the LWIR band. At temperatures from -20°C to 150°C, for example, the integration time for a LWIR SLS camera is 10 µs. For an MWIR InSb camera, the integration time for temperatures from -20°C to 55°C is 1 ms, more than 10 times as long. The advantage shifts as temperatures increase. At 350°C, integration time for the MWIR camera is about 10 µs, but for the LWIR it’s 30 µs. Another difference is that LWIR detectors can capture a much wider temperature range than a MWIR detector can.

A major factor in choosing between MWIR and LWIR will be what temperatures the object being studied emits. If you’re looking at a cooler range of temperatures, the LWIR provides superior high-speed performance. But often the physics of the particular application you’re interested in will dictate the choice of detectors. The LWIR cameras are also more expensive. If price is an issue and you don’t need the highest speed detection, a MWIR system might be preferable.

Where to Put the Data?

The final issue to consider is, what do you do with all the data these cameras generate? When you’re capturing 1,000 discrete, full-frame images every second, the total amount of data adds up quickly. You don’t want to lose any of the frames, because each could be the one specific frame that is crucial to understanding the phenomenon that you’re measuring.

One possibility is transmitting the data to a desktop computer. That computer would need to have a fast, high-capacity hard drive to accommodate all the input, and it’s not always practical to link a camera that’s out on a test range to a desktop. Additionally, it’s harder to guarantee there will be no frame loss with such a setup. Many IR cameras are now equipped with solid-state drives, which operate very quickly and which won’t drop any frames. Built-in SSDs are limited by the capacity of the onboard RAM, which is commonly 16 GB/s. That translates to holding approximately 60 seconds of recording at the maximum frame rate.

The other option is to transfer the data to a high-speed data recorder, which consists of a dedicated frame-grabber and several SSDs, providing a storage capacity of up to one terabyte, which allows for longer recording times.

Linking the camera to any external storage device requires some sort of connection. The standard for some time has been Camera Link, a communications protocol that comes with sturdy cables. The cables are not very flexible, and they limit transmission to a distance of about 5 meters. A newer solution that has been integrated into some cameras is CoaXPress, which uses a flexible coaxial cable that can transmit data at high speeds over distances of tens of meters, making it more practical for many setups.

There are many applications on the military test range that can benefit from IR cameras capable of making high-speed measurements. Various factors affect whether a camera is suitable for such an application, including lens speed, pixel pitch, and the waveband it covers. Today’s cameras offer both high sampling frequencies and short sampling times, which combined increase the amount of useful data that researchers can capture and use in their modeling.