Skip to main content
Advances In Radar Imaging
Jul 1, 1999

WHAT IS RADAR?

Radar, a contraction of the words radio detection and ranging, is an electronic device for detecting and locating objects. It operates by transmitting a particular waveform pattern and detects the nature of the echo (return) signal.1 Radar is used to extend the capability of the man's senses, especially that of vision. We can think of radar as being a substitute for the eye, although it can do so much more: it can see objects through such impervious conditions as darkness, haze, fog, rain, and snow, for its wavelengths are much longer than those of visible or infrared light. The human eye works as a passive device, since the object is illuminated by sunlight or other light sources. However, radar produces its own illumination via electromagnetic waves, which means that it is an active device. 

APPLICATIONS OF RADAR AND RADAR IMAGING

Radar is used in civilian applications as air-traffic-control radar to guide aircraft to a safe landing, and in commercial aircraft as radar altimeters to determine height and weather avoidance, as well as wind-shear radars to navigate in severe weather conditions.

The military uses radar for surveillance and weapons control. Examples of such radars are DEW (Distant Early Warning) and AEW (Airborne Early Warning), which detect aircraft, long-range search radars, and guided missile radars.2

Research scientists use radar as a measurement tool. Radars have been placed on satellites, space modules, and shuttles to explore meteors, planets, and other objects in the solar system.

In the case of an imaging radar, the radar travels along an airplane's or a space shuttle's flight path. The area underneath is illuminated by the radar, and the radar architecture builds the image as it moves on the top of its footprint (Fig.1). The radar image's finer resolution is achieved by using a very long antenna array to focus transmitted and received energy into a sharp beam.2 The beam's sharpness defines the resolution. Similarly, such optical systems as telescopes require large apertures (mirrors or lenses that are analogous to the radar antenna) to obtain fine imaging resolution. Synthetic Aperture Radar (SAR) is a common and very popular technique in radar imaging that achieves a very fine resolution.3 In the following sections, we introduce and explain different types of SAR imaging techniques.

SYNTHETIC APERTURE RADAR (SAR)

SAR refers to a technique that synthesizes a very long antenna by combining echoes received by the radar when it travels.4.5 Typically, SAR is used to produce a two-dimensional (2-D) image. One dimension in the image is called range (or along track), and is a measure of the "line-of-sight" distance from the radar to the target (Fig.l). Range is determined by precisely measuring the time from a pulse's transmission to receiving the echo from target. The range resolution is determined by the transmitted pulse's width (i.e., narrow pulses yield fine range resolution).

The other dimension is called azimuth (or cross track), and is perpendicular to range. Usually, the length of the radar antenna determines azimuth resolution. However, a good azimuth resolution requires a radar antenna that is not practically carried by an airborne platform, for imaging radars are much lower in frequency (1 to 10 GHz) than optical systems (4,000 to 8,000 GHz). The length of the required antenna could be around several hundred meters, which obviously cannot be carried by an air vehicle.

However, SAR differs from other radars in that it collects data along the flight path when it travels, instead of using a large antenna. Therefore, a very small antenna is adequate for the job. After collecting the data, it processes this aperture data as if it came from a physically long antenna. The distance the aircraft flies in synthesizing the antenna is known as the synthetic aperture. A narrow synthetic beamwidth results from the relatively long synthetic aperture, which yields finer resolution than what is possible from a smaller physical antenna.

SARs are not as simple as described above. Transmitting short pulses to provide range resolution is generally not practical. Typically, longer pulses with wide-bandwidth modulation are transmitted, which complicates range processing but decreases peak power requirements on the transmitter. For even moderate azimuth resolutions, a target's range to each location on the synthetic aperture changes along the synthetic aperture. The energy reflected from the target must be "mathematically focused" to compensate for the range dependence across the aperture prior to image formation. Additionally, for fine-resolution systems, range and azimuth processing is coupled (dependent on each other), which greatly increases computational processing. The trick in SAR processing is to correctly match the variation in frequency due to motion (moving target or moving radar) for each point in the image.

An example of SAR imaging is shown in Fig. 2. The colors in the image reflect the received signal intensity. The strongest signal level is red, whereas the weakest is black. The figure is a SAR image of San Francisco, California, obtained by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture (SIR-C/X-SAR) imaging radar when it flew aboard the space shuttle Endeavour on October 3, 1994. The size of the image is about 26 miles by 36 miles. The center of the area is 37.83 degrees north latitude, 122.38 degrees east longitude.

This particular SAR image is a good illustration of how SAR distinguishes urban areas from nearby relatively less populated areas. Such densely populated regions as downtown San Francisco (center) and the city of Oakland (at the right across the San Francisco Bay) show up as red images due to the alignment of streets and buildings vis A vis the incoming radar beam. The bridges in the area are easily detected by the imaging radar, including the Golden Gate Bridge (left center) at the opening of San Francisco Bay, the Bay Bridge (right center), and the San Mateo Bridge (bottom center). All dark regions on the image represent smooth water. Radar also easily detects the major faults in the area: those bounding the San Francisco-Oakland urban areas and the San Andreas Fault (at the lower left), As seen from the image, faults are shown as dark straight lines in the SAR image.

INCERSE SAR (ISAR)

While SAR images a region of the Earth from an airplane or an air shuttle, Inverse SAR (ISAR) images a flying object, such as airplane or an asteroid, from land-based radar. ISAR is very popular, and also very critical in military applications.6 It is commonly used for identification purposes. In a possible war scenario where there are too many aircraft in the sky, it is almost impossible to guess which one is friendly or hostile. In that case, ISAR imaging technique is used to identify the approaching aircraft and classify it from a collection of possible targets.

In theory, ISAR is an imaging technique that maps the locations of dominant scattering points of a target based on the multi-frequency, multi-aspect, backscattered data.7 In this data, the signal's amplitude reflects the magnitude information of the scattering points on the target, while the backscattered signal's phase is related to the location information of the scattering point off the target. After collecting this 2-D raw data, several signal-processing tools extract from this data the amplitude and location information of the scattering centers. Then, a 2-D image of the target is constructed by using a convenient image processing technique.

An example of ISAR imagery is shown in Fig. 3. The model of the test airplane (C-29 model) is shown at the lower portion, while a 2-D ISAR image of the airplane is constructed at the upper portion of Fig.3. The measurement is taken at the center frequency of 10 GHz, where the frequency bandwidth is 16 GHz. The data is collected from 0.10 steps to cover the entire 3600 azimuth. At the end, a 2048 by 2048 2-D grid is constructed by using the ISAR algorithm. By comparing both, it is seen that ISAR imaging provides accurate target information. By looking at this image, it is very easy to identify and classify the aircraft.

ISAR is an active operation of the radar at the target's far field. Both receiving and transmitting antennas must be far away from the target. Recently, new ISAR imaging techniques that allow passive radar operation have been discovered. Antenna SAR (ASAR) and Antenna Coupling (ACSAR) imaging techniques use direct radiation from an antenna mounted on the near field of an airplane or a ship to image the dominant radiation points off these platforms. In these cases, the radar functions only as a receiver, for the target's own antenna provides illumination to the target. These techniques are mainly used to determine the dominant radiation points off the target to explore ways to cancel or mitigate undesired extra radiation from the target's platform.

The development of fast computers during the 1980s allowed researchers to apply intensive computational electromagnetic (CEM) tools that ultimately led them to develop new SAR/ISAR algorithms. One of the most appreciated and widely used tool is Interferometric Synthetic Aperture Radar (INSAR) imaging, which allows the extraction of height information that can be used to render 3-D topographic views of a SAR scene.

INTERFEROMETRIC SAR (INSAR)

Radar interferometry involves coherently combining radar measurements made by two or more radar antennas displaced by a relatively small distance.8 Depending on the relative geometry of the two antennas, the combined measurements can be turned into measurements of surface topography, topographic change, or displacement over time. Mapping precision of around 2m in three dimensions over a wide area is now possible from airborne interferometric radars.

Here is how an INSAR works: A radar system launches electromagnetic energy to scan the ground terrain to be imaged. Two radar antennas collect the backscattered wave to obtain two different snapshots of SAR image. To avoid phase ambiguity, these antennas must be close enough to each other. Since the waves travel different distances from a particular scatterer to each antenna, the resultant phases of each SAR image is different. In the next step, an image called interferogram is formed by multiplying one SAR image by the complex conjugate of the other SAR image. The phase of the interferogram represents the differences in range to the scattering centers of each pixel in the image. These differences are caused by the terrain's topography. Then, a signal-processing algorithm converts this phase information to extract the terrain's topographic features. Finally, a 3-D INSAR image of the region is formed by combining the SAR images with the height information.

An example of INSAR imaging is illustrated in Fig.4, which depicts the Long Valley of east central California. The images were taken by the Spaceborne Imaging Radar-C/X-band Synthetic Aperture Radar (SIR-C/X-SAR) aboard the space shuttle Endeavour during its two flights in April and October 1994. The four images show the steps necessary to produce 3-D data from radar interferometry. The image covers an area of 21 by 37 miles. The radar illumination is from the top of the image. The bright areas are hilly regions of big rocks and pine forest; the darker areas are the relatively smooth, sparsely vegetated valley floors. The curving ridge running across the image's center from top to bottom is the northeast rim of the Long Valley caldera, a remnant crater from a massive volcanic eruption roughly 750,000 years ago.

The image in the upper right is an interferogram of the same region, constructed by combining data from the April and October flights. The different phases are shown as different color levels. These variations are caused by elevation differences in the area. The same color levels indicate that those regions have same altitudes. The image in the lower left shows a topographic map derived from the interferometric data. The black bold contour lines represent levels of elevation. In this particular image, elevation levels are spaced at 250-meter intervals. The last image is a 3-D view of the northeast rim of the caldera, looking toward the northwest. As can be seen from the image, it is possible to extract such geologic structural and landform features as elevation, vegetation, and soil type with the help of INSAR processing.

Another example of INSAR imaging is shown in Fig. 5, which depicts the Washington, DC, Mall area. A similar approach is used to form this 3-D image. The region starts from the Capitol building (top) to the Lincoln Memorial and the Arlington Memorial Bridge (toward the right bottom). The Washington Monument is very easy to observe at the center of the image. The bright areas (from white to yellow) represent higher elevation places; darker colors (from green to dark blue) represent the areas of lower elevation. The Potomac river (right bottom of the image) and the reflecting pool (from the Lincoln Memorial toward the Washington Monument) are all in dark blue because of the water and the lowest elevations. We can also clearly distinguish Constitution Avenue running from bottom to top. The green regions are intermediate elevation consisting mostly of vegetation. As seen from the image, the highest elevation is the top of the Washington Monument, the Library of Congress building, and the Capitol building.

CONCLUSION

In this paper, we presented a survey study of radar basics and radar imagery. It is obvious that radar has been a very important and useful tool throughout the 20th century, both in the military and industry. With developments in the computer era and new imaging algorithms, it looks like it will be a very critical tool in the 21st century as well. It is now possible to simulate very complex models and targets in a reasonable computation time in radar frequencies thanks to new developments in computational electromagnetics methods (CEM). Examples of those are Xpatch9 (a high frequency code that can predict the scattering from large, complex bodies) and FISC10 (a fast simulator of electromagnetic bodies at high frequencies). While computers continue to grow faster and faster, new electromagnetic simulators are also getting faster and more efficient. As a result, more compact, fancier, faster, and more accurate radar-imaging techniques are being developed.

REFERENCES

  1. Morris, G. V. and Harkness, L. (1996) 'Airborne Pulsed Doppler Radar', Artech House.
  2. Mensa, D. L. (1981) 'High Resolution Radar Imaging', pp. 185-189, Artech House.
  3. Wehner, D. R. (1994) 'High-Resolution Radar'. Artech House.
  4. Carrara, W. C., Goodman, R. S. and Majewski, R. M. (1995) 'Spotlight Synthetic Aperture Radar: Signal Processing Algorithms', Artech House.
  5. Franceschetti, G. and Lanari,. (1999) 'Synthetic Aperture Radar Processing', C. R. C. Press LLC.
  6. Baltes, H. P. (1980) 'Inverse Scatteiing Problems in Optics', Springer-Verlag.
  7. Chu, T. H. and Lin, D. B. (1991) 'Microwave diversity imaging of perfectly conducting objects in the near-field region', IEEE Trans. Antennas Propagat., vol. 39, pp. 480-487.
  8. Askne, J., et al. (1997) 'C-band repeat-pass inter ferometric SAR observations of forest, IEEE Trans. on Geoscience and Remote Sensing, vol.35, pp. 25-35.
  9. Lee, S. W. (1992) 'Test cases for XPATCH', Electromagn. Lab. Tech. Rept., ARTI-92-4, Univ. of Illinois.
  10. Ctr. Computat. Electromagn. (1997) 'User's Manual for FISC (Fast Illinois Solver Code)', Univ. Illinois, Urbana-Champaign, and DEMACO. Inc.