National Academies Press: OpenBook
« Previous: 3 INFORMATION CONTENT
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

4
Advanced Sensors

INTRODUCTION

A major theme of this report is the revolutionary impact of information technology emerging from the commercial sector on the prosecution of military operations. Advanced sensor technology is a crucial element of information collection, and one example of an evolving commercial business area with obvious military applications is airborne and particularly space-based image collection. In the near term, new ventures in this area will offer affordable submeter-resolution panchromatic as well as colorized imagery of most areas of Earth from commercially launched space platforms. Details of these enterprises are provided in Appendix C. In addition to providing new services and products, this industry will drive the development of low-cost, lightweight advanced sensors that will have spin-offs for uniquely military applications.

In spite of the major contributions the commercial sector is likely to make toward satisfying future military sensing requirements, there will always be a subset of those requirements that has no identifiable, profitable commercial counterpart. This chapter explores some of the sensor technologies that will be critical for future military operations, many of which will require DOD and possibly Department of the Navy investment to ensure their robust development and tailoring to naval applications. The focus here is on standoff sensors such as radar and electro-optical systems, which will be the workhorses of future reconnaissance and surveillance platforms. These platforms are expected to be under the control of the Joint Task Force Commander and will provide the information necessary to conduct naval missions and operations. The Department of the Navy must ensure that it provides connectivity to those assets provided by other Services or the National

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

community, and invest in organic sensors and platforms to meet unique Navy Department requirements that will not otherwise be satisfied. The issue of sensor platform types is also discussed, since the requirements on these platforms are often intimately tied to the capabilities of the sensors they carry.

RADAR TECHNOLOGY ISSUES FOR FUTURE NAVAL WARFARE

Introduction

Warfare in the future will become increasingly dependent on technological force multipliers as the numbers of personnel and equipment shrink in response to economic pressures, and as adversaries avail themselves of similar capabilities available in the open marketplace. Surveillance and reconnaissance are two military capabilities that will undergo dramatic growth in performance as a result of the explosion in information technology. Processing technology will enable surveillance coverage rates that are orders of magnitude higher than those achieved today. Wide-band communication via satellite or terrestrial channels will provide surveillance products on demand to warfighters in the field, who will be provided with the data storage and the tools necessary to take advantage of them. As military commanders seek near-perfect knowledge of the battle space in which they fight, it is critical that both the capabilities and limitations of these technologies be well understood as we postulate sensors and systems that might exist decades in the future. In all cases, it is necessary to assess such future capabilities in terms of their military utility to find, identify, and prosecute targets of interest to the forces.

To fully appreciate the role of reconnaissance and surveillance on the future battlefield, it is also necessary to extend the several "system of systems" concepts that are emerging as part of the current revolution in military affairs. One such concept is surveillance/precision strike, seeking the seamless integration of emerging highly accurate sensor location systems with the new precision-guided weapons based on GPS, terminal seeker, or other guidance concepts enabling hit-to-kill accuracy in the end game. A second example is the automatic fusion of real-time sensor and intelligence data in the context of various geographic and intelligence preparation-of-the-battlefield (IPB) databases to find and identify individual critical mobile targets such as the Scud transporter-erector launchers (TELs) that caused frustration in the Desert Storm campaign. Each of these needed capabilities suffers today from shortfalls in either basic sensor technology or exploitation technology. Many of these shortfalls will gradually disappear, but some limitations will remain due to physics-based limitations or cost constraints.

Platform Issues

A major issue for the future of reconnaissance and surveillance is the types of platforms in which the Services, and in particular the Navy, should invest.

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

General categories are space-based, airborne, and shipboard, the latter including both surface and subsurface platforms. A significant focus of this chapter is littoral operations, with particular emphasis on force projection ashore, whether for major regional contingencies, special operations, or operations other than war. Since most such operations will require deep look capability into hostile territory, the major platform competition in the future will be between spaceborne and airborne assets. Secondarily, there will be a competition in the airborne category between organic carrier-based assets and land-based assets within the inventory of the Navy or one of the other Services. Each of these surveillance platforms has its own set of advantages and disadvantages that must be fully understood and weighed against one another to arrive at a reasonable strategy for technology and system investment.

Many people believe that space is the ideal place to put most of the surveillance and reconnaissance assets in the long run. The reasons are many. First, space provides a vantage point from which no point on Earth is denied to a sensor system. Airborne assets are usually required by international law to fly over friendly territory in the absence of an outbreak of hostilities, and in most cases must do the same under wartime conditions out of consideration for safety. In fact, wartime conditions usually drive airborne platforms many tens of miles further back from the front once hostilities begin. As a consequence, airborne sensors are significantly limited in their ability to see deep into enemy territory—active radar sensors because of limitations on power versus slant range, and passive imaging sensors because of loss of spatial resolution due to limited angular resolution. Although these effects are suffered to some extent by both airborne and space-based sensors, airborne sensors are subject to the increasingly deleterious effects of atmospherics and weather as grazing angles become shallower, and sensors that must look at surface or low-flying targets become increasingly screened by terrain masking in theaters such as Bosnia or Korea. By maximizing the grazing angles over which it deploys its sensors, a space-based system minimizes the amount of atmosphere its signals must pass through, and suffers virtually no shadowing effects due to mountains. A doctrine of military intelligence existing from time immemorial is to seize the high ground so as to see the enemy, and clearly there can be no higher ground than space.

A second advantage argued for space-based sensor systems is that once the nonrecurring cost of producing and launching the sensors has been paid, the continuing infrastructure cost of utilizing the sensor is dramatically less than that of large airborne surveillance platforms. This argument no longer applies exclusively to space-based platforms, however, because the same case can be made for the evolving UAVs and for proposed future uninhabited reconnaissance aerial vehicles (URAVs). An advantage of a space-based system over the URAVs, however, is the high survivability afforded by the platform against potential enemy attack. Since the cost of developing and fielding antisatellite weapons will be prohibitively expensive and will require a high degree of advanced tech-

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

nological capability, only the most sophisticated of potential future adversaries will threaten such platforms. A more likely threat to space-based sensors will be electronic countermeasures and other techniques of information warfare, including camouflage, concealment, and deception (CCD).

On the negative side of the argument for space-based sensors is the issue of nonrecurring cost. The special environmental conditions in space obviously drive costly hardware solutions to meet temperature, radiation, and low-operating-power requirements, as well as the mechanical constraints associated with launch stress and foldability of the payload into the launch vehicle. The unique requirements of surveillance itself, moreover, can be significant cost drivers, depending on the level of performance sought. For example, if near-continuous coverage of an area on Earth is required, the choice is generally between one or a few satellites in synchronous orbit and a large constellation of low-altitude satellites such as those being implemented for cellular communications. But if an active radar sensor is required, the option of having satellites in synchronous orbit becomes prohibitive due to the R4 dependence of radar signals on target range. Even a radar sensor in low orbit suffers from R4 dependence, since practical considerations of orbital decay require significant satellite altitude, which maps into slant range requirements at least as severe as those of airborne radars, and typically worse. As a consequence, depending on the nature of the surveillance requirement, even a low-orbiting system may have to carry a very expensive radar sensor. For many tens (or hundreds) of such satellites, the nonrecurring bill could be quite large. Examples of requirements that may fall in this category are wide-area moving target indicator (MTI) surveillance of ground targets and airborne early warning (AEW) radar surveillance in support of theater missile defense. In general, providing surveillance for moving targets strains the ability to field affordable radar solutions from space because it requires a high revisit rate for near-continuous coverage. Fixed-target imaging systems in space, on the other hand, have been tremendously successful since the required rate for revisiting a given point on Earth may be much lower. Even active imaging systems are more cost-effective when the target is fixed, because they are able to use long integration times for coherent image formation at the desired resolution, with the side benefit of being able to operate with much lower radiated power than short-dwell active systems.

The primary alternatives to space-based surveillance systems are various categories of airborne platforms. Typical of today's land-based airborne surveillance systems are big platforms such as the P-3, the E-3, the E-8, Rivet Joint, and others. Other platforms include the carrier-based E-2C, as well as a variety of smaller signal intelligence (SIGINT) and other special-purpose platforms such as Guard Rail. A major issue to be considered for the future is the place of carrier-based versus land-based surveillance. As the Navy transitions to a posture of littoral operations and force projection ashore, the case for organic platforms versus land-based support is weakened. Furthermore, the concept of joint

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

warfighting argues that the surveillance assets of the other components will be available to support Navy and Marine Corps forces when required. On the other side of the coin, the number of foreign bases from which these joint assets may stage is diminishing at an alarming rate, and it is not clear that many of these platforms will have the ''legs" to reach certain theaters of operation. One potential solution to this problem is the deployment of long-endurance surveillance UAVs, for which the model in today's technology might be the Defense Advanced Research Project Agency's (DARPA's) Global Hawk (Tier-II+). With future, perhaps larger versions of such platforms, it should be possible to carry large surveillance payloads to the fleet from thousands of nautical miles away, loiter in the operating area for one or more days, and then return safely to the originating base when a relieving platform arrives on the scene. In the near term, land-based airborne surveillance will continue to be dominated by large manned platforms, and the Navy and Marine Corps should provide the appropriate data links and connectivity to these platforms so as to benefit from their presence in joint operations.

The Department of the Navy should also evaluate and potentially develop organic surveillance assets that would be attached to the carrier battle group for those circumstances in the early phases of a conflict where naval forces are first on the scene and do not have the necessary surveillance of the littorals required for the projection of power ashore. Such an organic asset could provide surveillance over land to ranges of 50 to 100 kilometers prior to the appearance of joint airborne or UAV assets in the theater of operations.

In the long term, the concept of URAVs is certainly technically feasible and will soon be demonstrated in near-term ACTDs. The total operational concept for the deployment of such surveillance systems must, however, be thoroughly understood. For example, the classical problem of providing radar surveillance of ground or airborne moving targets usually requires a radar with a certain power-aperture product in response to a requirement for target radar cross section and slant range. Since this requirement varies as R4, the size of the radar, and hence of the platform, increases very quickly with separation between radar and target. To satisfy the requirement with a small UAV carrying a limited payload (e.g., the total Tier-II+ payload is 2,000 lb), it may become necessary to penetrate enemy territory to achieve the necessary slant range for target acquisition. Against a strong adversary, this approach may impose a requirement for low observability on the vehicle and its sensor package. In addition to the cost of designing and maintaining a stealthy penetrating vehicle, another factor comes into play. As the vehicle is forced to penetrate enemy territory to overcome the R4 disadvantage, the radar emissions themselves compromise the survivability of the platform. Thus low probability of intercept (LPI) may have to be added to the requirements of the radar design, which may have a significant impact on range performance. It is not clear where the regression ends here, particularly as it affects the cost of the surveillance system as well as its net performance. An alternate solution is to

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

place a much larger radar sensor with a large power-aperture product at standoff ranges from the enemy's defenses so as to afford a reasonable level of survivability, as well as the radar "horsepower" needed to perform the mission at much longer range. In this case, system designers typically choose a large platform to provide the lift, power, and cooling necessary to support the sensor. Given the large platform, the tendency has been to populate the aircraft with large crews to support and exploit the system's full capabilities on board.

There is a tendency to view the large platform solution that is common in today's surveillance inventory as being driven by the on-board exploitation requirement. In fact, the size of the sensor may be the more fundamental driver. In the future, it should be possible to combine these two modes of surveillance by having much larger unmanned or lightly manned platforms capable of carrying standoff radars, but without putting so many people in harm's way, and without the expense of training and deploying large, highly trained flight crews. Wideband, secure satellite or point-to-point communications will enable near-real-time reconnaissance and surveillance to be done wherever the commander would like, and automated flight control systems together with telepresence will permit large URAVs to replace manned standoff platforms in the future. Such platforms will be able to interoperate with small, stealthy penetrating UAVs carrying passive sensors or active LPI imaging sensors. A standoff ground surveillance system could provide high-quality wide-area moving target detection, location, and target development, tasking the small systems to provide positive target identification based on cues from the "mother ship." The small systems could also complement the coverage of the standoff platform using LPI "spot mode" moving target indicator and synthetic aperture radar capability, together with intermittent operating protocols, to provide survivable focused surveillance of small areas, such as those screened from the large URAV by mountains. When cued by intelligence information to a narrow search area, this mode of operation could also be used to find deep targets out of range of the standoff system. Such a configuration of large standoff URAVs supported by constellations of small penetrators could provide the target development necessary for weapon delivery systems such as the arsenal ship and submarine-launched ballistic and cruise missiles, as well as for today's shipboard and airborne weapon delivery systems.

Key Radar Technology Areas

Radar technology development is likely to continue its evolutionary pace over the next several decades. Advances in solid-state transmit/receive (T/R) modules will include higher output power, greater direct-current-to-RF conversion efficiency, and increasing miniaturization. Even more importantly, costs will drop dramatically as production volumes increase, leading to extensive use of this technology in future systems. This will enable a variety of active array designs with two-dimensional electronic beam steering and dynamically

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

reconfigurable apertures that will optimize multimode radar performance. Multipolarization and multifrequency shared apertures will enhance the information gathering capabilities of future systems for such purposes as target classification, and will aid in the rejection of various sources of clutter. Large apertures processing wide instantaneous bandwidth signals at large-scan angles will be enabled by photonic manifold technology or by direct digital techniques. Fighter radars will exploit T/R module technology to provide a variety of sophisticated air-to-air and air-to-ground modes, both detection and imaging. Higher average power achieved will enable fire control solutions at very long range against conventional targets, and will begin to have benefit against small-cross-section threats. Ship-based air defense radars will see a similar benefit in enhanced sensitivity as well as flexibility in the prosecution of multiple simultaneous fire control solutions.

Exciter and receiver technology will achieve increasing levels of stability, permitting the detection of small moving targets in very-high-clutter backgrounds. As analog-to-digital converter technology improves in both sampling rate and dynamic range, more receiver functions will be performed digitally, leading to precise control of receiver characteristics and channel matching. Many antenna designs will exploit multiple subarrays on receivers to enable a variety of array signal processing techniques to be applied. Space-time adaptive processing (STAP) of digitally equalized subarray channels will provide unprecedented levels of MTI performance. The ability to form multiple simultaneous (or near-simultaneous via pulse interleaving) beams will permit SAR images of multiple areas to be created in a single coherent collection interval. Ultimately, particularly at lower RFs, conversion to digital signals may occur at the output of each T/R module, leading to all-digital array manifolds and receivers. Achievement of this long-term goal will provide the ultimate in flexibility for array processing, with the most sophisticated of algorithms possible, provided that a sufficiently powerful signal processor is available.

The radar signal processor will be the most critical element in any sophisticated radar of the future. The tendency to exploit COTS solutions involving massively parallel configurations of high-performance processors will continue. Processing systems with teraflop levels of performance will be employed for radar signal processing, tracking, and target identification algorithms. Algorithm design and signal processing software development will be areas of increasing focus, with sophisticated graphically based development tools providing rapid prototyping capability. Similarly, detailed control and dynamic reconfiguration of radar will become increasingly software dominated, with the hardware elements tending toward programmable universal smart modules with embedded processors capable of a wide range of performance when commanded over digital control buses.

Several radar system concepts may reach maturity over the next several decades. Bistatic radar has held much promise but has been hindered by imple-

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

mentation difficulties and a lack of well-defined concepts of operation. Hybrid systems involving spaceborne radar illuminators and stealthy UAVs carrying bistatic receivers and signal processors may prove their military advantage in hostile environments. Similarly, the bistatic exploitation of existing signals in the environment may lead to practical low-cost radar detection or imaging equipment. AEW systems using the lower RF bands together with STAP processing will permit the detection and tracking of low-cross-section aircraft and missiles. In radar imaging, SAR image resolution will continue to improve within limits imposed by atmospheric distortion. Advances in autofocusing techniques will likely extend the ranges at which these higher resolutions can be achieved, and operational microwave SARs with several-inch resolution should be achievable.

The explosion in digital processing technology will provide great benefits in reducing the vulnerability of radars to electronic countermeasures (ECMs). Electronically scanned arrays together with sophisticated algorithms will be employed to sense the jamming or casual interference environment in time, space, frequency, and polarization dimensions, adapting the radar's operation in real time to changing conditions. Spatial/temporal adaptive cancellation algorithms will not only provide greatly increased levels of performance against conventional sources of direct interference, but will also be effective against airborne jammers employing terrain-scattered signals. The use of T/R modules will provide very wide radar operating bands, making frequency agile radars more effective against moderately sophisticated reactive countermeasures.

Sensor Exploitation Issues

One of the weakest links in current sensor-to-shooter concepts is the capability to derive classification or identification information from sensor data. If lighting and weather conditions permit the collection of high-resolution optical imagery of a target complex, human operators and increasingly powerful machines can provide highly reliable classification of selected targets, but wide-area high-resolution imaging requires a high degree of automation. Electronic intelligence (ELINT) collectors can frequently provide precise classification of signal-emitting complexes based on the unique signatures observed. The most reliable long-range all-weather sensor is radar, however, and the state of the art in this realm is by no means complete. A great deal of investment has been made in automatic classification of targets seen in SAR imagery, and a certain degree of success has been achieved against certain targets in the clear. Much is left to be done to achieve robust classification at low false alarm rates for targets partially screened by foliage or other obstructions, and targets deliberately camouflaged by an enemy to defeat the classifier. As sensor resolutions improve, yielding more pixels on target, and as processing technology continues to advance at its rapid pace, enabling more sophisticated algorithms to be employed, the performance of the classifiers will improve steadily, and should provide a very power-

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

ful capability over the next several decades. This applies primarily to the microwave region SAR systems. At the low end of the RF frequency band, ongoing research is trying to achieve foliage penetration (FOPEN) and ground penetration (GOPEN) SAR imaging systems capable of finding and classifying obscured or buried targets. In contrast with the microwave SAR, these systems suffer from a fundamental conflict between the desire to utilize the lowest RF possible so as to achieve maximum penetration of the obscuring medium and the desire to maximize RF bandwidth to achieve high-resolution images. This area of investigation should converge to some optimum balance between mutually exclusive requirements, and the performance may not improve further beyond that point. Nonetheless this technology may provide great military value based on modest levels of classification performance.

The collection capability of future SAR systems will result in astronomical quantities of imagery at very high resolution. The fundamental limitations on SAR collectors today are not the sensors themselves, but the signal processors necessary to generate imagery and the data links necessary to disseminate it. Both of these limitations will all but disappear in the coming decades, resulting in collection systems that will be capable of imaging entire theaters of operations at very high resolution daily, or even several times a day. Human operators will be relegated to examining imagery that has been highly screened by automated techniques so as to reduce the bandwidth to that which is manageable by a finite number of interpreters. The ATR problem of finding and keeping track of mobile targets will be greatly assisted by the use of automatic change detection applied to SAR and other imagery, and three-dimensional interferometric processing of multipass SAR imagery will add height to targets, providing another dimension of information to enhance ATR performance. Dual polarization and multispectral SAR augmented by multispectral passive imaging under benign weather and cloud conditions will further increase the information content for each target on which the ATR system will operate. In response to this pervasive fixed target surveillance capability, future enemies will engage in ever more sophisticated CCD techniques to evade detection. This will drive sensor and ATR developers into seeking ever-increasing capabilities from their systems to identify partially obscured or disguised targets. It is not clear today which side will gain the upper hand in the long run, but economics will ultimately be the governing factor.

A relatively embryonic area of radar exploitation is in the area of moving ground targets. Virtually all airborne or spaceborne ground surveillance systems are either imaging systems, or systems that exploit a target's own emissions. Consequently, there is very little knowledge in the surveillance community about the potential benefits of moving target exploitation, and almost no prior body of knowledge on the subject. Moving target exploitation comes in several flavors, starting with the knowledge to be gained from the basic scan-to-scan detection picture obtained from a wide-area airborne ground surveillance system. Future MTI systems will have very high revisit rates on the order of once every few

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

seconds over very large areas (e.g., 50,000 square kilometers). These systems will have high-power-aperture products to achieve the rapid revisit rate, unaided circular errors of probability (CEPs) of a few tens of meters at extreme sensor range, and high single scan signal-to-noise ratio on individual targets to exploit unique signatures associated with rotating or other articulating parts of the target. The MTI modes will operate at very high range resolution so as to measure the length and down-range radar cross section (RCS) profile of each moving target, providing a one-dimensional crude ATR capability as well as a powerful vector association variable for maintaining track continuity. All moving targets will be automatically tracked, and individual targets will be aggregated into groups based on various rule-based filtering criteria. The RCS templating concept will be extended to groups of targets, enabling multiple hypothesis tracking algorithms to reacquire convoys hours after they have disappeared behind a mountain or driven into a foliage-obscured area. The ability to track each moving entity on the ground indefinitely will provide a mechanism for aggregating knowledge about that object over time, whether it is obtained from the on-board radar itself, of from off-board sources of sensor or intelligence data. The radar itself can be used to generate high-resolution ISAR images of selected targets as their track state is predicted to pass through curved road segments in the on-board geographic road network databases. These images, when classified by an ATR subsystem, would add to the knowledge base being built within that target's track file. Ultimately, the ability to indefinitely track and classify most moving objects on the ground together with intelligence data on the source and sink locations of vehicle movement should permit a complete dissection of the enemy's infrastructure, and ultimately yield a real-time ground order of battle. It will also provide a powerful mechanism for maximizing the utilization of narrow field-of-view sensor systems such as high-resolution spot SARs. Prior to an outbreak of hostilities, such a detailed analysis of the enemy's movement patterns should telegraph that adversary's intentions to intelligence analysts. In a peacekeeping mission, noncompliance with treaty obligations will be readily apparent in near real time as the movement of armaments and personnel is tracked throughout the theater.

Surveillance in Systems of Systems

The seamless integration of surveillance functions into an operational architecture for prosecuting a specific military mission represents a challenging systems engineering problem. Many of the architectures used in the past were developed on an ad hoc basis in response to an urgent need, and do not come close to optimizing the utilization of the various subsystems composing the architecture. The continuing revolution in the areas of information and communications technology will provide the physical basis for optimizing these architectures, but tremendous strides need to be made in developing and automating

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

viable concepts of operations for such large meta-systems. The development of these capabilities will require the coordination of large government and industry teams, both to create the new architecture and to engineer new interfaces for legacy systems to permit their incorporation into it. Considering the fact that a single major platform development by one prime contractor is a highly challenging engineering task, it is likely that the creation of a system-of-systems architecture will not be accomplished without considerable effort.

One example of such a desired architecture is surveillance/precision strike for near-real-time detection and identification of surface targets, followed immediately by their assignment in priority order to appropriate strike systems already in the field. This architecture would provide for the timely prosecution of moving ground targets, or mobile targets whose agility places their relocation period inside current planning cycles. Among the design challenges to be faced are the networking of surveillance, C2I, and strike systems via appropriate digital communications, the prioritization of targets based on importance and vulnerability, the pairing of targets with appropriate weapons, and the dynamic positioning of forces to optimize system effectiveness. The surveillance component of this architecture must be designed to provide a sufficiently high number of viable target nominations per unit time to keep the strike component efficiently employed, with emphasis on finding the target types that are known a priori to be of highest military value. Since targets may be fixed or moving, both SAR and MTI radar capability must be employed. As described above, wide-area coverage will be possible for both radar modes, and technology will permit exhaustive SAR imaging of large areas in the future to match the wide-area MTI capability available today. The fusion of SAR change detection products and FOPEN radar products with MTI will provide a further enhancement to the long-term tracking of mobile targets.

A major challenge in the surveillance component of this architecture is the target identification problem discussed above. The degree of confidence required in the identification is a function of the particular scenario. In a free-fire zone presumed to contain only enemy combatants, the main concern is the efficient use of ordnance, and the allowable false identification probability may be somewhat high. On the other hand, for a decision to attack targets close to a battle area where ground forces are engaged, the acceptable level of error may be so low as to preclude achieving it with the state of the art in available radar sensor and ATR technology. In these instances, other sensors such as high-resolution electro-optical/infrared (EO/IR) or SIGINT, specific human intelligence (HUMINT) reports, or absence of the expected identification friend or foe (IFF) response may be required to reach a level of confidence necessary to shoot. In operations other than war, political considerations may be of overriding importance in deciding what level of identification is sufficient. If the goal of the surveillance/precision strike architecture is to achieve timely execution through automation of most functions, then the algorithms will need to be extremely sophisticated in order to

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

adapt to such scenario-dependent constraints. More likely, there will always be people in the loop to handle these tough questions. These operators will require rapid access to a great deal of information in as efficient a manner as possible so that they do not become the long pole in the tent for the targeting cycle.

ADVANCED ELECTRO-OPTICAL SENSING TECHNOLOGIES

Introduction

A number of enabling electro-optical phenomenological and sensing technologies will be important for future military applications. This section begins with a broad overview of the available technologies and then, for each, considers more specific platform basing and functional applications, including intelligence, surveillance, and reconnaissance (ISR), targeting, weapon delivery, and threat warning systems.

Enabling Technologies

Passive Multispectral and Hyperspectral Imaging

Over the last decade, it has become increasingly clear that color can provide significant information (of military value) for both manmade objects (targets) and natural backgrounds (clutter). Initial systems will be multispectral with tens of relatively coarse bands (e.g., 50- to 100-nm wide) spread over the visible to near-infrared (NIR) to short-wave infrared (SWIR) spectral range with spatial resolutions that, depending on the basing, can be very high (inches to feet) to coarse (meters). These sensing concepts will be used predominately in the daytime. The technology to achieve such capability is off-the-shelf and will even be available in commercial applications.

For example, Landsat and Systeme Probatoire d'Observation de la Terre (SPOT) satellite multispectral imagery in the visible through NIR portion of the spectrum has been used for terrain classification, trafficability assessment, and change detection as well as commercial applications in agriculture, geology, and resource monitoring. There is even promising work by some groups in detecting military targets that are significantly under resolved (say 5 percent pixel fill) in Landsat imagery. The U.S. Air Force has successfully demonstrated reliable automatic target detection in airborne multispectral imagery in the visible through SWIR region. The Navy and Marine Corps have also demonstrated reliable detection of surface minefields from the Pioneer UAV using six bands in the visible region.

Most airborne spectral imagers are multispectral with 5 to 20 bands spanning the visible to SWIR region. Some systems are confined to wavelengths shorter that the cutoff wavelength of silicon detectors at about 1 micron. Typical spectral

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

bandwidth is about 100 nm. Typical instantaneous field of view (IFOV) is about 1 mrad, giving ground resolutions ranging from a few centimeters from low-altitude UAVs to 20 m from U-2 altitudes. In many cases these sensors have been developed more for commercial mapping applications than for military applications and can be expected to be available at decreasing cost as time progresses.

A natural evolution of the sensing concepts outlined above would be to operate in the same spectral regions but measure many (say hundreds) of narrow (10 nm or so) bands. This hyperspectral measurement approach is enabled by the advances in large focal plane array (FPA) technology and can support the sensing of very narrow spectral features for specific target discrimination tasks or to enable adaptive measurement of fewer or coarser bands (through postdetection aggregation) depending on the application or mission of interest. At present, very compact and efficient instruments can be built in the visible spectral range; complexity grows as multiple FPAs are needed to span a broader spectral range.

Hyperspectral sensors are currently being developed primarily for research into military applications. One well-known experimental sensor is HYDICE, which has 210 bands from 0.4 to 2.5 microns with a 0.5-mrad IFOV. Flown in a Convair CV-580, it achieves 1-m resolution on the ground. Hyperspectral imaging is usually obtained by using a slit and a prism or grating spectrometer. This instrument produces a two-dimensional image with one dimension of space and the other dimension of wavelength. The true image is built up by "pushbrooming" the sensor with aircraft motion. The number of spatial samples and wavelength bands is determined by the size of the focal plane array. Silicon arrays for visible and NIR sensing can easily be in the 1,000 × 1,000 size. Another example, the Navy PHILLS sensor, uses a silicon CCD detector and has about 400 bands in this region. The spectral bandwidths of this system are even smaller than 10 nm. In the SWIR, a different detector material must be used. Indium antimonide (InSb) is often the choice (as was true with HYDICE). InSb arrays are limited to about 512 × 512 in size. One can expect further increases in size and decreases in cost. The signal-to-noise ratio (SNR) of current systems ranges from 50 to 200. It is limited primarily by detector noise, which perhaps will improve with further research in InSb and other detector materials.

Processing of hyperspectral imagery is still in its infancy. If a sensor is well calibrated (and few are), the spectral signature from a pixel containing only one type of material may be fairly easily matched to a (laboratory) reference spectrum. However, in actual images, few pixels contain just one type of material. In this case, spectral unmixing methods must be used. The pure signatures to be found in a scene must be determined (sometimes from examination of the scene to identify pure or "basis" spectra) and used to estimate the proportions of different materials to be found in any one pixel. It is computationally difficult to determine these pure signatures in a hyperspectral space of typically 100 to 200 dimensions. The problem is complicated by the fact that scene composition and therefore the "basis" spectra change from place to place in the scene. It is

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

expected that considerable improvement in this process will come from further work and experience with hyperspectral data.

A second important growth path is to exploit the spectral character of clutter and manmade objects in the mid- and long-wave IR spectral regime. Recently, the Air Force and Navy have shown through single pixel Fourier transform spectrometer measurements that military targets (including CCD) have good color contrast with the background. This contrast is often 1 to 2 percent of the total available signal. The measurements have also shown that for individual narrow bands, significant band-to-band correlation exists between both midwave and long-wave subbands and that the background correlation is quite high (typically 0.999 to 0.99999). This implies that background variations due to temperature variations (which are typically 5 to 10 percent of the scene signal) can be estimated in one band and used to cancel clutter in the other band. A two-band spectral sensor on the long-wave IR can often achieve an "effective" signal-to-clutter ratio (SCR) of 10 to 20 when an otherwise equivalent single-band sensor would have an SCR of less than 1. To exploit this correlation structure, it is necessary of course to have an extremely low noise sensor; i.e., an SNR of 1,000 or greater is required. Fortunately in many applications there is sufficient signal so that such SNR levels can be achieved in a relatively short time by multiframe integration. This area will also benefit from improved detector arrays that are spatially registered, like the quantum-well devices.

In summary, this phenomenon can be used to separate the effects of temperature and material emissivity as viewed through apparent irradiance as well as to compensate for the effects of the intervening atmosphere. With proper sensor design (significant signal-to-noise ratios), the sensor can be operated (with post-detection processing) as though it is noise-limited as opposed to the more common clutter-limited operation. This shows great promise for detection and classification of deeply hidden targets and can support nighttime operation.

Active Multispectral Imaging

While passive multispectral methods have been developed and utilized for many years, it follows that there is an active system analog that has not received significant attention, but is likely to play an important role in future systems. Rather than using solar or thermal (passive) spectral signatures, the active version will make use of lasers for illuminating scenes at specific wavelengths to interrogate their reflective spectral features. The primary driver for using active multispectral sensing is that, unlike passive sensors, the illumination and apparent reflectivities are not dependent on sun illumination and are not dependent on the thermal status of the target. Thus a great deal of the variability of the spectral signature can be controlled by the active illuminator. This benefit comes with certain limitations, however. In particular, active systems are limited in their ability to perform wide-area search because of the limited available laser power

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

levels. With the development of high-power, robust, wavelength-agile lasers, active multispectral imaging does show promise for many applications.

One possible scenario envisioned for active multispectral sensors is the assessment of littoral regions. An aircraft containing an active multispectral sensor could fly along the coast with the laser scanning perpendicularly to the flight path to form a swath image of the littoral region. The platform altitude and swath width would be limited by laser power. This system could provide both daytime and nighttime operation. Laser wavelengths that cannot be detected with the naked eye or with standard night vision instruments could be used. The laser could be range-gated to avoid bias from backscatter from aerosols and fog, and, using blue-green wavelengths, could also image under the water to assess depth and look for underwater mines. The envisioned active sensor would scan the ground with a beam composed of light from a set of lasers. The specific wavelengths would be tuned to detect targets of interest via their spectral features. For example, to detect land mines, one could look for specific features of the paint by tuning the lasers to known wavelengths where the specific paint exhibits high contrast. Also, one could look for specific types of camouflage, vehicle paint, or disturbed soil.

For missions evaluated thus far, active systems inherently require fewer bands than passive systems because the signatures have less variability due to time-varying signature properties such as sun-illumination, or cloud cover. Many missions require as few as two or three wavelengths.

Active multispectral methods could also be used for long-distance target identification. Once a target is detected by radar, a multispectral laser sensor could be used to interrogate the target of interest. Again, a range-gated laser system would be used to avoid the difficulties of atmospheric scattering and path irradiance encountered with passive systems. The spectral distribution of the illuminating lasers could be tuned to interrogate a specific target class to perform long-range identification.

One final application for laser systems is the detection and identification of ''soft" targets such as gas clouds that may contain chemical or biological warfare agents, or missile or aircraft plumes. The spectral features of these targets are, again, interrogated with a set of specific laser wavelengths. The returns are thus available to conduct detection and identification. Both color signature and differential absorption techniques could be used to perform identification of cloud or plume contents, but could also evaluate plumes via their shapes or dispersal patterns.

Finally, active multispectral measurements can be coupled with additional laser discriminants to perform more thorough target identification. Active systems can readily measure a target's range and three-dimensional shape via pulsed illumination. Also by illuminating with polarized light and having two polarization detection channels, one can measure the amount of depolarization from a

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

target. It has been shown that manmade objects retain polarization, whereas natural targets tend to depolarize the return.

In summary, active imaging systems show promise for providing additional discriminants that can be used to detect and identify manmade targets that are deeply hidden. The discriminants available from active systems include spectral response, polarization, and three-dimensional spatial shape.

Electronic Beam Steering

Optical sensors are currently burdened with heavy, complex, and expensive gimbals. Electronic optical-phased array technology has the potential to provide lightweight, agile, and simple beam-steering subsystems that not only can rapidly and accurately point a single beam but also can point multiple simultaneous beams. Electronic steering of optical beams can be divided into two areas. One is the steering of narrowband (nearly monochromatic) light, such as with laser-based systems, and the other is the steering of wideband light as used in passive systems. Steering of monochromatic light is technologically easier since chromatic dispersive devices can be used directly and true time delay techniques are not required. It may be possible to design compact compensating optics that will allow useful wideband beam steering with intrinsically dispersive devices.

The use of a spatial light modulator as a grating with an electronically controllable spatial frequency can be used as an optical phased-array modulator (OPAM) for steering in either a transmitting or receiving mode. OPAMs can operate as amplitude or phase devices, with continuous or binary levels of control, and in pixel or continuous spatial formats. The most efficient OPAM would be a pure phase modulator with multiple phase levels. Such initial OPAMs have been constructed with two technologies that appear attractive for fabricating high-capability devices in the future. These devices use either electronically controlled liquid crystals or quantum-well Fabry-Perot vertical cavities to generate the phase shifts. Liquid-crystal OPAMs with apertures on the order of 4 cm × 4 cm have been fabricated for steering green, red, and 1.06-micrometer wavelength light. Using these liquid-crystal devices, steering efficiencies of 57 percent over 4 degrees of scan have been achieved with switching times on the order of a few milliseconds. Switching times on the order of tens of microseconds seem possible with such devices. More recently, quantum-well devices have been constructed showing 3 degrees of switching capability over a small area at 830 nanometers. These devices have the potential of switching times in the tens of nanoseconds with operation over much larger angles and sizes. Both technologies allow scaling to larger devices and mass production to reduce the device costs.

Continued progress in the development of coherent surface-emitting laser diode arrays may allow useful direct laser beam steering. Electronically steerable

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

narrow-beamwidth light has been generated using two-dimensional grating-surface-emitting diode laser arrays.

Video and Related Imaging Technologies

A revolution is under way in the commercial video area that will develop very high frame rate, high-pixel/resolution formats, high dynamic range (12 bits), and electronically stabilized imagery. For example, formats with pixel-sizes of 1,000 by 1,000 running at 500 to 1,000 frames per second seem possible. (At sizes of 5,000 × 5,000 pixels, the rates will be a few frames per second.) These megapixel cameras will have a great impact on our ability to do the ISR and targeting missions. Moreover, they will enable either synoptic-area surveillance (using advanced mosaic technologies under development) or spot sensors to provide either high-resolution or moving target imagery. Because of the high frame rate, video sensors are invaluable for detection of real-time change and motion.

Another certain growth path is the expansion of video technology into the infrared to enable nighttime operation. Initially these cameras will be cooled, have formats in the 500 × 500 pixel class and operate at 30 frames per second. Eventually a 1,000 × 1,000 pixel system operating at 500 frames per second seems possible. The systems will operate at room temperature and will becoming increasingly affordable as commercial applications expand.

Long-range Laser Designation

Military missions of the future will employ long-range laser designators to reduce U.S. casualties and increase weapon effectiveness. The system could include UAV or satellite-based laser designators. These designators will receive target coordinate information from other off-board sensors and then be directed to maintain a laser spot on the target for the duration of the laser-guided munition flyout. These munitions are delivered by either artillery or fighter aircraft. A significant feature of a designator system is that it does not rely on the pilot and his platform to perform target identification, designation, or bomb damage assessment. The pilot flies near the target, releases the weapon, and then leaves the area, thus minimizing the risk of being hit by enemy fire.

Another advantage of long-range designation is that in times of military conflict, high-altitude targets would likely be equipped with GPS jamming equipment. This would make GPS-guided munitions less effective. A laser designation system, in contrast, would not be susceptible to GPS jamming. In contrast, this laser system is limited by the requirement for a clear line of sight (no clouds) to the target.

There are two possible modes of operation for a long-range designation system: one in which the designating platform does not receive energy from the designating beam (open loop) and one in which the designating platform receives

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

and uses energy from the laser spot to derive pointing information (closed loop). The first system type would use a star tracker to determine its orientation and precision GPS to determine its location. Target coordinates would be delivered to the platform. The designator would then illuminate the target for the period from weapon release to impact. The designating platform might contain an imaging system to perform some functions; however, laser radiation would not be used to close the loop with the designator. The advantage of this system is that the laser power does not have to be boosted to compensate for the 1/(R2) loss on the return path to the designator. As a result, the laser power requirements would roughly match current laser power levels. This is attractive because low laser power allows the designator platform to be less complex (due to smaller aperture sizes, a smaller laser, and relaxed cooling requirement) and less observable. The technical question remaining is the degree to which an illuminating beam can be maintained on target in the presence of turbulence and scintillation.

In order to ensure that turbulence effects are properly compensated for, a closed-loop alternative in which the designating platform uses energy from the designating beam to derive beam pointing information, is one where the beam power must be significantly larger to ensure suitable contrast for the in-band laser tracking loop. Depending on the required standoff range, this requires a significantly more powerful laser.

One auxiliary use of long-range designators would be to activate laser defenses (smoke bombs) that protect targets. Triggering of smoke-bomb defenses would confuse the enemy and make the target location more obvious for subsequent missions.

Polarimetric Infrared Imaging

Polarization is an additional element of measurement diversity that can be exploited to improve clutter suppression, target discrimination, and object characterization. Measurement experience supports the general statement that the emitted signature of manmade objects tends to be partially polarized, while natural clutter (including ocean and terrestrial backgrounds) is highly correlated among the linear Stokes vector elements. Thus this measurement scheme, coupled with suitable processing, leads to the capability to detect very low contrast (dim) targets in cluttered scenes that are not observable (without target motion) using traditional radiometric imaging. Polarimetry also offers an ability to perform emissivity/temperature separation and, since the orientation of linear polarization radiance vectors is determined by the emitting surface normal, target geometry estimation.

Exploiting polarization for clutter suppression is challenging because Stokes vector measurements must be made with excellent spatial and temporal registration, sensitivity, and relative calibration between channels. Typically, channel pixel alignments to 1 percent or better and sensitivities/relative calibration errors

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

on the order of 10 mK are required. Current technology uses modest-sized FPAs (128 × 128) with deep wells (10 to 30 × 106) and discrete component optical systems to achieve this performance. Future technology will allow denser FPA implementations with the necessary sensitivity (using analog or integrated digital technology) along with integrated optical components (to maintain precision alignment tolerances) and mixed measurements with other diversity options (e.g., color or phase).

Improved Focal Plane Array Technologies

Focal plane array developments over the next 35 years will be determined by a combination of supply-and-demand forces—the demand forces of the commercial and military marketplace and the supply forces that are "fueled" by related technology developments in university and corporate research laboratories. The panel first considered the FPA developments that are likely due to the forces of market demand and next considered some of the emerging areas of technology development that are likely to occur due to related research and development.

Commercial and Other Nonmilitary Market Demand Forces

The demand forces of the commercial and nonmilitary marketplace will be responsible for technology improvements that will directly benefit military applications of focal plane arrays. These market areas include:

  • Facility or site surveillance (for security needs, law enforcement, drug interdiction);

  • Remote sensing (meteorological needs, land use, geological exploration);

  • Industrial inspection;

  • Entertainment industry—high-definition television (HDTV);

  • Automotive industry (augmented low-light-level and night vision); and

  • Astronomy and scientific research.

General areas where improvements in FPA technology can be expected to develop include:

  • Larger number of pixels per array;

  • Smaller interpixel spacing (detector element pitch);

  • Increased reliability;

  • Improved producibility (with lower cost);

  • Better sensitivity, lower noise, better uniformity;

  • Room temperature/uncooled IR FPAs (e.g., micro-bolometer array); and

  • Faster frame/readout rates.

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

As noted above, the move to HDTV standards will result in inexpensive (megapixel) arrays in the 2,000 × 2,000 class. Frame rates in excess of 60 Hz, for example, more like 1,000 Hz, will be commonplace. At a slower rate, digital photography markets will demand arrays in the 20,000 × 20,000 class to satisfy professional requirements. Smaller interpixel spacing will result from the commercial demand to have even smaller optical camera systems.

Currently, the cryo-cooler accounts for about 90 percent of the cost of an IR detector assembly. Thus only those applications that require high performance (noise equivalent detection temperature [NEDT] much less than 0.1 K) will utilize cooled IR. Driven by the need for private surveillance, law enforcement, and night vision for motorists, there will be uncooled arrays that support 0.1-K NEDT sensitivity.

The current limitation for hybrid mercury-cadmium-telluride (HgCdTe) arrays due to thermal mismatch will be overcome by using new lithographic techniques now under development. Array sizes in the 1,000 × 1,000 class seem feasible in the near term.

Developments in market areas related to digital imaging technology can also be expected to indirectly assist the development of focal plane array technology. These areas include the following:

  • Digital image compression, storage, and transmission; and

  • Digital image manipulation and display.

These FPA and FPA-related developments in the commercial marketplace can be expected to keep pace with the general needs of the Navy in the areas of communications, operations, and some space-based needs (e.g., meteorological).

Military Market Demand Forces

Military application areas with a continuing strong Navy need that is not likely to be met by developments in the commercial marketplace are as follows:

  • Missile warning receivers;

  • Bomb, missile, and projectile guidance; and

  • Identification friend or foe.

The above applications are less affected by developments in the commercial sector since information processing techniques required to discriminate targets from background clutter and perform target identification must exploit unique signatures of the target and background. These unique characteristics include spatial and temporal features as well as electromagnetic wavelength and polarization. As discussed above, the sensing of these unique radiation characteristics requires the use of specialized detectors (and optical methods) to achieve target

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

signal-to-background-clutter ratios adequate to provide acceptable detection (or discrimination) and false alarm probabilities.

In the above application areas, the demand will be for multiband focal plane arrays (for spectral discrimination) with a large number of elements (for wide field-of-view coverage) and dense detector elements (for compact and lightweight systems). Since the wavelength bands most suited for a particular target and background vary greatly, a means to "tune" the spectral bandpasses is necessary. Detection techniques (perhaps using quantum wells) that can be spectrally configured in real time will permit the fielding of sensors that are "agile"—more adaptive to terrain and target variations and more general purpose.

Due to the unique FPA architectures that must be developed and the lessor need in the commercial marketplace for such capabilities, continued FPA development in the above application areas will require government funding.

Research and Development Impacts on FPA Technology

Many incremental improvements in FPA technology will naturally occur due to the commercial and military market forces discussed above. Research and development efforts in university, corporate, and government laboratories will result in likely breakthroughs in areas related to FPA technology:

  • Development of synthetic materials (e.g., nano-technology); and

  • The understanding and emulation of biological systems, particularly in the areas of visual processing methods.

Progress in these areas is more difficult to predict, but can be expected to result in the following:

  • Detector materials whose detection properties can be controlled in real time;

  • On-chip (or close proximity) parallel processing for temporal change (e.g., motion); and

  • Processing characteristics that can be configured (in near real time) for a particular mission, i.e., provide a multirole sensor.

Progress in research and development efforts that can be expected to substantially improve FPA technology will depend strongly on government basic and applied research funding levels.

Phase/Wavelength Diversity for Aperture Synthesis

Optical aberrations that degrade image quality and resolution can arise from atmospheric turbulence, mirror misfigure, and misalignments among optical com-

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

ponents. A variety of sophisticated techniques have been developed to combat the effects of such aberrations. One of the most compelling of these is a technique known as phase diversity. A phase-diversity data set consists of two images. The first is a conventional focal-plane image that has been degraded by the unknown aberrations. A second image of the same object is formed by perturbing these unknown aberrations in some known fashion, thus creating phase diversity, and the reimaging. This can be accomplished with very simple optical hardware. For example, the combination of a simple beam splitter and a second detector array, translated along the optical axis, further degrades the imagery with a known amount of defocus. Alternatively, both images can be simultaneously collected on the same camera with the use of a prism. Notice that the second image will be degraded by the original aberrations in addition to the known defocus. It is rather remarkable that these two images can be digitally processed to "jointly" estimate both the unknown aberrations and the undegraded image that would have formed in the absence of any aberrations.

Phase diversity has been used to retrieve diffraction-limited images of solar granulation using ground-based telescope data, thus overcoming the degrading effects of daytime atmospheric turbulence. In solar astronomy, phase diversity is making the transition from academic curiosity to routine operation. Phase diversity has also been successfully demonstrated in nighttime astronomy. Some of the most impressive reconstructions to date have come from applying phase-diversity methods to ground-based images of satellites where adaptive-optics correction was used. In this case, phase diversity was used to overcome residual aberrations not compensated for by the adaptive optics. The resulting improvement in image quality and resolution is dramatic.

Technology Trends for Phase Diversity

Several alternative solutions to the problem of imaging in the presence of aberrations require expensive and complicated hardware, including Shack-Hartmann wavefront sensors and/or deformable mirrors. By contrast, phase-diversity technology requires only simple optical hardware at the cost of increased computational burden. In addition, the phase-diversity algorithm is evolving rapidly with respect to computational speed. Continued gains in algorithm speed can be anticipated with the integration of concepts such as improved initial estimates, tracking aberration evolution, and precomputing with neural networks. A special-purpose computing architecture using off-the-shelf components was recently designed for processing phase-diverse speckle (a variant on phase diversity) data. This special-purpose computer would produce 64 × 64 image reconstructions at video rates and evolving aberration estimates at an update rate of about 160 Hz. Given current trends in computing hardware and projections in algorithm development, it should be possible in a decade to process phase-diversity reconstructions of large images at kilohertz rates.

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

To date, phase diversity has been applied to imaging scenarios for which the aberrations are well modeled as localized to the pupil. This is a valid model for turbulence-induced aberrations in upward-looking scenarios. However, there are many applications of interest with horizontal-path or standoff geometries in which a volume turbulence model must be adopted. The volume-turbulence problem is considerably more challenging because the image blur function will change across the field of view (space variance). In addition, amplitude aberration (scintillation) is often a factor. As a consequence, there are more parameters to estimate, and the computations required are considerably more burdensome. Preliminary simulations suggest that phase diversity can be used to recover undegraded images in these challenging scenarios. However, the problem of imaging through volume-turbulence is sufficiently challenging that both pre- and postprocessing will likely be needed. It is projected that, in the next 15 years, phase-diversity technology in conjunction with adaptive optics will provide a means of collecting diffraction-limited images through volume turbulence.

A technology known as wavelength diversity, a close relative of phase diversity, was recently suggested for use in multi- and hyperspectral systems. In such systems, the performance of classification and identification tasks is enhanced when spatial resolution is improved. Therefore, the determination of aberrations in such systems can significantly improve performance. Like phase diversity, wavelength diversity affords the joint estimation of the system aberrations and the images that would have formed in the absence of any aberrations. However, wavelength diversity can be accomplished with a raw multispectral data set and does not require any system changes to obtain defocused images, because the "diversity" comes from the change in wavelength, which is already built into a multispectral data set. Wavelength diversity has been demonstrated in simulation, although the algorithms are embryonic at this stage. As exploitation of multispectral data matures so that spatial resolution becomes more important, wavelength diversity will provide increased performance at no additional sensor cost.

Future Applications

A capability for imaging through volume turbulence lends itself to a variety of uses in ship-based, littoral missions. In such scenarios, a ship-based telescope located up to tens of kilometers from the shore could recover diffraction-limited imagery while imaging through extended-path turbulence. Such imagery could be used for surveillance, targeting, and bomb damage assessment. At these ranges, the aperture can be relatively small while still acquiring very fine-resolution images. For example, if the range is 5 km, diffraction-limited resolution of 2 cm can be achieved with a telescope primary that has a diameter of about 12 cm at visible wavelengths. Given a video-rate processing capability, such assets will be particularly useful for time-critical monitoring of rapidly developing events.

A significant portion of the cost of space-based optical platforms is the cost

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

of launch, owing to the weight of the system. This is particularly true of long-dwell systems, but is also true for a fleet of LEO platforms. A candidate solution to the current technology shortfall is to relax optical (and structural) tolerances (thereby reducing weight) on the primary mirror(s) and recover the loss with postdetection processing via phase diversity. The driving principle is to "trade mirror mass for megaflops." This is a favorable tradeoff, given the cost/performance trends in computing technology. By allowing optical tolerances in the primary collector to be relaxed, one seeks to achieve (1) a significant reduction in structural weight, (2) simplified deployment, and (3) reduced fabrication expense. The added freedom in design afforded by relaxing optical tolerances suggests unconventional concepts for light collection. For example, the primary collector might be nonrigid (floppy), monolithic or segmented, filled or sparse. The collector could be a very thin, mylar surface stretched over a skeleton structure that could be deployed like an umbrella or could be inflated upon deployment. Time-varying aberrations that would be introduced with relaxed structural tolerances (by differential solar heating, mechanical vibrations, and so on) would then be overcome with phase-diversity methods. It may be important to design for a small compensating element conjugate to the primary collector to compensate for gross figure errors in the primary. Such compensation could be passive (fixed) or possibly active with a large dynamic range. Residual aberrations would then be overcome with postdetection processing. Note that optical precision is needed only on this small element and not on the large primary collector. Relaxed-optical-tolerance imaging concepts offer a low-cost approach to real-time, fine-resolution imaging with global coverage.

Passive Interferometric (Synthetic Aperture) Imaging

Interferometric measurements can be used in the visible or infrared region to "synthesize" an aperture in the same manner as is done in (active, coherent) SAR. Passive synthetic aperture can be constructed to defeat the diffraction limit of the collecting telescope or to provide differential range information so that a three-dimensional representation of an object can be collected.

As an example, interferometric imaging systems are being used for fine-resolution astrometry purposes. Astrometry is especially important to establish databases of star locations for star-tracking systems. These applications are effectively addressed by using multiaperture interferometric imaging systems. One such system, the naval prototype optical instruments (NPOI), is a long-baseline multiple-aperture imaging system. By using multiple apertures obtaining image information via interference phenomenon, systems such as NPOI are able to obtain fine-resolution images while avoiding the expense of large monolithic apertures.

In the future, passive interferometric systems will also be used for several other purposes. One of these is to perform passive synthetic aperture three-

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

dimensional imaging of military targets. With such a system, an airborne or space-based platform would be equipped with a relatively small multiple-aperture image collection system. It has been demonstrated that the relative motion between the platform and the stationary scene can be used to synthesize a (larger) imaging aperture. Two-dimensional images that are more than a factor of 10 better in resolution relative to the diffraction-limited collection aperture have been demonstrated in the visible and infrared spectral bands. Furthermore, a passive multiple-aperture system has been used to collect three-dimensional images of tactical targets at 2-km range in the mid-wave infrared (MWIR) bands. Yet another use of the passive interferometric imaging mode is to perform the radar analog of moving target detection. To accomplish this one measures the Doppler beating of light that originates from a single object point but propagates through separate apertures, or paths before interfering in the image plane. Geometrical path differences between the two optical paths cause the light to exhibit differential Doppler shifts. By examining the temporal content of the optical interferogram, one can readily measure the differences between stationary and moving targets to perform moving target identification with a passive multiple-aperture system.

In summary, interferometric optical systems show promise for passively providing additional target information based on either better-spatial-resolution, three-dimensional shape measurement, or detection of target motion through the processing of differential doppler signatures.

CONCLUSIONS

Future surveillance capabilities in support of force projection ashore will be truly astounding compared to what exists today, yet no scientific breakthroughs are necessary to achieve them. The expansion in scale of today's sensor systems in radar, electro-optics, acoustics, and SIGINT enabled by computer and communications technologies coming from the commercial sector will make it happen at an affordable price. Much thought must be given to platform issues and concepts of deployment, however, so as to understand the cost-survivability-performance tradeoffs necessary to guide future Navy investment. Since most future conflicts will be fought jointly with other components and coalition partners, every effort must be made to provide connectivity and interoperability with surveillance platforms of the other Services and of U.S. allies, taking advantage of the bandwidth revolution occurring in communications technology. The Department of the Navy should consider, however, the development of organic sensor platforms to support surveillance of the littorals during the early stages of conflict where the Navy/Marine Corps team will be first on the scene. Each sensor capability must be extrapolated in light of credible countermeasures an enemy might take to defeat its effectiveness, and in light of the estimated difficulty of achieving fully automated exploitation tools for its utilization. The ability to perform target

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×

classification or identification based on radar sensor data alone is an area where much progress must be made. The incorporation of surveillance systems into larger architectures to perform specific military missions only emphasizes the importance of the identification problem. Although the goal of ''near-perfect knowledge" may never, in fact, be truly achievable due to fundamental physical limitations, removal of implementation limitations through advanced technology will nonetheless provide surveillance systems that will be dramatic force multipliers for future naval forces.

Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 50
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 51
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 52
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 53
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 54
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 55
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 56
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 57
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 58
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 59
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 60
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 61
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 62
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 63
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 64
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 65
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 66
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 67
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 68
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 69
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 70
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 71
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 72
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 73
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 74
Suggested Citation:"4 ADVANCED SENSORS." National Research Council. 1997. Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare. Washington, DC: The National Academies Press. doi: 10.17226/5864.
×
Page 75
Next: 5 INFORMATION WARFARE »
Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare Get This Book
×
 Technology for the United States Navy and Marine Corps, 2000-2035: Becoming a 21st-Century Force: Volume 3: Information in Warfare
Buy Paperback | $64.00
MyNAP members save 10% online.
Login or Register to save!
Download Free PDF

READ FREE ONLINE

  1. ×

    Welcome to OpenBook!

    You're looking at OpenBook, NAP.edu's online reading room since 1999. Based on feedback from you, our users, we've made some improvements that make it easier than ever to read thousands of publications on our website.

    Do you want to take a quick tour of the OpenBook's features?

    No Thanks Take a Tour »
  2. ×

    Show this book's table of contents, where you can jump to any chapter by name.

    « Back Next »
  3. ×

    ...or use these buttons to go back to the previous chapter or skip to the next one.

    « Back Next »
  4. ×

    Jump up to the previous page or down to the next one. Also, you can type in a page number and press Enter to go directly to that page in the book.

    « Back Next »
  5. ×

    Switch between the Original Pages, where you can read the report as it appeared in print, and Text Pages for the web version, where you can highlight and search the text.

    « Back Next »
  6. ×

    To search the entire text of this book, type in your search term here and press Enter.

    « Back Next »
  7. ×

    Share a link to this book page on your preferred social network or via email.

    « Back Next »
  8. ×

    View our suggested citation for this chapter.

    « Back Next »
  9. ×

    Ready to take your reading offline? Click here to buy this book in print or download it as a free PDF, if available.

    « Back Next »
Stay Connected!