arrays are not especially commonplace elements either. They tend to be quite expensive in general, as hybrid implementations are painstaking, and monolithic (MIMIC) implementations of subsystems have not proven economically feasible. Interestingly enough, though, optics has been proposed as a technology that could lead to considerable cost reduction of phased arrays, possibly to the point that this technology could become viable to compete in a mass consumer market.

A major problem with applying any new technology to an already developed application area is that of convincing those already in the application area that the new technology will not cause more new problems than solve old problems. Phased array systems require high dynamic range and low crosstalk, among other attributes. Also, the performance levels that have been achieved with the standard microwave components are impressive and have been hard to come by. An engineer in such an area is loath to embrace a new technology that may increase bandwidth at, perhaps, the cost of an increased probability of false detection or, worse, that will only promise improvement while actually degrading system performance in every way. In order to "sell" such a new technology, one needs to demonstrate marked improvements in end-to-end system performance. Prototyping is not generally a good way to do this. A first prototype is generally of modest performance and possibly astronomical cost. A phased array system is expensive and can probably be made most cost effectively by the very people to whom the prototyper would like to demonstrate his improved technique. Still, simulation of noncommercial (which some engineers call imaginary) components is not enough to convince a systems engineer. Some operational characteristics, other than best-case figures for all components, are a necessity.

The approach that we have taken over the years has been a phenomenological one. We have worked on the fabrication and characterization of various integrated optical and microwave components, while simultaneously trying to make simple, physical models for these components. These models can then be calibrated from the experimental data in order to make them predictive, in the sense that the input to the model would be actual fabrication conditions, rather than physical (and perhaps indeterminable) parameters. The necessary physical parameters could, if necessary, be extracted from the model. For example, the fabrication of an integrated optical phase shifter requires one to mask off a channel, indiffuse it to obtain an index difference, and then remask and deposit electrodes. The fabrication parameters in such a case are the design dimensions of the masks, the design metal thicknesses, and times of indiffusion. The actual fabrication parameters, of course, will never be identical to the design parameters due to the basic nature of the fabrication process, and, therefore, calibrations through mask measurement, line thickness measurement, and so on are necessary. The physical parameters would be such things as the index profile and the electric field distributions in the substrate. To obtain good prediction of the modulation depth will also, in general, require calibration of the physical parameters through such measurements as m-lines, near-field profiles, and half-wave voltages. These predictive models would then be used to fabricate components to the achievable specifications. There is a limit to the amount of pure trial and error that can be carried out in an academic laboratory, and the phenomenological approach is adopted so as to minimize this trial and error. There is also a limit to the complexity of fabrication and characterization that can be carried out in an academic laboratory. If one is to work with new state-of-the-art components that must by nature be in-house fabricated, the educational process (being of finite duration per student) precludes serious experimental work in complex systems. The very existence of phenomenological models, however, forms a basis for system modeling. Simple models can be put together in packages to attempt to predict complex system behavior. The basic nature of the modeling process that follows from this phenomenological basis is a hierarchical one. Smaller models are successively grouped into larger models, which are successively grouped into system blocks, etc. Such modeling is not uncommon. As will be discussed in the next section, most of the standard computer aided design (CAD) tools employed in both the very large scale integrated (VLSI) circuit industry as well as in the microwave industry are hierarchical tools. The analysis techniques

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement