The following HTML text is provided to enhance online
readability. Many aspects of typography translate only awkwardly to HTML.
Please use the page image
as the authoritative form to ensure accuracy.
recognition, new systems have claimed efficiencies greater than 90 percent since research began, but few have been adopted for general use, because they do not work for many users. Only a few general theoretical facts are known about pattern recognition and machine learning. These include Bayes theorem, PAC (probably almost correct) learning developed by Leslie Valiant1 and others, and the Vapnik-Chervonenkis dimension for distribution-free learning.2 A general theory of pattern recognition would be highly desirable but is probably out of reach at present. In the absence of a general theory, the development of techniques to evaluate rationally and efficiently the effectiveness of ad hoc approaches should be a priority.
Data Compression or Encryption
Chaos is a behavior that is typically associated with unpredictable and undesirable phenomena. For example, similar but separate chaotic systems have motion that is never correlated and that is noise-like with broadband spectral densities. In the last five years, research at NRL by Pecora and coworkers3,4 has demonstrated new ways to configure coupled chaotic systems to give unexpected, but useful, behavior and has shown that the unique properties of chaotic systems permit them to accomplish things that better-behaved systems cannot.
The earliest finding was that chaotic systems could be coupled to synchronize exactly their behavior in a stable, reproducible fashion. The overall motion was chaotic, but the linked systems moved in lockstep. The method for accomplishing such synchronization is remarkably simple and immediately suggests techniques to synthesize such setups. A signal (the drive) is taken from a specific part of an autonomous chaotic system and transmitted to another system (the response). The transmitted signal takes the place of that part of the response, and the remaining parts see only each other and the incoming signal as though the whole system were still in place. The remaining signals in the response synchronize with their counterparts back in the drive. Theory subsequently developed at NRL establishes the conditions under which this arrangement is stable.
The synchronization approach demonstrates that a chaotic system can be broken into parts and that, by duplicating of some of those parts and linking various parts with chaotic signals, new behavior, such as synchronization, is possible. The main theme here is that the synthesis of new chaotic systems leads to interesting and potentially useful results, including the synthesis of new signals and signal processing hardware.
The first drive-response approach to synchronous chaos was patented along with several elementary approaches to using such behavior in communications with chaotic drives and carriers. Amplifying the theme of synthesis, new drive-response constructs have been pursued. Several drive-response subsystems have been cascaded to give a system that can reproduce the input, chaotic drive signal, provided all the identical subsystem parts have the same parameters. This arrangement allows one to identify incoming signals uniquely. Only signals that come from the same type of chaotic circuit with the same parameters will yield matching synchronous behavior in the response receiver. This discovery of the cascading of electronic components has been patented and currently is being investigated for potential use in identifying friend-or-foe (IFF) situations.
Nonautonomous, chaotic systems have been constructed using approaches similar to drive-response synchronization that allow the chaotic drive to carry phase information about the sinusoidal forcing of the transmitter and receiver. This configuration allows the receiver to follow phase changes similar to FM in the transmitter even though only chaos is broadcast.
Leslie Valiant, "Computational Learning Theory," Proceedings of the Fourth Annual Workshop on Computational Learning Theory, Morgan-Kaufmann, San Mateo, Calif., 1991.
Vladimir N. Vapnik, The Nature of Statistical Learning Theory, Springer, New York, 1995.
T.L. Carroll and L.M. Pecora, eds. Nonlinear Dynamics in Circuits, World Scientific, River Edge, N.J., 1995.
W.M. Ditto and L.M. Pecora, "Mastering Chaos," Sci. American, Vol. 269, No. 2, Aug. 1993, p. 78.