At present, only a handful of programs in photogrammetry exist in the United States (see Table A.2 in Appendix A). A few, such as those at Ohio State University and Purdue University, are top tier, yet are struggling to survive. Retiring faculty are not being replaced, and the number of faculty will soon decline below the critical mass needed to sustain these programs. Some 2-year technology programs, such as in surveying or construction technology, offer hands-on training using photogrammetric instruments to compile data. Most of these provide some photogrammetric skills but lack the rigorous mathematical basis of photogrammetry programs in 4-year colleges.

Outside of formal academic education, employers often provide in-house training, and some educational institutions and professional societies offer short courses ranging from a half day to a full week. The American Society for Photogrammetry and Remote Sensing regularly devotes a day or more to concurrent half-day or full-day short courses on specific topics in conjunction with its annual and semiannual meetings. Most of those who take these courses are employees seeking professional development.

REMOTE SENSING

Remote sensing is the science of measuring some property of an object or phenomenon by a sensor that is not in physical contact with the object or phenomenon under study (Colwell, 1983). Remote sensing requires a platform (e.g., aircraft, satellite), a sensor system (e.g., digital camera, multispectral scanner, radar), and the ability to interpret the data using analog and/or digital image processing.

Evolution

Remote sensing originated in aerial photography. The first aerial photograph was taken from a tethered balloon in 1858. The use of aerial photography during World War I and World War II helped drive the development of improved cameras, films, filtration, and visual image interpretation techniques. During the late 1940s, 1950s, and early 1960s, new active sensor systems (e.g., radar) and passive sensor systems (e.g., thermal infrared) were developed that recorded electromagnetic energy beyond the visible and near-infrared part of the spectrum. Scientists at the Office of Naval Research coined the term remote sensing to more accurately encompass the nature of the sensors that recorded energy beyond the optical region ( Jensen, 2007).

Digital image processing originated in early spy satellite programs, such as Corona and the Satellite and Missile Observation System, and was further developed after the National Aeronautics and Space Administration’s (NASA’s) 1972 launch of the Earth Resource Technology Satellite (later renamed Landsat) with its Multispectral Scanner System (Estes and Jensen, 1998). The first commercial satellite with pointable multispectral linear array sensor technology was launched by SPOT Image, Inc., in 1986. Subsequent satellites launched by NASA and the private sector have placed several sensor systems with high spatial resolution in orbit, including IKONOS-2 (1 × 1 m panchromatic and 4 × 4 m multispectral) in 1999, and satellites launched by GeoEye, Inc. and DigitalGlobe, Inc. (e.g., 51 × 51 cm panchromatic) from 2000 to 2010. Much of the imagery collected by these companies is used for national intelligence purposes in NGA programs such as ClearView and ExtendedView.

Modern remote sensing science focuses on the extraction of accurate information from remote sensor data. The remote sensing process used to extract information (Figure 2.5) generally involves (1) a clear statement of the problem and the information required, (2) collection of the in situ and remote sensing data to address the problem, (3) transformation of the remote sensing data into information using analog and digital image processing techniques, and (4) accuracy assessment and presentation of the remote sensing-derived information to make informed decisions ( Jensen, 2005; Lillesand et al., 2008; Jensen and Jensen, 2012).

State-of-the-art remote sensing instruments include analog and digital frame cameras, multispectral and hyperspectral sensors based on scanning or linear/area arrays, thermal infrared detectors, active microwave radar (single frequency-single polarization, polarimetric, interferometric, and ground penetrating radar), passive microwave detectors, lidar, and sonar. Selected methods for collecting optical analog and digital aerial photography, multispectral imagery, hyperspectral imagery, and lidar data are shown in Figure 2.6. Lidar imagery is increasingly being used to produce digital surface models, which include vegetation



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement