Cover Image


View/Hide Left Panel
Click for next page ( 494

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement

Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 493
Automatic Particle-Image Velocimetry Utilizing Laser-Induced Fluorescent Particles T. C. Fu, R. Bing and J. Katz The Johns Hopkins University Baltimore, USA T. T. Huang David Taylor Research Center Bethesda, USA Abstract Microscopic, neutrally buoyant particles containing fluorescing compound have been adapted as tracers for velocity measurements of large scale turbulent flows. This technique consists of illuminating a thin slice of the flow field with a laser sheet, which is pulsed following a specific illumination code. The multiple exposure image is recorded on photographic film and later enhanced while being digitized. Algorithms have been developed for analyzing the resulting images. They rely on the illumination code and the particle streak morphology in order to identify and compute the tracer velocity vectors. A one-inch diameter jet has been used as a flow field for preliminary tests. Introduction The present paper focuses on the development of a quantitative flow visualization system which is particularly suited for large scale towing tank experiments. Until recently, flow visualization has been utilized for providing only qualitative information, while quantitative data, namely the velocity field, has been determined by single point measurement techniques (hot wire anemometry, laser doppler velocimetry, etc.). Due to their nature, as well as cost, these techniques are limited to simultaneous sampling at a few spatial locations. Early experiments with quantitative flow visualization have been performed by Kobayashi [1] and Marko & Rimai [2]. They have all used long exposure photography to record both the position and mean velocity of passive tracers within a fluid. High speed photography has also been adopted to record time series of tracer particle positions (Racca & Dewey [3]). A further refinement has been to identify particles in successive frames, and reconstruct the velocity with a time base equal to the framing time (Racca & Dewey [3]). Another approach has been to record multiple exposure images and measure the displacements of small suspended particles to obtain full-field velocity maps (Adrian [4]). Gharib & Willert [5] have performed a similar analysis. However, they have used a single extended exposure of particle streaks with prescribed variations in the intensity of the light emitted from each particle. The variation in intensity was achieved by utilizing fluorescing particles and by varying the wavelength of the illumination light. Finally, one should mention the work of Khalighi [6] who utilized digital image processing techniques to automatically analyze particle streak images and produce full-field velocity maps. The primary factors affecting the capability to adapt particle displacement velocimetry to large scale complex flows are the ability to handle large amounts of data, the required processing time, variations in velocity scales in the same flow field, the capability to record and identify fine details in large scale images as well as the availability of the appropriate particle tracers in large quantities. The approach opted for utilizes double exposed images of laser induced fluorescent particles which are analyzed automatically by digital image processing. This method enables one to resolve the entire velocity field simultaneously, thus allowing the identification of large scale flow structures as well as provide quantitative details about the velocity, circulation, etc. This technique is specifically suited for the study of large scale complex turbulent flows. Similar to the techniques of Gharib & Willert [5] and Khalighi [6], it consists of recording two exposures on a single frame. This approach minimizes the difficulty of identifying particle traces on successive frames, and still allow for short sampling intervals. Unlike the others, we have opted to identify the direction of flow by keeping one of the exposures longer than the other. The following paper focuses on this method. An important assumption in all particle tracking techniques is that the seed particles follow the flow without slipping and do not alter the flow dynamics. This requirement prescribes the size, concentration and specific gravity of particles that can be used as tracers. Most of the past studies (Adamcyzk and Rimai [7], Landeth, Adrian and Yao [8], Khalighi [61), rely on the light reflected from the particles. This limitation has prohibited the use of very small particles, due to the low intensity of the reflected light. By utilizing tracers containing fluorescing material (Gharib and Willert [5]), the intensity of the emitted light is increased by several orders of magnitude, so that even microscopic particles can be used. The generation of particles with imbedded fluorescing material has been one of the critical problems, particularly when they are needed in large quantities. Experimental Procedures During the past two years we have constructed a large scale flow visualization facility with multiple light sheets in the 140 ft. towing basin at David Taylor Research Center (sketch is provided in Figure 1). While developing the flow visualization system in the towing tank, we have used water jets as the flow field in order to generate data for the imaging system. As a result the images provided in this paper have been recorded in a steady flow water jet facility located in a transparent test section. The jet was 1 inch in diameter. Thin sections (approximately 1 mm) of the flow field were 493

OCR for page 493
illuminated by a pulsed 300 mW argon ion laser. The water was seeded with microscopic (5-10 microns in diameter), neutrally buoyant particles, containing imbedded fluorescing dyes. These particles were invisible in most of the flow field, but responded with intense spontaneous fluorescence within the illuminated section. The production of these particles will be discussed later. The temporal light modulation followed a specific illumination code. Figure 2 shows the pattern used in the present work. The signal consisted of a long exposure (streak) followed by a shorter pulse (dot). The magnitude of the velocity was determined from the distance between the two traces of the same particle, while the flow direction could be determined by comparing the lengths of the two particles. As will be discussed later, the automatic image analysis algorithm could match between the traces as well as identify and remove streak patterns which did not resemble the illumination code. The images were recorded on 35 mm film. We have opted for film since its resolution is much higher than that of video. As an example, Figure 3 contains two digitized images, both originated from the same negative. However, in the first one the entire negative was translated to a single video frame and then a portion of the frame was magnified. The second, on the other hand, was magnified prior to translation to video and as a result is clearer and sharper. A video frame has a resolution in the order of 500 x 500 lines, which is less than the resolution of 1 mm2 Of emulsion. Thus the translation to video should be performed carefully to avoid loss of details. Storing the original image in the form of a film negative allows variation in magnification while digitizing the image, and as a result enables us to control the resolution. While analyzing the image, one can focus a video camera on the negative and select the magnification depending on the desired detail. This is especially important when examining turbulent flows where wide ranges of velocities are present. For example, Figure 4 shows a typical image of the flow near the exit of a 1 inch diameter nozzle. Portions of the flow are unresolvable at this magnification. By focusing on a smaller portion of the negative (Figure 5), the magnification and hence the resolution are increased allowing for analysis of the originally unresolvable traces. Image Analysis and Processing As noted before, the image analysis and processing algorithms were developed with the specific objective of examining large scale real turbulent flows. As a result, the selected technique should be able to analyze images with a large numbers of particles at a wide range of velocities. Thus, it was necessary that the image resolution be such that both very small (low velocity) traces as well as long (high velocity) streaks could be clearly identified and handled efficiently. It was also necessary for the algorithms to be as simple as possible to maximize the speed of the analysis. The images, recorded on film, were digitized by illuminating the negative from behind and by focusing a RCA video camera with a microscope objective zoom lens on a section of the film. The camera was linked to a high resolution video recorder as well as to an Imaging Technologies Inc. Series 100 Image processor and frame grabber which were installed in a Sun 4/260 workstation. Each video frame was digitized to a 512x512 pixel, 8 bit array. Each pixel was assigned an intensity value, ranging from O to 255, corresponding to its relative brightness. The digitized image was then enhanced by color filtering, a smoothing convolution and contrast enhancement to reduce the noise. The use of fluorescing particles has the added benefit that the emitted light is of a wavelength which is higher (in the yellow range) than the green light reflected from bubbles and contaminants. This feature allows significant enhancement of the input images by removing much of the reflected laser light through color filtering. The filtered image was then sharpened by convolving with the following kernel: -1 -1 -1 -1 12 -1 -1 -1 -1 Namely, each pixel value was multiplied by 12 and its eight nearest neighboring pixels were multiplied by -1. Then, the sum of these values was added to the original pixel intensity. Performing this process on the entire image effectively sharpens the edges of the traces (Figure 6). The image was then equalized, namely the intensity values of the entire image were normalized to range from 0 to 255, to improve contrast. The next step was to "threshold" the image. Pixel values above a selected intensity level were set to 255 and values below it were set to 0. The threshold level was determined from an intensity histogram of the entire image. In an optimal situation the intensity histogram would be bimodal, with well separated peaks. That is, the particle traces would be easily distinguishable from the background and their edges would be distinct and clear. In practice this was not usually the case. In fact, it was not uncommon that the brightest background pixel would be brighter than the faintest pixel of a particle. This phenomenon occurred when the background illumination was not uniform. Therefore construction of an accurate binary image using threshold analysis required the use of local threshold intensity levels. Additional techniques could be utilized, if needed, to further aid in distinguishing particles from the background. These techniques include the use of gradient and Laplacian operators, to provide edge enhancement (Rosenfeld and Kak [9]) and examination of the slope of the thresholded average intensity vs. threshold curve (Prasad & Sreenivasan [10]). The thresholded image was then reduced to a binary bit map of "0's" and "l's". The "0's" represented the background (pixels of value = 0) and the "l's" (pixels of value = 255) were parts of particle traces (Figure 7). This step reduced the needed computer memory space and processing time in that the image was now represented by an array of 1-bit integers. The bit map was then searched pixel by pixel, row by row until a pixel representing part of a particle trace was found (a "1"). Then the total size of the trace, as well as its length and orientation, were determined by examining connected pixels. The centroid of the trace was also found at this time. The program assumed that the trace found was the longer streak. The length and orientation of this trace, coupled with the illumination code, was then used to determine the probable location of the matching trace. In the photographs presented in Figures 3 6 the illumination code was such that the exposure time for the streak and the delay between exposures were 4 and 5 times the exposure time of the dot, respectively (Figure 2). Once the search distance and orientation of the streak were calculated, the matching dot was searched for only in a limited space. It was necessary however, to examine the image on both sides of the slot, since the flow direction was not known a priori. If a second trace was identified within the prescribed area, the ratio of the slot length to separation distance was then compared to the respective time ratios in order to insure that they were traces of the same particle. The last position of the center of the particle within the streak was then estimated by determining the width of the trace at three positions along the trace's major axis (Figure 8). The variation in their value 494

OCR for page 493
had to remain within a specified range for them to be accepted as the actual width of the trace. The position of the center of mass at the end of the streak was then determined to be at a half width from the edge of the trace and centered along its major axis (Figure 8). The same process was done for the shorter trace. The velocity was calculated from the estimated separation distance, center of mass to center of mass (Figure 8). This sequence was repeated until the entire image had been analyzed and a map of the entire velocity field (Figure 9) was produced. Since both sides of the traces were searched there was a small possibility that dots which fulfill the above mentioned criteria would be found on both sides of the streak. If this situation occurred the velocities would be compared to neighboring values to determine the correct direction. If the correct direction and magnitude could not be inferred, this data point was not used for the final velocity map. The computer program also contains additional procedures to handle unmatched traces, variations in slopes, flow near the core of a vortex, etc. The digitization, enhancement and processing phases were done interactively which allowed for operator control of the thresholding and scale of digitization. The analysis of the bit maps was done automatically with its output being the velocity vector for each particle trace found. A 512x512 pixel, 8 bit image was usually completely analyzed in approximately 2 minutes of CPU time. The more "trouble spots" (unmatched traces, multiple dots for one slot, noise, etc.) there were, the longer it took for the computer to complete the analysis. The entire process consisting of: selection of an image, digitization, enhancement, thresholding, and image analysis took approximately 10 minutes per image. More sophisticated automatic edge detection techniques as well as automatic thresholding are being implemented at the present time. Particle Production For the technique to be practical, particularly for large scale towing tank flows, an efficient method of manufacturing microscopic fluorescent particles was needed. A substantial effort has been invested in developing a reliable and controllable manufacturing process. The particles were composed of a specific mixture of acrylics and several fluorescing dyes. The mixture was adjusted to produce a neutrally buoyant substance. They were manufactured by dissolving the acrylics and then mixing the solution with the dyes. The mixture was atomized and the resulting "dust" (5- 10 microns in diameter) was then collected and used as velocity tracers. Error Estimation An important aspect of the analysis is to estimate the error of the measurement. Geometric distortion due to the lenses can be corrected for by using the techniques described by Green [11]. Other errors are predominantly due to the digitization which sets the accuracy of each measurement to + 0.5 pixels. The velocity is equal to the separation distance divided by the time lag between exposures. The separation distance is found by calculating the edge to edge distance between traces plus half the thickness of each trace. The error in edge to edge distance is 1 pixel and the error in estimating the center of the particle is 0.5 pixels, so the total error is 1.5 pixels. This is a rough analysis since filtering and enhancement may also introduce an error. As a result, future calibrations will be utilized for determining the error more accurately. In the future, the images will be digitized to provide generally a separation distance of 15 pixels, resulting in an error of 495 separation distance of 15 pixels, resulting in an error of approximately 10%. The error can be further reduced by reducing the digitization scale. For example, if the image is digitized such that the separation distance is increased to 60 pixels the error decreases to 2.5 %. However, the processing time increases accordingly. Thus, a judgement of what is the best digitization scale must be made. By recording the original image onto film, different digitization scales can be used for different portions of the image. This method provides the capability to optimize between the processing speed and the error. This feature is especially important if gradients of the velocity are desired, i.e. during vorticity analysis. Hesselink [12] estimated the maximum acceptable relative error to be less than 0.5% to insure an accurate vorticity determination. This error level is quite impractical for the a technique presented in this paper. However, errors on the order of 1-10% can be achieved, depending on the length of the analysis. It should be noted here that to achieve an error of loo the traces of a single particle should occupy the entire 512x512 frame. Summary and Future Work A particle displacement velocimetry technique utilizing digital image processing has been developed for examining large scale complex turbulent flows. The technique consists of illuminating a section of the flow field with a sheet of Argon ion laser while seeding the water with microscopic fluorescing neutrally buoyant particles. These tracers are invisible in most of the flow field, but respond with intense fluorescence within the illuminated section. By pulsing the laser twice while recording a single frame each particle leaves two traces on the same frame. The velocity is determined from the distance between the traces, and the direction of the flow is identified by keeping one of the pulses longer than the other. Each recorded film negative is analyzed by digitizing a portion of the frame, to a 512x512 pixel array which is then enhance, thresholded and then translated to a bit map. The analysis of the bit map consists of searching the array for traces. Once a trace is found, its position, orientation, width and length are determined. From this information, as well as the illumination code a search distance and direction of the matching second trace are calculated. If the second trace is found within the predetermined space the velocity is then determined from the separation distance. This procedure is repeated until the entire image is analyzed. At present, the system is installed in a 140 foot towing tank at the David Taylor Research Center and is being utilized in the study of the three dimensional separated flows. Further refinement of the image processing and analysis procedures are also currently underway. In the future the image processing and analysis will be fully automated and expert systems will be utilized to determine proper levels of enhancement, thresholding and accuracy of analysis. Acknowledgement This work was sponsored in part by the Office of Naval Research Applied Hydrodynamic Research Program and in part by DARPA's Submarine Technology Program. Their support is gratefully acknowledged. References 1 Kobayashi, T. 1983 Proc. 3rd Int. Symp. Flow Visualization Sept 6-9, Ann Arbor, Michigan, p. 261 2. Marko, K. A. and Rimai, L. 1985 Appl. Opt. 24, 3666 3672 3. Racca, R.G. and Dewey, J.M., 1988, Experiments in Fluids, Vol. 6, pp. 25-32.

OCR for page 493
4. Adrian, R. J. 1984 Appl Opt. 23, pp. 169~1691 5. Gharib, M. and Willert, C. ,1988 AIAA paper 88-3776-CP, pp. 193~1943 6 Khalighi, B. 1989 Experiments in Fluids, Vol. 7, pp. 142-144 7. Adamcyk, A. A. and Rimai, L. 1988 Experiments in Fluids, Vol. 6, pp. 373-380 8. Landreth, C. C., Adrian, R. J. and Yao, C. S. 1988 Experiments in Fluids Vol. 6, pp. 119-128 CL CYLINDRICAL LENS F FILTER MIRROR L LENS iSTRUT SUPPORT BACK ILLUMINATION PROBE (NEW) J ~ AL S/E R b R E E T lo, CL ~ c LASER SHEET ~ / L / ~= / / - HYDROFOIL . (' 9. Rosenfeld, A. and Kak, A. C. 1982 Digital Image Processing, 2nd edition New York: Academic Press. 10. Prasad, R R. and Sreenivasan, K.R., 1989 Experiments in Fluids Vol. 7, pp. 259 -264. 11. Green, W. B., 1983 Digital Image Processing A Systems Approach, New York: Van Nostrand Reinhold. 12. Hesselink, L., 1988 Ann. Rev. Fluid Mech. Vol. 20, pp. 421~85 ACOUSTOOPTICAL SWITCH (NEW) ARION LASER i , V // ~ OPTICAL FIBER (NEW) a '_ ~ , _ BEAR SPLIT _TRAILING CAMERA / (NEW) f 1 ~ ~ (E XISTING) / TRAILING PERISCOPE / ~ (EXISTING) / '`L / , ~ ~~ ^~ ~ _SIOE CAMERA (NEW) TRAILING PERISCOPE (EXISTING) ~ SIDE ILLU~INAT10N PROBE (EXISTING) I SIOE PERISCOPE (NEW) _ ~ Figure 1: Sketch of the large scale flow visualization facility in the 140 ft. towing tank at the David Taylor Research Center. ~ ~ ~ Laser beam modulation , , , 496 Pa rti cl e t race Figure 2: Incident illumination modulation code with a typical pair of traces of the same particle.

OCR for page 493
Figure 3: (a) Image of particle traces enlarged after digitization. (b) Same image as (a) but magnified prior to digitization. Figure 4: Typical double exposed image of the flow near a 1 inch diameter nozzle. Figure 5: Enlarged section of Figure 4. Figure 6: Figure 5 sharpened through convolution. 497

OCR for page 493
Figure 7: Binary bit map of Figure 6. 1) Measure trace length. it- A 2) Determine trace width and center of particle. ~3 w/2 ~ ~ 1~ ~ ~ W/2 Wit W2 W3 3) Search for the partner trace. 4) Measure parti cl e di spl acement. I.. o Figure 8: General sequence of image analysis steps. (1) Determine the length and orientation of the trace. (2) Determine the width and the position of the center of the particle at the end of the trace. (3) Search a small area at the calculated search distance. (4) Measure the particle displacement. ~~ ~~ ~ ~~ ~~ ~: at. ~~ .. ~~ ~~ ~ ~ ~ ~~ .~ ~~ ~.~.~.~.~. ~~ i: i- i: ~ ~.~ ~~: ~~ ~ .~ : ?~ ~ ~~ ~~.~ ~~ I.... ~~ ~ Figure 9: Map of the velocity field of the flow near a 1 inch nozzle, determined through analysis of Figure 7, by digital image processing. 498