tried to recognize the clustering communities within the Web. You could, for example, keep the user away from particular communities or exclude some communities from the allowed Web sites.


In the Stanford WIPE system,3 we use software to analyze image content and make classification decisions as to whether an image is appropriate or not. Speed and accuracy are issues; for example, we try to avoid both false positives and false negatives. The common image-processing challenges to be overcome include nonuniform image background; textual noise in foreground; and a wide range of image quality, camera positions, and composition.

This work was inspired by the Fleck-Forsyth-Bregler System at the University of California at Berkeley, which classifies images as pornographic or not.4 The published results were 52 percent sensitivity (i.e., 48 percent false negatives) and 96 percent specificity (i.e., 4 percent false positives). The Berkeley system had a rather long processing time of 6 minutes per image.

In comparison, the WIPE system has higher sensitivity, 96 percent, and somewhat less specificity (but still high) at 91 percent, and the processing time is less than 1 second per image. This technology is most applicable to automated identification of commercial porn sites; it also could be purchased by filtering companies and added to their products to increase accuracy.

In the WIPE system, the image is acquired, feature extraction is performed using wavelet technology, and, if the image is classified as a photograph (versus drawing), extra processing is done to compare a feature vector with prestored vectors. Then the image is classified as either pornographic or not, and the user can reject it or let it pass on that basis. There is an assumption that only photographs—and not manually generated images, such as an artist’s rendering—would be potentially objectionable. Manually generated images can be distinguished on the basis of tones: smooth tones for manually generated images versus continuous tones for photographs. Again, only photographs would require the next processing stage.


For a technical discussion, see James Z. Wang, Integrated Region-based Image Retrieval, Dordrecht, Holland: Kluwer Academic Publishers, 2001, pp. 107-122. The acronym WIPE stands for Wavelet Image Pornography Elimination.


Margaret Fleck, David Forsyth, and Chris Bregler, “Finding Naked People,” Proceedings of the European Conference on Computer Vision, B. Buxton and R. Cipolla, eds., Berlin, Germany: Springer-Verlag, Vol. 2, 1996, pp. 593-602.

The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement