Below is the uncorrected machine-read text of this chapter, intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text of each book. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.
A STREAM PROCESSOR FOR EXTRACTING USAGE INTELLIGENCE FROM HIGH-MOMENTUM INTERNET DATA 308 A STREAM PROCESSOR FOR EXTRACTING USAGE INTELLIGENCE FROM HIGH- MOMENTUM INTERNET DATA Lee RHODES The data streams of the Internet are quite large and present significant challenges to those wishing to analyze these streams on a continuous basis. Opportunities for analysis for a Network Service Provider include understanding subscriber usage patterns for developing new services, network demand flows for network operations and capacity planning functions, and early detection of network security breaches. The conventional analysis paradigm of store first, then analyze later has significant cost and latency issues when analyzing these high-momentum streams. This article presents a deployed architecture for a general purpose stream processor that includes dynamically configurable Capture Models that can be tailored for compact collection of statistics of the stream in real time. The highly configurable flow processing model is presented with numerous examples of how multiple streams can be merged and split based on the requirements at hand. Key Words: DNA; IUM; Real-time statistics; Statistical pre-processing. 1. INTRODUCTION In 1997 a small R&D group was formed inside of Hewlett-Packard's Telecommunications Business Unit to develop Internet usage management software for Network Service Providers (NSPs). The services offered by these NSPs ranged from Internet backbone to Internet access. The range of access services included residential and commercial broadband (cable and xDSL), dial-up, mobile data, as well as numerous flavors of hosting and application services. Early on our focus was the processing of usage data records (e.g., NetFlow® or sFlow®) produced by Internet routers. However, it quickly broadened to include convergent voice Call Detail Records (CDRs) as well as the ability to collect and process data from a very broad range of sources such as log files, databases, and other protocols. The diverse technological histories (and biases) of the different segments of the communications industry created for us interesting challenges in creating a software architecture that was Lee Rhodes is Chief Scientist/Architect, IUM/DNA, Hewlett-Packard Co. (E-mail: lee.rhodes@hp.com). ©2003 American Statistical Association, Institute of Mathematical Statistics, and Interface Foundation of North America Journal of Computational and Graphical Statistics, Volume 12, Number 4, Pages 927â944 DOI: 10.1198/1061860032706