Skip to main content

Currently Skimming:


Pages 121-124

The Chapter Skim interface presents what we've algorithmically identified as the most significant single chunk of text within every page in the chapter.
Select key terms on the right to highlight them within pages of the chapter.


From page 121...
... 121 Batch Processing Batch processing refers to a computer working automatically through a queue or batch of separate jobs or programs in a non-interactive manner. Big Data Big Data is data that traditional data management systems cannot manage due to its size and complexity.
From page 122...
... 122 Leveraging Big Data to Improve Traffic Incident Management Database Schema A database schema is a type of data model used to organize data inside a relational database. Distributed Computing Distributed computing is a model in which components of a software system are shared among multiple computers to improve efficiency and performance.
From page 123...
... Glossary 123 NetCDF NetCDF is a set of software libraries and self-describing, machine- independent data formats that support the creation, access, and sharing of array-oriented scientific data. NetCDF was originally developed by NASA and is now maintained by the University Corporation for Atmospheric Research.
From page 124...
... 124 Leveraging Big Data to Improve Traffic Incident Management Server Clusters A server cluster, or computer cluster, is a set of connected computers that work together so that, in many respects, the cluster can be viewed as a single system. Computer clusters are used to increase performance and reliability when dealing with very large dataset processing.

Key Terms



This material may be derived from roughly machine-read images, and so is provided only to facilitate research.
More information on Chapter Skim is available.