D Glossary

“Overall frequency accuracy” and “clock accuracy” are defined by military specification MIL-PRF55310D as follows:

6.4.33 Overall frequency accuracy. The maximum permissible frequency deviation of the oscillator frequency from the assigned nominal value due to all combinations of specified operating and non-operating parameters within a specified period of time. In the general case, overall accuracy of an oscillator is the sum of the absolute values assigned to the following:

  1. The initial frequency-temperature accuracy . . .

  2. Frequency-tolerances due to supply voltage changes . . . and other environmental effects . . .

    Total frequency change from an initial value due to frequency aging . . . at a specified temperature.

6.4.4 Clock accuracy. The degree of conformity of a clock’s rate with that of a time standard. Clock accuracy [is] expressed as the worst case time error that can accumulate over specified operating conditions and over a specified duration following clock synchronization (e.g., 10 milliseconds per day) . . .

Stability is the maximum deviation of the oscillator frequency due to operation over a specified parameter range. For example, MIL-PRF-55310D defines frequency-temperature stability as follows:

6.4.16 Frequency-temperature stability. The maximum permissible deviation of the oscillator frequency, with no reference implied, due to operation over the specified temperature range at nominal supply and load conditions, other conditions constant.



The National Academies | 500 Fifth St. N.W. | Washington, D.C. 20001
Copyright © National Academy of Sciences. All rights reserved.
Terms of Use and Privacy Statement



Below are the first 10 and last 10 pages of uncorrected machine-read text (when available) of this chapter, followed by the top 30 algorithmically extracted key phrases from the chapter as a whole.
Intended to provide our own search engines and external engines with highly rich, chapter-representative searchable text on the opening pages of each chapter. Because it is UNCORRECTED material, please consider the following text as a useful but insufficient proxy for the authoritative book pages.

Do not use for reproduction, copying, pasting, or reading; exclusively for search engines.

OCR for page 75
An Assessment of Precision Time and Time Interval Science and Technology D Glossary “Overall frequency accuracy” and “clock accuracy” are defined by military specification MIL-PRF55310D as follows: 6.4.33 Overall frequency accuracy. The maximum permissible frequency deviation of the oscillator frequency from the assigned nominal value due to all combinations of specified operating and non-operating parameters within a specified period of time. In the general case, overall accuracy of an oscillator is the sum of the absolute values assigned to the following: The initial frequency-temperature accuracy . . . Frequency-tolerances due to supply voltage changes . . . and other environmental effects . . . Total frequency change from an initial value due to frequency aging . . . at a specified temperature. 6.4.4 Clock accuracy. The degree of conformity of a clock’s rate with that of a time standard. Clock accuracy [is] expressed as the worst case time error that can accumulate over specified operating conditions and over a specified duration following clock synchronization (e.g., 10 milliseconds per day) . . . Stability is the maximum deviation of the oscillator frequency due to operation over a specified parameter range. For example, MIL-PRF-55310D defines frequency-temperature stability as follows: 6.4.16 Frequency-temperature stability. The maximum permissible deviation of the oscillator frequency, with no reference implied, due to operation over the specified temperature range at nominal supply and load conditions, other conditions constant.

OCR for page 75
An Assessment of Precision Time and Time Interval Science and Technology The short-term stability of an oscillator is measured by its Allan deviation, denoted by σy(τ), which is the square root of one half the average of the squares of the differences between successive average normalized frequency departure, averaged over the sampling time τ, under the assumption that there is no dead time between the averaged normalized frequency departure samples.