nyquist theorem

Computing Dictionary

Nyquist Theorem definition

A theorem stating that when an analogue waveform is digitised, only the frequencies in the waveform below half the sampling frequency will be recorded. In order to reconstruct (interpolate) a signal from a sequence of samples, sufficient samples must be recorded to capture the peaks and troughs of the original waveform. If a waveform is sampled at less than twice its frequency the reconstructed waveform will effectively contribute only noise. This phenomenon is called "aliasing" (the high frequencies are "under an alias").
This is why the best digital audio is sampled at 44,000 Hz - twice the average upper limit of human hearing.
The Nyquist Theorem is not specific to digitised signals (represented by discrete amplitude levels) but applies to any sampled signal (represented by discrete time values), not just sound.
Nyquist (http://geocities.com/bioelectrochemistry/nyquist.htm) (the man, somewhat inaccurate).
The Free On-line Dictionary of Computing, © Denis Howe 2010 http://foldoc.org
Cite This Source
Explore Dictionary.com
Previous Definition: nyquist rate
Next Definition: nys
Words Near: Nyquist Theorem
More from Thesaurus.com
Synonyms and Antonyms for Nyquist Theorem
More from Reference.com
Search for articles containing Nyquist Theorem
More from Dictionary.com Translator
Dictionary.com Word FAQs

Dictionary.com presents 366 FAQs, incorporating some of the frequently asked questions from the past with newer queries.

Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature