Jitter

From LavryEngineering
Revision as of 11:22, 29 March 2012 by Brad Johnson (talk | contribs)
Jump to navigation Jump to search

Overview

The term "jitter" is used to describe variations in a periodic signal, which can be in the frequency, amplitude, or "phase" of the signal in relationship to the idea form or the form at the point of generation. In digital audio; one of the most problematic issues involving jitter is in "clock recovery" from signals transmitted between equipment. If jitter in the recovered clock signal affects the clocking of the conversion, even small amounts of jitter can effectively reduce the resolution of the conversion to far below the theoretical limits of 16-24 bit conversion.

Basics

Because even "digital" signals are actually very high frequency analog signals, the receiving device must reconstruct the signal by means of some form of amplitude "threshold."