0
Q:

What is the difference between jitter and wander?

Add your answer...

1 Answer

0
A11. Jitter is defined as the magnitude of phase variation (with respect to a reference clock or data signal) whose frequency of variation is greater than 10Hz. Jitter is measured in unit intervals (UI), where 1 UI is equal to one data bit-width. For an E1 signal, 1 UI is equal to 488ns, and for a DS1 signal, 1 UI is equal to 648ns. However, if the rate of change in phase is less than 10Hz, then this phenomenon is known as wander and is measured in nanoseconds. more
Thanks for your feedback!

Related Videos

Not the answer you're looking for? Try asking your own question.

...