0
Q:

What is the difference between jitter and wander?

Tags:
Write a comment...

Answers

0
A11. Jitter is defined as the magnitude of phase variation (with respect to a reference clock or data signal) whose frequency of variation is greater than 10Hz. Jitter is measured in unit intervals (UI), where 1 UI is equal to one data bit-width. For an E1 signal, 1 UI is equal to 488ns, and for a DS1 signal, 1 UI is equal to 648ns. However, if the rate of change in phase is less than 10Hz, then this phenomenon is known as wander and is measured in nanoseconds. more
Write a comment...
Thanks for your feedback!

Related Videos

Add your answer...

Not the answer you're looking for? Try asking your own question.

...