It is desired to drive a digital system with a clock period that is always at least 1 ns. The clock has a nominal period T0=1.1 ns and a gaussian jitter distribution. If the rms period jitter is 5 ps, with what probability will any particular clock period be less than 1 ns?
The clock period will be less than 1 ns whenever the period jitter Jk<−0.1 ns. The PDF of Jk is,
fJ(t)=(5⋅10−12)2π1exp⎩⎪⎧−5⋅10−25t2⎭⎪⎫
Hence, the probability that the clock period is less than 1 ns is given by the integral of fJ(t) over the range -∞ tp -100ps.
−∞∫−10−10sfJ(t) ⋅ dt=2.75 ⋅ 10−89