I am running a transient PLL simulation. I check the phase noise and the jitter. I calculate the phase noise as following: I check the delay between the input signal and the feedback signal, subtract its average and then get the rms of the whole thing. I also calculate jitter at the output of the PLL using jitter function. The weird thing: I define the transient noise only for the divider, to check only the effect of the noise of the divider both at phase noise and at jitter of the whole PLL. I find the phase noise(delay between input and feedback signal) and the jitter at the output to have the same value in cadence. Next, I take the divider in a separate testbench, apply ideal input signal, run a pss and pnoise and get the noise plot. I put the values in Matlab and Matlab shows that these "2 jitters", the one between input and feedback signal and the other one at the output, to have different values. I think Matlab calculation is reasonable because the noise of the divider sees a high pass function (modelled in Matlab) for the jitter between input and feedback signal and a low pass function (modelled in Matlab) when I check at the output of the PLL. What is your opinion about this thing? Why cadence shows that the noise of divider has the same effect when I look at the input (delay between input signal and feedback signal) and when I look at the jitter at the output? Any idea will be very helpful.
Many thanks in advance!