Hi all,
I would like to run a transient noise analysis in a mixed signal system. For better estimation of the noise bandwidth of interest I ran an AC noise analysis of my system in steady state. I found out that the relevant noise contributions occur below 10 GHz. However, when I'm running transient simulation with noise enabled up to 10 GHz and simulated time over 100 us, the simulation practically takes forever.
Without having any deep knowledge about the transient noise simulation, I came up with the idea to enable transient noise only up to 1 GHz instead of 10 GHz and compensate for the "loss of noise" by scaling it up using the noise scale parameter. I calculate the necessary scaling factor by dividing the integrated AC noise at 10 GHz bandwidth by the integrated AC noise at 1 Ghz bandwidth.
My question is very simple: Is the described methodology to speed up the simulation giving realistic results? Or am I running into trouble here without seeing an important problem?
My idea behind is that, in general, the bandwidth where the noise is generated is of course lower than the expectable 10 GHz, but within this lower bandwidth, I'm scaling up the noise artificially, such that the integrated noise over the full bandwidth ends up to be the sameas if I had simulated it up to 10 GHz.
Thank you very much in advance for your replies!