Hello,
This is my first post in the forum so I do apologize if my question doesn't respect the community guidelines.
In 1990, a research team proved that cycling a FET transistor from inversion to accumulation has the effect of reducing its intrinsic 1/f noise. It is not merely a reduction due to a change in signal gain caused by the transistor being cut-off half the time, but instead caused by a "refresh" of the "long time memory" of the energy states at the Si/SiO2 interface that is believed to cause most of the flicker noise.
Many other papers followed that again proved this concept.
I was wondering if in Virtuoso this phenomenon is modelled.
I am using periodic steady-state analyses and periodic noise (PSS and PNoise) on a common source switched-bias pMOS but I cannot observe this effect no matter what the cut-off gate-source voltage is set to.
Cadence Virtuoso is version 6.1.8.
Thanks,
Giuseppe
Image may be NSFW.Clik here to view.
