Hi,
I am using a m-factor on instances (for example a cell containing transistors) to model parrallel instances instead of using instances with buses that would slow down simulations.
I expect to see that the standard deviation of my output voltage to be divided by sqrt(m). This worked so far in most of my testbenches.
But it recently stopped to work (meaning that the m_factor does not decrease anymore the standard deviation of my output) recently in another technology PDK where we also updated the Cadence tools.
I notice in Cadence support that it exists the option nullmfactorcorrelation for MonteCarlo mismatch simulation. This option seems to be by default set to no, meaning that there is 100% correlation between m_factor instance.
If I set it to yes, then there is 0% correlation and I get back the behavior that I normally had.
I wonder if this option is set by IC module or by MMSIM/SPECTRE module and at which version it started to be appear and be set to no?
Thanks
Alex