Quantcast
Channel: Cadence Custom IC Design Forum
Viewing all articles
Browse latest Browse all 4890

Determine corner-dependent setting during operating point calculation with VerilogA model

$
0
0

Hello,

I have a circuit, which can be controlled in discrete steps to compensate for different process corners. I had the idea to create an additional Veriloga component, which determines the correct control setting during operating point calculation. However, I can not get it to work. My current code looks something like this

integer digCur = digMin;
analog begin
   if(analysis("static")) begin

      if (V(vin) < V(vref) && digCur < digMax) begin
         digCur = digCur + 1;
      end
   end
  
   V(vout) <+ digCur;
end

It is supposed to check only during dc-op analysis if the input value 'vin' is smaller than the reference 'vref' and if yes, increase the control word by 1. It should stop if the input exceeds the reference or the maximum control word is reached. This value should be kept and used for all subsequent simulations (eg. ac or hb).  However, even though the code runs, it does not behave as expected. The 'if' statement is only executed once and the dc simulation finishes with digCur = digMin + 1 as final value.

Is it actually possible to realize, what I want? I thought, that after the value of 'vout' changes, the dc simulation would have to restart, because the input for the controlled block has changed.

Best regards
Paul


Viewing all articles
Browse latest Browse all 4890

Trending Articles