Digital Control Delay
When you have done all the tricks of oversampling, acceleration, etc, what is the smallest delay that you have achieved in implementing the compensation algorithm? (Both absolute time, and fraction of a switching period.)
I'm not talking about the final comparator in the inner loop, let's assume that has reverted back to a comparator to avoid delays there.