Thursday, October 30, 2008

GAMS Function and Derivative Evaluation

I have two different formulations of a linearly constrained NLP (i.e. the objective is nonlinear). The difference is how we formulate the objective. In the next fragments x is a variable and c and e are parameters.

Original Version

cost1 ..        z  =e=  sum((i,j,k), x(i,j,k)*log(x(i,j,k)+e)) -
sum((i,j,k), x(i,j,k)*log(c(i,j)+e));

Total CPU secs in IPOPT (w/o function evaluations) = 6.033
Total CPU secs in NLP function evaluations = 33.971


Fast Version


cost2 .. z =e= sum((i,j,k), x(i,j,k)*log((x(i,j,k)+e)/(c(i,j)+e)));

Total CPU secs in IPOPT (w/o function evaluations) = 5.905
Total CPU secs in NLP function evaluations = 0.541


There is probably something wrong how GAMS deals with the derivatives for the first version. I passed this on to GAMS so they can investigate.


The real model is much larger (> half a million variables), and then this difference becomes really significant. One version can solve the model quickly to near optimality (with Mosek) and the other version is not able to finish in reasonable time.

No comments:

Post a Comment