Sunday, August 24, 2008

NNLS

> Hi!
> In a least squares fitting application, I'm trying to determine the parameters x_1, x_2

> and x_3 so that the function
> f(x_1, x_2, x_3) = \sum_{i=1}^N (a_i*x_1 + b_i*x_2 + c_i*x_3 - d_i)^2
> is minimized under the constrains x_1, x_2, x_3 >= 0 (LaTeX notation)
> I was pointed to the keyword "nonnegative least squares" (NNLS), but wasn't able to

> find out much about this: Lawson, Hanson: "Solving Least Squares Problems", Ch. 23
> Briggs: "High Fidelity Deconvolution of Moderately Resolved Sources", Ch. 4.3
> Can you recommend further literature or web resources on this topic, ideally with examples

> on how to solve this type of problem?

Any NLP solver can solve these type of problems quite efficiently. See e.g.
http://www.princeton.edu/~rvdb/ampl/nlmodels/nnls/index.html
A GAMS version can look like:

   1  set
2 i /1*1000/
3 j /1*300/
4 ;
5
6 parameters
7 b(i)
8 A(i,j)
9 ;
10
11 a(i,j)$(uniform(0,1) < 0.21) = 10*(uniform(0,1)-1);
12 parameter x0(j);
13 x0(j) = uniform(0,1);
14 b(i) = sum(j, a(i,j)*x0(j)) + 100*(uniform(0,1)-0.5)
15
16 positive variable x(j);
17 variable sum_sqs;
18
19 equation esum_sqs;
20
21 esum_sqs.. sum_sqs =e= sum(i, sqr[ b(i) - sum(j, A(i,j)*x[j]) ]);
22
23 model m /all/;
24 solve m minimizing sum_sqs using nlp;
25
26
27 *conopt : 0.702 seconds
28 *minos : 1.264
29 *snopt : 0.343
30 *knitro : 7.385
31 *ipopt : 1.088

No comments:

Post a Comment