ILNumerics Ultimate VS

Optimizationleastsq_pdl Method

ILNumerics Ultimate VS Documentation
ILNumerics - Technical Application Development
Powell's dog leg algorithm for vectorial least squares minimization problem.

[ILNumerics Optimization Toolbox]

Namespace:  ILNumerics.Toolboxes
Assembly:  ILNumerics.Toolboxes.Optimization (in ILNumerics.Toolboxes.Optimization.dll) Version: 5.5.0.0 (5.5.7503.3146)
Syntax

public static RetArray<double> leastsq_pdl(
	OptimizationObjectiveFunction<double> objfunc,
	InArray<double> x0,
	OptimizationDerivativeFunction<double> jacobianFunc = null,
	int maxIter = 200,
	InArray<double> ydata = null,
	OutArray<double> iterations = null,
	OutArray<int> iterationCount = null,
	OutArray<double> gradientNorm = null,
	OutArray<int> nFct = null,
	double tol = 1E-08,
	double tolf = 1E-08
)

Parameters

objfunc
Type: ILNumerics.ToolboxesOptimizationObjectiveFunctionDouble
Vectorial cost function defined from Rn to Rm
x0
Type: ILNumericsInArrayDouble
Starting guess for the parameter vector in Rn
jacobianFunc (Optional)
Type: ILNumerics.ToolboxesOptimizationDerivativeFunctionDouble
[Optional] function to compute the Jacobian matrix of the objective function at a certain point. Default: null - internal finite difference algorithm (gradient_fast).
maxIter (Optional)
Type: SystemInt32
[optional] maximum number of iteration steps allowed. Default: 200
ydata (Optional)
Type: ILNumericsInArrayDouble
[optional] data point vector y in Rm. Leave empty if y is included in objfunc: min||f(x) - ydata||2. Default: null
iterations (Optional)
Type: ILNumericsOutArrayDouble
[optional] Output array of intermediate positions at each iteration. Default: null (not provided)
iterationCount (Optional)
Type: ILNumericsOutArrayInt32
Number of effective iterations. Default: null (nubmer is not tracked)
gradientNorm (Optional)
Type: ILNumericsOutArrayDouble
[optional] Output array of gradient function norms after each iteration step. Used for convergence verification. Default: null (not computed)
nFct (Optional)
Type: ILNumericsOutArrayInt32
Number of effective cost function evaluations. Default: null (number not tracked)
tol (Optional)
Type: SystemDouble
[Optional] Exit criterion for the distance of the minimizer to the optimal solution. Default: 1e-8
tolf (Optional)
Type: SystemDouble
[Optional] Maximum absolute value allowed for the function value at the solution. Default: 1e-8

Return Value

Type: RetArrayDouble
A vector of the same length as x0 containing the solution of min ||f(x)-ydata||^2
Exceptions

ExceptionCondition
ArgumentNullExceptionIf x0 was null
ArgumentOutOfRangeExceptionIf objfunc is not defined at x0 or if objfunc was found not to be a scalar function
Remarks

leastsq_pdl finds the (local) minimum of a vectorial function of several variables, starting at an initial guess:

argmin{0.5 * sum(f_i(x)^2) } , where x = [x_1, ..., x_n], F(x) = (f_i(x)) in Rm.

The function also finds the solution of the minimization problem that can be written in the following form:

min ||f(x)-ydata||^2.

The returned value will be of the same size as the initial guess x0 provided by the user. On empty initial guess input, an empty array will be returned.

It is recommended to design the objective function and the start parameter in a way which expects the data in columns!

  • leastsq_pdl(objfunc,x0) gives a local minimizer of the objective function using the Powell's dog leg algorithm. objfunc reaches a minimum at a minimizer.
    Examples

     public static RetArray<double> vect_function(InArray<double> x)
    {
         using (Scope.Enter(x))
        {
            // 
            // this callback calculates
            // f0(x0,x1) = 100*(x0+3)^4,
            // f1(x0,x1) = (x1-3)^4
            // 
            Array<double> fi = array<double>(x.S);
            fi[0] = 10 * pow(x[0] + 3, 2);
            fi[1] = pow(x[1] - 3, 2);
            return fi;
        }
    }
     //Compute now the minimum of the function.
     Array<double> xm = Optimization.leastsq_pdl(vect_function, zeros<double>(2, 1));
     // and the result...
     xm
     ><Double> [2,1]
     >      [0]:    -3.0000 
     >      [1]:     3.0000
  • leastsq_pdl(objfunc,x0,jacobianFunc) gives a local minimizer of the objective function using the Powell's dog leg algorithm with gradient provided.
    Examples

     //Sample function definition
     public static RetArray<double> vect_function(InArray<double> x)
    {
         using (Scope.Enter(x))
        {
            // f0(x0,x1) = 100*(x0+3)^4,
            // f1(x0,x1) = (x1-3)^4
            // 
            Array<double> fi = array<double>(x.S);
            fi[0] = 10 * pow(x[0] + 3, 2);
            fi[1] = pow(x[1] - 3, 2);
            return fi;
        }
    }
     // Gradient matrice
      public static RetArray<double> function1_jac(InArray<double> x, InArray<double> F_x)
    {
        using (Scope.Enter(x))
        {
            // and Jacobian matrix J = [dfi/dxj]
            Array<double> j = array<double>(x.S);
            j[0, 0] = 20 * (x[0] + 3);
            j[0, 1] = 0;
            j[1, 0] = 0;
            j[1, 1] = 2 * (x[1] - 3);
            return j;
        }
    }
     //Compute now the minimum of the function.
     Array<double> xm = Optimization.leastsq_pdl(vect_function, zeros<double>(2, 1),jacobianFunc: jac_function);
     // and the result...
     xm
     ><Double> [2,1]
     >      [0]:    -3.0000 
     >      [1]:     3.0000
  • Other combinations of input parameters can be done

[ILNumerics Optimization Toolbox]

See Also

Reference

Other Resources