This article discusses typical problems arising when doing numerical optimization and/or nonlinear fitting. We've tried to gather examples which cover different aspects of the optimization process.

1 General questions

Variables with wildly different magnitudes

Functions with singularities

2 Downloads section

Real-life optimization problems often have parameters with different magnitudes.
For example, one parameter can be somewhere between *10 ^{ 9}...10^{ 10}*, and another one can be approximately equal to

Badly scaled problems are hard to solve because of two reasons.
First, different magnitude of the variables makes it hard to formulate reasonable stopping criteria.
For example, if "sufficiently small step" for the first variable is something about *1000* then we need higher accuracy for the second one (about *0.000001*).
Second, different magnitude of variables increases condition number - hence we need more iterations to converge.
Both problems are discussed in more details in the article about variable scaling.

We can easily solve both problems with transformation which will make magnitudes of the variables approximately equal.
Although such transformation can be done manually, ALGLIB package can do it for you.
All you need is to do two things.
First, you should inform optimization algorithm about scale of your variables with a call of a special function (its name is algorithm-dependent and ends with `...setscale()`

).
Second, algorithm can use scaling coefficients for preconditioning (you can turn on scale-based preconditioning with `...setprecscale()`

function).

Quite often you have to solve optimization problems with functions which are defined on bounded domain and have singularities at the domain boundaries.
For example, *f(x)=(1+x) ^{ -0.2}+(1-x)^{ -0.3}+x* (see first chart below) is defined only in the internal area of

Extremum is located in the internal point, where function is smooth enough (twice continuously differentiable).
However, while converging to the solution, algorithm may examine point which is *outside of * *[-1,+1]*.
What will happen in this case?
The most probable outcome is that your program will crash because of exception (you can't raise negative number to the fractional power).
Alternatively you can get NAN (non-a-number) as result, and your program will hang later, when trying to cope with non-numeric value which can't be compared with anything else.
Attempt to use BLEIC algorithm with boundary constraint *-1≤x≤+1* won't help you.
Algorithm will surely try to evaluate function at the singularity.
You can use constraints like *-0.999≤x≤+0.999*, but it is hard to decide how many nines after decimal point are needed.

Luckily, this problem has simpler solution. ALGLIB package silently modifies your target function, "trims" it by making it constant when it is larger than some threshold value (see second chart above). For the sake of simplicity we can think that this threshold is equal to the initial value, although actual selection rule is a bit more complex. This modification doesn't change problem solution (threshold is always larger than initial value of a function). Then why do we need it at all?

This trick can be useful in a following situation.
When algorithm tries to evaluate target function at singular point (or outside of domain of a function)
you can return some large value as result (we recommend to use *10 ^{ 300}*) instead of trying to raise negative number to fractional power.
Algorithm will see that function value is too large and will "trim" function and zero its gradient.
As result, when we step outside of the domain we won't crash our program and will be gently moved back to the feasible area.

ALGLIB Reference Manual includes several examples which show how to use this trick: mincg_ftrim, minlbfgs_ftrim, minbleic_ftrim. All these examples solve same problem with different optimizers (L-BFGS, CG, BLEIC).

**Note #1**

Some users try to use this trick for solution of the constrained optimization problems instead of using specialized algorithms.
**Don't do it - it won't work!**
This tricks works only when extremum is located at the internal point of the domain - at the point where function is smooth enough.
In this case 'trimming' will change only areas which are far away from solution - areas we don't want to visit at all.
However, if you have extremum at the boundary of the feasible area (as most constrained problems do), then 'trimming' will make your optimization problem non-smooth
and your algorithm will stall right after encountering boundary.

**Note #2**

We recommend to return very large value (*10 ^{ 300}*) not only when we are exactly at the singular point

*This article is licensed for personal use only.*

ALGLIB Project offers you two editions of ALGLIB:

**ALGLIB Free Edition**:

delivered for free

offers full set of numerical functionality

extensive algorithmic optimizations

no low level optimizations

non-commercial license

**ALGLIB Commercial Edition**:

flexible pricing

offers full set of numerical functionality

extensive algorithmic optimizations

high performance (SMP, SIMD)

commercial license with support plan

Links to download sections for Free and Commercial editions can be found below:

C++ library.

Delivered with sources.

Monolithic design.

Extreme portability.

Delivered with sources.

Monolithic design.

Extreme portability.

C# library with native kernels.

Delivered with sources.

VB.NET and IronPython wrappers.

Extreme portability.

Delivered with sources.

VB.NET and IronPython wrappers.

Extreme portability.

Delphi wrapper around C core.

Delivered as precompiled binary.

Compatible with FreePascal.

Delivered as precompiled binary.

Compatible with FreePascal.

CPython wrapper around C core.

Delivered as precompiled binary.

Delivered as precompiled binary.