# Nonsmooth constrained optimization

Nonsmooth optimization is an optimization of nonsmooth function subject to optional nonsmooth constraints g(x) and h(x) (in addition to traditional box and linear constraints).

Non-smooth optimization has very high cost, and if you can find equivalent smooth formulation for your problem, it is better to do so. Smooth algorithms run much faster and with better convergence guarantees. However, sometimes you have no choice but to solve problem with nonsmoothness either in target function or in constraints. In this case ALGLIB will provide you with optimized implementation of adaptive gradient sampling algorithm - robust solver with convergence guarantees.

# Nonsmooth optimization overview

There are exist several families of algorithms for non-smooth optimization, each with its own benefits and drawbacks. We will review them shortly before proceeding to ones implemented in ALGLIB:

• gradient sampling method was the first successful approach to nonsmooth optimization. It performs ~2N gradient measurements in randomly sampled points around current iterate x i. Then constrained 2Nx2N QP problem is generated whose solution gives us descent direction. Gradient sampling has nice convergence guarantees (convergence to stationary point under wide range of conditions), although its performance is less than that of other methods. It works like a bulldozer - slowly but steadily moves towards solution.
• surprisingly, BFGS method with modified line search turned out to be quite efficient on some nonsmooth problems. Traditional line search fails as soon as we reach first non-smoothness in the target function, but its slight modification performs surprisingly better. Nonsmooth BFGS performs faster than gradient sampling, but without strong convergence theory. Sometimes it stalls without obvious reasons, and it tends to build highly ill-conditioned Hessians near the solution. It works like a red sportcar - fast, but unstable.

## Overview

AGS (Adaptive Gradient Sampling) solver is provided by minns subpackage of Optimization package. This solver is intended for solution of nonsmooth problems with box, linear and nonlinear (possibly nonsmooth) constraints. AGS solver includes several important features:

• special handling of box/linear constraints
• variable scaling (in order to handle badly scaled problems)
• built-in support for numerical differentiation

The most important limitation of AGS solver is that this solver is not intended for high-dimensional problems. One AGS step requires ~2·N gradient evaluations at randomly sampled locations (just for comparison - LBFGS requires O(1) evaluations per step). Typically you will need O(N) iterations to converge, which results in O(N 2) gradient evaluations per optimization session.

From the other side, this solver requires only a little tuning. Once you select initial sampling radius (almost any sufficiently large value will work; start with 1.0) and penalty parameter for nonlinear constraints, it will be working like a bulldozer - slowly but robustly moving towards solution. Algorithm is very tolerant to its tuning parameters, so you won't have to make numerous trial runs in order to find good settings for your optimizer.

## Solver API

As all ALGLIB optimizers, minns subpackage is built around minnsstate object. You should use it for everything optimization-related - from problem specification to retrieval of results:

• create an instance of optimizer with minnscreate (or minnscreatef - for numerical differentiation) function
• choose one of the optimization algorithms (as for now, only one solver, AGS, is present, which is activated by minnssetalgoags call)
• pass to optimizer problem specifications like box/linear constraints (minnssetbc and minnssetlc functions), scale of the variables (minnssetscale function), number of nonlinear constraints (if present; minnssetnlc function), stopping criteria (minnssetcond function)
• start optimizer with minnsoptimize function
• retrieve results with minnsresults function

## Examples

ALGLIB Reference Manual includes several examples for minns subpackage:

• minns_d_unconstrained - unconstrained nonsmooth optimization
• minns_d_diff - nonsmooth optimization with numerical gradient
• minns_d_bc - nonsmooth optimization subject to box constraints
• minns_d_nlc - nonsmooth optimization subject to mixed nonlinear equality/inequality constraints (in this example - smooth ones, although general non-smooth constraints are supported too)

# Other questions

## Scale of the variables

Before you start to use optimizer, we recommend you to set scale of the variables with minnssetscale functions. Scaling is essential for the correct work of the stopping criteria (and sometimes for the convergence of optimizer). You can use optimizer without scaling if your problem is well scaled. However, if some variables are up to 100 times different in magnitude, we recommend you to tell solver about their scale. And we strongly recommend to set scaling in case of larger difference in magnitudes.

We recommend you to read separate article on variable scaling, which is worth reading unless you solve some simple toy problem.

ALGLIB Project offers you two editions of ALGLIB:

ALGLIB Free Edition:
offers full set of numerical functionality
extensive algorithmic optimizations
no low level optimizations

ALGLIB Commercial Edition:
flexible pricing
offers full set of numerical functionality
extensive algorithmic optimizations
high performance (SMP, SIMD)

## ALGLIB 3.15.0 for C++

C++ library.
Delivered with sources.
Monolithic design.
Extreme portability.
Editions:   FREE   COMMERCIAL

## ALGLIB 3.15.0 for C#

C# library with native kernels.
Delivered with sources.
VB.NET and IronPython wrappers.
Extreme portability.
Editions:   FREE   COMMERCIAL

## ALGLIB 3.15.0 for Delphi

Delphi wrapper around C core.
Delivered as precompiled binary.
Compatible with FreePascal.
Editions:   FREE   COMMERCIAL

## ALGLIB 3.15.0 for CPython

CPython wrapper around C core.
Delivered as precompiled binary.
Editions:   FREE   COMMERCIAL