# Sparse Optimizers

`DataDrivenDiffEq.Optimize.STLSQ`

— Type`mutable struct STLSQ{T} <: DataDrivenDiffEq.Optimize.AbstractOptimizer{T}`

`STLQS`

is taken from the original paper on SINDY and implements a sequentially thresholded least squares iteration. `λ`

is the threshold of the iteration. It is based upon this matlab implementation. It solves the following problem

\[\argmin_{x} \frac{1}{2} \| Ax-b\|_2 + \lambda \|x\|_2\]

**Fields**

`λ`

Sparsity threshold

**Example**

```
opt = STLQS()
opt = STLQS(1e-1)
opt = STLQS(Float32[1e-2; 1e-1])
```

**Note**

This was formally `STRRidge`

and has been renamed.

`DataDrivenDiffEq.Optimize.ADMM`

— Type`mutable struct ADMM{T, R} <: DataDrivenDiffEq.Optimize.AbstractOptimizer{T}`

`ADMM`

is an implementation of Lasso using the alternating direction methods of multipliers and loosely based on this implementation. It solves the following problem

\[\argmin_{x} \frac{1}{2} \| Ax-b\|_2 + \lambda \|x\|_1\]

**Fields**

`λ`

Sparsity threshold

`ρ`

Augmented Lagrangian parameter

**Example**

```
opt = ADMM()
opt = ADMM(1e-1, 2.0)
```

`DataDrivenDiffEq.Optimize.SR3`

— Type`mutable struct SR3{T, V, P<:DataDrivenDiffEq.Optimize.AbstractProximalOperator} <: DataDrivenDiffEq.Optimize.AbstractOptimizer{T}`

`SR3`

is an optimizer framework introduced by Zheng et. al., 2018 and used within Champion et. al., 2019. `SR3`

contains a sparsification parameter `λ`

, a relaxation `ν`

. It solves the following problem

\[\argmin_{x, w} \frac{1}{2} \| Ax-b\|_2 + \lambda R(w) + \frac{\nu}{2}\|x-w\|_2\]

Where `R`

is a proximal operator and the result is given by `w`

.

**Fields**

`λ`

Sparsity threshold

`ν`

Relaxation parameter

`R`

Proximal operator

**Example**

```
opt = SR3()
opt = SR3(1e-2)
opt = SR3(1e-3, 1.0)
opt = SR3(1e-3, 1.0, SoftThreshold())
```

**Note**

Opposed to the original formulation, we use `ν`

as a relaxation parameter, as given in Champion et. al., 2019. In the standard case of hard thresholding the sparsity is interpreted as `λ = threshold^2 / 2`

, otherwise `λ = threshold`

.

`DataDrivenDiffEq.Optimize.ImplicitOptimizer`

— Type`mutable struct ImplicitOptimizer{T} <: DataDrivenDiffEq.Optimize.AbstractSubspaceOptimizer{T}`

Optimizer for finding a sparse implicit relationship via alternating the left hand side of the problem and solving the explicit problem, as introduced here.

\[\argmin_{x} \|x\|_0 ~s.t.~Ax= 0\]

**Fields**

`o`

Explicit Optimizer

**Example**

```
ImplicitOptimizer(STLSQ())
ImplicitOptimizer(0.1f0, ADMM)
```

`DataDrivenDiffEq.Optimize.ADM`

— Type`mutable struct ADM{T} <: DataDrivenDiffEq.Optimize.AbstractSubspaceOptimizer{T}`

Optimizer for finding a sparse basis vector in a subspace based on this paper. It solves the following problem

\[\argmin_{x} \|x\|_0 ~s.t.~Ax= 0\]

**Fields**

`λ`

Sparsity threshold

**Example**

```
ADM()
ADM(λ = 0.1)
```

**Note**

While useable for implicit problems, a better choice in general is given by the `ImplicitOptimizer`

which tends to be more robust.

## Related Functions

`DataDrivenDiffEq.Optimize.sparse_regression!`

— Function```
sparse_regression!(X, A, Y, opt; maxiter, abstol, progress, progress_outer, progress_offset, kwargs...)
```

Implements a sparse regression, given an `AbstractOptimizer`

or `AbstractSubspaceOptimizer`

. `X`

denotes the coefficient matrix, `A`

the design matrix and `Y`

the matrix of observed or target values. `X`

can be derived via `init(opt, A, Y)`

. `maxiter`

indicates the maximum iterations for each call of the optimizer, `abstol`

the absolute tolerance of the difference between iterations in the 2 norm. If the optimizer is called with a `Vector`

of thresholds, each `maxiter`

indicates the maximum iterations for each threshold.

If `progress`

is set to `true`

, a progressbar will be available. `progress_outer`

and `progress_offset`

are used to compute the initial offset of the progressbar.

If used with a `Vector`

of thresholds, the functions `f`

with signature `f(X, A, Y)`

and `g`

with signature `g(x, threshold) = G(f(X, A, Y))`

with the arguments given as stated above can be passed in. These are used for finding the pareto-optimal solution to the sparse regression.

`DataDrivenDiffEq.Optimize.set_threshold!`

— Function```
set_threshold!(opt, threshold)
```

Set the threshold(s) of an optimizer to (a) specific value(s).

`DataDrivenDiffEq.Optimize.get_threshold`

— Function```
get_threshold(opt)
```

Get the threshold(s) of an optimizer.

## Proximal Operators

`DataDrivenDiffEq.Optimize.SoftThreshold`

— Type`struct SoftThreshold <: DataDrivenDiffEq.Optimize.AbstractProximalOperator`

Proximal operator which implements the soft thresholding operator.

`sign(x) * max(abs(x) - λ, 0)`

`DataDrivenDiffEq.Optimize.HardThreshold`

— Type`struct HardThreshold <: DataDrivenDiffEq.Optimize.AbstractProximalOperator`

Proximal operator which implements the hard thresholding operator.

`abs(x) > sqrt(2*λ) ? x : 0`

`DataDrivenDiffEq.Optimize.ClippedAbsoluteDeviation`

— Type`struct ClippedAbsoluteDeviation{T} <: DataDrivenDiffEq.Optimize.AbstractProximalOperator`

Proximal operator which implements the (smoothly) clipped absolute deviation operator.

`abs(x) > ρ ? x : sign(x) * max(abs(x) - λ, 0)`

Where `ρ = 5λ`

per default.

#Fields

`ρ`

Upper threshold

**Example**

```
opt = ClippedAbsoluteDeviation()
opt = ClippedAbsoluteDeviation(1e-1)
```