Sparse Optimizers

DataDrivenDiffEq.Optimize.STLSQType
mutable struct STLSQ{T} <: DataDrivenDiffEq.Optimize.AbstractOptimizer{T}

STLQS is taken from the original paper on SINDY and implements a sequentially thresholded least squares iteration. λ is the threshold of the iteration. It is based upon this matlab implementation. It solves the following problem

\[\argmin_{x} \frac{1}{2} \| Ax-b\|_2 + \lambda \|x\|_2\]

Fields

  • λ

    Sparsity threshold

Example

opt = STLQS()
opt = STLQS(1e-1)
opt = STLQS(Float32[1e-2; 1e-1])

Note

This was formally STRRidge and has been renamed.

source
DataDrivenDiffEq.Optimize.ADMMType
mutable struct ADMM{T, R} <: DataDrivenDiffEq.Optimize.AbstractOptimizer{T}

ADMM is an implementation of Lasso using the alternating direction methods of multipliers and loosely based on this implementation. It solves the following problem

\[\argmin_{x} \frac{1}{2} \| Ax-b\|_2 + \lambda \|x\|_1\]

Fields

  • λ

    Sparsity threshold

  • ρ

    Augmented Lagrangian parameter

Example

opt = ADMM()
opt = ADMM(1e-1, 2.0)
source
DataDrivenDiffEq.Optimize.SR3Type
mutable struct SR3{T, V, P<:DataDrivenDiffEq.Optimize.AbstractProximalOperator} <: DataDrivenDiffEq.Optimize.AbstractOptimizer{T}

SR3 is an optimizer framework introduced by Zheng et. al., 2018 and used within Champion et. al., 2019. SR3 contains a sparsification parameter λ, a relaxation ν. It solves the following problem

\[\argmin_{x, w} \frac{1}{2} \| Ax-b\|_2 + \lambda R(w) + \frac{\nu}{2}\|x-w\|_2\]

Where R is a proximal operator and the result is given by w.

Fields

  • λ

    Sparsity threshold

  • ν

    Relaxation parameter

  • R

    Proximal operator

Example

opt = SR3()
opt = SR3(1e-2)
opt = SR3(1e-3, 1.0)
opt = SR3(1e-3, 1.0, SoftThreshold())

Note

Opposed to the original formulation, we use ν as a relaxation parameter, as given in Champion et. al., 2019. In the standard case of hard thresholding the sparsity is interpreted as λ = threshold^2 / 2, otherwise λ = threshold.

source
DataDrivenDiffEq.Optimize.ImplicitOptimizerType
mutable struct ImplicitOptimizer{T} <: DataDrivenDiffEq.Optimize.AbstractSubspaceOptimizer{T}

Optimizer for finding a sparse implicit relationship via alternating the left hand side of the problem and solving the explicit problem, as introduced here.

\[\argmin_{x} \|x\|_0 ~s.t.~Ax= 0\]

Fields

  • o

    Explicit Optimizer

Example

ImplicitOptimizer(STLSQ())
ImplicitOptimizer(0.1f0, ADMM)
source
DataDrivenDiffEq.Optimize.sparse_regression!Function
sparse_regression!(X, A, Y, opt; maxiter, abstol, progress, progress_outer, progress_offset, kwargs...)

Implements a sparse regression, given an AbstractOptimizer or AbstractSubspaceOptimizer. X denotes the coefficient matrix, A the design matrix and Y the matrix of observed or target values. X can be derived via init(opt, A, Y). maxiter indicates the maximum iterations for each call of the optimizer, abstol the absolute tolerance of the difference between iterations in the 2 norm. If the optimizer is called with a Vector of thresholds, each maxiter indicates the maximum iterations for each threshold.

If progress is set to true, a progressbar will be available. progress_outer and progress_offset are used to compute the initial offset of the progressbar.

If used with a Vector of thresholds, the functions f with signature f(X, A, Y) and g with signature g(x, threshold) = G(f(X, A, Y)) with the arguments given as stated above can be passed in. These are used for finding the pareto-optimal solution to the sparse regression.

source

Proximal Operators

DataDrivenDiffEq.Optimize.ClippedAbsoluteDeviationType
struct ClippedAbsoluteDeviation{T} <: DataDrivenDiffEq.Optimize.AbstractProximalOperator

Proximal operator which implements the (smoothly) clipped absolute deviation operator.

abs(x) > ρ ? x : sign(x) * max(abs(x) - λ, 0)

Where ρ = 5λ per default.

#Fields

  • ρ

    Upper threshold

Example

opt = ClippedAbsoluteDeviation()
opt = ClippedAbsoluteDeviation(1e-1)

See by Zheng et. al., 2018.

source