# Symbolic Regression

DataDrivenDiffEq includes the following symbolic regression algorithms.

## SymbolicRegression

Warning

This feature requires the explicit loading of SymbolicRegression.jl in addition to DataDrivenDiffEq. It will only be useable if loaded like:

using DataDrivenDiffEq
using SymbolicRegression

### Symbolic Regression

See the tutorial.

DataDrivenDiffEq.EQSearchType
struct EQSearch <: DataDrivenDiffEq.AbstractSymbolicRegression

Options for using SymbolicRegression.jl within the solve function. Automatically creates Options with the given specification. Sorts the operators stored in functions into unary and binary operators on conversion.

Fields

• functions

Operators used for symbolic regression

• kwargs

Additionally keyworded arguments passed to SymbolicRegression.Options

source

## OccamNet

See the tutorial.

Warning

This feature requires the explicit loading of Flux.jl in addition to DataDrivenDiffEq. It will only be useable if loaded like:

using DataDrivenDiffEq
using Flux
DataDrivenDiffEq.OccamNetType
mutable struct OccamNet{F, C, P} <: DataDrivenDiffEq.AbstractOccam

Defines an OccamNet which learns symbolic expressions from data using a probabilistic approach. See Interpretable Neuroevolutionary Models for Learning Non-Differentiable Functions and Programs for more details.

It get constructed via:

net = OccamNet(inp::Int, outp::Int, layers::Int, f::Vector{Function}, t::Real = 1.0; constants = typeof(t)[], parameters::Int = 0, skip::Bool = false, init_w = ones, init_p = Flux.glorot_uniform)

inp describes the size of the input domain, outp the size of the output domain, layers the number of layers (including the input layer and excluding the linear output layer) and f the functions to be used. Optional is the temperature t which is set to 1.0 at the beginning.

Keyworded arguments are constants, a vector of constants like π, ℯ which can concatenated to the input, the number of trainable parameters and if skip connections should be used. The constructors to the weights and parameters can be passed in via init_w and init_p.

OccamNet is callable with and without a specific route, which can be sampled from the networks weights via rand(net).

Fields

• c

The Chain representing the network

• constants

• parameters

source
DataDrivenDiffEq.OccamSRType
struct OccamSR{F, C, T} <: DataDrivenDiffEq.AbstractSymbolicRegression

Options for using OccamNet within the solve function. Automatically creates a network with the given specification.

Fields

• functions

Functions used within the network

• constants

• layers

Number of layers

• parameters

Number of parameters

• skip

Activate skip connections

source
DataDrivenDiffEq.ProbabilityLayerType
mutable struct ProbabilityLayer{F, W, T, A} <: DataDrivenDiffEq.AbstractProbabilityLayer

Defines a basic ProbabilityLayer in which the parameters act as probabilities via the softmax function for an array of functions.

The layer is callable either via layer(x), using all weights to form the output or by layer(x, route) where route is the result of rand(layer) which samples the function arguments from the underlying distribution.

Fields

• op

Nonlinear functions forming the basis of the layer

• weight

Weights

• t

Temperature controlling the shape of the distribution

• arieties

Arities of the functions

• skip

Skip connection

source
DataDrivenDiffEq.probabilitiesFunction
probabilities(p)


Return the probability associated with the ProbabilityLayer or OccamNet by applying softmax on the weights.

source
DataDrivenDiffEq.logprobabilitiesFunction
logprobabilities(p)


Return the logprobability associated with the ProbabilityLayer or OccamNet by applying logsoftmax on the weights.

source