Skip to main content
Module

x/netsaur/web.ts>Activation

Powerful machine learning, accelerated by WebGPU
Go to Latest
enum Activation
Re-export
import { Activation } from "https://deno.land/x/netsaur@0.2.8/web.ts";

Activation functions are used to transform the output of a layer into a new output.

Members

Elu = "elu"

Elu activation function f(x) = x if x >= 0, 1.01 * (e^x - 1) otherwise This is a rectified linear unit with an exponential output range.

LeakyRelu = "leakyrelu"

Leaky ReLU activation function f(x) = x if x > 0, 0.01 * x otherwise

Relu = "relu"

ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function.

Relu6 = "relu6"

Relu6 activation function f(x) = min(max(0, x), 6) This is a rectified linear unit with a 6-value output range.

Selu = "selu"

Selu activation function f(x) = x if x >= 0, 1.67 * (e^x - 1) otherwise This is a scaled version of the Elu function, which is a smoother approximation to the ReLU function.

Sigmoid = "sigmoid"

Sigmoid activation function f(x) = 1 / (1 + e^(-x))

Tanh = "tanh"

Tanh activation function f(x) = (ex - e-x) / (ex + e-x) This is the same as the sigmoid function, but is more robust to outliers