Skip to main content
Deno 2 is finally here 🎉️
Learn more
Module

x/netsaur/backends/gpu/activation.ts

Powerful machine learning, accelerated by WebGPU
Go to Latest
import * as netsaur from "https://deno.land/x/netsaur@0.1.5/backends/gpu/activation.ts";

Classes

Elu activation function f(x) = x if x >= 0, 1.01 * (e^x - 1) otherwise This is a rectified linear unit with an exponential output range.

Leaky ReLU activation function f(x) = x if x > 0, 0.01 * x otherwise

Linear activation function f(x) = x

ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function.

Relu6 activation function f(x) = min(max(0, x), 6) This is a rectified linear unit with a 6-value output range.

Selu activation function f(x) = x if x >= 0, 1.67 * (e^x - 1) otherwise This is a scaled version of the Elu function, which is a smoother approximation to the ReLU function.

Sigmoid activation function f(x) = 1 / (1 + e^(-x))

Tanh activation function f(x) = (ex - e-x) / (ex + e-x) This is the same as the sigmoid function, but is more robust to outliers