import * as netsaur from "https://deno.land/x/netsaur@0.1.2/src/cpu/activation.ts";
Classes
c Elu | Elu activation function f(x) = x if x >= 0, 1.01 * (e^x - 1) otherwise This is a rectified linear unit with an exponential output range. |
Leaky ReLU activation function f(x) = x if x > 0, 0.01 * x otherwise | |
Linear activation function f(x) = x | |
c Relu | ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function. |
Relu6 activation function f(x) = min(max(0, x), 6) This is a rectified linear unit with a 6-value output range. | |
c Selu | Selu activation function f(x) = x if x >= 0, 1.67 * (e^x - 1) otherwise This is a scaled version of the Elu function, which is a smoother approximation to the ReLU function. |
Sigmoid activation function f(x) = 1 / (1 + e^(-x)) | |
c Tanh | Tanh activation function f(x) = (ex - e-x) / (ex + e-x) This is the same as the sigmoid function, but is more robust to outliers |