import { Activation } from "https://deno.land/x/netsaur@0.4.0-patch/packages/core/mod.ts";
Activation functions are used to transform the output of a layer into a new output.
Members
Elu activation function f(x) = x if x >= 0, 1.01 * (e^x - 1) otherwise This is a rectified linear unit with an exponential output range.
ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function.
Relu6 activation function f(x) = min(max(0, x), 6) This is a rectified linear unit with a 6-value output range.
Selu activation function f(x) = x if x >= 0, 1.67 * (e^x - 1) otherwise This is a scaled version of the Elu function, which is a smoother approximation to the ReLU function.