Skip to main content
Using Deno in production at your company? Earn free Deno merch.
Give us feedback
Module

x/netsaur/backends/cpu/activation.ts>Selu

Powerful machine learning, accelerated by WebGPU
Go to Latest
class Selu
implements CPUActivationFn
import { Selu } from "https://deno.land/x/netsaur@0.1.5/backends/cpu/activation.ts";

Selu activation function f(x) = x if x >= 0, 1.67 * (e^x - 1) otherwise This is a scaled version of the Elu function, which is a smoother approximation to the ReLU function.

Properties

name: string

Methods

activate(val: number): number
prime(val: number, error?): number