Skip to main content
Using Deno in production at your company? Earn free Deno merch.
Give us feedback
Module

x/netsaur/src/cpu/activation.ts>Relu

Powerful machine learning, accelerated by WebGPU
Go to Latest
class Relu
implements CPUActivationFn
import { Relu } from "https://deno.land/x/netsaur@0.1.2/src/cpu/activation.ts";

ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function.

Methods

activate(val: number): number
prime(val: number): number