Skip to main content
Using Deno in production at your company? Earn free Deno merch.
Give us feedback
Module

x/netsaur/backends/gpu/activation.ts>Relu

Powerful machine learning, accelerated by WebGPU
Go to Latest
class Relu
implements GPUActivationFn
import { Relu } from "https://deno.land/x/netsaur@0.1.5/backends/gpu/activation.ts";

ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function.

Properties

name: string

Methods

activate(type: string): string
prime(type: string): string