Skip to main content
Module

x/netsaur/src/gpu/activation.ts>Relu

Powerful machine learning, accelerated by WebGPU
Go to Latest
class Relu
implements GPUActivationFn
import { Relu } from "https://deno.land/x/netsaur@0.1.2/src/gpu/activation.ts";

ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function.

Methods

activate(type: string): string
prime(type: string): string