Skip to main content
Module

x/netsaur/src/gpu/activation.ts>Relu6

Powerful machine learning, accelerated by WebGPU
Go to Latest
class Relu6
implements GPUActivationFn
import { Relu6 } from "https://deno.land/x/netsaur@0.1.2/src/gpu/activation.ts";

Relu6 activation function f(x) = min(max(0, x), 6) This is a rectified linear unit with a 6-value output range.

Methods

activate(type: string): string
prime(type: string): string