class Relu6
implements GPUActivationFn
import { Relu6 } from "https://deno.land/x/netsaur@0.1.2/src/gpu/activation.ts";
Relu6 activation function f(x) = min(max(0, x), 6) This is a rectified linear unit with a 6-value output range.
import { Relu6 } from "https://deno.land/x/netsaur@0.1.2/src/gpu/activation.ts";
Relu6 activation function f(x) = min(max(0, x), 6) This is a rectified linear unit with a 6-value output range.