class Relu
implements GPUActivationFn
import { Relu } from "https://deno.land/x/netsaur@0.1.1/src/gpu/activation.ts";
ReLU activation function f(x) = max(0, x) This is a rectified linear unit, which is a smooth approximation to the sigmoid function.