Skip to main content
Deno 2 is finally here 🎉️
Learn more
Module

x/netsaur/src/core/api/layers.ts

Powerful machine learning, accelerated by WebGPU
Go to Latest
import * as netsaur from "https://deno.land/x/netsaur@0.2.14/src/core/api/layers.ts";

Functions

Creates an average pooling layer. Pooling layers are used for downsampling. See https://en.wikipedia.org/wiki/Convolutional_neural_network#Pooling_layer

Creates a BatchNorm1D layer. BatchNorm1D layers normalize the input. They are usually used to improve the speed, performance, and stability of neural networks. See https://en.wikipedia.org/wiki/Batch_normalization

Creates a BatchNorm2D layer. BatchNorm2D layers normalize the input. They are usually used to improve the speed, performance, and stability of neural networks. See https://en.wikipedia.org/wiki/Batch_normalization

Creates a convolutional layer. Convolutional layers are used for feature extraction. They are commonly used in image processing. See https://en.wikipedia.org/wiki/Convolutional_neural_network

Creates a dense layer (also known as a fully connected layer). Dense layers feed all outputs from the previous layer to all its neurons, each neuron providing one output to the next layer. See https://en.wikipedia.org/wiki/Feedforward_neural_network#Fully_connected_network

Creates a dropout layer. Dropout is a regularization technique for reducing overfitting. The technique temporarily drops units (artificial neurons) from the network, along with all of those units' incoming and outgoing connections. See https://en.wikipedia.org/wiki/Dropout_(neural_networks)

Creates a dropout layer. Dropout is a regularization technique for reducing overfitting. The technique temporarily drops units (artificial neurons) from the network, along with all of those units' incoming and outgoing connections. See https://en.wikipedia.org/wiki/Dropout_(neural_networks)

Creates an Elu layer. Elu layers use the elu activation function.

Creates a Flatten layer. Flatten layers flatten the input. They are usually used to transition from convolutional layers to dense layers.

Creates a leaky relu layer. Leaky relu layers use the leaky relu activation function.

Creates a max pooling layer. Pooling layers are used for downsampling. See https://en.wikipedia.org/wiki/Convolutional_neural_network#Pooling_layer

Creates a pooling layer. Pooling layers are used for downsampling. See https://en.wikipedia.org/wiki/Convolutional_neural_network#Pooling_layer

Creates a relu6 layer. Relu6 layers use the relu6 activation function.

Creates a relu layer. Relu layers use the relu activation function.

Creates a Selu layer. Selu layers use the selu activation function.

Creates a sigmoid layer. Sigmoid layers use the sigmoid activation function. See https://en.wikipedia.org/wiki/Sigmoid_function

Creates a softmax layer. Softmax layers are used for classification. See https://en.wikipedia.org/wiki/Softmax_function

Creates a tanh layer. Tanh layers use the tanh activation function.