Skip to main content
Module

x/netsaur/mod.ts

Powerful machine learning, accelerated by WebGPU
Go to Latest
import * as netsaur from "https://deno.land/x/netsaur@0.2.7/mod.ts";

Classes

Sequential Neural Network

A generic N-dimensional tensor.

Enums

Activation functions are used to transform the output of a layer into a new output.

BackendType represents the type of backend to use.

Init represents the type of initialization to use.

Rank Types.

Variables

The AUTO backend is chosen automatically based on the environment.

CPU Backend written in Rust.

Web Assembly Backend written in Rust & compiled to Web Assembly.

Functions

Creates an average pooling layer. Pooling layers are used for downsampling. See https://en.wikipedia.org/wiki/Convolutional_neural_network#Pooling_layer

Creates a BatchNorm2D layer. BatchNorm2D layers normalize the input. They are usually used to improve the speed, performance, and stability of neural networks. See https://en.wikipedia.org/wiki/Batch_normalization

Creates a convolutional layer. Convolutional layers are used for feature extraction. They are commonly used in image processing. See https://en.wikipedia.org/wiki/Convolutional_neural_network

Creates a dense layer (also known as a fully connected layer). Dense layers feed all outputs from the previous layer to all its neurons, each neuron providing one output to the next layer. See https://en.wikipedia.org/wiki/Feedforward_neural_network#Fully_connected_network

Creates a dropout layer. Dropout is a regularization technique for reducing overfitting. The technique temporarily drops units (artificial neurons) from the network, along with all of those units' incoming and outgoing connections. See https://en.wikipedia.org/wiki/Dropout_(neural_networks)

Creates a dropout layer. Dropout is a regularization technique for reducing overfitting. The technique temporarily drops units (artificial neurons) from the network, along with all of those units' incoming and outgoing connections. See https://en.wikipedia.org/wiki/Dropout_(neural_networks)

Creates an Elu layer. Elu layers use the elu activation function.

Creates a Flatten layer. Flatten layers flatten the input. They are usually used to transition from convolutional layers to dense layers.

Creates a leaky relu layer. Leaky relu layers use the leaky relu activation function.

Creates a max pooling layer. Pooling layers are used for downsampling. See https://en.wikipedia.org/wiki/Convolutional_neural_network#Pooling_layer

Creates a pooling layer. Pooling layers are used for downsampling. See https://en.wikipedia.org/wiki/Convolutional_neural_network#Pooling_layer

Creates a relu6 layer. Relu6 layers use the relu6 activation function.

Creates a relu layer. Relu layers use the relu activation function.

Creates a Selu layer. Selu layers use the selu activation function.

setupBackend loads the backend and sets it up.

Creates a sigmoid layer. Sigmoid layers use the sigmoid activation function. See https://en.wikipedia.org/wiki/Sigmoid_function

Creates a softmax layer. Softmax layers are used for classification. See https://en.wikipedia.org/wiki/Softmax_function

Creates a tanh layer. Tanh layers use the tanh activation function.

Create an nth rank tensor from the given nthD array and shape.

Create a 1D tensor from the given 1D array.

Create a 2D tensor from the given 2D array.

Create a 3D tensor from the given 3D array.

Create a 4D tensor from the given 4D array.

Create a 5D tensor from the given 5D array.

Create a 6D tensor from the given 6D array.

Interfaces

The Backend is responsible for eveything related to the neural network.

Base Neural Network Structure. All Neural Networks should implement this.

Shape Interface

Type Aliases

1D Array.

2D Array.

3D Array.

4D Array.

5D Array.

6D Array.

Array Map Types.

DataSet is a container for training data.

NetworkConfig represents the configuration of a neural network.

1st dimentional shape.

2nd dimentional shape.

3th dimentional shape.

4th dimentional shape.

5th dimentional shape.

6th dimentional shape.