nnlib
GPU-accelerated, C/C++ neural network library.
Public Member Functions | Public Attributes | List of all members
Layer Class Reference

Represents a single layer of a neural network. More...

#include <layer.h>

Public Member Functions

 Layer (size_t inSize, size_t outSize, std::string activation, DataLocation location)
 Construct a new layer. More...
 
 ~Layer ()
 The destructor of the layer object.
 
sTensor forward (const sTensor &batch) const
 Forward one batch of data through the layer. More...
 
void applyGradients (size_t batchSize, float learningRate=0.01)
 Apply the computed gradients. More...
 

Public Attributes

DataLocation location
 The location of the layer. More...
 
size_t outSize
 The output size of the layer. More...
 
size_t inSize
 The input size to the layer. More...
 
std::string activation
 The activation function. More...
 
sTensor weights
 The weights of the layer. Stored as a matrix.
 
sTensor biases
 The biases of the layer. Stored as a vector.
 

Detailed Description

Represents a single layer of a neural network.

Constructor & Destructor Documentation

◆ Layer()

Layer::Layer ( size_t  inSize,
size_t  outSize,
std::string  activation,
DataLocation  location 
)

Construct a new layer.

Also allocates space that will be used during computation. This allows for in-place computation, which avoids allocating/freeing memory during training.

Parameters
inSizeThe input size to the layer.
outSizeThe output size of the layer (equal to the number of neurons).
activationThe activation function that should be used.
locationThe location of the layer. See Layer::location.

Member Function Documentation

◆ applyGradients()

void Layer::applyGradients ( size_t  batchSize,
float  learningRate = 0.01 
)

Apply the computed gradients.

The method should be called only when all the gradients have been computed for all the layers in the network.

Parameters
batchSizeThe size of the batch.
learningRateThe learning rate of the model.

◆ forward()

sTensor Layer::forward ( const sTensor &  batch) const

Forward one batch of data through the layer.

This includes allocating space that could not be allocated in the constructor as it depends on the batch size. The additional data will only be allocated if batch size changes. This means, if all batches are of the same size, the data will not be allocated again. This is performed in the Layer::allocate() method.

Parameters
batchThe batch that should be propagated.

Member Data Documentation

◆ activation

std::string Layer::activation

The activation function.

Pointer to the activation function object. Can be LinearActivation, ReLUActivation or SigmoidActivation.

◆ inSize

size_t Layer::inSize

The input size to the layer.

Equal to the number of neurons in the previous layer, or the size of the input in the case of the input layer.

◆ location

DataLocation Layer::location

The location of the layer.

Specifies the location of all the data used by the layer. See DataLocation for more info.

◆ outSize

size_t Layer::outSize

The output size of the layer.

Equal to the number of neurons in the layer.


The documentation for this class was generated from the following files: