nnlib
GPU-accelerated, C/C++ neural network library.
|
Represents a single layer of a neural network. More...
#include <layer.h>
Public Member Functions | |
Layer (size_t inSize, size_t outSize, std::string activation, DataLocation location) | |
Construct a new layer. More... | |
~Layer () | |
The destructor of the layer object. | |
sTensor | forward (const sTensor &batch) const |
Forward one batch of data through the layer. More... | |
void | applyGradients (size_t batchSize, float learningRate=0.01) |
Apply the computed gradients. More... | |
Public Attributes | |
DataLocation | location |
The location of the layer. More... | |
size_t | outSize |
The output size of the layer. More... | |
size_t | inSize |
The input size to the layer. More... | |
std::string | activation |
The activation function. More... | |
sTensor | weights |
The weights of the layer. Stored as a matrix. | |
sTensor | biases |
The biases of the layer. Stored as a vector. | |
Represents a single layer of a neural network.
Layer::Layer | ( | size_t | inSize, |
size_t | outSize, | ||
std::string | activation, | ||
DataLocation | location | ||
) |
Construct a new layer.
Also allocates space that will be used during computation. This allows for in-place computation, which avoids allocating/freeing memory during training.
inSize | The input size to the layer. |
outSize | The output size of the layer (equal to the number of neurons). |
activation | The activation function that should be used. |
location | The location of the layer. See Layer::location. |
void Layer::applyGradients | ( | size_t | batchSize, |
float | learningRate = 0.01 |
||
) |
Apply the computed gradients.
The method should be called only when all the gradients have been computed for all the layers in the network.
batchSize | The size of the batch. |
learningRate | The learning rate of the model. |
sTensor Layer::forward | ( | const sTensor & | batch | ) | const |
Forward one batch of data through the layer.
This includes allocating space that could not be allocated in the constructor as it depends on the batch size. The additional data will only be allocated if batch size changes. This means, if all batches are of the same size, the data will not be allocated again. This is performed in the Layer::allocate() method.
batch | The batch that should be propagated. |
std::string Layer::activation |
The activation function.
Pointer to the activation function object. Can be LinearActivation, ReLUActivation or SigmoidActivation.
size_t Layer::inSize |
The input size to the layer.
Equal to the number of neurons in the previous layer, or the size of the input in the case of the input layer.
DataLocation Layer::location |
The location of the layer.
Specifies the location of all the data used by the layer. See DataLocation for more info.
size_t Layer::outSize |
The output size of the layer.
Equal to the number of neurons in the layer.