In short, this is a C++ neural network static library developed as a simple, elegant, multi-purpose solution.
To be a bit more elaborate, this library offers a simple interface for home-cooked reinforcement based deep learning projects. It is optimized for running in a multi-threaded environment, seeking to offer performance and simple, essential, features without the complexity endured from larger-scale libraries. This library does not currently support back propagation.
The library is designed to be fully thread-safe and can be used in multi-threaded environments.
For a pre-built library file, check out our latest release. No additional dependencies are required.
#include "PyreNet.h"
int main() {
// Define middle and output layers
std::vector<PyreNet::LayerDefinition> layerDefs;
layerDefs.emplace_back(50, PyreNet::LayerDefinition::activationType::relu); // Middle (50 nodes)
layerDefs.emplace_back(50, PyreNet::LayerDefinition::activationType::sigmoid); // Middle (50 nodes)
layerDefs.emplace_back(5, PyreNet::LayerDefinition::activationType::relu); // Output (5 nodes)
// Initialize the network
PyreNet::NeuralNet nn(5, layerDefs); // Defines the network to have an input size of 5
nn.mutate_gaussian(0, 1); // Mutates network weights from a gaussian sample with mean 0, standard deviation 1
// Run a prediction on an input vector
std::vector<double> predictions = nn.predict(std::vector<double>{0, 1, 2, 3, 4});
}
Activation Type | Identifier |
---|---|
ReLU | LayerDefinition::relu |
Linear | LayerDefinition::linear |
Hyperbolic Tangent | LayerDefinition::tanh |
Sigmoid | LayerDefinition::sigmoid |
Step | LayerDefinition::step |
For convenience, all networks can easily be serialized and deserialized.
PyreNet::NeuralNet nn(5, layerDefs)
ofstream ofs("output.txt");
ofs << nn;
ifstream ifs("output.txt");
ifs >> nn;
mutate_gaussian(mean, std, OptionalInt(layerIndex));
Mutates the weights via a gaussian distribution.
If the layerIndex field is specified, only that layer will be mutated. Indexing starts from 0 at the first set of weights.
mutate_uniform(lower_bound, upper_bound, OptionalInt(layerIndex));
Mutates the weights uniformly by a modifier in the range [lower_bound, upper_bound].
Feel free to make a pull request if you have any useful features or bug fixes. For inquiries, contact [email protected].
- Hunter Harloff - Lead Developer - Poppro
This project is licensed under the MIT License