A dense neural network built from scratch, using Numpy and Python. Supports a variety of popular optimizers.
A demo can be found in main.py
.
To use, simply import NeuralNetwork
from network.py
, and DenseLayer
from dense_layer.py
Utilize the in-built activation, loss and optimizer functions by checking out all the available functions within the function folder, and import each as needed. A list of all available function and where to import them can be found below.
Initialize the network by creating an object of NeuralNetwork. Add a layer using your_neural_network.add()
and pass in a DenseLayer object. Please make sure to set appropriate
attributes when initializing the objects such as activation functions or loss functions. Again, see main.py
for a fully functioning sample neural network.
- Linear
- ReLU (Rectified Linear)
- Softmax
- CategoricalCrossEntropy
- *Dumb (A rudimentary, poor-performing custom optimizer used as a PoC)
- StochasticGradientDescent (with momentum)
- Adaptive Gradient
- Root Mean Squared Propagation
- Adaptive Momentum
Plans to add validation data testing functionality, and regression models.