Skip to content

This algorithm implements software realization of the smallest unit of an artificial neural network. One version is written in C language; the second in Python/Tensorflow/Keras. This single perceptron is able to solve simple logical operations like AND, OR, NAND, NOR, and NOT. Coefficients that are calculated are two weights and one bias in a ba…

Notifications You must be signed in to change notification settings

DusanRandD/SPBA

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 

Repository files navigation

SPBA - Single Perceptron Backpropogation Algorithm

This algorithm implements software realization of the smallest unit of an artificial neural network. One version is written in C language; the second in Python/Tensorflow/Keras. This single perceptron is able to solve simple logical operations like AND, OR, NAND, NOR, and NOT. Coefficients that are calculated are two weights and one bias in a backpropagation sort of learning process. In C code is added the logging functionality of all coefficients, over iteration process of learning, into one textual file which can be analyzed later. In Python code there is tracking of loss with chart creation. This algorithm can not resolve XOR or XNOR logical operations.

About

This algorithm implements software realization of the smallest unit of an artificial neural network. One version is written in C language; the second in Python/Tensorflow/Keras. This single perceptron is able to solve simple logical operations like AND, OR, NAND, NOR, and NOT. Coefficients that are calculated are two weights and one bias in a ba…

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published