The project explores the domain of machine learning through the implementation of neural networks. The project consists of two key examples.
The first example introduces the perceptron, the simplest neural network with a single neuron. This network enables the recognition of simple patterns, such as connections between sets of numbers. Visualizations before and after the learning process clearly demonstrate how the model adapts over time.
The goal is for the perceptron to learn the connection between the first set (0,1,2,3,4,5) and the second set (0,2,4,6,8,10), which is essentially multiplying by 2 (and adding nothing).
As seen, initially, the model multiplies by 3.91 and adds -0.14. Consequently, its predictions are incorrect.
During the learning process, the model reduces the values from 3.91->2 and -0.14->0.
This is achieved through the Cost function, and the entire goal of the learning process is to minimize the Cost function value, adjusting parameters and improving the model's predictions.
The second example involves connecting multiple neurons to create a more intelligent system.
This model, with three connected neurons, demonstrates the ability to recognize more complex patterns and rules, such as truth tables (XOR).
As seen in the table, the model must accurately predict the output value (0/1) when given two inputs (e.g., 0 and 0).
This model controls a total of nine values, and after the training process, predictions become more precise.
It involves more complex Cost and Forward functions to achieve this.
If you have any questions, reach out at [email protected]
Thanks to TSoding youtube channel for streaming informative Neural Networks content.