This repository contains my submissions for the courses in my data science degree program. The following is a list of the courses and their descriptions:
Course Basics:
- Reasonable programming background required
Description: This course provides an understanding of machine learning, the methods involved in evaluating them, and their application to real-world problems. The course will cover classification and regression learning along with other techniques, and apply the techniques to particular classes of problems. Topics covered in the course include:
- Introduction to machine learning: what is machine learning, taxonomy of machine learning algorithms, inductive bias, data mining.
- Learning to classify: decision tree induction, Naïve Bayes methods, Bayesian networks, K-nearest neighbor method, support vector machines.
- Learning to predict numeric values: linear regression, regression trees, evaluating learning procedures, overfitting and the 'bias-variance trade-off'.
- Clustering: k-means algorithm, agglomerative hierarchical methods.
- Association rules mining: A priori algorithm.
- Reinforcement learning: Q learning.
- Multiple learners: Bagging, boosting, forests, and stacking.
I will share the repository link that points to the project given to us, my report and my codebase so that it serves to both guide others and myself as it contains quite a few of those paradigms in ML that are almost repeated whenever a new project is started.
Find Project for CE-802 here: https://github.com/DarthAmk97/CE802_ML
Course Basics:
- Reasonable programming background required
Description: This course covers the basic concepts and principles of neural computation as an approach to intelligent problem-solving. Students will learn commonly used neural network architectures and learning algorithms, distinguish classes of problems to which neural networks offer solutions superior to other methods, and design a neural network to solve a particular problem. Topics covered in the course include:
- Introduction to artificial neural networks: basic concepts and principles, biological motivations and brief history of ANNs, neuron models and neural network architectures, computational power of ANNs in comparison with conventional AI methods, ANN applications.
- Basic learning rules and theories: basic issues in neural network learning, derivative-based methods such as error gradient descent learning algorithms, derivative-free methods such as simulated annealing, genetic algorithm, Hebbian learning, and competitive learning, the bias-variance dilemma in learning from data.
- Feedforward neural networks using supervised learning: feedforward neural network architectures and supervised learning, perceptron architecture, error correction learning, limitations, multilayer perceptron (MLP) architecture, back-propagation learning algorithm, radial basis function (RBF) network architecture, learning algorithm, comparison with MLP.
- Self-organizing neural networks using unsupervised learning: unsupervised learning, adaptive resonance theory (ART) neural network architecture, learning algorithm, self-organizing map (SOM) neural network architecture, learning algorithm.
- Recurrent neural networks: recurrent neural network architectures, Hopfield neural network energy function, Hebbian learning, stability analysis.
- Deep neural networks: concepts and architectures.
- ANN applications and recent advances: basic issues and strategies in neural network applications, data collection and preprocessing, classification, regression, prediction, and intelligent control, recent advances in neural network research and development, support vector machine (SVM), reinforcement learning, neuro-fuzzy networks, etc.
My two cents on the course: I still have yet to see SOM or ART in the popular mainstream ML/DL universe and I follow quite a lot of quality content churned out by Yann LeCun, Francois Chollet, and the likes of Yoshua Bengio and Geoff Hinton, never seen anyone mention these techniques. Anyway, I feel like this course did serve its purpose although to what end I am not sure with fuzzy networks being taught on which the professor alone is the specialized person to contact, outside in the real world, not sure what the gauged impact is for ART SOM and Neuro-Fuzzy networks.
However, the project was really amazing and you can find it here: https://github.com/DarthAmk97/CE889_DL