-
Data Science 102 Bootcamp that I gave with the association with Kodluyoruz
-
I want to give my special thanks to the assistants of this bootcamp for their hard works to help students and their continuous support to the education throughout the bootcamp.
-
You can find the lecture videos at:
🔗 https://www.youtube.com/playlist?list=PLoaCNumrILN8D5rfBtv83g3WspOyI8bhT
-
Mathematics Review:
- Why does Gradient give us the direction of greatest increase
- Linear Algebra
-
Random Forest
- Feature Importance
- Correlated Variables
- Feature Selection
- Profiling for speeding up the training
- Extrapolation Problem
- How to Deal with Extrapolation
-
When not to use random splitting and k fold cross validation ?
-
Naive Bayes
- Bayes Formula
- Writing Naive Bayes from Scratch
-
Neural Network from Scratch
- Why do Neural Networks work?
- Why do we normalize input?
- Why do we normalize layers?
- Writing Matrix Multiplication
- Writing Forward and Backward Passes
- Training Loop g) Understanding Optimizers
-
Deep Learning for Tabular Data
-
When not to use Softmax?
-
Pytorch Hooks : Wanna see what is going on in your model ?
-
Initialization does matter - A lot-
-
Dataset, DataLoader from scratch
-
Why does Batchnorm work ? No it does not about internal covariate shift
- Writing BatchNorm from scratch
-
Writing Learning Rate Finder
-
MixUp augmentation
-
Label Smoothing
-
Wanna train your models fast - try Mixed Precision Training
-
Instructor: Engin Deniz Alpman
-
Assistants (alphabetically):
- Ahmet Arif Avcı
- Burak Bagatarhan
- Elif Bayındır
- Fahri Bilici
- Kubilay Gazioğlu
- Melis Han
- Neris Özen
- Rana Kalkan
- Uğur Emek
-
In creating this course, there is a good deal of knowledge that I acquired from StatQuest and fastai, thank you for creating great contents. You can find their works at links below:
- StatQuest: https://www.youtube.com/user/joshstarmer
- fastai: https://www.fast.ai/