This repository contains code used to compete in the Titanic Machine Learning From Disaster competition: https://www.kaggle.com/competitions/titanic/overview
With a best accuracy of 0.777 the XG boosted random forest achieved a top 20% accuracy in the competition.
I do not own the underlying data used by these machine learning systems and publish this only as public record of my ability to use Keras, Sklearn, and Bagging/Boosting algorithms in practice.