Skip to content

ldebele/End-to-End-Airbus-Ship-Detection

Repository files navigation

Airbus-Ship-Detection

GitHub contributors GitHub forks GitHub stars GitHub issues GitHub license LinkedIn

Table of Contents

Project Overview

The aims of this project is to detect ships from satellite images using an event-driven architecture. The project implements an end-to-end U-Net based deep learning model for detecting ships. The model predicts segmentations masks indicating the ships within the images.

Architecture

The project consists of two main architectures, each containing specific pipelines for different purposes:

  • Training Pipeline Architecture

For training the model and consists of four components. The components are orchestrated using Airflow:

  1. Data Ingestion: Download the datasets from kaggle.
  2. Preprocessing: Apply preprocessing techiniques to the datasets.
  3. Model Training: Builds and trains the ship detection model.
  4. Model Evaluation: Evaluates the performance of the trained model.

Training Workflow

  • Inference Pipeline

This architecture is based on an event-driven approach for making predictions using the trained model.

  1. Producer: Publishes new satellite images from a Satellite API.
  2. Consumer: Consumes the published images and sends a Post request to the prediction API endpoint.
  3. API: Handles the prediction requests and returns the results.

Inference Architecture

Getting Started

  1. Clone the Repository

    git clone https://github.com/ldebele/Airbus-Ship-Detection.git
    cd Airbus-Ship-Detection 
  2. Install Docker and Docker Compose

    Follow the instructions on the Docker website to install Docker and Docker Compose.

  3. Build Docker images

  • To build all inference-related images:
    make all_inference
  • To build all training-related images:
    make all_training
    
  1. Start the pipelines.
  • To start the inference pipeline.

    make start-inference
  • To start the training pipeline.

    • Initialize the database
    make airflow-init
    • Running the airflow
    make start-training
  1. Accessing the web interfaces.
  • Airflow Web Interface.

    Once the cluster has started up, you can log into the web interface and begin experimenting the pipelines.

    Access the Airflow web interface at http://localhost:8080 using the defult credentials: Username: airflow and Password: airflow

  • MLflow Web Interface

    Access the MLflow experiment tracker at http://localhost:5000

  • API Web Server

    Access the prediction inference API web server at http://localhost:8585

  1. Stop and delete containers
    make cleanup

License

This project is licensed under the MIT License. See LICENSE file for more details.

Contact

Lemi Debela - [email protected]

Project Link: https://github.com/ldebele/Airbus-Ship-Detection

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages