From 38779c405025f6fed00a68acc51b5bf25fc68111 Mon Sep 17 00:00:00 2001 From: davidho27941 Date: Wed, 11 Sep 2024 11:05:13 +0800 Subject: [PATCH] chore: complete basic informations. --- README.md | 19 +++++++++++++++++-- 1 file changed, 17 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index db094d1..de83c33 100644 --- a/README.md +++ b/README.md @@ -1,6 +1,21 @@ # weather_data_dbt -A dbt based data transformation project. -## Overview +## Introduction + +This is a side project to build an end-to-end automated data pipeline. We use weather data provided by the Taiwan government as the data source. The automated pipeline is powered by the Apache Airflow, which is a well known automation tool. The data downloaded by the Airflow task will be uploaded to the AWS S3 Storage. The AWS S3 has been configured as an external table of the Snowflake Data Warehouse. ![Overview](./images/en/project_overview_en.jpg) + +## Extract - Load Process + +The automated data pipeline in this project is built on the Self-hosted Apache Airflow and uses AWS S3 as remote storage. The DAG is scheduled with different frequencies and gets the desired from the open data platform provided by Taiwan's government. The fetched data will be uploaded to AWS S3. + +![Extract-Load](./images/en/extract_load_en.jpg) + +## Transformation Process + +In the data transformation pipeline, we use S3 Storage as an external table for Snowflake and utilize dbt (data build tool) to transform the data from the external table. The transformation process consists of three steps: staging, intermediate, and artifacts, each corresponding to different purposes. + +![Transformation](./images/en/transformation_en.jpg) + +For detail transformation and processing information, please refer to the auto-generated dbt document: [Web Page](https://davidho27941.github.io/weather_data_dbt/#!/overview) \ No newline at end of file