GitHub / donjude / data-pipelines-with-airflow
This data pipeline is an ETL process with Apache Airflow that loads data from Amazon S3 bucket, stage the data workload and insert into Amazon Redshift Datawarehouse
JSON API: http://repos.ecosyste.ms/api/v1/hosts/GitHub/repositories/donjude%2Fdata-pipelines-with-airflow
PURL: pkg:github/donjude/data-pipelines-with-airflow
Stars: 1
Forks: 0
Open issues: 0
License: None
Language: Python
Size: 660 KB
Dependencies parsed at: Pending
Created at: almost 4 years ago
Updated at: over 3 years ago
Pushed at: almost 4 years ago
Last synced at: over 2 years ago
Topics: amazon-redshift, apache-airflow, elt, etl, pipeline, python, redshift, s3, s3-bucket, sql