Install pre-commit hooks for linting, format checking, etc. Rebuild docker image if requirements changed: docker-compose build.tests/airflow "pytest -cov=gcp_airflow_foundations tests/unit Authorize gcloud to access the Cloud Platform with Google user credentials: helpers/scripts/gcp-auth.sh.send env var PROJECT_ID to your test project.uncomment line 11 in docker-composer.yaml.Default authentication values for the Airflow UI are provided in lines 96, 97 of docker-composer.yaml.Run Airflow locally (Airflow UI will be accessible at docker-compose up.gitignore, and will not be push to the git repo) Create a service account in GCP, and save it as helpers/key/keys.json (don't worry, it is in.Update the gcp_project, location, dataset values, dlp config and policytag configs with your newly created values.Create a BigQuery Dataset for the HDS and ODS.Enable: BigQuery, Cloud Storage, Cloud DLP, Data Catalog API's.In order to have them successfully run please ensure the following: Sample DAGs that ingest publicly available GCS files can be found in the dags folder, and are started as soon Airflow is ran locally. See the gcp-airflow-foundations documentation for more details. Pip install 'gcp-airflow-foundations ' Full Documentation Well tested - We maintain a rich suite of both unit and integration tests.Integration with GCP data services such as DLP and Data Catalog.Support of advanced Airflow features for job prioritization such as slots and priorities. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |