🐞 This repo and Airflow
Repo conflict over desired state of dags
folder.
#6419
Labels
bug
Something isn't working
Describe the bug.
Currently, we provision
airflow_local_settings.py
, a file that allows us to set status messages in the Airflow UI, from this repo. However, to be processed, Airflow requires the script live in theDAGs
folder. This then means that after this repo pushes these files, it puts the state of that folder in conflict with the Sync Dags to S3 script in the Airflow repo.This script runs
aws s3 sync ./environments/${{ matrix.env }}/dags/ s3://mojap-airflow-${{ matrix.env }}/dags/ --exclude "*" --include "*.py" --delete
every time a DAG is altered by a user. When this occurs, the
--delete
flag means. that ourairflow_local_settings.py
is correctly identifed by the script as not existing in theAirflow
repo, and is as such destroyed. This back and forth of placing and deleting the file will occur each time someone does a PR to the relevant sections of either repo, and as such should be resolved so we can put up long-lasting status messages.To Reproduce
analytical-platform-data-production/airflow
airflow_local_settings.py
files being created in the Airflow S3 buckets.Airflow
repoExpected Behaviour
We should be able to store
airflow_local_settings.py
in the bucket without it being removed.Additional context
Can either move the local settings to being managed from within the
Airflow
repo, or we can re-write the sync action so it will correct ignore any DAGs not in a sub-directory. Either will achieve what we want here.The text was updated successfully, but these errors were encountered: