Hi team~
I have two airflow clusters with the same Dag Id and different workflow configurations. When I do ingestion from both airflow clusters to datahub, all tasks are recognized as one data pipe because the Dag Id is the same. Is there any way to recognize them as different data pipes?
My environment is as follows
Are you using UI or CLI for ingestion? UI + airflow plugin
Which DataHub version are you using? 0.11.0
What data source(s) are you integrating with DataHub? apache airflow
we can use cluster field of [datahub] configuration in airflow.cfg to different airflow instances. I believe the value to the field ‘cluster’ is going to be part URN’s of airflow entities https://datahubproject.io/docs/lineage/airflow/#configuration