Hello everyone,
During UI BigQuery ingestion we’ve noticed that if we have multiple tables with same names but in different datasets, only the table from the last dataset will be ingested.
We have configured our ingestion recipe to allow specific datasets and we are running a single ingestion for the whole project.
Is there a configuration option that we are missing? The ingestion recipe is in the
Should we consider having separate ingestion recipes? We have a single ingestion for Hive and it ingests the data to separate databases.
I only see one table from the last ingested dataset. For example if dataset_1 and dataset_2 contain the table test_table, I only see this: project_id.dataset_2.test_table
I was able to debug this. The issues was that the table that was missing had a type CLONE and in the BigQuery ingestion queries.py the query for fetching metadata considers only BASE TABLE and EXTERNAL.
Does it makes sense to add a filter for table types? I’ve noticed that Snowflake queries has the same condition. If it does, I can create a Feature request.
<@U04G3HGFB88> you mean on Snowflake CLONEd tables are not filtered out?
I think it makes sense to not filter out cloned tables, especially if it is not filtered out on Snowflake