Finding an Efficient Way to Ingest S3 Buckets with Datahub

Original Slack Thread

Hello team, I have a question with S3 ingestion.
I’m trying to add S3 to Datahub, and I want to import each folder to add files in all folder paths in S3
To do that, i have to do like below.

include: 's3://{bucket_name}/folder1/*.*'
include: 's3://{bucket_name}/folder2/*.*'
include: 's3://{bucket_name}/folder1/folder1/*.*'
...```
But i have many buckets to ingestion and it is very complicated
Is ther any other ways?

<@U04N9PYJBEW> might help you

Cc: <@UV14447EU>

<@U0608DME3UH> have you tried using **?
include: 's3://{bucket_name}/**/*.*'

<@U01GZEETMEZ>
I tried, but failed.

Msg is here

and my ingestion source is below

Ah my bad - only exclude paths can use **

You can do something like this

include: 's3://{bucket_name}/*/*.*'
include: 's3://{bucket_name}/*/*/*.*'```
it's not great, but should get the job done

Thank you for your help <@U01GZEETMEZ>