Datahub 0.12.1 Docker ingestion issue with 350 tables limit

Original Slack Thread

hey everybody im running Datahub 0.12.1 on top of Docker and im trying to do the lineage of SQL server tables by ingesting the tables metadata into Datahub. I ran into a problem because ive already ingested 350 tables and Datahub seems to not allow us to ingest more than these 350 tables… i have a lot more to ingest…

Hey there! :wave: Make sure your message includes the following information if relevant, so we can help more effectively!

  1. Are you using UI or CLI for ingestion?
  2. Which DataHub version are you using? (e.g. 0.12.0)
  3. What data source(s) are you integrating with DataHub? (e.g. BigQuery)

<@U06TM7M40RK> Can you help with the questions in this thread?


:information_source: We’re experimenting with a new AI assistant for questions in the <#CUMUWQU66> channel to help you track down helpful resources more quickly. Keep in mind that responses may not always be spot-on; feel free to ask it follow-up questions and don’t forget to provide its responses a :thumbsup: or :thumbsdown: so other folks in the Community know if you’re still looking for help on the topic!

I’m not able to find any data sources to answer your question.

Could you provide more details on the error messages or logs you’re encountering during the ingestion process?

If this doesn’t help, please reach out to the DataHub support team for further assistance. :+1: 0 button :-1: 0 button :bulb: Hint: Mention @runllm in the thread for followups.

<@U06494MENN8> normally we don’t have such limit.
Please, can you share what kind of error you get?