Hello,
I would like to ingest all entities I have in hive metastore into datahub? for example Spark write some tables using .saveAsTables() in metastore. How can I ingest the data of hive metastore. When I checked Hive ingestion, I see that It expect hiveserver2 and not metastore thrift 9083? anyone succeeded to ingest hive metadata?
Hey there! Make sure your message includes the following information if relevant, so we can help more effectively!
- Are you using UI or CLI for ingestion?
- Which DataHub version are you using? (e.g. 0.12.0)
- What data source(s) are you integrating with DataHub? (e.g. BigQuery)
<@UV14447EU> do you have any ideas here, thanks!
<@U04HJ2C8Y49> if you can connect to Hive Metastore db (mysql or postgres) then you can use presto-on-hive source to ingest metadata ->
https://datahubproject.io/docs/generated/ingestion/sources/presto-on-hive/