Hi team
Hello, this is the error message I am getting on dbt Ingestions (DH version 11.0).
<@U04N9PYJBEW> Thanks for helping me out on this. This is the error from the Office Hours call just now.
{'error': 'Unable to emit metadata to DataHub GMS: java.lang.NullPointerException',
'info': {'exceptionClass': 'com.linkedin.restli.server.RestLiServiceException',
'message': 'java.lang.NullPointerException',
'status': 500,
'id': 'urn:li:dataset:```
java.io.IOException: Unable to parse response body for Response{requestLine=POST /_bulk?timeout=1m HTTP/1.1, host=<https://vpc-datahub-2frsptdmsahrwqd5lhurzzx4d4.us-west-2.es.amazonaws.com:443>, response=HTTP/1.1 200 OK}```
Is it possible that ES is unhealthy?
I suspect that this is a more general problem in the backend, and not something specific to dbt because it looks like it's failing on both the subtypes aspect and the main MCEs
In any case, the NPE definitely should be handled better, although I'm not exactly sure where it's originating from, since it seems unlikely that there's a bug in `RestliUtil.toTask` cc <@UV5UEC3LN>
java.net.URISyntaxException: mismatched paren nesting: urn:li:dataset:(urn:li:dataPlatform:snowflake,dev_mart_db.ea_ops_trans.sharp_ai_conversations_dr_predict```
This also seems problematic
The above one is snowflake dataset but through dbt source
We do have similar issue, when directly ingesting snowflake. Thats the weird part, it gives us error on two of the schemas and everything else runs smooth, thats where i thought its source side issue over DH issue. But i am not so sure.
Here are some more clues. We are seeing a NPE when ingesting DBT, however when searching DataHub for the Snowflake object I also get a 500 Error on the UI.
Getting the following error on a dbt ingestion:
And when I pull up that same table in DH, I also get an error: