<@U06TM7M40RK> Hi,How can I upload all metadata fields at once using the Java client RestEmitter provided by DataHub?
To upload all metadata fields at once using the Java client RestEmitter
provided by DataHub, you can utilize the MetadataChangeProposalWrapper
to construct a metadata change proposal and then emit it using the RestEmitter
. Here’s a summarized approach based on the available data sources:
-
Create a
RestEmitter
: You can create aRestEmitter
using theRestEmitter.createWithDefaults()
method or customize it using a builder pattern to specify server details and other configurations. -
Construct a
MetadataChangeProposalWrapper
: This involves specifying the entity type, entity URN, and the aspect you want to update. You can set multiple aspects if needed. -
Emit the Metadata: Use the
emit
method of theRestEmitter
to send the metadata change proposal to DataHub. You can choose between a blocking call using aFuture
or a non-blocking call using aCallback
.
Here is a basic example of how you might implement this:
import com.linkedin.dataset.DatasetProperties;
import datahub.client.rest.RestEmitter;
import datahub.event.MetadataChangeProposalWrapper;
import java.util.concurrent.Future;
import datahub.client.MetadataWriteResponse;
// Create the emitter
RestEmitter emitter = RestEmitter.createWithDefaults();
// Construct the metadata change proposal
MetadataChangeProposalWrapper mcpw = MetadataChangeProposalWrapper.builder()
.entityType("dataset")
.entityUrn("urn:li:dataset:(urn:li:dataPlatform:bigquery,my-project.my-dataset.user-table,PROD)")
.upsert()
.aspect(new DatasetProperties().setDescription("This is the canonical User profile dataset"))
.build();
// Emit the metadata
Future<MetadataWriteResponse> requestFuture = emitter.emit(mcpw, null);
This example demonstrates how to set up a RestEmitter
, construct a MetadataChangeProposalWrapper
with a dataset description, and emit it to DataHub. You can add more aspects to the MetadataChangeProposalWrapper
as needed to upload all metadata fields at once.
Would you like more detailed information or further examples?
Sources:
- https://datahubproject.io/docs/0.13.1/metadata-integration/java/as-a-library#usage|https://datahubproject.io/docs/0.13.1/metadata-integration/java/as-a-library#usage
- https://datahubproject.io/docs/0.14.0/metadata-integration/java/as-a-library|https://datahubproject.io/docs/0.14.0/metadata-integration/java/as-a-library
Note: I wasn’t able to find highly relevant data sources, but above are a few potentially relevant links. 0 button 0 button Hint: Mention @runllm in the thread for followups.
我使用这个方案时一直失败,每次提示Transfer-Encoding and Content-Length,请解释原因
<@U06TM7M40RK> 我使用这个方案时一直失败,每次提示Transfer-Encoding and Content-Length,请解释原因
I’m not able to find any data sources to answer your question. You might want to provide details about the specific error message or logs you are encountering. If this doesn’t help, please reach out to the DataHub support team for further assistance. 0 button 0 button Hint: Mention @runllm in the thread for followups.
<@U06TM7M40RK> 我使用以下代码,把所有元数据的字段都上传到datahub,每次都是失败,提示Transfer-Encoding and Content-Length
代码如下“
RestEmitter emitter = RestEmitter.createWithDefaults();
// Construct the metadata change proposal
MetadataChangeProposalWrapper mcpw = MetadataChangeProposalWrapper.builder()
.entityType("dataset")
.entityUrn("urn:li:dataset:(urn:li:dataPlatform:bigquery,my-project.my-dataset.user-table,PROD)")
.upsert()
.aspect(new DatasetProperties().setDescription("This is the canonical User profile dataset"))
.build();
// Emit the metadata
Future<MetadataWriteResponse> requestFuture = emitter.emit(mcpw, null);```
”
I’m not able to find any data sources to answer your question. You might want to provide details about the server configuration or any custom headers being used. If this doesn’t help, please reach out to the DataHub support team for further assistance. 0 button 0 button Hint: Mention @runllm in the thread for followups.
Hi ! Did this interaction resolve your issue or do you still require assistance ?
Please tag @ RunLLM if you still need further assistance !
Or tag @Datahub Community Support if you require a person to take a look
Hope this helps ! ><