Hello there any tips how to fix that?
Sink (datahub-rest) report:
{‘total_records_written’: 36241,
‘records_written_per_second’: 3,
‘warnings’: ,
‘failures’: [{‘error’: 'Unable to emit metadata to DataHub GMS: java.lang.RuntimeException: java.util.concurrent.ExecutionException: ’
'org.apache.kafka.common.errors.RecordTooLargeException: The message is 1296474 bytes when serialized which is larger than ’
‘tel:1048576|1048576, which is the value of the max.request.size configuration.’,
‘info’: {‘exceptionClass’: ‘com.linkedin.restli.server.RestLiServiceException’,
‘message’: 'java.lang.RuntimeException: java.util.concurrent.ExecutionException: ’
'org.apache.kafka.common.errors.RecordTooLargeException: The message is 1296474 bytes when serialized which is ’
‘larger than tel:1048576|1048576, which is the value of the max.request.size configuration.’,
‘status’: 500,
‘id’: ‘urn:li:dataset:(urn:li:dataPlatform:mssql,CIGAM.dbo.GFHISOPB,PROD)’}}],
‘start_time’: ‘2023-10-12 23:46:47.724130 (2 hours, 59 minutes and 13.03 seconds ago)’,
‘current_time’: ‘2023-10-13 02:46:00.756905 (now)’,
‘total_duration_in_seconds’: 10753.03,
‘gms_version’: ‘null’,
‘pending_requests’: 0}
Pipeline finished with at least 1 failures; produced 36242 events in 2 hours, 59 minutes and 12.77 seconds.