Troubleshooting SSL Connection Issue with Python Kafka Client

Original Slack Thread

Hi All,

I am trying to connect to SSL enabled Kafka from datahub-actions(executor). I used the Java truststore which I used earlier in GMS but it is not working.
Getting below error. For python is the format of truststore different? I followed the link but didn’t get much help

KafkaException: KafkaError{code=_INVALID_ARG,val=-186,str="Java TrustStores are not supported, use `ssl.ca.location` and a certificate file instead. See <https://github.com/confluentinc/librdkafka/wiki/Using-SSL-with-librdkafka> for more information."}

Hey there! :wave: Make sure your message includes the following information if relevant, so we can help more effectively!

  1. Are you using UI or CLI for ingestion?
  2. Which DataHub version are you using? (e.g. 0.12.0)
  3. What data source(s) are you integrating with DataHub? (e.g. BigQuery)

yes, for the python client these certs/keys should not be in a truststore -> https://github.com/confluentinc/librdkafka/wiki/Using-SSL-with-librdkafka#create-standard-client-keys-for-librdkafka-etal

Is it possible to extract/convert from Java truststore?
The current JKS for kafka I am using is having 7 certificates inside it.
I wasn’t able to convert all of them in a single file.

maybe this can help -> https://dev.to/adityakanekar/connecting-to-kafka-cluster-using-ssl-with-python-k2e

Tried with above approach and created the files as well, still getting error

KafkaException: KafkaError{code=_INVALID_ARG,val=-186,str="Failed to create consumer: ssl.key.location failed: error:05800074:x509 certificate routines::key values mismatch"}

executor.yaml file

source:
  type: "kafka"
  config:
    connection:
      bootstrap: xxxx.visa.com:9092
      schema_registry_url: <http://xxxx.visa.com:8080/schema-registry/api/>
      consumer_config:
        security.protocol: ${KAFKA_PROPERTIES_SECURITY_PROTOCOL:-SSL}
        ssl.ca.location: ${KAFKA_PROPERTIES_SSL_KEYSTORE_LOCATION:-../certs/combined_CARoot.pem}
        ssl.certificate.location: ${KAFKA_PROPERTIES_SSL_TRUSTSTORE_LOCATION:-../certs/combined_certificate.pem}
        ssl.key.location: ${KAFKA_PROPERTIES_SSL_TRUSTSTORE_LOCATION:-../certs/combined_key.pem}
        ssl.key.password: ${KAFKA_PROPERTIES_SSL_TRUSTSTORE_PASSWORD:-Password@123}
    topic_routes:
      mcl: ${METADATA_CHANGE_LOG_VERSIONED_TOPIC_NAME:-vdc-MetadataChangeLog_Versioned_v1}
filter:
  event_type: "MetadataChangeLogEvent_v1"
  event:
    entityType: "dataHubExecutionRequest"
    changeType: "UPSERT"
action:
  type: "executor"
datahub:
  server: ${DATAHUB_GMS_HOST}
  token: ${DATAHUB_GMS_TOKEN} ```

Hi Pankaj! If you’re still hitting this issue, I highly suggest joining us at the Community Marathon! We have a variety of troubleshooting & support sessions scheduled throughout the day where we can give live help :slight_smile: https://datahubspace.slack.com/archives/CUMV92XRQ/p1705433341623299