Troubleshooting Postgres Cloud SQL deployment and authentication errors

Original Slack Thread

I’m trying to use Postgres Cloud SQL as my db (replacing the internal sql instance). I had a lot of trouble with gcloud-sqlproxy, so I just bypassed it like one this thread: https://datahubspace.slack.com/archives/C029A3M079U/p1634804799009300. On my prereq values, I’ve disabled mysql, postgres, and gcloud-sqlproxy. On my datahub values, I filled the sql datasource values just like the aforementioned thread. I disabled both the mysql and postgres set up jobs. I am now getting an error during deployment in the datahub-datahub-system-update-job pod. Did I go about this the right way?

Hey there! :wave: Make sure your message includes the following information if relevant, so we can help more effectively!

  1. Which DataHub version are you using? (e.g. 0.12.0)
  2. Please post any relevant error logs on the thread!

Using Datahub v0.12.1. Unfortunately, I don’t have access to the error logs, but I more generally want to know if I followed the correct process.

I’ve tested with postgres set up enabled and that will error as well

Ok, I got logs access. I’m using helm to deploy and getting error:

“psql: error: connection to server at “x.x.x.x”, port 5432 failed: FATAL: connection requires a valid client certificate”

Sounds like you might be using Certificate Authentication? In such case you would have to pass the client certificate in the sslcert and sslkey parameters of the connection URL
https://jdbc.postgresql.org/documentation/use/#connection-parameters

Tried that, but it was not picking it up for some reason

we ended up just using cloud sql auth proxy

Yeah that’s probably easier. For cert auth, you would also have to mount the cert/key files inside the relevant containers.

Ya I’m almost certain its because our DB has very weird restrictions/permissions

we tried mounting and passing file path to sslcert, etc but nothing was working

oh well yeah, but glad to hear it’s working now