Skip to main content

We just started integrating data to GCP BigQuery by adding a system environment variable called ‘GOOGLE_APPLICATION_CREDENTIALS’ and pointing that to the service account key file. I am anticipating a need to integrate with a different BigQuery and would like to know how to run parallel integrations to two different GCP accounts. Any thoughts? Thanks.

Hi ​@suresh007

According to the answer in this post, a possible way to get the job done is to use the OAuth2.0 web connection instead of JSON key files.

 


Thanks for the link! I guess we can only have one connection with service key file and many OAuth2.0 connections. Our current data sharing partner only wants to provide service key file and future integration needs could pose problems.


Hi ​@suresh007!

Unfortunately, because of the underlying functionality of the BigQueryConnector itself, there isn't an easy way to run parallel integrations using multiple Service Account Key Files. Because the transformer searches for that specific environment variable when you use that as the credential source, you'd have to go into that environment variable and change that path itself or possibly run some Python script as shown here:


So what ​@pepato suggested using, OAuth 2.0, is probably the best way. OAuth 2.0 is more flexible because it allows you to easily swap between Cloud Console projects, unlike Service Account key files which are tied to a single project. You can also create separate web service connections to a different BigQuery account.

If you’re interested in using OAuth, here is a great article to get started:

Hope this helps! 🙂