Sign up to take part
Registered users can ask their own questions, contribute to discussions, and be part of the Community!
Added on May 24, 2024 8:40AM
Likes: 0
Replies: 2
Hi there,
I am attempting to create a connection of Bigquery and Dataiku, so that my dataset used in flow can refresh itself schedueled automatically. While I need to download and upload the bigquery table by myself everytime now, which is quite unconvenient.
Should I do some setting in GCP Bigquery either before I configure the connections in Dataiku?And I'm confused with what should I entry to the blank.I entried information following but it turns out error when I click "test" button.
Project ID : I entry the project ID of the project which I attempt to connect in bigquery
Client ID: My GCP account
Client secret: My GCP account's password
Scope: the default scope
others:blank
Error messege:User 'xxxxxxxxxxxx' does not have credentials for connection 'xxxxxxxxxxxxxx'.
I guess that I should obtain OAuth 2.0 credentials from the Google API Console firstly, am I right?
And any other settings should I configure out ?
Thanks a lot,
Rakuwa
Operating system used: Dataiku Cloud
The easiest way to setup a connection for BigQuery is to setup a service account in GCP and then create a service account key for the account. Finally copy the service account key to your DSS server to any location you want (which should be readable to the user used to run the DSS services) and setup the connection in Dataiku.
Hi Turribeach,
Thanks for your reply.I will try it.