Quite often my users create API services with multiple endpoints. Currently they just hardcode the API service's API key in code using the following code snippet:
uri = "http://localhost:" + os.environ["DKU_APIMAIN_PORT"] #NonProduction key #client = APINodeClient(uri, "API_Service_Name", "mynonprodkey") #Production key client = APINodeClient(uri, "API_Service_Name", "myprodkey")
They then use "client" to invoke other endpoints of the same API service, which are often SQL queries or perhaps other Python endpoints. We, the IT, then have to make sure we comment/uncomment the right key before deploying the API service to different infrastructure. The question is 2 fold:
1. Is it really necessary to use a key if my endpoint needs to call another endpoint of the same API service? If not, please let me know the code snippet I should use.
2. Is there a way to store the API key securely if I need to call an endpoint of another Dataiku user API service?
There is a specific way to handle such use-cases by using the "dispatcher pattern" described in our reference doc.
By doing so, you don't need to explicitly handle API keys anymore.
That is great, thank you!
I'm wondering if there are any good patterns we could use for handling API keys if the dispatcher endpoint needs to call an endpoint in another API service? At the moment for this usecase our data scientists are hardcoding the keys in their code too:(
Each API service has its own set of API keys, so there is unfortunately no way to have consolidated "cross-service" authentication.
If possible, you should gather all endpoints that are meant to work together under the same API service. In addition to solving the API key hardcoding issue, it will also facilitate the deployment and maintenance process, since you will only have to deploy one artefact containing all endpoints, preventing potential version mismatches in production if you had multiple "linked" API services.