Issue with get_connection in Python Endpoint for Snowflake – API Node

sandeeppradeep
sandeeppradeep Registered Posts: 2 ✭✭

Hi all,

I'm currently working on a Python endpoint in Dataiku API Node, where I need to dynamically generate and execute SQL queries against a Snowflake connection. These SQL queries are generated via OpenAI, so I cannot use a predefined SQL endpoint (as those require fixed queries).

I followed the official Dataiku documentation for SQL query endpoints, and attempted to retrieve the connection using the get_connection function. However, I’m running into the following error:

php-templateCopyEdit"error": "No connection adapters were found for 'http://<IP_ADDRESS>:<PORT>/dip/publicapi/connections/<CONNECTION_NAME>/info'"

Here's what I’ve tried:

  • Verified the API Node is up and the Python endpoint is running correctly.
  • Double-checked that the connection works fine within DSS.
  • Ensured that the connection name matches exactly between DSS and the API node config.

Key Constraints

  • I can’t use a SQL endpoint since the query needs to be dynamically generated based on user input.
  • I need to use dataiku.core.sql_query or a similar approach in the Python backend of the API Node.

  1. Is get_connection supported in API Node Python endpoints?
  2. If not, is there an alternative method to execute dynamic SQL queries using a configured connection on the API Node?
  3. Are there specific configuration requirements on the API Node (e.g., plugins, proxies, or internal routing) that might be missing?

Any help or suggestions would be greatly appreciated!

Thanks in advance!
Sandeep

Answers

  • Turribeach
    Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,575 Neuron
    edited August 6

    The way the API Node stores connections is different than Automation or Design nodes. The use of connections in the API node is to allow for the enrichment of prediction queries. What you are trying to do is completely different and not a pattern that seems suitable for the API node. There are many things to consider here but two that stand for me are that APIs should respond relatively quickly, which a lot of SQL queries won't. And data volumes for the response should be smallish which depending on the SQL you execute might not be the case.

    Having said that nothing stops you from doing a custom SQL connection inside your Python function in your API service. However if you do this way it would seem a bit pointless to use a Dataiku API service to do a custom SQL with a custom connection, what would you gain doing that?. Might as well do your own adhoc SQL micro service ina well supported framework that you can manage and deploy outside Dataiku. You will probably get a more robust solution.

  • JordanG
    JordanG Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 9 Dataiker

    @sandeeppradeep are you attempting to use OpenAI for performing text-2-sql transformations? If so, I would recommend other ways of doing this. For example, you can use the AI SQL Assistant inside of a SQL Notebook to do this or if you're looking to provide a chat interface for this, you can leverage Dataiku Answers with dataset retrieval.

  • JordanG
    JordanG Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered Posts: 9 Dataiker

    @sandeeppradeep check out this blog I wrote on doing text-2-sql inside of Dataiku.

Setup Info
    Tags
      Help me…