Hi @Prakhyath ,
Where is your Python recipe is executed, it can be run :
1. DSS Server if local execution , you can see this from Advanced tab of the recipe and activity logs will show " Running a DSS Python recipe locally" in the activity logs.
2. Or on K8s, this reflect in the advanced tab of the recipe and logs.
4. On Snowflake - Snowpark. https://doc.dataiku.com/dss/latest/connecting/sql/snowflake.html#snowpark-integration, but this requires a specific setup.
More likely it will be  or  in that case, storage + queries will be in billed Snowflake, and the actual compute will be done by your DSS instance/K8s cluster.
In terms of storage, you can check the status tab of the input/output dataset and calculate storage costs in Snowflake for those tables.
For the actual query cost on the Snowflake, it will be more difficult to determine if multiple queries are running simultaneously on the warehouse the cost is still only reported for that hour for the whole warehouse. You can speak with Snowflake support to see if they can provide some additional granularity.