We are using custom python code recipe's to aid few transformations in our assignment, The input and output datasets are in Snowflake. Since we are not having an option to choose the engine to run Code recipe, need to get the information on the utilization, on few queries as follows,
* Code Recipe uses Spark Engine ?
* Since the input and output is Snowflake, Code recipe execution adds to Snowflake computation and billed for that ?
* How to calculate the utilization(Space and Compute) for the code recipe?
More likely it will be  or  in that case, storage + queries will be in billed Snowflake, and the actual compute will be done by your DSS instance/K8s cluster.
In terms of storage, you can check the status tab of the input/output dataset and calculate storage costs in Snowflake for those tables.
For the actual query cost on the Snowflake, it will be more difficult to determine if multiple queries are running simultaneously on the warehouse the cost is still only reported for that hour for the whole warehouse. You can speak with Snowflake support to see if they can provide some additional granularity.