We have created resources to help you navigate to find the assistance you need. In you can find a description of the many sources of support open to you including: * The Community - You are in a great place for help as it is! * DSS Global Search - find answers right within Dataiku DSS itself * Your Internal Community -…
I'm hoping to use the VARCHAR2 column storage type in Oracle rather than NVARCHAR2, which appears to be the default storage type used by Dataiku for strings in Dataiku-managed datasets stored in Oracle. What's the best way to do this?
The https://doc.dataiku.com/dss/latest/operations/monitoring.html#concepts documentation describe how to push DSS metrics to a graphite server. However, is it possible to push metrics to an existing prometheus environment (already in use as a central monitoring solution) instead of having to deploy a specific graphite…
I would like to know how to set K8s-related settings in the infrastructure of dataiku deployer and make the ingress controller (NGINX) option visible in the service exposition. Of course, I know that the ingress controller has been set up in K8s and needs to be done. However, I would like to know how to make the Ingress…
GCP環境で立ち上げたDataiku cloudでsparkを使おうとするとエラーが出てしまいます。 ドキュメントにはDataiku cloudではデフォルトでSparkが使えると書いてあるのですがPysparkレシピを実行すると画像のようなエラーが出ます。何か設定をする必要があるのでしょうか。 Operating system used: Windows Operating system used: Windows
Hi I created yfinance Python environment to get hands-on. On executing the code recipe — Error: "Error in python process. Class module not found…" How to change to my environment (yfinance) for my code recipe / or for a specific project? Thanks for your time Operating system used: Windows
How we can use a AWS EFS file share for Dataiku APIs deployed to Kubernetes. We can configure CSI driver and storage class, then create PVC on Kubernetes cluster. But, how we can use it as mount in deployments that gets deployed through DSS API deployer.
Hello Dataiku Community, In my Dataiku console in Administration - Code Envs, i can see that several Code Envs are defined in General Env Info with the mode AUTOMATION_SINGLE and several Code Envs are defined in General Env Info with the mode AUTOMATION_VERSIONED. I can't fin any explanation in the Dataiku Documentation…
Hello My environment uses two automation nodes, so I'm using multinode deployment. I'm trying to use a deployment hook, but I get a message saying the required automation_client value was not passed to the execute function of the deployment hook. I tested the same deployment hook on a single node, and the issue persists.…
I integrated DSS Desktop as admin (Free edition) with SQL, but on running the SQL recipe, facing the below issue. Need guidance to fix the issue. Thanks in advance.
Create an account to contribute great content, engage with others, and show your appreciation.