-
Getting Help - Start Here!
We have created resources to help you navigate to find the assistance you need. In you can find a description of the many sources of support open to you including: * The Community - You are in a great place for help as it is! * DSS Global Search - find answers right within Dataiku DSS itself * Your Internal Community -…
-
How to Change Oracle Data Storage Type
I'm hoping to use the VARCHAR2 column storage type in Oracle rather than NVARCHAR2, which appears to be the default storage type used by Dataiku for strings in Dataiku-managed datasets stored in Oracle. What's the best way to do this?
-
Can Dataiku connect to Azure Data Studio?
-
Use prometheus instead of graphite for monitoring
The https://doc.dataiku.com/dss/latest/operations/monitoring.html#concepts documentation describe how to push DSS metrics to a graphite server. However, is it possible to push metrics to an existing prometheus environment (already in use as a central monitoring solution) instead of having to deploy a specific graphite…
-
About deployer infrastructure setting
I would like to know how to set K8s-related settings in the infrastructure of dataiku deployer and make the ingress controller (NGINX) option visible in the service exposition. Of course, I know that the ingress controller has been set up in K8s and needs to be done. However, I would like to know how to make the Ingress…
-
Dataiku cloudでSparkを使う方法について
GCP環境で立ち上げたDataiku cloudでsparkを使おうとするとエラーが出てしまいます。 ドキュメントにはDataiku cloudではデフォルトでSparkが使えると書いてあるのですがPysparkレシピを実行すると画像のようなエラーが出ます。何か設定をする必要があるのでしょうか。 Operating system used: Windows Operating system used: Windows
-
Steps to change to created environment
Hi I created yfinance Python environment to get hands-on. On executing the code recipe — Error: "Error in python process. Class module not found…" How to change to my environment (yfinance) for my code recipe / or for a specific project? Thanks for your time Operating system used: Windows
-
How we can use a AWS EFS file share for Dataiku APIs deployed to Kubernetes
How we can use a AWS EFS file share for Dataiku APIs deployed to Kubernetes. We can configure CSI driver and storage class, then create PVC on Kubernetes cluster. But, how we can use it as mount in deployments that gets deployed through DSS API deployer.
-
Code Envs - What is the difference between AUTOMATION_SINGLE and AUTOMATION_VERSIONED?
Hello Dataiku Community, In my Dataiku console in Administration - Code Envs, i can see that several Code Envs are defined in General Env Info with the mode AUTOMATION_SINGLE and several Code Envs are defined in General Env Info with the mode AUTOMATION_VERSIONED. I can't fin any explanation in the Dataiku Documentation…
-
Using Deployment hook on multinode(2over automation)
Hello My environment uses two automation nodes, so I'm using multinode deployment. I'm trying to use a deployment hook, but I get a message saying the required automation_client value was not passed to the execute function of the deployment hook. I tested the same deployment hook on a single node, and the issue persists.…
-
I integrated DSS Desktop as admin with SQL, but on running the SQL recipe, facing the below issue.
I integrated DSS Desktop as admin (Free edition) with SQL, but on running the SQL recipe, facing the below issue. Need guidance to fix the issue. Thanks in advance.