-
delete vs drop
Hi, currently I am working on a project which I have to collect data but it should be new and have the same schema with previous one but without keeping the past data how can I do this? Firstly, I thought that delete option will work for me but at the end, it didn't do what I am looking for; Thanks in advance Operating…
-
Import txt file in dataset
Hello, i have a txt file fidex-width. When import the file in dataset DataIku not respects the position of string a so when i extract the data from the selected row in col_0 the position change. Example if in the original file the amount is substring in position 9 for 9 carachter after i have import the file this positione…
-
Calculate a single metric/check via the Dataiku API
Hi, It is currently not possible to calculate a single metric or check via the Dataiku API while this is possible via the GUI. The following APIs exists: dataset.compute_metrics() dataset.run_checks() but they will calculate all enabled metrics/checks which may take a lot of time. So this idea is to provide an API to allow…
-
Copying / saving Checks and metrics
Hello, Is there any way to copy checks and metrics from one data set to another? Is it possible to save a custom code check as a plugin ? Operating system used: Centos 7
-
Edit default metrics and checks as a project-wide setting
When creating a new dataset, I practically always edit the default metrics and checks to run row counts after build. Ideally, I could define this from the project settings so that every new dataset created automatically has my desired metrics and checks configured. Of course, this doesn't apply to column-specific values,…
-
Enable/disable scenario step
Hi, While debugging a scenario you may disable some prior steps in order to avoid some long SQL query. If you disable the previous steps, it removes the conditions set too. Could you disable this behavior ? Greetings, Steven
-
This problem is occurring on the Datiaku platform. What can I do to solve it?
-
Visibility condition in plugin parameter
I have a Python plugin with a parameter A of type "MULTISELECT" and another parameter B which I'd like to hide based on A. For instance, if A can take on the values (1, 2, 3), what should be the "visibilityCondition" for parameter B when A meets one of these conditions? * Hide B if A has no values selected * Hide B if A =…
-
Add Seldon to deployment options
One of the deployment options in our company is Seldon (Seldon, MLOps for the Enterprise.). It would be great if Dataiku had the option to deploy directly to Seldon, the way deployment to K8, AWS, Databricks or Azure is now possible. Seldon in general deploys MLflow artefacts.
-
add an incremental column in dataset
requirement is to add an incremental column in datset, it should not be an identity column however data in it will be unique.
-
Add support for Snowflake key-pair authentication
Currently the dataiku only support "User/Password" and "OAuth" for Snowflake connection. Snowflake has Key-pair authentication https://docs.snowflake.com/en/user-guide/key-pair-auth, which I would like to use for my service accounts.
-
Field grouping
Hello, A lot of time, when you have a dataset, you want to know if there is a group of fields that works together. That can help to normalize (like de-joining) your data model for dataviz, performance issue or simplify your analysis. Exemple <style></style> order_id item_id label model_id length color amount 1 1 A 10 15…
-
Export and Import of Code Studio Template
Hi, On Dataiku 12, I do not see a way to export a code studio template. How can I export a code studio template and import it into a new Dataiku instance? thx Operating system used: Windows 10 Operating system used: Windows 10
-
LLM Batch requests
I am experiencing rate limit errors for my LLM requests. Is there a simple method to batch my requests? My thanks in advance. Tom Operating system used: Mac OS
-
Can we install Dataiku in Openshift ?
Can we install Dataiku in Openshift ?
-
How to build a 'Zero Records Check'
In Alteryx I was able to build a zero records check that made it possible for me to have a "no data" entry for a particular path in my workflow. This is used in reporting to show that there were no items in that particular section. Count Records Append the record count (could include 0) Replace Nulls with 0 or Blank…
-
Add/Delete rows button for VisualEdit Plugin
The visual edit plugin is really good and does what we need, however I am having to code a dash webapp that uses the visual edit class and add my own functions for adding and deleting rows in the editable dataset, it would be great if you can configure the visual edit plugin to do this instead of having to custom code it.
-
How to get metrics (jmx, prometheus, etc) for Dataiku DSS ?
From what I can get from the documentation at DSS does not export metrics via prometheus, or JMX. The only thing it can do it's to export metrics to Graphite/Carbon server. The documentation does not mention what metrics are actually exported either, so for me it's hard to tell if it's even worth it to go all the trouble…
-
How to merge 2 tables, by dates comparison et precedent constraints
hello. I have 2 tables, the first one shows vehicules consumption records |ID_VHL| date | Type_Of_Record| Value_Of_Record| | A | 05/01/2024 | hour | 10 | | A | 15/01/2024 | km | 2 | | A | 05/08/2024 | hour | 20 | | A | 15/09/2024 | km | 28 | and the second one represents the distribution of these vehicles, variable over…
-
Option to display short descriptions on flow
Hi All, Forgive me if this has been discussed before, or if it is a polarizing topic as far as visual design goals. In evaluating Dataiku against other products, and ultimately deciding on Dataiku due to its many strengths, one thing my team lamented was that it was not possible to display descriptions of flow elements on…
-
Way to store certificate and key securely without using folder
Hi ! Our plugin needs certificate and key(types of which is not supported as dataiku dataset) for authentication purpose to fetch the data . I'm giving these by creating a folder in a secure location(using project connection, macros) and attaching it to plugin as input and specifying in UI the file names it should look for…
-
How to download a document for all the Scenarios implemented in Project
Hi, for example we can export flow in document format. Is their a similar way to export all scenarios info in document format. Thanks
-
My custom Plugin changes are not applied.
My environment is Dataiku built on-premise AWS. I created my custom Plugin from a Python program recipe. I edited and saved the recipe.json of this Plugin, but when I invoke the Plugin in the flow screen, the changes are not applied. I then restarted Dataiku and all the changes were applied. How can I apply the changes to…
-
How to change the underlying table in a dataset?
I am using this code to change the table associated with a dataset. The table is getting changed as I verified from the dataset settings, but I am getting this error 'Column name not matching between query and schema'. But when I click on Test Table in connection settings, the dataset is getting updated. I am using this…
-
Clear Notebook Output before checking into Version Control
Suppose we are working with certain sensitive data and do not want the output of the cells to get stored in the version control system, how can we clear the outputs by default? When checking in the notebooks separately, we use the following library which clears the output cells. # Install librariespip install nbstripout…
-
abc
-
Can't create a code env from a newer Python version (local DSS installation)
Hi, I installed the local free version of Dataiku on macOS 15.0.1. The Dataiku DSS version is 13.2.1. By default, I can only create Python code env for Python 3.7. The rest of the Python versions are marked as "Not available" in the code env creation form. I need a higher version than Python 3.7 to create a code env for…
-
How to manage dss nodes through python API when using fleetmanager
hello I want to manage fleetmanager's resources using API. After searching, I found out that only pythonapi was available, and a problem occurred during testing. I received the following error message: import dataikuapi key_id = "?" key_secret = "?" # <Cloud vendor> is either AWS, Azure or GCP client =…
-
How to integrate GPU from an external instance to my actual Dataiku instance
Hi I am currently running Dataiku DSS in AWS EC2 and I have a common use instance type where it is running. Due to some proyects and proposals I am required to use GPUs to accelerate the process of training models, and I have an accelerated computing EC2 instance that I want to use for this new proyects, but migrating all…
-
switch function not working with 'dayOfWeek'?
Hi all, I tried to apply a condition to find the latest previous weekday date, so for example, if today is Sunday or Monday, then it will Friday, for other days it will just be yesterday. I tried using switch function like this: switch( datePart(now(), 'dayofweek'), 7, trunc(inc(asDate(now(), 'yyyy-MM-dd'), -2,…
-
Error when using Embed recipe
Hi Team, When I am trying to embed a dataset in Dataiku with textual data, I encounter this error: Error in Python process: <class 'Exception'>: At least one embedding request failed: HTTP code: 429, response: { "statusCode": 429, "message": "Token limit is exceeded. Try again in 48 seconds." } Could you guys please help…
-
custom dataiku install on linux reverse proxy
I set up the proxy like this after I finished installing the dataiku, but the homepage won't open. Where did I go wrong with the nginx setting?
-
Snowpark - Error in writing using pandas
Hi, I am getting the following error when trying to convert snowpark dataframe into pandas dataframe. Snowpark dataframe and processing works well. Conversion to pandas to store the data causes the below problem ERROR:snowflake.connector.ocsp_snowflake:No CA bundle file is found in the system. Set REQUESTS_CA_BUNDLE to the…
-
Not sure what this error is:"invalid resource name projects/${Dataiku-BQDefaultProject};"
Hi I am new to dataiku, I am getting this error, not sure what this is: Oops: an unexpected error occurred Invalid resource name projects/${Dataiku-BQDefaultProject}; Project id: ${Dataiku-BQDefaultProject} HTTP code: 500, type: com.dataiku.dip.sql.bigquery.BigQueryException Operating system used: Windows
-
Guidance on Integrating GPU Instance with Main Dataiku Instance and Potential Alternatives
Hello Dataiku Community, We’re currently working on a project where we require GPU support for intensive training tasks. Our setup includes a primary Dataiku instance that is already configured and running in our AWS environment. To handle the more resource-intensive parts of our workflow, we’re exploring the option of…
-
"Forecasting Sales with Machine Learning" sample project unavailable?
Hi, Was the "Forecasting Sales with Machine Learning" sample project and related blog posted moved? The project links now return a 404: Blog post: https://www.dataiku.com/learn/samples/forecasting-sales/ Project link: https://gallery.dataiku.com/projects/DKU_STORESALES/ I found the project's description in some old…
-
How to do auto update of the "Variable display" tile in the Application designer
I am using the project variables to be displayed in the "Variable display" tile, Even after updating the project variables, "variable display" tile is not displaying the updated values of those project variables. I need to explicitly refresh the browser to see the "Variable display" tile to display the latest values. How…
-
Correcting Typos - Text Preparation Plugin?
Hi everyone, Context: I have data from a survey. One question in the survey is multiple choice and answers predefined, but there is no data validation built into the survey. As a result, I have typos in the data. For example, a column "Genre" can include "Rcok, Clasisc, Jaz". Question: Is there are smart/quick way to…
-
Can we use a Kubernetes cluster with the free edition of Dataiku?
Hi, Can we use a Kubernetes cluster with the free edition of Dataiku? Let's say we have a Linux VM in AWS or Azure environment where we have deployed the free edition of DSS Ver 12 or later. Is it possible to use a Kubernetes cluster in this environment to reduce model training time? Thank you. Taka Operating system used:…
-
Bundle secret credentials to automation node
Hi, I've not done a release to the automation node before. My model uses APIs to connect to external data which rely on passing credentials. These credentials I have stored under my account within dataiku and using python I pass the secrets through securely. What I am not sure on is when I bundle this model, how will the…
-
Present Connection Credentials in sorted order
Connection Credentials are currently presented to the user in what seems to be a random order. That combined with the fact that we have about 70 connections defined makes it pretty difficult to find a particular connection. It would be great if the connections were sorted as they are in most other areas of DSS. For…
-
Snowpark UDF issue
Below is a code for calling a snowpark UDF in python recipe. The environment is connected to snowflake. def ABC(session, df): def BCD_UDF(X,Y,Z): pandas series processing return pd.DataFrame(…) # Register the function as a UDF BCD = session.udf.register( func = BCD_UDF, return_type=StructType([ StructField("MN",…
-
Assistance with Integrating YOLOv5 in Dataiku
Hello Dataiku Community, I'm currently working on a project where I need to use the YOLOv5 algorithm to detect potholes in images. I have a dataset with labeled images ready, and I’m trying to set up YOLOv5 within a Dataiku Python recipe for model training. However, I’ve encountered several issues along the way, and I’m…
-
Multiple Joins to mimic VLOOKUP?
Hi everyone, I'm new to dataiku and have a question about JOINS: Context: I have two datasets: Activties and Goals. Activites is an uploaded excel file, with one row reflecting one activity. Each activty is linked to one or more goals. The link between activities and goals is captured in one column of the dataset. For…
-
Full Screen Streamlit Apps (stand-alone view through Dataiku)
The title says it all. Publishing your Streamlit Webapp via Dataiku feels more like a clunky embedded experience than a bespoke application. What I'm looking for here is a native experience through Dataiku's platform. I know that Dataiku offers users the ability to publish their web apps publicly (full screen / native…
-
Databricks as pyspark engine
Currently, Dataiku only supports Databricks using Databricks Connect (in a python recipe), but does not support pyspark recipes or selecting spark engine for visual recipes. As a result, we are not able to develop visual recipes with spark, and either need to use Databricks connect in python recipes, or develop directly in…
-
Granting Editing and Webapp Launch Capabilities to Explorer License Users
Hello, Our project managers hold DSS explorer licenses, but they are limited in two important ways: Editing Wikis: While they can help document projects in DSS wikis, they are unable to insert images (see Figure 1). Figure 1 : Error message shown to an "explorer" user attempting to insert an image in a wiki where they have…
-
Identify Product ICC using Attribute Values
I am new to Dataiku, how can i identify/predict ICC of a product using attribute values based on raw data using Dataiku AI/ML Capabilities
-
importing module to dataiku jupyter
i am new to dataiku and i am trying to learn how to import modules into dataiku . i have already gone through documentation. but when i try to import and use the module in the same project in a jupyter it says module not found . i have attached screenshots below Operating system used: windows 11
-
Oops: an unexpected error occurred java.lang.IllegalStateException: Expected a double but was BEGIN
Hi, I am working on the project "Advanced Designer Assessment" after modified Prepare recipe to add column "qualifies", when i open dataset "Online_Retail_Prepared", i got this error message. Oops: an unexpected error occurred java.lang.IllegalStateException: Expected a double but was BEGIN_ARRAY at line 377 column 21 path…
-
Exporting a file in dynamic name
Hi, I have my output as "Final_output" at the end of the flow. I want to export this into S3 as a csv with the name "Final_output_$datetime.csv" So everytime the flow runs, it has to create a file with that timestamp. I tried with variable. But didnt work when it comes to file name creation. Thanks, Vinothkumar M
-
Webapps and file structure
I would like to create a webapp which I have developed locally. The file structure is below. It appears that standard webapps have a simplified interface with tabs for CSS/js/python etc. Is there a way to create a webapp that will allow me to maintain my file structure without rewriting it to fit the rigid structure of a…
-
How to resolve permission error related to DeepFace in Dataiku?
I am using DeepFace library in Dataiku. When I write code in Jupyter notebook and try to import the library, I am able to do it. But when I try to export it as python recipe and try to execute the same code it's not working. It gives me Permission Error. How to resolve this?
-
How to set a project variables with Python
I am trying to set project variables that will be used by scenario recipes as dates. In a project the global variable is in the form of {"start_date": "10/27/2024"}. I am able to this in recipes and dashboards. Using the python notebook, I am able to change the variables but now I want to make it dynamic based on current…
-
Knowledge bank
i have my final file in managed folder which is document which has various graphs and summaries in it , how to embed the data in knowledge bank & create a prompt to generate summary.
-
How do I automate a notebook.
Hello, I have been attempting to automate a flow so that it runs hourly through the automation scenario, it starts of from a python recipe, building multiple datasets, then I retrain the model, which appends everything to a logging table. This script works if I run it manually however, I do not know how to create an…
-
export dataiku plots directly as svg
It would be great if users could easily download all plots/charts created in dataiku directly as an svg (including model performance plots). For users who code, it would be even better if you could open an ipynb notebook with the code for generating the plot so the user could customize it and then export in a…
-
Setting project tags in v12 seems oddly difficult
HI, It is very easy to get a list of the tags for a project but the set_tags function expects a dictionary. I created a dictionary (mydict) as so: {'BAC': {'color': '#1ac2ab'}, 'ABC': {'color': '#1ac2ab'}, 'POO': {'color': '#1ac2ab'}} I then use set_tags(mydict) which does not work but does not error. Any ideas? thx…
-
Sync Recipe - Append instead of overwrite
Greetings, I have a Sync recipe that uses an SQL query as the Input to load the past 6 months of data thru the previous calendar day to an Output table. Going forward, all I want is to load is the previous calendar day's data. Is this as simple as placing a check mark in the Append instead of overwrite checkbox shown…
-
Configure Windows OS in Dataiku
Hi Dataiku Team, I have a new requirement to track Windows file logs using Dataiku. Our development team has implemented code that generates logs on a Windows server, and I need to set up a connection in Dataiku to access these logs from my system. Could you please guide me on how to establish this connection and…
-
Co-Marketing Material
Where can I find the co-marketing materials?
-
Overlap area in Intersection objects
Hi. I have two geometry that overlap one another. Please advise is there a way to find the overlap area…ie How much one polygon overlap another polygon in terms of area or percentage using Geospatial Analytics
-
Test querie return prediction from empty parameter
Hi, i was testing my prediction model endpoint, by Test queries from my datasets, and its working perfictly,, But,, when i test my model with empty features query, it also returnts a prediction , even when i passed nothing to the features parameters, where it should rather return empty value. whad tyou think is the problem…
-
is there any limitations on number of recipients that can be added while sending email
is there any limitations on number of recipients that can be added while sending email step used in scenarios. I need to add around 250 recipient in the email. Is there any alternative if above way is not optimal
-
how to generate rows in DataIku
Hi Team, I want to generate some date rows based on a criteria in DataIKu like the attached screenshot from Alteryx This will generate a new column 'Date'. I came across below link from the community but there is no proper solution given in that. Any help would be greatly appreciated. Thanks Thanks & Regards Dil Operating…
-
"Project key is already used", but I cannot find it
Hello, I am trying to duplicate a project and have set a new project key "NEW_KEY" for the copy. When I try this I get the following error: Project key NEW_KEY is already used. Please check that you have not already imported this project. Since the message states the project already exists I tried accessing it (via the URL…
-
Could Fleet Manager and DSS do Cross-Region Backing up in AWS?
Hello, Has anyone used Cloud Stacks for AWS to build a Fleet Manager and DSS instance for cross region backups? For example, we have FM and DSS hosted in AWS region “ap-northeast-1” and we want to keep these snapshots in “ap-northeast-1” but automatically transfer them to “ap-northeast-3”. Looking at the Fleet Manager…
-
Join recipe with Multiple datasets
Hello, I am in study of the Dataiku DSS. I have 3 datasets(Customer, Computer, Printer) that I am joining by customerID. Each datasets has column field called CustomerID. The Computer and Printer has the following columns i.e. Customer ID and Date when they purchased the item. I would like to know if the Customer bought a…
-
Enhance dashboards to include a Python Visual tile
Hi, For our analytics, we need custom Python visuals nearly all of the time. Would be good to allow the user to create a Python visual tile that uses a standard Python recipe. thx
-
Sharing Applications
Hi there, We are working on best practices for deploying application instances, and have two items, which together are causing us a bit of a headache. If I want an end-user of a Dataiku application to be able to modify a variable through a parameter (say a dropdown) they need write project permissions By giving them write…
-
Updating models in scenarios
Hello, I set up a model with DSS machine learning modules. In my scenario, I've made a special step to run the model training queries. Will this step re-train the model and modify the model coefficients and metrics of the model or not?
-
evaulate LLM recipe
I want to evaluate LLM like the picture above, but if you look at the tutorial, there is a recipe called evaluate LLM, but I can't see the recipe. How do I evaluate LLM?
-
delete api service using py api
Hi, i can't seem to find a method to delete an existing service with associated endpoints in the api designer using the python api. is there an alternative way?
-
Rename automatically a dataset & export several datasets at same time
Hello ! I have two questions concerning the Dataiku platform : I would like to automatically rename the name of a dataset (I a m a visual designer user) with he date of the day Then, I would like to export several datasets at same time on mmy desktop (thanks to a scenario for example) Does someone know how to do that ?…
-
Option to rearrange output columns in join recipe
I would like to have the option to rearrange output columns in the join recipe. Perhaps by making the 'hamburger' icons on the Output panel draggable.
-
Dataiku kafka connection
hi i want kafka connection from dataiku administration -> connections -> new connections -> kafka select I have configured NEW CONNECTION NAME and BOOSTRAP SERVERS according to the installed Kafka. However, I encountered the following error: "Connection to Kafka failed: org.apache.kafka.common.errors.TimeoutException:…
-
Housekeep reports
I have a managed folders where I am storing the custom report generated which is build every month. I would like to housekeep the older reports every 6 months. Could you please guide how can we do that?
-
How does Dataiku run a recipe from a scenario if the recipe has multiple outputs?
Let us say there is a python recipe with input as A and outputs B, C, D. I want to build the datasets B, C and D from a scenario. So, should I specify datasets B, C and D in the build step in the scenario? Will it run the recipe thrice to create each of the datasets? If so, is there a way to run the whole recipe from the…
-
Keyword for creating sql spark recipe via Python API
Hi Team, Could you please help me clear below doubts? The use-case here is that the output of window recipe(s3) would be input to sql-spark recipe whose query we will dynamically create in python, add and run it from python. The output will also be s3 output. builder = project.new_recipe(??) #what would be the keyword for…
-
LLM Response streaming in API Designer
Hey, I am trying to implement a stream API in the API Designer for having the ability to request the endpoint in stream mode for displaying LLM response in a chatbot UI. I took inspiration from Response streaming section of LLM Mesh - Dataiku Developer Guide but I am unsure what to use as a replacement of the print…
-
Advanced container settings for R code environments
Like the Python code environments can have dockerfiles defined to be applied when building code env we need the same for R. We find our self modifying the base DSS image to accommodate some features needed in an R code environment.
-
My dashboard is not working it is showing the 404 error
-
Practical Use of Code Studios
I am starting this thread to learn about how others are using code studios (such as VSCode, JupyterLab, and Streamlit) and for what purposes. In our organization, we were initially excited about the feature introduced in DSS v11. However, our enthusiasm was quickly dampened by the fact that users cannot select the…
-
Haruyuki Yamada
dataiku13.0をインストールできたのですが一向に起動しません。なぜでしょうか?本当は13.2をインストールしたかったのですが。 Operating system used: Windows Operating system used: Windows
-
using pre deployment hook create a pv and pvc?
"I am trying to add a Persistent Volume Claim (PVC) to a Kubernetes deployment using a pre-deployment hook in Dataiku. Could you provide any documentation or steps outlining how to add a volume to a deployment through a pre-deployment hook?" Operating system used: windows Operating system used: windows
-
XGBoost error
Our solution: Dear Abhinav, The issue you're encountering seems to stem from the missing `libomp` library, which is required by XGBoost on macOS. Let's walk through a possible resolution: Steps to resolve the missing `libomp` issue: 1. Install OpenMP (libomp): On macOS, you need to install `libomp`, which provides the…
-
Does Automation NODE support Assume Role With Saml?
Hi everyone,I have a question. Does Automation NODE support Assume Role with SAML for server-to-server authentication when connecting to Amazon S3 and integrating with corporate IAM? Operating system used: Ubuntu 22.04
-
Difference in minutes and seconds between two date
Hi everyone, How can I get the difference in "hours" "minutes" AND "seconds" (example: 14:30:24) from two date? Example: Col 1: 2018-01-01T09:50:15.000Z Col 2: 2018-01-01T10:07:55.000Z Col 3 (difference): 00:17:40 Thank so much for your help Maxime
-
Identifying the Node Type in a DSS Notebook using Python
In Python, in a DSS notebook, I want to know if the code is running in the design node or the automation node. How can I do that?
-
Integration with Microsoft Fabric and its OneLake
Hi, Couldn't find anything on an integration with Microsoft Fabric through OneLake yet in the Dataiku docs/release notes. Is this coming soon? As I read the Microsoft docs I understand we can't connect directly via ADLS, only via APIs or SDKs. Thanks in advance, Jonathyan Operating system used: RHEL 8
-
Dashboards or piece of dashboard lost during collaborative work on a project
Does someone had this problem ? : When editing a tile or a graph in a dashboard if 2 teammembers edit in the same time the same graph , the graph is lost (question why such a situation is possible ? in most collaborative tools there is a blocking mechanism to avoid that) but even a dashboard which has been edited and saved…
-
Git Version Control - merging branch
OK, I'll admit it up front, I am new to the Git model and after watching a few introductory videos, I tried to use the Version Control feature of creating a new branch. I selected the option when creating a new branch, "Edition Mode: User current project to work on new branch". (Yes I know, the documentation says to use…
-
Export monthly files
I want to export files 9 months back from today's date, but always the 1st of that month. I have this working but when the month changes, it doesn't automatically include November for example when doing testing. Is there a way to automate this so I don't have to manually change the time period?
-
"NumberFormatException: For input string" in scenario with integer partitioned dataset
I have a non-partitioned D1 dataset. The first column "dt_partition" will be used to partition the next dataset. dt_partition is of type integer, representing the month (for example 202409). It only contains one value at a time, so the data will go into a single partition. My database is Snowflake. At the output of D1,…
-
ML Practitioner Certificate expires or its life time validity.?
I have completed my Certification in ML practitioner Certificate on 28th July 2023 and when i am trying to take again couldn't be able to see the exam thing instead my certificate only getting appeared, kindly help me to know the validity.
-
ML Practitioner Certificate expires or its life time validity.?
I have completed my Certification in ML practitioner Certificate on 28th July 2023 and when i am trying to take again couldn't be able to see the exam thing instead my certificate only getting appeared, kindly help me to know the validity. Operating system used: Windows
-
How to debug provisioning errors using Fleet manager
I get an error when I provision Automation servers using Fleet manager. How can I debug this? Operating system used: Amazon Linux
-
Publishing Feature Groups in the Dataiku Feature Store from a Prod Node to a Dev Node
We've used a Dataiku project on the Design Node to produce a number of shared data assets. These are published as Feature Groups in the Feature Store where other users can bring them into their own projects. When the project producing the shared datasets is promoted through Test and Prod environments, is there a way the…
-
Run a Time Series Forecasting Model
I get the following error message Error message: Failed to train : <class 'ImportError'> : libcuda.so.1: cannot open shared object file: No such file or directory Operating system used: 13.1.4
-
Code Snippet as Edit option in Support Chats and Community Posts
Hey there I do not know why this feature is not present. It would be really great to send codes as code snippet, provided by many other tech community and support platforms, for better readability and communication in these post. I think this is an essential to give better context