Clearer visual experience when working with Partitioned Visual Model summary screens
User Story
As a new Dataiku DSS user getting familiar with building partitioned visual models, I would like it to be clearer that there is a set of summary graphics for each of the partitions. This would make using the partition feature easier and build confidence in users. Making this feature feel less like an "easter egg" and more like a productive feature.
As a creator of partitioned visual models trying to use best practices of k-fold cross-validation testing, it would be helpful to see the (± 0.###) on the summary screens as we do with non-partitioned models. This would allow for a quicker comparison of models.
Notes:
- No indication when you look at the screen you can click on anything.
- Hover over "All Partitions" no underline indication that you can click on the word "All Partitions"
- Even though you do not have a clue click on All Partitions and you get a graph similar to but not the same as the documentation here.
- Click on the All Partitions "trained link" and you are taken to a training session log.
- Click on the name of the 1st partition in this case Florida. Or second portion name California in this case you get a different behavior and are taken to the partition Model summary page.
- Click on the Florida or California trained link and you are taken to a training session log.
- However, there appears to be nothing to click on to get the behavior described in the training session. One end up having to click on the gray space between values on this summary screen to re-produce the feature shown in the training.
What I was trying to produce was the following as shown in the training materials.
For the second comment about k-fold cross validation here is what I'm currently seeing.
For this testing, I'm using DSS V 8.0.4.
cc: @AshleyW
Comments
-
tgb417 Dataiku DSS Core Designer, Dataiku DSS & SQL, Dataiku DSS ML Practitioner, Dataiku DSS Core Concepts, Neuron 2020, Neuron, Registered, Dataiku Frontrunner Awards 2021 Finalist, Neuron 2021, Neuron 2022, Frontrunner 2022 Finalist, Frontrunner 2022 Winner, Dataiku Frontrunner Awards 2021 Participant, Frontrunner 2022 Participant, Neuron 2023 Posts: 1,601 Neuron
@Krishna
,I've updated this idea with a second closely related idea about the display of k-fold cross-validation results in a partitioned model scenario. I can break this out into a separate idea if you need it.
-
tgb417 Dataiku DSS Core Designer, Dataiku DSS & SQL, Dataiku DSS ML Practitioner, Dataiku DSS Core Concepts, Neuron 2020, Neuron, Registered, Dataiku Frontrunner Awards 2021 Finalist, Neuron 2021, Neuron 2022, Frontrunner 2022 Finalist, Frontrunner 2022 Winner, Dataiku Frontrunner Awards 2021 Participant, Frontrunner 2022 Participant, Neuron 2023 Posts: 1,601 Neuron
I did find a way to see the k-fold cross-validation results. However, it takes a fair amount of digging to find this information.
-
Krishna Dataiker, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Product Ideas Manager Posts: 18 Dataiker
Thanks Tom for this very thorough request; the detailed steps for reproducibility are much appreciated.
On your first suggestion, I'm afraid I wasn't able to reproduce with safari or chrome (I tried on my existing install of 8.0.2, and upgraded to 8.0.4 as you specified) -- on hover, the rows should appear highlighted, indicating a clickable action (hopefully the gif below demonstrates).
That being said, I acknowledge the suggestion to make it even more explicit, and have logged it along with the second suggestion around displaying the k-fold margin on the model summary screen.
-
tgb417 Dataiku DSS Core Designer, Dataiku DSS & SQL, Dataiku DSS ML Practitioner, Dataiku DSS Core Concepts, Neuron 2020, Neuron, Registered, Dataiku Frontrunner Awards 2021 Finalist, Neuron 2021, Neuron 2022, Frontrunner 2022 Finalist, Frontrunner 2022 Winner, Dataiku Frontrunner Awards 2021 Participant, Frontrunner 2022 Participant, Neuron 2023 Posts: 1,601 Neuron
Yes, it does work.
However, you know that the graphics are there. As a new user, I did not. The only indication is that the mouse pointer changes and yes the rows highlight. The issue for me was more around, in most other places in DSS when you have something clickable you end up with an underline under something. These items do not end up with the same behavior. (And because of all of the graphic building, the long delay to open these rows also felt really odd.)
Regarding including the error range in the overview of all of the summaries is that something that is going to be doable?