Specify multiple layers when designing a Neural Network?

macamhi
Level 1
Specify multiple layers when designing a Neural Network?
In Dataiku v4.2.1, I'm trying to build a multiple (2 or 3) hidden layer neural network model. How do I specify the size of the different layers in the Design interface? With the MLPRegressor class, I can pass a tuple of layers, eg. '(100,50)' , but this or just having comma-separated values, eg. '100,50', gets rejected when I try to launch the training with the error message 'Layer sizes must be positive' (as if not taken into account). When entering different values separated by a CR, a one hidden layer model is trained for each of the values provided, which is expected, but not what I'm looking for.
0 Kudos
6 Replies
Samuel_R_
Dataiker

Hello,



To specify the size of the different layers in the Design interface, just type integers followed by a space.



The result should look like in the image below:



 

0 Kudos
macamhi
Level 1
Author
This is what I initially thought based on the UI indication, but what it actually does is perform a grid search on 3 ANN with the 3 different layer sizes, which is confirmed when you export the produced model to a Python notebook : the best scoring layer size is the one selected, but this is with a single hidden layer model.
I can indeed do the computation in the Python notebook, but I would have thought there was a way to do this through the Design UI, as it itself suggests?!
0 Kudos
Alex_Combessie
Dataiker Alumni
Thanks for reporting this, we will investigate this behavior further.
0 Kudos
Wuser92
Level 3
Any update on this? Ran into the same issue on Dataiku 4.2.3. and 4.2.0
0 Kudos
Alex_Combessie
Dataiker Alumni
Hello, We confirmed there is a bug in the current version: layers are grid-searched instead of added as multiple layers. Our R&D team is aware of this and will work on a fix. We'll keep you updated. In the meantime, you can workaround around it by using a "Custom Python Model" (https://www.dataiku.com/learn/guide/visual/machine-learning/custom-model.html) and write something like this:
import sklearn.neural_network
clf = sklearn.neural_network.MLPClassifier(hidden_layer_sizes=(100, 100, 100))
0 Kudos
bdub2you
Level 2

Has this bug been fixed yet, i.e. is there not a way to create specify a neural network with multiple layers, each layer having  x number of neurons, or are we still required to write custom code for that?

0 Kudos