This website uses cookies. By clicking OK, you consent to the use of cookies. Read our cookie policy.

Auto-suggest helps you quickly narrow down your search results by suggesting possible matches as you type.

Showing results forÂ

- Community
- Â»
- Discussions
- Â»
- Using Dataiku DSS
- Â»

- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page

Solved!

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Train multiple neural networks in one Analysis?

The title basically says it all. I want to try different hyperparameters for my Neural Network (or algorithms in general). For some, like random forest, I can specify a list - e.g., max_depth. What I need is a queue of Neural Networks with different hyperparameters, so that I can start them in the evening and come back to the results in the morning.

How to do this?

1 Solution

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Hello,

It is not possible at the moment on the visual interface.

Instead, for hyperparameter optimization on neural networks, we invite you to code your own custom Python model (in the Analysis > Design > Algorithms section). For instance, for a neural network from scikit-learn (MLP), you can use this:

from sklearn.neural_network import MLPClassifier

from sklearn.model_selection import GridSearchCV

parameters={

'alpha': [1,10,0.1],

'activation': ["logistic", "relu"]

}

mlp = MLPClassifier()

clf = GridSearchCV(

estimator=mlp,

param_grid=parameters,

n_jobs=-1,

verbose=2,

cv=5

)

Note that we are looking to integrate neural networks more deeply ðŸ˜‰ into our product. We will keep you posted!

Cheers,

Alex

4 Replies

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Hello,

It is not possible at the moment on the visual interface.

Instead, for hyperparameter optimization on neural networks, we invite you to code your own custom Python model (in the Analysis > Design > Algorithms section). For instance, for a neural network from scikit-learn (MLP), you can use this:

from sklearn.neural_network import MLPClassifier

from sklearn.model_selection import GridSearchCV

parameters={

'alpha': [1,10,0.1],

'activation': ["logistic", "relu"]

}

mlp = MLPClassifier()

clf = GridSearchCV(

estimator=mlp,

param_grid=parameters,

n_jobs=-1,

verbose=2,

cv=5

)

Note that we are looking to integrate neural networks more deeply ðŸ˜‰ into our product. We will keep you posted!

Cheers,

Alex

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

Thanks! I will try the custom version as soon as the current NN classifier is finished.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content

NameError: name 'gridSearchCV' is not defined

--> Fixed by using GridSearchCV instead.

After fixing this, I now get:

get_params() must be called with MLPClassifier instance as first argument.

and I'm not really sure what to do about that.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Email to a Friend
- Report Inappropriate Content