Define shared variables during model training

pratikgujral-sf
Level 2
Define shared variables during model training

Hi Community,

I have a Deep Learning model in Visual Analysis. I need to have a few variables that should be accessible between Architecture and Feature Handling. These variables have to be computed at any one place, and then reused across build_model as well as feature handling. 

 

How and where do I declare such variables? Note the values are not static but are computed using Python. However, once set, the values are required to be used across the places mentioned above.

I came across 

from dataiku.doctor.deep_learning.shared_variables import get_variable

but could not find any documentation or code examples on setting their values.


Operating system used: Red Hat Enterprise Linux


Operating system used: Red Hat Enterprise Linux

0 Kudos
1 Reply
JordanB
Dataiker

Hi @pratikgujral-sf,

As stated in the documentation on "Advanced Training Mode", DSS builds a list of base_callbacks (to compute metrics, interrupt model if requested in the UI …) that must be added in the call to fit_generator. Then, you are free to add any custom callback to this list. The method fit_model allows you to define custom Keras callbacks. For example, the Early Stopping callback will stop the model training prior to completing all planned epochs if the tracked metric is no longer improving.

Check out Writing Custom Callbacks to learn how to build your own. 

Please let us know if you have any questions.

Thanks!

Jordan

0 Kudos