Manual importation of a bundle from the dev node to the auto node

gikeylian
Level 2
Manual importation of a bundle from the dev node to the auto node

Hello Everyone,

Line the title suggest i encountered an issue while importing a project bundle from the Development node to the Automation node, i have 2 datasets, one big query and one GCS like shown under.flow.PNG
It worked really well in the development node but while trying to launch the steps from the bundle in the automation mode, it failed due to a connection timeout.
which is weird cause the GCS connection  works in the dev mode and it has been added in the automation mode and the test is positive.

I tried to rebundle again but this time while including the 2 datasets in the "configure content" section but i also got an failed error while trying to activate the datasets ("WARN_BUNDLE_ACTIVATE_DATASET_LOAD_FAILED").

Any help would be greatly appreciated.
 
Anyone knows how to resolve it

 

0 Kudos
5 Replies
JordanB
Dataiker

Hi @gikeylian,

Are you using the Project Deployer to transfer the project bundle or are you manually importing the bundle zip file? 

Can you please take a screenshot of the error that you see?

Have you set up connection remapping? https://knowledge.dataiku.com/latest/data-sourcing/connections/tutorial-remap-connections.html

Please make sure that you have write permissions on the connection you are using on the target instance.

Thanks,
Jordan

 

0 Kudos
gikeylian
Level 2
Author

hey @JordanB ,

im using manually importing the bundle into the Automation node,

here is a screenshot of the error i encounter when im recieving when i try to activate the bundle.

 activation error dataiku.PNG

Also i about the remapping i already made the connections to big query and GCS in the settings and the test is Good,
but i dont see any remapping options when im uploading the zipfile
import_auto_dataiku.PNG

do you think it would be a permissions issue, or is it totally different ?

0 Kudos
JordanB
Dataiker

Hi @gikeylian,

It's difficult to tell without seeing an instance diagnostic. If you will be using the same connection in the target instance, the permissions should remain the same. However, if you are remapping to a different connection definition, then yes, please check that you have write permissions as this could potentially be the issue. 

My recommendation would be to remove these bigquery datasets from the configured content and rebundle (you can reconnect the datasets later in the target instance). You mentioned receiving some sort of connection time out error when doing it this way, can you please send me a screenshot of this error?

Thanks!

Jordan

0 Kudos
gikeylian
Level 2
Author

Sure @JordanB, i can provide you with an instance diagnosis,

and i did rebundle the package without the datasets and here are the errors i got :
build_error_2.PNG

and im sorry the full diagnosis size is too big to be uploaded here

0 Kudos
JordanB
Dataiker

Hi @gikeylian,

I would not recommend sending a diagnostic on the community forum - can you please send us a message at support@dataiku.com (https://doc.dataiku.com/dss/latest/troubleshooting/obtaining-support.html#editor-support-for-all-oth...)

Please attach a diagnostic of the TARGET DSS instance (the instance where you are importing the bundle), under (Administration > Maintenance > Diagnostic tool) and send the resulting file. 
 
If the resulting file is too large for mail (> 15 MB), you can use https://dl.dataiku.com to send it to us. Please don't forget to send the link that is generated when you upload the file.
 
Thanks!
Jordan

 

0 Kudos