Updating a dataiku instance without loosing the existing projects, code envs and plugins
Hi all,
I want to upgrade the Dataiku instance on my local server to the latest version, but I don't want to install a fresh new copy of Dataiku 11. Is there any way by which I can just directly upgrade the server with all the existing projects and plugins in one go?
Currently, the method that I know is mentioned here:
But when I do this, most of my plugins stop working or give the version error where I again need to reinstall the required dependencies on the server itself.
I hope that there is a much easier and effective way of updating the instance without moving back to the server again and again.
Regards
Operating system used: linux
Best Answer
-
Turribeach Dataiku DSS Core Designer, Neuron, Dataiku DSS Adv Designer, Registered, Neuron 2023 Posts: 2,113 Neuron
Hi, I am not sure what method you are following. I have upgraded my Dataiku instance many times and I have yet to face any issues. I basically follow two approaches:
1) In place upgrade: I first backup my whole DATA_DIR and store the backup in a safe location. Then basically perform the steps "Unpack the new software", "Perform the upgrade" and "Post-upgrade tasks" if required on the page you linked. This option basically works by "installing" (unpacking is really what you are doing) the new Dataiku binaries and then running the installer script to "repoint" your existing DATA_DIR to the new Dataiku binaries. You must unpack the new Dataiku binaries in a different directory than your current binaries. For instance if you have your 9.0.5 binaries in /dataiku/dataiku-dss-9.0.5 and you want to upgrade to v10 you should put the v10 binaries in /dataiku/dataiku-dss-10.0.0
2) Full VM rebuild: I first backup my whole DATA_DIR and store the backup in a safe location. All our DSS instances are built automatically via our IaC pipeline. So all I need to do upgrade one instance is to download the new Dataiku binary, put it in our Artifactory server and set the IaC pipeline config to use the new Dataiku version. I then run the IaC pipeline which will destroy the existing DSS VMs and rebuilt them with all the required OS packages and the new Dataiku version. I then restore the DATA_DIR from the backup and finally I run the "Perform the upgrade" and "Post-upgrade tasks" if required on the page you linked.
For both option 1) and 2) we would perform testing on a different VM before doing the in-place for full VM rebuild on the actual VM we want to upgrade.
By far we do way more upgrades using option 2) than 1) as we use the VM rebuild opportunity to update to the latest VM OS images from our Engineering team as well as getting the latest OS level package updates and being able to test all of this together with the new Dataiku version in a single movement. This also gives us strong confidence in that our backup/restore strategy is also being tested and validated. It is a bit unnerving to destroy all your Production VMs and rebuild them from scratch. So many things could go wrong and you will end up with nothing. But this is the new way of doings things in the Cloud.
Answers
-
BhaskarDevGoel Partner, Dataiku DSS Core Designer, Dataiku DSS ML Practitioner, Dataiku DSS Adv Designer, Registered, Frontrunner 2022 Participant Posts: 7 Partner
Thank you Turribeach for your response.
It worked!