This is a good question. There is no out-of-the-box feature to do that today. But we are thinking about it.
Actually, it works in a simple case since v2.0.0: when you have your data preparation in a script of an Analysis (called "Analyse"), deploying the model will also reproduce the script (ie. all the processors).
If you have more a complex flow (with Python recipes, etc.), there is a solution: you can stack your two sources in a single dataset, then apply your flow of transformation, then split in two before modelling.
Is there any update ? I've downloaded DSS 4.3.1 to evaluate whether it will support a project that will have serial data tables that will be inserted as records in a database over time. I'd like to run a code recipe on each new 'record'. Note that a database record will reflect a file/table containing multiple columns and rows. Is this possible currently?
Isn't it possible to export your project (project home page > actions > export this project), reimport it (DSS home page > mouse on the left > import) with a different name, and then change the input dataset ?