Note : this post is just one recommendation following this broader idea.
Writing datasets aliases* in SQL scripts can be tedious as they can get pretty long. Fortunately, when clicking on an input dataset in a [spark] SQL recipe, dataiku fills it for the user in the script. However, the metastore value is hardcoded and not written with variables.
1st improvement: it would be great if this path could be filled using variables. For instance, at our organization, we have to fill 3 (!) prefixes before reaching the dataset's name. For both readability and portability issues, we prefer writing the dataset's alias using variables (see the following screenshot for an illustration).

2nd improvement: I would love to have an auto-fill feature when trying to reference an input dataset (e.g. when hitting the "tab" key ; more generally speaking, auto-fill is crually missing in dataiku SQL scripts).
Implementing these 2 features would save a LOT of time for users prefering SQL to visual recipes.
* a dataset alias = metastore key for that dataset (for which the metastore returns the physical path)
cc : @Marlan