I have 2 questions with respect to reading & writing data from Oracle DB
1) Reading Data - Whenever I use a custom sql to read data from the table, why isn't the same datatype available in Oracle, doesnt reflect in my schema. How do I sync the same schema as available in Oracle DB in my DataIKU dataset?
2) Writing Data
a. Can I delete & append the data everytime I write into a Oracle table?
b. Is it possible, to use the same schema as that in Oracle DB, while using IKU to load the Oracle table?
For your first question, please note that there is a difference between storage type and meaning in DSS: https://doc.dataiku.com/dss/latest/schemas/definitions.html
If there is a specific datatype in Oracle that you are referring to though that you are not seeing in DSS, could you please clarify?
As for writing into Oracle, by default DSS will write into datasets using "Overwrite" mode, which means that DSS will first drop the target table before recreating it. In some cases, you can use "Append" mode when writing into an output dataset from a recipe (check the Inputs/Outputs tab) but this means that no drop occurs and the data is simply appended, which may or may not be what you are looking for.
Couple of ideas that may be helpful... first consider using a SQL Script if you aren't already (rather than SQL Query) recipe. Only works within one database and you have to do more work as you are responsible for dropping / creating tables etc but you do have more control over the structure of the output tables. In particular, the datatypes of output tables are controlled by the SQL so if you do a "create table as" the resulting table will have the same datatypes as the source table.
Check the slide deck here https://community.dataiku.com/t5/Online-Events/Using-the-Dataiku-DSS-Python-API-for-Interfacing-with... for information on how datatypes are mapped between SQL and Dataiku and how to clear an existing table before inserting records into it (e.g., execute a truncate on the table) within a Python recipe.