Hi,
You would generally not do that in a recipe, which is part of the flow / rerunnable / supposed more or less to process data, but, first in a Python notebook. You can then automate that as a "Macro" as part of a DSS plugin
This would use the DSS public API (https://doc.dataiku.com/dss/latest/api/public/client-python/index.html)
Something like ("pseudo-code")
import dataiku
client = dataiku.api_client()
project = client.get_project("PROJECT_NAME")
dataset = project.get_dataset("dataset_name")
current_schema = dataset.get_schema()
# current_schema is now a dict, containing "columns", list of dicts. Each dict contains "name"
# Build the new columns list.
new_cols = []
for i in xrange(0, len(col_names)):
new_cols.append({"name": col_names[i], "type": col_types[i]})
# And update the schema, and save it
current_schema.columns = new_cols
dataset.set_schema(current_schema)