Using Dataiku
Sort by:
1 - 10 of
29
- I would like to save a keras model in a folder. I can not figure out how to save the weights of my models because I do not find the correct filepath. The needed code to achieve this goal is : model.sa…Solution by Clément_Stenac
Hi,
Keras can only save H5 files to a regular filesystem, not to arbitrary storage locations.
You can either (recommended) switch your managed folder to a local folder, or:
* Save weights to a local file
* Then upload the models file to the managed folder usingfolder.upload_file(path_of_the_local_file, "path_in_the_managed_folder")
. You'll need to use something like that for retrieving the file in the scoring recipe:
with open("localfile", "wb") as out:
with folder.get_download_stream("path-of-the-h5-file") as in:
out.write(in.read()) - Hello Could you give example how to read csv file with Python\panda from HDFS managed folder? Thanks MilkoLast answer by Vinothkumar
To add more details:
tried to create a python script..1.First option:Tried to read the files which is available in the paths. Able to read the file which is in csv format. but unable to read which is in excel.Not so sure why.But looks DSS mainly supports txt n csvCode:with handle1.get_download_stream('/dqs/DQS_Reference Study Sites.csv') as f:
data=f.readlines() ##able to read csv.But in the same place if i keep excel and try to read.It comes as kind of xml component.2.Second option:Instead of reading excel via python.If we able to create a empty excel with specific headers(as like original file) and place in s3.So that the regular flow will be able to run thatBut again here i am able to place the empty dataframe with just columns alone as a csv file.But the same way im not able to move excel file.Code:with handle1.get_writer(Filename) as writer:
writer.write(network_df.to_csv().encode("utf-8"))#Working fine.but the same to_excel not working.So if any one option works fine then that will solve my problem. Can someone help me here? - I have some code where I need to run an HDFS command in Python to check if a file is present. See below for an example: import subproces command = 'hdfs dfs -ls /sandbox' ssh = subprocess.Popen(comman…Last answer by
- Hi, I'm unable to get started with establishing connectivity between DSS and Hiveserver2. HDFS integration works, and I have added the jars required for Hive client in a folder owned by dss user, and …Last answer by
- Hi, what defines the list of available engines for data processing recipes such as prepare? I have a HDFS dataset created by Impala, then a prepare or sync to another HDFS dataset, but only Spark/MR (…Solution by
- Hi, I am using HDFS datasets in my workflow which are updating on a daily basis and I would like to find out if these daily changes can be tracked by DSS and saved in a separate "delta" file through a…Last answer by
- Hi, I am evaluating DSS, so I installed it in my server and added a 2 weeks enterprise trial license. I am facing a Cloudera CDH 5.12 cluster, kerberized. I am able to connect and browse HDFS, but Hiv…Solution by
- Hi, I get this message from a hive recipe on a partitioned dataset stored on HDFS: validation failed: Cannot insert into target table because number/types are different "2018-02": Table inclause-0 has…Last answer by
1 - 10 of
291