[2020/05/20-18:53:56.602] [ActivityExecutor-38] [INFO] [dku] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - ---------------------------------------- [2020/05/20-18:53:56.602] [ActivityExecutor-38] [INFO] [dku] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - DSS startup: jek version:7.0.1 [2020/05/20-18:53:56.602] [ActivityExecutor-38] [INFO] [dku] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - DSS home: /home/dataiku/DATA_DIR [2020/05/20-18:53:56.602] [ActivityExecutor-38] [INFO] [dku] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - OS: Linux 5.3.0-51-generic amd64 - Java: Private Build 1.8.0_252 [2020/05/20-18:53:56.602] [ActivityExecutor-38] [INFO] [dku.flow.jobrunner] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Allocated a slot for this activity! [2020/05/20-18:53:56.603] [ActivityExecutor-38] [INFO] [dku.flow.jobrunner] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Run activity [2020/05/20-18:53:56.618] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Executing default pre-activity lifecycle hook [2020/05/20-18:53:56.634] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Checking if sources are ready [2020/05/20-18:53:56.642] [ActivityExecutor-38] [DEBUG] [dku.dataset.hash] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Readiness cache miss for dataset__admin__CHURNALGO.Telco_Customer_Churn_prepared_distinct__NP [2020/05/20-18:53:56.651] [ActivityExecutor-38] [INFO] [dku.datasets.file] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"/CHURNALGO.Telco_Customer_Churn_prepared_distinct","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}} [2020/05/20-18:53:56.653] [ActivityExecutor-38] [INFO] [dku.datasets.ftplike] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Enumerating Filesystem dataset prefix= [2020/05/20-18:53:56.663] [ActivityExecutor-38] [DEBUG] [dku.fs.local] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Enumerating local filesystem prefix=/ [2020/05/20-18:53:56.665] [ActivityExecutor-38] [DEBUG] [dku.fs.local] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Enumeration done nb_paths=1 size=155674 [2020/05/20-18:53:56.667] [ActivityExecutor-38] [INFO] [dku.dataset.hash] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Caching readiness for dataset__admin__CHURNALGO.Telco_Customer_Churn_prepared_distinct__NP s=READY h=J7OqQujbO0pUZDPXDnzrhQ [2020/05/20-18:53:56.667] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Checked source readiness CHURNALGO.Telco_Customer_Churn_prepared_distinct -> true [2020/05/20-18:53:56.677] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Checked source readiness CHURNALGO.8yvXZCIF -> true [2020/05/20-18:53:56.678] [ActivityExecutor-38] [DEBUG] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Computing hashes to propagate BEFORE activity [2020/05/20-18:53:56.681] [ActivityExecutor-38] [DEBUG] [dku.dataset.hash] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Readiness cache miss for dataset__admin__CHURNALGO.Telco_Customer_Churn_prepared_distinct__NP [2020/05/20-18:53:56.682] [ActivityExecutor-38] [INFO] [dku.datasets.file] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"/CHURNALGO.Telco_Customer_Churn_prepared_distinct","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}} [2020/05/20-18:53:56.682] [ActivityExecutor-38] [INFO] [dku.datasets.ftplike] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Enumerating Filesystem dataset prefix= [2020/05/20-18:53:56.683] [ActivityExecutor-38] [DEBUG] [dku.fs.local] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Enumerating local filesystem prefix=/ [2020/05/20-18:53:56.683] [ActivityExecutor-38] [DEBUG] [dku.fs.local] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Enumeration done nb_paths=1 size=155674 [2020/05/20-18:53:56.684] [ActivityExecutor-38] [INFO] [dku.dataset.hash] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Caching readiness for dataset__admin__CHURNALGO.Telco_Customer_Churn_prepared_distinct__NP s=READY h=J7OqQujbO0pUZDPXDnzrhQ [2020/05/20-18:53:56.690] [ActivityExecutor-38] [DEBUG] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Recorded 4 hashes before activity run [2020/05/20-18:53:56.691] [ActivityExecutor-38] [DEBUG] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Building recipe runner of type [2020/05/20-18:53:56.708] [ActivityExecutor-38] [DEBUG] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Recipe runner built, will use 1 thread(s) [2020/05/20-18:53:56.708] [ActivityExecutor-38] [DEBUG] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Starting execution thread: com.dataiku.dip.analysis.ml.prediction.flow.EvaluationRecipeRunner@671c457f [2020/05/20-18:53:56.709] [ActivityExecutor-38] [DEBUG] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Execution threads started, waiting for activity end [2020/05/20-18:53:56.712] [FRT-43-FlowRunnable] [INFO] [dku.flow.activity] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Run thread for activity evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP starting [2020/05/20-18:53:56.740] [FRT-43-FlowRunnable] [DEBUG] [dku.recipes.ml] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Model partition mode: UNPARTITIONED [2020/05/20-18:53:56.788] [FRT-43-FlowRunnable] [INFO] [dku.shaker.data] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Need to compute sampleId before checking memory cache [2020/05/20-18:53:56.797] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Script settings sampleMax=104857600 processedMax=-1 [2020/05/20-18:53:56.803] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Processing with sampleMax=104857600 processedMax=524288000 [2020/05/20-18:53:56.819] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Computed required sample id : 8d070a2de0d75dc7fa494a35f351d97a-NA-31c57f559ab672625afa057353f0de860--d751713988987e9331980363e24189ce [2020/05/20-18:53:56.828] [FRT-43-FlowRunnable] [DEBUG] [dku.shaker.cache] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Shaker MemoryCache get on CHURNALGO.Telco_Customer_Churn_prepared_distinct key=ds=e047573100b76c0ed682a57e964e6689--scr=cdde123927ccaac6c74855afbf7c87f2--samp=8d070a2de0d75dc7fa494a35f351d97a-NA-31c57f559ab672625afa057353f0de860--d751713988987e9331980363e24189ce: miss [2020/05/20-18:53:56.829] [FRT-43-FlowRunnable] [INFO] [dku.shaker.data] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Memory cache miss [2020/05/20-18:53:56.831] [FRT-43-FlowRunnable] [INFO] [dku.datasets.file] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP dataset:CHURNALGO.Telco_Customer_Churn_prepared_distinct - Building Filesystem handler config: {"connection":"filesystem_managed","path":"/CHURNALGO.Telco_Customer_Churn_prepared_distinct","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}} [2020/05/20-18:53:56.837] [FRT-43-FlowRunnable] [INFO] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Getting sample: null [2020/05/20-18:53:56.838] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Script settings sampleMax=104857600 processedMax=-1 [2020/05/20-18:53:56.838] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Processing with sampleMax=104857600 processedMax=524288000 [2020/05/20-18:53:56.842] [FRT-43-FlowRunnable] [INFO] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Computed required sample id : 8d070a2de0d75dc7fa494a35f351d97a-NA-31c57f559ab672625afa057353f0de860--d751713988987e9331980363e24189ce [2020/05/20-18:53:56.843] [FRT-43-FlowRunnable] [INFO] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - cache hit [2020/05/20-18:53:56.843] [FRT-43-FlowRunnable] [INFO] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Opening sample 8d070a2de0d75dc7fa494a35f351d97a-NA-31c57f559ab672625afa057353f0de860--d751713988987e9331980363e24189ce fileSize=813740 [2020/05/20-18:53:56.879] [FRT-43-FlowRunnable] [INFO] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Reading sample done, read 7043 rows: MemTable stats nrows=7043 ncols=21 allocChars=658515 usefulChars=658515 estimatedMem=2767888 [2020/05/20-18:53:56.887] [FRT-43-FlowRunnable] [INFO] [dku.table] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - MemTable stats nrows=7043 ncols=21 allocChars=658515 usefulChars=658515 estimatedMem=2767888 [2020/05/20-18:53:56.888] [FRT-43-FlowRunnable] [INFO] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Serialized warnings com.dataiku.dip.warnings.WarningsContext@640978d6 -> {"warnings":{},"totalCount":0} [2020/05/20-18:53:56.889] [FRT-43-FlowRunnable] [INFO] [dku.shaker.types.inferrer] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Starting ANALYSIS type inference on 7043 rows [2020/05/20-18:53:56.974] [analysis-typeinfer-2] [INFO] [dku.meanings.service] - Loading list of meanings [2020/05/20-18:53:57.009] [analysis-typeinfer-3] [INFO] [dku.shaker.types.inferrer] - DONE: VERIF column=InternetService notEmpty=2 shuf=1 main=2 schema=0 [2020/05/20-18:53:57.092] [FRT-43-FlowRunnable] [INFO] [dku.shaker.types.inferrer] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Type inference elapsed=203 detect=20 verify=1 colCacheHit=0 colCacheMiss=20 ht=250 pt=270 dt=382 [2020/05/20-18:53:57.095] [FRT-43-FlowRunnable] [INFO] [dku.shaker.data] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - DataService done, initTime=6 runTime=51 inferTime=207 [2020/05/20-18:53:57.106] [FRT-43-FlowRunnable] [DEBUG] [dku.shaker.cache] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Shaker MemoryCache put on CHURNALGO.Telco_Customer_Churn_prepared_distinct key=ds=e047573100b76c0ed682a57e964e6689--scr=cdde123927ccaac6c74855afbf7c87f2--samp=8d070a2de0d75dc7fa494a35f351d97a-NA-31c57f559ab672625afa057353f0de860--d751713988987e9331980363e24189ce [2020/05/20-18:53:57.112] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column customerID meaning=Text fail=0 [2020/05/20-18:53:57.115] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column gender meaning=Gender fail=0 [2020/05/20-18:53:57.115] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column SeniorCitizen meaning=LongMeaning fail=0 [2020/05/20-18:53:57.115] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Partner meaning=Boolean fail=0 [2020/05/20-18:53:57.116] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Dependents meaning=Boolean fail=0 [2020/05/20-18:53:57.116] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column tenure meaning=LongMeaning fail=0 [2020/05/20-18:53:57.116] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column PhoneService meaning=Boolean fail=0 [2020/05/20-18:53:57.116] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column MultipleLines_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.117] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column InternetService meaning=Text fail=0 [2020/05/20-18:53:57.117] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column OnlineSecurity_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.117] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column OnlineBackup_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.117] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column DeviceProtection_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.118] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column TechSupport_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.118] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column StreamingTV_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.118] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column StreamingMovies_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.119] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Contract meaning=Text fail=0 [2020/05/20-18:53:57.119] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column PaperlessBilling meaning=Boolean fail=0 [2020/05/20-18:53:57.119] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column PaymentMethod meaning=Text fail=0 [2020/05/20-18:53:57.119] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column MonthlyCharges meaning=DoubleMeaning fail=0 [2020/05/20-18:53:57.120] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column TotalCharges meaning=DoubleMeaning fail=0 [2020/05/20-18:53:57.120] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Churn meaning=Boolean fail=0 [2020/05/20-18:53:57.154] [FRT-43-FlowRunnable] [INFO] [dku.shaker.data] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Need to compute sampleId before checking memory cache [2020/05/20-18:53:57.155] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Script settings sampleMax=104857600 processedMax=-1 [2020/05/20-18:53:57.155] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Processing with sampleMax=104857600 processedMax=524288000 [2020/05/20-18:53:57.157] [FRT-43-FlowRunnable] [DEBUG] [dip.shaker.runner] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Computed required sample id : 8d070a2de0d75dc7fa494a35f351d97a-NA-31c57f559ab672625afa057353f0de860--d751713988987e9331980363e24189ce [2020/05/20-18:53:57.160] [FRT-43-FlowRunnable] [DEBUG] [dku.shaker.cache] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Shaker MemoryCache get on CHURNALGO.Telco_Customer_Churn_prepared_distinct key=ds=e047573100b76c0ed682a57e964e6689--scr=cdde123927ccaac6c74855afbf7c87f2--samp=8d070a2de0d75dc7fa494a35f351d97a-NA-31c57f559ab672625afa057353f0de860--d751713988987e9331980363e24189ce: hit [2020/05/20-18:53:57.161] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column customerID meaning=Text fail=0 [2020/05/20-18:53:57.161] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column gender meaning=Gender fail=0 [2020/05/20-18:53:57.162] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column SeniorCitizen meaning=LongMeaning fail=0 [2020/05/20-18:53:57.162] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Partner meaning=Boolean fail=0 [2020/05/20-18:53:57.162] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Dependents meaning=Boolean fail=0 [2020/05/20-18:53:57.163] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column tenure meaning=LongMeaning fail=0 [2020/05/20-18:53:57.163] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column PhoneService meaning=Boolean fail=0 [2020/05/20-18:53:57.164] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column MultipleLines_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.164] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column InternetService meaning=Text fail=0 [2020/05/20-18:53:57.164] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column OnlineSecurity_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.165] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column OnlineBackup_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.165] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column DeviceProtection_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.166] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column TechSupport_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.166] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column StreamingTV_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.166] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column StreamingMovies_0 meaning=Boolean fail=0 [2020/05/20-18:53:57.167] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Contract meaning=Text fail=0 [2020/05/20-18:53:57.167] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column PaperlessBilling meaning=Boolean fail=0 [2020/05/20-18:53:57.168] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column PaymentMethod meaning=Text fail=0 [2020/05/20-18:53:57.168] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column MonthlyCharges meaning=DoubleMeaning fail=0 [2020/05/20-18:53:57.168] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column TotalCharges meaning=DoubleMeaning fail=0 [2020/05/20-18:53:57.169] [FRT-43-FlowRunnable] [INFO] [dku.shaker.schema] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Column Churn meaning=Boolean fail=0 [2020/05/20-18:53:57.171] [FRT-43-FlowRunnable] [INFO] [dku.recipes.prediction.evaluation] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Will predict with threshold = 0.3415067297141797 [2020/05/20-18:53:57.497] [FRT-43-FlowRunnable] [INFO] [dku.shaker.resources] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Serialized 0 Python processor(s) [2020/05/20-18:53:57.500] [FRT-43-FlowRunnable] [INFO] [dku.shaker.resources] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Job required file dku.shaker.yarn.python.bins -> /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/yarn-python-bins0uRw67ec.json [2020/05/20-18:53:57.501] [FRT-43-FlowRunnable] [INFO] [dku.shaker.resources] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Job required file dku.shaker.contrib -> /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-hadoop/contributed_python_processorskB6B5ITd.json [2020/05/20-18:53:57.501] [FRT-43-FlowRunnable] [INFO] [dku.shaker.resources] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Job required file python-lib-instance -> /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-ITk9gUDV.zip.nodecompression [2020/05/20-18:53:57.501] [FRT-43-FlowRunnable] [INFO] [dku.shaker.resources] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Job required file python-package-dataikuapi -> /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-YfRNgeLJ.zip.nodecompression [2020/05/20-18:53:57.501] [FRT-43-FlowRunnable] [INFO] [dku.shaker.resources] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Job required file python-package-dataiku -> /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-BHQI2Oqk.zip.nodecompression [2020/05/20-18:53:57.505] [FRT-43-FlowRunnable] [INFO] [dip.cluster.selector] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Build cluster settings hadoop=__builtin__ k8s=__builtin__ authSource=USER_FROM_UI [2020/05/20-18:53:57.546] [FRT-43-FlowRunnable] [INFO] [dku.spark.activity] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - configure activity helper activityProjectKey=CHURNALGO activity={ "activityId": "evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP", "recipes": [ { "projectKey": "CHURNALGO", "name": "evaluate_on_Telco_Customer_Churn_prepared_distinct_1", "type": "evaluation" } ], "sources": [ { "projectKey": "CHURNALGO", "datasetName": "Telco_Customer_Churn_prepared_distinct", "partitionIds": [ "NP" ] }, { "projectKey": "CHURNALGO", "datasetName": "8yvXZCIF", "partitionIds": [ "NP" ] } ], "targets": [ { "projectKey": "CHURNALGO", "datasetName": "predict_tcs", "partitionId": "NP" }, { "projectKey": "CHURNALGO", "datasetName": "metric_tcs", "partitionId": "NP" } ], "requiredReason": { "type": "NON_RECURSIVE_FORCED_BUILD", "previousRecordTime": 0 }, "dependencies": [], "recipeProjectKey": "CHURNALGO", "recipeName": "evaluate_on_Telco_Customer_Churn_prepared_distinct_1", "recipeType": "evaluation" } type=class com.dataiku.dip.dataflow.JobActivity [2020/05/20-18:53:57.546] [FRT-43-FlowRunnable] [INFO] [dku.spark.activity] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Resulting relevant projects: ["CHURNALGO"] [2020/05/20-18:53:57.547] [FRT-43-FlowRunnable] [INFO] [dip.cluster.selector] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Build cluster settings hadoop=__builtin__ k8s=__builtin__ authSource=USER_FROM_UI [2020/05/20-18:53:57.548] [FRT-43-FlowRunnable] [INFO] [dku.spark.exec] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Building Spark exec env impersonationEnabled=false user=null [2020/05/20-18:53:57.557] [FRT-43-FlowRunnable] [INFO] [dku.logging.spark] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Copy additional Spark logging from /home/dataiku/dataiku-dss-7.0.1/resources/logging/dku-spark-log4j.properties [2020/05/20-18:53:57.559] [FRT-43-FlowRunnable] [INFO] [dku.logging.spark] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Copy additional Spark logging from /home/dataiku/DATA_DIR/resources/logging/dku-spark-log4j.properties [2020/05/20-18:53:57.562] [FRT-43-FlowRunnable] [INFO] [dku.datasets.file] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"CHURNALGO/predict_tcs","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}} [2020/05/20-18:53:57.563] [FRT-43-FlowRunnable] [INFO] [dku.datasets.file] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"CHURNALGO/metric_tcs","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}} [2020/05/20-18:53:57.564] [FRT-43-FlowRunnable] [WARN] [dku.fs.local] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - File does not exist: /home/dataiku/DATA_DIR/managed_datasets/CHURNALGO/metric_tcs [2020/05/20-18:53:57.564] [FRT-43-FlowRunnable] [INFO] [dku.datasets.file] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"/CHURNALGO.Telco_Customer_Churn_prepared_distinct","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}} [2020/05/20-18:53:57.569] [FRT-43-FlowRunnable] [WARN] [dku.code.projectLibs] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - External libraries file not found: /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/localconfig/projects/CHURNALGO/lib/external-libraries.json [2020/05/20-18:53:57.571] [FRT-43-FlowRunnable] [INFO] [dku.code.projectLibs] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - EXTERNAL LIBS FROM CHURNALGO is {"gitReferences":{},"pythonPath":["python"],"rsrcPath":["R"],"importLibrariesFromProjects":[]} [2020/05/20-18:53:57.571] [FRT-43-FlowRunnable] [INFO] [dku.code.projectLibs] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - chunkFolder is /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/localconfig/projects/CHURNALGO/lib/R [2020/05/20-18:53:57.600] [FRT-43-FlowRunnable] [INFO] [xxx] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - RSRC PATH: ["/home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/localconfig/projects/CHURNALGO/lib/R"] [2020/05/20-18:53:57.602] [FRT-43-FlowRunnable] [INFO] [dku.spark.helper] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Store environment variables in local file for executions via spark-submit [2020/05/20-18:53:57.611] [FRT-43-FlowRunnable] [INFO] [dku.recipes.spark.base] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Execute spark-submit: /home/dataiku/spark-2//bin/spark-submit --conf spark.app.name=DSS (predict): evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP --conf spark.dku.limitedLogs={"filePartitioner.noMatch":100,"s3.ignoredPath":100,"s3.ignoredFile":100} --conf spark.executor.memory=2400m --conf spark.dku.ml.hiveDb= --conf spark.dku.ml.repartitionNonHDFS=1 --conf spark.dku.ml.useGlobalMetastore=false --conf spark.yarn.executor.memoryOverhead=600 --conf inferSchema=true --conf spark.sql.shuffle.partitions=40 --conf spark.dku.ml.preparedDF.storageLevel=MEMORY_ONLY_SER --jars /home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-httpclient-3.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/poi-ooxml-schemas-4.0.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-core-2.8.10.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-fileupload-1.3.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/pdfbox-2.0.9.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mail-1.4.7.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jaxen-1.1.6.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/json-simple-1.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/java-cup-10k.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/asm-5.0.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/compress-lzf-1.0.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-opengis-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/curvesapi-1.05.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/accessors-smart-1.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsr305-1.3.9.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xml-resolver-1.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.eclipse.wst.xml.xpath2.processor-2.1.100.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-main-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/JavaEWAH-0.7.9.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/api-all-1.0.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/guava-18.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xmlsec-1.5.7.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/activation-1.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-math3-3.6.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/livy-client-http-0.5.0-incubating.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/HdrHistogram-2.1.6.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.apache.servicemix.bundles.dom4j-1.6.1_5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/postgresql-9.1-901.jdbc4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsr166e-1.1.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/twitter4j-stream-4.0.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/adal4j-1.6.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/openws-1.5.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/poi-ooxml-4.0.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsch-0.1.54.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/joda-time-2.9.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mongodb-driver-3.2.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-shapefile-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-databind-2.8.11.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/oauth2-oidc-sdk-6.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/juniversalchardet-2.3.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/geoip2-0.4.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jgridshift-1.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-compress-1.18.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-dataformat-yaml-2.8.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/metrics-core-3.1.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/bcprov-jdk15on-1.51.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/avro-1.7.7.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/antlr-runtime-3.5.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ejml-core-0.32.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-geojson-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-dataformat-smile-2.8.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/asm-commons-5.0.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/java-image-scaling-0.8.6.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-context-3.2.18.RELEASE.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ant-1.6.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/slf4j-api-1.7.25.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jts-1.13.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xlsx-streamer-2.1.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jython-standalone-2.7-b3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/twitter4j-core-4.0.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/GeographicLib-Java-1.44.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/metrics-jvm-3.1.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/asm-tree-5.0.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/fontbox-2.0.9.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-core-asl-1.9.13.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.eclipse.jgit-4.5.4.201711221230-r.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-expression-3.2.18.RELEASE.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spatial4j-0.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/snappy-java-1.0.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/lang-tag-1.4.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/poi-4.0.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-logging-1.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mina-core-2.0.18.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ooxml-schemas-1.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-pool-1.6.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-data-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/opensaml-2.6.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/esapi-2.0.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-collections4-4.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jdom-1.1.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/velocity-1.7.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/json-smart-2.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/paranamer-2.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/javax.mail-1.6.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/lz4-1.2.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xml-apis-1.4.01.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/icu4j-4.6.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-metadata-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/json-path-2.1.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/trove4j-3.0.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsoup-1.6.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hbc-core-2.0.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/livy-api-0.5.0-incubating.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-beans-3.2.18.RELEASE.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/bson-3.2.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/sqlite-jdbc-3.8.11.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-annotations-2.8.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-collections-3.2.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/opencsv-2.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/securesm-1.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-io-2.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xerces2-xsd11-2.11.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-cli-1.3.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xercesImpl-2.9.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/antlr4-runtime-4.7.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-lang3-3.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mongodb-driver-core-3.2.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xmlbeans-3.0.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xmltooling-1.4.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jdom2-2.0.6.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-codec-1.11.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/quartz-2.2.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jai_core-1.1.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-aop-3.2.18.RELEASE.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hbc-twitter4j-2.0.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/cassandra-driver-core-3.0.8.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/c3p0-0.9.1.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsr-275-1.0-beta-2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.apache.servicemix.bundles.antlr-2.7.7_5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hppc-0.7.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jts-core-1.14.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/not-yet-commons-ssl-0.3.9.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jcip-annotations-1.0-1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-dataformat-cbor-2.8.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/maxminddb-0.2.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-lang-2.6.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gson-2.8.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-net-3.3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/snakeyaml-1.15.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hsqldb-2.3.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/filters-2.0.235.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-referencing-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ejml-ddense-0.32.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/nimbus-jose-jwt-8.10.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/urlrewritefilter-4.0.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/lucene-analyzers-common-5.5.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xpp3-1.1.4c.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-mapper-asl-1.9.13.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/t-digest-3.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-api-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/joauth-6.0.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-webmvc-3.2.18.RELEASE.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/cglib-nodep-2.2.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/css-selectors-1.0.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/metrics-json-3.1.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-epsg-hsql-19.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-core-3.2.18.RELEASE.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-web-3.2.18.RELEASE.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.apache.servicemix.bundles.xpp3-1.1.4c_7.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/log4j-1.2.17.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-servlet-9.1.6.v20160112.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-client-9.1.6.v20160112.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/jodd-3.3.8.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/json-20151123.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/ini4j-0.5.4.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-server-9.1.6.v20160112.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/slf4j-log4j12-1.7.21.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/jetty-io-9.1.6.v20160112.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/freemarker-2.3.28.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/javax.annotation-api-1.2.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-common-9.1.6.v20160112.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/gson-2.8.5.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/javax.servlet-api-3.1.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/h2-1.4.195.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-api-9.1.6.v20160112.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/slf4j-api-1.7.21.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/spark-run_2.11/livy-scala-api_2.11-0.5.0-incubating.jar,/home/dataiku/dataiku-dss-7.0.1/lib/ivy/spark-run_2.11/util-eval_2.11-6.27.0.jar,/home/dataiku/dataiku-dss-7.0.1/lib/shadelib/dss-shadelib-assembly.jar,/home/dataiku/dataiku-dss-7.0.1/dist/dataiku-core.jar,/home/dataiku/dataiku-dss-7.0.1/dist/dataiku-app-platform.jar,/home/dataiku/dataiku-dss-7.0.1/dist/dataiku-dss-core.jar,/home/dataiku/dataiku-dss-7.0.1/dist/dataiku-dip.jar,/home/dataiku/dataiku-dss-7.0.1/dist/dataiku-scoring.jar,/home/dataiku/dataiku-dss-7.0.1/dist/dss-spark-public_2.11-2.0.jar,/home/dataiku/dataiku-dss-7.0.1/dist/dss-spark-main_2.11-5.1.jar,/home/dataiku/dataiku-dss-7.0.1/lib/third/LevenshteinAutomatonTest.jar,/home/dataiku/dataiku-dss-7.0.1/lib/third/MDAG.jar,/home/dataiku/dataiku-dss-7.0.1/lib/third/sassyreader.jar,/home/dataiku/dataiku-dss-7.0.1/lib/third/jcurses.jar,/home/dataiku/dataiku-dss-7.0.1/lib/third/junit-ext-1.0-junit-4.5-RC-3.jar,/home/dataiku/dataiku-dss-7.0.1/lib/third/spnego-7.0.jar --files /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/yarn-python-bins0uRw67ec.json,/home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-hadoop/contributed_python_processorskB6B5ITd.json,/home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-ITk9gUDV.zip.nodecompression,/home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-YfRNgeLJ.zip.nodecompression,/home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-BHQI2Oqk.zip.nodecompression,/home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/dku-spark-log4j.properties --class com.dataiku.dip.spark.MLLibEvaluationJob /home/dataiku/dataiku-dss-7.0.1/dist/dss-spark-main_2.11-5.1.jar /home/dataiku/DATA_DIR/saved_models/CHURNALGO/8yvXZCIF/versions/1589980789114 CHURNALGO Telco_Customer_Churn_prepared_distinct predict_tcs metric_tcs /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/evaluation-recipe/pyrunMn0DtWFJ3i41 [2020/05/20-18:53:57.611] [FRT-43-FlowRunnable] [INFO] [xxx] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - RSRC PATH: ["/home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/localconfig/projects/CHURNALGO/lib/R"] [2020/05/20-18:53:57.612] [FRT-43-FlowRunnable] [INFO] [dku.recipes.code.base] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Run command insecurely, from user dataiku [2020/05/20-18:53:57.613] [FRT-43-FlowRunnable] [INFO] [dku.security.process] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Starting process (regular) [2020/05/20-18:53:57.615] [FRT-43-FlowRunnable] [INFO] [dku.security.process] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Process started with pid=26988 [2020/05/20-18:53:57.619] [FRT-43-FlowRunnable] [INFO] [dku.recipes.code.base] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Process reads from nothing [2020/05/20-18:53:58.488] [null-err-50] [INFO] [dku.utils] - Warning: Ignoring non-Spark config property: inferSchema [2020/05/20-18:53:58.673] [null-err-50] [INFO] [dku.utils] - 20/05/20 18:53:58 WARN Utils: Your hostname, admin1 resolves to a loopback address: 127.0.1.1; using 10.139.4.79 instead (on interface ens160) [2020/05/20-18:53:58.674] [null-err-50] [INFO] [dku.utils] - 20/05/20 18:53:58 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address [2020/05/20-18:53:58.699] [null-err-50] [INFO] [dku.utils] - 20/05/20 18:53:58 WARN SparkConf: The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead. [2020/05/20-18:53:59.058] [null-err-50] [INFO] [dku.utils] - 20/05/20 18:53:59 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable [2020/05/20-18:53:59.715] [null-err-50] [INFO] [dku.utils] - log4j:WARN No appenders could be found for logger (dku.spark.entrypoint). [2020/05/20-18:53:59.716] [null-err-50] [INFO] [dku.utils] - log4j:WARN Please initialize the log4j system properly. [2020/05/20-18:53:59.716] [null-err-50] [INFO] [dku.utils] - log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. [2020/05/20-18:53:59.719] [null-err-50] [INFO] [dku.utils] - Initialize DKU app logging [2020/05/20-18:53:59.720] [null-err-50] [INFO] [dku.utils] - Logging resource not found ... [2020/05/20-18:53:59.724] [null-out-49] [INFO] [dku.utils] - 1051 [main] INFO dku.logging.spark - Configuring additional Spark logging from dku-spark-log4j.properties [2020/05/20-18:53:59.725] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:53:59.725] [main] [INFO] [dku.logging.spark] - No dku-spark-log4j.user.properties to configure logging from [2020/05/20-18:53:59.727] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:53:59.726] [main] [DEBUG] [dku.remoterun.envhelper] - Fetching remote run env def [2020/05/20-18:53:59.748] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:53:59.747] [main] [INFO] [dku.remoterun.fileshelper] - Look for resource or file remote-run-env-def.json [2020/05/20-18:53:59.748] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:53:59.748] [main] [INFO] [dku.remoterun.fileshelper] - > found in cwd [2020/05/20-18:53:59.811] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:53:59.811] [main] [INFO] [dku.spark.entrypoint] - Running in folder /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/. [2020/05/20-18:53:59.812] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:53:59.811] [main] [INFO] [dku.spark.entrypoint] - Driver working directory /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/. [2020/05/20-18:53:59.812] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:53:59.811] [main] [INFO] [dku.spark.entrypoint] - yarn-cluster=false databricks=false [2020/05/20-18:54:00.578] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:00.577] [main] [DEBUG] [dku] - Lambda prepare package step not loaded: com.dataiku.dip.scheduler.steps.PrepareLambdaPackageStepRunner [2020/05/20-18:54:00.578] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:00.578] [main] [DEBUG] [dku] - API Deployer step not loaded: com.dataiku.dip.scheduler.steps.UpdateAPIDeployerDeploymentStepRunner [2020/05/20-18:54:00.695] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:00.695] [main] [INFO] [dku.spark.entrypoint] - Use real dip home but no conf [2020/05/20-18:54:00.865] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:00.864] [main] [INFO] [org.springframework.context.annotation.AnnotationConfigApplicationContext] - Refreshing org.springframework.context.annotation.AnnotationConfigApplicationContext@784c5ef5: startup date [Wed May 20 18:54:00 IST 2020]; root of context hierarchy [2020/05/20-18:54:01.012] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.012] [main] [INFO] [org.springframework.beans.factory.annotation.AutowiredAnnotationBeanPostProcessor] - JSR-330 'javax.inject.Inject' annotation found and supported for autowiring [2020/05/20-18:54:01.029] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.028] [main] [INFO] [org.springframework.beans.factory.support.DefaultListableBeanFactory] - Pre-instantiating singletons in org.springframework.beans.factory.support.DefaultListableBeanFactory@4aedaf61: defining beans [org.springframework.context.annotation.internalConfigurationAnnotationProcessor,org.springframework.context.annotation.internalAutowiredAnnotationProcessor,org.springframework.context.annotation.internalRequiredAnnotationProcessor,org.springframework.context.annotation.internalCommonAnnotationProcessor,springUtils,noopTransactionService,noopSymetricCryptoService,manualVariablesService,noopImpersonationResolverService,manualConnectionsDAO,sparkDriverNoopPluginSettingsDAO,regularPluginsRegistryService,pluginRequirementService,notebooksTemplatesService,reportsTemplateService,webAppsTemplatesService,customCodeRecipesService,customPythonDatasetsService,customJavaDatasetsService,customPythonProbesService,customPythonFormatsService,customJavaFormatsService,customPythonChecksService,customSQLProbesService,customPythonExportersService,customJavaExportersService,customWebAppsService,pythonPluginStepsService,pythonPluginTriggersService,customPythonRunnablesService,customJavaRunnablesService,customPythonPredictionAlgoService,customPythonFSProvidersService,customJavaFSProvidersService,customJavaDialectsService,customJythonProcessorsService,pythonPluginClustersService,pluginParameterSetsService,customPolicyHooksRegistry,customFieldsService,customJavaPolicyHooksService,customExpositionsService,pluginsLoadService,org.springframework.context.annotation.ConfigurationClassPostProcessor.importAwareProcessor]; root of factory hierarchy [2020/05/20-18:54:01.270] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.270] [main] [INFO] [dku.plugins] - Init plugin service [2020/05/20-18:54:01.272] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.272] [main] [INFO] [dku.plugins] - Loading plugins [2020/05/20-18:54:01.274] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.274] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu [2020/05/20-18:54:01.297] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.297] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu/custom-recipes/deeplearning-image-retrain [2020/05/20-18:54:01.305] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.304] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu/custom-recipes/deeplearning-image-extract [2020/05/20-18:54:01.305] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.305] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu/custom-recipes/deeplearning-image-score [2020/05/20-18:54:01.307] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.306] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu/python-runnables/download-models [2020/05/20-18:54:01.315] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.314] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu/python-runnables/download-models/long-description.md [2020/05/20-18:54:01.317] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.317] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu/python-runnables/create-api-service [2020/05/20-18:54:01.319] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.318] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/deeplearning-image-cpu/python-runnables/create-api-service/long-description.md [2020/05/20-18:54:01.331] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.330] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/DATA_DIR/plugins/installed/gdpr [2020/05/20-18:54:01.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.332] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/gdpr/python-runnables/gdpr-ds-check-up [2020/05/20-18:54:01.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.332] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/gdpr/python-runnables/gdpr-ds-check-up/long-description.md [2020/05/20-18:54:01.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.332] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/gdpr/python-runnables/gdpr-audit [2020/05/20-18:54:01.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.333] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/gdpr/python-runnables/gdpr-audit/long-description.md [2020/05/20-18:54:01.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.333] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/gdpr/custom-fields/gdpr-fields [2020/05/20-18:54:01.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.337] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/gdpr/java-policy-hooks/gdpr-hooks [2020/05/20-18:54:01.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.341] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/DATA_DIR/plugins/installed/model-drift [2020/05/20-18:54:01.342] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.342] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/model-drift/webapps/model-drift-view [2020/05/20-18:54:01.355] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.355] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu [2020/05/20-18:54:01.356] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.356] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/custom-recipes/object-detection-draw-bounding-boxes-cpu [2020/05/20-18:54:01.356] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.356] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/custom-recipes/object-detection-retrain-object-detection-model-cpu [2020/05/20-18:54:01.357] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.357] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/custom-recipes/object-detection-detect-objects-cpu [2020/05/20-18:54:01.357] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.357] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/custom-recipes/object-detection-detect-video-cpu [2020/05/20-18:54:01.358] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.357] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/python-runnables/object-detection-download-models-cpu [2020/05/20-18:54:01.358] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.358] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/python-runnables/object-detection-download-models-cpu/long-description.md [2020/05/20-18:54:01.359] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.358] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/python-runnables/create-api-service [2020/05/20-18:54:01.361] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.358] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/object-detection-cpu/python-runnables/create-api-service/long-description.md [2020/05/20-18:54:01.361] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.361] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu [2020/05/20-18:54:01.362] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.362] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/custom-recipes/object-detection-retrain-object-detection-model-gpu [2020/05/20-18:54:01.363] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.362] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/custom-recipes/object-detection-detect-video-gpu [2020/05/20-18:54:01.363] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.362] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/custom-recipes/object-detection-draw-bounding-boxes-gpu [2020/05/20-18:54:01.364] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.362] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/custom-recipes/object-detection-detect-objects-gpu [2020/05/20-18:54:01.364] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.363] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/python-runnables/object-detection-download-models-gpu [2020/05/20-18:54:01.364] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.363] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/python-runnables/object-detection-download-models-gpu/long-description.md [2020/05/20-18:54:01.365] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.363] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/python-runnables/create-api-service [2020/05/20-18:54:01.365] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.364] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/object-detection-gpu/python-runnables/create-api-service/long-description.md [2020/05/20-18:54:01.367] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.366] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/DATA_DIR/plugins/installed/speech-to-text-cpu [2020/05/20-18:54:01.368] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.367] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/speech-to-text-cpu/custom-recipes/speech-to-text-speech-to-text-cpu [2020/05/20-18:54:01.369] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.368] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/DATA_DIR/plugins/installed/speech-to-text-cpu/python-runnables/speech-to-text-download-models-cpu [2020/05/20-18:54:01.369] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.369] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/DATA_DIR/plugins/installed/speech-to-text-cpu/python-runnables/speech-to-text-download-models-cpu/long-description.md [2020/05/20-18:54:01.375] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.371] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros [2020/05/20-18:54:01.376] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.372] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/clear-job-logs [2020/05/20-18:54:01.376] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.372] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/clear-job-logs/long-description.md [2020/05/20-18:54:01.377] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.372] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/clear-scenario-logs [2020/05/20-18:54:01.377] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.373] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/clear-scenario-logs/long-description.md [2020/05/20-18:54:01.377] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.373] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/project-connections-audit [2020/05/20-18:54:01.377] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.373] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/project-connections-audit/long-description.md [2020/05/20-18:54:01.377] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.373] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/kill-jupyter-sessions [2020/05/20-18:54:01.378] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.373] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/kill-jupyter-sessions/long-description.md [2020/05/20-18:54:01.378] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.373] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/delete-datasets-by-tags [2020/05/20-18:54:01.378] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.374] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/delete-datasets-by-tags/long-description.md [2020/05/20-18:54:01.378] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.374] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/remove-old-exports [2020/05/20-18:54:01.379] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.374] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/remove-old-exports/long-description.md [2020/05/20-18:54:01.379] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.374] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/remove-old-container-images [2020/05/20-18:54:01.379] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.375] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/remove-old-container-images/long-description.md [2020/05/20-18:54:01.380] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.375] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/list-datasets-using-connection [2020/05/20-18:54:01.380] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.376] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/python-runnables/list-datasets-using-connection/long-description.md [2020/05/20-18:54:01.380] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.376] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/java-runnables/drop-pipeline-views [2020/05/20-18:54:01.381] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.378] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/java-runnables/clear-internal-dbs [2020/05/20-18:54:01.381] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.378] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/builtin-macros/java-runnables/backup-internal-databases [2020/05/20-18:54:01.381] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.381] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/colorbrewer-palettes [2020/05/20-18:54:01.384] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.384] [main] [INFO] [dku.plugins] - Loading plugin from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/local-r-dev-setup [2020/05/20-18:54:01.384] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.384] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/local-r-dev-setup/python-runnables/create-dataiku-intercom-conf [2020/05/20-18:54:01.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.384] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/local-r-dev-setup/python-runnables/create-dataiku-intercom-conf/long-description.md [2020/05/20-18:54:01.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.385] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/local-r-dev-setup/python-runnables/install-dataiku-r-package [2020/05/20-18:54:01.386] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.385] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/local-r-dev-setup/python-runnables/install-dataiku-r-package/long-description.md [2020/05/20-18:54:01.386] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.385] [main] [DEBUG] [dku.plugins.code] - Loading custom code element from /home/dataiku/dataiku-dss-7.0.1/resources/plugins/local-r-dev-setup/python-runnables/make-rstudio-project [2020/05/20-18:54:01.386] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.385] [main] [INFO] [dku.plugins.macros] - Looking for /home/dataiku/dataiku-dss-7.0.1/resources/plugins/local-r-dev-setup/python-runnables/make-rstudio-project/long-description.md [2020/05/20-18:54:01.388] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.388] [main] [DEBUG] [dku.plugins] - Done init plugins service [2020/05/20-18:54:01.394] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.394] [main] [WARN] [org.apache.spark.SparkConf] - The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead. [2020/05/20-18:54:01.401] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.400] [main] [INFO] [dku.spark.entrypoint] - Loading exec env [2020/05/20-18:54:01.401] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.401] [main] [INFO] [dku.spark.exec] - Loading exec env with impersonation enabled=false user=null [2020/05/20-18:54:01.416] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.416] [main] [WARN] [org.apache.spark.SparkConf] - The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead. [2020/05/20-18:54:01.418] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.418] [main] [INFO] [dku.spark.mllib] - Creating Spark context [2020/05/20-18:54:01.431] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.430] [main] [WARN] [org.apache.spark.SparkConf] - The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead. [2020/05/20-18:54:01.446] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.446] [main] [INFO] [org.apache.spark.SparkContext] - Running Spark version 2.4.5 [2020/05/20-18:54:01.472] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.472] [main] [INFO] [org.apache.spark.SparkContext] - Submitted application: DSS (predict): evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP [2020/05/20-18:54:01.527] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.527] [main] [INFO] [org.apache.spark.SecurityManager] - Changing view acls to: dataiku [2020/05/20-18:54:01.528] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.527] [main] [INFO] [org.apache.spark.SecurityManager] - Changing modify acls to: dataiku [2020/05/20-18:54:01.528] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.527] [main] [INFO] [org.apache.spark.SecurityManager] - Changing view acls groups to: [2020/05/20-18:54:01.528] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.527] [main] [INFO] [org.apache.spark.SecurityManager] - Changing modify acls groups to: [2020/05/20-18:54:01.529] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.527] [main] [INFO] [org.apache.spark.SecurityManager] - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(dataiku); groups with view permissions: Set(); users with modify permissions: Set(dataiku); groups with modify permissions: Set() [2020/05/20-18:54:01.844] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.843] [main] [INFO] [org.apache.spark.util.Utils] - Successfully started service 'sparkDriver' on port 41629. [2020/05/20-18:54:01.878] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.877] [main] [INFO] [org.apache.spark.SparkEnv] - Registering MapOutputTracker [2020/05/20-18:54:01.904] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.904] [main] [INFO] [org.apache.spark.SparkEnv] - Registering BlockManagerMaster [2020/05/20-18:54:01.908] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.908] [main] [INFO] [org.apache.spark.storage.BlockManagerMasterEndpoint] - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information [2020/05/20-18:54:01.908] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.908] [main] [INFO] [org.apache.spark.storage.BlockManagerMasterEndpoint] - BlockManagerMasterEndpoint up [2020/05/20-18:54:01.922] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.922] [main] [INFO] [org.apache.spark.storage.DiskBlockManager] - Created local directory at /tmp/blockmgr-6db7c0c0-6ae0-4064-b09b-197a8a99cec8 [2020/05/20-18:54:01.949] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.949] [main] [INFO] [org.apache.spark.storage.memory.MemoryStore] - MemoryStore started with capacity 366.3 MB [2020/05/20-18:54:01.968] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:01.968] [main] [INFO] [org.apache.spark.SparkEnv] - Registering OutputCommitCoordinator [2020/05/20-18:54:02.067] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.066] [main] [INFO] [org.spark_project.jetty.util.log] - Logging initialized @4215ms [2020/05/20-18:54:02.151] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.151] [main] [INFO] [org.spark_project.jetty.server.Server] - jetty-9.3.z-SNAPSHOT, build timestamp: unknown, git hash: unknown [2020/05/20-18:54:02.177] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.176] [main] [INFO] [org.spark_project.jetty.server.Server] - Started @4327ms [2020/05/20-18:54:02.204] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.204] [main] [INFO] [org.spark_project.jetty.server.AbstractConnector] - Started ServerConnector@10f7c8ee{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} [2020/05/20-18:54:02.205] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.204] [main] [INFO] [org.apache.spark.util.Utils] - Successfully started service 'SparkUI' on port 4040. [2020/05/20-18:54:02.240] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.239] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@4ed9f7b1{/jobs,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.240] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.240] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@39c85c1a{/jobs/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.241] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.241] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@2b148329{/jobs/job,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.242] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.242] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@c889805{/jobs/job/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.243] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.243] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@77db3d02{/stages,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.243] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.243] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@78d9f51b{/stages/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.244] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.244] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@23ed382c{/stages/stage,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.246] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.245] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@72b6832e{/stages/stage/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.246] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.246] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@3850e90c{/stages/pool,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.247] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.247] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@3d9f5016{/stages/pool/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.248] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.248] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@7e91ed74{/storage,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.249] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.249] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@3878be7b{/storage/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.250] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.249] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@7df28f1{/storage/rdd,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.250] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.250] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@4087c7fc{/storage/rdd/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.251] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.251] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@318c68d5{/environment,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.252] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.252] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@4cfa83f9{/environment/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.253] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.253] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@321ca237{/executors,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.254] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.254] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@4c5a2baf{/executors/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.255] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.254] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@1ec22831{/executors/threadDump,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.255] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.255] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@63f855b{/executors/threadDump/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.265] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.265] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@516592b1{/static,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.266] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.266] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@3913f206{/,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.267] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.267] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@3e8799f{/api,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.268] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.268] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@41ccb3b9{/jobs/job/kill,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.269] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.269] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@53d9826f{/stages/stage/kill,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.272] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.272] [main] [INFO] [org.apache.spark.ui.SparkUI] - Bound SparkUI to 0.0.0.0, and started at http://10.139.4.79:4040 [2020/05/20-18:54:02.302] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.302] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-httpclient-3.1.jar at spark://10.139.4.79:41629/jars/commons-httpclient-3.1.jar with timestamp 1589981042301 [2020/05/20-18:54:02.302] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.302] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/poi-ooxml-schemas-4.0.1.jar at spark://10.139.4.79:41629/jars/poi-ooxml-schemas-4.0.1.jar with timestamp 1589981042302 [2020/05/20-18:54:02.303] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.302] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-core-2.8.10.jar at spark://10.139.4.79:41629/jars/jackson-core-2.8.10.jar with timestamp 1589981042302 [2020/05/20-18:54:02.303] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.303] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-fileupload-1.3.3.jar at spark://10.139.4.79:41629/jars/commons-fileupload-1.3.3.jar with timestamp 1589981042303 [2020/05/20-18:54:02.303] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.303] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/pdfbox-2.0.9.jar at spark://10.139.4.79:41629/jars/pdfbox-2.0.9.jar with timestamp 1589981042303 [2020/05/20-18:54:02.304] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.303] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mail-1.4.7.jar at spark://10.139.4.79:41629/jars/mail-1.4.7.jar with timestamp 1589981042303 [2020/05/20-18:54:02.304] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.303] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jaxen-1.1.6.jar at spark://10.139.4.79:41629/jars/jaxen-1.1.6.jar with timestamp 1589981042303 [2020/05/20-18:54:02.305] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.304] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/json-simple-1.1.jar at spark://10.139.4.79:41629/jars/json-simple-1.1.jar with timestamp 1589981042304 [2020/05/20-18:54:02.305] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.304] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/java-cup-10k.jar at spark://10.139.4.79:41629/jars/java-cup-10k.jar with timestamp 1589981042304 [2020/05/20-18:54:02.306] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.304] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/asm-5.0.4.jar at spark://10.139.4.79:41629/jars/asm-5.0.4.jar with timestamp 1589981042304 [2020/05/20-18:54:02.306] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.304] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/compress-lzf-1.0.2.jar at spark://10.139.4.79:41629/jars/compress-lzf-1.0.2.jar with timestamp 1589981042304 [2020/05/20-18:54:02.306] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.304] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-opengis-19.1.jar at spark://10.139.4.79:41629/jars/gt-opengis-19.1.jar with timestamp 1589981042304 [2020/05/20-18:54:02.307] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.305] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/curvesapi-1.05.jar at spark://10.139.4.79:41629/jars/curvesapi-1.05.jar with timestamp 1589981042305 [2020/05/20-18:54:02.307] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.305] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/accessors-smart-1.2.jar at spark://10.139.4.79:41629/jars/accessors-smart-1.2.jar with timestamp 1589981042305 [2020/05/20-18:54:02.308] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.305] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsr305-1.3.9.jar at spark://10.139.4.79:41629/jars/jsr305-1.3.9.jar with timestamp 1589981042305 [2020/05/20-18:54:02.308] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.305] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xml-resolver-1.2.jar at spark://10.139.4.79:41629/jars/xml-resolver-1.2.jar with timestamp 1589981042305 [2020/05/20-18:54:02.308] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.305] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.eclipse.wst.xml.xpath2.processor-2.1.100.jar at spark://10.139.4.79:41629/jars/org.eclipse.wst.xml.xpath2.processor-2.1.100.jar with timestamp 1589981042305 [2020/05/20-18:54:02.309] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.306] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-main-19.1.jar at spark://10.139.4.79:41629/jars/gt-main-19.1.jar with timestamp 1589981042306 [2020/05/20-18:54:02.309] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.306] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/JavaEWAH-0.7.9.jar at spark://10.139.4.79:41629/jars/JavaEWAH-0.7.9.jar with timestamp 1589981042306 [2020/05/20-18:54:02.309] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.306] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/api-all-1.0.2.jar at spark://10.139.4.79:41629/jars/api-all-1.0.2.jar with timestamp 1589981042306 [2020/05/20-18:54:02.309] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.306] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/guava-18.0.jar at spark://10.139.4.79:41629/jars/guava-18.0.jar with timestamp 1589981042306 [2020/05/20-18:54:02.310] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.306] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xmlsec-1.5.7.jar at spark://10.139.4.79:41629/jars/xmlsec-1.5.7.jar with timestamp 1589981042306 [2020/05/20-18:54:02.310] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.307] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/activation-1.1.jar at spark://10.139.4.79:41629/jars/activation-1.1.jar with timestamp 1589981042306 [2020/05/20-18:54:02.310] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.307] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-math3-3.6.1.jar at spark://10.139.4.79:41629/jars/commons-math3-3.6.1.jar with timestamp 1589981042307 [2020/05/20-18:54:02.311] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.307] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/livy-client-http-0.5.0-incubating.jar at spark://10.139.4.79:41629/jars/livy-client-http-0.5.0-incubating.jar with timestamp 1589981042307 [2020/05/20-18:54:02.311] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.307] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/HdrHistogram-2.1.6.jar at spark://10.139.4.79:41629/jars/HdrHistogram-2.1.6.jar with timestamp 1589981042307 [2020/05/20-18:54:02.312] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.307] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.apache.servicemix.bundles.dom4j-1.6.1_5.jar at spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.dom4j-1.6.1_5.jar with timestamp 1589981042307 [2020/05/20-18:54:02.312] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.308] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/postgresql-9.1-901.jdbc4.jar at spark://10.139.4.79:41629/jars/postgresql-9.1-901.jdbc4.jar with timestamp 1589981042308 [2020/05/20-18:54:02.318] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.308] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsr166e-1.1.0.jar at spark://10.139.4.79:41629/jars/jsr166e-1.1.0.jar with timestamp 1589981042308 [2020/05/20-18:54:02.318] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.308] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/twitter4j-stream-4.0.1.jar at spark://10.139.4.79:41629/jars/twitter4j-stream-4.0.1.jar with timestamp 1589981042308 [2020/05/20-18:54:02.318] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.308] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/adal4j-1.6.4.jar at spark://10.139.4.79:41629/jars/adal4j-1.6.4.jar with timestamp 1589981042308 [2020/05/20-18:54:02.318] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.308] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/openws-1.5.4.jar at spark://10.139.4.79:41629/jars/openws-1.5.4.jar with timestamp 1589981042308 [2020/05/20-18:54:02.319] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.308] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/poi-ooxml-4.0.1.jar at spark://10.139.4.79:41629/jars/poi-ooxml-4.0.1.jar with timestamp 1589981042308 [2020/05/20-18:54:02.319] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.309] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsch-0.1.54.jar at spark://10.139.4.79:41629/jars/jsch-0.1.54.jar with timestamp 1589981042309 [2020/05/20-18:54:02.319] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.309] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/joda-time-2.9.5.jar at spark://10.139.4.79:41629/jars/joda-time-2.9.5.jar with timestamp 1589981042309 [2020/05/20-18:54:02.319] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.309] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mongodb-driver-3.2.2.jar at spark://10.139.4.79:41629/jars/mongodb-driver-3.2.2.jar with timestamp 1589981042309 [2020/05/20-18:54:02.319] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.309] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-shapefile-19.1.jar at spark://10.139.4.79:41629/jars/gt-shapefile-19.1.jar with timestamp 1589981042309 [2020/05/20-18:54:02.319] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.309] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-databind-2.8.11.2.jar at spark://10.139.4.79:41629/jars/jackson-databind-2.8.11.2.jar with timestamp 1589981042309 [2020/05/20-18:54:02.320] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.310] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/oauth2-oidc-sdk-6.5.jar at spark://10.139.4.79:41629/jars/oauth2-oidc-sdk-6.5.jar with timestamp 1589981042310 [2020/05/20-18:54:02.320] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.310] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/juniversalchardet-2.3.0.jar at spark://10.139.4.79:41629/jars/juniversalchardet-2.3.0.jar with timestamp 1589981042310 [2020/05/20-18:54:02.320] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.310] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/geoip2-0.4.1.jar at spark://10.139.4.79:41629/jars/geoip2-0.4.1.jar with timestamp 1589981042310 [2020/05/20-18:54:02.320] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.310] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jgridshift-1.0.jar at spark://10.139.4.79:41629/jars/jgridshift-1.0.jar with timestamp 1589981042310 [2020/05/20-18:54:02.320] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.310] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-compress-1.18.jar at spark://10.139.4.79:41629/jars/commons-compress-1.18.jar with timestamp 1589981042310 [2020/05/20-18:54:02.320] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.311] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-dataformat-yaml-2.8.1.jar at spark://10.139.4.79:41629/jars/jackson-dataformat-yaml-2.8.1.jar with timestamp 1589981042311 [2020/05/20-18:54:02.320] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.311] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/metrics-core-3.1.5.jar at spark://10.139.4.79:41629/jars/metrics-core-3.1.5.jar with timestamp 1589981042311 [2020/05/20-18:54:02.321] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.311] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/bcprov-jdk15on-1.51.jar at spark://10.139.4.79:41629/jars/bcprov-jdk15on-1.51.jar with timestamp 1589981042311 [2020/05/20-18:54:02.321] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.311] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/avro-1.7.7.jar at spark://10.139.4.79:41629/jars/avro-1.7.7.jar with timestamp 1589981042311 [2020/05/20-18:54:02.321] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.311] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/antlr-runtime-3.5.2.jar at spark://10.139.4.79:41629/jars/antlr-runtime-3.5.2.jar with timestamp 1589981042311 [2020/05/20-18:54:02.321] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.312] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ejml-core-0.32.jar at spark://10.139.4.79:41629/jars/ejml-core-0.32.jar with timestamp 1589981042312 [2020/05/20-18:54:02.321] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.312] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-geojson-19.1.jar at spark://10.139.4.79:41629/jars/gt-geojson-19.1.jar with timestamp 1589981042312 [2020/05/20-18:54:02.321] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.312] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-dataformat-smile-2.8.1.jar at spark://10.139.4.79:41629/jars/jackson-dataformat-smile-2.8.1.jar with timestamp 1589981042312 [2020/05/20-18:54:02.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.312] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/asm-commons-5.0.3.jar at spark://10.139.4.79:41629/jars/asm-commons-5.0.3.jar with timestamp 1589981042312 [2020/05/20-18:54:02.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.312] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/java-image-scaling-0.8.6.jar at spark://10.139.4.79:41629/jars/java-image-scaling-0.8.6.jar with timestamp 1589981042312 [2020/05/20-18:54:02.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.313] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-context-3.2.18.RELEASE.jar at spark://10.139.4.79:41629/jars/spring-context-3.2.18.RELEASE.jar with timestamp 1589981042312 [2020/05/20-18:54:02.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.313] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ant-1.6.5.jar at spark://10.139.4.79:41629/jars/ant-1.6.5.jar with timestamp 1589981042313 [2020/05/20-18:54:02.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.313] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/slf4j-api-1.7.25.jar at spark://10.139.4.79:41629/jars/slf4j-api-1.7.25.jar with timestamp 1589981042313 [2020/05/20-18:54:02.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.313] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jts-1.13.jar at spark://10.139.4.79:41629/jars/jts-1.13.jar with timestamp 1589981042313 [2020/05/20-18:54:02.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.313] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xlsx-streamer-2.1.0.jar at spark://10.139.4.79:41629/jars/xlsx-streamer-2.1.0.jar with timestamp 1589981042313 [2020/05/20-18:54:02.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.314] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jython-standalone-2.7-b3.jar at spark://10.139.4.79:41629/jars/jython-standalone-2.7-b3.jar with timestamp 1589981042314 [2020/05/20-18:54:02.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.314] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/twitter4j-core-4.0.1.jar at spark://10.139.4.79:41629/jars/twitter4j-core-4.0.1.jar with timestamp 1589981042314 [2020/05/20-18:54:02.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.314] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/GeographicLib-Java-1.44.jar at spark://10.139.4.79:41629/jars/GeographicLib-Java-1.44.jar with timestamp 1589981042314 [2020/05/20-18:54:02.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.314] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/metrics-jvm-3.1.5.jar at spark://10.139.4.79:41629/jars/metrics-jvm-3.1.5.jar with timestamp 1589981042314 [2020/05/20-18:54:02.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.314] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/asm-tree-5.0.3.jar at spark://10.139.4.79:41629/jars/asm-tree-5.0.3.jar with timestamp 1589981042314 [2020/05/20-18:54:02.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.315] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/fontbox-2.0.9.jar at spark://10.139.4.79:41629/jars/fontbox-2.0.9.jar with timestamp 1589981042314 [2020/05/20-18:54:02.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.315] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-core-asl-1.9.13.jar at spark://10.139.4.79:41629/jars/jackson-core-asl-1.9.13.jar with timestamp 1589981042315 [2020/05/20-18:54:02.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.315] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.eclipse.jgit-4.5.4.201711221230-r.jar at spark://10.139.4.79:41629/jars/org.eclipse.jgit-4.5.4.201711221230-r.jar with timestamp 1589981042315 [2020/05/20-18:54:02.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.315] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-expression-3.2.18.RELEASE.jar at spark://10.139.4.79:41629/jars/spring-expression-3.2.18.RELEASE.jar with timestamp 1589981042315 [2020/05/20-18:54:02.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.315] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spatial4j-0.5.jar at spark://10.139.4.79:41629/jars/spatial4j-0.5.jar with timestamp 1589981042315 [2020/05/20-18:54:02.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.316] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/snappy-java-1.0.5.jar at spark://10.139.4.79:41629/jars/snappy-java-1.0.5.jar with timestamp 1589981042315 [2020/05/20-18:54:02.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.316] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/lang-tag-1.4.4.jar at spark://10.139.4.79:41629/jars/lang-tag-1.4.4.jar with timestamp 1589981042316 [2020/05/20-18:54:02.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.316] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/poi-4.0.1.jar at spark://10.139.4.79:41629/jars/poi-4.0.1.jar with timestamp 1589981042316 [2020/05/20-18:54:02.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.316] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-logging-1.2.jar at spark://10.139.4.79:41629/jars/commons-logging-1.2.jar with timestamp 1589981042316 [2020/05/20-18:54:02.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.316] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mina-core-2.0.18.jar at spark://10.139.4.79:41629/jars/mina-core-2.0.18.jar with timestamp 1589981042316 [2020/05/20-18:54:02.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.316] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ooxml-schemas-1.1.jar at spark://10.139.4.79:41629/jars/ooxml-schemas-1.1.jar with timestamp 1589981042316 [2020/05/20-18:54:02.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.317] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-pool-1.6.jar at spark://10.139.4.79:41629/jars/commons-pool-1.6.jar with timestamp 1589981042317 [2020/05/20-18:54:02.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.317] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-data-19.1.jar at spark://10.139.4.79:41629/jars/gt-data-19.1.jar with timestamp 1589981042317 [2020/05/20-18:54:02.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.317] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/opensaml-2.6.4.jar at spark://10.139.4.79:41629/jars/opensaml-2.6.4.jar with timestamp 1589981042317 [2020/05/20-18:54:02.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.317] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/esapi-2.0.1.jar at spark://10.139.4.79:41629/jars/esapi-2.0.1.jar with timestamp 1589981042317 [2020/05/20-18:54:02.326] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.317] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-collections4-4.2.jar at spark://10.139.4.79:41629/jars/commons-collections4-4.2.jar with timestamp 1589981042317 [2020/05/20-18:54:02.326] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.318] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jdom-1.1.3.jar at spark://10.139.4.79:41629/jars/jdom-1.1.3.jar with timestamp 1589981042318 [2020/05/20-18:54:02.326] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.318] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/velocity-1.7.jar at spark://10.139.4.79:41629/jars/velocity-1.7.jar with timestamp 1589981042318 [2020/05/20-18:54:02.326] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.318] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/json-smart-2.3.jar at spark://10.139.4.79:41629/jars/json-smart-2.3.jar with timestamp 1589981042318 [2020/05/20-18:54:02.326] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.318] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/paranamer-2.3.jar at spark://10.139.4.79:41629/jars/paranamer-2.3.jar with timestamp 1589981042318 [2020/05/20-18:54:02.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.318] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/javax.mail-1.6.1.jar at spark://10.139.4.79:41629/jars/javax.mail-1.6.1.jar with timestamp 1589981042318 [2020/05/20-18:54:02.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.318] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/lz4-1.2.0.jar at spark://10.139.4.79:41629/jars/lz4-1.2.0.jar with timestamp 1589981042318 [2020/05/20-18:54:02.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.319] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xml-apis-1.4.01.jar at spark://10.139.4.79:41629/jars/xml-apis-1.4.01.jar with timestamp 1589981042319 [2020/05/20-18:54:02.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.319] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/icu4j-4.6.jar at spark://10.139.4.79:41629/jars/icu4j-4.6.jar with timestamp 1589981042319 [2020/05/20-18:54:02.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.319] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-metadata-19.1.jar at spark://10.139.4.79:41629/jars/gt-metadata-19.1.jar with timestamp 1589981042319 [2020/05/20-18:54:02.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.319] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/json-path-2.1.0.jar at spark://10.139.4.79:41629/jars/json-path-2.1.0.jar with timestamp 1589981042319 [2020/05/20-18:54:02.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.319] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/trove4j-3.0.3.jar at spark://10.139.4.79:41629/jars/trove4j-3.0.3.jar with timestamp 1589981042319 [2020/05/20-18:54:02.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.319] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsoup-1.6.3.jar at spark://10.139.4.79:41629/jars/jsoup-1.6.3.jar with timestamp 1589981042319 [2020/05/20-18:54:02.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.320] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hbc-core-2.0.2.jar at spark://10.139.4.79:41629/jars/hbc-core-2.0.2.jar with timestamp 1589981042320 [2020/05/20-18:54:02.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.320] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/livy-api-0.5.0-incubating.jar at spark://10.139.4.79:41629/jars/livy-api-0.5.0-incubating.jar with timestamp 1589981042320 [2020/05/20-18:54:02.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.320] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-beans-3.2.18.RELEASE.jar at spark://10.139.4.79:41629/jars/spring-beans-3.2.18.RELEASE.jar with timestamp 1589981042320 [2020/05/20-18:54:02.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.320] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/bson-3.2.2.jar at spark://10.139.4.79:41629/jars/bson-3.2.2.jar with timestamp 1589981042320 [2020/05/20-18:54:02.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.320] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/sqlite-jdbc-3.8.11.2.jar at spark://10.139.4.79:41629/jars/sqlite-jdbc-3.8.11.2.jar with timestamp 1589981042320 [2020/05/20-18:54:02.329] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.321] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-annotations-2.8.0.jar at spark://10.139.4.79:41629/jars/jackson-annotations-2.8.0.jar with timestamp 1589981042321 [2020/05/20-18:54:02.329] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.321] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-collections-3.2.2.jar at spark://10.139.4.79:41629/jars/commons-collections-3.2.2.jar with timestamp 1589981042321 [2020/05/20-18:54:02.329] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.321] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/opencsv-2.3.jar at spark://10.139.4.79:41629/jars/opencsv-2.3.jar with timestamp 1589981042321 [2020/05/20-18:54:02.329] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.321] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/securesm-1.0.jar at spark://10.139.4.79:41629/jars/securesm-1.0.jar with timestamp 1589981042321 [2020/05/20-18:54:02.329] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.321] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-io-2.4.jar at spark://10.139.4.79:41629/jars/commons-io-2.4.jar with timestamp 1589981042321 [2020/05/20-18:54:02.329] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.321] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xerces2-xsd11-2.11.1.jar at spark://10.139.4.79:41629/jars/xerces2-xsd11-2.11.1.jar with timestamp 1589981042321 [2020/05/20-18:54:02.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.322] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-cli-1.3.1.jar at spark://10.139.4.79:41629/jars/commons-cli-1.3.1.jar with timestamp 1589981042322 [2020/05/20-18:54:02.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.322] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xercesImpl-2.9.1.jar at spark://10.139.4.79:41629/jars/xercesImpl-2.9.1.jar with timestamp 1589981042322 [2020/05/20-18:54:02.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.322] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/antlr4-runtime-4.7.jar at spark://10.139.4.79:41629/jars/antlr4-runtime-4.7.jar with timestamp 1589981042322 [2020/05/20-18:54:02.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.322] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-lang3-3.5.jar at spark://10.139.4.79:41629/jars/commons-lang3-3.5.jar with timestamp 1589981042322 [2020/05/20-18:54:02.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.322] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/mongodb-driver-core-3.2.2.jar at spark://10.139.4.79:41629/jars/mongodb-driver-core-3.2.2.jar with timestamp 1589981042322 [2020/05/20-18:54:02.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.322] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xmlbeans-3.0.2.jar at spark://10.139.4.79:41629/jars/xmlbeans-3.0.2.jar with timestamp 1589981042322 [2020/05/20-18:54:02.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.323] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xmltooling-1.4.4.jar at spark://10.139.4.79:41629/jars/xmltooling-1.4.4.jar with timestamp 1589981042323 [2020/05/20-18:54:02.331] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.323] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jdom2-2.0.6.jar at spark://10.139.4.79:41629/jars/jdom2-2.0.6.jar with timestamp 1589981042323 [2020/05/20-18:54:02.331] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.323] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-codec-1.11.jar at spark://10.139.4.79:41629/jars/commons-codec-1.11.jar with timestamp 1589981042323 [2020/05/20-18:54:02.331] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.323] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/quartz-2.2.3.jar at spark://10.139.4.79:41629/jars/quartz-2.2.3.jar with timestamp 1589981042323 [2020/05/20-18:54:02.331] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.323] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jai_core-1.1.3.jar at spark://10.139.4.79:41629/jars/jai_core-1.1.3.jar with timestamp 1589981042323 [2020/05/20-18:54:02.331] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.323] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-aop-3.2.18.RELEASE.jar at spark://10.139.4.79:41629/jars/spring-aop-3.2.18.RELEASE.jar with timestamp 1589981042323 [2020/05/20-18:54:02.331] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.324] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hbc-twitter4j-2.0.2.jar at spark://10.139.4.79:41629/jars/hbc-twitter4j-2.0.2.jar with timestamp 1589981042324 [2020/05/20-18:54:02.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.324] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/cassandra-driver-core-3.0.8.jar at spark://10.139.4.79:41629/jars/cassandra-driver-core-3.0.8.jar with timestamp 1589981042324 [2020/05/20-18:54:02.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.324] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/c3p0-0.9.1.1.jar at spark://10.139.4.79:41629/jars/c3p0-0.9.1.1.jar with timestamp 1589981042324 [2020/05/20-18:54:02.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.324] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jsr-275-1.0-beta-2.jar at spark://10.139.4.79:41629/jars/jsr-275-1.0-beta-2.jar with timestamp 1589981042324 [2020/05/20-18:54:02.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.324] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.apache.servicemix.bundles.antlr-2.7.7_5.jar at spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.antlr-2.7.7_5.jar with timestamp 1589981042324 [2020/05/20-18:54:02.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.324] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hppc-0.7.1.jar at spark://10.139.4.79:41629/jars/hppc-0.7.1.jar with timestamp 1589981042324 [2020/05/20-18:54:02.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.325] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jts-core-1.14.0.jar at spark://10.139.4.79:41629/jars/jts-core-1.14.0.jar with timestamp 1589981042325 [2020/05/20-18:54:02.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.325] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/not-yet-commons-ssl-0.3.9.jar at spark://10.139.4.79:41629/jars/not-yet-commons-ssl-0.3.9.jar with timestamp 1589981042325 [2020/05/20-18:54:02.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.325] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jcip-annotations-1.0-1.jar at spark://10.139.4.79:41629/jars/jcip-annotations-1.0-1.jar with timestamp 1589981042325 [2020/05/20-18:54:02.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.325] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-dataformat-cbor-2.8.1.jar at spark://10.139.4.79:41629/jars/jackson-dataformat-cbor-2.8.1.jar with timestamp 1589981042325 [2020/05/20-18:54:02.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.325] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/maxminddb-0.2.0.jar at spark://10.139.4.79:41629/jars/maxminddb-0.2.0.jar with timestamp 1589981042325 [2020/05/20-18:54:02.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.325] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-lang-2.6.jar at spark://10.139.4.79:41629/jars/commons-lang-2.6.jar with timestamp 1589981042325 [2020/05/20-18:54:02.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.326] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gson-2.8.0.jar at spark://10.139.4.79:41629/jars/gson-2.8.0.jar with timestamp 1589981042326 [2020/05/20-18:54:02.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.326] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/commons-net-3.3.jar at spark://10.139.4.79:41629/jars/commons-net-3.3.jar with timestamp 1589981042326 [2020/05/20-18:54:02.334] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.326] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/snakeyaml-1.15.jar at spark://10.139.4.79:41629/jars/snakeyaml-1.15.jar with timestamp 1589981042326 [2020/05/20-18:54:02.334] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.326] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/hsqldb-2.3.0.jar at spark://10.139.4.79:41629/jars/hsqldb-2.3.0.jar with timestamp 1589981042326 [2020/05/20-18:54:02.334] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.326] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/filters-2.0.235.jar at spark://10.139.4.79:41629/jars/filters-2.0.235.jar with timestamp 1589981042326 [2020/05/20-18:54:02.334] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.326] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-referencing-19.1.jar at spark://10.139.4.79:41629/jars/gt-referencing-19.1.jar with timestamp 1589981042326 [2020/05/20-18:54:02.334] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.327] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/ejml-ddense-0.32.jar at spark://10.139.4.79:41629/jars/ejml-ddense-0.32.jar with timestamp 1589981042327 [2020/05/20-18:54:02.334] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.327] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/nimbus-jose-jwt-8.10.jar at spark://10.139.4.79:41629/jars/nimbus-jose-jwt-8.10.jar with timestamp 1589981042327 [2020/05/20-18:54:02.334] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.327] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/urlrewritefilter-4.0.4.jar at spark://10.139.4.79:41629/jars/urlrewritefilter-4.0.4.jar with timestamp 1589981042327 [2020/05/20-18:54:02.335] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.327] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/lucene-analyzers-common-5.5.4.jar at spark://10.139.4.79:41629/jars/lucene-analyzers-common-5.5.4.jar with timestamp 1589981042327 [2020/05/20-18:54:02.335] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.327] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/xpp3-1.1.4c.jar at spark://10.139.4.79:41629/jars/xpp3-1.1.4c.jar with timestamp 1589981042327 [2020/05/20-18:54:02.335] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.327] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/jackson-mapper-asl-1.9.13.jar at spark://10.139.4.79:41629/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1589981042327 [2020/05/20-18:54:02.335] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.328] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/t-digest-3.0.jar at spark://10.139.4.79:41629/jars/t-digest-3.0.jar with timestamp 1589981042328 [2020/05/20-18:54:02.335] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.328] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-api-19.1.jar at spark://10.139.4.79:41629/jars/gt-api-19.1.jar with timestamp 1589981042328 [2020/05/20-18:54:02.335] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.328] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/joauth-6.0.2.jar at spark://10.139.4.79:41629/jars/joauth-6.0.2.jar with timestamp 1589981042328 [2020/05/20-18:54:02.336] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.328] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-webmvc-3.2.18.RELEASE.jar at spark://10.139.4.79:41629/jars/spring-webmvc-3.2.18.RELEASE.jar with timestamp 1589981042328 [2020/05/20-18:54:02.336] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.328] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/cglib-nodep-2.2.2.jar at spark://10.139.4.79:41629/jars/cglib-nodep-2.2.2.jar with timestamp 1589981042328 [2020/05/20-18:54:02.336] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.328] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/css-selectors-1.0.5.jar at spark://10.139.4.79:41629/jars/css-selectors-1.0.5.jar with timestamp 1589981042328 [2020/05/20-18:54:02.336] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.328] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/metrics-json-3.1.5.jar at spark://10.139.4.79:41629/jars/metrics-json-3.1.5.jar with timestamp 1589981042328 [2020/05/20-18:54:02.336] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/gt-epsg-hsql-19.1.jar at spark://10.139.4.79:41629/jars/gt-epsg-hsql-19.1.jar with timestamp 1589981042329 [2020/05/20-18:54:02.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-core-3.2.18.RELEASE.jar at spark://10.139.4.79:41629/jars/spring-core-3.2.18.RELEASE.jar with timestamp 1589981042329 [2020/05/20-18:54:02.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/spring-web-3.2.18.RELEASE.jar at spark://10.139.4.79:41629/jars/spring-web-3.2.18.RELEASE.jar with timestamp 1589981042329 [2020/05/20-18:54:02.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/backend-run/org.apache.servicemix.bundles.xpp3-1.1.4c_7.jar at spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.xpp3-1.1.4c_7.jar with timestamp 1589981042329 [2020/05/20-18:54:02.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/log4j-1.2.17.jar at spark://10.139.4.79:41629/jars/log4j-1.2.17.jar with timestamp 1589981042329 [2020/05/20-18:54:02.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-servlet-9.1.6.v20160112.jar at spark://10.139.4.79:41629/jars/websocket-servlet-9.1.6.v20160112.jar with timestamp 1589981042329 [2020/05/20-18:54:02.338] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-client-9.1.6.v20160112.jar at spark://10.139.4.79:41629/jars/websocket-client-9.1.6.v20160112.jar with timestamp 1589981042329 [2020/05/20-18:54:02.338] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.329] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/jodd-3.3.8.jar at spark://10.139.4.79:41629/jars/jodd-3.3.8.jar with timestamp 1589981042329 [2020/05/20-18:54:02.338] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/json-20151123.jar at spark://10.139.4.79:41629/jars/json-20151123.jar with timestamp 1589981042330 [2020/05/20-18:54:02.338] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/ini4j-0.5.4.jar at spark://10.139.4.79:41629/jars/ini4j-0.5.4.jar with timestamp 1589981042330 [2020/05/20-18:54:02.339] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-server-9.1.6.v20160112.jar at spark://10.139.4.79:41629/jars/websocket-server-9.1.6.v20160112.jar with timestamp 1589981042330 [2020/05/20-18:54:02.339] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/slf4j-log4j12-1.7.21.jar at spark://10.139.4.79:41629/jars/slf4j-log4j12-1.7.21.jar with timestamp 1589981042330 [2020/05/20-18:54:02.339] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/jetty-io-9.1.6.v20160112.jar at spark://10.139.4.79:41629/jars/jetty-io-9.1.6.v20160112.jar with timestamp 1589981042330 [2020/05/20-18:54:02.339] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/freemarker-2.3.28.jar at spark://10.139.4.79:41629/jars/freemarker-2.3.28.jar with timestamp 1589981042330 [2020/05/20-18:54:02.339] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/javax.annotation-api-1.2.jar at spark://10.139.4.79:41629/jars/javax.annotation-api-1.2.jar with timestamp 1589981042330 [2020/05/20-18:54:02.340] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.330] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-common-9.1.6.v20160112.jar at spark://10.139.4.79:41629/jars/websocket-common-9.1.6.v20160112.jar with timestamp 1589981042330 [2020/05/20-18:54:02.340] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.331] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/gson-2.8.5.jar at spark://10.139.4.79:41629/jars/gson-2.8.5.jar with timestamp 1589981042331 [2020/05/20-18:54:02.340] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.331] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/javax.servlet-api-3.1.0.jar at spark://10.139.4.79:41629/jars/javax.servlet-api-3.1.0.jar with timestamp 1589981042331 [2020/05/20-18:54:02.340] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.331] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/h2-1.4.195.jar at spark://10.139.4.79:41629/jars/h2-1.4.195.jar with timestamp 1589981042331 [2020/05/20-18:54:02.340] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.331] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/websocket-api-9.1.6.v20160112.jar at spark://10.139.4.79:41629/jars/websocket-api-9.1.6.v20160112.jar with timestamp 1589981042331 [2020/05/20-18:54:02.340] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.331] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/common-run/slf4j-api-1.7.21.jar at spark://10.139.4.79:41629/jars/slf4j-api-1.7.21.jar with timestamp 1589981042331 [2020/05/20-18:54:02.340] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.331] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/spark-run_2.11/livy-scala-api_2.11-0.5.0-incubating.jar at spark://10.139.4.79:41629/jars/livy-scala-api_2.11-0.5.0-incubating.jar with timestamp 1589981042331 [2020/05/20-18:54:02.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.331] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/ivy/spark-run_2.11/util-eval_2.11-6.27.0.jar at spark://10.139.4.79:41629/jars/util-eval_2.11-6.27.0.jar with timestamp 1589981042331 [2020/05/20-18:54:02.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.332] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/shadelib/dss-shadelib-assembly.jar at spark://10.139.4.79:41629/jars/dss-shadelib-assembly.jar with timestamp 1589981042332 [2020/05/20-18:54:02.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.332] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/dist/dataiku-core.jar at spark://10.139.4.79:41629/jars/dataiku-core.jar with timestamp 1589981042332 [2020/05/20-18:54:02.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.332] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/dist/dataiku-app-platform.jar at spark://10.139.4.79:41629/jars/dataiku-app-platform.jar with timestamp 1589981042332 [2020/05/20-18:54:02.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.332] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/dist/dataiku-dss-core.jar at spark://10.139.4.79:41629/jars/dataiku-dss-core.jar with timestamp 1589981042332 [2020/05/20-18:54:02.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.332] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/dist/dataiku-dip.jar at spark://10.139.4.79:41629/jars/dataiku-dip.jar with timestamp 1589981042332 [2020/05/20-18:54:02.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.332] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/dist/dataiku-scoring.jar at spark://10.139.4.79:41629/jars/dataiku-scoring.jar with timestamp 1589981042332 [2020/05/20-18:54:02.342] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.333] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/dist/dss-spark-public_2.11-2.0.jar at spark://10.139.4.79:41629/jars/dss-spark-public_2.11-2.0.jar with timestamp 1589981042332 [2020/05/20-18:54:02.342] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.333] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/dist/dss-spark-main_2.11-5.1.jar at spark://10.139.4.79:41629/jars/dss-spark-main_2.11-5.1.jar with timestamp 1589981042333 [2020/05/20-18:54:02.342] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.333] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/third/LevenshteinAutomatonTest.jar at spark://10.139.4.79:41629/jars/LevenshteinAutomatonTest.jar with timestamp 1589981042333 [2020/05/20-18:54:02.342] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.333] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/third/MDAG.jar at spark://10.139.4.79:41629/jars/MDAG.jar with timestamp 1589981042333 [2020/05/20-18:54:02.343] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.333] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/third/sassyreader.jar at spark://10.139.4.79:41629/jars/sassyreader.jar with timestamp 1589981042333 [2020/05/20-18:54:02.343] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.333] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/third/jcurses.jar at spark://10.139.4.79:41629/jars/jcurses.jar with timestamp 1589981042333 [2020/05/20-18:54:02.343] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.334] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/third/junit-ext-1.0-junit-4.5-RC-3.jar at spark://10.139.4.79:41629/jars/junit-ext-1.0-junit-4.5-RC-3.jar with timestamp 1589981042334 [2020/05/20-18:54:02.343] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.334] [main] [INFO] [org.apache.spark.SparkContext] - Added JAR file:///home/dataiku/dataiku-dss-7.0.1/lib/third/spnego-7.0.jar at spark://10.139.4.79:41629/jars/spnego-7.0.jar with timestamp 1589981042334 [2020/05/20-18:54:02.344] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.335] [main] [WARN] [org.apache.spark.SparkContext] - The jar file:/home/dataiku/dataiku-dss-7.0.1/dist/dss-spark-main_2.11-5.1.jar has been added already. Overwriting of added jars is not supported in the current version. [2020/05/20-18:54:02.363] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.363] [main] [INFO] [org.apache.spark.SparkContext] - Added file file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/yarn-python-bins0uRw67ec.json at file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/yarn-python-bins0uRw67ec.json with timestamp 1589981042362 [2020/05/20-18:54:02.365] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.365] [main] [INFO] [org.apache.spark.util.Utils] - Copying /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/yarn-python-bins0uRw67ec.json to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/yarn-python-bins0uRw67ec.json [2020/05/20-18:54:02.373] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.373] [main] [INFO] [org.apache.spark.SparkContext] - Added file file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-hadoop/contributed_python_processorskB6B5ITd.json at file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-hadoop/contributed_python_processorskB6B5ITd.json with timestamp 1589981042373 [2020/05/20-18:54:02.373] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.373] [main] [INFO] [org.apache.spark.util.Utils] - Copying /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-hadoop/contributed_python_processorskB6B5ITd.json to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/contributed_python_processorskB6B5ITd.json [2020/05/20-18:54:02.377] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.377] [main] [INFO] [org.apache.spark.SparkContext] - Added file file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-ITk9gUDV.zip.nodecompression at file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-ITk9gUDV.zip.nodecompression with timestamp 1589981042377 [2020/05/20-18:54:02.378] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.377] [main] [INFO] [org.apache.spark.util.Utils] - Copying /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-ITk9gUDV.zip.nodecompression to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-ITk9gUDV.zip.nodecompression [2020/05/20-18:54:02.381] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.381] [main] [INFO] [org.apache.spark.SparkContext] - Added file file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-YfRNgeLJ.zip.nodecompression at file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-YfRNgeLJ.zip.nodecompression with timestamp 1589981042381 [2020/05/20-18:54:02.381] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.381] [main] [INFO] [org.apache.spark.util.Utils] - Copying /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-YfRNgeLJ.zip.nodecompression to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-YfRNgeLJ.zip.nodecompression [2020/05/20-18:54:02.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.385] [main] [INFO] [org.apache.spark.SparkContext] - Added file file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-BHQI2Oqk.zip.nodecompression at file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-BHQI2Oqk.zip.nodecompression with timestamp 1589981042385 [2020/05/20-18:54:02.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.385] [main] [INFO] [org.apache.spark.util.Utils] - Copying /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-BHQI2Oqk.zip.nodecompression to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-BHQI2Oqk.zip.nodecompression [2020/05/20-18:54:02.390] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.390] [main] [INFO] [org.apache.spark.SparkContext] - Added file file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/dku-spark-log4j.properties at file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/dku-spark-log4j.properties with timestamp 1589981042390 [2020/05/20-18:54:02.391] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.390] [main] [INFO] [org.apache.spark.util.Utils] - Copying /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/dku-spark-log4j.properties to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dku-spark-log4j.properties [2020/05/20-18:54:02.454] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.454] [main] [INFO] [org.apache.spark.executor.Executor] - Starting executor ID driver on host localhost [2020/05/20-18:54:02.530] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.530] [main] [INFO] [org.apache.spark.util.Utils] - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 40737. [2020/05/20-18:54:02.531] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.531] [main] [INFO] [org.apache.spark.network.netty.NettyBlockTransferService] - Server created on 10.139.4.79:40737 [2020/05/20-18:54:02.533] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.533] [main] [INFO] [org.apache.spark.storage.BlockManager] - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy [2020/05/20-18:54:02.564] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.563] [main] [INFO] [org.apache.spark.storage.BlockManagerMaster] - Registering BlockManager BlockManagerId(driver, 10.139.4.79, 40737, None) [2020/05/20-18:54:02.568] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.567] [dispatcher-event-loop-2] [INFO] [org.apache.spark.storage.BlockManagerMasterEndpoint] - Registering block manager 10.139.4.79:40737 with 366.3 MB RAM, BlockManagerId(driver, 10.139.4.79, 40737, None) [2020/05/20-18:54:02.571] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.571] [main] [INFO] [org.apache.spark.storage.BlockManagerMaster] - Registered BlockManager BlockManagerId(driver, 10.139.4.79, 40737, None) [2020/05/20-18:54:02.571] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.571] [main] [INFO] [org.apache.spark.storage.BlockManager] - Initialized BlockManager: BlockManagerId(driver, 10.139.4.79, 40737, None) [2020/05/20-18:54:02.714] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.714] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@7455dacb{/metrics/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.861] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.861] [main] [INFO] [org.apache.spark.sql.internal.SharedState] - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:///tmp/mwd6665093703188540778'). [2020/05/20-18:54:02.861] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.861] [main] [INFO] [org.apache.spark.sql.internal.SharedState] - Warehouse path is 'file:///tmp/mwd6665093703188540778'. [2020/05/20-18:54:02.868] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.868] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@6a55594b{/SQL,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.869] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.869] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@632b305d{/SQL/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.869] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.869] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@2db15f70{/SQL/execution,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.870] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.870] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@25974207{/SQL/execution/json,null,AVAILABLE,@Spark} [2020/05/20-18:54:02.871] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:02.871] [main] [INFO] [org.spark_project.jetty.server.handler.ContextHandler] - Started o.s.j.s.ServletContextHandler@11b5f4e2{/static/sql,null,AVAILABLE,@Spark} [2020/05/20-18:54:03.269] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.269] [main] [INFO] [org.apache.spark.sql.execution.streaming.state.StateStoreCoordinatorRef] - Registered StateStoreCoordinator endpoint [2020/05/20-18:54:03.277] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.277] [main] [WARN] [dku.spark.hive] - Unable to set property spark.sql.warehouse.dir : Cannot modify the value of a static config: spark.sql.warehouse.dir; [2020/05/20-18:54:03.282] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.282] [main] [INFO] [dku.remoterun.fileshelper] - Look for (potentially zipped) /home/dataiku/DATA_DIR/saved_models/CHURNALGO/8yvXZCIF/versions/1589980789114 [2020/05/20-18:54:03.283] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.282] [main] [INFO] [dku.remoterun.fileshelper] - > found locally [2020/05/20-18:54:03.283] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.282] [main] [INFO] [dku.remoterun.fileshelper] - Look for (potentially zipped) /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/evaluation-recipe/pyrunMn0DtWFJ3i41 [2020/05/20-18:54:03.283] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.282] [main] [INFO] [dku.remoterun.fileshelper] - > found locally [2020/05/20-18:54:03.559] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.558] [main] [INFO] [dku.spark.mllib] - Using model threshold: 0.3415067297141797 [2020/05/20-18:54:03.656] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.656] [main] [INFO] [dku.spark.env] - Got ticket from env N6moNE5h1AsN1T7LCUICrGbyUha1oZQuUjJIZztyFqT3wj2nmCkdisdJQQ9m6oD3 [2020/05/20-18:54:03.658] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.658] [main] [INFO] [dku.spark.entrypoint] - Spark: initializeApplication [2020/05/20-18:54:03.658] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.658] [main] [INFO] [dku.spark.entrypoint] - Spark: initializeApplication: ALREADY done [2020/05/20-18:54:03.658] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.658] [main] [INFO] [dku.spark.entrypoint] - Loading exec env [2020/05/20-18:54:03.659] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.658] [main] [INFO] [dku.spark.exec] - Loading exec env with impersonation enabled=false user=null [2020/05/20-18:54:03.659] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.658] [main] [INFO] [dku.spark.context] - Running for job [2020/05/20-18:54:03.659] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.659] [main] [INFO] [dku.spark.context] - Init: running in flow, JEK port = 35569 activityId = evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP [2020/05/20-18:54:03.659] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.659] [main] [INFO] [dku.spark.env] - Got ticket from env N6moNE5h1AsN1T7LCUICrGbyUha1oZQuUjJIZztyFqT3wj2nmCkdisdJQQ9m6oD3 [2020/05/20-18:54:03.662] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.662] [main] [INFO] [dip.plugin.presets] - Setting up remote plugin settings access to admin1:35569 jek=true [2020/05/20-18:54:03.662] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.662] [main] [INFO] [dku.spark.context] - Init: api host = admin1 port = 35569 [2020/05/20-18:54:03.662] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.662] [main] [INFO] [dku.spark.context] - Init: effective api host = admin1 port = 35569 [2020/05/20-18:54:03.982] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.982] [main] [INFO] [dku.spark.context] - getAsDataFrame CHURNALGO.Telco_Customer_Churn_prepared_distinct: Loading dataset as Spark dataframe type=Filesystem [2020/05/20-18:54:03.983] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:03.982] [main] [WARN] [dku.spark.context] - getAsDataFrame CHURNALGO.Telco_Customer_Churn_prepared_distinct: Reading dataset as a remote dataframe. Computation will not be distributed [2020/05/20-18:54:05.642] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:05.641] [main] [INFO] [dku.spark.shaker.compiler] - Loading meanings exec env [2020/05/20-18:54:05.643] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:05.641] [main] [INFO] [dku.spark.shaker.compiler] - Meanings are [] [2020/05/20-18:54:05.645] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:05.645] [main] [INFO] [dku.spark.shaker.compiler] - Not using native processors (disabled by config) [2020/05/20-18:54:05.904] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:05.904] [main] [INFO] [dku.spark.entrypoint] - SPC using sparkFilesOverride= null [2020/05/20-18:54:05.905] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:05.904] [main] [INFO] [dku.spark.env] - Got ticket from env N6moNE5h1AsN1T7LCUICrGbyUha1oZQuUjJIZztyFqT3wj2nmCkdisdJQQ9m6oD3 [2020/05/20-18:54:05.967] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:05.966] [main] [INFO] [dku.spark.shaker.compiler] - Done FullCompiler. Dumping final DF plan [2020/05/20-18:54:06.015] [null-out-49] [INFO] [dku.utils] - == Parsed Logical Plan == [2020/05/20-18:54:06.015] [null-out-49] [INFO] [dku.utils] - LogicalRDD [customerID#85, gender#86, SeniorCitizen#87L, Partner#88, Dependents#89, tenure#90L, PhoneService#91, MultipleLines_0#92, InternetService#93, OnlineSecurity_0#94, OnlineBackup_0#95, DeviceProtection_0#96, TechSupport_0#97, StreamingTV_0#98, StreamingMovies_0#99, Contract#100, PaperlessBilling#101, PaymentMethod#102, MonthlyCharges#103, TotalCharges#104, Churn#105], false [2020/05/20-18:54:06.016] [null-out-49] [INFO] [dku.utils] - == Analyzed Logical Plan == [2020/05/20-18:54:06.016] [null-out-49] [INFO] [dku.utils] - customerID: string, gender: string, SeniorCitizen: bigint, Partner: boolean, Dependents: boolean, tenure: bigint, PhoneService: boolean, MultipleLines_0: boolean, InternetService: string, OnlineSecurity_0: boolean, OnlineBackup_0: boolean, DeviceProtection_0: boolean, TechSupport_0: boolean, StreamingTV_0: boolean, StreamingMovies_0: boolean, Contract: string, PaperlessBilling: boolean, PaymentMethod: string, MonthlyCharges: double, TotalCharges: double, Churn: boolean [2020/05/20-18:54:06.016] [null-out-49] [INFO] [dku.utils] - LogicalRDD [customerID#85, gender#86, SeniorCitizen#87L, Partner#88, Dependents#89, tenure#90L, PhoneService#91, MultipleLines_0#92, InternetService#93, OnlineSecurity_0#94, OnlineBackup_0#95, DeviceProtection_0#96, TechSupport_0#97, StreamingTV_0#98, StreamingMovies_0#99, Contract#100, PaperlessBilling#101, PaymentMethod#102, MonthlyCharges#103, TotalCharges#104, Churn#105], false [2020/05/20-18:54:06.016] [null-out-49] [INFO] [dku.utils] - == Optimized Logical Plan == [2020/05/20-18:54:06.016] [null-out-49] [INFO] [dku.utils] - LogicalRDD [customerID#85, gender#86, SeniorCitizen#87L, Partner#88, Dependents#89, tenure#90L, PhoneService#91, MultipleLines_0#92, InternetService#93, OnlineSecurity_0#94, OnlineBackup_0#95, DeviceProtection_0#96, TechSupport_0#97, StreamingTV_0#98, StreamingMovies_0#99, Contract#100, PaperlessBilling#101, PaymentMethod#102, MonthlyCharges#103, TotalCharges#104, Churn#105], false [2020/05/20-18:54:06.017] [null-out-49] [INFO] [dku.utils] - == Physical Plan == [2020/05/20-18:54:06.017] [null-out-49] [INFO] [dku.utils] - Scan ExistingRDD[customerID#85,gender#86,SeniorCitizen#87L,Partner#88,Dependents#89,tenure#90L,PhoneService#91,MultipleLines_0#92,InternetService#93,OnlineSecurity_0#94,OnlineBackup_0#95,DeviceProtection_0#96,TechSupport_0#97,StreamingTV_0#98,StreamingMovies_0#99,Contract#100,PaperlessBilling#101,PaymentMethod#102,MonthlyCharges#103,TotalCharges#104,Churn#105] [2020/05/20-18:54:06.018] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:06.015] [main] [INFO] [dku.spark.shaker.compiler] - Plan dumped [2020/05/20-18:54:08.009] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.009] [main] [INFO] [org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator] - Generated method too long to be JIT compiled: org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext is 15778 bytes [2020/05/20-18:54:08.010] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.010] [main] [INFO] [org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator] - Code generated in 453.235012 ms [2020/05/20-18:54:08.105] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.104] [main] [INFO] [org.apache.spark.SparkContext] - Starting job: collect at BinaryClassificationMetrics.scala:195 [2020/05/20-18:54:08.134] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.133] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Registering RDD 16 (map at MLLibEvaluationJob.scala:142) as input to shuffle 1 [2020/05/20-18:54:08.136] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.136] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Registering RDD 17 (combineByKey at BinaryClassificationMetrics.scala:151) as input to shuffle 0 [2020/05/20-18:54:08.140] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.139] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Got job 0 (collect at BinaryClassificationMetrics.scala:195) with 1 output partitions [2020/05/20-18:54:08.141] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.140] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Final stage: ResultStage 2 (collect at BinaryClassificationMetrics.scala:195) [2020/05/20-18:54:08.142] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.141] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Parents of final stage: List(ShuffleMapStage 1) [2020/05/20-18:54:08.144] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.144] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Missing parents: List(ShuffleMapStage 1) [2020/05/20-18:54:08.152] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.152] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Submitting ShuffleMapStage 0 (MapPartitionsRDD[16] at map at MLLibEvaluationJob.scala:142), which has no missing parents [2020/05/20-18:54:08.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.323] [dag-scheduler-event-loop] [INFO] [org.apache.spark.storage.memory.MemoryStore] - Block broadcast_0 stored as values in memory (estimated size 202.3 KB, free 366.1 MB) [2020/05/20-18:54:08.352] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.352] [dag-scheduler-event-loop] [INFO] [org.apache.spark.storage.memory.MemoryStore] - Block broadcast_0_piece0 stored as bytes in memory (estimated size 56.3 KB, free 366.0 MB) [2020/05/20-18:54:08.355] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.354] [dispatcher-event-loop-5] [INFO] [org.apache.spark.storage.BlockManagerInfo] - Added broadcast_0_piece0 in memory on 10.139.4.79:40737 (size: 56.3 KB, free: 366.2 MB) [2020/05/20-18:54:08.359] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.359] [dag-scheduler-event-loop] [INFO] [org.apache.spark.SparkContext] - Created broadcast 0 from broadcast at DAGScheduler.scala:1163 [2020/05/20-18:54:08.372] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.372] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Submitting 1 missing tasks from ShuffleMapStage 0 (MapPartitionsRDD[16] at map at MLLibEvaluationJob.scala:142) (first 15 tasks are for partitions Vector(0)) [2020/05/20-18:54:08.373] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.373] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.TaskSchedulerImpl] - Adding task set 0.0 with 1 tasks [2020/05/20-18:54:08.423] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.423] [dispatcher-event-loop-6] [INFO] [org.apache.spark.scheduler.TaskSetManager] - Starting task 0.0 in stage 0.0 (TID 0, localhost, executor driver, partition 0, PROCESS_LOCAL, 7647 bytes) [2020/05/20-18:54:08.449] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.448] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Running task 0.0 in stage 0.0 (TID 0) [2020/05/20-18:54:08.454] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.453] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-YfRNgeLJ.zip.nodecompression with timestamp 1589981042381 [2020/05/20-18:54:08.502] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.502] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-YfRNgeLJ.zip.nodecompression has been previously copied to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-YfRNgeLJ.zip.nodecompression [2020/05/20-18:54:08.506] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.506] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-BHQI2Oqk.zip.nodecompression with timestamp 1589981042385 [2020/05/20-18:54:08.509] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.509] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-BHQI2Oqk.zip.nodecompression has been previously copied to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-BHQI2Oqk.zip.nodecompression [2020/05/20-18:54:08.519] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.519] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-hadoop/contributed_python_processorskB6B5ITd.json with timestamp 1589981042373 [2020/05/20-18:54:08.520] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.520] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-hadoop/contributed_python_processorskB6B5ITd.json has been previously copied to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/contributed_python_processorskB6B5ITd.json [2020/05/20-18:54:08.526] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.526] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/dku-spark-log4j.properties with timestamp 1589981042390 [2020/05/20-18:54:08.527] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.526] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/dku-spark-log4j.properties has been previously copied to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dku-spark-log4j.properties [2020/05/20-18:54:08.530] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.529] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/yarn-python-bins0uRw67ec.json with timestamp 1589981042362 [2020/05/20-18:54:08.530] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.530] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/yarn-python-bins0uRw67ec.json has been previously copied to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/yarn-python-bins0uRw67ec.json [2020/05/20-18:54:08.535] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.535] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching file:///home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-ITk9gUDV.zip.nodecompression with timestamp 1589981042377 [2020/05/20-18:54:08.536] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.535] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/shaker-mr/rec-ITk9gUDV.zip.nodecompression has been previously copied to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-ITk9gUDV.zip.nodecompression [2020/05/20-18:54:08.541] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.541] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/opencsv-2.3.jar with timestamp 1589981042321 [2020/05/20-18:54:08.585] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.585] [Executor task launch worker for task 0] [INFO] [org.apache.spark.network.client.TransportClientFactory] - Successfully created connection to /10.139.4.79:41629 after 34 ms (0 ms spent in bootstraps) [2020/05/20-18:54:08.595] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.595] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/opencsv-2.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6174790809971170369.tmp [2020/05/20-18:54:08.634] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.634] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/opencsv-2.3.jar to class loader [2020/05/20-18:54:08.635] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.634] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/ejml-ddense-0.32.jar with timestamp 1589981042327 [2020/05/20-18:54:08.635] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.635] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/ejml-ddense-0.32.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2913371078888428709.tmp [2020/05/20-18:54:08.644] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.643] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/ejml-ddense-0.32.jar to class loader [2020/05/20-18:54:08.644] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.643] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gson-2.8.5.jar with timestamp 1589981042331 [2020/05/20-18:54:08.644] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.644] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gson-2.8.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1888174797863960422.tmp [2020/05/20-18:54:08.650] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.650] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gson-2.8.5.jar to class loader [2020/05/20-18:54:08.650] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.650] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/curvesapi-1.05.jar with timestamp 1589981042305 [2020/05/20-18:54:08.651] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.650] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/curvesapi-1.05.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1061154535375681856.tmp [2020/05/20-18:54:08.656] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.655] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/curvesapi-1.05.jar to class loader [2020/05/20-18:54:08.656] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.655] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/LevenshteinAutomatonTest.jar with timestamp 1589981042333 [2020/05/20-18:54:08.656] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.656] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/LevenshteinAutomatonTest.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6874749424393819567.tmp [2020/05/20-18:54:08.662] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.662] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/LevenshteinAutomatonTest.jar to class loader [2020/05/20-18:54:08.663] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.662] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/snappy-java-1.0.5.jar with timestamp 1589981042315 [2020/05/20-18:54:08.663] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.663] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/snappy-java-1.0.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1154203890462653693.tmp [2020/05/20-18:54:08.671] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.671] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/snappy-java-1.0.5.jar to class loader [2020/05/20-18:54:08.671] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.671] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xlsx-streamer-2.1.0.jar with timestamp 1589981042313 [2020/05/20-18:54:08.672] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.672] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xlsx-streamer-2.1.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6899588121165614080.tmp [2020/05/20-18:54:08.676] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.676] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xlsx-streamer-2.1.0.jar to class loader [2020/05/20-18:54:08.676] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.676] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dss-spark-public_2.11-2.0.jar with timestamp 1589981042332 [2020/05/20-18:54:08.677] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.676] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dss-spark-public_2.11-2.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2119527247703867056.tmp [2020/05/20-18:54:08.682] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.681] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dss-spark-public_2.11-2.0.jar to class loader [2020/05/20-18:54:08.682] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.681] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/poi-ooxml-4.0.1.jar with timestamp 1589981042308 [2020/05/20-18:54:08.682] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.682] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/poi-ooxml-4.0.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp827726145384610466.tmp [2020/05/20-18:54:08.696] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.696] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/poi-ooxml-4.0.1.jar to class loader [2020/05/20-18:54:08.697] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.696] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spnego-7.0.jar with timestamp 1589981042334 [2020/05/20-18:54:08.698] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.697] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spnego-7.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7118417150287198600.tmp [2020/05/20-18:54:08.703] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.703] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spnego-7.0.jar to class loader [2020/05/20-18:54:08.703] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.703] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/asm-commons-5.0.3.jar with timestamp 1589981042312 [2020/05/20-18:54:08.704] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.703] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/asm-commons-5.0.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1646821572977768951.tmp [2020/05/20-18:54:08.708] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.708] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/asm-commons-5.0.3.jar to class loader [2020/05/20-18:54:08.709] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.708] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/metrics-json-3.1.5.jar with timestamp 1589981042328 [2020/05/20-18:54:08.709] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.709] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/metrics-json-3.1.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9013066264797100697.tmp [2020/05/20-18:54:08.714] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.713] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/metrics-json-3.1.5.jar to class loader [2020/05/20-18:54:08.714] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.714] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/websocket-client-9.1.6.v20160112.jar with timestamp 1589981042329 [2020/05/20-18:54:08.714] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.714] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/websocket-client-9.1.6.v20160112.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp980981650160952396.tmp [2020/05/20-18:54:08.718] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.717] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/websocket-client-9.1.6.v20160112.jar to class loader [2020/05/20-18:54:08.718] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.718] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/icu4j-4.6.jar with timestamp 1589981042319 [2020/05/20-18:54:08.719] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.718] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/icu4j-4.6.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8789545713644630632.tmp [2020/05/20-18:54:08.737] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.737] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/icu4j-4.6.jar to class loader [2020/05/20-18:54:08.738] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.737] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xpp3-1.1.4c.jar with timestamp 1589981042327 [2020/05/20-18:54:08.738] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.738] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xpp3-1.1.4c.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2867883590534111533.tmp [2020/05/20-18:54:08.743] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.742] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xpp3-1.1.4c.jar to class loader [2020/05/20-18:54:08.743] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.742] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-codec-1.11.jar with timestamp 1589981042323 [2020/05/20-18:54:08.744] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.742] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-codec-1.11.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6580765016524769479.tmp [2020/05/20-18:54:08.747] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.747] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-codec-1.11.jar to class loader [2020/05/20-18:54:08.747] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.747] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/paranamer-2.3.jar with timestamp 1589981042318 [2020/05/20-18:54:08.748] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.747] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/paranamer-2.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp216699915315290085.tmp [2020/05/20-18:54:08.754] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.754] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/paranamer-2.3.jar to class loader [2020/05/20-18:54:08.755] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.754] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-api-19.1.jar with timestamp 1589981042328 [2020/05/20-18:54:08.755] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.755] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-api-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7460230545695710663.tmp [2020/05/20-18:54:08.759] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.759] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-api-19.1.jar to class loader [2020/05/20-18:54:08.759] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.759] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jetty-io-9.1.6.v20160112.jar with timestamp 1589981042330 [2020/05/20-18:54:08.760] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.759] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jetty-io-9.1.6.v20160112.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7568975863980941223.tmp [2020/05/20-18:54:08.763] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.763] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jetty-io-9.1.6.v20160112.jar to class loader [2020/05/20-18:54:08.764] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.763] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-compress-1.18.jar with timestamp 1589981042310 [2020/05/20-18:54:08.764] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.763] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-compress-1.18.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3787704774666460437.tmp [2020/05/20-18:54:08.770] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.770] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-compress-1.18.jar to class loader [2020/05/20-18:54:08.771] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.770] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dss-shadelib-assembly.jar with timestamp 1589981042332 [2020/05/20-18:54:08.771] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:08.771] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dss-shadelib-assembly.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3900655162161110432.tmp [2020/05/20-18:54:09.708] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.708] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dss-shadelib-assembly.jar to class loader [2020/05/20-18:54:09.708] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.708] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/org.eclipse.jgit-4.5.4.201711221230-r.jar with timestamp 1589981042315 [2020/05/20-18:54:09.708] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.708] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/org.eclipse.jgit-4.5.4.201711221230-r.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5772515085107253592.tmp [2020/05/20-18:54:09.721] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.721] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/org.eclipse.jgit-4.5.4.201711221230-r.jar to class loader [2020/05/20-18:54:09.721] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.721] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jdom-1.1.3.jar with timestamp 1589981042318 [2020/05/20-18:54:09.723] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.723] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jdom-1.1.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8064801779646923749.tmp [2020/05/20-18:54:09.784] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.784] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jdom-1.1.3.jar to class loader [2020/05/20-18:54:09.785] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.784] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/joda-time-2.9.5.jar with timestamp 1589981042309 [2020/05/20-18:54:09.786] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.786] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/joda-time-2.9.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp296216810546444194.tmp [2020/05/20-18:54:09.791] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.791] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/joda-time-2.9.5.jar to class loader [2020/05/20-18:54:09.791] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.791] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/metrics-core-3.1.5.jar with timestamp 1589981042311 [2020/05/20-18:54:09.792] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.792] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/metrics-core-3.1.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1747821672580186103.tmp [2020/05/20-18:54:09.796] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.796] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/metrics-core-3.1.5.jar to class loader [2020/05/20-18:54:09.797] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.796] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/json-path-2.1.0.jar with timestamp 1589981042319 [2020/05/20-18:54:09.798] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.798] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/json-path-2.1.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2116736202765268151.tmp [2020/05/20-18:54:09.855] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.854] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/json-path-2.1.0.jar to class loader [2020/05/20-18:54:09.855] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.854] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/junit-ext-1.0-junit-4.5-RC-3.jar with timestamp 1589981042334 [2020/05/20-18:54:09.856] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.856] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/junit-ext-1.0-junit-4.5-RC-3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8507047461679202217.tmp [2020/05/20-18:54:09.858] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.858] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/junit-ext-1.0-junit-4.5-RC-3.jar to class loader [2020/05/20-18:54:09.859] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.858] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/postgresql-9.1-901.jdbc4.jar with timestamp 1589981042308 [2020/05/20-18:54:09.860] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.860] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/postgresql-9.1-901.jdbc4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6614503299202792631.tmp [2020/05/20-18:54:09.867] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.867] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/postgresql-9.1-901.jdbc4.jar to class loader [2020/05/20-18:54:09.867] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.867] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/ejml-core-0.32.jar with timestamp 1589981042312 [2020/05/20-18:54:09.868] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.867] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/ejml-core-0.32.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6271035527784728320.tmp [2020/05/20-18:54:09.872] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.872] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/ejml-core-0.32.jar to class loader [2020/05/20-18:54:09.872] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.872] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/pdfbox-2.0.9.jar with timestamp 1589981042303 [2020/05/20-18:54:09.873] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.872] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/pdfbox-2.0.9.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1186308844352672019.tmp [2020/05/20-18:54:09.880] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.880] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/pdfbox-2.0.9.jar to class loader [2020/05/20-18:54:09.880] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.880] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/fontbox-2.0.9.jar with timestamp 1589981042314 [2020/05/20-18:54:09.881] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.880] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/fontbox-2.0.9.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5061093230165893880.tmp [2020/05/20-18:54:09.888] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.887] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fontbox-2.0.9.jar to class loader [2020/05/20-18:54:09.888] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.888] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-mapper-asl-1.9.13.jar with timestamp 1589981042327 [2020/05/20-18:54:09.888] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.888] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-mapper-asl-1.9.13.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3482312992967093294.tmp [2020/05/20-18:54:09.892] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.892] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-mapper-asl-1.9.13.jar to class loader [2020/05/20-18:54:09.893] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.892] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dataiku-core.jar with timestamp 1589981042332 [2020/05/20-18:54:09.893] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.893] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dataiku-core.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1829323056620752594.tmp [2020/05/20-18:54:09.896] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.896] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dataiku-core.jar to class loader [2020/05/20-18:54:09.897] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.896] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/hbc-twitter4j-2.0.2.jar with timestamp 1589981042324 [2020/05/20-18:54:09.897] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.896] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/hbc-twitter4j-2.0.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3602972301630077481.tmp [2020/05/20-18:54:09.900] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.900] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/hbc-twitter4j-2.0.2.jar to class loader [2020/05/20-18:54:09.900] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.900] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/livy-api-0.5.0-incubating.jar with timestamp 1589981042320 [2020/05/20-18:54:09.901] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.901] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/livy-api-0.5.0-incubating.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6637762598507602306.tmp [2020/05/20-18:54:09.904] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.904] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/livy-api-0.5.0-incubating.jar to class loader [2020/05/20-18:54:09.904] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.904] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/mail-1.4.7.jar with timestamp 1589981042303 [2020/05/20-18:54:09.904] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.904] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/mail-1.4.7.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9107695940102978811.tmp [2020/05/20-18:54:09.908] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.908] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/mail-1.4.7.jar to class loader [2020/05/20-18:54:09.909] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.908] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-annotations-2.8.0.jar with timestamp 1589981042321 [2020/05/20-18:54:09.909] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.909] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-annotations-2.8.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3134480012234145889.tmp [2020/05/20-18:54:09.912] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.912] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-annotations-2.8.0.jar to class loader [2020/05/20-18:54:09.912] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.912] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-lang-2.6.jar with timestamp 1589981042325 [2020/05/20-18:54:09.913] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.912] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-lang-2.6.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7252177606064119067.tmp [2020/05/20-18:54:09.916] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.916] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-lang-2.6.jar to class loader [2020/05/20-18:54:09.916] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.916] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dss-spark-main_2.11-5.1.jar with timestamp 1589981042333 [2020/05/20-18:54:09.916] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.916] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dss-spark-main_2.11-5.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3597834316381483409.tmp [2020/05/20-18:54:09.968] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.967] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dss-spark-main_2.11-5.1.jar to class loader [2020/05/20-18:54:09.968] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.968] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/twitter4j-core-4.0.1.jar with timestamp 1589981042314 [2020/05/20-18:54:09.968] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.968] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/twitter4j-core-4.0.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7330664778089240624.tmp [2020/05/20-18:54:09.972] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.972] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/twitter4j-core-4.0.1.jar to class loader [2020/05/20-18:54:09.972] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.972] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.dom4j-1.6.1_5.jar with timestamp 1589981042307 [2020/05/20-18:54:09.972] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.972] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.dom4j-1.6.1_5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5466441072846931384.tmp [2020/05/20-18:54:09.991] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.991] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/org.apache.servicemix.bundles.dom4j-1.6.1_5.jar to class loader [2020/05/20-18:54:09.991] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.991] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/ant-1.6.5.jar with timestamp 1589981042313 [2020/05/20-18:54:09.992] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.992] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/ant-1.6.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4835703931762786742.tmp [2020/05/20-18:54:09.998] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.998] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/ant-1.6.5.jar to class loader [2020/05/20-18:54:09.998] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.998] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-logging-1.2.jar with timestamp 1589981042316 [2020/05/20-18:54:09.999] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:09.998] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-logging-1.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4745995794339650443.tmp [2020/05/20-18:54:10.006] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.006] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-logging-1.2.jar to class loader [2020/05/20-18:54:10.006] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.006] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/poi-4.0.1.jar with timestamp 1589981042316 [2020/05/20-18:54:10.007] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.006] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/poi-4.0.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6830874006428329707.tmp [2020/05/20-18:54:10.016] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.016] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/poi-4.0.1.jar to class loader [2020/05/20-18:54:10.017] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.016] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-fileupload-1.3.3.jar with timestamp 1589981042303 [2020/05/20-18:54:10.017] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.017] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-fileupload-1.3.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4034916846154126543.tmp [2020/05/20-18:54:10.022] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.022] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-fileupload-1.3.3.jar to class loader [2020/05/20-18:54:10.022] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.022] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-dataformat-cbor-2.8.1.jar with timestamp 1589981042325 [2020/05/20-18:54:10.022] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.022] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-dataformat-cbor-2.8.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1289180003307479148.tmp [2020/05/20-18:54:10.069] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.069] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-dataformat-cbor-2.8.1.jar to class loader [2020/05/20-18:54:10.070] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.069] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xml-apis-1.4.01.jar with timestamp 1589981042319 [2020/05/20-18:54:10.071] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.070] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xml-apis-1.4.01.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8291291933216478757.tmp [2020/05/20-18:54:10.074] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.073] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xml-apis-1.4.01.jar to class loader [2020/05/20-18:54:10.074] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.074] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dataiku-app-platform.jar with timestamp 1589981042332 [2020/05/20-18:54:10.074] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.074] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dataiku-app-platform.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8901809270318532112.tmp [2020/05/20-18:54:10.078] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.078] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dataiku-app-platform.jar to class loader [2020/05/20-18:54:10.078] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.078] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/lz4-1.2.0.jar with timestamp 1589981042318 [2020/05/20-18:54:10.079] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.078] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/lz4-1.2.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp619048849649071222.tmp [2020/05/20-18:54:10.124] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.124] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/lz4-1.2.0.jar to class loader [2020/05/20-18:54:10.125] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.124] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/livy-client-http-0.5.0-incubating.jar with timestamp 1589981042307 [2020/05/20-18:54:10.125] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.124] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/livy-client-http-0.5.0-incubating.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9008435973253430079.tmp [2020/05/20-18:54:10.149] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.149] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/livy-client-http-0.5.0-incubating.jar to class loader [2020/05/20-18:54:10.149] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.149] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jaxen-1.1.6.jar with timestamp 1589981042303 [2020/05/20-18:54:10.150] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.149] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jaxen-1.1.6.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6253903364984328405.tmp [2020/05/20-18:54:10.154] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.153] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jaxen-1.1.6.jar to class loader [2020/05/20-18:54:10.154] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.154] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-io-2.4.jar with timestamp 1589981042321 [2020/05/20-18:54:10.155] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.155] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-io-2.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp429941564040508100.tmp [2020/05/20-18:54:10.158] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.158] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-io-2.4.jar to class loader [2020/05/20-18:54:10.159] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.159] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-cli-1.3.1.jar with timestamp 1589981042322 [2020/05/20-18:54:10.159] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.159] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-cli-1.3.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2875464670710909016.tmp [2020/05/20-18:54:10.162] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.162] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-cli-1.3.1.jar to class loader [2020/05/20-18:54:10.163] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.162] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xml-resolver-1.2.jar with timestamp 1589981042305 [2020/05/20-18:54:10.163] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.162] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xml-resolver-1.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5203357075700842933.tmp [2020/05/20-18:54:10.165] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.165] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xml-resolver-1.2.jar to class loader [2020/05/20-18:54:10.166] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.165] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jgridshift-1.0.jar with timestamp 1589981042310 [2020/05/20-18:54:10.166] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.166] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jgridshift-1.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7135707318613363832.tmp [2020/05/20-18:54:10.170] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.170] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jgridshift-1.0.jar to class loader [2020/05/20-18:54:10.171] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.170] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spring-web-3.2.18.RELEASE.jar with timestamp 1589981042329 [2020/05/20-18:54:10.171] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.171] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spring-web-3.2.18.RELEASE.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5717616220898888642.tmp [2020/05/20-18:54:10.179] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.179] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spring-web-3.2.18.RELEASE.jar to class loader [2020/05/20-18:54:10.180] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.179] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/mina-core-2.0.18.jar with timestamp 1589981042316 [2020/05/20-18:54:10.180] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.180] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/mina-core-2.0.18.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6176090436139608458.tmp [2020/05/20-18:54:10.187] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.187] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/mina-core-2.0.18.jar to class loader [2020/05/20-18:54:10.188] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.187] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jodd-3.3.8.jar with timestamp 1589981042329 [2020/05/20-18:54:10.189] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.188] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jodd-3.3.8.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8578175613399729746.tmp [2020/05/20-18:54:10.194] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.193] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jodd-3.3.8.jar to class loader [2020/05/20-18:54:10.194] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.194] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/metrics-jvm-3.1.5.jar with timestamp 1589981042314 [2020/05/20-18:54:10.195] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.194] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/metrics-jvm-3.1.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2231789977853794708.tmp [2020/05/20-18:54:10.239] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.238] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/metrics-jvm-3.1.5.jar to class loader [2020/05/20-18:54:10.239] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.239] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/json-20151123.jar with timestamp 1589981042330 [2020/05/20-18:54:10.240] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.239] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/json-20151123.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp202793741727550248.tmp [2020/05/20-18:54:10.244] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.244] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/json-20151123.jar to class loader [2020/05/20-18:54:10.245] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.244] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jsr-275-1.0-beta-2.jar with timestamp 1589981042324 [2020/05/20-18:54:10.245] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.245] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jsr-275-1.0-beta-2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5244457501352897727.tmp [2020/05/20-18:54:10.248] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.248] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jsr-275-1.0-beta-2.jar to class loader [2020/05/20-18:54:10.248] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.248] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/juniversalchardet-2.3.0.jar with timestamp 1589981042310 [2020/05/20-18:54:10.248] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.248] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/juniversalchardet-2.3.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4707408007354455889.tmp [2020/05/20-18:54:10.252] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.252] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/juniversalchardet-2.3.0.jar to class loader [2020/05/20-18:54:10.252] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.252] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-net-3.3.jar with timestamp 1589981042326 [2020/05/20-18:54:10.252] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.252] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-net-3.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5445967059002636897.tmp [2020/05/20-18:54:10.256] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.256] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-net-3.3.jar to class loader [2020/05/20-18:54:10.256] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.256] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/MDAG.jar with timestamp 1589981042333 [2020/05/20-18:54:10.256] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.256] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/MDAG.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1097407323555199462.tmp [2020/05/20-18:54:10.259] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.259] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/MDAG.jar to class loader [2020/05/20-18:54:10.259] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.259] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-core-asl-1.9.13.jar with timestamp 1589981042315 [2020/05/20-18:54:10.260] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.260] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-core-asl-1.9.13.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8482500028060553845.tmp [2020/05/20-18:54:10.263] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.263] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-core-asl-1.9.13.jar to class loader [2020/05/20-18:54:10.263] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.263] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/livy-scala-api_2.11-0.5.0-incubating.jar with timestamp 1589981042331 [2020/05/20-18:54:10.263] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.263] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/livy-scala-api_2.11-0.5.0-incubating.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9060311744041304770.tmp [2020/05/20-18:54:10.266] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.266] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/livy-scala-api_2.11-0.5.0-incubating.jar to class loader [2020/05/20-18:54:10.267] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.266] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/opensaml-2.6.4.jar with timestamp 1589981042317 [2020/05/20-18:54:10.267] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.266] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/opensaml-2.6.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9222625771314994727.tmp [2020/05/20-18:54:10.273] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.273] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/opensaml-2.6.4.jar to class loader [2020/05/20-18:54:10.274] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.273] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spring-core-3.2.18.RELEASE.jar with timestamp 1589981042329 [2020/05/20-18:54:10.274] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.274] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spring-core-3.2.18.RELEASE.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7138578361804807078.tmp [2020/05/20-18:54:10.280] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.280] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spring-core-3.2.18.RELEASE.jar to class loader [2020/05/20-18:54:10.280] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.280] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/twitter4j-stream-4.0.1.jar with timestamp 1589981042308 [2020/05/20-18:54:10.280] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.280] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/twitter4j-stream-4.0.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7411787457164334922.tmp [2020/05/20-18:54:10.288] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.288] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/twitter4j-stream-4.0.1.jar to class loader [2020/05/20-18:54:10.289] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.288] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/hbc-core-2.0.2.jar with timestamp 1589981042320 [2020/05/20-18:54:10.289] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.288] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/hbc-core-2.0.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5288207604529453179.tmp [2020/05/20-18:54:10.292] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.292] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/hbc-core-2.0.2.jar to class loader [2020/05/20-18:54:10.293] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.292] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/quartz-2.2.3.jar with timestamp 1589981042323 [2020/05/20-18:54:10.293] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.293] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/quartz-2.2.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7344501048150478943.tmp [2020/05/20-18:54:10.316] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.315] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/quartz-2.2.3.jar to class loader [2020/05/20-18:54:10.317] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.315] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/guava-18.0.jar with timestamp 1589981042306 [2020/05/20-18:54:10.318] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.316] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/guava-18.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3117412634140371729.tmp [2020/05/20-18:54:10.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.324] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/guava-18.0.jar to class loader [2020/05/20-18:54:10.324] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.324] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/util-eval_2.11-6.27.0.jar with timestamp 1589981042331 [2020/05/20-18:54:10.325] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.324] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/util-eval_2.11-6.27.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6395268598529383442.tmp [2020/05/20-18:54:10.327] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.327] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/util-eval_2.11-6.27.0.jar to class loader [2020/05/20-18:54:10.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.327] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-metadata-19.1.jar with timestamp 1589981042319 [2020/05/20-18:54:10.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.327] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-metadata-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9161788691243551335.tmp [2020/05/20-18:54:10.332] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.332] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-metadata-19.1.jar to class loader [2020/05/20-18:54:10.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.332] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/snakeyaml-1.15.jar with timestamp 1589981042326 [2020/05/20-18:54:10.333] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.332] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/snakeyaml-1.15.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7381128989943639323.tmp [2020/05/20-18:54:10.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.337] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/snakeyaml-1.15.jar to class loader [2020/05/20-18:54:10.338] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.337] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jython-standalone-2.7-b3.jar with timestamp 1589981042314 [2020/05/20-18:54:10.338] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.338] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jython-standalone-2.7-b3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5981004213386323339.tmp [2020/05/20-18:54:10.783] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.783] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jython-standalone-2.7-b3.jar to class loader [2020/05/20-18:54:10.784] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.783] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-pool-1.6.jar with timestamp 1589981042317 [2020/05/20-18:54:10.784] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.784] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-pool-1.6.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8323405262242290124.tmp [2020/05/20-18:54:10.788] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.788] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-pool-1.6.jar to class loader [2020/05/20-18:54:10.789] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.788] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/filters-2.0.235.jar with timestamp 1589981042326 [2020/05/20-18:54:10.789] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.789] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/filters-2.0.235.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6129903881203470391.tmp [2020/05/20-18:54:10.794] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.794] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/filters-2.0.235.jar to class loader [2020/05/20-18:54:10.794] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.794] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/asm-tree-5.0.3.jar with timestamp 1589981042314 [2020/05/20-18:54:10.795] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.794] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/asm-tree-5.0.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1362922220822806964.tmp [2020/05/20-18:54:10.832] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.831] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/asm-tree-5.0.3.jar to class loader [2020/05/20-18:54:10.832] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.832] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/freemarker-2.3.28.jar with timestamp 1589981042330 [2020/05/20-18:54:10.833] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.832] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/freemarker-2.3.28.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2850172837678095333.tmp [2020/05/20-18:54:10.842] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.842] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/freemarker-2.3.28.jar to class loader [2020/05/20-18:54:10.843] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.842] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/ooxml-schemas-1.1.jar with timestamp 1589981042316 [2020/05/20-18:54:10.843] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:10.843] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/ooxml-schemas-1.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8916767923114139490.tmp [2020/05/20-18:54:11.198] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.198] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/ooxml-schemas-1.1.jar to class loader [2020/05/20-18:54:11.199] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.198] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/lucene-analyzers-common-5.5.4.jar with timestamp 1589981042327 [2020/05/20-18:54:11.199] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.198] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/lucene-analyzers-common-5.5.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1866937037156349033.tmp [2020/05/20-18:54:11.207] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.207] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/lucene-analyzers-common-5.5.4.jar to class loader [2020/05/20-18:54:11.208] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.207] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/sqlite-jdbc-3.8.11.2.jar with timestamp 1589981042320 [2020/05/20-18:54:11.208] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.208] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/sqlite-jdbc-3.8.11.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp654996642195418966.tmp [2020/05/20-18:54:11.261] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.261] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/sqlite-jdbc-3.8.11.2.jar to class loader [2020/05/20-18:54:11.262] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.261] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/compress-lzf-1.0.2.jar with timestamp 1589981042304 [2020/05/20-18:54:11.262] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.262] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/compress-lzf-1.0.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6387802615468112535.tmp [2020/05/20-18:54:11.265] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.265] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/compress-lzf-1.0.2.jar to class loader [2020/05/20-18:54:11.265] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.265] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/websocket-api-9.1.6.v20160112.jar with timestamp 1589981042331 [2020/05/20-18:54:11.265] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.265] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/websocket-api-9.1.6.v20160112.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5845513025532686220.tmp [2020/05/20-18:54:11.268] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.268] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/websocket-api-9.1.6.v20160112.jar to class loader [2020/05/20-18:54:11.268] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.268] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/slf4j-log4j12-1.7.21.jar with timestamp 1589981042330 [2020/05/20-18:54:11.269] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.268] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/slf4j-log4j12-1.7.21.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8414278943220303204.tmp [2020/05/20-18:54:11.271] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.271] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/slf4j-log4j12-1.7.21.jar to class loader [2020/05/20-18:54:11.272] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.271] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/maxminddb-0.2.0.jar with timestamp 1589981042325 [2020/05/20-18:54:11.272] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.272] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/maxminddb-0.2.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7395431272138547636.tmp [2020/05/20-18:54:11.349] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.349] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/maxminddb-0.2.0.jar to class loader [2020/05/20-18:54:11.350] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.349] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-databind-2.8.11.2.jar with timestamp 1589981042309 [2020/05/20-18:54:11.350] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.349] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-databind-2.8.11.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4015867805536244668.tmp [2020/05/20-18:54:11.355] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.355] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-databind-2.8.11.2.jar to class loader [2020/05/20-18:54:11.356] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.355] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-core-2.8.10.jar with timestamp 1589981042302 [2020/05/20-18:54:11.356] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.356] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-core-2.8.10.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6173038932541261176.tmp [2020/05/20-18:54:11.359] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.359] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-core-2.8.10.jar to class loader [2020/05/20-18:54:11.359] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.359] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jts-1.13.jar with timestamp 1589981042313 [2020/05/20-18:54:11.359] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.359] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jts-1.13.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4778695652891850669.tmp [2020/05/20-18:54:11.364] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.364] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jts-1.13.jar to class loader [2020/05/20-18:54:11.364] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.364] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/joauth-6.0.2.jar with timestamp 1589981042328 [2020/05/20-18:54:11.364] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.364] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/joauth-6.0.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1698440468174071765.tmp [2020/05/20-18:54:11.368] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.368] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/joauth-6.0.2.jar to class loader [2020/05/20-18:54:11.369] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.368] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/urlrewritefilter-4.0.4.jar with timestamp 1589981042327 [2020/05/20-18:54:11.369] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.369] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/urlrewritefilter-4.0.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2837312830046422266.tmp [2020/05/20-18:54:11.373] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.373] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/urlrewritefilter-4.0.4.jar to class loader [2020/05/20-18:54:11.374] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.373] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/HdrHistogram-2.1.6.jar with timestamp 1589981042307 [2020/05/20-18:54:11.375] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.374] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/HdrHistogram-2.1.6.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4602708791775477232.tmp [2020/05/20-18:54:11.380] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.380] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/HdrHistogram-2.1.6.jar to class loader [2020/05/20-18:54:11.380] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.380] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-geojson-19.1.jar with timestamp 1589981042312 [2020/05/20-18:54:11.381] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.381] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-geojson-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2439574078187616063.tmp [2020/05/20-18:54:11.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.384] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-geojson-19.1.jar to class loader [2020/05/20-18:54:11.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.384] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/openws-1.5.4.jar with timestamp 1589981042308 [2020/05/20-18:54:11.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.385] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/openws-1.5.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8685344710381994258.tmp [2020/05/20-18:54:11.389] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.389] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/openws-1.5.4.jar to class loader [2020/05/20-18:54:11.389] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.389] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-math3-3.6.1.jar with timestamp 1589981042307 [2020/05/20-18:54:11.390] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.390] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-math3-3.6.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7935508395875904374.tmp [2020/05/20-18:54:11.398] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.398] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-math3-3.6.1.jar to class loader [2020/05/20-18:54:11.398] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.398] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jts-core-1.14.0.jar with timestamp 1589981042325 [2020/05/20-18:54:11.399] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.399] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jts-core-1.14.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5571685497015068230.tmp [2020/05/20-18:54:11.404] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.404] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jts-core-1.14.0.jar to class loader [2020/05/20-18:54:11.404] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.404] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/slf4j-api-1.7.21.jar with timestamp 1589981042331 [2020/05/20-18:54:11.405] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.405] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/slf4j-api-1.7.21.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4270266000811873909.tmp [2020/05/20-18:54:11.408] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.408] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/slf4j-api-1.7.21.jar to class loader [2020/05/20-18:54:11.408] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.408] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dataiku-dip.jar with timestamp 1589981042332 [2020/05/20-18:54:11.409] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.408] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dataiku-dip.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7199984212299183316.tmp [2020/05/20-18:54:11.637] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.636] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dataiku-dip.jar to class loader [2020/05/20-18:54:11.637] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.637] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.antlr-2.7.7_5.jar with timestamp 1589981042324 [2020/05/20-18:54:11.638] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.638] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.antlr-2.7.7_5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6960907068432984778.tmp [2020/05/20-18:54:11.647] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.647] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/org.apache.servicemix.bundles.antlr-2.7.7_5.jar to class loader [2020/05/20-18:54:11.647] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.647] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spring-expression-3.2.18.RELEASE.jar with timestamp 1589981042315 [2020/05/20-18:54:11.648] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.648] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spring-expression-3.2.18.RELEASE.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9197644787217653468.tmp [2020/05/20-18:54:11.652] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.652] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spring-expression-3.2.18.RELEASE.jar to class loader [2020/05/20-18:54:11.653] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.653] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-epsg-hsql-19.1.jar with timestamp 1589981042329 [2020/05/20-18:54:11.653] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.653] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-epsg-hsql-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7422169183660229810.tmp [2020/05/20-18:54:11.761] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.761] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-epsg-hsql-19.1.jar to class loader [2020/05/20-18:54:11.761] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.761] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-main-19.1.jar with timestamp 1589981042306 [2020/05/20-18:54:11.762] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.762] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-main-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6090297874707697999.tmp [2020/05/20-18:54:11.803] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.802] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-main-19.1.jar to class loader [2020/05/20-18:54:11.803] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.803] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/h2-1.4.195.jar with timestamp 1589981042331 [2020/05/20-18:54:11.804] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.804] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/h2-1.4.195.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3360942615000360053.tmp [2020/05/20-18:54:11.810] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.810] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/h2-1.4.195.jar to class loader [2020/05/20-18:54:11.811] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.811] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spatial4j-0.5.jar with timestamp 1589981042315 [2020/05/20-18:54:11.811] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.811] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spatial4j-0.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6171914669191681563.tmp [2020/05/20-18:54:11.860] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.859] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spatial4j-0.5.jar to class loader [2020/05/20-18:54:11.860] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.860] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/websocket-server-9.1.6.v20160112.jar with timestamp 1589981042330 [2020/05/20-18:54:11.861] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.861] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/websocket-server-9.1.6.v20160112.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1735653796972694558.tmp [2020/05/20-18:54:11.865] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.865] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/websocket-server-9.1.6.v20160112.jar to class loader [2020/05/20-18:54:11.865] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.865] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jsoup-1.6.3.jar with timestamp 1589981042319 [2020/05/20-18:54:11.866] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.866] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jsoup-1.6.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp282653109281804885.tmp [2020/05/20-18:54:11.870] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.870] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jsoup-1.6.3.jar to class loader [2020/05/20-18:54:11.871] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.870] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-referencing-19.1.jar with timestamp 1589981042326 [2020/05/20-18:54:11.871] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.871] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-referencing-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6898749404400113248.tmp [2020/05/20-18:54:11.926] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.925] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-referencing-19.1.jar to class loader [2020/05/20-18:54:11.926] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.926] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/mongodb-driver-3.2.2.jar with timestamp 1589981042309 [2020/05/20-18:54:11.927] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.927] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/mongodb-driver-3.2.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2893365882614399172.tmp [2020/05/20-18:54:11.932] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.932] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/mongodb-driver-3.2.2.jar to class loader [2020/05/20-18:54:11.933] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.933] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/javax.servlet-api-3.1.0.jar with timestamp 1589981042331 [2020/05/20-18:54:11.933] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.933] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/javax.servlet-api-3.1.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3039096267299806049.tmp [2020/05/20-18:54:11.937] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.937] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/javax.servlet-api-3.1.0.jar to class loader [2020/05/20-18:54:11.937] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.937] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/poi-ooxml-schemas-4.0.1.jar with timestamp 1589981042302 [2020/05/20-18:54:11.938] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.938] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/poi-ooxml-schemas-4.0.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2027977551173638530.tmp [2020/05/20-18:54:11.963] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.963] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/poi-ooxml-schemas-4.0.1.jar to class loader [2020/05/20-18:54:11.964] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.964] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/bson-3.2.2.jar with timestamp 1589981042320 [2020/05/20-18:54:11.964] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.964] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/bson-3.2.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1275789464763905187.tmp [2020/05/20-18:54:11.968] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.968] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/bson-3.2.2.jar to class loader [2020/05/20-18:54:11.968] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.968] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/slf4j-api-1.7.25.jar with timestamp 1589981042313 [2020/05/20-18:54:11.969] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.968] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/slf4j-api-1.7.25.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7655410201479082866.tmp [2020/05/20-18:54:11.971] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.971] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/slf4j-api-1.7.25.jar to class loader [2020/05/20-18:54:11.972] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.972] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spring-webmvc-3.2.18.RELEASE.jar with timestamp 1589981042328 [2020/05/20-18:54:11.972] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:11.972] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spring-webmvc-3.2.18.RELEASE.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3707411203480495497.tmp [2020/05/20-18:54:12.035] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.035] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spring-webmvc-3.2.18.RELEASE.jar to class loader [2020/05/20-18:54:12.036] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.036] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-httpclient-3.1.jar with timestamp 1589981042301 [2020/05/20-18:54:12.037] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.037] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-httpclient-3.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2224582328180467166.tmp [2020/05/20-18:54:12.042] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.042] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-httpclient-3.1.jar to class loader [2020/05/20-18:54:12.042] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.042] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-dataformat-yaml-2.8.1.jar with timestamp 1589981042311 [2020/05/20-18:54:12.043] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.043] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-dataformat-yaml-2.8.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6065204314746161943.tmp [2020/05/20-18:54:12.046] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.046] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-dataformat-yaml-2.8.1.jar to class loader [2020/05/20-18:54:12.046] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.046] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/api-all-1.0.2.jar with timestamp 1589981042306 [2020/05/20-18:54:12.047] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.047] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/api-all-1.0.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8761022664470706849.tmp [2020/05/20-18:54:12.095] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.095] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/api-all-1.0.2.jar to class loader [2020/05/20-18:54:12.095] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.095] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/websocket-common-9.1.6.v20160112.jar with timestamp 1589981042330 [2020/05/20-18:54:12.096] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.096] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/websocket-common-9.1.6.v20160112.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7898542187079593704.tmp [2020/05/20-18:54:12.100] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.100] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/websocket-common-9.1.6.v20160112.jar to class loader [2020/05/20-18:54:12.100] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.100] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/log4j-1.2.17.jar with timestamp 1589981042329 [2020/05/20-18:54:12.101] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.101] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/log4j-1.2.17.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5614646333393521212.tmp [2020/05/20-18:54:12.105] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.105] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/log4j-1.2.17.jar to class loader [2020/05/20-18:54:12.105] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.105] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/cglib-nodep-2.2.2.jar with timestamp 1589981042328 [2020/05/20-18:54:12.106] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.106] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/cglib-nodep-2.2.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5365463285401777269.tmp [2020/05/20-18:54:12.109] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.109] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/cglib-nodep-2.2.2.jar to class loader [2020/05/20-18:54:12.109] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.109] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-shapefile-19.1.jar with timestamp 1589981042309 [2020/05/20-18:54:12.110] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.110] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-shapefile-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3264170587632377365.tmp [2020/05/20-18:54:12.150] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.150] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-shapefile-19.1.jar to class loader [2020/05/20-18:54:12.150] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.150] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jcurses.jar with timestamp 1589981042333 [2020/05/20-18:54:12.152] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.152] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jcurses.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5575817593744186953.tmp [2020/05/20-18:54:12.155] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.155] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jcurses.jar to class loader [2020/05/20-18:54:12.155] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.155] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gson-2.8.0.jar with timestamp 1589981042326 [2020/05/20-18:54:12.156] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.155] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gson-2.8.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1761670553781564996.tmp [2020/05/20-18:54:12.159] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.159] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gson-2.8.0.jar to class loader [2020/05/20-18:54:12.160] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.160] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/c3p0-0.9.1.1.jar with timestamp 1589981042324 [2020/05/20-18:54:12.160] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.160] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/c3p0-0.9.1.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2482480071578375516.tmp [2020/05/20-18:54:12.178] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.178] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/c3p0-0.9.1.1.jar to class loader [2020/05/20-18:54:12.179] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.179] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xercesImpl-2.9.1.jar with timestamp 1589981042322 [2020/05/20-18:54:12.179] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.179] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xercesImpl-2.9.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6709978072862162057.tmp [2020/05/20-18:54:12.186] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.186] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xercesImpl-2.9.1.jar to class loader [2020/05/20-18:54:12.186] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.186] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/sassyreader.jar with timestamp 1589981042333 [2020/05/20-18:54:12.187] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.187] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/sassyreader.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp889257852495514023.tmp [2020/05/20-18:54:12.191] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.191] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/sassyreader.jar to class loader [2020/05/20-18:54:12.191] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.191] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/antlr-runtime-3.5.2.jar with timestamp 1589981042311 [2020/05/20-18:54:12.192] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.191] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/antlr-runtime-3.5.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4247315676031779321.tmp [2020/05/20-18:54:12.195] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.195] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/antlr-runtime-3.5.2.jar to class loader [2020/05/20-18:54:12.196] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.196] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-collections4-4.2.jar with timestamp 1589981042317 [2020/05/20-18:54:12.196] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.196] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-collections4-4.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6039346211809475142.tmp [2020/05/20-18:54:12.203] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.203] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-collections4-4.2.jar to class loader [2020/05/20-18:54:12.203] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.203] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jdom2-2.0.6.jar with timestamp 1589981042323 [2020/05/20-18:54:12.204] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.204] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jdom2-2.0.6.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3161723607472973092.tmp [2020/05/20-18:54:12.208] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.208] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jdom2-2.0.6.jar to class loader [2020/05/20-18:54:12.208] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.208] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spring-beans-3.2.18.RELEASE.jar with timestamp 1589981042320 [2020/05/20-18:54:12.209] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.209] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spring-beans-3.2.18.RELEASE.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6218540625394400417.tmp [2020/05/20-18:54:12.213] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.213] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spring-beans-3.2.18.RELEASE.jar to class loader [2020/05/20-18:54:12.214] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.214] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/websocket-servlet-9.1.6.v20160112.jar with timestamp 1589981042329 [2020/05/20-18:54:12.214] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.214] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/websocket-servlet-9.1.6.v20160112.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5876465800630126205.tmp [2020/05/20-18:54:12.218] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.217] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/websocket-servlet-9.1.6.v20160112.jar to class loader [2020/05/20-18:54:12.218] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.218] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xmltooling-1.4.4.jar with timestamp 1589981042323 [2020/05/20-18:54:12.218] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.218] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xmltooling-1.4.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6422513040835627983.tmp [2020/05/20-18:54:12.224] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.224] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xmltooling-1.4.4.jar to class loader [2020/05/20-18:54:12.224] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.224] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jai_core-1.1.3.jar with timestamp 1589981042323 [2020/05/20-18:54:12.225] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.225] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jai_core-1.1.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9217250343440345627.tmp [2020/05/20-18:54:12.245] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.245] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jai_core-1.1.3.jar to class loader [2020/05/20-18:54:12.246] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.246] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/lang-tag-1.4.4.jar with timestamp 1589981042316 [2020/05/20-18:54:12.247] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.246] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/lang-tag-1.4.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7734203819377853210.tmp [2020/05/20-18:54:12.251] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.251] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/lang-tag-1.4.4.jar to class loader [2020/05/20-18:54:12.252] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.252] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/oauth2-oidc-sdk-6.5.jar with timestamp 1589981042310 [2020/05/20-18:54:12.253] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.253] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/oauth2-oidc-sdk-6.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6864034550771103074.tmp [2020/05/20-18:54:12.258] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.258] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/oauth2-oidc-sdk-6.5.jar to class loader [2020/05/20-18:54:12.259] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.259] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/bcprov-jdk15on-1.51.jar with timestamp 1589981042311 [2020/05/20-18:54:12.260] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.259] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/bcprov-jdk15on-1.51.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2151964505029511883.tmp [2020/05/20-18:54:12.276] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.276] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/bcprov-jdk15on-1.51.jar to class loader [2020/05/20-18:54:12.277] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.276] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/t-digest-3.0.jar with timestamp 1589981042328 [2020/05/20-18:54:12.277] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.277] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/t-digest-3.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp193864180025052392.tmp [2020/05/20-18:54:12.282] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.282] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/t-digest-3.0.jar to class loader [2020/05/20-18:54:12.282] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.282] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spring-context-3.2.18.RELEASE.jar with timestamp 1589981042312 [2020/05/20-18:54:12.283] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.283] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spring-context-3.2.18.RELEASE.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1648443586257039939.tmp [2020/05/20-18:54:12.288] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.288] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spring-context-3.2.18.RELEASE.jar to class loader [2020/05/20-18:54:12.289] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.289] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/mongodb-driver-core-3.2.2.jar with timestamp 1589981042322 [2020/05/20-18:54:12.289] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.289] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/mongodb-driver-core-3.2.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7881440218053421025.tmp [2020/05/20-18:54:12.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.321] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/mongodb-driver-core-3.2.2.jar to class loader [2020/05/20-18:54:12.322] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.322] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/JavaEWAH-0.7.9.jar with timestamp 1589981042306 [2020/05/20-18:54:12.323] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.323] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/JavaEWAH-0.7.9.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6367671285374487951.tmp [2020/05/20-18:54:12.328] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.328] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/JavaEWAH-0.7.9.jar to class loader [2020/05/20-18:54:12.329] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.329] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/javax.mail-1.6.1.jar with timestamp 1589981042318 [2020/05/20-18:54:12.330] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.329] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/javax.mail-1.6.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2627714831599543856.tmp [2020/05/20-18:54:12.336] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.336] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/javax.mail-1.6.1.jar to class loader [2020/05/20-18:54:12.336] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.336] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/spring-aop-3.2.18.RELEASE.jar with timestamp 1589981042323 [2020/05/20-18:54:12.337] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.337] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/spring-aop-3.2.18.RELEASE.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4341140327030835573.tmp [2020/05/20-18:54:12.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.341] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/spring-aop-3.2.18.RELEASE.jar to class loader [2020/05/20-18:54:12.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.341] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/css-selectors-1.0.5.jar with timestamp 1589981042328 [2020/05/20-18:54:12.341] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.341] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/css-selectors-1.0.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4501537908285603095.tmp [2020/05/20-18:54:12.378] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.378] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/css-selectors-1.0.5.jar to class loader [2020/05/20-18:54:12.379] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.378] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/activation-1.1.jar with timestamp 1589981042306 [2020/05/20-18:54:12.379] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.379] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/activation-1.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1255849096189443612.tmp [2020/05/20-18:54:12.382] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.382] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/activation-1.1.jar to class loader [2020/05/20-18:54:12.383] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.383] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xmlsec-1.5.7.jar with timestamp 1589981042306 [2020/05/20-18:54:12.383] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.383] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xmlsec-1.5.7.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1863755571774704051.tmp [2020/05/20-18:54:12.388] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.387] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xmlsec-1.5.7.jar to class loader [2020/05/20-18:54:12.388] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.388] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/hppc-0.7.1.jar with timestamp 1589981042324 [2020/05/20-18:54:12.388] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.388] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/hppc-0.7.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp120214115578567960.tmp [2020/05/20-18:54:12.403] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.403] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/hppc-0.7.1.jar to class loader [2020/05/20-18:54:12.404] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.404] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/nimbus-jose-jwt-8.10.jar with timestamp 1589981042327 [2020/05/20-18:54:12.404] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.404] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/nimbus-jose-jwt-8.10.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1511017186322396692.tmp [2020/05/20-18:54:12.408] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.408] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/nimbus-jose-jwt-8.10.jar to class loader [2020/05/20-18:54:12.408] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.408] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xerces2-xsd11-2.11.1.jar with timestamp 1589981042321 [2020/05/20-18:54:12.409] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.409] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xerces2-xsd11-2.11.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9100219487187041666.tmp [2020/05/20-18:54:12.432] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.432] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xerces2-xsd11-2.11.1.jar to class loader [2020/05/20-18:54:12.433] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.433] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jcip-annotations-1.0-1.jar with timestamp 1589981042325 [2020/05/20-18:54:12.434] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.434] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jcip-annotations-1.0-1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2021919459665492042.tmp [2020/05/20-18:54:12.437] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.437] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jcip-annotations-1.0-1.jar to class loader [2020/05/20-18:54:12.438] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.438] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/esapi-2.0.1.jar with timestamp 1589981042317 [2020/05/20-18:54:12.439] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.439] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/esapi-2.0.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7288774415415246958.tmp [2020/05/20-18:54:12.443] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.443] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/esapi-2.0.1.jar to class loader [2020/05/20-18:54:12.444] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.443] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/velocity-1.7.jar with timestamp 1589981042318 [2020/05/20-18:54:12.444] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.444] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/velocity-1.7.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8259300453072028055.tmp [2020/05/20-18:54:12.448] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.448] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/velocity-1.7.jar to class loader [2020/05/20-18:54:12.448] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.448] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-lang3-3.5.jar with timestamp 1589981042322 [2020/05/20-18:54:12.449] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.449] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-lang3-3.5.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7243282684171956058.tmp [2020/05/20-18:54:12.453] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.453] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-lang3-3.5.jar to class loader [2020/05/20-18:54:12.453] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.453] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/javax.annotation-api-1.2.jar with timestamp 1589981042330 [2020/05/20-18:54:12.454] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.453] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/javax.annotation-api-1.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1637292640546925571.tmp [2020/05/20-18:54:12.456] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.456] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/javax.annotation-api-1.2.jar to class loader [2020/05/20-18:54:12.457] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.456] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jackson-dataformat-smile-2.8.1.jar with timestamp 1589981042312 [2020/05/20-18:54:12.457] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.457] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jackson-dataformat-smile-2.8.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6219565349166650023.tmp [2020/05/20-18:54:12.460] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.460] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jackson-dataformat-smile-2.8.1.jar to class loader [2020/05/20-18:54:12.460] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.460] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/accessors-smart-1.2.jar with timestamp 1589981042305 [2020/05/20-18:54:12.461] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.461] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/accessors-smart-1.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1407896223777472986.tmp [2020/05/20-18:54:12.463] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.463] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/accessors-smart-1.2.jar to class loader [2020/05/20-18:54:12.464] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.464] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/antlr4-runtime-4.7.jar with timestamp 1589981042322 [2020/05/20-18:54:12.464] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.464] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/antlr4-runtime-4.7.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5996297636071700894.tmp [2020/05/20-18:54:12.479] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.479] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/antlr4-runtime-4.7.jar to class loader [2020/05/20-18:54:12.480] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.480] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jsr305-1.3.9.jar with timestamp 1589981042305 [2020/05/20-18:54:12.480] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.480] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jsr305-1.3.9.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4010880943212683343.tmp [2020/05/20-18:54:12.484] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.484] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jsr305-1.3.9.jar to class loader [2020/05/20-18:54:12.484] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.484] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-data-19.1.jar with timestamp 1589981042317 [2020/05/20-18:54:12.485] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.485] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-data-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6171184549716663139.tmp [2020/05/20-18:54:12.488] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.488] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-data-19.1.jar to class loader [2020/05/20-18:54:12.488] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.488] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/GeographicLib-Java-1.44.jar with timestamp 1589981042314 [2020/05/20-18:54:12.489] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.489] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/GeographicLib-Java-1.44.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3648681701128059601.tmp [2020/05/20-18:54:12.520] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.520] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/GeographicLib-Java-1.44.jar to class loader [2020/05/20-18:54:12.521] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.521] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/json-smart-2.3.jar with timestamp 1589981042318 [2020/05/20-18:54:12.522] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.522] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/json-smart-2.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4118083896179265059.tmp [2020/05/20-18:54:12.526] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.526] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/json-smart-2.3.jar to class loader [2020/05/20-18:54:12.527] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.527] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/avro-1.7.7.jar with timestamp 1589981042311 [2020/05/20-18:54:12.528] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.528] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/avro-1.7.7.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp94746355354917193.tmp [2020/05/20-18:54:12.533] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.533] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/avro-1.7.7.jar to class loader [2020/05/20-18:54:12.534] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.533] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/asm-5.0.4.jar with timestamp 1589981042304 [2020/05/20-18:54:12.534] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.534] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/asm-5.0.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3927714218160310310.tmp [2020/05/20-18:54:12.538] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.538] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/asm-5.0.4.jar to class loader [2020/05/20-18:54:12.538] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.538] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/securesm-1.0.jar with timestamp 1589981042321 [2020/05/20-18:54:12.539] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.539] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/securesm-1.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8200893057756744348.tmp [2020/05/20-18:54:12.542] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.542] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/securesm-1.0.jar to class loader [2020/05/20-18:54:12.542] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.542] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/xmlbeans-3.0.2.jar with timestamp 1589981042322 [2020/05/20-18:54:12.543] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.543] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/xmlbeans-3.0.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp7890088980202526166.tmp [2020/05/20-18:54:12.551] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.551] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/xmlbeans-3.0.2.jar to class loader [2020/05/20-18:54:12.552] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.552] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/gt-opengis-19.1.jar with timestamp 1589981042304 [2020/05/20-18:54:12.552] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.552] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/gt-opengis-19.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9082137837964084622.tmp [2020/05/20-18:54:12.556] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.556] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/gt-opengis-19.1.jar to class loader [2020/05/20-18:54:12.556] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.556] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/java-cup-10k.jar with timestamp 1589981042304 [2020/05/20-18:54:12.557] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.557] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/java-cup-10k.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8472271360403037242.tmp [2020/05/20-18:54:12.560] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.560] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/java-cup-10k.jar to class loader [2020/05/20-18:54:12.560] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.560] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dataiku-dss-core.jar with timestamp 1589981042332 [2020/05/20-18:54:12.560] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.560] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dataiku-dss-core.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp153713697119677919.tmp [2020/05/20-18:54:12.564] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.564] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dataiku-dss-core.jar to class loader [2020/05/20-18:54:12.564] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.564] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/cassandra-driver-core-3.0.8.jar with timestamp 1589981042324 [2020/05/20-18:54:12.565] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.565] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/cassandra-driver-core-3.0.8.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4774768876322323436.tmp [2020/05/20-18:54:12.570] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.570] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/cassandra-driver-core-3.0.8.jar to class loader [2020/05/20-18:54:12.571] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.571] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/trove4j-3.0.3.jar with timestamp 1589981042319 [2020/05/20-18:54:12.571] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.571] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/trove4j-3.0.3.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp6189458016539466603.tmp [2020/05/20-18:54:12.579] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.579] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/trove4j-3.0.3.jar to class loader [2020/05/20-18:54:12.579] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.579] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.xpp3-1.1.4c_7.jar with timestamp 1589981042329 [2020/05/20-18:54:12.580] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.580] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/org.apache.servicemix.bundles.xpp3-1.1.4c_7.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4872604972436018281.tmp [2020/05/20-18:54:12.584] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.584] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/org.apache.servicemix.bundles.xpp3-1.1.4c_7.jar to class loader [2020/05/20-18:54:12.584] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.584] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/dataiku-scoring.jar with timestamp 1589981042332 [2020/05/20-18:54:12.585] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.585] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/dataiku-scoring.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3704656750153086417.tmp [2020/05/20-18:54:12.590] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.590] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/dataiku-scoring.jar to class loader [2020/05/20-18:54:12.590] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.590] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/not-yet-commons-ssl-0.3.9.jar with timestamp 1589981042325 [2020/05/20-18:54:12.591] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.591] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/not-yet-commons-ssl-0.3.9.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8833891313827484762.tmp [2020/05/20-18:54:12.594] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.594] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/not-yet-commons-ssl-0.3.9.jar to class loader [2020/05/20-18:54:12.595] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.595] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jsr166e-1.1.0.jar with timestamp 1589981042308 [2020/05/20-18:54:12.595] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.595] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jsr166e-1.1.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3284614131304793052.tmp [2020/05/20-18:54:12.599] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.599] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jsr166e-1.1.0.jar to class loader [2020/05/20-18:54:12.599] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.599] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/jsch-0.1.54.jar with timestamp 1589981042309 [2020/05/20-18:54:12.599] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.599] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/jsch-0.1.54.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp9192418601115768606.tmp [2020/05/20-18:54:12.602] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.602] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/jsch-0.1.54.jar to class loader [2020/05/20-18:54:12.603] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.602] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/java-image-scaling-0.8.6.jar with timestamp 1589981042312 [2020/05/20-18:54:12.603] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.603] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/java-image-scaling-0.8.6.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp1388913443812646122.tmp [2020/05/20-18:54:12.606] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.606] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/java-image-scaling-0.8.6.jar to class loader [2020/05/20-18:54:12.606] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.606] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/commons-collections-3.2.2.jar with timestamp 1589981042321 [2020/05/20-18:54:12.607] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.607] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/commons-collections-3.2.2.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3381881194888246936.tmp [2020/05/20-18:54:12.610] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.610] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/commons-collections-3.2.2.jar to class loader [2020/05/20-18:54:12.611] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.610] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/adal4j-1.6.4.jar with timestamp 1589981042308 [2020/05/20-18:54:12.611] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.611] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/adal4j-1.6.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp3752283960917201563.tmp [2020/05/20-18:54:12.614] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.613] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/adal4j-1.6.4.jar to class loader [2020/05/20-18:54:12.614] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.614] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/ini4j-0.5.4.jar with timestamp 1589981042330 [2020/05/20-18:54:12.614] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.614] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/ini4j-0.5.4.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp8943571677503576960.tmp [2020/05/20-18:54:12.617] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.617] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/ini4j-0.5.4.jar to class loader [2020/05/20-18:54:12.617] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.617] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/org.eclipse.wst.xml.xpath2.processor-2.1.100.jar with timestamp 1589981042305 [2020/05/20-18:54:12.618] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.617] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/org.eclipse.wst.xml.xpath2.processor-2.1.100.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp2048236993202814374.tmp [2020/05/20-18:54:12.680] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.680] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/org.eclipse.wst.xml.xpath2.processor-2.1.100.jar to class loader [2020/05/20-18:54:12.681] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.680] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/json-simple-1.1.jar with timestamp 1589981042304 [2020/05/20-18:54:12.681] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.681] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/json-simple-1.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp5028487869768440959.tmp [2020/05/20-18:54:12.685] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.685] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/json-simple-1.1.jar to class loader [2020/05/20-18:54:12.686] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.686] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/hsqldb-2.3.0.jar with timestamp 1589981042326 [2020/05/20-18:54:12.686] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.686] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/hsqldb-2.3.0.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp423181432579846942.tmp [2020/05/20-18:54:12.692] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.692] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/hsqldb-2.3.0.jar to class loader [2020/05/20-18:54:12.692] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.692] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Fetching spark://10.139.4.79:41629/jars/geoip2-0.4.1.jar with timestamp 1589981042310 [2020/05/20-18:54:12.693] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.693] [Executor task launch worker for task 0] [INFO] [org.apache.spark.util.Utils] - Fetching spark://10.139.4.79:41629/jars/geoip2-0.4.1.jar to /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/fetchFileTemp4896455286318861140.tmp [2020/05/20-18:54:12.695] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.695] [Executor task launch worker for task 0] [INFO] [org.apache.spark.executor.Executor] - Adding file:/tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/geoip2-0.4.1.jar to class loader [2020/05/20-18:54:12.734] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.734] [Executor task launch worker for task 0] [WARN] [org.apache.spark.SparkConf] - The configuration key 'spark.yarn.executor.memoryOverhead' has been deprecated as of Spark 2.3 and may be removed in the future. Please use the new key 'spark.executor.memoryOverhead' instead. [2020/05/20-18:54:12.738] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.738] [Executor task launch worker for task 0] [INFO] [dku.spark.dataset.remote] - compute: dataset=CHURNALGO.Telco_Customer_Churn_prepared_distinct columns= partitions= filter= [2020/05/20-18:54:12.978] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:12.977] [Executor task launch worker for task 0] [INFO] [org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator] - Code generated in 66.263818 ms [2020/05/20-18:54:13.006] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.006] [Executor task launch worker for task 0] [INFO] [org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator] - Code generated in 14.912211 ms [2020/05/20-18:54:13.013] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.013] [Executor task launch worker for task 0] [INFO] [dip.plugin.presets] - Setting up remote plugin settings access to admin1:35569 jek=true [2020/05/20-18:54:13.018] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.018] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Looking for resource yarn-python-bins0uRw67ec.json in localFiles [2020/05/20-18:54:13.018] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.018] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Distributed cache has file : /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/yarn-python-bins0uRw67ec.json [2020/05/20-18:54:13.019] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.019] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Reassign file dku.shaker.yarn.python.bins to path /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/yarn-python-bins0uRw67ec.json [2020/05/20-18:54:13.019] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.019] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Looking for resource rec-YfRNgeLJ.zip.nodecompression in localFiles [2020/05/20-18:54:13.019] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.019] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Distributed cache has file : /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-YfRNgeLJ.zip.nodecompression [2020/05/20-18:54:13.020] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.020] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Reassign file python-package-dataikuapi to path /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-YfRNgeLJ.zip.nodecompression [2020/05/20-18:54:13.020] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.020] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Looking for resource contributed_python_processorskB6B5ITd.json in localFiles [2020/05/20-18:54:13.020] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.020] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Distributed cache has file : /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/contributed_python_processorskB6B5ITd.json [2020/05/20-18:54:13.021] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.020] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Reassign file dku.shaker.contrib to path /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/contributed_python_processorskB6B5ITd.json [2020/05/20-18:54:13.021] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.021] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Looking for resource rec-BHQI2Oqk.zip.nodecompression in localFiles [2020/05/20-18:54:13.021] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.021] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Distributed cache has file : /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-BHQI2Oqk.zip.nodecompression [2020/05/20-18:54:13.021] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.021] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Reassign file python-package-dataiku to path /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-BHQI2Oqk.zip.nodecompression [2020/05/20-18:54:13.022] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.022] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Looking for resource rec-ITk9gUDV.zip.nodecompression in localFiles [2020/05/20-18:54:13.022] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.022] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Distributed cache has file : /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-ITk9gUDV.zip.nodecompression [2020/05/20-18:54:13.022] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.022] [Executor task launch worker for task 0] [INFO] [dku.shaker.resources] - Reassign file python-lib-instance to path /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-ITk9gUDV.zip.nodecompression [2020/05/20-18:54:13.030] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.029] [Executor task launch worker for task 0] [INFO] [dip.shaker.yarn.python] - Unzip /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-YfRNgeLJ.zip.nodecompression to /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/__dku_python_packages/dataikuapi (zip=rec-YfRNgeLJ.zip.nodecompression package=dataikuapi) [2020/05/20-18:54:13.045] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.045] [Executor task launch worker for task 0] [INFO] [dip.shaker.yarn.python] - Unzip /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-BHQI2Oqk.zip.nodecompression to /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/__dku_python_packages/dataiku (zip=rec-BHQI2Oqk.zip.nodecompression package=dataiku) [2020/05/20-18:54:13.129] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.128] [Executor task launch worker for task 0] [INFO] [dip.shaker.yarn.python] - Unzip instance lib /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88/userFiles-4af4f1f3-7d6b-422b-8a9a-b26b869c12b3/rec-ITk9gUDV.zip.nodecompression to /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/__dku_instance_python_lib [2020/05/20-18:54:13.167] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.167] [Executor task launch worker for task 0] [INFO] [org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator] - Code generated in 10.905944 ms [2020/05/20-18:54:13.294] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.294] [Executor task launch worker for task 0] [INFO] [org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator] - Code generated in 67.727458 ms [2020/05/20-18:54:13.344] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.344] [Executor task launch worker for task 0] [INFO] [dku.spark.dataset.remote] - closing dataset stream [2020/05/20-18:54:13.358] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.356] [Executor task launch worker for task 0] [ERROR] [org.apache.spark.executor.Executor] - Exception in task 0.0 in stage 0.0 (TID 0) [2020/05/20-18:54:13.359] [null-err-50] [INFO] [dku.utils] - org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) => double) [2020/05/20-18:54:13.359] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) [2020/05/20-18:54:13.360] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) [2020/05/20-18:54:13.360] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636) [2020/05/20-18:54:13.360] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.361] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.361] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.362] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191) [2020/05/20-18:54:13.362] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) [2020/05/20-18:54:13.362] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) [2020/05/20-18:54:13.362] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) [2020/05/20-18:54:13.363] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.Task.run(Task.scala:123) [2020/05/20-18:54:13.363] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) [2020/05/20-18:54:13.363] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) [2020/05/20-18:54:13.363] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) [2020/05/20-18:54:13.363] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [2020/05/20-18:54:13.364] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [2020/05/20-18:54:13.364] [null-err-50] [INFO] [dku.utils] - at java.lang.Thread.run(Thread.java:748) [2020/05/20-18:54:13.364] [null-err-50] [INFO] [dku.utils] - Caused by: java.util.NoSuchElementException: key not found: false [2020/05/20-18:54:13.364] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.default(MapLike.scala:228) [2020/05/20-18:54:13.364] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.default(Map.scala:59) [2020/05/20-18:54:13.364] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.apply(MapLike.scala:141) [2020/05/20-18:54:13.364] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.apply(Map.scala:59) [2020/05/20-18:54:13.365] [null-err-50] [INFO] [dku.utils] - ... 17 more [2020/05/20-18:54:13.381] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.380] [task-result-getter-0] [WARN] [org.apache.spark.scheduler.TaskSetManager] - Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) => double) [2020/05/20-18:54:13.381] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) [2020/05/20-18:54:13.381] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) [2020/05/20-18:54:13.381] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636) [2020/05/20-18:54:13.381] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.381] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.381] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.382] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191) [2020/05/20-18:54:13.382] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) [2020/05/20-18:54:13.382] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) [2020/05/20-18:54:13.382] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) [2020/05/20-18:54:13.382] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.Task.run(Task.scala:123) [2020/05/20-18:54:13.382] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) [2020/05/20-18:54:13.382] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - at java.lang.Thread.run(Thread.java:748) [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - Caused by: java.util.NoSuchElementException: key not found: false [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.default(MapLike.scala:228) [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.default(Map.scala:59) [2020/05/20-18:54:13.383] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.apply(MapLike.scala:141) [2020/05/20-18:54:13.384] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.apply(Map.scala:59) [2020/05/20-18:54:13.384] [null-err-50] [INFO] [dku.utils] - ... 17 more [2020/05/20-18:54:13.384] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.383] [task-result-getter-0] [ERROR] [org.apache.spark.scheduler.TaskSetManager] - Task 0 in stage 0.0 failed 1 times; aborting job [2020/05/20-18:54:13.385] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.385] [task-result-getter-0] [INFO] [org.apache.spark.scheduler.TaskSchedulerImpl] - Removed TaskSet 0.0, whose tasks have all completed, from pool [2020/05/20-18:54:13.392] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.392] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.TaskSchedulerImpl] - Cancelling stage 0 [2020/05/20-18:54:13.392] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.392] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.TaskSchedulerImpl] - Killing all running tasks in stage 0: Stage cancelled [2020/05/20-18:54:13.395] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.394] [dag-scheduler-event-loop] [INFO] [org.apache.spark.scheduler.DAGScheduler] - ShuffleMapStage 0 (map at MLLibEvaluationJob.scala:142) failed in 5.227 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) => double) [2020/05/20-18:54:13.395] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) [2020/05/20-18:54:13.395] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) [2020/05/20-18:54:13.395] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636) [2020/05/20-18:54:13.395] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.395] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.395] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.Task.run(Task.scala:123) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) [2020/05/20-18:54:13.396] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - at java.lang.Thread.run(Thread.java:748) [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - Caused by: java.util.NoSuchElementException: key not found: false [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.default(MapLike.scala:228) [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.default(Map.scala:59) [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.apply(MapLike.scala:141) [2020/05/20-18:54:13.397] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.apply(Map.scala:59) [2020/05/20-18:54:13.398] [null-err-50] [INFO] [dku.utils] - ... 17 more [2020/05/20-18:54:13.398] [null-err-50] [INFO] [dku.utils] - Driver stacktrace: [2020/05/20-18:54:13.399] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.399] [main] [INFO] [org.apache.spark.scheduler.DAGScheduler] - Job 0 failed: collect at BinaryClassificationMetrics.scala:195, took 5.294069 s [2020/05/20-18:54:13.400] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.400] [main] [INFO] [dku.spark.mllib] - Cleaning up. [2020/05/20-18:54:13.400] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.400] [main] [ERROR] [dku.spark.mllib] - Failed to cleanup jobs due to error. [2020/05/20-18:54:13.400] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.400] [main] [ERROR] [dku.spark.env] - job failed [2020/05/20-18:54:13.401] [null-err-50] [INFO] [dku.utils] - org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) => double) [2020/05/20-18:54:13.401] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) [2020/05/20-18:54:13.401] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) [2020/05/20-18:54:13.401] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636) [2020/05/20-18:54:13.401] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.401] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.401] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.Task.run(Task.scala:123) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) [2020/05/20-18:54:13.402] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - at java.lang.Thread.run(Thread.java:748) [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - Caused by: java.util.NoSuchElementException: key not found: false [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.default(MapLike.scala:228) [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.default(Map.scala:59) [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.apply(MapLike.scala:141) [2020/05/20-18:54:13.403] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.apply(Map.scala:59) [2020/05/20-18:54:13.404] [null-err-50] [INFO] [dku.utils] - ... 17 more [2020/05/20-18:54:13.404] [null-err-50] [INFO] [dku.utils] - Driver stacktrace: [2020/05/20-18:54:13.404] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1891) [2020/05/20-18:54:13.404] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1879) [2020/05/20-18:54:13.404] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1878) [2020/05/20-18:54:13.404] [null-err-50] [INFO] [dku.utils] - at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59) [2020/05/20-18:54:13.404] [null-err-50] [INFO] [dku.utils] - at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1878) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:927) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:927) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at scala.Option.foreach(Option.scala:257) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:927) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2112) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2061) [2020/05/20-18:54:13.405] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2050) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:738) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.SparkContext.runJob(SparkContext.scala:2061) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.SparkContext.runJob(SparkContext.scala:2082) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.SparkContext.runJob(SparkContext.scala:2101) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.SparkContext.runJob(SparkContext.scala:2126) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990) [2020/05/20-18:54:13.406] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) [2020/05/20-18:54:13.407] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) [2020/05/20-18:54:13.407] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.rdd.RDD.withScope(RDD.scala:385) [2020/05/20-18:54:13.407] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.rdd.RDD.collect(RDD.scala:989) [2020/05/20-18:54:13.407] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4$lzycompute(BinaryClassificationMetrics.scala:195) [2020/05/20-18:54:13.407] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4(BinaryClassificationMetrics.scala:146) [2020/05/20-18:54:13.407] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions$lzycompute(BinaryClassificationMetrics.scala:148) [2020/05/20-18:54:13.407] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions(BinaryClassificationMetrics.scala:148) [2020/05/20-18:54:13.408] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.createCurve(BinaryClassificationMetrics.scala:226) [2020/05/20-18:54:13.408] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.roc(BinaryClassificationMetrics.scala:86) [2020/05/20-18:54:13.408] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.areaUnderROC(BinaryClassificationMetrics.scala:97) [2020/05/20-18:54:13.408] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc$lzycompute(BinaryClassificationEvaluator.scala:224) [2020/05/20-18:54:13.410] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc(BinaryClassificationEvaluator.scala:223) [2020/05/20-18:54:13.411] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.MLLibEvaluationJob.getMetrics(MLLibEvaluationJob.scala:151) [2020/05/20-18:54:13.411] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.MLLibEvaluationJob.(MLLibEvaluationJob.scala:223) [2020/05/20-18:54:13.411] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.MLLibEvaluationJob$.delayedEndpoint$com$dataiku$dip$spark$MLLibEvaluationJob$1(MLLibEvaluationJob.scala:27) [2020/05/20-18:54:13.411] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.MLLibEvaluationJob$delayedInit$body.apply(MLLibEvaluationJob.scala:25) [2020/05/20-18:54:13.411] [null-err-50] [INFO] [dku.utils] - at scala.Function0$class.apply$mcV$sp(Function0.scala:34) [2020/05/20-18:54:13.412] [null-err-50] [INFO] [dku.utils] - at scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12) [2020/05/20-18:54:13.412] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.SuicidalApp$$anonfun$delayedInit$1.apply$mcV$sp(package.scala:482) [2020/05/20-18:54:13.412] [null-err-50] [INFO] [dku.utils] - at scala.App$$anonfun$main$1.apply(App.scala:76) [2020/05/20-18:54:13.412] [null-err-50] [INFO] [dku.utils] - at scala.App$$anonfun$main$1.apply(App.scala:76) [2020/05/20-18:54:13.413] [null-err-50] [INFO] [dku.utils] - at scala.collection.immutable.List.foreach(List.scala:392) [2020/05/20-18:54:13.413] [null-err-50] [INFO] [dku.utils] - at scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35) [2020/05/20-18:54:13.413] [null-err-50] [INFO] [dku.utils] - at scala.App$class.main(App.scala:76) [2020/05/20-18:54:13.413] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.MLLibEvaluationJob$.main(MLLibEvaluationJob.scala:25) [2020/05/20-18:54:13.414] [null-err-50] [INFO] [dku.utils] - at com.dataiku.dip.spark.MLLibEvaluationJob.main(MLLibEvaluationJob.scala) [2020/05/20-18:54:13.414] [null-err-50] [INFO] [dku.utils] - at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) [2020/05/20-18:54:13.414] [null-err-50] [INFO] [dku.utils] - at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) [2020/05/20-18:54:13.414] [null-err-50] [INFO] [dku.utils] - at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) [2020/05/20-18:54:13.414] [null-err-50] [INFO] [dku.utils] - at java.lang.reflect.Method.invoke(Method.java:498) [2020/05/20-18:54:13.415] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52) [2020/05/20-18:54:13.415] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845) [2020/05/20-18:54:13.415] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161) [2020/05/20-18:54:13.415] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184) [2020/05/20-18:54:13.415] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86) [2020/05/20-18:54:13.416] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920) [2020/05/20-18:54:13.416] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929) [2020/05/20-18:54:13.416] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) [2020/05/20-18:54:13.416] [null-err-50] [INFO] [dku.utils] - Caused by: org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) => double) [2020/05/20-18:54:13.416] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) [2020/05/20-18:54:13.417] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) [2020/05/20-18:54:13.417] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636) [2020/05/20-18:54:13.417] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.417] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.417] [null-err-50] [INFO] [dku.utils] - at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) [2020/05/20-18:54:13.418] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191) [2020/05/20-18:54:13.418] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) [2020/05/20-18:54:13.418] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) [2020/05/20-18:54:13.418] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) [2020/05/20-18:54:13.418] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.scheduler.Task.run(Task.scala:123) [2020/05/20-18:54:13.419] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) [2020/05/20-18:54:13.419] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) [2020/05/20-18:54:13.419] [null-err-50] [INFO] [dku.utils] - at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) [2020/05/20-18:54:13.419] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [2020/05/20-18:54:13.420] [null-err-50] [INFO] [dku.utils] - at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [2020/05/20-18:54:13.420] [null-err-50] [INFO] [dku.utils] - at java.lang.Thread.run(Thread.java:748) [2020/05/20-18:54:13.420] [null-err-50] [INFO] [dku.utils] - Caused by: java.util.NoSuchElementException: key not found: false [2020/05/20-18:54:13.420] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.default(MapLike.scala:228) [2020/05/20-18:54:13.420] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.default(Map.scala:59) [2020/05/20-18:54:13.421] [null-err-50] [INFO] [dku.utils] - at scala.collection.MapLike$class.apply(MapLike.scala:141) [2020/05/20-18:54:13.421] [null-err-50] [INFO] [dku.utils] - at scala.collection.AbstractMap.apply(Map.scala:59) [2020/05/20-18:54:13.421] [null-err-50] [INFO] [dku.utils] - ... 17 more [2020/05/20-18:54:13.421] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.417] [main] [INFO] [dku.spark.entrypoint] - Spark: initializeApplication [2020/05/20-18:54:13.421] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.417] [main] [INFO] [dku.spark.entrypoint] - Spark: initializeApplication: ALREADY done [2020/05/20-18:54:13.422] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.417] [main] [INFO] [dku.spark.entrypoint] - Loading exec env [2020/05/20-18:54:13.422] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.418] [main] [INFO] [dku.spark.exec] - Loading exec env with impersonation enabled=false user=null [2020/05/20-18:54:13.422] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.419] [main] [INFO] [dku.remoterun.fileexchange.helper] - Running locally, dump error.json into cwd [2020/05/20-18:54:13.433] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.433] [main] [INFO] [org.spark_project.jetty.server.AbstractConnector] - Stopped Spark@10f7c8ee{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} [2020/05/20-18:54:13.435] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.435] [main] [INFO] [org.apache.spark.ui.SparkUI] - Stopped Spark web UI at http://10.139.4.79:4040 [2020/05/20-18:54:13.446] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.446] [dispatcher-event-loop-7] [INFO] [org.apache.spark.MapOutputTrackerMasterEndpoint] - MapOutputTrackerMasterEndpoint stopped! [2020/05/20-18:54:13.463] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.463] [main] [INFO] [org.apache.spark.storage.memory.MemoryStore] - MemoryStore cleared [2020/05/20-18:54:13.463] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.463] [main] [INFO] [org.apache.spark.storage.BlockManager] - BlockManager stopped [2020/05/20-18:54:13.473] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.472] [main] [INFO] [org.apache.spark.storage.BlockManagerMaster] - BlockManagerMaster stopped [2020/05/20-18:54:13.476] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.476] [dispatcher-event-loop-0] [INFO] [org.apache.spark.scheduler.OutputCommitCoordinator$OutputCommitCoordinatorEndpoint] - OutputCommitCoordinator stopped! [2020/05/20-18:54:13.505] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.505] [main] [INFO] [org.apache.spark.SparkContext] - Successfully stopped SparkContext [2020/05/20-18:54:13.509] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.509] [Thread-1] [INFO] [org.apache.spark.util.ShutdownHookManager] - Shutdown hook called [2020/05/20-18:54:13.510] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.510] [Thread-1] [INFO] [org.apache.spark.util.ShutdownHookManager] - Deleting directory /tmp/spark-fe4fea45-b6d0-4b26-962f-d745e10d8f88 [2020/05/20-18:54:13.513] [null-err-50] [INFO] [dku.utils] - [2020/05/20-18:54:13.513] [Thread-1] [INFO] [org.apache.spark.util.ShutdownHookManager] - Deleting directory /tmp/spark-69729266-0990-4c75-99b7-d3f5cfea2712 [2020/05/20-18:54:13.892] [FRT-43-FlowRunnable] [INFO] [dku.recipes.code.base] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Error file found, trying to throw it: /home/dataiku/DATA_DIR/jobs/CHURNALGO/Build_predict_tcs_2020-05-20T13-23-56.140/evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP/sparkbased-recipe/out9UtT1gEVnWeg/error.json [2020/05/20-18:54:13.894] [FRT-43-FlowRunnable] [INFO] [dku.recipes.code.base] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Raw error is{"errorType":"org.apache.spark.SparkException","message":"Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:","detailedMessage":"Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:, caused by: SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double), caused by: NoSuchElementException: key not found: false","detailedMessageHTML":"\u003cspan\u003e\u003cspan class\u003d\"err-msg\"\u003eJob aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u0026gt; double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:\u003c/span\u003e\u003cspan class\u003d\"err-cause\"\u003e\u003cspan class\u003d\"err-type\"\u003e, caused by: SparkException:\u003c/span\u003e\u0026nbsp;\u003cspan class\u003d\"err-msg\"\u003eFailed to execute user defined function(Map2: (string) \u003d\u0026gt; double)\u003c/span\u003e\u003c/span\u003e\u003cspan class\u003d\"err-cause\"\u003e\u003cspan class\u003d\"err-type\"\u003e, caused by: NoSuchElementException:\u003c/span\u003e\u0026nbsp;\u003cspan class\u003d\"err-msg\"\u003ekey not found: false\u003c/span\u003e\u003c/span\u003e\u003c/span\u003e","stackTraceStr":"org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:\n\tat org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1891)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1879)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1878)\n\tat scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)\n\tat scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)\n\tat org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1878)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:927)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:927)\n\tat scala.Option.foreach(Option.scala:257)\n\tat org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:927)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2112)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2061)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2050)\n\tat org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)\n\tat org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:738)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)\n\tat org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990)\n\tat org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)\n\tat org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)\n\tat org.apache.spark.rdd.RDD.withScope(RDD.scala:385)\n\tat org.apache.spark.rdd.RDD.collect(RDD.scala:989)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4$lzycompute(BinaryClassificationMetrics.scala:195)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4(BinaryClassificationMetrics.scala:146)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions$lzycompute(BinaryClassificationMetrics.scala:148)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions(BinaryClassificationMetrics.scala:148)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.createCurve(BinaryClassificationMetrics.scala:226)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.roc(BinaryClassificationMetrics.scala:86)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.areaUnderROC(BinaryClassificationMetrics.scala:97)\n\tat com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc$lzycompute(BinaryClassificationEvaluator.scala:224)\n\tat com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc(BinaryClassificationEvaluator.scala:223)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob.getMetrics(MLLibEvaluationJob.scala:151)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob.\u003cinit\u003e(MLLibEvaluationJob.scala:223)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob$.delayedEndpoint$com$dataiku$dip$spark$MLLibEvaluationJob$1(MLLibEvaluationJob.scala:27)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob$delayedInit$body.apply(MLLibEvaluationJob.scala:25)\n\tat scala.Function0$class.apply$mcV$sp(Function0.scala:34)\n\tat scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)\n\tat com.dataiku.dip.spark.SuicidalApp$$anonfun$delayedInit$1.apply$mcV$sp(package.scala:482)\n\tat scala.App$$anonfun$main$1.apply(App.scala:76)\n\tat scala.App$$anonfun$main$1.apply(App.scala:76)\n\tat scala.collection.immutable.List.foreach(List.scala:392)\n\tat scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)\n\tat scala.App$class.main(App.scala:76)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob$.main(MLLibEvaluationJob.scala:25)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob.main(MLLibEvaluationJob.scala)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:498)\n\tat org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)\n\tat org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)\n\tat org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)\n\tat org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)\n\tat org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)\n\tat org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)\n\tat org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)\n\tat org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)\nCaused by: org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n","stackTrace":[{"file":"DAGScheduler.scala","line":1891,"function":"org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages"},{"file":"DAGScheduler.scala","line":1879,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply"},{"file":"DAGScheduler.scala","line":1878,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply"},{"file":"ResizableArray.scala","line":59,"function":"scala.collection.mutable.ResizableArray$class.foreach"},{"file":"ArrayBuffer.scala","line":48,"function":"scala.collection.mutable.ArrayBuffer.foreach"},{"file":"DAGScheduler.scala","line":1878,"function":"org.apache.spark.scheduler.DAGScheduler.abortStage"},{"file":"DAGScheduler.scala","line":927,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply"},{"file":"DAGScheduler.scala","line":927,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply"},{"file":"Option.scala","line":257,"function":"scala.Option.foreach"},{"file":"DAGScheduler.scala","line":927,"function":"org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed"},{"file":"DAGScheduler.scala","line":2112,"function":"org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive"},{"file":"DAGScheduler.scala","line":2061,"function":"org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive"},{"file":"DAGScheduler.scala","line":2050,"function":"org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive"},{"file":"EventLoop.scala","line":49,"function":"org.apache.spark.util.EventLoop$$anon$1.run"},{"file":"DAGScheduler.scala","line":738,"function":"org.apache.spark.scheduler.DAGScheduler.runJob"},{"file":"SparkContext.scala","line":2061,"function":"org.apache.spark.SparkContext.runJob"},{"file":"SparkContext.scala","line":2082,"function":"org.apache.spark.SparkContext.runJob"},{"file":"SparkContext.scala","line":2101,"function":"org.apache.spark.SparkContext.runJob"},{"file":"SparkContext.scala","line":2126,"function":"org.apache.spark.SparkContext.runJob"},{"file":"RDD.scala","line":990,"function":"org.apache.spark.rdd.RDD$$anonfun$collect$1.apply"},{"file":"RDDOperationScope.scala","line":151,"function":"org.apache.spark.rdd.RDDOperationScope$.withScope"},{"file":"RDDOperationScope.scala","line":112,"function":"org.apache.spark.rdd.RDDOperationScope$.withScope"},{"file":"RDD.scala","line":385,"function":"org.apache.spark.rdd.RDD.withScope"},{"file":"RDD.scala","line":989,"function":"org.apache.spark.rdd.RDD.collect"},{"file":"BinaryClassificationMetrics.scala","line":195,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4$lzycompute"},{"file":"BinaryClassificationMetrics.scala","line":146,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4"},{"file":"BinaryClassificationMetrics.scala","line":148,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions$lzycompute"},{"file":"BinaryClassificationMetrics.scala","line":148,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions"},{"file":"BinaryClassificationMetrics.scala","line":226,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.createCurve"},{"file":"BinaryClassificationMetrics.scala","line":86,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.roc"},{"file":"BinaryClassificationMetrics.scala","line":97,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.areaUnderROC"},{"file":"BinaryClassificationEvaluator.scala","line":224,"function":"com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc$lzycompute"},{"file":"BinaryClassificationEvaluator.scala","line":223,"function":"com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc"},{"file":"MLLibEvaluationJob.scala","line":151,"function":"com.dataiku.dip.spark.MLLibEvaluationJob.getMetrics"},{"file":"MLLibEvaluationJob.scala","line":223,"function":"com.dataiku.dip.spark.MLLibEvaluationJob.\u003cinit\u003e"},{"file":"MLLibEvaluationJob.scala","line":27,"function":"com.dataiku.dip.spark.MLLibEvaluationJob$.delayedEndpoint$com$dataiku$dip$spark$MLLibEvaluationJob$1"},{"file":"MLLibEvaluationJob.scala","line":25,"function":"com.dataiku.dip.spark.MLLibEvaluationJob$delayedInit$body.apply"},{"file":"Function0.scala","line":34,"function":"scala.Function0$class.apply$mcV$sp"},{"file":"AbstractFunction0.scala","line":12,"function":"scala.runtime.AbstractFunction0.apply$mcV$sp"},{"file":"package.scala","line":482,"function":"com.dataiku.dip.spark.SuicidalApp$$anonfun$delayedInit$1.apply$mcV$sp"},{"file":"App.scala","line":76,"function":"scala.App$$anonfun$main$1.apply"},{"file":"App.scala","line":76,"function":"scala.App$$anonfun$main$1.apply"},{"file":"List.scala","line":392,"function":"scala.collection.immutable.List.foreach"},{"file":"TraversableForwarder.scala","line":35,"function":"scala.collection.generic.TraversableForwarder$class.foreach"},{"file":"App.scala","line":76,"function":"scala.App$class.main"},{"file":"MLLibEvaluationJob.scala","line":25,"function":"com.dataiku.dip.spark.MLLibEvaluationJob$.main"},{"file":"MLLibEvaluationJob.scala","line":-1,"function":"com.dataiku.dip.spark.MLLibEvaluationJob.main"},{"file":"NativeMethodAccessorImpl.java","line":-2,"function":"sun.reflect.NativeMethodAccessorImpl.invoke0"},{"file":"NativeMethodAccessorImpl.java","line":62,"function":"sun.reflect.NativeMethodAccessorImpl.invoke"},{"file":"DelegatingMethodAccessorImpl.java","line":43,"function":"sun.reflect.DelegatingMethodAccessorImpl.invoke"},{"file":"Method.java","line":498,"function":"java.lang.reflect.Method.invoke"},{"file":"SparkApplication.scala","line":52,"function":"org.apache.spark.deploy.JavaMainApplication.start"},{"file":"SparkSubmit.scala","line":845,"function":"org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain"},{"file":"SparkSubmit.scala","line":161,"function":"org.apache.spark.deploy.SparkSubmit.doRunMain$1"},{"file":"SparkSubmit.scala","line":184,"function":"org.apache.spark.deploy.SparkSubmit.submit"},{"file":"SparkSubmit.scala","line":86,"function":"org.apache.spark.deploy.SparkSubmit.doSubmit"},{"file":"SparkSubmit.scala","line":920,"function":"org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit"},{"file":"SparkSubmit.scala","line":929,"function":"org.apache.spark.deploy.SparkSubmit$.main"},{"file":"SparkSubmit.scala","line":-1,"function":"org.apache.spark.deploy.SparkSubmit.main"},{"file":"CausedBy","line":0,"function":"Cause: org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)"},{"line":-1,"function":"org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext"},{"file":"BufferedRowIterator.java","line":43,"function":"org.apache.spark.sql.execution.BufferedRowIterator.hasNext"},{"file":"WholeStageCodegenExec.scala","line":636,"function":"org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"ExternalSorter.scala","line":191,"function":"org.apache.spark.util.collection.ExternalSorter.insertAll"},{"file":"SortShuffleWriter.scala","line":62,"function":"org.apache.spark.shuffle.sort.SortShuffleWriter.write"},{"file":"ShuffleMapTask.scala","line":99,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"ShuffleMapTask.scala","line":55,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"Task.scala","line":123,"function":"org.apache.spark.scheduler.Task.run"},{"file":"Executor.scala","line":408,"function":"org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply"},{"file":"Utils.scala","line":1360,"function":"org.apache.spark.util.Utils$.tryWithSafeFinally"},{"file":"Executor.scala","line":414,"function":"org.apache.spark.executor.Executor$TaskRunner.run"},{"file":"ThreadPoolExecutor.java","line":1149,"function":"java.util.concurrent.ThreadPoolExecutor.runWorker"},{"file":"ThreadPoolExecutor.java","line":624,"function":"java.util.concurrent.ThreadPoolExecutor$Worker.run"},{"file":"Thread.java","line":748,"function":"java.lang.Thread.run"},{"file":"CausedBy","line":0,"function":"Cause: java.util.NoSuchElementException: key not found: false"},{"file":"MapLike.scala","line":228,"function":"scala.collection.MapLike$class.default"},{"file":"Map.scala","line":59,"function":"scala.collection.AbstractMap.default"},{"file":"MapLike.scala","line":141,"function":"scala.collection.MapLike$class.apply"},{"file":"Map.scala","line":59,"function":"scala.collection.AbstractMap.apply"},{"line":-1,"function":"org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext"},{"file":"BufferedRowIterator.java","line":43,"function":"org.apache.spark.sql.execution.BufferedRowIterator.hasNext"},{"file":"WholeStageCodegenExec.scala","line":636,"function":"org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"ExternalSorter.scala","line":191,"function":"org.apache.spark.util.collection.ExternalSorter.insertAll"},{"file":"SortShuffleWriter.scala","line":62,"function":"org.apache.spark.shuffle.sort.SortShuffleWriter.write"},{"file":"ShuffleMapTask.scala","line":99,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"ShuffleMapTask.scala","line":55,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"Task.scala","line":123,"function":"org.apache.spark.scheduler.Task.run"},{"file":"Executor.scala","line":408,"function":"org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply"},{"file":"Utils.scala","line":1360,"function":"org.apache.spark.util.Utils$.tryWithSafeFinally"},{"file":"Executor.scala","line":414,"function":"org.apache.spark.executor.Executor$TaskRunner.run"},{"file":"ThreadPoolExecutor.java","line":1149,"function":"java.util.concurrent.ThreadPoolExecutor.runWorker"},{"file":"ThreadPoolExecutor.java","line":624,"function":"java.util.concurrent.ThreadPoolExecutor$Worker.run"},{"file":"Thread.java","line":748,"function":"java.lang.Thread.run"}]} [2020/05/20-18:54:13.895] [FRT-43-FlowRunnable] [INFO] [dku.recipes.code.base] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Now err: {"errorType":"org.apache.spark.SparkException","message":"Error in evaluate process: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:","detailedMessage":"Error in evaluate process: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:, caused by: SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double), caused by: NoSuchElementException: key not found: false","detailedMessageHTML":"\u003cspan\u003e\u003cspan class\u003d\"err-msg\"\u003eJob aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u0026gt; double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:\u003c/span\u003e\u003cspan class\u003d\"err-cause\"\u003e\u003cspan class\u003d\"err-type\"\u003e, caused by: SparkException:\u003c/span\u003e\u0026nbsp;\u003cspan class\u003d\"err-msg\"\u003eFailed to execute user defined function(Map2: (string) \u003d\u0026gt; double)\u003c/span\u003e\u003c/span\u003e\u003cspan class\u003d\"err-cause\"\u003e\u003cspan class\u003d\"err-type\"\u003e, caused by: NoSuchElementException:\u003c/span\u003e\u0026nbsp;\u003cspan class\u003d\"err-msg\"\u003ekey not found: false\u003c/span\u003e\u003c/span\u003e\u003c/span\u003e","stackTraceStr":"org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n\nDriver stacktrace:\n\tat org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages(DAGScheduler.scala:1891)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1879)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGScheduler.scala:1878)\n\tat scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)\n\tat scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)\n\tat org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1878)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:927)\n\tat org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply(DAGScheduler.scala:927)\n\tat scala.Option.foreach(Option.scala:257)\n\tat org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:927)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2112)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2061)\n\tat org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2050)\n\tat org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49)\n\tat org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:738)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2061)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2082)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2101)\n\tat org.apache.spark.SparkContext.runJob(SparkContext.scala:2126)\n\tat org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:990)\n\tat org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)\n\tat org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)\n\tat org.apache.spark.rdd.RDD.withScope(RDD.scala:385)\n\tat org.apache.spark.rdd.RDD.collect(RDD.scala:989)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4$lzycompute(BinaryClassificationMetrics.scala:195)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4(BinaryClassificationMetrics.scala:146)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions$lzycompute(BinaryClassificationMetrics.scala:148)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions(BinaryClassificationMetrics.scala:148)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.createCurve(BinaryClassificationMetrics.scala:226)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.roc(BinaryClassificationMetrics.scala:86)\n\tat org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.areaUnderROC(BinaryClassificationMetrics.scala:97)\n\tat com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc$lzycompute(BinaryClassificationEvaluator.scala:224)\n\tat com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc(BinaryClassificationEvaluator.scala:223)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob.getMetrics(MLLibEvaluationJob.scala:151)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob.\u003cinit\u003e(MLLibEvaluationJob.scala:223)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob$.delayedEndpoint$com$dataiku$dip$spark$MLLibEvaluationJob$1(MLLibEvaluationJob.scala:27)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob$delayedInit$body.apply(MLLibEvaluationJob.scala:25)\n\tat scala.Function0$class.apply$mcV$sp(Function0.scala:34)\n\tat scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)\n\tat com.dataiku.dip.spark.SuicidalApp$$anonfun$delayedInit$1.apply$mcV$sp(package.scala:482)\n\tat scala.App$$anonfun$main$1.apply(App.scala:76)\n\tat scala.App$$anonfun$main$1.apply(App.scala:76)\n\tat scala.collection.immutable.List.foreach(List.scala:392)\n\tat scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)\n\tat scala.App$class.main(App.scala:76)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob$.main(MLLibEvaluationJob.scala:25)\n\tat com.dataiku.dip.spark.MLLibEvaluationJob.main(MLLibEvaluationJob.scala)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\n\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\n\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\n\tat java.lang.reflect.Method.invoke(Method.java:498)\n\tat org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)\n\tat org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:845)\n\tat org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:161)\n\tat org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:184)\n\tat org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:86)\n\tat org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:920)\n\tat org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:929)\n\tat org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)\nCaused by: org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)\n\tat org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source)\n\tat org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)\n\tat org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)\n\tat org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191)\n\tat org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)\n\tat org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)\n\tat org.apache.spark.scheduler.Task.run(Task.scala:123)\n\tat org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)\n\tat org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)\n\tat org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)\n\tat java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)\n\tat java.lang.Thread.run(Thread.java:748)\nCaused by: java.util.NoSuchElementException: key not found: false\n\tat scala.collection.MapLike$class.default(MapLike.scala:228)\n\tat scala.collection.AbstractMap.default(Map.scala:59)\n\tat scala.collection.MapLike$class.apply(MapLike.scala:141)\n\tat scala.collection.AbstractMap.apply(Map.scala:59)\n\t... 17 more\n","stackTrace":[{"file":"DAGScheduler.scala","line":1891,"function":"org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGScheduler$$failJobAndIndependentStages"},{"file":"DAGScheduler.scala","line":1879,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply"},{"file":"DAGScheduler.scala","line":1878,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply"},{"file":"ResizableArray.scala","line":59,"function":"scala.collection.mutable.ResizableArray$class.foreach"},{"file":"ArrayBuffer.scala","line":48,"function":"scala.collection.mutable.ArrayBuffer.foreach"},{"file":"DAGScheduler.scala","line":1878,"function":"org.apache.spark.scheduler.DAGScheduler.abortStage"},{"file":"DAGScheduler.scala","line":927,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply"},{"file":"DAGScheduler.scala","line":927,"function":"org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.apply"},{"file":"Option.scala","line":257,"function":"scala.Option.foreach"},{"file":"DAGScheduler.scala","line":927,"function":"org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed"},{"file":"DAGScheduler.scala","line":2112,"function":"org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive"},{"file":"DAGScheduler.scala","line":2061,"function":"org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive"},{"file":"DAGScheduler.scala","line":2050,"function":"org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive"},{"file":"EventLoop.scala","line":49,"function":"org.apache.spark.util.EventLoop$$anon$1.run"},{"file":"DAGScheduler.scala","line":738,"function":"org.apache.spark.scheduler.DAGScheduler.runJob"},{"file":"SparkContext.scala","line":2061,"function":"org.apache.spark.SparkContext.runJob"},{"file":"SparkContext.scala","line":2082,"function":"org.apache.spark.SparkContext.runJob"},{"file":"SparkContext.scala","line":2101,"function":"org.apache.spark.SparkContext.runJob"},{"file":"SparkContext.scala","line":2126,"function":"org.apache.spark.SparkContext.runJob"},{"file":"RDD.scala","line":990,"function":"org.apache.spark.rdd.RDD$$anonfun$collect$1.apply"},{"file":"RDDOperationScope.scala","line":151,"function":"org.apache.spark.rdd.RDDOperationScope$.withScope"},{"file":"RDDOperationScope.scala","line":112,"function":"org.apache.spark.rdd.RDDOperationScope$.withScope"},{"file":"RDD.scala","line":385,"function":"org.apache.spark.rdd.RDD.withScope"},{"file":"RDD.scala","line":989,"function":"org.apache.spark.rdd.RDD.collect"},{"file":"BinaryClassificationMetrics.scala","line":195,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4$lzycompute"},{"file":"BinaryClassificationMetrics.scala","line":146,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.x$4"},{"file":"BinaryClassificationMetrics.scala","line":148,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions$lzycompute"},{"file":"BinaryClassificationMetrics.scala","line":148,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.confusions"},{"file":"BinaryClassificationMetrics.scala","line":226,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.createCurve"},{"file":"BinaryClassificationMetrics.scala","line":86,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.roc"},{"file":"BinaryClassificationMetrics.scala","line":97,"function":"org.apache.spark.mllib.evaluation.BinaryClassificationMetrics.areaUnderROC"},{"file":"BinaryClassificationEvaluator.scala","line":224,"function":"com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc$lzycompute"},{"file":"BinaryClassificationEvaluator.scala","line":223,"function":"com.dataiku.dip.spark.ml.prediction.BinaryClassificationFixedPerf.auc"},{"file":"MLLibEvaluationJob.scala","line":151,"function":"com.dataiku.dip.spark.MLLibEvaluationJob.getMetrics"},{"file":"MLLibEvaluationJob.scala","line":223,"function":"com.dataiku.dip.spark.MLLibEvaluationJob.\u003cinit\u003e"},{"file":"MLLibEvaluationJob.scala","line":27,"function":"com.dataiku.dip.spark.MLLibEvaluationJob$.delayedEndpoint$com$dataiku$dip$spark$MLLibEvaluationJob$1"},{"file":"MLLibEvaluationJob.scala","line":25,"function":"com.dataiku.dip.spark.MLLibEvaluationJob$delayedInit$body.apply"},{"file":"Function0.scala","line":34,"function":"scala.Function0$class.apply$mcV$sp"},{"file":"AbstractFunction0.scala","line":12,"function":"scala.runtime.AbstractFunction0.apply$mcV$sp"},{"file":"package.scala","line":482,"function":"com.dataiku.dip.spark.SuicidalApp$$anonfun$delayedInit$1.apply$mcV$sp"},{"file":"App.scala","line":76,"function":"scala.App$$anonfun$main$1.apply"},{"file":"App.scala","line":76,"function":"scala.App$$anonfun$main$1.apply"},{"file":"List.scala","line":392,"function":"scala.collection.immutable.List.foreach"},{"file":"TraversableForwarder.scala","line":35,"function":"scala.collection.generic.TraversableForwarder$class.foreach"},{"file":"App.scala","line":76,"function":"scala.App$class.main"},{"file":"MLLibEvaluationJob.scala","line":25,"function":"com.dataiku.dip.spark.MLLibEvaluationJob$.main"},{"file":"MLLibEvaluationJob.scala","line":-1,"function":"com.dataiku.dip.spark.MLLibEvaluationJob.main"},{"file":"NativeMethodAccessorImpl.java","line":-2,"function":"sun.reflect.NativeMethodAccessorImpl.invoke0"},{"file":"NativeMethodAccessorImpl.java","line":62,"function":"sun.reflect.NativeMethodAccessorImpl.invoke"},{"file":"DelegatingMethodAccessorImpl.java","line":43,"function":"sun.reflect.DelegatingMethodAccessorImpl.invoke"},{"file":"Method.java","line":498,"function":"java.lang.reflect.Method.invoke"},{"file":"SparkApplication.scala","line":52,"function":"org.apache.spark.deploy.JavaMainApplication.start"},{"file":"SparkSubmit.scala","line":845,"function":"org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain"},{"file":"SparkSubmit.scala","line":161,"function":"org.apache.spark.deploy.SparkSubmit.doRunMain$1"},{"file":"SparkSubmit.scala","line":184,"function":"org.apache.spark.deploy.SparkSubmit.submit"},{"file":"SparkSubmit.scala","line":86,"function":"org.apache.spark.deploy.SparkSubmit.doSubmit"},{"file":"SparkSubmit.scala","line":920,"function":"org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit"},{"file":"SparkSubmit.scala","line":929,"function":"org.apache.spark.deploy.SparkSubmit$.main"},{"file":"SparkSubmit.scala","line":-1,"function":"org.apache.spark.deploy.SparkSubmit.main"},{"file":"CausedBy","line":0,"function":"Cause: org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) \u003d\u003e double)"},{"line":-1,"function":"org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext"},{"file":"BufferedRowIterator.java","line":43,"function":"org.apache.spark.sql.execution.BufferedRowIterator.hasNext"},{"file":"WholeStageCodegenExec.scala","line":636,"function":"org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"ExternalSorter.scala","line":191,"function":"org.apache.spark.util.collection.ExternalSorter.insertAll"},{"file":"SortShuffleWriter.scala","line":62,"function":"org.apache.spark.shuffle.sort.SortShuffleWriter.write"},{"file":"ShuffleMapTask.scala","line":99,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"ShuffleMapTask.scala","line":55,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"Task.scala","line":123,"function":"org.apache.spark.scheduler.Task.run"},{"file":"Executor.scala","line":408,"function":"org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply"},{"file":"Utils.scala","line":1360,"function":"org.apache.spark.util.Utils$.tryWithSafeFinally"},{"file":"Executor.scala","line":414,"function":"org.apache.spark.executor.Executor$TaskRunner.run"},{"file":"ThreadPoolExecutor.java","line":1149,"function":"java.util.concurrent.ThreadPoolExecutor.runWorker"},{"file":"ThreadPoolExecutor.java","line":624,"function":"java.util.concurrent.ThreadPoolExecutor$Worker.run"},{"file":"Thread.java","line":748,"function":"java.lang.Thread.run"},{"file":"CausedBy","line":0,"function":"Cause: java.util.NoSuchElementException: key not found: false"},{"file":"MapLike.scala","line":228,"function":"scala.collection.MapLike$class.default"},{"file":"Map.scala","line":59,"function":"scala.collection.AbstractMap.default"},{"file":"MapLike.scala","line":141,"function":"scala.collection.MapLike$class.apply"},{"file":"Map.scala","line":59,"function":"scala.collection.AbstractMap.apply"},{"line":-1,"function":"org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext"},{"file":"BufferedRowIterator.java","line":43,"function":"org.apache.spark.sql.execution.BufferedRowIterator.hasNext"},{"file":"WholeStageCodegenExec.scala","line":636,"function":"org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"Iterator.scala","line":409,"function":"scala.collection.Iterator$$anon$11.hasNext"},{"file":"ExternalSorter.scala","line":191,"function":"org.apache.spark.util.collection.ExternalSorter.insertAll"},{"file":"SortShuffleWriter.scala","line":62,"function":"org.apache.spark.shuffle.sort.SortShuffleWriter.write"},{"file":"ShuffleMapTask.scala","line":99,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"ShuffleMapTask.scala","line":55,"function":"org.apache.spark.scheduler.ShuffleMapTask.runTask"},{"file":"Task.scala","line":123,"function":"org.apache.spark.scheduler.Task.run"},{"file":"Executor.scala","line":408,"function":"org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply"},{"file":"Utils.scala","line":1360,"function":"org.apache.spark.util.Utils$.tryWithSafeFinally"},{"file":"Executor.scala","line":414,"function":"org.apache.spark.executor.Executor$TaskRunner.run"},{"file":"ThreadPoolExecutor.java","line":1149,"function":"java.util.concurrent.ThreadPoolExecutor.runWorker"},{"file":"ThreadPoolExecutor.java","line":624,"function":"java.util.concurrent.ThreadPoolExecutor$Worker.run"},{"file":"Thread.java","line":748,"function":"java.lang.Thread.run"}]} [2020/05/20-18:54:13.898] [FRT-43-FlowRunnable] [INFO] [dku.flow.activity] act.evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Run thread failed for activity evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP com.dataiku.common.server.APIError$SerializedErrorException: Error in evaluate process: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) => double) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191) at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) at org.apache.spark.scheduler.Task.run(Task.scala:123) at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.util.NoSuchElementException: key not found: false at scala.collection.MapLike$class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:59) at scala.collection.MapLike$class.apply(MapLike.scala:141) at scala.collection.AbstractMap.apply(Map.scala:59) ... 17 more Driver stacktrace:, caused by: SparkException: Failed to execute user defined function(Map2: (string) => double), caused by: NoSuchElementException: key not found: false at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleErrorFile(AbstractCodeBasedActivityRunner.java:186) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleExecutionResult(AbstractCodeBasedActivityRunner.java:166) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.execute(AbstractCodeBasedActivityRunner.java:102) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.runUsingSparkSubmit(AbstractSparkBasedRecipeRunner.java:315) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.doRunSpark(AbstractSparkBasedRecipeRunner.java:131) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.runSpark(AbstractSparkBasedRecipeRunner.java:103) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.runSpark(AbstractSparkBasedRecipeRunner.java:88) at com.dataiku.dip.analysis.ml.prediction.flow.EvaluationRecipeRunner$1.run(EvaluationRecipeRunner.java:227) at com.dataiku.dip.analysis.ml.prediction.flow.EvaluationRecipeRunner.run(EvaluationRecipeRunner.java:273) at com.dataiku.dip.dataflow.jobrunner.ActivityRunner$FlowRunnableThread.run(ActivityRunner.java:380) [2020/05/20-18:54:13.931] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - activity is finished [2020/05/20-18:54:13.931] [ActivityExecutor-38] [ERROR] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Activity failed com.dataiku.common.server.APIError$SerializedErrorException: Error in evaluate process: Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0, localhost, executor driver): org.apache.spark.SparkException: Failed to execute user defined function(Map2: (string) => double) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$13$$anon$1.hasNext(WholeStageCodegenExec.scala:636) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409) at org.apache.spark.util.collection.ExternalSorter.insertAll(ExternalSorter.scala:191) at org.apache.spark.shuffle.sort.SortShuffleWriter.write(SortShuffleWriter.scala:62) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55) at org.apache.spark.scheduler.Task.run(Task.scala:123) at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Caused by: java.util.NoSuchElementException: key not found: false at scala.collection.MapLike$class.default(MapLike.scala:228) at scala.collection.AbstractMap.default(Map.scala:59) at scala.collection.MapLike$class.apply(MapLike.scala:141) at scala.collection.AbstractMap.apply(Map.scala:59) ... 17 more Driver stacktrace:, caused by: SparkException: Failed to execute user defined function(Map2: (string) => double), caused by: NoSuchElementException: key not found: false at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleErrorFile(AbstractCodeBasedActivityRunner.java:186) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleExecutionResult(AbstractCodeBasedActivityRunner.java:166) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.execute(AbstractCodeBasedActivityRunner.java:102) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.runUsingSparkSubmit(AbstractSparkBasedRecipeRunner.java:315) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.doRunSpark(AbstractSparkBasedRecipeRunner.java:131) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.runSpark(AbstractSparkBasedRecipeRunner.java:103) at com.dataiku.dip.dataflow.exec.AbstractSparkBasedRecipeRunner.runSpark(AbstractSparkBasedRecipeRunner.java:88) at com.dataiku.dip.analysis.ml.prediction.flow.EvaluationRecipeRunner$1.run(EvaluationRecipeRunner.java:227) at com.dataiku.dip.analysis.ml.prediction.flow.EvaluationRecipeRunner.run(EvaluationRecipeRunner.java:273) at com.dataiku.dip.dataflow.jobrunner.ActivityRunner$FlowRunnableThread.run(ActivityRunner.java:380) [2020/05/20-18:54:13.932] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Executing default post-activity lifecycle hook [2020/05/20-18:54:13.935] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Removing samples for CHURNALGO.predict_tcs [2020/05/20-18:54:13.937] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Removing samples for CHURNALGO.metric_tcs [2020/05/20-18:54:13.939] [ActivityExecutor-38] [INFO] [dku.flow.activity] running evaluate_on_Telco_Customer_Churn_prepared_distinct_1_NP - Done post-activity tasks