[2020/02/21-11:14:42.020] [ActivityExecutor-55] [INFO] [dku] running compute_python_chunk_example_1996-12 - ---------------------------------------- [2020/02/21-11:14:42.020] [ActivityExecutor-55] [INFO] [dku] running compute_python_chunk_example_1996-12 - DSS startup: jek version:5.1.5 [2020/02/21-11:14:42.020] [ActivityExecutor-55] [INFO] [dku] running compute_python_chunk_example_1996-12 - DSS home: /appl/dataiku/dss_data [2020/02/21-11:14:42.020] [ActivityExecutor-55] [INFO] [dku] running compute_python_chunk_example_1996-12 - OS: Linux 3.10.0-1062.9.1.el7.x86_64 amd64 - Java: Oracle Corporation 1.8.0_242 [2020/02/21-11:14:42.020] [ActivityExecutor-55] [INFO] [dku.flow.jobrunner] running compute_python_chunk_example_1996-12 - Allocated a slot for this activity! [2020/02/21-11:14:42.020] [ActivityExecutor-55] [INFO] [dku.flow.jobrunner] running compute_python_chunk_example_1996-12 - Run activity [2020/02/21-11:14:42.022] [ActivityExecutor-55] [INFO] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Executing default pre-activity lifecycle hook [2020/02/21-11:14:42.023] [ActivityExecutor-55] [INFO] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Checking if sources are ready [2020/02/21-11:14:42.024] [ActivityExecutor-55] [DEBUG] [dku.db.internal] running compute_python_chunk_example_1996-12 - Borrowing a connection. Read-only: false [2020/02/21-11:14:42.025] [ActivityExecutor-55] [INFO] [dku.dataset.hash] running compute_python_chunk_example_1996-12 - managed non-FS dataset #DELETED#.trans_month_partitioned/1996-12 has no propagated state, only checking if it has data [2020/02/21-11:14:42.026] [ActivityExecutor-55] [DEBUG] [dku.dataset.hash] running compute_python_chunk_example_1996-12 - Readiness cache miss for dataset__#DELETED#__#DELETED#.trans_month_partitioned__1996-12 [2020/02/21-11:14:42.026] [ActivityExecutor-55] [INFO] [dip.connections.sqlserver] running compute_python_chunk_example_1996-12 - Log using kerberos, as #DELETED#.eu.rabonet.com (proxy=%{dssUserLogin}) [2020/02/21-11:14:42.026] [ActivityExecutor-55] [INFO] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Connecting to jdbc:#DELETED# with props: {"integratedSecurity":"true","instanceName":"DEINSTANCE00","authenticationScheme":"JavaKerberos","portNumber":"1433","databaseName":"Training","requestCredDeleg":"true"} conn=Dataiku_Training-2CHUpdZ [2020/02/21-11:14:42.027] [ActivityExecutor-55] [DEBUG] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Driver version 7.2 [2020/02/21-11:14:42.027] [ActivityExecutor-55] [INFO] [dip.krb.login] running compute_python_chunk_example_1996-12 - Retrieving impersonated Kerberos credentials for #DELETED# [2020/02/21-11:14:42.066] [ActivityExecutor-55] [INFO] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Driver: Microsoft JDBC Driver 7.2 for SQL Server (JDBC 4.2) 7.2.2.0 (7.2) [2020/02/21-11:14:42.066] [ActivityExecutor-55] [INFO] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Database: Microsoft SQL Server 13.00.5622 (13.0) rowSize=8060 stmts=0 [2020/02/21-11:14:42.075] [ActivityExecutor-55] [INFO] [dku.sql.generic] running compute_python_chunk_example_1996-12 - Table null.trans_month_partitioned exists [2020/02/21-11:14:42.076] [ActivityExecutor-55] [DEBUG] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Close conn=Dataiku_Training-2CHUpdZ [2020/02/21-11:14:42.076] [ActivityExecutor-55] [INFO] [dku.dataset.hash] running compute_python_chunk_example_1996-12 - Caching readiness for dataset__#DELETED#__#DELETED#.trans_month_partitioned__1996-12 s=READY h=null [2020/02/21-11:14:42.076] [ActivityExecutor-55] [INFO] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Checked source readiness #DELETED#.trans_month_partitioned -> true [2020/02/21-11:14:42.076] [ActivityExecutor-55] [DEBUG] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Computing hashes to propagate BEFORE activity [2020/02/21-11:14:42.077] [ActivityExecutor-55] [DEBUG] [dku.db.internal] running compute_python_chunk_example_1996-12 - Borrowing a connection. Read-only: false [2020/02/21-11:14:42.078] [ActivityExecutor-55] [INFO] [dku.dataset.hash] running compute_python_chunk_example_1996-12 - managed non-FS dataset #DELETED#.trans_month_partitioned/1996-12 has no propagated state, only checking if it has data [2020/02/21-11:14:42.078] [ActivityExecutor-55] [DEBUG] [dku.dataset.hash] running compute_python_chunk_example_1996-12 - Readiness cache miss for dataset__#DELETED#__#DELETED#.trans_month_partitioned__1996-12 [2020/02/21-11:14:42.079] [ActivityExecutor-55] [INFO] [dip.connections.sqlserver] running compute_python_chunk_example_1996-12 - Log using kerberos, as #DELETED#.eu.rabonet.com (proxy=%{dssUserLogin}) [2020/02/21-11:14:42.079] [ActivityExecutor-55] [INFO] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Connecting to jdbc:#DELETED# with props: {"integratedSecurity":"true","instanceName":"DEINSTANCE00","authenticationScheme":"JavaKerberos","portNumber":"1433","databaseName":"Training","requestCredDeleg":"true"} conn=Dataiku_Training-9BYsEYZ [2020/02/21-11:14:42.080] [ActivityExecutor-55] [DEBUG] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Driver version 7.2 [2020/02/21-11:14:42.080] [ActivityExecutor-55] [INFO] [dip.krb.login] running compute_python_chunk_example_1996-12 - Retrieving impersonated Kerberos credentials for #DELETED# [2020/02/21-11:14:42.107] [ActivityExecutor-55] [INFO] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Driver: Microsoft JDBC Driver 7.2 for SQL Server (JDBC 4.2) 7.2.2.0 (7.2) [2020/02/21-11:14:42.107] [ActivityExecutor-55] [INFO] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Database: Microsoft SQL Server 13.00.5622 (13.0) rowSize=8060 stmts=0 [2020/02/21-11:14:42.111] [ActivityExecutor-55] [INFO] [dku.sql.generic] running compute_python_chunk_example_1996-12 - Table null.trans_month_partitioned exists [2020/02/21-11:14:42.112] [ActivityExecutor-55] [DEBUG] [dku.connections.sql.provider] running compute_python_chunk_example_1996-12 - Close conn=Dataiku_Training-9BYsEYZ [2020/02/21-11:14:42.112] [ActivityExecutor-55] [INFO] [dku.dataset.hash] running compute_python_chunk_example_1996-12 - Caching readiness for dataset__#DELETED#__#DELETED#.trans_month_partitioned__1996-12 s=READY h=null [2020/02/21-11:14:42.112] [ActivityExecutor-55] [DEBUG] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Recorded 1 hashes before activity run [2020/02/21-11:14:42.113] [ActivityExecutor-55] [DEBUG] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Building recipe runner of type [2020/02/21-11:14:42.113] [ActivityExecutor-55] [DEBUG] [dku.job.activity] running compute_python_chunk_example_1996-12 - Filling source sizes [2020/02/21-11:14:42.114] [ActivityExecutor-55] [DEBUG] [dku.job.activity] running compute_python_chunk_example_1996-12 - Done filling source sizes [2020/02/21-11:14:42.114] [ActivityExecutor-55] [DEBUG] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Recipe runner built, will use 1 thread(s) [2020/02/21-11:14:42.114] [ActivityExecutor-55] [DEBUG] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Starting execution thread: com.dataiku.dip.recipes.code.python.PythonRecipeRunner@4c8988bb [2020/02/21-11:14:42.115] [ActivityExecutor-55] [DEBUG] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Execution threads started, waiting for activity end [2020/02/21-11:14:42.115] [FRT-546-FlowRunnable] [INFO] [dku.flow.activity] act.compute_python_chunk_example_1996-12 - Run thread for activity compute_python_chunk_example_1996-12 starting [2020/02/21-11:14:42.116] [FRT-546-FlowRunnable] [INFO] [dku.flow.python] act.compute_python_chunk_example_1996-12 - Starting execution of user's Python code [2020/02/21-11:14:42.116] [FRT-546-FlowRunnable] [INFO] [dip.venv.selector] act.compute_python_chunk_example_1996-12 - Select in project with {"useBuiltinEnv":true,"preventOverride":false} [2020/02/21-11:14:42.117] [FRT-546-FlowRunnable] [INFO] [dku.flow.abstract.python] act.compute_python_chunk_example_1996-12 - Dumping Python script to /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/script.py [2020/02/21-11:14:42.119] [FRT-546-FlowRunnable] [WARN] [dku.code.projectLibs] act.compute_python_chunk_example_1996-12 - External libraries file not found: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/localconfig/projects/#DELETED#/lib/external-libraries.json [2020/02/21-11:14:42.119] [FRT-546-FlowRunnable] [INFO] [dku.code.projectLibs] act.compute_python_chunk_example_1996-12 - EXTERNAL LIBS FROM #DELETED# is {"gitReferences":{},"pythonPath":["python"],"rsrcPath":["R"],"importLibrariesFromProjects":[]} [2020/02/21-11:14:42.120] [FRT-546-FlowRunnable] [INFO] [dku.code.projectLibs] act.compute_python_chunk_example_1996-12 - chunkFolder is /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/localconfig/projects/#DELETED#/lib/R [2020/02/21-11:14:42.120] [FRT-546-FlowRunnable] [INFO] [xxx] act.compute_python_chunk_example_1996-12 - RSRC PATH: [] [2020/02/21-11:14:42.120] [FRT-546-FlowRunnable] [INFO] [dku.recipes.code.base] act.compute_python_chunk_example_1996-12 - Writing dku-exec-env for local execution in /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/remote-run-env-def.json [2020/02/21-11:14:42.121] [FRT-546-FlowRunnable] [INFO] [dku.code.envs.resolution] act.compute_python_chunk_example_1996-12 - Executing Python activity in builtin env [2020/02/21-11:14:42.121] [FRT-546-FlowRunnable] [INFO] [dku.flow.abstract.python] act.compute_python_chunk_example_1996-12 - Execute activity command: ["/appl/dataiku/dss_data/bin/python","-u","/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/python-exec-wrapper.py","/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/script.py"] [2020/02/21-11:14:42.121] [FRT-546-FlowRunnable] [INFO] [dku.recipes.code.base] act.compute_python_chunk_example_1996-12 - Run command insecurely, from user dataiku [2020/02/21-11:14:42.123] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Setting exclusive access on /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ [2020/02/21-11:14:42.123] [FRT-546-FlowRunnable] [DEBUG] [dku.security.fsacl] act.compute_python_chunk_example_1996-12 - exec chown: sudo -n /etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS/execwrapper.sh acls chown --path /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ --owner #DELETED# [2020/02/21-11:14:42.307] [null-err-548] [INFO] [dku.utils] - 2020-02-21 11:14:42,307 3613 INFO Reading security conf from /etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS [2020/02/21-11:14:42.307] [null-err-548] [INFO] [dku.utils] - 2020-02-21 11:14:42,307 3613 INFO Allowed path: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ in /appl/dataiku/dss_data [2020/02/21-11:14:42.307] [null-err-548] [INFO] [dku.utils] - 2020-02-21 11:14:42,307 3613 INFO Reading security conf from /etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS [2020/02/21-11:14:42.310] [null-err-548] [INFO] [dku.utils] - 2020-02-21 11:14:42,310 3613 INFO Allowed group ids: [50002] [2020/02/21-11:14:42.317] [null-err-548] [INFO] [dku.utils] - 2020-02-21 11:14:42,317 3613 INFO User #DELETED# belongs to groups: [503869135, 50002] [2020/02/21-11:14:42.318] [null-err-548] [INFO] [dku.utils] - 2020-02-21 11:14:42,317 3613 INFO User #DELETED# belongs to group 50002 (dataiku_python_runner) -> allowed [2020/02/21-11:14:42.318] [null-err-548] [INFO] [dku.utils] - 2020-02-21 11:14:42,318 3613 INFO Executing chown -Rh: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ -> #DELETED# [2020/02/21-11:14:42.328] [Thread-521] [INFO] [dku.utils] - Done waiting for return value, got 0 [2020/02/21-11:14:42.331] [FRT-546-FlowRunnable] [DEBUG] [dku.security.fsacl] act.compute_python_chunk_example_1996-12 - exec setfacl: sudo -n /etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS/execwrapper.sh acls setfacl --path /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ --perm u::rwx --perm g::--- --perm o:--- --perm d:u:dataiku:rwx --perm u:dataiku:rwx --recursive [2020/02/21-11:14:42.524] [null-err-558] [INFO] [dku.utils] - 2020-02-21 11:14:42,524 3643 INFO Reading security conf from /etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS [2020/02/21-11:14:42.525] [null-err-558] [INFO] [dku.utils] - 2020-02-21 11:14:42,525 3643 INFO Allowed path: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ in /appl/dataiku/dss_data [2020/02/21-11:14:42.525] [null-err-558] [INFO] [dku.utils] - 2020-02-21 11:14:42,525 3643 INFO Executing: ['setfacl', '-RP', '-m', 'u::rwx,g::---,o:---,d:u:dataiku:rwx,u:dataiku:rwx', '/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ'] [2020/02/21-11:14:42.538] [Thread-531] [INFO] [dku.utils] - Done waiting for return value, got 0 [2020/02/21-11:14:42.539] [FRT-546-FlowRunnable] [INFO] [dku.processes.cgroups] act.compute_python_chunk_example_1996-12 - Will use cgroups [] [2020/02/21-11:14:42.539] [FRT-546-FlowRunnable] [INFO] [dku.processes.cgroups] act.compute_python_chunk_example_1996-12 - Applying rules to used cgroups: [] [2020/02/21-11:14:42.540] [FRT-546-FlowRunnable] [INFO] [dku.processes.cgroups] act.compute_python_chunk_example_1996-12 - Rule path cpu/DSS/notebooks/ [2020/02/21-11:14:42.540] [FRT-546-FlowRunnable] [INFO] [dku.processes.cgroups] act.compute_python_chunk_example_1996-12 - Rule path memory/DSS/notebooks/ [2020/02/21-11:14:42.540] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Written config to /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/config.json [2020/02/21-11:14:42.541] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Starting wrapper: ["sudo","-n","/etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS/execwrapper.sh","execute","/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/config.json"] [2020/02/21-11:14:42.642] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Reading pipes from wrapper [2020/02/21-11:14:42.729] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,728 3662 INFO Reading security conf from /etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS [2020/02/21-11:14:42.733] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,730 3662 INFO Allowed group ids: [50002] [2020/02/21-11:14:42.736] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,736 3662 INFO User #DELETED# belongs to groups: [503869135, 50002] [2020/02/21-11:14:42.737] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,737 3662 INFO User #DELETED# belongs to group 50002 (dataiku_python_runner) -> allowed [2020/02/21-11:14:42.737] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,737 3662 INFO Reading security conf from /etc/dataiku-security/4Zpf4htwuFlRkVcrb8eSiHlS [2020/02/21-11:14:42.737] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,737 3662 INFO Allowed path: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ in /appl/dataiku/dss_data [2020/02/21-11:14:42.737] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,737 3662 INFO [BF] Setting up process environment [2020/02/21-11:14:42.738] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,738 3662 INFO [BF] Writing wrapper pipes and pid to wrapper stdout [2020/02/21-11:14:42.738] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Pipes read as bytes: 713 [2020/02/21-11:14:42.738] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Pipes read as bytes: {"pipes": {"stdin": "/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stdin.mwr0916m", "stderr": "/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stderr.mwr0916m", "stdout": "/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stdout.mwr0916m"}, "pid": 3662} [2020/02/21-11:14:42.738] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Pid read: 3662 [2020/02/21-11:14:42.738] [FRT-546-FlowRunnable] [INFO] [dku.security.process] act.compute_python_chunk_example_1996-12 - Pipes read: { "stdin": "/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stdin.mwr0916m", "stderr": "/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stderr.mwr0916m", "stdout": "/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stdout.mwr0916m" } [2020/02/21-11:14:42.739] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,738 3662 INFO [Parent] Started process 3664 [2020/02/21-11:14:42.739] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,738 3664 INFO [Child] pgroup is set [2020/02/21-11:14:42.739] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,739 3664 INFO [Child] opening stdin pipe: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stdin.mwr0916m [2020/02/21-11:14:42.739] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,739 3664 INFO [Child] opened stdin pipe [2020/02/21-11:14:42.739] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,739 3664 INFO [Child] opening stdout pipe: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stdout.mwr0916m [2020/02/21-11:14:42.740] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,739 3664 INFO [Child] opened stdout [2020/02/21-11:14:42.740] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:42,740 3664 INFO [Child] opening stderr pipe: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/.stderr.mwr0916m [2020/02/21-11:14:42.740] [FRT-546-FlowRunnable] [INFO] [dku.recipes.code.base] act.compute_python_chunk_example_1996-12 - Process reads from nothing [2020/02/21-11:14:42.740] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,740 3664 INFO [Child] opened stderr [2020/02/21-11:14:42.741] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,740 3664 INFO [Child] about to close other fd [2020/02/21-11:14:42.741] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,740 3664 INFO [Child] closed other fd [2020/02/21-11:14:42.741] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,740 3664 INFO [Child] chdired [2020/02/21-11:14:42.741] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,740 3664 INFO setting username=#DELETED# uid=503869135 gid=503869135 [2020/02/21-11:14:42.741] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,741 3664 INFO [Child] dropped privileges [2020/02/21-11:14:42.742] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,741 3664 INFO [Child] Executing: /appl/dataiku/dss_data/bin/python : /appl/dataiku/dss_data/bin/python -u /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/python-exec-wrapper.py /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/script.py [2020/02/21-11:14:42.840] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO -------------------- [2020/02/21-11:14:42.840] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO Dataiku Python entrypoint starting up [2020/02/21-11:14:42.841] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO executable = /appl/dataiku/dss_data/bin/python [2020/02/21-11:14:42.841] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO argv = ['/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/python-exec-wrapper.py', '/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/script.py'] [2020/02/21-11:14:42.841] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO -------------------- [2020/02/21-11:14:42.841] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO Looking for RemoteRunEnvDef in ./remote-run-env-def.json [2020/02/21-11:14:42.841] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO Found RemoteRunEnvDef environment: ./remote-run-env-def.json [2020/02/21-11:14:42.841] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,840 INFO Running a DSS Python recipe locally, uinsetting env [2020/02/21-11:14:42.842] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,841 INFO Setup complete, ready to execute Python code [2020/02/21-11:14:42.842] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,841 INFO Sys path: ['/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ', '/appl/dataiku/dss_data/lib/python', '/appl/dataiku/dataiku-dss-5.1.5/python', '/appl/dataiku/dss_data/pyenv/lib64/python27.zip', '/appl/dataiku/dss_data/pyenv/lib64/python2.7', '/appl/dataiku/dss_data/pyenv/lib64/python2.7/plat-linux2', '/appl/dataiku/dss_data/pyenv/lib64/python2.7/lib-tk', '/appl/dataiku/dss_data/pyenv/lib64/python2.7/lib-old', '/appl/dataiku/dss_data/pyenv/lib64/python2.7/lib-dynload', '/usr/lib64/python2.7', '/usr/lib/python2.7', '/appl/dataiku/dss_data/pyenv/lib/python2.7/site-packages', '/appl/dataiku/dataiku-dss-5.1.5/python.packages'] [2020/02/21-11:14:42.842] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:42,841 INFO Script file: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/script.py [2020/02/21-11:14:43.300] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:43,300 INFO Initializing dataset writer for dataset #DELETED#.python_chunk_example [2020/02/21-11:14:44.895] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:44,895 INFO Starting dataframes iterator [2020/02/21-11:14:45.128] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:45,128 INFO Start_once writer [2020/02/21-11:14:45.128] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:45,128 INFO Initializing write session [2020/02/21-11:14:45.137] [qtp1003307628-111] [INFO] [com.dataiku.dip.dataflow.streaming.DatasetWritingService] - Init write session: J2GxTAyNyd [2020/02/21-11:14:45.138] [qtp1003307628-111] [DEBUG] [dku.jobs] - Command /tintercom/datasets/init-write-session processed in 7ms [2020/02/21-11:14:45.140] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:45,140 INFO Starting RemoteStreamWriter [2020/02/21-11:14:45.142] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:45,142 INFO Initializing write data stream (J2GxTAyNyd) [2020/02/21-11:14:45.235] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:45,235 INFO Remote Stream Writer: start generate [2020/02/21-11:14:45.235] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:45,235 INFO Waiting for data to send ... [2020/02/21-11:14:45.237] [qtp1003307628-114] [INFO] [com.dataiku.dip.dataflow.streaming.DatasetWriter] - Creating output writer [2020/02/21-11:14:45.237] [qtp1003307628-114] [INFO] [com.dataiku.dip.dataflow.streaming.DatasetWriter] - Initializing output writer [2020/02/21-11:14:45.237] [qtp1003307628-114] [INFO] [dip.connections.sqlserver] - Log using kerberos, as #DELETED#.eu.rabonet.com (proxy=%{dssUserLogin}) [2020/02/21-11:14:45.238] [qtp1003307628-114] [INFO] [dku.connections.sql.provider] - Connecting to jdbc:#DELETED# with props: {"integratedSecurity":"true","instanceName":"DEINSTANCE00","authenticationScheme":"JavaKerberos","portNumber":"1433","databaseName":"Training","requestCredDeleg":"true"} conn=Dataiku_Training-5cOsgTm [2020/02/21-11:14:45.238] [qtp1003307628-114] [DEBUG] [dku.connections.sql.provider] - Driver version 7.2 [2020/02/21-11:14:45.238] [qtp1003307628-114] [INFO] [dip.krb.login] - Retrieving impersonated Kerberos credentials for #DELETED# [2020/02/21-11:14:45.266] [qtp1003307628-114] [INFO] [dku.connections.sql.provider] - Driver: Microsoft JDBC Driver 7.2 for SQL Server (JDBC 4.2) 7.2.2.0 (7.2) [2020/02/21-11:14:45.266] [qtp1003307628-114] [INFO] [dku.connections.sql.provider] - Database: Microsoft SQL Server 13.00.5622 (13.0) rowSize=8060 stmts=0 [2020/02/21-11:14:45.337] [qtp1003307628-114] [INFO] [dku.sql.generic] - Checking schema compatibility of python_chunk_example [2020/02/21-11:14:45.341] [qtp1003307628-114] [INFO] [dku.sql.generic] - Table dbo.#DELETED#_python_chunk_example exists [2020/02/21-11:14:45.344] [qtp1003307628-114] [ERROR] [com.dataiku.dip.dataflow.streaming.DatasetWritingService] - Push data error during streaming:Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table java.sql.SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table at com.dataiku.dip.sql.GenericSQLDialect.dropAndRecreateTableOrPartition(GenericSQLDialect.java:503) at com.dataiku.dip.datasets.sql.SQLTableOutput$SQLTableOutputWriter.init(SQLTableOutput.java:115) at com.dataiku.dip.dataflow.streaming.DatasetWriter.(DatasetWriter.java:77) at com.dataiku.dip.dataflow.streaming.DatasetWriter.build(DatasetWriter.java:114) at com.dataiku.dip.dataflow.streaming.DatasetWriter.build(DatasetWriter.java:106) at com.dataiku.dip.dataflow.streaming.DatasetWritingService.pushData(DatasetWritingService.java:251) at com.dataiku.dip.dataflow.kernel.slave.KernelSession.pushData(KernelSession.java:236) at com.dataiku.dip.dataflow.kernel.slave.KernelServlet.service(KernelServlet.java:228) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:462) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IllegalArgumentException: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table at com.dataiku.dip.utils.ErrorContext.iaef(ErrorContext.java:129) at com.dataiku.dip.sql.SchemaReader.isManagedSchemaCompatible(SchemaReader.java:126) at com.dataiku.dip.sql.SchemaReader.isManagedSchemaCompatible(SchemaReader.java:77) at com.dataiku.dip.sql.GenericSQLDialect.dropAndRecreateTableOrPartition(GenericSQLDialect.java:495) ... 22 more [2020/02/21-11:14:45.345] [qtp1003307628-114] [DEBUG] [dku.jobs] - Command /tintercom/datasets/push-data processed in 109ms [2020/02/21-11:14:45.344] [qtp1003307628-121] [ERROR] [com.dataiku.dip.dataflow.streaming.DatasetWritingService] - Wait session error: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table java.sql.SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table at com.dataiku.dip.sql.GenericSQLDialect.dropAndRecreateTableOrPartition(GenericSQLDialect.java:503) at com.dataiku.dip.datasets.sql.SQLTableOutput$SQLTableOutputWriter.init(SQLTableOutput.java:115) at com.dataiku.dip.dataflow.streaming.DatasetWriter.(DatasetWriter.java:77) at com.dataiku.dip.dataflow.streaming.DatasetWriter.build(DatasetWriter.java:114) at com.dataiku.dip.dataflow.streaming.DatasetWriter.build(DatasetWriter.java:106) at com.dataiku.dip.dataflow.streaming.DatasetWritingService.pushData(DatasetWritingService.java:251) at com.dataiku.dip.dataflow.kernel.slave.KernelSession.pushData(KernelSession.java:236) at com.dataiku.dip.dataflow.kernel.slave.KernelServlet.service(KernelServlet.java:228) at javax.servlet.http.HttpServlet.service(HttpServlet.java:790) at org.eclipse.jetty.servlet.ServletHolder.handle(ServletHolder.java:738) at org.eclipse.jetty.servlet.ServletHandler.doHandle(ServletHandler.java:551) at org.eclipse.jetty.server.handler.ContextHandler.doHandle(ContextHandler.java:1111) at org.eclipse.jetty.servlet.ServletHandler.doScope(ServletHandler.java:478) at org.eclipse.jetty.server.handler.ContextHandler.doScope(ContextHandler.java:1045) at org.eclipse.jetty.server.handler.ScopedHandler.handle(ScopedHandler.java:141) at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97) at org.eclipse.jetty.server.Server.handle(Server.java:462) at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:279) at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:232) at org.eclipse.jetty.io.AbstractConnection$2.run(AbstractConnection.java:534) at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607) at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536) at java.lang.Thread.run(Thread.java:748) Caused by: java.lang.IllegalArgumentException: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table at com.dataiku.dip.utils.ErrorContext.iaef(ErrorContext.java:129) at com.dataiku.dip.sql.SchemaReader.isManagedSchemaCompatible(SchemaReader.java:126) at com.dataiku.dip.sql.SchemaReader.isManagedSchemaCompatible(SchemaReader.java:77) at com.dataiku.dip.sql.GenericSQLDialect.dropAndRecreateTableOrPartition(GenericSQLDialect.java:495) ... 22 more [2020/02/21-11:14:45.346] [qtp1003307628-121] [DEBUG] [dku.jobs] - Command /tintercom/datasets/wait-write-session processed in 183ms [2020/02/21-11:14:45.364] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:45,348 ERROR Exception caught while writing [2020/02/21-11:14:45.365] [null-err-569] [INFO] [dku.utils] - Traceback (most recent call last): [2020/02/21-11:14:45.365] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 217, in run [2020/02/21-11:14:45.365] [null-err-569] [INFO] [dku.utils] - self.streaming_api.wait_write_session(self.session_id) [2020/02/21-11:14:45.365] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 180, in wait_write_session [2020/02/21-11:14:45.365] [null-err-569] [INFO] [dku.utils] - raise Exception(u'An error occurred during dataset write (%s): %s' % (id, decoded_resp["message"])) [2020/02/21-11:14:45.365] [null-err-569] [INFO] [dku.utils] - Exception: An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table [2020/02/21-11:14:46.230] [qtp1003307628-111] [INFO] [dku.jobs] - Connects using API ticket [2020/02/21-11:14:46.230] [qtp1003307628-111] [DEBUG] [dku.jobs] - Received command : /tintercom/datasets/set-schema [2020/02/21-11:14:46.230] [qtp1003307628-111] [INFO] [dku.job.slave] - Using cached grant: dataset::WRITE_SCHEMA::#DELETED#.python_chunk_example [2020/02/21-11:14:46.230] [qtp1003307628-111] [INFO] [dku.job.slave] - Setting schema of dataset #DELETED#.python_chunk_example [2020/02/21-11:14:46.230] [qtp1003307628-111] [INFO] [com.dataiku.dip.dataflow.streaming.DatasetWritingService] - Setting schema of #DELETED#.python_chunk_example [2020/02/21-11:14:46.231] [qtp1003307628-111] [DEBUG] [dku.datasets.managed.helper] - Fixup partitioning, datasetToFixup={"fullName":"#DELETED#.python_chunk_example","model":{"type":"SQLServer","managed":true,"name":"python_chunk_example","projectKey":"#DELETED#","checklists":{"checklists":[]},"checks":[],"customMeta":{"kv":{}},"flowOptions":{"virtualizable":false,"rebuildBehavior":"NORMAL","crossProjectBuildBehavior":"DEFAULT"},"readWriteOptions":{"preserveOrder":false,"writeBuckets":1,"forceSingleOutputFile":false,"defaultReadOrdering":{"enabled":false,"rules":[]}},"partitioning":{"ignoreNonMatchingFile":false,"considerMissingRequestedPartitionsAsEmpty":false,"dimensions":[{"name":"partition_month","params":{"period":"MONTH"},"type":"time"}]},"versionTag":{"versionNumber":8,"lastModifiedBy":{"login":"#DELETED#"},"lastModifiedOn":1582280085107},"creationTag":{"versionNumber":0,"lastModifiedBy":{"login":"#DELETED#"},"lastModifiedOn":1582278617843},"tags":[],"params":{"connection":"Dataiku_Training","notReadyIfEmpty":false,"mode":"table","partitioningType":"custom","normalizeDoubles":true,"table":"${projectKey}_python_chunk_example","schema":"dbo","tableCreationMode":"auto","writeInsertBatchSize":10000,"writeJDBCBadDataBehavior":"DISCARD_ROW","readColsWithUnknownTzAsDates":false,"readSQLDateColsAsDSSDates":true},"schema":{"columns":[{"name":"partition_month","type":"date"},{"name":"trans_id","type":"bigint"},{"name":"account_id","type":"bigint"},{"name":"date","type":"date"},{"name":"type","type":"string"},{"name":"operation","type":"string"},{"name":"amount","type":"double"},{"name":"balance","type":"double"},{"name":"k_symbol","type":"string"},{"name":"bank","type":"double"},{"name":"account","type":"double"},{"name":"chunkno","type":"bigint"}],"userModified":false},"metrics":{"probes":[{"type":"partitioning","enabled":true,"computeOnBuildMode":"WHOLE_DATASET","meta":{"name":"Partitioning statistics","level":1},"configuration":{}},{"type":"basic","enabled":true,"computeOnBuildMode":"PARTITION","meta":{"name":"Basic data","level":0},"configuration":{}},{"type":"records","enabled":true,"computeOnBuildMode":"NO","meta":{"name":"Record count","level":0},"configuration":{}}],"engineConfig":{"padRunsWithMetrics":false,"hive":{"active":true,"extraConf":[]},"basic":{},"dss":{"active":true,"selection":{"useMemTable":false,"filter":{"distinct":false,"enabled":false},"partitionSelectionMethod":"ALL","latestPartitionsN":1,"ordering":{"enabled":false,"rules":[]},"samplingMethod":"FULL","maxRecords":-1,"targetRatio":0.02,"withinFirstN":-1,"maxReadUncompressedBytes":-1}},"sql":{"active":true},"impala":{"active":true},"spark":{"active":true,"extraConf":[]},"python":{}},"displayedState":{"columns":[],"metrics":["basic:COUNT_COLUMNS","records:COUNT_RECORDS"]}},"metricsChecks":{"runOnBuild":false,"checks":[],"displayedState":{"checks":[]}},"customFields":{"gdpr_contains_personal_data":"UNSURE"}},"partitioning":{"ignoreNonMatchingFile":false,"considerMissingRequestedPartitionsAsEmpty":false,"dimensions":[{"name":"partition_month","params":{"period":"MONTH"},"type":"time"}]}} schemaToFixup={"columns":[{"name":"partition_month","type":"date"}, {"name":"trans_id","type":"bigint"}, {"name":"account_id","type":"bigint"}, {"name":"date","type":"date"}, {"name":"type","type":"string"}, {"name":"operation","type":"string"}, {"name":"amount","type":"double"}, {"name":"balance","type":"double"}, {"name":"k_symbol","type":"string"}, {"name":"bank","type":"string"}, {"name":"account","type":"double"}, {"name":"chunkno","type":"bigint"} ], "userModified": false } [2020/02/21-11:14:46.231] [qtp1003307628-111] [DEBUG] [dku.datasets.managed.helper] - After partitioning fixup, got schema: {"columns":[{"name":"partition_month","type":"date"}, {"name":"trans_id","type":"bigint"}, {"name":"account_id","type":"bigint"}, {"name":"date","type":"date"}, {"name":"type","type":"string"}, {"name":"operation","type":"string"}, {"name":"amount","type":"double"}, {"name":"balance","type":"double"}, {"name":"k_symbol","type":"string"}, {"name":"bank","type":"string"}, {"name":"account","type":"double"}, {"name":"chunkno","type":"bigint"} ], "userModified": false } [2020/02/21-11:14:46.232] [qtp1003307628-111] [INFO] [dip.connections.sqlserver] - Log using kerberos, as #DELETED#.eu.rabonet.com (proxy=%{dssUserLogin}) [2020/02/21-11:14:46.232] [qtp1003307628-111] [INFO] [com.dataiku.dip.dataflow.streaming.DatasetWritingService] - Schema was modified [2020/02/21-11:14:46.233] [qtp1003307628-111] [INFO] [dku.job.slave] - Propagating new schema of #DELETED#.python_chunk_example to backend [2020/02/21-11:14:46.249] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:46,248 INFO Start_once writer [2020/02/21-11:14:46.249] [null-err-569] [INFO] [dku.utils] - 2020-02-21 11:14:46,249 INFO Start_once writer [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - *************** Recipe code failed ************** [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - Begin Python stack [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - Traceback (most recent call last): [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/python-exec-wrapper.py", line 194, in [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - exec(f.read()) [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - File "", line 22, in [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 413, in __exit__ [2020/02/21-11:14:46.250] [null-err-569] [INFO] [dku.utils] - self.close() [2020/02/21-11:14:46.251] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 404, in close [2020/02/21-11:14:46.251] [null-err-569] [INFO] [dku.utils] - self._start_once() [2020/02/21-11:14:46.251] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 310, in _start_once [2020/02/21-11:14:46.251] [null-err-569] [INFO] [dku.utils] - self.waiter.raise_on_failure() [2020/02/21-11:14:46.251] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 204, in raise_on_failure [2020/02/21-11:14:46.251] [null-err-569] [INFO] [dku.utils] - exec("raise self.exception_type, self.exception, self.traceback") [2020/02/21-11:14:46.252] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 217, in run [2020/02/21-11:14:46.252] [null-err-569] [INFO] [dku.utils] - self.streaming_api.wait_write_session(self.session_id) [2020/02/21-11:14:46.252] [null-err-569] [INFO] [dku.utils] - File "/appl/dataiku/dataiku-dss-5.1.5/python/dataiku/core/dataset_write.py", line 180, in wait_write_session [2020/02/21-11:14:46.252] [null-err-569] [INFO] [dku.utils] - raise Exception(u'An error occurred during dataset write (%s): %s' % (id, decoded_resp["message"])) [2020/02/21-11:14:46.252] [null-err-569] [INFO] [dku.utils] - Exception: An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table [2020/02/21-11:14:46.252] [null-err-569] [INFO] [dku.utils] - End Python stack [2020/02/21-11:14:46.269] [qtp1003307628-111] [DEBUG] [dku.jobs] - Command /tintercom/datasets/set-schema processed in 39ms [2020/02/21-11:14:46.274] [qtp1003307628-121] [INFO] [dku.jobs] - Connects using API ticket [2020/02/21-11:14:46.274] [qtp1003307628-121] [DEBUG] [dku.jobs] - Received command : /tintercom/datasets/init-write-session [2020/02/21-11:14:46.299] [qtp1003307628-111] [INFO] [dku.jobs] - Connects using API ticket [2020/02/21-11:14:46.300] [qtp1003307628-111] [DEBUG] [dku.jobs] - Received command : /tintercom/datasets/wait-write-session [2020/02/21-11:14:46.330] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:46,325 3662 INFO [Child] Process 3664 exited with exit=1 signal=0 [2020/02/21-11:14:46.330] [wrapper-stderr-564] [INFO] [dku.utils] - 2020-02-21 11:14:46,325 3662 INFO Full child code: 1 [2020/02/21-11:14:46.343] [FRT-546-FlowRunnable] [INFO] [dku.recipes.code.base] act.compute_python_chunk_example_1996-12 - Error file found, trying to throw it: /appl/dataiku/dss_data/jobs/#DELETED#/Build_python_chunk_example__1993_01_1998_12__2020-02-21T10-13-32.659/compute_python_chunk_example_1996-12/python-recipe/pyoutg5N6BhQ2iOZJ/error.json [2020/02/21-11:14:46.344] [FRT-546-FlowRunnable] [INFO] [dku.recipes.code.base] act.compute_python_chunk_example_1996-12 - Raw error is{"errorType":"\u003ctype \u0027exceptions.Exception\u0027\u003e","message":"An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : \u0027DOUBLE\u0027 in dataset, \u0027NVARCHAR\u0027(-9:nvarchar) in table","detailedMessage":"At line 22: \u003ctype \u0027exceptions.Exception\u0027\u003e: An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : \u0027DOUBLE\u0027 in dataset, \u0027NVARCHAR\u0027(-9:nvarchar) in table","stackTrace":[]} [2020/02/21-11:14:46.344] [FRT-546-FlowRunnable] [INFO] [dku.recipes.code.base] act.compute_python_chunk_example_1996-12 - Now err: {"errorType":"\u003ctype \u0027exceptions.Exception\u0027\u003e","message":"Error in Python process: An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : \u0027DOUBLE\u0027 in dataset, \u0027NVARCHAR\u0027(-9:nvarchar) in table","detailedMessage":"Error in Python process: At line 22: \u003ctype \u0027exceptions.Exception\u0027\u003e: An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : \u0027DOUBLE\u0027 in dataset, \u0027NVARCHAR\u0027(-9:nvarchar) in table","stackTrace":[]} [2020/02/21-11:14:46.345] [FRT-546-FlowRunnable] [INFO] [dku.flow.activity] act.compute_python_chunk_example_1996-12 - Run thread failed for activity compute_python_chunk_example_1996-12 com.dataiku.common.server.APIError$SerializedErrorException: Error in Python process: At line 22: : An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleErrorFile(AbstractCodeBasedActivityRunner.java:186) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleExecutionResult(AbstractCodeBasedActivityRunner.java:166) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.execute(AbstractCodeBasedActivityRunner.java:102) at com.dataiku.dip.dataflow.exec.AbstractPythonRecipeRunner.executeScript(AbstractPythonRecipeRunner.java:48) at com.dataiku.dip.recipes.code.python.PythonRecipeRunner.run(PythonRecipeRunner.java:58) at com.dataiku.dip.dataflow.jobrunner.ActivityRunner$FlowRunnableThread.run(ActivityRunner.java:366) [2020/02/21-11:14:46.517] [ActivityExecutor-55] [INFO] [dku.flow.activity] running compute_python_chunk_example_1996-12 - activity is finished [2020/02/21-11:14:46.518] [ActivityExecutor-55] [ERROR] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Activity failed com.dataiku.common.server.APIError$SerializedErrorException: Error in Python process: At line 22: : An error occurred during dataset write (J2GxTAyNyd): SQLException: Cannot write to partition 1996-12, table already exists but with an incompatible schema: Type mismatch for column 10 (bank) : 'DOUBLE' in dataset, 'NVARCHAR'(-9:nvarchar) in table at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleErrorFile(AbstractCodeBasedActivityRunner.java:186) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.handleExecutionResult(AbstractCodeBasedActivityRunner.java:166) at com.dataiku.dip.dataflow.exec.AbstractCodeBasedActivityRunner.execute(AbstractCodeBasedActivityRunner.java:102) at com.dataiku.dip.dataflow.exec.AbstractPythonRecipeRunner.executeScript(AbstractPythonRecipeRunner.java:48) at com.dataiku.dip.recipes.code.python.PythonRecipeRunner.run(PythonRecipeRunner.java:58) at com.dataiku.dip.dataflow.jobrunner.ActivityRunner$FlowRunnableThread.run(ActivityRunner.java:366) [2020/02/21-11:14:46.518] [ActivityExecutor-55] [INFO] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Executing default post-activity lifecycle hook [2020/02/21-11:14:46.520] [ActivityExecutor-55] [INFO] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Removing samples for #DELETED#.python_chunk_example [2020/02/21-11:14:46.521] [ActivityExecutor-55] [INFO] [dku.flow.activity] running compute_python_chunk_example_1996-12 - Done post-activity tasks [2020/02/21-11:14:48.328] [qtp1003307628-114] [INFO] [dku.jobs] - Connects using API ticket [2020/02/21-11:14:48.329] [qtp1003307628-114] [DEBUG] [dku.jobs] - Received command : /tintercom/datasets/wait-write-session