Check out the first Dataiku 8 Deep Dive focusing on Productivity on October 29th Read More

NLP - The Visual Way

Level 1
Level 1
NLP - The Visual Way

I have a problem with the analysis part. Every time I get a fail. 

 

image.png

0 Kudos
14 Replies
Level 3
Level 3

Hey Lisa, looks like you're a new community user. Welcome!

Without knowing too much about your code, from the error it looks like there is an inf or NAN in the input dataset that is causing the error. I suspect if you can remove the records where the Infs/NANs occur you will be ok.

To check this, you could sample down your training data to a small set of records that you know do not have any Infs/NANs and run the analysis - if everything works that's probably the issue. You can probably get more specific guidance towards the column giving you the issue if you check the logs. 

 

0 Kudos
Level 1
Level 1
Author

Thanks for your reply. 
There are no zero values or empty values and I still have errors.
This model is a part of the course NLP visual way (ML Practitioner). 

0 Kudos
Dataiker
Dataiker

Hi @Lisa2410 , when you say "every time", it fails, do you mean that you are not able to build any of the models? For example, are you able to build the first baseline model? At which step does your first failure occur?

Dataiku
0 Kudos
Level 1
Level 1
Author

At the proces step.
I am working at a local machine (oracle vm).

Log:

ValueError: array must not contain infs or NaNs
[2020/10/16-10:33:06.765] [KNL-python-single-command-kernel-monitor-164] [INFO] [dku.kernels]  - Process done with code 0
[2020/10/16-10:33:06.765] [KNL-python-single-command-kernel-monitor-164] [INFO] [dip.tickets]  - Destroying API ticket for analysis-ml-DKU_TUTORIAL_NLP_VISUAL-I2w3z5H on behalf of admin
[2020/10/16-10:33:06.771] [KNL-python-single-command-kernel-monitor-164] [WARN] [dku.resource]  - stat file for pid 1938 does not exist. Process died?
[2020/10/16-10:33:06.772] [KNL-python-single-command-kernel-monitor-164] [INFO] [dku.resourceusage]  - Reporting completion of CRU:{"context":{"type":"ANALYSIS_ML_TRAIN","authIdentifier":"admin","projectKey":"DKU_TUTORIAL_NLP_VISUAL","analysisId":"fX041VEB","mlTaskId":"g4bYkwir","sessionId":"s1"},"type":"LOCAL_PROCESS","id":"S55KyLPITbMwL01t","startTime":1602844329233,"localProcess":{"pid":1938,"commandName":"/home/dataiku/dss/bin/python","cpuUserTimeMS":1880,"cpuSystemTimeMS":16970,"cpuChildrenUserTimeMS":0,"cpuChildrenSystemTimeMS":20,"cpuTotalMS":18870,"cpuCurrent":0.2836596084698362,"vmSizeMB":1483,"vmRSSMB":732,"vmHWMMB":732,"vmRSSAnonMB":713,"vmDataMB":817,"vmSizePeakMB":1483,"vmRSSPeakMB":732,"vmRSSTotalMBS":21160,"majorFaults":79,"childrenMajorFaults":0}}
[2020/10/16-10:33:06.772] [MRT-157] [INFO] [dku.kernels]  - Getting kernel tail
[2020/10/16-10:33:06.773] [MRT-157] [INFO] [dku.kernels]  - Trying to enrich exception: com.dataiku.dip.io.SocketBlockLinkKernelException: Failed to train : <type 'exceptions.ValueError'> : array must not contain infs or NaNs from kernel com.dataiku.dip.analysis.coreservices.AnalysisMLKernel@67a1c7b9 process=null pid=?? retcode=0
[2020/10/16-10:33:06.773] [MRT-157] [WARN] [dku.analysis.ml.python]  - Training failed
com.dataiku.dip.io.SocketBlockLinkKernelException: Failed to train : <type 'exceptions.ValueError'> : array must not contain infs or NaNs
	at com.dataiku.dip.io.SocketBlockLinkInteraction.throwExceptionFromPython(SocketBlockLinkInteraction.java:302)
	at com.dataiku.dip.io.SocketBlockLinkInteraction$AsyncResult.checkException(SocketBlockLinkInteraction.java:215)
	at com.dataiku.dip.io.SocketBlockLinkInteraction$AsyncResult.get(SocketBlockLinkInteraction.java:190)
	at com.dataiku.dip.io.SingleCommandKernelLink$1.call(SingleCommandKernelLink.java:208)
	at com.dataiku.dip.analysis.ml.prediction.PredictionTrainAdditionalThread.process(PredictionTrainAdditionalThread.java:74)
	at com.dataiku.dip.analysis.ml.shared.PRNSTrainThread.run(PRNSTrainThread.java:143)
[2020/10/16-10:33:06.774] [FT-TrainWorkThread-u4ApfB8z-156] [INFO] [dku.analysis.ml.python] T-g4bYkwir - Processing thread joined ...
[2020/10/16-10:33:06.775] [FT-TrainWorkThread-u4ApfB8z-156] [INFO] [dku.analysis.ml.python] T-g4bYkwir - Joining processing thread ...
[2020/10/16-10:33:06.775] [FT-TrainWorkThread-u4ApfB8z-156] [INFO] [dku.analysis.ml.python] T-g4bYkwir - Processing thread joined ...
[2020/10/16-10:33:06.775] [FT-TrainWorkThread-u4ApfB8z-156] [INFO] [dku.analysis.prediction] T-g4bYkwir - Train done
[2020/10/16-10:33:06.775] [FT-TrainWorkThread-u4ApfB8z-156] [INFO] [dku.analysis.prediction] T-g4bYkwir - Train done
[2020/10/16-10:33:06.782] [FT-TrainWorkThread-u4ApfB8z-156] [INFO] [dku.analysis.prediction] T-g4bYkwir - Publishing mltask-train-done reflected event

 

0 Kudos

How did you create the project and import the data? Did you use the "+New Project" method? Or did you use another method? Can you confirm your IMDB_train_prepared dataset has only 2 column (text and polarity) with 25k records? Can you also send a job diagnostic? Thanks!

Dataiku
0 Kudos
Level 1
Level 1
Author

There are 25000 records in the train prepared dataset.

And the log of the prepare receipe.

[13:21:09] [INFO] [dku] running compute_IMDB_train_prepared_NP - ----------------------------------------
[13:21:09] [INFO] [dku] running compute_IMDB_train_prepared_NP - DSS startup: jek version:7.0.2
[13:21:09] [INFO] [dku] running compute_IMDB_train_prepared_NP - DSS 
[13:21:09] [INFO] [dku] running compute_IMDB_train_prepared_NP - OS: Linux 3.10.0-1062.18.1.el7.x86_64 amd64 - Java: Oracle Corporation 1.8.0_242
[13:21:09] [INFO] [dku.flow.jobrunner] running compute_IMDB_train_prepared_NP - Allocated a slot for this activity!
[13:21:09] [INFO] [dku.flow.jobrunner] running compute_IMDB_train_prepared_NP - Run activity
[13:21:09] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Executing default pre-activity lifecycle hook
[13:21:09] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Checking if sources are ready
[13:21:09] [DEBUG] [dku.dataset.hash] running compute_IMDB_train_prepared_NP - Readiness cache miss for dataset__admin__DKU_TUTORIAL_NLP_VISUAL.IMDB_train__NP
[13:21:09] [INFO] [com.dataiku.dip.datasets.fs.FilesInFolderDatasetHandler] running compute_IMDB_train_prepared_NP - Build real handler with filter {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:09] [INFO] [dku.datasets.file] running compute_IMDB_train_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:09] [INFO] [dku.datasets.ftplike] running compute_IMDB_train_prepared_NP - Enumerating Filesystem dataset prefix=
[13:21:09] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumerating local filesystem prefix=/
[13:21:09] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumeration done nb_paths=1 size=13466304
[13:21:09] [INFO] [dku.dataset.hash] running compute_IMDB_train_prepared_NP - Caching readiness for dataset__admin__DKU_TUTORIAL_NLP_VISUAL.IMDB_train__NP s=READY h=CqwG1DAvTJ91XhHSG1k55Q
[13:21:09] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Checked source readiness DKU_TUTORIAL_NLP_VISUAL.IMDB_train -> true
[13:21:09] [DEBUG] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Computing hashes to propagate BEFORE activity
[13:21:09] [DEBUG] [dku.dataset.hash] running compute_IMDB_train_prepared_NP - Readiness cache miss for dataset__admin__DKU_TUTORIAL_NLP_VISUAL.IMDB_train__NP
[13:21:09] [INFO] [com.dataiku.dip.datasets.fs.FilesInFolderDatasetHandler] running compute_IMDB_train_prepared_NP - Build real handler with filter {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:09] [INFO] [dku.datasets.file] running compute_IMDB_train_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:09] [INFO] [dku.datasets.ftplike] running compute_IMDB_train_prepared_NP - Enumerating Filesystem dataset prefix=
[13:21:09] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumerating local filesystem prefix=/
[13:21:09] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumeration done nb_paths=1 size=13466304
[13:21:09] [INFO] [dku.dataset.hash] running compute_IMDB_train_prepared_NP - Caching readiness for dataset__admin__DKU_TUTORIAL_NLP_VISUAL.IMDB_train__NP s=READY h=CqwG1DAvTJ91XhHSG1k55Q
[13:21:09] [DEBUG] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Recorded 1 hashes before activity run
[13:21:09] [DEBUG] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Building recipe runner of type
[13:21:09] [INFO] [com.dataiku.dip.hive.HiveConfigurator] running compute_IMDB_train_prepared_NP - Hive support is disabled (no hadoop)
[13:21:09] [INFO] [com.dataiku.dip.impala.ImpalaConfigurator] running compute_IMDB_train_prepared_NP - Impala support is disabled (no hadoop)
[13:21:09] [INFO] [dip.cluster.selector] running compute_IMDB_train_prepared_NP - Build cluster settings hadoop=__builtin__ k8s=__builtin__ authSource=USER_FROM_UI
[13:21:09] [INFO] [dip.cluster.selector] running compute_IMDB_train_prepared_NP - Build cluster settings hadoop=__builtin__ k8s=__builtin__ authSource=USER_FROM_UI
[13:21:09] [INFO] [dku.recipes.engines] running compute_IMDB_train_prepared_NP - Resolved preferences projectKey=DKU_TUTORIAL_NLP_VISUAL recipeType=shaker global={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}} project={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}} pplusg={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}} recipe=null resolved={"forbiddenEngines":[],"enginesPreferenceOrder":[],"forbiddenByRecipeType":{},"preferenceByRecipeType":{}}
[13:21:09] [INFO] [dku.recipes.shaker] running compute_IMDB_train_prepared_NP - User-selected engine: null - used engine: DSS
[13:21:09] [INFO] [dku.flow.shaker] running compute_IMDB_train_prepared_NP - SET PAYLOAD {
  "columnsSelection": {
    "mode": "ALL"
  },
  "explorationSampling": {
    "_refreshTrigger": 0,
    "selection": {
      "filter": {
        "distinct": false,
        "enabled": false
      },
      "latestPartitionsN": 1,
      "maxRecords": 10000,
      "ordering": {
        "rules": [],
        "enabled": false
      },
      "withinFirstN": -1,
      "partitionSelectionMethod": "ALL",
      "maxStoredBytes": 104857600,
      "targetRatio": 0.02,
      "maxReadUncompressedBytes": -1,
      "samplingMethod": "HEAD_SEQUENTIAL"
    },
    "autoRefreshSample": false
  },
  "explorationFilters": [],
  "origin": "PREPARE_RECIPE",
  "exploreUIParams": {
    "autoRefresh": true
  },
  "steps": [
    {
      "preview": false,
      "metaType": "PROCESSOR",
      "disabled": false,
      "type": "ColumnsSelector",
      "params": {
        "columns": [
          "sample",
          "sentiment"
        ],
        "keep": false,
        "appliesTo": "COLUMNS"
      },
      "alwaysShowComment": false
    }
  ],
  "maxProcessedMemTableBytes": -1,
  "previewMode": "ALL_ROWS",
  "vizSampling": {
    "_refreshTrigger": 0,
    "autoRefreshSample": false
  },
  "analysisColumnData": {},
  "columnWidthsByName": {
    "sentiment": 100,
    "text": 300,
    "sample": 100,
    "polarity": 100
  },
  "sorting": [],
  "globalSearchQuery": "",
  "coloring": {
    "scheme": "MEANING_AND_STATUS",
    "individualColumns": [],
    "valueColoringMode": "HASH"
  }
}
[13:21:10] [INFO] [dku.flow.shaker] running compute_IMDB_train_prepared_NP - Shaker recipe, from IMDB_train of type FilesInFolder
[13:21:10] [DEBUG] [dku.job.activity] running compute_IMDB_train_prepared_NP - Filling source sizes
[13:21:10] [INFO] [com.dataiku.dip.datasets.fs.FilesInFolderDatasetHandler] running compute_IMDB_train_prepared_NP - Build real handler with filter {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:10] [INFO] [dku.datasets.file] running compute_IMDB_train_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:10] [INFO] [dku.datasets.ftplike] running compute_IMDB_train_prepared_NP - Enumerating Filesystem dataset prefix=
[13:21:10] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumerating local filesystem prefix=/
[13:21:10] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumeration done nb_paths=1 size=13466304
[13:21:10] [DEBUG] [dku.job.activity] running compute_IMDB_train_prepared_NP - Done filling source sizes
[13:21:10] [INFO] [dku.datasets.file] running compute_IMDB_train_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"DKU_TUTORIAL_NLP_VISUAL/IMDB_train_prepared","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}}
[13:21:10] [INFO] [dku.datasets.ftplike] running compute_IMDB_train_prepared_NP - Clear partitions
[13:21:10] [INFO] [dku.datasets.ftplike] running compute_IMDB_train_prepared_NP - Clearing partition as a folder : 'NP'
[13:21:10] [INFO] [dku.datasets.ftplike] running compute_IMDB_train_prepared_NP - Done clearing partition 'NP'
[13:21:10] [INFO] [dku.datasets.file] running compute_IMDB_train_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"DKU_TUTORIAL_NLP_VISUAL/IMDB_train_prepared","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}}
[13:21:10] [INFO] [com.dataiku.dip.datasets.fs.FilesInFolderDatasetHandler] running compute_IMDB_train_prepared_NP - Build real handler with filter {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:10] [INFO] [dku.datasets.file] running compute_IMDB_train_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_folders","path":"DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"RULES_INCLUDED_ONLY","excludeRules":[],"includeRules":[{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"/**/*"},{"matchingMode":"FULL_PATH","mode":"GLOB","expr":"1_IMDB_train.csv.gz"}],"explicitFiles":[]}}
[13:21:10] [INFO] [dku.datasets.ftplike] running compute_IMDB_train_prepared_NP - Enumerating Filesystem dataset prefix=
[13:21:10] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumerating local filesystem prefix=/
[13:21:10] [DEBUG] [dku.fs.local] running compute_IMDB_train_prepared_NP - Enumeration done nb_paths=1 size=13466304
[13:21:10] [INFO] [dku.flow.shaker] running compute_IMDB_train_prepared_NP - Using parallel runner
[13:21:10] [DEBUG] [dku.flow.stream] running compute_IMDB_train_prepared_NP - ADD PARTITION <partition:NP> files: 1
[13:21:10] [INFO] [dku.datasets.file] running compute_IMDB_train_prepared_NP - Building Filesystem handler config: {"connection":"filesystem_managed","path":"DKU_TUTORIAL_NLP_VISUAL/IMDB_train_prepared","notReadyIfEmpty":false,"filesSelectionRules":{"mode":"ALL","excludeRules":[],"includeRules":[],"explicitFiles":[]}}
[13:21:10] [WARN] [dku.fs.local] running compute_IMDB_train_prepared_NP - File does not exist: /home/dataiku/dss/managed_datasets/DKU_TUTORIAL_NLP_VISUAL/IMDB_train_prepared
[13:21:10] [WARN] [dku.fs.local] running compute_IMDB_train_prepared_NP - File does not exist: /home/dataiku/dss/managed_datasets/DKU_TUTORIAL_NLP_VISUAL/IMDB_train_prepared/out-s0.csv.gz
[13:21:10] [INFO] [dku.output.file] running compute_IMDB_train_prepared_NP - Writing base=/ split=0 chunk=0 -> target = out-s0.csv.gz
[13:21:10] [INFO] [dku.flow.shaker] running compute_IMDB_train_prepared_NP - Shaker recipe generated, 2 runnables
[13:21:10] [DEBUG] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Recipe runner built, will use 2 thread(s)
[13:21:10] [DEBUG] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Starting execution thread: com.dataiku.dip.dataflow.exec.stream.ParallelStreamSlaveRunnable@65197fe3
[13:21:10] [DEBUG] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Starting execution thread: com.dataiku.dip.dataflow.exec.stream.ParallelStreamControlRunnable@1bd27ecf
[13:21:10] [DEBUG] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Execution threads started, waiting for activity end
[13:21:10] [INFO] [dku.flow.activity] - Run thread for activity compute_IMDB_train_prepared_NP starting
[13:21:10] [INFO] [dku.pipeline] - start running
[13:21:10] [INFO] [dku.format] - Extractor run: limit=null totalRecords=0
[13:21:10] [INFO] [dku.flow.activity] - Run thread for activity compute_IMDB_train_prepared_NP starting
[13:21:10] [DEBUG] [dku.flow.stream] - Offering /1_IMDB_train.csv.gz for runners to take
[13:21:10] [INFO] [dku.flow.stream] - Offering poison
[13:21:10] [INFO] [dku.flow.stream] - Done offering, waiting for all runners to finish
[13:21:10] [INFO] [dku.flow.stream] - Have 0 done - nthreads 1
[13:21:10] [INFO] [dku.flow.stream] - Split runner has work to do : /1_IMDB_train.csv.gz
[13:21:10] [INFO] [dku] - getCompression filename=**1_IMDB_train.csv.gz**
[13:21:10] [INFO] [dku] - getCompression filename=**1_IMDB_train.csv.gz**
[13:21:10] [INFO] [dku.format] - Start compressed [GZIP] stream: /home/dataiku/dss/managed_folders/DKU_TUTORIAL_NLP_VISUAL/TlQPqaXQ/1_IMDB_train.csv.gz / totalRecsBefore=0
[13:21:10] [INFO] [dku] - getCompression filename=**1_IMDB_train.csv.gz**
[13:21:10] [INFO] [dku] - getCompression filename=**1_IMDB_train.csv.gz**
[13:21:17] [INFO] [dku.format] - after stream totalComp=13466304 totalUncomp=33526915 totalRec=25000
[13:21:17] [INFO] [dku.flow.stream] - Split runner done
[13:21:17] [INFO] [dku.format] - Extractor run done, totalCompressed=13466304 totalRecords=25000
[13:21:17] [INFO] [dku.flow.stream] - Parallel streamer done
[13:21:17] [INFO] [dku.flow.activity] - Run thread done for activity compute_IMDB_train_prepared_NP
[13:21:17] [INFO] [dku.pipeline] - done running
[13:21:17] [INFO] [dku.flow.activity] - Run thread done for activity compute_IMDB_train_prepared_NP
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - activity is finished
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Executing default post-activity lifecycle hook
[13:21:17] [INFO] [dip.cluster.selector] running compute_IMDB_train_prepared_NP - Build cluster settings hadoop=__builtin__ k8s=__builtin__ authSource=USER_FROM_UI
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Activity successful, computing metrics and running checks
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Activity checks produced 0 values
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Activity is successful, computing timestamps to propagate
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Activity is successful, propagating 3 timestamps
[13:21:17] [INFO] [dku.timestamps.sql] running compute_IMDB_train_prepared_NP - Starting to propagate 3 hashes
[13:21:17] [DEBUG] [dku.connections.sql.provider] running compute_IMDB_train_prepared_NP - Commit conn=dssdb-h2-flow_state-aYv5VPZ
[13:21:17] [INFO] [dku.timestamps.sql] running compute_IMDB_train_prepared_NP - Hashes propagated in 46 ms
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Removing samples for DKU_TUTORIAL_NLP_VISUAL.IMDB_train_prepared
[13:21:17] [INFO] [dku.flow.activity] running compute_IMDB_train_prepared_NP - Done post-activity tasks


 

0 Kudos
Level 1
Level 1
Author

I have used the new project. It is one of the DSS tutorials. 

0 Kudos
Dataiker
Dataiker

Thanks for confirming the project setup. One more question-- you've sent the log of the Prepare recipe this time. Did you get an error from running the recipe? From your screenshot, I thought you were able to successfully build the prepared dataset, but the problem came from building the model. The job diagnostic from the failed job is what would be most helpful.

Dataiku
0 Kudos
Level 1
Level 1
Author

A part of the log (building a model).

[2020/10/19-10:36:04.079] [MRT-397] [INFO] [dku.kernels]  - Getting kernel tail
[2020/10/19-10:36:04.080] [MRT-397] [INFO] [dku.kernels]  - Trying to enrich exception: com.dataiku.dip.io.SocketBlockLinkKernelException: Failed to train : <type 'exceptions.ValueError'> : array must not contain infs or NaNs from kernel com.dataiku.dip.analysis.coreservices.AnalysisMLKernel@70f05928 process=null pid=?? retcode=0
[2020/10/19-10:36:04.080] [MRT-397] [WARN] [dku.analysis.ml.python]  - Training failed
com.dataiku.dip.io.SocketBlockLinkKernelException: Failed to train : <type 'exceptions.ValueError'> : array must not contain infs or NaNs
	at com.dataiku.dip.io.SocketBlockLinkInteraction.throwExceptionFromPython(SocketBlockLinkInteraction.java:302)
	at com.dataiku.dip.io.SocketBlockLinkInteraction$AsyncResult.checkException(SocketBlockLinkInteraction.java:215)
	at com.dataiku.dip.io.SocketBlockLinkInteraction$AsyncResult.get(SocketBlockLinkInteraction.java:190)
	at com.dataiku.dip.io.SingleCommandKernelLink$1.call(SingleCommandKernelLink.java:208)
	at com.dataiku.dip.analysis.ml.prediction.PredictionTrainAdditionalThread.process(PredictionTrainAdditionalThread.java:74)
	at com.dataiku.dip.analysis.ml.shared.PRNSTrainThread.run(PRNSTrainThread.java:143)
[2020/10/19-10:36:04.082] [FT-TrainWorkThread-Iro4cBYS-396] [INFO] [dku.analysis.ml.python] T-g4bYkwir - Processing thread joined ...
[2020/10/19-10:36:04.083] [FT-TrainWorkThread-Iro4cBYS-396] [INFO] [dku.analysis.ml.python] T-g4bYkwir - Joining processing thread ...
[2020/10/19-10:36:04.083] [FT-TrainWorkThread-Iro4cBYS-396] [INFO] [dku.analysis.ml.python] T-g4bYkwir - Processing thread joined ...
[2020/10/19-10:36:04.083] [FT-TrainWorkThread-Iro4cBYS-396] [INFO] [dku.analysis.prediction] T-g4bYkwir - Train done
[2020/10/19-10:36:04.083] [FT-TrainWorkThread-Iro4cBYS-396] [INFO] [dku.analysis.prediction] T-g4bYkwir - Train done
[2020/10/19-10:36:04.090] [FT-TrainWorkThread-Iro4cBYS-396] [INFO] [dku.analysis.prediction] T-g4bYkwir - Publishing mltask-train-done reflected event
0 Kudos
Dataiker
Dataiker

I believe you clicked the log of the error but have not provided the full log. Can you provide the full log?

Dataiku
0 Kudos
Level 1
Level 1
Author

Full log 

0 Kudos
Dataiker
Dataiker

Sorry for the delay. Could you provide an export of the project including datasets? I'd like to try to reproduce it on my machine. Thanks!

Dataiku
0 Kudos
Level 1
Level 1
Author

Project NLP Visual

0 Kudos
Dataiker
Dataiker

Hi @Lisa2410 , our engineers have looked into this problem, and we have been unable to reproduce it. It may be linked to this scikit-learn issue (https://github.com/scikit-learn/scikit-learn/issues/6665) that depends on the hardware. One way forward would be to try on a machine with different specs and see if it can be reproduced.

Dataiku
0 Kudos