Job process died (killed - maybe out of memory ?) when trying to score dataset

Tags
Registered Posts: 18 ✭✭✭✭

I'm trying to score my dataset (about 800,000 rows and 720 columns). After about 5 minutes, I get the "Job process died (killed - maybe out of memory ?)" error. I noticed that if I subset it down to 10,000 rows, it runs as expected.

To get to this 800K row scoring dataset, I'm taking a 5M row table, inner joining it with my scoring scope, and outputting my 'scoring' table population (800K rows) which gets fed into the model scoring step.

I should note that when previously using SQL Server Data Source, I was able to score 900K+ row scoring datasets with no problem. I am now using Snowflake Data Sources instead, so I'm wondering if that is the cause or if it's because I'm joining the data before scoring.


Operating system used: Windows

Best Answer

Answers

Welcome!

It looks like you're new here. Sign in or register to get started.

Welcome!

It looks like you're new here. Sign in or register to get started.