Does anyone experienced this error? I run a join recipe on Spark engine and the run is taking ages, I checked logs, I always got this error. Also tried running on Hive engine, but got IOException.
From Hive engine error log.
Please see our options for getting help
HTTP code: , type: java.io.IOException
When you get an "OutOfMemoryError" that happens in a task (i.e. you get "Task failed ... OutOfMemoryError"), you typically need to increase the spark.executor.memory setting.
You can set this up in the recipe settings (Advanced > Spark config), add a key spark.executor.memory - If you have not overriden it, the default value is 2g, you may want to try with 4g for example, and keep increasing if it still fails.