set spark.sql.legacy.timeParserPolicy to LEGACY

Diaa
Level 2
set spark.sql.legacy.timeParserPolicy to LEGACY

Hi,

I'm running a pyspark recipe in DSS. I got an error while parsing a date column asking to set spark.sql.legacy.timeParserPolicy to LEGACY. How do I do that? through stacksoverflow, that solution was to do 

spark.conf.set("spark.sql.legacy.timeParserPolicy","LEGACY")

However that results in an error stating that spark is not defined. This may be a beginner's question, so please bear with me there.

here is the beginning of my code

import dataiku
from dataiku import spark as dkuspark
from pyspark import SparkContext
from pyspark.sql import SQLContext

from pyspark.sql import functions as func
from pyspark.sql import Row
from pyspark.sql.functions import col, unix_timestamp, round, Column, coalesce

sc = SparkContext.getOrCreate()
sqlContext = SQLContext(sc)

# Read recipe inputs
df = dataiku.Dataset("df")
df0 = dkuspark.get_dataframe(sqlContext, df)
spark.conf.set("spark.sql.legacy.timeParserPolicy","LEGACY")

..... # code


Operating system used: Win 10

0 Kudos
2 Replies
ZachM
Dataiker

Hi @Diaa,

You can add Spark configuration to a PySpark recipe in the advanced settings, as shown below:

image.png

 

Thanks,

Zach

0 Kudos
Diaa
Level 2
Author

Thanks Zach, that did the job!

I appreciate it!! ๐Ÿ™‚

Diaa

0 Kudos