set spark.sql.legacy.timeParserPolicy to LEGACY

Diaa
Diaa Registered Posts: 4 ✭✭✭

Hi,

I'm running a pyspark recipe in DSS. I got an error while parsing a date column asking to set spark.sql.legacy.timeParserPolicy to LEGACY. How do I do that? through stacksoverflow, that solution was to do

spark.conf.set("spark.sql.legacy.timeParserPolicy","LEGACY")

However that results in an error stating that spark is not defined. This may be a beginner's question, so please bear with me there.

here is the beginning of my code

import dataiku
from dataiku import spark as dkuspark
from pyspark import SparkContext
from pyspark.sql import SQLContext

from pyspark.sql import functions as func
from pyspark.sql import Row
from pyspark.sql.functions import col, unix_timestamp, round, Column, coalesce

sc = SparkContext.getOrCreate()
sqlContext = SQLContext(sc)

# Read recipe inputs
df = dataiku.Dataset("df")
df0 = dkuspark.get_dataframe(sqlContext, df)
spark.conf.set("spark.sql.legacy.timeParserPolicy","LEGACY")

..... # code


Operating system used: Win 10

Tagged:

Answers

Setup Info
    Tags
      Help me…