UDFs in SparkSQL
RSDonaldson
Dataiku DSS Core Designer, Dataiku DSS Adv Designer, Registered Posts: 1 ✭
Is it possible to use UDFs within a SparkSQL recipe? If so how do I define it.
This is the function I am trying to use'
def CAPITALIZE(str_wrd):
if str_wrd and not str_wrd.isspace() and isinstance(str_wrd,str
final_wrd = []
new_str= str_wrd.lower().split()
for x in new_str:
if x:
x1 = x.capitalize()
final_wrd.append(x1)
else:
pass
result = " ".join(final_wrd)
else:
result = None
return result
spark.udf.register("CAPITALIZE", CAPITALIZE)
Tagged: