Hive/Dremio table to pyspark Dataframe

Options
sigma_loge
sigma_loge Registered Posts: 1

import dataiku
from dataiku import spark as dkuspark
from pyspark import SparkContext
from pyspark.sql import SQLContext
from pyspark.sql import SparkSession
sc = SparkContext.getOrCreate()
sqlContext = SQLContext(sc)

# Read recipe inputs
internal= dataiku.Dataset("internal22") #internal22 is a hive table
internal_df= dkuspark.get_dataframe(sqlContext, internal)

internal_df.count()# it return as 0 but actual it has million records

Answers

Setup Info
    Tags
      Help me…