Conversion to Parquet fails in Hadoop HDFS

Benoni
Benoni Registered Posts: 23 ✭✭✭✭

$ hadoop version Hadoop 3.1.2



Source code repository https://github.com/apache/hadoop.git -r 1019dde65bcf12e05ef48ac71e84550d589e5d9a



Compiled by sunilg on 2019-01-29T01:39Z



Compiled with protoc 2.5.0 From source with checksum 64b8bdd4ca6e77cce75a93eb09ab2a9



This command was run using /usr/local/hadoop/share/hadoop/common/hadoop-common-3.1.2.jar







I receive this error shortly after the recipe starts:



parquet/io/api/RecordConsumer, caused by: ClassNotFoundException: parquet.io.api.RecordConsumer







Looks like Java cant find the RecordConsumer.class or .jar file. Any ideas how to fix this?

---SOLVED---

1. Locate your env-hadoop.sh in DATA_DIR

2. Sudo nano env-hadoop.sh

3. find line "export DKU_HADOOP_CP="

4. add


:$DKUINSTALLDIR/lib/ivy/parquet-run/*

5. Restart DSS

Tagged:

Best Answer

  • Clément_Stenac
    Clément_Stenac Dataiker, Dataiku DSS Core Designer, Registered Posts: 753 Dataiker
    edited July 17 Answer ✓

    Hi,

    Dataiku does not support "home made" Hadoop distributions.

    You may have some success by editing the "bin/env-hadoop.sh" file, locating the "DKU_HIVE_CP" line, and adding at the end (within the quotes):


    :$DKUINSTALLDIR/lib/ivy/parquet-run/*

    Then restart DSS

Answers

Setup Info
    Tags
      Help me…