Member since
05-13-2019
8
Posts
3
Kudos Received
0
Solutions
01-03-2023
07:39 AM
Hi, Unfortunately it is built with java 11 and will not work on Java 8 so. However, you can install Java 11 on only one node, along with Java 8 and let Java 8 be the default (so you do not impact any of CDP components or your programs). Then go in CM > Datagen > Configuration and set java_home_custom to the path on the node where Java 11 is installed. Datagen should start with it. Thanks.
... View more
12-01-2022
03:32 AM
1 Kudo
Could you try to add 'home' at the end of provided URL ? It should be http://xxxxxx/airflow/home
... View more
10-28-2019
10:40 AM
Since Hadoop 2.8, it is possible to make a directory protected and so all its files cannot be deleted, using : fs.protected.directories property. From documentation: "A comma-separated list of directories which cannot be deleted even by the superuser unless they are empty. This setting can be used to guard important system directories against accidental deletion due to administrator error." It does not exactly answer the question but it is a possibility.
... View more
10-14-2019
02:30 AM
Hello @sduraisankar93, If you are facing this issue, as you said it's because you do not have imported the module. I believe you should check this documentation on how to import HWC and use it : https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/integrating-hive/content/hive_configure_a_spark_hive_connection.html If you are using Zeppelin, please check this : https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/integrating-hive/content/hive_zeppelin_configuration_hivewarehouseconnector.html Note that, in Zeppelin pysark configuration could not work, so a work around is to set (via Ambari) in Zeppelin-env.sh section, this configuration : export SPARK_SUBMIT_OTPIONS="--jars usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.jar --py-files usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-<version>.zip" Then to start a pyspark shell on your machines, launch this command : pyspark --jars /usr/hdp/current/hive_warehouse_connector/hive-warehouse-connector-assembly-jar --py-files /usr/hdp/current/hive_warehouse_connector/pyspark_hwc-.zip
... View more