Member since
04-30-2019
53
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1034 | 01-08-2020 07:09 AM |
05-30-2021
01:17 AM
[hdfs@c****-node* hive-testbench-hive14]$ ./tpcds-build.sh Building TPC-DS Data Generator make: Nothing to be done for `all’. TPC-DS Data Generator built, you can now use tpcds-setup.sh to generate data. [hdfs@c4237-node2 hive-testbench-hive14]$ ./tpcds-setup.sh 2 TPC-DS text data generation complete. Loading text data into external tables. make: *** [time_dim] Error 1 make: *** Waiting for unfinished jobs.... make: *** [date_dim] Error 1 Data loaded into database tpcds_bin_partitioned_orc_2. INFO : OK +---------------------+ | database_name | +---------------------+ | default | | information_schema | | sys | +---------------------+ 3 rows selected (1.955 seconds) 0: jdbc:hive2://c4237-node2.coelab.cloudera.c> tpcds_bin_partitioned_orc_2 database is not created, I have some issues in testing the tpcds queries sudo -u hdfs -s 13 cd /home/hdfs 14 wget https://github.com/hortonworks/hive-testbench/archive/hive14.zip 15 unzip hive14.zip 17 export JAVA_HOME=/usr/jdk64/jdk1.8.0_77 18 export PATH=$JAVA_HOME/bin:$PATH ./tpcds-build.sh beeline -i testbench.settings -u "jdbc:hive2://c****-node9.coe***.*****.com:10500/tpcds_bin_partitioned_orc_2" I'm not able to test the tpcds queries, any help would be appreciated.
... View more
02-22-2021
01:48 AM
@ARP There is a bug https://issues.apache.org/jira/browse/HIVE-24693 Kindly use this work around properties and test your jobs. set hive.parquet.timestamp.time.unit=nanos; set hive.parquet.write.int64.timestamp=true;
... View more
02-16-2021
02:22 AM
@ARP try increasing fs.s3a.connection.maximum to 1500 and follow this doc for the fine S3 tuning parameters. https://docs.cloudera.com/documentation/enterprise/latest/topics/admin_hive_on_s3_tuning.html
... View more
02-16-2021
02:08 AM
@bigdataNico Kindly run hdfs dfs -rmr /user/hive/ .yarn/package/LLAP/* and restart the LLAP.
... View more
01-25-2021
03:29 AM
You have to delete the files from HDFS use this command::hdfs dfs -rm -rf /user/hive/.yarn/package/LLAP After the deleting run the below commands:: hdfs dfs -mkdir -p / user/hive/.yarn/package/LLAP hdfs dfs -chown hive: hadoop / user/hive/.yarn/package/LLAP hdfs dfs -chmod 755 /user/hive/.yarn/package/LLAP Restart the Hiveserver2 after executing these commands.
... View more
01-25-2021
03:28 AM
@kevinmat0510 You have to delete the files from HDFS use this command::hdfs dfs -rm -rf /user/hive/.yarn/package/LLAP
... View more
01-25-2021
02:52 AM
@kevinmat0510 Hive 3 architecture is changed to support ACID v2 and Hive 3 buckets generation is automatic which splits data implicitly. https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/using-hiveql/content/hive_3_internals.html You can't disable the generation of buckets, it's a complete architecture change of Hive 3 and refer the Hive 3 ACID support details in the document. Thanks, Prakash
... View more
01-25-2021
12:02 AM
@linzhongwei Relevant bug::https://issues.apache.org/jira/browse/HIVE-23111
... View more
01-24-2021
11:53 PM
@AnkitP It's not able to make connection to the namenode hostname/IPaddress to hostname:8020, can you check whether you're able to perform telnet namenode-hostname 8020, in case if you're getting connection error then you need to open this port for connection. Thanks, Prakash
... View more
01-24-2021
11:49 PM
@shrimaha It looks like you're using wrong jdbc connection string to connect to Hive, could you please check your connection string details once. https://community.cloudera.com/t5/Community-Articles/Working-with-Beeline/ta-p/247606 https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.6.5/bk_data-access/content/beeline-vs-hive-cli.html Thanks, Prakash
... View more
01-24-2021
11:39 PM
1 Kudo
@bigdataNico It's a known issue with the cloudbreak deployment. Kindly refer the "Known issues: Data lake" section in the below document. https://docs.cloudera.com/HDPDocuments/Cloudbreak/Cloudbreak-2.9.0/release-notes/content/cb_known-issues.html Delete the "/user/hive/.yarn/package/LLAP" file, and then create a new directory in this location with the relevant permissions for the hive user. Start HiveServer2. Thanks, Prakash
... View more
12-16-2020
03:32 AM
@anrathen Can you adjust hive proxy settings and test once hadoop.proxyuser.hive.hosts=* hadoop.proxyuser.hive.groups=*
... View more
12-05-2020
06:31 AM
@henimaher It could be due to bug https://issues.apache.org/jira/browse/HIVE-21866 perform yarn app -destroy llap0 and start LLAP service
... View more
08-24-2020
07:07 AM
https://docs.cloudera.com/HDPDocuments/Ambari-2.7.5.0/ambari-release-notes/content/known_issues.html
... View more
08-17-2020
07:41 AM
@AdityaShaw Yes with the help of Yarn ACL's you can control the users submitting applications to specific yarn queue. Kindly follow these documents to enable yarn acl. https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.6.5/bk_yarn-resource-management/content/controlling_access_to_queues_with_acls.html https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html
... View more
08-17-2020
07:34 AM
@manicool7 Can you elaborate more on the issue here and are you ingesting data from any third party like (Tableau, qlik sense) ? Attach the error details as well
... View more
08-17-2020
07:30 AM
@Sham Can you collect yarn logs -applicationId application_1594037379069_89091 > app89091.txt and attach it here.
... View more
08-14-2020
07:17 AM
@Nagamalleswara Thanks for your response, you need to open a support case with cloudera to get the patch. Mention about the bug and the issue, your issue will be analysed and appropriate patch will be released.
... View more
08-12-2020
10:01 AM
@Nagamalleswara It's due to these bugs https://issues.apache.org/jira/browse/HIVE-22416 and https://issues.apache.org/jira/browse/HIVE-9120 You need to have a fix for these bugs to overcome this issue. Thanks, Prakash
... View more
07-09-2020
03:24 AM
@Heri Suggestion is to use triggers and create new table with the timestamp field and run the sqoop incremental job on the newly created table. https://stackoverflow.com/questions/34806245/incremental-data-load-using-sqoop-without-primary-key-or-timestamp
... View more
07-09-2020
03:19 AM
@Mithun07 If the sqoop import is failed with large data set then it will not write anything in hdfs only the successful imports will right the data in hdfs.
... View more
04-23-2020
03:51 AM
@TR7_BRYLE Did we restart the agents after making the changes? can you attach the error stack trace.
... View more
04-20-2020
03:21 AM
1 Kudo
@TR7_BRYLE This issue occurs because the Java is restricting the TLSv1 from (1.8.0-171) used by the Ambari Agents. By default, ambari-agent connects to TLSv1, unless specified by force_https_protocol=PROTOCOL_TLSv1_2 in ambari-agent.ini. Hence, the Ambari- Agent is not able to connect and communicate to Ambari server. To resolve this issue, add the following property in ambari-agent.ini [/etc/ambari-agent/conf/ambari-agent.ini] file under [security] and restart ambari-agent. force_https_protocol = PROTOCOL_TLSv1_2 https://community.cloudera.com/t5/Support-Questions/ambari-agents-cannot-reach-ambari-server-after-changing/td-p/193251 Restart the ambari agents Thanks, Prakash
... View more
04-20-2020
03:13 AM
@banshidhar_saho Can we enable both Kerberos and LDAP authentication for HiveServer2 at the same time? --> No that's not possible, you can either use kerberos or LDAP, we can't use both at the same time. Secondly, can you share a sample workflow.xml and job.properties for an Oozie job that has a HiveServer2 action where LDAP authentication is enabled for Hiveserver2. Kindly refer this documentation to specify the ldap credentials in hive action. https://oozie.apache.org/docs/5.1.0/DG_Hive2ActionExtension.html Thanks, Prakash
... View more
02-21-2020
01:06 AM
@Folks You can follow this approach for any other kerberos keytab related issues with LLAP.
... View more
02-20-2020
02:09 AM
LLAP Service sometimes may fail to startup after HDP upgrade with the following error:
Error:: Caused by: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: hive/prakash.cloudera.com@CLOUDERA.COM from keytab /grid/0/yarn/local/usercache/hive/appcache/application_**********/container_***********_01_000001/keytabs/llap0.keytab javax.security.auth.login.LoginException: Pre-authentication information was invalid (24)
To resolve this issue, run the following command and start LLAP:
yarn app -destroy llap0
If the issue persists, then do the following
Check the hive.service.keytab file is available in all the hosts, run kinit and test the connectivity.
If the file is missing, recreate the keytabs on the missing hosts either through Ambari or manually (see this article for manual keytab creation).
Verify the file in HDFS and copy it to the local. Perform kinit with the copied keytab.
hdfs:///user/hive/.yarn/keytabs/hive/hive.service.keytab
hdfs dfs -copyToLocal /user/hive/.yarn/keytabs/hive/hive.service.keytab /tmp/hive.keytab
kinit -kt /tmp/hive.keytab hive/prakash.cloudera.com@CLOUDERA.COM
kinit: Preauthentication failed while getting initial credentials
If you get the error, to move the keytab file from hdfs location.
hdfs dfs -cp /user/hive/.yarn/keytabs/hive/hive.service.keytab /tmp/
hdfs dfs -rm /user/hive/.yarn/keytabs/hive/hive.service.keytab
Start LLAP component.
... View more
02-14-2020
06:37 AM
@hesham_eldib Could you please disable these properties hive.metastore.metrics.enabled and hive.server2.metrics.enabled from Advanced hiveserver2-site. Start Hiveserver2
... View more
02-13-2020
06:00 AM
@Asoka I think you're passing the hidden file in the --password-file option sqoop export --connect jdbc:mysql://hostname/test --connection-manager org.apache.sqoop.manager.MySQLManager --export-dir /tmp/version --table HiveVersion --username hive --password-file file:///tmp/mysql-pass.txt This command worked for me
... View more
01-20-2020
05:19 AM
@ana24 You can make use Hive CAST function to find the timestamp difference.Please refer this article and frame your queries according to your case. http://sqlandhadoop.com/how-to-subtract-timestamp-date-time-in-hive/
... View more
01-20-2020
04:54 AM
@saivenkatg55 It looks like query is not retrieving any results from the specific table.Could you please attach the "show create table pcr_project" output. Meanwhile verify the data and it's ownership in the hdfs path.
... View more