Member since
04-30-2019
49
Posts
5
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
443 | 01-08-2020 07:09 AM |
01-25-2021
03:29 AM
You have to delete the files from HDFS use this command::hdfs dfs -rm -rf /user/hive/.yarn/package/LLAP After the deleting run the below commands:: hdfs dfs -mkdir -p / user/hive/.yarn/package/LLAP hdfs dfs -chown hive: hadoop / user/hive/.yarn/package/LLAP hdfs dfs -chmod 755 /user/hive/.yarn/package/LLAP Restart the Hiveserver2 after executing these commands.
... View more
01-25-2021
03:28 AM
@kevinmat0510 You have to delete the files from HDFS use this command::hdfs dfs -rm -rf /user/hive/.yarn/package/LLAP
... View more
01-25-2021
02:52 AM
@kevinmat0510 Hive 3 architecture is changed to support ACID v2 and Hive 3 buckets generation is automatic which splits data implicitly. https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.5/using-hiveql/content/hive_3_internals.html You can't disable the generation of buckets, it's a complete architecture change of Hive 3 and refer the Hive 3 ACID support details in the document. Thanks, Prakash
... View more
01-25-2021
12:02 AM
@linzhongwei Relevant bug::https://issues.apache.org/jira/browse/HIVE-23111
... View more
01-24-2021
11:53 PM
@exploring It's not able to make connection to the namenode hostname/IPaddress to hostname:8020, can you check whether you're able to perform telnet namenode-hostname 8020, in case if you're getting connection error then you need to open this port for connection. Thanks, Prakash
... View more
01-24-2021
11:49 PM
@shrimaha It looks like you're using wrong jdbc connection string to connect to Hive, could you please check your connection string details once. https://community.cloudera.com/t5/Community-Articles/Working-with-Beeline/ta-p/247606 https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.6.5/bk_data-access/content/beeline-vs-hive-cli.html Thanks, Prakash
... View more
01-24-2021
11:39 PM
1 Kudo
@bigdataNico It's a known issue with the cloudbreak deployment. Kindly refer the "Known issues: Data lake" section in the below document. https://docs.cloudera.com/HDPDocuments/Cloudbreak/Cloudbreak-2.9.0/release-notes/content/cb_known-issues.html Delete the "/user/hive/.yarn/package/LLAP" file, and then create a new directory in this location with the relevant permissions for the hive user. Start HiveServer2. Thanks, Prakash
... View more
12-16-2020
03:32 AM
@anrathen Can you adjust hive proxy settings and test once hadoop.proxyuser.hive.hosts=* hadoop.proxyuser.hive.groups=*
... View more
12-05-2020
06:31 AM
@henimaher It could be due to bug https://issues.apache.org/jira/browse/HIVE-21866 perform yarn app -destroy llap0 and start LLAP service
... View more
08-24-2020
07:07 AM
https://docs.cloudera.com/HDPDocuments/Ambari-2.7.5.0/ambari-release-notes/content/known_issues.html
... View more
08-17-2020
07:41 AM
@AdityaShaw Yes with the help of Yarn ACL's you can control the users submitting applications to specific yarn queue. Kindly follow these documents to enable yarn acl. https://docs.cloudera.com/HDPDocuments/HDP2/HDP-2.6.5/bk_yarn-resource-management/content/controlling_access_to_queues_with_acls.html https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html
... View more
08-17-2020
07:34 AM
@manicool7 Can you elaborate more on the issue here and are you ingesting data from any third party like (Tableau, qlik sense) ? Attach the error details as well
... View more
08-17-2020
07:30 AM
@Sham Can you collect yarn logs -applicationId application_1594037379069_89091 > app89091.txt and attach it here.
... View more
08-14-2020
07:17 AM
@Nagamalleswara Thanks for your response, you need to open a support case with cloudera to get the patch. Mention about the bug and the issue, your issue will be analysed and appropriate patch will be released.
... View more
08-12-2020
10:01 AM
@Nagamalleswara It's due to these bugs https://issues.apache.org/jira/browse/HIVE-22416 and https://issues.apache.org/jira/browse/HIVE-9120 You need to have a fix for these bugs to overcome this issue. Thanks, Prakash
... View more
07-09-2020
03:24 AM
@Heri Suggestion is to use triggers and create new table with the timestamp field and run the sqoop incremental job on the newly created table. https://stackoverflow.com/questions/34806245/incremental-data-load-using-sqoop-without-primary-key-or-timestamp
... View more
07-09-2020
03:19 AM
@Mithun07 If the sqoop import is failed with large data set then it will not write anything in hdfs only the successful imports will right the data in hdfs.
... View more
04-23-2020
03:51 AM
@TR7_BRYLE Did we restart the agents after making the changes? can you attach the error stack trace.
... View more
04-20-2020
03:21 AM
1 Kudo
@TR7_BRYLE This issue occurs because the Java is restricting the TLSv1 from (1.8.0-171) used by the Ambari Agents. By default, ambari-agent connects to TLSv1, unless specified by force_https_protocol=PROTOCOL_TLSv1_2 in ambari-agent.ini. Hence, the Ambari- Agent is not able to connect and communicate to Ambari server. To resolve this issue, add the following property in ambari-agent.ini [/etc/ambari-agent/conf/ambari-agent.ini] file under [security] and restart ambari-agent. force_https_protocol = PROTOCOL_TLSv1_2 https://community.cloudera.com/t5/Support-Questions/ambari-agents-cannot-reach-ambari-server-after-changing/td-p/193251 Restart the ambari agents Thanks, Prakash
... View more
04-20-2020
03:13 AM
@banshidhar_saho Can we enable both Kerberos and LDAP authentication for HiveServer2 at the same time? --> No that's not possible, you can either use kerberos or LDAP, we can't use both at the same time. Secondly, can you share a sample workflow.xml and job.properties for an Oozie job that has a HiveServer2 action where LDAP authentication is enabled for Hiveserver2. Kindly refer this documentation to specify the ldap credentials in hive action. https://oozie.apache.org/docs/5.1.0/DG_Hive2ActionExtension.html Thanks, Prakash
... View more
02-21-2020
01:06 AM
@Folks You can follow this approach for any other kerberos keytab related issues with LLAP.
... View more
02-20-2020
02:09 AM
LLAP Service sometimes may fail to startup after HDP upgrade with the following error:
Error:: Caused by: org.apache.hadoop.security.KerberosAuthException: failure to login: for principal: hive/prakash.cloudera.com@CLOUDERA.COM from keytab /grid/0/yarn/local/usercache/hive/appcache/application_**********/container_***********_01_000001/keytabs/llap0.keytab javax.security.auth.login.LoginException: Pre-authentication information was invalid (24)
To resolve this issue, run the following command and start LLAP:
yarn app -destroy llap0
If the issue persists, then do the following
Check the hive.service.keytab file is available in all the hosts, run kinit and test the connectivity.
If the file is missing, recreate the keytabs on the missing hosts either through Ambari or manually (see this article for manual keytab creation).
Verify the file in HDFS and copy it to the local. Perform kinit with the copied keytab.
hdfs:///user/hive/.yarn/keytabs/hive/hive.service.keytab
hdfs dfs -copyToLocal /user/hive/.yarn/keytabs/hive/hive.service.keytab /tmp/hive.keytab
kinit -kt /tmp/hive.keytab hive/prakash.cloudera.com@CLOUDERA.COM
kinit: Preauthentication failed while getting initial credentials
If you get the error, to move the keytab file from hdfs location.
hdfs dfs -cp /user/hive/.yarn/keytabs/hive/hive.service.keytab /tmp/
hdfs dfs -rm /user/hive/.yarn/keytabs/hive/hive.service.keytab
Start LLAP component.
... View more
02-14-2020
06:37 AM
@hesham_eldib Could you please disable these properties hive.metastore.metrics.enabled and hive.server2.metrics.enabled from Advanced hiveserver2-site. Start Hiveserver2
... View more
02-13-2020
06:00 AM
@Asoka I think you're passing the hidden file in the --password-file option sqoop export --connect jdbc:mysql://hostname/test --connection-manager org.apache.sqoop.manager.MySQLManager --export-dir /tmp/version --table HiveVersion --username hive --password-file file:///tmp/mysql-pass.txt This command worked for me
... View more
01-20-2020
05:19 AM
@ana24 You can make use Hive CAST function to find the timestamp difference.Please refer this article and frame your queries according to your case. http://sqlandhadoop.com/how-to-subtract-timestamp-date-time-in-hive/
... View more
01-20-2020
04:54 AM
@saivenkatg55 It looks like query is not retrieving any results from the specific table.Could you please attach the "show create table pcr_project" output. Meanwhile verify the data and it's ownership in the hdfs path.
... View more
01-10-2020
07:24 AM
@Anibal_Linares It looks like your hivemetastore process is not running, try to restart the Hive metastore service from ambari and check the process details again. In case if you see the process details disable and enable back the alert
... View more
01-10-2020
06:55 AM
@Anibal_Linares Screenshot you have attached showing hiveserver2 process details.check for the metastore process. ps -ef|grep hive-metastore how will i know if the hive is wrong? To answer your query, you will still get an alert in case if anything goes wrong but after 60 secs, in case you feel the value is higher for your use case then you consider 30 secs.
... View more
01-10-2020
05:47 AM
@Anibal_Linares Go to hive metastore host and type ps -ef|grep hive. Make sure hive metastore is up and running fine. In case if it's found be running fine then go to ambari alert Disable the alert->Click ->Edit->Increase-> Connection Timeout->60 seconds. Perform this step and let me know whether it helps or not.
... View more
01-08-2020
07:41 AM
@Sambasivam Another way of collecting the data is from yarn ATS through Rest API calls.Please have a look and frame your rest api calls to access the data. https://docs.cloudera.com/HDPDocuments/HDP3/HDP-3.1.4/data-operating-system/content/rest_apis_for_querying_timeline_service_2.0.html https://hadoop.apache.org/docs/r3.1.0/hadoop-yarn/hadoop-yarn-site/TimelineServiceV2.html#Query_generic_entities.
... View more