Member since
04-03-2017
164
Posts
8
Kudos Received
4
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2243 | 03-09-2021 10:47 PM | |
3275 | 12-10-2018 10:59 AM | |
5878 | 12-02-2018 08:55 PM | |
8685 | 11-28-2018 10:38 AM |
11-28-2018
07:57 AM
Hi, Can you share your workflow.xml and job.properties file? Also please share the out of the below path where you have placed the hive-site.xml file. ## hadoop fs -ls <path of hdfs where you have placed hive-site.xml> Regards Nitish
... View more
11-28-2018
02:39 AM
Hi, I am afraid if you are passing the shell script name under file tag as well or not and because of that oozie is not able to localize the shell script which is causing the error of "No such file or directory" Can you please share the worfkflow.xml and job.properties file so we can check if you are passing it in right way or not. Regards Nitish
... View more
11-28-2018
02:29 AM
Hi, The reason why it is failing with error when you are running the Sqoop command for table creation is because when Sqoop creates the table then it has to connect to Hive metastore but somehow it is not able to connect to it and causing you this error. " Logging initialized using configuration in jar:file:/u05/hadoop/yarn/nm/filecache/467/hive-exec.jar!/hive-log4j.properties
FAILED: SemanticException org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient
Intercepting System.exit(1)" Solution:- ######## As a solution you should pass the hive-site.xml file in Oozie sqoop action. Link:- https://oozie.apache.org/docs/4.1.0/DG_SqoopActionExtension.html Kindly pass the hive-site.xml path under <file> tage. this path will be on HDFS and not local NOTE:- Please note to pick up the client hive-site.xml file. ## cd /etc/hive/conf ## ls -ltr Here you will see the hive-site.xml file. Please try the above and let us know if this helps or nor. Once this issue is resolved then we will go to the first one on which you started this Thread for Oozie hive2 connectivity error. Regards Nitish
... View more
11-28-2018
02:24 AM
Hi, Just to be accurate, Kindly check for these 2 jars as per your CDH version and see if there are multiple versions of jar for these 2. ## hive-jdbc-1.1.0-cdh5.12.2-standalone.jar matches ## hive-service-1.1.0-cdh5.12.2.jar matches Regards Nitish
... View more
11-28-2018
02:13 AM
Hi, Kindly check if there are some conflicting jars in the stdout logs where you are not able to connect to Hive. As this is an intermittent failure, I am afraid this could be the cause. Regards Nitish
... View more
11-26-2018
09:53 PM
Link:- https://www.tekstream.com/oracle-error-messages/ora-12505-tns-listener-does-not-currently-know-of-sid-given-in-connect-descriptor/
... View more
11-26-2018
09:41 PM
Hi, Looks like there is a connectivity issue from the NM to the Oracle database. Can you please run the below command from all the Node Managers. Note:- You need to install telnet utility prior running this command. ## telnet <oracle full hostname> <oracle port number> Kindly check if all the NM are able to connect, If any of them is not able to connect then kindly check with network team. Regards Nitish
... View more
11-26-2018
09:32 PM
1 Kudo
Hi, There are 2 options that you can do. 1. You can create a shell script(Containing the Sqoop commands) and set a Cron based scheduler time to time as per the your requirement. Link:- https://www.taniarascia.com/setting-up-a-basic-cron-job-in-linux/ 2. Or you can create Oozie workflow jobs( which will run Sqoop actions frequently) and pass it out in coordinator. Link:- https://oozie.apache.org/docs/4.1.0/DG_SqoopActionExtension.html (To create workflow.xml) Link:- https://oozie.apache.org/docs/3.1.3-incubating/CoordinatorFunctionalSpec.html ( To create coordinator) Regards Nitish
... View more
12-19-2017
02:37 AM
Hi, There are some internal Jira's that have been raised for the Sqoop + Parquet + --decimal to make the import happen perfectly fine without changing the datatype by Sqoop. But untill that Jira is fixed we have to change the data type. Regards Nitish
... View more
12-05-2017
08:23 PM
Hi Dwill, Kindly run the Sqoop command like this. sqoop import -D mapreduce.map.java.opts="-Doracle.net.crypto_checksum_client=REQUESTED -Doracle.net.crypto_checksum_types_client=SHA1 -Doracle.net.encryption_client=REQUIRED -Doracle.net.encryption_types_client=AES128" --connect <hostname> --username<> --password<> --table <> -------- (command contunues). NOTE:- These encryption properties will work only with Oracle12g only. Regards Nitish
... View more
- « Previous
- Next »