Member since
02-15-2016
7
Posts
3
Kudos Received
1
Solution
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1125 | 04-13-2016 12:55 AM |
01-04-2018
10:23 PM
Any help is appreciated. I know some folks have experienced this before. But I tired all the suggestions and still facing issue
... View more
01-04-2018
05:21 AM
My Error ======= INFO: Resolved authority: <hostname>:10000
Error in .jcall(drv@jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1], :
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hive.service.auth.HiveAuthFactory ===== Here is my connectivity details in R Studio in Windows 7 connecting to Remote Hadoop Server. #loading libraries library(rJava) library(RJDBC) library(DBI) #init of the classpath (works with hadoop 2.6 on CDH 5.4 installation) cp = c("C:/R/java/hive-jdbc.jar", "C:/R/java/hadoop-common.jar", "C:/R/java/hadoop-client-2.5.1.jar", "C:/R/java/commons-logging-commons-logging.jar","C:/R/java/libthrift-0.9.2.jar", "C:/R/java/hive-service.jar", "C:/R/java/httpclient-4.2.5.jar", "C:/R/java/httpcore-4.2.5.jar", "C:/R/java/hive-jdbc-1.2.1000.2.6.1.39-2.jar") .jinit(classpath=cp) .jclassLoader()$setDebug(1L) #init of the connexion to Hive server drv <- JDBC("org.apache.hive.jdbc.HiveDriver", "c:/R/java/hive-jdbc-2.0.0.jar", identifier.quote="`") conn <- dbConnect(drv, "jdbc:hive2://xxxxxx:10000/raw_irf;","hive","") My output in the console > #loading libraries
> > library(rJava)
> library(RJDBC)
> library(DBI)
> > #init of the classpath (works with hadoop 2.6 on CDH 5.4 installation)
> cp = c("C:/R/java/hive-jdbc.jar", "C:/R/java/hadoop-common.jar", "C:/R/java/hadoop-client-2.5.1.jar", "C:/R/java/commons-logging-commons-logging.jar","C:/R/java/libthrift-0.9.2.jar", "C:/R/java/hive-service.jar", "C:/R/java/httpclient-4.2.5.jar", "C:/R/java/httpcore-4.2.5.jar", "C:/R/java/hive-jdbc-1.2.1000.2.6.1.39-2.jar")
> .jinit(classpath=cp)
RJavaClassLoader: added 'C:/R/java/hive-jdbc.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/hive-jdbc.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/hadoop-common.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/hadoop-common.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/hadoop-client-2.5.1.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/hadoop-client-2.5.1.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/commons-logging-commons-logging.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/commons-logging-commons-logging.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/libthrift-0.9.2.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/libthrift-0.9.2.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/hive-service.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/hive-service.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/httpclient-4.2.5.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/httpclient-4.2.5.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/httpcore-4.2.5.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/httpcore-4.2.5.jar' to the internal class path
RJavaClassLoader: added 'C:/R/java/hive-jdbc-1.2.1000.2.6.1.39-2.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/R/java/hive-jdbc-1.2.1000.2.6.1.39-2.jar' to the internal class path
[1] 0
> .jclassLoader()$setDebug(1L)
> > #init of the connexion to Hive server
> drv <- JDBC("org.apache.hive.jdbc.HiveDriver", "c:/R/java/hive-jdbc-2.0.0.jar", identifier.quote="`")
RJavaClassLoader: added 'c:/R/java/hive-jdbc-2.0.0.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'c:/R/java/hive-jdbc-2.0.0.jar' to the internal class path
RJavaClassLoader: added 'C:/Users/sdrav01/AppData/Local/Continuum/Anaconda2/envs/rstudio/R/library/RJDBC/java/RJDBC.jar' to the URL class path loader
RJavaClassLoader: adding Java archive file 'C:/Users/sdrav01/AppData/Local/Continuum/Anaconda2/envs/rstudio/R/library/RJDBC/java/RJDBC.jar' to the internal class path
> conn <- dbConnect(drv, "jdbc:hive2://ucschdpprd03.steelroads.com:10000/raw_irf;","hive","")
Jan 04, 2018 12:15:25 AM org.apache.hive.jdbc.Utils parseURL
INFO: Supplied authorities: ucschdpprd03.steelroads.com:10000
Jan 04, 2018 12:15:25 AM org.apache.hive.jdbc.Utils parseURL
INFO: Resolved authority: ucschdpprd03.steelroads.com:10000
Jan 04, 2018 12:15:25 AM org.apache.hive.jdbc.Utils parseURL
INFO: Supplied authorities: ucschdpprd03.steelroads.com:10000
Jan 04, 2018 12:15:25 AM org.apache.hive.jdbc.Utils parseURL
INFO: Resolved authority: <hostname>:10000
Error in .jcall(drv@jdrv, "Ljava/sql/Connection;", "connect", as.character(url)[1], :
java.lang.NoClassDefFoundError: Could not initialize class org.apache.hive.service.auth.HiveAuthFactory I tried to search in google and tried so many different things for 2 days. Still having the same error
... View more
Labels:
04-13-2016
12:55 AM
It was a permission issue in Ranger. The issue was been resolved. Thank you very much for your help.
... View more
04-05-2016
02:26 PM
Thank you for your response. I am able to parse it based on your suggestion. However, the loader is not able to load into Phoenix table. 16/04/05 10:16:15 INFO mapreduce.CsvBulkLoadTool: Loading HFiles from /tmp/c0fdbfa0-383d-4f7a-bb0a-d41c58f3742b/INTM.EQUIP_KEY
16/04/05 10:16:15 WARN mapreduce.LoadIncrementalHFiles: managed connection cannot be used for bulkload. Creating unmanaged connection.
16/04/05 10:16:15 INFO zookeeper.RecoverableZooKeeper: Process identifier=hconnection-0x30022bf5 connecting to ZooKeeper ensemble=ucschdpdev01.railinc.com:2181
16/04/05 10:16:15 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=ucschdpdev01.railinc.com:2181 sessionTimeout=90000 watcher=hconnection-0x30022bf50x0, quorum=ucschdpdev01.railinc.com:2181, baseZNode=/hbase
16/04/05 10:16:15 INFO zookeeper.ClientCnxn: Opening socket connection to server ucschdpdev01.railinc.com/10.160.230.141:2181. Will not attempt to authenticate using SASL (unknown error)
16/04/05 10:16:15 INFO zookeeper.ClientCnxn: Socket connection established to ucschdpdev01.railinc.com/10.160.230.141:2181, initiating session
16/04/05 10:16:15 INFO zookeeper.ClientCnxn: Session establishment complete on server ucschdpdev01.railinc.com/10.160.230.141:2181, sessionid = 0x153c6f93c7f1f5a, negotiated timeout = 40000
16/04/05 10:16:15 WARN mapreduce.LoadIncrementalHFiles: Skipping non-directory hdfs://ucschdpdev01.railinc.com:8020/tmp/c0fdbfa0-383d-4f7a-bb0a-d41c58f3742b/INTM.EQUIP_KEY/_SUCCESS
16/04/05 10:16:15 INFO hfile.CacheConfig: CacheConfig:disabled
16/04/05 10:16:16 INFO mapreduce.LoadIncrementalHFiles: Trying to load hfile=hdfs://ucschdpdev01.railinc.com:8020/tmp/c0fdbfa0-383d-4f7a-bb0a-d41c58f3742b/INTM.EQUIP_KEY/0/344df66eb2644474b3ac5da7ecbe767c first=1 last=999999
16/04/05 10:27:29 INFO client.RpcRetryingCaller: Call exception, tries=10, retries=35, started=673701 ms ago, cancelled=false, msg=row '' on table 'INTM.EQUIP_KEY' at region=INTM.EQUIP_KEY,,1459834472681.60a4b8f4fad454e419242d08a51660ad., hostname=ucschdpdev03.railinc.com,16020,1459451121171, seqNum=2
... View more
04-04-2016
12:32 PM
1 Kudo
Getting the following error when trying to bulk load HDFS data into Phoenix. The data is separated by Cntl-A delimiter. Command used is as follows: hadoop jar /usr/hdp/2.3.4.0-3485/phoenix/phoenix-4.4.0.2.3.4.0-3485-client.jar
org.apache.phoenix.mapreduce.CsvBulkLoadTool --table INTM.TEST_DATA --input
/data/test_data/20160329145829/HDFS_TEST_DATA.csv --zookeeper
localhost:2181:/hbase Error Message= ============ 16/04/04 08:00:46 INFO mapreduce.Job: Task Id : attempt_1459451088217_0193_m_000008_0, Status : FAILED
Error: java.lang.RuntimeException: java.lang.RuntimeException: Error on record, CSV record does not have enough values (has 1, but needs 14), record =[2808976522139491A0301939984852009-08-22 08:49:46.961000UMEMCVSNRAIL2009-08-22 08:49:46.961000UMEMCVSNRAILNative\etlload2016-03-29 14:58:31.763751]
at org.apache.phoenix.mapreduce.CsvToKeyValueMapper.map(CsvToKeyValueMapper.java:176)
at org.apache.phoenix.mapreduce.CsvToKeyValueMapper.map(CsvToKeyValueMapper.java:67)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:146)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: java.lang.RuntimeException: Error on record, CSV record does not have enough values (has 1, but needs 14), record =[2808976522139491A0301939984852009-08-22 08:49:46.961000UMEMCVSNRAIL2009-08-22 08:49:46.961000UMEMCVSNRAILNative\etlload2016-03-29 14:58:31.763751]
at org.apache.phoenix.mapreduce.CsvToKeyValueMapper$MapperUpsertListener.errorOnRecord(CsvToKeyValueMapper.java:261)
at org.apache.phoenix.util.csv.CsvUpsertExecutor.execute(CsvUpsertExecutor.java:168)
at org.apache.phoenix.util.csv.CsvUpsertExecutor.execute(CsvUpsertExecutor.java:136)
at org.apache.phoenix.mapreduce.CsvToKeyValueMapper.map(CsvToKeyValueMapper.java:157)
... View more
Labels:
- Labels:
-
Apache Phoenix