Member since
05-19-2016
93
Posts
17
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1272 | 01-30-2017 07:34 AM | |
780 | 09-14-2016 10:31 AM |
11-06-2017
12:39 PM
I am trying to offload data from Netezza external table to HDFS using sqoop and facing issue. Netezza external table create statement. CREATE EXTERNAL TABLE NETEZZA_EMP_EXT(
employee_id integer,
employee_name character varying(100),
salary decimal (10,2))
USING (
dataobject('D:\test\testfile.txt')
remotesource 'JDBC'
delimiter ','
skiprows 1); testfile.txt contains below column values employeeid,employeename,salary
1,'John Lee',100000
2,'Marty Short', 120000
3,'Jane Mars', 150000 Sqoop command: sqoop import --direct --driver org.netezza.Driver --connection-manager org.apache.sqoop.manager.GenericJdbcManager --connect jdbc:netezza://**.***.***.***:5480/***** --username *** --password **** --query "select employee_id, employee_name, salary from NETEZZA_EMP_EXT WHERE \$CONDITIONS " --target-dir /user/aps/test142 --delete-target-dir -m 1 Error: Error: java.io.IOException: SQLException in nextKeyValue
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:277)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:556)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:80)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:91)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:145)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:64)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:787)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
Caused by: org.netezza.error.NzSQLException: ERROR: Transaction rolled back by client
at org.netezza.internal.QueryExecutor.getNextResult(QueryExecutor.java:280)
at org.netezza.internal.QueryExecutor.execute(QueryExecutor.java:76)
at org.netezza.sql.NzConnection.execute(NzConnection.java:2819)
at org.netezza.sql.NzStatement._execute(NzStatement.java:849)
at org.netezza.sql.NzPreparedStatament.executeQuery(NzPreparedStatament.java:172)
at org.apache.sqoop.mapreduce.db.DBRecordReader.executeQuery(DBRecordReader.java:111)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:235)
... 12 more Please help
... View more
Labels:
08-20-2017
04:01 AM
I am experience java programmer and want to shift in Data Science. I would like to apply machine learning algorithm on very very large data set (few TB to PB). Which language is preferable to use - Scala / Python / R ?
... View more
06-29-2017
05:05 AM
Is there any certification for HBase ? How to prepare it?
... View more
Labels:
06-07-2017
04:44 AM
2 Kudos
I am experience java programmer and want to shift my career in Data Science and Machine Learning . Which language needs to be learn - R or Python ?
... View more
05-08-2017
11:21 AM
I am trying to load data from RDBMS to kafka topic. I am using kafka jdbc connector provided by Confluent. My questions are 1. If I have millions of data in the database, How can I proceed with multi treading option ? 2. Can we use multiple broker in Kafka connect 3. How can we implement security in this offload from RDBMS to Kafka topic? 4. During data offload lets say my server goes down . How it will behave after kafka server restart? Please help. This is urgent.
... View more
Labels:
04-19-2017
12:59 PM
I would like to ingest data from RDBMS to CLOUD platform like Azure HDInsight BLOB using any one of Kafka/Spark Streaming/Storm/Flume. Also we can use more than one tool for the given list. My first preference is Kafka. I do not want to use Nifi. What will be the best practice in architectural perspective . Please suggest.
... View more
03-31-2017
07:20 AM
I am planning for Spark certification . Could you please let me know which one is best for career perspective for Spark developer 1. Databricks Certified Developer (From Databricks) 2. HDP CERTIFIED APACHE SPARK DEVELOPER (From Hortonworks) 3. CCA Spark and Hadoop Developer (From Cloudera) Please suggest.
... View more
Labels:
02-15-2017
05:07 AM
I would like to grant permission to HBase table using java API. is there any java api available for HBase 1.1.2 ?
... View more
Labels:
02-13-2017
05:28 AM
As per AWS support team , HDC 1.11 is the latest on AWS and it uses HDP 2.5 ,
TDCH version for that is "Teradata Connector for Hadoop 1.5.1".
Hortonworks has below article that explains how to use this connector. https://community.hortonworks.com/articles/53531/importing-data-from-teradata-into-hive.html
... View more
02-09-2017
02:25 PM
@mqureshi Thanks for your reply . This is working for existing database. But we are facing this issue for non existing database . Looks like issue is not resolved yet.
... View more
02-09-2017
12:27 PM
I am trying to connect hive non exist database using jdbc url . It does not throw any error and it is connecting with default database which should not be. This issue is occurring for hive-jdbc-1.2.1. if we use version hive-jdbc-0.12.0-cdh5.1.2 then we are getting proper exception showing Database does not exist. I am using HDP 2.4.2 and Hive 1.2.1.2.4 . I do not want to use cloudera jar. My code snapshot is given below where "acb" is unknown database. Class.forName("org.apache.hive.jdbc.HiveDriver");
conn = DriverManager.getConnection("jdbc:hive2://*****.***.***:10000/acb"); Is it bug for hive jdbc driver ? Please help to fix this issue asap.
... View more
02-09-2017
12:08 PM
I want to offload data from Teradata to HDFS in HDC for AWS using sqoop. . Which teradata connector we can use for data offload? Please advice urgent basis.
... View more
02-08-2017
12:03 PM
@Artem Ervits Yes. I am not able to connect HBase in HDInsight. I have the internal ip address fro zookepeer but do not have public ip for that. So How can I raise the firewall request.
... View more
02-08-2017
09:20 AM
@Josh Elser I am using HDInsight cluster in cloud. is it possible to raise firewall request for cloud host/port from local machine?
... View more
02-08-2017
09:19 AM
@Artem Ervits I am using HDInsight cluster in cloud. is it possible to raise firewall request for cloud host/port from local machine?
... View more
02-08-2017
09:18 AM
@Josh Elser I am using HDInsight cluster in cloud. is it possible to raise firewall request for cloud host/port?
... View more
02-08-2017
07:50 AM
I want to connect HBase in HDinsight using java api. My client machine is located at outside of HDInsight cluster. I am getting below error Mon Feb 06 15:40:51 IST 2017, RpcRetryingCaller{globalStartTime=1486374714593, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: org.apache.hadoop.hbase.MasterNotRunningException: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:413)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:397)
at com.bigframe.hbase.api.HDInsightHbase.main(HDInsightHbase.java:43)
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: org.apache.hadoop.hbase.MasterNotRunningException: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1533)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1553)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1704)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
... 4 more
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.checkIfBaseNodeAvailable(ConnectionManager.java:906)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.access$400(ConnectionManager.java:545)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1483)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1524)
... 8 more
Caused by: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045)
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:221)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:541)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.checkIfBaseNodeAvailable(ConnectionManager.java:895)
... 11 more MY code snapshot is below Configuration config = HBaseConfiguration.create();
config.set("hbase.zookeeper.quorum", "zookeepernode0,zookeepernode1,zookeepernode2");
config.set("hbase.zookeeper.property.clientPort", "2181");
config.set("hbase.cluster.distributed", "true");
config.set("zookeeper.znode.parent","/hbase-unsecure");
HBaseAdmin admin = new HBaseAdmin(config);
TableName[] tm = admin.listTableNames();
for(int i = 0; i<tm.length;i++){
System.out.println(tm[i].getNameAsString());
} Same code is running from inside HDInsight cluster . So I do not have any issue in java code. Please provide suggestion for running this code from outside the cluster. As it is cloud platform, I do not have idea regarding firewall ip/port. This is urgent. Please help.
... View more
Labels:
02-08-2017
06:59 AM
This issue is resolved after setting proper configuration in hive.
... View more
02-08-2017
06:18 AM
I am trying to execute below query from java api CREATE TABLE hive_aps (key struct<f1:string, f2:string>, value string) ROW FORMAT DELIMITED COLLECTION ITEMS TERMINATED BY '~' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping"=":key,f:c1") TBLPROPERTIES ("hbase.table.name" = "hbase_aps", "hbase.mapred.output.outputtable" = "hbase_aps") And getting below error java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:org.apache.hadoop.hbase.client.RetriesExhaustedException: Can't get the locations
at org.apache.hadoop.hbase.client.RpcRetryingCallerWithReadReplicas.getRegionLocations(RpcRetryingCallerWithReadReplicas.java:312)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:156)
at org.apache.hadoop.hbase.client.ScannerCallableWithReplicas.call(ScannerCallableWithReplicas.java:60)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithoutRetries(RpcRetryingCaller.java:200)
at org.apache.hadoop.hbase.client.ClientScanner.call(ClientScanner.java:327)
at org.apache.hadoop.hbase.client.ClientScanner.nextScanner(ClientScanner.java:302)
at org.apache.hadoop.hbase.client.ClientScanner.initializeScannerInConstruction(ClientScanner.java:167)
at org.apache.hadoop.hbase.client.ClientScanner.<init>(ClientScanner.java:162)
at org.apache.hadoop.hbase.client.HTable.getScanner(HTable.java:794)
at org.apache.hadoop.hbase.MetaTableAccessor.fullScan(MetaTableAccessor.java:602)
at org.apache.hadoop.hbase.MetaTableAccessor.tableExists(MetaTableAccessor.java:366)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:405)
at org.apache.hadoop.hbase.client.HBaseAdmin.tableExists(HBaseAdmin.java:415)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:214)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:690)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:683)
at sun.reflect.GeneratedMethodAccessor77.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:159)
at com.sun.proxy.$Proxy14.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:757)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4322)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:314)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1745)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1491)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1289)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1156)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1151)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:197)
at org.apache.hive.service.cli.operation.SQLOperation.access$300(SQLOperation.java:76)
at org.apache.hive.service.cli.operation.SQLOperation$2$1.run(SQLOperation.java:253)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hive.service.cli.operation.SQLOperation$2.run(SQLOperation.java:264)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:296)
at com.bigframe.perl.api.HiveHbaseIntegrationHDInsight.main(HiveHbaseIntegrationHDInsight.java:72) Above query is running fine from hive CLI and it is creating table in hive as well as in HBase. I am getting this issue only from java api. Also I am able to create simple hive table from java . If I try hive and hbase together , getting above issue. Please help. This is urgent.
... View more
Labels:
02-06-2017
12:53 PM
@Artem Ervits Thanks for your quick reply . How can I check this ? using telnet ? I am running below command from client machine telnet zookeepernode0 2181 I am getting below error telnet: Unable to connect to remote host: Connection refused Please suggest.
... View more
02-06-2017
12:21 PM
I am getting below error from from client machine (outside Hadoop cluster) when trying to access Hbase from java api Mon Feb 06 15:40:51 IST 2017, RpcRetryingCaller{globalStartTime=1486374714593, pause=100, retries=35}, org.apache.hadoop.hbase.MasterNotRunningException: org.apache.hadoop.hbase.MasterNotRunningException: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:147)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:3917)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:413)
at org.apache.hadoop.hbase.client.HBaseAdmin.listTableNames(HBaseAdmin.java:397)
at com.bigframe.hbase.api.HDInsightHbase.main(HDInsightHbase.java:43)
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: org.apache.hadoop.hbase.MasterNotRunningException: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1533)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$MasterServiceStubMaker.makeStub(ConnectionManager.java:1553)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.getKeepAliveMasterService(ConnectionManager.java:1704)
at org.apache.hadoop.hbase.client.MasterCallable.prepare(MasterCallable.java:38)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:124)
... 4 more
Caused by: org.apache.hadoop.hbase.MasterNotRunningException: Can't get connection to ZooKeeper: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.checkIfBaseNodeAvailable(ConnectionManager.java:906)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.access$400(ConnectionManager.java:545)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStubNoRetries(ConnectionManager.java:1483)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$StubMaker.makeStub(ConnectionManager.java:1524)
... 8 more
Caused by: org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase-unsecure
at org.apache.zookeeper.KeeperException.create(KeeperException.java:99)
at org.apache.zookeeper.KeeperException.create(KeeperException.java:51)
at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045)
at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:221)
at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:541)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.checkIfBaseNodeAvailable(ConnectionManager.java:895)
... 11 more MY code snapshot is below Configuration config = HBaseConfiguration.create();
config.set("hbase.zookeeper.quorum", "zookeepernode0,zookeepernode1,zookeepernode2");
config.set("hbase.zookeeper.property.clientPort", "2181");
config.set("hbase.cluster.distributed", "true");
config.set("zookeeper.znode.parent","/hbase-unsecure");
HBaseAdmin admin = new HBaseAdmin(config);
TableName[] tm = admin.listTableNames();
for(int i = 0; i<tm.length;i++){
System.out.println(tm[i].getNameAsString());
} Same code is running fine from inside hadoop cluster . Should I raise firewall request for hbase.zookeeper.quorum nodes from client machine ? Please help. This is urgent.
... View more
Labels:
02-06-2017
12:00 PM
It could be firewall issue for zookeeper quorums. Can you please check using ping command from your client machine like below ping zookeeper quorums node You have process firewall for all your zookeeper quorums nodes from client machine.
... View more
01-30-2017
03:58 PM
I am trying to execute below query in kerberised cluster with Hive principal and keytab from java api. This query will create table in Hive and as well as in HBase also. CREATE TABLE hive_aps (key struct<f1:string, f2:string>, value string) ROW FORMAT DELIMITED COLLECTION ITEMS TERMINATED BY '~' STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES (\"hbase.columns.mapping\"=\":key,f:c1\") TBLPROPERTIES (\"hbase.table.name\" = \"hbase_aps\", \"hbase.mapred.output.outputtable\" = \"hbase_aps\") And getting below error java.sql.SQLException: Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:org.apache.hadoop.hbase.security.AccessDeniedException: org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=hive, scope=default, params=[namespace=default,table=default:hbase_aps,family=f],action=CREATE)
at org.apache.hadoop.hbase.security.access.AccessController.requireNamespacePermission(AccessController.java:622)
at org.apache.hadoop.hbase.security.access.AccessController.preCreateTable(AccessController.java:991)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost$11.call(MasterCoprocessorHost.java:216)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1140)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preCreateTable(MasterCoprocessorHost.java:212)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1533)
at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:454)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55401)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:95)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:226)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.translateException(RpcRetryingCaller.java:240)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:140)
at org.apache.hadoop.hbase.client.HBaseAdmin.executeCallable(HBaseAdmin.java:4083)
at org.apache.hadoop.hbase.client.HBaseAdmin.createTableAsyncV2(HBaseAdmin.java:723)
at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:644)
at org.apache.hadoop.hbase.client.HBaseAdmin.createTable(HBaseAdmin.java:577)
at org.apache.hadoop.hive.hbase.HBaseStorageHandler.preCreateTable(HBaseStorageHandler.java:217)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:670)
at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.createTable(HiveMetaStoreClient.java:663)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.invoke(RetryingMetaStoreClient.java:156)
at com.sun.proxy.$Proxy9.createTable(Unknown Source)
at org.apache.hadoop.hive.ql.metadata.Hive.createTable(Hive.java:717)
at org.apache.hadoop.hive.ql.exec.DDLTask.createTable(DDLTask.java:4271)
at org.apache.hadoop.hive.ql.exec.DDLTask.execute(DDLTask.java:311)
at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160)
at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:89)
at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1728)
at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1485)
at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1262)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1126)
at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1121)
at org.apache.hive.service.cli.operation.SQLOperation.runQuery(SQLOperation.java:154)
at org.apache.hive.service.cli.operation.SQLOperation.runInternal(SQLOperation.java:183)
at org.apache.hive.service.cli.operation.Operation.run(Operation.java:257)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatementInternal(HiveSessionImpl.java:419)
at org.apache.hive.service.cli.session.HiveSessionImpl.executeStatement(HiveSessionImpl.java:400)
at sun.reflect.GeneratedMethodAccessor21.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy20.executeStatement(Unknown Source)
at org.apache.hive.service.cli.CLIService.executeStatement(CLIService.java:263)
at org.apache.hive.service.cli.thrift.ThriftCLIService.ExecuteStatement(ThriftCLIService.java:486)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1317)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$ExecuteStatement.getResult(TCLIService.java:1302)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:562)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: org.apache.hadoop.hbase.ipc.RemoteWithExtrasException(org.apache.hadoop.hbase.security.AccessDeniedException): org.apache.hadoop.hbase.security.AccessDeniedException: Insufficient permissions (user=hive, scope=default, params=[namespace=default,table=default:hbase_aps,family=f],action=CREATE)
at org.apache.hadoop.hbase.security.access.AccessController.requireNamespacePermission(AccessController.java:622)
at org.apache.hadoop.hbase.security.access.AccessController.preCreateTable(AccessController.java:991)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost$11.call(MasterCoprocessorHost.java:216)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.execOperation(MasterCoprocessorHost.java:1140)
at org.apache.hadoop.hbase.master.MasterCoprocessorHost.preCreateTable(MasterCoprocessorHost.java:212)
at org.apache.hadoop.hbase.master.HMaster.createTable(HMaster.java:1533)
at org.apache.hadoop.hbase.master.MasterRpcServices.createTable(MasterRpcServices.java:454)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$2.callBlockingMethod(MasterProtos.java:55401)
at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2114)
at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:101)
at org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:130)
at org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:107)
at java.lang.Thread.run(Thread.java:745)
at org.apache.hadoop.hbase.ipc.RpcClientImpl.call(RpcClientImpl.java:1226)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient.callBlockingMethod(AbstractRpcClient.java:213)
at org.apache.hadoop.hbase.ipc.AbstractRpcClient$BlockingRpcChannelImplementation.callBlockingMethod(AbstractRpcClient.java:287)
at org.apache.hadoop.hbase.protobuf.generated.MasterProtos$MasterService$BlockingStub.createTable(MasterProtos.java:58320)
at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation$4.createTable(ConnectionManager.java:1821)
at org.apache.hadoop.hbase.client.HBaseAdmin$5.call(HBaseAdmin.java:728)
at org.apache.hadoop.hbase.client.HBaseAdmin$5.call(HBaseAdmin.java:724)
at org.apache.hadoop.hbase.client.RpcRetryingCaller.callWithRetries(RpcRetryingCaller.java:126)
... 50 more
)
at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:167)
at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:155)
at org.apache.hive.jdbc.HiveStatement.execute(HiveStatement.java:210)
at org.apache.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:333)
at com.bigframe.hive.api.HiveClient_Kerberos.doCreateHiveTable(HiveClient_Kerberos.java:114)
at com.bigframe.hive.api.HiveClient_Kerberos.main(HiveClient_Kerberos.java:220) This query is able to create table in Hive but failing for HBase due to permission issue. Because Hive principal does not have permission for create table in HBase. What should I do? Please help. This is urgent.
... View more
Labels:
01-30-2017
07:34 AM
This issue has been fixed after setting below properties in kms-site.xml. hadoop.kms.proxyuser.hive.users=*
hadoop.kms.proxyuser.hive.hosts =* Now I am not getting any authentication error.
... View more
01-30-2017
06:40 AM
@rguruvannagari Thanks a lot. After setting hive property , this issue is resolved.
... View more
01-30-2017
06:15 AM
I am trying to execute below hive select query using java api in kerberised cluster using hive principal and keytab. Apart from select query all other queries (create, drop,load) running fine. select * from abc90 And getting below authentication error . org.apache.hive.service.cli.HiveSQLException: java.io.IOException: java.io.IOException: java.lang.reflect.UndeclaredThrowableException
at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:352)
at org.apache.hive.service.cli.operation.OperationManager.getOperationNextRowSet(OperationManager.java:223)
at org.apache.hive.service.cli.session.HiveSessionImpl.fetchResults(HiveSessionImpl.java:716)
at sun.reflect.GeneratedMethodAccessor29.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:78)
at org.apache.hive.service.cli.session.HiveSessionProxy.access$000(HiveSessionProxy.java:36)
at org.apache.hive.service.cli.session.HiveSessionProxy$1.run(HiveSessionProxy.java:63)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709)
at org.apache.hive.service.cli.session.HiveSessionProxy.invoke(HiveSessionProxy.java:59)
at com.sun.proxy.$Proxy20.fetchResults(Unknown Source)
at org.apache.hive.service.cli.CLIService.fetchResults(CLIService.java:456)
at org.apache.hive.service.cli.thrift.ThriftCLIService.FetchResults(ThriftCLIService.java:672)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1557)
at org.apache.hive.service.cli.thrift.TCLIService$Processor$FetchResults.getResult(TCLIService.java:1542)
at org.apache.thrift.ProcessFunction.process(ProcessFunction.java:39)
at org.apache.thrift.TBaseProcessor.process(TBaseProcessor.java:39)
at org.apache.hadoop.hive.thrift.HadoopThriftAuthBridge$Server$TUGIAssumingProcessor.process(HadoopThriftAuthBridge.java:562)
at org.apache.thrift.server.TThreadPoolServer$WorkerProcess.run(TThreadPoolServer.java:285)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.IOException: java.io.IOException: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:512)
at org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:419)
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:143)
at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1745)
at org.apache.hive.service.cli.operation.SQLOperation.getNextRowSet(SQLOperation.java:347)
... 24 more
Caused by: java.io.IOException: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:892)
at org.apache.hadoop.crypto.key.KeyProviderDelegationTokenExtension.addDelegationTokens(KeyProviderDelegationTokenExtension.java:86)
at org.apache.hadoop.hdfs.DistributedFileSystem.addDelegationTokens(DistributedFileSystem.java:2291)
at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:121)
at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodesInternal(TokenCache.java:100)
at org.apache.hadoop.mapreduce.security.TokenCache.obtainTokensForNamenodes(TokenCache.java:80)
at org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:206)
at org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:315)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextSplits(FetchOperator.java:367)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:299)
at org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:450)
... 28 more
Caused by: java.lang.reflect.UndeclaredThrowableException
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1727)
at org.apache.hadoop.crypto.key.kms.KMSClientProvider.addDelegationTokens(KMSClientProvider.java:874)
... 38 more
Caused by: org.apache.hadoop.security.authentication.client.AuthenticationException: Authentication failed, status: 403, message: Forbidden
at org.apache.hadoop.security.authentication.client.AuthenticatedURL.extractToken(AuthenticatedURL.java:274)
at org.apache.hadoop.security.authentication.client.PseudoAuthenticator.authenticate(PseudoAuthenticator.java:77)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:128)
at org.apache.hadoop.security.authentication.client.KerberosAuthenticator.authenticate(KerberosAuthenticator.java:214)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.authenticate(DelegationTokenAuthenticator.java:128)
at org.apache.hadoop.security.authentication.client.AuthenticatedURL.openConnection(AuthenticatedURL.java:215)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.doDelegationTokenOperation(DelegationTokenAuthenticator.java:285)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticator.getDelegationToken(DelegationTokenAuthenticator.java:166)
at org.apache.hadoop.security.token.delegation.web.DelegationTokenAuthenticatedURL.getDelegationToken(DelegationTokenAuthenticatedURL.java:371)
at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:879)
at org.apache.hadoop.crypto.key.kms.KMSClientProvider$2.run(KMSClientProvider.java:874)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1709) This is very urgent. Can anyone please help.
... View more
Labels:
01-29-2017
08:23 AM
@Pranay Vyas I have analyzed the log and found that select query from any table getting authentication error (Authentication failed, status: 403, message: Forbidden). Apart from select query all other queries are working. I am not sure why I am getting authentication error for select query. Select query is able to get the result set but getting this issue while iterating through result set. Please help.
... View more
01-29-2017
08:22 AM
@Ed Berezitsky Thanks for your reply. I have analyzed the log and found that select query from any table getting authentication error (Authentication failed, status: 403, message: Forbidden). Apart from select query all other queries are working. I am not sure why I am getting authentication error for select query. Select query is able to get the result set but getting this issue while iterating through result set. Please help.
... View more