Member since
08-13-2013
16
Posts
2
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
2657 | 10-28-2013 02:12 AM | |
8758 | 08-22-2013 10:51 PM |
10-28-2013
02:12 AM
1 Kudo
Above problem has been solved by adding hbase.jar in /usr/lib/hadoop/lib & add some jar files like hive-hbase-handler.jar, hbase.jar, guava.jar, zookeeper.jar, hive-contrib.jar in auxpth in hive-site.xml as given below: <name>hive.aux.jars.path</name> <description>Specify paths to jars for hbase integration - must update each CDH release!</description> <value>file:///usr/lib/hive/lib/hive-hbase-handler-0.10.0-cdh4.3.0.jar,file:///usr/lib/hive/lib/hbase.jar,file:///usr/lib/hive/lib/guava-11.0.2.jar,file:///usr/lib/hive/lib/zookeeper.jar,file:///usr/lib/hive/lib/hive-contrib-0.10.0-cdh4.3.0.jar</value> </property> Thanks Surbhi
... View more
10-28-2013
12:41 AM
1 Kudo
Hey Thanks for your reply. It Works for both Centos as well as for Windows. I have added all hosts IP's & their name in etc/hosts file. In Windows hosts file is located at C:\Windows\System32\drivers\etc\hosts. In Centos type vi /etc/hosts to access hosts file. Thank U so much for helping me to solve out this problem. Surbhi Singh
... View more
09-19-2013
11:46 PM
Hello, I am trying to connect my HBase with JAVA Application for creation, insertion, deletion data into hbase table. I have installed Cloudera Standard Verion with 7 hosts named as shown below: Server Cloudera-new Host1 Cluster1 Host2 Cluster2 Host3 Cluster3 Host4 Cluster4 Host5 Cluster5 Host6 Cluster6 Host7 Cluster7 Here is my etc/hosts file: 192.168.3.100 Cloudera-new Cloudera-new 192.168.3.101 Cluster1 Cluster1 192.168.3.102 Cluster2 Cluster2 192.168.3.103 Cluster3 Cluster3 192.168.3.104 Cluster4 Cluster4 192.168.3.105 Cluster5 Cluster5 192.168.3.106 Cluster6 Cluster6 192.168.3.107 Cluster7 Cluster7 All services are started on Cluster1 rest all hosts have some services started not all. Now I want to connect hbase Master running on Cluster 1 with JAVA Application. Here is my program: package Hive; import java.sql.SQLException; import org.apache.hadoop.conf.Configuration; import org.apache.hadoop.hbase.HBaseConfiguration; import org.apache.hadoop.hbase.client.HBaseAdmin; public class hbasecommand { private static String driverName = "org.apache.hadoop.hbase.HBaseConfiguration"; public static void main(String[] args) throws SQLException { try { Class.forName(driverName); } catch (ClassNotFoundException e) { e.printStackTrace(); System.exit(1); } System.out.println("Running connecting test..."); System.out.println(" 1. From program: Hello from MySimpleHBase"); System.out.println(" 2. From program: Create a HBase config"); Configuration config = HBaseConfiguration.create(); config.set("hbase.master", "192.168.3.101:60000"); config.set("hbase.zookeeper.quorum", "192.168.3.101"); config.set("hbase.zookeeper.property.clientPort", "2181"); try { String tableName = "hbase_table"; System.out.println("Enetered"); System.out.println(tableName); HBaseAdmin admin = new HBaseAdmin(config); System.out.println("===========Delete table========"); admin.disableTable(tableName); admin.deleteTable(tableName); System.out.println("delete table " + tableName + " ok."); } catch (Exception e) { e.printStackTrace(); } } } When i run this program error occur: Running connecting test... 1. From program: Hello from MySimpleHBase 2. From program: Create a HBase config Enetered hbase_table 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:zookeeper.version=3.4.5-cdh4.3.0--1, built on 05/28/2013 02:01 GMT 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:host.name=RSPL-NDA-005 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:java.version=1.7.0_03 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:java.vendor=Oracle Corporation 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:java.home=C:\Program Files\Java\jre7 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:java.class.path=C:\Users\Surbhi Singh\workspace\HiveTesting\bin;C:\Users\Surbhi Singh\Desktop\hive\commons-logging-1.0.4.jar;C:\Users\Surbhi Singh\Desktop\hive\commons-logging-api-1.0.4.jar;C:\Users\Surbhi Singh\Desktop\hive\guava-11.0.2.jar;C:\Users\Surbhi Singh\Desktop\hive\hadoop-core.jar;C:\Users\Surbhi Singh\Desktop\hive\hbase.jar;C:\Users\Surbhi Singh\Desktop\hive\hive-contrib-0.10.0-cdh4.3.0.jar;C:\Users\Surbhi Singh\Desktop\hive\hive-exec-0.10.0-cdh4.3.1.jar;C:\Users\Surbhi Singh\Desktop\hive\hive-hbase-handler-0.10.0-cdh4.3.0.jar;C:\Users\Surbhi Singh\Desktop\hive\hive-jdbc-0.10.0-cdh4.3.1.jar;C:\Users\Surbhi Singh\Desktop\hive\hive-metastore-0.10.0-cdh4.3.1.jar;C:\Users\Surbhi Singh\Desktop\hive\hive-service-0.10.0-cdh4.3.1.jar;C:\Users\Surbhi Singh\Desktop\hive\libfb303-0.9.0.jar;C:\Users\Surbhi Singh\Desktop\hive\libthrift-0.9.0-cdh4-1.jar;C:\Users\Surbhi Singh\Desktop\hive\log4j-1.2.16.jar;C:\Users\Surbhi Singh\Desktop\hive\mysql-connector-java.jar;C:\Users\Surbhi Singh\Desktop\hive\mysql-connector-java-5.1.25-bin.jar;C:\Users\Surbhi Singh\Desktop\hive\slf4j-api-1.6.1.jar;C:\Users\Surbhi Singh\Desktop\hive\slf4j-log4j12-1.6.1.jar;C:\Users\Surbhi Singh\Desktop\hive\zookeeper.jar;C:\Users\Surbhi Singh\Desktop\hive\commons-lang-2.4.jar;C:\Users\Surbhi Singh\Desktop\hive\commons-configuration-1.6.jar;C:\Users\Surbhi Singh\Desktop\hive\hadoop-common.jar 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:java.library.path=C:\Program Files\Java\jre7\bin;C:\Windows\Sun\Java\bin;C:\Windows\system32;C:\Windows;C:/Program Files/Java/jre7/bin/client;C:/Program Files/Java/jre7/bin;C:/Program Files/Java/jre7/lib/i386;C:\csvn\bin\;C:\csvn\Python25\;C:\Windows\system32;C:\Windows;C:\Windows\System32\Wbem;C:\Windows\System32\WindowsPowerShell\v1.0\;C:\Program Files\TortoiseSVN\bin;C:\xampp\mysql\bin;;E:\Setup\eclipse;;. 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:java.io.tmpdir=C:\Users\SURBHI~1\AppData\Local\Temp\ 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:java.compiler=<NA> 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:os.name=Windows 7 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:os.arch=x86 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:os.version=6.1 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:user.name=Surbhi Singh 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:user.home=C:\Users\Surbhi Singh 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Client environment:user.dir=C:\Users\Surbhi Singh\workspace\HiveTesting 13/09/20 11:21:38 INFO zookeeper.ZooKeeper: Initiating client connection, connectString=192.168.3.101:2181 sessionTimeout=180000 watcher=hconnection 13/09/20 11:21:38 INFO zookeeper.RecoverableZooKeeper: The identifier of this process is 8136@RSPL-NDA-005 13/09/20 11:21:42 INFO zookeeper.ClientCnxn: Opening socket connection to server 192.168.3.101/192.168.3.101:2181. Will not attempt to authenticate using SASL (unknown error) 13/09/20 11:21:42 INFO zookeeper.ClientCnxn: Socket connection established to 192.168.3.101/192.168.3.101:2181, initiating session 13/09/20 11:21:42 INFO zookeeper.ClientCnxn: Session establishment complete on server 192.168.3.101/192.168.3.101:2181, sessionid = 0x1413531a8eb0a86, negotiated timeout = 60000 13/09/20 11:21:45 INFO client.HConnectionManager$HConnectionImplementation: getMaster attempt 0 of 10 failed; retrying after sleep of 1004 java.net.UnknownHostException: unknown host: Cluster1 at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.<init>(HBaseClient.java:276) at org.apache.hadoop.hbase.ipc.HBaseClient.createConnection(HBaseClient.java:255) at org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1111) at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:974) at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:86) at $Proxy5.getProtocolVersion(Unknown Source) at org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:138) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:711) at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:126) at Hive.hbasecommand.main(hbasecommand.java:33) 13/09/20 11:21:46 INFO client.HConnectionManager$HConnectionImplementation: getMaster attempt 1 of 10 failed; retrying after sleep of 1000 java.net.UnknownHostException: unknown host: Cluster1 at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.<init>(HBaseClient.java:276) at org.apache.hadoop.hbase.ipc.HBaseClient.createConnection(HBaseClient.java:255) at org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1111) at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:974) at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:86) at $Proxy5.getProtocolVersion(Unknown Source) at org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:138) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:711) at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:126) at Hive.hbasecommand.main(hbasecommand.java:33) 13/09/20 11:21:47 INFO client.HConnectionManager$HConnectionImplementation: getMaster attempt 2 of 10 failed; retrying after sleep of 1002 java.net.UnknownHostException: unknown host: Cluster1 at org.apache.hadoop.hbase.ipc.HBaseClient$Connection.<init>(HBaseClient.java:276) at org.apache.hadoop.hbase.ipc.HBaseClient.createConnection(HBaseClient.java:255) at org.apache.hadoop.hbase.ipc.HBaseClient.getConnection(HBaseClient.java:1111) at org.apache.hadoop.hbase.ipc.HBaseClient.call(HBaseClient.java:974) at org.apache.hadoop.hbase.ipc.WritableRpcEngine$Invoker.invoke(WritableRpcEngine.java:86) at $Proxy5.getProtocolVersion(Unknown Source) at org.apache.hadoop.hbase.ipc.WritableRpcEngine.getProxy(WritableRpcEngine.java:138) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getMaster(HConnectionManager.java:711) at org.apache.hadoop.hbase.client.HBaseAdmin.<init>(HBaseAdmin.java:126) at Hive.hbasecommand.main(hbasecommand.java:33) I can't understand why this error unknown host came. Please help me by providing solution as soon as possible. Thnaks Surbhi Singh
... View more
Labels:
08-22-2013
10:56 PM
Above problem has been solved by adding hbase.jar in /usr/lib/hadoop/lib & add some jar files like hive-hbase-handler.jar, hbase.jar, guava.jar, zookeeper.jar, hive-contrib.jar in auxpth in hive-site.xml.. Thanks Surbhi
... View more
08-22-2013
10:51 PM
Above problem has been solved by adding auxpath in hive-site.xml
... View more
08-22-2013
12:12 AM
Hello, I am trying to insert data into hbase table though hive table. Though command line , i am able to create and insert data into hbase table through hive. My query for creating & inserting data into hbase is given below: CREATE TABLE hbase_test5(key int COMMENT 'key', value string COMMENT 'value') STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:value") TBLPROPERTIES ("hbase.table.name" = "hbase_test5"); INSERT OVERWRITE TABLE hbase_test4 SELECT * FROM testhivehbase WHERE key=1; testhivehbase is a table creted in HIVE. Now I am trying to create & insert data into hbse table through jdbc. My Java Program is given below: package Hive; import java.sql.SQLException; import java.sql.Connection; import java.sql.ResultSet; import java.sql.Statement; import java.sql.DriverManager; public class Hbasehive { private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver"; /** * @param args * @throws SQLException */ public static void main(String[] args) throws SQLException { try { System.out.println("hi"); Class.forName(driverName); Connection con = DriverManager.getConnection("jdbc:hive://192.168.1.177:10001/default", "", ""); System.out.println("hello"); Statement stmt = con.createStatement(); stmt.executeQuery("drop table hbase_test5"); String sql = "CREATE TABLE hbase_test5(key int COMMENT 'key', value string COMMENT 'value') STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES (\"hbase.columns.mapping\" = \":key,cf1:value\") TBLPROPERTIES (\"hbase.table.name\" = \"hbase_test5\")"; System.out.println("Running: " + sql); ResultSet res = stmt.executeQuery(sql); System.out.println("Create Table Successfully"); String sql1 = "INSERT OVERWRITE TABLE hbase_test5 SELECT * FROM testhivehbase WHERE key=1"; System.out.println("Running: " + sql1); ResultSet res1 = stmt.executeQuery(sql1); System.out.println("Inserted Successfully"); } catch (ClassNotFoundException e) { System.out.println("--->"+e.getMessage()); e.printStackTrace(); System.exit(1); } System.out.println("Create Table-Successful"); // load data into table // NOTE: filepath has to be local to the hive server // NOTE: /tmp/a.txt is a ctrl-A separated file with two fields per line } } Error: Diagnostic Messages for this Task: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332) at org.apache.hadoop.mapred.Child$4.run(Child.java:268) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408) at org.apache.hadoop.mapred.Child.main(Child.java:262) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRedTask MapReduce Jobs Launched: Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec Please help me by providing some solution. Thanks & Regards Surbhi Singh
... View more
Labels:
08-15-2013
10:46 PM
I have followed same steps as you have mentioned. But still error came. Please verify it & please provide me some another solution. bin/hive --auxpath /usr/lib/hive/lib/hive-hbase-handler-0.1 1.0.1.3.0.0-107.jar, /usr/lib/hive/lib/hbase-0.94.6.1.3.0.0-107-security.jar, /u sr/lib/hive/lib/zookeeper-3.4.5.1.3.0.0-107.jar, /usr/lib/hive/lib/guava-11.0.2. jar, -hiveconf hbase.master=localhost:60000 13/08/15 22:42:42 WARN conf.HiveConf: DEPRECATED: Configuration property hive.me tastore.local no longer has any effect. Make sure to provide a valid value for h ive.metastore.uris if you are connecting to a remote metastore. Logging initialized using configuration in jar:file:/usr/lib/hive/lib/hive-commo n-0.11.0.1.3.0.0-107.jar!/hive-log4j.properties hive> CREATE TABLE hbase_table_5(key int, value string) STORED BY 'org.apache.ha doop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mappin g" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "xxx"); OK Time taken: 4.99 seconds hive> INSERT OVERWRITE TABLE hbase_table_5 SELECT * FROM user WHERE key=1; Total MapReduce jobs = 1 Launching Job 1 out of 1 Number of reduce tasks is set to 0 since there's no reduce operator Starting Job = job_201307292222_0028, Tracking URL = http://cloudera:50030/jobdet ails.jsp?jobid=job_201307292222_0028 Kill Command = /usr/lib/hadoop/libexec/../bin/hadoop job -kill job_201307292222 _0028 Hadoop job information for Stage-0: number of mappers: 1; number of reducers: 0 2013-08-15 22:44:08,760 Stage-0 map = 0%, reduce = 0% 2013-08-15 22:44:29,849 Stage-0 map = 100%, reduce = 100% Ended Job = job_201307292222_0028 with errors Error during job, obtaining debugging information... Job Tracking URL: http://cloudera:50030/jobdetails.jsp?jobid=job_201307292222_002 8 Examining task ID: task_201307292222_0028_m_000002 (and more) from job job_20130 7292222_0028 Task with the most failures(4): ----- Task ID: task_201307292222_0028_m_000000 URL: http://cloudera:50030/taskdetails.jsp?jobid=job_201307292222_0028&tipid=task_20 1307292222_0028_m_000000 ----- Diagnostic Messages for this Task: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav a:93) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:6 4) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.ja va:117) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:425) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:365) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInforma tion.java:1232) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav a:88) ... 9 more Caused by: java.lang.RuntimeException: Error in configuring object at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav a:93) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:6 4) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.ja va:117) at org.apache.hadoop.mapred.MapRunner.configure(MapRunner.java:34) ... 14 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl. java:39) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAcces sorImpl.java:25) at java.lang.reflect.Method.invoke(Method.java:597) at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.jav a:88) ... 17 more Caused by: java.lang.RuntimeException: Map operator initialization failed at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:1 21) ... 22 more Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.NullPoint erException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSink Operator.java:385) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451) at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.j ava:407) at org.apache.hadoop.hive.ql.exec.SelectOperator.initializeOp(SelectOper ator.java:62) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451) at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.j ava:407) at org.apache.hadoop.hive.ql.exec.FilterOperator.initializeOp(FilterOper ator.java:78) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:451) at org.apache.hadoop.hive.ql.exec.Operator.initializeChildren(Operator.j ava:407) at org.apache.hadoop.hive.ql.exec.TableScanOperator.initializeOp(TableSc anOperator.java:186) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375) at org.apache.hadoop.hive.ql.exec.MapOperator.initializeOp(MapOperator.j ava:543) at org.apache.hadoop.hive.ql.exec.Operator.initialize(Operator.java:375) at org.apache.hadoop.hive.ql.exec.ExecMapper.configure(ExecMapper.java:1 00) ... 22 more Caused by: java.lang.NullPointerException at org.apache.hadoop.hive.ql.exec.FileSinkOperator.initializeOp(FileSink Operator.java:322) ... 38 more FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.MapRe dTask MapReduce Jobs Launched: Job 0: Map: 1 HDFS Read: 0 HDFS Write: 0 FAIL Total MapReduce CPU Time Spent: 0 msec
... View more
08-13-2013
10:43 PM
Thnks but nothing happened...same error again. Hey can you please mention steps of how to insert data into hbase table through hive. I think i am wrong somewhere. Please help !!!!!
... View more
08-13-2013
06:14 AM
Trying to insert data into hbase table through hive.: Create table query: CREATE TABLE hbase_testing(id int, name string, password string) STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler' WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val") TBLPROPERTIES ("hbase.table.name" = "hbase_testing"); Insert Query: INSERT OVERWRITE TABLE hbase_test2 SELECT * FROM user WHERE key=1; Error: java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja
java.lang.RuntimeException: Error in configuring object
at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:72)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:130)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:413)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.ja
... View more
Labels:
- Labels:
-
Apache Hadoop
-
Apache HBase
-
Apache Hive
-
Security