Member since
01-08-2017
36
Posts
1
Kudos Received
2
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
5398 | 09-09-2017 10:36 PM | |
31853 | 03-28-2017 04:13 AM |
12-10-2017
10:51 PM
Oh ok syam i got. I did the installation part manually.
... View more
10-04-2017
12:26 AM
Hi I tried installing hbase manually i.e. steps mentioned below but the issue with me here is after i run hbase shell command i got following error messages. zookeeper.RecoverableZooKeeper: ZooKeeper exists failed after 4 attempts. Please Advise. Note : Detail error messages is mentioned below First Unzip hbase-1.2.6-bin.tar.gz by following command [hduser@storage Downloads]$ tar zxvf hbase-1.2.6-bin.tar.gz Second Once you unzip hbase-1.2.6-bin.tar.gz. It will create a folder as hbase-1.2.6. Move this folder to /home/hduser. After moving the folder verify the folder as following Third Once you move the follow step two i.e. moving hbase folder . The third step that you need to do I create shortcut link and the command for shortcut link is following [hduser@storage ~]$ ln -s hbase-1.2.6 hbase Fourth Edit Bash profile and add following statement [hduser@storage ~]$ vi ~/.bash_profile export HBASE_HOME=/home/hduser/hbase-1.2.6/ export PATH=$PATH:$HBASE_HOME/bin/ Fifth Go to cd /home/hduser/hbase-1.2.6/conf location and Edit hbase-site.xml and Add below mentioned blue color text and then save and exit [hduser@storage conf]$ cd /home/hduser/hbase-1.2.6/conf [hduser@storage conf]$ ls -al total 52 drwxr-xr-x. 2 hduser hadoop 4096 Oct 4 13:45 . drwxr-xr-x. 8 hduser hadoop 4096 Oct 4 13:44 .. -rw-r--r--. 1 hduser hadoop 1811 Dec 27 2015 hadoop-metrics2-hbase.properties -rw-r--r--. 1 hduser hadoop 4537 Jan 29 2016 hbase-env.cmd -rw-r--r--. 1 hduser hadoop 7468 Jan 29 2016 hbase-env.sh -rw-r--r--. 1 hduser hadoop 2257 Dec 27 2015 hbase-policy.xml -rw-r--r--. 1 hduser hadoop 1225 Oct 4 13:45 hbase-site.xml -rw-r--r--. 1 hduser hadoop 1214 Oct 4 13:40 hbase-site.xml~ -rw-r--r--. 1 hduser hadoop 4603 May 29 14:29 log4j.properties -rw-r--r--. 1 hduser hadoop 10 Dec 27 2015 regionservers [hduser@storage conf]$ <configuration> <property> <name>hbase.rootdir</name> <value>hdfs://localhost:8020/hbase</value> </property> <property> <name>hbase.cluster.distributed</name> <value>false</value> </property> <property> <name>hbase.zookeeper.property.dataDir</name> <value>/home/hduser/spark/zoo_data</value> </property> </configuration> Sixth Refresh bash_profile file using following command [hduser@storage ~]$ . ~/.bash_profile Seventh Start hbase Note : Just in case if you see JAVA Related error while triggering start-hbase.sh command then you need to follow following steps:- Find java home location [hduser@storage bin]$ echo $JAVA_HOME /usr/local/jdk1.8.0_111 [hduser@storage bin]$ Copy this location /usr/local/jdk1.8.0_111 2. Edit hbase-env.sh file located at /home/hduser/hbase-1.2.6/conf folder [hduser@storage conf]$ pwd /home/hduser/hbase-1.2.6/conf 3. Under export JAVA_HOME , Paste this location -->/usr/local/jdk1.8.0_111 i.e. shown below [hduser@storage conf]$ vi hbase-env.sh And add following statement---> export JAVA_HOME=/usr/local/jdk1.8.0_111 And then trigger start-hbase.sh again Eight Start hbase in shell mode Complete Error Message is shown below [hduser@storage bin]$ stop-hbase.sh stopping hbase................................................................................................................................................ [hduser@storage bin]$ start-hbase.sh starting master, logging to /home/hduser/hbase-1.2.6//logs/hbase-hduser-master-storage.castrading.com.out Java HotSpot(TM) 64-Bit Server VM warning: ignoring option PermSize=128m; support was removed in 8.0 Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=128m; support was removed in 8.0 [hduser@storage bin]$ hbase shell 2017-10-04 14:55:30,688 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/hduser/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/home/hduser/hadoop-2.6.5/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] 2017-10-04 14:55:48,556 ERROR [main] zookeeper.RecoverableZooKeeper: ZooKeeper exists failed after 4 attempts 2017-10-04 14:55:48,558 WARN [main] zookeeper.ZKUtil: hconnection-0x46aa712c0x0, quorum=localhost:2181, baseZNode=/hbase Unable to set watcher on znode (/hbase/hbaseid) org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid at org.apache.zookeeper.KeeperException.create(KeeperException.java:99) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045) at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:220) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:419) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:905) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:648) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:450) at org.jruby.javasupport.JavaMethod.invokeStaticDirect(JavaMethod.java:362) at org.jruby.java.invokers.StaticMethodInvoker.call(StaticMethodInvoker.java:58) at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312) at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169) at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57) at org.jruby.ast.InstAsgnNode.interpret(InstAsgnNode.java:95) at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) at org.jruby.ast.BlockNode.interpret(BlockNode.java:71) at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:169) at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:191) at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:302) at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:144) at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:148) at org.jruby.RubyClass.newInstance(RubyClass.java:822) at org.jruby.RubyClass$i$newInstance.call(RubyClass$i$newInstance.gen:65535) at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroOrNBlock.call(JavaMethod.java:249) at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:292) at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:135) at home.hduser.hbase_minus_1_dot_2_dot_6.bin.hirb.__file__(/home/hduser/hbase-1.2.6//bin/hirb.rb:131) at home.hduser.hbase_minus_1_dot_2_dot_6.bin.hirb.load(/home/hduser/hbase-1.2.6//bin/hirb.rb) at org.jruby.Ruby.runScript(Ruby.java:697) at org.jruby.Ruby.runScript(Ruby.java:690) at org.jruby.Ruby.runNormally(Ruby.java:597) at org.jruby.Ruby.runFromMain(Ruby.java:446) at org.jruby.Main.doRunFromMain(Main.java:369) at org.jruby.Main.internalRun(Main.java:258) at org.jruby.Main.run(Main.java:224) at org.jruby.Main.run(Main.java:208) at org.jruby.Main.main(Main.java:188) 2017-10-04 14:55:48,569 ERROR [main] zookeeper.ZooKeeperWatcher: hconnection-0x46aa712c0x0, quorum=localhost:2181, baseZNode=/hbase Received unexpected KeeperException, re-throwing exception org.apache.zookeeper.KeeperException$ConnectionLossException: KeeperErrorCode = ConnectionLoss for /hbase/hbaseid at org.apache.zookeeper.KeeperException.create(KeeperException.java:99) at org.apache.zookeeper.KeeperException.create(KeeperException.java:51) at org.apache.zookeeper.ZooKeeper.exists(ZooKeeper.java:1045) at org.apache.hadoop.hbase.zookeeper.RecoverableZooKeeper.exists(RecoverableZooKeeper.java:220) at org.apache.hadoop.hbase.zookeeper.ZKUtil.checkExists(ZKUtil.java:419) at org.apache.hadoop.hbase.zookeeper.ZKClusterId.readClusterIdZNode(ZKClusterId.java:65) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:105) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.retrieveClusterId(ConnectionManager.java:905) at org.apache.hadoop.hbase.client.ConnectionManager$HConnectionImplementation.<init>(ConnectionManager.java:648) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:238) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:218) at org.apache.hadoop.hbase.client.ConnectionFactory.createConnection(ConnectionFactory.java:119) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(JavaMethod.java:450) at org.jruby.javasupport.JavaMethod.invokeStaticDirect(JavaMethod.java:362) at org.jruby.java.invokers.StaticMethodInvoker.call(StaticMethodInvoker.java:58) at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:312) at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:169) at org.jruby.ast.CallOneArgNode.interpret(CallOneArgNode.java:57) at org.jruby.ast.InstAsgnNode.interpret(InstAsgnNode.java:95) at org.jruby.ast.NewlineNode.interpret(NewlineNode.java:104) at org.jruby.ast.BlockNode.interpret(BlockNode.java:71) at org.jruby.evaluator.ASTInterpreter.INTERPRET_METHOD(ASTInterpreter.java:74) at org.jruby.internal.runtime.methods.InterpretedMethod.call(InterpretedMethod.java:169) at org.jruby.internal.runtime.methods.DefaultMethod.call(DefaultMethod.java:191) at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:302) at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:144) at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:148) at org.jruby.RubyClass.newInstance(RubyClass.java:822) at org.jruby.RubyClass$i$newInstance.call(RubyClass$i$newInstance.gen:65535) at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroOrNBlock.call(JavaMethod.java:249) at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:292) at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:135) at home.hduser.hbase_minus_1_dot_2_dot_6.bin.hirb.__file__(/home/hduser/hbase-1.2.6//bin/hirb.rb:131) at home.hduser.hbase_minus_1_dot_2_dot_6.bin.hirb.load(/home/hduser/hbase-1.2.6//bin/hirb.rb) at org.jruby.Ruby.runScript(Ruby.java:697) at org.jruby.Ruby.runScript(Ruby.java:690) at org.jruby.Ruby.runNormally(Ruby.java:597) at org.jruby.Ruby.runFromMain(Ruby.java:446) at org.jruby.Main.doRunFromMain(Main.java:369) at org.jruby.Main.internalRun(Main.java:258) at org.jruby.Main.run(Main.java:224) at org.jruby.Main.run(Main.java:208) at org.jruby.Main.main(Main.java:188) HBase Shell; enter 'help<RETURN>' for list of supported commands. Type "exit<RETURN>" to leave the HBase Shell Version 1.2.6, rUnknown, Mon May 29 02:25:32 CDT 2017 hbase(main):001:0> exit
... View more
Labels:
10-03-2017
10:09 PM
Hi While triggering spark-shell command. I received following error message. I wonder if anyone could guide me to get rid from the error. [hduser@storage bin]$ spark-shell Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 17/10/04 12:44:38 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 17/10/04 12:44:46 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hduser/spark-2.2.0-bin-hadoop2.6/jars/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/hduser/spark/jars/datanucleus-core-3.2.10.jar." 17/10/04 12:44:46 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hduser/spark/jars/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/hduser/spark-2.2.0-bin-hadoop2.6/jars/datanucleus-api-jdo-3.2.6.jar." 17/10/04 12:44:46 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/home/hduser/spark-2.2.0-bin-hadoop2.6/jars/datanucleus-rdbms-3.2.9.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/home/hduser/spark/jars/datanucleus-rdbms-3.2.9.jar." java.lang.IllegalArgumentException: Error while instantiating 'org.apache.spark.sql.hive.HiveSessionStateBuilder': at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1053) at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130) at org.apache.spark.sql.SparkSession$$anonfun$sessionState$2.apply(SparkSession.scala:130) at scala.Option.getOrElse(Option.scala:121) at org.apache.spark.sql.SparkSession.sessionState$lzycompute(SparkSession.scala:129) at org.apache.spark.sql.SparkSession.sessionState(SparkSession.scala:126) at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938) at org.apache.spark.sql.SparkSession$Builder$$anonfun$getOrCreate$5.apply(SparkSession.scala:938) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99) at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230) at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40) at scala.collection.mutable.HashMap.foreach(HashMap.scala:99) at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:938) at org.apache.spark.repl.Main$.createSparkSession(Main.scala:97) ... 47 elided Caused by: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------; at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106) at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:193) at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:105) at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:93) at org.apache.spark.sql.hive.HiveSessionStateBuilder.externalCatalog(HiveSessionStateBuilder.scala:39) at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog$lzycompute(HiveSessionStateBuilder.scala:54) at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:52) at org.apache.spark.sql.hive.HiveSessionStateBuilder.catalog(HiveSessionStateBuilder.scala:35) at org.apache.spark.sql.internal.BaseSessionStateBuilder.build(BaseSessionStateBuilder.scala:289) at org.apache.spark.sql.SparkSession$.org$apache$spark$sql$SparkSession$$instantiateSessionState(SparkSession.scala:1050) ... 61 more Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------ at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522) at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:191) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:264) at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:362) at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:266) at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66) at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65) at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:194) at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194) at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:194) at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97) ... 70 more Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwx------ at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612) at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508) ... 84 more <console>:14: error: not found: value spark import spark.implicits._ ^ <console>:14: error: not found: value spark import spark.sql ^ Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 2.2.0 /_/ Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_111) Type in expressions to have them evaluated. Type :help for more information. scala>
... View more
Labels:
10-03-2017
09:58 PM
Sorry for the late response. Nope i could not find the exact solution for those error. However i did followed all the steps mentioned on this post but that did not work. As a result i uninstalled hive and re-installed some other hive version which works for me. I spend many days for this issue to find the exact solution but could not find it out.
... View more
09-10-2017
01:22 AM
Can you try uninstalling hive and re-install it again and then try the same command.
... View more
09-10-2017
12:04 AM
Hello Everyone While triggering following insert query insert into table orders_sequence select * from orders_another; i am getting following error messages. FAILED: Execution Error, return code 2 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask My objectives here is to enable the compression on hive and check the fille size after compression. But the problem here is after enabling compression performing insert command is not working. I have attached the screenshot below where you can find the detail steps that i have trigger for enabling the compression and the error that has occured after insert command. Any help will be highly appreciated. Note : Issue occurs after below mentioned command is trigger where i have simply enable mapreduce, changed the compression to Snappy and enabled compression. After this, when i run insert command which is mentioned below it shows an FAILED error which is highlighted with red circle I even checked following status as well hive> set mapred.reduce.tasks; mapred.reduce.tasks=-1 hive> set hive.exec.reducers.max; hive.exec.reducers.max=1009 hive> set hive.exec.reducers.bytes.per.reducer; hive.exec.reducers.bytes.per.reducer=256000000 hive> set mapred.tasktracker.reduce.tasks.maximum; mapred.tasktracker.reduce.tasks.maximum=2 hive> hive> set hive.auto.convert.join; hive.auto.convert.join=true hive> set hive.auto.convert.join=false; hive> set hive.auto.convert.join; hive.auto.convert.join=false hive> set hive.auto.convert.join=true; hive> set hive.exec.dynamic.partition; hive.exec.dynamic.partition=true hive> set hive.exec.dynamic.partition.mode; hive.exec.dynamic.partition.mode=strict hive> set hive.exec.dynamic.partition.mode=nonstrict; hive> set hive.exec.dynamic.partition.mode; hive.exec.dynamic.partition.mode=nonstrict hive>
... View more
Labels:
09-09-2017
10:36 PM
I am able to solve the issue. I simply uninstall and re-install hive and with that it works. Select query is now able to show an output without any issue.
... View more
05-05-2017
02:44 AM
Please find the output of the command from below :- 1. start HiveMetaStore server: hive --service metastore & [hduser@storage Desktop]$ hive --service metastore & [1] 13900 [hduser@storage Desktop]$ which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/) Starting Hive Metastore Server [hduser@storage Desktop]$ 2. start HiveServer2 server: hiveserver2 & [hduser@storage Desktop]$ hiveserver2 & [2] 14114 [hduser@storage Desktop]$ which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/) [hduser@storage Desktop]$ 3. start beeline: beeline -u 'jdbc:hive2://localhost:10000/default' -n hive [hduser@storage Desktop]$ beeline -u 'jdbc:hive2://localhost:10000/default' -n hive which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/) ls: cannot access /home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-jdbc-*-standalone.jar: No such file or directory Connecting to jdbc:hive2://localhost:10000/default 17/05/05 17:37:50 [main]: INFO jdbc.HiveConnection: Transport Used for JDBC connection: null Error: Could not open client transport with JDBC Uri: jdbc:hive2://localhost:10000/default: java.net.ConnectException: Connection refused (Connection refused) (state=08S01,code=0) Beeline version 2.0.1 by Apache Hive beeline> 4. run the SELECT query to see if it fails the same way Question : At 4, You mentioned to trigger select query. So here you want me to trigger select query from hive > or beeline > ? . However, I tried it from hive > . It shows an error. Output mentioned below : hduser@storage Desktop]$ hive which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/) Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive> show databases; OK default hive Time taken: 1.436 seconds, Fetched: 2 row(s) hive> use hive; OK Time taken: 0.031 seconds hive> show tables; OK orders Time taken: 0.29 seconds, Fetched: 1 row(s) hive> select * from orders; Exception in thread "fc443fd9-8e62-4269-8df3-fae076efb9c2 main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/io/HdfsUtils$HadoopFileStatus at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:545) at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:237) at org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:429) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:6437) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8961) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8850) 5. if still failed, check the HiveServer2 server log file, which is defined under /home/hduser/Softwares/apache-hive-2.0.1-bin/conf/hive-log4j2.properties file with the following properties: Answer : There is no hive-log4j2.properties. Steps implemented are mentioned below property.hive.log.dir property.hive.log.file Check the error there. [hduser@storage Desktop]$ more /home/hduser/Softwares/apache-hive-2.0.1-bin/conf/hive-log4j2.properties /home/hduser/Softwares/apache-hive-2.0.1-bin/conf/hive-log4j2.properties: No such file or directory [hduser@storage Desktop]$ cd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf/hive-log4j2.properties bash: cd: /home/hduser/Softwares/apache-hive-2.0.1-bin/conf/hive-log4j2.properties: No such file or directory [hduser@storage Desktop]$ cd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf/ [hduser@storage conf]$ pwd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf [hduser@storage conf]$ ls -ltr hive-log4j2.properties.* -rw-r--r--. 1 hduser hadoop 2758 Apr 23 2016 hive-log4j2.properties.template [hduser@storage conf]$ Will look forward to hear from you Thank You Ujjwal Rana
... View more
04-07-2017
01:41 AM
I have to write all the steps for hadoop installation but i guess with .bashrc you will get to know the location for hadoop and hive. Please advise # .bashrc # Source global definitions if [ -f /etc/bashrc ]; then . /etc/bashrc fi # Set Hadoop-related environment variables #export HADOOP_HOME=/home/hduser/hadoop export HADOOP_HOME=/home/hduser/hadoop-2.6.5 export HADOOP_INSTALL=/home/hduser/hadoop-2.6.5 #Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on) export JAVA_HOME=/usr/local/jdk1.8.0_111 export PATH=$PATH:$JAVA_HOME/bin PATH=$PATH:$HOME/bin export PATH # Some convenient aliases and functions for running Hadoop-related commands unalias fs &> /dev/null alias fs="hadoop fs" unalias hls &> /dev/null alias hls="fs -ls" # If you have LZO compression enabled in your Hadoop cluster and # compress job outputs with LZOP (not covered in this tutorial): # Conveniently inspect an LZOP compressed file from the command # line; run via: # # $ lzohead /hdfs/path/to/lzop/compressed/file.lzo # # Requires installed 'lzop' command. lzohead () { hadoop fs -cat $1 | lzop -dc | head -1000 | less } # Add Hadoop bin/ directory to PATH export PATH=$PATH:$HADOOP_HOME/bin # Add Pig bin / directory to PATH export PIG_HOME=/home/hduser/pig-0.15.0 export PATH=$PATH:$PIG_HOME/bin # User specific aliases and functions export HADOOP_INSTALL=$HADOOP_HOME export HADOOP_MAPRED_HOME=$HADOOP_HOME export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export YARN_HOME=$HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin export SCALA_HOME=/home/hduser/scala/ export PATH=$PATH:$SCALA_HOME:/bin/ # Add Sqoop bin / directory to PATH export SQOOP_HOME=/home/hduser/Softwares/sqoop export PATH=$PATH:$SQOOP_HOME/bin/ # Add Hive bin / directory to PATH export HIVE_HOME=/home/hduser/Softwares/apache-hive-2.0.1-bin export PATH=$PATH:$HIVE_HOME/bin/ export HIVE_CONF_DIR=$HIVE_HOME/conf [hduser@storage ~]$
... View more
04-06-2017
05:30 PM
Please find my answer below in line accrodingly :- - when did job fail, did it fail before MR was launched or after? If after, was the mapper or reducer failed? Answer : I did not notice on that but after login inside hive if i trigger select query then i get that error. - have you tried to use beeline (need to start HiveServer2) to see if it also fails there? Answer : Nope. I am new to this. can you give me the url for beeline so that i can follow the steps accordingly. - how many nodes do you have in the cluster? is it just one? Answer : It's just one only. I have installed VMWARE and inside vmware i have installed linux as an os, hadoop etc So isn't there any way to find the solution for the error or is it just a bug only ?
... View more
04-06-2017
01:52 AM
Please find the output from below. [hduser@storage Desktop]$ strings /proc/29018/environ ORBIT_SOCKETDIR=/tmp/orbit-hduser HADOOP_DATANODE_OPTS=-Dhadoop.security.logger=ERROR,RFAS HOSTNAME=storage.castrading.com HADOOP_IDENT_STRING=hduser IMSETTINGS_INTEGRATE_DESKTOP=yes PIG_HOME=/home/hduser/pig-0.15.0 SHELL=/bin/bash TERM=xterm XDG_SESSION_COOKIE=4179b5ce21a6e7668e4c33d800000012-1491373771.913553-599059921 HADOOP_HOME=/home/hduser/hadoop-2.6.5 HISTSIZE=1000 HADOOP_PID_DIR= HADOOP_PREFIX=/home/hduser/hadoop-2.6.5 GTK_RC_FILES=/etc/gtk/gtkrc:/home/hduser/.gtkrc-1.2-gnome2 WINDOWID=46137348 QTDIR=/usr/lib64/qt-3.3 SQOOP_HOME=/home/hduser/Softwares/sqoop QTINC=/usr/lib64/qt-3.3/include YARN_HOME=/home/hduser/hadoop-2.6.5 IMSETTINGS_MODULE=none USER=hduser LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.tbz=01;31:*.tbz2=01;31:*.bz=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36: HADOOP_HEAPSIZE=256 SSH_AUTH_SOCK=/tmp/keyring-zl52DP/socket.ssh GNOME_KEYRING_SOCKET=/tmp/keyring-zl52DP/socket MALLOC_ARENA_MAX=4 HADOOP_SECURE_DN_PID_DIR= USERNAME=hduser SESSION_MANAGER=local/unix:@/tmp/.ICE-unix/3448,unix/unix:/tmp/.ICE-unix/3448 HADOOP_SECURE_DN_LOG_DIR=/ HIVE_AUX_JARS_PATH= DESKTOP_SESSION=gnome HADOOP_COMMON_LIB_NATIVE_DIR=/home/hduser/hadoop-2.6.5/lib/native MAIL=/var/spool/mail/hduser PATH=/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/usr/local/jdk1.8.0_111/bin HADOOP_HDFS_HOME=/home/hduser/hadoop-2.6.5 QT_IM_MODULE=xim HIVE_HOME=/home/hduser/Softwares/apache-hive-2.0.1-bin HADOOP_CLIENT_OPTS=-Xmx512m -Dlog4j.configurationFile=hive-log4j2.properties PWD=/home/hduser/Desktop HADOOP_COMMON_HOME=/home/hduser/hadoop-2.6.5 HADOOP_YARN_HOME=/home/hduser/hadoop-2.6.5 JAVA_HOME=/usr/local/jdk1.8.0_111 XMODIFIERS=@im=none HADOOP_INSTALL=/home/hduser/hadoop-2.6.5 GDM_KEYBOARD_LAYOUT=us HADOOP_CLASSPATH=/home/hduser/Softwares/apache-hive-2.0.1-bin/conf:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-core-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-fate-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-start-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-trace-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/activation-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.6.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-launcher-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-2.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr4-runtime-4.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-runtime-3.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/aopalliance-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-commons-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-tree-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/avro-1.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/bonecp-0.8.0.RELEASE.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-avatica-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-core-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-linq4j-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-cli-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-codec-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-collections-3.2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compiler-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compress-1.9.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-dbcp-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-el-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-httpclient-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-io-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang-2.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang3-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-logging-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-math-2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-pool-1.5.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-vfs2-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-client-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-framework-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-recipes-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-api-jdo-4.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-core-4.1.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-rdbms-4.1.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/derby-10.10.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/disruptor-3.3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/eigenbase-properties-1.1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/fastutil-6.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/findbugs-annotations-1.3.9-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jaspic_1.0_spec-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jta_1.1_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/groovy-all-2.4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/gson-2.2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guava-14.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-assistedinject-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hamcrest-core-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-annotations-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-client-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-prefix-tree-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-procedure-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-protocol-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-server-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-accumulo-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-ant-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-beeline-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-cli-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-contrib-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-exec-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hbase-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hplsql-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hwi-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-jdbc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-client-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-server-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-tez-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-metastore-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-orc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-serde-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-service-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-0.23-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-scheduler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-storage-api-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-testutils-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/htrace-core-3.1.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpclient-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpcore-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ivy-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-annotations-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-core-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-databind-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-jaxrs-1.9.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jamon-runtime-2.3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/janino-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-compiler-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-runtime-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.inject-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.jdo-3.2.0-m3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.servlet-3.0.0.v201112011016.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcodings-1.0.8.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcommander-1.32.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jdo-api-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jersey-server-1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-server-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-sslengine-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-util-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jline-2.12.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joda-time-2.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joni-2.1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jpam-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/json-20090211.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsr305-3.0.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jta-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/junit-4.11.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libfb303-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libthrift-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-1.2-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-core-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-web-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mail-1.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-api-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svn-commons-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svnexe-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-json-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-jvm-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mysql-connector-java-5.1.40-bin.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-3.7.0.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-all-4.0.23.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/opencsv-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/org.abego.treelayout.core-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/paranamer-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/parquet-hadoop-bundle-1.8.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/plexus-utils-1.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/protobuf-java-2.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/regexp-1.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.5-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-0.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-java-1.0.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ST4-4.0.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stax-api-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stringtemplate-3.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/super-csv-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tempus-fugit-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-api-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-core-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-hbase-compat-1.0-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/transaction-api-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-common-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-zookeeper-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/velocity-1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/zookeeper-3.4.6.jar::/usr/local/jdk1.8.0_111/lib/tools.jar::/home/hduser/hadoop-2.6.5/contrib/capacity-scheduler/*.jar HADOOP_CONF_DIR=/home/hduser/hadoop-2.6.5/etc/hadoop LANG=en_US.UTF-8 GNOME_KEYRING_PID=3438 SERVICE_LIST=beeline cli hbaseimport hbaseschematool help hiveburninclient hiveserver2 hiveserver hplsql hwi jar lineage llap metastore metatool orcfiledump rcfilecat schemaTool version GDM_LANG=en_US.UTF-8 HADOOP_PORTMAP_OPTS=-Xmx512m HADOOP_OPTS=-Djava.library.path=/home/hduser/hadoop-2.6.5/lib -Djava.net.preferIPv4Stack=true -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hduser/hadoop-2.6.5/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hduser/hadoop-2.6.5 -Dhadoop.id.str=hduser -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m -Dlog4j.configurationFile=hive-log4j2.properties -Dhadoop.security.logger=INFO,NullAppender HADOOP_SECONDARYNAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender GDMSESSION=gnome HISTCONTROL=ignoredups SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass SHLVL=2 HOME=/home/hduser HADOOP_SECURE_DN_USER= HADOOP_NAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender GNOME_DESKTOP_SESSION_ID=this-is-deprecated HADOOP_MAPRED_HOME=/home/hduser/hadoop-2.6.5 LOGNAME=hduser QTLIB=/usr/lib64/qt-3.3/lib CVS_RSH=ssh HADOOP_HOME_WARN_SUPPRESS=true CLASSPATH=/home/hduser/Softwares/apache-hive-2.0.1-bin/conf:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-core-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-fate-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-start-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-trace-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/activation-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.6.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-launcher-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-2.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr4-runtime-4.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-runtime-3.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/aopalliance-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-commons-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-tree-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/avro-1.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/bonecp-0.8.0.RELEASE.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-avatica-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-core-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-linq4j-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-cli-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-codec-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-collections-3.2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compiler-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compress-1.9.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-dbcp-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-el-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-httpclient-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-io-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang-2.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang3-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-logging-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-math-2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-pool-1.5.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-vfs2-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-client-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-framework-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-recipes-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-api-jdo-4.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-core-4.1.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-rdbms-4.1.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/derby-10.10.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/disruptor-3.3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/eigenbase-properties-1.1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/fastutil-6.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/findbugs-annotations-1.3.9-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jaspic_1.0_spec-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jta_1.1_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/groovy-all-2.4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/gson-2.2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guava-14.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-assistedinject-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hamcrest-core-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-annotations-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-client-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-prefix-tree-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-procedure-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-protocol-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-server-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-accumulo-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-ant-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-beeline-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-cli-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-contrib-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-exec-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hbase-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hplsql-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hwi-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-jdbc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-client-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-server-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-tez-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-metastore-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-orc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-serde-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-service-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-0.23-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-scheduler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-storage-api-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-testutils-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/htrace-core-3.1.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpclient-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpcore-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ivy-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-annotations-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-core-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-databind-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-jaxrs-1.9.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jamon-runtime-2.3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/janino-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-compiler-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-runtime-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.inject-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.jdo-3.2.0-m3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.servlet-3.0.0.v201112011016.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcodings-1.0.8.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcommander-1.32.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jdo-api-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jersey-server-1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-server-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-sslengine-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-util-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jline-2.12.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joda-time-2.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joni-2.1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jpam-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/json-20090211.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsr305-3.0.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jta-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/junit-4.11.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libfb303-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libthrift-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-1.2-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-core-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-web-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mail-1.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-api-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svn-commons-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svnexe-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-json-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-jvm-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mysql-connector-java-5.1.40-bin.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-3.7.0.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-all-4.0.23.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/opencsv-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/org.abego.treelayout.core-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/paranamer-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/parquet-hadoop-bundle-1.8.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/plexus-utils-1.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/protobuf-java-2.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/regexp-1.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.5-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-0.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-java-1.0.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ST4-4.0.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stax-api-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stringtemplate-3.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/super-csv-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tempus-fugit-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-api-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-core-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-hbase-compat-1.0-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/transaction-api-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-common-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-zookeeper-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/velocity-1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/zookeeper-3.4.6.jar::/usr/local/jdk1.8.0_111/lib/tools.jar::/home/hduser/hadoop-2.6.5/contrib/capacity-scheduler/*.jar:/home/hduser/hadoop-2.6.5/etc/hadoop:/home/hduser/hadoop-2.6.5/share/hadoop/common/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/common/*:/home/hduser/hadoop-2.6.5/share/hadoop/hdfs:/home/hduser/hadoop-2.6.5/share/hadoop/hdfs/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/hdfs/*:/home/hduser/hadoop-2.6.5/share/hadoop/yarn/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/yarn/*:/home/hduser/hadoop-2.6.5/share/hadoop/mapreduce/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/mapreduce/* HADOOP_NFS3_OPTS= DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-eFQEXskxRI,guid=1c48465a2229e73e4a2f8e1c00000063 LESSOPEN=||/usr/bin/lesspipe.sh %s SCALA_HOME=/home/hduser/scala/ WINDOWPATH=1 DISPLAY=:0.0 HADOOP_USER_CLASSPATH_FIRST=true G_BROKEN_FILENAMES=1 XAUTHORITY=/var/run/gdm/auth-for-hduser-wCDdK5/database HIVE_CONF_DIR=/home/hduser/Softwares/apache-hive-2.0.1-bin/conf COLORTERM=gnome-terminal [hduser@storage Desktop]$
... View more
04-05-2017
11:12 PM
Please find the output from below. First:::::::::::: [hduser@storage Desktop]$ hive CLI: hive which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/) Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive> SECOND:::: [hduser@storage Desktop]$ ps aux | grep CliDriver hduser 29018 29.9 5.3 2249484 217232 pts/0 Sl+ 14:08 0:38 /usr/local/jdk1.8.0_111/bin/java -Xmx256m -Djava.library.path=/home/hduser/hadoop-2.6.5/lib -Djava.net.preferIPv4Stack=true -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hduser/hadoop-2.6.5/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hduser/hadoop-2.6.5 -Dhadoop.id.str=hduser -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m -Dlog4j.configurationFile=hive-log4j2.properties -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-cli-2.0.1.jar org.apache.hadoop.hive.cli.CliDriver CLI: hive hduser 29161 0.0 0.0 103384 808 pts/3 S+ 14:10 0:00 grep CliDriver [hduser@storage Desktop]$ THIRD::::::::::::::::::: [hduser@storage Desktop]$ strings /proc/29161/environ strings: '/proc/29161/environ': No such file
... View more
- Tags:
- :
04-04-2017
11:38 PM
Please find the output from below :- [hduser@storage Desktop]$ hive --hiveconf hive.root.logger=DEBUG,console which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/) Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true 2017-04-05T14:37:29,204 INFO [main] SessionState: Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true 2017-04-05T14:37:29,208 DEBUG [main] conf.VariableSubstitution: Substitution is on: hive 2017-04-05T14:37:29,570 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)]) 2017-04-05T14:37:29,584 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)]) 2017-04-05T14:37:29,586 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[GetGroups]) 2017-04-05T14:37:29,591 DEBUG [main] impl.MetricsSystemImpl: UgiMetrics, User and group related metrics 2017-04-05T14:37:29,790 DEBUG [main] security.Groups: Creating new Groups object 2017-04-05T14:37:29,797 DEBUG [main] util.NativeCodeLoader: Trying to load the custom-built native-hadoop library... 2017-04-05T14:37:29,799 DEBUG [main] util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path 2017-04-05T14:37:29,799 DEBUG [main] util.NativeCodeLoader: java.library.path=/home/hduser/hadoop-2.6.5/lib 2017-04-05T14:37:29,799 WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 2017-04-05T14:37:29,800 DEBUG [main] util.PerformanceAdvisory: Falling back to shell based 2017-04-05T14:37:29,803 DEBUG [main] security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping 2017-04-05T14:37:29,810 DEBUG [main] security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000 2017-04-05T14:37:29,822 DEBUG [main] security.UserGroupInformation: hadoop login 2017-04-05T14:37:29,825 DEBUG [main] security.UserGroupInformation: hadoop login commit 2017-04-05T14:37:29,834 DEBUG [main] security.UserGroupInformation: using local user:UnixPrincipal: hduser 2017-04-05T14:37:29,835 DEBUG [main] security.UserGroupInformation: Using user: "UnixPrincipal: hduser" with name hduser 2017-04-05T14:37:29,835 DEBUG [main] security.UserGroupInformation: User entry: "hduser" 2017-04-05T14:37:29,836 DEBUG [main] security.UserGroupInformation: UGI loginUser:hduser (auth:SIMPLE) 2017-04-05T14:37:29,919 INFO [main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore 2017-04-05T14:37:29,961 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.storeManagerType value null from jpox.properties with rdbms 2017-04-05T14:37:29,962 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateConstraints value null from jpox.properties with false 2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.autoStartMechanismMode value null from jpox.properties with checked 2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateTables value null from jpox.properties with false 2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.Multithreaded value null from jpox.properties with true 2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.rdbms.initializeColumnInfo value null from jpox.properties with NONE 2017-04-05T14:37:29,964 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.cache.level2.type value null from jpox.properties with none 2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.connectionPoolingType value null from jpox.properties with BONECP 2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionUserName value null from jpox.properties with hive 2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.autoCreateAll value null from jpox.properties with false 2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.NonTransactionalRead value null from jpox.properties with true 2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.transactionIsolation value null from jpox.properties with read-committed 2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionURL value null from jpox.properties with jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true 2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateColumns value null from jpox.properties with false 2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.identifierFactory value null from jpox.properties with datanucleus1 2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.PersistenceManagerFactoryClass value null from jpox.properties with org.datanucleus.api.jdo.JDOPersistenceManagerFactory 2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.cache.level2 value null from jpox.properties with false 2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.rdbms.useLegacyNativeValueStrategy value null from jpox.properties with true 2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding hive.metastore.integral.jdo.pushdown value null from jpox.properties with false 2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.DetachAllOnCommit value null from jpox.properties with true 2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionDriverName value null from jpox.properties with org.apache.derby.jdbc.EmbeddedDriver 2017-04-05T14:37:29,972 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.plugin.pluginRegistryBundleCheck value null from jpox.properties with LOG 2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.schema.autoCreateAll = false 2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateTables = false 2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.rdbms.useLegacyNativeValueStrategy = true 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateColumns = false 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: hive.metastore.integral.jdo.pushdown = false 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.autoStartMechanismMode = checked 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.rdbms.initializeColumnInfo = NONE 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.Multithreaded = true 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.identifierFactory = datanucleus1 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.transactionIsolation = read-committed 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionURL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true 2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.DetachAllOnCommit = true 2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.NonTransactionalRead = true 2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionDriverName = org.apache.derby.jdbc.EmbeddedDriver 2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateConstraints = false 2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionUserName = hive 2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.cache.level2 = false 2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = LOG 2017-04-05T14:37:30,028 DEBUG [main] metastore.ObjectStore: datanucleus.cache.level2.type = none 2017-04-05T14:37:30,028 DEBUG [main] metastore.ObjectStore: javax.jdo.PersistenceManagerFactoryClass = org.datanucleus.api.jdo.JDOPersistenceManagerFactory 2017-04-05T14:37:30,029 DEBUG [main] metastore.ObjectStore: datanucleus.storeManagerType = rdbms 2017-04-05T14:37:30,029 DEBUG [main] metastore.ObjectStore: datanucleus.connectionPoolingType = BONECP 2017-04-05T14:37:30,029 INFO [main] metastore.ObjectStore: ObjectStore, initialize called 2017-04-05T14:37:31,267 DEBUG [main] bonecp.BoneCPDataSource: JDBC URL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true, Username = hive, partitions = 1, max (per partition) = 10, min (per partition) = 0, idle max age = 60 min, idle test period = 240 min, strategy = DEFAULT 2017-04-05T14:37:32,132 INFO [main] metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order" 2017-04-05T14:37:34,938 DEBUG [main] bonecp.BoneCPDataSource: JDBC URL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true, Username = hive, partitions = 1, max (per partition) = 10, min (per partition) = 0, idle max age = 60 min, idle test period = 240 min, strategy = DEFAULT 2017-04-05T14:37:35,150 DEBUG [main] metastore.MetaStoreDirectSql: Direct SQL query in 1.803389ms + 0.059481ms, the query is [SET @@session.sql_mode=ANSI_QUOTES] 2017-04-05T14:37:35,175 INFO [main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL 2017-04-05T14:37:35,185 DEBUG [main] metastore.ObjectStore: RawStore: org.apache.hadoop.hive.metastore.ObjectStore@7103ab0, with PersistenceManager: org.datanucleus.api.jdo.JDOPersistenceManager@b0964b2 created in the thread with id: 1 2017-04-05T14:37:35,185 INFO [main] metastore.ObjectStore: Initialized ObjectStore 2017-04-05T14:37:35,345 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:7234) 2017-04-05T14:37:35,418 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at: org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:7247) 2017-04-05T14:37:35,442 DEBUG [main] metastore.ObjectStore: Found expected HMS version of 2.1.0 2017-04-05T14:37:35,453 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.start(ObjectStore.java:2502) 2017-04-05T14:37:35,461 DEBUG [main] metastore.MetaStoreDirectSql: Direct SQL query in 1.297211ms + 0.019608ms, the query is [SET @@session.sql_mode=ANSI_QUOTES] 2017-04-05T14:37:35,500 DEBUG [main] metastore.MetaStoreDirectSql: getDatabase: directsql returning db default locn[hdfs://storage.castrading.com:9000/user/hive/warehouse] desc [Default Hive database] owner [public] ownertype [ROLE] 2017-04-05T14:37:35,503 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at: org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.commit(ObjectStore.java:2552) 2017-04-05T14:37:35,505 DEBUG [main] metastore.ObjectStore: db details for db default retrieved using SQL in 51.37478ms 2017-04-05T14:37:35,506 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3313) 2017-04-05T14:37:35,506 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670) 2017-04-05T14:37:35,546 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at: org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676) 2017-04-05T14:37:35,550 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at: org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3325) 2017-04-05T14:37:35,555 DEBUG [main] metastore.HiveMetaStore: admin role already exists InvalidObjectException(message:Role admin already exists.) at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3316) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101) at com.sun.proxy.$Proxy21.addRole(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:580) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 2017-04-05T14:37:35,560 INFO [main] metastore.HiveMetaStore: Added admin role in metastore 2017-04-05T14:37:35,562 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3313) 2017-04-05T14:37:35,562 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670) 2017-04-05T14:37:35,566 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at: org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676) 2017-04-05T14:37:35,567 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at: org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3325) 2017-04-05T14:37:35,569 DEBUG [main] metastore.HiveMetaStore: public role already exists InvalidObjectException(message:Role public already exists.) at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3316) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101) at com.sun.proxy.$Proxy21.addRole(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:589) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 2017-04-05T14:37:35,570 INFO [main] metastore.HiveMetaStore: Added public role in metastore 2017-04-05T14:37:35,590 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4063) 2017-04-05T14:37:35,590 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670) 2017-04-05T14:37:35,593 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at: org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676) 2017-04-05T14:37:35,594 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.listPrincipalMGlobalGrants(ObjectStore.java:4579) 2017-04-05T14:37:35,620 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at: org.apache.hadoop.hive.metastore.ObjectStore.listPrincipalMGlobalGrants(ObjectStore.java:4587) 2017-04-05T14:37:35,621 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at: org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4266) 2017-04-05T14:37:35,623 DEBUG [main] metastore.HiveMetaStore: Failed while granting global privs to admin InvalidObjectException(message:All is already granted by admin) at org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4099) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101) at com.sun.proxy.$Proxy21.grantPrivileges(Unknown Source) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:603) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) 2017-04-05T14:37:35,627 INFO [main] metastore.HiveMetaStore: No user is added in admin role, since config is empty 2017-04-05T14:37:35,891 INFO [main] metastore.HiveMetaStore: 0: get_all_functions 2017-04-05T14:37:35,895 INFO [main] HiveMetaStore.audit: ugi=hduser ip=unknown-ip-addr cmd=get_all_functions 2017-04-05T14:37:35,896 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at: org.apache.hadoop.hive.metastore.ObjectStore.getAllFunctions(ObjectStore.java:7549) 2017-04-05T14:37:35,916 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at: org.apache.hadoop.hive.metastore.ObjectStore.getAllFunctions(ObjectStore.java:7553) 2017-04-05T14:37:36,238 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false 2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false 2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false 2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.domain.socket.path = 2017-04-05T14:37:36,281 DEBUG [main] hdfs.DFSClient: No KeyProvider found. 2017-04-05T14:37:36,454 DEBUG [main] retry.RetryUtils: multipleLinearRandomRetry = null 2017-04-05T14:37:36,511 DEBUG [main] ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@ca93621 2017-04-05T14:37:36,532 DEBUG [main] ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@54bca971 2017-04-05T14:37:37,431 DEBUG [main] util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled. 2017-04-05T14:37:37,440 DEBUG [main] sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 2017-04-05T14:37:37,479 DEBUG [main] ipc.Client: The ping interval is 60000 ms. 2017-04-05T14:37:37,481 DEBUG [main] ipc.Client: Connecting to storage.castrading.com/192.168.0.227:9000 2017-04-05T14:37:37,524 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser: starting, having connections 1 2017-04-05T14:37:37,526 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #0 2017-04-05T14:37:37,537 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #0 2017-04-05T14:37:37,538 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 91ms 2017-04-05T14:37:37,600 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #1 2017-04-05T14:37:37,603 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #1 2017-04-05T14:37:37,604 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 5ms 2017-04-05T14:37:37,606 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #2 2017-04-05T14:37:37,608 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #2 2017-04-05T14:37:37,611 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 5ms 2017-04-05T14:37:37,616 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #3 2017-04-05T14:37:37,617 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #3 2017-04-05T14:37:37,619 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 3ms 2017-04-05T14:37:37,620 DEBUG [main] hdfs.DFSClient: /tmp/hive/hduser/10f8dcc5-0c5b-479d-92f3-87a848c6d188: masked=rwx------ 2017-04-05T14:37:37,624 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #4 2017-04-05T14:37:37,630 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #4 2017-04-05T14:37:37,636 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 13ms 2017-04-05T14:37:37,642 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #5 2017-04-05T14:37:37,643 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #5 2017-04-05T14:37:37,643 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms 2017-04-05T14:37:37,662 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #6 2017-04-05T14:37:37,663 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #6 2017-04-05T14:37:37,667 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 8ms 2017-04-05T14:37:37,667 DEBUG [main] hdfs.DFSClient: /tmp/hive/hduser/10f8dcc5-0c5b-479d-92f3-87a848c6d188/_tmp_space.db: masked=rwx------ 2017-04-05T14:37:37,668 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #7 2017-04-05T14:37:37,670 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #7 2017-04-05T14:37:37,670 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 3ms 2017-04-05T14:37:37,672 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #8 2017-04-05T14:37:37,675 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #8 2017-04-05T14:37:37,675 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 4ms Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive>
... View more
04-04-2017
04:01 AM
Please find the detail of .bashrc fie from below. Also i do have hive-exec-2.0.1.jar file inside lib folder. On previous one you mentioned hive-exec-2.0.1-SNAPSHOT.jar filename ($HIVE_HOME/lib/hive-exec-2.2.0-SNAPSHOT.jar,) which were not available but hive-exec-2.0.1.jar is available accordingly # .bashrc # Source global definitions if [ -f /etc/bashrc ]; then . /etc/bashrc fi # Set Hadoop-related environment variables #export HADOOP_HOME=/home/hduser/hadoop export HADOOP_HOME=/home/hduser/hadoop-2.6.5 export HADOOP_INSTALL=/home/hduser/hadoop-2.6.5 #Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on) export JAVA_HOME=/usr/local/jdk1.8.0_111 export PATH=$PATH:$JAVA_HOME/bin PATH=$PATH:$HOME/bin export PATH # Some convenient aliases and functions for running Hadoop-related commands unalias fs &> /dev/null alias fs="hadoop fs" unalias hls &> /dev/null alias hls="fs -ls" # If you have LZO compression enabled in your Hadoop cluster and # compress job outputs with LZOP (not covered in this tutorial): # Conveniently inspect an LZOP compressed file from the command # line; run via: # # $ lzohead /hdfs/path/to/lzop/compressed/file.lzo # # Requires installed 'lzop' command. # lzohead () { hadoop fs -cat $1 | lzop -dc | head -1000 | less } # Add Hadoop bin/ directory to PATH export PATH=$PATH:$HADOOP_HOME/bin # Add Pig bin / directory to PATH export PIG_HOME=/home/hduser/pig-0.15.0 export PATH=$PATH:$PIG_HOME/bin # User specific aliases and functions export HADOOP_INSTALL=$HADOOP_HOME export HADOOP_MAPRED_HOME=$HADOOP_HOME export HADOOP_COMMON_HOME=$HADOOP_HOME export HADOOP_HDFS_HOME=$HADOOP_HOME export YARN_HOME=$HADOOP_HOME export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib" export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin export SCALA_HOME=/home/hduser/scala/ export PATH=$PATH:$SCALA_HOME:/bin/ # Add Sqoop bin / directory to PATH export SQOOP_HOME=/home/hduser/Softwares/sqoop export PATH=$PATH:$SQOOP_HOME/bin/ # Add Hive bin / directory to PATH export HIVE_HOME=/home/hduser/Softwares/apache-hive-2.0.1-bin export PATH=$PATH:$HIVE_HOME/bin/ export HIVE_CONF_DIR=$HIVE_HOME/conf
... View more
04-04-2017
03:26 AM
Nope i don't have that file inside lib folder. I am using apache-hive-2.0.1-bin version. So i downloaded tar.gz file of this version but i did not find the jar file which you have mentioned. So can you recomend me the url where i can download the file for apache-hive-2.0.1-bin version. Yes i installed hadoop and hive manually.
... View more
04-03-2017
11:11 PM
I am bit confused. Do you mean to run those command after i get login inside hive ? [hduser@storage Desktop]$ jps 7041 NameNode 7891 NodeManager 7143 DataNode 7928 Jps 7291 SecondaryNameNode 7789 ResourceManager [hduser@storage Desktop]$ hive which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/) Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive> ps aux | grep HiveServer2 ; NoViableAltException(26@[]) at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1099) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:204) at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:440) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:319) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1249) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1295) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1178) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1166) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:236) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:782) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:721) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) FAILED: ParseException line 1:0 cannot recognize input near 'ps' 'aux' '|'
... View more
04-03-2017
10:13 PM
Please find the output from below : [hduser@storage Desktop]$ ps aux | grep HiveServer2 hduser 4479 0.0 0.0 103384 812 pts/0 S+ 13:08 0:00 grep HiveServer2 [root@storage ~]# strings /proc/4479/environ strings: '/proc/4479/environ': No such file
... View more
03-30-2017
10:10 PM
Hello Everyone While triggering sql SELECT statement inside HIVE i get following error messages. I have mentioned the sql statement and the output below. Any suggestion will be highly appreciated. hive (default)> show tables; OK order_items Time taken: 0.35 seconds, Fetched: 1 row(s) hive (default)> select count(1) from order_items; Exception in thread "d413467f-6da8-4ebc-bf93-730e15b4b23f main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/io/HdfsUtils$HadoopFileStatus at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:545) at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:237) at org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:429) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:6437) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8961) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8850) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9703) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9596) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:291) at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10103) at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:228) at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:239) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:473) at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:319) at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1249) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1295) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1178) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1166) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:236) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:782) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:721) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.io.HdfsUtils$HadoopFileStatus at java.net.URLClassLoader.findClass(URLClassLoader.java:381) at java.lang.ClassLoader.loadClass(ClassLoader.java:424) at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331) at java.lang.ClassLoader.loadClass(ClassLoader.java:357) ... 30 more [hduser@storage Softwares]$
... View more
Labels:
03-28-2017
04:13 AM
1 Kudo
Hello Everyone I am now able to solve HIVE ISSUE. Below is what i did. Please do not hesitate if below mentioned steps did not work :- Step 1 :Make Sure you have followed the following steps accordingly. In my case i created hive-site.xml manually as [hduser@storage conf]$ vi hive-site.xml. So i deleted hive-site.xml again and followed following steps. [hduser@storage conf]$ pwd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf [hduser@storage conf]$ cd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf [hduser@storage conf]$ pwd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf [hduser@storage conf]$ cp hive-default.xml.template hive-site.xml Step 2 : Create Directory as /tmp/hive Step 3 : Now edit hive-site.xml i.e the one we created at Step 1 above and make sure following information is added or mentioned accordingly inside hive-site.xml <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> <description>location of default database for the warehouse</description> </property> <property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> <description>Username to use against metastore database</description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>hive</value> <description>password to use against metastore database</description> </property> <property> <name>hive.querylog.location</name> <value>/tmp/hive</value> <description>Location of Hive run time structured log file</description> </property> <property> <name>hive.exec.local.scratchdir</name> <value>/tmp/hive</value> <description>Local scratch space for Hive jobs</description> </property> <property> <name>hive.downloaded.resources.dir</name> <value>/tmp/hive</value> <description>Temporary local directory for added resources in the remote file system.</description> </property> Step 4 :Make sure that you have added following line inside hive-env.sh export HADOOP_HOME=/home/hduser/hadoop-2.6.5 Note : HADOOP_HOME location and Version. Define accordingly Step 5 : Make sure of jps status and then try starting hive. Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. hive>
... View more
03-28-2017
02:54 AM
I am also facing the same issue and following is my status http://community.cloudera.com/t5/Batch-SQL-Apache-Hive/Hive-not-opening-working-Exception-in-thread-quot-main-quot-java/m-p/52291#M1860?eid=1&aid=1
... View more
03-24-2017
04:44 AM
Hi Again As per the url which you have given i trigger the mysql statement accordingly but still i am getting same issue. Below is what i did. Please help. Will look forward to hear from you. As per the above url. Below is what i did but still it showing same issue. Please help i am stuck almost three weeks now. I don't see any help from anyone. Will wait for your feedback mysql> grant all on *.* to 'hive'@'192.168.0.227' identified by 'hive'; Query OK, 0 rows affected (0.00 sec) mysql> flush privileges; Query OK, 0 rows affected (0.00 sec) [hduser@storage lib]$ mysql -u hive -h 192.168.0.227 -p Enter password: Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 13 Server version: 5.1.73 Source distribution Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. mysql> [hduser@storage Desktop]$ hive which: no hbase in (/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/) Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.1.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366) Caused by: java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873) at java.sql.DriverManager.getConnection(DriverManager.java:208) at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) ... 70 more [hduser@storage Desktop]$ Also please find the output of hive-site.xml again <?xml version="1.0" encoding="UTF-8"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. See accompanying LICENSE file. --> <!-- Put site-specific property overrides in this file. --> <configuration> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true</value> <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> <description>location of default database for the warehouse</description> </property> <property> <name>javax.jdo.optioin.ConnectionUserName</name> <value>hive</value> <description>MYSQL username</description> </property> <property> <name>javax.jdo.optioin.ConnectionPassword</name> <value>hive</value> <description>MYSQL Password</description> </property> <property> <name>datanucleus.autoCreateSchema</name> <value>true</value> </property> <property> <name>datanucleus.fixedDatastore</name> <value>true</value> </property> <property> <name>datanucleus.autoCreateTables</name> <value>True</value> </property> </configuration>
... View more
03-17-2017
05:30 PM
My understanding is it should take the username and password hive i.e. the one defined inside hive-site.xml. So If you look at hive-site.xml. Username and Password is mentioned there as HIVE but i am not been able to understand from where it is taking APP user. Also the location for hive-site.xml is inside conf directory. By the way i tried with following <property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://localhost/storage.castrading.com</value>
</property> And also tried with following too but no any progress <property>
<name>javax.jdo.option.ConnectionURL</name>
<value>jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true</value>
</property> Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://localhost/storage.castrading.com, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: Access denied for user 'APP'@'localhost' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) hive-site.xml location [hduser@storage conf]$ pwd /home/hduser/Softwares/apache-hive-2.1.1-bin/conf [hduser@storage conf]$ ls -ltr total 276 -rw-r--r--. 1 hduser hadoop 2662 Nov 29 05:32 parquet-logging.properties -rw-r--r--. 1 hduser hadoop 2060 Nov 29 05:32 ivysettings.xml -rw-r--r--. 1 hduser hadoop 2925 Nov 29 05:32 hive-log4j2.properties.template -rw-r--r--. 1 hduser hadoop 2925 Nov 29 05:32 hive-log4j2 properties -rw-r--r--. 1 hduser hadoop 1596 Nov 29 05:32 beeline-log4j2.properties.template -rw-r--r--. 1 hduser hadoop 2719 Nov 29 05:32 llap-cli-log4j2.properties.template -rw-r--r--. 1 hduser hadoop 2274 Nov 29 05:32 hive-exec-log4j2.properties.template -rw-r--r--. 1 hduser hadoop 4353 Nov 29 05:35 llap-daemon-log4j2.properties.template -rw-r--r--. 1 hduser hadoop 2378 Nov 29 05:35 hive-env.sh.template -rw-r--r--. 1 hduser hadoop 229198 Nov 30 03:46 hive-default.xml.template -rw-r--r--. 1 hduser hadoop 2378 Mar 15 14:20 hive-env.sh~ -rw-r--r--. 1 hduser hadoop 1918 Mar 15 19:07 hive-site.xmlbackup -rw-r--r--. 1 hduser hadoop 1913 Mar 18 08:26 hive-site.xml [hduser@storage conf]$ hive-site.xml output <configuration> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true</value> Note : I even tried giving jdbc:mysql://localhost/hive?createDatabaseIfNotExist=true <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> <description>location of default database for the warehouse</description> </property> <property> <name>javax.jdo.optioin.ConnectionUserName</name> <value>hive</value> <description>MYSQL username</description> </property> <property> <name>javax.jdo.optioin.ConnectionPassword</name> <value>hive</value> <description>MYSQL Password</description> </property> <property> <name>datanucleus.autoCreateSchema</name> <value>true</value> </property> <property> <name>datanucleus.fixedDatastore</name> <value>true</value> </property> <property> <name>datanucleus.autoCreateTables</name> <value>True</value> </property> </configuration> [hduser@storage conf]$ schematool -dbType mysql -info which: no hbase in (/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/) Metastore connection URL: jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true Metastore Connection Driver : com.mysql.jdbc.Driver Metastore connection User: APP org.apache.hadoop.hive.metastore.HiveMetaException: Failed to get schema version. Underlying cause: java.sql.SQLException : Access denied for user 'APP'@'storage' (using password: YES) SQL Error code: 1045 Use --verbose for detailed stacktrace. *** schemaTool failed *** [hduser@storage conf]$
... View more
03-16-2017
10:20 PM
Hi Saranvisa, I followed the url which you have provided but i don't see the issue on that. I am able to login to localhost as well as storagre.castrading.com i.e. mysql output mentioned below. Also Please find the ouput of hive-site.xml from below <?xml version="1.0" encoding="UTF-8"?> <?xml-stylesheet type="text/xsl" href="configuration.xsl"?> <!-- Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed to in writing, software distributed under the License is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License. See accompanying LICENSE file. --> <!-- Put site-specific property overrides in this file. --> <configuration> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://storage.castrading.com/hive?createDatabaseIfNotExist=true</value> Note : I even tried giving jdbc:mysql://localhost/hive?createDatabaseIfNotExist=true <description>JDBC connect string for a JDBC metastore</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.jdbc.Driver</value> <description>Driver class name for a JDBC metastore</description> </property> <property> <name>hive.metastore.warehouse.dir</name> <value>/user/hive/warehouse</value> <description>location of default database for the warehouse</description> </property> <property> <name>javax.jdo.optioin.ConnectionUserName</name> <value>hive</value> <description>MYSQL username</description> </property> <property> <name>javax.jdo.optioin.ConnectionPassword</name> <value>hive</value> <description>MYSQL Password</description> </property> <property> <name>datanucleus.autoCreateSchema</name> <value>true</value> </property> <property> <name>datanucleus.fixedDatastore</name> <value>true</value> </property> <property> <name>datanucleus.autoCreateTables</name> <value>True</value> </property> </configuration> [hduser@storage ~]$ mysql -u hive -p -hlocalhost Enter password: Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 208 Server version: 5.1.73 Source distribution Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. mysql> \q Bye [hduser@storage ~]$ mysql -u hive -p -hstorage.castrading.com Enter password: Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 209 Server version: 5.1.73 Source distribution Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. mysql> [hduser@storage ~]$ mysql -u hive -p Enter password: Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 214 Server version: 5.1.73 Source distribution Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. mysql> show databases; +--------------------+ | Database | +--------------------+ | information_schema | | hive | | metastore | | mysql | | retail_db | | retail_db_backup | +--------------------+ 6 rows in set (0.10 sec) mysql>
... View more
03-16-2017
03:37 AM
Hello Everyone I am receiving an error While triggering hive command. I have mentioned the error logoutput at the bottom. However, just for an updates, I followed following below mentioned steps accordingly. Any suggestion will be highly appreciated. Its been a two weeks now i am not been able to solve the issue. Please help. [hduser@storage Softwares]$ vi ~/.bashrc # Add Hive bin / directory to PATH export HIVE_HOME=/home/hduser/Softwares/apache-hive-2.1.1-bin export PATH=$PATH:$HIVE_HOME/bin/ Created following directory inside hdfs [hduser@storage ~]$ hadoop fs -ls /user/hive/ 17/03/16 18:26:24 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Found 1 items drwxrwxr-x - hduser supergroup 0 2017-03-13 18:23 /user/hive/warehouse [hduser@storage ~]$ Created Metastore for Hive [hduser@storage bin]$ mysql -u root -p Enter password: Welcome to the MySQL monitor. Commands end with ; or \g. Your MySQL connection id is 2 Server version: 5.1.73 Source distribution Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved. Oracle is a registered trademark of Oracle Corporation and/or its affiliates. Other names may be trademarks of their respective owners. Type 'help;' or '\h' for help. Type '\c' to clear the current input statement. mysql> create database hive; Query OK, 1 row affected (0.05 sec) mysql> use hive; Reading table information for completion of table and column names You can turn off this feature to get a quicker startup with -A Database changed mysql> CREATE USER 'hive'@'storage.castrading.com' IDENTIFIED BY 'hive'; Query OK, 0 rows affected (0.00 sec) mysql> grant all privileges on hive to 'hive'@'storage.castrading.com' identified by 'hive' with grant option; Query OK, 0 rows affected (0.08 sec) mysql> grant all privileges on hive.* to 'hive'@'storage.castrading.com' identified by 'hive' with grant option; Query OK, 0 rows affected (0.09 sec) mysql> grant all on *.* to 'hive'@'%' identified by 'hive'; Query OK, 0 rows affected (0.00 sec) mysql> flush privileges; Query OK, 0 rows affected (0.00 sec) mysql> mysql> use hive; Database changed mysql> mysql> show tables; Empty set (0.00 sec) mysql> select user,host from mysql.user; +------------+------------------------+ | user | host | +------------+------------------------+ | | % | | % | % | | retail_dba | % | | root | % | | root | 127.0.0.1 | | retail_dba | 192.168.0.227 | | root | 192.168.0.227 | | root | localhost | | root | sotrage.castrading.com | | hive | storage.castrading.com | | root | storage.castrading.com | +------------+------------------------+ 11 rows in set (0.00 sec) mysql> mysql> show tables; +---------------------------+ | Tables_in_hive | +---------------------------+ | BUCKETING_COLS | | CDS | | COLUMNS_V2 | | DATABASE_PARAMS | | DBS | | DB_PRIVS | | DELEGATION_TOKENS | | FUNCS | | FUNC_RU | | GLOBAL_PRIVS | | IDXS | | INDEX_PARAMS | | KEY_CONSTRAINTS | | MASTER_KEYS | | NOTIFICATION_LOG | | NOTIFICATION_SEQUENCE | | NUCLEUS_TABLES | | PARTITIONS | | PARTITION_EVENTS | | PARTITION_KEYS | | PARTITION_KEY_VALS | | PARTITION_PARAMS | | PART_COL_PRIVS | | PART_COL_STATS | | PART_PRIVS | | ROLES | | ROLE_MAP | | SDS | | SD_PARAMS | | SEQUENCE_TABLE | | SERDES | | SERDE_PARAMS | | SKEWED_COL_NAMES | | SKEWED_COL_VALUE_LOC_MAP | | SKEWED_STRING_LIST | | SKEWED_STRING_LIST_VALUES | | SKEWED_VALUES | | SORT_COLS | | TABLE_PARAMS | | TAB_COL_STATS | | TBLS | | TBL_COL_PRIVS | | TBL_PRIVS | | TYPES | | TYPE_FIELDS | | VERSION | +---------------------------+ 46 rows in set (0.00 sec) mysql> Also copied mysql-connector-java*.jar file inside /apache-hive/lib folder as well HIVE LOG OUTPUT hduser@storage ~]$ hive which: no hbase in (/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/) Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.1.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:591) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366) at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558) ... 9 more Caused by: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1654) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221) ... 14 more Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652) ... 23 more Caused by: javax.jdo.JDOFatalDataStoreException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://storage.castrading.com:3306/hive?createDatabaseIfNotExist=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873) at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483) at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606) at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133) at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at java.security.AccessController.doPrivileged(Native Method) at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515) at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544) at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366) at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) ------ NestedThrowables: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://storage.castrading.com:3306/hive?createDatabaseIfNotExist=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873) at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483) at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606) at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133) at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at java.security.AccessController.doPrivileged(Native Method) at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515) at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544) at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366) at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) ------ at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:529) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:834) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at java.security.AccessController.doPrivileged(Native Method) at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515) at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544) at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) ... 28 more Caused by: java.sql.SQLException: Unable to open a test connection to the given database. JDBC url = jdbc:mysql://storage.castrading.com:3306/hive?createDatabaseIfNotExist=true, username = APP. Terminating connection pool (set lazyInit to true if you expect to start your database after your app). Original Exception: ------ java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873) at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483) at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606) at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133) at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:338) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:217) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965) at java.security.AccessController.doPrivileged(Native Method) at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960) at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808) at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701) at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:515) at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:544) at org.apache.hadoop.hive.metastore.ObjectStore.initializeHelper(ObjectStore.java:399) at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:336) at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:297) at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73) at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133) at org.apache.hadoop.hive.metastore.RawStoreProxy.<init>(RawStoreProxy.java:58) at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:67) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:599) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:564) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:630) at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:416) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78) at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84) at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:6490) at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:238) at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:70) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1652) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:80) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:130) at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:101) at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3367) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3406) at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3386) at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3640) at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:236) at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:221) at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366) at org.apache.hadoop.hive.ql.metadata.Hive.create(Hive.java:310) at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:290) at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:266) at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:558) at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:705) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.hadoop.util.RunJar.run(RunJar.java:221) at org.apache.hadoop.util.RunJar.main(RunJar.java:136) ------ at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.jolbox.bonecp.PoolUtil.generateSQLException(PoolUtil.java:192) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:422) at com.jolbox.bonecp.BoneCPDataSource.getConnection(BoneCPDataSource.java:120) at org.datanucleus.store.rdbms.ConnectionFactoryImpl$ManagedConnectionImpl.getConnection(ConnectionFactoryImpl.java:483) at org.datanucleus.store.rdbms.RDBMSStoreManager.<init>(RDBMSStoreManager.java:296) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:606) at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301) at org.datanucleus.NucleusContextHelper.createStoreManagerForProperties(NucleusContextHelper.java:133) at org.datanucleus.PersistenceNucleusContextImpl.initialise(PersistenceNucleusContextImpl.java:420) at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:821) ... 58 more Caused by: java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES) at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906) at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873) at com.mysql.jdbc.MysqlIO.secureAuth411(MysqlIO.java:4417) at com.mysql.jdbc.MysqlIO.doHandshake(MysqlIO.java:1278) at com.mysql.jdbc.ConnectionImpl.coreConnect(ConnectionImpl.java:2253) at com.mysql.jdbc.ConnectionImpl.connectOneTryOnly(ConnectionImpl.java:2284) at com.mysql.jdbc.ConnectionImpl.createNewIO(ConnectionImpl.java:2083) at com.mysql.jdbc.ConnectionImpl.<init>(ConnectionImpl.java:806) at com.mysql.jdbc.JDBC4Connection.<init>(JDBC4Connection.java:47) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:423) at com.mysql.jdbc.Util.handleNewInstance(Util.java:425) at com.mysql.jdbc.ConnectionImpl.getInstance(ConnectionImpl.java:410) at com.mysql.jdbc.NonRegisteringDriver.connect(NonRegisteringDriver.java:328) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:208) at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361) at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416) ... 70 more [hduser@storage ~]$
... View more
01-19-2017
03:01 AM
So you mean i have to use third party tool like pentaho,informatica etc for etl job ? I thought like there is a features available inside hadoop itself for creating job like mentioned in this url but my understanding were wrong then right ? https://drive.google.com/open?id=0B-wEtRLWeFvMMGt1LWJUbURsTDA
... View more
- Tags:
- n
01-19-2017
02:58 AM
So you mean i have to use third party tool like pentaho, informatica etc in order to create etl job ?
... View more
01-18-2017
12:35 AM
Hi mbigelow I am not looking for third party etl tool like pentaho. My confusion here is if i can designed the etl job in apache nifin or not ? For example if you look at the below mentioned url then it contains the etl job screnshot designed in pentaho. Now i wanted to designed same like wise job in APACHE NIFIN as well ? So is it possible ? If it is possible then, I wonder if you can recomend any url where i can learn how can i design etl job in apache nifin same likewise of that pentaho. Please advise https://drive.google.com/file/d/0B-wEtRLWeFvMMGt1LWJUbURsTDA/view Will look forward to hear from you Thank You Ujjwal Rana
... View more
01-18-2017
12:29 AM
Hi mbigelow I am not looking for third party tool like pentaho. My only concern here is if i can designed a etl job directly from APACHE NIFIN. For more clear below is the url of etl job screenshot designed in pentaho now i wanted to design same job in apache nifin too . So is it possible ? Moreover If it is possible then i wonder if you can recomend any url where i can learn how to design etl as mentioned in the below mentioned url ? https://drive.google.com/open?id=0B-wEtRLWeFvMMGt1LWJUbURsTDA Will look forward to hear from you Thank You
... View more