Member since 
    
	
		
		
		01-08-2017
	
	
	
	
	
	
	
	
	
	
	
	
	
	
			
      
                36
            
            
                Posts
            
        
                1
            
            
                Kudos Received
            
        
                2
            
            
                Solutions
            
        My Accepted Solutions
| Title | Views | Posted | 
|---|---|---|
| 19528 | 09-09-2017 10:36 PM | |
| 41839 | 03-28-2017 04:13 AM | 
			
    
	
		
		
		04-05-2017
	
		
		11:12 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Please find the output from below.     First::::::::::::  [hduser@storage Desktop]$ hive CLI: hive  which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/)    Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true  Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.  hive>     SECOND::::  [hduser@storage Desktop]$ ps aux | grep CliDriver  hduser   29018 29.9  5.3 2249484 217232 pts/0  Sl+  14:08   0:38 /usr/local/jdk1.8.0_111/bin/java -Xmx256m -Djava.library.path=/home/hduser/hadoop-2.6.5/lib -Djava.net.preferIPv4Stack=true -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hduser/hadoop-2.6.5/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hduser/hadoop-2.6.5 -Dhadoop.id.str=hduser -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m -Dlog4j.configurationFile=hive-log4j2.properties -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-cli-2.0.1.jar org.apache.hadoop.hive.cli.CliDriver CLI: hive  hduser   29161  0.0  0.0 103384   808 pts/3    S+   14:10   0:00 grep CliDriver  [hduser@storage Desktop]$       THIRD:::::::::::::::::::  [hduser@storage Desktop]$ strings /proc/29161/environ  strings: '/proc/29161/environ': No such file      
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-04-2017
	
		
		11:38 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Please find the output from below :-     [hduser@storage Desktop]$ hive --hiveconf hive.root.logger=DEBUG,console  which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/)    Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true  2017-04-05T14:37:29,204  INFO [main] SessionState:  Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true  2017-04-05T14:37:29,208 DEBUG [main] conf.VariableSubstitution: Substitution is on: hive  2017-04-05T14:37:29,570 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])  2017-04-05T14:37:29,584 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])  2017-04-05T14:37:29,586 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[GetGroups])  2017-04-05T14:37:29,591 DEBUG [main] impl.MetricsSystemImpl: UgiMetrics, User and group related metrics  2017-04-05T14:37:29,790 DEBUG [main] security.Groups:  Creating new Groups object  2017-04-05T14:37:29,797 DEBUG [main] util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...  2017-04-05T14:37:29,799 DEBUG [main] util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path  2017-04-05T14:37:29,799 DEBUG [main] util.NativeCodeLoader: java.library.path=/home/hduser/hadoop-2.6.5/lib  2017-04-05T14:37:29,799  WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable  2017-04-05T14:37:29,800 DEBUG [main] util.PerformanceAdvisory: Falling back to shell based  2017-04-05T14:37:29,803 DEBUG [main] security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping  2017-04-05T14:37:29,810 DEBUG [main] security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000  2017-04-05T14:37:29,822 DEBUG [main] security.UserGroupInformation: hadoop login  2017-04-05T14:37:29,825 DEBUG [main] security.UserGroupInformation: hadoop login commit  2017-04-05T14:37:29,834 DEBUG [main] security.UserGroupInformation: using local user:UnixPrincipal: hduser  2017-04-05T14:37:29,835 DEBUG [main] security.UserGroupInformation: Using user: "UnixPrincipal: hduser" with name hduser  2017-04-05T14:37:29,835 DEBUG [main] security.UserGroupInformation: User entry: "hduser"  2017-04-05T14:37:29,836 DEBUG [main] security.UserGroupInformation: UGI loginUser:hduser (auth:SIMPLE)  2017-04-05T14:37:29,919  INFO [main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore  2017-04-05T14:37:29,961 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.storeManagerType value null from  jpox.properties with rdbms  2017-04-05T14:37:29,962 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateConstraints value null from  jpox.properties with false  2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.autoStartMechanismMode value null from  jpox.properties with checked  2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateTables value null from  jpox.properties with false  2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.Multithreaded value null from  jpox.properties with true  2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.rdbms.initializeColumnInfo value null from  jpox.properties with NONE  2017-04-05T14:37:29,964 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.cache.level2.type value null from  jpox.properties with none  2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.connectionPoolingType value null from  jpox.properties with BONECP  2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionUserName value null from  jpox.properties with hive  2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.autoCreateAll value null from  jpox.properties with false  2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.NonTransactionalRead value null from  jpox.properties with true  2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.transactionIsolation value null from  jpox.properties with read-committed  2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionURL value null from  jpox.properties with jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true  2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateColumns value null from  jpox.properties with false  2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.identifierFactory value null from  jpox.properties with datanucleus1  2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.PersistenceManagerFactoryClass value null from  jpox.properties with org.datanucleus.api.jdo.JDOPersistenceManagerFactory  2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.cache.level2 value null from  jpox.properties with false  2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.rdbms.useLegacyNativeValueStrategy value null from  jpox.properties with true  2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding hive.metastore.integral.jdo.pushdown value null from  jpox.properties with false  2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.DetachAllOnCommit value null from  jpox.properties with true  2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionDriverName value null from  jpox.properties with org.apache.derby.jdbc.EmbeddedDriver  2017-04-05T14:37:29,972 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.plugin.pluginRegistryBundleCheck value null from  jpox.properties with LOG  2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.schema.autoCreateAll = false  2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateTables = false  2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.rdbms.useLegacyNativeValueStrategy = true  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateColumns = false  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: hive.metastore.integral.jdo.pushdown = false  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.autoStartMechanismMode = checked  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.rdbms.initializeColumnInfo = NONE  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.Multithreaded = true  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.identifierFactory = datanucleus1  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.transactionIsolation = read-committed  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionURL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true  2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.DetachAllOnCommit = true  2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.NonTransactionalRead = true  2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionDriverName = org.apache.derby.jdbc.EmbeddedDriver  2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateConstraints = false  2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionUserName = hive  2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.cache.level2 = false  2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = LOG  2017-04-05T14:37:30,028 DEBUG [main] metastore.ObjectStore: datanucleus.cache.level2.type = none  2017-04-05T14:37:30,028 DEBUG [main] metastore.ObjectStore: javax.jdo.PersistenceManagerFactoryClass = org.datanucleus.api.jdo.JDOPersistenceManagerFactory  2017-04-05T14:37:30,029 DEBUG [main] metastore.ObjectStore: datanucleus.storeManagerType = rdbms  2017-04-05T14:37:30,029 DEBUG [main] metastore.ObjectStore: datanucleus.connectionPoolingType = BONECP  2017-04-05T14:37:30,029  INFO [main] metastore.ObjectStore: ObjectStore, initialize called  2017-04-05T14:37:31,267 DEBUG [main] bonecp.BoneCPDataSource: JDBC URL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true, Username = hive, partitions = 1, max (per partition) = 10, min (per partition) = 0, idle max age = 60 min, idle test period = 240 min, strategy = DEFAULT  2017-04-05T14:37:32,132  INFO [main] metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"  2017-04-05T14:37:34,938 DEBUG [main] bonecp.BoneCPDataSource: JDBC URL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true, Username = hive, partitions = 1, max (per partition) = 10, min (per partition) = 0, idle max age = 60 min, idle test period = 240 min, strategy = DEFAULT  2017-04-05T14:37:35,150 DEBUG [main] metastore.MetaStoreDirectSql: Direct SQL query in 1.803389ms + 0.059481ms, the query is [SET @@session.sql_mode=ANSI_QUOTES]  2017-04-05T14:37:35,175  INFO [main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL  2017-04-05T14:37:35,185 DEBUG [main] metastore.ObjectStore: RawStore: org.apache.hadoop.hive.metastore.ObjectStore@7103ab0, with PersistenceManager: org.datanucleus.api.jdo.JDOPersistenceManager@b0964b2 created in the thread with id: 1  2017-04-05T14:37:35,185  INFO [main] metastore.ObjectStore: Initialized ObjectStore  2017-04-05T14:37:35,345 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:7234)  2017-04-05T14:37:35,418 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:7247)  2017-04-05T14:37:35,442 DEBUG [main] metastore.ObjectStore: Found expected HMS version of 2.1.0  2017-04-05T14:37:35,453 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.start(ObjectStore.java:2502)  2017-04-05T14:37:35,461 DEBUG [main] metastore.MetaStoreDirectSql: Direct SQL query in 1.297211ms + 0.019608ms, the query is [SET @@session.sql_mode=ANSI_QUOTES]  2017-04-05T14:37:35,500 DEBUG [main] metastore.MetaStoreDirectSql: getDatabase: directsql returning db default locn[hdfs://storage.castrading.com:9000/user/hive/warehouse] desc [Default Hive database] owner [public] ownertype [ROLE]  2017-04-05T14:37:35,503 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at:      org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.commit(ObjectStore.java:2552)  2017-04-05T14:37:35,505 DEBUG [main] metastore.ObjectStore: db details for db default retrieved using SQL in 51.37478ms  2017-04-05T14:37:35,506 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3313)  2017-04-05T14:37:35,506 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670)  2017-04-05T14:37:35,546 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676)  2017-04-05T14:37:35,550 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at:      org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3325)  2017-04-05T14:37:35,555 DEBUG [main] metastore.HiveMetaStore: admin role already exists  InvalidObjectException(message:Role admin already exists.)      at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3316)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)      at com.sun.proxy.$Proxy21.addRole(Unknown Source)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:580)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371)      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)      at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)      at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)      at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)      at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)      at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)      at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)      at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349)      at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)      at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204)      at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331)      at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292)      at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262)      at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247)      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543)      at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516)      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)  2017-04-05T14:37:35,560  INFO [main] metastore.HiveMetaStore: Added admin role in metastore  2017-04-05T14:37:35,562 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3313)  2017-04-05T14:37:35,562 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670)  2017-04-05T14:37:35,566 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676)  2017-04-05T14:37:35,567 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at:      org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3325)  2017-04-05T14:37:35,569 DEBUG [main] metastore.HiveMetaStore: public role already exists  InvalidObjectException(message:Role public already exists.)      at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3316)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)      at com.sun.proxy.$Proxy21.addRole(Unknown Source)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:589)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371)      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)      at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)      at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)      at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)      at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)      at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)      at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)      at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349)      at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)      at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204)      at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331)      at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292)      at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262)      at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247)      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543)      at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516)      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)  2017-04-05T14:37:35,570  INFO [main] metastore.HiveMetaStore: Added public role in metastore  2017-04-05T14:37:35,590 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4063)  2017-04-05T14:37:35,590 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670)  2017-04-05T14:37:35,593 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:      org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676)  2017-04-05T14:37:35,594 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.listPrincipalMGlobalGrants(ObjectStore.java:4579)  2017-04-05T14:37:35,620 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:      org.apache.hadoop.hive.metastore.ObjectStore.listPrincipalMGlobalGrants(ObjectStore.java:4587)  2017-04-05T14:37:35,621 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at:      org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4266)  2017-04-05T14:37:35,623 DEBUG [main] metastore.HiveMetaStore: Failed while granting global privs to admin  InvalidObjectException(message:All is already granted by admin)      at org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4099)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)      at com.sun.proxy.$Proxy21.grantPrivileges(Unknown Source)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:603)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569)      at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371)      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)      at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)      at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)      at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)      at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)      at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)      at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)      at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)      at java.lang.reflect.Constructor.newInstance(Constructor.java:423)      at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)      at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)      at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)      at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)      at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349)      at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)      at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204)      at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331)      at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292)      at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262)      at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247)      at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543)      at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516)      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)  2017-04-05T14:37:35,627  INFO [main] metastore.HiveMetaStore: No user is added in admin role, since config is empty  2017-04-05T14:37:35,891  INFO [main] metastore.HiveMetaStore: 0: get_all_functions  2017-04-05T14:37:35,895  INFO [main] HiveMetaStore.audit: ugi=hduser    ip=unknown-ip-addr    cmd=get_all_functions      2017-04-05T14:37:35,896 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:      org.apache.hadoop.hive.metastore.ObjectStore.getAllFunctions(ObjectStore.java:7549)  2017-04-05T14:37:35,916 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at:      org.apache.hadoop.hive.metastore.ObjectStore.getAllFunctions(ObjectStore.java:7553)  2017-04-05T14:37:36,238 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false  2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false  2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false  2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.domain.socket.path =  2017-04-05T14:37:36,281 DEBUG [main] hdfs.DFSClient: No KeyProvider found.  2017-04-05T14:37:36,454 DEBUG [main] retry.RetryUtils: multipleLinearRandomRetry = null  2017-04-05T14:37:36,511 DEBUG [main] ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@ca93621  2017-04-05T14:37:36,532 DEBUG [main] ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@54bca971  2017-04-05T14:37:37,431 DEBUG [main] util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.  2017-04-05T14:37:37,440 DEBUG [main] sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection  2017-04-05T14:37:37,479 DEBUG [main] ipc.Client: The ping interval is 60000 ms.  2017-04-05T14:37:37,481 DEBUG [main] ipc.Client: Connecting to storage.castrading.com/192.168.0.227:9000  2017-04-05T14:37:37,524 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser: starting, having connections 1  2017-04-05T14:37:37,526 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #0  2017-04-05T14:37:37,537 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #0  2017-04-05T14:37:37,538 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 91ms  2017-04-05T14:37:37,600 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #1  2017-04-05T14:37:37,603 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #1  2017-04-05T14:37:37,604 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 5ms  2017-04-05T14:37:37,606 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #2  2017-04-05T14:37:37,608 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #2  2017-04-05T14:37:37,611 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 5ms  2017-04-05T14:37:37,616 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #3  2017-04-05T14:37:37,617 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #3  2017-04-05T14:37:37,619 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 3ms  2017-04-05T14:37:37,620 DEBUG [main] hdfs.DFSClient: /tmp/hive/hduser/10f8dcc5-0c5b-479d-92f3-87a848c6d188: masked=rwx------  2017-04-05T14:37:37,624 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #4  2017-04-05T14:37:37,630 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #4  2017-04-05T14:37:37,636 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 13ms  2017-04-05T14:37:37,642 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #5  2017-04-05T14:37:37,643 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #5  2017-04-05T14:37:37,643 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms  2017-04-05T14:37:37,662 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #6  2017-04-05T14:37:37,663 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #6  2017-04-05T14:37:37,667 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 8ms  2017-04-05T14:37:37,667 DEBUG [main] hdfs.DFSClient: /tmp/hive/hduser/10f8dcc5-0c5b-479d-92f3-87a848c6d188/_tmp_space.db: masked=rwx------  2017-04-05T14:37:37,668 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #7  2017-04-05T14:37:37,670 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #7  2017-04-05T14:37:37,670 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 3ms  2017-04-05T14:37:37,672 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #8  2017-04-05T14:37:37,675 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #8  2017-04-05T14:37:37,675 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 4ms  Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.  hive> 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-04-2017
	
		
		04:01 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Please find the detail of .bashrc fie from below. Also i do have hive-exec-2.0.1.jar file inside lib folder. On previous one you mentioned hive-exec-2.0.1-SNAPSHOT.jar filename ($HIVE_HOME/lib/hive-exec-2.2.0-SNAPSHOT.jar,) which were not available but hive-exec-2.0.1.jar is available accordingly     # .bashrc    # Source global definitions  if [ -f /etc/bashrc ]; then          . /etc/bashrc  fi    # Set Hadoop-related environment variables  #export HADOOP_HOME=/home/hduser/hadoop  export HADOOP_HOME=/home/hduser/hadoop-2.6.5  export HADOOP_INSTALL=/home/hduser/hadoop-2.6.5    #Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)    export JAVA_HOME=/usr/local/jdk1.8.0_111  export PATH=$PATH:$JAVA_HOME/bin  PATH=$PATH:$HOME/bin  export PATH      # Some convenient aliases and functions for running Hadoop-related commands  unalias fs &> /dev/null  alias fs="hadoop fs"  unalias hls &> /dev/null  alias hls="fs -ls"    # If you have LZO compression enabled in your Hadoop cluster and  # compress job outputs with LZOP (not covered in this tutorial):  # Conveniently inspect an LZOP compressed file from the command  # line; run via:  #  # $ lzohead /hdfs/path/to/lzop/compressed/file.lzo  #  # Requires installed 'lzop' command.     #  lzohead () {      hadoop fs -cat $1 | lzop -dc | head -1000 | less  }    # Add Hadoop bin/ directory to PATH  export PATH=$PATH:$HADOOP_HOME/bin    # Add Pig bin / directory to PATH  export PIG_HOME=/home/hduser/pig-0.15.0  export PATH=$PATH:$PIG_HOME/bin    # User specific aliases and functions    export HADOOP_INSTALL=$HADOOP_HOME  export HADOOP_MAPRED_HOME=$HADOOP_HOME  export HADOOP_COMMON_HOME=$HADOOP_HOME  export HADOOP_HDFS_HOME=$HADOOP_HOME  export YARN_HOME=$HADOOP_HOME  export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native  export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"  export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin  export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin    export SCALA_HOME=/home/hduser/scala/  export PATH=$PATH:$SCALA_HOME:/bin/    # Add Sqoop bin / directory to PATH  export SQOOP_HOME=/home/hduser/Softwares/sqoop  export PATH=$PATH:$SQOOP_HOME/bin/    # Add Hive bin / directory to PATH  export HIVE_HOME=/home/hduser/Softwares/apache-hive-2.0.1-bin  export PATH=$PATH:$HIVE_HOME/bin/  export HIVE_CONF_DIR=$HIVE_HOME/conf    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-04-2017
	
		
		03:26 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Nope i don't have that file inside lib folder. I am using apache-hive-2.0.1-bin version. So i downloaded tar.gz file of this version but i did not find the jar file which you have mentioned. So can you recomend me the url where i can download the file for apache-hive-2.0.1-bin version.     Yes i installed hadoop and hive manually. 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-03-2017
	
		
		11:11 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 I am bit confused. Do you mean to run those command after i get login inside hive ?  [hduser@storage Desktop]$ jps  7041 NameNode  7891 NodeManager  7143 DataNode  7928 Jps  7291 SecondaryNameNode  7789 ResourceManager    [hduser@storage Desktop]$ hive  which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/)    Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true  Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.  hive> ps aux | grep HiveServer2 ;  NoViableAltException(26@[])      at org.apache.hadoop.hive.ql.parse.HiveParser.statement(HiveParser.java:1099)      at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:204)      at org.apache.hadoop.hive.ql.parse.ParseDriver.parse(ParseDriver.java:166)      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:440)      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:319)      at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1249)      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1295)      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1178)      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1166)      at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:236)      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)      at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:782)      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:721)      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)  FAILED: ParseException line 1:0 cannot recognize input near 'ps' 'aux' '|' 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		04-03-2017
	
		
		10:13 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Please find the output from below :     [hduser@storage Desktop]$ ps aux | grep HiveServer2  hduser    4479  0.0  0.0 103384   812 pts/0    S+   13:08   0:00 grep HiveServer2     [root@storage ~]# strings /proc/4479/environ  strings: '/proc/4479/environ': No such file    
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-30-2017
	
		
		10:10 PM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hello Everyone  While triggering sql SELECT statement inside HIVE i get following error messages. I have mentioned the sql statement and the output below. Any suggestion will be highly appreciated.     hive (default)> show tables;  OK  order_items  Time taken: 0.35 seconds, Fetched: 1 row(s)  hive (default)> select count(1) from order_items;  Exception in thread "d413467f-6da8-4ebc-bf93-730e15b4b23f main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/io/HdfsUtils$HadoopFileStatus      at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:545)      at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:237)      at org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:429)      at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:6437)      at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8961)      at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8850)      at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9703)      at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9596)      at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:291)      at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10103)      at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:228)      at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:239)      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:473)      at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:319)      at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1249)      at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1295)      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1178)      at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1166)      at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:236)      at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)      at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)      at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:782)      at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:721)      at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)      at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)      at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)      at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)      at java.lang.reflect.Method.invoke(Method.java:498)      at org.apache.hadoop.util.RunJar.run(RunJar.java:221)      at org.apache.hadoop.util.RunJar.main(RunJar.java:136)  Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.io.HdfsUtils$HadoopFileStatus      at java.net.URLClassLoader.findClass(URLClassLoader.java:381)      at java.lang.ClassLoader.loadClass(ClassLoader.java:424)      at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)      at java.lang.ClassLoader.loadClass(ClassLoader.java:357)      ... 30 more  [hduser@storage Softwares]$ 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
		
			
				
						
							Labels:
						
						
		
			
	
					
			
		
	
	
	
	
				
		
	
	
- Labels:
- 
						
							
		
			Apache Hive
- 
						
							
		
			Apache Spark
			
    
	
		
		
		03-28-2017
	
		
		04:13 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
	
		1 Kudo
		
	
				
		
	
		
					
							 Hello Everyone  I am now able to solve HIVE ISSUE. Below is what i did. Please do not hesitate if below mentioned steps did not work :-     Step 1 :Make Sure you have followed the following steps accordingly. In my case i created hive-site.xml manually as [hduser@storage conf]$ vi hive-site.xml. So i deleted hive-site.xml again and followed following steps.  [hduser@storage conf]$ pwd  /home/hduser/Softwares/apache-hive-2.0.1-bin/conf  [hduser@storage conf]$ cd /home/hduser/Softwares/apache-hive-2.0.1-bin/conf  [hduser@storage conf]$ pwd  /home/hduser/Softwares/apache-hive-2.0.1-bin/conf  [hduser@storage conf]$ cp hive-default.xml.template hive-site.xml  Step 2 :  Create Directory as /tmp/hive     Step 3 : Now edit hive-site.xml i.e the one we created at Step 1 above and make sure following information is added or mentioned accordingly inside hive-site.xml  <property>      <name>javax.jdo.option.ConnectionURL</name>      <value>jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true</value>      <description>JDBC connect string for a JDBC metastore</description>  </property>  <property>    <name>javax.jdo.option.ConnectionDriverName</name>    <value>com.mysql.jdbc.Driver</value>      <description>Driver class name for a JDBC metastore</description>  </property>  <property>      <name>hive.metastore.warehouse.dir</name>      <value>/user/hive/warehouse</value>      <description>location of default database for the warehouse</description>   </property>  <property>     <property>      <name>javax.jdo.option.ConnectionUserName</name>      <value>hive</value>      <description>Username to use against metastore database</description>    </property>  <property>      <name>javax.jdo.option.ConnectionPassword</name>      <value>hive</value>      <description>password to use against metastore database</description>    </property>  <property>     <name>hive.querylog.location</name>     <value>/tmp/hive</value>     <description>Location of Hive run time structured log file</description>  </property>    <property>     <name>hive.exec.local.scratchdir</name>     <value>/tmp/hive</value>     <description>Local scratch space for Hive jobs</description>  </property>    <property>     <name>hive.downloaded.resources.dir</name>     <value>/tmp/hive</value>     <description>Temporary local directory for added resources in the remote file system.</description>  </property>        Step 4 :Make sure that you have added following line inside hive-env.sh       export HADOOP_HOME=/home/hduser/hadoop-2.6.5     Note : HADOOP_HOME location and Version. Define accordingly     Step 5 : Make sure of jps status and then try starting hive.  Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true  Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.  hive>       
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		
			
    
	
		
		
		03-24-2017
	
		
		04:44 AM
	
	
	
	
	
	
	
	
	
	
	
	
	
	
		
	
				
		
			
					
				
		
	
		
					
							 Hi Again  As per the url which you have given i trigger the mysql statement accordingly but still i am getting same issue. Below is what i did. Please help. Will look forward to hear from you.     As per the above url. Below is what i did but still it showing same issue. Please help i am stuck almost three weeks now. I don't see any help from anyone. Will wait for your feedback    mysql> grant all on *.* to 'hive'@'192.168.0.227' identified by 'hive';  Query OK, 0 rows affected (0.00 sec)    mysql> flush privileges;  Query OK, 0 rows affected (0.00 sec)     [hduser@storage lib]$ mysql -u hive -h 192.168.0.227 -p  Enter password:  Welcome to the MySQL monitor.  Commands end with ; or \g.  Your MySQL connection id is 13  Server version: 5.1.73 Source distribution    Copyright (c) 2000, 2013, Oracle and/or its affiliates. All rights reserved.    Oracle is a registered trademark of Oracle Corporation and/or its  affiliates. Other names may be trademarks of their respective  owners.    Type 'help;' or '\h' for help. Type '\c' to clear the current input statement.    mysql>      [hduser@storage Desktop]$ hive  which: no hbase in (/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.1.1-bin/bin/)    Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.1.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true  Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient          at org.apache.hadoop.util.RunJar.main(RunJar.java:136)  Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient      at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:226)      at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:366)       Caused by: java.sql.SQLException: Access denied for user 'APP'@'storage' (using password: YES)      at com.mysql.jdbc.SQLError.createSQLException(SQLError.java:964)      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3970)      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:3906)      at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:873)          at java.sql.DriverManager.getConnection(DriverManager.java:208)      at com.jolbox.bonecp.BoneCP.obtainRawInternalConnection(BoneCP.java:361)      at com.jolbox.bonecp.BoneCP.<init>(BoneCP.java:416)      ... 70 more  [hduser@storage Desktop]$      Also please find the output of hive-site.xml again    <?xml version="1.0" encoding="UTF-8"?>  <?xml-stylesheet type="text/xsl" href="configuration.xsl"?>  <!--    Licensed under the Apache License, Version 2.0 (the "License");    you may not use this file except in compliance with the License.    You may obtain a copy of the License at        http://www.apache.org/licenses/LICENSE-2.0      Unless required by applicable law or agreed to in writing, software    distributed under the License is distributed on an "AS IS" BASIS,    WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.    See the License for the specific language governing permissions and    limitations under the License. See accompanying LICENSE file.  -->    <!-- Put site-specific property overrides in this file. -->  <configuration>  <property>      <name>javax.jdo.option.ConnectionURL</name>      <value>jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true</value>      <description>JDBC connect string for a JDBC metastore</description>  </property>  <property>    <name>javax.jdo.option.ConnectionDriverName</name>    <value>com.mysql.jdbc.Driver</value>      <description>Driver class name for a JDBC metastore</description>  </property>  <property>      <name>hive.metastore.warehouse.dir</name>      <value>/user/hive/warehouse</value>      <description>location of default database for the warehouse</description>   </property>  <property>  <name>javax.jdo.optioin.ConnectionUserName</name>  <value>hive</value>  <description>MYSQL username</description>  </property>  <property>  <name>javax.jdo.optioin.ConnectionPassword</name>  <value>hive</value>  <description>MYSQL Password</description>  </property>  <property>    <name>datanucleus.autoCreateSchema</name>    <value>true</value>  </property>  <property>    <name>datanucleus.fixedDatastore</name>    <value>true</value>  </property>  <property>   <name>datanucleus.autoCreateTables</name>   <value>True</value>   </property>  </configuration> 
						
					
					... View more
				
			
			
			
			
			
			
			
			
			
		- « Previous
- 
						- 1
- 2
 
- Next »
 
        






