Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

select query inside hive not working

avatar
Contributor

Hello Everyone

While triggering sql SELECT statement inside HIVE i get following error messages. I have mentioned the sql statement and the output below. Any suggestion will be highly appreciated.

 

hive (default)> show tables;
OK
order_items
Time taken: 0.35 seconds, Fetched: 1 row(s)
hive (default)> select count(1) from order_items;
Exception in thread "d413467f-6da8-4ebc-bf93-730e15b4b23f main" java.lang.NoClassDefFoundError: org/apache/hadoop/hive/io/HdfsUtils$HadoopFileStatus
    at org.apache.hadoop.hive.common.FileUtils.mkdir(FileUtils.java:545)
    at org.apache.hadoop.hive.ql.Context.getStagingDir(Context.java:237)
    at org.apache.hadoop.hive.ql.Context.getExtTmpPathRelTo(Context.java:429)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genFileSinkPlan(SemanticAnalyzer.java:6437)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPostGroupByBodyPlan(SemanticAnalyzer.java:8961)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genBodyPlan(SemanticAnalyzer.java:8850)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9703)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.genPlan(SemanticAnalyzer.java:9596)
    at org.apache.hadoop.hive.ql.parse.CalcitePlanner.genOPTree(CalcitePlanner.java:291)
    at org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:10103)
    at org.apache.hadoop.hive.ql.parse.CalcitePlanner.analyzeInternal(CalcitePlanner.java:228)
    at org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:239)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:473)
    at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:319)
    at org.apache.hadoop.hive.ql.Driver.compileInternal(Driver.java:1249)
    at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1295)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1178)
    at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1166)
    at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:236)
    at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:187)
    at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:403)
    at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:782)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:721)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hive.io.HdfsUtils$HadoopFileStatus
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:331)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 30 more
[hduser@storage Softwares]$

1 ACCEPTED SOLUTION

avatar
Contributor

I am able to solve the issue. I simply uninstall and re-install hive and with that it works. Select query is now able to show an output without any issue.

View solution in original post

24 REPLIES 24

avatar
Contributor

Please find the detail of .bashrc fie from below. Also i do have hive-exec-2.0.1.jar file inside lib folder. On previous one you mentioned hive-exec-2.0.1-SNAPSHOT.jar filename ($HIVE_HOME/lib/hive-exec-2.2.0-SNAPSHOT.jar,) which were not available but hive-exec-2.0.1.jar is available accordingly

 

# .bashrc

# Source global definitions
if [ -f /etc/bashrc ]; then
        . /etc/bashrc
fi

# Set Hadoop-related environment variables
#export HADOOP_HOME=/home/hduser/hadoop
export HADOOP_HOME=/home/hduser/hadoop-2.6.5
export HADOOP_INSTALL=/home/hduser/hadoop-2.6.5

#Set JAVA_HOME (we will also configure JAVA_HOME directly for Hadoop later on)

export JAVA_HOME=/usr/local/jdk1.8.0_111
export PATH=$PATH:$JAVA_HOME/bin
PATH=$PATH:$HOME/bin
export PATH


# Some convenient aliases and functions for running Hadoop-related commands
unalias fs &> /dev/null
alias fs="hadoop fs"
unalias hls &> /dev/null
alias hls="fs -ls"

# If you have LZO compression enabled in your Hadoop cluster and
# compress job outputs with LZOP (not covered in this tutorial):
# Conveniently inspect an LZOP compressed file from the command
# line; run via:
#
# $ lzohead /hdfs/path/to/lzop/compressed/file.lzo
#
# Requires installed 'lzop' command.

 

#
lzohead () {
    hadoop fs -cat $1 | lzop -dc | head -1000 | less
}

# Add Hadoop bin/ directory to PATH
export PATH=$PATH:$HADOOP_HOME/bin

# Add Pig bin / directory to PATH
export PIG_HOME=/home/hduser/pig-0.15.0
export PATH=$PATH:$PIG_HOME/bin

# User specific aliases and functions

export HADOOP_INSTALL=$HADOOP_HOME
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=$HADOOP_HOME/lib"
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export PATH=$PATH:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

export SCALA_HOME=/home/hduser/scala/
export PATH=$PATH:$SCALA_HOME:/bin/

# Add Sqoop bin / directory to PATH
export SQOOP_HOME=/home/hduser/Softwares/sqoop
export PATH=$PATH:$SQOOP_HOME/bin/

# Add Hive bin / directory to PATH
export HIVE_HOME=/home/hduser/Softwares/apache-hive-2.0.1-bin
export PATH=$PATH:$HIVE_HOME/bin/
export HIVE_CONF_DIR=$HIVE_HOME/conf

 

avatar
Super Guru
the .bashrc looks all right, please run below command:

hive --hiveconf hive.root.logger=DEBUG,console

and then copy and paste the output here for review.

avatar
Contributor

Please find the output from below :-

 

[hduser@storage Desktop]$ hive --hiveconf hive.root.logger=DEBUG,console
which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/)

Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
2017-04-05T14:37:29,204  INFO [main] SessionState:
Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
2017-04-05T14:37:29,208 DEBUG [main] conf.VariableSubstitution: Substitution is on: hive
2017-04-05T14:37:29,570 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginSuccess with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[Rate of successful kerberos logins and latency (milliseconds)])
2017-04-05T14:37:29,584 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.loginFailure with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[Rate of failed kerberos logins and latency (milliseconds)])
2017-04-05T14:37:29,586 DEBUG [main] lib.MutableMetricsFactory: field org.apache.hadoop.metrics2.lib.MutableRate org.apache.hadoop.security.UserGroupInformation$UgiMetrics.getGroups with annotation @org.apache.hadoop.metrics2.annotation.Metric(sampleName=Ops, always=false, about=, type=DEFAULT, valueName=Time, value=[GetGroups])
2017-04-05T14:37:29,591 DEBUG [main] impl.MetricsSystemImpl: UgiMetrics, User and group related metrics
2017-04-05T14:37:29,790 DEBUG [main] security.Groups:  Creating new Groups object
2017-04-05T14:37:29,797 DEBUG [main] util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
2017-04-05T14:37:29,799 DEBUG [main] util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
2017-04-05T14:37:29,799 DEBUG [main] util.NativeCodeLoader: java.library.path=/home/hduser/hadoop-2.6.5/lib
2017-04-05T14:37:29,799  WARN [main] util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2017-04-05T14:37:29,800 DEBUG [main] util.PerformanceAdvisory: Falling back to shell based
2017-04-05T14:37:29,803 DEBUG [main] security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.ShellBasedUnixGroupsMapping
2017-04-05T14:37:29,810 DEBUG [main] security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000
2017-04-05T14:37:29,822 DEBUG [main] security.UserGroupInformation: hadoop login
2017-04-05T14:37:29,825 DEBUG [main] security.UserGroupInformation: hadoop login commit
2017-04-05T14:37:29,834 DEBUG [main] security.UserGroupInformation: using local user:UnixPrincipal: hduser
2017-04-05T14:37:29,835 DEBUG [main] security.UserGroupInformation: Using user: "UnixPrincipal: hduser" with name hduser
2017-04-05T14:37:29,835 DEBUG [main] security.UserGroupInformation: User entry: "hduser"
2017-04-05T14:37:29,836 DEBUG [main] security.UserGroupInformation: UGI loginUser:hduser (auth:SIMPLE)
2017-04-05T14:37:29,919  INFO [main] metastore.HiveMetaStore: 0: Opening raw store with implementation class:org.apache.hadoop.hive.metastore.ObjectStore
2017-04-05T14:37:29,961 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.storeManagerType value null from  jpox.properties with rdbms
2017-04-05T14:37:29,962 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateConstraints value null from  jpox.properties with false
2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.autoStartMechanismMode value null from  jpox.properties with checked
2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateTables value null from  jpox.properties with false
2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.Multithreaded value null from  jpox.properties with true
2017-04-05T14:37:29,963 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.rdbms.initializeColumnInfo value null from  jpox.properties with NONE
2017-04-05T14:37:29,964 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.cache.level2.type value null from  jpox.properties with none
2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.connectionPoolingType value null from  jpox.properties with BONECP
2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionUserName value null from  jpox.properties with hive
2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.autoCreateAll value null from  jpox.properties with false
2017-04-05T14:37:29,966 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.NonTransactionalRead value null from  jpox.properties with true
2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.transactionIsolation value null from  jpox.properties with read-committed
2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionURL value null from  jpox.properties with jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true
2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.schema.validateColumns value null from  jpox.properties with false
2017-04-05T14:37:29,967 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.identifierFactory value null from  jpox.properties with datanucleus1
2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.PersistenceManagerFactoryClass value null from  jpox.properties with org.datanucleus.api.jdo.JDOPersistenceManagerFactory
2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.cache.level2 value null from  jpox.properties with false
2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.rdbms.useLegacyNativeValueStrategy value null from  jpox.properties with true
2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding hive.metastore.integral.jdo.pushdown value null from  jpox.properties with false
2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.DetachAllOnCommit value null from  jpox.properties with true
2017-04-05T14:37:29,971 DEBUG [main] metastore.ObjectStore: Overriding javax.jdo.option.ConnectionDriverName value null from  jpox.properties with org.apache.derby.jdbc.EmbeddedDriver
2017-04-05T14:37:29,972 DEBUG [main] metastore.ObjectStore: Overriding datanucleus.plugin.pluginRegistryBundleCheck value null from  jpox.properties with LOG
2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.schema.autoCreateAll = false
2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateTables = false
2017-04-05T14:37:30,025 DEBUG [main] metastore.ObjectStore: datanucleus.rdbms.useLegacyNativeValueStrategy = true
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateColumns = false
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: hive.metastore.integral.jdo.pushdown = false
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.autoStartMechanismMode = checked
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.rdbms.initializeColumnInfo = NONE
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.Multithreaded = true
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.identifierFactory = datanucleus1
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: datanucleus.transactionIsolation = read-committed
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionURL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true
2017-04-05T14:37:30,026 DEBUG [main] metastore.ObjectStore: javax.jdo.option.DetachAllOnCommit = true
2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.NonTransactionalRead = true
2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionDriverName = org.apache.derby.jdbc.EmbeddedDriver
2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.schema.validateConstraints = false
2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: javax.jdo.option.ConnectionUserName = hive
2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.cache.level2 = false
2017-04-05T14:37:30,027 DEBUG [main] metastore.ObjectStore: datanucleus.plugin.pluginRegistryBundleCheck = LOG
2017-04-05T14:37:30,028 DEBUG [main] metastore.ObjectStore: datanucleus.cache.level2.type = none
2017-04-05T14:37:30,028 DEBUG [main] metastore.ObjectStore: javax.jdo.PersistenceManagerFactoryClass = org.datanucleus.api.jdo.JDOPersistenceManagerFactory
2017-04-05T14:37:30,029 DEBUG [main] metastore.ObjectStore: datanucleus.storeManagerType = rdbms
2017-04-05T14:37:30,029 DEBUG [main] metastore.ObjectStore: datanucleus.connectionPoolingType = BONECP
2017-04-05T14:37:30,029  INFO [main] metastore.ObjectStore: ObjectStore, initialize called
2017-04-05T14:37:31,267 DEBUG [main] bonecp.BoneCPDataSource: JDBC URL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true, Username = hive, partitions = 1, max (per partition) = 10, min (per partition) = 0, idle max age = 60 min, idle test period = 240 min, strategy = DEFAULT
2017-04-05T14:37:32,132  INFO [main] metastore.ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"
2017-04-05T14:37:34,938 DEBUG [main] bonecp.BoneCPDataSource: JDBC URL = jdbc:mysql://192.168.0.227/hive?createDatabaseIfNotExist=true, Username = hive, partitions = 1, max (per partition) = 10, min (per partition) = 0, idle max age = 60 min, idle test period = 240 min, strategy = DEFAULT
2017-04-05T14:37:35,150 DEBUG [main] metastore.MetaStoreDirectSql: Direct SQL query in 1.803389ms + 0.059481ms, the query is [SET @@session.sql_mode=ANSI_QUOTES]
2017-04-05T14:37:35,175  INFO [main] metastore.MetaStoreDirectSql: Using direct SQL, underlying DB is MYSQL
2017-04-05T14:37:35,185 DEBUG [main] metastore.ObjectStore: RawStore: org.apache.hadoop.hive.metastore.ObjectStore@7103ab0, with PersistenceManager: org.datanucleus.api.jdo.JDOPersistenceManager@b0964b2 created in the thread with id: 1
2017-04-05T14:37:35,185  INFO [main] metastore.ObjectStore: Initialized ObjectStore
2017-04-05T14:37:35,345 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:7234)
2017-04-05T14:37:35,418 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMSchemaVersion(ObjectStore.java:7247)
2017-04-05T14:37:35,442 DEBUG [main] metastore.ObjectStore: Found expected HMS version of 2.1.0
2017-04-05T14:37:35,453 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.start(ObjectStore.java:2502)
2017-04-05T14:37:35,461 DEBUG [main] metastore.MetaStoreDirectSql: Direct SQL query in 1.297211ms + 0.019608ms, the query is [SET @@session.sql_mode=ANSI_QUOTES]
2017-04-05T14:37:35,500 DEBUG [main] metastore.MetaStoreDirectSql: getDatabase: directsql returning db default locn[hdfs://storage.castrading.com:9000/user/hive/warehouse] desc [Default Hive database] owner [public] ownertype [ROLE]
2017-04-05T14:37:35,503 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at:
    org.apache.hadoop.hive.metastore.ObjectStore$GetHelper.commit(ObjectStore.java:2552)
2017-04-05T14:37:35,505 DEBUG [main] metastore.ObjectStore: db details for db default retrieved using SQL in 51.37478ms
2017-04-05T14:37:35,506 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3313)
2017-04-05T14:37:35,506 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670)
2017-04-05T14:37:35,546 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676)
2017-04-05T14:37:35,550 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at:
    org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3325)
2017-04-05T14:37:35,555 DEBUG [main] metastore.HiveMetaStore: admin role already exists
InvalidObjectException(message:Role admin already exists.)
    at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3316)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
    at com.sun.proxy.$Proxy21.addRole(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:580)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
2017-04-05T14:37:35,560  INFO [main] metastore.HiveMetaStore: Added admin role in metastore
2017-04-05T14:37:35,562 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3313)
2017-04-05T14:37:35,562 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670)
2017-04-05T14:37:35,566 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676)
2017-04-05T14:37:35,567 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at:
    org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3325)
2017-04-05T14:37:35,569 DEBUG [main] metastore.HiveMetaStore: public role already exists
InvalidObjectException(message:Role public already exists.)
    at org.apache.hadoop.hive.metastore.ObjectStore.addRole(ObjectStore.java:3316)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
    at com.sun.proxy.$Proxy21.addRole(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:589)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
2017-04-05T14:37:35,570  INFO [main] metastore.HiveMetaStore: Added public role in metastore
2017-04-05T14:37:35,590 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4063)
2017-04-05T14:37:35,590 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3670)
2017-04-05T14:37:35,593 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getMRole(ObjectStore.java:3676)
2017-04-05T14:37:35,594 DEBUG [main] metastore.ObjectStore: Open transaction: count = 2, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.listPrincipalMGlobalGrants(ObjectStore.java:4579)
2017-04-05T14:37:35,620 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 1, isactive true at:
    org.apache.hadoop.hive.metastore.ObjectStore.listPrincipalMGlobalGrants(ObjectStore.java:4587)
2017-04-05T14:37:35,621 DEBUG [main] metastore.ObjectStore: Rollback transaction, isActive: true at:
    org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4266)
2017-04-05T14:37:35,623 DEBUG [main] metastore.HiveMetaStore: Failed while granting global privs to admin
InvalidObjectException(message:All is already granted by admin)
    at org.apache.hadoop.hive.metastore.ObjectStore.grantPrivileges(ObjectStore.java:4099)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:101)
    at com.sun.proxy.$Proxy21.grantPrivileges(Unknown Source)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles_core(HiveMetaStore.java:603)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultRoles(HiveMetaStore.java:569)
    at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:371)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.<init>(RetryingHMSHandler.java:78)
    at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:84)
    at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5762)
    at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.<init>(HiveMetaStoreClient.java:219)
    at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.<init>(SessionHiveMetaStoreClient.java:67)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
    at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
    at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
    at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1548)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.<init>(RetryingMetaStoreClient.java:86)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:132)
    at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:104)
    at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:3080)
    at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:3108)
    at org.apache.hadoop.hive.ql.metadata.Hive.getAllFunctions(Hive.java:3349)
    at org.apache.hadoop.hive.ql.metadata.Hive.reloadFunctions(Hive.java:217)
    at org.apache.hadoop.hive.ql.metadata.Hive.registerAllFunctionsOnce(Hive.java:204)
    at org.apache.hadoop.hive.ql.metadata.Hive.<init>(Hive.java:331)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:292)
    at org.apache.hadoop.hive.ql.metadata.Hive.getInternal(Hive.java:262)
    at org.apache.hadoop.hive.ql.metadata.Hive.get(Hive.java:247)
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:543)
    at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:516)
    at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:712)
    at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:648)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
    at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
2017-04-05T14:37:35,627  INFO [main] metastore.HiveMetaStore: No user is added in admin role, since config is empty
2017-04-05T14:37:35,891  INFO [main] metastore.HiveMetaStore: 0: get_all_functions
2017-04-05T14:37:35,895  INFO [main] HiveMetaStore.audit: ugi=hduser    ip=unknown-ip-addr    cmd=get_all_functions    
2017-04-05T14:37:35,896 DEBUG [main] metastore.ObjectStore: Open transaction: count = 1, isActive = true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getAllFunctions(ObjectStore.java:7549)
2017-04-05T14:37:35,916 DEBUG [main] metastore.ObjectStore: Commit transaction: count = 0, isactive true at:
    org.apache.hadoop.hive.metastore.ObjectStore.getAllFunctions(ObjectStore.java:7553)
2017-04-05T14:37:36,238 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.use.legacy.blockreader.local = false
2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.read.shortcircuit = false
2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.client.domain.socket.data.traffic = false
2017-04-05T14:37:36,240 DEBUG [main] hdfs.BlockReaderLocal: dfs.domain.socket.path =
2017-04-05T14:37:36,281 DEBUG [main] hdfs.DFSClient: No KeyProvider found.
2017-04-05T14:37:36,454 DEBUG [main] retry.RetryUtils: multipleLinearRandomRetry = null
2017-04-05T14:37:36,511 DEBUG [main] ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcRequestWrapper, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@ca93621
2017-04-05T14:37:36,532 DEBUG [main] ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@54bca971
2017-04-05T14:37:37,431 DEBUG [main] util.PerformanceAdvisory: Both short-circuit local reads and UNIX domain socket are disabled.
2017-04-05T14:37:37,440 DEBUG [main] sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
2017-04-05T14:37:37,479 DEBUG [main] ipc.Client: The ping interval is 60000 ms.
2017-04-05T14:37:37,481 DEBUG [main] ipc.Client: Connecting to storage.castrading.com/192.168.0.227:9000
2017-04-05T14:37:37,524 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser: starting, having connections 1
2017-04-05T14:37:37,526 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #0
2017-04-05T14:37:37,537 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #0
2017-04-05T14:37:37,538 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 91ms
2017-04-05T14:37:37,600 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #1
2017-04-05T14:37:37,603 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #1
2017-04-05T14:37:37,604 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 5ms
2017-04-05T14:37:37,606 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #2
2017-04-05T14:37:37,608 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #2
2017-04-05T14:37:37,611 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 5ms
2017-04-05T14:37:37,616 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #3
2017-04-05T14:37:37,617 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #3
2017-04-05T14:37:37,619 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 3ms
2017-04-05T14:37:37,620 DEBUG [main] hdfs.DFSClient: /tmp/hive/hduser/10f8dcc5-0c5b-479d-92f3-87a848c6d188: masked=rwx------
2017-04-05T14:37:37,624 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #4
2017-04-05T14:37:37,630 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #4
2017-04-05T14:37:37,636 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 13ms
2017-04-05T14:37:37,642 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #5
2017-04-05T14:37:37,643 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #5
2017-04-05T14:37:37,643 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms
2017-04-05T14:37:37,662 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #6
2017-04-05T14:37:37,663 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #6
2017-04-05T14:37:37,667 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 8ms
2017-04-05T14:37:37,667 DEBUG [main] hdfs.DFSClient: /tmp/hive/hduser/10f8dcc5-0c5b-479d-92f3-87a848c6d188/_tmp_space.db: masked=rwx------
2017-04-05T14:37:37,668 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #7
2017-04-05T14:37:37,670 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #7
2017-04-05T14:37:37,670 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 3ms
2017-04-05T14:37:37,672 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser sending #8
2017-04-05T14:37:37,675 DEBUG [IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser] ipc.Client: IPC Client (409778321) connection to storage.castrading.com/192.168.0.227:9000 from hduser got value #8
2017-04-05T14:37:37,675 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 4ms
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>

avatar
Super Guru
Hmm, this is strange that you don't get the following output like I have:

17/03/29 07:20:04 [6af15552-8d3f-43a2-8d6e-5e2c03fc2d80 main]: DEBUG CliDriver: CliDriver inited with classpath /opt/hive/conf:/opt/hive/lib/accumulo-core-1.6.0.jar:/opt/hive/lib/accumulo-fate-1.6.0.jar:/opt/hive/lib/accumulo-start-1.6.0.jar:/opt/hive/lib/accumulo-trace-1.6.0.jar:/opt/hive/lib/activation-1.1.jar:/opt/hive/lib/aether-api-0.9.0.M2.jar....

I was expecting this so that I can check for the list of JAR files loaded by Hive CLI.

I might need to use the downloaded version as yours, coz my version is built from Hive source code.

Not too sure why at this stage.

avatar
Super Guru
Can you please perform the following:

1. start hive CLI: hive
2. on another shell window, type in the following command:

ps aux | grep CliDriver

and note down the PID of hive CLI

3. run command:

strings /proc/<hive-cli-PID>/environ

Then copy and paste the outputs here for me to have a look.

Thanks

avatar
Contributor

Please find the output from below.

 

First::::::::::::

[hduser@storage Desktop]$ hive CLI: hive
which: no hbase in (/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/)

Logging initialized using configuration in jar:file:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar!/hive-log4j2.properties Async: true
Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.
hive>

 

SECOND::::

[hduser@storage Desktop]$ ps aux | grep CliDriver
hduser   29018 29.9  5.3 2249484 217232 pts/0  Sl+  14:08   0:38 /usr/local/jdk1.8.0_111/bin/java -Xmx256m -Djava.library.path=/home/hduser/hadoop-2.6.5/lib -Djava.net.preferIPv4Stack=true -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hduser/hadoop-2.6.5/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hduser/hadoop-2.6.5 -Dhadoop.id.str=hduser -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m -Dlog4j.configurationFile=hive-log4j2.properties -Dhadoop.security.logger=INFO,NullAppender org.apache.hadoop.util.RunJar /home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-cli-2.0.1.jar org.apache.hadoop.hive.cli.CliDriver CLI: hive
hduser   29161  0.0  0.0 103384   808 pts/3    S+   14:10   0:00 grep CliDriver
[hduser@storage Desktop]$

 

THIRD:::::::::::::::::::

[hduser@storage Desktop]$ strings /proc/29161/environ
strings: '/proc/29161/environ': No such file

 

avatar
Super Guru
the PID for CliDriver is 29018, please run

strings /proc/29018/environ

instead.

avatar
Contributor

Please find the output from below.

 

[hduser@storage Desktop]$ strings /proc/29018/environ
ORBIT_SOCKETDIR=/tmp/orbit-hduser
HADOOP_DATANODE_OPTS=-Dhadoop.security.logger=ERROR,RFAS
HOSTNAME=storage.castrading.com
HADOOP_IDENT_STRING=hduser
IMSETTINGS_INTEGRATE_DESKTOP=yes
PIG_HOME=/home/hduser/pig-0.15.0
SHELL=/bin/bash
TERM=xterm
XDG_SESSION_COOKIE=4179b5ce21a6e7668e4c33d800000012-1491373771.913553-599059921
HADOOP_HOME=/home/hduser/hadoop-2.6.5
HISTSIZE=1000
HADOOP_PID_DIR=
HADOOP_PREFIX=/home/hduser/hadoop-2.6.5
GTK_RC_FILES=/etc/gtk/gtkrc:/home/hduser/.gtkrc-1.2-gnome2
WINDOWID=46137348
QTDIR=/usr/lib64/qt-3.3
SQOOP_HOME=/home/hduser/Softwares/sqoop
QTINC=/usr/lib64/qt-3.3/include
YARN_HOME=/home/hduser/hadoop-2.6.5
IMSETTINGS_MODULE=none
USER=hduser
LS_COLORS=rs=0:di=01;34:ln=01;36:mh=00:pi=40;33:so=01;35:do=01;35:bd=40;33;01:cd=40;33;01:or=40;31;01:mi=01;05;37;41:su=37;41:sg=30;43:ca=30;41:tw=30;42:ow=34;42:st=37;44:ex=01;32:*.tar=01;31:*.tgz=01;31:*.arj=01;31:*.taz=01;31:*.lzh=01;31:*.lzma=01;31:*.tlz=01;31:*.txz=01;31:*.zip=01;31:*.z=01;31:*.Z=01;31:*.dz=01;31:*.gz=01;31:*.lz=01;31:*.xz=01;31:*.bz2=01;31:*.tbz=01;31:*.tbz2=01;31:*.bz=01;31:*.tz=01;31:*.deb=01;31:*.rpm=01;31:*.jar=01;31:*.rar=01;31:*.ace=01;31:*.zoo=01;31:*.cpio=01;31:*.7z=01;31:*.rz=01;31:*.jpg=01;35:*.jpeg=01;35:*.gif=01;35:*.bmp=01;35:*.pbm=01;35:*.pgm=01;35:*.ppm=01;35:*.tga=01;35:*.xbm=01;35:*.xpm=01;35:*.tif=01;35:*.tiff=01;35:*.png=01;35:*.svg=01;35:*.svgz=01;35:*.mng=01;35:*.pcx=01;35:*.mov=01;35:*.mpg=01;35:*.mpeg=01;35:*.m2v=01;35:*.mkv=01;35:*.ogm=01;35:*.mp4=01;35:*.m4v=01;35:*.mp4v=01;35:*.vob=01;35:*.qt=01;35:*.nuv=01;35:*.wmv=01;35:*.asf=01;35:*.rm=01;35:*.rmvb=01;35:*.flc=01;35:*.avi=01;35:*.fli=01;35:*.flv=01;35:*.gl=01;35:*.dl=01;35:*.xcf=01;35:*.xwd=01;35:*.yuv=01;35:*.cgm=01;35:*.emf=01;35:*.axv=01;35:*.anx=01;35:*.ogv=01;35:*.ogx=01;35:*.aac=01;36:*.au=01;36:*.flac=01;36:*.mid=01;36:*.midi=01;36:*.mka=01;36:*.mp3=01;36:*.mpc=01;36:*.ogg=01;36:*.ra=01;36:*.wav=01;36:*.axa=01;36:*.oga=01;36:*.spx=01;36:*.xspf=01;36:
HADOOP_HEAPSIZE=256
SSH_AUTH_SOCK=/tmp/keyring-zl52DP/socket.ssh
GNOME_KEYRING_SOCKET=/tmp/keyring-zl52DP/socket
MALLOC_ARENA_MAX=4
HADOOP_SECURE_DN_PID_DIR=
USERNAME=hduser
SESSION_MANAGER=local/unix:@/tmp/.ICE-unix/3448,unix/unix:/tmp/.ICE-unix/3448
HADOOP_SECURE_DN_LOG_DIR=/
HIVE_AUX_JARS_PATH=
DESKTOP_SESSION=gnome
HADOOP_COMMON_LIB_NATIVE_DIR=/home/hduser/hadoop-2.6.5/lib/native
MAIL=/var/spool/mail/hduser
PATH=/usr/local/jdk1.8.0_111/bin:/usr/lib64/qt-3.3/bin:/usr/local/bin:/usr/bin:/bin:/usr/local/sbin:/usr/sbin:/sbin:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/home/hduser/bin:/home/hduser/scala//bin/:/home/hduser/Softwares/sqoop//bin/:/usr/local/jdk1.8.0_111/bin:/home/hduser/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/pig-0.15.0/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/bin:/home/hduser/hadoop-2.6.5/sbin:/home/hduser/scala/:/bin/:/home/hduser/Softwares/sqoop/bin/:/home/hduser/Softwares/apache-hive-2.0.1-bin/bin/:/usr/local/jdk1.8.0_111/bin
HADOOP_HDFS_HOME=/home/hduser/hadoop-2.6.5
QT_IM_MODULE=xim
HIVE_HOME=/home/hduser/Softwares/apache-hive-2.0.1-bin
HADOOP_CLIENT_OPTS=-Xmx512m  -Dlog4j.configurationFile=hive-log4j2.properties
PWD=/home/hduser/Desktop
HADOOP_COMMON_HOME=/home/hduser/hadoop-2.6.5
HADOOP_YARN_HOME=/home/hduser/hadoop-2.6.5
JAVA_HOME=/usr/local/jdk1.8.0_111
XMODIFIERS=@im=none
HADOOP_INSTALL=/home/hduser/hadoop-2.6.5
GDM_KEYBOARD_LAYOUT=us
HADOOP_CLASSPATH=/home/hduser/Softwares/apache-hive-2.0.1-bin/conf:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-core-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-fate-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-start-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-trace-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/activation-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.6.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-launcher-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-2.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr4-runtime-4.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-runtime-3.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/aopalliance-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-commons-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-tree-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/avro-1.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/bonecp-0.8.0.RELEASE.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-avatica-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-core-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-linq4j-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-cli-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-codec-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-collections-3.2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compiler-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compress-1.9.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-dbcp-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-el-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-httpclient-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-io-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang-2.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang3-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-logging-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-math-2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-pool-1.5.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-vfs2-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-client-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-framework-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-recipes-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-api-jdo-4.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-core-4.1.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-rdbms-4.1.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/derby-10.10.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/disruptor-3.3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/eigenbase-properties-1.1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/fastutil-6.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/findbugs-annotations-1.3.9-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jaspic_1.0_spec-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jta_1.1_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/groovy-all-2.4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/gson-2.2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guava-14.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-assistedinject-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hamcrest-core-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-annotations-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-client-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-prefix-tree-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-procedure-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-protocol-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-server-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-accumulo-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-ant-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-beeline-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-cli-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-contrib-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-exec-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hbase-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hplsql-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hwi-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-jdbc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-client-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-server-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-tez-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-metastore-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-orc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-serde-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-service-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-0.23-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-scheduler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-storage-api-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-testutils-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/htrace-core-3.1.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpclient-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpcore-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ivy-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-annotations-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-core-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-databind-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-jaxrs-1.9.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jamon-runtime-2.3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/janino-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-compiler-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-runtime-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.inject-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.jdo-3.2.0-m3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.servlet-3.0.0.v201112011016.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcodings-1.0.8.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcommander-1.32.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jdo-api-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jersey-server-1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-server-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-sslengine-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-util-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jline-2.12.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joda-time-2.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joni-2.1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jpam-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/json-20090211.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsr305-3.0.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jta-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/junit-4.11.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libfb303-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libthrift-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-1.2-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-core-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-web-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mail-1.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-api-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svn-commons-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svnexe-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-json-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-jvm-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mysql-connector-java-5.1.40-bin.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-3.7.0.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-all-4.0.23.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/opencsv-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/org.abego.treelayout.core-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/paranamer-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/parquet-hadoop-bundle-1.8.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/plexus-utils-1.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/protobuf-java-2.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/regexp-1.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.5-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-0.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-java-1.0.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ST4-4.0.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stax-api-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stringtemplate-3.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/super-csv-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tempus-fugit-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-api-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-core-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-hbase-compat-1.0-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/transaction-api-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-common-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-zookeeper-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/velocity-1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/zookeeper-3.4.6.jar::/usr/local/jdk1.8.0_111/lib/tools.jar::/home/hduser/hadoop-2.6.5/contrib/capacity-scheduler/*.jar
HADOOP_CONF_DIR=/home/hduser/hadoop-2.6.5/etc/hadoop
LANG=en_US.UTF-8
GNOME_KEYRING_PID=3438
SERVICE_LIST=beeline cli hbaseimport hbaseschematool help hiveburninclient hiveserver2 hiveserver hplsql hwi jar lineage llap metastore metatool orcfiledump rcfilecat schemaTool version
GDM_LANG=en_US.UTF-8
HADOOP_PORTMAP_OPTS=-Xmx512m
HADOOP_OPTS=-Djava.library.path=/home/hduser/hadoop-2.6.5/lib -Djava.net.preferIPv4Stack=true -XX:-PrintWarnings -Djava.net.preferIPv4Stack=true -Dhadoop.log.dir=/home/hduser/hadoop-2.6.5/logs -Dhadoop.log.file=hadoop.log -Dhadoop.home.dir=/home/hduser/hadoop-2.6.5 -Dhadoop.id.str=hduser -Dhadoop.root.logger=INFO,console -Dhadoop.policy.file=hadoop-policy.xml -Djava.net.preferIPv4Stack=true -Xmx512m  -Dlog4j.configurationFile=hive-log4j2.properties  -Dhadoop.security.logger=INFO,NullAppender
HADOOP_SECONDARYNAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
GDMSESSION=gnome
HISTCONTROL=ignoredups
SSH_ASKPASS=/usr/libexec/openssh/gnome-ssh-askpass
SHLVL=2
HOME=/home/hduser
HADOOP_SECURE_DN_USER=
HADOOP_NAMENODE_OPTS=-Dhadoop.security.logger=INFO,RFAS -Dhdfs.audit.logger=INFO,NullAppender
GNOME_DESKTOP_SESSION_ID=this-is-deprecated
HADOOP_MAPRED_HOME=/home/hduser/hadoop-2.6.5
LOGNAME=hduser
QTLIB=/usr/lib64/qt-3.3/lib
CVS_RSH=ssh
HADOOP_HOME_WARN_SUPPRESS=true
CLASSPATH=/home/hduser/Softwares/apache-hive-2.0.1-bin/conf:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-core-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-fate-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-start-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/accumulo-trace-1.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/activation-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.6.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ant-launcher-1.9.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-2.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr4-runtime-4.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/antlr-runtime-3.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/aopalliance-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-commons-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/asm-tree-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/avro-1.7.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/bonecp-0.8.0.RELEASE.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-avatica-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-core-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/calcite-linq4j-1.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-cli-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-codec-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-collections-3.2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compiler-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-compress-1.9.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-dbcp-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-el-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-httpclient-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-io-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang-2.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-lang3-3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-logging-1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-math-2.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-pool-1.5.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/commons-vfs2-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-client-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-framework-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/curator-recipes-2.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-api-jdo-4.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-core-4.1.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/datanucleus-rdbms-4.1.7.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/derby-10.10.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/disruptor-3.3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/eigenbase-properties-1.1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/fastutil-6.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/findbugs-annotations-1.3.9-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-annotation_1.0_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jaspic_1.0_spec-1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/geronimo-jta_1.1_spec-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/groovy-all-2.4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/gson-2.2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guava-14.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/guice-assistedinject-3.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hamcrest-core-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-annotations-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-client-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-common-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop2-compat-1.1.1-tests.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-hadoop-compat-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-prefix-tree-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-procedure-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-protocol-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hbase-server-1.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-accumulo-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-ant-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-beeline-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-cli-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-common-2.1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-contrib-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-exec-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hbase-handler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hplsql-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-hwi-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-jdbc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-client-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-server-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-llap-tez-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-metastore-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-orc-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-serde-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-service-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-0.23-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-common-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-shims-scheduler-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-storage-api-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-testutils-2.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/htrace-core-3.1.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpclient-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/httpcore-4.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ivy-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-annotations-2.4.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-core-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-databind-2.4.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jackson-jaxrs-1.9.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jamon-runtime-2.3.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/janino-2.7.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-compiler-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jasper-runtime-5.5.23.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.inject-1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.jdo-3.2.0-m3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/javax.servlet-3.0.0.v201112011016.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcodings-1.0.8.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jcommander-1.32.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jdo-api-3.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jersey-server-1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-all-server-7.6.0.v20120127.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-sslengine-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jetty-util-6.1.26.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jline-2.12.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joda-time-2.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/joni-2.1.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jpam-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/json-20090211.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsp-api-2.1-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jsr305-3.0.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/jta-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/junit-4.11.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libfb303-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/libthrift-0.9.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-1.2-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-api-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-core-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/log4j-web-2.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mail-1.4.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-api-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svn-commons-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/maven-scm-provider-svnexe-1.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-core-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-json-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/metrics-jvm-3.1.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/mysql-connector-java-5.1.40-bin.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-3.7.0.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/netty-all-4.0.23.Final.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/opencsv-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/org.abego.treelayout.core-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/paranamer-2.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/parquet-hadoop-bundle-1.8.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/pentaho-aggdesigner-algorithm-5.1.5-jhyde.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/plexus-utils-1.5.6.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/protobuf-java-2.5.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/regexp-1.3.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/servlet-api-2.5-6.1.14.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-0.2.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/snappy-java-1.0.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/ST4-4.0.4.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stax-api-1.0.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/stringtemplate-3.2.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/super-csv-2.2.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tempus-fugit-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-api-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-core-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/tephra-hbase-compat-1.0-0.6.0.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/transaction-api-1.1.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-common-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-api-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-discovery-core-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/twill-zookeeper-0.6.0-incubating.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/velocity-1.5.jar:/home/hduser/Softwares/apache-hive-2.0.1-bin/lib/zookeeper-3.4.6.jar::/usr/local/jdk1.8.0_111/lib/tools.jar::/home/hduser/hadoop-2.6.5/contrib/capacity-scheduler/*.jar:/home/hduser/hadoop-2.6.5/etc/hadoop:/home/hduser/hadoop-2.6.5/share/hadoop/common/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/common/*:/home/hduser/hadoop-2.6.5/share/hadoop/hdfs:/home/hduser/hadoop-2.6.5/share/hadoop/hdfs/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/hdfs/*:/home/hduser/hadoop-2.6.5/share/hadoop/yarn/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/yarn/*:/home/hduser/hadoop-2.6.5/share/hadoop/mapreduce/lib/*:/home/hduser/hadoop-2.6.5/share/hadoop/mapreduce/*
HADOOP_NFS3_OPTS=
DBUS_SESSION_BUS_ADDRESS=unix:abstract=/tmp/dbus-eFQEXskxRI,guid=1c48465a2229e73e4a2f8e1c00000063
LESSOPEN=||/usr/bin/lesspipe.sh %s
SCALA_HOME=/home/hduser/scala/
WINDOWPATH=1
DISPLAY=:0.0
HADOOP_USER_CLASSPATH_FIRST=true
G_BROKEN_FILENAMES=1
XAUTHORITY=/var/run/gdm/auth-for-hduser-wCDdK5/database
HIVE_CONF_DIR=/home/hduser/Softwares/apache-hive-2.0.1-bin/conf
COLORTERM=gnome-terminal
[hduser@storage Desktop]$

avatar
Super Guru
hmm, I can see file /home/hduser/Softwares/apache-hive-2.0.1-bin/lib/hive-exec-2.0.1.jar is loaded in Hive CLI already, it is hard to figure out without looking at the issue live, but I have a couple of other questions:

- when did job fail, did it fail before MR was launched or after? If after, was the mapper or reducer failed?

- have you tried to use beeline (need to start HiveServer2) to see if it also fails there?

- how many nodes do you have in the cluster? is it just one?

avatar
Contributor

Please find my answer below in line accrodingly :-

 

- when did job fail, did it fail before MR was launched or after? If after, was the mapper or reducer failed?

Answer : I did not notice on that but after login inside hive if i trigger select query then i get that error.

- have you tried to use beeline (need to start HiveServer2) to see if it also fails there?

Answer : Nope. I am new to this. can you give me the url for beeline so that i can follow the steps accordingly.

- how many nodes do you have in the cluster? is it just one?

Answer : It's just one only. I have installed VMWARE and inside vmware i have installed linux as an os, hadoop etc

 

So isn't there any way to find the solution for the error or is it just a bug only ?