Member since
05-31-2018
4
Posts
0
Kudos Received
0
Solutions
06-28-2018
12:50 PM
Hello, I did have Hadoop Cluster by Apache Ambari Version 2.5.1.0 I did have two Edge nodes. Did Install Flume agents. Via GUI Ambari did add Config. Now We want to make different configs on different hosts edge nodes. How make it?
... View more
Labels:
- Labels:
-
Apache Flume
06-01-2018
07:48 AM
Hello, I did have HDP 2.6.1.0-129 I have external Jar example.jar for serialized flume data file. I did add new parametr in section Custom hive-site name = hive.aux.jars.path
value hdfs:///user/libs/
Did save new configuration and did restart hadoop componens and in more time restart all hadoop cluster. After in Hive client I did try to run select select * from example_serealized_table
and hive did return error FAILED: RuntimeException MetaException(message:org.apache.hadoop.hive.serde2.SerDeException java.lang.ClassNotFoundException: Class com.my.bigtable.example.model.gen.TSerializedRecord not found)
How solve this problem? p.s. If did try add in current session, add jar hdfs:///user/libs/example-spark-SerializedRecord.jar; Did try to put *.jar to local folder. Problem same.
... View more
Labels:
05-31-2018
12:10 PM
I did add AUX Jar to Hive in section Advanced hive-env parameter hive-env template in end listing export HIVE_AUX_JARS_PATH="$HIVE_AUX_JARS_PATH,/opt/libs/spark-1.1.7.jar"
After this actions in console Hive did appear many DEBUG information... SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/libs/spark-1.1.7.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/hdp/2.6.1.0-129/hadoop/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [ch.qos.logback.classic.util.ContextSelectorStaticBinder]
20:09:06.519 [main] DEBUG org.apache.hadoop.util.Shell - setsid exited with exit code 0
log4j:WARN No such property [maxFileSize] in org.apache.log4j.DailyRollingFileAppender.
Logging initialized using configuration in file:/etc/hive/2.6.1.0-129/0/hive-log4j.properties
20:09:07.637 [main] DEBUG org.apache.hadoop.security.authentication.util.KerberosName - Kerberos krb5 configuration not found, setting default realm to empty
20:09:07.675 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login
20:09:07.675 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - hadoop login commit
20:09:07.680 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - using local user:UnixPrincipal: user1
20:09:07.680 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Using user: "UnixPrincipal: user1" with name user1
20:09:07.680 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - User entry: "user1"
20:09:07.680 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - Assuming keytab is managed externally since logged in from subject.
20:09:07.681 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - UGI loginUser:user1 (auth:SIMPLE)
20:09:08.466 [Finalizer] DEBUG org.apache.hadoop.fs.azure.NativeAzureFileSystem - finalize() called.
20:09:08.467 [Finalizer] DEBUG org.apache.hadoop.fs.azure.NativeAzureFileSystem - finalize() called.
20:09:08.763 [main] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.DataTransferSaslUtil - DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection
20:09:09.253 [Thread-7] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL client skipping handshake in unsecured configuration for addr = /192.168.63.22, datanodeId = DatanodeInfoWithStorage[192.168.63.22:50010,DS-13731d20-37a7-4e96-adc7-22883dc36760,DISK]
20:09:09.953 [Thread-10] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL client skipping handshake in unsecured configuration for addr = /192.168.63.22, datanodeId = DatanodeInfoWithStorage[192.168.63.22:50010,DS-13731d20-37a7-4e96-adc7-22883dc36760,DISK]
20:09:11.015 [main] INFO org.apache.tez.client.TezClient - Tez Client Version: [ component=tez-api, version=0.7.0.2.6.1.0-129, revision=bbcfb9e8d9cc93fb586b32199eb9492528449f7c, SCM-URL=scm:git:https://git-wip-us.apache.org/repos/asf/tez.git, buildTime=2017-05-31T03:06:59Z ]
20:09:11.111 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:user1 (auth:SIMPLE) from:org.apache.hadoop.yarn.client.RMProxy.getProxy(RMProxy.java:163)
20:09:11.309 [main] DEBUG org.apache.hadoop.security.UserGroupInformation - PrivilegedAction as:user1 (auth:SIMPLE) from:org.apache.hadoop.yarn.client.AHSProxy.getProxy(AHSProxy.java:49)
20:09:11.319 [main] INFO org.apache.tez.client.TezClient - Session mode. Starting session.
20:09:11.323 [main] INFO org.apache.tez.client.TezClientUtils - Using tez.lib.uris value from configuration: /hdp/apps/2.6.1.0-129/tez/tez.tar.gz
20:09:11.506 [main] INFO org.apache.tez.client.TezClient - Tez system stage directory hdfs://server01:8020/tmp/hive/user1/_tez_session_dir/864e3b98-3cfd-4a8e-a487-51668116d7e0/.tez/application_1526574235371_0024 doesn't exist and is created
20:09:11.509 [main] DEBUG org.apache.tez.client.TezClientUtils - AppMaster capability = <memory:512, vCores:1>
20:09:11.510 [main] DEBUG org.apache.tez.client.TezClientUtils - Command to launch container for ApplicationMaster is : $JAVA_HOME/bin/java -Xmx409m -Djava.io.tmpdir=$PWD/tmp -server -Djava.net.preferIPv4Stack=true -Dhdp.version=2.6.1.0-129 -XX:+PrintGCDetails -verbose:gc -XX:+PrintGCTimeStamps -XX:+UseNUMA -XX:+UseParallelGC -Dlog4j.configuratorClass=org.apache.tez.common.TezLog4jConfigurator -Dlog4j.configuration=tez-container-log4j.properties -Dyarn.app.container.log.dir=<LOG_DIR> -Dtez.root.logger=INFO,CLA -Dsun.nio.ch.bugLevel='' org.apache.tez.dag.app.DAGAppMaster --session 1><LOG_DIR>/stdout 2><LOG_DIR>/stderr
20:09:11.555 [Thread-13] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL client skipping handshake in unsecured configuration for addr = /192.168.63.22, datanodeId = DatanodeInfoWithStorage[192.168.63.22:50010,DS-13731d20-37a7-4e96-adc7-22883dc36760,DISK]
20:09:11.667 [Thread-15] DEBUG org.apache.hadoop.hdfs.protocol.datatransfer.sasl.SaslDataTransferClient - SASL client skipping handshake in unsecured configuration for addr = /192.168.63.22, datanodeId = DatanodeInfoWithStorage[192.168.63.22:50010,DS-13731d20-37a7-4e96-adc7-22883dc36760,DISK]
20:09:13.186 [main] INFO org.apache.tez.client.TezClient - The url to track the Tez Session: http://server01:8088/proxy/application_1526574235371_0024/
How disable this DEBUG information in hive?
... View more
Labels:
- Labels:
-
Apache Hive