Member since
08-19-2021
4
Posts
0
Kudos Received
0
Solutions
08-29-2021
02:38 PM
hello,
please i need help to load data from hdfs to druid i am working on ambari platform i have installed apache druid in one of the data node when i launche it ,i have used the load data in druid interface then hdfs (there is input for path the the file ) when i am putting the path an irror occurred .i need help please if any one have experience with apache druid please give me a solution
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hadoop
-
HDFS
08-22-2021
03:31 AM
hello thanks for your reply . yes it's a fresh dployement i have set the schema tool with mysql and when i want to start hive shell i have to start metastore first but i couldnt .
... View more
08-20-2021
05:46 AM
i have a problem when i start hive --service metastore there are no errors and no output i need help to solve this problem this is the logging info in config SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/apache-hive-3.1.2-bin/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:file:/usr/hdp/3.0.1.0-187/hadoop/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory] Hive Session ID = 91139bb4-147c-44a8-9b89-d32a47b5d9b8 2021-08-20T13:34:37,266 INFO [main] SessionState: Hive Session ID = 91139bb4-147c-44a8-9b89-d32a47b5d9b8 Logging initialized using configuration in jar:file:/usr/hdp/3.0.1.0-187/apache-hive-3.1.2-bin/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true 2021-08-20T13:34:37,342 INFO [main] SessionState: Logging initialized using configuration in jar:file:/usr/hdp/3.0.1.0-187/apache-hive-3.1.2-bin/lib/hive-common-3.1.2.jar!/hive-log4j2.properties Async: true 2021-08-20T13:34:37,345 DEBUG [main] conf.VariableSubstitution: Substitution is on: hive 2021-08-20T13:34:37,470 DEBUG [main] security.SecurityUtil: Setting hadoop.security.token.service.use_ip to true 2021-08-20T13:34:37,509 DEBUG [main] security.Groups: Creating new Groups object 2021-08-20T13:34:37,514 DEBUG [main] util.NativeCodeLoader: Trying to load the custom-built native-hadoop library... 2021-08-20T13:34:37,515 DEBUG [main] util.NativeCodeLoader: Loaded the native-hadoop library 2021-08-20T13:34:37,517 DEBUG [main] security.JniBasedUnixGroupsMapping: Using JniBasedUnixGroupsMapping for Group resolution 2021-08-20T13:34:37,517 DEBUG [main] security.JniBasedUnixGroupsMappingWithFallback: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMapping 2021-08-20T13:34:37,521 DEBUG [main] security.Groups: Group mapping impl=org.apache.hadoop.security.JniBasedUnixGroupsMappingWithFallback; cacheTimeout=300000; warningDeltaMs=5000 2021-08-20T13:34:37,555 DEBUG [main] core.Tracer: sampler.classes = ; loaded no samplers 2021-08-20T13:34:37,559 DEBUG [main] core.Tracer: span.receiver.classes = ; loaded no span receivers 2021-08-20T13:34:37,606 DEBUG [main] gcs.GoogleHadoopFileSystemBase: GHFS version: 1.9.0.3.0.1.0-187 2021-08-20T13:34:37,759 DEBUG [main] impl.DfsClientConf: dfs.client.use.legacy.blockreader.local = false 2021-08-20T13:34:37,759 DEBUG [main] impl.DfsClientConf: dfs.client.read.shortcircuit = true 2021-08-20T13:34:37,759 DEBUG [main] impl.DfsClientConf: dfs.client.domain.socket.data.traffic = false 2021-08-20T13:34:37,759 DEBUG [main] impl.DfsClientConf: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket 2021-08-20T13:34:37,778 DEBUG [main] hdfs.DFSClient: Sets dfs.client.block.write.replace-datanode-on-failure.min-replication to 0 2021-08-20T13:34:37,806 DEBUG [main] hdfs.HAUtilClient: No HA service delegation token found for logical URI hdfs://cs-cdp 2021-08-20T13:34:37,806 DEBUG [main] impl.DfsClientConf: dfs.client.use.legacy.blockreader.local = false 2021-08-20T13:34:37,806 DEBUG [main] impl.DfsClientConf: dfs.client.read.shortcircuit = true 2021-08-20T13:34:37,806 DEBUG [main] impl.DfsClientConf: dfs.client.domain.socket.data.traffic = false 2021-08-20T13:34:37,806 DEBUG [main] impl.DfsClientConf: dfs.domain.socket.path = /var/lib/hadoop-hdfs/dn_socket 2021-08-20T13:34:37,836 DEBUG [main] retry.RetryUtils: multipleLinearRandomRetry = null 2021-08-20T13:34:37,871 DEBUG [main] ipc.Server: rpcKind=RPC_PROTOCOL_BUFFER, rpcRequestWrapperClass=class org.apache.hadoop.ipc.ProtobufRpcEngine$RpcProtobufRequest, rpcInvoker=org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker@63f34b70 2021-08-20T13:34:37,876 DEBUG [main] ipc.Client: getting client out of cache: org.apache.hadoop.ipc.Client@411341bd 2021-08-20T13:34:38,554 DEBUG [client DomainSocketWatcher] unix.DomainSocketWatcher: org.apache.hadoop.net.unix.DomainSocketWatcher$2@2fbd6ab: starting with interruptCheckPeriodMs = 60000 2021-08-20T13:34:38,564 DEBUG [main] shortcircuit.DomainSocketFactory: The short-circuit local reads feature is enabled. 2021-08-20T13:34:38,577 DEBUG [main] sasl.DataTransferSaslUtil: DataTransferProtocol not using SaslPropertiesResolver, no QOP found in configuration for dfs.data.transfer.protection 2021-08-20T13:34:38,632 DEBUG [main] ipc.Client: The ping interval is 60000 ms. 2021-08-20T13:34:38,633 DEBUG [main] ipc.Client: Connecting to cs-cdp-master01/10.16.100.230:8020 2021-08-20T13:34:38,661 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root: starting, having connections 1 2021-08-20T13:34:38,666 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #0 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo 2021-08-20T13:34:38,677 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #0 2021-08-20T13:34:38,677 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 75ms 2021-08-20T13:34:38,725 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #1 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo 2021-08-20T13:34:38,727 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #1 2021-08-20T13:34:38,727 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms 2021-08-20T13:34:38,728 DEBUG [main] exec.Utilities: HDFS dir: /tmp/hive with schema null, permission: rwx-wx-wx 2021-08-20T13:34:38,729 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #2 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo 2021-08-20T13:34:38,730 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #2 2021-08-20T13:34:38,730 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms 2021-08-20T13:34:38,735 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #3 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo 2021-08-20T13:34:38,736 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #3 2021-08-20T13:34:38,736 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms 2021-08-20T13:34:38,739 DEBUG [main] hdfs.DFSClient: /tmp/hive/root/91139bb4-147c-44a8-9b89-d32a47b5d9b8: masked={ masked: rwx------, unmasked: rwx------ } 2021-08-20T13:34:38,743 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #4 org.apache.hadoop.hdfs.protocol.ClientProtocol.mkdirs 2021-08-20T13:34:38,749 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #4 2021-08-20T13:34:38,750 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 7ms 2021-08-20T13:34:38,752 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #5 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo 2021-08-20T13:34:38,753 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #5 2021-08-20T13:34:38,753 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 1ms 2021-08-20T13:34:38,777 DEBUG [main] nativeio.NativeIO: Initialized cache for IDs to User/Group mapping with a cache timeout of 14400 seconds. 2021-08-20T13:34:38,779 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #6 org.apache.hadoop.hdfs.protocol.ClientProtocol.getFileInfo 2021-08-20T13:34:38,780 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #6 2021-08-20T13:34:38,780 DEBUG [main] ipc.ProtobufRpcEngine: Call: getFileInfo took 2ms 2021-08-20T13:34:38,780 DEBUG [main] hdfs.DFSClient: /tmp/hive/root/91139bb4-147c-44a8-9b89-d32a47b5d9b8/_tmp_space.db: masked={ masked: rwx------, unmasked: rwx------ } 2021-08-20T13:34:38,781 DEBUG [IPC Parameter Sending Thread #0] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root sending #7 org.apache.hadoop.hdfs.protocol.ClientProtocol.mkdirs 2021-08-20T13:34:38,787 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root got value #7 2021-08-20T13:34:38,787 DEBUG [main] ipc.ProtobufRpcEngine: Call: mkdirs took 7ms 2021-08-20T13:34:39,648 INFO [91139bb4-147c-44a8-9b89-d32a47b5d9b8 main] metastore.HiveMetaStoreClient: Trying to connect to metastore with URI thrift://cs-cdp-data01:9083 2021-08-20T13:34:39,683 INFO [91139bb4-147c-44a8-9b89-d32a47b5d9b8 main] metastore.HiveMetaStoreClient: Opened a connection to metastore, current connections: 1 2021-08-20T13:34:39,695 DEBUG [91139bb4-147c-44a8-9b89-d32a47b5d9b8 main] security.Groups: GroupCacheLoader - load. 2021-08-20T13:34:39,708 INFO [91139bb4-147c-44a8-9b89-d32a47b5d9b8 main] metastore.HiveMetaStoreClient: Connected to metastore. 2021-08-20T13:34:39,708 INFO [91139bb4-147c-44a8-9b89-d32a47b5d9b8 main] metastore.RetryingMetaStoreClient: RetryingMetaStoreClient proxy=class org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient ugi=root (auth:SIMPLE) retries=1 delay=1 lifetime=0 Hive Session ID = f6f6da0d-5833-471f-950a-4ca80da604b6 2021-08-20T13:34:40,098 INFO [pool-7-thread-1] SessionState: Hive Session ID = f6f6da0d-5833-471f-950a-4ca80da604b6 Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases. 2021-08-20T13:36:19,941 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root: closed 2021-08-20T13:36:19,941 DEBUG [IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root] ipc.Client: IPC Client (443110940) connection to cs-cdp-master01/10.16.100.230:8020 from root: stopped, remaining connections 0
... View more
Labels:
- Labels:
-
Apache Ambari
-
Apache Hive
08-19-2021
05:26 PM
hi, i have a problem with hive --service metastore when i excute it there is no output and no errors just starting hive metastore server ///hive-site.xml//// <configuration> <property> <name>javax.jdo.option.ConnectionURL</name> <value>jdbc:mysql://10.16.100.236/hive?createDatabaseIfNotExist=true</value> <description>metadata is stored in a MySQL server</description> </property> <property> <name>javax.jdo.option.ConnectionDriverName</name> <value>com.mysql.cj.jdbc.Driver</value> <description>MySQL JDBC driver class</description> </property> <property> <name>hive.metastore.schema.verification</name> <value>false</value> </property> <property> <name>javax.jdo.option.ConnectionUserName</name> <value>hive</value> <description>user name for connecting to mysql server</description> </property> <property> <name>javax.jdo.option.ConnectionPassword</name> <value>Anem@2021</value> <description>password for connecting to mysql server</description> </property> <property> <name>datanucleus.autoCreateSchema</name> <value>false</value> </property> <property> <name>hive.metastore.uris</name> <value>thrift://10.16.100.236:9083</value> <description>Thrift server hostname and port</description> </property> </configuration>
... View more
Labels:
- Labels:
-
Apache Hive
-
HDFS