Member since
08-08-2017
1652
Posts
30
Kudos Received
11
Solutions
My Accepted Solutions
| Title | Views | Posted |
|---|---|---|
| 1935 | 06-15-2020 05:23 AM | |
| 15610 | 01-30-2020 08:04 PM | |
| 2085 | 07-07-2019 09:06 PM | |
| 8140 | 01-27-2018 10:17 PM | |
| 4616 | 12-31-2017 10:12 PM |
03-15-2019
12:09 PM
@Jay so I need to change the default port ? , is it the case ?
... View more
03-15-2019
08:55 AM
@Jay , for now we start the thrift and its up , but s it will down soon , its happens after ~1 hour or little more for now we get nc -l 10016
Ncat: bind to :::10016: Address already in use. QUITTING. but after some time thrift goes down from the ambari , and then nc command not give above results any way you said "Or change the spark thrift server setting to use specific address instead of "0.0.0.0" and then see if it works " do you means to change the defauls port from 10016 to other as 10055 for example? 0.0.0.0 - isn't real address , or I miss you ?
... View more
03-15-2019
07:35 AM
we have ambari cluster with two thrift server the first thrift server always fail on Address already in use on - master-node1 machine we get the following error on - Thrift server ( the log under /var/log/spark2 ) 19/03/08 08:42:59 ERROR ThriftCLIService: Error starting HiveServer2: could not start ThriftBinaryCLIService
org.apache.thrift.transport.TTransportException: Could not create ServerSocket on address 0.0.0.0/0.0.0.0:10016.
at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:109)
at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:91)
at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:87)
at org.apache.hive.service.auth.HiveAuthFactory.getServerSocket(HiveAuthFactory.java:241)
at org.apache.hive.service.cli.thrift.ThriftBinaryCLIService.run(ThriftBinaryCLIService.java:66)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.net.BindException: Address already in use (Bind failed)
at java.net.PlainSocketImpl.socketBind(Native Method)
at java.net.AbstractPlainSocketImpl.bind(AbstractPlainSocketImpl.java:387)
at java.net.ServerSocket.bind(ServerSocket.java:375)
at org.apache.thrift.transport.TServerSocket.<init>(TServerSocket.java:106)
... 5 more The default port for the thrift is 10016 and we do netstat in order to find who use the port as the following netstat -tulpn | grep 10016 we not get nothing , means no application using the port 10016 so we not understand how log say Address already in use , when no application using the port any suggestion example what we get on the good node ( master-node2 ) # netstat -tulpn | grep 10016
tcp6 0 0 :::10016 :::* LISTEN 26092/java
# ps -ef | grep 26092
hive 26092 1 6 07:14 ? 00:01:34 /usr/jdk64/jdk1.8.0_112/bin/java -Dhdp.version=2.6.4.0-91 ........
... View more
Labels:
03-07-2019
04:42 PM
How we can copy recursive jar files from HDFS ( jar files are under sub folders ) to local folder? Example export hdfs_folder=/app/lib export local_folder=/home/work_app while under /app/lib we have the following sub folder with the jar files as /app/lib/folder_jar1
/app/lib/folder_jar2 Under each of above folder we have jar files The following command , will copy only the jar files under /app/lib but not the Jar files under the sub folders as /app/lib/folder_jar1 , /app/lib/folder_jar2 , Etc hadoop fs -copyToLocal $hdfs_folder/*.jar $local_folder
... View more
Labels:
- Labels:
-
Apache Hadoop
02-26-2019
09:10 PM
I searched in google but not find it, is it possible to create link between HDFS folder to local folder? example we want to create link between folder_1 in HDFS to /home/hdfs_mirror local folder HDFS folder: su hdfs $ hdfs dfs -ls /hdfs_home/folder_1
Linux local folder: ls /home/hdfs_mirror
... View more
Labels:
02-26-2019
02:17 PM
but in case under /hdp we have also sub folders , then what we can do ?
... View more
02-25-2019
10:50 PM
The simple way to copy a folder from HDFS to a local folder is like this: su hdfs -c 'hadoop fs -copyToLocal /hdp /tmp' In the example above, we copy the hdp folder from HDFS to /tmp/local_folder But we have a more complicated case. Let’s say under /hdp ( hdfs ) , we have subfolders: /hdp/folder1
/hdp/folder2
/hdp/folder3/folder_a
.
.
. And in each sub folder we have files. In that case, how to copy recursively only the files under /hdp to the local dir /tmp/local_folder ?
... View more
Labels:
02-12-2019
01:30 PM
example after we click on - Create App button
... View more
02-12-2019
01:27 PM
but when I click on Create App button ( on the right side ) we get black screen , - why?
... View more
02-12-2019
01:25 PM
I solved it by hadoop fs -mkdir /user/slider
... View more