- Subscribe to RSS Feed
- Mark Question as New
- Mark Question as Read
- Float this Question for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Why I am getting : The short-circuit local reads feature cannot be used because libhadoop cannot be loaded
- Labels:
-
Apache Spark
Created ‎03-27-2017 08:50 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
I have installed Spark on RedHat Centos 6.
Installed:Java 1.8,spark-2.1.0-bin-hadoop2.7,Scala 2.12
Environment Variable set for Hadoop config
HADOOP_CONF_DIR
Hadoop directory contains hdfs-site.xml , core- site.xml
While executing I am getting below Warning and I am not able to write HDFS
17/03/27 03:48:18 INFO Utils: Successfully started service 'SparkUI' on port 4040. 17/03/27 03:48:18 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://10.4.124.113:4040 17/03/27 03:48:18 INFO SparkContext: Added JAR file:/storm/Teja/spark/target/uber-spark_kafka-0.0.1-SNAPSHOT.jar at spark://10.4.124.113:50101/jars/uber-spark_kafka-0.0.1-SNAPSHOT.jar with timestamp 1490600898913 17/03/27 03:48:20 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 17/03/27 03:48:20 INFO RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 17/03/27 03:48:21 INFO Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 0 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 17/03/27 03:48:22 INFO Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 1 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime=1000 MILLISECONDS) 17/03/27 03:48:23 INFO Client: Retrying connect to server: 0.0.0.0/0.0.0.0:8032. Already tried 2 time(s); retry policy is RetryUpToMaximumCountWithFixedSleep(maxRetries=50, sleepTime
Created ‎03-30-2017 12:51 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
