Created 02-27-2016 01:56 AM
Below is what i get when I initiate spark-shell. Can the warning below ignored ? [root@hdp-m hdfs]# spark-shell 16/02/27 01:52:04 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Welcome to ____ __ / __/__ ___ _____/ /__ _\ \/ _ \/ _ `/ __/ '_/ /___/ .__/\_,_/_/ /_/\_\ version 1.5.2 /_/ Using Scala version 2.10.4 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_60) Type in expressions to have them evaluated. Type :help for more information. 16/02/27 01:52:17 WARN MetricsSystem: Using default name DAGScheduler for source because spark.app.id is not set. Spark context available as sc. 16/02/27 01:52:25 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. 16/02/27 01:52:27 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 16/02/27 01:52:28 WARN DomainSocketFactory: The short-circuit local reads feature cannot be used because libhadoop cannot be loaded. SQL context available as sqlContext.
Created 02-27-2016 01:57 AM
@Prakash Punj Yes, you can ignore that warning
Created 02-27-2016 01:57 AM
@Prakash Punj Yes, you can ignore that warning
Created 02-27-2016 02:01 AM
Created 02-16-2022 06:47 PM
This answer may not help the person who asked but still wanted to post it as somebody could benefit from it.
There are multiple ways to solve this that explained in this artilce.
For me, setting the below environments solved my problem.
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
# (OR) If you have hadoop library installed at /usr/lib/
export LD_LIBRARY_PATH=/usr/lib/hadoop/lib/native
Thanks