<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Sandbox HDP 2.5.0 - Spark 1.6.2 - Issues: GPLNativeCodeLoader: Could not load native gpl library - LzoCodec: Cannot load native-lzo without native-hadoop in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Sandbox-HDP-2-5-0-Spark-1-6-2-Issues-GPLNativeCodeLoader/m-p/123160#M85904</link>
    <description>&lt;P&gt;@anandi Thanks, the word count fix works great! However I applied the fix by editing/adding the properties in Ambari, so they won't get overwritten if I make another change at that level. In my opinion that is preferable to editing the config file directly.&lt;/P&gt;</description>
    <pubDate>Thu, 01 Sep 2016 08:37:23 GMT</pubDate>
    <dc:creator>Former Member</dc:creator>
    <dc:date>2016-09-01T08:37:23Z</dc:date>
    <item>
      <title>Sandbox HDP 2.5.0 - Spark 1.6.2 - Issues: GPLNativeCodeLoader: Could not load native gpl library - LzoCodec: Cannot load native-lzo without native-hadoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sandbox-HDP-2-5-0-Spark-1-6-2-Issues-GPLNativeCodeLoader/m-p/123158#M85902</link>
      <description>&lt;P&gt;Sandbox HDP-2.5.0 TP Spark 1.6.2 - I am encounterning the following ERROR GPLNativeCodeLoader: Could not load native gpl library - ERROR LzoCodec: Cannot load native-lzo without native-hadoop&lt;/P&gt;&lt;P&gt;while running a simple word count on spark-shell&lt;/P&gt;&lt;P&gt;[root@sandbox ~]# cd $SPARK_HOME&lt;/P&gt;&lt;P&gt;[root@sandbox spark-client]# ./bin/spark-shell --master yarn-client --driver-memory 512m --executor-memory 512m --jars /us r/hdp/2.5.0.0-817/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-817.jar&lt;/P&gt;&lt;P&gt;The following code is submitted at the Spark CLI&lt;/P&gt;&lt;PRE&gt;&lt;/PRE&gt;&lt;OL&gt;&lt;LI&gt;val file = sc.textFile("/tmp/data")&lt;/LI&gt;&lt;LI&gt;val counts = file.flatMap(line =&amp;gt; line.split(" ")).map(word =&amp;gt;(word,1)).&lt;/LI&gt;&lt;LI&gt;reduceByKey(_ + _)&lt;/LI&gt;&lt;LI&gt;counts.saveAsTextFile("/tmp/wordcount")&lt;/LI&gt;&lt;/OL&gt;
&lt;P&gt;This yields the following error:&lt;/P&gt;&lt;P&gt;ERROR GPLNativeCodeLoader: Could not load native gpl library&lt;/P&gt;&lt;P&gt;ERROR LzoCodec: Cannot load native-lzo without native-hadoop&lt;/P&gt;&lt;P&gt;The same error appear with or without adding the --jars parameter as here under:&lt;/P&gt;&lt;P&gt;--jars /us r/hdp/2.5.0.0-817/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-817.jar&lt;/P&gt;&lt;P&gt;Full Log:&lt;/P&gt;&lt;PRE&gt;&lt;/PRE&gt;&lt;OL&gt;&lt;LI&gt;[root@sandbox ~]# cd $SPARK_HOME                                                                                          &lt;/LI&gt;&lt;LI&gt;[root@sandbox spark-client]# ./bin/spark-shell --master yarn-client --driver-memory 512m --executor-memory 512m --jars /us&lt;/LI&gt;&lt;LI&gt;r/hdp/2.5.0.0-817/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-817.jar&lt;/LI&gt;&lt;LI&gt;16/08/2716:28:23 INFO SecurityManager:Changing view acls to: root                                                       &lt;/LI&gt;&lt;LI&gt;16/08/2716:28:23 INFO SecurityManager:Changing modify acls to: root                                                     &lt;/LI&gt;&lt;LI&gt;16/08/2716:28:23 INFO SecurityManager:SecurityManager: authentication disabled; ui acls disabled; users with view permis&lt;/LI&gt;&lt;LI&gt;sions:Set(root); users with modify permissions:Set(root)&lt;/LI&gt;&lt;LI&gt;16/08/2716:28:23 INFO HttpServer:Starting HTTP Server&lt;/LI&gt;&lt;LI&gt;16/08/2716:28:23 INFO Server: jetty-8.y.z-SNAPSHOT                                                                       &lt;/LI&gt;&lt;LI&gt;16/08/2716:28:23 INFO AbstractConnector:StartedSocketConnector@0.0.0.0:43011&lt;/LI&gt;&lt;LI&gt;16/08/2716:28:23 INFO Utils:Successfully started service 'HTTP class server' on port 43011.&lt;/LI&gt;&lt;LI&gt;Welcome to                                                                                                                &lt;/LI&gt;&lt;LI&gt;      ____              __                                                                                                &lt;/LI&gt;&lt;LI&gt;/ __/__  ___ _____/ /__                                                                                              &lt;/LI&gt;&lt;LI&gt;    _\ \/ _ \/ _ `/ __/'_/                                                                                              &lt;/LI&gt;&lt;LI&gt;   /___/ .__/\_,_/_/ /_/\_\   version 1.6.2                                                                               &lt;/LI&gt;&lt;LI&gt;      /_/                                                                                                                 &lt;/LI&gt;&lt;LI&gt;Using Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_101)                                                     &lt;/LI&gt;&lt;LI&gt;Type in expressions to have them evaluated.                                                                               &lt;/LI&gt;&lt;LI&gt;Type :help for more information.                                                                                          &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:26 INFO SparkContext: Running Spark version 1.6.2                                                          &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:26 INFO SecurityManager: Changing view acls to: root                                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:26 INFO SecurityManager: Changing modify acls to: root                                                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permis&lt;/LI&gt;&lt;LI&gt;sions: Set(root); users with modify permissions: Set(root)                                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:26 INFO Utils: Successfully started service 'sparkDriver' on port 45506.                                   &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Slf4jLogger: Slf4jLogger started                                                                   &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Remoting: Starting remoting                                                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Remoting: Remoting started; listening on addresses :[akka.tcp://sparkDriverActorSystem@10.0.2.15:44&lt;/LI&gt;&lt;LI&gt;829]                                                                                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Utils: Successfully started service 'sparkDriverActorSystem' on port 44829.                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO SparkEnv: Registering MapOutputTracker                                                             &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO SparkEnv: Registering BlockManagerMaster                                                           &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO DiskBlockManager: Created local directory at /tmp/blockmgr-0776b175-5dd7-49b9-adf7-f2cbd85a1e1b    &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO MemoryStore: MemoryStore started with capacity 143.6 MB                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO SparkEnv: Registering OutputCommitCoordinator                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Server: jetty-8.y.z-SNAPSHOT                                                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO AbstractConnector: Started SelectChannelConnector@0.0.0.0:4040                                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Utils: Successfully started service 'SparkUI' on port 4040.                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at &lt;A href="http://10.0.2.15:4040/"&gt;http://10.0.2.15:4040&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO HttpFileServer: HTTP File server directory is /tmp/spark-61ecb98e-989c-4396-9b30-032c4d5a2b90/httpd&lt;/LI&gt;&lt;LI&gt;-857ce699-7db0-428c-9af5-1dca4ec5330d                                                                                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO HttpServer: Starting HTTP Server                                                                   &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Server: jetty-8.y.z-SNAPSHOT                                                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO AbstractConnector: Started SocketConnector@0.0.0.0:37515                                           &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO Utils: Successfully started service 'HTTP file server' on port 37515.                              &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:27 INFO SparkContext: Added JAR file:/usr/hdp/2.5.0.0-817/hadoop/lib/hadoop-lzo-0.6.0.2.5.0.0-817.jar at ht&lt;/LI&gt;&lt;LI&gt;tp://10.0.2.15:37515/jars/hadoop-lzo-0.6.0.2.5.0.0-817.jar with timestamp 1472315307772                                   &lt;/LI&gt;&lt;LI&gt;spark.yarn.driver.memoryOverhead is set but does not apply in client mode.                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO TimelineClientImpl: Timeline service address: &lt;A href="http://sandbox.hortonworks.com:8188/ws/v1/timeline/"&gt;http://sandbox.hortonworks.com:8188/ws/v1/timeline/&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO RMProxy: Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050                   &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Requesting a new application from cluster with 1 NodeManagers                              &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Verifying our application has not requested more than the maximum memory capability of the &lt;/LI&gt;&lt;LI&gt;cluster (2250 MB per container)                                                                                           &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Will allocate AM container, with 896 MB memory including 384 MB overhead                   &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Setting up container launch context for our AM                                             &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Setting up the launch environment for our AM container                                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs:/&lt;/LI&gt;&lt;LI&gt;/sandbox.hortonworks.com:8020/hdp/apps/2.5.0.0-817/spark/spark-hdp-assembly.jar                                           &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Preparing resources for our AM container                                                   &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Using the spark assembly jar on HDFS because you are using HDP, defaultSparkAssembly:hdfs:/&lt;/LI&gt;&lt;LI&gt;/sandbox.hortonworks.com:8020/hdp/apps/2.5.0.0-817/spark/spark-hdp-assembly.jar                                           &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:28 INFO Client: Source and destination file systems are the same. Not copying hdfs://sandbox.hortonworks.co&lt;/LI&gt;&lt;LI&gt;m:8020/hdp/apps/2.5.0.0-817/spark/spark-hdp-assembly.jar                                                                  &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:29 INFO Client: Uploading resource file:/tmp/spark-61ecb98e-989c-4396-9b30-032c4d5a2b90/__spark_conf__50848&lt;/LI&gt;&lt;LI&gt;04354575467223.zip -&amp;gt; hdfs://sandbox.hortonworks.com:8020/user/root/.sparkStaging/application_1472312154461_0006/__spark_c&lt;/LI&gt;&lt;LI&gt;onf__5084804354575467223.zip                                                                                              &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:29 INFO SecurityManager: Changing view acls to: root                                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:29 INFO SecurityManager: Changing modify acls to: root                                                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:29 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permis&lt;/LI&gt;&lt;LI&gt;sions: Set(root); users with modify permissions: Set(root)                                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:29 INFO Client: Submitting application 6 to ResourceManager                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:29 INFO YarnClientImpl: Submitted application application_1472312154461_0006                               &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:29 INFO SchedulerExtensionServices: Starting Yarn extension services with app application_1472312154461_000&lt;/LI&gt;&lt;LI&gt;6 and attemptId None                                                                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:30 INFO Client: Application report for application_1472312154461_0006 (state: ACCEPTED)                    &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:30 INFO Client:                                                                                            &lt;/LI&gt;&lt;LI&gt;         client token: N/A                                                                                                &lt;/LI&gt;&lt;LI&gt;         diagnostics: AM container is launched, waiting for AM container to Register with RM                              &lt;/LI&gt;&lt;LI&gt;         ApplicationMaster host: N/A                                                                                      &lt;/LI&gt;&lt;LI&gt;         ApplicationMaster RPC port: -1                                                                                   &lt;/LI&gt;&lt;LI&gt;         queue: default                                                                                                   &lt;/LI&gt;&lt;LI&gt;         start time: 1472315309252                                                                                        &lt;/LI&gt;&lt;LI&gt;         final status: UNDEFINED                                                                                          &lt;/LI&gt;&lt;LI&gt; tracking URL: &lt;A href="http://sandbox.hortonworks.com:8088/proxy/application_1472312154461_0006/"&gt;http://sandbox.hortonworks.com:8088/proxy/application_1472312154461_0006/&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;         user: root                                                                                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:31 INFO Client: Application report for application_1472312154461_0006 (state: ACCEPTED)                    &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO YarnSchedulerBackend$YarnSchedulerEndpoint: ApplicationMaster registered as NettyRpcEndpointRef(nul&lt;/LI&gt;&lt;LI&gt;l)                                                                                                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO YarnClientSchedulerBackend: Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpF&lt;/LI&gt;&lt;LI&gt;ilter, Map(PROXY_HOSTS -&amp;gt; sandbox.hortonworks.com, PROXY_URI_BASES -&amp;gt; &lt;A href="http://sandbox.hortonworks.com:8088/proxy/applicatio"&gt;http://sandbox.hortonworks.com:8088/proxy/applicatio&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;n_1472312154461_0006), /proxy/application_1472312154461_0006                                                              &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO JettyUtils: Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter              &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO Client: Application report for application_1472312154461_0006 (state: RUNNING)                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO Client:                                                                                            &lt;/LI&gt;&lt;LI&gt;         client token: N/A                                                                                                &lt;/LI&gt;&lt;LI&gt;         diagnostics: N/A                                                                                                 &lt;/LI&gt;&lt;LI&gt;         ApplicationMaster host: 10.0.2.15                                                                                &lt;/LI&gt;&lt;LI&gt;         ApplicationMaster RPC port: 0                                                                                    &lt;/LI&gt;&lt;LI&gt;         queue: default                                                                                                   &lt;/LI&gt;&lt;LI&gt;         start time: 1472315309252                                                                                        &lt;/LI&gt;&lt;LI&gt;         final status: UNDEFINED                                                                                          &lt;/LI&gt;&lt;LI&gt; tracking URL: &lt;A href="http://sandbox.hortonworks.com:8088/proxy/application_1472312154461_0006/"&gt;http://sandbox.hortonworks.com:8088/proxy/application_1472312154461_0006/&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;         user: root                                                                                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO YarnClientSchedulerBackend: Application application_1472312154461_0006 has started running.        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on p&lt;/LI&gt;&lt;LI&gt;ort 34124.                                                                                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO NettyBlockTransferService: Server created on 34124                                                 &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO BlockManagerMaster: Trying to register BlockManager                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO BlockManagerMasterEndpoint: Registering block manager 10.0.2.15:34124 with 143.6 MB RAM, BlockManag&lt;/LI&gt;&lt;LI&gt;erId(driver, 10.0.2.15, 34124)                                                                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO BlockManagerMaster: Registered BlockManager                                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:32 INFO EventLoggingListener: Logging events to hdfs:///spark-history/application_1472312154461_0006       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:36 INFO YarnClientSchedulerBackend: Registered executor NettyRpcEndpointRef(null) (sandbox.hortonworks.com:&lt;/LI&gt;&lt;LI&gt;39728) with ID 1                                                                                                          &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:36 INFO BlockManagerMasterEndpoint: Registering block manager sandbox.hortonworks.com:38362 with 143.6 MB R&lt;/LI&gt;&lt;LI&gt;AM, BlockManagerId(1, sandbox.hortonworks.com, 38362)                                                                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:57 INFO YarnClientSchedulerBackend: SchedulerBackend is ready for scheduling beginning after waiting maxReg&lt;/LI&gt;&lt;LI&gt;isteredResourcesWaitingTime: 30000(ms)                                                                                    &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:57 INFO SparkILoop: Created spark context..                                                                &lt;/LI&gt;&lt;LI&gt;Spark context available as sc.                                                                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:58 INFO HiveContext: Initializing execution hive, version 1.2.1                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:58 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.5.0.0-817                                         &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:58 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.5.0.0-8&lt;/LI&gt;&lt;LI&gt;17                                                                                                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:58 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.Objec&lt;/LI&gt;&lt;LI&gt;tStore                                                                                                                    &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:58 INFO ObjectStore: ObjectStore, initialize called                                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:58 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored                           &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:58 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored               &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:59 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)                 &lt;/LI&gt;&lt;LI&gt;16/08/27 16:28:59 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)                 &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:00 INFO ObjectStore: Setting MetaStore object pin classes with hive.metastore.cache.pinobjtypes="Table,Stor&lt;/LI&gt;&lt;LI&gt;ageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order"                                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:01 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-o&lt;/LI&gt;&lt;LI&gt;nly" so does not have its own datastore table.                                                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:01 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" s&lt;/LI&gt;&lt;LI&gt;o does not have its own datastore table.                                                                                  &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:02 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MFieldSchema" is tagged as "embedded-o&lt;/LI&gt;&lt;LI&gt;nly" so does not have its own datastore table.                                                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:02 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MOrder" is tagged as "embedded-only" s&lt;/LI&gt;&lt;LI&gt;o does not have its own datastore table.                                                                                  &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:02 INFO MetaStoreDirectSql: Using direct SQL, underlying DB is DERBY                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:02 INFO ObjectStore: Initialized ObjectStore                                                               &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:02 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not &lt;/LI&gt;&lt;LI&gt;enabled so recording the schema version 1.2.0                                                                             &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:02 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO HiveMetaStore: Added admin role in metastore                                                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO HiveMetaStore: Added public role in metastore                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO HiveMetaStore: No user is added in admin role, since config is empty                               &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO HiveMetaStore: 0: get_all_databases                                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO audit: ugi=root  ip=unknown-ip-addr      cmd=get_all_databases                                     &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO HiveMetaStore: 0: get_functions: db=default pat=*                                                  &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO audit: ugi=root  ip=unknown-ip-addr      cmd=get_functions: db=default pat=*                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO Datastore: The class "org.apache.hadoop.hive.metastore.model.MResourceUri" is tagged as "embedded-o&lt;/LI&gt;&lt;LI&gt;nly" so does not have its own datastore table.                                                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO SessionState: Created local directory: /tmp/6ebb0a60-b229-4dad-94a3-e2386ba7b4ec_resources         &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO SessionState: Created HDFS directory: /tmp/hive/root/6ebb0a60-b229-4dad-94a3-e2386ba7b4ec          &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO SessionState: Created local directory: /tmp/root/6ebb0a60-b229-4dad-94a3-e2386ba7b4ec              &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO SessionState: Created HDFS directory: /tmp/hive/root/6ebb0a60-b229-4dad-94a3-e2386ba7b4ec/_tmp_spac&lt;/LI&gt;&lt;LI&gt;e.db                                                                                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO HiveContext: default warehouse location is /user/hive/warehouse                                    &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO HiveContext: Initializing HiveMetastoreConnection version 1.2.1 using Spark classes.               &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO ClientWrapper: Inspected Hadoop version: 2.7.1.2.5.0.0-817                                         &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:03 INFO ClientWrapper: Loaded org.apache.hadoop.hive.shims.Hadoop23Shims for Hadoop version 2.7.1.2.5.0.0-8&lt;/LI&gt;&lt;LI&gt;17                                                                                                                        &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:04 INFO metastore: Trying to connect to metastore with URI thrift://sandbox.hortonworks.com:9083           &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:04 INFO metastore: Connected to metastore.                                                                 &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:04 INFO SessionState: Created local directory: /tmp/83a1e2d3-8c24-4f12-9841-fab259a77514_resources         &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:04 INFO SessionState: Created HDFS directory: /tmp/hive/root/83a1e2d3-8c24-4f12-9841-fab259a77514          &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:04 INFO SessionState: Created local directory: /tmp/root/83a1e2d3-8c24-4f12-9841-fab259a77514              &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:04 INFO SessionState: Created HDFS directory: /tmp/hive/root/83a1e2d3-8c24-4f12-9841-fab259a77514/_tmp_spac&lt;/LI&gt;&lt;LI&gt;e.db                                                                                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:04 INFO SparkILoop: Created sql context (with Hive support)..                                              &lt;/LI&gt;&lt;LI&gt;SQL context available as sqlContext.                                                                                      &lt;/LI&gt;&lt;LI&gt;scala&amp;gt; val file = sc.textFile("/tmp/data")                                                                                &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:20 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 234.8 KB, free 234.8 KB) &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:20 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 28.1 KB, free 262.9&lt;/LI&gt;&lt;LI&gt; KB)                                                                                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:20 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 10.0.2.15:34124 (size: 28.1 KB, free: 143.6&lt;/LI&gt;&lt;LI&gt; MB)                                                                                                                      &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:20 INFO SparkContext: Created broadcast 0 from textFile at &amp;lt;console&amp;gt;:27                                    &lt;/LI&gt;&lt;LI&gt;file: org.apache.spark.rdd.RDD[String] = /tmp/data MapPartitionsRDD[1] at textFile at &amp;lt;console&amp;gt;:27                        &lt;/LI&gt;&lt;LI&gt;scala&amp;gt; val counts = file.flatMap(line =&amp;gt; line.split(" ")).map(word =&amp;gt; (word, 1)).reduceByKey(_ + _)                       &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:35 ERROR GPLNativeCodeLoader: Could not load native gpl library                                            &lt;/LI&gt;&lt;LI&gt;java.lang.UnsatisfiedLinkError: no gplcompression in java.library.path                                                    &lt;/LI&gt;&lt;LI&gt;        at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1889)                                                       &lt;/LI&gt;&lt;LI&gt;        at java.lang.Runtime.loadLibrary0(Runtime.java:849)                                                               &lt;/LI&gt;&lt;LI&gt;        at java.lang.System.loadLibrary(System.java:1088)                                                                 &lt;/LI&gt;&lt;LI&gt;        at com.hadoop.compression.lzo.GPLNativeCodeLoader.&amp;lt;clinit&amp;gt;(GPLNativeCodeLoader.java:32)                           &lt;/LI&gt;&lt;LI&gt;        at com.hadoop.compression.lzo.LzoCodec.&amp;lt;clinit&amp;gt;(LzoCodec.java:71)                                                 &lt;/LI&gt;&lt;LI&gt;        at java.lang.Class.forName0(Native Method)                                                                        &lt;/LI&gt;&lt;LI&gt;        at java.lang.Class.forName(Class.java:278)                                                                        &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.conf.Configuration.getClassByNameOrNull(Configuration.java:2147)                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2112)                                   &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:132)        &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.io.compress.CompressionCodecFactory.&amp;lt;init&amp;gt;(CompressionCodecFactory.java:179)                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)                                    &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)                                                    &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)                                  &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)                          &lt;/LI&gt;&lt;LI&gt;        at java.lang.reflect.Method.invoke(Method.java:606)                                                               &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)                                    &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:78)                                        &lt;/LI&gt;&lt;LI&gt;        at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:136)                                   &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.HadoopRDD.getInputFormat(HadoopRDD.scala:189)                                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:202)                                              &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:242)                                            &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:240)                                            &lt;/LI&gt;&lt;LI&gt;        at scala.Option.getOrElse(Option.scala:120)                                                                       &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)                                                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)                                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:242)                                            &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:240)                                            &lt;/LI&gt;&lt;LI&gt;        at scala.Option.getOrElse(Option.scala:120)                                                                       &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)                                                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)                                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:242)                                            &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:240)                                            &lt;/LI&gt;&lt;LI&gt;        at scala.Option.getOrElse(Option.scala:120)                                                                       &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)                                                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35)                                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:242)                                            &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD$anonfun$partitions$2.apply(RDD.scala:240)                                            &lt;/LI&gt;&lt;LI&gt;        at scala.Option.getOrElse(Option.scala:120)                                                                       &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD.partitions(RDD.scala:240)                                                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.Partitioner$.defaultPartitioner(Partitioner.scala:65)                                         &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.PairRDDFunctions$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331)                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.PairRDDFunctions$anonfun$reduceByKey$3.apply(PairRDDFunctions.scala:331)                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:150)                                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:111)                                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.RDD.withScope(RDD.scala:323)                                                              &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.rdd.PairRDDFunctions.reduceByKey(PairRDDFunctions.scala:330)                                  &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC$iwC$iwC$iwC$iwC$iwC$iwC$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:29)                                     &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC$iwC$iwC$iwC$iwC$iwC$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:34)                                          &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC$iwC$iwC$iwC$iwC$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:36)                                               &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC$iwC$iwC$iwC$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:38)                                                    &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC$iwC$iwC$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:40)                                                         &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC$iwC$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:42)                                                              &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:44)                                                                   &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:46)                                                                        &lt;/LI&gt;&lt;LI&gt;        at $line19.$read.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:48)                                                                             &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:52)                                                                            &lt;/LI&gt;&lt;LI&gt;        at $line19.$read$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)                                                                             &lt;/LI&gt;&lt;LI&gt;        at $line19.$eval$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:7)                                                                             &lt;/LI&gt;&lt;LI&gt;        at $line19.$eval$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)                                                                             &lt;/LI&gt;&lt;LI&gt;        at $line19.$eval.$print(&amp;lt;console&amp;gt;)                                                                                &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)                                                    &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)                                  &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)                          &lt;/LI&gt;&lt;LI&gt;        at java.lang.reflect.Method.invoke(Method.java:606)                                                               &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)                                     &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)                                     &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)                                         &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)                                               &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)                                               &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)                                       &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)                                   &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)                                                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)                                           &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)                                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$loop(SparkILoop.scala:670)                  &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply$mcZ$sp(SparkILoop.s&lt;/LI&gt;&lt;LI&gt;cala:997)                                                                                                                 &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply(SparkILoop.scala:94&lt;/LI&gt;&lt;LI&gt;5)                                                                                                                        &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop$anonfun$org$apache$spark$repl$SparkILoop$process$1.apply(SparkILoop.scala:94&lt;/LI&gt;&lt;LI&gt;5)                                                                                                                        &lt;/LI&gt;&lt;LI&gt;        at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)                         &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$process(SparkILoop.scala:945)               &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)                                                &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.Main$.main(Main.scala:31)                                                                &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.repl.Main.main(Main.scala)                                                                    &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)                                                    &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)                                  &lt;/LI&gt;&lt;LI&gt;        at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)                          &lt;/LI&gt;&lt;LI&gt;        at java.lang.reflect.Method.invoke(Method.java:606)                                                               &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$runMain(SparkSubmit.scala:731)       &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)                                        &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)                                             &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)                                               &lt;/LI&gt;&lt;LI&gt;        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)                                                    &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:35 ERROR LzoCodec: Cannot load native-lzo without native-hadoop                                            &lt;/LI&gt;&lt;LI&gt;16/08/27 16:29:35 INFO FileInputFormat: Total input paths to process : 1                                                  &lt;/LI&gt;&lt;LI&gt;counts: org.apache.spark.rdd.RDD[(String, Int)] = ShuffledRDD[4] at reduceByKey at &amp;lt;console&amp;gt;:29                           &lt;/LI&gt;&lt;LI&gt;scala&amp;gt;                                                                                                                    &lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Please help to fix this issue.&lt;/P&gt;</description>
      <pubDate>Mon, 29 Aug 2016 04:29:16 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sandbox-HDP-2-5-0-Spark-1-6-2-Issues-GPLNativeCodeLoader/m-p/123158#M85902</guid>
      <dc:creator>anandi</dc:creator>
      <dc:date>2016-08-29T04:29:16Z</dc:date>
    </item>
    <item>
      <title>Re: Sandbox HDP 2.5.0 - Spark 1.6.2 - Issues: GPLNativeCodeLoader: Could not load native gpl library - LzoCodec: Cannot load native-lzo without native-hadoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sandbox-HDP-2-5-0-Spark-1-6-2-Issues-GPLNativeCodeLoader/m-p/123159#M85903</link>
      <description>&lt;P&gt;Resolution done for Spark 2.0.0&lt;/P&gt;&lt;P&gt;Resolution for Spark Submit issue: add java-opts file in /usr/hdp/current/spark2-client/conf/&lt;/P&gt;&lt;PRE&gt;&lt;/PRE&gt;&lt;OL&gt;&lt;LI&gt;[root@sandbox conf]# cat java-opts                                                              &lt;/LI&gt;&lt;LI&gt;-Dhdp.version=2.5.0.0-817&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Spark Submit working example:&lt;/P&gt;&lt;PRE&gt;&lt;/PRE&gt;&lt;OL&gt;&lt;LI&gt;[root@sandbox spark2-client]# ./bin/spark-submit --class org.apache.spark.examples.SparkPi --master yarn --deploy-mode cluster --driver-memory 2g --executor-memory 2g --ex&lt;/LI&gt;&lt;LI&gt;ecutor-cores 1 examples/jars/spark-examples*.jar 10&lt;/LI&gt;&lt;LI&gt;16/08/2917:44:57 WARN util.NativeCodeLoader:Unable to load native-hadoop library for your platform...using builtin-java classes where applicable                        &lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 WARN shortcircuit.DomainSocketFactory:Theshort-circuit local reads feature cannot be used because libhadoop cannot be loaded.&lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 INFO client.RMProxy:Connecting to ResourceManager at sandbox.hortonworks.com/10.0.2.15:8050&lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 INFO yarn.Client:Requesting a new application from cluster with1NodeManagers&lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 INFO yarn.Client:Verifyingour application has not requested more than the maximum memory capability of the cluster (7680 MB per container)&lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 INFO yarn.Client:Will allocate AM container,with2248 MB memory including 200 MB overhead                                                              &lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 INFO yarn.Client:Setting up container launch context forour AM                                                                                         &lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 INFO yarn.Client:Setting up the launch environment forour AM container                                                                                 &lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 INFO yarn.Client:Preparing resources forour AM container                                                                                               &lt;/LI&gt;&lt;LI&gt;16/08/2917:44:58 WARN yarn.Client:Neither spark.yarn.jars nor spark.yarn.archive isset, falling back to uploading libraries under SPARK_HOME.&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:00 INFO yarn.Client:Uploading resource file:/tmp/spark-38890bfc-d672-4c7d-bef9-d646c420836b/__spark_libs__3503948162159958877.zip -&amp;gt; hdfs://sandbox.hortonw&lt;/LI&gt;&lt;LI&gt;orks.com:8020/user/root/.sparkStaging/application_1472397144295_0006/__spark_libs__3503948162159958877.zip                                                                 &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO yarn.Client:Uploading resource file:/usr/hdp/2.5.0.0-817/spark2/examples/jars/spark-examples_2.11-2.0.0.jar-&amp;gt; hdfs://sandbox.hortonworks.com:8020/&lt;/LI&gt;&lt;LI&gt;user/root/.sparkStaging/application_1472397144295_0006/spark-examples_2.11-2.0.0.jar&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO yarn.Client:Uploading resource file:/tmp/spark-38890bfc-d672-4c7d-bef9-d646c420836b/__spark_conf__4613069544481307021.zip -&amp;gt; hdfs://sandbox.hortonw&lt;/LI&gt;&lt;LI&gt;orks.com:8020/user/root/.sparkStaging/application_1472397144295_0006/__spark_conf__.zip                                                                                    &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 WARN yarn.Client: spark.yarn.am.extraJavaOptions will not take effect in cluster mode                                                                    &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO spark.SecurityManager:Changing view acls to: root                                                                                                  &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO spark.SecurityManager:Changing modify acls to: root                                                                                                &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO spark.SecurityManager:Changing view acls groups to:&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO spark.SecurityManager:Changing modify acls groups to:&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO spark.SecurityManager:SecurityManager: authentication disabled; ui acls disabled; users  with view permissions:Set(root); groups with view permiss&lt;/LI&gt;&lt;LI&gt;ions:Set(); users  with modify permissions:Set(root); groups with modify permissions:Set()&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO yarn.Client:Submitting application application_1472397144295_0006 to ResourceManager&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:01 INFO impl.YarnClientImpl:Submitted application application_1472397144295_0006                                                                           &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:02 INFO yarn.Client:Application report for application_1472397144295_0006 (state: ACCEPTED)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:02 INFO yarn.Client:&lt;/LI&gt;&lt;LI&gt;         client token: N/A                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;         diagnostics: AM container is launched, waiting for AM container to Registerwith RM                                                                               &lt;/LI&gt;&lt;LI&gt;ApplicationMaster host: N/A                                                                                                                                       &lt;/LI&gt;&lt;LI&gt;ApplicationMaster RPC port:-1&lt;/LI&gt;&lt;LI&gt;         queue:default&lt;/LI&gt;&lt;LI&gt;         start time:1472492701409&lt;/LI&gt;&lt;LI&gt;final status: UNDEFINED                                                                                                                                           &lt;/LI&gt;&lt;LI&gt; tracking URL:&lt;A href="http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0006/"&gt;http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0006/&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;         user: root                                                                                                                                                        &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:03 INFO yarn.Client:Application report for application_1472397144295_0006 (state: ACCEPTED)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:04 INFO yarn.Client:Application report for application_1472397144295_0006 (state: ACCEPTED)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:05 INFO yarn.Client:Application report for application_1472397144295_0006 (state: ACCEPTED)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:06 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:06 INFO yarn.Client:&lt;/LI&gt;&lt;LI&gt;         client token: N/A                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;         diagnostics: N/A                                                                                                                                                  &lt;/LI&gt;&lt;LI&gt;ApplicationMaster host:10.0.2.15&lt;/LI&gt;&lt;LI&gt;ApplicationMaster RPC port:0&lt;/LI&gt;&lt;LI&gt;         queue:default&lt;/LI&gt;&lt;LI&gt;         start time:1472492701409&lt;/LI&gt;&lt;LI&gt;final status: UNDEFINED                                                                                                                                           &lt;/LI&gt;&lt;LI&gt; tracking URL:&lt;A href="http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0006/"&gt;http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0006/&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;         user: root                                                                                                                                                        &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:07 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:08 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:09 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:10 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:11 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:12 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:13 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:14 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:15 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:16 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:17 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:18 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:19 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:20 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:21 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:22 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:23 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:24 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:25 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:26 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:27 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:28 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:29 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:30 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:31 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:32 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:33 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:34 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:35 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:36 INFO yarn.Client:Application report for application_1472397144295_0006 (state: RUNNING)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:37 INFO yarn.Client:Application report for application_1472397144295_0006 (state: FINISHED)&lt;/LI&gt;&lt;LI&gt;16/08/2917:45:37 INFO yarn.Client:&lt;/LI&gt;&lt;LI&gt;         client token: N/A                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;         diagnostics: N/A                                                                                                                                                  &lt;/LI&gt;&lt;LI&gt;ApplicationMaster host:10.0.2.15&lt;/LI&gt;&lt;LI&gt;ApplicationMaster RPC port:0&lt;/LI&gt;&lt;LI&gt;         queue:default&lt;/LI&gt;&lt;LI&gt;         start time:1472492701409&lt;/LI&gt;&lt;LI&gt;final status: SUCCEEDED                                                                                                                                           &lt;/LI&gt;&lt;LI&gt; tracking URL:&lt;A href="http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0006/"&gt;http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0006/&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;         user: root                                                                                                                                                        &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:37 INFO util.ShutdownHookManager:Shutdown hook called                                                                                                      &lt;/LI&gt;&lt;LI&gt;16/08/2917:45:37 INFO util.ShutdownHookManager:Deleting directory /tmp/spark-38890bfc-d672-4c7d-bef9-d646c420836b                                                        &lt;/LI&gt;&lt;LI&gt;[root@sandbox spark2-client]#                                                                                                                                              &lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Resolution for Spark Shell issue (lzo-codec): add the following 2 lines in your spark-defaults.conf&lt;/P&gt;&lt;PRE&gt;&lt;/PRE&gt;&lt;OL&gt;&lt;LI&gt;spark.driver.extraClassPath /usr/hdp/current/hadoop-client/lib/hadoop-lzo-0.6.0.2.5.0.0-817.jar&lt;/LI&gt;&lt;LI&gt;spark.driver.extraLibraryPath /usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64&lt;/LI&gt;&lt;/OL&gt;&lt;P&gt;Spark Shell working example:&lt;/P&gt;&lt;PRE&gt;&lt;/PRE&gt;&lt;OL&gt;&lt;LI&gt;[root@sandbox spark2-client]# ./bin/spark-shell --master yarn --deploy-mode client --driver-memory 2g --executor-memory 2g --executor-cores 1                              &lt;/LI&gt;&lt;LI&gt;Settingdefault log level to "WARN".&lt;/LI&gt;&lt;LI&gt;To adjust logging level use sc.setLogLevel(newLevel).&lt;/LI&gt;&lt;LI&gt;16/08/2917:47:09 WARN yarn.Client:Neither spark.yarn.jars nor spark.yarn.archive isset, falling back to uploading libraries under SPARK_HOME.&lt;/LI&gt;&lt;LI&gt;16/08/2917:47:21 WARN spark.SparkContext:Use an existing SparkContext, some configuration may not take effect.&lt;/LI&gt;&lt;LI&gt;Spark context Web UI available at &lt;A href="http://10.0.2.15:4041/"&gt;http://10.0.2.15:4041&lt;/A&gt;&lt;/LI&gt;&lt;LI&gt;Spark context available as'sc'(master = yarn, app id = application_1472397144295_0007).&lt;/LI&gt;&lt;LI&gt;Spark session available as'spark'.&lt;/LI&gt;&lt;LI&gt;Welcome to                                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;      ____              __                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;/ __/__  ___ _____/ /__                                                                                                                                               &lt;/LI&gt;&lt;LI&gt;    _\ \/ _ \/ _ `/ __/'_/                                                                                                                                               &lt;/LI&gt;&lt;LI&gt;   /___/ .__/\_,_/_/ /_/\_\   version 2.0.0                                                                                                                                &lt;/LI&gt;&lt;LI&gt;      /_/                                                                                                                                                                  &lt;/LI&gt;&lt;LI&gt;Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.7.0_101)                                                                                                      &lt;/LI&gt;&lt;LI&gt;Type in expressions to have them evaluated.                                                                                                                                &lt;/LI&gt;&lt;LI&gt;Type :help for more information.                                                                                                                                           &lt;/LI&gt;&lt;LI&gt;scala&amp;gt; sc.getConf.getAll.foreach(println)                                                                                                                                  &lt;/LI&gt;&lt;LI&gt;(spark.eventLog.enabled,true)                                                                                                                                              &lt;/LI&gt;&lt;LI&gt;(spark.yarn.scheduler.heartbeat.interval-ms,5000)                                                                                                                          &lt;/LI&gt;&lt;LI&gt;(hive.metastore.warehouse.dir,file:/usr/hdp/2.5.0.0-817/spark2/spark-warehouse)                                                                                            &lt;/LI&gt;&lt;LI&gt;(spark.repl.class.outputDir,/tmp/spark-fa16d4d3-8ec8-4b0e-a1da-5a2dffe39d08/repl-5dd28f29-ae03-4965-a535-18a95173b173)                                                     &lt;/LI&gt;&lt;LI&gt;(spark.yarn.am.extraJavaOptions,-Dhdp.version=2.5.0.0-817)                                                                                                                 &lt;/LI&gt;&lt;LI&gt;(spark.yarn.containerLauncherMaxThreads,25)                                                                                                                                &lt;/LI&gt;&lt;LI&gt;(spark.driver.extraJavaOptions,-Dhdp.version=2.5.0.0-817)                                                                                                                  &lt;/LI&gt;&lt;LI&gt;(spark.driver.extraLibraryPath,/usr/hdp/current/hadoop-client/lib/native:/usr/hdp/current/hadoop-client/lib/native/Linux-amd64-64)                                         &lt;/LI&gt;&lt;LI&gt;(spark.driver.appUIAddress,&lt;A href="http://10.0.2.15:4041/"&gt;http://10.0.2.15:4041&lt;/A&gt;) &lt;/LI&gt;&lt;LI&gt;(spark.driver.host,10.0.2.15)                                                                                                                                              &lt;/LI&gt;&lt;LI&gt;(spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_URI_BASES,&lt;A href="http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0007"&gt;http://sandbox.hortonworks.com:8088/proxy/application_1472397144295_0007&lt;/A&gt;) &lt;/LI&gt;&lt;LI&gt;(spark.yarn.preserve.staging.files,false)                                                                                                                                  &lt;/LI&gt;&lt;LI&gt;(spark.home,/usr/hdp/current/spark2-client)                                                                                                                                &lt;/LI&gt;&lt;LI&gt;(spark.app.name,Spark shell)                                                                                                                                               &lt;/LI&gt;&lt;LI&gt;(spark.repl.class.uri,spark://10.0.2.15:37426/classes)                                                                                                                     &lt;/LI&gt;&lt;LI&gt;(spark.ui.port,4041)                                                                                                                                                       &lt;/LI&gt;&lt;LI&gt;(spark.yarn.max.executor.failures,3)                                                                                                                                       &lt;/LI&gt;&lt;LI&gt;(spark.submit.deployMode,client)                                                                                                                                           &lt;/LI&gt;&lt;LI&gt;(spark.yarn.executor.memoryOverhead,200)                                                                                                                                   &lt;/LI&gt;&lt;LI&gt;(spark.ui.filters,org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter)                                                                                              &lt;/LI&gt;&lt;LI&gt;(spark.driver.extraClassPath,/usr/hdp/current/hadoop-client/lib/hadoop-lzo-0.6.0.2.5.0.0-817.jar)                                                                          &lt;/LI&gt;&lt;LI&gt;(spark.executor.memory,2g)                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;(spark.yarn.driver.memoryOverhead,200)                                                                                                                                     &lt;/LI&gt;&lt;LI&gt;(spark.hadoop.yarn.timeline-service.enabled,false)                                                                                                                         &lt;/LI&gt;&lt;LI&gt;(spark.executor.extraLibraryPath,/usr/hdp/current/hadoop-client/lib/native)                                                                                                &lt;/LI&gt;&lt;LI&gt;(spark.app.id,application_1472397144295_0007)                                                                                                                              &lt;/LI&gt;&lt;LI&gt;(spark.executor.id,driver)                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;(spark.yarn.queue,default)                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;(spark.org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter.param.PROXY_HOSTS,sandbox.hortonworks.com)                                                               &lt;/LI&gt;&lt;LI&gt;(spark.eventLog.dir,hdfs:///spark-history)                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;(spark.master,yarn)                                                                                                                                                        &lt;/LI&gt;&lt;LI&gt;(spark.driver.port,37426)                                                                                                                                                  &lt;/LI&gt;&lt;LI&gt;(spark.yarn.submit.file.replication,3)                                                                                                                                     &lt;/LI&gt;&lt;LI&gt;(spark.sql.catalogImplementation,hive)                                                                                                                                     &lt;/LI&gt;&lt;LI&gt;(spark.driver.memory,2g)                                                                                                                                                   &lt;/LI&gt;&lt;LI&gt;(spark.jars,)                                                                                                                                                              &lt;/LI&gt;&lt;LI&gt;(spark.executor.cores,1)                                                                                                                                                   &lt;/LI&gt;&lt;LI&gt;scala&amp;gt; val file = sc.textFile("/tmp/data")                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;file: org.apache.spark.rdd.RDD[String] = /tmp/data MapPartitionsRDD[1] at textFile at &amp;lt;console&amp;gt;:24                                                                         &lt;/LI&gt;&lt;LI&gt;scala&amp;gt; val counts = file.flatMap(line =&amp;gt; line.split(" ")).map(word =&amp;gt; (word, 1)).reduceByKey(_ + _)                                                                        &lt;/LI&gt;&lt;LI&gt;counts: org.apache.spark.rdd.RDD[(String, Int)] = ShuffledRDD[4] at reduceByKey at &amp;lt;console&amp;gt;:26                                                                            &lt;/LI&gt;&lt;LI&gt;scala&amp;gt; counts.take(10)                                                                                                                                                     &lt;/LI&gt;&lt;LI&gt;res1: Array[(String, Int)] = Array((hadoop.tasklog.noKeepSplits=4,1), (log4j.logger.org.apache.hadoop.yarn.server.resourcemanager.RMAppManager$ApplicationSummary=${yarn.se&lt;/LI&gt;&lt;LI&gt;rver.resourcemanager.appsummary.logger},1), (Unless,1), (this,4), (hadoop.mapreduce.jobsummary.log.file=hadoop-mapreduce.jobsummary.log,1), (under,4), (log4j.appender.RFA.&lt;/LI&gt;&lt;LI&gt;layout.ConversionPattern=%d{ISO8601},2), (log4j.appender.DRFAAUDIT.layout=org.apache.log4j.PatternLayout,1), (AppSummaryLogging,1), (log4j.appender.RMAUDIT.layout=org.apac&lt;/LI&gt;&lt;LI&gt;he.log4j.PatternLayout,1))                                                                                                                                                 &lt;/LI&gt;&lt;LI&gt;scala&amp;gt;                                                                                                                                                                     &lt;/LI&gt;&lt;/OL&gt;</description>
      <pubDate>Wed, 31 Aug 2016 03:15:28 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sandbox-HDP-2-5-0-Spark-1-6-2-Issues-GPLNativeCodeLoader/m-p/123159#M85903</guid>
      <dc:creator>anandi</dc:creator>
      <dc:date>2016-08-31T03:15:28Z</dc:date>
    </item>
    <item>
      <title>Re: Sandbox HDP 2.5.0 - Spark 1.6.2 - Issues: GPLNativeCodeLoader: Could not load native gpl library - LzoCodec: Cannot load native-lzo without native-hadoop</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Sandbox-HDP-2-5-0-Spark-1-6-2-Issues-GPLNativeCodeLoader/m-p/123160#M85904</link>
      <description>&lt;P&gt;@anandi Thanks, the word count fix works great! However I applied the fix by editing/adding the properties in Ambari, so they won't get overwritten if I make another change at that level. In my opinion that is preferable to editing the config file directly.&lt;/P&gt;</description>
      <pubDate>Thu, 01 Sep 2016 08:37:23 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Sandbox-HDP-2-5-0-Spark-1-6-2-Issues-GPLNativeCodeLoader/m-p/123160#M85904</guid>
      <dc:creator>Former Member</dc:creator>
      <dc:date>2016-09-01T08:37:23Z</dc:date>
    </item>
  </channel>
</rss>

