<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD in Support Questions</title>
    <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29889#M22335</link>
    <description>&lt;P&gt;I tried every combination of capitalizations. None of them works.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I reckon the key to notice is that, when I use "import sqlContext.implicits._" in spark-shell, and then run:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;case class DimC(ID:Int, Name:String, City:String, EffectiveFrom:Int, EffectiveTo:Int)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It throws the error below. But it works perfectly the second time and creates the class.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;15/07/21 04:48:12 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore&lt;BR /&gt;15/07/21 04:48:12 INFO ObjectStore: ObjectStore, initialize called&lt;BR /&gt;15/07/21 04:48:12 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-api-jdo-3.2.1.jar."&lt;BR /&gt;15/07/21 04:48:12 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/flume-ng/lib/datanucleus-rdbms-3.2.1.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/hive/lib/datanucleus-rdbms-3.2.9.jar."&lt;BR /&gt;15/07/21 04:48:12 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-core-3.2.2.jar."&lt;BR /&gt;15/07/21 04:48:12 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored&lt;BR /&gt;15/07/21 04:48:12 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored&lt;BR /&gt;15/07/21 04:48:13 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory&lt;BR /&gt;javax.jdo.JDOFatalInternalException: Error creating transactional connection factory&lt;BR /&gt;at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)&lt;BR /&gt;at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.&amp;lt;init&amp;gt;(RawStoreProxy.java:56)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.&amp;lt;init&amp;gt;(RetryingHMSHandler.java:66)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:193)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.&amp;lt;init&amp;gt;(SessionHiveMetaStoreClient.java:74)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&amp;lt;init&amp;gt;(RetryingMetaStoreClient.java:64)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2841)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2860)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext$QueryExecution.&amp;lt;init&amp;gt;(HiveContext.scala:373)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:80)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:49)&lt;BR /&gt;at org.apache.spark.sql.DataFrame.&amp;lt;init&amp;gt;(DataFrame.scala:131)&lt;BR /&gt;at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)&lt;BR /&gt;at org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:319)&lt;BR /&gt;at org.apache.spark.sql.SQLContext$implicits$.rddToDataFrameHolder(SQLContext.scala:254)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:28)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:33)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:35)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:37)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:39)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:41)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:43)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:45)&lt;BR /&gt;at $line25.$read$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:47)&lt;BR /&gt;at $line25.$read$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:49)&lt;BR /&gt;at $line25.$read.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:51)&lt;BR /&gt;at $line25.$read$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:55)&lt;BR /&gt;at $line25.$read$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:7)&lt;BR /&gt;at $line25.$eval$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval.$print(&amp;lt;console&amp;gt;)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)&lt;BR /&gt;at org.apache.spark.repl.Main$.main(Main.scala:31)&lt;BR /&gt;at org.apache.spark.repl.Main.main(Main.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;NestedThrowablesStackTrace:&lt;BR /&gt;java.lang.reflect.InvocationTargetException&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)&lt;BR /&gt;at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)&lt;BR /&gt;at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)&lt;BR /&gt;at org.datanucleus.store.AbstractStoreManager.&amp;lt;init&amp;gt;(AbstractStoreManager.java:240)&lt;BR /&gt;at org.datanucleus.store.rdbms.RDBMSStoreManager.&amp;lt;init&amp;gt;(RDBMSStoreManager.java:286)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)&lt;BR /&gt;at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)&lt;BR /&gt;at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)&lt;BR /&gt;at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)&lt;BR /&gt;at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.&amp;lt;init&amp;gt;(RawStoreProxy.java:56)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.&amp;lt;init&amp;gt;(RetryingHMSHandler.java:66)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:193)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.&amp;lt;init&amp;gt;(SessionHiveMetaStoreClient.java:74)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&amp;lt;init&amp;gt;(RetryingMetaStoreClient.java:64)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2841)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2860)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext$QueryExecution.&amp;lt;init&amp;gt;(HiveContext.scala:373)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:80)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:49)&lt;BR /&gt;at org.apache.spark.sql.DataFrame.&amp;lt;init&amp;gt;(DataFrame.scala:131)&lt;BR /&gt;at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)&lt;BR /&gt;at org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:319)&lt;BR /&gt;at org.apache.spark.sql.SQLContext$implicits$.rddToDataFrameHolder(SQLContext.scala:254)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:28)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:33)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:35)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:37)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:39)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:41)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:43)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:45)&lt;BR /&gt;at $line25.$read$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:47)&lt;BR /&gt;at $line25.$read$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:49)&lt;BR /&gt;at $line25.$read.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:51)&lt;BR /&gt;at $line25.$read$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:55)&lt;BR /&gt;at $line25.$read$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:7)&lt;BR /&gt;at $line25.$eval$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval.$print(&amp;lt;console&amp;gt;)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)&lt;BR /&gt;at org.apache.spark.repl.Main$.main(Main.scala:31)&lt;BR /&gt;at org.apache.spark.repl.Main.main(Main.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;Caused by: java.lang.ExceptionInInitializerError&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at java.lang.Class.newInstance(Class.java:374)&lt;BR /&gt;at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:47)&lt;BR /&gt;at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)&lt;BR /&gt;at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)&lt;BR /&gt;at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)&lt;BR /&gt;at org.datanucleus.store.rdbms.ConnectionFactoryImpl.&amp;lt;init&amp;gt;(ConnectionFactoryImpl.java:85)&lt;BR /&gt;... 114 more&lt;BR /&gt;Caused by: java.lang.SecurityException: sealing violation: package org.apache.derby.impl.services.locks is sealed&lt;BR /&gt;at java.net.URLClassLoader.getAndVerifyPackage(URLClassLoader.java:388)&lt;BR /&gt;at java.net.URLClassLoader.defineClass(URLClassLoader.java:417)&lt;BR /&gt;at java.net.URLClassLoader.access$100(URLClassLoader.java:71)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:361)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:355)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at java.net.URLClassLoader.findClass(URLClassLoader.java:354)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:425)&lt;BR /&gt;at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:358)&lt;BR /&gt;at java.lang.ClassLoader.defineClass1(Native Method)&lt;BR /&gt;at java.lang.ClassLoader.defineClass(ClassLoader.java:800)&lt;BR /&gt;at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)&lt;BR /&gt;at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)&lt;BR /&gt;at java.net.URLClassLoader.access$100(URLClassLoader.java:71)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:361)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:355)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at java.net.URLClassLoader.findClass(URLClassLoader.java:354)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:425)&lt;BR /&gt;at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:358)&lt;BR /&gt;at java.lang.Class.forName0(Native Method)&lt;BR /&gt;at java.lang.Class.forName(Class.java:190)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.BaseMonitor.getImplementations(Unknown Source)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.BaseMonitor.getDefaultImplementations(Unknown Source)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.BaseMonitor.runWithState(Unknown Source)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.FileMonitor.&amp;lt;init&amp;gt;(Unknown Source)&lt;BR /&gt;at org.apache.derby.iapi.services.monitor.Monitor.startMonitor(Unknown Source)&lt;BR /&gt;at org.apache.derby.iapi.jdbc.JDBCBoot.boot(Unknown Source)&lt;BR /&gt;at org.apache.derby.jdbc.EmbeddedDriver.boot(Unknown Source)&lt;BR /&gt;at org.apache.derby.jdbc.EmbeddedDriver.&amp;lt;clinit&amp;gt;(Unknown Source)&lt;BR /&gt;... 124 more&lt;BR /&gt;15/07/21 04:48:13 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore&lt;BR /&gt;15/07/21 04:48:13 INFO ObjectStore: ObjectStore, initialize called&lt;BR /&gt;15/07/21 04:48:13 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-api-jdo-3.2.1.jar."&lt;BR /&gt;15/07/21 04:48:13 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/flume-ng/lib/datanucleus-rdbms-3.2.1.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/hive/lib/datanucleus-rdbms-3.2.9.jar."&lt;BR /&gt;15/07/21 04:48:13 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-core-3.2.2.jar."&lt;BR /&gt;15/07/21 04:48:13 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored&lt;BR /&gt;15/07/21 04:48:13 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored&lt;BR /&gt;java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient&lt;/P&gt;</description>
    <pubDate>Tue, 21 Jul 2015 11:54:26 GMT</pubDate>
    <dc:creator>Saeed.Barghi</dc:creator>
    <dc:date>2015-07-21T11:54:26Z</dc:date>
    <item>
      <title>Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29878#M22328</link>
      <description>&lt;P&gt;Hi all,&lt;/P&gt;&lt;P&gt;I am trying to create a DataFrame of a text file which gives me error: "&lt;STRONG&gt;value toDF is not a member of org.apache.spark.rdd.RDD&lt;/STRONG&gt;"&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;The only solution I can find online is to import&amp;nbsp;SQLContext.implicits._ which in trun throws "&lt;STRONG&gt;not found: value SQLContext&lt;/STRONG&gt;"&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I googled this new error but couldn't find anything. The funny part is that the piece of code I am using works in Spark-Shell, but fails when I try to build it using &lt;U&gt;sbt package&lt;/U&gt;&lt;/P&gt;&lt;P class="p1"&gt;I am suing Cloudera's QuickStart VM and My Spark Version is 1.3.0 and my&amp;nbsp;Scala Version: 2.10.4 .&lt;/P&gt;&lt;P class="p1"&gt;&amp;nbsp;&lt;/P&gt;&lt;P class="p1"&gt;Any help is highly appreciated,&lt;/P&gt;&lt;P class="p1"&gt;Cheers.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Here comes my piece of code:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;import...........&lt;/P&gt;&lt;P&gt;import SQLContext.implicits._&lt;/P&gt;&lt;P&gt;...&lt;/P&gt;&lt;P&gt;class Class_1() extends Runnable {&lt;BR /&gt;val conf = new SparkConf().setAppName("TestApp")&lt;BR /&gt;val sc = new SparkContext(conf)&lt;/P&gt;&lt;P&gt;val sqlContext= new org.apache.spark.sql.SQLContext(sc)&lt;BR /&gt;var fDimCustomer = sc.textFile("DimCustomer.txt")&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;def loadData(fileName:String) {&lt;/P&gt;&lt;P&gt;fDimCustomer = sc.textFile("DimCustomer.txt")&lt;BR /&gt;&lt;BR /&gt;case class DimC(ID:Int, Name:String)&lt;BR /&gt;var dimCustomer1 = fDimCustomer.map(_.split(',')).map(r=&amp;gt;DimC(r(0).toInt,r(1))).toDF&lt;BR /&gt;dimCustomer1.registerTempTable("Cust_1")&lt;BR /&gt;&lt;BR /&gt;val customers = sqlContext.sql("select * from Cust_1")&lt;BR /&gt;customers.show()&lt;/P&gt;&lt;P&gt;}&lt;/P&gt;&lt;P&gt;......&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 09:34:59 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29878#M22328</guid>
      <dc:creator>Saeed.Barghi</dc:creator>
      <dc:date>2022-09-16T09:34:59Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29879#M22329</link>
      <description>&lt;P&gt;I think you're missing the package name. org.apache.spark.sql.SQLContext...&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 11:14:17 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29879#M22329</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-07-21T11:14:17Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29882#M22330</link>
      <description>I am having it as&lt;BR /&gt;val sqlContext= new org.apache.spark.sql.SQLContext(sc)&lt;BR /&gt;&lt;BR /&gt;How should it be?</description>
      <pubDate>Tue, 21 Jul 2015 11:22:47 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29882#M22330</guid>
      <dc:creator>Saeed.Barghi</dc:creator>
      <dc:date>2015-07-21T11:22:47Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29883#M22331</link>
      <description>&lt;P&gt;I meant in the import; you're missing the implicits, I think.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;import org.apache.spark.sql.SQLContext.implicits._&lt;/PRE&gt;</description>
      <pubDate>Tue, 21 Jul 2015 11:32:39 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29883#M22331</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-07-21T11:32:39Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29884#M22332</link>
      <description>I tried that, still the same errors..</description>
      <pubDate>Tue, 21 Jul 2015 11:35:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29884#M22332</guid>
      <dc:creator>Saeed.Barghi</dc:creator>
      <dc:date>2015-07-21T11:35:50Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29885#M22333</link>
      <description>&lt;P&gt;This is the content of my .sbt file:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;name := "WatchApp"&lt;/P&gt;&lt;P&gt;version := "1.0"&lt;/P&gt;&lt;P&gt;scalaVersion := "2.10.4"&lt;/P&gt;&lt;P&gt;libraryDependencies ++= Seq (&lt;BR /&gt;"org.apache.spark" %% "spark-core" % "1.3.0",&lt;BR /&gt;"org.apache.spark" %% "spark-sql" % "1.3.0"&lt;BR /&gt;)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can you spot any mistakes in it?&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 11:37:41 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29885#M22333</guid>
      <dc:creator>Saeed.Barghi</dc:creator>
      <dc:date>2015-07-21T11:37:41Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29887#M22334</link>
      <description>&lt;P&gt;Ah, I think I'm mistaken. Try this; note the capitalization:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;import sqlContext.implicits._&lt;/PRE&gt;</description>
      <pubDate>Tue, 21 Jul 2015 11:42:07 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29887#M22334</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2015-07-21T11:42:07Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29889#M22335</link>
      <description>&lt;P&gt;I tried every combination of capitalizations. None of them works.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I reckon the key to notice is that, when I use "import sqlContext.implicits._" in spark-shell, and then run:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;case class DimC(ID:Int, Name:String, City:String, EffectiveFrom:Int, EffectiveTo:Int)&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;It throws the error below. But it works perfectly the second time and creates the class.&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;15/07/21 04:48:12 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore&lt;BR /&gt;15/07/21 04:48:12 INFO ObjectStore: ObjectStore, initialize called&lt;BR /&gt;15/07/21 04:48:12 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-api-jdo-3.2.1.jar."&lt;BR /&gt;15/07/21 04:48:12 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/flume-ng/lib/datanucleus-rdbms-3.2.1.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/hive/lib/datanucleus-rdbms-3.2.9.jar."&lt;BR /&gt;15/07/21 04:48:12 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-core-3.2.2.jar."&lt;BR /&gt;15/07/21 04:48:12 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored&lt;BR /&gt;15/07/21 04:48:12 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored&lt;BR /&gt;15/07/21 04:48:13 WARN HiveMetaStore: Retrying creating default database after error: Error creating transactional connection factory&lt;BR /&gt;javax.jdo.JDOFatalInternalException: Error creating transactional connection factory&lt;BR /&gt;at org.datanucleus.api.jdo.NucleusJDOHelper.getJDOExceptionForNucleusException(NucleusJDOHelper.java:587)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:788)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)&lt;BR /&gt;at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.&amp;lt;init&amp;gt;(RawStoreProxy.java:56)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.&amp;lt;init&amp;gt;(RetryingHMSHandler.java:66)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:193)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.&amp;lt;init&amp;gt;(SessionHiveMetaStoreClient.java:74)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&amp;lt;init&amp;gt;(RetryingMetaStoreClient.java:64)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2841)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2860)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext$QueryExecution.&amp;lt;init&amp;gt;(HiveContext.scala:373)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:80)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:49)&lt;BR /&gt;at org.apache.spark.sql.DataFrame.&amp;lt;init&amp;gt;(DataFrame.scala:131)&lt;BR /&gt;at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)&lt;BR /&gt;at org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:319)&lt;BR /&gt;at org.apache.spark.sql.SQLContext$implicits$.rddToDataFrameHolder(SQLContext.scala:254)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:28)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:33)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:35)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:37)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:39)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:41)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:43)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:45)&lt;BR /&gt;at $line25.$read$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:47)&lt;BR /&gt;at $line25.$read$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:49)&lt;BR /&gt;at $line25.$read.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:51)&lt;BR /&gt;at $line25.$read$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:55)&lt;BR /&gt;at $line25.$read$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:7)&lt;BR /&gt;at $line25.$eval$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval.$print(&amp;lt;console&amp;gt;)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)&lt;BR /&gt;at org.apache.spark.repl.Main$.main(Main.scala:31)&lt;BR /&gt;at org.apache.spark.repl.Main.main(Main.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;NestedThrowablesStackTrace:&lt;BR /&gt;java.lang.reflect.InvocationTargetException&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)&lt;BR /&gt;at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:325)&lt;BR /&gt;at org.datanucleus.store.AbstractStoreManager.registerConnectionFactory(AbstractStoreManager.java:282)&lt;BR /&gt;at org.datanucleus.store.AbstractStoreManager.&amp;lt;init&amp;gt;(AbstractStoreManager.java:240)&lt;BR /&gt;at org.datanucleus.store.rdbms.RDBMSStoreManager.&amp;lt;init&amp;gt;(RDBMSStoreManager.java:286)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.datanucleus.plugin.NonManagedPluginRegistry.createExecutableExtension(NonManagedPluginRegistry.java:631)&lt;BR /&gt;at org.datanucleus.plugin.PluginManager.createExecutableExtension(PluginManager.java:301)&lt;BR /&gt;at org.datanucleus.NucleusContext.createStoreManagerForProperties(NucleusContext.java:1187)&lt;BR /&gt;at org.datanucleus.NucleusContext.initialise(NucleusContext.java:356)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.freezeConfiguration(JDOPersistenceManagerFactory.java:775)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.createPersistenceManagerFactory(JDOPersistenceManagerFactory.java:333)&lt;BR /&gt;at org.datanucleus.api.jdo.JDOPersistenceManagerFactory.getPersistenceManagerFactory(JDOPersistenceManagerFactory.java:202)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at javax.jdo.JDOHelper$16.run(JDOHelper.java:1965)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at javax.jdo.JDOHelper.invoke(JDOHelper.java:1960)&lt;BR /&gt;at javax.jdo.JDOHelper.invokeGetPersistenceManagerFactoryOnImplementation(JDOHelper.java:1166)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:808)&lt;BR /&gt;at javax.jdo.JDOHelper.getPersistenceManagerFactory(JDOHelper.java:701)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPMF(ObjectStore.java:365)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.getPersistenceManager(ObjectStore.java:394)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.initialize(ObjectStore.java:291)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.ObjectStore.setConf(ObjectStore.java:258)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:73)&lt;BR /&gt;at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.&amp;lt;init&amp;gt;(RawStoreProxy.java:56)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RawStoreProxy.getProxy(RawStoreProxy.java:65)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.newRawStore(HiveMetaStore.java:579)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.getMS(HiveMetaStore.java:557)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:606)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:448)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.&amp;lt;init&amp;gt;(RetryingHMSHandler.java:66)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingHMSHandler.getProxy(RetryingHMSHandler.java:72)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStore.newRetryingHMSHandler(HiveMetaStore.java:5601)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.HiveMetaStoreClient.&amp;lt;init&amp;gt;(HiveMetaStoreClient.java:193)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient.&amp;lt;init&amp;gt;(SessionHiveMetaStoreClient.java:74)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.MetaStoreUtils.newInstance(MetaStoreUtils.java:1486)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.&amp;lt;init&amp;gt;(RetryingMetaStoreClient.java:64)&lt;BR /&gt;at org.apache.hadoop.hive.metastore.RetryingMetaStoreClient.getProxy(RetryingMetaStoreClient.java:74)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.createMetaStoreClient(Hive.java:2841)&lt;BR /&gt;at org.apache.hadoop.hive.ql.metadata.Hive.getMSC(Hive.java:2860)&lt;BR /&gt;at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:453)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState$lzycompute(HiveContext.scala:229)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.sessionState(HiveContext.scala:225)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext$QueryExecution.&amp;lt;init&amp;gt;(HiveContext.scala:373)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:80)&lt;BR /&gt;at org.apache.spark.sql.hive.HiveContext.executePlan(HiveContext.scala:49)&lt;BR /&gt;at org.apache.spark.sql.DataFrame.&amp;lt;init&amp;gt;(DataFrame.scala:131)&lt;BR /&gt;at org.apache.spark.sql.DataFrame$.apply(DataFrame.scala:51)&lt;BR /&gt;at org.apache.spark.sql.SQLContext.createDataFrame(SQLContext.scala:319)&lt;BR /&gt;at org.apache.spark.sql.SQLContext$implicits$.rddToDataFrameHolder(SQLContext.scala:254)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:28)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:33)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:35)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:37)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:39)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:41)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:43)&lt;BR /&gt;at $line25.$read$$iwC$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:45)&lt;BR /&gt;at $line25.$read$$iwC$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:47)&lt;BR /&gt;at $line25.$read$$iwC.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:49)&lt;BR /&gt;at $line25.$read.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:51)&lt;BR /&gt;at $line25.$read$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:55)&lt;BR /&gt;at $line25.$read$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval$.&amp;lt;init&amp;gt;(&amp;lt;console&amp;gt;:7)&lt;BR /&gt;at $line25.$eval$.&amp;lt;clinit&amp;gt;(&amp;lt;console&amp;gt;)&lt;BR /&gt;at $line25.$eval.$print(&amp;lt;console&amp;gt;)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)&lt;BR /&gt;at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:856)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:901)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:813)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:656)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:664)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:669)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:996)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:944)&lt;BR /&gt;at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:944)&lt;BR /&gt;at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1058)&lt;BR /&gt;at org.apache.spark.repl.Main$.main(Main.scala:31)&lt;BR /&gt;at org.apache.spark.repl.Main.main(Main.scala)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)&lt;BR /&gt;at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)&lt;BR /&gt;at java.lang.reflect.Method.invoke(Method.java:606)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110)&lt;BR /&gt;at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)&lt;BR /&gt;Caused by: java.lang.ExceptionInInitializerError&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)&lt;BR /&gt;at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)&lt;BR /&gt;at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)&lt;BR /&gt;at java.lang.reflect.Constructor.newInstance(Constructor.java:526)&lt;BR /&gt;at java.lang.Class.newInstance(Class.java:374)&lt;BR /&gt;at org.datanucleus.store.rdbms.connectionpool.AbstractConnectionPoolFactory.loadDriver(AbstractConnectionPoolFactory.java:47)&lt;BR /&gt;at org.datanucleus.store.rdbms.connectionpool.BoneCPConnectionPoolFactory.createConnectionPool(BoneCPConnectionPoolFactory.java:54)&lt;BR /&gt;at org.datanucleus.store.rdbms.ConnectionFactoryImpl.generateDataSources(ConnectionFactoryImpl.java:238)&lt;BR /&gt;at org.datanucleus.store.rdbms.ConnectionFactoryImpl.initialiseDataSources(ConnectionFactoryImpl.java:131)&lt;BR /&gt;at org.datanucleus.store.rdbms.ConnectionFactoryImpl.&amp;lt;init&amp;gt;(ConnectionFactoryImpl.java:85)&lt;BR /&gt;... 114 more&lt;BR /&gt;Caused by: java.lang.SecurityException: sealing violation: package org.apache.derby.impl.services.locks is sealed&lt;BR /&gt;at java.net.URLClassLoader.getAndVerifyPackage(URLClassLoader.java:388)&lt;BR /&gt;at java.net.URLClassLoader.defineClass(URLClassLoader.java:417)&lt;BR /&gt;at java.net.URLClassLoader.access$100(URLClassLoader.java:71)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:361)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:355)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at java.net.URLClassLoader.findClass(URLClassLoader.java:354)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:425)&lt;BR /&gt;at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:358)&lt;BR /&gt;at java.lang.ClassLoader.defineClass1(Native Method)&lt;BR /&gt;at java.lang.ClassLoader.defineClass(ClassLoader.java:800)&lt;BR /&gt;at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)&lt;BR /&gt;at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)&lt;BR /&gt;at java.net.URLClassLoader.access$100(URLClassLoader.java:71)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:361)&lt;BR /&gt;at java.net.URLClassLoader$1.run(URLClassLoader.java:355)&lt;BR /&gt;at java.security.AccessController.doPrivileged(Native Method)&lt;BR /&gt;at java.net.URLClassLoader.findClass(URLClassLoader.java:354)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:425)&lt;BR /&gt;at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)&lt;BR /&gt;at java.lang.ClassLoader.loadClass(ClassLoader.java:358)&lt;BR /&gt;at java.lang.Class.forName0(Native Method)&lt;BR /&gt;at java.lang.Class.forName(Class.java:190)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.BaseMonitor.getImplementations(Unknown Source)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.BaseMonitor.getDefaultImplementations(Unknown Source)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.BaseMonitor.runWithState(Unknown Source)&lt;BR /&gt;at org.apache.derby.impl.services.monitor.FileMonitor.&amp;lt;init&amp;gt;(Unknown Source)&lt;BR /&gt;at org.apache.derby.iapi.services.monitor.Monitor.startMonitor(Unknown Source)&lt;BR /&gt;at org.apache.derby.iapi.jdbc.JDBCBoot.boot(Unknown Source)&lt;BR /&gt;at org.apache.derby.jdbc.EmbeddedDriver.boot(Unknown Source)&lt;BR /&gt;at org.apache.derby.jdbc.EmbeddedDriver.&amp;lt;clinit&amp;gt;(Unknown Source)&lt;BR /&gt;... 124 more&lt;BR /&gt;15/07/21 04:48:13 INFO HiveMetaStore: 0: Opening raw store with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore&lt;BR /&gt;15/07/21 04:48:13 INFO ObjectStore: ObjectStore, initialize called&lt;BR /&gt;15/07/21 04:48:13 WARN General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-api-jdo-3.2.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-api-jdo-3.2.1.jar."&lt;BR /&gt;15/07/21 04:48:13 WARN General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/flume-ng/lib/datanucleus-rdbms-3.2.1.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/hive/lib/datanucleus-rdbms-3.2.9.jar."&lt;BR /&gt;15/07/21 04:48:13 WARN General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/usr/lib/hive/lib/datanucleus-core-3.2.10.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/usr/lib/flume-ng/lib/datanucleus-core-3.2.2.jar."&lt;BR /&gt;15/07/21 04:48:13 INFO Persistence: Property datanucleus.cache.level2 unknown - will be ignored&lt;BR /&gt;15/07/21 04:48:13 INFO Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored&lt;BR /&gt;java.lang.RuntimeException: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.ql.metadata.SessionHiveMetaStoreClient&lt;/P&gt;</description>
      <pubDate>Tue, 21 Jul 2015 11:54:26 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29889#M22335</guid>
      <dc:creator>Saeed.Barghi</dc:creator>
      <dc:date>2015-07-21T11:54:26Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29926#M22336</link>
      <description>&lt;P&gt;Anybody else had this problem? Please help!&lt;/P&gt;</description>
      <pubDate>Wed, 22 Jul 2015 10:27:02 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29926#M22336</guid>
      <dc:creator>Saeed.Barghi</dc:creator>
      <dc:date>2015-07-22T10:27:02Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29994#M22337</link>
      <description>&lt;P&gt;Ok, I finally fixed the issue. 2 things needed to be done:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;1- Import implicits:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; Note that this should be done only after an instance of&amp;nbsp;org.apache.spark.sql.SQLContext is created. It should be written as:&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp;val sqlContext= new org.apache.spark.sql.SQLContext(sc)&lt;BR /&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; import sqlContext.implicits._&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&lt;STRONG&gt;2- Move case class outside of the method:&lt;/STRONG&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; case class, by use of which you define the schema&amp;nbsp;of the DataFrame, should be defined outside of the method needing it. You can read more about it here:&lt;/P&gt;&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;A href="https://issues.scala-lang.org/browse/SI-6649" target="_blank"&gt;https://issues.scala-lang.org/browse/SI-6649&lt;/A&gt;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Cheers.&lt;/P&gt;</description>
      <pubDate>Thu, 23 Jul 2015 23:47:14 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/29994#M22337</guid>
      <dc:creator>Saeed.Barghi</dc:creator>
      <dc:date>2015-07-23T23:47:14Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/31564#M22338</link>
      <description>Can you show me how you write case class to define schema and how to use it in your method? Thanks so much</description>
      <pubDate>Fri, 04 Sep 2015 20:09:06 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/31564#M22338</guid>
      <dc:creator>Tong</dc:creator>
      <dc:date>2015-09-04T20:09:06Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/33535#M22339</link>
      <description>&lt;P&gt;Hi,&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Can you shae your program.&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;I am getting one single error mentioned below:-&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;[info] Compiling 1 Scala source to /home/sumeet/SimpleSparkProject/target/scala-2.11/classes...&lt;BR /&gt;[error] /home/sumeet/SimpleSparkProject/src/main/scala/SimpleApp.scala:16: value toDF is not a member of org.apache.spark.rdd.RDD[Auction]&lt;BR /&gt;[error] val auction = ebay.toDF()&lt;BR /&gt;[error] ^&lt;/P&gt;&lt;P&gt;import org.apache.spark.SparkContext&lt;BR /&gt;import org.apache.spark.SparkContext._&lt;BR /&gt;import org.apache.spark.sql._&lt;BR /&gt;object SimpleApp {&lt;BR /&gt;def main(args: Array[String]) {&lt;BR /&gt;val sc = new SparkContext("local", "Simple App", "/usr/local/spark-1.4.0-incubating",&lt;BR /&gt;List("target/scala-2.10/simple-project_2.10-1.0.jar"))&lt;BR /&gt;val sqlContext = new org.apache.spark.sql.SQLContext(sc)&lt;BR /&gt;import sqlContext.implicits._&lt;BR /&gt;val ebayText = sc.textFile("/home/sumeet/Desktop/useful huge sample data/ebay.csv")&lt;BR /&gt;ebayText.first()&lt;BR /&gt;case class Auction(auctionid: String, bid: Float, bidtime: Float, bidder: String, bidderrate: Integer, openbid: Float, price: Float)&lt;BR /&gt;val ebay = ebayText.map(_.split(",")).map(p =&amp;gt; Auction(p(0),p(1).toFloat,p(2).toFloat,p(3),p(4).toInt,p(5).toFloat,p(6).toFloat))&lt;BR /&gt;ebay.first()&lt;BR /&gt;ebay.count()&lt;BR /&gt;val auction = ebay.toDF()&lt;BR /&gt;auction.show()&lt;BR /&gt;}&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 28 Oct 2015 20:09:21 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/33535#M22339</guid>
      <dc:creator>sumeet89</dc:creator>
      <dc:date>2015-10-28T20:09:21Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/49448#M22340</link>
      <description>&lt;P&gt;Hi, Thank You! It resolved the similar issue that I was facing. However, coulc you please share your knowledge on why is this done? And what exactly implicit does in this case. Reply appreciated. Sorry for reopening this post.&lt;/P&gt;</description>
      <pubDate>Mon, 16 Jan 2017 00:11:04 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/49448#M22340</guid>
      <dc:creator>AshwiniPatil</dc:creator>
      <dc:date>2017-01-16T00:11:04Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/61263#M22341</link>
      <description>&lt;P&gt;package org.example.textclassification&lt;/P&gt;&lt;P&gt;import org.apache.predictionio.controller.P2LAlgorithm&lt;BR /&gt;import org.apache.predictionio.controller.Params&lt;/P&gt;&lt;P&gt;import org.apache.spark.SparkContext&lt;BR /&gt;import org.apache.spark.SparkContext._&lt;BR /&gt;import org.apache.spark.rdd.RDD&lt;BR /&gt;import org.apache.spark.ml.classification.LogisticRegression&lt;BR /&gt;import org.apache.spark.sql.DataFrame&lt;BR /&gt;import org.apache.spark.sql.functions&lt;BR /&gt;import org.apache.spark.sql.SQLContext&lt;BR /&gt;import org.apache.spark.sql.UserDefinedFunction&lt;/P&gt;&lt;P&gt;import grizzled.slf4j.Logger&lt;/P&gt;&lt;P&gt;case class LRAlgorithmParams(regParam: Double) extends Params&lt;/P&gt;&lt;P&gt;class LRAlgorithm(val ap: LRAlgorithmParams)&lt;BR /&gt;extends P2LAlgorithm[PreparedData, LRModel, Query, PredictedResult] {&lt;/P&gt;&lt;P&gt;@transient lazy val logger = Logger[this.type]&lt;/P&gt;&lt;P&gt;def train(sc: SparkContext, pd: PreparedData): LRModel = {&lt;/P&gt;&lt;P&gt;// Import SQLContext for creating DataFrame.&lt;BR /&gt;val sql: SQLContext = new SQLContext(sc)&lt;BR /&gt;import sql.implicits._&lt;/P&gt;&lt;P&gt;val lr = new LogisticRegression()&lt;BR /&gt;.setMaxIter(10)&lt;BR /&gt;.setThreshold(0.5)&lt;BR /&gt;.setRegParam(ap.regParam)&lt;/P&gt;&lt;P&gt;val labels: Seq[Double] = pd.categoryMap.keys.toSeq&lt;/P&gt;&lt;P&gt;val data = labels.foldLeft(pd.transformedData.toDF)( //transform to Spark DataFrame&lt;BR /&gt;// Add the different binary columns for each label.&lt;BR /&gt;(data: DataFrame, label: Double) =&amp;gt; {&lt;BR /&gt;// function: multiclass labels --&amp;gt; binary labels&lt;BR /&gt;val f: UserDefinedFunction = functions.udf((e : Double) =&amp;gt; if (e == label) 1.0 else 0.0)&lt;/P&gt;&lt;P&gt;data.withColumn(label.toInt.toString, f(data("label")))&lt;BR /&gt;}&lt;BR /&gt;ubuntu@ip-172-20-9-118:/spark/tracxn/predictionio/classification/isCompany$ cat src/main/scala/LRAlgorithm.scala&lt;BR /&gt;package org.example.textclassification&lt;/P&gt;&lt;P&gt;import org.apache.predictionio.controller.P2LAlgorithm&lt;BR /&gt;import org.apache.predictionio.controller.Params&lt;/P&gt;&lt;P&gt;import org.apache.spark.SparkContext&lt;BR /&gt;import org.apache.spark.SparkContext._&lt;BR /&gt;import org.apache.spark.rdd.RDD&lt;BR /&gt;import org.apache.spark.ml.classification.LogisticRegression&lt;BR /&gt;import org.apache.spark.sql.DataFrame&lt;BR /&gt;import org.apache.spark.sql.functions&lt;BR /&gt;import org.apache.spark.sql.SQLContext&lt;BR /&gt;import org.apache.spark.sql.UserDefinedFunction&lt;/P&gt;&lt;P&gt;import grizzled.slf4j.Logger&lt;/P&gt;&lt;P&gt;case class LRAlgorithmParams(regParam: Double) extends Params&lt;/P&gt;&lt;P&gt;class LRAlgorithm(val ap: LRAlgorithmParams)&lt;BR /&gt;extends P2LAlgorithm[PreparedData, LRModel, Query, PredictedResult] {&lt;/P&gt;&lt;P&gt;@transient lazy val logger = Logger[this.type]&lt;/P&gt;&lt;P&gt;def train(sc: SparkContext, pd: PreparedData): LRModel = {&lt;/P&gt;&lt;P&gt;// Import SQLContext for creating DataFrame.&lt;BR /&gt;val sql: SQLContext = new SQLContext(sc)&lt;BR /&gt;import sql.implicits._&lt;/P&gt;&lt;P&gt;val lr = new LogisticRegression()&lt;BR /&gt;.setMaxIter(10)&lt;BR /&gt;.setThreshold(0.5)&lt;BR /&gt;.setRegParam(ap.regParam)&lt;/P&gt;&lt;P&gt;val labels: Seq[Double] = pd.categoryMap.keys.toSeq&lt;/P&gt;&lt;P&gt;val data = labels.foldLeft(pd.transformedData.toDF)( //transform to Spark DataFrame&lt;BR /&gt;// Add the different binary columns for each label.&lt;BR /&gt;(data: DataFrame, label: Double) =&amp;gt; {&lt;BR /&gt;// function: multiclass labels --&amp;gt; binary labels&lt;BR /&gt;val f: UserDefinedFunction = functions.udf((e : Double) =&amp;gt; if (e == label) 1.0 else 0.0)&lt;/P&gt;&lt;P&gt;data.withColumn(label.toInt.toString, f(data("label")))&lt;BR /&gt;}&lt;BR /&gt;)&lt;/P&gt;&lt;P&gt;// Create a logistic regression model for each class.&lt;BR /&gt;val lrModels : Seq[(Double, LREstimate)] = labels.map(&lt;BR /&gt;label =&amp;gt; {&lt;BR /&gt;val lab = label.toInt.toString&lt;/P&gt;&lt;P&gt;val fit = lr.setLabelCol(lab).fit(&lt;BR /&gt;data.select(lab, "features")&lt;BR /&gt;)&lt;/P&gt;&lt;P&gt;// Return (label, feature coefficients, and intercept term.&lt;BR /&gt;(label, LREstimate(fit.weights.toArray, fit.intercept))&lt;/P&gt;&lt;P&gt;}&lt;BR /&gt;)&lt;/P&gt;&lt;P&gt;new LRModel(&lt;BR /&gt;tfIdf = pd.tfIdf,&lt;BR /&gt;categoryMap = pd.categoryMap,&lt;BR /&gt;lrModels = lrModels&lt;BR /&gt;)&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;def predict(model: LRModel, query: Query): PredictedResult = {&lt;BR /&gt;model.predict(query.text)&lt;BR /&gt;}&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;case class LREstimate (&lt;BR /&gt;coefficients : Array[Double],&lt;BR /&gt;intercept : Double&lt;BR /&gt;)&lt;/P&gt;&lt;P&gt;class LRModel(&lt;BR /&gt;val tfIdf: TFIDFModel,&lt;BR /&gt;val categoryMap: Map[Double, String],&lt;BR /&gt;val lrModels: Seq[(Double, LREstimate)]) extends Serializable {&lt;/P&gt;&lt;P&gt;/** Enable vector inner product for prediction. */&lt;BR /&gt;private def innerProduct (x : Array[Double], y : Array[Double]) : Double = {&lt;BR /&gt;x.zip(y).map(e =&amp;gt; e._1 * e._2).sum&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;/** Define prediction rule. */&lt;BR /&gt;def predict(text: String): PredictedResult = {&lt;BR /&gt;val x: Array[Double] = tfIdf.transform(text).toArray&lt;/P&gt;&lt;P&gt;// Logistic Regression binary formula for positive probability.&lt;BR /&gt;// According to MLLib documentation, class labeled 0 is used as pivot.&lt;BR /&gt;// Thus, we are using:&lt;BR /&gt;// log(p1/p0) = log(p1/(1 - p1)) = b0 + xTb =: z&lt;BR /&gt;// p1 = exp(z) * (1 - p1)&lt;BR /&gt;// p1 * (1 + exp(z)) = exp(z)&lt;BR /&gt;// p1 = exp(z)/(1 + exp(z))&lt;BR /&gt;val pred = lrModels.map(&lt;BR /&gt;e =&amp;gt; {&lt;BR /&gt;val z = scala.math.exp(innerProduct(e._2.coefficients, x) + e._2.intercept)&lt;BR /&gt;(e._1, z / (1 + z))&lt;BR /&gt;}&lt;BR /&gt;).maxBy(_._2)&lt;/P&gt;&lt;P&gt;PredictedResult(categoryMap(pred._1), pred._2)&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;override def toString = s"LR model"&lt;BR /&gt;}&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;P&gt;Getting same error in my code . Can you help me how to fix it&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 26 Oct 2017 14:44:52 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/61263#M22341</guid>
      <dc:creator>abhideoria23</dc:creator>
      <dc:date>2017-10-26T14:44:52Z</dc:date>
    </item>
    <item>
      <title>Re: Spark/Scala Error: value toDF is not a member of org.apache.spark.rdd.RDD</title>
      <link>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/313725#M225681</link>
      <description>&lt;P&gt;Import implicit&lt;/P&gt;&lt;PRE&gt;&amp;nbsp;&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;where sc=&lt;/P&gt;&lt;PRE&gt;&lt;SPAN&gt;val &lt;/SPAN&gt;sc = SparkSession&lt;BR /&gt;  .&lt;SPAN&gt;builder&lt;/SPAN&gt;()&lt;BR /&gt;  .appName(&lt;SPAN&gt;"demo"&lt;/SPAN&gt;)&lt;BR /&gt;  .master(&lt;SPAN&gt;"local"&lt;/SPAN&gt;)&lt;BR /&gt;  .getOrCreate()&lt;BR /&gt;&lt;BR /&gt;&lt;SPAN&gt;import &lt;/SPAN&gt;sc.implicits._&lt;/PRE&gt;&lt;P&gt;&amp;nbsp;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 25 Mar 2021 08:01:13 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Support-Questions/Spark-Scala-Error-value-toDF-is-not-a-member-of-org-apache/m-p/313725#M225681</guid>
      <dc:creator>raviverma</dc:creator>
      <dc:date>2021-03-25T08:01:13Z</dc:date>
    </item>
  </channel>
</rss>

