<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>question Build failed for spark thriftserver - CDH5.10.0 in Archives of Support Questions (Read Only)</title>
    <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Build-failed-for-spark-thriftserver-CDH5-10-0/m-p/52105#M56908</link>
    <description>&lt;P&gt;I'm trying to compile spark 1.6.0-cdh 5.10.0&amp;nbsp;on a Docker container (that should act as an edge node for the cluster with Jupiter).&lt;/P&gt;&lt;P&gt;This is the source:&lt;/P&gt;&lt;P&gt;&lt;A href="http://archive.cloudera.com/cdh5/cdh/5/spark-1.6.0-cdh5.10.0-src.tar.gz" target="_blank"&gt;http://archive.cloudera.com/cdh5/cdh/5/spark-1.6.0-cdh5.10.0-src.tar.gz&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I am using the following Maven command:&lt;/P&gt;&lt;PRE&gt;build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=$V_CDH -Phive -Phive-thriftserver -DskipTests clean package&lt;/PRE&gt;&lt;P&gt;but I cannot continue over the thriftserver part.&lt;/P&gt;&lt;P&gt;I always get this error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;[error] /usr/local/spark-1.6.0-cdh5.10.0/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala:42: class SparkExecuteStatementOperation needs to be abstract, since method cancel in class Operation of type (x$1: org.apache.hive.service.cli.OperationState)Unit is not defined
[error] private[hive] class SparkExecuteStatementOperation(
[error]                     ^
[error] /usr/local/spark-1.6.0-cdh5.10.0/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala:252: method cancel overrides nothing.
[error] Note: the super classes of class SparkExecuteStatementOperation contain the following, non final members named cancel:
[error] def cancel(x$1: org.apache.hive.service.cli.OperationState): Unit
[error]   override def cancel(): Unit = {
[error]                ^
[error] /usr/local/spark-1.6.0-cdh5.10.0/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/server/SparkSQLOperationManager.scala:42: method newExecuteStatementOperation overrides nothing.
[error] Note: the super classes of class SparkSQLOperationManager contain the following, non final members named newExecuteStatementOperation:
[error] def newExecuteStatementOperation(x$1: org.apache.hive.service.cli.session.HiveSession,x$2: String,x$3: java.util.Map[String,String],x$4: Boolean,x$5: Long): org.apache.hive.service.cli.operation.ExecuteStatementOperation
[error]   override def newExecuteStatementOperation(&lt;/PRE&gt;&lt;P&gt;Any idea?&lt;/P&gt;&lt;P&gt;Thanks in advance,&lt;/P&gt;&lt;P&gt;Lorenzo&lt;/P&gt;</description>
    <pubDate>Fri, 16 Sep 2022 11:14:50 GMT</pubDate>
    <dc:creator>lorenz984b</dc:creator>
    <dc:date>2022-09-16T11:14:50Z</dc:date>
    <item>
      <title>Build failed for spark thriftserver - CDH5.10.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Build-failed-for-spark-thriftserver-CDH5-10-0/m-p/52105#M56908</link>
      <description>&lt;P&gt;I'm trying to compile spark 1.6.0-cdh 5.10.0&amp;nbsp;on a Docker container (that should act as an edge node for the cluster with Jupiter).&lt;/P&gt;&lt;P&gt;This is the source:&lt;/P&gt;&lt;P&gt;&lt;A href="http://archive.cloudera.com/cdh5/cdh/5/spark-1.6.0-cdh5.10.0-src.tar.gz" target="_blank"&gt;http://archive.cloudera.com/cdh5/cdh/5/spark-1.6.0-cdh5.10.0-src.tar.gz&lt;/A&gt;&lt;/P&gt;&lt;P&gt;I am using the following Maven command:&lt;/P&gt;&lt;PRE&gt;build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=$V_CDH -Phive -Phive-thriftserver -DskipTests clean package&lt;/PRE&gt;&lt;P&gt;but I cannot continue over the thriftserver part.&lt;/P&gt;&lt;P&gt;I always get this error:&lt;/P&gt;&lt;P&gt;&amp;nbsp;&lt;/P&gt;&lt;PRE&gt;[error] /usr/local/spark-1.6.0-cdh5.10.0/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala:42: class SparkExecuteStatementOperation needs to be abstract, since method cancel in class Operation of type (x$1: org.apache.hive.service.cli.OperationState)Unit is not defined
[error] private[hive] class SparkExecuteStatementOperation(
[error]                     ^
[error] /usr/local/spark-1.6.0-cdh5.10.0/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkExecuteStatementOperation.scala:252: method cancel overrides nothing.
[error] Note: the super classes of class SparkExecuteStatementOperation contain the following, non final members named cancel:
[error] def cancel(x$1: org.apache.hive.service.cli.OperationState): Unit
[error]   override def cancel(): Unit = {
[error]                ^
[error] /usr/local/spark-1.6.0-cdh5.10.0/sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/server/SparkSQLOperationManager.scala:42: method newExecuteStatementOperation overrides nothing.
[error] Note: the super classes of class SparkSQLOperationManager contain the following, non final members named newExecuteStatementOperation:
[error] def newExecuteStatementOperation(x$1: org.apache.hive.service.cli.session.HiveSession,x$2: String,x$3: java.util.Map[String,String],x$4: Boolean,x$5: Long): org.apache.hive.service.cli.operation.ExecuteStatementOperation
[error]   override def newExecuteStatementOperation(&lt;/PRE&gt;&lt;P&gt;Any idea?&lt;/P&gt;&lt;P&gt;Thanks in advance,&lt;/P&gt;&lt;P&gt;Lorenzo&lt;/P&gt;</description>
      <pubDate>Fri, 16 Sep 2022 11:14:50 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Build-failed-for-spark-thriftserver-CDH5-10-0/m-p/52105#M56908</guid>
      <dc:creator>lorenz984b</dc:creator>
      <dc:date>2022-09-16T11:14:50Z</dc:date>
    </item>
    <item>
      <title>Re: Build failed for spark thriftserver - CDH5.10.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Build-failed-for-spark-thriftserver-CDH5-10-0/m-p/52107#M56909</link>
      <description>&lt;P&gt;Yes, the thrift server isn't shipped or supported, in part because it doesn't work with the later Hive shipped by CDH. I don't think it will build with this profile enabled.&lt;/P&gt;</description>
      <pubDate>Mon, 13 Mar 2017 12:38:18 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Build-failed-for-spark-thriftserver-CDH5-10-0/m-p/52107#M56909</guid>
      <dc:creator>srowen</dc:creator>
      <dc:date>2017-03-13T12:38:18Z</dc:date>
    </item>
    <item>
      <title>Re: Build failed for spark thriftserver - CDH5.10.0</title>
      <link>https://community.cloudera.com/t5/Archives-of-Support-Questions/Build-failed-for-spark-thriftserver-CDH5-10-0/m-p/52108#M56910</link>
      <description>Ok,&lt;BR /&gt;Thanks,&lt;BR /&gt;Lorenzo</description>
      <pubDate>Mon, 13 Mar 2017 12:43:48 GMT</pubDate>
      <guid>https://community.cloudera.com/t5/Archives-of-Support-Questions/Build-failed-for-spark-thriftserver-CDH5-10-0/m-p/52108#M56910</guid>
      <dc:creator>lorenz984b</dc:creator>
      <dc:date>2017-03-13T12:43:48Z</dc:date>
    </item>
  </channel>
</rss>

