Reply
Highlighted
Cloudera Employee Ana
Cloudera Employee
Posts: 86
Registered: ‎10-14-2015

[ANNOUNCE] CDS 2.3 Release 3 Powered by Apache Spark Released

We are happy to announce CDS 2.3 release 3 Powered by Apache Spark. You can download the parcel and apply it directly to provisioned clusters without disrupting your currently running Spark workloads.

 

This component is generally available and is supported on CDH 5.9 and higher, and Cloudera Manager 5.11 and higher.

This is purely a maintenance release and it includes all fixes that are in the Apache Spark 2.3.1 upstream release. Test-only changes are omitted.  For more information, see the Apache Spark 2.3.1 upstream release notes.

 

  • SPARK-16451 - [REPL] Spark-shell / pyspark should finish gracefully when "SaslException: GSS initiate failed" is hit
  • SPARK-17756 - [PYTHON][STREAMING] java.lang.ClassCastException returned when using 'cartesian' with DStream.transform
  • SPARK-24029 - Set the "reuse address" flag on listen sockets
  • SPARK-24216 - [SQL] Spark TypedAggregateExpression uses getSimpleName this is not safe in Scala
  • SPARK-24369 - [SQL] Correct handling for multiple distinct aggregations that have the same argument set
  • SPARK-24468 - [SQL] DecimalType 'adjustPrecisionScale' might fail when scale is negative
  • SPARK-24495 - [SQL] SortMergeJoin with duplicate keys produces wrong results
  • SPARK-24506 - [UI] Add UI filters to tabs added after binding
  • SPARK-24542 - [SQL] Hive UDF series UDFXPathXXXX allows users to pass carefully crafted XML to access arbitrary files
  • SPARK-24548 - [SQL] JavaPairRDD to Dataset<Row> in Spark generates ambiguous results
  • SPARK-24552 - Task attempt numbers are resused when stages are retried
  • SPARK-24578 - [CORE] Reading remote cache block behavior changes and causes timeout issue
  • SPARK-24583 - [SQL] Wrong schema type in InsertIntoDataSourceCommand
  • SPARK-24589 - [CORE] OutputCommitCoordinator might allow duplicate commits



Download Cloudera Distribution of CDS 2.3 release 3 Powered By Apache Spark.

Read the documentation.

Want to become a pro Spark user?  Sign up for Apache Spark Training.

 

Announcements