Navigation

Release Notes

MongoDB Connector for Spark 2.3.1

Released on October 8, 2018

  • Updated Mongo Java Driver to 3.8.2
  • SPARK-206 Updated Spark dependency to 2.3.2
  • SPARK-210 Added ReadConfig.samplePoolSize to improve the performance of inferring schemas
  • SPARK-216 Updated UDF helpers, don’t overwrite JavaScript with no scope and Regex with no options helpers.

MongoDB Connector for Spark 2.3.0

Released July 30, 2018

  • SPARK-156 Updated Spark dependency to 2.3.0. Dropped Scala 2.10 support.
  • SPARK-174 Updated MongoDB Java driver to 3.8.0.
  • SPARK-133 Added support for MapType when inferring the schema.
  • SPARK-186 Added configuration to disable auto pipeline manipulation with Spark SQL.
  • SPARK-188 Removed minKey/maxKey bounds from partitioners. Partitioners that produce empty querybounds no longer modify the pipeline.
  • SPARK-164 Added ordered property to WriteConfig.
  • SPARK-192 Added WriteConfig.forceInsert property. DataFrame overwrites will automatically set force insert to true.
  • SPARK-178 Log partitioner errors to provide clearer feedback to users.
  • SPARK-102 Added AggregationConfig to configure reads from MongoDB.
  • SPARK-197 Fixed BSON compatibility for non-nullable struct fields.
  • SPARK-199 Row to Document optimization.

MongoDB Connector for Spark 2.2.3

Released on June 19, 2018

  • SPARK-187 Fixed inferring decimal values with larger scales than precisions.

MongoDB Connector for Spark 2.2.2

Released on April 18, 2018

  • SPARK-150 Fixed MongoShardedPartitioner to work with compound shard keys.
  • SPARK-147 Fixed writing Datasets for compound shard keys, see WriteConfig#shardKey.
  • SPARK-157 Fix MongoPaginateByCountPartitioner single item with query exception.
  • SPARK-158 Fix null handling for String columns.
  • SPARK-173 Improved error messages for cursor not found exceptions.

MongoDB Connector for Spark 2.2.1

Released on October 31, 2017

  • SPARK-151 Fix MongoSamplePartitioner $match range bug.

MongoDB Connector for Spark 2.2.0

Released on July 13, 2017

  • SPARK-127 Fix Scala 2.10 compiler error for Java bean type inference.
  • SPARK-126 Support Spark 2.2.0. Updated Spark dependency to 2.2.0.

MongoDB Connector for Spark 2.1.0

Released on July 12, 2017

  • SPARK-125 Updated Spark dependency to 2.1.1.
  • SPARK-124 Made the maximum batch size when performing bulk updates / inserts configurable.
  • SPARK-106 Added helpers MongoSpark.load helpers for Java users using a SparkSesson.
  • SPARK-100 Added WriteConfig.replaceDocument to configure how Datasets are saved.
  • SPARK-39 Added support for Decimal type.
  • SPARK-112 Fixed custom partition key bug in MongoSamplePartitioner.
  • SPARK-122 Ensure pagination partitioners can use a covered query.
  • SPARK-101 Add support for partial collection partitioning for non-sharded partitioners.
  • SPARK-103 Ensure partitioners handle empty collections.

MongoDB Connector for Spark 2.0.0

Released November 1, 2016

  • First Spark 2.0.0 release
←   FAQ