Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
[SPARK-48773][FOLLOW-UP] spark.conf.set should not fail when setting …
…`spark.default.parallelism` ### What changes were proposed in this pull request? spark.session.set should not fail when setting `spark.default.parallelism`. ### Why are the changes needed? This is to fix a behavior change where before `SPARK-48773`, set `spark.default.parallelism` through spark session does not fail and is a no op. ### Does this PR introduce _any_ user-facing change? Yes. before `SPARK-48773`, spark.conf.set("spark.default.parallelism") does not fail and is a no-op. after ``SPARK-48773`, spark.conf.set("spark.default.parallelism") will fail with a `CANNOT_MODIFY_CONFIG` exception. With this followup, we restore the behavior to spark.conf.set("spark.default.parallelism") does not fail and is a no-op. ### How was this patch tested? manually testing. ### Was this patch authored or co-authored using generative AI tooling? No Closes #48526 from amaliujia/SPARK-48773. Authored-by: Rui Wang <rui.wang@databricks.com> Signed-off-by: Wenchen Fan <wenchen@databricks.com>
- Loading branch information