Add Spark 4.0 support via deequ:2.0.14-spark-4.0#259
Open
m-aciek wants to merge 1 commit intoawslabs:masterfrom
Open
Add Spark 4.0 support via deequ:2.0.14-spark-4.0#259m-aciek wants to merge 1 commit intoawslabs:masterfrom
m-aciek wants to merge 1 commit intoawslabs:masterfrom
Conversation
Author
|
This is now ready for review; CI tests pass on my fork: https://github.com/m-aciek/python-deequ/actions/runs/24196839467 |
chenliu0831
approved these changes
Apr 15, 2026
Contributor
chenliu0831
left a comment
There was a problem hiding this comment.
LGTM. I'm not sure if we would like to keep maintaining the Py4j approach though.
Contributor
|
@m-aciek we need your commit to have verified signatures |
- Add "4.0" entry to SPARK_TO_DEEQU_COORD_MAPPING in configs.py - Widen pyspark optional dep bound to <5.0.0 in pyproject.toml - Replace scala.collection.JavaConversions (removed in Scala 2.13) with JavaConverters in scala_utils.py and profiles.py - Replace scala.collection.Seq.empty() (inaccessible via Py4J in Scala 2.13) with to_scala_seq(jvm, jvm.java.util.ArrayList()) in analyzers.py and checks.py - Add Spark 4.0.0 to CI matrix with Java 17; use include: style to pair each Spark version with its required Java version - Fix CI for Spark 4.0: - use Python 3.9 and version-marker pyspark dep - use pip install instead of poetry add - install pandas>=2.0.0 required by PySpark 4.0 - Fix empty Seq compatibility across Scala 2.12 and 2.13 Fixes awslabs#258
d80a2cd to
6aa90c3
Compare
Author
|
@chenliu0831 I've setup the verification and squashed the commits |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Closes #258
Summary
"4.0": "com.amazon.deequ:deequ:2.0.14-spark-4.0"toSPARK_TO_DEEQU_COORD_MAPPINGinconfigs.py>=2.4.7,<3.4.0to>=2.4.7,<5.0.0inpyproject.tomlscala.collection.JavaConversions(removed in Scala 2.13) withJavaConvertersinscala_utils.pyandprofiles.pyscala.collection.Seq.empty()(inaccessible via Py4J in Scala 2.13) with an empty Java list converted viato_scala_seqinanalyzers.pyandchecks.pyinclude:style so each Spark version carries its required Java versionRoot causes fixed
Spark 4 uses Scala 2.13, which introduced two breaking changes affecting pydeequ:
scala.collection.JavaConversionswas removed — replaced byJavaConverterswith explicit.asScala()/.asJava()callsscala.collection.Seq.empty()is not accessible via Py4J reflection — replaced withto_scala_seq(jvm, jvm.java.util.ArrayList())which constructs an empty ScalaSeqvia the already-fixed converterTest plan
SPARK_VERSION=4.0.0/pyspark==4.0.0PR authored with assistance from Claude Code