What versions of Spark are supported for Data Engineering Spark Jobs? I currently have a large amount of PySpark code and want to make sure it will work out of the box.
Currently, Spark version 3.2 is the only supported version. Additional versions will be available in the future.