Skip to content

Add Java 21 to the supported version matrix (particularly for Spark 4.0) #4059

@james-willis

Description

@james-willis

What is the problem the feature request solves?

The installation docs list Java 17 only for Spark 4.0 (experimental). Java 21 is LTS, ~2.5 years old, officially supported by Spark 4, and the rest of the Spark ecosystem (Arrow, Netty, Iceberg, Delta) has shipped Java 21 compatibility. Comet is now one of the few pieces without official Java 21 validation.

Describe the potential solution

Add a Spark 4.0 / JDK 21 row to the matrix in pr_build_linux.yml and spark_sql_test.yml, and extend the conditional JAVA_TOOL_OPTIONS so the existing --add-opens/--add-exports flags also apply on JDK 21. Spark 4's launcher scripts already have the correct flag set for both JDK versions as prior art. Then update the compatibility matrix in installation.md.

Additional context

Metadata

Metadata

Assignees

No one assigned

    Type

    No type
    No fields configured for issues without a type.

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions