What is the problem the feature request solves?
The installation docs list Java 17 only for Spark 4.0 (experimental). Java 21 is LTS, ~2.5 years old, officially supported by Spark 4, and the rest of the Spark ecosystem (Arrow, Netty, Iceberg, Delta) has shipped Java 21 compatibility. Comet is now one of the few pieces without official Java 21 validation.
Describe the potential solution
Add a Spark 4.0 / JDK 21 row to the matrix in pr_build_linux.yml and spark_sql_test.yml, and extend the conditional JAVA_TOOL_OPTIONS so the existing --add-opens/--add-exports flags also apply on JDK 21. Spark 4's launcher scripts already have the correct flag set for both JDK versions as prior art. Then update the compatibility matrix in installation.md.
Additional context
What is the problem the feature request solves?
The installation docs list Java 17 only for Spark 4.0 (experimental). Java 21 is LTS, ~2.5 years old, officially supported by Spark 4, and the rest of the Spark ecosystem (Arrow, Netty, Iceberg, Delta) has shipped Java 21 compatibility. Comet is now one of the few pieces without official Java 21 validation.
Describe the potential solution
Add a
Spark 4.0 / JDK 21row to the matrix inpr_build_linux.ymlandspark_sql_test.yml, and extend the conditionalJAVA_TOOL_OPTIONSso the existing--add-opens/--add-exportsflags also apply on JDK 21. Spark 4's launcher scripts already have the correct flag set for both JDK versions as prior art. Then update the compatibility matrix ininstallation.md.Additional context