Describe the bug
The new Spark 4.1 test SPARK-53968 reading the view after allowPrecisionLoss is changed in SQLViewSuite fails with Comet enabled. The test stores DECIMAL(38, 18) values and computes unit_price + COALESCE(shipping_price, 0) through a CTE wrapped in a view. Comet returns values approximately 10x smaller than expected:
| Row |
Expected |
Actual (with Comet) |
| part1 |
100.00000000000000000 |
10.00000000000000000 |
| part2 |
100.00000000000000000 |
10.00000000000000000 |
| part3 |
300.23000000000000000 |
30.02300000000000000 |
The plan involves CometBroadcastHashJoin → CometProject(unit_price + COALESCE(shipping_price, 0E-18)) → CometExchange(rangepartitioning) → CometSort.
Steps to reproduce
Run Spark 4.1.1's SQLViewSuite with Comet enabled. The reproducer is the test body itself (see sql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewSuite.scala SPARK-53968 reading the view after allowPrecisionLoss is changed).
Expected behavior
unit_price + shipping_price should match Spark's result (100, 100, 300.23).
Workaround
The test is currently tagged IgnoreComet(...) in dev/diffs/4.1.1.diff.
Additional context
PR #4093 enables Spark 4.1.1 in the Spark SQL Tests workflow. The 10x discrepancy and the DECIMAL(38, 18) schema strongly suggest a precision/scale handling bug in Comet's decimal addition or in the view-resolution path that drops 1 digit of scale.
Describe the bug
The new Spark 4.1 test
SPARK-53968 reading the view after allowPrecisionLoss is changedinSQLViewSuitefails with Comet enabled. The test storesDECIMAL(38, 18)values and computesunit_price + COALESCE(shipping_price, 0)through a CTE wrapped in a view. Comet returns values approximately 10x smaller than expected:The plan involves
CometBroadcastHashJoin → CometProject(unit_price + COALESCE(shipping_price, 0E-18)) → CometExchange(rangepartitioning) → CometSort.Steps to reproduce
Run Spark 4.1.1's
SQLViewSuitewith Comet enabled. The reproducer is the test body itself (seesql/core/src/test/scala/org/apache/spark/sql/execution/SQLViewSuite.scalaSPARK-53968 reading the view after allowPrecisionLoss is changed).Expected behavior
unit_price + shipping_priceshould match Spark's result (100, 100, 300.23).Workaround
The test is currently tagged
IgnoreComet(...)indev/diffs/4.1.1.diff.Additional context
PR #4093 enables Spark 4.1.1 in the
Spark SQL Testsworkflow. The 10x discrepancy and theDECIMAL(38, 18)schema strongly suggest a precision/scale handling bug in Comet's decimal addition or in the view-resolution path that drops 1 digit of scale.