fix(spark): Use wrapping addition/subtraction in SparkDateAdd and SparkDateSub#19377
Merged
Jefffrey merged 3 commits intoapache:mainfrom Jan 6, 2026
Merged
fix(spark): Use wrapping addition/subtraction in SparkDateAdd and SparkDateSub#19377Jefffrey merged 3 commits intoapache:mainfrom
SparkDateAdd and SparkDateSub#19377Jefffrey merged 3 commits intoapache:mainfrom
Conversation
In Spark, `date_add` and `date_sub` functions do not raise an error when the addition or subtraction overflows 32-bit integer, regardless of whether the ANSI evaluation mode is in effect.
This was trickier to test than usual results: the Date32 values are computed wrapped around, but the _formatting_ of the results fails because the dates are out of chrono range. Cast the output to int to compare with the exact values.
martin-g
approved these changes
Dec 19, 2025
| query I | ||
| SELECT date_sub('1969-01-01'::date, 2147483647::int)::int; | ||
| ---- | ||
| 2147483284 |
Member
There was a problem hiding this comment.
Apache Spark 4.0.1 returns the same results:
❯ ./bin/spark-sql
WARNING: Using incubator modules: jdk.incubator.vector
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
25/12/19 13:27:39 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
25/12/19 13:27:41 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 2.3.0
25/12/19 13:27:41 WARN ObjectStore: setMetaStoreSchemaVersion called but recording version is disabled: version = 2.3.0, comment = Set by MetaStore mgrigorov@192.168.x.y
Spark Web UI available at http://localhost:4040
Spark master: local[*], Application Id: local-1766143660589
spark-sql (default)> SELECT UNIX_DATE(date_add('2016-07-30', 2147483647));
-2147466637
Time taken: 0.028 seconds, Fetched 1 row(s)
spark-sql (default)> SELECT UNIX_DATE(date_sub('1969-01-01', 2147483647));
2147483284
Time taken: 0.03 seconds, Fetched 1 row(s)
Jefffrey
approved these changes
Jan 3, 2026
Contributor
|
cc @andygrove |
Contributor
|
Thanks @mzabaluev & @martin-g |
This was referenced Jan 6, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Which issue does this PR close?
Rationale for this change
In Spark,
date_addanddate_subfunctions do not raise an error when the addition or subtraction overflows 32-bit integer, regardless of whether the ANSI evaluation mode is in effect.What changes are included in this PR?
Changed the implementations to use
wrapping_addandwrapping_sub.Are these changes tested?
Updated
date_add.sltto expect overflowed results rather than the errors.Are there any user-facing changes?
The functions behave more like Spark, even if Spark is doing the wrong thing.