Skip to content

Conversation

@subkanthi
Copy link
Collaborator

closes: #85

@subkanthi
Copy link
Collaborator Author

subkanthi commented Jan 6, 2026

**Before FIX: **

spark-sql (test)> select * from test.test4.partitions;
**{"_time_day":1970-01-21}**	0	86400	1	274952	0	0	0	0	2026-01-05 17:15:08.795	1674173899564276772
Time taken: 0.147 seconds, Fetched 1 row(s)

After FIX:

select * from test.test10.partitions;
**{"_time_day":2026-01-05}**	0	86400	1	274952	0	0	0	0	2026-01-06 16:47:20.632	6618988112230024927
Time taken: 0.189 seconds, Fetched 1 row(s)


spark-sql (test)> select count(*), min(_time), max(_time) from `test10` where _time >= '2026-01-01 18:00:00';
86400	2026-01-04 18:00:00	2026-01-05 17:59:59
Time taken: 0.083 seconds, Fetched 1 row(s)

spark-sql (test)> select current_timezone();
America/Chicago
Time taken: 0.057 seconds, Fetched 1 row(s)


spark-sql (test)> select count(*), min(_time), max(_time) from `test10` where _time >= '2026-01-04 18:05:00' and _time <= '2026-01-04 18:10:00';
301	2026-01-04 18:05:00	2026-01-04 18:10:00
Time taken: 0.118 seconds, Fetched 1 row(s)

@shyiko shyiko merged commit 1eee30c into master Jan 9, 2026
1 check passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Partition min/max incorrectly calculated for timestamp (millis)

3 participants