Pandas version checks
-
[X] I have checked that this issue has not already been reported.
-
[X] I have confirmed this bug exists on the latest version of pandas.
-
[X] I have confirmed this bug exists on the main branch of pandas.
Reproducible Example
import pandas as pd
if __name__ == "__main__":
df = pd.DataFrame(
[
["2023-09-29 02:55:54"],
["2023-09-29 02:56:03"],
],
columns=["timestamp"],
dtype="datetime64[ms]",
)
serialized = df.to_json()
print(serialized)
# Got: {"timestamp":{"0":1695956,"1":1695956}}
# Should be: {"timestamp":{"0":1695956154000,"1":1695956163000}}
deserialized = pd.read_json(serialized, convert_dates=["timestamp"])
print(pd.to_datetime(deserialized["timestamp"], unit="ms"))
# Got:
# 0 1970-01-01 00:28:15.956
# 1 1970-01-01 00:28:15.956
# Instead of:
# 0 2023-09-29 02:55:54
# 1 2023-09-29 02:56:03
Issue Description
When a dataframe contains a column which dtype is datetime64[ms]
and one tries to convert it to json using df.to_json()
the data does not correspond to the correct value. So trying to convert it back will give the default timestamp.
See example.
Expected Behavior
The json values for the timestamp should be the correspoinding one for the given date string so it we can restore the correct date/time. It works when using datetime64[ns].
Installed Versions
Comment From: jbrockmendel
xref #55827
Comment From: Ricardus312
I encountered this same problem with the to_datetime64() function when upgrading to version 2.2
According to the documentation, until version 2.0.3 the pandas.Timestamp.to_datetime64 function returned "a numpy.datetime64 object with 'ns' precision". Since version 2.1.4 that function returns "a numpy.datetime64 object with same precision".
This change has broken critical parts of my code that assumed conversion with nanosecond precision, and now depending on the decimals of the argument passed the behavior of the function is unpredictable.
I recommend downgrading to an older version of pandas to resolve it.
Comment From: lithomas1
If you specify date_format="iso"
in the to_json
call, the round-tripping happens successfully.
(Note: the dtype will still be datetime64[ns]
, though for stuff in range for datetime64ns. I don't know if we should be reading stuff that's in-bounds for datetime64[ns] as non-nano)
It looks like we can do better with preserving a dtype for non-nano datetimes, though.
Running
import pandas as pd
if __name__ == "__main__":
df = pd.DataFrame(
[
["1000-09-29 02:55:54"],
["1000-09-29 02:56:03"],
],
columns=["timestamp"],
dtype="datetime64[ms]",
)
import io
serialized = df.to_json(date_format="iso")
print(serialized)
deserialized = pd.read_json(io.StringIO(serialized), convert_dates=["timestamp"])
print(deserialized)
print(deserialized.dtypes)
I get object dtype for the deserialized json timestamp column
Output
{"timestamp":{"0":"1000-09-29T02:55:54.000","1":"1000-09-29T02:56:03.000"}}
timestamp
0 1000-09-29T02:55:54.000
1 1000-09-29T02:56:03.000
timestamp object
dtype: object
Comment From: Tikonderoga
Adding date_unit='ns'
to to_json
call makes serialization work properly, even if the index or column type still datetime64[ms]