Code Sample, a copy-pastable example if possible

# First, create a `Series` with `dtype` int64 or float64 or object (any type but datetime64).
series = pd.Series({'value': 123})
# >>> series
# value    123
# dtype: int64

# Setting a new datetime64 value, and it will be converted to int64.
series.at['date'] = datetime.datetime.now()
# >>> series
# value                    123
# date     1520605566114680000
# dtype: object

# If assigning to a existed field, it won't be auto-converted.
series.at['date'] = datetime.datetime.now()
# >>> series
# value                           123
# date     2018-03-09 14:26:19.945935
# dtype: object

Problem description

When assigning a datetime64 value to a new field of a non-datetime64 Series, it will be converted to int64. While assigning to a existed field, it won't be converted.

Expected Output

A datetime64 value should never be implicitly converted to int64.

Output of pd.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.6.4.final.0 python-bits: 64 OS: Darwin OS-release: 17.4.0 machine: x86_64 processor: i386 byteorder: little LC_ALL: None LANG: zh_CN.UTF-8 LOCALE: zh_CN.UTF-8 pandas: 0.22.0 pytest: None pip: 9.0.1 setuptools: 38.5.2 Cython: None numpy: 1.14.1 scipy: 1.0.0 pyarrow: None xarray: None IPython: None sphinx: None patsy: None dateutil: 2.6.1 pytz: 2018.3 blosc: None bottleneck: None tables: None numexpr: None feather: None matplotlib: 2.2.0 openpyxl: None xlrd: 1.1.0 xlwt: None xlsxwriter: 1.0.2 lxml: None bs4: 4.6.0 html5lib: None sqlalchemy: None pymysql: 0.7.11.None psycopg2: None jinja2: None s3fs: None fastparquet: None pandas_gbq: None pandas_datareader: None

Comment From: jreback

thanks, duplicate of #6942 . this was fixed for dataframes in the last release, but series takes a slightly different path in some cases. a PR to fix would be welcome.