• [x] I have checked that this issue has not already been reported.

  • [x] I have confirmed this bug exists on the latest version of pandas. (only in current master; dropna=False didn't exist previously)

  • [x] (optional) I have confirmed this bug exists on the master branch of pandas.


Code Sample, a copy-pastable example

>>> df = pd.DataFrame({'a': [pd.to_datetime('2019-02-12'), pd.to_datetime('2019-02-12'), pd.to_datetime('2019-02-13'), pd.NaT], 'b': [1, 2, 3, 4]})
>>>
>>> dfg = df.groupby(['a'], dropna=False)
>>> len(dfg)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/groupby/groupby.py", line 539, in __len__
    return len(self.groups)
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/groupby/groupby.py", line 558, in groups
    return self.grouper.groups
  File "pandas/_libs/properties.pyx", line 33, in pandas._libs.properties.CachedProperty.__get__
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/groupby/ops.py", line 257, in groups
    return self.groupings[0].groups
  File "pandas/_libs/properties.pyx", line 33, in pandas._libs.properties.CachedProperty.__get__
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/groupby/grouper.py", line 598, in groups
    return self.index.groupby(Categorical.from_codes(self.codes, self.group_index))
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/arrays/categorical.py", line 606, in from_codes
    dtype = CategoricalDtype._from_values_or_dtype(
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/dtypes/dtypes.py", line 365, in _from_values_or_dtype
    dtype = CategoricalDtype(categories, ordered)
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/dtypes/dtypes.py", line 252, in __init__
    self._finalize(categories, ordered, fastpath=False)
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/dtypes/dtypes.py", line 406, in _finalize
    categories = self.validate_categories(categories, fastpath=fastpath)
  File "./venv/pandas-1/lib64/python3.8/site-packages/pandas/core/dtypes/dtypes.py", line 579, in validate_categories
    raise ValueError("Categorical categories cannot be null")
ValueError: Categorical categories cannot be null

Problem description

When using groupby with the option of dropna=False, it may happen that a ValueError is being raised when the result of len(groupby(..., dropna=False)) is obtained. This works fine in older version of Pandas (0.25.x) which always dropped na rows by default. Even in the current master version (1.1.0.dev0+2054.gc15f08084) len(...) still works fine when dropna=True is used. The interesting bit (for me at least) is that categoricals are somehow used while there are no categoricals in the dataframe to begin with. Perhaps this is a code path that is not meant to be executed.

Expected Output

No exception being raised (there isn't any categorical after all) and len(...) isn't modifying anything so an exception is totally unexpected.

Output of pd.show_versions()

>>> pd.show_versions()

INSTALLED VERSIONS
------------------
commit           : c15f0808428b5cdb4eaea290fb26a71c3534630b
python           : 3.8.3.final.0
python-bits      : 64
OS               : Linux
OS-release       : 5.7.7-200.fc32.x86_64
Version          : #1 SMP Wed Jul 1 19:53:01 UTC 2020
machine          : x86_64
processor        : x86_64
byteorder        : little
LC_ALL           : None
LANG             : en_AU.UTF-8
LOCALE           : en_AU.UTF-8

pandas           : 1.1.0.dev0+2054.gc15f08084
numpy            : 1.19.0
pytz             : 2020.1
dateutil         : 2.8.1
pip              : 20.1
setuptools       : 46.1.3
Cython           : None
pytest           : None
hypothesis       : None
sphinx           : None
blosc            : None
feather          : None
xlsxwriter       : None
lxml.etree       : None
html5lib         : None
pymysql          : None
psycopg2         : None
jinja2           : None
IPython          : None
pandas_datareader: None
bs4              : None
bottleneck       : None
fsspec           : None
fastparquet      : None
gcsfs            : None
matplotlib       : None
numexpr          : None
odfpy            : None
openpyxl         : None
pandas_gbq       : None
pyarrow          : None
pytables         : None
pyxlsb           : None
s3fs             : None
scipy            : None
sqlalchemy       : None
tables           : None
tabulate         : None
xarray           : None
xlrd             : 1.2.0
xlwt             : None
numba            : None

Comment From: jorisvandenbossche

@ssche thanks for the report! (cc @charlesdong1991)

Comment From: charlesdong1991

take

Comment From: HalfWhitt

Hello, may I ask if anyone's figured out yet how to fix this? I just encountered the bug myself and was rather mystified. Is the dropna=False option even usable at all, perhaps with a workaround?

Comment From: ssche

@HalfWhitt I think you can use dfg.ngroups instead of len(dfg) (note: this is a very specific error and should not be conflated with the general use of dropna=False).

Comment From: HalfWhitt

@ssche Oh, interesting... I don't actually need len() myself, but I was trying to do list(dfg)... After finding that that, dfg.groups, and (as mentioned here) len(dfg) all threw the same error, I guess I just assumed the groups object didn't work at all with drop_na.

But you prompted me to look at it again, and now that I've tried iterating over it and found that that works, I guess I can just do [group for group in dfg] instead. Which feels a little silly (and really makes me wonder how it could work while list(dfg) fails), but works perfectly well I suppose.

Comment From: nir-ml

How about converting the datetime column to a string column before using groupby?

>>> df['a'] = df['a'].astype(str)
>>> dfg = df.groupby(['a'], dropna=False)

Comment From: rhshadrach

I now get 3 in the OP, could use tests.

Comment From: ssche

I now get 3 in the OP, could use tests.

Still happening in 2.1.1 and 2.2.3.

Comment From: rhshadrach

@ssche - the fix is in main, will be released in 3.0.

Comment From: palbha

@rhshadrach - Do we mark this as closed if the fix is in main already ?

Comment From: rhshadrach

@palbha - no, this still needs tests.

Comment From: asharmalik19

I would like to add the necessary tests for this issue.

Comment From: asharmalik19

take