Pandas version checks

  • [X] I have checked that this issue has not already been reported.

  • [X] I have confirmed this bug exists on the latest version of pandas.

  • [ ] I have confirmed this bug exists on the main branch of pandas.

Reproducible Example

>>> import pandas as pd
>>> import numpy as np
>>> pd.__version__
'2.2.1'
>>> a = pd.DataFrame(np.random.randint(2, size=(3,4))).astype(pd.SparseDtype(int, fill_value=0))
>>> a
   0  1  2  3
0  0  0  1  0
1  0  1  0  1
2  0  1  1  0
dtype: Sparse[int64, 0]
>>> (a>0).sum(axis=1)
0    True
1    True
2    True
dtype: Sparse[bool, False]
>>> b = pd.DataFrame(np.random.randint(2, size=(3,4)))
>>> (b>0).sum(axis=1)
0    3
1    4
2    2
dtype: int64
>>> import pandas as pd
>>> import numpy as np
>>> pd.__version__
'1.5.3'
>>> a = pd.DataFrame(np.random.randint(2, size=(3,4))).astype(pd.SparseDtype(int, fill_value=0))
>>> a
   0  1  2  3
0  1  1  0  0
1  0  0  1  0
2  0  0  1  1
>>> (a>0).sum(axis=1)
0    2
1    1
2    2
dtype: int64
>>> b = pd.DataFrame(np.random.randint(2, size=(3,4)))
>>> (b>0).sum(axis=1)
0    1
1    4
2    1
dtype: int64

```

Issue Description

The sum of a sparse boolean array is sparse boolean rather than int.

Expected Behavior

I would expect the sum of a sparse boolean array to be an int in order to match the behavior on a dense array.

Installed Versions

this issue is observed swapping from 1.5.3 to 2.2.1

Comment From: CompRhys

potentially this is an edge case of https://pandas.pydata.org/docs/dev/whatsnew/v2.1.0.html#dataframe-reductions-preserve-extension-dtypes, it is also could be intended behavior but it does seem very counter-intuitive to me

Comment From: rhshadrach

Thanks for the report - having the result be Sparse[bool] does look incorrect to me, I would think it should be Sparse[int]. Another potential way this may have changed is #54341, need to run a git bisect to tell.

Further investigations and PRs to fix are welcome!

Comment From: dontgoto

take

Comment From: dontgoto

I would take a look at this issue if that's ok for you @CompRhys

Comment From: CompRhys

I wouldn't know where to start in the internals so am very grateful if you would like to tackle it! @dontgoto