Pandas version checks

  • [x] I have checked that this issue has not already been reported.

  • [x] I have confirmed this bug exists on the latest version of pandas.

  • [x] I have confirmed this bug exists on the main branch of pandas.

Reproducible Example

import pandas as pd
import pyarrow as pa

decimal_type = pd.ArrowDtype(pa.decimal128(3, scale=2))

series = pd.Series([1, None], dtype=decimal_type)

pd.to_numeric(series, errors="coerce")

Issue Description

pandas.to_numeric fails to coerce Pyarrow Decimal series that contain NA values due to those NA values getting dropped, leading to an index mismatch:

import pandas as pd
import pyarrow as pa

decimal_type = pd.ArrowDtype(pa.decimal128(3, scale=2))

series = pd.Series([1, None], dtype=decimal_type)

pd.to_numeric(series, errors="coerce")
---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[13], line 8
      4 decimal_type = pd.ArrowDtype(pa.decimal128(3, scale=2))
      6 series = pd.Series([1, None], dtype=decimal_type)
----> 8 pd.to_numeric(series, errors="coerce")

File /opt/homebrew/lib/python3.13/site-packages/pandas/core/tools/numeric.py:319, in to_numeric(arg, errors, downcast, dtype_backend)
    316         values = ArrowExtensionArray(values.__arrow_array__())
    318 if is_series:
--> 319     return arg._constructor(values, index=arg.index, name=arg.name)
    320 elif is_index:
    321     # because we want to coerce to numeric if possible,
    322     # do not use _shallow_copy
    323     from pandas import Index

File /opt/homebrew/lib/python3.13/site-packages/pandas/core/series.py:575, in Series.__init__(self, data, index, dtype, name, copy, fastpath)
    573     index = default_index(len(data))
    574 elif is_list_like(data):
--> 575     com.require_length_match(data, index)
    577 # create/copy the manager
    578 if isinstance(data, (SingleBlockManager, SingleArrayManager)):

File /opt/homebrew/lib/python3.13/site-packages/pandas/core/common.py:573, in require_length_match(data, index)
    569 """
    570 Check the length of data matches the length of the index.
    571 """
    572 if len(data) != len(index):
--> 573     raise ValueError(
    574         "Length of values "
    575         f"({len(data)}) "
    576         "does not match length of index "
    577         f"({len(index)})"
    578     )

ValueError: Length of values (1) does not match length of index (2)

This seems to be due to this conversion to a numpy type setting the dtype to object, which causes this condition to be false, which skips re-adding the NA values, leading to a final values array shorter than the original index.

Expected Behavior

I'd expect the series to get converted (to values of decimal.Decimal type, with dtype=object) without raising an exception, preserving the null elements.

Installed Versions

INSTALLED VERSIONS ------------------ commit : 0691c5cf90477d3503834d983f69350f250a6ff7 python : 3.13.2 python-bits : 64 OS : Darwin OS-release : 24.5.0 Version : Darwin Kernel Version 24.5.0: Tue Apr 22 19:53:27 PDT 2025; root:xnu-11417.121.6~2/RELEASE_ARM64_T6041 machine : arm64 processor : arm byteorder : little LC_ALL : en_CA.UTF-8 LANG : None LOCALE : en_CA.UTF-8 pandas : 2.2.3 numpy : 2.2.2 pytz : 2025.1 dateutil : 2.9.0.post0 pip : 25.0 Cython : None sphinx : None IPython : 8.32.0 adbc-driver-postgresql: None adbc-driver-sqlite : None bs4 : 4.13.4 blosc : None bottleneck : None dataframe-api-compat : None fastparquet : None fsspec : 2025.2.0 html5lib : None hypothesis : 6.125.2 gcsfs : None jinja2 : 3.1.5 lxml.etree : None matplotlib : 3.10.3 numba : None numexpr : None odfpy : None openpyxl : None pandas_gbq : None psycopg2 : None pymysql : None pyarrow : 19.0.0 pyreadstat : None pytest : None python-calamine : None pyxlsb : None s3fs : None scipy : 1.15.2 sqlalchemy : 2.0.38 tables : None tabulate : None xarray : 2025.1.2 xlrd : None xlsxwriter : None zstandard : 0.23.0 tzdata : 2025.1 qtpy : None pyqt5 : None

Comment From: arthurlw

Confirmed on main. PRs and investigations are welcome. From a quick look I do think that .dropna() from your link above does cause this issue.

Thanks for raising this!

Comment From: chilin0525

take

Comment From: simonjayhawkins

Expected Behavior

I'd expect the series to get converted (to values of decimal.Decimal type, with dtype=object) without raising an exception, preserving the null elements.

the docs for pandas.to_numeric state that "The default return dtype is float64 or int64 depending on the data supplied. Use the downcast parameter to obtain other dtypes."

the whole point of pandas.to_numeric is to "Convert argument to a numeric type." and the return is "Numeric if parsing succeeded."

So returning an object array does not seem appropriate?

Also note that an traditional object array does not properly support null values #32931, so i'm not so sure that putting pd.NA values in an object array is ideal?

Comment From: kzvezdarov

Expected Behavior

I'd expect the series to get converted (to values of decimal.Decimal type, with dtype=object) without raising an exception, preserving the null elements.

the docs for pandas.to_numeric state that "The default return dtype is float64 or int64 depending on the data supplied. Use the downcast parameter to obtain other dtypes."

the whole point of pandas.to_numeric is to "Convert argument to a numeric type." and the return is "Numeric if parsing succeeded."

So returning an object array does not seem appropriate?

Also note that an traditional object array does not properly support null values #32931, so i'm not so sure that putting pd.NA values in an object array is ideal?

Makes sense; to be honest that was just my best guess after inspecting the partially constructed output with a debugger.

Comment From: simonjayhawkins

@mroeschke @jorisvandenbossche

Matt, interested on your views on how this should behave today with the "arrow dtypes" and Joris on the future of Decimal types (or other new numeric-like types) in general.

Comment From: mroeschke

IMO if a ExtensionDtype._is_numeric is True, I think to_numeric should no-op with data passed with that type, including the arrow dtypes. So alternatively, I think the float64 or int64 noted in the documentation should be expanded with respect to all types that claim they are "numeric".

Comment From: chilin0525

Hi @simonjayhawkins @mroeschke , I’ve opened a PR for this issue and implemented the corresponding test case. I’d like to ask if the test result looks correct to you? Thanks 🙏

Comment From: Vernon-codes

take @mroeschke

Comment From: simonjayhawkins

@Vernon-codes there is already a PR open to address this issue #61659