Code Sample, a copy-pastable example if possible

In [2]: s = pd.Series(dtype='object')

In [3]: s.loc['myint'] = 1

In [4]: s.loc['myfloat'] = 2.

In [5]: s
Out[5]: 
myint      1.0
myfloat    2.0
dtype: float64

Problem description

When an empty Series is added the first object, it does inference on it and sets its dtype ( https://github.com/pandas-dev/pandas/issues/19576#issuecomment-363875752 ). This can be nice... except if the user had passed a specific dtype on construction.

Empty objects should (by default) have no dtype set (or have an "Any" dtype), and inference should be done only in this case.

Expected Output

Output of pd.show_versions()

INSTALLED VERSIONS ------------------ commit: None python: 3.5.3.final.0 python-bits: 64 OS: Linux OS-release: 4.9.0-5-amd64 machine: x86_64 processor: byteorder: little LC_ALL: None LANG: it_IT.UTF-8 LOCALE: it_IT.UTF-8 pandas: 0.19.2 nose: 1.3.7 pip: 9.0.1 setuptools: 33.1.1 Cython: 0.25.2 numpy: 1.13.3 scipy: 0.18.1 statsmodels: None xarray: None IPython: 5.1.0 sphinx: 1.4.9 patsy: 0.4.1+dev dateutil: 2.5.3 pytz: 2016.7 blosc: None bottleneck: 1.2.0 tables: 3.3.0 numexpr: 2.6.1 matplotlib: 2.0.0 openpyxl: 2.3.0 xlrd: 1.0.0 xlwt: None xlsxwriter: 0.9.6 lxml: 3.7.1 bs4: 4.5.3 html5lib: 0.999999999 httplib2: 0.9.2 apiclient: None sqlalchemy: 1.0.15 pymysql: None psycopg2: None jinja2: 2.8 boto: None pandas_datareader: None