When doing some data analysis with pandas, the python shell breaks and I get "Segmentation fault (core dumped)". A little bit hard to reproduce since this only happens when the work i'm doing is a little bit heavy. But basically, I'm just reading dataframes and using stuff like groupby(), apply(), .loc[x, y], itertuples() and regular slicing. I'm not writing to or updating a dataframe, just reading and transforming.

Also, I did some memory profiling and there appears to be no memory leak (memory peak stable at ~166MB)

output of pd.show_versions()

commit: None python: 2.7.6.final.0 python-bits: 64 OS: Linux OS-release: 3.19.0-56-generic machine: x86_64 processor: x86_64 byteorder: little LC_ALL: None LANG: en_US.UTF-8

pandas: 0.17.0 nose: 1.3.4 pip: 6.0.6 setuptools: 20.3.1 Cython: 0.23.4 numpy: 1.11.0 scipy: 0.16.1 statsmodels: 0.6.1 IPython: 4.1.2 sphinx: None patsy: 0.4.1 dateutil: 2.4.0 pytz: 2014.10 blosc: None bottleneck: None tables: None numexpr: None matplotlib: 1.5.0 openpyxl: 2.3.1 xlrd: None xlwt: None xlsxwriter: None lxml: 3.4.0 bs4: None html5lib: None httplib2: None apiclient: None sqlalchemy: 0.9.8 pymysql: None psycopg2: 2.5.4 (dt dec pq3 ext)

Comment From: jreback

this is not a very useful report. if you can put up aminimal repro example then we can see what the issue might be. Further you can try upgrading and see if that solves the issue.