# Your code here
import pandas as pd
import sqlalchemy

sql='select * from stg.first_table limit 10'
pg_engine='postgresql+psycopg2://localhost:3433/TEST'

pg_df = pd.read_sql(sql, con=pg_engine)

print pg_df  #success, type(pg_df)== <class 'pandas.core.frame.DataFrame'>

#pg_df.to_sql('stg.temp_20170731', pg_engine, if_exists='replace', index=False,chunksize=10)

pd.io.sql.to_sql(pg_df,'stg.temp_20170731',pg_engine,
                 if_exists='replace',index=False,chunksize=10) #finish whitout any errors,but the
                                                # table stg.temp_20170731 don't exist

Problem description

I'm tyring to write a pandas DataFrame to a PostgreSQL database--use to_sql, the command is done without any errors,but the table stg.temp_20170731 still don't exist

Comment From: jorisvandenbossche

@Alex-qin Thanks for the report. It is difficult to say something just based on that (we cannot reproduce it, having a reproducible example for sql is always difficult). But can you do the same as above but with echo=True when you create the engine (see http://docs.sqlalchemy.org/en/latest/core/engines.html#engine-creation-api). This will print a lot of output when doing operations, and you can then paste this here.

The stg., is this some kind of schema? Do you see another table added somewhere?

Comment From: Alex-qin

@jorisvandenbossche Thanks!My problem is solved,it's my fault,the schema 'stg.' is not necessary,the true code : pdsql.to_sql(pg_df,'temp_20170731',if_exists='replace',index=False,chunksize=10).

Comment From: jorisvandenbossche

Good to hear it is solved!

Comment From: Shree2404

Do we need to create table before inserting..?? Won't it create automatically when we upload data from python DataFrame..??

Comment From: beloved23

use below.

self.read_file.to_sql("table_name", con=engine, schema='SCHEME_NAME', index=False, if_exists="replace")

In your postgresql db

select * from SCHEME_NAME."table_name";