Suggestion: introduce a function, something like pandas.sql_list_tables(conn)
to look up available tables (shortcut for
pd.read_sql("SELECT * FROM sqlite_master WHERE type='table'", conn, index_col=["name"])
or
pd.read_sql("SELECT name FROM sqlite_master WHERE type='table'", conn)
).
Like in this R
module
Comment From: sornars
I was looking into this and while implementing wondered if it was strictly necessary? If you're using an SQLAlchemy engine then you can use the engine.table_names
function to get this information. As far as I can tell the only DBAPI compatible interface supported by pandas is sqlite in which case you'd need to run the above query.
Is the intention to wrap the engine.table_names
function in SQLDatabase, create an equivalent function in SQLIteDatabase which will call the query supplied by @DSLituiev and wrap both of these behind a new interface in the pandas.io.sql
module?
Edit: For reference: https://github.com/sornars/pandas/commit/3411d7bcf392fe0846444b5cf073ea383e027582
If this is what's intended then I'll write up some tests and open up a PR.
Comment From: DSLituiev
thank you @sornars! Looks good to me.
Comment From: mroeschke
Yeah pandas SQL functionality is to read or write data. Database introspection can be done with sqlalchemy so closing