gbq.py currently returns an error if the result of a query is what Google considers to be 'Large'. The google api allows jobs to be sent with a flag to allow large results. It would be very beneficial to provide this as an option in the BigQuery connector.

Comment From: jorisvandenbossche

cc @jacobschaer @sean-schaefer

Comment From: ghost

I've made in improvement to the to_gbq function in Pull Request #10857 . The pull request hasn't been merged, but keep an eye on it.

The improvement is to add an argument destination_table to the to_gbq function , so that you can redirect the results of a query directly to a BigQuery table instead of a pandas DataFrame. When the destination_table is specified, I set allowLargeResults to true.

Note: A destination table is required to enable large results in BigQuery. Google BigQuery documentation for allowLargeResults is available at the following link under the heading Returning large query results: https://cloud.google.com/bigquery/querying-data?hl=en

Comment From: jacobschaer

Agree with @parthea1 - the reason we didn't add it originally was because we didn't want to support creating tables. Also, we were concerned about how to do this in an intuitive way for users who aren't very familiar with BigQuery. When reading results you are also writing them... I'm not particuarly interested in implementing this feature, but I agree it would be nice to have.

Comment From: parthea

@jreback Can this be closed, or do you expect a PR for this?

See my related PR here: #11209

Issue #13531 is also a duplicate of this. Setting allowLargeResults requires a destination table so you are effectively saving the result to a gbq table. If we decide to support this it would be trivial to save the results to a destination table without setting allowLargeResults.

Comment From: jreback

moved issue to https://github.com/pydata/pandas-gbq/issues/15