-
Notifications
You must be signed in to change notification settings - Fork 323
Description
Thanks for stopping by to let us know something could be better!
PLEASE READ: If you have a support contract with Google, please create an issue in the support console instead of filing on GitHub. This will ensure a timely response.
Is your feature request related to a problem? Please describe.
I'd like a way to have a bigquery emulator running as a local docker container which I can use for local integration testing.
Describe the solution you'd like
I'm able to achieve something similar with GCS using https://github.com/oittaa/gcp-storage-emulator and setting the STORAGE_EMULATOR_HOST environment variable which is read during the storage Client constructor and used to set the api_endpoint in the client_info see here and here. I've found I can construct a BQ Client using something like bigquery.Client(client_options={"api_endpoint": "http://localhost:8080"}) and the calls will be redirected to that host/port. However, that requires changing code which creates clients. Whereas with the environment variable in storage I can redirect without changing code. I'd like this library to implement similar logic to check for the existence of a BIGQUERY_EMULATOR_HOST or some similar environment variable and use that to set the api_endpoint in the same way as the STORAGE_EMULATOR_HOST does so that I could then stub out a mock BigQuery emulator to use in testing.
Describe alternatives you've considered
As described above I've looked at changing any constructors in my code base to include the api_endpoint and that is an okay work around, but it seems like having a consistent way to specify an alternate via an environment variable would be better.
Additional context
Hopefully the links above are sufficient extra context.