Before you begin
To load data into BigQuery, you must set up billing and ensure that you have read access to the data source and write access to the destination table.
-
Sign in to your Google account.
If you don't already have one, sign up for a new account.
- Select or create a Cloud Platform project.
- Enable billing for your project.
-
Ensure that you note the path to the Google Cloud Storage bucket containing
the data you are loading into BigQuery. The path takes the following form:
gs://[BUCKET]/[OBJECT]. For information on viewing buckets and objects in Google Cloud Storage, see Quickstart Using the Console. -
Ensure that you have read access to your data source. If you are loading
content from Google Cloud Storage, and you are an owner of the project
that contains your data source, you probably have read access.
To set READ access on a Cloud Storage object, see Creating and Managing Access Control Lists (ACLs) in the Cloud Storage documentation.
-
Ensure that you have write access to your destination table. If you are the
owner of the dataset that contains your destination table, you probably have
write access.
To set write access to a dataset in BigQuery:
- Go to the BigQuery web UI.
Go to the BigQuery web UI -
In the navigation, hover on a dataset ID and click the down arrow icon
next to the ID and click Share dataset.
-
Add a person and give that person
editaccess, then click Save changes.
- Go to the BigQuery web UI.
Loading data from Google Cloud Storage
BigQuery supports loading data from these storage classes:
- Multi-Regional
- Regional
- Nearline
- Standard
- Durable Reduced Availability
To load data from Google Cloud Storage:
Web UI
- Go to the BigQuery web UI. Go to the BigQuery web UI
- In the navigation, hover on a dataset ID and click the down arrow icon
next to the ID
and click Create new table. - Under Source Data, select Google Cloud Storage for the Location.
- Specify the location of the source data using the path:
gs://[BUCKET]/[OBJECT]
- Under Destination Table, enter a value for the destination table name.
- In the Schema section, input the table's schema.
- Click the Create Table button.
Command-line
Use thebq load command and include a Cloud Storage URI for the source
argument:
bq load [DATASET].[TABLE_NAME] [PATH_TO_SOURCE] [SCHEMA]
where:
[DATASET].[TABLE_NAME]is a fully qualified table name, where[DATASET]represents an existing dataset.[PATH_TO_SOURCE]is a fully-qualified Cloud Storage URI.[SCHEMA]is a valid schema. The schema can be a local file, or it can be typed as part of the command.
For example, the following examples all load data from Cloud Storage:
bq load ds.new_tbl gs://mybucket/info.csv ./info_schema.json bq load ds.small gs://mybucket/small.csv name:integer,value:string bq load ds.small gs://mybucket/small.csv field1,field2,field3
C#
For more on installing and creating a BigQuery client, refer to BigQuery Client Libraries.
Go
For more on installing and creating a BigQuery client, refer to BigQuery Client Libraries.
Java
For more on installing and creating a BigQuery client, refer to BigQuery Client Libraries.
Node.js
For more on installing and creating a BigQuery client, refer to BigQuery Client Libraries.
PHP
For more on installing and creating a BigQuery client, refer to BigQuery Client Libraries.
Python
For more on installing and creating a BigQuery client, refer to BigQuery Client Libraries.
Ruby
For more on installing and creating a BigQuery client, refer to BigQuery Client Libraries.
For more information about loading data with a POST request, see loading data with a POST request.
What's next
- Learn how to stream data one record at a time.
- Learn about supported data formats.
- Learn about querying data.
- Learn about exporting data.