This article describes the ways you can use to import data from your CRM/ERP systems, mailing services, or other online or offline sources to Google BigQuery.
To import data to Google BigQuery, use one of the following methods:
- Upload files in CSV or JSON format using BigQuery UI or a command line
- SDK for .NET, .Java, PHP, Python
- ETL applications
- ODBC driver from CDATA
- Upload data from Google Sheets using the OWOX BI BigQuery Reports add-on
Consider the encoding and delimitersGoogle BigQuery supports the encodings UTF-8 (by default) and ISO-8859-1. The supported delimiters are comma (by default) and tab.
More details in Google documentation.
The recommended table schema
For the correct data importing and its further analysis, make sure you follow the recommended structure of Google BigQuery table. Refer to our documentation on schema definition depending on your data type:
Uploading data via Google BigQuery web interface (using transaction data as an example)
We recommend uploading data via the BigQuery web UI. To do that, make up a CSV table with your data and upload it following the instructions below.
- Open Google BigQuery web interface
- Create a new dataset in your Google BigQuery project. Select the project in the Resources section, then click Create dataset in the window to the right:
- Specify the Dataset ID and Data location. Note: the data location must be the same as the location of all the other data you want to blend in OWOX BI.
Then, click Create dataset:
- Create a new table: Click the 'plus' icon on the right to the dataset name:
- Select Create table from: Upload, then specify the file location and format (CSV), then type in the name for the new table:
- Make up the data schema in the Schema section. You can add fields one by one by clicking Add Field and following this data schema, or simply click Edit as Text and paste the full schema in the JSON format (copy it from the snippet right after the screenshot below):
Copy the schema in JSON here:
7. Set Advanced options. Set Header rows to skip as 1 so the data would be imported with no errors. You can also specify the allowed amount of errors to avoid errors when the table is created:
After clicking Create Table, the table will be uploaded to BigQuery: