How to upload transaction data into Google BigQuery?

There are several methods of importing transaction data from your CRM to Google BigQuery:

  1. Uploading files in CSV or JSON format using BigQuery UI or a command line
  2. A set of Python scripts from OWOX, that let you automate data import to Google BigQuery using Google Cloud Functions.
  3. SDK for .NET, .Java, PHP, Python
  4. ODBC driver from CDATA
  5. ETL applications
  6. Upload data from Google Sheets using the OWOX BI BigQuery Reports add-on

We recommend using the first method and upload data via the BigQuery web UI. We'll describe this method step-by-step in this article.

Consider the encoding and delimiters Google BigQuery supports the encodings UTF-8 (by default) and ISO-8859-1. The supported delimiters are comma (by default) and tab.
More details in Google documentation.

The recommended table schema

For importing data on transactions, products, and customers from CRM, ERP, or any other accounting system to Google BigQuery, we recommend using this table schema. This will ensure all your data will be considered in OWOX BI.

Uploading data via Google BigQuery web interface

You can also upload data manually. To do that, make up a CSV table with your data and upload it using the BigQuery web interface.

  1. Open Google BigQuery web interface
  2. Create a new dataset in your Google BigQuery project: click an arrow down icon on the right to the project name, then click Create new dataset:Create_dataset.png
  3. Create a new table: Click the 'plus' icon on the right to the dataset name.
  • Check Create from source, then specify the file location (File upload), format (CSV), and type in the name for the new table:Upload_table.png 
  • Make up the data schema. You can add fields one by one clicking Add Field and following this data schema, or simply click Edit as Text and paste the full schema in the JSON format (copy it from the snippet right after the screenshot below):Add_fields.png

Copy the schema in JSON here:

  • Set additional Options. Set Header rows to skip as so the data would be imported with no errors. You can also specify the allowed amount of errors in the uploaded data or skip rows to avoid inconsistencies:

    _________________2018-05-31_11-24-28.png

    After clicking Create Table, the table will be uploaded to BigQuery and you'll be able to use it as a data source in OWOX BI Attribution и Smart Data.
Was this article helpful?
1 out of 1 found this helpful
Have more questions? Submit a request

0 Comments

Please sign in to leave a comment.