Note: This pipeline is available for the subscription plans Marketing Data, Reports & Attribution, and Enterprise.
1. On the OWOX BI main page, click Create pipeline:
2. As a source, select hotline.ua:
3. As a destination, select Google BigQuery:
4. Select or add new access to a hotline.ua account from which you want to export ad cost data:
5. Select or add new access to a Google BigQuery account where you want to store the exported data:
6. Select a Google BigQuery project and create a dataset you want to upload your data to (or choose an existing one):
Note: To set up the data collection successfully, your Google account must have "BigQuery Data Editor" and "BigQuery User" roles in the project where you want to collect data to. Without this permission, BigQuery won’t let you upload the data.
7. Specify the date that marks the beginning of the period you want to upload cost data for:
You can schedule the start of the import on a date in the future or select a past date to upload historical data from your hotline.ua account.
Also on this step, specify the VAT rate to exclude from the exported costs. You need this to have your hotline.ua cost data consistent with the cost data form other services where VAT is being excluded automatically.
Finally, select the currency you want your cost data to be additionally converted into.
The pipeline uploads ad cost data into two separate fields in your BigQuery data table. One field contains costs in the original currency — just as they appear in the ad service. The other field contains costs converted into the currency you select when creating the pipeline.
The converted costs are useful for the reports where you need to have cost data from different sources presented in a uniform currency.
8. Click Create pipeline.
Done! The pipeline will be uploading data to a partitioned table in the selected BigQuery dataset. The data will be uploaded daily — for the previous day.
OWOX BI will also update the data already uploaded to BigQuery if the data in the hotline.ua account would change retrospectively. The update period for historical data is 21 days. This means that 21 days after the import, the data in BigQuery will be totally up-to-date.
The table data schema is in this article.