This article describes the setup steps for LinkedIn Ads → Google BigQuery pipeline.
1. On the OWOX BI main page, click the 'New' button and choose 'Pipeline':
2. As a service provider, select 'LinkedIn Ads':
3. Choose pipeline 'LinkedIn Ads → Google BigQuery':
And click the 'Create & Setup' button:
4. Select or add new access to a LinkedIn account from which you want to export ad cost data. Note that the provided access must have at least one LinkedIn ad account.
5. Select or add new access to a Google BigQuery account where you want to store the exported data:
6. Select a Google BigQuery project and create a dataset you want to upload your data to (or create an existing one):
Note: To set up the data collection, your Google account must be granted both the BigQuery Data Editor and BigQuery User roles for the destination project. Otherwise, BigQuery won’t let you upload the data.
To check/grant the permissions, go to the Identity and Access Management page in your Google Cloud Platform project. Read more in Google documentation.
7. Specify the settings for your pipeline:
- The date that marks the beginning of the period you want to upload cost data for. You can set up either the future or past date. In the latter case, learn the existing limitations on historical data import first.
- Values for source/medium you want to apply by default to the imported data. The default values will be used by OWOX BI only if the actual UTM tags cannot be retrieved. Read more
ImportantThe combination of values google/organic for UTM source/medium is not allowed in the LinkedIn→Google BigQuery pipeline.
At any time, you can change the source/medium settings on the pipeline page.
Note that the changes will be applied to the newly imported data. Historical data will be updated with the new source/medium values within the preset actualization window only.
- Also, you can connect your new pipeline with the Cost Data pipeline to attribute the imported ad cost to user sessions. Read more
8. Click Create pipeline.
Done! The pipeline will be uploading data to a partitioned table in the selected BigQuery dataset. The data will be uploaded daily — for the previous day.
OWOX BI updates the imported data during the preset actualization window if cost data changes retrospectively in your ad account.
The data schema can be found in this article.
Under this pipeline, OWOX BI processes UTM tags in the ad links that were modified via the URL tracking or shortening services (for example, ad.doubleclick.net, weborama, bit.ly).
0 Comments