{"id":1930,"date":"2022-10-26T14:32:23","date_gmt":"2022-10-26T12:32:23","guid":{"rendered":"https:\/\/webmasterei-prange.de\/transfer-google-ads-api-data-to-bigquery\/"},"modified":"2025-10-06T10:17:27","modified_gmt":"2025-10-06T08:17:27","slug":"transfer-google-ads-api-data-to-bigquery","status":"publish","type":"post","link":"https:\/\/webmasterei-prange.de\/en\/transfer-google-ads-api-data-to-bigquery\/","title":{"rendered":"Transfer Google Ads API data to BigQuery"},"content":{"rendered":"\n<p>The article shows the transfer of Google Ads API data to Google BigQuery using custom reporting, which can be configured individually.<\/p>\n\n<h2 class=\"wp-block-heading\" id=\"toc_Mogliche_Ubertragungswege\">Possible transmission routes<\/h2>\n\n<p>You can transfer the Google Ads API data into BigQuery in several ways. One option is to create an ETL (Extract, Transform, Load) process using a third-party ETL tool. These tools extract the data from the Google Ads API, convert it to the right format, and then upload it to BigQuery.<\/p>\n\n<p>Another way is through 2 existing <a href=\"https:\/\/cloud.google.com\/bigquery\/docs\/google-ads-transfer\" target=\"_blank\" rel=\"noreferrer noopener\">Google Transfer Services for Google Ads<\/a>, which already come with BigQuery. Unfortunately, the API version of the service often lags behind here and so you don&#8217;t get the latest data types that Google has introduced. Especially with regard to Pmax campaigns and their asset groups, this is currently a limitation that is worth overcoming.<\/p>\n\n<p>A simple approach to transferring data to BigQuery is an Apps Script, which can be adapted to your own data needs.<\/p>\n\n<h2 class=\"wp-block-heading\" id=\"toc_Apps_Script_zur_Ubertragung_von_Google_Ads_API\">Apps Script for transferring Google Ads API data to BigQuery<\/h2>\n\n<p>The script is installed in the Google Apps Script environment and executed daily. During execution, the previously configured tables are taken and queried using the Google Ads API. The data is then transferred to Google BigQuery.<\/p>\n\n<h2 class=\"wp-block-heading\" id=\"toc_Vor_und_Nachteile_dieser_Losung\">Advantages and disadvantages of this solution<\/h2>\n\n<p>Advantages:<\/p>\n\n<ul class=\"wp-block-list\">\n<li>Flexible, API innovations can be responded to individually.<\/li>\n\n\n\n<li>Data saving: Only configured tables are transferred.<\/li>\n\n\n\n<li>Cost-effective, as no additional infrastructure needs to be set up.<\/li>\n<\/ul>\n\n<p>Disadvantages:<\/p>\n\n<ul class=\"wp-block-list\">\n<li>Daily transfer is limited due to API limits.<\/li>\n\n\n\n<li>If the data is needed in real time, this solution is not suitable<\/li>\n\n\n\n<li>The data is stored in BigQuery as it is delivered by the API. This means that additional reporting must be set up to prepare the data for analysis<\/li>\n<\/ul>\n\n<h2 class=\"wp-block-heading\" id=\"toc_Installation\">Installation<\/h2>\n\n<ol class=\"wp-block-list\">\n<li>Install script<\/li>\n\n\n\n<li>Set transfer window<\/li>\n\n\n\n<li>Specify data with GAQL Query Builder<\/li>\n\n\n\n<li>Start transmission<\/li>\n<\/ol>\n\n<h3 class=\"wp-block-heading\" id=\"toc_Script_installieren\">Install script<\/h3>\n\n<p>Copy and paste the script from below into a Google Ads script or Apps script. All script configuration takes place in the CONFIG variable. The rest should only be edited if you know what you are doing.<\/p>\n\n<p>In the script environment, BigQuery must still be hired as an extra service.<\/p>\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"891\" height=\"635\" src=\"https:\/\/webmasterei-prange.de\/wp-content\/uploads\/2025\/09\/Google-Ads-Script-erweiterte-APIs.png.webp\" alt=\"Google ads script extended apis.png\" class=\"wp-image-1249\"\/><\/figure>\n\n<p>Then the variable of the script must be configured.<\/p>\n\n<h3 class=\"wp-block-heading\" id=\"toc_Variablen_konfigurieren\">Configure variables<\/h3>\n\n<p>In the CONFIG variable all settings are made that the script needs. The variables should be self-explanatory in terms of naming. Otherwise, the comment behind it helps.<\/p>\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">BIGQUERY_PROJECT_ID: 'BQPID', \/\/ BigQuery Project ID\nBIGQUERY_DATASET_ID: 'BGDSID', \/\/ BigQuery Dataset ID\nLOCATION: 'EU', \/\/Location of the Dataset\n\nEXPERIATION: 2 * 31536000000, \/\/ Data Expiration time\nDEBUG: { WRITE: true}, \/\/ DEBUG Variable if WRITE is false, there is no data send to bigquery. Useful during configuration. \n\n\/\/ Truncate existing data, otherwise will append.\nTRUNCATE_EXISTING_DATASET: false, \/\/ If set to true, dataset will be cleared before data transfer\nTRUNCATE_EXISTING_TABLES: false, \/\/ If set to true, tables will be cleared before data transfer\n\n\/\/ Back up reports to Google Drive.\nWRITE_DATA_TO_DRIVE: false, \/\/ DEPRECATED\n\/\/ Folder to put all the intermediate files.\nDRIVE_FOLDER: 'INSERT_FOLDER_NAME', \/\/DEPRECATED\n\n\/\/ Default date range over which statistics fields are retrieved.\nDEFAULT_DATE_RANGE: 'YESTERDAY',\/\/EXAMPLES: 'YESTERDAY' OR '2022-08-01,2022-09-07',\nDEFAULT_DATA_LIMIT: '',\nCHUNK_SIZE: 5000, \/\/ Defines the number of lines the Script will load to Bigquery during one inset operation. Drill this down if you get errors for this.BIGQUERY_PROJECT_ID: 'BQPID', \/\/ BigQuery Project ID \nBIGQUERY_DATASET_ID: 'BGDSID', \/\/ BigQuery Dataset ID\nLOCATION: 'EU', \/\/Location of the Dataset\n\nEXPERIATION: 2 * 31536000000, \/\/ Data Expiration time\nDEBUG: { WRITE: true}, \/\/ DEBUG Variable if WRITE is false, there is no data send to bigquery. Useful during configuration. \n\n\/\/ Truncate existing data, otherwise will append.\nTRUNCATE_EXISTING_DATASET: false, \/\/ If set to true, dataset will be cleared before data transfer\nTRUNCATE_EXISTING_TABLES: false, \/\/ If set to true, tables will be cleared before data transfer\n\n\/\/ Back up reports to Google Drive.\nWRITE_DATA_TO_DRIVE: false, \/\/ DEPRECATED\n\/\/ Folder to put all the intermediate files.\nDRIVE_FOLDER: 'INSERT_FOLDER_NAME', \/\/DEPRECATED\n\n\/\/ Default date range over which statistics fields are retrieved.\nDEFAULT_DATE_RANGE: 'YESTERDAY',\/\/EXAMPLES: 'YESTERDAY' OR '2022-08-01,2022-09-07',\nDEFAULT_DATA_LIMIT: '',\nCHUNK_SIZE: 5000, \/\/ Defines the number of lines the Script will load to Bigquery during one inset operation. Drill this down if you get errors for this. <\/pre>\n\n<h3 class=\"wp-block-heading\" id=\"toc_Daten_festlegen_mit_dem_GAQL_Query_Builder\">Specify data with GAQL Query Builder<\/h3>\n\n<p>In the CONFIG.REPORTS array API calls can be. Here, each call is an object with the variables as shown in the table and in the example. For building the API calls I strongly recommend to use the <strong>\n  <a href=\"https:\/\/developers.google.com\/google-ads\/api\/fields\/v11\/overview_query_builder\" target=\"_blank\" rel=\"noreferrer noopener\">Google Ads API Query Builder<\/a>\n<\/strong>as it first shows which fields are allowed to be combined at all.<\/p>\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Variable<\/th><th>Value<\/th><\/tr><\/thead><tbody><tr><td>NAME<\/td><td>The table name in the Google Ads API &#8216; <a href=\"https:\/\/developers.google.com\/google-ads\/api\/fields\/v11\/asset_group_product_group_view\" target=\"_blank\" rel=\"noreferrer noopener\">asset_group_product_group_view<\/a> &#8216;<\/td><\/tr><tr><td>CONDITION<\/td><td>A \u201cWHERE\u201d statement can be inserted here<\/td><\/tr><tr><td>FIELDS<\/td><td>The fields required in the report.<\/td><\/tr><tr><td>RETURN_FIELDS<\/td><td>Fields that are sent by the API without a request. Necessary to create the table fields in BigQuery<\/td><\/tr><tr><td>PARTITION_FIELD<\/td><td>Necessary if the segemens_date field is to be used for partitioning. Leave blank in all other fields.<\/td><\/tr><\/tbody><\/table><\/figure>\n\n<p>Additionally the fields must be defined with<\/p>\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">'field.name':{'format':'INTEGER','alias':'field_name'}<\/pre>\n\n<figure class=\"wp-block-table\"><table class=\"has-fixed-layout\"><thead><tr><th>Variable<\/th><th>Value<\/th><\/tr><\/thead><tbody><tr><td>FieldName<\/td><td>The name of the field as shown in the <a href=\"https:\/\/developers.google.com\/google-ads\/api\/fields\/v11\/overview\" target=\"_blank\" rel=\"noreferrer noopener\">Google Ads Api.<\/a><\/td><\/tr><tr><td>Format<\/td><td><a href=\"https:\/\/cloud.google.com\/bigquery\/docs\/reference\/standard-sql\/data-types\" target=\"_blank\" rel=\"noreferrer noopener\">Data types as defined for BigQuery.<\/a><\/td><\/tr><tr><td>Alias<\/td><td>FieldName Conversion that works for Google BigQuery as colmun name.<\/td><\/tr><\/tbody><\/table><\/figure>\n\n<p>As in the code example for the actually query shows how the fields are used during the API call.<\/p>\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">var query =  \n  'SELECT ' + fieldNames.join(',') +\n  ' FROM ' + reportConfig.NAME + ' ' + reportConfig.CONDITIONS<\/pre>\n\n<p>Here is an example of a fully configured call to &#8216;ad_group_ad&#8217;:<\/p>\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">{\n  NAME: 'ad_group_ad',\n  CONDITIONS: '',\n  FIELDS: {\n\/\/ Fields have the API name in first object level. They inherit the Alias Name an the format for the field in BigQuery tables.   \n  'campaign.id':{'format':'INTEGER','alias':'campaign_id'},\n  'ad_group.id':{'format':'INTEGER','alias':'adGroup_id'},\n  'ad_group_ad.ad.responsive_search_ad.descriptions':{'format':'JSON','alias':'adGroupAd_ad_responsive_search_ad_descriptions'},\n  'ad_group_ad.ad.responsive_search_ad.headlines':{'format':'JSON','alias':'adGroupAd_ad_responsive_search_ad_headlines'},\n  'ad_group_ad.ad_strength':{'format':'STRING','alias':'adGroupAd_ad_strength'},\n  'segments.date':{'format':'DATE','alias':'segments_date'},  \n  'campaign.name':{'format':'STRING','alias':'campaign_name'},\n  'ad_group.name':{'format':'STRING','alias':'adGroup_name'},\n  'ad_group_ad.resource_name':{'format':'STRING','alias':'adGroupAd_resource_name'},\n  'ad_group.resource_name':{'format':'STRING','alias':'adGroup_resource_name'},\n  'metrics.clicks':{'format':'INTEGER','alias':'metrics_clicks'},  \n  'metrics.conversions':{'format':'FLOAT64','alias':'metrics_conversions'},  \n  'metrics.impressions':{'format':'INTEGER','alias':'metrics_impressions'},  \n  'metrics.cost_micros':{'format':'INTEGER','alias':'metrics_cost_micros'}  \n  },\n  RETURN_FIELDS: {\n\/\/ Return Fields are fields that we did not ask the API for, but they delivered those too. Needed for correct field mapping towards BigQuery. \n  'adGroupAd_ad':{'alias':'adGroupAd_ad','format':'JSON'},\n  'adGroupAd_ad.id':{'format':'INTEGER','alias':'adGroupAd_ad_id'}\n  },\n  PARTITION_FIELD:'segments_date',\n  },<\/pre>\n\n<h3 class=\"wp-block-heading\" id=\"toc_Ubertragung_starten\">Start transmission<\/h3>\n\n<p>If all components are configured correctly, the Google Ads Api call can be started.<\/p>\n\n<h2 class=\"wp-block-heading\" id=\"toc_Das_komplette_Script_zur_freien_Nutzung\">The complete script for free use:<\/h2>\n\n<p>Here I publish the script as a complete version. I am looking forward to comments, feedback, improvements and of course your extension of the retrieval configurations at (CONFIG.REPORTS).<\/p>\n\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"js\" data-enlighter-theme=\"\" data-enlighter-highlight=\"\" data-enlighter-linenumbers=\"\" data-enlighter-lineoffset=\"\" data-enlighter-title=\"\" data-enlighter-group=\"\">var CONFIG = {\n  BIGQUERY_PROJECT_ID: 'BQPID', \/\/ BigQuery Project ID\n  BIGQUERY_DATASET_ID: 'BGDSID', \/\/ BigQuery Dataset ID\n  LOCATION: 'EU', \/\/Location of the Dataset\n\n  EXPERIATION: 2 * 31536000000, \/\/ Data Expiration time\n  DEBUG: { WRITE: true}, \/\/ DEBUG Variable if WRITE is false, there is no data send to bigquery. Useful during configuration. \n\n  \/\/ Truncate existing data, otherwise will append.\n  TRUNCATE_EXISTING_DATASET: false, \/\/ If set to true, dataset will be cleared before data transfer\n  TRUNCATE_EXISTING_TABLES: false, \/\/ If set to true, tables will be cleared before data transfer\n\n  \/\/ Back up reports to Google Drive.\n  WRITE_DATA_TO_DRIVE: false, \/\/ DEPRECATED\n  \/\/ Folder to put all the intermediate files.\n  DRIVE_FOLDER: 'INSERT_FOLDER_NAME', \/\/DEPRECATED\n\n  \/\/ Default date range over which statistics fields are retrieved.\n  DEFAULT_DATE_RANGE: 'YESTERDAY',\/\/EXAMPLES: 'YESTERDAY' OR '2022-08-01,2022-09-07',\n  DEFAULT_DATA_LIMIT: '',\n  CHUNK_SIZE: 5000, \/\/ Defines the number of lines the Script will load to Bigquery during one inset operation. Drill this down if you get errors for this. \n\n  \/\/ Lists of reports and fields to retrieve from Google Ads.\n  REPORTS: [\n  {\n  NAME: 'search_term_view',\n  CONDITIONS: '',\n  FIELDS: {\n  'campaign.id':{'format':'INTEGER','alias':'campaign_id'},\n  'campaign.name':{'format':'STRING','alias':'campaign_name'},\n  'campaign.status':{'format':'STRING','alias':'campaign_status'},\n  'campaign.advertising_channel_type':{'format':'STRING','alias':'campaign_advertising_channel_type'},\n  'search_term_view.search_term':{'format':'STRING','alias':'searchTermView_search_term'},\n  'search_term_view.status':{'format':'STRING','alias':'searchTermView_status'},\n  'segments.date':{'format':'DATE','alias':'segments_date'},\n  'metrics.clicks':{'format':'INT64','alias':'metrics_clicks'},\n  'metrics.conversions':{'format':'FLOAT64','alias':'metrics_conversions'},\n  'metrics.conversions_value':{'format':'FLOAT64','alias':'metrics_conversions_value'},\n  'metrics.cost_micros':{'format':'INT64','alias':'metrics_cost_micros'},\n  'metrics.impressions':{'format':'INT64','alias':'metrics_impressions'},\n  }\n  },\n  RETURN_FIELDS: {\n\/\/ Return Fields are fields that we did not ask the API for, but they delivered those too. Needed for correct field mapping towards BigQuery. \n  'adGroupAd_ad':{'alias':'adGroupAd_ad','format':'JSON'},\n  'adGroupAd_ad.id':{'format':'INTEGER','alias':'adGroupAd_ad_id'}\n  },\n  PARTITION_FIELD:'segments_date',\n  }  \n  ],\n};\n\n\/\/ Impose a limit on the size of BQ inserts: 10MB - 512Kb for overheads.\nvar MAX_INSERT_SIZE = 10 * 1024 * 1024 - 512 * 1024;\n\/**\n  * Main method\n  *\/\nfunction main() {\n  createDataset();\n  for (var i = 0; i  &lt;  CONFIG.REPORTS.length; i++) {\n  var reportConfig = CONFIG.REPORTS[i];\n  createTable(reportConfig);\n  }\n\n  var jobIds = processReports();\n  waitTillJobsComplete(jobIds);\n}\n\n\n\/**\n  * Creates a new dataset.\n  *\n  * If a dataset with the same id already exists and the truncate flag\n  * is set, will truncate the old dataset. If the truncate flag is not \n  * set, then will not create a new dataset.\n  *\/\nfunction createDataset() {\n  if (datasetExists()) {\n  if (CONFIG.TRUNCATE_EXISTING_DATASET) {\n  BigQuery.Datasets.remove(CONFIG.BIGQUERY_PROJECT_ID,\n  CONFIG.BIGQUERY_DATASET_ID, {'deleteContents' : true});\n  Logger.log('Truncated dataset.');\n  } else {\n  Logger.log('Dataset %s already exists. Will not recreate.',\n  CONFIG.BIGQUERY_DATASET_ID);\n  return;\n  }\n  }\n\n  \/\/ Create new dataset.\n  var dataSet = BigQuery.newDataset();\n  dataSet.friendlyName = CONFIG.BIGQUERY_DATASET_ID;\n  dataSet.datasetReference = BigQuery.newDatasetReference();\n  dataSet.datasetReference.projectId = CONFIG.BIGQUERY_PROJECT_ID;\n  dataSet.datasetReference.datasetId = CONFIG.BIGQUERY_DATASET_ID;\n  dataSet.location = CONFIG.LOCATION\n  dataSet = BigQuery.Datasets.insert(dataSet, CONFIG.BIGQUERY_PROJECT_ID);\n  Logger.log('Created dataset with id %s.', dataSet.id);\n}\n\n\/**\n  * Checks if dataset already exists in project.\n  *\n  * @return  {boolean}  Returns true if dataset already exists.\n  *\/\nfunction datasetExists() {\n  \/\/ Get a list of all datasets in project.\n  var datasets = BigQuery.Datasets.list(CONFIG.BIGQUERY_PROJECT_ID);\n  var datasetExists = false;\n  \/\/ Iterate through each dataset and check for an id match.\n  if (datasets.datasets != null) {\n  for (var i = 0; i  &lt;  datasets.datasets.length; i++) {\n  var dataset = datasets.datasets[i];\n  if (dataset.datasetReference.datasetId == CONFIG.BIGQUERY_DATASET_ID) {\n  datasetExists = true;\n  break;\n  }\n  }\n  }\n  return datasetExists;\n}\n\n\/**\n  * Creates a new table.\n  *\n  * If a table with the same id already exists and the truncate flag\n  * is set, will truncate the old table. If the truncate flag is not \n  * set, then will not create a new table.\n  *\n  * @param  {Object}  reportConfig Report configuration including report name,\n  * conditions, and fields.\n  *\/\nfunction createTable(reportConfig) {\n  if (tableExists(reportConfig.NAME)) {\n  if (CONFIG.TRUNCATE_EXISTING_TABLES) {\n  BigQuery.Tables.remove(CONFIG.BIGQUERY_PROJECT_ID,\n  CONFIG.BIGQUERY_DATASET_ID, reportConfig.NAME);\n  Logger.log('Truncated table %s.', reportConfig.NAME);\n  } else {\n  Logger.log('Table %s already exists. Will not recreate.',\n  reportConfig.NAME);\n  return;\n  }\n  }\n\n  \/\/ Create new table.\n  var table = BigQuery.newTable();\n  var schema = BigQuery.newTableSchema();\n  schema.fields = createBigQueryFields(reportConfig);\n  table.schema = schema;\n  table.friendlyName = reportConfig.NAME;\n  table.time_partitioning = {\"type\":\"DAY\", \"expirationMs\":31536000000};\n  if(reportConfig.PARTITION_FIELD){\n  table.time_partitioning.field = reportConfig.PARTITION_FIELD\n  }\n  table.tableReference = BigQuery.newTableReference();\n  table.tableReference.datasetId = CONFIG.BIGQUERY_DATASET_ID;\n  table.tableReference.projectId = CONFIG.BIGQUERY_PROJECT_ID;\n  table.tableReference.tableId = reportConfig.NAME;\n\n  table = BigQuery.Tables.insert(table, CONFIG.BIGQUERY_PROJECT_ID,\n  CONFIG.BIGQUERY_DATASET_ID);\n\n  Logger.log('Created table with id %s.', table.id);\n}\n\nfunction createBigQueryFields(reportConfig){\n  var bigQueryFields = [];\n\n  \/\/ Add each field to table schema.\n  var fieldNames = Object.keys(reportConfig.FIELDS);\n  for (var i = 0; i  &lt;  fieldNames.length; i++) {\n  var fieldName = fieldNames[i];\n  var bigQueryFieldSchema = BigQuery.newTableFieldSchema();\n  bigQueryFieldSchema.description = fieldName;\n  bigQueryFieldSchema.name = reportConfig.FIELDS[fieldName].alias;\n  bigQueryFieldSchema.type = reportConfig.FIELDS[fieldName].format;\n\n  bigQueryFields.push(bigQueryFieldSchema);\n  }\n  if(reportConfig.RETURN_FIELDS){\n  var fieldNames = Object.keys(reportConfig.RETURN_FIELDS);\n  for (var i = 0; i  &lt;  fieldNames.length; i++) {\n  var fieldName = fieldNames[i];\n  var bigQueryFieldSchema = BigQuery.newTableFieldSchema();\n  bigQueryFieldSchema.description = fieldName;\n  bigQueryFieldSchema.name = reportConfig.RETURN_FIELDS[fieldName].alias;\n  bigQueryFieldSchema.type = reportConfig.RETURN_FIELDS[fieldName].format;\n\n  bigQueryFields.push(bigQueryFieldSchema);\n  }\n  }\n  return bigQueryFields\n}\n\/**\n  * Checks if table already exists in dataset.\n  *\n  * @param  {string}  tableId The table id to check existence.\n  *\n  * @return  {boolean}  Returns true if table already exists.\n  *\/\nfunction tableExists(tableId) {\n  \/\/ Get a list of all tables in the dataset.\n  var tables = BigQuery.Tables.list(CONFIG.BIGQUERY_PROJECT_ID,\n  CONFIG.BIGQUERY_DATASET_ID);\n  var tableExists = false;\n  \/\/ Iterate through each table and check for an id match.\n  if (tables.tables != null) {\n  for (var i = 0; i  &lt;  tables.tables.length; i++) {\n  var table = tables.tables[i];\n  if (table.tableReference.tableId == tableId) {\n  tableExists = true;\n  break;\n  }\n  }\n  }\n  return tableExists;\n}\n\n\/**\n  * Process all configured reports\n  *\n  * Iterates through each report to: retrieve Google Ads data,\n  * backup data to Drive (if configured), load data to BigQuery.\n  *\n  * @return {Array.&lt;string&gt;} jobIds The list of all job ids.\n  *\/\nfunction processReports() {\n  var jobIds = [];\n\n  \/\/ Iterate over each report type.\n  for (var i = 0; i  &lt;  CONFIG.REPORTS.length; i++) {\n  var reportConfig = CONFIG.REPORTS[i];\n  Logger.log('Running report %s', reportConfig.NAME);\n  \/\/ Get data as an array of CSV chunks.\n  var jsonRows = retrieveAdsReport(reportConfig);\n   \n  if(CONFIG.DEBUG.WRITE){\n  var chunks = chunkArray(jsonRows, CONFIG.CHUNK_SIZE)\n\n  for(var c = 0; c  &lt;  chunks.length;c++){\n  var chunk = chunks[c];\n  var ndJson = chunk.map(JSON.stringify).join('\\n')\n  var blobData = Utilities.newBlob(ndJson, 'application\/json');\n  var jobId = loadDataToBigquery(reportConfig,blobData);\n  jobIds.push(jobId);\n  }\n  } \/\/ END DEBUG WRITE\n  }\n  return jobIds;\n}\n\nfunction chunkArray(arr, chunkSize){\n  var res = [];\n  for (var i = 0; i  &lt;  arr.length; i += chunkSize) {\n  var chunk = arr.slice(i, i + chunkSize);\n  res.push(chunk);\n  }\n  return res;\n}\n\n\n\/**\n  * Retrieves Google Ads data as json and formats any fields\n  * to BigQuery expected format.\n  *\n  * @param  {Object}  reportConfig Report configuration including report name,\n  * conditions, and fields.\n  *\n  * @return {!Array.JSON} a chunked report in csv format.\n  *\/\nfunction retrieveAdsReport(reportConfig) {\n  var fieldNames = Object.keys(reportConfig.FIELDS);\n  var dateRange = setDateRange(CONFIG.DEFAULT_DATE_RANGE);\n  Logger.log(fieldNames.join(','))\n  var query =  \n  'SELECT ' + fieldNames.join(',') +\n  ' FROM ' + reportConfig.NAME + ' ' + reportConfig.CONDITIONS\n  if(reportConfig.FIELDS['segments.date']){\n  query = query + ' WHERE segments.date BETWEEN ' + dateRange\n  }\n  query = query + ' '+ CONFIG.DEFAULT_DATA_LIMIT  \n  var rows = AdsApp.search(query);\n  var chunks = [];\n  var chunkLen = 0;\n  var jsonRows = [];\n  var totalRows = 0;\n  while (rows.hasNext()) {\n  var row = rows.next();\n  var jsonRow = {}\n  for (seg in row) {\n  for(el in row[seg]){\n  \/\/ Transform name of element\n  var name = el.split(\/(?=[AZ])\/).join('_').toLowerCase();\n  jsonRow[seg +'_'+ name] = row[seg][el]\n  }\n  }\n  delete jsonRow.campaign_resource_name;\n  delete jsonRow.shoppingPerformanceView_resource_name;\n\n  jsonRows.push(jsonRow)\n  }\n  return jsonRows;\n  }\n\nfunction transformFields(reportConfig,search,replace) {\n  var transformedFields = {}\n  for(field in reportConfig.FIELDS){\n  var newName = field.replace(search,replace)\n  transformedFields[newName] = reportConfig.FIELDS[field]\n  }\n  transformedFields = transformedFields.map(JSON.stringify).join('\\n')\n  return transformedFields\n}  \n  \n\/**\n  * Creates a BigQuery insertJob to load csv data.\n  *\n  * @param  {Object}  reportConfig Report configuration including report name,\n  * conditions, and fields.\n  * @param  {Blob}  data Csv report data as an 'application\/octet-stream' blob.\n  * @param {number=} skipLeadingRows Optional number of rows to skip.\n  *\n  * @return  {string}  jobId The job id for upload.\n  *\/\nfunction loadDataToBigquery(reportConfig, data) {\n  \/\/ Create the data upload job.\n  var job = {\n  configuration: {\n  load: {\n  destinationTable: {\n  projectId: CONFIG.BIGQUERY_PROJECT_ID,\n  datasetId: CONFIG.BIGQUERY_DATASET_ID,\n  tableId: reportConfig.NAME\n  },\n  \/\/kipLeadingRows: skipLeadingRows ? skipLeadingRows : 0, \n  \/\/nullMarker: '--',\n  source_format:'NEWLINE_DELIMITED_JSON',\n  time_partitioning: {'type':\"DAY\"},\n  schemaUpdateOptions:[\"ALLOW_FIELD_ADDITION\",\"ALLOW_FIELD_RELAXATION\"],\n  schema: {fields: createBigQueryFields(reportConfig)}\n  }\n  }\n  };\n  if(reportConfig.PARTITION_FIELD){\n  job.configuration.load.time_partitioning.field = reportConfig.PARTITION_FIELD\n  }\n  var insertJob = BigQuery.Jobs.insert(job, CONFIG.BIGQUERY_PROJECT_ID, data);\n  Logger.log('Load job started for %s. Check on the status of it here: ' + \n  'https:\/\/bigquery.cloud.google.com\/jobs\/%s', reportConfig.NAME,\n  CONFIG.BIGQUERY_PROJECT_ID);\n  return insertJob.jobReference.jobId;\n}\n\n\/**\n  * Polls until all jobs are 'DONE'.\n  *\n  * @param {Array.&lt;string&gt;} jobIds The list of all job ids.\n  *\/\nfunction waitTillJobsComplete(jobIds) {\n  var complete = false;\n  var remainingJobs = jobIds;\n  while (!complete) {\n  if (AdsApp.getExecutionInfo().getRemainingTime()  &lt;  5){\n  Logger.log('Script is about to timeout, jobs ' + remainingJobs.join(',') +\n  ' are still incomplete.');\n  }\n  remainingJobs = getIncompleteJobs(remainingJobs);\n  if (remainingJobs.length == 0) {\n  complete = true;\n  }\n  if (!complete) {\n  Logger.log(remainingJobs.length + ' jobs still being processed.');\n  \/\/ Wait 5 seconds before checking status again.\n  Utilities.sleep(5000);\n  }\n  }\n  Logger.log('All jobs processed.');\n}\n\n\/**\n  * Iterates through jobs and returns the ids for those jobs\n  * that are not 'DONE'.\n  *\n  * @param {Array.&lt;string&gt;} jobIds The list of job ids.\n  *\n  * @return {Array.&lt;string&gt;} remainingJobIds The list of remaining job ids.\n  *\/\nfunction getIncompleteJobs(jobIds) {\n  var remainingJobIds = [];\n  for (var i = 0; i  &lt;  jobIds.length; i++) {\n  var jobId = jobIds[i];\n  var getJob = BigQuery.Jobs.get(CONFIG.BIGQUERY_PROJECT_ID, jobId,{'location':CONFIG.LOCATION});\n  if (getJob.status.state != 'DONE') {\n  remainingJobIds.push(jobId);\n  }\n  }\n  return remainingJobIds;\n}\n\n\nfunction setDateRange(parameter){\n  var rangeStatement = ''\n  const now = new Date();\n  if(parameter == 'YESTERDAY'){\n  const yesterday = new Date(now.getTime() - 1000 * 60 * 60 * 24);\n  rangeStatement = '\"'+ formatDate(yesterday) +'\" AND \"'+formatDate(yesterday)+'\"'\n  } else {\n  var range = parameter.split(\",\")\n  const start = Date.parse(range[0])\n  const end = Date.parse(range[1])\n  rangeStatement = '\"' + range[0] + '\" AND \"' + range[1] + '\"'\n  }\n  return range statement\n}\n\n\/\/ FORMAT A DATE TO yyyy-MM-dd\nfunction formatDate(date){\n  const timeZone = AdsApp.currentAccount().getTimeZone();\n  date = Utilities.formatDate(date, timeZone, 'yyyy-MM-dd')\n  return date\n}<\/pre>\n\n<p><a href=\"https:\/\/gist.github.com\/Webmasterei\/62605575476efa551d17dffa356ebe11#file-google-ads-api-custom-call-tutorial-js\">Google Ads API Custom Call &#8211; tutorial.js<\/a> hosted with \u2764 by <a href=\"https:\/\/github.com\/\">GitHub<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>The article shows the transfer of Google Ads API data to Google BigQuery using custom reporting, which can be configured individually. Possible transmission routes You can transfer the Google Ads API data into BigQuery in several ways. One option is to create an ETL (Extract, Transform, Load) process using a&#8230;<\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_kad_blocks_custom_css":"","_kad_blocks_head_custom_js":"","_kad_blocks_body_custom_js":"","_kad_blocks_footer_custom_js":"","_kad_post_transparent":"","_kad_post_title":"","_kad_post_layout":"","_kad_post_sidebar_id":"","_kad_post_content_style":"","_kad_post_vertical_padding":"","_kad_post_feature":"","_kad_post_feature_position":"","_kad_post_header":false,"_kad_post_footer":false,"_kad_post_classname":"","slim_seo":{"title":"Efficiently transfer Google Ads API data to BigQuery","description":"Step-by-step guide: How to load Google Ads API data into BigQuery daily using Apps Script or Transfer Services \u2013 flexible, automated & scalable."},"footnotes":""},"categories":[28,32],"tags":[],"class_list":["post-1930","post","type-post","status-publish","format-standard","hentry","category-big-query","category-google-ads-scripts"],"taxonomy_info":{"category":[{"value":28,"label":"Big Query"},{"value":32,"label":"Google Ads Scripts"}]},"featured_image_src_large":false,"author_info":{"display_name":"admin","author_link":"https:\/\/webmasterei-prange.de\/en\/author\/admin\/"},"comment_info":0,"category_info":[{"term_id":28,"name":"Big Query","slug":"big-query","term_group":0,"term_taxonomy_id":28,"taxonomy":"category","description":"","parent":0,"count":12,"filter":"raw","cat_ID":28,"category_count":12,"category_description":"","cat_name":"Big Query","category_nicename":"big-query","category_parent":0},{"term_id":32,"name":"Google Ads Scripts","slug":"google-ads-scripts","term_group":0,"term_taxonomy_id":32,"taxonomy":"category","description":"","parent":0,"count":10,"filter":"raw","cat_ID":32,"category_count":10,"category_description":"","cat_name":"Google Ads Scripts","category_nicename":"google-ads-scripts","category_parent":0}],"tag_info":false,"_links":{"self":[{"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/posts\/1930","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/comments?post=1930"}],"version-history":[{"count":1,"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/posts\/1930\/revisions"}],"predecessor-version":[{"id":1931,"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/posts\/1930\/revisions\/1931"}],"wp:attachment":[{"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/media?parent=1930"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/categories?post=1930"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/webmasterei-prange.de\/en\/wp-json\/wp\/v2\/tags?post=1930"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}