Are you already using Matomo and want to expand your web analytics in a data-driven way? Then you’ve come to the right place. Here you’ll learn how to close these gaps using Google Looker Studio and the Matomo API, taking individual web analyses to a new level.

Why Matomo?

Matomo has become widely popular as an open-source web analytics solution, especially since the GDPR came into effect in 2016. With a market share of 6% in Germany, Matomo ranks second right after Google. As a website operator, you enjoy full control over your data through independent hosting on your own infrastructure. This minimizes the risk of sharing data with third parties, as is the case with Google Analytics, and allows you to better respond to data protection requirements. However, Matomo does have limitations in certain areas: The user interface seems outdated, rights management is restricted, and extensions like conversion exports often incur additional fees.

Looker Studio in combination with Matomo (API)

The biggest pain point with Matomo is often the limited reporting. This is where Looker Studio comes in: By connecting to the Matomo API, you can retrieve visit data and then store it in BigQuery (keyword: Matomo BigQuery). There, it can be transformed as needed and finally flexibly processed via Looker Studio (keyword: Matomo Looker Studio).

Technical Process

The data flow from Matomo to visualization in Looker Studio can be summarized in four steps:

  1. Querying the Matomo API
    A Google Cloud Function regularly retrieves data via the Live.getLastVisitsDetails endpoint.
  2. Storage in BigQuery
    The data is temporarily stored in Google Cloud Storage and then loaded into a BigQuery table.
  3. Further processing via Dataform
    A Dataform package handles the transformation of your raw data and makes it available for various applications.
  4. Output in Looker Studio
    You can integrate the processed data into Looker Studio either via a native BigQuery connection or another Cloud Function.

This way, you bypass the limitations of the Matomo interface and gain more flexibility in your web analyses.

Matomo Looker Studio:
Matomo Looker Studio

Data flow within the solution: A Google Cloud Function retrieves data from Matomo and stores it in BigQuery.

First Success: Accessing the Matomo API

To access the Matomo API, you need an API token. You create this token in your Matomo user account. The token is sent as a parameter with each request to ensure only authorized persons have access. Since Matomo Version 5, API retrieval is only possible via POST request (in Version 4 it was easier to do via browser call). You can also control pagination using filter_limit and filter_offset to extract even large amounts of data from Matomo in smaller chunks. In practice, a Google Cloud Function usually handles the cyclical querying of Matomo data. The Cloud Function stores the data temporarily and then imports it into BigQuery. This even allows for intraday analyses. However, how often you actually retrieve your data also depends on factors such as costs and data volume. Queries at four-hour intervals have proven effective.

Matomo
Example API call in a Cloud Function: The Live.getLastVisitsDetails endpoint is used.

Matomo getLastVisitsDetails ≠ Raw Tracking Data

The endpoint Live.getLastVisitsDetails does not provide pure raw data, but already processed information. This includes, for example, cleaned URL parameters such as gclid, fbclid, or msclkid.

  • Goals are integrated, while certain events (e.g., addEcommerceItem) only appear indirectly as an abandoned shopping cart (ecommerceAbandonedCart).
  • You should omit irrelevant fields such as icon paths or various aggregations when importing into BigQuery to keep your dataset lean.

Typical information you receive via Live.getLastVisitsDetails includes:

  • Visit details: Session duration (in seconds and human-readable form), number of pages viewed, etc.
  • Referrer Information: Type of referrer (search engine, direct entry, social media), name, URL, keyword.
  • Device Information: Device type (desktop, tablet, smartphone), operating system, browser, installed plugins.
  • Location Data: City, region, country (with flag), continent, geo-coordinates, language settings.
  • Page and Event Data: Visited URLs, titles, time spent, load times, event categories and actions.
  • E-Commerce Data: Product information (name, SKU, price, category), abandoned carts, order ID, revenue, taxes, shipping, discounts, ordered items.

Here you can find the complete table with all fields of the Live.getLastVisitsDetails. With this broad spectrum of data, you can gain deep insights into the behavior of your visitors and significantly improve your web analysis in e-commerce and online marketing.

Data transformation with Dataform

To efficiently process your Matomo data, it’s worth taking a look at Dataform. This tool is integrated into the Google Cloud Platform and supports you in managing, automating, and orchestrating data pipelines. Advantages of Dataform:

  • Seamless Git integration for version control.
  • SQL workflows can be efficiently managed and automated.
  • Incremental MERGE logics simplify data updates.
  • A compilation graph always gives you a visual overview of tables and dependencies.

Typically, you create two base tables from the raw data using Dataform, e.g., actions and sessions, which you later split or aggregate further. This way, you only load the data you really need into Looker Studio – saving time and keeping your reports clear. An example is the configuration of the table actions. Here:

  • Incremental processing (config.type = "incremental").
  • Only a defined period is updated via partitioned tables.
  • Assertions ensure that no duplicate entries are created.
  • With pre_operations, certain sessions are deleted to potentially recapture incomplete visits.
  • JavaScript functions enable central CASE statements (e.g., for channel names).
Data flow in Dataform: Data is loaded from the rawdata table and written into actions and sessions.
Data flow in Dataform: Data is loaded from the rawdata table and written into actions and sessions.

Triggers, Timing, Messages

A reliable data pipeline requires well-thought-out orchestration. Instead of fixed times, triggers and Pub/Sub messages are usually the better choice:

  1. Cloud Scheduler: Triggers a Pub/Sub message and starts the Cloud Function.
  2. Cloud Function: Loads the raw data from Matomo into BigQuery and writes a log entry upon success.
  3. Log sink: Detects the entry and triggers a workflow.
  4. Workflow: Performs an update of your Dataform package and starts Dataform.
  5. Looker Studio: Retrieves the updated data from BigQuery.

This ensures that no step starts before the previous one has been successfully completed.

Sequence of triggers and messages: Ensuring a smooth process.
Sequence of triggers and messages: Ensuring a smooth process.

Data Usage – Looker Studio as an Example

Once your data is in BigQuery, you can analyze it with various tools. Looker Studio is a good choice because it’s free and has a low entry barrier (a Google account is sufficient). With a multi-page Looker Studio dashboard, you can display data on acquisition, behavior, e-commerce, and technical details, among other things. With this now flexible structure, you have much more freedom in compiling your reports than the Matomo interface allows.

Looker Studio Report: Overview page of the multi-page report.
Looker Studio Report: Overview page of the multi-page report.

Copying Dashboards with the LSD Cloner

Looker Studio itself doesn’t offer integrated versioning for dashboards. This is where the Looker Studio Dashboard Cloner (LSD Cloner) can help you:

  • It copies existing dashboards using the Looker Studio Link API.
  • A JSON configuration file specifies how data sources and dashboard names should be adjusted.
  • The tool is a simple npm package that automatically generates a link for you.

This allows you to quickly duplicate and customize dashboards for different projects or clients.

Conclusion

With the Matomo API endpoint Live.getLastVisitsDetails you get detailed insights into the visitor behavior on your website. When you combine this data with Matomo BigQuery and Matomo Looker Studio, you expand the capabilities of your web analytics far beyond Matomo’s standard interface. Especially in e-commerce environments, this setup enables more precise conversion analysis and a deeper understanding of your user behavior. Thanks to Dataform, you can efficiently transform your data and update it incrementally, while creating flexible, appealing dashboards with Looker Studio. Overall, this solution is ideal for you if you work professionally with Matomo and want to conduct state-of-the-art web analytics. If you need a ready-made solution, send us an inquiry. We have developed a fully automated solution for transferring Matomo data to BigQuery.

Matomo BigQuery Importer + Looker Studio Dashboard