Skip to content

Commit

Permalink
Latest attribution app doc changes (#995)
Browse files Browse the repository at this point in the history
* Latest attribution app doc changes

* Change default to dollar

* Make paths to non conv table more clear

* Rephrase
  • Loading branch information
agnessnowplow authored Sep 2, 2024
1 parent a3fd1fc commit d93a7cb
Show file tree
Hide file tree
Showing 4 changed files with 33 additions and 23 deletions.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
52 changes: 31 additions & 21 deletions docs/data-apps/attribution-modeling/index.md
Original file line number Diff line number Diff line change
@@ -1,25 +1,26 @@
---
title: "Attribution Modeling"
title: "Marketing Attribution"
sidebar_position: 3
sidebar_label: "Attribution Modeling"
sidebar_label: "Marketing Attribution"
---

:::caution

This data app is currently in Private Preview and features may change without notice.
This data app is currently in Public Preview and features may change without notice.

:::

In today's increasingly complex digital world, users often take multi-channel journeys before converting. Assigning credit across multiple touchpoints is vital to getting an accurate picture of the efficacy of your marketing channels, yet requires merging disparate datasets and running complex calculations.

Our **Attribution modeling** app (together with the [Snowplow Attribution dbt package](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-models/dbt-attribution-data-model/index.md)) lowers the barrier to entry for your marketing team through the following features:
Our **Marketing Attribution** app (together with the [Snowplow Attribution dbt package](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-models/dbt-attribution-data-model/index.md)) lowers the barrier to entry for your marketing team through the following features:

- Incremental SQL model in your warehouse for cost-effective computation
- Choice of first-touch, last-touch, linear and positional methods, with additional filters and transforms available
- Reports for conversions, revenue, spend and Return On Advertising Spend (ROAS) per channel and campaign
- Option to specify your own touchpoint and advertising spend tables
- Intermediate tables that you can build your own attribution models on top of

![](images/overview.png)

## Requirements

Expand All @@ -41,46 +42,55 @@ Use the toggle `Last N days View (Dynamic)` to choose whether your would like to

**Defining a Last N Days (Dynamic) View**

The so-called `Dynamic` views are to be used for generating datasets that have a rolling conversion window of last nth day and will be refreshed automatically (e.g. Last 30 days). The app will save the last-refreshed date with the View configurations and any subsequent day a user logs back in the app, a query will run in the background to look for any newly processed conversion event in the conversion source and if there is, the dynamic datasets are refreshed by running all the queries that are needed to generate data for the charts to populate. Once the update finishes the conversion window should display the new date range.
The so-called `Dynamic` views are to be used for generating datasets that have a rolling conversion window of last nth day and will be refreshed automatically (e.g. Last 30 days). The app will save the last-refreshed date with the View configurations and any subsequent day a user logs back in the app, a query will run in the background to look for any newly processed conversion event in the conversion source and if there is, the dynamic datasets are refreshed by running all the queries that are needed to generate data for the charts to populate.

If you choose this option, set the `auto-update days`: the number of days since the last conversion event defined here will define the conversion window.
If you choose this option, set the `auto-update days`: the number of days since the last conversion event defined here will define the conversion window. The latest conversion window in use can be checked on the `Settings` page where a table with information on all the created views is displayed including conversion window that is currently in use.

**Defining a Custom Date Range (static) View**

Non-dynamic views will have to be given a name and will typically be used to generate a fixed dataset (e.g. Jan, Q1, 2023) to avoid having to recalculate the analysis for subsequent users.

Define a fixed conversion window by selecting the appropriate date range with the date picker tool (which gets activated by clicking on the default date range).

#### 1.2 Set a `currency symbol` (defaults to £)

#### 1.3 Decide if you would like to `use Non-Conversions`:

Use this with caution, currently it uses the `snowplow_attribution_paths_to_non_conversion` table as is without considering the conversion period. The intention is to make this a fully automated feature in the near future so watch for updates on this.

#### 1.2 Set a currency symbol (defaults to $)

### 2. Connect your Data Sources:

1. Select your schema that contains the derived unified and attribution tables: this will trigger an update which checks for any tables with the names closest to what the app expects.
2. After waiting for the update to take place you can revise if the auto-detected source tables are in line with your expectations, you can change them to any other existing tables you have in case they are not correct.

There is an optional `snowplow_attribution_paths_to_non_conversion` table select box, which for most users are not relevant and therefore the first option: `Do not use paths_to_non_conversion table` should be selected. This drop and recompute table calculates the paths your customers have followed that have not lead to a conversion.

Please note that this table is not recalculated by the app, therefore it should only be used for a fixed view with the intention of consuming the same period as is in the latest data model, consuming the `snowplow_attribution_paths_to_non_conversion` table for use in the `Path Summary` page.

3. Overwrite the attribution_manifest table. Most likely the schema name will have to be modified. Please keep the `schema_name.table_name` notation here. Make sure you press enter once modified.
4. (Optional but recommended) Specify the Spend Source: this will most likely be a view you created on top of your table that holds your marketing spend data. The view should make sure you align the expected field names. It should have `campaign`, `channel`, `spend` and `spend_tstamp` for the analysis to work. Doing this will make sure you have Return On Advertising Spend (ROAS) calculation in your overview. Make sure you press enter once modified.

Once happy with all the imputs press `Create View` button. It will first run a validation against the data sources making sure it has all the fields it needs. After that it will run the queries that generate the data necessary to populate the dashboards. They will be saved as csv files that app will read from when selecting the View on the sidebar.
Once you are happy with all the imputs press `Create View` button. The app will first run a validation against the data sources making sure it has all the fields it needs and display them if something is not correct, otherwise it will save the view and the dashboards are ready to be explored. The first time a dashboard page is visited, the relevant query will run once and the data will be cached to speed up subsequent dashboard explorations for other users.


## Using the Dashboard

Once the Data Analyst or Engineer that knows how to set up the view configured one, users that are only interested in the Dashboard can just use the Attribution Dashboard to review the results of the analysis. There are various filters that make this interactive. Because the data is already saved users can make any of these interactive changes without affecting the warehouse to avoid expensive queries or laggy information retrieval.
Once at least one View is configured by the Data Analyst or Engineer, users that are only interested in the Dashboard can just use the dashboards created by the app to review the results of the analysis.

### Dashboard Filters

### Sidebar filters
There are various filters at the top of each dashboard page that make the data exploration interactive. Because the queries are cached, users can make any of these interactive changes without affecting the warehouse to avoid expensive queries or laggy information retrieval.

Filters on the sidebar refer to all of the dashboard tabs so you only have to change them once. The only exception to this is the Attribution Type, where
1. select which `View` to use from a dropdown
2. make changes within `View Settings`

- select which `View` to use
- toggle between using `Campaign` or `Channel`
Once you click on `View Settings` at the top of each page you can:
- select which Attribution Type to use (`Fist Touch`, `Last Touch`, `Linear` or `Position Based`)
- choose between using `Campaign` or `Channel` to be considered for paths

Optional filters:
- for relevant pages there are additional filters suche as the `Remove paths with only 1 touchpoint` or `Number of Items` which reduces items within specific charts.

## Editing / Deleting Views
On the `Settings` page there is an easy way to edit or delete existing views:

### Dashboard specific filters
- **deletion**: click X next to the view
- **editing**: click on the name of the view, the app will take you to the view configuration page where you can make amendments. Once ready, click save again, which will overwrite the existing view configurations

- `Remove paths with only 1 touchpoint`: this exclude paths with only one touch point to make it more useful in certain scenarios
- `Top N Filter`: filter your charts according to the top nth value
![](images/edit_delete_views.png)
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ This package consists of a series of dbt models that produce the following table
- `snowplow_attribution_channel_attributions`: By channel path and conversion level incremental table that attributes the conversion value based on various algorithms
- `snowplow_attribution_overview`: The user defined report view (potentially showing ROAS)
- `snowplow_attribution_path_summary`: For each unique path, a summary of associated conversions, optionally non-conversions and revenue
- `snowplow_attribution_paths_to_non_conversion`: Customer id and the the paths the customer has followed that have not lead to conversion. Optional drop and recompute table, disabled by default.
- `snowplow_attribution_paths_to_non_conversion`: Customer id and the paths the customer has followed that have not lead to conversion. Optional drop and recompute table, disabled by default.

In the [Quick Start](/docs/modeling-your-data/modeling-your-data-with-dbt/dbt-quickstart/index.md) section you will find a step-by-step guide on how to operate the package as a whole.

Expand Down Expand Up @@ -100,7 +100,7 @@ The purpose of this package is to allow an incremental, efficient way to do mark

In the below guide we will walk you through the data transformation process step-by-step in order for you to see how the source data changes downstream. This will give you and your team a transparent and easy-to-understand way to see how this package will lead you to valuable insights.

We also provide the **[Attribution Data App](/docs/data-apps/attribution-modeling/index.md)** specifically to help your analysis by visualizing the output in the form of interactive dashboards as well as letting you capture datasets for comparison. It works in tandem with the package and will auto-update daily in case your package has been processed since then.
We also provide the **[Marketing Attribution Data App](/docs/data-apps/attribution-modeling/index.md)** specifically to help your analysis by visualizing the output in the form of interactive dashboards as well as letting you capture datasets for comparison. It works in tandem with the package and will auto-update daily in case your package has been processed since then.

## Sources you are going to need

Expand Down

0 comments on commit d93a7cb

Please sign in to comment.