All Collections
Phrasee Platform
Performance Reports
Experiment-level performance reports: Straight split and test & deploy experiments
Experiment-level performance reports: Straight split and test & deploy experiments

An overview of the performance report displayed on the results tab for email, push and SMS tests using a standard testing methodology.

Updated over a week ago

This article provides an overview of the performance reports generated for individual experiments across email, push and SMS tests that have used a standard testing methodology (i.e. straight splits or test and deploy).

  • Please refer to this article for more information about performance reporting for individual email, push or SMS experiments that have used a dynamic optimization methodology.

  • For more information about performance reporting at the account level, please check out this article.

What is an experiment-level performance report?

Experiment-level performance reports summarize the performance of your Phrasee experiments based on the results received from the test.

Where can I find the report?

You can find the performance report for individual experiments under the 'results' tab for any experiment in Phrasee that has result data.

Experiment-level performance reporting

How to run an experiment-level performance report

Phrasee automatically generates the report once it receives the results of an experiment, either via one of our technology integrations or by uploading them to the platform.

What does the report show?

The exact metrics the report displays will depend on the channel and testing methodology of the experiment.

1. Winning copy variant

The experiment's winning copy variant is highlighted at the top of your performance report. This copy variant was determined to be the winner based on the optimization metric selected during the experiment's initial set-up (Step 1). For example, if you chose to optimize your copy variants based on the open rate, the winning variant is declared based on which received the highest open rate.

For experiments that used a test and deploy methodology, the winning copy variant displayed is the copy variant that was determined to be the winning variant when the testing window completed.

Note: In some cases, engagement observed after a testing period has ended may cause a different variant to overtake the variant selected as the winner at the end of the test. This isn't cause for concern, but it is possible for the variant that is declared as the winner in the report to be different to the variant ranked as number one in the results table below the performance report.

All subsequent data displayed under the winning variant is based on this variant. You will see your experiment's uplift and incremental event results directly below the winning variant. And below that are the results from the winning variant's final send.

2. Uplift results

Uplift calculations will be displayed for any email, push or SMS experiments that have used either a test and deploy or straight splits testing methodology.

Uplift in this report view is calculated based on the percentage change between the key metrics for the winning copy variant (e.g. opens, clicks) and the copy variant that ranked the lowest in the experiment.

The exact metric of the uplift will depend on the channel the experiment was deployed on and the result data available. These include:

Channels

Uplifts Calculated For Test and Deploy or Straight Split Experiments

Email

- Open Uplift

- Click Uplift

Push

- Direct Open Uplift

- Influenced Open Uplift

SMS

- Click Uplift

3. Incremental events

Incremental events are only calculated for any experiments that have used a Test and Deploy methodology.

They are a cumulative number and represent the additional number of times people have opened or clicked your email, push notification or SMS message due to your Phrasee experiment.

Incremental events are calculated by multiplying the percentage difference of the relevant metric (e.g. opens, clicks) between the winner and the human control. The difference is then divided by the final audience size.

Note: Before final results are entered, the incremental events number is estimated based on the audience size entered during the experiment's initial set-up (Step 1). Once final results are entered, the report is adjusted based on the sum of the split audience and the final send audience.

Depending on the channel and result data available, you will see slightly different incremental event metrics:

Channels

Incremental Events Calculated for Test and Deploy Experiments

Email

- Incremental Opens

- Incremental Clicks

Push

- Incremental Direct Opens

- Incremental Influenced Opens

SMS

- Incremental Clicks

4. Winning variant results

Below the uplift and incremental event results, the results for your winning variant are displayed in orange.

For straight split experiments, these are the results from the winning variant as determined by your selection optimization metric during the initial set-up of the experiment (Step 1).

For test and deploy experiments, the performance metrics are based on the results from the final send. The results achieved during the testing window are displayed in the variants result table.

5. Variants results table

The variant result table summarizes the key results for each variant. You can click on the arrows in each column header to sort the variant table.

The results displayed here and in the winning variant summary will depend on the channel and result data available to Phrasee:

Channel

Result Metrics Available to Display*

Email

- Total number recipients - required

- Opens (unique) - required

- Open Rate - required

- Clicks (unique) - required
- Click Rate (unique) - Required
- Unsubscribes (unique) - optional
- Conversions - optional
- Revenue - optional

Push

- Recipients - required

- Direct Opens (unique) - required

- Direct Open Rate - required

- Influenced Opens (unique) - required

- Influenced Open Rate (unique) - Required

- Conversions - optional

- Revenue - optional

SMS

- Recipients - required

- Clicks (unique) - required

- Click Rate (unique) - required

- Conversions - optional

- Revenue - optional

*Note: Some results data must be entered in order Phrasee to generate a performance report, while others are optional. These requirements are indicated in the above table.

Did this answer your question?