All Collections
A/B testing
Running A/B testing
Running A/B testing

Learn how to run A/B testing in Claspo

V
Written by Vitalii Shumilov
Updated over a week ago

A/B testing (or split testing) is a methodology for researching user experience. When running a test, you can split your audience to test different variations of your widget or launcher and determine which of them is performing better.

In Claspo, you can run A/B testing to compare:

You cannot run A/B tests for the widgets:

  • Containing errors.

  • Having non-saved changes.

  • Used in another A/B testing.

Running A/B testing for 2 variations of a widget

To run an A/B testing for 2 widgets:

1. In the left-hand side panel, select A/B Testing and click the New test button.

2. Enter the test title and test description (optional) into the corresponding fields.

3. Click Select widget A and select the widget from the list in the Link the widget window.

NOTE: You can enter the widget name in the Search box to search for a widget by its name. Or you can use the following filters:

  • By update: Sort the widgets in the list by their updating dates.

  • Creation time: Select the time period from the dropdown list or select period to specify the custom period of widget creation.

4. Click Select widget B and select the widget from the list in the Link the widget window.

5. Enter the value in the Traffic allocation field for one of the widgets.

NOTE: The traffic allocation value specifies the distribution of displays for widgets A and B. The default value is 50 % for each widget. This means that the number of displays for both widgets is evenly distributed between all site or page visitors.

6. Click Start test.

If your widgets are not published the Start testing dialog window opens. Select Publish both widgets and start the test in the Start testing window.

The test shows in the A/B testing window with the active status.

The active tests are marked with the play icon beside their name. The completed tests are marked with the tick icon.

To stop the active test and leave both widgets published, expand the Stop dropdown menu and select Stop.

To stop the active test and unpublish both widgets, expand the Stop dropdown menu and select Stop and unpublish both, then click Stop in the dialog box.

NOTE: When you stop the active test, you will not be able to relaunch it.

To view the test statistics, click the test name.

Running A/B testing for 2 variations of a widget attached to the same launcher

If one of your widgets has the Launcher click based option enabled in its triggering rules, you can run an A/B testing for 2 widgets attached to that launcher.

When the page visitor clicks the launcher during the test, it will show either widget A or widget B, depending on the traffic allocation values applied.

To run an A/B testing for 2 widgets attached to the same launcher:

  1. In the left-hand side panel, select A/B Testing and click the New test button.

  2. Enter the test title and test description (optional) into the corresponding fields.

  3. Click Select widget A and select the widget from the list in the Link the widget window. This widget must have the Launcher click based option enabled in its triggering rules.

    NOTE: You can enter the widget name in the Search box to search for a widget by its name. Or you can use the following filters:

    • By update: Sort the widgets in the list by their updating dates.

    • Creation time: Select the time period from the dropdown list or select period to specify the custom period of widget creation.

  4. Click Select widget B and select the widget from the list in the Link the widget window. This widget must not have the Launcher click based option enabled in its triggering rules.

  5. Enter the value in the Traffic allocation field for one of the widgets.

  6. Click Start test.

    If your widgets are not published, the Start testing dialog window opens. Select Publish both widgets and start the test in the Start testing window.

Running A/B testing for 2 variations of a widget attached to 2 variations of a launcher

If both your widgets have the Launcher click based option enabled in their triggering rules, you can run an A/B testing for 2 widgets attached to 2 different variations of a launcher.

The page visitor will see either of the launchers and their associated widgets depending on the traffic allocation values applied.

In this case, 2 different tests are run:

  • Test for launchers.

  • Test for widgets.

The statistics data are collected independently for each test.

NOTE: For testing the widgets that both are bound to the launchers, it is necessary to create a separate test for the launchers.

To run an A/B testing for 2 variations of a widget attached to 2 variations of a launcher:

  1. In the left-hand side panel, select A/B Testing and click the New test button.

  2. Enter the test title and test description into the corresponding fields.

    NOTE: You can enter the widget name in the Search box to search for a widget by its name.

    Or you can use the following filters:

    • By update: Sort the widgets in the list by their updating dates.

    • Creation time: Select the time period from the dropdown list or select period to specify the custom period of widget creation.

  3. Click Select widget A and select the widget from the list in the Link the widget window.

  4. Click Select widget B and select the widget from the list in the Link the widget window.

  5. Enter the value in the Traffic allocation field for one of the widgets.

  6. Click Start test.

    If your widgets are not published the Start testing dialog window opens. Select Publish both widgets and start the test in the Start testing window.

Did this answer your question?