All Collections
A/B testing and analytics
Running A/B testing for 2 variations of a widget attached to the same launcher
Running A/B testing for 2 variations of a widget attached to the same launcher
V
Written by Vitalii Shumilov
Updated over a week ago

A/B testing (or split testing) is a methodology for researching user experience. When running a test, you can split your audience to test different variations of your widget or launcher and determine which of them is performing better.


NOTE:

You cannot run A/B tests for the widgets:

  • Containing errors.

  • Having non-saved changes.

  • Used in another A/B testing.


If one of your widgets has the Launcher click based option enabled in its triggering rules, you can run an A/B testing for 2 widgets attached to that launcher.

When the page visitor clicks the launcher during the test, it will show either widget A or widget B, depending on the traffic allocation values applied.

To run an A/B testing for 2 widgets attached to the same launcher:

  1. In the left-hand side panel, select A/B Testing and click the New test button.
    โ€‹

  2. Enter the test title and test description (optional) into the corresponding fields.

  3. Click Select widget A and select the widget from the list in the Link the widget window. This widget must have the Launcher click based option enabled in its triggering rules.

    NOTE: You can enter the widget name in the Search box to search for a widget by its name. Or you can use the following filters:

    • By update: Sort the widgets in the list by their updating dates.

    • Creation time: Select the creation time item from the dropdown list or select period to specify the custom period of widget creation.

  4. Click Select widget B and select the widget from the list in the Link the widget window. This widget must not have the Launcher click based option enabled in its triggering rules.

  5. Enter the value in the Traffic allocation field for one of the widgets.

  6. Click Start test.

    If your widgets are not published, the Start testing dialog window opens. Select Publish both widgets and start the test in the Start testing window.

The test shows in the A/B testing window with the active status.

The active tests are marked with the play icon beside their name. The completed tests are marked with the tick icon.

To stop the active test and leave both widgets published, expand the Stop dropdown menu and select Stop.

To stop the active test and unpublish both widgets, expand the Stop dropdown menu and select Stop and unpublish both, then click Stop in the dialog box.

NOTE: When you stop the active test, you will not be able to relaunch it.

To view the test statistics, click the test name.

Did this answer your question?