Skip to main content
All CollectionsA/B testing and analytics
Running A/B testing for 2 variations of a widget attached to 2 variations of a launcher
Running A/B testing for 2 variations of a widget attached to 2 variations of a launcher

Learn how to set up and run A/B testing for 2 variations of a widget attached to 2 variations of a launcher

V
Written by Vitalii Shumilov
Updated over 9 months ago

A/B testing (or split testing) is a methodology for researching user experience. When running a test, you can split your audience to test different variations of your widget or launcher and determine which of them is performing better.


NOTE:

You cannot run A/B tests for the widgets:

  • Containing errors.

  • Having non-saved changes.

  • Used in another A/B testing.


If both your widgets have the Launcher click based option enabled in their triggering rules, you can run an A/B testing for 2 widgets attached to 2 different variations of a launcher.

The page visitor will see either of the launchers and their associated widgets depending on the traffic allocation values applied.

In this case, 2 different tests are run:

  • Test for launchers.

  • Test for widgets.

The statistics data are collected independently for each test.


NOTE:

For testing the widgets that both are bound to the launchers, it is necessary to create a separate test for the launchers.


To run an A/B testing for 2 variations of a widget attached to 2 variations of a launcher:

  1. In the left-hand side panel, select A/B Testing and click the New test button.

  2. Enter the test title and test description into the corresponding fields.

    NOTE: You can enter the widget name in the Search box to search for a widget using its name.

    Or you can use the following filters:

    • By update: Sort the widgets in the list by their updating dates.

    • Creation time: Select the creation time item from the dropdown list or select period to specify the custom period of widget creation.

  3. Click Select widget A and select the widget from the list in the Link the widget window.

  4. Click Select widget B and select the widget from the list in the Link the widget window.

  5. Enter the value in the Traffic allocation field for one of the widgets.

  6. Click Start test.

    If your widgets are not published the Start testing dialog window opens. Select Publish both widgets and start the test in the Start testing window.

The test shows in the A/B testing window with the active status.

The active tests are marked with the play icon beside their name. The completed tests are marked with the tick icon.

To stop the active test and leave both widgets published, expand the Stop dropdown menu and select Stop.

To stop the active test and unpublish both widgets, expand the Stop dropdown menu and select Stop and unpublish both, then click Stop in the dialog box.

NOTE: When you stop the active test, you will not be able to relaunch it.

To view the test statistics, click the test name.

Did this answer your question?