What is A/B Testing?
A/B testing (also known as split testing) is a controlled experiment that compares the performance of two different versions of a webpage by showing half your visitors one version of your site, and the other half the second version. By comparing the performance (measured by goals like signup rate or revenue) of each version of the webpage, we can empirically determine which version performs best and quantify the improvement rate.
A/B testing can be used to do a soft roll-out of a new design for a particular webpage, or to test different:
- Headlines and marketing copy
- Layouts and site workflows
- Designs and style elements
A simple example experiment could test a new homepage design or a different headline. 50% of visitors will see the original page, and 50% would see the new design. After running the test, you may find that the new design outperforms the original design’s revenue by 20% with 99% statistical significance. Or you may find that the new design hurts your conversion rate or any other goal you want to measure each variation on.
When a visitor is first enrolled in an experiment, they see either the original page or one of the variations (with equal chance) and will see the same variation on each visit to ensure a consistent experience. Each experiment can be configured to run for a specific percentage of your total audience and to target specific audience cohorts. Experiments can be activated immediately on pageload when a certain URL is visited, or programmatically via a javascript function call. The performance of experiments is tracked by defining Goals.
Creating Your First Experiment
To get started, create a new experiment and enter a URL for the webpage you want to run a test on. Once a URL is enter, the visual editor will load your original webpage into “Original” tab. To get started editing your webpage, create a new variation.
After adding a variation, the editor loads your original webpage in the new variation tab. You can switch to Interact Mode if you need to navigate inside the page to get to the element that needs to be changed. After you’ve located the page element that needs to be changed, switch back to Edit Mode to make changes to the page.
Click on the element to view the edit menu that allows you to make changes to that element:
Tracking Goals to Measure Performance
Goals let you define metrics to determine which variation performs the best wins in the experiment. Each variation's performance is tracked individually for each goal to make analysis possible from many contexts (i.e. a particular page change may decrease your engagement rate but increase signups).
There are four kinds of goals that can be tracked for experiments:
- Track URL Views - Track total or unique views of a URL pattern.
- Track Custom Events - Track any custom event triggered via javascript.
- Track Clicks on a Page Element - Track clicks on any element on your website.
- Track Engagement - Track how many unique visitors engage with each variation.
Adjusting the Experiment Frequency
The Experiment Frequency lets you choose what percent of your total visitors get enrolled in this experiment. For example, having a frequency of 25% means 1 out of 4 visitors to this page will be enrolled in this experiment. The other three will see the original page, and the fourth visitor will see one of the variations or the original.
Experiment Activation Settings
Activation Mode
Your experiment can activate either immediately on pageload (default), or it can be set to manual mode and execute on a javascript function call. Once the experiment is executed, the variation code will run and modify the page elements. If the elements aren't ready on the page yet, they will be modified as soon as the elements are added to the page.
Activation URL
Experiments run on a user-defined URL pattern, such as:
- URL contains /plans
- URL matches regex \/profile-\d+
- URL does not match regex \/profile-\d+
- any URL (activates on every page)
Javascript Loader Options
There are two different javascript loader options. Both loaders are small, cross-browser, asynchronous, non-blocking, and don't disrupt downloading of your other page assets and resources.
The A/B loader is useful to prevent any page change flickering on page load. Without the A/B loader, at pageload a visitor could see the original page briefly before Inspectlet loads and changes the page content to match the variation the visitor is enrolled in. To prevent this page change flickering, use the A/B loader and your page content will automatically be hidden for up to a second while Inspectlet's javascript and the rest of your page images and assets load simultaneously. This timeout can be adjusted to your preference in the code. In most cases, the real-world performance of loading the javascript is imperceptible, about 30-40ms (less than one tenth of a second). This is generally unavoidable in A/B testing if you want to avoid page change flickering since the script would have to load before your page content is made visible. The A/B loader is highly optimized to not delay the visual rendering pipeline and it doesn't delay the page's DOMContentLoaded event.
In summary: if you're using Inspectlet without running any A/B tests (for session recordings, heatmaps, form analytics), use the regular loader. If you're running A/B tests (and also using other products), we recommend using the A/B loader to avoid any visible page flickering.
Deploying your Experiment Live
Once you've made the changes to your variation, click "Save". When you've added in all variation data and configured experiment settings like goals, experiment frequency and activation settings, click Deploy to push your experiment live. Once an experiment is deployed your website visitors will be able to see it immediately.
Viewing Experiment Results
The Results page lets you see how each variation in your experiment is performing. Results are available immediately after a goal has been completed: