A/B testing recommendations
Once you have finished the frontend integration for Recommendations AI, you may want to evaluate them on your site. The best way to test changes is usually a live A/B test.
A/B testing for Recommendations may be useful for testing and comparing various changes:
Existing recommender vs. Google Recommendations AI
Google Recommendations vs. no recommender
Various models or serving changes (CTR vs. CVR optimized, or changes to price re-ranking & diversity)
UI changes or different placements on a page
There are some more A/B testing tips for Recommendations AI here.
In general, A/B testing involves splitting traffic into 2 or more groups and serving different versions of a page to each group, then reporting the results. You may have a custom in-house A/B testing framework or a 3rd-party A/B testing tool, but as an example here we'll show how to use Google Optimize to run a basic A/B test for Recommendations AI.
Google Optimize & Analytics
If you're already using Google Analytics, Google Optimize provides easy-to-manage A/B experiments and displays the results as a Google Analytics report. To use Google Optimize, simply link your Optimize account to Google Analytics, and install the Optimize tag. Once installed, you can create and run new experiments without any server-side code changes.
Google Optimize is primarily designed for front-end tests: any UI or DOM changes, CSS. Optimize can also add JavaScript to each variant of an experiment, which is useful when testing content that is displayed via an Ajax call (e.g. our cloud function). Doing an A/B experiment with server-side rendered content is possible, but usually this needs to be implemented by doing a Redirect test or by using the Optimize JavaScript API.
As an example, let's assume we want to test two different models on the same page: Similar Items & Others You Make Like. Both models take a product id as input and are well-suited for a product details page placement. For this example we'll assume a cloud function or other service is running that returns the recommendations in HTML format. These results can then be inserted into a div and displayed on page load.
The basic steps to configure an Optimize experiment here are:
Click Create Experience in Google Optimize control panel
Give your experience a name and select A/B test as the experience type
Add 2 variants: one for Others You May Like, another for Similar Items
Set variants weights to 0 for Original and 50%/50% for the 2 variants
Edit each variant and add your Ajax call to "Global JavaScript code" to populate the div
Add a url match rule to Page targeting to match all of your product detail pages
Choose primary and secondary objectives for your experiment
Revenue or Transactions for example, and Recommendation Clicks or another useful metric for secondary
Change any other optional settings like Traffic allocation as necessary
Verify your installation and click Start to start your experiment
In this scenario we have an empty <div> on the page by default, and then we create two variants that call our cloud function with a different placement id on each variant. You could use an existing <div> with the current recommendations for the Original version and then just have one variant, but this will cause unneeded calls to the recommender and may cause the display to flicker as the existing <div> content is changed.