This article is no longer maintained, so its content might be out of date.
Have you noticed something changing recently on Webmaker, and you're not sure why? If so, it's possible you're seeing one of the variations of our A/B testing strategy.
Table of Contents
What is A/B Testing and why do we do it?
A/B testing (and its slightly more complex relative 'Multivariate Testing') allow us to show different versions of our pages to different users and to measure which version has the best impact on the goals of a particular website or tool. These goals include things like encouraging more people to create an account or to remix a make.
When we run an A/B test, what we're trying to find out is if a piece of content, or a particular design idea will encourage more people to take an action. We do this by showing different versions of something, such as a button, to thousands of visitors and seeing what impact each version has.
Sometimes even the smallest of changes to a colour or a word on a button can have a significant impact on how people use a website, and by carefully measuring and testing our design ideas we can work towards building the best tools possible, and get even more people involved in Webmaker.
Testing design variations requires lots of visitors to view a page so that we can get a meaningful measurement. This means we can't test all our design changes all the time, but where we can, we will try to use data to help make the tools you use more effective for even more people.
Ways you can get involved in our A/B testing:
- Submit ideas for things you think we could test
- See the results of tests we have run
- Feedback on (and challenge!) the conclusions we draw from our results
- Learn about the testing process, and how design decisions affect how people use the web
Visit our wiki page to learn more and get involved: