Skip to main content

Experimentation Platform

Accelerating software innovation through trustworthy experimentation

Home: ExP Platform
Journal Survey
Encyclopedia article
Objective Bayesian AB
Rules of Thumb
Two Stage
Large Scale
Puzzling Outcomes
Experiments at Microsoft
Practical Guide (short)
Tracking Users' Clicks an
Seven Pitfalls
Semmelweis Reflex
Power Calculator
What is a HiPPO

Seven Rules of Thumb for Web Site Experimenters

By Ron Kohavi, Alex Deng, Roger Longbotham, and Ya Xu


Appears in KDD 2014, Aug 24-27, New York, NY. 

Seven Rules of Thumb for Web Site Experiments Paper (PDF), and Slides (PDF), and Video


ACMRef: Ron Kohavi, Alex Deng, Roger Longbotham, and Ya Xu.  2014. Seven Rules of Thumb for Web Site Experimenters. In Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining (KDD '14), pp. 1857-1866.  DOI: 10.1145/2623330.2623341


 © 2014. This is the author's version of the work. It is posted here for your personal use. Not for redistribution. The definitive version is published at KDD' 14, August 24–27, 2014, New York, NY, USA.



Web site owners, from small web sites to the largest properties that include Amazon, Facebook, Google, LinkedIn, Microsoft, and Yahoo, attempt to improve their web sites, optimizing for criteria ranging from repeat usage, time on site, to revenue. Having been involved in running thousands of controlled experiments at Amazon,, LinkedIn, and multiple Microsoft properties, we share seven rules of thumb for experimenters, which we have generalized from these experiments and their results. These are principles that we believe have broad applicability in web optimization and analytics outside of controlled experiments, yet they are not provably correct, and in some cases exceptions are known.

To support these rules of thumb, we share multiple real examples, most being shared in a public paper for the first time. Some rules of thumb have previously been stated, such as “speed matters,” but we describe the assumptions in the experimental design and share additional experiments that improved our understanding of where speed matters more: certain areas of the web page are more critical.

This paper serves two goals. First, it can guide experimenters with rules of thumb that can help them optimize their sites. Second, it provides the KDD community with new research challenges on the applicability, exceptions, and extensions to these, one of the goals for KDD’s industrial track.

What people wrote

  1. KDD Reviewer (anonymous): It really only presents a series of anecdotal evidence for all sorts of optimization tips&tricks for websites. Having said that: this is the beauty of the paper: it mentions a large series of real world experiments with positive and negative(!) outcome and hence there is lots to be learned from.
    ...I am sure it'll make a for a fun and interesting talk and lots of nodding in the audience...I'd like to see more papers like this one.
  2.  KDD Reviewer (anonymous): The paper is quite entertaining to read, and brings together experiences from many different products and organizations. In that sense, it functions somewhat as a review paper for online experimentation
  4. Neal Ungerleider: Lessons In Website Testing From A Master (8/25/2014)
  5. Steve Blank: 7 lessons in getting your website right from a master  @ronnyk (9/15/2014)



author = {Kohavi, Ron and Deng, Alex and Longbotham, Roger and Xu, Ya},
title = {Seven Rules of Thumb for Web Site Experimenters},
booktitle = {Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining},
series = {KDD '14},
year = {2014},
isbn = {978-1-4503-2956-9},
location = {New York, New York, USA},
pages = {1857--1866},
numpages = {10},
url = {},
doi = {10.1145/2623330.2623341},
acmid = {2623341},
publisher = {ACM},
address = {New York, NY, USA},
keywords = {a/b testing, controlled experiments, randomized experiments},

Link to this page: