Thursday, September 3, 2009

Website Engineering

This article is written in collaboration with Maxim Gorelkin.

There are two basic ways to build bridges: from the bottom up and from the top down. The former, the one more familiar to us and, unfortunately, still employed: to build a bridge and then see what happens; if it withstands all weights (at least some time), then all the better. The latter, a much more difficult approach not even from the same discipline, engineering: we define the requirements for the bridge and the metric for its evaluation - the load-carrying capacity - and using strength of materials we design, build, and test a model and... Only after we are satisfied with the result, we decorate the built bridge with, for example, statues of lions lying on their front paws. My preference to the second way yields my preference for the term “website engineering” over the traditional “website development”.

We shall start with the definition of the key metric of the website’s effectiveness. Typically, this is defined as the conversion rate - the percentage of visitors who take a desired action (e.g. who buy something from your online store or meet a specified dollar amount). Thus the essence of the engineering approach is when building websites, we must guarantee specified levels of their effectiveness. Adaptive websites, described in my previous article, define the model that should solve this problem.

The issue with today's web development is the lack of an engineering approach and similar models. Yes, you can construct several alternative landing pages for a/b split or multivariate testing and collect statistics for several months in order to find the best solution. However, as demonstrated by Tim Ash, your result may depend on the chosen method of testing and data analysis techniques!? Or there may be no statistically significant differences between alternatives and, consequently, you may be unable to choose the best page. Suppose you get lucky, and after months of testing, you optimize your website, only to discover that its web traffic has changed and you must start the process from scratch. The same applies to web analytics: yes, you have found that, for example, some number of users visited certain pages of your site and made certain clicks, but how do you interpret this? What motivations led them to do it? And what actions does such “knowledge” suggest you take to improve your site? What about if you find a complete chaotic behavior of users on your site, what do you do then?

Web testing (preferably adaptive, for example adaptive multivariate testing), web analytics and web usage mining (discovering patterns of user behavior on your site) should become part of your website or, put another way, your website must be self-testing, self-analyzing, and “intelligent” enough to extract the practical knowledge of user behavior from these tests and analyses in order to use it for its adaptivity. By the way, in order for the mentioned patterns to become knowledge about user behavior on your website, they must be formulated as statistical hypotheses and constantly verified for accuracy.

Next, let's assume you defined the metrics for the effectiveness of your website and measure them regularly to determine the effectiveness (or ineffectiveness) of your site. However, the problem is even more difficult: learning how to manage these metrics to achieve sustainable improvement. How can it be done? In one of the areas of quality control - statistical process control, they developed a technique for process stabilization prior to taking the process under control and improving the quality of production. It seems to me that there is a direct analogy here with web traffic and control over it for improving the website’s effectiveness.

Summing up, we say that website engineering is about computability of the website’s effectiveness on the basis of characteristics of its web traffic and of its web traffic on the basis of the website elements: its content, navigation, etc.

P.S. Another example is one from the field of algorithmic trading: this type of trading has become a money making machine - a lot of money - without direct human intervention. And this ability stems from the fact that these machines are becoming more intelligent and adaptive. Today, their development uses such disciplines as complexity theory, chaos theory, mathematical game theory, cybernetics, models of quantum dynamics, and so forth. For example, in order for e-commerce to set such ambitious goals, it must attain a similar level of sophistication, and the application of artificial intelligence algorithms is a modest start at best. But we’re treading on the territory of advanced engineering, based on modern science.

3 comments:

  1. Good. A few thoughts. Two problems for intelligent web applications:
    1) The optimization problem - determining how the site should present itself to encourage the user to take the desired action(s)(the goals).
    2) The learning problem - the site needs to concurrently learn the dynamics of the environment:A) how users transition in, out, and within the site. B) estimate the how each site option alters the frequency/value of the desired actions (goals).

    This is complicated since the online environment is likely to not be completely stationary - you mention in your article. So the site will need to continuously 'listen' to the environment by trying out options that appear to be suboptimal.

    My take is that if the objective is to have the site self optimize then using a bandit type of approach is probably better than using statistical process control. On the other hand, if you are trying to extract insights that you will go on to apply elsewhere or to use as priors on other problems then a hypothesis testing method is going to be your best bet.

    Cheers

    ReplyDelete
  2. I’m impressed!! Really informative blog post here my friend. I just wanted to comment & say keep up the quality work. I’ve bookmarked your blog just now and I’ll be back to read more in the future my friend! Also nice colors on the layout, it’s really easy on the eyes.

    ReplyDelete
  3. Hello, I came to your blog and have been reading along your posts. I decided I will leave my first comment. I have enjoyed reading your blog. Nice blog. I will keep visiting this blog very often…

    ReplyDelete