This article is written in collaboration with Maxim Gorelkin.
If you use Web Analytics to determine the effectiveness of your website, you may have noticed that around 99% of your visitors leave in the first three seconds without having done anything. And this is after you’ve spent a good fortune on both search engine optimization that maximized traffic from Google, and landing page optimization that increases the attractiveness of your site. Why, then, are these efforts and investments not yielding the expected results?
You have to deal with a very diverse and demanding web audience. Traditional static web sites are trying to satisfy this group by approximating the “typical” user. But the problem is that every visitor is unique and is looking for something very specific. And since the Internet arms him with nearly unlimited access to resources that meet his demand, he does not wish to fit any more user templates or make compromises. Not to mention that the Internet is ever-evolving to more accurately determine his needs and more quickly meet his requirements. This is what shapes the conditions under which your site must exist, and dictates the rules of the game.
Another trend – to populate your site with all relevant information in order to meet the diverse requirements of all users. The problem with this approach is that the majority of web surfers cannot cope with this “abundance” and decide whether they find anything interesting, and thus quickly leave the site for a more practical “solution”. The use of Web Analytics and Web Mining obliterate many of our illusions about the usefulness of our websites.
As demonstrated by Peter Peter Norvig, Director of Google Research, as far back as 2001 in his web-manifesto "Adaptive Software", in these circumstances you must change the paradigm and begin applying intelligent and adaptive systems. And this is exactly the trend we are now seeing on the Internet. In our case, this is the development of adaptive web sites, which adjust their content and interfaces, in order to better meet the needs and preferences of every individual user. Do not request anything impossible of your visitor: he will never search your site for anything. Today it is your problem to identify what each user is looking for, and to present that to him in the most appealing fashion. Remember, in your Internet business, you compete with… Google!
Now let’s describe the basic properties of adaptive web sites. First, we know from Web Analytics, that the basic web traffic (around 85-90%) comes from search engines such as Google, Bing, Yahoo, Ask, AOL, and so forth. But the majority is from Google (around 75-80%), which in response to any query still throws out on the order of a million results, in the midst of which the user must find his “answer” as quickly as possible. And if a search engine brought a visitor to my online footwear shop looking for “wedding shoes”, where the homepage is populated with the latest-fashion sneakers, then I’ll mostly likely lose this potential customer, even if the page has the exact link he’s seeking at the very bottom. This is the New Internet, where the site must “guesstimate” (be sufficiently “smart” for this) the needs of the user, know how he “arrived” there, and in “real-time” reconstruct itself with potential matches specific to this visitor. Is this possible? Yes and no. Under the old static-site paradigm: no. However under the new paradigm of adaptive web sites: very often, yes! In regards to an always-yes, forget it. Even when this “yes” accounts for a 20-30% conversion, Amazon and Flowers.com would envy such a site even during Christmas shopping season. The secret lies in the information that the search engine brings with every referred customer: where he came from (URL), which browser and operating system he’s using, but most importantly, what he’s searching for. More precisely, the search query he submitted, which the site must translate into what he’s intending to find. (*)
Second, the site should analyze in real-time what the user is doing: what he’s reading, what he’s clicking on, and the links he’s selecting. It should “formulate” and test its hypotheses in regards to the user’s objectives and generate, for example, new advertisements or recommendations, “watching” his reactions and using this information to offer better suited content (and improve itself at the same time). Even if the site doesn’t have what the user is searching for, it can find it on the Internet and offer links.
Third, one doesn’t search for information on adaptive sites since the pages offer it up-front, similar to Amazon with its “customers who bought this also bought”, Netflix with its Cinematch, Pandora and so forth. They recommend content that their visitors may not have even known about. (**)
In summary, while the idea of adaptive websites isn’t novel, being formulated around ten years ago, it can be implemented now on the combination of Machine Learning and Rich Internet Applications, such as AJAX and MS Silverlight.
(*) To do this I analyze my web-log for constant and static patterns of visitor behavior based on the search keywords submitted to the search engines. I then find combinations of words that “work” on the site, and those that do not. If there is a significant amount of the latter that we can’t ignore, we build specialized content pages for the most common combinations. The problem lies in the fact that my web traffic is complex, dynamic and sporadic, and thus many patterns carry a dynamic characteristic; thus to discover and apply them, I rely on machine learning algorithms. And in order to yield more accurate results, I must complicate the matter further and enrich the search queries with demographic data about the visitors, obtained from the same search engine.
(**) This is extremely complicated – analyzing every visitor, that’s why adaptive web sites cluster similar users into groups, building models for each and every one of them, and then in real-time try to correctly classify the visitor with one of these, and considering unavoidable occasional errors, continue to improve both the collection and the matching mechanism.
Carolyn Wei "Adaptive Web Sites: An Introduction" (2001)
Paul De Bra, Lora Aroyo, и Vadim Chepegin "The Next Big Thing: Adaptive Web-Based Systems" (2004)
Abhijit Nadgouda "Adaptive Websites – The Future of Web" (2006)
Howard Yeend "Adaptive Web Sites" (2009)
Jason Burby "Using Segmentation to Improve Site Conversions" (2009)
Business Case Studies:
BT Group: Website Morphing can increase your sales by 20%
Autonomy Interwoven Merchandising and Recommendation
Sitecore Visitor Experience Analytics & Real-Time Site Personalization
ChoiceStream RealRelevance Recommendations