Contact  |  Login  Volunteer

Web Analytics

Web analytics, or metrics, are measures of behavior of website users that are automatically collected across entire visitor populations or large samples. Data such as the number of visitors to a website, where they are from, which pages they view, and which links they click, can be measured, collected, analyzed and reported. Web analytics are relevant to usability practitioners in that they can provide insight into the large-scale behavior of website users to understand and improve (optimize) the website.

Web analytics cannot provide answers to questions about user motivations or underlying needs and goals. Web analytics may indicate that users are abandoning a checkout process at a particular point, but they cannot be used to explain why this is happening. Usability testing of the issues that are found through web analytics brings the deeper understanding needed to fix these usability problems. Additionally, in-person observations of users can lead to insight that informs what metrics are worthwhile to collect, and how to interpret them.

Related Links

Formal Publications

Clifton, B. (2008) Advanced Web Metrics with Google Analytics. Wiley

A guide to getting the most benefit out of Google's free web analytics system.

King, A. (2008) Website Optimization. O'Reilly Media

Includes a chapter on Website Optimization Metrics that has an explanation of how to analyse the statistical data returned by website analysers such as Google's Analytics and WebTrends.

Web Resources

The Web Analytics Association is working to standardize the web metrics industry, educate people about web metrics, unite professionals and influence legislation that affects the web analytics industry as a whole.

The Wikipedia entry on Web Analytics defines terms and explains web metrics in more depth.

Web Analysis Tools

Examples of web analysis tools include: Webtrends, Google Analytics, Mint, CrazyEgg, tealeaf, and Clicktale. More examples can be found on Wikipedia.

Published studies

Kaniasty, Eva. Web Analytics and Usability Testing: Triangulate your way to better design recommendations. UPA 2009 Conference.

Detailed description

There are two categories of web analytics:

  • Off-site web analytics
  • On-site web analytics

Off-site web analytics refers to web measurement and analysis, regardless of whether you own or maintain a website. It includes the measurement of a website's potential audience (opportunity), share of voice (visibility), and buzz (comments) that is happening on the Internet as a whole.

On-site web analytics measure a visitor's journey once on your website. This includes its drivers and conversions (that is, when users take a desired action on the website such as request a white paper). For example, which landing pages encourage people to make a purchase? On-site web analytics typically compares data about user behavior against key performance indicators such as purchases, downloads, enrollments, or any activity that ties into the goals of the organization. The subsequent analysis of this data guides the improvements to a web site or marketing campaign's audience response.

Historically, web analytics has referred to on-site visitor measurement. However, this has blurred as vendors are producing tools that span both categories.

Web Analytics Data

Commonly measured data includes the number of times web pages are displayed (hits), how many individuals visited the website, the paths users took through a website, where abandonment of the website occurs, conversion rates (primarily for e-commerce sites), feature usage, and frequency of actions of interest, such as errors. Web analytics usually are provided through a tool that uses embedded code on the site to collect data. Most of these tools provide information about the website in charts and graphs and also provide ways to download the information for deeper analysis.

Web analytics data differs from behavioral data collected via direct observation in several ways. First, analytics provide only the outcome of behaviors (such as clicks and purchases). The long and complex chain of cognitive processes leading up to an action that can be logged can only be surfaced and understood through in-person observation by a trained researcher. For example, lack of clicks on an important link could be due to low discoverability or low perceived utility. Just because analytics show that a very small percent of users perform a certain action, it does not mean that the concept doesn't have merit. In other words, analytics cannot diagnose an issue as being a design issue versus a strategy issue. Second, although it is often tempting to view analytics data as the complete picture due to its quantitative scale, it has unique challenges in that the data is often incomplete or "unclean" due to incomplete data logging, incomplete data processing, and the pollution of the data by non-human robotic activity, which contributes outlier data that impacts averages if not cleansed.

A/B Testing

A/B testing (experimentation) can be used in combination with web metric analysis to measure the impact of design, wording, or algorithm factors. In a typical simple A/B test, two variations of an interface are presented to two randomly selected samples of users, and the impact of the two variations on key success measures, such as conversion, is statistically analyzed.

A/B testing that pits two drastically different designs against each other often raises more questions than it answers, because decision makers may not understand what aspects of the "winning" variant contributed to its success, since there are so many interaction effects. A/B testing is most powerful when the variants are informed by formative user research and in-person usability testing, and when conducted for a period of time sufficient for accounting for learning curves and novelty effects.


Lifecycle: Evaluation
Sources and contributors: 
Eva Kaniasty, Carol Smith, Beverly Freeman, Donn DeBoard
Released: 2012-04
© 2010 Usability Professionals Association