Why Web analytics won’t help interaction design

The data provided through Web analytics offer promise to interaction designers by pointing to potential user experience problems. But interaction designers who think they should base critical design decisions on Web data are misguided at best and downright irresponsible at worst.

Web user experience practitioners recently embraced web-traffic measurement as a user experience research method. This is not necessarily a bad thing — gathering design insight from a variety of sources is always advisable. The problem comes when interaction designers (and the data analysts that advise them) base critical user experience decisions on these often incomplete and even misleading data.

My beef with Web analytics boils down to two points:

  1. Web analytics are notoriously unreliable. Web traffic researchers continue to struggle with accurate visitor counts and missing data points, compromising both the validity and reliability of the method (Chatham, 2005).Practitioners are dogged by day-to-day limitations in both the techniques and the underlying technology, which limit their ability to reliably produce analyses that are universally accepted as legitimate (Wiggins, 2007).
  2. More troubling, is the claims that Web analytics capture meaningful data about the user experience. Certainly Web analytics capture important information about server loads, form completion, navigation patterns, and browser types.The unspoken belief, however, is that user keystrokes and mouse clicks represent the sum total of what there is to know about a Web site visitor’s experience. If you base user experience decisions on Web traffic measurement, you assume that an individual person intentionally initiates these keystrokes and mouse clicks for meaningful reasons.

Ask yourself, have you ever initiated a mouseclick unintentionally? Have you inadvertently typed in a URL? These mistakes of intention are not registered by Web analytics tools.

In his book Observing The User Experience, Kuniavsky argues that Web analytics have the same amount of insight as a “jewelry store clerk” who has a “much better understanding of customers” because they watch everything the customer does.

Web analytics are not equivalent to a jewelry store clerk gathering subtle, nuanced information about a person visiting their store. They are the equivalent of a blindfolded, deaf jewelry store clerk who uses a complex system of tapping to communicate with store visitors, who may or may not know the unique tapping language of that particular clerk.

Interaction designers must base critical user experience decisions on the results of qualitative research, rich with “thick description,” and subtle cues. Basing such decisions on Web analytics would remove the all insight into users’ actual intentions.

Web analytics tools have their place in interaction design. They should be limited to:

  • Measuring appreciable increases in specific, observable goals, such as form completion
  • As a final test after a battery of in-person, qualitative usability tests
  • As an infrastructure monitoring tool
  • As a lead generation or campaign effectiveness tool

Any other uses of Web analytics reduces interaction design to nothing more than blind counting of meaningless signals.

This is an abridged version of Ladner, S. (forthcoming). “Watching the Web: Suggestions for Improving Web-based Measurement.” In Jansen, J., Spink, A. and Taksa, I. (eds.). Handbook of Log File Analysis. Idea Group: Hersey Pennsylvannia.

Further Reading:

Chatham, B. (2005). What’s On Web Analytics Users’ Minds? : Forrester Research.

Gassman, B. (2005). How to Choose An Advanced Solution for Web Analytics. Stamford: Gartner Research.

Kuniavsky, M. (2003). Observing the User Experience: A Practitioner’s Guide to User Research. San Francisco: Morgan Kaufman.

Lecompte, M., & Shenshul, J. (1999). Designing and Conducting Ethnographic Research. Walnut Creek: Altamira Press.

Wiggins, A. (2007). Data Driven Design: Using Web Analytics To Improve Information Architecture. Paper presented at the Conference Name|. Retrieved Access Date|. from URL|.


5 responses to “Why Web analytics won’t help interaction design

  1. I’m don’t agree. Web analystic tools have their place but don’t forget the actual analysis is done by a person. That person usually comes with some form of experience.

  2. You’re right, Dave, a person is always involved. But the problem comes when people too much faith in the number spouted out in analytics tools, even when they’re reviewed by an experienced analyst.

    My point is that there are fundamental limitations to the data you get in these tools, and they are no substitute for in-person observation.

  3. Sam,
    Pretty brave of you to step out there with this opinion – some might think it’s heresy 🙂

    I particularly love your jewelry clerk analogy. Way too much is assumed based on the data that web analytics collects. There are gaping holes, including:

    1. Web analytics only captures what people did, not what they didn’t do, or what they started to do, but lost patience with before they finished doing it. To understand why, it is not sufficient to ask people, because they can only comment on what they did and saw during the visit, a visit which will only consume a fraction of the site. Owners also have to be able to perform complete site discovery, so they can understand the environment they have created, that visitors are reacting to. They need to be able to know with certainty everything is working like it should, scripts are executing, links are working, pages are being served, forms are advancing to the next level. They need to be able to drill down on the data in multiple ways, slicing and dicing to find that needle in the haystack when the situation requires it. And they need the ability to adjust their queries of the data in fast and flexible way, as their needs change.

    2. Web analytics is prone to systemic errors – untagged pages, beacons that are not firing. Lots of WA pundits argue that accuracy doesn’t matter and that getting a sample is enough. That MAY be true if the sample is truly random. Systemic error means that the data about a particular action – page visit, for example – is missing 100% of the time. This leads to incorrect assumptions and bad decisions. Without validation of the implementation, site owners can have no confidence in the decisions they are making because they can have no confidence in their data.

    Complete site discovery, full functional verification, the ability to perform multi-variate searches of the site structure, and the ability to respond quickly to changes in the market and the business environment are crucial to online success. The evaluation of structure, correct and complete web analytics data, AND the opinions of users are all vital elements in achieving online success.

  4. Pingback: Webanalyticsbook » Web analytics vs. design

  5. The traffic has to be tied to the behaviors…one without the other is only half the story. If you don’t know what people came to do and whether or not they felt like they were successful in their attempts, you know absolutely nothing.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s