The data provided through Web analytics offer promise to interaction designers by pointing to potential user experience problems. But interaction designers who think they should base critical design decisions on Web data are misguided at best and downright irresponsible at worst.
Web user experience practitioners recently embraced web-traffic measurement as a user experience research method. This is not necessarily a bad thing — gathering design insight from a variety of sources is always advisable. The problem comes when interaction designers (and the data analysts that advise them) base critical user experience decisions on these often incomplete and even misleading data.
My beef with Web analytics boils down to two points:
- Web analytics are notoriously unreliable. Web traffic researchers continue to struggle with accurate visitor counts and missing data points, compromising both the validity and reliability of the method (Chatham, 2005).Practitioners are dogged by day-to-day limitations in both the techniques and the underlying technology, which limit their ability to reliably produce analyses that are universally accepted as legitimate (Wiggins, 2007).
- More troubling, is the claims that Web analytics capture meaningful data about the user experience. Certainly Web analytics capture important information about server loads, form completion, navigation patterns, and browser types.The unspoken belief, however, is that user keystrokes and mouse clicks represent the sum total of what there is to know about a Web site visitor’s experience. If you base user experience decisions on Web traffic measurement, you assume that an individual person intentionally initiates these keystrokes and mouse clicks for meaningful reasons.
Ask yourself, have you ever initiated a mouseclick unintentionally? Have you inadvertently typed in a URL? These mistakes of intention are not registered by Web analytics tools.
In his book Observing The User Experience, Kuniavsky argues that Web analytics have the same amount of insight as a “jewelry store clerk” who has a “much better understanding of customers” because they watch everything the customer does.
Web analytics are not equivalent to a jewelry store clerk gathering subtle, nuanced information about a person visiting their store. They are the equivalent of a blindfolded, deaf jewelry store clerk who uses a complex system of tapping to communicate with store visitors, who may or may not know the unique tapping language of that particular clerk.
Interaction designers must base critical user experience decisions on the results of qualitative research, rich with “thick description,” and subtle cues. Basing such decisions on Web analytics would remove the all insight into users’ actual intentions.
Web analytics tools have their place in interaction design. They should be limited to:
- Measuring appreciable increases in specific, observable goals, such as form completion
- As a final test after a battery of in-person, qualitative usability tests
- As an infrastructure monitoring tool
- As a lead generation or campaign effectiveness tool
Any other uses of Web analytics reduces interaction design to nothing more than blind counting of meaningless signals.
This is an abridged version of Ladner, S. (forthcoming). “Watching the Web: Suggestions for Improving Web-based Measurement.” In Jansen, J., Spink, A. and Taksa, I. (eds.). Handbook of Log File Analysis. Idea Group: Hersey Pennsylvannia.
Chatham, B. (2005). What’s On Web Analytics Users’ Minds? : Forrester Research.
Gassman, B. (2005). How to Choose An Advanced Solution for Web Analytics. Stamford: Gartner Research.
Kuniavsky, M. (2003). Observing the User Experience: A Practitioner’s Guide to User Research. San Francisco: Morgan Kaufman.
Lecompte, M., & Shenshul, J. (1999). Designing and Conducting Ethnographic Research. Walnut Creek: Altamira Press.
Wiggins, A. (2007). Data Driven Design: Using Web Analytics To Improve Information Architecture. Paper presented at the Conference Name|. Retrieved Access Date|. from URL|.