Web Analytics as Business Meteorology

I was interested to read this article the other day about using web analytics. While pretty inconclusive (it’s a PR piece after all), it got me thinking.

Are web analytics useful in UX design?

Having worked in some very “data led” environments, I’ve often had access to statistical reports of various kinds. From aggregate traffic through to counts of specific interactions, surveys, and other quantitative measures. All these covering various aspects of web, mobile and call centre activity.

Most are produced at regular intervals, and serve some higher business purpose (I hope!). But from the perspective of UX design, they are strangely impotent.

Take aggregate traffic reports. Web sites have a basic “fingerprint” that seldom changes. With very, very few exceptions, such reports are stubbornly unsurprising. Once you’ve seen the high-level traffic pattern of a site, you may as well never look at it again.

Zzzzz

The eternal buzzsaw. Nothing to see here.

Even traffic breakdowns are seldom worth looking at more than a couple of times. Home page traffic at or near the top; “about us” and the FAQ at or near the bottom, and variously obvious stuff in between. That, barring an earthquake, is how it will always be.

Similarly, when a report compares against a previous time period, why is such a comparison useful? Is there something special about last week, month or year that it should be compared to?

Of more promise are interaction counts such as bounce rate, click through, search criteria, etc. But these usually lack enough informative context for a defensible design response straight away. A bounce rate might be low, but conversion on that page might also be disastrous. The use of a search filter might be low, but is it driving repeat visitors? Of the people that click that link, how many ask for refunds? Ask a few simple contextual questions that pique your curiosity and you soon progress beyond what most analysis can provide.

No S**t, Sherlock!

But even if you have the missing context filled in, and assuming you know what to ask for, this seems to lead to another phenomenon: the confirmation of what you knew anyway.

Confirmation may be better than nothing of course, but if the purpose of analytics is to confirm you in your judgement of human nature, then after a while there’s not a lot of point in seeking out analytical evidence for that – particularly since getting good stats is such hard work.

What Lovely Weather We’re Having…

Instead it seems to me that such reports serve as a sort of corporate weather report. People like talking about the weather because if it’s good, you feel good. And if it’s bad, you hope it will improve. Aside from the digital equivalent of cloud seeding (buying more traffic) you don’t think you can actually change it though.

Perhaps the unspoken but obvious reason for this is that quantitative measures cannot, without a lot of luck and deductive effort, tell you why something happened.

Consequently, you are almost always better off consulting your own judgement about what might change the weather rather than consulting the weather reports for inspiration in that. And the absolute last person you should ask about this is an analyst/meteorologist.

Certainly one of the things that I find refreshing about MailOnline is that design decisions are made unashamedly from gut instinct. Of course, we check the “weather report” out of respect or habit. But we don’t conduct randomised control trials, or use “sophisticated” analytics such as Omniture Discover would provide. In the end, we have the guts to be confident in what we think our readers want. And I doubt if any charts would really change that.