Summary: In order to make the most of analytics data, UX professionals need to integrate this data where it can add value to qualitative processes instead of distract resources.
Jen Cardello | November 17, 2013
Analytics has traditionally been used to inform marketing strategy and tactics, but we now see more usability and user-experience professionals relying on this quantitative-data source to aid in research and design.
The biggest issue with analytics is that it can very quickly become a distracting black hole of “interesting” data without any actionable insight.
Probably the worst thing you can do is teach someone how to use an analytics system and hope she will get you some interesting findings. Even a “free” analytics service will cost you a fortune if it redirects resources from more productive uses. Many beginners get stumped over one of three hurdles:
- Scope of metrics: So many things that can be measured, but which are meaningful?
- Difference between metrics: Which metrics best answer specific questions?
- Interface complexity: How do you get the analytics system to tell you what you want to find out?
Because of this last point, many people end up jumping in the deep end and focusing on the tool instead of the work that it is intended to support. With that in mind, we recommend that UX professionals back up a step and think about how analytics data can supplement current methods and processes.
After interviewing a variety of UX teams regarding their use of analytics and other web data, we discovered some interesting high-value UX uses for analytics. We’ll cover 3 in this article (and more in our course on Analytics and User Experience):
- Issue indication: Notifying the team of potential problems reaching goals
- Investigation: Identifying potential causes of issues
- Triangulation: Adding data to supplement qualitative research
Use #1: Issue Indication
Some UX teams collaborate with optimization specialists while designing the site or launching new features to develop and implement a measurement plan. UX teams receive reports daily or weekly in order to monitor the site’s ability to meet stated goals. They can use the web metrics to diagnose specific issues or rely on them as clues to guide further investigations.
A measurement plan consists of:
- Goals/macro conversions: These are the big-picture actions that users need to complete on the site in order for it to be successful. Examples include the number of purchase completions and the number of lead submissions.
- Desirable actions/micro conversions: These are smaller actions that, when combined, support the meeting of the goal—such as progressing along a lead-generation funnel. Examples include visiting a specific page, clicking a particular link, or entering data in a form.
- Web metrics: These are web-analytics data that indicate whether these desirable actions occur; they help UX teams identify potential issues.
Measurement Plan Example
|Goal||Desirable Action by Site Visitors||Web Metric||Description|
|50 consulting leads per month||Visit consulting service section||Unique page views||“Unique page views” indicates how many site visitors visited this specific page. “Unique” means that the page is not counted multiple times if the same user visits it multiple times.|
|Read about consulting services||Average time on page||“Time on page” indicates the average amount of time spent on the page. More or less time is not necessarily a good thing. So, it’s important to always look at this data in context and compare it to similar time periods.|
|Download whitepapers, enter data into form fields, submit the lead form||Events||“Events” refers to any user action on the site that you are tracking. In Google Analytics (and many other systems) you can define what event instances should be recorded. Examples of events include downloading a PDF, clicking a particular button, entering text in a field, and watching a video.|
Use #2: Investigation
In this mode, UX teams—either on their own or with the assistance of optimization specialists—develop hypothesis for macro-conversion issues and use analytics to prove/disprove the hypotheses. There are several problem categories that guide the investigation: traffic, technical, content, navigation, visual design. Most examples we provide below are from Google Analytics’ free offering.
1) Traffic Issues
Example investigation: Determine if there is one traffic source (Google, Bing, Yahoo, direct, email campaigns, etc.) that is responsible for a decrease in page visitors.Useful analytics report: Pages (filtered by the page URI and using Sourceas the secondary dimension)In Google Analytics, you can generate page-specific reports that display where the traffic to the page originated (search, email, direct, etc.).
Example Pages report (in Google Analytics) filtered by URI and using Source as Secondary dimension
2) Technical Issues
Example investigation: Determine if a page element is not loading properly.Useful analytics report: Event PagesThe Event Pages report lists all the pages where events are tracked. You can select the specific page being investigated to get metrics on that specific-page’s events.
Example Event Pages report in Google Analytics
3) Content and 4) Visual-Design Issues
- Determine if new wording may not effectively communicate the benefits of or the process for taking a specific action.
- Determine if imagery, typography, colors, and/or layout are distracting from calls to action (CTAs).
Useful analytics report: In-page AnalyticsIn-Page Analytics indicates what links users select.Note: You need to set up Enhanced Link Attribution in the Google Analytics admin to see separate percentages for links on a page that all have the same destination. We also recommended supplementing In-Page Analytics data with clicktracking services like Clicktale and/or CrazyEgg.
Example In-Page Analytics indicating what links users are selecting
An example of a CrazyEgg Heatmap display; clicks can also be expressed in numbers via the Overlay view.
5) Navigation Issues
Example investigation: Determine if specific links/buttons are not being clicked.Useful Analytics Report: Pages(filtered by the page URI and selecting the Navigation Summary tab)Navigation Summary is a tab you can select from any Pages report (illustrated below). It details from which website pages people came before visiting the page of interest and where they went after visiting that page.
Example Pages report with Navigation Summary selected
Use #3: Triangulation
In this mode, the UX team uses analytics to verify findings derived from qualitative research (e.g., usability testing) and gather additional clues to help in defining a solution. If the original usability test was run with about 5 users — as we often recommend — then there is always the risk that estimates like success rates will be wrong. But such a quick test has the advantage of rapidly pinpointing a potential trouble spot, which can then be instrumented for targeted collection of a few thousand analytics data points that support much more accurate estimates.
Examples of usability-test findings to verify with analytics data:
Finding: Study participants don’t know where to find information about a topic because the word used on the site is different than what they use.
Additional questions to answer: Are people searching for those terms that participants mentioned in the study?Useful analytics report: Search TermsThe Search Terms report lists the terms users enter into the website’s own search box (not web-wide search). You can download the search-terms lists corresponding to any time period and conduct more extensive analysis. Terminology that users typed into your own search box is a prime candidate for use to turn your content into user-centered language.
Example Search Terms report in Google Analytics
Finding: A feature is not used or a page is not accessed because study participants didn’t notice the link.
Additional question to answer: Where are users going instead?Useful analytics reports: Pages (filtered by the page URI and selecting the Navigation Summary tab) and In-Page Analytics (see examples above in the Investigations section)
Finding: A form is not being completed because people don’t feel comfortable providing required information.
Additional question to answer: On which fields are people abandoning the form?Useful analytics report: Event Pages(see example above in the Investigations section)
Quantitative data is increasingly becoming a key ingredient in usability and user-experience work. With this change comes the need for UX professionals to become familiar with the language and tools and determine which metrics and features are useful in UX practice.
Adding analytics to UX work enables us to:
- Take early action to prevent unnecessary conversion decreases
- Quickly prove/disprove causation/correlation theories
- Better persuade the more data-oriented stakeholders in our organizations
Learning analytics systems can be daunting because they are complex and have been purpose-built for marketing activities, not UX. The learning process can be improved by starting with existing UX processes and determining where metrics can add value in those processes.
Beyond what is covered here, there are many more analytics metrics that can be incorporated into UX research and design processes. We will cover these in our course, Analytics and User Experience.