Original article posted on Medium.

While working on a project earlier this year, an interesting tidbit emerged from research: users tended to have just a few simple needs when it came to accessing data. Given that the tool we were working on is pretty robust, the lightweight nature of the most common use cases was a surprise… there is more “tool” available than what is necessarily needed. 

This reflects a broader trend among analytics providers: there’s a popular interface-first reflex when it comes to building data products. We opt for flexibility over convenience, often attempting to satisfy all plausible use cases rather than optimizing for the most frequent ones. Here’s an outline of some analytics use-cases, addressed in a one-size-fits-all way:

The approach is, in a way, straightforward. Give the user access to the data. If they have a question, they can go to the interface and “tell” it what they need. They then digest the information, isolating some meaningful insight (hopefully) before disseminating the information to their peers, supervisors, or other stakeholders. 

The interface-first approach is capable of satisfying many use cases for end users. Analytics tools can be used for all sorts of purposes — from status updates to fact-finding to open-ended exploration — but it’s not unusual to see a user base rally around a few lightweight ones (hint: open-ended exploration is usually not among them). Let’s think through a simple use case: a gym owner wants to know if member attendance has changed this week, as weather has been especially nice. Here’s how that scenario looks for an interface-first analytics tool:


  • User has a question: has gym attendance changed this week due to unusual weather conditions? 
  • User logs in to her gym attendance analytics web tool and navigates to the attendance traffic explorer. She manipulates the settings to show only this week compared to prior weeks’ activity. 
  • User observes the data and determines that gym attendance this week looks pretty similar to recent weeks.
  • User writes an email to her staff and managers, telling them that attendance is normal, and to keep up the good work.

Sound familiar? The user’s needs were met eventually, but she had to have enough motivation (inspiration and curiosity, or some external impetus) to overcome the friction (proactive log in, manual discovery and digestion, composing communications). Perhaps, here, she did. 

But she won’t always have that motivation… maybe there’s a lot going on that needs her attention. The towel deliveries are late from the cleaning service, so she’s on the phone all morning. Or the bathroom exploded, so she’s running the front desk while one of her staff members cleans things up. She doesn’t have time to monitor attendance trends… ones that may or may not be interesting at all, anyway. 

What she needs is less a data tool, more a data-savvy assistant. The gym can’t afford to staff a full-time assistant for the owner, but digital products can step in just fine… they’re pretty good with numbers, after all.

By optimizing towards more lightweight use cases — e.g., monitoring and reporting, rather than exploration and data mining — we can remove the user from a lot of the work that needs to happen. Here’s the prior scenario (gym attendance this week) given this data-assistant product approach:

  • Product is always monitoring gym attendance figures (robots don’t sleep). Attendance numbers this week are looking normal (within a certain % of algorithmic expectations) considering recent weeks’ traffic and year-over-year comparisons. Since things are normal, Product does not notify the user. 
  • Product will email out an attendance update at the end of the week, regardless, to the gym owner and all staff, per user settings (weekly attendance updates are turned on). 
  • User knows that the Product will notify her if gym attendance is unusual, so she doesn’t worry about it. 
  • (at the end of the week:) User sees the usual weekly attendance update email from Product on her phone. She glances to verify that everything is running smoothly. She knows that the rest of her staff sees the same info, so she doesn’t bother emailing anyone about it.

There are still several steps here, but most of the work is picked up by the product. If the product is marketed and positioned correctly, the user will have the correct expectation that “no news is good news,” and she’ll get on with the rest of her work without having to bother with attendance metrics. Even when she does give some attention to the product (via scheduled emails), it’s in the comfort of her own inbox… no behavior change required. 

Sounds great, right? Some products are moving in this direction — more channel automation, less interface navigation — but the interface-first reflex still abounds, especially in early stage products. One of the unfortunate motivators behind this approach is usage metrics: product owners want to see that people are using their products. Product engagement (e.g., daily user logins) seems like a good proxy for “value” to the end user, so it becomes the KPI. If engagement goes down, that’s a failing KPI, even if users are still realizing value via their email inboxes (e.g., through automated reports or alerts). 

Some analytics tools are truly robust tools meant for power users — think, a Search Marketing manager at a large consumer eCommerce company — but a lot of data products would be well-served by streamlining interactions around value (e.g., “I want to know when my gym attendance is abnormal”) rather than simply supplying users with tools.