Menu

The New Frontier of Designing With Data

by Jared Spool and Kathleen Barrett

There is a scene in The Hunt for Red October, when the actor Courtney B. Vance in the role of Seaman Jones tells the captain of his submarine that, while the data is telling them one thing, he’s discerned a different interpretation based on his experience working on submarines and his knowledge of the software and its inherent limitations. It goes a little something like this:

Capt. Bart Mancuso: [after hearing Jones’s findings] Have I got this straight, Jonesy? A $40 million computer tells you you’re chasing an earthquake, but you don’t believe it, and you come up with this on your own?

Seaman Jones: Yes, sir.

As the story goes, Vance’s character had experience detecting enemy submarines beyond the limitations of his sub’s detection software. Combining his experience with the sub’s data analysis, Seaman Jones had learned when to ignore the computer models to find the enemy submarine.

Businesses today swim in data. We know every transaction and every action our users take with our product or service. But all that data has limitations. Knowing how to interpret the data, by pushing beyond its limitations, is where the real value is.

Winner Winner, Chicken Dinner

Back in the 90s, Lynne Whitehorn’s team was given a challenge: Use data to perfect the roasted chicken. At the time, she was Boston Chicken’s Chief Implementation Architect and Coder, before they expanded their product line to become Boston Market.

Being in the roasted chicken business, each Boston Chicken store needed to ensure it always had product to sell. At the time, in 1992, it took 1 hour 45 minutes to roast a bird. If employees underestimated the number of customers they had in a day, they would run out of chicken. Yet, if they roasted too many, they had a lot of expensive waste from uneaten birds.

Could Lynne and her team create an application that would tell store managers exactly how many chickens to roast and when? Because it was the 90s, they had the advantage of having personal computers in the stores to collect data about every sale, including how much chickens they were selling when. Could the team analyze that data to predict when to roast each new chicken?

The Skill Of Predicting Perfect Poultry Production

Boston Chicken already had a simple process for predicting chicken demand. Yet, data showed some stores still regularly had days where they ran out of food to sell, or were throwing out too much unsold chicken. Stores needed something else to more accurately predict demand.

As they were studying which stores had less spoilage, they found something interesting. Some store managers were particularly sharp at nailing the right amount chicken to sell. They’d somehow learned how to make just enough chicken. “The most proficient managers turned out to be really good at retail food service,” Lynne explained.

These managers were taking context into account. Was there a ball game at the nearby stadium that day? That would push up demand. Would there be a snowstorm coming? Demand would go up before the roads got bad, but then plummet once the storm started.

Lynne remembers asking, “How do we capture that?” Her team needed to develop something sophisticated yet simple for employees and managers to use across all their store locations.

“We had two goals,” says Lynne. “On the one hand, we wanted to make the most sophisticated, complex, advanced, quick-service restaurant back office that people had never seen before. And we wanted people to use it with an 8th grade reading level and floppy gloves on. It had to be a nuclear submarine that you could drive with a play school steering wheel.”

Lynne directed her team to do something that was unheard of for IT teams in the 90s. She sent them into those stores, so they could understand how the managers became so good at predicting their store’s needs. She and her team embarked on user research.

Quality Data Needs Qualitative Insights

As Lynne and her team returned to headquarters to start building their revised prediction system, she realized something critical. Their experience in the store, watching their users interact with the current system, was just as essential in informing the prediction system’s design as the underlying data model.

Lynne decided her team wouldn’t work in isolation. Each team member would spend hours in stores observing how employees and customers interacted. Their observations would inform the data sets as they built the system that managers used to estimate how many birds to cook and when. They tested their prototypes in stores as they iterated over designs. They observed staff using the system to ensure the interface was easy for them to understand.

Lynne’s team extended the data they collected for each store. For example, they considered how fixed-time-based systems for traffic lights, which make people sit for a set period of time before the light changes, can delay a lunch rush to a nearby store as opposed to demand-based systems that alter the timing based on traffic patterns. Weather and geographic data inform customer behaviors, like the fact that people eat more chicken in the fall, and poor weather can keep people at home. Local events influence how busy a store is.

These are standard data inputs in today’s terms, explains Lynne, but at the time it was unusual for a business to compile data sets this broad in scope. And the only way they learned this was by spending time in the stores, seeing how store managers were making their operation decisions.

The prediction system Lynne and her team developed and deployed was enormous. It was credited with adding $167 million to Boston Chicken’s value, says Lynne. She remembers the CEO at the time saying in public that the system could do daily and hourly reporting at a time when McDonald’s only had weekly reporting on store performance. “We could cut overheads by meaningful amounts, which equals less spoilage,” Lynne said.

The end product Lynne and her team delivered, and their use of predictive data, not only increased Boston Chicken’s bottom line and value, it changed the way the company looked at customers and the employees who delivered the service.

The Mysterious Unused Feature

Sam Nordstrom, Global Product Manager on Intuit’s Quickbooks Payments team, found himself with a different challenge. He had a mystery to solve and he was hoping the data could help him understand it.

Sam’s team had built a payment processing capability feature into their accounting software package. Quickbooks customers could send invoices containing a Pay Now link directly to their clients. Using that interface, those clients could pay quickly, helping the Quickbooks customers with their cash flow.

“We have excellent online payment features for invoices that help you track it from being received to paid,” explains Sam. The problem: customers weren’t using it.

Everyone wants to get paid and there is, as Sam explains, a lot of research that small business owners get stressed when they have to invoice their clients. “When you are a small to medium sized business, getting paid is not a guarantee. You don’t have recourse if someone doesn’t pay,” explains Sam. “It’s hard and tricky if you don’t hear anything. That’s where the project came in. It offered a solution.” Yet, Sam’s users weren’t taking advantage of it.

Sam had been monitoring the usage data that came from Intuit’s analytics team. He could see that the numbers indicated that customers weren’t using it. But, staring at the numbers, he couldn’t tell why.

Observing Users In The Wild

Intuit has a long history of watching users interact with their products. Back in 1983, when founder Scott Cook shipped Intuit’s flagship product, Quicken, he instituted a user research program he called “Follow me home.” Intuit employees would hang out in computer stores and watch people shop for the software. When they identified a purchaser, the employee would approach and ask to go with them to see them install and use the application.

Those early efforts at understanding how people use the products were still baked into Intuit’s DNA. Over the years, they learned to make the research process more effective (and a little less creepy sounding).

Sam and his team picked a few users to visit. They focused Quickbooks customers who were issuing invoices, but their clients weren’t using the payment function. As Sam and his team watched their users, they began to see a pattern. Users bypassed the payments functionality in an odd way.

To take advantage of Sam’s payments functionality, users had to request that Quickbooks send their client an email with an invoice. The Quickbooks team had implemented a lovely built-in email function that would attach the invoice and a link to make a payment.

Sam’s team observed users sending invoices by email, but not with the built-in functionality. Instead, they were going through a convoluted process. The user would save their client’s invoice as a PDF. Then they’d switch to their mail client, where they’d compose a new email to the client and attach the invoice.

Uncovering the Why

The beauty of observing users is you can ask them “why?” And that’s exactly what Sam did.

In essence what they found was, users were comfortable using their personal mail clients to communicate with their existing customers. That’s where they wanted to be doing business.

Sam’s users wanted to add it to the email correspondence they were already having with the client. That wasn’t something Quickbooks email capability could do easily. That’s why these users were jumping through the convoluted save-as-PDF hoops.

Might this be why the payments function wasn’t being used? How widespread was this problem? Sam needed to return to the data to find out.

Going Back To Numbers

Intuit had a data analytics team, but they weren’t attached to any specific product team. Their job was to collect and disseminate information to every team that asked.

The data analytics team was frequently backed up handling too many requests. They were a small team and served every product team.

For Sam, accessing measures specific to the payments functionality was time-consuming. He had to develop a hypothesis and place a ticket request to the data analytics team to get an answer. He was competing for time with managers placing similar requests who had 900,000 to 1-2 million users per month.

Sam’s functionality only had 60,000 users. Needless to say, it took a while for Sam to hear back from anyone. He decided to take matters into his own hands and start pulling data from the Quickbooks analytics.

He tried to connect the dots himself in the numbers, and bungled it a bit, at first. Putting the complete story together was hard.

Upping the Data Science Game

When Sam first dove into the numbers, he realized that Quickbooks had three distinct sets of data. The analytics team was reporting what many corporate analytics teams report: which pages were customers visiting and how long were they staying on them. That was only one type of data that Intuit knew about their customers.

Because Quickbooks is an accounting system, Intuit’s customers enter a lot of information about their business. They know names, phone numbers, who their clients are, and many other details about the nature of the business. That’s the second dataset Sam had at his disposal.

Sam also had access to the transactions Intuit customers made. He could see how many invoices they sent out and how quickly those invoices got paid. That transactional information formed a third dataset.

The problem he had was that these three datasets were not connected. They’d been designed independent of each other, for completely different purposes. Tying the knowledge together to say “show me all the steps that customers who send out more than 100 invoices per month go through when they aren’t paid using the payments system” turned out to be really hard.

Sam assembled a small team of data scientists to make the three datasets work together. After much effort, they could start to see results.

Combining Observations and Numbers Strategically

Data revealed to Sam and his team how people were using the product. They used that same data to recruit specific users for more observations. They could identify the right type of customers to recruit, based on behavior, which opened up the door to their product team talking to the customer research group. “The whole process is dramatically improved by having the analytical infrastructure,” explains Sam.

At first, the team focused on the product workflow to figure out what the actual user interaction looked like. Then they went through countless iterations for how to make the feature discoverable within the product, without, as Sam explains, “being too interfering.” In one iteration, some of the members of Sam’s team wanted to create a drag and drop option for invoices and they tested that feature early, but it didn’t stick.

The team tried designs that would encourage customers to use the built-in email feature. It wasn’t working. That’s when it struck them that maybe that wasn’t the right way to solve the problem. Maybe they needed to meet customers where they already were?

“Is this the right way to solve a problem?” Sam and his team wondered. “Is forcing or incentivizing a new type of behavior what our customers want? Or do they want our solution to adapt to how they are used to doing the work?”

A New Hypothesis To Test

The original behavior Sam and his team observed was showing up in the datasets. They could see users saving invoices as PDFs. They could see those invoices would be paid by transactions that used competing payment systems (like Paypal), credit card, or checks. And they could see that customers who used Sam’s payment system were getting paid faster on average. Much faster.

If they could get those customers to use the payment system, they’d get the benefit of improved cash flow. Yet, they couldn’t find a way to deliver that benefit with the built-in email feature.

Sam’s team formed a new hypothesis: If they could build the invoicing functionality into Gmail itself, they could get these customers to take advantage of the payment capability. The team saw in their data that half of their drop off customers used Gmail to send invoices. It was a big enough trend to warrant the business case to develop something for Gmail first, explains Sam.

The team built an extension to Gmail that could fetch Quickbooks invoices and insert them into a message. Sam’s customers could reply to their existing thread with their client, inserting the invoice directly into the reply. And that message would contain the link to Sam’s payment functionality.

After several iterations, they got something customers loved using. And, as they suspected, those customers’ clients were paying using Quickbooks Payments. Sam could see in the data that those customers were being paid substantially faster.

Sam acknowledges it wasn’t easy to get this point. Yet, it was worth it. “It’s not like we dug into the numbers and then we figured the whole thing out. It was this constant back and forth of one type of input telling us, ‘hey, we should go and do this thing.’ And then we would go do that thing. And then the other type of input would say, ‘You know what? There might be a better way to do this.’”

Changing How Intuit Uses Data

At the start of the project, Sam’s team had a bottleneck: a singular analytics process at Intuit that required them to place tickets for data reports. Data analysts on teams were spending 80-90% of their day answering these kinds of ticketed questions.

Now, Sam and his team have created a self-service data analytics capability. They are taking this across the company as a use case to show others what they did, how it worked, and how to standardize the data. Sam’s peers can see the power of combining observations with rich datasets to reveal valuable, meaningful insights.

These days, the analyst working with Sam’s team is freed up to take a deep look at the analytics. “We can try machine learning personalization use cases that are only possible if you have someone who has time to do them. This is how you can get more out of your data partner by taking care of the reporting on your own,” says Sam.

“I’m pretty darn excited about the last couple of months. We’ve been leveraging data in a rich and cool way because we have freed up data professionals to do more robust work. We are getting much more diverse in the things we are seeing in our product roadmaps and the features we are rolling out.”

The New Frontier of Designing With Data

Sam and Lynne took advantage of their unique situations to look outside the status quo and solve a problem: how to make their products better, more efficient, and useful to their customers. They spent time and observed their customers to surface the value of the product. They used their datasets to test how close they could come to providing that value to customers.

Conventional wisdom would have us think that a dataset can show us the what of a user experience, while observations expose the why. This is a dangerous oversimplification.

Both Lynne and Sam employed a process that was far more nuanced and iterative. They relied on continually visiting the customers and the rich datasets to glean insights throughout the process. Their teams needed to become comfortable using both to blend those insights into product decisions.

Make no mistake, establishing a new process for harnessing data and interpreting it wasn’t easy for either Sam or Lynne’s teams. They were operating in uncharted territory. They had to learn a whole new way of working with observations and data. They needed to be comfortable with uncertainty while they figured it all out. It’s not a one size fits all process: every organization and product team has a unique set of challenges, datasets, customers, goals, and processes to sort out.

Most importantly, both Sam and Lynne took advantage of a moment to build skills and expertise in new areas. That provided immediate solutions and long-term value to their products.

That’s where the real wisdom can be found in the search for establishing meaningful metrics in design. When we rely on arbitrary, out-of-the-box solutions to determine meaning, we never get the full picture of the product experience.

We already know what the value of our products are. It takes effort and patience to align the right data streams to our customers and business goals, and connect the dots.

Jared Spool is a Maker of Awesomeness at Center Centre/UIE. He researches and writes about user experience design strategy and practice. Kathleen Barrett is a digital strategist and writer. She’s spent part of her career working to understand what the available data can tell her and her teams about creating the best user experiences.

How to Win Stakeholders & Influence Decisions program

Gain the power skills you need to grow your influence on critical product decisions.

Get mentored and coached by Jared Spool in a 16-week program.

Learn more about our How to Win Stakeholders & Influence Decisions program today!