Fast Path to a Great UX – Increased Exposure Hours

by Jared M. Spool

As we’ve been researching what design teams need to do to create great user experiences, we’ve stumbled across an interesting finding. It’s the closest thing we’ve found to a silver bullet when it comes to reliably improving the designs teams produce. This solution is so simple that we didn’t believe it at first. After all, if it was this easy, why isn’t everyone already doing it?

To make sure, we’ve spent the last few years working directly with teams, showing them what we found and helping them do it themselves. By golly, it actually worked. We were stunned.

The solution? Exposure hours. The number of hours each team member is exposed directly to real users interacting with the team’s designs or the team’s competitor’s designs. There is a direct correlation between this exposure and the improvements we see in the designs that team produces.

It Makes Perfect Sense: Watch Your Users

For more than 20 years, we’ve known that teams spending time watching users, can see improvements. Yet we still see many teams with regular user research programs that produce complicated, unusable products. We couldn’t understand why, until now.

Each team member has to be exposed directly to the users themselves. Teams that have dedicated user research professionals, who watch the users, then in turn, report the results through documents or videos, don’t deliver the same benefits. It’s from the direct exposure to the users that we see the improvements in the design.

Over the years, there has been plenty of debate over how many participants are enough for a study. It turns out we were looking in the wrong direction. When you focus on the hours of exposure, the number of participants disappears as an important discussion. We found 2 hours of direct exposure with one participant could be as valuable (if not more valuable) than eight participants at 15-minutes each. The two hours with that one participant, seeing the detailed subtleties and nuances of their interactions with the design, can drive a tremendous amount of actionable value to the team, when done well.

First Forays: Field Visits

As we watched different teams go through this process, we started to notice some repeatable patterns. For example, many teams spent little time watching their users. Often these teams had successful, profitable products that had evolved over many years into very complicated designs, chock full of features that users found hard to find and often frustrating to use.

Before they began watching users, the teams would frequently find themselves at odds in meetings. They knew that the product was getting more complex, but nobody had any real information about how the product was being used. Stakeholders would ask for features without giving any useful details to the team to implement. An attitude of “Let’s build it, and if we get it wrong, we’ll fix it” would prevail.

For teams like these, we often choose a field visit as their first foray into watching their users. Field visits are great because we get to see what the users do in their natural environment. It doesn’t require prior knowledge of what the proper tasks in the design are. We interview the user, uncover their goals and objectives, and then ask them to use the product or service to accomplish those.

A typical field visit is two hours. Usually with ten to twelve visits, each team member can get at least eight hours of exposure to a minimum of four different users, each trying to use the design in interesting ways.

The results are typically a list of easy fixes. One recent 12-visit venture with a 10-member team produced 350 items on their list of quick fixes. The product improvements started showing up in just a matter of weeks.

A Minimum of Every Six Weeks

We saw many teams that conducted a study once a year or even less. These teams struggled virtually the same as teams who didn’t do any research at all. Their designs became more complex and their users reported more frustration as they kept adding new features and capabilities.

The teams with the best results were those that kept up the research on an ongoing basis. It seems that six weeks was the bare minimum for a two-hour exposure dose. The teams with members who spent the minimum of two hours every six weeks saw far greater improvements to their design’s user experience than teams who didn’t meet the minimum. And teams with more frequent exposure, say two-hours every three weeks, saw even better results.

We think there are two reasons the frequency turns out to be important. First is the way memory works. It’s harder to remember someone you’ve met more than six weeks ago than someone you’ve met last week. If we want our users and their needs to be present in our minds as we’re creating our designs, we need to regularly see them.

The second reason has to do with the pain of an ongoing frustration. It’s painful to watch someone struggle with your design. It’s even more painful to come back a few weeks later and see someone else struggle with the same problem again. The more times we’re exposed to those struggles, the more frustrated we get, the more we want to fix those problems. (And the happier we’ll be when we finally see someone who breezes right through with our new design.)

Some problems are particularly gnarly. Seeing these problems repeat, in the field and in the lab, gives us insights into the nuances behind their potential causes. Testing out new design ideas can help us get to a solution faster. A regular exposure program makes that happen even better.

By having a six-week minimum to our exposure, we leverage these two factors, making our users and their needs the driver of the design work we’re doing on any given day.

Types of Exposure to Users

Field visits aren’t the only form of exposure we found that works. Usability tests, both in-person and remote, can be very effective. (We found a mixture of both works better than 100% remote sessions.) Once you know the tasks that users naturally use with the design (because you discovered them during your field visits), it’s easy to construct realistic scenarios for usability testing.

For folks heavily involved with a style of self-design, using it themselves for real work also can contribute. (For more about self-design, see my recent article, Actually, You Might Be Your User.) Again, validating these results with other methods, such as field visits and usability testing, helps you understand what your users experience that you don’t when using the design.

Watching users work with competitive designs also is important. Seeing them work through those same tasks with someone else’s design can help identify where there are gaps in your own design. It also makes it easy to point out where your advantages lie.

The Team of Influencers

Our research had a finding that took us by surprise: Teams that excluded non-design personnel didn’t see the same advantages as teams that included those people.

For example, we worked with teams where only the designers and developers were having regular exposure to their users. Stakeholders, such as product managers and executives, along with other non-design folks, like technical support liaisons and quality assurance management, didn’t participate in the field studies or usability tests. While the core design team became very familiar with what users needed and wanted, they were constantly battling with these other individuals who didn’t have the same experiences.

The tipping point came when we found teams where all these other folks were participating in the user research studies. No longer did they assert their own opinions of the design direction above what the research findings were telling the teams. Having the execs, stakeholders, and other non-design folks part of the exposure program produced a more user-focused process overall.

Exposure is easy to measure. You can just count the hours everyone has had participating in the studies. We’re seeing teams make it part of their quarterly performance reviews, sending a clear message of the importance of user experience, especially when all the influencers are measured the same way.

The Challenge: Two Hours Every Six Weeks For Everyone

Granted, we admit our data could be flawed. There could be other factors here. However, we’ve tested every possible theory, spent time reviewing every factor we could imagine, and we keep coming back to this one item: Get every member on the team to spend two hours every six weeks and you’ll likely have a great user experience appear before your very eyes.

About the Author

Jared M. Spool is a co-founder of Center Centre and the founder of UIE. In 2016, with Dr. Leslie Jensen-Inman, he opened Center Centre, a new design school in Chattanooga, TN to create the next generation of industry-ready UX Designers. They created a revolutionary approach to vocational training, infusing Jared’s decades of UX experience with Leslie’s mastery of experience-based learning methodologies.

Enroll in Our Four-Week Live Course on Outcome-Driven UX Metrics.

Establish your team’s 2025 UX metrics and goals by investing just 4 hours a week in our new Outcome-Driven UX Metrics course, featuring 8 hours of pre-recorded lectures and 8 hours of live coaching sessions with Jared.

You’ll learn to set inspiring UX goals, boost your team’s strategic impact, and receive personalized coaching, all while gaining access to a community of 51,000+ UX leaders.

Join this course and establish your UX Metrics today.