Who We Are

We are your guide to timely, fact-based insights and actionable solutions. We connect the dots between the research and your business objectives.

Who We Work With

Some of the most iconic brands in the world have trusted us since 1983 and we are proud to have clients that continue to work with us after 30 years of successful business together.

What We Do

We deliver on our promise. Our promise to provide the highest quality marketing research insights with a reliable team of industry experts that you can count on every step of the way.

How We Do It

We care about the quality of our deliverables. We care about the impact that our insights and solutions have on your organization. We care about our relationships with our clients and with each other.

Let’s Connect

We know that trust must be built between us over time. So, let’s get started.

Privacy Policy
Copyright © 2024 KS&R, Inc.
All Rights Reserved
How Observational Research Helped Fine-Tune a New Screen Experience Hero Image
Seeing What Users Don’t Say

How Observational Research Helped Fine-Tune a New Screen Experience

Our client, a marine manufacturer, sought to gather comprehensive user feedback on a prototype for a new multi-function display (MFD) screen currently in development for a watercraft. The primary goal was to understand how watercraft owners interact with the prototype, and how the MFDs content and usability align with their expectations and preferences.

Over the course of two days, we conducted 45-minute in-person in-depth interviews with watercraft owners.

The discussion kicked off with a short warm-up where we discussed the owner’s current display screen and the information they prioritize. We then asked participants to complete a range of tasks using the fully functional prototype the client provided.

All interactions were video recorded so we could observe how participants actually used the interface – where participants naturally went first, how they approached each task, and what tripped them up along the way. We didn’t just rely on what participants said, we looked closely at what they did. For example, virtually each person instinctively navigated to the same area to complete a specific task, but that path didn’t actually lead to the solution they were looking for, revealing a disconnect between user expectations and how the prototype was structured.

Once the task portion was complete, we asked participants to fill out a worksheet where they rated their experience across key dimensions like ease of navigation and error recovery.

On paper, most participants gave the prototype high marks. They understood there would be a learning curve and didn’t want to “ding” the experience for features still in progress. But the observational data told a deeper story.

Participants were clearly trying to apply interaction patterns they’d learned from other devices. When those expectations weren’t met, it caused some minor frustrations. Those types of insights didn’t always come out in direct conversation, but they were easy to spot in the footage, highlighting just how critical and powerful observation can be when testing new digital experiences. The feedback is already helping our client rethink certain interaction flows, and making sure the final version of the MFD is built with real-world behavior in mind.