How Observational Research Helped Fine-Tune a New Screen Experience
Our client, a marine manufacturer, sought to gather comprehensive user feedback on a prototype for a new multi-function display (MFD) screen currently in development for a watercraft. The primary goal was to understand how watercraft owners interact with the prototype, and how the MFDs content and usability align with their expectations and preferences.
Over the course of two days, we conducted 45-minute in-person in-depth interviews with watercraft owners.
The discussion kicked off with a short warm-up where we discussed the owner’s current display screen and the information they prioritize. We then asked participants to complete a range of tasks using the fully functional prototype the client provided.
All interactions were video recorded so we could observe how participants actually used the interface – where participants naturally went first, how they approached each task, and what tripped them up along the way. We didn’t just rely on what participants said, we looked closely at what they did. For example, virtually each person instinctively navigated to the same area to complete a specific task, but that path didn’t actually lead to the solution they were looking for, revealing a disconnect between user expectations and how the prototype was structured.
Once the task portion was complete, we asked participants to fill out a worksheet where they rated their experience across key dimensions like ease of navigation and error recovery.
On paper, most participants gave the prototype high marks. They understood there would be a learning curve and didn’t want to “ding” the experience for features still in progress. But the observational data told a deeper story.
Participants were clearly trying to apply interaction patterns they’d learned from other devices. When those expectations weren’t met, it caused some minor frustrations. Those types of insights didn’t always come out in direct conversation, but they were easy to spot in the footage, highlighting just how critical and powerful observation can be when testing new digital experiences. The feedback is already helping our client rethink certain interaction flows, and making sure the final version of the MFD is built with real-world behavior in mind.