Survey Results Dashboard

Project Overview

Employee Engagement Software (EES) is a type of enterprise software that large companies use to understand how satisfied their employees are. EES has three distinct applications: one to create surveys, another for employees take the surveys, and the last one to review the survey results. When I joined the team, we knew the survey results experience was in dire need of a redesign.

Discovery

Personas

There are two primary users for the survey results dashboard: Cathys and Ginas

Cathys, who are HR business professionals, focus on analyzing the company-wide engagement data. They have a keen interest in dissecting results based on specific demographics, and utilize use their data skills for in-depth analysis. Additionally, Cathys are responsible for presenting key findings and actionable steps to the senior leadership team.

Ginas, on the other hand, are people managers — basically anyone with direct reports. Their main concern lies in the well-being of their immediate team rather than the entire organization. Given their busy schedules, most Ginas prefer understanding the key takeaways from the results without spending a lot of time navigating the system.

Uncovering Pain Points

I was happy to find out a lot of discovery work was done before I joined the team. Between the data pulled from multiple user surveys (e.g. customer satisfaction surveys, exit surveys, lost business surveys) and our stakeholders’ conversations with clients, we had a solid understanding of our shortcomings.

Problems with the Results Dashboard

Challenging Navigation

The dashboard's navigation was extremely fragmented — when users tried to get a deeper understanding about specific results, they would be plopped onto a different page and lose their original context. Each data analysis tool was siloed on a separate page as well.

Unclear Takeaways

he dashboard was missing a “too long didn’t read" (TLDR) version of the results. There were plenty of analysis tools geared toward Cathy to help her comb through the data, but that process is time consuming and not particularly manager-friendly.

The existing Strengths and Opportunities section attempted to pull out key takeaways. Unfortunately, too much data was displayed (and often conflicting) so managers didn't even understand WHY those items were Strengths and Opportunities. This made our algorithm seem very untrustworthy.

Product Strategy

For the new dashboard, we wanted to keep both the results and analysis tools in a single page. The idea was to start high-level at the top and gear the first few sections toward the Ginas (managers). The data analysis tools that are more geared more towards the Cathys (HR business professionals) are found further down on the dashboard.

I was extremely involved in the design of the top of the dashboard components gear towards Ginas, specifically: the Key Category, Strengths & Opportunities, and Topic Overview sections. These components needed to be high-level enough for managers, while still containing insightful nuggets of data that HR business professionals could find useful.

(Side note: I was also heavily involved with the Comments Analysis component, which has its own case study)

Iterate, Iterate, Iterate

I led weekly design meetings with stakeholders, architects, front-end developers, product owners, and business analysts. On these calls we would discuss design concepts and iterate based on business requirements, technical constraints, and end-user needs.

Usability Testing

Once we had a team consensus on our favorite (and feasible) experiences for the top of the dashboard, we began usability testing. My main responsibilities were collaborating on the test script, building the prototype, moderating usability tests, and presenting the findings to the stakeholders.

Our first round of usability testing was with Gina (manager) users. There was certainly room for improvement from our initial design. We learned:

  • The trend line in the Key Category would be more useful if the scores were displayed outright rather than in a tooltip
  • In the Opportunity section, managers wanted the flexibility to change the comparison. The drop-shadows felt out-dated as well.
  • Our attempt at creating a blue heat map with Topic Overview tiles was unclear and perceived as decorative.

After iterating based off the feedback, we ran another usability testing session and spoke to both managers and HR business professionals. In these sessions, users flew through the tasks and found the experience modern and accommodating to their preferences.

Solutions

Key Category

The objective of the Key Category component was to immediately answer the question “Overall, how did I do?” I wanted the number to the be the very first the user looks at, and wanted it to feel almost like their grade or main score for the survey.

Strengths & Opportunities

In the Strengths & Opportunities redesign, we split the Strengths and Opportunities into two views. For the Opportunities, our algorithm pulls 3 questions with the most room for improvement. Instead of showing all metrics upfront, we display the most concerning comparison by default to drive home why the question is an opportunity. The ‘more’ icon displays additional details, data visualizations, and related comments in context — goodbye jumping to different pages!

Topic Overview

The Topic Overview section is a new dashboard component that helps both managers and HR business professionals understand and compare the scoring of their question topics. By default, the categories are sorted from highest score to lowest score. The user can also sort these based on different comparisons — like biggest change compared to last survey. Clicking on any of the topic tiles will reveal an in-context popup with a breakdown of the topic's data.