🎯 Q4 Inc. Data Filtering Research

ROLE : Product Designer, UX Researcher

TIMELINE : May - June 2022

TEAM: 2 Product Designers, 2 Product Managers, 2 Frontend Developers, 3 Backend Developers, 1 QA Analyst

TOOLS : Figma, Maze (Usability Testing), Dovetail (Research Documentation)

Engagement Analytics (EA) is a brand new application in development for Q4’s investor relation software platform Capital Connect. The EA application centralizes and aggregates investor engagement data across a company’s digital landscape, helping streamline investor relation officer workflow. Servicing over 58% of the S&P 100, Q4 has a significant existing client base that can benefit from EA.

During the early Alpha and Beta development stages, design took the opportunity to research new UX and UI patterns to better the data filtering experience within the application, with the following overarching goals:

  1. Find new directions and opportunities for the filtering experience 

  2. Improve usability of our existing design

How might we best present and systemize the Engagement Analytics data filtering experience so users can conveniently surface relevant investor engagement data?


HIGHLIGHTS: Primary research, secondary research, usability testing, A/B testing, prototyping, wireframing


🌎 Context

Filtering in Engagement Analytics

Engagement Analytics leverages a variety of data that can be customized through the application’s filtering tools. This is largely utilized within the Institution List page - a comprehensive list of engaged institutions. Each institution has associated engagement data including types of engagements, shareholder status, shareholder status, activist status, event attendance, as well as other relevant ownership data.

Opportunity

As mentioned, given the early development stage of the application at the time (Alpha to Beta), design was given the opportunity to research new UX and UI patterns to better the data filtering experience within the application. Its pre-existing state hosted a small amount of filters (mentioned above), with an open-face design. In consideration of the platform’s future state and continuous development, design wanted to solution for greater scalability. In the case of a larger amount of filters, EA would need a more size-resilient, flexible design.

Resulting new filter patterns and designs are to be implemented in Engagement Analytics and documented into the Q4 design system as a standard for reference as more Q4 products are transitioned into Capital Connect.


🔎 Research

In this research, we wanted to improve upon or validate existing filter patterns in EA. We specifically wanted to find new directions and opportunities for the filtering experience and improve usability of our existing design.

Given the nature of our goals, we decided to firstly conduct rough desk research, followed by formal usability testing.

Secondary Research

Early stage research consisted of foundational desk research to gather insight around common filter behavior. Documentation was collected from sources such as Nielsen Norman Group to establish a strong initial understanding of industry standard filtering patterns. Once we generated a better understanding of the problem space, we had a solid basis to gauge direction for our own upcoming research study.

This secondary research revealed that our original designs lacked essential elements such as a ‘Clear All’ function and a visible dynamic count for applied filters. Beyond this, we collected a variety of new features and concepts to test surrounding positioning, data fetching, and result display.

Primary Research

Given our short timeline, we decided to conduct usability testing through a product research platform called Maze. The flexibility of this platform allowed us to utilize Figma prototypes to collect data on key performance indicators such as click rates, success rates, and task duration. 

We utilized A/B testing to compare and contrast old vs new concepts as seen in the designs below. In doing so, we ultimately wanted to discover preferences in the filter adding/removal experience to then synthesize findings for potential hybrid designs.

Participant Criteria

We were able to recruit a total of 48 participants from across the company for testing. While it would have been ideal to solicit insight from our targeted end users (investor relation officers), we made do with our resources (limited timeline and access to specific users), and targeted Q4 employees who naturally have strong awareness and familiarity with the IR landscape.

Usability Test Structure

The test was structured to guide the user through the adding and and removal processes (with click-through prototypes), revealing user preferences and insight with both qualitative and quantitative follow up questions for each respective task:

  • Overall, which design made adding filters easier?

  • What did you find particularly useful in *preferred design A or B* while adding filters?

  • Overall, which design made removing filters easier?

  • What did you find particularly useful in *preferred design A or B* while removing filters?

Once both tasks were completed, the user was asked to select distinctly useful features and provide overall feedback on their filtering experience with the following questions:

  • Now that you have explored features across both designs, please select any features you found distinctly useful in accomplishing your tasks

  • Do you have any final thoughts or feedback on your experience?