⚠️ NDA protected. Full case study can be accessed via private link provided on my resume. Please contact kailin.chan@uwaterloo.ca for inquiry. Thanks a bunch! ⚠️
🎯 Q4 Inc. Data Filtering Research
ROLE : Product Designer, UX Researcher
TIMELINE : May - June 2022
TEAM: 2 Product Designers, 2 Product Managers, 2 Frontend Developers, 3 Backend Developers, 1 QA Analyst
TOOLS : Figma, Maze (Usability Testing), Dovetail (Research Documentation)
Engagement Analytics (EA) is a brand new application in development for Q4’s investor relation software platform Capital Connect. The EA application centralizes and aggregates investor engagement data across a company’s digital landscape, helping streamline investor relation officer workflow. Servicing over 58% of the S&P 100, Q4 has a significant existing client base that can benefit from EA.
During the early Alpha and Beta development stages, design took the opportunity to research new UX and UI patterns to better the data filtering experience within the application, with the following overarching goals:
Find new directions and opportunities for the filtering experience
Improve usability of our existing design
How might we best present and systemize the Engagement Analytics data filtering experience so users can conveniently surface relevant investor engagement data?
HIGHLIGHTS: Primary research, secondary research, usability testing, A/B testing, prototyping, wireframing
🌎 Context
Filtering in Engagement Analytics
Engagement Analytics leverages a variety of data that can be customized through the application’s filtering tools. This is largely utilized within the Institution List page - a comprehensive list of engaged institutions. Each institution has associated engagement data including types of engagements, shareholder status, shareholder status, activist status, event attendance, as well as other relevant ownership data.
Opportunity
As mentioned, given the early development stage of the application at the time (Alpha to Beta), design was given the opportunity to research new UX and UI patterns to better the data filtering experience within the application. Its pre-existing state hosted a small amount of filters (mentioned above), with an open-face design. In consideration of the platform’s future state and continuous development, design wanted to solution for greater scalability. In the case of a larger amount of filters, EA would need a more size-resilient, flexible design.
Resulting new filter patterns and designs are to be implemented in Engagement Analytics and documented into the Q4 design system as a standard for reference as more Q4 products are transitioned into Capital Connect.
🔎 Research
In this research, we wanted to improve upon or validate existing filter patterns in EA. We specifically wanted to find new directions and opportunities for the filtering experience and improve usability of our existing design. Given the nature of our goals, we decided to firstly conduct rough desk research, followed by formal usability testing.
Secondary Research
Early stage research consisted of foundational desk research to gather insight around common filter behavior. Documentation was collected from sources such as Nielsen Norman Group to establish a strong initial understanding of industry standard filtering patterns. Once we generated a better understanding of the problem space, we had a solid basis to gauge direction for our own upcoming research study.
This secondary research revealed that our original designs lacked essential elements such as a ‘Clear All’ function and a visible dynamic count for applied filters. Beyond this, we collected a variety of new features and concepts to test surrounding positioning, data fetching, and result display.
Primary Research
Given our short timeline, we decided to conduct usability testing through a product research platform called Maze. The flexibility of this platform allowed us to utilize Figma prototypes to collect data on key performance indicators such as click rates, success rates, and task duration.
We utilized A/B testing to compare and contrast old vs new concepts as seen in the designs below. In doing so, we ultimately wanted to discover preferences in the filter adding/removal experience to then synthesize findings for potential hybrid designs.
Participant Criteria
We were able to recruit a total of 48 participants from across the company for testing. While it would have been ideal to solicit insight from our targeted end users (investor relation officers), we made do with our resources (limited timeline and access to specific users), and targeted Q4 employees who naturally have strong awareness and familiarity with the IR landscape.
Usability Test Structure
The test was structured to guide the user through the adding and and removal processes (with click-through prototypes), revealing user preferences and insight with both qualitative and quantitative follow up questions for each respective task:
Overall, which design made adding filters easier?
What did you find particularly useful in *preferred design A or B* while adding filters?
Overall, which design made removing filters easier?
What did you find particularly useful in *preferred design A or B* while removing filters?
Once both tasks were completed, the user was asked to select distinctly useful features and provide overall feedback on their filtering experience with the following questions:
Now that you have explored features across both designs, please select any features you found distinctly useful in accomplishing your tasks
Do you have any final thoughts or feedback on your experience?
🎯 Results
Outputs/Deliverables
Resulting new filter patterns and designs echoed findings from usability testing. A hybrid approach where users are provided an open face access to filters with the addition of an “All Filters” function met a lot of our user needs.
Impact
According to data collected during usability testing, the new designs for the adding process shows an improvement in task time of 241%, and the new removal process 194%, producing an overall improvement in task time of 217%.
Next Steps
Resulting new filter patterns and designs are to be implemented in Engagement Analytics and documented into the Q4 design system. Leanings from this research can now be used as a standard for reference across the company’s products, as more Q4 products are transitioned into Capital Connect.
💡Takeaways
Being one of my first experiences with formal usability testing, I learned a whole lot along the way. Here is a breakdown of my major takeaways and learnings!
Usability Testing
Use casual, straightforward language during user testing
Utilize visual aids (ex. diagrams, references) and help text
Ensure your tests are presented with structure; users should be provided with transparency on what to expect going into testing
Brace the user for what’s to come - Any form of guidance helps reduce cognitive overload and frustration!
Cross Functional Collaboration
Sync with PMs and other relevant stakeholders on business goals and value
Sync with developers to gauge feasibility and scope
Leverage your team’s big dev/pm/design brains! Even quick little brainstorming sessions can go a long way!
Always ask questions and never hesitate to ask for help - this especially pays off in the long run!
Above all, this experience really showed the beauty in usability testing and validating designs with research. While it’s easy to assume you know best as a designer, usability testing helps identify potential oversights, and opens doors to new perspectives. It’s fascinating to see the power research, and how a well tested design can impact a product’s functionality!
⚠️ NDA protected. Full case study can be accessed via private link provided on my resume. Please contact kailin.chan@uwaterloo.ca for inquiry. Thanks a bunch! ⚠️