RESEARCH | DESIGN | TESTING | DEVELOP | PRODUCT | RESULTS
Blue Nile offers nearly 10,000 jewelry items online, so helping customers find what they want is imperative. I successfully rebuilt the interaction and design of the site navigation, product search, and filter functions. Collaborative efforts reduced user error and bounce rate by 25% increasing engagement by 18%.
Blue Nile was an early innovator in the e-commerce game in 1999, introducing an online buying model for diamonds, engagement rings, and added jewelry in 2012. The early 2000 user interface was built for necessity with little consideration of human-computer interaction (HCI) guidelines or intuitive interaction design (IxD) focus. As more online shopping competitors entered the market with sleek and mobile-responsive websites, the need for a dedicated user experience expert started to impact the bottom line. Relying on customers' patience and focus to search, navigate, and filter products was not profitable.
Take the simple task of wanting to see only Rose Gold necklaces. One-click overwhelmed the user with filter options. The layout required users to "Where's Waldo" find the Rose Gold filter and hid the results leaving users wondering if the filter was applied. This is one of MANY interactions to fix.
DISCOVERING INSIGHTS
Before starting this project, we examined user reviews and Google Analytics data. Next, we developed a common language for communicating. Finally, we examined the current filter experience, how our competitors were failing/succeeding, and which e-commerce companies were driving filter innovation.
We quickly realized as we began this project that we needed a common language for this project to promote accurate communication between departments and stakeholders. So we started by defining each element of the filter/sort functionality and distributing a visual aid for quick reference.
As we gathered data from different departments, we realized that we needed to address filters on more than 350 catalog pages to create a master list of every filter set, name, and icon. During this process, we discovered redundant catalog pages, missing filter sets, and errors in the filter count — all of which would have to be addressed as we moved through this project.
We performed a heuristic evaluation (an expert-based research method in contrast to user-centered methods) of testing and research surrounding the process of digital filtering and sorting functionalities. I did a deep dive into the recommendations by the Nielsen Norman Group and The Baymard Institute, among others. I took copious notes and provided a summary of my findings to stakeholders.
With a firmer idea of what I needed to look for, both in terms of functionality and aesthetics, I performed a competitive analysis to understand how our competitors were filtering their products. Additionally, I looked outside of the jewelry industry to find inspiration from other online e-commerce companies that I felt were meeting their user's needs in a particularly effective and attractive way.
As part of our company style guide, we already had two user persona "couples" for whom we'd defined demographics and psychographics. However, as filters interact with our key personas, we needed to find a different way to segment our target audiences. Instead of a biographical breakdown, we segmented our audience into the five types of shoppers to better predict their behavior.
PROTOTYPE LEARNINGS
While researching filter styles, I kept sketches and thoughts regarding the best approaches to the new filter design. We focused on conceiving digital wireframes, already having a few strong sketches to work from.
After a brief whiteboard brainstorming session for initial filter design concepts — inspired by our research — we began wireframing for all three device types (desktop, tablet, & mobile) and exploring the application of brand-style colors and patterns to the base wireframe models.
We completed a deep dive on menu placement, dropdown models, filter menus, sort functionality, applied filter notification, product count, filter count, and even type-area length to accommodate localization (language translation).
Given the simplicity of the design, availability of pre-existing assets, and the speed with which a high-fidelity prototype could be assembled, we opted for a hi-fi prototype for testing purposes. I used a combination of Adobe and Axure software to model user interaction with the two most commonly used filters — metals and gemstones.
VOICE OF THE USERS
Due to the number of filters, we wanted to perform card sort testing to understand how users defined jewelry-related terms — and how helpful those terms were. Once we had our final recommendations for our list of filters and filter sets, we performed usability testing on interactive prototypes to understand how our changes would be received.
Performing an open card sort test allowed us to understand how users grouped similar filters. This effort aimed to understand if filter types were grouped correctly and how the users named those groups. We worked with Optimal Workshop, which provides a card sort testing platform. I then watched the videos, documented the results, and presented the findings to the design team.
Performing the closed card sort test was a way to understand how users fit content into a pre-existing structure. Users were asked to place filters into pre-named groups specifically chosen to help us discover how users defined words or concepts with similar meanings, such as "ring" vs. "band" — and how significant those minor differences were to users when it applied to the process of filtering.
Once our filter names and sets were sorted out, we updated our prototypes and began A/B testing to analyze how the proposed changes would impact the user's success rate and brand impression. Testing the original filters against the new filters resulted in an overwhelmingly positive response — especially on mobile devices.
Once we had our testing results, we could complete the final concepts for handoff to the development team.
AN INTERACTIVE BLUEPRINT
While the underlying physical structure of what we were asking the development team to build was relatively simple in design, the sheer number of moving parts multiplied by the complexities of the algorithms involved provided quite the challenge.
Our overhaul of the product filter functionality required significant changes to the algorithms used to populate the filter & sort menus and product results. I felt it was essential to ensure the developers and quality assurance team had a point of reference. To this end, I created a cheat sheet with use cases and visual examples for every if/else pathway the user may take. I also provided a list of icon use cases to help developers identify filters requiring additional imagery.
EXPERIENCE AN EXPERIENCE
The final product was a sleek, refined experience that allowed users to quickly discover products, change their sort order, and add their bling to their shopping bag with efficiency.
Take the simple task of wanting to see only Rose Gold necklaces. One-click overwhelmed the user with filter options. The layout required users to "Where's Waldo" find the Rose Gold filter and hid the results leaving users wondering if the filter was applied. This is one of MANY interactions to fix.
It was important that our users could not only intuit how the filters functioned but also that the filter sets they used most frequently were readily available to them. However, we also wanted to make our less-used filters available. To achieve this effect, we hid any filters that did not fit in a single streamlined row (dependent on device size) and placed them in a list of additional filter sets
On mobile, the top user pain point was a lack of confidence their chosen filter was applied — as the filter UI takes up the entire mobile screen area. To achieve this, we added a placebo button to allow users to "Apply" their settings, even though the developers had coded the filters to apply on selection. The addition of this single non-functional button resulted in a drastic reduction in user er
IMPACTING THE BUSINESS
The new catalog filter system succeeded on multiple levels. Google Analytics page tracking showed a 25% reduced bounce rate on catalog pages and increased click-through rates. Data tracking revealed that users used multiple filters per page, including those hidden in the "More Filter" dropdown menu. Error case logging showed a reduction in the frequency in which users accidentally reset their filters and/or filtered their results to zero products. Finally, compared to benchmark studies, user testing revealed a 169% increase in filter discoverability times. Overall, these changes resulted in an increase in conversion rates and improvements in user feedback scores.
Copyright © 2023 Meyer UX Resume - All Rights Reserved.