Shazeeye's Blog Thoughts on User Experience, Technology and Business

My Portfolio CX Measurement Program: Monitoring and improving the voice of our customers

I rolled out our Net Promoter Score (NPS) or Customer Experience (CX) Measurement Program in August 2013 with the consumer (people who use our coupons) audience initially. In 2014 it extended to our clients (people who put the coupons on our site like Kraft and P&G) and publishers (bloggers who add our coupons on their site for a share of the revenue). We have gathered feedback from our initial round on measuring loyalty and satisfaction and have an action plan to listen and act on customer feedback. "We're Listening" is one way we communicate with our consumers to let them know what we're doing to address their concerns. I can discuss in detail if we meet on how we prioritize this feedback and what is our approach to address customer feedback. Help Center: Improving support efficiency and customer satisfaction

Before the new help center (bottom right image) had a one page FAQ sheet (top right image) for consumers to help resolve their issue. It was static content that was updated once in a few months as it required engineering effort to push the updates to production.

I spearheaded the design and execution of the new Help Center and it has the following benefits:

Improved case deflection: "Ask Sam" is an avatar to help with case deflection. Consumers could search or browse articles and if they didn't find what they wanted they could "Ask Sam". Sam would eventually get them in touch with a support agent but before she did that she would ask a few questions and show some articles that could help consumers resolve their issue.

Improved customer support efficiency: Maintaining the Help Center was extremely easy. Support agents now spent less than 10% of their time updating content and could send consumers to self resolve via the help articles. They were now in control as the reporting gave them insights into what content got more views and higher satisfaction (via consumer ratings of articles).

Improved customer satisfaction: Better organization of data based on product categories helps consumers find information faster. Plus, addition of a search field and related searches further improves findability. The biggest jump in satisfaction was related to improved content. Customers were delighted by the image heavy content. Step-by-step instructions with pictures made it easy  for consumers to resolve their issues.

CVS Photo Center: Increasing revenue by 42% by improving the online photo experience by 44%

The new design of the website (right) was 44% better than the old design (left).  Users were 15% more successful completing tasks on the new design and were 12% faster using the new design than the old design. The top design differentiators are listed below:

Better information architecture: Menus (click on images for greater resolution) are bucketed into user-defined categories with activity oriented labels (Old Design: My Photo Center, Order Prints, Order Gifts, Organize Photos, Share Photos, Order Status, Products and Pricing; New Design: Upload Photos, View and Edit, Share, Order Prints, Create Gifts)

Registration not mandatory as it frustrated users: Most users find registration a time wasting step so to increase conversion registration was not mandatory though encouraged. Non registered users could not save their photos on the site for more than a few days thus encouraging registration.

Photo-related activities (upload, organize, edit, order, share) better integrated: The old design had activities existing in silos. To move from phot editing to sharing one used the menu to move from the upload page to the share page. The new design has a contextual panel on the right where you can do any activity (share, order print, create gift, edit, organize) from the current page.

Simple experience to cater to our audience (soccer moms, amateurs and seniors): Step by step instructions are always visible for all photo activities. Tips related to the activity (for example, upload or crop a photo) appear on the right where required.

Improving the findability of information by 14% on the IELTS (language testing service) website

How many of us have left a website because  it was difficult to find the information we wanted? The not-so-surprising answer is 62% of online shoppers have given up at least once. Poor information architecture can also impact productivity. 35% of an employee’s time is wasted looking for information and that over the last 10 years that number has been as high as 50%. But the biggest surprise is that you can be sued for poor design. IAM sued Razorfish in 2000 for poor design. To improve findability good designers map similar tasks into buckets to match user mental models and come up with the information architecture or the menu of a website. The person on the left is trying to match tasks (left side of the picture below) into the buckets (right side of the picture below). He also comes up with the labels for these buckets.  Results (below) of this exercise are analyzed across 20 or more people using a card sorting tool.  This mapping (click on picture for greater resolution) led to the menu of the IELTS website and information findability improved by 14%.

Segmenting customers for a corporate intranet

You would be surprised to know how many companies are unaware of their customer segments. Usually the answer to this question for a photo sharing site would be -"Isn't it anyone wanting to share photos?"  or for a corporate intranet it would be - "It's everyone who works for the company." Segmentation is key to understanding markets, sub markets and factors influencing them. At Infosys, the consulting company I worked at for 4 years, we tried to segment the customers of our corporate intranet. We sent out a survey targeting equal number of employees from the 20 job roles at Infosys. The survey consisted of 30 questions related to intranet usage behavior, preferences across the various sections on the intranet and demographic data. 100 responses were analyzed using a card sorting tool (groups similar behavior and preferences together) to come up with seven personas or segments (click on image for greater resolution). The seven customer segments on the right  may have overlapping behavior but each segment is clearly distinguishable from the others.

Redefining a design process through a collaborative tool

Before the tool was designed around 3-5 designers from a pool of 30 would judge the compliance of a website with regard to certain usability guidelines and list these issues along with an image of the issue on PowerPoint. This process required designers to evaluate the website individually and then collaborate their findings together in the form of a  heuristic evaluation report. Some issues with this process were that it was time consuming, there was an overlap in identifying issues and there was a lack of consistency in terms of wording the issue. Thus, came the idea to develop a collaborative evaluation tool.  The tool eliminated redundancy as identifying an issue was now done collaboratively in real time. The process was significantly faster as reports were automated and issues were identified by choosing from an already existing database. Most importantly, quality of evaluations improved as designers/evaluators could build on issues identified by others or refine/edit already existing issues. Read my post on the future of business collaboration tools to get further insights and see a video of the tool.

Creating a metric for heuristic evaluation skills and defining global standards

Heuristic Evaluation is a discount usability engineering method involving three or more evaluators who evaluate the compliance of an interface based on a set of heuristics. Because the quality of the evaluation is highly dependent on their skills, it is critical to measure these skills to ensure evaluations are of a certain standard. This study provides a framework to quantify heuristic evaluation skills. Quantification is based on the number of unique issues identified by the evaluators as well as the severity of each issue. Unique issues are categorized into eight user interface parameters and severity is categorized into three. A benchmark computed from the collated evaluations is used to compare skills across applications as well as within applications. The result of this skill measurement divides the evaluators into levels of expertise. Two case studies illustrate the process, as well as its applications. Further studies will help define an expert’s profile.

An online heuristic evaluation competition was hosted on the World Usability Day website in November 2007 by Usability Professionals’ Association (UPA), Bangalore. Twenty contestants from the U.S. and India with heuristic evaluation experience ranging from 0 to 10 years participated. Contestants were judged on a quantitative framework that assigns weights to the severity of the identified issues (Kirmani & Rajasekaran, 2007). Results indicated that the group with average heuristic experience of 2 years found a mean HEQS% of 8% in 1 hour. Previous studies identified that evaluators found a mean of 34% of issues but did not address issue quality (Nielsen & Landauer, 1993). Further studies on heuristic expertise quality would make the standards more reliable.

CVS Photo Center: Increasing revenue by 42% by improving the online photo experience by 44%

Comments (0) Trackbacks (0)

No comments yet.

Leave a comment

No trackbacks yet.

Switch to our mobile site