1.3 billion people are estimated to live with some form of visual impairment. They are often unable to perform simple tasks due to the lack of accessibility features. I helped redesign grocery store kiosks made by NCR to make them universally accessible to all.

My Role    

  • UX Research: conducted observations, and led the affinity mapping and journey mapping process.

  • UX Design: prototyping experiences at low and high fidelity through concept sketches, wireframes, and interactive prototypes. 

  • Industry Partner Communication: represented the team by communicating weekly with industry partners on project updates, scheduling weekly meetings, and shared resources. Presented project updates utilizing purposeful slide decks. 

Timeline    

August 2019 - December 2019 

 

*Project is currently in progress 

Primary Prototyping Tools

Adobe XD, Figma

Team​ Members

Anjali, Yujin, and Nektar

Primary Prototyping Tool

Adobe XD

Results​

  • Average SUS Score: 74.1

  • Average No. of Clicks: 5

  • Average Time on Task: 34 Seconds

  • Best Features: AR Scanner, Image Recognition

  • Features Needing Improvement: NFC Pairing

Background

NCR Corporation, previously known as National Cash Register, is mentioned as “The # 1 global POS software provider for retail and hospitality, and the #1 provider of multi-vendor ATM software.”

They sell self-service kiosks, point-of-sale terminals, automated teller machines, check processing systems, barcode scanners, and business consumables. In this project, our group focused on the SS90 kiosk typically used in grocery stores.

 The Problem

Currently, the NCR Corporation kiosks do not have many accessibility features for visually impaired users. Users can turn up the volume of the kiosk and utilize an on screen navigation pad. However, there is currently no way to adjust text size or hear all the options presented on every screen.

Our job was to work to add accessibility features to the SS90 kiosk for users that are visually impaired. 

Process

Competitive Analysis

We studied self-service checkout kiosks and software that currently have accessibility features. 

We wanted to understand what accessibility feature were already available to users in self-service kiosks that were made by competitors. 

Vispero’s JAWS kiosk

Olea's Automated Passport Kiosk

We looked at ADA Compliance, support for Assistive Technology products, customizability of the kiosk screen, and ease of use. 

Frank Mayer and Associates Grocery Kiosk

From our analysis , we learned that many existing products have been unsuccessful in the visually impaired market.

From this competitive analysis, we know that the new XR7 kiosk should:

1.  Be ADA compliant

2. Be easily customizable to meet each user's needs

3. Support external assistive technology products

The kiosks aren't really something I use. I get frustrated, because sometimes the voice works and sometimes it doesn't. I'd love to be able to use it one day though.

User 1 - Fully Blind

 Observations, Interviews, & Surveys

To better understand users with visual impairments, we observed them using assistive technology. We also observed people using the XK7 kiosks at NCR. Next, we interviewed users to get insights into how individuals with visual impairments used assistive and regular technology, shopped for groceries, and how they received assistance for technology when in stores.

 Finally a survey was created in order to 

find out users' likes and dislikes in regards to self-service kiosks in grocery stores. We also asked preliminary questions to uncover the specific details of their preferences in regards to assistive technologies.

Results

Below are the findings from our observations, interviews, and survey.

Affinity Map

Next, we chose to complete an Affinity Map in order to organize the information we learned from our users. 

They were unable to use self-service kiosks in their current status, but we wondered if there was a solution in the assistive technologies they already used.

Understanding The Users

Based on the data obtained, we created Empathy Maps and User Personas to help to better empathize with the users and their story. 

The two main user groups are represented by Michael and Lily.

Next, we mapped out the User Journey to identify points of frustration in the self-service kiosk check-out process.

We used the Journey Map to identify touch points and potential opportunities.

Finally, we created storyboards of our user's current journey.

This creates additional context to help really understand the current road blocks in place in regards to using a self-service kiosk when you have a visual impairment.

Storyboard 1

Storyboard 1

 User Pain Points

From the research conducted, we identified many challenges with using a self-service kiosk if you have a visual impairment.

Below we identified a few of the major pain points experienced by the users that were prioritized for the first iterations. Priority was chosen based on feedback from users.

Kiosks are currently not adaptable.

01

 Problems to Solve

Pain point identification let us to the following problems to solve:

How might we make self-service kiosks easily customizable?

How might we make self-service kiosks easier to navigate and recover from error?

How might we enable self-service kiosks to provide feedback to our users in multiple ways?

Brainstorm

We used our research to help us conduct an informed brainstorming session. Together we considered different ways to solve the pain points identified.

After considering the feasibility and creativity of each idea, we narrowed our ideas down to three different potential solutions that could solve our problem.

Smart Cart

Potential Solutions

We used our graph and the ideas in the optimal zone to help us create three possible solutions. Next, based on user feedback we created detailed sketches and wireframes of each potential solution. 

We referred back to all of our user research, user pain points, and "How Might We" questions in order to make sure we stayed focused on user needs.

AR Navigation & Item Checkout

This application uses AR to help users navigate the store. It also allows users to scan aisles and sections within aisles in order to identify the correct products and add those products to their shopping cart.

Tap to Change Accessibility App 

This application allows users to tap their phones to self-checkout kiosks in order for the kiosk to be instantly customized according to the accessibility features on the user's phone. Users can also add items to their shopping carts before or while shopping and tap to pair to send their shopping list to the kiosk.

Smart Cart

Smart Cart is a smart shopping cart that scans items as you put them in your shopping cart and adds it to your virtual shopping cart. The smart cart will read aloud items. Once the cart gets to the self-checkout kiosk it will automatically upload all the items in the cart to the kiosk. 

01

02

03

We presented our solutions to our users for user testing. 

Below I highlighted some of the feedback received.

Tap to Change Accessibility App ​

Users loved the ease of being able to tap their phone to the self-checkout kiosk and enable it to instantly customize to their needs. However, users were unsure about how to navigate through the app.

01

AR Navigation & Item Checkout

The ability to have in store AR navigation and checkout at our user's fingertips was well received. Unfortunately, concerns were raised about the potential safety threat walking and utilizing this solution could pose. 

02

03

Smart Cart

The ability of the smart grocery cart to read items aloud and instantly add them to the user's virtual shopping cart on the kiosk was very helpful. Some users worried about the time it would take to potential purchases into the shopping cart just to be able to identify the item. 

Final Solution

Feedback from our users helped us to choose one solution that we felt would best solve the problem. Our users loved having AR to navigate the store and help identify products, but many were very concerned about how they would walk, use the app, and navigate the busy grocery store at the same time. A Smart Cart seemed great to users, but they wanted a more intuitive way to help them identify items.

We decided to go with the Tap to Change Accessibility solution with a few extra integrations to implement and test with our users through multiple iterations. This solution will allow users to access grocery shopping information before arriving to the store and add items to their shopping list. There is a 3D map to help users find items in the store. After shopping, users can then tap their phone to pair the app to the kiosk. This one action will send their customized accessibility settings to the kiosk as well as their shopping list. If users chose to checkout at the kiosk, computer vision would be used to recognize items when they are held up in front of the kiosk in a new scanning interaction that eliminates barcodes.

Iteration 1

In iteration 1, we completed user testing and feedback sessions in order to help us understand how our final solution would work to meet our user's needs. Users thought the system seemed intuitive and fresh. Many of our users were smiling while interacting with the kiosk. 

They loved the elimination of barcodes with our feature that gives users the ability to hold their items in front of the kiosk in order to scan. Users were rushing to try this feature again by scanning the next item! The users also responded positively to the new interaction of pairing phones to the kiosk.

Our users experienced some challenges when trying to pair the phone to the kiosk. Although they liked the feature, the wording seemed unclear. Also, with this being a brand new interaction, users wanted a little more feedback from the phone and the kiosk on how to complete the pair. 

When trying to add new items to their shopping list or grocery cart, users did not like having to use the plus sign icon to add items. They wanted to be able to click on the image in order to add an item to their list or cart. Finally, users wondered how this solution would prevent people from taking things without adding them to their shopping list on the mobile app.

Iteration 2

Our findings from iteration 1 led us to add a new feature to our iteration 2 prototype, and AR Scanner. Users loved that thus helped them to identify their items while grocery shopping. They also enjoyed the elimination of barcode scanning with the use of Image Recognition by the kiosk. 

Our users needed more support during the NFC pairing process. They expected to see animations in addition to visual instructions that could help them to understand how the pairing process worked. They also wondered if the AR Scanner could account for scanning and pricing multiple items at a time.

Next Steps

01

User Test & Iterate. We will test iteration 2 of this design with our users in order to make sure that we are meeting our user's needs. We plan to take the feedback we get from iteration 2 and come up with the next iteration of our design. 

This project is currently in progress for a semester long class. Please feel free to come back in a few weeks to see the results of our next steps.