top of page
Logo_MoPOP.png

Usability Study of

MoPOP's Mobile Website

PROBLEM

The Museum of Pop Culture (MoPOP) faced a commonly faced problem - a mobile website that was not very user-friendly. With a majority of the MoPOP website's users being mobile users (60%), the majority of tickets that were purchased through the website was still through the use of the desktop. This impacted their overall online ticket sales, and the website design team was seeking help from usability researchers to help improve their website.

SOLUTION

Our team came in with a two pronged study to test the ticketing flow of the MoPOP website on mobile browsers. A team of four usability researchers, we each conducted individual heuristic evaluations of two major ticketing flows of the website and then summarized our results. We then conducted usability studies with 6 mobile users who had never used the website before but wished to purchase a ticket to the Museum of Pop Culture. Our findings helped drastically shape the next phase of design of their ticketing flow.

OVERVIEW

The Museum of Pop Culture, also known as MoPOP, was looking to improve pages and workflows on their mobile website, MoPop.org. The purpose of this study was to identify usability issues with the MoPOP ticket ordering process on mobile phones. 60% of the users who access MoPOP.org do so on mobile devices, and for that reason, we focused our efforts on studying the responsive site. This study looked at 2 different workflows: purchasing a general admission museum ticket and purchasing a ticket to an exhibition. We determined the usability issues in these workflows in order to make the MoPOP site easier to use on a smartphone and increase mobile ticket purchases in the future. 

ROLE

User Recruitment

Heuristic Evaluation

Usability Test Facilitation

Usability Test Notetaking

Analysis

Report Writing

Presentation

The specific objectives of this study include:

  1. Identify issues that might deter MoPOP mobile users from completing an online ticket purchase.

  2. Assess the ease of viewing MoPOP museum event details and purchasing an online ticket.

  3. Determine user satisfaction with mobile ticket conversion.

The questions we investigated are based on the current interaction flow of MoPOP.org

MoPop Interaction Map.jpg

We answered the following research questions as part of this study:

  1. What is the path users take when completing an online ticket purchase?

  2. What is the path users take when searching for museum events?

  3. Are the number of steps taken by the users to fulfill each task given to them based on each planned scenario in line with assumed metrics? And if not, why is that so?

  4. How successfully do users:

    • Navigate through the General Admission ticket process (Flow #1)

    • Navigate through the Event/Exhibition ticket process (Flow #2)

  5. What obstacles or frustrations prevent users from making an online ticket conversion?

MEET THE TEAM

facebook_untitled.jpg
m0e1iX64TGiXIb0wj5Vong_thumb_66.jpg
IMG_0512.jpg
anna.png

Proshonjit Mitra​

Samantha Baker

Rachel Binnicker

Anna Davies

PROCESS & TIMELINE

project timeline mopop.jpg

PARTICIPANT PROFILES

The user group defined with this project was fairly broad, as the bulk of museum attendees are tourists within a wide age range. We tested the MoPOP ticket ordering workflows with 6 participants between the ages of 18-35. Four of our participants were male and two were female.

participant profiles.png

The most important criterion in our screener was whether or not the participant had ever been to MoPOP. We only wanted to test first-time visitors in order to more closely mimic MoPOP’s main demographic: tourists who are new to Seattle. Every participant we tested indicated in the screener that they had never visited MoPOP, but participant 6 later told us he had been to MoPOP. We decided to keep his data in our data set because he’d never purchased a ticket to MoPOP online. Our participants used a range of cell phones and browsers to access to MoPOP mobile site, ensuring that any findings shared between participants were not a bug or byproduct of a specific phone or browser.

METHODS

We used two methods to evaluate the MoPOP ticket-ordering process: a heuristic evaluation and a think-aloud usability study. The heuristic evaluation allowed us to do a detailed analysis from a professional perspective, and the think-aloud user study gave us insights on real site use and participant attitudes toward the process.

he.png
HEURISTIC EVALUATION
  • We performed the heuristic evaluation by each analyzing the site individually, then came together to synthesize the key findings from the results.

  • We compiled violations and successes of the site within the ten heuristics outlined by the Nielson Norman group.

test-med.png
USABILITY STUDY
  • We also conducted a usability study with 6 participants, and the testing took place between  February 26th and March 6th.

  • We tested out two similar user flows to test out the ticketing process using their participants' own mobile devices.

USABILITY STUDY DETAILS

Prior to beginning the test, each participant signed a video release consent form and completed a pre-test questionnaire to determine their familiarity and previous experience with buying tickets on their mobile phones. The usability test began with the moderator providing general information regarding the purpose of the study, then the participant completed the workflows. We encouraged participants to think aloud as they completed their tasks. Each participant was given a promo code specific to the study that allowed them to complete one ticket conversion at no cost.

Half of the participants began with task A, and the other half began with task B.

Task A. Purchase a general admission ticket for MoPOP

Task B. Purchase a ticket for a special exhibition

Following the completion (or incompletion) of the initial task, the participant was instructed to complete the secondary path, but to stop at the point of entering personal information and not attempt to complete a second purchase. Therefore, half of our participants completed workflow A and the other half completed workflow B.

Workflow A. Purchase a general admission ticket -> then attempt to purchase a ticket for special exhibition but stop at the point of entering personal information

Workflow B. Purchase a ticket for special exhibition -> then attempt to purchase a general admission ticket but stop at the point of entering personal information

After each workflow, the moderator asked a few post-task questions. After the user completed both tasks they completed a short survey and debrief.

FINDINGS & RECOMMENDATIONS

SUCCESSES

Overall, participants were satisfied with their experience using the MoPOP mobile site. When asked if the participants felt this process was similar to online purchases on other websites, most replied with “yes, for the most part.” There was considerable overlap between the findings of the heuristic evaluation and usability testing. Some of the notable successes from the usability testing as well as the unique findings from the heuristic evaluation are presented below.

he.png
HEURISTIC EVALUATION
  • Recognition rather than recall: Consistent with industry standards

  • Flexibility and efficiency of use: Option to create account

  • Aesthetic and minimalist design: Aesthetically pleasing

  • Help users recognize, diagnose, and recover from errors: Clear language surrounding errors

test-med.png
USABILITY STUDY
  • "Buy Tickets" button is big and noticeable

  • Users can make a purchase without creating an account

  • Quick links to Tickets page from search engine

  • Phone number input field takes multiple formats

  • System recognizes an existing user

OPPORTUNITIES

As mentioned previously, we led our participants through two flows. Flow A = General Admission ticket purchase -> Exhibition ticket, and Flow B = Exhibition ticket -> General Admission ticket. Upon making a conversion, the task was marked as successfully completed. The data below represents the completion rate for each task described above in the methods section.

OPPORTUNITIES1.jpg

Three out of three Flow A participants failed their first task, but all succeeded at the second task. Flow B had similar results as two out of three participants failed their first task, but successfully completed their second. Therefore, we found that the mobile website is learnable. All of our participants successfully completed their second task even if they failed the first.

Many of our findings from both the methods overlapped, but there were opportunities revealed through the heuristic evaluation that did not arise in the usability test. This is presented below:

Visibility of System Status

No visual indication of where you are in the process of purchases

There is no indication of how far along the user is in the purchasing process until the user gets to the billing information screen. At that point, there are three steps listed at the top of the screen, but the highlighted section is incorrect.

Inconsistency of how buttons change after being pressed

A couple of buttons provide some visual feedback when they are pressed, but there is no consistency. Some change color to a shade darker and others underline the words on the button.

User Control and Freedom

No clear backout option in some places

There is no back button (other than the browser default back button) for the user to take a step back and change their visit date if they wish to once they are in the personal information section.

No clear exhibition end date

There are notes above the visit date calendar that show when special exhibitions open, but not when they end. In order to determine when an exhibition ends the user must abandon the ticket buying process and go to another page.

Checkout as guest option is scrolled off screen

Users have the option to log in, register, or check out as a guest, but the guest option is only visible when one scrolls to the bottom of the page.

Consistency and Standards

User is unnecessarily prompted to log out

Once the order process is complete, the user is prompted to log out even if they never created a log in.

Inconsistent button styling

Buttons are not all one color on the site; some are white with pink outline and some are pink inside.

Error Prevention

No information validation

When entering personal information the user is able to enter junk data without an error. For example, entering letters in the zip code field.

Recognition rather than recall

Ticket prices not obvious at every step

The ticket prices are not carried through the process so a user must remember the price of each ticket from the initial pricing chart. 

Aesthetic and Minimalist Design

Finishing action is not distinguished from other actions

There is no back button (other than the browser default back button) for the user to take a step back and change their visit date if they wish to once they are in the personal information section.

The format is not always optimized for mobile

In some places, there is a bulk of information that is not completely legible on mobile. For example, the big grid of prices is above the buy tickets button and users have to scroll down to find the buy ticket option. 

No padding between certain elements

There are some elements that do not have padding between them. For example, the ‘Continue shopping’ and ‘Proceed to checkout’ buttons.

Confusing menu layout

The user can get themselves deep in the hamburger menu. It becomes difficult to discern open from closed sections and the hierarchy becomes fairly confusing on mobile devices.

Unstyled elements

Many of the inputs and dropdowns throughout the purchase process are unstyled HTML.

Unnecessary information on General Admission tickets

On a General Admission admission ticket purchase flow there is information about exhibitions but no context for it.

Help Users Recognize, Diagnose and Recover From Errors

No error highlighting

When a user fails to enter required information into the form the system alerts them to this fact, but does not take the user to the error and highlight exactly where.

USABILITY ISSUES

issues types.jpg
Major Issues

Billing information fields are cut off

Issue found in 6/6 studies

The billing information fields do not fit on the screen and the first 5 characters are cut off. This makes it very difficult for users to enter their personal information. In some cases, like the state field, the user can not see any of their entry. 

issue 1.jpg

RECOMMENDATION: Change the organizational structure to a single-column layout (for mobile) so that all the text fields and labels are clearly visible. This should help improve usability as the users will be able to see what they’re typing, and they would not have to scroll and zoom as much as they do now.

Promo code field is not in a standard location

Issue found in 5/6 studies

Several of the users tried the enter the promo code into the gift certificate field without success. None of the participants who struggled to find the field went back to the first page and found it, suggesting this first page is not an intuitive place for the promo code field. Additionally, almost every user expressed verbally that they expected the promo code to be entered on the same page as the billing information.

issue 2.jpg

RECOMMENDATION: Changing the position of the “Promo Code” field from where it is currently positioned to be next to the “Gift Certificate” field should alleviate this issue. Best practices followed across popular e-commerce websites (e.g. Amazon.com) have often combined the Promo Code and Gift Certificate fields as one, and these fields have labels such as “Promo Code or Gift Certificate” and usually function to accept both types of codes, even though the structure of the codes is generally different.

Ticket delivery method is not clear

Issue found in 2/ 6 studies

Several participants expressed that they did not know how the ticket would be delivered, or were frustrated that they were not given an option of how the ticket would be delivered. There is no indication that the ticket will be delivered to the email the user enters during checkout.

issue 3.jpg

RECOMMENDATION: There should be a clear indication before the user clicks the submit/purchase button that the ticket will be delivered to the email ID that they have provided. Ideally, this information is best dispensed out when the user is filling out their email address to make sure the user enters a valid email address. Additionally, it can also be displayed on the checkout page.

No indication of a button being clicked

Issue found in 5/6 studies

Most of the buttons in these workflows do not give any visual indication that they are being clicked. Users expressed confusion multiple times because they were not sure if they should wait for the system to respond or click the button again. This issue was perpetuated by the slightly longer time it takes for the ‘add to cart’ button to refresh the system. This led to users ending up with multiple of the same tickets in their shopping cart.

issue 4.gif

RECOMMENDATION: Every button click needs a visual confirmation so that the user understands that their response was submitted. This can be in the form of an animation around the button itself, to signify that the button is being pressed. Another form would be to instantly show any form of a “loading” animation, until the next screen is loaded, or the next action is performed.
 

Unnecessary scrolling

Issue found in 3/6 studies

Most screens in these workflows do not resize to fit a phone screen. This caused users to do a lot of scrolling up and down and side to side to see everything on the screen. Several users commented on this being a frustrating part of the process.

issue 5.gif

RECOMMENDATION: Everything on the page should be displayed within the width of the mobile screen. A single-column layout, as mentioned earlier, will solve a part of this problem. While pieces like the ticket prices table are an important source of information, having to use scroll/zoom causes unnecessary burden on the user. This can be easily addressed by making sure that the width of the table adjusts according to the mobile screen and that everything fits. Aside from that, you should consider taking out any unnecessary piece of information that is taking too much real estate, from the mobile site.

Confusing workflow to purchase an exhibition ticket

Issue found in 3/6 studies

Many of the participants initially tried to purchase an exhibition ticket through the large “buy tickets’ button. From there the users could not find a clear distinction between general admission and exhibition tickets. The users were not inclined to click on the list of exhibitions from the menu at first and expected to find exhibition tickets through the ‘buy tickets’ option. 

issue 6.gif

RECOMMENDATION: Although the banner images lead straight to purchase some of the special exhibition tickets, the users who come directly to the Tickets page, should also be given a clear option to choose between Museum Admission ticket and a ticket for a special exhibition on the ticket page. This can include separate buttons for each type of ticket to clearly display the difference. You can also use a separate theme (to coincide the theme of the banner) to make this ticketing route distinct from the general admission ticketing route.

POST TEST SURVEY RESULTS

After the participants completed both tasks, they were asked to complete a short survey through Google Forms that asked them questions about their general experience using the site. We decided to separate this survey from the post-study debrief that was performed verbally in order to reduce the chance that users were answering more kindly in order to preserve our feelings. The most interesting insights we found from the post-test survey was that most participants would rather buy their tickets online than at the ticket counter, even after the majority of participants failed at least one of the ticket ordering tasks. Half of the participants also agreed they were able to use the system with ease.

At the end of the study users were asked if they would prefer to complete this purchase on a desktop or mobile device if given the choice. All users responded with desktop computer. Most users explained that this was because of the billing information being difficult to enter. Many also explained that the information on the screen would probably be easier to see on a desktop with less scrolling.

post test survey mopop.jpg

FUTURE STUDIES

Although most of our interviews went smoothly and according to plan, there were a few things we’d probably do differently if we were to conduct this study again.

Fewer facilitator questions between tasks
Some of the questions we asked between tasks were repetitive, and the answers we got from them had diminishing returns. This could be because the users mostly followed a very similar flow during both the tasks, so they did not have anything different to say. Had we anticipated this, we would have made a unique set of questions for each task and try to get different pieces of information. We could also cut down the total number of questions to make the study shorter.

Testing with older adults
Although we began recruitment with the age range of 18-48 in mind, we only ended up getting participants between the ages 18-35, and thus we missed out on a major piece of the demographic. If we were to do this again, we would plan accordingly and have at least 1-2 people from the age ranges of 36-48, to ensure wider demographic coverage.

Invite stakeholders to observe and participants in tests
Even after doing the heuristic evaluations , our users managed to surprise us with more usability issues. Having stakeholders from the MoPOP design and development teams join us for heuristic evaluations and/or user tests would help us communicate our findings more effectively.

Explore different study locations and setup
Because we wanted to make our participants more comfortable and be mindful of their schedules, we allowed them to suggest a spot to conduct their study. As a result, most of our studies were conducted in coffee shops. Although coffee shops are a public and relaxed setting, it was not as quiet as a secluded lab. It would be interesting to explore new environments to conduct future studies.

Aside from these, we might consider recording the entire test if we had a smaller team and only one person was conducting an interview. Fortunately, we did not need to do that because we always went in teams of two or three, and took plenty of notes during each session.

APPENDIX

Here is the full report and some of the materials we used to conduct this study for your reference:

bottom of page