Project Azamusk: User Testing

This post outlines the testing process for Project Azamusk, my senior capstone project completed in April 2021. Throughout the course of 8 months, the team scheduled and conducted the following test plans around the course deliverable deadlines (Alpha, Beta, Gamma, and Final). 

Note: September was spent on planning the project, managing scope, and creating a production plan.

Stage 1: Prototyping (October - December)
  • Mechanics prototypes
    • Developed in Unity
    • Low-fidelity internal testing of how important mechanics would function, such as how an object can be moved around on a grid
Stage 2: Preliminary User Testing (January - February)
  • Controls prototype (Alpha)
    • Developed in Unity, tested in browser via itch.io for convenience
    • Low-fidelity external testing of camera view and object movements
  • Puzzle design prototype
    • 3D mockups created in Maya
    • Low-fidelity external testing of key game concept and puzzle progression
  • User interface (Beta)
    • Developed in Adobe XD, tested in browser
    • Medium-fidelity external testing of user interface design
    • In progress user interface implementation in Unity to match designs and updated based on feedback from user testing
Stage 3: User Testing (March - April)
  • Playtesting (Gamma/Final)
    • Unity game, tested in browser via itch.io or by downloading the .exe
    • High-fidelity external testing of gameplay
  • Quality assurance
    • Checked for bugs and unintended challenges in the game

I ran the user testing (with one other teammate to limit researcher bias) for the controls prototype, user interface, and some playtesting sessions. The team also conducted puzzle design testing using a “Wizard of Oz” method with a 3D mock-up in Maya. I was not running this study, so I won’t go in depth, but the goal was to basically check if the game concept was fun to players. The main findings included: the need for an interface that helps players to understand their options to solve the puzzles, and the need to make interactable objects more distinct from the environment.

Due to the remote work situation, all user testing was also conducted remotely via video web conferencing. Participants were not required to turn on their cameras but screenshared throughout the individual user testing as researchers observed and took notes. The goals, methods, findings, and informed changes from the user tests I conducted are described below.

Controls Prototype

Research goal: How intuitive are the controls?
Measures: time duration of task completion, observed behaviour and feedback
Method:
  1. Pre-test questionnaire (about prior gaming experience and preferences)
  2. Observations with "think-out-loud" (using an itch.io prototype in browser)
  3. Post-test questionnaire (feedback regarding the controls)
Screenshot of the controls prototype
Screenshot of the controls prototype

The task was to locate and move 3 spheres to the objective (green zone). Participants were asked to complete this task without any knowledge given about the controls. Instead, they were to discover the controls themselves.

In this prototype, the camera view movement is controlled with WASD, zoom with mouse scroll wheel, free rotation using Q & E. Participants were also able to left-click on spheres to select them and right click on grid spaces to move the sphere to the space, with path-finding capabilities avoiding obstacles even with large distances.

We recruited 17 participants from friends and acquaintances (8 male, 9 female) as a selective sample representative of our target audience between the ages of 20-35. The results from our pre-test questionnaire indicated that we had a mix of experience and preference levels, which was ideal in testing how new players unfamiliar with Real-Time Strategy (RTS) game controls would approach our game since we chose a similar control scheme. 

Findings and informed changes:
Participants took an average of approximately 3 minutes to figure out the controls and complete the task. The fastest time was just under a minute while the slowest time was around 6 minutes. Unsurprisingly, we found that participants who self-reported more experience with RTS games generally spent less time completing the task.

Positive feedback was received for WASD view movement and scrolling to zoom. Some participants had some difficult finding the controls to rotate their view but generally found Q and E easy to use once discovered although a bit slow. For the next prototype, we made rotation faster by snapping to 90-degree angles rather than free rotation. We also added arrow keys to have the same functionality as WASD, to provide more options to users.

We observed mixed reactions for using right-click to move the ball, which had a trend based on prior game experience. Participants less experienced with games did not find it intuitive to use right-click to move objects and usually tried to left-click a grid square to move. Although this is a concern, we ultimately decided not to change this control because left-click was being used to complete other types of actions. However, we planned to make this control explicit and more clear to the user.

User Interface Testing

Research goal: How understandable is the user interface?
Measures: observed behaviour and feedback
Method:
  1. Pre-test questionnaire (about prior gaming experience, expectations, and preferences)
  2. Observations with "think-out-loud" (using an Adobe XD prototype in browser)
  3. Post-test questionnaire (feedback regarding the UI)
Screenshot of Adobe XD prototype help pop-up screen
Screenshot of Adobe XD prototype help pop-up screen

Screenshot of Adobe XD prototype robotic arm interface
Screenshot of Adobe XD prototype robotic arm interface

Screenshot of Adobe XD prototype temperature jet machine interface
Screenshot of Adobe XD prototype temperature jet machine interface

We recruited 18 participants from friends and acquaintances (10 male, 8 female). Participants were asked to complete two introductory levels provided by the information on screen. For instance, hints appeared when the participant pressed the Help button. The first level was simply using the robotic arm to pick up the capsule and move it to the receptacle. The second level required activating a temperature jet machine to burn bushes and clear the path to the receptacle.

At the time of testing, UI implementation in Unity was in progress so an Adobe XD prototype with screenshots from the most recent prototype was used with UI overlay art. We did not want to delay UI testing as it is an important part of how the player would interact with game elements. Limitations of this method include having a fixed camera view, left-click interactions only, and a finite set of possible solutions. The participants were made aware of these limitations.

Findings and informed changes:
Graph of responses for understanding what to do for the first level
We found that the first level was very clear, and the greater majority of people understood the primary objective of the game, which is to pick up the capsule using the arm and place it on the receptacle.

Graph of responses for understanding what to do for the secondlevel

The second level however may have been too much of a difficulty increase. We saw a high variation across the ratings with a mean of 3.28. We also observed some confusion from presenting the participant with multiple new features at once.

For the next prototype, we provided more explicit instructions in introductory levels to inform the player what they should do. For example, highlighting to guide the player to what they should click next with some short text instructions. We aimed to lower the learning curve by introducing fewer new elements in each introductory level.

Moreover, we received feedback on the understandability of icons chosen and their locations. We also found that users preferred the floating UI attached to machines as opposed to existing having buttons at the bottom right of the screen (see robotic arm interface for this prototype). We made these informed changes for the next functional prototype in Unity.
Screenshot of plans for UI design updates
Screenshot of plans for UI design updates

Playtesting

With less than a month before the deadline, the work remaining was primarily bug fixes, additional game levels, and refinement. At this stage, participants could either play the game in browser via itch.io or download the .exe for a better experience (e.g., some lag may occur in browser). The goal of further playtesting was to gain other general feedback on the experience aiming to answer broader questions:
  • How is the gameplay?
  • Is it too easy/challenging?
  • Are there any bugs?
Overall, we followed our schedule for user testing and made informed changes based on the feedback received for each functional prototype in order to improve our game and deliver the best possible final product at the end of April 2021.

Comments