The Master Chief Collection includes 6 different games from the Halo franchise presented in one package with a singular UX that ties them together. In Halo: The Master Chief Collection, players enjoy competing against each other in head to head first person shooter competition. Players do this through our matchmaking system, which pairs players up based on their skill level and game preferences.
We support the ability to matchmake players across 6 different multiplayer titles, dozens of game modes, and in a variety of team configurations for our social multiplayer experience. We created a single screen called the Match Composer to present this variety of options to the user in a compact and simple way.
User can easily plan and add a route on the map interface. When user presses on the "add" icon, the control panel will appear on the right. User can determine the starting point of the route, and enter route distance and angle change in both absolute and relative.
(1) The new navigation provides a visual way to help users recognize faster the game they want to play than recalling the order of the pre-set games in the system.
(2) With pre-set games in horizontal distribution, users can choose the preset game faster with the controllers.
(3) With preset visual background change (demonstrated in the main gif), it aesthetically engages users to play the preset games we have.
*Note: the images were found online. In real design demo, we will replace with images in our library.
(1) Display all Game Sizes that players can choose in the match process to help users view all game size options at once. It also helps users to select game sizes faster with controller.
(2) If the player chooses a game size that doesn't match with the Game Preset. Game selection slides back to "Custom".
(1) The previous design of the Game Selection only displays the series numbers, which might be confusing for new users to understand. For instance, users might not know whether the "2" icon represents Halo 2 or Halo 2 Anniversary. Therefore, the redesigned version displays all the Halo series with full logo names, so users don't have to read the description dialogue to know which game they should select.
(2) Instead of using check boxes for selection, I adopted the light versus dark treatment for the selected versus not selected games and categories.
(3) In addition, to make designs more high-fidelity, I recreated all the icons and used strong color contrast to emphasize on selected ones.
I usually validate design by creating user personas and customer journey map to analyze user flow and evaluate whether we have solved existing user pain points in the system. I also use conduct qualitative and quantitative research methods with both primary target audience and secondary target audience of our product.
For qualitative study, I use "Think Aloud" method, give users usability tasks, and record users' initial impression, time on task, completion rate, or any usability issues or concerns that have been raised. I conduct user interviews as well to ask for users' feedback on the product. Quantitatively, I use A/B Testing which includes a controlled group and a test group. The controlled group would use the old version of design, and the test group would use our new design to observe user behavior within these two groups and evaluate if we have successfully improved the user experience. Sometimes, I would divide user groups into "expert users" and "new users" to make sure the design is easy to understand for new users.
I would collaborate with the data analytics, research, design, and engineering teams.
To start with, I would host a kickoff meeting and invite everyone who is on this project, such as the data scientists, researchers, engineers, and other designers, so we could align our vision together and reduce communication friction.
I would then work with the data analytics team to get any existing data we have on the current system, such as users' click rates and drop off rates on the current interface. It would be helpful if we could get user sequence data and navigation order to analyze the current user flow.
After that, I would run a design sprint and lead the brainstorming sessions, try to include other designers on the team to explore as many ideas as possible, analyze personas, user pain points, customer journey, and evaluate ideas on the impact versus effort metrics to have feasible MVPs. When I finally have prototypes ready, I would host meetings with the engineers to discuss technology feasibility of the prototypes to narrow down our design options. I would propose research methods, and then work with the researchers to try to get both qualitative and quantitative user data on the proposed prototype.
Finally after several more rounds of iterations, I would double check designs with our existing design system (if we don't have one, I will try to create a design system so our future designs could align), accessibility standards to make sure our design is inclusive, create design specs for the engineers, and then host meetings to explain the designs, and finally deliver to the engineers.
To measure success of my design, I would look at the following key metrics:
Qualitative Research Feedback
(1) User initial reaction: what are users first impression and emotion when they see our design.
(2) User satisfaction rate: after user uses our product, what are their thoughts?
(3) Think Aloud: have users "think aloud" and talk to us what in their minds when they use our product. Look for key signals: if they have experienced confusion, frustration, lost in the navigation, and etc.
(4) User Interview: do users find our product helps them to complete their user goal? Do they find our product to be intuitive and easy to use?
Quantitative Research Feedback
(1) User time on task: compare to the previous design, how long do users need to take to complete a user task.
(2) Number of tabs: how many times users have to tap on controller buttons users have to tab through to complete the same task?
(3) User completion rate: how many users are able to successfully complete their user tasks without dropping off the page.
(4) User retention rate: how many users come back and use this feature again?