In my previous blog post, I stated that currently there is no such thing as the metaverse. While I stand by that statement, this does not mean we cannot look ahead and think about what (QA) challenges an actual metaverse might throw at us.
In this blog, I will dive into the intricacies of ‘Usability’ in the metaverse. Traditionally, usability in IT has focused on some of the following elements:
- Do the visuals support and facilitate the user journey?
- Is the user’s attention naturally drawn to the right focal point?
- Are all controls and navigation self-explanatory?
- Is there uniformity in the formatting?
- Does the solution look right on different screen formats?
- Are all icons and images clear?
These evaluation criteria make sense as we are often talking about a 2D product, almost always exclusively consumed through a screen – whether that being a smartphone, tablet, or PC, and through touch, or mouse & keyboard input.
When talking about usability, there are additional factors to consider in the metaverse. Not only is there literally a 3rd dimension to factor in – both digitally as well as physically – also the input paradigm should be reconsidered.
A 3rd dimension poses some usability challenges for even a basic user journey. Let’s take the example of a sign-up user journey, which in most cases will look something like this: the user will click on the ‘create account’ button, enter all their personal data, hit ‘confirm’, the page will reload and a ‘sign up successful’ page will appear. What would such a journey look like in a 3D space? Will I see a digital terminal I can visit to enter my details, or will there be some giant floating form in the air? Do I use a physical or digital keyboard, or is there another medium of entry? How will I get confirmation of a successful sign-up, as a full ‘page refresh’ of our digital space is probably not the best idea?
On the topic of input paradigms: when an actual metaverse eventually comes to fruition, it is likely that just about any digital device will act as an entry gate to it. On smartphones or PCs, you will likely be able to move around your avatar across the 3D space as you would in any video game today: using your mouse and keyboard, using virtual joysticks, or by using the motion capabilities of the device. However, with the advent of the metaverse, there will be additional platforms and means through which we will connect. When we take the example of a complete VR setup, you have ergonomics and spatial needs popping up as important usability criteria, with potential concerns such as:
- How long can I comfortably wear the VR goggles?
- With a wired setup, do I risk getting entangled or tripping?
- With a wireless setup, does the battery backpack not weigh too much?
- Are my controllers easy to use and properly sized?
- How much unobstructed surface area do I need to navigate comfortably and freely?
Going beyond pure physical ergonomics there are other aspects to consider:
- How do I visually distinguish objects that allow interaction versus background objects?
- How do I make sure we minimize the risk of motion sickness?
- In a 3D world, how do we make sure we pull the user’s focus to where we want it, i.e., if a user needs to look at something that is behind them, how will they know?
- When the visual experience becomes the most important, how do we help and assist visually impaired or colourblind people?
As a QA engineer, these are important factors to consider when determining your test approach and setup. As a whole, QA professionals will likely need to take their existing business and functional knowledge and extend it with important lessons from video game QA engineers, who already have extensive experience with the challenges a digital 3D world entails. Our knowledge domain will need to expand, and creativity and curiosity will become an increasingly important part of the job description.