UX design rocks Virtual Reality

Capturly
Prototypr
Published in
7 min readMay 28, 2017

--

For most of the users is still something that you can see in Sci-fi movies, it’s like entering into a computer generated 3d environment. It wraps our brain, our field of view in an “illusion” that makes the feeling real (at least it’s supposed to do that).

It is also placing people into places and experiencing things that they can’t reach this point.

So when people start bragging about how awesome weekend they had getting drunk I respond with:

“That’s cute, I was playing flipper with planets.”

How to categorize the experiences?

  1. Hyper-immersive emotional: where we influence users’ senses

Ever wondered what flying feels like? Well, I sure did. But wait there’s more. A project called 5Gum Erlebe takes the VR experience to a whole new level. For the ultimate multi-sense teaser you can experience flying through a tunnel where you can smell different flavors corresponding to the actual colors surrounding you.

2. POV experience: for example educational journey to a place where they can’t go at the moment

3. Gaming: Gamers are often open minded scepticists, in other words, a hard target. Their expectations are one of -if not- the highest.

The current technology behind VR:

“ Without a large open space available, players run the risk of walking into walls or furniture while playing. Plus, not all players want to have to constantly run around to play games in virtual reality — for many, sitting comfortably is an important part of the video game experience.”

But even when not walking around interesting things can happen. In some cases users -especially first time- can merge with the enviroment so well, that they try to move like they would in a real situation.

Talking about “I believe it when I see it” right?

Ouch

There are a quite a few solutions out there, but the main problem is the freedom of movement since you’re mostly limited by wires connected to the hardware. There are of course mobile VR solutions for eg Gear VR or cardboard, but those require powerful mobile devices, not to mention that even on a 4k display you can count the pixels from a close distance, plus you are limited by the battery life. Needless to say, this leaves a rather limited environment (mostly by budget) to test and develop such equipment, since it’s a niche technology that is being kept alive by enthusiasm.

The HTC Vive has an elegant solution called the Chaperone feature, this safety function warns users before they contact an object in the real world or let them see control cords so they don’t get tangled. Speaking of functionality and safety, designers have to make sure that these products fit everyone. This means more than just sizes. It has to be useful for visually impaired and other physically handicapped users.

The importance of haptic feedback:

aka. “So close, yet so far away.”

You are there, you can see it, inspect it, but still, you know that it’s a lie. This is mostly because of the lack of haptic feedback. At this point, it’s basically nonexistent. There are things available such as sensorial gloves, but imagine the situation where you try a fishing simulator. There are so many factors that influence the fisherman other than just the environment itself that can be seen. The weight of the object, like the fishing pole in our example, the pulling force when you have a catch and so on. The lack of these haptic experiences adds up ending in a feeling of a fun, but rather empty experience. Developments have started like the use of sensorial gloves and projects like “Pretender” when they stimulate the user’s muscles by impulses, but things as such are still in research phase and not ready for the mass consumer base.

“Soo. What do I do?” -asks the user

“Nothing.. just sit back and watch”

All this means, that the user experience is still passive sit-back type. The user is there to observe. That’s the problem, it’s not interactive, they don’t feel that they’re in control.

This way it is very hard to monetarize such products.

How to design for such tools?

Immersion is King. As soon as it’s broken, the user loses the experience of VR. Designers can learn a lot from the gaming industry. The reason is that ever since gaming exists it’s one of the few activities that can drag out the users from reality and put them into a virtual scenario and not only for minutes. But how can video games be so successful in generating feelings, emotions and locking the users’ attention for so long? After all, the only connection between the two worlds is a screen, a controller, a keyboard or mouse combo and a headset. If they turn their head off the screen just for a millisecond they are basically out of immersion. Yet they manage to get back in immediately. How can this be more immersive experience than POV for an example?

Sound design brings a lot to the table. For example, when making different AAA grade military games, the experts spend a lot of time on field recording and analyzing the sound sources to bring the most out of a battle scenario in the game.

The freedom of movement

I n the 1950’s the Navy experimented with a simulation where pilots could test their skills in flying helicopters. They reported dizziness, nausea, sweating, disorientation. In other words, motion sickness, however, there was no actual movement involved in the tests. So was simulator sickness born.

In first person shooter video games users often don’t have legs, arms are locked in a certain position etc. This means they don’t experience simulator sickness. Another factor is that users have simple combinations of horizontal and vertical movements. The freedom of navigation is highly dependent on the input device itself. For instance, with a keyboard and mouse combination, you can target objects very precisely, while on controllers you have some sort of assistance. In VR users feel like they’re a human gyroscope being able to perform movements like in real life, however, this adds sickness to the formula.

To compensate this, you often get an avatar — that is basically a persona from a UX perspective- that has freely moving limbs just like in the reality.

“Okay so this should make navigation more convenient right?”

Not quite.

It can help to reduce the sickness but if you have a different physique, in reality, it can go against immersion, so it’s not a guarantee. Simulator sickness is one of the biggest enemies of VR experience. Users locked out of the real world often end up in lifting the headset. So how can it be more convenient?

Oculus Rift has a solution called positional tracking and also implementing higher resolution displays help a lot.

Your nose knows the answer!

David Whittinghill, an assistant professor in Purdue University’s Department of Computer Graphics Technology has a more simple yet “human” solution.

“We’ve discovered putting a virtual nose in the scene seems to have a stabilizing effect” — D. Whittinghill

He confirmed that this method has reduced the effect of simulator sickness by 13.5 percent. If we think of our nose like a fixed reference point, the idea makes a lot of sense. Of course, this won’t solve every issue related to the problem. But this small addition to the HUD is definitely a step towards convenient user experience.

What can VR offer in the future?

OOne is for sure, the goal for UX designers is to turn the passive aspect of VR into an intuitive, interactive experience. For this to happen development and UX have to work hand in hand. The tech world has to offer a latency and simulator sickness free environment, where the user can move around as naturally as possible. User experience designers have to face this new challenge as a new medium, understand what to communicate towards the user in order to help familiarization. Identify what visual, or audiovisual clues work the best and how can they guide the users through this new environment.

Conclusion:

VR is walking a hard path. What leads to the final solution? We don’t know. But that’s not a bad thing, knowing what not to do and experiencing failures is part of the development and design process at the same time. So this means UX designers have to focus on future technologies and developers have to be super user-centric.

Please write a comment, your feedback means a lot to me! Also sharing and recommending is a big motivation. I appreciate every help! ❤

--

--

Full-scale analytics for your online business. We share content about UX, Website Analytics, E-commerce and CRO. Visit us at https://capturly.com!