Extended Reality Oxygen Flow Meter

As a novice UX/UI designer, I was very intrigued coming into this start-up and diving into the world of XR (Mixed Reality). This was my first time designing something that wasn’t 2D and I was challenged and stretched to think and design in a different way.


Some of the challenges I faced with this project included #1. The caveat of not being able to have direct communication with the client, #2. Only being able to user test within the team with no measurable success, and 3. Having to design within the color and button constraints of the MRTK 2 kit.

Oxygen flow meter prototype made in Figma

3D Oxygen flow meter made in Unity

Understanding the Issue

The challenge was to redesign an extended reality oxygen flow meter using the Microsoft hololens MRTK 2 components. Users were having difficulty changing the current oxygen rate quickly with the current buttons. This oxygen flow meter would allow for the user to quickly use a slider to select their prescribed oxygen level vs using a button.

Goal

The goal of this project was to redesign our extended reality oxygen flow meter user experience

Process

The users are participating in an emergency scenario with a patient who is having difficulties breathing. In this emergency scenario it is important that the user can change the oxygen flow rate from 0 to 15 quickly. We inferred this challenge from analyzing medical students in a video trying to get from 0 to 15 quickly on the oxygen flow meter. The user was observed holding their hand up in the air for over 15 seconds and tapping 17 times continuously to get to desired oxygen level.

Discovering a pain point

The problem

Here were some of the insights and problems found from watching the video of the medical students using our app, these were the same insights used when considering the solution:

  • Insight #1: Arm fatigue from holding in the air for long periods of time, with the repeating tapping 14+ times. Difficulty going from 0 L to 14 L quickly, difficulty selecting and changing oxygen level.

  • Insight #2: Oxygen flow rate administration is not reflective of a realistic oxygen flow rate administration scenario.


Based on these insights, we decided to redesign our current button controlled oxygen flow meter into a slider mimicking a realistic oxygen flow changing experience.

Research

Arm fatigue in VR/XR is a real problem (also known as “Gorilla arms” in the VR/XR world). By observing and addressing the users physical ergonomics, and manipulation techniques with their virtual environment, we can make the experience much more comfortable for the user. In our scenario with the medical students we observed their arms raised in the air for extended periods of time to tap excessively, and this is something we want to minimize.
I tested “tapping” vs “dragging and sliding.” It takes approximately 12 seconds to tap 15 times in the air, vs about 4 seconds to drag.

Analyzing current designs

To the left contained in the yellow box is the current oxygen flow meter user experience. The buttons increase by increments of 1. User would increase the flow rate by continuously tapping until desired flow rate is reached.

A realistic oxygen flow meter

Our current oxygen flow meter experience is a very different experience from adjusting a real oxygen flow meter. A real oxygen flow meter is usually a knob found on the wall, see attached photo to the right. In a realistic scenario, the oxygen flow is adjusted by turning the knob and having the floating “ball” in the tube align to the center of the line of the desired number. Our current oxygen flow meter does not show a flow meter at all, and has buttons instead of a knob.

Design

After researching, I started exploring and sketching a variety of designs. These designs were tested and iterated on based on feedback within the team.

Iterations

I had 4 different design ideas and based on feedback and testing within the team.

  • #1 My first idea was to directly go to the holographic image of the oxygen valve and directly turn the knob and see the floating ball move. After discussions, it was found that we could not have the holographic image move or interact with the users touch. decided that option.

  • #2 My second Idea was to superimpose a design and outline of the oxygen flow meter over the holographic image. After further discussion we decided that this might obstruct the view of the holographic image.

  • #3 My third idea was to create a half circle slider design. The user would move their finger along the half circle, and the floating vertical oxygen rate would have a floating ball that changes as the user slides their finger.

Idea # 1: Manipulate 3d image

Idea # 2: Manipulate superimposed design

Idea # 3: Manipulate half circle slider

Idea # 4: Manipulate floating slider next to 3d flow meter

After further discussion, we found that a straight slider would be easier to implement as this is already part of the MRTK (Microsoft Toolkit) components. As a team we decided that design #4 would work best for our purposes because it would allow the user to choose accurately and move quickly. This design was also the easiest to see in XR, as thin lines in XR would potentially get lost in the users background. This design was also easier for the developers to create as the MRTK components were already created and in use in our app. The user is not able to directly manipulate the 3d oxygen flow meter, so with the floating menu adjacent to it, user can directly manipulate the “ball” on the oxygen flow meter and slide it to their desired number.

Iterations

Designs + Solutions

Insight #1: Arm Fatigue

Solution to insight #1: Our solution was to remove the buttons and make the experience a slider instead. The slider would allow the user to increase the oxygen from 0 to 14 in less than 5 seconds. This would decrease users arm fatigue, remove repetitive gestures, and allow user to do task very quickly.

Before

After

Insight #2: Unrealistic User Experience

Solution for insight #2 : We decided to make the experience more realistic and similar to a real scenario, we needed a real oxygen flow meter displayed. Our current scenario did not show any kind of oxygen tank at all, and only displayed the “prescription menu” seen below. To make the experience more realistic, we had a holographic image of an oxygen displayed next to the interactive slider design. See image below.

Before

After

What I Learned

I was stretched and challenged in so many ways during this project.

  • It was really interesting to keep in mind haptics and gestures while designing for the 3D space. I had to think about how long a user would be holding their hand up, or if certain gestures should be buttons vs sliders. User fatigue and posture were new things I had to think about incorporating into my designs.

  • I also learned a lot about business needs and goals, and how that can often times trump the quality of the final product.

  • I learned to focus on the flow of the product and usability, vs focusing on the UI because I was unable to control any of the UI in this project.

Next Steps

Moving forward there are a few things I would want to do with this project.

  • My number #1 frustration was probably not being able to form a direct relationship with clients. This was difficult because I was unable to directly hear from them their pain points and unable to do user testing to tailor this product to my user. In the future I would love to do user interviews and user testing directly with the medical students.

  • Because I was unable to speak with the client directly, I was unable to do user testing and measure success of the app. Next time I would want to user test and measure the success of the asset.

Previous
Previous

XR Crash Cart Medications

Next
Next

XR Menu