top of page

UX Portfolio

AR Health Project

AR tutorials and overviews on how to use lancing devices and glucose meters.

My Role

XR Product Design and UX 

Format

XR (Adobe Aero)

Year

2023

AR Health Glucose Meter
Main Objectives 

Currently, there is a lack of clear and informative instructional tutorials on using home medical devices. Existing solutions are geared toward commercial usage with virtual consultation in health and manufacturing industry. This website provides an overview of medical lancing devices and a links to an AR medical device tutorial. 

My Role 

I was responsible for UX and product design of the AR medical device as well as the AR tutorial experience. 

I also modelled the medical device parts and conducting user research and testing to design the AR user interface.

Features of the AR experience:

  • Video tutorials for each part of the medical lancing device and meter

  • Short descriptions as to the purpose of each part of the lancing device and meter

  • Annotated and interactable model with tooltips

  • Step by step tutorials on how to use the lancing device and meter.

Competitive Analysis

User Research

Research was conducted into the existing VR/AR education market and ways that VR/AR is used to enhance the learning experience for new users. Users were also surveyed regarding their familiarity with using lancing devices and the clarity of instruction provided with the products used. 

From this research it was determined that their was a lack of clear and concise home medical instruction for lancing devices. Users expressed difficulties taking a blood sample, reducing the pain of a blood sugar check, and how to dispose or maintain their device. When looking at the market most competitors focused on one-on-one video consultation for how to use home medical devices.

Competitive Analysis 

Direct and indirect competitors were analyzed for their business models, use cases and features.

These were the main research findings: 

  • The main focus of competitors is to provide live AR consultation through video chat 

  • Competitors focus on mainly manufacturing and health industry with a focus on onboarding

  • Users like information overlays, concise instructions and status notifications

Affinity Diagram
User Journey Map
User Persona

Interview data findings were organized into an Affinity Diagram as well as an Empathy Map. These diagrams helped to determine common user problems and narrow down the target audience for the application. 

A persona, a user insight statement and a problem statement were created to reflect user issues and goals.​​​​​​​

User Persona

Thomas has been asked to manage his health to avoid the risk of type 2 diabetes. He has a busy schedule and wants to find the simplest and pain free method to measure his glucose levels.

A user insight statement and a problem statement were created to reflect user issues and goals.​​​​​​​

User Insight

Thomas Brown, a user who has diabetes, is looking for a simple method to measure his glucose levels. He has a busy schedule and does not have the time to parse through extensive medical instruction and terminology when using devices.​​​​​​​

Problem Statement​​​​​​​

The learning app is focused on teaching novices on how to use a device to measure blood-sugar levels. Device manufacturers generally provide obtuse paper or online text or static image instructions. Terminology for medical or device parts is either lacking or poorly explained.

How can we provide a patient accessible and contextual instructions so they can intuitively gather measurements to manage their blood-sugar level.

User Journey Map

A user journey map was then created to describe Thomas' user scenario, convey their internal thoughts and to show how features of the medical application can aid their routine blood sugar check up.

Model Assets for AR

Prototype Iterations

Prototype Objectives

The AR/VR required transferring existing 2D UI (menus/customization options) for use in 3D space. Other assets such as 3D models were examined for file compatibility within an AR environment. 

Since the medical devices are projected onto a physical environment, sizing, object placement, and visual clarity of components was considered. Since users are navigating a smaller fixed space, context sensitive menus that could be toggled were also needed. Finally, an interaction method needed to be determined for selecting user interface elements.

UX Design

At the start of the design process a flowchart was used as the basis for the prototype. The flowchart assisted in determining the interaction order and what assets needed to be created. 

The Figma prototype was created to determine what actions users would take to view tutorials as well as the general form factor of components. This prototype uses toggleable buttons to show/hide labels for the lancing device or the meter. When clicked the labels show a basic description of purpose of the device part. 

Additionally, a button section was added to show a video tutorial with step-by-step instruction on using the lancing device and meter. Steps are grouped into categories such as how to take a blood sample, disposing the lancet, and how to get a glucose reading with the meter. ​​​​​​​

Creating 3D Objects

The 3.5 Blender release includes the MaterialX Python package allowing for the ability to easily write Python scripts for material extraction. This is incorporated into a stand-alone script as well as a Blender add-on allowing for automated extraction of geometry (meshes), animation, and materials.

All meshes are separated into individual files and imported as model and user interface parts. Separation is required as Aero does not preserve the scene hierarchy nor separation of parts on import. Separation is required to allow individual placement and association of actions and triggers per part.

Converting 2D Images to 3D Models

Creating the 3D User Interface

Icons, logos, and labels are either exported from Figma as SVG files available from public domain websites and asset libraries. Though it is possible to import unedited SVG files directly into Aero, previous experiences have shown that the content is very hard to view and select due to the flat geometry generated.

A script which takes an SVG file and produces a 3D meshes along with MaterialX materials is used for batch conversion of SVG files.

A Python script is used to read in images and create 3d geometry by :

  1. Importing an image

  2. Use its dimensions to create a 3D rectangular shape with a user defined height. The default width of the geometry is 1 but can be set.

  3. Create a material and use the image as the base color input. The material may be unlit or shaded.

  4. The results are then saved out to a GLTF geometry file and a MaterialX material file per image.

Audio Description

Voiced audio description was added to provide additional context and instruction when users access the device tutorial or want to learn about a specific part’s purpose for the lancet/meter devices.

ChatGPT was used for text generation of generic instructions for meter and lancet parts. The generated text was then modified to incorporate more specific technical terms and instruction for device usage.

To create audio clips, the instructions were put into the text to speech program Espeak to create separate audio clips for each step. Audio was then exported in WAV format and compressed to ensure that the files could be played in Adobe Aero.

Other methods of audio generation were explored such as the Mozilla TTS library and using Google’s speech synthesis markup language to code audio instructions. The Mozilla TTS library allows you to train an audio model to generate custom audio and more natural sounding speech. This library will be examined in the future for generating longer form audio content. The markup language will also be explored further due to the degree of customization options for the tone, voices, and language of the audio.

Using SSL (Speech Synthesis Markup Language) you can set the duration and strength of speech such as pauses.

Instruction Card in AR For Using the Lancing Device

Demo Video

Clickable labels/cards in AR provide short audio descriptions of the purpose of each device part.

bottom of page