Timeline: Oct 2017 - Present
Project Type: Internship
My Role: Technical Product Manager, UX Designer
Tools: Unity, C#
Skills: Mixed Reality Design, Object Tracking
Pyrus (formerly named PearMed) is a healthcare startup that uses VR, AR, MR, and other cutting edge technology to create tools to enhance the healthcare system. They are based in CoMotion Labs, a co-working space at UW. I'm working along side their CTO, Ryan James, as well as their dev team, Paul Schneider, and Travis Bailey. We recently finished a big project this past month (December 2017) involving visualizing burns.
For this project, we partnered with Harborview Medical Center to build an application for burn vicitms to visualize how their burn will heal throughout time using a HoloLens. Because this was considered a research project for Harborview, our job was to create a proof of concept that this technology could be used to deliver a compelling user experience for burn victims. We had just under a month to deliver this MVP.
The healthcare industry is always finding ways to communicate with their patients and educate them about their medical condition. This is especially relevant for explaining to burn victims what their healing process will look like. Currently, doctors would show them photos of other patient's burns at various time periods (1 month, 9 months, 1 year, etc) and point out specific aspects of their burn. This current approach is too generalized and fails to capture the nuances of how the burn changed.
For example, if the burn is located on a joint area there will be stretch marks the skin. But the severity of the stretching can be reduced if the patient regularly exercises that part of the body. There are many other factors that change the burn healing process such as skin bulidging, variance in the degree of the burn, and skin pigmentation.
Using Mixed Reality, we wanted to make this experience personalized and compelling for the patient.
Below are images of the types of burns doctors would show their patients. These images can be hard to look at. Click them to reveal the photo:
Patients would put on a HoloLens and by looking at their arm, the HoloLens would use image recognition to create a 3D mesh of their burn areas and overlay that mesh on their burn locations, changing the pigmentation, thickness, and mesh shape as they went through different healing phases. This way, patients would be able to see how their specific burn would change over time.
Most importantly, because we would be using image recognition, we would be able to identify specific facets of the burn such as degrees of the burn at various areas (ex. a burn having a 3rd degree burn at the middle but a 2nd degree burn on the outer edges would heal differently). We could also identify where stretch marks would start to show based on the location of the burn (ex. if it's on top of or next to a joint where the skin is often times being stretched and retracted). We could even show what their burn would look like if they were to maintain proper daily stretching vs not maintain it. There is also a collaborative component to this where other loved ones could put on another HoloLens and look at the burn simultaneously. This personalized experience would be massively compelling to the burn victims and bring a new degree of understanding and education to those patients.
However, this is the long term concept for the product. We were starting from scratch and our current goal was to create a proof of concept.
To create the 3D meshes of these burns, Travis used the images of existing burns shown above and wrapped that around a 3D model of an arm. From there, he wrote a script to sharpen the edges of the burn. Travis also went through and manually adjusted the mesh of the model at the 9 months phase to capture the swelling that happens in the burn areas. He then wrote a script to fade between one another to show the progression of the burn from one phase to the next. We also had to account for the brightness, opacity, and lighting of the model to help it look more realistic. Tools used were Blender and Gimp.
Arm tracking was a section I was highly involved in since it had a strong influence on the user experience. We knew that the optimal approach would be to use image recognition, that way patients could just look at their arm and it would work but because we had only a month to finish an MVP, we had to sacrifice a perfect user experience for a quicker deliverable. Considering the scope of the project, this was fine for us.
We explored Microsoft's Mixed Reality Toolkit Unity library but we found the most promising tool was HoloLensARToolKit, a QR code tracking library for the HoloLens. The toolkit uses this QR code cube to track location and rotation of the source:
At first, we considered different user experiences to make the QR code as frictionless as possible. These were the different approaches we considered before arriving to the QR Cube:
At first, we thought about having the user hold a piece of paper with the QR code on it but quickly realized that forcing users to hold something wasn't always possible due to limited muscle movement. We also thought we would need 2 QR codes to track the wrist as well as the elbow.
The next idea was to put two QR codes on a flat surface, one for the wrist and the other for the elbow, and have the user line their arm up to them. That way, the patient can keep their arm relaxed. The issue with this approach was that patients weren't able to rotate their arm and look at their burn from multiple angles. The hologram was no longer following their arm.
The QR code wristband consisted of 4 to 5 QR codes evenly spaced along an adjustable wristband. This was a step in the right direction as it didn't force the patient to use their muscles while also allowing the hologram follow their arm (Logistically, patients would wrap their arm before putting on the wristband to prevent burn skin damage). Unfortunately, we found complication with different sized wrists and making the band adjustable. If we had to make the wristband smaller, it would start covering up QR codes and that presented a lot of unnecessary complication. We would also need two bands, one for the wrist and one for the elbow area.
This concept kept the wristband but used the QR cube instead. Using this approach, we only needed to track one part of the arm since we could use the rotation of the cube to rotate the burn model. This was also when we thought of using velcro for our wristband material, which was much more flexible and easy to use than a "watch-like" wristband.
This was the final result:
After we had the burn model made and the arm tracking working, we put it together and demoed it at Harborview Medical in front of physicians and residents in the burn department.
This internship has been an incredible experience for me. I've been able to work with HoloLens and develop a mixed reality application all for the first time. I'm learning more about the capabilities and limitations of the technology. And I've pushed myself to think about how to apply MR to important problem spaces such as healthcare.