How a University Researcher is Looking to Revolutionize Surgery with XBOX

Adil Khan
4 min readDec 26, 2020
Credit: XBOX

Dr. Helena Mentis of the Human-Centered Computing department at the University of Maryland Baltimore County has led a revolution of utilizing “gaming” technology for surgical applications. More specifically she and her team were able to harness XBOX kinect gaming technology for use by surgeons in visualizing medical images during surgery. This technology was used by vascular surgeons at St. Thomas’ Hospital in London (Brimelow, 2012)). The interdisciplinary application of this technology is amazing. A surgeon at the hospital, Dr. Tom Carrell, was able to gesture to an XBOX kinect sensor in front of him to manipulate “a 3D image of the patient’s damaged aorta” (Brimelow, 2012).

This technology solves two usability issues. First, it saves the surgeon from having to communicate with someone else who would manipulate and might make a mistake, which could result in disastrous consequences during surgery. The surgeon is able to manipulate the images himself or herself without contaminating his or her hands.

Credit: Kenton O’Hara

Before this user-centered solution came about, Dr. Carell would be “shouting out across the operating theatre to tell someone to go up, down, left right” (Brimelow, 2012). The whole process of image manipulation is easier with the Kinect technology, and, again, the touchless aspect is ideal for a surgeon who has to keep his or her hands sterile. Dr. Carrell iterates as much, saying, “…with the Kinect I’m able to get the position that I want quickly — and also without me having to handle non-sterile things like a keyboard or mouse during the procedure” (Brimelow, 2012). The overall result is a technology which is “easy to use” (Brimelow, 2012). The user is able to manipulate images in more than one way in order to get to the exact location that he or she desires. Specifically, he or she can pan, zoom in and out, and rotate, and also lock the image that he or she is looking at (Brimelow, 2012). The user is also able to place markers to ensure precision (Brimelow, 2012).

Credit: Douglas Gantenbein, Microsoft Corporation

There is, however, apparently a learnability hurdle to overcome before a user gets accustomed to affordances (Brimelow, 2012). About this Dr. Carrell says, “The sensitivity is the main thing, but it’s very simple gestures, like on a smart-phone. Once you know the gestures it’s very intuitive” (Brimelow, 2012).

Credit: Alessio Pierluigi Placitelli

This technology is a classic example of a user-centered design solution to a real problem. A serious issue in the surgical community was approached with a user-centered approach. By identifying the two above-described issues, namely reducing error in image manipulation, and also allowing the surgeon to take full control while compromising the sterility of his or her hands while operating.

A potential improvement that might be made to this technology relates to its development (Rodrigo et al, 2014). Specifically, it has been noted by the Campo group that the use of a tool known as EasyGR may aid in quicker development for custom gestures for the Kinect (Rodrigo, et al 2014). EasyGR uses machine learning algorithms to stream-line the process of gesture recognition for developers (Rodrigo et al, 2014). An illustration of how this or a similar tool might be used to accomplish this is shown in the storyboard below which has been adapted from images from the Campo group’s research paper entitled, “Easy gesture recognition for Kinect”:

I personally found this topic interesting and important, because this is a research area that I am specifically interested in. I am interested in the interdisciplinary relationship between user-experience research and design, computer science, and medicine. I developed this interest, because I have a background in both biotechnology and medicine and human-centered computing from both professional and academic perspectives.

This is definitely an area of interest for me outside of this class. I have in fact been fortunate to actually be able to join the lab of Dr. Mentis here at UMBC as a research assistant about three months ago, and even assist in user research. I would say that I am generally drawn to the idea of human-centered computing and user-centered design applications to medicine and biotechnology. Apart from telecommunication and telemedicine solutions, however, I am also greatly interested in seeing how the concepts of human-centered design can be applied to the field of bioengineering innovation and design to create an optimal user experience which may result in saving lives.

Works Cited:

Brimelow, Adam. “Trial of ‘Touchless’ Gaming Technology in Surgery.” BBC News, BBC, 31 May 2012, www.bbc.com/news/health-18238356.

Ibañez, Rodrigo, et al. “Easy Gesture Recognition for Kinect.” Advances in Engineering Software, Elsevier, 26 July 2014, www.sciencedirect.com/science/article/abs/pii/S0965997814001161.

--

--

Adil Khan

I love human-centered design, biotechnology, and medicine, and I am passionate about exploring the inherent connections between these fields!