News Release

Virtual reality tool quantifies physics of a doctor's touch

Peer-Reviewed Publication

University at Buffalo

A doctor's hands are two of the most important diagnostic tools he or she has, allowing the physician to detect subtle signs of disease or injury just by touching a patient.

Exercising that expertise always has required the presence of two individuals in the same physical space at the same time: doctor and patient. Until now.

University at Buffalo researchers are developing a system that will allow physicians to use a new form of virtual reality, called physically based VR, to store information about what they feel during an exam. That information then will be accessible to the examining physician at a later time or to consulting physicians at another location, allowing them to experience the exam as though they had performed it themselves.

They will report on the progress of their work July 23-24 at the World Congress on Medical Physics and Biomedical Engineering in Chicago.

With this "Virtual Human Model for Medical Applications," physicians will wear a customized virtual-reality glove during the patient examination that collects data on what the physician is feeling through sensors located in the glove's fingertips.

Thenkurussi Kesavadas, Ph.D., UB assistant professor of mechanical and aerospace engineering, director of the university's Virtual Reality Lab and the project's principal investigator, explained that at this time, there is no way that a physician at a second site can share that experience without personally examining the patient. In very serious cases -- such as when a patient has been diagnosed at a small, rural hospital -- the patient may have to be airlifted to a more comprehensive medical facility where he or she can be examined in person.

The VR system under development at UB could make some of those costly, not to mention traumatic, airlifts unnecessary.

"Using our customized data-collection glove and the detailed understanding we are developing about the physics behind a doctor's touch during an exam, we expect within two to three years to have a device in use that will allow a physician to use medical palpation virtually and in real-time," said Kesavadas.

"The system will enhance the current method of clinical palpation by transforming it from a qualitative to a quantitative examination," said James Mayrose, research assistant professor in the UB Department of Emergency Medicine, doctoral candidate in the UB Department of Mechanical and Aerospace Engineering, a senior designer of the glove and co-investigator on the project.

Mayrose and medical professionals in the UB departments of Emergency Medicine and Radiology are conducting studies of the glove with human subjects at the Erie County Medical Center.

The UB work represents a departure from the usual route taken by researchers studying VR for use in medical applications, Kesavadas noted.

"Just about everyone who is looking at virtual medicine right now is interested in surgical applications," he said.

But those applications are many years away.

Kesavadas sees no reason to wait to reap the benefits of VR for diagnostics.

"This system could revolutionize imaging in medicine," said Anthony Billittier, M.D., medical director for the Office of Prehospital Care at the Erie County Medical Center, and co-director of the Calspan-UB Research Center's Center for Transportation Injury Research (CenTIR), which is funding the work.

Billittier is particularly excited about the UB researchers' creation of a database of information that accurately describes the biomechanical properties of soft tissue under various conditions.

"Right now, if a patient has been in a car crash and has abdominal pain, for example, we can use ultrasound in the trauma room to tell us if there is fluid in the abdomen," explained Billittier, "but we can't really tell why -- is it a shattered spleen or a lacerated liver? A database with information in it that could tell us that, just based on the consistency of what the physician is feeling, could allow the surgeons to go right into the operating room without having to obtain a CAT scan. It could save time and, with many injuries, that's absolutely critical."

The UB research group is modeling on the computer the soft tissue and organs of the human abdomen, using atomic-unit-type modeling that breaks up human tissues into pieces measuring no more than 8mm.

The system takes as its raw material the Visible Human Data Set developed by the National Institutes of Health that features complete, digitized data sets of the human body.

Using a very powerful graphics computer, the researchers "supersample" smaller and smaller sections of the data set for a given body part or organ, enabling them to get more and more detailed pictures of each one and develop increasingly complex equations about how each tiny section will respond to applied forces. They then create layers of these sections, gradually building the collection of samples into the complete organ.

"Our big contribution is that we are writing algorithms to model how soft tissue deforms as a real mass, rather than just as a surface, which is what many groups are currently doing. No one else is doing this in real time," said Kevin Chugh, a UB doctoral student in computer science who is a co-author on the research.

"We will be able to touch the model with a haptic thimble -- the physically based VR counterpart of a computer mouse -- on the screen, apply the 'force' using a 'haptics' feedback system and show how it deforms and then bounces back when the force is withdrawn."

The work is based on a solid understanding of the physics behind what happens when pressure is applied to different parts of the human body.

"While the physician is doing a palpation on a patient, the computer -- through the VR glove -- is picking up all the information about what anatomic-force characteristics the doctor's finger is feeling," said Kesavadas.

He noted that only a handful of groups in the U.S. are doing atomic-unit modeling for an interactive VR environment.

The system will have emergency-services, military and battlefield applications.

UB's Virtual Human Model also could enhance the training of new physicians in one of the most formidable medical procedures they have to learn -- intubation.

"Intubation is a very touchy procedure," Billittier said. "It involves putting a plastic tube into a patient's airway or windpipe when that patient may not be breathing or may be in shock."

He pointed out that medical students and emergency-medical technicians now learn the mechanics of the procedure first on a mannequin -- which is not very lifelike -- and then switch to real patients, with widely varying degrees of success, under the close supervision of an experienced physician.

A computerized VR simulator for intubation based on information being gathered for the Virtual Human Model database would be a real boon, said Billittier, who is working with Kesavadas' group to develop one. Such a system could provide visual feedback of real airways while the student is maneuvering the tube into the simulated patient on the screen; the simulation would provide the student with the proper tactile or pressure feedback, felt through the haptic interface.

Kesavadas estimates the research team could develop such a system within a couple of years.

###

Story, image and video available at http://www.buffalo.edu/news



Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.