Smartphone Science

Two innovators combine their expertise to develop facial recognition software.
Lung cancer and other lung afflictions share a common symptom: reduced lung capacity. The associated sensation of lack of breath can have a significant impact on quality of life and outlook toward treatment and healing. Following surgery for lung cancer, learning to live with reduced lung function can be bewildering and challenging. Currently, patients use a spirometer to monitor their level of respiratory functioning. The advantage is that the device is low-tech and relatively inexpensive. The disadvantage is that the apparatus is bulky, and provides a measurement that must be interpreted. Miad Faezipour, Ph.D., Assistant Professor of Computer Science and Engineering and Biomedical Engineering, is researching a novel approach — the use of smartphone technology to create a portable, user-friendly Virtual Reality biofeedback tool for lung cancer and/or other breathing disorders.

Virtual Reality has been the stuff of movies and video games for decades, but has only been used in clinical therapy more recently. Virtual Reality imagery for cancer patients’ visualization of their immune system killing the malignancy was documented in the mid-1990s. Biofeedback, the process of translating physiological measurements into meaningful data output for self-analysis regulation, is a therapy that has been used to manage migraines, chronic pain, and high blood pressure.

smartphone_signal

smartphone_verticalFaezipour aims to produce a Virtual Reality biofeedback application patients could use on their smartphones. Patients would connect a hands-free device to a smartphone, breathe into the microphone a few times and a 3-D virtual simulation of the lungs expanding and contracting would appear on the smartphone, along with simple coaching to better regulate the breathing at that moment and increase the percent of oxygen in the blood.

This application of Virtual Reality technology is a series of complex systems that must be able to function with precision and interact rapidly in order to be effective for biofeedback. The phases of the breathing cycle that produce unique acoustic signals are detected, recorded, analyzed, and transcribed to a virtually real image of the patient’s lungs on a smartphone for viewing and breath adjustment. The biofeedback is expected to alert the patient when breathing is not normal before it is noticeable, thereby allowing him/her to adjust subsequent breaths and increase blood oxygenation.

smartphone_miad

Faezipour’s research involves multiple stages and incorporates biology, biomedical engineering, computer science and engineering, and electrical engineering. The technology should be able to individually detect, record, analyze and classify particular breathing movements (inhalation, exhalation, and pauses between these phases). However, unlike precise and expensive instruments found in a controlled clinical setting, Faezipour is researching how to accomplish these tasks using a simple hand-held device. Patients must contend with challenges posed by interfering sounds in a normal setting such as conversation, keyboard clicks, and TV broadcasts. Using algorithms to differentiate the acoustics of breathing from background noises and segment the breathing movements for classification, Faezipour hopes to produce an easy-to-use application that patients can use at home or work.

The signal analysis stage of Faezipour’s research is close to completion, with an 86 percent accuracy rating. She hopes to achieve a 100 percent accuracy rating, deemed necessary for the continued advancement and eventual release of the breath signal project. Michael Autuori, Ph.D., Professor of Biology and Spiros Katsifis, Ph.D., Professor of Biology, have collaborated on the design and Ph.D. student Ahmad Abushakra has served as Faezipour’s research assistant.

A preliminary version of this virtual therapy framework application has been devised that monitors breathing movements and integrates a visual effect of the lungs inflating and deflating as the patient inhales and exhales.

smartphone_framework

The final product, a user-friendly, accurate smartphone application, will have the capacity to project a detailed animation of the cancer patient’s lung function. The remaining challenge involves the completion of a high definition animation, which will be integrated with the acoustic signal transmission, and identifying possible breathing/lung functionality disorders and/or diseases by further analyzing the acoustic signal of breath. In this way, the actual breath analysis of a patient will be visualized on a smartphone for use in real-time biofeedback therapy.

The user-friendly, accurate smart-phone application will have the capacity to project a detailed animation of a patient’s lung function on the screen of a hand-held device.

Faezipour foresees the completion of this integrated framework within the next two years.

Currently, most research into smartphone technology medical applications is focused on cardiac care. Faezipour envisions the application of her research as a means to improve the lives of many individuals throughout the world. By aiding in the treatment of lung cancer patients, it can improve lung functionality and contribute to a better quality of life for the cancer patient and other patients with breathing disorders. Overall, equipping an individual with the tools to regulate and improve their own health not only empowers them, but also motivates the individual to progress along the road to recovery.

Faezipour joined the engineering faculty in July of 2011, secured a UB Seed Money Grant in her first six months, and quickly established the Digital/Biomedical Embedded Systems and Technology (D-BEST) Laboratory in the School of Engineering’s technology building. Her research interests lie in the broad area of biomedical signal processing and behavior analysis techniques, high-speed packet processing architectures, and digital/embedded systems.

Computer Science and Engineering Ph.D. student Ahmad Abushakra has been working with Faezipour as a research assistant; together they have already published two journal articles on this area of research in the Institute of Electrical and Electronics Engineers’ IEEE Journal of Biomedical and Health Informatics as well presented components of this research in several world-class IEEE conferences.