Three engineering students from the University of Bridgeport—Wafa Elmannai, Eugene Gerety, and Reem Mahjoub—have won all three prizes at the 25th annual Connecticut Symposium on Microelectronics & Optoelectronics (CMOC) for their research involving sensors , image processing, and other technologies.
Judges selected their research from 35 entries submitted by teams from university engineering schools throughout Connecticut, including those at UConn, Southern Connecticut State University, and Yale.
Optoelectronics, the use of electronics and light, are used to process, store, transmit, and display information for applications such as communications, computing, and consumer electronics. Microelectronics involve incredibly small electronic devices used throughout a variety of industries.
UB Associate Vice President for Graduate Studies and Research Dr. Khaled Elleithy said the students’ research “has the potential to greatly improve processes used in everything from identity screening to crisis response to aiding the blind. We are very proud that their work has been recognized at CMOC, where universities and business come together each year to share resources related to microelectronics, optoelectronics, biosensors, energy and emerging technologies.”
About the UB winners:
Wafa Elmanai is designing an intelligent, highly precise, and economically accessible framework to guide visually impaired people.
According to World Health Organization (WHO) estimates, there are 285 million visually-impaired people worldwide, including 39 million blind individuals. Elmanni’s dissertation, “An Accurate Data Fusion System for Visually Impaired,” proposes a platform to assist them with obstacle detection and avoidance, as well as providing a navigational service.
The platform gathers various inputs from a camera, compass, GPS, gyroscope, ultrasonic sensor, and PIR motion sensors. Information is then available to be used by a data fusion algorithm to provide the assistance to the visually-impaired user. Various features that can be implemented include but are limited to: route guidance and navigation, character recognition and text reading, obstacle detection and avoidance, and finding lost items.
This novel electronic travel aid facilitates the visually-impaired people’s mobility indoor and outdoor by using computer vision technologies. The proposed approximately measurement method using computer technologies increases the accuracy of the obstacle avoidance system.
Elmannai is a doctorate candidate of Computer Science and Engineering who previously won the Connecticut Technology Council’s Women of Innovation Award.
Eugene P. Gerety’s research involves something people use every day: two-dimensional bar codes.
Found on driver’s licenses, identity cards, boarding passes, as QR codes on marketing materials, and even at sports events to track athletes, 2D barcodes use shapes and symbols to pack in a ton of data, from someone’s address to their fingerprint.
The hitch: retrieving information. Says Gerety: “The current practical limit on data density is about 2K bytes per square inch, using “standard” barcode printing/scanning hardware. Beyond that, the features of the barcode become so small that a type of image distortion known as aliasing occurs, making it impossible to obtain information.”
That’s where Gerety comes in. His research, “Code-Independent Technique with Alias Disambiguation for Data Extraction from Extreme High-Density 2D Printed Bit Field Images,” aims make it easy to retrieve information from 2D bar codes that have density limits as high as 8K bytes per square inch. At the same time, he has ways to make the 2D codes highly durable and copy-proof so they can endure, say, high heat (e.g. the clothes dryer), get torn in half, or be exposed to electrical and magnetic conditions that would destroy a chip card and still be usable. That would make them more secure, more usable, and cheaper than data chips.
Gerety’s project uses spectral and spatial processing techniques to model and “reverse” the aliasing process. Gerety, a doctorate student, previously earned his bachelor’s and master’s degrees from UB. He currently serves as senior staff systems engineer at Philips/Respironics, in Wallingford, CT.
Reem Mahjoub’s research focuses on optimizing the performance of mobile robots used on the battlefield, medical applications, and other critical tasks via so-called Wireless Sensor and Actor Networks, or WSAN.
Wireless Sensor and Actor Networks are a collection of actors, such as robots and sensors, collaborating via a wireless medium to perform designated tasks, like detecting forest fires. Sensors with nodes having limited power resources are responsible for sensing and transmitting events to actor nodes that have the ability to collect, process, transmit data, and perform various actions. Maintaining inter-actor connectivity is critical to a WSAN, as failure at one point may result in communication loss and performance failure.
Mahjoub, a doctorate student majoring in Computer Science and Engineering, proposes a grid-based model for detecting critical actor failure and performing network restoration while limiting service interruption. Using these technologies, the proposed model leads to overall improvement in network performance, especially when it’s essential not only to provide monitoring service but also to ensure that the network is running and needed action can be obtained automatically.
For instance, the model can be deployed to improve the monitoring of crop environment in response to climate changes or diseases. Sensor nodes can sense environmental parameters including crop diseases, water level, humidity, and temperature, while actor nodes are deployed to process sensed data and to activate actions through attached devices that enhance controlling of CO2, ventilation, heating sprinklers, humidity, and lighting. By providing real-time information, the model ensures that the crop is continuously monitored, and in the event of a failure, connection and performance is quickly restored as well.
Media contact: Leslie Geary, (203) 576-4625, email@example.com