Design of Human Assist System for Communication
Department of Mechatronics Engineering, Center for Nano Technology, Bharath Institute of Science & Technology Bharath University, Chennai-, India
|Related article at Pubmed, Scholar Google|
With the development of Human-Computer Interface (HCI), methods have been developed to help these people for communication. Unlike traditional HCIs (a keyboard, or a mouse, etc.), modern HCIs have played an important role in the area of rehabilitation. However, the disabled with severe paralysis have only few ways to control and work with the applications. For these people, methods based on eye movement or blinking and voice can be selected. In this project, we focus on implementing an IR Sensors for EOG based HCI and voice to text processor which is cheap, portable and noninvasive. In the Eye Ball Control section IR sensors will be placed closer to the eye and when it sense the Eye ball movement then it will automatically transmit the value to the Comparator to compare the voltage received from the Sensors. If the voltage is about 0.5 volt then automatically signal will be passed to the Remote section and control few applications in PC connected to it. This voltage will be gained only if the eye ball movement is sensed. In the Receiver section we have microcontroller section interfaced to the zigbee module and PC. Zigbee receiver will receive the value and controller will calculate the sensed movement and control the application in PC.In the tooth click application sensor is placed in tooth and while the user clicks the teeth, Micro switch sensor will sense the data and mouse click operation is performed for particular application. In the last phase of the project MEMS sensor is used for head nodding application. MEMS can be tilted in three positions like our head, so far each head position a particular application can be controlled.