This paper is published in Volume 2, Issue 6, 2017
Area
Image Processing and IOT
Author
Divya Sree .V
Org/Univ
Jawaharlal College of Engineering and Technology, India
Keywords
Horn–Schunck Method, HCI, Image Processing, Python, Visual Basic and Gestures
Citations
IEEE
Divya Sree .V. An IOT Integrated Gesture Recognition Using Image Processing For Speech Impaired People, International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARnD.com.
APA
Divya Sree .V (2017). An IOT Integrated Gesture Recognition Using Image Processing For Speech Impaired People. International Journal of Advance Research, Ideas and Innovations in Technology, 2(6) www.IJARnD.com.
MLA
Divya Sree .V. "An IOT Integrated Gesture Recognition Using Image Processing For Speech Impaired People." International Journal of Advance Research, Ideas and Innovations in Technology 2.6 (2017). www.IJARnD.com.
Divya Sree .V. An IOT Integrated Gesture Recognition Using Image Processing For Speech Impaired People, International Journal of Advance Research, Ideas and Innovations in Technology, www.IJARnD.com.
APA
Divya Sree .V (2017). An IOT Integrated Gesture Recognition Using Image Processing For Speech Impaired People. International Journal of Advance Research, Ideas and Innovations in Technology, 2(6) www.IJARnD.com.
MLA
Divya Sree .V. "An IOT Integrated Gesture Recognition Using Image Processing For Speech Impaired People." International Journal of Advance Research, Ideas and Innovations in Technology 2.6 (2017). www.IJARnD.com.
Abstract
Gesture is one of the most powerful and dramatic way of communications between human and computer. The method proposed here gives a real time gesture recognition system for speech impaired people to reduce the communication gap between the mute community and additionally the standard world. The proposed approach is capable of detecting gestures with good accuracy. The gestures shown in front of the system inbuilt camera is used for image processing. The Horn Schunck optical flow algorithm is used to track movements in the video frames. Gesture library is created by subdividing the data frames. Here hand movements are focused. For each gestures shown corresponding text and speech output is generated in MATLAB. The generated result is used for website updation in VB (Visual Basic).The python language is used for IoT integration, here the MATLAB and VB are connected via python. The VB provides text and speech output along with the website generated. The speech and the text output for the corresponding hand gestures shown can be accessed in any device by sharing the VB created and by providing the corresponding IP address of the source device. The matching score over multiple instances of training and testing is around 98%in MATLAB and VB gives 100% accuracy. This system is portable for the disabled to carry with them at their convenience. The basic objective of this project is to develop a computer based intelligent system that will enable speech impaired people significantly to communicate with all other people using their natural hand gestures.
Paper PDF
View Full Paper
Last