Machine Vision

cancel
Showing results for 
Search instead for 
Did you mean: 

ASL Hand Gesture Recognition

i'd like to make an ASL Hand Gesture Recognition program. http://en.wikipedia.org/wiki/American_Sign_Language (ASL). here's a list of the different hand gestures: http://en.wikipedia.org/wiki/File:ABC_pict.png

i think i can handle hands that have at least one finger open.

how do you handle the hand gestures that have finger/s closed like A, E, M, N, S, and T? how about the moving hand gestures like J and Z?

0 Kudos
Message 1 of 4
(4,709 Views)

Hi ognamidk,

 

Welcome to the NI Discussion Forums!

 

Unfortunately, this is a highly-complex problem in my opinion, and I am certainly not a domain expert in this problem. Everything that I have to say should be taken as such.

 

Are you using the NI Vision Development Module (VDM)? Also, what general approach do you have in mind? Are you trying to track the finger joint positions first or are you simply doing some kind of pattern recognition or template matching on the image?

 

A quick search online reveals that this problem is quite open-ended and is being worked on by numerous research groups in academia: Finger Detection for Sign Language RecognitionReal-Time American Sign Language Recognition Using Desk and Wearable Computer Based VideoAutomated Sign Language Recognition - Past, Present, and Future. and also confirms that this problem is highly complex.

 

Regardless of the technique, it appears that pre-processing of the image is highly important as you want to image the hand only and separate any other non-revelent information. What you can do is a background subtraction and pair that with edge detection to find the outline of the hand and fingers. With regards to the actual recognition, I do not think a pattern matching algorithm in VDM will be able to decipher the letter. Perhaps machine learning is something you can investigate?

 

Lastly, in my opinon, in order to robustly detect hand gestures that have motion, you would require some kind of segmentation algorithm. Another search for "segmentation of hand gestures" online reveals that even this problem in itself is very complex.

 

In closing, I'm not sure what your general approach to this problem is, and what timeframe you are aiming for, but I do believe that this is will be a challenging task. While we can certainly provide advice with the use of our tools and software, unfortunately, we are not domain experts in this field and will not be able to provide much support in the algorithm development phase. Hopefully someone else on the forums can provide more insight.

 

Feel free to share your thoughts on the topics I brought up. Best of luck to you with your application!

Message 2 of 4
(4,688 Views)

Hi

I am interested in doing sign language translator using labview.Hand signals are captured using bend sensors,accelerometer and contact sensors.This signal are processed and given to the gesture recognition module using labview.

 

Kindly guide me with this.

 

Thanks 

0 Kudos
Message 3 of 4
(4,602 Views)

Hi rasathi,

 

If you are using bend sensors, accelerometers and contact sensors, this is no longer a vision-based solution, correct? In which case, this is not the appropriate board on the forum for this question, as the original post pertains to vision and image processing based solutions to the problem. I would recommend posting on the general LabVIEW boards instead: http://forums.ni.com/t5/LabVIEW/bd-p/170

 

Furthermore, as mentioned in my previous post, I can certainly provide assistance with the implementation of the algorithm, but as I am not a domain expert in this application, I cannot provide much assistance in terms of algorithm development.

0 Kudos
Message 4 of 4
(4,582 Views)