The new hand-tracking AI by Google could be the new revolutionary product for speech impaired people

There are millions of speech-impaired people who use sign language to communicate with others. Although every passing second a new technology is introduced but we are still looking for the technology that can help detects complex gestures and translates it for others. There are various related technologies introduced but all share a specific amount of service hence limited. The workings for real-time hand tracking algorithm by Google could lead to a revolutionary product we’ve been looking for.

Detailed insight on the workings of this new algorithm

This new algorithm uses some shortcuts with increased efficiency of the machine to detect the complex hand movements in real-time and to respond accordingly using only a smartphone and its camera. Previous similar technologies used to rely on powerful desktop environments for its workings and the previous methods could only detect some specific type of hand gestures but not anymore. Now, this new technology can detect complex hand gestures in real-time and all performed on a mobile phone. This new algorithm can even detect multiple hands as well.

Hand movements are often quick and complex to understand for computers. For computers, it is really difficult to detect hand movements and right and fast. Even with the use of multi-camera, depth-sensing rigs computer still failed to detect all hand movements accurately and efficiently respond to it.

The aim behind this new algorithm is to not focus on detecting the whole hand but instead detecting a specified amount of information from the hand and fingers to detect the movements accurately. This means focusing on fewer data and responding more effectively.

This new algorithm doesn’t detect the position, size of the hand instead it focuses only to detect the palm which makes the system more efficient to movements as compared to other systems that focused on rectangular images, short ones and so on which resulted in less accurate results.


After recognizing the palm, Fingers can be analyzed separately. A separate algorithm detects 21 coordinates based on the size, angle of the palm and other things to recognize the fingers separately.

The team of researchers manually added 21 points to more than 30,000 images of hand gestures in different situations to help the machine learn how to detect the hand movements quickly.

After detecting the exact gesture of the hand, the machine then compares it with other gestures from sign language symbols to look for symbols for letters or numbers.



The detailed workings of this algorithm result in fast and accurate information from the machine and run on normal smartphones rather than complex desktops or the cloud.

This new algorithm is just a first step towards making a technology for detection of sign language. This algorithm can also be run by other researchers who were looking for the missing piece for the accurate results in the similar existing systems.

New technology that can mean a world for others

This technology is still under the working process and not launched in Google products yet. This means that researchers have free access to work on it in the way they want to. Google is a platform known for its unique attempts to bring comfort in the life of its users. This new algorithm is a step towards bringing comfort for speech impaired people and we hope that this hand-tracking algorithm goes to the wider research and development community to help bring a revolutionary product for the people with special needs. If developed widely, this new algorithm can be used in various applications that can help make it easy for speech impaired people to communicate with others.

Read next: Google Finds a new way to help its speech-impaired users
Previous Post Next Post