29 August 2019

Google has developed Artificial Intelligence to understand sign language

Google has developed Artificial Intelligence to understand sign language
The number of people communicating through sign language is high.  Understanding this complex mode of communication and converting it to sound has not yet been invented.  Google AI Lab is paving the way for revolutionary innovation in this field.  Google's Artificial Intelligence Lab developed an algorithm that can track hand movements in real time.
  Google is making use of the smartphone and its camera to enhance machine learning efficiency and track the movement of the hand and all fingers in real time.


  Google sources say that today they are trying to use the mobile phone to decode the sign language of the computer and other devices.
Gesture signs
 Shake hands and the similarity in gesture markers make it a challenge to translate sign language into real-time language.  In addition, the speed at which the sign language is handled and the frequent mistakes made by the gesture can affect the process of turning sign language into real-time speech.  Turning gestures into sounds is a challenge in itself.
Google Artificial Intelligence Lab
 With multiple cameras and depth-sensing rings, it is difficult to change gestures into real-time sounds, even if you understand all the movements of your hands.  However, Google did not release the Artificial Intelligence Lab, which found that if the computer did not provide enough data, the translation would be faster.
Movements of the palm
 Google understands the gestures of the hand rather than the movement of the entire hand in the gesture.  This system only needs to understand the movement of the tiny spice with the palm of your hand, and does not need to be able to detect large images.  This will speed up the process of turning gestures into sounds.
21 coordinates
 The movement of the palm of the hand is largely due to the movement of the fingers when the fingers move.  There is a special algorithm to understand the movement of fingers.  Set 21 coordinates from beginning to end of fingers.  This will help you pinpoint the position of the finger and determine which position it is.
Machine Learning System
 Of the 30,000 photos in different poses and lights of the hands for the finger recognition, 21 points must be manually recorded.  Behind every artificial intelligence system there is so much human labor involved.
Source code is free
 This algorithm has not yet been used in any Google products.  So Google is ready to give it away for free.  Anyone can take the source code and make their own system.  Google sources said that this source code was made available to all researchers in a way that would positively change the way technology is used.
--------------------------
google cloud auto ml,google cloud platform dialogflow,einstein ai salesforce,enterprise ai,microsoft ai advert,deepmind protein folding,ai predictive analytics,reshaping business with artificial intelligence,nvidia ai faces
google in ai,google cloud auto ml,basic sign language phrases,i love you in sign language,intro to sign language,american sign language i love you,sign language 101,ai google cloud,common baby sign language,savvy sign language.
google in ai,google cloud auto ml,ai google cloud,machine learning google assistant,ai for google,google ai website,google dialogflow,google cloud platform dialogflow,google trends machine learning,google ai ml.
artificial intelligence,ai,google machine learning,google ai,life 3.0,microsoft ai,nature machine intelligence,ai chatbot,quantum ai,ai intelligence.

0 Comments:

links