Human-robot interaction for cooperative task handling
Humans and robots are expected to interact in many settings and handle various tasks (e.g. construction of an outpost on the moon) together. As the robots possess more human-like communication capabilities, human-robot interaction (HRI) becomes more natural. This thesis developed a static gesture recognition system that was integrated with existing speech recognition and navigation capabilities of a mobile robot for handling some tasks such as fetching an object. The gesture recognition system first detects the hands using skin color detection and then segments them. A set of features extracted using the Hu-invariant moments. The features are then fed into two different classifiers, i.e, Support Vector Machine (SVM) and Neural Networks (NN). The system has been integrated with speech recognition and navigation systems of a Pioneer 3-AT mobile robot and successfully tested. The performance comparison of SVM and NN is also provided.
"Human-robot interaction for cooperative task handling"
ETD Collection for Tennessee State University.