Gesture based wireless single-armed Robot in Cartesian 3D space using Kinect
Human Machine Interaction (HMI) has always played an important role in everybody’s life motivating research in the area of intelligent service robots. Conventional methods such as remote controllers or wearables cannot cater the high demands in some environment. To overcome this situation, the challenge is to develop vision-based gesture recognition techniques. Most human interactions with the environment depend on our ability to navigate freely and to use our hands and arms to manipulate objects. An ideal interface for robot teleoperation will be inexpensive, person-independent, require no wearable equipment, easy to use, requiring little or no user training. This project deals with controlling an Arduino based bi-wheeled robot with an arm placed over. This armed robot, used as a prototype is controlled through various gestures of the arms and legs. For gesture recognition, we make use of skeletal tracking ability of Kinect sensor – a product of Microsoft. The tracked gestures are transmitted over Bluetooth technology thus helping us make the controls wireless. Since it is not line of sight operation, the robot also captures the environment video and transmits it over radio frequency in real-time and is displayed on the screen. On the operator end, according to the received video, the user guides the robot and uses the arm to pick and place objects with the help of predetermined gestures. This project not only serves the benefit of easing the operation of the robot but also has several benefits in the software and hardware department too. Since we have used the Kinect sensor for gesture recognition, it reduces the coding complexity of image processing and also nullifies the cost of the software used for the same e.g. Matlab, Wolfram, etc. This makes our project more effective, efficient and reliable.