Get Latest Final Year Projects in your Email

Your Email ID:
PA Subs

Imirok: Real-time Imitative Robotic Arm Control for Home Robot Applications (Computer Project)

Download Project:

Fields with * are mandatory

ABSTRACT:

Training home robots to behave like human can help people with their daily chores and repetitive tasks. In this paper, we present Imirok, a system to remotely control ro-botic arms by user motion using low-cost, off-the-shelf mobile devices and webcam.

The motion tracking algorithm  detects user motion in real-time, without classifier training or pre-defined action set. Experimental results show that the system achieves 90% precision and recall rate on motion detection with blank background, and is robust under the change of cluttered background and user-to-camera distance.

The System Architecture of Imirok.

The System Architecture of Imirok.

 MOTION TRACKING:

A. Design Considerations:

Our goal is to infer user motions from a video stream captured by a webcam facing the user. To track the motion of the user’s hand, we need to decide what kind of image features that is both robust and ideal for real-time tracking of moving points across consecutive image frames. The intuition is that if we randomly pick a point on a uniform surface to track, then we are not likely to find the same point in the next frame.

B. Image Feature Extraction:

First, each input image from the video sequence is converted to grayscale image, since the color information is not required. Then, we extract the corner feature since it is shown to be useful for object tracking in video. The corner feature extraction algorithm finds a list of points that are relatively useful for tracking.

C. Optical-Flow-based Motion Tracking:

Since we need real-time gesture recognition rather than an offline video motion analysis, we adopt Lucas-Kanade Algorithm, a sparse tracking method that significantly reduces computational cost compared with dense tracking. Our approach works as follows. First, given an input frame, we extract at most Nmax salient corner points that are suitable for tracking. Then, by  comparing the current and the previous frame.

Figure 2. The screenshot (a) of the motion tracking system. The green lines connect the previous and current position of each moving points. The dark and bright red circle is the centroid of the filtered moving corners in the previous (b) and current frame (c) , respectively.

Figure 2. The Screenshots  (a)  of  the  motion  tracking  system. The  green  lines  connect  the  previous  and  current  position  of  each moving  points.  The  dark  and  bright  red  circle  is  the  centroid  of the  filtered  moving  corners  in  the  previous  (b)  and  current  frame  (c) , respectively.

Figure 2. The Screenshots (a) of the motion tracking system. The green lines connect the previous and current position of each moving points. The dark and bright red circle is the centroid of the filtered moving corners in the previous (b) and current frame (c) , respectively.

ROBOTIC ARM CONTROL:

In this paper, we present an intuitive way of control ling a robotic arm, to follow motions made by a human operator. One challenge is to establish a one-to-one mapping from joints of human body to sections on the robot arm. In this section, we present motion-mapping techniques in  Imirok and the design of motor control circuits in the current implementation.

Motion Mapping and Motion Control Circuit.

Motion Mapping and Motion Control Circuit.

IMPLEMENTATION:

For prototype implementation, we use SunSPOT nodes for 802.15.4 radio communication between the  vision-based motion tracking device (a laptop with a webcam) and the OWI-535 ro-botic arm .TI SN754410 H-bridge IC is used for motor control. The image feature extraction and motion tracking algorithms are implemented using OpenCV library in C++.

EVALUATION:

In this section, we present the evaluation results of motion tracking performance of Imirok, and discuss possible factors that influence the performance.

A. Background Complexity:

Intuitively, simple backgrounds behind the human operator will generate better tracking performance than complex backgrounds having multiple objects with different colors. We evaluate the performance of Imirok under both blank and cluttered backgrounds. For the blank background, a human operator stands in front of a mono-color wall, issuing left, right, up, and down commands by moving his upper arm.

B. Distance to Camera:

The human operator stands in front of a mono-color wall with different distance to the camera. From Figure 7 and 8, it seems that the precision and recall rates of Imirok are pretty constant against distance changes. For all the cases ranging from 1m to 3m, the average precision and recall rates are over 85%.

  Fig 7. and Fig 8. Distance to Camera.

Fig 7. and Fig 8. Distance to Camera.

C. Frame Rate:

A promising and interesting research direction is to implement the whole Imirok system in mobile devices. A key challenge of doing so is to reduce the computation complexity. In this subsection, we compare the performance of motion tracking by changing the frame rate of the optical-flow algorithm.

CONCLUSION:

In this paper, we have presented the design, implementation, and evaluation of Imirok, an imitative robotic arm control system. The system detect user motion in real-time, without the need of model training before using the system.

The motion tracking approach achieved 90%  precision and recall rate with blank background, and show acceptable robustness under the  change of cluttered background and user-to-camera distance. The real-timeness, the robust performance, and the intuitive imitative human-robot inter-actionmake the system particularly desirable for controlling home robots in a smart home environment.

FUTURE WORK:

There are plenty of research opportunities for future work. Our approach can be extended to  increase the degree of free domain controlling robotic arm. In addition, controlling  two arms  simultaneously can be enabled by adding a motion orientation clustering algorithm after motion tracking. To further enable robotic control using mobile phone cameras, it is also important to  balance the tradeoff between computational cost, power consumption, transmission  bandwidth, and the performance of vision based motion tracking.

Source: Carnegie Mellon University
Authors: Heng-Tze Cheng | Zheng Sun | Pei Zhang

Download Project

Download Project:

Fields with * are mandatory