2018 3rd International Conference on Advanced Robotics and
Mechatronics (ICARM)
TABLE I
FUNDAMENTAL GRAMMARS OF XACRO
Command Definition
Property <xacro:property name=“pi” value=“3.14” />
Argument <xacro:arg name=“use gui” default=“false”/>
Macro <xacro:macro name=“arm” params=“side”/>
Including <xacro:include filename=“other file.xacro” />
Usage
< · · · value =“ ${2*pi}”· · · />
< · · · use gui:= true · · · />
<xacro:arm side=“left”/>
demonstrated that the system is feasible, real-time and accu-
rate. Future work will lay more emphasis on the development
of mapping algorithms and the accuracy of human motion
imitation on a humanoid robot. Encouraged by Tri-Co Robot
Initiative, we hope this work will contribute to the enhance-
ment of robot interaction capacities.
REFERENCES
[1] E. Yavan and A. Uar, “Gesture imitation and recognition using kinect
sensor and extreme learning machines,” Measurement, vol. 94, pp. 852–
861, 2016.
[2] H. Ding, X. Yang, N. Zheng, M. Li, Y. Lai, and H. Wu, “Tri-co robot:
a chinese robotic research initiative for enhanced robot interaction
capabilities,” National Science Review, p. nwx148, 2017. [Online].
[3] M. Riley, A. Ude, K. Wade, and C. G. Atkeson, “Enabling real-time
full-body imitation: a natural way of transferring human movement
to humanoids,” in IEEE International Conference on Robotics and
Automation, 2003. Proceedings. ICRA, 2003, pp. 2368–2374 vol.2.
[4] A. Durdu, H. Cetin, and H. Komur, “Robot imitation of human arm via
artificial neural network,” in International Conference on Mechatronics
- Mechatronika, 2015, pp. 370–374.
[5] S. H. Hyon, J. G. Hale, and G. Cheng, “Full-body compliant human-
humanoid interaction: Balancing in the presence of unknown external
forces,” IEEE Transactions on Robotics, vol. 23, no. 5, pp. 884–898,
2007.
[6] I. J. Ding, C. W. Chang, and C. J. He, “A kinect-based gesture command
control method for human action imitations of humanoid robots,” in
International Conference on Fuzzy Theory and ITS Applications, 2014,
pp. 208–211.
[7] A. Bindal, A. Kumar, H. Sharma, and W. K. Kumar, “Design and
implementation of a shadow bot for mimicking the basic motion of
a human leg,” in International Conference on Recent Developments in
Control, Automation and Power Engineering, 2015, pp. 361–366.
[8] S. Gobee, M. Muller, V. Durairajah, and R. Kassoo, “Humanoid robot
upper limb control using microsoft kinect,” in 2017 International Con-
ference on Robotics, Automation and Sciences (ICORAS), Nov 2017, pp.
1–5.
[9] X. Meng, J. Pan, and H. Qin, “Motion capture and retargeting of fish by
monocular camera,” in International Conference on Cyberworlds, 2017,
pp. 80–87.
[10] H. Dai, B. Cai, J. Song, and D. Zhang, “Skeletal animation based on
bvh motion data,” pp. 1–4, 2010.
Fig. 11. Snapshots for motion trajectory
factors. The first one is the difference of structure between
human and robot. Each of our arm has 7 DOF but the robot
has only 5 and the rotational axes of wrists are not the same.
Also, the rotational angles of some robot joints are limited
due to its mechanical design. (i.e. The arm of the robot
cannot get over its shoulder.) The second one is the mismatch
between the skeletal model visualized through BVH data and
the wearer’s real motion. As we all know, the revolution
of each joint is achieved through skeletons in human body,
while the wearable sensors can only keep fixed to the skin
or the clothes and there exists relative angular displacement
between our skins and skeletons. Hence errors are obvious
when we carry out particular behaviors. Other factors include
the accumulated drift errors and different positions of wearable
sensors relative to human bodies. Nevertheless, there are still
some possible solutions to these limitations. For example,
sensors can be bound tightly to limbs in case of relative
displacement between sensors and skins. Human motion can
be confined to a certain range to achieve a higher accuracy.
Besides, reasonable compensations for errors resulting from
relative angular displacements between skins and skeletons can
render the motion retargeting more reliable.
[11] N. E. N. Rodriguez, G. Carbone, and M. Ceccarelli, “Antropomorphic
design and operation of a new low-cost humanoid robot,” in Ieee/ras-
Embs International Conference on Biomedical Robotics and Biomecha-
tronics, 2006, pp. 933–938.
[13] L. Gong, C. Gong, Z. Ma, L. Zhao, Z. Wang, X. Li, X. Jing, H. Yang,
and C. Liu, “Real-time human-in-the-loop remote control for a life-
size traffic police robot with multiple augmented reality aided display
terminals,” in 2017 2nd International Conference on Advanced Robotics
and Mechatronics (ICARM), Aug 2017, pp. 420–425.
VI. CONCLUSIONS
In this paper, a novel human-in-the-loop system for human
motion imitation on a humanoid robot has been proposed.
The system enables real-time imitation through an accurate
motion capture system, visualization terminals for different
motion systems, fast mapping algorithms and reliable data
transfer methods. Experiments with different gestures have
[14] Z. Wang, L. Gong, Q. Chen, Y. Li, C. Liu, and Y. Huang, Rapid
Developing the Simulation and Control Systems for a Multifunctional
Autonomous Agricultural Robot with ROS.
Publishing, 2016.
Springer International
978-1-5386-7066-8/18/$31.00 ©2018 IEEE
786