It is a normal behavior for a robot to move itself around the places where people lives daily. Then around a robot, threre are many moving oblects such as walking persons, moving cars, moving dogs, cats etc. Moreover a moving robot is seeing a moving scene by its eyes.
In this situation, a robot looks at motions of people who are facing the robot. The robot must recognize the human motions and make respoces to the people by its creating motion and suitable utterances by synthsized voices.
If a robot is impossible to create suitable actions based on the perception of motions of facing people, the robot seems not be cooperative with human so that the robot not acceptable for human society.
We have already developed a new algorithm called "Time-space Continuous Dynamic Programming (TSCDP)" which enable a robot to realize fuctions mentioned the above required for a robot eye, which make a robot to be well cooperated with our society.
Namely, TSCDP is implemented for a robot. TSCDP works using a time-varying image
captured by an eye of moving robot, so that the robot can recognize the human motions in the moving background.
Occlusion also often occures in our daily life. Occlusion means that there are blocking objects between a robot and a forcussed person who is making motions.
TSCDP is also allowed the exisitance of partially occlusion for a robot.
The attached picture is showing the recognition of motion of "S"by a focussed person
captured by a moving camera of a robot in the moving background where persons are crossing the scene.
The realized functions by our proposed algorithm seem quite difficult by so-called Deep
Learning because Deep Learning is weak to recognize motions captured by a moving
camera under the environment of moving scene.
This research uses the algorithm proposed by the following paper:
 Yuki Niitsuma, Syunpei Torii, Yuichi Yaguchi & Ryuichi Oka:"Time-segmentation and position-free recognition of air-drawn gestures and characters in videos", Multimedia Tools and Applications, An International Journal, ISSN 1380-7501, Volume 75, Number 19, pp.11615--11639.