Workshop in Movement analytics and gesture recognition for Human- Machine Collaboration in Industry 4.0

The Workshop on Movement Analytics and Gesture Recognition for Human-Machine Collaboration in Industry 4.0 will be hosted by the 12th International Conference on Computer Vision Systems (ICVS 2019), which will be held in Thessaloniki, Greece, on 23-25 September 2019.

Context and general objectives

A collaborative robot is an autonomous machine that is able to share workspace with the worker without physical barriers, following health and safety standards. Collaborative robotics have created the appropriate conditions for designing a Human-Robot Collaboration (HRC) that can associate the human intelligence with the power of the robot by following a simple criterion: the complementarity of skills. Nevertheless, in industries, « we always start with manual work » as said the Executive Vice-President of Toyota. Today, even though we have made significant progress in training robots by demonstration, the simple automatisation of tasks, within mixed workspaces, still remains a priority. But, mixed workspaces are not necessarily collaborative. For example, if a robot is able to anticipate the professional gestures of the worker, then, the robot would be able to dynamically move in space and time, and the worker would be able to obtain more ergonomically « green postures ».

Computer Vision Systems together with the recent progress in Deep/Machine Learning opens a broad potential for innovation by re-thinking collaborative robots as real partners. The robot must be able to detect not only the human presence (e.g. worker or maintenance engineer), but also recognise and predict specific actions and/or gestures the worker performs (e.g. to screw, to assembly, etc.). To achieve this goal, human pose estimation, object detection and scene understanding in general are beneficial in order to augment the perception level of the robot. But beyond humanoid robots, Automated Guided Vehicles in factories should also be able to detect the human intentions (e.g. stop when a human tends to cut the motion trajectory, detect collaborative workspaces and identify them etc.) as well as understand human commands (e.g. to charge or not a palette, to go back to the starting point, etc.).

Topics

This workshop will focus on the most recent advances in the area of pose estimation, gesture recognition and movement analytics in Human-Machine Collaboration in Industry 4.0. This workshop aims to bring together researchers from different disciplines, such as robotics, computer vision, data analysis, intelligent systems, ergonomics, intelligent vehicles to share their experiences on these aspects and how they can be beneficial in Human-Machine Collaboration.

Papers are solicited on all areas, including but not limited to the following research topics:

  • Deep learning for pose estimation
  • Human modelling
  • Professional gesture recognition
  • Scene understanding for smart workspaces
  • Vision-based automatic ergonomic assessments
  • Extraction and visualisation of movement analytics
  • Vision-based gestural interaction with automated guided vehicles or drones
  • Human-robot rhythmic interaction
  • Internet of things and computer vision in industry 4.0
  • Contactless robot learning through gestures
  • Human style learning for robotics
  • Benchmarks, methods and datasets for professional gestures
  • Gestures and bio-inspired systems and robots
  • Machine learning for human data
  • Augmented capabilities for workers and machines

Keynote Speaker

Patrick Hénaff received an M.S degree in Robotic from the University of Pierre et Marie Curie (Paris 6) in 1989, and a Ph.D. in Robotics in 1994 at the Paris Robotic Laboratory, University of Pierre et Marie Curie. From 1997 to 2013, he was Associate Professor at the Institute of Technology of the University of Cergy Pontoise. He worked as a researcher at the LISV (Systems Engineering Laboratory of the University of Versailles) from 1997 to 2009. From 2009 to August 2013 he was a researcher at ETIS lab, University of Cergy-Pontoise, CNRS UMR 8051.

Since September 2013, Patrick Henaff is full Professor at the Ecole des Mines de Nancy (computer science department), (Mines Telecom Institut, University of Lorraine. He works as a researcher at the LORIA lab (CNRS UMR 7503), Neurorythms team. He is also the head of Department «Complex Systems, Artificial Intelligence and Robotics» at the LORIA and head of the department “computer sciences” of Mines Nancy.

His topics of interest concern the biological inspired control in robotics, particularly the neural network biological models dedicated to control rhythmic motions of humanoids robots during the walk or during their physical and/or social interaction with humans

His topics of interests concern bio-inspired control in robotics. It aims to better understand the functioning of learning mechanisms that are involved in low-level motor control in humans, to model them and to integrate them into robots' controllers in order to give the robots the ability to learn movements. In particular, he develops computational models of adaptive neural networks dedicated to the control of the rhythmic movements of humanoid robots during their physical and/or social interactions with humans, or to better control their walking gaits.

Keynote Presentation Abstract

Vision based Human-Robot motor coordination using adaptive central pattern generators
(Talk 30mn)

The talk concerns the design of adaptive neural controllers of humanoid robots to make these laters able to learn to interact appropriately with humans. These controllers are inspired from the biological structures located in spinal cord called central pattern generators (CPG) and dedicated to genesis of rhythmic movements. The CPGs and the plasticity mechanisms they incorporate allow interlimb motor coordination and interpersonal synchronization in human intercations.

The main difficulty of this approach to control humanoids concerns the model of CPGs that must behave like chaotic oscillators to be able to synchronize with an external signal and the determination of efficient proprioceptive and/or exteroceptive feedbacks to create this signal. The second difficulty concerns the plasticity mechanisms that can be incorporated in CPG allowing the robot to learn motor coordination when it interacts with humans particularly through rhythmic tasks.

The presentation will focus on these issues. We will show how to use optic flow and plastic CPGs to make humanoid robots able to learn motor coordination with a human partner performing various rhythmic movements and consequently triggering the emergence of synchrony.

Several videos of simulations and experiments will illustrate the presentation. A conclusion and perspectives will conclude the talk.

Presentation Keywords: Humanoid robotics, neural control, Central Pattern Generator, sensorimotor coordination, synchronization, motor coordination.

Important dates

EXTENDED - Paper Submission : 30 June 2019 
EXTENDED - Notification acceptance : 10 July 2019
Camera-Ready papers : 15 July 2019
Conference : 23-25 September 2019

Procedure

Please email your submission before 10th June 2019 23:59 CET following the instructions below:

  • Submission to: info@aimove.eu
  • Subject: “Movement analytics and gesture recognition for Human-Machine Collaboration in Industry 4.0”
  • Include authors names, affiliations and contact information
  • Attached file: pdf
  • Additional links or illustrations are welcome.

At least one author of an accepted submission is required to attend the workshop and must register for the main ICVS conference. Accepted papers will be published in the adjunct conference proceedings.

Workshop Organizers

  • Sotiris Manitsaris, Senior Researcher, S/T Project Leader, Centre for Robotics, MINES ParisTech, PSL Université Paris
  • Alina Glushkova, Postdoctoral Fellow, Centre for Robotics, MINES ParisTech, PSL Université Paris
  • Dimitrios Menychtas, Postdoctoral Fellow, Centre for Robotics, MINES ParisTech, PSL Université Paris


Acknowledgements

We acknowledge support from the CoLLaboratE project (H2020-FoF, grant agreement No. 820767), which is funded by the EU’s Horizon 2020 Research and Innovation Programme.