Workshop in Movement analytics and gesture recognition for Human- Machine Collaboration in Industry 4.0

The Workshop on Movement Analytics and Gesture Recognition for Human-Machine Collaboration in Industry 4.0 will be hosted by the 12th International Conference on Computer Vision Systems (ICVS 2019), which will be held in Thessaloniki, Greece, on 23-25 September 2019.

Context and general objectives

A collaborative robot is an autonomous machine that is able to share workspace with the worker without physical barriers, following health and safety standards. Collaborative robotics have created the appropriate conditions for designing a Human-Robot Collaboration (HRC) that can associate the human intelligence with the power of the robot by following a simple criterion: the complementarity of skills. Nevertheless, in industries, « we always start with manual work » as said the Executive Vice-President of Toyota. Today, even though we have made significant progress in training robots by demonstration, the simple automatisation of tasks, within mixed workspaces, still remains a priority. But, mixed workspaces are not necessarily collaborative. For example, if a robot is able to anticipate the professional gestures of the worker, then, the robot would be able to dynamically move in space and time, and the worker would be able to obtain more ergonomically « green postures ».

Computer Vision Systems together with the recent progress in Deep/Machine Learning opens a broad potential for innovation by re-thinking collaborative robots as real partners. The robot must be able to detect not only the human presence (e.g. worker or maintenance engineer), but also recognise and predict specific actions and/or gestures the worker performs (e.g. to screw, to assembly, etc.). To achieve this goal, human pose estimation, object detection and scene understanding in general are beneficial in order to augment the perception level of the robot. But beyond humanoid robots, Automated Guided Vehicles in factories should also be able to detect the human intentions (e.g. stop when a human tends to cut the motion trajectory, detect collaborative workspaces and identify them etc.) as well as understand human commands (e.g. to charge or not a palette, to go back to the starting point, etc.).

Topics

This workshop will focus on the most recent advances in the area of pose estimation, gesture recognition and movement analytics in Human-Machine Collaboration in Industry 4.0. This workshop aims to bring together researchers from different disciplines, such as robotics, computer vision, data analysis, intelligent systems, ergonomics, intelligent vehicles to share their experiences on these aspects and how they can be beneficial in Human-Machine Collaboration.

Papers are solicited on all areas, including but not limited to the following research topics:

  • Deep learning for pose estimation
  • Human modelling
  • Professional gesture recognition
  • Scene understanding for smart workspaces
  • Vision-based automatic ergonomic assessments
  • Extraction and visualisation of movement analytics
  • Vision-based gestural interaction with automated guided vehicles or drones
  • Human-robot rhythmic interaction
  • Internet of things and computer vision in industry 4.0
  • Contactless robot learning through gestures
  • Human style learning for robotics
  • Benchmarks, methods and datasets for professional gestures
  • Gestures and bio-inspired systems and robots
  • Machine learning for human data
  • Augmented capabilities for workers and machines

Important dates

Paper Submission :10 June 2019
Notification acceptance : 1 July 2019
Camera-Ready papers : 15 July 2019
Conference : 23-25 September 2019

Procedure

Please email your submission before 10th June 2019 23:59 CET following the instructions below:

  • Submission to: info@aimove.eu
  • Subject: “Movement analytics and gesture recognition for Human-Machine Collaboration in Industry 4.0”
  • Include authors names, affiliations and contact information
  • Attached file: pdf
  • Additional links or illustrations are welcome.

At least one author of an accepted submission is required to attend the workshop and must register for the main ICVS conference. Accepted papers will be published in the adjunct conference proceedings.

Workshop Organizers

  • Sotiris Manitsaris, Senior Researcher, S/T Project Leader, Centre for Robotics, MINES ParisTech, PSL Université Paris
  • Alina Glushkova, Postdoctoral Fellow, Centre for Robotics, MINES ParisTech, PSL Université Paris
  • Dimitrios Menychtas, Postdoctoral Fellow, Centre for Robotics, MINES ParisTech, PSL Université Paris


Acknowledgements

We acknowledge support from the CoLLaboratE project (H2020-FoF, grant agreement No. 820767), which is funded by the EU’s Horizon 2020 Research and Innovation Programme.