Increasing Human-Robot Awareness for a better cooperation in a collaborative environment
Niccolò Lucci
PHD Student
DEIB - Conference Room "E. Gatti" (Building 20)
May 17th, 2023
12.10 pm
Contacts:
Simone Formentin
Research Line:
Control systems
PHD Student
DEIB - Conference Room "E. Gatti" (Building 20)
May 17th, 2023
12.10 pm
Contacts:
Simone Formentin
Research Line:
Control systems
Sommario
On May 17th, 2023 at 12.10 pm Niccolò Lucci, PHD Student in Information Technology, will give a seminar on "Increasing Human-Robot Awareness for a better cooperation in a collaborative environment" in DEIB Conference Room.
Collaborative robotics is an increasing trend in the automation industry and is expected to keep increasing in the following years. Up to now, companies use mostly collaborative robots (cobots) as traditional robots without fences since they are lightweight and with rounded shapes so they are not able to create any harm to the human operator.
This is unluckily a very sub-optimal usage of cobots since their main advantages are not limited to safety measures but are also endowed with high flexibility in the production process. This means that they can be combined with the dexterity of a human to achieve the most flexible system possible while, at the same time, they can be easily reprogrammed by a non-skilled operator. This research aims at making the human-robot interactions more integrated both from a robot programming point of view, making the robot intuitive to use, and from a collaborative point of view, making the collaboration as natural and seamless as possible. This is possible thanks to a complete digitalization of all the components of the workspace: the robot, the human, the manipulated objects, and the executed tasks. It is indeed paramount to provide more awareness to the robotic agent both about the environment it is in (human, objects, ...) and the task it needs to perform (programming). Consequently, it is necessary to gather all possible information coming from the workspace or human task demonstrations and digitalize them to enhance the robot’s perception. In light of the above, this research project is focused on creating an architecture that can make human-robot collaboration easier to achieve and, at the same time, gives a straightforward interface to reprogram the robot to perform a different task.
Collaborative robotics is an increasing trend in the automation industry and is expected to keep increasing in the following years. Up to now, companies use mostly collaborative robots (cobots) as traditional robots without fences since they are lightweight and with rounded shapes so they are not able to create any harm to the human operator.
This is unluckily a very sub-optimal usage of cobots since their main advantages are not limited to safety measures but are also endowed with high flexibility in the production process. This means that they can be combined with the dexterity of a human to achieve the most flexible system possible while, at the same time, they can be easily reprogrammed by a non-skilled operator. This research aims at making the human-robot interactions more integrated both from a robot programming point of view, making the robot intuitive to use, and from a collaborative point of view, making the collaboration as natural and seamless as possible. This is possible thanks to a complete digitalization of all the components of the workspace: the robot, the human, the manipulated objects, and the executed tasks. It is indeed paramount to provide more awareness to the robotic agent both about the environment it is in (human, objects, ...) and the task it needs to perform (programming). Consequently, it is necessary to gather all possible information coming from the workspace or human task demonstrations and digitalize them to enhance the robot’s perception. In light of the above, this research project is focused on creating an architecture that can make human-robot collaboration easier to achieve and, at the same time, gives a straightforward interface to reprogram the robot to perform a different task.