Размер шрифта
Шрифт
Цвета сайта
Изображения
Инт.
Инт.
Education, research and development in the field of IT and Robotics
World-class IT education in Russia in English
Innopolis University has 17 laboratories and 9 research centers, which conduct research in the field of artificial intelligence, robotics, big data, software development, information security
The university’s project-based activities are aimed at implementing grant-based and commercial projects, as well as at enhancing the availability of education in IT areas.
Education, research and development in the field of IT and Robotics
Although deformable linear objects (DLOs), like
cables, are widely used in the majority of life fields and activities, the robotic manipulation of these objects is considerably more complex compared to the rigid-body manipulation and still an open challenge. In this paper, we introduce a new framework using two robotic arms cooperatively manipulating a DLO from an initial shape to a desired one. Based on visual servoing and computer vision techniques, a perception approach is proposed to detect and sample the DLO as a set of virtual feature points. Then, a manipulation planning approach is introduced to map between the motion of the manipulators end-effectors and the DLO points by a Jacobian matrix. To avoid excessive stretching of the DLO, the planning approach generates a path for each DLO point forming profiles between the initial
and desired shapes. It is guaranteed that all these inter-shape
profiles are reachable and maintain the cable length constraint.
The framework including the aforementioned approaches are validated in real-life experiments.
Read the article
Authors:
Karam Almaghout (Robotics Institute Innopolis University, k.almaghout@innopolis.university)
Alexandr Klimchik (Robotics Institute Innopolis University, A.Klimchik@innopolis.ru)
in Proceedings of the Third International Conference Nonlinearity,Information and Robotics 2022, August 24, 2022
This website uses cookies to ensure you get best experience on our website. By continuing to use this site, you agree to this cookie usage. You can learn more in the privacy policy.