Comparison and development of grasp predictors for robotic Task and Motion Planning

Manipulating (picking, moving, placing) objects is one of the most dominant functionalities of robots used in industry. In order to manipulate objects, the robot first has to grasp them successfully. In case of bulk production, hard-coded grasping poses can be sufficient, but recently the desire for more-and-more flexible robot systems, capable of rapidly adapting to different unknown objects is increasing. Naturally the hand-crafted grasp poses in this kind of applications are not sufficient, so automatic solutions are sought. The majority of the current grasp detection pipelines consider a predicted probability of the success of the grasp to select the best option. However, the easiest way to grasp an object may not be the best choice for all tasks (e.g.: obstructed best-grasp pose, no collision free place pose for given grasp pose, special required movements such as insertion etc.). A grasp predictor that can consider such task (and robot system) constraints might be better suited for an actual robot application. Moveit Task Constructor provides a flexible framework to implement manipulation tasks for robots, and carry out the motion planning for said tasks. The task constraints, environment and robot setup are also easily accessible using the Moveit functionalities.

 

The goal of the project is to implement a grasp detector in this framework (likely based on PointNetGPD), that also considers task constraints and robot setup in rating the grasp candidates.

 

 

Detailed tasks:

  • Implement a grasp detector in the mentioned framework (likely based on PointNetGPD), that also considers task constraints and robot setup in rating the grasp candidates;
  • Implement alternative grasp proposal methods for comparison and potentially help with the development of the new grasp pose detection algorithm.

 

Over the course of the project, the student will get involved with the various research project of the Antal Bejczy Center for Intelligent Robotics.