The most important contribution of Ciklum to RoBUTCHER is the development of the computer vision algorithms for predicting the cutting trajectories to implement the primal cuts of the pig carcass. Our team participated in the experiment design, data collection and annotation, and created the framework for training the deep neural networks to generate the path plan and trajectories.
By: Anton Popov, Ciklum
The AI model for predicting the cuts is continuously updated based on the new data from the workshops. The sequence and the parameters of the cuts were defined: e.g for the shoulder they are shoulder neck, shoulder ribs, shoulder chest, then several shoulder chest internal cuts. Data was relabelled to avoid collisions with a shoulder blade and another robot. The model was retrained based on updated data.
Ciklum team implemented not only deep learning solutions but also the post-processing algorithms for the predicted path plans, to ensure better robustness of the prediction. The post-processing consists of the skeletonization of the neural network prediction using Lee method, approximation of the skeletonized mask for each camera by a set of straight lines using the Iterative Hough Transform algorithm, combining the estimated lines into a polyline representing the rough cutting trajectory, detailing the rough cutting trajectory by larger number of points, projecting these points on the pig carcass in 3D space, merging trajectories obtained from different views into a single trajectory if needed.
Also, certain manipulations are carried out over the trajectory to ensure the correct and stable work of the robots, such as adjusting the edges of the trajectory to the boundaries of the object/carcass using a depth map and an RGB edge detector, setting the orientation of the trajectory, deepening the trajectory inside the pig to the specified depth values, discretization of the merged trajectory with a predefined step value, estimation at each point of the trajectory of the corresponding vectors describing the knife orientation in space, adding approach points at the trajectory edges, alignment of the trajectory by setting a constant value for a specific coordinate along the entire trajectory, rotating the knife orientation in a specific plane, setting the static depth of the cutting trajectory, extending the trajectory at the edges, providing multiple passes of the knife along the trajectory.
Also, Ciklum developed the AI solution which locates the gripping point on the pig limb and sends it to the robot, so it can grip the shoulder and ham of the pig and assist in automated cutting. Ciklum team also participated in the MFC integration activities and in testing the performance of the MFC after its transfer to MRI. The knowledge attained in RoBUTCHER will be used in future projects related to the development of customized AI/CV solutions in various domains, where high variability of the objects and lack of experimental data are the challenges.
In the figures below, the trajectories which were predicted by Ciklum’s AI are presented as an example. They are visualized inside the pig carcass 3D image in Ocellus environment.