The WUR Agro Food Robotics initiative is a joint program by several research groups of the Wageningen University & Research. The program tries to bring new knowledge to practice by carrying out feasibility studies, functional designs, prototype development, testing, validation and by supporting new product implementations.
In the past decades the food production in greenhouses has been confronted with the increasing size of production facilities, increasing labor demands and increasing product quality demands by the consumers. Many operations are still done manually, for example the harvesting. However, the availability of a skilled workforce that accepts repetitive tasks in the harsh greenhouse climate conditions is decreasing rapidly. Robotics and sensing technologies are an alternative solution, which makes crop production more efficient and more sustainable.
Within the scope of the the European research project "Clever Robots for Crops" (CROPS), later the "Sweet Pepper Robot"-project (SWEEPER), the WUR developed a robot to pick sweet peppers. The prototype comprises the following modules: a tool to cut and catch the pepper; a combined color and 3D camera; an industrial six degrees of freedom robot arm, computers and electronics, all assembled on a battery powered platform that moves the robot autonomously through the greenhouse. Once the camera system has found a ripe pepper, the robotic arm positions the tool on top of the crop stem. The arm then moves the tool a few centimeters down with a vibrating knife and cuts off the pepper crop near the main plant stem.
Object detection with MVTec HALCON
A central function in the SWEEPER robot is detection of ripe crops. For successful operation, the 3D location of each crop must be determined with high accuracy. The chosen solution is based on an RGB-D camera that simultaneously reports color and depth information. Using this camera and a custom built LED-based flash-light illumination system, RGB images of the plant are acquired from both overview distance and close range. In order to facilitate high frame-rate operation, a straight forward shape- and color-based detection algorithm was implemented using HALCON. The algorithm scans each acquired image for regions matching the target color thresholds. Detected regions are refined by removing detections exceeding predefined minimum/maximum sizes. To further remove misdetections additional shape parameters are calculated. Finally, depth information from the camera is used to compute the volume of the detected regions. This information is then used to further prune false detections, avoid non-harvestable crop clusters, and define harvest priorities. The exact 3D location of the point of mass is calculated using the depth information extracted from the detected region and a standard procedure of pixel-to-world transformation of the region. Given the subsets of regions that are classified as peppers to be harvested, a methodology for harvesting sequencing was defined. The robot arm then approaches the target by visual-servo control that keeps the target in the middle of the images until it is reached.
You can download the full success story here.