hand_eye_calibrationT_hand_eye_calibrationHandEyeCalibrationHandEyeCalibration (Operator)

Name

hand_eye_calibrationT_hand_eye_calibrationHandEyeCalibrationHandEyeCalibration — Perform a hand-eye calibration.

Signature

hand_eye_calibration( : : X, Y, Z, Row, Col, NumPoints, RobotPoses, CameraParam, Method, QualityType : CameraPose, CalibrationPose, Quality)

Herror T_hand_eye_calibration(const Htuple X, const Htuple Y, const Htuple Z, const Htuple Row, const Htuple Col, const Htuple NumPoints, const Htuple RobotPoses, const Htuple CameraParam, const Htuple Method, const Htuple QualityType, Htuple* CameraPose, Htuple* CalibrationPose, Htuple* Quality)

void HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HTuple& RobotPoses, const HTuple& CameraParam, const HTuple& Method, const HTuple& QualityType, HTuple* CameraPose, HTuple* CalibrationPose, HTuple* Quality)

HPose HCamPar::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const HString& Method, const HTuple& QualityType, HPose* CalibrationPose, HTuple* Quality) const

HPose HCamPar::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const HString& Method, const HString& QualityType, HPose* CalibrationPose, double* Quality) const

HPose HCamPar::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const char* Method, const char* QualityType, HPose* CalibrationPose, double* Quality) const

HPose HCamPar::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const wchar_t* Method, const wchar_t* QualityType, HPose* CalibrationPose, double* Quality) const   (Windows only)

static HPose HPose::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const HCamPar& CameraParam, const HString& Method, const HTuple& QualityType, HPose* CalibrationPose, HTuple* Quality)

static HPose HPose::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const HCamPar& CameraParam, const HString& Method, const HString& QualityType, HPose* CalibrationPose, double* Quality)

static HPose HPose::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const HCamPar& CameraParam, const char* Method, const char* QualityType, HPose* CalibrationPose, double* Quality)

static HPose HPose::HandEyeCalibration(const HTuple& X, const HTuple& Y, const HTuple& Z, const HTuple& Row, const HTuple& Col, const HTuple& NumPoints, const HPoseArray& RobotPoses, const HCamPar& CameraParam, const wchar_t* Method, const wchar_t* QualityType, HPose* CalibrationPose, double* Quality)   (Windows only)

static void HOperatorSet.HandEyeCalibration(HTuple x, HTuple y, HTuple z, HTuple row, HTuple col, HTuple numPoints, HTuple robotPoses, HTuple cameraParam, HTuple method, HTuple qualityType, out HTuple cameraPose, out HTuple calibrationPose, out HTuple quality)

HPose HCamPar.HandEyeCalibration(HTuple x, HTuple y, HTuple z, HTuple row, HTuple col, HTuple numPoints, HPose[] robotPoses, string method, HTuple qualityType, out HPose calibrationPose, out HTuple quality)

HPose HCamPar.HandEyeCalibration(HTuple x, HTuple y, HTuple z, HTuple row, HTuple col, HTuple numPoints, HPose[] robotPoses, string method, string qualityType, out HPose calibrationPose, out double quality)

static HPose HPose.HandEyeCalibration(HTuple x, HTuple y, HTuple z, HTuple row, HTuple col, HTuple numPoints, HPose[] robotPoses, HCamPar cameraParam, string method, HTuple qualityType, out HPose calibrationPose, out HTuple quality)

static HPose HPose.HandEyeCalibration(HTuple x, HTuple y, HTuple z, HTuple row, HTuple col, HTuple numPoints, HPose[] robotPoses, HCamPar cameraParam, string method, string qualityType, out HPose calibrationPose, out double quality)

Description

The operator hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration determines the 3D pose of a robot (“hand”) relative to a camera (“eye”). With this information, the results of image processing can be transformed into the coordinate system of the robot which can then, e.g., grasp an inspected part. Please note that the operator hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration does not support 3D sensors. A hand-eye calibration including 3D sensors is only supported by the operator calibrate_hand_eyecalibrate_hand_eyeCalibrateHandEyeCalibrateHandEyeCalibrateHandEye. That operator furthermore provides a more user-friendly interface to the hand-eye calibration than the operator hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration, since the reference coordinate systems are explicitly indicated.

There are two possible configurations of robot-camera (hand-eye) systems: The camera can be mounted on the robot or be stationary and observe the robot. Note that the term robot is used in place for a mechanism that moves objects. Thus, you can use hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration to calibrate many different systems, from pan-tilt heads to multi-axis manipulators.

In essence, systems suitable for hand-eye calibration are described by a closed chain of four Euclidean transformations. In this chain two non-consecutive transformations are either known from the robot controller or computed from calibration points seen by a camera system. The two other constant transformations are computed by the hand-eye calibration procedure.

A hand-eye calibration is performed similarly to the calibration of the external camera parameters (see camera_calibrationcamera_calibrationCameraCalibrationCameraCalibrationCameraCalibration): You acquire a set of images of a calibration object, determine correspondences between known calibration points and their projection in the images and pass them to hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration via the parameters XXXXx, YYYYy, ZZZZz, RowRowRowRowrow, ColColColColcol, and NumPointsNumPointsNumPointsNumPointsnumPoints. If you use the standard calibration plate, the correspondences can be determined very easily with the operators find_caltabfind_caltabFindCaltabFindCaltabFindCaltab and find_marks_and_posefind_marks_and_poseFindMarksAndPoseFindMarksAndPoseFindMarksAndPose. Furthermore, the camera is identical for the complete calibration sequence and is specified by the internal camera parameters in CameraParamCameraParamCameraParamCameraParamcameraParam. The internal camera parameters are calibrated beforehand deploying the operator calibrate_camerascalibrate_camerasCalibrateCamerasCalibrateCamerasCalibrateCameras or camera_calibrationcamera_calibrationCameraCalibrationCameraCalibrationCameraCalibration.

In contrast to the camera calibration, the calibration object is not moved manually. This task is delegated to the robot, which either moves the camera (mounted camera) or the calibration object (stationary camera). The robot's movements are assumed to be known and therefore are also used as an input for the calibration (parameter RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses).

The output of hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration are the two poses CameraPoseCameraPoseCameraPoseCameraPosecameraPose and CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose. Their pose type is identical to the pose type of the first input robot pose.

Basically, two hand-eye configurations can be distinguished and are discussed in more detail below, followed by general information about the process of hand-eye calibration.

Moving camera (mounted on a robot)

In this configuration, the calibration object remains stationary and the camera is moved to different positions by the robot. The main idea behind the hand-eye calibration is that the information extracted from a calibration image, i.e., the pose of the calibration object relative to the camera (i.e., the external camera parameters), can be seen as a chain of poses or homogeneous transformation matrices, from the calibration object via the base of the robot to its tool (end-effector) and finally to the camera:

Moving camera: cam_H_cal = cam_H_tool * tool_H_base * base_H_cal | | | CameraPoseCameraPoseCameraPoseCameraPosecameraPose RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose

From the set of calibration images, the operator hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration determines the two transformations at the ends of the chain, i.e., the pose of the robot tool in camera coordinates ( ,CameraPoseCameraPoseCameraPoseCameraPosecameraPose) and the pose of the calibration object in the robot base coordinate system ( ,CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose).

In contrast, the transformation in the middle of the chain, , is known but changes for each calibration image, because it describes the pose of the robot moving the camera, or to be more exact its inverse pose (pose of the base coordinate system in robot tool coordinates). You must specify the inverse robot poses in the calibration images in the parameter RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses.

Note that when calibrating SCARA robots it is not possible to determine the Z translation of CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose. To eliminate this ambiguity the Z translation of CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose is internally set to 0.0 and the CameraPoseCameraPoseCameraPoseCameraPosecameraPose is calculated accordingly. It is necessary to determine the true translation in Z after the calibration (see calibrate_hand_eyecalibrate_hand_eyeCalibrateHandEyeCalibrateHandEyeCalibrateHandEye).

Stationary camera

In this configuration, the robot grasps the calibration object and moves it in front of the camera. Again, the information extracted from a calibration image, i.e., the pose of the calibration object in camera coordinates (the external camera parameters), are equal to a chain of poses or homogeneous transformation matrices, this time from the calibration object via the robot's tool to its base and finally to the camera:

Stationary camera: cam_H_cal = cam_H_base * base_H_tool * tool_H_cal | | | CameraPoseCameraPoseCameraPoseCameraPosecameraPose RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose

Analogously to the configuration with a moving camera, the operator hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration determines the two transformations at the ends of the chain, here the pose of the robot base coordinate system in camera coordinates ( ,CameraPoseCameraPoseCameraPoseCameraPosecameraPose) and the pose of the calibration object relative to the robot tool ( ,CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose).

The transformation in the middle of the chain, , describes the pose of the robot moving the calibration object, i.e., the pose of the tool relative to the base coordinate system. You must specify the robot poses in the calibration images in the parameter RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses.

Note that when calibrating SCARA robots it is not possible to determine the Z translation of CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose. To eliminate this ambiguity the Z translation of CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose is internally set to 0.0 and the CameraPoseCameraPoseCameraPoseCameraPosecameraPose is calculated accordingly. It is necessary to determine the true translation in Z after the calibration (see calibrate_hand_eyecalibrate_hand_eyeCalibrateHandEyeCalibrateHandEyeCalibrateHandEye).

Additional information about the calibration process

The following sections discuss individual questions arising from the use of hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration. They are intended to be a guideline for using the operator in an application, as well as to help understanding the operator.

How do I get 3D calibration points and their projections?

3D calibration points given in the world coordinate system (XXXXx, YYYYy, ZZZZz) and their associated projections in the image (RowRowRowRowrow, ColColColColcol) form the basis of the hand-eye calibration. In order to be able to perform a successful hand-eye calibration, you need at least three images of the 3D calibration points that were obtained under different poses of the manipulator. In each image at least four points must be available, in order to compute internally the pose transferring the calibration points from their world coordinate system into the camera coordinate system.

In principle, you can use arbitrary known points for the calibration. However, it is usually most convenient to use the standard calibration plate, e.g., the one that can be generated with gen_caltabgen_caltabGenCaltabGenCaltabGenCaltab. In this case, you can use the operators find_caltabfind_caltabFindCaltabFindCaltabFindCaltab and find_marks_and_posefind_marks_and_poseFindMarksAndPoseFindMarksAndPoseFindMarksAndPose to extract the position of the calibration plate and of the calibration marks and the operator caltab_pointscaltab_pointsCaltabPointsCaltabPointsCaltabPoints to read the 3D coordinates of the calibration marks (see also the description of camera_calibrationcamera_calibrationCameraCalibrationCameraCalibrationCameraCalibration).

The parameter NumPointsNumPointsNumPointsNumPointsnumPoints specifies the number of 3D calibration points used for each pose of the manipulator, i.e., for each image. With this, the 3D calibration points which are stored in a linearized fashion in XXXXx, YYYYy, ZZZZz, and their corresponding projections (RowRowRowRowrow, ColColColColcol) can be associated with the corresponding pose of the manipulator (RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses). Note that in contrast to the operator camera_calibrationcamera_calibrationCameraCalibrationCameraCalibrationCameraCalibration the 3D coordinates of the calibration points must be specified for each calibration image, not only once, and thus can vary for each image of the sequence.

How do I acquire a suitable set of images?

The following conditions, especially if using a standard calibration plate, should be considered:

How do I obtain the poses of the robot?

In the parameter RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses you must pass the poses of the robot in the calibration images (moving camera: pose of the robot base in robot tool coordinates; stationary camera: pose of the robot tool in robot base coordinates) in a linearized fashion. We recommend to create the robot poses in a separate program and save in files using write_posewrite_poseWritePoseWritePoseWritePose. In the calibration program you can then read and accumulate them in a tuple as shown in the example program below. In addition, we recommend to save the pose of the robot tool in robot base coordinates independent of the hand-eye configuration. When using a moving camera, you then invert the read poses before accumulating them. This is also shown in the example program.

Via the Cartesian interface of the robot, you can typically obtain the pose of the tool in base coordinates in a notation that corresponds to the pose representations with the codes 0 or 2 (OrderOfRotationOrderOfRotationOrderOfRotationOrderOfRotationorderOfRotation = 'gba'"gba""gba""gba""gba" or 'abg'"abg""abg""abg""abg", see create_posecreate_poseCreatePoseCreatePoseCreatePose). In this case, you can directly use the pose values obtained from the robot as input for create_posecreate_poseCreatePoseCreatePoseCreatePose.

If the Cartesian interface of your robot describes the orientation in a different way, e.g., with the representation ZYZ ( ), you can create the corresponding homogeneous transformation matrix step by step using the operators hom_mat3d_rotatehom_mat3d_rotateHomMat3dRotateHomMat3dRotateHomMat3dRotate and hom_mat3d_translatehom_mat3d_translateHomMat3dTranslateHomMat3dTranslateHomMat3dTranslate and then convert the matrix into a pose using hom_mat3d_to_posehom_mat3d_to_poseHomMat3dToPoseHomMat3dToPoseHomMat3dToPose. The following example code creates a pose from the ZYZ representation described above:

hom_mat3d_identity (HomMat3DIdent) hom_mat3d_rotate (HomMat3DIdent, phi3, 'z', 0, 0, 0, HomMat3DRotZ) hom_mat3d_rotate (HomMat3DRotZ, phi2, 'y', 0, 0, 0, HomMat3DRotZY) hom_mat3d_rotate (HomMat3DRotZY, phi1, 'z', 0, 0, 0, HomMat3DRotZYZ) hom_mat3d_translate (HomMat3DRotZYZ, Tx, Ty, Tz, base_H_tool) hom_mat3d_to_pose (base_H_tool, RobPose)

Please note that the hand-eye calibration only works if the robot poses RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses are specified with high accuracy!

What is the order of the individual parameters?

The length of the tuple NumPointsNumPointsNumPointsNumPointsnumPoints corresponds to the number of different positions of the manipulator and thus to the number of calibration images. The parameter NumPointsNumPointsNumPointsNumPointsnumPoints determines the number of calibration points used in the individual positions. If the standard calibration plate is used, this means 49 points per position (image). If, for example, 15 images were acquired, NumPointsNumPointsNumPointsNumPointsnumPoints is a tuple of length 15, where all elements of the tuple have the value 49.

The number of images in the sequence, which is determined by the length of NumPointsNumPointsNumPointsNumPointsnumPoints, must also be taken into account for the tuples of the 3D calibration points and the extracted 2D marks, respectively. Hence, for 15 calibration images with 49 calibration points each, the tuples XXXXx, YYYYy, ZZZZz, RowRowRowRowrow, and ColColColColcol must contain values each. These tuples are ordered according to the image the respective points lie in, i.e., the first 49 values correspond to the 49 calibration points in the first image. The order of the 3D calibration points and the extracted 2D calibration points must be the same in each image.

The length of the tuple RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses also depends on the number of calibration images. If, for example, 15 images and therefore 15 poses are used, the length of the tuple RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses is (15 times 7 pose parameters). The first seven parameters thus determine the pose of the manipulator in the first image, and so on.

Algorithm and output parameter

The parameter MethodMethodMethodMethodmethod determines the type of algorithm used for the hand-eye calibration: With 'linear'"linear""linear""linear""linear" a linear algorithm is chosen, which is fast but in many practical situations not accurate enough. 'nonlinear'"nonlinear""nonlinear""nonlinear""nonlinear" selects a non-linear algorithm, which results in the most accurately calibrated poses and which is the method of choice.

For the calibration of SCARA robots the parameter MethodMethodMethodMethodmethod must be set to 'scara_linear'"scara_linear""scara_linear""scara_linear""scara_linear" or 'scara_nonlinear'"scara_nonlinear""scara_nonlinear""scara_nonlinear""scara_nonlinear", respectively. While the arm of an articulated robot has three rotary joints typically covering 6 degrees of freedom (3 translations and 3 rotations), SCARA robots have two parallel rotary joints and one parallel prismatic joint covering only 4 degrees of freedom (3 translations and 1 rotation). Loosely speaking, an articulated robot is able to tilt its end effector while a SCARA robot is not.

The parameter QualityTypeQualityTypeQualityTypeQualityTypequalityType switches between different possibilities for assessing the quality of the calibration result returned in QualityQualityQualityQualityquality. 'error_pose'"error_pose""error_pose""error_pose""error_pose" stands for the pose error of the complete chain of transformations. To be more precise, a tuple with four elements is returned, where the first element is the root-mean-square error of the translational part, the second element is the root-mean-square error of the rotational part, the third element is the maximum translational error and the fourth element is the maximum rotational error. With 'standard_deviation'"standard_deviation""standard_deviation""standard_deviation""standard_deviation", a tuple with 12 elements containing the standard deviations of the two poses is returned: The first six elements refer to the camera pose and the others to the pose of the calibration points. With 'covariance'"covariance""covariance""covariance""covariance", the full 12x12 covariance matrix of both poses is returned. Like poses, the standard deviations and the covariances are specified in the units [m] and [°]. Note that selecting 'linear'"linear""linear""linear""linear" or 'scara_linear'"scara_linear""scara_linear""scara_linear""scara_linear" for the parameter MethodMethodMethodMethodmethod enables only the output of the pose error ('error_pose'"error_pose""error_pose""error_pose""error_pose").

Execution Information

Parameters

XXXXx (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Linear list containing all the x coordinates of the calibration points (in the order of the images).

YYYYy (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Linear list containing all the y coordinates of the calibration points (in the order of the images).

ZZZZz (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Linear list containing all the z coordinates of the calibration points (in the order of the images).

RowRowRowRowrow (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Linear list containing all row coordinates of the calibration points (in the order of the images).

ColColColColcol (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Linear list containing all the column coordinates of the calibration points (in the order of the images).

NumPointsNumPointsNumPointsNumPointsnumPoints (input_control)  integer-array HTupleHTupleHtuple (integer) (int / long) (Hlong) (Hlong)

Number of the calibration points for each image.

RobotPosesRobotPosesRobotPosesRobotPosesrobotPoses (input_control)  pose-array HPose, HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Known 3D pose of the robot for each image (moving camera: robot base in robot tool coordinates; stationary camera: robot tool in robot base coordinates).

CameraParamCameraParamCameraParamCameraParamcameraParam (input_control)  campar HCamPar, HTupleHTupleHtuple (real / integer / string) (double / int / long / string) (double / Hlong / HString) (double / Hlong / char*)

Internal camera parameters.

MethodMethodMethodMethodmethod (input_control)  string HTupleHTupleHtuple (string) (string) (HString) (char*)

Method of hand-eye calibration.

Default value: 'nonlinear' "nonlinear" "nonlinear" "nonlinear" "nonlinear"

List of values: 'linear'"linear""linear""linear""linear", 'nonlinear'"nonlinear""nonlinear""nonlinear""nonlinear", 'scara_linear'"scara_linear""scara_linear""scara_linear""scara_linear", 'scara_nonlinear'"scara_nonlinear""scara_nonlinear""scara_nonlinear""scara_nonlinear"

QualityTypeQualityTypeQualityTypeQualityTypequalityType (input_control)  string(-array) HTupleHTupleHtuple (string) (string) (HString) (char*)

Type of quality assessment.

Default value: 'error_pose' "error_pose" "error_pose" "error_pose" "error_pose"

List of values: 'covariance'"covariance""covariance""covariance""covariance", 'error_pose'"error_pose""error_pose""error_pose""error_pose", 'standard_deviation'"standard_deviation""standard_deviation""standard_deviation""standard_deviation"

CameraPoseCameraPoseCameraPoseCameraPosecameraPose (output_control)  pose HPose, HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Computed relative camera pose: 3D pose of the robot tool (moving camera) or robot base (stationary camera), respectively, in camera coordinates.

CalibrationPoseCalibrationPoseCalibrationPoseCalibrationPosecalibrationPose (output_control)  pose HPose, HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Computed 3D pose of the calibration points in robot base coordinates (moving camera) or in robot tool coordinates (stationary camera), respectively.

QualityQualityQualityQualityquality (output_control)  real(-array) HTupleHTupleHtuple (real) (double) (double) (double)

Quality assessment of the result.

Example (HDevelop)

* Note that, in order to use this code snippet, you must provide
* the camera parameters, the calibration plate description file,
* the calibration images, and the robot poses.
read_cam_par('campar.dat', CameraParam)
CalDescr := 'caltab.descr'
caltab_points(CalDescr, X, Y, Z)
* Process all calibration images.
for i := 0 to NumImages-1 by 1
  read_image(Image, 'calib_'+i$'02d')
  * Find marks on the calibration plate in every image.
  find_caltab(Image, CalPlate, CalDescr, 3, 150, 5)
  find_marks_and_pose(Image, CalPlate, CalDescr, CameraParam, 128, 10, 18, \
                      0.9, 15, 100, RCoordTmp, CCoordTmp, StartPose)
  * Accumulate 2D and 3D coordinates of the marks.
  RCoord := [RCoord, RCoordTmp]
  CCoord := [CCoord, CCoordTmp]
  XCoord := [XCoord, X]
  YCoord := [YCoord, Y]
  ZCoord := [ZCoord, Z]
  NumMarker := [NumMarker, |RCoordTmp|]
  * Read pose of the robot tool in robot base coordinates.
  read_pose('robpose_'+i$'02d'+'.dat', RobPose)
  * Moving camera? Invert pose.
  if (IsMovingCameraConfig == 'true')
    pose_to_hom_mat3d(RobPose, base_H_tool)
    hom_mat3d_invert(base_H_tool, tool_H_base)
    hom_mat3d_to_pose(tool_H_base, RobPose)
  endif
  * Accumulate robot poses.
  RobotPoses := [RobotPoses, RobPose]
endfor
*
* Perform hand-eye calibration.
*
hand_eye_calibration(XCoord, YCoord, ZCoord, RCoord, CCoord, NumMarker, \
                     RobotPoses, CameraParam, 'nonlinear', 'error_pose', \
                     CameraPose, CalibrationPose, Error)

Result

The operator hand_eye_calibrationhand_eye_calibrationHandEyeCalibrationHandEyeCalibrationHandEyeCalibration returns the value H_MSG_2 (H_MSG_TRUE) if the given parameters are correct. Otherwise, an exception will be raised.

Possible Predecessors

find_marks_and_posefind_marks_and_poseFindMarksAndPoseFindMarksAndPoseFindMarksAndPose, camera_calibrationcamera_calibrationCameraCalibrationCameraCalibrationCameraCalibration, calibrate_camerascalibrate_camerasCalibrateCamerasCalibrateCamerasCalibrateCameras

Possible Successors

write_posewrite_poseWritePoseWritePoseWritePose, convert_pose_typeconvert_pose_typeConvertPoseTypeConvertPoseTypeConvertPoseType, pose_to_hom_mat3dpose_to_hom_mat3dPoseToHomMat3dPoseToHomMat3dPoseToHomMat3d, disp_caltabdisp_caltabDispCaltabDispCaltabDispCaltab, sim_caltabsim_caltabSimCaltabSimCaltabSimCaltab

Alternatives

calibrate_hand_eyecalibrate_hand_eyeCalibrateHandEyeCalibrateHandEyeCalibrateHandEye

See also

find_caltabfind_caltabFindCaltabFindCaltabFindCaltab, find_marks_and_posefind_marks_and_poseFindMarksAndPoseFindMarksAndPoseFindMarksAndPose, disp_caltabdisp_caltabDispCaltabDispCaltabDispCaltab, sim_caltabsim_caltabSimCaltabSimCaltabSimCaltab, write_cam_parwrite_cam_parWriteCamParWriteCamParWriteCamPar, read_cam_parread_cam_parReadCamParReadCamParReadCamPar, create_posecreate_poseCreatePoseCreatePoseCreatePose, convert_pose_typeconvert_pose_typeConvertPoseTypeConvertPoseTypeConvertPoseType, write_posewrite_poseWritePoseWritePoseWritePose, read_poseread_poseReadPoseReadPoseReadPose, pose_to_hom_mat3dpose_to_hom_mat3dPoseToHomMat3dPoseToHomMat3dPoseToHomMat3d, hom_mat3d_to_posehom_mat3d_to_poseHomMat3dToPoseHomMat3dToPoseHomMat3dToPose, caltab_pointscaltab_pointsCaltabPointsCaltabPointsCaltabPoints, gen_caltabgen_caltabGenCaltabGenCaltabGenCaltab, calibrate_hand_eyecalibrate_hand_eyeCalibrateHandEyeCalibrateHandEyeCalibrateHandEye

References

K. Daniilidis: “Hand-Eye Calibration Using Dual Quaternions”; International Journal of Robotics Research, Vol. 18, No. 3, pp. 286-298; 1999.
M. Ulrich, C. Steger: “Hand-Eye Calibration of SCARA Robots Using Dual Quaternions”; Pattern Recognition and Image Analysis, Vol. 26, No. 1, pp. 231-239; January 2016.

Module

Calibration