ClassesClasses | | Operators

vector_to_proj_hom_mat2dT_vector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2d (Operator)

Name

vector_to_proj_hom_mat2dT_vector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2d — Compute a projective transformation matrix using given point correspondences.

Signature

vector_to_proj_hom_mat2d( : : Px, Py, Qx, Qy, Method, CovXX1, CovYY1, CovXY1, CovXX2, CovYY2, CovXY2 : HomMat2D, Covariance)

Herror T_vector_to_proj_hom_mat2d(const Htuple Px, const Htuple Py, const Htuple Qx, const Htuple Qy, const Htuple Method, const Htuple CovXX1, const Htuple CovYY1, const Htuple CovXY1, const Htuple CovXX2, const Htuple CovYY2, const Htuple CovXY2, Htuple* HomMat2D, Htuple* Covariance)

void VectorToProjHomMat2d(const HTuple& Px, const HTuple& Py, const HTuple& Qx, const HTuple& Qy, const HTuple& Method, const HTuple& CovXX1, const HTuple& CovYY1, const HTuple& CovXY1, const HTuple& CovXX2, const HTuple& CovYY2, const HTuple& CovXY2, HTuple* HomMat2D, HTuple* Covariance)

HTuple HHomMat2D::VectorToProjHomMat2d(const HTuple& Px, const HTuple& Py, const HTuple& Qx, const HTuple& Qy, const HString& Method, const HTuple& CovXX1, const HTuple& CovYY1, const HTuple& CovXY1, const HTuple& CovXX2, const HTuple& CovYY2, const HTuple& CovXY2)

HTuple HHomMat2D::VectorToProjHomMat2d(const HTuple& Px, const HTuple& Py, const HTuple& Qx, const HTuple& Qy, const char* Method, const HTuple& CovXX1, const HTuple& CovYY1, const HTuple& CovXY1, const HTuple& CovXX2, const HTuple& CovYY2, const HTuple& CovXY2)

static void HOperatorSet.VectorToProjHomMat2d(HTuple px, HTuple py, HTuple qx, HTuple qy, HTuple method, HTuple covXX1, HTuple covYY1, HTuple covXY1, HTuple covXX2, HTuple covYY2, HTuple covXY2, out HTuple homMat2D, out HTuple covariance)

HTuple HHomMat2D.VectorToProjHomMat2d(HTuple px, HTuple py, HTuple qx, HTuple qy, string method, HTuple covXX1, HTuple covYY1, HTuple covXY1, HTuple covXX2, HTuple covYY2, HTuple covXY2)

Description

vector_to_proj_hom_mat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2dVectorToProjHomMat2d determines the homogeneous projective transformation matrix HomMat2DHomMat2DHomMat2DHomMat2DhomMat2D that optimally fulfills the following equations given by at least 4 point correspondences

If fewer than 4 pairs of points (PxPxPxPxpx,PyPyPyPypy), (QxQxQxQxqx,QyQyQyQyqy) are given, there exists no unique solution, if exactly 4 pairs are supplied the matrix HomMat2DHomMat2DHomMat2DHomMat2DhomMat2D transforms them in exactly the desired way, and if there are more than 4 point pairs given, vector_to_proj_hom_mat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2dVectorToProjHomMat2d seeks to minimize the transformation error. To achieve such a minimization, several different algorithms are available. The algorithm to use can be chosen using the parameter MethodMethodMethodMethodmethod. MethodMethodMethodMethodmethod='dlt'"dlt""dlt""dlt""dlt" uses a fast and simple, but also rather inaccurate error estimation algorithm while MethodMethodMethodMethodmethod='normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt" offers a good compromise between speed and accuracy. Finally, MethodMethodMethodMethodmethod='gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard" performs a mathematically optimal but slower optimization.

If 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard" is used and the input points have been obtained from an operator like points_foerstnerpoints_foerstnerPointsFoerstnerPointsFoerstnerPointsFoerstner, which provides a covariance matrix for each of the points, which specifies the accuracy of the points, this can be taken into account by using the input parameters CovYY1CovYY1CovYY1CovYY1covYY1, CovXX1CovXX1CovXX1CovXX1covXX1, CovXY1CovXY1CovXY1CovXY1covXY1 for the points in the first image and CovYY2CovYY2CovYY2CovYY2covYY2, CovXX2CovXX2CovXX2CovXX2covXX2, CovXY2CovXY2CovXY2CovXY2covXY2 for the points in the second image. The covariances are symmetric 2×2 matrices. CovXX1CovXX1CovXX1CovXX1covXX1/CovXX2CovXX2CovXX2CovXX2covXX2 and CovYY1CovYY1CovYY1CovYY1covYY1/CovYY2CovYY2CovYY2CovYY2covYY2 are a list of diagonal entries while CovXY1CovXY1CovXY1CovXY1covXY1/CovXY2CovXY2CovXY2CovXY2covXY2 contains the non-diagonal entries which appear twice in a symmetric matrix. If a different MethodMethodMethodMethodmethod than 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard" is used or the covariances are unknown the covariance parameters can be left empty.

In contrast to hom_vector_to_proj_hom_mat2dhom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2d, points at infinity cannot be used to determine the transformation in vector_to_proj_hom_mat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2dVectorToProjHomMat2d. If this is necessary, hom_vector_to_proj_hom_mat2dhom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2d must be used. If the correspondence between the points has not been determined, proj_match_points_ransacproj_match_points_ransacProjMatchPointsRansacProjMatchPointsRansacProjMatchPointsRansac should be used to determine the correspondence as well as the transformation.

If the points to transform are specified in standard image coordinates, their row coordinates must be passed in PxPxPxPxpx and their column coordinates in PyPyPyPypy. This is necessary to obtain a right-handed coordinate system for the image. In particular, this assures that rotations are performed in the correct direction. Note that the (x,y) order of the matrices quite naturally corresponds to the usual (row,column) order for coordinates in the image.

Attention

It should be noted that homogeneous transformation matrices refer to a general right-handed mathematical coordinate system. If a homogeneous transformation matrix is used to transform images, regions, XLD contours, or any other data that has been extracted from images, the row coordinates of the transformation must be passed in the x coordinates, while the column coordinates must be passed in the y coordinates. Consequently, the order of passing row and column coordinates follows the usual order (RowRowRowRowrow,ColumnColumnColumnColumncolumn). This convention is essential to obtain a right-handed coordinate system for the transformation of iconic data, and consequently to ensure in particular that rotations are performed in the correct mathematical direction.

Furthermore, it should be noted that if a homogeneous transformation matrix is used to transform images, regions, XLD contours, or any other data that has been extracted from images, it is assumed that the origin of the coordinate system of the homogeneous transformation matrix lies in the upper left corner of a pixel. The image processing operators that return point coordinates, however, assume a coordinate system in which the origin lies in the center of a pixel. Therefore, to obtain a consistent homogeneous transformation matrix, 0.5 must be added to the point coordinates before computing the transformation.

Execution Information

Parameters

PxPxPxPxpx (input_control)  point.x-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Input points in image 1 (row coordinate).

PyPyPyPypy (input_control)  point.y-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Input points in image 1 (column coordinate).

QxQxQxQxqx (input_control)  point.x-array HTupleHTupleHtuple (real) (double) (double) (double)

Input points in image 2 (row coordinate).

QyQyQyQyqy (input_control)  point.y-array HTupleHTupleHtuple (real) (double) (double) (double)

Input points in image 2 (column coordinate).

MethodMethodMethodMethodmethod (input_control)  string HTupleHTupleHtuple (string) (string) (HString) (char*)

Estimation algorithm.

Default value: 'normalized_dlt' "normalized_dlt" "normalized_dlt" "normalized_dlt" "normalized_dlt"

List of values: 'dlt'"dlt""dlt""dlt""dlt", 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard", 'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt"

CovXX1CovXX1CovXX1CovXX1covXX1 (input_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Row coordinate variance of the points in image 1.

Default value: []

CovYY1CovYY1CovYY1CovYY1covYY1 (input_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Column coordinate variance of the points in image 1.

Default value: []

CovXY1CovXY1CovXY1CovXY1covXY1 (input_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Covariance of the points in image 1.

Default value: []

CovXX2CovXX2CovXX2CovXX2covXX2 (input_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Row coordinate variance of the points in image 2.

Default value: []

CovYY2CovYY2CovYY2CovYY2covYY2 (input_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Column coordinate variance of the points in image 2.

Default value: []

CovXY2CovXY2CovXY2CovXY2covXY2 (input_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Covariance of the points in image 2.

Default value: []

HomMat2DHomMat2DHomMat2DHomMat2DhomMat2D (output_control)  hom_mat2d HHomMat2D, HTupleHTupleHtuple (real) (double) (double) (double)

Homogeneous projective transformation matrix.

CovarianceCovarianceCovarianceCovariancecovariance (output_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

9×9 covariance matrix of the projective transformation matrix.

Possible Predecessors

proj_match_points_ransacproj_match_points_ransacProjMatchPointsRansacProjMatchPointsRansacProjMatchPointsRansac, proj_match_points_ransac_guidedproj_match_points_ransac_guidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuided, points_foerstnerpoints_foerstnerPointsFoerstnerPointsFoerstnerPointsFoerstner, points_harrispoints_harrisPointsHarrisPointsHarrisPointsHarris

Possible Successors

projective_trans_imageprojective_trans_imageProjectiveTransImageProjectiveTransImageProjectiveTransImage, projective_trans_image_sizeprojective_trans_image_sizeProjectiveTransImageSizeProjectiveTransImageSizeProjectiveTransImageSize, projective_trans_regionprojective_trans_regionProjectiveTransRegionProjectiveTransRegionProjectiveTransRegion, projective_trans_contour_xldprojective_trans_contour_xldProjectiveTransContourXldProjectiveTransContourXldProjectiveTransContourXld, projective_trans_point_2dprojective_trans_point_2dProjectiveTransPoint2dProjectiveTransPoint2dProjectiveTransPoint2d, projective_trans_pixelprojective_trans_pixelProjectiveTransPixelProjectiveTransPixelProjectiveTransPixel

Alternatives

hom_vector_to_proj_hom_mat2dhom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2d, proj_match_points_ransacproj_match_points_ransacProjMatchPointsRansacProjMatchPointsRansacProjMatchPointsRansac, proj_match_points_ransac_guidedproj_match_points_ransac_guidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuided

References

Richard Hartley, Andrew Zisserman: “Multiple View Geometry in Computer Vision”; Cambridge University Press, Cambridge; 2000.
Olivier Faugeras, Quang-Tuan Luong: “The Geometry of Multiple Images: The Laws That Govern the Formation of Multiple Images of a Scene and Some of Their Applications”; MIT Press, Cambridge, MA; 2001.

Module

Calibration


ClassesClasses | | Operators