ClassesClasses | | Operators

vector_to_essential_matrixT_vector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrix (Operator)

Name

vector_to_essential_matrixT_vector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrix — Compute the essential matrix given image point correspondences and known camera matrices and reconstruct 3D points.

Signature

vector_to_essential_matrix( : : Rows1, Cols1, Rows2, Cols2, CovRR1, CovRC1, CovCC1, CovRR2, CovRC2, CovCC2, CamMat1, CamMat2, Method : EMatrix, CovEMat, Error, X, Y, Z, CovXYZ)

Herror T_vector_to_essential_matrix(const Htuple Rows1, const Htuple Cols1, const Htuple Rows2, const Htuple Cols2, const Htuple CovRR1, const Htuple CovRC1, const Htuple CovCC1, const Htuple CovRR2, const Htuple CovRC2, const Htuple CovCC2, const Htuple CamMat1, const Htuple CamMat2, const Htuple Method, Htuple* EMatrix, Htuple* CovEMat, Htuple* Error, Htuple* X, Htuple* Y, Htuple* Z, Htuple* CovXYZ)

void VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HTuple& CamMat1, const HTuple& CamMat2, const HTuple& Method, HTuple* EMatrix, HTuple* CovEMat, HTuple* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ)

HHomMat2D HHomMat2D::VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HHomMat2D& CamMat2, const HString& Method, HTuple* CovEMat, HTuple* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ) const

HHomMat2D HHomMat2D::VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HHomMat2D& CamMat2, const HString& Method, HTuple* CovEMat, double* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ) const

HHomMat2D HHomMat2D::VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HHomMat2D& CamMat2, const char* Method, HTuple* CovEMat, double* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ) const

static void HOperatorSet.VectorToEssentialMatrix(HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple covRR1, HTuple covRC1, HTuple covCC1, HTuple covRR2, HTuple covRC2, HTuple covCC2, HTuple camMat1, HTuple camMat2, HTuple method, out HTuple EMatrix, out HTuple covEMat, out HTuple error, out HTuple x, out HTuple y, out HTuple z, out HTuple covXYZ)

HHomMat2D HHomMat2D.VectorToEssentialMatrix(HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple covRR1, HTuple covRC1, HTuple covCC1, HTuple covRR2, HTuple covRC2, HTuple covCC2, HHomMat2D camMat2, string method, out HTuple covEMat, out HTuple error, out HTuple x, out HTuple y, out HTuple z, out HTuple covXYZ)

HHomMat2D HHomMat2D.VectorToEssentialMatrix(HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple covRR1, HTuple covRC1, HTuple covCC1, HTuple covRR2, HTuple covRC2, HTuple covCC2, HHomMat2D camMat2, string method, out HTuple covEMat, out double error, out HTuple x, out HTuple y, out HTuple z, out HTuple covXYZ)

Description

For a stereo configuration with known camera matrices the geometric relation between the two images is defined by the essential matrix. The operator vector_to_essential_matrixvector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrixVectorToEssentialMatrix determines the essential matrix EMatrixEMatrixEMatrixEMatrixEMatrix from in general at least six given point correspondences, that fulfill the epipolar constraint:

The operator vector_to_essential_matrixvector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrixVectorToEssentialMatrix is designed to deal only with a linear camera model. This is in constrast to the operator vector_to_rel_posevector_to_rel_poseVectorToRelPoseVectorToRelPoseVectorToRelPose, that encompasses lens distortions too. The internal camera parameters are passed by the arguments CamMat1CamMat1CamMat1CamMat1camMat1 and CamMat2CamMat2CamMat2CamMat2camMat2, which are 3x3 upper triangular matrices desribing an affine transformation. The relation between the vector (X,Y,1), defining the direction from the camera to the viewed 3D point, and its (projective) 2D image coordinates (col,row,1) is:

The focal length is denoted by f, are scaling factors, s describes a skew factor and indicates the principal point. Mainly, these are the elements known from the camera parameters as used for example in calibrate_camerascalibrate_camerasCalibrateCamerasCalibrateCamerasCalibrateCameras. Alternatively, the elements of the camera matrix can be described in a different way, see e.g. stationary_camera_self_calibrationstationary_camera_self_calibrationStationaryCameraSelfCalibrationStationaryCameraSelfCalibrationStationaryCameraSelfCalibration.

The point correspondences (Rows1Rows1Rows1Rows1rows1,Cols1Cols1Cols1Cols1cols1) and (Rows2Rows2Rows2Rows2rows2,Cols2Cols2Cols2Cols2cols2) are typically found by applying the operator match_essential_matrix_ransacmatch_essential_matrix_ransacMatchEssentialMatrixRansacMatchEssentialMatrixRansacMatchEssentialMatrixRansac. Multiplying the image coordinates by the inverse of the camera matrices results in the 3D direction vectors, which can then be inserted in the epipolar constraint.

The parameter MethodMethodMethodMethodmethod decides whether the relative orientation between the cameras is of a special type and which algorithm is to be applied for its computation. If MethodMethodMethodMethodmethod is either 'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt" or 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard" the relative orientation is arbitrary. Choosing 'trans_normalized_dlt'"trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt" or 'trans_gold_standard'"trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard" means that the relative motion between the cameras is a pure translation. The typical application for this special motion case is the scenario of a single fixed camera looking onto a moving conveyor belt. In this case the minimum required number of corresponding points is just two instead of six in the general case.

The essential matrix is computed by a linear algorithm if 'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt" or 'trans_normalized_dlt'"trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt" is chosen. With 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard" or 'trans_gold_standard'"trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard" the algorithm gives a statistically optimal result. Here, 'normalized_dlt' and 'gold_standard' stand for direct-linear-transformation and gold-standard-algorithm respectively. All methods return the coordinates (XXXXx,YYYYy,ZZZZz) of the reconstructed 3D points. The optimal methods also return the covariances of the 3D points in CovXYZCovXYZCovXYZCovXYZcovXYZ. Let n be the number of points then the 3x3 covariance matrices are concatenated and stored in a tuple of length 9n. Additionally, the optimal methods return the covariance of the essential matrix CovEMatCovEMatCovEMatCovEMatcovEMat.

If an optimal gold-standard-algorithm is chosen the covariances of the image points (CovRR1CovRR1CovRR1CovRR1covRR1, CovRC1CovRC1CovRC1CovRC1covRC1, CovCC1CovCC1CovCC1CovCC1covCC1, CovRR2CovRR2CovRR2CovRR2covRR2, CovRC2CovRC2CovRC2CovRC2covRC2, CovCC2CovCC2CovCC2CovCC2covCC2) can be incorporated in the computation. They can be provided for example by the operator points_foerstnerpoints_foerstnerPointsFoerstnerPointsFoerstnerPointsFoerstner. If the point covariances are unknown, which is the default, empty tuples are input. In this case the optimization algorithm internally assumes uniform and equal covariances for all points.

The value ErrorErrorErrorErrorerror indicates the overall quality of the optimization process and is the root-mean-square euclidian distance in pixels between the points and their corresponding epipolar lines.

For the operator vector_to_essential_matrixvector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrixVectorToEssentialMatrix a special configuration of scene points and cameras exists: if all 3D points lie in a single plane and additionally are all closer to one of the two cameras then the solution in the essential matrix is not unique but twofold. As a consequence both solutions are computed and returned by the operator. This means that all output parameters are of double length and the values of the second solution are simply concatenated behind the values of the first one.

Execution Information

Parameters

Rows1Rows1Rows1Rows1rows1 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Input points in image 1 (row coordinate).

Restriction: length(Rows1) >= 6 || length(Rows1) >= 2

Cols1Cols1Cols1Cols1cols1 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Input points in image 1 (column coordinate).

Restriction: length(Cols1) == length(Rows1)

Rows2Rows2Rows2Rows2rows2 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Input points in image 2 (row coordinate).

Restriction: length(Rows2) == length(Rows1)

Cols2Cols2Cols2Cols2cols2 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Input points in image 2 (column coordinate).

Restriction: length(Cols2) == length(Rows1)

CovRR1CovRR1CovRR1CovRR1covRR1 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Row coordinate variance of the points in image 1.

Default value: []

CovRC1CovRC1CovRC1CovRC1covRC1 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Covariance of the points in image 1.

Default value: []

CovCC1CovCC1CovCC1CovCC1covCC1 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Column coordinate variance of the points in image 1.

Default value: []

CovRR2CovRR2CovRR2CovRR2covRR2 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Row coordinate variance of the points in image 2.

Default value: []

CovRC2CovRC2CovRC2CovRC2covRC2 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Covariance of the points in image 2.

Default value: []

CovCC2CovCC2CovCC2CovCC2covCC2 (input_control)  number-array HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Column coordinate variance of the points in image 2.

Default value: []

CamMat1CamMat1CamMat1CamMat1camMat1 (input_control)  hom_mat2d HHomMat2D, HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Camera matrix of the 1st camera.

CamMat2CamMat2CamMat2CamMat2camMat2 (input_control)  hom_mat2d HHomMat2D, HTupleHTupleHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong)

Camera matrix of the 2nd camera.

MethodMethodMethodMethodmethod (input_control)  string HTupleHTupleHtuple (string) (string) (HString) (char*)

Algorithm for the computation of the essential matrix and for special camera orientations.

Default value: 'normalized_dlt' "normalized_dlt" "normalized_dlt" "normalized_dlt" "normalized_dlt"

List of values: 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard", 'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt", 'trans_gold_standard'"trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard", 'trans_normalized_dlt'"trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt"

EMatrixEMatrixEMatrixEMatrixEMatrix (output_control)  hom_mat2d HHomMat2D, HTupleHTupleHtuple (real) (double) (double) (double)

Computed essential matrix.

CovEMatCovEMatCovEMatCovEMatcovEMat (output_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

9x9 covariance matrix of the essential matrix.

ErrorErrorErrorErrorerror (output_control)  real(-array) HTupleHTupleHtuple (real) (double) (double) (double)

Root-Mean-Square of the epipolar distance error.

XXXXx (output_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

X coordinates of the reconstructed 3D points.

YYYYy (output_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Y coordinates of the reconstructed 3D points.

ZZZZz (output_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Z coordinates of the reconstructed 3D points.

CovXYZCovXYZCovXYZCovXYZcovXYZ (output_control)  real-array HTupleHTupleHtuple (real) (double) (double) (double)

Covariance matrices of the reconstructed 3D points.

Possible Predecessors

match_essential_matrix_ransacmatch_essential_matrix_ransacMatchEssentialMatrixRansacMatchEssentialMatrixRansacMatchEssentialMatrixRansac

Possible Successors

essential_to_fundamental_matrixessential_to_fundamental_matrixEssentialToFundamentalMatrixEssentialToFundamentalMatrixEssentialToFundamentalMatrix

Alternatives

vector_to_rel_posevector_to_rel_poseVectorToRelPoseVectorToRelPoseVectorToRelPose, vector_to_fundamental_matrixvector_to_fundamental_matrixVectorToFundamentalMatrixVectorToFundamentalMatrixVectorToFundamentalMatrix

See also

stationary_camera_self_calibrationstationary_camera_self_calibrationStationaryCameraSelfCalibrationStationaryCameraSelfCalibrationStationaryCameraSelfCalibration

References

Richard Hartley, Andrew Zisserman: “Multiple View Geometry in Computer Vision”; Cambridge University Press, Cambridge; 2003.
J.Chris McGlone (editor): “Manual of Photogrammetry” ; American Society for Photogrammetry and Remote Sensing ; 2004.

Module

3D Metrology


ClassesClasses | | Operators