Name
vector_to_essential_matrixT_vector_to_essential_matrixVectorToEssentialMatrixvector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrix — Compute the essential matrix given image point correspondences
and known camera matrices and reconstruct 3D points.
vector_to_essential_matrix( : : Rows1, Cols1, Rows2, Cols2, CovRR1, CovRC1, CovCC1, CovRR2, CovRC2, CovCC2, CamMat1, CamMat2, Method : EMatrix, CovEMat, Error, X, Y, Z, CovXYZ)
Herror T_vector_to_essential_matrix(const Htuple Rows1, const Htuple Cols1, const Htuple Rows2, const Htuple Cols2, const Htuple CovRR1, const Htuple CovRC1, const Htuple CovCC1, const Htuple CovRR2, const Htuple CovRC2, const Htuple CovCC2, const Htuple CamMat1, const Htuple CamMat2, const Htuple Method, Htuple* EMatrix, Htuple* CovEMat, Htuple* Error, Htuple* X, Htuple* Y, Htuple* Z, Htuple* CovXYZ)
Herror vector_to_essential_matrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HTuple& CamMat1, const HTuple& CamMat2, const HTuple& Method, HTuple* EMatrix, HTuple* CovEMat, HTuple* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ)
void VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HTuple& CamMat1, const HTuple& CamMat2, const HTuple& Method, HTuple* EMatrix, HTuple* CovEMat, HTuple* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ)
HHomMat2D HHomMat2D::VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HHomMat2D& CamMat2, const HString& Method, HTuple* CovEMat, HTuple* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ) const
HHomMat2D HHomMat2D::VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HHomMat2D& CamMat2, const HString& Method, HTuple* CovEMat, double* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ) const
HHomMat2D HHomMat2D::VectorToEssentialMatrix(const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& CovRR1, const HTuple& CovRC1, const HTuple& CovCC1, const HTuple& CovRR2, const HTuple& CovRC2, const HTuple& CovCC2, const HHomMat2D& CamMat2, const char* Method, HTuple* CovEMat, double* Error, HTuple* X, HTuple* Y, HTuple* Z, HTuple* CovXYZ) const
void HOperatorSetX.VectorToEssentialMatrix(
[in] VARIANT Rows1, [in] VARIANT Cols1, [in] VARIANT Rows2, [in] VARIANT Cols2, [in] VARIANT CovRR1, [in] VARIANT CovRC1, [in] VARIANT CovCC1, [in] VARIANT CovRR2, [in] VARIANT CovRC2, [in] VARIANT CovCC2, [in] VARIANT CamMat1, [in] VARIANT CamMat2, [in] VARIANT Method, [out] VARIANT* EMatrix, [out] VARIANT* CovEMat, [out] VARIANT* Error, [out] VARIANT* X, [out] VARIANT* Y, [out] VARIANT* Z, [out] VARIANT* CovXYZ)
IHHomMat2DX* HHomMat2DX.VectorToEssentialMatrix(
[in] VARIANT Rows1, [in] VARIANT Cols1, [in] VARIANT Rows2, [in] VARIANT Cols2, [in] VARIANT CovRR1, [in] VARIANT CovRC1, [in] VARIANT CovCC1, [in] VARIANT CovRR2, [in] VARIANT CovRC2, [in] VARIANT CovCC2, [in] IHHomMat2DX* CamMat2, [in] BSTR Method, [out] VARIANT* CovEMat, [out] VARIANT* Error, [out] VARIANT* X, [out] VARIANT* Y, [out] VARIANT* Z, [out] VARIANT* CovXYZ)
static void HOperatorSet.VectorToEssentialMatrix(HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple covRR1, HTuple covRC1, HTuple covCC1, HTuple covRR2, HTuple covRC2, HTuple covCC2, HTuple camMat1, HTuple camMat2, HTuple method, out HTuple EMatrix, out HTuple covEMat, out HTuple error, out HTuple x, out HTuple y, out HTuple z, out HTuple covXYZ)
HHomMat2D HHomMat2D.VectorToEssentialMatrix(HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple covRR1, HTuple covRC1, HTuple covCC1, HTuple covRR2, HTuple covRC2, HTuple covCC2, HHomMat2D camMat2, string method, out HTuple covEMat, out HTuple error, out HTuple x, out HTuple y, out HTuple z, out HTuple covXYZ)
HHomMat2D HHomMat2D.VectorToEssentialMatrix(HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple covRR1, HTuple covRC1, HTuple covCC1, HTuple covRR2, HTuple covRC2, HTuple covCC2, HHomMat2D camMat2, string method, out HTuple covEMat, out double error, out HTuple x, out HTuple y, out HTuple z, out HTuple covXYZ)
For a stereo configuration with known camera matrices the geometric relation
between the two images is defined by the essential matrix.
The operator vector_to_essential_matrixvector_to_essential_matrixVectorToEssentialMatrixvector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrix determines the essential
matrix EMatrixEMatrixEMatrixEMatrixEMatrixEMatrix from in general at least six given point
correspondences, that fulfill the epipolar constraint:
T
/ X2 \ / X1 \
| Y2 | * EMatrix * | Y1 | = 0
\ 1 / \ 1 /
The operator vector_to_essential_matrixvector_to_essential_matrixVectorToEssentialMatrixvector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrix is designed to deal
only with a linear camera model. This is in constrast to the
operator vector_to_rel_posevector_to_rel_poseVectorToRelPosevector_to_rel_poseVectorToRelPoseVectorToRelPose, that encompasses lens distortions too.
The internal camera parameters are passed by the arguments
CamMat1CamMat1CamMat1CamMat1CamMat1camMat1 and CamMat2CamMat2CamMat2CamMat2CamMat2camMat2, which are
3x3 upper triangular matrices desribing an affine
transformation. The relation between the vector (X,Y,1), defining the
direction from the camera to the viewed 3D point, and its (projective)
2D image coordinates (col,row,1) is:
/ col \ / X \ / f/Sx s Cx \
| row | = CamMat * | Y | where CamMat = | 0 f/Sy Cy | .
\ 1 / \ 1 / \ 0 0 1 /
The focal length is denoted by f, Sx,Sy
are scaling
factors, s describes a skew factor and (Cx,Cy)
indicates the principal point.
Mainly, these are the elements known from the camera parameters as used for
example in calibrate_camerascalibrate_camerasCalibrateCamerascalibrate_camerasCalibrateCamerasCalibrateCameras. Alternatively, the elements
of the camera matrix can be described in a different way, see e.g.
stationary_camera_self_calibrationstationary_camera_self_calibrationStationaryCameraSelfCalibrationstationary_camera_self_calibrationStationaryCameraSelfCalibrationStationaryCameraSelfCalibration.
The point correspondences
(Rows1Rows1Rows1Rows1Rows1rows1,Cols1Cols1Cols1Cols1Cols1cols1) and (Rows2Rows2Rows2Rows2Rows2rows2,Cols2Cols2Cols2Cols2Cols2cols2)
are typically found by applying the operator
match_essential_matrix_ransacmatch_essential_matrix_ransacMatchEssentialMatrixRansacmatch_essential_matrix_ransacMatchEssentialMatrixRansacMatchEssentialMatrixRansac. Multiplying the image
coordinates by the inverse of the camera matrices results in the 3D
direction vectors, which can then be inserted in the epipolar constraint.
The parameter MethodMethodMethodMethodMethodmethod decides whether the relative orientation
between the cameras is of a special type and which algorithm is to be applied
for its computation.
If MethodMethodMethodMethodMethodmethod is either 'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt" or
'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard" the relative orientation is arbitrary.
Choosing 'trans_normalized_dlt'"trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt" or 'trans_gold_standard'"trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard"
means that the relative motion between the cameras is a pure translation.
The typical application for this special motion case is the
scenario of a single fixed camera looking onto a moving conveyor belt.
In this case the minimum required number of corresponding points is just two
instead of six in the general case.
The essential matrix is computed by a linear algorithm if
'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt" or 'trans_normalized_dlt'"trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt" is chosen.
With 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard" or 'trans_gold_standard'"trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard"
the algorithm gives a statistically optimal result.
Here, 'normalized_dlt' and 'gold_standard' stand for
direct-linear-transformation and gold-standard-algorithm respectively.
All methods return the coordinates (XXXXXx,YYYYYy,ZZZZZz)
of the reconstructed 3D points. The optimal methods also return
the covariances of the 3D points in CovXYZCovXYZCovXYZCovXYZCovXYZcovXYZ.
Let n be the number of points
then the 3x3 covariance matrices are concatenated and
stored in a tuple of length 9n.
Additionally, the optimal methods return the
covariance of the essential matrix CovEMatCovEMatCovEMatCovEMatCovEMatcovEMat.
If an optimal gold-standard-algorithm is chosen the covariances of the image
points (CovRR1CovRR1CovRR1CovRR1CovRR1covRR1, CovRC1CovRC1CovRC1CovRC1CovRC1covRC1, CovCC1CovCC1CovCC1CovCC1CovCC1covCC1, CovRR2CovRR2CovRR2CovRR2CovRR2covRR2,
CovRC2CovRC2CovRC2CovRC2CovRC2covRC2, CovCC2CovCC2CovCC2CovCC2CovCC2covCC2) can be incorporated in the computation.
They can be provided for example by the operator points_foerstnerpoints_foerstnerPointsFoerstnerpoints_foerstnerPointsFoerstnerPointsFoerstner.
If the point covariances are unknown, which is the default, empty tuples
are input. In this case the optimization algorithm internally assumes
uniform and equal covariances for all points.
The value ErrorErrorErrorErrorErrorerror indicates the overall quality of the optimization
process and is the root-mean-square euclidian distance in pixels between the
points and their corresponding epipolar lines.
For the operator vector_to_essential_matrixvector_to_essential_matrixVectorToEssentialMatrixvector_to_essential_matrixVectorToEssentialMatrixVectorToEssentialMatrix a special configuration
of scene points and cameras exists: if all 3D points lie in a single plane
and additionally are all closer to one of the two cameras then the solution
in the essential matrix is not unique but twofold.
As a consequence both solutions are computed and returned by the operator.
This means that all output parameters are of double length and the values
of the second solution are simply concatenated behind the values of the
first one.
- Multithreading type: reentrant (runs in parallel with non-exclusive operators).
- Multithreading scope: global (may be called from any thread).
- Processed without parallelization.
Input points in image 1 (row coordinate).
Restriction: length(Rows1) >= 6 || length(Rows1) >= 2
Input points in image 1 (column coordinate).
Restriction: length(Cols1) == length(Rows1)
Input points in image 2 (row coordinate).
Restriction: length(Rows2) == length(Rows1)
Input points in image 2 (column coordinate).
Restriction: length(Cols2) == length(Rows1)
Row coordinate variance of the points in image 1.
Default value: []
Covariance of the points in image 1.
Default value: []
Column coordinate variance of the points in image 1.
Default value: []
Row coordinate variance of the points in image 2.
Default value: []
Covariance of the points in image 2.
Default value: []
Column coordinate variance of the points in image 2.
Default value: []
Camera matrix of the 1st camera.
Camera matrix of the 2nd camera.
Algorithm for the computation of the
essential matrix and for special camera orientations.
Default value:
'normalized_dlt'
"normalized_dlt"
"normalized_dlt"
"normalized_dlt"
"normalized_dlt"
"normalized_dlt"
List of values: 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard", 'normalized_dlt'"normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt""normalized_dlt", 'trans_gold_standard'"trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard""trans_gold_standard", 'trans_normalized_dlt'"trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt""trans_normalized_dlt"
Computed essential matrix.
9x9 covariance matrix of the
essential matrix.
Root-Mean-Square of the epipolar distance error.
XXXXXx (output_control) real-array → HTupleHTupleHTupleVARIANTHtuple (real) (double) (double) (double) (double) (double)
X coordinates of the reconstructed 3D points.
YYYYYy (output_control) real-array → HTupleHTupleHTupleVARIANTHtuple (real) (double) (double) (double) (double) (double)
Y coordinates of the reconstructed 3D points.
ZZZZZz (output_control) real-array → HTupleHTupleHTupleVARIANTHtuple (real) (double) (double) (double) (double) (double)
Z coordinates of the reconstructed 3D points.
Covariance matrices of the reconstructed 3D points.
match_essential_matrix_ransacmatch_essential_matrix_ransacMatchEssentialMatrixRansacmatch_essential_matrix_ransacMatchEssentialMatrixRansacMatchEssentialMatrixRansac
essential_to_fundamental_matrixessential_to_fundamental_matrixEssentialToFundamentalMatrixessential_to_fundamental_matrixEssentialToFundamentalMatrixEssentialToFundamentalMatrix
vector_to_rel_posevector_to_rel_poseVectorToRelPosevector_to_rel_poseVectorToRelPoseVectorToRelPose,
vector_to_fundamental_matrixvector_to_fundamental_matrixVectorToFundamentalMatrixvector_to_fundamental_matrixVectorToFundamentalMatrixVectorToFundamentalMatrix
stationary_camera_self_calibrationstationary_camera_self_calibrationStationaryCameraSelfCalibrationstationary_camera_self_calibrationStationaryCameraSelfCalibrationStationaryCameraSelfCalibration
Richard Hartley, Andrew Zisserman: “Multiple View Geometry in
Computer Vision”; Cambridge University Press, Cambridge; 2003.
J.Chris McGlone (editor): “Manual of Photogrammetry” ;
American Society for Photogrammetry and Remote Sensing ; 2004.
3D Metrology