ClassesClassesClassesClasses | | | | Operators

proj_match_points_distortion_ransac_guidedT_proj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedproj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedProjMatchPointsDistortionRansacGuided (Operator)

Name

proj_match_points_distortion_ransac_guidedT_proj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedproj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedProjMatchPointsDistortionRansacGuided — Compute a projective transformation matrix and the radial distortion coefficient between two images by finding correspondences between points based on known approximations of the projective transformation matrix and the radial distortion coefficient.

Signature

proj_match_points_distortion_ransac_guided(Image1, Image2 : : Rows1, Cols1, Rows2, Cols2, GrayMatchMethod, MaskSize, HomMat2DGuide, KappaGuide, DistanceTolerance, MatchThreshold, EstimationMethod, DistanceThreshold, RandSeed : HomMat2D, Kappa, Error, Points1, Points2)

Herror T_proj_match_points_distortion_ransac_guided(const Hobject Image1, const Hobject Image2, const Htuple Rows1, const Htuple Cols1, const Htuple Rows2, const Htuple Cols2, const Htuple GrayMatchMethod, const Htuple MaskSize, const Htuple HomMat2DGuide, const Htuple KappaGuide, const Htuple DistanceTolerance, const Htuple MatchThreshold, const Htuple EstimationMethod, const Htuple DistanceThreshold, const Htuple RandSeed, Htuple* HomMat2D, Htuple* Kappa, Htuple* Error, Htuple* Points1, Htuple* Points2)

Herror proj_match_points_distortion_ransac_guided(Hobject Image1, Hobject Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& GrayMatchMethod, const HTuple& MaskSize, const HTuple& HomMat2DGuide, const HTuple& KappaGuide, const HTuple& DistanceTolerance, const HTuple& MatchThreshold, const HTuple& EstimationMethod, const HTuple& DistanceThreshold, const HTuple& RandSeed, HTuple* HomMat2D, HTuple* Kappa, HTuple* Error, HTuple* Points1, HTuple* Points2)

HTuple HImage::ProjMatchPointsDistortionRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& GrayMatchMethod, const HTuple& MaskSize, const HTuple& HomMat2DGuide, const HTuple& KappaGuide, const HTuple& DistanceTolerance, const HTuple& MatchThreshold, const HTuple& EstimationMethod, const HTuple& DistanceThreshold, const HTuple& RandSeed, HTuple* Kappa, HTuple* Error, HTuple* Points1, HTuple* Points2) const

void ProjMatchPointsDistortionRansacGuided(const HObject& Image1, const HObject& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HTuple& GrayMatchMethod, const HTuple& MaskSize, const HTuple& HomMat2DGuide, const HTuple& KappaGuide, const HTuple& DistanceTolerance, const HTuple& MatchThreshold, const HTuple& EstimationMethod, const HTuple& DistanceThreshold, const HTuple& RandSeed, HTuple* HomMat2D, HTuple* Kappa, HTuple* Error, HTuple* Points1, HTuple* Points2)

HHomMat2D HImage::ProjMatchPointsDistortionRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, const HHomMat2D& HomMat2DGuide, double KappaGuide, double DistanceTolerance, const HTuple& MatchThreshold, const HString& EstimationMethod, const HTuple& DistanceThreshold, Hlong RandSeed, double* Kappa, double* Error, HTuple* Points1, HTuple* Points2) const

HHomMat2D HImage::ProjMatchPointsDistortionRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, const HHomMat2D& HomMat2DGuide, double KappaGuide, double DistanceTolerance, Hlong MatchThreshold, const HString& EstimationMethod, double DistanceThreshold, Hlong RandSeed, double* Kappa, double* Error, HTuple* Points1, HTuple* Points2) const

HHomMat2D HImage::ProjMatchPointsDistortionRansacGuided(const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const char* GrayMatchMethod, Hlong MaskSize, const HHomMat2D& HomMat2DGuide, double KappaGuide, double DistanceTolerance, Hlong MatchThreshold, const char* EstimationMethod, double DistanceThreshold, Hlong RandSeed, double* Kappa, double* Error, HTuple* Points1, HTuple* Points2) const

HHomMat2D HHomMat2D::ProjMatchPointsDistortionRansacGuided(const HImage& Image1, const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, double KappaGuide, double DistanceTolerance, const HTuple& MatchThreshold, const HString& EstimationMethod, const HTuple& DistanceThreshold, Hlong RandSeed, double* Kappa, double* Error, HTuple* Points1, HTuple* Points2) const

HHomMat2D HHomMat2D::ProjMatchPointsDistortionRansacGuided(const HImage& Image1, const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const HString& GrayMatchMethod, Hlong MaskSize, double KappaGuide, double DistanceTolerance, Hlong MatchThreshold, const HString& EstimationMethod, double DistanceThreshold, Hlong RandSeed, double* Kappa, double* Error, HTuple* Points1, HTuple* Points2) const

HHomMat2D HHomMat2D::ProjMatchPointsDistortionRansacGuided(const HImage& Image1, const HImage& Image2, const HTuple& Rows1, const HTuple& Cols1, const HTuple& Rows2, const HTuple& Cols2, const char* GrayMatchMethod, Hlong MaskSize, double KappaGuide, double DistanceTolerance, Hlong MatchThreshold, const char* EstimationMethod, double DistanceThreshold, Hlong RandSeed, double* Kappa, double* Error, HTuple* Points1, HTuple* Points2) const

void HOperatorSetX.ProjMatchPointsDistortionRansacGuided(
[in] IHUntypedObjectX* Image1, [in] IHUntypedObjectX* Image2, [in] VARIANT Rows1, [in] VARIANT Cols1, [in] VARIANT Rows2, [in] VARIANT Cols2, [in] VARIANT GrayMatchMethod, [in] VARIANT MaskSize, [in] VARIANT HomMat2dGuide, [in] VARIANT KappaGuide, [in] VARIANT DistanceTolerance, [in] VARIANT MatchThreshold, [in] VARIANT EstimationMethod, [in] VARIANT DistanceThreshold, [in] VARIANT RandSeed, [out] VARIANT* HomMat2d, [out] VARIANT* Kappa, [out] VARIANT* Error, [out] VARIANT* Points1, [out] VARIANT* Points2)

IHHomMat2DX* HImageX.ProjMatchPointsDistortionRansacGuided(
[in] IHImageX* Image2, [in] VARIANT Rows1, [in] VARIANT Cols1, [in] VARIANT Rows2, [in] VARIANT Cols2, [in] BSTR GrayMatchMethod, [in] Hlong MaskSize, [in] IHHomMat2DX* HomMat2dGuide, [in] double KappaGuide, [in] double DistanceTolerance, [in] VARIANT MatchThreshold, [in] BSTR EstimationMethod, [in] VARIANT DistanceThreshold, [in] Hlong RandSeed, [out] double* Kappa, [out] double* Error, [out] VARIANT* Points1, [out] VARIANT* Points2)

IHHomMat2DX* HHomMat2DX.ProjMatchPointsDistortionRansacGuided(
[in] IHImageX* Image1, [in] IHImageX* Image2, [in] VARIANT Rows1, [in] VARIANT Cols1, [in] VARIANT Rows2, [in] VARIANT Cols2, [in] BSTR GrayMatchMethod, [in] Hlong MaskSize, [in] double KappaGuide, [in] double DistanceTolerance, [in] VARIANT MatchThreshold, [in] BSTR EstimationMethod, [in] VARIANT DistanceThreshold, [in] Hlong RandSeed, [out] double* Kappa, [out] double* Error, [out] VARIANT* Points1, [out] VARIANT* Points2)

static void HOperatorSet.ProjMatchPointsDistortionRansacGuided(HObject image1, HObject image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, HTuple grayMatchMethod, HTuple maskSize, HTuple homMat2DGuide, HTuple kappaGuide, HTuple distanceTolerance, HTuple matchThreshold, HTuple estimationMethod, HTuple distanceThreshold, HTuple randSeed, out HTuple homMat2D, out HTuple kappa, out HTuple error, out HTuple points1, out HTuple points2)

HHomMat2D HImage.ProjMatchPointsDistortionRansacGuided(HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, HHomMat2D homMat2DGuide, double kappaGuide, double distanceTolerance, HTuple matchThreshold, string estimationMethod, HTuple distanceThreshold, int randSeed, out double kappa, out double error, out HTuple points1, out HTuple points2)

HHomMat2D HImage.ProjMatchPointsDistortionRansacGuided(HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, HHomMat2D homMat2DGuide, double kappaGuide, double distanceTolerance, int matchThreshold, string estimationMethod, double distanceThreshold, int randSeed, out double kappa, out double error, out HTuple points1, out HTuple points2)

HHomMat2D HHomMat2D.ProjMatchPointsDistortionRansacGuided(HImage image1, HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, double kappaGuide, double distanceTolerance, HTuple matchThreshold, string estimationMethod, HTuple distanceThreshold, int randSeed, out double kappa, out double error, out HTuple points1, out HTuple points2)

HHomMat2D HHomMat2D.ProjMatchPointsDistortionRansacGuided(HImage image1, HImage image2, HTuple rows1, HTuple cols1, HTuple rows2, HTuple cols2, string grayMatchMethod, int maskSize, double kappaGuide, double distanceTolerance, int matchThreshold, string estimationMethod, double distanceThreshold, int randSeed, out double kappa, out double error, out HTuple points1, out HTuple points2)

Description

Given a set of coordinates of characteristic points (Rows1Rows1Rows1Rows1Rows1rows1,Cols1Cols1Cols1Cols1Cols1cols1) and (Rows2Rows2Rows2Rows2Rows2rows2Cols2Cols2Cols2Cols2Cols2cols2) in both input images Image1Image1Image1Image1Image1image1 and Image2Image2Image2Image2Image2image2, which must have identical size, and given known approximations HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuide and KappaGuideKappaGuideKappaGuideKappaGuideKappaGuidekappaGuide for the transformation matrix and the radial distortion coefficient between Image1Image1Image1Image1Image1image1 and Image2Image2Image2Image2Image2image2, proj_match_points_distortion_ransac_guidedproj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedproj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedProjMatchPointsDistortionRansacGuided automatically determines corresponding points, the homogeneous projective transformation matrix HomMat2DHomMat2DHomMat2DHomMat2DHomMat2DhomMat2D, and the radial distortion coefficient KappaKappaKappaKappaKappakappa that optimally fulfill the following equation:

Here, denote image points that are obtained by undistorting the input image points with the division model (see calibrate_camerascalibrate_camerasCalibrateCamerascalibrate_camerasCalibrateCamerasCalibrateCameras):
Here, denote the distorted image points, specified relative to the image center, and w and h denote the width and height of the input images. Thus, proj_match_points_distortion_ransac_guidedproj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedproj_match_points_distortion_ransac_guidedProjMatchPointsDistortionRansacGuidedProjMatchPointsDistortionRansacGuided assumes that the principal point of the camera, i.e., the center of the radial distortions, lies at the center of the image.

The returned KappaKappaKappaKappaKappakappa can be used to construct camera parameters that can be used to rectify images or points (see change_radial_distortion_cam_parchange_radial_distortion_cam_parChangeRadialDistortionCamParchange_radial_distortion_cam_parChangeRadialDistortionCamParChangeRadialDistortionCamPar, change_radial_distortion_imagechange_radial_distortion_imageChangeRadialDistortionImagechange_radial_distortion_imageChangeRadialDistortionImageChangeRadialDistortionImage, and change_radial_distortion_pointschange_radial_distortion_pointsChangeRadialDistortionPointschange_radial_distortion_pointsChangeRadialDistortionPointsChangeRadialDistortionPoints):

  CamPar = [0.0,KappaKappaKappaKappaKappakappa,1.0,1.0,0.5*(w-1),0.5*(h-1),w,h]

The approximations HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuide and KappaGuideKappaGuideKappaGuideKappaGuideKappaGuidekappaGuide can, for example, be calculated with proj_match_points_distortion_ransacproj_match_points_distortion_ransacProjMatchPointsDistortionRansacproj_match_points_distortion_ransacProjMatchPointsDistortionRansacProjMatchPointsDistortionRansac on lower resolution versions of Image1Image1Image1Image1Image1image1 and Image2Image2Image2Image2Image2image2. See the example below.

The matching process is based on characteristic points, which can be extracted with point operators like points_foerstnerpoints_foerstnerPointsFoerstnerpoints_foerstnerPointsFoerstnerPointsFoerstner or points_harrispoints_harrisPointsHarrispoints_harrisPointsHarrisPointsHarris. The matching itself is carried out in two steps: first, gray value correlations of mask windows around the input points in the first and the second image are determined and an initial matching between them is generated using the similarity of the windows in both images. Then, the RANSAC algorithm is applied to find the projective transformation matrix and radial distortion coefficient that maximizes the number of correspondences under the above constraint.

The size of the mask windows used for the matching is MaskSizeMaskSizeMaskSizeMaskSizeMaskSizemaskSize x MaskSizeMaskSizeMaskSizeMaskSizeMaskSizemaskSize. Three metrics for the correlation can be selected. If GrayMatchMethodGrayMatchMethodGrayMatchMethodGrayMatchMethodGrayMatchMethodgrayMatchMethod has the value 'ssd'"ssd""ssd""ssd""ssd""ssd", the sum of the squared gray value differences is used, 'sad'"sad""sad""sad""sad""sad" means the sum of absolute differences, and 'ncc'"ncc""ncc""ncc""ncc""ncc" is the normalized cross correlation. For details please refer to binocular_disparitybinocular_disparityBinocularDisparitybinocular_disparityBinocularDisparityBinocularDisparity. The metric is minimized ('ssd'"ssd""ssd""ssd""ssd""ssd", 'sad'"sad""sad""sad""sad""sad") or maximized ('ncc'"ncc""ncc""ncc""ncc""ncc") over all possible point pairs. A thus found matching is only accepted if the value of the metric is below the value of MatchThresholdMatchThresholdMatchThresholdMatchThresholdMatchThresholdmatchThreshold ('ssd'"ssd""ssd""ssd""ssd""ssd", 'sad'"sad""sad""sad""sad""sad") or above that value ('ncc'"ncc""ncc""ncc""ncc""ncc").

To increase the algorithm's performance, the search area for the match candidates is limited based on the approximate transformation specified by HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuide and KappaGuideKappaGuideKappaGuideKappaGuideKappaGuidekappaGuide. Only points within a distance of DistanceToleranceDistanceToleranceDistanceToleranceDistanceToleranceDistanceTolerancedistanceTolerance around the point in Image2Image2Image2Image2Image2image2 that is obtained when transforming a point in Image1Image1Image1Image1Image1image1 via HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuide and KappaGuideKappaGuideKappaGuideKappaGuideKappaGuidekappaGuide are considered for the matching.

After the initial matching has been completed, a randomized search algorithm (RANSAC) is used to determine the projective transformation matrix HomMat2DHomMat2DHomMat2DHomMat2DHomMat2DhomMat2D and the radial distortion coefficient KappaKappaKappaKappaKappakappa. It tries to find the parameters that are consistent with a maximum number of correspondences. For a point to be accepted, the distance to its corresponding transformed point must not exceed the threshold DistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholddistanceThreshold. Consequently, DistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholddistanceThreshold should be smaller than DistanceToleranceDistanceToleranceDistanceToleranceDistanceToleranceDistanceTolerancedistanceTolerance.

The parameter EstimationMethodEstimationMethodEstimationMethodEstimationMethodEstimationMethodestimationMethod determines which algorithm is used to compute the projective transformation matrix. A linear algorithm is used if EstimationMethodEstimationMethodEstimationMethodEstimationMethodEstimationMethodestimationMethod is set to 'linear'"linear""linear""linear""linear""linear". This algorithm is very fast and returns accurate results for small to moderate noise of the point coordinates and for most distortions (except for small distortions). For EstimationMethodEstimationMethodEstimationMethodEstimationMethodEstimationMethodestimationMethod = 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard", a mathematically optimal but slower optimization is used, which minimizes the geometric reprojection error. In general, it is preferable to use EstimationMethodEstimationMethodEstimationMethodEstimationMethodEstimationMethodestimationMethod = 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard".

The value ErrorErrorErrorErrorErrorerror indicates the overall quality of the estimation procedure and is the mean symmetric euclidian distance in pixels between the points and their corresponding transformed points.

Point pairs consistent with the above constraints are considered to be corresponding points. Points1Points1Points1Points1Points1points1 contains the indices of the matched input points from the first image and Points2Points2Points2Points2Points2points2 contains the indices of the corresponding points in the second image.

The parameter RandSeedRandSeedRandSeedRandSeedRandSeedrandSeed can be used to control the randomized nature of the RANSAC algorithm, and hence to obtain reproducible results. If RandSeedRandSeedRandSeedRandSeedRandSeedrandSeed is set to a positive number, the operator returns the same result on every call with the same parameters because the internally used random number generator is initialized with RandSeedRandSeedRandSeedRandSeedRandSeedrandSeed. If RandSeedRandSeedRandSeedRandSeedRandSeedrandSeed = 0, the random number generator is initialized with the current time. In this case the results may not be reproducible.

Parallelization

Parameters

Image1Image1Image1Image1Image1image1 (input_object)  singlechannelimage objectHImageHImageHImageHImageXHobject (byte / uint2)

Input image 1.

Image2Image2Image2Image2Image2image2 (input_object)  singlechannelimage objectHImageHImageHImageHImageXHobject (byte / uint2)

Input image 2.

Rows1Rows1Rows1Rows1Rows1rows1 (input_control)  point.y-array HTupleHTupleHTupleVARIANTHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong) (double / Hlong) (double / Hlong)

Input points in image 1 (row coordinate).

Restriction: length(Rows1) >= 5

Cols1Cols1Cols1Cols1Cols1cols1 (input_control)  point.x-array HTupleHTupleHTupleVARIANTHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong) (double / Hlong) (double / Hlong)

Input points in image 1 (column coordinate).

Restriction: length(Cols1) == length(Rows1)

Rows2Rows2Rows2Rows2Rows2rows2 (input_control)  point.y-array HTupleHTupleHTupleVARIANTHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong) (double / Hlong) (double / Hlong)

Input points in image 2 (row coordinate).

Restriction: length(Rows2) >= 5

Cols2Cols2Cols2Cols2Cols2cols2 (input_control)  point.x-array HTupleHTupleHTupleVARIANTHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong) (double / Hlong) (double / Hlong)

Input points in image 2 (column coordinate).

Restriction: length(Cols2) == length(Rows2)

GrayMatchMethodGrayMatchMethodGrayMatchMethodGrayMatchMethodGrayMatchMethodgrayMatchMethod (input_control)  string HTupleHTupleHTupleVARIANTHtuple (string) (string) (HString) (char*) (BSTR) (char*)

Gray value match metric.

Default value: 'ncc' "ncc" "ncc" "ncc" "ncc" "ncc"

List of values: 'ncc'"ncc""ncc""ncc""ncc""ncc", 'sad'"sad""sad""sad""sad""sad", 'ssd'"ssd""ssd""ssd""ssd""ssd"

MaskSizeMaskSizeMaskSizeMaskSizeMaskSizemaskSize (input_control)  integer HTupleHTupleHTupleVARIANTHtuple (integer) (int / long) (Hlong) (Hlong) (Hlong) (Hlong)

Size of gray value masks.

Default value: 10

Typical range of values: 3 ≤ MaskSize MaskSize MaskSize MaskSize MaskSize maskSize ≤ 15

Restriction: MaskSize >= 1

HomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuideHomMat2DGuidehomMat2DGuide (input_control)  hom_mat2d HHomMat2D, HTupleHTupleHTupleHHomMat2DX, VARIANTHtuple (real) (double) (double) (double) (double) (double)

Approximation of the homogeneous projective transformation matrix between the two images.

KappaGuideKappaGuideKappaGuideKappaGuideKappaGuidekappaGuide (input_control)  real HTupleHTupleHTupleVARIANTHtuple (real) (double) (double) (double) (double) (double)

Approximation of the radial distortion coefficient in the two images.

DistanceToleranceDistanceToleranceDistanceToleranceDistanceToleranceDistanceTolerancedistanceTolerance (input_control)  real HTupleHTupleHTupleVARIANTHtuple (real) (double) (double) (double) (double) (double)

Tolerance for the matching search window.

Default value: 20.0

Suggested values: 0.2, 0.5, 1.0, 2.0, 3.0, 5.0, 10.0, 20.0, 50.0

Restriction: DistanceTolerance > 0

MatchThresholdMatchThresholdMatchThresholdMatchThresholdMatchThresholdmatchThreshold (input_control)  number HTupleHTupleHTupleVARIANTHtuple (integer / real) (int / long / double) (Hlong / double) (Hlong / double) (Hlong / double) (Hlong / double)

Threshold for gray value matching.

Default value: 0.7

Suggested values: 0.9, 0.7, 0.5, 10, 20, 50, 100

EstimationMethodEstimationMethodEstimationMethodEstimationMethodEstimationMethodestimationMethod (input_control)  string HTupleHTupleHTupleVARIANTHtuple (string) (string) (HString) (char*) (BSTR) (char*)

Algorithm for the computation of the projective transformation matrix.

Default value: 'gold_standard' "gold_standard" "gold_standard" "gold_standard" "gold_standard" "gold_standard"

List of values: 'gold_standard'"gold_standard""gold_standard""gold_standard""gold_standard""gold_standard", 'linear'"linear""linear""linear""linear""linear"

DistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholdDistanceThresholddistanceThreshold (input_control)  number HTupleHTupleHTupleVARIANTHtuple (real / integer) (double / int / long) (double / Hlong) (double / Hlong) (double / Hlong) (double / Hlong)

Threshold for transformation consistency check.

Default value: 1

Restriction: DistanceThreshold > 0

RandSeedRandSeedRandSeedRandSeedRandSeedrandSeed (input_control)  integer HTupleHTupleHTupleVARIANTHtuple (integer) (int / long) (Hlong) (Hlong) (Hlong) (Hlong)

Seed for the random number generator.

Default value: 0

HomMat2DHomMat2DHomMat2DHomMat2DHomMat2DhomMat2D (output_control)  hom_mat2d HHomMat2D, HTupleHTupleHTupleHHomMat2DX, VARIANTHtuple (real) (double) (double) (double) (double) (double)

Computed homogeneous projective transformation matrix.

KappaKappaKappaKappaKappakappa (output_control)  real HTupleHTupleHTupleVARIANTHtuple (real) (double) (double) (double) (double) (double)

Computed radial distortion coefficient.

ErrorErrorErrorErrorErrorerror (output_control)  real HTupleHTupleHTupleVARIANTHtuple (real) (double) (double) (double) (double) (double)

Root-Mean-Square transformation error.

Points1Points1Points1Points1Points1points1 (output_control)  integer-array HTupleHTupleHTupleVARIANTHtuple (integer) (int / long) (Hlong) (Hlong) (Hlong) (Hlong)

Indices of matched input points in image 1.

Points2Points2Points2Points2Points2points2 (output_control)  integer-array HTupleHTupleHTupleVARIANTHtuple (integer) (int / long) (Hlong) (Hlong) (Hlong) (Hlong)

Indices of matched input points in image 2.

Example (HDevelop)

Factor := 0.5
zoom_image_factor (Image1, Image1Zoomed, Factor, Factor, 'constant')
zoom_image_factor (Image2, Image2Zoomed, Factor, Factor, 'constant')
points_foerstner (Image1Zoomed, 1, 2, 3, 200, 0.3, 'gauss', 'true', \
                  Rows1, Cols1, _, _, _, _, _, _, _, _)
points_foerstner (Image2Zoomed, 1, 2, 3, 200, 0.3, 'gauss', 'true', \
                  Rows2, Cols2, _, _, _, _, _, _, _, _)
get_image_size (Image1Zoomed, Width, Height)
proj_match_points_distortion_ransac (Image1Zoomed, Image2Zoomed, \
                                     Rows1, Cols1, Rows2, Cols2, \
                                     'ncc', 10, 0, 0, Height, Width, \
                                     0, 0.5, 'gold_standard', 2, 0, \
                                     HomMat2D, Kappa, Error, \
                                     Points1, Points2)
hom_mat2d_scale_local (HomMat2D, Factor, Factor, HomMat2DGuide)
hom_mat2d_scale (HomMat2DGuide, 1.0/Factor, 1.0/Factor, 0, 0, \
                 HomMat2DGuide)
KappaGuide := Kappa*Factor*Factor
points_foerstner (Image1, 1, 2, 3, 200, 0.3, 'gauss', 'true', \
                  Rows1, Cols1, _, _, _, _, _, _, _, _)
points_foerstner (Image2, 1, 2, 3, 200, 0.3, 'gauss', 'true', \
                  Rows2, Cols2, _, _, _, _, _, _, _, _)
proj_match_points_distortion_ransac_guided (Image1, Image2, \
                                            Rows1, Cols1, \
                                            Rows2, Cols2, \
                                            'ncc', 10, \
                                            HomMat2DGuide, \
                                            KappaGuide, 5, 0.5, \
                                            'gold_standard', 2, 0, \
                                            HomMat2D, Kappa, \
                                            Error, Points1, Points2)
get_image_size (Image1, Width, Height)
CamParDist := [0.0,Kappa,1.0,1.0,0.5*(Width-1),0.5*(Height-1), \
               Width,Height]
change_radial_distortion_cam_par ('fixed', CamParDist, 0, CamPar)
change_radial_distortion_image (Image1, Image1, Image1Rect, \
                                CamParDist, CamPar)
change_radial_distortion_image (Image2, Image2, Image2Rect, \
                                CamParDist, CamPar)
concat_obj (Image1Rect, Image2Rect, ImagesRect)
gen_projective_mosaic (ImagesRect, MosaicImage, 1, 1, 2, HomMat2D, \
                       'default', 'false', MosaicMatrices2D)

Possible Predecessors

points_foerstnerpoints_foerstnerPointsFoerstnerpoints_foerstnerPointsFoerstnerPointsFoerstner, points_harrispoints_harrisPointsHarrispoints_harrisPointsHarrisPointsHarris

Possible Successors

vector_to_proj_hom_mat2d_distortionvector_to_proj_hom_mat2d_distortionVectorToProjHomMat2dDistortionvector_to_proj_hom_mat2d_distortionVectorToProjHomMat2dDistortionVectorToProjHomMat2dDistortion, change_radial_distortion_cam_parchange_radial_distortion_cam_parChangeRadialDistortionCamParchange_radial_distortion_cam_parChangeRadialDistortionCamParChangeRadialDistortionCamPar, change_radial_distortion_imagechange_radial_distortion_imageChangeRadialDistortionImagechange_radial_distortion_imageChangeRadialDistortionImageChangeRadialDistortionImage, change_radial_distortion_pointschange_radial_distortion_pointsChangeRadialDistortionPointschange_radial_distortion_pointsChangeRadialDistortionPointsChangeRadialDistortionPoints, gen_binocular_proj_rectificationgen_binocular_proj_rectificationGenBinocularProjRectificationgen_binocular_proj_rectificationGenBinocularProjRectificationGenBinocularProjRectification, projective_trans_imageprojective_trans_imageProjectiveTransImageprojective_trans_imageProjectiveTransImageProjectiveTransImage, projective_trans_image_sizeprojective_trans_image_sizeProjectiveTransImageSizeprojective_trans_image_sizeProjectiveTransImageSizeProjectiveTransImageSize, projective_trans_regionprojective_trans_regionProjectiveTransRegionprojective_trans_regionProjectiveTransRegionProjectiveTransRegion, projective_trans_contour_xldprojective_trans_contour_xldProjectiveTransContourXldprojective_trans_contour_xldProjectiveTransContourXldProjectiveTransContourXld, projective_trans_point_2dprojective_trans_point_2dProjectiveTransPoint2dprojective_trans_point_2dProjectiveTransPoint2dProjectiveTransPoint2d, projective_trans_pixelprojective_trans_pixelProjectiveTransPixelprojective_trans_pixelProjectiveTransPixelProjectiveTransPixel

Alternatives

proj_match_points_distortion_ransacproj_match_points_distortion_ransacProjMatchPointsDistortionRansacproj_match_points_distortion_ransacProjMatchPointsDistortionRansacProjMatchPointsDistortionRansac

See also

proj_match_points_ransacproj_match_points_ransacProjMatchPointsRansacproj_match_points_ransacProjMatchPointsRansacProjMatchPointsRansac, proj_match_points_ransac_guidedproj_match_points_ransac_guidedProjMatchPointsRansacGuidedproj_match_points_ransac_guidedProjMatchPointsRansacGuidedProjMatchPointsRansacGuided, hom_vector_to_proj_hom_mat2dhom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dhom_vector_to_proj_hom_mat2dHomVectorToProjHomMat2dHomVectorToProjHomMat2d, vector_to_proj_hom_mat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dvector_to_proj_hom_mat2dVectorToProjHomMat2dVectorToProjHomMat2d

References

Richard Hartley, Andrew Zisserman: “Multiple View Geometry in Computer Vision”; Cambridge University Press, Cambridge; 2003.
Olivier Faugeras, Quang-Tuan Luong: “The Geometry of Multiple Images: The Laws That Govern the Formation of Multiple Images of a Scene and Some of Their Applications”; MIT Press, Cambridge, MA; 2001.

Module

Matching


ClassesClassesClassesClasses | | | | Operators