Class VisOdomDualTrackPnP<T extends boofcv.struct.image.ImageBase<T>,​Desc extends boofcv.struct.feature.TupleDesc>

  • All Implemented Interfaces:
    org.ddogleg.struct.VerbosePrint

    public class VisOdomDualTrackPnP<T extends boofcv.struct.image.ImageBase<T>,​Desc extends boofcv.struct.feature.TupleDesc>
    extends VisOdomBundlePnPBase<VisOdomDualTrackPnP.TrackInfo>
    Stereo visual odometry algorithm which relies on tracking features independently in the left and right images and then matching those tracks together. The idea behind this tracker is that the expensive task of association features between left and right cameras only needs to be done once when track is spawned. Triangulation is used to estimate each feature's 3D location. Motion is estimated robustly using a RANSAC type algorithm provided by the user which internally uses PnP type algorithm. Estimated motion is relative to left camera. FUTURE WORK: Save visual tracks without stereo matches and do monocular tracking on them. This is useful for stereo systems with only a little bit of overlap.
    • Constructor Detail

      • VisOdomDualTrackPnP

        public VisOdomDualTrackPnP​(double epilolarTol,
                                   boofcv.abst.tracker.PointTracker<T> trackerLeft,
                                   boofcv.abst.tracker.PointTracker<T> trackerRight,
                                   boofcv.abst.feature.describe.DescribeRegionPoint<T,​Desc> describe,
                                   boofcv.abst.feature.associate.AssociateDescription2D<Desc> assocL2R,
                                   boofcv.abst.geo.Triangulate2ViewsMetric triangulate2,
                                   org.ddogleg.fitting.modelset.ModelMatcher<georegression.struct.se.Se3_F64,​Stereo2D3D> matcher,
                                   org.ddogleg.fitting.modelset.ModelFitter<georegression.struct.se.Se3_F64,​Stereo2D3D> modelRefiner,
                                   boofcv.abst.geo.bundle.BundleAdjustment<boofcv.abst.geo.bundle.SceneStructureMetric> bundleAdjustment)
        Specifies internal algorithms and parameters
        Parameters:
        epilolarTol - Tolerance in pixels for enforcing the epipolar constraint
        trackerLeft - Tracker used for left camera
        trackerRight - Tracker used for right camera
        describe - Describes features in tracks
        assocL2R - Assocation for left to right
        triangulate2 - Triangulation for estimating 3D location from stereo pair
        matcher - Robust motion model estimation with outlier rejection
        modelRefiner - Non-linear refinement of motion model
    • Method Detail

      • setCalibration

        public void setCalibration​(boofcv.struct.calib.StereoParameters param)
        Specifies the stereo parameters. Note that classes which are passed into constructor are maintained outside. Example, the RANSAC distance model might need to have stereo parameters passed to it externally since there's no generic way to handle that.
      • process

        public boolean process​(T left,
                               T right)
        Updates motion estimate using the stereo pair.
        Parameters:
        left - Image from left camera
        right - Image from right camera
        Returns:
        true if motion estimate was updated and false if not
      • isFault

        public boolean isFault()
        If there are no candidates then a fault happened.
        Returns:
        true if fault. false is no fault