本讲首先复习上一张图像拼接，是为了解决两个图像旋转/缩放，但没有平移的相关问题。而对极几何(Epipolar geometry)是从两个位置连接摄像机，立体深度估计(Stereo depth estimation)用于从两个图像恢复深度。在立体深度估计中，如何恢复相机之间的关系(如果还不知道)?和对应关系:我们如何寻找匹配点x ' ?本章将讲述相关内容。
- 108
- 0
- 0
注脚
展开查看详情
1.Epipolar Geometry and Stereo Vision Computer Vision Jia-Bin Huang, Virginia Tech Many slides from S. Seitz and D. Hoiem
2.Last class: Image Stitching Two images with rotation/zoom but no translation f f . x x X
3.This class: Two-View Geometry Epipolar geometry Relates cameras from two positions Stereo depth estimation Recover depth from two images
4.This class: Two-View Geometry Epipolar geometry Relates cameras from two positions Stereo depth estimation Recover depth from two images
5.This class: Two-View Geometry Epipolar geometry Relates cameras from two positions Stereo depth estimation Recover depth from two images
6.Correspondence Problem We have two images taken from cameras with different intrinsic and extrinsic parameters How do we match a point in the first image to a point in the second? How can we constrain our search? x ?
7.Key idea: Epipolar constraint
8.Potential matches for x have to lie on the corresponding line l’ . Potential matches for x’ have to lie on the corresponding line l . Key idea: Epipolar constraint x x’ X x’ X x’ X
9. Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center Baseline – line connecting the two camera centers Epipolar geometry: notation X x x’
10. Epipolar Lines - intersections of epipolar plane with image planes (always come in corresponding pairs) Epipolar geometry: notation X x x’ Epipolar Plane – plane containing baseline (1D family) Epipoles = intersections of baseline with image planes = projections of the other camera center Baseline – line connecting the two camera centers
11.Example: Converging cameras
12.Example: Motion parallel to image plane
13.Example: Forward motion What would the epipolar lines look like if the camera moves directly forward?
14.e e’ Example: Forward motion Epipole has same coordinates in both images. Points move along lines radiating from e: “Focus of expansion”
15.X x x’ Epipolar constraint: Calibrated case Given the intrinsic parameters of the cameras : Convert to normalized coordinates by pre-multiplying all points with the inverse of the calibration matrix; set first camera’s coordinate system to world coordinates Homogeneous 2d point (3D ray towards X) 2D pixel coordinate (homogeneous) 3D scene point 3D scene point in 2 nd camera’s 3D coordinates
16.X x x’ Epipolar constraint: Calibrated case Given the intrinsic parameters of the cameras: Convert to normalized coordinates by pre-multiplying all points with the inverse of the calibration matrix; set first camera’s coordinate system to world coordinates Define some R and t that relate X to X’ as below for some scale factor
17.Epipolar constraint: Calibrated case x x’ X (because and are co-planar)
18.Essential Matrix (Longuet-Higgins, 1981) Essential matrix X x x’
19.X Properties of the Essential matrix E x’ is the epipolar line associated with x’ ( l = E x’ ) E T x is the epipolar line associated with x ( l’ = E T x ) E e ’ = 0 and E T e = 0 E is singular (rank two) E has five degrees of freedom (3 for R, 2 for t because it’s up to a scale) Drop ^ below to simplify notation x x’ Skew-symmetric matrix
20.Epipolar constraint: Uncalibrated case If we don’t know K and K’ , then we can write the epipolar constraint in terms of unknown normalized coordinates: X x x’
21.The Fundamental Matrix Fundamental Matrix ( Faugeras and Luong , 1992) Without knowing K and K’, we can define a similar relation using unknown normalized coordinates
22.Properties of the Fundamental matrix F x’ is the epipolar line associated with x’ ( l = F x’ ) F T x is the epipolar line associated with x ( l’ = F T x ) F e’ = 0 and F T e = 0 F is singular (rank two): det (F)=0 F has seven degrees of freedom: 9 entries but defined up to scale, det (F)=0 X x x’
23.Estimating the Fundamental Matrix 8-point algorithm Least squares solution using SVD on equations from 8 pairs of correspondences Enforce det (F)=0 constraint using SVD on F 7-point algorithm Use least squares to solve for null space (two vectors) using SVD and 7 pairs of correspondences Solve for linear combination of null space vectors that satisfies det (F)=0 Minimize reprojection error Non-linear least squares Note: estimation of F (or E) is degenerate for a planar scene.
24.8-point algorithm Solve a system of homogeneous linear equations Write down the system of equations = 0
25.8-point algorithm Solve a system of homogeneous linear equations Write down the system of equations Solve f from A f = 0 using SVD Matlab : [U, S, V] = svd (A); f = V(:, end); F = reshape(f, [3 3])’;
26.Need to enforce singularity constraint
27.8-point algorithm Solve a system of homogeneous linear equations Write down the system of equations Solve f from A f = 0 using SVD Resolve det (F) = 0 constraint using SVD Matlab : [U, S, V] = svd (A); f = V(:, end); F = reshape(f, [3 3])’; Matlab : [U, S, V] = svd (F); S(3,3) = 0; F = U*S*V’;
28.8-point algorithm Solve a system of homogeneous linear equations Write down the system of equations Solve f from A f = 0 using SVD Resolve det (F) = 0 constraint by SVD Notes: Use RANSAC to deal with outliers (sample 8 points) How to test for outliers? Solve in normalized coordinates mean=0 s tandard deviation ~= (1,1,1) just like with estimating the homography for stitching
29.Homography (No Translation) Fundamental Matrix (Translation) Correspondence Relation Normalize image coordinates RANSAC with 8 points Initial solution via SVD Enforce by SVD De-normalize: Correspondence Relation Normalize image coordinates RANSAC with 4 points Solution via SVD De-normalize: Comparison of homography estimation and the 8-point algorithm Assume we have matched points x x ’ with outliers
30.7-point algorithm Faster (need fewer points) and could be more robust (fewer points), but also need to check for degenerate cases
31.7-point algorithm Faster (need fewer points) and could be more robust (fewer points), but also need to check for degenerate cases
32.Comparison of estimation algorithms 8-point Normalized 8-point Nonlinear least squares Av. Dist. 1 2.33 pixels 0.92 pixel 0.86 pixel Av. Dist. 2 2.18 pixels 0.85 pixel 0.80 pixel
33.We can get projection matrices P and P’ up to a projective ambiguity Code : function P = vgg_P_from_F (F) [U,S,V] = svd (F); e = U(:,3); P = [- vgg_contreps (e)*F e]; See HZ p. 255-256 K’*translation K’*rotation If we know the intrinsic matrices (K and K’), we can resolve the ambiguity
34.From epipolar geometry to camera calibration Estimating the fundamental matrix is known as “weak calibration” If we know the calibration matrices of the two cameras, we can estimate the essential matrix: E = K T FK’ The essential matrix gives us the relative rotation and translation between the cameras, or their extrinsic parameters
35.Let’s recap… Fundamental matrix song
36.Moving on to stereo… Fuse a calibrated binocular stereo pair to produce a depth image image 1 image 2 Dense depth map Many of these slides adapted from Steve Seitz and Lana Lazebnik
37.Basic stereo matching algorithm For each pixel in the first image Find corresponding epipolar line in the right image Search along epipolar line and pick the best match Triangulate the matches to get depth information Simplest case: epipolar lines are scanlines When does this happen?
38.Simplest Case: Parallel images Image planes of cameras are parallel to each other and to the baseline Camera centers are at same height Focal lengths are the same
39.Simplest Case: Parallel images Image planes of cameras are parallel to each other and to the baseline Camera centers are at same height Focal lengths are the same Then, epipolar lines fall along the horizontal scan lines of the images
40.Simplest Case: Parallel images Epipolar constraint: R = I t = ( T , 0, 0) The y-coordinates of corresponding points are the same t x x’
41.Depth from disparity f x’ Baseline B z O O’ X f Disparity is inversely proportional to depth. x
42.Stereo image rectification
43.Stereo image rectification Reproject image planes onto a common plane parallel to the line between camera centers Pixel motion is horizontal after this transformation Two homographies (3x3 transform), one for each input image reprojection C. Loop and Z. Zhang. Computing Rectifying Homographies for Stereo Vision . IEEE Conf. Computer Vision and Pattern Recognition, 1999 .
44.Rectification example
45.Basic stereo matching algorithm If necessary, rectify the two stereo images to transform epipolar lines into scanlines For each pixel x in the first image Find corresponding epipolar scanline in the right image Search the scanline and pick the best match x’ Compute disparity x-x’ and set depth(x) = fB /(x-x’)
46.Matching cost disparity Left Right scanline Correspondence search Slide a window along the right scanline and compare contents of that window with the reference window in the left image Matching cost: SSD or normalized correlation
47.Left Right scanline Correspondence search SSD
48.Left Right scanline Correspondence search Norm. corr
49.Effect of window size W = 3 W = 20 Smaller window + More detail More noise Larger window + Smoother disparity maps Less detail Fails near boundaries
50.Failures of correspondence search Textureless surfaces Occlusions, repetition Non-Lambertian surfaces, specularities
51.Results with window search Window-based matching Ground truth Data
52.How can we improve window-based matching? So far, matches are independent for each point What constraints or priors can we add?
53.Stereo constraints/priors Uniqueness For any point in one image, there should be at most one matching point in the other image
54.Stereo constraints/priors Uniqueness For any point in one image, there should be at most one matching point in the other image Ordering Corresponding points should be in the same order in both views
55.Stereo constraints/priors Uniqueness For any point in one image, there should be at most one matching point in the other image Ordering Corresponding points should be in the same order in both views Ordering constraint doesn’t hold
56.Priors and constraints Uniqueness For any point in one image, there should be at most one matching point in the other image Ordering Corresponding points should be in the same order in both views Smoothness We expect disparity values to change slowly (for the most part)
57.Stereo matching as energy minimization I 1 I 2 D Energy functions of this form can be minimized using graph cuts Y. Boykov, O. Veksler, and R. Zabih, Fast Approximate Energy Minimization via Graph Cuts , PAMI 2001 W 1 ( i ) W 2 ( i+D ( i )) D ( i )
58.Many of these constraints can be encoded in an energy function and solved using graph cuts Graph cuts Ground truth For the latest and greatest: http://www.middlebury.edu/stereo/ Y. Boykov, O. Veksler, and R. Zabih, Fast Approximate Energy Minimization via Graph Cuts , PAMI 2001 Before
59.Things to remember Epipolar geometry Epipoles are intersection of baseline with image planes Matching point in second image is on a line passing through its epipole Fundamental matrix maps from a point in one image to a line (its epipolar line) in the other Can solve for F given corresponding points (e.g., interest points) Can recover canonical camera matrices from F (with projective ambiguity) Stereo depth estimation Estimate disparity by finding corresponding points along scanlines Depth is inverse to disparity
60.Next class: structure from motion