在学习完寻找兴趣点、特征跟踪和光流课程后,本章讲解的拟合和对齐是为了找到最佳对齐匹配点的转换参数。拟合(fitting)即找到最适合数据的模型参数,而对齐(alignment)是指找到变换中最匹配点的参数。实现拟合和对齐的方法有全局优化/参数搜索、最小二乘拟合、鲁棒最小二乘、迭代最近点(ICP)、假设和测试、广义霍夫变换、RANSAC等几种方法。

注脚

展开查看详情

1.Fitting and Alignment Computer Vision Jia-Bin Huang, Virginia Tech Many slides from S. Lazebnik and D. Hoiem

2.Administrative Stuffs Homework grading policy Graduate students: graded out of 600 points Undergrad students: graded out of 525 points For example, grad students need to complete on average 25 points extra credits. HW 1 Extra credits due 11:59 PM Friday 9/23 Competition: Edge Detection Submission link Leaderboard Anonymous feedback Lectures are too fast, too many slides

3.Where are we? Interest points Find distinct and repeatable points in images Harris-> corners, DoG -> blobs SIFT -> feature descriptor Feature tracking and optical flow Find motion of a keypoint /pixel over time Lucas- Kanade : brightness consistency, small motion, spatial coherence Handle large motion: iterative update + pyramid search Fitting and alignment (this class) find the transformation parameters that best align matched points Object instance recognition (next class) Keypoint -based object instance recognition and search

4.Review: Harris Corner Detector Second moment matrix 4 1. Image derivatives 2. Square of derivatives 3. Gaussian filter g( s I ) I x I y I x 2 I y 2 I x I y g(I x 2 ) g(I y 2 ) g(I x I y ) 4. Cornerness function – both eigenvalues are strong har 5. Non-maxima suppression (optionally, blur first)

5.Review: Find local maxima in position-scale space of Difference-of-Gaussian K. Grauman, B. Leibe s s 2 s 3 s 4 s 5  L ist of (x, y, s)

6.Review: SIFT Descriptor [Lowe, ICCV 1999] Histogram of oriented gradients Captures important texture information Robust to small translations / affine deformations K. Grauman, B. Leibe

7.Review: Lucas- Kanade Tracker Brightness consistency I ( x , y , t ) I ( x , y , t+1 ) Small motion Spatial coherence

8.Dealing with larger movements: Iterative refinement Initialize (x’,y’) = (x,y) Compute (u,v) by Shift window by (u, v): x’ = x’+u ; y’= y’+v ; Recalculate I t Repeat steps 2-4 until small change Use interpolation for subpixel values 2 nd moment matrix for feature patch in first image displacement I t = I(x’, y’, t+1) - I(x, y, t) Original (x,y) position

9.image I image J Gaussian pyramid of image 1 (t) Gaussian pyramid of image 2 (t+1) image 2 image 1 Dealing with larger movements: coarse-to-fine registration run iterative L-K run iterative L-K upsample . . .

10.Fitting [ ˈ fidiNG ]: find the parameters of a model that best fit the data Alignment [ əˈlīnmənt ]: find the parameters of the transformation that best align matched points

11.Source: K. Grauman Fitting and alignment Choose a parametric model to represent a set of features simple model: lines simple model: circles complicated model: car complicated model: face shape

12.Fitting and Alignment -Design challenges Design a suitable goodness of fit measure Similarity should reflect application goals Encode robustness to outliers and noise Design an optimization method Avoid local optima Find best parameters quickly

13.Fitting and Alignment: Methods Global optimization / Search for parameters Least squares fit Robust least squares Iterative closest point (ICP) Hypothesize and test Generalized Hough transform RANSAC

14.Simple example: Fitting a line

15.Least squares line fitting Data: ( x 1 , y 1 ), …, ( x n , y n ) Line equation: y i = m x i + b Find ( m , b ) to minimize ( x i , y i ) y=mx+b Matlab : p = A \ y; Modified from S. Lazebnik

16.Problem with “vertical” least squares Not rotation-invariant Fails completely for vertical lines Slide from S. Lazebnik

17.Total least squares If ( a 2 +b 2 = 1) then Distance between point ( x i , y i ) and line ax+by+c =0 is | ax i + by i + c | ( x i , y i ) ax+by+c =0 Unit normal: N= ( a, b ) Slide modified from S. Lazebnik proof: http://mathworld.wolfram.com/Point-LineDistance2-Dimensional.html

18.Total least squares If ( a 2 +b 2 = 1) then Distance between point ( x i , y i ) and line ax+by+c =0 is | ax i + by i + c | Find ( a , b , c) to minimize the sum of squared perpendicular distances ( x i , y i ) ax+by+c =0 Unit normal: N= ( a, b ) Slide modified from S. Lazebnik

19.Total least squares Find ( a , b , c ) to minimize the sum of squared perpendicular distances ( x i , y i ) ax+by+c =0 Unit normal: N= ( a, b ) Solution is eigenvector corresponding to smallest eigenvalue of A T A See details on Raleigh Quotient: http://en.wikipedia.org/wiki/Rayleigh_quotient Slide modified from S. Lazebnik

20.Recap: Two Common Optimization Problems Problem statement Solution Problem statement Solution ( matlab )

21.Least squares (global) optimization Good Clearly specified objective Optimization is easy Bad May not be what you want to optimize Sensitive to outliers Bad matches, extra points Doesn’t allow you to get multiple good fits Detecting multiple objects, lines, etc.

22.Robust least squares (to deal with outliers) General approach: minimize u i ( x i , θ ) – residual of i th point w.r.t . model parameters θ ρ – robust function with scale parameter σ The robust function ρ Favors a configuration with small residuals Constant penalty for large residuals Slide from S. Savarese

23.Robust Estimator Initialize: e.g., choose by least squares fit and Choose params to minimize: E.g., numerical optimization Compute new Repeat (2) and (3) until convergence  

24.Other ways to search for parameters (for when no closed form solution exists) Line search F or each parameter, step through values and choose value that gives best fit Repeat (1) until no parameter changes Grid search Propose several sets of parameters, evenly sampled in the joint set Choose best (or top few) and sample joint parameters around the current best; repeat Gradient descent Provide initial position (e.g., random) Locally search for better parameters by following gradient

25.Hypothesize and test Propose parameters Try all possible Each point votes for all consistent parameters Repeatedly sample enough points to solve for parameters Score the given parameters Number of consistent points, possibly weighted by distance Choose from among the set of parameters Global or local maximum of scores Possibly refine parameters using inliers

26.Hough Transform: Outline Create a grid of parameter values Each point votes for a set of parameters, incrementing those values in grid Find maximum or local maxima in grid

27.x y b m y = m x + b Hough transform Given a set of points, find the curve or line that explains the data points best P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Hough space Slide from S. Savarese

28.x y b m x y m 3 5 3 3 2 2 3 7 11 10 4 3 2 3 1 4 5 2 2 1 0 1 3 3 b Hough transform Slide from S. Savarese

29.x y Hough transform Issue : parameter space [ m,b ] is unbounded… P.V.C. Hough, Machine Analysis of Bubble Chamber Pictures, Proc. Int. Conf. High Energy Accelerators and Instrumentation, 1959 Hough space Use a polar representation for the parameter space Slide from S. Savarese

30.features votes Hough transform - experiments Slide from S. Savarese

31.features votes Need to adjust grid size or smooth Hough transform - experiments Noisy data Slide from S. Savarese

32.Issue: spurious peaks due to uniform noise features votes Hough transform - experiments Slide from S. Savarese

33.1. Image  Canny

34.2. Canny  Hough votes

35.3. Hough votes  Edges Find peaks and post-process

36.Hough transform example http://ostatic.com/files/images/ss_hough.jpg

37.Hough transform for circles For a fixed radius r Circle: center ( a,b ) and radius r Image space Hough space Adapted by Devi Parikh from: Kristen Grauman 37 Equation of circle? Equation of set of circles that all pass through a point?

38.Hough transform for circles For a fixed radius r Circle: center ( a,b ) and radius r Image space Hough space Intersection: most votes for center occur here. Kristen Grauman 38

39.Hough transform for circles For an unknown radius r Circle: center ( a,b ) and radius r Hough space Image space b a r ? Kristen Grauman 39

40.Hough transform for circles For an unknown radius r Circle: center ( a,b ) and radius r Hough space Image space b a r Kristen Grauman 40

41.Hough transform for circles For an unknown radius r, known gradient direction Circle: center ( a,b ) and radius r Hough space Image space θ x Kristen Grauman 41

42.Original Edges Example: detecting circles with Hough Votes: Penny Note: a different Hough transform (with separate accumulators) was used for each circle radius (quarters vs. penny). 42 Slide credit: Kristen Grauman

43.Original Edges Example: detecting circles with Hough Votes: Quarter Combined detections Coin finding sample images from: Vivek Kwatra 43 Slide credit: Kristen Grauman

44.Generalized Hough for object detection Instead of indexing displacements by gradient orientation, index by matched local patterns. B. Leibe, A. Leonardis, and B. Schiele, Combined Object Categorization and Segmentation with an Implicit Shape Model , ECCV Workshop on Statistical Learning in Computer Vision 2004 training image “visual codeword” with displacement vectors Source: L. Lazebnik 44

45.Instead of indexing displacements by gradient orientation, index by “visual codeword ” B. Leibe, A. Leonardis, and B. Schiele, Combined Object Categorization and Segmentation with an Implicit Shape Model , ECCV Workshop on Statistical Learning in Computer Vision 2004 test image Source: L. Lazebnik 45 Generalized Hough for object detection

46.Hough transform conclusions Good Robust to outliers: each point votes separately Fairly efficient (much faster than trying all sets of parameters) Provides multiple good fits Bad Some sensitivity to noise Bin size trades off between noise tolerance, precision, and speed/memory Can be hard to find sweet spot Not suitable for more than a few parameters grid size grows exponentially Common applications Line fitting (also circles, ellipses, etc.) Object instance recognition (parameters are position/scale/orientation) Object category recognition (parameters are position/scale)

47.RANSAC Algorithm: Sample (randomly) the number of points required to fit the model Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Fischler & Bolles in ‘81. ( RAN dom SA mple C onsensus) :

48.RANSAC Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Illustration by Savarese Line fitting example

49.RANSAC Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Line fitting example

50.RANSAC Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence Line fitting example

51.RANSAC Algorithm: Sample (randomly) the number of points required to fit the model (#=2) Solve for model parameters using samples Score by the fraction of inliers within a preset threshold of the model Repeat 1-3 until the best model is found with high confidence

52.How to choose parameters? Number of samples N Choose N so that, with probability p , at least one random sample is free from outliers (e.g. p =0.99) (outlier ratio: e ) Number of sampled points s Minimum number needed to fit the model Distance threshold  Choose  so that a good point with noise is likely (e.g., prob =0.95) within threshold Zero-mean Gaussian noise with std. dev. σ : t 2 =3.84 σ 2 proportion of outliers e s 5% 10% 20% 25% 30% 40% 50% 2 2 3 5 6 7 11 17 3 3 4 7 9 11 19 35 4 3 5 9 13 17 34 72 5 4 6 12 17 26 57 146 6 4 7 16 24 37 97 293 7 4 8 20 33 54 163 588 8 5 9 26 44 78 272 1177 m odified from M. Pollefeys

53.RANSAC conclusions Good Robust to outliers Applicable for larger number of objective function parameters than Hough transform Optimization parameters are easier to choose than Hough transform Bad Computational time grows quickly with fraction of outliers and number of parameters Not as good for getting multiple fits (though one solution is to remove inliers after each fit and repeat) Common applications Computing a homography (e.g., image stitching) Estimating fundamental matrix (relating two views)

54.RANSAC Song

55.What if you want to align but have no prior matched pairs? Hough transform and RANSAC not applicable Important applications Medical imaging: match brain scans or contours Robotics: match point clouds

56.Iterative Closest Points (ICP) Algorithm Goal: estimate transform between two dense sets of points Initialize transformation (e.g., compute difference in means and scale) Assign each point in {Set 1} to its nearest neighbor in {Set 2} Estimate transformation parameters e.g., least squares or robust least squares Transform the points in {Set 1} using estimated parameters Repeat steps 2-4 until change is very small

57.Example: solving for translation A 1 A 2 A 3 B 1 B 2 B 3 Given matched points in {A} and {B}, estimate the translation of the object

58.A 1 A 2 A 3 B 1 B 2 B 3 Least squares solution ( t x , t y ) Write down objective function Derived solution Compute derivative Compute solution Computational solution Write in form Ax=b Solve using pseudo-inverse or eigenvalue decomposition Example: solving for translation

59.A 1 A 2 A 3 B 1 B 2 B 3 RANSAC solution ( t x , t y ) Sample a set of matching points (1 pair) Solve for transformation parameters Score parameters with number of inliers Repeat steps 1-3 N times Problem: outliers A 4 A 5 B 5 B 4 Example: solving for translation

60.A 1 A 2 A 3 B 1 B 2 B 3 Hough transform solution ( t x , t y ) Initialize a grid of parameter values Each matched pair casts a vote for consistent values Find the parameters with the most votes Solve using least squares with inliers A 4 A 5 A 6 B 4 B 5 B 6 Problem: outliers, multiple objects, and/or many-to-one matches Example: solving for translation

61.( t x , t y ) Problem: no initial guesses for correspondence ICP solution Find nearest neighbors for each point Compute transform using matches Move points using transform Repeat steps 1-3 until convergence Example: solving for translation

62.HW 2 – Feature tracker Keypoint detection Compute second moment matrix Harris corner criterion Threshold Non-maximum suppression Tracking Kanade -Lucas- Tomasi tracking Show keypoint trajectories Show points have moved out of frames

63.HW 2 – Shape Alignment Global transformation (similarity, affine, perspective) Iterative closest point algorithm

64.HW 2 – Local Feature Matching Express location of the detected object Implement ratio test feature matching algorithm

65.Things to remember Least Squares Fit closed form solution robust to noise not robust to outliers Robust Least Squares improves robustness to noise requires iterative optimization Hough transform robust to noise and outliers can fit multiple models o nly works for a few parameters (1-4 typically) RANSAC robust to noise and outliers works with a moderate number of parameters ( e.g , 1-8) Iterative Closest Point (ICP) For local alignment only: does not require initial correspondences

66.Next week Object instance recognition