Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Computer Vision: 3.Fitting

Computer Vision: 3.Fitting

Mohammed Hachama

April 02, 2020
Tweet

More Decks by Mohammed Hachama

Other Decks in Education

Transcript

  1. Master Mathematical Analysis and Applications Course M1 - S2 Computer

    vision Fitting and alignment Week 5 Mohammed Hachama [email protected] http://hachama.github.io/home/ University of Khemis Miliana -2020-
  2. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    SVD and QR decomposition Computer vision -Fitting and alignment- (Week 5) (3/13) M. Hachama ([email protected])
  3. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Factorization SVD • Any matrix X can be decomposed as Xn×p = Un×n Σn×p V T p×p , • U, V are orthogonal and σ1 ≥ ... ≥ σr are the singular values. • Σ =       σ1 ... σr       or Σ =    σ1 ... σr    . • Columns of V are normalized eigenvectors of XT X. • Columns of U are normalized eigenvectors of XXT . • uj and vj share the same eigenvalue λj , where σj = λj . • Every matrix X of rank r has exactly r nonzero singular values. Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  4. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Factorization SVD • Full and compact (reduced) SVD • Full SVD : X = UΣV T • Compact SVD : X = U1 Σ1 V T 1 , i.e., X = r i=1 σi ui vT i . Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  5. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Computing the compact SVD 1: procedure compact SVD(A) 2: λ, V ← eig(AHA) Calculate the eigenvalues and eigenvectors of AHA. 3: σ ← √ λ Calculate the singular values of A. 4: σ ← sort(σ) Sort the singular values from greatest to least. 5: V ← sort(V ) Sort the eigenvectors the same way. 6: r ← count(σ = 0) Count the number of nonzero singular values (the rank of A). 7: σ1 ← σ:r Keep only the positive singular values. 8: V1 ← V:,:r Keep only the corresponding eigenvectors. 9: U1 ← AV1 /σ1 Construct U with array broadcasting. 10: return U1 , σ1 , V H 1 Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  6. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Factorization SVD • Intuition : X can be seen as a linear transformation. This transformation can be decomposed in three sub-transformations : 1 Rotation, 2 Re-scaling, 3 Rotation. These three steps correspond to the three matrices U, D, and V. Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  7. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Applications of SVD 1. Pseudoinverse • The pseudoinverse of a rectangular matrix X is X+ = V     σ−1 1 ... σ−1 p     UT • X has linearly independent columns (XT X is invertible) : X+ = (XT X)−1XT . In this case, X+X = I. • X has linearly independent rows (matrix XXT is invertible) : X+ = XT (XXT )−1. This is a right inverse, as XX+ = I. Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  8. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Applications of SVD 2. Low-Rank Matrix Approximations • If A is a m × n matrix of rank r < min{m, n}, store the matrices U1 , Σ1 and V1 instead of A. • Store A : mn values • Store the matrices U1, Σ1 and V1 : mr + r + nr values. • Example : If A is 100 × 200 and has rank 20 • Store A : 20, 000 values • Store the matrices U1, Σ1 and V1 : 6, 020 enteries. Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  9. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Applications of SVD 2. Low-Rank Matrix Approximations • The truncated SVD keeps only the first s < r singular values, plus the corresponding columns of U and V : As = s i=1 σi ui vH i . The resulting matrix As has rank s and is only an approximation to A, since r − s nonzero singular values are neglected. Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  10. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Applications of SVD 2. Low-Rank Matrix Approximations - Image compression Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  11. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    QR Factorization • Any matrix An×p with linearly independent columns admits a unique decomposition An×p = Qn×p Rp×p where QT Q = I and R is upper triangular. • QR is a useful factorization when n > p. • Full QR decomposition A m×n = Q m×m R 0 m×n = Q1 Q2 R 0 m×n • Reduced QR decomposition A m×n = Q1 m×n R n×n Computer vision -Fitting and alignment- (Week 5) (4/13) M. Hachama ([email protected])
  12. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Least squares Computer vision -Fitting and alignment- (Week 5) (5/13) M. Hachama ([email protected])
  13. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Total Least squares • Fitting a line to a point cloud • Data : (x1, y1), ·, (xn, yn) • Line equation : y = ax + b • Find (a, b) to minimize E(a, b) = n i=1 (yi − axi − b)2 Computer vision -Fitting and alignment- (Week 5) (6/13) M. Hachama ([email protected])
  14. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Total Least squares • Fitting a line to a point cloud E(a, b) = n i=1 (yi − axi − b)2 = ||Ac − d||2 A =     x1 1 . . . . . . xn 1     ; c = a b ; d =     y1 . . . yn     • Cons • Fails for vertical lines • Not rotation-invariant Computer vision -Fitting and alignment- (Week 5) (6/13) M. Hachama ([email protected])
  15. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Total Least squares • Minimize the function min c∈Rn E(c) , / E(c) = Ac − d 2 • The problem always has solution (size(A) = m × n). • Sol. is unique ⇔ rank(A) = n. • Resolution • Normal equations • Orthogonal methods • SVD Computer vision -Fitting and alignment- (Week 5) (6/13) M. Hachama ([email protected])
  16. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Total Least squares • Minimize the function min c∈Rn E(c) , / E(c) = Ac − d 2 • EL : ∂E ∂c = 2(AT A)c − 2AT d = 0. • Normal equation AT A c = AT d. AT A c = AT d =⇒ c = A+d. when A has linearly independent columns and where A+ = (AT A)−1AT Computer vision -Fitting and alignment- (Week 5) (6/13) M. Hachama ([email protected])
  17. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Total Least squares • Solution using QR factorization A = QR ˆ c = (AT A)−1AT d = ((QR)T (QR))−1(QR)T d = (RT QT QR)−1RT QT d = (RT R)−1RT QT d = R−1R−T RT Q−1d = R−1QT d • Algorithm • 1. compute QR factorization A = QR. • 2. matrix-vector product x = QT d • 3. solve Rc = x by back substitution Computer vision -Fitting and alignment- (Week 5) (6/13) M. Hachama ([email protected])
  18. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Orthogonal least squares • Fitting a line to a point cloud E(a, b) = n i=1 (axi + byi − d)2 ∂E ∂d = n i=1 −2(axi + byi − d) = 0 Computer vision -Fitting and alignment- (Week 5) (7/13) M. Hachama ([email protected])
  19. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Orthogonal least squares • Fitting a line to a point cloud ¯ x = 1 n n i=1 xi , ¯ y = 1 n n i=1 yi E = n i=1 (a(xi − ¯ x) + b(yi − ¯ y))2 =     x1 − ¯ x y1 − ¯ y . . . . . . xn − ¯ x yn − ¯ y     a b = (UN)T (UN) • min E s.t. ||N||2 = 1 : eigenvector of UT U associated with the smallest eigenvalue 1. 1. Proof : exercise. Computer vision -Fitting and alignment- (Week 5) (7/13) M. Hachama ([email protected])
  20. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    RANSAC Computer vision -Fitting and alignment- (Week 5) (8/13) M. Hachama ([email protected])
  21. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Robust estimation • Outliers are points that don’t ”fit” the model. Computer vision -Fitting and alignment- (Week 5) (9/13) M. Hachama ([email protected])
  22. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Robust estimation • Outliers are points that don’t ”fit” the model. Computer vision -Fitting and alignment- (Week 5) (9/13) M. Hachama ([email protected])
  23. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Robust estimation • Outliers are points that don’t ”fit” the model. Computer vision -Fitting and alignment- (Week 5) (9/13) M. Hachama ([email protected])
  24. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Algorithm • RANSAC or “RANdom SAmple Consensus” an iterative method to ”robustly” estimate parameters of a mathematical model from a set of observed data which contains outliers. • Untill N iterations have occured : • Draw a random sample of S points from the data • Fit the model to that set of S points • Classify data points as outliers or inliers • ReFit the model to inliers while ignoring outliers • Use the best fit from this collection using the fitting error as a criterion. Computer vision -Fitting and alignment- (Week 5) (9/13) M. Hachama ([email protected])
  25. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  26. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  27. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  28. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  29. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  30. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  31. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  32. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  33. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  34. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Line fitting Computer vision -Fitting and alignment- (Week 5) (10/13) M. Hachama ([email protected])
  35. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Example : Compute a mapping between two images • Several hundred key points are extracted from each image and the goal is to match them and compute the transformation which minimises a given criterion. • There may be outliers (incorrect matches) which will corrupt the estimation. • RANSAC : Choose a subset of the points from one image, match these to the other image and compute the transformation which minimises the re-projection error. Choose another subset of the points, match them and compute another transformation. Repeat a couple of times using a different subset of key points each time. Then select the transformation which has the minimum re-projection error. Computer vision -Fitting and alignment- (Week 5) (11/13) M. Hachama ([email protected])
  36. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    Template matching Computer vision -Fitting and alignment- (Week 5) (12/13) M. Hachama ([email protected])
  37. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    • Source image (I) : The image in which we expect to find a match to the template image • Template image (T) : The patch image which will be compared to the template to detect the highest matching area Computer vision -Fitting and alignment- (Week 5) (13/13) M. Hachama ([email protected])
  38. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    • Comparing the template against the source image by sliding it • Moving the patch one pixel at a time. • At each location, a similarity metric is calculated Computer vision -Fitting and alignment- (Week 5) (13/13) M. Hachama ([email protected])
  39. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    • Store the metric in the result matrix (image) (R). Computer vision -Fitting and alignment- (Week 5) (13/13) M. Hachama ([email protected])
  40. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    • Metrics • Sum of Square differences R(x, y) = x ,y (T(x , y ) − I(x + x , y + y ))2 R(x, y) = x ,y (T(x , y ) − I(x + x , y + y ))2 x ,y T(x , y )2 · x ,y I(x + x , y + y )2 • Cross-correlation R(x, y) = x ,y (T(x , y ) · I(x + x , y + y )) R(x, y) = x ,y (T(x , y ) · I(x + x , y + y )) x ,y T(x , y )2 · x ,y I(x + x , y + y )2 Computer vision -Fitting and alignment- (Week 5) (13/13) M. Hachama ([email protected])
  41. Fitting and alignment Matrix decomposition Least squares RANSAC Template Matching

    • Metrics • Correlation coefficient R(x, y) = x ,y (T (x , y ) · I (x + x , y + y )) where T (x , y ) = T(x , y ) − 1/(w · h) · x ,y T(x , y ) R(x, y) = x ,y (T (x , y ) · I (x + x , y + y )) x ,y T (x , y )2 · x ,y I (x + x , y + y )2 Computer vision -Fitting and alignment- (Week 5) (13/13) M. Hachama ([email protected])