Slide 1

Slide 1 text

Computer vision Introduction (Week 1) NHSM - 4th year - Spring 2025 - Prof. Mohammed Hachama [email protected] http://hachama.github.io/home/

Slide 2

Slide 2 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 2/22

Slide 3

Slide 3 text

Introduction NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 3/22

Slide 4

Slide 4 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 4/22

Slide 5

Slide 5 text

Extract ”information” from pixels Figure 1: What we see Figure 2: What computer sees NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 5/22

Slide 6

Slide 6 text

Extract ”information” from pixels Figure 1: Input Image NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 5/22

Slide 7

Slide 7 text

Extract ”information” from pixels Figure 1: Input Image Figure 2: Geometric information NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 5/22

Slide 8

Slide 8 text

Extract ”information” from pixels Figure 1: Input Image Figure 2: Semantic information NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 5/22

Slide 9

Slide 9 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 6/22

Slide 10

Slide 10 text

Industrial vision Optical character recognition (OCR) NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 7/22

Slide 11

Slide 11 text

Industrial vision Visual inspection for quality assurance NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 7/22

Slide 12

Slide 12 text

Computer animation Use of retro-reflective markers to capture actors motion NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 7/22

Slide 13

Slide 13 text

Image edition NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 7/22

Slide 14

Slide 14 text

Vision-based biometrics NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 7/22

Slide 15

Slide 15 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 8/22

Slide 16

Slide 16 text

Organization of the course • Course web page: canvas.instructure.com/courses/11329701 • Grading • Class participation: 3 point • One midterm exam: 10 points • Written test (5 pts) • Lab exam (5 pts) • Programming assignments • HW1 (1 pts) + HW2 (3 pts) + HW3 (3 pts) • Final exam NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 9/22

Slide 17

Slide 17 text

Content • Introduction • Feature Description, Detection, and Matching • Data fitting and the RANSAC algorithm • Harris, SIFT, HOG, etc. • Motion Estimation • Multiple Image Geometry and 3D Reconstruction • Camera calibration and epipolar geometry • Reconstruction from one or multiple images: Shape from Shading, stereo vision, Structure from Motion, etc. • Object Detection and Recognition • Sparse Coding and Dictionary Learning • Deep Learning NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 9/22

Slide 18

Slide 18 text

Fitting and alignment NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 10/22

Slide 19

Slide 19 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 11/22

Slide 20

Slide 20 text

SVD and QR decomposition NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 12/22

Slide 21

Slide 21 text

Factorization SVD • Any matrix X can be decomposed as Xn×p = Un×n Σn×p V T p×p , • U, V are orthogonal and σ1 ≥ ... ≥ σr are the singular values. • Σ =      σ1 ... σr ⃝      or Σ =    σ1 ... ⃝ σr    . • Columns of V are normalized eigenvectors of XT X. • Columns of U are normalized eigenvectors of XXT . • uj and vj share the same eigenvalue λj , where σj = λj . • Every matrix X of rank r has exactly r nonzero singular values. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 22

Slide 22 text

Factorization SVD • Full and compact (reduced) SVD • Full SVD: X = UΣV T • Compact SVD: X = U1 Σ1V T 1 , i.e., X = r i=1 σi ui vT i . NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 23

Slide 23 text

Computing the compact SVD Algorithm 1 1: procedure compact SVD(A) 2: λ, V ← eig(AT A) ▷ Calculate the eigenvalues and eigenvectors of AT A. 3: σ ← √ λ ▷ Calculate the singular values of A. 4: σ ← sort(σ) ▷ Sort the singular values from greatest to least. 5: V ← sort(V ) ▷ Sort the eigenvectors the same way. 6: r ← count(σ ̸= 0) ▷ Count the number of nonzero singular values (the rank of A). 7: σ1 ← σ:r ▷ Keep only the positive singular values. 8: V1 ← V:,:r ▷ Keep only the corresponding eigenvectors. 9: U1 ← AV1/σ1 ▷ Construct U with array broadcasting. 10: return U1, σ1, V T 1 11: end procedure NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 24

Slide 24 text

Factorization SVD • Intuition: X can be seen as a linear transformation. This transformation can be decomposed in three sub-transformations: 1. Rotation, 2. Re-scaling, 3. Rotation. These three steps correspond to the three matrices U, D, and V. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 25

Slide 25 text

Applications of SVD • The pseudoinverse of a rectangular matrix X is X+ = V    σ−1 1 ... ⃝ σ−1 p    UT • X has linearly independent columns (XT X is invertible): X+ = (XT X)−1XT . In this case, X+X = I. • X has linearly independent rows (matrix XXT is invertible): X+ = XT (XXT )−1. This is a right inverse, as XX+ = I. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 26

Slide 26 text

Applications of SVD • If A is a m × n matrix of rank r < min{m, n}, store the matrices U1 , Σ1 and V1 instead of A. • Store A: mn values • Store the matrices U1 , Σ1 and V1 : mr + r + nr values. • Example: If A is 100 × 200 and has rank 20 • Store A: 20, 000 values • Store the matrices U1 , Σ1 and V1 : 6, 020 entries. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 27

Slide 27 text

Applications of SVD • The truncated SVD keeps only the first s < r singular values, plus the corresponding columns of U and V : As = s i=1 σi ui vT i . The resulting matrix As has rank s and is only an approximation to A, since r − s nonzero singular values are neglected. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 28

Slide 28 text

Applications of SVD NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 29

Slide 29 text

QR Factorization • Any matrix An×p with linearly independent columns admits a unique decomposition An×p = Qn×p Rp×p where QT Q = I and R is upper triangular. • QR is a useful factorization when n > p. • Full QR decomposition A m×n = Q m×m R 0 m×n = Q1 Q2 R 0 m×n • Reduced QR decomposition A m×n = Q1 m×n R n×n NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 13/22

Slide 30

Slide 30 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 14/22

Slide 31

Slide 31 text

Least squares NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 15/22

Slide 32

Slide 32 text

Total Least squares • Fitting a line to a point cloud • Data: (x1, y1 ), ·, (xn, yn ) • Line equation: y = ax + b • Find (a, b) to minimize E(a, b) = n i=1 (yi − axi − b)2 NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 16/22

Slide 33

Slide 33 text

Total Least squares • Fitting a line to a point cloud E(a, b) = n i=1 (yi − axi − b)2 = ||Ac − d||2 A =    x1 1 . . . . . . xn 1    ; c = a b ; d =    y1 . . . yn    • Cons • Fails for vertical lines • Not rotation-invariant NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 16/22

Slide 34

Slide 34 text

Total Least squares • Minimize the function min c∈Rn E(c) , / E(c) = ∥Ac − d∥2 • The problem always has solution (size(A) = m × n). • Sol. is unique ⇔ rank(A) = n. • Resolution • Normal equations • Orthogonal methods • SVD NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 16/22

Slide 35

Slide 35 text

Total Least squares • Minimize the function min c∈Rn E(c) , / E(c) = ∥Ac − d∥2 • EL: ∂E ∂c = 2(AT A)c − 2AT d = 0. • Normal equation AT A c = AT d. AT A c = AT d =⇒ c = A+d. when A has linearly independent columns and where A+ = (AT A)−1AT NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 16/22

Slide 36

Slide 36 text

Total Least squares • Solution using QR factorization A = QR ˆ c = (AT A)−1AT d = ((QR)T (QR))−1(QR)T d = (RT QT QR)−1RT QT d = (RT R)−1RT QT d = R−1R−T RT Q−1d = R−1QT d • Algorithm • 1. compute QR factorization A = QR. • 2. matrix-vector product x = QT d • 3. solve Rc = x by back substitution NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 16/22

Slide 37

Slide 37 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 17/22

Slide 38

Slide 38 text

RANSAC NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 18/22

Slide 39

Slide 39 text

Robust estimation • Outliers are points that don’t ”fit” the model. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 19/22

Slide 40

Slide 40 text

Robust estimation • Outliers are points that don’t ”fit” the model. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 19/22

Slide 41

Slide 41 text

Robust estimation • Outliers are points that don’t ”fit” the model. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 19/22

Slide 42

Slide 42 text

Algorithm • RANSAC or “RANdom SAmple Consensus” an iterative method to ”robustly” estimate parameters of a mathematical model from a set of observed data which contains outliers. • Untill N iterations have occured: • Draw a random sample of S points from the data • Fit the model to that set of S points • Classify data points as outliers or inliers • ReFit the model to inliers while ignoring outliers • Use the best fit from this collection using the fitting error as a criterion. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 19/22

Slide 43

Slide 43 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 44

Slide 44 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 45

Slide 45 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 46

Slide 46 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 47

Slide 47 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 48

Slide 48 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 49

Slide 49 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 50

Slide 50 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 51

Slide 51 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 52

Slide 52 text

Example: Line fitting NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 20/22

Slide 53

Slide 53 text

Example: Compute a mapping between two images • Several hundred key points are extracted from each image and the goal is to match them and compute the transformation which minimises a given criterion. • There may be outliers (incorrect matches) which will corrupt the estimation. • RANSAC: Choose a subset of the points from one image, match these to the other image and compute the transformation which minimises the re-projection error. Choose another subset of the points, match them and compute another transformation. Repeat a couple of times using a different subset of key points each time. Then select the transformation which has the minimum re-projection error. NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 21/22

Slide 54

Slide 54 text

Outline Introduction Computer vision Some applications Organization of the course Fitting and alignment Matrix decomposition Least squares RANSAC Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 22/22

Slide 55

Slide 55 text

Template matching NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 23/22

Slide 56

Slide 56 text

Template matching • Source image (I): The image in which we expect to find a match to the template image • Template image (T): The patch image which will be compared to the template to detect the highest matching area NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 24/22

Slide 57

Slide 57 text

Template matching • Comparing the template against the source image by sliding it • Moving the patch one pixel at a time. • At each location, a similarity metric is calculated NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 24/22

Slide 58

Slide 58 text

Template matching • Store the metric in the result matrix (image) (R). NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 24/22

Slide 59

Slide 59 text

Template matching • Metrics • Sum of Square differences R(x, y) = x′,y′ (T(x′, y′) − I(x + x′, y + y′))2 R(x, y) = x′,y′ (T(x′, y′) − I(x + x′, y + y′))2 x′,y′ T(x′, y′)2 · x′,y′ I(x + x′, y + y′)2 • Cross-correlation R(x, y) = x′,y′ (T(x′, y′) · I(x + x′, y + y′)) R(x, y) = x′,y′ (T(x′, y′) · I(x + x′, y + y′)) x′,y′ T(x′, y′)2 · x′,y′ I(x + x′, y + y′)2 NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 24/22

Slide 60

Slide 60 text

Template matching • Metrics • Correlation coefficient R(x, y) = x′,y′ (T′(x′, y′) · I′(x + x′, y + y′)) where T′(x′, y′) = T(x′, y′) − 1/(w · h) · x′′,y′′ T(x′′, y′′) R(x, y) = x′,y′ (T′(x′, y′) · I′(x + x′, y + y′)) x′,y′ T′(x′, y′)2 · x′,y′ I′(x + x′, y + y′)2 NHSM - 4th year: Computer vision - Introduction (Week 1) - M. Hachama ([email protected]) 24/22