辅导案例-CM20219

欢迎使用51辅导,51作业君孵化低价透明的学长辅导平台,服务保持优质,平均费用压低50%以上! 51fudao.top
1
CM20219 – Coursework Part 1 – 2020/2021
Image processing and 2D geometry
Dr. Wenbin Li
1. Introduction
In this coursework, you will implement functions for filtering images using convolutions, for
transforming 2D coordinates, and for warping images according to 2D transforms, as used for
example in panorama stitching.
This assignment is worth 10% of the total marks for the unit. The second half of the coursework
(on WebGL) is worth 20% of the total marks, and the exam accounts for the remaining 70%
of the total marks.
2. Assessment
In this coursework, you will be writing functions and code snippets using MATLAB. Some tasks
require you to extend a given piece of code (which you can download from the unit’s Moodle
page), while others ask you to write code from scratch. Please read the instructions
carefully, and should you have any questions, please ask one of the tutors.
This coursework covers the first half of the unit (weeks 1–6). The first lab of the semester will
be a brief introduction to the MATLAB programming language and graphical user interface.
Should you have any MATLAB-related questions after this session, please ask one of the tutors
who are remotely available to help you. The following four labs (weeks 3–6) focus on specific
topics covered in lectures: convolutions, 2D transforms and image warping.
The intention is that you are encouraged to finish each lab’s exercises in the designated
week (see below). In the live online interactive learning sections, you can ask questions to the
tutors and the lecturer. Note that you should (and may have to) work on this coursework in
your own time before the sections.
The last opportunity to get help is during consolidation week (week 6). This coursework is
entirely assessed by your code submitted to Moodle.
The coursework deadline is 6 November 2020, 20:00 via Moodle, i.e. the Friday of
consolidation week. Feedbacks will be given via Moodle in two weeks after the deadline.
Summary of labs for this coursework:
Week 2: Introduction to MATLAB
Week 3: Convolution Lab
Week 4: Convolution Lab
Week 5: Transforms Lab
Week 6: Image Warping Lab and deadline for code submission (consolidation week)




2
3. Getting started with MATLAB
Useful resources:
• https://www.mathworks.com/help/pdf_doc/matlab/getstart.pdf
MATLAB Primer, a guide for getting started with MATLAB.
• https://uk.mathworks.com/support/learn-with-matlab-tutorials.html
Introduction, tutorials, documentation and videos for learning MATLAB.
4. Learning outcomes
After completing this coursework, and attending the relevant lectures, you should be able to:
• Construct appropriate convolution kernels for common image filtering operations.
• Efficiently compute convolutions, using Fast Fourier Transforms if necessary.
• Devise and apply 2D transformations for animating and aligning 2D objects.
• Warp images using homographies and parametric models, e.g. for lens undistortion.
5. Plagiarism
You must complete this coursework individually. If you copy code from another student and
include it in your project without clear attribution, then you have committed plagiarism.
Plagiarism is a serious academic offence. For details on plagiarism and how to avoid it, please
visit http://www.bath.ac.uk/library/help/infoguides/plagiarism.html. Undetected plagiarism
degrades the quality of your degree, as it interferes with our ability to assess you and prevents
you learning through properly attempting the coursework. Consequently, if we detect any
plagiarism you will receive zero marks for the coursework and be referred to the Director of
Studies for disciplinary action. Such action may affect your ability to continue your studies at
the University. Note that properly attributed code, while allowed in your submission, will not
contribute towards your marks. The report marks will also only be applicable to sections you
have coded yourself.

3
6. Convolution Lab (3.3% of marks for the unit)
This lab implements image convolution of greyscale images, a basic image filtering function
that is implemented in many computer vision systems (e.g. for edge detection) and most image
editing programs such as Photoshop (e.g. for image sharpening).
6.1. Basic convolution [40%]
Implement basic convolution by translating the pseudo-code below to MATLAB. Write a function
fun result = basic_convolution(image, kernel)
that takes as input a grayscale image (2D matrix) and a filtering kernel (2D matrix), and
returns the convolved image result as a greyscale image with the same size and datatype as
the input image. This operation can be described by the following pseudocode:
for each image row in input image:
for each pixel in image row:

set accumulator to zero

for each kernel row in kernel:
for each element in kernel row:

if element position pixel position then
multiply element value pixel value
add result to accumulator
endif

set output image pixel to accumulator

( input image pixels are found relative to the kernel’s origin)
Source: https://en.wikipedia.org/wiki/Kernel_(image_processing)#Convolution
Demonstrate the functionality of your function by running the following:
image = im2double(imread('cameraman.tif'));
kernel = ones(5) / 25;
filtered = basic_convolution(image, kernel);
subplot(121); imshow(image); title('Input image');
subplot(122); imshow(filtered); title('Filtered image');

6.2. Border handling [20%]
Improve your implementation from the previous exercise (name it “extended_convolution”)
by first centring the filtered image (so that the content of the input and filtered images is not
shifted between them), and then fill in the border regions by extending/replicating the edge
pixels of the image (‘clamp-to-edge’). A perfect result will match MATLAB’s built-in function
“imfilter” (with ‘replicate’) exactly, i.e. with a sum of squared differences (SSD) of 0 (zero):
filtered = extended_convolution(image, kernel);
reference = imfilter(image, kernel, 'replicate');
difference = 0.5 + 10 * (filtered - reference);
ssd = sum((filtered(:) - reference(:)) .^ 2);
4
subplot(131); imshow(filtered); title('Extended convolution');
subplot(132); imshow(reference); title('Reference result');
subplot(133); imshow(difference); title(sprintf('Difference (SSD=%.1f)',ssd));

6.3. Image filtering [10%]
Design and demonstrate 3×3 convolution kernels for:
• computing horizontal, vertical and (any) diagonal image gradients, and
• sharpening an image using unsharp masking.
Implement a Gaussian low-pass filtering from scratch, which includes a 5×5 kernel and a
standard deviation of 1 pixel. Demonstrate the filtering results on the ‘cameraman’ image used
above.
6.4. Exploiting the convolution theorem [30%]
Apply the convolution theorem to speed up the convolution. You can use MATLAB’s functions
for 2D Fast Fourier Transform (FFT), “fft2” and its inverse “ifft2”. Aim to produce the same
result as in exercise 2 above (with SSD=0). Finally, compare run times of your FFT-based
convolution function with “extended_convolution” above for different kernel sizes, using
“tic” before your code and “toc” after it.
6.5. Marking scheme for Convolution Lab
Task Weight Description
1. Basic convolution [40%]
a 70% Result of convolution looks blurred (using 5×5 average filter).
b 10% Output image has same dimensions and datatype as input.
c 20% Function works for general (2n+1)×(2m+1) kernels; try [–1, 0, 1].
2. Border handling [20%]
a 50% Plausible convolution result (centred, no black border).
b 25% Near-perfect match to library function (SSD < 0.1).
c 25% SSD < 0.1 for kernel [–1, 0, 1].
3. Image filtering [10%]
a 60% Show kernel + convolution results for (1) horizontal, (2) vertical), (3)
any diagonal gradients, (4) unsharp masking. The gradient kernels can
be Prewitt or Sobel filters or even something like [–1, 0, 1].
b 30% Show code computing kernel for 5×5 Gaussian blur (σ = 1) + the
convolution result when it is applied.
c 10% Is the Gaussian kernel normalised? Check if sum(K(:)) is 1.
4. Convolution theorem [30%]
a 60% Plausible convolution result.
b 10% SSD < 0.1 for a 5×5 average filter (or another appropriate kernel)
c 30% Run-time comparisons for different kernel sizes.
5
7. Transforms Lab (3.3% of marks for the unit)
This lab will explore 2D matrix transformations. You should download the supporting MATLAB
code from Moodle, unzip it into a working directory, and use it to perform the following tasks.
The tutors will provide assistance regarding the use of MATLAB.
7.1. Compound transform [20%]
Run the program “rotate2D”. This shows rotation of a blue square about the origin. Read the
code (type “edit rotate2D”), and ensure you understand how it works (with reference to
your lecture notes).
Using a compound matrix transform (a single transform combining translation and rotation)
modify “rotate2D” so that the square rotates about the red point. Keep your code general so
that modifying “point” makes the square rotate about the new position.
7.2. Post-multiplication [20%]
Modify “rotate2D” to use row vectors to represent points (e.g. “p=[x y 1]”) rather than the
column vector format currently used (e.g. “p=[x; y; 1]”).
Use the transposition rule discussed in lectures, i.e. that changing between row and column
vector format requires the matrix transformations to be composed in the opposite order, and
each component matrix to be transposed, and the transforms to be post-multiplied instead of
pre-multiplied:
()⊤ = ⊤⊤⊤ where , and are matrices
′ = where and ′ are column vectors
′ = ⊤ ⊤ ⊤ where and ′ are row vectors
7.3. Articulated motion [30%]
Run the program “earthmoon”, which shows a square moon (in black) orbiting a square earth
(in blue). It is an example of articulated motion.
Modify the program to add a new, second moon that orbits the earth.
Add a third moon to orbit the second moon, instead of the earth.
Hint: You will need to place the third moon into the
second moon’s reference frame. See how the
existing (first) moon is already placed in the earth’s
reference frame using matrix multiplication.
7.4. Estimating transformations [30%]
Run the program “transforms”, which transforms a
shape (a black ‘F’) in various ways. Modify the eight
transforms (a)–(h) at the top of the program to align
the input shape with each of the dotted shapes, as
shown on the right. Note that transformed outlines
should match the dotted shapes in both shape and
colour.

6
7.5. Marking scheme for Transforms Lab
Task Weight Description
1. Compound transform [20%]
a 40% Square rotates about the point.
b 40% Using a single compound transform (check code).
c 20% Changing “point” makes the square rotate about the new point.
2. Post-multiplication [20%]
a 40% Square rotates about the point.
b 40% Using a single compound transform (check code).
c 20% Changing “point” makes the square rotate about the new point.
3. Articulated motion [30%]
a 50% A second moon is orbiting the Earth.
b 50% A third moon is orbiting the second moon.
4. Estimating transforms [30%]
a 12.5% Input shape is transformed to align with thick outline “a” (red).
b 12.5% Input shape is transformed to align with thick outline “b” (green).
c 12.5% Input shape is transformed to align with thick outline “c” (blue).
d 12.5% Input shape is transformed to align with thick outline “d” (cyan).
e 12.5% Input shape is transformed to align with thick outline “e” (purple).
f 12.5% Input shape is transformed to align with thick outline “f” (yellowish).
g 12.5% Input shape is transformed to align with thick outline “g” (brown).
h 12.5% Input shape is transformed to align with thick outline “h” (dark purple).

7
8. Image Warping Lab (3.3% of marks for the unit)
This lab will explore image warping. You should download the supporting MATLAB code from
Moodle, unzip it into a working directory, and use it to perform the following tasks. The tutors
will provide assistance regarding the use of MATLAB.
8.1. Forward and backward warping [30%]
Run the program “forwardmap”. The code will warp a 2D image of Mona Lisa via a rotation
about a point. Observe that forward warping transforms every source pixel independently,
leading to gaps between the transformed pixels in the target image.
Create a copy of “forwardmap.m” called “backwardmap.m”, and modify it to perform backward
warping using the same 2D transform ‘M’. Backward warping computes for each target image
pixel, where in the source image it originated, and uses the colour sampled from the nearest
pixel. Note that this should not leave any gaps in the warped image.
Experiment by changing the matrix transformation ‘M’, for example by changing the rotation
angle or the scaling transform.
8.2. Linear interpolation [20%]
Extend your backward-warping program to sample pixel colours from the source image using
bilinear interpolation. Handle the edge-cases carefully for full marks, e.g. when sampling pixels
on the boundary of the source image.
8.3. Lens undistortion [20%]
This exercise applies a different warping function that is commonly used for removing lens
distortion from images, i.e. to make straight lines straight again. The polynomial lens distortion
model uses the following steps for computing the location (′, ′) to sample from the source
image for a target image pixel (, ):
= ( − )/
= ( − )/
2 = 2 + 2
′ = ⋅ (1 + 1
2 + 2
4 + 3
6)
′ = ⋅ (1 + 1
2 + 2
4 + 3
6)
′ = ′ +
′ = ′ +
Here, and are the focal lengths of the camera, ( , ) is called the principal point or
centre of projection, and 1, 2 and 3 are lens distortion coefficients.
Implement polynomial lens undistortion using the steps above, and then load the image
“window.jpg” and undistort it using the following camera calibration parameters:
= [
0
0
0 0 1
] = [
474.53 0 405.96
0 474.53 217.81
0 0 1
]
[1 2 3] = [−0.27194 0.11517 −0.029859]


8
8.4. Homographies [20%]
We will now explore how to calculate the transforms for image warping from corresponding
points, rather than just typing in a matrix manually. This is the first step for stitching photos
into panoramas.
The technical name for the 3×3 matrix transformations derived in this way is “homography”.
Run the program “homogdemo”. You will see two pictures of the library from the parade (move
the figures apart if they overlap), taken from slightly different points of view, labelled “left” and
“right”, respectively. You will be prompted to click on four points on the left image, then four
corresponding points on the right image. For best results, ensure you click on points that are
quite well spaced out and co-planar, i.e. lie on the same flat surface in the photograph (e.g.
the front of the library). Make sure you click the points in the same order in the left and right
images, and that you do not click three or more co-linear points (points lying on the same line).
Integrate the function “calchomography” into the code to compute the homography that maps
a point on the left image to a point on the right image.
Modify the code so that, after the homography has been estimated, the user is prompted to
click on points on the left image. Then use the homography to work out where the points will
be on the right image and plot them on the right image for demonstration purposes. Remember
to divide by the homogeneous coordinate in your code.
8.5. Image alignment [10%]
Use your code from the previous task to again estimate the homography matrix ‘M’ mapping
points from the left-hand image of the parade to the right-hand image of the parade. Use the
MATLAB command “save mymatrix M” to save this matrix to the file “mymatrix.mat”.
Plug the estimated homography ‘M’ into the backward warping code for Task 1, and warp the
left-hand image. You may want to load the saved matrix using “load mymatrix”.
Objects in the left hand image (like the library door) will be “moved” right-wards to a new
position (as they would appear had they been photographed from the viewpoint of the right
hand image) right hand image. Demonstrate that this is the case.

9
8.6. Marking scheme for Image Warping Lab
Task Weight Description
1. Backward warping [30%]
a 80% Working backward warping.
b 20% Show warping result for a different transform M.
2. Linear interpolation [20%]
a 80% Test linear interpolation using 4x zoom or so.
b 20% Handling edge cases correctly (may need to look at code).
3. Lens undistortion [20%]
a 20% Implementation of function given in coursework document.
b 60% Plausibly undistorted image (straight lines are straight).
c 20% Warped image is free from visual artefacts
(using bilinear interpolation).
4. Homographies [20%]
a 50% Integrate calchomography to compute homography H.
b 50% Transfer clicked points to the other image using the homography H.
5. Image alignment [10%]
a 100% Plausible image alignment, shown by overlapping both input images
on top of each other (no blending required).
6. Readme file [0%]
a 100% Introduction to how to run the code (loss 5% if missing)


欢迎咨询51作业君
51作业君

Email:51zuoyejun

@gmail.com

添加客服微信: abby12468