程序代写案例-QBUS1040

欢迎使用51辅导,51作业君孵化低价透明的学长辅导平台,服务保持优质,平均费用压低50%以上! 51fudao.top
QBUS1040: Foundations of Business Analytics
Final Exam Marking Guide
Semester 1, 2019
This is a 150-minute in-class exam with 10 minutes of reading time.
You may not use any books, notes, or computer programs (e.g., Python) except for a personal hand-
written double-sided cheat sheet. The cheat sheet cannot be larger than A4. Throughout this exam we
use standard mathematical notation; in particular, we do not use (and you may not use) notation from
any computer language, or from any strange or non-standard mathematical dialect (e.g., physics).
This exam consists of nine problems. You will write your answers directly in this exam paper. You
should use a pen, not a pencil or marker. You should use scratch paper (which you will not turn in) to
do your rough work. For the problems that you are asked for a free-form answer, you must write in the
box below the problem. We won’t read anything outside the boxes.
Note that problems have unequal weight. Some are easy. Others, not so much.
Good luck!
Your SID:
(For QBUS1040 staff only)
Question: 1 2 3 4 5 6 7 8 9 Total
Points: 10 10 10 10 10 10 10 10 10 90
Score:
i
QBUS1040 Final Exam Marking Guide Semester 1, 2019
1. Appropriate response For each of the situations (a) and (b) described below, choose the most
appropriate response from among the five choices. You can choose only one for each situation.
(a) (5 points) An intern working for you develops several different models to predict the daily
demand for a product. How should you choose which model is the best one, the one to put into
production?
© Regularization
© Gram-Schmidt algorithm
© k-means algorithm
© Validation
© Least squares classifier
(b) (5 points) An image can be represented by a vector. A colleague needs a method to group the
images according to the similarity of their corresponding vectors. What do you suggest?
© Regularization
© Gram-Schmidt algorithm
© k-means algorithm
© Validation
© Least squares classifier
Solution:
(a) Validation
(b) k-means algorithm
2. Choose one of the responses Always, Never, or Sometimes for each of the statements below. ’Always’
means the statement is always true, ’Never’ means it is never true, and ’Sometimes’ means it can
be true or false, depending on the particular values of the matrix or matrices. You do not need to
justify your answers.
(a) (2 points) An upper triangular matrix has linearly independent columns.
© Always © Never © Sometimes
(b) (2 points) The rows of a tall non-square matrix are linearly dependent.
© Always © Never © Sometimes
(c) (2 points) The product of two lower triangular matrices is lower triangular.
© Always © Never © Sometimes
(d) (2 points) The product of two orthogonal matrices is orthogonal.
© Always © Never © Sometimes
(e) (2 points) The KKT matrix is invertible.
© Always © Never © Sometimes
Page 1 of 5.
QBUS1040 Final Exam Marking Guide Semester 1, 2019
Solution:
(a) Sometimes. The zero matrix is upper triangular and has linearly dependent columns. The
identity matrix is upper triangular and has linearly independent columns.
(b) Always. This is true by the independence-dimension inequality.
(c) Always.
(d) Always. Let U, V be two orthogonal n× n matrices.
(UV )T (UV ) = V TUTUV = V T IV = V TV = I
(e) Sometimes.
Page 2 of 5.
QBUS1040 Final Exam Marking Guide Semester 1, 2019
3. Left and right inverses of a vector. Suppose that x is a nonzero n-vector with n > 1. Determine if
x has a left or right inverse. If the left or right inverse does not exist, state that it does not exist. If
a left or right inverse exists, then give one. You do not need to justify your answer.
(a) (5 points) Does x have a left inverse? If so, give one. If not say so.
Solution: Yes, x has a left inverse, in fact, many left inverses. Here is a simple one, which
is the pseudo-inverse: x† = (xTx)−1xT = (1/‖x‖2)xT . You can check that x†x = 1.
(b) (5 points) Does x have a right inverse? If so, give one. If not say so.
Solution:
No, x does not have a right inverse. To have a right inverse its rows must be linearly inde-
pendent, but since x is tall, that is not possible by the independence-dimension inequality.
4. (10 points) Weighted Gram matrix. Consider a multi-objective least squares problems with matrices
A1, . . . , Ak and positive weights λ1, . . . , λk. The matrix G = λ1A
T
1 A1 + · · · + λkATkAk is called the
weighted Gram matrix; it is the Gram matrix of the stacked matrix A˜ (given in (1)) associated with
the multi-objective problem.
A˜ =


λ1A1
...√
λkAk
 (1)
Show that G is invertible provided there is no nonzero vector x that satisfies A1x = 0, . . . , Akx = 0.
Solution: Suppose Gx = 0. Then
0 = xTGx
= λ1x
TAT 1A1x+ . . .+ λkx
TATkAkx
= λ1‖A1x‖2 + . . .+ λk‖Akx‖2
This is only possible if A1x = 0, . . . , Akx = 0. By our assumption, the only vector that satisfies
this is x = 0. Hence Gx = 0 holds only if x = 0. Therefore G is invertible.
5. (10 points) Immigration. The population dynamics of a country is given by xt+1 = Axt + u,
t = 1, . . . , T , where the 100-vector xt gives the population age distribution in year t, and u gives
the immigration age distribution (with negative entries meaning emigration), which we assume is
constant (i.e., does not vary with t). You are given A, x1, and x
des, a 100-vector that represents a
desired population distribution in year 4. We seek u that achieves x4 = x
des.
Give a matrix formula for u. If your formula only makes sense when some conditions hold (for
example invertibility of one or more matrices), say so.
Solution:
x2 = Ax1 + u
x3 = Ax2 + u = A
2x1 +Au+ u
x4 = Ax3 + u = A
3x1 +A
2u+Au+ u
...
xT = A
T−1x1 + (AT−2 + . . .+A+ I)u
Page 3 of 5.
QBUS1040 Final Exam Marking Guide Semester 1, 2019
Therefore x4 = x
des when
u = (AT−2 + . . .+A+ I)−1(xdes −AT−1x1)
6. (10 points) Adjacency matrix of reversed graph. Suppose A is the adjacency matrix of a directed
graph with
Aij =
{
1 if there is a directed edge from j to i,
0 otherwise.
The reversed graph is obtained by reversing the directions of all the edges of the original graph.
What is the adjacency matrix of the reversed graph? (Express your answer in terms of A.)
Solution: The adjacency matrix of the reversed graph is AT . To see this, we note that Aij = 1
means there is an edge in the original graph from node j to node i. In the reversed graph, that
edge goes from node i to node j, so its adjacency matrix A˜, satisfies A˜ji = 1. This argument
goes both ways, so A˜ = AT .
7. (10 points) Likert classifier. A response to a question has the options Strongly Disagree, Disagree,
Neutral, Agree, or Strongly Agree, encoded as −2, −1, 0, 1, 2, respectively. You wish to build a multi-
class classifier that takes a feature vector x and predicts the response. A multi-class least squares
classifier builds a separate (continuous) predictor for each response versus the others. Suggest a
simpler classifier, based on one continuous regression model f˜(x) that is fit to the numbers that code
the responses, using least squares. Your answer should be in English.
Solution: We simply round the continuous prediction f(x) to the nearest of 2, 1, 0, 1, 2, and
then predict that response.
8. (10 points) Computational complexity. Suppose R is an n×n upper triangular matrix with nonzero
diagonal entries and b is an n-vector. What is the computational complexity of solving a system of
linear equations Rx = b with back substitution? You should provide an answer in terms of flops and
justify it.
Solution: n2 flops
For n is even, lump the first entry in the sum together with the last entry, the second entry with
the second-to-last and so on. Each of these pairs add up to 2n, since there are n/2 pairs, the
total is (n/2)(2n) = n2. A similar argument can be make when n is odd.
9. (10 points) Interpreting model fitting results. Five different models are fit using the same training
data set, and tested on the same (separate) test set (which has the same size as the training set). The
RMS prediction errors for each model, on the training and test sets, are reported below. Comment
briefly on the results for each model. You should write a sentence or two for each model. You might
mention whether the model’s predictions are good or bad, whether it is likely to generalize to unseen
data, or whether it is over-fit. You are also welcome to say that you don’t believe the results, or
think the reported numbers are fishy.
Page 4 of 5.
QBUS1040 Final Exam Marking Guide Semester 1, 2019
Model Train RMS Test RMS
A 1.355 1.423
B 0.633 0.633
C 0.211 5.073
D 5.033 0.899
E 9.760 9.165
Solution:
(a) This is a good model, and likely will generalise.
(b) These results are suspicious, since it is unlikely that the train and test RMS errors would
be so close. For example, maybe the model was accidentally tested on the training set. If
the numbers are correct, then this is a very good model, and would likely generalize.
(c) The model is over-fit.
(d) Something is wrong, or you are lucky. Probably the former.
(e) This is a bad model, but will likely generalise.
END OF EXAM
Page 5 of 5.

欢迎咨询51作业君
51作业君

Email:51zuoyejun

@gmail.com

添加客服微信: abby12468