程序代写案例-UA9473

欢迎使用51辅导,51作业君孵化低价透明的学长辅导平台,服务保持优质,平均费用压低50%以上! 51fudao.top
2021/9/24 下午6:07 Jupyter Notebook Viewer
https://nbviewer.jupyter.org/github/acosse/Introduction-to-ML-Fall2021/blob/main/Assignments/CSCI-UA9473 Fall 2021 As
signment1.ipynb 1/4
Introduction-to-ML-Fall2021 (/github/acosse/Introduction-to-ML-Fall2021/tree/main)
/  Assignments (/github/acosse/Introduction-to-ML-Fall2021/tree/main/Assignments)
CSCI-UA 9473 Introduction to Machine Learning
Assignment 1: Gradient descent
Given date: Sept 14
Due date: Sept 23
Total: 10pts
Question 1. (5pts) Local vs global minimas and gradient descent
We consider the following function.
The surface plot of this function is given below together with its contour plot. The function has a single
global minimum located near and shown in red in the contour plot.
We want to implement gradient descent iterations on that function. Starting from a random initial point
, code the following updates
where represents the gradient of with respect to . Choose a sufficiently small learning
rate and plot the iterates (in white) on the contour plot. Repeat your experiments for various initial iterates.
F( , ) = 3(1 − exp(−( ) − ( + 1 )x
1
x
2
x
1
)
2
x
2
1
x
2
)
2
−10( /5 − − ) exp(− − )x
1
x
3
1
x
5
2
x
2
1
x
2
2
−(1/3) exp(−( + 1 − )x
1
)
2
x
2
2
(1)
(2)
(3)
(0.23,−1.62)
( , )x
1
x
2
= − η ∗ F( , )x
(k+1)
1
x
(k)
1
grad
x
1
x
1
x
2
= − η ∗ F( , )x
(k+1)
2
x
(k)
2
grad
x
2
x
1
x
2
(4)
(5)
grad
x
i
F( , )x
1
x
2
x
i
2021/9/24 下午6:07 Jupyter Notebook Viewer
https://nbviewer.jupyter.org/github/acosse/Introduction-to-ML-Fall2021/blob/main/Assignments/CSCI-UA9473 Fall 2021 Assignment1.ipynb 2/4
In [15]:
from mpl_toolkits.mplot3d import Axes3D
import matplotlib.pyplot as plt
from matplotlib import cm
from matplotlib.ticker import LinearLocator, FormatStrFormatter
import numpy as np
fig = plt.figure()
ax = fig.gca(projection='3d')
# Make data.
x = np.linspace(-3, 3, 100)
y = np.linspace(-3, 3, 100)
x1, x2 = np.meshgrid(x, y)
F = 3*(1-x1)**2 * np.exp(-(x1**2) - (x2+1)**2)\
- 10*(np.true_divide(x1,5) - x1**3 - x2**5)*np.exp(-x1**2 - x2**2)\
- np.true_divide(1,3)*np.exp(-(x1+1)**2 - x2**2)
# Plot the surface.
surf = ax.plot_surface(x1, x2, F, linewidth=0, alpha=1, cmap = 'viridis')
plt.show()
2021/9/24 下午6:07 Jupyter Notebook Viewer
https://nbviewer.jupyter.org/github/acosse/Introduction-to-ML-Fall2021/blob/main/Assignments/CSCI-UA9473 Fall 2021 Assignment1.ipynb 3/4
In [27]:
fig1, ax = plt.subplots(constrained_layout=True)
contour = ax.contourf(x1, x2, F,cmap = 'viridis')
plt.scatter(0.23, -1.62,c='r',marker='X')
plt.show()
In [ ]:
# put your solution here
Question 2. (5pts) Regression through the normal equations
We consider the simple regression problem below, similar to the one discussed in class. Find the model
that minimizes the sum of squares loss
using the Normal Equations. To do this:
Start by building the matrix with
Then compute the matrix and the vector where
Finally solve the equations through
using the function np.linalg.inv from the linear algebra package. Plot the result in green on top of the plot
below and compare with the true (blue) (unknown) model.
ℓ(β) = ( − ( + )
1
2N

i=1
N
t
(i)
noisy
β
0
β
1
x
(i)
)
2
(6)
X
̃ 
=
X
̃ 





1

1
x
(1)

x
(N)





(7)
X
̃ 
T
X
̃ 
tX
̃ 
T
t = [ ,… ,t
(1)
noisy
t
(N)
noisy
]
T
= ( t)β
OLS
( )
X
̃ 
T
X
̃ 
−1
X
̃ 
T
(8)
2021/9/24 下午6:07 Jupyter Notebook Viewer
https://nbviewer.jupyter.org/github/acosse/Introduction-to-ML-Fall2021/blob/main/Assignments/CSCI-UA9473 Fall 2021 Assignment1.ipynb 4/4
In [29]:
import numpy as np
import matplotlib.pyplot as plt
x = np.linspace(0,5,10)
noise = np.random.normal(0,.3,len(x))
beta_true = np.random.normal(0,1,2)
t = beta_true[0] + beta_true[1]*x
tnoisy = t+noise
plt.scatter(x, tnoisy, c='r')
plt.plot(x, t)
plt.show()
In [ ]:
# put your code here

欢迎咨询51作业君
51作业君

Email:51zuoyejun

@gmail.com

添加客服微信: abby12468