Uploaded by AnonymousHQ 391007

l191389 A#1

advertisement
Name: Abdullah Sohail Syed
Roll No.: 19L-1389
Assignment: 01
Course: Applied Machine Learning
Part a:
As the polynomial is first order, the generalized equation after substituting values will
become:
𝑚
𝑚
𝑖=1
𝑖=1
1
1
1
𝐽(𝜃) = ∑( )(𝑦 (𝑖) − (𝜃0 + 𝜃1 (𝑥 (𝑖) ))2 =
∑(𝑦 (𝑖) − 𝜃0 − 𝜃1 (𝑥 (𝑖) ))2
2
8
16
Hence, after substituting the datapoints, the equation becomes,
𝐽(𝜃) =
1
2
2
2
[(2 − 𝜃0 − 𝜃1 (0)) + (1.5 − 𝜃0 − 𝜃1 (0.5)) + (1.5 − 𝜃0 − 𝜃1 (1))
16
2
2
2
+ (1 − 𝜃0 − 𝜃1 (1.5)) + (1 − 𝜃0 − 𝜃1 (2)) + (0 − 𝜃0 − 𝜃1 (2.5))
2
2
+ (−0.5 − 𝜃0 − 𝜃1 (3)) + (−2 − 𝜃0 − 𝜃1 (3.5)) ]
𝐽(𝜃) =
1 2 7
35
9
𝜃0 + 𝜃0 𝜃1 + 𝜃1 2 − 𝜃0 + 0.34375𝜃1 + 0.921875
2
4
16
16
Partial Derivatives of 𝐽(𝜃)
𝑑𝜃
= 𝜃0 − 1.75𝜃1 − 0.5625 = 0 → 𝜃0 − 1.75𝜃1 = 0.5625
𝑑 𝜃0
𝑑𝜃
= 1.75𝜃0 − 4.375𝜃1 + 0.34375 = 0 → 1.75𝜃0 − 4.375𝜃1 = −0.34375
𝑑 𝜃1
After solving simultaneously, we get the following values of 𝜃0 and 𝜃1
𝜃0 = 2.333
𝜃1 = −1.011904762 ≅ −1.012
Part b:
As the polynomial is first order, the generalized equation after substituting values will
become:
𝑚
𝑚
𝑖=1
𝑖=1
1
1
𝐽(𝜃) = ∑(𝑤 (𝑖) )(𝑦 (𝑖) − (𝜃0 + 𝜃1 (𝑥 (𝑖) ))2 = ∑(𝑤 (𝑖) )(𝑦 (𝑖) − 𝜃0 − 𝜃1 (𝑥 (𝑖) ))2
2
2
Hence, after substituting the datapoints and weights the equation becomes,
𝐽(𝜃) =
1
2
2
[(0.1)(2 − 𝜃0 − 𝜃1 (0)) + (0.1)(1.5 − 𝜃0 − 𝜃1 (0.5))
2
2
2
+ (0.1)(1.5 − 𝜃0 − 𝜃1 (1)) + (0.1)(1 − 𝜃0 − 𝜃1 (1.5))
2
+ (0.1)(1 − 𝜃0 − 𝜃1 (2)) + (0.1)(0 − 𝜃0 − 𝜃1 (2.5))
2
2
2
+ (0.1)(−0.5 − 𝜃0 − 𝜃1 (3)) + (0.3)(−2 − 𝜃0 − 𝜃1 (3.5)) ]
𝐽(𝜃) =
1 2
𝜃0 + 2.1𝜃0 𝜃1 + 2.975𝜃1 2 − 0.05𝜃0 + 1.675𝜃1 + 1.1375
2
Partial Derivatives of 𝐽(𝜃)
𝑑𝜃
= 𝜃0 + 2.1𝜃1 − 0.05 = 0 → 𝜃0 + 2.1𝜃1 = 0.05
𝑑𝜃0
𝑑𝜃
= 2.1𝜃0 + 5.95𝜃1 + 1.675 = 0 → 2.1𝜃0 + 5.95𝜃1 = −1.675
𝑑𝜃1
After solving simultaneously, we get the following values of 𝜃0 and 𝜃1
𝜃0 = 2.4773
𝜃1 = −1.155844156 ≅ −1.156
Part c:
Download