Machine Learning | python程序代写 – CS559-B Machine learning

Homework代做|Machine Learning|Network代写|Algorithm代做|Assignment|python程序代写 – 这个是一个比较典型的机器学习的题目

CS559-B: Machine learning

assignment 4

 homework assignments will be done individually: each student must hand in their own
answers. Use of partial or entire solutions obtained from others or online is strictly
prohibited. Electronic submission on Canvas is mandatory.
1.Clustering(10 points) Suppose we clustered a set of N data points using two different clustering
algorithms: k-means and Gaussian mixtures. In both cases we obtained 5 clusters and in both cases
the centers of the clusters are exactly the same. Can a few (say 3) points that are assigned to different
clusters in the kmeans solution be assigned to the same cluster in the Gaussian mixture solution? If no,
explain. If so, sketch an example or explain in 1-2 sentences.
Instructions There are 4 questions on this assignment worth the total of 100 points. Please hand in a
hard copy at the beginning of the class. Refer to the webpage for policies regarding collaboration, due dates,
and extensions.

1 Bayesian networks and factor graphs [12 pts]

1. For each of the networks given in Figure 1 (a,b,c), do the following statements hold? Please explain
your reasoning.
  • A?C|B,D[3x2pts]
  • B?D|A,C[3x2pts]

D B

A

C

D B

A

C

D B

A

C

a) b) c)
Figure 1: Factor graph (a) and Bayesian networks (b,c) for Problem 1.
FSOLUTION:
1. For the factor graph shown in Figure 1 (a)
  • A?C|B,D
True. All the paths betweenAandCare inactive onceBandDare observed.
  • B?D|A,C
True. Similarly all the paths betweenBandDare inactive whenAandCare observed.
2. For the Bayesian Network shown in Figure 1 (b)
1
(a)

10-601 Machine Learning, Fall 2009: Homework 2 Solutions

Due: Wednesday, September 16nd, 10:30 am

Instructions There are 4 questions on this assignment worth the total of 100 points. Please hand in a
hard copy at the beginning of the class. Refer to the webpage for policies regarding collaboration, due dates,
and extensions.

1 Bayesian networks and factor graphs [12 pts]

1. For each of the networks given in Figure 1 (a,b,c), do the following statements hold? Please explain
your reasoning.
  • A?C|B,D[3x2pts]
  • B?D|A,C [3x2pts]

D B

A

C

D B

A

C

D B

A

C

a) b) c)

Figure 1: Factor graph (a) and Bayesian networks (b,c) for Problem 1.
FSOLUTION:
1. For the factor graph shown in Figure 1 (a)
  • A?C|B,D
True. All the paths betweenAandCare inactive onceBandDare observed.
  • B?D|A,C
True. Similarly all the paths betweenBandDare inactive whenAandCare observed.
2. For the Bayesian network shown in Figure 1 (b)
1
(b)
2.Bayesian Networks(10 points) Do the following statements hold in each of the above networks?
Please explain your reasoning
  • AC|B,D
  • BD|A,C
2.6 4.5 5 5.5 6 6.
2.
3
3.
3.
3.
3.
4
4.
4.
(5.9, 3.2)
(4.6, 2.9)
(6.2, 2.8)
(4.7, 3.2)
(5.5, 4.2)
(5.0, 3.0)
(4.9, 3.1) (6.7, 3.1)
(5.1, 3.8)
(6.0, 3.0)
(6.2, 3.2)
(6.6, 3.7)
(6.5, 3.0)
Figure 1: Scatter plot of datasets and the initialized centers of 3 clusters
1.1 Implementk-means manually [8 pts]
Given the matrixXwhose rows represent dierent data points, you are asked to perform ak-means clus-
tering on this dataset using the Euclidean distance as the distance function. Herekis chosen as 3. The
Euclidean distancedbetween a vectorxand a vectoryboth inRpis defined asd=
pPp
i=1(xiyi)

(^2). All data inXwere plotted in Figure 1. The centers of 3 clusters were initialized as 1 =(6. 2 , 3 .2) (red), 2 =(6. 6 , 3 .7) (green), 3 =(6. 5 , 3 .0) (blue). X= 2 (^66) (^66) 6 (^66) (^66) 6 (^66) (^66) 4

    1. 2
    1. 9
    1. 8
    1. 2
    1. 2
    1. 0
    1. 1
    1. 1
    1. 8
    1. 0 3 (^77) (^77) 7 (^77) (^77) 7 (^77) (^77) 5
  1. Whats the center of the first cluster (red) after one iteration? (Answer in the format of[x1, x2], round your results to three decimal places, same as problems 2 and 3) [2 pts]

  2. Whats the center of the second cluster (green) after two iteration? [2 pts]

  3. Whats the center of the third cluster (blue) when the clustering converges? [2 pts]

  4. How many iterations are required for the clusters to converge? [2 pts]

1.2 Application ofkmeans [6 pts]
There are 6 dierent datasets noted as A,B,C,D,E,F. Each dataset is clustered using two dierent methods,
and one of them is K-means. All results are shown in Figure 2. You are required to determine which result
2
Figure 1: Scatter plot of datasets and the initialized centers of 3 clusters

3.K-means(30 points) Given the matrixXwhose rows represent different data points, you are asked to perform a k-means clustering on this dataset using the Euclidean distance as the distance function. Herekis chosen as 3. The Euclidean distance d between a vectorxand a vectoryboth inRdis defined asd(x,y) =

d
i=1(xiyi)

(^2). All data in X were plotted in Figure 1. The centers of 3 clusters were initialized as 1 = (6. 2 , 3 .2)(red), 2 = (6. 6 , 3 .7)(green), 3 = (6. 5 , 3 .0)(blue). (a) Whats the center of the first cluster (red) after one iteration? (Answer in the format of [x 1 ,x 2 ], round your results to three decimal places) (b) Whats the center of the second cluster (green) after two iteration? (c) Whats the center of the third cluster (blue) when the clustering converges? (d) How many iterations are required for the clusters to converge?

4.Expectation Maximization (EM)(50 points) In this question you will implement the EM Algorithm for Gaussian Mixture Models. A good read on gaussian mixture EM can be found at this link. A sample dataset for this problem can be downloaded in canvas files. For this problem:

  • nis the number of training points
  • fis the number of features
  • kis the number of gaussians
  • Xis annfmatrix of training data
  • wis annkmatrix of membership weights. w(i,j) is the probability thatxiwas generated by gaussianj
  • is ak1 vector of mixture weights (gaussian prior probabilities).iis the prior probability that any point belongs to clusteri
  • is akfmatrix containing the means of each gaussian
  • is anffktensor of covariance matrices. (:,:,i) is the covariance of gaussiani (a)Expectation: Complete the function [w] = Expectation(X,k,,,). This function takes in a set of parameters of a gaussian mixture model, and outputs the membership weights of each data point (b) Maximization of Means: Complete the function [] = MaximizeMean(X,k,w). This function takes in the training data along with the membership weights, and calculates the new maximum likelihood mean for each gaussian. (c)Maximization of Covariances: Complete the function [] = MaximizeCovariance(X,k,w,). This function takes in the training data along with membership weights and means for each gaussian, and calculates the new maximum likelihood covariance for each gaussian (d) Maximization of Mixture Weights: Complete the function [] = MaximizeMixtures(k,w). This function takes in the membership weights, and calculates the new maximum likelihood mixture weight for each gaussian. (e)EM: Put everything together and implement the function [,,] = EM(X,k, 0 , 0 , 0 , nIter). This function runs the EM algorithm for nIter steps and returns the parameters of the underlying GMM. Note: Since this code will call your other functions, make sure that they are correct first. A good way to test your EM function offline is to check that the log likelihood, logP(X|,,) is increasing for each iteration of EM.

发表评论

电子邮件地址不会被公开。 必填项已用*标注