homework | assignment代做 – CMPT 413/713 Natural language processing

CMPT 413/713 Natural language processing

homework | assignment代做 – , 这是值得参考的assignment代写的题目

ass代做 assignment代写 代写assignment

homework 0, Fall 2021

Due: Thursday Sep 16th, 2021
InstructionsAnswers are to be submitted by 11:59pm of the due date as a PDF file through Canvas.
Go to Canvas, select theHW0-Cactivity and submit your answer asanswer.pdf. This  assignment is
to be done individually.

1 Derivatives (10pt)

Provide the derivatives with respect toxfor each of the following (assume log is the natural logarithm):
(a)f(x) =^20 x. 5 +1x 2 (2pt)

(b)f(x) = (ex+ 1)^2 (2pt)

(c) f(x) = log (cx^2 +x) (2pt)

(d)f(x) =e(xa) 2 (2pt)

(e) f(x) = 1 ^1 ex(2pt)

2 Linear Algebra (6pt)

You are given two matrices, please compute their product. Please specify invalid if the two matrices
cannot be multiplied.
(a)
[
1 3
4 2
][
2 1
6 0
]
(2pt)

(b)

[
1 6 3
2 1 3
][
3
5
]
(2pt)
(c)
[
4 0 1
1 3 8
][
0 8 2
5 2 1
]>
(2pt)

3 Linear Algebra (5pt)

Suppose thatxis a column vector of lengthN(xRN^1 ), andWis matrix of dimensionDrows byN
columns (WRDN). Given that
y=Wx
(a) What is the dimension ofy? (2pt)

(b) Suppose we are interested in the derivative ofywith respect tox. How many values does this derivative contain? UseN, Dto denote the final result. (2pt)

(c) LetWi,jbe the (i, j)th element ofW. What would beyx^13? (1pt)

4 Linear Algebra (5pt)

(a) LetM=
xz z
x^2 z ex
exz 1
, what is the derivative ofMwith respect tox? (3pt)

(b) What is the rank ofMifx=z= 1? (2pt)

5 Cosine similarity (6pt)

Cosine similarity is a measure of similarity between two non-zero vectors of an inner product space. It is
defined as follows:
simcos(a,b) =
ab
ab
(a) Givena= (2, 2 ,1) andb= (0, 1 ,3), what is the cosine similarity of the two vectorsaandb? (2pt)

(b) What is the minimum and maximum value the cosine similarity can take? (2pt)

(c) Given an example of when the cosine similarity is maximized but the Euclidean distance between the
two vectors is greater than zero. (2pt)

6 Probability (5pt)

Anna has a weighted coin that has a probabilityphof landing up head, and probabilityptof landing up
tails.
(a) Ifph= 0.6, what ispt? (1pt)

(b) Assuming that the flips are independent, if she flips the coin 10 times, what is the probability that she get 10 heads in a row? (2pt)

(c) Assuming that the flips are independent, if she flips the coin 10 times, what is the probability that
she get 7 heads and 3 tails? (2pt)

7 Probability (8pt)

Twenty percent (20%) of the population has a genetic defect that can lead to a specific disease in later
life. Of the people who have this defect, forty percent (40%) will develop the disease. A test is developed
for detecting the genetic defect. The test can detect the defect ninety percent (90%) of the time and gives
a false positive 5% of the time.
(a) What is the probability that Joe (a random person) tests negative for the defect? (3pt)

(b) Joe just got the happy news that the test came back negative; what is the probability that Joe will develop the disease in the future? (5pt)

8 Probability (10pt)

LetXbe a continuous random variable with p.d.f.
fX=
{
2 ex for 1< x < 2
0 otherwise
Compute the expectation and variance,E(X) and Var(X).

9 Information Theory (10pt)

Suppose there are two distributionsPandQ, and three valuesx 1 ,x 2 , andx 3.
x 1 x 2 x 3
distribution P 0.10 0.80 0.
distribution Q 0.25 0.40 0.
(a) Compute the entropy ofPand entropy ofQ(4pt)

(b) Calculate the cross-entropyH(P, Q) (4pt)

(c) Given arbitrary distributionsP andQ, when is the cross-entropyH(P, Q) minimized? When is it
maximized? (2pt)

10 Information Theory (10pt)

The joint distribution of these two random variablesX,Y is as follows:
x=a x=b x=c x=d
y=a^1416116118
y=b 0 163 161 0
y=c 0 321 161 321
y=d 321 321 161 0
(a) What is the mutual informationI(X;Y) between the two random variablesXandY in bits? (5pt)

(b) What is the range of the mutual information between any two random variables? When is it mini- mized? (5pt)