统计代做 | Markov chain | probabilities | math代写 – 统计题目

统计题目

统计代做 | Markov chain | probabilities – 这道题目是统计概率方面的题目, 涉及了Markov chain with transition probabilities等代写方面

express代写 代写express nodejs代做 web代写

(1) Let{Xn,n= 1, 2 ,…}be a sequence of independent random variables with state sample space SXn ={ 0 , 1 },P[Xn= 0] = 2/3 andP[Xn = 1] = 1/3. LetYn = n i=1Xi (a) Find the first order probability mass function ofYn. (b) FindE[Yn] andCY(n,n+k)

(2) Suppose thatS =Z. Let Xn, n 1 be a sequence of independent identically distributed random variables withP(X 1 = 1) =pandP(X 1 =1) =q= 1p. Definen= ni=1Xiforn1 and 0. Show thatnis a Markov chain with transition probabilities

p(j|i) =

p, if j=i+ 1
q, if,j=i 1
0 , otherwise

(3) Suppose that random variablesXandY areN(0,0; 1 , 2 ;r).

(E(X^2 ) =^21 ),E(Y^2 ) =^22 ).f(Y|X) is a normal density with mean
r 2 X
 1
and vari-
ance 2

1 r^2. IfE(Y^2 |X) =

(

r 2 X
 1

) 2

+ 22 (1r^2 ) findE(XY) andE(X^2 Y^2 ).

(4) We consider a discrete time and discrete space Markov chainXnwith state space { 0 , 1 , 2 , 3 }modeling the wealth of a gambler. The gambler start with an initial capital equal to 4 dollars at the timen= 0. He tosses a coin at each round and either gets a head with probability 1/4 or get a tail with probability 3/4. Every time he gets a head, he earns one dollar and every time he gets a tail, he loses one dollar. He stops playing whenever he is ruined, i.e. his capitalXngoes down to 0 or when his capital reaches 5, i.e.Xn= 5. (a) Give the transition probability matrix of this Markov chain. (b) ComputeP(2). (c) ComputeE[X 2 |X 0 = 2]. (d) What is the gamblers probability of ruin in one step? (e) What is the gamblers probability of ruin in exactly 4 steps?

(5) Consider the processX(t),t0 defined by,

X(t) =

{

1 , tY
0 , t>Y
whereY is a uniformly distributed random variable on the interval (0,1).
(a) Compute, fort(0,1), the first-order probability mass ofX,f(t,x).
(b) Give the expectation and the variance ofX.
(c) Compute the autocorrelation function ofX,E[X(t 1 )X(t 2 )], fort 1 ,t 2 (0,1).
(d) What is the distribution of an increment ofX? DoesXhave stationary incre-
ments?
1

2

(6)X 1 ,X 2 ,X 3 ,...is a sequence of independent random variables, each with mean 0 and
variance 1. We defineYkto beX 1 +X 2 +...+Xk. Ifk < j, what is the coefficient
correlation betweenYkandYj?
(7) We consider a sequence of independent and identically distributed random variables
X 1 ,X 2 ,...,Xn,...with common meanand variance^2 >0. We also assume that
the variablesXihave finite third and fourth moments. Let
Tn=

1

N

Ni=1(Xi)^2
(a) Are the variables (Xi)^2 independent and identically distributed?
(b) What is their common mean? Their common variance?
(c) DoesTNhave a limit in the almost surely sense? What is it?
(d) ComputeE[TN].
(8) An asset price is modeled by using a sequence of independent and identically dis-
tributed random variablesX 1 ,X 2 ,...with common density functionf. We say that
a record price occurs at timenifXn> max(X 1 ,...,Xn 1 ).
(a) ComputeP[a record price occurs at timen].
Next, Consider the Bernoulli variablesXidefined as
Xi=

{

1 , if a record occurs at time i;
0 , otherwise.
(b) LetYnbe the number of records by timen.  express the variableYnin terms
of the Bernouilli variablesXi.
(c) Compute the expectation and variance ofXi
(d) Calculate the variance of the number of records by timen. Hint: do not
attempt to calculate the sum.
(9) Consider a random walkYn =
n
i=1XnwithP(X^1 = 1) =P(X^1 = 1) = 0.5,
started fromY 0 = 0.
Find the following probabilities:
(a)P(Y 4 =k) for all possible values ofk.
(b)P(Yn0) forn= 1, 2 , 3 ,4.
(c) P(Yn 6 = 0), forn= 1, 2 , 3 ,4.
(d)P(|Yn|2), forn= 1, 2 , 3 ,4.
(10) Suppose thatX 1 ,X 2 ,...are independent identically distributed random variables
with meanandYn=
n
i=1(Xi). Show thatYnis a martingale.
(11) Let{Xn,n= 1, 2 ,...}be a discrete-time and discrete-state stochastic process such
that, for everyn,i,jand everyi 1 ,...,in 1 ,
P(Xn+1=j|Xn=i,Xn 1 =in 1 ,...,X 1 =i 1 ) =P(Xn+1=j|Xn=i,Xn 1 =in 1 )
(a) Is this process Markovian? Explain.
(b) Can you make it Markovian? In other words, can you model the same process
in a slightly different way, changing in particular its state space, so that the
new process you obtain is Markovian?