MATH 5763 - Stochastic Processes, Section 001 - Spring 2011
MWF 1:30-2:20 p.m., 117 PHSC
Instructor:
Nikola Petrov, 802 PHSC, (405)325-4316, npetrov AT math.ou.edu
Office Hours:
Monday 2:30-3:30 p.m., Wednesday 3:30-4:30 p.m., or by appointment.
Prerequisite:
Basic calculus-based probability theory at the level of MATH 4733
(including axioms of probability, random variables, expectation,
probability distributions, independence, conditional probability).
The class will also require knowledge of elementary analysis
(including sequences, series, continuity),
linear algebra (including linear spaces, eigenvalues,
eigenvectors),
and ordinary differential equations (at the level
of MATH 3113 or MATH 3413).
Course description:
The theory of stochastic processes studies systems that evolve
randomly in time;
it can be regarded as the "dynamical" part of probability theory.
It has many important practical applications,
as well as in other branches
in mathematics such as partial differential equations.
This course is a graduate-level
introduction to stochastic processes,
and should be of interest to students of mathematics,
statistics, physics, engineering, and economics.
The emphasis will be on the fundamental concepts,
but we will avoid using the theory of Lebesgue measure
and integration in any essential way.
Many examples of stochastic phenomena in applications
and some modeling issues will also be discussed in class
and given as homework problems.
Text:
Mario Lefebvre,
Applied Stochastic Processes,
1st edition, Springer, 2006,
ISBN-10: 0387341714,
ISBN-13: 978-0387341712,
which is freely available online to OU students
through the
OU Library.
(At the end of the course we may also use parts of the book
Hui-Hsiung Kuo, Introduction to Stochastic Integration,
1st edition, Springer, 2007,
also freely available online to OU students
through the OU Library.)
Main topics (a tentative list):
-
a brief review of probability theory;
-
discrete Markov chains: Chapman-Kolmogorov equations,
persistence and transience, generating functions,
stationary distributions, reducibility, limit theorems, ergodicity;
-
continuous Markov processes:
Poisson process, birth-death and branching processes,
embedding of a discrete-time Markov chain
in a continuous-time Markov processes;
-
conditional expectation, martingales;
-
stationary processes (autocorrelation function,
spectral representation);
-
renewal processes, queues;
-
diffusion processes, Wiener processes (Brownian motion);
-
introduction to stochastic differential equations, Ito calculus;
-
Fokker-Planck equation, Black-Scholes option-pricing formula,
Ornstein-Uhlenbeck process.
Homework:
-
Homework 1, due Friday, January 28.
-
Homework 2, due Friday, February 4.
-
Homework 3, due Friday, February 18.
-
Homework 4, due Friday, February 25.
-
Homework 5, due Friday, March 4.
-
Homework 6, due Friday, March 11.
-
Homework 7, due Friday, April 1.
-
Homework 8, due Friday, April 15.
-
Homework 9, due Friday, April 22.
-
Homework 10, due Friday, April 29.
-
Homework 11, due Friday, May 6.
Content of the lectures:
-
Lecture 1 (Wed, Jan 19):
Review of probability:
sample space, examples, events,
σ-algebra (σ-field),
events, elementary events,
operations with events (complement,
union, intersection), disjoint events,
De Morgan's laws,
probability (probability measure),
probability space (pages 1-3 of Sec. 1.1).
-
Lecture 2 (Fri, Jan 21):
Review of probability (cont.):
elementary properties of probability measures
(including the inclusion-exclusion formula),
conditional probability,
partitions of the sample space,
law of total probability, Bayes' formula,
independent events, independent family of events
(pages 3, 5-8 of Sec. 1.1).
Random variables:
random variables (RVs),
(cumulative) distribution function (c.d.f.) of a RV
(pages 8-9 of Sec. 1.2).
- Lecture 3 (Mon, Jan 24):
Random variables (cont.):
properties of c.d.f.s,
important discrete RVs (Bernoulli, binomial, Poisson, geometric),
important continuous RVs (uniform, exponential, normal/Gaussian,
standard normal);
conditional c.d.f. of a RV conditioned on an event,
conditional p.m.f./p.d.f. of a RV conditioned on an event;
expectation of a discrete RV expressed as a sum
over the elements of the sample space S
and as a sum over the values in the domain
SX of X,
example - expectation of an indicator function of an event
(pages 9-16 of Sec. 1.2).
- Lecture 4 (Wed, Jan 26):
Random variables (cont.):
expectation of a function of a RV,
nth moment of a RV, variance of a RV,
moment-generating function (m.g.f.) of a RV,
computing the moments from the m.g.f.,
characteristic function of a RV
(pages 16-20 of Sec. 1.2).
Random vectors:
definition, (joint) c.d.f. of a random vector,
marginal p.m.f.s and p.d.f.s of a random vector,
independence of the components of a random vector;
conditional distribution functions,
conditional expectation, tower rule
E[E[X|Y]]=E[X]
(pages 21-27 of Sec. 1.3).
- Lecture 5 (Fri, Jan 28):
Random processes:
definition of a stochastic process (random process),
classification of random processes:
discrete-time or continuous-time,
discrete-state space or continous-state space
(pages 47-48 of Sec 2.1).
Markov chains:
Markov property,
examples: 1-dimensional and 2-dimensional random walks,
Ehrenfests' urn model
(pages 73-75 of Sec. 3.1).
- Lecture 6 (Mon, Jan 31):
Discrete-time Markov chains:
time-homogeneous discrete-time discrete-state space Markov chains,
one-step and n-step transition probabilities,
one-step and n-step transition matrices
P(n),
stochastic and doubly-stochastic matrices,
Chapman-Kolmogorov equations,
matrix form of the Chapman-Kolmogorov equations,
an example
(pages 77-80 of Sec. 3.2).
- Lecture 7 (Wed, Feb 2):
Cancelled due to weather.
- Lecture 8 (Fri, Feb 4):
Cancelled due to weather.
- Lecture 9 (Mon, Feb 7):
Discrete-time Markov chains (cont.):
probability ρij(n)
of visiting state j for the first time
in n steps starting from state i,
probability ρii(n)
of first return to state i in n steps,
representation of
pij(n)
as a sum over k from 1 to n of
ρii(k)pii(n-k),
examples of computing
ρij(n)
for some simple Markov chains,
initial distribution
a=(a0,a1,a2,...)
of a Markov chain,
ai=P(X0=i),
distribution
a(n)=(a0(n),a1(n),a2(n),...)
at time n, formula for evolution of the probability
distribution:
a(n)=aPn,
examples: simple 1-dim random walk on Z,
simple 1-dim random walk on Z+ with reflecting
and absorbing boundary condition at 0,
a Markov chain coming from sums of i.i.d. random variables
(pages 80-85 of Sec. 3.2).
- Lecture 10 (Wed, Feb 9):
Cancelled due to weather.
- Lecture 11 (Fri, Feb 11):
Properties of Markov chains:
accessibility of state j from state i,
i→j,
communicating states i↔j,
properties of the relation ↔
(reflexivity, symmetry, transitivity),
↔ as an equivalence relation,
equivalence classes with respect of ↔,
closed sets of the state space,
irreducible MCs, irreducibility criteria,
examples, recurrent and transient states,
probability fij
of eventual visit of j starting from i,
probability fii
of eventual return to state i,
expressing fij
as a sum of the first visit probabilities
ρij(n),
Decomposition Theorem
(pages 85-87 of Sec. 3.2).
- Lecture 12 (Mon, Feb 14):
Properties of Markov chains (cont.):
an example of identifying closed irreducible sets of recurrent states
and of sets of transient states, and structure of the stochastic
matrix; a necessary and sufficient criterion of recurrence
of a state in terms of the expected value of the number of returns
to this state, recurrence is a class property
(pages 87-89 of Sec. 3.2).
- Lecture 13 (Wed, Feb 16):
Properties of Markov chains (cont.):
average number μi of transitions for first
return to state i,
positive recurrent and null-recurrent states,
criterion for null-recurrence,
type of recurrence is a class property,
recurrent states of a finite MC are positive recurrent
(pages 87-90 of Sec. 3.2).
periodic and aperiodic states, remarks about periodicity,
examples;
limiting probabilities πi,
limiting probability distribution
π=(π0,π1,π1,...),
ergodic states,
Ergodic Theorem (giving conditions for existence and uniqueness
of a limiting probability distribution,
relation between πi and
μi,
and an algorithm for computing π),
(pages 87-95 of Sec. 3.2).
- Lecture 14 (Fri, Feb 18):
Properties of Markov chains (cont.):
examples showing the importance of the conditions
in the Ergodic Theorem,
examples of computing π,
proposition giving π for the case of
an irreducible and aperiodic MC over a finite set
with a doubly stochastic transition matrix,
an example of computing π
for a MC with infinite state space
(a random walk with a partially absorbing boundary)
(pages 96-98 of Sec. 3.2).
- Lecture 15 (Mon, Feb 21):
Absorption problems:
definition of probabilities
ri(n)(C)
and ri(C)
of absorption by the recurrent class C
after exactly n steps and eventual absorption
if strating at state i;
theorem giving ri(C)
in terms of the (pij),
example - the gambler's ruin problem;
martingales (pages 100-104 of Sec. 3.2)
- Lecture 16 (Wed, Feb 23):
Continuous-time discrete-state space MCs:
definition, transition functions
pij(t),
stationary (time-homogeneous) MCs,
irreducibility, analogue of the condition of being
a stochastic matrix for pij(t)
(pages 121-122 of Sec. 3.3).
- Lecture 17 (Fri, Feb 25):
Continuous-time discrete-state space MCs (cont.):
embedded chain, Chapman-Kolmogorov equations,
evolution of the occupation probabilities
pj(t)=P(Xt=j)
expressed in terms of the initial occupation probabilities
pi(0)
and the transition probabilities
pij(t),
memorylessness, memorylessness properties of the exponential random
variables;
definition of a Poisson process (counting process)
(pages 109, 110, and 123 of Sec. 3.3, pages 231-232 of Sec. 5.1).
- Lecture 18 (Mon, Feb 28):
Poisson process:
derivation of the distribution of N(t)
for a Poisson process N
by induction and by the method of generating functions.
- Lecture 19 (Wed, Mar 2):
Poisson process (cont.):
properties of the Γ(α,λ) distribution:
moment-generating function, proof that the sum
of j independent Exp(λ) RVs
is a Γ(j,λ) RV,
c.d.f. of a Γ(j,λ) RV
(pages 115-119 of Sec. 3.3, pages 237-238 of Sec. 5.1).
- Lecture 20 (Fri, Mar 4):
Continuous-time Markov chains (cont.):
instantaneous transition rates (infinitesimal parameters)
νij, generating matrix (generator)
G, properties of G (sum of the elements
νij in each row of G is zero),
stochastic semigroup Pt
(pages 124-126 of Sec. 3.3).
- Lecture 21 (Mon, Mar 7):
Continuous-time Markov chains (cont.):
standard semigroups, Kolmogorov backward and forward equations,
boundary coditions for Kolmogorov equations,
exponent of a square matrix (possibly infinite),
solution of an initial-value problem for a linear first-order
system of ordinary differential equations with constant coefficients,
the stochastic semigroup in terms of its generator:
Pt=exp(tG),
detailed solution for a continuous-time, two-state MC
(pages 126-131 of Sec. 3.3).
- Lecture 22 (Wed, Mar 9):
Continuous-time Markov chains (cont.):
Laplace transform, linearity of the Laplace transform,
Laplace transform of a derivative of a function,
solving the problem of a two-state continous-time Markov process
by using Laplace transform method (a sketch);
stationary distribution π
of a stochastic semigroup Pt,
recurrent and transient states,
positive recurrent and null recurrent states
of a continuous-time Markov chain,
irreducible Markov chains,
Ergodic Theorem for continuous-time Markov process, remarks,
finding stationary distributions from the generator:
πG=0
(pages 138-140 of Sec. 3.3).
- Lecture 23 (Fri, Mar 11):
Continuous-time Markov chains (cont.):
balace equations; birth processes,
computing the expectation of the time Tn
for a birth process starting at X0=1 to reach
Xt=n for the first time;
derivation of the infinitesimal-time evolution probabilities
and the Kolmogorov differential equations of a birth process,
solving the differential equations by using a generating function,
- Lecture 24 (Mon, Mar 21):
Continuous-time Markov chains (cont.):
computing the average
M(t)=E[X(t)|X(0)=i]
of a birth process by deriving and solving a differential
equation for M(t);
a birth-death-immigration-disaster process,
detailed derivation of the short-time transition probabilities
of a death-immigration process.
- Lecture 25 (Wed, Mar 23):
Compound Poisson processes:
definition of a compound Poisson process,
mean, variance, and moment generating function
of a compound Poisson process;
proof that the sum of two independent compound processes
coming from Poisson processes with rates λ1
and λ2 is a compound process
coming from a Poisson process with rate
λ1+λ2
(pages 254-258 of Sec. 5.3).
- Lecture 26 (Fri, Mar 25):
Nonhomogeneous Poisson processes:
definition, intensity function λ(t),
mean-value function m(t)
(such that m(0)=0,
m'(t)=λ(t)),
proof that the distribution of the increment
Ns+t-Ns
is Poisson with parameter
m(s+t)-m(s),
"homogenizing" a nonhomogeneous Poisson process
(pages 250, 251, 253, 254 of Sec. 5.2).
- Lecture 27 (Mon, Mar 28):
Doubly stochastic Poisson processes:
conditional (or "mixed") Poisson process,
proof that the conditional Poisson process
has stationary, but not independent increments,
best estimator of the rate of a Poisson process;
doubly stochastic Poisson process ("Cox process")
- only mentioned
(pages 258, 259, 262 of Sec. 5.4).
Renewal processes:
definition of a renewal process,
modified ("delayed") renewal process;
relations between the process Nt,
the times of the events Tn,
and the interevent times τn;
expression for the p.m.f. of Nt
in terms of the c.d.f. of Tn
(Proposition 5.6.1)
(pages 267-269 of Sec. 5.6).
- Lecture 28 (Wed, Mar 30):
Renewal processes (cont.):
"honesty" of a renewal process,
renewal function
mN(t);
definition of the Riemann-Stieltjes integral,
particular cases, applications
to computing expected values of discrete
and continuous random variables;
expected value of an N-valued random variable X
as a sum (over n from 1 to infinity)
of probabilites of X to be greater or equal to n,
expected value of a non-negative continuous random variable
X as an integral of
[1-FX(x)],
geometric meaning;
expressing mN(t)
in as a sum of the c.d.f.s of all the Tn's
(pages 267-271 of Sec. 5.6).
- Lecture 29 (Fri, Apr 1):
Renewal processes (cont.):
derivation of an integral equation for the renewal function
mN(t)=E[Nt],
solving renewal-type equations by using Laplace transform,
recursive formula for the c.d.f. of the event times
Tn expressed
through Riemann-Stieltjes integrals
(pages 271, 273-276 of Sec. 5.6).
- Lecture 30 (Mon, Apr 4):
Renewal processes (cont.):
another derivation of the formula for the renewal function
mN(t)
by performing Laplace transformation on the formula
representing mN(t)
as a sum of the c.d.f.s of all the Tn's,
relation between the Laplace transform of the p.d.f. of a random
variable and the moment generating function of the random variable;
an example of application of renewal-type problems
- crossing a street.
Queues:
set-up of the problem, examples of queues
(queues with baulking, multiple servers, airline check-in, FIFO, LIFO,
group servise, "student discipline", "continental queueing").
- Lecture 31 (Wed, Apr 6):
Queues (cont.):
A/B/k/s/... classification of the queues,
where A and B are deterministic (D),
or have Markovian
(M - with exponentially distributed interrarival/service times),
Erlang (or Gamma, Γ) or general (G) distributions;
trafic intensity, stability of a queue;
M(λ)/M(μ)/1 queue - see problem in
handout;
M(λ)/G/1 queue - constructing of a discrete-time Markov
chain embedded in the queueing process
and derivation of the transition probability matrix of this Markov
chain.
Please read the
handout on sigma-fields.
- Lecture 32 (Fri, Apr 8):
General properties of stochastic processes:
cumulative distribution function
F(x1,...,xk;t1,...,tk),
probability mass function
p(x1,...,xk;t1,...,tk),
and probability density function
f(x1,...,xk;t1,...,tk)
of order k of a stochastic process
X={Xt:t∈[0,∞)};
mean
mX(t)=E[Xt],
autocorrelation function
RX(t1,t2)=E[Xt1Xt2],
autocovariance function
CX(t1,t2)=RX(t1,t2)-mX(t1)mX(t2),
variance
var(X(t))=CX(t,t),
and autocorrelation coefficient
ρX(t1,t2)
of a stochastic process;
processes with indepent increments,
processes with stationary increments,
strict-sense stationary (SSS, strongly stationary) processes,
wide-sense stationary (WSS, weakly stationary) processes,
an example of a WSS stochastic process that is not SSS
(Sec. 2.1, pages 52-53 of Sec. 2.2).
- Lecture 33 (Mon, Apr 11):
General properties of stochastic processes (cont.):
average power E[Xt2] of a stochastic process,
E[Xt2] of a WSS stochastic process
does not depend on t;
spectral density SX(ω)
of a WSS process, properties of SX(ω)
(pages 53-54 of Sec. 2.2).
Gaussian and Markov processes:
multinormal distribution of a random vector
X=(X1,...,Xn)∼N(m,K),
vector of the means m, covariance matrix
K=(cov(Xi,Xj));
characteristic function φX(ω)=E[exp(iωX)]
of a random variable X,
(joint) characteristic function
φX(ω)=E[exp(iωX)]
of a multinormal random variable X
(Proposition 2.4.1);
if two components of
X=(X1,...,Xn)∼N(m,K)
are uncorrelated, then they are independent,
Gaussian process {Xt}
- a continuous-time stochastic process with
(Xt1,...,Xtn)
being multinormal for any n and times
t1,...tn;
if {Xt} is a Gaussian process
such that its mean mX(t)
does not depend on t
and its autocovariance function
CX(t1,t2)
depends only on t2-t1,
then the process is SSS (Proposition 2.4.2);
definition of a Markov (or Markovian) processes, examples
(random walk, Poisson process)
(pages 58-61 of Sec. 2.4).
- Lecture 34 (Wed, Apr 13):
Gaussian and Markov processes (cont.):
(first-order) density function
f(x;t),
conditional transition density function
p(x,x0;t,t0)=fXt|Xt0(x|x0);
integrals of f(x;t)
and
p(x,x0;t,t0)
over x are equal to 1;
expressing f(x;t)
as in integral of
f(x0;t0)p(x,x0;t,t0)
over x0;
more on the meaning of the p.d.f. of a continuous RV:
P(X∈(x,x+Δx])≈fX(x)Δx,
generalization for jointly continuous random vectors
P(X∈A)≈fX(x)vol(A)
where A is a small domain in Rk
containing x;
application to kth order p.d.f.'s of a random process:
P(Xt1∈(x1,x1+Δx1],...,Xtk∈(xk,xk+Δxk])≈f(Xt1,...,Xtk)(x1,...,xk)Δx1...Δxk;
Chapman-Kolmogorov equations for the
conditional transition density function
p(x,x0;t,t0)=fXt|Xt0(x|x0);
Dirac δ-function;
time-homogeneous process
(pages 62-64 of Sec. 2.5).
The Wiener process:
definition of a Wiener process (Brownian motion)
Wt∼N(0,σ2t)
and a standard Wiener process
Bt∼N(0,t);
p.d.f. of order k of a Wiener process;
autocorrelation function
RB(t,s)=E[BtBs]=min(t,s)
of a Wiener process
(pages 175, 177, 178 of Sec. 4.1).
- Lecture 35 (Fri, Apr 15):
A digression on generalized functions (distributions):
test functions (infinitely smooth compactly supported functions),
Dirac δ-function δa,
generalized derivatives,
derivatives of δa;
example: generalized derivative
of the Heaviside (unit step) function:
Ha'=δa,
representing δ(x) as a limit of
(1/ε)χ[0,ε]
as ε→0+,
representing δa as a limit of
(2πε)-1/2exp{-x2/(2ε)}
as ε→0+.
Gaussian and Markov processes (remarks):
derivation of the Chapman-Kolmogorov equations,
consistency conditions among the conditional density functions.
The Wiener process (cont.):
characteristic functions,
characteristic function of Wt,
the Wiener process as a limit of simple random walk;
historical remarks (Robert Brown, Albert Einstein,
Marian Smoluchowski, Norbert Wiener, Andrey Kolmogorov).
- Lecture 36 (Mon, Apr 18):
The Wiener process (cont.):
m.g.f. and moments of Bt:
E[Bt2n+1]=0,
E[Bt2]=t,
E[Bt4]=3t2,
E[Bt6]=15t3;
short-time behavior:
E[ΔBt]=0,
E[(ΔBt)2]=Δt;
nondifferentiability of Bt:
E[ΔBt/Δt]=0,
E[(ΔBt/Δt)2]=1/Δt→∞
as t→0;
(Gaussian) white noise
ξt=dBt/dt,
making sense of
ξt=dBt/dt
by treating it as a functional (on test functions φ)
taking value ξ(φ) in the space of random variables,
moments of ξ(φ).
- Lecture 37 (Wed, Apr 20):
The Wiener process (cont.):
more on the meaning of ξ(φ) as
a measurement "smeared by φ",
writing the facts about moments of ξ(φ)
as consequences of E[ξt]=0
and
E[ξtξs]=δ(t-s).
Stochastic differential equations (SDEs) and Ito integrals:
discussion of the concept of a stochastic differential equation
and the meaning of its solution,
Fokker-Planck equation for the conditional transition
density function
p(x,x0;t,t0),
an example (the Fokker-Planck equation
and its solution for the Wiener process
Bt),
discretizing the SDE,
Ito's way of defining the integral
as a limit of left Riemann sums,
main reason for using left Riemann sums - independence
of Bt and the increment
ΔBt:=Bt+Δt-Bt
for any positive Δt
(due to the independence of the increments of the Wiener process).
- Lecture 38 (Fri, Apr 22):
SDEs and Ito integrals (cont.):
mean-square (L2-) convergence of functions,
examples, more on the meaning of the definition of Ito's integral.
- Lecture 39 (Mon, Apr 25):
SDEs and Ito integrals (cont.):
derivation of the formula
∫
Btdt=(Bt2-t)/2.
- Lecture 40 (Wed, Apr 27):
Ito formula:
more on the meaning of stochastic integrals,
sketch of the derivation of the Ito formula,
examples.
- Lecture 41 (Fri, Apr 29):
Conditional expectation and martingales:
probability spaces (Ω,F,P),
random variables,
σ-algebra σ(X) generated by a random variable X,
σ-algebra σ(F1,...,Fn) generated by a
by a collection of σ-algebras,
σ-algebra
σ(X1,...,Xn)
generated by a family of random variables,
distribution (c.d.f.)
FX(x)=P(X≤x)
of a random variable,
integrable (L1-) random variables,
expectation E[X] of a random variable X,
conditional expectation E[X|A]
of a random variable X conditioned on an event A,
conditional expectation E[X|F]
of a random variable X conditioned on a
σ-algebra F,
conditional expectation E[X|Y]
of a random variable X conditioned on another
random variable Y.
- Lecture 42 (Mon, May 2):
Conditional expectation and martingales (cont.):
filtration
F1,F2,... of σ-algebras,
discussion of the meaning of
Fn
in the context of "coin tossing"
(Fn
is our knowledge at time n),
filtration
Fn=σ(Y1,...,Yn)
of σ-algebras generated by a sequence
Y1,Y2,...
of random variables ("coin tosses"),
a sequence {Xn} of random variables
adapted to a filtration
{Fn}
of σ-algebras,
martingale with respect to a filtration of σ-algebras,
example - if Xn is a
symmetric one-dimensional random walk,
then Xn and
Xn^2-n are martingales;
filtration {Ft}
of σ-algebras and martingales Xt
in the case of continuous time,
example - exponential martingale
exp(αBt-α2t/2),
obtaining a family of polynomial martingales from the Taylor
expansion of the exponential martingale:
exp(αBt−α2t/2)=1+Btα+(1/2)(Bt2−t)α2+(1/6)(Bt3−3tBt)α3+(1/24)(Bt4−6tBt2+3t2)α4+(1/120)(Bt5−10tBt3+15t2Bt)α5+...
SDEs and Ito integrals (cont.):
Ito isometry.
- Lecture 43 (Wed, May 4):
SDEs and Ito integrals (cont.):
Ito integrals are martingales.
Ornstein-Uhlenbeck process:
deterministic motion of a body in a viscous fluid
(the resistance force is proportional to the speed
and has a direction opposite to the velocity);
random fluctuations in the rate of change of the velocity
(due to molecular collisions),
Ornstein-Uhlenbeck process, Langevin equation,
derivation of the explicit form of Xt,
mean and variance of Xt,
behavior of the variance in the short-time and long-time limits
and in the limit of disappearing a random force.
- Lecture 44 (Fri, May 6):
Ornstein-Uhlenbeck process (cont.):
Fokker-Planck equation for the conditional transition density
p(x,x0;t,t0)
of a stochastic process {Xt},
solution of the Fokker-Planck equation for the Ornstein-Uhlenbeck process,
deriving the equation for the moment generating function
M(θ,t|Xt0=x0)
of the Ornstein-Uhlenbeck process and solving it,
identifying the type of distribution of Xt
(conditioned on the event
{Xt0=x0}.
Simple population growth at a noisy rate:
derivation of the differential equation in the deterministic
and the stochastic cases, solving the SDE
dXt=rXtdt+αXtdBt
by using Ito's formula,
computing the expected value of the solution,
discussion of the behavior of the population
and its average in the weak noise (α2<2r)
and in the strong noise (α2<2r) cases.
Attendance:
You are required to attend class on those days when an
examination is being given;
attendance during other class periods is also
strongly encouraged.
You are fully responsible for the
material covered in each class, whether or not you attend.
Make-ups for missed exams will be given only if
there is a compelling reason for the absence,
which I know about beforehand
and can document independently of your testimony
(for example, via a note or a phone call from a
doctor or a parent).
Grading:
Your grade will be determined by your performance
on the following coursework:
Homework (lowest grade dropped) |
50% |
Midterm exam |
20% |
Final exam |
30% |
Homework:
It is absolutely essential
to solve the assigned homework problems!
Homework assignments will be given
regularly throughout the semester
and will be posted on this web-site.
The homework will be due at the start
of class on the due date.
Each homework will consist of several problems,
of which some pseudo-randomly chosen problems will be graded.
Your lowest homework grade will be dropped.
All homework should be written on a 8.5"×11" paper
with your name clearly written, and should be stapled.
No late homework will be accepted!
You are encouraged to discuss the homework problems
with other students.
However, you have to write your solutions clearly
and in your own words - this is the only way to
achieve real understanding!
It is advisable that you first write a draft
of the solutions and then copy them neatly.
Please write the problems in the same order
in which they are given in the assignment.
Exams:
There will be one take-home midterm and a comprehensive final.
All tests must be taken at the scheduled times,
except in extraordinary circumstances.
Please do not arrange travel plans that will prevent you
from taking any of the exams at the scheduled time.
Tentative date for the midterm exam: March 11 (Friday).
The final is scheduled for May 9 (Monday), 8:00-10:00 a.m.
Academic calendar for
Spring 2011.
Course schedule for
Spring 2011.
Policy on W/I Grades :
Through February 25 (Friday), you can withdraw
from the course with an automatic "W".
In addition, from February 28 (Monday) to May 6 (Friday),
you may withdraw and receive a "W" or "F"
according to your standing in the class.
Dropping after April 4 (Monday) requires a petition to the Dean.
(Such petitions are not often granted.
Furthermore, even if the petition
is granted, I will give you a grade
of "Withdrawn Failing" if you are
indeed failing at the time of your petition.)
Please check the dates in the Academic Calendar!
The grade of "I" (Incomplete)
is not intended to serve as
a benign substitute for the grade of "F".
I only give the "I" grade
if a student has completed the majority
of the work in the course
(for example everything except the final exam),
the coursework cannot be completed
because of compelling and verifiable problems
beyond the student's control, and the student expresses a clear
intention of making up the missed work as soon as possible.
Academic Misconduct: All cases of suspected academic misconduct will
be referred to the Dean of the College of Arts and Sciences for prosecution
under the University's Academic Misconduct Code. The penalties can be quite
severe. Don't do it!
For details on the University's
policies concerning academic integrity see the
Student's Guide to Academic Integrity
at the
Academic Integrity web-site.
For information on your rights to appeal charges
of academic misconduct consult the
Academic Misconduct Code.
Students are also bound by the provisions of the
OU Student Code.
Students With Disabilities:
The University of Oklahoma is committed to providing reasonable accommodation
for all students with disabilities. Students with disabilities who require
accommodations in this course are requested to speak with the instructor
as early in the semester as possible. Students with disabilities must be
registered with the Office of Disability Services prior to receiving
accommodations in this course. The Office of Disability Services is located
in Goddard Health Center, Suite 166: phone 405-325-3852 or TDD only
405-325-4173.
Good to know: