MATH 5763 - Stochastic Processes, Section 001 - Spring 2008
TR 12:00-1:15 p.m., 120 PHSC
Instructor:
Nikola Petrov, 802 PHSC, (405)325-4316, npetrov AT math.ou.edu
Office Hours:
Mon 2:30-3:30 p.m., Tue 2:45-3:30 p.m., or by appointment.
Prerequisite:
Basic calculus-based probability theory at the level of MATH 4733
(including axioms of probability, random variables, expectation,
probability distributions, independence, conditional probability).
The class will also require knowledge of elementary analysis
(including sequences, series, continuity),
linear algebra (including linear spaces, eigenvalues,
eigenvectors),
and ordinary differential equations (at the level
of MATH 3113 or MATH 3413).
Course description:
The theory of stochastic processes studies systems that evolve
randomly in time;
it can be regarded as the "dynamical" part of probability theory.
It has many important practical applications,
as well as in other branches
in mathematics such as partial differential equations.
This course is a graduate-level
introduction to stochastic processes,
and should be of interest to students of mathematics,
statistics, physics, engineering, and economics.
The emphasis will be on the fundamental concepts,
but we will avoid using the theory of Lebesgue measure
and integration in any essential way.
Many examples of stochastic phenomena in applications
and some modeling issues will also be discussed in class
and given as homework problems.
Text:
Mario Lefebvre,
Applied Stochastic Processes,
1st edition, Springer, 2006,
ISBN-10: 0387341714,
ISBN-13: 978-0387341712.
We will also use the book
Hui-Hsiung Kuo,
Introduction to Stochastic Integration,
1st edition, Springer, 2007,
which is freely available online to OU students
through the OU Library.
Homework
(solutions are deposited after the due date in the Chemistry-Mathematics
Library, 207 PHSC):
-
Homework 1, due date Thu, Jan 24.
Solution.
-
Homework 2, due date Thu, Jan 31.
Solution.
-
Homework 3, due date Thu, Feb 7.
Solution.
-
Homework 4, due date Thu, Feb 14.
Solution.
-
Homework 5, due date Thu, Feb 21.
Solution.
-
Homework 6, due date Thu, Feb 28.
Solution.
-
Homework 7, due date Thu, Mar 6.
Solution.
-
Homework 8, due date Thu, Apr. 3.
Solution.
-
Homework 9, due date Thu, Apr. 10.
Solution.
-
Homework 10, due date Thu, Apr. 17.
Solution.
-
Homework 11, due date Thu, Apr. 24.
Solution.
-
Homework 12, due date Thu, May 1.
Solution.
Content of the lectures:
-
Lecture 1 (Tue, Jan 15):
Elementary probability:
sample space, events,
σ-field (σ-algebra),
operations with events (complement,
union, intersection), disjoint events,
probability, conditional probability,
basic concepts of counting
(pages 1-5 of Sec. 1.1, handout on
counting).
-
Lecture 2 (Thu, Jan 17):
Elementary probability (cont.):
partitions of the sample space,
law of total probability, Bayes' formula,
indepent events
(pages 5-8 of Sec. 1.1).
Random variables:
random variables (RVs),
(cumulative) distribution function (c.d.f.) of a RV,
properties of the c.d.f.'s,
conditional c.d.f.;
discrete RVs, probability mass function (p.m.f.)
of a discrete RV, conditional p.m.f.,
important discrete RVs;
continuous RVs, probability density function (p.d.f.)
of a continuous RV,
important continuous RVs
(pages 8-15 of Sec. 1.2).
-
Lecture 3 (Tue, Jan 22):
Random variables (cont.):
conditional p.m.f. pX(x|A)
of a discrete RV X given an event A,
conditional c.d.f.
FX(x|A)
and p.d.f.
fX(x|A)
of a continuous RV X given an event A,
expectation of a RV, indicator RV 1A of an
event A,
expectation of the indicator RV of an event:
E[1A]=P(A),
expectation of a function of a RV:
E[g(X)],
nth moment E[Xn] of X,
variance,
moment-generating function and characteristic function of a RV
(pages 10, 14, 16-20 of Sec. 1.2).
Random vectors:
random vectors, joing c.d.f. of a random vector,
p.m.f. of a discrete random vector,
p.d.f. of a continuous random vector,
marginal p.m.f. (resp. p.d.f.) of components of a random
vector, independence in terms of the p.m.f. (resp. p.d.f.);
conditional p.m.f.'s
pX|Y(xi|yj)
and conditional p.d.f.'s
fX|Y(x|y),
conditional expectation
E[X|Y=y],
tower property of conditional expectation
E[X]=[E[X|Y]],
application: expectation of the sum of a random number
of independent random variables
(pages 21-27 of Sec. 1.3).
-
Lecture 4 (Thu, Jan 24):
Random processes:
definition of a stochastic process (random process),
classification of random processes:
discrete-time or continuous-time,
discrete-state space or continous-state space
(pages 47-48 of Sec 2.1).
Markov chains:
Markov property,
examples: 1-dimensional and 2-dimensional random walks,
Ehrenfests' urn model;
time-homogeneous discrete-time discrete-state space
Markov chains,
one-step and n-step transition probabilities,
one-step and n-step transition matrices,
stochastic and doubly-stochastic matrices,
Chapman-Kolmogorov equations
(pages 77-80 of Sec. 3.2).
-
Lecture 5 (Tue, Jan 29):
Markov chains (cont.):
matrix form of the Chapman-Kolmogorov equations, examples,
probability ρij(n)
of visiting state j for the first time
in n steps starting from state i,
probability ρii(n)
of first return to state i in n steps,
representation of
pij(n)
as a sum over k from 1 to n of
ρii(k)pii(n-k),
examples of computing
ρij(n)
for some simple Markov chains,
initial distribution
a=(a0,a1,a2,...)
of a Markov chain,
ai=P(X0=i),
distribution
a(n)=(a0(n),a1(n),a2(n),...)
at time n, formula for evolution of the probability
distribution:
a(n)=aPn,
examples: simple 1-dim random walk on Z,
simple 1-dim random walk on Z+ with reflecting
and absorbing boundary condition at 0
(pages 80-85 of Sec. 3.2).
-
Lecture 6 (Thu, Jan 31):
Properties of Markov chains:
accessibility of state j from state i,
i→j,
communicating states i↔j,
properties of the relation ↔
(reflexivity, symmetry, transitivity),
↔ as an equivalence relation,
equivalence classes with respect of ↔,
closed sets of the state space,
irreducible MCs, irreducibility criteria,
examples, recurrent and transient states,
probability fij
of eventual visit of j starting from i,
probability fii
of eventual return to state i,
expressing fij
as a sum of the first visit probabilities
ρij(n),
Decomposition Theorem, example
(pages 85-87 of Sec. 3.2).
-
Lecture 7 (Tue, Feb 5):
Properties of Markov chains (cont.):
criterion for recurrence/transience of a state in terms of expected
numbers of returns to the state,
criterion for recurrence/transience of a state in terms of
the sum (over all n in N) of the return
probabilities to the state at time n,
recurrence is a class property,
average number μi of transitions for first
return to state i,
positive recurrent and null-recurrent states,
criterion for null-recurrence,
type of recurrence is a class property,
recurrent states of a finite MC are positive recurrent
(pages 87-90 of Sec. 3.2).
-
Lecture 8 (Thu, Feb 7):
Properties of Markov chains (cont.):
periodic and aperiodic states, remarks about periodicity,
examples;
limiting probabilities πi,
limiting probability distribution
π=(π0,π1,π1,...),
ergodic states,
Ergodic Theorem (giving conditions for existence and uniqueness
of a limiting probability distribution,
relation between πi and
μi,
and an algorithm for computing π),
examples showing the importance of the conditions
in the Ergodic Theorem
(pages 91-97 of Sec. 3.2).
-
Lecture 9 (Tue, Feb 12):
Properties of Markov chains (cont.):
example: simple random walk with partially absorbing
boundary, solving difference equations
(pages 98-100 of Sec. 3.2).
-
Lecture 10 (Thu, Feb 14):
Properties of Markov chains (cont.):
absorption problems, definition of the probability
ri(C)
of eventually entering a closed and irreducible set
C of recurrent states starting from
a transient state i,
main theorem about these probabilities,
example: computing the probabilities
of ruin ri({0})
and winning ri({k})
in the gambler's ruin problem
(with the goal set to k), definition of a martingale
(pages 100-104 of Sec. 3.2).
-
Lecture 11 (Tue, Feb 19):
Continuous-time (discrete-state space) Markov chains:
definition of a continuous-time (discrete-state space)
stochastic process, approximating such a process
by a discrete-time (discrete-state space) Markov chain,
proof that the inter-event times are exponential random
variables, definition of a continuous-time Markov process,
time-homogeneous Markov process, transition probabilities,
(ir)reducibility of a continuous-time Markov process,
embedded discrete-time Markov chain in a continuous-time
Markov process, Chapman-Kolmogorov equations,
definition of a Poisson process (counting process)
(pages 121-123 of Sec. 3.3, pages 231-232 of Sec. 5.1).
-
Lecture 12 (Thu, Feb 21):
Poisson process:
derivation of the distribution of N(t)
for a Poisson process N
by induction and by the method of generating functions,
definition of the Poisson process of the process
of with unit jumps up at gamma-distirbuted times
(or, equivalently, with exponentially distributed
interevent time intervals)
(pages 123, 133 of Sec. 3.3;
233, 234, 237-238 of Sec. 5.1).
Please read pages 109-111, 115-116,
and Proposition 3.3.6 on page 119.
-
Lecture 13 (Tue, Feb 26):
Poisson process (cont.):
properties of the Γ(α,λ) distribution:
moment-generating function, proof that the sum
of j independent Exp(λ) RVs
is a Γ(j,λ) RV,
c.d.f. of a Γ(j,λ) RV;
derivation of the distribution of a Poisson process
Nt from the c.d.f.
of the Γ(j,λ)-distributed
times of the events
(pages 115-119 of Sec. 3.3, 237-239 of Sec. 5.1).
Continuous-time Markov chains (cont.):
instantaneous transition rates (infinitesimal parameters)
νij, generating matrix (generator)
G
(pages 124-126 of Sec. 3.3).
-
Lecture 14 (Thu, Feb 28):
Continuous-time Markov chains (cont.):
Kolmogorov backward and forward equations,
initial-value problem for the transition probability
matrix
Pt=(pij(t)),
function of a matrix, exponential of a matrix, solution
Pt=exp(tG)
of the Kolmogorov equations with initial condition
P0=I;
example: a two-state continous-time Markov process,
solving the problem by exponentiating the generator;
Laplace transform, linearity of the Laplace transform,
Laplace transform of a derivative
(pages 126-131 of Sec. 3.3).
-
Lecture 15 (Tue, Mar 4):
Continuous-time Markov chains (cont.):
solving the problem of a two-state continous-time Markov process
by using Laplace transform method;
stochastic semigroup Pt,
stationary distribution π
of a stochastic semigroup Pt,
recurrent and transient states,
positive recurrent and null recurrent states
of a continuous-time Markov chain,
irreducible Markov chains,
Ergodic Theorem for continuous-time Markov process,
remarks
(pages 138-142 of Sec. 3.3).
-
Lecture 16 (Thu, Mar 6):
Continuous-time Markov chains (cont.):
finding stationary distributions from the generator:
πG=0;
the sum of the elements in each row of G is zero;
meaning and signs of νij;
the holding time of the ith state
is an Exp(-νii) random variable;
transition probability matrix
Q=(qij)
of the jump chain; computing the entries of Q:
qii=0,
qij=-νij/νii
for i different from j.
-
Lecture 17 (Tue, Mar 11):
Continuous-time Markov chains (cont.):
birth-death-immigration-disaster Markov processes
- derivation of the infinitesimal-time evolution probabilities
and differential equation,
birth process - solving the differential equations
by using a generating function
(pages 132, 133, 135, 136 of Sec. 3.3).
-
Lecture 18 (Thu, Mar 13):
Continuous-time Markov chains (cont.):
computing the expectation of the time
Tn for a birth process
starting at X0=1 to reach
Xt=n for the first time.
More properties of Poisson processes:
counting the events in a Poisson process
Nt if the counter
detects each event with probability p∈[0,1]
- the resulting counting process Mt
has distribution
Mt∼Bin(Nt,p);
given that Nt=1,
the time T1 of the first event
has distribution T1∼Uniform(0,t]
(Proposition 5.1.5);
generalization: if t1<t2,
Nt1=i, and
Nt2=i+1,
then
Ti+1∼Uniform(t1,t2];
the sum
Nt=Nt(1)+Nt(2)
of the independent Poisson processes
Nt(1)
and
Nt(2)
with rates λ(1) and
λ(2)
is a Poisson process with rate
λ(1)+λ(2)
(Proposition 5.1.1)
(pages 240, 243 of Sec. 5.1).
-
Lecture 19 (Tue, Mar 25):
Discussion of some problems from the midterm exam.
Nonhomogeneous Poisson processes:
definition, intensity function λ(t),
mean-value function m(t)
(such that m(0)=0,
m'(t)=λ(t)),
proof that the distribution of the increment
Ns+t-Ns
is Poisson with parameter
m(s+t)-m(s)
(pages 250, 251 of Sec. 5.2).
-
Lecture 20 (Thu, Mar 27):
Nonhomogeneous Poisson processes (cont.):
"homogenizing" a nonhomogeneous Poisson process,
distribution of the time of the first event
given that Nt=1
(pages 253, 254 of Sec. 5.2).
Compound Poisson processes:
definition of a compound Poisson process,
mean, variance, and moment generating function
of a compound Poisson process
(pages 254-256 of Sec. 5.3).
-
Lecture 21 (Tue, Apr 1):
Compound Poisson processes:
derivation of the expression for the
moment generating function of a compound Poisson process,
consequences of the Central Limit Theorem
for the behavior of a compound Poisson process
Yt for large t
(pages 256, 257 of Sec. 5.3).
Doubly stochastic Poisson processes:
idea of conditional (or "mixed") Poisson processes
and doubly stochastic Poisson process ("Cox process")
(pages 258, 262 of Sec. 5.4).
Renewal processes:
definition of a renewal process,
modified ("delayed") renewal process;
relations between the process Nt,
the times of the events Tn,
and the interevent times τn;
expression for the p.m.f. of Nt
in terms of the c.d.f. of Tn
(Proposition 5.6.1),
"honesty" of a renewal process,
renewal function
mN(t);
definition of the Riemann-Stieltjes integral,
particular cases, applications
to computing expected values of discrete
and continuous random variables
(pages 267-271 of Sec. 5.6).
-
Lecture 22 (Thu, Apr 3):
Renewal processes (cont.):
recursive formula for the c.d.f. of the event times
Tn expressed
through Riemann-Stieltjes integrals;
expected value of an N-valued random variable X
as a sum (over n from 1 to infinity)
of probabilites of X to be greater or equal to n,
expected value of an non-negative continuous random variable
X as an integral of
[1-FX(x)],
geometric meaning;
integral equation for the renewal function
mN(t)=E[Nt],
solving renewal-type equations by using Laplace transform
(pages 271,273-276 of Sec. 5.6).
-
Lecture 23 (Tue, Apr 8):
Renewal processes (cont.):
finding the expected time E[T] to cross a street
with a Poissonian car flow
(derivation and solution of an integral equation for
E[T]);
remarks about "renewal equation" approach to the problem
of the dynamics of population with age-dependent
reproductive ability.
Queueing theory:
definition, classification of the queues,
examples of different problems and applications
of queueing theory.
-
Lecture 24 (Thu, Apr 10):
General properties of stochastic processes:
cumulative distribution function
F(x1,...,xk;t1,...,tk),
probability mass function
p(x1,...,xk;t1,...,tk),
and probability density function
f(x1,...,xk;t1,...,tk)
of order k of a stochastic process
X={Xt:t∈[0,∞)};
mean
mX(t)=E[Xt],
autocorrelation function
RX(t1,t2)=E[Xt1Xt2],
autocovariance function
CX(t1,t2)=RX(t1,t2)-mX(t1)mX(t2),
and autocorrelation coefficient
ρX(t1,t2)
of a stochastic process,
average power
RX(t,t)=E[Xt2]
and variance
V[Xt]=CX(t,t)
of a stochastic process;
processes with indepent increments,
processes with stationary increments,
strict-sense stationary (SSS, strongly stationary) processes,
wide-sense stationary (WSS, weakly stationary) processes,
examples;
spectral density SX(ω)
of a WSS process
(Sec. 2.1, 2.2).
-
Lecture 25 (Tue, Apr 15):
Gaussian and Markov processes:
multinormal distribution of a random vector
X=(X1,...,Xn)∼N(m,K),
vector of the means m, covariance matrix
K=(cov(Xi,Xj));
Gaussian process Xt
- a continuous-time stochastic process with
(Xt1,...,Xtn)
being multinormal for any n and times
t1,...tn;
Markov processes; examples;
(first-order) density function
f(x;t),
conditional transition density function
p(x,x0;t,t0)=fXt|Xt0(x|x0);
integrals of f(x;t)
and
p(x,x0;t,t0)
over x are equal to 1;
expressing f(x;t)
as in integral of
f(x0;t0)p(x,x0;t,t0)
over x0
(pages 58-62 of Sec. 2.4)
-
Lecture 26 (Thu, Apr 17):
Gaussian and Markov processes (cont.):
more on the meaning of the p.d.f. of a continuous RV:
P(X∈(x,x+Δx])≈fX(x)Δx,
generalization for jointly continuous random vectors
P(X∈A)≈fX(x)vol(A)
where A is a small domain in Rk
containing x;
application to kth order p.d.f.'s of a random process:
P(Xt1∈(x1,x1+Δx1],...,Xtk∈(xk,xk+Δxk])≈f(Xt1,...,Xtk)(x1,...,xk)Δx1...Δxk;
Chapman-Kolmogorov equations for the
conditional transition density function
p(x,x0;t,t0)=fXt|Xt0(x|x0);
time-homogeneous process;
generalized functions ("distributions"):
test functions (infinitely smooth compactly supported functions)
Dirac δ-function and its derivatives,
generalized derivatives (example: generalized derivative
of the unit step function:
u'(x)=δ(x)),
representing δ(x) as a limit of
(1/t)1[0,t](x)
as t→0+,
representing δ(x) as a limit of
(2πt)-1/2exp{-x2/(2t)}
as t→0+
(pages 62-65 of Sec. 2.5).
The Wiener process:
definition of a Wiener process
Wt∼N(0,σ2t)
and a standard Wiener process
Bt∼N(0,t),
autocorrelation function
RB(t,s)=E[BtBs]=min(t,s)
(pages 175-178 of Sec. 4.1).
-
Lecture 27 (Tue, Apr 22):
The Wiener process (cont.):
Wiener process (Brownian motion)
as a limit of a simple random walk,
a brief history of theory of Brownian motion
(Robert Brown, Albert Einstein,
Norbert Wiener, Andrey Kolmogorov);
m.g.f. and moments of Bt:
E[Bt2n+1]=0,
E[Bt2]=t,
E[Bt4]=3t2,
E[Bt6]=15t3;
short-time behavior:
E[ΔBt]=0,
E[(ΔBt)2]=Δt;
nondifferentiability of Bt:
E[ΔBt/Δt]=0,
E[(ΔBt/Δt)2]=1/Δt→∞
as t→0;
(Gaussian) white noise
ξt=dBt/dt,
making sense of
ξt=dBt/dt
by treating it as a functional (on test functions φ)
taking value ξ(φ) in the space of random variables,
moments of ξ(φ).
-
Lecture 28 (Thu, Apr 24):
The Wiener process (cont.):
writing the facts about moments of ξ(φ)
as consequences of
E[ξt]=0
and
E[ξtξs]=δ(t-s).
Stochastic differential equations (SDEs) and Ito integrals:
discussion of the concept of a stochastic differential equation
and the meaning of its solution,
Fokker-Planck equation for the conditional transition
density function
p(x,x0;t,t0),
an example (the Fokker-Planck equation
and its solution for the Wiener process
Bt),
discretizing the SDE,
Ito's way of defining the integral
as a limit of left Riemann sums,
reasons for using left Riemann sums,
examples of Ito integrals,
Ito formula, examples of applications.
-
Lecture 29 (Tue, Apr 29):
SDEs and Ito integrals (cont.):
meaning of convergence of the left Riemann-Stieltjes
sums to the Ito integral (mean-square convergence),
sketch of computing
∫t0tBsdBs;
simple population growth at a noisy rate:
dXt/dt=(r+αξt)Xt
or, equivalently,
dXt=rXtdt+αXtdBt,
derivation of the solution
Xt=X0exp{(r-(α2/2))t+αBt}.
-
Lecture 30 (Tue, May 1):
SDEs and Ito integrals (cont.):
simple population growth at a noisy rate:
computation of the expectation and the variance
of Xt,
discussion on the interpretation of the results;
martingales with respect of a filtration
of σ-fields, examples of martingales
with respect to the σ-field generated
by a Brownian motion Bt:
Bt,
Bt2-t,
exp(βBt-β2t/2);
more on the Fokker-Planck equation and the initial conditions
for it, looking for stationary solutions of the Fokker-Planck
equation (if such solutions exist).
Attendance:
You are required to attend class on those days when an
examination is being given;
attendance during other class periods is also
strongly encouraged.
You are fully responsible for the
material covered in each class, whether or not you attend.
Make-ups for missed exams will be given only if
there is a compelling reason for the absence,
which I know about beforehand
and can document independently of your testimony
(for example, via a note or a phone call from a
doctor or a parent).
Homework:
It is absolutely essential
to solve the assigned homework problems!
Homework assignments will be given
regularly throughout the semester
and will be posted on this web-site.
Usually the homework will be due at the start
of class on Thursday.
Each homework will consist of several problems,
of which some pseudo-randomly chosen problems will be graded.
Your lowest homework grade will be dropped.
All homework should be written on a 8.5"×11" paper
with your name clearly written, and should be stapled.
No late homework will be accepted!
You are encouraged to discuss the homework problems
with other students.
However, you have to write your solutions clearly
and in your own words - this is the only way to
achieve real understanding!
It is advisable that you first write a draft
of the solutions and then copy them neatly.
Please write the problems in the same order
in which they are given in the assignment.
Shortly after a homework assignment's due date,
solutions to the problems from that assignment
will be placed on restricted reserve in
the Chemistry-Mathematics Library in 207 PHSC.
Exams:
There will be one take-home midterm and a comprehensive final.
All tests must be taken at the scheduled times,
except in extraordinary circumstances.
Main topics (a tentative list):
-
a brief review of probability theory;
-
discrete Markov chains: Chapman-Kolmogorov equations,
persistence and transience, generating functions,
stationary distributions, reducibility, limit theorems, ergodicity;
-
continuous Markov processes:
Poisson process, birth-death and branching processes,
embedding of a discrete-time Markov chain
in a continuous-time Markov processes;
-
conditional expectation, martingales;
-
stationary processes (autocorrelation function,
spectral representation);
-
renewal processes, queues;
-
diffusion processes, Wiener processes (Brownian motion);
-
introduction to stochastic differential equations, Ito calculus;
-
Fokker-Planck equation, Black-Scholes option-pricing formula,
Ornstein-Uhlenbeck process.
Grading:
Your grade will be determined by your performance
on the following coursework:
Homework (lowest grade dropped) |
50% |
Take-home midterm exam |
20% |
Final exam |
30% |
Academic calendar for
Spring 2008.
Policy on W/I Grades :
Through February 22, you can withdraw
from the course with an automatic W. In addition,
it is my policy to give
any student a W grade,
regardless of his/her performance in the course,
through the extended drop period that ends on May 2.
However, after March 31, you can only drop
via petition to the Dean of your college.
Such petitions are not often granted.
Furthermore, even if the petition
is granted, I will give you a grade
of "Withdrawn Failing" if you are
indeed failing at the time of your petition.
The grade of I (Incomplete)
is not intended to serve as
a benign substitute for the grade of F.
I only give the I grade
if a student has completed the majority
of the work in the course
(for example everything except the final exam),
the coursework cannot be completed
because of compelling and verifiable problems
beyond the student's control, and the student expresses a clear
intention of making up the missed work as soon as possible.
Academic Misconduct: All cases of suspected academic misconduct will
be referred to the Dean of the College of Arts and Sciences for prosecution
under the University's Academic Misconduct Code. The penalties can be quite
severe. Don't do it!
For more details on the University's
policies concerning academic misconduct see
http://www.ou.edu/provost/integrity/.
See also the Academic Misconduct Code,
which is a part of the Student Code
and can be found at
http://www.ou.edu/studentcode/.
Students With Disabilities:
The University of Oklahoma is committed to providing reasonable accommodation
for all students with disabilities. Students with disabilities who require
accommodations in this course are requested to speak with the instructor
as early in the semester as possible. Students with disabilities must be
registered with the Office of Disability Services prior to receiving
accommodations in this course. The Office of Disability Services is located
in Goddard Health Center, Suite 166: phone 405-325-3852 or TDD only
405-325-4173.
Good to know: