MATH 4733 - Mathematical Theory of Probability, Section 001 - Fall 2007
TR 1:30-2:45 p.m., 117 PHSC
Instructor:
Nikola Petrov, 802 PHSC, (405)325-4316, npetrov AT math.ou.edu
Office Hours:
Mon 2:30-3:30 p.m., Tue 4:30-5:30 p.m., or by appointment.
Prerequisite:
2443 (Calculus and Analytic Geometry IV) or concurrent enrollment.
Course catalog description:
Probability spaces, counting techniques, random variables,
moments, special distributions, limit theorems. (F)
Text:
R. J. Larsen, M. L. Marx,
Introduction to Mathematical Statistics and Its Applications,
4th edition, Pearson/Prentice Hall, 2005,
ISBN-10: 0131867938, ISBN-13: 978-0131867932.
The course will cover major parts of chapters 2, 3, and 4.
Homework
(solutions are deposited after the due date in the Chemistry-Mathematics
Library, 207 PHSC):
-
Homework 1, due Thu, Aug 30.
-
Homework 2, due Thu, Sep 6.
-
Homework 3, due Thu, Sep 13.
-
Homework 4, due Thu, Sep 20.
-
Homework 5, due Thu, Oct 4.
-
Homework 6, due Thu, Oct 11.
-
Homework 7, due Thu, Oct 18.
-
Homework 8, due Thu, Oct 25.
-
Homework 9, due Thu, Nov 1.
-
Homework 10, due Thu, Nov 8.
-
Homework 11, due Thu, Nov 29.
-
Homework 12, due Thu, Dec 6.
Content of the lectures:
-
Lecture 1 (Tue, Aug 21):
Introduction:
remarks on history of modern Theory of Probability
(Sec. 2.1).
Sample spaces and the algebra of sets:
experiment, sample outcome, sample space, event, examples;
unions, intersections, and complements of sets, examples
(Sec. 2.2).
-
Lecture 2 (Thu, Aug 23):
The probability function:
Kolmogorov axioms,
elementary consequences of the axioms, examples
(Sec. 2.3).
Conditional probability:
motivation and formal definition of conditional probability,
examples
(pages 42-45 of Sec. 2.4).
-
Lecture 3 (Tue, Aug 28):
Conditional probability (cont.):
more examples;
applying conditional probability to
higher-order intersections,
partitions, calculating "unconditional"
probabilities by using partitioning
the sample space
(pages 46-57 of Sec. 2.4).
-
Lecture 4 (Thu, Aug 30):
Conditional probability (cont.):
more examples of applying conditional probability,
Bayes Theorem and its applications
(pages 58-69 of Sec. 2.4).
Independence:
motivation and definition of independence
of two events
(pages 69-70 of Sec. 2.5).
-
Lecture 5 (Tue, Sep 4):
Independence (cont.):
independence of more than two events,
independence of A and B
implies independence of A and
Bc,
independence of A, B and C
implies independence of A and B∩C,
as well as independence of A and B∪C;
deducing independence, repeated trials, examples
(pages 70-80 of Sec. 2.5).
-
Lecture 6 (Thu, Sep 6):
Independence (cont.):
more examples
(pages 80-85 of Sec. 2.5; Example 2.5.15 is optional).
Combinatorics:
general idea and classification of combinatorial problems:
ordered or unordered selection, distict or identical objects,
with or without repetition;
the multiplication rule;
permutations of distinct objects, examples,
Stirling's formula for approximating n!
(pages 85-96 of Sec. 2.6).
-
Lecture 7 (Tue, Sep 11):
Combinatorics (cont.):
example of permutations with restrictions
(seating children at the movies,
permutations of "composite objects");
permutations of not necessarily distinct objects, examples;
combinations, binomial coefficients, examples
(pages 97-108 of Sec. 2.6).
-
Lecture 8 (Thu, Sep 13):
Combinatorics (cont.):
properties of binomial coefficients
- algebraic and combinatorial derivations,
Pascal's triangle, binomial formula
(pages 110-113 of Sec. 2.6).
Combinatorial probability:
probability of events in a sample space
with equally likely outcomes
(pages 113-115 of Sec. 2.7;
please read Examples 2.7.1 and 2.7.1).
-
Lecture 9 (Tue, Sep 18):
Combinatorial probability (cont.):
more examples
(pages 115-121 of Sec. 2.7).
-
Lecture 10 (Thu, Sep 20):
Binomial and hypergeometric probabilities:
random variables, discrete and continuous random variables;
binomial distribution, examples;
hypergeometric distribution, examples
(Sec. 3.1, 3.2).
-
Lecture 11 (Tue, Sep 25):
More examples.
-
Lecture 12 (Thu, Sep 27):
Hour exam 1.
-
Lecture 13 (Tue, Oct 2):
Discrete random variables (RV):
random variables, discrete and continuous random variables,
probability density function, p.d.f.,
pX(x)
(also called probability mass function, p.m.f.),
cumulative distribution function, c.d.f.,
FX(x),
examples, binomial and hypergeometric distributions
(Sec. 3.3).
-
Lecture 14 (Thu, Oct 4):
Discrete random variables (cont.):
transformations of pX(x)
under a linear change of variables
Y=aX+b
(page 158 of Sec. 3.3).
Continuous random variables:
continuous RVs,
probability density function
pX(x) of a continuous RV,
cumulative distribution function
FX(x)
of a continuous RV,
properties of FX(x),
relationships between FX(x)
and pX(x),
transformations of pX(x)
under a linear change of variables
Y=aX+b, examples
(Sec. 3.4).
-
Lecture 15 (Tue, Oct 9):
Expected values:
motivation, definitions of expected values
(average values, mean values, expectations)
of discrete and continuous random variables,
expactations of binomial, hypergeometric,
and exponential random variables,
St. Petersburg paradox,
median of a probability distribution
(pages 173-183 of Sec 3.5).
-
Lecture 16 (Thu, Oct 11):
Expected values (cont.):
expected value of a function of a random variable,
linearity of the expectation,
examples
(pages 186-192 of Sec 3.5).
The variance:
motivation, definition,
properties, standard deviation, examples;
nth moment and nth central moment
(moment about the mean) of a random variable,
the expectation as the first moment,
and the variance as the central moment,
skewness and curtosis
(Sec. 3.6).
-
Lecture 17 (Tue, Oct 16):
Joint densities:
motivation, discrete and continuous joint densities, examples,
marginal densities, obtaining marginal densities
from a joint density, geomertic probabilities
(pages 203-212 of Sec. 3.7).
-
Lecture 18 (Thu, Oct 18):
Joint densities (cont.):
joint c.d.f., obtaining the joint p.d.f.
from the joint c.d.f., marginal c.d.f.'s
and their relation with the joint c.d.f.,
multivariate densities,
independence of two random variables
in terms of their joint c.d.f. and
their joint p.d.f.
(pages 213-218 of Sec. 3.7).
Combining random variables:
p.d.f. of a sum of two independent
random variables,
proof in the discrete case;
do the proof in the continuous case yourselves
(pages 220-221 of Sec. 3.8).
-
Lecture 19 (Tue, Oct 23):
Combining random variables (cont.):
p.d.f. of a sum of two independent
random variable - example
of a sum of independent binomial RVs
(what is its average value?),
example of a sum of two independent exponential RVs
(what is its average value?),
p.d.f. of a quotient of random variables - an example
(pages 221-224 of Sec. 3.8).
Further properties of the mean and variance:
expectation of a function of two (or more) RVs;
expectation of a linear combination of RVs
(not necessarily independent!), examples
(pages 226-230 of Sec. 3.9).
-
Lecture 20 (Thu, Oct 25):
Further properties of the mean and variance (cont.):
more examples on expectation of a function of two RVs;
expected value of a product of independent RVs;
variance of a sum of independent RVs;
random sample, sample average,
expectation and variance of the sample average
(pages 230-236 of Sec. 3.9,
pages 218-219 of Sec. 3.7).
Order statistics:
order statistics of a random sample,
minimum and maximum values in a random sample
(pages 241-242 of Sec. 3.10).
-
Lecture 21 (Tue, Oct 30):
Order statistics (cont.):
computing the distributions of
the minimum and maximum values in a random sample,
example - lifetime of two lightbulbs
(pages 242-243 of Sec. 3.10).
Moment generating functions:
definition, expressing the moments
as derivatives of the m.g.f.,
examples: m.g.f.'s of binomial
and exponential RVs
(pages 257-259 and 261-262 of Sec. 3.12).
-
Lecture 22 (Thu, Nov 1):
Moment generating functions (cont.):
more examples: m.g.f.'s of geometric and normal RVs,
m.g.f. of a sum of two independent random variables:
MX+Y(t)=MX(t)MY(t),
examples of application: identifying the type
of the sum of a Bin(n,p) and a
Bin(k,p) RVs,
identifying the type
of the sum of a N(μ,σ2) and a
N(ν,ρ2) RVs
(pages 258, 260, and 263-268 of Sec. 3.12).
-
Lecture 23 (Tue, Nov 6):
Moment generating functions (cont.):
expressing m.g.f. of the RV aW+b through
the m.g..f. of W
(pages 266-267 of Sec. 3.12).
The Poisson distribution:
motivation, detailed derivation
as a limiting case of the binomial distribution,
derivation of the m.g.f. of the Poisson
distribution from the m.g.f. of the
binomial distribution
(pages 276, 284-285 of Sec. 4.2).
-
Lecture 24 (Thu, Nov 8):
The Poisson distribution (cont.):
sum of two independent Poisson RVs
with parameters λ and β, resp.,
is a Poisson random variable
with parameter λ+β
- three derivations (intuitive,
through convolution, and using m.g.f.'s);
interevent times in a Poisson(λ) process
are exponential (λ) continuous RVs;
example - radioactive decay
(pages 220-222 of Sec. 3.8,
266-267 of Sec. 3.12, 289-292 of Sec. 4.2).
-
Lecture 25 (Tue, Nov 13):
The Poisson distribution (cont.):
example - radioactive decay.
The geometric ditstribution:
definition, p.d.f., m.g.f.,
mean, variance
(pages 317-319 of Sec. 4.4).
-
Lecture 26 (Thu, Nov 15):
Hour exam 2.
-
Lecture 27 (Tue, Nov 20):
The normal distribution:
reminder about meaning and properties of
binomial and normal distributions,
statement DeMoivre-Laplace central limit theorem
(Theorem 4.3.1), c.d.f. of the standard normal
distribution, an example
(pages 292-296 of Sec. 4.3).
-
Lecture 28 (Tue, Nov 27):
The normal distribution (cont.):
approximating a binomial random variable
by a normal random variable,
working with the table of the c.d.f. of
a standard normal random variable,
the continuity correction,
central limit theorem,
proof of the theorem
(pages 294-297, 301-302 of Sec. 4.3,
Appendix 4.A.2).
-
Lecture 29 (Thu, Nov 29):
The normal distribution (cont.):
more examples
(pages 302-307 of Sec. 4.3).
-
Lecture 30 (Tue, Dec 4):
The normal distribution (cont.):
the normal distribution as a model for individual
measurements, distribution of a sum of normal
random variables, distribution of the sample mean
of a random sample from a normally distributed population,
distribution of a sum of a linear combination
of normal random variables
(pages 307-314 of Sec. 4.3).
Conditional densities:
conditional probability (a reminder),
conditional probability density function
for discrete RVs, examples
(pages 249-251 of Sec. 3.11).
-
Lecture 31 (Thu, Dec 6):
Conditional densities (cont.):
using conditional probabilities
to find hierarchical distributions:
proof that if X~Poisson(λ)
and Y~Bin(X,p),
then Y~Poisson(λp);
setting up and discussing the physical meaning
and the "intuitive" solution of other problems:
if X~Poisson(λ)
and Y~Poisson(μ),
then given that X+Y=n,
X is Binomial with parameters
n and p=λ/(λ+μ)
(see Problem 3.11.9);
if X~Bin(n,p) and
Y~Bin(X,q),
then Y~Bin(n,pq).
-
Final exam:
Monday, Dec 10, 1:30-3:30 p.m.
Attendance:
You are required to attend class on those days when an
examination is being given;
attendance during other class periods is also
strongly encouraged.
You are fully responsible for the
material covered in each class, whether or not you attend.
Make-ups for missed exams will be given only if
there is a compelling reason for the absence,
which I know about beforehand
and can document independently of your testimony
(for example, via a note or a phone call from a
doctor or a parent).
You should come to class on time;
if you miss a quiz because you came late,
you won't be able to make up for it.
Homework:
It is absolutely essential
to solve a large number of problems on a regular basis!
Homework assignments will be given
regularly throughout the semester
and will be posted on this web-site.
Usually the homeworks will be due at the start
of class on Thursday.
Each homework will consist of several problems,
of which some pseudo-randomly chosen problems will be graded.
Your lowest homework grade will be dropped.
All homework should be written on a 8.5"×11" paper
with your name clearly written, and should be stapled.
No late homework will be accepted!
You are encouraged to discuss the homework problems
with other students.
However, you have to write your solutions clearly
and in your own words - this is the only way to
achieve real understanding!
It is advisable that you first write a draft
of the solutions and then copy them neatly.
Please write the problems in the same order
in which they are given in the assignment.
Shortly after a homework assignment's due date,
solutions to the problems from that assignment
will be placed on restricted reserve in
the Chemistry-Mathematics Library in 207 PHSC.
Quizzes:
Short pop-quizzes will be given in class at random times;
your lowest quiz grade will be dropped.
Often the quizzes will use material
that has been covered very recently
(even in the previous lecture),
so you have to make every effort to keep up
with the material and to study the corresponding
sections from the book right after
they have been covered in class.
Exams:
There will be two midterms and a (comprehensive) final.
The approximate dates for the midterms are
September 27 and November 1.
The final is scheduled for Monday, December 10, 1:30-3:30 p.m.
All tests must be taken at the scheduled times,
except in extraordinary circumstances.
Please do not arrange travel plans that prevent you
from taking any of the exams at the scheduled time.
Grading:
Your grade will be determined by your performance
on the following coursework:
Pop-quizzes (lowest grade dropped) |
15% |
Homework (lowest grade dropped) |
15% |
Two in-class midterms |
20% each |
Final Examination |
30% |
Academic calendar for
Fall 2007.
Policy on W/I Grades :
Through September 23, you can withdraw
from the course with an automatic W. In addition,
it is my policy to give
any student a W grade,
regardless of his/her performance in the course,
through the extended drop period that ends on December 7.
However, after October 29, you can only drop
via petition to the Dean of your college.
Such petitions are not often granted.
Furthermore, even if the petition
is granted, I will give you a grade
of "Withdrawn Failing" if you are
indeed failing at the time of your petition.
The grade of I (Incomplete)
is not intended to serve as
a benign substitute for the grade of F.
I only give the I grade
if a student has completed the majority
of the work in the course
(for example everything except the final exam),
the coursework cannot be completed
because of compelling and verifiable problems
beyond the student's control, and the student expresses a clear
intention of making up the missed work as soon as possible.
Academic Misconduct: All cases of suspected academic misconduct will
be referred to the Dean of the College of Arts and Sciences for prosecution
under the University's Academic Misconduct Code. The penalties can be quite
severe. Don't do it!
For more details on the University's
policies concerning academic misconduct see
http://www.ou.edu/provost/integrity/.
See also the Academic Misconduct Code,
which is a part of the Student Code
and can be found at
http://www.ou.edu/studentcode/.
Students With Disabilities:
The University of Oklahoma is committed to providing reasonable accommodation
for all students with disabilities. Students with disabilities who require
accommodations in this course are requested to speak with the instructor
as early in the semester as possible. Students with disabilities must be
registered with the Office of Disability Services prior to receiving
accommodations in this course. The Office of Disability Services is located
in Goddard Health Center, Suite 166: phone 405-325-3852 or TDD only
405-325-4173.
Good to know: