TOC
1. Probability spaces1.1 Definition1.2 Properties of a probability measure1.3 Sigma field2. Integration with respect to a probability measure2.1 F-measurable2.2 Integration2.3 Convergence Theorems3. Conditional Expectation3.1 Definition3.2 Important Properties 4. Martingales4.1 Intro and Definition4.2 Examples5. Stopping times5.1 Definition5.2 Stopped process5.3 Doobβs optional stopping theorem5.4 Doobβs maximum inequality
1. Probability spaces
1.1 Definition
A probability space is a triple , where
- is called the sample space consisting of all possible outcomes of the random experiment
- is a collection of events (subsets of ), called a -field (collection of information)
- is a probability measure which assigns to each event A a number (the probability of A) in such a way that
- If is a sequence of disjoint (mutually exlusive) events, then
Example 1
Choose a number randomly from the interval
In this case, . The basic events are sub-intervals . = event that the number we choose is in the interval . Choose so that it contains intervals. For the probability , we set
Example 2
Toss a coin. . is the probability of A. If , then
1.2 Properties of a probability measure
Proposition 1: If is a sequence of events, then
Proof: Let , β¦ , . Then so . Moreover, . Thus
Proposition 2: If , that is, and , then
Proof: Assume . and . Therefore, is disjoint. Thus
Similarly, If , that is, and then as .
1.3 Sigma field
A collection of events (subsets of ) (information) is called a -field if it satisfies the following properties:
- if , then
- if is a sequence of events belonging to , then also belong to
Examples
- The smallest -field:
- The biggest -field:
- For any event
Def 1.2 For any collection D of events, the smallest -field that contains D is called the -field generated by D, writen as
Example, If , then
2. Integration with respect to a probability measure
Define the integration of a random variable against a probability measure
2.1 F-measurable
A random variable is said to be F-measurable (random variable or F-determined) if the events of form belong to for all choices a, b.
is a funtion. is a random variable. is measurable equals to that
For example: Consider the coin tossing game with 2 rounds
={, , βfirst coin is headβ, βfirst coin is tailβ}. Event A = βsecond coin is headβ. Then is not -measurable. But it is -measurable.
Meaning: is -measurable cotains information of .
Properties:
- and are -measurable, then so are
- are -measurable, then so are
- are -measurable and , then is -measurable
2.2 Integration
Define the in three steps:
1. Integration of a non-negative discrete random variable.
- Let be a non-negative discrete random variable with values
- In this case, we define that
is called the integral of with respect to .
2. Integration of a non-negative random variable
Let be a non-negative random variable. For , set
Then forms a partition of the sample space , that is . Define the n-th approximation of by
so that
for all . Thus for every , as . Since , it holds that for
Hence . Since is discrete, is already defined as above. Now we define
Note that is allowed in the definition.
3. Integration of random variable (general case)
- Let be a given random variable
- In this case, we define that
Note that
Define:
if at least on of them is finite.
Definition: Integrable
A random variable is said to be integrable if and are finite.
Since , thus
is integrable
Example: For an event , define
is called the indicator of (non-negative & discrete)
Then
For any event A, define
as the intergration of on the event
Properites of integration
If are integrable, then:
are constants
- If , then
- If , then . Moreover, if , then almost surely
- If and Y is integrable, then is also integrable
2.3 Convergence Theorems
Suppose for each . It is not true in general that .
Example: Let and be the generalized length.
Then for every . However,
The equation can be true if certain conditions are satisfied
Monotone Convergence Theorem
Suppose are random variables with
- , and
- , then
Example: Let , P is generalized length
Define:
Then clearly, and
By using the Monotone Convergence theorem, we conclude
Dominated Convergence Theorem
Let are random variables such that . In addition, suppose there is a fixed integrable random variable such that for all . Then
Example: Let , P is generalized length.
Define:
From , as .
On the other hand,
Using the Dominant Convergence Theorem,
3. Conditional Expectation
3.1 Definition
Given two events A, B. is the conditional probability of A given B.
Let be a probability space. Let be a -field (collection of events, information). Given two events A, B, we know how to calculate the conditional probability
Define the conditional expectation of given , denoted by , which will be a random variable again regarded as the best estimate of based on the information provided by .
(Definition) A random variable is called the conditional expectation of given , written as , if
- is -measurable (determined by )
- for any event
This means that the two random variables have the same average (integration) over any event in .
3.2 Important Properties
- linear property
- If is a random variable determined by (-measurable), then ( can be taken as constant)
In particular,
- If is independent of , then
- If is another -field (which is smaller than ) such that , then
that is, if take conditional expectation twice, it will end up with the smaller one
Proof of Properties
(2)
Let . Since , we have
(the definition of the conditional expectation)
(4)
Assume is independent of . Let . To prove , we need to check (i) and (ii)
- since is a constant, so itβs certainly -measurable (determined by )
- For any , we need to show that . Since is independent of , it follows that
(5)
Prove
Let , . We need to show that , check two things:
- is -measurable. Since , by definition, is -measurable
- check
By definition, .
Since , we have , because , thus .
Prove : because is -measurable, thus the equation is valid.
4. Martingales
4.1 Intro and Definition
Martingales are mathematical models for fair games.
For a family of random variables , denote by the smallest -field containing the events of the form for all choices of .
is called the -field generated by . Random variables determined by are functions of .
Example: Consider a series of games decided by the tosses of a coin, in which we either win $1 with probability or lose $1 with probability in each round. Let denote the net gain in the -th round. Then are independent random variables with
and so
Total net gain after the n-th round is given by and
Let denote the -field generated by ( contains all the information up to round ). Then and can be regarded as the history of the games up to time (the -th round). The average gain after the -th round given the history up to time .
since is determined by (-measurable) and is independent of .
Thus
In the first case, the game is fair, is called a martingale. In the second case, it is called a submartingale. In the third case, it is called a supermartingale.
Definition: A sequence of random variables is said to be a maringale (submartingale, or supermartingale) with respect to an increasing sequence of -fields if :
- is determined by (-measurable)
- is integrable, i.e.
- sequence correlation
- Martingale:
- Submartingale:
- Supermartingale:
Immediate properties
- The notion of martingales is mathematical formulation for fair games
- If is a martingale, then
The expectation remains constant for all
- If is a martingale, then we have
More generally, it holds that for any ,
4.2 Examples
Ex1: Let be independent, integrable random variables with for all . Prove that is a martingale with respect to
Proof:
(i) Since is a function of , it is -determined.
(ii) integrable
(iii) condition 3
So is a martingale.
Ex2: Let be an integrable random variable and be a sequence of increasing -field. Define . Then is a martingale w.r.t
Proof:
(i) Because of the definition of conditional expectation, is -measurable.
(ii) , thus
(iii)
Ex3: If are independent, integrable random variables with for all , then
is a martingale w.r.t.
Proof:
(i) is function of thus is -measurable
(ii) since is integrable and independent thus is integrable.
(iii) Condition 3
Ex4: Let be the net gain process in a series of fair games. Then , and , . We already know that is a martingale w.r.t. . Prove that is also a martingale w.r.t.
Proof:
(i) Since is a function of , is -measurable.
(ii) As , is integrable.
(iii) condition 3
Thus
Ex5: Let be a sequence of independent random variables with . Set . Define . Prove is a martingale w.r.t
Proof:
(i) Because is a function of , is -measurable
(ii) Since , thus
(iii) condition 3
5. Stopping times
Example: Let be the net gain after the n-th round in a series of games. Let be the first time at which net gain reaches 100, i.e.
Such random times are called stopping times.
5.1 Definition
Let be an increasing sequence of -fields. A random variable taking values in the set is called a stopping time (or optional time) w.r.t. if for every n
Example
Let be the net gain process in a series of games. Set
Define
Then is a stopping time w.r.t. .
Indeed, for every
5.2 Stopped process
Let be a stopping time. Let be a sequence of random variables. Set . Define
is called the stopped process of at the stopping time .
Proposition: If is a martingale w.r.t. , then the stopped process is also a martingale (stopped martingale) w.r.t. . In particular,
Proof
Note that
This immediately implies that is -measurable and integrable. It remains to check the property (iii) in the definition of martingale.
That is if , the left is equal to the right, which is . While if , then the left is also equal to the right, which is . Thus
Since , it follows that
Thus is martingale (stopped martingale).
Example
.Is it a stopping game?
{T=3} = {3 is the last time in [0, 100] such that } = {} = {} {} β¦ {} . Thus T is not a stopping time.
5.3 Doobβs optional stopping theorem
Namely, will not be true. For example, in a series of fair game, define , then . However, the following theorem holds.
Theorem: Let be a martingale and an almost surely finite stopping time. In each of the following three cases, we have :
- is bounded, i.e. there is an integer such that
- The sequence is bounded in the sense that there is a constant such that for all
- and the step are bounded, i.e. there is a constant such that for all
Proof
Using the dominated convergence theorem. First of all, as . Since the stopped process is a martingale, we always have
- Since , we have and thus
- Since is bounded for all , it follows from the dominated convergence theorem that
- Write
Consequencely,
By virtue of the dominated convergence theorem, we have
Application of the Theorem
Gamblerβs ruin problem
Gambler and play a series of games against each other in which a fair coin is tossed repeatedly. In each game gambler wins or losses $1 according as the toss results in a head or a tail respectively. The initial capital of gambler is $a and that of gambler is $b and they continue playing until one of them is ruined. Determine the probability that will be ruined and also the expected number of games played.
Let be the fortune of gambler after the n-th game. Then
where are independent r.v.βs with ,
The game will stop at the time the gambler or is ruined, i.e. the game stop at
forms a martingale. Since or we have
Note that
Since for all , it follows from the Doobβs theorem that which is
Thus
Next we try to find the expected number of games .
Define . It has been proved that is a martingale. So the stopped process is also a martingale. Thus we have
This yields by convergence theorem that
uses monotone convergence theorem and uses dominated convergence theorem.
Example 1
Write for the independent steps of the walk. Then with
Define . Then as we have seen before is a margingale w.r.t. . Let be the first time at which the random walk hits 0 or . Since , by the Doobβs optional theorem
On the other hand
Therefore,
Example 2
Doubling Strategy: Define the game and the game stop if gain = 1
thus , . Define which means
Is a martingale? Yes
- By definition, is -measurable (is the expression of )
thus
Thus is martingale
Is Doobβs Optional Stopping theorem establishedοΌ No
(the only circumstance the game ends)
While because is stopped martingale. Why:
- is not bounded, that is, can finite
- is not bounded. On event , which can go to
- is not bounded. Since is not bounded.
Some properties of the game
- Almost surely win 1 if playing long enough
- Loss can be
5.4 Doobβs maximum inequality
(Theorem.1) Let be a martingale such that for some . Then for every :
and if
That is, for martingales, the probability and the expectation can be controlled by the expectation of
Proof
Let
is a bounded stopping time. Since is a submartingale, we particularly have
Note that
and
We have
This immediately gives
For a real-valued -adapted process and an interval , define a sequence of stopping times as follows:
We set
Then is the number of upcrosssings of for the interval .
(Theorem.2) Let be a supermartingale. We have
Proof
We can assume that the process is stopped at and also . Otherwise consider the process . Set
Then , and on . Note that is the event that the process has at least upcrossings for the interval . Hence, we have
Thus,
Adding the above inequality from to we obtain
(Theorem.3) Let be a martingale such that for some . Then
almost surely.
Proof
For , set
is the number of upcrossings of the processs . By theorem we have
This yields that , where . Set
Then and hence . Since
where denotes te set of rational numbers, we deduce that
Hence
Example 1
Let be a sequence of increasing -fields. Let be an intergrable random variable. Set . Explain why the limit
exists.
Solution: We knew from previous examples that is a martingale. Moreover, we have
By theorem.3 exists.
Example 2
Let be independent random variables with . If and , show converges almost surely.
Solution: Set . It is easy to verify that is a martingale. Furthermore,
It follows from the martingale convergence theorem that
exists. Since by assumption
esists, we conclude that
exists.
Loading Comments...