×

Close

- Probability Theory and Stochastic Processes - PTSP
- Note
**Jawaharlal Nehru Technological University Anantapur (JNTU) College of Engineering (CEP), Pulivendula, Pulivendula, Andhra Pradesh, India - JNTUACEP**- 7 Topics
**168 Views**- 2 Offline Downloads
- Uploaded 1 year ago

Touch here to read

Page-1

- Contents and Preface - ( 1 - 3 )
- Probability, measure and integration - ( 4 - 31 )
- Conditional expectation and Hilbert spaces - ( 32 - 45 )
- Stochastic Processes: general theory - ( 46 - 63 )
- Martingales and Stopping times - ( 64 - 91 )
- The Brownian motion - ( 92 - 107 )
- Markov, Poisson and Jump processes - ( 108 - 124 )

Topic:

Contents Preface 5 Chapter 1. Probability, measure and integration 1.1. Probability spaces and σ-fields 1.2. Random variables and their expectation 1.3. Convergence of random variables 1.4. Independence, weak convergence and uniform integrability 7 7 10 19 25 Chapter 2. Conditional expectation and Hilbert spaces 2.1. Conditional expectation: existence and uniqueness 2.2. Hilbert spaces 2.3. Properties of the conditional expectation 2.4. Regular conditional probability 35 35 39 43 46 Chapter 3. Stochastic Processes: general theory 3.1. Definition, distribution and versions 3.2. Characteristic functions, Gaussian variables and processes 3.3. Sample path continuity 49 49 55 62 Chapter 4. Martingales and stopping times 4.1. Discrete time martingales and filtrations 4.2. Continuous time martingales and right continuous filtrations 4.3. Stopping times and the optional stopping theorem 4.4. Martingale representations and inequalities 4.5. Martingale convergence theorems 4.6. Branching processes: extinction probabilities 67 67 73 76 82 88 90 Chapter 5. The Brownian motion 5.1. Brownian motion: definition and construction 5.2. The reflection principle and Brownian hitting times 5.3. Smoothness and variation of the Brownian sample path 95 95 101 103 Chapter 6. Markov, Poisson and Jump processes 6.1. Markov chains and processes 6.2. Poisson process, Exponential inter-arrivals and order statistics 6.3. Markov jump processes, compound Poisson processes 111 111 119 125 Bibliography 127 Index 129 3

Preface These are the lecture notes for a one quarter graduate course in Stochastic Processes that I taught at Stanford University in 2002 and 2003. This course is intended for incoming master students in Stanford’s Financial Mathematics program, for advanced undergraduates majoring in mathematics and for graduate students from Engineering, Economics, Statistics or the Business school. One purpose of this text is to prepare students to a rigorous study of Stochastic Differential Equations. More broadly, its goal is to help the reader understand the basic concepts of measure theory that are relevant to the mathematical theory of probability and how they apply to the rigorous construction of the most fundamental classes of stochastic processes. Towards this goal, we introduce in Chapter 1 the relevant elements from measure and integration theory, namely, the probability space and the σ-fields of events in it, random variables viewed as measurable functions, their expectation as the corresponding Lebesgue integral, independence, distribution and various notions of convergence. This is supplemented in Chapter 2 by the study of the conditional expectation, viewed as a random variable defined via the theory of orthogonal projections in Hilbert spaces. After this exploration of the foundations of Probability Theory, we turn in Chapter 3 to the general theory of Stochastic Processes, with an eye towards processes indexed by continuous time parameter such as the Brownian motion of Chapter 5 and the Markov jump processes of Chapter 6. Having this in mind, Chapter 3 is about the finite dimensional distributions and their relation to sample path continuity. Along the way we also introduce the concepts of stationary and Gaussian stochastic processes. Chapter 4 deals with filtrations, the mathematical notion of information progression in time, and with the associated collection of stochastic processes called martingales. We treat both discrete and continuous time settings, emphasizing the importance of right-continuity of the sample path and filtration in the latter case. Martingale representations are explored, as well as maximal inequalities, convergence theorems and applications to the study of stopping times and to extinction of branching processes. Chapter 5 provides an introduction to the beautiful theory of the Brownian motion. It is rigorously constructed here via Hilbert space theory and shown to be a Gaussian martingale process of stationary independent increments, with continuous sample path and possessing the strong Markov property. Few of the many explicit computations known for this process are also demonstrated, mostly in the context of hitting times, running maxima and sample path smoothness and regularity. 5

6 PREFACE Chapter 6 provides a brief introduction to the theory of Markov chains and processes, a vast subject at the core of probability theory, to which many text books are devoted. We illustrate some of the interesting mathematical properties of such processes by examining the special case of the Poisson process, and more generally, that of Markov jump processes. As clear from the preceding, it normally takes more than a year to cover the scope of this text. Even more so, given that the intended audience for this course has only minimal prior exposure to stochastic processes (beyond the usual elementary probability class covering only discrete settings and variables with probability density function). While students are assumed to have taken a real analysis class dealing with Riemann integration, no prior knowledge of measure theory is assumed here. The unusual solution to this set of constraints is to provide rigorous definitions, examples and theorem statements, while forgoing the proofs of all but the most easy derivations. At this somewhat superficial level, one can cover everything in a one semester course of forty lecture hours (and if one has highly motivated students such as I had in Stanford, even a one quarter course of thirty lecture hours might work). In preparing this text I was much influenced by Zakai’s unpublished lecture notes [Zak]. Revised and expanded by Shwartz and Zeitouni it is used to this day for teaching Electrical Engineering Phd students at the Technion, Israel. A second source for this text is Breiman’s [Bre92], which was the intended text book for my class in 2002, till I realized it would not do given the preceding constraints. The resulting text is thus a mixture of these influencing factors with some digressions and additions of my own. I thank my students out of whose work this text materialized. Most notably I thank Nageeb Ali, Ajar Ashyrkulova, Alessia Falsarone and Che-Lin Su who wrote the first draft out of notes taken in class, Barney Hartman-Glaser, Michael He, Chin-Lum Kwa and Chee-Hau Tan who used their own class notes a year later in a major revision, reorganization and expansion of this draft, and Gary Huang and Mary Tian who helped me with the intricacies of LATEX. I am much indebted to my colleague Kevin Ross for providing many of the exercises and all the figures in this text. Kevin’s detailed feedback on an earlier draft of these notes has also been extremely helpful in improving the presentation of many key concepts. Amir Dembo Stanford, California January 2008

CHAPTER 1 Probability, measure and integration This chapter is devoted to the mathematical foundations of probability theory. Section 1.1 introduces the basic measure theory framework, namely, the probability space and the σ-fields of events in it. The next building block are random variables, introduced in Section 1.2 as measurable functions ω 7→ X(ω). This allows us to define the important concept of expectation as the corresponding Lebesgue integral, extending the horizon of our discussion beyond the special functions and variables with density, to which elementary probability theory is limited. As much of probability theory is about asymptotics, Section 1.3 deals with various notions of convergence of random variables and the relations between them. Section 1.4 concludes the chapter by considering independence and distribution, the two fundamental aspects that differentiate probability from (general) measure theory, as well as the related and highly useful technical tools of weak convergence and uniform integrability. 1.1. Probability spaces and σ-fields We shall define here the probability space (Ω, F , P) using the terminology of measure theory. The sample space Ω is a set of all possible outcomes ω ∈ Ω of some random experiment or phenomenon. Probabilities are assigned by a set function A 7→ P(A) to A in a subset F of all possible sets of outcomes. The event space F represents both the amount of information available as a result of the experiment conducted and the collection of all events of possible interest to us. A pleasant mathematical framework results by imposing on F the structural conditions of a σ-field, as done in Subsection 1.1.1. The most common and useful choices for this σ-field are then explored in Subsection 1.1.2. 1.1.1. The probability space (Ω, F , P). We use 2Ω to denote the set of all possible subsets of Ω. The event space is thus a subset F of 2Ω , consisting of all allowed events, that is, those events to which we shall assign probabilities. We next define the structural conditions imposed on F . Definition 1.1.1. We say that F ⊆ 2Ω is a σ-field (or a σ-algebra), if (a) Ω ∈ F, c (b) If A ∈ F then Ac ∈ F as well (where S∞A = Ω \ A). (c) If Ai ∈ F for i = 1, 2 . . . then also i=1 Ai ∈ F. Remark. Using DeMorgan’s T law you can easily check that if Ai ∈ F for i = 1, 2 . . . and F is a σ-field, then also i Ai ∈ F. Similarly, you can show that a σ-field is closed under countably many elementary set operations. 7

## Leave your Comments